Synthetic Apature Radar (SAR) Basics#

What is Synthetic Apature Radar?#

Active radar based satellite imagery, also known as SAR (Synthetic Aperture Radar) is not a new technology, but has gained some popularity over the last 4 years with the launch of the Sentinel-1 satellites, which provide high-resolution SAR images of the earth, every few days. The technology is radically different from “traditional” satellite imagery which uses optical sensors, making it quite difficult to use and understand.

However, the advantages of this technology are huge! SAR imagery can provide information even during cloudy weather, in the middle of the night and is extremely good at detecting materials and the topography. It opens new avenues of innovation for agriculture that can propel satellite technology at the top of the must-have list of every digital agriculture company. It can provide soil composition information, soil moisture data, information about farming practices like tilling and much more!

Let’s have a look at how it works.

How does SAR work?#

Radar 101#

First the basics: Radar is a detection system that uses radio waves to determine the range, angle, or velocity of objects. Active radar sensors send radio waves from a transmitter, the waves reflect off an object and return to the receiver, giving information about the object’s location and speed.

Radar detection

Credit: Wikipedia#

So SAR imagery is the signal received from the reflected radio waves, and in the case of Sentinel-1, the receiver is a satellite orbiting the earth about 700km away from the ground. Pretty cool right?

This explains why SAR imagery is not affected by cloud coverage or night. The waves penetrate through the clouds, bounce on the ground and back to the satellite. No light needed.


The second concept very important to understand with SAR is the backscatter. It describes the way the reflected waves get back to the receiver. While in traditional optical imagery, wavelengths like green or near-infrared are recorded, for SAR the recorded signal is backscatter. Depending on the surface on which it bounces, it can vary greatly.

It is definitely a challenge to interpret the information, but it is also a great asset. The SAR sensors are sensitive to moisture, which means that it is possible to detect humidity variations in the soil based on the backscatter signal from the satellite. It also means that you can detect information on the soil texture from the backscatter, and should be able to make the difference between a sandy soil and a silty one, right? Well, with the proper calibration, it is!

SAR polarization scattering

Credit: NASA#


The next logical questions are: What about crops in the field? They are going to affect the backscatter, no? And for moisture I need underground information, not surface. How is that handled?

To answer these questions, we need to introduce a third concept (it’s the last one, promised!): The type of radio wave band. The four bands most commonly utilized for SAR applications are referred to as: P, L, C and X.

SAR wavelength penetration

Sentinel-1 SAR in SpaceSense’s library#

Sentinel-1 uses C-band radar (5.405 GHz frequency or 5.55 cm wavelength) with the primary objectives including the mapping of glaciers, ice sheets, and sea ice, as well as land surface topography. Important secondary objectives include land cover monitoring, snow cover mapping, surface soil moisture retrevals, and ocean wave science, among many others. You can read more on the specific details of the Sentinel-1 mission, and its objectives, here.

Besides radar frequency, polarization is also an important aspect of SAR data. The polarization, or orientation of the radar wave, can give us important information regarding the properties of the target. Some applications using this polarimetric information include crop identification, soil moisture retrivals, geological mapping, shoreline detection, biomass estimation, ocean wave measurements, and much more. You can read about more polarization use cases here.

Radar waves can be polarized in either the horizontal or vertical direction, indicated by the letter H or V respectively. If a sensor emits and retrieves a radar signal in a single polarization, it will be noted as VV or HH. Additionally, the radar wave can be emited in one polarization and observed in the other, such as a wave emited in a vertical polarization but retrieved in the horizontal polarization. This type of observation is notated with VH (or HV in the inverse case).

Sentinel 1 provides single and dual polarization, however as this analysis ready data comes from level 1 Interferometric Wide swath (IW), only the VV and VH polarizations are retrieved. You can read more about S1 aquisition modes here.

Finally, Local Incidence Angle, or LIA, is the measure of the angle between the radar beam and an imaginary line normal to the surface (i.e. taking into account the local topography). Here is an image illustrating this angle. This angle is important as different uses of SAR data may need to restrict which angles are used, or even normalize the radar backscatter based on this angle, such as in Kaplan et al., 2021.

SpaceSense SAR Processing Pipeline#

SpaceSense provides Sentinel-1 SAR analyis ready data (ARD) by processing the original level 1 data. Data starts at level 1 Interferometric Wide swath (IW).

Pre-processing steps are applied to this S1 data at this stage through the Snappy Graph Processing Framework (GPF) tool. Given here are the processing steps, and the GPF parameters used.

  1. Border noise removal

    GPF “Remove-GRD-Border-Noise” process set to True, with the “borderLimit” set to 2000, and the “trimThreshold” set to 0.5

  2. Thermal noise removal

    GPF “removeThermalNoise” process set to True

  3. Radiometric calibration

    GPF “outputSigmaBand” set to True, and “outputImageScaleInDb” set to False

  4. Terrain correction

    GPF options of:

    “demName” set to “SRTM 3Sec” “demResamplingMethod” set to “BILINEAR_INTERPOLATION” “imgResamplingMethod” set to “BILINEAR_INTERPOLATION” ”saveProjectedLocalIncidenceAngle” set to True “saveSelectedSourceBand” set to True “pixelSpacingInMeter” set to “resolution”) “alignToStandardGrid” set to True) “standardGridOriginX” and “standardGridOriginY” set to 0

    The “mapProjection” was set using the following projection:

    proj = (

    ‘GEOGCS[“WGS84(DD)”,’ ‘DATUM[“WGS84”,’ ‘SPHEROID[“WGS84”, 6378137.0, 298.257223563]],’ ‘PRIMEM[“Greenwich”, 0.0],’ ‘UNIT[“degree”, 0.017453292519943295],’ ‘AXIS[“Geodetic longitude”, EAST],’ ‘AXIS[“Geodetic latitude”, NORTH]]’ )