Remote sensing! It may sound like the stuff of science fiction, but even we amateurs to the world of science would be lost without the remote sensing tools that we use daily. What would we do without GPS, trackers, security cameras and webcams? Don’t let me forget to mention X-rays, CAT scans, MRI scans!
Google Earth is a great example of a free remote sensing tool that can be easily accessed and navigated. It is a remote application that combines satellite imagery with a host of layers including 3D buildings, road maps, tourist images, environmental data, and google cars that drive around all over the place taking photographs so that you can zoom into the street view of any city in the world. This creates a multi-layered, 3-Dimensional representation of the world around us.
Google Earth is an excellent illustration of just how multi-applicable remote sensing can be. It provides easy to use street maps, area measuring tools, weather maps, historical maps and political borders, only to name a few of its functions.
There are a number of definitions for remote sensing, mostly penned in complicated scientific speak. Using the least confusing definition, it comes down to the following:
Remote sensing is the act of acquiring data such as an image, from a stipulated target using a detection device or sensor. The word “remote” implies that you do not have direct contact with the target – you are “remotely” observing the target from a distance (Rice, 2012).
In other words, getting information about places on earth, in the form of images, without actually having to be there. Very convenient if you don’t want to climb to the top of a mountain, or submarine to the bottom of an ocean; or to knock on the door of an international terrorists’ lair. Remote sensing is an opportune way of using technology to monitor, analyse, make decisions or even spy!
Let’s get back to our scientific definition. In order to understand the definition of remote sensing, we can break it down into the following components:
What is the meaning of data?
Simply put, data is the information that you get from an image. The image is formed by capturing and recording light energy just like an analogue camera. Computers can’t see light, but they can see numbers. So, light image is converted into numbers and renamed ‘data’.
In remote sensing, the target is an object or place on earth that you want to see. The data about the target is often captured by satellite or aeroplane photography, hand held thermal scopes or even drones.
From the data you can look for particular characteristics of the target; such as surface temperatures, soil types, topography and size of the area (Rice, 2012). This is done using Geographic Information Systems (GIS). GIS is software that uses the data to arrange the image into layers. The layers show each separate characteristic of the data, which we can view separately or altogether. This helps us to extract more information about the target than our eyes can perceive. But remote sensing is not just about satellite images. Nowadays targets can include images of cucumber crops infected with mildew, chickens that have lost too many feathers, or even the source of an elephants sore left foot.
Sensors are devices that can be mounted on a satellite, an aeroplane, a drone, pole or even a selfie stick. They measure photons of light from the electromagnetic spectrum that reach the target on earth and are reflected back to the sensor. The sensors measure the energy that they receive and record it as data, which gets re-beamed back down to earth. There are many different types of sensor, used for a variety of different things. Examples of these sensors range from the grand to the minor. They include the Lidar sensor mounted to the International Space Station, which helps to analyse biodiversity in environments on earth, and drone-mounted thermal night vision scopes used by anti-poaching teams. See footage below about thermal technology fighting the scourge of poaching.
The science of remote sensing
Electromagnetic energy comes from the sun:
Much like splitting a ray of light through a prism, white light is made up of a number of different parts. These parts include visible light and other light waves that are invisible to the naked eye. These waves are either short and energetic waves called gamma rays, x-rays, ultraviolet, infrared, or slow and lazy microwaves and radio waves (Short, 2012).
Light passes through the atmosphere, where the ozone layer immediately diverts some of the ultra violet waves straight back into space. This is fortunate because these rays are harmful and sometimes even fatal to life on earth. Water vapour absorbs most of the long infrared and short microwave wavelengths, causing the air to become warmer and more humid. Infrared light should be reflected back into space off dust and gas particles. Or it is absorbed by the Earth and emitted as heat.
The problem we face today is that we humans have messed with this cycle of life. Man-made pollution has caused the atmosphere to fill up with extra carbon dioxide and methane. We call this the greenhouse effect. Basically, this means that the heat gets trapped inside the atmosphere and cannot escape into space. This makes the planet warmer, causing the uncanny weather we have nowadays, which we call climate change.
Light hits Earth:
When the light has successfully managed to avoid all the pitfalls in the atmosphere and has arrived at the Earth’s surface, some of it is absorbed into the ground and some is transmitted. This means that light passes through stuff like leaves or water. Or, the light can be reflected back into space.
Satellite sensors capture the reflected light including the bands of light not visible to the human eye. For example, red and infrared light waves are absorbed by leaves during the process of photosynthesis. Green light is not welcomed by the leaf, so it is either transmitted through the leaf or reflected of the leaf surface. That is why plants look green to us (NASA, 2010). Sensors record these wavelengths and we can use these records to create algorithms such as Normalised Difference Vegetation Index (NDVI). This is very useful to farmers. It helps them see whether their crops are healthy or not.
Light from the target that is reflected back towards the satellite’s sensor
Satellites are orbiting the earth in different configurations. The oldest and most beloved satellite to us earth dwellers is the moon. The journey the satellite takes to go around the earth once is called an orbit. Orbits are at different altitudes, orientations and rotations relative to earth, depending on the type and mission of the satellite and the sensor it carries. Time of day, time of year and weather conditions controls when the satellite captures an image and the quality of the image. Different sensors placed on different satellites are designed to capture various aspects of the light spectrum.
Reflected light is captured and converted into numerical data
All the light that gets bounced back to space may be detected by the satellite sensors, where it is changed from light into numbers (data).
Once the light has reached the sensor in the satellite, it is “digitised” into a matrix of numbers. This is because computers understand the world of numbers and not so much the world of light. This numerical information, or data, is transmitted to a ground receiving station.
At the end of the day, remote sensing isn’t all that complicated. It’s simply the process of capturing light using some type of a sensor, converting that light into data, and doing something good with it. Stick around, nest we are going to look at how we interpret that data.
REFERENCES & FURTHER READING
Buckey, D. J. (n.d.). SPATIAL DATA LAYERS – VERTICAL DATA ORGANIZATION. Retrieved from NISL Ecologial Informatics: http://planet.botany.uwc.ac.za/nisl/gis/gis_primer/page_29.htm
Church, J. S., Cook, N. J., & Schaefer, A. L. (2009). Recent Applications of Infrared Thermography. InfraMation 2009 Proceedings .
DigitalGlobe, 2012. Google Earth Image. http://www.google.com/earth/index.html.
Earthdata (EOSDIS). (2018, December 11). Remote Sensors. Retrieved from Earthdata: https://earthdata.nasa.gov/user-resources/remote-sensors
GIS Geography. (2019, January 5). What is Geographic Information Systems (GIS)? Retrieved from GIS Geography: https://gisgeography.com/what-gis-geographic-information-systems/
Lewis JC 1970: Wildlife Cenus Methods: A Resume. Journal of Wildlife Diseases. Vol. 6. 356 -364.
Lillesand TM,Keifer RW, & Chipman JW 2008: Remote Sensing and Image Interpretation. 6th Edition. John Wiley & Sons, Inc. United States.
National Aeronautics and Space Administration 2005. Satellite Data to Track Wildlife: Elephants in Space. http://www.nasa.gov/vision/earth/lookingatearth/elephants_space.html. Accessed 23/01/2012
National Aeronautics and Space Administration, Science Mission Directorate, 2010. Reflected Near-Infrared Waves. Retrieved January 22, 2012, from Mission:Science website
Port Technology International, 2011: FLIR Systems launches new thermal imaging camera.
Taylor F, 2009. About Google Earth Imagery. http://www.gearthblog.com/blog/archives/2009/03/about_google_earth_imagery_1.html. Accessed 22/01/2012.
this is best one…
Rice B, 2012: Barr’s Remote Sensing Page – What is Remote Sensing. http://www.sarracenia.com/astronomy/remotesensing/primer0110.html. [Accessed 22/01/2012]