DC-24 - Unmanned Aerial Systems (UAS)

You are currently viewing an archived version of Topic Unmanned Aerial Systems (UAS). If updates or revisions have been published you can find them at Unmanned Aerial Systems (UAS).

Unmanned Aerial Systems (UAS) are revolutionizing how GIS&T researchers and practitioners model and analyze our world. Compared to traditional remote sensing approaches, UAS provide a largely inexpensive, flexible, and relatively easy-to-use platform to capture high spatial and temporal resolution geospatial data. Developments in computer vision, specifically Structure from Motion (SfM), enable processing of UAS-captured aerial images to produce three-dimensional point clouds and orthophotos. However, many challenges persist, including restrictive legal environments for UAS flight, extensive data processing times, and the need for further basic research. Despite its transformative potential, UAS adoption still faces some societal hesitance due to privacy concerns and liability issues.

Author and Citation Info: 

Mathews, A. J. and Frazier, A. E., (2017). Unmanned Aerial Systems. The Geographic Information Science & Technology Body of Knowledge (2nd Quarter 2017 Edition), John P. Wilson (ed.), DOI: 10.22224/gistbok/2017.2.4

This entry was first published on June 9, 2017. No earlier editions exist.

Topic Description: 
  1. Definitions
  2. Operations
  3. Sensors and Data Capture
  4. Data Processing and Analysis
  5. Applications
  6. UAS and Society

 

1. Definitions

Unmanned Aerial System (UAS): an aircraft without an onboard pilot that is operated autonomously or manually by a remote-control operator. The terms unmanned aerial vehicle (UAV), unmanned aircraft systems/vehicles, remotely piloted aircraft (RPA), and drone are often used interchangeably. UAS platforms typically adopted by geospatial researchers are considered small UAS (sUAS), weighing between 0.5 lbs (~0.2 kg) and 55 lbs (~25 kg) as designated by the U.S. Federal Aviation Administration (FAA; weight limits may vary in other countries)

Rotary-Wing (RW): single or multirotor copter with upward-mounted propeller(s) that generate lift allowing aircraft to take off and land vertically and hover during flight. RW platforms typically provide more maneuverability than fixed-wing aircraft.

Fixed-Wing (FW): platform with a stationary wing and forward-mounted propeller(s) to generate lift and continuously move aircraft forward at varying pitch angles. FW platforms can fly at higher speeds and for longer duration (40 minutes to several hours) increasing aerial coverage in comparison to RW.

Structure from Motion (SfM): computer vision algorithms to process digital photos into three-dimensional point clouds and subsequent geospatial data products such as digital terrain and surface models, and orthophotos. SfM is a broad term that often also encompasses multi-view stereo techniques (e.g., MVS, SfM-MVS).

 

2. Operations

UAS operators must adhere to civil aviation authority policies when collecting data. In the U.S., the FAA governs UAS operations, requiring aircraft to be registered and operators to obtain a remote pilot certification. Operating rules include: flying during daylight hours, below 400 feet (~120 m) in altitude, not within 5 mi (~8 km) of any airport, or over populated areas. Additionally, operators must maintain visual line-of-sight and yield to manned aircraft during flight.

Many UAS are hindered by even slightly windy conditions, requiring frequent confirmation of weather forecasts at/near the study site. Although platform dependent, FW aircraft are often flown into and with the wind to minimize side-to-side movement, whereas RW aircraft are less restricted in flight direction. FW platforms require a larger staging area than RW platforms for launch and skid landings. During data collection missions, flightlines should be organized to ensure stereoscopic coverage. UAS-based image capture requires considerable overlap (80-90% endlap and 60% sidelap recommended) to ensure effective image matching due to the larger distortions introduced by lower flying altitudes and platform instability (Harwin et al. 2015). Nadir-facing images are commonly collected, although convergent views are recommended (i.e. integration of obliques; James & Robson, 2014).

 

3. Sensors and Data Capture

3.1 Image Capture

UAS are mainly utilized to capture imagery, and off-the-shelf, point-and-shoot digital cameras are a popular sensor option (see Toth et al. [2015] for a comparison of cameras). Wide-angle lenses (e.g., GoPro Hero) are avoided due to high image distortion, and parsing video into still images is not recommended because frames may contain blur. Off-the-shelf cameras typically have limited spectral resolution, and reflectance calibration can be challenging, but removal of the internal hot mirror permits capture of near-infrared wavelengths (Mathews 2015). Spectral targets with known reflectance properties placed in situ are commonly used to calibrate optical sensor measurements, or sensors such as the Tetracam ADC Lite sensor allow image capture from UAS with spectral bands matching certain Landsat bands, thereby facilitating comparisons. Other commonly used sensors include the Parrot Sequoia and MicaSense RedEdge.

Georeferencing schemes for UAS-acquired imagery include: (1) direct, which uses known camera locations through GNSS-enabled cameras or onboard GNSS and IMU measurements stored and attached to captured images, (2) indirect, which uses GNSS-located ground control points (GCPs), and (3) a combination of direct and indirect.

3.2 Non-image Date Capture

Non-imagery applications of UAS include, for example, collecting measurements of temperature, pressure, humidity, and wind for atmospheric sampling and meteorology or environmental surveillance using sensors that can detect CO2, methane, and other gases for pipeline monitoring. Lidar sensors have been employed for terrain and 3D mapping, but sensor size, weight, and cost remain restrictive for many applications. However, advances are being made toward developing low-cost, miniaturized sensing devices.  

3.3 Coordinated Data Capture

A benefit of deploying sensors onboard UAS is the potential for coordinated, self-organized data capture between two or more vehicles. Algorithms for coordinating, controlling, and systematizing distributed networks of airborne sensors are developing rapidly, allowing multiple UAS flying in a network to communicate with each other and ground stations to coordinate capture of optimally distributed spatial datasets (Namuduri et al., 2013). This type of ‘smart’, mobile UAS network will permit adaptive sampling schemes not possible with fixed, ground-based networks.

 

4. Data Processing and Analysis

Structure from Motion (SfM) computer vision technique incorporates a series of algorithms (e.g., Scale Invariant Feature Transform—SIFT [Lowe, 2004], Bundler [Snavely et al., 2008], patch-based multi-view stereopsis—PMVS [Furukawa & Ponce, 2010]) to match overlapping areas across multiple images with differing perspectives (identifying keypoints of the same features—equivalent to photogrammetric tie points) to generate sparse and dense point cloud reconstructions of 3D space. SfM point clouds are not inherently georeferenced, and known locations of cameras or GCPs must be incorporated to transform the point cloud to real-world coordinates. SfM point clouds are similar to lidar datasets with the addition of RGB information for each point. Commonly used SfM desktop software packages include Agisoft PhotoScan, Pix4D, and VisualSfM. Cloud-based alternatives (e.g., DroneDeploy) are also available.

Images can also be processed to produce very high spatial resolution orthophotos. Proper orthophoto production requires removal of radiometric effects (e.g., vignetting, brightness variation from image-to-image, conversion to reflectance values; see Mathews, 2015) and geometric effects (e.g., lens distortion, relief displacement; see Kelcey & Lucieer, 2012). Geometric corrections remain especially challenging when using uncalibrated sensors at low altitudes where distortions are magnified (Mathews, 2015).

Advanced analyses use lidar data filtering and classification techniques to extract height information from SfM point clouds, either by generating Digital Terrain Models (DTMs) and Digital Surface Models (DSMs) (Fonstad et al., 2013) or computing height metrics to characterize vegetation canopy structure and biomass (see Dandois & Ellis, 2013; Mathews & Jensen, 2013). The temporal flexibility of data collected by UAS allow for 3D/volumetric change analyses via DTM/DSM differencing and/or direct point cloud comparison. Object-based image analysis techniques are commonly applied to analyze and extract useful vector data from 2D orthophotos (Laliberte et al., 2010).

Data collection and processing standards as well as comprehensive accuracy assessments remain in early stages. Some solutions have been proposed to optimize workflow efficiency and improve model accuracy (see Turner et al., 2012; James & Robson, 2014), but considerable uncertainty remains.

 

5. Applications

Most GIS&T research using UAS has been applied. Table 1 outlines several major application areas with some related works for further reading. 

Table 1. Geospatial applications for UAS and related research
Application Related Works
Terrain modeling Stefanik et al., 2011; Fonstad et al., 2013
Geomorphological and fluvial processes Flener et al., 2013; Dietrich, 2016
Vegetation structure, forestry, and ecosystem modeling Wallace et al., 2012; Dandois & Ellis, 2013
Precision agriculture Baluja et al., 2012; Mathews & Jensen, 2013
Land and natural resource management Rango et al., 2009; Laliberte et al., 2010
Animal habitat and monitoring Chabot et al., 2014
Natural disasters (wildfire, landslides) Ambrosia et al., 2003; Niethammer et al., 2012
Meteorology Frew et al., 2012
Cultural features and archaeology Eisenbeiss & Sauerbier, 2011

 

6. UAS and Society

As UAS usage becomes more common, ongoing discourse surrounding their role in ‘citizen mapping’ such as participatory mapping and citizen science projects as well as their role in ‘mapping citizens’ is critical. As of 2016, few citizen science and participatory mapping projects were engaging with aerial platforms (see Cummings et al., 2017), but as scientists increasingly recognize the utility of complementing traditional remote sensing with sensors onboard UAS, the use of UAS in citizen science projects will rise. Websites such as OpenAerialMap (https://openaerialmap.org/) and Dronestagram (http://www.dronestagr.am/) allow users to share UAS-acquired images online. The Humanitarian OpenStreetMap Team has crowdsourced (i.e., microtasked) the digitization of UAS-acquired imagery to support disaster recovery, and many small civilian UAS projects have been successfully completed around the world for similar purposes.

Conversely, ‘mapping citizens’ has implications for location privacy, which is concerned with individuals’ claim to determine when, how, and to what extent information about themselves and their location is communicated to others (Kerski, 2016). These locational privacy issues face new challenges according to the ease with which very high resolution imagery or other data can be captured from UAS. Privacy concerns of citizens surrounding UAS are fluid and evolving, and it will be important for GIS&T researchers to remain engaged in these societal questions as adoption of UAS technology increases. 

References: 

Ambrosia, V.G., Wegener, S. S., Sullivan, D. V., Buechel, S.W., Dunagan, S. E., Brass, J. A., ... & Schoenung, S. M. (2003). Demonstrating UAV-acquired real-time thermal data over fires. Photogrammetric Engineering & Remote Sensing, 69(4), 391-402. DOI: 10.14358/PERS.69.4.391

Baluja, J., Diago, M. P., Balda, P., Zorer, R., Meggio, F., Morales, F., & Tardaguila, J. (2012). Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrigation Science, 30(6), 511-522. DOI: 10.1007/s00271-012-0382-9

Colomina, I., & Molina, P. (2014). Unmanned aerial systems for photogrammetry and remote sensing: a review. ISPRS Journal of Photogrammetry and Remote Sensing, 92, 79-97. DOI: 10.1016/j.isprsjprs.2014.02.013

Cummings, A.R., Cummings, G. R., Harner, E., Moses, P., Norman, Z., Captain, V., ... & Butler, K. (2017). Developing a UAV-Based Monitoring Program with Indigenous Peoples. Journal of Unmanned Vehicle Systems. DOI: 10.1139/juvs-2016-0022

Dandois, J. P., & Ellis. E. C. (2013). High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision. Remote Sensing of Environment, 136, 259-276. DOI: 10.1016/j.rse.2013.04.005

Dietrich, J.T. (2016). Bathymetric structure-from-motion: extracting shallow stream bathymetry from multi-view stereo photogrammetry. Earth Surface Processes and Landforms, 42(2), 355-364. DOI: 10.1002/esp.4060

Eisenbeiss, H., & Sauerbier, M. (2011). Investigation of UAV systems and flight modes for photogrammetric applications. The Photogrammetric Record 26(136), 400-421. DOI: 10.1111/j.1477-9730.2011.00657.x

Flener, C., Vaaja, M., Jaakkola, A., Krooks, A., Kaartinen, H., Kukko, A., ... & Alho, P. (2013). Seamless Mapping of River Channels at High Resolution Using Mobile LiDAR and UAV-Photography. Remote Sensing, 5(12), 6382-6407. DOI: 10.3390/rs5126382

Fonstad, M.A., Dietrich, J. T., Courville, B. C., Jensen, J. L. and Carbonneau, P.E. (2013). Topographic structure from motion: A new development in photogrammetric measurement. Earth Surface Processes and Landforms, 38(4), 421-430. DOI: 10.1002/esp.3366

Frew, E., Elston, J., Argrow, B., Houston, A, & Rasmussen, E. (2012). Sampling severe local storms and related phenomena using unmanned aircraft systems. IEEE Robotics & Automation Magazine, 19(1), 85-95. DOI: 10.1109/mra.2012.2184193

Furukawa, Y., & Ponce, J. (2010). Accurate, dense and robust multiview stereopsis. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(8), 1362-1376. DOI: 10.1109/tpami.2009.161

Harwin, S., Lucieer, A., and Osborn, J. (2015). The impact of calibration method on the accuracy of point clouds derived using unmanned aerial vehicle multi-view stereopsis. Remote Sensing, 7(9), 11933-11953. DOI: 10.3390/rs70911933

James, M.R., & Robson, S. (2014). Mitigating systematic error in topographic models derived from UAV and ground-based image networks. Earth Surface Processes and Landforms, 39, 1413-1420. DOI: 10.1002/esp.3609

Kelcey, J., & Lucieer, A. (2012). Sensor Correction of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing. Remote Sensing, 4(12), 1462-1493. DOI: 10.3390/rs4051462

Kerski, J. (2016). Location Privacy. The Geographic Information Science & Technology Body of Knowledge (3rd Quarter 2016 Edition), John P. Wilson (ed.). DOI: 10.22224/gistbok/2016.3.2

Laliberte, A., Herrick, J. E., Rango, A., & Winters, C. (2010). Acquisition, orthorectification, and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring. Photogrammetric Engineering & Remote Sensing, 76(6): 661-672. DOI: 10.14358/PERS.76.6.661

Lowe, D. (2004). Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 60(2), 91-110. DOI: 10.1023/B:VISI.0000029664.99615.94

Mathews, A. J., & Jensen, J. L. (2013). Visualizing and quantifying vineyard canopy LAI using an unmanned aerial vehicle (UAV) collected high density structure from motion point cloud. Remote Sensing, 5(5), 2164-2183. DOI: 10.3390/rs5052164

Mathews, A. J. (2015). A practical UAV remote sensing methodology to generate multispectral orthophotos for vineyards: Estimation of spectral reflectance using compact digital cameras. International Journal of Applied Geospatial Research, 6(4), 65-87. DOI: 10.4018/ijagr.2015100104.

Namuduri, K., Wan, Y., & Gomathisankaran, M. (2013). Mobile Ad Hoc Networks in the Sky: State of the Art, Opportunities, and Challenges.  Proceedings of the second ACM MobiHoc workshop on Airborne networks and communications - ANC 13. Bangalore, India. DOI: 10.1145/2491260.2491265

Niethammer, U., James, M. R., Rothmund, S., Travelletti, J., & Joswig, M. (2012). UAV-based remote sensing of the Super-Sauze landslide: Evaluation and results. Engineering Geology, 128, 2-11. DOI: 10.1016/j.enggeo.2011.03.012

Rango, A., Laliberte A., Herrick, J. E., Winters, C., Havstad, K., Steele, C., & Browning, D. (2009). Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and management. Journal of Applied Remote Sensing, 3(1), 033542. DOI: 10.1109/IGARSS.2010.5651659

Snavely, N., Seitz, S. M., & Szeliski, R. (2008). Modeling the world from internet photo collections. International Journal of Computer Vision, 80(2), 189-210. DOI: 10.1007/s11263-007-0107-3

Stefanik, K. V., Gassaway, J. C., Kochersberger, K., & Abbott, A. L. (2011). UAV-based stereo vision for rapid aerial terrain mapping. GIScience & Remote Sensing, 48(1), 24-49. DOI: 10.2747/1548-1603.48.1.24

Toth, C., Jozkow, G., & Grejner-Brzezinska, D. (2015). Mapping with small UAS: a point cloud accuracy assessment. Journal of Applied Geodesy, 9(4), 213-226. DOI: 10.1515/jag-2015-0017

Turner, D., Lucieer, A., & Watson, C. (2012). An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sensing, 4(5), 1392-1410. DOI: 10.3390/rs4051392

Wallace, L., Lucieer, A., Watson, C., & Turner, D. (2012). Development of a UAV-lidar system with application to forest inventory. Remote Sensing, 4(6), 1519-1543. DOI: 10.3390/rs4061519

Learning Objectives: 
  • Define UAS.
  • Summarize the predominant UAS platform types.
  • Understand the requirements for legal operation of UAS for data collection purposes.
  • Describe how to execute a successful UAS data capture mission.
  • List commonly used sensors for capturing remote sensed data via UAS.
  • Describe a common data workflow for UAS-collected aerial imagery.
  • Acquire knowledge of how UAS technology is applied in geospatial research.
  • Understand the societal issues surrounding UAS data capture.
Instructional Assessment Questions: 
  1. How does GIS&T benefit from the incorporation of UAS technology? Be specific about the benefits of spatial and temporal data resolution.
  2. Identify the two UAS platform types and describe the strengths/weaknesses of each platform regarding their ability to collect data.
  3. You would like to acquire terrain data for a small (~0.5 ha) open area in a rural location.
    1. What type of UAS would you require? Why?
    2. What sensor(s) would you utilize?
    3. Describe your mission plan.
    4. What legal regulations must you consider before you fly?
  4. How are UAS-captured aerial images processed to generate geospatial data?
    1. What is Structure from Motion?
    2. What data products can be produced from UAS-captured imagery?
    3. How are UAS-acquired data properly georeferenced?
  5. Discuss two geospatial applications of UAS technology. Why is the integration of UAS technology especially beneficial to these application areas?
  6. Regarding the adoption of UAS as a data collection platform, describe a societal challenge facing civilian usage of UAS.

 

Additional Resources: 
  • Academy of Model Aeronautics (AMA) - www.modelaircraft.org
  • Association for Unmanned Vehicle Systems International - www.auvsi.org
  • Federal Aviation Administration (FAA) Unmanned Aircraft Systems Homepage - www.faa.gov/uas
  • United States Geological Survey (USGS) National Unmanned Aircraft Systems Project Office - uas.usgs.gov