You are currently viewing an archived version of Topic Unmanned Aerial Systems (UAS). If updates or revisions have been published you can find them at Unmanned Aerial Systems (UAS).
Unmanned Aerial Systems (UAS) are revolutionizing how GIS&T researchers and practitioners model and analyze our world. Compared to traditional remote sensing approaches, UAS provide a largely inexpensive, flexible, and relatively easy-to-use platform to capture high spatial and temporal resolution geospatial data. Developments in computer vision, specifically Structure from Motion (SfM), enable processing of UAS-captured aerial images to produce three-dimensional point clouds and orthophotos. However, many challenges persist, including restrictive legal environments for UAS flight, extensive data processing times, and the need for further basic research. Despite its transformative potential, UAS adoption still faces some societal hesitance due to privacy concerns and liability issues.
- Sensors and Data Capture
- Data Processing and Analysis
- UAS and Society
Unmanned Aerial System (UAS) – an aircraft without an onboard pilot that is operated autonomously or manually by a remote-control operator. The terms unmanned aerial vehicle (UAV), unmanned aircraft systems/vehicles, remotely piloted aircraft (RPA), and drone are often used interchangeably. UAS platforms typically adopted by geospatial researchers are considered small UAS (sUAS), weighing between 0.5 lbs (~0.2 kg) and 55 lbs (~25 kg) as designated by the U.S. Federal Aviation Administration (FAA; weight limits may vary in other countries)
Rotary-Wing (RW) – single or multirotor copter with upward-mounted propeller(s) that generate lift allowing aircraft to take off and land vertically and hover during flight. RW platforms typically provide more maneuverability than fixed-wing aircraft.
Fixed-Wing (FW) – platform with a stationary wing and forward-mounted propeller(s) to generate lift and continuously move aircraft forward at varying pitch angles. FW platforms can fly at higher speeds and for longer duration (40 minutes to several hours) increasing aerial coverage in comparison to RW.
Structure from Motion (SfM) - computer vision algorithms to process digital photos into three-dimensional point clouds and subsequent geospatial data products such as digital terrain and surface models, and orthophotos. SfM is a broad term that often also encompasses multi-view stereo techniques (e.g., MVS, SfM-MVS).
UAS operators must adhere to civil aviation authority policies when collecting data. In the U.S., the FAA governs UAS operations, requiring aircraft to be registered and operators to obtain a remote pilot certification. Operating rules include: flying during daylight hours, below 400 feet (~120 m) in altitude, not within 5 mi (~8 km) of any airport, or over populated areas. Additionally, operators must maintain visual line-of-sight and yield to manned aircraft during flight.
Many UAS are hindered by even slightly windy conditions, requiring frequent confirmation of weather forecasts at/near the study site. Although platform dependent, FW aircraft are often flown into and with the wind to minimize side-to-side movement, whereas RW aircraft are less restricted in flight direction. FW platforms require a larger staging area than RW platforms for launch and skid landings. During data collection missions, flightlines should be organized to ensure stereoscopic coverage. UAS-based image capture requires considerable overlap (80-90% endlap and 60% sidelap recommended) to ensure effective image matching due to the larger distortions introduced by lower flying altitudes and platform instability (Harwin et al. 2015). Nadir-facing images are commonly collected, although convergent views are recommended (i.e. integration of obliques; James and Robson 2014).
3.1 Image Capture
UAS are mainly utilized to capture imagery, and off-the-shelf, point-and-shoot digital cameras are a popular sensor option (see Toth et al.  for a comparison of cameras). Wide-angle lenses (e.g., GoPro Hero) are avoided due to high image distortion, and parsing video into still images is not recommended because frames may contain blur. Off-the-shelf cameras typically have limited spectral resolution, and reflectance calibration can be challenging, but removal of the internal hot mirror permits capture of near-infrared wavelengths (Mathews 2015). Spectral targets with known reflectance properties placed in situ are commonly used to calibrate optical sensor measurements, or sensors such as the Tetracam ADC Lite sensor allow image capture from UAS with spectral bands matching certain Landsat bands, thereby facilitating comparisons. Other commonly used sensors include the Parrot Sequoia and MicaSense RedEdge.
Georeferencing schemes for UAS-acquired imagery include: (1) direct, which uses known camera locations through GNSS-enabled cameras or onboard GNSS and IMU measurements stored and attached to captured images, (2) indirect, which uses GNSS-located ground control points (GCPs), and (3) a combination of direct and indirect.
3.2 Non-image Date Capture
Non-imagery applications of UAS include, for example, collecting measurements of temperature, pressure, humidity, and wind for atmospheric sampling and meteorology or environmental surveillance using sensors that can detect CO2, methane, and other gases for pipeline monitoring. Lidar sensors have been employed for terrain and 3D mapping, but sensor size, weight, and cost remain restrictive for many applications. However, advances are being made toward developing low-cost, miniaturized sensing devices.
3.3 Coordinated Data Capture
A benefit of deploying sensors onboard UAS is the potential for coordinated, self-organized data capture between two or more vehicles. Algorithms for coordinating, controlling, and systematizing distributed networks of airborne sensors are developing rapidly, allowing multiple UAS flying in a network to communicate with each other and ground stations to coordinate capture of optimally distributed spatial datasets (Namuduri et al. 2013). This type of ‘smart’, mobile UAS network will permit adaptive sampling schemes not possible with fixed, ground-based networks.
Structure from Motion (SfM) computer vision technique incorporates a series of algorithms (e.g., Scale Invariant Feature Transform—SIFT [Lowe 2004], Bundler [Snavely et al. 2008], patch-based multi-view stereopsis—PMVS [Furukawa and Ponce 2010]) to match overlapping areas across multiple images with differing perspectives (identifying keypoints of the same features—equivalent to photogrammetric tie points) to generate sparse and dense point cloud reconstructions of 3D space. SfM point clouds are not inherently georeferenced, and known locations of cameras or GCPs must be incorporated to transform the point cloud to real-world coordinates. SfM point clouds are similar to lidar datasets with the addition of RGB information for each point. Commonly used SfM desktop software packages include Agisoft PhotoScan, Pix4D, and VisualSfM. Cloud-based alternatives (e.g., DroneDeploy) are also available.
Images can also be processed to produce very high spatial resolution orthophotos. Proper orthophoto production requires removal of radiometric effects (e.g., vignetting, brightness variation from image-to-image, conversion to reflectance values; see Mathews 2015) and geometric effects (e.g., lens distortion, relief displacement; see Kelcey and Lucieer 2012). Geometric corrections remain especially challenging when using uncalibrated sensors at low altitudes where distortions are magnified (Mathews 2015).
Advanced analyses use lidar data filtering and classification techniques to extract height information from SfM point clouds, either by generating Digital Terrain Models (DTMs) and Digital Surface Models (DSMs) (Fonstad et al. 2013) or computing height metrics to characterize vegetation canopy structure and biomass (see Dandois and Ellis 2013; Mathews and Jensen 2013). The temporal flexibility of data collected by UAS allow for 3D/volumetric change analyses via DTM/DSM differencing and/or direct point cloud comparison. Object-based image analysis techniques are commonly applied to analyze and extract useful vector data from 2D orthophotos (Laliberte et al. 2010).
Data collection and processing standards as well as comprehensive accuracy assessments remain in early stages. Some solutions have been proposed to optimize workflow efficiency and improve model accuracy (see Turner et al. 2012; James and Robson 2014), but considerable uncertainty remains.
Most GIS&T research using UAS has been applied. Table 1 outlines several major application areas with some related works for further reading.
|Terrain modeling||Stefanik et al. 2011; Fonstad et al. 2013|
|Geomorphological and fluvial processes||Flener et al. 2013; Dietrich 2016|
|Vegetation structure, forestry, and ecosystem modeling||Wallace et al. 2012; Dandois and Ellis 2013|
|Precision agriculture||Baluja et al. 2012; Mathews and Jensen 2013|
|Land and natural resource management||Rango et al. 2009; Laliberte et al. 2010|
|Animal habitat and monitoring||Chabot et al. 2014|
|Natural disasters (wildfire, landslides)||Ambrosia et al. 2003; Niethammer et al. 2012|
|Meteorology||Frew et al. 2012|
|Cultural features and archaeology||Eisenbeiss and Sauerbier 2011|
As UAS usage becomes more common, ongoing discourse surrounding their role in ‘citizen mapping’ such as participatory mapping and citizen science projects as well as their role in ‘mapping citizens’ is critical. As of 2016, few citizen science and participatory mapping projects were engaging with aerial platforms (see Cummings et al. 2017), but as scientists increasingly recognize the utility of complementing traditional remote sensing with sensors onboard UAS, the use of UAS in citizen science projects will rise. Websites such as OpenAerialMap (https://openaerialmap.org/) and Dronestagram (http://www.dronestagr.am/) allow users to share UAS-acquired images online. The Humanitarian OpenStreetMap Team has crowdsourced (i.e., microtasked) the digitization of UAS-acquired imagery to support disaster recovery, and many small civilian UAS projects have been successfully completed around the world for similar purposes.
Conversely, ‘mapping citizens’ has implications for location privacy, which is concerned with individuals’ claim to determine when, how, and to what extent information about themselves and their location is communicated to others (Kerski 2016). These locational privacy issues face new challenges according to the ease with which very high resolution imagery or other data can be captured from UAS. Privacy concerns of citizens surrounding UAS are fluid and evolving, and it will be important for GIS&T researchers to remain engaged in these societal questions as adoption of UAS technology increases.
Ambrosia, V.G., Wegener, S. S., Sullivan, D. V., Buechel, S.W., Dunagan, S. E., Brass, J. A., Stoneburner, J. and Schoenung, S. M. (2003). Demonstrating UAV-acquired real-time thermal data over fires. Photogrammetric Engineering & Remote Sensing, 69(4),391-402. doi: 10.14358/PERS.69.4.391
Baluja, J., Diago, M. P., Balda, P., Zorer, R., Meggio, F., Morales, F. and Tardaguila, J. (2012). Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrigation Science, 30(6): 511-522. doi: 10.1007/s00271-012-0382-9
Colomina, I., and Molina, P. (2014). Unmanned aerial systems for photogrammetry and remote sensing: a review. ISPRS Journal of Photogrammetry and Remote Sensing, 92: 79-97. doi:10.1016/j.isprsjprs.2014.02.013
Cummings, A.R., Cummings, G. R., Harner, E., Moses, P., Norman, Z., Captain, V., Bento, R., and Butler, K. (2017). Developing a UAV-Based Monitoring Program with Indigenous Peoples. Journal of Unmanned Vehicle Systems. doi:10.1139/juvs-2016-0022.
Dandois, J.P., and Ellis. E. C.,(2013). High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision. Remote Sensing of Environment 136: 259-276. doi: 10.1016/j.rse.2013.04.005
Dietrich, J.T. (2016). Bathymetric structure-from-motion: extracting shallow stream bathymetry from multi-view stereo photogrammetry. Earth Surface Processes and Landforms, 42(2), 355-364. doi: 10.1002/esp.4060
Eisenbeiss, H., and Sauerbier, M. (2011). Investigation of UAV systems and flight modes for photogrammetric applications. The Photogrammetric Record 26(136), 400-421. doi: 10.1111/j.1477-9730.2011.00657.x
Flener, C., Vaaja, M., Jaakkola, A., Krooks, A., Kaartinen, H., Kukko, A., Kasvi, E., Hyyppa, H., Hyyppa, J., and Alho, P. (2013). Seamless Mapping of River Channels at High Resolution Using Mobile LiDAR and UAV-Photography. Remote Sensing 5(12), 6382-6407. doi:10.3390/rs5126382
Fonstad, M.A., Dietrich, J. T., Courville, B. C., Jensen, J. L. and Carbonneau, P.E. (2013). Topographic structure from motion: A new development in photogrammetric measurement. Earth Surface Processes and Landforms, 38(4), 421-430. doi: 10.1002/esp.3366
Frew, E., Elston, J., Argrow, B., Houston, A, and Rasmussen, E. (2012). Sampling severe local storms and related phenomena using unmanned aircraft systems. IEEE Robotics & Automation Magazine, 19(1), 85-95. doi:10.1109/mra.2012.2184193
Furukawa, Y., and Ponce, J. (2010). Accurate, dense and robust multiview stereopsis. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(8), 1362-1376. doi:10.1109/tpami.2009.161
Harwin, S., Lucieer, A., and Osborn, J. (2015). The impact of calibration method on the accuracy of point clouds derived using unmanned aerial vehicle multi-view stereopsis. Remote Sensing, 7(9), 11933–11953. doi: 10.3390/rs70911933
James, M.R., and Robson, S. (2014). Mitigating systematic error in topographic models derived from UAV and ground-based image networks. Earth Surface Processes and Landforms 39, 1413-1420. doi: 10.1002/esp.3609
Kelcey, J., and Lucieer, A. (2012). Sensor Correction of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing. Remote Sensing 4(12), 1462–1493. doi: 10.3390/rs4051462
Kerski, J. (2016). Location Privacy. The Geographic Information Science & Technology Body of Knowledge (3rd Quarter 2016 Edition), John P. Wilson (ed.). doi: 10.22224/gistbok/2016.3.2
Laliberte, A., Herrick, J. E., Rango, A., and Winters, C. (2010). Acquisition, orthorectification, and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring. Photogrammetric Engineering & Remote Sensing, 76(6): 661-672. doi: 10.14358/PERS.76.6.661
Lowe, D. (2004). Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110. doi: 10.1023/B:VISI.0000029664.99615.94
Mathews, A. J., and Jensen, J. L. (2013). Visualizing and quantifying vineyard canopy LAI using an unmanned aerial vehicle (UAV) collected high density structure from motion point cloud. Remote Sensing 5(5), 2164–2183. doi: 10.3390/rs5052164
Mathews, A. J. (2015). A practical UAV remote sensing methodology to generate multispectral orthophotos for vineyards: Estimation of spectral reflectance using compact digital cameras. International Journal of Applied Geospatial Research 6(4), 65-87. doi:10.4018/ijagr.2015100104.
Namuduri, K., Wan, Y., and Gomathisankaran, M. (2013). Mobile Ad Hoc Networks in the Sky: State of the Art, Opportunities, and Challenges. Proceedings of the second ACM MobiHoc workshop on Airborne networks and communications - ANC 13. Bangalore, India. doi: 10.1145/2491260.2491265
Niethammer, U., James, M. R., Rothmund, S., Travelletti, J. and Joswig, M. (2012). UAV-based remote sensing of the Super-Sauze landslide: Evaluation and results. Engineering Geology 128, 2-11. doi: 10.1016/j.enggeo.2011.03.012
Rango, A., A. Laliberte, J.E. Herrick, C. Winters, K. Havstad, C. Steele, and D. Browning. 2009. “Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and management.” Journal of Applied Remote Sensing 3 (1): 033542. DOI: 10.1109/IGARSS.2010.5651659
Snavely, N., Seitz, S. M., and Szeliski, R. (2008). Modeling the world from internet photo collections. International Journal of Computer Vision 80(2),189-210. doi: 10.1007/s11263-007-0107-3
Stefanik, K.V., Gassaway, J. C., Kochersberger, K. and Abbott, A. L. (2011). UAV-based stereo vision for rapid aerial terrain mapping. GIScience & Remote Sensing 48(1), 24-49. doi: 10.2747/1548-1603.48.1.24
Toth, C., Jozkow, G., and Grejner-Brzezinska, D. (2015). Mapping with small UAS: a point cloud accuracy assessment. Journal of Applied Geodesy 9(4), 213-226. doi: 10.1515/jag-2015-0017
Turner, D., Lucieer, A., and Watson, C. (2012). An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sensing 4(5), 1392–1410. doi:10.3390/rs4051392
Wallace, L., Lucieer, A., Watson, C. and Turner, D. (2012). Development of a UAV-lidar system with application to forest inventory. Remote Sensing 4(6), 1519-1543. doi:10.3390/rs4061519
- How does GIS&T benefit from the incorporation of UAS technology? Be specific about the benefits of spatial and temporal data resolution.
- Identify the two UAS platform types and describe the strengths/weaknesses of each platform regarding their ability to collect data.
You would like to acquire terrain data for a small (~0.5 ha) open area in a rural location.
- What type of UAS would you require? Why?
- What sensor(s) would you utilize?
- Describe your mission plan.
- What legal regulations must you consider before you fly?
How are UAS-captured aerial images processed to generate geospatial data?
- What is Structure from Motion?
- What data products can be produced from UAS-captured imagery?
- How are UAS-acquired data properly georeferenced?
- Discuss two geospatial applications of UAS technology. Why is the integration of UAS technology especially beneficial to these application areas?
- Regarding the adoption of UAS as a data collection platform, describe a societal challenge facing civilian usage of UAS.
- Academy of Model Aeronautics (AMA) - www.modelaircraft.org
- Association for Unmanned Vehicle Systems International - www.auvsi.org
- Federal Aviation Administration (FAA) Unmanned Aircraft Systems Homepage - www.faa.gov/uas
- United States Geological Survey (USGS) National Unmanned Aircraft Systems Project Office - uas.usgs.gov