Topic: EMERGING REMOTE SENSING SYSTEMS-UAVs COURSE TITLE: REMOTE SENSING SYSTEMS LECTURER: Dr M.J Fasona EMERGING REMOTE SENSING SYSTEMS-UAVs (UNMANNED AERIAL VEHICLES

June 1, 2017 | Autor: Taiwo Toyeebah | Categoria: Environmental Remote Sensing
Share Embed


Descrição do Produto


10





UNIVERSITY OF LAGOS AKOKA
NAME: TAIWO JOKOTOLA. T

MATRIC NO: 090906052

PROGRAM: M.Sc GEOGRAPHY (geomorphology option)

COURSE CODE: GRY 813
Topic: EMERGING REMOTE SENSING SYSTEMS- UAVs (UNMANNED AERIAL VEHICLES)

COURSE TITLE: REMOTE SENSING SYSTEMS
LECTURER: Dr M.J Fasona
EMERGING REMOTE SENSING SYSTEMS- UAVs
(UNMANNED AERIAL VEHICLES)
1.0 Introduction
Remote sensing data acquired from satellites and piloted aircraft are powerful tools for measuring the state and trends of environmental changes associated with natural processes and human-induced alterations of the environment. For many situations, data from these platforms provide the only way to measure features or processes on the Earth's surface and in the atmosphere, and to evaluate how they are changing. To address the growing demand for data on the state of the environment, the science of remote sensing is continually evolving (Whitehead 2014).
The conventional airborne and satellite remote sensing platforms upon which most sensors are mounted have not always met the demands of researchers and environmental professionals. For many environmental applications these platforms pose challenges or require trade-offs, such as high cost, lack of operational flexibility, limited versatility, and (or) poor spatial and temporal resolution.
1.0.1 Relevance of this issue
Many types of satellite imagery are readily available – low resolution for free online (Landsat, MODIS) and high resolution for purchase (WorldView, Quickbird) – they sometimes cannot offer sufficiently high resolution, cover the specific area of study, or capture the time series necessary to fulfil the entire purpose of a project. For several types of situations, satellite imagery and remote sensing analysis are the only way to see what has occurred on the ground, but sometimes the information collected may not be adequate enough. If the image resolution is not high enough to see exact areas of devastation or change, coverage of an entire affected area is not available, or imagery is simply too expensive to acquire, then an analysis will be difficult to complete. The generally low-cost high resolution image capture capability of eco-drones creates the potential for them to fill the data gap between satellites and ground surveying in the aforementioned cases. In addition, eco-drones can do much more than image acquisition, occasionally making them advantageous over typical satellite or aircraft image acquisition.

In addition to image capture, eco-drones can function as a real-time monitoring mechanism for disaster events or illegal resource extraction, distribute broadcast messages and collect and transmit meteorological data (CielMap, 2012). Drones can also fly in riskier and more treacherous areas than humans or manned aircraft can traverse, such as inaccessible shorelines or hurricanes (Nagai et al., 2008; Watts et al., 2012; NOAA, 2012). Due to the size and aerodynamics of drones, they are able to fly at lower altitudes, collecting more precise information than manned aircraft or satellites. This also means that they can fly below clouds making them advantageous in tropical areas where clouds can often impede satellite image collection. A typical UAS can capture images at about 6 cm spatial resolution when flying at an altitude slightly over 200 m (Rango and Laliberte, 2010). With changing ecosystems and disaster dynamics caused by climate change and urbanisation, as well as the elusive presence of environmental crime, on-demand aerial data collection and real-time environmental monitoring will become increasingly important.

The ability to identify, measure, and forecast environmental changes requires remote sensing data that match the resolution of the changes and the associated processes. Often data acquired from conventional remote sensing platforms do not have the resolution and operational flexibility to address this challenge effectively or affordably. Attempts have been made with different types of low-cost platforms to overcome this gap (e.g., telescoping masts, Schlitz 2004; balloons, Vierling et al. 2006; and kites, Wundram and Loffler 2007), but limited adoption suggests these platforms have not met the demands of the research and professional communities.
We suspect this is because they involve customised designs, are operationally impractical for some environments, or rely on manual control. One area of recent innovation in terms of remote sensing platforms is the development and application of small unmanned aircraft systems (UASs), also known as drones, unmanned aerial vehicles (UAVs), or remotely piloted aircraft.
1.1 Definition of the UAV
A UAS is a system of systems—that is, a set of complementary technologies brought together to fulfil a specific task—and, as such, there currently exists a wealth of different systems: it may be said that there is one for every combination of technologies. At the highest level of the UAS technology breakdown, three main UAS components are commonly identified namely the unmanned aerial vehicle, the ground control station and the communication data link.


1.2 COMPONENTS OF A UAS
It is important to clarify that a UAS consists of several components (Eisenbeiss 2009):
the aircraft or UAV,
a ground control system,
a pilot or navigator who operates the UAS from the ground control system, and
One or more spotters who monitor the UAS and other aircraft and hazards in the area.

1.3 TYPES OF DRONES

UAVs may be either
Fixed wing,
Rotary wing.
Fixed wing UASs typically have greater speed and longer range. Rotary-wing UASs include miniature helicopters and multirotor platforms. Typically, they have shorter flight durations, but offer greater manoeuvrability. Fixed-wing UASs are typically launched by hand or by catapult, and land with or without some form of arresting mechanism, such as a parachute or by flying into a net. Rotary-wing UASs typically require some manual operation for take-off, and may or may not require manual operation for landing.

There are three general classes of drones, the most cost effective being close range, and several different models of drones that vary in size, flight time, camera capability, takeoff and landing needs and altitude flight level. Drones can vary in wingspan from about 0.5 m to more than 35 m. Altitude is often restricted by government regulations, but some small drones can fly as low as a few hundred metres and some as high as 6,000 m. Large scale drones built for extreme endurance can fly as high as 20 km.









Characteristic

Close Range
Short range
Endurance
Range
~50 km
~200 km
> 200 km
Flight time
30 min – 2 hrs
8 hrs – 10 hrs
> 24 hrs
Weight
< 5 kg
< 5,000 kg
< 105 t
Speed
~60 kmph
< 485 kmph
< 730 kmph
Altitude
< 6 km
< 16 km
< 20 km
Cost (USD)
$500 - $70,000
< $8 mil
< 123 mil
Table 1: Characteristics of the three general classes of drones: close range, short range and endurance. (Sources: Lucintel 2011, Koh and Wich 2012; CielMap 2012); t = metric tonnes).

Different models of drones are advantageous for different applications. For example, a fixed wing unmanned aircraft (Figure 1A) is best when an extended flight time is required over a long distance. If a small area is being mapped, a multicopter (Figure 1B), which can remain stationary, is best to use. Multicopters such as quad-rotors are also useful in areas of rugged terrain (Niethammer et al., 2012).

Fig(1a): Fixed wing UAV




Figure 1b. Types of unmanned aircrafts (drones) Multicopter

1.4 Names and Acronyms According to Countries and Organisations

UAS are known under various different names and acronyms, as ''Unmanned Aerial Vehicle'' (UAV), ''aerial robot'' or simply ''drone,'' with ''UAV'' and ''drone'' being the most popular terms. The term UAS was adopted by the US Department of Defence (DOD) and the Civil Aviation Authority (CAA) of the UK. The International Civil Aviation Organization (ICAO) has introduced the concept of ''Remotely-Piloted Aerial System'' (RPAS), a particular class of UAS, in the ICAO Circular 328 (ICAO, 2011). UAS refers to the system comprising an unmanned aircraft (UA), a ground control station (GCS) and a communications data link for the UA command and control (C2) from the Ground control station. A drone can be defined as a system with 'an aircraft with the capacity to fly semi or fully autonomously thanks to an on-board computer and sensors' (CielMap, 2012).

This term paper will be referring to this system as the UAV-unmanned aerial vehicle- which consists of the
Unmanned aircraft( UA)
Ground control station(GCS)
Communication data link (CDL)


1.5 Early developments
Balloons are probably the oldest platforms for aerial observation. As a matter of fact, aerial photographs of Paris were already being captured in 1858 by Tournachon aboard a hot-air balloon. Later on, and thanks to the simplification of camera technology, other means such as kites (used by the English meteorologist E.D Archibald in 1882) and rockets (as used by the Swedish inventor Alfred Nobel in 1897) were used for aerial photography.
Perhaps one of the most exciting early experiments was the use of small cameras mounted on the breasts of pigeons from the Bavarian Pigeon Corps, as proposed by J. Neubronner in 1903. Thus, Tournachon's adventures aside, one may conclude that the oldest form of aerial remote sensing was actually based on remotely-piloted vehicles.
As mentioned earlier, the UAS is a system of systems—that is, a set of complementary technologies brought together to fulfil a specific task—and, as such, there currently exists a wealth of different systems: it may be said that there is one for every combination of technologies. The categorization of the existing unmanned aircraft has been a constant and necessary exercise among the UAS community throughout its history ''to bring order into chaos,'' (Everaerts, 2009). As a result of this need, there is a myriad of classifications of UAS, according to the various characteristics of the aerial platform (size and weight, endurance, aerodynamics, etc.) or the system operation (mission range or flying altitude, nature of its application, etc.). We review some of current work on UAS classification hereafter. The work presented in Eisenbeiss (2009) with respect to UAS categorization is noteworthy; it considers powered and non-powered, heavier- or lighter-than-air platforms and performs an assessment on range, endurance, weather and wind dependency, and manoeuvrability. It additionally defines a proprietary classification based on price and payload, including on-board navigation sensor grade and, thus, geo-referencing or real-time capabilities, and application accuracy requirements.

In van Blyenburgh (2013), a vast inventory of world-wide UAS, including categorizations based on size, weight, operating range and certification potential is provided. The classification based on size and operation range, with the corresponding number of inventorised platforms is of special interest. From the last to the first, the Medium Range Endurance to Exo- Stratosferic; UAS ecosystem groups the largest UAS with highest operating altitudes, which often consist of fine and complex avionics and are only allowed to fly under certain special regulations by certain specific teams, generally military units. Together, they total up to 179 referenced UAS. Secondly, close-short-medium-range UAS are characterized by a Maximum Take-Off Weight (MTOW) between 150 and 1250 kg and an operating range between 10 and 70 km. This group includes fixed- and rotary-wing developments from well-established companies (Raytheon, Northrop Grumman, Saab) generally equipped with remote sensing technologies, and Total up to 546 developments. Finally, the nano-micro-mini UAS class is defined by low weights and payload sizes, low flying altitudes and quick operational deployments, and concentrates up to 728 developments. More specifically, mini UAS, which are defined by an operative range of less than 10 km, allowed to fly lower than national ceilings of segregated airspaces, feature less than two hours of endurance and less than thirty kilograms of MTOW (although the MTOW specification may vary from country to country), is the largest group, featuring 490 referenced systems.

A typical Landsat TM/SPOT5 5m pan-sharpened image – South East of Ilorin

2.0 The Sensing payload

A recent in-depth survey on optical remote sensing instruments, for terrestrial static, terrestrial kinematic, airborne and space image acquisition systems can be found in Remondino (2011). Specifically for UAS, van Blyenburgh (2013) identifies 406 imaging and ranging instruments including active and passive systems, optical—from the visible band, to the Near Infrared (NIR) up to the Thermal Infrared (TIR) and microwave systems.
This section concentrates on remote sensing instruments which might be suitable for micro, mini and tactical UA payloads, particularly distinguishing between visible-band, near-infrared, multispectral, hyperspectral, thermal, laser scanners and synthetic aperture radar.

Tables 3-8 describe fundamental characteristics of some common and/or representative sensors, and aim to complement sensor integrators' knowledge on available systems.

2.1 Visible-band, near-infrared and multi-spectral cameras
The photogrammetric and remote sensing community have benefited from the mass-market and other professional markets 'strength by leveraging these to design remote sensing instruments with high resolution.

Fig 3: Common and/or representative small format (SF) and medium format (MF) visible band cameras.
Manufacturer and model
Format type
Resolution (MPx)
Size (mm2)
Pixel sixe (um)
Weight (kg)
Frame rate(fps)
Speed
(s-1)
Phase One
MF
CCD
53.7
5.2
1.70
0.7
4000(fp)
iXA 180

80
×40.4



1600(ls)
Trimble
MF
CCD
53.7
5.2
1.50

1000(ls)
Sony
SF
CMOS
23.5
3.9`
0.35
2.3
4000(fp)
NEX-7
MILC
24.3
×15.6




GXR A16
IUC
16.2
×15.7




fp: focal plane shutter, ls: leaf shutter.
(Source: Colomina, and Molina 2014)
Table 4
Common and/or representative multispectral cameras for UAS.
Manufacturer and model
Resolution
Size (mm2)
Pixel size
(um)
Weight(kg)

Spectral range(nm)
Tetracam
CMOS
6.66
5.2
0.7
450-1050
MiniMCA-6
1.3
5.32
5.2


Quest innovations
CCD
10.2
7.5
0.8
400-1000
Condor-5 UAV-285
1.4
8.3
8.1


(Source: Colomina, and Molina 2014)

Table 5: Common and/or representative hyperspectral cameras for UAS.
Manufacturer and model
Size (mm2)
Pixel size (um)
Weight(kg)

Spectral range (nm)
Spectral bands and resolution
Rikola Ltd.

CMOS
5.5
0.6
500-900
40
Hyperspectral Camera
×5.6

1.025
900-1700
10nm
Headwall photonics
9.6
30


62
Micro-Hyperspec X-series NIR
×9.6



12.9nm
(Source: Colomina, and Molina 2014)

Table 6: Common and/or representative thermal cameras for UAS
Manufacturer and model
Size (mm2)
Pixel size (um)
Weight(kg)

Spectral range(um)
Thermal sensitivity (mK)
FLIR
10.8
17
0.07
7.5-13.5
50
TAU2 640





Thermoteknix systems Ltd
16
25
0.105
8-12
50
(Source: Colomina, and Molina 2014)



Table 7: Common and/or representative laser scanners for UAS.

Manufacturer and Model
Scanning pattern
Range(m)
Weight(kg)
Angular resolution
(deg)
FOV
(deg)
Laser class and WL(nm)
Frequency
(Kp/s)
Application
Ibeo automotive systems
4 scanning
200
1
(H) 0.125
(H) 110
Class A
22
A
IBEO LUX
Parallel lines


(V) 0.8

905


Velodyne
32 laser detector
100
2
(H)

Class A
700
MM
HDL-32E
Pairs


(V) 1.33
(V) 41
905


RIEGL
1 Scanning
1000

(H) 0.01
(H)
60
Class 3B
200
H
A: automotive; MM: terrestrial mobile mapping; H: hydrography.
(Source: Colomina, and Molina 2014)

Fig 8: Common and /or representative synthetic aperture radars for UAS


Manufacturer and model
Spectral bands
Weight(kg)
Transmitted power (W)
Resolution (m)
IMSAR
X and Ku
1.58
1
Btw 0.3 and 5
Fraunhofer FHR
W

0.1
0.15
NASA JPL
L
200
2000
2
Selex Galileo
X
10

1
(Source: Colomina, and Molina 2014)
Choice of Radar Frequency
Application factors:
– Radar wavelength should be matched to the size of the surface features that we wish to discriminate
– e.g. Ice discrimination, small features, use X-band
– e.g. Geology mapping, large features, use L-band
– e.g. Foliage penetration, better at low frequencies, use P-band
In general, C-band is a good compromise.

The maturation of visible-spectrum cameras has impacted several side-technologies, for example, mobile phones. Current smartphones come equipped with high-quality cameras at a reasonably low-cost. Yun et al. (2012) reports on a fixed-wing UAS equipped with Samsung Galaxy S and S2 smartphones to produce a digital elevation model (DEM) of a construction area in South Korea. Moving to the upper portion of the spectrum, several small, commercial multi-spectral cameras exist and have already been used in UAS. More recently, the French company Lehman Aviation presented the LA300 UAS, featuring an on-board 41 megapixel Nokia Lumia 1020 optical camera. Moving to the upper portion of the spectrum, several small, commercial multi-spectral cameras exist and have already been used in UAS for PaRS. This is the case of the mini-MCA series, from Tetracam, which is available in three configurations to capture four, six and twelve channels (Tetracam, 2012). They are able to discriminate spectral reflectance which is an important indicator in applications related to vegetation health. Bendig et al. (2012) and Lucieer et al. (2012) describe developments using mini-MCA cameras and Kelcey and Lucieer (2012) reports on the geometric and radiometric calibration of a mini-MCA. Table 4 summarizes common and/or representative RGB cameras suitable for UAS PaRS, as reported in several experiements. Bäumker and Przybilla (2011) report on the Ricoh GXR using the A16 lens, featuring 16.2 Mpx and variable focal length (24– 85 mm), and Küng et al. (2011) report on the use of the Sony NEX-5. In the same product family, the Sony NEX-7 is present in commercial systems such as Falcon 8, from Ascending Technologies, or the Aeromapper, from Aeromao. In addition, some sensors are described as potential candidates in view of their specifications. For ease of comparison, we follow the same structure as in Remondino (2011) for the visible-band cameras. Finally, Table 5 describes some of the most common developments in the field of multi-spectral sensing for UAS. Table 6 provides information on recent relevant developments on hyper spectral sensors for UAS.


2.0 UAS regulations and remote sensing

There are several characteristics of remote sensing surveys with small UASs that stand out from those performed by satellites or manned aircraft. For the time being, many of the differences are driven by aviation regulations that are in place to address safety issues arising from civil uses of UASs. However, the regulations also place restrictions on how UASs are operated, which in turn has a major impact on the types of data that can be acquired from these platforms. In this section, we outline some of the key regulatory criteria that distinguish remote sensing data acquired by UASs.

At the time of writing, many countries and organisations are in the process of establishing or
revamping regulatory frameworks for integrating UASs into civil airspace (e.g., United States; Hugenholtz et al. 2012), which makes it difficult to define a consistent set of regulations affecting UAS-based remote sensing. Nevertheless, according to existing rules in countries like Canada, the United States, and the United Kingdom, there are three criteria that, in our estimation, are likely to persist into the future: limited flying altitude, flying within visual range, and proximity to built-up areas. In Canada and the United Kingdom, operators of small UASs are required to fly below 400 feet (122 m) above ground level unless otherwise specified. In the United States, similar height restrictions generally apply, although Rango and Laliberte (2010) were able to obtain permission to fly at up to 1,000 feet (305 m) above ground level. For remote sensing data, the low-flying height enables the acquisition of ultra-high-resolution imagery (centimetre scale), which is a major benefit for some applications, but also introduces a trade-off in that a large number of images, perhaps several hundred, may be required to completely cover the area of interest. The trade-off emerges in the image processing, such that variations in viewing geometry, as well as the roll, pitch, and yaw of the aircraft, can yield radiometric and geometric distortions in the final image mosaic.
Requirements that the aircraft be in visual range at all times place an effective limit on the distance between the operator and the aircraft, which varies according to the shape and color of the aircraft, and the atmospheric conditions during flight. This requirement places a limit on the size of the survey area, thereby often necessitating extra flights to cover larger areas. Future advances in sense and- avoid technology may permit UAS flights beyond visual range, thus enabling UAS-based remote sensing of larger areas. The final regulatory criterion that is likely to persist into the future is the restriction of UAS flights near built-up areas. Without doubt, the high resolution of UAS surveys would be of benefit for many engineering and construction projects in urban environments. However, even with the development
of reliable sense and avoid technology, public safety (and possibly privacy) considerations are likely to rule out UAS remote sensing in this context.

3.0 UAS remote sensing
3.1. Survey and flight planning
The remote sensing workflow for small UASs is essentially an adaptation of the same steps and processes used for piloted aircraft surveys, and in both cases, aviation regulations place certain restrictions on how the surveys are configured. Though each UAS survey is unique in nature, the same generic workflow is normally followed. Typically, a UAS survey starts with flight planning (Hugenholtz et al. 2013). This stage relies on specialised flight-planning software and uses a background map or satellite image to define the survey area.
Additional information is then added, such as the desired flying height, the focal length and orientation of the camera, the desired amount of overlap between images, and the desired flight direction. The flight-planning software will then calculate the optimal solution to obtain overlapping stereo imagery covering the area of interest. During this process, the various parameters can be adjusted until the operator is satisfied with the flight plan.

As part of the planning stage, the camera shutter speed may need to be set manually, but many consumer-grade cameras now use predefined automatic modes for different lighting conditions. When setting the manual shutter, experience tends to trump any given rule-of-thumb because the setting will depend on the ground surface cover and the ambient light levels. If the exposure time is too long, the imagery will be blurred, or will be too bright, but if it is too short, the imagery might be too dark to discriminate key features of interest.
Once a flight plan has been generated, it is uploaded to the UAS autopilot. The instructions contained in the flight plan are used by the autopilot to calculate the necessary climb rates and positional adjustments that enable the aircraft to follow the planned course as closely as possible.



Fig. 3 Flight planning example (a) showing image waypoints and flight lines and (b) image footprints with
overlap. The home point denotes the location of the ground control system and the location used for takeoff and
landing.
On completion of the flight, a log file is usually downloaded from the aircraft autopilot (Hugenholtz et al. 2013). This file contains details about the recorded aircraft position and attitude throughout the flight, as well as details about when the camera was triggered. This log file is typically used to provide initial estimates for image centre positions and camera orientations, which are then used as inputs to the photogrammetric process. These initial estimates will reflect the accuracy of on-board instrumentation. For example, low-cost UAS using inexpensive mapping-grade navigational sensors will typically have positional accuracies in the 2–5 m range (Turner et al. 2014). Further errors will be introduced if there is uncertainty in the timing of the camera shutter release. Such errors can be significant, with a one second delay potentially resulting in an error of 30–50 m in the direction of flight for a fast-flying fixed-wing UAS.
3.2 Photogrammetry
Other than the collection of airborne video footage, the most common non-military application of UASs to date has been for large-scale photogrammetric mapping (e.g., Haala et al. 2011; d'Oleire-Oltmanns et al. 2012; Hugenholtz et al. 2013; Whitehead et al. 2013). Issues, such as platform stability and the use of nonmetric cameras, usually mean that the geometry of the imagery collected is of a lower quality than that obtained during traditional photogrammetric surveys carried out from manned aircraft (Hardin and Jensen 2011). UAS surveys also tend to collect images with large amounts of overlap. This is partly because the low flying height and comparatively low accuracy of on-board navigational sensors can lead to significant differences between the image footprints estimated during flight planning and the actual ground coverage of each image, especially in undulating terrain (Haala et al. 2011; Zhang et al. 2011). Image footprints can also drift from expectation because of changes in the roll, pitch, and yaw of the aircraft caused by wind and navigation corrections. In spite of these drawbacks, the low flying heights normally make it possible to gather imagery with sub decimetre spatial resolution. This level of detail combined with low costs, flexibility in the timing of image acquisition, and short turn-around times makes UAS-based photogrammetry an attractive option for many potential users across a broad spectrum of research and professional applications.

Within the field of aerial photography in general, the last twenty years has seen high-resolution digital imagery largely replace analogue aerial photography, as well as the development of on-board navigational systems that provide accurate positional and attitude information. This has spurred the parallel development of automated photogrammetric processing packages, such as Inpho (e.g., Haala et al. 2011; Hugenholtz et al. 2013; Whitehead et al. 2013) and LPS (e.g., Laliberte and Rango 2011; d'Oleire-Oltmanns et al. 2012). These software packages provide a semi-automated workflow, allowing for the production of digital elevation models and orthophoto mosaics with limited operator
intervention. The photogrammetric processing chain for a typical UAS survey is described in detail by Hugenholtz et al. (2013) and by Whitehead et al. (2013), who describe processing using Trimble's Inpho software. The process is, however, the same for most photogrammetric software packages. The log file from the UAS autopilot is used to provide initial estimates for the position and orientation of each image. In addition, it is usual to include a number of accurately surveyed ground control points (GCPs) in the photogrammetric adjustment (see Fig. 3). These usually consist of specially placed targets that are surveyed with a GNSS at the time of the UAS survey (e.g., Hugenholtz et al. 2013).

Aerial triangulation refers to the process by which the true positions and orientations of the
images from an aerial survey are re-established. This process includes project setup, measurement of GCPs and manual tie points, and bundle-block adjustment (Hugenholtz et al. 2013; Whitehead et al. 2013). During aerial triangulation, a large number of automated tie points are generated for conjugate points identified across multiple images.
After aerial triangulation, the oriented images may be used to generate a digital surface model
(DSM), which provides a detailed representation of the terrain surface, including the elevations of raised objects, such as trees and buildings. The DSM production process creates a dense point cloud by matching features across multiple image pairs (Whitehead et al. 2013). Another product that can be generated at this stage is a digital terrain model (DTM), which is often referred to as a bare-Earth model. For most purposes a DTM is a more useful product than a surface model, because the high frequency noise associated with vegetation cover is removed. A DTM can be produced in a number of ways, including filtering of the dense point cloud used to produce a DSM, or interpolation of a sparse point cloud (Arefi et al. 2009).

DTMs often require manual editing to remove the influence of larger buildings and heavily vegetated areas, which are generally not adequately filtered. Break lines and additional points are often added during this process, to augment the quality of the final DTM.
While often used interchangeably, the term digital elevation model as used in this review is considered to be generic, and can thus refer to either a DSM or a DTM. After a DTM has been created, it can then be used to orthorectify the original images. Orthorectification refers to the removal of distortions caused by relief, which result from the central-perspective geometry associated with photography. Once orthorectified, the images have an orthogonal geometry and can be used for direct measurement. After orthorectification, the individual images can be combined into a mosaic, to provide a seamless image of the survey area at the desired resolution. Orthorectification can also be carried out using a DSM, but the amount of noise associated with dense vegetation can often cause the resulting orthoimage to have a choppy and irregular appearance.
4.0 CHALLENGES INVOLVED IN THE PROCESSING OF UAV IMAGERY
There are a few challenges associated with using drones, such as smaller image footprint (image area), but with further technological exploration and field testing, solutions can be created.
Advantages
Challenges
Lightweight and easy to transport

Limited flight time depending on model
Low-cost high resolution images


Limited by camera weight
Low-cost operations


Air space limitations and restrictions

Can fly at variety of altitudes depending on data collection needs

Can be limited by wind speed and gusts

Can map areas not accessible by car, boat, etc. on an on-demand time schedule

Limited amount of appropriate software

Video recording capabilities

Time intensive to create ortho-mosaics with minimal geographic reference errors

Quick availability of raw data

Due to small image footprint, numerous images must be captured

Table 8. Advantages and challenges associated with drones (Sources: Hardin and Hardin, 2010; Niethammer et al., 2012; CielMap, 2012; Rango and Laliberte, 2010).
UAS imagery is subject to variable scales
high amounts of overlap,
variable image orientations, and
often has high amounts of relief displacement arising from the low flying heights relative to the variation in topographic relief
Direct georeferencing is still comparatively rare. Because of the extreme sensitivity such systems have to timing errors, they tend to use slow-moving vertical take-off and landing platforms, rather than fixed-wing UASs (e.g., Bláha et al. 2011; Turner et al. 2014). Direct georeferencing systems also require the use of high-end survey-grade components, and to achieve good results they also need to carry heavier cameras.

These factors mean that such systems are generally expensive, heavy, and have limited range (e.g., Nagai et al. 2009; Turner et al. 2014), but we surmise that UASs with these capabilities will become more common in the near future. These processing challenges are detailed and explained below:
Image classification
UASs can be used to gather aerial images at a considerably higher spatial resolution than has
hitherto been achieved, often to centimetre level (e.g., d'Oleire-Oltmanns et al. 2012; Harwin and Lucieer 2012; Turner et al. 2014). While this resolution offers a number of advantages, the amount of detail presents new challenges from the point of view of image classification. The brightness of an individual pixel represents an aggregate of the reflected solar radiation from the different cover types making up that pixel. Traditionally in remote sensing, the low resolutions of satellite imagery and high-altitude aerial imagery have tended to result in comparatively homogenous clusters of pixels, which are well-suited to pixel-based analysis techniques. However at resolutions of only a few centimetres, the individual component parts of plants and trees often become apparent, with separate pixels often representing leaves, branches, and underlying ground cover. Because of the high contrast differences between these features, mixed pixels, comprising various combinations of these components, will also tend to show greater variation than would be apparent for lower resolution imagery.
In such circumstances, pixel-based image classification algorithms are unlikely to give good results.
Illumination issues
One factor that has thus far received little attention in the published literature is the effect of
variable illumination on the photogrammetric processing of UAS imagery. Differences between sunlit and shaded areas can be significant on a bright sunny day, especially where there are cumulus clouds overhead, which give sharp well-defined shadows. From our combined experience, such conditions can pose significant challenges for the automated image matching algorithms used in both triangulation and digital elevation model generation. Where clouds are moving rapidly, shaded areas can vary significantly between images obtained during the same flight, potentially causing the aerial triangulation process to fail for some images, and also resulting in errors in automatically generated digital elevation models. Moreover, patterns of light and shade across images can confuse automated colour balancing algorithms used in the creation of image mosaics. This can result in output mosaics of poor visual quality, which have obvious variations in contrast or colour and that may be excessively dark or light in places.
The best recommendation is to avoid flying under such conditions. However there may be no other available time to carry out the survey.
Another commonly seen illumination effect is the presence of image hotspots, where a bright spot appears in the image. These are due to the effects of bidirectional reflectance, which is dependent on the relative position of the image sensor and the sun (Hakala et al. 2010; Grenzdörffer and Niemeyer 2011; Laliberte et al. 2011). Hotspots occur at the antisolar point, which is the point where the line defined by the sensor position and the sun intersects with the ground.
Relief displacement
Because of the low flying heights used, UAS imagery is particularly prone to the effects of relief displacement (Eisenbeiss 2009; Mozas-Calvache et al. 2012; Niethammer et al. 2012). For non-vegetated areas, such displacement is removed during the ortho-rectification process, assuming that the DSM or DTM used correctly represents the terrain. The situation is more complicated when dealing with trees and buildings (e.g., Mozas-Calvache et al. 2012). In such cases, local displacement is often considerable and there can often be hidden areas where no data has been captured. If a DSM is used to orthorectify such images, the result can often be a choppy looking, irregular image, due to the noise present in the surface. Using a DTM will typically result in a smoother looking image; however, locally elevated features will often still be subject to the effects of relief displacement. As such it is often difficult to produce a true orthoimage, which accurately represents all features.

There are a number of work-arounds to this problem. These include obtaining high overlaps and only using the centres of each image, flying higher, and using a longer focal length. All of these options will help to reduce, but not eliminate, the effects of relief displacement. Problems with relief displacement will often surface at the mosaicking stage, where images with different amounts and directions of relief displacement are combined (Fig. 6). This can result in features being displaced in the final image, with linear features in rapidly changing areas, such as in the case of a road surrounded by high trees, showing sudden jumps in horizontal position.
Mosaic artifacts
The production of mosaics can lead to additional problems. In addition to the problems that result from vignetting, relief displacement, misregistration, and ghosting. Image artifacts are often created where the colour-balancing algorithms fail to work properly. These can occur where the contrasts of individual image bands fall outside the range of the image histograms used for image matching.
Another common occurrence is striping on the final mosaic. This will often occur where there is insufficient overlap between flight lines to allow colour matching to be carried out successfully. Geometric artifacts may also occur where poorly orthorectified adjacent images are used as inputs to the final mosaic. This can result in mismatching of features across the mosaic, as well as holes in the final mosaic. In general such features are indicative of problems at the orthorectification stage.

A mosaic of thermal imagery acquired by a forward-looking infrared camera onboard a UAV quadcopter. The imagery was used to enhance the waterline (outlined in red) along the river channel.

5.0 Benefits of high resolution images captured by an eco-drone
Using drones enables the researchers to detect changes in the fissures and draw conclusions they would have had difficulty reaching from the use of satellite imagery. Figure 4 shows a drone prototype and an image it took with a 12 megapixel (MP) camera. Future prototypes from CielMap will test 16 MP cameras as well as a three band (RGB) sensor.

Fig4a

Figure 4(b).
A prototype of a CielMap drone and a sample high resolution image taken with the 12 MP camera on board

5.1 Innovative Field Applications
Eco-drones have the potential to supplement data collection efforts and contribute to ecosystem inventory and accounting. Specific environmental and ecosystem applications suitable for the use of a drone can range from precision agriculture, to mapping coastline or soil erosion, to species and habitat monitoring. Drones can be launched into the eye of a hurricane to measure wind speed at altitudes and conditions in which a manned aircraft could not and they can fly over the Artic to observe sea ice conditions and track seal populations (NOAA, 2008). One of the most common civil applications is rangeland management (Rango and Laliberte, 2010). Other environmental applications are described in Table 9.
Change Mapping
Disaster Risk Management
Disaster Risk Mitigation
Illegal Activity
Monitoring
River erosion
Flooding risk
Map impacted areas
Poaching
Migration patterns
Deforestation
Landslide risk
Broadcast messages
Illegal fishing
Endangered species status
Urban expansion and planning
Volcano eruption risk
Monitor forest fire spread
Illegal trade
Agriculture
Table 9. Various environmental applications suitable for the use of a drone.
(Rango and Laliberte, 2010).

5.1.1 Agricultural and environmental applications
Remote sensing is a well-known art for agriculture and environment analysis. Vegetation and/or biodiversity control has been traditionally performed using aerial and/or satellite imagery, resulting in high expenses when fine resolution is requested. UAS
have successfully introduced the smaller, cheaper-to-operate platform paradigm among the remote-sensing community. The range of available sensors is widening as a natural attempt to adapt to smaller platforms, in which weight and dimension restrictions hold as opposite to manned aerial platforms, and also to adapt to user and application needs.
Several researchers and/or companies have balanced the requirements of the payload and aerial platform to enable the operation of small, fast and easily deployable systems and cover small or medium-size areas. For example, Rufino and Moccia (2005) used a radio-controlled fixed-wing model to fly a thermal imager and a hyperspectral sensor in visible-NIR bands targeting forest fire monitoring. Another example is Zarco-Tejada and Berni (2012), in which a miniaturized hyperspectral camera mounted on a fixed-wing auto-piloted platform of 6 kg MTOW is described. In Bendig et al. (2012) a mini-UAS MK-Okto by HiSystems GmbH equipped with either a NEC F30 IS thermal imaging system or a tetracam Mini MCA-4 is described for successful Normalized Difference Vegetation Index (NDVI) computation. Gini et al. (2012) describes the use of a Pentax Optio A40 for RGB photos and a Sigma DP1 modified to acquire the NIR band, on-board a Microdrones md4-200, for tree classification based on different vegetation indices.
Agüera et al. (2011) describes the use of the same platform, equipped with an ADC Lite Tetracam, to compare aerial and ground measurements and vegetation indices. Lucieer et al. (2012) uses an Oktokopter with optical and hyperspectral cameras to analyze the Antarctic moss beds, and Jensen et al. (2012) presents an RGB, NIR and thermal-vision concept to monitor stream temperatures. Costa et al. (2012) presents a combination of UAS and a ground wireless sensor network to proceed with crop fertilizing missions. In it, the UAS route is modified depending on the inputs from the ground network, which can measure the amount of fertilizer applied. In Grenzdörffer and Niemeyer (2011), the use of UAS for Bidirectional Reflectance Distribution Function (BRDF) measurements is proposed as an alternative to costly and cumbersome field goniometer measurement campaigns, in the context of agricultural applications. An example of environmental application is presented in Wich and Koh (2012), in which small fixed-wing UAS carrying photo- or video-cameras are used in missions in Switzerland, the Netherlands, Indonesia, Malaysia and Nepal, to perform detection of several species such as orangutans, elephants or rhinos and provide information on density and circulation of animals. The use of small UAS to detect animals is also the motivation behind (Israel, 2011), in which a thermal camera Tau640 from FLIR is flown on-board a Falcon 8 from Ascending Technologies GmbH to detect small fawns in meadows before mowing machines could harm them. Grenzdörffer (2013) reports on the use of UAS for bird counts.
In the geological field, Eisenbeiss (2009) reports on the use of a rotary-wing Copter 1B equipped with a Nikon D2Xs to perform oblique- view mapping of the mountainous area of Randa, Switzerland. Its mission is to analyze tectonic fractures. In Delacourt et al. (2009), a coastal management application, related to the quantification of morpho-sedimentary changes of the coastal fringe, is carried out with a rotary-wing platform carrying a commercial digital reflex camera to generate a DEM for hydrodynamics numerical modelling. Eck and Imbach (2011) reports on the use of a high resolution 3-axis magnetic sensor, mounted on an autonomous Scout B1-100 helicopter to generate detailed magnetic maps. Cox et al. (2006) in a report performed in 2006 by the NASA's Civil UAV Team, in which a list of NASA-funded science mission experiences since 1995 are compiled, such as clear air radiation measurements, cumulus electrification measurements, harvest optimization, coastal mapping, atmospheric chemistry and many others. It also provides an assessment of Earth science, Land Management and Homeland versus their required capabilities (access to regulated airspace, long endurance, quick deployment, etc.).

Use of drones for monitoring destructive activities such as poaching and illegal logging have been notably applied in Africa, Asia and South America. At the end of 2012, Google awarded a US$5 million grant to the World Wildlife Fund (WWF) to use drones, alongside other technologies, to monitor illicit trade in Africa by tracking poachers and the wildlife they are pursuing. The growing occurrences of deforestation and forest degradation worldwide could be more precisely monitored and measured with the use of drones. Member countries of programmes such as the United Nations collaborative initiative on Reducing Emissions from Deforestation and Forest Degradation in Developing Countries (REDD+) could benefit from the monitoring capabilities of drones to measure and report deforestation, using the data to calculate forest carbon emissions more accurately. The option of a low-cost drone would be extremely beneficial to governments or organisations with small budgets seeking to fulfill REDD+ requirements.

5.1 Early warning applications
Quick, easy deployment and ability to enter hazardous areas make drones a beneficial tool for collecting real-time data about atmospheric conditions, mapping disaster impacts as they occur and their aftermath. This information can be incorporated into current and future early warning systems. Drones can provide information to emergency planners by monitoring evacuation, identifying where environmental conditions are worsening (i.e. flood spreading) and contribute to rescue efforts serving as an emergency response mechanism.
Drones can be used to monitor highways vulnerable to landslides, using high resolution cameras to detect cracks that may indicate the onset of a landslide and sensors to detect changes in stress. Once detected, data collected from the drone can be used by authorities to initiate early warning allowing people currently in the area to escape and those travelling to the area to avoid the disaster event before it occurs.
The use of drones for early warning of forest fires has been tested by several federal agencies in the USA. By collecting data about forest fires, the public can be alerted of impending danger and firefighters can better plan for how to attack the fires. While helicopters and manned planes could collect similar information, pilot projects conducted by the United States Forest Service (USFS) have proven that UAS technology has a place in wild land fire-fighting especially when considering flight costs, contract requirements, regulations and operations (Hinkley and Zajkowski, 2011).
The United States Department of Agriculture (USDA) and NASA have used a drone named Ikhana to gather information that helps fight raging forest fires in California, USA (NASA, 2010). Ikhana has a wingspan of about 20 m, is 11 m in length and can carry more than 180 kg of sensors internally and over 900 kg in pods under its wings. It is designed for long endurance flights at higher altitudes, typically flying at an altitude of 12 km, allowing it to gather a significant amount of data over a long period of time and at an altitude high enough to stay out of the heat of the fires (NASA, 2007). Ikhana is also being used to test new capabilities and advance its technological capabilities to improve the design and function of drones.



5.3 What are the implications for policy?
As UAVs become more prevalent in the public and private sectors for research and non-military surveillance, many policy considerations will need to be made. According to a 2012 United States Government Accountability Office (GAO) report, the number of countries with a UAV system for military, commercial, or civil use grew from 41 countries in 2004 to 76 countries by 2011 (GAO, 2012). Tremendous cooperation between nations in regards to airspace jurisdiction will be necessary in the future as eco-drones, and other research drones, become more commonplace. Policy creation and enforcement for demarcation of eco-drones is necessary for communicating to people on the ground that the drones are safe, only for research and to be clearly visible to other air traffic.

Future regulations will need to address weight and size of the drones. Data sharing standards will need to be created if data is collected in airspace not native to the research team. Regulations specific to drones used for environmental modeling or research will need to be developed, implemented and enforced with heavy regard to public safety and privacy (Rango and Laliberte, 2010).

To fly a drone, the public sector must obtain a Certificate of Waiver or Authorisation (COA) and the private sector must obtain an experimental airworthiness certificate in addition to following many regulations (FAA, 2013). For both sectors, drones may not be flown over densely populated areas to ensure public safety.

On a global scale, UAV manufacturing and export as well as licensing will need to be regulated to ensure that purpose remains ethical and legitimate and does not violate public safety and privacy (GAO, 2012). European regulations generally coincide with those of the USA, but requirements among most other countries vary considerably (Watts et al., 2012).










References

Agüera, F., Carvajal, F., Pérez, M., 2011. Measuring sunflower nitrogen status from
an unmanned aerial vehicle-based system and an on the ground device. ISPRS – Ann. Photogramm. Remote Sens. Spatial Inform. Sci. XXXVIII-1/C22, 33–37.

Bendig, J., Bolten, A., Bareth, G., 2012. Introducing a low-cost mini-UAV for thermal and
multispectral-imaging. ISPRS – Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. XXXIX-B1, 345–349.

Blázquez, M., Colomina, I., 2012a. Fast AT: a simple procedure for quasi direct orientation. ISPRS J. Photogramm. Remote Sens. 71, 1–11.

Blázquez, M., Colomina, I., 2012b. Relative INS/GNSS aerial control in integrated sensor orientation: models and performance. ISPRS J. Photogramm. Remote Sens. 67, 120–133.

Costa, F.G., Ueyama, J., Braun, T., Pessin, G., Osorio, F.S., Vargas, P.A., 2012. The use of
unmanned aerial vehicles and wireless sensor network in agricultural applications. In: Geoscience and Remote Sensing Symposium (IGARSS), 2012 IEEE International, 22-27 July, pp. 5045–5048. http://dx.doi.org/10.1109/ IGARSS.2012.6352477.

Cox, T., Somers, I., Fratello, S., 2006. Earth Observation and Role of UAVs: Capabilities Assessment, Version 1.1. Technical Report. Civil UAV Team, NASA.

CielMap, (2012). Low-Cost Unmanned Aircraft Systems for Development Projects: A quiet revolution. Presentation, Sciences Po Paris.

Cohen, M.J., (2011). Brazilian Eyes in the Sky Focus on the Disappearing Rainforest. Scientific American. (accessed 25.04.2013).

Delacourt, C., Allemand, P., Jaud, M., Grandjean, P., Deschamps, A., Ammann, J., Cuq, V., Suanez, S., 2009. DRELIO: An unmanned helicopter for imaging coastal areas. J. Coast. Res. 2, 1489–1493.

Eck, C., Imbach, B., 2011. Aerial magnetic sensing with an UAV helicopter. ISPRS –
Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. XXXVIII-1/C22, 81–85.

Eisenbeiss, H., 2009. UAV Photogrammetry. Ph.D. Thesis. Institut für Geodesie und
Photogrammetrie, ETH-Zürich. Zürich, Switzerland

Grenzdörffer, G., Niemeyer, F., 2011. UAV-based BRDF-measurements of agricutltural surfaces with PFIFFikus. ISPRS – Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. XXXVIII-1/C22, 229–234.

Gini, R., Passoni, D., Pinto, L., Sona, G., 2012. Aerial images from a UAV system: 3D modelling and tree species classification in a park area. ISPRS – Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. XXXIX-B1, 361–366.

GAO, (2012). Agencies Could Improve Information Sharing and End-Use Monitoring on Unmanned Aerial Vehicle Exports. United States Government Accountability Office. Available online: http://www.gao.gov/assets/600/593131.pdf.

Hinkley, E.A. and Zajkowski, T., ( 2011). USDA forest service – NASA: unmanned aerial systems demonstrations – pushing the leading edge in fire mapping. Geocarto International. 26(2), 103 – 11.

Hakala, T., Suomalainen, J., and Peltoniemi, J.I. (2010). Acquisition of bidirectional reflectance factor dataset using a micro unmanned aerial vehicle and a consumer camera. Remote Sens. 2(3): 819–832. doi: 10.3390/rs2030819

Hardin, P.J. and Hardin, T.J., (2010). Small-scale remotely piloted vehicles in environmental research. Geography Compass. 4(9), 1297-1311.

Harwin, S., and Lucieer, A. (2012). Assessing the accuracy of georeferenced point clouds produced via multi-view stereopsis from unmanned aerial vehicle (UAV) imagery. Remote Sens. 4(6): 1573–1599. doi: 10.3390/rs4061573

Hugenholtz, C.H., Whitehead, K., Barchyn, T.E., Brown, O.W., Moorman, B.J., LeClair, A., Hamilton, T., and Riddell, K. (2013). Geomorphological mapping with a small unmanned aircraft system (sUAS): feature detection and accuracy assessment of a photogrammetrically-
derived digital terrain model. Geomorphology. 194: 16–24. doi: 10.1016/j.geomorph.2013.03.023

Israel, M., 2011. A UAV-based roe deer fawn detection system. ISPRS – Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. XXXVIII-1/C22, 51–55.

Jensen, A.M., Neilson, B.T., McKee, M., Chen, Y., 2012. Thermal remote sensing with an autonomous unmanned aerial remote sensing platform for surface stream temperatures. GARSS 5049–5052. http://dx.doi.org/10.1109/ IGARSS.2012.6352476.

Lucieer, A., Robinson, S., Turner, D., Harwin, S., Kelcey, J., 2012. Using a micro-UAV for ultra-high resolution multi-sensor observations of antarctic moss beds. ISPRS – Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. XXXIX-B1, 429–433.

Molina, P., Colomina, I.,( 2014). Drones to the Rescue! Unmanned aerial search missions based on thermal imaging and reliable navigation. Inside GNSS 7, 36–47.

NASA, (2007). Ikhana Unmanned Science and Research Aircraft System. National Aeronautics and Space Administration. (accessed 19.04.2013).

NASA, (2010). NASA's Ikhana Aircraft Helps Fight 1,000+ California Fires. National Aeronautics and Space Administration. (accessed 18.04.2013).

NASA, (2013). NASA flies Dragon Eye unmanned aircraft into volcanic plume. National Aeronautics and Space Administration. (accessed 19.04.2013).

Niethammer, U., James, M.R., Rothmund, S., Travelletti, J., Joswig, M., (2012). UAV-based remote sensing of the Super-Sauze landslide: Evaluation and results. Engineering Geology. 128, 2-11.

NOAA, n.d. NOAA Unmanned Aircraft Systems Program. National Oceanic and Atmospheric Administration. (accessed 18.04.2013).

NOAA, (2008). NOAA Invests $3 Million for Unmanned Aircraft System Testing Pilotless Craft Gather Data for Hurricane Forecasts, Climate, West Coast Flood Warnings. National Oceanic and Atmospheric Administration. (accessed 18.04.2013).

NOAA, (2012). NOAA Scientists Part of NASA-Led Mission to Study the Damaging Storms with Unmanned Aircraft, New Instruments. National Oceanic and Atmospheric Administration. (accessed 19.04.2013).

Bláha, M., Eisenbeiss, H., Grimm, D., and Limpach, P. (2011). Direct georeferencing of UAVs. Proceedings of Conference on Unmanned Aerial Vehicle in Geomatics, Zurich, Switzerland, 14–16 September 2011. 38(1/C22): 1–6.

D'Oleire-Oltmanns, S., Marzolff, I., Peter, K.D., and Ries, J.B.( 2012.) Unmanned aerial vehicle (UAV) for monitoring soil erosion in Morocco. Remote Sens. 4(11): 3390–3416. doi: 10.3390/rs4113390

Duan, S.B., Li, Z.L., Tang, B.H., Wu, H., Ma, L., Zhao, E., and Li, C.( 2013). Land surface reflectance retrieval from hyperspectral data collected by an unmanned aerial vehicle over the Baotou test site. PLoS One. 8(6): e66972. PMID: 23785513. doi: 10.1371/journal.
pone.0066972

Eisenbeiss, H. (2006). Applications of photogrammetric processing using an autonomous model helicopter. International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences. 36(185): 51–56

Eisenbeiss, H. (2009). UAV photogrammetry. Eidgenössische Technische Hochschule, Zürich.

Jensen, J.R. (2000). Remote sensing of the environment: an earth resource perspective. Prentice Hall.

Laliberte, A.S., Goforth, M.A., Steele, C.M., and Rango, A.( 2011). Multispectral remote sensing from unmanned aircraft: image processing workflows and applications for rangeland environments. Remote Sens. 3(11): 2529–2551. doi: 10.3390/rs3112529

Laliberte, A.S., Herrick, J.E., Rango, A., and Winters, C. (2010). Acquisition, ortho-rectification, and object-based classification of Unmanned aerial vehicle (UAV) imagery for rangeland monitoring. Photogrammetry. Eng. Remote Sens. 76(6): 661–672. doi:
10.14358/PERS.76.6.661

Laliberte, A.S., and Rango, A. (2011). Image processing and classification procedures for analysis of sub-decimeter imagery acquired with an unmanned aircraft over arid rangelands. GIScience & Remote Sensing. 48(1): 4–23. doi: 10.2747/1548-1603.48.1.4

Madsen, S.N., Hensley, S., Wheeler, K., Sadowy, G.A., Miller, T., Muellerschoen, R., Lou, Y., and Rosen, P.A. (2005). UAV-based L-band SAR with precision flight path control. Fourth International Asia-Pacific Environmental Remote Sensing Symposium 2004:
Remote Sensing of the Atmosphere, Ocean, Environment, and Space. International Society for Optics and Photonics. doi: 10.1117/ 12.578373

Niethammer, U., James, M.R., Rothmund, S., Travelletti, J., and Joswig, M. (2012). UAV-based remote sensing of the Super-Sauze landslide: evaluation and results. Eng. Geol. 128(0): 2–11. doi: 10.1016/j.enggeo.2011.03.012

Rango, A. and Laliberte, A., (2010). Impact of flight regulations on effective use of unmanned aircraft systems for natural resources applications. Journal of Applied Remote Sensing. 4(1).

Rufino, G., Moccia, A., 2005. Integrated VIS-NIR Hyperspectral/thermal-IR Electro-optical Payload System for a Mini-UAV. American Institute of Aeronautics and Astronautics, Arlington, VA, USA, pp. 647–664.

Watts, A.C., Ambrosia, V.G., Hinkley, E.A., (2012). Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use. Remote Sensing. 4(6), 1671 – 1692.

Wich, S., Koh, L., 2012. Conservation drones: the use of unmanned aerial vehicles by ecologists. GIM Int. 26, 29–33.

Yun, M., Kimb, J., Seo, D., Lee, J., Choi, C., 2012. Application possibility of smartphone as payload for photogrammetric UAV system. ISPRS – Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. XXXIX-B4, 349–352

Zarco-Tejada, P., Berni, J., 2012. Vegetation monitoring using a micro-hyperspectral imaging sensor onboard an unmanned aerial vehicle (UAV). In: Proceedings of the EuroCOW 2012, European Spatial Data Research (EuroSDR), Castelldefels, Spain.

Lihat lebih banyak...

Comentários

Copyright © 2017 DADOSPDF Inc.