Integration of a broad range of sensors is a key enabler for various functionalities of the Sherpa system operation.
First of all, each individual platform will have to perform at least in parts autonomous navigation in difficult environmental scenarios where GPS reception might be prohibited by trees, or where visible light cameras may be of limited use, due to extremely homogeneous surfaces (snow), strong precipitation, or darkness. Moreover the deficits of these commonly used sensors pose challenges to the sensory system and related algorithms, e.g. concerning construction of a dynamic shared map and victim localization.
We thus suggest relying on a palette of complementary sensors, which the platforms are equipped with, along with dedicated fusion algorithms to be developed in the project.
Inside the consortium, there is experience in processing images for keypoint-based localization and mapping in unstructured environments also in presence of rather homogeneous surfaces. Thus, at least monocular cameras are foreseen on all the platforms.
A stereo camera attached to the end-effector of the “Donkey” will enable precise visual servoing. The FWUAV will make use of its inherently large baseline by carrying cameras on the wingtips (3 meters baseline). Furthermore, we plan to equip at the FWUAV and possibly other platforms with thermal cameras, providing vision-based navigation even in the dark, and also to obtain a powerful tool for people localization likewise.
The SICK LMS511 LIDAR (LIght Detection And Ranging) sensor installed on the Rmax helicopter (RW-UAV) is equipped with a rotating mirror mechanism, which deflects the laser beam emitted. With such a mechanism the LIDAR allows for time-of-flight measurements not just of a point, but of a 2D slice of the environment.
In case of bad weather conditions (rain or snow) a laser pulse can be reflected by a raindrop or a snowflake preventing from measuring the object of interest.
In order to improve the LIDAR performance in bad weather conditions, a multi-echo technology has been introduced. When a laser pulse is emitted, the energy propagates through the environment in a cone shape, why not all of it will typically be reflected by a rain drop or snow flake.
In a multi-echo LIDAR (here: up to 5), several echoes of one single pulse emitted can be measured thus increasing the probability of hitting the desired target (usually the last echo). The LIDAR offers a 190° scanning angle with resolution down to 0.166° at a range from 0.8 to 80 m. The accuracies achieved are in the order of 5 cm.
At a weight of 3.7 kg, this sensor is unfortunately not suitable for integration on any of the other SHERPA UAVs.
Llaser scanners may be explored as 3D sensors in order to make the trade-off between accuracy, range, weight and power consumption. The Hokuyo UTM-30LX, to name an example, would be suitable for integration even on-board the SHERPA UAVs - at a mass of 210 grams and a power consumption below 10 Watts.
These sensors can provide active sensing for ranges of 30 to 50 meters even in bright sunlight that can dramatically increase the localization and mapping robustness when close to the terrain, at the expense of reduced endurance. Furthermore the inclusion of such devices goes hand-in-hand with the multi-sensor tight fusion approach for localization and mapping outlined below.
Aaccelerometer and gyroscope MEMS sensors are powerful tools for motion tracking due to their short-term accuracy. They furthermore enable global observability of roll and pitch angle when fused appropriately.
Apart from localization, the use of these sensors therefore increases map accuracy and consistency.
We foresee a GPS receiver for all platforms. Additionally, the UAVs will carry MEMS altimeters and compasses – again not only tackling the localization problem, but also ameliorating map quality.
The FW-UAV is furthermore equipped with an airspeed sensor allowing precise estimation of the wind vector that is crucial for UAV planning at any level. In fact we see a pure inertial/GPS/compass/pressure system combined with digital elevation maps as a back-up UAV localization solution for cases where any vision system fails.
For maximum achievable detection rates we will complement visual classifiers (on visible light and thermal images) with state-of-the-art localization techniques involving avalanche beacon and mobile phone tracking.