The HTC Vive Virtual Reality equipment is up and running in the Ljungbergslaboratory. Oscar was the first student to walk around in a terresterial laserscanning point cloud. Thanks to Mikael Hertz for generating the VR content from the point cloud.
Two weeks ago, we did several flights with the lab’s new Parrot Sequoia multispectral camera. We have summarized how the camera can be mounted on a 3DR Solo drone. We also did a first quick 3D point cloud from the NIR photos:
The ortofoto in NIR-band looks like this:
The drone optimized multispectral camera, Parrot Sequoia, just arrived to the lab today! The camera collect images in four defined wavelength bands as well as a normal RGB sensor. More information about the camera.
In contrast to many other users, we plan to use the camera to capture data of forest. As soon as we have some footage to show, we will publish here on rslab.se.
The camera captures images (1.2 Mpx) in the following wavelength bands:
- Green 550 nm (40 nm bandwidth)
- Red 660 nm (40 nm bandwidth)
- Red edge (10 nm bandwidth)
- Near infrared (40 nm bandwidth)
In addition to the four narrow wavelength bands, the Sequoia has a RGB camera with 16 Mpx resolution.
The scientists at the Ljungberg Lab at the Swedish University of agricultural Sciences in Umeå went for a two day field excursion to test thermal cameras and new drones. Joining the students at the Fire Management course at the Forest faculty, who were going to make a prescribed burning of a 20 ha clear-cut. We wanted to test our new Solo helicopter drone from 3Drobotics and to capture thermal video and images from our Flir Vue camera. We also wanted to test the thermal camera in our fixed wing Smartplane drone.
We started by capturing RGB images from 200 meters above ground with the Smartplane before the fire was started. From this imagery we created a 7 cm Ground Sampling Distance (GSD) orthorectified image mosaic. Which could be used for describing the pre-fire state of the area.
The video streams were later synchronized and fused to a side-by-side video using a software made by engineering students (an earlier project at the Ljungberg Lab).
Forest fire, fusion of thermal and RGB camera
Forest fire, thermal and RGB camera side-by-side
Under the duration of the controlled burning we flew the thermal camera multiple times, with the purpose of acquiring aerial thermal and visual (RGB) images to describe the burn process and to test the usefulness of having a thermal camera to find hotspots or ground fire hours after the fire front have passed an area.
This will be evaluated later, when all data sets had been processed to orthorectified imagery and also to 3D point clouds.
Jonas Bohlin and Mattias Nyström
Mattias visited the UASForum Conference in Linköping, Sweden, on Tuesday and Wednesday (24-25 May). Mattias presented how drones can be used in forestry as well as how the SLU in Umeå have implemented drones in the education.
Mattias presentation can be found here.
More information about the conference: http://www.uasforumsweden.se
Tuesday April 19 at 13.00 in the room “Älven”, Daniel Bertilsson will present his MSc thesis work “Estimations of forest density using TanDEM-X”.
Examiner: Mats Nilsson
Main advisor: Heather Reese
Co-advisor: Henrik Persson
On Thursday the 14th of April, at 10 am, in Viken, Edward and André will present their master theses.
The title of Edward’s work: “Analysis of seasonal variations for estimation of forest variables with InSAR technology”
The title of André’s work: “Forecasting of ALS data using TanDEM-X”
Supervisor is Dr. Henrik Persson and examiner is Prof. Håkan Olsson.
Adrian Straker will tomorrow, Friday, present his master thesis “Comparison of forest fire severity classification models based on aerial images and Landsat 8 OLI/TIRS images of a forest fire area in central Sweden”.
Room: Årsringen, next to the Ljungberg lab
Time: Friday 8th April, 9.00 am
Next week, Edward and André will present their thesis.
At the Ljungberg laboratory, we have started a project where we combine the traditional dowsing rod with latest drone technology. The goal is to improve the existing and important Depth to Water Map available online. At the Ljungberg lab, a teaching lab for advanced 3D remote sensing, we always aim to make a wide range of sensors available for our students – this project is a bold step in that direction.
[Swedish] I Ljungberglaboratoriet har vi startat ett projekt där vi kombinerar den traditionella och länge använda slagrutan med senaste drönar-teknik. Målet är att förbättra den befintliga och viktiga fuktighetskartan som finns på nätet. I Ljungberglabbet, ett undervisninglaboratorium för avancerad 3D fjärranalys, strävar vi alltid efter att ha ett stort antal sensorer tillgängliga för våra studenter och det här projektet är i linje med detta.