This weeks seminar is about centimeter positioning inside forest using a GNSS-receiver (Global Navigation Satellite Systems). Thomas Hörnlund from Svartbergets Field Research Station will share his experiences from measurements with the Trimble GeoXR 6000 GNSS receiver.
This is the last seminar for this year and we will be back with more seminars in January 2017. If you have suggestions for topics, please contact Mattias Nyström.
Next weeks seminar (24 November) will be given by Kenneth Olofsson. He has developed methods to automatically detect tree locations and shapes from stationary terresterial laser scanning data. The title of the seminar is: “Terresterial Laser Scanning: Can we see the wood for the trees?”
Here is a short video from this weeks seminar where Henrik tries out the HTC Vive (Virtual Reality). We have converted a point cloud created from drone images to the virtual reality environment. On the screen in the background you can see what Henrik sees in the headset. With the headset on you head, you see it all in 3D as well.
Next weeks seminar will be given by students from Umeå University who is developing a platform for a mobile laser scanning system. They will give an overview of their work so far. More information about the seminars.
Swedish radio interviewed Mattias Nyström about the recent judgement in the court that interpret that a camera on a drone is the same as a surveillance camera. This means that you need a permission from the county government to use a camera on the drone. This permission is normally only given to prevent crime, so not sure if we can get this permission. Hopefully the law will be updated, but this can take some time. Luckily we have about 600 Gb of photos collected this summer and autumn that we can use meantime.
The seminar on the 3rd of November (14:30-15:00) will be a demonstration of the Virtual reality equipment in the lab. We will also demonstrate portable systems. Come and try out a birch forest scanned with our high resolution terresterial laser scanner and a colorized point cloud generated from drone images.
Flakaliden Research Park, where effects of climate and fertilization are studied, was surveyed by a drone taking about 700 pictures. The drone was equipped with a consumer camera (Sony a5100) and the images was processed to a very dense point cloud and very high resolution orthophoto (GSD 2.3 cm). A fly through of the point cloud is presented in the video. More information about the research at Flakaliden.
The HTC Vive Virtual Reality equipment is up and running in the Ljungbergslaboratory. Oscar was the first student to walk around in a terresterial laserscanning point cloud. Thanks to Mikael Hertz for generating the VR content from the point cloud.
Two weeks ago, we did several flights with the lab’s new Parrot Sequoia multispectral camera. We have summarized how the camera can be mounted on a 3DR Solo drone. We also did a first quick 3D point cloud from the NIR photos:
The drone optimized multispectral camera, Parrot Sequoia, just arrived to the lab today! The camera collect images in four defined wavelength bands as well as a normal RGB sensor. More information about the camera.
In contrast to many other users, we plan to use the camera to capture data of forest. As soon as we have some footage to show, we will publish here on rslab.se.
The camera captures images (1.2 Mpx) in the following wavelength bands:
Green 550 nm (40 nm bandwidth)
Red 660 nm (40 nm bandwidth)
Red edge (10 nm bandwidth)
Near infrared (40 nm bandwidth)
In addition to the four narrow wavelength bands, the Sequoia has a RGB camera with 16 Mpx resolution.
The scientists at the Ljungberg Lab at the Swedish University of agricultural Sciences in Umeå went for a two day field excursion to test thermal cameras and new drones. Joining the students at the Fire Management course at the Forest faculty, who were going to make a prescribed burning of a 20 ha clear-cut. We wanted to test our new Solo helicopter drone from 3Drobotics and to capture thermal video and images from our Flir Vue camera. We also wanted to test the thermal camera in our fixed wing Smartplane drone.
We started by capturing RGB images from 200 meters above ground with the Smartplane before the fire was started. From this imagery we created a 7 cm Ground Sampling Distance (GSD) orthorectified image mosaic. Which could be used for describing the pre-fire state of the area.
Under the duration of the controlled burning we flew the thermal camera multiple times, with the purpose of acquiring aerial thermal and visual (RGB) images to describe the burn process and to test the usefulness of having a thermal camera to find hotspots or ground fire hours after the fire front have passed an area.
This will be evaluated later, when all data sets had been processed to orthorectified imagery and also to 3D point clouds.