From the 1st of July there will be new regulations for operating drones in the air space. The regulations will be the same within EU. In order to inform the drone users within the university and to discuss how we should implement the new regulations in the organisation we had a internal seminar, however the information could be of interest to others so we published video of the seminar here.
A pdf of the presentation can be downloaded here: 200529 New drone regulations seminar
As part of a Design-Build-Test course, five engineering students are building a albedometer sensor system for the new drone in the Lab. The albedometer will consist of two MS-80U pyranometers from EKO instruments and a lidar unit (LIDAR Lite v3) from Garmin.
Albedometers are mostly used to measure the reflectance of earths surface, using two pyranometers: one facing up towards the sky and one facing down towards the surface. From the ratio of incoming and reflecting radiation the albedo can be calculated. Albedo is often used in different climate models but is usually measured using a fixed sensor or a mobile sensor attached on a pole carried over the surface of interest. With a drone borne albedometer we can collect data over forests and other vegetation types without walking in it and also cover larger areas.
The project will be finished in mid January 2020.
Victor Kingstad och Mårten Tovedal presented their master thesis projects at the RIU conference in Skinnskatteberg, 13-14 November. The conference is an annual event gathering people from forest planing and forest inventory. Both students are doing drone based forest inventory. Victors topic is forest cleaning need assessment using rapid orthophotos. Mårten is looking at methods applying models from national ALS (airborne laser scanning) forest mapping to point clouds generated from drone imagery, with the aim of omitting new field inventory.
Now we continue with the second half of the course. The first part gave an overview of drone systems and regulations as well as drone flying and flight planning. Now we continue with processing collected images to geographic data, e.i. ortho images and 3D point clouds and then how those products can be transformed into forest data and the current use in forestry.
Wednesday 19th of September, 13 – 16 pm. Stereo-photogrammetry, production of ortho image mosaics and 3D point clouds.
Wednesday 26th of September, 13 – 16 pm. Estimation of forest variables and use of drone in forestry.
Applications to the course should be sent to email@example.com latest on September 18th. The amount of students on the course is limited and students how have taken the first part have priority.
The course aim is to provide students with knowledge how to conduct drone based image acquisition and forest inventory. An introduction will be given on drone systems and regulations as well as practical knowledge on how to operate drones to acquire images for mapping. The first part will focus on getting the images. The second part of the course, in the autumn, will be on data processing to get ortho-mosaics, 3D models and extract forest information.
Wednesday 23th of May, 13 – 16 p.m. Drone and camera systems and regulations. In the Ljungberg lab.
Wednesday 30th of May, 12.30 – 17 p.m. Field trip to fly drones and acquire images.
Applications to the course should be sent to firstname.lastname@example.org latest on May 21st. The amount of students on the course is limited.
Carl Jansson presenterar sitt examensarbete med titel ”Identifiering av fullskiktade bestånd med stereomatchande flygbilder och laserskanning”. Carl har undersökt om det går att skilja fullskiktade blädade bestånd från enskiktade bestånd skötta med trakthyggesbruk med hjälp av fjärranalys och metoder som går att automatisera och använda för stora områden.
Presentationen hålls i Årsringen på plan 1 kl 14.00-14.30. Mats Nilsson är examinator och Eva Lindberg är handledare.
Lisa Wennerlund will present her master thesis;
“Evaluating the need of cleaning using 3D point clouds derived from high resolution images collected with a drone”
On Thursday the 8th of March at 15.00 in Ljungberglabbet.
Supervisor is Jonas Bohlin and co-supervisor is Jonas Jonzén. Håkan Olson is the examiner.
Today, Magnus Persson will present his master thesis titled “Tree species classification using multi-temporal Sentinel-2 data”.
This year we have had four students doing master thesis work in the Ljungberglab, so more interesting presentations will follow.
Here are some quick results from the drone camera tests we did two days ago. We flow four different cameras under similar conditions. The data set should ideally be used buy a student to do some project and a deeper evaluation of the cameras. But as we know that many in the forest industri are thinking about the difference between the Dji Phantom 4pro camera with a global shutter and a larger sensor compared to the Dji Mavic pro which is smaller, cheeper and has a, for photogrammetry, poorer camera.
The drones (and cameras) was flown in the evening with low sun behind thin clouds, generating no shadows but also not bright light. This lighting condition should push the cameras a bit and at the same time give less problem with bright sunlit tree crowns against a dark shadowed ground.
A orthophoto was made in Pix4D cloud service using default settings. Here are the results:
The Phantom 4 pro camera (on the left) has a higher dynamic range in the spectral resolution and is also less over exposed on the road compared with the Mavic pro (right image).
For the final test the Sony a5100 and the Parrot sequoia needs to be added and the images needs to be compared in more depth as well as their performance when generating point clouds.
We decided to take our drones to the forest and test some cameras today. We flow two 3DR Solo; one with Parrot Sequoia and one with a new Escadrone SoloMapper (Sony QX1 camera) and two DJI drones; one Phantom 4 pro and one Mavic pro. This was done because we had a great opportunity to acquire data from four forest patches that has just been scanned with a good terrestrial laser scanner. The scans was multi-scans with 16 scans in a 8 x 2 grid. The point cloud from the laser could be used for evaluating the point clouds generated from the drone cameras.
We tried to fly slow (5 m/s) and with lots of overlap (90/90). Sadly we ran in to trouble with the new SoloMapper, it did not acquire more then 15 images and then the system stopped. This was the first time ever we tested the camera system on the drone so it was probably a installation error of ours. We switch camera to a Sony a5100 that we have used before but which doesn’t have a gimbal, so we got good quality images.
This data set, with four different cameras, would be perfect for some student to use for a project or thesis.