Here are some quick results from the drone camera tests we did two days ago. We flow four different cameras under similar conditions. The data set should ideally be used buy a student to do some project and a deeper evaluation of the cameras. But as we know that many in the forest industri are thinking about the difference between the Dji Phantom 4pro camera with a global shutter and a larger sensor compared to the Dji Mavic pro which is smaller, cheeper and has a, for photogrammetry, poorer camera.
The drones (and cameras) was flown in the evening with low sun behind thin clouds, generating no shadows but also not bright light. This lighting condition should push the cameras a bit and at the same time give less problem with bright sunlit tree crowns against a dark shadowed ground.
A orthophoto was made in Pix4D cloud service using default settings. Here are the results:
The Phantom 4 pro camera (on the left) has a higher dynamic range in the spectral resolution and is also less over exposed on the road compared with the Mavic pro (right image).
For the final test the Sony a5100 and the Parrot sequoia needs to be added and the images needs to be compared in more depth as well as their performance when generating point clouds.
We decided to take our drones to the forest and test some cameras today. We flow two 3DR Solo; one with Parrot Sequoia and one with a new Escadrone SoloMapper (Sony QX1 camera) and two DJI drones; one Phantom 4 pro and one Mavic pro. This was done because we had a great opportunity to acquire data from four forest patches that has just been scanned with a good terrestrial laser scanner. The scans was multi-scans with 16 scans in a 8 x 2 grid. The point cloud from the laser could be used for evaluating the point clouds generated from the drone cameras.
We tried to fly slow (5 m/s) and with lots of overlap (90/90). Sadly we ran in to trouble with the new SoloMapper, it did not acquire more then 15 images and then the system stopped. This was the first time ever we tested the camera system on the drone so it was probably a installation error of ours. We switch camera to a Sony a5100 that we have used before but which doesn’t have a gimbal, so we got good quality images.
This data set, with four different cameras, would be perfect for some student to use for a project or thesis.
The Ljungbergslab received 3 million Swedish crowns for the coming three years (2017-2019) from Ljungbergsfonden. The goal with the project is to become a world leading education lab for 3D-remote sensing.
The current phase of the lab ended in november 2016 and the new project will take the lab to a broader cummunity. Our goal is to make the lab even more visible using Internet, for example Youtube tutorials, short online courses, example data sets and hands-on examples on remote sensing cases. We will also develop tools for Virtual Reality to import and explore 3D-data. The goal is to give a deeper understanding of the data you’re working with.
-We don’t see “world leading” as a competition, says Mattias Nyström, researcher at the Swedish University of Agricultural Sciences. There is often an gap between research and education, and between engineers and foresters. Our lab will narrow this gap and provide the latest sensors, softwares and tools to make this possible. We don’t want to compete with other labs, we want to collaborate and share knowledge. Everything we do in the lab will be made available on the Internet and in English language.
-To become a world leading educational lab is not about competition, it is about who can share and spread knowledge and provide the latest tools and sensors, says Jonas Bohlin, teacher at the Swedish University of Agricultural Sciences.
This weeks seminar is about centimeter positioning inside forest using a GNSS-receiver (Global Navigation Satellite Systems). Thomas Hörnlund from Svartbergets Field Research Station will share his experiences from measurements with the Trimble GeoXR 6000 GNSS receiver.
This is the last seminar for this year and we will be back with more seminars in January 2017. If you have suggestions for topics, please contact Mattias Nyström.
Next weeks seminar (24 November) will be given by Kenneth Olofsson. He has developed methods to automatically detect tree locations and shapes from stationary terresterial laser scanning data. The title of the seminar is: “Terresterial Laser Scanning: Can we see the wood for the trees?”