The Ljungbergslab received 3 million Swedish crowns for the coming three years (2017-2019) from Ljungbergsfonden. The goal with the project is to become a world leading education lab for 3D-remote sensing.
The current phase of the lab ended in november 2016 and the new project will take the lab to a broader cummunity. Our goal is to make the lab even more visible using Internet, for example Youtube tutorials, short online courses, example data sets and hands-on examples on remote sensing cases. We will also develop tools for Virtual Reality to import and explore 3D-data. The goal is to give a deeper understanding of the data you’re working with.
-We don’t see “world leading” as a competition, says Mattias Nyström, researcher at the Swedish University of Agricultural Sciences. There is often an gap between research and education, and between engineers and foresters. Our lab will narrow this gap and provide the latest sensors, softwares and tools to make this possible. We don’t want to compete with other labs, we want to collaborate and share knowledge. Everything we do in the lab will be made available on the Internet and in English language.
-To become a world leading educational lab is not about competition, it is about who can share and spread knowledge and provide the latest tools and sensors, says Jonas Bohlin, teacher at the Swedish University of Agricultural Sciences.
This weeks seminar is about centimeter positioning inside forest using a GNSS-receiver (Global Navigation Satellite Systems). Thomas Hörnlund from Svartbergets Field Research Station will share his experiences from measurements with the Trimble GeoXR 6000 GNSS receiver.
This is the last seminar for this year and we will be back with more seminars in January 2017. If you have suggestions for topics, please contact Mattias Nyström.
Next weeks seminar (24 November) will be given by Kenneth Olofsson. He has developed methods to automatically detect tree locations and shapes from stationary terresterial laser scanning data. The title of the seminar is: “Terresterial Laser Scanning: Can we see the wood for the trees?”
Here is a short video from this weeks seminar where Henrik tries out the HTC Vive (Virtual Reality). We have converted a point cloud created from drone images to the virtual reality environment. On the screen in the background you can see what Henrik sees in the headset. With the headset on you head, you see it all in 3D as well.
Next weeks seminar will be given by students from Umeå University who is developing a platform for a mobile laser scanning system. They will give an overview of their work so far. More information about the seminars.
Swedish radio interviewed Mattias Nyström about the recent judgement in the court that interpret that a camera on a drone is the same as a surveillance camera. This means that you need a permission from the county government to use a camera on the drone. This permission is normally only given to prevent crime, so not sure if we can get this permission. Hopefully the law will be updated, but this can take some time. Luckily we have about 600 Gb of photos collected this summer and autumn that we can use meantime.
The seminar on the 3rd of November (14:30-15:00) will be a demonstration of the Virtual reality equipment in the lab. We will also demonstrate portable systems. Come and try out a birch forest scanned with our high resolution terresterial laser scanner and a colorized point cloud generated from drone images.
Flakaliden Research Park, where effects of climate and fertilization are studied, was surveyed by a drone taking about 700 pictures. The drone was equipped with a consumer camera (Sony a5100) and the images was processed to a very dense point cloud and very high resolution orthophoto (GSD 2.3 cm). A fly through of the point cloud is presented in the video. More information about the research at Flakaliden.
The HTC Vive Virtual Reality equipment is up and running in the Ljungbergslaboratory. Oscar was the first student to walk around in a terresterial laserscanning point cloud. Thanks to Mikael Hertz for generating the VR content from the point cloud.
Two weeks ago, we did several flights with the lab’s new Parrot Sequoia multispectral camera. We have summarized how the camera can be mounted on a 3DR Solo drone. We also did a first quick 3D point cloud from the NIR photos:
The drone optimized multispectral camera, Parrot Sequoia, just arrived to the lab today! The camera collect images in four defined wavelength bands as well as a normal RGB sensor. More information about the camera.
In contrast to many other users, we plan to use the camera to capture data of forest. As soon as we have some footage to show, we will publish here on rslab.se.
The camera captures images (1.2 Mpx) in the following wavelength bands:
Green 550 nm (40 nm bandwidth)
Red 660 nm (40 nm bandwidth)
Red edge (10 nm bandwidth)
Near infrared (40 nm bandwidth)
In addition to the four narrow wavelength bands, the Sequoia has a RGB camera with 16 Mpx resolution.