Arvid Axelsson will give a presentation on his on-going work with tree species classification using multi-spectral lidar.
The Ljungbergslab received 3 million Swedish crowns for the coming three years (2017-2019) from Ljungbergsfonden. The goal with the project is to become a world leading education lab for 3D-remote sensing.
The current phase of the lab ended in november 2016 and the new project will take the lab to a broader cummunity. Our goal is to make the lab even more visible using Internet, for example Youtube tutorials, short online courses, example data sets and hands-on examples on remote sensing cases. We will also develop tools for Virtual Reality to import and explore 3D-data. The goal is to give a deeper understanding of the data you’re working with.
-We don’t see “world leading” as a competition, says Mattias Nyström, researcher at the Swedish University of Agricultural Sciences. There is often an gap between research and education, and between engineers and foresters. Our lab will narrow this gap and provide the latest sensors, softwares and tools to make this possible. We don’t want to compete with other labs, we want to collaborate and share knowledge. Everything we do in the lab will be made available on the Internet and in English language.
-To become a world leading educational lab is not about competition, it is about who can share and spread knowledge and provide the latest tools and sensors, says Jonas Bohlin, teacher at the Swedish University of Agricultural Sciences.
Here is a press-release in Swedish describing the new project.
We wish you all a Merry Christmas and a Happy New Year!
This weeks seminar is about centimeter positioning inside forest using a GNSS-receiver (Global Navigation Satellite Systems). Thomas Hörnlund from Svartbergets Field Research Station will share his experiences from measurements with the Trimble GeoXR 6000 GNSS receiver.
This is the last seminar for this year and we will be back with more seminars in January 2017. If you have suggestions for topics, please contact Mattias Nyström.
This weeks seminar was given by Johan Holmgren and he shared the results from field tests with three different mobile laser scanners. The three scanners were:
- ZEB1 with 3D SLAM
- Velodyne VLP 16 with stereo video cameras and 3D SLAM
- Leica Geosystems Pegasus Backpack with 3D SLAM
The presentation can be found here.
Next weeks seminar (24 November) will be given by Kenneth Olofsson. He has developed methods to automatically detect tree locations and shapes from stationary terresterial laser scanning data. The title of the seminar is: “Terresterial Laser Scanning: Can we see the wood for the trees?”
Here is a short video from this weeks seminar where Henrik tries out the HTC Vive (Virtual Reality). We have converted a point cloud created from drone images to the virtual reality environment. On the screen in the background you can see what Henrik sees in the headset. With the headset on you head, you see it all in 3D as well.
Next weeks seminar will be given by students from Umeå University who is developing a platform for a mobile laser scanning system. They will give an overview of their work so far. More information about the seminars.
Swedish radio interviewed Mattias Nyström about the recent judgement in the court that interpret that a camera on a drone is the same as a surveillance camera. This means that you need a permission from the county government to use a camera on the drone. This permission is normally only given to prevent crime, so not sure if we can get this permission. Hopefully the law will be updated, but this can take some time. Luckily we have about 600 Gb of photos collected this summer and autumn that we can use meantime.
Link to interview in the Swedish radio, in Swedish
Link to report in ATL (Lantbrukets Affärstidning), in Swedish
The seminar on the 3rd of November (14:30-15:00) will be a demonstration of the Virtual reality equipment in the lab. We will also demonstrate portable systems. Come and try out a birch forest scanned with our high resolution terresterial laser scanner and a colorized point cloud generated from drone images.
Flakaliden Research Park, where effects of climate and fertilization are studied, was surveyed by a drone taking about 700 pictures. The drone was equipped with a consumer camera (Sony a5100) and the images was processed to a very dense point cloud and very high resolution orthophoto (GSD 2.3 cm). A fly through of the point cloud is presented in the video. More information about the research at Flakaliden.
The HTC Vive Virtual Reality equipment is up and running in the Ljungbergslaboratory. Oscar was the first student to walk around in a terresterial laserscanning point cloud. Thanks to Mikael Hertz for generating the VR content from the point cloud.
Two weeks ago, we did several flights with the lab’s new Parrot Sequoia multispectral camera. We have summarized how the camera can be mounted on a 3DR Solo drone. We also did a first quick 3D point cloud from the NIR photos:
The ortofoto in NIR-band looks like this: