The lab has developed a software to view and analyze point clouds in Virtual Reality. The application is written in Unity and supports only HTC Vive. We have also released the source code we have written ourself and you can modify and add support for other headsets.
The application will right now not be further developed by us, but feel free to download the source code and contribute! Please add a comment on the repository if you continue to develop the application.
Read more about the application here and about the original plan for our development here.
Placing ground control points (GCP) when performing flights with drones can be a very time consuming task which is why we now have placed permanent GCPs around the property at Innertavle. There is a total of 11 GCPs placed around the boarder of the property. The GCPs helps in the post process by being able to reference to the location of these points when matching up the rest of the points from the data.
The 13th of February we will start with the first workshop for this semester. Previously we called this open-lab, but now we prefer to call it workshops. The idea with the workshops is start at 13:00 with a short introductory presentation of a remote sensing topic, for example use of drones in forestry. The presentation is 15-20 minutes long and after this the lab is open for you to hands on experiment and work with remote sensing data. We will always supply data for you to experiment with, but you can of course bring your own. Or use the labs equipment to collect new data. The lab closes at 15:00.
The topic for this first workshop is airborne laser scanning.
Today we release the first beta version of our application to view and analyze point clouds in Virtual Reality. We have named the application PointCloud XR and you can read more about the development and download PointCloud XR. Please add a comment on our Facebook page about your thoughts after you have tried it!
In the current version (2018-12-14) you can do the following:
Open points in LAS 1.2 format.
Depending on your hardware you can open about 15 million points and move around fluently.
You can “fly” around in the point cloud using your hand controller.
Change the size of the points.
Color the points by RGB, Class, Height or Intensity. The intensity can alway be blended with either of the other coloring modes.
Measure distance. You can snap to points and restrict measurement in y-direction or in the xz-plane.
Select points using a sphere. The selected points can the be removed.
Save the edited point cloud in PCXR-format. LAS exporter is not implemented yet but we are working on that.
Set a new start position. This will only work with a file saved in PCXR-format.
Change the throttle (how fast you are flying when you press the trigger).
Here is a clip when a few students try an earlier version of the application.
Now we continue with the second half of the course. The first part gave an overview of drone systems and regulations as well as drone flying and flight planning. Now we continue with processing collected images to geographic data, e.i. ortho images and 3D point clouds and then how those products can be transformed into forest data and the current use in forestry.
Wednesday 19th of September, 13 – 16 pm. Stereo-photogrammetry, production of ortho image mosaics and 3D point clouds.
Wednesday 26th of September, 13 – 16 pm. Estimation of forest variables and use of drone in forestry.
Applications to the course should be sent to email@example.com latest on September 18th. The amount of students on the course is limited and students how have taken the first part have priority.
The course aim is to provide students with knowledge how to conduct drone based image acquisition and forest inventory. An introduction will be given on drone systems and regulations as well as practical knowledge on how to operate drones to acquire images for mapping. The first part will focus on getting the images. The second part of the course, in the autumn, will be on data processing to get ortho-mosaics, 3D models and extract forest information.
Wednesday 23th of May, 13 – 16 p.m. Drone and camera systems and regulations. In the Ljungberg lab.
Wednesday 30th of May, 12.30 – 17 p.m. Field trip to fly drones and acquire images.
Applications to the course should be sent to firstname.lastname@example.org latest on May 21st. The amount of students on the course is limited.
Carl Jansson presenterar sitt examensarbete med titel ”Identifiering av fullskiktade bestånd med stereomatchande flygbilder och laserskanning”. Carl har undersökt om det går att skilja fullskiktade blädade bestånd från enskiktade bestånd skötta med trakthyggesbruk med hjälp av fjärranalys och metoder som går att automatisera och använda för stora områden.
Presentationen hålls i Årsringen på plan 1 kl 14.00-14.30. Mats Nilsson är examinator och Eva Lindberg är handledare.
We will take a break in the open lab Wednesdays. We will announce here on the website when we have open again. Mean-while you can have a look at the updated list of equipment and softwares. We will shortly provide more tutorials/instructions.