Final Data Products (AT309)

 Pix4D Final Data Products

Michael Holland

Introduction

             Throughout the Fall Semester of AT309, my lab partner, Saurav Dalvi, and I completed multiple 3D scans and mapping missions using both a Skydio S2+ and Mavic 2 Pro to engage in data collection. These 3D scans and mapping missions involved the drone flying in a predetermined path to collect individual images of the area we wanted to scan. After collecting these images, Saurav and I would save them to a personal external hard drive. From here, we were able to put them into a mapping software called Pix4Dmapper, which took our individual images and stitched them together to make one large 3D model. If our scan was a mapping mission, we could then use a software called ArcGIS Pro to turn out processed data into a cartographically correct map. While engaging in data collection, and processing, we were able to see the differences between platforms of the Skydio S2 and Mavic 2 Pro. The following section shows some of the scans that we completed with an analysis of the data collection, platform, and the final product.

 

Results/Discussion

Week 3 Data: 3D Object Scan

One of the first 3D scans my partner and I completed was a 3D object scan with the Skydio S2+ during week 3. The object we chose to scan was my lab partner’s car. We were able to set this scan up using Skydio’s ground control app, where it automatically determined the necessary flight path and picture locations to gather enough imagery of the 3D object to create a 3D model of it. To scan the car, the Skydio S2+ took 526 images. We then used Pix4Dmapper software to process these pictures into a 3D model. Figure 1, Figure 2, and Figure 3 show screenshots of the point cloud of the 3D model after it processed. The point cloud data of our processed scan shows a large number of spots without imagery. Point cloud essentially shows many individual points of imagery but does not connect them. I believe that there are gaps in the point cloud data because of the reflection off of the car from the sun. The reflection of the sun would appear differently to the UAS as it changes angles to gather more imagery. When we activate the “triangulation” which fills in the gaps of the point cloud, the scan looks more realistic and complete as shown in Figure 4.

      

             Figure 1- 3D Object Scan Point Cloud                         Figure 2- 3D Object Scan Point Cloud

       

               Figure 3- 3D Object Scan Point Cloud                      Figure 4- 3D Object Scan Triangulation


Week 3 Data: 3D Tower Scan

             Along with the 3D Object Scan, my partner and I also completed a 3D Tower Scan in week 3. We chose a tall light pole at the intermural fields at Purdue. The scan was also set up and completed using the Skydio S2+. Upon setting up the scan, the Skydio flew circles around the 3D tower as it climbed and gathered imagery from every direction. Overall, it took 649 images of the light pole. Figure 5 and Figure 6 both show the complete processed model of the 3D scan. The Skydio and processing software did a good job of showing the shape and color of the light pole and terrain below it. One thing that did not appear great was at the top of the light pole, the lights themselves are missing imagery as shown in figure 7. This is because as the Skydio flew around the light pole, the gimbal was facing downward, so it was not able to gather imagery looking up into the lights. If I could redo the scan, I would set the Skydio to have more of a horizontal gimbal as it gathered imagery so it could see the lights at the top of the pole.


                   Figure 5- 3D Tower Scan Point Cloud                       Figure 6- 3D Tower Scan Point Cloud

Figure 7- 3D Tower Scan Point Cloud Issue Area

 

Week 3 Data: 2D mapping missions

             During week 3, my lab partner and I completed a 2D mapping mission scan of a soccer field at Purdue using the Skydio S2+. The scan was easy to set up and did not take a long time. The Skydio took 254 pictures as it flew at an altitude of 51 feet. The Model of this map was very good as it showed the contour of the ground and had a high level of detail. I created a map of the scan area as shown in Figure 8, which was created using ArcGIS Pro. The only potential problem we experienced was that the soccer net got somewhat distorted as the drone had a hard time capturing imagery of small strings blowing in the wind. Overall, the Skydio platform was very easy to use for this scan and did a good job of capturing the data.

Figure 8- 2D Mapping Mission Orthomosaic Map

 

 

 

 

Week 6 Data: Parallel Lawn Mower Grid with Mavic 2 Pro

             During week 6, Saurav and I completed two scans of the same area at the William H. Daniel Turfgrass Research and Diagnostic Center using the DJI Mavic 2 Pro. Since we had previously completed scans using the Skydio S2+, this was our first time using the platform and it gave us a chance to get to know each one. To set up the 2D mapping scans with the Mavic, we used a third-party ground control called Pix4D. This was used to set up the scan and guide the Mavic, so it knew where to take its pictures. This was different than Pix4D Mapper which was used to process the scans. The first 2D mapping scan was a scan with parallel passes of the Mavic, which we referred to as the lawn mower type scan. I then used ArcGIS Pro to create a map using the processed scan, as shown in Figure 9.

 

Figure 9- Parallel Lawn Mower Grid Orthomosaic Map

 

 

Week 6 Data: Opposing Grid Lines with Mavic 2 Pro

             The second scan we did during week 6 with the Mavic 2 Pro was the same scan area but with the Mavic making parallel and perpendicular scan passes, which we referred to as crosshatched passes. By doing passes in different directions, you will take roughly twice as many pictures. Our lawn mower-type scan took 132 images, and our crosshatched scan of the same area took 275 images. Doing crosshatched scan paths provides higher detail with the scan because it allows the drone to scan everything from different angles. I made a map of this crosshatched scan shown in Figure 10. Comparing the maps made in Figure 9 with the lawn mower scan and Figure 10 with the crosshatched scan, the difference is minor. With the top-down view of the map, both maps have almost the same amount of detail. The only difference that was fairly noticeable was if you opened the 3D model of the mapped area and angled the view to look at the sides of an object. That was when it was obvious that the crosshatched scan obtained more imagery and data of the area.


  Figure 10- Opposing Grid Lines Orthomosaic Map

 

Week 7 Data: S2 Mapping Mission

             Our objective in week 7 was to perform 3D scans and mapping missions with the Skydio and Mavic to compare both platforms. We started by performing a scan of a small parking lot area using the Skydio S2+. After having already completed scans with the Skydio, this was very quick and easy to set up. I turned the processed data in to a map and the level of detail is high, as shown in Figure 11. There is a small amount of distortion towards the bottom left of the map, but other than that, the platform worked perfectly. While assembling the map, it was necessary to trip the edges that were outside of the scan area because of distorted and incorrect imagery.


Figure 11- Skydio S2+ Mapping Mission Orthomosaic Map

 

Week 7 Data: Mavic mapping mission (opposing gridline)

             The same parking lot area was also scanned with the Mavic 2 Pro and the results were similar and are shown in Figure 12. The main difference between the two maps from the two different platforms is the large shadow over Figure 12, taken by the Mavic. This is because the scan was complete at a different time of day, where the sun was being blocked by trees. This caused poor lighting in the area, but the scan and processing still turned out the way it was supposed to. Another noticeable area of the map is that towards the top right of Figure 12, there is a parking spot with a car that is very faded. This is because the car drove away halfway through the scan, so it is in some of the images for the scan, and not others. The processed 3D model made by Pix4D Mapper partly shows the car in the parking space.


Figure 12- Mavic 2 Pro Mapping Mission Orthomosaic Map

 

Week 7 Data: Skydio Accident Scene

             During week 7, Saurav and I also completed two scans of a fake car accident scene to better understand the Skydio and Mavic’s capabilities. Using the Skydio, we selected the scan area and the UAS flew all around the scan area to get different angles of the accident scene. The processed 3D model is shown in Figure 13, Figure 14, and Figure 15. We experienced the same issues as our original 3D object scan shown in Figure 1, where there is missing point cloud data on the cars. This is most likely caused by the reflection of the sun. Another issue that was prevalent is shown clearly in Figure 15, where there is the point cloud of the cars, and then there is a false echo of the point cloud. You can see where the car actually is, but there are points that show a part of the car outside of where it should actually be. These were the most notable things about the 3D scan but overall, the Skydio did a good job scanning the scene.

                              

         Figure 13- Skydio 3D Model of Accident Scene      Figure 14- Skydio 3D Model of Accident Scene

Figure 15- Skydio 3D Model of Accident Scene

Week 7 Data: Mavic Accident Scene 3D Orbit

             Figure 16, Figure 17, and Figure 18 show the same accident scene, but scanned by the Mavic 2 Pro. The 3D model from the Mavic’s scan has many more holes of missing point cloud data, creating black spots throughout the 3D model. This was to be expected as the Skydio was set up much better for such a scan. The Skydio had settings that were able to set up a scan for this sort of project, however the DJI Mavic only had an option to do a 3D orbit. This 3D orbit stayed at the same altitude and flew in a circle, so it did not get many different angles of the area. The Mavic’s scan totaled 36 images as compared to the Skydio’s 211 images. Overall, the Skydio’s 3D model was higher quality than the DJI Mavic 2 Pro’s 3D model.

      

         Figure 16: Mavic 3D model of Accident Scene        Figure 17: Mavic 3D model of Accident Scene

Figure 18: Mavic 3D model of Accident Scene

Conclusion

               Throughout the scans, my lab partner and I were able to use different platforms to create 3D models for our scans. This process started with choosing a platform to gather the imagery. The Skydio S2+ platform performed higher-quality scans with more images and a higher level of detail. The DJI Mavic 2 Pro was user-friendly to set up, but its data output and the 3D models created from its scan usually did not compare to the Skydio’s. The Mavic 2 Pro performed well for 2D mapping missions, but not 3D object scans or orbit scans. Overall if I were to choose one platform to use in the future, I would choose the Skydio. If we wanted to create a map of an area, we would choose the 2D mapping scan feature. If we wanted a 3D model, we would complete a 3D object scan with the Skydio or an orbit scan with the DJI Mavic 2 Pro. Once the scan was completed, we would use Pix4D Mapper to process the scans and then use ArcGIS Pro to create a map if the scan was a 2D mapping scan.








Comments

Popular posts from this blog

UAS Teardown: 3D printed Motor Mount Final Product

Criminal Location Search (AT309 Week 11)

AT209 Lab 5 Wind Speed vs Battery Life Measurement