Testing Pix4Dmapper for UAV-based 3D Modeling

The Department of Geosciences at Mansfield University recently approved the purchase of Pix4Dmapper, a software that renders 3D models photogrammetrically through a proprietary algorithm. What this means is that Pix4D can assemble an array of aerial images from a UAV to create a three-dimensional representation of the landscape photographed. This option is significantly cheaper and more accessible than using LiDAR or other 3D remote sensing technologies, and according to the DJI specs, should be nearly as accurate.  And happily, Pix4D also has a dedicated smartphone app that fully integrates with our DJI Phantom 2 Vision+ UAV, providing coverage assistance to the pilot, or offering fully automatic piloting specifically for mapping purposes.

To do a preliminary test of the new software, MU Geosciences students Wes Glowitz, Nate Harpster and Stephen Prinzo used the Phantom to photograph Belknap Hall on the Mansfield campus, home to the department.


Wes, Nate and Stephen fly the Phantom to acquire images of Belknap
The students flew the UAV for the duration of one battery, about 15 minutes, acquiring 113 photos of the building and surrounding areas using the smartphone app’s guidance.

Diagram of the UAV’s location during each shot.

We knew that this coverage would ultimately insufficient — the application recommends around 1,000 photos for this size of study area at full resolution capabilities — but for a test, 10% coverage would provide a basic illustration of how the software works.

Then, using Pix4Dmapper, we processed the imagery into a 3D environment.  We used the free download version (which does not allow outputs to other formats) because we’re still waiting for the university to finalize the purchase of the professional license.

 

Rendering

The software is pretty resource-intensive. The i5 quad-core machines in the Belknap GIS lab really weren’t successful at completing the processing. I ultimately completed the processing at home on my wife’s souped-up graphics machine (she’s a photographer). The rendering took around four hours and producing a point cloud, an apparently interpolated “densified” point cloud, and a triangulated mesh to represent the area.  Below are the images from the rendering process.

10% Rendered

Point cloud looking at Belknap from the southwest.


Point cloud looking at Belknap from the southeast. In each of these photos, the shape of the building is coming into focus.

 

26% Rendered


Looking at Belknap from the south. Definitely see a difference in fill between 10 and 26% of rendering.

 

45% Rendered

Again from the south. Another big difference.

60% Rendered

Looking from the east — now, we’re seeing more of the surrounding area come into focus. Interestingly, the yellow stripe on Sullivan Street (US Highway 6) seems to be easier for the software to render.

 

72% Rendered (showing area of point cloud coverage)

Not a lot changed by 72%, so I decided to show the coverage of the model’s point cloud here. This is looking from the east again, and Belknap is that very bright spot toward the right. Even though the camera’s shots were all focused strictly on Belknap, there are a lot of points showing up elsewhere on campus and in the surrounding neighborhoods.

100% Rendered

Looking at Belknap from the south again. The maple tree on the building’s southwest corner is definitely playing tricks with the software’s ability to grab the geometry.

Results

From the point cloud, the software interpolated a densified version of the point cloud, as well as a triangulated mesh. The densified point cloud views have a distinctively pointillistic feel to them.

 

Densified Point Cloud

Looking from the east. Sullivan Street is clearly visible on the left side of Belknap. More interesting, though, is the amount of modeling the software did for surrounding areas. No, those mountain point clouds in the distance wouldn’t be capable of producing a detailed 3D model, but the fact that they show up at all is somewhat impressive.

Looking down from the north. More detail of neighboring Retan Center (left of Belknap), plus a bit of Grant Science Center (just up from Belknap) and Butler Hall (way up the hill).

Looking from the northeast. You can see the iconic North Hall Library (the old main building) way in the distance behind Belknap. Notice the smoke coming from the steam plant’s smokestack?

 

Triangulated Mesh

Another way the software interpolated the data was by constructing a triangulated mesh. Using the points rendered in the point cloud, the software attempted to create a set of interconnected vectors, over which it laid a composite of the imagery captures. This is similar to the method that Google Earth has used to represent 3D buildings since 2013, and landforms since its inception.

Looking from the east. Oddly, some of the cars seem to be melting into the parking lots…


The resolution of this mesh could obviously use some work; but remember, we used only 10% of the recommended number of images.

Video Tour

Of course, the model doesn’t really show it’s “3D-ness” with still imagery. This past Saturday, I took a rendered flyby video of the model and took it to iMovie to add some titles and music. The 720p resolution on the video (found in the gear button) gives you the full effect.

 

Next Step

Now that we’ve got a successful test run out of the way, we’re going to tackle a few more things, including a fully modeled version of Belknap and a test of mapping a local cemetery. If the application works in these environments, we have another really fun piece of mapping technology at our disposal.

 

Author: Andrew Shears

Andrew Shears is an Assistant Professor of Geography at Mansfield University in Mansfield, Pennsylvania. His research interests lie at an intersection of the human-environmental nexus, and includes branches of mapping, technological, memorialization and urban geographies. He lives in Wellsboro, Pennsylvania with his wife Amy, a professional photographer.

1 thought on “Testing Pix4Dmapper for UAV-based 3D Modeling”

  1. Nice work. I’ve been experimenting with the Pix 4d mapper trial version using the cloud. Thinking of upgrading to the full version for desktop.

    Before I invest in the full version of the Pix4d mapper software for say, a 3 month period, and buy a I7 computer for faster processing, I would like to know if there is a market for these type of models. Maybe I could start out as a hobby and then figure out how to sell later on? Or I could use Pix4d mesh Mapper, which is the DJI version and a lot cheaper.

    Unfortunately, when I do my (test) UAV flights for mapping, my flight path is not automated. But I made a model with 140 pictures just from eyeballing the flight path and got great results. Currently, I use an Autel drone, which is not compatible with Pix4d capture. I upgraded to a DJI Phantom 4 Professional but the drone is not yet compatible with the Pix4d capture app yet either.

Comments are closed.