Sunday, December 13, 2020

How to Turn Physical Products into Realistic 3D Models for AR

Available Methods

There are a couple different options to consider when it comes to creating a 3D model from a real world object.



Option One: Photogrammetry

Using a series of photos taken of a real world object, photogrammetry software can create an accurate high-density mesh of most objects. The “mesh” is a group triangles that define the shape of the object. Along with the mesh, it will also create texture images that define the colour of the object.



These photos also define how light interacts with that object, letting the program know how rough or smooth the object is. However, while photogrammetry can be extremely effective for some objects, it can be highly ineffective for others


Below given good and bad candidate 


Good Candidates

rough surface

opaque

lots of visual patterns on surface (i.e. colour changes, texture, depth)


Bad Candidates

smooth surface

transparent

reflective/ shiny

featureless ( i.e. one solid colour; no visual patterns to detect)



Option Two: 3D Scanning

3D scanning is similar to photogrammetry, but uses more specialized hardware.This technology shares a lot of the pitfalls that photogrammetry does. Though it can be highly effective when it comes to accuracy, it provides a non-optimized model and texture set. This means the file size will be larger than necessary and will potentially require manual work to make it ready for use. Furthermore, it can be expensive to buy a good 3D scanner or have your object 3D scanned elsewhere.



Option Three: 3D Modeling Programs

In this option an artist starts with a blank digital space and creates the model from scratch. This can be a time consuming process and needs the skills of an experienced modeller to get right. However, the results can be visually accurate and fully optimized for purposes such as ours.

The majority of our products were modeled using a 3D modeling program called Maya. Next they were brought into Substance Painter, or Mudbox for texture painting. For the few products we saw as good candidates for photogrammetry, we used a program called RealityCapture.


Process:

Step One: Taking Reference Photos and Measurements

The first step for each product is taking good reference photos. We were lucky enough to have Magnolia send us each product, which was a huge help during the entire process.


Long focal length: it’s important to use a lens that has a long focal length. Otherwise the photo will be skewed by a perspective that makes things closer to the camera appear to be much larger. This kind of photo is not ideal to model against.



Varying views: Usually front, back, left side, right side and bottom are sufficient to create an accurate model.


These photos are then imported in the Maya scene for reference when we build the model. The goal here is accuracy. If the model does not reflect the real world proportions of the product, the AR representation becomes misleading.

Then we take careful measurements of height, length, width of each part. Sometimes it requires drawing up of an extensive diagram depending on how complex the object is.


Step Two: Modeling


After we have our measurements and scene file set up, we start with a primitive shape (i.e. a sphere, cylinder or cube) and add detail until we have an accurate representation of the product. It’s also important to remember that the mesh complexity has to stay relatively low, in order for the app to load it fast.


After the modeling portion is finished, we export two files. One with a low-density mesh, and one with a high-density mesh, and import them into Substance Painter, where we add texture.


Step Three: Painting the Textures


The high-density mesh will be used to generate smooth texture maps. Substance Painter will use these initial texture maps for the generation of various effects, like edge wear, scratches and rust.


Then we adjust our Substance Painter viewport to match the environmental lighting the models will be viewed in. We did this by using a 360° photo of the office.


Substance works a lot like Photoshop: you add detail, textures, and colour adjustments in layers. What makes a model look real is capturing the imperfections. The layers of scratches, fingerprints, and chipped paint of the real world object. Rarely is anything in real life one colour, completely clean, or perfectly reflective.



Prepping Models for AR

Before the models can be used, they must be exported in a format used by your 3D engine of choice. In our case, we used the Collada (DAE) format as we were building a native Swift app using XCode, and the textures were exported separately as JPEGs.



An additional texture is needed for the product’s contact shadow. Without shadows, products look like they’re floating above the surface. These are generated beforehand in Maya, and saved as a texture to be displayed underneath the product mesh.





References:

https://medium.com/shopify-vr/how-we-turn-physical-products-into-realistic-3d-models-for-ar-13f9dc20d964

No comments:

Post a Comment