laitimes

Meta launches Garment3DGen: 3D clothing can be customized based on images

author:Sina XR

Recently, the Meta research team proposed a new method to synthesize 3D clothing assets from a base mesh using a single input image as a guide—Garment3DGen.

Meta launches Garment3DGen: 3D clothing can be customized based on images

According to the introduction, Garment3DGen is a method of stylizing geometric shapes and textures from 3D mesh garments that can be generated from 2D images, which can be mounted on parametric bodies and simulated for 3D garment generation and application.

Meta launches Garment3DGen: 3D clothing can be customized based on images
Meta launches Garment3DGen: 3D clothing can be customized based on images

Given the RGB image, the research team first performed a 3D reconstruction using a consistent image generation technique from single view to multi-view, using this output mesh as a 3D pseudo-ground truth to guide the mesh deformation process to deform the input mesh geometry to match the target. Its output geometry preserves the structure and topology of the input geometry, contains holes in the neck/arm/waist area, so they can fit the body, and have good mesh quality for physics simulation. Finally, the Texture Estimation module generates high-fidelity texture maps that are globally and locally consistent and faithfully capture the input guide to render the resulting 3D assets.

Meta launches Garment3DGen: 3D clothing can be customized based on images

With Garment3DGen, users can generate textured 3D garments of their choice without the need for artist intervention. People can provide text prompts describing the garments they want to generate simulation-ready 3D assets.

Read on