DreamMat: High-quality PBR Material Generation
with Geometry- and Light-aware Diffusion Models


Yuqing Zhang1*, Yuan Liu2,3*, Zhiyu Xie1, Lei Yang3, Zhongyuan Liu3, Mengzhou Yang3, Runze Zhang3, Qilong Kou3, Cheng Lin3, Wenping Wang4, Xiaogang Jin1†

1State Key Lab of CAD&CG, Zhejiang University    2The University of Hong Kong     3Tencent Games     4Texas A&M University
*The first two authors contribute equally.    Corresponding authors.   


DreamMat is able to generate PBR materials for 3D meshes using text prompts.

Recent advancements in 2D diffusion models allow appearance generation on untextured raw meshes. These methods create RGB textures by distilling a 2D diffusion model, which often contains unwanted baked-in shading effects and results in unrealistic rendering effects in the downstream applications. Generating Physically Based Rendering (PBR) materials instead of just RGB textures would be a promising solution. However, directly distilling the PBR material parameters from 2D diffusion models still suffers from incorrect material decomposition, such as baked-in shading effects in albedo. We introduce DreamMat, an innovative approach to resolve the aforementioned problem, to generate high-quality PBR materials from text descriptions. We find out that the main reason for the incorrect material distillation is that large-scale 2D diffusion models are only trained to generate final shading colors, resulting in insufficient constraints on material decomposition during distillation. To tackle this problem, we first finetune a new light-aware 2D diffusion model to condition on a given lighting environment and generate the shading results on this specific lighting condition. Then, by applying the same environment lights in the material distillation, DreamMat can generate high-quality PBR materials that are not only consistent with the given geometry but also free from any baked-in shading effects in albedo. Extensive experiments demonstrate that the materials produced through our methods exhibit greater visual appeal to users and achieve significantly superior rendering quality compared to baseline methods, which are preferable for downstream tasks such as game and film production.

Generated PBR materials and relighting results

The input text prompts are "A red apple", "A cute striped kitten", "A vintage-style green sewing machine with golden buttons", "A wooden stool", "The earth", "A vintage rotary telephone with red base and brass handset", "A blue pleated plaid skirt" and "A cupcake with marshmallow and chocolate drizzle topping".


DreamMat distills a diffusion model to generate PBR materials. We first use Monte Carlo sampling to render images of the object from its material representation and a randomly-selected predefined environment light. Then, we train the material representation by CSD loss on rendered images using a geometry- and light-aware diffusion model.

Geometry- and Light-aware Diffusion Model

Our geometry- and light-aware diffusion model uses an object's normal and depth maps as geometry conditions and six predefined materials with a given environment light as lighting conditions. Our model generates images that align with the given geometry and environment light.

Material Generation of Complex Objects

Generating Diverse Materials

Material Generation for Object Sets

For a cluttered scene or a highly-detailed avatar, we generate materials for each component separately and then combine these parts together to render the appearances in Blender.


  title={DreamMat: High-quality PBR Material Generation with Geometry- and Light-aware Diffusion Models},
  author={Zhang, Yuqing and Liu, Yuan and Xie, Zhiyu and Yang, Lei and Liu, Zhongyuan and Yang, Mengzhou and Zhang, Runze and Kou, Qilong and and Lin, Cheng and Wang, Wenping and Jin, Xiaogang},