Workflow: UV texture map generation with ControlNet Image Segmentation #204
Replies: 11 comments 19 replies
-
Thank you! |
Beta Was this translation helpful? Give feedback.
-
Holy shit. |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
This is marvelous and delicious! And I'm not talking about those gorgeous packagings haha! This is the type of SD application I want to see more of, please share your new ideas! 🥇🥇🥇🥇🥇 |
Beta Was this translation helpful? Give feedback.
-
Wow, incredible |
Beta Was this translation helpful? Give feedback.
-
@AugmentedRealityCat Would you please help me on that? |
Beta Was this translation helpful? Give feedback.
-
Wonderful! Are there more complex examples? For example, a car mesh or an animal mesh? |
Beta Was this translation helpful? Give feedback.
-
@AugmentedRealityCat Can you provide me with the .gltf model file of this packaging box? Thanks !!! |
Beta Was this translation helpful? Give feedback.
-
honestly this .. shouldn't work :) If it does I will have to revisit my whole knowledge of this space and go into hiding to meditate. |
Beta Was this translation helpful? Give feedback.
-
Is it possible to somehow influence what parts of the segmentation map the prompt applies to? Ie. redish colors are brick, blue is grass. |
Beta Was this translation helpful? Give feedback.
-
Where can I find the seg difference model? I can't really find it anywhere. |
Beta Was this translation helpful? Give feedback.
-
Here is how to use ControlNet with the Image Segmentation model to generate unwrapped UV texture maps for your 3d objects.
This video shows 50 different packaging textures for this box, all synthesized in a few minutes. Textures were applied in Cinema4d, and the model used for this example was taken from its assets library, but this should work with any 3d application.
unwrappedbox_render_1.mp4
Workflow
Create or import your model and unwrap its UV coordinates. It is essential to have clean UVs for this to work.
Create a segmentation map by applying flat plain colors to the different surfaces of your object. Keep it simple. Reuse the same color for surfaces sharing the same properties.
Export that segmentation map as a PNG. It does not have to be square but it's a good idea to have it in a resolution that works well with SD. For this example, it was in 2048x1024, a nice 2:1 ratio that you can downscale easily to 1024x512 if required.
Start Stable Diffusion and enable the ControlNet extension.
Load your segmentation map as an input for ControlNet. Make sure you set the resolution to match the ratio of the texture you want to synthesize. I went for half-resolution here, with 1024x512.
Leave the Preprocessor to
None
. Since we already created our own segmentation map there is no need to reprocess it.Set the ControlNet Model to the one containing the word Seg (for image SEGmentation)
You may have to adjust the weight parameter - in this example I cranked it up to 2.
. Write a simple prompt to describe the texture you want to generate, for example Grape Juice Packaging. You can also change the sampler, the steps, the CFG and the SD model. For this example I used the plain ema-only 1.5 model just to show you don't need anything fancy. You can also use other extensions, like in this case I wanted the texture to be seamless on the X axis, so I used the Asymmetric Tiling extension to achieve that.
Optional: if you want you can do this in IMG2IMG mode and use a picture to influence the texture map generation process.
Optional: if you want to synthesize many variations, it can be a good idea to use wildcards and other dynamic prompting methods.
Press the Generate button, repeat until you get an image that satisfies you. Here is one example of the results I got. I made hundreds of them over a few minutes by using
__fruit__ juice packaging
as a wildcard promptImport that picture as a texture map and apply it to your 3d object using the same UV coordinates as those used to first generate the Image Segmentation map.
Press render and voila !
Beta Was this translation helpful? Give feedback.
All reactions