Baking workflow revamp #68925
(This design doc is work in progress)
Baking is hard to use and requires one map to be baked at a time. Each time configuring appropriate settings and selecting the right objects and shader nodes.
Blender should natively have a baking workflow that is more efficient and powerful.
Multiple add-ons exist to make this workflow easier.
A patch for object-based baking and long discussion about various other options is here:
D3203: Baking system overhaul: Move baking settings from render settings into BakePasses
- Baking a set of maps for game assets.
Typically includes maps like base color, metallic, roughness, normal map, AO. The assets may use complex procedural texture nodes, multiple blended shader nodes and multiple materials, all to be baked down to a single set of maps and material that the game engine understands.
Exporters like FBX or glTF would typically want to export this baked material representation. Efficiently re-baking objects, inspecting the results in the viewport, and applying the same settings to other assets is important.
- Baking to optimize rendering
Some part of a shading network may be baked down for more efficient rendering, either as a viewport approximation or final render. Maybe a complex procedural texture, static background or ray-traced AO/curvature. Light maps may also be considered part of this, though I personally would not recommend to use them for Cycles or Eevee and not focus on this use case.
- One-off bakes
Examples may be to create a base to start texture or vertex painting, to use in geometry nodes, etc. The current workflow is reasonably well suited to this, but automatic creation of images or vertex color layers and preview of the result would be helpful.
** Perhaps best integrated as an operator/tool in paint or edit modes.
Light map baking for export: how important is this still in modern game engines? Baking as a part of a geometry nodes network?
Baking to transfer attributes between objects: use Data Transfer modifier instead? ... ?
There is not a single workflow that is ideal for all uses cases. I suggest to have two workflows that I think will handle the most important ones well, and can work for others with some extra effort. The Bake panel in the scene render properties would be removed.
- New operator to quickly bake to images or attribute
- Operator pops up a dialog, with choice for bake type and relevant settings
- No settings stored in the scene. Only operator properties, remembered for the next operator execution
- Operator creates new images or attribute if none exists, or bake to active one
- Likely exposed in vertex, sculpt, texture paint mode, and in the vertex colors panel
- No selected to active baking support here
Material Based Bakes
- New Bake Texture shader node
Bake type and relevant settings One bake type is an input socket with arbitrary shader nodes plugged in
Image or attribute to bake to Bake/Clear button
- New Material datablock settings
Bake panel in material properties shader node editor, visible when there are any Bake Texture nodes Bake/Clear button for the entire material
Collection property for "selected to active" baking, no use of selection state. Any other shared settings
With this system, it's possible to create a reusable shader node group for a baked material, or manually add a few Bake Texture nodes for specific cases.
A big advantage of this approach is that the baked result is also immediately previewable in the viewport and usable in the final render. No need to set up bake settings in one place, and a preview shader node setup in another place.
More details to be worked out
- Likely need an Image socket type, to make shader node groups really reusable?
- How to handle baking multiple materials into one, special material slot that all materials bake into?
** Potentially confusing to have two ways to bake, using either this special slot or as part of regular material slot
- Precise workflow to clear bakes and decide if baked/unbaked is show in viewport and render
We expect add-ons will continue to be needed for more advanced use cases and automation. Currently these use the Bake operator, but a lower level Python API function could be added, which does not rely on selection state, but rather accepts specific objects, materials, attributes, etc.
- Metallic, Base Color and possibly other types
- Packing bakes into specific channels, and multiple maps into one image or attribute
Please add Denoising.
It's a huge timesaver for Combined, Glossy(direct/indirect), Diffuse(Direct/Indirect), Ambient Occlusion and Shadow pass.
I have a design concern about having bake materials like first proposed by @brecht , if there are some other nodes in-between the output node/bsdf node and the bake-to-texture node and the "bake material" is supposed to be able to bake other materials on the objects its assigned to, from an engineering standpoint how would the baker know what to write to the texture or any attribute representing the texture?
In my baking material, lets lets say I have:
UV tex coordinate -> write-to-texture node -> multiply*2 -> modulo 5 -> bsdf, albedo input -> output node.
From my understanding this would require from the system to perform my multiply*2 modulo 5 in reverse. Of course logically there can only be one possible outcome resulting in the expected behaviour, but how would one achieve this reverse calculation?
This question arised for me when reading:
In D3203#174437, @Rawalanche wrote:
But if you look at the average baking workflow from the high level user stand point, then what people are mostly interested is just "I have this thing, and I want to bake these specific material channels into this particular UV channel and save them as the textures in this format at this file path.". That's pretty much all in 99% of cases.
In your workflow you join everything you want to bake together in one object. But from what I understand, there are two users here arguing it needs to work for multiple objects too. Not multiple texture sets on one object, but the same texture set applied to multiple objects. One of the main reasons asked for multi-object editing in Blender was to be able to pack UVs for multiple objects together, this is not uncommon.
The other things you're not addressing is how to preview baked materials in Blender, how to export materials to file formats like glTF or USD, and how to handle the case where you want to bake materials for use in Blender itself.
So I just don't how this covers 99% of use cases, I wouldn't generalize so easily.
In D3203#174473, @joules-2 wrote:
You have to consider multiple textures for a single mesh object. How I approached it with my addon (I might show screenshots tomorrow). Is how I approached materials on objects. And that is with "Material Groups". The addon will cycle through the groups and bake out "skins" textures.
I think a general concept of material groups is too much. There's already much user confusion around the difference between materials and material slots, adding another layer to that would make this worse.
To me it seems better to do something specifically for baking. Let me expand a bit on what the UI for a "baking material" could look like. In the material properties or shader editor, you'd have a "Add Baked Material" button. This would create a new material datablock, with as contents a Principled BSDF nodes and one or more Bake Texture nodes connected to it (or not if it doesn't make sense for e.g. Curvature).
Maybe this would just be a fixed default set, maybe it would look at the existing materials to see which parts needs textures and which parts have a fixed value, maybe there's a popup that let's you easily choose which channels to bake. Once this is added you can go and edit parameters in the various nodes. Due to the presence of Bake Texture nodes, there would be a Baking panel visible with material wide baking settings and a Bake button.
If you want to change the channels to bake afterwards, you can remove or add Bake Texture nodes. The Add node menu could have a category for that, for common channels like Base Color, Metallic, Roughness, Normal Map, Curvature, etc. Adding a bake texture to an input of the Principled BSDF could also auto detect the right type of channel to bake. Image file names could also be set up automatically based on object/material name and channel.
The baked material would be added outside of the regular list of material slots. The exact UI for that I'm not sure about. Maybe it's in the material slot list separate from the rest, with a specific icon, or in a completely separate material slot list where you can toggle between input and baked material slots. But regardless, this would be an actual material that can be used by the viewport, renderers and exporters.
which I got the understand of that "baking materials" is a separate material kind that could really be idealised with a baking node graph, except that its an evolved shader graph thats only categorised as a "baking material" instead to allow it to bake from "regular" materials existing on the same objects.
I do love the concept of this, as like you said this would allow for easy previewing of the material (there could be some button next to each baking material that lets you preview it, like an eye icon similarly to show/hide in the outliner), there wouldn't be a need to setup a bake node graph and material to preview the results individually and this would also allow for easier exports when using an export system capable of reading basic Blender shaders (Godots tools for importing for Blender already does this btw).
But going back to the beginning of this post, how would this actually be implemented?
I would be willing to expand on your concept and make some visual representations on what this would actually look like and give a suggestion on a more finalised design for this (including where all the buttons and such would go) as I got some ideas for this myself when reading the https://developer.blender.org/D3203 thread . However I would like to feel sure this is actually implementable first.
Also I wrote a summary of all the features requested in https://developer.blender.org/D3203 , I also added in some suggestions of my own here:
[Summary of ideas from D3203.rtf](https://archive.blender.org/developer/F10047122/Summary_of_ideas_from_D3203.rtf)
It took me 6-7 hours to read through all of that and reflect over it and there was a lot of features being suggested several times over. There were also no concrete design suggestions that would promise an overall workable solution, so I suggest anyone considering to add some more ideas on use cases to read through either the full thread or at least my summary first to make sure it already hasn't been suggested.
I would also like to suggest having a separate thread for discussion on this topic and one thread for raw and complete design suggestions, so that its easier to compare different approaches to this enormous task and so that any contributors that might stop by won't get overwhelmed by a long discussion of feature suggestions but instead be able to just read what design suggestions have been made from a smaller thread.
Something else that haven't been discussed is how non-cycles bakers would integrate with this. I believe it would make sense to make one internal (and preferably also Py exposed) "API" that any renderer can hook up to & that its as simple as possible Maybe so that the baker will only be called from the baking-system as a part of a baking job & simply return an attribute/texture that then gets processed in a universal system for applying post process effects, manage export and file naming (etc).
In #68925#1154299, @Olliver wrote:
UV tex coordinate -> write-to-texture node -> multiply*2 -> modulo 5 -> bsdf, albedo input -> output node.
From my understanding this would require from the system to perform my multiply*2 modulo 5 in reverse.
No, any nodes after the bake node would have no impact on what is written to the texture. The bake node would typically be placed right front of the BSDF, unless you wanted to bake one texture and then make unbaked variations of it in the shader.
About baking. Request/important.
ATM baking another object to existing atlas will produce margin overlap (when new margin overlaps old one).
Desired solution: Ability to bake mask, that will prevent this effect.
ATM I forced to join all Hipoly and all lowpoly to single meshes, to bake with wide non-intersected margins.
Pre-join is intensive operation and it is not always possible, for example when some parts have bake from HP and other parts have bake from self.
Also it is faster to change, fix and bake only small part of Atlas.
So that is why I suggest to bake mask (1 sample, no AA direct from shader as emission), that will limit margin for each object or/and each island (optional).
Because you may want to fix and rebake only small part (glove for example)
Each part of mask will be colored. Mask can be saved separately or cached (fake user?) for next bakes/fixes.
Some addons for blender do pre-join. Like BakeLab2 (pretty cool addon, clean and nice)
But it is still requires full atlas rebake.
No due date set.
No dependencies set.
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?