Always write unused IDs on save #61209

Open
opened 2019-02-05 15:12:57 +01:00 by Bastien Montagne · 144 comments

#61141 raises again the issue of not writing IDs with zero users on file save. This has also been a long known pain for artists, especially when starting to work with Blender.

Historical reason for this behavior was that it used to be pretty impossible to properly delete an ID while Blender was running, and that undoing such deletion was impossible. Those issues have been fixed since several years now.

Proposal

    • Do write datablocks with zero users on save (the only exceptions could be library and linked data-blocks, I think?).
    • When deleting some IDs, also delete its 'dependencies' if they are no more used at all.
      • Main case: when deleting object, also delete its object data, shape keys, particle systems, ... (this solves most of 'free large amounts of memory' issue)
      • Likewise, when deleting any ID, also delete its animdata's Action if not used anywhere else.
      • This could also be options in the user preferences (similar to what we have currently to control what gets duplicated).
    • Make Purge accessible from the File menu and Blender File view in the outliner.
    • Add an option to the Preferences, to run recursive purge on file save (see also #87490 (Recursive Purge Orphans datablocks on file save)).
    • Changes to Purge operator itself:
      • Make purge recursive (i.e. also delete IDs that are currently used, but would have no users once currently unused ones get deleted).
      • Before it runs, show a popup with info on data types that will be deleted (5 materials, 3 meshes, etc) and allow users to confirm or cancel.
      • Ideally should also allow expanded view to see exactly which datablocks?
      • Add option to select which data types to purge.
      • After running, show "X datablocks deleted" message in the status bar.
      • Make it not save and reload?
      • Support undoing ID purge/deletion.

About Fake User

    • Tweak the ID template: remove the 'fake user' button, and have an 'unlink' button (X icon), and a delete button (trashcan icon).

Alternative 1

    • Keep fake user for use in the Outliner mostly (orphaned view, allows to do batch deletion while keeping some IDs with the purge button).
    • Enable fake user by default for some data-block types:
      • Materials, Textures, Images, Node groups.
      • Brushes (already done iirc), Palettes…
      • Others?

Alternative 2

    • Fully remove Fake User, and instead rely on the new marked as asset feature to protect IDs from purge deletion.
      • This would imply a do_version to convert existing 'fake users' to 'marked as assets' IDs.

Notes

  • Code wise, changes should be very minimal, unless we have to do a lot of UI work (would not expect it).
#61141 raises again the issue of not writing IDs with zero users on file save. This has also been a long known pain for artists, especially when starting to work with Blender. *Historical reason for this behavior was that it used to be pretty impossible to properly delete an ID while Blender was running, and that undoing such deletion was impossible. Those issues have been fixed since several years now.* Proposal ---------- * - [ ] Do write datablocks with zero users on save *(the only exceptions could be library and linked data-blocks, I think?)*. * - [ ] When deleting some IDs, also delete its 'dependencies' if they are no more used at all. * - [ ] Main case: when deleting object, also delete its object data, shape keys, particle systems, ... *(this solves most of 'free large amounts of memory' issue)* * - [ ] Likewise, when deleting any ID, also delete its animdata's Action if not used anywhere else. * - [ ] This could also be options in the user preferences (similar to what we have currently to control what gets duplicated). * - [x] Make Purge accessible from the File menu and Blender File view in the outliner. * - [ ] Add an option to the Preferences, to run recursive purge on file save (see also #87490 (Recursive Purge Orphans datablocks on file save)). * - [ ] Changes to Purge operator itself: * - [x] Make purge recursive (i.e. also delete IDs that are currently used, but would have no users once currently unused ones get deleted). * - [x] Before it runs, show a popup with info on data types that will be deleted (5 materials, 3 meshes, etc) and allow users to confirm or cancel. * - [ ] Ideally should also allow expanded view to see exactly which datablocks? * - [ ] Add option to select which data types to purge. * - [x] After running, show "X datablocks deleted" message in the status bar. * - [x] Make it not save and reload? * - [x] Support undoing ID purge/deletion. About Fake User ------------------- * - [ ] Tweak the ID template: remove the 'fake user' button, and have an 'unlink' button (`X` icon), and a delete button (`trashcan` icon). **Alternative 1** * - [ ] Keep fake user for use in the Outliner mostly (orphaned view, allows to do batch deletion while keeping some IDs with the purge button). * - [ ] Enable fake user by default for some data-block types: * - [ ] Materials, Textures, Images, Node groups. * - [ ] Brushes (already done iirc), Palettes… * - [ ] Others? **Alternative 2** * - [ ] Fully remove Fake User, and instead rely on the new `marked as asset` feature to protect IDs from purge deletion. * - [ ] This would imply a `do_version` to convert existing 'fake users' to 'marked as assets' IDs. Notes ------- * Code wise, changes should be very minimal, unless we have to do a lot of UI work (would not expect it).
Bastien Montagne self-assigned this 2019-02-05 15:12:57 +01:00
Author
Owner
Added subscribers: @mont29, @WilliamReynish, @brecht, @ideasman42, @Sergey

My main concern here is that we end up in a situation where users have to press Purge often to keep memory usage under control. And then if they learn to do that, they still end up accidentally losing data in the same way as before.

I think we could tweak behavior a bit depending on the data type:

  • Enable fake user by default for materials, textures, images, node groups and maybe a few others.
  • Alternatively if we keep unused IDs, give the Purge operator settings to delete on certain data types and disable materials and similar by default.
  • When deleting an object, also immediately delete its object data, shape keys, action if they have no other users. This is probably what users expect to happen and saves memory immediately.

This hopefully then makes running Purge less important in typical cases, while also being a safer operation. There are a few other things that could be improved:

  • Make it accessible from the File menu and Blender File view in the outliner.
  • Before it runs, show a popup with info on data types that will be deleted (5 materials, 3 meshes, etc) and allow users to confirm or cancel. Ideally should also allow expanded to see exactly which datablocks
  • After running, show "X datablocks deleted" message in the status bar.
  • Make it not save and reload?
My main concern here is that we end up in a situation where users have to press Purge often to keep memory usage under control. And then if they learn to do that, they still end up accidentally losing data in the same way as before. I think we could tweak behavior a bit depending on the data type: * Enable fake user by default for materials, textures, images, node groups and maybe a few others. * Alternatively if we keep unused IDs, give the Purge operator settings to delete on certain data types and disable materials and similar by default. * When deleting an object, also immediately delete its object data, shape keys, action if they have no other users. This is probably what users expect to happen and saves memory immediately. This hopefully then makes running Purge less important in typical cases, while also being a safer operation. There are a few other things that could be improved: * Make it accessible from the File menu and Blender File view in the outliner. * Before it runs, show a popup with info on data types that will be deleted (5 materials, 3 meshes, etc) and allow users to confirm or cancel. Ideally should also allow expanded to see exactly which datablocks * After running, show "X datablocks deleted" message in the status bar. * Make it not save and reload?

Good initiative to take a look at this. Here are my thoughts:

I definitely think we should remove the auto-purge feature. It causes lots of confusion, complexity and loss of user data, which I don't think is acceptable. It's too easy to lose real work, and going through and managing this is a pain.

This is going to sound hyperbolic, but I find the concept of auto-purging downright user-hostile. User data should be treated as sacred, not to be auto-purged, just because a material happens to not be in use.

The biggest problems with Fake Users & auto-purging are:

  • It's difficult to avoid losing data (if you forget to enable Fake User for a datablock)
  • It's actually difficult to remove something intentionally, because you first have to make sure it has zero users.

So, it's hard to keep data and hard to remove data with this system. I think the entire concept of Fake User should be abolished. If we don't do auto-purge, it is not needed anyway.

To me it seems as if we have an opportunity to change this while we introduce the Asset Manager. As part of the Asset Manager design doc, we came up with the idea of having a category for the current blend file, to give users an overview of all the datablocks inside the current blend:

Screenshot 2019-02-05 at 18.19.32.png

Here we could add a section called 'Unused Data', which displays all datablocks which aren't in use. Users can then purge this data from here.

This kind of thing would be helpful if we make this change, to giver users a better overview of what data they have inside the file.

So, my proposal would be to bundle this behavior change in when the Asset Manager is added, to give extra clarity for users who can get a better overview of their document data and usage.

Good initiative to take a look at this. Here are my thoughts: I definitely think we should remove the auto-purge feature. It causes lots of confusion, complexity and loss of user data, which I don't think is acceptable. It's too easy to lose real work, and going through and managing this is a pain. This is going to sound hyperbolic, but I find the concept of auto-purging downright user-hostile. User data should be treated as sacred, not to be auto-purged, just because a material happens to not be in use. The biggest problems with Fake Users & auto-purging are: - It's difficult to avoid losing data (if you forget to enable Fake User for a datablock) - It's actually difficult to remove something intentionally, because you first have to make sure it has zero users. So, it's hard to keep data and hard to remove data with this system. I think the entire concept of Fake User should be abolished. If we don't do auto-purge, it is not needed anyway. To me it seems as if we have an opportunity to change this while we introduce the Asset Manager. As part of the Asset Manager design doc, we came up with the idea of having a category for the current blend file, to give users an overview of all the datablocks inside the current blend: ![Screenshot 2019-02-05 at 18.19.32.png](https://archive.blender.org/developer/F6518869/Screenshot_2019-02-05_at_18.19.32.png) Here we could add a section called 'Unused Data', which displays all datablocks which aren't in use. Users can then purge this data from here. This kind of thing would be helpful if we make this change, to giver users a better overview of what data they have inside the file. So, my proposal would be to bundle this behavior change in when the Asset Manager is added, to give extra clarity for users who can get a better overview of their document data and usage.

The outliner already has an "Orphaned Data" view for this. Maybe it should be renamed to "Unused Data", or folded into "Blender File" somehow (which could also use a rename).

I'm not sure the asset manager should become a place to manage Blender data in the current file.

The outliner already has an "Orphaned Data" view for this. Maybe it should be renamed to "Unused Data", or folded into "Blender File" somehow (which could also use a rename). I'm not sure the asset manager should become a place to manage Blender data in the current file.

Yes, the Orphaned Data in the Outliner is ok, but doesn't give very much information about who is using the data where.

The nice thing about a section like that in the Asset Manager, is that users might want a nice overview of their materials/node groups/worlds/textures etc inside the current document anyway, as part of the asset manager. Not all projects need external libraries or folder structures. Most other 3D DCC apps have some way to get a visual overview of your materals, textures etc.

And once we have that, we can use it for this purpose too. The nice thing is that, if you were to select a datablock here, we could have a way to tell users where it is being used, by which object, etc.

The Asset Manager is strictly speaking a separate thing, but it can be used to get a better overview of user data, which is exactly what is needed if we get rid of auto-purge and fake users.

So it's just happens to be good timing, that's all.

Yes, the Orphaned Data in the Outliner is ok, but doesn't give very much information about who is using the data where. The nice thing about a section like that in the Asset Manager, is that users might want a nice overview of their materials/node groups/worlds/textures etc inside the current document anyway, as part of the asset manager. Not all projects need external libraries or folder structures. Most other 3D DCC apps have some way to get a visual overview of your materals, textures etc. And once we have that, we can use it for this purpose too. The nice thing is that, if you were to select a datablock here, we could have a way to tell users where it is being used, by which object, etc. The Asset Manager is strictly speaking a separate thing, but it can be used to get a better overview of user data, which is exactly what is needed if we get rid of auto-purge and fake users. So it's just happens to be good timing, that's all.

Added subscriber: @DuarteRamos

Added subscriber: @DuarteRamos
Member

Added subscriber: @jendrzych

Added subscriber: @jendrzych

Added subscriber: @michaelknubben

Added subscriber: @michaelknubben

@WilliamReynish I definitely agree. Having the current file visible in the asset manager also means there's now an easy overview of all used materials, or all of your meshes, and an easy way to mark them for storage in the Asset Manager itself.

@WilliamReynish I definitely agree. Having the current file visible in the asset manager also means there's now an easy overview of all used materials, or all of your meshes, and an easy way to mark them for storage in the Asset Manager itself.
Contributor

Added subscriber: @Rawalanche

Added subscriber: @Rawalanche
Contributor

Keep in mind that this needs to be managed on per-case basis, and some common sense needs to be used. Generally, the way this is handled should be aligned with general expectations of users. Let me show you two examples:

1:
User spends some time creating the material.
User decides not to apply the material yet.
User saves the file.
User re-opens the file and material is gone.
That is wrong.

2:
User creates a material by accident.
There is no simple straightforward way to delete the material except one very obscure mode of outliner or restarting Blender.
That is wrong.

3:
Use creates a mesh model in his scene.
Later in the process, user no longer needs the mesh.
User deletes the mesh object from the scene.
User saves the scene, and proceeds to continue working on the scene file until the mesh object deletion is pushed out of the undo stack.
The mesh data still remains in scene file and gets deleted only on scene close/reload.
That is wrong.

4:
With new proposed behavior, when user deletes all objects referencing particular mesh datablock, the mesh datablock still stays in the scene. Especially meshes are quite heavy datablocks.
This now requires user to drop his work often and go do chores of manually cleaning up unused mesh datablocks.
That is wrong.

With the few examples above, you can't see that Williams idea of treating user data as sacred can not be applied in the same manner to all different types of data blocks.

It all comes down to main underlying issue. Blender treats both low level and high level data as data blocks, even though both of these require quite different workflows. In many other packages, users do not even come in contact with low level data. In Blender they do. It's sometimes a blessing, mostly a curse though.

Here's how it should work:
High level data, such as scene objects, materials or texture maps should be managed explicitly. Meaning they can be created and deleted by users, and they never get purged automatically. True deletion of these should be easily accessible from the same place they are created and managed at.

Low level data such as mesh data should be managed implicitly. Meaning that for example as soon as mesh data block becomes orphaned, it gets purged. But immediately, as soon as it drops out of the undo stack. Otherwise yes, users will run into memory issues.

So again, source of most of these issues is that Blender treats both high level data and low level data as same type of data block, despite these two data types requiring different kind of management.

If you think about it, unless user decides to pack external assets within Blend file, then mesh data is really the only data block type that has any significant size. Most of all the other data block types are procedural. Mesh data datablocks are probably the only ones that require the kind of implicit management that is present in Blender now. I'd perhaps go as far as to suggest users should not be able to interact with mesh data at all. It should happen under the hood.

Right now, exposure of mesh data to users only brings issues, especially having to do naming for both scene objects and mesh data blocks, since exporters mostly use names of mesh data blocks rather than scene objects.

After all these measures, the dreaded fake user button can finally go away.

Keep in mind that this needs to be managed on per-case basis, and some common sense needs to be used. Generally, the way this is handled should be aligned with general expectations of users. Let me show you two examples: 1: User spends some time creating the material. User decides not to apply the material yet. User saves the file. User re-opens the file and material is gone. That is **wrong**. 2: User creates a material by accident. There is no simple straightforward way to delete the material except one very obscure mode of outliner or restarting Blender. That is **wrong**. 3: Use creates a mesh model in his scene. Later in the process, user no longer needs the mesh. User deletes the mesh object from the scene. User saves the scene, and proceeds to continue working on the scene file until the mesh object deletion is pushed out of the undo stack. The mesh data still remains in scene file and gets deleted only on scene close/reload. That is **wrong**. 4: With new proposed behavior, when user deletes all objects referencing particular mesh datablock, the mesh datablock still stays in the scene. Especially meshes are quite heavy datablocks. This now requires user to drop his work often and go do chores of manually cleaning up unused mesh datablocks. That is **wrong**. With the few examples above, you can't see that Williams idea of treating user data as sacred can not be applied in the same manner to all different types of data blocks. It all comes down to main underlying issue. Blender treats both low level and high level data as data blocks, even though both of these require quite different workflows. In many other packages, users do not even come in contact with low level data. In Blender they do. It's sometimes a blessing, mostly a curse though. Here's how it should work: High level data, such as scene objects, materials or texture maps should be managed explicitly. Meaning they can be created and deleted by users, and they never get purged automatically. **True** deletion of these should be easily accessible from the same place they are created and managed at. Low level data such as mesh data should be managed implicitly. Meaning that for example as soon as mesh data block becomes orphaned, it gets purged. But immediately, as soon as it drops out of the undo stack. Otherwise yes, users will run into memory issues. So again, source of most of these issues is that Blender treats both high level data and low level data as same type of data block, despite these two data types requiring different kind of management. If you think about it, unless user decides to pack external assets within Blend file, then mesh data is really the only data block type that has any significant size. Most of all the other data block types are procedural. Mesh data datablocks are probably the only ones that require the kind of implicit management that is present in Blender now. I'd perhaps go as far as to suggest users should not be able to interact with mesh data at all. It should happen under the hood. Right now, exposure of mesh data to users only brings issues, especially having to do naming for both scene objects and mesh data blocks, since exporters mostly use names of mesh data blocks rather than scene objects. After all these measures, the dreaded fake user button can finally go away.
Author
Owner

I’d rather leave the asset thing outside of this discussion, though related, it's still kind of off-topic (and should not be used to do current .blend file management, IDs and assets are not the same things). :)

@brecht your propositions make sense… I wasn’t even aware that Purge was doing a mere save/reload, that can be changed immediately to a proper on-the-fly IDs removal, will do that already.

I’d rather leave the asset thing outside of this discussion, though related, it's still kind of off-topic (and should not be used to do current .blend file management, IDs and assets are not the same things). :) @brecht your propositions make sense… I wasn’t even aware that Purge was doing a mere save/reload, that can be changed immediately to a proper on-the-fly IDs removal, will do that already.
Author
Owner

Edited task description to add suggestions from @brecht, and to reflect work already done on Purge operator.

Also fixed a crash when undoing ID deletion (since yes, new code should allow to undo ID deletion/purge ;) ).

Edited task description to add suggestions from @brecht, and to reflect work already done on Purge operator. Also fixed a crash when undoing ID deletion (since yes, new code should allow to undo ID deletion/purge ;) ).

@mont29 As I understand, you proposal is to only display the Fake User toggle inside the Outliner. If Fake User must be kept, which I'm not sure it does, then I agree that demoting it to the Outliner is nicer. Because then at least the Fake User is not something most users ever have to worry about.

I think it would be a shame to not take the opportunity to simplify things for users. If the ID's are all saved in the blend file, there's no need to keep clicking on the Fake User button. Removing this toggle from the ID selector is a nice way to simplify things for users.

@mont29 As I understand, you proposal is to only display the Fake User toggle inside the Outliner. If Fake User must be kept, which I'm not sure it does, then I agree that demoting it to the Outliner is nicer. Because then at least the Fake User is not something most users ever have to worry about. I think it would be a shame to not take the opportunity to simplify things for users. If the ID's are all saved in the blend file, there's no need to keep clicking on the Fake User button. Removing this toggle from the ID selector is a nice way to simplify things for users.

I think Fake User could be removed, but only if we add options to the Purge operator to specify types of datablocks to delete. Because you still need a way to reclaim memory (at least until deeper design changes happen), but without deleting datablocks like materials.

As for icons, I suggest to have:

  • Remove from list: -
  • Unlink: X
  • Delete: trashcan
I think Fake User could be removed, but only if we add options to the Purge operator to specify types of datablocks to delete. Because you still need a way to reclaim memory (at least until deeper design changes happen), but without deleting datablocks like materials. As for icons, I suggest to have: * Remove from list: - * Unlink: X * Delete: trashcan
Member

The "-" (remove from the list) and "X" (unlink) - let's imagine, that the F-user toggle is demoted to the Outliner and all ID's are kept. What's the difference between both of those functions then? Wouldn't the "unlink" be enough? This part of the UI seems to be counterintuitive from my point of view..

The "-" (remove from the list) and "X" (unlink) - let's imagine, that the F-user toggle is demoted to the Outliner and all ID's are kept. What's the difference between both of those functions then? Wouldn't the "unlink" be enough? This part of the UI seems to be counterintuitive from my point of view..
Author
Owner

@jendrzych think @brecht was more on general UI level: - to remove from lists (aka UI lists, like for vgroups, UVMaps, etc.), X and trashcan would be for IDs (unlink removes the ID usage represented by the IDTemplate UI, but does not remove used ID itself, while trashcan will fully delete that used ID). Also probably for historical reason it's better that way (we have used X to unlink IDs for ages).

@WilliamReynish yes, fake user would be for outliner only. I would keep it though, it's a nice way to fine-grain select IDs you want to protect from Purge (even if/when we add by-type option to purge). And thinking forward a little bit, it could be a handy way to import existing .blend files used as libraries into some asset management system, too.

@jendrzych think @brecht was more on general UI level: `-` to remove from lists (aka UI lists, like for vgroups, UVMaps, etc.), `X` and `trashcan` would be for IDs (unlink removes the ID usage represented by the IDTemplate UI, but does not remove used ID itself, while trashcan will fully delete that used ID). Also probably for historical reason it's better that way (we have used `X` to unlink IDs for ages). @WilliamReynish yes, fake user would be for outliner only. I would keep it though, it's a nice way to fine-grain select IDs you want to protect from Purge (even if/when we add by-type option to purge). And thinking forward a little bit, it could be a handy way to import existing .blend files used as libraries into some asset management system, too.

@mont29 Ok, in that case I don't mind keeping the Fake User then. We could even probably rename it to 'Resist Purge' , 'Don't Purge' or 'Exclude from Purge'. Something like that.

The term 'Fake User' is also just a confusing name - both the term 'fake' and 'user' don't make any intuitive sense in this context at all.

@mont29 Ok, in that case I don't mind keeping the Fake User then. We could even probably rename it to 'Resist Purge' , 'Don't Purge' or 'Exclude from Purge'. Something like that. The term 'Fake User' is also just a confusing name - both the term 'fake' and 'user' don't make any intuitive sense in this context at all.
Contributor

In #61209#617807, @WilliamReynish wrote:
@mont29 Ok, in that case I don't mind keeping the Fake User then. We could even probably rename it to 'Resist Purge' , 'Don't Purge' or 'Exclude from Purge'. Something like that.

The term 'Fake User' is also just a confusing name - both the term 'fake' and 'user' don't make any intuitive sense in this context at all.

To be honest, there's no other software I know of that has any concept of purging. It's concept unique to Blender. All the truly modern pieces of software dedicated to digital creation manage these types of data automatically. So if you are a new user, and expect Blender to be just another digital creation software, then button with tooltip saying "resist purge" will be equally as confusing as button saying "fake user". No win here.

If you want to keep this feature (which I am against), and want to introduce this unusual mechanism to new users, then I would rename purge to "Data Cleanup" and the button would be labeled something like "Ignore Data Cleanup"

> In #61209#617807, @WilliamReynish wrote: > @mont29 Ok, in that case I don't mind keeping the Fake User then. We could even probably rename it to 'Resist Purge' , 'Don't Purge' or 'Exclude from Purge'. Something like that. > > The term 'Fake User' is also just a confusing name - both the term 'fake' and 'user' don't make any intuitive sense in this context at all. To be honest, there's no other software I know of that has any concept of purging. It's concept unique to Blender. All the truly modern pieces of software dedicated to digital creation manage these types of data automatically. So if you are a new user, and expect Blender to be just another digital creation software, then button with tooltip saying "resist purge" will be equally as confusing as button saying "fake user". No win here. If you want to keep this feature (which I am against), and want to introduce this unusual mechanism to new users, then I would rename purge to "Data Cleanup" and the button would be labeled something like "Ignore Data Cleanup"

Remember we are not talking about auto-purging, but manual purging inside the Outliner or Asset Browser.

If we are to keep the concept of manually purging unused datablocks, which I think is reasonable and useful, then a toggle to make items resist this purging is OK. We just should not call it Fake User then - that name no longer makes any sense (if it ever did).

Unlike the current Fake User toggle, this toggle is not something users would have to worry about, unless they do manual purging of unused data.

As @mont29 suggests, it will only be visible inside the Orphaned Data section in the Outliner.

Remember we are not talking about auto-purging, but manual purging inside the Outliner or Asset Browser. If we are to keep the concept of ***manually*** purging unused datablocks, which I think is reasonable and useful, then a toggle to make items resist this purging is OK. We just should not call it Fake User then - that name no longer makes any sense (if it ever did). Unlike the current Fake User toggle, this toggle is not something users would have to worry about, unless they do manual purging of unused data. As @mont29 suggests, it will only be visible inside the Orphaned Data section in the Outliner.
Contributor

Alright, I will wait and see how it works out in practice :)

Alright, I will wait and see how it works out in practice :)

Added subscriber: @PetterLundh

Added subscriber: @PetterLundh

@mont29:

Likwise, when deleting any ID, also delete its animdata's Action if not used anywhere else.

Eh, I'm not sure about that. Just like with materials, users might build up a library of actions. Deleting objects should not delete their actions, otherwise you will lose important user-created data.

@mont29: > Likwise, when deleting any ID, also delete its animdata's Action if not used anywhere else. Eh, I'm not sure about that. Just like with materials, users might build up a library of actions. Deleting objects should not delete their actions, otherwise you will lose important user-created data.
Author
Owner

'Fake User' is actually an excellent example of naming that totally makes sense at the technical level (how it is implemented in code), but is absolute non-sense for any normal user. :) Regarding new name, I would not reference purge here, something as simple as 'always keep', or 'don't auto-delete', or in that area? Anyway, this is a bit detail for now ;)

@WilliamReynish thing is, actions can also become very heavy at some point, and you could have that argument for meshes as well anyway… I'd still expect the 'Fake User' flag (whatever would be its new name) to be set on those data-blocks then. On the other hand, I wouldn't mind adding a setting to control which ID types to auto-delete either.

'Fake User' is actually an excellent example of naming that totally makes sense at the technical level (how it is implemented in code), but is absolute non-sense for any normal user. :) Regarding new name, I would not reference purge here, something as simple as 'always keep', or 'don't auto-delete', or in that area? Anyway, this is a bit detail for now ;) @WilliamReynish thing is, actions can also become very heavy at some point, and you could have that argument for meshes as well anyway… I'd still expect the 'Fake User' flag (whatever would be its new name) to be set on those data-blocks then. On the other hand, I wouldn't mind adding a setting to control which ID types to auto-delete either.

Those names would be fine with me too. And yes, I do understand the technical reason for the old name, but you have to have quite an intimate knowledge of the technical details for it to make sense.

As for actions:
Currently, actions actually do have Fake User enabled by default, at least armature actions do. Although in this new proposal I thought the idea was that it would not auto-purge, so there's no need for Fake User to be enabled for anything, surely?

If your file gets too big you could always go and purge unused.

Enabling Fake User by default for certain things doesn't seem to make sense in this new world where auto-purging goes away. It seems random that some things are born with fake user enabled, and not necessary if Blender stops deleting user data without their consent.

Those names would be fine with me too. And yes, I do understand the technical reason for the old name, but you have to have quite an intimate knowledge of the technical details for it to make sense. As for actions: Currently, actions actually *do* have Fake User enabled by default, at least armature actions do. Although in this new proposal I thought the idea was that it would not auto-purge, so there's no need for Fake User to be enabled for anything, surely? If your file gets too big you could always go and purge unused. Enabling Fake User by default for certain things doesn't seem to make sense in this new world where auto-purging goes away. It seems random that some things are born with fake user enabled, and not necessary if Blender stops deleting user data without their consent.
Member

Added subscriber: @Poulpator

Added subscriber: @Poulpator

Added subscriber: @SebastianWolf

Added subscriber: @SebastianWolf

Added subscriber: @Rickyx

Added subscriber: @Rickyx

"A computer shall not harm your work or, through inaction, allow your work to come to harm".

I always teach my students how to use the fake user button (on mesh, actions, ecc...) and everyone is admired that blender takes care of delaying the deletion of user data:
we, me and my colleagues, approve that blender does not delete datablock itself. Having a purge command (having clear what you purge) could be a nice improvement, making clear what happens.

Moreover, I am convinced that there is no data of series A (objects, materials) or series B(mesh, or other): it would only create confusion not to have a unique behavior.
We believe that blender linking (ex. object-mesh-mat-text-image ecc... ) is unique and better than the one of other software, giving artist more freedom, having explicit commands.

*"A computer shall not harm your work or, through inaction, allow your work to come to harm".* I always teach my students how to use the fake user button (on mesh, actions, ecc...) and everyone is admired that blender takes care of **delaying** the deletion of user data: we, me and my colleagues, approve that blender does not delete datablock itself. Having a purge command (having clear what you purge) could be a nice improvement, making clear what happens. Moreover, I am convinced that there is no data of *series A* (objects, materials) or *series B*(mesh, or other): it would only create confusion not to have a unique behavior. We believe that blender linking (ex. object-mesh-mat-text-image ecc... ) is unique and better than the one of other software, giving artist more freedom, having explicit commands.

@Rickyx Nice Jef Raskin quote.

I also don't find it completely clear why some ID types are to be given 'fake users' by default and others not. I don't see how we can rationalise what data the user prefers to keep from purging.

The user may have created a material or texture by mistake and wish to purge materials - why do they have fake user enabled but other things not? I don't get that.

The good thing is that fake users aren't needed to keep Blender from lynching your work.

@Rickyx Nice Jef Raskin quote. I also don't find it completely clear why some ID types are to be given 'fake users' by default and others not. I don't see how we can rationalise what data the user prefers to keep from purging. The user may have created a material or texture by mistake and wish to purge materials - why do they have fake user enabled but other things not? I don't get that. The good thing is that fake users aren't needed to keep Blender from lynching your work.

@WilliamReynish Ton suggested me to read The Humane Interface 12 years ago: super, you got it right away!

Choosing the precautionary approach I suggest to consider all datablocks equally important ( I don't think that any modern pc gets nailed because, by mistake, I created material.001). There is no rationale about having data hierarchy: too many hours spent on creating 3d (action, texture, node group, armature, text, world, ecc...).

I do agree to remove fake-user and create a purge procedure.

Ex. Inkscape does that: File -> Clean up Document
If you have create a pattern, and removed the "father" object the patterns stays in the xml until you manually purge: unused definitions goes away. And it is clever.
From the guide:
"Many of the no-longer-used gradients, patterns, and markers (more precisely, those which you edited manually) remain in the corresponding palettes and can be reused for new objects. However if you want to optimize your document, use the Clean up Document command in File menu. It will remove any gradients, patterns, or markers which are not used by anything in the document, making the file smaller. "

@WilliamReynish Ton suggested me to read *The Humane Interface* 12 years ago: super, you got it right away! Choosing the precautionary approach I suggest to consider all datablocks equally important ( I don't think that any modern pc gets nailed because, by mistake, I created material.001). There is no rationale about having data hierarchy: too many hours spent on creating 3d (action, texture, node group, armature, text, world, ecc...). I do agree to remove fake-user and create a purge procedure. Ex. **Inkscape does that**: File -> Clean up Document If you have create a pattern, and removed the "father" object the patterns stays in the xml until you manually purge: unused definitions goes away. And it is clever. From the guide: *"Many of the no-longer-used gradients, patterns, and markers (more precisely, those which you edited manually) remain in the corresponding palettes and can be reused for new objects. However if you want to optimize your document, use the Clean up Document command in File menu. It will remove any gradients, patterns, or markers which are not used by anything in the document, making the file smaller. "*

Added subscribers: @william-70, @fabioroldan

Added subscribers: @william-70, @fabioroldan

I agree with @WilliamReynish , a better user experience would be to manage everything from the asset manager.

I agree with @WilliamReynish , a better user experience would be to manage everything from the asset manager.
Member

@brecht - You ask, You get.
purgre-remove.png

@brecht - You ask, You get. ![purgre-remove.png](https://archive.blender.org/developer/F6631401/purgre-remove.png)

In #61209#619305, @WilliamReynish wrote:
@Rickyx Nice Jef Raskin quote.

I also don't find it completely clear why some ID types are to be given 'fake users' by default and others not. I don't see how we can rationalise what data the user prefers to keep from purging.

The user may have created a material or texture by mistake and wish to purge materials - why do they have fake user enabled but other things not? I don't get that.

Currently if you delete objects, end up with unused object-data which can take a lot of memory & space in the blend file.

Brecht suggested: "When deleting an object, also immediately delete its object data..." ~ if that's done it's not an issue.

If you change the mesh an object references some other way (besides explicitly deleting), it's unlikely you will want it to be kept.

> In #61209#619305, @WilliamReynish wrote: > @Rickyx Nice Jef Raskin quote. > > I also don't find it completely clear why some ID types are to be given 'fake users' by default and others not. I don't see how we can rationalise what data the user prefers to keep from purging. > > The user may have created a material or texture by mistake and wish to purge materials - why do they have fake user enabled but other things not? I don't get that. Currently if you delete objects, end up with unused object-data which can take a lot of memory & space in the blend file. Brecht suggested: *"When deleting an object, also immediately delete its object data..."* ~ if that's done it's not an issue. If you change the mesh an object references some other way (besides explicitly deleting), it's unlikely you will want it to be kept.
Member

This comment was removed by @jendrzych

*This comment was removed by @jendrzych*

@ideasman42 it may happen that you delete an object but you don't want to delete his material.
So we come back to the same question: is there a hierarchy between the data? Why keeping my object material and not my mesh or my walk-cycle animation, so carefully crafted?
And my node-group?

What if I remap the mesh (read material, texture, data-block) of an object? Will the old one be deleted as soon as it is un-linked?
The data-block linking/keeping is so powerful!

If it is necessary I can provide some real usage case of keeping data and work.

Thank you,
Riccardo

@ideasman42 it may happen that you delete an object but you don't want to delete his material. So we come back to the same question: is there a **hierarchy** between the data? Why keeping my object material and not my mesh or my walk-cycle animation, so carefully crafted? And my node-group? What if I remap the mesh (read material, texture, data-block) of an object? Will the old one be deleted as soon as it is un-linked? The data-block linking/keeping is so powerful! If it is necessary I can provide some real usage case of keeping data and work. Thank you, Riccardo

@Rickyx not sure about the situation w/ materials, I assumed it would have a fake-user.

Responding the the point @WilliamReynish was making about why you might want some data-blocks to have fake users and not others.

If some object data isn't used anywhere (which can happen even if it's less likely if ob-data is removed on deletion), it's not likely you will want to keep it. Hence there is a reason you might not want to have fake users for object-data by default, but have it for materials.

@Rickyx not sure about the situation w/ materials, I assumed it would have a fake-user. Responding the the point @WilliamReynish was making about why you might want some data-blocks to have fake users and not others. If some object data isn't used anywhere *(which can happen even if it's less likely if ob-data is removed on deletion)*, it's not likely you will want to keep it. Hence there is a reason you might not want to have fake users for object-data by default, but have it for materials.

Added subscriber: @0o00o0oo

Added subscriber: @0o00o0oo
Contributor

How's this moving along? :) Given the huge value of this in everyday workflows of pretty much everyone, I think it deserves priority.

How's this moving along? :) Given the huge value of this in everyday workflows of pretty much everyone, I think it deserves priority.

Priority is fixing bugs and getting 2.80 released with the planned targets.

@Rawalanche: in general, please don't do "thread bumping" on developer.blender.org. If all users start doing this it will be a mess.

Priority is fixing bugs and getting 2.80 released with the planned targets. @Rawalanche: in general, please don't do "thread bumping" on developer.blender.org. If all users start doing this it will be a mess.
Contributor

In #61209#632775, @brecht wrote:
Priority is fixing bugs and getting 2.80 released with the planned targets.

@Rawalanche: in general, please don't do "thread bumping" on developer.blender.org. If all users start doing this it will be a mess.

Okay. I understand.

> In #61209#632775, @brecht wrote: > Priority is fixing bugs and getting 2.80 released with the planned targets. > > @Rawalanche: in general, please don't do "thread bumping" on developer.blender.org. If all users start doing this it will be a mess. Okay. I understand.
Member

Added subscriber: @JulienKaspar

Added subscriber: @JulienKaspar

Added subscriber: @a.monti

Added subscriber: @a.monti

Added subscriber: @justastatue

Added subscriber: @justastatue

Added subscriber: @DominikR

Added subscriber: @DominikR

Added subscriber: @YegorSmirnov

Added subscriber: @YegorSmirnov

Added subscriber: @1D_Inc

Added subscriber: @1D_Inc

In #61209#613966, @brecht wrote:

  • Enable fake user by default for materials, textures, images, node groups and maybe a few others.

Does it mean that all the trash will left in file if it was once placed there? =)
This violates "always keep blend file clean" strategy.

Or is it supposed that I, as a project manager, should clean all the mess after artists instead of software?

> In #61209#613966, @brecht wrote: > * Enable fake user by default for materials, textures, images, node groups and maybe a few others. Does it mean that all the trash will left in file if it was once placed there? =) This violates "always keep blend file clean" strategy. Or is it supposed that I, as a project manager, should clean all the mess after artists instead of software?

In #61209#615458, @Rawalanche wrote:

With the few examples above, you can't see that Williams idea of treating user data as sacred can not be applied in the same manner to all different types of data blocks.

Agree. Nice examples by the way.

Here's how it should work:

The problem is that this cannot be expressed with a simple rule like "everything that is not used will be removed."
As a result you have to remember what and when will be purged, this will make management difficult.

Right now, exposure of mesh data to users only brings issues, especially having to do naming for both scene objects and mesh data blocks, since exporters mostly use names of mesh data blocks rather than scene objects.

No, mesh data allow to control instancing, and exposes real objects structure. Hiding it will bring the same confusion like in max or maya, where instances management is pretty painful.

> In #61209#615458, @Rawalanche wrote: > With the few examples above, you can't see that Williams idea of treating user data as sacred can not be applied in the same manner to all different types of data blocks. Agree. Nice examples by the way. > Here's how it should work: The problem is that this cannot be expressed with a simple rule like "everything that is not used will be removed." As a result you have to remember what and when will be purged, this will make management difficult. > Right now, exposure of mesh data to users only brings issues, especially having to do naming for both scene objects and mesh data blocks, since exporters mostly use names of mesh data blocks rather than scene objects. No, mesh data allow to control instancing, and exposes real objects structure. Hiding it will bring the same confusion like in max or maya, where instances management is pretty painful.

On massive progects gigabytes of possible trash are iterating through our files just daily during creation process.
What is the point to keep it in file?
Having 200 Gb blend files instead of 2?

On massive progects gigabytes of possible trash are iterating through our files just daily during creation process. What is the point to keep it in file? Having 200 Gb blend files instead of 2?

@1D_Inc Totally agree that "mesh data allow to control instancing, and exposes real objects structure" is a massive advantage and a key improving difference from other software.

If this topic is still active...
after much thought this year, the countdown is not such a bad system and allows for a slow elimination of obsolete content, keeping a balance between distracted users and huge files.

@1D_Inc Totally agree that "*mesh data allow to control instancing, and exposes real objects structure*" is a massive advantage and a key improving difference from other software. If this topic is still active... after much thought this year, the countdown is not such a bad system and allows for a slow elimination of obsolete content, keeping a balance between distracted users and huge files.
Author
Owner

We already have tools in the outliner to batch-delete orphaned data-blocks, proposal here is rather to not do that implicitly on .blend file save, which is a fairly unexpected (especially from new users) behavior...

Further more, keeping too much trash data is always better than losing valuable one imho.

We already have tools in the outliner to batch-delete orphaned data-blocks, proposal here is rather to not do that implicitly on .blend file save, which is a fairly unexpected (especially from new users) behavior... Further more, keeping too much trash data is always better than losing valuable one imho.

In #61209#1079385, @mont29 wrote:
Further more, keeping too much trash data is always better than losing valuable one imho.

You stop losing valuable data, when start to care about it.
That's the basics that deserves to be learned before starting any kind of massive production, because there is no other way to stop massive pollution there.
And there is no simple way to learn it, so it better be done in the very beginning, when there is nothing valuable anyway.

> In #61209#1079385, @mont29 wrote: > Further more, keeping too much trash data is always better than losing valuable one imho. You stop losing valuable data, when start to care about it. That's the basics that deserves to be learned before starting any kind of massive production, because there is no other way to stop massive pollution there. And there is no simple way to learn it, so it better be done in the very beginning, when there is nothing valuable anyway.

Added subscriber: @ErickNyanduKabongo

Added subscriber: @ErickNyanduKabongo

In #61209#1079391, @1D_Inc wrote:

You stop losing valuable data, when start to care about it.

The problem is they come from different culture and they don't understand how practical the actual Blender structure is, this is sad :(

> In #61209#1079391, @1D_Inc wrote: > > You stop losing valuable data, when start to care about it. The problem is they come from different culture and they don't understand how practical the actual Blender structure is, this is sad :(

In my opinion the problem is not in automatic garbage collecting, but in the lack of notification. (which can be optional to not to annoy experienced users, who already mastered such a workflow tool as controlled losses)

Also, considering garbage collection, it would be nice to have a recursive cleanup to perform garbage cleanup faster.

In my opinion the problem is not in automatic garbage collecting, but in the lack of notification. (which can be optional to not to annoy experienced users, who already mastered such a workflow tool as controlled losses) Also, considering garbage collection, it would be nice to have a recursive cleanup to perform garbage cleanup faster.
Contributor

No, the main problem is that there is no proper, easy to access and easy to use UI space to manage datablocks. That's exactly why I wrote my post in the datablock selector redesign thread. Of course no one is going to care about datablock management when the entire datablock management is hidden inside one, clumsy, frustrating to use outliner mode. That's why asset manager should be a datablock manager, primarily. You are doing your best to justify Blender's convoluted way of managing datablocks, while game engines such as Unreal or Unity have been proving for over a decade now, that asset manager based data asset management is the best, easiest and most straightforward way of doing this task.

Users themselves should decide whether or not should the unused materials be removed or not, because only users themselves known which material they want to remove, and which ones they want to keep for later, even unused. No AI, let alone simple heuristic based on references can decide that for user. But if you have such a bad solution like Blender currently has, where the Blender does keep unused materials but does not give users any acceptable user interface to manage them within, you end up with exactly the dumpster fire we have now.

No, the main problem is that there is no proper, easy to access and easy to use UI space to manage datablocks. That's exactly why I wrote my post in the datablock selector redesign thread. Of course no one is going to care about datablock management when the entire datablock management is hidden inside one, clumsy, frustrating to use outliner mode. That's why asset manager should be a datablock manager, primarily. You are doing your best to justify Blender's convoluted way of managing datablocks, while game engines such as Unreal or Unity have been proving for over a decade now, that asset manager based data asset management is the best, easiest and most straightforward way of doing this task. Users themselves should decide whether or not should the unused materials be removed or not, because only users themselves known which material they want to remove, and which ones they want to keep for later, even unused. No AI, let alone simple heuristic based on references can decide that for user. But if you have such a bad solution like Blender currently has, where the Blender does keep unused materials but does not give users any acceptable user interface to manage them within, you end up with exactly the dumpster fire we have now.

In #61209#1079561, @Rawalanche wrote:
Users themselves should decide whether or not should the unused materials be removed or not, because only users themselves known which material they want to remove, and which ones they want to keep for later, even unused.

They already doing this, by flagging datablocks.
However, from our management experience, artists never know what they really need, except everything at once. This is a huge problem.

No, the main problem is that there is no proper, easy to access and easy to use UI space to manage datablocks.

Sounds like a lot of design work.

> In #61209#1079561, @Rawalanche wrote: > Users themselves should decide whether or not should the unused materials be removed or not, because only users themselves known which material they want to remove, and which ones they want to keep for later, even unused. They already doing this, by flagging datablocks. However, from our management experience, artists never know what they really need, except everything at once. This is a huge problem. > No, the main problem is that there is no proper, easy to access and easy to use UI space to manage datablocks. Sounds like a lot of design work.
Contributor

In #61209#1079638, @1D_Inc wrote:

In #61209#1079561, @Rawalanche wrote:

They already doing this, by flagging datablocks.
However, from our management experience, artists never know what they really need, except everything at once. This is a huge problem.

Yes, because something as trivial as material management should not have any learning curve, let alone one as confusing as Blender has. Blender is one of the very few 3D software packages where material management is something that is complicated.

In #61209#1079638, @1D_Inc wrote:

In #61209#1079561, @Rawalanche wrote:
No, the main problem is that there is no proper, easy to access and easy to use UI space to manage datablocks.

Sounds like a lot of design work.

No, not at all. I mean it's a lot of design work if you overcomplicate it, as is tradition for Blender. But otherwise, it can be as simple as having a visual editor which can browse currently active .blend file, very similar to how append/link dialog can:
image.png
image.png
Then, all that would be needed is to add some function to display amount of references for each datablock, so user knows if it's used or not. So users could manage their materials as well as other datablocks as easily as they manage files in file browsers in Windows and linux. All the materials, or other datablocks that do not have any references would simply have some little warning icon in the corner indicating they are unused. And alongside that, this asset manager would have a simple operator, which would remove unused assets, with 3 modes:

  1. Remove in the entire .blend file, similar to current purge.
  2. Remove in current asset browser folder.
  3. Remove in current asset browser folder and all its subfolders.

There, it's that simple. A material and other datablock management with close to 0 learning curve and/or confusion. And then, fake user could finally go to hell, where it belongs.

> In #61209#1079638, @1D_Inc wrote: >> In #61209#1079561, @Rawalanche wrote: > They already doing this, by flagging datablocks. > However, from our management experience, artists never know what they really need, except everything at once. This is a huge problem. Yes, because something as trivial as material management should not have any learning curve, let alone one as confusing as Blender has. Blender is one of the very few 3D software packages where material management is something that is complicated. > In #61209#1079638, @1D_Inc wrote: >> In #61209#1079561, @Rawalanche wrote: >> No, the main problem is that there is no proper, easy to access and easy to use UI space to manage datablocks. > Sounds like a lot of design work. No, not at all. I mean it's a lot of design work if you overcomplicate it, as is tradition for Blender. But otherwise, it can be as simple as having a visual editor which can browse currently active .blend file, very similar to how append/link dialog can: ![image.png](https://archive.blender.org/developer/F9521809/image.png) ![image.png](https://archive.blender.org/developer/F9521812/image.png) Then, all that would be needed is to add some function to display amount of references for each datablock, so user knows if it's used or not. So users could manage their materials as well as other datablocks as easily as they manage files in file browsers in Windows and linux. All the materials, or other datablocks that do not have any references would simply have some little warning icon in the corner indicating they are unused. And alongside that, this asset manager would have a simple operator, which would remove unused assets, with 3 modes: 1. Remove in the entire .blend file, similar to current purge. 2. Remove in current asset browser folder. 3. Remove in current asset browser folder and all its subfolders. There, it's that simple. A material and other datablock management with close to 0 learning curve and/or confusion. And then, fake user could finally go to hell, where it belongs.

In #61209#1079561, @Rawalanche wrote:

Blender is one of the very few 3D software packages where material management is something that is complicated.

I am not sure, because I remember multimaterial in 3dsmax. You also have explanation video about its material editors on your youtube channel. That was a learning curve.

it can be as simple as having a visual editor which can browse currently active .blend file, very similar to how append/link dialog can:
image.png

M_Building_windows? Such a self-representing information.
Have you worked with BIM imports? There will be mostly things like Building_Windows_025645656 and Building_Windows_778546588
This is not even trash on your screenshots, so you can manage it by names in lists.

>>> In #61209#1079561, @Rawalanche wrote: > Blender is one of the very few 3D software packages where material management is something that is complicated. I am not sure, because I remember multimaterial in 3dsmax. You also have explanation video about its material editors on your youtube channel. That was a learning curve. > it can be as simple as having a visual editor which can browse currently active .blend file, very similar to how append/link dialog can: > ![image.png](https://archive.blender.org/developer/F9521812/image.png) M_Building_windows? Such a self-representing information. Have you worked with BIM imports? There will be mostly things like Building_Windows_025645656 and Building_Windows_778546588 This is not even trash on your screenshots, so you can manage it by names in lists.

Removed subscriber: @a.monti

Removed subscriber: @a.monti
Contributor

In #61209#1079682, @1D_Inc wrote:

In #61209#1079561, @Rawalanche wrote:

Blender is one of the very few 3D software packages where material management is something that is complicated.

I am not sure, because I remember multimaterial in 3dsmax. You also have explanation video about its material editors on your youtube channel. That was a learning curve.

WTF are you talking about? That video is over 5 years old. I do not use 3ds Max for over 3 years now, and when I said "Blender is one of the very few 3D software packages where material management is something that is complicated" I'd also count 3ds Max among those.

it can be as simple as having a visual editor which can browse currently active .blend file, very similar to how append/link dialog can:
image.png

M_Building_windows? Such a self-representing information.
Have you worked with BIM imports? There will be mostly things like Building_Windows_025645656 and Building_Windows_778546588
This is not even trash on your screenshots.

So? What's your point? There's absolutely no reason the asset manager should not have options to change how entities within the folder get displayed:
image.png
Exactly in the same way it can be changed in asset browsers of mainstream game engines or file browsers of mainstream operating systems. Pretty much anything beats the horribleness of delegating entire scene wide material management down to one dropdown UI element with modal popup with vertically scrolling floater.

> In #61209#1079682, @1D_Inc wrote: >>>> In #61209#1079561, @Rawalanche wrote: >> Blender is one of the very few 3D software packages where material management is something that is complicated. > I am not sure, because I remember multimaterial in 3dsmax. You also have explanation video about its material editors on your youtube channel. That was a learning curve. > WTF are you talking about? That video is over 5 years old. I do not use 3ds Max for over 3 years now, and when I said "Blender is one of the very few 3D software packages where material management is something that is complicated" I'd also count 3ds Max among those. >> it can be as simple as having a visual editor which can browse currently active .blend file, very similar to how append/link dialog can: >> ![image.png](https://archive.blender.org/developer/F9521812/image.png) > M_Building_windows? Such a self-representing information. > Have you worked with BIM imports? There will be mostly things like Building_Windows_025645656 and Building_Windows_778546588 > This is not even trash on your screenshots. So? What's your point? There's absolutely no reason the asset manager should not have options to change how entities within the folder get displayed: ![image.png](https://archive.blender.org/developer/F9521836/image.png) Exactly in the same way it can be changed in asset browsers of mainstream game engines or file browsers of mainstream operating systems. Pretty much anything beats the horribleness of delegating entire scene wide material management down to one dropdown UI element with modal popup with vertically scrolling floater.

Added subscriber: @APEC

Added subscriber: @APEC

Can this option be toggled somewhere in preferences (Save & Load is a perfect place)? Checkbox with option to save/purge trash data when saving file?

Can this option be toggled somewhere in preferences (Save & Load is a perfect place)? Checkbox with option to save/purge trash data when saving file?

In #61209#1079710, @APEC wrote:
Can this option be toggled somewhere in preferences (Save & Load is a perfect place)? Checkbox with option to save/purge trash data when saving file?

No. If you send file to your team, they will purge its content if they have different setup. (Teamwork issue)

> In #61209#1079710, @APEC wrote: > Can this option be toggled somewhere in preferences (Save & Load is a perfect place)? Checkbox with option to save/purge trash data when saving file? No. If you send file to your team, they will purge its content if they have different setup. (Teamwork issue)
Author
Owner

Added subscriber: @all

Added subscriber: @all
Author
Owner

@ All Please remember that this tracker is not a forum, but a tool to help the development team.

Please strictly stick to the topic of this task, and avoid endless ping-pong wall-of-text discussions, those should happen elsewhere.

@ All Please remember that this tracker is not a forum, but a tool to help the development team. Please strictly stick to the topic of this task, and avoid endless ping-pong wall-of-text discussions, those should happen elsewhere.
Author
Owner

Removed subscriber: @all

Removed subscriber: @all

This comment was removed by @1D_Inc

*This comment was removed by @1D_Inc*

That was even more off-topic, please respect the rules of this website. developer.blender.org is the workplace of Blender developers. Please treat it with the same kind of respect as would be expected when walking into an office where developers are working.

That was even more off-topic, please respect the rules of this website. developer.blender.org is the workplace of Blender developers. Please treat it with the same kind of respect as would be expected when walking into an office where developers are working.

In #61209#1080029, @brecht wrote:
Please treat it with the same kind of respect as would be expected when walking into an office where developers are working.

Sure. I am also addons developer, and know how hard this work is and how concentration is important.
But, please, be careful with workflow design, otherwise you will completely push us out of production.

> In #61209#1080029, @brecht wrote: > Please treat it with the same kind of respect as would be expected when walking into an office where developers are working. Sure. I am also addons developer, and know how hard this work is and how concentration is important. But, please, be careful with workflow design, otherwise you will completely push us out of production.

Design process in a nutshell, pretty accurate.
Over the course of several iterations, when important data becomes garbage and garbage becomes important at any moment, sooner or later there comes a point when it simply cannot be taken care of, especially at deadlines.
Projects cleanliness in business always has a lower priority than results (because results are paid for), and this is a problem of the entire industry.
image.png

Design process in a nutshell, pretty accurate. Over the course of several iterations, when important data becomes garbage and garbage becomes important at any moment, sooner or later there comes a point when it simply cannot be taken care of, especially at deadlines. Projects cleanliness in business always has a lower priority than results (because results are paid for), and this is a problem of the entire industry. ![image.png](https://archive.blender.org/developer/F9555531/image.png)

Added subscriber: @TheRedWaxPolice

Added subscriber: @TheRedWaxPolice
Contributor

Added subscriber: @RedMser

Added subscriber: @RedMser

Added subscriber: @finirpar

Added subscriber: @finirpar

This whole proposal looks strange. It looks like you are trying to design a part of the engine without considering the whole engine.

This whole proposal looks strange. It looks like you are trying to design a part of the engine without considering the whole engine.

I think it's like turning an automatic transmission into a manual.
The manual purge solution is familiar from the Autodesk family of products and all of their longstanding problems.

I think it's like turning an automatic transmission into a manual. The manual purge solution is familiar from the Autodesk family of products and all of their longstanding problems.

As I understood it's because with new Asset Browser we need to store for example unused material with it nodes and not be purged after close blender.

As I understood it's because with new Asset Browser we need to store for example unused material with it nodes and not be purged after close blender.

Asset Browser allows you to simply view some data, and that's it, its presence cannot affect the saving of unused data in the file.

Asset Browser allows you to simply view some data, and that's it, its presence cannot affect the saving of unused data in the file.

Just checked, and indeed when we mark asset it become a fake user, so why need to store all trash data? - unclear.
2.93.0 AB.png

Just checked, and indeed when we mark asset it become a fake user, so why need to store all trash data? - unclear. ![2.93.0 AB.png](https://archive.blender.org/developer/F9841669/2.93.0_AB.png)

It is logical when we mark something as an asset in library file, because this way we confirm that we need it.

It is logical when we mark something as an asset in library file, because this way we confirm that we need it.

I wan to try to clarify the issue about auto/manual purge and garbage collector.

The problem is that users purges or optimizes any kind of things only when the situation turns critical - for example, when HDD is full.
This is suitable load for regular users (like, for example, freelancers), but is not suitable for companies, which generates a lot of files during production.

This is the same problem we faced in AutoCAD - whose DWG files can also store unused datablocks, but where only manual cleanup is provided.
Situation became critical only when corporate hard drives are full, that means that there way more data to purge and cleanup.
So Autodesk provided free application with the ability to batch purge process, but it rewrites all the purged files to a specified version.
Since different versions of AutoCAD contain different subset of features, and earlier versions contain features which was cut out, batch purge is not a suitable solution, especially if to take into account that there are tens of thousands files are generated during corporative production.

The situation is complicated by the fact that, unlike DWG files, which usually weigh 10-20 megabytes per file, blend files weigh can easily reach gigabytes, and the amount of unused data can also reach gigabytes per file, what wastes hard drive space extremely fast, and which is especially critical at the corporate level. Also, the size of file hides the amount of unused datablocks it contains, so it is hard to say if computer you work is able to open such a file, having just information about its size, turning files into black boxes. Such an issue is presented, for example, in 3dsmax, which have the ability to save larger files than it can actually open later on the same computer.

So I want to clarify, that proper autopack, autopurge and automatic garbage collector systems design (like any other massive data management solutions during production lifecycle) is a corporate level requirement.

I wan to try to clarify the issue about auto/manual purge and garbage collector. The problem is that users purges or optimizes any kind of things only when the situation turns critical - for example, when HDD is full. This is suitable load for regular users (like, for example, freelancers), but is not suitable for companies, which generates a lot of files during production. This is the same problem we faced in AutoCAD - whose DWG files can also store unused datablocks, but where only manual cleanup is provided. Situation became critical only when corporate hard drives are full, that means that there way more data to purge and cleanup. So Autodesk provided free application with the ability to batch purge process, but it rewrites all the purged files to a specified version. Since different versions of AutoCAD contain different subset of features, and earlier versions contain features which was cut out, batch purge is not a suitable solution, especially if to take into account that there are tens of thousands files are generated during corporative production. The situation is complicated by the fact that, unlike DWG files, which usually weigh 10-20 megabytes per file, blend files weigh can easily reach gigabytes, and the amount of unused data can also reach gigabytes per file, what wastes hard drive space extremely fast, and which is especially critical at the corporate level. Also, the size of file hides the amount of unused datablocks it contains, so it is hard to say if computer you work is able to open such a file, having just information about its size, turning files into black boxes. Such an issue is presented, for example, in 3dsmax, which have the ability to save larger files than it can actually open later on the same computer. So I want to clarify, that proper autopack, autopurge and automatic garbage collector systems design (like any other massive data management solutions during production lifecycle) is a corporate level requirement.

Added subscriber: @nikitron

Added subscriber: @nikitron

working in archicad i used to resave files due to garbage it collects. It is cruel thing. Every half a year 10 gb files needed to be rewrited completely on teamwork.
I don't want the same in blender.
My blender pipeline has some custom purge buttons that clear user-free data. So i not need check if node tree not used. i just clean it once. I know what i utilize and fakeusering it directly.

working in archicad i used to resave files due to garbage it collects. It is cruel thing. Every half a year 10 gb files needed to be rewrited completely on teamwork. I don't want the same in blender. My blender pipeline has some custom purge buttons that clear user-free data. So i not need check if node tree not used. i just clean it once. I know what i utilize and fakeusering it directly.

There is a huge and massive indistry level problem - project files congestion issue.
And the problem is that Blender system realization is the only solution on a market that solves that issue in practice.

Yes, it has "issues", but in fact they are just the only way that works.

There is a huge and massive indistry level problem - project files congestion issue. And the problem is that Blender system realization is the only solution on a market that solves that issue in practice. Yes, it has "issues", but in fact they are just the only way that works.

This comment was removed by @1D_Inc

*This comment was removed by @1D_Inc*

It is quite typical project management case.

Person A: writes unused id's into project file.
Person B: open project file, make some changes - puts and replaces heavyweight assets according to design changes, purges unused data because file became too heavy.
Person A: "where is my data, I saved it into the file, it was important"
Person B: "there was some unused data, so I purged it to be able to work with project file, so I am not even sorry"

There are two solutions after such a situation occurs - always lose unused id data as a rule for everybody or never purge anything ever again until project is not finished. The first solution is current b3d behavior, the second one or any other are not possible during production.

It is quite typical project management case. Person A: writes unused id's into project file. Person B: open project file, make some changes - puts and replaces heavyweight assets according to design changes, purges unused data because file became too heavy. Person A: "where is my data, I saved it into the file, it was important" Person B: "there was some unused data, so I purged it to be able to work with project file, so I am not even sorry" There are two solutions after such a situation occurs - always lose unused id data as a rule for everybody or never purge anything ever again until project is not finished. The first solution is current b3d behavior, the second one or any other are not possible during production.

Added subscriber: @ChinoD

Added subscriber: @ChinoD

This comment was removed by @ChinoD

*This comment was removed by @ChinoD*

Multiple fake users will require lots of unnecesarry additional management at the cost of lack of a solution to the original problems - project files congestion issue and using unused id's trash bin by different persons for mixing trash with no trash.

If some data was marked as a fake user, that means that some user already put some attention to mark it to preserve it - this is the most important part.

Of course, it is possible to give a names to a fake users, but after the point trash was marked, it is not a trash for automating deletion anymore.

Of course, no one will tell what was "Ralph" or "Important" or "Important1111" fake users about after a month of project making - because fake usered unused data is a dynamic context data which is hard to label by definition, since it is a self-representative data type (where the context is represented by data itself) which cannot be described properly by names.

Multiple fake users will require lots of unnecesarry additional management at the cost of lack of a solution to the original problems - project files congestion issue and using unused id's trash bin by different persons for mixing trash with no trash. If some data was marked as a fake user, that means that some user already put some attention to mark it to preserve it - this is the most important part. Of course, it is possible to give a names to a fake users, but after the point trash was marked, it is not a trash for automating deletion anymore. Of course, no one will tell what was "Ralph" or "Important" or "Important1111" fake users about after a month of project making - because fake usered unused data is a dynamic context data which is hard to label by definition, since it is a self-representative data type (where the context is represented by data itself) which cannot be described properly by names.

Added subscriber: @Daniel_KL

Added subscriber: @Daniel_KL

I also think that autofaking unused data is a bad idea, because when people tag a fake user, they separate the trash that needs to be lost from the important data that needs to be preserved.
Doing this manually is the only correct way, otherwise it will be difficult later to sort and clean up the autofaked data.

I also think that autofaking unused data is a bad idea, because when people tag a fake user, they separate the trash that needs to be lost from the important data that needs to be preserved. Doing this manually is the only correct way, otherwise it will be difficult later to sort and clean up the autofaked data.
Author
Owner

Added subscribers: @Mets, @eyecandy

Added subscribers: @Mets, @eyecandy
Author
Owner

Thanks for the feedbacks, updated design proposal. Note that this is still in discussion phase, not yet finalized and approved.

Thanks for the feedbacks, updated design proposal. Note that this is still in discussion phase, not yet finalized and approved.

In #61209#1119450, @APEC wrote:
Just checked, and indeed when we mark asset it become a fake user, so why need to store all trash data? - unclear.

In #61209#1236390, @Daniel_KL wrote:
Doing this manually is the only correct way, otherwise it will be difficult later to sort and clean up the autofaked data.

That is the original question: defining what is "trash".

I do agree with @Daniel_KL
for the moment i don't think there is a clear vision of how to filter the garbage from the work (more than 3 years of comments...): this division is the result of work dynamics external to the software and must be used human intelligence.

Maybe then the problem shifts: the solution is to find the best method to assist the human to be able to delete, as quickly as possible, what he considers to be junk.

Thank you,
Riccardo

> In #61209#1119450, @APEC wrote: > Just checked, and indeed when we mark asset it become a fake user, so why need to store all trash data? - unclear. > In #61209#1236390, @Daniel_KL wrote: > Doing this manually is the only correct way, otherwise it will be difficult later to sort and clean up the autofaked data. That is the original question: defining what is "trash". I do agree with @Daniel_KL for the moment i don't think there is a clear vision of how to filter the garbage from the work (more than 3 years of comments...): this division is the result of work dynamics external to the software and must be used human intelligence. Maybe then the problem shifts: the solution is to find the best method to assist the human to be able to delete, as quickly as possible, what **he** considers to be junk. Thank you, Riccardo
Member

I suggest a pop-up warning when confirming a deletion, to make sure users know what they're deleting. By name when it's a few things, and just the number of things if it's a lot of things (eg. when deleting a collection and all its contents).

If the "Delete" buttons would actually delete things instead of unlinking them, the problem that sparked #87490 would be sidestepped, which is super nice. That means that this proposal would solve #87490 even without the extra option to purge on file save. 👍

I suggest a pop-up warning when confirming a deletion, to make sure users know what they're deleting. By name when it's a few things, and just the number of things if it's a lot of things (eg. when deleting a collection and all its contents). If the "Delete" buttons would actually delete things instead of unlinking them, the problem that sparked #87490 would be sidestepped, which is super nice. That means that this proposal would solve #87490 even without the extra option to purge on file save. 👍
Contributor

In #61209#1245089, @Mets wrote:
I suggest a pop-up warning when confirming a deletion, to make sure users know what they're deleting. By name when it's a few things, and just the number of things if it's a lot of things (eg. when deleting a collection and all its contents).

If the "Delete" buttons would actually delete things instead of unlinking them, the problem that sparked #87490 would be sidestepped, which is super nice. That means that this proposal would solve #87490 even without the extra option to purge on file save. 👍

The last thing Blender needs is more deletion confirmation pop ups (aside from file management). There should be one simple rule enforced. Deletion confirmation popup should exist ONLY for operations which are not supported by undo. That's it. Datablock deletion can be usually undone. Overwriting a file can not be undone, so pop up is warranted there. Deletion of a material can, so a pop up doesn't belong there.

Regarding the overall topic. In my opinion, most datablock types should never ever be auto-deleted, except a very few exceptions, such as mesh datablocks. And the fake user thing should be completely removed.

The reason this mess exists is that unlike other common 3D software, Blender does not have any proper, easy to use place for scene data management. The closest to it is Outliner in the "Blend File" mode. The problem with this solution is that it only allows datablock inspection and deletion, not creation. In other common 3D software, you'd expect to have some sort of space where you can for example both create and delete (not unlink, actually delete) materials in your file, as well as assign them to objects. In Blender, you create material data blocks in this weird, confusing datablock UI element, which you can also unlink them from, but you have to delete them in a completely different place. You also have to visit this completely different place for mas management of multiple datablocks.

I think game engines, such as Unreal, Unity, Godot and so on are a great example of how a data management should work. There should be some asset browser mode, which instead of browsing the asset library would also allow you to browse your active Blend file, and offer complete datablock functionality - creating, renaming, editing, assigning, and deleting, all in one place. This would allow for following:

  • Removal of fake user workflow
  • Removal of any worry of unintended data loss
  • Removal of the whole purge data concept
  • Removal of messy scenes with many unused fake user datablocks caused by blender not having good tools that would encourage users to care and manage the data stored in their scene files

Unfortunately, addressing this in a way I just described would require bigger changes, because it would open an ugly can of worms of bad legacy design in parts of Blender. For example the fact that material editing is almost exclusively tied to viewport object selection (except the hacky operator to "pin" current material). It would require more fundamental redesign of some concepts, so that you could for example edit materials in material editor regardless of what you have selected. And material assignment would be completely decoupled from material editing.

But I think this would be actually helpful in the end. I mean how many of us got frustrated when we accidentally changed material assigned to our object when we switched the active material in the shader editor and forgot it's tied to the active material slot on the active selected object :) It's not a good design.

> In #61209#1245089, @Mets wrote: > I suggest a pop-up warning when confirming a deletion, to make sure users know what they're deleting. By name when it's a few things, and just the number of things if it's a lot of things (eg. when deleting a collection and all its contents). > > If the "Delete" buttons would actually delete things instead of unlinking them, the problem that sparked #87490 would be sidestepped, which is super nice. That means that this proposal would solve #87490 even without the extra option to purge on file save. 👍 The last thing Blender needs is more deletion confirmation pop ups (aside from file management). There should be one simple rule enforced. Deletion confirmation popup should exist ONLY for operations which are not supported by undo. That's it. Datablock deletion can be usually undone. Overwriting a file can not be undone, so pop up is warranted there. Deletion of a material can, so a pop up doesn't belong there. Regarding the overall topic. In my opinion, most datablock types should never ever be auto-deleted, except a very few exceptions, such as mesh datablocks. And the fake user thing should be completely removed. The reason this mess exists is that unlike other common 3D software, Blender does not have any proper, easy to use place for scene data management. The closest to it is Outliner in the "Blend File" mode. The problem with this solution is that it only allows datablock inspection and deletion, not creation. In other common 3D software, you'd expect to have some sort of space where you can for example both create and delete (not unlink, actually delete) materials in your file, as well as assign them to objects. In Blender, you create material data blocks in this weird, confusing datablock UI element, which you can also unlink them from, but you have to delete them in a completely different place. You also have to visit this completely different place for mas management of multiple datablocks. I think game engines, such as Unreal, Unity, Godot and so on are a great example of how a data management should work. There should be some asset browser mode, which instead of browsing the asset library would also allow you to browse your active Blend file, and offer complete datablock functionality - creating, renaming, editing, assigning, and deleting, all in one place. This would allow for following: - Removal of fake user workflow - Removal of any worry of unintended data loss - Removal of the whole purge data concept - Removal of messy scenes with many unused fake user datablocks caused by blender not having good tools that would encourage users to care and manage the data stored in their scene files Unfortunately, addressing this in a way I just described would require bigger changes, because it would open an ugly can of worms of bad legacy design in parts of Blender. For example the fact that material editing is almost exclusively tied to viewport object selection (except the hacky operator to "pin" current material). It would require more fundamental redesign of some concepts, so that you could for example edit materials in material editor regardless of what you have selected. And material assignment would be completely decoupled from material editing. But I think this would be actually helpful in the end. I mean how many of us got frustrated when we accidentally changed material assigned to our object when we switched the active material in the shader editor and forgot it's tied to the active material slot on the active selected object :) It's not a good design.

In #61209#1245135, @Rawalanche wrote:
And the fake user thing should be completely removed.

This is a solution suitable for 3dsmax, Maya, C4D and other software which does not have the ability to pack media into a file yet.

And material assignment would be completely decoupled from material editing.

This will require material id, multimaterial or other legacy 3dsmax solutions that Blender managed to avoid.

> In #61209#1245135, @Rawalanche wrote: > And the fake user thing should be completely removed. This is a solution suitable for 3dsmax, Maya, C4D and other software which does not have the ability to pack media into a file yet. > And material assignment would be completely decoupled from material editing. This will require material id, multimaterial or other legacy 3dsmax solutions that Blender managed to avoid.

Added subscriber: @TheCharacterhero-4

Added subscriber: @TheCharacterhero-4

In #61209#1245135, @Rawalanche wrote:

In #61209#1245089, @Mets wrote:

The reason this mess exists is that unlike other common 3D software, Blender does not have any proper, easy to use place for scene data management.

This looks like a system from 3dsmax, where Nodes editor was put on top of Slots editor, so, unlike in Blender it wasnt designed for flexible material editing.
In 3dsmax in complex scenes you have to clean your slots and pick materials from objects, so it is object-oriented anyway.

And it is very difficult for a beginner to set up 3d max materials, in Blender it is much easier to figure it out.

> In #61209#1245135, @Rawalanche wrote: >> In #61209#1245089, @Mets wrote: > The reason this mess exists is that unlike other common 3D software, Blender does not have any proper, easy to use place for scene data management. This looks like a system from 3dsmax, where Nodes editor was put on top of Slots editor, so, unlike in Blender it wasnt designed for flexible material editing. In 3dsmax in complex scenes you have to clean your slots and pick materials from objects, so it is object-oriented anyway. And it is very difficult for a beginner to set up 3d max materials, in Blender it is much easier to figure it out.

Removed subscriber: @ChinoD

Removed subscriber: @ChinoD

Removed subscriber: @justastatue

Removed subscriber: @justastatue
Contributor

In #61209#1245230, @1D_Inc wrote:

In #61209#1245135, @Rawalanche wrote:
And the fake user thing should be completely removed.

This is a solution suitable for 3dsmax, Maya, C4D and other software which does not have the ability to pack media into a file yet.

And material assignment would be completely decoupled from material editing.

This will require material id, multimaterial or other legacy 3dsmax solutions that Blender managed to avoid.

Please - not just here, but anywhere where both I and you post - just please do not respond to my posts. You never ever do mental work to figure out what I wrote, and just start spewing nonsense.

You did not even read my post, otherwise you wouldn't write what you wrote. I mentioned game engines as a great example of data management. Game engines usually have some sort of project the data is contained within, both internally, and externally. Blender does the exact same thing, with the only exception that the blend file is an internal folder structure archive which can not be browsed using OS file explorer. That's about it. When you look at most of the game engines, their architecture is quite similar to Blender, with the only exception that unlike Bledner, they have much better tools for managing these "datablocks", generally in form of some content browser, which allows you to create them, edit them and delete them from a single place. There is no such thing multisubobject material, and neither does it have to be in Blender.

The decoupling of material editing from object selection would not change much about the way materials are used on objects in Blender. You could still create material slots on objects exactly the same way you do now, and assign materials to them again in the exactly same way as you do now. The only difference would be that if you changed the active material in the shader editor, it would NOT change the material assigned to the currently active material slot on the object. You could optionally lock them together (in the same way you can sync UV editor selection to 3D view selection), to behave like they do now, but it would not be default.

This could then allow more comfortable workflows, such as that you could drag and drop material from any object material slot onto the shader editor and it would become the active material in the shader editor, and you could drag and drop any material from shader editor onto existing object material slot to assign it. Of course you could still also assign materials to material slots on objects by using the search datablock button, and you could do the same for the shader editor. It's just that these two would not be tied together.

Here's a great example of this ridiculousness: Sometimes I have some materials in the scene that I need to be there for later use, but they are not assigned to any material slot of any object. To be able to edit that material, I first need to create some dummy object in the scene, like a cube, then assign that material to that cube, and only then I can actually use a material editor to edit it. And then delete the temporary cube once I am done. That's just preposterous.

On top of that, this concept is not anything new to Blender. For example, in Image Editor, when you select image to view or edit, you are not actually assigning that image to any selected object, you can just edit it regardless of what's selected in viewport, and the - [x] unlink button in the datablock UI element just "clears" it from the image editor, but does not actually remove its assignment from any object or material. See? That's already possible in Blender without a need to haven any sort of "multisub image map". I am proposing that the other editors used for editing datablocks work the same way. So that there is one, unified way of editing datablocks, which allows you to edit any datablock you want, whenever you want it.

> In #61209#1245230, @1D_Inc wrote: >> In #61209#1245135, @Rawalanche wrote: >> And the fake user thing should be completely removed. > > This is a solution suitable for 3dsmax, Maya, C4D and other software which does not have the ability to pack media into a file yet. > >> And material assignment would be completely decoupled from material editing. > > This will require material id, multimaterial or other legacy 3dsmax solutions that Blender managed to avoid. Please - not just here, but anywhere where both I and you post - just please do not respond to my posts. You never ever do mental work to figure out what I wrote, and just start spewing nonsense. You did not even read my post, otherwise you wouldn't write what you wrote. I mentioned game engines as a great example of data management. Game engines usually have some sort of project the data is contained within, both internally, and externally. Blender does the exact same thing, with the only exception that the blend file is an internal folder structure archive which can not be browsed using OS file explorer. That's about it. When you look at most of the game engines, their architecture is quite similar to Blender, with the only exception that unlike Bledner, they have much better tools for managing these "datablocks", generally in form of some content browser, which allows you to create them, edit them and delete them from a single place. There is no such thing multisubobject material, and neither does it have to be in Blender. The decoupling of material editing from object selection would not change much about the way materials are used on objects in Blender. You could still create material slots on objects exactly the same way you do now, and assign materials to them again in the exactly same way as you do now. The only difference would be that if you changed the active material in the shader editor, it would NOT change the material assigned to the currently active material slot on the object. You could optionally lock them together (in the same way you can sync UV editor selection to 3D view selection), to behave like they do now, but it would not be default. This could then allow more comfortable workflows, such as that you could drag and drop material from any object material slot onto the shader editor and it would become the active material in the shader editor, and you could drag and drop any material from shader editor onto existing object material slot to assign it. Of course you could still also assign materials to material slots on objects by using the search datablock button, and you could do the same for the shader editor. It's just that these two would not be tied together. Here's a great example of this ridiculousness: Sometimes I have some materials in the scene that I need to be there for later use, but they are not assigned to any material slot of any object. To be able to edit that material, I first need to create some dummy object in the scene, like a cube, then assign that material to that cube, and only then I can actually use a material editor to edit it. And then delete the temporary cube once I am done. That's just preposterous. On top of that, this concept is not anything new to Blender. For example, in Image Editor, when you select image to view or edit, you are not actually assigning that image to any selected object, you can just edit it regardless of what's selected in viewport, and the - [x] unlink button in the datablock UI element just "clears" it from the image editor, but does not actually remove its assignment from any object or material. See? That's already possible in Blender without a need to haven any sort of "multisub image map". I am proposing that the other editors used for editing datablocks work the same way. So that there is one, unified way of editing datablocks, which allows you to edit any datablock you want, whenever you want it.

Well, unlike image, which can be used as texture or a brush, material has the only one possible host - an object.
Editing shader without connection to object means editing shader without a response in 3dviewport.

Well, unlike image, which can be used as texture or a brush, material has the only one possible host - an object. Editing shader without connection to object means editing shader without a response in 3dviewport.
Member

In #61209#1245135, @Rawalanche wrote:

In #61209#1245089, @Mets wrote:
I suggest a pop-up warning when confirming a deletion, to make sure users know what they're deleting. By name when it's a few things, and just the number of things if it's a lot of things (eg. when deleting a collection and all its contents).

Datablock deletion can be usually undone.

I was unclear, I was talking about indirect deletions. If you delete an object, you would sometimes delete the object data along with it, but sometimes not, depending on whether that object data is used by anything else. This is when I think a pop-up should be shown to make sure the user realizes they are deleting things other than what they have selected (which would be the case under the current proposal).

> In #61209#1245135, @Rawalanche wrote: >> In #61209#1245089, @Mets wrote: >> I suggest a pop-up warning when confirming a deletion, to make sure users know what they're deleting. By name when it's a few things, and just the number of things if it's a lot of things (eg. when deleting a collection and all its contents). > Datablock deletion can be usually undone. I was unclear, I was talking about indirect deletions. If you delete an object, you would sometimes delete the object data along with it, but sometimes not, depending on whether that object data is used by anything else. This is when I think a pop-up should be shown to make sure the user realizes they are deleting things other than what they have selected (which would be the case under the current proposal).

In #61209#1245621, @Mets wrote:

In #61209#1245135, @Rawalanche wrote:

In #61209#1245089, @Mets wrote:
I suggest a pop-up warning when confirming a deletion, to make sure users know what they're deleting. By name when it's a few things, and just the number of things if it's a lot of things (eg. when deleting a collection and all its contents).

Datablock deletion can be usually undone.

I was unclear, I was talking about indirect deletions. If you delete an object, you would sometimes delete the object data along with it, but sometimes not, depending on whether that object data is used by anything else. This is when I think a pop-up should be shown to make sure the user realizes they are deleting things other than what they have selected (which would be the case under the current proposal).

Cascade delete is very, very dangerous.
At the beginning of this very log thread it was suggested that only some data type should be cascade deleted:
there is no rationality about deciding which (ex. mesh, materials, actions, ecc...).

A pop-up can't work with bulk delete or big project management (where you have a list of 75 pop-up you can't really control what you are deleting...).
RG

> In #61209#1245621, @Mets wrote: >> In #61209#1245135, @Rawalanche wrote: >>> In #61209#1245089, @Mets wrote: >>> I suggest a pop-up warning when confirming a deletion, to make sure users know what they're deleting. By name when it's a few things, and just the number of things if it's a lot of things (eg. when deleting a collection and all its contents). >> Datablock deletion can be usually undone. > I was unclear, I was talking about indirect deletions. If you delete an object, you would sometimes delete the object data along with it, but sometimes not, depending on whether that object data is used by anything else. This is when I think a pop-up should be shown to make sure the user realizes they are deleting things other than what they have selected (which would be the case under the current proposal). Cascade delete is very, very dangerous. At the beginning of this very log thread it was suggested that only some data type should be cascade deleted: there is no rationality about deciding which (ex. mesh, materials, actions, ecc...). A pop-up can't work with bulk delete or big project management (where you have a list of 75 pop-up you can't really control what you are deleting...). RG
Member

In #61209#1245828, @Rickyx wrote:
A pop-up can't work with bulk delete or big project management

It already exists in some sense. I'm just cleaning up the Sprite Fright production files, and the "Recursive Purge Unused Datablocks" operator gives me a number of each datablock type that will be deleted. Of course at very high numbers it becomes pretty meaningless, but even just seeing the datablock types is very reassuring. For example if I just deleted some static meshes and the pop-up tells me I'm about to delete 3 Actions, I may get suspicious, and not go ahead with the deletion until I double-check what those Actions are exactly.

There may be more robust solutions of course, but I think this would be a good start that could be easy to implement.

> In #61209#1245828, @Rickyx wrote: > A pop-up can't work with bulk delete or big project management It already exists in some sense. I'm just cleaning up the Sprite Fright production files, and the "Recursive Purge Unused Datablocks" operator gives me a number of each datablock type that will be deleted. Of course at very high numbers it becomes pretty meaningless, but even just seeing the datablock types is very reassuring. For example if I just deleted some static meshes and the pop-up tells me I'm about to delete 3 Actions, I may get suspicious, and not go ahead with the deletion until I double-check what those Actions are exactly. There may be more robust solutions of course, but I think this would be a good start that could be easy to implement.

In #61209#1245888, @Mets wrote:

In #61209#1245828, @Rickyx wrote:
A pop-up can't work with bulk delete or big project management

For example if I just deleted some static meshes and the pop-up tells me I'm about to delete 3 Actions, I may get suspicious, and not go ahead with the deletion until I double-check what those Actions are exactly.

I am not sure how it is possible to check Actions of a deleted host, but I agree that that there is a strong psychological aspect here.

Such an issue is related, for examples, to stock trading - everybody want to earn 100 usd, but if rules are a bit different - you will get the same 100 usd and stock will get 300, lots of people refuses to do that.
The problem occures when you know, that you will sort of lose 300 usd.

Clearing a music library, for example, is incredibly difficult because you have to define that you will never listen from it and lose it.
People naturally don't like to see they are losing something even more than actually losing it.

There may be more robust solutions of course, but I think this would be a good start that could be easy to implement.

We are trying to figure it out for quite a decade, but it is really hard.
There are known realizations from other software that are more pleasant to use, but in fact are quite useless in practice and does not solve the original problem of a massive project files congestion.

> In #61209#1245888, @Mets wrote: >> In #61209#1245828, @Rickyx wrote: >> A pop-up can't work with bulk delete or big project management > For example if I just deleted some static meshes and the pop-up tells me I'm about to delete 3 Actions, I may get suspicious, and not go ahead with the deletion until I double-check what those Actions are exactly. I am not sure how it is possible to check Actions of a deleted host, but I agree that that there is a strong psychological aspect here. Such an issue is related, for examples, to stock trading - everybody want to earn 100 usd, but if rules are a bit different - you will get the same 100 usd and stock will get 300, lots of people refuses to do that. The problem occures when you know, that you will sort of lose 300 usd. Clearing a music library, for example, is incredibly difficult because you have to define that you will never listen from it and lose it. People naturally don't like to see they are losing something even more than actually losing it. > There may be more robust solutions of course, but I think this would be a good start that could be easy to implement. We are trying to figure it out for quite a decade, but it is really hard. There are known realizations from other software that are more pleasant to use, but in fact are quite useless in practice and does not solve the original problem of a massive project files congestion.

Maybe add a check box when saving your blend file.

When unchecked Orphan Data is deleted.

1.png

When checked Orphan Data is saved.

2.png

The saved Orphan Data defaults to Fake User.

For team collaborations, we could add the ability to name Orphan Data saves to help prevent team members from accidentally purging each other's saved Orphan Data.

3.png

Or choose from already saved Orphan Data from a list.

4.png

This saved Orphan Data can be purged at any time from the Edit Menu. You can choose which saved Orphan Data to purge.

5.png

The above is pretty much for the first time you save the blend file or do a Save As.

Perhaps include a warning prompt on interim saves if new unsaved Orphan Data was created.

6.png
7.png

Maybe add a check box when saving your blend file. When unchecked Orphan Data is deleted. ![1.png](https://archive.blender.org/developer/F12836395/1.png) When checked Orphan Data is saved. ![2.png](https://archive.blender.org/developer/F12836396/2.png) The saved Orphan Data defaults to Fake User. For team collaborations, we could add the ability to name Orphan Data saves to help prevent team members from accidentally purging each other's saved Orphan Data. ![3.png](https://archive.blender.org/developer/F12836397/3.png) Or choose from already saved Orphan Data from a list. ![4.png](https://archive.blender.org/developer/F12836398/4.png) This saved Orphan Data can be purged at any time from the Edit Menu. You can choose which saved Orphan Data to purge. ![5.png](https://archive.blender.org/developer/F12836409/5.png) The above is pretty much for the first time you save the blend file or do a Save As. Perhaps include a warning prompt on interim saves if new unsaved Orphan Data was created. ![6.png](https://archive.blender.org/developer/F12836400/6.png) ![7.png](https://archive.blender.org/developer/F12836401/7.png)

It was already discussed, things doesnot work like that.
If someone set save option he doesnot purge trash, drowning the project files also making trash untouchable, and if someone set purge option, he corrupt files shredding data which was expected to be saved by the other project member.
This system design level problem is not solvable by setup.

It was already discussed, things doesnot work like that. If someone set save option he doesnot purge trash, drowning the project files also making trash untouchable, and if someone set purge option, he corrupt files shredding data which was expected to be saved by the other project member. This system design level problem is not solvable by setup.

What if orphan data will be saved in separate file.
And, if loading main *.blend file without this orphan data file, there will be message - "Orphan data file is missing. Load blend file with purged orphan data?" (like a missing textures).
After loading blend file without orphan data file - it will match a blend file after manually purge and save.

What if orphan data will be saved in separate file. And, if loading main *.blend file without this orphan data file, there will be message - "Orphan data file is missing. Load blend file with purged orphan data?" (like a missing textures). After loading blend file without orphan data file - it will match a blend file after manually purge and save.

100 project files will generate 200 total files.
Every version increment will have corresponding copy.
Too much to maintain.

100 project files will generate 200 total files. Every version increment will have corresponding copy. Too much to maintain.

It already create 2 files: *.blend and *.blend1
Let it save orphan data to blend1 instead main file

It already create 2 files: *.blend and *.blend1 Let it save orphan data to blend1 instead main file
Member

.blend1 files are backups, nothing to do with this.

.blend1 files are backups, nothing to do with this.

Removed subscriber: @DominikR

Removed subscriber: @DominikR

https://devtalk.blender.org/t/blender-deleted-my-un-assigned-materials-how-is-that-a-feature-fake-user/22715/8
Just wanted to redirect this discussion there
and to be sure @1D_Inc read this topic and explained to users why it does not work in team.

https://devtalk.blender.org/t/blender-deleted-my-un-assigned-materials-how-is-that-a-feature-fake-user/22715/8 Just wanted to redirect this discussion there and to be sure @1D_Inc read this topic and explained to users why it does not work in team.

Added subscriber: @ChinoD

Added subscriber: @ChinoD

In #61209#1298314, @1D_Inc wrote:
This system design level problem is not solvable by setup.

Why so?

The concept provided by @ChinoD, in my opinion, is almost perfect.
The only thing missing is set your default user name somewhere in preferences (when working with team) and when saving, use this name for fake user.
If other team member (with its own user name) wanted to purge orphan data it purge only data with his user name or (if the user name was by default and not changed "Fake User") purges also data with "Fake User".
To delete all users orphan data you need to hold shift+click purge button and it ask "Are you sure you want to delete all users orphan data?" - "Yes" - "Cancel"

Example how to add a custom user with it prefs:
Set User Name.png

> In #61209#1298314, @1D_Inc wrote: > This system design level problem is not solvable by setup. Why so? The concept provided by @ChinoD, in my opinion, is almost perfect. The only thing missing is set your default user name somewhere in preferences (when working with team) and when saving, use this name for fake user. If other team member (with its own user name) wanted to purge orphan data it purge only data with his user name or (if the user name was by default and not changed "Fake User") purges also data with "Fake User". To delete all users orphan data you need to hold shift+click purge button and it ask "Are you sure you want to delete all users orphan data?" - "Yes" - "Cancel" Example how to add a custom user with it prefs: ![Set User Name.png](https://archive.blender.org/developer/F12852089/Set_User_Name.png)
Contributor

In #61209#1302913, @APEC wrote:

In #61209#1298314, @1D_Inc wrote:
This system design level problem is not solvable by setup.

Why so?

The concept provided by @ChinoD, in my opinion, is almost perfect.
The only thing missing is set your default user name somewhere in preferences (when working with team) and when saving, use this name for fake user.
If other team member (with its own user name) wanted to purge orphan data it purge only data with his user name or (if the user name was by default and not changed "Fake User") purges also data with "Fake User".
To delete all users orphan data you need to hold shift+click purge button and it ask "Are you sure you want to delete all users orphan data?" - "Yes" - "Cancel"

Example how to add a custom user with it prefs:
Set User Name.png

This will add massive burden of configuration on the users but won't solve much. The main issue is that Blender right now does not have a good editor for datablock management. This alone won't address that.

> In #61209#1302913, @APEC wrote: >> In #61209#1298314, @1D_Inc wrote: >> This system design level problem is not solvable by setup. > Why so? > > The concept provided by @ChinoD, in my opinion, is almost perfect. > The only thing missing is set your default user name somewhere in preferences (when working with team) and when saving, use this name for fake user. > If other team member (with its own user name) wanted to purge orphan data it purge only data with his user name or (if the user name was by default and not changed "Fake User") purges also data with "Fake User". > To delete all users orphan data you need to hold shift+click purge button and it ask "Are you sure you want to delete all users orphan data?" - "Yes" - "Cancel" > > Example how to add a custom user with it prefs: > ![Set User Name.png](https://archive.blender.org/developer/F12852089/Set_User_Name.png) This will add massive burden of configuration on the users but won't solve much. The main issue is that Blender right now does not have a good editor for datablock management. This alone won't address that.

Added subscriber: @Harley

Added subscriber: @Harley

@Harley recently submitted this [D14030 ](https://developer.blender.org/D14030) UI WIP: User-Managed Unused Data

@Harley recently submitted this [[D14030](https://archive.blender.org/developer/D14030) ](https://developer.blender.org/D14030) UI WIP: User-Managed Unused Data

In #61209#1302896, @APEC wrote:
https://devtalk.blender.org/t/blender-deleted-my-un-assigned-materials-how-is-that-a-feature-fake-user/22715/8
Just wanted to redirect this discussion there
and to be sure @1D_Inc read this topic and explained to users why it does not work in team.

It was also redirected here, as far as I remember.

Naming (labeling) unused data was also discussed there. Autolabeled trash is untouchable, has no expiration date, and also unmanageable the same way as regular autosaving. I can pack whatever I want no matter how it is needed, and autolabeling will keep it forever. If someone will have to purge it, he will have to take responsibility for all the consequences.

Flexible setup provokes conflicts.
If someone set save all option he doesnot purge trash, drowning the project files also making trash untouchable - if someone set even partial purge option, he will corrupt files shredding data which was expected to be saved by the other project member. If you will send file to Steve, you will lose materials, if you will send file to Jennifer, you will lose textures. Or something else. Or everything. Or not.

There should be a simple rule for everyone compatible with massive production.

> In #61209#1302896, @APEC wrote: > https://devtalk.blender.org/t/blender-deleted-my-un-assigned-materials-how-is-that-a-feature-fake-user/22715/8 > Just wanted to redirect this discussion there > and to be sure @1D_Inc read this topic and explained to users why it does not work in team. It was also redirected here, as far as I remember. Naming (labeling) unused data was also discussed there. Autolabeled trash is untouchable, has no expiration date, and also unmanageable the same way as regular autosaving. I can pack whatever I want no matter how it is needed, and autolabeling will keep it forever. If someone will have to purge it, he will have to take responsibility for all the consequences. Flexible setup provokes conflicts. If someone set save all option he doesnot purge trash, drowning the project files also making trash untouchable - if someone set even partial purge option, he will corrupt files shredding data which was expected to be saved by the other project member. If you will send file to Steve, you will lose materials, if you will send file to Jennifer, you will lose textures. Or something else. Or everything. Or not. There should be a simple rule for everyone compatible with massive production.

After looking at @APEC's mockup, why not have a checkbox in the Preferences that allows to Auto-Save Orphan Data? A singular checkbox. Checked, It saves all Orphan Data. Unchecked it doesn't save Orphan Data. Keep the Fake User as is, shield toggles and all, but have the Auto-Save Orphan Data checkbox checked by default. This would prevent new Users from accidentally losing their work and at the same time allow experienced Users to uncheck the checkbox and go about saving their work with the Fake User method, business as usual. If you work on a Team that requires careful data management then you can require your artists to uncheck the Auto-Save Orphan Data checkbox.

1.png

While the Auto-Save Orphan Data checkbox is checked all the Shield Icons will automatically show checked and be grayed out and un-toggleable.

2.png

If with the Auto-Save Orphan Data the file becomes bloated, the Artist can purge the unwanted data. The Orphan Data Area of the Outliner could have checkboxes next to each of the Orphan Data. The checked checkboxes are the items that are Purged on click of the Purge Button. The unchecked Items are not Purged.

EDIT: Maybe the checkboxes in the Orphan Data Area of the Outliner aren't even necessary. An Artist who was using the Auto-Save Orphan Data could simply uncheck it in the Preferences and then manually manage the Orphan Data by un-toggling the Shield icon next to each item they want to Purge in the Orphan Data Area of the Outliner then click the Purge Button.

The idea is that as long as the Auto-Save Orphan Data checkbox is checked in the Preferences no Orphan Data will be purged unless done manually. If the Auto-Save Orphan Data checkbox is unchecked in the Preferences all Orphan Data that is not assigned to a Fake User is Purged as it works currently.

After looking at @APEC's mockup, why not have a checkbox in the Preferences that allows to Auto-Save Orphan Data? A singular checkbox. Checked, It saves all Orphan Data. Unchecked it doesn't save Orphan Data. Keep the Fake User as is, shield toggles and all, but have the Auto-Save Orphan Data checkbox checked by default. This would prevent new Users from accidentally losing their work and at the same time allow experienced Users to uncheck the checkbox and go about saving their work with the Fake User method, business as usual. If you work on a Team that requires careful data management then you can require your artists to uncheck the Auto-Save Orphan Data checkbox. ![1.png](https://archive.blender.org/developer/F12852604/1.png) While the Auto-Save Orphan Data checkbox is checked all the Shield Icons will automatically show checked and be grayed out and un-toggleable. ![2.png](https://archive.blender.org/developer/F12852607/2.png) If with the Auto-Save Orphan Data the file becomes bloated, the Artist can purge the unwanted data. The Orphan Data Area of the Outliner could have checkboxes next to each of the Orphan Data. The checked checkboxes are the items that are Purged on click of the Purge Button. The unchecked Items are not Purged. *EDIT: ***Maybe the checkboxes in the Orphan Data Area of the Outliner aren't even necessary. An Artist who was using the Auto-Save Orphan Data could simply uncheck it in the Preferences and then manually manage the Orphan Data by un-toggling the Shield icon next to each item they want to Purge in the Orphan Data Area of the Outliner then click the Purge Button.**** The idea is that as long as the Auto-Save Orphan Data checkbox is checked in the Preferences no Orphan Data will be purged unless done manually. If the Auto-Save Orphan Data checkbox is unchecked in the Preferences all Orphan Data that is not assigned to a Fake User is Purged as it works currently.

Because of the same teamwork issue.

Because of the same teamwork issue.

As soon as you have any Autosave Orphan data option you will be forced to enable it permanently in order to make sure that you will not lose any data in files from your colleagues or in any file downloaded from the internet.
It doesnot matter how much trash was accidently packed there - it will indeed stay forever, since the only way to split trash from non-trash in unused data is to contact the author and clarify this personally.
As a result, any Autosave Orphan data option is a channel to deliver and spread random trash across Blender ecosystem which is currently cleared automatically. A kind of analogue of the destruction of the dam.

As soon as you have any Autosave Orphan data option you will be forced to enable it permanently in order to make sure that you will not lose any data in files from your colleagues or in any file downloaded from the internet. It doesnot matter how much trash was accidently packed there - it will indeed stay forever, since the only way to split trash from non-trash in unused data is to contact the author and clarify this personally. As a result, any Autosave Orphan data option is a channel to deliver and spread random trash across Blender ecosystem which is currently cleared automatically. A kind of analogue of the destruction of the dam.

In #61209#1303663, @Daniel_KL wrote:
As a result, any Autosave Orphan data option is a channel to deliver and spread random trash across Blender ecosystem which is currently cleared automatically. A kind of analogue of the destruction of the dam.

I agree that in case if there will be an option, there will be no actual choice.

At the moment the rule is quite simple and uniform - you "Lose Unused", so anything unused is equalized to trash.
This rule does a lot to protect against trash flooding which is essential for project management.
For sure, we lost a couple of materials a decade ago, because in Blender this rule is presented to a user in an incredibly harsh form - you have to completely unexpectedly lose some data in order to discover this rule.
But we are not very concerned about this couple of materials now, when tens of particle systems, hundreds of textures and thousands of materials are iterating in our projects leaving no trash at all, even when using autopacking, no matter how much members are involved into projects.

> In #61209#1303663, @Daniel_KL wrote: > As a result, any Autosave Orphan data option is a channel to deliver and spread random trash across Blender ecosystem which is currently cleared automatically. A kind of analogue of the destruction of the dam. I agree that in case if there will be an option, there will be no actual choice. At the moment the rule is quite simple and uniform - you "Lose Unused", so anything unused is equalized to trash. This rule does a lot to protect against trash flooding which is essential for project management. For sure, we lost a couple of materials a decade ago, because in Blender this rule is presented to a user in an incredibly harsh form - you have to completely unexpectedly lose some data in order to discover this rule. But we are not very concerned about this couple of materials now, when tens of particle systems, hundreds of textures and thousands of materials are iterating in our projects leaving no trash at all, even when using autopacking, no matter how much members are involved into projects.

There should at least be a warning when closing Blender.

1.png

Clicking Manage Data opens a window as @Harley setup in [D14030 ](https://developer.blender.org/D14030). The only change I suggest for the Orphan Data is to display a Shield next to the item type to identify when there are items without a user in the list tree.

2.png

3.png

This at least offers an opportunity to examine the data and assign a Fake Use before it gets purged.

There should at least be a warning when closing Blender. ![1.png](https://archive.blender.org/developer/F12854831/1.png) Clicking Manage Data opens a window as @Harley setup in [[D14030](https://archive.blender.org/developer/D14030) ](https://developer.blender.org/D14030). The only change I suggest for the Orphan Data is to display a Shield next to the item type to identify when there are items without a user in the list tree. ![2.png](https://archive.blender.org/developer/F12854840/2.png) ![3.png](https://archive.blender.org/developer/F12854841/3.png) This at least offers an opportunity to examine the data and assign a Fake Use before it gets purged.

I agree about warning - it will be nice to make it optional, but turned on by default. Warning will not explain the reason though, but it will make the system at least explicit.

I agree about warning - it will be nice to make it optional, but turned on by default. Warning will not explain the reason though, but it will make the system at least explicit.

The idea I got (and it has changed over time thanks to the many reflections you have shared) is that:

  • the user needs to be clear about Blender's autopurge mechanism for unused blocks. The current one it's a decent and balanced way to handle garbage and offers a nice little error tolerance (one "layer" of data is deleted at a time e.g. if I delete a mesh with a material with a texture, to lose everything I have to save 3 times...).
  • there must be a way to manage the project data massively, surely highlighting the various links. The Clean up menu is definitely too simplistic.

That said, the artists should take care of keeping the files in order, and the managers should be responsible for managing the files and the data, possibly even deleting them...
No algorithm can replace this process that comes from contact with the artists and the overall vision of the project.
Thank you.

The idea I got (and it has changed over time thanks to the many reflections you have shared) is that: - the user needs to be clear about Blender's autopurge mechanism for unused blocks. The current one it's a decent and balanced way to handle garbage and offers a nice little error tolerance (one "layer" of data is deleted at a time e.g. if I delete a mesh with a material with a texture, to lose everything I have to save 3 times...). - there must be a way to manage the project data massively, surely highlighting the various links. The Clean up menu is definitely too simplistic. That said, the artists should take care of keeping the files in order, and the managers should be responsible for managing the files and the data, possibly even deleting them... No algorithm can replace this process that comes from contact with the artists and the overall vision of the project. Thank you.

1.png

![1.png](https://archive.blender.org/developer/F12855022/1.png)

I feel like I have to clarify what orphan data is and stands for.

In 3dsmax, when you delete a cube - it disappears. Immediately.
In AutoCAD, when you delete a line - it disappears. Immediately.
In Inkscape, when you delete a circle - it disappears. Immediately.
In Photoshop, when you delete a layer - it disappears. Immediately.
In Blender, when you delete a cone - it disappears but not immediately. It goes to a temporal trash bin which is called Orphan data.

Historical reason for this behavior was that it used to be pretty impossible to properly delete an ID while Blender was running, and that undoing such deletion was impossible.

No-user-data which is collected by Orphan data is originally a "data already deleted by user" with the only difference that in Blender it is deleted in some sort of delayed slow motion, just after saving file,
so user is able to explore such a trash bin for some purpose (like fakeusing interesting datablocks) and not immediately lose it like in other programs. Quite a feature.
But when people create some no user data (because they are able to), they are creating data in-between this slow motion deletion and predictably (but not obviously) lose it after saving file.

From this point of view this warning shows weird "a data already deleted by you will be lost" message.
Also "Always write unused IDs on save" task is quite similar to "never lose already deleted data".

I feel like I have to clarify what orphan data is and stands for. In 3dsmax, when you delete a cube - it disappears. Immediately. In AutoCAD, when you delete a line - it disappears. Immediately. In Inkscape, when you delete a circle - it disappears. Immediately. In Photoshop, when you delete a layer - it disappears. Immediately. In Blender, when you delete a cone - it disappears but not immediately. It goes to a temporal trash bin which is called Orphan data. > Historical reason for this behavior was that it used to be pretty impossible to properly delete an ID while Blender was running, and that undoing such deletion was impossible. No-user-data which is collected by Orphan data is originally a "data already deleted by user" with the only difference that in Blender it is deleted in some sort of delayed slow motion, just after saving file, so user is able to explore such a trash bin for some purpose (like fakeusing interesting datablocks) and not immediately lose it like in other programs. Quite a feature. But when people create some no user data (because they are able to), they are creating data in-between this slow motion deletion and predictably (but not obviously) lose it after saving file. From this point of view this warning shows weird "a data already deleted by you will be lost" message. Also "Always write unused IDs on save" task is quite similar to "never lose already deleted data".

Clear: this "slow motions" allows some very nice workflow.
In Blender you can relink data on the fly and the data association mechanism is explicit (duplicate linked...): this is a key differentiation!

Here are a few examples among many...

Relink.png

Clear: this "slow motions" allows some very nice workflow. In Blender you can relink data on the fly and the data association mechanism is explicit (duplicate linked...): this is a key differentiation! Here are a few examples among many... ![Relink.png](https://archive.blender.org/developer/F12855437/Relink.png)

Added subscriber: @Michael-Drake

Added subscriber: @Michael-Drake

In any software, when you delete the object it is strightly doomed.
When the trigger is pressed, the bullet is shot and hits the target immediately.

In Blender, when you delete the object, it is sentenced to be doomed.
The trigger is pressed, the bullet is shot, but the target is hit only at the moment when file is saved, executing the sentence written in orphan data, so you have time to make important decisions. It was made because of a programming limitations and became a feature even before Matrix.

In short, this tread looks like an attempt to make bullets never hit targets, because some illegal but possible activity in that bullet time cause quite predictable problems.

In any software, when you delete the object it is strightly doomed. When the trigger is pressed, the bullet is shot and hits the target immediately. In Blender, when you delete the object, it is sentenced to be doomed. The trigger is pressed, the bullet is shot, but the target is hit only at the moment when file is saved, executing the sentence written in orphan data, so you have time to make important decisions. It was made because of a programming limitations and became a feature even before Matrix. In short, this tread looks like an attempt to make bullets never hit targets, because some illegal but possible activity in that bullet time cause quite predictable problems.

So I've read this whole thread and indeed I can see the difficulties around having auto-protection turned on by default for certain types. Depending on who is using it and their settings etc. Having lots of trash data build up is a bad thing for sure. What about this as a solution, riffing on what has been outlined above:

  • Add a confirmation window (which can be turned off in preferences / in the window itself) that warns of orphaned non-protected data being deleted at close. Similar to what @ChinoD has mocked up. There could be three buttons, one to go to the data-block viewer to see the data that has no users, and no protection, where the user could protect those items they wanted to protect. The second button would cancel the close. And the third would "Delete Data" and close blender. This way it could prevent accidental data loss due to orphaned data not being protected.
  • Add a data-block viewer or better yet a mode to the asset browser, that allows for a complete view of the current blender file. All of the data-blocks would be viewed / relevant blocks would have previews (e.g. textures, materials, objects). It could have filtering and sorting based on type / properties etc. and have search. This would also be the place where items like materials could be created without having to associated them with an object, and be applied to a whole selection, instead of just a single object, renamed, deleted etc. The asset browser would have three modes, one to see all assets marked as such, one for assets inside this file that are marked and shared outside this file, and one for all data blocks inside the current file. From here you could see what the data-blocks were linked to etc.
  • Allow for materials to be edited without being associated to the current selection.
  • Rename Fake User to Protect or something like that. Fake User as someone new to Blender doesn't make any sense. Renaming Users to Links would also help clarify what it actually is. The object centric paradigm of Blender seems to make Materials and Textures etc. worthless if they're not associated with their betters, the Objects ;)

Also what do you people think about how the delete menu option in the outliner works? It doesn't send those items to orphaned data, it straight removes them (when doing so from the orphaned data view). Is that true deletion occurring?

So I've read this whole thread and indeed I can see the difficulties around having auto-protection turned on by default for certain types. Depending on who is using it and their settings etc. Having lots of trash data build up is a bad thing for sure. What about this as a solution, riffing on what has been outlined above: - Add a confirmation window (which can be turned off in preferences / in the window itself) that warns of orphaned non-protected data being deleted at close. Similar to what @ChinoD has mocked up. There could be three buttons, one to go to the data-block viewer to see the data that has no users, and no protection, where the user could protect those items they wanted to protect. The second button would cancel the close. And the third would "Delete Data" and close blender. This way it could prevent accidental data loss due to orphaned data not being protected. - Add a data-block viewer or better yet a mode to the asset browser, that allows for a complete view of the current blender file. All of the data-blocks would be viewed / relevant blocks would have previews (e.g. textures, materials, objects). It could have filtering and sorting based on type / properties etc. and have search. This would also be the place where items like materials could be created without having to associated them with an object, and be applied to a whole selection, instead of just a single object, renamed, deleted etc. The asset browser would have three modes, one to see all assets marked as such, one for assets inside this file that are marked and shared outside this file, and one for all data blocks inside the current file. From here you could see what the data-blocks were linked to etc. - Allow for materials to be edited without being associated to the current selection. - Rename Fake User to Protect or something like that. Fake User as someone new to Blender doesn't make any sense. Renaming Users to Links would also help clarify what it actually is. The object centric paradigm of Blender seems to make Materials and Textures etc. worthless if they're not associated with their betters, the Objects ;) Also what do you people think about how the delete menu option in the outliner works? It doesn't send those items to orphaned data, it straight removes them (when doing so from the orphaned data view). Is that true deletion occurring?

Personally, I like everything you've suggested here. To answer your question regarding the delete menu option in the outliner orphan data view, I think it's a true deletion. That being said, my understanding of blender's data management could be incorrect. I think deleting an item this way is like doing a single item purge. In fact, it's more potent and will even delete items with Fake User activated.

Personally, I like everything you've suggested here. To answer your question regarding the delete menu option in the outliner orphan data view, I think it's a true deletion. That being said, my understanding of blender's data management could be incorrect. I think deleting an item this way is like doing a single item purge. In fact, it's more potent and will even delete items with Fake User activated.

The proposal looks interesting.
It is a complex system design question, so it is hard to predict if different measures will be efficient, define which one will actually be, or all the possible consequences that could make situation even worse, so it is definitely better to not to solve such a problems in a rush.
Proper solution may take several iterations.

The proposal looks interesting. It is a complex system design question, so it is hard to predict if different measures will be efficient, define which one will actually be, or all the possible consequences that could make situation even worse, so it is definitely better to not to solve such a problems in a rush. Proper solution may take several iterations.

There is quite ancient industry level project management problem.
Due to lack of overall care, you have to basically live in garbage, and also lose or doom someones data at some point when the amount of garbage becomes huge.
Such data management debt quickly turns critical on a massive production.

image.png

In Blender, this problem has been solved by distributing the data management load between editing sessions.
As a result, a garbage-free system was obtained.

image.png

As soon as you have a system designed to be garbage-free, it has a very interesting property - it is able to manage any kind of garbage, including making it physical.
As a result, it is possible to pack external data into files without consequences.
This way Blender can afford packing into files external data and even Autopack external data feature, which is compensated by Autopurge data management system solution, making blend files flexible and always complete.

image.png

There are many different data management systems on the market, but none of them are ideal.
There are various software solutions that allow you to package data into files, such as Photoshop or Substance painter, but the problem is that these programs were designed for local personal workflows, such as asset creation, and were never intended to process large-scale projects in them.
As far as we know, Blender is the only program that has successfully scaled up the packing/Autopaking feature to mass production because of a system design solutions that allow this.

There is quite ancient industry level project management problem. Due to lack of overall care, you have to basically live in garbage, and also lose or doom someones data at some point when the amount of garbage becomes huge. Such data management debt quickly turns critical on a massive production. ![image.png](https://archive.blender.org/developer/F12869673/image.png) In Blender, this problem has been solved by distributing the data management load between editing sessions. As a result, a garbage-free system was obtained. ![image.png](https://archive.blender.org/developer/F12869677/image.png) As soon as you have a system designed to be garbage-free, it has a very interesting property - it is able to manage any kind of garbage, including making it *physical*. As a result, it is possible to pack external data into files without consequences. This way Blender can afford packing into files external data and even Autopack external data feature, which is compensated by Autopurge data management system solution, making blend files flexible and always complete. ![image.png](https://archive.blender.org/developer/F12869684/image.png) There are many different data management systems on the market, but none of them are ideal. There are various software solutions that allow you to package data into files, such as Photoshop or Substance painter, but the problem is that these programs were designed for local personal workflows, such as asset creation, and were never intended to process large-scale projects in them. As far as we know, Blender is the only program that has successfully scaled up the packing/Autopaking feature to mass production because of a system design solutions that allow this.

@1D_Inc

I'm definitely on board for helping keep team projects efficient and garbage free. It seems like the options are quite binary at the moment where you either have garbage building up if people are not organized, or people loosing data if they're not careful to mark those items as something they want to stick around.

A couple questions:

  • What constitutes garbage? Is that anything that isn't being currently used in the scene?
  • Would you be up to having the fake user system revamped to be renamed something more appropriate like "Mark Persistent" and having users be instead "Links"? So in this way items that didn't have any links could be marked as a persistent asset while also having zero links. It doesn't make sense to have items that have zero links and are on the chopping block marked with a fake link, when they still have no links. Instead mark them as protected / persistent items
  • What other alternatives are there to this binary situation where you're either trusting the human to keep things organized and clean, or where you're not trusting them and throwing out potentially important data?
  • If the issue is external data being packed into the file, couldn't those items be the ones that should be automatically referenced instead of packed into the file?

To clarify, my problem with the current setup is that blender treats materials, for instance, as second class citizens to objects, and that those objects also need to be used in the scene otherwise they're thrown out. Basically things like materials have no right to exist on their own. They're always subservient to objects.

What are your thoughts?

@1D_Inc I'm definitely on board for helping keep team projects efficient and garbage free. It seems like the options are quite binary at the moment where you either have garbage building up if people are not organized, or people loosing data if they're not careful to mark those items as something they want to stick around. A couple questions: - What constitutes garbage? Is that anything that isn't being currently used in the scene? - Would you be up to having the fake user system revamped to be renamed something more appropriate like "Mark Persistent" and having users be instead "Links"? So in this way items that didn't have any links could be marked as a persistent asset while also having zero links. It doesn't make sense to have items that have zero links and are on the chopping block marked with a fake link, when they still have no links. Instead mark them as protected / persistent items - What other alternatives are there to this binary situation where you're either trusting the human to keep things organized and clean, or where you're not trusting them and throwing out potentially important data? - If the issue is external data being packed into the file, couldn't those items be the ones that should be automatically referenced instead of packed into the file? To clarify, my problem with the current setup is that blender treats materials, for instance, as second class citizens to objects, and that those objects also need to be used in the scene otherwise they're thrown out. Basically things like materials have no right to exist on their own. They're always subservient to objects. What are your thoughts?

In #61209#1308778, @Michael-Drake wrote:
@1D_Inc

I'm definitely on board for helping keep team projects efficient and garbage free. It seems like the options are quite binary at the moment where you either have garbage building up if people are not organized, or people loosing data if they're not careful to mark those items as something they want to stick around.

A couple questions:

  1. Would you be up to having the fake user system revamped to be renamed something more appropriate like "Mark Persistent" and having users be instead "Links"? So in this way items that didn't have any links could be marked as a
    persistent asset while also having zero links. It doesn't make sense to have items that have zero links and are on the chopping block marked with a fake link, when they still have no links. Instead mark them as protected / persistent items

Renaming is more a matter of terminology than system design, since renamed things remains the same. I am not much fluent in English anyway. And it is probably better to discuss a system first.

  1. If the issue is external data being packed into the file, couldn't those items be the ones that should be automatically referenced instead of packed into the file?

This sounds similar to cancelling external data packing - systems with no packing external data ability behave like that. Losing autopacking ability will be quite a damage and also breaking backwards files compatibility.

  1. What constitutes garbage? Is that anything that isn't being currently used in the scene?

Yes.
I think it is impotant to observe a reasons behind this solution.

Users come to the sofware for creating content, which is quite logical - we are not familiar we any artist who came to software for things like data management instead.
The market is also built around the representative part, such as portfolios and artistic skills (pictures, models, animations - the representing product).
To make portfolios and skills learning software users usually perform as individuals.
As soon as they step into any kind of collaboration in any kind of software their data management is not personal any more, and they are forced to follow almost the same rules to avoid conflicts.
This is usually achieved through a fairly rigid corporate management mechanisms.
The point is that in any software those rules are just the same - "all the unused data will be lost during the very first massive purgeout", the only difference is that Blender users follow it from the very beginning and are prepared to avoid that.

As a result a question of a data management way is a question of an individuals vs collaboration, the point is that Blenders data management infrastructure was originally built on the service of collaboration.
It is probably not a problem to make a choice, but before making it is important to realize that such a choice exists.
image.png

  1. What other alternatives are there to this binary situation where you're either trusting the human to keep things organized and clean, or where you're not trusting them and throwing out potentially important data?

To clarify, my problem with the current setup is that blender treats materials, for instance, as second class citizens to objects, and that those objects also need to be used in the scene otherwise they're thrown out.
Basically things like materials have no right to exist on their own. They're always subservient to objects.

Those questions drills our brains for a years. Otherwise, we would have already proposed a solution.
It is generally unknown how to make a system designed for collaboration comfortable for individuals, or how to make a system designed for individuals comfortable for collaboration.
We are not familiar with any massive production system out of this binary in practice (I even can't remember any other software with Autopack feature), and would like to know if some software took such kind of a risk.
At the moment we know that we can trust data management any Blender user, even not very experienced one.

> In #61209#1308778, @Michael-Drake wrote: > @1D_Inc > > I'm definitely on board for helping keep team projects efficient and garbage free. It seems like the options are quite binary at the moment where you either have garbage building up if people are not organized, or people loosing data if they're not careful to mark those items as something they want to stick around. > > A couple questions: >2. Would you be up to having the fake user system revamped to be renamed something more appropriate like "Mark Persistent" and having users be instead "Links"? So in this way items that didn't have any links could be marked as a > persistent asset while also having zero links. It doesn't make sense to have items that have zero links and are on the chopping block marked with a fake link, when they still have no links. Instead mark them as protected / persistent items Renaming is more a matter of terminology than system design, since renamed things remains the same. I am not much fluent in English anyway. And it is probably better to discuss a system first. > 4. If the issue is external data being packed into the file, couldn't those items be the ones that should be automatically referenced instead of packed into the file? This sounds similar to cancelling external data packing - systems with no packing external data ability behave like that. Losing autopacking ability will be quite a damage and also breaking backwards files compatibility. > 1. What constitutes garbage? Is that anything that isn't being currently used in the scene? Yes. I think it is impotant to observe a reasons behind this solution. Users come to the sofware for creating content, which is quite logical - we are not familiar we any artist who came to software for things like data management instead. The market is also built around the representative part, such as portfolios and artistic skills (pictures, models, animations - the representing product). To make portfolios and skills learning software users usually perform as individuals. As soon as they step into any kind of collaboration in any kind of software their data management is not personal any more, and they are forced to follow almost the same rules to avoid conflicts. This is usually achieved through a fairly rigid corporate management mechanisms. The point is that in any software those rules are just the same - "all the unused data will be lost during the very first massive purgeout", the only difference is that Blender users follow it from the very beginning and are prepared to avoid that. As a result a question of a data management way is a question of an **individuals vs collaboration**, the point is that Blenders data management infrastructure was originally built on the service of collaboration. It is probably not a problem to make a choice, but before making it is important to realize that such a choice exists. ![image.png](https://archive.blender.org/developer/F12870335/image.png) >3. What other alternatives are there to this binary situation where you're either trusting the human to keep things organized and clean, or where you're not trusting them and throwing out potentially important data? > To clarify, my problem with the current setup is that blender treats materials, for instance, as second class citizens to objects, and that those objects also need to be used in the scene otherwise they're thrown out. > Basically things like materials have no right to exist on their own. They're always subservient to objects. Those questions drills our brains for a years. Otherwise, we would have already proposed a solution. It is generally unknown how to make a system designed for collaboration comfortable for individuals, or how to make a system designed for individuals comfortable for collaboration. We are not familiar with any massive production system out of this binary in practice (I even can't remember any other software with Autopack feature), and would like to know if some software took such kind of a risk. At the moment we know that we can trust data management any Blender user, even not very experienced one.

I want to share our system integration experience.
There are 4 data management complexity levels, sorted by software use and type of data input, datamanagement demands depends only from the level you belong to:

  • Individual use + vanilla data
  • Individual use + imported data
  • Collaboration + vanilla data
  • Collaboration + imported data

As an architectural company we belong to the fourth level, so we have to deal with different software integration, having some "common denominator" - a software final projects are combined in.

Since Maya is officially promoted as a software designed around collaboration ("a place where modellers meet developers meet vfx artists" and so one), we originally decided to use Maya+Arnold for such kind of purposes, and use
Blender with 3dsmax+Corona at the local level.
The resulting system looked similar to this.
image.png

Maya and 3dsmax has pretty much the same well known data management systems and structure (slate material editor, no external data packing/purging, global node editor and so one).
The problem we faced is that due to huge amount of a various data sources that generates lots of data and the amount of changes during design process in teamwork, our Maya division most of time was trying to handle and sort a lot of garbage data which inevitably fell into a common denominator from all those data sources.

At some point we decided to make a restructuring and get rid of using Maya completely (to avoid a number of reasons and limitations, including these), using Blender+Cycles as a common denominator instead, since its data management approach had no such kind of a problems at all.
Our assumption worked pretty well and the resulting system looks something like this.

image.png

Of course, our example is far from universal, but I think it's important to mention the conditions in which we use Blender's garbage-free data management system and the reasons we prefer it in practice.

I want to share our system integration experience. There are 4 data management complexity levels, sorted by software use and type of data input, datamanagement demands depends only from the level you belong to: - Individual use + vanilla data - Individual use + imported data - Collaboration + vanilla data - Collaboration + imported data As an architectural company we belong to the fourth level, so we have to deal with different software integration, having some "common denominator" - a software final projects are combined in. Since Maya is officially promoted as a software designed around collaboration ("a place where modellers meet developers meet vfx artists" and so one), we originally decided to use Maya+Arnold for such kind of purposes, and use Blender with 3dsmax+Corona at the local level. The resulting system looked similar to this. ![image.png](https://archive.blender.org/developer/F12902623/image.png) Maya and 3dsmax has pretty much the same well known data management systems and structure (slate material editor, no external data packing/purging, global node editor and so one). The problem we faced is that due to huge amount of a various data sources that generates lots of data and the amount of changes during design process in teamwork, our Maya division most of time was trying to handle and sort a lot of garbage data which inevitably fell into a common denominator from all those data sources. At some point we decided to make a restructuring and get rid of using Maya completely (to avoid a number of reasons and limitations, including these), using Blender+Cycles as a common denominator instead, since its data management approach had no such kind of a problems at all. Our assumption worked pretty well and the resulting system looks something like this. ![image.png](https://archive.blender.org/developer/F12902625/image.png) Of course, our example is far from universal, but I think it's important to mention the conditions in which we use Blender's garbage-free data management system and the reasons we prefer it in practice.

Removed subscriber: @ChinoD

Removed subscriber: @ChinoD
Contributor

Added subscriber: @Raimund58

Added subscriber: @Raimund58
Philipp Oeser removed the
Interest
User Interface
label 2023-02-10 09:25:56 +01:00
Bastien Montagne added
Module
Core
and removed
Module
User Interface
Interest
Core
labels 2023-10-07 13:09:40 +02:00
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
32 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#61209
No description provided.