Ensure 'virtual' linked override IDs generated by the recursive resync
process are tagged as indirectly linked data.
This is needed to avoid the 'missing data' messages on those virtual
data-blocks after saving and reloading.
This commit fixes two different issues:
* In some cases, when an object was added to a sub-collection and used
into a different subcollection, and the root common collection would
not need to be resynced, it would end up creating multiple overrides
of the new object. This was affecting both normal and recursive
resync.
* In recurisve resync case, the barrier code to define what is part or
not of a override group/hierarchy was wrong.
Note that the current solution for the first issue is sub-optimal (it
goes back to the root of the override hierarchy and resync the whole
thing), a better solution is TODO for now.
We need to re-evaluate what needs to be resynced after each step of
processing overrides from a given 'indirect level' of libraries.
Otherwise, recusrive overrides (overrides of linked overrides) won't
work.
Note that this should not change too much in practice currently, since
there are other issues with recursive overrides yet.
Also, checks (CLOG errors) added show that some ID (node trees) seem to
be detected as needing resynced even after beig just resynced, this
needs further investigation still. Could be though that it is due to
limit currently set on nodetrees, those are always complicated
snowflakes to deal with...
This is not supposed to happen, but better be safe than sorry, and
assume it is beyond unlikely that someone would use chains of over 10k
linked libraries.
Very stupid mistake in libraries indirect-level building code, was not
skipping 'loop-back' ID pointers.
Note that we also need some level of checks for the case where there
would be an actual dependency loop between libraries, this is not
supposed to be possible, but better be safe than sorry. Will add in next
commit.
Recursive resync means also resyncing overrides that are linked from
other library files into current working file.
Note that this allows to get 'working' files even when their
dependencies are out of sync. However, since linked data is never
written/saved, this has to be re-done every time the working file is
loaded, until said dependencies are updated properly.
NOTE: This is still missing the 'report' side of things, which is part
of a larger task to enhance reports regarding both linking, and
liboverrides (see T88393).
----------
Technical notes:
Implementing this proved to be slightly more challenging than expected,
mainly because one of the key aspects of the feature was never done in
Blender before: manipulating, re-creating linked data.
This ended up moving the whole resync code to use temp IDs out of bmain,
which is better in the long run anyway (and more aligned with what we
generally want to do when manipulating temp ID data). It should also
give a marginal improvement in performances for regular resync.
This commit also had to carefully 'sort' libraries by level of indirect
usage, as we want to resync first the libraries that are the least directly
used, i.e. libraries that are most used by other libraries.
this is a followup to rB2bd85d9cc623, we cannot forcefully delete
obsolete overrides of object data (meshes etc.), as this implies also
deleting their user object, which might still be a perfectly valid
override, albeit in conflict regarding its obdata ID pointer...
This is the opposite of previous code, which would keep those
'deprecated' overrides arround (often in a dedicated collection), when
they were detected as user-edited.
While this is a safe-ish way to (try to) preserve user-edited data, this
tends to add too much 'trash' data to production scenes, which cleaning
becomes a burden.
Note that user will get warnings in thos cases, and can always choose
not to save the current blend file and go fix the library issue instead.
Code detecting overrides which reference linked data is missing was
actually missing many cases, leading to too much garbage data being kept
around after resync process.
One of current annoying limitations of Blender re Collections/Objects is
that objects are forbidden to not be instantiated in at least one
collection.
Code ensuring that as a pst-processing step of override creation/resync
operations would be a bit too eager to add those objects to an external
'ad-hoc' collection, which poses several issues (both in term of keeping
the scene well organized, and related to override hierarchy handling).
So now be very conservative and only generate and use external 'storage'
collection for those objects when it is absolutely mandatory.
In pratice, it means this should never happen anymore on any decently
organized data source.
Move the detection/decision of whether an ID pointer should be taken
into account in library override hierarchy processing to the LibQuery
area of code, by introducing a new callback flag.
This allows to factorize the test logic, be explicit in liboverride code
about ID relationships that can be ignored when exploring the override
hierarchy, and adds the possibility to do more checks about pointers to
be tagged as non-overridable in the future.
Note that all but the 'special' ID pointers (loop-back, embedded, etc.)
should be overridable. If some is not, relevant IDType 'foreach_id'
callback code is reponsible to tag it properly.
Python-defined IDProperties however are not systematicaly overridable
(yet), so this should allow us to detect that case and act accordingly
in an incomming commit.
No behavioral change expected in this commit.
Multi-overrides of a same linked ID in a same override hierarchy are
currently not supported and can cause all kind of issues.
In some cases they could lead to infinite loop trying to resync the same
ID over and over, this is now prevented.
Found in some Blender studio production files.
Linked override were not properly ignored in some part of the code,
leading to invalid resync results in some cases with recursive overrides
(i.e. overrides of overrides).
Reported by Andy @eyecandy from the studio.
We do not want to copy exiting overrides data from linked ID when
creating its local override (be it either a template, or because linked
ID is itself an override of another lib data).
Note that this was not a very serious issue, would just cause some memory
leak since override data is re-created on newly copied local data
anyway.
These use cases have been very little tested so far, but both complex
production pipeline and future restrictive workflow will make them fairly
common...
This commit essentially touches to post-processing of collections and
objects after resync itself has been done, to ensure their proper
instantiation in the scene:
- Remove a lot of the process in resync case (resynced data are assumed
to be already instantiated in the scene, unlike override creation
case).
- For auto-resync, only do post-processing once after all overrides
have been resynced (doing it after each individual resynced was
causing a lot of instantiation glitches, with a lot of unwanted
extra objects and collections being added to the master collection).
It also deals in a much more reliable way with detection of objects
missing from the scene, by using the new `BKE_scene_objects_as_gset`
utils.
As a bonus this makes auto-resync process slightly faster (only by a few
percents, but that's always good to get).
This flag is set for liboverride IDs that are detected as no longer
needed by resync process, while having been user-edited, so
auto-handling code cannot silently delete them.
Exposing those to users will be part of the new incoming Override
Outliner view.
Part of D10855.
This is functionality that isn't accessible via the user interface. The
API allows the creation and modification of an override template that
holds rules that needs to be checked when overriding the asset.
The API is setup that it cannot be changed after creation. Later on when
the system is more mature we will allow changing overrides operations.
NOTE: This is an experimental feature and should not be used in productions.
Reviewed By: mont29, sebbas
Differential Revision: https://developer.blender.org/D10792
Purely local IDs would not always be used as proper barriers when
generating override hierarchy data, leading to imporperly trying to
resync override IDs from other hierarchies, which would break/cause
issues in case they would be duplicate overrides of the same linked
data.
Reported by Pablo from Sprite team, thanks.
We do not want to re-generate auto-overrides until everything that
needed to be resynced has been resynced. Otherwise it's a waste of time
and guaranteed loss of some override properties.
Mistake in rB9947f2095610 earlier today.
Instead of storing those in scne's master collection, which is fairly
annoying, we now add them to a (hidden) specific collection. Easy to
ignore, or check and cleanup.
This is not only potentially extremely expensive, it is also fairly
futile, and code is not designed to handle it currently anyway (could
easily end up in inifinite loops and other crashes).
Problem is, when a collection is excluded from the scene, none of its
objects are technically instantiated.
This should not happen when *creating* an override, but can be fairly
common during resync process.
For now, use a lesser precise check in resync case, only relying on
object usercount. This might lead to some objects being left without any
collection in some rare weird case, but this cannot really be avoided
currently.
Attempt to work around some full-corruption cases created at the studio.
Not clear how those were created, so not really fixing anything here,
just detecting and 'solving' as best as possible some high corruption of
local overrides.
This is good to have in general anyway, might help prevent further
corruption to happen too.