Memory not freed after rendering with 2.79a #54287

Closed
opened 5 years ago by nobi08 · 38 comments
nobi08 commented 5 years ago

System Information
on Debian stable arm64 with Intel chipset
with GTX 1060
rendered with or without GPU doesn't matter

Blender Version
Broken: 2.79a, latest build
Worked: 2.79

Short description of error
After rendering a scene with a lot of memory consumption while rendering the memory is not set free after rendering.
When I then load a new scene with "File-New" the memory usage seems to stay at the high value.
(In addition it seems that with the same scene the memory usage of 2.79a is higher than with 2.79.)
There's something wrong with this behaviour.

Exact steps for others to reproduce the error
Load the "fishy cat" demo scene and render it 3-5 times.
The memory usage on my system raises after every render job.
When starting with a fresh scene (without quitting Blender), the memory usage keeps the same.
After quitting Blender, the memory is set free (so no big memory leak).

Conclusion
I have to use 2.79 now, because with 2.79a I have to restart Blender every second render job - show stopper ;-(

**System Information** on Debian stable arm64 with Intel chipset with GTX 1060 rendered with or without GPU doesn't matter **Blender Version** Broken: 2.79a, latest build Worked: 2.79 **Short description of error** After rendering a scene with a lot of memory consumption while rendering the memory is not set free after rendering. When I then load a new scene with "File-New" the memory usage seems to stay at the high value. (In addition it seems that with the same scene the memory usage of 2.79a is higher than with 2.79.) There's something wrong with this behaviour. **Exact steps for others to reproduce the error** Load the "fishy cat" demo scene and render it 3-5 times. The memory usage on my system raises after every render job. When starting with a fresh scene (without quitting Blender), the memory usage keeps the same. After quitting Blender, the memory is set free (so no big memory leak). **Conclusion** I have to use 2.79 now, because with 2.79a I have to restart Blender every second render job - show stopper ;-(
nobi08 commented 5 years ago
Poster

Added subscriber: @nobi08

Added subscriber: @nobi08
brecht commented 5 years ago
Owner

Added subscriber: @brecht

Added subscriber: @brecht
brecht commented 5 years ago
Owner

I can't reproduce the problem with the fishy cat scene on Ubuntu with 2.79a.

  • If you have any addons enabled, try disabling them since we have had an addon cause this kind of problem before.
  • Where are you checking the memory usage, in the Blender top header itself? In the system monitor?
  • What kind of memory usage numbers are you seeing? A little bit is expected due to fragmentation, undo steps, etc, but it should be relatively small and stabilize after a few renders.
  • Are there any errors or warnings in the console when rendering?
  • How are you rendering, Shift+Z in the viewport, F12 final render, animation, .. ?
I can't reproduce the problem with the fishy cat scene on Ubuntu with 2.79a. * If you have any addons enabled, try disabling them since we have had an addon cause this kind of problem before. * Where are you checking the memory usage, in the Blender top header itself? In the system monitor? * What kind of memory usage numbers are you seeing? A little bit is expected due to fragmentation, undo steps, etc, but it should be relatively small and stabilize after a few renders. * Are there any errors or warnings in the console when rendering? * How are you rendering, Shift+Z in the viewport, F12 final render, animation, .. ?
nobi08 commented 5 years ago
Poster

Changed status from 'Open' to: 'Resolved'

Changed status from 'Open' to: 'Resolved'
nobi08 closed this issue 5 years ago
nobi08 self-assigned this 5 years ago
nobi08 commented 5 years ago
Poster

Thanks for the answer. But the hint looking at the Blender memory and the system memory usage gave the solution.
The guilty was not Blender but a system update without starting the system (which normally is not a problem if there is no kernel update).
I watched the Blender memory usage - it was ok. The system usage wasn't.
Restarted, tested... everything works fine - thanks for the fast reply.
The show can go on now :-)

Thanks for the answer. But the hint looking at the Blender memory and the system memory usage gave the solution. The guilty was not Blender but a system update without starting the system (which normally is not a problem if there is no kernel update). I watched the Blender memory usage - it was ok. The system usage wasn't. Restarted, tested... everything works fine - thanks for the fast reply. The show can go on now :-)
nobi08 commented 5 years ago
Poster

Changed status from 'Resolved' to: 'Open'

Changed status from 'Resolved' to: 'Open'
nobi08 reopened this issue 5 years ago
nobi08 commented 5 years ago
Poster

Sorry to bother you again, but the problems seems NOT to be solved.
Again: Blender shows normal memory usage in the header information, but my system doesn't.
This is strange and appears to happen
.. with or without addons enabled,
.. with or without GPU rendering,
.. independent from starting rendering with Shift-Z, F12

with one of my bigger scenes
Memory usages:
2.79:
scene only: 2GB
while rendering: ~9GB
after rendering: back to ~3GB
reloading new scene: ~2GB
2.79a:
scene only: 2GB
while rendering: ~13GB
after rendering: back to ~11GB
reloading new scene: ~11GB

After rendering the scene memory usage in system memory stays on a high level and is not freed. Repeated renders eat up the memory of my system (with a big scene after the second rendering).
Funny thing is the 4GB higher memory consumption of the 2.79a version while rendering the same scene.

In addition I can't see any warnings/errors in the console.

Seems like being a strange problem though.
Any help appreciated.

Sorry to bother you again, but the problems seems NOT to be solved. Again: Blender shows normal memory usage in the header information, but my system doesn't. This is strange and appears to happen .. with or without addons enabled, .. with or without GPU rendering, .. independent from starting rendering with Shift-Z, F12 with one of my bigger scenes Memory usages: 2.79: scene only: 2GB while rendering: ~9GB after rendering: back to ~3GB reloading new scene: ~2GB 2.79a: scene only: 2GB while rendering: ~13GB after rendering: back to ~11GB reloading new scene: ~11GB After rendering the scene memory usage in system memory stays on a high level and is not freed. Repeated renders eat up the memory of my system (with a big scene after the second rendering). Funny thing is the 4GB higher memory consumption of the 2.79a version while rendering the same scene. In addition I can't see any warnings/errors in the console. Seems like being a strange problem though. Any help appreciated.
nobi08 changed title from Memory not freed after rendering on 2.79a to Memory not freed after rendering with 2.79a 5 years ago
Collaborator

Added subscriber: @LazyDodo

Added subscriber: @LazyDodo
Collaborator

I can't seem to repro this behavior on windows either. What exact build of 2.79a are you running? (the splash screen should have a hash on it, that would help narrowing this down)

edit: also is this a blender.org build or something from the ubuntu repositories?

I can't seem to repro this behavior on windows either. What exact build of 2.79a are you running? (the splash screen should have a hash on it, that would help narrowing this down) edit: also is this a blender.org build or something from the ubuntu repositories?
nobi08 commented 5 years ago
Poster

Hash is: 8928d99270 (as of 2018-02-21 10:41)

What makes me wonder is that 2.79 works fine, but 2.79a shows this behaviour with exactly the same settings and scene.

Hash is: 8928d99270f (as of 2018-02-21 10:41) What makes me wonder is that 2.79 works fine, but 2.79a shows this behaviour with exactly the same settings and scene.
Collaborator

i tried with both 2.79a and a recent master build, can't seem to trigger the issue with fishy_cat_cpu.blend maybe it's scene related? also i probably was too late editing my post, but where did you obtain the blender build? from blender.org or did it come from ubuntu's repositories?

i tried with both 2.79a and a recent master build, can't seem to trigger the issue with fishy_cat_cpu.blend maybe it's scene related? also i probably was too late editing my post, but where did you obtain the blender build? from blender.org or did it come from ubuntu's repositories?
nobi08 commented 5 years ago
Poster

LazyDodo, thanks for trying.
IMO it's not scene related, because it does work normally with 2.79.
The build is from blender.org and I am not using Ubuntu but Debian stable. Since the 2.79a release I never had any problems like this one.

LazyDodo, thanks for trying. IMO it's not scene related, because it does work normally with 2.79. The build is from blender.org and I am not using Ubuntu but Debian stable. Since the 2.79a release I never had any problems like this one.
brecht commented 5 years ago
Owner

It can be scene related also if it works normally in 2.79, since there can be a new bug in 2.79a that is only triggered by specific scene setups. So ideally we want statistics from a scene that we can test ourselves, to figure out where exactly the difference is.

I can see differences between Blender versions, but some randomness and delay in releasing the memory to the operating system is expected. The memory statistics in the system monitor do not tell you the actual memory that is used and available, only an approximation that can be off quite a bit. What is a problem is if re-rendering the same scene increases memory usage each time and eventually fails due to running out of memory, and I have not been able to reproduce that.

What you can try is running Blender like this, and tell us how it affects memory usage. It will make Blender release memory to the operating system quicker and eliminates some of the randomness.

MALLOC_CONF="dirty_decay_ms:0" ./blender
It can be scene related also if it works normally in 2.79, since there can be a new bug in 2.79a that is only triggered by specific scene setups. So ideally we want statistics from a scene that we can test ourselves, to figure out where exactly the difference is. I can see differences between Blender versions, but some randomness and delay in releasing the memory to the operating system is expected. The memory statistics in the system monitor do not tell you the actual memory that is used and available, only an approximation that can be off quite a bit. What is a problem is if re-rendering the same scene increases memory usage each time and eventually fails due to running out of memory, and I have not been able to reproduce that. What you can try is running Blender like this, and tell us how it affects memory usage. It will make Blender release memory to the operating system quicker and eliminates some of the randomness. ``` MALLOC_CONF="dirty_decay_ms:0" ./blender ```
YAFU commented 5 years ago

Added subscriber: @YAFU

Added subscriber: @YAFU
YAFU commented 5 years ago

I am able to reproduce the problem with Blender 2.79a official release and this scene:
High RAM scene.blend

My system: Kubuntu Linux 18.04 64bits - GTX 960 4GB - i7 3770 - 15.6GB RAM available.

Steps (All the procedure is without closing Blender):

*2.79a
Open High RAM scene.blend scene, Render image. After finished System Monitor reports Blender still uses about 12.6GB.

File > New > Reload Start-UP file. Turn to Cycles. Render image (default cube). After finished System Monitor reports Blender still uses about 11.9GB.

File > Open Recent > High RAM scene.blend. render Image. After finished System Monitor reports Blender still uses about 12.8GB

So apparently memory is never freed, but apparently it is not cumulative either.

*2.79
The behavior is different, while render scene simply it reaches a peak of 13GB, but frees the memory when it ends.

I am able to reproduce the problem with Blender 2.79a official release and this scene: [High RAM scene.blend](https://archive.blender.org/developer/F2439528/High_RAM_scene.blend) My system: Kubuntu Linux 18.04 64bits - GTX 960 4GB - i7 3770 - 15.6GB RAM available. Steps (All the procedure is without closing Blender): *2.79a Open High RAM scene.blend scene, Render image. After finished System Monitor reports Blender still uses about 12.6GB. File > New > Reload Start-UP file. Turn to Cycles. Render image (default cube). After finished System Monitor reports Blender still uses about 11.9GB. File > Open Recent > High RAM scene.blend. render Image. After finished System Monitor reports Blender still uses about 12.8GB So apparently memory is never freed, but apparently it is not cumulative either. *2.79 The behavior is different, while render scene simply it reaches a peak of 13GB, but frees the memory when it ends.
Collaborator

i had to dial down the particle system a tiny bit because of ram issues but this is the memory curve i got on windows with 2.79a with that file

image.png

i had to dial down the particle system a tiny bit because of ram issues but this is the memory curve i got on windows with 2.79a with that file ![image.png](https://archive.blender.org/developer/F2439555/image.png)
brecht commented 5 years ago
Owner

Added subscriber: @Sergey

Added subscriber: @Sergey
brecht commented 5 years ago
Owner

@YAFU, does running Blender 2.79a like this cause the reported memory usage to be reduced after rendering?

MALLOC_CONF="dirty_decay_ms:0" ./blender

@Sergey, did the jemalloc version change for 2.79a?

@YAFU, does running Blender 2.79a like this cause the reported memory usage to be reduced after rendering? ``` MALLOC_CONF="dirty_decay_ms:0" ./blender ``` @Sergey, did the jemalloc version change for 2.79a?
YAFU commented 5 years ago

@LazyDodo. Maybe just Linux issue? My system monitor graphic with Blender 2.79a is basically a permanent horizontal line at approximately 13.5 GB :)
Just in case, using 4.15 linux kernel here.

@brecht Van Lommel. I had forgotten to try it. Running Blender 2.79a with "MALLOC_CONF="dirty_decay_ms:0" ./blender" it works fine, memory is freed.

@LazyDodo. Maybe just Linux issue? My system monitor graphic with Blender 2.79a is basically a permanent horizontal line at approximately 13.5 GB :) Just in case, using 4.15 linux kernel here. @brecht Van Lommel. I had forgotten to try it. Running Blender 2.79a with "MALLOC_CONF="dirty_decay_ms:0" ./blender" it works fine, memory is freed.
nobi08 commented 5 years ago
Poster

@brecht also for me: "MALLOC_CONF="dirty_decay_ms:0" blender" works fine.
No cumulation of allocated memory on system monitor/blender monitor. Memory is freed after rendering, loading new scene.
Seems to work fine with that env variable setting.

@brecht also for me: "MALLOC_CONF="dirty_decay_ms:0" blender" works fine. No cumulation of allocated memory on system monitor/blender monitor. Memory is freed after rendering, loading new scene. Seems to work fine with that env variable setting.
Sergey commented 5 years ago
Owner

@brecht, jemalloc indeed changed for 2.79a. The reason for that was that previous update introduced rather dramatic slowdown for things like Draw Manager behavior in 2.8. This slowdown was fixed by jemalloc-5.0.1, but seems it introduced some other issues?

@brecht, jemalloc indeed changed for 2.79a. The reason for that was that previous update introduced rather dramatic slowdown for things like Draw Manager behavior in 2.8. This slowdown was fixed by jemalloc-5.0.1, but seems it introduced some other issues?
brecht commented 5 years ago
Owner

It seems the dirty page purging totally changed:
https://github.com/jemalloc/jemalloc/releases/tag/5.0.0
https://github.com/jemalloc/jemalloc/issues/325

Purging dirty pages is slower than before when opening a big .blend and then doing file > new, but it still happens after some time in that case. This can be explained by the default 10s delay:
http://jemalloc.net/jemalloc.3.html#opt.dirty_decay_ms

However for rendering we allocate memory in different threads, and according to this comment that can be problematic:

(a lot of your arenas only have one thread in them -- if that thread is sleeping or otherwise not interacting with the allocator much, it might never get far enough into the arena code to purge).

And indeed, if we render a simple scene afterwards a few times it wakes up the render threads, and memory goes down a bit each time. Similar behavior happens for Cycles, Blender Internal, baking, etc. The suggested solution seems to be enabling jemalloc background threads:

diff --git a/source/creator/creator.c b/source/creator/creator.c
index a59a45f..15b4169 100644
--- a/source/creator/creator.c
+++ b/source/creator/creator.c
@@ -201,6 +201,10 @@ char **environ = NULL;
 - endif
 - endif
 
+/* If jemalloc is used, it reads this global variable and enables background
+ * threads to purge dirty pages. Otherwise we release memory too slowly. */
+const char *malloc_conf = "background_thread:true";
+
 /**
  * Blender's main function responsibilities are:
  * - setup subsystems.

With that option it seems to purge all the dirty pages within about 10s after rendering as expected. It's not entirely clear to me if this would solve all potential problems though. Suppose we allocate and free a bunch of memory in one thread, and then do the same in another thread soon afterwards. Can it now run out of memory if the sum of what the two threads allocated is too high? The background threads are not going to be able to purge the pages quick enough, so will jemalloc do it in some other way to avoid running out of memory?

It seems the dirty page purging totally changed: https://github.com/jemalloc/jemalloc/releases/tag/5.0.0 https://github.com/jemalloc/jemalloc/issues/325 Purging dirty pages is slower than before when opening a big .blend and then doing file > new, but it still happens after some time in that case. This can be explained by the default 10s delay: http://jemalloc.net/jemalloc.3.html#opt.dirty_decay_ms However for rendering we allocate memory in different threads, and according to [this comment ](https://github.com/jemalloc/jemalloc/issues/956#issuecomment-316171166) that can be problematic: > (a lot of your arenas only have one thread in them -- if that thread is sleeping or otherwise not interacting with the allocator much, it might never get far enough into the arena code to purge). And indeed, if we render a simple scene afterwards a few times it wakes up the render threads, and memory goes down a bit each time. Similar behavior happens for Cycles, Blender Internal, baking, etc. The suggested solution seems to be enabling jemalloc background threads: ``` diff --git a/source/creator/creator.c b/source/creator/creator.c index a59a45f..15b4169 100644 --- a/source/creator/creator.c +++ b/source/creator/creator.c @@ -201,6 +201,10 @@ char **environ = NULL; - endif - endif +/* If jemalloc is used, it reads this global variable and enables background + * threads to purge dirty pages. Otherwise we release memory too slowly. */ +const char *malloc_conf = "background_thread:true"; + /** * Blender's main function responsibilities are: * - setup subsystems. ``` With that option it seems to purge all the dirty pages within about 10s after rendering as expected. It's not entirely clear to me if this would solve all potential problems though. Suppose we allocate and free a bunch of memory in one thread, and then do the same in another thread soon afterwards. Can it now run out of memory if the sum of what the two threads allocated is too high? The background threads are not going to be able to purge the pages quick enough, so will jemalloc do it in some other way to avoid running out of memory?
Sergey commented 5 years ago
Owner

@brecht, i propose the following:

  • We make 2.79b with previous version of jemalloc, just so we are safe and everything.
  • We use your solution for master branch and see how that behaves. We shouldn't be locking to an older jemalloc versions, lots of distros are moving to a newer versions now.

I'm also quite sure jemalloc is used in other software, including Firefox with all this separate threads for JIT compilation and such. Can totally check how it's used there or ask developers.

@brecht, i propose the following: - We make 2.79b with previous version of jemalloc, just so we are safe and everything. - We use your solution for master branch and see how that behaves. We shouldn't be locking to an older jemalloc versions, lots of distros are moving to a newer versions now. I'm also quite sure jemalloc is used in other software, including Firefox with all this separate threads for JIT compilation and such. Can totally check how it's used there or ask developers.
brecht commented 5 years ago
Owner

Sounds like a good plan.

Firefox did not upgrade to jemalloc 4 or 5, they're using a customized earlier version of jemalloc.
https://bugzilla.mozilla.org/show_bug.cgi?id=1363992

Sounds like a good plan. Firefox did not upgrade to jemalloc 4 or 5, they're using a customized earlier version of jemalloc. https://bugzilla.mozilla.org/show_bug.cgi?id=1363992
Sergey commented 5 years ago
Owner

@brecht, 2.79b builds are done. If you can verify they're fine it'll be great! The buildbot is back to jemalloc 5.0.1, so this issue we need to address in one way or another.

@brecht, 2.79b builds are done. If you can verify they're fine it'll be great! The buildbot is back to jemalloc 5.0.1, so this issue we need to address in one way or another.
Sergey commented 5 years ago
Owner

Just verified with Campbell. The issue is solved in 2.79b by using older jemalloc.

Buildbot is on newer jemalloc, so let's see what we can do there.

@brecht, Using MALLOC_CONF="dirty_decay_ms:0" solves the issue. Tried your patch, but this gives me:

<jemalloc>: Error in dlsym(RTLD_NEXT, "pthread_create")
Aborted

on the very startup.

Just verified with Campbell. The issue is solved in 2.79b by using older jemalloc. Buildbot is on newer jemalloc, so let's see what we can do there. @brecht, Using `MALLOC_CONF="dirty_decay_ms:0"` solves the issue. Tried your patch, but this gives me: ``` <jemalloc>: Error in dlsym(RTLD_NEXT, "pthread_create") Aborted ``` on the very startup.
brecht commented 5 years ago
Owner

There's a bug report about that error here, I guess you are using 5.0.1 with the fix already?
https://github.com/jemalloc/jemalloc/issues/907

It seems there are multiple issues depending on static / shared libraries and linking order, and not all of them may be solved. It worked for me with a static jemalloc 5.0.1 and no changes to the build system, not sure how your setup is different.

I think dirty_decay_ms:0 is not intended for production use and performance is expected to be poor, not sure how much it matters in practice but probably it's not a good idea to use it.

There's a bug report about that error here, I guess you are using 5.0.1 with the fix already? https://github.com/jemalloc/jemalloc/issues/907 It seems there are multiple issues depending on static / shared libraries and linking order, and not all of them may be solved. It worked for me with a static jemalloc 5.0.1 and no changes to the build system, not sure how your setup is different. I think `dirty_decay_ms:0` is not intended for production use and performance is expected to be poor, not sure how much it matters in practice but probably it's not a good idea to use it.
Sergey commented 5 years ago
Owner

@brecht, i'm dynamically linking against libjemalloc v5.0.1 here. Did you talk to those folks about how to make newer jemalloc usable for Blender? :)

@brecht, i'm dynamically linking against libjemalloc v5.0.1 here. Did you talk to those folks about how to make newer jemalloc usable for Blender? :)

Added subscriber: @christian

Added subscriber: @christian

Added subscriber: @lsstratmann

Added subscriber: @lsstratmann

test.blend

I can confirm this or a similar issue with 2.80 (build "2.80-f8908f0d434-linux-glibc219-x86_64"). When I try to render about 300 frames of the default cube animation above, my system memory usage increases by about 7 GiB while Blender itself only reports about 100 MB of memory being used.
The output format in this file is set to "Ffmpeg video"; my ffmpeg version is 4.0.

[test.blend](https://archive.blender.org/developer/F3530896/test.blend) I can confirm this or a similar issue with 2.80 (build "2.80-f8908f0d434-linux-glibc219-x86_64"). When I try to render about 300 frames of the default cube animation above, my system memory usage increases by about 7 GiB while Blender itself only reports about 100 MB of memory being used. The output format in this file is set to "Ffmpeg video"; my ffmpeg version is 4.0.
nobi08 commented 5 years ago
Poster

I changed the render samples to 1 and render size to 50% to speed up things.
I started rendering. System memory consumption raises to >16GB after short time with blender displays memory usage around 70MB.
I can confirm Lukas' observation. I used blender-2.80-fbd614f1faf-linux-glibc219-x86_64 on a Debian stable system with a GTX1060.
Seems as if the jmalloc problem is here again??

I changed the render samples to 1 and render size to 50% to speed up things. I started rendering. System memory consumption raises to >16GB after short time with blender displays memory usage around 70MB. I can confirm Lukas' observation. I used blender-2.80-fbd614f1faf-linux-glibc219-x86_64 on a Debian stable system with a GTX1060. Seems as if the jmalloc problem is here again??

Added subscriber: @Metal3d

Added subscriber: @Metal3d

Seems to be related to what I reported here https://developer.blender.org/T56825 I'll try the MALLOC option

Seems to be related to what I reported here https://developer.blender.org/T56825 I'll try the MALLOC option

I confirm that MALLOC_CONF changed the behavior on my computer with 20G RAM and 1060 ti (3g) NVidia card on Fedora 28. Memory seems to be cleaned and I do not have computer freeze after several renders.

I confirm that MALLOC_CONF changed the behavior on my computer with 20G RAM and 1060 ti (3g) NVidia card on Fedora 28. Memory seems to be cleaned and I do not have computer freeze after several renders.
YAFU commented 4 years ago

I still have the problem in 2.79 and 2.8 from Buildbot builds. Compiling Blender by myself the problem does not happen. I suppose that is due to the version of jemalloc installed in my system (3.6.0)

I still have the problem in 2.79 and 2.8 from Buildbot builds. Compiling Blender by myself the problem does not happen. I suppose that is due to the version of jemalloc installed in my system (3.6.0)
Collaborator

This issue was referenced by d2da3af073

This issue was referenced by d2da3af073a63b6ae25119ebbd2e0d3c9a1b6823
brecht commented 4 years ago
Owner

Changed status from 'Open' to: 'Resolved'

Changed status from 'Open' to: 'Resolved'
brecht closed this issue 4 years ago
Sign in to join this conversation.
No Label
Interest/Alembic
Interest/Animation & Rigging
Interest/Asset Browser
Interest/Asset Browser Project Overview
Interest/Audio
Interest/Automated Testing
Interest/Blender Asset Bundle
Interest/Collada
Interest/Compositing
Interest/Core
Interest/Cycles
Interest/Dependency Graph
Interest/Development Management
Interest/Eevee & Viewport
Interest/Freestyle
Interest/Geometry Nodes
Interest/Grease Pencil
Interest/ID Management
Interest/Images & Movies
Interest/Import/Export
Interest/Line Art
Interest/Masking
Interest/Modeling
Interest/Modifiers
Interest/Motion Tracking
Interest/Nodes & Physics
Interest/Overrides
Interest/Performance
Interest/Performance
Interest/Physics
Interest/Pipeline, Assets & I/O
Interest/Platforms, Builds, Tests & Devices
Interest/Python API
Interest/Render & Cycles
Interest/Render Pipeline
Interest/Sculpt, Paint & Texture
Interest/Text Editor
Interest/Translations
Interest/Triaging
Interest/Undo
Interest/USD
Interest/User Interface
Interest/UV Editing
Interest/VFX & Video
Interest/Video Sequencer
Interest/Virtual Reality
legacy module/Animation & Rigging
legacy module/Core
legacy module/Development Management
legacy module/Eevee & Viewport
legacy module/Grease Pencil
legacy module/Modeling
legacy module/Nodes & Physics
legacy module/Pipeline, Assets & IO
legacy module/Platforms, Builds, Tests & Devices
legacy module/Python API
legacy module/Rendering & Cycles
legacy module/Sculpt, Paint & Texture
legacy module/Triaging
legacy module/User Interface
legacy module/VFX & Video
legacy project/1.0.0-beta.2
legacy project/Asset Browser (Archived)
legacy project/BF Blender: 2.8
legacy project/BF Blender: After Release
legacy project/BF Blender: Next
legacy project/BF Blender: Regressions
legacy project/BF Blender: Unconfirmed
legacy project/Blender 2.70
legacy project/Code Quest
legacy project/Datablocks and Libraries
legacy project/Eevee
legacy project/Game Animation
legacy project/Game Audio
legacy project/Game Data Conversion
legacy project/Game Engine
legacy project/Game Logic
legacy project/Game Physics
legacy project/Game Python
legacy project/Game Rendering
legacy project/Game UI
legacy project/GPU / Viewport
legacy project/GSoC
legacy project/Infrastructure: Websites
legacy project/LibOverrides - Usability and UX
legacy project/Milestone 1: Basic, Local Asset Browser
legacy project/Nodes
legacy project/OpenGL Error
legacy project/Papercut
legacy project/Pose Library Basics
legacy project/Retrospective
legacy project/Tracker Curfew
legacy project/Wintab High Frequency
Meta/Good First Issue
Meta/Papercut
migration/requires-manual-verification
Module › Animation & Rigging
Module › Core
Module › Development Management
Module › Eevee & Viewport
Module › Grease Pencil
Module › Modeling
Module › Nodes & Physics
Module › Pipeline, Assets & IO
Module › Platforms, Builds, Tests & Devices
Module › Python API
Module › Render & Cycles
Module › Sculpt, Paint & Texture
Module › Triaging
Module › User Interface
Module › VFX & Video
Platform/FreeBSD
Platform/Linux
Platform/macOS
Platform/Windows
Priority › High
Priority › Low
Priority › Normal
Priority › Unbreak Now!
Status › Archived
Status › Confirmed
Status › Duplicate
Status › Needs Information from Developers
Status › Needs Information from User
Status › Needs Triage
Status › Resolved
Type › Bug
Type › Design
Type › Known Issue
Type › Patch
Type › Report
Type › To Do
No Milestone
No project
No Assignees
9 Participants
Notifications
Due Date

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#54287
Loading…
There is no content yet.