Remesh Voxel Size can be much to small by default (leading to crashes). 'Edit Voxel Size' operator can have wrong range #77868

Open
opened 2020-06-14 23:04:19 +02:00 by RedAssassin · 84 comments

System Information
Operating system: Windows-10-10.0.17763-SP0 64 Bits
Graphics card: GeForce GTX 1660/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 432.00
Processor: AMD Ryzen 7 3800x 8-Core Processor 3.90 GHz

Blender Version
Broken: version: 2.90.0 Alpha, branch: master, commit date: 2020-06-12 17:01, hash: fd8d245e6a
(Also with Blender 2.83 BETA)

Short description of error
Selecting the "Remesh" modifier, selected "Voxel", Voxel size below "0.01m" = PC Crash. Needed to restart the PC.

Exact steps for others to reproduce the error

  1. Start Blender.
  2. Let the default Cube alive and add the "Remesh" modifier.
  3. Make the "Voxel Size" below "0.01m".
  4. Did it crashed?
    (Found out in my Blender Project, but I tested it then in a default Scene. Same thing happened.)

(Why I set the Voxel Size below "0.01m"?
It happens accidentally, when I have it like "0.42m", but I hold my left click and slide it to the left, so I accidentally go below "0.01m".)

{F8619727, size = full}

Good Luck!

**System Information** Operating system: Windows-10-10.0.17763-SP0 64 Bits Graphics card: GeForce GTX 1660/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 432.00 Processor: AMD Ryzen 7 3800x 8-Core Processor 3.90 GHz **Blender Version** Broken: version: 2.90.0 Alpha, branch: master, commit date: 2020-06-12 17:01, hash: `fd8d245e6a` (Also with Blender 2.83 BETA) **Short description of error** Selecting the "Remesh" modifier, selected "Voxel", Voxel size below "0.01m" = PC Crash. Needed to restart the PC. **Exact steps for others to reproduce the error** 1. Start Blender. 2. Let the default Cube alive and add the "Remesh" modifier. 3. Make the "Voxel Size" below "0.01m". 4. Did it crashed? (Found out in my Blender Project, but I tested it then in a default Scene. Same thing happened.) (Why I set the Voxel Size below "0.01m"? It happens accidentally, when I have it like "0.42m", but I hold my left click and slide it to the left, so I accidentally go below "0.01m".) {[F8619727](https://archive.blender.org/developer/F8619727/image.png), size = full} Good Luck!
Author

Added subscriber: @RedAssassin44

Added subscriber: @RedAssassin44

#100877 was marked as duplicate of this issue

#100877 was marked as duplicate of this issue

#99466 was marked as duplicate of this issue

#99466 was marked as duplicate of this issue

#98347 was marked as duplicate of this issue

#98347 was marked as duplicate of this issue

#93318 was marked as duplicate of this issue

#93318 was marked as duplicate of this issue

#90272 was marked as duplicate of this issue

#90272 was marked as duplicate of this issue

#88634 was marked as duplicate of this issue

#88634 was marked as duplicate of this issue

#87946 was marked as duplicate of this issue

#87946 was marked as duplicate of this issue

#82239 was marked as duplicate of this issue

#82239 was marked as duplicate of this issue

#82072 was marked as duplicate of this issue

#82072 was marked as duplicate of this issue

#78825 was marked as duplicate of this issue

#78825 was marked as duplicate of this issue

#78808 was marked as duplicate of this issue

#78808 was marked as duplicate of this issue

blender/blender-addons#77664 was marked as duplicate of this issue

blender/blender-addons#77664 was marked as duplicate of this issue

#77556 was marked as duplicate of this issue

#77556 was marked as duplicate of this issue

Added subscriber: @rjg

Added subscriber: @rjg

Changed status from 'Needs Triage' to: 'Needs User Info'

Changed status from 'Needs Triage' to: 'Needs User Info'

If your operating system crashes there seems to be an issue outside of Blender. Please update the graphics driver and check if the crash still occurs. Be aware the remeshing is very memory intensive, you may be running out of memory (see similar reports blender/blender-addons#77664 and #77556).

If your operating system crashes there seems to be an issue outside of Blender. Please update the [graphics driver ](https://www.nvidia.com/Download/driverResults.aspx/159887/en-us) and check if the crash still occurs. Be aware the remeshing is very memory intensive, you may be running out of memory (see similar reports blender/blender-addons#77664 and #77556).
Ankit Meel changed title from --{ Blender crashes PC. Modifier Remesh Voxel Size issue. }-- to Blender crashes PC. Modifier Remesh Voxel Size issue. 2020-06-15 08:20:30 +02:00
Member

Added subscribers: @car313, @Alaska

Added subscribers: @car313, @Alaska
Member

Added subscribers: @visionarymind, @HDMaster84, @MeshVoid

Added subscribers: @visionarymind, @HDMaster84, @MeshVoid
Member

Added subscriber: @ankitm

Added subscriber: @ankitm
Member

Changed status from 'Needs User Info' to: 'Confirmed'

Changed status from 'Needs User Info' to: 'Confirmed'
Member

{F8620460, layout=inline}{F8620461, layout=inline}{F8620462, layout=inline}{F8620463, layout=inline}{F8620464, layout=inline}
The memory usage goes wild when reducing the voxel size just under .01 (use the "reduce" button beside the slider)

{[F8620460](https://archive.blender.org/developer/F8620460/Screenshot_2020-06-15_at_11.59.39.jpeg), layout=inline}{[F8620461](https://archive.blender.org/developer/F8620461/Screenshot_2020-06-15_at_11.59.50.jpeg), layout=inline}{[F8620462](https://archive.blender.org/developer/F8620462/Screenshot_2020-06-15_at_12.00.09.jpeg), layout=inline}{[F8620463](https://archive.blender.org/developer/F8620463/Screenshot_2020-06-15_at_12.01.44.jpeg), layout=inline}{[F8620464](https://archive.blender.org/developer/F8620464/Screenshot_2020-06-15_at_12.01.17.jpeg), layout=inline} The memory usage goes wild when reducing the voxel size just under .01 (use the "reduce" button beside the slider)
Member

An easy fix would be to put a soft limit of 0.01 in the slider UI itself. and manually let people type into it what they want.

An easy fix would be to put a soft limit of 0.01 in the slider UI itself. and manually let people type into it what they want.
Member

Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI. If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred. IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

@ankitm I can't reproduce a crash on my system with 0.001 on a subdivided Suzanne, it only consumes a large amount of memory. That in itself is not surprising, since the smaller the voxel the more memory it needs and the memory consumption grows cubically. The issue is that the soft_min is 0.0001 (or 9.999999747378752e-05 to be precise), which means that sliding over the value can easily result in OOM. I wouldn't change the hard_limit, since someone with more memory or a use case like @HDMaster84 described, may want to use really small values and should be able to type them in manually. Edit: I would change the hard_limit too, in order to avoid the arithmetic error reported in #77878.

@ankitm I can't reproduce a crash on my system with 0.001 on a subdivided Suzanne, it only consumes a large amount of memory. That in itself is not surprising, since the smaller the voxel the more memory it needs and the memory consumption grows cubically. The issue is that the `soft_min` is `0.0001` (or `9.999999747378752e-05` to be precise), which means that sliding over the value can easily result in OOM. ~~I wouldn't change the `hard_limit`, since someone with more memory or a use case like @HDMaster84 described, may want to use really small values and should be able to type them in manually.~~ Edit: I would change the `hard_limit` too, in order to avoid the arithmetic error reported in #77878.

This comment was removed by @rjg

*This comment was removed by @rjg*

In #77868#953889, @HDMaster84 wrote:
Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

I agree with Henrick. I'm just finishing sculpting quite a hefty model in Blender. I think the solution of having a hard limit of 0.01 is not a solution as you definitely use these small numbers when you scale the model down, when you have little details, but you still need that geometrical density to sculpt, it all depends on the type of the model and its scale and putting a hard-limit is a hacky solution. To be honest, I would argue that your computer running out of memory because you created a geo that is too dense is a bug, that is your fault.

Having no notification that geo might get too dense and you'll most likely get a crash (as it is done in Zbrush once you subdivide model too much) should be considered more of a UI paper-cut I guess. You can crash any sculpting software like that if you don't know what you're doing and which numbers are viable. You can easily crush Zbrush the same way if you over-do Dynamesh values or overly subdivide your model.

> In #77868#953889, @HDMaster84 wrote: > Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI. > > If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred. > > IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent. I agree with Henrick. I'm just finishing sculpting quite a hefty model in Blender. I think the solution of having a hard limit of 0.01 is not a solution as you definitely use these small numbers when you scale the model down, when you have little details, but you still need that geometrical density to sculpt, it all depends on the type of the model and its scale and putting a hard-limit is a hacky solution. To be honest, I would argue that your computer running out of memory because you created a geo that is too dense is a bug, that is your fault. Having no notification that geo might get too dense and you'll most likely get a crash (as it is done in Zbrush once you subdivide model too much) should be considered more of a UI paper-cut I guess. You can crash any sculpting software like that if you don't know what you're doing and which numbers are viable. You can easily crush Zbrush the same way if you over-do Dynamesh values or overly subdivide your model.
Member

@rjg for me bpy.context.object.modifiers["Remesh"].bl_rna.properties["voxel_size"].soft_min - 0.0001 is basically giving zero, so the soft_min matches. The little error is due to float not beeing able to store that number exactly (see https://www.h-schmidt.net/FloatConverter/IEEE754.html)

@rjg for me `bpy.context.object.modifiers["Remesh"].bl_rna.properties["voxel_size"].soft_min - 0.0001` is basically giving zero, so the `soft_min` matches. The little error is due to float not beeing able to store that number exactly (see https://www.h-schmidt.net/FloatConverter/IEEE754.html)

@HDMaster84 You're right I misread it and thought there was one erroneous extra zero digit. The actual deviation is obviously coming from floating point precision.

@HDMaster84 You're right I misread it and thought there was one erroneous extra zero digit. The actual deviation is obviously coming from floating point precision.
Member

Edited the comment up there. #77868#953819 It felt surprising to me that going from 0.03 > 0.02 > 0.01 > (the number here) created such a huge increase. What I had was the default cube with nothing else.

I can't reproduce a crash on my system with 0.001 on a subdivided Suzanne, it only consumes a large amount of memory.

Just to clarify: is anyone opposing the confirmation of the report ?

Edited the comment up there. #77868#953819 It felt surprising to me that going from 0.03 > 0.02 > 0.01 > (the number here) created such a huge increase. What I had was the default cube with nothing else. >I can't reproduce a crash on my system with 0.001 on a subdivided Suzanne, it only consumes a large amount of memory. Just to clarify: is anyone opposing the confirmation of the report ?
Member

@ankitm No it definitly crashes when decreasing from 0.01 with the left arrow going to the softmin at 0.0001. But only on the default cube and objects of similar size or bigger.

@ankitm No it definitly crashes when decreasing from 0.01 with the left arrow going to the softmin at 0.0001. But only on the default cube and objects of similar size or bigger.
Member

#77878 is related to the decision what the hard limit should be.

#77878 is related to the decision what the hard limit should be.

@HDMaster84 What is a suitable soft_min, when the required memory is relative to the size of the object? Remeshing a 50 m cube and a voxel size of 0.01 will have the same effect, allocating about 27 GB. In a perfect world the modifier would base the limits on the available resources. The next best would be dynamic limits based on the object's bounding box. I'm not sure if there's a good solution that fits well with Blender's current concept of limits on properties.

If I'm not mistaken #77878 that is a duplicate of #72747 fixed in 6a54969cf1 Nevermind that is for the voxel remesher, but generally related. Making too many mistakes this morning.

@HDMaster84 What is a suitable `soft_min`, when the required memory is relative to the size of the object? Remeshing a 50 m cube and a voxel size of 0.01 will have the same effect, allocating about 27 GB. In a perfect world the modifier would base the limits on the available resources. The next best would be dynamic limits based on the object's bounding box. I'm not sure if there's a good solution that fits well with Blender's current concept of limits on properties. ~~If I'm not mistaken #77878 that is a duplicate of #72747 fixed in 6a54969cf1~~ Nevermind that is for the voxel remesher, but generally related. Making too many mistakes this morning.
Member

@rjg

In #77868#953889, @HDMaster84 wrote:
Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI.

If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well, indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.

IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

@rjg > In #77868#953889, @HDMaster84 wrote: > Currently there is only a soft limit for the voxel size at 0.0001, but you can also type in something as low as 0.00002. A value of 0.001 on the default cube does still work on my computer, but takes very long to compute, but if I scale the cube down in editmode by 0.01, then even the voxelsize of 0.0001 works. I think that the solution should not be a hard limit of 0.01 in the UI. > > **If there shoud be a limit, then that should be calculated at runtime of the modifier. The modifier would check the bounding box size and make sure that memory usage stays under some reasonable amount. If the modifier did clamp the voxel size it would give an error message to the user as well, indicating that. Keep in mind this is about the modifier version of voxel remesh, and it makes no sense to have an extraordinarily slow modifier in the stack. So for very high resolution the normal remesh feature in the mesh data properties would be always preferred.** > > IMO the optimal soution would be to have logarithmic/nonlinear sliders which would work similar to the brush radius slider in Krita. This way you would not go into the dangerous region by one click, but rather smoothly descent.

@HDMaster84 I definitely shouldn't be allowed on the internet without caffeine. Sorry for missing so many obvious things today. Do you know of any modifier that currently implement dynamic limits? I don't think that's currently the case, but given how many things I was already wrong about today ...

@HDMaster84 I definitely shouldn't be allowed on the internet without caffeine. Sorry for missing so many obvious things today. Do you know of any modifier that currently implement dynamic limits? I don't think that's currently the case, but given how many things I was already wrong about today ...
Member

@rjg I don't think there is a modifier with dynamic limits yet. The general assumtion is that you know what values you are allowed to enter without crashing blender. There are lots of modifiers that can crash blender in a similar way (array, screw, bevel, subsurface). All of those other modifiers including the other modes of remesh are designed in a way, that you slowly descent in performance when clicking or dragging the slider, so you can intuitivly see where the limits for your computer are. The voxel size is the only field which will jump in a very rapid way to a crash (one click literally). Thats exactly what I was on in the last paragraph of my comment #77868#953889, when I was talking about nonlinear sliders.

What also should be considered is that, if your object is very big in size, adding the modifier will already crash blender. In that case an error message raised by a dynamic limit would be a life saver.

@rjg I don't think there is a modifier with dynamic limits yet. The general assumtion is that you know what values you are allowed to enter without crashing blender. There are lots of modifiers that can crash blender in a similar way (array, screw, bevel, subsurface). All of those other modifiers including the other modes of remesh are designed in a way, that you slowly descent in performance when clicking or dragging the slider, so you can intuitivly see where the limits for your computer are. The voxel size is the only field which will jump in a very rapid way to a crash (one click literally). Thats exactly what I was on in the last paragraph of my comment #77868#953889, when I was talking about nonlinear sliders. What also should be considered is that, if your object is very big in size, adding the modifier will already crash blender. In that case an error message raised by a dynamic limit would be a life saver.

@HDMaster84 The issue with dynamic limits based on available memory is that it is not quite easy to know how much is actually available, due to virtual memory, swap and other programs (de)allocating alongside of Blender. I don't think this is the right approach. As you've said, there are already plenty of ways to run out of memory, it just shouldn't be that easy to do that unintentionally.

The different scaling of the increments is a nice idea. What I meant in my comment was that instead of an absolute voxel size, one could rather use a factor that is used to determine that voxel size based on the object's bounding box. This should solve the problem, including the case when the modifier is added to a large object.

@HDMaster84 The issue with dynamic limits based on available memory is that it is not quite easy to know how much is actually available, due to virtual memory, swap and other programs (de)allocating alongside of Blender. I don't think this is the right approach. As you've said, there are already plenty of ways to run out of memory, it just shouldn't be that easy to do that unintentionally. The different scaling of the increments is a nice idea. What I meant in my comment was that instead of an absolute voxel size, one could rather use a factor that is used to determine that voxel size based on the object's bounding box. This should solve the problem, including the case when the modifier is added to a large object.
Member

@rjg no please not, please! The bounding box size changes all the time when using different modifier combinations or shapekeys and I don't like the effect that the other remesh modes are giving me. The other remesh modes do exactly that. They have an octree depth and a scale to scale and subdivide the bounding box. The issue is that it would render this new voxel remesh as useless as the others for metaball-like animations. I want the grid to stay where I specified it with a certain size that I can easily control. I don't think there is a reasonable way around nonlinear sliders.

Also nonlinear slider when implemented would be used everywhere is blender (e.g. all Merge Thresholds, Voxel size also for the remesh in the properties ans sculpt, most scale sliders, radius in painting and sculpt mode, ...) so it would be a very very useful addition for blender in general.

@rjg no please not, please! The bounding box size changes all the time when using different modifier combinations or shapekeys and I don't like the effect that the other remesh modes are giving me. The other remesh modes do exactly that. They have an octree depth and a scale to scale and subdivide the bounding box. The issue is that it would render this new voxel remesh as useless as the others for metaball-like animations. I want the grid to stay where I specified it with a certain size that I can easily control. I don't think there is a reasonable way around nonlinear sliders. Also nonlinear slider when implemented would be used everywhere is blender (e.g. all Merge Thresholds, Voxel size also for the remesh in the properties ans sculpt, most scale sliders, radius in painting and sculpt mode, ...) so it would be a very very useful addition for blender in general.

Added subscriber: @PabloDobarro

Added subscriber: @PabloDobarro

@HDMaster84 I see, that's a valid point. I like the non-linear slider idea. Perhaps there is an even better solution for this particular modifier, but I'm not familiar with the implementation of the remesher and openvdb.

@PabloDobarro since you've added the voxel mode to the modifier, perhaps you could take a look at this?

@HDMaster84 I see, that's a valid point. I like the non-linear slider idea. Perhaps there is an even better solution for this particular modifier, but I'm not familiar with the implementation of the remesher and openvdb. @PabloDobarro since you've added the voxel mode to the modifier, perhaps you could take a look at this?
Author

Robert Guetzkow (rjg) (Can't tag you, why ever I tag then someone else) To the first comment you've sent: I updated my Graphics Driver and tried it again. It's crashed, I needed to restart my PC. My Memory is 16GB RAM, so I don't think, that this is a problem, because, when I try to get high, I can get higher than like 20000000000m. Also, when I go below 0.1 to 0.01, it is smooth, it works ok. But then suddenly, when I go below 0.01 it just freezes and I need to crash my PC to be able to restart it. So, with an updated Graphics Driver and recommended Hardware/Memory specs, it just goes down.

**Robert Guetzkow (rjg)** (Can't tag you, why ever I tag then someone else) To the first comment you've sent: I updated my Graphics Driver and tried it again. It's crashed, I needed to restart my PC. My Memory is 16GB RAM, so I don't think, that this is a problem, because, when I try to get high, I can get higher than like 20000000000m. Also, when I go below 0.1 to 0.01, it is smooth, it works ok. But then suddenly, when I go below 0.01 it just freezes and I need to crash my PC to be able to restart it. So, with an updated Graphics Driver and recommended Hardware/Memory specs, it just goes down.

@RedAssassin44 As we've discussed in the comments above. Going below 0.01 can very easily exceed the available memory. At 0.0001, which is one step after 0.01, it takes approx. 27 GB with the default Cube. Depending on the available swap space this may either result in Blender crashing or being frozen for a long time. Your OS may react very slowly too, until either Blender finishes processing or it terminates Blender for running out of memory. Note that the OS freezing/reacting slowly is not the same as a crash. A crash means that it terminates unexpectedly, it could be displaying a blue screen or the computer simply turns off/restarts on it's own.

@RedAssassin44 As we've discussed in the comments above. Going below 0.01 can very easily exceed the available memory. At 0.0001, which is one step after 0.01, it takes approx. 27 GB with the default Cube. Depending on the available swap space this may either result in Blender crashing or being frozen for a long time. Your OS may react very slowly too, until either Blender finishes processing or it terminates Blender for running out of memory. Note that the OS freezing/reacting slowly is not the same as a crash. A crash means that it terminates unexpectedly, it could be displaying a blue screen or the computer simply turns off/restarts on it's own.
Member

Added subscriber: @daendew

Added subscriber: @daendew

Added subscriber: @arsdever

Added subscriber: @arsdever

Added subscriber: @zelfit

Added subscriber: @zelfit

Happened couple of times for me too. Crashed on 64gb ram.
Maybe it will be better to show some kind of warning before proceeding after certain threshold. Similar example from Maya: if you try to subdivide dense mesh you'll receive warning message, from there you can confirm or cancel.

Happened couple of times for me too. Crashed on 64gb ram. Maybe it will be better to show some kind of warning before proceeding after certain threshold. Similar example from Maya: if you try to subdivide dense mesh you'll receive warning message, from there you can confirm or cancel.

For 3ds max there is a good opportunity to cancel the currently executing task via pressing ESC

For 3ds max there is a good opportunity to cancel the currently executing task via pressing `ESC`
Member

@rjg I just found out that the Array Modifier has a limit at which it sends an error without crashing blender. It limits the vertex count to 67 million. We could do the same for the voxel remesher, so it would abort if it would generate more than say 2^32 voxels (computed from bounding box volume), which would be for a simple cube 24 million vertices with 2000 voxels per dimension (2000^26=2410^6). That example still works on my 8 GiB computer. For more complex objects it would be more vertices and can go up to almost the voxel count (or a fraction of it? I am not entirely sure on the upper limit for vertices here). Then it will crash on my computer due to lack of memory, but may still work for others.

@rjg I just found out that the Array Modifier has a limit at which it sends an error without crashing blender. It limits the vertex count to 67 million. We could do the same for the voxel remesher, so it would abort if it would generate more than say 2^32 voxels (computed from bounding box volume), which would be for a simple cube 24 million vertices with 2000 voxels per dimension (2000^2*6=24*10^6). That example still works on my 8 GiB computer. For more complex objects it would be more vertices and can go up to almost the voxel count (or a fraction of it? I am not entirely sure on the upper limit for vertices here). Then it will crash on my computer due to lack of memory, but may still work for others.

@HDMaster84 Sounds like a good idea for a heuristic. Since I'm not familiar with the algorithm used for the voxel remesher and currently busy with my thesis, this should be looked at by @PabloDobarro or other developers familiar with the memory requirements and runtime complexity.

@HDMaster84 Sounds like a good idea for a heuristic. Since I'm not familiar with the algorithm used for the voxel remesher and currently busy with my thesis, this should be looked at by @PabloDobarro or other developers familiar with the memory requirements and runtime complexity.

Added subscriber: @mano-wii

Added subscriber: @mano-wii

It doesn't crash here, but the time it takes to compute is also not convenient.
Perhaps the modifier could raise warning messages, but it also seems like a good idea to lower the sliding limit when pressing and dragging with the mouse.

diff --git a/source/blender/makesrna/intern/rna_modifier.c b/source/blender/makesrna/intern/rna_modifier.c
index a891194550f..4c715bbb747 100644
--- a/source/blender/makesrna/intern/rna_modifier.c
+++ b/source/blender/makesrna/intern/rna_modifier.c
@@ -5557,7 +5557,7 @@ static void rna_def_modifier_remesh(BlenderRNA *brna)
   prop = RNA_def_property(srna, "voxel_size", PROP_FLOAT, PROP_DISTANCE);
   RNA_def_property_float_sdna(prop, NULL, "voxel_size");
   RNA_def_property_range(prop, 0.0001f, FLT_MAX);
-  RNA_def_property_ui_range(prop, 0.0001, 2, 0.1, 3);
+  RNA_def_property_ui_range(prop, 0.01, 2, 0.1, 3);
   RNA_def_property_ui_text(prop,
                            "Voxel Size",
                            "Size of the voxel in object space used for volume evaluation. Lower "

It doesn't crash here, but the time it takes to compute is also not convenient. Perhaps the modifier could raise warning messages, but it also seems like a good idea to lower the sliding limit when pressing and dragging with the mouse. ``` diff --git a/source/blender/makesrna/intern/rna_modifier.c b/source/blender/makesrna/intern/rna_modifier.c index a891194550f..4c715bbb747 100644 --- a/source/blender/makesrna/intern/rna_modifier.c +++ b/source/blender/makesrna/intern/rna_modifier.c @@ -5557,7 +5557,7 @@ static void rna_def_modifier_remesh(BlenderRNA *brna) prop = RNA_def_property(srna, "voxel_size", PROP_FLOAT, PROP_DISTANCE); RNA_def_property_float_sdna(prop, NULL, "voxel_size"); RNA_def_property_range(prop, 0.0001f, FLT_MAX); - RNA_def_property_ui_range(prop, 0.0001, 2, 0.1, 3); + RNA_def_property_ui_range(prop, 0.01, 2, 0.1, 3); RNA_def_property_ui_text(prop, "Voxel Size", "Size of the voxel in object space used for volume evaluation. Lower " ```
Member

@mano-wii The problem why we kind of agreed, that that was not the solution is that 0.0001 is kind of reasonable when your bounding box is 0.02 instead of 2. There are lots of people (that I know) who dont comply with the rule to use a unitsystem where your objects are "close" to a size of 1. They would want to use 0.0001 then.

If you want to implement a quick solution, doing it this way would be nice:

In #77868#982749, @HDMaster84 wrote:
@rjg I just found out that the Array Modifier has a limit at which it sends an error without crashing blender. It limits the vertex count to 67 million. We could do the same for the voxel remesher, so it would abort if it would generate more than say 2^32 voxels (computed from bounding box volume), which would be for a simple cube 24 million vertices with 2000 voxels per dimension (2000^26=2410^6). That example still works on my 8 GiB computer. For more complex objects it would be more vertices and can go up to almost the voxel count (or a fraction of it? I am not entirely sure on the upper limit for vertices here). Then it will crash on my computer due to lack of memory, but may still work for others.

@mano-wii The problem why we kind of agreed, that that was not the solution is that 0.0001 is kind of reasonable when your bounding box is 0.02 instead of 2. There are lots of people (that I know) who dont comply with the rule to use a unitsystem where your objects are "close" to a size of 1. They would want to use 0.0001 then. If you want to implement a quick solution, doing it this way would be nice: > In #77868#982749, @HDMaster84 wrote: > @rjg I just found out that the Array Modifier has a limit at which it sends an error without crashing blender. It limits the vertex count to 67 million. We could do the same for the voxel remesher, so it would abort if it would generate more than say 2^32 voxels (computed from bounding box volume), which would be for a simple cube 24 million vertices with 2000 voxels per dimension (2000^2*6=24*10^6). That example still works on my 8 GiB computer. For more complex objects it would be more vertices and can go up to almost the voxel count (or a fraction of it? I am not entirely sure on the upper limit for vertices here). Then it will crash on my computer due to lack of memory, but may still work for others.

Added subscriber: @Marinus

Added subscriber: @Marinus

I am having the same issue on Mac os Mojave and Blender 2.90
It is not exactly the same, but out of nowhere Blender and eventually the entire computer hangs on simply adding the remesh modifier. Even with the object being hidden. Standard voxel size of 0.1m wasn't an issue yesterday but now even after a reboot it doesn't want to work. It even crashes other programs when adding the remesh modifier.

I am having the same issue on Mac os Mojave and Blender 2.90 It is not exactly the same, but out of nowhere Blender and eventually the entire computer hangs on simply adding the remesh modifier. Even with the object being hidden. Standard voxel size of 0.1m wasn't an issue yesterday but now even after a reboot it doesn't want to work. It even crashes other programs when adding the remesh modifier.

Added subscriber: @Sandstedt

Added subscriber: @Sandstedt

I have the same problem that Unity crashes while trying to apply a Remesh modifier to a fairly complex mesh. But the problem occured first after trying to crank up the regular Octree Depth to around 9 and then changed to Voxel to see what it was. And now the default value is set to voxels with 0.1m voxel size that craches the program very time.

Tried to uninstall blender and install the latest 2.90.1. But seems there where some save files left. Couse had still stuck on the voxel preset for every time I applied the modifier to a new mesh.

I have the same problem that Unity crashes while trying to apply a Remesh modifier to a fairly complex mesh. But the problem occured first after trying to crank up the regular Octree Depth to around 9 and then changed to Voxel to see what it was. And now the default value is set to voxels with 0.1m voxel size that craches the program very time. Tried to uninstall blender and install the latest 2.90.1. But seems there where some save files left. Couse had still stuck on the voxel preset for every time I applied the modifier to a new mesh.

Manage to get around it by applying the modifier in edit mode, and then switch back to sharp instead.

Manage to get around it by applying the modifier in edit mode, and then switch back to sharp instead.
Member

Added subscribers: @Mashimaro7, @manzanilla

Added subscribers: @Mashimaro7, @manzanilla

Added subscriber: @rocketman

Added subscriber: @rocketman

Added subscriber: @Dragosh

Added subscriber: @Dragosh

Remeshing a cube of 20,000 m x 20,000 m x 2000 m with a voxel remesh of 0.1 crashes blender after it devours all of my 32 gb of ram, running latest blender release

Remeshing a cube of 20,000 m x 20,000 m x 2000 m with a voxel remesh of 0.1 crashes blender after it devours all of my 32 gb of ram, running latest blender release

Added subscriber: @gert7

Added subscriber: @gert7

Added subscribers: @Wiczus, @Calra

Added subscribers: @Wiczus, @Calra

It may not be my place to suggest, but I feel that this would be quickly solved by replacing "Voxel Size" with its reciprocal, "Voxels Per Unit".

For instance: voxels per unit = 100 would be the same as voxel size = 0.01.

When larger numbers correspond with greater complexity, there's little chance of accidentally crashing Blender with a single value. You don't even have to bother with non-linear sliders

It may not be my place to suggest, but I feel that this would be quickly solved by replacing "Voxel Size" with its reciprocal, "Voxels Per Unit". For instance: **voxels per unit = 100** would be the same as **voxel size = 0.01.** When larger numbers correspond with greater complexity, there's little chance of accidentally crashing Blender with a single value. You don't even have to bother with non-linear sliders
Member

@rocketman One problem I can see with that is versioning. The way to do it with correct versioning would be to add a combobox with multiple choice for [voxel size, voxel per unit, voxelcount, etc]. That is how "Points to Volume" in Geometry Nodes solved it. That also solved the problem of bad defaults by using voxel amount as default option. Just making it reciprocal would not change the fact that adding the modifier on a mesh can already crash (out of memory) blender, because the default can not be good for all situations.

@rocketman One problem I can see with that is versioning. The way to do it with correct versioning would be to add a combobox with multiple choice for [voxel size, voxel per unit, voxelcount, etc]. That is how "Points to Volume" in Geometry Nodes solved it. That also solved the problem of bad defaults by using voxel amount as default option. Just making it reciprocal would not change the fact that adding the modifier on a mesh can already crash (out of memory) blender, because the default can not be good for all situations.

Removed subscriber: @daendew

Removed subscriber: @daendew

Added subscriber: @SquidlyPerson77

Added subscriber: @SquidlyPerson77

I ran into this problem today by accident. I have:
Ryzen 3400g
16 (2x8Gb) RAM

I slid the slider to 0 by accident and it stopped responding. My video stopped (In browser) and my system started to have bad lag. (My mouse would stop responding some times). However, I was able to slowly make my way to the task manager to find that Blender was making my system usage of RAM to ~90-~95%. I killed the program and my RAM usage went down. I can confirm that there might be some error or something of the sort that allows for unlimited RAM usage and crashes most pcs with low amount of RAM. (My theory)

I ran into this problem today by accident. I have: Ryzen 3400g 16 (2x8Gb) RAM I slid the slider to 0 by accident and it stopped responding. My video stopped (In browser) and my system started to have bad lag. (My mouse would stop responding some times). However, I was able to slowly make my way to the task manager to find that Blender was making my system usage of RAM to ~90-~95%. I killed the program and my RAM usage went down. I can confirm that there might be some error or something of the sort that allows for unlimited RAM usage and crashes most pcs with low amount of RAM. (My theory)

Added subscribers: @Microweb4, @DarkKnight

Added subscribers: @Microweb4, @DarkKnight

Added subscriber: @ideasman42

Added subscriber: @ideasman42

@HDMaster84 @ideasman42 Can this issue be closed as 3.0.0 will have the logarithmic slider?

@HDMaster84 @ideasman42 Can this issue be closed as 3.0.0 will have the logarithmic slider?
Member

The problem described by the user is fixed.

The open problem mentioned by multiple users in the comments here is that you could apply the modifier on an object which is out of scale (like 1000bu in size). This would be solved by

  1. adding the options like in the equivalent node to scale with size (as an additional new default option)
  2. changing the default to smooth or blocks etc, because they only use relative sizes
  3. adding a constraint on how any voxels the modifier is allowed to create (similar to how it is done in the Array Modifier)
The problem described by the user is fixed. The open problem mentioned by multiple users in the comments here is that you could apply the modifier on an object which is out of scale (like 1000bu in size). This would be solved by 1. adding the options like in the equivalent node to scale with size (as an additional new default option) 2. changing the default to smooth or blocks etc, because they only use relative sizes 3. adding a constraint on how any voxels the modifier is allowed to create (similar to how it is done in the Array Modifier)

Added subscriber: @brandonglockaby

Added subscriber: @brandonglockaby

Accidentally dragging the slider to the left crashes hard on the default cube, and I have 32GB of memory. Maybe the lower limit is too low...

Accidentally dragging the slider to the left crashes hard on the default cube, and I have 32GB of memory. Maybe the lower limit is too low...
Member

Added subscribers: @samrrr, @lichtwerk

Added subscribers: @samrrr, @lichtwerk
Member

Should this also take into account scene's unit scale?

Should this also take into account scene's unit scale?

Added subscriber: @ignacio.borderes

Added subscriber: @ignacio.borderes

I accidentally wrote a negative value on that modifier and it crashed, I reproduced the steps with the modifier disabled on viewport and I realized that any value below 0.0001 mm, including negatives, go to the minimum of 0.0001 mm voxel size. This could still happen with a logaritmic slider (I never use the slider), there might be a better "fool proof" way of implementing this. Ideally I would love an advanced configuration option, where I could set default, minimum or maximum values for modifiers, that would be very useful.

I accidentally wrote a negative value on that modifier and it crashed, I reproduced the steps with the modifier disabled on viewport and I realized that any value below 0.0001 mm, including negatives, go to the minimum of 0.0001 mm voxel size. This could still happen with a logaritmic slider (I never use the slider), there might be a better "fool proof" way of implementing this. Ideally I would love an advanced configuration option, where I could set default, minimum or maximum values for modifiers, that would be very useful.

Added subscriber: @AndrewPalmer

Added subscriber: @AndrewPalmer

I've also noticed that simply having an object that is very large will cause blender to crash or hang for a long time when using remesh. I work with Unreal a lot, and sometimes export meshes from Unreal to tweak in Blender. Since the default units in Unreal is centimetres, objects come into Blender looking the correct size, but the scale is set to 0.01, so an object that may look like it's about 4 meters wide is actually 400 meters wide when the scale is set to 1.0, and putting a remesh modifier on it will hang/crash Blender. I think part of the reason it's so frustrating is that there doesn't appear to be any way to change the settings of a modifier without adding it first.

I think Henrik's suggestion of adding an option to take into account scene and object scale and enabling it by default would go a long way to avoid frustrating users who did not notice that their object was actually enormous because it was scaled.

However, to solve this issue for a wider variety of cases, how about checking how many voxels will be needed to remesh the object and use this number to determine whether or not to intervene in some way? This could be in the form of adding a prompt that allows the user to cancel and set a larger voxel size, or setting the voxel size based on some predefined maximum voxel count and the dimensions of the object in order to make sure it doesn't crash the user's machine. Another option would be to add the modifier in disabled state if it looks like it was going to be heavy and showing a note on the modifier itself until the user takes some action. I know these ideas may sound a bit heavy handed, but if it prevents Blender from crashing and the user losing time and possibly work, I think it's worth doing.

I've also noticed that simply having an object that is very large will cause blender to crash or hang for a long time when using remesh. I work with Unreal a lot, and sometimes export meshes from Unreal to tweak in Blender. Since the default units in Unreal is centimetres, objects come into Blender looking the correct size, but the scale is set to 0.01, so an object that may look like it's about 4 meters wide is actually 400 meters wide when the scale is set to 1.0, and putting a remesh modifier on it will hang/crash Blender. I think part of the reason it's so frustrating is that there doesn't appear to be any way to change the settings of a modifier without adding it first. I think Henrik's suggestion of adding an option to take into account scene and object scale and enabling it by default would go a long way to avoid frustrating users who did not notice that their object was actually enormous because it was scaled. However, to solve this issue for a wider variety of cases, how about checking how many voxels will be needed to remesh the object and use this number to determine whether or not to intervene in some way? This could be in the form of adding a prompt that allows the user to cancel and set a larger voxel size, or setting the voxel size based on some predefined maximum voxel count and the dimensions of the object in order to make sure it doesn't crash the user's machine. Another option would be to add the modifier in disabled state if it looks like it was going to be heavy and showing a note on the modifier itself until the user takes some action. I know these ideas may sound a bit heavy handed, but if it prevents Blender from crashing and the user losing time and possibly work, I think it's worth doing.
Member

Added subscriber: @Kendall-3

Added subscriber: @Kendall-3
Philipp Oeser changed title from Blender crashes PC. Modifier Remesh Voxel Size issue. to Remesh Voxel Size can be much to small by default (leading to crashes). 'Edit Voxel Size' operator can have wrong range 2022-05-24 09:06:51 +02:00

Added subscriber: @SRZ

Added subscriber: @SRZ
Member

Added subscribers: @donutblender, @PratikPB2123

Added subscribers: @donutblender, @PratikPB2123

Removed subscriber: @arsdever

Removed subscriber: @arsdever
Philipp Oeser removed the
Interest
Sculpt, Paint & Texture
label 2023-02-10 09:12:29 +01:00
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
20 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#77868
No description provided.