Crash when render large (>2GB) voxel data image sequence on Windows, OsX, Ubuntu #46696
Labels
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
5 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: blender/blender#46696
Loading…
Reference in New Issue
No description provided.
Delete Branch "%!s(<nil>)"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
System Information
Windows7 x64 - Nvidia Quadro K5100M
32GB RAM
Blender Version
2.76
48f7dd6
Short description of error
Blender crash when render more than 2GB of voxel data image sequence
Exact steps for others to reproduce the error
On a cube, add material/surface, Diffuse intensity and shading emit to 0; add a voxel data texture/Image sequence and import an image sequence with a total size above 2GB, set the number of image to use, Mapping to Generated and Projection to Cube, Influence Diffuse Intensity and Shading Emit to 1. Display the cube in the 3Dview Render preview, it took a lot of memory (to around 13GB) and then crash.
I used an image sequence of 643 .tif files with size of 2041x2041pix, total of 2.45GB. When loading only 500 files it works (below 2GB).
It's exactly the same behaviour on another Windows7 x64 PC and on MacOsX iMac (Radeon R9 M395X). I tried on Ubuntu (14.04LTS) it crash immediatly when i tried to open the image sequence with image number set to 643 or 500.
I tried with jpg files and had the same crash.
Thank you for your help.
grundraisin
Changed status to: 'Open'
Added subscriber: @grundraisin
Added subscriber: @mont29
I do not have enough memory to fully test that here (2GB of 8bit image means 8GB of float voxels…). I would think about some int vs size_t mismatch, but code looks OK here.
In fact, I’d say that the fact it keeps up to 13GB before crashing makes it look like a free mem fault (finding a chunk of 10GB free mem can be challenging, even with 32GB available, especially after some time of work). Did you try right after a cold start of your machine?
And finally, as usual, would be simpler if we had at least a .blend file (even better if you can share the voxel data too, of course).
Ok, I will send you the files when back home.
In the mean time for the memory usage, I run blender just after turn on the second Windows7PC (64GB RAM) and the same with Ubuntu PC.
On the two windows7 pc, the RAM usage increase to around 13GB then drop, then increase again, drop again, increase, and finally crash (saw blade histogram).
Here the linked to download the image sequence (I just edited the link, the previous one was not good)
https://filesender.switch.ch/filesender/?vid=27912cf3-f73f-4068-62fd-000023afe697
And here the blend file:
LiverMito.blend
Added subscriber: @LazyDodo
calloc in load_frame_image_sequence (voxeldata.c) if returning null and we deference it without checking, hence the crash.
Howerver. it shouldn't matter how long a machine has been up, the virtual memory manager will just give you a continuous space even if it has to back it with a paging file as long as there is a continuous block available in virtual memory space. I checked with VMMap, There is a ~8TB chunk of unfragmented free memory sitting there, calloc just refuses to take it.
Ok the null is my bad (we should probably still check for it though), replaced the calloc with a virtualalloc call, which still failed, but atleast gave me a more descriptive error ( ERROR_COMMITMENT_LIMIT , 1455 (0x5AF) , The paging file is too small for this operation to complete.) Increased my paging file, all good! until it started loading, crapped out at frame 525
https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/HEAD:/source/blender/render/intern/source/voxeldata.c#l197
BLI_VOXEL_INDEX(x, y, z, vd->resol)
x,y,z and vd->resol- [x] all ints, so the result is an int, and a negative one once you hit high numbers.
https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/HEAD:/source/blender/render/intern/source/voxeldata.c#l158
change int to size_t and we should be good to go.
And wrong once more, most of the stuff in blenlib\intern\voxel.c is also using ints to index the array.
This issue was referenced by
1ffdb1b472
Changed status from 'Open' to: 'Resolved'
A null check here
https://git.blender.org/gitweb/gitweb.cgi/blender.git/blob/HEAD:/source/blender/render/intern/source/voxeldata.c#l181
would have been nice.
Dear all,
Thank you for your work: you are really speed and efficient!
I just finish to compile blender with the last source, it works for my image sequence of 643 images 2021x2021 pixel, but crash immediately when I tried to load a bigger image sequence of 643 images 4041x4041 (total size of 9.78GB).
Cheers,
grundraisin
Added subscriber: @ideasman42
@grundraisin, hi, with very large voxels its possible this is running out of memory (instead of int-overflow),
Can you run Blender from the command line and see if you get a warning. eg
Calloc returns null
orMalloc returns null
.Downloading such large files is an option, but would be good to rule-out simple failed allocation first.
The internal format for voxels are floats (4 bytes / pixel) so your dataset is actually 64340414041*4= 41999939532 bytes (nearly 42 gb) , so yeah, probably a memory allocation failure , that being said, it would be nicer to handle this situation with slightly more grace and check the pointer before we dereference it, so we don't blow up in the users face.
Dear Campbell, dear LazyDodo,
TRUE! The bug is really fixed! Sorry for my incomplete test. My computer has 32GB and crash when I load this 9GB image sequence, but an other computer with 64GB can load the data. It is really long to load but works.
Do you know if there is some improvement as Campbell suggest in this post here:
http://blender.stackexchange.com/questions/27690/why-does-blender-use-so-much-memory-for-large-textures
Thank you for all your works, it's impressive! Chapeau!
Jean