import OBJ > 10 millions meshs bug or crash #114154

Closed
opened 2023-10-25 17:13:00 +02:00 by David-ROMEUF · 26 comments

System Information
Operating system: win10 22H2 i9 64 Go RAM
Graphics card: RTX A3000 12 Gb

Blender Version
Broken: 3.6.4 LTS 2023-09-25 13:24
Worked: (newest version of Blender that worked as expected)

Short description of error
Hello, I create photogrammetric models with MetaShape software. My photogrammetric models can contain 10 million facets meshs or 150 million facets meshs. When I import a 3D model with 10 million facets + 4k texture, via import -> OBJ, Blender displays it and I can work on it without any problem. When I try to load a 60 million-facet OBJ model, Blender loads it, but then I can't save the project, it takes a very long time and crashes. When I try to import a model with 132 million facets mesh + 16k PNG texture Blender crashes. Of course I can decimate the model and switch it to 10 million and a 4k texture, but all the finesse of the reconstruction is lost. Do you think there's a solution?

Exact steps for others to reproduce the error
Based on the default startup or an attached .blend file (as simple as possible).

**System Information** Operating system: win10 22H2 i9 64 Go RAM Graphics card: RTX A3000 12 Gb **Blender Version** Broken: 3.6.4 LTS 2023-09-25 13:24 Worked: (newest version of Blender that worked as expected) **Short description of error** Hello, I create photogrammetric models with MetaShape software. My photogrammetric models can contain 10 million facets meshs or 150 million facets meshs. When I import a 3D model with 10 million facets + 4k texture, via import -> OBJ, Blender displays it and I can work on it without any problem. When I try to load a 60 million-facet OBJ model, Blender loads it, but then I can't save the project, it takes a very long time and crashes. When I try to import a model with 132 million facets mesh + 16k PNG texture Blender crashes. Of course I can decimate the model and switch it to 10 million and a 4k texture, but all the finesse of the reconstruction is lost. Do you think there's a solution? **Exact steps for others to reproduce the error** Based on the default startup or an attached .blend file (as simple as possible).
David-ROMEUF added the
Severity
Normal
Type
Report
Status
Needs Triage
labels 2023-10-25 17:13:00 +02:00

Pretty sure that is limitation of int32 indices.

Pretty sure that is limitation of int32 indices.
Author

Can you use int64 in your code ?
it's a real pity not to be able to import such detailed models into the excellent Blender.

Can you use int64 in your code ? it's a real pity not to be able to import such detailed models into the excellent Blender.
Author

I can manipulate and visualize, and create video animations in MetaShape with 150 million Mesh models with 32k x 32k pixels textures. But unfortunately without raytracing. I want to produce videos for scientific articles. Bests regards,

I can manipulate and visualize, and create video animations in MetaShape with 150 million Mesh models with 32k x 32k pixels textures. But unfortunately without raytracing. I want to produce videos for scientific articles. Bests regards,

Well, maybe you can split your mesh in parts?

Well, maybe you can split your mesh in parts?
Member

Blender can't support more than 2 billion or so face corners in the same mesh. If you're running into that limit, or a memory usage limit, there isn't much we can do. But if it's something else, maybe there is. We would need a way to reproduce the issue though, "Exact steps for others to reproduce the error" is not complete in your report.

Blender can't support more than 2 billion or so face corners in the same mesh. If you're running into that limit, or a memory usage limit, there isn't much we can do. But if it's something else, maybe there is. We would need a way to reproduce the issue though, "Exact steps for others to reproduce the error" is not complete in your report.
Hans Goudey added
Status
Needs Information from User
and removed
Status
Needs Triage
labels 2023-10-25 21:16:59 +02:00

I laughed out loud. Replacing int, easy-peasy ... 😂

I laughed out loud. Replacing int, easy-peasy ... 😂

Don't try to do that 😂.

Don't try to do that 😂.

You should first attach the crash files that will typically be written into your TEMP directory. They will be named blender.crash.txt or <name of file>.crash.txt

Those will hopefully allow us to see the code path that's crashing. There are existing bugs for large meshes already or it could be something new.

And yes, a way to perhaps generate, without other big apps, an equivalent .obj file locally instead of doing a large transfer would also be nice.

You should first attach the crash files that will typically be written into your TEMP directory. They will be named `blender.crash.txt` or `<name of file>.crash.txt` Those will hopefully allow us to see the code path that's crashing. There are existing bugs for large meshes already or it could be something new. And yes, a way to perhaps generate, without other big apps, an equivalent .obj file locally instead of doing a large transfer would also be nice.
Author

I laughed out loud. Replacing int, easy-peasy ... 😂

I'm super happy to make you laugh because life can be so sad sometimes. I understand from your message and your reaction that Blender is coded in 32 bits and that coding it to 64 bits is a very big job ? bests regards.

> I laughed out loud. Replacing int, easy-peasy ... 😂 I'm super happy to make you laugh because life can be so sad sometimes. I understand from your message and your reaction that Blender is coded in 32 bits and that coding it to 64 bits is a very big job ? bests regards.
Author

You should first attach the crash files that will typically be written into your TEMP directory. They will be named blender.crash.txt or <name of file>.crash.txt

Those will hopefully allow us to see the code path that's crashing. There are existing bugs for large meshes already or it could be something new.

And yes, a way to perhaps generate, without other big apps, an equivalent .obj file locally instead of doing a large transfer would also be nice.

Thanks for your reply, here is the crash file when I try to load an OBJ with 132 million facets and 16 4k texture files. Unfortunately I can't push these OBJ + PNG files as they haven't been published yet.
Bests regards,

> You should first attach the crash files that will typically be written into your TEMP directory. They will be named `blender.crash.txt` or `<name of file>.crash.txt` > > Those will hopefully allow us to see the code path that's crashing. There are existing bugs for large meshes already or it could be something new. > > And yes, a way to perhaps generate, without other big apps, an equivalent .obj file locally instead of doing a large transfer would also be nice. Thanks for your reply, here is the crash file when I try to load an OBJ with 132 million facets and 16 4k texture files. Unfortunately I can't push these OBJ + PNG files as they haven't been published yet. Bests regards,

Not enough memory (it seems), initialization of nullptr.

Not enough memory (it seems), initialization of nullptr.
Author

Illiya, I'm working on the surface morphology of 67P nucleus comet. In attachment, a Agisoft MetaShape anaglyph red-cyan FullHD 1920 rendering. I want to produce stereoscopics videos by adding solar light and ray traciing rendering. All the best.

Illiya, I'm working on the surface morphology of 67P nucleus comet. In attachment, a Agisoft MetaShape anaglyph red-cyan FullHD 1920 rendering. I want to produce stereoscopics videos by adding solar light and ray traciing rendering. All the best.
Author

Not enough memory (it seems), initialization of nullptr.

memory usage just 1s after crash for loading 132M OBJ model.

> Not enough memory (it seems), initialization of nullptr. memory usage just 1s after crash for loading 132M OBJ model.

Hm, can you check 4.0 or 4.1?

Hm, can you check 4.0 or 4.1?

The crash there does look like one of our existing bugs/design issues. Do you know how many verts are in the mesh? Not faces but actual vertices?

The crash there does look like one of our existing bugs/design issues. Do you know how many verts are in the mesh? Not faces but actual vertices?
Author

The crash there does look like one of our existing bugs/design issues. Do you know how many verts are in the mesh? Not faces but actual vertices?

Hi Jesse, MetaShape show info : 132.092.546 faces and 66.043.050 Vertices. Vertex colors 3 bands, uint8. 16 x 4096x4096.

Bests regards, David.

> The crash there does look like one of our existing bugs/design issues. Do you know how many verts are in the mesh? Not faces but actual vertices? Hi Jesse, MetaShape show info : 132.092.546 faces and 66.043.050 Vertices. Vertex colors 3 bands, uint8. 16 x 4096x4096. Bests regards, David.

Have you tried importing bare geometry without the textures?
Blender doesn't have mipmapping, so all your textures are loaded to VRAM in full size. Maybe this is causing the crash?

Have you tried importing bare geometry without the textures? Blender doesn't have mipmapping, so all your textures are loaded to VRAM in full size. Maybe this is causing the crash?

@David-ROMEUF

Hm, can you check 4.0 or 4.1?

@David-ROMEUF > Hm, can you check 4.0 or 4.1?
Author

@David-ROMEUF

Hm, can you check 4.0 or 4.1?

Iliya, crash with 4.1 with the same OBJ+texture.

> @David-ROMEUF > > Hm, can you check 4.0 or 4.1? Iliya, crash with 4.1 with the same OBJ+texture.
Author

Have you tried importing bare geometry without the textures?
Blender doesn't have mipmapping, so all your textures are loaded to VRAM in full size. Maybe this is causing the crash?

crash too

> Have you tried importing bare geometry without the textures? > Blender doesn't have mipmapping, so all your textures are loaded to VRAM in full size. Maybe this is causing the crash? crash too

The crash remains the same as in #111575 which is the existing bugs/design issues I mentioned above. Without an actual file we cannot confirm that. But yes, based on your crash file this is not due to the images and it's probably not due to running out of memory.

I won't be looking at this issue any further unless a file can be provided but it's most likely the same issue.

The crash remains the same as in #111575 which is the existing bugs/design issues I mentioned above. Without an actual file we cannot confirm that. But yes, based on your crash file this is not due to the images and it's probably not due to running out of memory. I won't be looking at this issue any further unless a file can be provided but it's most likely the same issue.
Author

The crash remains the same as in #111575 which is the existing bugs/design issues I mentioned above. Without an actual file we cannot confirm that. But yes, based on your crash file this is not due to the images and it's probably not due to running out of memory.

I won't be looking at this issue any further unless a file can be provided but it's most likely the same issue.

Thank you Jesse for your expertise. Unfortunately I can't share this model with you as the article in a refereed journal has not yet been published. I'll try to find an older model that causes the same problems with Blender and pass it on to you (through which channel ?).

Otherwise, in my initial message at the beginning, I had described another problem. This model decimated into 66 million facets loads but afterwards it's impossible to save it as a Blender file. Blender doesn't crash, but it remains in a loop and the save never finishes...

> The crash remains the same as in #111575 which is the existing bugs/design issues I mentioned above. Without an actual file we cannot confirm that. But yes, based on your crash file this is not due to the images and it's probably not due to running out of memory. > > I won't be looking at this issue any further unless a file can be provided but it's most likely the same issue. Thank you Jesse for your expertise. Unfortunately I can't share this model with you as the article in a refereed journal has not yet been published. I'll try to find an older model that causes the same problems with Blender and pass it on to you (through which channel ?). Otherwise, in my initial message at the beginning, I had described another problem. This model decimated into 66 million facets loads but afterwards it's impossible to save it as a Blender file. Blender doesn't crash, but it remains in a loop and the save never finishes...
Author

The crash remains the same as in #111575 which is the existing bugs/design issues I mentioned above. Without an actual file we cannot confirm that. But yes, based on your crash file this is not due to the images and it's probably not due to running out of memory.

I won't be looking at this issue any further unless a file can be provided but it's most likely the same issue.

Jesse, I've prepared an OBJ with texture that crashes the OBJ import into Blender in the same way. There are 13 GB of data. Can I send you a download link and to what email address ?

> The crash remains the same as in #111575 which is the existing bugs/design issues I mentioned above. Without an actual file we cannot confirm that. But yes, based on your crash file this is not due to the images and it's probably not due to running out of memory. > > I won't be looking at this issue any further unless a file can be provided but it's most likely the same issue. Jesse, I've prepared an OBJ with texture that crashes the OBJ import into Blender in the same way. There are 13 GB of data. Can I send you a download link and to what email address ?

You can DM me (user deadpin) on https://blender.chat/ where I can provide an address.

You can DM me (user `deadpin`) on https://blender.chat/ where I can provide an address.
Author

You can DM me (user deadpin) on https://blender.chat/ where I can provide an address.

I send you DM link by chat. Bests regard, David.

> You can DM me (user `deadpin`) on https://blender.chat/ where I can provide an address. I send you DM link by chat. Bests regard, David.

Ok, I can confirm the issue is unfortunately the same as #111575. The file is parsed into the following components:

totvert: 50000000
totedge: 149996085
faces_num: 99996086
totloop: 299988258

When it comes time to commit that mesh to the internal Blender file structure things break because of the design issue above. It tries to store 299988258 * 8 bytes worth of UV coordinates and crashes (each UV coordinate is 8 bytes; 2, 4-byte values). This is because 299988258 * 8 exceeds the size of the datatype being used and it's not very easy to just change it.

Ok, I can confirm the issue is unfortunately the same as #111575. The file is parsed into the following components: ``` totvert: 50000000 totedge: 149996085 faces_num: 99996086 totloop: 299988258 ``` When it comes time to commit that mesh to the internal Blender file structure things break because of the design issue above. It tries to store 299988258 * 8 bytes worth of UV coordinates and crashes (each UV coordinate is 8 bytes; 2, 4-byte values). This is because 299988258 * 8 exceeds the size of the datatype being used and it's not very easy to just change it.
Blender Bot added
Status
Archived
and removed
Status
Needs Information from User
labels 2023-11-11 05:44:43 +01:00
Jesse Yurkovich added
Status
Duplicate
and removed
Status
Archived
labels 2023-11-11 05:44:53 +01:00
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset System
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Viewport & EEVEE
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Asset Browser Project
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Module
Viewport & EEVEE
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Severity
High
Severity
Low
Severity
Normal
Severity
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
6 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#114154
No description provided.