Export FBX doubles vertices for Bevel Harden Normals #104526

Closed
opened 2023-04-04 10:59:32 +02:00 by Grigoriy Titaev · 10 comments

System Information
Operating system: Windows 10

Blender Version
Broken: 3.5.0, Add-on FBX 4.37.5
Broken: 3.6.0, Add-on FBX 5.2.0

Short description of error
Export FBX doubles vertices for Bevel Harden Normals

Exact steps for others to reproduce the error

  1. Create Cube
  2. Delete UVMap from Cube
  3. Add Bevel modifier with one segment, enable Harden Normals
  4. Export cube to FBX file "Cube 1.fbx"
  5. Import "Cube 1.fbx"
  6. Export imported cube to new FBX file "Cube 2.fbx"

Open these two FBX files in other program: Unreal Engine, Marmoset Toolbag, Microsoft 3D Viewer.
Look at the number of vertices.

  • Cube from "Cube 2.fbx" has 24 vertices
  • Сube from "Cube 1.fbx" has 48 vertices, although it should be 24.

Unnecessary vertices are undesirable for game optimization.

"Harden Normals" creates Sharps, and perhaps "Export FBX" saves two identical normals for each vertex.

**System Information** Operating system: Windows 10 **Blender Version** Broken: 3.5.0, Add-on FBX 4.37.5 Broken: 3.6.0, Add-on FBX 5.2.0 **Short description of error** Export FBX doubles vertices for Bevel Harden Normals **Exact steps for others to reproduce the error** 1. Create Cube 2. Delete UVMap from Cube 3. Add Bevel modifier with one segment, enable Harden Normals 4. Export cube to FBX file "Cube 1.fbx" 5. Import "Cube 1.fbx" 6. Export imported cube to new FBX file "Cube 2.fbx" Open these two FBX files in other program: Unreal Engine, Marmoset Toolbag, Microsoft 3D Viewer. Look at the number of vertices. - Cube from "Cube 2.fbx" has 24 vertices - Сube from "Cube 1.fbx" has 48 vertices, although it should be 24. Unnecessary vertices are undesirable for game optimization. "Harden Normals" creates Sharps, and perhaps "Export FBX" saves two identical normals for each vertex.
Grigoriy Titaev added the
Priority
Normal
Type
Report
Status
Needs Triage
labels 2023-04-04 10:59:33 +02:00

I can't reproduce the problem with the steps described.
Even because 24 vertices cannot represent the described geometry.

In this case, 24 would represent the corners of a Cube without the bevel (6 * 4).

48 are the corners of the described and triangulated geometry.

It is probably needed to inform custom normals.

I can't reproduce the problem with the steps described. Even because 24 vertices cannot represent the described geometry. In this case, 24 would represent the corners of a Cube without the bevel (6 * 4). 48 are the corners of the described and triangulated geometry. It is probably needed to inform custom normals.
Germano Cavalcante added
Status
Needs Information from User
and removed
Status
Needs Triage
labels 2023-04-04 19:04:51 +02:00

Here is what it looks like.

Files "Cube 1 (3.5.0) - 48 vertices.fbx" and "Cube 2 (3.5.0) - 24 vertices.fbx" from attached zip file.

Unreal Engine:
Cubes - UE5.png

Marmoset Toolbag:
Cubes - Marmoset Toolbag.png

Here is what it looks like. Files "Cube 1 (3.5.0) - 48 vertices.fbx" and "Cube 2 (3.5.0) - 24 vertices.fbx" from attached zip file. Unreal Engine: ![Cubes - UE5.png](/attachments/4a540c45-23b6-494f-a32a-b67a13aafcd3) Marmoset Toolbag: ![Cubes - Marmoset Toolbag.png](/attachments/8c7593bf-1728-4cc2-ae13-c4c137178a5e)

Can you provide the Cube 1 (3.5.0) - 48 vertices.fbx file?

Can you provide the `Cube 1 (3.5.0) - 48 vertices.fbx` file?

Can you provide the Cube 1 (3.5.0) - 48 vertices.fbx file?

Here all FBX files: Test Files - Export Cube - Harden Normals.zip

> Can you provide the Cube 1 (3.5.0) - 48 vertices.fbx file? Here all FBX files: [Test Files - Export Cube - Harden Normals.zip](attachments/fb697b50-9001-4d5e-8177-95a756ba953d)

I converted both files to json for parsing and I could see they both report 27 vertices.
They are even the same size.
I haven't found anything that might indicate a problem with Blender.

Attached are the files as JSON in case you want to analyze by yourself.

I converted both files to json for parsing and I could see they both report 27 vertices. They are even the same size. I haven't found anything that might indicate a problem with Blender. Attached are the files as JSON in case you want to analyze by yourself.

Perhaps your converter is not working correctly, or doing some kind of optimization.

I used Filestar converter: https://filestar.com/skills/fbx/convert-fbx-to-json

Cube 1 (3.5.0) - 48 vertices - filestar.json
length meshes.vertices = 144
length meshes.normals = 144
144 / 3 = 48

Cube 2 (3.5.0) - 24 vertices - filestar.json
length meshes.vertices = 72
length meshes.normals = 72
72 / 3 = 24

Perhaps your converter is not working correctly, or doing some kind of optimization. I used Filestar converter: https://filestar.com/skills/fbx/convert-fbx-to-json [Cube 1 (3.5.0) - 48 vertices - filestar.json](/attachments/7f353dc9-1a0a-4688-af85-f44a38595f91) length `meshes.vertices` = 144 length `meshes.normals` = 144 144 / 3 = 48 [Cube 2 (3.5.0) - 24 vertices - filestar.json](/attachments/885b8c0a-56a4-41e9-96c4-c513ac42fb5f) length `meshes.vertices` = 72 length `meshes.normals` = 72 72 / 3 = 24
Member

The fbx2json.py converter in the blender\scripts\addons\io_scene_fbx folder for the FBX addon is pretty much a 1:1 conversion, but that filestar converter seems to be converting to some different 3d model format that happens to be represented as .json.
I get 24 vertices for all four of the .fbx files. I ran them through fbx2json.py, counted the number of elements in the "Vertices" arrays and divided by 3, since there is an x, y and z component for each.

The behaviour you are seeing of the number of vertices increasing is consistent with software that can't store normals per-loop (per-polygon-corner) and so must store them per-vertex instead.

What appears to be happening is that Unreal/Marmoset is able to see that in some cases, multiple loops that use the same vertex also have an identical (or close enough to identical) normal, so it doesn't need to duplicate the vertex for these loops.

In your Cube 1 export, the per-loop normals for the large square faces on each side appear to have no floating-point inaccuracy, they're all made up of nice and clean looking 0.0, 1.0 and -1.0 values. The rest of the faces appear to have small inaccuracies, so end up with values like 2.142786979675293e-05 when they should be 0.0 or -0.9999999403953552 when they should be -1.0 etc..

When you've re-imported Cube 1 and exported it as Cube 2, you've introduced additional floating-point inaccuracies that have affected most of the clean looking normals.

My suspicion would be that, when Unreal/Marmoset compares the normals to determine if they're the same, it sees 0.0 and 2.142786979675293e-05 and considers them different, but when it sees 2.79843807220459e-05 and 1.4007091522216797e-05 it considers them close enough to be the same. 0.0 can cause problems with comparing similarity of floating point values, particularly in cases where a relative tolerance is used.

The 1st loop normal of Cube 1 is 0.0, -1.0, 0.0
The 22nd loop normal of Cube 1 is 2.142786979675293e-05, -0.9999999403953552, -4.500150680541992e-06
I suspect that Unreal/Marmoset considers these to be different and thus creates an extra vertex.

The 1st loop normal of Cube 2 is 2.79843807220459e-05, -0.9999999403953552, 1.3947486877441406e-05
The 22nd loop normal of Cube 2 is 1.4007091522216797e-05, -1.0, -2.8073787689208984e-05
I suspect that Unreal/Marmoset considers these to be the same so discards one of the normals instead of creating a new vertex.

There are a total of 24 loop normals for all of the large square faces (one for each corner of each face) all of which contain at least one component that is 0.0 in Cube 1. Unreal/Marmoset produces 24 extra vertices for Cube 1, which would match up with what I'm suspecting.

The `fbx2json.py` converter in the `blender\scripts\addons\io_scene_fbx` folder for the FBX addon is pretty much a 1:1 conversion, but that filestar converter seems to be converting to some different 3d model format that happens to be represented as .json. I get 24 vertices for all four of the .fbx files. I ran them through `fbx2json.py`, counted the number of elements in the "Vertices" arrays and divided by 3, since there is an `x`, `y` and `z` component for each. The behaviour you are seeing of the number of vertices increasing is consistent with software that can't store normals per-loop (per-polygon-corner) and so must store them per-vertex instead. What appears to be happening is that Unreal/Marmoset is able to see that in some cases, multiple loops that use the same vertex also have an identical (or close enough to identical) normal, so it doesn't need to duplicate the vertex for these loops. In your Cube 1 export, the per-loop normals for the large square faces on each side appear to have no floating-point inaccuracy, they're all made up of nice and clean looking `0.0`, `1.0` and `-1.0` values. The rest of the faces appear to have small inaccuracies, so end up with values like `2.142786979675293e-05` when they should be `0.0` or `-0.9999999403953552` when they should be `-1.0` etc.. When you've re-imported Cube 1 and exported it as Cube 2, you've introduced additional floating-point inaccuracies that have affected most of the clean looking normals. My suspicion would be that, when Unreal/Marmoset compares the normals to determine if they're the same, it sees `0.0` and `2.142786979675293e-05` and considers them different, but when it sees `2.79843807220459e-05` and `1.4007091522216797e-05` it considers them close enough to be the same. `0.0` can cause problems with comparing similarity of floating point values, particularly in cases where a relative tolerance is used. The 1st loop normal of Cube 1 is `0.0, -1.0, 0.0` The 22nd loop normal of Cube 1 is `2.142786979675293e-05, -0.9999999403953552, -4.500150680541992e-06` I suspect that Unreal/Marmoset considers these to be different and thus creates an extra vertex. The 1st loop normal of Cube 2 is `2.79843807220459e-05, -0.9999999403953552, 1.3947486877441406e-05` The 22nd loop normal of Cube 2 is `1.4007091522216797e-05, -1.0, -2.8073787689208984e-05` I suspect that Unreal/Marmoset considers these to be the same so discards one of the normals instead of creating a new vertex. There are a total of 24 loop normals for all of the large square faces (one for each corner of each face) all of which contain at least one component that is `0.0` in Cube 1. Unreal/Marmoset produces 24 extra vertices for Cube 1, which would match up with what I'm suspecting.

Here is a simpler example:

Test Plane - Harden Normals.png
Files: Test Plane - Harden Normals.zip

Something is wrong with the "Bevel - Harden Normals":

  • The modifier creates unnecessary sharps and double normal per vertex,
  • or export FBX saves double normals
  • or import FBX loses data
Here is a simpler example: ![Test Plane - Harden Normals.png](/attachments/e4e198ac-53d8-4a89-8112-289c9d7bce4e) Files: [Test Plane - Harden Normals.zip](/attachments/f50f30dc-64b7-43bf-ad71-079efbeda49e) Something is wrong with the "Bevel - Harden Normals": - The modifier creates unnecessary sharps and double normal per vertex, - or export FBX saves double normals - or import FBX loses data
Member

Marmoset/Unreal are creating additional vertices upon importing the .fbx that do not exist in the .fbx itself. This is the expected behaviour for a lot of software when they see split normals.

You can import the .fbx straight back into Blender to see that the number of vertices is not increased from when the mesh was exported. Unity also imports all of the original Cube 1 and Cube 2 .fbx with 24 vertices as expected.

I've attached the original Test Export Cube - Harden Normals.blend but where the cube has been rotated slightly, along with the exported .fbx. Try importing that rotated cube into Unreal/Marmoset, I suspect you will only get 24 vertices because none of the xyz components of the normals will be exactly 0.0. If so, I suspect this is a bug with Unreal and Marmoset Toolbag being unable to properly account for 0.0 when comparing similarity of normals (since Unity has no problems).

Possibly, there could also be an issue in the Bevel Modifier that is incurring additional floating-point inaccuracy for some faces or possibly in the Mesh.calc_normals_split() function that is used to calculate the split (per-loop) normals that are then exported. But I wouldn't say there is anything that the FBX exporter can do about this.

Marmoset/Unreal are creating additional vertices upon importing the .fbx that do not exist in the .fbx itself. This is the expected behaviour for a lot of software when they see split normals. You can import the .fbx straight back into Blender to see that the number of vertices is not increased from when the mesh was exported. Unity also imports all of the original `Cube 1` and `Cube 2` .fbx with 24 vertices as expected. I've attached the original `Test Export Cube - Harden Normals.blend` but where the cube has been rotated slightly, along with the exported .fbx. Try importing that rotated cube into Unreal/Marmoset, I suspect you will only get 24 vertices because none of the xyz components of the normals will be exactly `0.0`. If so, I suspect this is a bug with Unreal and Marmoset Toolbag being unable to properly account for `0.0` when comparing similarity of normals (since Unity has no problems). Possibly, there could also be an issue in the Bevel Modifier that is incurring additional floating-point inaccuracy for some faces or possibly in the `Mesh.calc_normals_split()` function that is used to calculate the split (per-loop) normals that are then exported. But I wouldn't say there is anything that the FBX exporter can do about this.

Since the original problem is actually the way other software handles the normals in corners, I am closing the report.

Problems with float precision in modifiers are not considered a bug, it's more of an improvement request.

It is understandable that Mark Sharp is disregarded by FBX when normals are defined explicitly. The Sharp mark is used to split normals or by the modifiers.

If you think you found a bug, please submit a new report and carefully follow the instructions. Be sure to provide a .blend file with exact steps to reproduce the problem.

Since the original problem is actually the way other software handles the normals in corners, I am closing the report. Problems with float precision in modifiers are not considered a bug, it's more of an improvement request. It is understandable that `Mark Sharp` is disregarded by FBX when normals are defined explicitly. The Sharp mark is used to split normals or by the modifiers. If you think you found a bug, please submit a new report and carefully follow the instructions. Be sure to provide a .blend file with exact steps to reproduce the problem.
Blender Bot added
Status
Archived
and removed
Status
Needs Information from User
labels 2023-04-05 18:09:37 +02:00
Sign in to join this conversation.
No Milestone
No project
No Assignees
3 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender-addons#104526
No description provided.