Cycles render test: Failing oneAPI tests #125088

Open
opened 2024-07-19 17:13:16 +02:00 by Alaska · 5 comments
Member

System Information
Operating system: Windows-10-10.0.22631-SP0 64 Bits
Graphics card: Intel(R) Arc(TM) A750 Graphics Intel 4.6.0 - Build 32.0.101.5762
Blender version: 4.3.0 Alpha, branch: main, commit date: 2024-07-17 23:28, hash: 0b70a9edc56b

Short description of error
Blender has a collection of render tests that can be used to test for breaking changes. On the Cycles side this can also be used to verify that GPU render results generally align with the CPU.

I have noticed that a few tests are failing when running the render tests with oneAPI, one specific to Embree GPU, and the Cycles team would like these issues to be fixed when possible.

This task is to catalog failing render tests for developers to use as reference when fixing the issues.

Exact steps for others to reproduce the error

  1. Build Blender locally with GPU support
  2. In the CYCLES_TEST_DEVICE parameter in the cmake file for Blender, change it to ONEAPI-RT
  3. Run Blenders tests. This can generally be done with make test from the Blender source code folder, but more information can be found here
  4. The automated tests will report that some tests are failing, and direct you to a local web page to review them.

Note: After running make test at least once, you can switch to running ctest -R cycles -C release inside the Blender build folder (E.g. /blender-git/build_windows) to only run Cycles test for faster iterations.


Note: All file paths are provided relative to the render tests folder.
Locally this folder can be found at blender_repo_folder/tests/data/render/. This folder can also be found online in the render folder of the blender/blender-test-data repo.

List of failing tests:

  • Noise differences
    • The tests listed here have differences in noise compared to the CPU render. Nothing appears obviously broken, and so fixing these are not high priority. These issues occur with Embree GPU on and off.
      • /light/all_light_types
      • /light_linking/light_link_distant_multi_tree
      • /light_linking/light_link_distant_tree
      • /light_linking/light_link_surface_in_volume
  • Incorrect rendering
    • One test is rendering incorrectly when Embree GPU is enabled. Shadows of some semi-transparent objects show artifacts, making the shadow brighter or darker when compared to the CPU. A simplifed file that reproduces the issue can be found in #125093. Just open the file and decrease the transparent bounces to 1023 or lower.
      • /light_linking/shadow_link_transparency - Fixed by (d4ceade5ea)
**System Information** Operating system: Windows-10-10.0.22631-SP0 64 Bits Graphics card: Intel(R) Arc(TM) A750 Graphics Intel 4.6.0 - Build 32.0.101.5762 Blender version: 4.3.0 Alpha, branch: main, commit date: 2024-07-17 23:28, hash: `0b70a9edc56b` **Short description of error** Blender has a collection of render tests that can be used to test for breaking changes. On the Cycles side this can also be used to verify that GPU render results generally align with the CPU. I have noticed that a few tests are failing when running the render tests with oneAPI, one specific to Embree GPU, and the Cycles team would like these issues to be fixed when possible. This task is to catalog failing render tests for developers to use as reference when fixing the issues. **Exact steps for others to reproduce the error** 1. Build [Blender locally](https://developer.blender.org/docs/handbook/building_blender/) with [GPU support](https://developer.blender.org/docs/handbook/building_blender/cycles_gpu_binaries/) 2. In the `CYCLES_TEST_DEVICE` parameter in the cmake file for Blender, change it to `ONEAPI-RT` 3. Run Blenders tests. This can generally be done with `make test` from the Blender source code folder, but more information can be found [here](https://developer.blender.org/docs/handbook/testing/setup/) 4. The automated tests will report that some tests are failing, and direct you to a local web page to review them. Note: After running `make test` at least once, you can switch to running `ctest -R cycles -C release` inside the Blender build folder (E.g. `/blender-git/build_windows`) to only run Cycles test for faster iterations. --- Note: All file paths are provided relative to the render tests folder. Locally this folder can be found at `blender_repo_folder/tests/data/render/`. This folder can also be found online in the `render` folder of the [blender/blender-test-data](https://projects.blender.org/blender/blender-test-data) repo. List of failing tests: - Noise differences - The tests listed here have differences in noise compared to the CPU render. Nothing appears obviously broken, and so fixing these are not high priority. These issues occur with Embree GPU on and off. - [ ] `/light/all_light_types` - [ ] `/light_linking/light_link_distant_multi_tree` - [ ] `/light_linking/light_link_distant_tree` - [ ] `/light_linking/light_link_surface_in_volume` - Incorrect rendering - One test is rendering incorrectly when Embree GPU is enabled. Shadows of some semi-transparent objects show artifacts, making the shadow brighter or darker when compared to the CPU. A simplifed file that reproduces the issue can be found in #125093. Just open the file and decrease the transparent bounces to 1023 or lower. - [x] `/light_linking/shadow_link_transparency` - Fixed by (d4ceade5eadea48aaab8e2bbf462ba996dab5fa6)
Alaska added the
Type
To Do
Module
Render & Cycles
Severity
Normal
labels 2024-07-19 17:13:17 +02:00
Alaska added this to the Render & Cycles project 2024-07-19 17:13:17 +02:00
Author
Member

CC @xavierh so you know about these issues.

CC @xavierh so you know about these issues.
Nikita Sirgienko was assigned by Xavier Hallade 2024-07-19 17:37:59 +02:00
Xavier Hallade self-assigned this 2024-07-19 17:38:07 +02:00
Author
Member

The incorrect rendering in /light_linking/shadow_link_transparency seems to be fixed by !125739

The incorrect rendering in `/light_linking/shadow_link_transparency` seems to be fixed by !125739
Author
Member

In a recent Cycles meeting there was this note:

  • Cycles render test: Failing oneAPI tests 125088 3
  • Nikita will check if increasing samples properly converges

I ran my own tests with all_light_types, light_link_distant_multi_tree, and light_link_distant_tree and the CPU and GPU seem to converge when increasing the sample count.

I didn't test light_link_surface_in_volume.

In a recent [Cycles meeting](https://devtalk.blender.org/t/2024-09-17-render-cycles-meeting/36668) there was this note: > - Cycles render test: Failing oneAPI tests 125088 3 > - Nikita will check if increasing samples properly converges I ran my own tests with `all_light_types`, `light_link_distant_multi_tree`, and `light_link_distant_tree` and the CPU and GPU seem to converge when increasing the sample count. I didn't test `light_link_surface_in_volume`.

@Alaska , This is interesting – because I conducted a similar experiment with the "light link distant multi tree" test, and for me, the CPU and Intel GPU generated images didn't converge even at a doubled resolution (256x256 pixels, 100% Resolution scale) and with 100 Samples (see the ctest report below).
But maybe resolution increase were a bad idea, let me review your data and retest on my side.
image

@Alaska , This is interesting – because I conducted a similar experiment with the "light link distant multi tree" test, and for me, the CPU and Intel GPU generated images didn't converge even at a doubled resolution (256x256 pixels, 100% Resolution scale) and with 100 Samples (see the ctest report below). But maybe resolution increase were a bad idea, let me review your data and retest on my side. <img width="928" alt="image" src="attachments/e5eb42bd-44a6-4589-8fb2-49043822f846">
1.1 MiB
Author
Member

Ah, my results are a bit mis-leading. I forgot that Cycles render tests show 16x the absolute difference of the images. While I was showing 1x the difference.

If I apply the same 16x scale to my 128 samples per pixel test, my results are very similar to yours.
But as the sample count continues to increase (E.g. to 2048 samples per pixel), the difference between CPU and GPU decreases to a point where you can't really tell.

Ah, my results are a bit mis-leading. I forgot that Cycles render tests show 16x the absolute difference of the images. While I was showing 1x the difference. If I apply the same 16x scale to my 128 samples per pixel test, my results are very similar to yours. But as the sample count continues to increase (E.g. to 2048 samples per pixel), the difference between CPU and GPU decreases to a point where you can't really tell.
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset System
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Viewport & EEVEE
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Asset Browser Project
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Module
Viewport & EEVEE
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Severity
High
Severity
Low
Severity
Normal
Severity
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
2 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#125088
No description provided.