1
1

Compare commits

..

309 Commits

Author SHA1 Message Date
1f3233374b Build: update various libraries for 2.93, fixing bugs and security issues
This is based on similar updates in 3.3 and 3.4 from D16269.

expat 2.5.0
ffmpeg 4.4.3
flac 1.3.4
freetype 2.12.1
imath 3.1.5
numpy 1.22.0
ogg 1.3.5
openexr 2.5.8
openjpeg 2.5.0
python 3.9.15
sndfile 1.1.0
sqlite 3.37.2
ssl 1.1.1q
tiff 4.4.0
vorbis 1.3.7
vpx 1.11.0
webp 1.2.2
xml2 2.10.3
zlib 1.2.13
2022-11-18 17:40:34 +01:00
e2ffa58320 Build: get make deps working with Xcode command line tools
Deduplicating code with Xcode detection for Blender builds.
2022-11-07 15:43:44 +01:00
c9df21fac7 deps_builder: harden the package download process
During the 3.3 release some packages were missing
in SVN during the release and it ended up building
the release tarball without issues when re-running
the `make source_archive_complete` command after it
failed initially. The tarball however had 0 byte files
for the missing packages.... not good.

This diff hardens the download process by :

1) Validating all required variables are set. This
catches the erroneously attempt at downloading the
nanovdb package even though we have removed it
from versions.cmake but neglected to remove it
from download.cmake

2) When a download fails (due to either a missing
package, or bad download URL) FILE Download will
warn about a hash mismatch but will carry on
happily, you then have to go into the file system
go delete the 0 byte file to retry the download.
We know for a fact the file is bad when it is 0
bytes, just delete it.

3) When we are using the blender repository
(and likely building a source archive) explicitly
validate the hash of all packages. Normally the
build process does this, however when building
a source archive the build does not actually
run for a dep. So preform this check during the
configuration stage.

Reviewed By: brecht

Differential Revision: https://developer.blender.org/D16124
2022-11-07 15:42:48 +01:00
137eae06ad Cleanup: remove unnecessary argument to else() in CMake
We have moved away from duplicating arguments in else() and endif()
commands.
2022-11-07 15:42:26 +01:00
3c3f4aceaf Cleanup: correct indentation of harvest.cmake 2022-11-07 15:41:07 +01:00
7bc5193f5e deps_builder: Add support for cve-bin-tool
This change adds support for intels cve-bin-tool [1]
in the deps builder. This adds 2 new targets to the
builder that do not build automatically but can be
build on demand when required.

`make cve_check` will output to the console.
`make cve_check_html` will output a html file that
can be shared with other people.

Requirements:

- A working installation of cve-bin-tool on the system

Not required but higly recommended:

- Obtaining a key from the nvd [2] to speed up the
  database download. you can pass the key to cmake
  using `-DCVE_CHECK_NVD_KEY=your_api_key`

[1] https://github.com/intel/cve-bin-tool
[2] https://nvd.nist.gov/developers/request-an-api-key

Reviewed By: brecht

Differential Revision: https://developer.blender.org/D16160
2022-11-07 15:41:07 +01:00
d52a0fa115 Bump version cycle for Blender 2.93.11 release. 2022-10-04 20:12:35 +02:00
dfcc8086fb Fix: GPencil animated layer transforms evaluate wrong when identity
Due to (optimization) checks in in `BKE_gpencil_prepare_eval_data` &
`BKE_gpencil_update_layer_transforms`, updates were skipped if animation
reached exact identity transforms.

Now check if the matrix has changed additionally to gain proper updates.
Unsure if this is the cheapest way to check for the animated state of
layer transforms tbh, but I see similar checks elsewhere.

Fixes T101164.

Maniphest Tasks: T101164

Differential Revision: https://developer.blender.org/D16018
2022-10-03 21:19:04 +02:00
1e7e1b151c Fix T101109: Animation on nodes problems when dealing with Node Groups
Whenever animation on nodes was transfered to/from nodegroups (grouping/
ungrouping/separating) via BKE_animdata_transfer_by_basepath, it was
possible to create new nodes with the same name (in the formerly same
path -- see report for an example of this) and animation from the
original node was still performed on them. Issue went away after save/
reload.

In order to fully update the action, a depsgraph is now performed on the
action (similar to what is done when renaming for example).

Maniphest Tasks: T101109

Differential Revision: https://developer.blender.org/D15987
2022-10-03 21:00:03 +02:00
3dce2469ee Fix T101046: missing DEG update changing bone layers in editmode
Following {T54811} (and {rBbb92edd1c802}), DEG updates have been added
to the various operators. This has also been done for the layers
operators (see {rBf998bad211ae}, `ARMATURE_OT_bone_layers` has been
marked done in T54811). However, instead of `ARMATURE_OT_bone_layers`,
the update tagging actually happened for `POSE_OT_bone_layers`.

Now do this for `ARMATURE_OT_bone_layers` as well (keep it for
`POSE_OT_bone_layers`, dont think this is wrong there either).

Maniphest Tasks: T101046

Differential Revision: https://developer.blender.org/D15969
2022-10-03 20:59:23 +02:00
331fe92e46 Fix EEVEE: Screen Space Refraction Artefacts caused by viewport aspect ratio
This was caused by the vertical/horizontal clasification being done in
NDC space which wasn't respecting the Aspect ratio.

Multiplying the test vector by the target size fixes the issue.
2022-10-03 20:58:32 +02:00
170e03aa98 Fix T101138: remove console spam when hovering over toolbar in uv editor
Reviewers: Campbell Barton <ideasman42>, Ryan Inch <Imaginer>

Differential Revision: https://developer.blender.org/D16015
2022-10-03 20:57:36 +02:00
5c20363e37 Fix T100606: Apply object transform fails with delta quaternion rotation
Apply transform failed to clear delta quaternion & axis-angle rotation.
2022-10-03 20:51:26 +02:00
Pratik Borhade
4b3b8156ee Fix T99070: Apply transform fails to clear delta transform values
Clear delta transform value after applying transform.
Include delta location while applying transform.
Use `copy_v3_fl` for resetting object scale

Reviewed By: mano-wii

Maniphest Tasks: T99070

Differential Revision: https://developer.blender.org/D15270
2022-10-03 20:48:39 +02:00
8c148a4f3b Fix T100851: Sync markers does not work for numinput
special_aftertrans_update would always use TransInfo values (not
the values_final -- we need the final values to follow numinput, snapping,
etc).

Maniphest Tasks: T100851

Differential Revision: https://developer.blender.org/D15893
2022-09-19 14:46:27 +02:00
fd248e6bf9 Fix T100578: Surface Deform modifier displays wrong in editmode
This was the case when the "Show in Editmode" option was used and a
vertexgroup affected the areas.

Probably an oversight in {rBdeaff945d0b9}?, seems like deforming
modifiers always need to call `BKE_mesh_wrapper_ensure_mdata` in
`deformVertsEM` when a vertex group is used.

Maniphest Tasks: T100578

Differential Revision: https://developer.blender.org/D15756
2022-09-19 14:45:44 +02:00
52a5c80313 Fix T100191: Crash with the wave modifier using normals in edit-mode 2022-09-19 14:43:01 +02:00
6d24de6c43 Fix T99979: GPencil strokes cannot be edited after set origin
The stroke points were changed but the bounding box calculation was not
done and this produced a problem in any bounding box check done by
different tools.
2022-09-19 14:39:21 +02:00
238cd2b2a4 Release cycle: Blender 2.93.11 candidate. 2022-09-19 14:36:14 +02:00
0a65e1a8e7 Release cycle: Blender 2.93.10 release. 2022-08-02 20:21:24 +02:00
977a5bbee1 Fix T99820: missing 'no more mising' tagging on reloaded libraries.
Backport of rB4d8018948ddb for 2.93 LTS.
2022-08-02 15:11:06 +02:00
d03a5fab7a Python: restrict name-space access for restricted evaluation
From [0], restrict namsepace access to anything with an underscore
prefix since these may be undocumented.

[0]: 00c7e760b3
2022-08-02 10:42:39 +02:00
ddffd1bc9f Fix (studio-reported) crash in some rare cases in blendfile read code.
Crash would happen when a linked ID would become missing, that was
'pre-declared' and used only once as a 'weak link' in another library
stored before the one it came from.

In that case, the place-holder generated in read code would be freed in
read_library_clear_weak_links, when handling its 'owner' library, but
since all previous libraries in the list had already been 'lib_linked'
and their filedata (and related libmap) freed, the update of the libmaps
in read_library_clear_weak_links would not apply to data from those
previous libraries, leading to ID pointers there pointing to freed
memory.

This fix should also be backported to 2.93.
2022-08-02 10:33:29 +02:00
bc9d461ab0 Fix Python SystemExit exceptions silently exiting
Any script that raised a SystemExit called by --python, --python-expr
command line args or by executing the text block would exit without
printing a message. This caused the error from T99966 to be hidden.

Add explicit handling for SystemExit to ensure the message is always
shown before exiting.

More details noted in code-comments.
2022-08-02 10:16:44 +02:00
41689bb310 Fix: Incorrect coordinates used in BLI_rct*_isect_segment functions
Ref D15330.
2022-08-02 10:05:03 +02:00
a2cadb786f API doc changelog generation: fix issue with version/path.
`BLENDER_VERSION_DOTS` has changed since 2.93, in that version of the
codebase it cannot be used directly to get directory in the API doc
matching Blender version.
2022-07-06 14:46:02 +02:00
3d4cfd228a Py API doc: Add some more dependencies versions requirements.
Copied over from the ones for the manual for 2.93.

This is an attempt to fix API doc generation for 2.93 LTS.
2022-07-06 12:26:08 +02:00
Gaia Clary
d7901ed607 COLLADA: Support for alpha color in vertex data.
Many thanks to the original Author of this patch: Christian Aguilera

The COLLADA importer was silently ignoring the alpha component in the
vertex data.

The `stride` variable holds the component count (3 for RGB; 4 for RGBA),
and can be used for honouring the alpha channel in the vertex data.

Test plan:
- Open Blender.
- Clear the scene.
- Add a plane.
- Enter **Vertex Paint** mode.
- Switch to the **Erase Alpha** blending mode.
- Select a tone of gray.
- Turn strength down to less than 1
- Paint [some of] the vertices of the plane.
- Export project as a COLLADA file (`.dae`).
- Clear the scene.
- Re-import the COLLADA file again.
- Export the project again (with different name).

**Without** this patch, the second exported project will have lost the
alpha component in their vertex data:

```lang=xml, counterexample
<float_array id="Plane-mesh-colors-Col-array" count="24">1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1</float_array>
```

**With** the patch, the first and the second exported projects retain
the alpha values painted previously:

```lang=xml
<float_array id="Plane-mesh-colors-Col-array" count="24">1 1 1 1 1 1 1 0.5490196 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0.5490196</float_array>
```

Reviewed By: cristian64, SonnyCampbell_Unity
Authored by: Christian Aguilera

Differential Revision: https://developer.blender.org/D14246
2022-07-02 22:30:38 +02:00
7ec5ef4e1e Py API Doc: add runtime changelog generation to sphinx_doc_gen.py.
Optionally use `sphinx_changelog_gen.py` to dump current version of the
API in a JSON file, and use closest previous one listed in given index
file to create a changelog RST page for Sphinx.

Part of {T97663}.
2022-07-01 11:14:24 +02:00
5e4a8fa12e Py API Doc: refactor changelog generation script.
Main change is to make it use JSON format for its dump files, instead of
some Python code.

It also introduces an index for those API dump files, mapping a blender
version to the relevant file path.

This is then used to automatically the most recent (version-number wise)
previous API dump to compare against current one, when generating the
change log RST file.

Part of {T97663}.
2022-07-01 11:12:12 +02:00
d57f57717a Fix T98699: Face dot colors in UV editor was using wrong color from theme 2022-06-16 11:00:49 +02:00
8f530d6a47 Fix (unreported) bad memory access in read/write code of MeshDeform modifier.
This abuse of one one size value to handle another allocated array of a
different size is bad in itself, but at least now read/write code of
this modifier should not risk invalid memory access anymore.

NOTE: invalid memory access would in practice only happen in case endian
switch would be performed at read time I think (those switches only check
for given length being non-zero, not for a NULL data pointer...).
2022-06-16 09:30:12 +02:00
354c22b28c Version bump: 2.93.10-rc 2022-04-20 12:08:38 +02:00
31712ce77a Version bump: 2.93.9-release 2022-04-19 16:25:25 +02:00
d2ca2c6e74 Fix T85467: Mask transform center doesn't take parent Track into account
Coordinate checks in `spline_under_mouse_get` need to take place with
the evaluated mask to get the right center. This is now more in line to
how this is done in `ED_mask_point_find_nearest`.

Note: similar issues are reported with box/circle/lasso selection in
T97135, will tackle these separately though.

Maniphest Tasks: T85467

Differential Revision: https://developer.blender.org/D14598
2022-04-19 16:11:21 +02:00
59e68d8031 Fix T97135: Fix selection issues with parented masks in the MCE
Box, Circle and Lasso select were not taking into account if a
mask(point) was parented; selection was only succeeding in the original
place.

Now check coordinates from evaluated mask (points) instead while
setting selection flags and DEG tagging still happens on the original ID.

Maniphest Tasks: T97135

Differential Revision: https://developer.blender.org/D14651
2022-04-19 16:07:22 +02:00
f1173749c2 Fix T85756: Adjust Last Operation panel is slow.
Extremely subttle bug that would only appear in some specific
circumstances, would cause memfile undo writing code to falsely detect
some ID as changed because it would get the wrong 'starting point' of
comparison with existing previous memfile step.

See T85756 for detailed explanation and reproducible case.
2022-04-19 16:00:59 +02:00
49e1144832 Correct error in 405bff7fd8
Was adding 1 to dietime twice in init_particle_interpolation.
2022-04-19 15:59:09 +02:00
d0bcb6efea Fix T68290: Baked particles don't render in final frame
Particles baked into memory would never load the final frame because
of an off-by-one error calculating the particles `dietime`.

This value indicates the frame which the particle ceases to exist but
was being set to the end-frame which caused this bug as the scenes
end-frame is inclusive.

While the last frame was properly written and read from memory,
the `dietime` was set to the last frame causing all the particles to be
considered dead when calculating the cached particle system.
2022-04-19 15:58:43 +02:00
7dd15b7113 Fix T96790: Gpencil Inverted Fill makes extra stroke at origin
The problem was the stroke was created in the inverted loop before checking if the total of points is 0 and exit the loop.

Also some code Cleanup.
2022-04-19 15:57:38 +02:00
Pratik Borhade
69cea4fce1 Fix T96624: NLA crash when reordering tracks if no object is selected.
Caused by rBc0bd240ad0a1.
To avoid crash, make boolean value false if active object data is NULL.

Should be backported to 2.93 LTS and 3.1 corrective releases.
2022-03-24 10:48:11 +01:00
Ethan-Hall
3dc36b32a1 GPU: Allow the user to set an anisotropic filtering setting below the implementation-defined value of GL_MAX_TEXTURE_MAX_ANISOTROPY_EXT
Allow the user to set an anisotropic filtering setting below the implementation-defined value of `GL_MAX_TEXTURE_MAX_ANISOTROPY_EXT`.

This bug-fix is also needed for 2.93 LTS.

Reviewed By: fclem

Differential Revision: https://developer.blender.org/D14392
2022-03-24 10:47:18 +01:00
b6039cd82f Fix T96243: Workbench Curvature not rendering in background.
When rendering using the command line the curvature wasn't rendered. The reason
was that the ui_scale wasn't initialized and therefore the same pixels where
sampled to detect the curvature. This is fixed by setting the ui_scale to 1 for any
image render.
2022-03-24 10:45:43 +01:00
c3fed26f37 Fix memory leak evaluating PyDrivers
Missed decref in 686ab4c940 caused every
driver evaluation to create the BPy_StructRNA depsgraph without freeing
the previously allocated depsgraph.
2022-03-24 10:44:33 +01:00
Ethan-Hall
fe6999352a Fix T96195: f-curve factorized polynomial generator broken UI
The polynomial parameters were not shown correctly.

Differential Revision: https://developer.blender.org/D14254
2022-03-24 10:43:41 +01:00
fc56ef5f97 Fix broken shapekeys: check for 'NULL' from pointer too.
Add check for `NULL` `from` pointer to `BLO_main_validate_shapekeys`,
and delete these shapekeys, as they are fully invalid and impossible to
recover.

Found in a studio production file (`animation
test/snow_parkour/shots/0040/0040.lighting.blend`, svn rev `1111`).
Would be nice to know how this was generated too...
2022-02-21 11:43:51 +01:00
6d6bb04c19 Fix T95787: Texture paint: Apply Camera Image crash for certain images
This does not happen with **any** image, but with images that have ID
properties.

ID properties are used to store view projection matrices (e.g. for
reprojection with `Image from View` or `Quick Edit` -- these are the
ones we are interested in), but of course they can be used for anything
else, too. The images in the file from the report have ID properties from
an Addon for example.

So the crash can reliably be reproduced with **any** image doing the
following:
```
bpy.data.images['myImage']['myIDprop'] = "foo"
```
This would lead code in `texture_paint_camera_project_exec` to think the
needed `view_data` is on the image (but in reality it was just some
other IDprop).

Solution is simple: just check `view_data` is really valid after getting
it from the IDprops.

Maniphest Tasks: T95787

Differential Revision: https://developer.blender.org/D14116
2022-02-21 11:42:44 +01:00
Bastien Montagne
3a9619af80 Fix T95601: Missing handling of keyingsets ID pointers in lib_query/foreach_id code
Fix T95601: Missing handling of keyingsets ID pointers in lib_query/foreach_id code.

This will have to be backported to 2.93 and possibly 2.83 if possible.
2022-02-21 11:40:35 +01:00
1ee4e6bf31 Fix T89542: Crash when loading certain .hdr files
The direct cause of the bug in question was passing in the raw memory
buffer to sscanf. It should be called with a null-terminated buffer;
which isn't guaranteed when blindly trusting the file data.

When attempting to fuzz this code path, a variety of other crashes were
discovered and fixed.

Differential Revision: https://developer.blender.org/D11952
2022-02-21 10:54:40 +01:00
87b77a97b9 Fix dna_genfile error converting signed int types to floating point
Regression in 51befa4108
caused negative values to overflow into a uint64_t.

Resolve by casting to a signed int from originally signed types.

This caused D14033 not to work properly.
2022-02-21 10:37:00 +01:00
05f631b7ad Cleanup: use an intermediate value for cast_primitive_type
Assign the actual value before casting to large uint64_t/double types.

This improves readability, especially in cases where both pointer
and integer casts were used in one expression, to make matters worse
clang-format treated these casts as a multiplication.

This also made debugging/printing the values more of a hassle.

No functional changes (GCC produced identical output).
2022-02-21 10:30:43 +01:00
3cfccf1199 Fix T95137: Spline calc_length not working with just 1 NURB point
The NURB case did not properly handle a curve with only 1 point.

Ref D13904
2022-02-21 10:23:46 +01:00
df5da16aa1 Cleanup: Avoid possible NULL pointer error
In normal conditions, `gpf` always has a value, but better move inside the NULL checking.
2022-02-21 10:12:07 +01:00
6994f9a978 Version bump: 2.93.9-rc 2022-02-02 13:26:56 +01:00
09da7f489a Version bump: 2.93.8-release 2022-02-01 22:14:24 +01:00
3735e82c27 Fix T94197: Applying boolean with fast solver clears bevel weights
For boolean operations only one of the meshes was checked to determine
if bevel weights should be created.

Now initialize custom data from both meshes flag.
Note that this is a localized fix to be back-ported, further changes
will be made so edit-mode conversion accounts for this
without the caller needing explicit checks for custom-data flags.
2022-02-01 22:09:23 +01:00
fb21201d42 Fix T94837: curve tilt on a 2-point-curve is wrong
2-point-curves are treated separately from 3plus-point-curves (assume a
lot of the twisting reduction can be skipped, so there is a dedicated
function for single segment curves).

And while using the 3plus-point-curves function [`make_bevel_list_3D`]
would actually work in this case, the dedicated function
`make_bevel_list_segment_3D` would only consider the tilt of the second
point and would just copy over the quat to the first point as well. Dont
see a reason for this, now consider the first point's tilt as well.

Maniphest Tasks: T94837

Differential Revision: https://developer.blender.org/D13813
2022-01-17 14:24:12 +01:00
49f25ae23c Fix T94089: GPencil Drawing don't Update after paste in Dopesheet
When paste new frames, the datablock need to be tagged to update the drawings.
2022-01-17 14:11:51 +01:00
4cd881c70f Fix T94903: GPencil: Copying keys doesn't preserve Keyframe Type
When a new frame is created, ensure the keytype of source key is used.
2022-01-17 14:10:55 +01:00
caf362422e Fix crash caused by exception in Python gizmo target get handler 2022-01-17 14:10:01 +01:00
ccb2456df6 Fix T94799: GPencil Strokes drawn at 0.0 Strength still visible
There was a clamp with a value greater than 0.
2022-01-17 14:08:30 +01:00
Richard Antalik
8b749260fb Fix T94768: Crash in VSE prefetching
Only the fix part of rBf2fb9a0c59a applied (P2726).
2022-01-17 14:03:58 +01:00
8e2ee4a3b7 Fix T91005: Autosplit produces unusable files
Audio PTS was reset for each new file. This caused misalignment of video
and audio streams. In Blender, these files can't be loaded, other
players will fail to align audio and video.

Since timestamps are reset intentionally, reset also video stream
timestamps.

There were other bugs:
After timestamp was reset for audio, write_audio_frames started
encoding from timeline start until target frame, so each split video
had more audio than it should.

Also audio for last frame before splitting was written into new file.

Differential Revision: https://developer.blender.org/D13280
2022-01-17 14:00:54 +01:00
1c04d22ebf Fix T93995: Cycles camera motion blur not working in right stereo view
Thanks to Michael (michael64) for identifying the solution.

Ref D13567
2022-01-11 15:52:18 +01:00
Aleksi Juvani
66346a6852 Fix PSYS_GLOBAL_HAIR stripped even if connecting the hair fails
After disconnecting hair on an object, if you then hide the particle system, and try connecting the hair again, the operator is cancelled due to `remap_hair_emitter` returning `false` because `target_psmd->mesh_final` is NULL, but `connect_hair` will still strip the `PSYS_GLOBAL_HAIR` flag, which will cause the hair in the hidden particle system to be positioned incorrectly. The correct behavior is to strip the flag only if `remap_hair_emitter` succeeds.

Differential Revision: https://developer.blender.org/D13703
2022-01-11 15:46:14 +01:00
bbad834f1c Fix T94661: Out-of-bounds memory access due to malformed DDS image file
Harden bounds check in the stream reader avoiding integer overflow.
2022-01-11 15:34:51 +01:00
d294d19d21 Fix T86952: Buffer overflow reading specific DDS images
Add a data boundary check in the flipping code.

This code now also communicates the number of mipmap levels
it processed with an intent to avoid GPU texture from using
more levels than there are in the DDS data.

Differential Revision: https://developer.blender.org/D13755
2022-01-11 15:33:37 +01:00
e07f16776b Fix T94629: The IMB_flip API would fail with large images
Fix IMB_flip[xy] to handle cases where integer overflow might occur when
given sufficiently large image dimensions.

All of these fixes were of a similar class where the intermediate
sub-expression would overflow silently. Widen the types as necessary.

Differential Revision: https://developer.blender.org/D13744
2022-01-11 15:25:49 +01:00
Aleksi Juvani
e38a0eea5c Fix: connecting hair fails on meshes with no generative modifiers
Fixes a bug introduced in rB5dedb39d447b. `mesh_original` is not set if the
mesh has no generative modifiers, in which case we can use `mesh_final`, which
would seem to be consistent with the rest of the particle code. An alternative
approach would be to make sure that `mesh_original` is always set in
`deformVerts`.

Differential Revision: https://developer.blender.org/D13754
2022-01-11 14:48:14 +01:00
Aleksi Juvani
db9ddb8a45 Fix T54488: hair disconnect/reconnect not working with modifiers
Take the Use Modifier Stack setting into account when connecting hair, and
fix wrong results results when using deforming modifiers also.

Differential Revision: https://developer.blender.org/D13704
2022-01-11 14:46:52 +01:00
1f16b5d794 Fix T94137: GPencil: Eraser does not erase first point
The eraser checks the current, previous and next point (and sets pc0,
pc1 & pc2 corresponding to that for futher occlusion/brush/clipping
checks). For the very first point, it sets pc0 to pc1 [which makes sense,
there is no previous point, so we should assume the previous segment is
"visible" as soon as the first point is], but does so *before* pc1 is
even calculated. This makes following occlusion/brush/clipping checks
work with zero values [which leads to no earsing in most cases].

Now *first* calculate pc1, *then* set pc0 to pc1.

Maniphest Tasks: T94137

Differential Revision: https://developer.blender.org/D13593
2022-01-11 14:42:30 +01:00
9336edd060 Fix T94635: Sculpt Smooth in Surface mode with Anchored Stroke crash
Sculpt Smooth in Surface mode (as opposed to Laplacian) needs a cache
initialized on first time. In anchored stroke mode with spherical falloff
this was skipped though (because this starts of with no PBVH nodes and
an early return checks for this) and `first_time` was set to false before
cache initialization.

Now move the cache initalization to happen earlier (same as the cache
initialization for automasking).

Maniphest Tasks: T94635

Differential Revision: https://developer.blender.org/D13746
2022-01-11 14:39:29 +01:00
a597164ea6 Fix T94685: python error adding Space handlers for Spreadsheet
Oversight in {rB9cb5f0a2282a}.

Above commit made an entry in `rna_Space_refine()`, but the entry in
`rna_Space_refine_reverse()` was missing (and this is what python uses
for the Space callbacks).

Maniphest Tasks: T94685

Differential Revision: https://developer.blender.org/D13751
2022-01-11 14:31:48 +01:00
3a79c1d8f5 Fix T94564: Mirror clipping is not properly placed in sculpt mode
If a mirror object is used in a mirror modifier, sculptmode did not take
this into account (and instead always clipped on the sculpt objects
local axis).

Now take this into account by storing a matrix in the preparation
function `sculpt_init_mirror_clipping` and use that later in
`SCULPT_clip`.

Maniphest Tasks: T94564

Differential Revision: https://developer.blender.org/D13711
2022-01-11 14:29:49 +01:00
8a901d4925 Fix T94366: Grease Pencil Automerge no immediate UI update
Just an oversight in rBe9607f45d85d.
Now add notifier that toolsettings changed.

Maniphest Tasks: T94366

Differential Revision: https://developer.blender.org/D13723
2022-01-11 14:17:15 +01:00
9bae5988ba Fix T94544: crash removing image used as camera background via python
Since 2.8, background images are tied to cameras (in 2.79 these were
tied to a View3D I think).
Code in `BKE_library_id_can_use_idtype` wasnt taking this relation
between `Camera` and `Image` into account, thus leading to ID deletion/
unlinking not working properly -- in particular `libblock_remap_data`
not doing its thing (and leaving the camera as a user of the image),
then things went downhill from there...

Now make the "Camera-can-use-an-Image" relation clear in
`BKE_library_id_can_use_idtype`.

Maniphest Tasks: T94544

Differential Revision: https://developer.blender.org/D13722
2022-01-11 14:16:00 +01:00
20d0cddc60 Fix T94262: Grease Pencil Blur Effect DoF mode wrong
This was visible outside of camera view and was not respecting the
"Depth of Field" checkbox on the Camera properties.

Now return early if DoF should not be visible.

Maniphest Tasks: T94262

Differential Revision: https://developer.blender.org/D13631
2022-01-11 14:14:36 +01:00
dfdc7c3f81 Fix T94422: Shading/Normals break on array modifier caps
The array modifier does not necessarily tag normals dirty.
If it doesnt, normals are recalculated "internally" using the offset ob
transform. This was happening for the array items, but not for the caps.

Now do the same thing for caps.

Maniphest Tasks: T94422

Differential Revision: https://developer.blender.org/D13681
2022-01-11 14:12:27 +01:00
81879680e9 Fix T94375: Python error when trying to add Grease Pencil brush preset
The prop name was wrong.
2022-01-11 14:10:59 +01:00
42ab4b60bd Fix T91012: Scene strip doesn't play audio
Issue was caused by adding `seq->sound` check in ded68fb102 in
function `BKE_sound_scene_add_scene_sound` as `offset_time` field was
introduced to resolve sub-frame a/v misalignment.

Scene strips don't have `bSound` allocated but also don't suffer from
a/v misalignment.

Remove `seq->sound` check and don't apply any offset for scene strips.

Reviewed By: zeddb, sergey

Differential Revision: https://developer.blender.org/D12819
2022-01-11 14:09:31 +01:00
Germano Cavalcante
5c6418b077 Fix T94109: 3d cursor crash when using shortcut
Operator was erroneously starting edge_slide operation.

Revert part of the changes in rB3fab16fe8eb4 as obedit_type was being
confused with object_mode.
2022-01-11 14:05:41 +01:00
2926f10ffe Fix/workaround macOS Rosetta crash running Cycles AVX tests
Just disable these tests on macOS for now as fixing seems hard, and we want to
be able to cross-compile and test x86_64 on Arm machines on the buildbot.
2022-01-05 21:22:08 +01:00
deca1be480 Fix Cycles AVX test failure with x86_64 build running on Arm
Don't create const avx vectors before validating if CPU supports AVX.
2022-01-05 20:10:55 +01:00
59a48cc43d Version bump: 2.93.8-rc 2021-12-15 15:37:02 +01:00
a5b7f9dc90 Version bump: 2.93.7-release 2021-12-14 17:35:31 +01:00
d23303e27d LineArt: Shifting fix for different camera fitting.
FOV was expanded to cover the shifting range,
rather than to precisely cut at the image border. Now fixed.

Reviewed By: Sebastian Parborg (zeddb)

Differential Revision: https://developer.blender.org/D11523
2021-12-14 17:31:34 +01:00
39ebb8be4d Fix T91680: viewport selection broken in macOS x86 build with Xcode 13
There is an apparent compiler bug here, tweak the code to avoid it. This did
not affect official builds as we were still using Xcode 12.
2021-12-08 15:44:23 +01:00
fb762eedbe Fix T92043: Relax/Push Pose does nothing for quaternion rotation.
As can be confirmed by checking generic code for this operation,
it is supposed to blend between the result of Breakdown based on
actual frame range, and the current pose. However for some reason
the quaternion specific code was blending between the current pose
and the current keyframed pose. This means that the operation does
nothing if invoked without modifying the pose first.

This rewrites the code to match the non-quaternion behavior.

Differential Revision: https://developer.blender.org/D13030
2021-12-07 15:43:56 +01:00
ef381ffd05 Fix T93563: Crash subdividing with overlapping tri and quad
The first loop was left out when finding the split edge boundary.

Error from f2138686d9.
2021-12-07 15:14:43 +01:00
Bastien Montagne
ca6acceb60 Fix T93353: Reload Library Override file loses Constraints, take II.
When adding `INSERT` operations over RNACollection items, rna diffing
code did not properly report the properties as not being equals.

This in turn triggered the 'purge unused exiting override properties'
mechanism, thus deleting the exitsting (valid) insert override property
operation.

NOTE: This should also be backported to 2.93, and probably 2.83.

Reviewed By: sybren, jbakker

Maniphest Tasks: T93353

Differential Revision: https://developer.blender.org/D13426
2021-12-07 15:11:57 +01:00
654967b0e0 Fix T89564: Spline IK breaks when it is far away from the world origin
The isect_line_sphere algorithm became very imprecise when the line and
the sphere were reasonably far away from the world origin.

This would lead to no intersections being reported even if there was a
guaranteed intersection (line crossing from inside the sphere to the
outside).

To fix this we now use the secant root finding method to get an
intersection point. This is much more stable and robust it seems.
2021-12-07 15:07:36 +01:00
448c838cee Fix missing Blender logo in Windows store package
D9681 was not properly merged to all branches, leaving a path to a non-existent
icon file in the maniphest.
2021-12-02 16:04:48 +01:00
5e2dc507e7 Fix T93290: Rotation without contraint after extrude has wrong axis
The default orientation of the mode was being indicated as overridden,
although the one of constraint was used.
2021-11-28 16:47:13 +01:00
f3ba6ac182 Fix T89081: Freestyle noise seed of zero crash
This leads to division by zero in Freestyle's NoiseShader which also
crashes blender.

Not sure if we really need a do_version patch for old files, as an
alternative we could also force a positive number in the NoiseShader.
This patch does not do either, just force a positive range in RNA from
now on.

Maniphest Tasks: T89081

Differential Revision: https://developer.blender.org/D13332
2021-11-28 16:43:44 +01:00
b02bdd52a4 Fix T93130: Frame Selected with selected paint mask does not work
This broke with {rB20fac2eca723} (which landed in 2.63), so long
standing bug.

Convention for paint modes is:
- when no paint mask is active, `Frame Selected` will focus the last
stroke
- when paint mask is active, `Frame Selected` will focus the selected
mask faces

To check the right vert coords we have to offset with `mp->loopstart`.

Maniphest Tasks: T93130

Differential Revision: https://developer.blender.org/D13247
2021-11-28 16:40:23 +01:00
a29283f43a Fix T93117: Texture paint clone tool crash in certain situation
Caused by {rBaf162658e127}, so long standing bug.

When changing clone slots (report involved a quite complicated sequence
of selecting textures and undo -- but I think this could happen in more
situations) code checks for UV of new clone slot.
However, since above commit the slot and the clone slot were mixed up,
so in this case the responsible NULL check (for when no UV is assigned)
wasnt working.
Now correct this (NULL check the clone slot uv -- instead of the paint
slot UV).

note: not sure why low level CustomData functions actually dont do the
name NULL checks themselves (seems like callers are always responsible).

Maniphest Tasks: T93117

Differential Revision: https://developer.blender.org/D13378
2021-11-28 16:37:06 +01:00
c2971eea84 Fix T93338: Curve Guide force field crash
Caused by {rBcf2baa585cc8}.

For Curve Guide force fields to work, the `Path Animation` option has to
be enabled. With it disabled, we are lacking the necessary
`anim_path_accum_length` data initialized [done by
`BKE_anim_path_calc_data`] which `BKE_where_on_path` relies on since
above commit.

Now just check for this before using it - and return early otherwise.
Prior to said commit, `BKE_where_on_path` would equally return early
with a similar message, so that is expected behavior here.

Maniphest Tasks: T93338

Differential Revision: https://developer.blender.org/D13371
2021-11-28 16:23:16 +01:00
647578b143 Fix T93322: Freestyle Sinus Displacement Division by Zero Crash
This happens if the Wavelength is set to 0.0f.

Not sure if we really need a do_version patch for old files, as an
alternative we could also force a slight offset in the
SinusDisplacementShader. This patch does not do either, just force a
positive range from now on.

Maniphest Tasks: T93322

Differential Revision: https://developer.blender.org/D13329
2021-11-28 16:18:29 +01:00
5121773cc5 Fix T93320: Freestyle LineStyleModifier blend 'Minimum' error
This was just a typo in {rBb408d8af31c9}
Must be 'MINIMUM' (instead of 'MININUM').

Maniphest Tasks: T93320

Differential Revision: https://developer.blender.org/D13328
2021-11-28 16:03:17 +01:00
63aa10802b Fix T93007: Cycles not updating for animated Object properties like color 2021-11-23 09:14:35 +01:00
9f73c9d656 Fix T93198: Frame Selected in greasepencil curve editing does not work
Was not taking into account curve points at all.

Maniphest Tasks: T93198

Differential Revision: https://developer.blender.org/D13281
2021-11-23 09:12:01 +01:00
f043b0334b Fix T93194: greasepencil channel lists ignoring collection visibility
Same fix as rB0a3b4d4c64f1, but this time for greasepencil.

To repeat: dopesheet in greasepencil mode was ignoring the temporariy
visibility flag of collections. As a result, even though the dopesheet
was supposed to show animation data of visible greasepencils only was
still showing such data of greasepencils that were hidden by hiding
their collection.
2021-11-23 09:09:28 +01:00
Jeroen Bakker
419b4e0b00 Fix T89260: Eevee crashes with custom node sockets.
Cause of this issue is that Custom Node Sockets info type was
initialized as SOCK_FLOAT when registering. Areas within the core that
would ignore custom socket types by checking its type would use the
socket as being a float type.

When custom node sockets have a property called default_value blender
tries to store it as an internal default value what failed in debug
builds.

This patch will set the socket type to SOCK_CUSTOM when registering a
custom socket type and allow, but skip storage of custom default values.
In this case the default values should already be stored as custom
properies.

Reviewed By: campbellbarton, JacquesLucke

Maniphest Tasks: T89260

Differential Revision: https://developer.blender.org/D13174
2021-11-23 09:07:39 +01:00
215a6f1c10 Fix T92384: Wrong UV layers used with Boolean Modifier (Fast Solver)
Ensure the layers from the source mesh are used instead of the
object referenced by the boolean modifier.
2021-11-23 08:48:59 +01:00
bf8909eb5d Fix T88614: Mixdown crashes Blender 2.92.0 and 3.0.0 Alpha
The problem is caused by the most recent ffmpeg version (4.4) which
needs channels to be set when submitting a frame for encoding.
2021-11-23 08:44:48 +01:00
4a8f07a832 Fix T93066: Alembic export ignores Mantaflow particles
`ABCPointsWriter::is_supported` already checked for valid particle
system types (liquid, spray, foam, bubbles, ...).

`AbstractHierarchyIterator::make_writers_particle_systems` did not
create a writer for these though, so now bring these in line and also
create writers for these.
2021-11-22 16:47:55 +01:00
6cad8690c3 Fix T93074: Gpencil cutter not using flat caps in middle cuts
When cut an stroke using the option Flat Caps, the falt was not done if the cut was done in the middle of the stroke.

Now the flat is applied to the segments created and also some cleanup of the code done.
2021-11-22 16:35:51 +01:00
89adf516cd Fix T92760: Crash erasing GPencil when occlusion test is enabled
`pt0` was read when `NULL` and `is_occluded_pt1` could be read even if
it is not initialized.
2021-11-22 16:28:58 +01:00
4796865b46 GPencil: Speed up Occlude Eraser
This is an initial change to speed up the calculation of the Occlude eraser. In the future, we can add more optimizations, but at least this increase speed.

Intead to check always the 3 points, the check is skipped if it's not required.

Base in a solution by Philipp Oeser.

This is related to T88412
2021-11-22 16:20:25 +01:00
1d8d6c2f62 Fix potential crash opening 3.0 blend files in older versions.
Affects insertion of constraints or NLA tracks in liboverrides. In some
cases, when opening newer post-3.0 .blend files, the source won't be
found anymore, override apply code then needs to fail properly instead
of crashing.

Related to refactor from  rB33c5e7bcd5e5.
2021-11-22 14:29:55 +01:00
0b89b94947 Fix T92090: Eevee crash with Intel HD 4000 and macOS 10.15.7
A recent security update to macOS 10.15.7 causes crashes when using Eevee and
various other 3D viewport features. It appears that glGenerateMipmap is
broken, causing a crash whenever its commands are flushed/submitted to the GPU.

Ideally this would be fixed in a driver update, however it's unlikely this will
happen. Earlier macOS versions have been receiving security updates for 2 years,
and that window has just passed for 10.15. Further, computers with these GPUs
can't upgrade to a newer macOS version.

As a workaround, disable mipmaps on these GPUs, by setting the mipmap max level
to 0 and not calling glGenerateMipmaps. Effects like depth of field also use
mipmaps, but fill in the mip levels by other means. In those cases we keep the
mipmap level.

Differential Revision: https://developer.blender.org/D13295
2021-11-20 17:51:09 +01:00
4ec6cc412d Version bump: 2.93.7-rc 2021-11-17 17:21:44 +01:00
c842a90e2f Version bump: 2.93.6-release 2021-11-16 15:54:38 +01:00
fa99d3e53b Fix T92807: Incorrect display planar tracking.
Issue introduced in {7e66616b7e15} where the shader was replaced with a
2d image shader. This patch reverts several commits that removed the 3d
image shader.
2021-11-10 10:31:53 +01:00
3bd6b80d65 Fix T92515: Incorrect translation when scaling pose bones 2021-11-10 10:19:27 +01:00
1180b4ff4a Fix T89516: Crash on append
Caused by 37458798fa, was doing a NULL-pointer dereference because it used
the wrong pointer to check if the data-block is linked.
2021-11-04 09:11:32 +01:00
9a80455d87 Fix missing proper 'make local' call for liboverrides from outliner.
Also includes minor improvements to
`BKE_lib_override_library_make_local` itself.

This is a complement to rB37458798fa02c.
2021-11-02 14:29:33 +01:00
9a290dd657 LibOverride: Fix crash in ShapeKeys when making a mesh override local.
Weird 'embedded for overrides' flag of embedded IDs (including ShapeKeys
in override context) was not properly cleaned up when making an override
fully local.

Reported by studio, thanks.

@jbakker should be backported to 2.93LTS if possible.
2021-11-02 14:29:19 +01:00
619c51592f Fix T92608: Image Editor does not display stereo images
Caused by own {rB5aa3167e48b2}.
Related commit: {rBebaa3fcedd23}.

For stereo renders, `BKE_image_is_multilayer` is true, however we seem
to get to down to `space_image_gpu_texture_get` [where this is called]
from `IMAGE_cache_init` with a NULL Image->RenderResult.

So what then happens is that `BKE_image_multilayer_index` is called and
even though it has an appropriate codepath for stereo, it earlies out
and does not set multi_index correctly.

Still a bit puzzled why RenderResult is NULL for a render, but since
other places also check for a valid RenderResult before going down the
_multilayer_ route (and doing _multiview_ instead), now do the same
thing, BKE_image_multiview_index is now called in these cases (and seems
to behave correctly, checked with layers and passes and all seems to
display correctly, either in stereo or choosing individual eyes).

thx @jbakker & @brecht for double-checking.

Maniphest Tasks: T92608

Differential Revision: https://developer.blender.org/D13063
2021-11-02 13:25:51 +01:00
f9f6d8de58 Fix T89777 EEVEE: Contact Shadows causes wrong shading in Reflection Plane
The planar reflections being rendered at the same resolution as the HiZ max
buffer, do not need any uv correction during raytracing.

However, the GTAO horizon buffer being at output resolution do need the
uv factors in order to match the pixels visible on screen. To avoid many
complication, we increase the size of the GTAO texture up to the hiz buffer
size. This way, if planar reflections need GTAO the texture is big enough.
We change the viewport of the GTAO framebuffer for the main view in order
to not have to modify Uvs in many places.
2021-11-02 13:17:28 +01:00
Ankit Meel
8e237d83f2 Fix T88877: 2.93: Crash on recent OSX with a non-English locale.
Looks like OSX changed the default format of its locale, which is not
valid anymore for gettext/boost::locale.

Solution based on investigations and patch by Kieun Mun (@kieuns), with
some further tweaks by Ankit Meel (@ankitm), many thanks.

Also add an exception catcher on `std::runtime_error` in
`bl_locale_set()`, since in OSX catching the ancestor `std::exception`
does not work with `boost::locale::conv::conversion_error` and the like
for some reasons.

Reviewed By: #platform_macos, brecht

Maniphest Tasks: T88877

Differential Revision: https://developer.blender.org/D13019
2021-11-02 13:15:45 +01:00
77104bf318 Fix T92355: Quadriflow crashes with zero length edges
Add a check for zero length edges to the manifold check as quadriflow
doesn't handle meshes with these.
2021-11-02 13:13:41 +01:00
28d581af92 Fix T91411: Outliner crash using contextmenu operators from a shortcut
Oversight in {rBb0741e1dcbc5}.

This was guarded by an assert in `get_target_element`, but it can be
valid to have these assigned to a shortcut (and then perform the action
without an active outliner element).

Now remove the assert and let the operator polls check if we really have
a target element.

note: this basically makes `get_target_element` obsolete, could call
`outliner_find_element_with_flag` instead in all cases.

Maniphest Tasks: T91411

Differential Revision: https://developer.blender.org/D12495
2021-11-02 13:10:44 +01:00
Jacob Lewallen
7d2b6a213f Pass correct array size to BKE_object_material_remap_calc
This was patch D12460 from jlewallen and fixes T91339 and T90818.
2021-11-02 12:57:51 +01:00
852d10bd3d Fix T92265: Outliner crash clicking override warning buttons
`outliner_draw_overrides_buts` uses `uiDefIconBlockBut` but doing so
without defining a function callback to actually build a block.
This will make the button go down the route of spawning a popup, but
without a menu. Crash then happens later accesing the (missing) menu in
`ui_handler_region_menu`.

So while we could dive into making this usage failsafe (carefully
checking `BUTTON_STATE_MENU_OPEN` in combination with
`uiHandleButtonData->menu` being NULL all over), but it seems much more
straightforward to just use `uiDefIconBut` (instead of
`uiDefIconBlockBut`) since this Override Warning buttons seem not to
intend spawning a menu anyways?

Maniphest Tasks: T92265

Differential Revision: https://developer.blender.org/D12917
2021-11-02 12:56:35 +01:00
2491bc5ec7 Fix T92314: Auto naming of the Vertex Group doesn't work for Grease
Pencil

Not naming the auto-generated vertexgroup after the selected bone was
just confusing (since the group would not have an effect), so now use
similar code that is used for meshes for greasepencil as well.

Maniphest Tasks: T92314

Differential Revision: https://developer.blender.org/D12906
2021-11-02 12:55:01 +01:00
5a5788ace2 Fix T92246: sculpt crash displaying statistics in certain situations
It seems possible to switch object selection (if `Lock Object Modes` is
turned off) and end up with an object that has a SculptSession but a
NULL PBVH.
(I was not able to repro from scratch, but file from the report was
clearly in that state).

This would crash in displaying scene statistics.

While there might be a deeper fix (making sure PBVH is available early
enough -- possibly using `BKE_sculpt_object_pbvh_ensure`,
`sculpt_update_object` or friends), there are also many checks in tools
for PBVH, so the situation seems to be somewhat vaild/expected also in
other places.
So to fix this, just check for a non-NULL PBVH, returning early
otherwise.
Note: this leaves us with displaying 0/0 Faces & Vertices in the borked
case until an operation takes place that updates the PBVH.

Maniphest Tasks: T92246

Differential Revision: https://developer.blender.org/D12904
2021-11-02 12:53:03 +01:00
554b1b1663 Fix missing null-terminator in BLI_string_join_arrayN
Although the documentation says so, the null-terminator was missing.
This could cause crashes when logging shader linking errors as shader
sources are empty in this case.
2021-11-02 12:50:31 +01:00
272cb6157d Fix T88766 EEVEE: Missing glossy reflections with Shader to RGB & SSR is active.
This was due to the shading evaluation being outdated inside the ShaderToRGBA
glsl code.
2021-11-02 12:48:29 +01:00
71799d46b4 Fix T91398 Overlay: Camera BG jitter offset (regression)
This was caused by camera background being rendered in world space, causing
floating point imprecision issues when camera was far from origin.

Adding a uniform to change vertex shader to process everything in viewspace
to fix the problem.
2021-11-02 12:38:09 +01:00
dff17b0893 Fix T92185: GPencil memory leak removing stroke from python
The API was not removing the weights, points and traingulation data. Only it was removing the stroke itself.
2021-11-02 12:35:22 +01:00
d3f0470289 Fix LLVM 12 symbol conflict with Mesa drivers, after recent Linux libs update 2021-11-02 12:33:35 +01:00
31dfdb6379 Add missing "CUDA_ERROR_UNSUPPORTED_PTX_VERSION" to CUEW
This is required for Cycles to report a meaningful error message when it fails to load a PTX module
created with a newer CUDA toolkit version than the driver supports.

Fix crash when kernel loading failed (T91879)

Ref T91879
2021-11-02 12:30:28 +01:00
Gottfried Hofmann
ac42e58e31 Expose Color Management as argument for gpu.types.GPUOffScreen.draw_view3d()
Fix for https://developer.blender.org/T84227

The problem was that https://developer.blender.org/rBe0ffb911a22bb03755687f45fc1a996870e059a8 turned color management for offscreen rendering off by default, which makes it non-color-managed in some cases. So the idea here is that script authors get the choice wether they want color managed non-color-managed output. Thus this patch introduces a new argument do_color_management as a bool to gpu.types.GPUOffScreen.draw_view3d().

Reviewed By: jbakker

Differential Revision: https://developer.blender.org/D11645
2021-11-02 12:02:25 +01:00
71f354a825 Fix T89164: Sculpt "Smooth" brush crash with zero pressure
Caused by {rB3e5431fdf439}

Issue is that sculpting could start with using `SCULPT_smooth` and
(because of the Pressure sensitivity dropping to zero) code would switch
to `SCULPT_enhance_details_brush` at strength zero.
Issue with this though is that this can be in the middle or end of a
stroke and the necessary `ss->cache->detail_directions` are only
initialized for the first brush step (see
`SCULPT_stroke_is_first_brush_step` in `SCULPT_enhance_details_brush`).
With these missing, it could only go downhill from there.

Suggest to prevent the "mode-flip" from `SCULPT_smooth` to
`SCULPT_enhance_details_brush` (happening solely because of pressure
strength) by changing the condition.
Now do `SCULPT_enhance_details_brush` only if strength is **below** zero
and `SCULPT_smooth` else (flipping from enhance_details to regular smooth
is fine).

If inverting the brush in the middle of the stroke will be supported at
some point, the codepath of `smooth` would have to inform the cache that
invert changed and detail_directions would have to be initialized then
(even if not at the start of the stroke).

Maniphest Tasks: T89164

Differential Revision: https://developer.blender.org/D12676
2021-11-02 11:54:35 +01:00
53fd7abfe7 Fix T91237: Wrong Editors could sync animation 'Visible Range'
This was reported for the Outliner.

It was possible to set 'show_locked_time' on any space (via python, not
sure if there are other ways to achieve this).
Navigating in an animation editor obviously ruined the layout in certain
Editors that are not made for this.

Now restrict syncing to editors that support it well (the ones that have
this setting exposed in their menus) and prevent setting this in RNA.

Maniphest Tasks: T91237

Differential Revision: https://developer.blender.org/D12512
2021-11-02 11:35:46 +01:00
aaa85ad2e0 VSE: Implement sanity check for files with more channels than supported
This is a follow up to 8fecc2a852.

This makes sure future .blend files that have more channels than the
limit won't break Blender.

For LTS this happens silently, without warning the users.

This is part of https://developer.blender.org/D12645

Differential Revision: https://developer.blender.org/D12648
2021-11-02 11:32:24 +01:00
5d8a7ba7de Fix T91728: Cycles render artifacts with motion blur and object attributes 2021-11-02 11:16:58 +01:00
6489f2212b Fix T88386: Continuous Grab occasionally jumping on Arm64 MacOS
During the processing of a continuous drag event, other mouse move
events may be in the queue waiting to be processed.

But when a mouse wrapping happens, these waiting mouse move events
become out of date as they report a mouse position prior to wrapping.

The current code ignores these events by comparing their `timestamp` to
the time recorded in the last mouse wrapping.

The bug happens because the computed value in
`mach_absolute_time() * 1e-9` for some reason is incompatible with the
value of `[event timestamp]`.

Since macOS 10.6, we have a new way to get the amount of time the
system has been awake. `[[NSProcessInfo processInfo] systemUptime]`.

Using this updated method fixed the problem.

Differential Revision: https://developer.blender.org/D12202
2021-11-02 10:58:14 +01:00
379ffa75aa Fix T90840: Can't duplicate or copy (Ctrl-C) object from linked file.
We need to separate the flag telling duplicate code to not handle
remapping to new IDs etc., from the one telling the code that we are
currently duplicating a 'root' ID (i.e. not a dependency of another
duplicated ID).

This whole duplicate code/logic is still fairly unsatisfying, think it
will need further refactor, or maybe even re-design, at some point...
2021-11-01 09:59:20 +01:00
d56d5bfafc Fix T87703: Failed assert when dragging object data-block into 3D View
Talked with Bastien and we ended up looking into this. Issue is that the
dupliation through drag & drop should also be considered a
"sub-process", like Shift+D duplicating does. Added a comment explaining
why this is needed.
2021-11-01 09:43:30 +01:00
0930a70e81 Version bump: 2.93.6-rc 2021-10-06 12:52:36 +02:00
a791bdabd0 Versioin bump: 2.93.5-release 2021-10-05 14:04:58 +02:00
af42086e74 Fix building without audaspace 2021-09-27 08:30:29 +02:00
1ff10bb6d1 Fix 'WM_window_find_under_cursor'
This function was not working if the window is partially out of screen space.
2021-09-27 08:28:46 +02:00
4528c9a357 Fix default surface resolution U/V mis-match
The resolution for surfaces was 12 for U, 4 for V,
where both should have been set to 4.

Regression in 9a076dd95a
2021-09-27 08:24:52 +02:00
5b588282a6 Fix T91557: Texture Paint Stencil doesnt use assigned UV Layer
Choosing a UV layer would actually affect the overlay in the viewport
and also painting with the mask brush was in that UV space, but the
resulting stencil mask was always applied with the active UV (not the
explicitly selected stencil UV -- the one one is looking at in the
viewport!) to painting.

This has been like that as far as I have checked back (at least 2.79b),
I am surprised this has not come up before, but it does not seem to make
sense at all...

Now use the UV specified for the stencil layer when applying the mask for
painting, so it corresponds to the stencil mask one is looking at in the
viewport.

Maniphest Tasks: T91557

Differential Revision: https://developer.blender.org/D12583
2021-09-27 08:24:24 +02:00
25440e2a91 Fix T91607: GPencil Tint modifier "apply" removes the effect 2021-09-27 08:23:32 +02:00
93c7e83b2a Audaspace: porting upstream pulseaudio fixes.
Fixes T89045 and T91057.
2021-09-22 08:36:48 +02:00
Jake
651f1b5cf9 EEVEE fix gloss low roughness error
Up lower clamp on spec_angle to prevent NaN from being generated on intel GPUs at low roughness.

Fixes T88754

Reviewed By: fclem

Maniphest Tasks: T88754

Differential Revision: https://developer.blender.org/D12508
2021-09-22 08:36:27 +02:00
f2e96c2473 Fix T87801: Eevee ambient occlusion is incorrect on M1 macMini
The issue was caused by `textureSize()` returning the size of the level 0
even when the min texture level is higher than 0.

Using a uniform to pass the correct size fixes the issue.

This issue also affected the downsampling of radiance for reflections and
refractions.

This does not affect anything other than the recusive downsampling shaders.
2021-09-22 08:33:31 +02:00
934c724755 Fix T91534: GPencil interpolate Sequence fails if stroke has 0 points
In some cases the stroke has 0 points and this must be skipped in the interpolation.
2021-09-22 08:32:52 +02:00
cc31c159a1 Fix T91511: GPencil weight_get and Vertex Groups not working at expected
The API was checking the number of total weights with the first point of the stroke and this was not valid because each point can have different number of weight elemnts,
2021-09-22 08:31:44 +02:00
8235d4dea6 Fix T91448: GPencil Fill simplify not working in render
The simplify was hardcode to be disabled in render.
2021-09-22 08:31:12 +02:00
cbf18b6586 Fix straightline gesture snapping not working for modal tools
This was implemented in {rB14d56b4217f8} but was never working for
tools/operators other than the sculpt line mask tool.

To be precise, the preview actually snapped but the operations (e.g.
mesh bisect, vertex weight gradient) still happened "unsnapped" in
modal. For the sculpt line mask tool this wasnt a problem, because it
only draws a preview while modal, the actual mask was only applied
later.

This solves part one of T91320 (snapping), sculpting also introduced
flipping in {rB7ff6bfd1e0af} which does not make much sense for all
tools, but in bisect this could actually be supported, will add that in
a separate Diff.

ref T91320

Maniphest Tasks: T91320

Differential Revision: https://developer.blender.org/D12470
2021-09-22 08:30:41 +02:00
ec448d5a6a Fix memory leak if an error occurred assigning id-property sequence 2021-09-22 08:29:47 +02:00
e90aabdebf Fix T89241: Scale to fit overflows into a second line 2021-09-22 08:29:26 +02:00
597d9518ef Fix T91159: GPencil Smooth brush is using Affect Pressure but not used
The parameter wa sin the UI but was not used because it was replaced by Use Thickness.
2021-09-06 09:37:00 +02:00
333db02e5d Fix T88433: no greaspencil depsgraph evaluation with certain drivers
When the same stroke was used as a driver variable, this could make this
stroke already tagged as built in the course of building driver
variables (via `build_gpencil`), but then important stuff from
`build_object_data_geometry_datablock` could be missed later on (because
both of these funtions use `checkIsBuiltAndTag`). Most importantly,
setting up operations such as GEOMETRY_EVAL would be skipped entirely.

`build_object_data_geometry_datablock` seems to cover greasepencil just
fine (does the same as `build_gpencil` and more). Proposed solution is to
remove `build_gpencil` entirely. In `build_id` it would then also call
`build_object_data_geometry_datablock` for `ID_GD` IDs. Now the covered
types that _call_ `build_object_data_geometry_datablock` match exactly
to what is covered _inside_ `build_object_data_geometry_datablock`.

Think this "duplication" of functionality was just overseen in
rB66da2f537ae8 [`build_gpencil` existed long before and said commit made
greasepencil a real object with geometry and such].

thx @JacquesLucke for additional input!

Maniphest Tasks: T88433

Differential Revision: https://developer.blender.org/D12324
2021-09-06 09:36:36 +02:00
f0b3b6710c Fix T91060: GPencil Time Offset Modifier breaks evaluation time
Caused by {rBf3bf87e5887c}.

When using a GPencil Time Offset Modifier, the bGPDlayer>actframe can be
NULL. This can be determined though, but above optimization commit
skipped getting the active frame in this case entirely (with the
intention to only get it if framenumbers did not match).

Now also call BKE_gpencil_layer_frame_get() if actframe is NULL in order
to fetch a valid one if present.

Maniphest Tasks: T91060

Differential Revision: https://developer.blender.org/D12355
2021-09-06 09:36:16 +02:00
ae7ec34dfe Fix T88887: Audio causes issues with Playback when PC put to Sleep, Hibernate or when Screensaver appears
Porting WASAPI device reinitialization from upstream.
2021-09-06 09:35:14 +02:00
16637e7ff4 Fix error scaling thumbnails to zero dimensions
Follow up to fix for T89868.
2021-09-06 09:34:19 +02:00
d6facd44b5 Fix T88909: Win32 getTitle() UTF8 Support
In the Win32 platform our setTitle() can properly assign a Unicode
utf-8 window title. Unfortunately our getTitle() will only read regular
8-bit character strings. This means that we can never compare what we
set to what we get. This patch updates getTitle() to use Unicode-aware
GetWindowTextLengthW and GetWindowTextW.

see T88909 for an example of this affecting user experience.

Differential Revision: https://developer.blender.org/D11782

Reviewed by Ray Molenkamp
2021-09-06 09:33:45 +02:00
933c6b7d8a Fix "Text to Object" creating invisible object
Newly created objects would not become visible until
another action forced a depsgraph update.
2021-09-06 09:30:29 +02:00
969d6d3a0f Fix invalid mask use for the UV-project modifier
Mistake in a30a817933.
2021-09-06 09:30:03 +02:00
515bdda3d9 Fix buffer size mismatch in SCRIPT_OT_python_file_run
Reading paths over 512 bytes would cause a buffer overrun.
2021-09-06 09:28:46 +02:00
74ba0f8bd8 Fix T88812: Child Windows on Vertical Monitors
This patch improves the positioning of child windows when on monitors
that are arranged vertically (any above any other). When calculating a
window position in Ghost coordinates from GL coordinates we were using
monitor height, which can give incorrect values when desktop is taller
than any single monitor. So use desktop height instead.

See D10637 for more details and examples.

Differential Revision: https://developer.blender.org/D10637

Reviewed by Brecht Van Lommel
2021-09-06 09:27:41 +02:00
1341664e48 Audaspace: porting PulseAudio fixes from upstream. 2021-09-06 09:27:12 +02:00
c6c9e361e2 Audaspace: porting pulseaudio fixes from upstream. 2021-09-06 09:26:12 +02:00
1af6e0daaf Fix T90772: Image Editor not sampling color from the the currently
selected pass

Caused by {rBebaa3fcedd23}.

Seems this above commit assumed an ImageUser's multi_index is only used
for Multiview/Stereo? This is not the case, multi_index also stores the
index for layer/pass combination.

If we call both BKE_image_multilayer_index and BKE_image_multiview_index
(even though this is not appropriate/needed for multilayer images?), we
might end up overwriting multi_index again.

note: looking at this I was also wondering why we update the ImageUser
in image-buffer-aquiring funnctions [and not from the UI, e.g.
template_image_layers, but that is a whole different story I guess, see
comment in T90772 as well]

note2: this could also use a utility function (this is not the only
place where this is done), this is fo a cleanup commit.

Maniphest Tasks: T90772

Differential Revision: https://developer.blender.org/D12267
2021-09-06 09:24:38 +02:00
52fdf42d35 Fix T90651: camera reconstruction crash without scene camera
This was working differently in 2.79, tried tracking this down and it
seems this was wrong since the 2.8 beginning in {rB7907dfc40018}.

This would not only crash without an active scene camera, but would also
result in different tracks from different camera's constraints could not
be selected.

So select id depends on corresponding camera, remove the dependency on
scene camera completely.

Maniphest Tasks: T90651

Differential Revision: https://developer.blender.org/D12230
2021-09-06 09:22:46 +02:00
ae1a57a05e Version bump: 2.93.5-rc 2021-09-01 13:53:24 +02:00
b7205031ce Version bump: 2.93.4-release 2021-08-31 11:23:20 +02:00
7ec351c0d5 FFMPEG: Fix building with older versions that need FFMPEG_USE_DURATION_WORKAROUND 2021-08-25 15:43:14 +02:00
85c08c9717 VSE: Flush audio encode after finishing video export
We didn't flush audio after encoding finished which lead to audio
packets being lost.

In addition to this the audio timestamps were wrong because we
incremented the current audio time before using it.

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11916
2021-08-25 15:43:01 +02:00
f53cdbdd27 Pipeline: Use more explicit cuda versions. 2021-08-23 13:50:02 +02:00
1a4122d441 Add sanity NULL checks when loading sound sequences
Would cause crashes in files that had lingering invalid sound sequences around.
For example our tests/render/volume/fire.blend test file.
2021-08-23 11:55:26 +02:00
e97a2c228e Fix T87967: M2T video seeking is broken
Bug caused by integer overflow in ffmpeg_generic_seek_workaround().
Function max_ii() was used to limit int_64tvalue.

After fixing the issue there was another issue, where near-infinite loop
was caused by requested_pos being very large and stream being cut in a
way, that it was missing keyframe at beginning.
This was fixed by checking if we are reading beyond file content.

Reviewed By: zeddb

Differential Revision: https://developer.blender.org/D11888
2021-08-23 11:54:30 +02:00
c634d859b2 VSE: Use lines to draw waveform
Refactor and improve waveform drawing.

Drawing now can use line strips to draw waveforms instead of only
triangle strips. This makes us able to properly visualize thin waveforms
as they would not be visible before. We now also draw the RMS value of
the waveform.

The waveform drawing is now also properly aligned to the screen pixels
to avoid flickering when transforming the strip.

Reviewed By: Richard Antalik

Differential Revision: https://developer.blender.org/D11184
2021-08-23 11:53:18 +02:00
489df7ac88 VSE: Fix audaspace not reading ffmpeg files with start offset correctly
The duration and start time for audio strips were not correctly read in
audaspace.

Some video files have a "lead in" section of audio that plays before the
video starts playing back. Before this patch, we would play this lead in
audio at the same time as the video started and thus the audio would not
be in sync anymore.

Now the lead in audio is cut off and the duration should be correctly
calculated with this in mind.

If the audio starts after the video, the audio strip is shifted to
account for this, but it will also lead to cut off audio which might not
be wanted. However we don't have a simple way to solve this at this
point.

Differential Revision: http://developer.blender.org/D11917
2021-08-23 11:21:31 +02:00
00dd68405d VSE: Fix seeking issues.
The seek pts was not correctly calculated.
In addition to that we were not seeking in the video pts time base.

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11921
2021-08-23 11:20:32 +02:00
d486d24868 VSE: Fix video strip duration calculation
The video duration was not read correctly from the video file.

It would use the global duration of the file which does in some cases
not line up with the actual duration of the video stream.
Now we take the video stream duration and start time into account when
calculating the strip duration.

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11920
2021-08-23 11:19:17 +02:00
54a821e8fd VSE: Fix memory leak when adding bad image/movie strips
If the add strip operator errored out, we wouldn't free custom data allocated

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11919
2021-08-23 11:18:35 +02:00
9511009438 VSE: Fix "off by one" error when encoding audio
Before we didn't encode the audio up until the current frame.
This lead to us not encoding the last video frame of audio.

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11918
2021-08-23 11:17:55 +02:00
ab572f3ac1 Fix: instances are made real when they shouldn't be
The original assumption that the `modifyMesh` function is only
called when the modifier is applied was wrong. There are still a
couple of other places calling it through `BKE_modifier_modify_mesh`.

Now there is an extra check that makes sure instances are only
realized when the modifier is actually applied.
2021-08-23 11:15:17 +02:00
3fa1de6ad6 Fix T90737: VSE adding nested strips could have non-unique names
Caused by {rBbbb1936411a5}.

When adding strips via the new SEQ_add_XXX_strip functions, the
`Editing->seqbasep` pointer was passed around.
Following in `seq_add_generic_update` this `seqbasep` pointer was used
to ensure a unique name.
But `seqbasep` is the pointer to the current list of seq's being edited
(**which can be limited to the ones within a meta strip**).

We need unique names across all strips though (since these are used for
RNA paths, FCurves as reported), so now use the scene's `Editing-
>seqbase` (**which is the list of the top-most sequences**) instead.

Unfortunately this might have screwed files to a borked state, not sure
if this could easily be fixed...

Maniphest Tasks: T90737

Differential Revision: https://developer.blender.org/D12256
2021-08-23 11:14:53 +02:00
f311542182 Fix T90719: Boost sources dowload address needed to be updated. 2021-08-23 11:10:34 +02:00
Matteo F. Vescovi
9d94b358ca Fix FTBFS on mips64el architecture
While trying to get Blender 2.93.x LTS to build fine on all release architectures in Debian, I noticed that the misleading use of "mips" as integer variable caused problems when compiling on mips64el. The patch should fix the issue.

Reviewed By: fclem

Differential Revision: https://developer.blender.org/D12194
2021-08-23 10:05:49 +02:00
9fb9bf5996 Fix: DNA struct alignment on 32 bit
Some of the dna structs were not properly
aligned for 32 bit builds causing issues
for some of the 32 platforms Debian builds
for.

Reviewed By: sergey, brecht
Differential Revision: https://developer.blender.org/D9389
2021-08-23 10:05:01 +02:00
70df9119f4 Makesdna: Fix detecting 32 bit padding issues.
Makesdna fails to detect issues in 32 bit code that can
only be resolved by adding a padding pointer.

We never noticed since we ourselves no longer build for
32 bit, but debian's 32 bit builds got bitten by this

A rather extensive explanation on why this is alignment
requirement is there can be found in this comment:

https://developer.blender.org/D9389#233034

Differential Revision: https://developer.blender.org/D12188

Reviewed by: sergey, campbellbarton
2021-08-23 09:43:18 +02:00
06e3e2f21c Fix T90364: buttons (partially) behind animchannel search block search
When channels are scrolled to be (partially) behind the search bar,
their widget buttons would still be interactive, preventing the seach
buttons to be usable.

We have to make sure the events are consumed from the search and dont
reach other UI blocks.
We can do so by flagging the block `UI_BLOCK_CLIP_EVENTS` -- but also
have to make sure the bounds are calculated correctly (otherwise the
check relating `UI_BLOCK_CLIP_EVENTS` in `ui_but_find_mouse_over_ex` wont
trigger properly.

Maniphest Tasks: T90364

Differential Revision: https://developer.blender.org/D12103
2021-08-23 09:34:42 +02:00
c2ba9be098 Fix NLA action cannot be unlinked in certain cases
The poll for unlinking calls `nla_panel_context` without providing an
adt pointer, and there is a check for this pointer in
`nla_panel_context` leading to never returning true if it is not
provided. (this is fine if there are tracks already, poll would succeed
in this case, `nla_panel_context` goes a different code path then)

Same call to `nla_panel_context` is also done in the beginning of the
corresponding unlink exec function (but this time providing the pointer
because it is used later), so it makes sense to do the same thing in the
poll function. Equal check is also done in the panel poll function, so
now these are all in sync.

Part of T87681.

Maniphest Tasks: T87681

Differential Revision: https://developer.blender.org/D11041
2021-08-23 09:34:07 +02:00
60dbbaac42 Fix T88498: 'Clear Parent' does not clear parent_bone
Clearing the parent from the UI using the X (or from python) clears the
`parsubstr` and set `partype` back to `PAROBJECT`.

Using the Clear Parent operator would leave the `parsubstr` (and thus
`parent_bone`) untouched even though this operator claims to "clear
parenting relationship completely" (it also removes parent deform
modifiers for example).

So now, also clear `parsubstr` and set back to `PAROBJECT` [which is
default].

Maniphest Tasks: T88498

Differential Revision: https://developer.blender.org/D11503
2021-08-23 09:33:24 +02:00
47618781d8 Fix T89805: NLA crash without active track
Was reported for a file which does not have an active track set in
AnimData even though it was in strip twek mode (but this was accessed in
is_nlatrack_evaluatable()).

Root cause for this is not totally clear, but I assume the situation is
described as part T87681 (and is fixed in D11052).

This patch here just prevents the crash for files that are already in the
borked state.

Reviewers: sybren

Maniphest Tasks: T89805

Differential Revision: https://developer.blender.org/D12085
2021-08-23 09:33:01 +02:00
c2f74d6243 BLI_math: Fix several division-by-zero cases.
Those were caused by various tools used on degenerate geometry, see
T79775.

Note that fixes are as low-level as possible, to ensure they cover as
much as possible of unreported issues too.

We still probably have many more of those hidden in BLI_math though.
2021-08-23 09:32:38 +02:00
0cd6c1d502 Fix T88998: GPencil not projecting to the most front surface
It was projecting from the stroke position.

The behavior has changed in {rB5400be9ffee2}.
2021-08-23 09:21:44 +02:00
d97e586f58 Geometry Nodes: Fix vector math project bug
Implementation is incorrect compared to Cycles/Eevee.

Reported by @DrDubosc in comments of T88922.

Differential Revision: https://developer.blender.org/D12029
2021-08-23 09:20:41 +02:00
690fa2ba86 PyAPI: resolve build error with Python 3.10
Resolves T89931
2021-08-23 09:19:53 +02:00
7b1e8e244c Fix T90493: Undo a knife-project operation crashes
The crash occurred calling because mesh_get_eval_final in edit-mode
freed all derived mesh data without tagging the object for updating.

However meshes in edit-mode weren't meant to be used as knife-project
source-data, adding support for multi object edit-mode  caused this.
2021-08-23 09:18:47 +02:00
06ecdab9bc Fix T88033: Python reference memory leaks for non main data-blocks
ID data-blocks that could be accessed from Python and weren't freed
using BKE_id_free_ex did not release the Python reference count.

Add BKE_libblock_free_data_py function to clear the Python reference
in this case.

Add asserts to ensure no Python reference is held in situations
when ID's are copied for internal use (not exposed through the RNA API),
to ensure these kinds of leaks don't go by unnoticed again.
2021-08-23 09:18:22 +02:00
85b28933f0 Fix T89241: 3D Text "Scale to Fit" wraps onto the second line
Disable wrapping when "scale to fit" is used, assert the error is
small so an invalid scale-to-fit value wont go by unnoticed.
2021-08-23 09:14:53 +02:00
e1e2abd4bf Fix memory leak in edit-mesh dissolve degenerate 2021-08-23 09:08:03 +02:00
2cbfa786f3 Fix T89306: GPencil selection doesn't work correctly with modifiers
The problem was introduced with Bezier modification because the selection code was using the original stroke and not the evaluated version.
2021-08-23 09:07:15 +02:00
9eb62b99bc Fix T90791: Knife project leaks memory with curve/text cutter 2021-08-23 09:06:44 +02:00
1e736d8a95 Cleanup: rename BKE_mesh_free -> BKE_mesh_free_data
It wasn't obvious this didn't free the memory of the mesh it's self
leading to memory leaks.
2021-08-23 09:05:20 +02:00
afde98a826 Cleanup: accidentally included printf 2021-08-23 08:54:42 +02:00
20f04ce62a Fix memory leak with building springs in the cloth simulator
Error in 2788b0261c.
2021-08-23 08:54:06 +02:00
63559a779c Version bump: 2.93.4-rc 2021-08-18 14:30:18 +02:00
8b80d19f36 Fix T88552: Cycles changing Render Passes in viewport does not work
Backporting this fixes T90599.
2021-08-17 20:30:22 +02:00
9fca86d549 Version bump: 2.93.3-release 2021-08-17 15:01:53 +02:00
1eeae9be15 [2.93 only] pose slide factor wrong in redo popup
Originally this was caused by {rBc71a8e837616}.
Above commit wasnt renaming RNA definitions, but not the occurances were
RNA values were set.

In 3.0, we had improvements to pose-sliding tools (D9054) where this was
somewhat ironed out (only partially, see D12187).

But since the pose-sliding improvements are not part of 2.93, we have to
correct this in 2.93 ONLY.

Maniphest Tasks: T89027

Differential Revision: https://developer.blender.org/D12191
2021-08-12 17:05:48 +02:00
ee486aa5ce Fix T83164: Spline IK joint_bindings parameter is broken.
Code freeing the array would not properly reset its length value to
zero.

Note that this corrupted data could also be saved in .blend files, so
had to bump fileversion and add some doversion code too.

Fix T90166: crash when creating a liboverride.
2021-08-09 08:29:55 +02:00
93aae90ab5 Cleanup/Fix RNA array length accessors returning non-zero values in invalid cases.
This was apparently done in two places only, with a very cryptic comment
(`/* for raw_access, untested */`), and... I cannot see how returning a
non-zero length value for an array that does not exist or is not
accessible at least, would be anything but an obvious source of issues.

Note that both commits adding those lines are from stone ages (2009):
rBcbc2c1886dee and rB50e3bb7f5f34.
2021-08-09 08:25:14 +02:00
45efd0b6a1 Fix T89835: Crash after Instancing to Scene after making linked Collection local.
Even though the ID itself remain the same after being made local, from
depsgraph point of view this is a different ID. Hence we need to tag all
of its users for COW update, as well as rebuild depsgraph relationships.

Should be also backported to LTS 2.93 (and 2.83 if possible).
2021-08-09 08:23:05 +02:00
2ccf4b15cc Fix T87041: Driver Editor not updated in realtime
Caused by {rBbbb2e0614fc3}

Since above commit only the playhead is updated as an overlay in
animation playback (was moved out of drawing of the main region for
perfomance reasons).
The driver value "debug" visualization is very useful to have during
playback though but was left in main region drawing as part of
`draw_fcurve` (thus does not update in realtime anymore).

Moving `graph_draw_driver_debug` into the overlay is not feasible
because it requires animation filtering which has significant overhead
which needs to be avoided in the overlay which is redrawn on every UI
interaction.

Now tag the whole main region for updates in the Driver Editor during
playback instead (which will make the Drivers Editor as slow during
playback as before rBbbb2e0614fc3 -- but with realtime updates of the
debug visualization).

Maniphest Tasks: T87041

Differential Revision: https://developer.blender.org/D12003
2021-08-09 08:22:34 +02:00
6e987ca32a Fix T78469: Output Metadata: Strip Name no longer accessible
Caused by rB7fc60bff14a6.

This has actually been reported and closed, but that was clearly a
misunderstanding (above commit changed a checkbox to be an enum, but a
second checkbox was simply removed)

Maniphest Tasks: T78469

Differential Revision: https://developer.blender.org/D12084
2021-08-09 08:19:14 +02:00
1fa73076ca Fix T88807: crash when there are multiple links between the same sockets
This commit does two things:

* Disallows creating more than one link from one socket to a multi socket input.
* Properly count links if there happen to be more than one link between the same sockets.

The new link counting should also be more efficient asymptotically.

Differential Revision: https://developer.blender.org/D11570
2021-08-09 08:18:26 +02:00
4e14fe167d Fix particle system duplication duplicates all systems
Followup to rB3834dc2f7b38 (where getting the proper particle system was
fixed for the Adjust Last Operation panel in the Properties Editor). But
since this operator can also be called from the 3DView, get a current
particle system there as well.

Without this, _all_ particle systems would be copied when executing from
the 3DView (which was never really intended [operator description uses
singular] -- it just happens to use `copy_particle_systems_to_object`
internally as well -- same as the `Copy Active/All to Selected Objects`
operators)).

ref. T83317

Maniphest Tasks: T83317

Differential Revision: https://developer.blender.org/D12033
2021-08-09 08:17:34 +02:00
Jagannadhan Ravi
2ae04254b3 Speedup rigid body "Copy from Active" operator
If there were lots of selected objects without an existing rigid body,
we would add rigid bodies to them one by one.

This would be slow in python, now we instead do this as a batch
operation in C.

On my (Intel) MacBook it used to take 60 seconds and with this change it
takes about 0.3 seconds.

Reviewed By: Sebastian Parborg

Differential Revision: http://developer.blender.org/D11957
2021-08-09 08:16:23 +02:00
Yann Lanthony
443ae0f22d Fix T89952: GPencil channel box selection offset
The channel box selection was offset for grease pencil layers.

This is a proposed fix by @yann-lty

Before:
{F10227973}

After:
{F10227974}

Reviewed By: #grease_pencil, antoniov

Maniphest Tasks: T89952

Differential Revision: https://developer.blender.org/D11962
2021-08-09 08:15:29 +02:00
69edb9c7d9 LineArt: Occlusion accuracy fix.
This patch fixes occlusion function to handle one specific case (when an edge shares a point with triangle) better,especially when there's overlapping edges in this case.
2021-08-09 08:14:06 +02:00
a818ad5a54 Fix memory leak with Python RNA property get callback errors
Failure to return a list of the expected size & type wasn't
decrementing the value, leaking a reference.

Caused by 127b5423d6 a workaround for the
real error that was fixed f5e020a7a6.
2021-08-09 08:13:28 +02:00
bc627fbf65 Fix gpu.types.GPUTexture crash when the size argument was too big
Missing length check on the size argument before copying it
into a fixed size buffer.
2021-08-09 08:13:05 +02:00
2be2aeaf0a Fix memory leaks in Python gizmo get/set handlers 2021-08-09 08:11:00 +02:00
db5501eb6e Revert "Fix spin-gizmo not allowing click events to select vertices"
This reverts commit 0b903755a9.

This caused T86030, left-mouse selection override clicking,
which is used for creating a full revolution.
2021-08-09 08:10:04 +02:00
168965835c Fix T85436: Separate by loose parts doesn't show new objects
Only the "changed" state from the last edit-object was used,
this meant the operator would not perform the necessary update
with multi-object edit-mode.

Use "changed" & "changed_multi" naming convention.
2021-08-09 08:09:30 +02:00
2bb38f3022 Fix T90417: font loading creates duplicate ID names
Also repair any errors in existing files.

Error from e0dd3fe587.
2021-08-09 08:06:52 +02:00
a1b687c7a4 Fix broken logic in Windows directory query function
Mistake in a5bbdd6998
2021-08-09 08:02:01 +02:00
f6665f506b Fix T89450: Crash slicing BMEditSelSeq
Slicing with indices greater than the length of the sequence would crash.
2021-08-09 08:01:19 +02:00
e862874606 Fix slicing with negative indices
Negative indices that remained negative after adding the sequence length
caused incorrect slicing.

With the default scene for example:

   bpy.context.scene.objects[-4:2]

Gave a different result to:

   tuple(bpy.context.scene.objects)[-4:2]

Clamp indices above zero so loops that step forward works as intended.
2021-08-09 08:00:16 +02:00
4d7f10d614 Fix fix invalid index use for edit-mesh laplacian smooth
Only vertex indices were ensured to be correct.
2021-08-09 07:59:37 +02:00
ec00d218c9 Fix T90477: Cursor vertex snapping not working in UV editor
`t->obedit_type` is read to indicate if we are in uv edit mode.

Also fixes a crash introduced in {rBdd14ea18190ff27082009f73a556569a43377a71}.
2021-08-09 07:57:24 +02:00
fb7510087f Version bump: 2.93.3-rc 2021-08-04 16:17:57 +02:00
1eb06de260 Version bump: 2.93.2 release. 2021-08-03 07:58:55 +02:00
cb6d1b76dc Remove pipeline_config.json
File is replaced by pipeline_config.yaml.
2021-08-02 14:46:16 +02:00
45ed762124 Cycles: upgrade CUDA to 11.4
This fixes a performance regression on Ampere cards, on specific scenes like
classroom. For cycles-x there is little difference, but this is still helpful
for LTS releases, and we need to upgrade at some point anyway.
2021-07-27 14:05:40 +02:00
fdb811f030 Fix T89884: Cycles stuck on first sample in viewport render
This is a backported of 97f1e47, which was an optimization but also
fixes this bug.

Ref D11279
2021-07-26 18:05:40 +02:00
256ab174d8 Added pipeline config formatted in yaml. 2021-07-26 14:57:21 +02:00
Himanshi Kalra
60a0928f35 Fix T85517: Cannot type Space while holding Shift key in text-field like spaces.
Fix for T85517
Bug: Couldn't type space while holding down the shift key in text spaces (e.g. when saving a file, changing the name of object).

Changes: Removing the key combination of Shift + space in `WM_event_is_ime_switch` method.

Reviewed By: harley, mont29

Maniphest Tasks: T85517

Differential Revision: https://developer.blender.org/D10452
2021-07-26 09:36:11 +02:00
Yann Lanthony
59b77cd688 Fix T89733: Py API: bpy.data.orphans_purge argument parsing
On Windows, using `bpy.data.orphans_purge` with some arguments (eg: `do_recursive=True`) does not produce the expected results. This is due to arguments not being parsed correctly on this platform with the current code.

The proposed fix is based on how other functions with boolean attributes are exposed to the Python API.

Reviewed By: #python_api, mont29

Maniphest Tasks: T89733

Differential Revision: https://developer.blender.org/D11963
2021-07-26 08:44:01 +02:00
05ffe05ebc Fix T89982: Geometry Nodes: 'New' Button tries to create node_tree on active modifier, rather than button context
When done from the Properties Editor, the context's modifier should be
used (this is where the button is located), when done from elsewhere,
the active modifier is still the way to go (since the context modifier is
not available then)

Maniphest Tasks: T89982

Differential Revision: https://developer.blender.org/D11972
2021-07-26 08:42:53 +02:00
00b135a379 Fix T89981: missing refresh on the compositors render layer node when adding/removing AOVs
Just refresh the node's outputs via ntreeCompositUpdateRLayers().

Maniphest Tasks: T89981

Differential Revision: https://developer.blender.org/D11973
2021-07-26 08:41:53 +02:00
283481643c Fix T89736: Cycles error with persistent data, displacement and motion blur 2021-07-26 08:40:36 +02:00
3e6ea470ee Fix T89851: Geometry nodes: wrongly detected "Node group has unidentified
nodes or sockets" error

rBfe22635bf664 introduced a utility to check for this (but it was always
returning true).

This wasnt a problem in master (since it is unused there), but in the
2.93 branch, this utility is actually used and the error results in all
geometry nodetrees to appear with the "Node group has unidentified nodes
or sockets" message (and being unusable).

Now return false in has_undefined_nodes_or_sockets if all nodes and
sockets have been successfully checked.

This commit then needs to end up in the 2.93 branch.

Maniphest Tasks: T89851

Differential Revision: https://developer.blender.org/D11911
2021-07-26 08:39:24 +02:00
25a1b27a93 LineArt: Fix crash due to empty duplicollection. 2021-07-26 08:36:55 +02:00
81e12cd9b1 Fix T89455: Cycles crash when rendering a Mesh with autosmooth
The crash was caused by a mistake in 5f9677fe0c
where the pointers to the custom data layers would be overwritten with the one
for the first layer, as CustomData_duplicate_referenced_layer is only about the
first layer. customData_duplicate_referenced_layer_index should be used instead
to duplicate the right layer.
2021-07-26 08:35:33 +02:00
d070cce778 Fix T88756: crash when baking with autosmooth
When baking some data, we create a new Mesh with edits and modifiers applied.
However, in some cases (e.g. when there is no modifier), the returned Mesh is
actually referencing the original one and its data layers. When autosmooth is
enabled we also split the Mesh. However, since the new Mesh is referencing the
original one, although `BKE_mesh_split_faces` is creating new vertices and edges,
the reallocation of the custom data layers is preempted because of the
reference, so adding the new vertices and edges overwrites valid data

To fix this we duplicate referenced layers before splitting the faces.

Reviewed By: brecht

Differential Revision: https://developer.blender.org/D11703
2021-07-26 08:32:26 +02:00
b529a84ec3 Fix T88015: Round end caps on Freestyle lines not shaped as documented
This might be an artistic choice, but round end caps are supposed to be
a "half circle centered at the end point of the line" as documented
here: https://docs.blender.org/manual/en/dev/render/freestyle/
parameter_editor/line_style/strokes.html#caps

They are a shashed half circle instead.

This patch makes this pure half circles [and also fixes the case where
thickness of beginning was used for both beginning and end of the
stroke]

Maniphest Tasks: T88015

Differential Revision: https://developer.blender.org/D11340
2021-07-26 08:31:23 +02:00
2d32bf14e4 Fix T89765: boolean modifier collection refcount issue
The 'collection' property is flagged PROP_ID_REFCOUNT, so the
modifiers foreachIDLink functions should walk with IDWALK_CB_USER
(instead of IDWALK_CB_NOP).

Otherwise the modifier wont be included as a user for the collection
(e.g. on file read); removing the collection from the modifier will
decrement usercount though (which in worst case scenario makes the
collection orphan and will result in data loss)

Maniphest Tasks: T89765

Differential Revision: https://developer.blender.org/D11877
2021-07-26 08:30:05 +02:00
563fdaaa39 Fix channel packed images display in the Image/Node editor
Channel packed images should not have their RGB affected by alpha.
rendering in Cycles and Eevee was fine already, but displaying these was
not right in the Image and Node editors.

Not 100% sure what to do for the "Color and Alpha" mode, but I guess
this should stay like it was before (applying the alpha).

"Color", "R", "G", and "B" modes were changed to not have color be
affected by alpha though.

ref. T89034

Maniphest Tasks: T89034

Differential Revision: https://developer.blender.org/D11871
2021-07-26 08:27:10 +02:00
7b94e7cca4 Alembic export: evaluation mode option
This option will determine visibility on either render or the viewport
visibility. Same for modifer settings. So it will either evaluate the
depsgrah with DAG_EVAL_RENDER or DAG_EVAL_VIEWPORT.
This not only makes it more flexible, it is also a lot
clearer which visibility / modfier setting is taken into account (up
until now, this was always considered to be DAG_EVAL_RENDER)

This option was always present in the USD exporter, this just brings
Alembic in line with that.

ref. T89594

Maniphest Tasks: T89594

Differential Revision: https://developer.blender.org/D11820
2021-07-26 08:25:18 +02:00
cb1601ddf3 Alembic: remove non-functional "Renderable Objects" only option
When introduced in {rB61050f75b13e} this was actually working (meaning
it checked the Outliner OB_RESTRICT_RENDER flag and skipped the object if
desired).

Behavior has since then been commented in rBae6e9401abb7 and apparently
refactored out in rB2917df21adc8.

If checked, it seemed to be working (objects marked non-renderable in
the Outliner were pruned from the export), however unchecking that
option did not include them in the export.

Now it changed - for the worse if you like - in rBa95f86359673 which
made it so if "Renderable Objects" only is checked, it will still export
objects invisible in renders. So since we now have the non-functional
option with a broken/misleading default, it is better to just remove it
entirely.

In fact it has been superseeded by the "Visible Objects" option (this
does the same thing: depsgraph is evaluated in render mode) and as a
second step (and to make this even clearer) a choice whether
Render or Viewport evaluation is used can be added (just like the USD
exporter has). When that choice is explicit, it's also clear which
visibility actually matters.

This is breaking API usage, should be in release notes.

ref. T89594

Maniphest Tasks: T89594

Differential Revision: https://developer.blender.org/D11808
2021-07-26 08:23:53 +02:00
5d1ef0efd0 Fix object "Set Origin" operating on linked library data
Regression in d25747ee75
2021-07-26 07:59:17 +02:00
41bd164e4c Fix T70356: Scaling up 1x1 pixel image reads past buffer bounds
Also resolve a crash when when displaying thumbnails, see T89868.
2021-07-26 07:56:47 +02:00
062764d5d0 Fix T89861: Checking face selection breaks UV stitch operator 2021-07-26 07:55:18 +02:00
Jesse Yurkovich
d097d35be6 Fix T89868: Crash showing thumbnail of wide-aspect image
Scaling down images could create images with a width or height of zero.

Clamp at 1 to prevent a crash, also add an assert to scaling functions.

Ref D11956
2021-07-26 07:54:11 +02:00
9e64fd461a Line Art Bug FIx: stroke default in-front should be on
See {D11550} for more information.
2021-07-07 07:52:38 +02:00
2659bc5457 Fix T89153: Follow Path for empty works only in negative values
The old code only clamped cyclic curves
2021-07-07 07:47:37 +02:00
aa1bbadb5b Fix T89571: Align Orientation to Target keeps rotation when toggled
Backport of {4546f176eb0f}
2021-07-07 07:41:00 +02:00
d3f7ed08e7 Cleanup: remove compilation warning in previous commit. 2021-06-30 09:45:07 +02:00
e9d6f2164e UI: Hide collection tab when scene master collection is active
CollectionLineart does not care about the configurations
in master collection.
Other options are not applicaple for master collection as well.
Hence hiding it.

Reviewed by Dalai Felinto (dfelinto)

Differential Revision: https://developer.blender.org/D11702
2021-06-30 09:43:42 +02:00
bec8e436a1 Fix: VSE seeking with proxy strips would fail on certain frames
If the last decoded frame had the same timestamp as the GOP current
packet, then we would skip over this frame when fast forwarding and we
would seek until the end of the file.

This would could only be triggered reliably in single threaded mode.

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11601
2021-06-30 09:41:44 +02:00
444a8cbc2f Fix: VSE search in mpegts files would fail
ffmpeg_generic_seek_workaround did work properly and our start pts
calculation was wrong.

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11562
2021-06-30 09:41:16 +02:00
7616f4ae57 Fix: VSE indexer seeking not working correctly
Because of the added sanity checks in rB14508ef100c9 (D11492), seeking
in proxies would not work correctly any more. This is because it wasn't
working as intended before, but in most cases this wouldn't be
noticeable. However now when the sanity checks are tripped it is very
noticeable that something is wrong

The indexer tried to use dts values for time stamps when we used pts in
our decode functions to get the time positions. This would make it
start in the wrong GOP frames when searching. Now that we enforce no
crossing of GOP frames when decoding after seek, this would lead to
issues.

Now we correctly use pts (or dts if pts is not available) and thus we
don't have any seeking issues because of time stamp format missmatch.

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11561
2021-06-30 09:40:54 +02:00
a68f5456e4 Fix: VSE timecodes being used even when turned off.
Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11567
2021-06-30 09:39:02 +02:00
cea80f1add Fix: Prevent small memory leak in VSE indexer
We need to unref the packet to tell ffmpeg it is ok to free it after
use.
2021-06-30 09:38:01 +02:00
02a6be5443 Fix: Wrong logic for checking if we can reuse decoded frame
We should only check if the new pts value lies inside the duration of
the current frame.
2021-06-30 09:37:43 +02:00
7eb3e77b94 FFmpeg: Fix seeking not returning the correct frame when not using TC index
Fixed the logic for seeking in ffmpeg video files.
The main fix is that we now apply a small offset in ffmpeg_get_seek_pos
to make sure we don't get the frame in front of the seek position when
seeking backward.

The rest of the changes is general cleanup and untangling code.

Reviewed By: Richard Antalik

Differential Revision: http://developer.blender.org/D11492
2021-06-30 09:37:21 +02:00
13ab6b7bb6 Fix T57397: Movies are blurred after sws_scale
Images with 4:2:2 and 4:4:4 chroma subsampling were blurred when
`SWS_FAST_BILINEAR` interpolation is set for `anim->img_convert_ctx`.

Use `SWS_BILINEAR` interpolation for all movies, as performance is
not impacted by this change.

Reviewed By: sergey

Differential Revision: https://developer.blender.org/D11457
2021-06-30 09:34:48 +02:00
00ffe02820 FFmpeg: Update proxy settings
Changes in rBce649c73446e, affected established proxy codec preset.
Presets were not working and all presets were similar to `veryfast`.
Tunes are now working too, so `fastdecode` tune can be used. I have
measured little improvement, but I tested this only on 2 machines and
I have been informed that `fastdecode` tune does influence decoding
performance for some users.

Change preset from `slow` to `veryfast` and add tune `fastdecode`

Reviewed By: sergey

Differential Revision: https://developer.blender.org/D11454
2021-06-30 09:34:21 +02:00
19c0666d40 Fix T88623, T87044: Make encoded videos play correctly in VLC
The issue was two fold. We didn't properly:

1. Initialize the codec default values which would lead to VLC
   complaining because of garbage/wrong codec settings.

2.Calculate the time base for the video. FFmpeg would happily accept
  this but VLC seems to assume the time base value is at least somewhat
  correct and couldn't properly display the frames as the internal time
  base was huge. We are talking about 90k ticks (tbn) for one second of
  video!

This patch initializes all codecs to use their default values and fixes
the time base calculation so it follows the guidelines from ffmpeg.

Reviewed By: Sergey, Richard Antalik

Differential Revision: http://developer.blender.org/D11426
2021-06-30 09:33:49 +02:00
14308b0a5e Make encoded video fps correct with ffmpeg < 4.4
Before the FFmpeg commit: github.com/FFmpeg/FFmpeg/commit/1c0885334dda9ee8652e60c586fa2e3674056586
FFmpeg would use deprecated variables to calculate the video fps.

We don't use these deprecated variables anymore, so ensure that the
duration is correct in ffmpeg versions without this fix.

Reviewed By: Sergey, Richard Antalik

Differential Revision: http://developer.blender.org/D11417
2021-06-30 09:33:12 +02:00
fe4cbe62df Fix T87932: Failure to build movie strip proxy
We didn't initialize the scaled proxy frame properly.
This would lead to issues in ffmpeg 4.4 as they are more strict that the API is properly used.

Now we initialize the size and format of the frame.
2021-06-30 09:32:45 +02:00
f522e8db22 Cleanup: Remove deprecated variables and functions calls from our ffmpeg code
There need to be more cleanup for ffmpeg 4.5 (ffmpeg master branch).

However this now compiles on ffmpeg 4.4 without and deprication
warnings.

Reviewed By: Sergey, Richard Antalik

Differential Revision: http://developer.blender.org/D10338
2021-06-30 09:32:06 +02:00
3b7a520d88 Fixed issue in previous commit.
During development a test was disabled. Enabling it again.
2021-06-30 09:28:28 +02:00
2c7efe5ac0 Fix T89405: Viewport Render Preview glitching (AMD)
AMD Drivers didn't report an additional space in the rendered. This made
testing for the HQ workaround fail and the issue appeared back on
certain cards.

This fix will test with surrounding spaces or if the renderer name
endswith the given string. If any of these are the case the hq normals
workaround will be enabled.
2021-06-30 09:28:06 +02:00
3c6a981fd9 Fix: Crash Requesting GPU_SHADER_GPENCIL_FILL builtin shader.
Shader doesn't have any shader code. Requesting the shader
would crash blender. Solved by removing the enum_value.
2021-06-30 09:27:33 +02:00
Pratik Borhade
46daaec425 Fix T88808: Set Origin missing from Text object in 2.93
Fix T88808.
Caused by {rB5f2c5e5bb8c15bf0d6679351e3482f9c38c00935}

object type for `TEXT object` was missing in following check
that's why `Set Origin` option was lost from object context menu.

Reviewed By: lichtwerk

Maniphest Tasks: T88808

Differential Revision: https://developer.blender.org/D11495
2021-06-30 09:26:22 +02:00
7573e45103 Fix T89004: Undefined Geometry nodes cause a crash when connected to a Group Output node.
Patch provided by Jacques Lucke.
2021-06-30 09:22:37 +02:00
Fen
129798cacb Fix T89247: Dereference arguments to comparison function correctly
`bm_face_len_cmp` incorrectly interpreted its arguments as `BMFace *`
instead of `BMFace **`, causing an out-of-bounds read.

Ref D11637
2021-06-30 09:19:29 +02:00
32658bb5c9 Fix invalid polygon normal array access building bake data
Pre computed normals index wasn't properly aligned.
Regression from 2ec00ea0c1.
2021-06-30 09:18:48 +02:00
174f39bd03 Fix T89265: Crash when tabbing through num inputs
Fix by reverting the part of ec30cf0b74
that assigned `but->editval` in `ui_numedit_begin_set_values`.

Causing access freed memory when using tab to switch
to a numeric input and then leaving the textbox by clicking outside.
This was because `ui_numedit_begin_set_values` shouldn't need to set
`but->editval` and overwrite the pointer.
This would set a pointer that had previously been freed,
causing a `NULL` check to fail later on.

Ref D11679
2021-06-30 09:18:22 +02:00
Piotr Makal
27e3265267 Fix edit-mesh random select regression in random seed use
Fix regression in 9c20228128
that wasn't using the random seed correctly.
2021-06-30 09:14:16 +02:00
cf9cacd091 Fix T89526: "Toggle Maximize Area" clears context screen properties
Removed in b787581c9c as it's comment
noted it was bad code, the reason for it's necessity was no longer valid.

Add this back with comment explaining why it's still needed.
2021-06-30 09:12:57 +02:00
03d5c8b4ed Blender v2.93.2 candidate version bump. 2021-06-23 14:44:02 +02:00
1b8d33b18c Blender v2.93.1 release version bump. 2021-06-22 07:57:15 +02:00
baa7a53974 Fix T86956: VSE shading mode ignores Grease Pencil Vertex colors.
Issue is that due to the strange definition of render in grease pencil
(meaning should be rendered similar to rendering). This included normal
viewport rendering in OB_RENDER and OpenGL render in OB_RENDER.

For other rendering modes the overlay vertex opacity would be used. This
patch sets this value to 1 when rendering via a scene strip override.

NOTE: that this isn't a good solution as I expect that users want to use
the opacity of the Grease pencil object. Perhaps the GPencil team has a
better solution for it.
2021-06-16 13:46:39 +02:00
b4a81d8053 Fix outdated face tessellation use when editing edit-mesh coodinates 2021-06-16 09:31:01 +02:00
b61f4fd8da Fix image space missing mask display panel 2021-06-16 09:30:39 +02:00
bc64801857 Fix modifier deform by armature check ignoring virtual modifiers
Regression in f00cb93dbe (fix for T63125)
2021-06-16 09:30:05 +02:00
ba7f110753 Docs: remove deprecated parameter from bmesh docs
The perimeter itself was removed but the documentation wasn't updated.

Resolves T89013
2021-06-16 09:29:40 +02:00
e24765d7d3 LineArt: Camera marker update fix.
The original fix was probably flushed by some newer
line art commits. Fixed.

See https://developer.blender.org/T88464
2021-06-16 09:28:46 +02:00
5730112174 LineArt: Fix edge clipping index error.
Small bug that's causing edge count to be incorrect in
final culled list, just being offset exactly 1 entry.

Reviewed By: Sebastian Parborg (zeddb)

Differential Revision: https://developer.blender.org/D11513
2021-06-09 08:57:59 +02:00
Philipp Oeser
d354e6a846 Texture Paint: changing paint slots and viewport could go out of sync
When changing to another texture paint slot, the texture displayed in
the viewport should change accordingly (as well as the image displayed
in the Image Editor).

The procedure to find the texture to display in the viewport
(BKE_texpaint_slot_material_find_node) could fail
though because it assumed iterating nodes would always happen in the
same order (it was index based). This is not the case though, nodes can
get sorted differently based on selection (see ED_node_sort).

Now check the actual image being referenced in the paint slot for
comparison.

ref T88788 (probably enough to call this a fix, the other issue(s)
mentioned in the report are more likely a feature request)

Reviewed By: mano-wii

Maniphest Tasks: T88788

Differential Revision: https://developer.blender.org/D11496
2021-06-09 08:49:32 +02:00
706c7d775d Fix T77651: Black screen on Blender startup on ChromeOS
Apparently `textureSize` doesn't work with
`sampler1DArray` on this OS.

Thanks to @dave1853 for finding the source of the
problem.
2021-06-09 08:49:01 +02:00
db909281f2 Fix T88813: Scalable allocator not used on win10
Due to the way we ship the CRT on windows TBB's
malloc proxy was unable to attach it self to
the memory management functions on windows 10.

This change moves ucrtbase.dll out of the blender.crt
folder and back into the main blender folder to side
step some undesirable behaviour on win10 making TBB
once more able to attach it self.

Having this work again, should give a speed
boost in memory allocation heavy workloads
such as mantaflow.

For details on how this only failed on Win10
see T88813
2021-06-09 08:48:16 +02:00
e1d2485fd3 EEVEE: AOVs not same as cycles.
EEVEE uses hashing to sync aov names and types with the gpu.
For the type a hashed value was overridden making `decalA`
and `decalB` choose the same hash. This patches fixes this
by removing the most significant bit.
2021-06-09 08:46:20 +02:00
0b577e977b Fix T88567: Cryptomatte only works for the first View Layer.
The view layer was always set to 0. This patch increments it.
2021-06-09 08:45:45 +02:00
5a612097ab Fix T88666: Cryptomatte: EXR sequence does not update when scrubbing the timeline.
Cause is that initializing the cryptomatte session would reset the
current frame of an image sequence. The solution is to always use the
scene current frame so it resets to the correct frame.

This was a todo that wasn't solved after it landed in master.
Needs to be backported to 2.93.
2021-06-09 08:43:40 +02:00
dbb002387d Fix T88625: Multiobject UV hiding/unhiding does not work with UV_SYNC_SELECTION
Oversight in {rB470f17f21c06}.

Hiding was only done for the first mesh, then the operator finished (in
case of UV_SYNC_SELECTION).
Now just continue to the next.

Maniphest Tasks: T88625

Differential Revision: https://developer.blender.org/D11413
2021-06-09 08:43:07 +02:00
c1d67abcd8 Fix T88531: Mantaflow problem with geometry nodes
Objects modified by geometry nodes modifiers were not caught as being
"dynamic".

Now add this modifier type to the list of modifiers making them "dynamic"
in the eyes of mantaflow.

(noticed by @sebbas in chat)

Maniphest Tasks: T88531

Differential Revision: https://developer.blender.org/D11389
2021-06-09 08:42:16 +02:00
b38eb31050 Fix T88566: Mantaflow inflow with shapekeys is not working anymore
(regression)

Code was actually checking for shapekeys, but these were not detected
properly (some effects like shape keys are added as virtual modifiers
before the user created modifiers)

Now go over virtual modifiers as well.

Maniphest Tasks: T88566

Differential Revision: https://developer.blender.org/D11388
2021-06-09 08:41:42 +02:00
8eef2f83e9 Fix T88111: Skin modifier assets within invalid face normals
The skin modifier was moving vertices without updating normals for the
connected faces, this happened when smoothing and welding vertices.

Reviewed By: mont29

Ref D11397
2021-06-09 08:41:00 +02:00
cc5392b4d9 Fix buffer overrun in paint_line_strokes_spacing
Error in 87cafe92ce
2021-06-09 08:39:33 +02:00
9d134104e4 Fix T88899: __file__ not set for text.as_module() 2021-06-09 08:39:00 +02:00
7edbe463d2 Versionbump: 2.93.1 candidate. 2021-06-02 17:10:44 +02:00
ddb2d71438 Fix typo in Linux appdata 2021-06-02 15:05:09 +02:00
84da05a8b8 Docs: 2.93 release description for Linux appdata 2021-06-02 13:21:47 +02:00
383bc8d9bc Blender version bump 2.93 release. 2021-06-01 16:46:30 +02:00
4990 changed files with 184020 additions and 368406 deletions

View File

@@ -180,7 +180,6 @@ ForEachMacros:
- CTX_DATA_BEGIN_WITH_ID
- DEG_OBJECT_ITER_BEGIN
- DEG_OBJECT_ITER_FOR_RENDER_ENGINE_BEGIN
- DRW_ENABLED_ENGINE_ITER
- DRIVER_TARGETS_LOOPER_BEGIN
- DRIVER_TARGETS_USED_LOOPER_BEGIN
- FOREACH_BASE_IN_EDIT_MODE_BEGIN
@@ -256,7 +255,6 @@ ForEachMacros:
- SCULPT_VERTEX_DUPLICATES_AND_NEIGHBORS_ITER_BEGIN
- SCULPT_VERTEX_NEIGHBORS_ITER_BEGIN
- SEQ_ALL_BEGIN
- SEQ_ITERATOR_FOREACH
- SURFACE_QUAD_ITER_BEGIN
- foreach
- ED_screen_areas_iter
@@ -266,8 +264,4 @@ ForEachMacros:
- VECTOR_SET_SLOT_PROBING_BEGIN
StatementMacros:
- PyObject_HEAD
- PyObject_VAR_HEAD
MacroBlockBegin: "^BSDF_CLOSURE_CLASS_BEGIN$"
MacroBlockEnd: "^BSDF_CLOSURE_CLASS_END$"

View File

@@ -1,5 +0,0 @@
This repository is only used as a mirror of git.blender.org. Blender development happens on
https://developer.blender.org.
To get started with contributing code, please see:
https://wiki.blender.org/wiki/Process/Contributing_Code

View File

@@ -30,7 +30,7 @@ if(${CMAKE_SOURCE_DIR} STREQUAL ${CMAKE_BINARY_DIR})
"CMake generation for blender is not allowed within the source directory!"
"\n Remove \"${CMAKE_SOURCE_DIR}/CMakeCache.txt\" and try again from another folder, e.g.:"
"\n "
"\n rm -rf CMakeCache.txt CMakeFiles"
"\n rm CMakeCache.txt"
"\n cd .."
"\n mkdir cmake-make"
"\n cd cmake-make"
@@ -110,10 +110,6 @@ if(POLICY CMP0074)
cmake_policy(SET CMP0074 NEW)
endif()
# Install CODE|SCRIPT allow the use of generator expressions.
if(POLICY CMP0087)
cmake_policy(SET CMP0087 NEW)
endif()
#-----------------------------------------------------------------------------
# Load some macros.
include(build_files/cmake/macros.cmake)
@@ -156,15 +152,6 @@ get_blender_version()
option(WITH_BLENDER "Build blender (disable to build only the blender player)" ON)
mark_as_advanced(WITH_BLENDER)
if(APPLE)
# Currently this causes a build error linking, disable.
set(WITH_BLENDER_THUMBNAILER OFF)
elseif(WIN32)
option(WITH_BLENDER_THUMBNAILER "Build \"BlendThumb.dll\" helper for Windows explorer integration" ON)
else()
option(WITH_BLENDER_THUMBNAILER "Build \"blender-thumbnailer\" thumbnail extraction utility" ON)
endif()
option(WITH_INTERNATIONAL "Enable I18N (International fonts and text)" ON)
option(WITH_PYTHON "Enable Embedded Python API (only disable for development)" ON)
@@ -362,7 +349,7 @@ mark_as_advanced(WITH_SYSTEM_GLOG)
option(WITH_FREESTYLE "Enable Freestyle (advanced edges rendering)" ON)
# Misc
if(WIN32 OR APPLE)
if(WIN32)
option(WITH_INPUT_IME "Enable Input Method Editor (IME) for complex Asian character input" ON)
endif()
option(WITH_INPUT_NDOF "Enable NDOF input devices (SpaceNavigator and friends)" ON)
@@ -397,63 +384,46 @@ if(WITH_PYTHON_INSTALL)
set(PYTHON_REQUESTS_PATH "" CACHE PATH "Path to python site-packages or dist-packages containing 'requests' module")
mark_as_advanced(PYTHON_REQUESTS_PATH)
endif()
option(WITH_PYTHON_INSTALL_ZSTANDARD "Copy zstandard into the blender install folder" ON)
set(PYTHON_ZSTANDARD_PATH "" CACHE PATH "Path to python site-packages or dist-packages containing 'zstandard' module")
mark_as_advanced(PYTHON_ZSTANDARD_PATH)
endif()
option(WITH_CPU_SIMD "Enable SIMD instruction if they're detected on the host machine" ON)
mark_as_advanced(WITH_CPU_SIMD)
# Cycles
option(WITH_CYCLES "Enable Cycles Render Engine" ON)
option(WITH_CYCLES_OSL "Build Cycles with OpenShadingLanguage support" ON)
option(WITH_CYCLES_EMBREE "Build Cycles with Embree support" ON)
option(WITH_CYCLES_LOGGING "Build Cycles with logging support" ON)
option(WITH_CYCLES_STANDALONE "Build Cycles standalone application" OFF)
option(WITH_CYCLES_STANDALONE_GUI "Build Cycles standalone with GUI" OFF)
option(WITH_CYCLES_DEBUG_NAN "Build Cycles with additional asserts for detecting NaNs and invalid values" OFF)
option(WITH_CYCLES_NATIVE_ONLY "Build Cycles with native kernel only (which fits current CPU, use for development only)" OFF)
option(WITH_CYCLES_KERNEL_ASAN "Build Cycles kernels with address sanitizer when WITH_COMPILER_ASAN is on, even if it's very slow" OFF)
set(CYCLES_TEST_DEVICES CPU CACHE STRING "Run regression tests on the specified device types (CPU CUDA OPTIX HIP)" )
option(WITH_CYCLES "Enable Cycles Render Engine" ON)
option(WITH_CYCLES_STANDALONE "Build Cycles standalone application" OFF)
option(WITH_CYCLES_STANDALONE_GUI "Build Cycles standalone with GUI" OFF)
option(WITH_CYCLES_OSL "Build Cycles with OSL support" ON)
option(WITH_CYCLES_EMBREE "Build Cycles with Embree support" ON)
option(WITH_CYCLES_CUDA_BINARIES "Build Cycles CUDA binaries" OFF)
option(WITH_CYCLES_CUBIN_COMPILER "Build cubins with nvrtc based compiler instead of nvcc" OFF)
option(WITH_CYCLES_CUDA_BUILD_SERIAL "Build cubins one after another (useful on machines with limited RAM)" OFF)
mark_as_advanced(WITH_CYCLES_CUDA_BUILD_SERIAL)
set(CYCLES_TEST_DEVICES CPU CACHE STRING "Run regression tests on the specified device types (CPU CUDA OPTIX OPENCL)" )
set(CYCLES_CUDA_BINARIES_ARCH sm_30 sm_35 sm_37 sm_50 sm_52 sm_60 sm_61 sm_70 sm_75 sm_86 compute_75 CACHE STRING "CUDA architectures to build binaries for")
mark_as_advanced(CYCLES_CUDA_BINARIES_ARCH)
unset(PLATFORM_DEFAULT)
option(WITH_CYCLES_LOGGING "Build Cycles with logging support" ON)
option(WITH_CYCLES_DEBUG "Build Cycles with extra debug capabilities" OFF)
option(WITH_CYCLES_NATIVE_ONLY "Build Cycles with native kernel only (which fits current CPU, use for development only)" OFF)
option(WITH_CYCLES_KERNEL_ASAN "Build Cycles kernels with address sanitizer when WITH_COMPILER_ASAN is on, even if it's very slow" OFF)
mark_as_advanced(WITH_CYCLES_KERNEL_ASAN)
mark_as_advanced(WITH_CYCLES_CUBIN_COMPILER)
mark_as_advanced(WITH_CYCLES_LOGGING)
mark_as_advanced(WITH_CYCLES_DEBUG_NAN)
mark_as_advanced(WITH_CYCLES_DEBUG)
mark_as_advanced(WITH_CYCLES_NATIVE_ONLY)
# NVIDIA CUDA & OptiX
option(WITH_CYCLES_DEVICE_CUDA "Enable Cycles NVIDIA CUDA compute support" ON)
option(WITH_CYCLES_DEVICE_OPTIX "Enable Cycles NVIDIA OptiX support" ON)
option(WITH_CYCLES_DEVICE_CUDA "Enable Cycles CUDA compute support" ON)
option(WITH_CYCLES_DEVICE_OPTIX "Enable Cycles OptiX support" OFF)
option(WITH_CYCLES_DEVICE_OPENCL "Enable Cycles OpenCL compute support" ON)
option(WITH_CYCLES_NETWORK "Enable Cycles compute over network support (EXPERIMENTAL and unfinished)" OFF)
mark_as_advanced(WITH_CYCLES_DEVICE_CUDA)
mark_as_advanced(WITH_CYCLES_DEVICE_OPENCL)
mark_as_advanced(WITH_CYCLES_NETWORK)
option(WITH_CYCLES_CUDA_BINARIES "Build Cycles NVIDIA CUDA binaries" OFF)
set(CYCLES_CUDA_BINARIES_ARCH sm_30 sm_35 sm_37 sm_50 sm_52 sm_60 sm_61 sm_70 sm_75 sm_86 compute_75 CACHE STRING "CUDA architectures to build binaries for")
option(WITH_CYCLES_CUBIN_COMPILER "Build cubins with nvrtc based compiler instead of nvcc" OFF)
option(WITH_CYCLES_CUDA_BUILD_SERIAL "Build cubins one after another (useful on machines with limited RAM)" OFF)
option(WITH_CUDA_DYNLOAD "Dynamically load CUDA libraries at runtime (for developers, makes cuda-gdb work)" ON)
mark_as_advanced(CYCLES_CUDA_BINARIES_ARCH)
mark_as_advanced(WITH_CYCLES_CUBIN_COMPILER)
mark_as_advanced(WITH_CYCLES_CUDA_BUILD_SERIAL)
option(WITH_CUDA_DYNLOAD "Dynamically load CUDA libraries at runtime" ON)
mark_as_advanced(WITH_CUDA_DYNLOAD)
# AMD HIP
if(WIN32)
option(WITH_CYCLES_DEVICE_HIP "Enable Cycles AMD HIP support" ON)
else()
option(WITH_CYCLES_DEVICE_HIP "Enable Cycles AMD HIP support" OFF)
endif()
option(WITH_CYCLES_HIP_BINARIES "Build Cycles AMD HIP binaries" OFF)
set(CYCLES_HIP_BINARIES_ARCH gfx1010 gfx1011 gfx1012 gfx1030 gfx1031 gfx1032 gfx1034 CACHE STRING "AMD HIP architectures to build binaries for")
mark_as_advanced(WITH_CYCLES_DEVICE_HIP)
mark_as_advanced(CYCLES_HIP_BINARIES_ARCH)
# Draw Manager
option(WITH_DRAW_DEBUG "Add extra debug capabilities to Draw Manager" OFF)
mark_as_advanced(WITH_DRAW_DEBUG)
# LLVM
option(WITH_LLVM "Use LLVM" OFF)
if(APPLE)
@@ -634,6 +604,12 @@ if(WIN32)
option(WITH_WINDOWS_FIND_MODULES "Use find_package to locate libraries" OFF)
mark_as_advanced(WITH_WINDOWS_FIND_MODULES)
option(WINDOWS_USE_VISUAL_STUDIO_PROJECT_FOLDERS "Organize the visual studio projects according to source folder structure." ON)
mark_as_advanced(WINDOWS_USE_VISUAL_STUDIO_PROJECT_FOLDERS)
option(WINDOWS_USE_VISUAL_STUDIO_SOURCE_FOLDERS "Organize the source files in filters matching the source folders." ON)
mark_as_advanced(WINDOWS_USE_VISUAL_STUDIO_SOURCE_FOLDERS)
option(WINDOWS_PYTHON_DEBUG "Include the files needed for debugging python scripts with visual studio 2017+." OFF)
mark_as_advanced(WINDOWS_PYTHON_DEBUG)
@@ -646,23 +622,11 @@ if(WIN32)
option(WITH_WINDOWS_PDB "Generate a pdb file for client side stacktraces" ON)
mark_as_advanced(WITH_WINDOWS_PDB)
option(WITH_WINDOWS_STRIPPED_PDB "Use a stripped PDB file" ON)
option(WITH_WINDOWS_STRIPPED_PDB "Use a stripped PDB file" On)
mark_as_advanced(WITH_WINDOWS_STRIPPED_PDB)
endif()
if(WIN32 OR XCODE)
option(IDE_GROUP_SOURCES_IN_FOLDERS "Organize the source files in filters matching the source folders." ON)
mark_as_advanced(IDE_GROUP_SOURCES_IN_FOLDERS)
option(IDE_GROUP_PROJECTS_IN_FOLDERS "Organize the projects according to source folder structure." ON)
mark_as_advanced(IDE_GROUP_PROJECTS_IN_FOLDERS)
if (IDE_GROUP_PROJECTS_IN_FOLDERS)
set_property(GLOBAL PROPERTY USE_FOLDERS ON)
endif()
endif()
if(UNIX)
# See WITH_WINDOWS_SCCACHE for Windows.
option(WITH_COMPILER_CCACHE "Use ccache to improve rebuild times (Works with Ninja, Makefiles and Xcode)" OFF)
@@ -847,11 +811,6 @@ if(NOT WITH_CUDA_DYNLOAD)
endif()
endif()
if(WITH_CYCLES_DEVICE_HIP)
# Currently HIP must be dynamically loaded, this may change in future toolkits
set(WITH_HIP_DYNLOAD ON)
endif()
#-----------------------------------------------------------------------------
# Check check if submodules are cloned
@@ -877,7 +836,7 @@ if(WITH_PYTHON)
# because UNIX will search for the old Python paths which may not exist.
# giving errors about missing paths before this case is met.
if(DEFINED PYTHON_VERSION AND "${PYTHON_VERSION}" VERSION_LESS "3.9")
message(FATAL_ERROR "At least Python 3.9 is required to build, but found Python ${PYTHON_VERSION}")
message(FATAL_ERROR "At least Python 3.9 is required to build")
endif()
file(GLOB RESULT "${CMAKE_SOURCE_DIR}/release/scripts/addons")
@@ -1625,9 +1584,6 @@ elseif(CMAKE_C_COMPILER_ID MATCHES "Clang")
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_UNUSED_PARAMETER -Wunused-parameter)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_ALL -Wall)
# Using C++20 features while having C++17 as the project language isn't allowed by MSVC.
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_CXX20_DESIGNATOR -Wc++20-designator)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_NO_AUTOLOGICAL_COMPARE -Wno-tautological-compare)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_NO_UNKNOWN_PRAGMAS -Wno-unknown-pragmas)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_NO_CHAR_SUBSCRIPTS -Wno-char-subscripts)
@@ -1747,26 +1703,24 @@ if(WITH_PYTHON)
elseif(WITH_PYTHON_INSTALL_REQUESTS)
find_python_package(requests "")
endif()
if(WIN32 OR APPLE)
# pass, we have this in lib/python/site-packages
elseif(WITH_PYTHON_INSTALL_ZSTANDARD)
find_python_package(zstandard "")
endif()
endif()
# Select C++17 as the standard for C++ projects.
set(CMAKE_CXX_STANDARD 17)
# If C++17 is not available, downgrading to an earlier standard is NOT OK.
set(CMAKE_CXX_STANDARD_REQUIRED ON)
# Do not enable compiler specific language extentions.
set(CMAKE_CXX_EXTENSIONS OFF)
# Make MSVC properly report the value of the __cplusplus preprocessor macro
# Available MSVC 15.7 (1914) and up, without this it reports 199711L regardless
# of the C++ standard chosen above.
if(MSVC AND MSVC_VERSION GREATER 1913)
string(APPEND CMAKE_CXX_FLAGS " /Zc:__cplusplus")
if(MSVC)
string(APPEND CMAKE_CXX_FLAGS " /std:c++17")
# Make MSVC properly report the value of the __cplusplus preprocessor macro
# Available MSVC 15.7 (1914) and up, without this it reports 199711L regardless
# of the C++ standard chosen above
if(MSVC_VERSION GREATER 1913)
string(APPEND CMAKE_CXX_FLAGS " /Zc:__cplusplus")
endif()
elseif(
CMAKE_COMPILER_IS_GNUCC OR
CMAKE_C_COMPILER_ID MATCHES "Clang" OR
CMAKE_C_COMPILER_ID MATCHES "Intel"
)
string(APPEND CMAKE_CXX_FLAGS " -std=c++17")
else()
message(FATAL_ERROR "Unknown compiler ${CMAKE_C_COMPILER_ID}, can't enable C++17 build")
endif()
# Visual Studio has all standards it supports available by default
@@ -1887,9 +1841,6 @@ elseif(WITH_CYCLES_STANDALONE)
if(WITH_CUDA_DYNLOAD)
add_subdirectory(extern/cuew)
endif()
if(WITH_HIP_DYNLOAD)
add_subdirectory(extern/hipew)
endif()
if(NOT WITH_SYSTEM_GLEW)
add_subdirectory(extern/glew)
endif()
@@ -1964,7 +1915,6 @@ if(FIRST_RUN)
info_cfg_option(WITH_IK_ITASC)
info_cfg_option(WITH_IK_SOLVER)
info_cfg_option(WITH_INPUT_NDOF)
info_cfg_option(WITH_INPUT_IME)
info_cfg_option(WITH_INTERNATIONAL)
info_cfg_option(WITH_OPENCOLLADA)
info_cfg_option(WITH_OPENCOLORIO)
@@ -2024,7 +1974,6 @@ if(FIRST_RUN)
endif()
info_cfg_option(WITH_PYTHON_INSTALL)
info_cfg_option(WITH_PYTHON_INSTALL_NUMPY)
info_cfg_option(WITH_PYTHON_INSTALL_ZSTANDARD)
info_cfg_option(WITH_PYTHON_MODULE)
info_cfg_option(WITH_PYTHON_SAFETY)

View File

@@ -27,7 +27,7 @@
define HELP_TEXT
Blender Convenience Targets
Provided for building Blender (multiple targets can be used at once).
Provided for building Blender, (multiple at once can be used).
* debug: Build a debug binary.
* full: Enable all supported dependencies & options.
@@ -40,8 +40,6 @@ Blender Convenience Targets
* ninja: Use ninja build tool for faster builds.
* ccache: Use ccache for faster rebuilds.
Note: when passing in multiple targets their order is not important.
So for a fast build you can for e.g. run 'make lite ccache ninja'.
Note: passing the argument 'BUILD_DIR=path' when calling make will override the default build dir.
Note: passing the argument 'BUILD_CMAKE_ARGS=args' lets you add cmake arguments.
@@ -65,7 +63,7 @@ Package Targets
* package_debian: Build a debian package.
* package_pacman: Build an arch linux pacman package.
* package_archive: Build an archive package.
* package_archive: Build an archive package.
Testing Targets
Not associated with building Blender.
@@ -169,7 +167,7 @@ endef
# This makefile is not meant for Windows
ifeq ($(OS),Windows_NT)
$(error On Windows, use "cmd //c make.bat" instead of "make")
$(error On Windows, use "cmd //c make.bat" instead of "make")
endif
# System Vars
@@ -381,7 +379,7 @@ deps: .FORCE
@cmake -H"$(DEPS_SOURCE_DIR)" \
-B"$(DEPS_BUILD_DIR)" \
-DHARVEST_TARGET=$(DEPS_INSTALL_DIR)
-DHARVEST_TARGET=$(DEPS_INSTALL_DIR)
@echo
@echo Building dependencies ...
@@ -458,8 +456,7 @@ project_eclipse: .FORCE
check_cppcheck: .FORCE
$(CMAKE_CONFIG)
cd "$(BUILD_DIR)" ; \
$(PYTHON) \
"$(BLENDER_DIR)/build_files/cmake/cmake_static_check_cppcheck.py" 2> \
$(PYTHON) "$(BLENDER_DIR)/build_files/cmake/cmake_static_check_cppcheck.py" 2> \
"$(BLENDER_DIR)/check_cppcheck.txt"
@echo "written: check_cppcheck.txt"
@@ -521,9 +518,8 @@ source_archive: .FORCE
python3 ./build_files/utils/make_source_archive.py
source_archive_complete: .FORCE
cmake \
-S "$(BLENDER_DIR)/build_files/build_environment" -B"$(BUILD_DIR)/source_archive" \
-DCMAKE_BUILD_TYPE_INIT:STRING=$(BUILD_TYPE) -DPACKAGE_USE_UPSTREAM_SOURCES=OFF
cmake -S "$(BLENDER_DIR)/build_files/build_environment" -B"$(BUILD_DIR)/source_archive" \
-DCMAKE_BUILD_TYPE_INIT:STRING=$(BUILD_TYPE) -DPACKAGE_USE_UPSTREAM_SOURCES=OFF
# This assumes CMake is still using a default `PACKAGE_DIR` variable:
python3 ./build_files/utils/make_source_archive.py --include-packages "$(BUILD_DIR)/source_archive/packages"
@@ -531,11 +527,9 @@ source_archive_complete: .FORCE
INKSCAPE_BIN?="inkscape"
icons: .FORCE
BLENDER_BIN=$(BLENDER_BIN) INKSCAPE_BIN=$(INKSCAPE_BIN) \
"$(BLENDER_DIR)/release/datafiles/blender_icons_update.py"
INKSCAPE_BIN=$(INKSCAPE_BIN) \
"$(BLENDER_DIR)/release/datafiles/prvicons_update.py"
INKSCAPE_BIN=$(INKSCAPE_BIN) \
"$(BLENDER_DIR)/release/datafiles/alert_icons_update.py"
"$(BLENDER_DIR)/release/datafiles/blender_icons_update.py"
BLENDER_BIN=$(BLENDER_BIN) INKSCAPE_BIN=$(INKSCAPE_BIN) \
"$(BLENDER_DIR)/release/datafiles/prvicons_update.py"
icons_geom: .FORCE
BLENDER_BIN=$(BLENDER_BIN) \
@@ -549,7 +543,7 @@ update_code: .FORCE
format: .FORCE
PATH="../lib/${OS_NCASE}_${CPU}/llvm/bin/:../lib/${OS_NCASE}_centos7_${CPU}/llvm/bin/:../lib/${OS_NCASE}/llvm/bin/:$(PATH)" \
$(PYTHON) source/tools/utils_maintenance/clang_format_paths.py $(PATHS)
$(PYTHON) source/tools/utils_maintenance/clang_format_paths.py $(PATHS)
# -----------------------------------------------------------------------------
@@ -559,9 +553,8 @@ format: .FORCE
# Simple version of ./doc/python_api/sphinx_doc_gen.sh with no PDF generation.
doc_py: .FORCE
ASAN_OPTIONS=halt_on_error=0:${ASAN_OPTIONS} \
$(BLENDER_BIN) \
--background -noaudio --factory-startup \
--python doc/python_api/sphinx_doc_gen.py
$(BLENDER_BIN) --background -noaudio --factory-startup \
--python doc/python_api/sphinx_doc_gen.py
sphinx-build -b html -j $(NPROCS) doc/python_api/sphinx-in doc/python_api/sphinx-out
@echo "docs written into: '$(BLENDER_DIR)/doc/python_api/sphinx-out/index.html'"
@@ -570,9 +563,8 @@ doc_doxy: .FORCE
@echo "docs written into: '$(BLENDER_DIR)/doc/doxygen/html/index.html'"
doc_dna: .FORCE
$(BLENDER_BIN) \
--background -noaudio --factory-startup \
--python doc/blender_file_format/BlendFileDnaExporter_25.py
$(BLENDER_BIN) --background -noaudio --factory-startup \
--python doc/blender_file_format/BlendFileDnaExporter_25.py
@echo "docs written into: '$(BLENDER_DIR)/doc/blender_file_format/dna.html'"
doc_man: .FORCE

View File

@@ -56,7 +56,6 @@ else()
endif()
include(cmake/zlib.cmake)
include(cmake/zstd.cmake)
include(cmake/openal.cmake)
include(cmake/png.cmake)
include(cmake/jpeg.cmake)
@@ -82,11 +81,7 @@ if(UNIX)
endif()
include(cmake/openimageio.cmake)
include(cmake/tiff.cmake)
if(WIN32)
include(cmake/flexbison.cmake)
elseif(UNIX AND NOT APPLE)
include(cmake/flex.cmake)
endif()
include(cmake/flexbison.cmake)
include(cmake/osl.cmake)
include(cmake/tbb.cmake)
include(cmake/openvdb.cmake)
@@ -109,6 +104,8 @@ include(cmake/pugixml.cmake)
include(cmake/ispc.cmake)
include(cmake/openimagedenoise.cmake)
include(cmake/embree.cmake)
include(cmake/xml2.cmake)
if(NOT APPLE)
include(cmake/xr_openxr.cmake)
endif()
@@ -118,7 +115,7 @@ include(cmake/expat.cmake)
include(cmake/yamlcpp.cmake)
include(cmake/opencolorio.cmake)
if(BLENDER_PLATFORM_ARM)
if(APPLE AND ("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64"))
include(cmake/sse2neon.cmake)
endif()
@@ -149,7 +146,6 @@ if(NOT WIN32 OR ENABLE_MINGW64)
endif()
if(UNIX)
include(cmake/flac.cmake)
include(cmake/xml2.cmake)
if(NOT APPLE)
include(cmake/spnav.cmake)
include(cmake/jemalloc.cmake)
@@ -169,7 +165,7 @@ endif()
if(UNIX AND NOT APPLE)
include(cmake/libglu.cmake)
include(cmake/mesa.cmake)
include(cmake/wayland_protocols.cmake)
endif()
include(cmake/harvest.cmake)
include(cmake/cve_check.cmake)

View File

@@ -29,7 +29,7 @@ set(BLOSC_EXTRA_ARGS
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
)
# Prevent blosc from including its own local copy of zlib in the object file
# Prevent blosc from including it's own local copy of zlib in the object file
# and cause linker errors with everybody else.
set(BLOSC_EXTRA_ARGS ${BLOSC_EXTRA_ARGS}
-DPREFER_EXTERNAL_ZLIB=ON

View File

@@ -18,12 +18,6 @@
set(BOOST_ADDRESS_MODEL 64)
if(BLENDER_PLATFORM_ARM)
set(BOOST_ARCHITECTURE arm)
else()
set(BOOST_ARCHITECTURE x86)
endif()
if(WIN32)
set(BOOST_TOOLSET toolset=msvc-14.1)
set(BOOST_COMPILER_STRING -vc141)
@@ -35,6 +29,7 @@ if(WIN32)
if(BUILD_MODE STREQUAL Release)
set(BOOST_HARVEST_CMD ${BOOST_HARVEST_CMD} && ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/boost/include/boost-${BOOST_VERSION_NODOTS_SHORT}/ ${HARVEST_TARGET}/boost/include/)
endif()
elseif(APPLE)
set(BOOST_CONFIGURE_COMMAND ./bootstrap.sh)
set(BOOST_BUILD_COMMAND ./b2)
@@ -98,7 +93,7 @@ ExternalProject_Add(external_boost
UPDATE_COMMAND ""
PATCH_COMMAND ${BOOST_PATCH_COMMAND}
CONFIGURE_COMMAND ${BOOST_CONFIGURE_COMMAND}
BUILD_COMMAND ${BOOST_BUILD_COMMAND} ${BOOST_BUILD_OPTIONS} -j${MAKE_THREADS} architecture=${BOOST_ARCHITECTURE} address-model=${BOOST_ADDRESS_MODEL} link=static threading=multi ${BOOST_OPTIONS} --prefix=${LIBDIR}/boost install
BUILD_COMMAND ${BOOST_BUILD_COMMAND} ${BOOST_BUILD_OPTIONS} -j${MAKE_THREADS} architecture=x86 address-model=${BOOST_ADDRESS_MODEL} link=static threading=multi ${BOOST_OPTIONS} --prefix=${LIBDIR}/boost install
BUILD_IN_SOURCE 1
INSTALL_COMMAND "${BOOST_HARVEST_CMD}"
)

View File

@@ -0,0 +1,75 @@
# SPDX-License-Identifier: GPL-2.0-or-later
# CVE Check requirements
#
# - A working installation of intels cve-bin-tool [1] has to be available in
# your path
#
# - Not strictly required, but highly recommended is obtaining a NVD key from
# nist since it significantly speeds up downloading/updating the required
# databases one can request a key on the following website:
# https://nvd.nist.gov/developers/request-an-api-key
# Bill of Materials construction
#
# This constructs a CSV cve-bin-tool [1] can read and process. Sadly
# cve-bin-tool at this point does not take a list of CPE's and output a check
# based on that list. so we need to pick apart the CPE retrieve the vendor,
# product and version tokens and generate a CSV.
#
# [1] https://github.com/intel/cve-bin-tool
# Because not all deps are downloaded (ie python packages) but can still have a
# xxx_CPE declared loop over all variables and look for variables ending in CPE.
set(SBOMCONTENTS)
get_cmake_property(_variableNames VARIABLES)
foreach (_variableName ${_variableNames})
if(_variableName MATCHES "CPE$")
string(REPLACE ":" ";" CPE_LIST ${${_variableName}})
string(REPLACE "_CPE" "_ID" CPE_DEPNAME ${_variableName})
list(GET CPE_LIST 3 CPE_VENDOR)
list(GET CPE_LIST 4 CPE_NAME)
list(GET CPE_LIST 5 CPE_VERSION)
set(${CPE_DEPNAME} "${CPE_VENDOR},${CPE_NAME},${CPE_VERSION}")
set(SBOMCONTENTS "${SBOMCONTENTS}${CPE_VENDOR},${CPE_NAME},${CPE_VERSION},,,\n")
endif()
endforeach()
configure_file(${CMAKE_SOURCE_DIR}/cmake/cve_check.csv.in ${CMAKE_CURRENT_BINARY_DIR}/cve_check.csv @ONLY)
# Custom Targets
#
# This defines two new custom targets one could run in the build folder
# `cve_check` which will output the report to the console, and `cve_check_html`
# which will write out blender_dependencies.html in the build folder that one
# could share with other people or be used to get more information on the
# reported CVE's.
#
# cve-bin-tool takes data from the nist nvd database which rate limits
# unauthenticated requests to 1 requests per 6 seconds making the database
# download take "quite a bit" of time.
#
# When adding -DCVE_CHECK_NVD_KEY=your_api_key_here to your cmake invocation
# this key will be passed on to cve-bin-tool speeding up the process.
#
if(DEFINED CVE_CHECK_NVD_KEY)
set(NVD_ARGS --nvd-api-key ${CVE_CHECK_NVD_KEY})
endif()
# This will just report to the console
add_custom_target(cve_check
COMMAND cve-bin-tool
${NVD_ARGS}
-i ${CMAKE_CURRENT_BINARY_DIR}/cve_check.csv
--affected-versions
SOURCES ${CMAKE_CURRENT_BINARY_DIR}/cve_check.csv
)
# This will write out blender_dependencies.html
add_custom_target(cve_check_html
COMMAND cve-bin-tool
${NVD_ARGS}
-i ${CMAKE_CURRENT_BINARY_DIR}/cve_check.csv
-f html
SOURCES ${CMAKE_CURRENT_BINARY_DIR}/cve_check.csv
)

View File

@@ -0,0 +1,29 @@
vendor,product,version,cve_number,remarks,comment
@OPENJPEG_ID@,CVE-2016-9675,Ignored,issue in convert command line tool not used by blender
@PYTHON_ID@,CVE-2009-2940,Ignored,issue in pygresql not used by blender
@PYTHON_ID@,CVE-2020-29396,Ignored,issue in odoo not used by blender
@PYTHON_ID@,CVE-2021-32052,Ignored,issue in django not used by blender
@PYTHON_ID@,CVE-2009-3720,Ignored,already fixed in libexpat version used
@SSL_ID@,CVE-2009-1390,Ignored,issue in mutt not used by blender
@SSL_ID@,CVE-2009-3765,Ignored,issue in mutt not used by blender
@SSL_ID@,CVE-2009-3766,Ignored,issue in mutt not used by blender
@SSL_ID@,CVE-2009-3767,Ignored,issue in ldap not used by blender
@SSL_ID@,CVE-2019-0190,Ignored,issue in apache not used by blender
@TIFF_ID@,CVE-2022-2056,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-2057,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-2058,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-2519,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-2520,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-2521,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-2953,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-34526,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-3570,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-3597,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-3598,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-3599,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-3626,Ignored,issue in tiff command line tool not used by blender
@TIFF_ID@,CVE-2022-3627,Ignored,issue in tiff command line tool not used by blender
@XML2_ID@,CVE-2016-3709,Ignored,not affecting blender and not considered a security issue upstream
@GMP_ID@,CVE-2021-43618,Mitigated,patched using upstream commit 561a9c25298e
@SQLITE_ID@,CVE-2022-35737,Ignored,only affects SQLITE_ENABLE_STAT4 compile option not used by blender or python
@SBOMCONTENTS@

View File

@@ -7,6 +7,20 @@ function(download_source dep)
else()
set(TARGET_URI https://svn.blender.org/svnroot/bf-blender/trunk/lib/packages/${TARGET_FILE})
endif()
# Validate all required variables are set and give an explicit error message
# rather than CMake erroring out later on with a more ambigious error.
if (NOT DEFINED TARGET_FILE)
message(FATAL_ERROR "${dep}_FILE variable not set")
endif()
if (NOT DEFINED TARGET_HASH)
message(FATAL_ERROR "${dep}_HASH variable not set")
endif()
if (NOT DEFINED TARGET_HASH_TYPE)
message(FATAL_ERROR "${dep}_HASH_TYPE variable not set")
endif()
if (NOT DEFINED TARGET_URI)
message(FATAL_ERROR "${dep}_URI variable not set")
endif()
set(TARGET_FILE ${PACKAGE_DIR}/${TARGET_FILE})
message("Checking source : ${dep} (${TARGET_FILE})")
if(NOT EXISTS ${TARGET_FILE})
@@ -18,6 +32,36 @@ function(download_source dep)
SHOW_PROGRESS
)
endif()
if(EXISTS ${TARGET_FILE})
# Sometimes the download fails, but that is not a
# fail condition for "file(DOWNLOAD" it will warn about
# a crc mismatch and just carry on, we need to explicitly
# catch this and remove the bogus 0 byte file so we can
# retry without having to go find the file and manually
# delete it.
file (SIZE ${TARGET_FILE} TARGET_SIZE)
if(${TARGET_SIZE} EQUAL 0)
file(REMOVE ${TARGET_FILE})
message(FATAL_ERROR "for ${TARGET_FILE} file size 0, download likely failed, deleted...")
endif()
# If we are using sources from the blender repo also
# validate that the hashes match, this takes a
# little more time, but protects us when we are
# building a release package and one of the packages
# is missing or incorrect.
#
# For regular platform maintenaince this is not needed
# since the actual build of the dep will notify the
# platform maintainer if there is a problem with the
# source package and refuse to build.
if(NOT PACKAGE_USE_UPSTREAM_SOURCES OR FORCE_CHECK_HASH)
file(${TARGET_HASH_TYPE} ${TARGET_FILE} LOCAL_HASH)
if(NOT ${TARGET_HASH} STREQUAL ${LOCAL_HASH})
message(FATAL_ERROR "${TARGET_FILE} ${TARGET_HASH_TYPE} mismatch\nExpected\t: ${TARGET_HASH}\nActual\t: ${LOCAL_HASH}")
endif()
endif()
endif()
endfunction(download_source)
download_source(ZLIB)
@@ -48,7 +92,6 @@ download_source(OSL)
download_source(PYTHON)
download_source(TBB)
download_source(OPENVDB)
download_source(NANOVDB)
download_source(NUMPY)
download_source(LAME)
download_source(OGG)
@@ -70,7 +113,6 @@ endif()
download_source(SPNAV)
download_source(JEMALLOC)
download_source(XML2)
download_source(TINYXML)
download_source(YAMLCPP)
download_source(EXPAT)
download_source(PUGIXML)
@@ -87,10 +129,7 @@ download_source(LIBGLU)
download_source(MESA)
download_source(NASM)
download_source(XR_OPENXR_SDK)
download_source(WL_PROTOCOLS)
download_source(ISPC)
download_source(GMP)
download_source(POTRACE)
download_source(HARU)
download_source(ZSTD)
download_source(FLEX)

View File

@@ -43,17 +43,11 @@ endif()
if(WIN32)
set(EMBREE_BUILD_DIR ${BUILD_MODE}/)
if(BUILD_MODE STREQUAL Debug)
list(APPEND EMBREE_EXTRA_ARGS
-DEMBREE_TBBMALLOC_LIBRARY_NAME=tbbmalloc_debug
-DEMBREE_TBB_LIBRARY_NAME=tbb_debug
)
endif()
else()
set(EMBREE_BUILD_DIR)
endif()
if(BLENDER_PLATFORM_ARM)
if(APPLE AND ("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64"))
ExternalProject_Add(external_embree
GIT_REPOSITORY ${EMBREE_ARM_GIT}
GIT_TAG "blender-arm"

View File

@@ -30,14 +30,7 @@ if(WIN32)
--enable-w32threads
--disable-pthreads
--enable-libopenjpeg
--disable-mediafoundation
)
if("${CMAKE_SIZEOF_VOID_P}" EQUAL "4")
set(FFMPEG_EXTRA_FLAGS
${FFMPEG_EXTRA_FLAGS}
--x86asmexe=yasm
)
endif()
else()
set(FFMPEG_EXTRA_FLAGS
${FFMPEG_EXTRA_FLAGS}

View File

@@ -1,28 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
ExternalProject_Add(external_flex
URL file://${PACKAGE_DIR}/${FLEX_FILE}
URL_HASH ${FLEX_HASH_TYPE}=${FLEX_HASH}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
PREFIX ${BUILD_DIR}/flex
CONFIGURE_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/flex/src/external_flex/ && ${CONFIGURE_COMMAND} --prefix=${LIBDIR}/flex
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/flex/src/external_flex/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/flex/src/external_flex/ && make install
INSTALL_DIR ${LIBDIR}/flex
)

View File

@@ -19,13 +19,14 @@
set(FREETYPE_EXTRA_ARGS
-DCMAKE_RELEASE_POSTFIX:STRING=2ST
-DCMAKE_DEBUG_POSTFIX:STRING=2ST_d
-DWITH_BZip2=OFF
-DWITH_HarfBuzz=OFF
-DFT_WITH_HARFBUZZ=OFF
-DFT_WITH_BZIP2=OFF
-DCMAKE_DISABLE_FIND_PACKAGE_HarfBuzz=TRUE
-DCMAKE_DISABLE_FIND_PACKAGE_BZip2=TRUE
-DCMAKE_DISABLE_FIND_PACKAGE_BrotliDec=TRUE)
-DFT_DISABLE_BZIP2=ON
-DFT_DISABLE_HARFBUZZ=ON
-DFT_DISABLE_PNG=ON
-DFT_REQUIRE_BROTLI=OFF
-DFT_REQUIRE_ZLIB=ON
-DZLIB_LIBRARY=${LIBDIR}/zlib/lib/${ZLIB_LIBRARY}
-DZLIB_INCLUDE_DIR=${LIBDIR}/zlib/include
)
ExternalProject_Add(external_freetype
URL file://${PACKAGE_DIR}/${FREETYPE_FILE}
@@ -36,6 +37,11 @@ ExternalProject_Add(external_freetype
INSTALL_DIR ${LIBDIR}/freetype
)
add_dependencies(
external_freetype
external_zlib
)
if(BUILD_MODE STREQUAL Release AND WIN32)
ExternalProject_Add_Step(external_freetype after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/freetype ${HARVEST_TARGET}/freetype

View File

@@ -25,12 +25,19 @@ else()
set(GMP_OPTIONS --enable-static --disable-shared )
endif()
if(APPLE AND NOT BLENDER_PLATFORM_ARM)
set(GMP_OPTIONS
${GMP_OPTIONS}
--with-pic
)
elseif(UNIX AND NOT APPLE)
if(APPLE)
if("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64")
set(GMP_OPTIONS
${GMP_OPTIONS}
--disable-assembly
)
else()
set(GMP_OPTIONS
${GMP_OPTIONS}
--with-pic
)
endif()
elseif(UNIX)
set(GMP_OPTIONS
${GMP_OPTIONS}
--with-pic
@@ -38,18 +45,12 @@ elseif(UNIX AND NOT APPLE)
)
endif()
if(BLENDER_PLATFORM_ARM)
set(GMP_OPTIONS
${GMP_OPTIONS}
--disable-assembly
)
endif()
ExternalProject_Add(external_gmp
URL file://${PACKAGE_DIR}/${GMP_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${GMP_HASH_TYPE}=${GMP_HASH}
PREFIX ${BUILD_DIR}/gmp
PATCH_COMMAND ${PATCH_CMD} -p 1 -d ${BUILD_DIR}/gmp/src/external_gmp < ${PATCH_DIR}/gmp.diff
CONFIGURE_COMMAND ${CONFIGURE_ENV_NO_PERL} && cd ${BUILD_DIR}/gmp/src/external_gmp/ && ${CONFIGURE_COMMAND} --prefix=${LIBDIR}/gmp ${GMP_OPTIONS} ${GMP_EXTRA_ARGS}
BUILD_COMMAND ${CONFIGURE_ENV_NO_PERL} && cd ${BUILD_DIR}/gmp/src/external_gmp/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV_NO_PERL} && cd ${BUILD_DIR}/gmp/src/external_gmp/ && make install

View File

@@ -26,179 +26,172 @@ endif()
message("HARVEST_TARGET = ${HARVEST_TARGET}")
if(WIN32)
if(BUILD_MODE STREQUAL Release)
add_custom_target(Harvest_Release_Results
COMMAND # jpeg rename libfile + copy include
${CMAKE_COMMAND} -E copy ${LIBDIR}/jpg/lib/jpeg-static.lib ${HARVEST_TARGET}/jpeg/lib/libjpeg.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/jpg/include/ ${HARVEST_TARGET}/jpeg/include/ &&
# png
${CMAKE_COMMAND} -E copy ${LIBDIR}/png/lib/libpng16_static.lib ${HARVEST_TARGET}/png/lib/libpng.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/png/include/ ${HARVEST_TARGET}/png/include/ &&
# freeglut-> opengl
${CMAKE_COMMAND} -E copy ${LIBDIR}/freeglut/lib/freeglut_static.lib ${HARVEST_TARGET}/opengl/lib/freeglut_static.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/freeglut/include/ ${HARVEST_TARGET}/opengl/include/ &&
# glew-> opengl
${CMAKE_COMMAND} -E copy ${LIBDIR}/glew/lib/libglew32.lib ${HARVEST_TARGET}/opengl/lib/glew.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/glew/include/ ${HARVEST_TARGET}/opengl/include/ &&
# tiff
${CMAKE_COMMAND} -E copy ${LIBDIR}/tiff/lib/tiff.lib ${HARVEST_TARGET}/tiff/lib/libtiff.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/tiff/include/ ${HARVEST_TARGET}/tiff/include/
DEPENDS
)
endif()
else(WIN32)
function(harvest from to)
set(pattern "")
foreach(f ${ARGN})
set(pattern ${f})
endforeach()
if(pattern STREQUAL "")
get_filename_component(dirpath ${to} DIRECTORY)
get_filename_component(filename ${to} NAME)
install(
FILES ${LIBDIR}/${from}
DESTINATION ${HARVEST_TARGET}/${dirpath}
RENAME ${filename})
else()
install(
DIRECTORY ${LIBDIR}/${from}/
DESTINATION ${HARVEST_TARGET}/${to}
USE_SOURCE_PERMISSIONS
FILES_MATCHING PATTERN ${pattern}
PATTERN "pkgconfig" EXCLUDE
PATTERN "cmake" EXCLUDE
PATTERN "__pycache__" EXCLUDE
PATTERN "tests" EXCLUDE)
if(BUILD_MODE STREQUAL Release)
add_custom_target(Harvest_Release_Results
COMMAND # jpeg rename libfile + copy include
${CMAKE_COMMAND} -E copy ${LIBDIR}/jpg/lib/jpeg-static.lib ${HARVEST_TARGET}/jpeg/lib/libjpeg.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/jpg/include/ ${HARVEST_TARGET}/jpeg/include/ &&
# png
${CMAKE_COMMAND} -E copy ${LIBDIR}/png/lib/libpng16_static.lib ${HARVEST_TARGET}/png/lib/libpng.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/png/include/ ${HARVEST_TARGET}/png/include/ &&
# freeglut-> opengl
${CMAKE_COMMAND} -E copy ${LIBDIR}/freeglut/lib/freeglut_static.lib ${HARVEST_TARGET}/opengl/lib/freeglut_static.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/freeglut/include/ ${HARVEST_TARGET}/opengl/include/ &&
# glew-> opengl
${CMAKE_COMMAND} -E copy ${LIBDIR}/glew/lib/libglew32.lib ${HARVEST_TARGET}/opengl/lib/glew.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/glew/include/ ${HARVEST_TARGET}/opengl/include/ &&
# tiff
${CMAKE_COMMAND} -E copy ${LIBDIR}/tiff/lib/tiff.lib ${HARVEST_TARGET}/tiff/lib/libtiff.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/tiff/include/ ${HARVEST_TARGET}/tiff/include/
DEPENDS
)
endif()
endfunction()
harvest(alembic/include alembic/include "*.h")
harvest(alembic/lib/libAlembic.a alembic/lib/libAlembic.a)
harvest(alembic/bin alembic/bin "*")
harvest(boost/include boost/include "*")
harvest(boost/lib boost/lib "*.a")
harvest(ffmpeg/include ffmpeg/include "*.h")
harvest(ffmpeg/lib ffmpeg/lib "*.a")
harvest(fftw3/include fftw3/include "*.h")
harvest(fftw3/lib fftw3/lib "*.a")
harvest(flac/lib sndfile/lib "libFLAC.a")
harvest(freetype/include freetype/include "*.h")
harvest(freetype/lib/libfreetype2ST.a freetype/lib/libfreetype.a)
harvest(glew/include glew/include "*.h")
harvest(glew/lib glew/lib "*.a")
harvest(gmp/include gmp/include "*.h")
harvest(gmp/lib gmp/lib "*.a")
harvest(jemalloc/include jemalloc/include "*.h")
harvest(jemalloc/lib jemalloc/lib "*.a")
harvest(jpg/include jpeg/include "*.h")
harvest(jpg/lib jpeg/lib "libjpeg.a")
harvest(lame/lib ffmpeg/lib "*.a")
harvest(llvm/bin llvm/bin "clang-format")
if(BUILD_CLANG_TOOLS)
harvest(llvm/bin llvm/bin "clang-tidy")
harvest(llvm/share/clang llvm/share "run-clang-tidy.py")
endif()
harvest(llvm/include llvm/include "*")
harvest(llvm/bin llvm/bin "llvm-config")
harvest(llvm/lib llvm/lib "libLLVM*.a")
harvest(llvm/lib llvm/lib "libclang*.a")
harvest(llvm/lib/clang llvm/lib/clang "*.h")
if(APPLE)
harvest(openmp/lib openmp/lib "*")
harvest(openmp/include openmp/include "*.h")
endif()
if(BLENDER_PLATFORM_ARM)
harvest(sse2neon sse2neon "*.h")
endif()
harvest(ogg/lib ffmpeg/lib "*.a")
harvest(openal/include openal/include "*.h")
if(UNIX AND NOT APPLE)
harvest(openal/lib openal/lib "*.a")
harvest(blosc/include blosc/include "*.h")
harvest(blosc/lib blosc/lib "*.a")
harvest(zlib/include zlib/include "*.h")
harvest(zlib/lib zlib/lib "*.a")
harvest(xml2/include xml2/include "*.h")
harvest(xml2/lib xml2/lib "*.a")
harvest(wayland-protocols/share/wayland-protocols wayland-protocols/share/wayland-protocols/ "*.xml")
else()
harvest(blosc/lib openvdb/lib "*.a")
harvest(xml2/lib opencollada/lib "*.a")
endif()
harvest(opencollada/include/opencollada opencollada/include "*.h")
harvest(opencollada/lib/opencollada opencollada/lib "*.a")
harvest(opencolorio/include opencolorio/include "*.h")
harvest(opencolorio/lib opencolorio/lib "*.a")
harvest(opencolorio/lib/static opencolorio/lib "*.a")
harvest(openexr/include openexr/include "*.h")
harvest(openexr/lib openexr/lib "*.a")
harvest(openimageio/bin openimageio/bin "idiff")
harvest(openimageio/bin openimageio/bin "maketx")
harvest(openimageio/bin openimageio/bin "oiiotool")
harvest(openimageio/include openimageio/include "*")
harvest(openimageio/lib openimageio/lib "*.a")
harvest(openimagedenoise/include openimagedenoise/include "*")
harvest(openimagedenoise/lib openimagedenoise/lib "*.a")
harvest(embree/include embree/include "*.h")
harvest(embree/lib embree/lib "*.a")
harvest(openjpeg/include/openjpeg-2.3 openjpeg/include "*.h")
harvest(openjpeg/lib openjpeg/lib "*.a")
harvest(opensubdiv/include opensubdiv/include "*.h")
harvest(opensubdiv/lib opensubdiv/lib "*.a")
harvest(openvdb/include/openvdb openvdb/include/openvdb "*.h")
harvest(openvdb/lib openvdb/lib "*.a")
harvest(nanovdb/nanovdb nanovdb/include/nanovdb "*.h")
harvest(xr_openxr_sdk/include/openxr xr_openxr_sdk/include/openxr "*.h")
harvest(xr_openxr_sdk/lib xr_openxr_sdk/lib "*.a")
harvest(osl/bin osl/bin "oslc")
harvest(osl/include osl/include "*.h")
harvest(osl/lib osl/lib "*.a")
harvest(osl/share/OSL/shaders osl/share/OSL/shaders "*.h")
harvest(png/include png/include "*.h")
harvest(png/lib png/lib "*.a")
harvest(pugixml/include pugixml/include "*.hpp")
harvest(pugixml/lib pugixml/lib "*.a")
harvest(python/bin python/bin "python${PYTHON_SHORT_VERSION}")
harvest(python/include python/include "*h")
harvest(python/lib python/lib "*")
harvest(sdl/include/SDL2 sdl/include "*.h")
harvest(sdl/lib sdl/lib "libSDL2.a")
harvest(sndfile/include sndfile/include "*.h")
harvest(sndfile/lib sndfile/lib "*.a")
harvest(spnav/include spnav/include "*.h")
harvest(spnav/lib spnav/lib "*.a")
harvest(tbb/include tbb/include "*.h")
harvest(tbb/lib/libtbb_static.a tbb/lib/libtbb.a)
harvest(theora/lib ffmpeg/lib "*.a")
harvest(tiff/include tiff/include "*.h")
harvest(tiff/lib tiff/lib "*.a")
harvest(vorbis/lib ffmpeg/lib "*.a")
harvest(opus/lib ffmpeg/lib "*.a")
harvest(vpx/lib ffmpeg/lib "*.a")
harvest(webp/lib ffmpeg/lib "*.a")
harvest(x264/lib ffmpeg/lib "*.a")
harvest(xvidcore/lib ffmpeg/lib "*.a")
harvest(usd/include usd/include "*.h")
harvest(usd/lib/usd usd/lib/usd "*")
harvest(usd/plugin usd/plugin "*")
harvest(potrace/include potrace/include "*.h")
harvest(potrace/lib potrace/lib "*.a")
harvest(haru/include haru/include "*.h")
harvest(haru/lib haru/lib "*.a")
harvest(zstd/include zstd/include "*.h")
harvest(zstd/lib zstd/lib "*.a")
if(UNIX AND NOT APPLE)
harvest(libglu/lib mesa/lib "*.so*")
harvest(mesa/lib64 mesa/lib "*.so*")
endif()
function(harvest from to)
set(pattern "")
foreach(f ${ARGN})
set(pattern ${f})
endforeach()
if(pattern STREQUAL "")
get_filename_component(dirpath ${to} DIRECTORY)
get_filename_component(filename ${to} NAME)
install(
FILES ${LIBDIR}/${from}
DESTINATION ${HARVEST_TARGET}/${dirpath}
RENAME ${filename})
else()
install(
DIRECTORY ${LIBDIR}/${from}/
DESTINATION ${HARVEST_TARGET}/${to}
USE_SOURCE_PERMISSIONS
FILES_MATCHING PATTERN ${pattern}
PATTERN "pkgconfig" EXCLUDE
PATTERN "cmake" EXCLUDE
PATTERN "__pycache__" EXCLUDE
PATTERN "tests" EXCLUDE)
endif()
endfunction()
harvest(alembic/include alembic/include "*.h")
harvest(alembic/lib/libAlembic.a alembic/lib/libAlembic.a)
harvest(alembic/bin alembic/bin "*")
harvest(boost/include boost/include "*")
harvest(boost/lib boost/lib "*.a")
harvest(ffmpeg/include ffmpeg/include "*.h")
harvest(ffmpeg/lib ffmpeg/lib "*.a")
harvest(fftw3/include fftw3/include "*.h")
harvest(fftw3/lib fftw3/lib "*.a")
harvest(flac/lib sndfile/lib "libFLAC.a")
harvest(freetype/include freetype/include "*.h")
harvest(freetype/lib/libfreetype2ST.a freetype/lib/libfreetype.a)
harvest(glew/include glew/include "*.h")
harvest(glew/lib glew/lib "*.a")
harvest(gmp/include gmp/include "*.h")
harvest(gmp/lib gmp/lib "*.a")
harvest(jemalloc/include jemalloc/include "*.h")
harvest(jemalloc/lib jemalloc/lib "*.a")
harvest(jpg/include jpeg/include "*.h")
harvest(jpg/lib jpeg/lib "libjpeg.a")
harvest(lame/lib ffmpeg/lib "*.a")
harvest(llvm/bin llvm/bin "clang-format")
if(BUILD_CLANG_TOOLS)
harvest(llvm/bin llvm/bin "clang-tidy")
harvest(llvm/share/clang llvm/share "run-clang-tidy.py")
endif()
harvest(llvm/include llvm/include "*")
harvest(llvm/bin llvm/bin "llvm-config")
harvest(llvm/lib llvm/lib "libLLVM*.a")
harvest(llvm/lib llvm/lib "libclang*.a")
if(APPLE)
harvest(openmp/lib openmp/lib "*")
harvest(openmp/include openmp/include "*.h")
if("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64")
harvest(sse2neon sse2neon "*.h")
endif()
endif()
harvest(ogg/lib ffmpeg/lib "*.a")
harvest(openal/include openal/include "*.h")
if(UNIX AND NOT APPLE)
harvest(openal/lib openal/lib "*.a")
harvest(blosc/include blosc/include "*.h")
harvest(blosc/lib blosc/lib "*.a")
harvest(zlib/include zlib/include "*.h")
harvest(zlib/lib zlib/lib "*.a")
harvest(xml2/include xml2/include "*.h")
harvest(xml2/lib xml2/lib "*.a")
else()
harvest(blosc/lib openvdb/lib "*.a")
harvest(xml2/lib opencollada/lib "*.a")
endif()
harvest(opencollada/include/opencollada opencollada/include "*.h")
harvest(opencollada/lib/opencollada opencollada/lib "*.a")
harvest(opencolorio/include opencolorio/include "*.h")
harvest(opencolorio/lib opencolorio/lib "*.a")
harvest(opencolorio/lib/static opencolorio/lib "*.a")
harvest(openexr/include openexr/include "*.h")
harvest(openexr/lib openexr/lib "*.a")
harvest(openimageio/bin openimageio/bin "idiff")
harvest(openimageio/bin openimageio/bin "maketx")
harvest(openimageio/bin openimageio/bin "oiiotool")
harvest(openimageio/include openimageio/include "*")
harvest(openimageio/lib openimageio/lib "*.a")
harvest(openimagedenoise/include openimagedenoise/include "*")
harvest(openimagedenoise/lib openimagedenoise/lib "*.a")
harvest(embree/include embree/include "*.h")
harvest(embree/lib embree/lib "*.a")
harvest(openjpeg/include/openjpeg-2.3 openjpeg/include "*.h")
harvest(openjpeg/lib openjpeg/lib "*.a")
harvest(opensubdiv/include opensubdiv/include "*.h")
harvest(opensubdiv/lib opensubdiv/lib "*.a")
harvest(openvdb/include/openvdb openvdb/include/openvdb "*.h")
harvest(openvdb/lib openvdb/lib "*.a")
harvest(nanovdb/nanovdb nanovdb/include/nanovdb "*.h")
harvest(xr_openxr_sdk/include/openxr xr_openxr_sdk/include/openxr "*.h")
harvest(xr_openxr_sdk/lib xr_openxr_sdk/lib "*.a")
harvest(osl/bin osl/bin "oslc")
harvest(osl/include osl/include "*.h")
harvest(osl/lib osl/lib "*.a")
harvest(osl/share/OSL/shaders osl/share/OSL/shaders "*.h")
harvest(png/include png/include "*.h")
harvest(png/lib png/lib "*.a")
harvest(pugixml/include pugixml/include "*.hpp")
harvest(pugixml/lib pugixml/lib "*.a")
harvest(python/bin python/bin "python${PYTHON_SHORT_VERSION}")
harvest(python/include python/include "*h")
harvest(python/lib python/lib "*")
harvest(sdl/include/SDL2 sdl/include "*.h")
harvest(sdl/lib sdl/lib "libSDL2.a")
harvest(sndfile/include sndfile/include "*.h")
harvest(sndfile/lib sndfile/lib "*.a")
harvest(spnav/include spnav/include "*.h")
harvest(spnav/lib spnav/lib "*.a")
harvest(tbb/include tbb/include "*.h")
harvest(tbb/lib/libtbb_static.a tbb/lib/libtbb.a)
harvest(theora/lib ffmpeg/lib "*.a")
harvest(tiff/include tiff/include "*.h")
harvest(tiff/lib tiff/lib "*.a")
harvest(vorbis/lib ffmpeg/lib "*.a")
harvest(opus/lib ffmpeg/lib "*.a")
harvest(vpx/lib ffmpeg/lib "*.a")
harvest(webp/lib ffmpeg/lib "*.a")
harvest(x264/lib ffmpeg/lib "*.a")
harvest(xvidcore/lib ffmpeg/lib "*.a")
harvest(usd/include usd/include "*.h")
harvest(usd/lib/usd usd/lib/usd "*")
harvest(usd/plugin usd/plugin "*")
harvest(potrace/include potrace/include "*.h")
harvest(potrace/lib potrace/lib "*.a")
harvest(haru/include haru/include "*.h")
harvest(haru/lib haru/lib "*.a")
if(UNIX AND NOT APPLE)
harvest(libglu/lib mesa/lib "*.so*")
harvest(mesa/lib64 mesa/lib "*.so*")
endif()
endif()

View File

@@ -35,7 +35,6 @@ elseif(APPLE)
else()
set(ISPC_EXTRA_ARGS_APPLE
-DBISON_EXECUTABLE=/usr/local/opt/bison/bin/bison
-DFLEX_EXECUTABLE=/usr/local/opt/flex/bin/flex
-DARM_ENABLED=Off
)
endif()
@@ -44,7 +43,6 @@ elseif(UNIX)
-DCMAKE_C_COMPILER=${LIBDIR}/llvm/bin/clang
-DCMAKE_CXX_COMPILER=${LIBDIR}/llvm/bin/clang++
-DARM_ENABLED=Off
-DFLEX_EXECUTABLE=${LIBDIR}/flex/bin/flex
)
endif()
@@ -84,9 +82,4 @@ if(WIN32)
external_ispc
external_flexbison
)
elseif(UNIX AND NOT APPLE)
add_dependencies(
external_ispc
external_flex
)
endif()

View File

@@ -41,7 +41,7 @@ if(WIN32)
else()
set(JPEG_LIBRARY jpeg-staticd${LIBEXT})
endif()
else(WIN32)
else()
# cmake for unix
set(JPEG_EXTRA_ARGS
-DWITH_JPEG8=ON

View File

@@ -16,7 +16,7 @@
#
# ***** END GPL LICENSE BLOCK *****
if(BLENDER_PLATFORM_ARM)
if(APPLE AND "${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64")
set(LLVM_TARGETS AArch64$<SEMICOLON>ARM)
else()
set(LLVM_TARGETS X86)
@@ -25,6 +25,7 @@ endif()
if(APPLE)
set(LLVM_XML2_ARGS
-DLIBXML2_LIBRARY=${LIBDIR}/xml2/lib/libxml2.a
-DLIBXML2_INCLUDE_DIR=${LIBDIR}/xml2/include/libxml2
)
set(LLVM_BUILD_CLANG_TOOLS_EXTRA ^^clang-tools-extra)
set(BUILD_CLANG_TOOLS ON)
@@ -66,11 +67,7 @@ ExternalProject_Add(ll
if(MSVC)
if(BUILD_MODE STREQUAL Release)
set(LLVM_HARVEST_COMMAND
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/llvm/lib ${HARVEST_TARGET}/llvm/lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/llvm/include ${HARVEST_TARGET}/llvm/include &&
${CMAKE_COMMAND} -E copy ${LIBDIR}/llvm/bin/clang-format.exe ${HARVEST_TARGET}/llvm/bin/clang-format.exe
)
set(LLVM_HARVEST_COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/llvm/ ${HARVEST_TARGET}/llvm/ )
else()
set(LLVM_HARVEST_COMMAND
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/llvm/lib/ ${HARVEST_TARGET}/llvm/debug/lib/ &&

View File

@@ -38,6 +38,7 @@ ExternalProject_Add(external_numpy
PREFIX ${BUILD_DIR}/numpy
PATCH_COMMAND ${NUMPY_PATCH}
CONFIGURE_COMMAND ""
PATCH_COMMAND COMMAND ${PATCH_CMD} -p 1 -d ${BUILD_DIR}/numpy/src/external_numpy < ${PATCH_DIR}/numpy.diff
LOG_BUILD 1
BUILD_COMMAND ${PYTHON_BINARY} ${BUILD_DIR}/numpy/src/external_numpy/setup.py build ${NUMPY_BUILD_OPTION} install --old-and-unmanageable
INSTALL_COMMAND ""

View File

@@ -20,6 +20,29 @@ if(UNIX)
set(OPENCOLLADA_EXTRA_ARGS
-DLIBXML2_INCLUDE_DIR=${LIBDIR}/xml2/include/libxml2
-DLIBXML2_LIBRARIES=${LIBDIR}/xml2/lib/libxml2.a)
# WARNING: the patch contains mixed UNIX and DOS line endings
# as does the OPENCOLLADA package, if this can be corrected upstream that would be better.
# For now use `sed` to force UNIX line endings so the patch applies.
# Needed as neither ignoring white-space or applying as a binary resolve this problem.
set(PATCH_MAYBE_DOS2UNIX_CMD
sed -i "s/\\r//"
${PATCH_DIR}/opencollada.diff
${BUILD_DIR}/opencollada/src/external_opencollada/CMakeLists.txt
${BUILD_DIR}/opencollada/src/external_opencollada/Externals/LibXML/CMakeLists.txt &&
)
else()
set(OPENCOLLADA_EXTRA_ARGS
-DCMAKE_DEBUG_POSTFIX=_d
-DLIBXML2_INCLUDE_DIR=${LIBDIR}/xml2/include/libxml2
)
if(BUILD_MODE STREQUAL Release)
list(APPEND OPENCOLLADA_EXTRA_ARGS -DLIBXML2_LIBRARIES=${LIBDIR}/xml2/lib/libxml2s.lib)
else()
list(APPEND OPENCOLLADA_EXTRA_ARGS -DLIBXML2_LIBRARIES=${LIBDIR}/xml2/lib/libxml2sd.lib)
endif()
set(PATCH_MAYBE_DOS2UNIX_CMD)
endif()
ExternalProject_Add(external_opencollada
@@ -27,17 +50,19 @@ ExternalProject_Add(external_opencollada
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${OPENCOLLADA_HASH_TYPE}=${OPENCOLLADA_HASH}
PREFIX ${BUILD_DIR}/opencollada
PATCH_COMMAND ${PATCH_CMD} -p 1 -N -d ${BUILD_DIR}/opencollada/src/external_opencollada < ${PATCH_DIR}/opencollada.diff
PATCH_COMMAND
${PATCH_MAYBE_DOS2UNIX_CMD}
${PATCH_CMD} -p 1 -N -d ${BUILD_DIR}/opencollada/src/external_opencollada < ${PATCH_DIR}/opencollada.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/opencollada ${DEFAULT_CMAKE_FLAGS} ${OPENCOLLADA_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/opencollada
)
if(UNIX)
add_dependencies(
external_opencollada
external_xml2
)
endif()
unset(PATCH_MAYBE_DOS2UNIX_CMD)
add_dependencies(
external_opencollada
external_xml2
)
if(WIN32)
if(BUILD_MODE STREQUAL Release)
@@ -48,17 +73,7 @@ if(WIN32)
endif()
if(BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_opencollada after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/buffer.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/buffer_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/ftoa.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/ftoa_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/GeneratedSaxParser.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/GeneratedSaxParser_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/MathMLSolver.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/MathMLSolver_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/OpenCOLLADABaseUtils.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/OpenCOLLADABaseUtils_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/OpenCOLLADAFramework.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/OpenCOLLADAFramework_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/OpenCOLLADASaxFrameworkLoader.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/OpenCOLLADASaxFrameworkLoader_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/OpenCOLLADAStreamWriter.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/OpenCOLLADAStreamWriter_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/pcre.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/pcre_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/UTF.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/UTF_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencollada/lib/opencollada/xml.lib ${HARVEST_TARGET}/opencollada/lib/opencollada/xml_d.lib
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/opencollada/lib ${HARVEST_TARGET}/opencollada/lib
DEPENDEES install
)
endif()

View File

@@ -36,7 +36,7 @@ set(OPENCOLORIO_EXTRA_ARGS
-Dyaml-cpp_ROOT=${LIBDIR}/yamlcpp
)
if(BLENDER_PLATFORM_ARM)
if(APPLE AND NOT("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "x86_64"))
set(OPENCOLORIO_EXTRA_ARGS
${OPENCOLORIO_EXTRA_ARGS}
-DOCIO_USE_SSE=OFF

View File

@@ -45,6 +45,7 @@ ExternalProject_Add(external_openimagedenoise
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${OIDN_HASH_TYPE}=${OIDN_HASH}
PREFIX ${BUILD_DIR}/openimagedenoise
PATCH_COMMAND ${PATCH_CMD} -p 1 -N -d ${BUILD_DIR}/openimagedenoise/src/external_openimagedenoise < ${PATCH_DIR}/oidn.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/openimagedenoise ${DEFAULT_CMAKE_FLAGS} ${OIDN_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/openimagedenoise
)

View File

@@ -16,20 +16,15 @@
#
# ***** END GPL LICENSE BLOCK *****
if(APPLE)
set(OPENMP_PATCH_COMMAND ${PATCH_CMD} -p 1 -d ${BUILD_DIR}/openmp/src/external_openmp < ${PATCH_DIR}/openmp.diff)
else()
set(OPENMP_PATCH_COMMAND)
endif()
ExternalProject_Add(external_openmp
URL file://${PACKAGE_DIR}/${OPENMP_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${OPENMP_HASH_TYPE}=${OPENMP_HASH}
PREFIX ${BUILD_DIR}/openmp
PATCH_COMMAND ${OPENMP_PATCH_COMMAND}
PATCH_COMMAND ${PATCH_CMD} -p 1 -d ${BUILD_DIR}/openmp/src/external_openmp < ${PATCH_DIR}/openmp.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/openmp ${DEFAULT_CMAKE_FLAGS}
INSTALL_COMMAND cd ${BUILD_DIR}/openmp/src/external_openmp-build && install_name_tool -id @rpath/libomp.dylib runtime/src/libomp.dylib && make install
INSTALL_COMMAND cd ${BUILD_DIR}/openmp/src/external_openmp-build && install_name_tool -id @executable_path/../Resources/lib/libomp.dylib runtime/src/libomp.dylib && make install
INSTALL_DIR ${LIBDIR}/openmp
)

View File

@@ -32,7 +32,7 @@ message("BuildMode = ${BUILD_MODE}")
if(BUILD_MODE STREQUAL "Debug")
set(LIBDIR ${CMAKE_CURRENT_BINARY_DIR}/Debug)
else(BUILD_MODE STREQUAL "Debug")
else()
set(LIBDIR ${CMAKE_CURRENT_BINARY_DIR}/Release)
endif()
@@ -116,34 +116,16 @@ else()
set(LIBPREFIX "lib")
if(APPLE)
# Let's get the current Xcode dir, to support xcode-select
execute_process(
COMMAND xcode-select --print-path
OUTPUT_VARIABLE XCODE_DEV_PATH OUTPUT_STRIP_TRAILING_WHITESPACE
)
execute_process(
COMMAND xcodebuild -version -sdk macosx SDKVersion
OUTPUT_VARIABLE MACOSX_SDK_VERSION OUTPUT_STRIP_TRAILING_WHITESPACE)
if(NOT CMAKE_OSX_ARCHITECTURES)
execute_process(COMMAND uname -m OUTPUT_VARIABLE ARCHITECTURE OUTPUT_STRIP_TRAILING_WHITESPACE)
message(STATUS "Detected native architecture ${ARCHITECTURE}.")
set(CMAKE_OSX_ARCHITECTURES "${ARCHITECTURE}")
endif()
if("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "x86_64")
set(OSX_DEPLOYMENT_TARGET 10.13)
else()
set(OSX_DEPLOYMENT_TARGET 11.00)
endif()
set(OSX_SYSROOT ${XCODE_DEV_PATH}/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk)
# Use same Xcode detection as Blender itself.
include(../cmake/platform/platform_apple_xcode.cmake)
if("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64")
set(BLENDER_PLATFORM_ARM ON)
endif()
set(PLATFORM_CFLAGS "-isysroot ${OSX_SYSROOT} -mmacosx-version-min=${OSX_DEPLOYMENT_TARGET} -arch ${CMAKE_OSX_ARCHITECTURES}")
set(PLATFORM_CXXFLAGS "-isysroot ${OSX_SYSROOT} -mmacosx-version-min=${OSX_DEPLOYMENT_TARGET} -std=c++11 -stdlib=libc++ -arch ${CMAKE_OSX_ARCHITECTURES}")
set(PLATFORM_LDFLAGS "-isysroot ${OSX_SYSROOT} -mmacosx-version-min=${OSX_DEPLOYMENT_TARGET} -arch ${CMAKE_OSX_ARCHITECTURES}")
set(PLATFORM_CFLAGS "-isysroot ${CMAKE_OSX_SYSROOT} -mmacosx-version-min=${CMAKE_OSX_DEPLOYMENT_TARGET} -arch ${CMAKE_OSX_ARCHITECTURES}")
set(PLATFORM_CXXFLAGS "-isysroot ${CMAKE_OSX_SYSROOT} -mmacosx-version-min=${CMAKE_OSX_DEPLOYMENT_TARGET} -std=c++11 -stdlib=libc++ -arch ${CMAKE_OSX_ARCHITECTURES}")
set(PLATFORM_LDFLAGS "-isysroot ${CMAKE_OSX_SYSROOT} -mmacosx-version-min=${CMAKE_OSX_DEPLOYMENT_TARGET} -arch ${CMAKE_OSX_ARCHITECTURES}")
if("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "x86_64")
set(PLATFORM_BUILD_TARGET --build=x86_64-apple-darwin17.0.0) # OS X 10.13
else()
@@ -151,14 +133,10 @@ else()
endif()
set(PLATFORM_CMAKE_FLAGS
-DCMAKE_OSX_ARCHITECTURES:STRING=${CMAKE_OSX_ARCHITECTURES}
-DCMAKE_OSX_DEPLOYMENT_TARGET:STRING=${OSX_DEPLOYMENT_TARGET}
-DCMAKE_OSX_SYSROOT:PATH=${OSX_SYSROOT}
-DCMAKE_OSX_DEPLOYMENT_TARGET:STRING=${CMAKE_OSX_DEPLOYMENT_TARGET}
-DCMAKE_OSX_SYSROOT:PATH=${CMAKE_OSX_SYSROOT}
)
else()
if("${CMAKE_SYSTEM_PROCESSOR}" STREQUAL "aarch64")
set(BLENDER_PLATFORM_ARM ON)
endif()
set(PLATFORM_CFLAGS "-fPIC")
set(PLATFORM_CXXFLAGS "-std=c++11 -fPIC")
set(PLATFORM_LDFLAGS)
@@ -186,8 +164,8 @@ else()
set(BLENDER_CMAKE_CXX_FLAGS_RELWITHDEBINFO "-O2 -g -DNDEBUG ${PLATFORM_CXXFLAGS}")
set(CONFIGURE_ENV
export MACOSX_DEPLOYMENT_TARGET=${OSX_DEPLOYMENT_TARGET} &&
export MACOSX_SDK_VERSION=${OSX_DEPLOYMENT_TARGET} &&
export MACOSX_DEPLOYMENT_TARGET=${CMAKE_OSX_DEPLOYMENT_TARGET} &&
export MACOSX_SDK_VERSION=${CMAKE_OSX_DEPLOYMENT_TARGET} &&
export CFLAGS=${PLATFORM_CFLAGS} &&
export CXXFLAGS=${PLATFORM_CXXFLAGS} &&
export LDFLAGS=${PLATFORM_LDFLAGS}

View File

@@ -20,10 +20,12 @@ if(WIN32)
set(OSL_CMAKE_CXX_STANDARD_LIBRARIES "kernel32${LIBEXT} user32${LIBEXT} gdi32${LIBEXT} winspool${LIBEXT} shell32${LIBEXT} ole32${LIBEXT} oleaut32${LIBEXT} uuid${LIBEXT} comdlg32${LIBEXT} advapi32${LIBEXT} psapi${LIBEXT}")
set(OSL_FLEX_BISON -DFLEX_EXECUTABLE=${LIBDIR}/flexbison/win_flex.exe -DBISON_EXECUTABLE=${LIBDIR}/flexbison/win_bison.exe)
set(OSL_SIMD_FLAGS -DOIIO_NOSIMD=1 -DOIIO_SIMD=sse2)
SET(OSL_PLATFORM_FLAGS -DLINKSTATIC=ON)
else()
set(OSL_CMAKE_CXX_STANDARD_LIBRARIES)
set(OSL_FLEX_BISON)
set(OSL_OPENIMAGEIO_LIBRARY "${LIBDIR}/openimageio/lib/${LIBPREFIX}OpenImageIO${LIBEXT};${LIBDIR}/openimageio/lib/${LIBPREFIX}OpenImageIO_Util${LIBEXT};${LIBDIR}/png/lib/${LIBPREFIX}png16${LIBEXT};${LIBDIR}/jpg/lib/${LIBPREFIX}jpeg${LIBEXT};${LIBDIR}/tiff/lib/${LIBPREFIX}tiff${LIBEXT};${LIBDIR}/openexr/lib/${LIBPREFIX}IlmImf${OPENEXR_VERSION_POSTFIX}${LIBEXT}")
SET(OSL_PLATFORM_FLAGS)
endif()
set(OSL_ILMBASE_CUSTOM_LIBRARIES "${LIBDIR}/openexr/lib/Imath${OPENEXR_VERSION_POSTFIX}.lib^^${LIBDIR}/openexr/lib/Half{OPENEXR_VERSION_POSTFIX}.lib^^${LIBDIR}/openexr/lib/IlmThread${OPENEXR_VERSION_POSTFIX}.lib^^${LIBDIR}/openexr/lib/Iex${OPENEXR_VERSION_POSTFIX}.lib")
@@ -49,13 +51,12 @@ set(OSL_EXTRA_ARGS
-DOpenImageIO_ROOT=${LIBDIR}/openimageio/
-DOSL_BUILD_TESTS=OFF
-DOSL_BUILD_MATERIALX=OFF
-DPNG_ROOT=${LIBDIR}/png
-DZLIB_LIBRARY=${LIBDIR}/zlib/lib/${ZLIB_LIBRARY}
-DZLIB_INCLUDE_DIR=${LIBDIR}/zlib/include/
${OSL_FLEX_BISON}
-DCMAKE_CXX_STANDARD_LIBRARIES=${OSL_CMAKE_CXX_STANDARD_LIBRARIES}
-DBUILD_SHARED_LIBS=OFF
-DLINKSTATIC=ON
${OSL_PLATFORM_FLAGS}
-DOSL_BUILD_PLUGINS=OFF
-DSTOP_ON_WARNING=OFF
-DUSE_LLVM_BITCODE=OFF
@@ -67,10 +68,16 @@ set(OSL_EXTRA_ARGS
-DINSTALL_DOCS=OFF
${OSL_SIMD_FLAGS}
-Dpugixml_ROOT=${LIBDIR}/pugixml
-DTIFF_ROOT=${LIBDIR}/tiff
-DJPEG_ROOT=${LIBDIR}/jpeg
-DUSE_PYTHON=OFF
-DCMAKE_CXX_STANDARD=14
)
# Apple arm64 uses LLVM 11, LLVM 10+ requires C++14
if (APPLE AND "${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64")
list(APPEND OSL_EXTRA_ARGS -DCMAKE_CXX_STANDARD=14)
endif()
ExternalProject_Add(external_osl
URL file://${PACKAGE_DIR}/${OSL_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
@@ -88,20 +95,10 @@ add_dependencies(
ll
external_openexr
external_zlib
external_flexbison
external_openimageio
external_pugixml
)
if(WIN32)
add_dependencies(
external_osl
external_flexbison
)
elseif(UNIX AND NOT APPLE)
add_dependencies(
external_osl
external_flex
)
endif()
if(WIN32)
if(BUILD_MODE STREQUAL Release)

View File

@@ -22,7 +22,7 @@ set(PNG_EXTRA_ARGS
-DPNG_STATIC=ON
)
if(BLENDER_PLATFORM_ARM)
if(APPLE AND ("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64"))
set(PNG_EXTRA_ARGS ${PNG_EXTRA_ARGS} -DPNG_HARDWARE_OPTIMIZATIONS=ON -DPNG_ARM_NEON=on -DCMAKE_SYSTEM_PROCESSOR="aarch64")
endif()
@@ -40,6 +40,14 @@ add_dependencies(
external_zlib
)
if(WIN32 AND BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_png after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/png/include/ ${HARVEST_TARGET}/png/include/
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/png/lib/libpng16_static${LIBEXT} ${HARVEST_TARGET}/png/lib/libpng${LIBEXT}
DEPENDEES install
)
endif()
if(WIN32 AND BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_png after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/png/lib/libpng16_staticd${LIBEXT} ${LIBDIR}/png/lib/libpng16${LIBEXT}

View File

@@ -31,9 +31,11 @@ if(WIN32)
endmacro()
set(PYTHON_EXTERNALS_FOLDER ${BUILD_DIR}/python/src/external_python/externals)
set(ZLIB_SOURCE_FOLDER ${BUILD_DIR}/zlib/src/external_zlib)
set(DOWNLOADS_EXTERNALS_FOLDER ${DOWNLOAD_DIR}/externals)
cmake_to_dos_path(${PYTHON_EXTERNALS_FOLDER} PYTHON_EXTERNALS_FOLDER_DOS)
cmake_to_dos_path(${ZLIB_SOURCE_FOLDER} ZLIB_SOURCE_FOLDER_DOS)
cmake_to_dos_path(${DOWNLOADS_EXTERNALS_FOLDER} DOWNLOADS_EXTERNALS_FOLDER_DOS)
ExternalProject_Add(external_python
@@ -41,11 +43,21 @@ if(WIN32)
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${PYTHON_HASH_TYPE}=${PYTHON_HASH}
PREFIX ${BUILD_DIR}/python
CONFIGURE_COMMAND ""
# Python will download its own deps and there's very little we can do about
# that beyond placing some code in their externals dir before it tries.
# the foldernames *HAVE* to match the ones inside pythons get_externals.cmd.
# python 3.10.8 still ships zlib 1.2.12, replace it with our 1.2.13
# copy until they update.
CONFIGURE_COMMAND mkdir ${PYTHON_EXTERNALS_FOLDER_DOS} &&
mklink /J ${PYTHON_EXTERNALS_FOLDER_DOS}\\zlib-1.2.12 ${ZLIB_SOURCE_FOLDER_DOS} &&
${CMAKE_COMMAND} -E copy ${ZLIB_SOURCE_FOLDER}/../external_zlib-build/zconf.h ${PYTHON_EXTERNALS_FOLDER}/zlib-1.2.12/zconf.h
BUILD_COMMAND cd ${BUILD_DIR}/python/src/external_python/pcbuild/ && set IncludeTkinter=false && call build.bat -e -p x64 -c ${BUILD_MODE}
INSTALL_COMMAND ${PYTHON_BINARY_INTERNAL} ${PYTHON_SRC}/PC/layout/main.py -b ${PYTHON_SRC}/PCbuild/amd64 -s ${PYTHON_SRC} -t ${PYTHON_SRC}/tmp/ --include-stable --include-pip --include-dev --include-launchers --include-venv --include-symbols ${PYTHON_EXTRA_INSTLAL_FLAGS} --copy ${LIBDIR}/python
)
add_dependencies(
external_python
external_zlib
)
else()
if(APPLE)
# Disable functions that can be in 10.13 sdk but aren't available on 10.9 target.

View File

@@ -18,20 +18,14 @@
if(WIN32 AND BUILD_MODE STREQUAL Debug)
set(SITE_PACKAGES_EXTRA --global-option build --global-option --debug)
# zstandard is determined to build and link release mode libs in a debug
# configuration, the only way to make it happy is to bend to its will
# and give it a library to link with.
set(PIP_CONFIGURE_COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/python/libs/python${PYTHON_SHORT_VERSION_NO_DOTS}_d.lib ${LIBDIR}/python/libs/python${PYTHON_SHORT_VERSION_NO_DOTS}.lib)
else()
set(PIP_CONFIGURE_COMMAND echo ".")
endif()
ExternalProject_Add(external_python_site_packages
DOWNLOAD_COMMAND ""
CONFIGURE_COMMAND ${PIP_CONFIGURE_COMMAND}
CONFIGURE_COMMAND ""
BUILD_COMMAND ""
PREFIX ${BUILD_DIR}/site_packages
INSTALL_COMMAND ${PYTHON_BINARY} -m pip install ${SITE_PACKAGES_EXTRA} cython==${CYTHON_VERSION} idna==${IDNA_VERSION} charset-normalizer==${CHARSET_NORMALIZER_VERSION} urllib3==${URLLIB3_VERSION} certifi==${CERTIFI_VERSION} requests==${REQUESTS_VERSION} zstandard==${ZSTANDARD_VERSION} --no-binary :all:
INSTALL_COMMAND ${PYTHON_BINARY} -m pip install ${SITE_PACKAGES_EXTRA} cython==${CYTHON_VERSION} idna==${IDNA_VERSION} chardet==${CHARDET_VERSION} urllib3==${URLLIB3_VERSION} certifi==${CERTIFI_VERSION} requests==${REQUESTS_VERSION} --no-binary :all:
)
if(USE_PIP_NUMPY)

View File

@@ -27,18 +27,11 @@ else()
set(SNDFILE_OPTIONS --enable-static --disable-shared )
endif()
if(UNIX)
set(SNDFILE_PATCH_CMD ${PATCH_CMD} --verbose -p 0 -d ${BUILD_DIR}/sndfile/src/external_sndfile < ${PATCH_DIR}/sndfile.diff)
else()
set(SNDFILE_PATCH_CMD)
endif()
ExternalProject_Add(external_sndfile
URL file://${PACKAGE_DIR}/${SNDFILE_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${SNDFILE_HASH_TYPE}=${SNDFILE_HASH}
PREFIX ${BUILD_DIR}/sndfile
PATCH_COMMAND ${SNDFILE_PATCH_CMD}
CONFIGURE_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/sndfile/src/external_sndfile/ && ${SNDFILE_ENV} ${CONFIGURE_COMMAND} ${SNDFILE_OPTIONS} --prefix=${mingw_LIBDIR}/sndfile
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/sndfile/src/external_sndfile/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/sndfile/src/external_sndfile/ && make install

View File

@@ -64,7 +64,6 @@ ExternalProject_Add(external_sqlite
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${SQLITE_HASH_TYPE}=${SQLITE_HASH}
PREFIX ${BUILD_DIR}/sqlite
PATCH_COMMAND ${PATCH_CMD} -p 1 -d ${BUILD_DIR}/sqlite/src/external_sqlite < ${PATCH_DIR}/sqlite.diff
CONFIGURE_COMMAND ${SQLITE_CONFIGURE_ENV} && cd ${BUILD_DIR}/sqlite/src/external_sqlite/ && ${CONFIGURE_COMMAND} --prefix=${LIBDIR}/sqlite ${SQLITE_CONFIGURATION_ARGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/sqlite/src/external_sqlite/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/sqlite/src/external_sqlite/ && make install

View File

@@ -16,13 +16,15 @@
#
# ***** END GPL LICENSE BLOCK *****
ExternalProject_Add(external_sse2neon
GIT_REPOSITORY ${SSE2NEON_GIT}
GIT_TAG ${SSE2NEON_GIT_HASH}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
PREFIX ${BUILD_DIR}/sse2neon
CONFIGURE_COMMAND echo sse2neon - Nothing to configure
BUILD_COMMAND echo sse2neon - nothing to build
INSTALL_COMMAND mkdir -p ${LIBDIR}/sse2neon && cp ${BUILD_DIR}/sse2neon/src/external_sse2neon/sse2neon.h ${LIBDIR}/sse2neon
INSTALL_DIR ${LIBDIR}/sse2neon
)
if(APPLE AND ("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64"))
ExternalProject_Add(external_sse2neon
GIT_REPOSITORY ${SSE2NEON_GIT}
GIT_TAG ${SSE2NEON_GIT_HASH}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
PREFIX ${BUILD_DIR}/sse2neon
CONFIGURE_COMMAND echo sse2neon - Nothing to configure
BUILD_COMMAND echo sse2neon - nothing to build
INSTALL_COMMAND mkdir -p ${LIBDIR}/sse2neon && cp ${BUILD_DIR}/sse2neon/src/external_sse2neon/sse2neon.h ${LIBDIR}/sse2neon
INSTALL_DIR ${LIBDIR}/sse2neon
)
endif()

View File

@@ -21,10 +21,9 @@ set(SSL_PATCH_CMD echo .)
if(APPLE)
set(SSL_OS_COMPILER "blender-darwin-${CMAKE_OSX_ARCHITECTURES}")
set(SSL_PATCH_CMD ${PATCH_CMD} --verbose -p 0 -d ${BUILD_DIR}/ssl/src/external_ssl < ${PATCH_DIR}/ssl.diff)
else()
if(BLENDER_PLATFORM_ARM)
set(SSL_OS_COMPILER "blender-linux-aarch64")
elseif("${CMAKE_SIZEOF_VOID_P}" EQUAL "8")
if("${CMAKE_SIZEOF_VOID_P}" EQUAL "8")
set(SSL_EXTRA_ARGS enable-ec_nistp_64_gcc_128)
set(SSL_OS_COMPILER "blender-linux-x86_64")
else()

View File

@@ -8,11 +8,6 @@ my %targets = (
inherit_from => [ "linux-x86_64" ],
cflags => add("-fPIC"),
},
"blender-linux-aarch64" => {
inherit_from => [ "linux-aarch64" ],
cxxflags => add("-fPIC"),
cflags => add("-fPIC"),
},
"blender-darwin-x86_64" => {
inherit_from => [ "darwin64-x86_64-cc" ],
cflags => add("-fPIC"),

View File

@@ -21,8 +21,6 @@ if(WIN32)
-DTBB_BUILD_TBBMALLOC=On
-DTBB_BUILD_TBBMALLOC_PROXY=On
-DTBB_BUILD_STATIC=Off
-DTBB_BUILD_TESTS=Off
-DCMAKE_DEBUG_POSTFIX=_debug
)
set(TBB_LIBRARY tbb)
set(TBB_STATIC_LIBRARY Off)
@@ -32,7 +30,6 @@ else()
-DTBB_BUILD_TBBMALLOC=On
-DTBB_BUILD_TBBMALLOC_PROXY=Off
-DTBB_BUILD_STATIC=On
-DTBB_BUILD_TESTS=Off
)
set(TBB_LIBRARY tbb_static)
set(TBB_STATIC_LIBRARY On)
@@ -45,7 +42,7 @@ ExternalProject_Add(external_tbb
URL_HASH ${TBB_HASH_TYPE}=${TBB_HASH}
PREFIX ${BUILD_DIR}/tbb
PATCH_COMMAND COMMAND ${CMAKE_COMMAND} -E copy ${PATCH_DIR}/cmakelists_tbb.txt ${BUILD_DIR}/tbb/src/external_tbb/CMakeLists.txt &&
${CMAKE_COMMAND} -E copy ${BUILD_DIR}/tbb/src/external_tbb/build/vs2013/version_string.ver ${BUILD_DIR}/tbb/src/external_tbb/build/version_string.ver.in &&
${CMAKE_COMMAND} -E copy ${BUILD_DIR}/tbb/src/external_tbb/build/vs2013/version_string.ver ${BUILD_DIR}/tbb/src/external_tbb/src/tbb/version_string.ver &&
${PATCH_CMD} -p 1 -d ${BUILD_DIR}/tbb/src/external_tbb < ${PATCH_DIR}/tbb.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/tbb ${DEFAULT_CMAKE_FLAGS} ${TBB_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/tbb
@@ -56,17 +53,17 @@ if(WIN32)
ExternalProject_Add_Step(external_tbb after_install
# findtbb.cmake in some deps *NEEDS* to find tbb_debug.lib even if they are not going to use it
# to make that test pass, we place a copy with the right name in the lib folder.
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb.lib ${LIBDIR}/tbb/lib/tbb_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.lib ${LIBDIR}/tbb/lib/tbbmalloc_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/bin/tbb.dll ${LIBDIR}/tbb/bin/tbb_debug.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/bin/tbbmalloc.dll ${LIBDIR}/tbb/bin/tbbmalloc_debug.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb.lib ${HARVEST_TARGET}/tbb/lib/tbb_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb.dll ${HARVEST_TARGET}/tbb/lib/tbb_debug.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.dll ${HARVEST_TARGET}/tbb/lib/tbbmalloc_debug.dll
# Normal collection of build artifacts
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb.lib ${HARVEST_TARGET}/tbb/lib/tbb.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/bin/tbb.dll ${HARVEST_TARGET}/tbb/bin/tbb.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb.dll ${HARVEST_TARGET}/tbb/lib/tbb.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/bin/tbbmalloc.dll ${HARVEST_TARGET}/tbb/bin/tbbmalloc.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.dll ${HARVEST_TARGET}/tbb/lib/tbbmalloc.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/bin/tbbmalloc_proxy.dll ${HARVEST_TARGET}/tbb/bin/tbbmalloc_proxy.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.dll ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy.dll
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/tbb/include/ ${HARVEST_TARGET}/tbb/include/
DEPENDEES install
)
@@ -77,12 +74,11 @@ if(WIN32)
# to make that test pass, we place a copy with the right name in the lib folder.
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb_debug.lib ${LIBDIR}/tbb/lib/tbb.lib
# Normal collection of build artifacts
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb_debug.lib ${HARVEST_TARGET}/tbb/lib/tbb_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/bin/tbb_debug.dll ${HARVEST_TARGET}/tbb/bin/tbb_debug.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_debug.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy_debug.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/bin/tbbmalloc_debug.dll ${HARVEST_TARGET}/tbb/bin/tbbmalloc_debug.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/bin/tbbmalloc_proxy_debug.dll ${HARVEST_TARGET}/tbb/bin/tbbmalloc_proxy_debug.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb_debug.lib ${HARVEST_TARGET}/tbb/lib/debug/tbb_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb_debug.dll ${HARVEST_TARGET}/tbb/lib/debug/tbb_debug.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.dll ${HARVEST_TARGET}/tbb/lib/debug/tbbmalloc.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.dll ${HARVEST_TARGET}/tbb/lib/debug/tbbmalloc_proxy.dll
DEPENDEES install
)
endif()

View File

@@ -45,6 +45,7 @@ ExternalProject_Add(external_tiff
add_dependencies(
external_tiff
external_zlib
external_jpeg
)
if(WIN32 AND BUILD_MODE STREQUAL Debug)

View File

@@ -16,11 +16,20 @@
#
# ***** END GPL LICENSE BLOCK *****
set(ZLIB_VERSION 1.2.11)
# CPE's are used to identify dependencies, for more information on what they
# are please see https://nvd.nist.gov/products/cpe
#
# We use them in combination with cve-bin-tool to scan for known security issues.
#
# Not all of our dependencies are currently in the nvd database so not all
# dependencies have one assigned.
set(ZLIB_VERSION 1.2.13)
set(ZLIB_URI https://zlib.net/zlib-${ZLIB_VERSION}.tar.gz)
set(ZLIB_HASH 1c9f62f0778697a09d36121ead88e08e)
set(ZLIB_HASH 9b8aa094c4e5765dabf4da391f00d15c)
set(ZLIB_HASH_TYPE MD5)
set(ZLIB_FILE zlib-${ZLIB_VERSION}.tar.gz)
set(ZLIB_CPE "cpe:2.3:a:zlib:zlib:${ZLIB_VERSION}:*:*:*:*:*:*:*")
set(OPENAL_VERSION 1.20.1)
set(OPENAL_URI http://openal-soft.org/openal-releases/openal-soft-${OPENAL_VERSION}.tar.bz2)
@@ -33,12 +42,14 @@ set(PNG_URI http://prdownloads.sourceforge.net/libpng/libpng-${PNG_VERSION}.tar.
set(PNG_HASH 505e70834d35383537b6491e7ae8641f1a4bed1876dbfe361201fc80868d88ca)
set(PNG_HASH_TYPE SHA256)
set(PNG_FILE libpng-${PNG_VERSION}.tar.xz)
set(PNG_CPE "cpe:2.3:a:libpng:libpng:${PNG_VERSION}:*:*:*:*:*:*:*")
set(JPEG_VERSION 2.0.4)
set(JPEG_URI https://github.com/libjpeg-turbo/libjpeg-turbo/archive/${JPEG_VERSION}.tar.gz)
set(JPEG_HASH 44c43e4a9fb352f47090804529317c88)
set(JPEG_HASH_TYPE MD5)
set(JPEG_FILE libjpeg-turbo-${JPEG_VERSION}.tar.gz)
set(JPEG_CPE "cpe:2.3:a:d.r.commander:libjpeg-turbo:${JPEG_VERSION}:*:*:*:*:*:*:*")
set(BOOST_VERSION 1.73.0)
set(BOOST_VERSION_NODOTS 1_73_0)
@@ -47,6 +58,7 @@ set(BOOST_URI https://boostorg.jfrog.io/artifactory/main/release/${BOOST_VERSION
set(BOOST_HASH 4036cd27ef7548b8d29c30ea10956196)
set(BOOST_HASH_TYPE MD5)
set(BOOST_FILE boost_${BOOST_VERSION_NODOTS}.tar.gz)
set(BOOST_CPE "cpe:2.3:a:boost:boost:${BOOST_VERSION}:*:*:*:*:*:*:*")
# Using old version as recommended by OpenVDB build documentation.
set(BLOSC_VERSION 1.5.0)
@@ -54,6 +66,7 @@ set(BLOSC_URI https://github.com/Blosc/c-blosc/archive/v${BLOSC_VERSION}.tar.gz)
set(BLOSC_HASH 6e4a49c8c06f05aa543f3312cfce3d55)
set(BLOSC_HASH_TYPE MD5)
set(BLOSC_FILE blosc-${BLOSC_VERSION}.tar.gz)
set(BLOSC_CPE "cpe:2.3:a:c-blosc2_project:c-blosc2:${BLOSC_VERSION}:*:*:*:*:*:*:*")
set(PTHREADS_VERSION 3.0.0)
set(PTHREADS_URI http://prdownloads.sourceforge.net/pthreads4w/pthreads4w-code-v${PTHREADS_VERSION}.zip)
@@ -61,11 +74,12 @@ set(PTHREADS_HASH f3bf81bb395840b3446197bcf4ecd653)
set(PTHREADS_HASH_TYPE MD5)
set(PTHREADS_FILE pthreads4w-code-${PTHREADS_VERSION}.zip)
set(OPENEXR_VERSION 2.5.5)
set(OPENEXR_VERSION 2.5.8)
set(OPENEXR_URI https://github.com/AcademySoftwareFoundation/openexr/archive/v${OPENEXR_VERSION}.tar.gz)
set(OPENEXR_HASH 85e8a979092c9055d10ed103062d31a0)
set(OPENEXR_HASH_TYPE MD5)
set(OPENEXR_FILE openexr-${OPENEXR_VERSION}.tar.gz)
set(OPENEXR_CPE "cpe:2.3:a:openexr:openexr:${OPENEXR_VERSION}:*:*:*:*:*:*:*")
if(WIN32)
# Openexr started appending _d on its own so now
@@ -83,11 +97,12 @@ else()
set(OPENEXR_VERSION_POSTFIX)
endif()
set(FREETYPE_VERSION 2.10.2)
set(FREETYPE_VERSION 2.12.1)
set(FREETYPE_URI http://prdownloads.sourceforge.net/freetype/freetype-${FREETYPE_VERSION}.tar.gz)
set(FREETYPE_HASH b1cb620e4c875cd4d1bfa04945400945)
set(FREETYPE_HASH 8bc5c9c9df7ac12c504f8918552a7cf2)
set(FREETYPE_HASH_TYPE MD5)
set(FREETYPE_FILE freetype-${FREETYPE_VERSION}.tar.gz)
SET(FREETYPE_CPE "cpe:2.3:a:freetype:freetype:${FREETYPE_VERSION}:*:*:*:*:*:*:*")
set(GLEW_VERSION 1.13.0)
set(GLEW_URI http://prdownloads.sourceforge.net/glew/glew/${GLEW_VERSION}/glew-${GLEW_VERSION}.tgz)
@@ -106,6 +121,7 @@ set(ALEMBIC_URI https://github.com/alembic/alembic/archive/${ALEMBIC_VERSION}.ta
set(ALEMBIC_HASH effcc86e42fe6605588e3de57bde6677)
set(ALEMBIC_HASH_TYPE MD5)
set(ALEMBIC_FILE alembic-${ALEMBIC_VERSION}.tar.gz)
SET(FREETYPE_CPE "cpe:2.3:a:freetype:freetype:${FREETYPE_VERSION}:*:*:*:*:*:*:*")
# hash is for 3.1.2
set(GLFW_GIT_UID 30306e54705c3adae9fe082c816a3be71963485c)
@@ -139,6 +155,7 @@ set(SDL_URI https://www.libsdl.org/release/SDL2-${SDL_VERSION}.tar.gz)
set(SDL_HASH 783b6f2df8ff02b19bb5ce492b99c8ff)
set(SDL_HASH_TYPE MD5)
set(SDL_FILE SDL2-${SDL_VERSION}.tar.gz)
set(SDL_CPE "cpe:2.3:a:libsdl:sdl:${SDL_VERSION}:*:*:*:*:*:*:*")
set(OPENCOLLADA_VERSION v1.6.68)
set(OPENCOLLADA_URI https://github.com/KhronosGroup/OpenCOLLADA/archive/${OPENCOLLADA_VERSION}.tar.gz)
@@ -152,56 +169,68 @@ set(OPENCOLORIO_HASH 1a2e3478b6cd9a1549f24e1b2205e3f0)
set(OPENCOLORIO_HASH_TYPE MD5)
set(OPENCOLORIO_FILE OpenColorIO-${OPENCOLORIO_VERSION}.tar.gz)
set(LLVM_VERSION 12.0.0)
set(LLVM_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/llvm-project-${LLVM_VERSION}.src.tar.xz)
set(LLVM_HASH 5a4fab4d7fc84aefffb118ac2c8a4fc0)
set(LLVM_HASH_TYPE MD5)
set(LLVM_FILE llvm-project-${LLVM_VERSION}.src.tar.xz)
if(APPLE AND ("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64"))
# Newer version required by ISPC with arm support.
set(LLVM_VERSION 11.0.1)
set(LLVM_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/llvm-project-${LLVM_VERSION}.src.tar.xz)
set(LLVM_HASH e700af40ab83463e4e9ab0ba3708312e)
set(LLVM_HASH_TYPE MD5)
set(LLVM_FILE llvm-project-${LLVM_VERSION}.src.tar.xz)
if(APPLE)
# Cloth physics test is crashing due to this bug:
# https://bugs.llvm.org/show_bug.cgi?id=50579
set(OPENMP_VERSION 9.0.1)
set(OPENMP_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${OPENMP_VERSION}/openmp-${OPENMP_VERSION}.src.tar.xz)
set(OPENMP_HASH 6eade16057edbdecb3c4eef9daa2bfcf)
set(OPENMP_HASH_TYPE MD5)
set(OPENMP_FILE openmp-${OPENMP_VERSION}.src.tar.xz)
else()
set(OPENMP_VERSION ${LLVM_VERSION})
set(OPENMP_HASH ac48ce3e4582ccb82f81ab59eb3fc9dc)
endif()
set(OPENMP_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${OPENMP_VERSION}/openmp-${OPENMP_VERSION}.src.tar.xz)
set(OPENMP_HASH_TYPE MD5)
set(OPENMP_FILE openmp-${OPENMP_VERSION}.src.tar.xz)
set(LLVM_VERSION 9.0.1)
set(LLVM_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/llvm-project-${LLVM_VERSION}.tar.xz)
set(LLVM_HASH b4268e733dfe352960140dc07ef2efcb)
set(LLVM_HASH_TYPE MD5)
set(LLVM_FILE llvm-project-${LLVM_VERSION}.tar.xz)
set(OPENIMAGEIO_VERSION 2.2.15.1)
set(OPENMP_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/openmp-${LLVM_VERSION}.src.tar.xz)
set(OPENMP_HASH 6eade16057edbdecb3c4eef9daa2bfcf)
set(OPENMP_HASH_TYPE MD5)
set(OPENMP_FILE openmp-${LLVM_VERSION}.src.tar.xz)
endif()
set(LLVM_CPE "cpe:2.3:a:llvm:compiler:${LLVM_VERSION}:*:*:*:*:*:*:*")
set(OPENIMAGEIO_VERSION 2.1.15.0)
set(OPENIMAGEIO_URI https://github.com/OpenImageIO/oiio/archive/Release-${OPENIMAGEIO_VERSION}.tar.gz)
set(OPENIMAGEIO_HASH 3db5c5f0b3dc91597c75e5df09eb9072)
set(OPENIMAGEIO_HASH f03aa5e3ac4795af04771ee4146e9832)
set(OPENIMAGEIO_HASH_TYPE MD5)
set(OPENIMAGEIO_FILE OpenImageIO-${OPENIMAGEIO_VERSION}.tar.gz)
set(TIFF_VERSION 4.1.0)
set(TIFF_VERSION 4.4.0)
set(TIFF_URI http://download.osgeo.org/libtiff/tiff-${TIFF_VERSION}.tar.gz)
set(TIFF_HASH 2165e7aba557463acc0664e71a3ed424)
set(TIFF_HASH 376f17f189e9d02280dfe709b2b2bbea)
set(TIFF_HASH_TYPE MD5)
set(TIFF_FILE tiff-${TIFF_VERSION}.tar.gz)
set(TIFF_CPE "cpe:2.3:a:libtiff:libtiff:${TIFF_VERSION}:*:*:*:*:*:*:*")
set(OSL_VERSION 1.11.14.1)
set(OSL_VERSION 1.11.10.0)
set(OSL_URI https://github.com/imageworks/OpenShadingLanguage/archive/Release-${OSL_VERSION}.tar.gz)
set(OSL_HASH 1abd7ce40481771a9fa937f19595d2f2)
set(OSL_HASH dfdc23597aeef083832cbada62211756)
set(OSL_HASH_TYPE MD5)
set(OSL_FILE OpenShadingLanguage-${OSL_VERSION}.tar.gz)
set(PYTHON_VERSION 3.9.7)
set(PYTHON_VERSION 3.9.15)
set(PYTHON_SHORT_VERSION 3.9)
set(PYTHON_SHORT_VERSION_NO_DOTS 39)
set(PYTHON_URI https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tar.xz)
set(PYTHON_HASH fddb060b483bc01850a3f412eea1d954)
set(PYTHON_HASH f0dc9000312abeb16de4eccce9a870ab)
set(PYTHON_HASH_TYPE MD5)
set(PYTHON_FILE Python-${PYTHON_VERSION}.tar.xz)
set(PYTHON_CPE "cpe:2.3:a:python:python:${PYTHON_VERSION}:-:*:*:*:*:*:*")
set(TBB_VERSION 2020_U2)
set(TBB_YEAR 2020)
set(TBB_VERSION ${TBB_YEAR}_U2)
set(TBB_URI https://github.com/oneapi-src/oneTBB/archive/${TBB_VERSION}.tar.gz)
set(TBB_HASH 1b711ae956524855088df3bbf5ec65dc)
set(TBB_HASH_TYPE MD5)
set(TBB_FILE oneTBB-${TBB_VERSION}.tar.gz)
set(TBB_CPE "cpe:2.3:a:intel:threading_building_blocks:${TBB_YEAR}:*:*:*:*:*:*:*")
set(OPENVDB_VERSION 8.0.1)
set(OPENVDB_URI https://github.com/AcademySoftwareFoundation/openvdb/archive/v${OPENVDB_VERSION}.tar.gz)
@@ -209,44 +238,47 @@ set(OPENVDB_HASH 01b490be16cc0e15c690f9a153c21461)
set(OPENVDB_HASH_TYPE MD5)
set(OPENVDB_FILE openvdb-${OPENVDB_VERSION}.tar.gz)
set(NANOVDB_GIT_UID dc37d8a631922e7bef46712947dc19b755f3e841)
set(NANOVDB_GIT_UID e62f7a0bf1e27397223c61ddeaaf57edf111b77f)
set(NANOVDB_URI https://github.com/AcademySoftwareFoundation/openvdb/archive/${NANOVDB_GIT_UID}.tar.gz)
set(NANOVDB_HASH e7b9e863ec2f3b04ead171dec2322807)
set(NANOVDB_HASH 90919510bc6ccd630fedc56f748cb199)
set(NANOVDB_HASH_TYPE MD5)
set(NANOVDB_FILE nano-vdb-${NANOVDB_GIT_UID}.tar.gz)
set(IDNA_VERSION 3.2)
set(CHARSET_NORMALIZER_VERSION 2.0.6)
set(URLLIB3_VERSION 1.26.7)
set(CERTIFI_VERSION 2021.10.8)
set(REQUESTS_VERSION 2.26.0)
set(CYTHON_VERSION 0.29.24)
set(ZSTANDARD_VERSION 0.15.2 )
set(IDNA_VERSION 2.10)
set(CHARDET_VERSION 4.0.0)
set(URLLIB3_VERSION 1.26.3)
set(URLLIB3_CPE "cpe:2.3:a:urllib3:urllib3:${URLLIB3_VERSION}:*:*:*:*:*:*:*")
set(CERTIFI_VERSION 2020.12.5)
set(REQUESTS_VERSION 2.25.1)
set(CYTHON_VERSION 0.29.21)
set(NUMPY_VERSION 1.21.2)
set(NUMPY_SHORT_VERSION 1.21)
set(NUMPY_VERSION 1.22.0)
set(NUMPY_SHORT_VERSION 1.22)
set(NUMPY_URI https://github.com/numpy/numpy/releases/download/v${NUMPY_VERSION}/numpy-${NUMPY_VERSION}.zip)
set(NUMPY_HASH 5638d5dae3ca387be562912312db842e)
set(NUMPY_HASH 252de134862a27bd66705d29622edbfe)
set(NUMPY_HASH_TYPE MD5)
set(NUMPY_FILE numpy-${NUMPY_VERSION}.zip)
set(NUMPY_CPE "cpe:2.3:a:numpy:numpy:${NUMPY_VERSION}:*:*:*:*:*:*:*")
set(LAME_VERSION 3.100)
set(LAME_URI http://downloads.sourceforge.net/project/lame/lame/3.100/lame-${LAME_VERSION}.tar.gz)
set(LAME_HASH 83e260acbe4389b54fe08e0bdbf7cddb)
set(LAME_HASH_TYPE MD5)
set(LAME_FILE lame-${LAME_VERSION}.tar.gz)
set(LAME_CPE "cpe:2.3:a:lame_project:lame:${LAME_VERSION}:*:*:*:*:*:*:*")
set(OGG_VERSION 1.3.4)
set(OGG_VERSION 1.3.5)
set(OGG_URI http://downloads.xiph.org/releases/ogg/libogg-${OGG_VERSION}.tar.gz)
set(OGG_HASH fe5670640bd49e828d64d2879c31cb4dde9758681bb664f9bdbf159a01b0c76e)
set(OGG_HASH 0eb4b4b9420a0f51db142ba3f9c64b333f826532dc0f48c6410ae51f4799b664)
set(OGG_HASH_TYPE SHA256)
set(OGG_FILE libogg-${OGG_VERSION}.tar.gz)
set(VORBIS_VERSION 1.3.6)
set(VORBIS_VERSION 1.3.7)
set(VORBIS_URI http://downloads.xiph.org/releases/vorbis/libvorbis-${VORBIS_VERSION}.tar.gz)
set(VORBIS_HASH 6ed40e0241089a42c48604dc00e362beee00036af2d8b3f46338031c9e0351cb)
set(VORBIS_HASH 0e982409a9c3fc82ee06e08205b1355e5c6aa4c36bca58146ef399621b0ce5ab)
set(VORBIS_HASH_TYPE SHA256)
set(VORBIS_FILE libvorbis-${VORBIS_VERSION}.tar.gz)
set(VORBIS_CPE "cpe:2.3:a:xiph.org:libvorbis:${VORBIS_VERSION}:*:*:*:*:*:*:*")
set(THEORA_VERSION 1.1.1)
set(THEORA_URI http://downloads.xiph.org/releases/theora/libtheora-${THEORA_VERSION}.tar.bz2)
@@ -254,17 +286,19 @@ set(THEORA_HASH b6ae1ee2fa3d42ac489287d3ec34c5885730b1296f0801ae577a35193d3affbc
set(THEORA_HASH_TYPE SHA256)
set(THEORA_FILE libtheora-${THEORA_VERSION}.tar.bz2)
set(FLAC_VERSION 1.3.3)
set(FLAC_VERSION 1.3.4)
set(FLAC_URI http://downloads.xiph.org/releases/flac/flac-${FLAC_VERSION}.tar.xz)
set(FLAC_HASH 213e82bd716c9de6db2f98bcadbc4c24c7e2efe8c75939a1a84e28539c4e1748)
set(FLAC_HASH 8ff0607e75a322dd7cd6ec48f4f225471404ae2730d0ea945127b1355155e737 )
set(FLAC_HASH_TYPE SHA256)
set(FLAC_FILE flac-${FLAC_VERSION}.tar.xz)
set(FLAC_CPE "cpe:2.3:a:flac_project:flac:${FLAC_VERSION}:*:*:*:*:*:*:*")
set(VPX_VERSION 1.8.2)
set(VPX_VERSION 1.11.0)
set(VPX_URI https://github.com/webmproject/libvpx/archive/v${VPX_VERSION}/libvpx-v${VPX_VERSION}.tar.gz)
set(VPX_HASH 8735d9fcd1a781ae6917f28f239a8aa358ce4864ba113ea18af4bb2dc8b474ac)
set(VPX_HASH 965e51c91ad9851e2337aebcc0f517440c637c506f3a03948062e3d5ea129a83)
set(VPX_HASH_TYPE SHA256)
set(VPX_FILE libvpx-v${VPX_VERSION}.tar.gz)
set(VPX_CPE "cpe:2.3:a:webmproject:libvpx:${VPX_VERSION}:*:*:*:*:*:*:*")
set(OPUS_VERSION 1.3.1)
set(OPUS_URI https://archive.mozilla.org/pub/opus/opus-${OPUS_VERSION}.tar.gz)
@@ -284,18 +318,20 @@ set(XVIDCORE_HASH abbdcbd39555691dd1c9b4d08f0a031376a3b211652c0d8b3b8aa9be1303ce
set(XVIDCORE_HASH_TYPE SHA256)
set(XVIDCORE_FILE xvidcore-${XVIDCORE_VERSION}.tar.gz)
set(OPENJPEG_VERSION 2.3.1)
set(OPENJPEG_SHORT_VERSION 2.3)
set(OPENJPEG_VERSION 2.5.0)
set(OPENJPEG_SHORT_VERSION 2.5)
set(OPENJPEG_URI https://github.com/uclouvain/openjpeg/archive/v${OPENJPEG_VERSION}.tar.gz)
set(OPENJPEG_HASH 63f5a4713ecafc86de51bfad89cc07bb788e9bba24ebbf0c4ca637621aadb6a9)
set(OPENJPEG_HASH 0333806d6adecc6f7a91243b2b839ff4d2053823634d4f6ed7a59bc87409122a)
set(OPENJPEG_HASH_TYPE SHA256)
set(OPENJPEG_FILE openjpeg-v${OPENJPEG_VERSION}.tar.gz)
set(OPENJPEG_CPE "cpe:2.3:a:uclouvain:openjpeg:${OPENJPEG_VERSION}:*:*:*:*:*:*:*")
set(FFMPEG_VERSION 4.4)
set(FFMPEG_VERSION 4.4.3)
set(FFMPEG_URI http://ffmpeg.org/releases/ffmpeg-${FFMPEG_VERSION}.tar.bz2)
set(FFMPEG_HASH 42093549751b582cf0f338a21a3664f52e0a9fbe0d238d3c992005e493607d0e)
set(FFMPEG_HASH_TYPE SHA256)
set(FFMPEG_HASH 695fad11f3baf27784e24cb0e977b65a)
set(FFMPEG_HASH_TYPE MD5)
set(FFMPEG_FILE ffmpeg-${FFMPEG_VERSION}.tar.bz2)
set(FFMPEG_CPE "cpe:2.3:a:ffmpeg:ffmpeg:${FFMPEG_VERSION}:*:*:*:*:*:*:*")
set(FFTW_VERSION 3.3.8)
set(FFTW_URI http://www.fftw.org/fftw-${FFTW_VERSION}.tar.gz)
@@ -309,17 +345,19 @@ set(ICONV_HASH 7d2a800b952942bb2880efb00cfd524c)
set(ICONV_HASH_TYPE MD5)
set(ICONV_FILE libiconv-${ICONV_VERSION}.tar.gz)
set(SNDFILE_VERSION 1.0.28)
set(SNDFILE_URI http://www.mega-nerd.com/libsndfile/files/libsndfile-${SNDFILE_VERSION}.tar.gz)
set(SNDFILE_HASH 646b5f98ce89ac60cdb060fcd398247c)
set(SNDFILE_VERSION 1.1.0)
set(SNDFILE_URI https://github.com/libsndfile/libsndfile/releases/download/1.1.0/libsndfile-${SNDFILE_VERSION}.tar.xz)
set(SNDFILE_HASH e63dead2b4f0aaf323687619d007ee6a)
set(SNDFILE_HASH_TYPE MD5)
set(SNDFILE_FILE libsndfile-${SNDFILE_VERSION}.tar.gz)
set(SNDFILE_CPE "cpe:2.3:a:libsndfile_project:libsndfile:${SNDFILE_VERSION}:*:*:*:*:*:*:*")
set(WEBP_VERSION 0.6.1)
set(WEBP_VERSION 1.2.2)
set(WEBP_URI https://storage.googleapis.com/downloads.webmproject.org/releases/webp/libwebp-${WEBP_VERSION}.tar.gz)
set(WEBP_HASH b49ce9c3e3e9acae4d91bca44bb85a72)
set(WEBP_HASH b5e2e414a8adee4c25fe56b18dd9c549)
set(WEBP_HASH_TYPE MD5)
set(WEBP_FILE libwebp-${WEBP_VERSION}.tar.gz)
set(WEBP_CPE "cpe:2.3:a:webmproject:libwebp:${WEBP_VERSION}:*:*:*:*:*:*:*")
set(SPNAV_VERSION 0.2.3)
set(SPNAV_URI http://downloads.sourceforge.net/project/spacenav/spacenav%20library%20%28SDK%29/libspnav%20${SPNAV_VERSION}/libspnav-${SPNAV_VERSION}.tar.gz)
@@ -333,64 +371,65 @@ set(JEMALLOC_HASH 3d41fbf006e6ebffd489bdb304d009ae)
set(JEMALLOC_HASH_TYPE MD5)
set(JEMALLOC_FILE jemalloc-${JEMALLOC_VERSION}.tar.bz2)
set(XML2_VERSION 2.9.10)
set(XML2_URI http://xmlsoft.org/sources/libxml2-${XML2_VERSION}.tar.gz)
set(XML2_HASH 10942a1dc23137a8aa07f0639cbfece5)
set(XML2_VERSION 2.10.3)
set(XML2_URI https://download.gnome.org/sources/libxml2/2.10/libxml2-${XML2_VERSION}.tar.xz)
set(XML2_HASH f9edac7fac232b3657a003fd9a5bbe42)
set(XML2_HASH_TYPE MD5)
set(XML2_FILE libxml2-${XML2_VERSION}.tar.gz)
set(TINYXML_VERSION 2_6_2)
set(TINYXML_VERSION_DOTS 2.6.2)
set(TINYXML_URI https://nchc.dl.sourceforge.net/project/tinyxml/tinyxml/${TINYXML_VERSION_DOTS}/tinyxml_${TINYXML_VERSION}.tar.gz)
set(TINYXML_HASH c1b864c96804a10526540c664ade67f0)
set(TINYXML_HASH_TYPE MD5)
set(TINYXML_FILE tinyxml_${TINYXML_VERSION}.tar.gz)
set(XML2_FILE libxml2-${XML2_VERSION}.tar.xz)
set(XML2_CPE "cpe:2.3:a:xmlsoft:libxml2:${XML2_VERSION}:*:*:*:*:*:*:*")
set(YAMLCPP_VERSION 0.6.3)
set(YAMLCPP_URI https://codeload.github.com/jbeder/yaml-cpp/tar.gz/yaml-cpp-${YAMLCPP_VERSION})
set(YAMLCPP_HASH b45bf1089a382e81f6b661062c10d0c2)
set(YAMLCPP_HASH_TYPE MD5)
set(YAMLCPP_FILE yaml-cpp-${YAMLCPP_VERSION}.tar.gz)
set(YAMLCPP "cpe:2.3:a:yaml-cpp_project:yaml-cpp:${YAMLCPP_VERSION}:*:*:*:*:*:*:*")
set(EXPAT_VERSION 2_2_10)
set(EXPAT_VERSION 2_5_0)
set(EXPAT_VERSION_DOTS 2.5.0)
set(EXPAT_URI https://github.com/libexpat/libexpat/archive/R_${EXPAT_VERSION}.tar.gz)
set(EXPAT_HASH 7ca5f09959fcb9a57618368deb627b9f)
set(EXPAT_HASH d375fa3571c0abb945873f5061a8f2e2)
set(EXPAT_HASH_TYPE MD5)
set(EXPAT_FILE libexpat-${EXPAT_VERSION}.tar.gz)
set(EXPAT_CPE "cpe:2.3:a:libexpat_project:libexpat:${EXPAT_VERSION_DOTS}:*:*:*:*:*:*:*")
set(PUGIXML_VERSION 1.10)
set(PUGIXML_URI https://github.com/zeux/pugixml/archive/v${PUGIXML_VERSION}.tar.gz)
set(PUGIXML_HASH 0c208b0664c7fb822bf1b49ad035e8fd)
set(PUGIXML_HASH_TYPE MD5)
set(PUGIXML_FILE pugixml-${PUGIXML_VERSION}.tar.gz)
set(PUGIXML_CPE "cpe:2.3:a:pugixml_project:pugixml:${PUGIXML_VERSION}:*:*:*:*:*:*:*")
set(FLEXBISON_VERSION 2.5.24)
set(FLEXBISON_VERSION 2.5.5)
set(FLEXBISON_URI http://prdownloads.sourceforge.net/winflexbison/win_flex_bison-${FLEXBISON_VERSION}.zip)
set(FLEXBISON_HASH 6b549d43e34ece0e8ed05af92daa31c4)
set(FLEXBISON_HASH d87a3938194520d904013abef3df10ce)
set(FLEXBISON_HASH_TYPE MD5)
set(FLEXBISON_FILE win_flex_bison-${FLEXBISON_VERSION}.zip)
set(FLEX_VERSION 2.6.4)
set(FLEX_URI https://github.com/westes/flex/releases/download/v${FLEX_VERSION}/flex-${FLEX_VERSION}.tar.gz)
set(FLEX_HASH 2882e3179748cc9f9c23ec593d6adc8d)
set(FLEX_HASH_TYPE MD5)
set(FLEX_FILE flex-${FLEX_VERSION}.tar.gz)
# Libraries to keep Python modules static on Linux.
# NOTE: bzip.org domain does no longer belong to BZip 2 project, so we download
# sources from Debian packaging.
#
# NOTE 2: This will *HAVE* to match the version python ships on windows which
# is hardcoded in pythons PCbuild/get_externals.bat. For compliance reasons there
# can be no exceptions to this.
set(BZIP2_VERSION 1.0.8)
set(BZIP2_URI http://http.debian.net/debian/pool/main/b/bzip2/bzip2_${BZIP2_VERSION}.orig.tar.gz)
set(BZIP2_HASH ab5a03176ee106d3f0fa90e381da478ddae405918153cca248e682cd0c4a2269)
set(BZIP2_HASH_TYPE SHA256)
set(BZIP2_FILE bzip2_${BZIP2_VERSION}.orig.tar.gz)
set(BZIP2_CPE "cpe:2.3:a:bzip:bzip2:${BZIP2_VERSION}:*:*:*:*:*:*:*")
# NOTE: This will *HAVE* to match the version python ships on windows which
# is hardcoded in pythons PCbuild/get_externals.bat. For compliance reasons there
# can be no exceptions to this.
set(FFI_VERSION 3.3)
set(FFI_URI https://sourceware.org/pub/libffi/libffi-${FFI_VERSION}.tar.gz)
set(FFI_HASH 72fba7922703ddfa7a028d513ac15a85c8d54c8d67f55fa5a4802885dc652056)
set(FFI_HASH_TYPE SHA256)
set(FFI_FILE libffi-${FFI_VERSION}.tar.gz)
set(FFI_CPE "cpe:2.3:a:libffi_project:libffi:${FFI_VERSION}:*:*:*:*:*:*:*")
set(LZMA_VERSION 5.2.5)
set(LZMA_URI https://tukaani.org/xz/xz-${LZMA_VERSION}.tar.bz2)
@@ -398,26 +437,26 @@ set(LZMA_HASH 5117f930900b341493827d63aa910ff5e011e0b994197c3b71c08a20228a42df)
set(LZMA_HASH_TYPE SHA256)
set(LZMA_FILE xz-${LZMA_VERSION}.tar.bz2)
if(BLENDER_PLATFORM_ARM)
# Need at least 1.1.1i for aarch64 support (https://github.com/openssl/openssl/pull/13218)
set(SSL_VERSION 1.1.1i)
set(SSL_URI https://www.openssl.org/source/openssl-${SSL_VERSION}.tar.gz)
set(SSL_HASH e8be6a35fe41d10603c3cc635e93289ed00bf34b79671a3a4de64fcee00d5242)
set(SSL_HASH_TYPE SHA256)
set(SSL_FILE openssl-${SSL_VERSION}.tar.gz)
else()
set(SSL_VERSION 1.1.1g)
set(SSL_URI https://www.openssl.org/source/openssl-${SSL_VERSION}.tar.gz)
set(SSL_HASH ddb04774f1e32f0c49751e21b67216ac87852ceb056b75209af2443400636d46)
set(SSL_HASH_TYPE SHA256)
set(SSL_FILE openssl-${SSL_VERSION}.tar.gz)
endif()
# NOTE: This will *HAVE* to match the version python ships on windows which
# is hardcoded in pythons PCbuild/get_externals.bat. For compliance reasons there
# can be no exceptions to this.
set(SSL_VERSION 1.1.1q)
set(SSL_URI https://www.openssl.org/source/openssl-${SSL_VERSION}.tar.gz)
set(SSL_HASH d7939ce614029cdff0b6c20f0e2e5703158a489a72b2507b8bd51bf8c8fd10ca)
set(SSL_HASH_TYPE SHA256)
set(SSL_FILE openssl-${SSL_VERSION}.tar.gz)
set(SSL_CPE "cpe:2.3:a:openssl:openssl:${SSL_VERSION}:*:*:*:*:*:*:*")
set(SQLITE_VERSION 3.31.1)
set(SQLITE_URI https://www.sqlite.org/2018/sqlite-src-3240000.zip)
set(SQLITE_HASH fb558c49ee21a837713c4f1e7e413309aabdd9c7)
# Note: This will *HAVE* to match the version python ships on windows which
# is hardcoded in pythons PCbuild/get_externals.bat for compliance reasons there
# can be no exceptions to this.
set(SQLITE_VERSION 3.37.2)
set(SQLLITE_LONG_VERSION 3370200)
set(SQLITE_URI https://www.sqlite.org/2022/sqlite-autoconf-${SQLLITE_LONG_VERSION}.tar.gz)
set(SQLITE_HASH e56faacadfb4154f8fbd0f2a3f827d13706b70a1)
set(SQLITE_HASH_TYPE SHA1)
set(SQLITE_FILE sqlite-src-3240000.zip)
set(SQLITE_FILE sqlite-autoconf-${SQLLITE_LONG_VERSION}.tar.gz)
set(SQLITE_CPE "cpe:2.3:a:sqlite:sqlite:${SQLITE_VERSION}:*:*:*:*:*:*:*")
set(EMBREE_VERSION 3.10.0)
set(EMBREE_URI https://github.com/embree/embree/archive/v${EMBREE_VERSION}.zip)
@@ -432,9 +471,9 @@ set(USD_HASH 1dd1e2092d085ed393c1f7c450a4155a)
set(USD_HASH_TYPE MD5)
set(USD_FILE usd-v${USD_VERSION}.tar.gz)
set(OIDN_VERSION 1.4.1)
set(OIDN_VERSION 1.3.0)
set(OIDN_URI https://github.com/OpenImageDenoise/oidn/releases/download/v${OIDN_VERSION}/oidn-${OIDN_VERSION}.src.tar.gz)
set(OIDN_HASH df4007b0ab93b1c41cdf223b075d01c0)
set(OIDN_HASH 301a5a0958d375a942014df0679b9270)
set(OIDN_HASH_TYPE MD5)
set(OIDN_FILE oidn-${OIDN_VERSION}.src.tar.gz)
@@ -444,47 +483,53 @@ set(LIBGLU_HASH 151aef599b8259efe9acd599c96ea2a3)
set(LIBGLU_HASH_TYPE MD5)
set(LIBGLU_FILE glu-${LIBGLU_VERSION}.tar.xz)
set(MESA_VERSION 21.1.5)
set(MESA_VERSION 20.3.4)
set(MESA_URI ftp://ftp.freedesktop.org/pub/mesa/mesa-${MESA_VERSION}.tar.xz)
set(MESA_HASH 022c7293074aeeced2278c872db4fa693147c70f8595b076cf3f1ef81520766d)
set(MESA_HASH_TYPE SHA256)
set(MESA_HASH 556338446aef8ae947a789b3e0b5e056)
set(MESA_HASH_TYPE MD5)
set(MESA_FILE mesa-${MESA_VERSION}.tar.xz)
set(MESA_CPE "cpe:2.3:a:mesa3d:mesa:${MESA_VERSION}:*:*:*:*:*:*:*")
set(NASM_VERSION 2.15.02)
set(NASM_URI https://github.com/netwide-assembler/nasm/archive/nasm-${NASM_VERSION}.tar.gz)
set(NASM_HASH aded8b796c996a486a56e0515c83e414116decc3b184d88043480b32eb0a8589)
set(NASM_HASH_TYPE SHA256)
set(NASM_FILE nasm-${NASM_VERSION}.tar.gz)
set(NASM_PCE "cpe:2.3:a:nasm:nasm:${NASM_VERSION}:*:*:*:*:*:*:*")
set(XR_OPENXR_SDK_VERSION 1.0.17)
set(XR_OPENXR_SDK_VERSION 1.0.14)
set(XR_OPENXR_SDK_URI https://github.com/KhronosGroup/OpenXR-SDK/archive/release-${XR_OPENXR_SDK_VERSION}.tar.gz)
set(XR_OPENXR_SDK_HASH bf0fd8828837edff01047474e90013e1)
set(XR_OPENXR_SDK_HASH 0df6b2fd6045423451a77ff6bc3e1a75)
set(XR_OPENXR_SDK_HASH_TYPE MD5)
set(XR_OPENXR_SDK_FILE OpenXR-SDK-${XR_OPENXR_SDK_VERSION}.tar.gz)
set(WL_PROTOCOLS_VERSION 1.21)
set(WL_PROTOCOLS_FILE wayland-protocols-${WL_PROTOCOLS_VERSION}.tar.gz)
set(WL_PROTOCOLS_URI https://gitlab.freedesktop.org/wayland/wayland-protocols/-/archive/${WL_PROTOCOLS_VERSION}/${WL_PROTOCOLS_FILE})
set(WL_PROTOCOLS_HASH af5ca07e13517cdbab33504492cef54a)
set(WL_PROTOCOLS_HASH_TYPE MD5)
set(ISPC_VERSION v1.16.0)
set(ISPC_URI https://github.com/ispc/ispc/archive/${ISPC_VERSION}.tar.gz)
set(ISPC_HASH 2e3abedbc0ea9aaec17d6562c632454d)
set(ISPC_HASH_TYPE MD5)
set(ISPC_FILE ispc-${ISPC_VERSION}.tar.gz)
if(APPLE AND ("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64"))
# Unreleased version with macOS arm support.
set(ISPC_URI https://github.com/ispc/ispc/archive/f5949c055eb9eeb93696978a3da4bfb3a6a30b35.zip)
set(ISPC_HASH d382fea18d01dbd0cd05d9e1ede36d7d)
set(ISPC_HASH_TYPE MD5)
set(ISPC_FILE f5949c055eb9eeb93696978a3da4bfb3a6a30b35.zip)
else()
set(ISPC_VERSION v1.14.1)
set(ISPC_URI https://github.com/ispc/ispc/archive/${ISPC_VERSION}.tar.gz)
set(ISPC_HASH 968fbc8dfd16a60ba4e32d2e0e03ea7a)
set(ISPC_HASH_TYPE MD5)
set(ISPC_FILE ispc-${ISPC_VERSION}.tar.gz)
endif()
set(GMP_VERSION 6.2.0)
set(GMP_URI https://gmplib.org/download/gmp/gmp-${GMP_VERSION}.tar.xz)
set(GMP_HASH a325e3f09e6d91e62101e59f9bda3ec1)
set(GMP_HASH_TYPE MD5)
set(GMP_FILE gmp-${GMP_VERSION}.tar.xz)
set(GMP_CPE "cpe:2.3:a:gmplib:gmp:${GMP_VERSION}:*:*:*:*:*:*:*")
set(POTRACE_VERSION 1.16)
set(POTRACE_URI http://potrace.sourceforge.net/download/${POTRACE_VERSION}/potrace-${POTRACE_VERSION}.tar.gz)
set(POTRACE_HASH 5f0bd87ddd9a620b0c4e65652ef93d69)
set(POTRACE_HASH_TYPE MD5)
set(POTRACE_FILE potrace-${POTRACE_VERSION}.tar.gz)
set(POTRACE_CPE "cpe:2.3:a:icoasoft:potrace:${POTRACE_VERSION}:*:*:*:*:*:*:*")
set(HARU_VERSION 2_3_0)
set(HARU_URI https://github.com/libharu/libharu/archive/RELEASE_${HARU_VERSION}.tar.gz)
@@ -492,11 +537,5 @@ set(HARU_HASH 4f916aa49c3069b3a10850013c507460)
set(HARU_HASH_TYPE MD5)
set(HARU_FILE libharu-${HARU_VERSION}.tar.gz)
set(ZSTD_VERSION 1.5.0)
set(ZSTD_URI https://github.com/facebook/zstd/releases/download/v${ZSTD_VERSION}/zstd-${ZSTD_VERSION}.tar.gz)
set(ZSTD_HASH 5194fbfa781fcf45b98c5e849651aa7b3b0a008c6b72d4a0db760f3002291e94)
set(ZSTD_HASH_TYPE SHA256)
set(ZSTD_FILE zstd-${ZSTD_VERSION}.tar.gz)
set(SSE2NEON_GIT https://github.com/DLTcollab/sse2neon.git)
set(SSE2NEON_GIT_HASH fe5ff00bb8d19b327714a3c290f3e2ce81ba3525)

View File

@@ -1,27 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
ExternalProject_Add(external_wayland_protocols
URL file://${PACKAGE_DIR}/${WL_PROTOCOLS_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${WL_PROTOCOLS_HASH_TYPE}=${WL_PROTOCOLS_HASH}
PREFIX ${BUILD_DIR}/wayland-protocols
CONFIGURE_COMMAND meson --prefix ${LIBDIR}/wayland-protocols . ../external_wayland_protocols -Dtests=false
BUILD_COMMAND ninja
INSTALL_COMMAND ninja install
)

View File

@@ -20,16 +20,24 @@ if(WIN32)
set(X264_EXTRA_ARGS --enable-win32thread --cross-prefix=${MINGW_HOST}- --host=${MINGW_HOST})
endif()
if(BLENDER_PLATFORM_ARM)
set(X264_EXTRA_ARGS ${X264_EXTRA_ARGS} "--disable-asm")
if(APPLE)
if("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64")
set(X264_EXTRA_ARGS ${X264_EXTRA_ARGS} "--disable-asm")
set(X264_CONFIGURE_ENV echo .)
else()
set(X264_CONFIGURE_ENV
export AS=${LIBDIR}/nasm/bin/nasm
)
endif()
else()
set(X264_CONFIGURE_ENV echo .)
endif()
if((APPLE AND NOT BLENDER_PLATFORM_ARM) OR (UNIX AND NOT APPLE))
if(UNIX AND NOT APPLE)
set(X264_CONFIGURE_ENV
export AS=${LIBDIR}/nasm/bin/nasm
)
else()
set(X264_CONFIGURE_ENV echo .)
endif()
ExternalProject_Add(external_x264

View File

@@ -16,21 +16,49 @@
#
# ***** END GPL LICENSE BLOCK *****
ExternalProject_Add(external_xml2
URL file://${PACKAGE_DIR}/${XML2_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${XML2_HASH_TYPE}=${XML2_HASH}
PREFIX ${BUILD_DIR}/xml2
CONFIGURE_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/xml2/src/external_xml2/ && ${CONFIGURE_COMMAND}
--prefix=${LIBDIR}/xml2
--disable-shared
--enable-static
--with-pic
--with-python=no
--with-lzma=no
--with-zlib=no
--with-iconv=no
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/xml2/src/external_xml2/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/xml2/src/external_xml2/ && make install
INSTALL_DIR ${LIBDIR}/xml2
)
if(WIN32)
set(XML2_EXTRA_ARGS
-DLIBXML2_WITH_ZLIB=OFF
-DLIBXML2_WITH_LZMA=OFF
-DLIBXML2_WITH_PYTHON=OFF
-DLIBXML2_WITH_ICONV=OFF
-DLIBXML2_WITH_TESTS=OFF
-DLIBXML2_WITH_PROGRAMS=OFF
-DBUILD_SHARED_LIBS=OFF
)
ExternalProject_Add(external_xml2
URL file://${PACKAGE_DIR}/${XML2_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${XML2_HASH_TYPE}=${XML2_HASH}
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/xml2 ${DEFAULT_CMAKE_FLAGS} ${XML2_EXTRA_ARGS}
PREFIX ${BUILD_DIR}/xml2
INSTALL_DIR ${LIBDIR}/xml2
)
else()
ExternalProject_Add(external_xml2
URL file://${PACKAGE_DIR}/${XML2_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${XML2_HASH_TYPE}=${XML2_HASH}
PREFIX ${BUILD_DIR}/xml2
CONFIGURE_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/xml2/src/external_xml2/ && ${CONFIGURE_COMMAND}
--prefix=${LIBDIR}/xml2
--disable-shared
--enable-static
--with-pic
--with-python=no
--with-lzma=no
--with-zlib=no
--with-iconv=no
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/xml2/src/external_xml2/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/xml2/src/external_xml2/ && make install
INSTALL_DIR ${LIBDIR}/xml2
)
endif()
if(WIN32 AND BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_xml2 after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/xml2/include ${HARVEST_TARGET}/xml2/include
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/xml2/lib/libxml2s.lib ${HARVEST_TARGET}/xml2/lib/libxml2s.lib
DEPENDEES install
)
endif()

View File

@@ -1,51 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(ZSTD_EXTRA_ARGS
-DZSTD_BUILD_PROGRAMS=OFF
-DZSTD_BUILD_SHARED=OFF
-DZSTD_BUILD_STATIC=ON
-DZSTD_BUILD_TESTS=OFF
-DZSTD_LEGACY_SUPPORT=OFF
-DZSTD_LZ4_SUPPORT=OFF
-DZSTD_LZMA_SUPPORT=OFF
-DZSTD_MULTITHREAD_SUPPORT=ON
-DZSTD_PROGRAMS_LINK_SHARED=OFF
-DZSTD_USE_STATIC_RUNTIME=OFF
-DZSTD_ZLIB_SUPPORT=OFF
)
ExternalProject_Add(external_zstd
URL file://${PACKAGE_DIR}/${ZSTD_FILE}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH ${ZSTD_HASH_TYPE}=${ZSTD_HASH}
PREFIX ${BUILD_DIR}/zstd
SOURCE_SUBDIR build/cmake
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/zstd ${DEFAULT_CMAKE_FLAGS} ${ZSTD_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/zstd
)
if(WIN32)
if(BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_zstd after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/zstd/lib/zstd_static${LIBEXT} ${HARVEST_TARGET}/zstd/lib/zstd_static${LIBEXT}
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/zstd/include/ ${HARVEST_TARGET}/zstd/include/
DEPENDEES install
)
endif()
endif()

View File

@@ -1,38 +1,76 @@
strict graph {
graph[autosize = false, size = "25.7,8.3!", resolution = 300, overlap = false, splines = false, outputorder=edgesfirst ];
node [style=filled fillcolor=white];
external_alembic -- external_boost;
external_alembic -- external_zlib;
graph[autosize = false, size = "25.7,8.3!", resolution = 300];
external_alembic -- external_openexr;
external_alembic -- external_imath;
external_blosc -- external_zlib;
external_blosc -- external_pthreads;
external_boost -- Make_Python_Environment;
external_clang -- ll;
external_boost -- external_python;
external_boost -- external_numpy;
external_dpcpp -- external_python;
external_dpcpp -- external_python_site_packages;
external_dpcpp -- external_vcintrinsics;
external_dpcpp -- external_openclheaders;
external_dpcpp -- external_icdloader;
external_dpcpp -- external_mp11;
external_dpcpp -- external_level_zero;
external_dpcpp -- external_spirvheaders;
external_embree -- external_tbb;
external_ffmpeg -- external_zlib;
external_ffmpeg -- external_faad;
external_ffmpeg -- external_openjpeg;
external_ffmpeg -- external_xvidcore;
external_ffmpeg -- external_x264;
external_ffmpeg -- external_opus;
external_ffmpeg -- external_vpx;
external_ffmpeg -- external_theora;
external_ffmpeg -- external_vorbis;
external_ffmpeg -- external_ogg;
external_ffmpeg -- external_lame;
external_ffmpeg -- external_aom;
external_ffmpeg -- external_zlib_mingw;
external_numpy -- Make_Python_Environment;
external_ffmpeg -- external_nasm;
external_freetype -- external_brotli;
external_freetype -- external_zlib;
external_gmpxx -- external_gmp;
external_igc_llvm -- external_igc_opencl_clang;
external_igc_spirv_translator -- external_igc_opencl_clang;
external_igc -- external_igc_vcintrinsics;
external_igc -- external_igc_llvm;
external_igc -- external_igc_opencl_clang;
external_igc -- external_igc_vcintrinsics;
external_igc -- external_igc_spirv_headers;
external_igc -- external_igc_spirv_tools;
external_igc -- external_igc_spirv_translator;
external_igc -- external_flex;
external_ispc -- ll;
external_ispc -- external_python;
external_ispc -- external_flexbison;
external_ispc -- external_flex;
ll -- external_xml2;
ll -- external_python;
external_mesa -- ll;
external_numpy -- external_python;
external_numpy -- external_python_site_packages;
external_ocloc -- external_igc;
external_ocloc -- external_gmmlib;
external_opencollada -- external_xml2;
external_opencolorio -- external_boost;
external_opencolorio -- external_tinyxml;
external_opencolorio -- external_yamlcpp;
external_opencolorio -- external_expat;
external_opencolorio -- external_imath;
external_opencolorio -- external_pystring;
external_openexr -- external_zlib;
external_openimagedenoise -- external_tbb;
external_openimagedenoise -- external_ispc;
external_openimagedenoise -- external_python;
external_openimageio -- external_png;
external_openimageio -- external_zlib;
external_openimageio -- external_openexr;
external_openimageio -- external_openexr;
external_openimageio -- external_imath;
external_openimageio -- external_jpeg;
external_openimageio -- external_boost;
external_openimageio -- external_tiff;
external_openimageio -- external_opencolorio;
external_openimageio -- external_pugixml;
external_openimageio -- external_fmt;
external_openimageio -- external_robinmap;
external_openimageio -- external_openjpeg;
external_openimageio -- external_webp;
external_openimageio -- external_opencolorio_extra;
@@ -44,57 +82,37 @@ graph[autosize = false, size = "25.7,8.3!", resolution = 300, overlap = false, s
external_opensubdiv -- external_tbb;
openvdb -- external_tbb;
openvdb -- external_boost;
openvdb -- external_openexr;
openvdb -- external_openexr;
openvdb -- external_zlib;
openvdb -- external_blosc;
external_osl -- external_boost;
external_osl -- ll;
external_osl -- external_clang;
external_osl -- external_openexr;
external_osl -- external_openexr;
external_osl -- external_zlib;
external_osl -- external_flexbison;
external_osl -- external_openimageio;
external_osl -- external_pugixml;
external_osl -- external_flexbison;
external_osl -- external_flex;
external_png -- external_zlib;
external_python_site_packages -- Make_Python_Environment;
external_python -- external_bzip2;
external_python -- external_ffi;
external_python -- external_lzma;
external_python -- external_ssl;
external_python -- external_sqlite;
external_python -- external_zlib;
external_python_site_packages -- external_python;
external_sndfile -- external_ogg;
external_sndfile -- external_vorbis;
external_sndfile -- external_flac;
external_theora -- external_vorbis;
external_theora -- external_ogg;
external_tiff -- external_zlib;
external_usd -- external_tbb;
external_usd -- external_boost;
external_usd -- external_opensubdiv;
external_vorbis -- external_ogg;
blender-- external_ffmpeg;
blender-- external_alembic;
blender-- external_openjpeg;
blender-- external_opencolorio;
blender-- external_openexr;
blender-- external_opensubdiv;
blender-- openvdb;
blender-- external_osl;
blender-- external_boost;
blender-- external_jpeg;
blender-- external_png;
blender-- external_python;
blender-- external_sndfile;
blender-- external_iconv;
blender-- external_fftw3;
external_python-- external_python_site_packages;
external_python_site_packages-- requests;
external_python_site_packages-- idna;
external_python_site_packages-- chardet;
external_python_site_packages-- urllib3;
external_python_site_packages-- certifi;
external_python-- external_numpy;
external_usd-- external_boost;
external_usd-- external_tbb;
blender-- external_opencollada;
blender-- external_sdl;
blender-- external_freetype;
blender-- external_pthreads;
blender-- external_zlib;
blender-- external_openal;
blender-- external_usd;
external_wayland -- external_expat;
external_wayland -- external_xml2;
external_wayland -- external_ffi;
external_wayland_protocols -- external_wayland;
external_x264 -- external_nasm;
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,18 @@
diff -Naur libaom-3.4.0/build/cmake/aom_configure.cmake external_aom/build/cmake/aom_configure.cmake
--- libaom-3.4.0/build/cmake/aom_configure.cmake 2022-06-17 11:46:18 -0600
+++ external_aom/build/cmake/aom_configure.cmake 2022-10-16 15:35:54 -0600
@@ -15,8 +15,12 @@
include(FindGit)
include(FindPerl)
-include(FindThreads)
-
+# Blender: This will drag in a dep on libwinpthreads which we prefer
+# not to have, aom will fallback on a native win32 thread wrapper
+# if pthreads are not found.
+if(NOT WIN32)
+ include(FindThreads)
+endif()
include("${AOM_ROOT}/build/cmake/aom_config_defaults.cmake")
include("${AOM_ROOT}/build/cmake/aom_experiment_deps.cmake")
include("${AOM_ROOT}/build/cmake/aom_optimization.cmake")

View File

@@ -20,7 +20,7 @@
# ILMBASE_LIBRARIES - list of libraries to link against when using IlmBase.
# ILMBASE_FOUND - True if IlmBase was found.
# Other standard issue macros
# Other standarnd issue macros
include(FindPackageHandleStandardArgs)
include(FindPackageMessage)
include(SelectLibraryConfigurations)

View File

@@ -22,7 +22,7 @@
# These are defined by the FindIlmBase module.
# OPENEXR_FOUND - True if OpenEXR was found.
# Other standard issue macros
# Other standarnd issue macros
include(SelectLibraryConfigurations)
include(FindPackageHandleStandardArgs)
include(FindPackageMessage)

View File

@@ -1,32 +1,5 @@
cmake_minimum_required(VERSION 3.1 FATAL_ERROR)
if (POLICY CMP0048)
# cmake warns if loaded from a min-3.0-required parent dir, so silence the warning:
cmake_policy(SET CMP0048 NEW)
endif()
project (tbb CXX)
include(CheckCXXCompilerFlag)
include(CheckCXXSourceRuns)
if(POLICY CMP0058)
cmake_policy(SET CMP0058 NEW)
endif()
if(POLICY CMP0068)
cmake_policy(SET CMP0068 NEW)
endif()
if (POLICY CMP0078)
# swig standard target names
cmake_policy(SET CMP0078 NEW)
endif ()
if (POLICY CMP0086)
# UseSWIG honors SWIG_MODULE_NAME via -module flag
cmake_policy(SET CMP0086 NEW)
endif ()
cmake_minimum_required (VERSION 2.8)
project(tbb CXX)
if(NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)
message(STATUS "Setting build type to 'Release' as none was specified.")
@@ -35,36 +8,12 @@ if(NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)
"MinSizeRel" "RelWithDebInfo")
endif()
if(NOT TBB_INSTALL_RUNTIME_DIR)
set(TBB_INSTALL_RUNTIME_DIR bin)
endif()
if(NOT TBB_INSTALL_LIBRARY_DIR)
set(TBB_INSTALL_LIBRARY_DIR lib)
endif()
if(NOT TBB_INSTALL_ARCHIVE_DIR)
set(TBB_INSTALL_ARCHIVE_DIR lib)
endif()
if(NOT TBB_INSTALL_INCLUDE_DIR)
set(TBB_INSTALL_INCLUDE_DIR include)
endif()
if(NOT TBB_CMAKE_PACKAGE_INSTALL_DIR)
set(TBB_CMAKE_PACKAGE_INSTALL_DIR lib/cmake/tbb)
endif()
include_directories(include src src/rml/include ${CMAKE_CURRENT_BINARY_DIR})
include_directories(include src src/rml/include )
option(TBB_BUILD_SHARED "Build TBB shared library" ON)
option(TBB_BUILD_STATIC "Build TBB static library" ON)
option(TBB_BUILD_TBBMALLOC "Build TBB malloc library" ON)
option(TBB_BUILD_TBBMALLOC_PROXY "Build TBB malloc proxy library" ON)
option(TBB_BUILD_TESTS "Build TBB tests and enable testing infrastructure" ON)
option(TBB_NO_DATE "Do not save the configure date in the version string" OFF)
option(TBB_BUILD_PYTHON "Build TBB Python bindings" OFF)
option(TBB_SET_SOVERSION "Set the SOVERSION (shared library build version suffix)?" OFF)
# When this repository is part of a larger build system of a parent project
# we may not want TBB to set up default installation targets
option(TBB_INSTALL_TARGETS "Include build targets for 'make install'" ON)
if(APPLE)
set(CMAKE_MACOSX_RPATH ON)
@@ -90,143 +39,66 @@ set(tbbmalloc_proxy_src
src/tbbmalloc/proxy.cpp
src/tbbmalloc/tbb_function_replacement.cpp)
add_library (tbb_interface INTERFACE)
add_definitions(-DTBB_SUPPRESS_DEPRECATED_MESSAGES=1)
if (CMAKE_SYSTEM_PROCESSOR MATCHES "(i386|x86_64)")
if (NOT APPLE AND NOT MINGW)
target_compile_definitions(tbb_interface INTERFACE DO_ITT_NOTIFY)
endif()
endif()
if (APPLE)
if (NOT APPLE)
add_definitions(-DDO_ITT_NOTIFY)
else()
# Disable annoying "has no symbols" warnings
set(CMAKE_C_ARCHIVE_CREATE "<CMAKE_AR> Scr <TARGET> <LINK_FLAGS> <OBJECTS>")
set(CMAKE_CXX_ARCHIVE_CREATE "<CMAKE_AR> Scr <TARGET> <LINK_FLAGS> <OBJECTS>")
set(CMAKE_C_ARCHIVE_FINISH "<CMAKE_RANLIB> -no_warning_for_no_symbols -c <TARGET>")
set(CMAKE_CXX_ARCHIVE_FINISH "<CMAKE_RANLIB> -no_warning_for_no_symbols -c <TARGET>")
set(CMAKE_C_ARCHIVE_CREATE "<CMAKE_AR> Scr <TARGET> <LINK_FLAGS> <OBJECTS>")
set(CMAKE_CXX_ARCHIVE_CREATE "<CMAKE_AR> Scr <TARGET> <LINK_FLAGS> <OBJECTS>")
set(CMAKE_C_ARCHIVE_FINISH "<CMAKE_RANLIB> -no_warning_for_no_symbols -c <TARGET>")
set(CMAKE_CXX_ARCHIVE_FINISH "<CMAKE_RANLIB> -no_warning_for_no_symbols -c <TARGET>")
endif()
macro(CHECK_CXX_COMPILER_AND_LINKER_FLAGS _RESULT _CXX_FLAGS _LINKER_FLAGS)
set(CMAKE_REQUIRED_FLAGS ${_CXX_FLAGS})
set(CMAKE_REQUIRED_LIBRARIES ${_LINKER_FLAGS})
set(CMAKE_REQUIRED_QUIET TRUE)
check_cxx_source_runs("#include <iostream>\nint main(int argc, char **argv) { std::cout << \"test\"; return 0; }" ${_RESULT})
set(CMAKE_REQUIRED_FLAGS "")
set(CMAKE_REQUIRED_LIBRARIES "")
endmacro()
# Prefer libc++ in conjunction with Clang
if (CMAKE_CXX_COMPILER_ID MATCHES "Clang")
if (CMAKE_CXX_FLAGS MATCHES "-stdlib=libc\\+\\+")
message(STATUS "TBB: using libc++.")
else()
CHECK_CXX_COMPILER_AND_LINKER_FLAGS(HAS_LIBCPP "-stdlib=libc++" "-stdlib=libc++")
if (HAS_LIBCPP)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -stdlib=libc++ -D_LIBCPP_VERSION")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -stdlib=libc++")
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -stdlib=libc++")
message(STATUS "TBB: using libc++.")
else()
message(STATUS "TBB: NOT using libc++.")
endif()
endif()
endif()
set (CMAKE_CXX_STANDARD 11)
if (UNIX)
target_compile_definitions(tbb_interface INTERFACE USE_PTHREAD)
check_cxx_compiler_flag ("-mrtm -Werror" SUPPORTS_MRTM)
if (SUPPORTS_MRTM)
target_compile_options(tbb_interface INTERFACE "-mrtm")
endif ()
elseif(WIN32)
target_compile_definitions(tbb_interface INTERFACE USE_WINTHREAD _WIN32_WINNT=0x0600)
if (MSVC)
enable_language(ASM_MASM)
target_compile_options(tbb_interface INTERFACE /GS- /Zc:wchar_t /Zc:forScope)
check_cxx_compiler_flag ("/volatile:iso" SUPPORTS_VOLATILE_FLAG)
if (SUPPORTS_VOLATILE_FLAG)
target_compile_options(tbb_interface INTERFACE /volatile:iso)
endif ()
target_compile_options(tbb_interface INTERFACE $<$<COMPILE_LANGUAGE:CXX>:/wd4267 /wd4800 /wd4146 /wd4244 /wd4577 /wd4018>)
if (NOT CMAKE_SIZEOF_VOID_P)
message(FATAL_ERROR "'CMAKE_SIZEOF_VOID_P' is undefined. Please delete your build directory and rerun CMake again!")
endif()
if (CMAKE_SIZEOF_VOID_P EQUAL 8)
list(APPEND tbb_src src/tbb/intel64-masm/atomic_support.asm
src/tbb/intel64-masm/itsx.asm src/tbb/intel64-masm/intel64_misc.asm)
list(APPEND tbbmalloc_src src/tbb/intel64-masm/atomic_support.asm)
set(CMAKE_ASM_MASM_FLAGS "/DEM64T=1 ${CMAKE_ASM_MASM_FLAGS}")
else()
list(APPEND tbb_src src/tbb/ia32-masm/atomic_support.asm
src/tbb/ia32-masm/itsx.asm src/tbb/ia32-masm/lock_byte.asm)
# Enable SAFESEH feature for assembly (x86 builds only).
set(CMAKE_ASM_MASM_FLAGS "/safeseh ${CMAKE_ASM_MASM_FLAGS}")
endif()
elseif (MINGW)
target_compile_options(tbb_interface INTERFACE "-mthreads")
endif ()
endif()
if (MSVC)
set(ENABLE_RTTI "/EHsc /GR ")
set(DISABLE_RTTI "/EHs- /GR- ")
elseif (UNIX)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11 -DUSE_PTHREAD")
if(NOT CMAKE_CXX_FLAGS MATCHES "-mno-rtm")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -mrtm")
endif()
if (APPLE)
set(ARCH_PREFIX "mac")
else()
set(ARCH_PREFIX "lin")
endif()
if (CMAKE_SIZEOF_VOID_P EQUAL 8)
set(ARCH_PREFIX "${ARCH_PREFIX}64")
else()
set(ARCH_PREFIX "${ARCH_PREFIX}32")
endif()
set(ENABLE_RTTI "-frtti -fexceptions ")
set(DISABLE_RTTI "-fno-rtti -fno-exceptions ")
endif ()
elseif(WIN32)
cmake_minimum_required (VERSION 3.1)
enable_language(ASM_MASM)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /GS- /Zc:wchar_t /Zc:forScope /DUSE_WINTHREAD")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /D_CRT_SECURE_NO_DEPRECATE /D_WIN32_WINNT=0x0600 /volatile:iso")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /wd4267 /wd4800 /wd4146 /wd4244 /wd4018")
##--------
# - Added TBB_USE_GLIBCXX_VERSION macro to specify the version of GNU
# libstdc++ when it cannot be properly recognized, e.g. when used
# with Clang on Linux* OS. Inspired by a contribution from David A.
if (NOT TBB_USE_GLIBCXX_VERSION AND UNIX AND NOT APPLE)
if ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "Clang")
# using Clang
string(REPLACE "." "0" TBB_USE_GLIBCXX_VERSION ${CMAKE_CXX_COMPILER_VERSION})
if (CMAKE_SIZEOF_VOID_P EQUAL 8)
list(APPEND tbb_src src/tbb/intel64-masm/atomic_support.asm
src/tbb/intel64-masm/itsx.asm src/tbb/intel64-masm/intel64_misc.asm)
list(APPEND tbbmalloc_src src/tbb/intel64-masm/atomic_support.asm)
set(CMAKE_ASM_MASM_FLAGS "/DEM64T=1")
set(ARCH_PREFIX "win64")
else()
list(APPEND tbb_src src/tbb/ia32-masm/atomic_support.asm
src/tbb/ia32-masm/itsx.asm src/tbb/ia32-masm/lock_byte.asm)
list(APPEND tbbmalloc_src src/tbb/ia32-masm/atomic_support.asm)
set(ARCH_PREFIX "win32")
endif()
set(ENABLE_RTTI "/EHsc /GR ")
set(DISABLE_RTTI "/EHs- /GR- ")
endif()
if (TBB_USE_GLIBCXX_VERSION)
target_compile_definitions(tbb_interface INTERFACE TBB_USE_GLIBCXX_VERSION=${TBB_USE_GLIBCXX_VERSION})
endif()
##-------
if ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
check_cxx_compiler_flag ("-flifetime-dse=1" SUPPORTS_FLIFETIME)
if (SUPPORTS_FLIFETIME)
target_compile_options(tbb_interface INTERFACE -flifetime-dse=1)
endif()
include(CheckCXXCompilerFlag)
check_cxx_compiler_flag("-flifetime-dse=1" SUPPORTS_FLIFETIME)
if (SUPPORTS_FLIFETIME)
add_definitions(-flifetime-dse=1)
endif()
endif()
# Linker export definitions
if (APPLE)
set (ARCH_PREFIX "mac")
elseif(WIN32)
set (ARCH_PREFIX "win")
else()
set (ARCH_PREFIX "lin")
endif()
if (CMAKE_SIZEOF_VOID_P EQUAL 8)
set(ARCH_PREFIX "${ARCH_PREFIX}64")
else()
set(ARCH_PREFIX "${ARCH_PREFIX}32")
endif()
if (MINGW)
set (ARCH_PREFIX "${ARCH_PREFIX}-gcc")
# there's no win32-gcc-tbb-export.def, use lin32-tbb-export.def
execute_process (COMMAND ${CMAKE_COMMAND} -E copy ${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/lin32-tbb-export.def ${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/win32-gcc-tbb-export.def)
endif()
if (MSVC)
if (WIN32)
add_custom_command(OUTPUT tbb.def
COMMAND ${CMAKE_CXX_COMPILER} /TC /EP ${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/${ARCH_PREFIX}-tbb-export.def -I ${CMAKE_CURRENT_SOURCE_DIR}/include > tbb.def
MAIN_DEPENDENCY ${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/${ARCH_PREFIX}-tbb-export.def
@@ -238,15 +110,18 @@ if (MSVC)
MAIN_DEPENDENCY ${CMAKE_CURRENT_SOURCE_DIR}/src/tbbmalloc/${ARCH_PREFIX}-tbbmalloc-export.def
COMMENT "Preprocessing tbbmalloc.def"
)
list(APPEND tbb_src ${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/tbb_resource.rc)
list(APPEND tbbmalloc_src ${CMAKE_CURRENT_SOURCE_DIR}/src/tbbmalloc/tbbmalloc.rc)
list(APPEND tbbmalloc_proxy_src ${CMAKE_CURRENT_SOURCE_DIR}/src/tbbmalloc/tbbmalloc.rc)
else()
add_custom_command(OUTPUT tbb.def
COMMAND ${CMAKE_CXX_COMPILER} -xc++ -std=c++11 -E ${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/${ARCH_PREFIX}-tbb-export.def -I ${CMAKE_CURRENT_SOURCE_DIR}/include -o tbb.def
COMMAND ${CMAKE_CXX_COMPILER} -xc++ -E ${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/${ARCH_PREFIX}-tbb-export.def -I ${CMAKE_CURRENT_SOURCE_DIR}/include -o tbb.def
MAIN_DEPENDENCY ${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/${ARCH_PREFIX}-tbb-export.def
COMMENT "Preprocessing tbb.def"
)
add_custom_command(OUTPUT tbbmalloc.def
COMMAND ${CMAKE_CXX_COMPILER} -xc++ -std=c++11 -E ${CMAKE_CURRENT_SOURCE_DIR}/src/tbbmalloc/${ARCH_PREFIX}-tbbmalloc-export.def -I ${CMAKE_CURRENT_SOURCE_DIR}/include -o tbbmalloc.def
COMMAND ${CMAKE_CXX_COMPILER} -xc++ -E ${CMAKE_CURRENT_SOURCE_DIR}/src/tbbmalloc/${ARCH_PREFIX}-tbbmalloc-export.def -I ${CMAKE_CURRENT_SOURCE_DIR}/include -o tbbmalloc.def
MAIN_DEPENDENCY ${CMAKE_CURRENT_SOURCE_DIR}/src/tbbmalloc/${ARCH_PREFIX}-tbbmalloc-export.def
COMMENT "Preprocessing tbbmalloc.def"
)
@@ -257,80 +132,34 @@ add_custom_target(tbb_def_files DEPENDS tbb.def tbbmalloc.def)
# TBB library
if (TBB_BUILD_STATIC)
add_library(tbb_static STATIC ${tbb_src})
target_link_libraries(tbb_static PRIVATE tbb_interface)
target_include_directories(tbb_static INTERFACE "$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/include>" "$<INSTALL_INTERFACE:${TBB_INSTALL_INCLUDE_DIR}>")
set_property(TARGET tbb_static APPEND PROPERTY COMPILE_DEFINITIONS "__TBB_BUILD=1")
set_property(TARGET tbb_static APPEND PROPERTY COMPILE_DEFINITIONS "__TBB_SOURCE_DIRECTLY_INCLUDED=1")
set_property(TARGET tbb_static APPEND_STRING PROPERTY COMPILE_FLAGS ${ENABLE_RTTI})
if (TBB_INSTALL_TARGETS)
install(TARGETS tbb_static ARCHIVE DESTINATION ${TBB_INSTALL_ARCHIVE_DIR})
endif()
target_compile_definitions(tbb_static
PRIVATE
-D__TBB_BUILD=1
-D__TBB_DYNAMIC_LOAD_ENABLED=0
-D__TBB_SOURCE_DIRECTLY_INCLUDED=1)
if (MSVC)
target_compile_definitions(tbb_static
PUBLIC -D__TBB_NO_IMPLICIT_LINKAGE=1
PRIVATE -D_CRT_SECURE_NO_WARNINGS)
endif()
if (UNIX AND NOT APPLE)
target_link_libraries(tbb_static PUBLIC pthread dl)
endif()
install(TARGETS tbb_static ARCHIVE DESTINATION lib)
endif()
if (TBB_BUILD_SHARED)
add_library(tbb SHARED ${tbb_src})
target_link_libraries(tbb PRIVATE tbb_interface)
target_include_directories(tbb INTERFACE "$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/include>" "$<INSTALL_INTERFACE:${TBB_INSTALL_INCLUDE_DIR}>")
set_property(TARGET tbb APPEND PROPERTY COMPILE_DEFINITIONS "__TBB_BUILD=1")
set_property(TARGET tbb APPEND_STRING PROPERTY COMPILE_FLAGS ${ENABLE_RTTI})
if (TBB_SET_SOVERSION)
set_property(TARGET tbb PROPERTY SOVERSION 2)
endif ()
target_compile_definitions(tbb
PRIVATE -D__TBB_BUILD=1)
if (MSVC)
target_compile_definitions(tbb
PUBLIC -D__TBB_NO_IMPLICIT_LINKAGE=1
PRIVATE -D_CRT_SECURE_NO_WARNINGS)
endif()
add_dependencies(tbb tbb_def_files)
if (APPLE)
set_property(TARGET tbb APPEND PROPERTY LINK_FLAGS "-Wl,-exported_symbols_list,\"${CMAKE_CURRENT_BINARY_DIR}/tbb.def\"")
elseif (MSVC)
set_property(TARGET tbb APPEND PROPERTY LINK_FLAGS "/DEF:\"${CMAKE_CURRENT_BINARY_DIR}/tbb.def\"")
else ()
set_property(TARGET tbb APPEND PROPERTY LINK_FLAGS "-Wl,-version-script,\"${CMAKE_CURRENT_BINARY_DIR}/tbb.def\"")
set_property(TARGET tbb APPEND PROPERTY LINK_FLAGS "-Wl,-exported_symbols_list,${CMAKE_CURRENT_BINARY_DIR}/tbb.def")
elseif(UNIX)
set_property(TARGET tbb APPEND PROPERTY LINK_FLAGS "-Wl,-version-script,${CMAKE_CURRENT_BINARY_DIR}/tbb.def")
elseif(WIN32)
set_property(TARGET tbb APPEND PROPERTY LINK_FLAGS "/DEF:${CMAKE_CURRENT_BINARY_DIR}/tbb.def")
endif()
if (TBB_INSTALL_TARGETS)
install(TARGETS tbb EXPORT TBB
LIBRARY DESTINATION ${TBB_INSTALL_LIBRARY_DIR}
ARCHIVE DESTINATION ${TBB_INSTALL_ARCHIVE_DIR}
RUNTIME DESTINATION ${TBB_INSTALL_RUNTIME_DIR})
if (MSVC)
install(FILES $<TARGET_PDB_FILE:tbb> DESTINATION ${TBB_INSTALL_RUNTIME_DIR} OPTIONAL)
endif()
endif()
if (UNIX AND NOT APPLE)
target_link_libraries(tbb PUBLIC pthread dl)
install(TARGETS tbb DESTINATION lib)
if(WIN32)
set_target_properties(tbb PROPERTIES OUTPUT_NAME "tbb$<$<CONFIG:Debug>:_debug>")
endif()
endif()
if("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
if(CMAKE_COMPILER_IS_GNUCC)
# Quench a warning on GCC
set_source_files_properties(${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/governor.cpp COMPILE_FLAGS "-Wno-missing-field-initializers ")
elseif("${CMAKE_CXX_COMPILER_ID}" STREQUAL "Clang")
# Quench a warning on Clang
set_source_files_properties(${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/itt_notify.cpp COMPILE_FLAGS "-Wno-varargs ")
elseif(MSVC)
# Quench a warning on MSVC
set_source_files_properties(${CMAKE_CURRENT_SOURCE_DIR}/src/tbb/scheduler.cpp COMPILE_FLAGS "/wd4458 ")
@@ -340,50 +169,24 @@ if(TBB_BUILD_TBBMALLOC)
# TBB malloc library
if (TBB_BUILD_STATIC)
add_library(tbbmalloc_static STATIC ${tbbmalloc_static_src})
target_link_libraries(tbbmalloc_static PRIVATE tbb_interface)
set_property(TARGET tbbmalloc_static APPEND PROPERTY COMPILE_DEFINITIONS "__TBBMALLOC_BUILD=1")
set_property(TARGET tbbmalloc_static APPEND PROPERTY COMPILE_DEFINITIONS "__TBB_DYNAMIC_LOAD_ENABLED=0")
set_property(TARGET tbbmalloc_static APPEND PROPERTY COMPILE_DEFINITIONS "__TBB_SOURCE_DIRECTLY_INCLUDED=1")
set_property(TARGET tbbmalloc_static APPEND_STRING PROPERTY COMPILE_FLAGS ${DISABLE_RTTI})
if (MSVC)
target_compile_definitions(tbbmalloc_static PUBLIC __TBB_NO_IMPLICIT_LINKAGE=1 __TBBMALLOC_NO_IMPLICIT_LINKAGE=1)
endif()
if (TBB_INSTALL_TARGETS)
install(TARGETS tbbmalloc_static ARCHIVE DESTINATION ${TBB_INSTALL_ARCHIVE_DIR})
endif()
install(TARGETS tbbmalloc_static ARCHIVE DESTINATION lib)
endif()
if (TBB_BUILD_SHARED)
add_library(tbbmalloc SHARED ${tbbmalloc_src})
target_link_libraries(tbbmalloc PRIVATE tbb_interface)
set_property(TARGET tbbmalloc APPEND PROPERTY COMPILE_DEFINITIONS "__TBBMALLOC_BUILD=1")
set_property(TARGET tbbmalloc APPEND_STRING PROPERTY COMPILE_FLAGS ${DISABLE_RTTI})
if (TBB_SET_SOVERSION)
set_property(TARGET tbbmalloc PROPERTY SOVERSION 2)
endif ()
add_dependencies(tbbmalloc tbb_def_files)
if (APPLE)
set_property(TARGET tbbmalloc APPEND PROPERTY LINK_FLAGS "-Wl,-exported_symbols_list,\"${CMAKE_CURRENT_BINARY_DIR}/tbbmalloc.def\"")
elseif (MSVC)
set_property(TARGET tbbmalloc APPEND PROPERTY LINK_FLAGS "/DEF:\"${CMAKE_CURRENT_BINARY_DIR}/tbbmalloc.def\"")
else ()
set_property(TARGET tbbmalloc APPEND PROPERTY LINK_FLAGS "-Wl,-version-script,\"${CMAKE_CURRENT_BINARY_DIR}/tbbmalloc.def\"")
endif()
if (MSVC)
target_compile_definitions(tbbmalloc PUBLIC __TBB_NO_IMPLICIT_LINKAGE=1 __TBBMALLOC_NO_IMPLICIT_LINKAGE=1)
endif()
if (TBB_INSTALL_TARGETS)
install(TARGETS tbbmalloc EXPORT TBB
LIBRARY DESTINATION ${TBB_INSTALL_LIBRARY_DIR}
ARCHIVE DESTINATION ${TBB_INSTALL_ARCHIVE_DIR}
RUNTIME DESTINATION ${TBB_INSTALL_RUNTIME_DIR})
if (MSVC)
install(FILES $<TARGET_PDB_FILE:tbbmalloc> DESTINATION ${TBB_INSTALL_RUNTIME_DIR} OPTIONAL)
endif()
endif()
if (UNIX AND NOT APPLE)
target_link_libraries(tbbmalloc PUBLIC pthread dl)
set_property(TARGET tbbmalloc APPEND PROPERTY LINK_FLAGS "-Wl,-exported_symbols_list,${CMAKE_CURRENT_BINARY_DIR}/tbbmalloc.def")
elseif(UNIX)
set_property(TARGET tbbmalloc APPEND PROPERTY LINK_FLAGS "-Wl,-version-script,${CMAKE_CURRENT_BINARY_DIR}/tbbmalloc.def")
elseif(WIN32)
set_property(TARGET tbbmalloc APPEND PROPERTY LINK_FLAGS "/DEF:${CMAKE_CURRENT_BINARY_DIR}/tbbmalloc.def")
endif()
install(TARGETS tbbmalloc DESTINATION lib)
endif()
endif()
@@ -391,298 +194,19 @@ if(TBB_BUILD_TBBMALLOC_PROXY)
# TBB malloc proxy library
if (TBB_BUILD_STATIC)
add_library(tbbmalloc_proxy_static STATIC ${tbbmalloc_proxy_src})
target_link_libraries(tbbmalloc_proxy_static PRIVATE tbb_interface)
set_property(TARGET tbbmalloc_proxy_static APPEND PROPERTY COMPILE_DEFINITIONS "__TBBMALLOC_BUILD=1")
set_property(TARGET tbbmalloc_proxy_static APPEND PROPERTY COMPILE_DEFINITIONS "__TBB_DYNAMIC_LOAD_ENABLED=0")
set_property(TARGET tbbmalloc_proxy_static APPEND PROPERTY COMPILE_DEFINITIONS "__TBB_SOURCE_DIRECTLY_INCLUDED=1")
set_property(TARGET tbbmalloc_proxy_static APPEND_STRING PROPERTY COMPILE_FLAGS ${DISABLE_RTTI})
if (TBB_INSTALL_TARGETS)
install(TARGETS tbbmalloc_proxy_static ARCHIVE DESTINATION ${TBB_INSTALL_ARCHIVE_DIR})
endif()
link_libraries(tbbmalloc_proxy_static tbbmalloc)
install(TARGETS tbbmalloc_proxy_static ARCHIVE DESTINATION lib)
endif()
if (TBB_BUILD_SHARED)
add_library(tbbmalloc_proxy SHARED ${tbbmalloc_proxy_src})
target_link_libraries(tbbmalloc_proxy PRIVATE tbb_interface)
set_property(TARGET tbbmalloc_proxy APPEND PROPERTY COMPILE_DEFINITIONS "__TBBMALLOC_BUILD=1")
set_property(TARGET tbbmalloc_proxy APPEND_STRING PROPERTY COMPILE_FLAGS ${DISABLE_RTTI})
if (TBB_SET_SOVERSION)
set_property(TARGET tbbmalloc_proxy PROPERTY SOVERSION 2)
endif ()
target_link_libraries(tbbmalloc_proxy PUBLIC tbbmalloc)
if (TBB_INSTALL_TARGETS)
install(TARGETS tbbmalloc_proxy EXPORT TBB
LIBRARY DESTINATION ${TBB_INSTALL_LIBRARY_DIR}
ARCHIVE DESTINATION ${TBB_INSTALL_ARCHIVE_DIR}
RUNTIME DESTINATION ${TBB_INSTALL_RUNTIME_DIR})
if (MSVC)
install(FILES $<TARGET_PDB_FILE:tbbmalloc_proxy> DESTINATION ${TBB_INSTALL_RUNTIME_DIR} OPTIONAL)
endif()
endif()
if (UNIX AND NOT APPLE)
target_link_libraries(tbbmalloc_proxy PUBLIC pthread dl)
endif()
target_link_libraries(tbbmalloc_proxy tbbmalloc)
install(TARGETS tbbmalloc_proxy DESTINATION lib)
endif()
endif()
if (TBB_INSTALL_TARGETS)
install(DIRECTORY include/tbb DESTINATION ${TBB_INSTALL_INCLUDE_DIR})
if (TBB_BUILD_SHARED)
install(EXPORT TBB DESTINATION ${TBB_CMAKE_PACKAGE_INSTALL_DIR} NAMESPACE TBB:: FILE TBBConfig.cmake)
endif()
endif()
# version file
if (TBB_INSTALL_TARGETS)
set (_VERSION_FILE ${CMAKE_CURRENT_SOURCE_DIR}/include/tbb/tbb_stddef.h)
file (STRINGS ${_VERSION_FILE} _VERSION_MAJOR_STRING REGEX ".*define[ ]+TBB_VERSION_MAJOR[ ]+[0-9]+.*")
file (STRINGS ${_VERSION_FILE} _VERSION_MINOR_STRING REGEX ".*define[ ]+TBB_VERSION_MINOR[ ]+[0-9]+.*")
string (REGEX REPLACE ".*TBB_VERSION_MAJOR[ ]+([0-9]+)" "\\1" TBB_MAJOR_VERSION ${_VERSION_MAJOR_STRING})
string (REGEX REPLACE ".*TBB_VERSION_MINOR[ ]+([0-9]+)" "\\1" TBB_MINOR_VERSION ${_VERSION_MINOR_STRING})
set (TBB_VERSION_STRING "${TBB_MAJOR_VERSION}.${TBB_MINOR_VERSION}")
include (CMakePackageConfigHelpers)
write_basic_package_version_file (TBBConfigVersion.cmake VERSION "${TBB_VERSION_STRING}" COMPATIBILITY AnyNewerVersion)
install (FILES ${CMAKE_CURRENT_BINARY_DIR}/TBBConfigVersion.cmake DESTINATION "${TBB_CMAKE_PACKAGE_INSTALL_DIR}")
endif()
# version_string.ver
if (UNIX AND NOT TBB_NO_DATE)
execute_process (COMMAND date "+%a, %d %b %Y %H:%M:%S %z"
OUTPUT_VARIABLE _configure_date
OUTPUT_STRIP_TRAILING_WHITESPACE)
elseif (WIN32 AND NOT TBB_NO_DATE)
execute_process (COMMAND cmd " /C date /T"
OUTPUT_VARIABLE _configure_date
OUTPUT_STRIP_TRAILING_WHITESPACE)
else ()
set (_configure_date "Unknown")
endif()
set (TBB_CONFIG_DATE "${_configure_date}" CACHE STRING "First time that TBB was configured")
set (_configure_date "${TBB_CONFIG_DATE}")
include_directories (${CMAKE_BINARY_DIR})
configure_file (build/version_string.ver.in version_string.ver @ONLY)
if (TBB_BUILD_TESTS)
enable_language (C)
enable_testing ()
find_library (LIBRT_LIBRARIES rt)
find_library (LIDL_LIBRARIES dl)
find_package (Threads)
if (NOT APPLE)
find_package (OpenMP)
endif()
macro (tbb_add_test testname)
set (full_testname tbb_test_${testname})
add_executable (${full_testname} src/test/test_${testname}.cpp)
target_link_libraries(${full_testname} PRIVATE tbb_interface)
if (TBB_BUILD_SHARED)
target_link_libraries (${full_testname} PRIVATE tbb tbbmalloc)
target_compile_definitions (${full_testname} PRIVATE __TBB_LIB_NAME=tbb)
else ()
target_link_libraries (${full_testname} PRIVATE tbb_static tbbmalloc_static)
target_compile_definitions (${full_testname} PRIVATE __TBB_LIB_NAME=tbb_static)
endif ()
if (LIBRT_LIBRARIES)
target_link_libraries (${full_testname} PRIVATE ${LIBRT_LIBRARIES})
endif ()
if (LIDL_LIBRARIES)
target_link_libraries (${full_testname} PRIVATE ${LIDL_LIBRARIES})
endif ()
if (Threads_FOUND)
target_link_libraries (${full_testname} PRIVATE ${CMAKE_THREAD_LIBS_INIT})
endif ()
if (OPENMP_FOUND AND "${testname}" MATCHES "openmp")
set_target_properties (${full_testname} PROPERTIES COMPILE_FLAGS "${OpenMP_CXX_FLAGS}")
set_target_properties (${full_testname} PROPERTIES LINK_FLAGS "${OpenMP_CXX_FLAGS}")
endif()
if (MINGW)
target_link_libraries (${full_testname} PRIVATE psapi)
endif ()
add_test (NAME ${full_testname} COMMAND ${full_testname})
endmacro ()
tbb_add_test (aggregator)
tbb_add_test (aligned_space)
tbb_add_test (assembly)
if (NOT WIN32)
tbb_add_test (async_msg) # msvc64/debug timeouts
endif()
tbb_add_test (async_node)
# tbb_add_test (atomic) # msvc64/debug timeouts: Compile-time initialization fails for static tbb::atomic variables
tbb_add_test (blocked_range2d)
tbb_add_test (blocked_range3d)
tbb_add_test (blocked_range)
tbb_add_test (broadcast_node)
tbb_add_test (buffer_node)
tbb_add_test (cache_aligned_allocator)
if (NOT WIN32)
tbb_add_test (cache_aligned_allocator_STL)
endif()
tbb_add_test (cilk_dynamic_load)
tbb_add_test (cilk_interop)
tbb_add_test (combinable)
tbb_add_test (composite_node)
tbb_add_test (concurrent_hash_map)
tbb_add_test (concurrent_lru_cache)
# tbb_add_test (concurrent_monitor) # too long
# tbb_add_test (concurrent_priority_queue)
if (NOT WIN32)
tbb_add_test (concurrent_queue) # msvc64/debug timeouts
endif()
# tbb_add_test (concurrent_queue_whitebox)
tbb_add_test (concurrent_unordered_map)
# tbb_add_test (concurrent_unordered_set)
tbb_add_test (concurrent_vector)
tbb_add_test (continue_node)
tbb_add_test (critical_section)
tbb_add_test (dynamic_link)
# tbb_add_test (eh_algorithms)
tbb_add_test (eh_flow_graph)
# tbb_add_test (eh_tasks)
tbb_add_test (enumerable_thread_specific)
tbb_add_test (examples_common_utility)
# tbb_add_test (fast_random)
tbb_add_test (flow_graph)
tbb_add_test (flow_graph_whitebox)
# tbb_add_test (fp) # mingw: harness_fp.h:66, assertion !checkConsistency || (ctl.mxcsr & SSE_RND_MODE_MASK) >> 3 == (ctl.x87cw & FE_RND_MODE_MASK): failed
# tbb_add_test (function_node) # mingw:random timeout
# tbb_add_test (global_control)
# tbb_add_test (global_control_whitebox)
tbb_add_test (halt)
tbb_add_test (handle_perror)
# tbb_add_test (hw_concurrency)
tbb_add_test (indexer_node)
tbb_add_test (inits_loop)
tbb_add_test (intrusive_list)
tbb_add_test (ittnotify)
# tbb_add_test (join_node) #msvc/64: fatal error C1128: number of sections exceeded object file format limit: compile with /bigob
tbb_add_test (lambda)
tbb_add_test (limiter_node)
# tbb_add_test (malloc_atexit)
# tbb_add_test (malloc_compliance) #mingw: Limits should be decreased for the test to work
tbb_add_test (malloc_init_shutdown)
# tbb_add_test (malloc_lib_unload)
# tbb_add_test (malloc_overload)
tbb_add_test (malloc_pools)
tbb_add_test (malloc_regression)
# tbb_add_test (malloc_used_by_lib)
# tbb_add_test (malloc_whitebox)
tbb_add_test (model_plugin)
# tbb_add_test (multifunction_node) # too long
tbb_add_test (mutex)
tbb_add_test (mutex_native_threads)
# tbb_add_test (opencl_node)
if (OPENMP_FOUND)
tbb_add_test (openmp)
endif ()
tbb_add_test (overwrite_node)
# tbb_add_test (parallel_do)
# This seems to fail on CI platforms (AppVeyor/Travis), perhaps because the VM exposes just 1 core?
tbb_add_test (parallel_for)
tbb_add_test (parallel_for_each)
tbb_add_test (parallel_for_vectorization)
tbb_add_test (parallel_invoke)
tbb_add_test (parallel_pipeline)
tbb_add_test (parallel_reduce)
tbb_add_test (parallel_scan)
tbb_add_test (parallel_sort)
tbb_add_test (parallel_while)
# tbb_add_test (partitioner_whitebox) # too long
tbb_add_test (pipeline)
# tbb_add_test (pipeline_with_tbf) # takes forever on appveyor
tbb_add_test (priority_queue_node)
tbb_add_test (queue_node)
tbb_add_test (reader_writer_lock)
# tbb_add_test (runtime_loader) # LINK : fatal error LNK1104: cannot open file 'tbbproxy.lib' [C:\projects\tbb\test_runtime_loader.vcxproj]
tbb_add_test (rwm_upgrade_downgrade)
# tbb_add_test (ScalableAllocator)
if (NOT WIN32)
tbb_add_test (ScalableAllocator_STL)
endif()
tbb_add_test (semaphore)
# tbb_add_test (sequencer_node) # msvc: timeout
tbb_add_test (source_node)
tbb_add_test (split_node)
tbb_add_test (static_assert)
tbb_add_test (std_thread)
tbb_add_test (tagged_msg)
# tbb_add_test (task_arena) # LINK : fatal error LNK1104: cannot open file '__TBB_LIB_NAME.lib' [C:\projects\tbb\test_task_arena.vcxproj]
# tbb_add_test (task_assertions)
tbb_add_test (task_auto_init)
tbb_add_test (task)
# tbb_add_test (task_enqueue) # too long
tbb_add_test (task_group)
# tbb_add_test (task_leaks)
# tbb_add_test (task_priority)
# tbb_add_test (task_scheduler_init) # msvc: test_task_scheduler_init.cpp:68, assertion !test_mandatory_parallelism || Harness::CanReachConcurrencyLevel(threads): failed
tbb_add_test (task_scheduler_observer)
tbb_add_test (task_steal_limit)
tbb_add_test (tbb_condition_variable)
tbb_add_test (tbb_fork)
# tbb_add_test (tbb_header)
tbb_add_test (tbb_thread)
# tbb_add_test (tbb_version)
tbb_add_test (tick_count)
tbb_add_test (tuple)
tbb_add_test (write_once_node)
tbb_add_test (yield)
endif ()
if (TBB_BUILD_PYTHON)
find_package(PythonInterp)
find_package(PythonLibs ${PYTHON_VERSION_STRING} EXACT)
find_package(SWIG 3)
if (PythonLibs_FOUND AND SWIG_FOUND AND TBB_BUILD_SHARED)
include (${SWIG_USE_FILE})
set_source_files_properties (python/tbb/api.i PROPERTIES CPLUSPLUS ON)
set (CMAKE_SWIG_FLAGS "-threads")
# swig_add_module is deprecated
if (CMAKE_VERSION VERSION_LESS 3.8)
swig_add_module (api python python/tbb/api.i)
else ()
swig_add_library (api LANGUAGE python SOURCES python/tbb/api.i)
endif ()
# UseSWIG generates now standard target names
if (CMAKE_VERSION VERSION_LESS 3.13)
set (module_target ${SWIG_MODULE_api_REAL_NAME})
else ()
set (module_target api)
endif ()
target_include_directories(${module_target} PRIVATE ${PYTHON_INCLUDE_DIRS})
target_link_libraries(${module_target} PRIVATE tbb)
if(WIN32)
target_link_libraries(${module_target} ${PYTHON_LIBRARIES})
elseif(APPLE)
set_target_properties(${module_target} PROPERTIES LINK_FLAGS "-undefined dynamic_lookup")
endif()
if (WIN32)
set (PYTHON_SITE_PACKAGES Lib/site-packages)
else ()
set (PYTHON_SITE_PACKAGES lib/python${PYTHON_VERSION_MAJOR}.${PYTHON_VERSION_MINOR}/site-packages)
endif ()
if (TBB_INSTALL_TARGETS)
install(FILES python/TBB.py
DESTINATION ${PYTHON_SITE_PACKAGES})
install(FILES python/tbb/__init__.py python/tbb/pool.py python/tbb/test.py python/tbb/__main__.py ${CMAKE_CURRENT_BINARY_DIR}/api.py
DESTINATION ${PYTHON_SITE_PACKAGES}/tbb)
install(TARGETS ${module_target}
DESTINATION ${PYTHON_SITE_PACKAGES}/tbb)
endif()
if(UNIX AND NOT APPLE)
add_library(irml SHARED python/rml/ipc_server.cpp python/rml/ipc_utils.cpp src/tbb/cache_aligned_allocator.cpp src/tbb/dynamic_link.cpp src/tbb/tbb_misc_ex.cpp src/tbb/tbb_misc.cpp)
target_compile_definitions(irml PRIVATE DO_ITT_NOTIFY=0 USE_PTHREAD=1)
target_link_libraries(irml PRIVATE tbb)
set_target_properties(irml PROPERTIES VERSION 1)
if (TBB_INSTALL_TARGETS)
install(TARGETS irml DESTINATION ${TBB_INSTALL_LIBRARY_DIR})
endif()
endif ()
endif ()
endif ()
install(DIRECTORY include/tbb DESTINATION include)

View File

@@ -68,34 +68,18 @@
+
return ret;
}
--- a/libavcodec/rl.c
+++ b/libavcodec/rl.c
@@ -71,17 +71,19 @@
av_cold void ff_rl_init_vlc(RLTable *rl, unsigned static_size)
{
int i, q;
- VLC_TYPE table[1500][2] = {{0}};
+ VLC_TYPE (*table)[2] = av_calloc(sizeof(VLC_TYPE), 1500 * 2);
VLC vlc = { .table = table, .table_allocated = static_size };
- av_assert0(static_size <= FF_ARRAY_ELEMS(table));
+ av_assert0(static_size < 1500);
init_vlc(&vlc, 9, rl->n + 1, &rl->table_vlc[0][1], 4, 2, &rl->table_vlc[0][0], 4, 2, INIT_VLC_USE_NEW_STATIC);
diff --git a/libavcodec/x86/simple_idct.asm b/libavcodec/x86/simple_idct.asm
index dcf0da6df121..982b2f0bbba1 100644
--- a/libavcodec/x86/simple_idct.asm
+++ b/libavcodec/x86/simple_idct.asm
@@ -25,9 +25,9 @@
for (q = 0; q < 32; q++) {
int qmul = q * 2;
int qadd = (q - 1) | 1;
%include "libavutil/x86/x86util.asm"
- if (!rl->rl_vlc[q])
+ if (!rl->rl_vlc[q]){
+ av_free(table);
return;
+ }
-%if ARCH_X86_32
SECTION_RODATA
if (q == 0) {
qmul = 1;
@@ -113,4 +115,5 @@
rl->rl_vlc[q][i].run = run;
}
}
+ av_free(table);
}
+%if ARCH_X86_32
cextern pb_80
wm1010: dw 0, 0xffff, 0, 0xffff

View File

@@ -0,0 +1,15 @@
--- a/mpz/inp_raw.c Tue Dec 22 23:49:51 2020 +0100
+++ b/mpz/inp_raw.c Thu Oct 21 19:06:49 2021 +0200
@@ -88,8 +88,11 @@
abs_csize = ABS (csize);
+ if (UNLIKELY (abs_csize > ~(mp_bitcnt_t) 0 / 8))
+ return 0; /* Bit size overflows */
+
/* round up to a multiple of limbs */
- abs_xsize = BITS_TO_LIMBS (abs_csize*8);
+ abs_xsize = BITS_TO_LIMBS ((mp_bitcnt_t) abs_csize * 8);
if (abs_xsize != 0)
{

View File

@@ -0,0 +1,27 @@
diff --git a/numpy/distutils/system_info.py b/numpy/distutils/system_info.py
index ba2b1f4..b10f7df 100644
--- a/numpy/distutils/system_info.py
+++ b/numpy/distutils/system_info.py
@@ -2164,8 +2164,8 @@ class accelerate_info(system_info):
'accelerate' in libraries):
if intel:
args.extend(['-msse3'])
- else:
- args.extend(['-faltivec'])
+# else:
+# args.extend(['-faltivec'])
args.extend([
'-I/System/Library/Frameworks/vecLib.framework/Headers'])
link_args.extend(['-Wl,-framework', '-Wl,Accelerate'])
@@ -2174,8 +2174,8 @@ class accelerate_info(system_info):
'veclib' in libraries):
if intel:
args.extend(['-msse3'])
- else:
- args.extend(['-faltivec'])
+# else:
+# args.extend(['-faltivec'])
args.extend([
'-I/System/Library/Frameworks/vecLib.framework/Headers'])
link_args.extend(['-Wl,-framework', '-Wl,vecLib'])

View File

@@ -0,0 +1,40 @@
diff -Naur oidn-1.3.0/cmake/FindTBB.cmake external_openimagedenoise/cmake/FindTBB.cmake
--- oidn-1.3.0/cmake/FindTBB.cmake 2021-02-04 16:20:26 -0700
+++ external_openimagedenoise/cmake/FindTBB.cmake 2021-02-12 09:35:53 -0700
@@ -332,20 +332,22 @@
${TBB_ROOT}/lib/${TBB_ARCH}/${TBB_VCVER}
${TBB_ROOT}/lib
)
-
# On Windows, also search the DLL so that the client may install it.
file(GLOB DLL_NAMES
${TBB_ROOT}/bin/${TBB_ARCH}/${TBB_VCVER}/${LIB_NAME}.dll
${TBB_ROOT}/bin/${LIB_NAME}.dll
+ ${TBB_ROOT}/lib/${LIB_NAME}.dll
${TBB_ROOT}/redist/${TBB_ARCH}/${TBB_VCVER}/${LIB_NAME}.dll
${TBB_ROOT}/redist/${TBB_ARCH}/${TBB_VCVER}/${LIB_NAME_GLOB1}.dll
${TBB_ROOT}/redist/${TBB_ARCH}/${TBB_VCVER}/${LIB_NAME_GLOB2}.dll
${TBB_ROOT}/../redist/${TBB_ARCH}/tbb/${TBB_VCVER}/${LIB_NAME}.dll
${TBB_ROOT}/../redist/${TBB_ARCH}_win/tbb/${TBB_VCVER}/${LIB_NAME}.dll
)
- list(GET DLL_NAMES 0 DLL_NAME)
- get_filename_component(${BIN_DIR_VAR} "${DLL_NAME}" DIRECTORY)
- set(${DLL_VAR} "${DLL_NAME}" CACHE PATH "${COMPONENT_NAME} ${BUILD_CONFIG} dll path")
+ if (DLL_NAMES)
+ list(GET DLL_NAMES 0 DLL_NAME)
+ get_filename_component(${BIN_DIR_VAR} "${DLL_NAME}" DIRECTORY)
+ set(${DLL_VAR} "${DLL_NAME}" CACHE PATH "${COMPONENT_NAME} ${BUILD_CONFIG} dll path")
+ endif()
elseif(APPLE)
set(LIB_PATHS ${TBB_ROOT}/lib)
else()
--- external_openimagedenoise/cmake/oidn_ispc.cmake 2021-02-15 17:29:34.000000000 +0100
+++ external_openimagedenoise/cmake/oidn_ispc.cmake2 2021-02-15 17:29:28.000000000 +0100
@@ -98,7 +98,7 @@
elseif(OIDN_ARCH STREQUAL "ARM64")
set(ISPC_ARCHITECTURE "aarch64")
if(APPLE)
- set(ISPC_TARGET_OS "--target-os=ios")
+ set(ISPC_TARGET_OS "--target-os=macos")
endif()
endif()

View File

@@ -130,3 +130,28 @@ index 715d903..24423ce 100644
{
string id = node.attribute("id").value();
size_t line = node.line();
diff -Naur a/CMakeLists.txt b/CMakeLists.txt
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -274,7 +274,7 @@
add_subdirectory(${EXTERNAL_LIBRARIES}/UTF)
add_subdirectory(common/libBuffer)
add_subdirectory(${EXTERNAL_LIBRARIES}/MathMLSolver)
-add_subdirectory(${EXTERNAL_LIBRARIES}/zlib)
+#add_subdirectory(${EXTERNAL_LIBRARIES}/zlib)
# building OpenCOLLADA libs
add_subdirectory(COLLADABaseUtils)
@@ -284,10 +284,10 @@
add_subdirectory(COLLADAStreamWriter)
# building COLLADAValidator app
-add_subdirectory(COLLADAValidator)
+#add_subdirectory(COLLADAValidator)
# DAE validator app
-add_subdirectory(DAEValidator)
+#add_subdirectory(DAEValidator)
# Library export
install(EXPORT LibraryExport DESTINATION ${OPENCOLLADA_INST_CMAKECONFIG} FILE OpenCOLLADATargets.cmake)

View File

@@ -34,3 +34,24 @@ diff -Naur orig/src/include/OpenImageIO/platform.h external_openimageio/src/incl
# include <windows.h>
#endif
diff -Naur orig/src/libutil/ustring.cpp external_openimageio/src/libutil/ustring.cpp
--- orig/src/libutil/ustring.cpp 2020-05-11 05:43:52.000000000 +0200
+++ external_openimageio/src/libutil/ustring.cpp 2020-11-26 12:06:08.000000000 +0100
@@ -337,6 +337,8 @@
// the std::string to make it point to our chars! In such a case, the
// destructor will be careful not to allow a deallocation.
+ // Disable internal std::string for Apple silicon based Macs
+#if !(defined(__APPLE__) && defined(__arm64__))
#if defined(__GNUC__) && !defined(_LIBCPP_VERSION) \
&& defined(_GLIBCXX_USE_CXX11_ABI) && _GLIBCXX_USE_CXX11_ABI
// NEW gcc ABI
@@ -382,7 +384,7 @@
return;
}
#endif
-
+#endif
// Remaining cases - just assign the internal string. This may result
// in double allocation for the chars. If you care about that, do
// something special for your platform, much like we did for gcc and

View File

@@ -1,8 +1,47 @@
diff -Naur OpenShadingLanguage-Release-1.9.9/src/cmake/flexbison.cmake.rej external_osl/src/cmake/flexbison.cmake.rej
--- OpenShadingLanguage-Release-1.9.9/src/cmake/flexbison.cmake.rej 1969-12-31 17:00:00 -0700
+++ external_osl/src/cmake/flexbison.cmake.rej 2018-08-24 17:42:11 -0600
@@ -0,0 +1,11 @@
+--- src/cmake/flexbison.cmake 2018-05-01 16:39:02 -0600
++++ src/cmake/flexbison.cmake 2018-08-24 10:24:03 -0600
+@@ -77,7 +77,7 @@
+ DEPENDS ${${compiler_headers}}
+ WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR} )
+ ADD_CUSTOM_COMMAND ( OUTPUT ${flexoutputcxx}
+- COMMAND ${FLEX_EXECUTABLE} -o ${flexoutputcxx} "${CMAKE_CURRENT_SOURCE_DIR}/${flexsrc}"
++ COMMAND ${FLEX_EXECUTABLE} ${FLEX_EXTRA_OPTIONS} -o ${flexoutputcxx} "${CMAKE_CURRENT_SOURCE_DIR}/${flexsrc}"
+ MAIN_DEPENDENCY ${flexsrc}
+ DEPENDS ${${compiler_headers}}
+ WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR} )
diff -Naur OpenShadingLanguage-Release-1.9.9/src/include/OSL/llvm_util.h external_osl/src/include/OSL/llvm_util.h
--- OpenShadingLanguage-Release-1.9.9/src/include/OSL/llvm_util.h 2018-05-01 16:39:02 -0600
+++ external_osl/src/include/OSL/llvm_util.h 2018-08-25 14:05:00 -0600
@@ -33,6 +33,8 @@
+set (USE_OIIO_STATIC ON CACHE BOOL "If OIIO is built static")
+if (USE_OIIO_STATIC)
+ add_definitions ("-DOIIO_STATIC_BUILD=1")
+ add_definitions ("-DOIIO_STATIC_DEFINE=1")
+endif ()
set (OSL_NO_DEFAULT_TEXTURESYSTEM OFF CACHE BOOL "Do not use create a raw OIIO::TextureSystem")
if (OSL_NO_DEFAULT_TEXTURESYSTEM)
diff -Naur OpenShadingLanguage-1.12.6.2/src/cmake/externalpackages.cmake external_osl/src/cmake/externalpackages.cmake
--- OpenShadingLanguage-1.12.6.2/src/cmake/externalpackages.cmake 2022-09-30 17:43:53 -0600
+++ external_osl/src/cmake/externalpackages.cmake 2022-10-15 14:49:26 -0600
@@ -77,6 +77,7 @@
checked_find_package (ZLIB REQUIRED) # Needed by several packages
+checked_find_package (PNG REQUIRED) # Needed since OIIO needs it
# IlmBase & OpenEXR
checked_find_package (OpenEXR REQUIRED
diff -Naur OpenShadingLanguage-1.12.6.2/src/include/OSL/llvm_util.h external_osl/src/include/OSL/llvm_util.h
--- OpenShadingLanguage-1.12.6.2/src/include/OSL/llvm_util.h 2022-09-30 17:43:53 -0600
+++ external_osl/src/include/OSL/llvm_util.h 2022-10-15 15:37:24 -0600
@@ -9,6 +9,8 @@
#include <unordered_set>
#include <vector>
+#define OSL_HAS_BLENDER_CLEANUP_FIX
@@ -10,26 +49,14 @@ diff -Naur OpenShadingLanguage-Release-1.9.9/src/include/OSL/llvm_util.h externa
#ifdef LLVM_NAMESPACE
namespace llvm = LLVM_NAMESPACE;
#endif
@@ -487,6 +489,7 @@
std::string func_name (llvm::Function *f);
static size_t total_jit_memory_held ();
+ static void Cleanup ();
private:
class MemoryManager;
diff -Naur OpenShadingLanguage-Release-1.9.9/src/liboslexec/llvm_util.cpp external_osl/src/liboslexec/llvm_util.cpp
--- OpenShadingLanguage-Release-1.9.9/src/liboslexec/llvm_util.cpp 2018-05-01 16:39:02 -0600
+++ external_osl/src/liboslexec/llvm_util.cpp 2018-08-25 14:04:27 -0600
@@ -140,7 +140,10 @@
};
@@ -455,7 +457,7 @@
llvm::BasicBlock* masked_return_block() const;
bool is_masking_required() const { return m_is_masking_required; }
-
+void LLVM_Util::Cleanup ()
+{
+ if(jitmm_hold) jitmm_hold->clear();
+}
+ static void Cleanup ();
struct ScopedMasking {
ScopedMasking() {}
size_t
LLVM_Util::total_jit_memory_held ()
@@ -48,50 +75,19 @@ diff -Naur org/CMakeLists.txt external_osl/CMakeLists.txt
set (OSL_NO_DEFAULT_TEXTURESYSTEM OFF CACHE BOOL "Do not use create a raw OIIO::TextureSystem")
if (OSL_NO_DEFAULT_TEXTURESYSTEM)
diff --git a/CMakeLists.txt b/CMakeLists.txt
index 990f50d69..46ef7351d 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -252,11 +252,9 @@ install (EXPORT OSL_EXPORTED_TARGETS
FILE ${OSL_TARGETS_EXPORT_NAME}
NAMESPACE ${PROJECT_NAME}::)
-
-
-
-osl_add_all_tests()
-
+if (${PROJECT_NAME}_BUILD_TESTS AND NOT ${PROJECT_NAME}_IS_SUBPROJECT)
+ osl_add_all_tests()
+endif ()
if (NOT ${PROJECT_NAME}_IS_SUBPROJECT)
include (packaging)
diff -Naur external_osl_orig/src/cmake/externalpackages.cmake external_osl/src/cmake/externalpackages.cmake
--- external_osl_orig/src/cmake/externalpackages.cmake 2021-06-01 13:44:18 -0600
+++ external_osl/src/cmake/externalpackages.cmake 2021-06-28 07:44:32 -0600
@@ -80,6 +80,7 @@
checked_find_package (ZLIB REQUIRED) # Needed by several packages
+checked_find_package (PNG REQUIRED) # Needed since OIIO needs it
# IlmBase & OpenEXR
checked_find_package (OpenEXR REQUIRED
diff -Naur external_osl_orig/src/liboslcomp/oslcomp.cpp external_osl/src/liboslcomp/oslcomp.cpp
--- external_osl_orig/src/liboslcomp/oslcomp.cpp 2021-06-01 13:44:18 -0600
+++ external_osl/src/liboslcomp/oslcomp.cpp 2021-06-28 09:11:06 -0600
@@ -21,6 +21,13 @@
#if !defined(__STDC_CONSTANT_MACROS)
# define __STDC_CONSTANT_MACROS 1
diff --git a/src/liboslexec/llvm_util.cpp b/src/liboslexec/llvm_util.cpp
index 445f6400..3d468de2 100644
--- a/src/liboslexec/llvm_util.cpp
+++ b/src/liboslexec/llvm_util.cpp
@@ -3430,8 +3430,9 @@ LLVM_Util::call_function (llvm::Value *func, cspan<llvm::Value *> args)
#endif
//llvm_gen_debug_printf (std::string("start ") + std::string(name));
#if OSL_LLVM_VERSION >= 110
- OSL_DASSERT(llvm::isa<llvm::Function>(func));
- llvm::Value *r = builder().CreateCall(llvm::cast<llvm::Function>(func), llvm::ArrayRef<llvm::Value *>(args.data(), args.size()));
+ llvm::Value* r = builder().CreateCall(
+ llvm::cast<llvm::FunctionType>(func->getType()->getPointerElementType()), func,
+ llvm::ArrayRef<llvm::Value*>(args.data(), args.size()));
#else
llvm::Value *r = builder().CreateCall (func, llvm::ArrayRef<llvm::Value *>(args.data(), args.size()));
#endif
+
+// clang uses CALLBACK in its templates which causes issues if it is already defined
+#ifdef _WIN32 && defined(CALLBACK)
+# undef CALLBACK
+#endif
+
+//
#include <clang/Basic/TargetInfo.h>
#include <clang/Frontend/CompilerInstance.h>
#include <clang/Frontend/TextDiagnosticPrinter.h>

View File

@@ -1,42 +0,0 @@
--- src/Makefile.in 2017-09-26 01:28:47.000000000 +0300
+++ src/Makefile.in 2017-09-26 01:19:06.000000000 +0300
@@ -513,7 +513,7 @@
libcommon_la_SOURCES = common.c file_io.c command.c pcm.c ulaw.c alaw.c \
float32.c double64.c ima_adpcm.c ms_adpcm.c gsm610.c dwvw.c vox_adpcm.c \
interleave.c strings.c dither.c cart.c broadcast.c audio_detect.c \
- ima_oki_adpcm.c ima_oki_adpcm.h alac.c chunk.c ogg.c chanmap.c \
+ ima_oki_adpcm.c ima_oki_adpcm.h alac.c chunk.c ogg.c chanmap.c \
windows.c id3.c $(WIN_VERSION_FILE)
@@ -719,10 +719,10 @@
$(AM_V_CCLD)$(LINK) $(GSM610_libgsm_la_OBJECTS) $(GSM610_libgsm_la_LIBADD) $(LIBS)
libcommon.la: $(libcommon_la_OBJECTS) $(libcommon_la_DEPENDENCIES) $(EXTRA_libcommon_la_DEPENDENCIES)
- $(AM_V_CCLD)$(LINK) $(libcommon_la_OBJECTS) $(libcommon_la_LIBADD) $(LIBS)
+ $(AM_V_CCLD)$(LINK) $(libcommon_la_OBJECTS) $(libcommon_la_LIBADD) $(LIBS) $(EXTERNAL_XIPH_LIBS)
libsndfile.la: $(libsndfile_la_OBJECTS) $(libsndfile_la_DEPENDENCIES) $(EXTRA_libsndfile_la_DEPENDENCIES)
- $(AM_V_CCLD)$(libsndfile_la_LINK) -rpath $(libdir) $(libsndfile_la_OBJECTS) $(libsndfile_la_LIBADD) $(LIBS)
+ $(AM_V_CCLD)$(libsndfile_la_LINK) -rpath $(libdir) $(libsndfile_la_OBJECTS) $(libsndfile_la_LIBADD) $(LIBS) $(EXTERNAL_XIPH_LIBS)
clean-checkPROGRAMS:
@list='$(check_PROGRAMS)'; test -n "$$list" || exit 0; \
@@ -924,7 +924,7 @@
@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(libsndfile_la_CPPFLAGS) $(CPPFLAGS) $(libsndfile_la_CFLAGS) $(CFLAGS) -c -o libsndfile_la-dwd.lo `test -f 'dwd.c' || echo '$(srcdir)/'`dwd.c
libsndfile_la-flac.lo: flac.c
-@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(libsndfile_la_CPPFLAGS) $(CPPFLAGS) $(libsndfile_la_CFLAGS) $(CFLAGS) -MT libsndfile_la-flac.lo -MD -MP -MF $(DEPDIR)/libsndfile_la-flac.Tpo -c -o libsndfile_la-flac.lo `test -f 'flac.c' || echo '$(srcdir)/'`flac.c
+@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(libsndfile_la_CPPFLAGS) $(CPPFLAGS) $(libsndfile_la_CFLAGS) $(CFLAGS) $(EXTERNAL_XIPH_CFLAGS) -MT libsndfile_la-flac.lo -MD -MP -MF $(DEPDIR)/libsndfile_la-flac.Tpo -c -o libsndfile_la-flac.lo `test -f 'flac.c' || echo '$(srcdir)/'`flac.c
@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/libsndfile_la-flac.Tpo $(DEPDIR)/libsndfile_la-flac.Plo
@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='flac.c' object='libsndfile_la-flac.lo' libtool=yes @AMDEPBACKSLASH@
@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@
@@ -1092,7 +1092,7 @@
@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(libsndfile_la_CPPFLAGS) $(CPPFLAGS) $(libsndfile_la_CFLAGS) $(CFLAGS) -c -o libsndfile_la-rf64.lo `test -f 'rf64.c' || echo '$(srcdir)/'`rf64.c
libsndfile_la-ogg_vorbis.lo: ogg_vorbis.c
-@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(libsndfile_la_CPPFLAGS) $(CPPFLAGS) $(libsndfile_la_CFLAGS) $(CFLAGS) -MT libsndfile_la-ogg_vorbis.lo -MD -MP -MF $(DEPDIR)/libsndfile_la-ogg_vorbis.Tpo -c -o libsndfile_la-ogg_vorbis.lo `test -f 'ogg_vorbis.c' || echo '$(srcdir)/'`ogg_vorbis.c
+@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(libsndfile_la_CPPFLAGS) $(CPPFLAGS) $(libsndfile_la_CFLAGS) $(CFLAGS) $(EXTERNAL_XIPH_CFLAGS) -MT libsndfile_la-ogg_vorbis.lo -MD -MP -MF $(DEPDIR)/libsndfile_la-ogg_vorbis.Tpo -c -o libsndfile_la-ogg_vorbis.lo `test -f 'ogg_vorbis.c' || echo '$(srcdir)/'`ogg_vorbis.c
@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/libsndfile_la-ogg_vorbis.Tpo $(DEPDIR)/libsndfile_la-ogg_vorbis.Plo
@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='ogg_vorbis.c' object='libsndfile_la-ogg_vorbis.lo' libtool=yes @AMDEPBACKSLASH@
@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@

View File

@@ -1,14 +0,0 @@
Only in external_sqlite_orig: config.log
diff -ru external_sqlite_orig/config.sub external_sqlite/config.sub
--- external_sqlite_orig/config.sub 2020-07-10 14:06:42.000000000 +0200
+++ external_sqlite/config.sub 2020-07-10 14:10:24.000000000 +0200
@@ -314,6 +314,7 @@
# Recognize the basic CPU types with company name.
580-* \
| a29k-* \
+ | aarch64-* \
| alpha-* | alphaev[4-8]-* | alphaev56-* | alphaev6[78]-* \
| alpha64-* | alpha64ev[4-8]-* | alpha64ev56-* | alpha64ev6[78]-* \
| alphapca5[67]-* | alpha64pca5[67]-* | arc-* \
Only in external_sqlite: mksourceid
Only in external_sqlite: sqlite3session.h

View File

@@ -0,0 +1,10 @@
--- ./test/v3ext.c 2022-07-05 11:08:33.000000000 +0200
+++ ./test/v3ext.c 2022-10-18 13:58:05.000000000 +0200
@@ -8,6 +8,7 @@
*/
#include <stdio.h>
+#include <string.h>
#include <openssl/x509.h>
#include <openssl/x509v3.h>
#include <openssl/pem.h>

View File

@@ -10,15 +10,4 @@ index 7a8d06a0..886699d8 100644
+ #if (__cplusplus >= 201402L && (!defined(_MSC_VER) || _MSC_VER >= 1920))
#define __TBB_DEPRECATED [[deprecated]]
#define __TBB_DEPRECATED_MSG(msg) [[deprecated(msg)]]
#elif _MSC_VER
--- a/src/tbb/tools_api/ittnotify_config.h
+++ b/src/tbb/tools_api/ittnotify_config.h
@@ -162,7 +162,7 @@
# define ITT_ARCH ITT_ARCH_IA32E
# elif defined _M_IA64 || defined __ia64__
# define ITT_ARCH ITT_ARCH_IA64
-# elif defined _M_ARM || defined __arm__
+# elif defined _M_ARM || defined __arm__ || defined __aarch64__
# define ITT_ARCH ITT_ARCH_ARM
# elif defined __powerpc64__
# define ITT_ARCH ITT_ARCH_PPC64
#elif _MSC_VER

View File

@@ -4,7 +4,7 @@
# Some are omitted here because they have special meanings below.
1750a | 580 \
| a29k \
+ | aarch64 \
+ | aarch64 \
| alpha | alphaev[4-8] | alphaev56 | alphaev6[78] | alphapca5[67] \
| alpha64 | alpha64ev[4-8] | alpha64ev56 | alpha64ev6[78] | alpha64pca5[67] \
| arc | arm | arm[bl]e | arme[lb] | armv[2345] | armv[345][lb] | avr \
@@ -12,7 +12,7 @@
# Recognize the basic CPU types with company name.
580-* \
| a29k-* \
+ | aarch64-* \
+ | aarch64-* \
| alpha-* | alphaev[4-8]-* | alphaev56-* | alphaev6[78]-* \
| alpha64-* | alpha64ev[4-8]-* | alpha64ev56-* | alpha64ev6[78]-* \
| alphapca5[67]-* | alpha64pca5[67]-* | arc-* \

View File

@@ -53,147 +53,3 @@ diff -ru USD-20.11/pxr/base/tf/pxrLZ4/lz4.cpp external_usd/pxr/base/tf/pxrLZ4/lz
/*-******************************
* Compression functions
From 442d087962f762deeb8b6e49a0955753fcf9aeb9 Mon Sep 17 00:00:00 2001
From: Tsahi Zidenberg <tsahee@amazon.com>
Date: Sun, 15 Nov 2020 15:18:24 +0000
Subject: [PATCH 1/2] stackTrace: support aarch64/linux
stacktrace calls syscall directly via assembler. Create compatible
aarch64 code.
---
pxr/base/arch/stackTrace.cpp | 30 ++++++++++++++++++++++++------
1 file changed, 24 insertions(+), 6 deletions(-)
diff --git a/pxr/base/arch/stackTrace.cpp b/pxr/base/arch/stackTrace.cpp
index dcc1dfd46..c11aabeb1 100644
--- a/pxr/base/arch/stackTrace.cpp
+++ b/pxr/base/arch/stackTrace.cpp
@@ -583,7 +583,6 @@ nonLockingLinux__execve (const char *file,
char *const argv[],
char *const envp[])
{
-#if defined(ARCH_BITS_64)
/*
* We make a direct system call here, because we can't find an
* execve which corresponds with the non-locking fork we call
@@ -594,7 +593,27 @@ nonLockingLinux__execve (const char *file,
* hangs in a threaded app. (We use the non-locking fork to get
* around problems with forking when we have had memory
* corruption.) whew.
- *
+ */
+
+ unsigned long result;
+
+#if defined (__aarch64__)
+ {
+ register long __file_result asm ("x0") = (long)file;
+ register char* const* __argv asm ("x1") = argv;
+ register char* const* __envp asm ("x2") = envp;
+ register long __num_execve asm ("x8") = 221;
+ __asm__ __volatile__ (
+ "svc 0"
+ : "=r" (__file_result)
+ : "r"(__num_execve), "r" (__file_result), "r" (__argv), "r" (__envp)
+ : "memory"
+ );
+ result = __file_result;
+ }
+#elif defined(ARCH_CPU_INTEL) && defined(ARCH_BITS_64)
+
+ /*
* %rdi, %rsi, %rdx, %rcx, %r8, %r9 are args 0-5
* syscall clobbers %rcx and %r11
*
@@ -603,7 +622,6 @@ nonLockingLinux__execve (const char *file,
* constraints to gcc.
*/
- unsigned long result;
__asm__ __volatile__ (
"mov %0, %%rdi \n\t"
"mov %%rcx, %%rsi \n\t"
@@ -614,6 +632,9 @@ nonLockingLinux__execve (const char *file,
: "0" (file), "c" (argv), "d" (envp)
: "memory", "cc", "r11"
);
+#else
+#error Unknown architecture
+#endif
if (result >= 0xfffffffffffff000) {
errno = -result;
@@ -621,9 +642,6 @@ nonLockingLinux__execve (const char *file,
}
return result;
-#else
-#error Unknown architecture
-#endif
}
#endif
From a1dffe02519bb3c6ccbbe8c6c58304da5db98995 Mon Sep 17 00:00:00 2001
From: Tsahi Zidenberg <tsahee@amazon.com>
Date: Sun, 15 Nov 2020 15:22:52 +0000
Subject: [PATCH 2/2] timing: support aarch64/linux
The aarch64 arch-timer is directly accessible to userspace via two
registers:
CNTVCT_EL0 - holds the current counter value
CNTFRQ_EL0 - holds the counter frequency (in Hz)
---
pxr/base/arch/timing.cpp | 6 ++++++
pxr/base/arch/timing.h | 6 +++++-
2 files changed, 11 insertions(+), 1 deletion(-)
diff --git a/pxr/base/arch/timing.cpp b/pxr/base/arch/timing.cpp
index 27ad58fed..9022950c1 100644
--- a/pxr/base/arch/timing.cpp
+++ b/pxr/base/arch/timing.cpp
@@ -59,6 +59,11 @@ ARCH_HIDDEN
void
Arch_InitTickTimer()
{
+#ifdef __aarch64__
+ uint64_t counter_hz;
+ __asm __volatile("mrs %0, CNTFRQ_EL0" : "=&r" (counter_hz));
+ Arch_NanosecondsPerTick = double(1e9) / double(counter_hz);
+#else
// NOTE: Normally ifstream would be cleaner, but it causes crashes when
// used in conjunction with DSOs and the Intel Compiler.
FILE *in;
@@ -135,6 +140,7 @@ Arch_InitTickTimer()
}
Arch_NanosecondsPerTick = double(1e9) / double(cpuHz);
+#endif
}
#elif defined(ARCH_OS_WINDOWS)
diff --git a/pxr/base/arch/timing.h b/pxr/base/arch/timing.h
index 67ec0d15f..6dc3e85a0 100644
--- a/pxr/base/arch/timing.h
+++ b/pxr/base/arch/timing.h
@@ -36,7 +36,7 @@
/// \addtogroup group_arch_SystemFunctions
///@{
-#if defined(ARCH_OS_LINUX)
+#if defined(ARCH_OS_LINUX) && defined(ARCH_CPU_INTEL)
#include <x86intrin.h>
#elif defined(ARCH_OS_DARWIN)
#include <mach/mach_time.h>
@@ -69,6 +69,10 @@ ArchGetTickTime()
#elif defined(ARCH_CPU_INTEL)
// On Intel we'll use the rdtsc instruction.
return __rdtsc();
+#elif defined (__aarch64__)
+ uint64_t result;
+ __asm __volatile("mrs %0, CNTVCT_EL0" : "=&r" (result));
+ return result;
#else
#error Unknown architecture.
#endif

View File

@@ -79,9 +79,6 @@ set STAGING=%BUILD_DIR%\S
rem for python module build
set MSSdk=1
set DISTUTILS_USE_SDK=1
rem if you let pip pick its own build dirs, it'll stick it somewhere deep inside the user profile
rem and cython will refuse to link due to a path that gets too long.
set TMPDIR=c:\t\
rem for python externals source to be shared between the various archs and compilers
mkdir %BUILD_DIR%\downloads\externals

View File

@@ -1,4 +1,70 @@
Buildbot Configuration
=====================
Blender Buildbot
================
Files used by Buildbot's `compile-code` step.
Code signing
------------
Code signing is done as part of INSTALL target, which makes it possible to sign
files which are aimed into a bundle and coming from a non-signed source (such as
libraries SVN).
This is achieved by specifying `worker_codesign.cmake` as a post-install script
run by CMake. This CMake script simply involves an utility script written in
Python which takes care of an actual signing.
### Configuration
Client configuration doesn't need anything special, other than variable
`SHARED_STORAGE_DIR` pointing to a location which is watched by a server.
This is done in `config_builder.py` file and is stored in Git (which makes it
possible to have almost zero-configuration buildbot machines).
Server configuration requires copying `config_server_template.py` under the
name of `config_server.py` and tweaking values, which are platform-specific.
#### Windows configuration
There are two things which are needed on Windows in order to have code signing
to work:
- `TIMESTAMP_AUTHORITY_URL` which is most likely set http://timestamp.digicert.com
- `CERTIFICATE_FILEPATH` which is a full file path to a PKCS #12 key (.pfx).
## Tips
### Self-signed certificate on Windows
It is easiest to test configuration using self-signed certificate.
The certificate manipulation utilities are coming with Windows SDK.
Unfortunately, they are not added to PATH. Here is an example of how to make
sure they are easily available:
```
set PATH=C:\Program Files (x86)\Windows Kits\10\App Certification Kit;%PATH%
set PATH=C:\Program Files (x86)\Windows Kits\10\bin\10.0.18362.0\x64;%PATH%
```
Generate CA:
```
makecert -r -pe -n "CN=Blender Test CA" -ss CA -sr CurrentUser -a sha256 ^
-cy authority -sky signature -sv BlenderTestCA.pvk BlenderTestCA.cer
```
Import the generated CA:
```
certutil -user -addstore Root BlenderTestCA.cer
```
Create self-signed certificate and pack it into PKCS #12:
```
makecert -pe -n "CN=Blender Test SPC" -a sha256 -cy end ^
-sky signature ^
-ic BlenderTestCA.cer -iv BlenderTestCA.pvk ^
-sv BlenderTestSPC.pvk BlenderTestSPC.cer
pvk2pfx -pvk BlenderTestSPC.pvk -spc BlenderTestSPC.cer -pfx BlenderTestSPC.pfx
```

View File

@@ -0,0 +1,127 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import argparse
import os
import re
import subprocess
import sys
def is_tool(name):
"""Check whether `name` is on PATH and marked as executable."""
# from whichcraft import which
from shutil import which
return which(name) is not None
class Builder:
def __init__(self, name, branch, codesign):
self.name = name
self.branch = branch
self.is_release_branch = re.match("^blender-v(.*)-release$", branch) is not None
self.codesign = codesign
# Buildbot runs from build/ directory
self.blender_dir = os.path.abspath(os.path.join('..', 'blender.git'))
self.build_dir = os.path.abspath(os.path.join('..', 'build'))
self.install_dir = os.path.abspath(os.path.join('..', 'install'))
self.upload_dir = os.path.abspath(os.path.join('..', 'install'))
# Detect platform
if name.startswith('mac'):
self.platform = 'mac'
self.command_prefix = []
elif name.startswith('linux'):
self.platform = 'linux'
if is_tool('scl'):
self.command_prefix = ['scl', 'enable', 'devtoolset-9', '--']
else:
self.command_prefix = []
elif name.startswith('win'):
self.platform = 'win'
self.command_prefix = []
else:
raise ValueError('Unkonw platform for builder ' + self.platform)
# Always 64 bit now
self.bits = 64
def create_builder_from_arguments():
parser = argparse.ArgumentParser()
parser.add_argument('builder_name')
parser.add_argument('branch', default='master', nargs='?')
parser.add_argument("--codesign", action="store_true")
args = parser.parse_args()
return Builder(args.builder_name, args.branch, args.codesign)
class VersionInfo:
def __init__(self, builder):
# Get version information
buildinfo_h = os.path.join(builder.build_dir, "source", "creator", "buildinfo.h")
blender_h = os.path.join(builder.blender_dir, "source", "blender", "blenkernel", "BKE_blender_version.h")
version_number = int(self._parse_header_file(blender_h, 'BLENDER_VERSION'))
version_number_patch = int(self._parse_header_file(blender_h, 'BLENDER_VERSION_PATCH'))
version_numbers = (version_number // 100, version_number % 100, version_number_patch)
self.short_version = "%d.%02d" % (version_numbers[0], version_numbers[1])
self.version = "%d.%02d.%d" % version_numbers
self.version_cycle = self._parse_header_file(blender_h, 'BLENDER_VERSION_CYCLE')
self.hash = self._parse_header_file(buildinfo_h, 'BUILD_HASH')[1:-1]
if self.version_cycle == "release":
# Final release
self.full_version = self.version
self.is_development_build = False
elif self.version_cycle == "rc":
# Release candidate
self.full_version = self.version + self.version_cycle
self.is_development_build = False
else:
# Development build
self.full_version = self.version + '-' + self.hash
self.is_development_build = True
def _parse_header_file(self, filename, define):
import re
regex = re.compile(r"^#\s*define\s+%s\s+(.*)" % define)
with open(filename, "r") as file:
for l in file:
match = regex.match(l)
if match:
return match.group(1)
return None
def call(cmd, env=None, exit_on_error=True):
print(' '.join(cmd))
# Flush to ensure correct order output on Windows.
sys.stdout.flush()
sys.stderr.flush()
retcode = subprocess.call(cmd, env=env)
if exit_on_error and retcode != 0:
sys.exit(retcode)
return retcode

View File

@@ -0,0 +1,81 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
from dataclasses import dataclass
from pathlib import Path
from typing import List
@dataclass
class AbsoluteAndRelativeFileName:
"""
Helper class which keeps track of absolute file path for a direct access and
corresponding relative path against given base.
The relative part is used to construct a file name within an archive which
contains files which are to be signed or which has been signed already
(depending on whether the archive is addressed to signing server or back
to the buildbot worker).
"""
# Base directory which is where relative_filepath is relative to.
base_dir: Path
# Full absolute path of the corresponding file.
absolute_filepath: Path
# Derived from full file path, contains part of the path which is relative
# to a desired base path.
relative_filepath: Path
def __init__(self, base_dir: Path, filepath: Path):
self.base_dir = base_dir
self.absolute_filepath = filepath.resolve()
self.relative_filepath = self.absolute_filepath.relative_to(
self.base_dir)
@classmethod
def from_path(cls, path: Path) -> 'AbsoluteAndRelativeFileName':
assert path.is_absolute()
assert path.is_file()
base_dir = path.parent
return AbsoluteAndRelativeFileName(base_dir, path)
@classmethod
def recursively_from_directory(cls, base_dir: Path) \
-> List['AbsoluteAndRelativeFileName']:
"""
Create list of AbsoluteAndRelativeFileName for all the files in the
given directory.
NOTE: Result will be pointing to a resolved paths.
"""
assert base_dir.is_absolute()
assert base_dir.is_dir()
base_dir = base_dir.resolve()
result = []
for filename in base_dir.glob('**/*'):
if not filename.is_file():
continue
result.append(AbsoluteAndRelativeFileName(base_dir, filename))
return result

View File

@@ -0,0 +1,245 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import dataclasses
import json
import os
from pathlib import Path
from typing import Optional
import codesign.util as util
class ArchiveStateError(Exception):
message: str
def __init__(self, message):
self.message = message
super().__init__(self.message)
@dataclasses.dataclass
class ArchiveState:
"""
Additional information (state) of the archive
Includes information like expected file size of the archive file in the case
the archive file is expected to be successfully created.
If the archive can not be created, this state will contain error message
indicating details of error.
"""
# Size in bytes of the corresponding archive.
file_size: Optional[int] = None
# Non-empty value indicates that error has happenned.
error_message: str = ''
def has_error(self) -> bool:
"""
Check whether the archive is at error state
"""
return self.error_message
def serialize_to_string(self) -> str:
payload = dataclasses.asdict(self)
return json.dumps(payload, sort_keys=True, indent=4)
def serialize_to_file(self, filepath: Path) -> None:
string = self.serialize_to_string()
filepath.write_text(string)
@classmethod
def deserialize_from_string(cls, string: str) -> 'ArchiveState':
try:
object_as_dict = json.loads(string)
except json.decoder.JSONDecodeError:
raise ArchiveStateError('Error parsing JSON')
return cls(**object_as_dict)
@classmethod
def deserialize_from_file(cls, filepath: Path):
string = filepath.read_text()
return cls.deserialize_from_string(string)
class ArchiveWithIndicator:
"""
The idea of this class is to wrap around logic which takes care of keeping
track of a name of an archive and synchronization routines between buildbot
worker and signing server.
The synchronization is done based on creating a special file after the
archive file is knowingly ready for access.
"""
# Base directory where the archive is stored (basically, a basename() of
# the absolute archive file name).
#
# For example, 'X:\\TEMP\\'.
base_dir: Path
# Absolute file name of the archive.
#
# For example, 'X:\\TEMP\\FOO.ZIP'.
archive_filepath: Path
# Absolute name of a file which acts as an indication of the fact that the
# archive is ready and is available for access.
#
# This is how synchronization between buildbot worker and signing server is
# done:
# - First, the archive is created under archive_filepath name.
# - Second, the indication file is created under ready_indicator_filepath
# name.
# - Third, the colleague of whoever created the indicator name watches for
# the indication file to appear, and once it's there it access the
# archive.
ready_indicator_filepath: Path
def __init__(
self, base_dir: Path, archive_name: str, ready_indicator_name: str):
"""
Construct the object from given base directory and name of the archive
file:
ArchiveWithIndicator(Path('X:\\TEMP'), 'FOO.ZIP', 'INPUT_READY')
"""
self.base_dir = base_dir
self.archive_filepath = self.base_dir / archive_name
self.ready_indicator_filepath = self.base_dir / ready_indicator_name
def is_ready_unsafe(self) -> bool:
"""
Check whether the archive is ready for access.
No guarding about possible network failres is done here.
"""
if not self.ready_indicator_filepath.exists():
return False
try:
archive_state = ArchiveState.deserialize_from_file(
self.ready_indicator_filepath)
except ArchiveStateError as error:
print(f'Error deserializing archive state: {error.message}')
return False
if archive_state.has_error():
# If the error did happen during codesign procedure there will be no
# corresponding archive file.
# The caller code will deal with the error check further.
return True
# Sometimes on macOS indicator file appears prior to the actual archive
# despite the order of creation and os.sync() used in tag_ready().
# So consider archive not ready if there is an indicator without an
# actual archive.
if not self.archive_filepath.exists():
print('Found indicator without actual archive, waiting for archive '
f'({self.archive_filepath}) to appear.')
return False
# Wait for until archive is fully stored.
actual_archive_size = self.archive_filepath.stat().st_size
if actual_archive_size != archive_state.file_size:
print('Partial/invalid archive size (expected '
f'{archive_state.file_size} got {actual_archive_size})')
return False
return True
def is_ready(self) -> bool:
"""
Check whether the archive is ready for access.
Will tolerate possible network failures: if there is a network failure
or if there is still no proper permission on a file False is returned.
"""
# There are some intermitten problem happening at a random which is
# translates to "OSError : [WinError 59] An unexpected network error occurred".
# Some reports suggests it might be due to lack of permissions to the file,
# which might be applicable in our case since it's possible that file is
# initially created with non-accessible permissions and gets chmod-ed
# after initial creation.
try:
return self.is_ready_unsafe()
except OSError as e:
print(f'Exception checking archive: {e}')
return False
def tag_ready(self, error_message='') -> None:
"""
Tag the archive as ready by creating the corresponding indication file.
NOTE: It is expected that the archive was never tagged as ready before
and that there are no subsequent tags of the same archive.
If it is violated, an assert will fail.
"""
assert not self.is_ready()
# Try the best to make sure everything is synced to the file system,
# to avoid any possibility of stamp appearing on a network share prior to
# an actual file.
if util.get_current_platform() != util.Platform.WINDOWS:
os.sync()
archive_size = -1
if self.archive_filepath.exists():
archive_size = self.archive_filepath.stat().st_size
archive_info = ArchiveState(
file_size=archive_size, error_message=error_message)
self.ready_indicator_filepath.write_text(
archive_info.serialize_to_string())
def get_state(self) -> ArchiveState:
"""
Get state object for this archive
The state is read from the corresponding state file.
"""
try:
return ArchiveState.deserialize_from_file(self.ready_indicator_filepath)
except ArchiveStateError as error:
return ArchiveState(error_message=f'Error in information format: {error}')
def clean(self) -> None:
"""
Remove both archive and the ready indication file.
"""
util.ensure_file_does_not_exist_or_die(self.ready_indicator_filepath)
util.ensure_file_does_not_exist_or_die(self.archive_filepath)
def is_fully_absent(self) -> bool:
"""
Check whether both archive and its ready indicator are absent.
Is used for a sanity check during code signing process by both
buildbot worker and signing server.
"""
return (not self.archive_filepath.exists() and
not self.ready_indicator_filepath.exists())

View File

@@ -0,0 +1,501 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Signing process overview.
#
# From buildbot worker side:
# - Files which needs to be signed are collected from either a directory to
# sign all signable files in there, or by filename of a single file to sign.
# - Those files gets packed into an archive and stored in a location location
# which is watched by the signing server.
# - A marker READY file is created which indicates the archive is ready for
# access.
# - Wait for the server to provide an archive with signed files.
# This is done by watching for the READY file which corresponds to an archive
# coming from the signing server.
# - Unpack the signed signed files from the archives and replace original ones.
#
# From code sign server:
# - Watch special location for a READY file which indicates the there is an
# archive with files which are to be signed.
# - Unpack the archive to a temporary location.
# - Run codesign tool and make sure all the files are signed.
# - Pack the signed files and store them in a location which is watched by
# the buildbot worker.
# - Create a READY file which indicates that the archive with signed files is
# ready.
import abc
import logging
import shutil
import subprocess
import time
import tarfile
import uuid
from pathlib import Path
from tempfile import TemporaryDirectory
from typing import Iterable, List
import codesign.util as util
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.archive_with_indicator import ArchiveWithIndicator
from codesign.exception import CodeSignException
logger = logging.getLogger(__name__)
logger_builder = logger.getChild('builder')
logger_server = logger.getChild('server')
def pack_files(files: Iterable[AbsoluteAndRelativeFileName],
archive_filepath: Path) -> None:
"""
Create tar archive from given files for the signing pipeline.
Is used by buildbot worker to create an archive of files which are to be
signed, and by signing server to send signed files back to the worker.
"""
with tarfile.TarFile.open(archive_filepath, 'w') as tar_file_handle:
for file_info in files:
tar_file_handle.add(file_info.absolute_filepath,
arcname=file_info.relative_filepath)
def extract_files(archive_filepath: Path,
extraction_dir: Path) -> None:
"""
Extract all files form the given archive into the given direcotry.
"""
# TODO(sergey): Verify files in the archive have relative path.
with tarfile.TarFile.open(archive_filepath, mode='r') as tar_file_handle:
tar_file_handle.extractall(path=extraction_dir)
class BaseCodeSigner(metaclass=abc.ABCMeta):
"""
Base class for a platform-specific signer of binaries.
Contains all the logic shared across platform-specific implementations, such
as synchronization and notification logic.
Platform specific bits (such as actual command for signing the binary) are
to be implemented as a subclass.
Provides utilities code signing as a whole, including functionality needed
by a signing server and a buildbot worker.
The signer and builder may run on separate machines, the only requirement is
that they have access to a directory which is shared between them. For the
security concerns this is to be done as a separate machine (or as a Shared
Folder configuration in VirtualBox configuration). This directory might be
mounted under different base paths, but its underlying storage is to be
the same.
The code signer is short-lived on a buildbot worker side, and is living
forever on a code signing server side.
"""
# TODO(sergey): Find a neat way to have config annotated.
# config: Config
# Storage directory where builder puts files which are requested to be
# signed.
# Consider this an input of the code signing server.
unsigned_storage_dir: Path
# Storage where signed files are stored.
# Consider this an output of the code signer server.
signed_storage_dir: Path
# Platform the code is currently executing on.
platform: util.Platform
def __init__(self, config):
self.config = config
absolute_shared_storage_dir = config.SHARED_STORAGE_DIR.resolve()
# Unsigned (signing server input) configuration.
self.unsigned_storage_dir = absolute_shared_storage_dir / 'unsigned'
# Signed (signing server output) configuration.
self.signed_storage_dir = absolute_shared_storage_dir / 'signed'
self.platform = util.get_current_platform()
def cleanup_environment_for_builder(self) -> None:
# TODO(sergey): Revisit need of cleaning up the existing files.
# In practice it wasn't so helpful, and with multiple clients
# talking to the same server it becomes even more tricky.
pass
def cleanup_environment_for_signing_server(self) -> None:
# TODO(sergey): Revisit need of cleaning up the existing files.
# In practice it wasn't so helpful, and with multiple clients
# talking to the same server it becomes even more tricky.
pass
def generate_request_id(self) -> str:
"""
Generate an unique identifier for code signing request.
"""
return str(uuid.uuid4())
def archive_info_for_request_id(
self, path: Path, request_id: str) -> ArchiveWithIndicator:
return ArchiveWithIndicator(
path, f'{request_id}.tar', f'{request_id}.ready')
def signed_archive_info_for_request_id(
self, request_id: str) -> ArchiveWithIndicator:
return self.archive_info_for_request_id(
self.signed_storage_dir, request_id)
def unsigned_archive_info_for_request_id(
self, request_id: str) -> ArchiveWithIndicator:
return self.archive_info_for_request_id(
self.unsigned_storage_dir, request_id)
############################################################################
# Buildbot worker side helpers.
@abc.abstractmethod
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
"""
Check whether file is to be signed.
Is used by both single file signing pipeline and recursive directory
signing pipeline.
This is where code signer is to check whether file is to be signed or
not. This check might be based on a simple extension test or on actual
test whether file have a digital signature already or not.
"""
def collect_files_to_sign(self, path: Path) \
-> List[AbsoluteAndRelativeFileName]:
"""
Get all files which need to be signed from the given path.
NOTE: The path might either be a file or directory.
This function is run from the buildbot worker side.
"""
# If there is a single file provided trust the buildbot worker that it
# is eligible for signing.
if path.is_file():
file = AbsoluteAndRelativeFileName.from_path(path)
if not self.check_file_is_to_be_signed(file):
return []
return [file]
all_files = AbsoluteAndRelativeFileName.recursively_from_directory(
path)
files_to_be_signed = [file for file in all_files
if self.check_file_is_to_be_signed(file)]
return files_to_be_signed
def wait_for_signed_archive_or_die(self, request_id) -> None:
"""
Wait until archive with signed files is available.
Will only return if the archive with signed files is available. If there
was an error during code sign procedure the SystemExit exception is
raised, with the message set to the error reported by the codesign
server.
Will only wait for the configured time. If that time exceeds and there
is still no responce from the signing server the application will exit
with a non-zero exit code.
"""
signed_archive_info = self.signed_archive_info_for_request_id(
request_id)
unsigned_archive_info = self.unsigned_archive_info_for_request_id(
request_id)
timeout_in_seconds = self.config.TIMEOUT_IN_SECONDS
time_start = time.monotonic()
while not signed_archive_info.is_ready():
time.sleep(1)
time_slept_in_seconds = time.monotonic() - time_start
if time_slept_in_seconds > timeout_in_seconds:
signed_archive_info.clean()
unsigned_archive_info.clean()
raise SystemExit("Signing server didn't finish signing in "
f'{timeout_in_seconds} seconds, dying :(')
archive_state = signed_archive_info.get_state()
if archive_state.has_error():
signed_archive_info.clean()
unsigned_archive_info.clean()
raise SystemExit(
f'Error happenned during codesign procedure: {archive_state.error_message}')
def copy_signed_files_to_directory(
self, signed_dir: Path, destination_dir: Path) -> None:
"""
Copy all files from signed_dir to destination_dir.
This function will overwrite any existing file. Permissions are copied
from the source files, but other metadata, such as timestamps, are not.
"""
for signed_filepath in signed_dir.glob('**/*'):
if not signed_filepath.is_file():
continue
relative_filepath = signed_filepath.relative_to(signed_dir)
destination_filepath = destination_dir / relative_filepath
destination_filepath.parent.mkdir(parents=True, exist_ok=True)
shutil.copy(signed_filepath, destination_filepath)
def run_buildbot_path_sign_pipeline(self, path: Path) -> None:
"""
Run all steps needed to make given path signed.
Path points to an unsigned file or a directory which contains unsigned
files.
If the path points to a single file then this file will be signed.
This is used to sign a final bundle such as .msi on Windows or .dmg on
macOS.
NOTE: The code signed implementation might actually reject signing the
file, in which case the file will be left unsigned. This isn't anything
to be considered a failure situation, just might happen when buildbot
worker can not detect whether signing is really required in a specific
case or not.
If the path points to a directory then code signer will sign all
signable files from it (finding them recursively).
"""
self.cleanup_environment_for_builder()
# Make sure storage directory exists.
self.unsigned_storage_dir.mkdir(parents=True, exist_ok=True)
# Collect all files which needs to be signed and pack them into a single
# archive which will be sent to the signing server.
logger_builder.info('Collecting files which are to be signed...')
files = self.collect_files_to_sign(path)
if not files:
logger_builder.info('No files to be signed, ignoring.')
return
logger_builder.info('Found %d files to sign.', len(files))
request_id = self.generate_request_id()
signed_archive_info = self.signed_archive_info_for_request_id(
request_id)
unsigned_archive_info = self.unsigned_archive_info_for_request_id(
request_id)
pack_files(files=files,
archive_filepath=unsigned_archive_info.archive_filepath)
unsigned_archive_info.tag_ready()
# Wait for the signing server to finish signing.
logger_builder.info('Waiting signing server to sign the files...')
self.wait_for_signed_archive_or_die(request_id)
# Extract signed files from archive and move files to final location.
with TemporaryDirectory(prefix='blender-buildbot-') as temp_dir_str:
unpacked_signed_files_dir = Path(temp_dir_str)
logger_builder.info('Extracting signed files from archive...')
extract_files(
archive_filepath=signed_archive_info.archive_filepath,
extraction_dir=unpacked_signed_files_dir)
destination_dir = path
if destination_dir.is_file():
destination_dir = destination_dir.parent
self.copy_signed_files_to_directory(
unpacked_signed_files_dir, destination_dir)
logger_builder.info('Removing archive with signed files...')
signed_archive_info.clean()
############################################################################
# Signing server side helpers.
def wait_for_sign_request(self) -> str:
"""
Wait for the buildbot to request signing of an archive.
Returns an identifier of signing request.
"""
# TOOD(sergey): Support graceful shutdown on Ctrl-C.
logger_server.info(
f'Waiting for a request directory {self.unsigned_storage_dir} to appear.')
while not self.unsigned_storage_dir.exists():
time.sleep(1)
logger_server.info(
'Waiting for a READY indicator of any signing request.')
request_id = None
while request_id is None:
for file in self.unsigned_storage_dir.iterdir():
if file.suffix != '.ready':
continue
request_id = file.stem
logger_server.info(f'Found READY for request ID {request_id}.')
if request_id is None:
time.sleep(1)
unsigned_archive_info = self.unsigned_archive_info_for_request_id(
request_id)
while not unsigned_archive_info.is_ready():
time.sleep(1)
return request_id
@abc.abstractmethod
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
"""
Sign all files in the given directory.
NOTE: Signing should happen in-place.
"""
def run_signing_pipeline(self, request_id: str):
"""
Run the full signing pipeline starting from the point when buildbot
worker have requested signing.
"""
# Make sure storage directory exists.
self.signed_storage_dir.mkdir(parents=True, exist_ok=True)
with TemporaryDirectory(prefix='blender-codesign-') as temp_dir_str:
temp_dir = Path(temp_dir_str)
signed_archive_info = self.signed_archive_info_for_request_id(
request_id)
unsigned_archive_info = self.unsigned_archive_info_for_request_id(
request_id)
logger_server.info('Extracting unsigned files from archive...')
extract_files(
archive_filepath=unsigned_archive_info.archive_filepath,
extraction_dir=temp_dir)
logger_server.info('Collecting all files which needs signing...')
files = AbsoluteAndRelativeFileName.recursively_from_directory(
temp_dir)
logger_server.info('Signing all requested files...')
try:
self.sign_all_files(files)
except CodeSignException as error:
signed_archive_info.tag_ready(error_message=error.message)
unsigned_archive_info.clean()
logger_server.info('Signing is complete with errors.')
return
logger_server.info('Packing signed files...')
pack_files(files=files,
archive_filepath=signed_archive_info.archive_filepath)
signed_archive_info.tag_ready()
logger_server.info('Removing signing request...')
unsigned_archive_info.clean()
logger_server.info('Signing is complete.')
def run_signing_server(self):
logger_server.info('Starting new code signing server...')
self.cleanup_environment_for_signing_server()
logger_server.info('Code signing server is ready')
while True:
logger_server.info('Waiting for the signing request in %s...',
self.unsigned_storage_dir)
request_id = self.wait_for_sign_request()
logger_server.info(
f'Beging signign procedure for request ID {request_id}.')
self.run_signing_pipeline(request_id)
############################################################################
# Command executing.
#
# Abstracted to a degree that allows to run commands from a foreign
# platform.
# The goal with this is to allow performing dry-run tests of code signer
# server from other platforms (for example, to test that macOS code signer
# does what it is supposed to after doing a refactor on Linux).
# TODO(sergey): What is the type annotation for the command?
def run_command_or_mock(self, command, platform: util.Platform) -> None:
"""
Run given command if current platform matches given one
If the platform is different then it will only be printed allowing
to verify logic of the code signing process.
"""
if platform != self.platform:
logger_server.info(
f'Will run command for {platform}: {command}')
return
logger_server.info(f'Running command: {command}')
subprocess.run(command)
# TODO(sergey): What is the type annotation for the command?
def check_output_or_mock(self, command,
platform: util.Platform,
allow_nonzero_exit_code=False) -> str:
"""
Run given command if current platform matches given one
If the platform is different then it will only be printed allowing
to verify logic of the code signing process.
If allow_nonzero_exit_code is truth then the output will be returned
even if application quit with non-zero exit code.
Otherwise an subprocess.CalledProcessError exception will be raised
in such case.
"""
if platform != self.platform:
logger_server.info(
f'Will run command for {platform}: {command}')
return
if allow_nonzero_exit_code:
process = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
output = process.communicate()[0]
return output.decode()
logger_server.info(f'Running command: {command}')
return subprocess.check_output(
command, stderr=subprocess.STDOUT).decode()

View File

@@ -0,0 +1,62 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Configuration of a code signer which is specific to the code running from
# buildbot's worker.
import sys
from pathlib import Path
import codesign.util as util
from codesign.config_common import *
platform = util.get_current_platform()
if platform == util.Platform.LINUX:
SHARED_STORAGE_DIR = Path('/data/codesign')
elif platform == util.Platform.WINDOWS:
SHARED_STORAGE_DIR = Path('Z:\\codesign')
elif platform == util.Platform.MACOS:
SHARED_STORAGE_DIR = Path('/Volumes/codesign_macos/codesign')
# https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
LOGGING = {
'version': 1,
'formatters': {
'default': {'format': '%(asctime)-15s %(levelname)8s %(name)s %(message)s'}
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'default',
'stream': 'ext://sys.stderr',
}
},
'loggers': {
'codesign': {'level': 'INFO'},
},
'root': {
'level': 'WARNING',
'handlers': [
'console',
],
}
}

View File

@@ -0,0 +1,36 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
from pathlib import Path
# Timeout in seconds for the signing process.
#
# This is how long buildbot packing step will wait signing server to
# perform signing.
#
# NOTE: Notarization could take a long time, hence the rather high value
# here. Might consider using different timeout for different platforms.
TIMEOUT_IN_SECONDS = 45 * 60 * 60
# Directory which is shared across buildbot worker and signing server.
#
# This is where worker puts files requested for signing as well as where
# server puts signed files.
SHARED_STORAGE_DIR: Path

View File

@@ -0,0 +1,101 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Configuration of a code signer which is specific to the code signing server.
#
# NOTE: DO NOT put any sensitive information here, put it in an actual
# configuration on the signing machine.
from pathlib import Path
from codesign.config_common import *
CODESIGN_DIRECTORY = Path(__file__).absolute().parent
BLENDER_GIT_ROOT_DIRECTORY = CODESIGN_DIRECTORY.parent.parent.parent
################################################################################
# Common configuration.
# Directory where folders for codesign requests and signed result are stored.
# For example, /data/codesign
SHARED_STORAGE_DIR: Path
################################################################################
# macOS-specific configuration.
MACOS_ENTITLEMENTS_FILE = \
BLENDER_GIT_ROOT_DIRECTORY / 'release' / 'darwin' / 'entitlements.plist'
# Identity of the Developer ID Application certificate which is to be used for
# codesign tool.
# Use `security find-identity -v -p codesigning` to find the identity.
#
# NOTE: This identity is just an example from release/darwin/README.txt.
MACOS_CODESIGN_IDENTITY = 'AE825E26F12D08B692F360133210AF46F4CF7B97'
# User name (Apple ID) which will be used to request notarization.
MACOS_XCRUN_USERNAME = 'me@example.com'
# One-time application password which will be used to request notarization.
MACOS_XCRUN_PASSWORD = '@keychain:altool-password'
# Timeout in seconds within which the notarial office is supposed to reply.
MACOS_NOTARIZE_TIMEOUT_IN_SECONDS = 60 * 60
################################################################################
# Windows-specific configuration.
# URL to the timestamping authority.
WIN_TIMESTAMP_AUTHORITY_URL = 'http://timestamp.digicert.com'
# Full path to the certificate used for signing.
#
# The path and expected file format might vary depending on a platform.
#
# On Windows it is usually is a PKCS #12 key (.pfx), so the path will look
# like Path('C:\\Secret\\Blender.pfx').
WIN_CERTIFICATE_FILEPATH: Path
################################################################################
# Logging configuration, common for all platforms.
# https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
LOGGING = {
'version': 1,
'formatters': {
'default': {'format': '%(asctime)-15s %(levelname)8s %(name)s %(message)s'}
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'default',
'stream': 'ext://sys.stderr',
}
},
'loggers': {
'codesign': {'level': 'INFO'},
},
'root': {
'level': 'WARNING',
'handlers': [
'console',
],
}
}

View File

@@ -18,16 +18,9 @@
# <pep8 compliant>
class CodeSignException(Exception):
message: str
import os
import sys
sys.path.append(os.path.dirname(os.path.realpath(__file__)))
from modules.mesh_test import BlendFileTest
geo_node_test = BlendFileTest("test_object", "expected_object", threshold=1e-4)
result = geo_node_test.run_test()
# Telling `ctest` about the failed test by raising Exception.
if result == False:
raise Exception("Failed {}".format(geo_node_test.test_name))
def __init__(self, message):
self.message = message
super().__init__(self.message)

View File

@@ -0,0 +1,72 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# NOTE: This is a no-op signer (since there isn't really a procedure to sign
# Linux binaries yet). Used to debug and verify the code signing routines on
# a Linux environment.
import logging
from pathlib import Path
from typing import List
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.base_code_signer import BaseCodeSigner
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
class LinuxCodeSigner(BaseCodeSigner):
def is_active(self) -> bool:
"""
Check whether this signer is active.
if it is inactive, no files will be signed.
Is used to be able to debug code signing pipeline on Linux, where there
is no code signing happening in the actual buildbot and release
environment.
"""
return False
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
if file.relative_filepath == Path('blender'):
return True
if (file.relative_filepath.parts[-3:-1] == ('python', 'bin') and
file.relative_filepath.name.startwith('python')):
return True
if file.relative_filepath.suffix == '.so':
return True
return False
def collect_files_to_sign(self, path: Path) \
-> List[AbsoluteAndRelativeFileName]:
if not self.is_active():
return []
return super().collect_files_to_sign(path)
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
num_files = len(files)
for file_index, file in enumerate(files):
logger.info('Server: Signed file [%d/%d] %s',
file_index + 1, num_files, file.relative_filepath)

View File

@@ -0,0 +1,456 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging
import re
import stat
import subprocess
import time
from pathlib import Path
from typing import List
import codesign.util as util
from buildbot_utils import Builder
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.base_code_signer import BaseCodeSigner
from codesign.exception import CodeSignException
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
# NOTE: Check is done as filename.endswith(), so keep the dot
EXTENSIONS_TO_BE_SIGNED = {'.dylib', '.so', '.dmg'}
# Prefixes of a file (not directory) name which are to be signed.
# Used to sign extra executable files in Contents/Resources.
NAME_PREFIXES_TO_BE_SIGNED = {'python'}
class NotarizationException(CodeSignException):
pass
def is_file_from_bundle(file: AbsoluteAndRelativeFileName) -> bool:
"""
Check whether file is coming from an .app bundle
"""
parts = file.relative_filepath.parts
if not parts:
return False
if not parts[0].endswith('.app'):
return False
return True
def get_bundle_from_file(
file: AbsoluteAndRelativeFileName) -> AbsoluteAndRelativeFileName:
"""
Get AbsoluteAndRelativeFileName descriptor of bundle
"""
assert(is_file_from_bundle(file))
parts = file.relative_filepath.parts
bundle_name = parts[0]
base_dir = file.base_dir
bundle_filepath = file.base_dir / bundle_name
return AbsoluteAndRelativeFileName(base_dir, bundle_filepath)
def is_bundle_executable_file(file: AbsoluteAndRelativeFileName) -> bool:
"""
Check whether given file is an executable within an app bundle
"""
if not is_file_from_bundle(file):
return False
parts = file.relative_filepath.parts
num_parts = len(parts)
if num_parts < 3:
return False
if parts[1:3] != ('Contents', 'MacOS'):
return False
return True
def xcrun_field_value_from_output(field: str, output: str) -> str:
"""
Get value of a given field from xcrun output.
If field is not found empty string is returned.
"""
field_prefix = field + ': '
for line in output.splitlines():
line = line.strip()
if line.startswith(field_prefix):
return line[len(field_prefix):]
return ''
class MacOSCodeSigner(BaseCodeSigner):
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
if file.relative_filepath.name.startswith('.'):
return False
if is_bundle_executable_file(file):
return True
base_name = file.relative_filepath.name
if any(base_name.startswith(prefix)
for prefix in NAME_PREFIXES_TO_BE_SIGNED):
return True
mode = file.absolute_filepath.lstat().st_mode
if mode & stat.S_IXUSR != 0:
file_output = subprocess.check_output(
("file", file.absolute_filepath)).decode()
if "64-bit executable" in file_output:
return True
return file.relative_filepath.suffix in EXTENSIONS_TO_BE_SIGNED
def collect_files_to_sign(self, path: Path) \
-> List[AbsoluteAndRelativeFileName]:
# Include all files when signing app or dmg bundle: all the files are
# needed to do valid signature of bundle.
if path.name.endswith('.app'):
return AbsoluteAndRelativeFileName.recursively_from_directory(path)
if path.is_dir():
files = []
for child in path.iterdir():
if child.name.endswith('.app'):
current_files = AbsoluteAndRelativeFileName.recursively_from_directory(
child)
else:
current_files = super().collect_files_to_sign(child)
for current_file in current_files:
files.append(AbsoluteAndRelativeFileName(
path, current_file.absolute_filepath))
return files
return super().collect_files_to_sign(path)
############################################################################
# Codesign.
def codesign_remove_signature(
self, file: AbsoluteAndRelativeFileName) -> None:
"""
Make sure given file does not have codesign signature
This is needed because codesigning is not possible for file which has
signature already.
"""
logger_server.info(
'Removing codesign signature from %s...', file.relative_filepath)
command = ['codesign', '--remove-signature', file.absolute_filepath]
self.run_command_or_mock(command, util.Platform.MACOS)
def codesign_file(
self, file: AbsoluteAndRelativeFileName) -> None:
"""
Sign given file
NOTE: File must not have any signatures.
"""
logger_server.info(
'Codesigning %s...', file.relative_filepath)
entitlements_file = self.config.MACOS_ENTITLEMENTS_FILE
command = ['codesign',
'--timestamp',
'--options', 'runtime',
f'--entitlements={entitlements_file}',
'--sign', self.config.MACOS_CODESIGN_IDENTITY,
file.absolute_filepath]
self.run_command_or_mock(command, util.Platform.MACOS)
def codesign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
"""
Run codesign tool on all eligible files in the given list.
Will ignore all files which are not to be signed. For the rest will
remove possible existing signature and add a new signature.
"""
num_files = len(files)
have_ignored_files = False
signed_files = []
for file_index, file in enumerate(files):
# Ignore file if it is not to be signed.
# Allows to manually construct ZIP of a bundle and get it signed.
if not self.check_file_is_to_be_signed(file):
logger_server.info(
'Ignoring file [%d/%d] %s',
file_index + 1, num_files, file.relative_filepath)
have_ignored_files = True
continue
logger_server.info(
'Running codesigning routines for file [%d/%d] %s...',
file_index + 1, num_files, file.relative_filepath)
self.codesign_remove_signature(file)
self.codesign_file(file)
signed_files.append(file)
if have_ignored_files:
logger_server.info('Signed %d files:', len(signed_files))
num_signed_files = len(signed_files)
for file_index, signed_file in enumerate(signed_files):
logger_server.info(
'- [%d/%d] %s',
file_index + 1, num_signed_files,
signed_file.relative_filepath)
def codesign_bundles(
self, files: List[AbsoluteAndRelativeFileName]) -> None:
"""
Codesign all .app bundles in the given list of files.
Bundle is deducted from paths of the files, and every bundle is only
signed once.
"""
signed_bundles = set()
extra_files = []
for file in files:
if not is_file_from_bundle(file):
continue
bundle = get_bundle_from_file(file)
bundle_name = bundle.relative_filepath
if bundle_name in signed_bundles:
continue
logger_server.info('Running codesign routines on bundle %s',
bundle_name)
# It is not possible to remove signature from DMG.
if bundle.relative_filepath.name.endswith('.app'):
self.codesign_remove_signature(bundle)
self.codesign_file(bundle)
signed_bundles.add(bundle_name)
# Codesign on a bundle adds an extra folder with information.
# It needs to be compied to the source.
code_signature_directory = \
bundle.absolute_filepath / 'Contents' / '_CodeSignature'
code_signature_files = \
AbsoluteAndRelativeFileName.recursively_from_directory(
code_signature_directory)
for code_signature_file in code_signature_files:
bundle_relative_file = AbsoluteAndRelativeFileName(
bundle.base_dir,
code_signature_directory /
code_signature_file.relative_filepath)
extra_files.append(bundle_relative_file)
files.extend(extra_files)
############################################################################
# Notarization.
def notarize_get_bundle_id(self, file: AbsoluteAndRelativeFileName) -> str:
"""
Get bundle ID which will be used to notarize DMG
"""
name = file.relative_filepath.name
app_name = name.split('-', 2)[0].lower()
app_name_words = app_name.split()
if len(app_name_words) > 1:
app_name_id = ''.join(word.capitalize() for word in app_name_words)
else:
app_name_id = app_name_words[0]
# TODO(sergey): Consider using "alpha" for buildbot builds.
return f'org.blenderfoundation.{app_name_id}.release'
def notarize_request(self, file) -> str:
"""
Request notarization of the given file.
Returns UUID of the notarization request. If error occurred None is
returned instead of UUID.
"""
bundle_id = self.notarize_get_bundle_id(file)
logger_server.info('Bundle ID: %s', bundle_id)
logger_server.info('Submitting file to the notarial office.')
command = [
'xcrun', 'altool', '--notarize-app', '--verbose',
'-f', file.absolute_filepath,
'--primary-bundle-id', bundle_id,
'--username', self.config.MACOS_XCRUN_USERNAME,
'--password', self.config.MACOS_XCRUN_PASSWORD]
output = self.check_output_or_mock(
command, util.Platform.MACOS, allow_nonzero_exit_code=True)
for line in output.splitlines():
line = line.strip()
if line.startswith('RequestUUID = '):
request_uuid = line[14:]
return request_uuid
# Check whether the package has been already submitted.
if 'The software asset has already been uploaded.' in line:
request_uuid = re.sub(
'.*The upload ID is ([A-Fa-f0-9\-]+).*', '\\1', line)
logger_server.warning(
f'The package has been already submitted under UUID {request_uuid}')
return request_uuid
logger_server.error(output)
logger_server.error('xcrun command did not report RequestUUID')
return None
def notarize_review_status(self, xcrun_output: str) -> bool:
"""
Review status returned by xcrun's notarization info
Returns truth if the notarization process has finished.
If there are errors during notarization, a NotarizationException()
exception is thrown with status message from the notarial office.
"""
# Parse status and message
status = xcrun_field_value_from_output('Status', xcrun_output)
status_message = xcrun_field_value_from_output(
'Status Message', xcrun_output)
if status == 'success':
logger_server.info(
'Package successfully notarized: %s', status_message)
return True
if status == 'invalid':
logger_server.error(xcrun_output)
logger_server.error(
'Package notarization has failed: %s', status_message)
raise NotarizationException(status_message)
if status == 'in progress':
return False
logger_server.info(
'Unknown notarization status %s (%s)', status, status_message)
return False
def notarize_wait_result(self, request_uuid: str) -> None:
"""
Wait for until notarial office have a reply
"""
logger_server.info(
'Waiting for a result from the notarization office.')
command = ['xcrun', 'altool',
'--notarization-info', request_uuid,
'--username', self.config.MACOS_XCRUN_USERNAME,
'--password', self.config.MACOS_XCRUN_PASSWORD]
time_start = time.monotonic()
timeout_in_seconds = self.config.MACOS_NOTARIZE_TIMEOUT_IN_SECONDS
while True:
xcrun_output = self.check_output_or_mock(
command, util.Platform.MACOS, allow_nonzero_exit_code=True)
if self.notarize_review_status(xcrun_output):
break
logger_server.info('Keep waiting for notarization office.')
time.sleep(30)
time_slept_in_seconds = time.monotonic() - time_start
if time_slept_in_seconds > timeout_in_seconds:
logger_server.error(
"Notarial office didn't reply in %f seconds.",
timeout_in_seconds)
def notarize_staple(self, file: AbsoluteAndRelativeFileName) -> bool:
"""
Staple notarial label on the file
"""
logger_server.info('Stapling notarial stamp.')
command = ['xcrun', 'stapler', 'staple', '-v', file.absolute_filepath]
self.check_output_or_mock(command, util.Platform.MACOS)
def notarize_dmg(self, file: AbsoluteAndRelativeFileName) -> bool:
"""
Run entire pipeline to get DMG notarized.
"""
logger_server.info('Begin notarization routines on %s',
file.relative_filepath)
# Submit file for notarization.
request_uuid = self.notarize_request(file)
if not request_uuid:
return False
logger_server.info('Received Request UUID: %s', request_uuid)
# Wait for the status from the notarization office.
if not self.notarize_wait_result(request_uuid):
return False
# Staple.
self.notarize_staple(file)
def notarize_all_dmg(
self, files: List[AbsoluteAndRelativeFileName]) -> bool:
"""
Notarize all DMG images from the input.
Images are supposed to be codesigned already.
"""
for file in files:
if not file.relative_filepath.name.endswith('.dmg'):
continue
if not self.check_file_is_to_be_signed(file):
continue
self.notarize_dmg(file)
############################################################################
# Entry point.
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
# TODO(sergey): Handle errors somehow.
self.codesign_all_files(files)
self.codesign_bundles(files)
self.notarize_all_dmg(files)

View File

@@ -0,0 +1,52 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging.config
import sys
from pathlib import Path
from typing import Optional
import codesign.config_builder
import codesign.util as util
from codesign.base_code_signer import BaseCodeSigner
class SimpleCodeSigner:
code_signer: Optional[BaseCodeSigner]
def __init__(self):
platform = util.get_current_platform()
if platform == util.Platform.LINUX:
from codesign.linux_code_signer import LinuxCodeSigner
self.code_signer = LinuxCodeSigner(codesign.config_builder)
elif platform == util.Platform.MACOS:
from codesign.macos_code_signer import MacOSCodeSigner
self.code_signer = MacOSCodeSigner(codesign.config_builder)
elif platform == util.Platform.WINDOWS:
from codesign.windows_code_signer import WindowsCodeSigner
self.code_signer = WindowsCodeSigner(codesign.config_builder)
else:
self.code_signer = None
def sign_file_or_directory(self, path: Path) -> None:
logging.config.dictConfig(codesign.config_builder.LOGGING)
self.code_signer.run_buildbot_path_sign_pipeline(path)

View File

@@ -0,0 +1,54 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import sys
from enum import Enum
from pathlib import Path
class Platform(Enum):
LINUX = 1
MACOS = 2
WINDOWS = 3
def get_current_platform() -> Platform:
if sys.platform == 'linux':
return Platform.LINUX
elif sys.platform == 'darwin':
return Platform.MACOS
elif sys.platform == 'win32':
return Platform.WINDOWS
raise Exception(f'Unknown platform {sys.platform}')
def ensure_file_does_not_exist_or_die(filepath: Path) -> None:
"""
If the file exists, unlink it.
If the file path exists and is not a file an assert will trigger.
If the file path does not exists nothing happens.
"""
if not filepath.exists():
return
if not filepath.is_file():
# TODO(sergey): Provide information about what the filepath actually is.
raise SystemExit(f'{filepath} is expected to be a file, but is not')
filepath.unlink()

View File

@@ -0,0 +1,117 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging
from pathlib import Path
from typing import List
import codesign.util as util
from buildbot_utils import Builder
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.base_code_signer import BaseCodeSigner
from codesign.exception import CodeSignException
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
# NOTE: Check is done as filename.endswith(), so keep the dot
EXTENSIONS_TO_BE_SIGNED = {'.exe', '.dll', '.pyd', '.msi'}
BLACKLIST_FILE_PREFIXES = (
'api-ms-', 'concrt', 'msvcp', 'ucrtbase', 'vcomp', 'vcruntime')
class SigntoolException(CodeSignException):
pass
class WindowsCodeSigner(BaseCodeSigner):
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
base_name = file.relative_filepath.name
if any(base_name.startswith(prefix)
for prefix in BLACKLIST_FILE_PREFIXES):
return False
return file.relative_filepath.suffix in EXTENSIONS_TO_BE_SIGNED
def get_sign_command_prefix(self) -> List[str]:
return [
'signtool', 'sign', '/v',
'/f', self.config.WIN_CERTIFICATE_FILEPATH,
'/tr', self.config.WIN_TIMESTAMP_AUTHORITY_URL]
def run_codesign_tool(self, filepath: Path) -> None:
command = self.get_sign_command_prefix() + [filepath]
try:
codesign_output = self.check_output_or_mock(command, util.Platform.WINDOWS)
except subprocess.CalledProcessError as e:
raise SigntoolException(f'Error running signtool {e}')
logger_server.info(f'signtool output:\n{codesign_output}')
got_number_of_success = False
for line in codesign_output.split('\n'):
line_clean = line.strip()
line_clean_lower = line_clean.lower()
if line_clean_lower.startswith('number of warnings') or \
line_clean_lower.startswith('number of errors'):
number = int(line_clean_lower.split(':')[1])
if number != 0:
raise SigntoolException('Non-clean success of signtool')
if line_clean_lower.startswith('number of files successfully signed'):
got_number_of_success = True
number = int(line_clean_lower.split(':')[1])
if number != 1:
raise SigntoolException('Signtool did not consider codesign a success')
if not got_number_of_success:
raise SigntoolException('Signtool did not report number of files signed')
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
# NOTE: Sign files one by one to avoid possible command line length
# overflow (which could happen if we ever decide to sign every binary
# in the install folder, for example).
#
# TODO(sergey): Consider doing batched signing of handful of files in
# one go (but only if this actually known to be much faster).
num_files = len(files)
for file_index, file in enumerate(files):
# Ignore file if it is not to be signed.
# Allows to manually construct ZIP of package and get it signed.
if not self.check_file_is_to_be_signed(file):
logger_server.info(
'Ignoring file [%d/%d] %s',
file_index + 1, num_files, file.relative_filepath)
continue
logger_server.info(
'Running signtool command for file [%d/%d] %s...',
file_index + 1, num_files, file.relative_filepath)
self.run_codesign_tool(file.absolute_filepath)

View File

@@ -0,0 +1,37 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# NOTE: This is a no-op signer (since there isn't really a procedure to sign
# Linux binaries yet). Used to debug and verify the code signing routines on
# a Linux environment.
import logging.config
from pathlib import Path
from typing import List
from codesign.linux_code_signer import LinuxCodeSigner
import codesign.config_server
if __name__ == "__main__":
logging.config.dictConfig(codesign.config_server.LOGGING)
code_signer = LinuxCodeSigner(codesign.config_server)
code_signer.run_signing_server()

View File

@@ -0,0 +1,41 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging.config
from pathlib import Path
from typing import List
from codesign.macos_code_signer import MacOSCodeSigner
import codesign.config_server
if __name__ == "__main__":
entitlements_file = codesign.config_server.MACOS_ENTITLEMENTS_FILE
if not entitlements_file.exists():
raise SystemExit(
'Entitlements file {entitlements_file} does not exist.')
if not entitlements_file.is_file():
raise SystemExit(
'Entitlements file {entitlements_file} is not a file.')
logging.config.dictConfig(codesign.config_server.LOGGING)
code_signer = MacOSCodeSigner(codesign.config_server)
code_signer.run_signing_server()

View File

@@ -0,0 +1,11 @@
@echo off
rem This is an entry point of the codesign server for Windows.
rem It makes sure that signtool.exe is within the current PATH and can be
rem used by the Python script.
SETLOCAL
set PATH=C:\Program Files (x86)\Windows Kits\10\App Certification Kit;%PATH%
codesign_server_windows.py

View File

@@ -0,0 +1,54 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Implementation of codesign server for Windows.
#
# NOTE: If signtool.exe is not in the PATH use codesign_server_windows.bat
import logging.config
import shutil
from pathlib import Path
from typing import List
import codesign.util as util
from codesign.windows_code_signer import WindowsCodeSigner
import codesign.config_server
if __name__ == "__main__":
logging.config.dictConfig(codesign.config_server.LOGGING)
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
# TODO(sergey): Consider moving such sanity checks into
# CodeSigner.check_environment_or_die().
if not shutil.which('signtool.exe'):
if util.get_current_platform() == util.Platform.WINDOWS:
raise SystemExit("signtool.exe is not found in %PATH%")
logger_server.info(
'signtool.exe not found, '
'but will not be used on this foreign platform')
code_signer = WindowsCodeSigner(codesign.config_server)
code_signer.run_signing_server()

View File

@@ -0,0 +1,551 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
import argparse
import re
import shutil
import subprocess
import sys
import time
from pathlib import Path
from tempfile import TemporaryDirectory, NamedTemporaryFile
from typing import List
BUILDBOT_DIRECTORY = Path(__file__).absolute().parent
CODESIGN_SCRIPT = BUILDBOT_DIRECTORY / 'worker_codesign.py'
BLENDER_GIT_ROOT_DIRECTORY = BUILDBOT_DIRECTORY.parent.parent
DARWIN_DIRECTORY = BLENDER_GIT_ROOT_DIRECTORY / 'release' / 'darwin'
# Extra size which is added on top of actual files size when estimating size
# of destination DNG.
EXTRA_DMG_SIZE_IN_BYTES = 800 * 1024 * 1024
################################################################################
# Common utilities
def get_directory_size(root_directory: Path) -> int:
"""
Get size of directory on disk
"""
total_size = 0
for file in root_directory.glob('**/*'):
total_size += file.lstat().st_size
return total_size
################################################################################
# DMG bundling specific logic
def create_argument_parser():
parser = argparse.ArgumentParser()
parser.add_argument(
'source_dir',
type=Path,
help='Source directory which points to either existing .app bundle'
'or to a directory with .app bundles.')
parser.add_argument(
'--background-image',
type=Path,
help="Optional background picture which will be set on the DMG."
"If not provided default Blender's one is used.")
parser.add_argument(
'--volume-name',
type=str,
help='Optional name of a volume which will be used for DMG.')
parser.add_argument(
'--dmg',
type=Path,
help='Optional argument which points to a final DMG file name.')
parser.add_argument(
'--applescript',
type=Path,
help="Optional path to applescript to set up folder looks of DMG."
"If not provided default Blender's one is used.")
parser.add_argument(
'--codesign',
action="store_true",
help="Code sign and notarize DMG contents.")
return parser
def collect_app_bundles(source_dir: Path) -> List[Path]:
"""
Collect all app bundles which are to be put into DMG
If the source directory points to FOO.app it will be the only app bundle
packed.
Otherwise all .app bundles from given directory are placed to a single
DMG.
"""
if source_dir.name.endswith('.app'):
return [source_dir]
app_bundles = []
for filename in source_dir.glob('*'):
if not filename.is_dir():
continue
if not filename.name.endswith('.app'):
continue
app_bundles.append(filename)
return app_bundles
def collect_and_log_app_bundles(source_dir: Path) -> List[Path]:
app_bundles = collect_app_bundles(source_dir)
if not app_bundles:
print('No app bundles found for packing')
return
print(f'Found {len(app_bundles)} to pack:')
for app_bundle in app_bundles:
print(f'- {app_bundle}')
return app_bundles
def estimate_dmg_size(app_bundles: List[Path]) -> int:
"""
Estimate size of DMG to hold requested app bundles
The size is based on actual size of all files in all bundles plus some
space to compensate for different size-on-disk plus some space to hold
codesign signatures.
Is better to be on a high side since the empty space is compressed, but
lack of space might cause silent failures later on.
"""
app_bundles_size = 0
for app_bundle in app_bundles:
app_bundles_size += get_directory_size(app_bundle)
return app_bundles_size + EXTRA_DMG_SIZE_IN_BYTES
def copy_app_bundles_to_directory(app_bundles: List[Path],
directory: Path) -> None:
"""
Copy all bundles to a given directory
This directory is what the DMG will be created from.
"""
for app_bundle in app_bundles:
print(f'Copying {app_bundle.name}...')
shutil.copytree(app_bundle, directory / app_bundle.name)
def get_main_app_bundle(app_bundles: List[Path]) -> Path:
"""
Get application bundle main for the installation
"""
return app_bundles[0]
def create_dmg_image(app_bundles: List[Path],
dmg_filepath: Path,
volume_name: str) -> None:
"""
Create DMG disk image and put app bundles in it
No DMG configuration or codesigning is happening here.
"""
if dmg_filepath.exists():
print(f'Removing existing writable DMG {dmg_filepath}...')
dmg_filepath.unlink()
print('Preparing directory with app bundles for the DMG...')
with TemporaryDirectory(prefix='blender-dmg-content-') as content_dir_str:
# Copy all bundles to a clean directory.
content_dir = Path(content_dir_str)
copy_app_bundles_to_directory(app_bundles, content_dir)
# Estimate size of the DMG.
dmg_size = estimate_dmg_size(app_bundles)
print(f'Estimated DMG size: {dmg_size:,} bytes.')
# Create the DMG.
print(f'Creating writable DMG {dmg_filepath}')
command = ('hdiutil',
'create',
'-size', str(dmg_size),
'-fs', 'HFS+',
'-srcfolder', content_dir,
'-volname', volume_name,
'-format', 'UDRW',
dmg_filepath)
subprocess.run(command)
def get_writable_dmg_filepath(dmg_filepath: Path):
"""
Get file path for writable DMG image
"""
parent = dmg_filepath.parent
return parent / (dmg_filepath.stem + '-temp.dmg')
def mount_readwrite_dmg(dmg_filepath: Path) -> None:
"""
Mount writable DMG
Mounting point would be /Volumes/<volume name>
"""
print(f'Mounting read-write DMG ${dmg_filepath}')
command = ('hdiutil',
'attach', '-readwrite',
'-noverify',
'-noautoopen',
dmg_filepath)
subprocess.run(command)
def get_mount_directory_for_volume_name(volume_name: str) -> Path:
"""
Get directory under which the volume will be mounted
"""
return Path('/Volumes') / volume_name
def eject_volume(volume_name: str) -> None:
"""
Eject given volume, if mounted
"""
mount_directory = get_mount_directory_for_volume_name(volume_name)
if not mount_directory.exists():
return
mount_directory_str = str(mount_directory)
print(f'Ejecting volume {volume_name}')
# Figure out which device to eject.
mount_output = subprocess.check_output(['mount']).decode()
device = ''
for line in mount_output.splitlines():
if f'on {mount_directory_str} (' not in line:
continue
tokens = line.split(' ', 3)
if len(tokens) < 3:
continue
if tokens[1] != 'on':
continue
if device:
raise Exception(
f'Multiple devices found for mounting point {mount_directory}')
device = tokens[0]
if not device:
raise Exception(
f'No device found for mounting point {mount_directory}')
print(f'{mount_directory} is mounted as device {device}, ejecting...')
subprocess.run(['diskutil', 'eject', device])
def copy_background_if_needed(background_image_filepath: Path,
mount_directory: Path) -> None:
"""
Copy background to the DMG
If the background image is not specified it will not be copied.
"""
if not background_image_filepath:
print('No background image provided.')
return
print(f'Copying background image {background_image_filepath}')
destination_dir = mount_directory / '.background'
destination_dir.mkdir(exist_ok=True)
destination_filepath = destination_dir / background_image_filepath.name
shutil.copy(background_image_filepath, destination_filepath)
def create_applications_link(mount_directory: Path) -> None:
"""
Create link to /Applications in the given location
"""
print('Creating link to /Applications')
command = ('ln', '-s', '/Applications', mount_directory / ' ')
subprocess.run(command)
def run_applescript(applescript: Path,
volume_name: str,
app_bundles: List[Path],
background_image_filepath: Path) -> None:
"""
Run given applescript to adjust look and feel of the DMG
"""
main_app_bundle = get_main_app_bundle(app_bundles)
with NamedTemporaryFile(
mode='w', suffix='.applescript') as temp_applescript:
print('Adjusting applescript for volume name...')
# Adjust script to the specific volume name.
with open(applescript, mode='r') as input:
for line in input.readlines():
stripped_line = line.strip()
if stripped_line.startswith('tell disk'):
line = re.sub('tell disk ".*"',
f'tell disk "{volume_name}"',
line)
elif stripped_line.startswith('set background picture'):
if not background_image_filepath:
continue
else:
background_image_short = \
'.background:' + background_image_filepath.name
line = re.sub('to file ".*"',
f'to file "{background_image_short}"',
line)
line = line.replace('blender.app', main_app_bundle.name)
temp_applescript.write(line)
temp_applescript.flush()
print('Running applescript...')
command = ('osascript', temp_applescript.name)
subprocess.run(command)
print('Waiting for applescript...')
# NOTE: This is copied from bundle.sh. The exact reason for sleep is
# still remained a mystery.
time.sleep(5)
def codesign(subject: Path):
"""
Codesign file or directory
NOTE: For DMG it will also notarize.
"""
command = (CODESIGN_SCRIPT, subject)
subprocess.run(command)
def codesign_app_bundles_in_dmg(mount_directory: str) -> None:
"""
Code sign all binaries and bundles in the mounted directory
"""
print(f'Codesigning all app bundles in {mount_directory}')
codesign(mount_directory)
def codesign_and_notarize_dmg(dmg_filepath: Path) -> None:
"""
Run codesign and notarization pipeline on the DMG
"""
print(f'Codesigning and notarizing DMG {dmg_filepath}')
codesign(dmg_filepath)
def compress_dmg(writable_dmg_filepath: Path,
final_dmg_filepath: Path) -> None:
"""
Compress temporary read-write DMG
"""
command = ('hdiutil', 'convert',
writable_dmg_filepath,
'-format', 'UDZO',
'-o', final_dmg_filepath)
if final_dmg_filepath.exists():
print(f'Removing old compressed DMG {final_dmg_filepath}')
final_dmg_filepath.unlink()
print('Compressing disk image...')
subprocess.run(command)
def create_final_dmg(app_bundles: List[Path],
dmg_filepath: Path,
background_image_filepath: Path,
volume_name: str,
applescript: Path,
codesign: bool) -> None:
"""
Create DMG with all app bundles
Will take care configuring background, signing all binaries and app bundles
and notarizing the DMG.
"""
print('Running all routines to create final DMG')
writable_dmg_filepath = get_writable_dmg_filepath(dmg_filepath)
mount_directory = get_mount_directory_for_volume_name(volume_name)
# Make sure volume is not mounted.
# If it is mounted it will prevent removing old DMG files and could make
# it so app bundles are copied to the wrong place.
eject_volume(volume_name)
create_dmg_image(app_bundles, writable_dmg_filepath, volume_name)
mount_readwrite_dmg(writable_dmg_filepath)
# Run codesign first, prior to copying amything else.
#
# This allows to recurs into the content of bundles without worrying about
# possible interfereice of Application symlink.
if codesign:
codesign_app_bundles_in_dmg(mount_directory)
copy_background_if_needed(background_image_filepath, mount_directory)
create_applications_link(mount_directory)
run_applescript(applescript, volume_name, app_bundles,
background_image_filepath)
print('Ejecting read-write DMG image...')
eject_volume(volume_name)
compress_dmg(writable_dmg_filepath, dmg_filepath)
writable_dmg_filepath.unlink()
if codesign:
codesign_and_notarize_dmg(dmg_filepath)
def ensure_dmg_extension(filepath: Path) -> Path:
"""
Make sure given file have .dmg extension
"""
if filepath.suffix != '.dmg':
return filepath.with_suffix(f'{filepath.suffix}.dmg')
return filepath
def get_dmg_filepath(requested_name: Path, app_bundles: List[Path]) -> Path:
"""
Get full file path for the final DMG image
Will use the provided one when possible, otherwise will deduct it from
app bundles.
If the name is deducted, the DMG is stored in the current directory.
"""
if requested_name:
return ensure_dmg_extension(requested_name.absolute())
# TODO(sergey): This is not necessarily the main one.
main_bundle = app_bundles[0]
# Strip .app from the name
return Path(main_bundle.name[:-4] + '.dmg').absolute()
def get_background_image(requested_background_image: Path) -> Path:
"""
Get effective filepath for the background image
"""
if requested_background_image:
return requested_background_image.absolute()
return DARWIN_DIRECTORY / 'background.tif'
def get_applescript(requested_applescript: Path) -> Path:
"""
Get effective filepath for the applescript
"""
if requested_applescript:
return requested_applescript.absolute()
return DARWIN_DIRECTORY / 'blender.applescript'
def get_volume_name_from_dmg_filepath(dmg_filepath: Path) -> str:
"""
Deduct volume name from the DMG path
Will use first part of the DMG file name prior to dash.
"""
tokens = dmg_filepath.stem.split('-')
words = tokens[0].split()
return ' '.join(word.capitalize() for word in words)
def get_volume_name(requested_volume_name: str,
dmg_filepath: Path) -> str:
"""
Get effective name for DMG volume
"""
if requested_volume_name:
return requested_volume_name
return get_volume_name_from_dmg_filepath(dmg_filepath)
def main():
parser = create_argument_parser()
args = parser.parse_args()
# Get normalized input parameters.
source_dir = args.source_dir.absolute()
background_image_filepath = get_background_image(args.background_image)
applescript = get_applescript(args.applescript)
codesign = args.codesign
app_bundles = collect_and_log_app_bundles(source_dir)
if not app_bundles:
return
dmg_filepath = get_dmg_filepath(args.dmg, app_bundles)
volume_name = get_volume_name(args.volume_name, dmg_filepath)
print(f'Will produce DMG "{dmg_filepath.name}" (without quotes)')
create_final_dmg(app_bundles,
dmg_filepath,
background_image_filepath,
volume_name,
applescript,
codesign)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,44 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# This is a script which is used as POST-INSTALL one for regular CMake's
# INSTALL target.
# It is used by buildbot workers to sign every binary which is going into
# the final buundle.
# On Windows Python 3 there only is python.exe, no python3.exe.
#
# On other platforms it is possible to have python2 and python3, and a
# symbolic link to python to either of them. So on those platforms use
# an explicit Python version.
if(WIN32)
set(PYTHON_EXECUTABLE python)
else()
set(PYTHON_EXECUTABLE python3)
endif()
execute_process(
COMMAND ${PYTHON_EXECUTABLE} "${CMAKE_CURRENT_LIST_DIR}/worker_codesign.py"
"${CMAKE_INSTALL_PREFIX}"
WORKING_DIRECTORY ${CMAKE_CURRENT_LIST_DIR}
RESULT_VARIABLE exit_code
)
if(NOT exit_code EQUAL "0")
message(FATAL_ERROR "Non-zero exit code of codesign tool")
endif()

View File

@@ -0,0 +1,74 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# Helper script which takes care of signing provided location.
#
# The location can either be a directory (in which case all eligible binaries
# will be signed) or a single file (in which case a single file will be signed).
#
# This script takes care of all the complexity of communicating between process
# which requests file to be signed and the code signing server.
#
# NOTE: Signing happens in-place.
import argparse
import sys
from pathlib import Path
from codesign.simple_code_signer import SimpleCodeSigner
def create_argument_parser():
parser = argparse.ArgumentParser()
parser.add_argument('path_to_sign', type=Path)
return parser
def main():
parser = create_argument_parser()
args = parser.parse_args()
path_to_sign = args.path_to_sign.absolute()
if sys.platform == 'win32':
# When WIX packed is used to generate .msi on Windows the CPack will
# install two different projects and install them to different
# installation prefix:
#
# - C:\b\build\_CPack_Packages\WIX\Blender
# - C:\b\build\_CPack_Packages\WIX\Unspecified
#
# Annoying part is: CMake's post-install script will only be run
# once, with the install prefix which corresponds to a project which
# was installed last. But we want to sign binaries from all projects.
# So in order to do so we detect that we are running for a CPack's
# project used for WIX and force parent directory (which includes both
# projects) to be signed.
#
# Here we force both projects to be signed.
if path_to_sign.name == 'Unspecified' and 'WIX' in str(path_to_sign):
path_to_sign = path_to_sign.parent
code_signer = SimpleCodeSigner()
code_signer.sign_file_or_directory(path_to_sign)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,135 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import os
import shutil
import buildbot_utils
def get_cmake_options(builder):
codesign_script = os.path.join(
builder.blender_dir, 'build_files', 'buildbot', 'worker_codesign.cmake')
config_file = "build_files/cmake/config/blender_release.cmake"
options = ['-DCMAKE_BUILD_TYPE:STRING=Release',
'-DWITH_GTESTS=ON']
if builder.platform == 'mac':
options.append('-DCMAKE_OSX_ARCHITECTURES:STRING=x86_64')
options.append('-DCMAKE_OSX_DEPLOYMENT_TARGET=10.9')
elif builder.platform == 'win':
options.extend(['-G', 'Visual Studio 16 2019', '-A', 'x64'])
if builder.codesign:
options.extend(['-DPOSTINSTALL_SCRIPT:PATH=' + codesign_script])
elif builder.platform == 'linux':
config_file = "build_files/buildbot/config/blender_linux.cmake"
optix_sdk_dir = os.path.join(builder.blender_dir, '..', '..', 'NVIDIA-Optix-SDK-7.1')
options.append('-DOPTIX_ROOT_DIR:PATH=' + optix_sdk_dir)
# Workaround to build sm_30 kernels with CUDA 10, since CUDA 11 no longer supports that architecture
if builder.platform == 'win':
options.append('-DCUDA10_TOOLKIT_ROOT_DIR:PATH=C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v10.1')
options.append('-DCUDA10_NVCC_EXECUTABLE:FILEPATH=C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v10.1/bin/nvcc.exe')
options.append('-DCUDA11_TOOLKIT_ROOT_DIR:PATH=C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v11.1')
options.append('-DCUDA11_NVCC_EXECUTABLE:FILEPATH=C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v11.1/bin/nvcc.exe')
elif builder.platform == 'linux':
options.append('-DCUDA10_TOOLKIT_ROOT_DIR:PATH=/usr/local/cuda-10.1')
options.append('-DCUDA10_NVCC_EXECUTABLE:FILEPATH=/usr/local/cuda-10.1/bin/nvcc')
options.append('-DCUDA11_TOOLKIT_ROOT_DIR:PATH=/usr/local/cuda-11.1')
options.append('-DCUDA11_NVCC_EXECUTABLE:FILEPATH=/usr/local/cuda-11.1/bin/nvcc')
options.append("-C" + os.path.join(builder.blender_dir, config_file))
options.append("-DCMAKE_INSTALL_PREFIX=%s" % (builder.install_dir))
return options
def update_git(builder):
# Do extra git fetch because not all platform/git/buildbot combinations
# update the origin remote, causing buildinfo to detect local changes.
os.chdir(builder.blender_dir)
print("Fetching remotes")
command = ['git', 'fetch', '--all']
buildbot_utils.call(builder.command_prefix + command)
def clean_directories(builder):
# Make sure no garbage remained from the previous run
if os.path.isdir(builder.install_dir):
shutil.rmtree(builder.install_dir)
# Make sure build directory exists and enter it
os.makedirs(builder.build_dir, exist_ok=True)
# Remove buildinfo files to force buildbot to re-generate them.
for buildinfo in ('buildinfo.h', 'buildinfo.h.txt', ):
full_path = os.path.join(builder.build_dir, 'source', 'creator', buildinfo)
if os.path.exists(full_path):
print("Removing {}" . format(buildinfo))
os.remove(full_path)
def cmake_configure(builder):
# CMake configuration
os.chdir(builder.build_dir)
cmake_cache = os.path.join(builder.build_dir, 'CMakeCache.txt')
if os.path.exists(cmake_cache):
print("Removing CMake cache")
os.remove(cmake_cache)
print("CMake configure:")
cmake_options = get_cmake_options(builder)
command = ['cmake', builder.blender_dir] + cmake_options
buildbot_utils.call(builder.command_prefix + command)
def cmake_build(builder):
# CMake build
os.chdir(builder.build_dir)
# NOTE: CPack will build an INSTALL target, which would mean that code
# signing will happen twice when using `make install` and CPack.
# The tricky bit here is that it is not possible to know whether INSTALL
# target is used by CPack or by a buildbot itaself. Extra level on top of
# this is that on Windows it is required to build INSTALL target in order
# to have unit test binaries to run.
# So on the one hand we do an extra unneeded code sign on Windows, but on
# a positive side we don't add complexity and don't make build process more
# fragile trying to avoid this. The signing process is way faster than just
# a clean build of buildbot, especially with regression tests enabled.
if builder.platform == 'win':
command = ['cmake', '--build', '.', '--target', 'install', '--config', 'Release']
else:
command = ['make', '-s', '-j16', 'install']
print("CMake build:")
buildbot_utils.call(builder.command_prefix + command)
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
update_git(builder)
clean_directories(builder)
cmake_configure(builder)
cmake_build(builder)

View File

@@ -0,0 +1,208 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Runs on buildbot worker, creating a release package using the build
# system and zipping it into buildbot_upload.zip. This is then uploaded
# to the master in the next buildbot step.
import os
import sys
from pathlib import Path
import buildbot_utils
def get_package_name(builder, platform=None):
info = buildbot_utils.VersionInfo(builder)
package_name = 'blender-' + info.full_version
if platform:
package_name += '-' + platform
if not (builder.branch == 'master' or builder.is_release_branch):
if info.is_development_build:
package_name = builder.branch + "-" + package_name
return package_name
def sign_file_or_directory(path):
from codesign.simple_code_signer import SimpleCodeSigner
code_signer = SimpleCodeSigner()
code_signer.sign_file_or_directory(Path(path))
def create_buildbot_upload_zip(builder, package_files):
import zipfile
buildbot_upload_zip = os.path.join(builder.upload_dir, "buildbot_upload.zip")
if os.path.exists(buildbot_upload_zip):
os.remove(buildbot_upload_zip)
try:
z = zipfile.ZipFile(buildbot_upload_zip, "w", compression=zipfile.ZIP_STORED)
for filepath, filename in package_files:
print("Packaged", filename)
z.write(filepath, arcname=filename)
z.close()
except Exception as ex:
sys.stderr.write('Create buildbot_upload.zip failed: ' + str(ex) + '\n')
sys.exit(1)
def create_tar_xz(src, dest, package_name):
# One extra to remove leading os.sep when cleaning root for package_root
ln = len(src) + 1
flist = list()
# Create list of tuples containing file and archive name
for root, dirs, files in os.walk(src):
package_root = os.path.join(package_name, root[ln:])
flist.extend([(os.path.join(root, file), os.path.join(package_root, file)) for file in files])
import tarfile
# Set UID/GID of archived files to 0, otherwise they'd be owned by whatever
# user compiled the package. If root then unpacks it to /usr/local/ you get
# a security issue.
def _fakeroot(tarinfo):
tarinfo.gid = 0
tarinfo.gname = "root"
tarinfo.uid = 0
tarinfo.uname = "root"
return tarinfo
package = tarfile.open(dest, 'w:xz', preset=9)
for entry in flist:
package.add(entry[0], entry[1], recursive=False, filter=_fakeroot)
package.close()
def cleanup_files(dirpath, extension):
for f in os.listdir(dirpath):
filepath = os.path.join(dirpath, f)
if os.path.isfile(filepath) and f.endswith(extension):
os.remove(filepath)
def pack_mac(builder):
info = buildbot_utils.VersionInfo(builder)
os.chdir(builder.build_dir)
cleanup_files(builder.build_dir, '.dmg')
package_name = get_package_name(builder, 'macOS')
package_filename = package_name + '.dmg'
package_filepath = os.path.join(builder.build_dir, package_filename)
release_dir = os.path.join(builder.blender_dir, 'release', 'darwin')
buildbot_dir = os.path.join(builder.blender_dir, 'build_files', 'buildbot')
bundle_script = os.path.join(buildbot_dir, 'worker_bundle_dmg.py')
command = [bundle_script]
command += ['--dmg', package_filepath]
if info.is_development_build:
background_image = os.path.join(release_dir, 'buildbot', 'background.tif')
command += ['--background-image', background_image]
if builder.codesign:
command += ['--codesign']
command += [builder.install_dir]
buildbot_utils.call(command)
create_buildbot_upload_zip(builder, [(package_filepath, package_filename)])
def pack_win(builder):
info = buildbot_utils.VersionInfo(builder)
os.chdir(builder.build_dir)
cleanup_files(builder.build_dir, '.zip')
# CPack will add the platform name
cpack_name = get_package_name(builder, None)
package_name = get_package_name(builder, 'windows' + str(builder.bits))
command = ['cmake', '-DCPACK_OVERRIDE_PACKAGENAME:STRING=' + cpack_name, '.']
buildbot_utils.call(builder.command_prefix + command)
command = ['cpack', '-G', 'ZIP']
buildbot_utils.call(builder.command_prefix + command)
package_filename = package_name + '.zip'
package_filepath = os.path.join(builder.build_dir, package_filename)
package_files = [(package_filepath, package_filename)]
if info.version_cycle == 'release':
# Installer only for final release builds, otherwise will get
# 'this product is already installed' messages.
command = ['cpack', '-G', 'WIX']
buildbot_utils.call(builder.command_prefix + command)
package_filename = package_name + '.msi'
package_filepath = os.path.join(builder.build_dir, package_filename)
if builder.codesign:
sign_file_or_directory(package_filepath)
package_files += [(package_filepath, package_filename)]
create_buildbot_upload_zip(builder, package_files)
def pack_linux(builder):
blender_executable = os.path.join(builder.install_dir, 'blender')
info = buildbot_utils.VersionInfo(builder)
# Strip all unused symbols from the binaries
print("Stripping binaries...")
buildbot_utils.call(builder.command_prefix + ['strip', '--strip-all', blender_executable])
print("Stripping python...")
py_target = os.path.join(builder.install_dir, info.short_version)
buildbot_utils.call(
builder.command_prefix + [
'find', py_target, '-iname', '*.so', '-exec', 'strip', '-s', '{}', ';',
],
)
# Construct package name
platform_name = 'linux64'
package_name = get_package_name(builder, platform_name)
package_filename = package_name + ".tar.xz"
print("Creating .tar.xz archive")
package_filepath = builder.install_dir + '.tar.xz'
create_tar_xz(builder.install_dir, package_filepath, package_name)
# Create buildbot_upload.zip
create_buildbot_upload_zip(builder, [(package_filepath, package_filename)])
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
# Make sure install directory always exists
os.makedirs(builder.install_dir, exist_ok=True)
if builder.platform == 'mac':
pack_mac(builder)
elif builder.platform == 'win':
pack_win(builder)
elif builder.platform == 'linux':
pack_linux(builder)

View File

@@ -0,0 +1,42 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import buildbot_utils
import os
import sys
def get_ctest_arguments(builder):
args = ['--output-on-failure']
if builder.platform == 'win':
args += ['-C', 'Release']
return args
def test(builder):
os.chdir(builder.build_dir)
command = builder.command_prefix + ['ctest'] + get_ctest_arguments(builder)
buildbot_utils.call(command)
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
test(builder)

View File

@@ -0,0 +1,31 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import buildbot_utils
import os
import sys
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
os.chdir(builder.blender_dir)
# Run make update which handles all libraries and submodules.
make_update = os.path.join(builder.blender_dir, "build_files", "utils", "make_update.py")
buildbot_utils.call([sys.executable, make_update, '--no-blender', "--use-tests", "--use-centos-libraries"])

View File

@@ -20,24 +20,8 @@ if(NOT CLANG_ROOT_DIR AND NOT $ENV{CLANG_ROOT_DIR} STREQUAL "")
set(CLANG_ROOT_DIR $ENV{CLANG_ROOT_DIR})
endif()
if(NOT LLVM_ROOT_DIR)
if(DEFINED LLVM_VERSION)
message(running llvm-config-${LLVM_VERSION})
find_program(LLVM_CONFIG llvm-config-${LLVM_VERSION})
endif()
if(NOT LLVM_CONFIG)
find_program(LLVM_CONFIG llvm-config)
endif()
execute_process(COMMAND ${LLVM_CONFIG} --prefix
OUTPUT_VARIABLE LLVM_ROOT_DIR
OUTPUT_STRIP_TRAILING_WHITESPACE)
set(LLVM_ROOT_DIR ${LLVM_ROOT_DIR} CACHE PATH "Path to the LLVM installation")
endif()
set(_CLANG_SEARCH_DIRS
${CLANG_ROOT_DIR}
${LLVM_ROOT_DIR}
/opt/lib/clang
)

View File

@@ -34,7 +34,7 @@ FIND_PATH(EMBREE_INCLUDE_DIR
include
)
IF(NOT (("${CMAKE_SYSTEM_PROCESSOR}" STREQUAL "aarch64") OR (APPLE AND ("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64"))))
IF(NOT (APPLE AND ("${CMAKE_OSX_ARCHITECTURES}" STREQUAL "arm64")))
SET(_embree_SIMD_COMPONENTS
embree_sse42
embree_avx

View File

@@ -472,7 +472,8 @@ if(NOT GFLAGS_FOUND)
gflags_report_not_found(
"Could not find gflags include directory, set GFLAGS_INCLUDE_DIR "
"to directory containing gflags/gflags.h")
endif()
endif(NOT GFLAGS_INCLUDE_DIR OR
NOT EXISTS ${GFLAGS_INCLUDE_DIR})
find_library(GFLAGS_LIBRARY NAMES gflags
PATHS ${GFLAGS_LIBRARY_DIR_HINTS}
@@ -483,7 +484,8 @@ if(NOT GFLAGS_FOUND)
gflags_report_not_found(
"Could not find gflags library, set GFLAGS_LIBRARY "
"to full path to libgflags.")
endif()
endif(NOT GFLAGS_LIBRARY OR
NOT EXISTS ${GFLAGS_LIBRARY})
# gflags typically requires a threading library (which is OS dependent), note
# that this defines the CMAKE_THREAD_LIBS_INIT variable. If we are able to
@@ -558,7 +560,8 @@ if(NOT GFLAGS_FOUND)
gflags_report_not_found(
"Caller defined GFLAGS_INCLUDE_DIR:"
" ${GFLAGS_INCLUDE_DIR} does not contain gflags/gflags.h header.")
endif()
endif(GFLAGS_INCLUDE_DIR AND
NOT EXISTS ${GFLAGS_INCLUDE_DIR}/gflags/gflags.h)
# TODO: This regex for gflags library is pretty primitive, we use lowercase
# for comparison to handle Windows using CamelCase library names, could
# this check be better?
@@ -568,7 +571,8 @@ if(NOT GFLAGS_FOUND)
gflags_report_not_found(
"Caller defined GFLAGS_LIBRARY: "
"${GFLAGS_LIBRARY} does not match gflags.")
endif()
endif(GFLAGS_LIBRARY AND
NOT "${LOWERCASE_GFLAGS_LIBRARY}" MATCHES ".*gflags[^/]*")
gflags_reset_find_library_prefix()

View File

@@ -1,81 +0,0 @@
# - Find HIP compiler
#
# This module defines
# HIP_HIPCC_EXECUTABLE, the full path to the hipcc executable
# HIP_VERSION, the HIP compiler version
#
# HIP_FOUND, if the HIP toolkit is found.
#=============================================================================
# Copyright 2021 Blender Foundation.
#
# Distributed under the OSI-approved BSD 3-Clause License,
# see accompanying file BSD-3-Clause-license.txt for details.
#=============================================================================
# If HIP_ROOT_DIR was defined in the environment, use it.
if(NOT HIP_ROOT_DIR AND NOT $ENV{HIP_ROOT_DIR} STREQUAL "")
set(HIP_ROOT_DIR $ENV{HIP_ROOT_DIR})
endif()
set(_hip_SEARCH_DIRS
${HIP_ROOT_DIR}
)
find_program(HIP_HIPCC_EXECUTABLE
NAMES
hipcc
HINTS
${_hip_SEARCH_DIRS}
PATH_SUFFIXES
bin
)
if(HIP_HIPCC_EXECUTABLE AND NOT EXISTS ${HIP_HIPCC_EXECUTABLE})
message(WARNING "Cached or directly specified hipcc executable does not exist.")
set(HIP_FOUND FALSE)
elseif(HIP_HIPCC_EXECUTABLE)
set(HIP_FOUND TRUE)
set(HIP_VERSION_MAJOR 0)
set(HIP_VERSION_MINOR 0)
set(HIP_VERSION_PATCH 0)
# Get version from the output.
execute_process(COMMAND ${HIP_HIPCC_EXECUTABLE} --version
OUTPUT_VARIABLE HIP_VERSION_RAW
ERROR_QUIET
OUTPUT_STRIP_TRAILING_WHITESPACE)
# Parse parts.
if(HIP_VERSION_RAW MATCHES "HIP version: .*")
# Strip the HIP prefix and get list of individual version components.
string(REGEX REPLACE
".*HIP version: ([.0-9]+).*" "\\1"
HIP_SEMANTIC_VERSION "${HIP_VERSION_RAW}")
string(REPLACE "." ";" HIP_VERSION_PARTS "${HIP_SEMANTIC_VERSION}")
list(LENGTH HIP_VERSION_PARTS NUM_HIP_VERSION_PARTS)
# Extract components into corresponding variables.
if(NUM_HIP_VERSION_PARTS GREATER 0)
list(GET HIP_VERSION_PARTS 0 HIP_VERSION_MAJOR)
endif()
if(NUM_HIP_VERSION_PARTS GREATER 1)
list(GET HIP_VERSION_PARTS 1 HIP_VERSION_MINOR)
endif()
if(NUM_HIP_VERSION_PARTS GREATER 2)
list(GET HIP_VERSION_PARTS 2 HIP_VERSION_PATCH)
endif()
# Unset temp variables.
unset(NUM_HIP_VERSION_PARTS)
unset(HIP_SEMANTIC_VERSION)
unset(HIP_VERSION_PARTS)
endif()
# Construct full semantic version.
set(HIP_VERSION "${HIP_VERSION_MAJOR}.${HIP_VERSION_MINOR}.${HIP_VERSION_PATCH}")
unset(HIP_VERSION_RAW)
else()
set(HIP_FOUND FALSE)
endif()

View File

@@ -40,7 +40,7 @@ FIND_PACKAGE_HANDLE_STANDARD_ARGS(NanoVDB DEFAULT_MSG
IF(NANOVDB_FOUND)
SET(NANOVDB_INCLUDE_DIRS ${NANOVDB_INCLUDE_DIR})
ENDIF()
ENDIF(NANOVDB_FOUND)
MARK_AS_ADVANCED(
NANOVDB_INCLUDE_DIR

View File

@@ -46,7 +46,7 @@ SET(_opencollada_FIND_COMPONENTS
)
# Fedora openCOLLADA package links these statically
# note that order is important here or it won't link
# note that order is important here ot it wont link
SET(_opencollada_FIND_STATIC_COMPONENTS
buffer
ftoa

View File

@@ -33,23 +33,11 @@ FIND_PATH(OPTIX_INCLUDE_DIR
include
)
IF(EXISTS "${OPTIX_INCLUDE_DIR}/optix.h")
FILE(STRINGS "${OPTIX_INCLUDE_DIR}/optix.h" _optix_version REGEX "^#define OPTIX_VERSION[ \t].*$")
STRING(REGEX MATCHALL "[0-9]+" _optix_version ${_optix_version})
MATH(EXPR _optix_version_major "${_optix_version} / 10000")
MATH(EXPR _optix_version_minor "(${_optix_version} % 10000) / 100")
MATH(EXPR _optix_version_patch "${_optix_version} % 100")
SET(OPTIX_VERSION "${_optix_version_major}.${_optix_version_minor}.${_optix_version_patch}")
ENDIF()
# handle the QUIETLY and REQUIRED arguments and set OPTIX_FOUND to TRUE if
# all listed variables are TRUE
INCLUDE(FindPackageHandleStandardArgs)
FIND_PACKAGE_HANDLE_STANDARD_ARGS(OptiX
REQUIRED_VARS OPTIX_INCLUDE_DIR
VERSION_VAR OPTIX_VERSION)
FIND_PACKAGE_HANDLE_STANDARD_ARGS(OptiX DEFAULT_MSG
OPTIX_INCLUDE_DIR)
IF(OPTIX_FOUND)
SET(OPTIX_INCLUDE_DIRS ${OPTIX_INCLUDE_DIR})
@@ -57,7 +45,6 @@ ENDIF()
MARK_AS_ADVANCED(
OPTIX_INCLUDE_DIR
OPTIX_VERSION
)
UNSET(_optix_SEARCH_DIRS)

View File

@@ -44,7 +44,7 @@ SET(PYTHON_LINKFLAGS "-Xlinker -export-dynamic" CACHE STRING "Linker flags for p
MARK_AS_ADVANCED(PYTHON_LINKFLAGS)
# if the user passes these defines as args, we don't want to overwrite
# if the user passes these defines as args, we dont want to overwrite
SET(_IS_INC_DEF OFF)
SET(_IS_INC_CONF_DEF OFF)
SET(_IS_LIB_DEF OFF)
@@ -143,7 +143,7 @@ IF((NOT _IS_INC_DEF) OR (NOT _IS_INC_CONF_DEF) OR (NOT _IS_LIB_DEF) OR (NOT _IS_
SET(_PYTHON_ABI_FLAGS "${_CURRENT_ABI_FLAGS}")
break()
ELSE()
# ensure we don't find values from 2 different ABI versions
# ensure we dont find values from 2 different ABI versions
IF(NOT _IS_INC_DEF)
UNSET(PYTHON_INCLUDE_DIR CACHE)
ENDIF()

View File

@@ -1,66 +0,0 @@
# - Find Zstd library
# Find the native Zstd includes and library
# This module defines
# ZSTD_INCLUDE_DIRS, where to find zstd.h, Set when
# ZSTD_INCLUDE_DIR is found.
# ZSTD_LIBRARIES, libraries to link against to use Zstd.
# ZSTD_ROOT_DIR, The base directory to search for Zstd.
# This can also be an environment variable.
# ZSTD_FOUND, If false, do not try to use Zstd.
#
# also defined, but not for general use are
# ZSTD_LIBRARY, where to find the Zstd library.
#=============================================================================
# Copyright 2019 Blender Foundation.
#
# Distributed under the OSI-approved BSD License (the "License");
# see accompanying file Copyright.txt for details.
#
# This software is distributed WITHOUT ANY WARRANTY; without even the
# implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the License for more information.
#=============================================================================
# If ZSTD_ROOT_DIR was defined in the environment, use it.
IF(NOT ZSTD_ROOT_DIR AND NOT $ENV{ZSTD_ROOT_DIR} STREQUAL "")
SET(ZSTD_ROOT_DIR $ENV{ZSTD_ROOT_DIR})
ENDIF()
SET(_zstd_SEARCH_DIRS
${ZSTD_ROOT_DIR}
)
FIND_PATH(ZSTD_INCLUDE_DIR
NAMES
zstd.h
HINTS
${_zstd_SEARCH_DIRS}
PATH_SUFFIXES
include
)
FIND_LIBRARY(ZSTD_LIBRARY
NAMES
zstd
HINTS
${_zstd_SEARCH_DIRS}
PATH_SUFFIXES
lib64 lib
)
# handle the QUIETLY and REQUIRED arguments and set ZSTD_FOUND to TRUE if
# all listed variables are TRUE
INCLUDE(FindPackageHandleStandardArgs)
FIND_PACKAGE_HANDLE_STANDARD_ARGS(Zstd DEFAULT_MSG
ZSTD_LIBRARY ZSTD_INCLUDE_DIR)
IF(ZSTD_FOUND)
SET(ZSTD_LIBRARIES ${ZSTD_LIBRARY})
SET(ZSTD_INCLUDE_DIRS ${ZSTD_INCLUDE_DIR})
ENDIF()
MARK_AS_ADVANCED(
ZSTD_INCLUDE_DIR
ZSTD_LIBRARY
)

View File

@@ -40,7 +40,7 @@ FIND_PACKAGE_HANDLE_STANDARD_ARGS(sse2neon DEFAULT_MSG
IF(SSE2NEON_FOUND)
SET(SSE2NEON_INCLUDE_DIRS ${SSE2NEON_INCLUDE_DIR})
ENDIF()
ENDIF(SSE2NEON_FOUND)
MARK_AS_ADVANCED(
SSE2NEON_INCLUDE_DIR

View File

@@ -79,7 +79,7 @@ if(EXISTS ${SOURCE_DIR}/.git)
ERROR_QUIET)
if(NOT _git_below_check STREQUAL "")
# If there're commits between HEAD and upstream this means
# that we're reset-ed to older revision. Use its hash then.
# that we're reset-ed to older revision. Use it's hash then.
execute_process(COMMAND git rev-parse --short=12 HEAD
WORKING_DIRECTORY ${SOURCE_DIR}
OUTPUT_VARIABLE MY_WC_HASH

View File

@@ -168,7 +168,7 @@ def function_parm_wash_tokens(parm):
# if tokens[-1].kind == To
# remove trailing char
if tokens[-1].kind == TokenKind.PUNCTUATION:
if tokens[-1].spelling in {",", ")", ";"}:
if tokens[-1].spelling in (",", ")", ";"):
tokens.pop()
# else:
# print(tokens[-1].spelling)
@@ -179,7 +179,7 @@ def function_parm_wash_tokens(parm):
t_spelling = t.spelling
ok = True
if t_kind == TokenKind.KEYWORD:
if t_spelling in {"const", "restrict", "volatile"}:
if t_spelling in ("const", "restrict", "volatile"):
ok = False
elif t_spelling.startswith("__"):
ok = False # __restrict
@@ -305,7 +305,7 @@ def file_check_arg_sizes(tu):
for i, node_child in enumerate(children):
children = list(node_child.get_children())
# skip if we don't have an index...
# skip if we dont have an index...
size_def = args_size_definition.get(i, -1)
if size_def == -1:
@@ -354,7 +354,7 @@ def file_check_arg_sizes(tu):
filepath # always the same but useful when running threaded
))
# we don't really care what we are looking at, just scan entire file for
# we dont really care what we are looking at, just scan entire file for
# function calls.
def recursive_func_call_check(node):

View File

@@ -8,9 +8,6 @@ IGNORE_SOURCE = (
# specific source files
"extern/audaspace/",
# Use for `WIN32` only.
"source/creator/blender_launcher_win32.c",
# specific source files
"extern/bullet2/src/BulletCollision/CollisionDispatch/btBox2dBox2dCollisionAlgorithm.cpp",
"extern/bullet2/src/BulletCollision/CollisionDispatch/btConvex2dConvex2dAlgorithm.cpp",

View File

@@ -82,7 +82,7 @@ def create_nb_project_main():
make_exe = cmake_cache_var("CMAKE_MAKE_PROGRAM")
make_exe_basename = os.path.basename(make_exe)
# --------------- NetBeans specific.
# --------------- NB specific
defines = [("%s=%s" % cdef) if cdef[1] else cdef[0] for cdef in defines]
defines += [cdef.replace("#define", "").strip() for cdef in cmake_compiler_defines()]

Some files were not shown because too many files have changed in this diff Show More