Compare commits

..

129 Commits

Author SHA1 Message Date
dbea6c3507 Fix symmetry in pose brush, add radius preview. 2019-08-28 14:21:27 +02:00
b386d5594a Mask filter: update only modified nodes 2019-08-21 16:06:26 +02:00
2ecb4540f3 Fix merge errors 2019-08-20 19:19:26 +02:00
8c509bb69c Merge branch 'master' into sculpt-mode-features 2019-08-20 19:13:25 +02:00
d6d51674a2 Sculpt: Pose brush
This commit also includes the MOUSE_INBETWEEN events fix for other
brushes.
2019-08-16 00:12:43 +02:00
ba17acf8f4 Tweak mesh preview and cursor line width 2019-08-12 03:36:59 +02:00
12dca31682 Grab active vertex and dynamic mesh preview 2019-08-12 03:35:13 +02:00
ddccb5411d Set pivot position: update to new sculpt API 2019-08-06 16:48:29 +02:00
194457f332 Fix transform tool mask value 2019-08-06 16:23:09 +02:00
de28610dfa Mask expand: symmetry support 2019-08-06 16:16:19 +02:00
e1b6180e41 Topology automasking: update to new sculpt API 2019-08-06 15:50:58 +02:00
26e1155ca9 Mesh filters: update to new sculpt API 2019-08-05 19:50:51 +02:00
966f02a8cf Mask filter: add auto iteration count for pie menu 2019-08-05 15:41:41 +02:00
0ec5917cd1 Mask filter: update to new sculpt API 2019-08-05 15:24:18 +02:00
f7979585e2 Sculpt edit mask pie menu 2019-08-03 14:53:38 +02:00
881eb3f866 Avoid allocating memory in neighbour iteration 2019-07-31 16:28:44 +02:00
f4cf8193cb Changes to intern/openvdb 2019-07-31 16:06:15 +02:00
1bdadef21c Change neighbour array default size 2019-07-30 18:16:27 +02:00
10b2c9051d Sculpt mesh API: replace gsets with static arrays 2019-07-30 18:12:46 +02:00
95767bade8 Sculpt: Mesh abstaction API [WIP]
Using this API, all new tools for the sculpt branch should work with
both mesh and bmesh. I also ported mask by normal, mask expand and the
transform operator (with its mesh filter cache) in order to test it
2019-07-26 23:53:32 +02:00
84516c806b Merge branch 'master' into sculpt-mode-features 2019-07-25 14:25:22 +02:00
2c6e50c6c0 Fix PBVH sculpt vertex color drawing 2019-07-15 19:12:17 +02:00
77ce16299d Sculpt vertex colors: Support for Unified paint settings and secondary
color
2019-07-15 19:11:53 +02:00
c7709c8eff Fix blueprint tool matrix calculation
This should fix some plane scaling problems
2019-07-15 18:27:19 +02:00
0a964a62b2 Merge branch 'master' into sculpt-mode-features 2019-07-15 18:24:07 +02:00
cf701b0d2a Fix normal radius versioning 2019-07-09 15:24:00 +02:00
9895675b9b Fix formatting 2019-07-09 15:20:29 +02:00
3539cd38a4 Fix mask extract Windows build 2019-07-08 16:21:54 +02:00
bfff8c8bf5 Fix qex cmake 2019-07-08 16:19:00 +02:00
66cdb4df51 Blueprint tool: Initial implementation
Work in progress

- Geometry generation code is only for testing the operator. It will triangulate any
generated ngon and the normal calculation can fail.
- Snapping options are not integrated with scene settings
- Gird alignment is forced to global X axis. This can cause problems
with rotated objects.
- The tool only works from object mode. It crashes in other modes.
- Some internal grid default values are not set up properly and can make
the operator crash
- The grid preview is not integrated with the gizmo system, you need to
start the operator in order to see it
2019-07-08 16:15:47 +02:00
0997eff30c Sculpt vertex colors: add color blend modes to the paint brush
This makes all color blend modes available from the paint mode dropdown.
Source alpha is currently hardcoded to 1 to avoid confusion. The
viewport can't display it properly.
2019-07-02 15:32:47 +02:00
4ef1b6e717 Remove old mask opacity control and property 2019-06-30 01:13:41 +02:00
81e9011f57 Fix color fill mask intensity
I added this multiplication to fix the pressure sensitivity of my broken
wacom pen while testing the paint brush. I should add separate sculpt
and paint strength factors.
2019-06-30 00:36:43 +02:00
3a7b5693d1 Sculpt: Paint mask extract operator [WIP]
Currently the operator extracts a triangulated mesh from the mask and
applies a solidify modifier.
I would like to avoid using a shrinkwrap or duplicating the sculpt to
create the external edge loop. I think it should be possible to do
something similar to the relax brush to shrink the geometry to fit the
external loop within the mask.
2019-06-29 18:05:00 +02:00
ce83be4832 Cleanup and variable renaming 2019-06-29 17:35:19 +02:00
fa916b7470 Merge branch 'master' into sculpt-mode-features 2019-06-28 01:31:04 +02:00
8d17efaf7b Merge branch 'master' into sculpt-mode-features 2019-06-22 18:30:34 +02:00
5b19ecdc6b Add remesh panel to the topbar
This panel has the same options as the mesh properties remesh panel.
2019-06-16 23:16:07 +02:00
80b4987139 Fix crash with mesh filters and EEVEE 2019-06-16 23:11:24 +02:00
184b8d2bb3 Add front faces only option to the mask lasso tool 2019-06-16 23:05:15 +02:00
450e547298 Merge branch 'master' into sculpt-mode-features 2019-06-16 23:01:16 +02:00
d9fd3be084 Pose tool: Mask expand operator
This operator is designed to use with the transform.rotate tool to
create a pose tool. It generates a mask from the active vertex,
following the topology of the mesh.

The operator has two modes. It can expand the mask to the vertex under
the cursor or to a certain number of iterations. All the iterations are
cached when the operator is invoked, so it should be usable with high
poly meshes.

I also included some hardcoded mask operations when the operator
finishes. In the future, this could be included as an operator macro if
it does not have a significant performance penalty (like redrawing all
nodes each time a new operator is invoked).

It does not support symmetry and it only works with meshes (no dyntopo
or multires).

I also fixed some bugs in the transform code that were breaking EEVEE
compatibility. Undo system still fails sometimes.
2019-06-11 15:28:10 +02:00
ad5ac8c833 Mask filter: Dirty and constrast mask filters
The funcionality should be the same as the dirty vertex colors operator.
This filter can be used to generate cavity mask for sculpting and
painting. I also added a contrast filter to control the cavity mask.
The contrast filter can be implemented with a real time preview in a
future version instead of incrementing the contrast with fixed steps.
2019-06-07 16:27:59 +02:00
097157c881 Fix EEVEE/LookDev viewport updates with filter tools 2019-06-05 16:09:49 +02:00
5fde4ab1c2 Fix build after last commit
I forgot to add a couple of files
2019-06-02 23:01:53 +02:00
e36ae3c1e4 Merge branch 'master' into sculpt-mode-features 2019-05-31 23:10:45 +02:00
d79427986b Sculpt vertex colors: Color filter tool
This tool allows the manipulation of all the colors stored in the
vertices at once. Behavior and implementation are similar to the mesh
filter tool. Sculpt vertex colors are not connected to the renderer, so
the viewport does not update when EEVEE/Cycles is enabled.
I also added some separators to the toolbar to separate brushes from
filters.
2019-05-31 22:46:29 +02:00
c6a3009b15 Merge branch 'master' into sculpt-mode-features 2019-05-31 16:02:26 +02:00
711d1f7591 Merge branch 'master' into sculpt-mode-features 2019-05-21 00:59:16 +02:00
42c0bd2e90 Mask filter: add mask filter tool with iterations
Now it is possible to test the operator without editing the keymap.
By changing the iterations of the filter it should be possible to avoid
adding an unnecessary amount of undo steps when working with high poly
meshes.
2019-05-21 00:29:46 +02:00
839adc7387 Fix Windows build 2019-05-20 18:49:16 +02:00
08c0e2a35d Sculpt mode transform tool: Initial implementation
Work in progress...
- Scaling and orientations are not implemented yet.
- The "move pivot only" option is ignored when the tool is used from the
gizmo.
- Only meshes, no multires or dyntopo.

This commit also includes the operator to set the pivot point position
automatically as well as a refactor of the mesh filter cache to be shared
with the transform tool.
2019-05-19 18:10:41 +02:00
6e558c84e4 Sculpt: Mask by color operator
Similar to mask by normal, this operator generates a mask based on the
color of the active vertex.
2019-05-15 16:28:55 +02:00
c8ce96b1a2 Mesh Filter: Multithreaded initialization and masked node filtering
This should improve performance a lot, making the tool usable on several
millions of vertices. If the mesh is partially masked, only the unmasked
nodes are going to be processed, so it is possible to isolate parts of a
high poly mesh using the mask and work with them in real time.

I also changed the random displacement initialization to avoid
allocating a huge amount of memory (it wasn't really necessary)
2019-05-15 14:46:28 +02:00
485ea73b2b Mesh Filter: Init random displacement only once
The random mesh filter should be much faster now, and it should not do
weird things when adjusting the strength.
2019-05-14 23:42:54 +02:00
303cee750e Mesh Filter: Initial work for relax mesh filter
The mesh filter tool now manages its own orco cache, so there is no need
for a global orco mode using the pbvh proxies. The tool should be much
faster than before.
The code for the relax mesh filter is there, but it is disabled in the
interface until the algorithm is finished.
2019-05-14 19:25:35 +02:00
01b0d4bf31 Cleanup: comments formatting 2019-05-14 00:38:51 +02:00
66f6bd0c24 Relax Brush: Initial implementation
Work in progress. This still has a lot of bugs and I'm not sure if this
approach is the best for this kind of smoothing. The current
implementation can't handle poles with 5 vertices or more, so it doesn't
work properly with the result of the voxel remesher. It does not support
multires.
Also, I'm using gsets to store the smoothing groups. I'm sure that there
is a better way to do it. If for some reason someone enters sculpt mode
and touches a pole with 30 vertices it should not crash, but it adds an
unnecessary overhead when working with properly constructed quad meshes.
2019-05-14 00:35:31 +02:00
5be901f344 Merge branch 'master' into sculpt-mode-features 2019-05-06 18:19:53 +02:00
b66ecaf40c Fix crash on mesh filters 2019-04-29 03:48:34 +02:00
2b32296e17 Fix crash on remesh operator with a multi user mesh data.
I also added some error messages.
2019-04-28 18:53:05 +02:00
e27ab0790f Fix crash when sculpting with EEVEE/LookDev enabled 2019-04-24 17:19:52 +02:00
c47ca6adab Merge branch 'master' into sculpt-mode-features 2019-04-23 20:24:54 +02:00
ab0be1b720 Merge commit 'e12c08e8d170b7ca40f204a5b0423c23a9fbc2c1' into sculpt-mode-features 2019-04-23 17:53:45 +02:00
eab72c5242 Merge commit 'e12c08e8d170b7ca40f204a5b0423c23a9fbc2c1^1' into sculpt-mode-features 2019-04-23 17:50:58 +02:00
1a1152cb1b Brush Stroke: Use world spacing option
Quick hack to compute the stroke step using the world space location of
the brush over the mesh instead of the screen space distance, so the
user can avoid a lot of artifacts when sculpting across curved surfaces.
This would need to be implemented for texture/vertex/weight paint in the
future to keep the stroke system consistent.
It still has some problems with brushes that use original coordinates.
2019-04-22 19:37:49 +02:00
bad96e5c7c Sculpt mask: Mask by normal operator
This is an experimental operator that can be used to generate mask for
faces in hard surface sculpts. It does not have UI and without undo
support the behaviour is unpredictable, but this code can be used to add
another automasking mode to the automasking system.
2019-04-22 01:09:47 +02:00
bc7068c4b9 Add option to control de opacity of the sculpt mask 2019-04-20 00:09:10 +02:00
2fa9e300df further fix attempts for crashing 2019-04-17 22:25:07 +02:00
997980096f attempt to take hard edges into account at quad remeshing
(with mark sharp on edges)
2019-04-17 21:20:19 +02:00
0d3501725c Fix build with QEX disabled 2019-04-16 20:35:40 +02:00
538669942a Merge branch 'sculpt-mode-features' of git.blender.org:blender into sculpt-mode-features 2019-04-16 20:18:01 +02:00
d6acac5af8 Automasking: Open edges automasking and refactor
This new version should allow us to stack multiple automasking
operations in one brush. The UI still needs to be done for supporting
that.

This is still a binary mask, it does not have falloff, but now the code
was done with that in mind.
2019-04-16 20:16:51 +02:00
356e009914 some minor fixes for libQex dependency setup 2019-04-16 20:15:37 +02:00
31ebf10f1b initial implementation of libQex-based quad remeshing
note, still crashes often... has some memory alloc/free issues
2019-04-16 19:58:18 +02:00
a847fa4523 added rebuild levelset after grid resampling
but in general, the current grid resampling approach still seems wrong... needs more thinking
2019-04-13 12:43:39 +02:00
3d288c58d2 first attempt for allowing different voxel sizes on operands
the grids should not have offsets any more, they are being transformed and resampled.
but does not look overly great yet... and is very slow, too.
2019-04-13 01:12:33 +02:00
73da0bdf64 Voxel remesher: Reproject mask option 2019-04-12 16:49:51 +02:00
2a08dae14e added a new live remesh option.
If enabled, the modifier will be continously invoked, as usual. If disabled,
it will return the last automatically cached mesh, so basically the remesh
is "frozen", but still nondestructive / not applied yet.
Note, doesnt work yet in sculpt mode. There you may still need to apply
the modifier to update the base mesh. Or you can disable the modifier in
the stack to work on the base mesh, too. A third alternative is having
live remesh enabled with the modifier enabled, but this will cost some
sculpting performance and works at the moment only in material and rendered
mode.
2019-04-12 10:32:05 +02:00
51582bf565 Sculpt mode: Grow and shrink mask filters 2019-04-12 00:35:38 +02:00
4cd1941bb7 Mesh Filter tool: Initial implementation
This tool applies a deformation to all vertices in the sculpt taking the
mask into account. It is useful for creating surface detail or hard
surface sculpts. It is also the base for the implementation of the
transform tool.

Notes:
- The smooth filter needs multiple iterations. Right now it only applies
one iteration per tool action, so it is unusable in most cases. I don't
know if adding this is possible with the current smooth code while
supporting real-time preview.
- I'm not sure if it is properly integrated with the tool system and the
keymap (probably not).
- Only works with mesh, no dyntopo or multires.
- It still needs to ignore nodes with all vertices fully masked. Adding
this basic optimization will increase performance a lot when working
with high-resolution meshes.
- Previewing the deformation in real time is not the best option for
performance, I could add an operator that applies the filter without
preview.
2019-04-11 15:18:44 +02:00
9f371c5ca3 fix crash with openvdb filters in case the openvdb grid is invalid
invalid can mean the grid is empty or the grid class is not "LevelSet",
happens usually when the volume is empty
2019-04-11 15:01:00 +02:00
d050badf12 Fix naming 2019-04-10 01:25:00 +02:00
e910ac7c3d disabling an operand now skips unnecessary calculations
useful when transforming the disabled operand for example
2019-04-10 00:42:48 +02:00
f23ef4db63 odd transform jumps should be fixed now
instead of restoring the meshtransform on the original, a copy is made
2019-04-09 18:39:24 +02:00
8442fdb8fa fix attempt for bug when moving a shared csg operand (still odd transforms) 2019-04-09 18:24:17 +02:00
92c9111cae odd crashes on deleting objects and on load seem to be solved
turned out I needed to copy the csg operand list as well, so the walk
function has it available when it needs it.
2019-04-09 17:57:50 +02:00
89ff5976fb some improvement of the CSG UI 2019-04-09 10:44:29 +02:00
a171332887 Sculpt vertex colors: Color fill operator 2019-04-09 01:18:08 +02:00
f78f60a84e fix attempt for crashes when loading blends with volume csg objects
Note this isnt solved yet, its some weird stuff going on with pointers not in sync etc.
Must be somewhere in the object linking system...
2019-04-09 00:46:47 +02:00
c3c265a101 Sculpt mode: Mask filter operator
It includes blur and sharpen mask filters. It is not enabled for multires
yet.
2019-04-09 00:14:41 +02:00
d594fd6dc5 removed some unnecessary disabled code 2019-04-08 22:08:34 +02:00
ff98475fcc yay, multi-object volume CSG finally operational :) 2019-04-08 21:54:42 +02:00
4b721f42c8 use refactored functions for voxel remesh modifier, too 2019-04-08 10:41:53 +02:00
deaa664cc0 OpenVDB: Replace dilate and erode with offset on level set
Dilate and erode filters are now working
2019-04-08 02:25:11 +02:00
7f9b00bd46 Refactor: Move remesher functions to blenderkernel 2019-04-08 01:22:39 +02:00
d2e0afdd5e disabled a few not-yet-working options for remesh modifier (dilate/erode) 2019-04-07 22:03:26 +02:00
4c0c29a29e Enable topology automasking for the snake hook brush 2019-04-07 21:25:22 +02:00
18a5d24611 for now, CSG options disabled in python, since not functional yet 2019-04-07 20:39:43 +02:00
ba5577eea9 first attempt for CSG operations via voxels, but not functional yet 2019-04-07 20:38:09 +02:00
42876e5f2d OpenVDB: Add Filters and CSG operations to the Level Set C API
I also refactored the voxel remesher code to use the new API.
2019-04-07 04:08:45 +02:00
ade63efcdf initial implementation of the openvdb remesher inside the remesh modifier
this merges my patch now officially to the sculpt-mode-features branch :)
2019-04-06 19:17:23 +02:00
6e81e5c709 Sculpt vertex colors: Initial implementation
This adds a new CustomDataLayer to store per vertex color data in a way
that can be used from sculpt mode. This vertex paint mode uses the same
brush, pbvh, undo and rendering code as sculpt mode, so it has much
better performance.  It also enables other features like sculpting and
painting at the same time.

I also added some extra features that are needed to start testing this
paint mode, such as:
- New paint brush optimized for painting. It doesn't modify any vertex
position data.
- Sculpt mode sample color operator that uses the active vertex from the
cursor
- Remesh reprojection support: sculpt vertex colors are reprojected into
the new mesh when using the remesh operator if the mesh has that option
enabled.
- Store and load sculpt vertex colors into the regular vertex color
layers to keep compatibility.
- Use the single color display mode to disable sculpt vertex colors in
sculpt mode.

A lot of decisions needs to be made to integrate this in Blender, so I
decided to keep the implementation as unintrusive as possible to start
working on the tools. This has the following limitations:

- Sculpt and paint at the same time is only enabled in the draw brush.
You need to set the Color Mode to Mix in the brush options.
- Dyntopo and multires modes and viewport display are broken
- Only mix as a color blend mode
- No brush texture support
- Sculpt and color share the same strength
- Limited to one layer per object
- No rendering support. You will need to copy the sculpt vertex colors
to the old color layers first.

All of these issues should be addressed once the main workflow, final UI
and behavior are decided.
2019-04-05 17:56:58 +02:00
159bbb7755 Enable OpenVDB by default
The voxel remesher needs OpenVDB to work. Maybe I should add an option
to disable building the voxel remesher in the future.
Patch by Jun Mizutani (@jmztn).
2019-04-03 01:28:07 +02:00
b4902ccc24 Fix crash on vertex and weight paint
I think the crash was related to the new viewport navigation. I'm not
sure if this is the correct way to fix this, but now vertex and weight
paint are working again.
2019-04-03 01:21:45 +02:00
097997c015 Fix pad error after merge 2019-04-03 01:15:22 +02:00
8b54c524be Merge branch 'master' into sculpt-mode-features 2019-04-03 01:03:31 +02:00
b0169056df Ignore normal radius on vertex and weight paint 2019-04-03 00:54:29 +02:00
d14017cdcb View Navigation: 2D viewport zooming
This implements the same concept of 2D viewport panning but for zooming.
It prevents unexpected zoom speeds when working with a pen tablet.
Zoom speed is constant and it is based on the initial distance to the
working area. Speed parameters are hardcoded for now, but they can be
exposed as options or replaced by a variable zoom speed.
I also reduced the rotation speed when 2D viewport panning is enabled
for consistency.
This navigation mode is enabled by activating 2D viewport panning in the
navigation preferences. That option should be renamed in the future.
Enabling and disabling perspective is broken most of the time, but it
also happens in master. I'm not sure if these changes are making the
problem worse.
2019-03-30 21:27:56 +01:00
1df1337dcb Merge branch 'master' into sculpt-mode-features 2019-03-24 04:02:07 +01:00
7c194b367e Automasking System: Topology automasking initial implementation
This commit adds a general automasking system and the topology
automasking option (not topology falloff, that can be added later). It
should be faster and with fewer memory errors that the last version I
made. It is designed to work with every brush and every stroke mode, but
for now, I only added support for draw and grab to avoid changing every
brush until the system is more polished.

The main purpose of adding this in such an early stage of development is
to check if the cursor is working correctly. The purpose of the vertex
preview in the cursor was to show the active vertex for automasking, so
now I can test if the whole system is behaving as it should.

Limitations:
- Only meshes for now (no dyntopo or multires). I still need to think
about what to do with dyntopo. I don't know if brushes that generate new
topology are going to work with this.
- It only stores a binary mask (a vertex can be active or not). This
could be changed if the future to support topological falloff or other
more complex types of automasking.
- As it is now, brushes that use stroke space update the active vertex
dynamically, setting it to the closest vertex to the brush location each
sample. This often causes masking errors with small brushes. I'm
considering changing it to always check topology using the initial
active vertex, but it is going to be much slower.
2019-03-22 16:16:51 +01:00
a92fb8a045 Add proportional editing presets to sculpt brushes
When sculpting, I always found that the behavior of the current curve
falloff system is not correct, so I added the same curves that
proportional editing uses to the brush falloff calculation. Now, curves
are easier to set up and they produce a much better result. You can
easily compare it against the old method by changing the curve preset
option from custom to smooth in the grab brush.

To make the new dam brush behave correctly, you need to select the
sharper curve preset. You can also use it to insert shapes using the
stroke anchored option and changing the curve preset.
2019-03-20 23:17:19 +01:00
55d482bf40 Mask tools: Fix crash with dyntopo and lasso mask
Ignore the front facing option with dyntopo. I will properly fix this
later.
2019-03-20 20:35:28 +01:00
726773588c Voxel Remesher: Add remesher parameters to mesh datablock
I tested several UI integrations:
-Running the remesher from an operator and adjusting the parameters with
the redo panel freezes the interface with high-res meshes. It looks
cool, but it is not that useful.
-Integrating it into a tool does not store per object remesh
configurations. It also forces the user to switch the tool when he/she is
sculpting just to recalculate the topology.

Storing the remesher options per mesh is the best option I found. Now,
the remesher is found in the properties panel under the mesh tab. The
user can assign it to the quick favorites menu or add a keyboard
shortcut.
Remesher parameters can be adjusted there. They are stored per mesh.
This way, remesher parameters of each mesh are preserved when sculpting
different objects at different levels of details. In the future, the user
will be able to choose different remeshing algorithms (triangulation,
quadriflow...) from there and assign them to the mesh. All of them will
be executed by the same remesh operator and some of them will share some
options (like smooth normals).
2019-03-20 03:34:07 +01:00
af8fe4e94e Voxel remesher: Add support for undo from sculpt mode
Only undo for now, no redo.
This will need a lot of work and refactoring in the future (it has a lot
of bugs), but now it is safer to test new features without losing work.
2019-03-19 23:07:22 +01:00
75b1f625dd Mask tools: Option to mask only front faces with lasso mask
It still needs to be added to the keymap or as a tool setting.
2019-03-19 02:13:59 +01:00
00f2700b75 Merge branch 'master' into sculpt-mode-features 2019-03-18 17:30:19 +01:00
378baee498 Dam brush: Initial implementation
The purpose of this brush is to create sharp edges without creasing or
pinching the geometry. It is useful when sculpting cloth wrinkles, hard
surface objects or small cavities.
To avoid artifacts, set the stroke spacing to a lower value than 10 and
enable symmetry feather. This affects performance, so I am not adding it
to the defaults for now.
It does not create new geometry in dyntopo, so you will need to create
the geometry with another tool or switch to the voxel remesher workflow.
I'm not quite happy with the deformation as it is now (maybe it is too
smooth). I also need to try the brush with different pen tablets to
properly calibrate the pressure.
2019-03-18 04:36:35 +01:00
0f1c86542e Brush cursor: Fix smooth stroke line position
Reset the viewport to its original state after drawing the vertex
preview.
2019-03-18 02:31:04 +01:00
45dbf1835f Brush cursor: Fix crash with 2D falloff
Disable the normal preview when 2D falloff is enabled.
2019-03-16 19:27:10 +01:00
b88ba40183 Brush cursor: Fix bug in tiling preview 2019-03-16 03:05:03 +01:00
dc1a17501d Brush cursor: Radial symmetry and tiling preview support 2019-03-16 02:09:04 +01:00
2bf0495d5c Voxel remesher: add option to smooth normals when remeshing 2019-03-15 00:35:31 +01:00
e2f23d91b9 OpenVDB Voxel remesher: Initial implementation
This introduces a new workflow for sculpting. The voxel remesher works
with Dyntopo disabled. The user needs to run it manually once he/she
considers that the topology is too stretched to continue sculpting
normally. OpenVDB evaluates the mesh volume as a level set with a given
voxel size and it always outputs an all quads manifold mesh. It
automatically solves self-intersections and geometry errors produced by
booleans, which Dyntopo can't solve right now. Once the new mesh is
calculated, the user can continue sculpting without the overhead of
having Dyntopo enabled.

It still needs a proper UI and it has some issues. Undo is not working
on a remeshed object.
2019-03-14 21:54:27 +01:00
b342c293a9 Grab Brush: Use smooth curve deformation
For now, this ignores the brush curve. The brush curve does't make sense
in some brushes and in other ones it should be optional (like in the
grab brush). This needs design to integrate with the UI.
2019-03-13 18:09:37 +01:00
72c4264277 Brush Cursor: show minimal mouse cursor by default
The brush cursor is disabled when manipulating the viewport, so you
would not have any mouse feedback if the mouse cursor is completely
disabled. This also displays the cursor position without relying on
internal brush updates when the stroke is active.
2019-03-13 17:34:04 +01:00
91d16ceb56 View Navigation: 2D viewport panning
This option helps to keep the working area under control when working
with brushes, especially if this feature is used with rotate around
selection enabled. Now, panning speed does not depend on zoom level and
it behaves more similarly to a 2D environment.

The option certainly needs a better name and description, and it could
be a good idea to hide and disable it if rotate around selection is
disabled.

Mode info: D3931
2019-03-13 17:26:54 +01:00
9336694916 Add defaults and versioning to normal radius
Patch by @jmztn
2019-03-13 16:32:25 +01:00
d55ea7bf81 New sculpt brush cursor with normal radius
From D3594
2019-03-13 16:18:30 +01:00
4736 changed files with 489388 additions and 827975 deletions

View File

@@ -1,7 +1,6 @@
{
"project_id" : "Blender",
"conduit_uri" : "https://developer.blender.org/",
"phabricator.uri" : "https://developer.blender.org/",
"git.default-relative-commit" : "origin/master",
"arc.land.update.default" : "rebase",
"arc.land.onto.default" : "master"

View File

@@ -221,23 +221,10 @@ ForEachMacros:
- ITER_BEGIN
- ITER_PIXELS
- ITER_SLOTS
- ITER_SLOTS_BEGIN
- LOOP_EDITED_POINTS
- LOOP_KEYS
- LOOP_POINTS
- LOOP_SELECTED_KEYS
- LOOP_SELECTED_POINTS
- LOOP_TAGGED_KEYS
- LOOP_TAGGED_POINTS
- LOOP_UNSELECTED_POINTS
- LOOP_VISIBLE_KEYS
- LOOP_VISIBLE_POINTS
- LISTBASE_CIRCULAR_BACKWARD_BEGIN
- LISTBASE_CIRCULAR_FORWARD_BEGIN
- LISTBASE_FOREACH
- LISTBASE_FOREACH_BACKWARD
- LISTBASE_FOREACH_MUTABLE
- LISTBASE_FOREACH_BACKWARD_MUTABLE
- MAN2D_ITER_AXES_BEGIN
- MAN_ITER_AXES_BEGIN
- NODE_INSTANCE_HASH_ITER

22
.github/stale.yml vendored
View File

@@ -1,22 +0,0 @@
# Configuration for probot-stale - https://github.com/probot/stale
# This file is used on Blender's GitHub mirror to automatically close any pull request
# and invite contributors to join the official development platform on blender.org
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 1
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
daysUntilClose: 1
# Label to use when marking as stale
staleLabel: stale
# Comment to post when closing a stale Issue or Pull Request.
closeComment: >
This issue has been automatically closed, because this repository is only
used as a mirror of git.blender.org. Blender development happens on
developer.blender.org.
To get started contributing code, please read:
https://wiki.blender.org/wiki/Process/Contributing_Code

6
.gitignore vendored
View File

@@ -40,9 +40,3 @@ Desktop.ini
# in-source lib downloads
/build_files/build_environment/downloads
# in-source buildbot signing configuration
/build_files/buildbot/codesign/config_server.py
# smoke simulation noise tile (generated)
waveletNoiseTile.bin

View File

@@ -16,7 +16,6 @@
#
# The Original Code is Copyright (C) 2006, Blender Foundation
# All rights reserved.
#
# ***** END GPL LICENSE BLOCK *****
#-----------------------------------------------------------------------------
@@ -97,11 +96,6 @@ cmake_policy(SET CMP0010 NEW)
# Input directories must have CMakeLists.txt
cmake_policy(SET CMP0014 NEW)
# Silence draco warning on macOS, new policy works fine.
if(POLICY CMP0068)
cmake_policy(SET CMP0068 NEW)
endif()
#-----------------------------------------------------------------------------
# Load some macros.
include(build_files/cmake/macros.cmake)
@@ -135,6 +129,68 @@ endif()
get_blender_version()
#-----------------------------------------------------------------------------
# Platform Specific Defaults
# list of var-names
set(_init_vars)
# initialize to ON
macro(option_defaults_init)
foreach(_var ${ARGV})
set(${_var} ON)
list(APPEND _init_vars "${_var}")
endforeach()
unset(_var)
endmacro()
# remove from namespace
macro(option_defaults_clear)
foreach(_var ${_init_vars})
unset(${_var})
endforeach()
unset(_var)
unset(_init_vars)
endmacro()
# values to initialize WITH_****
option_defaults_init(
_init_BUILDINFO
_init_CODEC_FFMPEG
_init_CYCLES_OSL
_init_IMAGE_OPENEXR
_init_INPUT_NDOF
_init_JACK
_init_OPENCOLLADA
_init_OPENCOLORIO
_init_SDL
_init_FFTW3
_init_OPENSUBDIV
)
# customize...
if(UNIX AND NOT APPLE)
# some of these libraries are problematic on Linux
# disable less important dependencies by default
set(_init_CODEC_FFMPEG OFF)
set(_init_CYCLES_OSL OFF)
set(_init_IMAGE_OPENEXR OFF)
set(_init_JACK OFF)
set(_init_OPENCOLLADA OFF)
set(_init_OPENCOLORIO OFF)
set(_init_SDL OFF)
set(_init_FFTW3 OFF)
set(_init_OPENSUBDIV OFF)
set(_init_OPENVDB OFF)
set(_init_OPENIMAGEDENOISE OFF)
elseif(WIN32)
set(_init_JACK OFF)
elseif(APPLE)
set(_init_JACK OFF)
endif()
#-----------------------------------------------------------------------------
# Options
@@ -161,7 +217,7 @@ if(APPLE)
option(WITH_PYTHON_FRAMEWORK "Enable building using the Python available in the framework (OSX only)" OFF)
endif()
option(WITH_BUILDINFO "Include extra build details (only disable for development & faster builds)" ON)
option(WITH_BUILDINFO "Include extra build details (only disable for development & faster builds)" ${_init_BUILDINFO})
if(${CMAKE_VERSION} VERSION_LESS 2.8.8)
# add_library OBJECT arg unsupported
set(WITH_BUILDINFO OFF)
@@ -175,29 +231,20 @@ mark_as_advanced(BUILDINFO_OVERRIDE_TIME)
option(WITH_IK_ITASC "Enable ITASC IK solver (only disable for development & for incompatible C++ compilers)" ON)
option(WITH_IK_SOLVER "Enable Legacy IK solver (only disable for development)" ON)
option(WITH_FFTW3 "Enable FFTW3 support (Used for smoke, ocean sim, and audio effects)" ON)
option(WITH_FFTW3 "Enable FFTW3 support (Used for smoke, ocean sim, and audio effects)" ${_init_FFTW3})
option(WITH_BULLET "Enable Bullet (Physics Engine)" ON)
option(WITH_SYSTEM_BULLET "Use the systems bullet library (currently unsupported due to missing features in upstream!)" )
mark_as_advanced(WITH_SYSTEM_BULLET)
option(WITH_OPENCOLORIO "Enable OpenColorIO color management" ON)
if(APPLE)
# There's no OpenXR runtime in sight for macOS, neither is code well
# tested there -> disable it by default.
option(WITH_XR_OPENXR "Enable VR features through the OpenXR specification" OFF)
mark_as_advanced(WITH_XR_OPENXR)
else()
# Disabled until there's more than just the build system stuff. Should be enabled soon.
option(WITH_XR_OPENXR "Enable VR features through the OpenXR specification" OFF)
endif()
option(WITH_OPENCOLORIO "Enable OpenColorIO color management" ${_init_OPENCOLORIO})
# Compositor
option(WITH_COMPOSITOR "Enable the tile based nodal compositor" ON)
option(WITH_OPENIMAGEDENOISE "Enable the OpenImageDenoise compositing node" ON)
option(WITH_OPENIMAGEDENOISE "Enable the OpenImageDenoise compositing node" ${_init_OPENIMAGEDENOISE})
option(WITH_OPENSUBDIV "Enable OpenSubdiv for surface subdivision" ON)
option(WITH_OPENSUBDIV "Enable OpenSubdiv for surface subdivision" ${_init_OPENSUBDIV})
option(WITH_OPENVDB "Enable features relying on OpenVDB" ON)
option(WITH_OPENVDB_BLOSC "Enable blosc compression for OpenVDB, only enable if OpenVDB was built with blosc support" ON)
option(WITH_OPENVDB "Enable features relying on OpenVDB" ${_init_OPENVDB})
option(WITH_OPENVDB_BLOSC "Enable blosc compression for OpenVDB, only enable if OpenVDB was built with blosc support" ${_init_OPENVDB})
option(WITH_OPENVDB_3_ABI_COMPATIBLE "Assume OpenVDB library has been compiled with version 3 ABI compatibility" OFF)
mark_as_advanced(WITH_OPENVDB_3_ABI_COMPATIBLE)
@@ -216,8 +263,6 @@ endif()
option(WITH_HEADLESS "Build without graphical support (renderfarm, server mode only)" OFF)
mark_as_advanced(WITH_HEADLESS)
option(WITH_QUADRIFLOW "Build with quadriflow remesher support" ON)
option(WITH_AUDASPACE "Build with blenders audio library (only disable if you know what you're doing!)" ON)
option(WITH_SYSTEM_AUDASPACE "Build with external audaspace library installed on the system (only enable if you know what you're doing!)" OFF)
mark_as_advanced(WITH_AUDASPACE)
@@ -256,13 +301,16 @@ endif()
# Modifiers
option(WITH_MOD_FLUID "Enable Mantaflow Fluid Simulation Framework" ON)
option(WITH_MOD_FLUID "Enable Elbeem Modifier (Fluid Simulation)" ON)
option(WITH_MOD_SMOKE "Enable Smoke Modifier (Smoke Simulation)" ON)
option(WITH_MOD_REMESH "Enable Remesh Modifier" ON)
option(WITH_MOD_OCEANSIM "Enable Ocean Modifier" ON)
# option(WITH_MOD_CLOTH_ELTOPO "Enable Experimental cloth solver" OFF) # this is now only available in a branch
# mark_as_advanced(WITH_MOD_CLOTH_ELTOPO)
option(WITH_MOD_OCEANSIM "Enable Ocean Modifier" OFF)
# Image format support
option(WITH_OPENIMAGEIO "Enable OpenImageIO Support (http://www.openimageio.org)" ON)
option(WITH_IMAGE_OPENEXR "Enable OpenEXR Support (http://www.openexr.com)" ON)
option(WITH_IMAGE_OPENEXR "Enable OpenEXR Support (http://www.openexr.com)" ${_init_IMAGE_OPENEXR})
option(WITH_IMAGE_OPENJPEG "Enable OpenJpeg Support (http://www.openjpeg.org)" ON)
option(WITH_IMAGE_TIFF "Enable LibTIFF Support" ON)
option(WITH_IMAGE_DDS "Enable DDS Image Support" ON)
@@ -271,28 +319,23 @@ option(WITH_IMAGE_HDR "Enable HDR Image Support" ON)
# Audio/Video format support
option(WITH_CODEC_AVI "Enable Blenders own AVI file support (raw/jpeg)" ON)
option(WITH_CODEC_FFMPEG "Enable FFMPeg Support (http://ffmpeg.org)" ON)
option(WITH_CODEC_SNDFILE "Enable libsndfile Support (http://www.mega-nerd.com/libsndfile)" ON)
option(WITH_CODEC_FFMPEG "Enable FFMPeg Support (http://ffmpeg.org)" ${_init_CODEC_FFMPEG})
option(WITH_CODEC_SNDFILE "Enable libsndfile Support (http://www.mega-nerd.com/libsndfile)" OFF)
# Alembic support
option(WITH_ALEMBIC "Enable Alembic Support" ON)
option(WITH_ALEMBIC "Enable Alembic Support" OFF)
option(WITH_ALEMBIC_HDF5 "Enable Legacy Alembic Support (not officially supported)" OFF)
# Universal Scene Description support
option(WITH_USD "Enable Universal Scene Description (USD) Support" OFF)
# 3D format support
# Disable opencollada when we don't have precompiled libs
option(WITH_OPENCOLLADA "Enable OpenCollada Support (http://www.opencollada.org)" ON)
option(WITH_OPENCOLLADA "Enable OpenCollada Support (http://www.opencollada.org)" ${_init_OPENCOLLADA})
# Sound output
option(WITH_SDL "Enable SDL for sound and joystick support" ON)
option(WITH_SDL "Enable SDL for sound and joystick support" ${_init_SDL})
option(WITH_OPENAL "Enable OpenAL Support (http://www.openal.org)" ON)
if(NOT WIN32)
option(WITH_JACK "Enable JACK Support (http://www.jackaudio.org)" ON)
if(UNIX AND NOT APPLE)
option(WITH_JACK_DYNLOAD "Enable runtime dynamic JACK libraries loading" OFF)
endif()
option(WITH_JACK "Enable JACK Support (http://www.jackaudio.org)" ${_init_JACK})
if(UNIX AND NOT APPLE)
option(WITH_JACK_DYNLOAD "Enable runtime dynamic JACK libraries loading" OFF)
endif()
if(UNIX AND NOT APPLE)
option(WITH_SDL_DYNLOAD "Enable runtime dynamic SDL libraries loading" OFF)
@@ -308,7 +351,7 @@ option(WITH_DRACO "Enable Draco mesh compression Python module (used for
# Camera/motion tracking
option(WITH_LIBMV "Enable Libmv structure from motion library" ON)
option(WITH_LIBMV_SCHUR_SPECIALIZATIONS "Enable fixed-size schur specializations." ON)
option(WITH_LIBMV_SCHUR_SPECIALIZATIONS "Enable fixed-size schur specializations." OFF)
mark_as_advanced(WITH_LIBMV_SCHUR_SPECIALIZATIONS)
# Logging/unbit test libraries.
@@ -324,7 +367,8 @@ option(WITH_FREESTYLE "Enable Freestyle (advanced edges rendering)" ON)
if(WIN32)
option(WITH_INPUT_IME "Enable Input Method Editor (IME) for complex Asian character input" ON)
endif()
option(WITH_INPUT_NDOF "Enable NDOF input devices (SpaceNavigator and friends)" ON)
option(WITH_INPUT_NDOF "Enable NDOF input devices (SpaceNavigator and friends)" ${_init_INPUT_NDOF})
option(WITH_RAYOPTIMIZATION "Enable use of SIMD (SSE) optimizations for the raytracer" ON)
if(UNIX AND NOT APPLE)
option(WITH_INSTALL_PORTABLE "Install redistributeable runtime, otherwise install into CMAKE_INSTALL_PREFIX" ON)
option(WITH_STATIC_LIBS "Try to link with static libraries, as much as possible, to make blender more portable across distributions" OFF)
@@ -356,11 +400,14 @@ endif()
option(WITH_CPU_SSE "Enable SIMD instruction if they're detected on the host machine" ON)
mark_as_advanced(WITH_CPU_SSE)
option(WITH_QEX "Enable libQEx support" ON)
# Cycles
option(WITH_CYCLES "Enable Cycles Render Engine" ON)
option(WITH_CYCLES_STANDALONE "Build Cycles standalone application" OFF)
option(WITH_CYCLES_STANDALONE_GUI "Build Cycles standalone with GUI" OFF)
option(WITH_CYCLES_OSL "Build Cycles with OSL support" ON)
option(WITH_CYCLES_OSL "Build Cycles with OSL support" ${_init_CYCLES_OSL})
option(WITH_CYCLES_EMBREE "Build Cycles with Embree support" OFF)
option(WITH_CYCLES_CUDA_BINARIES "Build Cycles CUDA binaries" OFF)
option(WITH_CYCLES_CUBIN_COMPILER "Build cubins with nvrtc based compiler instead of nvcc" OFF)
@@ -380,7 +427,6 @@ mark_as_advanced(WITH_CYCLES_DEBUG)
mark_as_advanced(WITH_CYCLES_NATIVE_ONLY)
option(WITH_CYCLES_DEVICE_CUDA "Enable Cycles CUDA compute support" ON)
option(WITH_CYCLES_DEVICE_OPTIX "Enable Cycles OptiX support" OFF)
option(WITH_CYCLES_DEVICE_OPENCL "Enable Cycles OpenCL compute support" ON)
option(WITH_CYCLES_NETWORK "Enable Cycles compute over network support (EXPERIMENTAL and unfinished)" OFF)
mark_as_advanced(WITH_CYCLES_DEVICE_CUDA)
@@ -414,19 +460,14 @@ mark_as_advanced(WITH_CXX_GUARDEDALLOC)
option(WITH_ASSERT_ABORT "Call abort() when raising an assertion through BLI_assert()" ON)
mark_as_advanced(WITH_ASSERT_ABORT)
option(WITH_BOOST "Enable features depending on boost" ON)
option(WITH_TBB "Enable features depending on TBB (OpenVDB, OpenImageDenoise, sculpt multithreading)" ON)
# TBB malloc is only supported on for windows currently
if(WIN32)
option(WITH_TBB_MALLOC_PROXY "Enable the TBB malloc replacement" ON)
endif()
option(WITH_BOOST "Enable features depending on boost" ON)
# Unit testsing
option(WITH_GTESTS "Enable GTest unit testing" OFF)
option(WITH_OPENGL_RENDER_TESTS "Enable OpenGL render related unit testing (Experimental)" OFF)
option(WITH_OPENGL_DRAW_TESTS "Enable OpenGL UI drawing related unit testing (Experimental)" OFF)
# Documentation
if(UNIX AND NOT APPLE)
option(WITH_DOC_MANPAGE "Create a manual page (Unix manpage)" OFF)
@@ -501,8 +542,7 @@ if(CMAKE_COMPILER_IS_GNUCC OR CMAKE_C_COMPILER_ID MATCHES "Clang")
if(NOT MSVC)
find_library(COMPILER_ASAN_LIBRARY asan ${CMAKE_C_IMPLICIT_LINK_DIRECTORIES})
else()
find_library(
COMPILER_ASAN_LIBRARY NAMES clang_rt.asan-x86_64
find_library( COMPILER_ASAN_LIBRARY NAMES clang_rt.asan-x86_64
PATHS
[HKEY_LOCAL_MACHINE\\SOFTWARE\\Wow6432Node\\LLVM\\LLVM;]/lib/clang/7.0.0/lib/windows
[HKEY_LOCAL_MACHINE\\SOFTWARE\\Wow6432Node\\LLVM\\LLVM;]/lib/clang/6.0.0/lib/windows
@@ -512,11 +552,24 @@ if(CMAKE_COMPILER_IS_GNUCC OR CMAKE_C_COMPILER_ID MATCHES "Clang")
endif()
endif()
# Dependency graph
option(WITH_LEGACY_DEPSGRAPH "Build Blender with legacy dependency graph" ON)
mark_as_advanced(WITH_LEGACY_DEPSGRAPH)
if(WIN32)
# Use hardcoded paths or find_package to find externals
option(WITH_WINDOWS_FIND_MODULES "Use find_package to locate libraries" OFF)
mark_as_advanced(WITH_WINDOWS_FIND_MODULES)
option(WITH_WINDOWS_CODESIGN "Use signtool to sign the final binary." OFF)
mark_as_advanced(WITH_WINDOWS_CODESIGN)
set(WINDOWS_CODESIGN_PFX CACHE FILEPATH "Path to pfx file to use for codesigning.")
mark_as_advanced(WINDOWS_CODESIGN_PFX)
set(WINDOWS_CODESIGN_PFX_PASSWORD CACHE STRING "password for pfx file used for codesigning.")
mark_as_advanced(WINDOWS_CODESIGN_PFX_PASSWORD)
option(WINDOWS_USE_VISUAL_STUDIO_PROJECT_FOLDERS "Organize the visual studio projects according to source folder structure." ON)
mark_as_advanced(WINDOWS_USE_VISUAL_STUDIO_PROJECT_FOLDERS)
@@ -525,10 +578,6 @@ if(WIN32)
option(WINDOWS_PYTHON_DEBUG "Include the files needed for debugging python scripts with visual studio 2017+." OFF)
mark_as_advanced(WINDOWS_PYTHON_DEBUG)
option(WITH_WINDOWS_BUNDLE_CRT "Bundle the C runtime for install free distribution." ON)
mark_as_advanced(WITH_WINDOWS_BUNDLE_CRT)
endif()
# The following only works with the Ninja generator in CMake >= 3.0.
@@ -539,14 +588,8 @@ if("${CMAKE_GENERATOR}" MATCHES "Ninja")
mark_as_advanced(WITH_NINJA_POOL_JOBS)
endif()
if(UNIX AND NOT APPLE)
option(WITH_CXX11_ABI "Use native C++11 ABI of compiler" ON)
mark_as_advanced(WITH_CXX11_ABI)
endif()
# Installation process.
option(POSTINSTALL_SCRIPT "Run given CMake script after installation process" OFF)
mark_as_advanced(POSTINSTALL_SCRIPT)
# avoid using again
option_defaults_clear()
# end option(...)
@@ -623,28 +666,28 @@ endif()
# enable boost for cycles, audaspace or i18n
# otherwise if the user disabled
if(NOT WITH_BOOST)
# Explicitly disabled. so disable all deps.
macro(set_and_warn
_setting _val)
if(${${_setting}})
message(STATUS "'WITH_BOOST' is disabled: forcing 'set(${_setting} ${_val})'")
endif()
set(${_setting} ${_val})
endmacro()
set_and_warn_dependency(WITH_BOOST WITH_CYCLES OFF)
set_and_warn_dependency(WITH_BOOST WITH_INTERNATIONAL OFF)
set_and_warn_dependency(WITH_BOOST WITH_OPENVDB OFF)
set_and_warn_dependency(WITH_BOOST WITH_OPENCOLORIO OFF)
set_and_warn_dependency(WITH_BOOST WITH_QUADRIFLOW OFF)
set_and_warn_dependency(WITH_BOOST WITH_USD OFF)
if(WITH_BOOST AND NOT (WITH_CYCLES OR WITH_OPENIMAGEIO OR WITH_INTERNATIONAL OR
WITH_OPENVDB OR WITH_OPENCOLORIO OR WITH_USD))
message(STATUS "No dependencies need 'WITH_BOOST' forcing WITH_BOOST=OFF")
set_and_warn(WITH_CYCLES OFF)
set_and_warn(WITH_INTERNATIONAL OFF)
set_and_warn(WITH_OPENVDB OFF)
set_and_warn(WITH_OPENCOLORIO OFF)
elseif(WITH_CYCLES OR WITH_OPENIMAGEIO OR WITH_INTERNATIONAL OR
WITH_OPENVDB OR WITH_OPENCOLORIO)
# Keep enabled
else()
# Disable boost if not needed.
set(WITH_BOOST OFF)
endif()
set_and_warn_dependency(WITH_TBB WITH_USD OFF)
set_and_warn_dependency(WITH_TBB WITH_OPENIMAGEDENOISE OFF)
set_and_warn_dependency(WITH_TBB WITH_OPENVDB OFF)
set_and_warn_dependency(WITH_TBB WITH_MOD_FLUID OFF)
# OpenVDB uses 'half' type from OpenEXR & fails to link without OpenEXR enabled.
set_and_warn_dependency(WITH_IMAGE_OPENEXR WITH_OPENVDB OFF)
# auto enable openimageio for cycles
if(WITH_CYCLES)
set(WITH_OPENIMAGEIO ON)
@@ -716,12 +759,14 @@ if(NOT WITH_CUDA_DYNLOAD)
endif()
#-----------------------------------------------------------------------------
# Check check if submodules are cloned
# Check for valid directories
# ... a partial checkout may cause this.
#
# note: we need to check for a known subdir in both cases.
# since uninitialized git submodules will give blank dirs
if(WITH_INTERNATIONAL)
file(GLOB RESULT "${CMAKE_SOURCE_DIR}/release/datafiles/locale")
list(LENGTH RESULT DIR_LEN)
if(DIR_LEN EQUAL 0)
if(NOT EXISTS "${CMAKE_SOURCE_DIR}/release/datafiles/locale/languages")
message(WARNING
"Translation path '${CMAKE_SOURCE_DIR}/release/datafiles/locale' is missing, "
"This is a 'git submodule', which are known not to work with bridges to other version "
@@ -743,9 +788,7 @@ if(WITH_PYTHON)
message(FATAL_ERROR "At least Python 3.7 is required to build")
endif()
file(GLOB RESULT "${CMAKE_SOURCE_DIR}/release/scripts/addons")
list(LENGTH RESULT DIR_LEN)
if(DIR_LEN EQUAL 0)
if(NOT EXISTS "${CMAKE_SOURCE_DIR}/release/scripts/addons/modules")
message(WARNING
"Addons path '${CMAKE_SOURCE_DIR}/release/scripts/addons' is missing, "
"This is a 'git submodule', which are known not to work with bridges to other version "
@@ -794,14 +837,67 @@ if(NOT CMAKE_BUILD_TYPE MATCHES "Release")
if(MSVC)
set(COMPILER_ASAN_LINKER_FLAGS "/FUNCTIONPADMIN:6")
endif()
if(COMPILER_ASAN_LIBRARY)
set(PLATFORM_LINKLIBS "${PLATFORM_LINKLIBS};${COMPILER_ASAN_LIBRARY}")
set(PLATFORM_LINKFLAGS "${COMPILER_ASAN_LIBRARY} ${COMPILER_ASAN_LINKER_FLAGS}")
set(PLATFORM_LINKFLAGS_DEBUG "${COMPILER_ASAN_LIBRARY} ${COMPILER_ASAN_LINKER_FLAGS}")
endif()
set(PLATFORM_LINKLIBS "${PLATFORM_LINKLIBS};${COMPILER_ASAN_LIBRARY}")
set(PLATFORM_LINKFLAGS "${COMPILER_ASAN_LIBRARY} ${COMPILER_ASAN_LINKER_FLAGS}")
set(PLATFORM_LINKFLAGS_DEBUG "${COMPILER_ASAN_LIBRARY} ${COMPILER_ASAN_LINKER_FLAGS}")
endif()
endif()
#-----------------------------------------------------------------------------
#Platform specifics
if(WITH_X11)
find_package(X11 REQUIRED)
find_path(X11_XF86keysym_INCLUDE_PATH X11/XF86keysym.h ${X11_INC_SEARCH_PATH})
mark_as_advanced(X11_XF86keysym_INCLUDE_PATH)
list(APPEND PLATFORM_LINKLIBS ${X11_X11_LIB})
if(WITH_X11_XINPUT)
if(X11_Xinput_LIB)
list(APPEND PLATFORM_LINKLIBS ${X11_Xinput_LIB})
else()
message(FATAL_ERROR "LibXi not found. Disable WITH_X11_XINPUT if you
want to build without tablet support")
endif()
endif()
if(WITH_X11_XF86VMODE)
# XXX, why doesn't cmake make this available?
find_library(X11_Xxf86vmode_LIB Xxf86vm ${X11_LIB_SEARCH_PATH})
mark_as_advanced(X11_Xxf86vmode_LIB)
if(X11_Xxf86vmode_LIB)
list(APPEND PLATFORM_LINKLIBS ${X11_Xxf86vmode_LIB})
else()
message(FATAL_ERROR "libXxf86vm not found. Disable WITH_X11_XF86VMODE if you
want to build without")
endif()
endif()
if(WITH_X11_XFIXES)
if(X11_Xfixes_LIB)
list(APPEND PLATFORM_LINKLIBS ${X11_Xfixes_LIB})
else()
message(FATAL_ERROR "libXfixes not found. Disable WITH_X11_XFIXES if you
want to build without")
endif()
endif()
if(WITH_X11_ALPHA)
find_library(X11_Xrender_LIB Xrender ${X11_LIB_SEARCH_PATH})
mark_as_advanced(X11_Xrender_LIB)
if(X11_Xrender_LIB)
list(APPEND PLATFORM_LINKLIBS ${X11_Xrender_LIB})
else()
message(FATAL_ERROR "libXrender not found. Disable WITH_X11_ALPHA if you
want to build without")
endif()
endif()
endif()
# ----------------------------------------------------------------------------
# Main Platform Checks
#
@@ -911,28 +1007,6 @@ if(NOT WITH_SYSTEM_EIGEN3)
set(EIGEN3_INCLUDE_DIRS ${CMAKE_SOURCE_DIR}/extern/Eigen3)
endif()
if(WITH_OPENVDB)
list(APPEND OPENVDB_DEFINITIONS -DWITH_OPENVDB)
if(WITH_OPENVDB_3_ABI_COMPATIBLE)
list(APPEND OPENVDB_DEFINITIONS -DOPENVDB_3_ABI_COMPATIBLE)
endif()
list(APPEND OPENVDB_INCLUDE_DIRS
${BOOST_INCLUDE_DIR}
${TBB_INCLUDE_DIRS}
${OPENEXR_INCLUDE_DIRS})
list(APPEND OPENVDB_LIBRARIES ${OPENEXR_LIBRARIES} ${ZLIB_LIBRARIES})
if(WITH_OPENVDB_BLOSC)
list(APPEND OPENVDB_DEFINITIONS -DWITH_OPENVDB_BLOSC)
list(APPEND OPENVDB_LIBRARIES ${BLOSC_LIBRARIES} ${ZLIB_LIBRARIES})
endif()
list(APPEND OPENVDB_LIBRARIES ${BOOST_LIBRARIES} ${TBB_LIBRARIES})
endif()
#-----------------------------------------------------------------------------
# Configure OpenGL.
@@ -956,7 +1030,7 @@ if(WITH_GL_PROFILE_ES20)
)
endif()
list(APPEND BLENDER_GL_LIBRARIES "${OPENGLES_LIBRARY}")
list(APPEND BLENDER_GL_LIBRARIES OPENGLES_LIBRARY)
else()
set(OPENGLES_LIBRARY "" CACHE FILEPATH "OpenGL ES 2.0 library file")
@@ -1016,10 +1090,7 @@ else()
endif()
if(WITH_GL_EGL)
find_package(OpenGL REQUIRED EGL)
list(APPEND BLENDER_GL_LIBRARIES OpenGL::EGL)
list(APPEND GL_DEFINITIONS -DWITH_GL_EGL -DGLEW_EGL -DGLEW_INC_EGL)
list(APPEND GL_DEFINITIONS -DWITH_GL_EGL)
if(WITH_SYSTEM_GLES)
if(NOT OPENGLES_EGL_LIBRARY)
@@ -1029,7 +1100,7 @@ if(WITH_GL_EGL)
)
endif()
list(APPEND BLENDER_GL_LIBRARIES ${OPENGLES_EGL_LIBRARY})
list(APPEND BLENDER_GL_LIBRARIES OPENGLES_EGL_LIBRARY)
else()
set(OPENGLES_EGL_LIBRARY "" CACHE FILEPATH "EGL library file")
@@ -1069,6 +1140,10 @@ else()
list(APPEND GL_DEFINITIONS -DWITH_GL_PROFILE_CORE)
endif()
if(WITH_GL_EGL)
list(APPEND GL_DEFINITIONS -DWITH_EGL)
endif()
#-----------------------------------------------------------------------------
# Configure OpenMP.
if(WITH_OPENMP)
@@ -1111,7 +1186,6 @@ if(WITH_SYSTEM_GLEW)
message(FATAL_ERROR "GLEW is required to build Blender. Install it or disable WITH_SYSTEM_GLEW.")
endif()
set(GLEW_INCLUDE_PATH "${GLEW_INCLUDE_DIR}")
set(BLENDER_GLEW_LIBRARIES ${GLEW_LIBRARY})
else()
if(WITH_GLEW_ES)
@@ -1140,6 +1214,10 @@ else()
list(APPEND GL_DEFINITIONS -DGL_ES_VERSION_1_0=0 -DGL_ES_VERSION_CL_1_1=0 -DGL_ES_VERSION_CM_1_1=0)
endif()
if(WITH_GL_EGL)
list(APPEND GL_DEFINITIONS -DGLEW_INC_EGL)
endif()
set(BLENDER_GLEW_LIBRARIES extern_glew_es bf_intern_glew_mx)
else()
@@ -1320,16 +1398,13 @@ if(CMAKE_COMPILER_IS_GNUCC)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_LOGICAL_OP -Wlogical-op)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_UNDEF -Wundef)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_INIT_SELF -Winit-self) # needs -Wuninitialized
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_NO_NULL -Wnonnull) # C only
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_MISSING_INCLUDE_DIRS -Wmissing-include-dirs)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_NO_DIV_BY_ZERO -Wno-div-by-zero)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_TYPE_LIMITS -Wtype-limits)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_FORMAT_SIGN -Wformat-signedness)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_RESTRICT -Wrestrict)
# C-only.
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_NO_NULL -Wnonnull)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_ABSOLUTE_VALUE -Wabsolute-value)
# gcc 4.2 gives annoying warnings on every file with this
if(NOT "${CMAKE_C_COMPILER_VERSION}" VERSION_LESS "4.3")
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_UNINITIALIZED -Wuninitialized)
@@ -1362,6 +1437,7 @@ if(CMAKE_COMPILER_IS_GNUCC)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_NO_DIV_BY_ZERO -Wno-div-by-zero)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_TYPE_LIMITS -Wtype-limits)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_ERROR_RETURN_TYPE -Werror=return-type)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_ERROR_IMPLICIT_FUNCTION_DECLARATION -Werror=implicit-function-declaration)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_NO_CHAR_SUBSCRIPTS -Wno-char-subscripts)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_NO_UNKNOWN_PRAGMAS -Wno-unknown-pragmas)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_WARNINGS CXX_WARN_POINTER_ARITH -Wpointer-arith)
@@ -1436,7 +1512,6 @@ elseif(CMAKE_C_COMPILER_ID MATCHES "Clang")
# flags to undo strict flags
ADD_CHECK_C_COMPILER_FLAG(C_REMOVE_STRICT_FLAGS C_WARN_NO_UNUSED_PARAMETER -Wno-unused-parameter)
ADD_CHECK_C_COMPILER_FLAG(C_REMOVE_STRICT_FLAGS C_WARN_NO_UNUSED_VARIABLE -Wno-unused-variable)
ADD_CHECK_C_COMPILER_FLAG(C_REMOVE_STRICT_FLAGS C_WARN_NO_UNUSED_MACROS -Wno-unused-macros)
ADD_CHECK_C_COMPILER_FLAG(C_REMOVE_STRICT_FLAGS C_WARN_NO_MISSING_VARIABLE_DECLARATIONS -Wno-missing-variable-declarations)
@@ -1455,8 +1530,6 @@ elseif(CMAKE_C_COMPILER_ID MATCHES "Clang")
ADD_CHECK_CXX_COMPILER_FLAG(CXX_REMOVE_STRICT_FLAGS CXX_WARN_NO_REORDER -Wno-reorder)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_REMOVE_STRICT_FLAGS CXX_WARN_NO_COMMENT -Wno-comment)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_REMOVE_STRICT_FLAGS CXX_WARN_NO_UNUSED_TYPEDEFS -Wno-unused-local-typedefs)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_REMOVE_STRICT_FLAGS CXX_WARN_NO_UNDEFINED_VAR_TEMPLATE -Wno-undefined-var-template)
ADD_CHECK_CXX_COMPILER_FLAG(CXX_REMOVE_STRICT_FLAGS CXX_WARN_NO_INSTANTIATION_AFTER_SPECIALIZATION -Wno-instantiation-after-specialization)
elseif(CMAKE_C_COMPILER_ID MATCHES "Intel")
@@ -1501,7 +1574,7 @@ elseif(CMAKE_C_COMPILER_ID MATCHES "MSVC")
if(MSVC_VERSION GREATER_EQUAL 1911)
# see https://docs.microsoft.com/en-us/cpp/error-messages/compiler-warnings/c5038?view=vs-2017
set(_WARNINGS "${_WARNINGS} /w35038") # order of initialization in c++ constructors
set(_WARNINGS "${_WARNINGS} /w35038") #order of initialisation in c++ constructors
endif()
string(REPLACE ";" " " _WARNINGS "${_WARNINGS}")
@@ -1541,19 +1614,15 @@ if(WITH_PYTHON)
endif()
endif()
if(MSVC)
# MSVC needs to be tested first, since clang on windows will
# match the compiler test below but clang-cl does not accept -std=c++11
# since it is on by default and cannot be turned off.
#
# Nothing special is needed, C++11 features are available by default.
elseif(
if(
CMAKE_COMPILER_IS_GNUCC OR
CMAKE_C_COMPILER_ID MATCHES "Clang" OR
CMAKE_C_COMPILER_ID MATCHES "Intel"
)
# TODO(sergey): Do we want c++11 or gnu-c++11 here?
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")
elseif(MSVC)
# Nothing special is needed, C++11 features are available by default.
else()
message(FATAL_ERROR "Unknown compiler ${CMAKE_C_COMPILER_ID}, can't enable C++11 build")
endif()
@@ -1569,12 +1638,6 @@ if(
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -std=gnu11")
endif()
if(UNIX AND NOT APPLE)
if(NOT WITH_CXX11_ABI)
set(PLATFORM_CFLAGS "${PLATFORM_CFLAGS} -D_GLIBCXX_USE_CXX11_ABI=0")
endif()
endif()
# Include warnings first, so its possible to disable them with user defined flags
# eg: -Wno-uninitialized
set(CMAKE_C_FLAGS "${C_WARNINGS} ${CMAKE_C_FLAGS} ${PLATFORM_CFLAGS}")
@@ -1707,16 +1770,14 @@ if(FIRST_RUN)
info_cfg_option(WITH_CYCLES)
info_cfg_option(WITH_FREESTYLE)
info_cfg_option(WITH_OPENCOLORIO)
info_cfg_option(WITH_XR_OPENXR)
info_cfg_option(WITH_OPENIMAGEDENOISE)
info_cfg_option(WITH_OPENVDB)
info_cfg_option(WITH_ALEMBIC)
info_cfg_option(WITH_QUADRIFLOW)
info_cfg_option(WITH_USD)
info_cfg_text("Compiler Options:")
info_cfg_option(WITH_BUILDINFO)
info_cfg_option(WITH_OPENMP)
info_cfg_option(WITH_RAYOPTIMIZATION)
info_cfg_text("System Options:")
info_cfg_option(WITH_INSTALL_PORTABLE)

View File

@@ -62,7 +62,8 @@ Testing Targets
Not associated with building Blender.
* test:
Run automated tests with ctest.
Run ctest, currently tests import/export,
operator execution and that python modules load
* test_cmake:
Runs our own cmake file checker
which detects errors in the cmake file list definitions
@@ -118,7 +119,7 @@ Utilities
Example
make icons_geom BLENDER_BIN=/path/to/blender
* source_archive:
* tgz:
Create a compressed archive of the source code.
* update:
@@ -163,8 +164,9 @@ CPU:=$(shell uname -m)
BLENDER_DIR:=$(shell pwd -P)
BUILD_TYPE:=Release
# CMake arguments, assigned to local variable to make it mutable.
CMAKE_CONFIG_ARGS := $(BUILD_CMAKE_ARGS)
ifndef BUILD_CMAKE_ARGS
BUILD_CMAKE_ARGS:=
endif
ifndef BUILD_DIR
BUILD_DIR:=$(shell dirname "$(BLENDER_DIR)")/build_$(OS_NCASE)
@@ -191,16 +193,6 @@ ifndef PYTHON
PYTHON:=python3
endif
# For macOS python3 is not installed by default, so fallback to python binary
# in libraries, or python 2 for running make update to get it.
ifeq ($(OS_NCASE),darwin)
ifeq (, $(shell command -v $(PYTHON)))
PYTHON:=../lib/darwin/python/bin/python3.7m
ifeq (, $(shell command -v $(PYTHON)))
PYTHON:=python
endif
endif
endif
# -----------------------------------------------------------------------------
# additional targets for the build configuration
@@ -212,48 +204,41 @@ ifneq "$(findstring debug, $(MAKECMDGOALS))" ""
endif
ifneq "$(findstring full, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_full
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/blender_full.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/blender_full.cmake"
endif
ifneq "$(findstring lite, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_lite
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/blender_lite.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/blender_lite.cmake"
endif
ifneq "$(findstring cycles, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_cycles
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/cycles_standalone.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/cycles_standalone.cmake"
endif
ifneq "$(findstring headless, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_headless
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/blender_headless.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/blender_headless.cmake"
endif
ifneq "$(findstring bpy, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_bpy
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/bpy_module.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/bpy_module.cmake"
endif
ifneq "$(findstring developer, $(MAKECMDGOALS))" ""
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/blender_developer.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/blender_developer.cmake"
endif
# -----------------------------------------------------------------------------
# build tool
ifneq "$(findstring ninja, $(MAKECMDGOALS))" ""
CMAKE_CONFIG_ARGS:=$(CMAKE_CONFIG_ARGS) -G Ninja
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -G Ninja
BUILD_COMMAND:=ninja
DEPS_BUILD_COMMAND:=ninja
else
ifneq ("$(wildcard $(BUILD_DIR)/build.ninja)","")
BUILD_COMMAND:=ninja
else
BUILD_COMMAND:=make -s
endif
ifneq ("$(wildcard $(DEPS_BUILD_DIR)/build.ninja)","")
DEPS_BUILD_COMMAND:=ninja
else
DEPS_BUILD_COMMAND:=make -s
endif
endif
# -----------------------------------------------------------------------------
@@ -275,10 +260,7 @@ ifndef NPROCS
ifeq ($(OS), Linux)
NPROCS:=$(shell nproc)
endif
ifeq ($(OS), NetBSD)
NPROCS:=$(shell getconf NPROCESSORS_ONLN)
endif
ifneq (,$(filter $(OS),Darwin FreeBSD))
ifneq (,$(filter $(OS),Darwin FreeBSD NetBSD))
NPROCS:=$(shell sysctl -n hw.ncpu)
endif
endif
@@ -287,7 +269,7 @@ endif
# -----------------------------------------------------------------------------
# Macro for configuring cmake
CMAKE_CONFIG = cmake $(CMAKE_CONFIG_ARGS) \
CMAKE_CONFIG = cmake $(BUILD_CMAKE_ARGS) \
-H"$(BLENDER_DIR)" \
-B"$(BUILD_DIR)" \
-DCMAKE_BUILD_TYPE_INIT:STRING=$(BUILD_TYPE)
@@ -351,7 +333,7 @@ deps: .FORCE
@echo
@echo Building dependencies ...
$(DEPS_BUILD_COMMAND) -C "$(DEPS_BUILD_DIR)" -j $(NPROCS) $(DEPS_TARGET)
$(BUILD_COMMAND) -C "$(DEPS_BUILD_DIR)" -j $(NPROCS) $(DEPS_TARGET)
@echo
@echo Dependencies successfully built and installed to $(DEPS_INSTALL_DIR).
@echo
@@ -386,7 +368,7 @@ package_archive: .FORCE
# Tests
#
test: .FORCE
$(PYTHON) ./build_files/utils/make_test.py "$(BUILD_DIR)"
cd $(BUILD_DIR) ; ctest . --output-on-failure
# run pep8 check check on scripts we distribute.
test_pep8: .FORCE
@@ -527,8 +509,8 @@ check_descriptions: .FORCE
# Utilities
#
source_archive: .FORCE
./build_files/utils/make_source_archive.sh
tgz: .FORCE
./build_files/utils/build_tgz.sh
INKSCAPE_BIN?="inkscape"
icons: .FORCE
@@ -542,11 +524,21 @@ icons_geom: .FORCE
"$(BLENDER_DIR)/release/datafiles/blender_icons_geom_update.py"
update: .FORCE
$(PYTHON) ./build_files/utils/make_update.py
if [ "$(OS_NCASE)" = "darwin" ] && [ ! -d "../lib/$(OS_NCASE)" ]; then \
svn checkout https://svn.blender.org/svnroot/bf-blender/trunk/lib/$(OS_NCASE) ../lib/$(OS_NCASE) ; \
fi
if [ -d "../lib" ]; then \
svn cleanup ../lib/* ; \
svn update ../lib/* ; \
fi
git pull --rebase
git submodule update --init --recursive
git submodule foreach git checkout master
git submodule foreach git pull --rebase origin master
format: .FORCE
PATH="../lib/${OS_NCASE}_${CPU}/llvm/bin/:../lib/${OS_NCASE}_centos7_${CPU}/llvm/bin/:../lib/${OS_NCASE}/llvm/bin/:$(PATH)" \
$(PYTHON) source/tools/utils_maintenance/clang_format_paths.py $(PATHS)
PATH="../lib/${OS_NCASE}/llvm/bin/:$(PATH)" \
python3 source/tools/utils_maintenance/clang_format_paths.py $(PATHS)
# -----------------------------------------------------------------------------

View File

@@ -43,7 +43,6 @@ project("BlenderDependencies")
cmake_minimum_required(VERSION 3.5)
include(ExternalProject)
include(cmake/check_software.cmake)
include(cmake/options.cmake)
include(cmake/versions.cmake)
@@ -58,12 +57,14 @@ else()
endif()
include(cmake/zlib.cmake)
include(cmake/blendthumb.cmake)
include(cmake/openal.cmake)
include(cmake/png.cmake)
include(cmake/jpeg.cmake)
include(cmake/boost.cmake)
include(cmake/blosc.cmake)
include(cmake/pthreads.cmake)
include(cmake/ilmbase.cmake)
include(cmake/openexr.cmake)
include(cmake/freetype.cmake)
include(cmake/freeglut.cmake)
@@ -91,20 +92,21 @@ include(cmake/python.cmake)
include(cmake/python_site_packages.cmake)
include(cmake/package_python.cmake)
include(cmake/numpy.cmake)
include(cmake/usd.cmake)
if(UNIX)
if(UNIX AND NOT APPLE)
# Rely on PugiXML compiled with OpenImageIO
else()
include(cmake/pugixml.cmake)
endif()
include(cmake/openimagedenoise.cmake)
include(cmake/embree.cmake)
include(cmake/xr_openxr.cmake)
if(WITH_WEBP)
include(cmake/webp.cmake)
endif()
if(WITH_EMBREE)
include(cmake/embree.cmake)
endif()
if(WIN32)
# HMD branch deps
include(cmake/hidapi.cmake)
@@ -112,7 +114,7 @@ if(WIN32)
include(cmake/tinyxml.cmake)
include(cmake/yamlcpp.cmake)
# LCMS is an OCIO dep, but only if you build the apps, leaving it here for convenience
# include(cmake/lcms.cmake)
#include(cmake/lcms.cmake)
endif()
@@ -126,7 +128,6 @@ if(NOT WIN32 OR ENABLE_MINGW64)
include(cmake/ogg.cmake)
include(cmake/vorbis.cmake)
include(cmake/theora.cmake)
include(cmake/opus.cmake)
include(cmake/vpx.cmake)
include(cmake/x264.cmake)
include(cmake/xvidcore.cmake)
@@ -156,9 +157,9 @@ if(UNIX)
include(cmake/sqlite.cmake)
endif()
if(UNIX AND NOT APPLE)
include(cmake/libglu.cmake)
include(cmake/mesa.cmake)
endif()
include(cmake/openmesh.cmake)
include(cmake/lapack.cmake)
include(cmake/igl.cmake)
include(cmake/qex.cmake)
include(cmake/harvest.cmake)

View File

@@ -26,6 +26,12 @@ if(ALEMBIC_HDF5)
endif()
endif()
if(WIN32)
set(ALEMBIC_ILMBASE ${LIBDIR}/openexr)
else()
set(ALEMBIC_ILMBASE ${LIBDIR}/ilmbase)
endif()
set(ALEMBIC_EXTRA_ARGS
-DBUILDSTATIC=ON
-DLINKSTATIC=ON
@@ -34,17 +40,17 @@ set(ALEMBIC_EXTRA_ARGS
-DBoost_USE_MULTITHREADED=ON
-DUSE_STATIC_BOOST=On
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBoost_USE_STATIC_RUNTIME=ON
-DBoost_DEBUG=ON
-DBOOST_ROOT=${LIBDIR}/boost
-DBoost_NO_SYSTEM_PATHS=ON
-DILMBASE_ROOT=${LIBDIR}/openexr
-DALEMBIC_ILMBASE_INCLUDE_DIRECTORY=${LIBDIR}/openexr/include/OpenEXR
-DALEMBIC_ILMBASE_HALF_LIB=${LIBDIR}/openexr/lib/${LIBPREFIX}Half${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DALEMBIC_ILMBASE_IMATH_LIB=${LIBDIR}/openexr/lib/${LIBPREFIX}Imath${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DALEMBIC_ILMBASE_ILMTHREAD_LIB=${LIBDIR}/openexr/lib/${LIBPREFIX}IlmThread${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DALEMBIC_ILMBASE_IEX_LIB=${LIBDIR}/openexr/lib/${LIBPREFIX}Iex${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DALEMBIC_ILMBASE_IEXMATH_LIB=${LIBDIR}/openexr/lib/${LIBPREFIX}IexMath${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DILMBASE_ROOT=${ALEMBIC_ILMBASE}
-DALEMBIC_ILMBASE_INCLUDE_DIRECTORY=${ALEMBIC_ILMBASE}/include/OpenEXR
-DALEMBIC_ILMBASE_HALF_LIB=${ALEMBIC_ILMBASE}/lib/${LIBPREFIX}Half${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DALEMBIC_ILMBASE_IMATH_LIB=${ALEMBIC_ILMBASE}/lib/${LIBPREFIX}Imath${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DALEMBIC_ILMBASE_ILMTHREAD_LIB=${ALEMBIC_ILMBASE}/lib/${LIBPREFIX}IlmThread${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DALEMBIC_ILMBASE_IEX_LIB=${ALEMBIC_ILMBASE}/lib/${LIBPREFIX}Iex${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DALEMBIC_ILMBASE_IEXMATH_LIB=${ALEMBIC_ILMBASE}/lib/${LIBPREFIX}IexMath${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DUSE_PYILMBASE=0
-DUSE_PYALEMBIC=0
-DUSE_ARNOLD=0
@@ -94,5 +100,6 @@ add_dependencies(
external_alembic
external_boost
external_zlib
external_ilmbase
external_openexr
)

View File

@@ -0,0 +1,67 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
if(BUILD_MODE STREQUAL Release)
if(WIN32)
set(THUMB_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../../release/windows/blendthumb)
ExternalProject_Add(external_zlib_32
URL ${ZLIB_URI}
CMAKE_GENERATOR ${GENERATOR_32}
URL_HASH MD5=${ZLIB_HASH}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
PREFIX ${BUILD_DIR}/zlib32
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/zlib32 ${DEFAULT_CMAKE_FLAGS}
INSTALL_DIR ${LIBDIR}/zlib32
)
ExternalProject_Add(external_zlib_64
URL ${ZLIB_URI}
CMAKE_GENERATOR ${GENERATOR_64}
URL_HASH MD5=${ZLIB_HASH}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
PREFIX ${BUILD_DIR}/zlib64
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/zlib64 ${DEFAULT_CMAKE_FLAGS}
INSTALL_DIR ${LIBDIR}/zlib64
)
ExternalProject_Add(external_blendthumb_32
CMAKE_GENERATOR ${GENERATOR_32}
SOURCE_DIR ${THUMB_DIR}
PREFIX ${BUILD_DIR}/blendthumb32
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/blendThumb32 ${DEFAULT_CMAKE_FLAGS} -DZLIB_INCLUDE=${LIBDIR}/zlib32/include -DZLIB_LIBS=${LIBDIR}/zlib32/lib/zlibstatic.lib
INSTALL_DIR ${LIBDIR}/blendthumb32
)
add_dependencies(
external_blendthumb_32
external_zlib_32
)
ExternalProject_Add(external_blendthumb_64
CMAKE_GENERATOR ${GENERATOR_64}
SOURCE_DIR ${THUMB_DIR}
PREFIX ${BUILD_DIR}/blendthumb64
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/blendThumb64 ${DEFAULT_CMAKE_FLAGS} -DZLIB_INCLUDE=${LIBDIR}/zlib64/include -DZLIB_LIBS=${LIBDIR}/zlib64/lib/zlibstatic.lib
INSTALL_DIR ${LIBDIR}/blendthumb64
)
add_dependencies(
external_blendthumb_64
external_zlib_64
)
endif()
endif()

View File

@@ -29,11 +29,13 @@ set(BLOSC_EXTRA_ARGS
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
)
# Prevent blosc from including it's own local copy of zlib in the object file
# and cause linker errors with everybody else.
set(BLOSC_EXTRA_ARGS ${BLOSC_EXTRA_ARGS}
-DPREFER_EXTERNAL_ZLIB=ON
)
if(WIN32)
#prevent blosc from including it's own local copy of zlib in the object file
#and cause linker errors with everybody else
set(BLOSC_EXTRA_ARGS ${BLOSC_EXTRA_ARGS}
-DPREFER_EXTERNAL_ZLIB=ON
)
endif()
ExternalProject_Add(external_blosc
URL ${BLOSC_URI}

View File

@@ -29,22 +29,22 @@ if(WIN32)
set(PYTHON_OUTPUTDIR ${BUILD_DIR}/python/src/external_python/pcbuild/win32/)
set(BOOST_ADDRESS_MODEL 32)
endif()
set(BOOST_TOOLSET toolset=msvc-14.1)
set(BOOST_COMPILER_STRING -vc141)
if(MSVC14)
set(BOOST_TOOLSET toolset=msvc-14.0)
set(BOOST_COMPILER_STRING -vc140)
endif()
set(BOOST_CONFIGURE_COMMAND bootstrap.bat)
set(BOOST_BUILD_COMMAND bjam)
set(BOOST_BUILD_OPTIONS runtime-link=shared )
set(BOOST_BUILD_OPTIONS runtime-link=static )
set(BOOST_HARVEST_CMD ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/boost/lib/ ${HARVEST_TARGET}/boost/lib/ )
if(BUILD_MODE STREQUAL Release)
set(BOOST_HARVEST_CMD ${BOOST_HARVEST_CMD} && ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/boost/include/boost-${BOOST_VERSION_NODOTS_SHORT}/ ${HARVEST_TARGET}/boost/include/)
set(BOOST_HARVEST_CMD ${BOOST_HARVEST_CMD} && ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/boost/include/boost-1_68/ ${HARVEST_TARGET}/boost/include/)
endif()
elseif(APPLE)
set(BOOST_CONFIGURE_COMMAND ./bootstrap.sh)
set(BOOST_BUILD_COMMAND ./b2)
set(BOOST_BUILD_OPTIONS toolset=darwin cxxflags=${PLATFORM_CXXFLAGS} linkflags=${PLATFORM_LDFLAGS} visibility=global --disable-icu boost.locale.icu=off)
set(BOOST_BUILD_OPTIONS toolset=darwin cxxflags=${PLATFORM_CXXFLAGS} linkflags=${PLATFORM_LDFLAGS} --disable-icu boost.locale.icu=off)
set(BOOST_HARVEST_CMD echo .)
set(BOOST_PATCH_COMMAND echo .)
else()
@@ -72,9 +72,6 @@ set(BOOST_OPTIONS
--with-serialization
--with-program_options
--with-iostreams
-sNO_BZIP2=1
-sNO_LZMA=1
-sNO_ZSTD=1
${BOOST_TOOLSET}
)

View File

@@ -1,59 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
if(UNIX)
if(APPLE)
set(_libtoolize_name glibtoolize)
else()
set(_libtoolize_name libtoolize)
endif()
set(_required_software
autoconf
automake
${_libtoolize_name}
nasm
yasm
tclsh
)
foreach(_software ${_required_software})
find_program(_software_find NAMES ${_software})
if(NOT _software_find)
set(_software_missing "${_software_missing}${_software} ")
endif()
unset(_software_find CACHE)
endforeach()
if(_software_missing)
message(
"\n"
"Missing software for building Blender dependencies:\n"
" ${_software_missing}\n"
"\n"
"On Debian and Ubuntu:\n"
" apt install autoconf automake libtool yasm nasm tcl\n"
"\n"
"On macOS (with homebrew):\n"
" brew install cmake autoconf automake libtool yasm nasm\n"
"\n"
"Other platforms:\n"
" Install equivalent packages.\n")
message(FATAL_ERROR "Install missing software before continuing")
endif()
endif()

View File

@@ -19,8 +19,8 @@
set(CLANG_EXTRA_ARGS
-DCLANG_PATH_TO_LLVM_SOURCE=${BUILD_DIR}/ll/src/ll
-DCLANG_PATH_TO_LLVM_BUILD=${LIBDIR}/llvm
-DLLVM_USE_CRT_RELEASE=MD
-DLLVM_USE_CRT_DEBUG=MDd
-DLLVM_USE_CRT_RELEASE=MT
-DLLVM_USE_CRT_DEBUG=MTd
-DLLVM_CONFIG=${LIBDIR}/llvm/bin/llvm-config
)
@@ -46,7 +46,9 @@ if(MSVC)
set(CLANG_HARVEST_COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/clang/ ${HARVEST_TARGET}/llvm/)
else()
set(CLANG_HARVEST_COMMAND
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/clang/lib/ ${HARVEST_TARGET}/llvm/debug/lib/
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/clang/lib/ ${HARVEST_TARGET}/llvm/debug/lib/ &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/clang/bin/ ${HARVEST_TARGET}/llvm/debug/bin/ &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/clang/include/ ${HARVEST_TARGET}/llvm/debug/include/
)
endif()
ExternalProject_Add_Step(external_clang after_install

View File

@@ -16,10 +16,10 @@
#
# ***** END GPL LICENSE BLOCK *****
set(FFMPEG_CFLAGS "-I${mingw_LIBDIR}/lame/include -I${mingw_LIBDIR}/openjpeg/include/ -I${mingw_LIBDIR}/ogg/include -I${mingw_LIBDIR}/vorbis/include -I${mingw_LIBDIR}/theora/include -I${mingw_LIBDIR}/opus/include -I${mingw_LIBDIR}/vpx/include -I${mingw_LIBDIR}/x264/include -I${mingw_LIBDIR}/xvidcore/include -I${mingw_LIBDIR}/zlib/include")
set(FFMPEG_LDFLAGS "-L${mingw_LIBDIR}/lame/lib -L${mingw_LIBDIR}/openjpeg/lib -L${mingw_LIBDIR}/ogg/lib -L${mingw_LIBDIR}/vorbis/lib -L${mingw_LIBDIR}/theora/lib -L${mingw_LIBDIR}/opus/lib -L${mingw_LIBDIR}/vpx/lib -L${mingw_LIBDIR}/x264/lib -L${mingw_LIBDIR}/xvidcore/lib -L${mingw_LIBDIR}/zlib/lib")
set(FFMPEG_CFLAGS "-I${mingw_LIBDIR}/lame/include -I${mingw_LIBDIR}/openjpeg/include/ -I${mingw_LIBDIR}/ogg/include -I${mingw_LIBDIR}/vorbis/include -I${mingw_LIBDIR}/theora/include -I${mingw_LIBDIR}/vpx/include -I${mingw_LIBDIR}/x264/include -I${mingw_LIBDIR}/xvidcore/include -I${mingw_LIBDIR}/zlib/include")
set(FFMPEG_LDFLAGS "-L${mingw_LIBDIR}/lame/lib -L${mingw_LIBDIR}/openjpeg/lib -L${mingw_LIBDIR}/ogg/lib -L${mingw_LIBDIR}/vorbis/lib -L${mingw_LIBDIR}/theora/lib -L${mingw_LIBDIR}/vpx/lib -L${mingw_LIBDIR}/x264/lib -L${mingw_LIBDIR}/xvidcore/lib -L${mingw_LIBDIR}/zlib/lib")
set(FFMPEG_EXTRA_FLAGS --pkg-config-flags=--static --extra-cflags=${FFMPEG_CFLAGS} --extra-ldflags=${FFMPEG_LDFLAGS})
set(FFMPEG_ENV PKG_CONFIG_PATH=${mingw_LIBDIR}/openjpeg/lib/pkgconfig:${mingw_LIBDIR}/x264/lib/pkgconfig:${mingw_LIBDIR}/vorbis/lib/pkgconfig:${mingw_LIBDIR}/ogg/lib/pkgconfig:${mingw_LIBDIR}:${mingw_LIBDIR}/vpx/lib/pkgconfig:${mingw_LIBDIR}/theora/lib/pkgconfig:${mingw_LIBDIR}/openjpeg/lib/pkgconfig:${mingw_LIBDIR}/opus/lib/pkgconfig:)
set(FFMPEG_ENV PKG_CONFIG_PATH=${mingw_LIBDIR}/openjpeg/lib/pkgconfig:${mingw_LIBDIR}/x264/lib/pkgconfig:${mingw_LIBDIR}/vorbis/lib/pkgconfig:${mingw_LIBDIR}/ogg/lib/pkgconfig:${mingw_LIBDIR})
if(WIN32)
set(FFMPEG_ENV set ${FFMPEG_ENV} &&)
@@ -73,7 +73,6 @@ ExternalProject_Add(external_ffmpeg
--disable-libgsm
--disable-libspeex
--enable-libvpx
--enable-libopus
--prefix=${LIBDIR}/ffmpeg
--enable-libtheora
--enable-libvorbis
@@ -131,7 +130,6 @@ add_dependencies(
external_openjpeg
external_xvidcore
external_x264
external_opus
external_vpx
external_theora
external_vorbis

View File

@@ -32,6 +32,7 @@ ExternalProject_Add(external_freetype
URL_HASH MD5=${FREETYPE_HASH}
PREFIX ${BUILD_DIR}/freetype
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/freetype ${DEFAULT_CMAKE_FLAGS} ${FREETYPE_EXTRA_ARGS}
PATCH_COMMAND ${PATCH_CMD} -p 1 -d ${BUILD_DIR}/freetype/src/external_freetype < ${PATCH_DIR}/freetype.diff
INSTALL_DIR ${LIBDIR}/freetype
)

View File

@@ -51,8 +51,13 @@ if(BUILD_MODE STREQUAL Release)
# tiff
${CMAKE_COMMAND} -E copy ${LIBDIR}/tiff/lib/tiff.lib ${HARVEST_TARGET}/tiff/lib/libtiff.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/tiff/include/ ${HARVEST_TARGET}/tiff/include/ &&
# BlendThumb
${CMAKE_COMMAND} -E copy ${LIBDIR}/BlendThumb64/bin/blendthumb.dll ${HARVEST_TARGET}/ThumbHandler/lib/BlendThumb64.dll &&
${CMAKE_COMMAND} -E copy ${LIBDIR}/BlendThumb32/bin/blendthumb.dll ${HARVEST_TARGET}/ThumbHandler/lib/BlendThumb.dll &&
# hidapi
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/hidapi/ ${HARVEST_TARGET}/hidapi/
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/hidapi/ ${HARVEST_TARGET}/hidapi/ &&
# webp, straight up copy
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/webp ${HARVEST_TARGET}/webp &&
DEPENDS
)
endif()
@@ -62,8 +67,14 @@ if(BUILD_MODE STREQUAL Debug)
# OpenImageIO
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/openimageio/lib/OpenImageIO.lib ${HARVEST_TARGET}/openimageio/lib/OpenImageIO_d.lib &&
${CMAKE_COMMAND} -E copy ${LIBDIR}/openimageio/lib/OpenImageIO_Util.lib ${HARVEST_TARGET}/openimageio/lib/OpenImageIO_Util_d.lib &&
# python
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/python/ ${HARVEST_TARGET}/python/ &&
# hdf5
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/hdf5/lib ${HARVEST_TARGET}/hdf5/lib &&
# numpy
${CMAKE_COMMAND} -E copy ${LIBDIR}/python${PYTHON_SHORT_VERSION_NO_DOTS}_numpy_${NUMPY_SHORT_VERSION}d.tar.gz ${HARVEST_TARGET}/Release/python${PYTHON_SHORT_VERSION_NO_DOTS}_numpy_${NUMPY_SHORT_VERSION}d.tar.gz &&
# python
${CMAKE_COMMAND} -E copy ${LIBDIR}/python${PYTHON_SHORT_VERSION_NO_DOTS}_d.tar.gz ${HARVEST_TARGET}/Release/python${PYTHON_SHORT_VERSION_NO_DOTS}_d.tar.gz
DEPENDS Package_Python
)
endif()
@@ -110,6 +121,8 @@ harvest(freetype/include freetype/include "*.h")
harvest(freetype/lib/libfreetype2ST.a freetype/lib/libfreetype.a)
harvest(glew/include glew/include "*.h")
harvest(glew/lib glew/lib "*.a")
harvest(ilmbase openexr "*")
harvest(ilmbase/include openexr/include "*.h")
harvest(jemalloc/include jemalloc/include "*.h")
harvest(jemalloc/lib jemalloc/lib "*.a")
harvest(jpg/include jpeg/include "*.h")
@@ -161,8 +174,6 @@ harvest(opensubdiv/include opensubdiv/include "*.h")
harvest(opensubdiv/lib opensubdiv/lib "*.a")
harvest(openvdb/include/openvdb openvdb/include/openvdb "*.h")
harvest(openvdb/lib openvdb/lib "*.a")
harvest(xr_openxr_sdk/include/openxr xr_openxr_sdk/include/openxr "*.h")
harvest(xr_openxr_sdk/lib xr_openxr_sdk/src/loader "*.a")
harvest(osl/bin osl/bin "oslc")
harvest(osl/include osl/include "*.h")
harvest(osl/lib osl/lib "*.a")
@@ -184,20 +195,20 @@ harvest(theora/lib ffmpeg/lib "*.a")
harvest(tiff/include tiff/include "*.h")
harvest(tiff/lib tiff/lib "*.a")
harvest(vorbis/lib ffmpeg/lib "*.a")
harvest(opus/lib ffmpeg/lib "*.a")
harvest(vpx/lib ffmpeg/lib "*.a")
harvest(webp/lib ffmpeg/lib "*.a")
harvest(x264/lib ffmpeg/lib "*.a")
harvest(xvidcore/lib ffmpeg/lib "*.a")
harvest(embree/include embree/include "*.h")
harvest(embree/lib embree/lib "*.a")
harvest(usd/include usd/include "*.h")
harvest(usd/lib/usd usd/lib/usd "*")
harvest(usd/plugin usd/plugin "*")
if(UNIX AND NOT APPLE)
harvest(libglu/lib mesa/lib "*.so*")
harvest(mesa/lib mesa/lib "*.so*")
endif()
harvest(openmesh/include openmesh/include "*")
harvest(openmesh/lib openmesh/lib "*.a")
harvest(lapack/lib lapack/lib "*.a")
harvest(lapack/include lapack/include "*.h")
harvest(igl/include igl/include "*")
harvest(igl/lib igl/lib "*.a")
harvest(qex/include qex/include "*.h")
harvest(qex/lib qex/lib "*.a")
endif()

View File

@@ -0,0 +1,70 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(IGL_EXTRA_ARGS
-DLIBIGL_BUILD_PYTHON=OFF
-DLIBIGL_BUILD_TESTS=OFF
-DLIBIGL_BUILD_TUTORIALS=OFF
-DLIBIGL_USE_STATIC_LIBRARY=ON
-DLIBIGL_WITHOUT_COPYLEFT=OFF
-DLIBIGL_WITH_CGAL=OFF
-DLIBIGL_WITH_COMISO=ON
-DLIBIGL_WITH_CORK=OFF
-DLIBIGL_WITH_EMBREE=OFF
-DLIBIGL_WITH_MATLAB=OFF
-DLIBIGL_WITH_MOSEK=OFF
-DLIBIGL_WITH_OPENGL=OFF
-DLIBIGL_WITH_OPENGL_GLFW=OFF
-DLIBIGL_WITH_OPENGL_GLFW_IMGUI=OFF
-DLIBIGL_WITH_PNG=OFF
-DLIBIGL_WITH_PYTHON=OFF
-DLIBIGL_WITH_TETGEN=OFF
-DLIBIGL_WITH_TRIANGLE=OFF
-DLIBIGL_WITH_XML=OFF
-DBLAS_LIBRARIES=${LIBDIR}/lapack/lib/libblas.a
)
ExternalProject_Add(external_igl
URL ${IGL_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH SHA256=${IGL_HASH}
PREFIX ${BUILD_DIR}/igl
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/igl -Wno-dev ${DEFAULT_CMAKE_FLAGS} ${IGL_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/igl
)
ExternalProject_Add_Step(external_igl after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${BUILD_DIR}/igl/src/external_igl/include/igl/copyleft/comiso
${LIBDIR}/igl/include/igl/copyleft/comiso
COMMAND ${CMAKE_COMMAND} -E copy ${BUILD_DIR}/igl/src/external_igl-build/libigl_comiso.a
${LIBDIR}/igl/lib/libigl_comiso.a
COMMAND ${CMAKE_COMMAND} -E copy ${BUILD_DIR}/igl/src/external_igl-build/libCoMISo.a
${LIBDIR}/igl/lib/libCoMISo.a
COMMAND ${CMAKE_COMMAND} -E copy_directory ${BUILD_DIR}/igl/src/external_igl/external/eigen/Eigen
${LIBDIR}/igl/include/eigen/Eigen
DEPENDEES install
)
add_dependencies(
external_igl
external_lapack
)

View File

@@ -0,0 +1,58 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
if(WIN32)
set(ILMBASE_CMAKE_CXX_STANDARD_LIBRARIES "kernel32${LIBEXT} user32${LIBEXT} gdi32${LIBEXT} winspool${LIBEXT} shell32${LIBEXT} ole32${LIBEXT} oleaut32${LIBEXT} uuid${LIBEXT} comdlg32${LIBEXT} advapi32${LIBEXT} psapi${LIBEXT}")
set(ILMBASE_EXTRA_ARGS
-DBUILD_SHARED_LIBS=OFF
-DCMAKE_CXX_STANDARD_LIBRARIES=${ILMBASE_CMAKE_CXX_STANDARD_LIBRARIES}
)
ExternalProject_Add(external_ilmbase
URL ${ILMBASE_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${ILMBASE_HASH}
PREFIX ${BUILD_DIR}/ilmbase
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/ilmbase ${DEFAULT_CMAKE_FLAGS} ${ILMBASE_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/openexr
)
if(BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_ilmbase after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/ilmbase ${HARVEST_TARGET}/openexr
DEPENDEES install
)
endif()
else()
set(ILMBASE_EXTRA_ARGS
--enable-static
--disable-shared
--enable-cxxstd=11
)
ExternalProject_Add(external_ilmbase
URL ${ILMBASE_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${ILMBASE_HASH}
PREFIX ${BUILD_DIR}/ilmbase
CONFIGURE_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/ilmbase/src/external_ilmbase/ && ${CONFIGURE_COMMAND} --prefix=${LIBDIR}/ilmbase ${ILMBASE_EXTRA_ARGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/ilmbase/src/external_ilmbase/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/ilmbase/src/external_ilmbase/ && make install
INSTALL_DIR ${LIBDIR}/openexr
)
endif()

View File

@@ -14,24 +14,16 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# The Original Code is Copyright (C) 2015, Blender Foundation
# All rights reserved.
# ***** END GPL LICENSE BLOCK *****
set(INC
.
set(LAPACK_EXTRA_ARGS
)
set(INC_SYS
ExternalProject_Add(external_lapack
URL ${LAPACK_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH SHA256=${LAPACK_HASH}
PREFIX ${BUILD_DIR}/lapack
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/lapack -Wno-dev ${DEFAULT_CMAKE_FLAGS} ${LAPACK_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/lapack
)
set(SRC
libc_compat.c
)
set(LIB
)
add_c_flag(-ffast-math)
blender_add_lib(bf_intern_libc_compat "${SRC}" "${INC}" "${INC_SYS}" "${LIB}")

View File

@@ -24,7 +24,7 @@ ExternalProject_Add(external_lcms
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${LCMS_HASH}
PREFIX ${BUILD_DIR}/lcms
# Patch taken from ocio.
#patch taken from ocio
PATCH_COMMAND ${CMAKE_COMMAND} -E copy ${PATCH_DIR}/cmakelists_lcms.txt ${BUILD_DIR}/lcms/src/external_lcms/CMakeLists.txt
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/lcms ${DEFAULT_CMAKE_FLAGS} ${LCMS_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/lcms

View File

@@ -1,40 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(LIBGLU_CFLAGS "-static-libgcc")
set(LIBGLU_CXXFLAGS "-static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -l:libstdc++.a")
set(LIBGLU_LDFLAGS "-pthread -static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -l:libstdc++.a")
set(LIBGLU_EXTRA_FLAGS
CFLAGS=${LIBGLU_CFLAGS}
CXXFLAGS=${LIBGLU_CXXFLAGS}
LDFLAGS=${LIBGLU_LDFLAGS}
)
ExternalProject_Add(external_libglu
URL ${LIBGLU_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${LIBGLU_HASH}
PREFIX ${BUILD_DIR}/libglu
CONFIGURE_COMMAND ${CONFIGURE_ENV} &&
cd ${BUILD_DIR}/libglu/src/external_libglu/ &&
${CONFIGURE_COMMAND_NO_TARGET} --prefix=${LIBDIR}/libglu ${LIBGLU_EXTRA_FLAGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/libglu/src/external_libglu/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/libglu/src/external_libglu/ && make install
INSTALL_DIR ${LIBDIR}/libglu
)

View File

@@ -17,14 +17,12 @@
# ***** END GPL LICENSE BLOCK *****
set(LLVM_EXTRA_ARGS
-DLLVM_USE_CRT_RELEASE=MD
-DLLVM_USE_CRT_DEBUG=MDd
-DLLVM_USE_CRT_RELEASE=MT
-DLLVM_USE_CRT_DEBUG=MTd
-DLLVM_INCLUDE_TESTS=OFF
-DLLVM_TARGETS_TO_BUILD=X86
-DLLVM_INCLUDE_EXAMPLES=OFF
-DLLVM_ENABLE_TERMINFO=OFF
-DLLVM_BUILD_LLVM_C_DYLIB=OFF
-DLLVM_ENABLE_UNWIND_TABLES=OFF
)
if(WIN32)
@@ -40,7 +38,6 @@ ExternalProject_Add(ll
URL_HASH MD5=${LLVM_HASH}
CMAKE_GENERATOR ${LLVM_GENERATOR}
PREFIX ${BUILD_DIR}/ll
PATCH_COMMAND ${PATCH_CMD} -p 1 -d ${BUILD_DIR}/ll/src/ll < ${PATCH_DIR}/llvm.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/llvm ${DEFAULT_CMAKE_FLAGS} ${LLVM_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/llvm
)

View File

@@ -1,54 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(MESA_CFLAGS "-static-libgcc")
set(MESA_CXXFLAGS "-static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -l:libstdc++.a")
set(MESA_LDFLAGS "-L${LIBDIR}/zlib/lib -pthread -static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -l:libstdc++.a -l:libz_pic.a")
set(MESA_EXTRA_FLAGS
CFLAGS=${MESA_CFLAGS}
CXXFLAGS=${MESA_CXXFLAGS}
LDFLAGS=${MESA_LDFLAGS}
--enable-glx=gallium-xlib
--with-gallium-drivers=swrast
--disable-dri
--disable-gbm
--disable-egl
--disable-gles1
--disable-gles2
--disable-llvm-shared-libs
--with-llvm-prefix=${LIBDIR}/llvm
)
ExternalProject_Add(external_mesa
URL ${MESA_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${MESA_HASH}
PREFIX ${BUILD_DIR}/mesa
CONFIGURE_COMMAND ${CONFIGURE_ENV} &&
cd ${BUILD_DIR}/mesa/src/external_mesa/ &&
${CONFIGURE_COMMAND_NO_TARGET} --prefix=${LIBDIR}/mesa ${MESA_EXTRA_FLAGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/mesa/src/external_mesa/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/mesa/src/external_mesa/ && make install
INSTALL_DIR ${LIBDIR}/mesa
)
add_dependencies(
external_mesa
ll
)

View File

@@ -50,7 +50,7 @@ if(WIN32)
-DUSE_EXTERNAL_LCMS=ON
-DINC_1=${LIBDIR}/tinyxml/include
-DINC_2=${LIBDIR}/yamlcpp/include
# Lie because ocio cmake is demanding boost even though it is not needed.
#lie because ocio cmake is demanding boost even though it is not needed
-DYAML_CPP_VERSION=0.5.0
)
else()
@@ -95,7 +95,7 @@ if(WIN32)
ExternalProject_Add_Step(external_opencolorio after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/opencolorio/include ${HARVEST_TARGET}/opencolorio/include
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/opencolorio/lib/static ${HARVEST_TARGET}/opencolorio/lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/yamlcpp/lib/libyaml-cppmd.lib ${HARVEST_TARGET}/opencolorio/lib/libyaml-cpp.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/yamlcpp/lib/libyaml-cppmt.lib ${HARVEST_TARGET}/opencolorio/lib/libyaml-cpp.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tinyxml/lib/tinyxml.lib ${HARVEST_TARGET}/opencolorio/lib/tinyxml.lib
DEPENDEES install
)
@@ -103,7 +103,7 @@ if(WIN32)
if(BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_opencolorio after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencolorio/lib/static/Opencolorio.lib ${HARVEST_TARGET}/opencolorio/lib/OpencolorIO_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/yamlcpp/lib/libyaml-cppmdd.lib ${HARVEST_TARGET}/opencolorio/lib/libyaml-cpp_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/yamlcpp/lib/libyaml-cppmtd.lib ${HARVEST_TARGET}/opencolorio/lib/libyaml-cpp_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tinyxml/lib/tinyxml.lib ${HARVEST_TARGET}/opencolorio/lib/tinyxml_d.lib
DEPENDEES install
)

View File

@@ -20,45 +20,59 @@ if(WIN32)
set(OPENEXR_CMAKE_CXX_STANDARD_LIBRARIES "kernel32${LIBEXT} user32${LIBEXT} gdi32${LIBEXT} winspool${LIBEXT} shell32${LIBEXT} ole32${LIBEXT} oleaut32${LIBEXT} uuid${LIBEXT} comdlg32${LIBEXT} advapi32${LIBEXT} psapi${LIBEXT}")
set(OPENEXR_EXTRA_ARGS
-DCMAKE_CXX_STANDARD_LIBRARIES=${OPENEXR_CMAKE_CXX_STANDARD_LIBRARIES}
-DZLIB_LIBRARY=${LIBDIR}/zlib/lib/${ZLIB_LIBRARY}
-DZLIB_INCLUDE_DIR=${LIBDIR}/zlib/include/
-DILMBASE_PACKAGE_PREFIX=${LIBDIR}/ilmbase
-DOPENEXR_BUILD_ILMBASE=On
-DOPENEXR_BUILD_OPENEXR=On
-DOPENEXR_BUILD_PYTHON_LIBS=Off
-DOPENEXR_BUILD_STATIC=On
-DOPENEXR_BUILD_SHARED=Off
-DOPENEXR_BUILD_TESTS=Off
-DOPENEXR_BUILD_VIEWERS=Off
-DOPENEXR_BUILD_UTILS=Off
-DOPENEXR_NAMESPACE_VERSIONING=Off
)
else()
set(OPENEXR_EXTRA_ARGS
ExternalProject_Add(external_openexr
URL ${OPENEXR_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${OPENEXR_HASH}
PREFIX ${BUILD_DIR}/openexr
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/openexr ${DEFAULT_CMAKE_FLAGS} ${OPENEXR_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/openexr
)
endif()
set(OPENEXR_EXTRA_ARGS
${OPENEXR_EXTRA_ARGS}
-DZLIB_LIBRARY=${LIBDIR}/zlib/lib/${ZLIB_LIBRARY}
-DZLIB_INCLUDE_DIR=${LIBDIR}/zlib/include/
-DBUILD_TESTING=OFF
-DOPENEXR_BUILD_BOTH_STATIC_SHARED=OFF
-DILMBASE_BUILD_BOTH_STATIC_SHARED=OFF
-DBUILD_SHARED_LIBS=OFF
-DOPENEXR_BUILD_UTILS=OFF
-DPYILMBASE_ENABLE=OFF
-DOPENEXR_VIEWERS_ENABLE=OFF
-DILMBASE_LIB_SUFFIX=${OPENEXR_VERSION_BUILD_POSTFIX}
-DOPENEXR_LIB_SUFFIX=${OPENEXR_VERSION_BUILD_POSTFIX}
)
ExternalProject_Add(external_openexr
URL ${OPENEXR_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${OPENEXR_HASH}
PREFIX ${BUILD_DIR}/openexr
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/openexr ${DEFAULT_CMAKE_FLAGS} ${OPENEXR_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/openexr
)
if(WIN32)
ExternalProject_Add_Step(external_openexr after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/openexr/lib ${HARVEST_TARGET}/openexr/lib
#libs have moved between versions, just duplicate it for now.
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/openexr/lib ${LIBDIR}/ilmbase/lib
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/openexr/include ${HARVEST_TARGET}/openexr/include
DEPENDEES install
)
else()
set(OPENEXR_PKG_CONFIG_PATH ${LIBDIR}/zlib/share/pkgconfig)
set(OPENEXR_EXTRA_ARGS
--enable-static
--disable-shared
--enable-cxxstd=11
--with-ilmbase-prefix=${LIBDIR}/ilmbase
)
ExternalProject_Add(external_openexr
URL ${OPENEXR_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${OPENEXR_HASH}
PREFIX ${BUILD_DIR}/openexr
CONFIGURE_COMMAND ${CONFIGURE_ENV} && export PKG_CONFIG_PATH=${OPENEXR_PKG_CONFIG_PATH} && cd ${BUILD_DIR}/openexr/src/external_openexr/ && ${CONFIGURE_COMMAND} --prefix=${LIBDIR}/openexr ${OPENEXR_EXTRA_ARGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/openexr/src/external_openexr/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/openexr/src/external_openexr/ && make install
INSTALL_DIR ${LIBDIR}/openexr
)
endif()
add_dependencies(
external_openexr
external_zlib
external_ilmbase
)

View File

@@ -69,7 +69,7 @@ set(OPENIMAGEIO_EXTRA_ARGS
-DBoost_COMPILER:STRING=${BOOST_COMPILER_STRING}
-DBoost_USE_MULTITHREADED=ON
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBoost_USE_STATIC_RUNTIME=ON
-DBOOST_ROOT=${LIBDIR}/boost
-DBOOST_LIBRARYDIR=${LIBDIR}/boost/lib/
-DBoost_NO_SYSTEM_PATHS=ON
@@ -106,13 +106,13 @@ set(OPENIMAGEIO_EXTRA_ARGS
-DOCIO_PATH=${LIBDIR}/opencolorio/
-DOpenEXR_USE_STATIC_LIBS=On
-DOPENEXR_HOME=${LIBDIR}/openexr/
-DILMBASE_INCLUDE_PATH=${LIBDIR}/openexr/
-DILMBASE_PACKAGE_PREFIX=${LIBDIR}/openexr/
-DILMBASE_INCLUDE_DIR=${LIBDIR}/openexr/include/
-DOPENEXR_HALF_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}Half${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_IMATH_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}Imath${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_ILMTHREAD_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}IlmThread${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_IEX_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}Iex${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DILMBASE_INCLUDE_PATH=${LIBDIR}/ilmbase/
-DILMBASE_PACKAGE_PREFIX=${LIBDIR}/ilmbase/
-DILMBASE_INCLUDE_DIR=${LIBDIR}/ilmbase/include/
-DOPENEXR_HALF_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}Half${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_IMATH_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}Imath${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_ILMTHREAD_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}IlmThread${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_IEX_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}Iex${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_INCLUDE_DIR=${LIBDIR}/openexr/include/
-DOPENEXR_ILMIMF_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}IlmImf${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DSTOP_ON_WARNING=OFF
@@ -135,6 +135,7 @@ ExternalProject_Add(external_openimageio
add_dependencies(
external_openimageio
external_png external_zlib
external_ilmbase
external_openexr
external_jpeg
external_boost

View File

@@ -38,7 +38,7 @@ ExternalProject_Add(external_openjpeg
INSTALL_DIR ${LIBDIR}/openjpeg
)
# On windows ffmpeg wants a mingw build, while oiio needs a msvc build.
#on windows ffmpeg wants a mingw build, while oiio needs a msvc build
if(MSVC)
set(OPENJPEG_EXTRA_ARGS ${DEFAULT_CMAKE_FLAGS})
ExternalProject_Add(external_openjpeg_msvc

View File

@@ -0,0 +1,45 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(OPENMESH_EXTRA_ARGS
-DBUILD_APPS=OFF
-DDISABLE_QMAKE_BUILD=ON
)
ExternalProject_Add(external_openmesh
URL ${OPENMESH_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH SHA256=${OPENMESH_HASH}
PREFIX ${BUILD_DIR}/openmesh
PATCH_COMMAND ${PATCH_CMD} -p 0 -d ${BUILD_DIR}/openmesh/src/external_openmesh < ${PATCH_DIR}/openmesh.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/openmesh -Wno-dev ${DEFAULT_CMAKE_FLAGS} ${OPENMESH_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/openmesh
)
if (BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_openmesh after_install
COMMAND ${CMAKE_COMMAND} -E rename ${LIBDIR}/openmesh/lib/OpenMesh/libOpenMeshCored.a
${LIBDIR}/openmesh/lib/OpenMesh/libOpenMeshCore.a
COMMAND ${CMAKE_COMMAND} -E rename ${LIBDIR}/openmesh/lib/OpenMesh/libOpenMeshToolsd.a
${LIBDIR}/openmesh/lib/OpenMesh/libOpenMeshTools.a
DEPENDEES install
)
endif()

View File

@@ -41,6 +41,7 @@ if(WIN32)
-DCLEW_LIBRARY=${LIBDIR}/clew/lib/clew${LIBEXT}
-DCUEW_INCLUDE_DIR=${LIBDIR}/cuew/include
-DCUEW_LIBRARY=${LIBDIR}/cuew/lib/cuew${LIBEXT}
-DCMAKE_EXE_LINKER_FLAGS_RELEASE=libcmt.lib
)
if("${CMAKE_SIZEOF_VOID_P}" EQUAL "8")
set(OPENSUBDIV_EXTRA_ARGS

View File

@@ -24,25 +24,29 @@ set(OPENVDB_EXTRA_ARGS
-DBoost_COMPILER:STRING=${BOOST_COMPILER_STRING}
-DBoost_USE_MULTITHREADED=ON
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBoost_USE_STATIC_RUNTIME=ON
-DBOOST_ROOT=${LIBDIR}/boost
-DBoost_NO_SYSTEM_PATHS=ON
-DZLIB_LIBRARY=${LIBDIR}/zlib/lib/${ZLIB_LIBRARY}
-DZLIB_INCLUDE_DIR=${LIBDIR}/zlib/include/
-DBlosc_INCLUDE_DIR=${LIBDIR}/blosc/include/
-DBlosc_LIBRARY=${LIBDIR}/blosc/lib/libblosc${BLOSC_POST}${LIBEXT}
-DBLOSC_INCLUDE_DIR=${LIBDIR}/blosc/include/
-DBLOSC_blosc_LIBRARY=${LIBDIR}/blosc/lib/libblosc${BLOSC_POST}${LIBEXT}
-DOPENVDB_ENABLE_3_ABI_COMPATIBLE=OFF
-DOPENVDB_BUILD_UNITTESTS=Off
-DOPENVDB_BUILD_PYTHON_MODULE=Off
-DBlosc_ROOT=${LIBDIR}/blosc/
-DGLEW_LOCATION=${LIBDIR}/glew/
-DBLOSC_LOCATION=${LIBDIR}/blosc/
-DTBB_LOCATION=${LIBDIR}/tbb/
-DTBB_ROOT=${LIBDIR}/tbb/
-DOpenEXR_ROOT=${LIBDIR}/openexr
-DIlmBase_ROOT=${LIBDIR}/openexr
-DOPENEXR_LIBRARYDIR=${LIBDIR}/openexr/lib
# All libs live in openexr, even the ilmbase ones
-DILMBASE_LIBRARYDIR=${LIBDIR}/openexr/lib
-DOPENVDB_CORE_SHARED=Off
-DOPENVDB_BUILD_BINARIES=Off
-DOPENEXR_LOCATION=${LIBDIR}/openexr
-DILMBASE_LOCATION=${LIBDIR}/ilmbase
-DIlmbase_HALF_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}Half${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DIlmbase_IEX_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}Iex${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DIlmbase_ILMTHREAD_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}IlmThread${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOpenexr_ILMIMF_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}IlmImf${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DTBB_LIBRARYDIR=${LIBDIR}/tbb/lib
-DTbb_TBB_LIBRARY=${LIBDIR}/tbb/lib/${LIBPREFIX}tbb_static${LIBEXT}
-DTBB_LIBRARY_PATH=${LIBDIR}/tbb/lib
)
if(WIN32)
@@ -50,17 +54,15 @@ if(WIN32)
# being in the correct namespace
# needs to link pthreads due to it being a blosc dependency
set(OPENVDB_EXTRA_ARGS ${OPENVDB_EXTRA_ARGS}
-DCMAKE_CXX_STANDARD_LIBRARIES="${LIBDIR}/pthreads/lib/pthreadVC3.lib"
-DUSE_EXR=On
)
else()
# OpenVDB can't find the _static libraries automatically.
set(OPENVDB_EXTRA_ARGS ${OPENVDB_EXTRA_ARGS}
-DTbb_LIBRARIES=${LIBDIR}/tbb/lib/${LIBPREFIX}tbb_static${LIBEXT}
-DTbb_tbb_LIBRARY=${LIBDIR}/tbb/lib/${LIBPREFIX}tbb_static${LIBEXT}
-DTbb_tbbmalloc_LIBRARY=${LIBDIR}/tbb/lib/${LIBPREFIX}tbbmalloc_static${LIBEXT}
-DTbb_tbbmalloc_proxy_LIBRARY=${LIBDIR}/tbb/lib/${LIBPREFIX}tbbmalloc_proxy_static${LIBEXT}
-DOPENEXR_NAMESPACE_VERSIONING=OFF
-DEXTRA_LIBS:FILEPATH=${LIBDIR}/pthreads/lib/pthreadVC3.lib
)
if("${CMAKE_SIZEOF_VOID_P}" EQUAL "4")
set(OPENVDB_EXTRA_ARGS ${OPENVDB_EXTRA_ARGS}
-DCMAKE_SHARED_LINKER_FLAGS="/safeseh:no"
-DCMAKE_EXE_LINKER_FLAGS="/safeseh:no"
)
endif()
endif()
ExternalProject_Add(openvdb
@@ -77,6 +79,7 @@ add_dependencies(
openvdb
external_tbb
external_boost
external_ilmbase
external_openexr
external_zlib
external_blosc

View File

@@ -20,6 +20,7 @@ if(WIN32)
option(ENABLE_MINGW64 "Enable building of ffmpeg/iconv/libsndfile/lapack/fftw3 by installing mingw64" ON)
endif()
option(WITH_WEBP "Enable building of oiio with webp support" OFF)
option(WITH_EMBREE "Enable building of Embree" OFF)
set(MAKE_THREADS 1 CACHE STRING "Number of threads to run make with")
if(NOT BUILD_MODE)
@@ -61,22 +62,22 @@ if(WIN32)
endif()
set(COMMON_MSVC_FLAGS "${COMMON_MSVC_FLAGS} /bigobj")
if(WITH_OPTIMIZED_DEBUG)
set(BLENDER_CMAKE_C_FLAGS_DEBUG "/MDd ${COMMON_MSVC_FLAGS} /O2 /Ob2 /DNDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_DEBUG "/MTd ${COMMON_MSVC_FLAGS} /O2 /Ob2 /DNDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
else()
set(BLENDER_CMAKE_C_FLAGS_DEBUG "/MDd ${COMMON_MSVC_FLAGS} /Zi /Ob0 /Od /RTC1 /D_DEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_DEBUG "/MTd ${COMMON_MSVC_FLAGS} /Zi /Ob0 /Od /RTC1 /D_DEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
endif()
set(BLENDER_CMAKE_C_FLAGS_MINSIZEREL "/MD ${COMMON_MSVC_FLAGS} /O1 /Ob1 /D NDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_RELEASE "/MD ${COMMON_MSVC_FLAGS} /O2 /Ob2 /DNDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_RELWITHDEBINFO "/MD ${COMMON_MSVC_FLAGS} /Zi /O2 /Ob1 /D NDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_MINSIZEREL "/MT ${COMMON_MSVC_FLAGS} /O1 /Ob1 /D NDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_RELEASE "/MT ${COMMON_MSVC_FLAGS} /O2 /Ob2 /DNDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_RELWITHDEBINFO "/MT ${COMMON_MSVC_FLAGS} /Zi /O2 /Ob1 /D NDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
if(WITH_OPTIMIZED_DEBUG)
set(BLENDER_CMAKE_CXX_FLAGS_DEBUG "/MDd ${COMMON_MSVC_FLAGS} /O2 /Ob2 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_DEBUG "/MTd ${COMMON_MSVC_FLAGS} /O2 /Ob2 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
else()
set(BLENDER_CMAKE_CXX_FLAGS_DEBUG "/D_DEBUG /D PLATFORM_WINDOWS /MTd ${COMMON_MSVC_FLAGS} /Zi /Ob0 /Od /RTC1 /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
endif()
set(BLENDER_CMAKE_CXX_FLAGS_MINSIZEREL "/MD /${COMMON_MSVC_FLAGS} /O1 /Ob1 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_RELEASE "/MD ${COMMON_MSVC_FLAGS} /O2 /Ob2 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_RELWITHDEBINFO "/MD ${COMMON_MSVC_FLAGS} /Zi /O2 /Ob1 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_MINSIZEREL "/MT /${COMMON_MSVC_FLAGS} /O1 /Ob1 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_RELEASE "/MT ${COMMON_MSVC_FLAGS} /O2 /Ob2 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_RELWITHDEBINFO "/MT ${COMMON_MSVC_FLAGS} /Zi /O2 /Ob1 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(PLATFORM_FLAGS)
set(PLATFORM_CXX_FLAGS)
@@ -96,8 +97,8 @@ if(WIN32)
set(CONFIGURE_ENV
cd ${MINGW_PATH} &&
call ${PERL_SHELL} &&
call ${MINGW_SHELL} &&
call ${PERL_SHELL} &&
set path &&
set CFLAGS=-g &&
set LDFLAGS=-Wl,--as-needed -static-libgcc
@@ -189,7 +190,7 @@ set(DEFAULT_CMAKE_FLAGS
)
if(WIN32)
# We need both flavors to build the thumbnail dlls
#we need both flavors to build the thumbnail dlls
if(MSVC12)
set(GENERATOR_32 "Visual Studio 12 2013")
set(GENERATOR_64 "Visual Studio 12 2013 Win64")

View File

@@ -1,35 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
ExternalProject_Add(external_opus
URL ${OPUS_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH SHA256=${OPUS_HASH}
PREFIX ${BUILD_DIR}/opus
CONFIGURE_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/opus/src/external_opus/ && ${CONFIGURE_COMMAND} --prefix=${LIBDIR}/opus
--disable-shared
--enable-static
--with-pic
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/opus/src/external_opus/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/opus/src/external_opus/ && make install
INSTALL_DIR ${LIBDIR}/opus
)
if(MSVC)
set_target_properties(external_opus PROPERTIES FOLDER Mingw)
endif()

View File

@@ -18,7 +18,7 @@
if(WIN32)
set(OSL_CMAKE_CXX_STANDARD_LIBRARIES "kernel32${LIBEXT} user32${LIBEXT} gdi32${LIBEXT} winspool${LIBEXT} shell32${LIBEXT} ole32${LIBEXT} oleaut32${LIBEXT} uuid${LIBEXT} comdlg32${LIBEXT} advapi32${LIBEXT} psapi${LIBEXT}")
set(OSL_FLEX_BISON -DFLEX_EXECUTABLE=${LIBDIR}/flexbison/win_flex.exe -DBISON_EXECUTABLE=${LIBDIR}/flexbison/win_bison.exe)
set(OSL_FLEX_BISON -DFLEX_EXECUTABLE=${LIBDIR}/flexbison/win_flex.exe -DFLEX_EXTRA_OPTIONS="--wincompat" -DBISON_EXECUTABLE=${LIBDIR}/flexbison/win_bison.exe)
set(OSL_OPENIMAGEIO_LIBRARY "${LIBDIR}/openimageio/lib/${LIBPREFIX}OpenImageIO${LIBEXT};${LIBDIR}/openimageio/lib/${LIBPREFIX}OpenImageIO_Util${LIBEXT};${LIBDIR}/png/lib/libpng16${LIBEXT};${LIBDIR}/jpg/lib/${LIBPREFIX}jpeg${LIBEXT};${LIBDIR}/tiff/lib/${LIBPREFIX}tiff${LIBEXT};${LIBDIR}/openexr/lib/${LIBPREFIX}IlmImf${OPENEXR_VERSION_POSTFIX}${LIBEXT}")
if("${CMAKE_SIZEOF_VOID_P}" EQUAL "4")
set(OSL_SIMD_FLAGS -DOIIO_NOSIMD=1 -DOIIO_SIMD=0)
@@ -33,14 +33,14 @@ else()
SET(OSL_PLATFORM_FLAGS)
endif()
set(OSL_ILMBASE_CUSTOM_LIBRARIES "${LIBDIR}/openexr/lib/Imath${OPENEXR_VERSION_POSTFIX}.lib^^${LIBDIR}/openexr/lib/Half{OPENEXR_VERSION_POSTFIX}.lib^^${LIBDIR}/openexr/lib/IlmThread${OPENEXR_VERSION_POSTFIX}.lib^^${LIBDIR}/openexr/lib/Iex${OPENEXR_VERSION_POSTFIX}.lib")
set(OSL_ILMBASE_CUSTOM_LIBRARIES "${LIBDIR}/ilmbase/lib/Imath${ILMBASE_VERSION_POSTFIX}.lib^^${LIBDIR}/ilmbase/lib/Half{ILMBASE_VERSION_POSTFIX}.lib^^${LIBDIR}/ilmbase/lib/IlmThread${ILMBASE_VERSION_POSTFIX}.lib^^${LIBDIR}/ilmbase/lib/Iex${ILMBASE_VERSION_POSTFIX}.lib")
set(OSL_LLVM_LIBRARY "${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMAnalysis${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMAsmParser${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMAsmPrinter${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMBitReader${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMBitWriter${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMCodeGen${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMCore${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMDebugInfo${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMExecutionEngine${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMInstCombine${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMInstrumentation${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMInterpreter${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMJIT${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMLinker${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMMC${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMMCDisassembler${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMMCJIT${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMMCParser${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMObject${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMRuntimeDyld${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMScalarOpts${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMSelectionDAG${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMSupport${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMTableGen${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMTarget${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMTransformUtils${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMVectorize${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMX86AsmParser${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMX86AsmPrinter${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMX86CodeGen${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMX86Desc${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMX86Disassembler${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMX86Info${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMX86Utils${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMipa${LIBEXT};${LIBDIR}/llvm/lib/${LIBPREFIX}LLVMipo${LIBEXT}")
set(OSL_EXTRA_ARGS
-DBoost_COMPILER:STRING=${BOOST_COMPILER_STRING}
-DBoost_USE_MULTITHREADED=ON
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBoost_USE_STATIC_RUNTIME=ON
-DBOOST_ROOT=${LIBDIR}/boost
-DBOOST_LIBRARYDIR=${LIBDIR}/boost/lib/
-DBoost_NO_SYSTEM_PATHS=ON
@@ -50,21 +50,20 @@ set(OSL_EXTRA_ARGS
-DLLVM_VERSION=3.4
-DLLVM_LIBRARY=${OSL_LLVM_LIBRARY}
-DOPENEXR_HOME=${LIBDIR}/openexr/
-DILMBASE_HOME=${LIBDIR}/openexr/
-DILMBASE_INCLUDE_DIR=${LIBDIR}/openexr/include/
-DOPENEXR_HALF_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}Half${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_IMATH_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}Imath${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_ILMTHREAD_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}IlmThread${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_IEX_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}Iex${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DILMBASE_HOME=${LIBDIR}/ilmbase/
-DILMBASE_INCLUDE_DIR=${LIBDIR}/ilmbase/include/
-DOPENEXR_HALF_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}Half${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_IMATH_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}Imath${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_ILMTHREAD_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}IlmThread${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_IEX_LIBRARY=${LIBDIR}/ilmbase/lib/${LIBPREFIX}Iex${ILMBASE_VERSION_POSTFIX}${LIBEXT}
-DOPENEXR_INCLUDE_DIR=${LIBDIR}/openexr/include/
-DOPENEXR_ILMIMF_LIBRARY=${LIBDIR}/openexr/lib/${LIBPREFIX}IlmImf${OPENEXR_VERSION_POSTFIX}${LIBEXT}
-DOSL_BUILD_TESTS=OFF
-DOSL_BUILD_MATERIALX=OFF
-DZLIB_LIBRARY=${LIBDIR}/zlib/lib/${ZLIB_LIBRARY}
-DZLIB_INCLUDE_DIR=${LIBDIR}/zlib/include/
-DOPENIMAGEIOHOME=${LIBDIR}/openimageio/
-DOPENIMAGEIO_INCLUDE_DIR=${LIBDIR}/openimageio/include
-DOPENIMAGEIO_LIBRARY=${OSL_OPENIMAGEIO_LIBRARY}
-DOPENIMAGEIO_INCLUDES=${LIBDIR}/openimageio/include
${OSL_FLEX_BISON}
-DCMAKE_CXX_STANDARD_LIBRARIES=${OSL_CMAKE_CXX_STANDARD_LIBRARIES}
-DBUILDSTATIC=ON
@@ -73,7 +72,6 @@ set(OSL_EXTRA_ARGS
-DSTOP_ON_WARNING=OFF
-DUSE_LLVM_BITCODE=OFF
-DUSE_PARTIO=OFF
-DUSE_QT=OFF
${OSL_SIMD_FLAGS}
-DPARTIO_LIBRARIES=
)
@@ -89,6 +87,7 @@ elseif(APPLE)
set(OSL_EXTRA_ARGS
${OSL_EXTRA_ARGS}
-DHIDE_SYMBOLS=OFF
-DPUGIXML_HOME=${LIBDIR}/pugixml
)
endif()
@@ -108,13 +107,14 @@ add_dependencies(
external_boost
ll
external_clang
external_ilmbase
external_openexr
external_zlib
external_flexbison
external_openimageio
)
if(UNIX)
if(UNIX AND NOT APPLE)
# Rely on PugiXML compiled with OpenImageIO
else()
add_dependencies(

View File

@@ -29,7 +29,6 @@ if(MSVC)
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/libs/python${PYTHON_SHORT_VERSION_NO_DOTS}.lib ${PYTARGET}/libs/python${PYTHON_SHORT_VERSION_NO_DOTS}.lib
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/python.exe ${PYTARGET}/bin/python.exe
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/python${PYTHON_SHORT_VERSION_NO_DOTS}.dll ${PYTARGET}/bin/python${PYTHON_SHORT_VERSION_NO_DOTS}.dll
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/python3${PYTHON_POSTFIX}.dll ${PYTARGET}/bin/python3${PYTHON_POSTFIX}.dll
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/python${PYTHON_SHORT_VERSION_NO_DOTS}.pdb ${PYTARGET}/libs/python${PYTHON_SHORT_VERSION_NO_DOTS}.pdb
COMMAND ${CMAKE_COMMAND} -E copy_directory ${PYSRC}/include/ ${PYTARGET}/include/
COMMAND ${CMAKE_COMMAND} -E copy_directory ${PYSRC}/lib/ ${PYTARGET}/lib/
@@ -48,7 +47,6 @@ if(MSVC)
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/libs/python${PYTHON_SHORT_VERSION_NO_DOTS}${PYTHON_POSTFIX}.lib ${PYTARGET}/libs/python${PYTHON_SHORT_VERSION_NO_DOTS}${PYTHON_POSTFIX}.lib
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/python${PYTHON_POSTFIX}.exe ${PYTARGET}/bin/python${PYTHON_POSTFIX}.exe
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/python${PYTHON_SHORT_VERSION_NO_DOTS}${PYTHON_POSTFIX}.dll ${PYTARGET}/bin/python${PYTHON_SHORT_VERSION_NO_DOTS}${PYTHON_POSTFIX}.dll
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/python3${PYTHON_POSTFIX}.dll ${PYTARGET}/bin/python3${PYTHON_POSTFIX}.dll
COMMAND ${CMAKE_COMMAND} -E copy ${PYSRC}/python${PYTHON_SHORT_VERSION_NO_DOTS}${PYTHON_POSTFIX}.pdb ${PYTARGET}/libs/python${PYTHON_SHORT_VERSION_NO_DOTS}${PYTHON_POSTFIX}.pdb
COMMAND ${CMAKE_COMMAND} -E copy_directory ${PYSRC}/include/ ${PYTARGET}/include/
COMMAND ${CMAKE_COMMAND} -E copy_directory ${PYSRC}/lib/ ${PYTARGET}/lib/

View File

@@ -43,7 +43,7 @@ if(WIN32)
PREFIX ${BUILD_DIR}/python
CONFIGURE_COMMAND ""
BUILD_COMMAND cd ${BUILD_DIR}/python/src/external_python/pcbuild/ && set IncludeTkinter=false && call build.bat -e -p ${PYTHON_ARCH} -c ${BUILD_MODE}
INSTALL_COMMAND ${PYTHON_BINARY_INTERNAL} ${PYTHON_SRC}/PC/layout/main.py -b ${PYTHON_SRC}/PCbuild/amd64 -s ${PYTHON_SRC} -t ${PYTHON_SRC}/tmp/ --include-underpth --include-stable --include-pip --include-dev --include-launchers --include-venv --include-symbols ${PYTHON_EXTRA_INSTLAL_FLAGS} --copy ${LIBDIR}/python
INSTALL_COMMAND ${PYTHON_BINARY_INTERNAL} ${PYTHON_SRC}/PC/layout/main.py -b ${PYTHON_SRC}/PCbuild/amd64 -s ${PYTHON_SRC} -t ${PYTHON_SRC}/tmp/ --include-underpth --include-stable --include-pip --include-dev --include-launchers --include-symbols ${PYTHON_EXTRA_INSTLAL_FLAGS} --copy ${LIBDIR}/python
)
else()

View File

@@ -0,0 +1,52 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(QEX_EXTRA_ARGS
-DOPENMESH_CORE_LIBRARY=${LIBDIR}/openmesh/lib/OpenMesh/libOpenMeshCore.a
-DOPENMESH_INCLUDE_DIR=${LIBDIR}/openmesh/include
-DOPENMESH_LIBRARY_DIR=${LIBDIR}/openmesh/lib/OpenMesh
-DOPENMESH_TOOLS_LIBRARY=${LIBDIR}/openmesh/lib/OpenMesh/libOpenMeshTools.a
)
ExternalProject_Add(external_qex
URL ${QEX_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH SHA256=${QEX_HASH}
PREFIX ${BUILD_DIR}/qex
PATCH_COMMAND ${PATCH_CMD} -p 0 -d ${BUILD_DIR}/qex/src/external_qex < ${PATCH_DIR}/qex.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/qex -Wno-dev ${DEFAULT_CMAKE_FLAGS} ${QEX_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/qex
)
#grr, INSTALL(TARGETS has no exclude pattern, so removing unnecessary empty dirs here
ExternalProject_Add_Step(external_qex after_install
COMMAND ${CMAKE_COMMAND} -E remove_directory ${LIBDIR}/qex/lib/src
COMMAND ${CMAKE_COMMAND} -E remove_directory ${LIBDIR}/qex/lib/interfaces
COMMAND ${CMAKE_COMMAND} -E remove_directory ${LIBDIR}/qex/lib/demo
COMMAND ${CMAKE_COMMAND} -E remove_directory ${LIBDIR}/qex/include/src
COMMAND ${CMAKE_COMMAND} -E remove_directory ${LIBDIR}/qex/include/interfaces
COMMAND ${CMAKE_COMMAND} -E remove_directory ${LIBDIR}/qex/include/demo
DEPENDEES install
)
add_dependencies(
external_qex
external_igl
external_openmesh
)

View File

@@ -21,7 +21,7 @@ set(SNDFILE_ENV PKG_CONFIG_PATH=${mingw_LIBDIR}/ogg/lib/pkgconfig:${mingw_LIBDIR
if(WIN32)
set(SNDFILE_ENV set ${SNDFILE_ENV} &&)
# Shared for windows because static libs will drag in a libgcc dependency.
#shared for windows because static libs will drag in a libgcc dependency.
set(SNDFILE_OPTIONS --disable-static --enable-shared )
else()
set(SNDFILE_OPTIONS --enable-static --disable-shared )

View File

@@ -19,10 +19,8 @@
set(SQLITE_CONFIGURE_ENV echo .)
set(SQLITE_CONFIGURATION_ARGS)
if(UNIX)
if(NOT APPLE)
set(SQLITE_LDFLAGS -Wl,--as-needed)
endif()
if(UNIX AND NOT APPLE)
set(SQLITE_LDFLAGS -Wl,--as-needed)
set(SQLITE_CFLAGS
"-DSQLITE_SECURE_DELETE -DSQLITE_ENABLE_COLUMN_METADATA \
-DSQLITE_ENABLE_FTS3 -DSQLITE_ENABLE_FTS3_PARENTHESIS \

View File

@@ -15,21 +15,13 @@
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
if(WIN32)
set(TBB_EXTRA_ARGS
-DTBB_BUILD_SHARED=On
-DTBB_BUILD_TBBMALLOC=On
-DTBB_BUILD_TBBMALLOC_PROXY=On
-DTBB_BUILD_STATIC=On
set(TBB_EXTRA_ARGS
-DTBB_BUILD_SHARED=Off
-DTBB_BUILD_TBBMALLOC=On
-DTBB_BUILD_TBBMALLOC_PROXY=Off
-DTBB_BUILD_STATIC=On
)
else()
set(TBB_EXTRA_ARGS
-DTBB_BUILD_SHARED=Off
-DTBB_BUILD_TBBMALLOC=On
-DTBB_BUILD_TBBMALLOC_PROXY=Off
-DTBB_BUILD_STATIC=On
)
endif()
# CMake script for TBB from https://github.com/wjakob/tbb/blob/master/CMakeLists.txt
ExternalProject_Add(external_tbb
@@ -47,10 +39,6 @@ if(WIN32)
if(BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_tbb after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb_static.lib ${HARVEST_TARGET}/tbb/lib/tbb.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.dll ${HARVEST_TARGET}/tbb/lib/tbbmalloc.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.dll ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy.dll
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/tbb/include/ ${HARVEST_TARGET}/tbb/include/
DEPENDEES install
)
@@ -58,9 +46,6 @@ if(WIN32)
if(BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_tbb after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb_static.lib ${HARVEST_TARGET}/tbb/lib/tbb_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.dll ${HARVEST_TARGET}/tbb/lib/debug/tbbmalloc.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.dll ${HARVEST_TARGET}/tbb/lib/debug/tbbmalloc_proxy.dll
DEPENDEES install
)
endif()

View File

@@ -16,7 +16,7 @@
#
# ***** END GPL LICENSE BLOCK *****
if(UNIX)
if (UNIX)
set(THEORA_CONFIGURE_ENV ${CONFIGURE_ENV} && export HAVE_PDFLATEX=no)
else()
set(THEORA_CONFIGURE_ENV ${CONFIGURE_ENV})

View File

@@ -24,7 +24,7 @@ ExternalProject_Add(external_tinyxml
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${TINYXML_HASH}
PREFIX ${BUILD_DIR}/tinyxml
# patch taken from ocio
#patch taken from ocio
PATCH_COMMAND ${PATCH_CMD} -p 1 -N -d ${BUILD_DIR}/tinyxml/src/external_tinyxml < ${PATCH_DIR}/tinyxml.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/tinyxml ${DEFAULT_CMAKE_FLAGS} ${TINYXML_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/tinyxml

View File

@@ -1,101 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(USD_EXTRA_ARGS
-DBoost_COMPILER:STRING=${BOOST_COMPILER_STRING}
-DBoost_USE_MULTITHREADED=ON
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBOOST_ROOT=${LIBDIR}/boost
-DTBB_INCLUDE_DIRS=${LIBDIR}/tbb/include
-DTBB_LIBRARIES=${LIBDIR}/tbb/lib/${LIBPREFIX}tbb_static${LIBEXT}
-DTbb_TBB_LIBRARY=${LIBDIR}/tbb/lib/${LIBPREFIX}tbb_static${LIBEXT}
# This is a preventative measure that avoids possible conflicts when add-ons
# try to load another USD library into the same process space.
-DPXR_SET_INTERNAL_NAMESPACE=usdBlender
-DPXR_ENABLE_PYTHON_SUPPORT=OFF
-DPXR_BUILD_IMAGING=OFF
-DPXR_BUILD_TESTS=OFF
-DBUILD_SHARED_LIBS=OFF
-DPYTHON_EXECUTABLE=${PYTHON_BINARY}
-DPXR_BUILD_MONOLITHIC=ON
# The PXR_BUILD_USD_TOOLS argument is patched-in by usd.diff. An upstream pull request
# can be found at https://github.com/PixarAnimationStudios/USD/pull/1048.
-DPXR_BUILD_USD_TOOLS=OFF
-DCMAKE_DEBUG_POSTFIX=_d
# USD is hellbound on making a shared lib, unless you point this variable to a valid cmake file
# doesn't have to make sense, but as long as it points somewhere valid it will skip the shared lib.
-DPXR_MONOLITHIC_IMPORT=${BUILD_DIR}/usd/src/external_usd/cmake/defaults/Version.cmake
)
ExternalProject_Add(external_usd
URL ${USD_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${USD_HASH}
PREFIX ${BUILD_DIR}/usd
PATCH_COMMAND ${PATCH_CMD} -p 1 -d ${BUILD_DIR}/usd/src/external_usd < ${PATCH_DIR}/usd.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/usd -Wno-dev ${DEFAULT_CMAKE_FLAGS} ${USD_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/usd
)
add_dependencies(
external_usd
external_tbb
external_boost
)
if(WIN32)
# USD currently demands python be available at build time
# and then proceeds not to use it, but still checks that the
# version of the interpreter it is not going to use is atleast 2.7
# so we need this dep currently since there is no system python
# on windows.
add_dependencies(
external_usd
external_python
)
if(BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_usd after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/usd/ ${HARVEST_TARGET}/usd
COMMAND ${CMAKE_COMMAND} -E copy ${BUILD_DIR}/usd/src/external_usd-build/pxr/Release/libusd_m.lib ${HARVEST_TARGET}/usd/lib/libusd_m.lib
DEPENDEES install
)
endif()
if(BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_usd after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/usd/lib ${HARVEST_TARGET}/usd/lib
COMMAND ${CMAKE_COMMAND} -E copy ${BUILD_DIR}/usd/src/external_usd-build/pxr/Debug/libusd_m_d.lib ${HARVEST_TARGET}/usd/lib/libusd_m_d.lib
DEPENDEES install
)
endif()
else()
# USD has two build options. The default build creates lots of small libraries,
# whereas the 'monolithic' build produces only a single library. The latter
# makes linking simpler, so that's what we use in Blender. However, running
# 'make install' in the USD sources doesn't install the static library in that
# case (only the shared library). As a result, we need to grab the `libusd_m.a`
# file from the build directory instead of from the install directory.
ExternalProject_Add_Step(external_usd after_install
COMMAND ${CMAKE_COMMAND} -E copy ${BUILD_DIR}/usd/src/external_usd-build/pxr/libusd_m.a ${HARVEST_TARGET}/usd/lib/libusd_m.a
DEPENDEES install
)
endif()

View File

@@ -32,43 +32,47 @@ set(JPEG_VERSION 1.5.3)
set(JPEG_URI https://github.com/libjpeg-turbo/libjpeg-turbo/archive/${JPEG_VERSION}.tar.gz)
set(JPEG_HASH 5b7549d440b86c98a517355c102d155e)
set(BOOST_VERSION 1.70.0)
set(BOOST_VERSION_NODOTS 1_70_0)
set(BOOST_VERSION_NODOTS_SHORT 1_70)
set(BOOST_VERSION 1.68.0)
set(BOOST_VERSION_NODOTS 1_68_0)
set(BOOST_URI https://dl.bintray.com/boostorg/release/${BOOST_VERSION}/source/boost_${BOOST_VERSION_NODOTS}.tar.gz)
set(BOOST_HASH fea771fe8176828fabf9c09242ee8c26)
set(BOOST_HASH 5d8b4503582fffa9eefdb9045359c239)
# Using old version as recommended by OpenVDB build documentation.
set(BLOSC_VERSION 1.5.0)
set(BLOSC_VERSION 1.14.4)
set(BLOSC_URI https://github.com/Blosc/c-blosc/archive/v${BLOSC_VERSION}.tar.gz)
set(BLOSC_HASH 6e4a49c8c06f05aa543f3312cfce3d55)
set(BLOSC_HASH e80dfc71e4cba03b8d01ed0876547ffe)
set(PTHREADS_VERSION 3.0.0)
set(PTHREADS_URI http://sourceforge.mirrorservice.org/p/pt/pthreads4w/pthreads4w-code-v${PTHREADS_VERSION}.zip)
set(PTHREADS_HASH f3bf81bb395840b3446197bcf4ecd653)
set(OPENEXR_VERSION 2.4.0)
set(OPENEXR_URI https://github.com/AcademySoftwareFoundation/openexr/archive/v${OPENEXR_VERSION}.tar.gz)
set(OPENEXR_HASH 9e4d69cf2a12c6fb19b98af7c5e0eaee)
set(ILMBASE_VERSION 2.3.0)
if(WIN32)
# Openexr started appending _d on its own so now
# we need to tell the build the postfix is _s while
# telling all other deps the postfix is _s_d
if(BUILD_MODE STREQUAL Release)
set(ILMBASE_VERSION_POSTFIX _s)
set(OPENEXR_VERSION_POSTFIX _s)
set(OPENEXR_VERSION_BUILD_POSTFIX _s)
else()
set(ILMBASE_VERSION_POSTFIX _s_d)
set(OPENEXR_VERSION_POSTFIX _s_d)
set(OPENEXR_VERSION_BUILD_POSTFIX _s)
endif()
else()
set(OPENEXR_VERSION_BUILD_POSTFIX)
set(ILMBASE_VERSION_POSTFIX)
endif()
set(ILMBASE_URI https://github.com/openexr/openexr/releases/download/v${ILMBASE_VERSION}/ilmbase-${ILMBASE_VERSION}.tar.gz)
set(ILMBASE_HASH 354bf86de3b930ab87ac63619d60c860)
set(OPENEXR_VERSION 2.3.0)
if(WIN32) #release 2.3.0 tarball has broken cmake support
set(OPENEXR_URI https://github.com/openexr/openexr/archive/0ac2ea34c8f3134148a5df4052e40f155b76f6fb.tar.gz)
set(OPENEXR_HASH ed159435d508240712fbaaa21d94bafb)
else()
set(OPENEXR_VERSION_POSTFIX)
set(OPENEXR_URI https://github.com/openexr/openexr/releases/download/v${OPENEXR_VERSION}/openexr-${OPENEXR_VERSION}.tar.gz)
set(OPENEXR_HASH a157e8a46596bc185f2472a5a4682174)
endif()
set(FREETYPE_VERSION 2.10.1)
set(FREETYPE_VERSION 2.9.1)
set(FREETYPE_URI http://prdownloads.sourceforge.net/freetype/freetype-${FREETYPE_VERSION}.tar.gz)
set(FREETYPE_HASH c50a3c9e5e62bdc938a6e1598a782947)
set(FREETYPE_HASH 3adb0e35d3c100c456357345ccfa8056)
set(GLEW_VERSION 1.13.0)
set(GLEW_URI http://prdownloads.sourceforge.net/glew/glew/${GLEW_VERSION}/glew-${GLEW_VERSION}.tgz)
@@ -82,21 +86,21 @@ set(HDF5_VERSION 1.8.17)
set(HDF5_URI https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8/hdf5-${HDF5_VERSION}/src/hdf5-${HDF5_VERSION}.tar.gz)
set(HDF5_HASH 7d572f8f3b798a628b8245af0391a0ca)
set(ALEMBIC_VERSION 1.7.12)
set(ALEMBIC_VERSION 1.7.8)
set(ALEMBIC_URI https://github.com/alembic/alembic/archive/${ALEMBIC_VERSION}.tar.gz)
set(ALEMBIC_MD5 e2b3777f23c5c09481a008cc6f0f8a40)
set(ALEMBIC_MD5 d095c2feb5e183b824904db7b63c1d30)
# hash is for 3.1.2
## hash is for 3.1.2
set(GLFW_GIT_UID 30306e54705c3adae9fe082c816a3be71963485c)
set(GLFW_URI https://github.com/glfw/glfw/archive/${GLFW_GIT_UID}.zip)
set(GLFW_HASH 20cacb1613da7eeb092f3ac4f6b2b3d0)
# latest uid in git as of 2016-04-01
#latest uid in git as of 2016-04-01
set(CLEW_GIT_UID 277db43f6cafe8b27c6f1055f69dc67da4aeb299)
set(CLEW_URI https://github.com/OpenCLWrangler/clew/archive/${CLEW_GIT_UID}.zip)
set(CLEW_HASH 2c699d10ed78362e71f56fae2a4c5f98)
# latest uid in git as of 2016-04-01
#latest uid in git as of 2016-04-01
set(CUEW_GIT_UID 1744972026de9cf27c8a7dc39cf39cd83d5f922f)
set(CUEW_URI https://github.com/CudaWrangler/cuew/archive/${CUEW_GIT_UID}.zip)
set(CUEW_HASH 86760d62978ebfd96cd93f5aa1abaf4a)
@@ -117,15 +121,15 @@ set(OPENCOLORIO_VERSION 1.1.0)
set(OPENCOLORIO_URI https://github.com/imageworks/OpenColorIO/archive/v${OPENCOLORIO_VERSION}.tar.gz)
set(OPENCOLORIO_HASH 802d8f5b1d1fe316ec5f76511aa611b8)
set(LLVM_VERSION 9.0.1)
set(LLVM_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/llvm-${LLVM_VERSION}.src.tar.xz)
set(LLVM_HASH 31eb9ce73dd2a0f8dcab8319fb03f8fc)
set(LLVM_VERSION 6.0.1)
set(LLVM_URI http://releases.llvm.org/${LLVM_VERSION}/llvm-${LLVM_VERSION}.src.tar.xz)
set(LLVM_HASH c88c98709300ce2c285391f387fecce0)
set(CLANG_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/clang-${LLVM_VERSION}.src.tar.xz)
set(CLANG_HASH 13468e4a44940efef1b75e8641752f90)
set(CLANG_URI http://releases.llvm.org/${LLVM_VERSION}/cfe-${LLVM_VERSION}.src.tar.xz)
set(CLANG_HASH 4e419bd4e3b55aa06d872320f754bd85)
set(OPENMP_URI https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/openmp-${LLVM_VERSION}.src.tar.xz)
set(OPENMP_HASH 6eade16057edbdecb3c4eef9daa2bfcf)
set(OPENMP_URI http://releases.llvm.org/${LLVM_VERSION}/openmp-${LLVM_VERSION}.src.tar.xz)
set(OPENMP_HASH 4826402ae3633c36c51ba4d0e5527d30)
set(OPENIMAGEIO_VERSION 1.8.13)
set(OPENIMAGEIO_URI https://github.com/OpenImageIO/oiio/archive/Release-${OPENIMAGEIO_VERSION}.tar.gz)
@@ -135,9 +139,9 @@ set(TIFF_VERSION 4.0.9)
set(TIFF_URI http://download.osgeo.org/libtiff/tiff-${TIFF_VERSION}.tar.gz)
set(TIFF_HASH 54bad211279cc93eb4fca31ba9bfdc79)
set(OSL_VERSION 1.10.9)
set(OSL_VERSION 1.9.9)
set(OSL_URI https://github.com/imageworks/OpenShadingLanguage/archive/Release-${OSL_VERSION}.tar.gz)
set(OSL_HASH a94f1e8506f7e8f5e993653de5c5fa00)
set(OSL_HASH 44ad511e424965a10fce051a053b0605)
set(PYTHON_VERSION 3.7.4)
set(PYTHON_SHORT_VERSION 3.7)
@@ -145,13 +149,13 @@ set(PYTHON_SHORT_VERSION_NO_DOTS 37)
set(PYTHON_URI https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tar.xz)
set(PYTHON_HASH d33e4aae66097051c2eca45ee3604803)
set(TBB_VERSION 2019_U9)
set(TBB_VERSION 2018_U5)
set(TBB_URI https://github.com/01org/tbb/archive/${TBB_VERSION}.tar.gz)
set(TBB_HASH 584edbec127c508f2cd5b6e79ad200fc)
set(TBB_HASH ff3ae09f8c23892fbc3008c39f78288f)
set(OPENVDB_VERSION 7.0.0)
set(OPENVDB_VERSION 5.1.0)
set(OPENVDB_URI https://github.com/dreamworksanimation/openvdb/archive/v${OPENVDB_VERSION}.tar.gz)
set(OPENVDB_HASH fd6c4f168282f7e0e494d290cd531fa8)
set(OPENVDB_HASH 5310101f874dcfd2165f9cee68c22624)
set(IDNA_VERSION 2.8)
set(CHARDET_VERSION 3.0.4)
@@ -188,10 +192,6 @@ set(VPX_VERSION 1.7.0)
set(VPX_URI https://github.com/webmproject/libvpx/archive/v${VPX_VERSION}/libvpx-v${VPX_VERSION}.tar.gz)
set(VPX_HASH 1fec931eb5c94279ad219a5b6e0202358e94a93a90cfb1603578c326abfc1238)
set(OPUS_VERSION 1.3.1)
set(OPUS_URI https://archive.mozilla.org/pub/opus/opus-${OPUS_VERSION}.tar.gz)
set(OPUS_HASH 65b58e1e25b2a114157014736a3d9dfeaad8d41be1c8179866f144a2fb44ff9d)
set(X264_URI http://download.videolan.org/pub/videolan/x264/snapshots/x264-snapshot-20180811-2245-stable.tar.bz2)
set(X264_HASH ae8a868a0e236a348b35d79f3ee80294b169d1195408b689f9851383661ed7aa)
@@ -199,7 +199,7 @@ set(XVIDCORE_VERSION 1.3.5)
set(XVIDCORE_URI http://downloads.xvid.org/downloads/xvidcore-${XVIDCORE_VERSION}.tar.gz)
set(XVIDCORE_HASH 165ba6a2a447a8375f7b06db5a3c91810181f2898166e7c8137401d7fc894cf0)
# This has to be in sync with the version in blenders /extern folder.
#this has to be in sync with the version in blenders /extern folder
set(OPENJPEG_VERSION 2.3.0)
set(OPENJPEG_SHORT_VERSION 2.3)
# Use slightly newer commit after release which includes a cmake fix
@@ -230,9 +230,9 @@ set(SNDFILE_VERSION 1.0.28)
set(SNDFILE_URI http://www.mega-nerd.com/libsndfile/files/libsndfile-${SNDFILE_VERSION}.tar.gz)
set(SNDFILE_HASH 646b5f98ce89ac60cdb060fcd398247c)
# set(HIDAPI_VERSION 0.8.0-rc1)
# set(HIDAPI_URI https://github.com/signal11/hidapi/archive/hidapi-${HIDAPI_VERSION}.tar.gz)
# set(HIDAPI_HASH 069f9dd746edc37b6b6d0e3656f47199)
#set(HIDAPI_VERSION 0.8.0-rc1)
#set(HIDAPI_URI https://github.com/signal11/hidapi/archive/hidapi-${HIDAPI_VERSION}.tar.gz)
#set(HIDAPI_HASH 069f9dd746edc37b6b6d0e3656f47199)
set(HIDAPI_UID 89a6c75dc6f45ecabd4ddfbd2bf5ba6ad8ba38b5)
set(HIDAPI_URI https://github.com/TheOnlyJoey/hidapi/archive/${HIDAPI_UID}.zip)
@@ -259,9 +259,9 @@ set(TINYXML_VERSION_DOTS 2.6.2)
set(TINYXML_URI https://nchc.dl.sourceforge.net/project/tinyxml/tinyxml/${TINYXML_VERSION_DOTS}/tinyxml_${TINYXML_VERSION}.tar.gz)
set(TINYXML_HASH c1b864c96804a10526540c664ade67f0)
set(YAMLCPP_VERSION 0.6.3)
set(YAMLCPP_VERSION 0.6.2)
set(YAMLCPP_URI https://codeload.github.com/jbeder/yaml-cpp/tar.gz/yaml-cpp-${YAMLCPP_VERSION})
set(YAMLCPP_HASH b45bf1089a382e81f6b661062c10d0c2)
set(YAMLCPP_HASH 5b943e9af0060d0811148b037449ef82)
set(LCMS_VERSION 2.9)
set(LCMS_URI https://nchc.dl.sourceforge.net/project/lcms/lcms/${LCMS_VERSION}/lcms2-${LCMS_VERSION}.tar.gz)
@@ -299,26 +299,33 @@ set(SQLITE_VERSION 3.24.0)
set(SQLITE_URI https://www.sqlite.org/2018/sqlite-src-3240000.zip)
set(SQLITE_HASH fb558c49ee21a837713c4f1e7e413309aabdd9c7)
set(EMBREE_VERSION 3.8.0)
set(EMBREE_VERSION 3.2.4)
set(EMBREE_URI https://github.com/embree/embree/archive/v${EMBREE_VERSION}.zip)
set(EMBREE_HASH ac504d5426945fe25dec1267e0c39d52)
set(EMBREE_HASH 3d4a1147002ff43939d45140aa9d6fb8)
set(USD_VERSION 19.11)
set(USD_URI https://github.com/PixarAnimationStudios/USD/archive/v${USD_VERSION}.tar.gz)
set(USD_HASH 79ff176167b3fe85f4953abd6cc5e0cc)
<<<<<<< HEAD
set(IGL_VERSION 2.0.0)
set(IGL_URI https://github.com/libigl/libigl/archive/v${IGL_VERSION}.tar.gz)
set(IGL_HASH 42518e6b106c7209c73435fd260ed5d34edeb254852495b4c95dce2d95401328)
#should match qex, but what about comiso ?
set(OPENMESH_VERSION 8.0)
set(OPENMESH_URI http://www.openmesh.org/media/Releases/${OPENMESH_VERSION}/OpenMesh-${OPENMESH_VERSION}.tar.gz)
#set(OPENMESH_HASH 96c595c1683b1ad950e80464ae5fc1f3c8dc45c31c8a211122a23e16a076ab23) #3.0
set(OPENMESH_HASH 8974d44026cacaa37b171945b5c96a284bfd32c9df9d671d62931050d057ec82)
#latest uid in git as of 2016-10-25
set(QEX_GIT_UID 1d33e0d6700dd3d00df984aad776d9537fcf16af)
set(QEX_URI https://github.com/hcebke/libQEx/archive/${QEX_GIT_UID}.tar.gz)
set(QEX_HASH a29cf440c7b83c9e803d1f55247cd18e9c277b3845edce5d381c4c91daa90452)
#latest uid in release branch 3.8.0
set(LAPACK_GIT_UID ba3779a6813d84d329b73aac86afc4e041170609)
set(LAPACK_URI https://github.com/Reference-LAPACK/lapack-release/archive/${LAPACK_GIT_UID}.tar.gz)
set(LAPACK_HASH 4bd7a3014ee2b5c7cdcdc8960c9476b689f21796715aa03fd719c92c43faee31)
=======
set(OIDN_VERSION 1.0.0)
set(OIDN_URI https://github.com/OpenImageDenoise/oidn/releases/download/v${OIDN_VERSION}/oidn-${OIDN_VERSION}.src.zip)
set(OIDN_HASH 19fe67b0164e8f020ac8a4f520defe60)
set(LIBGLU_VERSION 9.0.1)
set(LIBGLU_URI ftp://ftp.freedesktop.org/pub/mesa/glu/glu-${LIBGLU_VERSION}.tar.xz)
set(LIBGLU_HASH 151aef599b8259efe9acd599c96ea2a3)
set(MESA_VERSION 18.3.1)
set(MESA_URI ftp://ftp.freedesktop.org/pub/mesa//mesa-${MESA_VERSION}.tar.xz)
set(MESA_HASH d60828056d77bfdbae0970f9b15fb1be)
set(XR_OPENXR_SDK_VERSION 1.0.6)
set(XR_OPENXR_SDK_URI https://github.com/KhronosGroup/OpenXR-SDK/archive/release-${XR_OPENXR_SDK_VERSION}.tar.gz)
set(XR_OPENXR_SDK_HASH 21daea7c3bfec365298d779a0e19caa1)
>>>>>>> master

View File

@@ -49,8 +49,6 @@ ExternalProject_Add(external_vpx
--disable-avx2
--disable-unit-tests
--disable-examples
--enable-vp8
--enable-vp9
${VPX_EXTRA_FLAGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/vpx/src/external_vpx/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/vpx/src/external_vpx/ && make install

View File

@@ -39,12 +39,3 @@ ExternalProject_Add(external_webp
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/webp -Wno-dev ${DEFAULT_CMAKE_FLAGS} ${WEBP_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/webp
)
if(WIN32)
if(BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_webp after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/webp ${HARVEST_TARGET}/webp
DEPENDEES install
)
endif()
endif()

View File

@@ -1,58 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
# Keep flags in sync with install_deps.sh ones in compile_XR_OpenXR_SDK()
set(XR_OPENXR_SDK_EXTRA_ARGS
-DBUILD_FORCE_GENERATION=OFF
-DBUILD_LOADER=ON
-DDYNAMIC_LOADER=OFF
)
if(UNIX AND NOT APPLE)
list(APPEND XR_OPENXR_SDK_EXTRA_ARGS
-DBUILD_WITH_WAYLAND_HEADERS=OFF
-DBUILD_WITH_XCB_HEADERS=OFF
-DBUILD_WITH_XLIB_HEADERS=ON
)
endif()
ExternalProject_Add(external_xr_openxr_sdk
URL ${XR_OPENXR_SDK_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${XR_OPENXR_SDK_HASH}
PREFIX ${BUILD_DIR}/xr_openxr_sdk
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/xr_openxr_sdk ${DEFAULT_CMAKE_FLAGS} ${XR_OPENXR_SDK_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/xr_openxr_sdk
)
if(WIN32)
if(BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_xr_openxr_sdk after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/xr_openxr_sdk/include/openxr ${HARVEST_TARGET}/xr_openxr_sdk/include/openxr
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/xr_openxr_sdk/lib ${HARVEST_TARGET}/xr_openxr_sdk/lib
DEPENDEES install
)
endif()
if(BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_xr_openxr_sdk after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/xr_openxr_sdk/lib/openxr_loader.lib ${HARVEST_TARGET}/xr_openxr_sdk/lib/openxr_loader_d.lib
DEPENDEES install
)
endif()
endif()

View File

@@ -21,7 +21,7 @@ set(YAMLCPP_EXTRA_ARGS
-DYAML_CPP_BUILD_TESTS=OFF
-DYAML_CPP_BUILD_TOOLS=OFF
-DYAML_CPP_BUILD_CONTRIB=OFF
-DYAML_MSVC_SHARED_RT=ON
-DMSVC_SHARED_RT=OFF
)
ExternalProject_Add(external_yamlcpp

View File

@@ -1,100 +0,0 @@
strict graph {
graph[autosize = false, size = "25.7,8.3!", resolution = 300, overlap = false, splines = false, outputorder=edgesfirst ];
node [style=filled fillcolor=white];
external_alembic -- external_boost;
external_alembic -- external_zlib;
external_alembic -- external_openexr;
external_blosc -- external_zlib;
external_blosc -- external_pthreads;
external_boost -- Make_Python_Environment;
external_clang -- ll;
external_ffmpeg -- external_zlib;
external_ffmpeg -- external_faad;
external_ffmpeg -- external_openjpeg;
external_ffmpeg -- external_xvidcore;
external_ffmpeg -- external_x264;
external_ffmpeg -- external_vpx;
external_ffmpeg -- external_theora;
external_ffmpeg -- external_vorbis;
external_ffmpeg -- external_ogg;
external_ffmpeg -- external_lame;
external_ffmpeg -- external_zlib_mingw;
external_numpy -- Make_Python_Environment;
external_opencollada -- external_xml2;
external_opencolorio -- external_boost;
external_opencolorio -- external_tinyxml;
external_opencolorio -- external_yamlcpp;
external_openexr -- external_zlib;
external_openimageio -- external_png;
external_openimageio -- external_zlib;
external_openimageio -- external_openexr;
external_openimageio -- external_openexr;
external_openimageio -- external_jpeg;
external_openimageio -- external_boost;
external_openimageio -- external_tiff;
external_openimageio -- external_opencolorio;
external_openimageio -- external_openjpeg;
external_openimageio -- external_webp;
external_openimageio -- external_opencolorio_extra;
external_openmp -- external_clang;
external_opensubdiv -- external_glew;
external_opensubdiv -- external_glfw;
external_opensubdiv -- external_clew;
external_opensubdiv -- external_cuew;
external_opensubdiv -- external_tbb;
openvdb -- external_tbb;
openvdb -- external_boost;
openvdb -- external_openexr;
openvdb -- external_openexr;
openvdb -- external_zlib;
openvdb -- external_blosc;
external_osl -- external_boost;
external_osl -- ll;
external_osl -- external_clang;
external_osl -- external_openexr;
external_osl -- external_openexr;
external_osl -- external_zlib;
external_osl -- external_flexbison;
external_osl -- external_openimageio;
external_osl -- external_pugixml;
external_png -- external_zlib;
external_python_site_packages -- Make_Python_Environment;
external_sndfile -- external_ogg;
external_sndfile -- external_vorbis;
external_sndfile -- external_flac;
external_theora -- external_vorbis;
external_theora -- external_ogg;
external_tiff -- external_zlib;
external_vorbis -- external_ogg;
blender-- external_ffmpeg;
blender-- external_alembic;
blender-- external_openjpeg;
blender-- external_opencolorio;
blender-- external_openexr;
blender-- external_opensubdiv;
blender-- openvdb;
blender-- external_osl;
blender-- external_boost;
blender-- external_jpeg;
blender-- external_png;
blender-- external_python;
blender-- external_sndfile;
blender-- external_iconv;
blender-- external_fftw3;
external_python-- external_python_site_packages;
external_python_site_packages-- requests;
external_python_site_packages-- idna;
external_python_site_packages-- chardet;
external_python_site_packages-- urllib3;
external_python_site_packages-- certifi;
external_python-- external_numpy;
external_usd-- external_boost;
external_usd-- external_tbb;
blender-- external_opencollada;
blender-- external_sdl;
blender-- external_freetype;
blender-- external_pthreads;
blender-- external_zlib;
blender-- external_openal;
blender-- external_usd;
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,2 +0,0 @@
# Files contains mixed line endings, patch needs to preserve them to apply.
opencollada.diff binary

View File

@@ -10,84 +10,22 @@ diff -Naur src/blosc/CMakeLists.txt external_blosc/blosc/CMakeLists.txt
endif(NOT Threads_FOUND)
else(WIN32)
find_package(Threads REQUIRED)
diff -Naur src/CMakeLists.txt external_blosc/CMakeLists.txt
--- src/CMakeLists.txt 2016-02-03 10:26:28 -0700
+++ external_blosc/CMakeLists.txt 2017-03-03 09:03:31 -0700
@@ -17,8 +17,8 @@
# do not include support for the Snappy library
# DEACTIVATE_ZLIB: default OFF
# do not include support for the Zlib library
-# PREFER_EXTERNAL_COMPLIBS: default ON
-# when found, use the installed compression libs instead of included sources
+# PREFER_EXTERNAL_ZLIB: default ON
+# when found, use the installed zlib instead of included sources
# TEST_INCLUDE_BENCH_SINGLE_1: default ON
# add a test that runs the benchmark program passing "single" with 1 thread
# as first parameter
@@ -80,29 +80,23 @@
"Do not include support for the SNAPPY library." OFF)
option(DEACTIVATE_ZLIB
"Do not include support for the ZLIB library." OFF)
-option(PREFER_EXTERNAL_COMPLIBS
- "When found, use the installed compression libs instead of included sources." ON)
+option(PREFER_EXTERNAL_ZLIB
+ "When found, use the installed zlib instead of included sources." ON)
set(CMAKE_MODULE_PATH "${CMAKE_SOURCE_DIR}/cmake")
-if(NOT PREFER_EXTERNAL_COMPLIBS)
+if(NOT PREFER_EXTERNAL_ZLIB)
message(STATUS "Finding external libraries disabled. Using internal sources.")
-endif(NOT PREFER_EXTERNAL_COMPLIBS)
+endif(NOT PREFER_EXTERNAL_ZLIB)
if(NOT DEACTIVATE_LZ4)
- if(PREFER_EXTERNAL_COMPLIBS)
- find_package(LZ4)
- endif(PREFER_EXTERNAL_COMPLIBS)
# HAVE_LZ4 will be set to true because even if the library is
# not found, we will use the included sources for it
set(HAVE_LZ4 TRUE)
endif(NOT DEACTIVATE_LZ4)
if(NOT DEACTIVATE_SNAPPY)
- if(PREFER_EXTERNAL_COMPLIBS)
- find_package(Snappy)
- endif(PREFER_EXTERNAL_COMPLIBS)
# HAVE_SNAPPY will be set to true because even if the library is not found,
# we will use the included sources for it
set(HAVE_SNAPPY TRUE)
@@ -110,13 +104,13 @@
if(NOT DEACTIVATE_ZLIB)
# import the ZLIB_ROOT environment variable to help finding the zlib library
- if(PREFER_EXTERNAL_COMPLIBS)
+ if(PREFER_EXTERNAL_ZLIB)
set(ZLIB_ROOT $ENV{ZLIB_ROOT})
find_package( ZLIB )
if (NOT ZLIB_FOUND )
message(STATUS "No zlib found. Using internal sources.")
endif (NOT ZLIB_FOUND )
- endif(PREFER_EXTERNAL_COMPLIBS)
+ endif(PREFER_EXTERNAL_ZLIB)
# HAVE_ZLIB will be set to true because even if the library is not found,
# we will use the included sources for it
set(HAVE_ZLIB TRUE)
diff -Naur external_blosc.orig/blosc/blosc.c external_blosc/blosc/blosc.c
--- external_blosc.orig/blosc/blosc.c 2018-07-30 04:56:38 -0600
+++ external_blosc/blosc/blosc.c 2018-08-11 15:27:26 -0600
@@ -41,12 +41,7 @@
@@ -56,14 +56,7 @@
#include <inttypes.h>
#endif /* _WIN32 */
-/* Include the win32/pthread.h library for all the Windows builds. See #224. */
-#if defined(_WIN32)
- #include "win32/pthread.h"
- #include "win32/pthread.c"
-#else
- #include <pthread.h>
-#endif
-
+#include <pthread.h>
/* Some useful units */
#define KB 1024

View File

@@ -29,6 +29,9 @@ ENDIF()
SET(_blosc_SEARCH_DIRS
${BLOSC_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
/opt/lib/blosc
)

View File

@@ -29,6 +29,9 @@ ENDIF()
SET(_cppunit_SEARCH_DIRS
${CPPUNIT_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
/opt/lib/cppunit
)

View File

@@ -26,8 +26,8 @@ include(FindPackageMessage)
include(SelectLibraryConfigurations)
if(ILMBASE_USE_STATIC_LIBS)
set(_ilmbase_ORIG_CMAKE_FIND_LIBRARY_SUFFIXES ${CMAKE_FIND_LIBRARY_SUFFIXES})
if( ILMBASE_USE_STATIC_LIBS )
set( _ilmbase_ORIG_CMAKE_FIND_LIBRARY_SUFFIXES ${CMAKE_FIND_LIBRARY_SUFFIXES})
if(WIN32)
set(CMAKE_FIND_LIBRARY_SUFFIXES .lib .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
else()
@@ -124,6 +124,7 @@ set(IlmBase_generic_include_paths
/usr/include
/usr/include/${CMAKE_LIBRARY_ARCHITECTURE}
/usr/local/include
/sw/include
/opt/local/include
)
set(IlmBase_generic_library_paths
@@ -132,6 +133,7 @@ set(IlmBase_generic_library_paths
/usr/lib/${CMAKE_LIBRARY_ARCHITECTURE}
/usr/local/lib
/usr/local/lib/${CMAKE_LIBRARY_ARCHITECTURE}
/sw/lib
/opt/local/lib
)
@@ -190,11 +192,11 @@ if(ILMBASE_CUSTOM)
set(IlmBase_Libraries ${ILMBASE_CUSTOM_LIBRARIES})
separate_arguments(IlmBase_Libraries)
else()
# elseif(${ILMBASE_VERSION} VERSION_LESS "2.1")
#elseif(${ILMBASE_VERSION} VERSION_LESS "2.1")
set(IlmBase_Libraries Half Iex Imath IlmThread)
# else()
# string(REGEX REPLACE "([0-9]+)[.]([0-9]+).*" "\\1_\\2" _ilmbase_libs_ver ${ILMBASE_VERSION})
# set(IlmBase_Libraries Half Iex-${_ilmbase_libs_ver} Imath-${_ilmbase_libs_ver} IlmThread-${_ilmbase_libs_ver})
#else()
# string(REGEX REPLACE "([0-9]+)[.]([0-9]+).*" "\\1_\\2" _ilmbase_libs_ver ${ILMBASE_VERSION})
# set(IlmBase_Libraries Half Iex-${_ilmbase_libs_ver} Imath-${_ilmbase_libs_ver} IlmThread-${_ilmbase_libs_ver})
endif()
@@ -247,7 +249,7 @@ if(ILMBASE_FOUND)
endif()
# Restore the original find library ordering
if(ILMBASE_USE_STATIC_LIBS )
if( ILMBASE_USE_STATIC_LIBS )
set(CMAKE_FIND_LIBRARY_SUFFIXES ${_ilmbase_ORIG_CMAKE_FIND_LIBRARY_SUFFIXES})
endif()

View File

@@ -29,6 +29,9 @@ ENDIF()
SET(_logc4plus_SEARCH_DIRS
${LOGC4PLUS_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
/opt/lib/logc4plus
)

View File

@@ -119,6 +119,7 @@ set(OpenEXR_generic_include_paths
/usr/include
/usr/include/${CMAKE_LIBRARY_ARCHITECTURE}
/usr/local/include
/sw/include
/opt/local/include
)
set(OpenEXR_generic_library_paths
@@ -127,6 +128,7 @@ set(OpenEXR_generic_library_paths
/usr/lib/${CMAKE_LIBRARY_ARCHITECTURE}
/usr/local/lib
/usr/local/lib/${CMAKE_LIBRARY_ARCHITECTURE}
/sw/lib
/opt/local/lib
)
@@ -186,11 +188,11 @@ if(OPENEXR_CUSTOM)
endif()
set(OpenEXR_Library ${OPENEXR_CUSTOM_LIBRARY})
else()
# elseif(${OPENEXR_VERSION} VERSION_LESS "2.1")
#elseif(${OPENEXR_VERSION} VERSION_LESS "2.1")
set(OpenEXR_Library IlmImf)
# else()
# string(REGEX REPLACE "([0-9]+)[.]([0-9]+).*" "\\1_\\2" _openexr_libs_ver ${OPENEXR_VERSION})
# set(OpenEXR_Library IlmImf-${_openexr_libs_ver})
#else()
# string(REGEX REPLACE "([0-9]+)[.]([0-9]+).*" "\\1_\\2" _openexr_libs_ver ${OPENEXR_VERSION})
# set(OpenEXR_Library IlmImf-${_openexr_libs_ver})
endif()
# Locate the OpenEXR library
@@ -230,7 +232,7 @@ if(OPENEXR_FOUND)
endif()
# Restore the original find library ordering
if(OPENEXR_USE_STATIC_LIBS )
if( OPENEXR_USE_STATIC_LIBS )
set(CMAKE_FIND_LIBRARY_SUFFIXES ${_openexr_ORIG_CMAKE_FIND_LIBRARY_SUFFIXES})
endif()

View File

@@ -29,6 +29,9 @@ ENDIF()
SET(_tbb_SEARCH_DIRS
${TBB_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
/opt/lib/tbb
)

View File

@@ -0,0 +1,18 @@
diff -NaurBb b/CMakeLists.txt a/CMakeLists.txt
--- b/CMakeLists.txt 2018-05-01 12:45:46 -0600
+++ a/CMakeLists.txt 2018-08-08 13:03:22 -0600
@@ -229,9 +229,12 @@
endif ()
string(REPLACE "/undef " "#undef "
FTCONFIG_H "${FTCONFIG_H}")
- file(WRITE "${PROJECT_BINARY_DIR}/include/freetype/config/ftconfig.h"
- "${FTCONFIG_H}")
+else()
+ file(READ "${PROJECT_SOURCE_DIR}/include/freetype/config/ftconfig.h"
+ FTCONFIG_H)
endif ()
+file(WRITE "${PROJECT_BINARY_DIR}/include/freetype/config/ftconfig.h"
+ "${FTCONFIG_H}")
# Create the options file

View File

@@ -1,13 +0,0 @@
--- a/lib/Support/Unix/Path.inc 2020-02-17 09:24:26.000000000 +0100
+++ b/lib/Support/Unix/Path.inc 2020-02-17 09:26:25.000000000 +0100
@@ -1200,7 +1200,9 @@
/// implementation.
std::error_code copy_file(const Twine &From, const Twine &To) {
uint32_t Flag = COPYFILE_DATA;
-#if __has_builtin(__builtin_available) && defined(COPYFILE_CLONE)
+ // BLENDER: This optimization makes LLVM not build on older Xcode versions,
+ // just disable until everyone has new enough Xcode versions.
+#if 0
if (__builtin_available(macos 10.12, *)) {
bool IsSymlink;
if (std::error_code Error = is_symlink_file(From, IsSymlink))

View File

@@ -16,19 +16,6 @@ index 95abbe2..4f14f30 100644
message("WARNING: Native PCRE not found, taking PCRE from ./Externals")
add_definitions(-DPCRE_STATIC)
add_subdirectory(${EXTERNAL_LIBRARIES}/pcre)
diff --git a/DAEValidator/CMakeLists.txt b/DAEValidator/CMakeLists.txt
index 03ad540..f7d05cf 100644
--- a/DAEValidator/CMakeLists.txt
+++ b/DAEValidator/CMakeLists.txt
@@ -98,7 +98,7 @@ if (WIN32)
# C4710: 'function' : function not inlined
# C4711: function 'function' selected for inline expansion
# C4820: 'bytes' bytes padding added after construct 'member_name'
- set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /MP /Wall /WX /wd4505 /wd4514 /wd4592 /wd4710 /wd4711 /wd4820")
+ set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /MP /Wall /wd4505 /wd4514 /wd4592 /wd4710 /wd4711 /wd4820")
else ()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11 -Wall -Werror")
endif ()
diff --git a/DAEValidator/library/include/no_warning_begin b/DAEValidator/library/include/no_warning_begin
index 7a69c32..defb315 100644
--- a/DAEValidator/library/include/no_warning_begin
@@ -43,36 +30,6 @@ index 7a69c32..defb315 100644
# if defined(_MSC_VER) && defined(_DEBUG)
# pragma warning(disable:4548)
# endif
diff --git a/DAEValidator/library/src/ArgumentParser.cpp b/DAEValidator/library/src/ArgumentParser.cpp
index 897e4dc..98a69ff 100644
--- a/DAEValidator/library/src/ArgumentParser.cpp
+++ b/DAEValidator/library/src/ArgumentParser.cpp
@@ -6,10 +6,10 @@
using namespace std;
-#ifdef _MSC_VER
-#define NOEXCEPT _NOEXCEPT
-#else
+#ifndef _NOEXCEPT
#define NOEXCEPT noexcept
+#else
+#define NOEXCEPT _NOEXCEPT
#endif
namespace opencollada
diff --git a/Externals/LibXML/CMakeLists.txt b/Externals/LibXML/CMakeLists.txt
index 40081e7..e1d1bfa 100644
--- a/Externals/LibXML/CMakeLists.txt
+++ b/Externals/LibXML/CMakeLists.txt
@@ -9,6 +9,7 @@ add_definitions(
-DLIBXML_SCHEMAS_ENABLED
-DLIBXML_XPATH_ENABLED
-DLIBXML_TREE_ENABLED
+ -DLIBXML_STATIC
)
if(USE_STATIC_MSVC_RUNTIME)
diff --git a/GeneratedSaxParser/src/GeneratedSaxParserUtils.cpp b/GeneratedSaxParser/src/GeneratedSaxParserUtils.cpp
index 1f9a3ee..d151e9a 100644
--- a/GeneratedSaxParser/src/GeneratedSaxParserUtils.cpp

View File

@@ -117,15 +117,3 @@ diff '--ignore-matching-lines=:' -ur '--exclude=*.svn*' -u -r
if(OIDN_STATIC_LIB)
set(OIDN_LIB_TYPE STATIC)
else()
diff -Naur orig/core/api.cpp external_openimagedenoise/core/api.cpp
--- orig/core/api.cpp 2019-07-19 08:37:04 -0600
+++ external_openimagedenoise/core/api.cpp 2020-01-21 15:10:56 -0700
@@ -15,7 +15,7 @@
// ======================================================================== //
#ifdef _WIN32
-# define OIDN_API extern "C" __declspec(dllexport)
+# define OIDN_API extern "C"
#else
# define OIDN_API extern "C" __attribute__ ((visibility ("default")))
#endif

View File

@@ -0,0 +1,13 @@
--- idiff.cpp 2016-11-18 11:42:01 -0700
+++ idiff.cpp 2016-11-18 11:41:25 -0700
@@ -308,8 +308,10 @@
// printed with three digit exponent. We change this behaviour to fit
// Linux way
#ifdef _MSC_VER
+#if _MSC_VER < 1900
_set_output_format(_TWO_DIGIT_EXPONENT);
#endif
+#endif
std::streamsize precis = std::cout.precision();
std::cout << " " << cr.nwarn << " pixels ("
<< std::setprecision(3) << (100.0*cr.nwarn / npels)

View File

@@ -0,0 +1,12 @@
--- CMakeLists.txt 2019-04-13 21:03:18.119000842 +0200
+++ CMakeLists.txt 2019-04-13 20:59:01.521760285 +0200
@@ -92,7 +92,7 @@
set (OPENMESH_LIBRARY_DIR "${_OPENMESH_LIBRARY_DIR}" CACHE PATH "The directory where the OpenMesh libraries can be found.")
endif()
-add_subdirectory (Doc)
+#add_subdirectory (Doc)
# ========================================================================
# Bundle generation (Targets exist, now configure them)

View File

@@ -1,73 +1,102 @@
diff -Naur orig/cmake/FindIlmBase.cmake openvdb/cmake/FindIlmBase.cmake
--- orig/cmake/FindIlmBase.cmake 2019-12-06 13:11:33 -0700
+++ openvdb/cmake/FindIlmBase.cmake 2020-01-16 09:06:32 -0700
@@ -225,6 +225,12 @@
list(APPEND CMAKE_FIND_LIBRARY_SUFFIXES
"-${IlmBase_VERSION_MAJOR}_${IlmBase_VERSION_MINOR}.lib"
diff -Naur openvdb.orig/openvdb/CMakeLists.txt openvdb/openvdb/CMakeLists.txt
--- openvdb.orig/openvdb/CMakeLists.txt 2018-04-10 12:22:17 -0600
+++ openvdb/openvdb/CMakeLists.txt 2018-08-15 19:04:52 -0600
@@ -82,6 +82,9 @@
IF (WIN32 AND OPENVDB_DISABLE_BOOST_IMPLICIT_LINKING)
ADD_DEFINITIONS ( -DBOOST_ALL_NO_LIB )
ENDIF ()
+if(WIN32)
+ADD_DEFINITIONS ( -D__TBB_NO_IMPLICIT_LINKAGE )
+endif()
FIND_PACKAGE ( Blosc REQUIRED )
FIND_PACKAGE ( TBB REQUIRED )
@@ -195,6 +198,7 @@
${Ilmbase_HALF_LIBRARY}
${ZLIB_LIBRARY}
${BLOSC_blosc_LIBRARY}
+ ${EXTRA_LIBS}
)
+ list(APPEND CMAKE_FIND_LIBRARY_SUFFIXES
+ "_s.lib"
+ )
+ list(APPEND CMAKE_FIND_LIBRARY_SUFFIXES
+ "_s_d.lib"
+ )
else()
if(ILMBASE_USE_STATIC_LIBS)
list(APPEND CMAKE_FIND_LIBRARY_SUFFIXES
diff -Naur orig/cmake/FindOpenEXR.cmake openvdb/cmake/FindOpenEXR.cmake
--- orig/cmake/FindOpenEXR.cmake 2019-12-06 13:11:33 -0700
+++ openvdb/cmake/FindOpenEXR.cmake 2020-01-16 09:06:39 -0700
@@ -218,6 +218,12 @@
list(APPEND CMAKE_FIND_LIBRARY_SUFFIXES
"-${OpenEXR_VERSION_MAJOR}_${OpenEXR_VERSION_MINOR}.lib"
IF (WIN32)
@@ -225,13 +228,16 @@
${VDB_PRINT_SOURCE_FILES}
)
+ list(APPEND CMAKE_FIND_LIBRARY_SUFFIXES
+ "_s.lib"
+if(NOT WIN32)
+ set(EXTRA_LIBS m stdc++ dl)
+endif()
+
TARGET_LINK_LIBRARIES ( vdb_print
openvdb_shared
${CMAKE_THREAD_LIBS_INIT}
${BLOSC_blosc_LIBRARY}
- m
- stdc++
- )
+ ${EXTRA_LIBS}
+)
SET ( VDB_RENDER_SOURCE_FILES cmd/openvdb_render/main.cc )
SET_SOURCE_FILES_PROPERTIES ( ${VDB_RENDER_SOURCE_FILES}
@@ -249,8 +255,7 @@
${Openexr_ILMIMF_LIBRARY}
${Ilmbase_ILMTHREAD_LIBRARY}
${Ilmbase_IEX_LIBRARY}
- m
- stdc++
+ ${EXTRA_LIBS}
)
SET ( VDB_VIEW_SOURCE_FILES
@@ -270,7 +270,7 @@
PROPERTIES
COMPILE_FLAGS "-DOPENVDB_USE_BLOSC ${OPENVDB_USE_GLFW_FLAG} -DGL_GLEXT_PROTOTYPES=1"
)
-IF (NOT WIN32)
+IF (FALSE)
ADD_EXECUTABLE ( vdb_view
${VDB_VIEW_SOURCE_FILES}
)
@@ -283,9 +288,8 @@
${GLFW_LINK_LIBRARY}
${GLFW_DEPENDENT_LIBRARIES}
${GLEW_GLEW_LIBRARY}
- m
- stdc++
- )
+ ${EXTRA_LIBS}
+ )
+ list(APPEND CMAKE_FIND_LIBRARY_SUFFIXES
+ "_s_d.lib"
+ )
else()
if(OPENEXR_USE_STATIC_LIBS)
list(APPEND CMAKE_FIND_LIBRARY_SUFFIXES
diff -Naur orig/openvdb/CMakeLists.txt openvdb/openvdb/CMakeLists.txt
--- orig/openvdb/CMakeLists.txt 2019-12-06 13:11:33 -0700
+++ openvdb/openvdb/CMakeLists.txt 2020-01-16 08:56:25 -0700
@@ -193,11 +193,12 @@
if(OPENVDB_DISABLE_BOOST_IMPLICIT_LINKING)
add_definitions(-DBOOST_ALL_NO_LIB)
endif()
+ add_definitions(-D__TBB_NO_IMPLICIT_LINKAGE -DOPENVDB_OPENEXR_STATICLIB)
endif()
ENDIF ()
# @todo Should be target definitions
if(WIN32)
- add_definitions(-D_WIN32 -DNOMINMAX -DOPENVDB_DLL)
+ add_definitions(-D_WIN32 -DNOMINMAX -DOPENVDB_STATICLIB)
endif()
SET ( UNITTEST_SOURCE_FILES
@@ -392,8 +396,7 @@
TARGET_LINK_LIBRARIES ( vdb_test
${CPPUnit_cppunit_LIBRARY}
openvdb_shared
- m
- stdc++
+ ${EXTRA_LIBS}
)
##### Core library configuration
diff -Naur orig/openvdb/cmd/CMakeLists.txt openvdb/openvdb/cmd/CMakeLists.txt
--- orig/openvdb/cmd/CMakeLists.txt 2019-12-06 13:11:33 -0700
+++ openvdb/openvdb/cmd/CMakeLists.txt 2020-01-16 08:56:25 -0700
@@ -53,7 +53,7 @@
endif()
ADD_TEST ( vdb_unit_test vdb_test )
@@ -422,7 +422,7 @@
ENDIF ()
if(WIN32)
- add_definitions(-D_WIN32 -DNOMINMAX -DOPENVDB_DLL)
+ add_definitions(-D_WIN32 -DNOMINMAX -DOPENVDB_STATICLIB)
endif()
# rpath handling
diff -Naur orig/openvdb/unittest/CMakeLists.txt openvdb/openvdb/unittest/CMakeLists.txt
--- orig/openvdb/unittest/CMakeLists.txt 2019-12-06 13:11:33 -0700
+++ openvdb/openvdb/unittest/CMakeLists.txt 2020-01-16 08:56:25 -0700
@@ -49,7 +49,7 @@
endif()
if(WIN32)
- add_definitions(-D_WIN32 -DNOMINMAX -DOPENVDB_DLL)
+ add_definitions(-D_WIN32 -DNOMINMAX -DOPENVDB_STATICLIB)
endif()
##### VDB unit tests
# Installation
-IF ( NOT WIN32 )
+IF ( FALSE )
INSTALL ( TARGETS
vdb_view
DESTINATION
diff -Naur openvdb.orig/openvdb/math/Coord.h openvdb/openvdb/math/Coord.h
--- openvdb.orig/openvdb/math/Coord.h 2018-04-10 12:22:17 -0600
+++ openvdb/openvdb/math/Coord.h 2018-08-15 20:32:43 -0600
@@ -35,6 +35,7 @@
#include <array> // for std::array
#include <iostream>
#include <limits>
+#define NOMINMAX
#include <openvdb/Platform.h>
#include "Math.h"
#include "Vec3.h"

View File

@@ -1,3 +1,15 @@
diff -Naur OpenShadingLanguage-Release-1.9.9/src/cmake/flexbison.cmake external_osl/src/cmake/flexbison.cmake
--- OpenShadingLanguage-Release-1.9.9/src/cmake/flexbison.cmake 2018-05-01 16:39:02 -0600
+++ external_osl/src/cmake/flexbison.cmake 2018-08-23 15:42:27 -0600
@@ -77,7 +77,7 @@
DEPENDS ${${compiler_headers}}
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR} )
ADD_CUSTOM_COMMAND ( OUTPUT ${flexoutputcxx}
- COMMAND ${FLEX_EXECUTABLE} -o ${flexoutputcxx} "${CMAKE_CURRENT_SOURCE_DIR}/${flexsrc}"
+ COMMAND ${FLEX_EXECUTABLE} ${FLEX_EXTRA_OPTIONS} -o ${flexoutputcxx} "${CMAKE_CURRENT_SOURCE_DIR}/${flexsrc}"
MAIN_DEPENDENCY ${flexsrc}
DEPENDS ${${compiler_headers}}
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR} )
diff -Naur OpenShadingLanguage-Release-1.9.9/src/cmake/flexbison.cmake.rej external_osl/src/cmake/flexbison.cmake.rej
--- OpenShadingLanguage-Release-1.9.9/src/cmake/flexbison.cmake.rej 1969-12-31 17:00:00 -0700
+++ external_osl/src/cmake/flexbison.cmake.rej 2018-08-24 17:42:11 -0600
@@ -33,6 +45,18 @@ diff -Naur OpenShadingLanguage-Release-1.9.9/src/include/OSL/llvm_util.h externa
private:
class MemoryManager;
diff -Naur OpenShadingLanguage-Release-1.9.9/src/include/OSL/oslnoise.h external_osl/src/include/OSL/oslnoise.h
--- OpenShadingLanguage-Release-1.9.9/src/include/OSL/oslnoise.h 2018-05-01 16:39:02 -0600
+++ external_osl/src/include/OSL/oslnoise.h 2018-08-24 17:42:11 -0600
@@ -762,7 +762,7 @@
// packed into a float4. We assume T is float and VECTYPE is float4,
// but it also works if T is Dual2<float> and VECTYPE is Dual2<float4>.
template<typename T, typename VECTYPE>
-OIIO_FORCEINLINE T bilerp (VECTYPE abcd, T u, T v) {
+OIIO_FORCEINLINE T bilerp (VECTYPE& abcd, T u, T v) {
VECTYPE xx = OIIO::lerp (abcd, OIIO::simd::shuffle<1,1,3,3>(abcd), u);
return OIIO::simd::extract<0>(OIIO::lerp (xx,OIIO::simd::shuffle<2>(xx), v));
}
diff -Naur OpenShadingLanguage-Release-1.9.9/src/liboslexec/llvm_util.cpp external_osl/src/liboslexec/llvm_util.cpp
--- OpenShadingLanguage-Release-1.9.9/src/liboslexec/llvm_util.cpp 2018-05-01 16:39:02 -0600
+++ external_osl/src/liboslexec/llvm_util.cpp 2018-08-25 14:04:27 -0600

View File

@@ -0,0 +1,44 @@
--- CMakeLists.txt 2019-01-13 19:55:24.920548897 +0100
+++ CMakeLists.txt 2019-04-13 23:36:50.517127296 +0200
@@ -129,12 +129,20 @@
add_subdirectory(tests)
endif()
+set_target_properties(QEx PROPERTIES PUBLIC_HEADER "interfaces/c/qex.h")
+set_target_properties(QExStatic PROPERTIES PUBLIC_HEADER "interfaces/c/qex.h")
-find_package(Doxygen)
-if (DOXYGEN_FOUND)
- configure_file(${CMAKE_CURRENT_SOURCE_DIR}/Doxyfile.in ${CMAKE_CURRENT_BINARY_DIR}/doc/Doxyfile @ONLY)
- add_custom_target(qex_doc
- ${DOXYGEN_EXECUTABLE} ${CMAKE_CURRENT_BINARY_DIR}/doc/Doxyfile
- WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/doc
- COMMENT "Generating Doxygen documentation" VERBATIM)
-endif()
+install(TARGETS QEx QExStatic
+ LIBRARY DESTINATION lib
+ ARCHIVE DESTINATION lib
+ PUBLIC_HEADER DESTINATION include
+)
+
+#find_package(Doxygen)
+#if (DOXYGEN_FOUND)
+# configure_file(${CMAKE_CURRENT_SOURCE_DIR}/Doxyfile.in ${CMAKE_CURRENT_BINARY_DIR}/doc/Doxyfile @ONLY)
+# add_custom_target(qex_doc
+# ${DOXYGEN_EXECUTABLE} ${CMAKE_CURRENT_BINARY_DIR}/doc/Doxyfile
+# WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/doc
+# COMMENT "Generating Doxygen documentation" VERBATIM)
+#endif()
--- src/predicates.c 2019-04-14 22:27:37.975746000 +0200
+++ src/predicates.c 2019-04-14 22:27:50.059746000 +0200
@@ -1227,7 +1227,7 @@
/* */
/*****************************************************************************/
-int compress(elen, e, h) /* e and h may be the same. */
+int compress1(elen, e, h) /* e and h may be the same. */
int elen;
REAL *e;
REAL *h;

View File

@@ -1,139 +0,0 @@
diff -x .git -ur usd.orig/cmake/defaults/Options.cmake external_usd/cmake/defaults/Options.cmake
--- usd.orig/cmake/defaults/Options.cmake 2019-10-24 22:39:53.000000000 +0200
+++ external_usd/cmake/defaults/Options.cmake 2019-11-28 13:00:33.197957712 +0100
@@ -25,6 +25,7 @@
option(PXR_VALIDATE_GENERATED_CODE "Validate script generated code" OFF)
option(PXR_HEADLESS_TEST_MODE "Disallow GUI based tests, useful for running under headless CI systems." OFF)
option(PXR_BUILD_TESTS "Build tests" ON)
+option(PXR_BUILD_USD_TOOLS "Build commandline tools" ON)
option(PXR_BUILD_IMAGING "Build imaging components" ON)
option(PXR_BUILD_EMBREE_PLUGIN "Build embree imaging plugin" OFF)
option(PXR_BUILD_OPENIMAGEIO_PLUGIN "Build OpenImageIO plugin" OFF)
diff -x .git -ur usd.orig/cmake/defaults/Packages.cmake external_usd/cmake/defaults/Packages.cmake
--- usd.orig/cmake/defaults/Packages.cmake 2019-10-24 22:39:53.000000000 +0200
+++ external_usd/cmake/defaults/Packages.cmake 2019-11-28 13:00:33.185957483 +0100
@@ -64,7 +64,7 @@
endif()
# --TBB
-find_package(TBB REQUIRED COMPONENTS tbb)
+find_package(TBB)
add_definitions(${TBB_DEFINITIONS})
# --math
diff -x .git -ur usd.orig/pxr/base/lib/plug/initConfig.cpp external_usd/pxr/base/lib/plug/initConfig.cpp
--- usd.orig/pxr/base/lib/plug/initConfig.cpp 2019-10-24 22:39:53.000000000 +0200
+++ external_usd/pxr/base/lib/plug/initConfig.cpp 2019-12-11 11:00:37.643323127 +0100
@@ -69,8 +69,38 @@
ARCH_CONSTRUCTOR(Plug_InitConfig, 2, void)
{
+ /* The contents of this constructor have been moved to usd_initialise_plugin_path(...) */
+}
+
+}; // end of anonymous namespace
+
+/**
+ * The contents of this function used to be in the static constructor Plug_InitConfig.
+ * This static constructor made it impossible for Blender to pass a path to the USD
+ * library at runtime, as the constructor would run before Blender's main() function.
+ *
+ * This function is wrapped in a C function of the same name (defined below),
+ * so that it can be called from Blender's main() function.
+ *
+ * The datafiles_usd_path path is used to point to the USD plugin path when Blender
+ * has been installed. The fallback_usd_path path should point to the build-time
+ * location of the USD plugin files so that Blender can be run on a development machine
+ * without requiring an installation step.
+ */
+void
+usd_initialise_plugin_path(const char *datafiles_usd_path)
+{
std::vector<std::string> result;
+ // Add Blender-specific paths. They MUST end in a slash, or symlinks will not be treated as directory.
+ if (datafiles_usd_path != NULL && datafiles_usd_path[0] != '\0') {
+ std::string datafiles_usd_path_str(datafiles_usd_path);
+ if (datafiles_usd_path_str.back() != '/') {
+ datafiles_usd_path_str += "/";
+ }
+ result.push_back(datafiles_usd_path_str);
+ }
+
// Determine the absolute path to the Plug shared library.
// Any relative paths specified in the plugin search path will be
// anchored to this directory, to allow for relocatability.
@@ -94,9 +124,24 @@
_AppendPathList(&result, installLocation, sharedLibPath);
#endif // PXR_INSTALL_LOCATION
- Plug_SetPaths(result);
-}
+ if (!TfGetenv("PXR_PATH_DEBUG").empty()) {
+ printf("USD Plugin paths: (%zu in total):\n", result.size());
+ for(const std::string &path : result) {
+ printf(" %s\n", path.c_str());
+ }
+ }
+ Plug_SetPaths(result);
}
PXR_NAMESPACE_CLOSE_SCOPE
+
+/* Workaround to make it possible to pass a path at runtime to USD. */
+extern "C" {
+void
+usd_initialise_plugin_path(
+ const char *datafiles_usd_path)
+{
+ PXR_NS::usd_initialise_plugin_path(datafiles_usd_path);
+}
+}
diff -x .git -ur usd.orig/pxr/usd/CMakeLists.txt external_usd/pxr/usd/CMakeLists.txt
--- usd.orig/pxr/usd/CMakeLists.txt 2019-10-24 22:39:53.000000000 +0200
+++ external_usd/pxr/usd/CMakeLists.txt 2019-11-28 13:00:33.197957712 +0100
@@ -1,6 +1,5 @@
set(DIRS
lib
- bin
plugin
)
@@ -8,3 +7,8 @@
add_subdirectory(${d})
endforeach()
+if (PXR_BUILD_USD_TOOLS)
+ add_subdirectory(bin)
+else()
+ message(STATUS "Skipping commandline tools because PXR_BUILD_USD_TOOLS=OFF")
+endif()
diff -Naur external_usd_orig/pxr/base/lib/tf/preprocessorUtils.h external_usd/pxr/base/lib/tf/preprocessorUtils.h
--- external_usd_orig/pxr/base/lib/tf/preprocessorUtils.h 2019-10-24 14:39:53 -0600
+++ external_usd/pxr/base/lib/tf/preprocessorUtils.h 2020-01-14 09:30:18 -0700
@@ -189,7 +189,7 @@
/// Exapnds to 1 if the argument is a tuple, and 0 otherwise.
/// \ingroup group_tf_Preprocessor
/// \hideinitializer
-#if defined(ARCH_OS_WINDOWS)
+#if defined(ARCH_COMPILER_MSVC)
#define TF_PP_IS_TUPLE(sequence) \
BOOST_VMD_IS_TUPLE(sequence)
#else
diff -Naur external_usd_base/cmake/macros/Public.cmake external_usd/cmake/macros/Public.cmake
--- external_usd_base/cmake/macros/Public.cmake 2019-10-24 14:39:53 -0600
+++ external_usd/cmake/macros/Public.cmake 2020-01-11 13:33:29 -0700
@@ -996,6 +996,12 @@
foreach(lib ${PXR_OBJECT_LIBS})
string(TOUPPER ${lib} uppercaseName)
list(APPEND exports "${uppercaseName}_EXPORTS=1")
+ # When building for blender, we do NOT want to export all symbols on windows.
+ # This is a dirty hack, but USD makes it impossible to do the right thing
+ # with the default options exposed.
+ if (WIN32)
+ list(APPEND exports "PXR_STATIC=1")
+ endif()
endforeach()
foreach(lib ${PXR_OBJECT_LIBS})
set(objects "${objects};\$<TARGET_OBJECTS:${lib}>")

View File

@@ -126,7 +126,6 @@ if "%dobuild%" == "1" (
cmake --build . --target Harvest_Release_Results > Harvest_Release.txt
)
echo %DATE% %TIME% : Release Harvest done >> %StatusFile%
if "%NODEBUG%" == "1" goto exit
cd %BUILD_DIR%
mkdir %STAGING%\%BuildDir%%ARCH%D
cd %Staging%\%BuildDir%%ARCH%D

View File

@@ -1,70 +0,0 @@
Blender Buildbot
================
Code signing
------------
Code signing is done as part of INSTALL target, which makes it possible to sign
files which are aimed into a bundle and coming from a non-signed source (such as
libraries SVN).
This is achieved by specifying `slave_codesign.cmake` as a post-install script
run by CMake. This CMake script simply involves an utility script written in
Python which takes care of an actual signing.
### Configuration
Client configuration doesn't need anything special, other than variable
`SHARED_STORAGE_DIR` pointing to a location which is watched by a server.
This is done in `config_builder.py` file and is stored in Git (which makes it
possible to have almost zero-configuration buildbot machines).
Server configuration requires copying `config_server_template.py` under the
name of `config_server.py` and tweaking values, which are platform-specific.
#### Windows configuration
There are two things which are needed on Windows in order to have code signing
to work:
- `TIMESTAMP_AUTHORITY_URL` which is most likely set http://timestamp.digicert.com
- `CERTIFICATE_FILEPATH` which is a full file path to a PKCS #12 key (.pfx).
## Tips
### Self-signed certificate on Windows
It is easiest to test configuration using self-signed certificate.
The certificate manipulation utilities are coming with Windows SDK.
Unfortunately, they are not added to PATH. Here is an example of how to make
sure they are easily available:
```
set PATH=C:\Program Files (x86)\Windows Kits\10\App Certification Kit;%PATH%
set PATH=C:\Program Files (x86)\Windows Kits\10\bin\10.0.18362.0\x64;%PATH%
```
Generate CA:
```
makecert -r -pe -n "CN=Blender Test CA" -ss CA -sr CurrentUser -a sha256 ^
-cy authority -sky signature -sv BlenderTestCA.pvk BlenderTestCA.cer
```
Import the generated CA:
```
certutil -user -addstore Root BlenderTestCA.cer
```
Create self-signed certificate and pack it into PKCS #12:
```
makecert -pe -n "CN=Blender Test SPC" -a sha256 -cy end ^
-sky signature ^
-ic BlenderTestCA.cer -iv BlenderTestCA.pvk ^
-sv BlenderTestSPC.pvk BlenderTestSPC.cer
pvk2pfx -pvk BlenderTestSPC.pvk -spc BlenderTestSPC.cer -pfx BlenderTestSPC.pfx
```

View File

@@ -1,125 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import argparse
import os
import re
import subprocess
import sys
def is_tool(name):
"""Check whether `name` is on PATH and marked as executable."""
# from whichcraft import which
from shutil import which
return which(name) is not None
class Builder:
def __init__(self, name, branch):
self.name = name
self.branch = branch
self.is_release_branch = re.match("^blender-v(.*)-release$", branch) is not None
# Buildbot runs from build/ directory
self.blender_dir = os.path.abspath(os.path.join('..', 'blender.git'))
self.build_dir = os.path.abspath(os.path.join('..', 'build', name))
self.install_dir = os.path.abspath(os.path.join('..', 'install', name))
self.upload_dir = os.path.abspath(os.path.join('..', 'install'))
# Detect platform
if name.startswith('mac'):
self.platform = 'mac'
self.command_prefix = []
elif name.startswith('linux'):
self.platform = 'linux'
if is_tool('scl'):
self.command_prefix = ['scl', 'enable', 'devtoolset-6', '--']
else:
self.command_prefix = []
elif name.startswith('win'):
self.platform = 'win'
self.command_prefix = []
else:
raise ValueError('Unkonw platform for builder ' + self.platform)
# Always 64 bit now
self.bits = 64
def create_builder_from_arguments():
parser = argparse.ArgumentParser()
parser.add_argument('builder_name')
parser.add_argument('branch', default='master', nargs='?')
args = parser.parse_args()
return Builder(args.builder_name, args.branch)
class VersionInfo:
def __init__(self, builder):
# Get version information
buildinfo_h = os.path.join(builder.build_dir, "source", "creator", "buildinfo.h")
blender_h = os.path.join(builder.blender_dir, "source", "blender", "blenkernel", "BKE_blender_version.h")
version_number = int(self._parse_header_file(blender_h, 'BLENDER_VERSION'))
self.version = "%d.%d" % (version_number // 100, version_number % 100)
self.version_char = self._parse_header_file(blender_h, 'BLENDER_VERSION_CHAR')
self.version_cycle = self._parse_header_file(blender_h, 'BLENDER_VERSION_CYCLE')
self.version_cycle_number = self._parse_header_file(blender_h, 'BLENDER_VERSION_CYCLE_NUMBER')
self.hash = self._parse_header_file(buildinfo_h, 'BUILD_HASH')[1:-1]
if self.version_cycle == "release":
# Final release
self.full_version = self.version + self.version_char
self.is_development_build = False
elif self.version_cycle == "rc":
# Release candidate
version_cycle = self.version_cycle + self.version_cycle_number
if len(self.version_char) == 0:
self.full_version = self.version + version_cycle
else:
self.full_version = self.version + self.version_char + '-' + version_cycle
self.is_development_build = False
else:
# Development build
self.full_version = self.version + '-' + self.hash
self.is_development_build = True
def _parse_header_file(self, filename, define):
import re
regex = re.compile("^#\s*define\s+%s\s+(.*)" % define)
with open(filename, "r") as file:
for l in file:
match = regex.match(l)
if match:
return match.group(1)
return None
def call(cmd, env=None, exit_on_error=True):
print(' '.join(cmd))
# Flush to ensure correct order output on Windows.
sys.stdout.flush()
sys.stderr.flush()
retcode = subprocess.call(cmd, env=env)
if exit_on_error and retcode != 0:
sys.exit(retcode)
return retcode

View File

@@ -1,81 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
from dataclasses import dataclass
from pathlib import Path
from typing import List
@dataclass
class AbsoluteAndRelativeFileName:
"""
Helper class which keeps track of absolute file path for a direct access and
corresponding relative path against given base.
The relative part is used to construct a file name within an archive which
contains files which are to be signed or which has been signed already
(depending on whether the archive is addressed to signing server or back
to the buildbot worker).
"""
# Base directory which is where relative_filepath is relative to.
base_dir: Path
# Full absolute path of the corresponding file.
absolute_filepath: Path
# Derived from full file path, contains part of the path which is relative
# to a desired base path.
relative_filepath: Path
def __init__(self, base_dir: Path, filepath: Path):
self.base_dir = base_dir
self.absolute_filepath = filepath.resolve()
self.relative_filepath = self.absolute_filepath.relative_to(
self.base_dir)
@classmethod
def from_path(cls, path: Path) -> 'AbsoluteAndRelativeFileName':
assert path.is_absolute()
assert path.is_file()
base_dir = path.parent
return AbsoluteAndRelativeFileName(base_dir, path)
@classmethod
def recursively_from_directory(cls, base_dir: Path) \
-> List['AbsoluteAndRelativeFileName']:
"""
Create list of AbsoluteAndRelativeFileName for all the files in the
given directory.
NOTE: Result will be pointing to a resolved paths.
"""
assert base_dir.is_absolute()
assert base_dir.is_dir()
base_dir = base_dir.resolve()
result = []
for filename in base_dir.glob('**/*'):
if not filename.is_file():
continue
result.append(AbsoluteAndRelativeFileName(base_dir, filename))
return result

View File

@@ -1,163 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import os
from pathlib import Path
import codesign.util as util
class ArchiveWithIndicator:
"""
The idea of this class is to wrap around logic which takes care of keeping
track of a name of an archive and synchronization routines between buildbot
worker and signing server.
The synchronization is done based on creating a special file after the
archive file is knowingly ready for access.
"""
# Base directory where the archive is stored (basically, a basename() of
# the absolute archive file name).
#
# For example, 'X:\\TEMP\\'.
base_dir: Path
# Absolute file name of the archive.
#
# For example, 'X:\\TEMP\\FOO.ZIP'.
archive_filepath: Path
# Absolute name of a file which acts as an indication of the fact that the
# archive is ready and is available for access.
#
# This is how synchronization between buildbot worker and signing server is
# done:
# - First, the archive is created under archive_filepath name.
# - Second, the indication file is created under ready_indicator_filepath
# name.
# - Third, the colleague of whoever created the indicator name watches for
# the indication file to appear, and once it's there it access the
# archive.
ready_indicator_filepath: Path
def __init__(
self, base_dir: Path, archive_name: str, ready_indicator_name: str):
"""
Construct the object from given base directory and name of the archive
file:
ArchiveWithIndicator(Path('X:\\TEMP'), 'FOO.ZIP', 'INPUT_READY')
"""
self.base_dir = base_dir
self.archive_filepath = self.base_dir / archive_name
self.ready_indicator_filepath = self.base_dir / ready_indicator_name
def is_ready_unsafe(self) -> bool:
"""
Check whether the archive is ready for access.
No guarding about possible network failres is done here.
"""
if not self.ready_indicator_filepath.exists():
return False
# Sometimes on macOS indicator file appears prior to the actual archive
# despite the order of creation and os.sync() used in tag_ready().
# So consider archive not ready if there is an indicator without an
# actual archive.
if not self.archive_filepath.exists():
print('Found indicator without actual archive, waiting for archive '
f'({self.archive_filepath}) to appear.')
return False
# Read archive size from indicator/
#
# Assume that file is either empty or is fully written. This is being checked
# by performing ValueError check since empty string will throw this exception
# when attempted to be converted to int.
expected_archive_size_str = self.ready_indicator_filepath.read_text()
try:
expected_archive_size = int(expected_archive_size_str)
except ValueError:
print(f'Invalid archive size "{expected_archive_size_str}"')
return False
# Wait for until archive is fully stored.
actual_archive_size = self.archive_filepath.stat().st_size
if actual_archive_size != expected_archive_size:
print('Partial/invalid archive size (expected '
f'{expected_archive_size} got {actual_archive_size})')
return False
return True
def is_ready(self) -> bool:
"""
Check whether the archive is ready for access.
Will tolerate possible network failures: if there is a network failure
or if there is still no proper permission on a file False is returned.
"""
# There are some intermitten problem happening at a random which is
# translates to "OSError : [WinError 59] An unexpected network error occurred".
# Some reports suggests it might be due to lack of permissions to the file,
# which might be applicable in our case since it's possible that file is
# initially created with non-accessible permissions and gets chmod-ed
# after initial creation.
try:
return self.is_ready_unsafe()
except OSError as e:
print(f'Exception checking archive: {e}')
return False
def tag_ready(self) -> None:
"""
Tag the archive as ready by creating the corresponding indication file.
NOTE: It is expected that the archive was never tagged as ready before
and that there are no subsequent tags of the same archive.
If it is violated, an assert will fail.
"""
assert not self.is_ready()
# Try the best to make sure everything is synced to the file system,
# to avoid any possibility of stamp appearing on a network share prior to
# an actual file.
if util.get_current_platform() != util.Platform.WINDOWS:
os.sync()
archive_size = self.archive_filepath.stat().st_size
self.ready_indicator_filepath.write_text(str(archive_size))
def clean(self) -> None:
"""
Remove both archive and the ready indication file.
"""
util.ensure_file_does_not_exist_or_die(self.ready_indicator_filepath)
util.ensure_file_does_not_exist_or_die(self.archive_filepath)
def is_fully_absent(self) -> bool:
"""
Check whether both archive and its ready indicator are absent.
Is used for a sanity check during code signing process by both
buildbot worker and signing server.
"""
return (not self.archive_filepath.exists() and
not self.ready_indicator_filepath.exists())

View File

@@ -1,454 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Signing process overview.
#
# From buildbot worker side:
# - Files which needs to be signed are collected from either a directory to
# sign all signable files in there, or by filename of a single file to sign.
# - Those files gets packed into an archive and stored in a location location
# which is watched by the signing server.
# - A marker READY file is created which indicates the archive is ready for
# access.
# - Wait for the server to provide an archive with signed files.
# This is done by watching for the READY file which corresponds to an archive
# coming from the signing server.
# - Unpack the signed signed files from the archives and replace original ones.
#
# From code sign server:
# - Watch special location for a READY file which indicates the there is an
# archive with files which are to be signed.
# - Unpack the archive to a temporary location.
# - Run codesign tool and make sure all the files are signed.
# - Pack the signed files and store them in a location which is watched by
# the buildbot worker.
# - Create a READY file which indicates that the archive with signed files is
# ready.
import abc
import logging
import shutil
import subprocess
import time
import tarfile
from pathlib import Path
from tempfile import TemporaryDirectory
from typing import Iterable, List
import codesign.util as util
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.archive_with_indicator import ArchiveWithIndicator
logger = logging.getLogger(__name__)
logger_builder = logger.getChild('builder')
logger_server = logger.getChild('server')
def pack_files(files: Iterable[AbsoluteAndRelativeFileName],
archive_filepath: Path) -> None:
"""
Create tar archive from given files for the signing pipeline.
Is used by buildbot worker to create an archive of files which are to be
signed, and by signing server to send signed files back to the worker.
"""
with tarfile.TarFile.open(archive_filepath, 'w') as tar_file_handle:
for file_info in files:
tar_file_handle.add(file_info.absolute_filepath,
arcname=file_info.relative_filepath)
def extract_files(archive_filepath: Path,
extraction_dir: Path) -> None:
"""
Extract all files form the given archive into the given direcotry.
"""
# TODO(sergey): Verify files in the archive have relative path.
with tarfile.TarFile.open(archive_filepath, mode='r') as tar_file_handle:
tar_file_handle.extractall(path=extraction_dir)
class BaseCodeSigner(metaclass=abc.ABCMeta):
"""
Base class for a platform-specific signer of binaries.
Contains all the logic shared across platform-specific implementations, such
as synchronization and notification logic.
Platform specific bits (such as actual command for signing the binary) are
to be implemented as a subclass.
Provides utilities code signing as a whole, including functionality needed
by a signing server and a buildbot worker.
The signer and builder may run on separate machines, the only requirement is
that they have access to a directory which is shared between them. For the
security concerns this is to be done as a separate machine (or as a Shared
Folder configuration in VirtualBox configuration). This directory might be
mounted under different base paths, but its underlying storage is to be
the same.
The code signer is short-lived on a buildbot worker side, and is living
forever on a code signing server side.
"""
# TODO(sergey): Find a neat way to have config annotated.
# config: Config
# Storage directory where builder puts files which are requested to be
# signed.
# Consider this an input of the code signing server.
unsigned_storage_dir: Path
# Information about archive which contains files which are to be signed.
#
# This archive is created by the buildbot worked and acts as an input for
# the code signing server.
unsigned_archive_info: ArchiveWithIndicator
# Storage where signed files are stored.
# Consider this an output of the code signer server.
signed_storage_dir: Path
# Information about archive which contains signed files.
#
# This archive is created by the code signing server.
signed_archive_info: ArchiveWithIndicator
# Platform the code is currently executing on.
platform: util.Platform
def __init__(self, config):
self.config = config
absolute_shared_storage_dir = config.SHARED_STORAGE_DIR.resolve()
# Unsigned (signing server input) configuration.
self.unsigned_storage_dir = absolute_shared_storage_dir / 'unsigned'
self.unsigned_archive_info = ArchiveWithIndicator(
self.unsigned_storage_dir, 'unsigned_files.tar', 'ready.stamp')
# Signed (signing server output) configuration.
self.signed_storage_dir = absolute_shared_storage_dir / 'signed'
self.signed_archive_info = ArchiveWithIndicator(
self.signed_storage_dir, 'signed_files.tar', 'ready.stamp')
self.platform = util.get_current_platform()
"""
General note on cleanup environment functions.
It is expected that there is only one instance of the code signer server
running for a given input/output directory, and that it serves a single
buildbot worker.
By its nature, a buildbot worker only produces one build at a time and
never performs concurrent builds.
This leads to a conclusion that when starting in a clean environment
there shouldn't be any archives remaining from a previous build.
However, it is possible to have various failure scenarios which might
leave the environment in a non-clean state:
- Network hiccup which makes buildbot worker to stop current build
and re-start it after connection to server is re-established.
Note, this could also happen during buildbot server maintenance.
- Signing server might get restarted due to updates or other reasons.
Requiring manual interaction in such cases is not something good to
require, so here we simply assume that the system is used the way it is
intended to and restore environment to a prestine clean state.
"""
def cleanup_environment_for_builder(self) -> None:
self.unsigned_archive_info.clean()
self.signed_archive_info.clean()
def cleanup_environment_for_signing_server(self) -> None:
# Don't clear the requested to-be-signed archive since we might be
# restarting signing machine while the buildbot is busy.
self.signed_archive_info.clean()
############################################################################
# Buildbot worker side helpers.
@abc.abstractmethod
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
"""
Check whether file is to be signed.
Is used by both single file signing pipeline and recursive directory
signing pipeline.
This is where code signer is to check whether file is to be signed or
not. This check might be based on a simple extension test or on actual
test whether file have a digital signature already or not.
"""
def collect_files_to_sign(self, path: Path) \
-> List[AbsoluteAndRelativeFileName]:
"""
Get all files which need to be signed from the given path.
NOTE: The path might either be a file or directory.
This function is run from the buildbot worker side.
"""
# If there is a single file provided trust the buildbot worker that it
# is eligible for signing.
if path.is_file():
file = AbsoluteAndRelativeFileName.from_path(path)
if not self.check_file_is_to_be_signed(file):
return []
return [file]
all_files = AbsoluteAndRelativeFileName.recursively_from_directory(
path)
files_to_be_signed = [file for file in all_files
if self.check_file_is_to_be_signed(file)]
return files_to_be_signed
def wait_for_signed_archive_or_die(self) -> None:
"""
Wait until archive with signed files is available.
Will only wait for the configured time. If that time exceeds and there
is still no responce from the signing server the application will exit
with a non-zero exit code.
"""
timeout_in_seconds = self.config.TIMEOUT_IN_SECONDS
time_start = time.monotonic()
while not self.signed_archive_info.is_ready():
time.sleep(1)
time_slept_in_seconds = time.monotonic() - time_start
if time_slept_in_seconds > timeout_in_seconds:
self.unsigned_archive_info.clean()
raise SystemExit("Signing server didn't finish signing in "
f"{timeout_in_seconds} seconds, dying :(")
def copy_signed_files_to_directory(
self, signed_dir: Path, destination_dir: Path) -> None:
"""
Copy all files from signed_dir to destination_dir.
This function will overwrite any existing file. Permissions are copied
from the source files, but other metadata, such as timestamps, are not.
"""
for signed_filepath in signed_dir.glob('**/*'):
if not signed_filepath.is_file():
continue
relative_filepath = signed_filepath.relative_to(signed_dir)
destination_filepath = destination_dir / relative_filepath
destination_filepath.parent.mkdir(parents=True, exist_ok=True)
shutil.copy(signed_filepath, destination_filepath)
def run_buildbot_path_sign_pipeline(self, path: Path) -> None:
"""
Run all steps needed to make given path signed.
Path points to an unsigned file or a directory which contains unsigned
files.
If the path points to a single file then this file will be signed.
This is used to sign a final bundle such as .msi on Windows or .dmg on
macOS.
NOTE: The code signed implementation might actually reject signing the
file, in which case the file will be left unsigned. This isn't anything
to be considered a failure situation, just might happen when buildbot
worker can not detect whether signing is really required in a specific
case or not.
If the path points to a directory then code signer will sign all
signable files from it (finding them recursively).
"""
self.cleanup_environment_for_builder()
# Make sure storage directory exists.
self.unsigned_storage_dir.mkdir(parents=True, exist_ok=True)
# Collect all files which needs to be signed and pack them into a single
# archive which will be sent to the signing server.
logger_builder.info('Collecting files which are to be signed...')
files = self.collect_files_to_sign(path)
if not files:
logger_builder.info('No files to be signed, ignoring.')
return
logger_builder.info('Found %d files to sign.', len(files))
pack_files(files=files,
archive_filepath=self.unsigned_archive_info.archive_filepath)
self.unsigned_archive_info.tag_ready()
# Wait for the signing server to finish signing.
logger_builder.info('Waiting signing server to sign the files...')
self.wait_for_signed_archive_or_die()
# Extract signed files from archive and move files to final location.
with TemporaryDirectory(prefix='blender-buildbot-') as temp_dir_str:
unpacked_signed_files_dir = Path(temp_dir_str)
logger_builder.info('Extracting signed files from archive...')
extract_files(
archive_filepath=self.signed_archive_info.archive_filepath,
extraction_dir=unpacked_signed_files_dir)
destination_dir = path
if destination_dir.is_file():
destination_dir = destination_dir.parent
self.copy_signed_files_to_directory(
unpacked_signed_files_dir, destination_dir)
logger_builder.info('Removing archive with signed files...')
self.signed_archive_info.clean()
############################################################################
# Signing server side helpers.
def wait_for_sign_request(self) -> None:
"""
Wait for the buildbot to request signing of an archive.
"""
# TOOD(sergey): Support graceful shutdown on Ctrl-C.
while not self.unsigned_archive_info.is_ready():
time.sleep(1)
@abc.abstractmethod
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
"""
Sign all files in the given directory.
NOTE: Signing should happen in-place.
"""
def run_signing_pipeline(self):
"""
Run the full signing pipeline starting from the point when buildbot
worker have requested signing.
"""
# Make sure storage directory exists.
self.signed_storage_dir.mkdir(parents=True, exist_ok=True)
with TemporaryDirectory(prefix='blender-codesign-') as temp_dir_str:
temp_dir = Path(temp_dir_str)
logger_server.info('Extracting unsigned files from archive...')
extract_files(
archive_filepath=self.unsigned_archive_info.archive_filepath,
extraction_dir=temp_dir)
logger_server.info('Collecting all files which needs signing...')
files = AbsoluteAndRelativeFileName.recursively_from_directory(
temp_dir)
logger_server.info('Signing all requested files...')
self.sign_all_files(files)
logger_server.info('Packing signed files...')
pack_files(files=files,
archive_filepath=self.signed_archive_info.archive_filepath)
self.signed_archive_info.tag_ready()
logger_server.info('Removing signing request...')
self.unsigned_archive_info.clean()
logger_server.info('Signing is complete.')
def run_signing_server(self):
logger_server.info('Starting new code signing server...')
self.cleanup_environment_for_signing_server()
logger_server.info('Code signing server is ready')
while True:
logger_server.info('Waiting for the signing request in %s...',
self.unsigned_storage_dir)
self.wait_for_sign_request()
logger_server.info(
'Got signing request, beging signign procedure.')
self.run_signing_pipeline()
############################################################################
# Command executing.
#
# Abstracted to a degree that allows to run commands from a foreign
# platform.
# The goal with this is to allow performing dry-run tests of code signer
# server from other platforms (for example, to test that macOS code signer
# does what it is supposed to after doing a refactor on Linux).
# TODO(sergey): What is the type annotation for the command?
def run_command_or_mock(self, command, platform: util.Platform) -> None:
"""
Run given command if current platform matches given one
If the platform is different then it will only be printed allowing
to verify logic of the code signing process.
"""
if platform != self.platform:
logger_server.info(
f'Will run command for {platform}: {command}')
return
logger_server.info(f'Running command: {command}')
subprocess.run(command)
# TODO(sergey): What is the type annotation for the command?
def check_output_or_mock(self, command,
platform: util.Platform,
allow_nonzero_exit_code=False) -> str:
"""
Run given command if current platform matches given one
If the platform is different then it will only be printed allowing
to verify logic of the code signing process.
If allow_nonzero_exit_code is truth then the output will be returned
even if application quit with non-zero exit code.
Otherwise an subprocess.CalledProcessError exception will be raised
in such case.
"""
if platform != self.platform:
logger_server.info(
f'Will run command for {platform}: {command}')
return
if allow_nonzero_exit_code:
process = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
output = process.communicate()[0]
return output.decode()
logger_server.info(f'Running command: {command}')
return subprocess.check_output(
command, stderr=subprocess.STDOUT).decode()

View File

@@ -1,62 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Configuration of a code signer which is specific to the code running from
# buildbot's worker.
import sys
from pathlib import Path
import codesign.util as util
from codesign.config_common import *
platform = util.get_current_platform()
if platform == util.Platform.LINUX:
SHARED_STORAGE_DIR = Path('/data/codesign')
elif platform == util.Platform.WINDOWS:
SHARED_STORAGE_DIR = Path('Z:\\codesign')
elif platform == util.Platform.MACOS:
SHARED_STORAGE_DIR = Path('/Volumes/codesign_macos/codesign')
# https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
LOGGING = {
'version': 1,
'formatters': {
'default': {'format': '%(asctime)-15s %(levelname)8s %(name)s %(message)s'}
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'default',
'stream': 'ext://sys.stderr',
}
},
'loggers': {
'codesign': {'level': 'INFO'},
},
'root': {
'level': 'WARNING',
'handlers': [
'console',
],
}
}

View File

@@ -1,36 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
from pathlib import Path
# Timeout in seconds for the signing process.
#
# This is how long buildbot packing step will wait signing server to
# perform signing.
#
# NOTE: Notarization could take a long time, hence the rather high value
# here. Might consider using different timeout for different platforms.
TIMEOUT_IN_SECONDS = 45 * 60 * 60
# Directory which is shared across buildbot worker and signing server.
#
# This is where worker puts files requested for signing as well as where
# server puts signed files.
SHARED_STORAGE_DIR: Path

View File

@@ -1,101 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Configuration of a code signer which is specific to the code signing server.
#
# NOTE: DO NOT put any sensitive information here, put it in an actual
# configuration on the signing machine.
from pathlib import Path
from codesign.config_common import *
CODESIGN_DIRECTORY = Path(__file__).absolute().parent
BLENDER_GIT_ROOT_DIRECTORY = CODESIGN_DIRECTORY.parent.parent.parent
################################################################################
# Common configuration.
# Directory where folders for codesign requests and signed result are stored.
# For example, /data/codesign
SHARED_STORAGE_DIR: Path
################################################################################
# macOS-specific configuration.
MACOS_ENTITLEMENTS_FILE = \
BLENDER_GIT_ROOT_DIRECTORY / 'release' / 'darwin' / 'entitlements.plist'
# Identity of the Developer ID Application certificate which is to be used for
# codesign tool.
# Use `security find-identity -v -p codesigning` to find the identity.
#
# NOTE: This identity is just an example from release/darwin/README.txt.
MACOS_CODESIGN_IDENTITY = 'AE825E26F12D08B692F360133210AF46F4CF7B97'
# User name (Apple ID) which will be used to request notarization.
MACOS_XCRUN_USERNAME = 'me@example.com'
# One-time application password which will be used to request notarization.
MACOS_XCRUN_PASSWORD = '@keychain:altool-password'
# Timeout in seconds within which the notarial office is supposed to reply.
MACOS_NOTARIZE_TIMEOUT_IN_SECONDS = 60 * 60
################################################################################
# Windows-specific configuration.
# URL to the timestamping authority.
WIN_TIMESTAMP_AUTHORITY_URL = 'http://timestamp.digicert.com'
# Full path to the certificate used for signing.
#
# The path and expected file format might vary depending on a platform.
#
# On Windows it is usually is a PKCS #12 key (.pfx), so the path will look
# like Path('C:\\Secret\\Blender.pfx').
WIN_CERTIFICATE_FILEPATH: Path
################################################################################
# Logging configuration, common for all platforms.
# https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
LOGGING = {
'version': 1,
'formatters': {
'default': {'format': '%(asctime)-15s %(levelname)8s %(name)s %(message)s'}
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'default',
'stream': 'ext://sys.stderr',
}
},
'loggers': {
'codesign': {'level': 'INFO'},
},
'root': {
'level': 'WARNING',
'handlers': [
'console',
],
}
}

View File

@@ -1,72 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# NOTE: This is a no-op signer (since there isn't really a procedure to sign
# Linux binaries yet). Used to debug and verify the code signing routines on
# a Linux environment.
import logging
from pathlib import Path
from typing import List
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.base_code_signer import BaseCodeSigner
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
class LinuxCodeSigner(BaseCodeSigner):
def is_active(self) -> bool:
"""
Check whether this signer is active.
if it is inactive, no files will be signed.
Is used to be able to debug code signing pipeline on Linux, where there
is no code signing happening in the actual buildbot and release
environment.
"""
return False
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
if file.relative_filepath == Path('blender'):
return True
if (file.relative_filepath.parts[-3:-1] == ('python', 'bin') and
file.relative_filepath.name.startwith('python')):
return True
if file.relative_filepath.suffix == '.so':
return True
return False
def collect_files_to_sign(self, path: Path) \
-> List[AbsoluteAndRelativeFileName]:
if not self.is_active():
return []
return super().collect_files_to_sign(path)
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
num_files = len(files)
for file_index, file in enumerate(files):
logger.info('Server: Signed file [%d/%d] %s',
file_index + 1, num_files, file.relative_filepath)

View File

@@ -1,453 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging
import re
import stat
import subprocess
import time
from pathlib import Path
from typing import List
import codesign.util as util
from buildbot_utils import Builder
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.base_code_signer import BaseCodeSigner
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
# NOTE: Check is done as filename.endswith(), so keep the dot
EXTENSIONS_TO_BE_SIGNED = {'.dylib', '.so', '.dmg'}
# Prefixes of a file (not directory) name which are to be signed.
# Used to sign extra executable files in Contents/Resources.
NAME_PREFIXES_TO_BE_SIGNED = {'python'}
def is_file_from_bundle(file: AbsoluteAndRelativeFileName) -> bool:
"""
Check whether file is coming from an .app bundle
"""
parts = file.relative_filepath.parts
if not parts:
return False
if not parts[0].endswith('.app'):
return False
return True
def get_bundle_from_file(
file: AbsoluteAndRelativeFileName) -> AbsoluteAndRelativeFileName:
"""
Get AbsoluteAndRelativeFileName descriptor of bundle
"""
assert(is_file_from_bundle(file))
parts = file.relative_filepath.parts
bundle_name = parts[0]
base_dir = file.base_dir
bundle_filepath = file.base_dir / bundle_name
return AbsoluteAndRelativeFileName(base_dir, bundle_filepath)
def is_bundle_executable_file(file: AbsoluteAndRelativeFileName) -> bool:
"""
Check whether given file is an executable within an app bundle
"""
if not is_file_from_bundle(file):
return False
parts = file.relative_filepath.parts
num_parts = len(parts)
if num_parts < 3:
return False
if parts[1:3] != ('Contents', 'MacOS'):
return False
return True
def xcrun_field_value_from_output(field: str, output: str) -> str:
"""
Get value of a given field from xcrun output.
If field is not found empty string is returned.
"""
field_prefix = field + ': '
for line in output.splitlines():
line = line.strip()
if line.startswith(field_prefix):
return line[len(field_prefix):]
return ''
class MacOSCodeSigner(BaseCodeSigner):
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
if file.relative_filepath.name.startswith('.'):
return False
if is_bundle_executable_file(file):
return True
base_name = file.relative_filepath.name
if any(base_name.startswith(prefix)
for prefix in NAME_PREFIXES_TO_BE_SIGNED):
return True
mode = file.absolute_filepath.lstat().st_mode
if mode & stat.S_IXUSR != 0:
file_output = subprocess.check_output(
("file", file.absolute_filepath)).decode()
if "64-bit executable" in file_output:
return True
return file.relative_filepath.suffix in EXTENSIONS_TO_BE_SIGNED
def collect_files_to_sign(self, path: Path) \
-> List[AbsoluteAndRelativeFileName]:
# Include all files when signing app or dmg bundle: all the files are
# needed to do valid signature of bundle.
if path.name.endswith('.app'):
return AbsoluteAndRelativeFileName.recursively_from_directory(path)
if path.is_dir():
files = []
for child in path.iterdir():
if child.name.endswith('.app'):
current_files = AbsoluteAndRelativeFileName.recursively_from_directory(
child)
else:
current_files = super().collect_files_to_sign(child)
for current_file in current_files:
files.append(AbsoluteAndRelativeFileName(
path, current_file.absolute_filepath))
return files
return super().collect_files_to_sign(path)
############################################################################
# Codesign.
def codesign_remove_signature(
self, file: AbsoluteAndRelativeFileName) -> None:
"""
Make sure given file does not have codesign signature
This is needed because codesigning is not possible for file which has
signature already.
"""
logger_server.info(
'Removing codesign signature from %s...', file.relative_filepath)
command = ['codesign', '--remove-signature', file.absolute_filepath]
self.run_command_or_mock(command, util.Platform.MACOS)
def codesign_file(
self, file: AbsoluteAndRelativeFileName) -> None:
"""
Sign given file
NOTE: File must not have any signatures.
"""
logger_server.info(
'Codesigning %s...', file.relative_filepath)
entitlements_file = self.config.MACOS_ENTITLEMENTS_FILE
command = ['codesign',
'--timestamp',
'--options', 'runtime',
f'--entitlements={entitlements_file}',
'--sign', self.config.MACOS_CODESIGN_IDENTITY,
file.absolute_filepath]
self.run_command_or_mock(command, util.Platform.MACOS)
def codesign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> bool:
"""
Run codesign tool on all eligible files in the given list.
Will ignore all files which are not to be signed. For the rest will
remove possible existing signature and add a new signature.
"""
num_files = len(files)
have_ignored_files = False
signed_files = []
for file_index, file in enumerate(files):
# Ignore file if it is not to be signed.
# Allows to manually construct ZIP of a bundle and get it signed.
if not self.check_file_is_to_be_signed(file):
logger_server.info(
'Ignoring file [%d/%d] %s',
file_index + 1, num_files, file.relative_filepath)
have_ignored_files = True
continue
logger_server.info(
'Running codesigning routines for file [%d/%d] %s...',
file_index + 1, num_files, file.relative_filepath)
self.codesign_remove_signature(file)
self.codesign_file(file)
signed_files.append(file)
if have_ignored_files:
logger_server.info('Signed %d files:', len(signed_files))
num_signed_files = len(signed_files)
for file_index, signed_file in enumerate(signed_files):
logger_server.info(
'- [%d/%d] %s',
file_index + 1, num_signed_files,
signed_file.relative_filepath)
return True
def codesign_bundles(
self, files: List[AbsoluteAndRelativeFileName]) -> None:
"""
Codesign all .app bundles in the given list of files.
Bundle is deducted from paths of the files, and every bundle is only
signed once.
"""
signed_bundles = set()
extra_files = []
for file in files:
if not is_file_from_bundle(file):
continue
bundle = get_bundle_from_file(file)
bundle_name = bundle.relative_filepath
if bundle_name in signed_bundles:
continue
logger_server.info('Running codesign routines on bundle %s',
bundle_name)
# It is not possible to remove signature from DMG.
if bundle.relative_filepath.name.endswith('.app'):
self.codesign_remove_signature(bundle)
self.codesign_file(bundle)
signed_bundles.add(bundle_name)
# Codesign on a bundle adds an extra folder with information.
# It needs to be compied to the source.
code_signature_directory = \
bundle.absolute_filepath / 'Contents' / '_CodeSignature'
code_signature_files = \
AbsoluteAndRelativeFileName.recursively_from_directory(
code_signature_directory)
for code_signature_file in code_signature_files:
bundle_relative_file = AbsoluteAndRelativeFileName(
bundle.base_dir,
code_signature_directory /
code_signature_file.relative_filepath)
extra_files.append(bundle_relative_file)
files.extend(extra_files)
return True
############################################################################
# Notarization.
def notarize_get_bundle_id(self, file: AbsoluteAndRelativeFileName) -> str:
"""
Get bundle ID which will be used to notarize DMG
"""
name = file.relative_filepath.name
app_name = name.split('-', 2)[0].lower()
app_name_words = app_name.split()
if len(app_name_words) > 1:
app_name_id = ''.join(word.capitalize() for word in app_name_words)
else:
app_name_id = app_name_words[0]
# TODO(sergey): Consider using "alpha" for buildbot builds.
return f'org.blenderfoundation.{app_name_id}.release'
def notarize_request(self, file) -> str:
"""
Request notarization of the given file.
Returns UUID of the notarization request. If error occurred None is
returned instead of UUID.
"""
bundle_id = self.notarize_get_bundle_id(file)
logger_server.info('Bundle ID: %s', bundle_id)
logger_server.info('Submitting file to the notarial office.')
command = [
'xcrun', 'altool', '--notarize-app', '--verbose',
'-f', file.absolute_filepath,
'--primary-bundle-id', bundle_id,
'--username', self.config.MACOS_XCRUN_USERNAME,
'--password', self.config.MACOS_XCRUN_PASSWORD]
output = self.check_output_or_mock(
command, util.Platform.MACOS, allow_nonzero_exit_code=True)
for line in output.splitlines():
line = line.strip()
if line.startswith('RequestUUID = '):
request_uuid = line[14:]
return request_uuid
# Check whether the package has been already submitted.
if 'The software asset has already been uploaded.' in line:
request_uuid = re.sub(
'.*The upload ID is ([A-Fa-f0-9\-]+).*', '\\1', line)
logger_server.warning(
f'The package has been already submitted under UUID {request_uuid}')
return request_uuid
logger_server.error(output)
logger_server.error('xcrun command did not report RequestUUID')
return None
def notarize_wait_result(self, request_uuid: str) -> bool:
"""
Wait for until notarial office have a reply
"""
logger_server.info(
'Waiting for a result from the notarization office.')
command = ['xcrun', 'altool',
'--notarization-info', request_uuid,
'--username', self.config.MACOS_XCRUN_USERNAME,
'--password', self.config.MACOS_XCRUN_PASSWORD]
time_start = time.monotonic()
timeout_in_seconds = self.config.MACOS_NOTARIZE_TIMEOUT_IN_SECONDS
while True:
output = self.check_output_or_mock(
command, util.Platform.MACOS, allow_nonzero_exit_code=True)
# Parse status and message
status = xcrun_field_value_from_output('Status', output)
status_message = xcrun_field_value_from_output(
'Status Message', output)
# Review status.
if status:
if status == 'success':
logger_server.info(
'Package successfully notarized: %s', status_message)
return True
elif status == 'invalid':
logger_server.error(output)
logger_server.error(
'Package notarization has failed: %s', status_message)
return False
elif status == 'in progress':
pass
else:
logger_server.info(
'Unknown notarization status %s (%s)', status, status_message)
logger_server.info('Keep waiting for notarization office.')
time.sleep(30)
time_slept_in_seconds = time.monotonic() - time_start
if time_slept_in_seconds > timeout_in_seconds:
logger_server.error(
"Notarial office didn't reply in %f seconds.",
timeout_in_seconds)
def notarize_staple(self, file: AbsoluteAndRelativeFileName) -> bool:
"""
Staple notarial label on the file
"""
logger_server.info('Stapling notarial stamp.')
command = ['xcrun', 'stapler', 'staple', '-v', file.absolute_filepath]
self.check_output_or_mock(command, util.Platform.MACOS)
return True
def notarize_dmg(self, file: AbsoluteAndRelativeFileName) -> bool:
"""
Run entire pipeline to get DMG notarized.
"""
logger_server.info('Begin notarization routines on %s',
file.relative_filepath)
# Submit file for notarization.
request_uuid = self.notarize_request(file)
if not request_uuid:
return False
logger_server.info('Received Request UUID: %s', request_uuid)
# Wait for the status from the notarization office.
if not self.notarize_wait_result(request_uuid):
return False
# Staple.
if not self.notarize_staple(file):
return False
return True
def notarize_all_dmg(
self, files: List[AbsoluteAndRelativeFileName]) -> bool:
"""
Notarize all DMG images from the input.
Images are supposed to be codesigned already.
"""
for file in files:
if not file.relative_filepath.name.endswith('.dmg'):
continue
if not self.check_file_is_to_be_signed(file):
continue
if not self.notarize_dmg(file):
return False
return True
############################################################################
# Entry point.
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
# TODO(sergey): Handle errors somehow.
if not self.codesign_all_files(files):
return
if not self.codesign_bundles(files):
return
if not self.notarize_all_dmg(files):
return

View File

@@ -1,52 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging.config
import sys
from pathlib import Path
from typing import Optional
import codesign.config_builder
import codesign.util as util
from codesign.base_code_signer import BaseCodeSigner
class SimpleCodeSigner:
code_signer: Optional[BaseCodeSigner]
def __init__(self):
platform = util.get_current_platform()
if platform == util.Platform.LINUX:
from codesign.linux_code_signer import LinuxCodeSigner
self.code_signer = LinuxCodeSigner(codesign.config_builder)
elif platform == util.Platform.MACOS:
from codesign.macos_code_signer import MacOSCodeSigner
self.code_signer = MacOSCodeSigner(codesign.config_builder)
elif platform == util.Platform.WINDOWS:
from codesign.windows_code_signer import WindowsCodeSigner
self.code_signer = WindowsCodeSigner(codesign.config_builder)
else:
self.code_signer = None
def sign_file_or_directory(self, path: Path) -> None:
logging.config.dictConfig(codesign.config_builder.LOGGING)
self.code_signer.run_buildbot_path_sign_pipeline(path)

View File

@@ -1,54 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import sys
from enum import Enum
from pathlib import Path
class Platform(Enum):
LINUX = 1
MACOS = 2
WINDOWS = 3
def get_current_platform() -> Platform:
if sys.platform == 'linux':
return Platform.LINUX
elif sys.platform == 'darwin':
return Platform.MACOS
elif sys.platform == 'win32':
return Platform.WINDOWS
raise Exception(f'Unknown platform {sys.platform}')
def ensure_file_does_not_exist_or_die(filepath: Path) -> None:
"""
If the file exists, unlink it.
If the file path exists and is not a file an assert will trigger.
If the file path does not exists nothing happens.
"""
if not filepath.exists():
return
if not filepath.is_file():
# TODO(sergey): Provide information about what the filepath actually is.
raise SystemExit(f'{filepath} is expected to be a file, but is not')
filepath.unlink()

View File

@@ -1,84 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging
from pathlib import Path
from typing import List
import codesign.util as util
from buildbot_utils import Builder
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.base_code_signer import BaseCodeSigner
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
# NOTE: Check is done as filename.endswith(), so keep the dot
EXTENSIONS_TO_BE_SIGNED = {'.exe', '.dll', '.pyd', '.msi'}
BLACKLIST_FILE_PREFIXES = (
'api-ms-', 'concrt', 'msvcp', 'ucrtbase', 'vcomp', 'vcruntime')
class WindowsCodeSigner(BaseCodeSigner):
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
base_name = file.relative_filepath.name
if any(base_name.startswith(prefix)
for prefix in BLACKLIST_FILE_PREFIXES):
return False
return file.relative_filepath.suffix in EXTENSIONS_TO_BE_SIGNED
def get_sign_command_prefix(self) -> List[str]:
return [
'signtool', 'sign', '/v',
'/f', self.config.WIN_CERTIFICATE_FILEPATH,
'/tr', self.config.WIN_TIMESTAMP_AUTHORITY_URL]
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
# NOTE: Sign files one by one to avoid possible command line length
# overflow (which could happen if we ever decide to sign every binary
# in the install folder, for example).
#
# TODO(sergey): Consider doing batched signing of handful of files in
# one go (but only if this actually known to be much faster).
num_files = len(files)
for file_index, file in enumerate(files):
# Ignore file if it is not to be signed.
# Allows to manually construct ZIP of package and get it signed.
if not self.check_file_is_to_be_signed(file):
logger_server.info(
'Ignoring file [%d/%d] %s',
file_index + 1, num_files, file.relative_filepath)
continue
command = self.get_sign_command_prefix()
command.append(file.absolute_filepath)
logger_server.info(
'Running signtool command for file [%d/%d] %s...',
file_index + 1, num_files, file.relative_filepath)
# TODO(sergey): Check the status somehow. With a missing certificate
# the command still exists with a zero code.
self.run_command_or_mock(command, util.Platform.WINDOWS)
# TODO(sergey): Report number of signed and ignored files.

View File

@@ -1,41 +0,0 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging.config
from pathlib import Path
from typing import List
from codesign.macos_code_signer import MacOSCodeSigner
import codesign.config_server
if __name__ == "__main__":
entitlements_file = codesign.config_server.MACOS_ENTITLEMENTS_FILE
if not entitlements_file.exists():
raise SystemExit(
'Entitlements file {entitlements_file} does not exist.')
if not entitlements_file.is_file():
raise SystemExit(
'Entitlements file {entitlements_file} is not a file.')
logging.config.dictConfig(codesign.config_server.LOGGING)
code_signer = MacOSCodeSigner(codesign.config_server)
code_signer.run_signing_server()

View File

@@ -1,11 +0,0 @@
@echo off
rem This is an entry point of the codesign server for Windows.
rem It makes sure that signtool.exe is within the current PATH and can be
rem used by the Python script.
SETLOCAL
set PATH=C:\Program Files (x86)\Windows Kits\10\App Certification Kit;%PATH%
codesign_server_windows.py

View File

@@ -1,54 +0,0 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Implementation of codesign server for Windows.
#
# NOTE: If signtool.exe is not in the PATH use codesign_server_windows.bat
import logging.config
import shutil
from pathlib import Path
from typing import List
import codesign.util as util
from codesign.windows_code_signer import WindowsCodeSigner
import codesign.config_server
if __name__ == "__main__":
logging.config.dictConfig(codesign.config_server.LOGGING)
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
# TODO(sergey): Consider moving such sanity checks into
# CodeSigner.check_environment_or_die().
if not shutil.which('signtool.exe'):
if util.get_current_platform() == util.Platform.WINDOWS:
raise SystemExit("signtool.exe is not found in %PATH%")
logger_server.info(
'signtool.exe not found, '
'but will not be used on this foreign platform')
code_signer = WindowsCodeSigner(codesign.config_server)
code_signer.run_signing_server()

View File

@@ -2,22 +2,33 @@
include("${CMAKE_CURRENT_LIST_DIR}/../../cmake/config/blender_release.cmake")
message(STATUS "Building in CentOS 7 64bit environment")
set(LIBDIR_NAME "linux_centos7_x86_64")
set(WITH_CXX11_ABI OFF CACHE BOOL "" FORCE)
# For libc-2.24 we are using chroot which runs on a 64bit system.
# There we can not use CPU bitness check since it is always 64bit. So instead
# we check for a specific libraries.
#
# Other builders we are running in a bare virtual machine, and the libraries
# are installed to /opt/.
# We assume that only 64bit builders exists in such configuration.
if(EXISTS "/lib/x86_64-linux-gnu/libc-2.24.so")
message(STATUS "Building in GLibc-2.24 environment")
set(LIBDIR_NAME "linux_x86_64")
elseif(EXISTS "/lib/i386-linux-gnu//libc-2.24.so")
message(STATUS "Building in GLibc-2.24 environment")
set(LIBDIR_NAME "linux_i686")
else()
message(STATUS "Building in generic 64bit environment")
set(LIBDIR_NAME "linux_x86_64")
endif()
# Default to only build Blender
set(WITH_BLENDER ON CACHE BOOL "" FORCE)
# ######## Linux-specific build options ########
# Options which are specific to Linux-only platforms
set(WITH_DOC_MANPAGE OFF CACHE BOOL "" FORCE)
# ######## Official release-specific build options ########
# Options which are specific to Linux release builds only
set(WITH_JACK_DYNLOAD ON CACHE BOOL "" FORCE)
set(WITH_SDL_DYNLOAD ON CACHE BOOL "" FORCE)
set(WITH_SYSTEM_GLEW OFF CACHE BOOL "" FORCE)
@@ -29,7 +40,7 @@ set(WITH_PYTHON_INSTALL_REQUESTS ON CACHE BOOL "" FORCE)
# ######## Release environment specific settings ########
set(LIBDIR "${CMAKE_CURRENT_LIST_DIR}/../../../../lib/${LIBDIR_NAME}" CACHE STRING "" FORCE)
set(LIBDIR "/opt/blender-deps/${LIBDIR_NAME}" CACHE BOOL "" FORCE)
# Platform specific configuration, to ensure static linking against everything.

View File

@@ -1,542 +0,0 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
import argparse
import re
import shutil
import subprocess
import sys
import time
from pathlib import Path
from tempfile import TemporaryDirectory, NamedTemporaryFile
from typing import List
BUILDBOT_DIRECTORY = Path(__file__).absolute().parent
CODESIGN_SCRIPT = BUILDBOT_DIRECTORY / 'slave_codesign.py'
BLENDER_GIT_ROOT_DIRECTORY = BUILDBOT_DIRECTORY.parent.parent
DARWIN_DIRECTORY = BLENDER_GIT_ROOT_DIRECTORY / 'release' / 'darwin'
# Extra size which is added on top of actual files size when estimating size
# of destination DNG.
EXTRA_DMG_SIZE_IN_BYTES = 800 * 1024 * 1024
################################################################################
# Common utilities
def get_directory_size(root_directory: Path) -> int:
"""
Get size of directory on disk
"""
total_size = 0
for file in root_directory.glob('**/*'):
total_size += file.lstat().st_size
return total_size
################################################################################
# DMG bundling specific logic
def create_argument_parser():
parser = argparse.ArgumentParser()
parser.add_argument(
'source_dir',
type=Path,
help='Source directory which points to either existing .app bundle'
'or to a directory with .app bundles.')
parser.add_argument(
'--background-image',
type=Path,
help="Optional background picture which will be set on the DMG."
"If not provided default Blender's one is used.")
parser.add_argument(
'--volume-name',
type=str,
help='Optional name of a volume which will be used for DMG.')
parser.add_argument(
'--dmg',
type=Path,
help='Optional argument which points to a final DMG file name.')
parser.add_argument(
'--applescript',
type=Path,
help="Optional path to applescript to set up folder looks of DMG."
"If not provided default Blender's one is used.")
return parser
def collect_app_bundles(source_dir: Path) -> List[Path]:
"""
Collect all app bundles which are to be put into DMG
If the source directory points to FOO.app it will be the only app bundle
packed.
Otherwise all .app bundles from given directory are placed to a single
DMG.
"""
if source_dir.name.endswith('.app'):
return [source_dir]
app_bundles = []
for filename in source_dir.glob('*'):
if not filename.is_dir():
continue
if not filename.name.endswith('.app'):
continue
app_bundles.append(filename)
return app_bundles
def collect_and_log_app_bundles(source_dir: Path) -> List[Path]:
app_bundles = collect_app_bundles(source_dir)
if not app_bundles:
print('No app bundles found for packing')
return
print(f'Found {len(app_bundles)} to pack:')
for app_bundle in app_bundles:
print(f'- {app_bundle}')
return app_bundles
def estimate_dmg_size(app_bundles: List[Path]) -> int:
"""
Estimate size of DMG to hold requested app bundles
The size is based on actual size of all files in all bundles plus some
space to compensate for different size-on-disk plus some space to hold
codesign signatures.
Is better to be on a high side since the empty space is compressed, but
lack of space might cause silent failures later on.
"""
app_bundles_size = 0
for app_bundle in app_bundles:
app_bundles_size += get_directory_size(app_bundle)
return app_bundles_size + EXTRA_DMG_SIZE_IN_BYTES
def copy_app_bundles_to_directory(app_bundles: List[Path],
directory: Path) -> None:
"""
Copy all bundles to a given directory
This directory is what the DMG will be created from.
"""
for app_bundle in app_bundles:
print(f'Copying {app_bundle.name}...')
shutil.copytree(app_bundle, directory / app_bundle.name)
def get_main_app_bundle(app_bundles: List[Path]) -> Path:
"""
Get application bundle main for the installation
"""
return app_bundles[0]
def create_dmg_image(app_bundles: List[Path],
dmg_filepath: Path,
volume_name: str) -> None:
"""
Create DMG disk image and put app bundles in it
No DMG configuration or codesigning is happening here.
"""
if dmg_filepath.exists():
print(f'Removing existing writable DMG {dmg_filepath}...')
dmg_filepath.unlink()
print('Preparing directory with app bundles for the DMG...')
with TemporaryDirectory(prefix='blender-dmg-content-') as content_dir_str:
# Copy all bundles to a clean directory.
content_dir = Path(content_dir_str)
copy_app_bundles_to_directory(app_bundles, content_dir)
# Estimate size of the DMG.
dmg_size = estimate_dmg_size(app_bundles)
print(f'Estimated DMG size: {dmg_size:,} bytes.')
# Create the DMG.
print(f'Creating writable DMG {dmg_filepath}')
command = ('hdiutil',
'create',
'-size', str(dmg_size),
'-fs', 'HFS+',
'-srcfolder', content_dir,
'-volname', volume_name,
'-format', 'UDRW',
dmg_filepath)
subprocess.run(command)
def get_writable_dmg_filepath(dmg_filepath: Path):
"""
Get file path for writable DMG image
"""
parent = dmg_filepath.parent
return parent / (dmg_filepath.stem + '-temp.dmg')
def mount_readwrite_dmg(dmg_filepath: Path) -> None:
"""
Mount writable DMG
Mounting point would be /Volumes/<volume name>
"""
print(f'Mounting read-write DMG ${dmg_filepath}')
command = ('hdiutil',
'attach', '-readwrite',
'-noverify',
'-noautoopen',
dmg_filepath)
subprocess.run(command)
def get_mount_directory_for_volume_name(volume_name: str) -> Path:
"""
Get directory under which the volume will be mounted
"""
return Path('/Volumes') / volume_name
def eject_volume(volume_name: str) -> None:
"""
Eject given volume, if mounted
"""
mount_directory = get_mount_directory_for_volume_name(volume_name)
if not mount_directory.exists():
return
mount_directory_str = str(mount_directory)
print(f'Ejecting volume {volume_name}')
# Figure out which device to eject.
mount_output = subprocess.check_output(['mount']).decode()
device = ''
for line in mount_output.splitlines():
if f'on {mount_directory_str} (' not in line:
continue
tokens = line.split(' ', 3)
if len(tokens) < 3:
continue
if tokens[1] != 'on':
continue
if device:
raise Exception(
f'Multiple devices found for mounting point {mount_directory}')
device = tokens[0]
if not device:
raise Exception(
f'No device found for mounting point {mount_directory}')
print(f'{mount_directory} is mounted as device {device}, ejecting...')
subprocess.run(['diskutil', 'eject', device])
def copy_background_if_needed(background_image_filepath: Path,
mount_directory: Path) -> None:
"""
Copy background to the DMG
If the background image is not specified it will not be copied.
"""
if not background_image_filepath:
print('No background image provided.')
return
print(f'Copying background image {background_image_filepath}')
destination_dir = mount_directory / '.background'
destination_dir.mkdir(exist_ok=True)
destination_filepath = destination_dir / background_image_filepath.name
shutil.copy(background_image_filepath, destination_filepath)
def create_applications_link(mount_directory: Path) -> None:
"""
Create link to /Applications in the given location
"""
print('Creating link to /Applications')
command = ('ln', '-s', '/Applications', mount_directory / ' ')
subprocess.run(command)
def run_applescript(applescript: Path,
volume_name: str,
app_bundles: List[Path],
background_image_filepath: Path) -> None:
"""
Run given applescript to adjust look and feel of the DMG
"""
main_app_bundle = get_main_app_bundle(app_bundles)
with NamedTemporaryFile(
mode='w', suffix='.applescript') as temp_applescript:
print('Adjusting applescript for volume name...')
# Adjust script to the specific volume name.
with open(applescript, mode='r') as input:
for line in input.readlines():
stripped_line = line.strip()
if stripped_line.startswith('tell disk'):
line = re.sub('tell disk ".*"',
f'tell disk "{volume_name}"',
line)
elif stripped_line.startswith('set background picture'):
if not background_image_filepath:
continue
else:
background_image_short = \
'.background:' + background_image_filepath.name
line = re.sub('to file ".*"',
f'to file "{background_image_short}"',
line)
line = line.replace('blender.app', main_app_bundle.name)
temp_applescript.write(line)
temp_applescript.flush()
print('Running applescript...')
command = ('osascript', temp_applescript.name)
subprocess.run(command)
print('Waiting for applescript...')
# NOTE: This is copied from bundle.sh. The exact reason for sleep is
# still remained a mystery.
time.sleep(5)
def codesign(subject: Path):
"""
Codesign file or directory
NOTE: For DMG it will also notarize.
"""
command = (CODESIGN_SCRIPT, subject)
subprocess.run(command)
def codesign_app_bundles_in_dmg(mount_directory: str) -> None:
"""
Code sign all binaries and bundles in the mounted directory
"""
print(f'Codesigning all app bundles in {mount_directory}')
codesign(mount_directory)
def codesign_and_notarize_dmg(dmg_filepath: Path) -> None:
"""
Run codesign and notarization pipeline on the DMG
"""
print(f'Codesigning and notarizing DMG {dmg_filepath}')
codesign(dmg_filepath)
def compress_dmg(writable_dmg_filepath: Path,
final_dmg_filepath: Path) -> None:
"""
Compress temporary read-write DMG
"""
command = ('hdiutil', 'convert',
writable_dmg_filepath,
'-format', 'UDZO',
'-o', final_dmg_filepath)
if final_dmg_filepath.exists():
print(f'Removing old compressed DMG {final_dmg_filepath}')
final_dmg_filepath.unlink()
print('Compressing disk image...')
subprocess.run(command)
def create_final_dmg(app_bundles: List[Path],
dmg_filepath: Path,
background_image_filepath: Path,
volume_name: str,
applescript: Path) -> None:
"""
Create DMG with all app bundles
Will take care configuring background, signing all binaries and app bundles
and notarizing the DMG.
"""
print('Running all routines to create final DMG')
writable_dmg_filepath = get_writable_dmg_filepath(dmg_filepath)
mount_directory = get_mount_directory_for_volume_name(volume_name)
# Make sure volume is not mounted.
# If it is mounted it will prevent removing old DMG files and could make
# it so app bundles are copied to the wrong place.
eject_volume(volume_name)
create_dmg_image(app_bundles, writable_dmg_filepath, volume_name)
mount_readwrite_dmg(writable_dmg_filepath)
# Run codesign first, prior to copying amything else.
#
# This allows to recurs into the content of bundles without worrying about
# possible interfereice of Application symlink.
codesign_app_bundles_in_dmg(mount_directory)
copy_background_if_needed(background_image_filepath, mount_directory)
create_applications_link(mount_directory)
run_applescript(applescript, volume_name, app_bundles,
background_image_filepath)
print('Ejecting read-write DMG image...')
eject_volume(volume_name)
compress_dmg(writable_dmg_filepath, dmg_filepath)
writable_dmg_filepath.unlink()
codesign_and_notarize_dmg(dmg_filepath)
def ensure_dmg_extension(filepath: Path) -> Path:
"""
Make sure given file have .dmg extension
"""
if filepath.suffix != '.dmg':
return filepath.with_suffix(f'{filepath.suffix}.dmg')
return filepath
def get_dmg_filepath(requested_name: Path, app_bundles: List[Path]) -> Path:
"""
Get full file path for the final DMG image
Will use the provided one when possible, otherwise will deduct it from
app bundles.
If the name is deducted, the DMG is stored in the current directory.
"""
if requested_name:
return ensure_dmg_extension(requested_name.absolute())
# TODO(sergey): This is not necessarily the main one.
main_bundle = app_bundles[0]
# Strip .app from the name
return Path(main_bundle.name[:-4] + '.dmg').absolute()
def get_background_image(requested_background_image: Path) -> Path:
"""
Get effective filepath for the background image
"""
if requested_background_image:
return requested_background_image.absolute()
return DARWIN_DIRECTORY / 'background.tif'
def get_applescript(requested_applescript: Path) -> Path:
"""
Get effective filepath for the applescript
"""
if requested_applescript:
return requested_applescript.absolute()
return DARWIN_DIRECTORY / 'blender.applescript'
def get_volume_name_from_dmg_filepath(dmg_filepath: Path) -> str:
"""
Deduct volume name from the DMG path
Will use first part of the DMG file name prior to dash.
"""
tokens = dmg_filepath.stem.split('-')
words = tokens[0].split()
return ' '.join(word.capitalize() for word in words)
def get_volume_name(requested_volume_name: str,
dmg_filepath: Path) -> str:
"""
Get effective name for DMG volume
"""
if requested_volume_name:
return requested_volume_name
return get_volume_name_from_dmg_filepath(dmg_filepath)
def main():
parser = create_argument_parser()
args = parser.parse_args()
# Get normalized input parameters.
source_dir = args.source_dir.absolute()
background_image_filepath = get_background_image(args.background_image)
applescript = get_applescript(args.applescript)
app_bundles = collect_and_log_app_bundles(source_dir)
if not app_bundles:
return
dmg_filepath = get_dmg_filepath(args.dmg, app_bundles)
volume_name = get_volume_name(args.volume_name, dmg_filepath)
print(f'Will produce DMG "{dmg_filepath.name}" (without quotes)')
create_final_dmg(app_bundles,
dmg_filepath,
background_image_filepath,
volume_name,
applescript)
if __name__ == "__main__":
main()

View File

@@ -1,44 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# This is a script which is used as POST-INSTALL one for regular CMake's
# INSTALL target.
# It is used by buildbot workers to sign every binary which is going into
# the final buundle.
# On Windows Python 3 there only is python.exe, no python3.exe.
#
# On other platforms it is possible to have python2 and python3, and a
# symbolic link to python to either of them. So on those platforms use
# an explicit Python version.
if(WIN32)
set(PYTHON_EXECUTABLE python)
else()
set(PYTHON_EXECUTABLE python3)
endif()
execute_process(
COMMAND ${PYTHON_EXECUTABLE} "${CMAKE_CURRENT_LIST_DIR}/slave_codesign.py"
"${CMAKE_INSTALL_PREFIX}"
WORKING_DIRECTORY ${CMAKE_CURRENT_LIST_DIR}
RESULT_VARIABLE exit_code
)
if(NOT exit_code EQUAL "0")
message(FATAL_ERROR "Non-zero exit code of codesign tool")
endif()

View File

@@ -1,74 +0,0 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# Helper script which takes care of signing provided location.
#
# The location can either be a directory (in which case all eligible binaries
# will be signed) or a single file (in which case a single file will be signed).
#
# This script takes care of all the complexity of communicating between process
# which requests file to be signed and the code signing server.
#
# NOTE: Signing happens in-place.
import argparse
import sys
from pathlib import Path
from codesign.simple_code_signer import SimpleCodeSigner
def create_argument_parser():
parser = argparse.ArgumentParser()
parser.add_argument('path_to_sign', type=Path)
return parser
def main():
parser = create_argument_parser()
args = parser.parse_args()
path_to_sign = args.path_to_sign.absolute()
if sys.platform == 'win32':
# When WIX packed is used to generate .msi on Windows the CPack will
# install two different projects and install them to different
# installation prefix:
#
# - C:\b\build\_CPack_Packages\WIX\Blender
# - C:\b\build\_CPack_Packages\WIX\Unspecified
#
# Annoying part is: CMake's post-install script will only be run
# once, with the install prefix which corresponds to a project which
# was installed last. But we want to sign binaries from all projects.
# So in order to do so we detect that we are running for a CPack's
# project used for WIX and force parent directory (which includes both
# projects) to be signed.
#
# Here we force both projects to be signed.
if path_to_sign.name == 'Unspecified' and 'WIX' in str(path_to_sign):
path_to_sign = path_to_sign.parent
code_signer = SimpleCodeSigner()
code_signer.sign_file_or_directory(path_to_sign)
if __name__ == "__main__":
main()

View File

@@ -19,98 +19,148 @@
# <pep8 compliant>
import os
import subprocess
import sys
import shutil
import buildbot_utils
# get builder name
if len(sys.argv) < 2:
sys.stderr.write("Not enough arguments, expecting builder name\n")
sys.exit(1)
def get_cmake_options(builder):
post_install_script = os.path.join(
builder.blender_dir, 'build_files', 'buildbot', 'slave_codesign.cmake')
builder = sys.argv[1]
config_file = "build_files/cmake/config/blender_release.cmake"
options = ['-DCMAKE_BUILD_TYPE:STRING=Release',
'-DWITH_GTESTS=ON']
# we run from build/ directory
blender_dir = os.path.join('..', 'blender.git')
if builder.platform == 'mac':
options.append('-DCMAKE_OSX_ARCHITECTURES:STRING=x86_64')
options.append('-DCMAKE_OSX_DEPLOYMENT_TARGET=10.9')
elif builder.platform == 'win':
options.extend(['-G', 'Visual Studio 15 2017 Win64'])
options.extend(['-DPOSTINSTALL_SCRIPT:PATH=' + post_install_script])
elif builder.platform == 'linux':
config_file = "build_files/buildbot/config/blender_linux.cmake"
optix_sdk_dir = os.path.join(builder.blender_dir, '..', '..', 'NVIDIA-Optix-SDK')
options.append('-DOPTIX_ROOT_DIR:PATH=' + optix_sdk_dir)
def parse_header_file(filename, define):
import re
regex = re.compile("^#\s*define\s+%s\s+(.*)" % define)
with open(filename, "r") as file:
for l in file:
match = regex.match(l)
if match:
return match.group(1)
return None
options.append("-C" + os.path.join(builder.blender_dir, config_file))
options.append("-DCMAKE_INSTALL_PREFIX=%s" % (builder.install_dir))
if 'cmake' in builder:
# cmake
return options
# Some fine-tuning configuration
blender_dir = os.path.abspath(blender_dir)
build_dir = os.path.abspath(os.path.join('..', 'build', builder))
install_dir = os.path.abspath(os.path.join('..', 'install', builder))
targets = ['blender']
command_prefix = []
def update_git(builder):
# Do extra git fetch because not all platform/git/buildbot combinations
# update the origin remote, causing buildinfo to detect local changes.
os.chdir(builder.blender_dir)
bits = 64
print("Fetching remotes")
command = ['git', 'fetch', '--all']
buildbot_utils.call(builder.command_prefix + command)
# Config file to be used (relative to blender's sources root)
cmake_config_file = "build_files/cmake/config/blender_release.cmake"
def clean_directories(builder):
# Make sure no garbage remained from the previous run
if os.path.isdir(builder.install_dir):
shutil.rmtree(builder.install_dir)
# Set build options.
cmake_options = []
cmake_extra_options = ['-DCMAKE_BUILD_TYPE:STRING=Release',
'-DWITH_GTESTS=ON']
# Make sure build directory exists and enter it
os.makedirs(builder.build_dir, exist_ok=True)
if builder.startswith('mac'):
# Set up OSX architecture
if builder.endswith('x86_64_10_9_cmake'):
cmake_extra_options.append('-DCMAKE_OSX_ARCHITECTURES:STRING=x86_64')
cmake_extra_options.append('-DCMAKE_OSX_DEPLOYMENT_TARGET=10.9')
# Remove buildinfo files to force buildbot to re-generate them.
for buildinfo in ('buildinfo.h', 'buildinfo.h.txt', ):
full_path = os.path.join(builder.build_dir, 'source', 'creator', buildinfo)
if os.path.exists(full_path):
print("Removing {}" . format(buildinfo))
os.remove(full_path)
elif builder.startswith('win'):
if builder.startswith('win64'):
cmake_options.extend(['-G', 'Visual Studio 15 2017 Win64'])
elif builder.startswith('win32'):
bits = 32
cmake_options.extend(['-G', 'Visual Studio 15 2017'])
def cmake_configure(builder):
# CMake configuration
os.chdir(builder.build_dir)
elif builder.startswith('linux'):
cmake_config_file = "build_files/buildbot/config/blender_linux.cmake"
tokens = builder.split("_")
glibc = tokens[1]
if glibc == 'glibc224':
deb_name = "stretch"
if builder.endswith('x86_64_cmake'):
chroot_name = 'buildbot_' + deb_name + '_x86_64'
elif builder.endswith('i686_cmake'):
bits = 32
chroot_name = 'buildbot_' + deb_name + '_i686'
command_prefix = ['schroot', '-c', chroot_name, '--']
elif glibc == 'glibc217':
command_prefix = ['scl', 'enable', 'devtoolset-6', '--']
cmake_cache = os.path.join(builder.build_dir, 'CMakeCache.txt')
if os.path.exists(cmake_cache):
print("Removing CMake cache")
os.remove(cmake_cache)
cmake_options.append("-C" + os.path.join(blender_dir, cmake_config_file))
print("CMake configure:")
cmake_options = get_cmake_options(builder)
command = ['cmake', builder.blender_dir] + cmake_options
buildbot_utils.call(builder.command_prefix + command)
def cmake_build(builder):
# CMake build
os.chdir(builder.build_dir)
# NOTE: CPack will build an INSTALL target, which would mean that code
# signing will happen twice when using `make install` and CPack.
# The tricky bit here is that it is not possible to know whether INSTALL
# target is used by CPack or by a buildbot itaself. Extra level on top of
# this is that on Windows it is required to build INSTALL target in order
# to have unit test binaries to run.
# So on the one hand we do an extra unneeded code sign on Windows, but on
# a positive side we don't add complexity and don't make build process more
# fragile trying to avoid this. The signing process is way faster than just
# a clean build of buildbot, especially with regression tests enabled.
if builder.platform == 'win':
command = ['cmake', '--build', '.', '--target', 'install', '--config', 'Release']
# Prepare CMake options needed to configure cuda binaries compilation, 64bit only.
if bits == 64:
cmake_options.append("-DWITH_CYCLES_CUDA_BINARIES=ON")
cmake_options.append("-DCUDA_64_BIT_DEVICE_CODE=ON")
else:
command = ['make', '-s', '-j16', 'install']
cmake_options.append("-DWITH_CYCLES_CUDA_BINARIES=OFF")
print("CMake build:")
buildbot_utils.call(builder.command_prefix + command)
cmake_options.append("-DCMAKE_INSTALL_PREFIX=%s" % (install_dir))
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
update_git(builder)
clean_directories(builder)
cmake_configure(builder)
cmake_build(builder)
cmake_options += cmake_extra_options
# Make sure no garbage remained from the previous run
if os.path.isdir(install_dir):
shutil.rmtree(install_dir)
for target in targets:
print("Building target %s" % (target))
# Construct build directory name based on the target
target_build_dir = build_dir
target_command_prefix = command_prefix[:]
if target != 'blender':
target_build_dir += '_' + target
target_name = 'install'
# Tweaking CMake options to respect the target
target_cmake_options = cmake_options[:]
# Do extra git fetch because not all platform/git/buildbot combinations
# update the origin remote, causing buildinfo to detect local changes.
os.chdir(blender_dir)
print("Fetching remotes")
command = ['git', 'fetch', '--all']
print(command)
retcode = subprocess.call(target_command_prefix + command)
if retcode != 0:
sys.exit(retcode)
# Make sure build directory exists and enter it
if not os.path.isdir(target_build_dir):
os.mkdir(target_build_dir)
os.chdir(target_build_dir)
# Configure the build
print("CMake options:")
print(target_cmake_options)
if os.path.exists('CMakeCache.txt'):
print("Removing CMake cache")
os.remove('CMakeCache.txt')
# Remove buildinfo files to force buildbot to re-generate them.
for buildinfo in ('buildinfo.h', 'buildinfo.h.txt', ):
full_path = os.path.join('source', 'creator', buildinfo)
if os.path.exists(full_path):
print("Removing {}" . format(buildinfo))
os.remove(full_path)
retcode = subprocess.call(target_command_prefix + ['cmake', blender_dir] + target_cmake_options)
if retcode != 0:
print('Configuration FAILED!')
sys.exit(retcode)
if 'win32' in builder or 'win64' in builder:
command = ['cmake', '--build', '.', '--target', target_name, '--config', 'Release']
else:
command = ['make', '-s', '-j2', target_name]
print("Executing command:")
print(command)
retcode = subprocess.call(target_command_prefix + command)
if retcode != 0:
sys.exit(retcode)
else:
print("Unknown building system")
sys.exit(1)

View File

@@ -23,48 +23,48 @@
# to the master in the next buildbot step.
import os
import subprocess
import sys
import zipfile
from pathlib import Path
# get builder name
if len(sys.argv) < 2:
sys.stderr.write("Not enough arguments, expecting builder name\n")
sys.exit(1)
import buildbot_utils
builder = sys.argv[1]
# Never write branch if it is master.
branch = sys.argv[2] if (len(sys.argv) >= 3 and sys.argv[2] != 'master') else ''
def get_package_name(builder, platform=None):
info = buildbot_utils.VersionInfo(builder)
blender_dir = os.path.join('..', 'blender.git')
build_dir = os.path.join('..', 'build', builder)
install_dir = os.path.join('..', 'install', builder)
buildbot_upload_zip = os.path.abspath(os.path.join(os.path.dirname(install_dir), "buildbot_upload.zip"))
package_name = 'blender-' + info.full_version
if platform:
package_name += '-' + platform
if not (builder.branch == 'master' or builder.is_release_branch):
if info.is_development_build:
package_name = builder.branch + "-" + package_name
return package_name
def sign_file_or_directory(path):
from codesign.simple_code_signer import SimpleCodeSigner
code_signer = SimpleCodeSigner()
code_signer.sign_file_or_directory(Path(path))
upload_filename = None # Name of the archive to be uploaded
# (this is the name of archive which will appear on the
# download page)
upload_filepath = None # Filepath to be uploaded to the server
# (this folder will be packed)
def create_buildbot_upload_zip(builder, package_files):
import zipfile
def parse_header_file(filename, define):
import re
regex = re.compile("^#\s*define\s+%s\s+(.*)" % define)
with open(filename, "r") as file:
for l in file:
match = regex.match(l)
if match:
return match.group(1)
return None
buildbot_upload_zip = os.path.join(builder.upload_dir, "buildbot_upload.zip")
if os.path.exists(buildbot_upload_zip):
os.remove(buildbot_upload_zip)
try:
z = zipfile.ZipFile(buildbot_upload_zip, "w", compression=zipfile.ZIP_STORED)
for filepath, filename in package_files:
print("Packaged", filename)
z.write(filepath, arcname=filename)
z.close()
except Exception as ex:
sys.stderr.write('Create buildbot_upload.zip failed: ' + str(ex) + '\n')
sys.exit(1)
# Make sure install directory always exists
if not os.path.exists(install_dir):
os.makedirs(install_dir)
def create_tar_xz(src, dest, package_name):
def create_tar_bz2(src, dest, package_name):
# One extra to remove leading os.sep when cleaning root for package_root
ln = len(src) + 1
flist = list()
@@ -75,123 +75,168 @@ def create_tar_xz(src, dest, package_name):
flist.extend([(os.path.join(root, file), os.path.join(package_root, file)) for file in files])
import tarfile
# Set UID/GID of archived files to 0, otherwise they'd be owned by whatever
# user compiled the package. If root then unpacks it to /usr/local/ you get
# a security issue.
def _fakeroot(tarinfo):
tarinfo.gid = 0
tarinfo.gname = "root"
tarinfo.uid = 0
tarinfo.uname = "root"
return tarinfo
package = tarfile.open(dest, 'w:xz', preset=9)
package = tarfile.open(dest, 'w:bz2')
for entry in flist:
package.add(entry[0], entry[1], recursive=False, filter=_fakeroot)
package.add(entry[0], entry[1], recursive=False)
package.close()
def cleanup_files(dirpath, extension):
for f in os.listdir(dirpath):
filepath = os.path.join(dirpath, f)
if os.path.isfile(filepath) and f.endswith(extension):
os.remove(filepath)
if builder.find('cmake') != -1:
# CMake
if 'win' in builder or 'mac' in builder:
os.chdir(build_dir)
files = [f for f in os.listdir('.') if os.path.isfile(f) and f.endswith('.zip')]
for f in files:
os.remove(f)
retcode = subprocess.call(['cpack', '-G', 'ZIP'])
result_file = [f for f in os.listdir('.') if os.path.isfile(f) and f.endswith('.zip')][0]
# TODO(sergey): Such magic usually happens in SCon's packaging but we don't have it
# in the CMake yet. For until then we do some magic here.
tokens = result_file.split('-')
blender_version = tokens[1].split('.')
blender_full_version = '.'.join(blender_version[0:2])
git_hash = tokens[2].split('.')[1]
platform = builder.split('_')[0]
if platform == 'mac':
# Special exception for OSX
platform = 'OSX-10.9-'
if builder.endswith('x86_64_10_9_cmake'):
platform += 'x86_64'
if builder.endswith('vc2015'):
platform += "-vc14"
builderified_name = 'blender-{}-{}-{}'.format(blender_full_version, git_hash, platform)
# NOTE: Blender 2.7 is already respected by blender_full_version.
if branch != '' and branch != 'blender2.7':
builderified_name = branch + "-" + builderified_name
os.rename(result_file, "{}.zip".format(builderified_name))
# create zip file
try:
if os.path.exists(buildbot_upload_zip):
os.remove(buildbot_upload_zip)
z = zipfile.ZipFile(buildbot_upload_zip, "w", compression=zipfile.ZIP_STORED)
z.write("{}.zip".format(builderified_name))
z.close()
sys.exit(retcode)
except Exception as ex:
sys.stderr.write('Create buildbot_upload.zip failed' + str(ex) + '\n')
sys.exit(1)
elif builder.startswith('linux_'):
blender = os.path.join(install_dir, 'blender')
buildinfo_h = os.path.join(build_dir, "source", "creator", "buildinfo.h")
blender_h = os.path.join(blender_dir, "source", "blender", "blenkernel", "BKE_blender_version.h")
# Get version information
blender_version = int(parse_header_file(blender_h, 'BLENDER_VERSION'))
blender_version = "%d.%d" % (blender_version // 100, blender_version % 100)
blender_hash = parse_header_file(buildinfo_h, 'BUILD_HASH')[1:-1]
blender_glibc = builder.split('_')[1]
command_prefix = []
bits = 64
blender_arch = 'x86_64'
if blender_glibc == 'glibc224':
if builder.endswith('x86_64_cmake'):
chroot_name = 'buildbot_stretch_x86_64'
elif builder.endswith('i686_cmake'):
chroot_name = 'buildbot_stretch_i686'
bits = 32
blender_arch = 'i686'
command_prefix = ['schroot', '-c', chroot_name, '--']
elif blender_glibc == 'glibc217':
command_prefix = ['scl', 'enable', 'devtoolset-6', '--']
# Strip all unused symbols from the binaries
print("Stripping binaries...")
subprocess.call(command_prefix + ['strip', '--strip-all', blender])
print("Stripping python...")
py_target = os.path.join(install_dir, blender_version)
subprocess.call(command_prefix + ['find', py_target, '-iname', '*.so', '-exec', 'strip', '-s', '{}', ';'])
# Copy all specific files which are too specific to be copied by
# the CMake rules themselves
print("Copying extra scripts and libs...")
extra = '/' + os.path.join('home', 'sources', 'release-builder', 'extra')
mesalibs = os.path.join(extra, 'mesalibs' + str(bits) + '.tar.bz2')
software_gl = os.path.join(blender_dir, 'release', 'bin', 'blender-softwaregl')
icons = os.path.join(blender_dir, 'release', 'freedesktop', 'icons')
os.system('tar -xpf %s -C %s' % (mesalibs, install_dir))
os.system('cp %s %s' % (software_gl, install_dir))
os.system('cp -r %s %s' % (icons, install_dir))
os.system('chmod 755 %s' % (os.path.join(install_dir, 'blender-softwaregl')))
# Construct archive name
package_name = 'blender-%s-%s-linux-%s-%s' % (blender_version,
blender_hash,
blender_glibc,
blender_arch)
# NOTE: Blender 2.7 is already respected by blender_full_version.
if branch != '' and branch != 'blender2.7':
package_name = branch + "-" + package_name
upload_filename = package_name + ".tar.bz2"
print("Creating .tar.bz2 archive")
upload_filepath = install_dir + '.tar.bz2'
create_tar_bz2(install_dir, upload_filepath, package_name)
else:
print("Unknown building system")
sys.exit(1)
def pack_mac(builder):
info = buildbot_utils.VersionInfo(builder)
if upload_filepath is None:
# clean release directory if it already exists
release_dir = 'release'
os.chdir(builder.build_dir)
cleanup_files(builder.build_dir, '.dmg')
if os.path.exists(release_dir):
for f in os.listdir(release_dir):
if os.path.isfile(os.path.join(release_dir, f)):
os.remove(os.path.join(release_dir, f))
package_name = get_package_name(builder, 'macOS')
package_filename = package_name + '.dmg'
package_filepath = os.path.join(builder.build_dir, package_filename)
# create release package
try:
subprocess.call(['make', 'package_archive'])
except Exception as ex:
sys.stderr.write('Make package release failed' + str(ex) + '\n')
sys.exit(1)
release_dir = os.path.join(builder.blender_dir, 'release', 'darwin')
buildbot_dir = os.path.join(builder.blender_dir, 'build_files', 'buildbot')
bundle_script = os.path.join(buildbot_dir, 'slave_bundle_dmg.py')
# find release directory, must exist this time
if not os.path.exists(release_dir):
sys.stderr.write("Failed to find release directory %r.\n" % release_dir)
sys.exit(1)
command = [bundle_script]
command += ['--dmg', package_filepath]
if info.is_development_build:
background_image = os.path.join(release_dir, 'buildbot', 'background.tif')
command += ['--background-image', background_image]
command += [builder.install_dir]
buildbot_utils.call(command)
# find release package
file = None
filepath = None
create_buildbot_upload_zip(builder, [(package_filepath, package_filename)])
for f in os.listdir(release_dir):
rf = os.path.join(release_dir, f)
if os.path.isfile(rf) and f.startswith('blender'):
file = f
filepath = rf
if not file:
sys.stderr.write("Failed to find release package.\n")
sys.exit(1)
def pack_win(builder):
info = buildbot_utils.VersionInfo(builder)
upload_filename = file
upload_filepath = filepath
os.chdir(builder.build_dir)
cleanup_files(builder.build_dir, '.zip')
# CPack will add the platform name
cpack_name = get_package_name(builder, None)
package_name = get_package_name(builder, 'windows' + str(builder.bits))
command = ['cmake', '-DCPACK_OVERRIDE_PACKAGENAME:STRING=' + cpack_name, '.']
buildbot_utils.call(builder.command_prefix + command)
command = ['cpack', '-G', 'ZIP']
buildbot_utils.call(builder.command_prefix + command)
package_filename = package_name + '.zip'
package_filepath = os.path.join(builder.build_dir, package_filename)
package_files = [(package_filepath, package_filename)]
if info.version_cycle == 'release':
# Installer only for final release builds, otherwise will get
# 'this product is already installed' messages.
command = ['cpack', '-G', 'WIX']
buildbot_utils.call(builder.command_prefix + command)
package_filename = package_name + '.msi'
package_filepath = os.path.join(builder.build_dir, package_filename)
sign_file_or_directory(package_filepath)
package_files += [(package_filepath, package_filename)]
create_buildbot_upload_zip(builder, package_files)
def pack_linux(builder):
blender_executable = os.path.join(builder.install_dir, 'blender')
info = buildbot_utils.VersionInfo(builder)
# Strip all unused symbols from the binaries
print("Stripping binaries...")
buildbot_utils.call(builder.command_prefix + ['strip', '--strip-all', blender_executable])
print("Stripping python...")
py_target = os.path.join(builder.install_dir, info.version)
buildbot_utils.call(builder.command_prefix + ['find', py_target, '-iname', '*.so', '-exec', 'strip', '-s', '{}', ';'])
# Construct package name
platform_name = 'linux64'
package_name = get_package_name(builder, platform_name)
package_filename = package_name + ".tar.xz"
print("Creating .tar.xz archive")
package_filepath = builder.install_dir + '.tar.xz'
create_tar_xz(builder.install_dir, package_filepath, package_name)
# Create buildbot_upload.zip
create_buildbot_upload_zip(builder, [(package_filepath, package_filename)])
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
# Make sure install directory always exists
os.makedirs(builder.install_dir, exist_ok=True)
if builder.platform == 'mac':
pack_mac(builder)
elif builder.platform == 'win':
pack_win(builder)
elif builder.platform == 'linux':
pack_linux(builder)
# create zip file
try:
upload_zip = os.path.join(buildbot_upload_zip)
if os.path.exists(upload_zip):
os.remove(upload_zip)
z = zipfile.ZipFile(upload_zip, "w", compression=zipfile.ZIP_STORED)
z.write(upload_filepath, arcname=upload_filename)
z.close()
except Exception as ex:
sys.stderr.write('Create buildbot_upload.zip failed' + str(ex) + '\n')
sys.exit(1)

View File

@@ -21,17 +21,23 @@
# Runs on buildbot slave, rsync zip directly to buildbot server rather
# than using upload which is much slower
import buildbot_utils
import os
import sys
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
# get builder name
if len(sys.argv) < 2:
sys.stderr.write("Not enough arguments, expecting builder name\n")
sys.exit(1)
# rsync, this assumes ssh keys are setup so no password is needed
local_zip = "buildbot_upload.zip"
remote_folder = "builder.blender.org:/data/buildbot-master/uploaded/"
remote_zip = remote_folder + "buildbot_upload_" + builder.name + ".zip"
builder = sys.argv[1]
command = ["rsync", "-avz", local_zip, remote_zip]
buildbot_utils.call(command)
# rsync, this assumes ssh keys are setup so no password is needed
local_zip = "buildbot_upload.zip"
remote_folder = "builder.blender.org:/data/buildbot-master/uploaded/"
remote_zip = remote_folder + "buildbot_upload_" + builder + ".zip"
command = "rsync -avz %s %s" % (local_zip, remote_zip)
print(command)
ret = os.system(command)
sys.exit(ret)

View File

@@ -18,22 +18,59 @@
# <pep8 compliant>
import buildbot_utils
import subprocess
import os
import sys
def get_ctest_arguments(builder):
args = ['--output-on-failure']
if builder.platform == 'win':
args += ['-C', 'Release']
return args
# get builder name
if len(sys.argv) < 2:
sys.stderr.write("Not enough arguments, expecting builder name\n")
sys.exit(1)
def test(builder):
os.chdir(builder.build_dir)
builder = sys.argv[1]
command = builder.command_prefix + ['ctest'] + get_ctest_arguments(builder)
buildbot_utils.call(command)
# we run from build/ directory
blender_dir = '../blender.git'
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
test(builder)
if "cmake" in builder:
print("Automated tests are still DISABLED!")
sys.exit(0)
build_dir = os.path.abspath(os.path.join('..', 'build', builder))
install_dir = os.path.abspath(os.path.join('..', 'install', builder))
# NOTE: For quick test only to see if the approach work.
# n the future must be replaced with an actual blender version.
blender_version = '2.80'
blender_version_dir = os.path.join(install_dir, blender_version)
command_prefix = []
extra_ctest_args = []
if builder.startswith('win'):
extra_ctest_args += ['-C', 'Release']
elif builder.startswith('linux'):
tokens = builder.split("_")
glibc = tokens[1]
if glibc == 'glibc224':
deb_name = "stretch"
if builder.endswith('x86_64_cmake'):
chroot_name = 'buildbot_' + deb_name + '_x86_64'
elif builder.endswith('i686_cmake'):
chroot_name = 'buildbot_' + deb_name + '_i686'
command_prefix = ['schroot', '--preserve-environment', '-c', chroot_name, '--']
elif glibc == 'glibc217':
command_prefix = ['scl', 'enable', 'devtoolset-6', '--']
ctest_env = os.environ.copy()
ctest_env['BLENDER_SYSTEM_SCRIPTS'] = os.path.join(blender_version_dir, 'scripts')
ctest_env['BLENDER_SYSTEM_DATAFILES'] = os.path.join(blender_version_dir, 'datafiles')
os.chdir(build_dir)
retcode = subprocess.call(command_prefix + ['ctest', '--output-on-failure'] + extra_ctest_args,
env=ctest_env)
# Always exit with a success, for until we know all the tests are passing
# on all builders.
sys.exit(0)
else:
print("Unknown building system")
sys.exit(1)

View File

@@ -1,31 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import buildbot_utils
import os
import sys
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
os.chdir(builder.blender_dir)
# Run make update which handles all libraries and submodules.
make_update = os.path.join(builder.blender_dir, "build_files", "utils", "make_update.py")
buildbot_utils.call([sys.executable, make_update, '--no-blender', "--use-tests", "--use-centos-libraries"])

View File

@@ -27,6 +27,9 @@ ENDIF()
SET(_alembic_SEARCH_DIRS
${ALEMBIC_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
/opt/lib/alembic
)

View File

@@ -17,6 +17,9 @@ ENDIF()
SET(_audaspace_SEARCH_DIRS
${AUDASPACE_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
)
# Use pkg-config to get hints about paths

View File

@@ -29,6 +29,9 @@ ENDIF()
SET(_blosc_SEARCH_DIRS
${BLOSC_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
/opt/lib/blosc
)

View File

@@ -25,6 +25,9 @@ ENDIF()
SET(_eigen3_SEARCH_DIRS
${EIGEN3_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
)
FIND_PATH(EIGEN3_INCLUDE_DIR

View File

@@ -29,6 +29,10 @@ ENDIF()
SET(_embree_SEARCH_DIRS
${EMBREE_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
/opt/embree
/opt/lib/embree
)

View File

@@ -29,6 +29,9 @@ ENDIF()
SET(_fftw3_SEARCH_DIRS
${FFTW3_ROOT_DIR}
/usr/local
/sw # Fink
/opt/local # DarwinPorts
)
FIND_PATH(FFTW3_INCLUDE_DIR

View File

@@ -28,6 +28,7 @@ ENDIF()
SET(_glew_SEARCH_DIRS
${GLEW_ROOT_DIR}
/usr/local
)
FIND_PATH(GLEW_INCLUDE_DIR

Some files were not shown because too many files have changed in this diff Show More