1
1

Compare commits

..

24 Commits

Author SHA1 Message Date
b063115d4e EEVEE: Remove softness parameters from RNA & UI
Keep it in DNA for backward compatibility
2019-09-02 16:41:07 +02:00
312dbf8af1 EEVEE: Fix sunlight when sun angle is 180 degrees 2019-09-02 16:41:07 +02:00
eb8093cb14 EEVEE: Shadow: Add back shadow bias
This is needed in some corner case (shadow acne due to aliasing and depth
disparity). This is a simple bias added to default bias. It should not be
needed most of the time but we leave it at 1 by default.
2019-09-02 16:41:07 +02:00
6adaa4202a EEVEE: Fix spotlight shadow optimization
Spot light can be non-uniform and scale in one direction. This fix makes
sure both directions are taken into account before skipping cubemap faces.
2019-09-02 16:41:07 +02:00
4dd3a9790f UI: Make cascaded shadowmap panel on top of contact shadow
This make more sense as cascade parameters are more important.
2019-09-02 16:41:07 +02:00
c584116e25 EEVEE: Shadow: Add temporal sampling to shadowmaps
If soft shadows is enabled, we randomize the shadowmap sample position
to reduce aliasing artifacts (jagged edge shadows).
2019-09-02 16:41:07 +02:00
3e10785c60 Cleanup: EEVEE: Split eevee_lights.c into smaller files
Also have some const correctness fix and some header cleanup.
2019-09-02 16:41:07 +02:00
9468425524 Eevee: Shadow: Refactor / Cleanup of shadow update
- Replace EEVEE_lightbits by BLI_bitmap
- Replace EEVEE_BoundSphere by BoundSphere
- Support for dupli light shadows
- Detect unecessary update of soft shadows (i.e: moving the view)
- Remove Object references
2019-09-02 16:41:07 +02:00
31e21d38e3 Eevee: Light: Refactor shadow tagging to allow dupli shadow casters
Dupli objects can now cast shadows.

This also replace the custom lightbits by BLI_bitmap.
2019-09-02 16:41:07 +02:00
8457593672 Eevee: Shadows: Add texel border on shadow cube for better edge filtering
Unfortunately, this seems to be imprecise on lower cube resolution. But
the result is still most of the time more pleasant than no border filtering.
2019-09-02 16:41:06 +02:00
e13f5cc5ae Eevee: Shadow: Speedup: Only render shadow cube face needed
This reduce the number of face to render to 5 in the case of area lights
and 5 or 1 for spotlights.

Spotlights that have a spot size less than 90 degrees only need 1 face and
are the fastest.
2019-09-02 16:41:06 +02:00
278d174e35 Eevee: Shadows: Make shadowmap follow light orientation
This is in preparation of an optimization
2019-09-02 16:41:06 +02:00
d5555e5d5a DRW: Add line offset to DRW_STATE_SHADOW_OFFSET
This is needed for hairs.
2019-09-02 16:41:06 +02:00
585eaa5b0a Eevee: Shadows: Improve contact shadows
Contact shadows now follow the shadowmap direction. This means it matches
the shadowmap blur that is proportional to the light radius.

Moreover this adds a more efficient bias for most contact shadows.
2019-09-02 16:41:06 +02:00
4ec63b5421 Eevee: Shadow: Speedup: Use only 2 sample for cascaded shadowmap 2019-09-02 16:41:06 +02:00
a77301289d Eevee: Shadow: Remove receiver plane bias and use hardware filtering
In an attempt to simplify the shadowing in eevee, we remove the bias and
filtering option.

Now the shadowmap always get the minimum constant and slope bias and we
only do a bilinear shadow filtering. This means the shadow is as sharper
and exact as the shadow map format allows (bitdepth and size).

Only the lamp size can change the shadow softness.
2019-09-02 16:41:06 +02:00
95a3c256dc Eevee: Shadow: Make sun clip distances automatic
This simplify sun lights setup and matches more cycles behavior.
2019-09-02 16:41:06 +02:00
0edd48c904 Eevee: Remove ESM and VSM code 2019-09-02 16:41:06 +02:00
9e19ec0d18 Eevee: Shadows: Add Receiver Plane Depth Bias
This bias replace the previous bias method. The bias is now scalled to
have the correct depth of the triangle at the sample location. This is done
by computing the actual depth that would be recorded in the shadowmap
at the texels locations, if the triangle was extrapolated.

This leads to less shadow acne and it ensure the bias is always the minimum
amount that ensure correct shadowing.

However this technique is not failure free and if the receiver is nearly
parallel to the light, the bias is nearly infinite and light leaking occurs.

For this reason I decided to cap the bias by the bias parameter to tweak
between shadow acne and light leaking.
2019-09-02 16:41:06 +02:00
e951f5f0fe Eevee: Replace ESM and VSM by PCF shadow mapping
PCF shadowmaps are less prone to light leaking and are faster to
render.

This remove a substantial part of the shadowing code.
2019-09-02 16:41:06 +02:00
4df76629a5 DRW: Add shadow bias state
This state add shadowmap bias to avoid most of self shadowing.
2019-09-02 16:41:06 +02:00
786b0c6414 Eevee: SSS: Refactor translucency
This separate the translucency evaluation to be outside of surface eval.

This as the benefit to reduce code complexity and remove the need for
shadow map (non-test) sampler in the shading pass.

One big change is that bsdf intensity is multiplied and stored with the
albedo color instead of the sss irradiance. This is in order to apply it
to both the translucency and the sss diffusion. This change the look of
mixed SSS shaders which is now closer to cycles.

Performance cost is negligeable.

# Conflicts:
#	source/blender/gpu/shaders/gpu_shader_material.glsl
2019-09-02 16:41:02 +02:00
9ee5e73a3d Eevee: SSS: Refactor to use less memory and always use separate albedo
This refactor reduce the Memory overhead of SSS and enables us to always
use separate albedo. Previously we used 128bits/px for SSS data and
32bits/px for albedo. Now we use 112bits/px for SSS data & separate albedo
altogether.

This refactor is needed for PCF shadow maps.

# Conflicts:
#	source/blender/gpu/shaders/gpu_shader_material.glsl
2019-09-02 16:38:14 +02:00
77a0ef91ba GPUFramebuffer: Bump max color attachement to 6 2019-09-02 16:30:48 +02:00
2056 changed files with 37714 additions and 216573 deletions

View File

@@ -221,17 +221,6 @@ ForEachMacros:
- ITER_BEGIN
- ITER_PIXELS
- ITER_SLOTS
- ITER_SLOTS_BEGIN
- LOOP_EDITED_POINTS
- LOOP_KEYS
- LOOP_POINTS
- LOOP_SELECTED_KEYS
- LOOP_SELECTED_POINTS
- LOOP_TAGGED_KEYS
- LOOP_TAGGED_POINTS
- LOOP_UNSELECTED_POINTS
- LOOP_VISIBLE_KEYS
- LOOP_VISIBLE_POINTS
- LISTBASE_CIRCULAR_BACKWARD_BEGIN
- LISTBASE_CIRCULAR_FORWARD_BEGIN
- LISTBASE_FOREACH

22
.github/stale.yml vendored
View File

@@ -1,22 +0,0 @@
# Configuration for probot-stale - https://github.com/probot/stale
# This file is used on Blender's GitHub mirror to automatically close any pull request
# and invite contributors to join the official development platform on blender.org
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 1
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
daysUntilClose: 1
# Label to use when marking as stale
staleLabel: stale
# Comment to post when closing a stale Issue or Pull Request.
closeComment: >
This issue has been automatically closed, because this repository is only
used as a mirror of git.blender.org. Blender development happens on
developer.blender.org.
To get started contributing code, please read:
https://wiki.blender.org/wiki/Process/Contributing_Code

3
.gitignore vendored
View File

@@ -40,6 +40,3 @@ Desktop.ini
# in-source lib downloads
/build_files/build_environment/downloads
# in-source buildbot signing configuration
/build_files/buildbot/codesign/config_server.py

View File

@@ -16,7 +16,6 @@
#
# The Original Code is Copyright (C) 2006, Blender Foundation
# All rights reserved.
#
# ***** END GPL LICENSE BLOCK *****
#-----------------------------------------------------------------------------
@@ -170,14 +169,6 @@ option_defaults_init(
_init_OPENSUBDIV
)
# TBB malloc is only supported on for windows currently
if(WIN32)
set(_init_TBB_MALLOC_PROXY ON)
else()
set(_init_TBB_MALLOC_PROXY OFF)
endif()
# customize...
if(UNIX AND NOT APPLE)
# some of these libraries are problematic on Linux
@@ -272,8 +263,6 @@ endif()
option(WITH_HEADLESS "Build without graphical support (renderfarm, server mode only)" OFF)
mark_as_advanced(WITH_HEADLESS)
option(WITH_QUADRIFLOW "Build with quadriflow remesher support" ON)
option(WITH_AUDASPACE "Build with blenders audio library (only disable if you know what you're doing!)" ON)
option(WITH_SYSTEM_AUDASPACE "Build with external audaspace library installed on the system (only enable if you know what you're doing!)" OFF)
mark_as_advanced(WITH_AUDASPACE)
@@ -315,6 +304,8 @@ endif()
option(WITH_MOD_FLUID "Enable Elbeem Modifier (Fluid Simulation)" ON)
option(WITH_MOD_SMOKE "Enable Smoke Modifier (Smoke Simulation)" ON)
option(WITH_MOD_REMESH "Enable Remesh Modifier" ON)
# option(WITH_MOD_CLOTH_ELTOPO "Enable Experimental cloth solver" OFF) # this is now only available in a branch
# mark_as_advanced(WITH_MOD_CLOTH_ELTOPO)
option(WITH_MOD_OCEANSIM "Enable Ocean Modifier" OFF)
# Image format support
@@ -377,6 +368,7 @@ if(WIN32)
option(WITH_INPUT_IME "Enable Input Method Editor (IME) for complex Asian character input" ON)
endif()
option(WITH_INPUT_NDOF "Enable NDOF input devices (SpaceNavigator and friends)" ${_init_INPUT_NDOF})
option(WITH_RAYOPTIMIZATION "Enable use of SIMD (SSE) optimizations for the raytracer" ON)
if(UNIX AND NOT APPLE)
option(WITH_INSTALL_PORTABLE "Install redistributeable runtime, otherwise install into CMAKE_INSTALL_PREFIX" ON)
option(WITH_STATIC_LIBS "Try to link with static libraries, as much as possible, to make blender more portable across distributions" OFF)
@@ -432,7 +424,6 @@ mark_as_advanced(WITH_CYCLES_DEBUG)
mark_as_advanced(WITH_CYCLES_NATIVE_ONLY)
option(WITH_CYCLES_DEVICE_CUDA "Enable Cycles CUDA compute support" ON)
option(WITH_CYCLES_DEVICE_OPTIX "Enable Cycles OptiX support" OFF)
option(WITH_CYCLES_DEVICE_OPENCL "Enable Cycles OpenCL compute support" ON)
option(WITH_CYCLES_NETWORK "Enable Cycles compute over network support (EXPERIMENTAL and unfinished)" OFF)
mark_as_advanced(WITH_CYCLES_DEVICE_CUDA)
@@ -466,15 +457,14 @@ mark_as_advanced(WITH_CXX_GUARDEDALLOC)
option(WITH_ASSERT_ABORT "Call abort() when raising an assertion through BLI_assert()" ON)
mark_as_advanced(WITH_ASSERT_ABORT)
option(WITH_BOOST "Enable features depending on boost" ON)
option(WITH_TBB "Enable features depending on TBB (OpenVDB, OpenImageDenoise, sculpt multithreading)" ON)
option(WITH_TBB_MALLOC_PROXY "Enable the TBB malloc replacement" ${_init_TBB_MALLOC_PROXY})
option(WITH_BOOST "Enable features depending on boost" ON)
# Unit testsing
option(WITH_GTESTS "Enable GTest unit testing" OFF)
option(WITH_OPENGL_RENDER_TESTS "Enable OpenGL render related unit testing (Experimental)" OFF)
option(WITH_OPENGL_DRAW_TESTS "Enable OpenGL UI drawing related unit testing (Experimental)" OFF)
# Documentation
if(UNIX AND NOT APPLE)
option(WITH_DOC_MANPAGE "Create a manual page (Unix manpage)" OFF)
@@ -564,6 +554,15 @@ if(WIN32)
option(WITH_WINDOWS_FIND_MODULES "Use find_package to locate libraries" OFF)
mark_as_advanced(WITH_WINDOWS_FIND_MODULES)
option(WITH_WINDOWS_CODESIGN "Use signtool to sign the final binary." OFF)
mark_as_advanced(WITH_WINDOWS_CODESIGN)
set(WINDOWS_CODESIGN_PFX CACHE FILEPATH "Path to pfx file to use for codesigning.")
mark_as_advanced(WINDOWS_CODESIGN_PFX)
set(WINDOWS_CODESIGN_PFX_PASSWORD CACHE STRING "password for pfx file used for codesigning.")
mark_as_advanced(WINDOWS_CODESIGN_PFX_PASSWORD)
option(WINDOWS_USE_VISUAL_STUDIO_PROJECT_FOLDERS "Organize the visual studio projects according to source folder structure." ON)
mark_as_advanced(WINDOWS_USE_VISUAL_STUDIO_PROJECT_FOLDERS)
@@ -582,15 +581,6 @@ if("${CMAKE_GENERATOR}" MATCHES "Ninja")
mark_as_advanced(WITH_NINJA_POOL_JOBS)
endif()
if(UNIX AND NOT APPLE)
option(WITH_CXX11_ABI "Use native C++11 ABI of compiler" ON)
mark_as_advanced(WITH_CXX11_ABI)
endif()
# Installation process.
option(POSTINSTALL_SCRIPT "Run given CMake script after installation process" OFF)
mark_as_advanced(POSTINSTALL_SCRIPT)
# avoid using again
option_defaults_clear()
@@ -683,7 +673,6 @@ if(NOT WITH_BOOST)
set_and_warn(WITH_INTERNATIONAL OFF)
set_and_warn(WITH_OPENVDB OFF)
set_and_warn(WITH_OPENCOLORIO OFF)
set_and_warn(WITH_QUADRIFLOW OFF)
elseif(WITH_CYCLES OR WITH_OPENIMAGEIO OR WITH_INTERNATIONAL OR
WITH_OPENVDB OR WITH_OPENCOLORIO)
# Keep enabled
@@ -763,12 +752,14 @@ if(NOT WITH_CUDA_DYNLOAD)
endif()
#-----------------------------------------------------------------------------
# Check check if submodules are cloned
# Check for valid directories
# ... a partial checkout may cause this.
#
# note: we need to check for a known subdir in both cases.
# since uninitialized git submodules will give blank dirs
if(WITH_INTERNATIONAL)
file(GLOB RESULT "${CMAKE_SOURCE_DIR}/release/datafiles/locale")
list(LENGTH RESULT DIR_LEN)
if(DIR_LEN EQUAL 0)
if(NOT EXISTS "${CMAKE_SOURCE_DIR}/release/datafiles/locale/languages")
message(WARNING
"Translation path '${CMAKE_SOURCE_DIR}/release/datafiles/locale' is missing, "
"This is a 'git submodule', which are known not to work with bridges to other version "
@@ -790,9 +781,7 @@ if(WITH_PYTHON)
message(FATAL_ERROR "At least Python 3.7 is required to build")
endif()
file(GLOB RESULT "${CMAKE_SOURCE_DIR}/release/scripts/addons")
list(LENGTH RESULT DIR_LEN)
if(DIR_LEN EQUAL 0)
if(NOT EXISTS "${CMAKE_SOURCE_DIR}/release/scripts/addons/modules")
message(WARNING
"Addons path '${CMAKE_SOURCE_DIR}/release/scripts/addons' is missing, "
"This is a 'git submodule', which are known not to work with bridges to other version "
@@ -848,7 +837,7 @@ if(NOT CMAKE_BUILD_TYPE MATCHES "Release")
endif()
#-----------------------------------------------------------------------------
# Platform specifics
#Platform specifics
if(WITH_X11)
find_package(X11 REQUIRED)
@@ -1190,7 +1179,6 @@ if(WITH_SYSTEM_GLEW)
message(FATAL_ERROR "GLEW is required to build Blender. Install it or disable WITH_SYSTEM_GLEW.")
endif()
set(GLEW_INCLUDE_PATH "${GLEW_INCLUDE_DIR}")
set(BLENDER_GLEW_LIBRARIES ${GLEW_LIBRARY})
else()
if(WITH_GLEW_ES)
@@ -1403,16 +1391,13 @@ if(CMAKE_COMPILER_IS_GNUCC)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_LOGICAL_OP -Wlogical-op)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_UNDEF -Wundef)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_INIT_SELF -Winit-self) # needs -Wuninitialized
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_NO_NULL -Wnonnull) # C only
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_MISSING_INCLUDE_DIRS -Wmissing-include-dirs)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_NO_DIV_BY_ZERO -Wno-div-by-zero)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_TYPE_LIMITS -Wtype-limits)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_FORMAT_SIGN -Wformat-signedness)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_RESTRICT -Wrestrict)
# C-only.
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_NO_NULL -Wnonnull)
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_ABSOLUTE_VALUE -Wabsolute-value)
# gcc 4.2 gives annoying warnings on every file with this
if(NOT "${CMAKE_C_COMPILER_VERSION}" VERSION_LESS "4.3")
ADD_CHECK_C_COMPILER_FLAG(C_WARNINGS C_WARN_UNINITIALIZED -Wuninitialized)
@@ -1581,7 +1566,7 @@ elseif(CMAKE_C_COMPILER_ID MATCHES "MSVC")
if(MSVC_VERSION GREATER_EQUAL 1911)
# see https://docs.microsoft.com/en-us/cpp/error-messages/compiler-warnings/c5038?view=vs-2017
set(_WARNINGS "${_WARNINGS} /w35038") # order of initialization in c++ constructors
set(_WARNINGS "${_WARNINGS} /w35038") #order of initialisation in c++ constructors
endif()
string(REPLACE ";" " " _WARNINGS "${_WARNINGS}")
@@ -1621,19 +1606,15 @@ if(WITH_PYTHON)
endif()
endif()
if(MSVC)
# MSVC needs to be tested first, since clang on windows will
# match the compiler test below but clang-cl does not accept -std=c++11
# since it is on by default and cannot be turned off.
#
# Nothing special is needed, C++11 features are available by default.
elseif(
if(
CMAKE_COMPILER_IS_GNUCC OR
CMAKE_C_COMPILER_ID MATCHES "Clang" OR
CMAKE_C_COMPILER_ID MATCHES "Intel"
)
# TODO(sergey): Do we want c++11 or gnu-c++11 here?
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")
elseif(MSVC)
# Nothing special is needed, C++11 features are available by default.
else()
message(FATAL_ERROR "Unknown compiler ${CMAKE_C_COMPILER_ID}, can't enable C++11 build")
endif()
@@ -1649,12 +1630,6 @@ if(
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -std=gnu11")
endif()
if(UNIX AND NOT APPLE)
if(NOT WITH_CXX11_ABI)
set(PLATFORM_CFLAGS "${PLATFORM_CFLAGS} -D_GLIBCXX_USE_CXX11_ABI=0")
endif()
endif()
# Include warnings first, so its possible to disable them with user defined flags
# eg: -Wno-uninitialized
set(CMAKE_C_FLAGS "${C_WARNINGS} ${CMAKE_C_FLAGS} ${PLATFORM_CFLAGS}")
@@ -1790,11 +1765,11 @@ if(FIRST_RUN)
info_cfg_option(WITH_OPENIMAGEDENOISE)
info_cfg_option(WITH_OPENVDB)
info_cfg_option(WITH_ALEMBIC)
info_cfg_option(WITH_QUADRIFLOW)
info_cfg_text("Compiler Options:")
info_cfg_option(WITH_BUILDINFO)
info_cfg_option(WITH_OPENMP)
info_cfg_option(WITH_RAYOPTIMIZATION)
info_cfg_text("System Options:")
info_cfg_option(WITH_INSTALL_PORTABLE)

View File

@@ -62,7 +62,8 @@ Testing Targets
Not associated with building Blender.
* test:
Run automated tests with ctest.
Run ctest, currently tests import/export,
operator execution and that python modules load
* test_cmake:
Runs our own cmake file checker
which detects errors in the cmake file list definitions
@@ -118,7 +119,7 @@ Utilities
Example
make icons_geom BLENDER_BIN=/path/to/blender
* source_archive:
* tgz:
Create a compressed archive of the source code.
* update:
@@ -163,8 +164,9 @@ CPU:=$(shell uname -m)
BLENDER_DIR:=$(shell pwd -P)
BUILD_TYPE:=Release
# CMake arguments, assigned to local variable to make it mutable.
CMAKE_CONFIG_ARGS := $(BUILD_CMAKE_ARGS)
ifndef BUILD_CMAKE_ARGS
BUILD_CMAKE_ARGS:=
endif
ifndef BUILD_DIR
BUILD_DIR:=$(shell dirname "$(BLENDER_DIR)")/build_$(OS_NCASE)
@@ -191,16 +193,6 @@ ifndef PYTHON
PYTHON:=python3
endif
# For macOS python3 is not installed by default, so fallback to python binary
# in libraries, or python 2 for running make update to get it.
ifeq ($(OS_NCASE),darwin)
ifeq (, $(shell command -v $(PYTHON)))
PYTHON:=../lib/darwin/python/bin/python3.7m
ifeq (, $(shell command -v $(PYTHON)))
PYTHON:=python
endif
endif
endif
# -----------------------------------------------------------------------------
# additional targets for the build configuration
@@ -212,34 +204,34 @@ ifneq "$(findstring debug, $(MAKECMDGOALS))" ""
endif
ifneq "$(findstring full, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_full
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/blender_full.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/blender_full.cmake"
endif
ifneq "$(findstring lite, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_lite
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/blender_lite.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/blender_lite.cmake"
endif
ifneq "$(findstring cycles, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_cycles
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/cycles_standalone.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/cycles_standalone.cmake"
endif
ifneq "$(findstring headless, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_headless
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/blender_headless.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/blender_headless.cmake"
endif
ifneq "$(findstring bpy, $(MAKECMDGOALS))" ""
BUILD_DIR:=$(BUILD_DIR)_bpy
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/bpy_module.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/bpy_module.cmake"
endif
ifneq "$(findstring developer, $(MAKECMDGOALS))" ""
CMAKE_CONFIG_ARGS:=-C"$(BLENDER_DIR)/build_files/cmake/config/blender_developer.cmake" $(CMAKE_CONFIG_ARGS)
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -C"$(BLENDER_DIR)/build_files/cmake/config/blender_developer.cmake"
endif
# -----------------------------------------------------------------------------
# build tool
ifneq "$(findstring ninja, $(MAKECMDGOALS))" ""
CMAKE_CONFIG_ARGS:=$(CMAKE_CONFIG_ARGS) -G Ninja
BUILD_CMAKE_ARGS:=$(BUILD_CMAKE_ARGS) -G Ninja
BUILD_COMMAND:=ninja
DEPS_BUILD_COMMAND:=ninja
else
@@ -275,10 +267,7 @@ ifndef NPROCS
ifeq ($(OS), Linux)
NPROCS:=$(shell nproc)
endif
ifeq ($(OS), NetBSD)
NPROCS:=$(shell getconf NPROCESSORS_ONLN)
endif
ifneq (,$(filter $(OS),Darwin FreeBSD))
ifneq (,$(filter $(OS),Darwin FreeBSD NetBSD))
NPROCS:=$(shell sysctl -n hw.ncpu)
endif
endif
@@ -287,7 +276,7 @@ endif
# -----------------------------------------------------------------------------
# Macro for configuring cmake
CMAKE_CONFIG = cmake $(CMAKE_CONFIG_ARGS) \
CMAKE_CONFIG = cmake $(BUILD_CMAKE_ARGS) \
-H"$(BLENDER_DIR)" \
-B"$(BUILD_DIR)" \
-DCMAKE_BUILD_TYPE_INIT:STRING=$(BUILD_TYPE)
@@ -386,7 +375,7 @@ package_archive: .FORCE
# Tests
#
test: .FORCE
$(PYTHON) ./build_files/utils/make_test.py "$(BUILD_DIR)"
python3 ./build_files/utils/make_test.py "$(BUILD_DIR)"
# run pep8 check check on scripts we distribute.
test_pep8: .FORCE
@@ -527,8 +516,8 @@ check_descriptions: .FORCE
# Utilities
#
source_archive: .FORCE
./build_files/utils/make_source_archive.sh
tgz: .FORCE
./build_files/utils/build_tgz.sh
INKSCAPE_BIN?="inkscape"
icons: .FORCE
@@ -542,11 +531,11 @@ icons_geom: .FORCE
"$(BLENDER_DIR)/release/datafiles/blender_icons_geom_update.py"
update: .FORCE
$(PYTHON) ./build_files/utils/make_update.py
python3 ./build_files/utils/make_update.py
format: .FORCE
PATH="../lib/${OS_NCASE}/llvm/bin/:$(PATH)" \
$(PYTHON) source/tools/utils_maintenance/clang_format_paths.py $(PATHS)
python3 source/tools/utils_maintenance/clang_format_paths.py $(PATHS)
# -----------------------------------------------------------------------------

View File

@@ -114,7 +114,7 @@ if(WIN32)
include(cmake/tinyxml.cmake)
include(cmake/yamlcpp.cmake)
# LCMS is an OCIO dep, but only if you build the apps, leaving it here for convenience
# include(cmake/lcms.cmake)
#include(cmake/lcms.cmake)
endif()
@@ -128,7 +128,6 @@ if(NOT WIN32 OR ENABLE_MINGW64)
include(cmake/ogg.cmake)
include(cmake/vorbis.cmake)
include(cmake/theora.cmake)
include(cmake/opus.cmake)
include(cmake/vpx.cmake)
include(cmake/x264.cmake)
include(cmake/xvidcore.cmake)
@@ -158,9 +157,4 @@ if(UNIX)
include(cmake/sqlite.cmake)
endif()
if(UNIX AND NOT APPLE)
include(cmake/libglu.cmake)
include(cmake/mesa.cmake)
endif()
include(cmake/harvest.cmake)

View File

@@ -40,7 +40,7 @@ set(ALEMBIC_EXTRA_ARGS
-DBoost_USE_MULTITHREADED=ON
-DUSE_STATIC_BOOST=On
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBoost_USE_STATIC_RUNTIME=ON
-DBoost_DEBUG=ON
-DBOOST_ROOT=${LIBDIR}/boost
-DBoost_NO_SYSTEM_PATHS=ON

View File

@@ -30,8 +30,8 @@ set(BLOSC_EXTRA_ARGS
)
if(WIN32)
# Prevent blosc from including it's own local copy of zlib in the object file
# and cause linker errors with everybody else.
#prevent blosc from including it's own local copy of zlib in the object file
#and cause linker errors with everybody else
set(BLOSC_EXTRA_ARGS ${BLOSC_EXTRA_ARGS}
-DPREFER_EXTERNAL_ZLIB=ON
)

View File

@@ -29,13 +29,13 @@ if(WIN32)
set(PYTHON_OUTPUTDIR ${BUILD_DIR}/python/src/external_python/pcbuild/win32/)
set(BOOST_ADDRESS_MODEL 32)
endif()
set(BOOST_TOOLSET toolset=msvc-14.1)
set(BOOST_COMPILER_STRING -vc141)
if(MSVC14)
set(BOOST_TOOLSET toolset=msvc-14.0)
set(BOOST_COMPILER_STRING -vc140)
endif()
set(BOOST_CONFIGURE_COMMAND bootstrap.bat)
set(BOOST_BUILD_COMMAND bjam)
set(BOOST_BUILD_OPTIONS runtime-link=shared )
set(BOOST_BUILD_OPTIONS runtime-link=static )
set(BOOST_HARVEST_CMD ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/boost/lib/ ${HARVEST_TARGET}/boost/lib/ )
if(BUILD_MODE STREQUAL Release)
set(BOOST_HARVEST_CMD ${BOOST_HARVEST_CMD} && ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/boost/include/boost-1_68/ ${HARVEST_TARGET}/boost/include/)

View File

@@ -17,16 +17,10 @@
# ***** END GPL LICENSE BLOCK *****
if(UNIX)
if(APPLE)
set(_libtoolize_name glibtoolize)
else()
set(_libtoolize_name libtoolize)
endif()
set(_required_software
autoconf
automake
${_libtoolize_name}
libtoolize
nasm
yasm
tclsh

View File

@@ -19,8 +19,8 @@
set(CLANG_EXTRA_ARGS
-DCLANG_PATH_TO_LLVM_SOURCE=${BUILD_DIR}/ll/src/ll
-DCLANG_PATH_TO_LLVM_BUILD=${LIBDIR}/llvm
-DLLVM_USE_CRT_RELEASE=MD
-DLLVM_USE_CRT_DEBUG=MDd
-DLLVM_USE_CRT_RELEASE=MT
-DLLVM_USE_CRT_DEBUG=MTd
-DLLVM_CONFIG=${LIBDIR}/llvm/bin/llvm-config
)

View File

@@ -16,10 +16,10 @@
#
# ***** END GPL LICENSE BLOCK *****
set(FFMPEG_CFLAGS "-I${mingw_LIBDIR}/lame/include -I${mingw_LIBDIR}/openjpeg/include/ -I${mingw_LIBDIR}/ogg/include -I${mingw_LIBDIR}/vorbis/include -I${mingw_LIBDIR}/theora/include -I${mingw_LIBDIR}/opus/include -I${mingw_LIBDIR}/vpx/include -I${mingw_LIBDIR}/x264/include -I${mingw_LIBDIR}/xvidcore/include -I${mingw_LIBDIR}/zlib/include")
set(FFMPEG_LDFLAGS "-L${mingw_LIBDIR}/lame/lib -L${mingw_LIBDIR}/openjpeg/lib -L${mingw_LIBDIR}/ogg/lib -L${mingw_LIBDIR}/vorbis/lib -L${mingw_LIBDIR}/theora/lib -L${mingw_LIBDIR}/opus/lib -L${mingw_LIBDIR}/vpx/lib -L${mingw_LIBDIR}/x264/lib -L${mingw_LIBDIR}/xvidcore/lib -L${mingw_LIBDIR}/zlib/lib")
set(FFMPEG_CFLAGS "-I${mingw_LIBDIR}/lame/include -I${mingw_LIBDIR}/openjpeg/include/ -I${mingw_LIBDIR}/ogg/include -I${mingw_LIBDIR}/vorbis/include -I${mingw_LIBDIR}/theora/include -I${mingw_LIBDIR}/vpx/include -I${mingw_LIBDIR}/x264/include -I${mingw_LIBDIR}/xvidcore/include -I${mingw_LIBDIR}/zlib/include")
set(FFMPEG_LDFLAGS "-L${mingw_LIBDIR}/lame/lib -L${mingw_LIBDIR}/openjpeg/lib -L${mingw_LIBDIR}/ogg/lib -L${mingw_LIBDIR}/vorbis/lib -L${mingw_LIBDIR}/theora/lib -L${mingw_LIBDIR}/vpx/lib -L${mingw_LIBDIR}/x264/lib -L${mingw_LIBDIR}/xvidcore/lib -L${mingw_LIBDIR}/zlib/lib")
set(FFMPEG_EXTRA_FLAGS --pkg-config-flags=--static --extra-cflags=${FFMPEG_CFLAGS} --extra-ldflags=${FFMPEG_LDFLAGS})
set(FFMPEG_ENV PKG_CONFIG_PATH=${mingw_LIBDIR}/openjpeg/lib/pkgconfig:${mingw_LIBDIR}/x264/lib/pkgconfig:${mingw_LIBDIR}/vorbis/lib/pkgconfig:${mingw_LIBDIR}/ogg/lib/pkgconfig:${mingw_LIBDIR}:${mingw_LIBDIR}/vpx/lib/pkgconfig:${mingw_LIBDIR}/theora/lib/pkgconfig:${mingw_LIBDIR}/openjpeg/lib/pkgconfig:${mingw_LIBDIR}/opus/lib/pkgconfig:)
set(FFMPEG_ENV PKG_CONFIG_PATH=${mingw_LIBDIR}/openjpeg/lib/pkgconfig:${mingw_LIBDIR}/x264/lib/pkgconfig:${mingw_LIBDIR}/vorbis/lib/pkgconfig:${mingw_LIBDIR}/ogg/lib/pkgconfig:${mingw_LIBDIR})
if(WIN32)
set(FFMPEG_ENV set ${FFMPEG_ENV} &&)
@@ -73,7 +73,6 @@ ExternalProject_Add(external_ffmpeg
--disable-libgsm
--disable-libspeex
--enable-libvpx
--enable-libopus
--prefix=${LIBDIR}/ffmpeg
--enable-libtheora
--enable-libvorbis
@@ -131,7 +130,6 @@ add_dependencies(
external_openjpeg
external_xvidcore
external_x264
external_opus
external_vpx
external_theora
external_vorbis

View File

@@ -52,7 +52,9 @@ if(BUILD_MODE STREQUAL Release)
${CMAKE_COMMAND} -E copy ${LIBDIR}/tiff/lib/tiff.lib ${HARVEST_TARGET}/tiff/lib/libtiff.lib &&
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/tiff/include/ ${HARVEST_TARGET}/tiff/include/ &&
# hidapi
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/hidapi/ ${HARVEST_TARGET}/hidapi/
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/hidapi/ ${HARVEST_TARGET}/hidapi/ &&
# webp, straight up copy
${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/webp ${HARVEST_TARGET}/webp &&
DEPENDS
)
endif()
@@ -190,7 +192,6 @@ harvest(theora/lib ffmpeg/lib "*.a")
harvest(tiff/include tiff/include "*.h")
harvest(tiff/lib tiff/lib "*.a")
harvest(vorbis/lib ffmpeg/lib "*.a")
harvest(opus/lib ffmpeg/lib "*.a")
harvest(vpx/lib ffmpeg/lib "*.a")
harvest(webp/lib ffmpeg/lib "*.a")
harvest(x264/lib ffmpeg/lib "*.a")
@@ -198,9 +199,4 @@ harvest(xvidcore/lib ffmpeg/lib "*.a")
harvest(embree/include embree/include "*.h")
harvest(embree/lib embree/lib "*.a")
if(UNIX AND NOT APPLE)
harvest(libglu/lib mesa/lib "*.so*")
harvest(mesa/lib mesa/lib "*.so*")
endif()
endif()

View File

@@ -24,7 +24,7 @@ ExternalProject_Add(external_lcms
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${LCMS_HASH}
PREFIX ${BUILD_DIR}/lcms
# Patch taken from ocio.
#patch taken from ocio
PATCH_COMMAND ${CMAKE_COMMAND} -E copy ${PATCH_DIR}/cmakelists_lcms.txt ${BUILD_DIR}/lcms/src/external_lcms/CMakeLists.txt
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/lcms ${DEFAULT_CMAKE_FLAGS} ${LCMS_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/lcms

View File

@@ -1,40 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(LIBGLU_CFLAGS "-static-libgcc")
set(LIBGLU_CXXFLAGS "-static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -l:libstdc++.a")
set(LIBGLU_LDFLAGS "-pthread -static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -l:libstdc++.a")
set(LIBGLU_EXTRA_FLAGS
CFLAGS=${LIBGLU_CFLAGS}
CXXFLAGS=${LIBGLU_CXXFLAGS}
LDFLAGS=${LIBGLU_LDFLAGS}
)
ExternalProject_Add(external_libglu
URL ${LIBGLU_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${LIBGLU_HASH}
PREFIX ${BUILD_DIR}/libglu
CONFIGURE_COMMAND ${CONFIGURE_ENV} &&
cd ${BUILD_DIR}/libglu/src/external_libglu/ &&
${CONFIGURE_COMMAND_NO_TARGET} --prefix=${LIBDIR}/libglu ${LIBGLU_EXTRA_FLAGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/libglu/src/external_libglu/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/libglu/src/external_libglu/ && make install
INSTALL_DIR ${LIBDIR}/libglu
)

View File

@@ -17,8 +17,8 @@
# ***** END GPL LICENSE BLOCK *****
set(LLVM_EXTRA_ARGS
-DLLVM_USE_CRT_RELEASE=MD
-DLLVM_USE_CRT_DEBUG=MDd
-DLLVM_USE_CRT_RELEASE=MT
-DLLVM_USE_CRT_DEBUG=MTd
-DLLVM_INCLUDE_TESTS=OFF
-DLLVM_TARGETS_TO_BUILD=X86
-DLLVM_INCLUDE_EXAMPLES=OFF

View File

@@ -1,54 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
set(MESA_CFLAGS "-static-libgcc")
set(MESA_CXXFLAGS "-static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -l:libstdc++.a")
set(MESA_LDFLAGS "-L${LIBDIR}/zlib/lib -pthread -static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -l:libstdc++.a -l:libz_pic.a")
set(MESA_EXTRA_FLAGS
CFLAGS=${MESA_CFLAGS}
CXXFLAGS=${MESA_CXXFLAGS}
LDFLAGS=${MESA_LDFLAGS}
--enable-glx=gallium-xlib
--with-gallium-drivers=swrast
--disable-dri
--disable-gbm
--disable-egl
--disable-gles1
--disable-gles2
--disable-llvm-shared-libs
--with-llvm-prefix=${LIBDIR}/llvm
)
ExternalProject_Add(external_mesa
URL ${MESA_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${MESA_HASH}
PREFIX ${BUILD_DIR}/mesa
CONFIGURE_COMMAND ${CONFIGURE_ENV} &&
cd ${BUILD_DIR}/mesa/src/external_mesa/ &&
${CONFIGURE_COMMAND_NO_TARGET} --prefix=${LIBDIR}/mesa ${MESA_EXTRA_FLAGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/mesa/src/external_mesa/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/mesa/src/external_mesa/ && make install
INSTALL_DIR ${LIBDIR}/mesa
)
add_dependencies(
external_mesa
ll
)

View File

@@ -50,7 +50,7 @@ if(WIN32)
-DUSE_EXTERNAL_LCMS=ON
-DINC_1=${LIBDIR}/tinyxml/include
-DINC_2=${LIBDIR}/yamlcpp/include
# Lie because ocio cmake is demanding boost even though it is not needed.
#lie because ocio cmake is demanding boost even though it is not needed
-DYAML_CPP_VERSION=0.5.0
)
else()
@@ -95,7 +95,7 @@ if(WIN32)
ExternalProject_Add_Step(external_opencolorio after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/opencolorio/include ${HARVEST_TARGET}/opencolorio/include
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/opencolorio/lib/static ${HARVEST_TARGET}/opencolorio/lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/yamlcpp/lib/libyaml-cppmd.lib ${HARVEST_TARGET}/opencolorio/lib/libyaml-cpp.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/yamlcpp/lib/libyaml-cppmt.lib ${HARVEST_TARGET}/opencolorio/lib/libyaml-cpp.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tinyxml/lib/tinyxml.lib ${HARVEST_TARGET}/opencolorio/lib/tinyxml.lib
DEPENDEES install
)
@@ -103,7 +103,7 @@ if(WIN32)
if(BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_opencolorio after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/opencolorio/lib/static/Opencolorio.lib ${HARVEST_TARGET}/opencolorio/lib/OpencolorIO_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/yamlcpp/lib/libyaml-cppmdd.lib ${HARVEST_TARGET}/opencolorio/lib/libyaml-cpp_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/yamlcpp/lib/libyaml-cppmtd.lib ${HARVEST_TARGET}/opencolorio/lib/libyaml-cpp_d.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tinyxml/lib/tinyxml.lib ${HARVEST_TARGET}/opencolorio/lib/tinyxml_d.lib
DEPENDEES install
)

View File

@@ -44,7 +44,7 @@ if(WIN32)
ExternalProject_Add_Step(external_openexr after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/openexr/lib ${HARVEST_TARGET}/openexr/lib
# Libs have moved between versions, just duplicate it for now.
#libs have moved between versions, just duplicate it for now.
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/openexr/lib ${LIBDIR}/ilmbase/lib
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/openexr/include ${HARVEST_TARGET}/openexr/include
DEPENDEES install

View File

@@ -69,7 +69,7 @@ set(OPENIMAGEIO_EXTRA_ARGS
-DBoost_COMPILER:STRING=${BOOST_COMPILER_STRING}
-DBoost_USE_MULTITHREADED=ON
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBoost_USE_STATIC_RUNTIME=ON
-DBOOST_ROOT=${LIBDIR}/boost
-DBOOST_LIBRARYDIR=${LIBDIR}/boost/lib/
-DBoost_NO_SYSTEM_PATHS=ON

View File

@@ -38,7 +38,7 @@ ExternalProject_Add(external_openjpeg
INSTALL_DIR ${LIBDIR}/openjpeg
)
# On windows ffmpeg wants a mingw build, while oiio needs a msvc build.
#on windows ffmpeg wants a mingw build, while oiio needs a msvc build
if(MSVC)
set(OPENJPEG_EXTRA_ARGS ${DEFAULT_CMAKE_FLAGS})
ExternalProject_Add(external_openjpeg_msvc

View File

@@ -41,6 +41,7 @@ if(WIN32)
-DCLEW_LIBRARY=${LIBDIR}/clew/lib/clew${LIBEXT}
-DCUEW_INCLUDE_DIR=${LIBDIR}/cuew/include
-DCUEW_LIBRARY=${LIBDIR}/cuew/lib/cuew${LIBEXT}
-DCMAKE_EXE_LINKER_FLAGS_RELEASE=libcmt.lib
)
if("${CMAKE_SIZEOF_VOID_P}" EQUAL "8")
set(OPENSUBDIV_EXTRA_ARGS

View File

@@ -24,7 +24,7 @@ set(OPENVDB_EXTRA_ARGS
-DBoost_COMPILER:STRING=${BOOST_COMPILER_STRING}
-DBoost_USE_MULTITHREADED=ON
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBoost_USE_STATIC_RUNTIME=ON
-DBOOST_ROOT=${LIBDIR}/boost
-DBoost_NO_SYSTEM_PATHS=ON
-DZLIB_LIBRARY=${LIBDIR}/zlib/lib/${ZLIB_LIBRARY}

View File

@@ -62,22 +62,22 @@ if(WIN32)
endif()
set(COMMON_MSVC_FLAGS "${COMMON_MSVC_FLAGS} /bigobj")
if(WITH_OPTIMIZED_DEBUG)
set(BLENDER_CMAKE_C_FLAGS_DEBUG "/MDd ${COMMON_MSVC_FLAGS} /O2 /Ob2 /DNDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_DEBUG "/MTd ${COMMON_MSVC_FLAGS} /O2 /Ob2 /DNDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
else()
set(BLENDER_CMAKE_C_FLAGS_DEBUG "/MDd ${COMMON_MSVC_FLAGS} /Zi /Ob0 /Od /RTC1 /D_DEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_DEBUG "/MTd ${COMMON_MSVC_FLAGS} /Zi /Ob0 /Od /RTC1 /D_DEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
endif()
set(BLENDER_CMAKE_C_FLAGS_MINSIZEREL "/MD ${COMMON_MSVC_FLAGS} /O1 /Ob1 /D NDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_RELEASE "/MD ${COMMON_MSVC_FLAGS} /O2 /Ob2 /DNDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_RELWITHDEBINFO "/MD ${COMMON_MSVC_FLAGS} /Zi /O2 /Ob1 /D NDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_MINSIZEREL "/MT ${COMMON_MSVC_FLAGS} /O1 /Ob1 /D NDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_RELEASE "/MT ${COMMON_MSVC_FLAGS} /O2 /Ob2 /DNDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_C_FLAGS_RELWITHDEBINFO "/MT ${COMMON_MSVC_FLAGS} /Zi /O2 /Ob1 /D NDEBUG /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
if(WITH_OPTIMIZED_DEBUG)
set(BLENDER_CMAKE_CXX_FLAGS_DEBUG "/MDd ${COMMON_MSVC_FLAGS} /O2 /Ob2 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_DEBUG "/MTd ${COMMON_MSVC_FLAGS} /O2 /Ob2 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
else()
set(BLENDER_CMAKE_CXX_FLAGS_DEBUG "/D_DEBUG /D PLATFORM_WINDOWS /MTd ${COMMON_MSVC_FLAGS} /Zi /Ob0 /Od /RTC1 /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
endif()
set(BLENDER_CMAKE_CXX_FLAGS_MINSIZEREL "/MD /${COMMON_MSVC_FLAGS} /O1 /Ob1 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_RELEASE "/MD ${COMMON_MSVC_FLAGS} /O2 /Ob2 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_RELWITHDEBINFO "/MD ${COMMON_MSVC_FLAGS} /Zi /O2 /Ob1 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_MINSIZEREL "/MT /${COMMON_MSVC_FLAGS} /O1 /Ob1 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_RELEASE "/MT ${COMMON_MSVC_FLAGS} /O2 /Ob2 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(BLENDER_CMAKE_CXX_FLAGS_RELWITHDEBINFO "/MT ${COMMON_MSVC_FLAGS} /Zi /O2 /Ob1 /D NDEBUG /D PLATFORM_WINDOWS /DPSAPI_VERSION=1 /DOIIO_STATIC_BUILD /DTINYFORMAT_ALLOW_WCHAR_STRINGS")
set(PLATFORM_FLAGS)
set(PLATFORM_CXX_FLAGS)
@@ -97,8 +97,8 @@ if(WIN32)
set(CONFIGURE_ENV
cd ${MINGW_PATH} &&
call ${PERL_SHELL} &&
call ${MINGW_SHELL} &&
call ${PERL_SHELL} &&
set path &&
set CFLAGS=-g &&
set LDFLAGS=-Wl,--as-needed -static-libgcc
@@ -190,7 +190,7 @@ set(DEFAULT_CMAKE_FLAGS
)
if(WIN32)
# We need both flavors to build the thumbnail dlls
#we need both flavors to build the thumbnail dlls
if(MSVC12)
set(GENERATOR_32 "Visual Studio 12 2013")
set(GENERATOR_64 "Visual Studio 12 2013 Win64")

View File

@@ -1,35 +0,0 @@
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
ExternalProject_Add(external_opus
URL ${OPUS_URI}
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH SHA256=${OPUS_HASH}
PREFIX ${BUILD_DIR}/opus
CONFIGURE_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/opus/src/external_opus/ && ${CONFIGURE_COMMAND} --prefix=${LIBDIR}/opus
--disable-shared
--enable-static
--with-pic
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/opus/src/external_opus/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/opus/src/external_opus/ && make install
INSTALL_DIR ${LIBDIR}/opus
)
if(MSVC)
set_target_properties(external_opus PROPERTIES FOLDER Mingw)
endif()

View File

@@ -40,7 +40,7 @@ set(OSL_EXTRA_ARGS
-DBoost_COMPILER:STRING=${BOOST_COMPILER_STRING}
-DBoost_USE_MULTITHREADED=ON
-DBoost_USE_STATIC_LIBS=ON
-DBoost_USE_STATIC_RUNTIME=OFF
-DBoost_USE_STATIC_RUNTIME=ON
-DBOOST_ROOT=${LIBDIR}/boost
-DBOOST_LIBRARYDIR=${LIBDIR}/boost/lib/
-DBoost_NO_SYSTEM_PATHS=ON

View File

@@ -21,7 +21,7 @@ set(SNDFILE_ENV PKG_CONFIG_PATH=${mingw_LIBDIR}/ogg/lib/pkgconfig:${mingw_LIBDIR
if(WIN32)
set(SNDFILE_ENV set ${SNDFILE_ENV} &&)
# Shared for windows because static libs will drag in a libgcc dependency.
#shared for windows because static libs will drag in a libgcc dependency.
set(SNDFILE_OPTIONS --disable-static --enable-shared )
else()
set(SNDFILE_OPTIONS --enable-static --disable-shared )

View File

@@ -15,21 +15,13 @@
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENSE BLOCK *****
if(WIN32)
set(TBB_EXTRA_ARGS
-DTBB_BUILD_SHARED=On
-DTBB_BUILD_TBBMALLOC=On
-DTBB_BUILD_TBBMALLOC_PROXY=On
-DTBB_BUILD_STATIC=On
set(TBB_EXTRA_ARGS
-DTBB_BUILD_SHARED=Off
-DTBB_BUILD_TBBMALLOC=On
-DTBB_BUILD_TBBMALLOC_PROXY=Off
-DTBB_BUILD_STATIC=On
)
else()
set(TBB_EXTRA_ARGS
-DTBB_BUILD_SHARED=Off
-DTBB_BUILD_TBBMALLOC=On
-DTBB_BUILD_TBBMALLOC_PROXY=Off
-DTBB_BUILD_STATIC=On
)
endif()
# CMake script for TBB from https://github.com/wjakob/tbb/blob/master/CMakeLists.txt
ExternalProject_Add(external_tbb
@@ -47,10 +39,6 @@ if(WIN32)
if(BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_tbb after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb_static.lib ${HARVEST_TARGET}/tbb/lib/tbb.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.dll ${HARVEST_TARGET}/tbb/lib/tbbmalloc.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.dll ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy.dll
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/tbb/include/ ${HARVEST_TARGET}/tbb/include/
DEPENDEES install
)
@@ -58,9 +46,6 @@ if(WIN32)
if(BUILD_MODE STREQUAL Debug)
ExternalProject_Add_Step(external_tbb after_install
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbb_static.lib ${HARVEST_TARGET}/tbb/lib/tbb_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.lib ${HARVEST_TARGET}/tbb/lib/tbbmalloc_proxy_debug.lib
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc.dll ${HARVEST_TARGET}/tbb/lib/debug/tbbmalloc.dll
COMMAND ${CMAKE_COMMAND} -E copy ${LIBDIR}/tbb/lib/tbbmalloc_proxy.dll ${HARVEST_TARGET}/tbb/lib/debug/tbbmalloc_proxy.dll
DEPENDEES install
)
endif()

View File

@@ -16,7 +16,7 @@
#
# ***** END GPL LICENSE BLOCK *****
if(UNIX)
if (UNIX)
set(THEORA_CONFIGURE_ENV ${CONFIGURE_ENV} && export HAVE_PDFLATEX=no)
else()
set(THEORA_CONFIGURE_ENV ${CONFIGURE_ENV})

View File

@@ -24,7 +24,7 @@ ExternalProject_Add(external_tinyxml
DOWNLOAD_DIR ${DOWNLOAD_DIR}
URL_HASH MD5=${TINYXML_HASH}
PREFIX ${BUILD_DIR}/tinyxml
# patch taken from ocio
#patch taken from ocio
PATCH_COMMAND ${PATCH_CMD} -p 1 -N -d ${BUILD_DIR}/tinyxml/src/external_tinyxml < ${PATCH_DIR}/tinyxml.diff
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/tinyxml ${DEFAULT_CMAKE_FLAGS} ${TINYXML_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/tinyxml

View File

@@ -61,7 +61,7 @@ set(ILMBASE_URI https://github.com/openexr/openexr/releases/download/v${ILMBASE_
set(ILMBASE_HASH 354bf86de3b930ab87ac63619d60c860)
set(OPENEXR_VERSION 2.3.0)
if(WIN32) # release 2.3.0 tarball has broken cmake support
if(WIN32) #release 2.3.0 tarball has broken cmake support
set(OPENEXR_URI https://github.com/openexr/openexr/archive/0ac2ea34c8f3134148a5df4052e40f155b76f6fb.tar.gz)
set(OPENEXR_HASH ed159435d508240712fbaaa21d94bafb)
else()
@@ -86,21 +86,21 @@ set(HDF5_VERSION 1.8.17)
set(HDF5_URI https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8/hdf5-${HDF5_VERSION}/src/hdf5-${HDF5_VERSION}.tar.gz)
set(HDF5_HASH 7d572f8f3b798a628b8245af0391a0ca)
set(ALEMBIC_VERSION 1.7.12)
set(ALEMBIC_VERSION 1.7.8)
set(ALEMBIC_URI https://github.com/alembic/alembic/archive/${ALEMBIC_VERSION}.tar.gz)
set(ALEMBIC_MD5 e2b3777f23c5c09481a008cc6f0f8a40)
set(ALEMBIC_MD5 d095c2feb5e183b824904db7b63c1d30)
# hash is for 3.1.2
## hash is for 3.1.2
set(GLFW_GIT_UID 30306e54705c3adae9fe082c816a3be71963485c)
set(GLFW_URI https://github.com/glfw/glfw/archive/${GLFW_GIT_UID}.zip)
set(GLFW_HASH 20cacb1613da7eeb092f3ac4f6b2b3d0)
# latest uid in git as of 2016-04-01
#latest uid in git as of 2016-04-01
set(CLEW_GIT_UID 277db43f6cafe8b27c6f1055f69dc67da4aeb299)
set(CLEW_URI https://github.com/OpenCLWrangler/clew/archive/${CLEW_GIT_UID}.zip)
set(CLEW_HASH 2c699d10ed78362e71f56fae2a4c5f98)
# latest uid in git as of 2016-04-01
#latest uid in git as of 2016-04-01
set(CUEW_GIT_UID 1744972026de9cf27c8a7dc39cf39cd83d5f922f)
set(CUEW_URI https://github.com/CudaWrangler/cuew/archive/${CUEW_GIT_UID}.zip)
set(CUEW_HASH 86760d62978ebfd96cd93f5aa1abaf4a)
@@ -149,9 +149,9 @@ set(PYTHON_SHORT_VERSION_NO_DOTS 37)
set(PYTHON_URI https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tar.xz)
set(PYTHON_HASH d33e4aae66097051c2eca45ee3604803)
set(TBB_VERSION 2019_U9)
set(TBB_VERSION 2018_U5)
set(TBB_URI https://github.com/01org/tbb/archive/${TBB_VERSION}.tar.gz)
set(TBB_HASH 584edbec127c508f2cd5b6e79ad200fc)
set(TBB_HASH ff3ae09f8c23892fbc3008c39f78288f)
set(OPENVDB_VERSION 5.1.0)
set(OPENVDB_URI https://github.com/dreamworksanimation/openvdb/archive/v${OPENVDB_VERSION}.tar.gz)
@@ -192,10 +192,6 @@ set(VPX_VERSION 1.7.0)
set(VPX_URI https://github.com/webmproject/libvpx/archive/v${VPX_VERSION}/libvpx-v${VPX_VERSION}.tar.gz)
set(VPX_HASH 1fec931eb5c94279ad219a5b6e0202358e94a93a90cfb1603578c326abfc1238)
set(OPUS_VERSION 1.3.1)
set(OPUS_URI https://archive.mozilla.org/pub/opus/opus-${OPUS_VERSION}.tar.gz)
set(OPUS_HASH 65b58e1e25b2a114157014736a3d9dfeaad8d41be1c8179866f144a2fb44ff9d)
set(X264_URI http://download.videolan.org/pub/videolan/x264/snapshots/x264-snapshot-20180811-2245-stable.tar.bz2)
set(X264_HASH ae8a868a0e236a348b35d79f3ee80294b169d1195408b689f9851383661ed7aa)
@@ -203,7 +199,7 @@ set(XVIDCORE_VERSION 1.3.5)
set(XVIDCORE_URI http://downloads.xvid.org/downloads/xvidcore-${XVIDCORE_VERSION}.tar.gz)
set(XVIDCORE_HASH 165ba6a2a447a8375f7b06db5a3c91810181f2898166e7c8137401d7fc894cf0)
# This has to be in sync with the version in blenders /extern folder.
#this has to be in sync with the version in blenders /extern folder
set(OPENJPEG_VERSION 2.3.0)
set(OPENJPEG_SHORT_VERSION 2.3)
# Use slightly newer commit after release which includes a cmake fix
@@ -234,9 +230,9 @@ set(SNDFILE_VERSION 1.0.28)
set(SNDFILE_URI http://www.mega-nerd.com/libsndfile/files/libsndfile-${SNDFILE_VERSION}.tar.gz)
set(SNDFILE_HASH 646b5f98ce89ac60cdb060fcd398247c)
# set(HIDAPI_VERSION 0.8.0-rc1)
# set(HIDAPI_URI https://github.com/signal11/hidapi/archive/hidapi-${HIDAPI_VERSION}.tar.gz)
# set(HIDAPI_HASH 069f9dd746edc37b6b6d0e3656f47199)
#set(HIDAPI_VERSION 0.8.0-rc1)
#set(HIDAPI_URI https://github.com/signal11/hidapi/archive/hidapi-${HIDAPI_VERSION}.tar.gz)
#set(HIDAPI_HASH 069f9dd746edc37b6b6d0e3656f47199)
set(HIDAPI_UID 89a6c75dc6f45ecabd4ddfbd2bf5ba6ad8ba38b5)
set(HIDAPI_URI https://github.com/TheOnlyJoey/hidapi/archive/${HIDAPI_UID}.zip)
@@ -263,9 +259,9 @@ set(TINYXML_VERSION_DOTS 2.6.2)
set(TINYXML_URI https://nchc.dl.sourceforge.net/project/tinyxml/tinyxml/${TINYXML_VERSION_DOTS}/tinyxml_${TINYXML_VERSION}.tar.gz)
set(TINYXML_HASH c1b864c96804a10526540c664ade67f0)
set(YAMLCPP_VERSION 0.6.3)
set(YAMLCPP_VERSION 0.6.2)
set(YAMLCPP_URI https://codeload.github.com/jbeder/yaml-cpp/tar.gz/yaml-cpp-${YAMLCPP_VERSION})
set(YAMLCPP_HASH b45bf1089a382e81f6b661062c10d0c2)
set(YAMLCPP_HASH 5b943e9af0060d0811148b037449ef82)
set(LCMS_VERSION 2.9)
set(LCMS_URI https://nchc.dl.sourceforge.net/project/lcms/lcms/${LCMS_VERSION}/lcms2-${LCMS_VERSION}.tar.gz)
@@ -310,11 +306,3 @@ set(EMBREE_HASH 3d4a1147002ff43939d45140aa9d6fb8)
set(OIDN_VERSION 1.0.0)
set(OIDN_URI https://github.com/OpenImageDenoise/oidn/releases/download/v${OIDN_VERSION}/oidn-${OIDN_VERSION}.src.zip)
set(OIDN_HASH 19fe67b0164e8f020ac8a4f520defe60)
set(LIBGLU_VERSION 9.0.1)
set(LIBGLU_URI ftp://ftp.freedesktop.org/pub/mesa/glu/glu-${LIBGLU_VERSION}.tar.xz)
set(LIBGLU_HASH 151aef599b8259efe9acd599c96ea2a3)
set(MESA_VERSION 18.3.1)
set(MESA_URI ftp://ftp.freedesktop.org/pub/mesa//mesa-${MESA_VERSION}.tar.xz)
set(MESA_HASH d60828056d77bfdbae0970f9b15fb1be)

View File

@@ -49,8 +49,6 @@ ExternalProject_Add(external_vpx
--disable-avx2
--disable-unit-tests
--disable-examples
--enable-vp8
--enable-vp9
${VPX_EXTRA_FLAGS}
BUILD_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/vpx/src/external_vpx/ && make -j${MAKE_THREADS}
INSTALL_COMMAND ${CONFIGURE_ENV} && cd ${BUILD_DIR}/vpx/src/external_vpx/ && make install

View File

@@ -39,12 +39,3 @@ ExternalProject_Add(external_webp
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX=${LIBDIR}/webp -Wno-dev ${DEFAULT_CMAKE_FLAGS} ${WEBP_EXTRA_ARGS}
INSTALL_DIR ${LIBDIR}/webp
)
if(WIN32)
if(BUILD_MODE STREQUAL Release)
ExternalProject_Add_Step(external_webp after_install
COMMAND ${CMAKE_COMMAND} -E copy_directory ${LIBDIR}/webp ${HARVEST_TARGET}/webp
DEPENDEES install
)
endif()
endif()

View File

@@ -21,7 +21,7 @@ set(YAMLCPP_EXTRA_ARGS
-DYAML_CPP_BUILD_TESTS=OFF
-DYAML_CPP_BUILD_TOOLS=OFF
-DYAML_CPP_BUILD_CONTRIB=OFF
-DYAML_MSVC_SHARED_RT=ON
-DMSVC_SHARED_RT=OFF
)
ExternalProject_Add(external_yamlcpp

View File

@@ -388,7 +388,7 @@ OPENVDB_FORCE_REBUILD=false
OPENVDB_SKIP=false
# Alembic needs to be compiled for now
ALEMBIC_VERSION="1.7.12"
ALEMBIC_VERSION="1.7.8"
ALEMBIC_VERSION_MIN=$ALEMBIC_VERSION
ALEMBIC_FORCE_BUILD=false
ALEMBIC_FORCE_REBUILD=false
@@ -431,9 +431,6 @@ X264_VERSION_MIN=0.118
VPX_USE=false
VPX_VERSION_MIN=0.9.7
VPX_DEV=""
OPUS_USE=false
OPUS_VERSION_MIN=1.1.1
OPUS_DEV=""
MP3LAME_USE=false
MP3LAME_DEV=""
OPENJPEG_USE=false
@@ -1117,7 +1114,7 @@ run_ldconfig() {
WARNING "--no-sudo enabled, impossible to run ldconfig for $1, you'll have to do it yourself..."
else
INFO "Running ldconfig for $1..."
$SUDO sh -c "/bin/echo -e \"$_lib_path\n$_lib64_path\" > $_ldconf_path"
$SUDO sh -c "echo -e \"$_lib_path\n$_lib64_path\" > $_ldconf_path"
$SUDO /sbin/ldconfig # XXX OpenSuse does not include sbin in command path with sudo!!!
fi
PRINT ""
@@ -2757,10 +2754,6 @@ compile_FFmpeg() {
extra="$extra --enable-libvpx"
fi
if [ "$OPUS_USE" = true ]; then
extra="$extra --enable-libopus"
fi
if [ "$MP3LAME_USE" = true ]; then
extra="$extra --enable-libmp3lame"
fi
@@ -2998,14 +2991,6 @@ install_DEB() {
install_packages_DEB $VPX_DEV
VPX_USE=true
fi
PRINT ""
OPUS_DEV="libopus-dev"
check_package_version_ge_DEB $OPUS_DEV $OPUS_VERSION_MIN
if [ $? -eq 0 ]; then
install_packages_DEB $OPUS_DEV
OPUS_USE=true
fi
fi
# Check cmake/glew versions and disable features for older distros.
@@ -3616,17 +3601,8 @@ install_RPM() {
install_packages_RPM $VPX_DEV
VPX_USE=true
fi
PRINT ""
install_packages_RPM libspnav-devel
PRINT ""
OPUS_DEV="libopus-devel"
check_package_version_ge_RPM $OPUS_DEV $OPUS_VERSION_MIN
if [ $? -eq 0 ]; then
install_packages_RPM $OPUS_DEV
OPUS_USE=true
fi
fi
PRINT ""
@@ -4101,14 +4077,6 @@ install_ARCH() {
install_packages_ARCH $VPX_DEV
VPX_USE=true
fi
PRINT ""
OPUS_DEV="opus"
check_package_version_ge_ARCH $OPUS_DEV $OPUS_VERSION_MIN
if [ $? -eq 0 ]; then
install_packages_ARCH $OPUS_DEV
OPUS_USE=true
fi
fi
@@ -4665,10 +4633,6 @@ print_info_ffmpeglink() {
_packages="$_packages $VPX_DEV"
fi
if [ "$OPUS_USE" = true ]; then
_packages="$_packages $OPUS_DEV"
fi
if [ "$MP3LAME_USE" = true ]; then
_packages="$_packages $MP3LAME_DEV"
fi

View File

@@ -192,11 +192,11 @@ if(ILMBASE_CUSTOM)
set(IlmBase_Libraries ${ILMBASE_CUSTOM_LIBRARIES})
separate_arguments(IlmBase_Libraries)
else()
# elseif(${ILMBASE_VERSION} VERSION_LESS "2.1")
#elseif(${ILMBASE_VERSION} VERSION_LESS "2.1")
set(IlmBase_Libraries Half Iex Imath IlmThread)
# else()
# string(REGEX REPLACE "([0-9]+)[.]([0-9]+).*" "\\1_\\2" _ilmbase_libs_ver ${ILMBASE_VERSION})
# set(IlmBase_Libraries Half Iex-${_ilmbase_libs_ver} Imath-${_ilmbase_libs_ver} IlmThread-${_ilmbase_libs_ver})
#else()
# string(REGEX REPLACE "([0-9]+)[.]([0-9]+).*" "\\1_\\2" _ilmbase_libs_ver ${ILMBASE_VERSION})
# set(IlmBase_Libraries Half Iex-${_ilmbase_libs_ver} Imath-${_ilmbase_libs_ver} IlmThread-${_ilmbase_libs_ver})
endif()

View File

@@ -188,11 +188,11 @@ if(OPENEXR_CUSTOM)
endif()
set(OpenEXR_Library ${OPENEXR_CUSTOM_LIBRARY})
else()
# elseif(${OPENEXR_VERSION} VERSION_LESS "2.1")
#elseif(${OPENEXR_VERSION} VERSION_LESS "2.1")
set(OpenEXR_Library IlmImf)
# else()
# string(REGEX REPLACE "([0-9]+)[.]([0-9]+).*" "\\1_\\2" _openexr_libs_ver ${OPENEXR_VERSION})
# set(OpenEXR_Library IlmImf-${_openexr_libs_ver})
#else()
# string(REGEX REPLACE "([0-9]+)[.]([0-9]+).*" "\\1_\\2" _openexr_libs_ver ${OPENEXR_VERSION})
# set(OpenEXR_Library IlmImf-${_openexr_libs_ver})
endif()
# Locate the OpenEXR library

View File

@@ -43,34 +43,3 @@ index 1f9a3ee..d151e9a 100644
return isnan( value );
#else
return std::isnan(value);
diff --git a/DAEValidator/CMakeLists.txt b/DAEValidator/CMakeLists.txt
index 03ad540f..f7d05cfb 100644
--- a/DAEValidator/CMakeLists.txt
+++ b/DAEValidator/CMakeLists.txt
@@ -98,7 +98,7 @@ if (WIN32)
# C4710: 'function' : function not inlined
# C4711: function 'function' selected for inline expansion
# C4820: 'bytes' bytes padding added after construct 'member_name'
- set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /MP /Wall /WX /wd4505 /wd4514 /wd4592 /wd4710 /wd4711 /wd4820")
+ set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /MP /Wall /wd4505 /wd4514 /wd4592 /wd4710 /wd4711 /wd4820")
else ()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11 -Wall -Werror")
endif ()
diff --git a/DAEValidator/library/src/ArgumentParser.cpp b/DAEValidator/library/src/ArgumentParser.cpp
index 897e4dcf..98a69ff1 100644
--- a/DAEValidator/library/src/ArgumentParser.cpp
+++ b/DAEValidator/library/src/ArgumentParser.cpp
@@ -6,10 +6,10 @@
using namespace std;
-#ifdef _MSC_VER
-#define NOEXCEPT _NOEXCEPT
-#else
+#ifndef _NOEXCEPT
#define NOEXCEPT noexcept
+#else
+#define NOEXCEPT _NOEXCEPT
#endif
namespace opencollada

View File

@@ -1,70 +0,0 @@
Blender Buildbot
================
Code signing
------------
Code signing is done as part of INSTALL target, which makes it possible to sign
files which are aimed into a bundle and coming from a non-signed source (such as
libraries SVN).
This is achieved by specifying `slave_codesign.cmake` as a post-install script
run by CMake. This CMake script simply involves an utility script written in
Python which takes care of an actual signing.
### Configuration
Client configuration doesn't need anything special, other than variable
`SHARED_STORAGE_DIR` pointing to a location which is watched by a server.
This is done in `config_builder.py` file and is stored in Git (which makes it
possible to have almost zero-configuration buildbot machines).
Server configuration requires copying `config_server_template.py` under the
name of `config_server.py` and tweaking values, which are platform-specific.
#### Windows configuration
There are two things which are needed on Windows in order to have code signing
to work:
- `TIMESTAMP_AUTHORITY_URL` which is most likely set http://timestamp.digicert.com
- `CERTIFICATE_FILEPATH` which is a full file path to a PKCS #12 key (.pfx).
## Tips
### Self-signed certificate on Windows
It is easiest to test configuration using self-signed certificate.
The certificate manipulation utilities are coming with Windows SDK.
Unfortunately, they are not added to PATH. Here is an example of how to make
sure they are easily available:
```
set PATH=C:\Program Files (x86)\Windows Kits\10\App Certification Kit;%PATH%
set PATH=C:\Program Files (x86)\Windows Kits\10\bin\10.0.18362.0\x64;%PATH%
```
Generate CA:
```
makecert -r -pe -n "CN=Blender Test CA" -ss CA -sr CurrentUser -a sha256 ^
-cy authority -sky signature -sv BlenderTestCA.pvk BlenderTestCA.cer
```
Import the generated CA:
```
certutil -user -addstore Root BlenderTestCA.cer
```
Create self-signed certificate and pack it into PKCS #12:
```
makecert -pe -n "CN=Blender Test SPC" -a sha256 -cy end ^
-sky signature ^
-ic BlenderTestCA.cer -iv BlenderTestCA.pvk ^
-sv BlenderTestSPC.pvk BlenderTestSPC.cer
pvk2pfx -pvk BlenderTestSPC.pvk -spc BlenderTestSPC.cer -pfx BlenderTestSPC.pfx
```

View File

@@ -1,114 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import argparse
import os
import re
import subprocess
import sys
class Builder:
def __init__(self, name, branch):
self.name = name
self.branch = branch
self.is_release_branch = re.match("^blender-v(.*)-release$", branch) is not None
# Buildbot runs from build/ directory
self.blender_dir = os.path.abspath(os.path.join('..', 'blender.git'))
self.build_dir = os.path.abspath(os.path.join('..', 'build', name))
self.install_dir = os.path.abspath(os.path.join('..', 'install', name))
self.upload_dir = os.path.abspath(os.path.join('..', 'install'))
# Detect platform
if name.startswith('mac'):
self.platform = 'mac'
self.command_prefix = []
elif name.startswith('linux'):
self.platform = 'linux'
self.command_prefix = ['scl', 'enable', 'devtoolset-6', '--']
elif name.startswith('win'):
self.platform = 'win'
self.command_prefix = []
else:
raise ValueError('Unkonw platform for builder ' + self.platform)
# Always 64 bit now
self.bits = 64
def create_builder_from_arguments():
parser = argparse.ArgumentParser()
parser.add_argument('builder_name')
parser.add_argument('branch', default='master', nargs='?')
args = parser.parse_args()
return Builder(args.builder_name, args.branch)
class VersionInfo:
def __init__(self, builder):
# Get version information
buildinfo_h = os.path.join(builder.build_dir, "source", "creator", "buildinfo.h")
blender_h = os.path.join(builder.blender_dir, "source", "blender", "blenkernel", "BKE_blender_version.h")
version_number = int(self._parse_header_file(blender_h, 'BLENDER_VERSION'))
self.version = "%d.%d" % (version_number // 100, version_number % 100)
self.version_char = self._parse_header_file(blender_h, 'BLENDER_VERSION_CHAR')
self.version_cycle = self._parse_header_file(blender_h, 'BLENDER_VERSION_CYCLE')
self.version_cycle_number = self._parse_header_file(blender_h, 'BLENDER_VERSION_CYCLE_NUMBER')
self.hash = self._parse_header_file(buildinfo_h, 'BUILD_HASH')[1:-1]
if self.version_cycle == "release":
# Final release
self.full_version = self.version + self.version_char
self.is_development_build = False
elif self.version_cycle == "rc":
# Release candidate
version_cycle = self.version_cycle + self.version_cycle_number
if len(self.version_char) == 0:
self.full_version = self.version + version_cycle
else:
self.full_version = self.version + self.version_char + '-' + version_cycle
self.is_development_build = False
else:
# Development build
self.full_version = self.version + '-' + self.hash
self.is_development_build = True
def _parse_header_file(self, filename, define):
import re
regex = re.compile("^#\s*define\s+%s\s+(.*)" % define)
with open(filename, "r") as file:
for l in file:
match = regex.match(l)
if match:
return match.group(1)
return None
def call(cmd, env=None, exit_on_error=True):
print(' '.join(cmd))
# Flush to ensure correct order output on Windows.
sys.stdout.flush()
sys.stderr.flush()
retcode = subprocess.call(cmd, env=env)
if exit_on_error and retcode != 0:
sys.exit(retcode)
return retcode

View File

@@ -1,77 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
from dataclasses import dataclass
from pathlib import Path
from typing import List
@dataclass
class AbsoluteAndRelativeFileName:
"""
Helper class which keeps track of absolute file path for a direct access and
corresponding relative path against given base.
The relative part is used to construct a file name within an archive which
contains files which are to be signed or which has been signed already
(depending on whether the archive is addressed to signing server or back
to the buildbot worker).
"""
# Base directory which is where relative_filepath is relative to.
base_dir: Path
# Full absolute path of the corresponding file.
absolute_filepath: Path
# Derived from full file path, contains part of the path which is relative
# to a desired base path.
relative_filepath: Path
def __init__(self, base_dir: Path, filepath: Path):
self.base_dir = base_dir
self.absolute_filepath = filepath.resolve()
self.relative_filepath = self.absolute_filepath.relative_to(
self.base_dir)
@classmethod
def from_path(cls, path: Path) -> 'AbsoluteAndRelativeFileName':
assert path.is_absolute()
assert path.is_file()
base_dir = path.parent
return AbsoluteAndRelativeFileName(base_dir, path)
@classmethod
def recursively_from_directory(cls, base_dir: Path) \
-> List['AbsoluteAndRelativeFileName']:
"""
Create list of AbsoluteAndRelativeFileName for all the files in the
given directory.
"""
assert base_dir.is_absolute()
assert base_dir.is_dir()
result = []
for filename in base_dir.glob('**/*'):
if not filename.is_file():
continue
result.append(AbsoluteAndRelativeFileName(base_dir, filename))
return result

View File

@@ -1,101 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
from pathlib import Path
from codesign.util import ensure_file_does_not_exist_or_die
class ArchiveWithIndicator:
"""
The idea of this class is to wrap around logic which takes care of keeping
track of a name of an archive and synchronization routines between buildbot
worker and signing server.
The synchronization is done based on creating a special file after the
archive file is knowingly ready for access.
"""
# Base directory where the archive is stored (basically, a basename() of
# the absolute archive file name).
#
# For example, 'X:\\TEMP\\'.
base_dir: Path
# Absolute file name of the archive.
#
# For example, 'X:\\TEMP\\FOO.ZIP'.
archive_filepath: Path
# Absolute name of a file which acts as an indication of the fact that the
# archive is ready and is available for access.
#
# This is how synchronization between buildbot worker and signing server is
# done:
# - First, the archive is created under archive_filepath name.
# - Second, the indication file is created under ready_indicator_filepath
# name.
# - Third, the colleague of whoever created the indicator name watches for
# the indication file to appear, and once it's there it access the
# archive.
ready_indicator_filepath: Path
def __init__(
self, base_dir: Path, archive_name: str, ready_indicator_name: str):
"""
Construct the object from given base directory and name of the archive
file:
ArchiveWithIndicator(Path('X:\\TEMP'), 'FOO.ZIP', 'INPUT_READY')
"""
self.base_dir = base_dir
self.archive_filepath = self.base_dir / archive_name
self.ready_indicator_filepath = self.base_dir / ready_indicator_name
def is_ready(self) -> bool:
"""Check whether the archive is ready for access."""
return self.ready_indicator_filepath.exists()
def tag_ready(self) -> None:
"""
Tag the archive as ready by creating the corresponding indication file.
NOTE: It is expected that the archive was never tagged as ready before
and that there are no subsequent tags of the same archive.
If it is violated, an assert will fail.
"""
assert not self.is_ready()
self.ready_indicator_filepath.touch()
def clean(self) -> None:
"""
Remove both archive and the ready indication file.
"""
ensure_file_does_not_exist_or_die(self.ready_indicator_filepath)
ensure_file_does_not_exist_or_die(self.archive_filepath)
def is_fully_absent(self) -> bool:
"""
Check whether both archive and its ready indicator are absent.
Is used for a sanity check during code signing process by both
buildbot worker and signing server.
"""
return (not self.archive_filepath.exists() and
not self.ready_indicator_filepath.exists())

View File

@@ -1,385 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Signing process overview.
#
# From buildbot worker side:
# - Files which needs to be signed are collected from either a directory to
# sign all signable files in there, or by filename of a single file to sign.
# - Those files gets packed into an archive and stored in a location location
# which is watched by the signing server.
# - A marker READY file is created which indicates the archive is ready for
# access.
# - Wait for the server to provide an archive with signed files.
# This is done by watching for the READY file which corresponds to an archive
# coming from the signing server.
# - Unpack the signed signed files from the archives and replace original ones.
#
# From code sign server:
# - Watch special location for a READY file which indicates the there is an
# archive with files which are to be signed.
# - Unpack the archive to a temporary location.
# - Run codesign tool and make sure all the files are signed.
# - Pack the signed files and store them in a location which is watched by
# the buildbot worker.
# - Create a READY file which indicates that the archive with signed files is
# ready.
import abc
import logging
import shutil
import time
import zipfile
from pathlib import Path
from tempfile import TemporaryDirectory
from typing import Iterable, List
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.archive_with_indicator import ArchiveWithIndicator
logger = logging.getLogger(__name__)
logger_builder = logger.getChild('builder')
logger_server = logger.getChild('server')
def pack_files(files: Iterable[AbsoluteAndRelativeFileName],
archive_filepath: Path) -> None:
"""
Create zip archive from given files for the signing pipeline.
Is used by buildbot worker to create an archive of files which are to be
signed, and by signing server to send signed files back to the worker.
"""
with zipfile.ZipFile(archive_filepath, 'w') as zip_file_handle:
for file_info in files:
zip_file_handle.write(file_info.absolute_filepath,
arcname=file_info.relative_filepath)
def extract_files(archive_filepath: Path,
extraction_dir: Path) -> None:
"""
Extract all files form the given archive into the given direcotry.
"""
# TODO(sergey): Verify files in the archive have relative path.
with zipfile.ZipFile(archive_filepath, mode='r') as zip_file_handle:
zip_file_handle.extractall(path=extraction_dir)
class BaseCodeSigner(metaclass=abc.ABCMeta):
"""
Base class for a platform-specific signer of binaries.
Contains all the logic shared across platform-specific implementations, such
as synchronization and notification logic.
Platform specific bits (such as actual command for signing the binary) are
to be implemented as a subclass.
Provides utilities code signing as a whole, including functionality needed
by a signing server and a buildbot worker.
The signer and builder may run on separate machines, the only requirement is
that they have access to a directory which is shared between them. For the
security concerns this is to be done as a separate machine (or as a Shared
Folder configuration in VirtualBox configuration). This directory might be
mounted under different base paths, but its underlying storage is to be
the same.
The code signer is short-lived on a buildbot worker side, and is living
forever on a code signing server side.
"""
# TODO(sergey): Find a neat way to have config annotated.
# config: Config
# Storage directory where builder puts files which are requested to be
# signed.
# Consider this an input of the code signing server.
unsigned_storage_dir: Path
# Information about archive which contains files which are to be signed.
#
# This archive is created by the buildbot worked and acts as an input for
# the code signing server.
unsigned_archive_info: ArchiveWithIndicator
# Storage where signed files are stored.
# Consider this an output of the code signer server.
signed_storage_dir: Path
# Information about archive which contains signed files.
#
# This archive is created by the code signing server.
signed_archive_info: ArchiveWithIndicator
def __init__(self, config):
self.config = config
absolute_shared_storage_dir = config.SHARED_STORAGE_DIR.resolve()
# Unsigned (signing server input) configuration.
self.unsigned_storage_dir = absolute_shared_storage_dir / 'unsigned'
self.unsigned_archive_info = ArchiveWithIndicator(
self.unsigned_storage_dir, 'unsigned_files.zip', 'ready.stamp')
# Signed (signing server output) configuration.
self.signed_storage_dir = absolute_shared_storage_dir / 'signed'
self.signed_archive_info = ArchiveWithIndicator(
self.signed_storage_dir, 'signed_files.zip', 'ready.stamp')
"""
General note on cleanup environment functions.
It is expected that there is only one instance of the code signer server
running for a given input/output directory, and that it serves a single
buildbot worker.
By its nature, a buildbot worker only produces one build at a time and
never performs concurrent builds.
This leads to a conclusion that when starting in a clean environment
there shouldn't be any archives remaining from a previous build.
However, it is possible to have various failure scenarios which might
leave the environment in a non-clean state:
- Network hiccup which makes buildbot worker to stop current build
and re-start it after connection to server is re-established.
Note, this could also happen during buildbot server maintenance.
- Signing server might get restarted due to updates or other reasons.
Requiring manual interaction in such cases is not something good to
require, so here we simply assume that the system is used the way it is
intended to and restore environment to a prestine clean state.
"""
def cleanup_environment_for_builder(self) -> None:
self.unsigned_archive_info.clean()
self.signed_archive_info.clean()
def cleanup_environment_for_signing_server(self) -> None:
# Don't clear the requested to-be-signed archive since we might be
# restarting signing machine while the buildbot is busy.
self.signed_archive_info.clean()
############################################################################
# Buildbot worker side helpers.
@abc.abstractmethod
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
"""
Check whether file is to be signed.
Is used by both single file signing pipeline and recursive directory
signing pipeline.
This is where code signer is to check whether file is to be signed or
not. This check might be based on a simple extension test or on actual
test whether file have a digital signature already or not.
"""
def collect_files_to_sign(self, path: Path) \
-> List[AbsoluteAndRelativeFileName]:
"""
Get all files which need to be signed from the given path.
NOTE: The path might either be a file or directory.
This function is run from the buildbot worker side.
"""
# If there is a single file provided trust the buildbot worker that it
# is eligible for signing.
if path.is_file():
file = AbsoluteAndRelativeFileName.from_path(path)
if not self.check_file_is_to_be_signed(file):
return []
return [file]
all_files = AbsoluteAndRelativeFileName.recursively_from_directory(
path)
files_to_be_signed = [file for file in all_files
if self.check_file_is_to_be_signed(file)]
return files_to_be_signed
def wait_for_signed_archive_or_die(self) -> None:
"""
Wait until archive with signed files is available.
Will only wait for the configured time. If that time exceeds and there
is still no responce from the signing server the application will exit
with a non-zero exit code.
"""
timeout_in_seconds = self.config.TIMEOUT_IN_SECONDS
time_start = time.monotonic()
while not self.signed_archive_info.is_ready():
time.sleep(1)
time_slept_in_seconds = time.monotonic() - time_start
if time_slept_in_seconds > timeout_in_seconds:
self.unsigned_archive_info.clean()
raise SystemExit("Signing server didn't finish signing in "
f"{timeout_in_seconds} seconds, dying :(")
def copy_signed_files_to_directory(
self, signed_dir: Path, destination_dir: Path) -> None:
"""
Copy all files from signed_dir to destination_dir.
This function will overwrite any existing file. Permissions are copied
from the source files, but other metadata, such as timestamps, are not.
"""
for signed_filepath in signed_dir.glob('**/*'):
if not signed_filepath.is_file():
continue
relative_filepath = signed_filepath.relative_to(signed_dir)
destination_filepath = destination_dir / relative_filepath
destination_filepath.parent.mkdir(parents=True, exist_ok=True)
shutil.copy(signed_filepath, destination_filepath)
def run_buildbot_path_sign_pipeline(self, path: Path) -> None:
"""
Run all steps needed to make given path signed.
Path points to an unsigned file or a directory which contains unsigned
files.
If the path points to a single file then this file will be signed.
This is used to sign a final bundle such as .msi on Windows or .dmg on
macOS.
NOTE: The code signed implementation might actually reject signing the
file, in which case the file will be left unsigned. This isn't anything
to be considered a failure situation, just might happen when buildbot
worker can not detect whether signing is really required in a specific
case or not.
If the path points to a directory then code signer will sign all
signable files from it (finding them recursively).
"""
self.cleanup_environment_for_builder()
# Make sure storage directory exists.
self.unsigned_storage_dir.mkdir(parents=True, exist_ok=True)
# Collect all files which needs to be signed and pack them into a single
# archive which will be sent to the signing server.
logger_builder.info('Collecting files which are to be signed...')
files = self.collect_files_to_sign(path)
if not files:
logger_builder.info('No files to be signed, ignoring.')
return
logger_builder.info('Found %d files to sign.', len(files))
pack_files(files=files,
archive_filepath=self.unsigned_archive_info.archive_filepath)
self.unsigned_archive_info.tag_ready()
# Wait for the signing server to finish signing.
logger_builder.info('Waiting signing server to sign the files...')
self.wait_for_signed_archive_or_die()
# Extract signed files from archive and move files to final location.
with TemporaryDirectory(prefix='blender-buildbot-') as temp_dir_str:
unpacked_signed_files_dir = Path(temp_dir_str)
logger_builder.info('Extracting signed files from archive...')
extract_files(
archive_filepath=self.signed_archive_info.archive_filepath,
extraction_dir=unpacked_signed_files_dir)
destination_dir = path
if destination_dir.is_file():
destination_dir = destination_dir.parent
self.copy_signed_files_to_directory(
unpacked_signed_files_dir, destination_dir)
############################################################################
# Signing server side helpers.
def wait_for_sign_request(self) -> None:
"""
Wait for the buildbot to request signing of an archive.
"""
# TOOD(sergey): Support graceful shutdown on Ctrl-C.
while not self.unsigned_archive_info.is_ready():
time.sleep(1)
@abc.abstractmethod
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
"""
Sign all files in the given directory.
NOTE: Signing should happen in-place.
"""
def run_signing_pipeline(self):
"""
Run the full signing pipeline starting from the point when buildbot
worker have requested signing.
"""
# Make sure storage directory exists.
self.signed_storage_dir.mkdir(parents=True, exist_ok=True)
with TemporaryDirectory(prefix='blender-codesign-') as temp_dir_str:
temp_dir = Path(temp_dir_str)
logger_server.info('Extracting unsigned files from archive...')
extract_files(
archive_filepath=self.unsigned_archive_info.archive_filepath,
extraction_dir=temp_dir)
logger_server.info('Collecting all files which needs signing...')
files = AbsoluteAndRelativeFileName.recursively_from_directory(
temp_dir)
logger_server.info('Signing all requested files...')
self.sign_all_files(files)
logger_server.info('Packing signed files...')
pack_files(files=files,
archive_filepath=self.signed_archive_info.archive_filepath)
self.signed_archive_info.tag_ready()
logger_server.info('Removing signing request...')
self.unsigned_archive_info.clean()
logger_server.info('Signing is complete.')
def run_signing_server(self):
logger_server.info('Starting new code signing server...')
self.cleanup_environment_for_signing_server()
logger_server.info('Code signing server is ready')
while True:
logger_server.info('Waiting for the signing request in %s...',
self.unsigned_storage_dir)
self.wait_for_sign_request()
logger_server.info(
'Got signing request, beging signign procedure.')
self.run_signing_pipeline()

View File

@@ -1,57 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Configuration of a code signer which is specific to the code running from
# buildbot's worker.
import sys
from pathlib import Path
from codesign.config_common import *
if sys.platform == 'linux':
SHARED_STORAGE_DIR = Path('/data/codesign')
elif sys.platform == 'win32':
SHARED_STORAGE_DIR = Path('Z:\\codesign')
# https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
LOGGING = {
'version': 1,
'formatters': {
'default': {'format': '%(asctime)-15s %(levelname)8s %(name)s %(message)s'}
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'default',
'stream': 'ext://sys.stderr',
}
},
'loggers': {
'codesign': {'level': 'INFO'},
},
'root': {
'level': 'WARNING',
'handlers': [
'console',
],
}
}

View File

@@ -1,33 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
from pathlib import Path
# Timeout in seconds for the signing process.
#
# This is how long buildbot packing step will wait signing server to
# perform signing.
TIMEOUT_IN_SECONDS = 240
# Directory which is shared across buildbot worker and signing server.
#
# This is where worker puts files requested for signing as well as where
# server puts signed files.
SHARED_STORAGE_DIR: Path

View File

@@ -1,63 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Configuration of a code signer which is specific to the code signing server.
#
# NOTE: DO NOT put any sensitive information here, put it in an actual
# configuration on the signing machine.
from pathlib import Path
from codesign.config_common import *
# URL to the timestamping authority.
TIMESTAMP_AUTHORITY_URL = 'http://timestamp.digicert.com'
# Full path to the certificate used for signing.
#
# The path and expected file format might vary depending on a platform.
#
# On Windows it is usually is a PKCS #12 key (.pfx), so the path will look
# like Path('C:\\Secret\\Blender.pfx').
CERTIFICATE_FILEPATH: Path
# https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
LOGGING = {
'version': 1,
'formatters': {
'default': {'format': '%(asctime)-15s %(levelname)8s %(name)s %(message)s'}
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'default',
'stream': 'ext://sys.stderr',
}
},
'loggers': {
'codesign': {'level': 'INFO'},
},
'root': {
'level': 'WARNING',
'handlers': [
'console',
],
}
}

View File

@@ -1,72 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# NOTE: This is a no-op signer (since there isn't really a procedure to sign
# Linux binaries yet). Used to debug and verify the code signing routines on
# a Linux environment.
import logging
from pathlib import Path
from typing import List
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.base_code_signer import BaseCodeSigner
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
class LinuxCodeSigner(BaseCodeSigner):
def is_active(self) -> bool:
"""
Check whether this signer is active.
if it is inactive, no files will be signed.
Is used to be able to debug code signing pipeline on Linux, where there
is no code signing happening in the actual buildbot and release
environment.
"""
return False
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
if file.relative_filepath == Path('blender'):
return True
if (file.relative_filepath.parts()[-3:-1] == ('python', 'bin') and
file.relative_filepath.name.startwith('python')):
return True
if file.relative_filepath.suffix == '.so':
return True
return False
def collect_files_to_sign(self, path: Path) \
-> List[AbsoluteAndRelativeFileName]:
if not self.is_active():
return []
return super().collect_files_to_sign(path)
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
num_files = len(files)
for file_index, file in enumerate(files):
logger.info('Server: Signed file [%d/%d] %s',
file_index + 1, num_files, file.relative_filepath)

View File

@@ -1,47 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging.config
import sys
from pathlib import Path
from typing import Optional
import codesign.config_builder
from codesign.base_code_signer import BaseCodeSigner
class SimpleCodeSigner:
code_signer: Optional[BaseCodeSigner]
def __init__(self):
if sys.platform == 'linux':
from codesign.linux_code_signer import LinuxCodeSigner
self.code_signer = LinuxCodeSigner(codesign.config_builder)
elif sys.platform == 'win32':
from codesign.windows_code_signer import WindowsCodeSigner
self.code_signer = WindowsCodeSigner(codesign.config_builder)
else:
self.code_signer = None
def sign_file_or_directory(self, path: Path) -> None:
logging.config.dictConfig(codesign.config_builder.LOGGING)
self.code_signer.run_buildbot_path_sign_pipeline(path)

View File

@@ -1,35 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
from pathlib import Path
def ensure_file_does_not_exist_or_die(filepath: Path) -> None:
"""
If the file exists, unlink it.
If the file path exists and is not a file an assert will trigger.
If the file path does not exists nothing happens.
"""
if not filepath.exists():
return
if not filepath.is_file():
# TODO(sergey): Provide information about what the filepath actually is.
raise SystemExit(f'{filepath} is expected to be a file, but is not')
filepath.unlink()

View File

@@ -1,75 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import logging
import subprocess
from pathlib import Path
from typing import List
from buildbot_utils import Builder
from codesign.absolute_and_relative_filename import AbsoluteAndRelativeFileName
from codesign.base_code_signer import BaseCodeSigner
logger = logging.getLogger(__name__)
logger_server = logger.getChild('server')
# NOTE: Check is done as filename.endswith(), so keep the dot
EXTENSIONS_TO_BE_SIGNED = {'.exe', '.dll', '.pyd', '.msi'}
BLACKLIST_FILE_PREFIXES = (
'api-ms-', 'concrt', 'msvcp', 'ucrtbase', 'vcomp', 'vcruntime')
class WindowsCodeSigner(BaseCodeSigner):
def check_file_is_to_be_signed(
self, file: AbsoluteAndRelativeFileName) -> bool:
base_name = file.relative_filepath.name
if any(base_name.startswith(prefix)
for prefix in BLACKLIST_FILE_PREFIXES):
return False
return file.relative_filepath.suffix in EXTENSIONS_TO_BE_SIGNED
def get_sign_command_prefix(self) -> List[str]:
return [
'signtool', 'sign', '/v',
'/f', self.config.CERTIFICATE_FILEPATH,
'/t', self.config.TIMESTAMP_AUTHORITY_URL]
def sign_all_files(self, files: List[AbsoluteAndRelativeFileName]) -> None:
# NOTE: Sign files one by one to avoid possible command line length
# overflow (which could happen if we ever decide to sign every binary
# in the install folder, for example).
#
# TODO(sergey): Consider doing batched signing of handful of files in
# one go (but only if this actually known to be much faster).
num_files = len(files)
for file_index, file in enumerate(files):
command = self.get_sign_command_prefix()
command.append(file.absolute_filepath)
logger_server.info(
'Running signtool command for file [%d/%d] %s...',
file_index + 1, num_files, file.relative_filepath)
# TODO(sergey): Check the status somehow. With a missing certificate
# the command still exists with a zero code.
subprocess.run(command)
# TODO(sergey): Report number of signed and ignored files.

View File

@@ -1,37 +0,0 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# NOTE: This is a no-op signer (since there isn't really a procedure to sign
# Linux binaries yet). Used to debug and verify the code signing routines on
# a Linux environment.
import logging.config
from pathlib import Path
from typing import List
from codesign.linux_code_signer import LinuxCodeSigner
import codesign.config_server
if __name__ == "__main__":
logging.config.dictConfig(codesign.config_server.LOGGING)
code_signer = LinuxCodeSigner(codesign.config_server)
code_signer.run_signing_server()

View File

@@ -1,11 +0,0 @@
@echo off
rem This is an entry point of the codesign server for Windows.
rem It makes sure that signtool.exe is within the current PATH and can be
rem used by the Python script.
SETLOCAL
set PATH=C:\Program Files (x86)\Windows Kits\10\App Certification Kit;%PATH%
codesign_server_windows.py

View File

@@ -1,44 +0,0 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
# Implementation of codesign server for Windows.
#
# NOTE: If signtool.exe is not in the PATH use codesign_server_windows.bat
import logging.config
import shutil
from pathlib import Path
from typing import List
from codesign.windows_code_signer import WindowsCodeSigner
import codesign.config_server
if __name__ == "__main__":
# TODO(sergey): Consider moving such sanity checks into
# CodeSigner.check_environment_or_die().
if not shutil.which('signtool.exe'):
raise SystemExit("signtool.exe is not found in %PATH%")
logging.config.dictConfig(codesign.config_server.LOGGING)
code_signer = WindowsCodeSigner(codesign.config_server)
code_signer.run_signing_server()

View File

@@ -2,22 +2,33 @@
include("${CMAKE_CURRENT_LIST_DIR}/../../cmake/config/blender_release.cmake")
message(STATUS "Building in CentOS 7 64bit environment")
set(LIBDIR_NAME "linux_centos7_x86_64")
set(WITH_CXX11_ABI OFF CACHE BOOL "" FORCE)
# For libc-2.24 we are using chroot which runs on a 64bit system.
# There we can not use CPU bitness check since it is always 64bit. So instead
# we check for a specific libraries.
#
# Other builders we are running in a bare virtual machine, and the libraries
# are installed to /opt/.
# We assume that only 64bit builders exists in such configuration.
if(EXISTS "/lib/x86_64-linux-gnu/libc-2.24.so")
message(STATUS "Building in GLibc-2.24 environment")
set(LIBDIR_NAME "linux_x86_64")
elseif(EXISTS "/lib/i386-linux-gnu//libc-2.24.so")
message(STATUS "Building in GLibc-2.24 environment")
set(LIBDIR_NAME "linux_i686")
else()
message(STATUS "Building in generic 64bit environment")
set(LIBDIR_NAME "linux_x86_64")
endif()
# Default to only build Blender
set(WITH_BLENDER ON CACHE BOOL "" FORCE)
# ######## Linux-specific build options ########
# Options which are specific to Linux-only platforms
set(WITH_DOC_MANPAGE OFF CACHE BOOL "" FORCE)
# ######## Official release-specific build options ########
# Options which are specific to Linux release builds only
set(WITH_JACK_DYNLOAD ON CACHE BOOL "" FORCE)
set(WITH_SDL_DYNLOAD ON CACHE BOOL "" FORCE)
set(WITH_SYSTEM_GLEW OFF CACHE BOOL "" FORCE)
@@ -29,7 +40,7 @@ set(WITH_PYTHON_INSTALL_REQUESTS ON CACHE BOOL "" FORCE)
# ######## Release environment specific settings ########
set(LIBDIR "${CMAKE_CURRENT_LIST_DIR}/../../../../lib/${LIBDIR_NAME}" CACHE STRING "" FORCE)
set(LIBDIR "/opt/blender-deps/${LIBDIR_NAME}" CACHE BOOL "" FORCE)
# Platform specific configuration, to ensure static linking against everything.

View File

@@ -1,44 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# This is a script which is used as POST-INSTALL one for regular CMake's
# INSTALL target.
# It is used by buildbot workers to sign every binary which is going into
# the final buundle.
# On Windows Python 3 there only is python.exe, no python3.exe.
#
# On other platforms it is possible to have python2 and python3, and a
# symbolic link to python to either of them. So on those platforms use
# an explicit Python version.
if(WIN32)
set(PYTHON_EXECUTABLE python)
else()
set(PYTHON_EXECUTABLE python3)
endif()
execute_process(
COMMAND ${PYTHON_EXECUTABLE} "${CMAKE_CURRENT_LIST_DIR}/slave_codesign.py"
"${CMAKE_INSTALL_PREFIX}"
WORKING_DIRECTORY ${CMAKE_CURRENT_LIST_DIR}
RESULT_VARIABLE exit_code
)
if(NOT exit_code EQUAL "0")
message( FATAL_ERROR "Non-zero exit code of codesign tool")
endif()

View File

@@ -1,74 +0,0 @@
#!/usr/bin/env python3
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# Helper script which takes care of signing provided location.
#
# The location can either be a directory (in which case all eligible binaries
# will be signed) or a single file (in which case a single file will be signed).
#
# This script takes care of all the complexity of communicating between process
# which requests file to be signed and the code signing server.
#
# NOTE: Signing happens in-place.
import argparse
import sys
from pathlib import Path
from codesign.simple_code_signer import SimpleCodeSigner
def create_argument_parser():
parser = argparse.ArgumentParser()
parser.add_argument('path_to_sign', type=Path)
return parser
def main():
parser = create_argument_parser()
args = parser.parse_args()
path_to_sign = args.path_to_sign
if sys.platform == 'win32':
# When WIX packed is used to generate .msi on Windows the CPack will
# install two different projects and install them to different
# installation prefix:
#
# - C:\b\build\_CPack_Packages\WIX\Blender
# - C:\b\build\_CPack_Packages\WIX\Unspecified
#
# Annoying part is: CMake's post-install script will only be run
# once, with the install prefix which corresponds to a project which
# was installed last. But we want to sign binaries from all projects.
# So in order to do so we detect that we are running for a CPack's
# project used for WIX and force parent directory (which includes both
# projects) to be signed.
#
# Here we force both projects to be signed.
if path_to_sign.name == 'Unspecified' and 'WIX' in str(path_to_sign):
path_to_sign = path_to_sign.parent
code_signer = SimpleCodeSigner()
code_signer.sign_file_or_directory(path_to_sign)
if __name__ == "__main__":
main()

View File

@@ -19,98 +19,148 @@
# <pep8 compliant>
import os
import subprocess
import sys
import shutil
import buildbot_utils
# get builder name
if len(sys.argv) < 2:
sys.stderr.write("Not enough arguments, expecting builder name\n")
sys.exit(1)
def get_cmake_options(builder):
post_install_script = os.path.join(
builder.blender_dir, 'build_files', 'buildbot', 'slave_codesign.cmake')
builder = sys.argv[1]
config_file = "build_files/cmake/config/blender_release.cmake"
options = ['-DCMAKE_BUILD_TYPE:STRING=Release',
'-DWITH_GTESTS=ON']
# we run from build/ directory
blender_dir = os.path.join('..', 'blender.git')
if builder.platform == 'mac':
options.append('-DCMAKE_OSX_ARCHITECTURES:STRING=x86_64')
options.append('-DCMAKE_OSX_DEPLOYMENT_TARGET=10.9')
elif builder.platform == 'win':
options.extend(['-G', 'Visual Studio 15 2017 Win64'])
options.extend(['-DPOSTINSTALL_SCRIPT:PATH=' + post_install_script])
elif builder.platform == 'linux':
config_file = "build_files/buildbot/config/blender_linux.cmake"
optix_sdk_dir = os.path.join(builder.blender_dir, '..', '..', 'NVIDIA-Optix-SDK')
options.append('-DOPTIX_ROOT_DIR:PATH=' + optix_sdk_dir)
def parse_header_file(filename, define):
import re
regex = re.compile("^#\s*define\s+%s\s+(.*)" % define)
with open(filename, "r") as file:
for l in file:
match = regex.match(l)
if match:
return match.group(1)
return None
options.append("-C" + os.path.join(builder.blender_dir, config_file))
options.append("-DCMAKE_INSTALL_PREFIX=%s" % (builder.install_dir))
if 'cmake' in builder:
# cmake
return options
# Some fine-tuning configuration
blender_dir = os.path.abspath(blender_dir)
build_dir = os.path.abspath(os.path.join('..', 'build', builder))
install_dir = os.path.abspath(os.path.join('..', 'install', builder))
targets = ['blender']
command_prefix = []
def update_git(builder):
# Do extra git fetch because not all platform/git/buildbot combinations
# update the origin remote, causing buildinfo to detect local changes.
os.chdir(builder.blender_dir)
bits = 64
print("Fetching remotes")
command = ['git', 'fetch', '--all']
buildbot_utils.call(builder.command_prefix + command)
# Config file to be used (relative to blender's sources root)
cmake_config_file = "build_files/cmake/config/blender_release.cmake"
def clean_directories(builder):
# Make sure no garbage remained from the previous run
if os.path.isdir(builder.install_dir):
shutil.rmtree(builder.install_dir)
# Set build options.
cmake_options = []
cmake_extra_options = ['-DCMAKE_BUILD_TYPE:STRING=Release',
'-DWITH_GTESTS=ON']
# Make sure build directory exists and enter it
os.makedirs(builder.build_dir, exist_ok=True)
if builder.startswith('mac'):
# Set up OSX architecture
if builder.endswith('x86_64_10_9_cmake'):
cmake_extra_options.append('-DCMAKE_OSX_ARCHITECTURES:STRING=x86_64')
cmake_extra_options.append('-DCMAKE_OSX_DEPLOYMENT_TARGET=10.9')
# Remove buildinfo files to force buildbot to re-generate them.
for buildinfo in ('buildinfo.h', 'buildinfo.h.txt', ):
full_path = os.path.join(builder.build_dir, 'source', 'creator', buildinfo)
if os.path.exists(full_path):
print("Removing {}" . format(buildinfo))
os.remove(full_path)
elif builder.startswith('win'):
if builder.startswith('win64'):
cmake_options.extend(['-G', 'Visual Studio 15 2017 Win64'])
elif builder.startswith('win32'):
bits = 32
cmake_options.extend(['-G', 'Visual Studio 15 2017'])
def cmake_configure(builder):
# CMake configuration
os.chdir(builder.build_dir)
elif builder.startswith('linux'):
cmake_config_file = "build_files/buildbot/config/blender_linux.cmake"
tokens = builder.split("_")
glibc = tokens[1]
if glibc == 'glibc224':
deb_name = "stretch"
if builder.endswith('x86_64_cmake'):
chroot_name = 'buildbot_' + deb_name + '_x86_64'
elif builder.endswith('i686_cmake'):
bits = 32
chroot_name = 'buildbot_' + deb_name + '_i686'
command_prefix = ['schroot', '-c', chroot_name, '--']
elif glibc == 'glibc217':
command_prefix = ['scl', 'enable', 'devtoolset-6', '--']
cmake_cache = os.path.join(builder.build_dir, 'CMakeCache.txt')
if os.path.exists(cmake_cache):
print("Removing CMake cache")
os.remove(cmake_cache)
cmake_options.append("-C" + os.path.join(blender_dir, cmake_config_file))
print("CMake configure:")
cmake_options = get_cmake_options(builder)
command = ['cmake', builder.blender_dir] + cmake_options
buildbot_utils.call(builder.command_prefix + command)
def cmake_build(builder):
# CMake build
os.chdir(builder.build_dir)
# NOTE: CPack will build an INSTALL target, which would mean that code
# signing will happen twice when using `make install` and CPack.
# The tricky bit here is that it is not possible to know whether INSTALL
# target is used by CPack or by a buildbot itaself. Extra level on top of
# this is that on Windows it is required to build INSTALL target in order
# to have unit test binaries to run.
# So on the one hand we do an extra unneeded code sign on Windows, but on
# a positive side we don't add complexity and don't make build process more
# fragile trying to avoid this. The signing process is way faster than just
# a clean build of buildbot, especially with regression tests enabled.
if builder.platform == 'win':
command = ['cmake', '--build', '.', '--target', 'install', '--config', 'Release']
# Prepare CMake options needed to configure cuda binaries compilation, 64bit only.
if bits == 64:
cmake_options.append("-DWITH_CYCLES_CUDA_BINARIES=ON")
cmake_options.append("-DCUDA_64_BIT_DEVICE_CODE=ON")
else:
command = ['make', '-s', '-j2', 'install']
cmake_options.append("-DWITH_CYCLES_CUDA_BINARIES=OFF")
print("CMake build:")
buildbot_utils.call(builder.command_prefix + command)
cmake_options.append("-DCMAKE_INSTALL_PREFIX=%s" % (install_dir))
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
update_git(builder)
clean_directories(builder)
cmake_configure(builder)
cmake_build(builder)
cmake_options += cmake_extra_options
# Make sure no garbage remained from the previous run
if os.path.isdir(install_dir):
shutil.rmtree(install_dir)
for target in targets:
print("Building target %s" % (target))
# Construct build directory name based on the target
target_build_dir = build_dir
target_command_prefix = command_prefix[:]
if target != 'blender':
target_build_dir += '_' + target
target_name = 'install'
# Tweaking CMake options to respect the target
target_cmake_options = cmake_options[:]
# Do extra git fetch because not all platform/git/buildbot combinations
# update the origin remote, causing buildinfo to detect local changes.
os.chdir(blender_dir)
print("Fetching remotes")
command = ['git', 'fetch', '--all']
print(command)
retcode = subprocess.call(target_command_prefix + command)
if retcode != 0:
sys.exit(retcode)
# Make sure build directory exists and enter it
if not os.path.isdir(target_build_dir):
os.mkdir(target_build_dir)
os.chdir(target_build_dir)
# Configure the build
print("CMake options:")
print(target_cmake_options)
if os.path.exists('CMakeCache.txt'):
print("Removing CMake cache")
os.remove('CMakeCache.txt')
# Remove buildinfo files to force buildbot to re-generate them.
for buildinfo in ('buildinfo.h', 'buildinfo.h.txt', ):
full_path = os.path.join('source', 'creator', buildinfo)
if os.path.exists(full_path):
print("Removing {}" . format(buildinfo))
os.remove(full_path)
retcode = subprocess.call(target_command_prefix + ['cmake', blender_dir] + target_cmake_options)
if retcode != 0:
print('Configuration FAILED!')
sys.exit(retcode)
if 'win32' in builder or 'win64' in builder:
command = ['cmake', '--build', '.', '--target', target_name, '--config', 'Release']
else:
command = ['make', '-s', '-j2', target_name]
print("Executing command:")
print(command)
retcode = subprocess.call(target_command_prefix + command)
if retcode != 0:
sys.exit(retcode)
else:
print("Unknown building system")
sys.exit(1)

View File

@@ -23,46 +23,46 @@
# to the master in the next buildbot step.
import os
import subprocess
import sys
import zipfile
from pathlib import Path
# get builder name
if len(sys.argv) < 2:
sys.stderr.write("Not enough arguments, expecting builder name\n")
sys.exit(1)
import buildbot_utils
builder = sys.argv[1]
# Never write branch if it is master.
branch = sys.argv[2] if (len(sys.argv) >= 3 and sys.argv[2] != 'master') else ''
def get_package_name(builder, platform=None):
info = buildbot_utils.VersionInfo(builder)
blender_dir = os.path.join('..', 'blender.git')
build_dir = os.path.join('..', 'build', builder)
install_dir = os.path.join('..', 'install', builder)
buildbot_upload_zip = os.path.abspath(os.path.join(os.path.dirname(install_dir), "buildbot_upload.zip"))
package_name = 'blender-' + info.full_version
if platform:
package_name += '-' + platform
if not (builder.branch == 'master' or builder.is_release_branch):
if info.is_development_build:
package_name = builder.branch + "-" + package_name
return package_name
def sign_file_or_directory(path):
from codesign.simple_code_signer import SimpleCodeSigner
code_signer = SimpleCodeSigner()
code_signer.sign_file_or_directory(Path(path))
upload_filename = None # Name of the archive to be uploaded
# (this is the name of archive which will appear on the
# download page)
upload_filepath = None # Filepath to be uploaded to the server
# (this folder will be packed)
def create_buildbot_upload_zip(builder, package_files):
import zipfile
def parse_header_file(filename, define):
import re
regex = re.compile("^#\s*define\s+%s\s+(.*)" % define)
with open(filename, "r") as file:
for l in file:
match = regex.match(l)
if match:
return match.group(1)
return None
buildbot_upload_zip = os.path.join(builder.upload_dir, "buildbot_upload.zip")
if os.path.exists(buildbot_upload_zip):
os.remove(buildbot_upload_zip)
try:
z = zipfile.ZipFile(buildbot_upload_zip, "w", compression=zipfile.ZIP_STORED)
for filepath, filename in package_files:
print("Packaged", filename)
z.write(filepath, arcname=filename)
z.close()
except Exception as ex:
sys.stderr.write('Create buildbot_upload.zip failed: ' + str(ex) + '\n')
sys.exit(1)
# Make sure install directory always exists
if not os.path.exists(install_dir):
os.makedirs(install_dir)
def create_tar_bz2(src, dest, package_name):
# One extra to remove leading os.sep when cleaning root for package_root
@@ -80,108 +80,163 @@ def create_tar_bz2(src, dest, package_name):
package.add(entry[0], entry[1], recursive=False)
package.close()
def cleanup_files(dirpath, extension):
for f in os.listdir(dirpath):
filepath = os.path.join(dirpath, f)
if os.path.isfile(filepath) and f.endswith(extension):
os.remove(filepath)
if builder.find('cmake') != -1:
# CMake
if 'win' in builder or 'mac' in builder:
os.chdir(build_dir)
files = [f for f in os.listdir('.') if os.path.isfile(f) and f.endswith('.zip')]
for f in files:
os.remove(f)
retcode = subprocess.call(['cpack', '-G', 'ZIP'])
result_file = [f for f in os.listdir('.') if os.path.isfile(f) and f.endswith('.zip')][0]
# TODO(sergey): Such magic usually happens in SCon's packaging but we don't have it
# in the CMake yet. For until then we do some magic here.
tokens = result_file.split('-')
blender_version = tokens[1].split('.')
blender_full_version = '.'.join(blender_version[0:2])
git_hash = tokens[2].split('.')[1]
platform = builder.split('_')[0]
if platform == 'mac':
# Special exception for OSX
platform = 'OSX-10.9-'
if builder.endswith('x86_64_10_9_cmake'):
platform += 'x86_64'
if builder.endswith('vc2015'):
platform += "-vc14"
builderified_name = 'blender-{}-{}-{}'.format(blender_full_version, git_hash, platform)
# NOTE: Blender 2.7 is already respected by blender_full_version.
if branch != '' and branch != 'blender2.7':
builderified_name = branch + "-" + builderified_name
os.rename(result_file, "{}.zip".format(builderified_name))
# create zip file
try:
if os.path.exists(buildbot_upload_zip):
os.remove(buildbot_upload_zip)
z = zipfile.ZipFile(buildbot_upload_zip, "w", compression=zipfile.ZIP_STORED)
z.write("{}.zip".format(builderified_name))
z.close()
sys.exit(retcode)
except Exception as ex:
sys.stderr.write('Create buildbot_upload.zip failed' + str(ex) + '\n')
sys.exit(1)
elif builder.startswith('linux_'):
blender = os.path.join(install_dir, 'blender')
buildinfo_h = os.path.join(build_dir, "source", "creator", "buildinfo.h")
blender_h = os.path.join(blender_dir, "source", "blender", "blenkernel", "BKE_blender_version.h")
# Get version information
blender_version = int(parse_header_file(blender_h, 'BLENDER_VERSION'))
blender_version = "%d.%d" % (blender_version // 100, blender_version % 100)
blender_hash = parse_header_file(buildinfo_h, 'BUILD_HASH')[1:-1]
blender_glibc = builder.split('_')[1]
command_prefix = []
bits = 64
blender_arch = 'x86_64'
if blender_glibc == 'glibc224':
if builder.endswith('x86_64_cmake'):
chroot_name = 'buildbot_stretch_x86_64'
elif builder.endswith('i686_cmake'):
chroot_name = 'buildbot_stretch_i686'
bits = 32
blender_arch = 'i686'
command_prefix = ['schroot', '-c', chroot_name, '--']
elif blender_glibc == 'glibc217':
command_prefix = ['scl', 'enable', 'devtoolset-6', '--']
# Strip all unused symbols from the binaries
print("Stripping binaries...")
subprocess.call(command_prefix + ['strip', '--strip-all', blender])
print("Stripping python...")
py_target = os.path.join(install_dir, blender_version)
subprocess.call(command_prefix + ['find', py_target, '-iname', '*.so', '-exec', 'strip', '-s', '{}', ';'])
# Copy all specific files which are too specific to be copied by
# the CMake rules themselves
print("Copying extra scripts and libs...")
extra = '/' + os.path.join('home', 'sources', 'release-builder', 'extra')
mesalibs = os.path.join(extra, 'mesalibs' + str(bits) + '.tar.bz2')
software_gl = os.path.join(blender_dir, 'release', 'bin', 'blender-softwaregl')
icons = os.path.join(blender_dir, 'release', 'freedesktop', 'icons')
os.system('tar -xpf %s -C %s' % (mesalibs, install_dir))
os.system('cp %s %s' % (software_gl, install_dir))
os.system('cp -r %s %s' % (icons, install_dir))
os.system('chmod 755 %s' % (os.path.join(install_dir, 'blender-softwaregl')))
# Construct archive name
package_name = 'blender-%s-%s-linux-%s-%s' % (blender_version,
blender_hash,
blender_glibc,
blender_arch)
# NOTE: Blender 2.7 is already respected by blender_full_version.
if branch != '' and branch != 'blender2.7':
package_name = branch + "-" + package_name
upload_filename = package_name + ".tar.bz2"
print("Creating .tar.bz2 archive")
upload_filepath = install_dir + '.tar.bz2'
create_tar_bz2(install_dir, upload_filepath, package_name)
else:
print("Unknown building system")
sys.exit(1)
def pack_mac(builder):
info = buildbot_utils.VersionInfo(builder)
if upload_filepath is None:
# clean release directory if it already exists
release_dir = 'release'
os.chdir(builder.build_dir)
cleanup_files(builder.build_dir, '.dmg')
if os.path.exists(release_dir):
for f in os.listdir(release_dir):
if os.path.isfile(os.path.join(release_dir, f)):
os.remove(os.path.join(release_dir, f))
package_name = get_package_name(builder, 'macOS')
package_filename = package_name + '.dmg'
package_filepath = os.path.join(builder.build_dir, package_filename)
# create release package
try:
subprocess.call(['make', 'package_archive'])
except Exception as ex:
sys.stderr.write('Make package release failed' + str(ex) + '\n')
sys.exit(1)
release_dir = os.path.join(builder.blender_dir, 'release', 'darwin')
bundle_sh = os.path.join(release_dir, 'bundle.sh')
# find release directory, must exist this time
if not os.path.exists(release_dir):
sys.stderr.write("Failed to find release directory %r.\n" % release_dir)
sys.exit(1)
command = [bundle_sh]
command += ['--source', builder.install_dir]
command += ['--dmg', package_filepath]
if info.is_development_build:
background_image = os.path.join(release_dir, 'buildbot', 'background.tif')
command += ['--background-image', background_image]
buildbot_utils.call(command)
# find release package
file = None
filepath = None
create_buildbot_upload_zip(builder, [(package_filepath, package_filename)])
for f in os.listdir(release_dir):
rf = os.path.join(release_dir, f)
if os.path.isfile(rf) and f.startswith('blender'):
file = f
filepath = rf
if not file:
sys.stderr.write("Failed to find release package.\n")
sys.exit(1)
def pack_win(builder):
info = buildbot_utils.VersionInfo(builder)
upload_filename = file
upload_filepath = filepath
os.chdir(builder.build_dir)
cleanup_files(builder.build_dir, '.zip')
# CPack will add the platform name
cpack_name = get_package_name(builder, None)
package_name = get_package_name(builder, 'windows' + str(builder.bits))
command = ['cmake', '-DCPACK_OVERRIDE_PACKAGENAME:STRING=' + cpack_name, '.']
buildbot_utils.call(builder.command_prefix + command)
command = ['cpack', '-G', 'ZIP']
buildbot_utils.call(builder.command_prefix + command)
package_filename = package_name + '.zip'
package_filepath = os.path.join(builder.build_dir, package_filename)
package_files = [(package_filepath, package_filename)]
if info.version_cycle == 'release':
# Installer only for final release builds, otherwise will get
# 'this product is already installed' messages.
command = ['cpack', '-G', 'WIX']
buildbot_utils.call(builder.command_prefix + command)
package_filename = package_name + '.msi'
package_filepath = os.path.join(builder.build_dir, package_filename)
sign_file_or_directory(package_filepath)
package_files += [(package_filepath, package_filename)]
create_buildbot_upload_zip(builder, package_files)
def pack_linux(builder):
blender_executable = os.path.join(builder.install_dir, 'blender')
info = buildbot_utils.VersionInfo(builder)
blender_glibc = builder.name.split('_')[1]
blender_arch = 'x86_64'
# Strip all unused symbols from the binaries
print("Stripping binaries...")
buildbot_utils.call(builder.command_prefix + ['strip', '--strip-all', blender_executable])
print("Stripping python...")
py_target = os.path.join(builder.install_dir, info.version)
buildbot_utils.call(builder.command_prefix + ['find', py_target, '-iname', '*.so', '-exec', 'strip', '-s', '{}', ';'])
# Construct package name
platform_name = 'linux-' + blender_glibc + '-' + blender_arch
package_name = get_package_name(builder, platform_name)
package_filename = package_name + ".tar.bz2"
print("Creating .tar.bz2 archive")
package_filepath = builder.install_dir + '.tar.bz2'
create_tar_bz2(builder.install_dir, package_filepath, package_name)
# Create buildbot_upload.zip
create_buildbot_upload_zip(builder, [(package_filepath, package_filename)])
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
# Make sure install directory always exists
os.makedirs(builder.install_dir, exist_ok=True)
if builder.platform == 'mac':
pack_mac(builder)
elif builder.platform == 'win':
pack_win(builder)
elif builder.platform == 'linux':
pack_linux(builder)
# create zip file
try:
upload_zip = os.path.join(buildbot_upload_zip)
if os.path.exists(upload_zip):
os.remove(upload_zip)
z = zipfile.ZipFile(upload_zip, "w", compression=zipfile.ZIP_STORED)
z.write(upload_filepath, arcname=upload_filename)
z.close()
except Exception as ex:
sys.stderr.write('Create buildbot_upload.zip failed' + str(ex) + '\n')
sys.exit(1)

View File

@@ -21,17 +21,23 @@
# Runs on buildbot slave, rsync zip directly to buildbot server rather
# than using upload which is much slower
import buildbot_utils
import os
import sys
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
# get builder name
if len(sys.argv) < 2:
sys.stderr.write("Not enough arguments, expecting builder name\n")
sys.exit(1)
# rsync, this assumes ssh keys are setup so no password is needed
local_zip = "buildbot_upload.zip"
remote_folder = "builder.blender.org:/data/buildbot-master/uploaded/"
remote_zip = remote_folder + "buildbot_upload_" + builder.name + ".zip"
builder = sys.argv[1]
command = ["rsync", "-avz", local_zip, remote_zip]
buildbot_utils.call(command)
# rsync, this assumes ssh keys are setup so no password is needed
local_zip = "buildbot_upload.zip"
remote_folder = "builder.blender.org:/data/buildbot-master/uploaded/"
remote_zip = remote_folder + "buildbot_upload_" + builder + ".zip"
command = "rsync -avz %s %s" % (local_zip, remote_zip)
print(command)
ret = os.system(command)
sys.exit(ret)

View File

@@ -18,22 +18,59 @@
# <pep8 compliant>
import buildbot_utils
import subprocess
import os
import sys
def get_ctest_arguments(builder):
args = ['--output-on-failure']
if builder.platform == 'win':
args += ['-C', 'Release']
return args
# get builder name
if len(sys.argv) < 2:
sys.stderr.write("Not enough arguments, expecting builder name\n")
sys.exit(1)
def test(builder):
os.chdir(builder.build_dir)
builder = sys.argv[1]
command = builder.command_prefix + ['ctest'] + get_ctest_arguments(builder)
buildbot_utils.call(command)
# we run from build/ directory
blender_dir = '../blender.git'
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
test(builder)
if "cmake" in builder:
print("Automated tests are still DISABLED!")
sys.exit(0)
build_dir = os.path.abspath(os.path.join('..', 'build', builder))
install_dir = os.path.abspath(os.path.join('..', 'install', builder))
# NOTE: For quick test only to see if the approach work.
# n the future must be replaced with an actual blender version.
blender_version = '2.80'
blender_version_dir = os.path.join(install_dir, blender_version)
command_prefix = []
extra_ctest_args = []
if builder.startswith('win'):
extra_ctest_args += ['-C', 'Release']
elif builder.startswith('linux'):
tokens = builder.split("_")
glibc = tokens[1]
if glibc == 'glibc224':
deb_name = "stretch"
if builder.endswith('x86_64_cmake'):
chroot_name = 'buildbot_' + deb_name + '_x86_64'
elif builder.endswith('i686_cmake'):
chroot_name = 'buildbot_' + deb_name + '_i686'
command_prefix = ['schroot', '--preserve-environment', '-c', chroot_name, '--']
elif glibc == 'glibc217':
command_prefix = ['scl', 'enable', 'devtoolset-6', '--']
ctest_env = os.environ.copy()
ctest_env['BLENDER_SYSTEM_SCRIPTS'] = os.path.join(blender_version_dir, 'scripts')
ctest_env['BLENDER_SYSTEM_DATAFILES'] = os.path.join(blender_version_dir, 'datafiles')
os.chdir(build_dir)
retcode = subprocess.call(command_prefix + ['ctest', '--output-on-failure'] + extra_ctest_args,
env=ctest_env)
# Always exit with a success, for until we know all the tests are passing
# on all builders.
sys.exit(0)
else:
print("Unknown building system")
sys.exit(1)

View File

@@ -18,14 +18,14 @@
# <pep8 compliant>
import buildbot_utils
import os
import sys
import runpy
if __name__ == "__main__":
builder = buildbot_utils.create_builder_from_arguments()
os.chdir(builder.blender_dir)
# We run from build/ directory.
blender_dir = os.path.join('..', 'blender.git')
blender_dir = os.path.abspath(blender_dir)
os.chdir(blender_dir)
# Run make update which handles all libraries and submodules.
make_update = os.path.join(builder.blender_dir, "build_files", "utils", "make_update.py")
buildbot_utils.call([sys.executable, make_update, '--no-blender', "--use-tests", "--use-centos-libraries"])
# Run make update which handles all libraries and submodules.
make_update = os.path.join(blender_dir, "build_files", "utils", "make_update.py")
runpy.run_path(make_update)

View File

@@ -38,14 +38,14 @@ SET(_icu_SEARCH_DIRS
)
# We don't need includes, only libs to link against...
# FIND_PATH(ICU_INCLUDE_DIR
# NAMES
# utf.h
# HINTS
# ${_icu_SEARCH_DIRS}
# PATH_SUFFIXES
# include/unicode
# )
#FIND_PATH(ICU_INCLUDE_DIR
# NAMES
# utf.h
# HINTS
# ${_icu_SEARCH_DIRS}
# PATH_SUFFIXES
# include/unicode
#)
FIND_LIBRARY(ICU_LIBRARY_DATA
NAMES

View File

@@ -1,57 +0,0 @@
# - Find OptiX library
# Find the native OptiX includes and library
# This module defines
# OPTIX_INCLUDE_DIRS, where to find optix.h, Set when
# OPTIX_INCLUDE_DIR is found.
# OPTIX_ROOT_DIR, The base directory to search for OptiX.
# This can also be an environment variable.
# OPTIX_FOUND, If false, do not try to use OptiX.
#=============================================================================
# Copyright 2019 Blender Foundation.
#
# Distributed under the OSI-approved BSD License (the "License");
# see accompanying file Copyright.txt for details.
#
# This software is distributed WITHOUT ANY WARRANTY; without even the
# implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the License for more information.
#=============================================================================
# If OPTIX_ROOT_DIR was defined in the environment, use it.
IF(NOT OPTIX_ROOT_DIR AND NOT $ENV{OPTIX_ROOT_DIR} STREQUAL "")
SET(OPTIX_ROOT_DIR $ENV{OPTIX_ROOT_DIR})
ENDIF()
SET(_optix_SEARCH_DIRS
${OPTIX_ROOT_DIR}
"$ENV{PROGRAMDATA}/NVIDIA Corporation/OptiX SDK 7.0.0"
/usr/local
/sw # Fink
/opt/local # DarwinPorts
)
FIND_PATH(OPTIX_INCLUDE_DIR
NAMES
optix.h
HINTS
${_optix_SEARCH_DIRS}
PATH_SUFFIXES
include
)
# handle the QUIETLY and REQUIRED arguments and set OPTIX_FOUND to TRUE if
# all listed variables are TRUE
INCLUDE(FindPackageHandleStandardArgs)
FIND_PACKAGE_HANDLE_STANDARD_ARGS(OptiX DEFAULT_MSG
OPTIX_INCLUDE_DIR)
IF(OPTIX_FOUND)
SET(OPTIX_INCLUDE_DIRS ${OPTIX_INCLUDE_DIR})
ENDIF(OPTIX_FOUND)
MARK_AS_ADVANCED(
OPTIX_INCLUDE_DIR
)
UNSET(_optix_SEARCH_DIRS)

View File

@@ -83,9 +83,9 @@ IF((NOT _IS_INC_DEF) OR (NOT _IS_INC_CONF_DEF) OR (NOT _IS_LIB_DEF) OR (NOT _IS_
)
FOREACH(_CURRENT_ABI_FLAGS ${_PYTHON_ABI_FLAGS_TEST})
# IF(CMAKE_BUILD_TYPE STREQUAL Debug)
# SET(_CURRENT_ABI_FLAGS "d${_CURRENT_ABI_FLAGS}")
# ENDIF()
#IF(CMAKE_BUILD_TYPE STREQUAL Debug)
# SET(_CURRENT_ABI_FLAGS "d${_CURRENT_ABI_FLAGS}")
#ENDIF()
STRING(REPLACE " " "" _CURRENT_ABI_FLAGS ${_CURRENT_ABI_FLAGS})
IF(NOT DEFINED PYTHON_INCLUDE_DIR)

View File

@@ -48,20 +48,12 @@ macro(BLENDER_SRC_GTEST_EX NAME SRC EXTRA_LIBS DO_ADD_TEST)
if(WITH_OPENMP_STATIC)
target_link_libraries(${TARGET_NAME} ${OpenMP_LIBRARIES})
endif()
get_property(GENERATOR_IS_MULTI_CONFIG GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
if(GENERATOR_IS_MULTI_CONFIG)
string(REPLACE "\${BUILD_TYPE}" "$<CONFIG>" TEST_INSTALL_DIR ${CMAKE_INSTALL_PREFIX})
else()
string(REPLACE "\${BUILD_TYPE}" "" TEST_INSTALL_DIR ${CMAKE_INSTALL_PREFIX})
endif()
set_target_properties(${TARGET_NAME} PROPERTIES
RUNTIME_OUTPUT_DIRECTORY "${TESTS_OUTPUT_DIR}"
RUNTIME_OUTPUT_DIRECTORY_RELEASE "${TESTS_OUTPUT_DIR}"
RUNTIME_OUTPUT_DIRECTORY_DEBUG "${TESTS_OUTPUT_DIR}")
if(${DO_ADD_TEST})
add_test(NAME ${TARGET_NAME} COMMAND ${TESTS_OUTPUT_DIR}/${TARGET_NAME} WORKING_DIRECTORY ${TEST_INSTALL_DIR})
add_test(NAME ${TARGET_NAME} COMMAND ${TESTS_OUTPUT_DIR}/${TARGET_NAME} WORKING_DIRECTORY $<TARGET_FILE_DIR:blender>)
# Don't fail tests on leaks since these often happen in external libraries
# that we can't fix.

View File

@@ -145,13 +145,22 @@ if(EXISTS ${SOURCE_DIR}/.git)
unset(_git_changed_files)
endif()
# BUILD_PLATFORM is taken from CMake
# BUILD_PLATFORM and BUILD_PLATFORM are taken from CMake
# but BUILD_DATE and BUILD_TIME are platform dependent
if(NOT BUILD_DATE)
STRING(TIMESTAMP BUILD_DATE "%Y-%m-%d" UTC)
endif()
if(NOT BUILD_TIME)
STRING(TIMESTAMP BUILD_TIME "%H:%M:%S" UTC)
if(UNIX)
if(NOT BUILD_DATE)
execute_process(COMMAND date "+%Y-%m-%d" OUTPUT_VARIABLE BUILD_DATE OUTPUT_STRIP_TRAILING_WHITESPACE)
endif()
if(NOT BUILD_TIME)
execute_process(COMMAND date "+%H:%M:%S" OUTPUT_VARIABLE BUILD_TIME OUTPUT_STRIP_TRAILING_WHITESPACE)
endif()
elseif(WIN32)
if(NOT BUILD_DATE)
execute_process(COMMAND cmd /c date /t OUTPUT_VARIABLE BUILD_DATE OUTPUT_STRIP_TRAILING_WHITESPACE)
endif()
if(NOT BUILD_TIME)
execute_process(COMMAND cmd /c time /t OUTPUT_VARIABLE BUILD_TIME OUTPUT_STRIP_TRAILING_WHITESPACE)
endif()
endif()
# Write a file with the BUILD_HASH define

View File

@@ -28,7 +28,6 @@ if not sys.version.startswith("3"):
from cmake_consistency_check_config import (
IGNORE_SOURCE,
IGNORE_SOURCE_MISSING,
IGNORE_CMAKE,
UTF8_CHECK,
SOURCE_DIR,
@@ -43,11 +42,6 @@ global_h = set()
global_c = set()
global_refs = {}
# Ignore cmake file, path pairs.
global_ignore_source_missing = {}
for k, v in IGNORE_SOURCE_MISSING:
global_ignore_source_missing.setdefault(k, []).append(v)
def replace_line(f, i, text, keep_indent=True):
file_handle = open(f, 'r')
@@ -143,13 +137,6 @@ def cmake_get_src(f):
cmake_base = dirname(f)
cmake_base_bin = os.path.join(BUILD_DIR, os.path.relpath(cmake_base, SOURCE_DIR))
# Find known missing sources list (if we have one).
f_rel = os.path.relpath(f, SOURCE_DIR)
f_rel_key = f_rel
if os.sep != "/":
f_rel_key = f_rel_key.replace(os.sep, "/")
local_ignore_source_missing = global_ignore_source_missing.get(f_rel_key, [])
while it is not None:
i += 1
try:
@@ -162,9 +149,6 @@ def cmake_get_src(f):
if not l.startswith("#"):
# Remove in-line comments.
l = l.split(" # ")[0].rstrip()
if ")" in l:
if l.strip() != ")":
raise Exception("strict formatting not kept '*)' %s:%d" % (f, i))
@@ -227,10 +211,7 @@ def cmake_get_src(f):
# replace_line(f, i - 1, new_path_rel)
else:
if l in local_ignore_source_missing:
local_ignore_source_missing.remove(l)
else:
raise Exception("non existent include %s:%d -> %s" % (f, i, new_file))
raise Exception("non existent include %s:%d -> %s" % (f, i, new_file))
# print(new_file)
@@ -360,12 +341,6 @@ def main():
if not ignore_used_source[index]:
print("unused ignore: %r" % ig)
# Check ignores aren't stale
print("\nCheck for unused 'IGNORE_SOURCE_MISSING' paths...")
for k, v in sorted(global_ignore_source_missing.items()):
for ig in v:
print("unused ignore: %r -> %r" % (ig, k))
# Check ignores aren't stale
print("\nCheck for unused 'IGNORE_CMAKE' paths...")
for index, ig in enumerate(IGNORE_CMAKE):

View File

@@ -17,6 +17,8 @@ IGNORE_SOURCE = (
"extern/bullet2/src/BulletDynamics/Character/btKinematicCharacterController.cpp",
"extern/bullet2/src/BulletDynamics/ConstraintSolver/btHinge2Constraint.cpp",
"extern/bullet2/src/BulletDynamics/ConstraintSolver/btUniversalConstraint.cpp",
"intern/audaspace/SRC/AUD_SRCResampleFactory.cpp",
"intern/audaspace/SRC/AUD_SRCResampleReader.cpp",
"doc/doxygen/doxygen.extern.h",
"doc/doxygen/doxygen.intern.h",
@@ -30,12 +32,8 @@ IGNORE_SOURCE = (
"extern/bullet2/src/BulletDynamics/Character/btKinematicCharacterController.h",
"extern/bullet2/src/BulletDynamics/ConstraintSolver/btHinge2Constraint.h",
"extern/bullet2/src/BulletDynamics/ConstraintSolver/btUniversalConstraint.h",
)
# Ignore cmake file, path pairs.
IGNORE_SOURCE_MISSING = (
# Use for cycles stand-alone.
("intern/cycles/util/CMakeLists.txt", "../../third_party/numaapi/include"),
"intern/audaspace/SRC/AUD_SRCResampleFactory.h",
"intern/audaspace/SRC/AUD_SRCResampleReader.h",
)
IGNORE_CMAKE = (

View File

@@ -46,9 +46,8 @@ set(WITH_OPENSUBDIV ON CACHE BOOL "" FORCE)
set(WITH_OPENVDB ON CACHE BOOL "" FORCE)
set(WITH_OPENVDB_BLOSC ON CACHE BOOL "" FORCE)
set(WITH_PYTHON_INSTALL ON CACHE BOOL "" FORCE)
set(WITH_QUADRIFLOW ON CACHE BOOL "" FORCE)
set(WITH_RAYOPTIMIZATION ON CACHE BOOL "" FORCE)
set(WITH_SDL ON CACHE BOOL "" FORCE)
set(WITH_TBB ON CACHE BOOL "" FORCE)
set(WITH_X11_XINPUT ON CACHE BOOL "" FORCE)
set(WITH_X11_XF86VMODE ON CACHE BOOL "" FORCE)

View File

@@ -17,7 +17,6 @@ set(WITH_CODEC_FFMPEG OFF CACHE BOOL "" FORCE)
set(WITH_CODEC_SNDFILE OFF CACHE BOOL "" FORCE)
set(WITH_CYCLES OFF CACHE BOOL "" FORCE)
set(WITH_CYCLES_OSL OFF CACHE BOOL "" FORCE)
set(WITH_CYCLES_DEVICE_OPTIX OFF CACHE BOOL "" FORCE)
set(WITH_DRACO OFF CACHE BOOL "" FORCE)
set(WITH_FFTW3 OFF CACHE BOOL "" FORCE)
set(WITH_LIBMV OFF CACHE BOOL "" FORCE)
@@ -51,8 +50,7 @@ set(WITH_OPENIMAGEIO OFF CACHE BOOL "" FORCE)
set(WITH_OPENMP OFF CACHE BOOL "" FORCE)
set(WITH_OPENSUBDIV OFF CACHE BOOL "" FORCE)
set(WITH_OPENVDB OFF CACHE BOOL "" FORCE)
set(WITH_QUADRIFLOW OFF CACHE BOOL "" FORCE)
set(WITH_RAYOPTIMIZATION OFF CACHE BOOL "" FORCE)
set(WITH_SDL OFF CACHE BOOL "" FORCE)
set(WITH_TBB OFF CACHE BOOL "" FORCE)
set(WITH_X11_XINPUT OFF CACHE BOOL "" FORCE)
set(WITH_X11_XF86VMODE OFF CACHE BOOL "" FORCE)

View File

@@ -47,9 +47,8 @@ set(WITH_OPENSUBDIV ON CACHE BOOL "" FORCE)
set(WITH_OPENVDB ON CACHE BOOL "" FORCE)
set(WITH_OPENVDB_BLOSC ON CACHE BOOL "" FORCE)
set(WITH_PYTHON_INSTALL ON CACHE BOOL "" FORCE)
set(WITH_QUADRIFLOW ON CACHE BOOL "" FORCE)
set(WITH_RAYOPTIMIZATION ON CACHE BOOL "" FORCE)
set(WITH_SDL ON CACHE BOOL "" FORCE)
set(WITH_TBB ON CACHE BOOL "" FORCE)
set(WITH_X11_XINPUT ON CACHE BOOL "" FORCE)
set(WITH_X11_XF86VMODE ON CACHE BOOL "" FORCE)
@@ -57,7 +56,6 @@ set(WITH_MEM_JEMALLOC ON CACHE BOOL "" FORCE)
set(WITH_CYCLES_CUDA_BINARIES ON CACHE BOOL "" FORCE)
set(WITH_CYCLES_CUBIN_COMPILER OFF CACHE BOOL "" FORCE)
set(CYCLES_CUDA_BINARIES_ARCH sm_30;sm_35;sm_37;sm_50;sm_52;sm_60;sm_61;sm_70;sm_75 CACHE STRING "" FORCE)
set(WITH_CYCLES_DEVICE_OPTIX ON CACHE BOOL "" FORCE)
# platform dependent options
if(UNIX AND NOT APPLE)

View File

@@ -31,18 +31,7 @@ set(WITH_BULLET OFF CACHE BOOL "" FORCE)
set(WITH_OPENVDB OFF CACHE BOOL "" FORCE)
set(WITH_ALEMBIC OFF CACHE BOOL "" FORCE)
# Depends on Python install, do this to quiet warning.
set(WITH_DRACO OFF CACHE BOOL "" FORCE)
# Note, if linking errors can be resolved, lines below can be removed.
# Until then, disable configurations known to fail.
if(UNIX AND NOT APPLE)
if(CMAKE_SYSTEM_NAME MATCHES "Linux")
# jemalloc causes linking error on import, disable.
set(WITH_MEM_JEMALLOC OFF CACHE BOOL "" FORCE)
endif()
elseif(APPLE)
# OpenMP causes linking error on build, disable.
if(CMAKE_SYSTEM_NAME MATCHES "Linux")
# jemalloc causes linking error on import, disable.
set(WITH_MEM_JEMALLOC OFF CACHE BOOL "" FORCE)
endif()

View File

@@ -147,9 +147,9 @@ function(blender_include_dirs
get_filename_component(_ABS_INC ${_INC} ABSOLUTE)
list(APPEND _ALL_INCS ${_ABS_INC})
# for checking for invalid includes, disable for regular use
# if(NOT EXISTS "${_ABS_INC}/")
# message(FATAL_ERROR "Include not found: ${_ABS_INC}/")
# endif()
##if(NOT EXISTS "${_ABS_INC}/")
## message(FATAL_ERROR "Include not found: ${_ABS_INC}/")
##endif()
endforeach()
include_directories(${_ALL_INCS})
endfunction()
@@ -162,9 +162,9 @@ function(blender_include_dirs_sys
foreach(_INC ${ARGV})
get_filename_component(_ABS_INC ${_INC} ABSOLUTE)
list(APPEND _ALL_INCS ${_ABS_INC})
# if(NOT EXISTS "${_ABS_INC}/")
# message(FATAL_ERROR "Include not found: ${_ABS_INC}/")
# endif()
##if(NOT EXISTS "${_ABS_INC}/")
## message(FATAL_ERROR "Include not found: ${_ABS_INC}/")
##endif()
endforeach()
include_directories(SYSTEM ${_ALL_INCS})
endfunction()
@@ -173,7 +173,7 @@ function(blender_source_group
sources
)
# if enabled, use the sources directories as filters.
#if enabled, use the sources directories as filters.
if(WINDOWS_USE_VISUAL_STUDIO_SOURCE_FOLDERS)
foreach(_SRC ${sources})
# remove ../'s
@@ -259,7 +259,7 @@ function(blender_add_lib__impl
# listed is helpful for IDE's (QtCreator/MSVC)
blender_source_group("${sources}")
# if enabled, set the FOLDER property for visual studio projects
#if enabled, set the FOLDER property for visual studio projects
if(WINDOWS_USE_VISUAL_STUDIO_PROJECT_FOLDERS)
get_filename_component(FolderDir ${CMAKE_CURRENT_SOURCE_DIR} DIRECTORY)
string(REPLACE ${CMAKE_SOURCE_DIR} "" FolderDir ${FolderDir})
@@ -375,7 +375,7 @@ function(SETUP_LIBDIRS)
endif()
if(WITH_OPENCOLLADA)
link_directories(${OPENCOLLADA_LIBPATH})
# # Never set
## Never set
# link_directories(${PCRE_LIBPATH})
# link_directories(${EXPAT_LIBPATH})
endif()
@@ -396,7 +396,6 @@ endfunction()
macro(setup_platform_linker_flags)
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} ${PLATFORM_LINKFLAGS}")
set(CMAKE_EXE_LINKER_FLAGS_RELEASE "${CMAKE_EXE_LINKER_FLAGS_RELEASE} ${PLATFORM_LINKFLAGS_RELEASE}")
set(CMAKE_EXE_LINKER_FLAGS_DEBUG "${CMAKE_EXE_LINKER_FLAGS_DEBUG} ${PLATFORM_LINKFLAGS_DEBUG}")
endmacro()
@@ -406,15 +405,12 @@ function(setup_liblinks
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} ${PLATFORM_LINKFLAGS}" PARENT_SCOPE)
set(CMAKE_EXE_LINKER_FLAGS_DEBUG "${CMAKE_EXE_LINKER_FLAGS_DEBUG} ${PLATFORM_LINKFLAGS_DEBUG}" PARENT_SCOPE)
set(CMAKE_EXE_LINKER_FLAGS_RELEASE "${CMAKE_EXE_LINKER_FLAGS_RELEASE} ${PLATFORM_LINKFLAGS_RELEASE}" PARENT_SCOPE)
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} ${PLATFORM_LINKFLAGS}" PARENT_SCOPE)
set(CMAKE_SHARED_LINKER_FLAGS_DEBUG "${CMAKE_SHARED_LINKER_FLAGS_DEBUG} ${PLATFORM_LINKFLAGS_DEBUG}" PARENT_SCOPE)
set(CMAKE_SHARED_LINKER_FLAGS_RELEASE "${CMAKE_SHARED_LINKER_FLAGS_RELEASE} ${PLATFORM_LINKFLAGS_RELEASE}" PARENT_SCOPE)
set(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} ${PLATFORM_LINKFLAGS}" PARENT_SCOPE)
set(CMAKE_MODULE_LINKER_FLAGS_DEBUG "${CMAKE_MODULE_LINKER_FLAGS_DEBUG} ${PLATFORM_LINKFLAGS_DEBUG}" PARENT_SCOPE)
set(CMAKE_MODULE_LINKER_FLAGS_RELEASE "${CMAKE_MODULE_LINKER_FLAGS_RELEASE} ${PLATFORM_LINKFLAGS_RELEASE}" PARENT_SCOPE)
# jemalloc must be early in the list, to be before pthread (see T57998)
if(WITH_MEM_JEMALLOC)
@@ -464,16 +460,13 @@ function(setup_liblinks
target_link_libraries(${target} ${OSL_LIBRARIES})
endif()
if(WITH_OPENVDB)
target_link_libraries(${target} ${OPENVDB_LIBRARIES} ${BLOSC_LIBRARIES})
target_link_libraries(${target} ${OPENVDB_LIBRARIES} ${TBB_LIBRARIES} ${BLOSC_LIBRARIES})
endif()
if(WITH_OPENIMAGEIO)
target_link_libraries(${target} ${OPENIMAGEIO_LIBRARIES})
endif()
if(WITH_OPENIMAGEDENOISE)
target_link_libraries(${target} ${OPENIMAGEDENOISE_LIBRARIES})
endif()
if(WITH_TBB)
target_link_libraries(${target} ${TBB_LIBRARIES})
target_link_libraries(${target} ${OPENIMAGEDENOISE_LIBRARIES} ${TBB_LIBRARIES})
endif()
if(WITH_OPENCOLORIO)
target_link_libraries(${target} ${OPENCOLORIO_LIBRARIES})
@@ -534,6 +527,9 @@ function(setup_liblinks
)
endif()
endif()
if(WITH_MOD_CLOTH_ELTOPO)
target_link_libraries(${target} ${LAPACK_LIBRARIES})
endif()
if(WITH_LLVM)
target_link_libraries(${target} ${LLVM_LIBRARY})
endif()
@@ -570,11 +566,11 @@ function(setup_liblinks
${ZLIB_LIBRARIES}
)
# System libraries with no dependencies such as platform link libs or opengl should go last.
#system libraries with no dependencies such as platform link libs or opengl should go last
target_link_libraries(${target}
${BLENDER_GL_LIBRARIES})
# target_link_libraries(${target} ${PLATFORM_LINKLIBS} ${CMAKE_DL_LIBS})
#target_link_libraries(${target} ${PLATFORM_LINKLIBS} ${CMAKE_DL_LIBS})
target_link_libraries(${target} ${PLATFORM_LINKLIBS})
endfunction()
@@ -723,7 +719,7 @@ macro(remove_strict_flags)
endif()
if(MSVC)
remove_cc_flag(/w34189) # Restore warn C4189 (unused variable) back to w4
remove_cc_flag(/w34189) # Restore warn C4189 (unused variable) back to w4
endif()
endmacro()
@@ -1068,7 +1064,7 @@ function(data_to_c_simple_icons
add_custom_command(
OUTPUT ${_file_from} ${_file_to}
COMMAND ${CMAKE_COMMAND} -E make_directory ${_file_to_path}
# COMMAND python3 ${CMAKE_SOURCE_DIR}/source/blender/datatoc/datatoc_icon.py ${_path_from_abs} ${_file_from}
#COMMAND python3 ${CMAKE_SOURCE_DIR}/source/blender/datatoc/datatoc_icon.py ${_path_from_abs} ${_file_from}
COMMAND "$<TARGET_FILE:datatoc_icon>" ${_path_from_abs} ${_file_from}
COMMAND "$<TARGET_FILE:datatoc>" ${_file_from} ${_file_to}
DEPENDS
@@ -1213,9 +1209,7 @@ macro(openmp_delayload
)
if(MSVC)
if(WITH_OPENMP)
if(MSVC_CLANG)
set(OPENMP_DLL_NAME "libomp")
elseif(MSVC_VERSION EQUAL 1800)
if(MSVC_VERSION EQUAL 1800)
set(OPENMP_DLL_NAME "vcomp120")
else()
set(OPENMP_DLL_NAME "vcomp140")
@@ -1228,6 +1222,29 @@ macro(openmp_delayload
endif()
endmacro()
macro(WINDOWS_SIGN_TARGET target)
if(WITH_WINDOWS_CODESIGN)
if(!SIGNTOOL_EXE)
error("Codesigning is enabled, but signtool is not found")
else()
if(WINDOWS_CODESIGN_PFX_PASSWORD)
set(CODESIGNPASSWORD /p ${WINDOWS_CODESIGN_PFX_PASSWORD})
else()
if($ENV{PFXPASSWORD})
set(CODESIGNPASSWORD /p $ENV{PFXPASSWORD})
else()
message(FATAL_ERROR "WITH_WINDOWS_CODESIGN is on but WINDOWS_CODESIGN_PFX_PASSWORD not set, and environment variable PFXPASSWORD not found, unable to sign code.")
endif()
endif()
add_custom_command(TARGET ${target}
POST_BUILD
COMMAND ${SIGNTOOL_EXE} sign /f ${WINDOWS_CODESIGN_PFX} ${CODESIGNPASSWORD} $<TARGET_FILE:${target}>
VERBATIM
)
endif()
endif()
endmacro()
macro(blender_precompile_headers target cpp header)
if(MSVC)
# get the name for the pch output file

View File

@@ -100,8 +100,8 @@ if(WITH_PYTHON)
set(PYTHON_INCLUDE_DIR "${_py_framework}/include/python${PYTHON_VERSION}m")
set(PYTHON_EXECUTABLE "${_py_framework}/bin/python${PYTHON_VERSION}m")
set(PYTHON_LIBPATH "${_py_framework}/lib/python${PYTHON_VERSION}/config-${PYTHON_VERSION}m")
# set(PYTHON_LIBRARY python${PYTHON_VERSION})
# set(PYTHON_LINKFLAGS "-u _PyMac_Error -framework Python") # won't build with this enabled
#set(PYTHON_LIBRARY python${PYTHON_VERSION})
#set(PYTHON_LINKFLAGS "-u _PyMac_Error -framework Python") # won't build with this enabled
unset(_py_framework)
endif()
@@ -157,7 +157,7 @@ if(WITH_CODEC_FFMPEG)
avcodec avdevice avformat avutil
mp3lame swscale x264 xvidcore
theora theoradec theoraenc
vorbis vorbisenc vorbisfile ogg opus
vorbis vorbisenc vorbisfile ogg
vpx swresample)
set(FFMPEG_LIBPATH ${FFMPEG}/lib)
endif()
@@ -218,12 +218,12 @@ if(WITH_OPENCOLLADA)
${OPENCOLLADA_LIBPATH}/libxml2.a
)
# PCRE is bundled with openCollada
# set(PCRE ${LIBDIR}/pcre)
# set(PCRE_LIBPATH ${PCRE}/lib)
#set(PCRE ${LIBDIR}/pcre)
#set(PCRE_LIBPATH ${PCRE}/lib)
set(PCRE_LIBRARIES pcre)
# libxml2 is used
# set(EXPAT ${LIBDIR}/expat)
# set(EXPAT_LIBPATH ${EXPAT}/lib)
#set(EXPAT ${LIBDIR}/expat)
#set(EXPAT_LIBPATH ${EXPAT}/lib)
set(EXPAT_LIB)
endif()
@@ -313,13 +313,16 @@ endif()
if(WITH_OPENVDB)
set(OPENVDB ${LIBDIR}/openvdb)
set(OPENVDB_INCLUDE_DIRS ${OPENVDB}/include)
set(OPENVDB_LIBRARIES openvdb blosc)
set(TBB_INCLUDE_DIRS ${LIBDIR}/tbb/include)
set(TBB_LIBRARIES ${LIBDIR}/tbb/lib/libtbb.a)
set(OPENVDB_LIBRARIES openvdb blosc ${TBB_LIBRARIES})
set(OPENVDB_LIBPATH ${LIBDIR}/openvdb/lib)
set(OPENVDB_DEFINITIONS)
endif()
if(WITH_LLVM)
set(LLVM_ROOT_DIR ${LIBDIR}/llvm)
set(LLVM_VERSION 3.4)
if(EXISTS "${LLVM_ROOT_DIR}/bin/llvm-config")
set(LLVM_CONFIG "${LLVM_ROOT_DIR}/bin/llvm-config")
else()
@@ -331,9 +334,6 @@ if(WITH_LLVM)
execute_process(COMMAND ${LLVM_CONFIG} --prefix
OUTPUT_VARIABLE LLVM_ROOT_DIR
OUTPUT_STRIP_TRAILING_WHITESPACE)
execute_process(COMMAND ${LLVM_CONFIG} --includedir
OUTPUT_VARIABLE LLVM_INCLUDE_DIRS
OUTPUT_STRIP_TRAILING_WHITESPACE)
execute_process(COMMAND ${LLVM_CONFIG} --libdir
OUTPUT_VARIABLE LLVM_LIBPATH
OUTPUT_STRIP_TRAILING_WHITESPACE)
@@ -384,25 +384,14 @@ endif()
if(WITH_OPENIMAGEDENOISE)
find_package(OpenImageDenoise)
find_package(TBB)
if(NOT OPENIMAGEDENOISE_FOUND)
set(WITH_OPENIMAGEDENOISE OFF)
message(STATUS "OpenImageDenoise not found")
endif()
endif()
if(WITH_TBB)
find_package(TBB)
endif()
if(NOT WITH_TBB OR NOT TBB_FOUND)
if(WITH_OPENIMAGEDENOISE)
message(STATUS "TBB not found, disabling OpenImageDenoise")
elseif(NOT TBB_FOUND)
set(WITH_OPENIMAGEDENOISE OFF)
endif()
if(WITH_OPENVDB)
message(STATUS "TBB not found, disabling OpenVDB")
set(WITH_OPENVDB OFF)
message(STATUS "TBB not found")
endif()
endif()

View File

@@ -22,30 +22,14 @@
# Detect precompiled library directory
if(NOT DEFINED LIBDIR)
# Path to a locally compiled libraries.
set(LIBDIR_NAME ${CMAKE_SYSTEM_NAME}_${CMAKE_SYSTEM_PROCESSOR})
string(TOLOWER ${LIBDIR_NAME} LIBDIR_NAME)
set(LIBDIR_NATIVE_ABI ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_NAME})
# Path to precompiled libraries with known CentOS 7 ABI.
set(LIBDIR_CENTOS7_ABI ${CMAKE_SOURCE_DIR}/../lib/linux_centos7_x86_64)
# Choose the best suitable libraries.
if(EXISTS ${LIBDIR_NATIVE_ABI})
set(LIBDIR ${LIBDIR_NATIVE_ABI})
elseif(EXISTS ${LIBDIR_CENTOS7_ABI})
set(LIBDIR ${LIBDIR_CENTOS7_ABI})
set(WITH_CXX11_ABI OFF)
endif()
# Avoid namespace pollustion.
unset(LIBDIR_NATIVE_ABI)
unset(LIBDIR_CENTOS7_ABI)
set(LIBDIR ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_NAME})
else()
message(STATUS "Using pre-compiled LIBDIR: ${LIBDIR}")
endif()
if(EXISTS ${LIBDIR})
message(STATUS "Using pre-compiled LIBDIR: ${LIBDIR}")
file(GLOB LIB_SUBDIRS ${LIBDIR}/*)
# NOTE: Make sure "proper" compiled zlib comes first before the one
# which is a part of OpenCollada. They have different ABI, and we
@@ -260,8 +244,13 @@ endif()
if(WITH_OPENVDB)
find_package_wrapper(OpenVDB)
find_package_wrapper(TBB)
find_package_wrapper(Blosc)
if(NOT OPENVDB_FOUND)
if(NOT TBB_FOUND)
set(WITH_OPENVDB OFF)
set(WITH_OPENVDB_BLOSC OFF)
message(STATUS "TBB not found, disabling OpenVDB")
elseif(NOT OPENVDB_FOUND)
set(WITH_OPENVDB OFF)
set(WITH_OPENVDB_BLOSC OFF)
message(STATUS "OpenVDB not found, disabling it")
@@ -427,21 +416,6 @@ if(WITH_OPENSUBDIV)
endif()
endif()
if(WITH_TBB)
find_package_wrapper(TBB)
endif()
if(NOT WITH_TBB OR NOT TBB_FOUND)
if(WITH_OPENIMAGEDENOISE)
message(STATUS "TBB not found, disabling OpenImageDenoise")
set(WITH_OPENIMAGEDENOISE OFF)
endif()
if(WITH_OPENVDB)
message(STATUS "TBB not found, disabling OpenVDB")
set(WITH_OPENVDB OFF)
endif()
endif()
# OpenSuse needs lutil, ArchLinux not, for now keep, can avoid by using --as-needed
if(HAIKU)
list(APPEND PLATFORM_LINKLIBS -lnetwork)

View File

@@ -35,22 +35,6 @@ if(CMAKE_C_COMPILER_ID MATCHES "Clang")
else()
message("Unable to detect the Visual Studio redist directory, copying of the runtime dlls will not work, try running from the visual studio developer prompt.")
endif()
# 1) CMake has issues detecting openmp support in clang-cl so we have to provide
# the right switches here.
# 2) While the /openmp switch *should* work, it currently doesn't as for clang 9.0.0
if(WITH_OPENMP)
set(OPENMP_CUSTOM ON)
set(OPENMP_FOUND ON)
set(OpenMP_C_FLAGS "/clang:-fopenmp")
set(OpenMP_CXX_FLAGS "/clang:-fopenmp")
GET_FILENAME_COMPONENT(LLVMROOT "[HKEY_LOCAL_MACHINE\\SOFTWARE\\WOW6432Node\\LLVM\\LLVM;]" ABSOLUTE CACHE)
set(CLANG_OPENMP_DLL "${LLVMROOT}/bin/libomp.dll")
set(CLANG_OPENMP_LIB "${LLVMROOT}/lib/libomp.lib")
if(NOT EXISTS "${CLANG_OPENMP_DLL}")
message(FATAL_ERROR "Clang OpenMP library (${CLANG_OPENMP_DLL}) not found.")
endif()
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \"${CLANG_OPENMP_LIB}\"")
endif()
endif()
set_property(GLOBAL PROPERTY USE_FOLDERS ${WINDOWS_USE_VISUAL_STUDIO_PROJECT_FOLDERS})
@@ -112,7 +96,7 @@ set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} /SAFESEH:NO")
set(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} /SAFESEH:NO")
list(APPEND PLATFORM_LINKLIBS
ws2_32 vfw32 winmm kernel32 user32 gdi32 comdlg32 Comctl32
ws2_32 vfw32 winmm kernel32 user32 gdi32 comdlg32
advapi32 shfolder shell32 ole32 oleaut32 uuid psapi Dbghelp
)
@@ -151,23 +135,22 @@ else()
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} /nologo /J /Gd /MP /bigobj")
endif()
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /MDd /ZI")
set(CMAKE_C_FLAGS_DEBUG "${CMAKE_C_FLAGS_DEBUG} /MDd /ZI")
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} /MD")
set(CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE} /MD")
set(CMAKE_CXX_FLAGS_MINSIZEREL "${CMAKE_CXX_FLAGS_MINSIZEREL} /MD")
set(CMAKE_C_FLAGS_MINSIZEREL "${CMAKE_C_FLAGS_MINSIZEREL} /MD")
set(CMAKE_CXX_FLAGS_RELWITHDEBINFO "${CMAKE_CXX_FLAGS_RELWITHDEBINFO} /MD")
set(CMAKE_C_FLAGS_RELWITHDEBINFO "${CMAKE_C_FLAGS_RELWITHDEBINFO} /MD")
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /MTd /ZI")
set(CMAKE_C_FLAGS_DEBUG "${CMAKE_C_FLAGS_DEBUG} /MTd /ZI")
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} /MT")
set(CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE} /MT")
set(CMAKE_CXX_FLAGS_MINSIZEREL "${CMAKE_CXX_FLAGS_MINSIZEREL} /MT")
set(CMAKE_C_FLAGS_MINSIZEREL "${CMAKE_C_FLAGS_MINSIZEREL} /MT")
set(CMAKE_CXX_FLAGS_RELWITHDEBINFO "${CMAKE_CXX_FLAGS_RELWITHDEBINFO} /MT")
set(CMAKE_C_FLAGS_RELWITHDEBINFO "${CMAKE_C_FLAGS_RELWITHDEBINFO} /MT")
# JMC is available on msvc 15.8 (1915) and up
#JMC is available on msvc 15.8 (1915) and up
if(MSVC_VERSION GREATER 1914 AND NOT MSVC_CLANG)
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /JMC")
endif()
set(PLATFORM_LINKFLAGS "${PLATFORM_LINKFLAGS} /SUBSYSTEM:CONSOLE /STACK:2097152 /INCREMENTAL:NO ")
set(PLATFORM_LINKFLAGS_RELEASE "/NODEFAULTLIB:libcmt.lib /NODEFAULTLIB:libcmtd.lib /NODEFAULTLIB:msvcrtd.lib")
set(PLATFORM_LINKFLAGS_DEBUG "${PLATFORM_LINKFLAGS_DEBUG} /IGNORE:4099 /NODEFAULTLIB:libcmt.lib /NODEFAULTLIB:msvcrt.lib /NODEFAULTLIB:libcmtd.lib")
set(PLATFORM_LINKFLAGS "${PLATFORM_LINKFLAGS} /NODEFAULTLIB:msvcrt.lib /NODEFAULTLIB:msvcmrt.lib /NODEFAULTLIB:msvcurt.lib /NODEFAULTLIB:msvcrtd.lib ")
# Ignore meaningless for us linker warnings.
set(PLATFORM_LINKFLAGS "${PLATFORM_LINKFLAGS} /ignore:4049 /ignore:4217 /ignore:4221")
@@ -179,6 +162,8 @@ else()
set(PLATFORM_LINKFLAGS "/MACHINE:IX86 /LARGEADDRESSAWARE ${PLATFORM_LINKFLAGS}")
endif()
set(PLATFORM_LINKFLAGS_DEBUG "${PLATFORM_LINKFLAGS_DEBUG} /IGNORE:4099 /NODEFAULTLIB:libcmt.lib /NODEFAULTLIB:libc.lib")
if(NOT DEFINED LIBDIR)
# Setup 64bit and 64bit windows systems
@@ -191,19 +176,22 @@ if(NOT DEFINED LIBDIR)
# Can be 1910..1912
if(MSVC_VERSION GREATER 1919)
message(STATUS "Visual Studio 2019 detected.")
set(LIBDIR ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_BASE}_vc15)
set(LIBDIR ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_BASE}_vc14)
elseif(MSVC_VERSION GREATER 1909)
message(STATUS "Visual Studio 2017 detected.")
set(LIBDIR ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_BASE}_vc15)
set(LIBDIR ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_BASE}_vc14)
elseif(MSVC_VERSION EQUAL 1900)
message(STATUS "Visual Studio 2015 detected.")
set(LIBDIR ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_BASE}_vc15)
set(LIBDIR ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_BASE}_vc14)
else()
message(STATUS "Visual Studio 2013 detected.")
set(LIBDIR ${CMAKE_SOURCE_DIR}/../lib/${LIBDIR_BASE}_vc12)
endif()
else()
message(STATUS "Using pre-compiled LIBDIR: ${LIBDIR}")
endif()
if(NOT EXISTS "${LIBDIR}/")
message(FATAL_ERROR "\n\nWindows requires pre-compiled libs at: '${LIBDIR}'. Please run `make update` in the blender source folder to obtain them.")
message(FATAL_ERROR "Windows requires pre-compiled libs at: '${LIBDIR}'")
endif()
# Mark libdir as system headers with a lower warn level, to resolve some warnings
@@ -393,8 +381,8 @@ if(WITH_BOOST)
set(BOOST_INCLUDE_DIR ${BOOST}/include)
set(BOOST_LIBPATH ${BOOST}/lib)
if(CMAKE_CL_64)
set(BOOST_POSTFIX "vc141-mt-x64-1_68.lib")
set(BOOST_DEBUG_POSTFIX "vc141-mt-gd-x64-1_68.lib")
set(BOOST_POSTFIX "vc140-mt-s-x64-1_68.lib")
set(BOOST_DEBUG_POSTFIX "vc140-mt-sgd-x64-1_68.lib")
endif()
set(BOOST_LIBRARIES
optimized ${BOOST_LIBPATH}/libboost_date_time-${BOOST_POSTFIX}
@@ -487,24 +475,25 @@ endif()
if(WITH_OPENVDB)
set(BLOSC_LIBRARIES optimized ${LIBDIR}/blosc/lib/libblosc.lib debug ${LIBDIR}/blosc/lib/libblosc_d.lib)
set(TBB_LIBRARIES optimized ${LIBDIR}/tbb/lib/tbb.lib debug ${LIBDIR}/tbb/lib/tbb_debug.lib)
set(TBB_INCLUDE_DIR ${LIBDIR}/tbb/include)
set(OPENVDB ${LIBDIR}/openVDB)
set(OPENVDB_LIBPATH ${OPENVDB}/lib)
set(OPENVDB_INCLUDE_DIRS ${OPENVDB}/include)
set(OPENVDB_LIBRARIES optimized ${OPENVDB_LIBPATH}/openvdb.lib debug ${OPENVDB_LIBPATH}/openvdb_d.lib ${BLOSC_LIBRARIES})
set(OPENVDB_INCLUDE_DIRS ${OPENVDB}/include ${TBB_INCLUDE_DIR})
set(OPENVDB_LIBRARIES optimized ${OPENVDB_LIBPATH}/openvdb.lib debug ${OPENVDB_LIBPATH}/openvdb_d.lib ${TBB_LIBRARIES} ${BLOSC_LIBRARIES})
set(OPENVDB_DEFINITIONS -DNOMINMAX)
endif()
if(WITH_OPENIMAGEDENOISE)
set(TBB_LIBRARIES optimized ${LIBDIR}/tbb/lib/tbb.lib debug ${LIBDIR}/tbb/lib/tbb_debug.lib)
set(TBB_INCLUDE_DIR ${LIBDIR}/tbb/include)
set(OPENIMAGEDENOISE ${LIBDIR}/OpenImageDenoise)
set(OPENIMAGEDENOISE_LIBPATH ${LIBDIR}/OpenImageDenoise/lib)
set(OPENIMAGEDENOISE_INCLUDE_DIRS ${OPENIMAGEDENOISE}/include)
set(OPENIMAGEDENOISE_INCLUDE_DIRS ${OPENIMAGEDENOISE}/include ${TBB_INCLUDE_DIR})
set(OPENIMAGEDENOISE_LIBRARIES
optimized ${OPENIMAGEDENOISE_LIBPATH}/OpenImageDenoise.lib
optimized ${OPENIMAGEDENOISE_LIBPATH}/common.lib
optimized ${OPENIMAGEDENOISE_LIBPATH}/mkldnn.lib
debug ${OPENIMAGEDENOISE_LIBPATH}/OpenImageDenoise_d.lib
debug ${OPENIMAGEDENOISE_LIBPATH}/common_d.lib
debug ${OPENIMAGEDENOISE_LIBPATH}/mkldnn_d.lib)
optimized ${OPENIMAGEDENOISE_LIBPATH}/OpenImageDenoise.lib ${OPENIMAGEDENOISE_LIBPATH}/common.lib ${OPENIMAGEDENOISE_LIBPATH}/mkldnn.lib
debug ${OPENIMAGEDENOISE_LIBPATH}/OpenImageDenoise_d.lib ${OPENIMAGEDENOISE_LIBPATH}/common_d.lib ${OPENIMAGEDENOISE_LIBPATH}/mkldnn_d.lib
${TBB_LIBRARIES})
set(OPENIMAGEDENOISE_DEFINITIONS)
endif()
@@ -517,6 +506,17 @@ if(WITH_ALEMBIC)
set(ALEMBIC_FOUND 1)
endif()
if(WITH_MOD_CLOTH_ELTOPO)
set(LAPACK ${LIBDIR}/lapack)
# set(LAPACK_INCLUDE_DIR ${LAPACK}/include)
set(LAPACK_LIBPATH ${LAPACK}/lib)
set(LAPACK_LIBRARIES
${LIBDIR}/lapack/lib/libf2c.lib
${LIBDIR}/lapack/lib/clapack_nowrap.lib
${LIBDIR}/lapack/lib/BLAS_nowrap.lib
)
endif()
if(WITH_IMAGE_OPENJPEG)
set(OPENJPEG ${LIBDIR}/openjpeg)
set(OPENJPEG_INCLUDE_DIRS ${OPENJPEG}/include/openjpeg-2.3)
@@ -558,27 +558,20 @@ if(WITH_SYSTEM_AUDASPACE)
set(AUDASPACE_PY_LIBRARIES ${LIBDIR}/audaspace/lib/audaspace-py.lib)
endif()
if(WITH_TBB)
set(TBB_LIBRARIES optimized ${LIBDIR}/tbb/lib/tbb.lib debug ${LIBDIR}/tbb/lib/tbb_debug.lib)
set(TBB_INCLUDE_DIR ${LIBDIR}/tbb/include)
set(TBB_INCLUDE_DIRS ${TBB_INCLUDE_DIR})
if(WITH_TBB_MALLOC_PROXY)
add_definitions(-DWITH_TBB_MALLOC)
endif()
else()
if(WITH_OPENIMAGEDENOISE)
message(STATUS "TBB disabled, also disabling OpenImageDenoise")
set(WITH_OPENIMAGEDENOISE OFF)
endif()
if(WITH_OPENVDB)
message(STATUS "TBB disabled, also disabling OpenVDB")
set(WITH_OPENVDB OFF)
endif()
endif()
# used in many places so include globally, like OpenGL
blender_include_dirs_sys("${PTHREADS_INCLUDE_DIRS}")
#find signtool
set(ProgramFilesX86_NAME "ProgramFiles(x86)") #env dislikes the ( )
find_program(SIGNTOOL_EXE signtool
HINTS
"$ENV{${ProgramFilesX86_NAME}}/Windows Kits/10/bin/x86/"
"$ENV{ProgramFiles}/Windows Kits/10/bin/x86/"
"$ENV{${ProgramFilesX86_NAME}}/Windows Kits/8.1/bin/x86/"
"$ENV{ProgramFiles}/Windows Kits/8.1/bin/x86/"
"$ENV{${ProgramFilesX86_NAME}}/Windows Kits/8.0/bin/x86/"
"$ENV{ProgramFiles}/Windows Kits/8.0/bin/x86/"
)
set(WINTAB_INC ${LIBDIR}/wintab/include)
if(WITH_OPENAL)
@@ -601,6 +594,10 @@ if(WITH_CODEC_SNDFILE)
set(LIBSNDFILE_LIBRARIES ${LIBSNDFILE_LIBPATH}/libsndfile-1.lib)
endif()
if(WITH_RAYOPTIMIZATION AND SUPPORT_SSE_BUILD)
add_definitions(-D__SSE__ -D__MMX__)
endif()
if(WITH_CYCLES_OSL)
set(CYCLES_OSL ${LIBDIR}/osl CACHE PATH "Path to OpenShadingLanguage installation")

View File

@@ -20,7 +20,7 @@ else
fi
MANIFEST="blender-$VERSION-manifest.txt"
TARBALL="blender-$VERSION.tar.xz"
TARBALL="blender-$VERSION.tar.gz"
cd "$blender_srcdir"
@@ -51,26 +51,19 @@ echo "OK"
# Create the tarball
#
# Without owner/group args, extracting the files as root will
# use ownership from the tar archive.
cd "$blender_srcdir"
echo -n "Creating archive: \"$BASE_DIR/$TARBALL\" ..."
tar \
--transform "s,^,blender-$VERSION/,g" \
--use-compress-program="xz -9" \
tar --transform "s,^,blender-$VERSION/,g" \
--use-compress-program="gzip --best" \
--create \
--file="$BASE_DIR/$TARBALL" \
--files-from="$BASE_DIR/$MANIFEST" \
--owner=0 \
--group=0
--files-from="$BASE_DIR/$MANIFEST"
echo "OK"
# Create checksum file
cd "$BASE_DIR"
echo -n "Creating checksum: \"$BASE_DIR/$TARBALL.md5sum\" ..."
echo -n "Creating checksum: \"$BASE_DIR/$TARBALL.md5sum\" ..."
md5sum "$TARBALL" > "$TARBALL.md5sum"
echo "OK"

View File

@@ -18,7 +18,6 @@ def parse_arguments():
parser.add_argument("--cmake-command", default="cmake")
parser.add_argument("--svn-command", default="svn")
parser.add_argument("--git-command", default="git")
parser.add_argument("--config", default="")
parser.add_argument("build_directory")
return parser.parse_args()
@@ -27,33 +26,23 @@ git_command = args.git_command
svn_command = args.svn_command
ctest_command = args.ctest_command
cmake_command = args.cmake_command
config = args.config
build_dir = args.build_directory
if make_utils.command_missing(ctest_command):
if shutil.which(ctest_command) is None:
sys.stderr.write("ctest not found, can't run tests\n")
sys.exit(1)
if make_utils.command_missing(git_command):
sys.stderr.write("git not found, can't run tests\n")
sys.exit(1)
# Test if we are building a specific release version.
branch = make_utils.git_branch(git_command)
release_version = make_utils.git_branch_release_version(branch)
release_version = make_utils.git_branch_release_version(git_command)
lib_tests_dirpath = os.path.join('..', 'lib', "tests")
if not os.path.exists(lib_tests_dirpath):
print("Tests files not found, downloading...")
if make_utils.command_missing(svn_command):
if shutil.which(svn_command) is None:
sys.stderr.write("svn not found, can't checkout test files\n")
sys.exit(1)
if make_utils.command_missing(cmake_command):
sys.stderr.write("cmake not found, can't checkout test files\n")
sys.exit(1)
svn_url = make_utils.svn_libraries_base_url(release_version) + "/tests"
call([svn_command, "checkout", svn_url, lib_tests_dirpath])
@@ -62,15 +51,5 @@ if not os.path.exists(lib_tests_dirpath):
call([cmake_command, "."])
# Run tests
tests_dir = os.path.join(build_dir, "tests")
os.makedirs(tests_dir, exist_ok=True)
os.chdir(build_dir)
command = [ctest_command, ".", "--output-on-failure"]
if len(config):
command += ["-C", config]
tests_log = "log_" + config + ".txt"
else:
tests_log = "log.txt"
command += ["-O", os.path.join(tests_dir, tests_log)]
call(command)
call([ctest_command, ".", "--output-on-failure"])

View File

@@ -12,33 +12,34 @@ import shutil
import sys
import make_utils
from make_utils import call, check_output
from make_utils import call
# Parse arguments
def parse_arguments():
parser = argparse.ArgumentParser()
parser.add_argument("--only-code", action="store_true")
parser.add_argument("--svn-command", default="svn")
parser.add_argument("--git-command", default="git")
return parser.parse_args()
args = parse_arguments()
only_code = args.only_code
git_command = args.git_command
svn_command = args.svn_command
svn_non_interactive = [args.svn_command, '--non-interactive']
def print_stage(text):
print("")
print(text)
print("")
# Parse arguments
def parse_arguments():
parser = argparse.ArgumentParser()
parser.add_argument("--no-libraries", action="store_true")
parser.add_argument("--no-blender", action="store_true")
parser.add_argument("--no-submodules", action="store_true")
parser.add_argument("--use-tests", action="store_true")
parser.add_argument("--svn-command", default="svn")
parser.add_argument("--git-command", default="git")
parser.add_argument("--use-centos-libraries", action="store_true")
return parser.parse_args()
def get_blender_git_root():
return check_output([args.git_command, "rev-parse", "--show-toplevel"])
# Test if we are building a specific release version.
release_version = make_utils.git_branch_release_version(git_command)
# Setup for precompiled libraries and tests from svn.
def svn_update(args, release_version):
svn_non_interactive = [args.svn_command, '--non-interactive']
lib_dirpath = os.path.join(get_blender_git_root(), '..', 'lib')
if not only_code:
lib_dirpath = os.path.join('..', 'lib')
svn_url = make_utils.svn_libraries_base_url(release_version)
# Checkout precompiled libraries
@@ -48,9 +49,7 @@ def svn_update(args, release_version):
# Windows checkout is usually handled by bat scripts since python3 to run
# this script is bundled as part of the precompiled libraries. However it
# is used by the buildbot.
lib_platform = "win64_vc15"
elif args.use_centos_libraries:
lib_platform = "linux_centos7_x86_64"
lib_platform = "win64_vc14"
else:
# No precompiled libraries for Linux.
lib_platform = None
@@ -61,163 +60,48 @@ def svn_update(args, release_version):
if not os.path.exists(lib_platform_dirpath):
print_stage("Checking out Precompiled Libraries")
if make_utils.command_missing(args.svn_command):
if shutil.which(svn_command) is None:
sys.stderr.write("svn not found, can't checkout libraries\n")
sys.exit(1)
svn_url_platform = svn_url + lib_platform
call(svn_non_interactive + ["checkout", svn_url_platform, lib_platform_dirpath])
if args.use_tests:
lib_tests = "tests"
lib_tests_dirpath = os.path.join(lib_dirpath, lib_tests)
if not os.path.exists(lib_tests_dirpath):
print_stage("Checking out Tests")
if make_utils.command_missing(args.svn_command):
sys.stderr.write("svn not found, can't checkout tests\n")
sys.exit(1)
svn_url_tests = svn_url + lib_tests
call(svn_non_interactive + ["checkout", svn_url_tests, lib_tests_dirpath])
# Update precompiled libraries and tests
print_stage("Updating Precompiled Libraries and Tests")
if os.path.isdir(lib_dirpath):
for dirname in os.listdir(lib_dirpath):
dirpath = os.path.join(lib_dirpath, dirname)
if dirname == ".svn":
# Cleanup must be run from svn root directory if it exists.
if not make_utils.command_missing(args.svn_command):
call(svn_non_interactive + ["cleanup", lib_dirpath])
continue
dirpath = os.path.join(lib_dirpath, dirname)
svn_dirpath = os.path.join(dirpath, ".svn")
svn_root_dirpath = os.path.join(lib_dirpath, ".svn")
if os.path.isdir(dirpath) and \
(os.path.exists(svn_dirpath) or os.path.exists(svn_root_dirpath)):
if make_utils.command_missing(args.svn_command):
if shutil.which(svn_command) is None:
sys.stderr.write("svn not found, can't update libraries\n")
sys.exit(1)
# Cleanup to continue with interrupted downloads.
if os.path.exists(svn_dirpath):
call(svn_non_interactive + ["cleanup", dirpath])
# Switch to appropriate branch and update.
call(svn_non_interactive + ["switch", svn_url + dirname, dirpath], exit_on_error=False)
call(svn_non_interactive + ["cleanup", dirpath])
call(svn_non_interactive + ["switch", svn_url + dirname, dirpath])
call(svn_non_interactive + ["update", dirpath])
# Test if git repo can be updated.
def git_update_skip(args, check_remote_exists=True):
if make_utils.command_missing(args.git_command):
sys.stderr.write("git not found, can't update code\n")
sys.exit(1)
# Update blender repository and submodules.
print_stage("Updating Blender Git Repository and Submodules")
# Abort if a rebase is still progress.
rebase_merge = check_output([args.git_command, 'rev-parse', '--git-path', 'rebase-merge'], exit_on_error=False)
rebase_apply = check_output([args.git_command, 'rev-parse', '--git-path', 'rebase-apply'], exit_on_error=False)
merge_head = check_output([args.git_command, 'rev-parse', '--git-path', 'MERGE_HEAD'], exit_on_error=False)
if os.path.exists(rebase_merge) or \
os.path.exists(rebase_apply) or \
os.path.exists(merge_head):
return "rebase or merge in progress, complete it first"
if shutil.which(git_command) is None:
sys.stderr.write("git not found, can't update code\n")
sys.exit(1)
# Abort if uncommitted changes.
changes = check_output([args.git_command, 'status', '--porcelain', '--untracked-files=no'])
if len(changes) != 0:
return "you have unstaged changes"
call([git_command, "pull", "--rebase"])
call([git_command, "submodule", "update", "--init", "--recursive"])
# Test if there is an upstream branch configured
if check_remote_exists:
branch = check_output([args.git_command, "rev-parse", "--abbrev-ref", "HEAD"])
remote = check_output([args.git_command, "config", "branch." + branch + ".remote"], exit_on_error=False)
if len(remote) == 0:
return "no remote branch to pull from"
return ""
# Update blender repository.
def blender_update(args):
print_stage("Updating Blender Git Repository")
call([args.git_command, "pull", "--rebase"])
# Update submodules.
def submodules_update(args, release_version, branch):
print_stage("Updating Submodules")
if make_utils.command_missing(args.git_command):
sys.stderr.write("git not found, can't update code\n")
sys.exit(1)
# Update submodules to latest master or appropriate release branch.
if not release_version:
branch = "master"
submodules = [
("release/scripts/addons", branch),
("release/scripts/addons_contrib", branch),
("release/datafiles/locale", branch),
("source/tools", branch),
]
# Initialize submodules only if needed.
for submodule_path, submodule_branch in submodules:
if not os.path.exists(os.path.join(submodule_path, ".git")):
call([args.git_command, "submodule", "update", "--init", "--recursive"])
break
# Checkout appropriate branch and pull changes.
skip_msg = ""
for submodule_path, submodule_branch in submodules:
cwd = os.getcwd()
try:
os.chdir(submodule_path)
msg = git_update_skip(args, check_remote_exists=False)
if msg:
skip_msg += submodule_path + " skipped: " + msg + "\n"
else:
if make_utils.git_branch(args.git_command) != submodule_branch:
call([args.git_command, "fetch", "origin"])
call([args.git_command, "checkout", submodule_branch])
call([args.git_command, "pull", "--rebase", "origin", submodule_branch])
finally:
os.chdir(cwd)
return skip_msg
if __name__ == "__main__":
args = parse_arguments()
blender_skip_msg = ""
submodules_skip_msg = ""
# Test if we are building a specific release version.
branch = make_utils.git_branch(args.git_command)
release_version = make_utils.git_branch_release_version(branch)
if not args.no_libraries:
svn_update(args, release_version)
if not args.no_blender:
blender_skip_msg = git_update_skip(args)
if blender_skip_msg:
blender_skip_msg = "Blender repository skipped: " + blender_skip_msg + "\n"
else:
blender_update(args)
if not args.no_submodules:
submodules_skip_msg = submodules_update(args, release_version, branch)
# Report any skipped repositories at the end, so it's not as easy to miss.
skip_msg = blender_skip_msg + submodules_skip_msg
if skip_msg:
print_stage(skip_msg.strip())
# For failed submodule update we throw an error, since not having correct
# submodules can make Blender throw errors.
# For Blender itself we don't and consider "make update" to be a command
# you can use while working on uncommitted code.
if submodules_skip_msg:
sys.exit(1)
if not release_version:
# Update submodules to latest master if not building a specific release.
# In that case submodules are set to a specific revision, which is checked
# out by running "git submodule update".
call([git_command, "submodule", "foreach", "git", "checkout", "master"])
call([git_command, "submodule", "foreach", "git", "pull", "--rebase", "origin", "master"])

View File

@@ -3,11 +3,10 @@
# Utility functions for make update and make tests.
import re
import shutil
import subprocess
import sys
def call(cmd, exit_on_error=True):
def call(cmd):
print(" ".join(cmd))
# Flush to ensure correct order output on Windows.
@@ -15,27 +14,10 @@ def call(cmd, exit_on_error=True):
sys.stderr.flush()
retcode = subprocess.call(cmd)
if exit_on_error and retcode != 0:
sys.exit(retcode)
return retcode
if retcode != 0:
sys.exit(retcode)
def check_output(cmd, exit_on_error=True):
# Flush to ensure correct order output on Windows.
sys.stdout.flush()
sys.stderr.flush()
try:
output = subprocess.check_output(cmd, stderr=subprocess.STDOUT, universal_newlines=True)
except subprocess.CalledProcessError as e:
if exit_on_error:
sys.stderr.write(" ".join(cmd))
sys.stderr.write(e.output + "\n")
sys.exit(e.returncode)
output = ""
return output.strip()
def git_branch(git_command):
def git_branch_release_version(git_command):
# Test if we are building a specific release version.
try:
branch = subprocess.check_output([git_command, "rev-parse", "--abbrev-ref", "HEAD"])
@@ -43,9 +25,7 @@ def git_branch(git_command):
sys.stderr.write("Failed to get Blender git branch\n")
sys.exit(1)
return branch.strip().decode('utf8')
def git_branch_release_version(branch):
branch = branch.strip().decode('utf8')
release_version = re.search("^blender-v(.*)-release$", branch)
if release_version:
release_version = release_version.group(1)
@@ -57,10 +37,3 @@ def svn_libraries_base_url(release_version):
else:
svn_branch = "trunk"
return "https://svn.blender.org/svnroot/bf-blender/" + svn_branch + "/lib/"
def command_missing(command):
# Support running with Python 2 for macOS
if sys.version_info >= (3, 0):
return shutil.which(command) is None
else:
return False

View File

@@ -1,5 +1,6 @@
if "%BUILD_VS_YEAR%"=="2017" set BUILD_VS_LIBDIRPOST=vc15
if "%BUILD_VS_YEAR%"=="2019" set BUILD_VS_LIBDIRPOST=vc15
if "%BUILD_VS_YEAR%"=="2015" set BUILD_VS_LIBDIRPOST=vc14
if "%BUILD_VS_YEAR%"=="2017" set BUILD_VS_LIBDIRPOST=vc14
if "%BUILD_VS_YEAR%"=="2019" set BUILD_VS_LIBDIRPOST=vc14
set BUILD_VS_SVNDIR=win64_%BUILD_VS_LIBDIRPOST%
set BUILD_VS_LIBDIR="%BLENDER_DIR%..\lib\%BUILD_VS_SVNDIR%"
@@ -9,6 +10,7 @@ if NOT "%verbose%" == "" (
)
if NOT EXIST %BUILD_VS_LIBDIR% (
rem libs not found, but svn is on the system
echo
if not "%SVN%"=="" (
echo.
echo The required external libraries in %BUILD_VS_LIBDIR% are missing
@@ -53,8 +55,5 @@ if NOT EXIST %BUILD_VS_LIBDIR% (
echo Error: Required libraries not found at "%BUILD_VS_LIBDIR%"
echo This is needed for building, aborting!
echo.
if "%SVN%"=="" (
echo This is most likely caused by svn.exe not being available.
)
exit /b 1
)

View File

@@ -30,15 +30,9 @@ set LLVM_DIR=
:DetectionComplete
set CC=%LLVM_DIR%\bin\clang-cl
set CXX=%LLVM_DIR%\bin\clang-cl
if "%BUILD_VS_YEAR%" == "2019" (
rem build and tested against 2019 16.2
set CFLAGS=-m64 -fmsc-version=1922
set CXXFLAGS=-m64 -fmsc-version=1922
) else (
rem build and tested against 2017 15.7
set CFLAGS=-m64 -fmsc-version=1914
set CXXFLAGS=-m64 -fmsc-version=1914
)
rem build and tested against 2017 15.7
set CFLAGS=-m64 -fmsc-version=1914
set CXXFLAGS=-m64 -fmsc-version=1914
if "%WITH_ASAN%"=="1" (
set BUILD_CMAKE_ARGS=%BUILD_CMAKE_ARGS% -DWITH_COMPILER_ASAN=On
)

View File

@@ -1,17 +1,15 @@
REM find all dependencies and set the corresponding environment variables.
for %%X in (svn.exe) do (set SVN=%%~$PATH:X)
for %%X in (cmake.exe) do (set CMAKE=%%~$PATH:X)
for %%X in (ctest.exe) do (set CTEST=%%~$PATH:X)
for %%X in (git.exe) do (set GIT=%%~$PATH:X)
set PYTHON=%BLENDER_DIR%\..\lib\win64_vc15\python\37\bin\python.exe
set PYTHON=%BLENDER_DIR%\..\lib\win64_vc14\python\37\bin\python.exe
if NOT "%verbose%" == "" (
echo svn : "%SVN%"
echo cmake : "%CMAKE%"
echo ctest : "%CTEST%"
echo git : "%GIT%"
echo python : "%PYTHON%"
)
if "%CMAKE%" == "" (
echo Cmake not found in path, required for building, exiting...
exit /b 1
)
)

View File

@@ -1,5 +1,5 @@
if EXIST %BLENDER_DIR%\..\lib\win64_vc15\llvm\bin\clang-format.exe (
set CF_PATH=..\lib\win64_vc15\llvm\bin
if EXIST %BLENDER_DIR%\..\lib\win64_vc14\llvm\bin\clang-format.exe (
set CF_PATH=..\lib\win64_vc14\llvm\bin
goto detect_done
)
@@ -10,7 +10,7 @@ exit /b 1
echo found clang-format in %CF_PATH%
if EXIST %PYTHON% (
set PYTHON=%BLENDER_DIR%\..\lib\win64_vc15\python\37\bin\python.exe
set PYTHON=%BLENDER_DIR%\..\lib\win64_vc14\python\37\bin\python.exe
goto detect_python_done
)

View File

@@ -13,7 +13,7 @@ if NOT "%1" == "" (
set BUILD_TYPE=Debug
REM Build Configurations
) else if "%1" == "builddir" (
set BUILD_DIR_OVERRRIDE=%BLENDER_DIR%..\%2
set BUILD_DIR_OVERRRIDE="%BLENDER_DIR%..\%2"
shift /1
) else if "%1" == "with_tests" (
set TESTS_CMAKE_ARGS=%TESTS_CMAKE_ARGS% -DWITH_GTESTS=On
@@ -85,16 +85,13 @@ if NOT "%1" == "" (
set BUILD_UPDATE_ARGS=
) else if "%1" == "code_update" (
SET BUILD_UPDATE=1
set BUILD_UPDATE_ARGS="--no-libraries"
set BUILD_UPDATE_ARGS="--only-code"
) else if "%1" == "ninja" (
SET BUILD_WITH_NINJA=1
) else if "%1" == "clean" (
set MUST_CLEAN=1
) else if "%1" == "verbose" (
set VERBOSE=1
) else if "%1" == "test" (
set TEST=1
set NOBUILD=1
) else if "%1" == "format" (
set FORMAT=1
set FORMAT_ARGS=%2 %3 %4 %5 %6 %7 %8 %9

View File

@@ -29,4 +29,3 @@ set ASAN_CMAKE_ARGS=
set WITH_PYDEBUG=
set PYDEBUG_CMAKE_ARGS=
set FORMAT=
set TEST=

View File

@@ -13,7 +13,6 @@ echo - update ^(Update both SVN and GIT^)
echo - code_update ^(Update only GIT^)
echo - nobuild ^(only generate project files^)
echo - showhash ^(Show git hashes of source tree^)
echo - test ^(Run automated tests with ctest^)
echo - format [path] ^(Format the source using clang-format, path is optional, requires python 3.x to be available^)
echo.
echo Configuration options

View File

@@ -1,13 +0,0 @@
if EXIST %PYTHON% (
goto detect_python_done
)
echo python not found in lib folder
exit /b 1
:detect_python_done
REM Use -B to avoid writing __pycache__ in lib directory and causing update conflicts.
%PYTHON% -B %BLENDER_DIR%\build_files\utils\make_test.py --git-command "%GIT%" --svn-command "%SVN%" --cmake-command="%CMAKE%" --ctest-command="%CTEST%" --config="%BUILD_TYPE%" %BUILD_DIR%
:EOF

View File

@@ -1,4 +1,4 @@
# Doxyfile 1.8.16
# Doxyfile 1.8.15
# This file describes the settings to be used by the documentation system
# doxygen (www.doxygen.org) for a project.
@@ -38,7 +38,7 @@ PROJECT_NAME = Blender
# could be handy for archiving the generated documentation or if some version
# control system is used.
PROJECT_NUMBER = "V2.82"
PROJECT_NUMBER = "V2.81"
# Using the PROJECT_BRIEF tag one can provide an optional one line description
# for a project that appears at the top of each page and should give viewer a
@@ -197,16 +197,6 @@ SHORT_NAMES = NO
JAVADOC_AUTOBRIEF = NO
# If the JAVADOC_BANNER tag is set to YES then doxygen will interpret a line
# such as
# /***************
# as being the beginning of a Javadoc-style comment "banner". If set to NO, the
# Javadoc-style will behave just like regular comments and it will not be
# interpreted by doxygen.
# The default value is: NO.
JAVADOC_BANNER = NO
# If the QT_AUTOBRIEF tag is set to YES then doxygen will interpret the first
# line (until the first dot) of a Qt-style comment as the brief description. If
# set to NO, the Qt-style will behave just like regular Qt-style comments (thus
@@ -339,7 +329,7 @@ MARKDOWN_SUPPORT = YES
# to that level are automatically included in the table of contents, even if
# they do not have an id attribute.
# Note: This feature currently applies only to Markdown headings.
# Minimum value: 0, maximum value: 99, default value: 5.
# Minimum value: 0, maximum value: 99, default value: 0.
# This tag requires that the tag MARKDOWN_SUPPORT is set to YES.
TOC_INCLUDE_HEADINGS = 0
@@ -475,12 +465,6 @@ EXTRACT_ALL = YES
EXTRACT_PRIVATE = NO
# If the EXTRACT_PRIV_VIRTUAL tag is set to YES, documented private virtual
# methods of a class will be included in the documentation.
# The default value is: NO.
EXTRACT_PRIV_VIRTUAL = NO
# If the EXTRACT_PACKAGE tag is set to YES, all members with package or internal
# scope will be included in the documentation.
# The default value is: NO.
@@ -559,7 +543,7 @@ INTERNAL_DOCS = YES
# names in lower-case letters. If set to YES, upper-case letters are also
# allowed. This is useful if you have classes or files whose names only differ
# in case and if your file system supports case sensitive file names. Windows
# (including Cygwin) ands Mac users are advised to set this option to NO.
# and Mac users are advised to set this option to NO.
# The default value is: system dependent.
CASE_SENSE_NAMES = YES
@@ -1384,7 +1368,7 @@ QCH_FILE =
# The QHP_NAMESPACE tag specifies the namespace to use when generating Qt Help
# Project output. For more information please see Qt Help Project / Namespace
# (see: https://doc.qt.io/archives/qt-4.8/qthelpproject.html#namespace).
# (see: http://doc.qt.io/archives/qt-4.8/qthelpproject.html#namespace).
# The default value is: org.doxygen.Project.
# This tag requires that the tag GENERATE_QHP is set to YES.
@@ -1392,7 +1376,7 @@ QHP_NAMESPACE = org.doxygen.Project
# The QHP_VIRTUAL_FOLDER tag specifies the namespace to use when generating Qt
# Help Project output. For more information please see Qt Help Project / Virtual
# Folders (see: https://doc.qt.io/archives/qt-4.8/qthelpproject.html#virtual-
# Folders (see: http://doc.qt.io/archives/qt-4.8/qthelpproject.html#virtual-
# folders).
# The default value is: doc.
# This tag requires that the tag GENERATE_QHP is set to YES.
@@ -1401,7 +1385,7 @@ QHP_VIRTUAL_FOLDER = doc
# If the QHP_CUST_FILTER_NAME tag is set, it specifies the name of a custom
# filter to add. For more information please see Qt Help Project / Custom
# Filters (see: https://doc.qt.io/archives/qt-4.8/qthelpproject.html#custom-
# Filters (see: http://doc.qt.io/archives/qt-4.8/qthelpproject.html#custom-
# filters).
# This tag requires that the tag GENERATE_QHP is set to YES.
@@ -1409,7 +1393,7 @@ QHP_CUST_FILTER_NAME =
# The QHP_CUST_FILTER_ATTRS tag specifies the list of the attributes of the
# custom filter to add. For more information please see Qt Help Project / Custom
# Filters (see: https://doc.qt.io/archives/qt-4.8/qthelpproject.html#custom-
# Filters (see: http://doc.qt.io/archives/qt-4.8/qthelpproject.html#custom-
# filters).
# This tag requires that the tag GENERATE_QHP is set to YES.
@@ -1417,7 +1401,7 @@ QHP_CUST_FILTER_ATTRS =
# The QHP_SECT_FILTER_ATTRS tag specifies the list of the attributes this
# project's filter section matches. Qt Help Project / Filter Attributes (see:
# https://doc.qt.io/archives/qt-4.8/qthelpproject.html#filter-attributes).
# http://doc.qt.io/archives/qt-4.8/qthelpproject.html#filter-attributes).
# This tag requires that the tag GENERATE_QHP is set to YES.
QHP_SECT_FILTER_ATTRS =
@@ -1696,11 +1680,10 @@ LATEX_CMD_NAME = latex
MAKEINDEX_CMD_NAME = makeindex
# The LATEX_MAKEINDEX_CMD tag can be used to specify the command name to
# generate index for LaTeX. In case there is no backslash (\) as first character
# it will be automatically added in the LaTeX code.
# generate index for LaTeX.
# Note: This tag is used in the generated output file (.tex).
# See also: MAKEINDEX_CMD_NAME for the part in the Makefile / make.bat.
# The default value is: makeindex.
# The default value is: \makeindex.
# This tag requires that the tag GENERATE_LATEX is set to YES.
LATEX_MAKEINDEX_CMD = \makeindex
@@ -2193,6 +2176,12 @@ EXTERNAL_GROUPS = YES
EXTERNAL_PAGES = YES
# The PERL_PATH should be the absolute path and name of the perl script
# interpreter (i.e. the result of 'which perl').
# The default file (with absolute path) is: /usr/bin/perl.
PERL_PATH = /usr/bin/perl
#---------------------------------------------------------------------------
# Configuration options related to the dot tool
#---------------------------------------------------------------------------
@@ -2206,6 +2195,15 @@ EXTERNAL_PAGES = YES
CLASS_DIAGRAMS = NO
# You can define message sequence charts within doxygen comments using the \msc
# command. Doxygen will then run the mscgen tool (see:
# http://www.mcternan.me.uk/mscgen/)) to produce the chart and insert it in the
# documentation. The MSCGEN_PATH tag allows you to specify the directory where
# the mscgen tool resides. If left empty the tool is assumed to be found in the
# default search path.
MSCGEN_PATH =
# You can include diagrams made with dia in doxygen documentation. Doxygen will
# then run dia to produce the diagram and insert it in the documentation. The
# DIA_PATH tag allows you to specify the directory where the dia binary resides.

View File

@@ -30,11 +30,11 @@ and <output-filename> is where to write the generated man page.
# <pep8 compliant>
import os
import subprocess
import sys
import time
import datetime
def man_format(data):
@@ -52,18 +52,11 @@ outfilename = sys.argv[2]
cmd = [blender_bin, "--help"]
print(" executing:", " ".join(cmd))
blender_help = subprocess.run(
cmd, env={"ASAN_OPTIONS": "exitcode=0"}, check=True, stdout=subprocess.PIPE).stdout.decode(encoding="utf-8")
blender_version = subprocess.run(
[blender_bin, "--version"], env={"ASAN_OPTIONS": "exitcode=0"}, check=True, stdout=subprocess.PIPE).stdout.decode(encoding="utf-8").strip()
blender_version, blender_date = (blender_version.split("build") + [None, None])[0:2]
blender_version = blender_version.rstrip().partition(" ")[2] # remove 'Blender' prefix.
if blender_date is None:
# Happens when built without WITH_BUILD_INFO e.g.
date_string = time.strftime("%B %d, %Y", time.gmtime(int(os.environ.get('SOURCE_DATE_EPOCH', time.time()))))
else:
blender_date = blender_date.strip().partition(" ")[2] # remove 'date:' prefix
date_string = time.strftime("%B %d, %Y", time.strptime(blender_date, "%Y-%m-%d"))
blender_help = subprocess.check_output(cmd).decode(encoding="utf-8")
blender_version = subprocess.check_output([blender_bin, "--version"]).decode(encoding="utf-8").strip()
blender_version = blender_version.split("build")[0].rstrip()
blender_version = blender_version.partition(" ")[2] # remove 'Blender' prefix.
date_string = datetime.date.fromtimestamp(time.time()).strftime("%B %d, %Y")
outfile = open(outfilename, "w")
fw = outfile.write
@@ -87,7 +80,7 @@ is a full-featured 3D application. It supports the entirety of the 3D pipeline -
Use Blender to create 3D images and animations, films and commercials, content for games, architectural and industrial visualizatons, and scientific visualizations.
https://www.blender.org''')
http://www.blender.org''')
fw('''
.SH OPTIONS''')

View File

@@ -4,9 +4,7 @@ Run a Function in x Seconds
"""
import bpy
def in_5_seconds():
print("Hello World")
bpy.app.timers.register(in_5_seconds, first_interval=5)

View File

@@ -4,10 +4,8 @@ Run a Function every x Seconds
"""
import bpy
def every_2_seconds():
print("Hello World")
return 2.0
bpy.app.timers.register(every_2_seconds)

View File

@@ -6,7 +6,6 @@ import bpy
counter = 0
def run_10_times():
global counter
counter += 1
@@ -15,5 +14,4 @@ def run_10_times():
return None
return 0.1
bpy.app.timers.register(run_10_times)

View File

@@ -5,10 +5,8 @@ Assign parameters to functions
import bpy
import functools
def print_message(message):
print("Message:", message)
bpy.app.timers.register(functools.partial(print_message, "Hello"), first_interval=2.0)
bpy.app.timers.register(functools.partial(print_message, "World"), first_interval=3.0)

View File

@@ -35,7 +35,7 @@ class OBJECT_OT_evaluated_example(bpy.types.Operator):
# modifiers.
#
# For mesh objects the object.data will be a mesh with all modifiers applied.
# This means that in access to vertices or faces after modifier stack happens via fields of
# This means that in access to vertices or faces after modifier stack happens via fields of
# object_eval.object.
#
# For other types of objects the object_eval.data does not have modifiers applied on it,

View File

@@ -29,7 +29,7 @@ class OBJECT_OT_simple_exporter(bpy.types.Operator):
# Happens for non-geometry objects.
continue
print(f"Exporting mesh with {len(mesh.vertices)} vertices "
f"at {object_instance.matrix_world}")
f"at {object_instance.matrix_world}")
object_instace.to_mesh_clear()
return {'FINISHED'}

View File

@@ -101,7 +101,7 @@ class CustomRenderEngine(bpy.types.RenderEngine):
# Bind shader that converts from scene linear to display space,
bgl.glEnable(bgl.GL_BLEND)
bgl.glBlendFunc(bgl.GL_ONE, bgl.GL_ONE_MINUS_SRC_ALPHA)
bgl.glBlendFunc(bgl.GL_ONE, bgl.GL_ONE_MINUS_SRC_ALPHA);
self.bind_display_space_shader(scene)
if not self.draw_data or self.draw_data.dimensions != dimensions:
@@ -135,18 +135,18 @@ class CustomDrawData:
# Bind shader that converts from scene linear to display space,
# use the scene's color management settings.
shader_program = bgl.Buffer(bgl.GL_INT, 1)
bgl.glGetIntegerv(bgl.GL_CURRENT_PROGRAM, shader_program)
bgl.glGetIntegerv(bgl.GL_CURRENT_PROGRAM, shader_program);
# Generate vertex array
self.vertex_array = bgl.Buffer(bgl.GL_INT, 1)
bgl.glGenVertexArrays(1, self.vertex_array)
bgl.glBindVertexArray(self.vertex_array[0])
texturecoord_location = bgl.glGetAttribLocation(shader_program[0], "texCoord")
position_location = bgl.glGetAttribLocation(shader_program[0], "pos")
texturecoord_location = bgl.glGetAttribLocation(shader_program[0], "texCoord");
position_location = bgl.glGetAttribLocation(shader_program[0], "pos");
bgl.glEnableVertexAttribArray(texturecoord_location)
bgl.glEnableVertexAttribArray(position_location)
bgl.glEnableVertexAttribArray(texturecoord_location);
bgl.glEnableVertexAttribArray(position_location);
# Generate geometry buffers for drawing textured quad
position = [0.0, 0.0, width, 0.0, width, height, 0.0, height]
@@ -178,7 +178,7 @@ class CustomDrawData:
bgl.glActiveTexture(bgl.GL_TEXTURE0)
bgl.glBindTexture(bgl.GL_TEXTURE_2D, self.texture[0])
bgl.glBindVertexArray(self.vertex_array[0])
bgl.glDrawArrays(bgl.GL_TRIANGLE_FAN, 0, 4)
bgl.glDrawArrays(bgl.GL_TRIANGLE_FAN, 0, 4);
bgl.glBindVertexArray(0)
bgl.glBindTexture(bgl.GL_TEXTURE_2D, 0)
@@ -201,7 +201,6 @@ def get_panels():
return panels
def register():
# Register the RenderEngine
bpy.utils.register_class(CustomRenderEngine)
@@ -209,7 +208,6 @@ def register():
for panel in get_panels():
panel.COMPAT_ENGINES.add('CUSTOM')
def unregister():
bpy.utils.unregister_class(CustomRenderEngine)

View File

@@ -210,3 +210,4 @@ def unregister():
if __name__ == "__main__":
register()

View File

@@ -301,7 +301,7 @@ Advantages include:
This is marked advanced because to run Blender as a Python module requires a special build option.
For instructions on building see
`Building Blender as a Python module <https://wiki.blender.org/wiki/Building_Blender/Other/BlenderAsPyModule>`_
`Building Blender as a Python module <https://wiki.blender.org/index.php/User:Ideasman42/BlenderAsPyModule>`_
Python Safety (Build Option)

View File

@@ -138,11 +138,10 @@ def main():
"BMO_OP_SLOT_SUBTYPE_MAP_EMPTY",
"BMO_OP_SLOT_SUBTYPE_MAP_INTERNAL",
"BMO_OP_SLOT_SUBTYPE_PTR_BMESH",
"BMO_OP_SLOT_SUBTYPE_PTR_SCENE",
"BMO_OP_SLOT_SUBTYPE_PTR_OBJECT",
"BMO_OP_SLOT_SUBTYPE_PTR_MESH",
"BMO_OP_SLOT_SUBTYPE_PTR_STRUCT",
"BMO_OP_SLOT_SUBTYPE_PTR_BMESH",
"BMO_OP_SLOT_SUBTYPE_INT_ENUM",
"BMO_OP_SLOT_SUBTYPE_INT_FLAG",
@@ -346,10 +345,6 @@ def main():
tp_str = ":class:`bpy.types.Object`"
elif tp_sub == BMO_OP_SLOT_SUBTYPE_PTR_MESH:
tp_str = ":class:`bpy.types.Mesh`"
elif tp_sub == BMO_OP_SLOT_SUBTYPE_PTR_STRUCT:
# XXX Used for CurveProfile only currently I think (bevel code),
# but think the idea is that that pointer is for any type?
tp_str = ":class:`bpy.types.bpy_struct`"
else:
print("Can't find", vars_dict_reverse[tp_sub])
assert(0)

Some files were not shown because too many files have changed in this diff Show More