Compare commits

...

217 Commits

Author SHA1 Message Date
d940735453 Mark version 1.25 as released today 2022-02-25 15:45:55 +01:00
7d71067b3d Use BAT version 1.11 for UDIM support 2022-02-25 15:45:55 +01:00
b0b804410d Bumped version to 1.25 2022-02-25 15:45:55 +01:00
d55f2dcee1 Compatibility with Blender 3.1 / Python 3.10
Blender 3.1 will be shipped with Python 3.10, which made some backward-
incompatible changes in its asyncio module (the removal of the `loop`
parameter from various functions).

Depending on which Python version is used in Blender, the add-on now
passes (or not) the `loop` parameter, retaining backward compatibility.
2022-02-25 15:45:52 +01:00
2fbb5ac788 Bumped version to 1.24 2022-02-04 11:01:55 +01:00
b47b407589 Update CHANGELOG.md 2022-02-04 10:58:55 +01:00
a136366804 Upgrade to BAT 1.10
Upgrade to BAT to fix doubly-compressed blend files.

This also changes the way the wheel files are loaded, as the old
alphabetical ordering won't pick up on BAT 1.10. It now uses the
modification time of the wheel files to find the latest one.
2022-02-04 10:57:45 +01:00
6718e1646f Attract: prevent rare error in ATTRACT_OT_open_meta_blendfile 2021-11-19 15:46:19 +01:00
9d7f9a979e Bumped version to 1.23 2021-11-09 11:25:58 +01:00
326a793de0 Bump BAT 1.7 → 1.8
Bump BAT version to allow sending read-only files to Flamenco.
2021-11-09 11:25:44 +01:00
88ccb0f376 Bumped version to 1.22 2021-11-05 16:33:07 +01:00
5b8895278a Mark version 1.22 as released today 2021-11-05 16:33:00 +01:00
eb37d20039 Bump blender-asset-tracer 1.6 → 1.7
BAT v1.7 adds support for zstandard-compressed files, which are written
by Blender 3.0
2021-11-05 16:32:32 +01:00
4f49e8ca0b Cleanup: remove some unused imports 2021-11-05 16:25:27 +01:00
c931700fec Cleanup: formatting with Black
No functional changes.
2021-07-29 19:34:11 +02:00
6285826bfc Fix Windows incompatibility when using Shaman URLs as job storage path
The Shaman URL check was done on the wrong string, which went unnoticed
on Linux because an URL is a valid file path. However, on Windows this is
not the case, and thus caused problems. This is now fixed.
2021-07-29 19:33:36 +02:00
25150397c0 Bumped version to 1.21 2021-07-27 17:12:18 +02:00
c67b161e3d Bump blender-asset-tracer version 1.5.1 → 1.6
BAT 1.6 has better compatibility with Geometry Nodes.
2021-07-27 17:12:18 +02:00
f76dcb964e Bumped version to 1.20 2021-07-22 17:16:00 +02:00
2d868ec724 Disable Strict Pointer Mode in Blender Asset Tracer
Disable BAT's Strict Pointer Mode to work around issues with dangling
pointers in the Blender Animation Studio files. These seem to be caused
by not-perfectly-resynced library overrides. Ignoring those pointers
seems to cause less problems than crashing on them.
2021-07-22 16:44:53 +02:00
666ae0fa90 Bump blender-asset-tracer version 1.3.1 → 1.5.1
Bump BAT version to have it tested on the currently used Python version
(3.9) and to have the ability to disable Strict Pointer Mode.
2021-07-22 16:44:47 +02:00
49844e17b2 Bumped version to 1.19 2021-02-23 11:58:09 +01:00
06432a3534 Mark 1.19 as released in CHANGELOG.md 2021-02-23 11:58:03 +01:00
3a2e9bc672 Simplify @pyside_cache decorator
This fixes a compatibility issue with Python 3.9+, and at the same time
avoids a not-yet-quite-stable area of Blender's Python API.
2021-02-23 11:57:29 +01:00
ce331c7b22 Mark 1.18 as released 2021-02-16 11:58:05 +01:00
8b5dc65d84 Bumped version to 1.18 2021-02-16 11:58:05 +01:00
3bc7dcfa9e Update update_script.sh for new formatting with Black 2021-02-16 11:58:05 +01:00
d9fe24ece7 Cleanup: reformat setup.py with Black
No functional changes.
2021-02-16 11:58:02 +01:00
dd00bc9cb5 Don't save preferences when exiting with "Send & Quit" button
The "Save & Quit" button disables the exit confirmation box, and that
change shouldn't be auto-saved.
2021-02-16 11:48:43 +01:00
14778e5c08 Remove code to support Blender 2.79 and older 2021-02-16 11:48:43 +01:00
8b49c5505e Reformat with Black
No functional changes.
2021-02-16 11:48:43 +01:00
883f125722 Compatibility with Blender 2.93 / Python 3.9 → require Blender 2.80+
The code now requires Python 3.7 or newer, as a side-effect of the changes
required for compatibility with 3.9 (as used in Blender 2.93). As a result,
Blender Cloud Add-on now requires Blender 2.80 or newer.
2021-02-16 11:48:43 +01:00
2fbe7e1258 Slightly more compressed changelog
No functional changes.
2021-02-16 10:28:47 +01:00
405b823c81 Bumped version to 1.17 2021-02-04 12:04:46 +01:00
9e952035d3 Upgrade BAT 1.2.1 → 1.3.1
Upgrade BAT to version 1.3.1, which brings compatibility with Geometry
Nodes and fixes some issues on Windows.
2021-02-04 12:04:26 +01:00
d77acfb9c8 Reduce logging noise
- No longer list Attract's RNA classes, these haven't changed in a long
  time and it's not interesting to see.
- Reduced log level when updating internal state. The result of the update
  is already logged at INFO level.
2020-11-12 12:31:17 +01:00
70de9741df Bumped version to 1.16 2020-03-03 10:39:11 +01:00
cc37e73bc6 Fix T74211: Windows compatibility with Shaman URL handling 2020-03-03 10:38:53 +01:00
e32e75e3db Bumped version to 1.15 and marked as released in CHANGELOG 2019-12-12 10:42:08 +01:00
6fa5ab5481 Removed trailing period from property description
No functional changes.
2019-12-12 10:40:58 +01:00
379580de86 Don't create BAT pack when rendering file in job storage directory
When the to-be-rendered blend file is contained in the job storage
directory, it is now assumed that all files are already reachable by the
Flamenco Workers. This supports environments working directly on shared
storage.

This assumes that the paths are already correct for the Flamenco
Workers. No detection of missing files is done (as BAT doesn't run).
2019-10-25 13:34:34 +02:00
db30b3df76 Bumped version to 1.14 2019-10-10 10:39:37 +02:00
5de99baaef Updated changelog 2019-10-10 10:39:28 +02:00
2184b39d27 Bump Blender Asset Tracer (BAT) version from 1.1.1 → 1.2.1 2019-10-10 10:29:53 +02:00
23b1f7de7d Convert property definitions from assignment to annotations on Blender 2.80+
The properties are still declared in the Python 3.5 compatible assignment
notation, and a class decorator that converts those to class annotations
as preferred by Blender 2.80.
2019-10-10 10:29:36 +02:00
28f68c6fbf update_version.sh: Use Python 3 in example command
This makes it possible to run the command outside of a Python 3 virtualenv.
2019-06-21 14:31:54 +02:00
b00cb233cc Bumped version to 1.13.5 2019-06-21 14:30:03 +02:00
2142e9e7fc Attract fix for Blender 2.80 panel change
Commit 1e7c3a159fd2ca42fd5688be067008ef0d2c03df removed the 'Info' panel
(which is good), so we have to attach the metadata subpanel somewhere else.
2019-06-21 14:29:49 +02:00
1dea802932 Attract doesn't have to be active to use ATTRACT_OT_open_meta_blendfile
It is pretty much independent of Attract.
2019-06-21 14:29:07 +02:00
077bd1abdb Prevent KeyError when Flamenco Manager settings are unknown 2019-06-12 11:47:16 +02:00
5a2c528681 Run Pip via {sys.executable} -m pip
This solves the same problem as c457767edf814f92e1da8cb9d08fa52404ea074c,
but in a way that's actually [recommended](https://pip.pypa.io/en/latest/user_guide/#using-pip-from-your-program).
2019-06-04 12:40:02 +02:00
53b12376d1 Revert "Use Python module to run Pip"
This reverts commit c457767edf814f92e1da8cb9d08fa52404ea074c. Modern pip
can no longer be used this way ('pip.main' does not exist).
2019-06-04 12:35:46 +02:00
8495868ea6 Bumped version to 1.13.4 2019-06-04 12:29:50 +02:00
cf810de41b Another Blender 2.8 compatibility fix 2019-06-04 12:29:37 +02:00
c457767edf Use Python module to run Pip
setup.py used systemcalls for package management pip. This call is
platform dependent as on ubuntu distros this needs to be pip3. On these
platforms pip points to the python2 version.

By direct calling the pip module from within the running python process
we know for sure we are triggering the correct one.

Differential revision: https://developer.blender.org/D4952/

Reviewed by: sybren
2019-05-29 10:29:14 +02:00
985b3f6a7d Attract: draw strip metadata as its own panel
The panel is a subpanel in Blender 2.80, and a top-level panel in 2.79.
2019-05-24 14:12:36 +02:00
a45bf3cd5c Bumped version to 1.13.3 2019-05-21 10:19:49 +02:00
3789742cc8 Fixed little bug
Missed a function call in a69f4d3fd91958e2fdbc94e661bae10ba1d7f139.
2019-05-21 10:19:34 +02:00
58f374e175 Bumped version to 1.13.2 2019-05-17 11:26:40 +02:00
99e90e1008 Mark version 1.13 as released 2019-05-17 11:26:29 +02:00
dd83d3ee60 Blender 2.80 compatibility for Attract panel in sequence editor 2019-05-17 11:15:34 +02:00
e74e014c66 Quick fix for Blender 2.80 texture loading
The `Image.gl_load()` call was changed in Blender commit
7ad802cf3ae500bc72863b6dba0f28a488fce3d1; the two parameters we were using
were removed.

This commit fixes the exception and makes the texture browser usable again,
but doesn't properly fix everything. The textures are drawn in the wrong
colour space, which will be fixed in another commit once I know how.
2019-05-17 11:09:57 +02:00
01541f181e Bumped Pillar Python SDK 1.7.0 → 1.8.0 2019-05-14 11:05:51 +02:00
a69f4d3fd9 Flamenco: Moved some code around, no semantic changes 2019-05-10 12:29:39 +02:00
3ffea46f23 Bumped version to 1.13.1 2019-04-18 12:58:49 +02:00
94c5811e42 Typo 2019-04-18 12:58:34 +02:00
676ad1ed14 Removed unused import 2019-04-18 12:46:42 +02:00
79e6fa37f4 Bumped version to 1.13.0 2019-04-18 12:10:30 +02:00
e06fa3ea75 Flamenco: Support for Flamenco Manager settings version 2
When using Blender Cloud Add-on 1.12 or older, Flamenco Server will
automatically convert the Manager settings to version 1. As a result,
upgrading is recommended but not required to keep working with a newer
Flamenco Server.
2019-04-18 12:09:54 +02:00
fb6352dc7d Upgraded BAT to 1.1.1 for a compatibility fix with Blender 2.79 2019-04-18 12:06:43 +02:00
97ad8bf5ba Flamenco: sort path replacement vars by replacement, not by variable name
The longer paths need to be replaced first. Not the longer variable name.
2019-04-18 11:07:36 +02:00
b0f7719add Fix pyrna_enum_to_py: current value matches no enum warnings 2019-03-26 12:36:13 +01:00
dada275e32 Bumped version to 1.12.1 2019-03-26 11:32:10 +01:00
6bce1ccf90 Bumped BAT requirement to 1.1 2019-03-25 17:48:28 +01:00
bbe524c099 Updated CHANGELOG 2019-03-25 17:44:56 +01:00
462da038ec Fixed Blender 2.79 incompatibility 2019-03-20 13:58:56 +01:00
8d7799655e Bumped BAT to 1.1.dev2 2019-03-20 13:58:47 +01:00
cb0393868e Flamenco: get JWT token from Flamenco Server when sending files to Shaman 2019-03-13 15:09:24 +01:00
5a61a7a6c4 Use exponential backoff in uncached_session 2019-03-13 15:08:56 +01:00
60d1fbff50 Blender changed use_quit_dialog into use_save_prompt 2019-03-13 10:07:23 +01:00
352fe239f2 Flamenco: Use DNA enum value for format setting
See https://developer.blender.org/D4502 and https://developer.blender.org/rF032423271d0417aed3b6053adb8b6db2774b0d36
for more info.
2019-03-12 15:27:27 +01:00
09c1bf67b4 Bumped BAT to 1.1-dev1 2019-03-06 13:41:49 +01:00
23235afe71 Updated CHANGELOG 2019-03-06 13:32:38 +01:00
ff9624d4f3 Blender Video Chunks: also allow .mp4 and .mov as container format 2019-03-06 13:31:30 +01:00
48c60f73d7 Bundle with BAT 1.1-dev0 for Shaman support
See https://gitlab.com/blender-institute/shaman for more info.
2019-03-01 14:37:44 +01:00
12eaaa5bae Set min job priority to 1
Previously the minimum was 0, but the server only accepts 1 and up.
2019-03-01 14:36:41 +01:00
f7396350db Add support for Shaman servers
See https://gitlab.com/blender-institute/shaman for more info
2019-02-28 12:53:29 +01:00
cc97288018 Create job first, then send files
This requires Flamenco Server 2.2 or newer.
2019-02-28 12:52:51 +01:00
26105add9c Updated BAT to 0.99 2019-02-26 16:48:39 +01:00
ea81cc5769 Flamenco: Name render jobs just 'thefile' instead of 'Render thefile.flamenco.blend'
This makes the job list on Flamenco Server cleaner.
2019-02-13 15:18:33 +01:00
25b6053836 Allow project selection, even when the current project is ''. 2019-02-13 14:29:36 +01:00
65a05403dc Bumped BAT to 0.9 2019-02-12 12:33:31 +01:00
770b0121fa Flamenco: Different label for 'frame chunk' depending on render job type
The frame chunk size has a slightly different meaning when rendering
progressively (Flamenco Server can choose to chunk more frames together
when rendering a low number of samples).
2019-02-06 09:32:24 +01:00
2b155eac45 Flamenco: show a warning when the frame dimensions are not divisible by 2
Any 'Create Video' Flamenco task that's part of the job will pad the video
with black pixels to make the dimensions even, and this warning notifies
the artist about this.
2019-02-04 11:39:14 +01:00
d36959e91b Flamenco: Fixed tiny layout bug 2019-02-04 11:37:04 +01:00
9028c38c68 Fixed "You are not logged in" message 2019-02-01 17:20:01 +01:00
f04f07eaf1 Bumped version to 1.12.0 2019-01-31 14:43:08 +01:00
6c38a432bc Flamenco: Added a hidden "Submit & Quit" button.
This button can be enabled in the add-on preferences and and then be
available on the Flamenco Render panel. Pressing the button will
silently close Blender after the job has been submitted to Flamenco (for
example to click, walk away, and free up memory for when the same
machine is part of the render farm).
2019-01-31 14:42:50 +01:00
53fa3e628a Flamenco: disable Cycles denoiser when progressive rendering
The denoiser data cannot be (easily) merged, so for now we just disable
the denoiser.
2019-01-30 16:10:09 +01:00
924fb45cb2 Flamenco: disallow progressive rendering unless Cycles is used 2019-01-30 16:06:39 +01:00
b5619757bc Flamenco: disallow progressive rendering on Blender < 2.80
Rendering ranges of sample chunks only works reliably for us after
Blender commit 7744203b7fde35a074faf232dda3595b78c5f14c (Tue Jan 29
18:08:12 2019 +0100).
2019-01-30 16:06:39 +01:00
ae41745743 Flamenco: easy button for setting max sample count for progressive rendering 2019-01-30 16:06:39 +01:00
ffab83f921 Flamenco: no longer use the word 'chunks' in the UI
It's a confusing word; 'Frames per Task' is clearer.
2019-01-30 16:06:39 +01:00
8bef2f48a5 Flamenco: Move job-type-specific options to a box below job type selector
This should make the relation between the job type and its options clearer.
2019-01-30 16:04:43 +01:00
74b46ff0db Flamenco: Progressive Rendering max sample count instead of chunk count
Flamenco Server changed from expecting a fixed number of sample chunks to
a compile-time determined number of nonuniform chunks. The artist can now
influence the size of each render task by setting a maximum number of
samples per render task.
2019-01-30 16:04:43 +01:00
e1934b20d9 Flamenco: nicer error reporting when creating a job fails 2019-01-30 13:05:09 +01:00
0caf761863 Prevent error when running Blender in background mode
We shouldn't call any `gpu` functions in background mode. Since the texture
browser will never run when Blender is in background mode anyway, we can
simply assign `None` instead.
2019-01-04 16:25:50 +01:00
bc864737ae Bumped version to 1.11.1 2019-01-04 13:42:12 +01:00
f454a99c4b Bundled missing Texture Browser icons in setup.py 2019-01-04 13:42:04 +01:00
40732e0487 Updated changelog 2019-01-04 11:13:32 +01:00
b86bffbdbb Bumped version to 1.11.0 2019-01-04 11:12:36 +01:00
67f9d40fd3 Blender Sync: fixed missing icon in Blender 2.80
I like the 'DOTSDOWN' icon better, so I keep using it in Blender ≤ 2.79.
2019-01-04 11:09:20 +01:00
c4de4e9990 Fixed some MyPy warnings
This includes using `''` instead of `None` in some cases where an empty
string conveys 'nothing' equally well as `None`; in such cases keeping the
type the same rather than switching to another type is preferred.
2019-01-03 12:07:05 +01:00
6d2e6efa13 Update users of the material after replacing a HDRi
This causes a refresh and immediately shows the new texture in the viewport.
2019-01-03 11:33:19 +01:00
ff9ae0117d Fixed race condition referring to self when operator may have stopped running
The `file_loading` function is called deferred by asyncio, and can thus
be called when the operator has already stopped loading. This is fixed by
not referring to `self` in that function, and taking the logger from the
outer scope.
2019-01-03 11:32:40 +01:00
974d33e3a3 Texture Browser updated for Blender 2.8 drawing
The drawing code has been abstracted into a `draw.py` for Blender 2.8
and `draw_27.py` for earlier versions.
2019-01-03 10:41:42 +01:00
8de3a0bba2 Moved texture browser to its own module
This places it in the same kind of structure as Attract and Flamenco.
2019-01-02 16:47:33 +01:00
6f705b917f Removed local import 2019-01-02 16:47:11 +01:00
02b694f5d4 Bumped version to 1.10.0 and marked as released today 2019-01-02 16:19:23 +01:00
663ebae572 Bumped Blender-Asset-Tracer version to 0.8
This version has lots of Windows-specific fixes.
2019-01-02 16:19:05 +01:00
cb5a116dff Compatibility fix for Blender 2.8
bpy.context.user_preferences was renamed to bpy.context.preferences.
2018-12-28 12:31:33 +01:00
5821611d89 Compatibility fix with Blender 2.79 (Python 3.5) 2018-12-28 12:29:25 +01:00
8bd1faa575 Overwrite when deploying 2018-12-07 14:34:02 +01:00
8899bff5e4 Fixed Flamenco exclusion filter bug
There was a mistake in an older version of the property tooltip, showing
semicolon-separated instead of space-separated. We now just handle both.
2018-12-07 12:25:48 +01:00
4fd4ad7448 Added 'blender-video-chunks' job type
Requires that the file is configured for rendering to Matroska video
files.

Audio is only extracted when there is an audio codec configured. This is
a bit arbitrary, but it's at least a way to tell whether the artist is
considering that there is audio of any relevance in the current blend
file.
2018-12-07 11:28:09 +01:00
4f32b49ad3 Flamenco: Allow BAT-packing of only relative-path assets 2018-12-06 15:46:54 +01:00
1f13b4d249 Updated changelog 2018-12-05 13:01:03 +01:00
ef57dba5d3 Flamenco: Write more extensive information to jobinfo.json
This introduces version 2 of that file.

Version 1:
    - Only the job doc was saved, with 'missing_files' added inside it.

Version 2:
  - '_meta' key was added to indicate version.
  - 'job' is saved in a 'job' key, 'misssing_files' still top-level key.
  - 'exclusion_filter', 'project_settings', and
    'flamenco_manager_settings' keys were added.
2018-12-05 12:57:39 +01:00
419249ee19 Flamenco: Compress all blend files
All blend files in the BAT pack are now compressed, and not just the one
we save from Blender. Requires BAT 0.5 or newer.
2018-11-27 16:40:05 +01:00
113eb8f7ab Flamenco: add fps, output_file_extension, and images_or_video job settings
These are all needed to use FFmpeg on the worker to render a video from
rendered image sequences.

- fps: float, the scene FPS
- images_or_video: either 'images' or 'video', depending on what's being
  output by Blender. We don't support using FFmpeg to join chunked videos
  yet.
- output_file_extension: string like '.png' or '.exr', only set when
  outputting images (since doing this for video requires a lookup table and
  isn't even being used at the moment).
2018-11-21 14:24:32 +01:00
85f911cb59 Generalised saving/loading of project+manager-specific settings + added one
Added the `flamenco_exclude_filter` setting to the set, and also made it
easier to add new settings too.
2018-11-16 17:12:30 +01:00
564c2589b1 Added little script to automate deployment in Blender Animation Studio 2018-11-16 16:54:36 +01:00
80155ed4f4 Fixed storing & loading project+manager-specific settings
The problem was that there was too much storing done in an on-change
handler, causing things to be overwritten. By splitting up some functionality
and properly marking the "we're now loading" bits of code, its' solved.
2018-11-16 16:52:07 +01:00
d8c5c4eecd Cross-platformified my setup.py 'local' hack 2018-11-16 12:20:09 +01:00
3972ce4543 Write wheel files to correct dir in the bdist archive
They were ending up in a `local` directory next to the `blender_cloud`
directory. Probably something to do with newer setuptools? Had the same
issue in the Blender ID add-on.
2018-11-15 17:47:33 +01:00
d75a055149 Updated CHANGELOG 2018-11-12 15:07:03 +01:00
649542daad Prevent crashing Blender when running in the background 2018-11-12 15:02:51 +01:00
1d99751d20 Bumped version to 1.9.4 2018-11-01 18:39:24 +01:00
69028e0cfd Fixed Python 3.6 / 2.79b incompatibilities introduced in 1.9.3 2018-11-01 18:39:06 +01:00
dc7ad296bf Added little reminder for myself 2018-11-01 18:30:18 +01:00
3f2479067c Fixed incompatibility with Python 3.6 (used in Blender 2.79b) 2018-11-01 18:30:10 +01:00
6fefe4ffd8 Bumped version to 1.9.3 2018-10-30 14:17:02 +01:00
62c1c966f6 Attract: draw using the GPU module
The drawing is rather primitive, but it works.
2018-10-30 14:14:33 +01:00
57aadc1817 Attract: added 'open project in browser' button
The button was added to the video sequence editor panel.
2018-10-30 14:14:33 +01:00
7204d4a24c Added bl_category for Attract panel 2018-10-30 14:14:33 +01:00
641b51496a Some drawing code simplification 2018-10-30 14:14:33 +01:00
0562d57513 Attract: fixed class naming and registration 2018-10-30 14:14:33 +01:00
ac19e48895 Changelog update 2018-10-30 10:56:09 +01:00
73d96e5c89 Bumped version to 1.9.2 2018-09-17 18:58:05 +02:00
4bfdac223a Include Python 3.7-compatible pillarsdk 2018-09-17 18:57:57 +02:00
5d6777c74b Bumped version to 1.9.1 2018-09-17 18:47:52 +02:00
f4322f1d1f Updated changelog 2018-09-17 18:47:16 +02:00
13a8595cc0 Don't set prefs.flamenco_manager.manager to a not-in-the-enum value 2018-09-17 18:23:41 +02:00
af413059b0 Bumped version to 1.9.0 2018-09-05 13:35:27 +02:00
4d26ad248e Bumped version to 1.9 for last 2.79-compatible release
The next release will be 2.0 and target Blender 2.80.
2018-09-05 13:31:09 +02:00
d019fd0cf0 Some debug logging code 2018-09-05 13:28:56 +02:00
fb9ffbbc23 Store available managers per project, and store chosen manager too 2018-09-04 17:38:21 +02:00
f6d797512a Moved use of global variable to a context manager 2018-09-04 17:37:07 +02:00
8367abeeb9 Fixed bad registration 2018-09-04 15:36:38 +02:00
2f5f82b1a8 Blender 2.80-compatible unregistration 2018-09-04 14:56:10 +02:00
a04137ec6a Bumped version to 1.9.999 2018-09-04 14:43:57 +02:00
87c90a7f72 Some more Blender 2.80 compatibility 2018-09-04 14:38:30 +02:00
4de8122920 More code simplification 2018-09-04 14:34:14 +02:00
21d2257be0 Simplified some code 2018-09-04 14:31:08 +02:00
bc4036573c Updated CHANGELOG 2018-09-04 14:30:55 +02:00
87cf1e12fa Prevent KeyError when accessing ps['flamenco_manager'] 2018-09-04 14:11:15 +02:00
b35d7bc5f3 Made the add-on more compatible with 2.80 and 2.79 2018-09-04 13:48:44 +02:00
973dafcc3a Store some flamenco job preferences on a per-manager basis
Managers often require distinct input and output path, which can now
be saved and loaded from the User Preferences, as well as in the
Flamenco panel.
2018-07-25 15:01:39 +02:00
62d16fff35 Display only Flamenco Managers linked to the current project 2018-07-24 15:44:12 +02:00
ed3de414ef Bumped version to 1.8.99999 2018-07-12 11:54:38 +02:00
b0a03c81f5 Flamenco: allow jobs to be created in 'paused' state. 2018-07-12 11:54:13 +02:00
99f0764986 More efficient removal of Flamenco-specific Scene properties 2018-07-12 11:54:01 +02:00
f9c2dda9fa Fixed problem with relative project paths 2018-07-12 11:53:30 +02:00
0a7dea568a Bundle BAT 0.4 2018-07-10 16:05:45 +02:00
40c31e8be2 Upgrade blender-asset-tracer 0.2-dev → 0.3 2018-07-03 15:12:36 +02:00
394395a7f5 Update bl_info to mark compatibility with Blender 2.80+
The add-on will still work on Blender 2.77a+; this change is required for
Blender 2.80 to load the add-on.
2018-07-03 12:32:50 +02:00
f1478bf3d9 Bumped version to 1.8.9999 2018-06-01 17:33:24 +02:00
2fce27f8cb Made the add-on not immediately crash on Blender 2.8 2018-06-01 17:22:49 +02:00
59e6491110 Bumped BAT required version to 0.2 2018-05-08 12:52:43 +02:00
afca7abe18 Bundle development version of BAT for now
This makes testing a bit easier.
2018-03-26 17:22:38 +02:00
4aae107396 Support colour strips as Attract shots 2018-03-22 16:25:35 +01:00
096a5f5803 Updated changelog 2018-03-22 16:22:44 +01:00
79dc5c91f7 Gracefully handle download errors in texture browser 2018-03-22 14:21:09 +01:00
0a99b9e22e Save jobinfo.json to output directory
Previously it would be saved in the same directory as the blend file, which
may be deeply nested in a directory structure. Now it's saved at the top
of the BAT pack.
2018-03-21 16:05:20 +01:00
0452fd845b Fix for some threading issue 2018-03-21 15:57:45 +01:00
c0a8602e17 Get the FLAMENCO_OT_copy_files operator up to par with the rest 2018-03-21 15:57:45 +01:00
10c51b3af5 For development, require latest version of BAT 2018-03-21 15:34:56 +01:00
0be5c16926 Avoid TypeError when project-specific Flamenco Manager cannot be found 2018-03-16 14:13:29 +01:00
4158c4eed5 Require BAT 0.1 2018-03-16 13:47:24 +01:00
de4a93de98 Bumped version to 1.8.999 to indicate '1.9-dev'
Maybe this will even become 2.0 eventually.
2018-03-16 13:46:48 +01:00
331e9e6ca0 Don't show stack trace when BAT Pack was aborted 2018-03-16 12:41:09 +01:00
1d81f4bc38 Some code unindentation 2018-03-16 12:40:05 +01:00
5f58f8b6f7 Set status done 'DONE' after done 2018-03-16 12:26:16 +01:00
164f65f30c Allow aborting a running BAT Pack operation 2018-03-16 12:15:53 +01:00
b82bc14fcf Flamenco: Reporting BAT Pack process on the UI 2018-03-15 18:01:04 +01:00
9e5dcd0b55 Replaced BAM with BAT
Blender Asset Tracer, or BAT, is a newly written replacement for BAM,
with a nicer API.
2018-03-15 12:36:05 +01:00
531ddad8f5 Formatting 2018-02-21 13:51:01 +01:00
7e0dd0384d Simplified wheel downloading 2018-02-21 13:50:55 +01:00
da0c8dd944 Marked 1.8 as released 2018-02-15 17:42:00 +01:00
8065ab88a4 Bumped version to 1.8.0 2018-01-03 14:09:01 +01:00
6baf43e53b Fix KeyError when browsing HDRIs 2018-01-02 17:24:03 +01:00
f1fa273370 Updated changelog 2018-01-02 16:48:47 +01:00
bf96638c88 Updated changelog 2018-01-02 16:44:33 +01:00
bc8a985228 Added button to open a Cloud project in webbrowser. 2018-01-02 16:44:29 +01:00
ba14c33b6d Store project-specific settings in the preferences.
This stores project-specific settings, such as filesystem paths, for each
project, and restores those settings when the project is selected again.
Does not touch settings that haven't been set for the newly selected
project.
2018-01-02 16:42:37 +01:00
0a7e7195a2 Changed default Flamenco path to tempfile.gettempdir()
The previous defaults were very Blender Institute specific.
2018-01-02 15:37:38 +01:00
ecab0f6163 Upgraded cryptography package version + its dependencies
This was required to get the package to build on Kubuntu 17.10.
2018-01-02 15:36:19 +01:00
3c91ccced6 Texture Browser: use DPI from user preferences 2018-01-02 15:10:27 +01:00
c9ed6c7d23 Texture browser: show which map types are available in GUI 2018-01-02 14:53:30 +01:00
5fa01daf9e Revisit previous path when re-opening texture browser. 2018-01-02 14:11:30 +01:00
77664fb6d7 Distinguish between 'renew' and 'join' in messages & URL to open. 2018-01-02 14:02:17 +01:00
45cffc5365 Bumped version to 1.7.99 (surrogate for 1.8-dev) 2018-01-02 14:02:17 +01:00
fb5433d473 Bumped version to 1.7.5 2017-10-06 12:39:38 +02:00
a17fe45712 Allow overriding the render output path on a per-scene basis. 2017-10-06 12:39:18 +02:00
1bfba64bdc Formatting 2017-10-06 12:38:17 +02:00
cdb4bf4f4f Renamed 'Job File Path' to 'Job Storage Path' so it's more explicit. 2017-10-06 12:37:44 +02:00
15254b8951 Sorting the project list in user prefs alphabetically 2017-10-06 12:35:25 +02:00
34 changed files with 5473 additions and 3291 deletions

View File

@ -1,5 +1,171 @@
# Blender Cloud changelog # Blender Cloud changelog
## Version 1.25 (2022-02-25)
- Compatibility with Blender 3.1 (Python 3.10).
- Bump blender-asset-tracer to version 1.11, for UDIM support.
## Version 1.24 (2022-02-04)
- Bump blender-asset-tracer version 1.8 → 1.10, for fixing a bug where files were doubly-compressed.
## Version 1.23 (2021-11-09)
- Bump blender-asset-tracer version 1.7 → 1.8, for compatibility with sending read-only blend files to Flamenco.
## Version 1.22 (2021-11-05)
- Fix Windows incompatibility when using Shaman URLs as job storage path.
- Bump blender-asset-tracer version 1.6 → 1.7, for compatibility with files compressed by Blender 3.0.
## Version 1.21 (2021-07-27)
- Bump blender-asset-tracer version 1.5.1 → 1.6, for better compatibility with Geometry Nodes.
## Version 1.20 (2021-07-22)
- Bump blender-asset-tracer version 1.3.1 -> 1.5.1.
- Blender-asset-tracer "Strict Pointer Mode" disabled, to avoid issues with
not-entirely-synced library overrides.
## Version 1.19 (2021-02-23)
- Another Python 3.9+ compatibility fix.
## Version 1.18 (2021-02-16)
- Add compatibility with Python 3.9 (as used in Blender 2.93).
- Drop compatibility with Blender 2.79 and older. The last version of the
Blender Cloud add-on with 2.79 and older is version 1.17.
## Version 1.17 (2021-02-04)
- This is the last version compatible with Blender 2.77a - 2.79.
- Upgrade BAT to version 1.3.1, which brings compatibility with Geometry Nodes and
fixes some issues on Windows.
## Version 1.16 (2020-03-03)
- Fixed Windows compatibility issue with the handling of Shaman URLs.
## Version 1.15 (2019-12-12)
- Avoid creating BAT pack when the to-be-rendered file is already inside the job storage
directory. This assumes that the paths are already correct for the Flamenco Workers.
## Version 1.14 (2019-10-10)
- Upgraded BAT to 1.2 for missing smoke caches, compatibility with Blender 2.81, and some
Windows-specific fixes.
- Removed warnings on the terminal when running Blender 2.80+
## Version 1.13 (2019-04-18)
- Upgraded BAT to 1.1.1 for a compatibility fix with Blender 2.79
- Flamenco: Support for Flamenco Manager settings versioning + for settings version 2.
When using Blender Cloud Add-on 1.12 or older, Flamenco Server will automatically convert the
Manager settings to version 1.
- More Blender 2.80 compatibility fixes
## Version 1.12 (2019-03-25)
- Flamenco: Change how progressive render tasks are created. Instead of the artist setting a fixed
number of sample chunks, they can now set a maximum number of samples for each render task.
Initial render tasks are created with a low number of samples, and subsequent tasks have an
increasing number of samples, up to the set maximum. The total number of samples of the final
render is still equal to the number of samples configured in the blend file.
Requires Flamenco Server 2.2 or newer.
- Flamenco: Added a hidden "Submit & Quit" button. This button can be enabled in the add-on
preferences and and then be available on the Flamenco Render panel. Pressing the button will
silently close Blender after the job has been submitted to Flamenco (for example to click,
walk away, and free up memory for when the same machine is part of the render farm).
- Flamenco: Name render jobs just 'thefile' instead of 'Render thefile.flamenco.blend'.
This makes the job overview on Flamenco Server cleaner.
- Flamenco: support Shaman servers. See https://www.flamenco.io/docs/user_manual/shaman/
for more info.
- Flamenco: The 'blender-video-chunks' job type now also allows MP4 and MOV video containers.
## Version 1.11.1 (2019-01-04)
- Bundled missing Texture Browser icons.
## Version 1.11.0 (2019-01-04)
- Texture Browser now works on Blender 2.8.
- Blender Sync: Fixed compatibility issue with Blender 2.8.
## Version 1.10.0 (2019-01-02)
- Bundles Blender-Asset-Tracer 0.8.
- Fix crashing Blender when running in background mode (e.g. without GUI).
- Flamenco: Include extra job parameters to allow for encoding a video at the end of a render
job that produced an image sequence.
- Flamenco: Compress all blend files, and not just the one we save from Blender.
- Flamenco: Store more info in the `jobinfo.json` file. This is mostly useful for debugging issues
on the render farm, as now things like the exclusion filter and Manager settings are logged too.
- Flamenco: Allow BAT-packing of only those assets that are referred to by relative path (e.g.
a path starting with `//`). Assets with an absolute path are ignored, and assumed to be reachable
at the same path by the Workers.
- Flamenco: Added 'blender-video-chunks' job type, meant for rendering the edit of a film from the
VSE. This job type requires that the file is configured for rendering to Matroska video
files.
Audio is only extracted when there is an audio codec configured. This is a bit arbitrary, but it's
at least a way to tell whether the artist is considering that there is audio of any relevance in
the current blend file.
## Version 1.9.4 (2018-11-01)
- Fixed Python 3.6 and Blender 2.79b incompatibilities accidentally introduced in 1.9.3.
## Version 1.9.3 (2018-10-30)
- Fix drawing of Attract strips in the VSE on Blender 2.8.
## Version 1.9.2 (2018-09-17)
- No changes, just a different filename to force a refresh on our
hosting platform.
## Version 1.9.1 (2018-09-17)
- Fix issue with Python 3.7, which is used by current daily builds of Blender.
## Version 1.9 (2018-09-05)
- Last version to support Blender versions before 2.80!
- Replace BAM with BAT🦇.
- Don't crash the texture browser when an invalid texture is seen.
- Support colour strips as Attract shots.
- Flamenco: allow jobs to be created in 'paused' state.
- Flamenco: only show Flamenco Managers that are linked to the currently selected project.
## Version 1.8 (2018-01-03)
- Distinguish between 'please subscribe' (to get a new subscription) and 'please renew' (to renew an
existing subscription).
- When re-opening the Texture Browser it now opens in the same folder as where it was when closed.
- In the texture browser, draw the components of the texture (i.e. which map types are available),
such as 'bump, normal, specular'.
- Use Interface Scale setting from user preferences to draw the Texture Browser text.
- Store project-specific settings in the preferences, such as filesystem paths, for each project,
and restore those settings when the project is selected again. Does not touch settings that
haven't been set for the newly selected project. These settings are only saved when a setting
is updated, so to save your current settings need to update a single setting; this saves all
settings for the project.
- Added button in the User Preferences to open a Cloud project in your webbrowser.
## Version 1.7.5 (2017-10-06)
- Sorting the project list alphabetically.
- Renamed 'Job File Path' to 'Job Storage Path' so it's more explicit.
- Allow overriding the render output path on a per-scene basis.
## Version 1.7.4 (2017-09-05) ## Version 1.7.4 (2017-09-05)
- Fix [T52621](https://developer.blender.org/T52621): Fixed class name collision upon add-on - Fix [T52621](https://developer.blender.org/T52621): Fixed class name collision upon add-on
@ -7,57 +173,48 @@
- Fix [T48852](https://developer.blender.org/T48852): Screenshot no longer shows "Communicating with - Fix [T48852](https://developer.blender.org/T48852): Screenshot no longer shows "Communicating with
Blender Cloud". Blender Cloud".
## Version 1.7.3 (2017-08-08) ## Version 1.7.3 (2017-08-08)
- Default to scene frame range when no frame range is given. - Default to scene frame range when no frame range is given.
- Refuse to render on Flamenco before blend file is saved at least once. - Refuse to render on Flamenco before blend file is saved at least once.
- Fixed some Windows-specific issues. - Fixed some Windows-specific issues.
## Version 1.7.2 (2017-06-22) ## Version 1.7.2 (2017-06-22)
- Fixed compatibility with Blender 2.78c. - Fixed compatibility with Blender 2.78c.
## Version 1.7.1 (2017-06-13) ## Version 1.7.1 (2017-06-13)
- Fixed asyncio issues on Windows - Fixed asyncio issues on Windows
## Version 1.7.0 (2017-06-09) ## Version 1.7.0 (2017-06-09)
- Fixed reloading after upgrading from 1.4.4 (our last public release). - Fixed reloading after upgrading from 1.4.4 (our last public release).
- Fixed bug handling a symlinked project path. - Fixed bug handling a symlinked project path.
- Added support for Manager-defined path replacement variables. - Added support for Manager-defined path replacement variables.
## Version 1.6.4 (2017-04-21) ## Version 1.6.4 (2017-04-21)
- Added file exclusion filter for Flamenco. A filter like `*.abc;*.mkv;*.mov` can be - Added file exclusion filter for Flamenco. A filter like `*.abc;*.mkv;*.mov` can be
used to prevent certain files from being copied to the job storage directory. used to prevent certain files from being copied to the job storage directory.
Requires a Blender that is bundled with BAM 1.1.7 or newer. Requires a Blender that is bundled with BAM 1.1.7 or newer.
## Version 1.6.3 (2017-03-21) ## Version 1.6.3 (2017-03-21)
- Fixed bug where local project path wasn't shown for projects only set up for Flamenco - Fixed bug where local project path wasn't shown for projects only set up for Flamenco
(and not Attract). (and not Attract).
- Added this CHANGELOG.md file, which will contain user-relevant changes. - Added this CHANGELOG.md file, which will contain user-relevant changes.
## Version 1.6.2 (2017-03-17) ## Version 1.6.2 (2017-03-17)
- Flamenco: when opening non-existing file path, open parent instead - Flamenco: when opening non-existing file path, open parent instead
- Fix T50954: Improve Blender Cloud add-on project selector - Fix T50954: Improve Blender Cloud add-on project selector
## Version 1.6.1 (2017-03-07) ## Version 1.6.1 (2017-03-07)
- Show error in GUI when Blender Cloud is unreachable - Show error in GUI when Blender Cloud is unreachable
- Fixed sample count when using branched path tracing - Fixed sample count when using branched path tracing
## Version 1.6.0 (2017-02-14) ## Version 1.6.0 (2017-02-14)
- Default to frame chunk size of 1 (instead of 10). - Default to frame chunk size of 1 (instead of 10).

View File

@ -19,22 +19,22 @@
# <pep8 compliant> # <pep8 compliant>
bl_info = { bl_info = {
'name': 'Blender Cloud', "name": "Blender Cloud",
"author": "Sybren A. Stüvel, Francesco Siddi, Inês Almeida, Antony Riakiotakis", "author": "Sybren A. Stüvel, Francesco Siddi, Inês Almeida, Antony Riakiotakis",
'version': (1, 7, 4), "version": (1, 25),
'blender': (2, 77, 0), "blender": (2, 80, 0),
'location': 'Addon Preferences panel, and Ctrl+Shift+Alt+A anywhere for texture browser', "location": "Addon Preferences panel, and Ctrl+Shift+Alt+A anywhere for texture browser",
'description': 'Texture library browser and Blender Sync. Requires the Blender ID addon ' "description": "Texture library browser and Blender Sync. Requires the Blender ID addon "
'and Blender 2.77a or newer.', "and Blender 2.80 or newer.",
'wiki_url': 'https://wiki.blender.org/index.php/Extensions:2.6/Py/' "wiki_url": "https://wiki.blender.org/index.php/Extensions:2.6/Py/"
'Scripts/System/BlenderCloud', "Scripts/System/BlenderCloud",
'category': 'System', "category": "System",
} }
import logging import logging
# Support reloading # Support reloading
if 'pillar' in locals(): if "pillar" in locals():
import importlib import importlib
wheels = importlib.reload(wheels) wheels = importlib.reload(wheels)
@ -60,11 +60,11 @@ def register():
_monkey_patch_requests() _monkey_patch_requests()
# Support reloading # Support reloading
if '%s.blender' % __name__ in sys.modules: if "%s.blender" % __name__ in sys.modules:
import importlib import importlib
def reload_mod(name): def reload_mod(name):
modname = '%s.%s' % (__name__, name) modname = "%s.%s" % (__name__, name)
try: try:
old_module = sys.modules[modname] old_module = sys.modules[modname]
except KeyError: except KeyError:
@ -76,20 +76,32 @@ def register():
sys.modules[modname] = new_module sys.modules[modname] = new_module
return new_module return new_module
reload_mod('blendfile') reload_mod("blendfile")
reload_mod('home_project') reload_mod("home_project")
reload_mod('utils') reload_mod("utils")
reload_mod("pillar")
async_loop = reload_mod('async_loop') async_loop = reload_mod("async_loop")
flamenco = reload_mod('flamenco') flamenco = reload_mod("flamenco")
attract = reload_mod('attract') attract = reload_mod("attract")
texture_browser = reload_mod('texture_browser') texture_browser = reload_mod("texture_browser")
settings_sync = reload_mod('settings_sync') settings_sync = reload_mod("settings_sync")
image_sharing = reload_mod('image_sharing') image_sharing = reload_mod("image_sharing")
blender = reload_mod('blender') blender = reload_mod("blender")
project_specific = reload_mod("project_specific")
else: else:
from . import (blender, texture_browser, async_loop, settings_sync, blendfile, home_project, from . import (
image_sharing, attract, flamenco) blender,
texture_browser,
async_loop,
settings_sync,
blendfile,
home_project,
image_sharing,
attract,
flamenco,
project_specific,
)
async_loop.setup_asyncio_executor() async_loop.setup_asyncio_executor()
async_loop.register() async_loop.register()
@ -101,7 +113,7 @@ def register():
image_sharing.register() image_sharing.register()
blender.register() blender.register()
blender.handle_project_update() project_specific.handle_project_update()
def _monkey_patch_requests(): def _monkey_patch_requests():
@ -115,15 +127,23 @@ def _monkey_patch_requests():
if requests.__build__ >= 0x020601: if requests.__build__ >= 0x020601:
return return
log.info('Monkey-patching requests version %s', requests.__version__) log.info("Monkey-patching requests version %s", requests.__version__)
from requests.packages.urllib3.response import HTTPResponse from requests.packages.urllib3.response import HTTPResponse
HTTPResponse.chunked = False HTTPResponse.chunked = False
HTTPResponse.chunk_left = None HTTPResponse.chunk_left = None
def unregister(): def unregister():
from . import (blender, texture_browser, async_loop, settings_sync, image_sharing, attract, from . import (
flamenco) blender,
texture_browser,
async_loop,
settings_sync,
image_sharing,
attract,
flamenco,
)
image_sharing.unregister() image_sharing.unregister()
attract.unregister() attract.unregister()

View File

@ -14,7 +14,7 @@ See <http://github.com/ActiveState/appdirs> for details and usage.
# - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html # - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html
__version_info__ = (1, 4, 0) __version_info__ = (1, 4, 0)
__version__ = '.'.join(map(str, __version_info__)) __version__ = ".".join(map(str, __version_info__))
import sys import sys
@ -25,23 +25,23 @@ PY3 = sys.version_info[0] == 3
if PY3: if PY3:
unicode = str unicode = str
if sys.platform.startswith('java'): if sys.platform.startswith("java"):
import platform import platform
os_name = platform.java_ver()[3][0] os_name = platform.java_ver()[3][0]
if os_name.startswith('Windows'): # "Windows XP", "Windows 7", etc. if os_name.startswith("Windows"): # "Windows XP", "Windows 7", etc.
system = 'win32' system = "win32"
elif os_name.startswith('Mac'): # "Mac OS X", etc. elif os_name.startswith("Mac"): # "Mac OS X", etc.
system = 'darwin' system = "darwin"
else: # "Linux", "SunOS", "FreeBSD", etc. else: # "Linux", "SunOS", "FreeBSD", etc.
# Setting this to "linux2" is not ideal, but only Windows or Mac # Setting this to "linux2" is not ideal, but only Windows or Mac
# are actually checked for and the rest of the module expects # are actually checked for and the rest of the module expects
# *sys.platform* style strings. # *sys.platform* style strings.
system = 'linux2' system = "linux2"
else: else:
system = sys.platform system = sys.platform
def user_data_dir(appname=None, appauthor=None, version=None, roaming=False): def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
r"""Return full path to the user-specific data dir for this application. r"""Return full path to the user-specific data dir for this application.
@ -84,12 +84,12 @@ def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
path = os.path.join(path, appauthor, appname) path = os.path.join(path, appauthor, appname)
else: else:
path = os.path.join(path, appname) path = os.path.join(path, appname)
elif system == 'darwin': elif system == "darwin":
path = os.path.expanduser('~/Library/Application Support/') path = os.path.expanduser("~/Library/Application Support/")
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
else: else:
path = os.getenv('XDG_DATA_HOME', os.path.expanduser("~/.local/share")) path = os.getenv("XDG_DATA_HOME", os.path.expanduser("~/.local/share"))
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
if appname and version: if appname and version:
@ -137,16 +137,19 @@ def site_data_dir(appname=None, appauthor=None, version=None, multipath=False):
path = os.path.join(path, appauthor, appname) path = os.path.join(path, appauthor, appname)
else: else:
path = os.path.join(path, appname) path = os.path.join(path, appname)
elif system == 'darwin': elif system == "darwin":
path = os.path.expanduser('/Library/Application Support') path = os.path.expanduser("/Library/Application Support")
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
else: else:
# XDG default for $XDG_DATA_DIRS # XDG default for $XDG_DATA_DIRS
# only first, if multipath is False # only first, if multipath is False
path = os.getenv('XDG_DATA_DIRS', path = os.getenv(
os.pathsep.join(['/usr/local/share', '/usr/share'])) "XDG_DATA_DIRS", os.pathsep.join(["/usr/local/share", "/usr/share"])
pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)] )
pathlist = [
os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)
]
if appname: if appname:
if version: if version:
appname = os.path.join(appname, version) appname = os.path.join(appname, version)
@ -195,7 +198,7 @@ def user_config_dir(appname=None, appauthor=None, version=None, roaming=False):
if system in ["win32", "darwin"]: if system in ["win32", "darwin"]:
path = user_data_dir(appname, appauthor, None, roaming) path = user_data_dir(appname, appauthor, None, roaming)
else: else:
path = os.getenv('XDG_CONFIG_HOME', os.path.expanduser("~/.config")) path = os.getenv("XDG_CONFIG_HOME", os.path.expanduser("~/.config"))
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
if appname and version: if appname and version:
@ -240,8 +243,10 @@ def site_config_dir(appname=None, appauthor=None, version=None, multipath=False)
else: else:
# XDG default for $XDG_CONFIG_DIRS # XDG default for $XDG_CONFIG_DIRS
# only first, if multipath is False # only first, if multipath is False
path = os.getenv('XDG_CONFIG_DIRS', '/etc/xdg') path = os.getenv("XDG_CONFIG_DIRS", "/etc/xdg")
pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)] pathlist = [
os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)
]
if appname: if appname:
if version: if version:
appname = os.path.join(appname, version) appname = os.path.join(appname, version)
@ -298,14 +303,14 @@ def user_cache_dir(appname=None, appauthor=None, version=None, opinion=True):
path = os.path.join(path, appname) path = os.path.join(path, appname)
if opinion: if opinion:
path = os.path.join(path, "Cache") path = os.path.join(path, "Cache")
elif system == 'darwin': elif system == "darwin":
path = os.path.expanduser('~/Library/Caches') path = os.path.expanduser("~/Library/Caches")
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
else: else:
path = os.getenv('XDG_CACHE_HOME', os.path.expanduser('~/.cache')) path = os.getenv("XDG_CACHE_HOME", os.path.expanduser("~/.cache"))
if appname: if appname:
path = os.path.join(path, appname.lower().replace(' ', '-')) path = os.path.join(path, appname.lower().replace(" ", "-"))
if appname and version: if appname and version:
path = os.path.join(path, version) path = os.path.join(path, version)
return path return path
@ -344,9 +349,7 @@ def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
This can be disabled with the `opinion=False` option. This can be disabled with the `opinion=False` option.
""" """
if system == "darwin": if system == "darwin":
path = os.path.join( path = os.path.join(os.path.expanduser("~/Library/Logs"), appname)
os.path.expanduser('~/Library/Logs'),
appname)
elif system == "win32": elif system == "win32":
path = user_data_dir(appname, appauthor, version) path = user_data_dir(appname, appauthor, version)
version = False version = False
@ -364,8 +367,10 @@ def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
class AppDirs(object): class AppDirs(object):
"""Convenience wrapper for getting application dirs.""" """Convenience wrapper for getting application dirs."""
def __init__(self, appname, appauthor=None, version=None, roaming=False,
multipath=False): def __init__(
self, appname, appauthor=None, version=None, roaming=False, multipath=False
):
self.appname = appname self.appname = appname
self.appauthor = appauthor self.appauthor = appauthor
self.version = version self.version = version
@ -374,36 +379,39 @@ class AppDirs(object):
@property @property
def user_data_dir(self): def user_data_dir(self):
return user_data_dir(self.appname, self.appauthor, return user_data_dir(
version=self.version, roaming=self.roaming) self.appname, self.appauthor, version=self.version, roaming=self.roaming
)
@property @property
def site_data_dir(self): def site_data_dir(self):
return site_data_dir(self.appname, self.appauthor, return site_data_dir(
version=self.version, multipath=self.multipath) self.appname, self.appauthor, version=self.version, multipath=self.multipath
)
@property @property
def user_config_dir(self): def user_config_dir(self):
return user_config_dir(self.appname, self.appauthor, return user_config_dir(
version=self.version, roaming=self.roaming) self.appname, self.appauthor, version=self.version, roaming=self.roaming
)
@property @property
def site_config_dir(self): def site_config_dir(self):
return site_config_dir(self.appname, self.appauthor, return site_config_dir(
version=self.version, multipath=self.multipath) self.appname, self.appauthor, version=self.version, multipath=self.multipath
)
@property @property
def user_cache_dir(self): def user_cache_dir(self):
return user_cache_dir(self.appname, self.appauthor, return user_cache_dir(self.appname, self.appauthor, version=self.version)
version=self.version)
@property @property
def user_log_dir(self): def user_log_dir(self):
return user_log_dir(self.appname, self.appauthor, return user_log_dir(self.appname, self.appauthor, version=self.version)
version=self.version)
#---- internal support stuff # ---- internal support stuff
def _get_win_folder_from_registry(csidl_name): def _get_win_folder_from_registry(csidl_name):
"""This is a fallback technique at best. I'm not sure if using the """This is a fallback technique at best. I'm not sure if using the
@ -420,7 +428,7 @@ def _get_win_folder_from_registry(csidl_name):
key = _winreg.OpenKey( key = _winreg.OpenKey(
_winreg.HKEY_CURRENT_USER, _winreg.HKEY_CURRENT_USER,
r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders" r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders",
) )
dir, type = _winreg.QueryValueEx(key, shell_folder_name) dir, type = _winreg.QueryValueEx(key, shell_folder_name)
return dir return dir
@ -428,6 +436,7 @@ def _get_win_folder_from_registry(csidl_name):
def _get_win_folder_with_pywin32(csidl_name): def _get_win_folder_with_pywin32(csidl_name):
from win32com.shell import shellcon, shell from win32com.shell import shellcon, shell
dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0) dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0)
# Try to make this a unicode path because SHGetFolderPath does # Try to make this a unicode path because SHGetFolderPath does
# not return unicode strings when there is unicode data in the # not return unicode strings when there is unicode data in the
@ -445,6 +454,7 @@ def _get_win_folder_with_pywin32(csidl_name):
if has_high_char: if has_high_char:
try: try:
import win32api import win32api
dir = win32api.GetShortPathName(dir) dir = win32api.GetShortPathName(dir)
except ImportError: except ImportError:
pass pass
@ -479,15 +489,22 @@ def _get_win_folder_with_ctypes(csidl_name):
return buf.value return buf.value
def _get_win_folder_with_jna(csidl_name): def _get_win_folder_with_jna(csidl_name):
import array import array
from com.sun import jna from com.sun import jna
from com.sun.jna.platform import win32 from com.sun.jna.platform import win32
buf_size = win32.WinDef.MAX_PATH * 2 buf_size = win32.WinDef.MAX_PATH * 2
buf = array.zeros('c', buf_size) buf = array.zeros("c", buf_size)
shell = win32.Shell32.INSTANCE shell = win32.Shell32.INSTANCE
shell.SHGetFolderPath(None, getattr(win32.ShlObj, csidl_name), None, win32.ShlObj.SHGFP_TYPE_CURRENT, buf) shell.SHGetFolderPath(
None,
getattr(win32.ShlObj, csidl_name),
None,
win32.ShlObj.SHGFP_TYPE_CURRENT,
buf,
)
dir = jna.Native.toString(buf.tostring()).rstrip("\0") dir = jna.Native.toString(buf.tostring()).rstrip("\0")
# Downgrade to short path name if have highbit chars. See # Downgrade to short path name if have highbit chars. See
@ -498,38 +515,47 @@ def _get_win_folder_with_jna(csidl_name):
has_high_char = True has_high_char = True
break break
if has_high_char: if has_high_char:
buf = array.zeros('c', buf_size) buf = array.zeros("c", buf_size)
kernel = win32.Kernel32.INSTANCE kernel = win32.Kernel32.INSTANCE
if kernal.GetShortPathName(dir, buf, buf_size): if kernal.GetShortPathName(dir, buf, buf_size):
dir = jna.Native.toString(buf.tostring()).rstrip("\0") dir = jna.Native.toString(buf.tostring()).rstrip("\0")
return dir return dir
if system == "win32": if system == "win32":
try: try:
import win32com.shell import win32com.shell
_get_win_folder = _get_win_folder_with_pywin32 _get_win_folder = _get_win_folder_with_pywin32
except ImportError: except ImportError:
try: try:
from ctypes import windll from ctypes import windll # type: ignore
_get_win_folder = _get_win_folder_with_ctypes _get_win_folder = _get_win_folder_with_ctypes
except ImportError: except ImportError:
try: try:
import com.sun.jna import com.sun.jna
_get_win_folder = _get_win_folder_with_jna _get_win_folder = _get_win_folder_with_jna
except ImportError: except ImportError:
_get_win_folder = _get_win_folder_from_registry _get_win_folder = _get_win_folder_from_registry
#---- self test code # ---- self test code
if __name__ == "__main__": if __name__ == "__main__":
appname = "MyApp" appname = "MyApp"
appauthor = "MyCompany" appauthor = "MyCompany"
props = ("user_data_dir", "site_data_dir", props = (
"user_config_dir", "site_config_dir", "user_data_dir",
"user_cache_dir", "user_log_dir") "site_data_dir",
"user_config_dir",
"site_config_dir",
"user_cache_dir",
"user_log_dir",
)
print("-- app dirs (with optional 'version')") print("-- app dirs (with optional 'version')")
dirs = AppDirs(appname, appauthor, version="1.0") dirs = AppDirs(appname, appauthor, version="1.0")

View File

@ -23,6 +23,7 @@ import traceback
import concurrent.futures import concurrent.futures
import logging import logging
import gc import gc
import typing
import bpy import bpy
@ -37,11 +38,13 @@ def setup_asyncio_executor():
import sys import sys
if sys.platform == 'win32': if sys.platform == "win32":
asyncio.get_event_loop().close() asyncio.get_event_loop().close()
# On Windows, the default event loop is SelectorEventLoop, which does # On Windows, the default event loop is SelectorEventLoop, which does
# not support subprocesses. ProactorEventLoop should be used instead. # not support subprocesses. ProactorEventLoop should be used instead.
# Source: https://docs.python.org/3/library/asyncio-subprocess.html # Source: https://docs.python.org/3/library/asyncio-subprocess.html
#
# NOTE: this is actually the default even loop in Python 3.9+.
loop = asyncio.ProactorEventLoop() loop = asyncio.ProactorEventLoop()
asyncio.set_event_loop(loop) asyncio.set_event_loop(loop)
else: else:
@ -52,8 +55,12 @@ def setup_asyncio_executor():
# loop.set_debug(True) # loop.set_debug(True)
from . import pillar from . import pillar
# Python 3.8 deprecated the 'loop' parameter, 3.10 removed it.
kwargs = {"loop": loop} if sys.version_info < (3, 8) else {}
# No more than this many Pillar calls should be made simultaneously # No more than this many Pillar calls should be made simultaneously
pillar.pillar_semaphore = asyncio.Semaphore(3, loop=loop) pillar.pillar_semaphore = asyncio.Semaphore(3, **kwargs)
def kick_async_loop(*args) -> bool: def kick_async_loop(*args) -> bool:
@ -69,17 +76,23 @@ def kick_async_loop(*args) -> bool:
stop_after_this_kick = False stop_after_this_kick = False
if loop.is_closed(): if loop.is_closed():
log.warning('loop closed, stopping immediately.') log.warning("loop closed, stopping immediately.")
return True return True
all_tasks = asyncio.Task.all_tasks() # Passing an explicit loop is required. Without it, the function uses
# asyncio.get_running_loop(), which raises a RuntimeError as the current
# loop isn't running.
all_tasks = asyncio.all_tasks(loop=loop)
if not len(all_tasks): if not len(all_tasks):
log.debug('no more scheduled tasks, stopping after this kick.') log.debug("no more scheduled tasks, stopping after this kick.")
stop_after_this_kick = True stop_after_this_kick = True
elif all(task.done() for task in all_tasks): elif all(task.done() for task in all_tasks):
log.debug('all %i tasks are done, fetching results and stopping after this kick.', log.debug(
len(all_tasks)) "all %i tasks are done, fetching results and stopping after this kick.",
len(all_tasks),
)
stop_after_this_kick = True stop_after_this_kick = True
# Clean up circular references between tasks. # Clean up circular references between tasks.
@ -92,12 +105,12 @@ def kick_async_loop(*args) -> bool:
# noinspection PyBroadException # noinspection PyBroadException
try: try:
res = task.result() res = task.result()
log.debug(' task #%i: result=%r', task_idx, res) log.debug(" task #%i: result=%r", task_idx, res)
except asyncio.CancelledError: except asyncio.CancelledError:
# No problem, we want to stop anyway. # No problem, we want to stop anyway.
log.debug(' task #%i: cancelled', task_idx) log.debug(" task #%i: cancelled", task_idx)
except Exception: except Exception:
print('{}: resulted in exception'.format(task)) print("{}: resulted in exception".format(task))
traceback.print_exc() traceback.print_exc()
# for ref in gc.get_referrers(task): # for ref in gc.get_referrers(task):
@ -110,26 +123,26 @@ def kick_async_loop(*args) -> bool:
def ensure_async_loop(): def ensure_async_loop():
log.debug('Starting asyncio loop') log.debug("Starting asyncio loop")
result = bpy.ops.asyncio.loop() result = bpy.ops.asyncio.loop()
log.debug('Result of starting modal operator is %r', result) log.debug("Result of starting modal operator is %r", result)
def erase_async_loop(): def erase_async_loop():
global _loop_kicking_operator_running global _loop_kicking_operator_running
log.debug('Erasing async loop') log.debug("Erasing async loop")
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
loop.stop() loop.stop()
class AsyncLoopModalOperator(bpy.types.Operator): class AsyncLoopModalOperator(bpy.types.Operator):
bl_idname = 'asyncio.loop' bl_idname = "asyncio.loop"
bl_label = 'Runs the asyncio main loop' bl_label = "Runs the asyncio main loop"
timer = None timer = None
log = logging.getLogger(__name__ + '.AsyncLoopModalOperator') log = logging.getLogger(__name__ + ".AsyncLoopModalOperator")
def __del__(self): def __del__(self):
global _loop_kicking_operator_running global _loop_kicking_operator_running
@ -146,16 +159,16 @@ class AsyncLoopModalOperator(bpy.types.Operator):
global _loop_kicking_operator_running global _loop_kicking_operator_running
if _loop_kicking_operator_running: if _loop_kicking_operator_running:
self.log.debug('Another loop-kicking operator is already running.') self.log.debug("Another loop-kicking operator is already running.")
return {'PASS_THROUGH'} return {"PASS_THROUGH"}
context.window_manager.modal_handler_add(self) context.window_manager.modal_handler_add(self)
_loop_kicking_operator_running = True _loop_kicking_operator_running = True
wm = context.window_manager wm = context.window_manager
self.timer = wm.event_timer_add(0.00001, context.window) self.timer = wm.event_timer_add(0.00001, window=context.window)
return {'RUNNING_MODAL'} return {"RUNNING_MODAL"}
def modal(self, context, event): def modal(self, context, event):
global _loop_kicking_operator_running global _loop_kicking_operator_running
@ -164,10 +177,10 @@ class AsyncLoopModalOperator(bpy.types.Operator):
# erase_async_loop(). This is a signal that we really should stop # erase_async_loop(). This is a signal that we really should stop
# running. # running.
if not _loop_kicking_operator_running: if not _loop_kicking_operator_running:
return {'FINISHED'} return {"FINISHED"}
if event.type != 'TIMER': if event.type != "TIMER":
return {'PASS_THROUGH'} return {"PASS_THROUGH"}
# self.log.debug('KICKING LOOP') # self.log.debug('KICKING LOOP')
stop_after_this_kick = kick_async_loop() stop_after_this_kick = kick_async_loop()
@ -175,29 +188,33 @@ class AsyncLoopModalOperator(bpy.types.Operator):
context.window_manager.event_timer_remove(self.timer) context.window_manager.event_timer_remove(self.timer)
_loop_kicking_operator_running = False _loop_kicking_operator_running = False
self.log.debug('Stopped asyncio loop kicking') self.log.debug("Stopped asyncio loop kicking")
return {'FINISHED'} return {"FINISHED"}
return {'RUNNING_MODAL'} return {"RUNNING_MODAL"}
# noinspection PyAttributeOutsideInit # noinspection PyAttributeOutsideInit
class AsyncModalOperatorMixin: class AsyncModalOperatorMixin:
async_task = None # asyncio task for fetching thumbnails async_task = None # asyncio task for fetching thumbnails
signalling_future = None # asyncio future for signalling that we want to cancel everything. signalling_future = (
log = logging.getLogger('%s.AsyncModalOperatorMixin' % __name__) None # asyncio future for signalling that we want to cancel everything.
)
log = logging.getLogger("%s.AsyncModalOperatorMixin" % __name__)
_state = 'INITIALIZING' _state = "INITIALIZING"
stop_upon_exception = False stop_upon_exception = False
def invoke(self, context, event): def invoke(self, context, event):
context.window_manager.modal_handler_add(self) context.window_manager.modal_handler_add(self)
self.timer = context.window_manager.event_timer_add(1 / 15, context.window) self.timer = context.window_manager.event_timer_add(
1 / 15, window=context.window
)
self.log.info('Starting') self.log.info("Starting")
self._new_async_task(self.async_execute(context)) self._new_async_task(self.async_execute(context))
return {'RUNNING_MODAL'} return {"RUNNING_MODAL"}
async def async_execute(self, context): async def async_execute(self, context):
"""Entry point of the asynchronous operator. """Entry point of the asynchronous operator.
@ -208,7 +225,7 @@ class AsyncModalOperatorMixin:
def quit(self): def quit(self):
"""Signals the state machine to stop this operator from running.""" """Signals the state machine to stop this operator from running."""
self._state = 'QUIT' self._state = "QUIT"
def execute(self, context): def execute(self, context):
return self.invoke(context, None) return self.invoke(context, None)
@ -216,46 +233,50 @@ class AsyncModalOperatorMixin:
def modal(self, context, event): def modal(self, context, event):
task = self.async_task task = self.async_task
if self._state != 'EXCEPTION' and task and task.done() and not task.cancelled(): if self._state != "EXCEPTION" and task and task.done() and not task.cancelled():
ex = task.exception() ex = task.exception()
if ex is not None: if ex is not None:
self._state = 'EXCEPTION' self._state = "EXCEPTION"
self.log.error('Exception while running task: %s', ex) self.log.error("Exception while running task: %s", ex)
if self.stop_upon_exception: if self.stop_upon_exception:
self.quit() self.quit()
self._finish(context) self._finish(context)
return {'FINISHED'} return {"FINISHED"}
return {'RUNNING_MODAL'} return {"RUNNING_MODAL"}
if self._state == 'QUIT': if self._state == "QUIT":
self._finish(context) self._finish(context)
return {'FINISHED'} return {"FINISHED"}
return {'PASS_THROUGH'} return {"PASS_THROUGH"}
def _finish(self, context): def _finish(self, context):
self._stop_async_task() self._stop_async_task()
context.window_manager.event_timer_remove(self.timer) context.window_manager.event_timer_remove(self.timer)
def _new_async_task(self, async_task: asyncio.coroutine, future: asyncio.Future = None): def _new_async_task(
self, async_task: typing.Coroutine, future: asyncio.Future = None
):
"""Stops the currently running async task, and starts another one.""" """Stops the currently running async task, and starts another one."""
self.log.debug('Setting up a new task %r, so any existing task must be stopped', async_task) self.log.debug(
"Setting up a new task %r, so any existing task must be stopped", async_task
)
self._stop_async_task() self._stop_async_task()
# Download the previews asynchronously. # Download the previews asynchronously.
self.signalling_future = future or asyncio.Future() self.signalling_future = future or asyncio.Future()
self.async_task = asyncio.ensure_future(async_task) self.async_task = asyncio.ensure_future(async_task)
self.log.debug('Created new task %r', self.async_task) self.log.debug("Created new task %r", self.async_task)
# Start the async manager so everything happens. # Start the async manager so everything happens.
ensure_async_loop() ensure_async_loop()
def _stop_async_task(self): def _stop_async_task(self):
self.log.debug('Stopping async task') self.log.debug("Stopping async task")
if self.async_task is None: if self.async_task is None:
self.log.debug('No async task, trivially stopped') self.log.debug("No async task, trivially stopped")
return return
# Signal that we want to stop. # Signal that we want to stop.
@ -271,14 +292,14 @@ class AsyncModalOperatorMixin:
try: try:
loop.run_until_complete(self.async_task) loop.run_until_complete(self.async_task)
except asyncio.CancelledError: except asyncio.CancelledError:
self.log.info('Asynchronous task was cancelled') self.log.info("Asynchronous task was cancelled")
return return
# noinspection PyBroadException # noinspection PyBroadException
try: try:
self.async_task.result() # This re-raises any exception of the task. self.async_task.result() # This re-raises any exception of the task.
except asyncio.CancelledError: except asyncio.CancelledError:
self.log.info('Asynchronous task was cancelled') self.log.info("Asynchronous task was cancelled")
except Exception: except Exception:
self.log.exception("Exception from asynchronous task") self.log.exception("Exception from asynchronous task")

File diff suppressed because it is too large Load Diff

View File

@ -18,26 +18,86 @@
# <pep8 compliant> # <pep8 compliant>
import bpy
import logging import logging
import collections import typing
import bpy
import bgl
import gpu
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
strip_status_colour = { strip_status_colour = {
None: (0.7, 0.7, 0.7), None: (0.7, 0.7, 0.7),
'approved': (0.6392156862745098, 0.8784313725490196, 0.30196078431372547), "approved": (0.6392156862745098, 0.8784313725490196, 0.30196078431372547),
'final': (0.9058823529411765, 0.9607843137254902, 0.8274509803921568), "final": (0.9058823529411765, 0.9607843137254902, 0.8274509803921568),
'in_progress': (1.0, 0.7450980392156863, 0.0), "in_progress": (1.0, 0.7450980392156863, 0.0),
'on_hold': (0.796078431372549, 0.6196078431372549, 0.08235294117647059), "on_hold": (0.796078431372549, 0.6196078431372549, 0.08235294117647059),
'review': (0.8941176470588236, 0.9607843137254902, 0.9764705882352941), "review": (0.8941176470588236, 0.9607843137254902, 0.9764705882352941),
'todo': (1.0, 0.5019607843137255, 0.5019607843137255) "todo": (1.0, 0.5019607843137255, 0.5019607843137255),
} }
CONFLICT_COLOUR = (0.576, 0.118, 0.035) # RGB tuple CONFLICT_COLOUR = (0.576, 0.118, 0.035, 1.0) # RGBA tuple
gpu_vertex_shader = """
uniform mat4 ModelViewProjectionMatrix;
layout (location = 0) in vec2 pos;
layout (location = 1) in vec4 color;
out vec4 lineColor; // output to the fragment shader
void main()
{
gl_Position = ModelViewProjectionMatrix * vec4(pos.x, pos.y, 0.0, 1.0);
lineColor = color;
}
"""
gpu_fragment_shader = """
out vec4 fragColor;
in vec4 lineColor;
void main()
{
fragColor = lineColor;
}
"""
Float2 = typing.Tuple[float, float]
Float3 = typing.Tuple[float, float, float]
Float4 = typing.Tuple[float, float, float, float]
def get_strip_rectf(strip): class AttractLineDrawer:
def __init__(self):
self._format = gpu.types.GPUVertFormat()
self._pos_id = self._format.attr_add(
id="pos", comp_type="F32", len=2, fetch_mode="FLOAT"
)
self._color_id = self._format.attr_add(
id="color", comp_type="F32", len=4, fetch_mode="FLOAT"
)
self.shader = gpu.types.GPUShader(gpu_vertex_shader, gpu_fragment_shader)
def draw(self, coords: typing.List[Float2], colors: typing.List[Float4]):
if not coords:
return
bgl.glEnable(bgl.GL_BLEND)
bgl.glLineWidth(2.0)
vbo = gpu.types.GPUVertBuf(len=len(coords), format=self._format)
vbo.attr_fill(id=self._pos_id, data=coords)
vbo.attr_fill(id=self._color_id, data=colors)
batch = gpu.types.GPUBatch(type="LINES", buf=vbo)
batch.program_set(self.shader)
batch.draw()
def get_strip_rectf(strip) -> Float4:
# Get x and y in terms of the grid's frames and channels # Get x and y in terms of the grid's frames and channels
x1 = strip.frame_final_start x1 = strip.frame_final_start
x2 = strip.frame_final_end x2 = strip.frame_final_end
@ -47,59 +107,60 @@ def get_strip_rectf(strip):
return x1, y1, x2, y2 return x1, y1, x2, y2
def draw_underline_in_strip(strip_coords, pixel_size_x, color): def underline_in_strip(
from bgl import glColor4f, glRectf, glEnable, glDisable, GL_BLEND strip_coords: Float4,
import bgl pixel_size_x: float,
color: Float4,
context = bpy.context out_coords: typing.List[Float2],
out_colors: typing.List[Float4],
):
# Strip coords # Strip coords
s_x1, s_y1, s_x2, s_y2 = strip_coords s_x1, s_y1, s_x2, s_y2 = strip_coords
# be careful not to draw over the current frame line # be careful not to draw over the current frame line
cf_x = context.scene.frame_current_final cf_x = bpy.context.scene.frame_current_final
bgl.glPushAttrib(bgl.GL_COLOR_BUFFER_BIT | bgl.GL_LINE_BIT) # TODO(Sybren): figure out how to pass one colour per line,
# instead of one colour per vertex.
out_coords.append((s_x1, s_y1))
out_colors.append(color)
glColor4f(*color)
glEnable(GL_BLEND)
bgl.glLineWidth(2)
bgl.glBegin(bgl.GL_LINES)
bgl.glVertex2f(s_x1, s_y1)
if s_x1 < cf_x < s_x2: if s_x1 < cf_x < s_x2:
# Bad luck, the line passes our strip # Bad luck, the line passes our strip, so draw two lines.
bgl.glVertex2f(cf_x - pixel_size_x, s_y1) out_coords.append((cf_x - pixel_size_x, s_y1))
bgl.glVertex2f(cf_x + pixel_size_x, s_y1) out_colors.append(color)
bgl.glVertex2f(s_x2, s_y1)
bgl.glEnd() out_coords.append((cf_x + pixel_size_x, s_y1))
bgl.glPopAttrib() out_colors.append(color)
out_coords.append((s_x2, s_y1))
out_colors.append(color)
def draw_strip_conflict(strip_coords, pixel_size_x): def strip_conflict(
strip_coords: Float4,
out_coords: typing.List[Float2],
out_colors: typing.List[Float4],
):
"""Draws conflicting states between strips.""" """Draws conflicting states between strips."""
import bgl
s_x1, s_y1, s_x2, s_y2 = strip_coords s_x1, s_y1, s_x2, s_y2 = strip_coords
bgl.glPushAttrib(bgl.GL_COLOR_BUFFER_BIT | bgl.GL_LINE_BIT)
# Always draw the full rectangle, the conflict should be resolved and thus stand out. # TODO(Sybren): draw a rectangle instead of a line.
bgl.glColor3f(*CONFLICT_COLOUR) out_coords.append((s_x1, s_y2))
bgl.glLineWidth(2) out_colors.append(CONFLICT_COLOUR)
bgl.glBegin(bgl.GL_LINE_LOOP) out_coords.append((s_x2, s_y1))
bgl.glVertex2f(s_x1, s_y1) out_colors.append(CONFLICT_COLOUR)
bgl.glVertex2f(s_x2, s_y1)
bgl.glVertex2f(s_x2, s_y2)
bgl.glVertex2f(s_x1, s_y2)
bgl.glEnd()
bgl.glPopAttrib() out_coords.append((s_x2, s_y2))
out_colors.append(CONFLICT_COLOUR)
out_coords.append((s_x1, s_y1))
out_colors.append(CONFLICT_COLOUR)
def draw_callback_px(): def draw_callback_px(line_drawer: AttractLineDrawer):
context = bpy.context context = bpy.context
if not context.scene.sequence_editor: if not context.scene.sequence_editor:
@ -115,6 +176,10 @@ def draw_callback_px():
strips = shown_strips(context) strips = shown_strips(context)
coords = [] # type: typing.List[Float2]
colors = [] # type: typing.List[Float4]
# Collect all the lines (vertex coords + vertex colours) to draw.
for strip in strips: for strip in strips:
if not strip.atc_object_id: if not strip.atc_object_id:
continue continue
@ -123,8 +188,12 @@ def draw_callback_px():
strip_coords = get_strip_rectf(strip) strip_coords = get_strip_rectf(strip)
# check if any of the coordinates are out of bounds # check if any of the coordinates are out of bounds
if strip_coords[0] > xwin2 or strip_coords[2] < xwin1 or strip_coords[1] > ywin2 or \ if (
strip_coords[3] < ywin1: strip_coords[0] > xwin2
or strip_coords[2] < xwin1
or strip_coords[1] > ywin2
or strip_coords[3] < ywin1
):
continue continue
# Draw # Draw
@ -136,9 +205,11 @@ def draw_callback_px():
alpha = 1.0 if strip.atc_is_synced else 0.5 alpha = 1.0 if strip.atc_is_synced else 0.5
draw_underline_in_strip(strip_coords, pixel_size_x, color + (alpha,)) underline_in_strip(strip_coords, pixel_size_x, color + (alpha,), coords, colors)
if strip.atc_is_synced and strip.atc_object_id_conflict: if strip.atc_is_synced and strip.atc_object_id_conflict:
draw_strip_conflict(strip_coords, pixel_size_x) strip_conflict(strip_coords, coords, colors)
line_drawer.draw(coords, colors)
def tag_redraw_all_sequencer_editors(): def tag_redraw_all_sequencer_editors():
@ -147,9 +218,9 @@ def tag_redraw_all_sequencer_editors():
# Py cant access notifiers # Py cant access notifiers
for window in context.window_manager.windows: for window in context.window_manager.windows:
for area in window.screen.areas: for area in window.screen.areas:
if area.type == 'SEQUENCE_EDITOR': if area.type == "SEQUENCE_EDITOR":
for region in area.regions: for region in area.regions:
if region.type == 'WINDOW': if region.type == "WINDOW":
region.tag_redraw() region.tag_redraw()
@ -162,8 +233,16 @@ def callback_enable():
if cb_handle: if cb_handle:
return return
cb_handle[:] = bpy.types.SpaceSequenceEditor.draw_handler_add( # Doing GPU stuff in the background crashes Blender, so let's not.
draw_callback_px, (), 'WINDOW', 'POST_VIEW'), if bpy.app.background:
return
line_drawer = AttractLineDrawer()
cb_handle[:] = (
bpy.types.SpaceSequenceEditor.draw_handler_add(
draw_callback_px, (line_drawer,), "WINDOW", "POST_VIEW"
),
)
tag_redraw_all_sequencer_editors() tag_redraw_all_sequencer_editors()
@ -173,7 +252,7 @@ def callback_disable():
return return
try: try:
bpy.types.SpaceSequenceEditor.draw_handler_remove(cb_handle[0], 'WINDOW') bpy.types.SpaceSequenceEditor.draw_handler_remove(cb_handle[0], "WINDOW")
except ValueError: except ValueError:
# Thrown when already removed. # Thrown when already removed.
pass pass

View File

@ -23,64 +23,77 @@ Separated from __init__.py so that we can import & run from non-Blender environm
import functools import functools
import logging import logging
import os.path import os.path
import tempfile
import bpy import bpy
from bpy.types import AddonPreferences, Operator, WindowManager, Scene, PropertyGroup from bpy.types import AddonPreferences, Operator, WindowManager, Scene, PropertyGroup
from bpy.props import StringProperty, EnumProperty, PointerProperty, BoolProperty, IntProperty from bpy.props import (
StringProperty,
EnumProperty,
PointerProperty,
BoolProperty,
IntProperty,
)
import rna_prop_ui import rna_prop_ui
from . import pillar, async_loop, flamenco from . import pillar, async_loop, flamenco, project_specific
from .utils import pyside_cache, redraw from .utils import pyside_cache, redraw
PILLAR_WEB_SERVER_URL = os.environ.get('BCLOUD_SERVER', 'https://cloud.blender.org/') PILLAR_WEB_SERVER_URL = os.environ.get("BCLOUD_SERVER", "https://cloud.blender.org/")
PILLAR_SERVER_URL = '%sapi/' % PILLAR_WEB_SERVER_URL PILLAR_SERVER_URL = "%sapi/" % PILLAR_WEB_SERVER_URL
ADDON_NAME = 'blender_cloud' ADDON_NAME = "blender_cloud"
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
icons = None icons = None
@pyside_cache('version') @pyside_cache
def blender_syncable_versions(self, context): def blender_syncable_versions(self, context):
"""Returns the list of items used by SyncStatusProperties.version EnumProperty.""" """Returns the list of items used by SyncStatusProperties.version EnumProperty."""
bss = context.window_manager.blender_sync_status bss = context.window_manager.blender_sync_status
versions = bss.available_blender_versions versions = bss.available_blender_versions
if not versions: if not versions:
return [('', 'No settings stored in your Blender Cloud', '')] return [("", "No settings stored in your Blender Cloud", "")]
return [(v, v, '') for v in versions] return [(v, v, "") for v in versions]
class SyncStatusProperties(PropertyGroup): class SyncStatusProperties(PropertyGroup):
status = EnumProperty( status: EnumProperty(
items=[ items=[
('NONE', 'NONE', 'We have done nothing at all yet.'), ("NONE", "NONE", "We have done nothing at all yet."),
('IDLE', 'IDLE', 'User requested something, which is done, and we are now idle.'), (
('SYNCING', 'SYNCING', 'Synchronising with Blender Cloud.'), "IDLE",
"IDLE",
"User requested something, which is done, and we are now idle.",
),
("SYNCING", "SYNCING", "Synchronising with Blender Cloud."),
], ],
name='status', name="status",
description='Current status of Blender Sync', description="Current status of Blender Sync",
update=redraw) update=redraw,
)
version = EnumProperty( version: EnumProperty(
items=blender_syncable_versions, items=blender_syncable_versions,
name='Version of Blender from which to pull', name="Version of Blender from which to pull",
description='Version of Blender from which to pull') description="Version of Blender from which to pull",
)
message = StringProperty(name='message', update=redraw) message: StringProperty(name="message", update=redraw)
level = EnumProperty( level: EnumProperty(
items=[ items=[
('INFO', 'INFO', ''), ("INFO", "INFO", ""),
('WARNING', 'WARNING', ''), ("WARNING", "WARNING", ""),
('ERROR', 'ERROR', ''), ("ERROR", "ERROR", ""),
('SUBSCRIBE', 'SUBSCRIBE', ''), ("SUBSCRIBE", "SUBSCRIBE", ""),
], ],
name='level', name="level",
update=redraw) update=redraw,
)
def report(self, level: set, message: str): def report(self, level: set, message: str):
assert len(level) == 1, 'level should be a set of one string, not %r' % level assert len(level) == 1, "level should be a set of one string, not %r" % level
self.level = level.pop() self.level = level.pop()
self.message = message self.message = message
@ -97,21 +110,21 @@ class SyncStatusProperties(PropertyGroup):
# because I don't know how to store a variable list of strings in a proper RNA property. # because I don't know how to store a variable list of strings in a proper RNA property.
@property @property
def available_blender_versions(self) -> list: def available_blender_versions(self) -> list:
return self.get('available_blender_versions', []) return self.get("available_blender_versions", [])
@available_blender_versions.setter @available_blender_versions.setter
def available_blender_versions(self, new_versions): def available_blender_versions(self, new_versions):
self['available_blender_versions'] = new_versions self["available_blender_versions"] = new_versions
@pyside_cache('project') @pyside_cache
def bcloud_available_projects(self, context): def bcloud_available_projects(self, context):
"""Returns the list of items used by BlenderCloudProjectGroup.project EnumProperty.""" """Returns the list of items used by BlenderCloudProjectGroup.project EnumProperty."""
projs = preferences().project.available_projects projs = preferences().project.available_projects
if not projs: if not projs:
return [('', 'No projects available in your Blender Cloud', '')] return [("", "No projects available in your Blender Cloud", "")]
return [(p['_id'], p['name'], '') for p in projs] return [(p["_id"], p["name"], "") for p in projs]
@functools.lru_cache(1) @functools.lru_cache(1)
@ -121,145 +134,146 @@ def project_extensions(project_id) -> set:
At the moment of writing these are 'attract' and 'flamenco'. At the moment of writing these are 'attract' and 'flamenco'.
""" """
log.debug('Finding extensions for project %s', project_id) log.debug("Finding extensions for project %s", project_id)
# We can't use our @property, since the preferences may be loaded from a # We can't use our @property, since the preferences may be loaded from a
# preferences blend file, in which case it is not constructed from Python code. # preferences blend file, in which case it is not constructed from Python code.
available_projects = preferences().project.get('available_projects', []) available_projects = preferences().project.get("available_projects", [])
if not available_projects: if not available_projects:
log.debug('No projects available.') log.debug("No projects available.")
return set() return set()
proj = next((p for p in available_projects proj = next((p for p in available_projects if p["_id"] == project_id), None)
if p['_id'] == project_id), None)
if proj is None: if proj is None:
log.debug('Project %s not found in available projects.', project_id) log.debug("Project %s not found in available projects.", project_id)
return set() return set()
return set(proj.get('enabled_for', ())) return set(proj.get("enabled_for", ()))
def handle_project_update(_=None, _2=None):
"""Handles changing projects, which may cause extensions to be disabled/enabled.
Ignores arguments so that it can be used as property update callback.
"""
project_id = preferences().project.project
log.info('Updating internal state to reflect extensions enabled on current project %s.',
project_id)
project_extensions.cache_clear()
from blender_cloud import attract, flamenco
attract.deactivate()
flamenco.deactivate()
enabled_for = project_extensions(project_id)
log.info('Project extensions: %s', enabled_for)
if 'attract' in enabled_for:
attract.activate()
if 'flamenco' in enabled_for:
flamenco.activate()
class BlenderCloudProjectGroup(PropertyGroup): class BlenderCloudProjectGroup(PropertyGroup):
status = EnumProperty( status: EnumProperty(
items=[ items=[
('NONE', 'NONE', 'We have done nothing at all yet'), ("NONE", "NONE", "We have done nothing at all yet"),
('IDLE', 'IDLE', 'User requested something, which is done, and we are now idle'), (
('FETCHING', 'FETCHING', 'Fetching available projects from Blender Cloud'), "IDLE",
"IDLE",
"User requested something, which is done, and we are now idle",
),
("FETCHING", "FETCHING", "Fetching available projects from Blender Cloud"),
], ],
name='status', name="status",
update=redraw) update=redraw,
)
project = EnumProperty( project: EnumProperty(
items=bcloud_available_projects, items=bcloud_available_projects,
name='Cloud project', name="Cloud project",
description='Which Blender Cloud project to work with', description="Which Blender Cloud project to work with",
update=handle_project_update update=project_specific.handle_project_update,
) )
# List of projects is stored in 'available_projects' ID property, # List of projects is stored in 'available_projects' ID property,
# because I don't know how to store a variable list of strings in a proper RNA property. # because I don't know how to store a variable list of strings in a proper RNA property.
@property @property
def available_projects(self) -> list: def available_projects(self) -> list:
return self.get('available_projects', []) return self.get("available_projects", [])
@available_projects.setter @available_projects.setter
def available_projects(self, new_projects): def available_projects(self, new_projects):
self['available_projects'] = new_projects self["available_projects"] = new_projects
handle_project_update() project_specific.handle_project_update()
class BlenderCloudPreferences(AddonPreferences): class BlenderCloudPreferences(AddonPreferences):
bl_idname = ADDON_NAME bl_idname = ADDON_NAME
# The following two properties are read-only to limit the scope of the # The following property is read-only to limit the scope of the
# addon and allow for proper testing within this scope. # addon and allow for proper testing within this scope.
pillar_server = StringProperty( pillar_server: StringProperty(
name='Blender Cloud Server', name="Blender Cloud Server",
description='URL of the Blender Cloud backend server', description="URL of the Blender Cloud backend server",
default=PILLAR_SERVER_URL, default=PILLAR_SERVER_URL,
get=lambda self: PILLAR_SERVER_URL get=lambda self: PILLAR_SERVER_URL,
) )
local_texture_dir = StringProperty( local_texture_dir: StringProperty(
name='Default Blender Cloud Texture Storage Directory', name="Default Blender Cloud Texture Storage Directory",
subtype='DIR_PATH', subtype="DIR_PATH",
default='//textures') default="//textures",
)
open_browser_after_share = BoolProperty( open_browser_after_share: BoolProperty(
name='Open Browser after Sharing File', name="Open Browser after Sharing File",
description='When enabled, Blender will open a webbrowser', description="When enabled, Blender will open a webbrowser",
default=True default=True,
) )
# TODO: store project-dependent properties with the project, so that people # TODO: store project-dependent properties with the project, so that people
# can switch projects and the Attract and Flamenco properties switch with it. # can switch projects and the Attract and Flamenco properties switch with it.
project = PointerProperty(type=BlenderCloudProjectGroup) project: PointerProperty(type=BlenderCloudProjectGroup)
cloud_project_local_path = StringProperty( cloud_project_local_path: StringProperty(
name='Local Project Path', name="Local Project Path",
description='Local path of your Attract project, used to search for blend files; ' description="Local path of your Attract project, used to search for blend files; "
'usually best to set to an absolute path', "usually best to set to an absolute path",
subtype='DIR_PATH', subtype="DIR_PATH",
default='//../') default="//../",
update=project_specific.store,
)
flamenco_manager = PointerProperty(type=flamenco.FlamencoManagerGroup) flamenco_manager: PointerProperty(type=flamenco.FlamencoManagerGroup)
flamenco_exclude_filter = StringProperty( flamenco_exclude_filter: StringProperty(
name='File Exclude Filter', name="File Exclude Filter",
description='Filter like "*.abc;*.mkv" to prevent certain files to be packed ' description='Space-separated list of filename filters, like "*.abc *.mkv", to prevent '
'into the output directory', "matching files from being packed into the output directory",
default='') default="",
# TODO: before making Flamenco public, change the defaults to something less Institute-specific. update=project_specific.store,
# NOTE: The assumption is that the workers can also find the files in the same path. )
# This assumption is true for the Blender Institute. flamenco_job_file_path: StringProperty(
flamenco_job_file_path = StringProperty( name="Job Storage Path",
name='Job File Path', description="Path where to store job files, should be accesible for Workers too",
description='Path where to store job files, should be accesible for Workers too', subtype="DIR_PATH",
subtype='DIR_PATH', default=tempfile.gettempdir(),
default='/render/_flamenco/storage') update=project_specific.store,
)
# TODO: before making Flamenco public, change the defaults to something less Institute-specific. flamenco_job_output_path: StringProperty(
flamenco_job_output_path = StringProperty( name="Job Output Path",
name='Job Output Path', description="Path where to store output files, should be accessible for Workers",
description='Path where to store output files, should be accessible for Workers', subtype="DIR_PATH",
subtype='DIR_PATH', default=tempfile.gettempdir(),
default='/render/_flamenco/output') update=project_specific.store,
flamenco_job_output_strip_components = IntProperty( )
name='Job Output Path Strip Components', flamenco_job_output_strip_components: IntProperty(
description='The final output path comprises of the job output path, and the blend file ' name="Job Output Path Strip Components",
'path relative to the project with this many path components stripped off ' description="The final output path comprises of the job output path, and the blend file "
'the front', "path relative to the project with this many path components stripped off "
"the front",
min=0, min=0,
default=0, default=0,
soft_max=4, soft_max=4,
update=project_specific.store,
) )
flamenco_open_browser_after_submit = BoolProperty( flamenco_relative_only: BoolProperty(
name='Open Browser after Submitting Job', name="Relative Paths Only",
description='When enabled, Blender will open a webbrowser', description="When enabled, only assets that are referred to with a relative path are "
default=True "packed, and assets referred to by an absolute path are excluded from the "
"BAT pack. When disabled, all assets are packed",
default=False,
update=project_specific.store,
)
flamenco_open_browser_after_submit: BoolProperty(
name="Open Browser after Submitting Job",
description="When enabled, Blender will open a webbrowser",
default=True,
)
flamenco_show_quit_after_submit_button: BoolProperty(
name='Show "Submit & Quit" button',
description='When enabled, next to the "Render on Flamenco" button there will be a button '
'"Submit & Quit" that silently quits Blender after submitting the render job '
"to Flamenco",
default=False,
) )
def draw(self, context): def draw(self, context):
@ -277,24 +291,30 @@ class BlenderCloudPreferences(AddonPreferences):
blender_id_profile = blender_id.get_active_profile() blender_id_profile = blender_id.get_active_profile()
if blender_id is None: if blender_id is None:
msg_icon = 'ERROR' msg_icon = "ERROR"
text = 'This add-on requires Blender ID' text = "This add-on requires Blender ID"
help_text = 'Make sure that the Blender ID add-on is installed and activated' help_text = (
"Make sure that the Blender ID add-on is installed and activated"
)
elif not blender_id_profile: elif not blender_id_profile:
msg_icon = 'ERROR' msg_icon = "ERROR"
text = 'You are logged out.' text = "You are logged out."
help_text = 'To login, go to the Blender ID add-on preferences.' help_text = "To login, go to the Blender ID add-on preferences."
elif bpy.app.debug and pillar.SUBCLIENT_ID not in blender_id_profile.subclients: elif bpy.app.debug and pillar.SUBCLIENT_ID not in blender_id_profile.subclients:
msg_icon = 'QUESTION' msg_icon = "QUESTION"
text = 'No Blender Cloud credentials.' text = "No Blender Cloud credentials."
help_text = ('You are logged in on Blender ID, but your credentials have not ' help_text = (
'been synchronized with Blender Cloud yet. Press the Update ' "You are logged in on Blender ID, but your credentials have not "
'Credentials button.') "been synchronized with Blender Cloud yet. Press the Update "
"Credentials button."
)
else: else:
msg_icon = 'WORLD_DATA' msg_icon = "WORLD_DATA"
text = 'You are logged in as %s.' % blender_id_profile.username text = "You are logged in as %s." % blender_id_profile.username
help_text = ('To logout or change profile, ' help_text = (
'go to the Blender ID add-on preferences.') "To logout or change profile, "
"go to the Blender ID add-on preferences."
)
# Authentication stuff # Authentication stuff
auth_box = layout.box() auth_box = layout.box()
@ -308,188 +328,205 @@ class BlenderCloudPreferences(AddonPreferences):
# Texture browser stuff # Texture browser stuff
texture_box = layout.box() texture_box = layout.box()
texture_box.enabled = msg_icon != 'ERROR' texture_box.enabled = msg_icon != "ERROR"
sub = texture_box.column() sub = texture_box.column()
sub.label(text='Local directory for downloaded textures', icon_value=icon('CLOUD')) sub.label(
sub.prop(self, "local_texture_dir", text='Default') text="Local directory for downloaded textures", icon_value=icon("CLOUD")
sub.prop(context.scene, "local_texture_dir", text='Current scene') )
sub.prop(self, "local_texture_dir", text="Default")
sub.prop(context.scene, "local_texture_dir", text="Current scene")
# Blender Sync stuff # Blender Sync stuff
bss = context.window_manager.blender_sync_status bss = context.window_manager.blender_sync_status
bsync_box = layout.box() bsync_box = layout.box()
bsync_box.enabled = msg_icon != 'ERROR' bsync_box.enabled = msg_icon != "ERROR"
row = bsync_box.row().split(percentage=0.33) row = bsync_box.row().split(factor=0.33)
row.label('Blender Sync with Blender Cloud', icon_value=icon('CLOUD')) row.label(text="Blender Sync with Blender Cloud", icon_value=icon("CLOUD"))
icon_for_level = { icon_for_level = {
'INFO': 'NONE', "INFO": "NONE",
'WARNING': 'INFO', "WARNING": "INFO",
'ERROR': 'ERROR', "ERROR": "ERROR",
'SUBSCRIBE': 'ERROR', "SUBSCRIBE": "ERROR",
} }
msg_icon = icon_for_level[bss.level] if bss.message else 'NONE' msg_icon = icon_for_level[bss.level] if bss.message else "NONE"
message_container = row.row() message_container = row.row()
message_container.label(bss.message, icon=msg_icon) message_container.label(text=bss.message, icon=msg_icon)
sub = bsync_box.column() sub = bsync_box.column()
if bss.level == 'SUBSCRIBE': if bss.level == "SUBSCRIBE":
self.draw_subscribe_button(sub) self.draw_subscribe_button(sub)
self.draw_sync_buttons(sub, bss) self.draw_sync_buttons(sub, bss)
# Image Share stuff # Image Share stuff
share_box = layout.box() share_box = layout.box()
share_box.label('Image Sharing on Blender Cloud', icon_value=icon('CLOUD')) share_box.label(text="Image Sharing on Blender Cloud", icon_value=icon("CLOUD"))
share_box.prop(self, 'open_browser_after_share') share_box.prop(self, "open_browser_after_share")
# Project selector # Project selector
project_box = layout.box() project_box = layout.box()
project_box.enabled = self.project.status in {'NONE', 'IDLE'} project_box.enabled = self.project.status in {"NONE", "IDLE"}
self.draw_project_selector(project_box, self.project) self.draw_project_selector(project_box, self.project)
extensions = project_extensions(self.project.project) extensions = project_extensions(self.project.project)
# Flamenco stuff # Flamenco stuff
if 'flamenco' in extensions: if "flamenco" in extensions:
flamenco_box = project_box.column() flamenco_box = project_box.column()
self.draw_flamenco_buttons(flamenco_box, self.flamenco_manager, context) self.draw_flamenco_buttons(flamenco_box, self.flamenco_manager, context)
def draw_subscribe_button(self, layout): def draw_subscribe_button(self, layout):
layout.operator('pillar.subscribe', icon='WORLD') layout.operator("pillar.subscribe", icon="WORLD")
def draw_sync_buttons(self, layout, bss): def draw_sync_buttons(self, layout, bss):
layout.enabled = bss.status in {'NONE', 'IDLE'} layout.enabled = bss.status in {"NONE", "IDLE"}
buttons = layout.column() buttons = layout.column()
row_buttons = buttons.row().split(percentage=0.5) row_buttons = buttons.row().split(factor=0.5)
row_push = row_buttons.row() row_push = row_buttons.row()
row_pull = row_buttons.row(align=True) row_pull = row_buttons.row(align=True)
row_push.operator('pillar.sync', row_push.operator(
text='Save %i.%i settings' % bpy.app.version[:2], "pillar.sync",
icon='TRIA_UP').action = 'PUSH' text="Save %i.%i settings" % bpy.app.version[:2],
icon="TRIA_UP",
).action = "PUSH"
versions = bss.available_blender_versions versions = bss.available_blender_versions
version = bss.version if bss.status in {"NONE", "IDLE"}:
if bss.status in {'NONE', 'IDLE'}: if not versions:
if not versions or not version: row_pull.operator(
row_pull.operator('pillar.sync', "pillar.sync", text="Find version to load", icon="TRIA_DOWN"
text='Find version to load', ).action = "REFRESH"
icon='TRIA_DOWN').action = 'REFRESH'
else: else:
props = row_pull.operator('pillar.sync', props = row_pull.operator(
text='Load %s settings' % version, "pillar.sync",
icon='TRIA_DOWN') text="Load %s settings" % bss.version,
props.action = 'PULL' icon="TRIA_DOWN",
props.blender_version = version )
row_pull.operator('pillar.sync', props.action = "PULL"
text='', props.blender_version = bss.version
icon='DOTSDOWN').action = 'SELECT' row_pull.operator(
"pillar.sync", text="", icon="DOWNARROW_HLT"
).action = "SELECT"
else: else:
row_pull.label('Cloud Sync is running.') row_pull.label(text="Cloud Sync is running.")
def draw_project_selector(self, project_box, bcp: BlenderCloudProjectGroup): def draw_project_selector(self, project_box, bcp: BlenderCloudProjectGroup):
project_row = project_box.row(align=True) project_row = project_box.row(align=True)
project_row.label('Project settings', icon_value=icon('CLOUD')) project_row.label(text="Project settings", icon_value=icon("CLOUD"))
row_buttons = project_row.row(align=True) row_buttons = project_row.row(align=True)
projects = bcp.available_projects projects = bcp.available_projects
project = bcp.project project = bcp.project
if bcp.status in {'NONE', 'IDLE'}: if bcp.status in {"NONE", "IDLE"}:
if not projects or not project: if not projects:
row_buttons.operator('pillar.projects', row_buttons.operator(
text='Find project to load', "pillar.projects", text="Find project to load", icon="FILE_REFRESH"
icon='FILE_REFRESH') )
else: else:
row_buttons.prop(bcp, 'project') row_buttons.prop(bcp, "project")
row_buttons.operator('pillar.projects', row_buttons.operator("pillar.projects", text="", icon="FILE_REFRESH")
text='', props = row_buttons.operator(
icon='FILE_REFRESH') "pillar.project_open_in_browser", text="", icon="WORLD"
)
props.project_id = project
else: else:
row_buttons.label('Fetching available projects.') row_buttons.label(text="Fetching available projects.")
enabled_for = project_extensions(project) enabled_for = project_extensions(project)
if not project: if not project:
return return
if not enabled_for: if not enabled_for:
project_box.label('This project is not set up for Attract or Flamenco') project_box.label(text="This project is not set up for Attract or Flamenco")
return return
project_box.label('This project is set up for: %s' % project_box.label(
', '.join(sorted(enabled_for))) text="This project is set up for: %s" % ", ".join(sorted(enabled_for))
)
# This is only needed when the project is set up for either Attract or Flamenco. # This is only needed when the project is set up for either Attract or Flamenco.
project_box.prop(self, 'cloud_project_local_path', project_box.prop(self, "cloud_project_local_path", text="Local Project Path")
text='Local Cloud Project Path')
def draw_flamenco_buttons(self, flamenco_box, bcp: flamenco.FlamencoManagerGroup, context):
from .flamenco import bam_interface
def draw_flamenco_buttons(
self, flamenco_box, bcp: flamenco.FlamencoManagerGroup, context
):
header_row = flamenco_box.row(align=True) header_row = flamenco_box.row(align=True)
header_row.label('Flamenco:', icon_value=icon('CLOUD')) header_row.label(text="Flamenco:", icon_value=icon("CLOUD"))
manager_split = flamenco_box.split(0.32, align=True) manager_split = flamenco_box.split(factor=0.32, align=True)
manager_split.label('Manager:') manager_split.label(text="Manager:")
manager_box = manager_split.row(align=True) manager_box = manager_split.row(align=True)
if bcp.status in {'NONE', 'IDLE'}: if bcp.status in {"NONE", "IDLE"}:
if not bcp.available_managers or not bcp.manager: if not bcp.available_managers:
manager_box.operator('flamenco.managers', manager_box.operator(
text='Find Flamenco Managers', "flamenco.managers",
icon='FILE_REFRESH') text="Find Flamenco Managers",
icon="FILE_REFRESH",
)
else: else:
manager_box.prop(bcp, 'manager', text='') manager_box.prop(bcp, "manager", text="")
manager_box.operator('flamenco.managers', manager_box.operator("flamenco.managers", text="", icon="FILE_REFRESH")
text='',
icon='FILE_REFRESH')
else: else:
manager_box.label('Fetching available managers.') manager_box.label(text="Fetching available managers.")
path_split = flamenco_box.split(0.32, align=True) path_split = flamenco_box.split(factor=0.32, align=True)
path_split.label(text='Job File Path:') path_split.label(text="Job File Path:")
path_box = path_split.row(align=True) path_box = path_split.row(align=True)
path_box.prop(self, 'flamenco_job_file_path', text='') path_box.prop(self, "flamenco_job_file_path", text="")
props = path_box.operator('flamenco.explore_file_path', text='', icon='DISK_DRIVE') props = path_box.operator(
"flamenco.explore_file_path", text="", icon="DISK_DRIVE"
)
props.path = self.flamenco_job_file_path props.path = self.flamenco_job_file_path
job_output_box = flamenco_box.column(align=True) job_output_box = flamenco_box.column(align=True)
path_split = job_output_box.split(0.32, align=True) path_split = job_output_box.split(factor=0.32, align=True)
path_split.label(text='Job Output Path:') path_split.label(text="Job Output Path:")
path_box = path_split.row(align=True) path_box = path_split.row(align=True)
path_box.prop(self, 'flamenco_job_output_path', text='') path_box.prop(self, "flamenco_job_output_path", text="")
props = path_box.operator('flamenco.explore_file_path', text='', icon='DISK_DRIVE') props = path_box.operator(
"flamenco.explore_file_path", text="", icon="DISK_DRIVE"
)
props.path = self.flamenco_job_output_path props.path = self.flamenco_job_output_path
job_output_box.prop(self, 'flamenco_exclude_filter') job_output_box.prop(self, "flamenco_exclude_filter")
prop_split = job_output_box.split(0.32, align=True) prop_split = job_output_box.split(factor=0.32, align=True)
prop_split.label('Strip Components:') prop_split.label(text="Strip Components:")
prop_split.prop(self, 'flamenco_job_output_strip_components', text='') prop_split.prop(self, "flamenco_job_output_strip_components", text="")
from .flamenco import render_output_path from .flamenco import render_output_path
path_box = job_output_box.row(align=True) path_box = job_output_box.row(align=True)
output_path = render_output_path(context) output_path = render_output_path(context)
if output_path: if output_path:
path_box.label(str(output_path)) path_box.label(text=str(output_path))
props = path_box.operator('flamenco.explore_file_path', text='', icon='DISK_DRIVE') props = path_box.operator(
"flamenco.explore_file_path", text="", icon="DISK_DRIVE"
)
props.path = str(output_path.parent) props.path = str(output_path.parent)
else: else:
path_box.label('Blend file is not in your project path, ' path_box.label(
'unable to give output path example.') text="Blend file is not in your project path, "
"unable to give output path example."
)
flamenco_box.prop(self, 'flamenco_open_browser_after_submit') flamenco_box.prop(self, "flamenco_relative_only")
flamenco_box.prop(self, "flamenco_open_browser_after_submit")
flamenco_box.prop(self, "flamenco_show_quit_after_submit_button")
class PillarCredentialsUpdate(pillar.PillarOperatorMixin, class PillarCredentialsUpdate(pillar.PillarOperatorMixin, Operator):
Operator):
"""Updates the Pillar URL and tests the new URL.""" """Updates the Pillar URL and tests the new URL."""
bl_idname = 'pillar.credentials_update'
bl_label = 'Update credentials'
bl_description = 'Resynchronises your Blender ID login with Blender Cloud'
log = logging.getLogger('bpy.ops.%s' % bl_idname) bl_idname = "pillar.credentials_update"
bl_label = "Update credentials"
bl_description = "Resynchronises your Blender ID login with Blender Cloud"
log = logging.getLogger("bpy.ops.%s" % bl_idname)
@classmethod @classmethod
def poll(cls, context): def poll(cls, context):
@ -511,49 +548,86 @@ class PillarCredentialsUpdate(pillar.PillarOperatorMixin,
# Only allow activation when the user is actually logged in. # Only allow activation when the user is actually logged in.
if not self.is_logged_in(context): if not self.is_logged_in(context):
self.report({'ERROR'}, 'No active profile found') self.report({"ERROR"}, "No active profile found")
return {'CANCELLED'} return {"CANCELLED"}
try: try:
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
loop.run_until_complete(self.check_credentials(context, set())) loop.run_until_complete(self.check_credentials(context, set()))
except blender_id.BlenderIdCommError as ex: except blender_id.BlenderIdCommError as ex:
log.exception('Error sending subclient-specific token to Blender ID') log.exception("Error sending subclient-specific token to Blender ID")
self.report({'ERROR'}, 'Failed to sync Blender ID to Blender Cloud') self.report({"ERROR"}, "Failed to sync Blender ID to Blender Cloud")
return {'CANCELLED'} return {"CANCELLED"}
except Exception as ex: except Exception as ex:
log.exception('Error in test call to Pillar') log.exception("Error in test call to Pillar")
self.report({'ERROR'}, 'Failed test connection to Blender Cloud') self.report({"ERROR"}, "Failed test connection to Blender Cloud")
return {'CANCELLED'} return {"CANCELLED"}
self.report({'INFO'}, 'Blender Cloud credentials & endpoint URL updated.') self.report({"INFO"}, "Blender Cloud credentials & endpoint URL updated.")
return {'FINISHED'} return {"FINISHED"}
class PILLAR_OT_subscribe(Operator): class PILLAR_OT_subscribe(Operator):
"""Opens a browser to subscribe the user to the Cloud.""" """Opens a browser to subscribe the user to the Cloud."""
bl_idname = 'pillar.subscribe'
bl_label = 'Subscribe to the Cloud' bl_idname = "pillar.subscribe"
bl_label = "Subscribe to the Cloud"
bl_description = "Opens a page in a web browser to subscribe to the Blender Cloud" bl_description = "Opens a page in a web browser to subscribe to the Blender Cloud"
def execute(self, context): def execute(self, context):
import webbrowser import webbrowser
webbrowser.open_new_tab('https://cloud.blender.org/join') webbrowser.open_new_tab("https://cloud.blender.org/join")
self.report({'INFO'}, 'We just started a browser for you.') self.report({"INFO"}, "We just started a browser for you.")
return {'FINISHED'} return {"FINISHED"}
class PILLAR_OT_projects(async_loop.AsyncModalOperatorMixin, class PILLAR_OT_project_open_in_browser(Operator):
bl_idname = "pillar.project_open_in_browser"
bl_label = "Open in Browser"
bl_description = "Opens a webbrowser to show the project"
project_id: StringProperty(name="Project ID")
def execute(self, context):
if not self.project_id:
return {"CANCELLED"}
import webbrowser
import urllib.parse
import pillarsdk
from .pillar import sync_call
project = sync_call(
pillarsdk.Project.find, self.project_id, {"projection": {"url": True}}
)
if log.isEnabledFor(logging.DEBUG):
import pprint
log.debug("found project: %s", pprint.pformat(project.to_dict()))
url = urllib.parse.urljoin(PILLAR_WEB_SERVER_URL, "p/" + project.url)
webbrowser.open_new_tab(url)
self.report({"INFO"}, "Opened a browser at %s" % url)
return {"FINISHED"}
class PILLAR_OT_projects(
async_loop.AsyncModalOperatorMixin,
pillar.AuthenticatedPillarOperatorMixin, pillar.AuthenticatedPillarOperatorMixin,
Operator): Operator,
):
"""Fetches the projects available to the user""" """Fetches the projects available to the user"""
bl_idname = 'pillar.projects'
bl_label = 'Fetch available projects' bl_idname = "pillar.projects"
bl_label = "Fetch available projects"
stop_upon_exception = True stop_upon_exception = True
_log = logging.getLogger('bpy.ops.%s' % bl_idname) _log = logging.getLogger("bpy.ops.%s" % bl_idname)
async def async_execute(self, context): async def async_execute(self, context):
if not await self.authenticate(context): if not await self.authenticate(context):
@ -562,71 +636,84 @@ class PILLAR_OT_projects(async_loop.AsyncModalOperatorMixin,
import pillarsdk import pillarsdk
from .pillar import pillar_call from .pillar import pillar_call
self.log.info('Going to fetch projects for user %s', self.user_id) self.log.info("Going to fetch projects for user %s", self.user_id)
preferences().project.status = 'FETCHING' preferences().project.status = "FETCHING"
# Get all projects, except the home project. # Get all projects, except the home project.
projects_user = await pillar_call( projects_user = await pillar_call(
pillarsdk.Project.all, pillarsdk.Project.all,
{'where': {'user': self.user_id, {
'category': {'$ne': 'home'}}, "where": {"user": self.user_id, "category": {"$ne": "home"}},
'sort': '-_created', "sort": "-name",
'projection': {'_id': True, "projection": {"_id": True, "name": True, "extension_props": True},
'name': True, },
'extension_props': True}, )
})
projects_shared = await pillar_call( projects_shared = await pillar_call(
pillarsdk.Project.all, pillarsdk.Project.all,
{'where': {'user': {'$ne': self.user_id}, {
'permissions.groups.group': {'$in': self.db_user.groups}}, "where": {
'sort': '-_created', "user": {"$ne": self.user_id},
'projection': {'_id': True, "permissions.groups.group": {"$in": self.db_user.groups},
'name': True, },
'extension_props': True}, "sort": "-name",
}) "projection": {"_id": True, "name": True, "extension_props": True},
},
)
# We need to convert to regular dicts before storing in ID properties. # We need to convert to regular dicts before storing in ID properties.
# Also don't store more properties than we need. # Also don't store more properties than we need.
def reduce_properties(project_list): def reduce_properties(project_list):
for p in project_list: for p in project_list:
p = p.to_dict() p = p.to_dict()
extension_props = p.get('extension_props', {}) extension_props = p.get("extension_props", {})
enabled_for = list(extension_props.keys()) enabled_for = list(extension_props.keys())
self._log.debug('Project %r is enabled for %s', p['name'], enabled_for) self._log.debug("Project %r is enabled for %s", p["name"], enabled_for)
yield { yield {
'_id': p['_id'], "_id": p["_id"],
'name': p['name'], "name": p["name"],
'enabled_for': enabled_for, "enabled_for": enabled_for,
} }
projects = list(reduce_properties(projects_user['_items'])) + \ projects = list(reduce_properties(projects_user["_items"])) + list(
list(reduce_properties(projects_shared['_items'])) reduce_properties(projects_shared["_items"])
)
preferences().project.available_projects = projects def proj_sort_key(project):
return project.get("name")
preferences().project.available_projects = sorted(projects, key=proj_sort_key)
self.quit() self.quit()
def quit(self): def quit(self):
preferences().project.status = 'IDLE' preferences().project.status = "IDLE"
super().quit() super().quit()
class PILLAR_PT_image_custom_properties(rna_prop_ui.PropertyPanel, bpy.types.Panel): class PILLAR_PT_image_custom_properties(rna_prop_ui.PropertyPanel, bpy.types.Panel):
"""Shows custom properties in the image editor.""" """Shows custom properties in the image editor."""
bl_space_type = 'IMAGE_EDITOR' bl_space_type = "IMAGE_EDITOR"
bl_region_type = 'UI' bl_region_type = "UI"
bl_label = 'Custom Properties' bl_label = "Custom Properties"
_context_path = 'edit_image' _context_path = "edit_image"
_property_type = bpy.types.Image _property_type = bpy.types.Image
def ctx_preferences():
"""Returns bpy.context.preferences in a 2.79-compatible way."""
try:
return bpy.context.preferences
except AttributeError:
return bpy.context.user_preferences
def preferences() -> BlenderCloudPreferences: def preferences() -> BlenderCloudPreferences:
return bpy.context.user_preferences.addons[ADDON_NAME].preferences return ctx_preferences().addons[ADDON_NAME].preferences
def load_custom_icons(): def load_custom_icons():
@ -637,9 +724,10 @@ def load_custom_icons():
return return
import bpy.utils.previews import bpy.utils.previews
icons = bpy.utils.previews.new() icons = bpy.utils.previews.new()
my_icons_dir = os.path.join(os.path.dirname(__file__), 'icons') my_icons_dir = os.path.join(os.path.dirname(__file__), "icons")
icons.load('CLOUD', os.path.join(my_icons_dir, 'icon-cloud.png'), 'IMAGE') icons.load("CLOUD", os.path.join(my_icons_dir, "icon-cloud.png"), "IMAGE")
def unload_custom_icons(): def unload_custom_icons():
@ -669,13 +757,14 @@ def register():
bpy.utils.register_class(SyncStatusProperties) bpy.utils.register_class(SyncStatusProperties)
bpy.utils.register_class(PILLAR_OT_subscribe) bpy.utils.register_class(PILLAR_OT_subscribe)
bpy.utils.register_class(PILLAR_OT_projects) bpy.utils.register_class(PILLAR_OT_projects)
bpy.utils.register_class(PILLAR_OT_project_open_in_browser)
bpy.utils.register_class(PILLAR_PT_image_custom_properties) bpy.utils.register_class(PILLAR_PT_image_custom_properties)
addon_prefs = preferences() addon_prefs = preferences()
WindowManager.last_blender_cloud_location = StringProperty( WindowManager.last_blender_cloud_location = StringProperty(
name="Last Blender Cloud browser location", name="Last Blender Cloud browser location", default="/"
default="/") )
def default_if_empty(scene, context): def default_if_empty(scene, context):
"""The scene's local_texture_dir, if empty, reverts to the addon prefs.""" """The scene's local_texture_dir, if empty, reverts to the addon prefs."""
@ -684,10 +773,11 @@ def register():
scene.local_texture_dir = addon_prefs.local_texture_dir scene.local_texture_dir = addon_prefs.local_texture_dir
Scene.local_texture_dir = StringProperty( Scene.local_texture_dir = StringProperty(
name='Blender Cloud texture storage directory for current scene', name="Blender Cloud texture storage directory for current scene",
subtype='DIR_PATH', subtype="DIR_PATH",
default=addon_prefs.local_texture_dir, default=addon_prefs.local_texture_dir,
update=default_if_empty) update=default_if_empty,
)
WindowManager.blender_sync_status = PointerProperty(type=SyncStatusProperties) WindowManager.blender_sync_status = PointerProperty(type=SyncStatusProperties)
@ -703,6 +793,7 @@ def unregister():
bpy.utils.unregister_class(SyncStatusProperties) bpy.utils.unregister_class(SyncStatusProperties)
bpy.utils.unregister_class(PILLAR_OT_subscribe) bpy.utils.unregister_class(PILLAR_OT_subscribe)
bpy.utils.unregister_class(PILLAR_OT_projects) bpy.utils.unregister_class(PILLAR_OT_projects)
bpy.utils.unregister_class(PILLAR_OT_project_open_in_browser)
bpy.utils.unregister_class(PILLAR_PT_image_custom_properties) bpy.utils.unregister_class(PILLAR_PT_image_custom_properties)
del WindowManager.last_blender_cloud_location del WindowManager.last_blender_cloud_location

View File

@ -52,13 +52,13 @@ def open_blend(filename, access="rb"):
bfile.is_compressed = False bfile.is_compressed = False
bfile.filepath_orig = filename bfile.filepath_orig = filename
return bfile return bfile
elif magic[:2] == b'\x1f\x8b': elif magic[:2] == b"\x1f\x8b":
log.debug("gzip blendfile detected") log.debug("gzip blendfile detected")
handle.close() handle.close()
log.debug("decompressing started") log.debug("decompressing started")
fs = gzip.open(filename, "rb") fs = gzip.open(filename, "rb")
data = fs.read(FILE_BUFFER_SIZE) data = fs.read(FILE_BUFFER_SIZE)
magic = data[:len(magic_test)] magic = data[: len(magic_test)]
if magic == magic_test: if magic == magic_test:
handle = tempfile.TemporaryFile() handle = tempfile.TemporaryFile()
while data: while data:
@ -90,6 +90,7 @@ class BlendFile:
""" """
Blend file. Blend file.
""" """
__slots__ = ( __slots__ = (
# file (result of open()) # file (result of open())
"handle", "handle",
@ -125,9 +126,10 @@ class BlendFile:
self.code_index = {} self.code_index = {}
block = BlendFileBlock(handle, self) block = BlendFileBlock(handle, self)
while block.code != b'ENDB': while block.code != b"ENDB":
if block.code == b'DNA1': if block.code == b"DNA1":
(self.structs, (
self.structs,
self.sdna_index_from_id, self.sdna_index_from_id,
) = BlendFile.decode_structs(self.header, block, handle) ) = BlendFile.decode_structs(self.header, block, handle)
else: else:
@ -141,7 +143,9 @@ class BlendFile:
self.blocks.append(block) self.blocks.append(block)
# cache (could lazy init, incase we never use?) # cache (could lazy init, incase we never use?)
self.block_from_offset = {block.addr_old: block for block in self.blocks if block.code != b'ENDB'} self.block_from_offset = {
block.addr_old: block for block in self.blocks if block.code != b"ENDB"
}
def __enter__(self): def __enter__(self):
return self return self
@ -150,7 +154,7 @@ class BlendFile:
self.close() self.close()
def find_blocks_from_code(self, code): def find_blocks_from_code(self, code):
assert(type(code) == bytes) assert type(code) == bytes
if code not in self.code_index: if code not in self.code_index:
return [] return []
return self.code_index[code] return self.code_index[code]
@ -158,7 +162,7 @@ class BlendFile:
def find_block_from_offset(self, offset): def find_block_from_offset(self, offset):
# same as looking looping over all blocks, # same as looking looping over all blocks,
# then checking ``block.addr_old == offset`` # then checking ``block.addr_old == offset``
assert(type(offset) is int) assert type(offset) is int
return self.block_from_offset.get(offset) return self.block_from_offset.get(offset)
def close(self): def close(self):
@ -185,12 +189,15 @@ class BlendFile:
def ensure_subtype_smaller(self, sdna_index_curr, sdna_index_next): def ensure_subtype_smaller(self, sdna_index_curr, sdna_index_next):
# never refine to a smaller type # never refine to a smaller type
if (self.structs[sdna_index_curr].size > if self.structs[sdna_index_curr].size > self.structs[sdna_index_next].size:
self.structs[sdna_index_next].size):
raise RuntimeError("cant refine to smaller type (%s -> %s)" % raise RuntimeError(
(self.structs[sdna_index_curr].dna_type_id.decode('ascii'), "cant refine to smaller type (%s -> %s)"
self.structs[sdna_index_next].dna_type_id.decode('ascii'))) % (
self.structs[sdna_index_curr].dna_type_id.decode("ascii"),
self.structs[sdna_index_next].dna_type_id.decode("ascii"),
)
)
@staticmethod @staticmethod
def decode_structs(header, block, handle): def decode_structs(header, block, handle):
@ -199,7 +206,7 @@ class BlendFile:
""" """
log.debug("building DNA catalog") log.debug("building DNA catalog")
shortstruct = DNA_IO.USHORT[header.endian_index] shortstruct = DNA_IO.USHORT[header.endian_index]
shortstruct2 = struct.Struct(header.endian_str + b'HH') shortstruct2 = struct.Struct(header.endian_str + b"HH")
intstruct = DNA_IO.UINT[header.endian_index] intstruct = DNA_IO.UINT[header.endian_index]
data = handle.read(block.size) data = handle.read(block.size)
@ -281,6 +288,7 @@ class BlendFileBlock:
""" """
Instance of a struct. Instance of a struct.
""" """
__slots__ = ( __slots__ = (
# BlendFile # BlendFile
"file", "file",
@ -294,18 +302,22 @@ class BlendFileBlock:
) )
def __str__(self): def __str__(self):
return ("<%s.%s (%s), size=%d at %s>" % return (
"<%s.%s (%s), size=%d at %s>"
%
# fields=[%s] # fields=[%s]
(self.__class__.__name__, (
self.dna_type.dna_type_id.decode('ascii'), self.__class__.__name__,
self.dna_type.dna_type_id.decode("ascii"),
self.code.decode(), self.code.decode(),
self.size, self.size,
# b", ".join(f.dna_name.name_only for f in self.dna_type.fields).decode('ascii'), # b", ".join(f.dna_name.name_only for f in self.dna_type.fields).decode('ascii'),
hex(self.addr_old), hex(self.addr_old),
)) )
)
def __init__(self, handle, bfile): def __init__(self, handle, bfile):
OLDBLOCK = struct.Struct(b'4sI') OLDBLOCK = struct.Struct(b"4sI")
self.file = bfile self.file = bfile
self.user_data = None self.user_data = None
@ -318,8 +330,8 @@ class BlendFileBlock:
if len(data) > 15: if len(data) > 15:
blockheader = bfile.block_header_struct.unpack(data) blockheader = bfile.block_header_struct.unpack(data)
self.code = blockheader[0].partition(b'\0')[0] self.code = blockheader[0].partition(b"\0")[0]
if self.code != b'ENDB': if self.code != b"ENDB":
self.size = blockheader[1] self.size = blockheader[1]
self.addr_old = blockheader[2] self.addr_old = blockheader[2]
self.sdna_index = blockheader[3] self.sdna_index = blockheader[3]
@ -333,7 +345,7 @@ class BlendFileBlock:
self.file_offset = 0 self.file_offset = 0
else: else:
blockheader = OLDBLOCK.unpack(data) blockheader = OLDBLOCK.unpack(data)
self.code = blockheader[0].partition(b'\0')[0] self.code = blockheader[0].partition(b"\0")[0]
self.code = DNA_IO.read_data0(blockheader[0]) self.code = DNA_IO.read_data0(blockheader[0])
self.size = 0 self.size = 0
self.addr_old = 0 self.addr_old = 0
@ -346,16 +358,18 @@ class BlendFileBlock:
return self.file.structs[self.sdna_index] return self.file.structs[self.sdna_index]
def refine_type_from_index(self, sdna_index_next): def refine_type_from_index(self, sdna_index_next):
assert(type(sdna_index_next) is int) assert type(sdna_index_next) is int
sdna_index_curr = self.sdna_index sdna_index_curr = self.sdna_index
self.file.ensure_subtype_smaller(sdna_index_curr, sdna_index_next) self.file.ensure_subtype_smaller(sdna_index_curr, sdna_index_next)
self.sdna_index = sdna_index_next self.sdna_index = sdna_index_next
def refine_type(self, dna_type_id): def refine_type(self, dna_type_id):
assert(type(dna_type_id) is bytes) assert type(dna_type_id) is bytes
self.refine_type_from_index(self.file.sdna_index_from_id[dna_type_id]) self.refine_type_from_index(self.file.sdna_index_from_id[dna_type_id])
def get_file_offset(self, path, def get_file_offset(
self,
path,
default=..., default=...,
sdna_index_refine=None, sdna_index_refine=None,
base_index=0, base_index=0,
@ -363,11 +377,11 @@ class BlendFileBlock:
""" """
Return (offset, length) Return (offset, length)
""" """
assert(type(path) is bytes) assert type(path) is bytes
ofs = self.file_offset ofs = self.file_offset
if base_index != 0: if base_index != 0:
assert(base_index < self.count) assert base_index < self.count
ofs += (self.size // self.count) * base_index ofs += (self.size // self.count) * base_index
self.file.handle.seek(ofs, os.SEEK_SET) self.file.handle.seek(ofs, os.SEEK_SET)
@ -377,21 +391,23 @@ class BlendFileBlock:
self.file.ensure_subtype_smaller(self.sdna_index, sdna_index_refine) self.file.ensure_subtype_smaller(self.sdna_index, sdna_index_refine)
dna_struct = self.file.structs[sdna_index_refine] dna_struct = self.file.structs[sdna_index_refine]
field = dna_struct.field_from_path( field = dna_struct.field_from_path(self.file.header, self.file.handle, path)
self.file.header, self.file.handle, path)
return (self.file.handle.tell(), field.dna_name.array_size) return (self.file.handle.tell(), field.dna_name.array_size)
def get(self, path, def get(
self,
path,
default=..., default=...,
sdna_index_refine=None, sdna_index_refine=None,
use_nil=True, use_str=True, use_nil=True,
use_str=True,
base_index=0, base_index=0,
): ):
ofs = self.file_offset ofs = self.file_offset
if base_index != 0: if base_index != 0:
assert(base_index < self.count) assert base_index < self.count
ofs += (self.size // self.count) * base_index ofs += (self.size // self.count) * base_index
self.file.handle.seek(ofs, os.SEEK_SET) self.file.handle.seek(ofs, os.SEEK_SET)
@ -402,36 +418,55 @@ class BlendFileBlock:
dna_struct = self.file.structs[sdna_index_refine] dna_struct = self.file.structs[sdna_index_refine]
return dna_struct.field_get( return dna_struct.field_get(
self.file.header, self.file.handle, path, self.file.header,
self.file.handle,
path,
default=default, default=default,
use_nil=use_nil, use_str=use_str, use_nil=use_nil,
use_str=use_str,
) )
def get_recursive_iter(self, path, path_root=b"", def get_recursive_iter(
self,
path,
path_root=b"",
default=..., default=...,
sdna_index_refine=None, sdna_index_refine=None,
use_nil=True, use_str=True, use_nil=True,
use_str=True,
base_index=0, base_index=0,
): ):
if path_root: if path_root:
path_full = ( path_full = (path_root if type(path_root) is tuple else (path_root,)) + (
(path_root if type(path_root) is tuple else (path_root, )) + path if type(path) is tuple else (path,)
(path if type(path) is tuple else (path, ))) )
else: else:
path_full = path path_full = path
try: try:
yield (path_full, self.get(path_full, default, sdna_index_refine, use_nil, use_str, base_index)) yield (
path_full,
self.get(
path_full, default, sdna_index_refine, use_nil, use_str, base_index
),
)
except NotImplementedError as ex: except NotImplementedError as ex:
msg, dna_name, dna_type = ex.args msg, dna_name, dna_type = ex.args
struct_index = self.file.sdna_index_from_id.get(dna_type.dna_type_id, None) struct_index = self.file.sdna_index_from_id.get(dna_type.dna_type_id, None)
if struct_index is None: if struct_index is None:
yield (path_full, "<%s>" % dna_type.dna_type_id.decode('ascii')) yield (path_full, "<%s>" % dna_type.dna_type_id.decode("ascii"))
else: else:
struct = self.file.structs[struct_index] struct = self.file.structs[struct_index]
for f in struct.fields: for f in struct.fields:
yield from self.get_recursive_iter( yield from self.get_recursive_iter(
f.dna_name.name_only, path_full, default, None, use_nil, use_str, 0) f.dna_name.name_only,
path_full,
default,
None,
use_nil,
use_str,
0,
)
def items_recursive_iter(self): def items_recursive_iter(self):
for k in self.keys(): for k in self.keys():
@ -445,9 +480,13 @@ class BlendFileBlock:
# TODO This implementation is most likely far from optimal... and CRC32 is not renown as the best hashing # TODO This implementation is most likely far from optimal... and CRC32 is not renown as the best hashing
# algo either. But for now does the job! # algo either. But for now does the job!
import zlib import zlib
def _is_pointer(self, k): def _is_pointer(self, k):
return self.file.structs[self.sdna_index].field_from_path( return (
self.file.header, self.file.handle, k).dna_name.is_pointer self.file.structs[self.sdna_index]
.field_from_path(self.file.header, self.file.handle, k)
.dna_name.is_pointer
)
hsh = 1 hsh = 1
for k, v in self.items_recursive_iter(): for k, v in self.items_recursive_iter():
@ -455,7 +494,10 @@ class BlendFileBlock:
hsh = zlib.adler32(str(v).encode(), hsh) hsh = zlib.adler32(str(v).encode(), hsh)
return hsh return hsh
def set(self, path, value, def set(
self,
path,
value,
sdna_index_refine=None, sdna_index_refine=None,
): ):
@ -467,29 +509,34 @@ class BlendFileBlock:
dna_struct = self.file.structs[sdna_index_refine] dna_struct = self.file.structs[sdna_index_refine]
self.file.handle.seek(self.file_offset, os.SEEK_SET) self.file.handle.seek(self.file_offset, os.SEEK_SET)
self.file.is_modified = True self.file.is_modified = True
return dna_struct.field_set( return dna_struct.field_set(self.file.header, self.file.handle, path, value)
self.file.header, self.file.handle, path, value)
# --------------- # ---------------
# Utility get/set # Utility get/set
# #
# avoid inline pointer casting # avoid inline pointer casting
def get_pointer( def get_pointer(
self, path, self,
path,
default=..., default=...,
sdna_index_refine=None, sdna_index_refine=None,
base_index=0, base_index=0,
): ):
if sdna_index_refine is None: if sdna_index_refine is None:
sdna_index_refine = self.sdna_index sdna_index_refine = self.sdna_index
result = self.get(path, default, sdna_index_refine=sdna_index_refine, base_index=base_index) result = self.get(
path, default, sdna_index_refine=sdna_index_refine, base_index=base_index
)
# default # default
if type(result) is not int: if type(result) is not int:
return result return result
assert(self.file.structs[sdna_index_refine].field_from_path( assert (
self.file.header, self.file.handle, path).dna_name.is_pointer) self.file.structs[sdna_index_refine]
.field_from_path(self.file.header, self.file.handle, path)
.dna_name.is_pointer
)
if result != 0: if result != 0:
# possible (but unlikely) # possible (but unlikely)
# that this fails and returns None # that this fails and returns None
@ -517,7 +564,7 @@ class BlendFileBlock:
yield self[k] yield self[k]
except NotImplementedError as ex: except NotImplementedError as ex:
msg, dna_name, dna_type = ex.args msg, dna_name, dna_type = ex.args
yield "<%s>" % dna_type.dna_type_id.decode('ascii') yield "<%s>" % dna_type.dna_type_id.decode("ascii")
def items(self): def items(self):
for k in self.keys(): for k in self.keys():
@ -525,7 +572,7 @@ class BlendFileBlock:
yield (k, self[k]) yield (k, self[k])
except NotImplementedError as ex: except NotImplementedError as ex:
msg, dna_name, dna_type = ex.args msg, dna_name, dna_type = ex.args
yield (k, "<%s>" % dna_type.dna_type_id.decode('ascii')) yield (k, "<%s>" % dna_type.dna_type_id.decode("ascii"))
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
@ -542,6 +589,7 @@ class BlendFileHeader:
BlendFileHeader allocates the first 12 bytes of a blend file BlendFileHeader allocates the first 12 bytes of a blend file
it contains information about the hardware architecture it contains information about the hardware architecture
""" """
__slots__ = ( __slots__ = (
# str # str
"magic", "magic",
@ -558,46 +606,51 @@ class BlendFileHeader:
) )
def __init__(self, handle): def __init__(self, handle):
FILEHEADER = struct.Struct(b'7s1s1s3s') FILEHEADER = struct.Struct(b"7s1s1s3s")
log.debug("reading blend-file-header") log.debug("reading blend-file-header")
values = FILEHEADER.unpack(handle.read(FILEHEADER.size)) values = FILEHEADER.unpack(handle.read(FILEHEADER.size))
self.magic = values[0] self.magic = values[0]
pointer_size_id = values[1] pointer_size_id = values[1]
if pointer_size_id == b'-': if pointer_size_id == b"-":
self.pointer_size = 8 self.pointer_size = 8
elif pointer_size_id == b'_': elif pointer_size_id == b"_":
self.pointer_size = 4 self.pointer_size = 4
else: else:
assert(0) assert 0
endian_id = values[2] endian_id = values[2]
if endian_id == b'v': if endian_id == b"v":
self.is_little_endian = True self.is_little_endian = True
self.endian_str = b'<' self.endian_str = b"<"
self.endian_index = 0 self.endian_index = 0
elif endian_id == b'V': elif endian_id == b"V":
self.is_little_endian = False self.is_little_endian = False
self.endian_index = 1 self.endian_index = 1
self.endian_str = b'>' self.endian_str = b">"
else: else:
assert(0) assert 0
version_id = values[3] version_id = values[3]
self.version = int(version_id) self.version = int(version_id)
def create_block_header_struct(self): def create_block_header_struct(self):
return struct.Struct(b''.join(( return struct.Struct(
b"".join(
(
self.endian_str, self.endian_str,
b'4sI', b"4sI",
b'I' if self.pointer_size == 4 else b'Q', b"I" if self.pointer_size == 4 else b"Q",
b'II', b"II",
))) )
)
)
class DNAName: class DNAName:
""" """
DNAName is a C-type name stored in the DNA DNAName is a C-type name stored in the DNA
""" """
__slots__ = ( __slots__ = (
"name_full", "name_full",
"name_only", "name_only",
@ -614,40 +667,40 @@ class DNAName:
self.array_size = self.calc_array_size() self.array_size = self.calc_array_size()
def __repr__(self): def __repr__(self):
return '%s(%r)' % (type(self).__qualname__, self.name_full) return "%s(%r)" % (type(self).__qualname__, self.name_full)
def as_reference(self, parent): def as_reference(self, parent):
if parent is None: if parent is None:
result = b'' result = b""
else: else:
result = parent + b'.' result = parent + b"."
result = result + self.name_only result = result + self.name_only
return result return result
def calc_name_only(self): def calc_name_only(self):
result = self.name_full.strip(b'*()') result = self.name_full.strip(b"*()")
index = result.find(b'[') index = result.find(b"[")
if index != -1: if index != -1:
result = result[:index] result = result[:index]
return result return result
def calc_is_pointer(self): def calc_is_pointer(self):
return (b'*' in self.name_full) return b"*" in self.name_full
def calc_is_method_pointer(self): def calc_is_method_pointer(self):
return (b'(*' in self.name_full) return b"(*" in self.name_full
def calc_array_size(self): def calc_array_size(self):
result = 1 result = 1
temp = self.name_full temp = self.name_full
index = temp.find(b'[') index = temp.find(b"[")
while index != -1: while index != -1:
index_2 = temp.find(b']') index_2 = temp.find(b"]")
result *= int(temp[index + 1:index_2]) result *= int(temp[index + 1 : index_2])
temp = temp[index_2 + 1:] temp = temp[index_2 + 1 :]
index = temp.find(b'[') index = temp.find(b"[")
return result return result
@ -657,6 +710,7 @@ class DNAField:
DNAField is a coupled DNAStruct and DNAName DNAField is a coupled DNAStruct and DNAName
and cache offset for reuse and cache offset for reuse
""" """
__slots__ = ( __slots__ = (
# DNAName # DNAName
"dna_name", "dna_name",
@ -680,6 +734,7 @@ class DNAStruct:
""" """
DNAStruct is a C-type structure stored in the DNA DNAStruct is a C-type structure stored in the DNA
""" """
__slots__ = ( __slots__ = (
"dna_type_id", "dna_type_id",
"size", "size",
@ -695,7 +750,7 @@ class DNAStruct:
self.user_data = None self.user_data = None
def __repr__(self): def __repr__(self):
return '%s(%r)' % (type(self).__qualname__, self.dna_type_id) return "%s(%r)" % (type(self).__qualname__, self.dna_type_id)
def field_from_path(self, header, handle, path): def field_from_path(self, header, handle, path):
""" """
@ -709,7 +764,7 @@ class DNAStruct:
if len(path) >= 2 and type(path[1]) is not bytes: if len(path) >= 2 and type(path[1]) is not bytes:
name_tail = path[2:] name_tail = path[2:]
index = path[1] index = path[1]
assert(type(index) is int) assert type(index) is int
else: else:
name_tail = path[1:] name_tail = path[1:]
index = 0 index = 0
@ -718,7 +773,7 @@ class DNAStruct:
name_tail = None name_tail = None
index = 0 index = 0
assert(type(name) is bytes) assert type(name) is bytes
field = self.field_from_name.get(name) field = self.field_from_name.get(name)
@ -729,47 +784,69 @@ class DNAStruct:
index_offset = header.pointer_size * index index_offset = header.pointer_size * index
else: else:
index_offset = field.dna_type.size * index index_offset = field.dna_type.size * index
assert(index_offset < field.dna_size) assert index_offset < field.dna_size
handle.seek(index_offset, os.SEEK_CUR) handle.seek(index_offset, os.SEEK_CUR)
if not name_tail: # None or () if not name_tail: # None or ()
return field return field
else: else:
return field.dna_type.field_from_path(header, handle, name_tail) return field.dna_type.field_from_path(header, handle, name_tail)
def field_get(self, header, handle, path, def field_get(
self,
header,
handle,
path,
default=..., default=...,
use_nil=True, use_str=True, use_nil=True,
use_str=True,
): ):
field = self.field_from_path(header, handle, path) field = self.field_from_path(header, handle, path)
if field is None: if field is None:
if default is not ...: if default is not ...:
return default return default
else: else:
raise KeyError("%r not found in %r (%r)" % raise KeyError(
(path, [f.dna_name.name_only for f in self.fields], self.dna_type_id)) "%r not found in %r (%r)"
% (
path,
[f.dna_name.name_only for f in self.fields],
self.dna_type_id,
)
)
dna_type = field.dna_type dna_type = field.dna_type
dna_name = field.dna_name dna_name = field.dna_name
if dna_name.is_pointer: if dna_name.is_pointer:
return DNA_IO.read_pointer(handle, header) return DNA_IO.read_pointer(handle, header)
elif dna_type.dna_type_id == b'int': elif dna_type.dna_type_id == b"int":
if dna_name.array_size > 1: if dna_name.array_size > 1:
return [DNA_IO.read_int(handle, header) for i in range(dna_name.array_size)] return [
DNA_IO.read_int(handle, header) for i in range(dna_name.array_size)
]
return DNA_IO.read_int(handle, header) return DNA_IO.read_int(handle, header)
elif dna_type.dna_type_id == b'short': elif dna_type.dna_type_id == b"short":
if dna_name.array_size > 1: if dna_name.array_size > 1:
return [DNA_IO.read_short(handle, header) for i in range(dna_name.array_size)] return [
DNA_IO.read_short(handle, header)
for i in range(dna_name.array_size)
]
return DNA_IO.read_short(handle, header) return DNA_IO.read_short(handle, header)
elif dna_type.dna_type_id == b'uint64_t': elif dna_type.dna_type_id == b"uint64_t":
if dna_name.array_size > 1: if dna_name.array_size > 1:
return [DNA_IO.read_ulong(handle, header) for i in range(dna_name.array_size)] return [
DNA_IO.read_ulong(handle, header)
for i in range(dna_name.array_size)
]
return DNA_IO.read_ulong(handle, header) return DNA_IO.read_ulong(handle, header)
elif dna_type.dna_type_id == b'float': elif dna_type.dna_type_id == b"float":
if dna_name.array_size > 1: if dna_name.array_size > 1:
return [DNA_IO.read_float(handle, header) for i in range(dna_name.array_size)] return [
DNA_IO.read_float(handle, header)
for i in range(dna_name.array_size)
]
return DNA_IO.read_float(handle, header) return DNA_IO.read_float(handle, header)
elif dna_type.dna_type_id == b'char': elif dna_type.dna_type_id == b"char":
if use_str: if use_str:
if use_nil: if use_nil:
return DNA_IO.read_string0(handle, dna_name.array_size) return DNA_IO.read_string0(handle, dna_name.array_size)
@ -781,30 +858,39 @@ class DNAStruct:
else: else:
return DNA_IO.read_bytes(handle, dna_name.array_size) return DNA_IO.read_bytes(handle, dna_name.array_size)
else: else:
raise NotImplementedError("%r exists but isn't pointer, can't resolve field %r" % raise NotImplementedError(
(path, dna_name.name_only), dna_name, dna_type) "%r exists but isn't pointer, can't resolve field %r"
% (path, dna_name.name_only),
dna_name,
dna_type,
)
def field_set(self, header, handle, path, value): def field_set(self, header, handle, path, value):
assert(type(path) == bytes) assert type(path) == bytes
field = self.field_from_path(header, handle, path) field = self.field_from_path(header, handle, path)
if field is None: if field is None:
raise KeyError("%r not found in %r" % raise KeyError(
(path, [f.dna_name.name_only for f in self.fields])) "%r not found in %r"
% (path, [f.dna_name.name_only for f in self.fields])
)
dna_type = field.dna_type dna_type = field.dna_type
dna_name = field.dna_name dna_name = field.dna_name
if dna_type.dna_type_id == b'char': if dna_type.dna_type_id == b"char":
if type(value) is str: if type(value) is str:
return DNA_IO.write_string(handle, value, dna_name.array_size) return DNA_IO.write_string(handle, value, dna_name.array_size)
else: else:
return DNA_IO.write_bytes(handle, value, dna_name.array_size) return DNA_IO.write_bytes(handle, value, dna_name.array_size)
elif dna_type.dna_type_id == b'int': elif dna_type.dna_type_id == b"int":
DNA_IO.write_int(handle, header, value) DNA_IO.write_int(handle, header, value)
else: else:
raise NotImplementedError("Setting %r is not yet supported for %r" % raise NotImplementedError(
(dna_type, dna_name), dna_name, dna_type) "Setting %r is not yet supported for %r" % (dna_type, dna_name),
dna_name,
dna_type,
)
class DNA_IO: class DNA_IO:
@ -821,20 +907,20 @@ class DNA_IO:
@staticmethod @staticmethod
def write_string(handle, astring, fieldlen): def write_string(handle, astring, fieldlen):
assert(isinstance(astring, str)) assert isinstance(astring, str)
if len(astring) >= fieldlen: if len(astring) >= fieldlen:
stringw = astring[0:fieldlen] stringw = astring[0:fieldlen]
else: else:
stringw = astring + '\0' stringw = astring + "\0"
handle.write(stringw.encode('utf-8')) handle.write(stringw.encode("utf-8"))
@staticmethod @staticmethod
def write_bytes(handle, astring, fieldlen): def write_bytes(handle, astring, fieldlen):
assert(isinstance(astring, (bytes, bytearray))) assert isinstance(astring, (bytes, bytearray))
if len(astring) >= fieldlen: if len(astring) >= fieldlen:
stringw = astring[0:fieldlen] stringw = astring[0:fieldlen]
else: else:
stringw = astring + b'\0' stringw = astring + b"\0"
handle.write(stringw) handle.write(stringw)
@ -850,44 +936,44 @@ class DNA_IO:
@staticmethod @staticmethod
def read_string(handle, length): def read_string(handle, length):
return DNA_IO.read_bytes(handle, length).decode('utf-8') return DNA_IO.read_bytes(handle, length).decode("utf-8")
@staticmethod @staticmethod
def read_string0(handle, length): def read_string0(handle, length):
return DNA_IO.read_bytes0(handle, length).decode('utf-8') return DNA_IO.read_bytes0(handle, length).decode("utf-8")
@staticmethod @staticmethod
def read_data0_offset(data, offset): def read_data0_offset(data, offset):
add = data.find(b'\0', offset) - offset add = data.find(b"\0", offset) - offset
return data[offset:offset + add] return data[offset : offset + add]
@staticmethod @staticmethod
def read_data0(data): def read_data0(data):
add = data.find(b'\0') add = data.find(b"\0")
return data[:add] return data[:add]
USHORT = struct.Struct(b'<H'), struct.Struct(b'>H') USHORT = struct.Struct(b"<H"), struct.Struct(b">H")
@staticmethod @staticmethod
def read_ushort(handle, fileheader): def read_ushort(handle, fileheader):
st = DNA_IO.USHORT[fileheader.endian_index] st = DNA_IO.USHORT[fileheader.endian_index]
return st.unpack(handle.read(st.size))[0] return st.unpack(handle.read(st.size))[0]
SSHORT = struct.Struct(b'<h'), struct.Struct(b'>h') SSHORT = struct.Struct(b"<h"), struct.Struct(b">h")
@staticmethod @staticmethod
def read_short(handle, fileheader): def read_short(handle, fileheader):
st = DNA_IO.SSHORT[fileheader.endian_index] st = DNA_IO.SSHORT[fileheader.endian_index]
return st.unpack(handle.read(st.size))[0] return st.unpack(handle.read(st.size))[0]
UINT = struct.Struct(b'<I'), struct.Struct(b'>I') UINT = struct.Struct(b"<I"), struct.Struct(b">I")
@staticmethod @staticmethod
def read_uint(handle, fileheader): def read_uint(handle, fileheader):
st = DNA_IO.UINT[fileheader.endian_index] st = DNA_IO.UINT[fileheader.endian_index]
return st.unpack(handle.read(st.size))[0] return st.unpack(handle.read(st.size))[0]
SINT = struct.Struct(b'<i'), struct.Struct(b'>i') SINT = struct.Struct(b"<i"), struct.Struct(b">i")
@staticmethod @staticmethod
def read_int(handle, fileheader): def read_int(handle, fileheader):
@ -896,19 +982,22 @@ class DNA_IO:
@staticmethod @staticmethod
def write_int(handle, fileheader, value): def write_int(handle, fileheader, value):
assert isinstance(value, int), 'value must be int, but is %r: %r' % (type(value), value) assert isinstance(value, int), "value must be int, but is %r: %r" % (
type(value),
value,
)
st = DNA_IO.SINT[fileheader.endian_index] st = DNA_IO.SINT[fileheader.endian_index]
to_write = st.pack(value) to_write = st.pack(value)
handle.write(to_write) handle.write(to_write)
FLOAT = struct.Struct(b'<f'), struct.Struct(b'>f') FLOAT = struct.Struct(b"<f"), struct.Struct(b">f")
@staticmethod @staticmethod
def read_float(handle, fileheader): def read_float(handle, fileheader):
st = DNA_IO.FLOAT[fileheader.endian_index] st = DNA_IO.FLOAT[fileheader.endian_index]
return st.unpack(handle.read(st.size))[0] return st.unpack(handle.read(st.size))[0]
ULONG = struct.Struct(b'<Q'), struct.Struct(b'>Q') ULONG = struct.Struct(b"<Q"), struct.Struct(b">Q")
@staticmethod @staticmethod
def read_ulong(handle, fileheader): def read_ulong(handle, fileheader):

View File

@ -33,7 +33,9 @@ from cachecontrol.caches import FileCache
from . import appdirs from . import appdirs
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
_session = None # requests.Session object that's set up for caching by requests_session(). _session = (
None # requests.Session object that's set up for caching by requests_session().
)
def cache_directory(*subdirs) -> str: def cache_directory(*subdirs) -> str:
@ -56,12 +58,12 @@ def cache_directory(*subdirs) -> str:
if profile: if profile:
username = profile.username username = profile.username
else: else:
username = 'anonymous' username = "anonymous"
# TODO: use bpy.utils.user_resource('CACHE', ...) # TODO: use bpy.utils.user_resource('CACHE', ...)
# once https://developer.blender.org/T47684 is finished. # once https://developer.blender.org/T47684 is finished.
user_cache_dir = appdirs.user_cache_dir(appname='Blender', appauthor=False) user_cache_dir = appdirs.user_cache_dir(appname="Blender", appauthor=False)
cache_dir = os.path.join(user_cache_dir, 'blender_cloud', username, *subdirs) cache_dir = os.path.join(user_cache_dir, "blender_cloud", username, *subdirs)
os.makedirs(cache_dir, mode=0o700, exist_ok=True) os.makedirs(cache_dir, mode=0o700, exist_ok=True)
@ -76,10 +78,11 @@ def requests_session() -> requests.Session:
if _session is not None: if _session is not None:
return _session return _session
cache_name = cache_directory('blender_cloud_http') cache_name = cache_directory("blender_cloud_http")
log.info('Storing cache in %s' % cache_name) log.info("Storing cache in %s" % cache_name)
_session = cachecontrol.CacheControl(sess=requests.session(), _session = cachecontrol.CacheControl(
cache=FileCache(cache_name)) sess=requests.session(), cache=FileCache(cache_name)
)
return _session return _session

File diff suppressed because it is too large Load Diff

View File

@ -1,185 +0,0 @@
"""BAM packing interface for Flamenco."""
import logging
from pathlib import Path
import typing
# Timeout of the BAM subprocess, in seconds.
SUBPROC_READLINE_TIMEOUT = 600
log = logging.getLogger(__name__)
class CommandExecutionError(Exception):
"""Raised when there was an error executing a BAM command."""
pass
def wheel_pythonpath_278() -> str:
"""Returns the value of a PYTHONPATH environment variable needed to run BAM from its wheel file.
Workaround for Blender 2.78c not having io_blend_utils.pythonpath()
"""
import os
from ..wheels import wheel_filename
# Find the wheel to run.
wheelpath = wheel_filename('blender_bam')
log.info('Using wheel %s to run BAM-Pack', wheelpath)
# Update the PYTHONPATH to include that wheel.
existing_pypath = os.environ.get('PYTHONPATH', '')
if existing_pypath:
return os.pathsep.join((existing_pypath, wheelpath))
return wheelpath
async def bam_copy(base_blendfile: Path, target_blendfile: Path,
exclusion_filter: str) -> typing.List[Path]:
"""Uses BAM to copy the given file and dependencies to the target blendfile.
Due to the way blendfile_pack.py is programmed/structured, we cannot import it
and call a function; it has to be run in a subprocess.
:raises: asyncio.CanceledError if the task was cancelled.
:raises: asyncio.TimeoutError if reading a line from the BAM process timed out.
:raises: CommandExecutionError if the subprocess failed or output invalid UTF-8.
:returns: a list of missing sources; hopefully empty.
"""
import asyncio
import os
import shlex
import subprocess
import bpy
import io_blend_utils
args = [
bpy.app.binary_path_python,
'-m', 'bam.pack',
'--input', str(base_blendfile),
'--output', str(target_blendfile),
'--mode', 'FILE',
]
if exclusion_filter:
args.extend(['--exclude', exclusion_filter])
cmd_to_log = ' '.join(shlex.quote(s) for s in args)
log.info('Executing %s', cmd_to_log)
# Workaround for Blender 2.78c not having io_blend_utils.pythonpath()
if hasattr(io_blend_utils, 'pythonpath'):
pythonpath = io_blend_utils.pythonpath()
else:
pythonpath = wheel_pythonpath_278()
env = {
'PYTHONPATH': pythonpath,
# Needed on Windows because http://bugs.python.org/issue8557
'PATH': os.environ['PATH'],
}
if 'SYSTEMROOT' in os.environ: # Windows http://bugs.python.org/issue20614
env['SYSTEMROOT'] = os.environ['SYSTEMROOT']
proc = await asyncio.create_subprocess_exec(
*args,
env=env,
stdin=subprocess.DEVNULL,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
missing_sources = []
try:
while not proc.stdout.at_eof():
line = await asyncio.wait_for(proc.stdout.readline(),
SUBPROC_READLINE_TIMEOUT)
if not line:
# EOF received, so let's bail.
break
try:
line = line.decode('utf8')
except UnicodeDecodeError as ex:
raise CommandExecutionError('Command produced non-UTF8 output, '
'aborting: %s' % ex)
line = line.rstrip()
if 'source missing:' in line:
path = parse_missing_source(line)
missing_sources.append(path)
log.warning('Source is missing: %s', path)
log.info(' %s', line)
finally:
if proc.returncode is None:
# Always wait for the process, to avoid zombies.
try:
proc.kill()
except ProcessLookupError:
# The process is already stopped, so killing is impossible. That's ok.
log.debug("The process was already stopped, aborting is impossible. That's ok.")
await proc.wait()
log.info('The process stopped with status code %i', proc.returncode)
if proc.returncode:
raise CommandExecutionError('Process stopped with status %i' % proc.returncode)
return missing_sources
def parse_missing_source(line: str) -> Path:
r"""Parses a "missing source" line into a pathlib.Path.
>>> parse_missing_source(r" source missing: b'D\xc3\xaffficult \xc3\x9cTF-8 filename'")
PosixPath('Dïfficult ÜTF-8 filename')
>>> parse_missing_source(r" source missing: b'D\xfffficult Win1252 f\xeflen\xe6me'")
PosixPath('D<EFBFBD>fficult Win1252 f<>len<65>me')
"""
_, missing_source = line.split(': ', 1)
missing_source_as_bytes = parse_byte_literal(missing_source.strip())
# The file could originate from any platform, so UTF-8 and the current platform's
# filesystem encodings are just guesses.
try:
missing_source = missing_source_as_bytes.decode('utf8')
except UnicodeDecodeError:
import sys
try:
missing_source = missing_source_as_bytes.decode(sys.getfilesystemencoding())
except UnicodeDecodeError:
missing_source = missing_source_as_bytes.decode('ascii', errors='replace')
path = Path(missing_source)
return path
def parse_byte_literal(bytes_literal: str) -> bytes:
r"""Parses a repr(bytes) output into a bytes object.
>>> parse_byte_literal(r"b'D\xc3\xaffficult \xc3\x9cTF-8 filename'")
b'D\xc3\xaffficult \xc3\x9cTF-8 filename'
>>> parse_byte_literal(r"b'D\xeffficult Win1252 f\xeflen\xe6me'")
b'D\xeffficult Win1252 f\xeflen\xe6me'
"""
# Some very basic assertions to make sure we have a proper bytes literal.
assert bytes_literal[0] == "b"
assert bytes_literal[1] in {'"', "'"}
assert bytes_literal[-1] == bytes_literal[1]
import ast
return ast.literal_eval(bytes_literal)
if __name__ == '__main__':
import doctest
doctest.testmod()

View File

@ -0,0 +1,199 @@
"""BAT🦇 packing interface for Flamenco."""
import asyncio
import logging
import pathlib
import re
import threading
import typing
import urllib.parse
import bpy
from blender_asset_tracer import pack
from blender_asset_tracer.pack import progress, transfer, shaman
log = logging.getLogger(__name__)
_running_packer = None # type: pack.Packer
_packer_lock = threading.RLock()
# For using in other parts of the add-on, so only this file imports BAT.
Aborted = pack.Aborted
FileTransferError = transfer.FileTransferError
parse_shaman_endpoint = shaman.parse_endpoint
class BatProgress(progress.Callback):
"""Report progress of BAT Packing to the UI.
Uses asyncio.run_coroutine_threadsafe() to ensure the UI is only updated
from the main thread. This is required since we run the BAT Pack in a
background thread.
"""
def __init__(self) -> None:
super().__init__()
self.loop = asyncio.get_event_loop()
def _set_attr(self, attr: str, value):
async def do_it():
setattr(bpy.context.window_manager, attr, value)
asyncio.run_coroutine_threadsafe(do_it(), loop=self.loop)
def _txt(self, msg: str):
"""Set a text in a thread-safe way."""
self._set_attr("flamenco_status_txt", msg)
def _status(self, status: str):
"""Set the flamenco_status property in a thread-safe way."""
self._set_attr("flamenco_status", status)
def _progress(self, progress: int):
"""Set the flamenco_progress property in a thread-safe way."""
self._set_attr("flamenco_progress", progress)
def pack_start(self) -> None:
self._txt("Starting BAT Pack operation")
def pack_done(
self, output_blendfile: pathlib.Path, missing_files: typing.Set[pathlib.Path]
) -> None:
if missing_files:
self._txt("There were %d missing files" % len(missing_files))
else:
self._txt("Pack of %s done" % output_blendfile.name)
def pack_aborted(self, reason: str):
self._txt("Aborted: %s" % reason)
self._status("ABORTED")
def trace_blendfile(self, filename: pathlib.Path) -> None:
"""Called for every blendfile opened when tracing dependencies."""
self._txt("Inspecting %s" % filename.name)
def trace_asset(self, filename: pathlib.Path) -> None:
if filename.stem == ".blend":
return
self._txt("Found asset %s" % filename.name)
def rewrite_blendfile(self, orig_filename: pathlib.Path) -> None:
self._txt("Rewriting %s" % orig_filename.name)
def transfer_file(self, src: pathlib.Path, dst: pathlib.Path) -> None:
self._txt("Transferring %s" % src.name)
def transfer_file_skipped(self, src: pathlib.Path, dst: pathlib.Path) -> None:
self._txt("Skipped %s" % src.name)
def transfer_progress(self, total_bytes: int, transferred_bytes: int) -> None:
self._progress(round(100 * transferred_bytes / total_bytes))
def missing_file(self, filename: pathlib.Path) -> None:
# TODO(Sybren): report missing files in a nice way
pass
class ShamanPacker(shaman.ShamanPacker):
"""Packer with support for getting an auth token from Flamenco Server."""
def __init__(
self,
bfile: pathlib.Path,
project: pathlib.Path,
target: str,
endpoint: str,
checkout_id: str,
*,
manager_id: str,
**kwargs
) -> None:
self.manager_id = manager_id
super().__init__(bfile, project, target, endpoint, checkout_id, **kwargs)
def _get_auth_token(self) -> str:
"""get a token from Flamenco Server"""
from ..blender import PILLAR_SERVER_URL
from ..pillar import blender_id_subclient, uncached_session, SUBCLIENT_ID
url = urllib.parse.urljoin(
PILLAR_SERVER_URL, "flamenco/jwt/generate-token/%s" % self.manager_id
)
auth_token = blender_id_subclient()["token"]
resp = uncached_session.get(url, auth=(auth_token, SUBCLIENT_ID))
resp.raise_for_status()
return resp.text
async def copy(
context,
base_blendfile: pathlib.Path,
project: pathlib.Path,
target: str,
exclusion_filter: str,
*,
relative_only: bool,
packer_class=pack.Packer,
**packer_args
) -> typing.Tuple[pathlib.Path, typing.Set[pathlib.Path]]:
"""Use BAT🦇 to copy the given file and dependencies to the target location.
:raises: FileTransferError if a file couldn't be transferred.
:returns: the path of the packed blend file, and a set of missing sources.
"""
global _running_packer
loop = asyncio.get_event_loop()
wm = bpy.context.window_manager
packer = packer_class(
base_blendfile,
project,
target,
compress=True,
relative_only=relative_only,
**packer_args
)
with packer:
with _packer_lock:
if exclusion_filter:
# There was a mistake in an older version of the property tooltip,
# showing semicolon-separated instead of space-separated. We now
# just handle both.
filter_parts = re.split("[ ;]+", exclusion_filter.strip(" ;"))
packer.exclude(*filter_parts)
packer.progress_cb = BatProgress()
_running_packer = packer
log.debug("awaiting strategise")
wm.flamenco_status = "INVESTIGATING"
await loop.run_in_executor(None, packer.strategise)
log.debug("awaiting execute")
wm.flamenco_status = "TRANSFERRING"
await loop.run_in_executor(None, packer.execute)
log.debug("done")
wm.flamenco_status = "DONE"
with _packer_lock:
_running_packer = None
return packer.output_path, packer.missing_files
def abort() -> None:
"""Abort a running copy() call.
No-op when there is no running copy(). Can be called from any thread.
"""
with _packer_lock:
if _running_packer is None:
log.debug("No running packer, ignoring call to bat_abort()")
return
log.info("Aborting running packer")
_running_packer.abort()

View File

@ -1,16 +1,40 @@
import functools import functools
import pathlib import pathlib
import typing
from pillarsdk.resource import List, Find, Create from pillarsdk.resource import List, Find, Create
class Manager(List, Find): class Manager(List, Find):
"""Manager class wrapping the REST nodes endpoint""" """Manager class wrapping the REST nodes endpoint"""
path = 'flamenco/managers'
path = "flamenco/managers"
PurePlatformPath = pathlib.PurePath PurePlatformPath = pathlib.PurePath
@functools.lru_cache() @functools.lru_cache(maxsize=1)
def _sorted_path_replacements(self) -> list: def _path_replacements(self) -> list:
"""Defer to _path_replacements_vN() to get path replacement vars.
Returns a list of tuples (variable name, variable value).
"""
settings_version = self.settings_version or 1
try:
settings_func = getattr(self, "_path_replacements_v%d" % settings_version)
except AttributeError:
raise RuntimeError(
"This manager has unsupported settings version %d; "
"upgrade Blender Cloud add-on"
)
def longest_value_first(item):
var_name, var_value = item
return -len(var_value), var_value, var_name
replacements = settings_func()
replacements.sort(key=longest_value_first)
return replacements
def _path_replacements_v1(self) -> typing.List[typing.Tuple[str, str]]:
import platform import platform
if self.path_replacement is None: if self.path_replacement is None:
@ -18,13 +42,36 @@ class Manager(List, Find):
items = self.path_replacement.to_dict().items() items = self.path_replacement.to_dict().items()
def by_length(item): this_platform = platform.system().lower()
return -len(item[0]), item[0] return [
(varname, platform_replacements[this_platform])
for varname, platform_replacements in items
if this_platform in platform_replacements
]
def _path_replacements_v2(self) -> typing.List[typing.Tuple[str, str]]:
import platform
if not self.variables:
return []
this_platform = platform.system().lower() this_platform = platform.system().lower()
return [(varname, platform_replacements[this_platform]) audiences = {"users", "all"}
for varname, platform_replacements in sorted(items, key=by_length)
if this_platform in platform_replacements] replacements = []
for var_name, variable in self.variables.to_dict().items():
# Path replacement requires bidirectional variables.
if variable.get("direction") != "twoway":
continue
for var_value in variable.get("values", []):
if var_value.get("audience") not in audiences:
continue
if var_value.get("platform", "").lower() != this_platform:
continue
replacements.append((var_name, var_value.get("value")))
return replacements
def replace_path(self, some_path: pathlib.PurePath) -> str: def replace_path(self, some_path: pathlib.PurePath) -> str:
"""Performs path variable replacement. """Performs path variable replacement.
@ -32,8 +79,11 @@ class Manager(List, Find):
Tries to find platform-specific path prefixes, and replaces them with Tries to find platform-specific path prefixes, and replaces them with
variables. variables.
""" """
assert isinstance(some_path, pathlib.PurePath), (
"some_path should be a PurePath, not %r" % some_path
)
for varname, path in self._sorted_path_replacements(): for varname, path in self._path_replacements():
replacement = self.PurePlatformPath(path) replacement = self.PurePlatformPath(path)
try: try:
relpath = some_path.relative_to(replacement) relpath = some_path.relative_to(replacement)
@ -41,14 +91,26 @@ class Manager(List, Find):
# Not relative to each other, so no replacement possible # Not relative to each other, so no replacement possible
continue continue
replacement_root = self.PurePlatformPath('{%s}' % varname) replacement_root = self.PurePlatformPath("{%s}" % varname)
return (replacement_root / relpath).as_posix() return (replacement_root / relpath).as_posix()
return some_path.as_posix() return some_path.as_posix()
class Job(List, Find, Create): class Job(List, Find, Create):
"""Job class wrapping the REST nodes endpoint """Job class wrapping the REST nodes endpoint"""
"""
path = 'flamenco/jobs' path = "flamenco/jobs"
ensure_query_projections = {'project': 1} ensure_query_projections = {"project": 1}
def patch(self, payload: dict, api=None):
import pillarsdk.utils
api = api or self.api
url = pillarsdk.utils.join_url(self.path, str(self["_id"]))
headers = pillarsdk.utils.merge_dict(
self.http_headers(), {"Content-Type": "application/json"}
)
response = api.patch(url, payload, headers=headers)
return response

View File

@ -23,28 +23,31 @@ from pillarsdk import exceptions as sdk_exceptions
from .pillar import pillar_call from .pillar import pillar_call
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
HOME_PROJECT_ENDPOINT = '/bcloud/home-project' HOME_PROJECT_ENDPOINT = "/bcloud/home-project"
async def get_home_project(params=None) -> pillarsdk.Project: async def get_home_project(params=None) -> pillarsdk.Project:
"""Returns the home project.""" """Returns the home project."""
log.debug('Getting home project') log.debug("Getting home project")
try: try:
return await pillar_call(pillarsdk.Project.find_from_endpoint, return await pillar_call(
HOME_PROJECT_ENDPOINT, params=params) pillarsdk.Project.find_from_endpoint, HOME_PROJECT_ENDPOINT, params=params
)
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
log.warning('Access to the home project was denied. ' log.warning(
'Double-check that you are logged in with valid BlenderID credentials.') "Access to the home project was denied. "
"Double-check that you are logged in with valid BlenderID credentials."
)
raise raise
except sdk_exceptions.ResourceNotFound: except sdk_exceptions.ResourceNotFound:
log.warning('No home project available.') log.warning("No home project available.")
raise raise
async def get_home_project_id() -> str: async def get_home_project_id() -> str:
"""Returns just the ID of the home project.""" """Returns just the ID of the home project."""
home_proj = await get_home_project({'projection': {'_id': 1}}) home_proj = await get_home_project({"projection": {"_id": 1}})
home_proj_id = home_proj['_id'] home_proj_id = home_proj["_id"]
return home_proj_id return home_proj_id

View File

@ -27,8 +27,8 @@ from pillarsdk import exceptions as sdk_exceptions
from .pillar import pillar_call from .pillar import pillar_call
from . import async_loop, pillar, home_project, blender from . import async_loop, pillar, home_project, blender
REQUIRES_ROLES_FOR_IMAGE_SHARING = {'subscriber', 'demo'} REQUIRES_ROLES_FOR_IMAGE_SHARING = {"subscriber", "demo"}
IMAGE_SHARING_GROUP_NODE_NAME = 'Image sharing' IMAGE_SHARING_GROUP_NODE_NAME = "Image sharing"
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -36,76 +36,84 @@ async def find_image_sharing_group_id(home_project_id, user_id):
# Find the top-level image sharing group node. # Find the top-level image sharing group node.
try: try:
share_group, created = await pillar.find_or_create_node( share_group, created = await pillar.find_or_create_node(
where={'project': home_project_id, where={
'node_type': 'group', "project": home_project_id,
'parent': None, "node_type": "group",
'name': IMAGE_SHARING_GROUP_NODE_NAME}, "parent": None,
additional_create_props={ "name": IMAGE_SHARING_GROUP_NODE_NAME,
'user': user_id,
'properties': {},
}, },
projection={'_id': 1}, additional_create_props={
may_create=True) "user": user_id,
"properties": {},
},
projection={"_id": 1},
may_create=True,
)
except pillar.PillarError: except pillar.PillarError:
log.exception('Pillar error caught') log.exception("Pillar error caught")
raise pillar.PillarError('Unable to find image sharing folder on the Cloud') raise pillar.PillarError("Unable to find image sharing folder on the Cloud")
return share_group['_id'] return share_group["_id"]
class PILLAR_OT_image_share(pillar.PillarOperatorMixin, class PILLAR_OT_image_share(
async_loop.AsyncModalOperatorMixin, pillar.PillarOperatorMixin, async_loop.AsyncModalOperatorMixin, bpy.types.Operator
bpy.types.Operator): ):
bl_idname = 'pillar.image_share' bl_idname = "pillar.image_share"
bl_label = 'Share an image/screenshot via Blender Cloud' bl_label = "Share an image/screenshot via Blender Cloud"
bl_description = 'Uploads an image for sharing via Blender Cloud' bl_description = "Uploads an image for sharing via Blender Cloud"
log = logging.getLogger('bpy.ops.%s' % bl_idname) log = logging.getLogger("bpy.ops.%s" % bl_idname)
home_project_id = None home_project_id = None
home_project_url = 'home' home_project_url = "home"
share_group_id = None # top-level share group node ID share_group_id = None # top-level share group node ID
user_id = None user_id = None
target = bpy.props.EnumProperty( target: bpy.props.EnumProperty(
items=[ items=[
('FILE', 'File', 'Share an image file'), ("FILE", "File", "Share an image file"),
('DATABLOCK', 'Datablock', 'Share an image datablock'), ("DATABLOCK", "Datablock", "Share an image datablock"),
('SCREENSHOT', 'Screenshot', 'Share a screenshot'), ("SCREENSHOT", "Screenshot", "Share a screenshot"),
], ],
name='target', name="target",
default='SCREENSHOT') default="SCREENSHOT",
)
name = bpy.props.StringProperty(name='name', name: bpy.props.StringProperty(
description='File or datablock name to sync') name="name", description="File or datablock name to sync"
)
screenshot_show_multiview = bpy.props.BoolProperty( screenshot_show_multiview: bpy.props.BoolProperty(
name='screenshot_show_multiview', name="screenshot_show_multiview", description="Enable Multi-View", default=False
description='Enable Multi-View', )
default=False)
screenshot_use_multiview = bpy.props.BoolProperty( screenshot_use_multiview: bpy.props.BoolProperty(
name='screenshot_use_multiview', name="screenshot_use_multiview", description="Use Multi-View", default=False
description='Use Multi-View', )
default=False)
screenshot_full = bpy.props.BoolProperty( screenshot_full: bpy.props.BoolProperty(
name='screenshot_full', name="screenshot_full",
description='Full Screen, Capture the whole window (otherwise only capture the active area)', description="Full Screen, Capture the whole window (otherwise only capture the active area)",
default=False) default=False,
)
def invoke(self, context, event): def invoke(self, context, event):
# Do a quick test on datablock dirtyness. If it's not packed and dirty, # Do a quick test on datablock dirtyness. If it's not packed and dirty,
# the user should save it first. # the user should save it first.
if self.target == 'DATABLOCK': if self.target == "DATABLOCK":
if not self.name: if not self.name:
self.report({'ERROR'}, 'No name given of the datablock to share.') self.report({"ERROR"}, "No name given of the datablock to share.")
return {'CANCELLED'} return {"CANCELLED"}
datablock = bpy.data.images[self.name] datablock = bpy.data.images[self.name]
if datablock.type == 'IMAGE' and datablock.is_dirty and not datablock.packed_file: if (
self.report({'ERROR'}, 'Datablock is dirty, save it first.') datablock.type == "IMAGE"
return {'CANCELLED'} and datablock.is_dirty
and not datablock.packed_file
):
self.report({"ERROR"}, "Datablock is dirty, save it first.")
return {"CANCELLED"}
return async_loop.AsyncModalOperatorMixin.invoke(self, context, event) return async_loop.AsyncModalOperatorMixin.invoke(self, context, event)
@ -113,81 +121,87 @@ class PILLAR_OT_image_share(pillar.PillarOperatorMixin,
"""Entry point of the asynchronous operator.""" """Entry point of the asynchronous operator."""
# We don't want to influence what is included in the screen shot. # We don't want to influence what is included in the screen shot.
if self.target == 'SCREENSHOT': if self.target == "SCREENSHOT":
print('Blender Cloud add-on is communicating with Blender Cloud') print("Blender Cloud add-on is communicating with Blender Cloud")
else: else:
self.report({'INFO'}, 'Communicating with Blender Cloud') self.report({"INFO"}, "Communicating with Blender Cloud")
try: try:
# Refresh credentials # Refresh credentials
try: try:
db_user = await self.check_credentials(context, REQUIRES_ROLES_FOR_IMAGE_SHARING) db_user = await self.check_credentials(
self.user_id = db_user['_id'] context, REQUIRES_ROLES_FOR_IMAGE_SHARING
self.log.debug('Found user ID: %s', self.user_id) )
except pillar.NotSubscribedToCloudError: self.user_id = db_user["_id"]
self.log.exception('User not subscribed to cloud.') self.log.debug("Found user ID: %s", self.user_id)
self.report({'ERROR'}, 'Please subscribe to the Blender Cloud.') except pillar.NotSubscribedToCloudError as ex:
self._state = 'QUIT' self._log_subscription_needed(can_renew=ex.can_renew)
self._state = "QUIT"
return return
except pillar.UserNotLoggedInError: except pillar.UserNotLoggedInError:
self.log.exception('Error checking/refreshing credentials.') self.log.exception("Error checking/refreshing credentials.")
self.report({'ERROR'}, 'Please log in on Blender ID first.') self.report({"ERROR"}, "Please log in on Blender ID first.")
self._state = 'QUIT' self._state = "QUIT"
return return
# Find the home project. # Find the home project.
try: try:
home_proj = await home_project.get_home_project({ home_proj = await home_project.get_home_project(
'projection': {'_id': 1, 'url': 1} {"projection": {"_id": 1, "url": 1}}
}) )
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
self.log.exception('Forbidden access to home project.') self.log.exception("Forbidden access to home project.")
self.report({'ERROR'}, 'Did not get access to home project.') self.report({"ERROR"}, "Did not get access to home project.")
self._state = 'QUIT' self._state = "QUIT"
return return
except sdk_exceptions.ResourceNotFound: except sdk_exceptions.ResourceNotFound:
self.report({'ERROR'}, 'Home project not found.') self.report({"ERROR"}, "Home project not found.")
self._state = 'QUIT' self._state = "QUIT"
return return
self.home_project_id = home_proj['_id'] self.home_project_id = home_proj["_id"]
self.home_project_url = home_proj['url'] self.home_project_url = home_proj["url"]
try: try:
gid = await find_image_sharing_group_id(self.home_project_id, gid = await find_image_sharing_group_id(
self.user_id) self.home_project_id, self.user_id
)
self.share_group_id = gid self.share_group_id = gid
self.log.debug('Found group node ID: %s', self.share_group_id) self.log.debug("Found group node ID: %s", self.share_group_id)
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
self.log.exception('Unable to find Group ID') self.log.exception("Unable to find Group ID")
self.report({'ERROR'}, 'Unable to find sync folder.') self.report({"ERROR"}, "Unable to find sync folder.")
self._state = 'QUIT' self._state = "QUIT"
return return
await self.share_image(context) await self.share_image(context)
except Exception as ex: except Exception as ex:
self.log.exception('Unexpected exception caught.') self.log.exception("Unexpected exception caught.")
self.report({'ERROR'}, 'Unexpected error %s: %s' % (type(ex), ex)) self.report({"ERROR"}, "Unexpected error %s: %s" % (type(ex), ex))
self._state = 'QUIT' self._state = "QUIT"
async def share_image(self, context): async def share_image(self, context):
"""Sends files to the Pillar server.""" """Sends files to the Pillar server."""
if self.target == 'FILE': if self.target == "FILE":
self.report({'INFO'}, "Uploading %s '%s'" % (self.target.lower(), self.name)) self.report(
{"INFO"}, "Uploading %s '%s'" % (self.target.lower(), self.name)
)
node = await self.upload_file(self.name) node = await self.upload_file(self.name)
elif self.target == 'SCREENSHOT': elif self.target == "SCREENSHOT":
node = await self.upload_screenshot(context) node = await self.upload_screenshot(context)
else: else:
self.report({'INFO'}, "Uploading %s '%s'" % (self.target.lower(), self.name)) self.report(
{"INFO"}, "Uploading %s '%s'" % (self.target.lower(), self.name)
)
node = await self.upload_datablock(context) node = await self.upload_datablock(context)
self.report({'INFO'}, 'Upload complete, creating link to share.') self.report({"INFO"}, "Upload complete, creating link to share.")
share_info = await pillar_call(node.share) share_info = await pillar_call(node.share)
url = share_info.get('short_link') url = share_info.get("short_link")
context.window_manager.clipboard = url context.window_manager.clipboard = url
self.report({'INFO'}, 'The link has been copied to your clipboard: %s' % url) self.report({"INFO"}, "The link has been copied to your clipboard: %s" % url)
await self.maybe_open_browser(url) await self.maybe_open_browser(url)
@ -197,19 +211,21 @@ class PILLAR_OT_image_share(pillar.PillarOperatorMixin,
Returns the node. Returns the node.
""" """
self.log.info('Uploading file %s', filename) self.log.info("Uploading file %s", filename)
node = await pillar_call(pillarsdk.Node.create_asset_from_file, node = await pillar_call(
pillarsdk.Node.create_asset_from_file,
self.home_project_id, self.home_project_id,
self.share_group_id, self.share_group_id,
'image', "image",
filename, filename,
extra_where={'user': self.user_id}, extra_where={"user": self.user_id},
always_create_new_node=True, always_create_new_node=True,
fileobj=fileobj, fileobj=fileobj,
caching=False) caching=False,
node_id = node['_id'] )
self.log.info('Created node %s', node_id) node_id = node["_id"]
self.report({'INFO'}, 'File succesfully uploaded to the cloud!') self.log.info("Created node %s", node_id)
self.report({"INFO"}, "File succesfully uploaded to the cloud!")
return node return node
@ -220,7 +236,7 @@ class PILLAR_OT_image_share(pillar.PillarOperatorMixin,
import webbrowser import webbrowser
self.log.info('Opening browser at %s', url) self.log.info("Opening browser at %s", url)
webbrowser.open_new_tab(url) webbrowser.open_new_tab(url)
async def upload_datablock(self, context) -> pillarsdk.Node: async def upload_datablock(self, context) -> pillarsdk.Node:
@ -232,12 +248,13 @@ class PILLAR_OT_image_share(pillar.PillarOperatorMixin,
self.log.info("Uploading datablock '%s'" % self.name) self.log.info("Uploading datablock '%s'" % self.name)
datablock = bpy.data.images[self.name] datablock = bpy.data.images[self.name]
if datablock.type == 'RENDER_RESULT': if datablock.type == "RENDER_RESULT":
# Construct a sensible name for this render. # Construct a sensible name for this render.
filename = '%s-%s-render%s' % ( filename = "%s-%s-render%s" % (
os.path.splitext(os.path.basename(context.blend_data.filepath))[0], os.path.splitext(os.path.basename(context.blend_data.filepath))[0],
context.scene.name, context.scene.name,
context.scene.render.file_extension) context.scene.render.file_extension,
)
return await self.upload_via_tempdir(datablock, filename) return await self.upload_via_tempdir(datablock, filename)
if datablock.packed_file is not None: if datablock.packed_file is not None:
@ -266,7 +283,7 @@ class PILLAR_OT_image_share(pillar.PillarOperatorMixin,
with tempfile.TemporaryDirectory() as tmpdir: with tempfile.TemporaryDirectory() as tmpdir:
filepath = os.path.join(tmpdir, filename_on_cloud) filepath = os.path.join(tmpdir, filename_on_cloud)
self.log.debug('Saving %s to %s', datablock, filepath) self.log.debug("Saving %s to %s", datablock, filepath)
datablock.save_render(filepath) datablock.save_render(filepath)
return await self.upload_file(filepath) return await self.upload_file(filepath)
@ -278,25 +295,27 @@ class PILLAR_OT_image_share(pillar.PillarOperatorMixin,
import io import io
filename = '%s.%s' % (datablock.name, datablock.file_format.lower()) filename = "%s.%s" % (datablock.name, datablock.file_format.lower())
fileobj = io.BytesIO(datablock.packed_file.data) fileobj = io.BytesIO(datablock.packed_file.data)
fileobj.seek(0) # ensure PillarSDK reads the file from the beginning. fileobj.seek(0) # ensure PillarSDK reads the file from the beginning.
self.log.info('Uploading packed file directly from memory to %r.', filename) self.log.info("Uploading packed file directly from memory to %r.", filename)
return await self.upload_file(filename, fileobj=fileobj) return await self.upload_file(filename, fileobj=fileobj)
async def upload_screenshot(self, context) -> pillarsdk.Node: async def upload_screenshot(self, context) -> pillarsdk.Node:
"""Takes a screenshot, saves it to a temp file, and uploads it.""" """Takes a screenshot, saves it to a temp file, and uploads it."""
self.name = datetime.datetime.now().strftime('Screenshot-%Y-%m-%d-%H%M%S.png') self.name = datetime.datetime.now().strftime("Screenshot-%Y-%m-%d-%H%M%S.png")
self.report({'INFO'}, "Uploading %s '%s'" % (self.target.lower(), self.name)) self.report({"INFO"}, "Uploading %s '%s'" % (self.target.lower(), self.name))
with tempfile.TemporaryDirectory() as tmpdir: with tempfile.TemporaryDirectory() as tmpdir:
filepath = os.path.join(tmpdir, self.name) filepath = os.path.join(tmpdir, self.name)
self.log.debug('Saving screenshot to %s', filepath) self.log.debug("Saving screenshot to %s", filepath)
bpy.ops.screen.screenshot(filepath=filepath, bpy.ops.screen.screenshot(
filepath=filepath,
show_multiview=self.screenshot_show_multiview, show_multiview=self.screenshot_show_multiview,
use_multiview=self.screenshot_use_multiview, use_multiview=self.screenshot_use_multiview,
full=self.screenshot_full) full=self.screenshot_full,
)
return await self.upload_file(filepath) return await self.upload_file(filepath)
@ -305,34 +324,47 @@ def image_editor_menu(self, context):
box = self.layout.row() box = self.layout.row()
if image and image.has_data: if image and image.has_data:
text = 'Share on Blender Cloud' text = "Share on Blender Cloud"
if image.type == 'IMAGE' and image.is_dirty and not image.packed_file: if image.type == "IMAGE" and image.is_dirty and not image.packed_file:
box.enabled = False box.enabled = False
text = 'Save image before sharing on Blender Cloud' text = "Save image before sharing on Blender Cloud"
props = box.operator(PILLAR_OT_image_share.bl_idname, text=text, props = box.operator(
icon_value=blender.icon('CLOUD')) PILLAR_OT_image_share.bl_idname, text=text, icon_value=blender.icon("CLOUD")
props.target = 'DATABLOCK' )
props.target = "DATABLOCK"
props.name = image.name props.name = image.name
def window_menu(self, context): def window_menu(self, context):
props = self.layout.operator(PILLAR_OT_image_share.bl_idname, props = self.layout.operator(
text='Share screenshot via Blender Cloud', PILLAR_OT_image_share.bl_idname,
icon_value=blender.icon('CLOUD')) text="Share screenshot via Blender Cloud",
props.target = 'SCREENSHOT' icon_value=blender.icon("CLOUD"),
)
props.target = "SCREENSHOT"
props.screenshot_full = True props.screenshot_full = True
def get_topbar_menu():
"""Return the topbar menu in a Blender 2.79 and 2.80 compatible way."""
try:
menu = bpy.types.TOPBAR_MT_window
except AttributeError:
# Blender < 2.80
menu = bpy.types.INFO_MT_window
return menu
def register(): def register():
bpy.utils.register_class(PILLAR_OT_image_share) bpy.utils.register_class(PILLAR_OT_image_share)
bpy.types.IMAGE_MT_image.append(image_editor_menu) bpy.types.IMAGE_MT_image.append(image_editor_menu)
bpy.types.INFO_MT_window.append(window_menu) get_topbar_menu().append(window_menu)
def unregister(): def unregister():
bpy.utils.unregister_class(PILLAR_OT_image_share) bpy.utils.unregister_class(PILLAR_OT_image_share)
bpy.types.IMAGE_MT_image.remove(image_editor_menu) bpy.types.IMAGE_MT_image.remove(image_editor_menu)
bpy.types.INFO_MT_window.remove(window_menu) get_topbar_menu().remove(window_menu)

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,180 @@
"""Handle saving and loading project-specific settings."""
import contextlib
import logging
import typing
# Names of BlenderCloudPreferences properties that are both project-specific
# and simple enough to store directly in a dict.
PROJECT_SPECIFIC_SIMPLE_PROPS = ("cloud_project_local_path",)
# Names of BlenderCloudPreferences properties that are project-specific and
# Flamenco Manager-specific, and simple enough to store in a dict.
FLAMENCO_PER_PROJECT_PER_MANAGER = (
"flamenco_exclude_filter",
"flamenco_job_file_path",
"flamenco_job_output_path",
"flamenco_job_output_strip_components",
"flamenco_relative_only",
)
log = logging.getLogger(__name__)
project_settings_loading = 0 # counter, if > 0 then we're loading stuff.
@contextlib.contextmanager
def mark_as_loading():
"""Sets project_settings_loading > 0 while the context is active.
A counter is used to allow for nested mark_as_loading() contexts.
"""
global project_settings_loading
project_settings_loading += 1
try:
yield
finally:
project_settings_loading -= 1
def update_preferences(
prefs,
names_to_update: typing.Iterable[str],
new_values: typing.Mapping[str, typing.Any],
):
for name in names_to_update:
if not hasattr(prefs, name):
log.debug("not setting %r, property cannot be found", name)
continue
if name in new_values:
log.debug("setting %r = %r", name, new_values[name])
setattr(prefs, name, new_values[name])
else:
# The property wasn't stored, so set the default value instead.
bl_type, args = getattr(prefs.bl_rna, name)
log.debug("finding default value for %r", name)
if "default" not in args:
log.debug("no default value for %r, not touching", name)
continue
log.debug("found default value for %r = %r", name, args["default"])
setattr(prefs, name, args["default"])
def handle_project_update(_=None, _2=None):
"""Handles changing projects, which may cause extensions to be disabled/enabled.
Ignores arguments so that it can be used as property update callback.
"""
from .blender import preferences, project_extensions
with mark_as_loading():
prefs = preferences()
project_id = prefs.project.project
log.debug(
"Updating internal state to reflect extensions enabled on current project %s.",
project_id,
)
project_extensions.cache_clear()
from blender_cloud import attract, flamenco
attract.deactivate()
flamenco.deactivate()
enabled_for = project_extensions(project_id)
log.info("Project extensions: %s", enabled_for)
if "attract" in enabled_for:
attract.activate()
if "flamenco" in enabled_for:
flamenco.activate()
# Load project-specific settings from the last time we visited this project.
ps = prefs.get("project_settings", {}).get(project_id, {})
if not ps:
log.debug(
"no project-specific settings are available, "
"only resetting available Flamenco Managers"
)
# The Flamenco Manager should really be chosen explicitly out of the available
# Managers.
prefs.flamenco_manager.available_managers = []
return
if log.isEnabledFor(logging.DEBUG):
from pprint import pformat
log.debug("loading project-specific settings:\n%s", pformat(ps.to_dict()))
# Restore simple properties.
update_preferences(prefs, PROJECT_SPECIFIC_SIMPLE_PROPS, ps)
# Restore Flamenco settings.
prefs.flamenco_manager.available_managers = ps.get(
"flamenco_available_managers", []
)
flamenco_manager_id = ps.get("flamenco_manager_id")
if flamenco_manager_id:
log.debug("setting flamenco manager to %s", flamenco_manager_id)
try:
# This will trigger a load of Project+Manager-specfic settings.
prefs.flamenco_manager.manager = flamenco_manager_id
except TypeError:
log.warning(
"manager %s for this project could not be found",
flamenco_manager_id,
)
elif prefs.flamenco_manager.available_managers:
prefs.flamenco_manager.manager = prefs.flamenco_manager.available_managers[
0
]["_id"]
def store(_=None, _2=None):
"""Remember project-specific settings as soon as one of them changes.
Ignores arguments so that it can be used as property update callback.
No-op when project_settings_loading=True, to prevent saving project-
specific settings while they are actually being loaded.
"""
from .blender import preferences
global project_settings_loading
if project_settings_loading:
return
prefs = preferences()
project_id = prefs.project.project
all_settings = prefs.get("project_settings", {})
ps = all_settings.get(project_id, {}) # either a dict or bpy.types.IDPropertyGroup
for name in PROJECT_SPECIFIC_SIMPLE_PROPS:
ps[name] = getattr(prefs, name)
# Store project-specific Flamenco settings
ps["flamenco_manager_id"] = prefs.flamenco_manager.manager
ps["flamenco_available_managers"] = prefs.flamenco_manager.available_managers
# Store per-project, per-manager settings for the current Manager.
pppm = ps.get("flamenco_managers_settings", {})
pppm[prefs.flamenco_manager.manager] = {
name: getattr(prefs, name) for name in FLAMENCO_PER_PROJECT_PER_MANAGER
}
ps[
"flamenco_managers_settings"
] = pppm # IDPropertyGroup has no setdefault() method.
# Store this project's settings in the preferences.
all_settings[project_id] = ps
prefs["project_settings"] = all_settings
if log.isEnabledFor(logging.DEBUG):
from pprint import pformat
if hasattr(all_settings, "to_dict"):
to_log = all_settings.to_dict()
else:
to_log = all_settings
log.debug("Saving project-specific settings:\n%s", pformat(to_log))

View File

@ -25,6 +25,7 @@ import functools
import logging import logging
import pathlib import pathlib
import tempfile import tempfile
import typing
import shutil import shutil
import bpy import bpy
@ -34,33 +35,35 @@ import asyncio
import pillarsdk import pillarsdk
from pillarsdk import exceptions as sdk_exceptions from pillarsdk import exceptions as sdk_exceptions
from .pillar import pillar_call from .pillar import pillar_call
from . import async_loop, pillar, cache, blendfile, home_project from . import async_loop, blender, pillar, cache, blendfile, home_project
SETTINGS_FILES_TO_UPLOAD = ['userpref.blend', 'startup.blend'] SETTINGS_FILES_TO_UPLOAD = ["userpref.blend", "startup.blend"]
# These are RNA keys inside the userpref.blend file, and their # These are RNA keys inside the userpref.blend file, and their
# Python properties names. These settings will not be synced. # Python properties names. These settings will not be synced.
LOCAL_SETTINGS_RNA = [ LOCAL_SETTINGS_RNA = [
(b'dpi', 'system.dpi'), (b"dpi", "system.dpi"),
(b'virtual_pixel', 'system.virtual_pixel_mode'), (b"virtual_pixel", "system.virtual_pixel_mode"),
(b'compute_device_id', 'system.compute_device'), (b"compute_device_id", "system.compute_device"),
(b'compute_device_type', 'system.compute_device_type'), (b"compute_device_type", "system.compute_device_type"),
(b'fontdir', 'filepaths.font_directory'), (b"fontdir", "filepaths.font_directory"),
(b'textudir', 'filepaths.texture_directory'), (b"textudir", "filepaths.texture_directory"),
(b'renderdir', 'filepaths.render_output_directory'), (b"renderdir", "filepaths.render_output_directory"),
(b'pythondir', 'filepaths.script_directory'), (b"pythondir", "filepaths.script_directory"),
(b'sounddir', 'filepaths.sound_directory'), (b"sounddir", "filepaths.sound_directory"),
(b'tempdir', 'filepaths.temporary_directory'), (b"tempdir", "filepaths.temporary_directory"),
(b'render_cachedir', 'filepaths.render_cache_directory'), (b"render_cachedir", "filepaths.render_cache_directory"),
(b'i18ndir', 'filepaths.i18n_branches_directory'), (b"i18ndir", "filepaths.i18n_branches_directory"),
(b'image_editor', 'filepaths.image_editor'), (b"image_editor", "filepaths.image_editor"),
(b'anim_player', 'filepaths.animation_player'), (b"anim_player", "filepaths.animation_player"),
] ]
REQUIRES_ROLES_FOR_SYNC = set() # no roles needed. REQUIRES_ROLES_FOR_SYNC = set() # no roles needed.
SYNC_GROUP_NODE_NAME = 'Blender Sync' SYNC_GROUP_NODE_NAME = "Blender Sync"
SYNC_GROUP_NODE_DESC = 'The [Blender Cloud Addon](https://cloud.blender.org/services' \ SYNC_GROUP_NODE_DESC = (
'#blender-addon) will synchronize your Blender settings here.' "The [Blender Cloud Addon](https://cloud.blender.org/services"
"#blender-addon) will synchronize your Blender settings here."
)
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -73,7 +76,7 @@ def set_blender_sync_status(set_status: str):
try: try:
return func(*args, **kwargs) return func(*args, **kwargs)
finally: finally:
bss.status = 'IDLE' bss.status = "IDLE"
return wrapper return wrapper
@ -89,18 +92,16 @@ def async_set_blender_sync_status(set_status: str):
try: try:
return await func(*args, **kwargs) return await func(*args, **kwargs)
finally: finally:
bss.status = 'IDLE' bss.status = "IDLE"
return wrapper return wrapper
return decorator return decorator
async def find_sync_group_id(home_project_id: str, async def find_sync_group_id(
user_id: str, home_project_id: str, user_id: str, blender_version: str, *, may_create=True
blender_version: str, ) -> typing.Tuple[str, str]:
*,
may_create=True) -> str:
"""Finds the group node in which to store sync assets. """Finds the group node in which to store sync assets.
If the group node doesn't exist and may_create=True, it creates it. If the group node doesn't exist and may_create=True, it creates it.
@ -110,43 +111,52 @@ async def find_sync_group_id(home_project_id: str,
# created by Pillar while creating the home project. # created by Pillar while creating the home project.
try: try:
sync_group, created = await pillar.find_or_create_node( sync_group, created = await pillar.find_or_create_node(
where={'project': home_project_id, where={
'node_type': 'group', "project": home_project_id,
'parent': None, "node_type": "group",
'name': SYNC_GROUP_NODE_NAME, "parent": None,
'user': user_id}, "name": SYNC_GROUP_NODE_NAME,
projection={'_id': 1}, "user": user_id,
may_create=False) },
projection={"_id": 1},
may_create=False,
)
except pillar.PillarError: except pillar.PillarError:
raise pillar.PillarError('Unable to find sync folder on the Cloud') raise pillar.PillarError("Unable to find sync folder on the Cloud")
if not may_create and sync_group is None: if not may_create and sync_group is None:
log.info("Sync folder doesn't exist, and not creating it either.") log.info("Sync folder doesn't exist, and not creating it either.")
return None, None return "", ""
# Find/create the sub-group for the requested Blender version # Find/create the sub-group for the requested Blender version
try: try:
sub_sync_group, created = await pillar.find_or_create_node( sub_sync_group, created = await pillar.find_or_create_node(
where={'project': home_project_id, where={
'node_type': 'group', "project": home_project_id,
'parent': sync_group['_id'], "node_type": "group",
'name': blender_version, "parent": sync_group["_id"],
'user': user_id}, "name": blender_version,
additional_create_props={ "user": user_id,
'description': 'Sync folder for Blender %s' % blender_version,
'properties': {'status': 'published'},
}, },
projection={'_id': 1}, additional_create_props={
may_create=may_create) "description": "Sync folder for Blender %s" % blender_version,
"properties": {"status": "published"},
},
projection={"_id": 1},
may_create=may_create,
)
except pillar.PillarError: except pillar.PillarError:
raise pillar.PillarError('Unable to create sync folder on the Cloud') raise pillar.PillarError("Unable to create sync folder on the Cloud")
if not may_create and sub_sync_group is None: if not may_create and sub_sync_group is None:
log.info("Sync folder for Blender version %s doesn't exist, " log.info(
"and not creating it either.", blender_version) "Sync folder for Blender version %s doesn't exist, "
return sync_group['_id'], None "and not creating it either.",
blender_version,
)
return sync_group["_id"], ""
return sync_group['_id'], sub_sync_group['_id'] return sync_group["_id"], sub_sync_group["_id"]
@functools.lru_cache() @functools.lru_cache()
@ -157,82 +167,94 @@ async def available_blender_versions(home_project_id: str, user_id: str) -> list
sync_group = await pillar_call( sync_group = await pillar_call(
pillarsdk.Node.find_first, pillarsdk.Node.find_first,
params={ params={
'where': {'project': home_project_id, "where": {
'node_type': 'group', "project": home_project_id,
'parent': None, "node_type": "group",
'name': SYNC_GROUP_NODE_NAME, "parent": None,
'user': user_id}, "name": SYNC_GROUP_NODE_NAME,
'projection': {'_id': 1}, "user": user_id,
}, },
caching=False) "projection": {"_id": 1},
},
caching=False,
)
if sync_group is None: if sync_group is None:
bss.report({'ERROR'}, 'No synced Blender settings in your Blender Cloud') bss.report({"ERROR"}, "No synced Blender settings in your Blender Cloud")
log.debug('-- unable to find sync group for home_project_id=%r and user_id=%r', log.debug(
home_project_id, user_id) "-- unable to find sync group for home_project_id=%r and user_id=%r",
home_project_id,
user_id,
)
return [] return []
sync_nodes = await pillar_call( sync_nodes = await pillar_call(
pillarsdk.Node.all, pillarsdk.Node.all,
params={ params={
'where': {'project': home_project_id, "where": {
'node_type': 'group', "project": home_project_id,
'parent': sync_group['_id'], "node_type": "group",
'user': user_id}, "parent": sync_group["_id"],
'projection': {'_id': 1, 'name': 1}, "user": user_id,
'sort': '-name',
}, },
caching=False) "projection": {"_id": 1, "name": 1},
"sort": "-name",
},
caching=False,
)
if not sync_nodes or not sync_nodes._items: if not sync_nodes or not sync_nodes._items:
bss.report({'ERROR'}, 'No synced Blender settings in your Blender Cloud.') bss.report({"ERROR"}, "No synced Blender settings in your Blender Cloud.")
return [] return []
versions = [node.name for node in sync_nodes._items] versions = [node.name for node in sync_nodes._items]
log.debug('Versions: %s', versions) log.debug("Versions: %s", versions)
return versions return versions
# noinspection PyAttributeOutsideInit # noinspection PyAttributeOutsideInit
class PILLAR_OT_sync(pillar.PillarOperatorMixin, class PILLAR_OT_sync(
async_loop.AsyncModalOperatorMixin, pillar.PillarOperatorMixin, async_loop.AsyncModalOperatorMixin, bpy.types.Operator
bpy.types.Operator): ):
bl_idname = 'pillar.sync' bl_idname = "pillar.sync"
bl_label = 'Synchronise with Blender Cloud' bl_label = "Synchronise with Blender Cloud"
bl_description = 'Synchronises Blender settings with Blender Cloud' bl_description = "Synchronises Blender settings with Blender Cloud"
log = logging.getLogger('bpy.ops.%s' % bl_idname) log = logging.getLogger("bpy.ops.%s" % bl_idname)
home_project_id = None home_project_id = ""
sync_group_id = None # top-level sync group node ID sync_group_id = "" # top-level sync group node ID
sync_group_versioned_id = None # sync group node ID for the given Blender version. sync_group_versioned_id = "" # sync group node ID for the given Blender version.
action = bpy.props.EnumProperty( action: bpy.props.EnumProperty(
items=[ items=[
('PUSH', 'Push', 'Push settings to the Blender Cloud'), ("PUSH", "Push", "Push settings to the Blender Cloud"),
('PULL', 'Pull', 'Pull settings from the Blender Cloud'), ("PULL", "Pull", "Pull settings from the Blender Cloud"),
('REFRESH', 'Refresh', 'Refresh available versions'), ("REFRESH", "Refresh", "Refresh available versions"),
('SELECT', 'Select', 'Select version to sync'), ("SELECT", "Select", "Select version to sync"),
], ],
name='action') name="action",
)
CURRENT_BLENDER_VERSION = '%i.%i' % bpy.app.version[:2] CURRENT_BLENDER_VERSION = "%i.%i" % bpy.app.version[:2]
blender_version = bpy.props.StringProperty(name='blender_version', blender_version: bpy.props.StringProperty(
description='Blender version to sync for', name="blender_version",
default=CURRENT_BLENDER_VERSION) description="Blender version to sync for",
default=CURRENT_BLENDER_VERSION,
)
def bss_report(self, level, message): def bss_report(self, level, message):
bss = bpy.context.window_manager.blender_sync_status bss = bpy.context.window_manager.blender_sync_status
bss.report(level, message) bss.report(level, message)
def invoke(self, context, event): def invoke(self, context, event):
if self.action == 'SELECT': if self.action == "SELECT":
# Synchronous action # Synchronous action
return self.action_select(context) return self.action_select(context)
if self.action in {'PUSH', 'PULL'} and not self.blender_version: if self.action in {"PUSH", "PULL"} and not self.blender_version:
self.bss_report({'ERROR'}, 'No Blender version to sync for was given.') self.bss_report({"ERROR"}, "No Blender version to sync for was given.")
return {'CANCELLED'} return {"CANCELLED"}
return async_loop.AsyncModalOperatorMixin.invoke(self, context, event) return async_loop.AsyncModalOperatorMixin.invoke(self, context, event)
@ -242,133 +264,146 @@ class PILLAR_OT_sync(pillar.PillarOperatorMixin,
This is a synchronous action, as it requires a dialog box. This is a synchronous action, as it requires a dialog box.
""" """
self.log.info('Performing action SELECT') self.log.info("Performing action SELECT")
# Do a refresh before we can show the dropdown. # Do a refresh before we can show the dropdown.
fut = asyncio.ensure_future(self.async_execute(context, action_override='REFRESH')) fut = asyncio.ensure_future(
self.async_execute(context, action_override="REFRESH")
)
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
loop.run_until_complete(fut) loop.run_until_complete(fut)
self._state = 'SELECTING' self._state = "SELECTING"
return context.window_manager.invoke_props_dialog(self) return context.window_manager.invoke_props_dialog(self)
def draw(self, context): def draw(self, context):
bss = bpy.context.window_manager.blender_sync_status bss = bpy.context.window_manager.blender_sync_status
self.layout.prop(bss, 'version', text='Blender version') self.layout.prop(bss, "version", text="Blender version")
def execute(self, context): def execute(self, context):
if self.action != 'SELECT': if self.action != "SELECT":
log.debug('Ignoring execute() for action %r', self.action) log.debug("Ignoring execute() for action %r", self.action)
return {'FINISHED'} return {"FINISHED"}
log.debug('Performing execute() for action %r', self.action) log.debug("Performing execute() for action %r", self.action)
# Perform the sync when the user closes the dialog box. # Perform the sync when the user closes the dialog box.
bss = bpy.context.window_manager.blender_sync_status bss = bpy.context.window_manager.blender_sync_status
bpy.ops.pillar.sync('INVOKE_DEFAULT', bpy.ops.pillar.sync(
action='PULL', "INVOKE_DEFAULT", action="PULL", blender_version=bss.version
blender_version=bss.version) )
return {'FINISHED'} return {"FINISHED"}
@async_set_blender_sync_status('SYNCING') @async_set_blender_sync_status("SYNCING")
async def async_execute(self, context, *, action_override=None): async def async_execute(self, context, *, action_override=None):
"""Entry point of the asynchronous operator.""" """Entry point of the asynchronous operator."""
action = action_override or self.action action = action_override or self.action
self.bss_report({'INFO'}, 'Communicating with Blender Cloud') self.bss_report({"INFO"}, "Communicating with Blender Cloud")
self.log.info('Performing action %s', action) self.log.info("Performing action %s", action)
try: try:
# Refresh credentials # Refresh credentials
try: try:
db_user = await self.check_credentials(context, REQUIRES_ROLES_FOR_SYNC) db_user = await self.check_credentials(context, REQUIRES_ROLES_FOR_SYNC)
self.user_id = db_user['_id'] self.user_id = db_user["_id"]
log.debug('Found user ID: %s', self.user_id) log.debug("Found user ID: %s", self.user_id)
except pillar.NotSubscribedToCloudError: except pillar.NotSubscribedToCloudError as ex:
self.log.exception('User not subscribed to cloud.') self._log_subscription_needed(can_renew=ex.can_renew)
self.bss_report({'SUBSCRIBE'}, 'Please subscribe to the Blender Cloud.') self._state = "QUIT"
self._state = 'QUIT'
return return
except pillar.UserNotLoggedInError: except pillar.UserNotLoggedInError:
self.log.exception('Error checking/refreshing credentials.') self.log.exception("Error checking/refreshing credentials.")
self.bss_report({'ERROR'}, 'Please log in on Blender ID first.') self.bss_report({"ERROR"}, "Please log in on Blender ID first.")
self._state = 'QUIT' self._state = "QUIT"
return return
# Find the home project. # Find the home project.
try: try:
self.home_project_id = await home_project.get_home_project_id() self.home_project_id = await home_project.get_home_project_id()
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
self.log.exception('Forbidden access to home project.') self.log.exception("Forbidden access to home project.")
self.bss_report({'ERROR'}, 'Did not get access to home project.') self.bss_report({"ERROR"}, "Did not get access to home project.")
self._state = 'QUIT' self._state = "QUIT"
return return
except sdk_exceptions.ResourceNotFound: except sdk_exceptions.ResourceNotFound:
self.bss_report({'ERROR'}, 'Home project not found.') self.bss_report({"ERROR"}, "Home project not found.")
self._state = 'QUIT' self._state = "QUIT"
return return
# Only create the folder structure if we're pushing. # Only create the folder structure if we're pushing.
may_create = self.action == 'PUSH' may_create = self.action == "PUSH"
try: try:
gid, subgid = await find_sync_group_id(self.home_project_id, gid, subgid = await find_sync_group_id(
self.home_project_id,
self.user_id, self.user_id,
self.blender_version, self.blender_version,
may_create=may_create) may_create=may_create,
)
self.sync_group_id = gid self.sync_group_id = gid
self.sync_group_versioned_id = subgid self.sync_group_versioned_id = subgid
self.log.debug('Found top-level group node ID: %s', self.sync_group_id) self.log.debug("Found top-level group node ID: %s", self.sync_group_id)
self.log.debug('Found group node ID for %s: %s', self.log.debug(
self.blender_version, self.sync_group_versioned_id) "Found group node ID for %s: %s",
self.blender_version,
self.sync_group_versioned_id,
)
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
self.log.exception('Unable to find Group ID') self.log.exception("Unable to find Group ID")
self.bss_report({'ERROR'}, 'Unable to find sync folder.') self.bss_report({"ERROR"}, "Unable to find sync folder.")
self._state = 'QUIT' self._state = "QUIT"
return return
# Perform the requested action. # Perform the requested action.
action_method = { action_method = {
'PUSH': self.action_push, "PUSH": self.action_push,
'PULL': self.action_pull, "PULL": self.action_pull,
'REFRESH': self.action_refresh, "REFRESH": self.action_refresh,
}[action] }[action]
await action_method(context) await action_method(context)
except Exception as ex: except Exception as ex:
self.log.exception('Unexpected exception caught.') self.log.exception("Unexpected exception caught.")
self.bss_report({'ERROR'}, 'Unexpected error: %s' % ex) self.bss_report({"ERROR"}, "Unexpected error: %s" % ex)
self._state = 'QUIT' self._state = "QUIT"
async def action_push(self, context): async def action_push(self, context):
"""Sends files to the Pillar server.""" """Sends files to the Pillar server."""
self.log.info('Saved user preferences to disk before pushing to cloud.') self.log.info("Saved user preferences to disk before pushing to cloud.")
bpy.ops.wm.save_userpref() bpy.ops.wm.save_userpref()
config_dir = pathlib.Path(bpy.utils.user_resource('CONFIG')) config_dir = pathlib.Path(bpy.utils.user_resource("CONFIG"))
for fname in SETTINGS_FILES_TO_UPLOAD: for fname in SETTINGS_FILES_TO_UPLOAD:
path = config_dir / fname path = config_dir / fname
if not path.exists(): if not path.exists():
self.log.debug('Skipping non-existing %s', path) self.log.debug("Skipping non-existing %s", path)
continue continue
if self.signalling_future.cancelled(): if self.signalling_future.cancelled():
self.bss_report({'WARNING'}, 'Upload aborted.') self.bss_report({"WARNING"}, "Upload aborted.")
return return
self.bss_report({'INFO'}, 'Uploading %s' % fname) self.bss_report({"INFO"}, "Uploading %s" % fname)
try: try:
await pillar.attach_file_to_group(path, await pillar.attach_file_to_group(
path,
self.home_project_id, self.home_project_id,
self.sync_group_versioned_id, self.sync_group_versioned_id,
self.user_id) self.user_id,
)
except sdk_exceptions.RequestEntityTooLarge as ex: except sdk_exceptions.RequestEntityTooLarge as ex:
self.log.error('File too big to upload: %s' % ex) self.log.error("File too big to upload: %s" % ex)
self.log.error('To upload larger files, please subscribe to Blender Cloud.') self.log.error(
self.bss_report({'SUBSCRIBE'}, 'File %s too big to upload. ' "To upload larger files, please subscribe to Blender Cloud."
'Subscribe for unlimited space.' % fname) )
self._state = 'QUIT' self.bss_report(
{"SUBSCRIBE"},
"File %s too big to upload. "
"Subscribe for unlimited space." % fname,
)
self._state = "QUIT"
return return
await self.action_refresh(context) await self.action_refresh(context)
@ -382,31 +417,37 @@ class PILLAR_OT_sync(pillar.PillarOperatorMixin,
else: else:
bss.version = max(bss.available_blender_versions) bss.version = max(bss.available_blender_versions)
self.bss_report({'INFO'}, 'Settings pushed to Blender Cloud.') self.bss_report({"INFO"}, "Settings pushed to Blender Cloud.")
async def action_pull(self, context): async def action_pull(self, context):
"""Loads files from the Pillar server.""" """Loads files from the Pillar server."""
# If the sync group node doesn't exist, offer a list of groups that do. # If the sync group node doesn't exist, offer a list of groups that do.
if self.sync_group_id is None: if not self.sync_group_id:
self.bss_report({'ERROR'}, self.bss_report(
'There are no synced Blender settings in your Blender Cloud.') {"ERROR"}, "There are no synced Blender settings in your Blender Cloud."
)
return return
if self.sync_group_versioned_id is None: if not self.sync_group_versioned_id:
self.bss_report({'ERROR'}, 'Therre are no synced Blender settings for version %s' % self.bss_report(
self.blender_version) {"ERROR"},
"Therre are no synced Blender settings for version %s"
% self.blender_version,
)
return return
self.bss_report({'INFO'}, 'Pulling settings from Blender Cloud') self.bss_report({"INFO"}, "Pulling settings from Blender Cloud")
with tempfile.TemporaryDirectory(prefix='bcloud-sync') as tempdir: with tempfile.TemporaryDirectory(prefix="bcloud-sync") as tempdir:
for fname in SETTINGS_FILES_TO_UPLOAD: for fname in SETTINGS_FILES_TO_UPLOAD:
await self.download_settings_file(fname, tempdir) await self.download_settings_file(fname, tempdir)
self.bss_report({'WARNING'}, 'Settings pulled from Cloud, restart Blender to load them.') self.bss_report(
{"WARNING"}, "Settings pulled from Cloud, restart Blender to load them."
)
async def action_refresh(self, context): async def action_refresh(self, context):
self.bss_report({'INFO'}, 'Refreshing available Blender versions.') self.bss_report({"INFO"}, "Refreshing available Blender versions.")
# Clear the LRU cache of available_blender_versions so that we can # Clear the LRU cache of available_blender_versions so that we can
# obtain new versions (if someone synced from somewhere else, for example) # obtain new versions (if someone synced from somewhere else, for example)
@ -416,102 +457,123 @@ class PILLAR_OT_sync(pillar.PillarOperatorMixin,
bss = bpy.context.window_manager.blender_sync_status bss = bpy.context.window_manager.blender_sync_status
bss.available_blender_versions = versions bss.available_blender_versions = versions
if versions: if not versions:
# There are versions to sync, so we can remove the status message. # There are versions to sync, so we can remove the status message.
# However, if there aren't any, the status message shows why, and # However, if there aren't any, the status message shows why, and
# shouldn't be erased. # shouldn't be erased.
self.bss_report({'INFO'}, '')
async def download_settings_file(self, fname: str, temp_dir: str):
config_dir = pathlib.Path(bpy.utils.user_resource('CONFIG'))
meta_path = cache.cache_directory('home-project', 'blender-sync')
self.bss_report({'INFO'}, 'Downloading %s from Cloud' % fname)
# Get the asset node
node_props = {'project': self.home_project_id,
'node_type': 'asset',
'parent': self.sync_group_versioned_id,
'name': fname}
node = await pillar_call(pillarsdk.Node.find_first, {
'where': node_props,
'projection': {'_id': 1, 'properties.file': 1}
}, caching=False)
if node is None:
self.bss_report({'INFO'}, 'Unable to find %s on Blender Cloud' % fname)
self.log.info('Unable to find node on Blender Cloud for %s', fname)
return return
async def file_downloaded(file_path: str, file_desc: pillarsdk.File, map_type: str): # Prevent warnings that the current value of the EnumProperty isn't valid.
current_version = "%d.%d" % bpy.app.version[:2]
if current_version in versions:
bss.version = current_version
else:
bss.version = versions[0]
self.bss_report({"INFO"}, "")
async def download_settings_file(self, fname: str, temp_dir: str):
config_dir = pathlib.Path(bpy.utils.user_resource("CONFIG"))
meta_path = cache.cache_directory("home-project", "blender-sync")
self.bss_report({"INFO"}, "Downloading %s from Cloud" % fname)
# Get the asset node
node_props = {
"project": self.home_project_id,
"node_type": "asset",
"parent": self.sync_group_versioned_id,
"name": fname,
}
node = await pillar_call(
pillarsdk.Node.find_first,
{"where": node_props, "projection": {"_id": 1, "properties.file": 1}},
caching=False,
)
if node is None:
self.bss_report({"INFO"}, "Unable to find %s on Blender Cloud" % fname)
self.log.info("Unable to find node on Blender Cloud for %s", fname)
return
async def file_downloaded(
file_path: str, file_desc: pillarsdk.File, map_type: str
):
# Allow the caller to adjust the file before we move it into place. # Allow the caller to adjust the file before we move it into place.
if fname.lower() == 'userpref.blend': if fname.lower() == "userpref.blend":
await self.update_userpref_blend(file_path) await self.update_userpref_blend(file_path)
# Move the file next to the final location; as it may be on a # Move the file next to the final location; as it may be on a
# different filesystem than the temporary directory, this can # different filesystem than the temporary directory, this can
# fail, and we don't want to destroy the existing file. # fail, and we don't want to destroy the existing file.
local_temp = config_dir / (fname + '~') local_temp = config_dir / (fname + "~")
local_final = config_dir / fname local_final = config_dir / fname
# Make a backup copy of the file as it was before pulling. # Make a backup copy of the file as it was before pulling.
if local_final.exists(): if local_final.exists():
local_bak = config_dir / (fname + '-pre-bcloud-pull') local_bak = config_dir / (fname + "-pre-bcloud-pull")
self.move_file(local_final, local_bak) self.move_file(local_final, local_bak)
self.move_file(file_path, local_temp) self.move_file(file_path, local_temp)
self.move_file(local_temp, local_final) self.move_file(local_temp, local_final)
file_id = node.properties.file file_id = node.properties.file
await pillar.download_file_by_uuid(file_id, await pillar.download_file_by_uuid(
file_id,
temp_dir, temp_dir,
str(meta_path), str(meta_path),
file_loaded_sync=file_downloaded, file_loaded_sync=file_downloaded,
future=self.signalling_future) future=self.signalling_future,
)
def move_file(self, src, dst): def move_file(self, src, dst):
self.log.info('Moving %s to %s', src, dst) self.log.info("Moving %s to %s", src, dst)
shutil.move(str(src), str(dst)) shutil.move(str(src), str(dst))
async def update_userpref_blend(self, file_path: str): async def update_userpref_blend(self, file_path: str):
self.log.info('Overriding machine-local settings in %s', file_path) self.log.info("Overriding machine-local settings in %s", file_path)
# Remember some settings that should not be overwritten from the Cloud. # Remember some settings that should not be overwritten from the Cloud.
up = bpy.context.user_preferences prefs = blender.ctx_preferences()
remembered = {} remembered = {}
for rna_key, python_key in LOCAL_SETTINGS_RNA: for rna_key, python_key in LOCAL_SETTINGS_RNA:
assert '.' in python_key, 'Sorry, this code assumes there is a dot in the Python key' assert (
"." in python_key
), "Sorry, this code assumes there is a dot in the Python key"
try: try:
value = up.path_resolve(python_key) value = prefs.path_resolve(python_key)
except ValueError: except ValueError:
# Setting doesn't exist. This can happen, for example Cycles # Setting doesn't exist. This can happen, for example Cycles
# settings on a build that doesn't have Cycles enabled. # settings on a build that doesn't have Cycles enabled.
continue continue
# Map enums from strings (in Python) to ints (in DNA). # Map enums from strings (in Python) to ints (in DNA).
dot_index = python_key.rindex('.') dot_index = python_key.rindex(".")
parent_key, prop_key = python_key[:dot_index], python_key[dot_index + 1:] parent_key, prop_key = python_key[:dot_index], python_key[dot_index + 1 :]
parent = up.path_resolve(parent_key) parent = prefs.path_resolve(parent_key)
prop = parent.bl_rna.properties[prop_key] prop = parent.bl_rna.properties[prop_key]
if prop.type == 'ENUM': if prop.type == "ENUM":
log.debug('Rewriting %s from %r to %r', log.debug(
python_key, value, prop.enum_items[value].value) "Rewriting %s from %r to %r",
python_key,
value,
prop.enum_items[value].value,
)
value = prop.enum_items[value].value value = prop.enum_items[value].value
else: else:
log.debug('Keeping value of %s: %r', python_key, value) log.debug("Keeping value of %s: %r", python_key, value)
remembered[rna_key] = value remembered[rna_key] = value
log.debug('Overriding values: %s', remembered) log.debug("Overriding values: %s", remembered)
# Rewrite the userprefs.blend file to override the options. # Rewrite the userprefs.blend file to override the options.
with blendfile.open_blend(file_path, 'rb+') as blend: with blendfile.open_blend(file_path, "rb+") as blend:
prefs = next(block for block in blend.blocks prefs = next(block for block in blend.blocks if block.code == b"USER")
if block.code == b'USER')
for key, value in remembered.items(): for key, value in remembered.items():
self.log.debug('prefs[%r] = %r' % (key, prefs[key])) self.log.debug("prefs[%r] = %r" % (key, prefs[key]))
self.log.debug(' -> setting prefs[%r] = %r' % (key, value)) self.log.debug(" -> setting prefs[%r] = %r" % (key, value))
prefs[key] = value prefs[key] = value

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,910 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
import asyncio
import logging
import os
import threading
import typing
import bpy
import bgl
import pillarsdk
from .. import async_loop, pillar, cache, blender, utils
from . import (
menu_item as menu_item_mod,
) # so that we can have menu items called 'menu_item'
from . import draw, nodes
REQUIRED_ROLES_FOR_TEXTURE_BROWSER = {"subscriber", "demo"}
MOUSE_SCROLL_PIXELS_PER_TICK = 50
TARGET_ITEM_WIDTH = 400
TARGET_ITEM_HEIGHT = 128
ITEM_MARGIN_X = 5
ITEM_MARGIN_Y = 5
ITEM_PADDING_X = 5
log = logging.getLogger(__name__)
class BlenderCloudBrowser(
pillar.PillarOperatorMixin, async_loop.AsyncModalOperatorMixin, bpy.types.Operator
):
bl_idname = "pillar.browser"
bl_label = "Blender Cloud Texture Browser"
_draw_handle = None
current_path = pillar.CloudPath("/")
project_name = ""
# This contains a stack of Node objects that lead up to the currently browsed node.
path_stack = [] # type: typing.List[pillarsdk.Node]
# This contains a stack of MenuItem objects that lead up to the currently browsed node.
menu_item_stack = [] # type: typing.List[menu_item_mod.MenuItem]
timer = None
log = logging.getLogger("%s.BlenderCloudBrowser" % __name__)
_menu_item_lock = threading.Lock()
current_display_content = [] # type: typing.List[menu_item_mod.MenuItem]
loaded_images = set() # type: typing.Set[str]
thumbnails_cache = ""
maximized_area = False
mouse_x = 0
mouse_y = 0
scroll_offset = 0
scroll_offset_target = 0
scroll_offset_max = 0
scroll_offset_space_left = 0
def invoke(self, context, event):
# Refuse to start if the file hasn't been saved. It's okay if
# it's dirty, we just need to know where '//' points to.
if not os.path.exists(context.blend_data.filepath):
self.report(
{"ERROR"},
"Please save your Blend file before using " "the Blender Cloud addon.",
)
return {"CANCELLED"}
wm = context.window_manager
self.current_path = pillar.CloudPath(wm.last_blender_cloud_location)
self.path_stack = [] # list of nodes that make up the current path.
self.thumbnails_cache = cache.cache_directory("thumbnails")
self.mouse_x = event.mouse_x
self.mouse_y = event.mouse_y
# See if we have to maximize the current area
if not context.screen.show_fullscreen:
self.maximized_area = True
bpy.ops.screen.screen_full_area(use_hide_panels=True)
# Add the region OpenGL drawing callback
# draw in view space with 'POST_VIEW' and 'PRE_VIEW'
self._draw_handle = context.space_data.draw_handler_add(
self.draw_menu, (context,), "WINDOW", "POST_PIXEL"
)
self.current_display_content = []
self.loaded_images = set()
self._scroll_reset()
context.window.cursor_modal_set("DEFAULT")
return async_loop.AsyncModalOperatorMixin.invoke(self, context, event)
def modal(self, context, event):
result = async_loop.AsyncModalOperatorMixin.modal(self, context, event)
if not {"PASS_THROUGH", "RUNNING_MODAL"}.intersection(result):
return result
if event.type == "TAB" and event.value == "RELEASE":
self.log.info("Ensuring async loop is running")
async_loop.ensure_async_loop()
if event.type == "TIMER":
self._scroll_smooth()
context.area.tag_redraw()
return {"RUNNING_MODAL"}
if "MOUSE" in event.type:
context.area.tag_redraw()
self.mouse_x = event.mouse_x
self.mouse_y = event.mouse_y
left_mouse_release = event.type == "LEFTMOUSE" and event.value == "RELEASE"
if left_mouse_release and self._state in {"PLEASE_SUBSCRIBE", "PLEASE_RENEW"}:
self.open_browser_subscribe(renew=self._state == "PLEASE_RENEW")
self._finish(context)
return {"FINISHED"}
if self._state == "BROWSING":
selected = self.get_clicked()
if selected:
if selected.is_spinning:
context.window.cursor_set("WAIT")
else:
context.window.cursor_set("HAND")
else:
context.window.cursor_set("DEFAULT")
# Scrolling
if event.type == "WHEELUPMOUSE":
self._scroll_by(MOUSE_SCROLL_PIXELS_PER_TICK)
context.area.tag_redraw()
elif event.type == "WHEELDOWNMOUSE":
self._scroll_by(-MOUSE_SCROLL_PIXELS_PER_TICK)
context.area.tag_redraw()
elif event.type == "TRACKPADPAN":
self._scroll_by(event.mouse_prev_y - event.mouse_y, smooth=False)
context.area.tag_redraw()
if left_mouse_release:
if selected is None:
# No item clicked, ignore it.
return {"RUNNING_MODAL"}
if selected.is_spinning:
# This can happen when the thumbnail information isn't loaded yet.
return {"RUNNING_MODAL"}
if selected.is_folder:
self.descend_node(selected)
else:
self.handle_item_selection(context, selected)
if event.type in {"RIGHTMOUSE", "ESC"}:
self._finish(context)
return {"CANCELLED"}
return {"RUNNING_MODAL"}
async def async_execute(self, context):
self._state = "CHECKING_CREDENTIALS"
self.log.debug("Checking credentials")
try:
db_user = await self.check_credentials(
context, REQUIRED_ROLES_FOR_TEXTURE_BROWSER
)
except pillar.NotSubscribedToCloudError as ex:
self._log_subscription_needed(can_renew=ex.can_renew, level="INFO")
self._show_subscribe_screen(can_renew=ex.can_renew)
return None
if db_user is None:
raise pillar.UserNotLoggedInError()
await self.async_download_previews()
def _show_subscribe_screen(self, *, can_renew: bool):
"""Shows the "You need to subscribe" screen."""
if can_renew:
self._state = "PLEASE_RENEW"
else:
self._state = "PLEASE_SUBSCRIBE"
bpy.context.window.cursor_set("HAND")
def descend_node(self, menu_item: menu_item_mod.MenuItem):
"""Descends the node hierarchy by visiting this menu item's node.
Also keeps track of the current node, so that we know where the "up" button should go.
"""
node = menu_item.node
assert isinstance(node, pillarsdk.Node), "Wrong type %s" % node
if isinstance(node, nodes.UpNode):
# Going up.
self.log.debug("Going up to %r", self.current_path)
self.current_path = self.current_path.parent
if self.path_stack:
self.path_stack.pop()
if self.menu_item_stack:
self.menu_item_stack.pop()
if not self.path_stack:
self.project_name = ""
else:
# Going down, keep track of where we were
if isinstance(node, nodes.ProjectNode):
self.project_name = node["name"]
self.current_path /= node["_id"]
self.log.debug("Going down to %r", self.current_path)
self.path_stack.append(node)
self.menu_item_stack.append(menu_item)
self.browse_assets()
@property
def node(self):
if not self.path_stack:
return None
return self.path_stack[-1]
def _finish(self, context):
self.log.debug("Finishing the modal operator")
async_loop.AsyncModalOperatorMixin._finish(self, context)
self.clear_images()
context.space_data.draw_handler_remove(self._draw_handle, "WINDOW")
context.window.cursor_modal_restore()
if self.maximized_area:
bpy.ops.screen.screen_full_area(use_hide_panels=True)
context.area.tag_redraw()
self.log.debug("Modal operator finished")
def clear_images(self):
"""Removes all images we loaded from Blender's memory."""
for image in bpy.data.images:
if image.filepath_raw not in self.loaded_images:
continue
image.user_clear()
bpy.data.images.remove(image)
self.loaded_images.clear()
self.current_display_content.clear()
def add_menu_item(self, *args) -> menu_item_mod.MenuItem:
menu_item = menu_item_mod.MenuItem(*args)
# Just make this thread-safe to be on the safe side.
with self._menu_item_lock:
self.current_display_content.append(menu_item)
if menu_item.icon is not None:
self.loaded_images.add(menu_item.icon.filepath_raw)
self.sort_menu()
return menu_item
def update_menu_item(self, node, *args):
node_uuid = node["_id"]
# Just make this thread-safe to be on the safe side.
with self._menu_item_lock:
for menu_item in self.current_display_content:
if menu_item.represents(node):
menu_item.update(node, *args)
self.loaded_images.add(menu_item.icon.filepath_raw)
break
else:
raise ValueError("Unable to find MenuItem(node_uuid=%r)" % node_uuid)
self.sort_menu()
def sort_menu(self):
"""Sorts the self.current_display_content list."""
if not self.current_display_content:
return
with self._menu_item_lock:
self.current_display_content.sort(key=menu_item_mod.MenuItem.sort_key)
async def async_download_previews(self):
self._state = "BROWSING"
thumbnails_directory = self.thumbnails_cache
self.log.info("Asynchronously downloading previews to %r", thumbnails_directory)
self.log.info("Current BCloud path is %r", self.current_path)
self.clear_images()
self._scroll_reset()
project_uuid = self.current_path.project_uuid
node_uuid = self.current_path.node_uuid
if node_uuid:
# Query for sub-nodes of this node.
self.log.debug("Getting subnodes for parent node %r", node_uuid)
children = await pillar.get_nodes(
parent_node_uuid=node_uuid, node_type={"group_texture", "group_hdri"}
)
elif project_uuid:
# Query for top-level nodes.
self.log.debug("Getting subnodes for project node %r", project_uuid)
children = await pillar.get_nodes(
project_uuid=project_uuid,
parent_node_uuid="",
node_type={"group_texture", "group_hdri"},
)
else:
# Query for projects
self.log.debug(
"No node UUID and no project UUID, listing available projects"
)
children = await pillar.get_texture_projects()
for proj_dict in children:
self.add_menu_item(
nodes.ProjectNode(proj_dict), None, "FOLDER", proj_dict["name"]
)
return
# Make sure we can go up again.
self.add_menu_item(nodes.UpNode(), None, "FOLDER", ".. up ..")
# Download all child nodes
self.log.debug("Iterating over child nodes of %r", self.current_path)
for child in children:
# print(' - %(_id)s = %(name)s' % child)
if child["node_type"] not in menu_item_mod.MenuItem.SUPPORTED_NODE_TYPES:
self.log.debug("Skipping node of type %r", child["node_type"])
continue
self.add_menu_item(child, None, "FOLDER", child["name"])
# There are only sub-nodes at the project level, no texture nodes,
# so we won't have to bother looking for textures.
if not node_uuid:
return
directory = os.path.join(thumbnails_directory, project_uuid, node_uuid)
os.makedirs(directory, exist_ok=True)
self.log.debug("Fetching texture thumbnails for node %r", node_uuid)
def thumbnail_loading(node, texture_node):
self.add_menu_item(node, None, "SPINNER", texture_node["name"])
def thumbnail_loaded(node, file_desc, thumb_path):
self.log.debug("Node %s thumbnail loaded", node["_id"])
self.update_menu_item(node, file_desc, thumb_path)
await pillar.fetch_texture_thumbs(
node_uuid,
"s",
directory,
thumbnail_loading=thumbnail_loading,
thumbnail_loaded=thumbnail_loaded,
future=self.signalling_future,
)
def browse_assets(self):
self.log.debug("Browsing assets at %r", self.current_path)
bpy.context.window_manager.last_blender_cloud_location = str(self.current_path)
self._new_async_task(self.async_download_previews())
def draw_menu(self, context):
"""Draws the GUI with OpenGL."""
drawers = {
"INITIALIZING": self._draw_initializing,
"CHECKING_CREDENTIALS": self._draw_checking_credentials,
"BROWSING": self._draw_browser,
"DOWNLOADING_TEXTURE": self._draw_downloading,
"EXCEPTION": self._draw_exception,
"PLEASE_SUBSCRIBE": self._draw_subscribe,
"PLEASE_RENEW": self._draw_renew,
}
if self._state in drawers:
drawer = drawers[self._state]
drawer(context)
# For debugging: draw the state
draw.text(
(5, 5),
"%s %s" % (self._state, self.project_name),
rgba=(1.0, 1.0, 1.0, 1.0),
fsize=12,
)
@staticmethod
def _window_region(context):
window_regions = [
region for region in context.area.regions if region.type == "WINDOW"
]
return window_regions[0]
def _draw_browser(self, context):
"""OpenGL drawing code for the BROWSING state."""
from . import draw
if not self.current_display_content:
self._draw_text_on_colour(
context, "Communicating with Blender Cloud", (0.0, 0.0, 0.0, 0.6)
)
return
window_region = self._window_region(context)
content_width = window_region.width - ITEM_MARGIN_X * 2
content_height = window_region.height - ITEM_MARGIN_Y * 2
content_x = ITEM_MARGIN_X
content_y = context.area.height - ITEM_MARGIN_Y - TARGET_ITEM_HEIGHT
col_count = content_width // TARGET_ITEM_WIDTH
item_width = (content_width - (col_count * ITEM_PADDING_X)) / col_count
item_height = TARGET_ITEM_HEIGHT
block_width = item_width + ITEM_PADDING_X
block_height = item_height + ITEM_MARGIN_Y
bgl.glEnable(bgl.GL_BLEND)
draw.aabox(
(0, 0), (window_region.width, window_region.height), (0.0, 0.0, 0.0, 0.6)
)
bottom_y = float("inf")
# The -1 / +2 are for extra rows that are drawn only half at the top/bottom.
first_item_idx = max(
0, int(-self.scroll_offset // block_height - 1) * col_count
)
items_per_page = int(content_height // item_height + 2) * col_count
last_item_idx = first_item_idx + items_per_page
for item_idx, item in enumerate(self.current_display_content):
x = content_x + (item_idx % col_count) * block_width
y = content_y - (item_idx // col_count) * block_height - self.scroll_offset
item.update_placement(x, y, item_width, item_height)
if first_item_idx <= item_idx < last_item_idx:
# Only draw if the item is actually on screen.
item.draw(highlighted=item.hits(self.mouse_x, self.mouse_y))
bottom_y = min(y, bottom_y)
self.scroll_offset_space_left = window_region.height - bottom_y
self.scroll_offset_max = (
self.scroll_offset - self.scroll_offset_space_left + 0.25 * block_height
)
bgl.glDisable(bgl.GL_BLEND)
def _draw_downloading(self, context):
"""OpenGL drawing code for the DOWNLOADING_TEXTURE state."""
self._draw_text_on_colour(
context, "Downloading texture from Blender Cloud", (0.0, 0.0, 0.2, 0.6)
)
def _draw_checking_credentials(self, context):
"""OpenGL drawing code for the CHECKING_CREDENTIALS state."""
self._draw_text_on_colour(
context, "Checking login credentials", (0.0, 0.0, 0.2, 0.6)
)
def _draw_initializing(self, context):
"""OpenGL drawing code for the INITIALIZING state."""
self._draw_text_on_colour(context, "Initializing", (0.0, 0.0, 0.2, 0.6))
def _draw_text_on_colour(self, context, text: str, bgcolour):
content_height, content_width = self._window_size(context)
bgl.glEnable(bgl.GL_BLEND)
draw.aabox((0, 0), (content_width, content_height), bgcolour)
draw.text(
(content_width * 0.5, content_height * 0.7), text, fsize=20, align="C"
)
bgl.glDisable(bgl.GL_BLEND)
def _window_size(self, context):
window_region = self._window_region(context)
content_width = window_region.width
content_height = window_region.height
return content_height, content_width
def _draw_exception(self, context):
"""OpenGL drawing code for the EXCEPTION state."""
import textwrap
content_height, content_width = self._window_size(context)
bgl.glEnable(bgl.GL_BLEND)
draw.aabox((0, 0), (content_width, content_height), (0.2, 0.0, 0.0, 0.6))
ex = self.async_task.exception()
if isinstance(ex, pillar.UserNotLoggedInError):
ex_msg = (
"You are not logged in on Blender ID. Please log in at User Preferences, "
"Add-ons, Blender ID Authentication."
)
else:
ex_msg = str(ex)
if not ex_msg:
ex_msg = str(type(ex))
text = "An error occurred:\n%s" % ex_msg
lines = textwrap.wrap(text, width=100)
draw.text((content_width * 0.1, content_height * 0.9), lines, fsize=16)
bgl.glDisable(bgl.GL_BLEND)
def _draw_subscribe(self, context):
self._draw_text_on_colour(
context, "Click to subscribe to the Blender Cloud", (0.0, 0.0, 0.2, 0.6)
)
def _draw_renew(self, context):
self._draw_text_on_colour(
context,
"Click to renew your Blender Cloud subscription",
(0.0, 0.0, 0.2, 0.6),
)
def get_clicked(self) -> typing.Optional[menu_item_mod.MenuItem]:
for item in self.current_display_content:
if item.hits(self.mouse_x, self.mouse_y):
return item
return None
def handle_item_selection(self, context, item: menu_item_mod.MenuItem):
"""Called when the user clicks on a menu item that doesn't represent a folder."""
from pillarsdk.utils import sanitize_filename
self.clear_images()
self._state = "DOWNLOADING_TEXTURE"
node_path_components = (
node["name"] for node in self.path_stack if node is not None
)
local_path_components = [
sanitize_filename(comp) for comp in node_path_components
]
top_texture_directory = bpy.path.abspath(context.scene.local_texture_dir)
local_path = os.path.join(top_texture_directory, *local_path_components)
meta_path = os.path.join(top_texture_directory, ".blender_cloud")
self.log.info("Downloading texture %r to %s", item.node_uuid, local_path)
self.log.debug("Metadata will be stored at %s", meta_path)
file_paths = []
select_dblock = None
node = item.node
def texture_downloading(file_path, *_):
self.log.info("Texture downloading to %s", file_path)
def texture_downloaded(file_path, file_desc, map_type):
nonlocal select_dblock
self.log.info("Texture downloaded to %r.", file_path)
if context.scene.local_texture_dir.startswith("//"):
file_path = bpy.path.relpath(file_path)
image_dblock = bpy.data.images.load(filepath=file_path)
image_dblock["bcloud_file_uuid"] = file_desc["_id"]
image_dblock["bcloud_node_uuid"] = node["_id"]
image_dblock["bcloud_node_type"] = node["node_type"]
image_dblock["bcloud_node"] = pillar.node_to_id(node)
if node["node_type"] == "hdri":
# All HDRi variations should use the same image datablock, hence once name.
image_dblock.name = node["name"]
else:
# All texture variations are loaded at once, and thus need the map type in the name.
image_dblock.name = "%s-%s" % (node["name"], map_type)
# Select the image in the image editor (if the context is right).
# Just set the first image we download,
if context.area.type == "IMAGE_EDITOR":
if select_dblock is None or file_desc.map_type == "color":
select_dblock = image_dblock
context.space_data.image = select_dblock
file_paths.append(file_path)
def texture_download_completed(_):
self.log.info(
"Texture download complete, inspect:\n%s", "\n".join(file_paths)
)
self._state = "QUIT"
# For HDRi nodes: only download the first file.
download_node = pillarsdk.Node.new(node)
if node["node_type"] == "hdri":
download_node.properties.files = [download_node.properties.files[0]]
signalling_future = asyncio.Future()
self._new_async_task(
pillar.download_texture(
download_node,
local_path,
metadata_directory=meta_path,
texture_loading=texture_downloading,
texture_loaded=texture_downloaded,
future=signalling_future,
)
)
self.async_task.add_done_callback(texture_download_completed)
def open_browser_subscribe(self, *, renew: bool):
import webbrowser
url = "renew" if renew else "join"
webbrowser.open_new_tab("https://cloud.blender.org/%s" % url)
self.report({"INFO"}, "We just started a browser for you.")
def _scroll_smooth(self):
diff = self.scroll_offset_target - self.scroll_offset
if diff == 0:
return
if abs(round(diff)) < 1:
self.scroll_offset = self.scroll_offset_target
return
self.scroll_offset += diff * 0.5
def _scroll_by(self, amount, *, smooth=True):
# Slow down scrolling up
if smooth and amount < 0 and -amount > self.scroll_offset_space_left / 4:
amount = -self.scroll_offset_space_left / 4
self.scroll_offset_target = min(
0, max(self.scroll_offset_max, self.scroll_offset_target + amount)
)
if not smooth:
self._scroll_offset = self.scroll_offset_target
def _scroll_reset(self):
self.scroll_offset_target = self.scroll_offset = 0
class PILLAR_OT_switch_hdri(
pillar.PillarOperatorMixin, async_loop.AsyncModalOperatorMixin, bpy.types.Operator
):
bl_idname = "pillar.switch_hdri"
bl_label = "Switch with another variation"
bl_description = (
"Downloads the selected variation of an HDRi, " "replacing the current image"
)
log = logging.getLogger("bpy.ops.%s" % bl_idname)
image_name: bpy.props.StringProperty(
name="image_name", description="Name of the image block to replace"
)
file_uuid: bpy.props.StringProperty(
name="file_uuid", description="File ID to download"
)
async def async_execute(self, context):
"""Entry point of the asynchronous operator."""
self.report({"INFO"}, "Communicating with Blender Cloud")
try:
try:
db_user = await self.check_credentials(
context, REQUIRED_ROLES_FOR_TEXTURE_BROWSER
)
user_id = db_user["_id"]
except pillar.NotSubscribedToCloudError as ex:
self._log_subscription_needed(can_renew=ex.can_renew)
self._state = "QUIT"
return
except pillar.UserNotLoggedInError:
self.log.exception("Error checking/refreshing credentials.")
self.report({"ERROR"}, "Please log in on Blender ID first.")
self._state = "QUIT"
return
if not user_id:
raise pillar.UserNotLoggedInError()
await self.download_and_replace(context)
except Exception as ex:
self.log.exception("Unexpected exception caught.")
self.report({"ERROR"}, "Unexpected error %s: %s" % (type(ex), ex))
self._state = "QUIT"
async def download_and_replace(self, context):
self._state = "DOWNLOADING_TEXTURE"
current_image = bpy.data.images[self.image_name]
node = current_image["bcloud_node"]
filename = "%s.taken_from_file" % pillar.sanitize_filename(node["name"])
local_path = os.path.dirname(bpy.path.abspath(current_image.filepath))
top_texture_directory = bpy.path.abspath(context.scene.local_texture_dir)
meta_path = os.path.join(top_texture_directory, ".blender_cloud")
file_uuid = self.file_uuid
resolution = next(
file_ref["resolution"]
for file_ref in node["properties"]["files"]
if file_ref["file"] == file_uuid
)
my_log = self.log
my_log.info("Downloading file %r-%s to %s", file_uuid, resolution, local_path)
my_log.debug("Metadata will be stored at %s", meta_path)
def file_loading(file_path, file_desc, map_type):
my_log.info(
"Texture downloading to %s (%s)",
file_path,
utils.sizeof_fmt(file_desc["length"]),
)
async def file_loaded(file_path, file_desc, map_type):
if context.scene.local_texture_dir.startswith("//"):
file_path = bpy.path.relpath(file_path)
my_log.info("Texture downloaded to %s", file_path)
current_image["bcloud_file_uuid"] = file_uuid
current_image.filepath = (
file_path # This automatically reloads the image from disk.
)
# This forces users of the image to update.
for datablocks in bpy.data.user_map({current_image}).values():
for datablock in datablocks:
datablock.update_tag()
await pillar.download_file_by_uuid(
file_uuid,
local_path,
meta_path,
filename=filename,
map_type=resolution,
file_loading=file_loading,
file_loaded_sync=file_loaded,
future=self.signalling_future,
)
self.report({"INFO"}, "Image download complete")
# store keymaps here to access after registration
addon_keymaps = []
def image_editor_menu(self, context):
self.layout.operator(
BlenderCloudBrowser.bl_idname,
text="Get image from Blender Cloud",
icon_value=blender.icon("CLOUD"),
)
def hdri_download_panel__image_editor(self, context):
_hdri_download_panel(self, context.edit_image)
def hdri_download_panel__node_editor(self, context):
if context.active_node.type not in {"TEX_ENVIRONMENT", "TEX_IMAGE"}:
return
_hdri_download_panel(self, context.active_node.image)
def _hdri_download_panel(self, current_image):
if not current_image:
return
if "bcloud_node_type" not in current_image:
return
if current_image["bcloud_node_type"] != "hdri":
return
try:
current_variation = current_image["bcloud_file_uuid"]
except KeyError:
log.warning(
"Image %r has a bcloud_node_type but no bcloud_file_uuid property.",
current_image.name,
)
return
row = self.layout.row(align=True).split(factor=0.3)
row.label(text="HDRi", icon_value=blender.icon("CLOUD"))
row.prop(current_image, "hdri_variation", text="")
if current_image.hdri_variation != current_variation:
props = row.operator(
PILLAR_OT_switch_hdri.bl_idname, text="Replace", icon="FILE_REFRESH"
)
props.image_name = current_image.name
props.file_uuid = current_image.hdri_variation
# Storage for variation labels, as the strings in EnumProperty items
# MUST be kept in Python memory.
variation_label_storage = {}
def hdri_variation_choices(self, context):
if context.area.type == "IMAGE_EDITOR":
image = context.edit_image
elif context.area.type == "NODE_EDITOR":
image = context.active_node.image
else:
return []
if "bcloud_node" not in image:
return []
choices = []
for file_doc in image["bcloud_node"]["properties"]["files"]:
label = file_doc["resolution"]
variation_label_storage[label] = label
choices.append((file_doc["file"], label, ""))
return choices
def register():
bpy.utils.register_class(BlenderCloudBrowser)
bpy.utils.register_class(PILLAR_OT_switch_hdri)
bpy.types.IMAGE_MT_image.prepend(image_editor_menu)
bpy.types.IMAGE_PT_image_properties.append(hdri_download_panel__image_editor)
bpy.types.NODE_PT_active_node_properties.append(hdri_download_panel__node_editor)
# HDRi resolution switcher/chooser.
# TODO: when an image is selected, switch this property to its current resolution.
bpy.types.Image.hdri_variation = bpy.props.EnumProperty(
name="HDRi variations",
items=hdri_variation_choices,
description="Select a variation with which to replace this image",
)
# handle the keymap
wm = bpy.context.window_manager
kc = wm.keyconfigs.addon
if not kc:
print("No addon key configuration space found, so no custom hotkeys added.")
return
km = kc.keymaps.new(name="Screen")
kmi = km.keymap_items.new(
"pillar.browser", "A", "PRESS", ctrl=True, shift=True, alt=True
)
addon_keymaps.append((km, kmi))
def unregister():
# handle the keymap
for km, kmi in addon_keymaps:
km.keymap_items.remove(kmi)
addon_keymaps.clear()
if hasattr(bpy.types.Image, "hdri_variation"):
del bpy.types.Image.hdri_variation
bpy.types.IMAGE_MT_image.remove(image_editor_menu)
bpy.types.IMAGE_PT_image_properties.remove(hdri_download_panel__image_editor)
bpy.types.NODE_PT_active_node_properties.remove(hdri_download_panel__node_editor)
bpy.utils.unregister_class(BlenderCloudBrowser)
bpy.utils.unregister_class(PILLAR_OT_switch_hdri)

View File

@ -0,0 +1,119 @@
"""OpenGL drawing code for the texture browser.
Requires Blender 2.80 or newer.
"""
import typing
import bgl
import blf
import bpy
import gpu
from gpu_extras.batch import batch_for_shader
if bpy.app.background:
shader = None
texture_shader = None
else:
shader = gpu.shader.from_builtin("2D_UNIFORM_COLOR")
texture_shader = gpu.shader.from_builtin("2D_IMAGE")
Float2 = typing.Tuple[float, float]
Float3 = typing.Tuple[float, float, float]
Float4 = typing.Tuple[float, float, float, float]
def text(
pos2d: Float2,
display_text: typing.Union[str, typing.List[str]],
rgba: Float4 = (1.0, 1.0, 1.0, 1.0),
fsize=12,
align="L",
):
"""Draw text with the top-left corner at 'pos2d'."""
dpi = bpy.context.preferences.system.dpi
gap = 12
x_pos, y_pos = pos2d
font_id = 0
blf.size(font_id, fsize, dpi)
# Compute the height of one line.
mwidth, mheight = blf.dimensions(font_id, "Tp") # Use high and low letters.
mheight *= 1.5
# Split text into lines.
if isinstance(display_text, str):
mylines = display_text.split("\n")
else:
mylines = display_text
maxwidth = 0
maxheight = len(mylines) * mheight
for idx, line in enumerate(mylines):
text_width, text_height = blf.dimensions(font_id, line)
if align == "C":
newx = x_pos - text_width / 2
elif align == "R":
newx = x_pos - text_width - gap
else:
newx = x_pos
# Draw
blf.position(font_id, newx, y_pos - mheight * idx, 0)
blf.color(font_id, rgba[0], rgba[1], rgba[2], rgba[3])
blf.draw(font_id, " " + line)
# saves max width
if maxwidth < text_width:
maxwidth = text_width
return maxwidth, maxheight
def aabox(v1: Float2, v2: Float2, rgba: Float4):
"""Draw an axis-aligned box."""
coords = [
(v1[0], v1[1]),
(v1[0], v2[1]),
(v2[0], v2[1]),
(v2[0], v1[1]),
]
shader.bind()
shader.uniform_float("color", rgba)
batch = batch_for_shader(shader, "TRI_FAN", {"pos": coords})
batch.draw(shader)
def aabox_with_texture(v1: Float2, v2: Float2):
"""Draw an axis-aligned box with a texture."""
coords = [
(v1[0], v1[1]),
(v1[0], v2[1]),
(v2[0], v2[1]),
(v2[0], v1[1]),
]
texture_shader.bind()
texture_shader.uniform_int("image", 0)
batch = batch_for_shader(
texture_shader,
"TRI_FAN",
{
"pos": coords,
"texCoord": ((0, 0), (0, 1), (1, 1), (1, 0)),
},
)
batch.draw(texture_shader)
def bind_texture(texture: bpy.types.Image):
"""Bind a Blender image to a GL texture slot."""
bgl.glActiveTexture(bgl.GL_TEXTURE0)
bgl.glBindTexture(bgl.GL_TEXTURE_2D, texture.bindcode)
def load_texture(texture: bpy.types.Image) -> int:
"""Load the texture, return OpenGL error code."""
return texture.gl_load()

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

View File

Before

Width:  |  Height:  |  Size: 16 KiB

After

Width:  |  Height:  |  Size: 16 KiB

View File

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.4 KiB

View File

@ -0,0 +1,209 @@
import logging
import os.path
import bpy
import bgl
import pillarsdk
from . import nodes
if bpy.app.version < (2, 80):
from . import draw_27 as draw
else:
from . import draw
library_icons_path = os.path.join(os.path.dirname(__file__), "icons")
ICON_WIDTH = 128
ICON_HEIGHT = 128
class MenuItem:
"""GUI menu item for the 3D View GUI."""
icon_margin_x = 4
icon_margin_y = 4
text_margin_x = 6
text_size = 12
text_size_small = 10
DEFAULT_ICONS = {
"FOLDER": os.path.join(library_icons_path, "folder.png"),
"SPINNER": os.path.join(library_icons_path, "spinner.png"),
"ERROR": os.path.join(library_icons_path, "error.png"),
}
FOLDER_NODE_TYPES = {
"group_texture",
"group_hdri",
nodes.UpNode.NODE_TYPE,
nodes.ProjectNode.NODE_TYPE,
}
SUPPORTED_NODE_TYPES = {"texture", "hdri"}.union(FOLDER_NODE_TYPES)
def __init__(self, node, file_desc, thumb_path: str, label_text):
self.log = logging.getLogger("%s.MenuItem" % __name__)
if node["node_type"] not in self.SUPPORTED_NODE_TYPES:
self.log.info("Invalid node type in node: %s", node)
raise TypeError(
"Node of type %r not supported; supported are %r."
% (node["node_type"], self.SUPPORTED_NODE_TYPES)
)
assert isinstance(node, pillarsdk.Node), "wrong type for node: %r" % type(node)
assert isinstance(node["_id"], str), 'wrong type for node["_id"]: %r' % type(
node["_id"]
)
self.node = node # pillarsdk.Node, contains 'node_type' key to indicate type
self.file_desc = file_desc # pillarsdk.File object, or None if a 'folder' node.
self.label_text = label_text
self.small_text = self._small_text_from_node()
self._thumb_path = ""
self.icon = None
self._is_folder = node["node_type"] in self.FOLDER_NODE_TYPES
self._is_spinning = False
# Determine sorting order.
# by default, sort all the way at the end and folders first.
self._order = 0 if self._is_folder else 10000
if node and node.properties and node.properties.order is not None:
self._order = node.properties.order
self.thumb_path = thumb_path
# Updated when drawing the image
self.x = 0
self.y = 0
self.width = 0
self.height = 0
def _small_text_from_node(self) -> str:
"""Return the components of the texture (i.e. which map types are available)."""
if not self.node:
return ""
try:
node_files = self.node.properties.files
except AttributeError:
# Happens for nodes that don't have .properties.files.
return ""
if not node_files:
return ""
map_types = {f.map_type for f in node_files if f.map_type}
map_types.discard("color") # all textures have colour
if not map_types:
return ""
return ", ".join(sorted(map_types))
def sort_key(self):
"""Key for sorting lists of MenuItems."""
return self._order, self.label_text
@property
def thumb_path(self) -> str:
return self._thumb_path
@thumb_path.setter
def thumb_path(self, new_thumb_path: str):
self._is_spinning = new_thumb_path == "SPINNER"
self._thumb_path = self.DEFAULT_ICONS.get(new_thumb_path, new_thumb_path)
if self._thumb_path:
self.icon = bpy.data.images.load(filepath=self._thumb_path)
else:
self.icon = None
@property
def node_uuid(self) -> str:
return self.node["_id"]
def represents(self, node) -> bool:
"""Returns True iff this MenuItem represents the given node."""
node_uuid = node["_id"]
return self.node_uuid == node_uuid
def update(self, node, file_desc, thumb_path: str, label_text=None):
# We can get updated information about our Node, but a MenuItem should
# always represent one node, and it shouldn't be shared between nodes.
if self.node_uuid != node["_id"]:
raise ValueError(
"Don't change the node ID this MenuItem reflects, "
"just create a new one."
)
self.node = node
self.file_desc = file_desc # pillarsdk.File object, or None if a 'folder' node.
self.thumb_path = thumb_path
if label_text is not None:
self.label_text = label_text
if thumb_path == "ERROR":
self.small_text = "This open is broken"
else:
self.small_text = self._small_text_from_node()
@property
def is_folder(self) -> bool:
return self._is_folder
@property
def is_spinning(self) -> bool:
return self._is_spinning
def update_placement(self, x, y, width, height):
"""Use OpenGL to draw this one menu item."""
self.x = x
self.y = y
self.width = width
self.height = height
def draw(self, highlighted: bool):
bgl.glEnable(bgl.GL_BLEND)
if highlighted:
color = (0.555, 0.555, 0.555, 0.8)
else:
color = (0.447, 0.447, 0.447, 0.8)
draw.aabox((self.x, self.y), (self.x + self.width, self.y + self.height), color)
texture = self.icon
if texture:
err = draw.load_texture(texture)
assert not err, "OpenGL error: %i" % err
# ------ TEXTURE ---------#
if texture:
draw.bind_texture(texture)
bgl.glBlendFunc(bgl.GL_SRC_ALPHA, bgl.GL_ONE_MINUS_SRC_ALPHA)
draw.aabox_with_texture(
(self.x + self.icon_margin_x, self.y),
(self.x + self.icon_margin_x + ICON_WIDTH, self.y + ICON_HEIGHT),
)
bgl.glDisable(bgl.GL_BLEND)
if texture:
texture.gl_free()
# draw some text
text_x = self.x + self.icon_margin_x + ICON_WIDTH + self.text_margin_x
text_y = self.y + ICON_HEIGHT * 0.5 - 0.25 * self.text_size
draw.text((text_x, text_y), self.label_text, fsize=self.text_size)
draw.text(
(text_x, self.y + 0.5 * self.text_size_small),
self.small_text,
fsize=self.text_size_small,
rgba=(1.0, 1.0, 1.0, 0.5),
)
def hits(self, mouse_x: int, mouse_y: int) -> bool:
return (
self.x < mouse_x < self.x + self.width
and self.y < mouse_y < self.y + self.height
)

View File

@ -0,0 +1,28 @@
import pillarsdk
class SpecialFolderNode(pillarsdk.Node):
NODE_TYPE = "SPECIAL"
class UpNode(SpecialFolderNode):
NODE_TYPE = "UP"
def __init__(self):
super().__init__()
self["_id"] = "UP"
self["node_type"] = self.NODE_TYPE
class ProjectNode(SpecialFolderNode):
NODE_TYPE = "PROJECT"
def __init__(self, project):
super().__init__()
assert isinstance(
project, pillarsdk.Project
), "wrong type for project: %r" % type(project)
self.merge(project.to_dict())
self["node_type"] = self.NODE_TYPE

View File

@ -16,24 +16,27 @@
# #
# ##### END GPL LICENSE BLOCK ##### # ##### END GPL LICENSE BLOCK #####
import json
import pathlib import pathlib
import typing
from typing import Any, Dict, Optional, Tuple
def sizeof_fmt(num: int, suffix='B') -> str: def sizeof_fmt(num: int, suffix="B") -> str:
"""Returns a human-readable size. """Returns a human-readable size.
Source: http://stackoverflow.com/a/1094933/875379 Source: http://stackoverflow.com/a/1094933/875379
""" """
for unit in ['', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi']: for unit in ["", "Ki", "Mi", "Gi", "Ti", "Pi", "Ei", "Zi"]:
if abs(num) < 1024: if abs(num) < 1024:
return '%.1f %s%s' % (num, unit, suffix) return "%.1f %s%s" % (num, unit, suffix)
num /= 1024 num //= 1024
return '%.1f Yi%s' % (num, suffix) return "%.1f Yi%s" % (num, suffix)
def find_in_path(path: pathlib.Path, filename: str) -> pathlib.Path: def find_in_path(path: pathlib.Path, filename: str) -> Optional[pathlib.Path]:
"""Performs a breadth-first search for the filename. """Performs a breadth-first search for the filename.
Returns the path that contains the file, or None if not found. Returns the path that contains the file, or None if not found.
@ -64,39 +67,43 @@ def find_in_path(path: pathlib.Path, filename: str) -> pathlib.Path:
return None return None
def pyside_cache(propname): # Mapping from (module name, function name) to the last value returned by that function.
_pyside_cache: Dict[Tuple[str, str], Any] = {}
def pyside_cache(wrapped):
"""Decorator, stores the result of the decorated callable in Python-managed memory. """Decorator, stores the result of the decorated callable in Python-managed memory.
This is to work around the warning at This is to work around the warning at
https://www.blender.org/api/blender_python_api_master/bpy.props.html#bpy.props.EnumProperty https://www.blender.org/api/blender_python_api_master/bpy.props.html#bpy.props.EnumProperty
""" """
if callable(propname):
raise TypeError('Usage: pyside_cache("property_name")')
def decorator(wrapped):
"""Stores the result of the callable in Python-managed memory.
This is to work around the warning at
https://www.blender.org/api/blender_python_api_master/bpy.props.html#bpy.props.EnumProperty
"""
import functools import functools
@functools.wraps(wrapped) @functools.wraps(wrapped)
# We can't use (*args, **kwargs), because EnumProperty explicitly checks # We can't use (*args, **kwargs), because EnumProperty explicitly checks
# for the number of fixed positional arguments. # for the number of fixed positional arguments.
def wrapper(self, context): def decorator(self, context):
result = None result = None
try: try:
result = wrapped(self, context) result = wrapped(self, context)
return result return result
finally: finally:
rna_type, rna_info = getattr(self.bl_rna, propname) _pyside_cache[wrapped.__module__, wrapped.__name__] = result
rna_info['_cached_result'] = result
return wrapper
return decorator return decorator
def redraw(self, context): def redraw(self, context):
if context.area is None:
return
context.area.tag_redraw() context.area.tag_redraw()
class JSONEncoder(json.JSONEncoder):
"""JSON encoder with support for some Blender types."""
def default(self, o):
if o.__class__.__name__ == "IDPropertyGroup" and hasattr(o, "to_dict"):
return o.to_dict()
return super().default(o)

View File

@ -37,30 +37,43 @@ def load_wheel(module_name, fname_prefix):
try: try:
module = __import__(module_name) module = __import__(module_name)
except ImportError as ex: except ImportError as ex:
log.debug('Unable to import %s directly, will try wheel: %s', log.debug("Unable to import %s directly, will try wheel: %s", module_name, ex)
module_name, ex)
else: else:
log.debug('Was able to load %s from %s, no need to load wheel %s', log.debug(
module_name, module.__file__, fname_prefix) "Was able to load %s from %s, no need to load wheel %s",
module_name,
module.__file__,
fname_prefix,
)
return return
sys.path.append(wheel_filename(fname_prefix)) sys.path.append(wheel_filename(fname_prefix))
module = __import__(module_name) module = __import__(module_name)
log.debug('Loaded %s from %s', module_name, module.__file__) log.debug("Loaded %s from %s", module_name, module.__file__)
def wheel_filename(fname_prefix: str) -> str: def wheel_filename(fname_prefix: str) -> str:
path_pattern = os.path.join(my_dir, '%s*.whl' % fname_prefix) path_pattern = os.path.join(my_dir, "%s*.whl" % fname_prefix)
wheels = glob.glob(path_pattern) wheels = glob.glob(path_pattern)
if not wheels: if not wheels:
raise RuntimeError('Unable to find wheel at %r' % path_pattern) raise RuntimeError("Unable to find wheel at %r" % path_pattern)
# If there are multiple wheels that match, load the latest one. # If there are multiple wheels that match, load the last-modified one.
wheels.sort() # Alphabetical sorting isn't going to cut it since BAT 1.10 was released.
def modtime(filename: str) -> int:
return os.stat(filename).st_mtime
wheels.sort(key=modtime)
return wheels[-1] return wheels[-1]
def load_wheels(): def load_wheels():
load_wheel('lockfile', 'lockfile') load_wheel("blender_asset_tracer", "blender_asset_tracer")
load_wheel('cachecontrol', 'CacheControl') load_wheel("lockfile", "lockfile")
load_wheel('pillarsdk', 'pillarsdk') load_wheel("cachecontrol", "CacheControl")
load_wheel("pillarsdk", "pillarsdk")
if __name__ == "__main__":
wheel = wheel_filename("blender_asset_tracer")
print(f"Wheel: {wheel}")

13
deploy-to-shared.sh Executable file
View File

@ -0,0 +1,13 @@
#!/bin/bash -e
FULLNAME="$(python3 setup.py --fullname)"
echo "Press [ENTER] to deploy $FULLNAME to /shared"
read dummy
./clear_wheels.sh
python3 setup.py wheels bdist
DISTDIR=$(pwd)/dist
cd /shared/software/addons
rm -vf blender_cloud/wheels/*.whl # remove obsolete wheel files
unzip -o $DISTDIR/$FULLNAME.addon.zip

View File

@ -1,16 +1,17 @@
# Primary requirements: # Primary requirements:
-e git+https://github.com/sybrenstuvel/cachecontrol.git@sybren-filecache-delete-crash-fix#egg=CacheControl -e git+https://github.com/sybrenstuvel/cachecontrol.git@sybren-filecache-delete-crash-fix#egg=CacheControl
lockfile==0.12.2 lockfile==0.12.2
pillarsdk==1.6.1 pillarsdk==1.8.0
wheel==0.29.0 wheel==0.29.0
blender-bam==1.1.7 blender-asset-tracer==1.11
# Secondary requirements: # Secondary requirements:
cffi==1.6.0 asn1crypto==0.24.0
cryptography==1.3.1 cffi==1.11.2
idna==2.1 cryptography==2.1.4
idna==2.6
pyasn1==0.1.9 pyasn1==0.1.9
pycparser==2.14 pycparser==2.18
pyOpenSSL==16.0.0 pyOpenSSL==17.5.0
requests==2.10.0 requests==2.10.0
six==1.10.0 six==1.11.0

184
setup.py
View File

@ -18,7 +18,6 @@
# ##### END GPL LICENSE BLOCK ##### # ##### END GPL LICENSE BLOCK #####
import glob import glob
import os
import sys import sys
import shutil import shutil
import subprocess import subprocess
@ -29,13 +28,20 @@ import zipfile
from distutils import log from distutils import log
from distutils.core import Command from distutils.core import Command
from distutils.command.bdist import bdist from distutils.command.bdist import bdist
from distutils.command.install import install from distutils.command.install import install, INSTALL_SCHEMES
from distutils.command.install_egg_info import install_egg_info from distutils.command.install_egg_info import install_egg_info
from setuptools import setup, find_packages from setuptools import setup, find_packages
requirement_re = re.compile('[><=]+') requirement_re = re.compile("[><=]+")
sys.dont_write_bytecode = True sys.dont_write_bytecode = True
# Download wheels from pypi. The specific versions are taken from requirements.txt
wheels = [
"lockfile",
"pillarsdk",
"blender-asset-tracer",
]
def set_default_path(var, default): def set_default_path(var, default):
"""convert CLI-arguments (string) to Paths""" """convert CLI-arguments (string) to Paths"""
@ -51,35 +57,38 @@ class BuildWheels(Command):
description = "builds/downloads the dependencies as wheel files" description = "builds/downloads the dependencies as wheel files"
user_options = [ user_options = [
('wheels-path=', None, "wheel file installation path"), ("wheels-path=", None, "wheel file installation path"),
('deps-path=', None, "path in which dependencies are built"), ("deps-path=", None, "path in which dependencies are built"),
('cachecontrol-path=', None, "subdir of deps-path containing CacheControl"), ("cachecontrol-path=", None, "subdir of deps-path containing CacheControl"),
] ]
def initialize_options(self): def initialize_options(self):
self.wheels_path = None # path that will contain the installed wheels. self.wheels_path = None # path that will contain the installed wheels.
self.deps_path = None # path in which dependencies are built. self.deps_path = None # path in which dependencies are built.
self.cachecontrol_path = None # subdir of deps_path containing CacheControl self.cachecontrol_path = None # subdir of deps_path containing CacheControl
self.bat_path = None # subdir of deps_path containing Blender-Asset-Tracer
def finalize_options(self): def finalize_options(self):
self.my_path = pathlib.Path(__file__).resolve().parent self.my_path = pathlib.Path(__file__).resolve().parent
package_path = self.my_path / self.distribution.get_name() package_path = self.my_path / self.distribution.get_name()
self.wheels_path = set_default_path(self.wheels_path, package_path / 'wheels') self.wheels_path = set_default_path(self.wheels_path, package_path / "wheels")
self.deps_path = set_default_path(self.deps_path, self.my_path / 'build/deps') self.deps_path = set_default_path(self.deps_path, self.my_path / "build/deps")
self.cachecontrol_path = set_default_path(self.cachecontrol_path, self.cachecontrol_path = set_default_path(
self.deps_path / 'cachecontrol') self.cachecontrol_path, self.deps_path / "cachecontrol"
)
self.bat_path = self.deps_path / "bat"
def run(self): def run(self):
log.info('Storing wheels in %s', self.wheels_path) log.info("Storing wheels in %s", self.wheels_path)
# Parse the requirements.txt file # Parse the requirements.txt file
requirements = {} requirements = {}
with open(str(self.my_path / 'requirements.txt')) as reqfile: with open(str(self.my_path / "requirements.txt")) as reqfile:
for line in reqfile.readlines(): for line in reqfile.readlines():
line = line.strip() line = line.strip()
if not line or line.startswith('#'): if not line or line.startswith("#"):
# comments are lines that start with # only # comments are lines that start with # only
continue continue
@ -90,48 +99,46 @@ class BuildWheels(Command):
# log.info(' - %s = %s / %s', package, line, line_req[-1]) # log.info(' - %s = %s / %s', package, line, line_req[-1])
self.wheels_path.mkdir(parents=True, exist_ok=True) self.wheels_path.mkdir(parents=True, exist_ok=True)
for package in wheels:
# Download lockfile, as there is a suitable wheel on pypi. pattern = package.replace("-", "_") + "*.whl"
if not list(self.wheels_path.glob('lockfile*.whl')): if list(self.wheels_path.glob(pattern)):
log.info('Downloading lockfile wheel') continue
self.download_wheel(requirements['lockfile']) self.download_wheel(requirements[package])
# Download Pillar Python SDK from pypi.
if not list(self.wheels_path.glob('pillarsdk*.whl')):
log.info('Downloading Pillar Python SDK wheel')
self.download_wheel(requirements['pillarsdk'])
# Download BAM from pypi. This is required for compatibility with Blender 2.78.
if not list(self.wheels_path.glob('blender_bam*.whl')):
log.info('Downloading BAM wheel')
self.download_wheel(requirements['blender-bam'])
# Build CacheControl. # Build CacheControl.
if not list(self.wheels_path.glob('CacheControl*.whl')): if not list(self.wheels_path.glob("CacheControl*.whl")):
log.info('Building CacheControl in %s', self.cachecontrol_path) log.info("Building CacheControl in %s", self.cachecontrol_path)
# self.git_clone(self.cachecontrol_path, # self.git_clone(self.cachecontrol_path,
# 'https://github.com/ionrock/cachecontrol.git', # 'https://github.com/ionrock/cachecontrol.git',
# 'v%s' % requirements['CacheControl'][1]) # 'v%s' % requirements['CacheControl'][1])
# FIXME: we need my clone until pull request #125 has been merged & released # FIXME: we need my clone until pull request #125 has been merged & released
self.git_clone(self.cachecontrol_path, self.git_clone(
'https://github.com/sybrenstuvel/cachecontrol.git', self.cachecontrol_path,
'sybren-filecache-delete-crash-fix') "https://github.com/sybrenstuvel/cachecontrol.git",
"sybren-filecache-delete-crash-fix",
)
self.build_copy_wheel(self.cachecontrol_path) self.build_copy_wheel(self.cachecontrol_path)
# Ensure that the wheels are added to the data files. # Ensure that the wheels are added to the data files.
self.distribution.data_files.append( self.distribution.data_files.append(
('blender_cloud/wheels', (str(p) for p in self.wheels_path.glob('*.whl'))) ("blender_cloud/wheels", (str(p) for p in self.wheels_path.glob("*.whl")))
) )
def download_wheel(self, requirement): def download_wheel(self, requirement):
"""Downloads a wheel from PyPI and saves it in self.wheels_path.""" """Downloads a wheel from PyPI and saves it in self.wheels_path."""
subprocess.check_call([ subprocess.check_call(
'pip', 'download', [
'--no-deps', sys.executable,
'--dest', str(self.wheels_path), "-m",
requirement[0] "pip",
]) "download",
"--no-deps",
"--dest",
str(self.wheels_path),
requirement[0],
]
)
def git_clone(self, workdir: pathlib.Path, git_url: str, checkout: str = None): def git_clone(self, workdir: pathlib.Path, git_url: str, checkout: str = None):
if workdir.exists(): if workdir.exists():
@ -140,24 +147,25 @@ class BuildWheels(Command):
workdir.mkdir(parents=True) workdir.mkdir(parents=True)
subprocess.check_call(['git', 'clone', git_url, str(workdir)], subprocess.check_call(
cwd=str(workdir.parent)) ["git", "clone", git_url, str(workdir)], cwd=str(workdir.parent)
)
if checkout: if checkout:
subprocess.check_call(['git', 'checkout', checkout], subprocess.check_call(["git", "checkout", checkout], cwd=str(workdir))
cwd=str(workdir))
def build_copy_wheel(self, package_path: pathlib.Path): def build_copy_wheel(self, package_path: pathlib.Path):
# Make sure no wheels exist yet, so that we know which one to copy later. # Make sure no wheels exist yet, so that we know which one to copy later.
to_remove = list((package_path / 'dist').glob('*.whl')) to_remove = list((package_path / "dist").glob("*.whl"))
for fname in to_remove: for fname in to_remove:
fname.unlink() fname.unlink()
subprocess.check_call([sys.executable, 'setup.py', 'bdist_wheel'], subprocess.check_call(
cwd=str(package_path)) [sys.executable, "setup.py", "bdist_wheel"], cwd=str(package_path)
)
wheel = next((package_path / 'dist').glob('*.whl')) wheel = next((package_path / "dist").glob("*.whl"))
log.info('copying %s to %s', wheel, self.wheels_path) log.info("copying %s to %s", wheel, self.wheels_path)
shutil.copy(str(wheel), str(self.wheels_path)) shutil.copy(str(wheel), str(self.wheels_path))
@ -167,11 +175,19 @@ class BlenderAddonBdist(bdist):
def initialize_options(self): def initialize_options(self):
super().initialize_options() super().initialize_options()
self.formats = ['zip'] self.formats = ["zip"]
self.plat_name = 'addon' # use this instead of 'linux-x86_64' or similar. self.plat_name = "addon" # use this instead of 'linux-x86_64' or similar.
self.fix_local_prefix()
def fix_local_prefix(self):
"""Place data files in blender_cloud instead of local/blender_cloud."""
for key in INSTALL_SCHEMES:
if "data" not in INSTALL_SCHEMES[key]:
continue
INSTALL_SCHEMES[key]["data"] = "$base"
def run(self): def run(self):
self.run_command('wheels') self.run_command("wheels")
super().run() super().run()
@ -180,7 +196,7 @@ class BlenderAddonFdist(BlenderAddonBdist):
"""Ensures that 'python setup.py fdist' creates a plain folder structure.""" """Ensures that 'python setup.py fdist' creates a plain folder structure."""
user_options = [ user_options = [
('dest-path=', None, 'addon installation path'), ("dest-path=", None, "addon installation path"),
] ]
def initialize_options(self): def initialize_options(self):
@ -194,12 +210,12 @@ class BlenderAddonFdist(BlenderAddonBdist):
filepath = self.distribution.dist_files[0][2] filepath = self.distribution.dist_files[0][2]
# if dest_path is not specified use the filename as the dest_path (minus the .zip) # if dest_path is not specified use the filename as the dest_path (minus the .zip)
assert filepath.endswith('.zip') assert filepath.endswith(".zip")
target_folder = self.dest_path or filepath[:-4] target_folder = self.dest_path or filepath[:-4]
print('Unzipping the package on {}.'.format(target_folder)) print("Unzipping the package on {}.".format(target_folder))
with zipfile.ZipFile(filepath, 'r') as zip_ref: with zipfile.ZipFile(filepath, "r") as zip_ref:
zip_ref.extractall(target_folder) zip_ref.extractall(target_folder)
@ -209,8 +225,8 @@ class BlenderAddonInstall(install):
def initialize_options(self): def initialize_options(self):
super().initialize_options() super().initialize_options()
self.prefix = '' self.prefix = ""
self.install_lib = '' self.install_lib = ""
class AvoidEggInfo(install_egg_info): class AvoidEggInfo(install_egg_info):
@ -225,30 +241,38 @@ class AvoidEggInfo(install_egg_info):
setup( setup(
cmdclass={'bdist': BlenderAddonBdist, cmdclass={
'fdist': BlenderAddonFdist, "bdist": BlenderAddonBdist,
'install': BlenderAddonInstall, "fdist": BlenderAddonFdist,
'install_egg_info': AvoidEggInfo, "install": BlenderAddonInstall,
'wheels': BuildWheels}, "install_egg_info": AvoidEggInfo,
name='blender_cloud', "wheels": BuildWheels,
description='The Blender Cloud addon allows browsing the Blender Cloud from Blender.', },
version='1.7.4', name="blender_cloud",
author='Sybren A. Stüvel', description="The Blender Cloud addon allows browsing the Blender Cloud from Blender.",
author_email='sybren@stuvel.eu', version="1.25",
packages=find_packages('.'), author="Sybren A. Stüvel",
data_files=[('blender_cloud', ['README.md', 'README-flamenco.md', 'CHANGELOG.md']), author_email="sybren@stuvel.eu",
('blender_cloud/icons', glob.glob('blender_cloud/icons/*'))], packages=find_packages("."),
data_files=[
("blender_cloud", ["README.md", "README-flamenco.md", "CHANGELOG.md"]),
("blender_cloud/icons", glob.glob("blender_cloud/icons/*")),
(
"blender_cloud/texture_browser/icons",
glob.glob("blender_cloud/texture_browser/icons/*"),
),
],
scripts=[], scripts=[],
url='https://developer.blender.org/diffusion/BCA/', url="https://developer.blender.org/diffusion/BCA/",
license='GNU General Public License v2 or later (GPLv2+)', license="GNU General Public License v2 or later (GPLv2+)",
platforms='', platforms="",
classifiers=[ classifiers=[
'Intended Audience :: End Users/Desktop', "Intended Audience :: End Users/Desktop",
'Operating System :: OS Independent', "Operating System :: OS Independent",
'Environment :: Plugins', "Environment :: Plugins",
'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)', "License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)",
'Programming Language :: Python', "Programming Language :: Python",
'Programming Language :: Python :: 3.5', "Programming Language :: Python :: 3.5",
], ],
zip_safe=False, zip_safe=False,
) )

View File

@ -15,71 +15,94 @@ from blender_cloud.flamenco import sdk
class PathReplacementTest(unittest.TestCase): class PathReplacementTest(unittest.TestCase):
def setUp(self): def setUp(self):
self.test_manager = sdk.Manager({ self.test_manager = sdk.Manager(
'_created': datetime.datetime(2017, 5, 31, 15, 12, 32, tzinfo=pillarsdk.utils.utc), {
'_etag': 'c39942ee4bcc4658adcc21e4bcdfb0ae', "_created": datetime.datetime(
'_id': '592edd609837732a2a272c62', 2017, 5, 31, 15, 12, 32, tzinfo=pillarsdk.utils.utc
'_updated': datetime.datetime(2017, 6, 8, 14, 51, 3, tzinfo=pillarsdk.utils.utc), ),
'description': 'Manager formerly known as "testman"', "_etag": "c39942ee4bcc4658adcc21e4bcdfb0ae",
'job_types': {'sleep': {'vars': {}}}, "_id": "592edd609837732a2a272c62",
'name': '<script>alert("this is a manager")</script>', "_updated": datetime.datetime(
'owner': '592edd609837732a2a272c63', 2017, 6, 8, 14, 51, 3, tzinfo=pillarsdk.utils.utc
'path_replacement': {'job_storage': {'darwin': '/Volume/shared', ),
'linux': '/shared', "description": 'Manager formerly known as "testman"',
'windows': 's:/'}, "job_types": {"sleep": {"vars": {}}},
'render': {'darwin': '/Volume/render/', "name": '<script>alert("this is a manager")</script>',
'linux': '/render/', "owner": "592edd609837732a2a272c63",
'windows': 'r:/'}, "path_replacement": {
'longrender': {'darwin': '/Volume/render/long', "job_storage": {
'linux': '/render/long', "darwin": "/Volume/shared",
'windows': 'r:/long'}, "linux": "/shared",
"windows": "s:/",
}, },
'projects': ['58cbdd5698377322d95eb55e'], "render": {
'service_account': '592edd609837732a2a272c60', "darwin": "/Volume/render/",
'stats': {'nr_of_workers': 3}, "linux": "/render/",
'url': 'http://192.168.3.101:8083/', "windows": "r:/",
'user_groups': ['58cbdd5698377322d95eb55f'], },
'variables': {'blender': {'darwin': '/opt/myblenderbuild/blender', "longrender": {
'linux': '/home/sybren/workspace/build_linux/bin/blender ' "darwin": "/Volume/render/long",
'--enable-new-depsgraph --factory-startup', "linux": "/render/long",
'windows': 'c:/temp/blender.exe'}}} "windows": "r:/long",
},
},
"projects": ["58cbdd5698377322d95eb55e"],
"service_account": "592edd609837732a2a272c60",
"stats": {"nr_of_workers": 3},
"url": "http://192.168.3.101:8083/",
"user_groups": ["58cbdd5698377322d95eb55f"],
"variables": {
"blender": {
"darwin": "/opt/myblenderbuild/blender",
"linux": "/home/sybren/workspace/build_linux/bin/blender "
"--enable-new-depsgraph --factory-startup",
"windows": "c:/temp/blender.exe",
}
},
}
) )
def test_linux(self): def test_linux(self):
# (expected result, input) # (expected result, input)
test_paths = [ test_paths = [
('/doesnotexistreally', '/doesnotexistreally'), ("/doesnotexistreally", "/doesnotexistreally"),
('{render}/agent327/scenes/A_01_03_B', '/render/agent327/scenes/A_01_03_B'), ("{render}/agent327/scenes/A_01_03_B", "/render/agent327/scenes/A_01_03_B"),
('{job_storage}/render/agent327/scenes', '/shared/render/agent327/scenes'), ("{job_storage}/render/agent327/scenes", "/shared/render/agent327/scenes"),
('{longrender}/agent327/scenes', '/render/long/agent327/scenes'), ("{longrender}/agent327/scenes", "/render/long/agent327/scenes"),
] ]
self._do_test(test_paths, 'linux', pathlib.PurePosixPath) self._do_test(test_paths, "linux", pathlib.PurePosixPath)
def test_windows(self): def test_windows(self):
# (expected result, input) # (expected result, input)
test_paths = [ test_paths = [
('c:/doesnotexistreally', 'c:/doesnotexistreally'), ("c:/doesnotexistreally", "c:/doesnotexistreally"),
('c:/some/path', r'c:\some\path'), ("c:/some/path", r"c:\some\path"),
('{render}/agent327/scenes/A_01_03_B', r'R:\agent327\scenes\A_01_03_B'), ("{render}/agent327/scenes/A_01_03_B", r"R:\agent327\scenes\A_01_03_B"),
('{render}/agent327/scenes/A_01_03_B', r'r:\agent327\scenes\A_01_03_B'), ("{render}/agent327/scenes/A_01_03_B", r"r:\agent327\scenes\A_01_03_B"),
('{render}/agent327/scenes/A_01_03_B', r'r:/agent327/scenes/A_01_03_B'), ("{render}/agent327/scenes/A_01_03_B", r"r:/agent327/scenes/A_01_03_B"),
('{job_storage}/render/agent327/scenes', 's:/render/agent327/scenes'), ("{job_storage}/render/agent327/scenes", "s:/render/agent327/scenes"),
('{longrender}/agent327/scenes', 'r:/long/agent327/scenes'), ("{longrender}/agent327/scenes", "r:/long/agent327/scenes"),
] ]
self._do_test(test_paths, 'windows', pathlib.PureWindowsPath) self._do_test(test_paths, "windows", pathlib.PureWindowsPath)
def test_darwin(self): def test_darwin(self):
# (expected result, input) # (expected result, input)
test_paths = [ test_paths = [
('/Volume/doesnotexistreally', '/Volume/doesnotexistreally'), ("/Volume/doesnotexistreally", "/Volume/doesnotexistreally"),
('{render}/agent327/scenes/A_01_03_B', r'/Volume/render/agent327/scenes/A_01_03_B'), (
('{job_storage}/render/agent327/scenes', '/Volume/shared/render/agent327/scenes'), "{render}/agent327/scenes/A_01_03_B",
('{longrender}/agent327/scenes', '/Volume/render/long/agent327/scenes'), r"/Volume/render/agent327/scenes/A_01_03_B",
),
(
"{job_storage}/render/agent327/scenes",
"/Volume/shared/render/agent327/scenes",
),
("{longrender}/agent327/scenes", "/Volume/render/long/agent327/scenes"),
] ]
self._do_test(test_paths, 'darwin', pathlib.PurePosixPath) self._do_test(test_paths, "darwin", pathlib.PurePosixPath)
def _do_test(self, test_paths, platform, pathclass): def _do_test(self, test_paths, platform, pathclass):
self.test_manager.PurePlatformPath = pathclass self.test_manager.PurePlatformPath = pathclass
@ -87,9 +110,11 @@ class PathReplacementTest(unittest.TestCase):
def mocked_system(): def mocked_system():
return platform return platform
with unittest.mock.patch('platform.system', mocked_system): with unittest.mock.patch("platform.system", mocked_system):
for expected_result, input_path in test_paths: for expected_result, input_path in test_paths:
as_path_instance = pathclass(input_path) as_path_instance = pathclass(input_path)
self.assertEqual(expected_result, self.assertEqual(
expected_result,
self.test_manager.replace_path(as_path_instance), self.test_manager.replace_path(as_path_instance),
'for input %r on platform %s' % (as_path_instance, platform)) "for input %r on platform %s" % (as_path_instance, platform),
)

View File

@ -8,18 +8,18 @@ from blender_cloud import utils
class FindInPathTest(unittest.TestCase): class FindInPathTest(unittest.TestCase):
def test_nonexistant_path(self): def test_nonexistant_path(self):
path = pathlib.Path('/doesnotexistreally') path = pathlib.Path("/doesnotexistreally")
self.assertFalse(path.exists()) self.assertFalse(path.exists())
self.assertIsNone(utils.find_in_path(path, 'jemoeder.blend')) self.assertIsNone(utils.find_in_path(path, "jemoeder.blend"))
def test_really_breadth_first(self): def test_really_breadth_first(self):
"""A depth-first test might find dir_a1/dir_a2/dir_a3/find_me.txt first.""" """A depth-first test might find dir_a1/dir_a2/dir_a3/find_me.txt first."""
path = pathlib.Path(__file__).parent / 'test_really_breadth_first' path = pathlib.Path(__file__).parent / "test_really_breadth_first"
found = utils.find_in_path(path, 'find_me.txt') found = utils.find_in_path(path, "find_me.txt")
self.assertEqual(path / 'dir_b1' / 'dir_b2' / 'find_me.txt', found) self.assertEqual(path / "dir_b1" / "dir_b2" / "find_me.txt", found)
def test_nonexistant_file(self): def test_nonexistant_file(self):
path = pathlib.Path(__file__).parent / 'test_really_breadth_first' path = pathlib.Path(__file__).parent / "test_really_breadth_first"
found = utils.find_in_path(path, 'do_not_find_me.txt') found = utils.find_in_path(path, "do_not_find_me.txt")
self.assertEqual(None, found) self.assertEqual(None, found)

View File

@ -9,11 +9,14 @@ fi
BL_INFO_VER=$(echo "$VERSION" | sed 's/\./, /g') BL_INFO_VER=$(echo "$VERSION" | sed 's/\./, /g')
sed "s/version='[^']*'/version='$VERSION'/" -i setup.py sed "s/version=\"[^\"]*\"/version=\"$VERSION\"/" -i setup.py
sed "s/'version': ([^)]*)/'version': ($BL_INFO_VER)/" -i blender_cloud/__init__.py sed "s/\"version\": ([^)]*)/\"version\": ($BL_INFO_VER)/" -i blender_cloud/__init__.py
git diff git diff
echo echo
echo "Don't forget to commit and tag:" echo "Don't forget to commit and tag:"
echo git commit -m \'Bumped version to $VERSION\' setup.py blender_cloud/__init__.py echo git commit -m \'Bumped version to $VERSION\' setup.py blender_cloud/__init__.py
echo git tag -a version-$VERSION -m \'Tagged version $VERSION\' echo git tag -a version-$VERSION -m \'Tagged version $VERSION\'
echo
echo "To build a distribution ZIP:"
echo python3 setup.py bdist