Compare commits

..

No commits in common. "master" and "version-1.12.0" have entirely different histories.

30 changed files with 2756 additions and 3448 deletions

View File

@ -1,77 +1,5 @@
# Blender Cloud changelog # Blender Cloud changelog
## Version 1.25 (2022-02-25)
- Compatibility with Blender 3.1 (Python 3.10).
- Bump blender-asset-tracer to version 1.11, for UDIM support.
## Version 1.24 (2022-02-04)
- Bump blender-asset-tracer version 1.8 → 1.10, for fixing a bug where files were doubly-compressed.
## Version 1.23 (2021-11-09)
- Bump blender-asset-tracer version 1.7 → 1.8, for compatibility with sending read-only blend files to Flamenco.
## Version 1.22 (2021-11-05)
- Fix Windows incompatibility when using Shaman URLs as job storage path.
- Bump blender-asset-tracer version 1.6 → 1.7, for compatibility with files compressed by Blender 3.0.
## Version 1.21 (2021-07-27)
- Bump blender-asset-tracer version 1.5.1 → 1.6, for better compatibility with Geometry Nodes.
## Version 1.20 (2021-07-22)
- Bump blender-asset-tracer version 1.3.1 -> 1.5.1.
- Blender-asset-tracer "Strict Pointer Mode" disabled, to avoid issues with
not-entirely-synced library overrides.
## Version 1.19 (2021-02-23)
- Another Python 3.9+ compatibility fix.
## Version 1.18 (2021-02-16)
- Add compatibility with Python 3.9 (as used in Blender 2.93).
- Drop compatibility with Blender 2.79 and older. The last version of the
Blender Cloud add-on with 2.79 and older is version 1.17.
## Version 1.17 (2021-02-04)
- This is the last version compatible with Blender 2.77a - 2.79.
- Upgrade BAT to version 1.3.1, which brings compatibility with Geometry Nodes and
fixes some issues on Windows.
## Version 1.16 (2020-03-03)
- Fixed Windows compatibility issue with the handling of Shaman URLs.
## Version 1.15 (2019-12-12)
- Avoid creating BAT pack when the to-be-rendered file is already inside the job storage
directory. This assumes that the paths are already correct for the Flamenco Workers.
## Version 1.14 (2019-10-10)
- Upgraded BAT to 1.2 for missing smoke caches, compatibility with Blender 2.81, and some
Windows-specific fixes.
- Removed warnings on the terminal when running Blender 2.80+
## Version 1.13 (2019-04-18)
- Upgraded BAT to 1.1.1 for a compatibility fix with Blender 2.79
- Flamenco: Support for Flamenco Manager settings versioning + for settings version 2.
When using Blender Cloud Add-on 1.12 or older, Flamenco Server will automatically convert the
Manager settings to version 1.
- More Blender 2.80 compatibility fixes
## Version 1.12 (2019-03-25) ## Version 1.12 (2019-03-25)
- Flamenco: Change how progressive render tasks are created. Instead of the artist setting a fixed - Flamenco: Change how progressive render tasks are created. Instead of the artist setting a fixed
@ -90,15 +18,18 @@
for more info. for more info.
- Flamenco: The 'blender-video-chunks' job type now also allows MP4 and MOV video containers. - Flamenco: The 'blender-video-chunks' job type now also allows MP4 and MOV video containers.
## Version 1.11.1 (2019-01-04) ## Version 1.11.1 (2019-01-04)
- Bundled missing Texture Browser icons. - Bundled missing Texture Browser icons.
## Version 1.11.0 (2019-01-04) ## Version 1.11.0 (2019-01-04)
- Texture Browser now works on Blender 2.8. - Texture Browser now works on Blender 2.8.
- Blender Sync: Fixed compatibility issue with Blender 2.8. - Blender Sync: Fixed compatibility issue with Blender 2.8.
## Version 1.10.0 (2019-01-02) ## Version 1.10.0 (2019-01-02)
- Bundles Blender-Asset-Tracer 0.8. - Bundles Blender-Asset-Tracer 0.8.
@ -119,23 +50,28 @@
at least a way to tell whether the artist is considering that there is audio of any relevance in at least a way to tell whether the artist is considering that there is audio of any relevance in
the current blend file. the current blend file.
## Version 1.9.4 (2018-11-01) ## Version 1.9.4 (2018-11-01)
- Fixed Python 3.6 and Blender 2.79b incompatibilities accidentally introduced in 1.9.3. - Fixed Python 3.6 and Blender 2.79b incompatibilities accidentally introduced in 1.9.3.
## Version 1.9.3 (2018-10-30) ## Version 1.9.3 (2018-10-30)
- Fix drawing of Attract strips in the VSE on Blender 2.8. - Fix drawing of Attract strips in the VSE on Blender 2.8.
## Version 1.9.2 (2018-09-17) ## Version 1.9.2 (2018-09-17)
- No changes, just a different filename to force a refresh on our - No changes, just a different filename to force a refresh on our
hosting platform. hosting platform.
## Version 1.9.1 (2018-09-17) ## Version 1.9.1 (2018-09-17)
- Fix issue with Python 3.7, which is used by current daily builds of Blender. - Fix issue with Python 3.7, which is used by current daily builds of Blender.
## Version 1.9 (2018-09-05) ## Version 1.9 (2018-09-05)
- Last version to support Blender versions before 2.80! - Last version to support Blender versions before 2.80!
@ -145,6 +81,7 @@
- Flamenco: allow jobs to be created in 'paused' state. - Flamenco: allow jobs to be created in 'paused' state.
- Flamenco: only show Flamenco Managers that are linked to the currently selected project. - Flamenco: only show Flamenco Managers that are linked to the currently selected project.
## Version 1.8 (2018-01-03) ## Version 1.8 (2018-01-03)
- Distinguish between 'please subscribe' (to get a new subscription) and 'please renew' (to renew an - Distinguish between 'please subscribe' (to get a new subscription) and 'please renew' (to renew an
@ -160,12 +97,14 @@
settings for the project. settings for the project.
- Added button in the User Preferences to open a Cloud project in your webbrowser. - Added button in the User Preferences to open a Cloud project in your webbrowser.
## Version 1.7.5 (2017-10-06) ## Version 1.7.5 (2017-10-06)
- Sorting the project list alphabetically. - Sorting the project list alphabetically.
- Renamed 'Job File Path' to 'Job Storage Path' so it's more explicit. - Renamed 'Job File Path' to 'Job Storage Path' so it's more explicit.
- Allow overriding the render output path on a per-scene basis. - Allow overriding the render output path on a per-scene basis.
## Version 1.7.4 (2017-09-05) ## Version 1.7.4 (2017-09-05)
- Fix [T52621](https://developer.blender.org/T52621): Fixed class name collision upon add-on - Fix [T52621](https://developer.blender.org/T52621): Fixed class name collision upon add-on
@ -173,48 +112,57 @@
- Fix [T48852](https://developer.blender.org/T48852): Screenshot no longer shows "Communicating with - Fix [T48852](https://developer.blender.org/T48852): Screenshot no longer shows "Communicating with
Blender Cloud". Blender Cloud".
## Version 1.7.3 (2017-08-08) ## Version 1.7.3 (2017-08-08)
- Default to scene frame range when no frame range is given. - Default to scene frame range when no frame range is given.
- Refuse to render on Flamenco before blend file is saved at least once. - Refuse to render on Flamenco before blend file is saved at least once.
- Fixed some Windows-specific issues. - Fixed some Windows-specific issues.
## Version 1.7.2 (2017-06-22) ## Version 1.7.2 (2017-06-22)
- Fixed compatibility with Blender 2.78c. - Fixed compatibility with Blender 2.78c.
## Version 1.7.1 (2017-06-13) ## Version 1.7.1 (2017-06-13)
- Fixed asyncio issues on Windows - Fixed asyncio issues on Windows
## Version 1.7.0 (2017-06-09) ## Version 1.7.0 (2017-06-09)
- Fixed reloading after upgrading from 1.4.4 (our last public release). - Fixed reloading after upgrading from 1.4.4 (our last public release).
- Fixed bug handling a symlinked project path. - Fixed bug handling a symlinked project path.
- Added support for Manager-defined path replacement variables. - Added support for Manager-defined path replacement variables.
## Version 1.6.4 (2017-04-21) ## Version 1.6.4 (2017-04-21)
- Added file exclusion filter for Flamenco. A filter like `*.abc;*.mkv;*.mov` can be - Added file exclusion filter for Flamenco. A filter like `*.abc;*.mkv;*.mov` can be
used to prevent certain files from being copied to the job storage directory. used to prevent certain files from being copied to the job storage directory.
Requires a Blender that is bundled with BAM 1.1.7 or newer. Requires a Blender that is bundled with BAM 1.1.7 or newer.
## Version 1.6.3 (2017-03-21) ## Version 1.6.3 (2017-03-21)
- Fixed bug where local project path wasn't shown for projects only set up for Flamenco - Fixed bug where local project path wasn't shown for projects only set up for Flamenco
(and not Attract). (and not Attract).
- Added this CHANGELOG.md file, which will contain user-relevant changes. - Added this CHANGELOG.md file, which will contain user-relevant changes.
## Version 1.6.2 (2017-03-17) ## Version 1.6.2 (2017-03-17)
- Flamenco: when opening non-existing file path, open parent instead - Flamenco: when opening non-existing file path, open parent instead
- Fix T50954: Improve Blender Cloud add-on project selector - Fix T50954: Improve Blender Cloud add-on project selector
## Version 1.6.1 (2017-03-07) ## Version 1.6.1 (2017-03-07)
- Show error in GUI when Blender Cloud is unreachable - Show error in GUI when Blender Cloud is unreachable
- Fixed sample count when using branched path tracing - Fixed sample count when using branched path tracing
## Version 1.6.0 (2017-02-14) ## Version 1.6.0 (2017-02-14)
- Default to frame chunk size of 1 (instead of 10). - Default to frame chunk size of 1 (instead of 10).

View File

@ -19,22 +19,22 @@
# <pep8 compliant> # <pep8 compliant>
bl_info = { bl_info = {
"name": "Blender Cloud", 'name': 'Blender Cloud',
"author": "Sybren A. Stüvel, Francesco Siddi, Inês Almeida, Antony Riakiotakis", "author": "Sybren A. Stüvel, Francesco Siddi, Inês Almeida, Antony Riakiotakis",
"version": (1, 25), 'version': (1, 12, 0),
"blender": (2, 80, 0), 'blender': (2, 80, 0),
"location": "Addon Preferences panel, and Ctrl+Shift+Alt+A anywhere for texture browser", 'location': 'Addon Preferences panel, and Ctrl+Shift+Alt+A anywhere for texture browser',
"description": "Texture library browser and Blender Sync. Requires the Blender ID addon " 'description': 'Texture library browser and Blender Sync. Requires the Blender ID addon '
"and Blender 2.80 or newer.", 'and Blender 2.77a or newer.',
"wiki_url": "https://wiki.blender.org/index.php/Extensions:2.6/Py/" 'wiki_url': 'https://wiki.blender.org/index.php/Extensions:2.6/Py/'
"Scripts/System/BlenderCloud", 'Scripts/System/BlenderCloud',
"category": "System", 'category': 'System',
} }
import logging import logging
# Support reloading # Support reloading
if "pillar" in locals(): if 'pillar' in locals():
import importlib import importlib
wheels = importlib.reload(wheels) wheels = importlib.reload(wheels)
@ -60,11 +60,11 @@ def register():
_monkey_patch_requests() _monkey_patch_requests()
# Support reloading # Support reloading
if "%s.blender" % __name__ in sys.modules: if '%s.blender' % __name__ in sys.modules:
import importlib import importlib
def reload_mod(name): def reload_mod(name):
modname = "%s.%s" % (__name__, name) modname = '%s.%s' % (__name__, name)
try: try:
old_module = sys.modules[modname] old_module = sys.modules[modname]
except KeyError: except KeyError:
@ -76,32 +76,22 @@ def register():
sys.modules[modname] = new_module sys.modules[modname] = new_module
return new_module return new_module
reload_mod("blendfile") reload_mod('blendfile')
reload_mod("home_project") reload_mod('home_project')
reload_mod("utils") reload_mod('utils')
reload_mod("pillar") reload_mod('pillar')
async_loop = reload_mod("async_loop") async_loop = reload_mod('async_loop')
flamenco = reload_mod("flamenco") flamenco = reload_mod('flamenco')
attract = reload_mod("attract") attract = reload_mod('attract')
texture_browser = reload_mod("texture_browser") texture_browser = reload_mod('texture_browser')
settings_sync = reload_mod("settings_sync") settings_sync = reload_mod('settings_sync')
image_sharing = reload_mod("image_sharing") image_sharing = reload_mod('image_sharing')
blender = reload_mod("blender") blender = reload_mod('blender')
project_specific = reload_mod("project_specific") project_specific = reload_mod('project_specific')
else: else:
from . import ( from . import (blender, texture_browser, async_loop, settings_sync, blendfile, home_project,
blender, image_sharing, attract, flamenco, project_specific)
texture_browser,
async_loop,
settings_sync,
blendfile,
home_project,
image_sharing,
attract,
flamenco,
project_specific,
)
async_loop.setup_asyncio_executor() async_loop.setup_asyncio_executor()
async_loop.register() async_loop.register()
@ -127,23 +117,15 @@ def _monkey_patch_requests():
if requests.__build__ >= 0x020601: if requests.__build__ >= 0x020601:
return return
log.info("Monkey-patching requests version %s", requests.__version__) log.info('Monkey-patching requests version %s', requests.__version__)
from requests.packages.urllib3.response import HTTPResponse from requests.packages.urllib3.response import HTTPResponse
HTTPResponse.chunked = False HTTPResponse.chunked = False
HTTPResponse.chunk_left = None HTTPResponse.chunk_left = None
def unregister(): def unregister():
from . import ( from . import (blender, texture_browser, async_loop, settings_sync, image_sharing, attract,
blender, flamenco)
texture_browser,
async_loop,
settings_sync,
image_sharing,
attract,
flamenco,
)
image_sharing.unregister() image_sharing.unregister()
attract.unregister() attract.unregister()

View File

@ -14,7 +14,7 @@ See <http://github.com/ActiveState/appdirs> for details and usage.
# - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html # - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html
__version_info__ = (1, 4, 0) __version_info__ = (1, 4, 0)
__version__ = ".".join(map(str, __version_info__)) __version__ = '.'.join(map(str, __version_info__))
import sys import sys
@ -25,23 +25,23 @@ PY3 = sys.version_info[0] == 3
if PY3: if PY3:
unicode = str unicode = str
if sys.platform.startswith("java"): if sys.platform.startswith('java'):
import platform import platform
os_name = platform.java_ver()[3][0] os_name = platform.java_ver()[3][0]
if os_name.startswith("Windows"): # "Windows XP", "Windows 7", etc. if os_name.startswith('Windows'): # "Windows XP", "Windows 7", etc.
system = "win32" system = 'win32'
elif os_name.startswith("Mac"): # "Mac OS X", etc. elif os_name.startswith('Mac'): # "Mac OS X", etc.
system = "darwin" system = 'darwin'
else: # "Linux", "SunOS", "FreeBSD", etc. else: # "Linux", "SunOS", "FreeBSD", etc.
# Setting this to "linux2" is not ideal, but only Windows or Mac # Setting this to "linux2" is not ideal, but only Windows or Mac
# are actually checked for and the rest of the module expects # are actually checked for and the rest of the module expects
# *sys.platform* style strings. # *sys.platform* style strings.
system = "linux2" system = 'linux2'
else: else:
system = sys.platform system = sys.platform
def user_data_dir(appname=None, appauthor=None, version=None, roaming=False): def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
r"""Return full path to the user-specific data dir for this application. r"""Return full path to the user-specific data dir for this application.
@ -84,12 +84,12 @@ def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
path = os.path.join(path, appauthor, appname) path = os.path.join(path, appauthor, appname)
else: else:
path = os.path.join(path, appname) path = os.path.join(path, appname)
elif system == "darwin": elif system == 'darwin':
path = os.path.expanduser("~/Library/Application Support/") path = os.path.expanduser('~/Library/Application Support/')
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
else: else:
path = os.getenv("XDG_DATA_HOME", os.path.expanduser("~/.local/share")) path = os.getenv('XDG_DATA_HOME', os.path.expanduser("~/.local/share"))
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
if appname and version: if appname and version:
@ -137,19 +137,16 @@ def site_data_dir(appname=None, appauthor=None, version=None, multipath=False):
path = os.path.join(path, appauthor, appname) path = os.path.join(path, appauthor, appname)
else: else:
path = os.path.join(path, appname) path = os.path.join(path, appname)
elif system == "darwin": elif system == 'darwin':
path = os.path.expanduser("/Library/Application Support") path = os.path.expanduser('/Library/Application Support')
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
else: else:
# XDG default for $XDG_DATA_DIRS # XDG default for $XDG_DATA_DIRS
# only first, if multipath is False # only first, if multipath is False
path = os.getenv( path = os.getenv('XDG_DATA_DIRS',
"XDG_DATA_DIRS", os.pathsep.join(["/usr/local/share", "/usr/share"]) os.pathsep.join(['/usr/local/share', '/usr/share']))
) pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
pathlist = [
os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)
]
if appname: if appname:
if version: if version:
appname = os.path.join(appname, version) appname = os.path.join(appname, version)
@ -198,7 +195,7 @@ def user_config_dir(appname=None, appauthor=None, version=None, roaming=False):
if system in ["win32", "darwin"]: if system in ["win32", "darwin"]:
path = user_data_dir(appname, appauthor, None, roaming) path = user_data_dir(appname, appauthor, None, roaming)
else: else:
path = os.getenv("XDG_CONFIG_HOME", os.path.expanduser("~/.config")) path = os.getenv('XDG_CONFIG_HOME', os.path.expanduser("~/.config"))
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
if appname and version: if appname and version:
@ -243,10 +240,8 @@ def site_config_dir(appname=None, appauthor=None, version=None, multipath=False)
else: else:
# XDG default for $XDG_CONFIG_DIRS # XDG default for $XDG_CONFIG_DIRS
# only first, if multipath is False # only first, if multipath is False
path = os.getenv("XDG_CONFIG_DIRS", "/etc/xdg") path = os.getenv('XDG_CONFIG_DIRS', '/etc/xdg')
pathlist = [ pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)
]
if appname: if appname:
if version: if version:
appname = os.path.join(appname, version) appname = os.path.join(appname, version)
@ -303,14 +298,14 @@ def user_cache_dir(appname=None, appauthor=None, version=None, opinion=True):
path = os.path.join(path, appname) path = os.path.join(path, appname)
if opinion: if opinion:
path = os.path.join(path, "Cache") path = os.path.join(path, "Cache")
elif system == "darwin": elif system == 'darwin':
path = os.path.expanduser("~/Library/Caches") path = os.path.expanduser('~/Library/Caches')
if appname: if appname:
path = os.path.join(path, appname) path = os.path.join(path, appname)
else: else:
path = os.getenv("XDG_CACHE_HOME", os.path.expanduser("~/.cache")) path = os.getenv('XDG_CACHE_HOME', os.path.expanduser('~/.cache'))
if appname: if appname:
path = os.path.join(path, appname.lower().replace(" ", "-")) path = os.path.join(path, appname.lower().replace(' ', '-'))
if appname and version: if appname and version:
path = os.path.join(path, version) path = os.path.join(path, version)
return path return path
@ -349,7 +344,9 @@ def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
This can be disabled with the `opinion=False` option. This can be disabled with the `opinion=False` option.
""" """
if system == "darwin": if system == "darwin":
path = os.path.join(os.path.expanduser("~/Library/Logs"), appname) path = os.path.join(
os.path.expanduser('~/Library/Logs'),
appname)
elif system == "win32": elif system == "win32":
path = user_data_dir(appname, appauthor, version) path = user_data_dir(appname, appauthor, version)
version = False version = False
@ -367,10 +364,8 @@ def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
class AppDirs(object): class AppDirs(object):
"""Convenience wrapper for getting application dirs.""" """Convenience wrapper for getting application dirs."""
def __init__(self, appname, appauthor=None, version=None, roaming=False,
def __init__( multipath=False):
self, appname, appauthor=None, version=None, roaming=False, multipath=False
):
self.appname = appname self.appname = appname
self.appauthor = appauthor self.appauthor = appauthor
self.version = version self.version = version
@ -379,39 +374,36 @@ class AppDirs(object):
@property @property
def user_data_dir(self): def user_data_dir(self):
return user_data_dir( return user_data_dir(self.appname, self.appauthor,
self.appname, self.appauthor, version=self.version, roaming=self.roaming version=self.version, roaming=self.roaming)
)
@property @property
def site_data_dir(self): def site_data_dir(self):
return site_data_dir( return site_data_dir(self.appname, self.appauthor,
self.appname, self.appauthor, version=self.version, multipath=self.multipath version=self.version, multipath=self.multipath)
)
@property @property
def user_config_dir(self): def user_config_dir(self):
return user_config_dir( return user_config_dir(self.appname, self.appauthor,
self.appname, self.appauthor, version=self.version, roaming=self.roaming version=self.version, roaming=self.roaming)
)
@property @property
def site_config_dir(self): def site_config_dir(self):
return site_config_dir( return site_config_dir(self.appname, self.appauthor,
self.appname, self.appauthor, version=self.version, multipath=self.multipath version=self.version, multipath=self.multipath)
)
@property @property
def user_cache_dir(self): def user_cache_dir(self):
return user_cache_dir(self.appname, self.appauthor, version=self.version) return user_cache_dir(self.appname, self.appauthor,
version=self.version)
@property @property
def user_log_dir(self): def user_log_dir(self):
return user_log_dir(self.appname, self.appauthor, version=self.version) return user_log_dir(self.appname, self.appauthor,
version=self.version)
# ---- internal support stuff #---- internal support stuff
def _get_win_folder_from_registry(csidl_name): def _get_win_folder_from_registry(csidl_name):
"""This is a fallback technique at best. I'm not sure if using the """This is a fallback technique at best. I'm not sure if using the
@ -428,7 +420,7 @@ def _get_win_folder_from_registry(csidl_name):
key = _winreg.OpenKey( key = _winreg.OpenKey(
_winreg.HKEY_CURRENT_USER, _winreg.HKEY_CURRENT_USER,
r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders", r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
) )
dir, type = _winreg.QueryValueEx(key, shell_folder_name) dir, type = _winreg.QueryValueEx(key, shell_folder_name)
return dir return dir
@ -436,7 +428,6 @@ def _get_win_folder_from_registry(csidl_name):
def _get_win_folder_with_pywin32(csidl_name): def _get_win_folder_with_pywin32(csidl_name):
from win32com.shell import shellcon, shell from win32com.shell import shellcon, shell
dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0) dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0)
# Try to make this a unicode path because SHGetFolderPath does # Try to make this a unicode path because SHGetFolderPath does
# not return unicode strings when there is unicode data in the # not return unicode strings when there is unicode data in the
@ -454,7 +445,6 @@ def _get_win_folder_with_pywin32(csidl_name):
if has_high_char: if has_high_char:
try: try:
import win32api import win32api
dir = win32api.GetShortPathName(dir) dir = win32api.GetShortPathName(dir)
except ImportError: except ImportError:
pass pass
@ -489,22 +479,15 @@ def _get_win_folder_with_ctypes(csidl_name):
return buf.value return buf.value
def _get_win_folder_with_jna(csidl_name): def _get_win_folder_with_jna(csidl_name):
import array import array
from com.sun import jna from com.sun import jna
from com.sun.jna.platform import win32 from com.sun.jna.platform import win32
buf_size = win32.WinDef.MAX_PATH * 2 buf_size = win32.WinDef.MAX_PATH * 2
buf = array.zeros("c", buf_size) buf = array.zeros('c', buf_size)
shell = win32.Shell32.INSTANCE shell = win32.Shell32.INSTANCE
shell.SHGetFolderPath( shell.SHGetFolderPath(None, getattr(win32.ShlObj, csidl_name), None, win32.ShlObj.SHGFP_TYPE_CURRENT, buf)
None,
getattr(win32.ShlObj, csidl_name),
None,
win32.ShlObj.SHGFP_TYPE_CURRENT,
buf,
)
dir = jna.Native.toString(buf.tostring()).rstrip("\0") dir = jna.Native.toString(buf.tostring()).rstrip("\0")
# Downgrade to short path name if have highbit chars. See # Downgrade to short path name if have highbit chars. See
@ -515,47 +498,38 @@ def _get_win_folder_with_jna(csidl_name):
has_high_char = True has_high_char = True
break break
if has_high_char: if has_high_char:
buf = array.zeros("c", buf_size) buf = array.zeros('c', buf_size)
kernel = win32.Kernel32.INSTANCE kernel = win32.Kernel32.INSTANCE
if kernal.GetShortPathName(dir, buf, buf_size): if kernal.GetShortPathName(dir, buf, buf_size):
dir = jna.Native.toString(buf.tostring()).rstrip("\0") dir = jna.Native.toString(buf.tostring()).rstrip("\0")
return dir return dir
if system == "win32": if system == "win32":
try: try:
import win32com.shell import win32com.shell
_get_win_folder = _get_win_folder_with_pywin32 _get_win_folder = _get_win_folder_with_pywin32
except ImportError: except ImportError:
try: try:
from ctypes import windll # type: ignore from ctypes import windll # type: ignore
_get_win_folder = _get_win_folder_with_ctypes _get_win_folder = _get_win_folder_with_ctypes
except ImportError: except ImportError:
try: try:
import com.sun.jna import com.sun.jna
_get_win_folder = _get_win_folder_with_jna _get_win_folder = _get_win_folder_with_jna
except ImportError: except ImportError:
_get_win_folder = _get_win_folder_from_registry _get_win_folder = _get_win_folder_from_registry
# ---- self test code #---- self test code
if __name__ == "__main__": if __name__ == "__main__":
appname = "MyApp" appname = "MyApp"
appauthor = "MyCompany" appauthor = "MyCompany"
props = ( props = ("user_data_dir", "site_data_dir",
"user_data_dir", "user_config_dir", "site_config_dir",
"site_data_dir", "user_cache_dir", "user_log_dir")
"user_config_dir",
"site_config_dir",
"user_cache_dir",
"user_log_dir",
)
print("-- app dirs (with optional 'version')") print("-- app dirs (with optional 'version')")
dirs = AppDirs(appname, appauthor, version="1.0") dirs = AppDirs(appname, appauthor, version="1.0")

View File

@ -38,13 +38,11 @@ def setup_asyncio_executor():
import sys import sys
if sys.platform == "win32": if sys.platform == 'win32':
asyncio.get_event_loop().close() asyncio.get_event_loop().close()
# On Windows, the default event loop is SelectorEventLoop, which does # On Windows, the default event loop is SelectorEventLoop, which does
# not support subprocesses. ProactorEventLoop should be used instead. # not support subprocesses. ProactorEventLoop should be used instead.
# Source: https://docs.python.org/3/library/asyncio-subprocess.html # Source: https://docs.python.org/3/library/asyncio-subprocess.html
#
# NOTE: this is actually the default even loop in Python 3.9+.
loop = asyncio.ProactorEventLoop() loop = asyncio.ProactorEventLoop()
asyncio.set_event_loop(loop) asyncio.set_event_loop(loop)
else: else:
@ -55,12 +53,8 @@ def setup_asyncio_executor():
# loop.set_debug(True) # loop.set_debug(True)
from . import pillar from . import pillar
# Python 3.8 deprecated the 'loop' parameter, 3.10 removed it.
kwargs = {"loop": loop} if sys.version_info < (3, 8) else {}
# No more than this many Pillar calls should be made simultaneously # No more than this many Pillar calls should be made simultaneously
pillar.pillar_semaphore = asyncio.Semaphore(3, **kwargs) pillar.pillar_semaphore = asyncio.Semaphore(3, loop=loop)
def kick_async_loop(*args) -> bool: def kick_async_loop(*args) -> bool:
@ -76,23 +70,17 @@ def kick_async_loop(*args) -> bool:
stop_after_this_kick = False stop_after_this_kick = False
if loop.is_closed(): if loop.is_closed():
log.warning("loop closed, stopping immediately.") log.warning('loop closed, stopping immediately.')
return True return True
# Passing an explicit loop is required. Without it, the function uses all_tasks = asyncio.Task.all_tasks()
# asyncio.get_running_loop(), which raises a RuntimeError as the current
# loop isn't running.
all_tasks = asyncio.all_tasks(loop=loop)
if not len(all_tasks): if not len(all_tasks):
log.debug("no more scheduled tasks, stopping after this kick.") log.debug('no more scheduled tasks, stopping after this kick.')
stop_after_this_kick = True stop_after_this_kick = True
elif all(task.done() for task in all_tasks): elif all(task.done() for task in all_tasks):
log.debug( log.debug('all %i tasks are done, fetching results and stopping after this kick.',
"all %i tasks are done, fetching results and stopping after this kick.", len(all_tasks))
len(all_tasks),
)
stop_after_this_kick = True stop_after_this_kick = True
# Clean up circular references between tasks. # Clean up circular references between tasks.
@ -105,12 +93,12 @@ def kick_async_loop(*args) -> bool:
# noinspection PyBroadException # noinspection PyBroadException
try: try:
res = task.result() res = task.result()
log.debug(" task #%i: result=%r", task_idx, res) log.debug(' task #%i: result=%r', task_idx, res)
except asyncio.CancelledError: except asyncio.CancelledError:
# No problem, we want to stop anyway. # No problem, we want to stop anyway.
log.debug(" task #%i: cancelled", task_idx) log.debug(' task #%i: cancelled', task_idx)
except Exception: except Exception:
print("{}: resulted in exception".format(task)) print('{}: resulted in exception'.format(task))
traceback.print_exc() traceback.print_exc()
# for ref in gc.get_referrers(task): # for ref in gc.get_referrers(task):
@ -123,26 +111,26 @@ def kick_async_loop(*args) -> bool:
def ensure_async_loop(): def ensure_async_loop():
log.debug("Starting asyncio loop") log.debug('Starting asyncio loop')
result = bpy.ops.asyncio.loop() result = bpy.ops.asyncio.loop()
log.debug("Result of starting modal operator is %r", result) log.debug('Result of starting modal operator is %r', result)
def erase_async_loop(): def erase_async_loop():
global _loop_kicking_operator_running global _loop_kicking_operator_running
log.debug("Erasing async loop") log.debug('Erasing async loop')
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
loop.stop() loop.stop()
class AsyncLoopModalOperator(bpy.types.Operator): class AsyncLoopModalOperator(bpy.types.Operator):
bl_idname = "asyncio.loop" bl_idname = 'asyncio.loop'
bl_label = "Runs the asyncio main loop" bl_label = 'Runs the asyncio main loop'
timer = None timer = None
log = logging.getLogger(__name__ + ".AsyncLoopModalOperator") log = logging.getLogger(__name__ + '.AsyncLoopModalOperator')
def __del__(self): def __del__(self):
global _loop_kicking_operator_running global _loop_kicking_operator_running
@ -159,8 +147,8 @@ class AsyncLoopModalOperator(bpy.types.Operator):
global _loop_kicking_operator_running global _loop_kicking_operator_running
if _loop_kicking_operator_running: if _loop_kicking_operator_running:
self.log.debug("Another loop-kicking operator is already running.") self.log.debug('Another loop-kicking operator is already running.')
return {"PASS_THROUGH"} return {'PASS_THROUGH'}
context.window_manager.modal_handler_add(self) context.window_manager.modal_handler_add(self)
_loop_kicking_operator_running = True _loop_kicking_operator_running = True
@ -168,7 +156,7 @@ class AsyncLoopModalOperator(bpy.types.Operator):
wm = context.window_manager wm = context.window_manager
self.timer = wm.event_timer_add(0.00001, window=context.window) self.timer = wm.event_timer_add(0.00001, window=context.window)
return {"RUNNING_MODAL"} return {'RUNNING_MODAL'}
def modal(self, context, event): def modal(self, context, event):
global _loop_kicking_operator_running global _loop_kicking_operator_running
@ -177,10 +165,10 @@ class AsyncLoopModalOperator(bpy.types.Operator):
# erase_async_loop(). This is a signal that we really should stop # erase_async_loop(). This is a signal that we really should stop
# running. # running.
if not _loop_kicking_operator_running: if not _loop_kicking_operator_running:
return {"FINISHED"} return {'FINISHED'}
if event.type != "TIMER": if event.type != 'TIMER':
return {"PASS_THROUGH"} return {'PASS_THROUGH'}
# self.log.debug('KICKING LOOP') # self.log.debug('KICKING LOOP')
stop_after_this_kick = kick_async_loop() stop_after_this_kick = kick_async_loop()
@ -188,33 +176,29 @@ class AsyncLoopModalOperator(bpy.types.Operator):
context.window_manager.event_timer_remove(self.timer) context.window_manager.event_timer_remove(self.timer)
_loop_kicking_operator_running = False _loop_kicking_operator_running = False
self.log.debug("Stopped asyncio loop kicking") self.log.debug('Stopped asyncio loop kicking')
return {"FINISHED"} return {'FINISHED'}
return {"RUNNING_MODAL"} return {'RUNNING_MODAL'}
# noinspection PyAttributeOutsideInit # noinspection PyAttributeOutsideInit
class AsyncModalOperatorMixin: class AsyncModalOperatorMixin:
async_task = None # asyncio task for fetching thumbnails async_task = None # asyncio task for fetching thumbnails
signalling_future = ( signalling_future = None # asyncio future for signalling that we want to cancel everything.
None # asyncio future for signalling that we want to cancel everything. log = logging.getLogger('%s.AsyncModalOperatorMixin' % __name__)
)
log = logging.getLogger("%s.AsyncModalOperatorMixin" % __name__)
_state = "INITIALIZING" _state = 'INITIALIZING'
stop_upon_exception = False stop_upon_exception = False
def invoke(self, context, event): def invoke(self, context, event):
context.window_manager.modal_handler_add(self) context.window_manager.modal_handler_add(self)
self.timer = context.window_manager.event_timer_add( self.timer = context.window_manager.event_timer_add(1 / 15, window=context.window)
1 / 15, window=context.window
)
self.log.info("Starting") self.log.info('Starting')
self._new_async_task(self.async_execute(context)) self._new_async_task(self.async_execute(context))
return {"RUNNING_MODAL"} return {'RUNNING_MODAL'}
async def async_execute(self, context): async def async_execute(self, context):
"""Entry point of the asynchronous operator. """Entry point of the asynchronous operator.
@ -225,7 +209,7 @@ class AsyncModalOperatorMixin:
def quit(self): def quit(self):
"""Signals the state machine to stop this operator from running.""" """Signals the state machine to stop this operator from running."""
self._state = "QUIT" self._state = 'QUIT'
def execute(self, context): def execute(self, context):
return self.invoke(context, None) return self.invoke(context, None)
@ -233,50 +217,46 @@ class AsyncModalOperatorMixin:
def modal(self, context, event): def modal(self, context, event):
task = self.async_task task = self.async_task
if self._state != "EXCEPTION" and task and task.done() and not task.cancelled(): if self._state != 'EXCEPTION' and task and task.done() and not task.cancelled():
ex = task.exception() ex = task.exception()
if ex is not None: if ex is not None:
self._state = "EXCEPTION" self._state = 'EXCEPTION'
self.log.error("Exception while running task: %s", ex) self.log.error('Exception while running task: %s', ex)
if self.stop_upon_exception: if self.stop_upon_exception:
self.quit() self.quit()
self._finish(context) self._finish(context)
return {"FINISHED"} return {'FINISHED'}
return {"RUNNING_MODAL"} return {'RUNNING_MODAL'}
if self._state == "QUIT": if self._state == 'QUIT':
self._finish(context) self._finish(context)
return {"FINISHED"} return {'FINISHED'}
return {"PASS_THROUGH"} return {'PASS_THROUGH'}
def _finish(self, context): def _finish(self, context):
self._stop_async_task() self._stop_async_task()
context.window_manager.event_timer_remove(self.timer) context.window_manager.event_timer_remove(self.timer)
def _new_async_task( def _new_async_task(self, async_task: typing.Coroutine, future: asyncio.Future = None):
self, async_task: typing.Coroutine, future: asyncio.Future = None
):
"""Stops the currently running async task, and starts another one.""" """Stops the currently running async task, and starts another one."""
self.log.debug( self.log.debug('Setting up a new task %r, so any existing task must be stopped', async_task)
"Setting up a new task %r, so any existing task must be stopped", async_task
)
self._stop_async_task() self._stop_async_task()
# Download the previews asynchronously. # Download the previews asynchronously.
self.signalling_future = future or asyncio.Future() self.signalling_future = future or asyncio.Future()
self.async_task = asyncio.ensure_future(async_task) self.async_task = asyncio.ensure_future(async_task)
self.log.debug("Created new task %r", self.async_task) self.log.debug('Created new task %r', self.async_task)
# Start the async manager so everything happens. # Start the async manager so everything happens.
ensure_async_loop() ensure_async_loop()
def _stop_async_task(self): def _stop_async_task(self):
self.log.debug("Stopping async task") self.log.debug('Stopping async task')
if self.async_task is None: if self.async_task is None:
self.log.debug("No async task, trivially stopped") self.log.debug('No async task, trivially stopped')
return return
# Signal that we want to stop. # Signal that we want to stop.
@ -292,14 +272,14 @@ class AsyncModalOperatorMixin:
try: try:
loop.run_until_complete(self.async_task) loop.run_until_complete(self.async_task)
except asyncio.CancelledError: except asyncio.CancelledError:
self.log.info("Asynchronous task was cancelled") self.log.info('Asynchronous task was cancelled')
return return
# noinspection PyBroadException # noinspection PyBroadException
try: try:
self.async_task.result() # This re-raises any exception of the task. self.async_task.result() # This re-raises any exception of the task.
except asyncio.CancelledError: except asyncio.CancelledError:
self.log.info("Asynchronous task was cancelled") self.log.info('Asynchronous task was cancelled')
except Exception: except Exception:
self.log.exception("Exception from asynchronous task") self.log.exception("Exception from asynchronous task")

File diff suppressed because it is too large Load Diff

View File

@ -29,17 +29,17 @@ log = logging.getLogger(__name__)
strip_status_colour = { strip_status_colour = {
None: (0.7, 0.7, 0.7), None: (0.7, 0.7, 0.7),
"approved": (0.6392156862745098, 0.8784313725490196, 0.30196078431372547), 'approved': (0.6392156862745098, 0.8784313725490196, 0.30196078431372547),
"final": (0.9058823529411765, 0.9607843137254902, 0.8274509803921568), 'final': (0.9058823529411765, 0.9607843137254902, 0.8274509803921568),
"in_progress": (1.0, 0.7450980392156863, 0.0), 'in_progress': (1.0, 0.7450980392156863, 0.0),
"on_hold": (0.796078431372549, 0.6196078431372549, 0.08235294117647059), 'on_hold': (0.796078431372549, 0.6196078431372549, 0.08235294117647059),
"review": (0.8941176470588236, 0.9607843137254902, 0.9764705882352941), 'review': (0.8941176470588236, 0.9607843137254902, 0.9764705882352941),
"todo": (1.0, 0.5019607843137255, 0.5019607843137255), 'todo': (1.0, 0.5019607843137255, 0.5019607843137255)
} }
CONFLICT_COLOUR = (0.576, 0.118, 0.035, 1.0) # RGBA tuple CONFLICT_COLOUR = (0.576, 0.118, 0.035, 1.0) # RGBA tuple
gpu_vertex_shader = """ gpu_vertex_shader = '''
uniform mat4 ModelViewProjectionMatrix; uniform mat4 ModelViewProjectionMatrix;
layout (location = 0) in vec2 pos; layout (location = 0) in vec2 pos;
@ -52,9 +52,9 @@ void main()
gl_Position = ModelViewProjectionMatrix * vec4(pos.x, pos.y, 0.0, 1.0); gl_Position = ModelViewProjectionMatrix * vec4(pos.x, pos.y, 0.0, 1.0);
lineColor = color; lineColor = color;
} }
""" '''
gpu_fragment_shader = """ gpu_fragment_shader = '''
out vec4 fragColor; out vec4 fragColor;
in vec4 lineColor; in vec4 lineColor;
@ -62,7 +62,7 @@ void main()
{ {
fragColor = lineColor; fragColor = lineColor;
} }
""" '''
Float2 = typing.Tuple[float, float] Float2 = typing.Tuple[float, float]
Float3 = typing.Tuple[float, float, float] Float3 = typing.Tuple[float, float, float]
@ -70,18 +70,25 @@ Float4 = typing.Tuple[float, float, float, float]
class AttractLineDrawer: class AttractLineDrawer:
def __init__(self): def __init__(self):
self._format = gpu.types.GPUVertFormat() self._format = gpu.types.GPUVertFormat()
self._pos_id = self._format.attr_add( self._pos_id = self._format.attr_add(
id="pos", comp_type="F32", len=2, fetch_mode="FLOAT" id="pos",
) comp_type="F32",
len=2,
fetch_mode="FLOAT")
self._color_id = self._format.attr_add( self._color_id = self._format.attr_add(
id="color", comp_type="F32", len=4, fetch_mode="FLOAT" id="color",
) comp_type="F32",
len=4,
fetch_mode="FLOAT")
self.shader = gpu.types.GPUShader(gpu_vertex_shader, gpu_fragment_shader) self.shader = gpu.types.GPUShader(gpu_vertex_shader, gpu_fragment_shader)
def draw(self, coords: typing.List[Float2], colors: typing.List[Float4]): def draw(self,
coords: typing.List[Float2],
colors: typing.List[Float4]):
if not coords: if not coords:
return return
@ -107,13 +114,11 @@ def get_strip_rectf(strip) -> Float4:
return x1, y1, x2, y2 return x1, y1, x2, y2
def underline_in_strip( def underline_in_strip(strip_coords: Float4,
strip_coords: Float4, pixel_size_x: float,
pixel_size_x: float, color: Float4,
color: Float4, out_coords: typing.List[Float2],
out_coords: typing.List[Float2], out_colors: typing.List[Float4]):
out_colors: typing.List[Float4],
):
# Strip coords # Strip coords
s_x1, s_y1, s_x2, s_y2 = strip_coords s_x1, s_y1, s_x2, s_y2 = strip_coords
@ -137,11 +142,9 @@ def underline_in_strip(
out_colors.append(color) out_colors.append(color)
def strip_conflict( def strip_conflict(strip_coords: Float4,
strip_coords: Float4, out_coords: typing.List[Float2],
out_coords: typing.List[Float2], out_colors: typing.List[Float4]):
out_colors: typing.List[Float4],
):
"""Draws conflicting states between strips.""" """Draws conflicting states between strips."""
s_x1, s_y1, s_x2, s_y2 = strip_coords s_x1, s_y1, s_x2, s_y2 = strip_coords
@ -188,12 +191,8 @@ def draw_callback_px(line_drawer: AttractLineDrawer):
strip_coords = get_strip_rectf(strip) strip_coords = get_strip_rectf(strip)
# check if any of the coordinates are out of bounds # check if any of the coordinates are out of bounds
if ( if strip_coords[0] > xwin2 or strip_coords[2] < xwin1 or strip_coords[1] > ywin2 or \
strip_coords[0] > xwin2 strip_coords[3] < ywin1:
or strip_coords[2] < xwin1
or strip_coords[1] > ywin2
or strip_coords[3] < ywin1
):
continue continue
# Draw # Draw
@ -218,9 +217,9 @@ def tag_redraw_all_sequencer_editors():
# Py cant access notifiers # Py cant access notifiers
for window in context.window_manager.windows: for window in context.window_manager.windows:
for area in window.screen.areas: for area in window.screen.areas:
if area.type == "SEQUENCE_EDITOR": if area.type == 'SEQUENCE_EDITOR':
for region in area.regions: for region in area.regions:
if region.type == "WINDOW": if region.type == 'WINDOW':
region.tag_redraw() region.tag_redraw()
@ -238,11 +237,8 @@ def callback_enable():
return return
line_drawer = AttractLineDrawer() line_drawer = AttractLineDrawer()
cb_handle[:] = ( cb_handle[:] = bpy.types.SpaceSequenceEditor.draw_handler_add(
bpy.types.SpaceSequenceEditor.draw_handler_add( draw_callback_px, (line_drawer,), 'WINDOW', 'POST_VIEW'),
draw_callback_px, (line_drawer,), "WINDOW", "POST_VIEW"
),
)
tag_redraw_all_sequencer_editors() tag_redraw_all_sequencer_editors()
@ -252,7 +248,7 @@ def callback_disable():
return return
try: try:
bpy.types.SpaceSequenceEditor.draw_handler_remove(cb_handle[0], "WINDOW") bpy.types.SpaceSequenceEditor.draw_handler_remove(cb_handle[0], 'WINDOW')
except ValueError: except ValueError:
# Thrown when already removed. # Thrown when already removed.
pass pass

View File

@ -0,0 +1,182 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ##### END GPL LICENSE BLOCK #####
# <pep8 compliant>
import bpy
import logging
import collections
log = logging.getLogger(__name__)
strip_status_colour = {
None: (0.7, 0.7, 0.7),
'approved': (0.6392156862745098, 0.8784313725490196, 0.30196078431372547),
'final': (0.9058823529411765, 0.9607843137254902, 0.8274509803921568),
'in_progress': (1.0, 0.7450980392156863, 0.0),
'on_hold': (0.796078431372549, 0.6196078431372549, 0.08235294117647059),
'review': (0.8941176470588236, 0.9607843137254902, 0.9764705882352941),
'todo': (1.0, 0.5019607843137255, 0.5019607843137255)
}
CONFLICT_COLOUR = (0.576, 0.118, 0.035) # RGB tuple
def get_strip_rectf(strip):
# Get x and y in terms of the grid's frames and channels
x1 = strip.frame_final_start
x2 = strip.frame_final_end
y1 = strip.channel + 0.2
y2 = strip.channel - 0.2 + 1
return x1, y1, x2, y2
def draw_underline_in_strip(strip_coords, pixel_size_x, color):
from bgl import glColor4f, glRectf, glEnable, glDisable, GL_BLEND
import bgl
context = bpy.context
# Strip coords
s_x1, s_y1, s_x2, s_y2 = strip_coords
# be careful not to draw over the current frame line
cf_x = context.scene.frame_current_final
bgl.glPushAttrib(bgl.GL_COLOR_BUFFER_BIT | bgl.GL_LINE_BIT)
glColor4f(*color)
glEnable(GL_BLEND)
bgl.glLineWidth(2)
bgl.glBegin(bgl.GL_LINES)
bgl.glVertex2f(s_x1, s_y1)
if s_x1 < cf_x < s_x2:
# Bad luck, the line passes our strip
bgl.glVertex2f(cf_x - pixel_size_x, s_y1)
bgl.glVertex2f(cf_x + pixel_size_x, s_y1)
bgl.glVertex2f(s_x2, s_y1)
bgl.glEnd()
bgl.glPopAttrib()
def draw_strip_conflict(strip_coords, pixel_size_x):
"""Draws conflicting states between strips."""
import bgl
s_x1, s_y1, s_x2, s_y2 = strip_coords
bgl.glPushAttrib(bgl.GL_COLOR_BUFFER_BIT | bgl.GL_LINE_BIT)
# Always draw the full rectangle, the conflict should be resolved and thus stand out.
bgl.glColor3f(*CONFLICT_COLOUR)
bgl.glLineWidth(2)
bgl.glBegin(bgl.GL_LINE_LOOP)
bgl.glVertex2f(s_x1, s_y1)
bgl.glVertex2f(s_x2, s_y1)
bgl.glVertex2f(s_x2, s_y2)
bgl.glVertex2f(s_x1, s_y2)
bgl.glEnd()
bgl.glPopAttrib()
def draw_callback_px():
context = bpy.context
if not context.scene.sequence_editor:
return
from . import shown_strips
region = context.region
xwin1, ywin1 = region.view2d.region_to_view(0, 0)
xwin2, ywin2 = region.view2d.region_to_view(region.width, region.height)
one_pixel_further_x, one_pixel_further_y = region.view2d.region_to_view(1, 1)
pixel_size_x = one_pixel_further_x - xwin1
strips = shown_strips(context)
for strip in strips:
if not strip.atc_object_id:
continue
# Get corners (x1, y1), (x2, y2) of the strip rectangle in px region coords
strip_coords = get_strip_rectf(strip)
# check if any of the coordinates are out of bounds
if strip_coords[0] > xwin2 or strip_coords[2] < xwin1 or strip_coords[1] > ywin2 or \
strip_coords[3] < ywin1:
continue
# Draw
status = strip.atc_status
if status in strip_status_colour:
color = strip_status_colour[status]
else:
color = strip_status_colour[None]
alpha = 1.0 if strip.atc_is_synced else 0.5
draw_underline_in_strip(strip_coords, pixel_size_x, color + (alpha,))
if strip.atc_is_synced and strip.atc_object_id_conflict:
draw_strip_conflict(strip_coords, pixel_size_x)
def tag_redraw_all_sequencer_editors():
context = bpy.context
# Py cant access notifiers
for window in context.window_manager.windows:
for area in window.screen.areas:
if area.type == 'SEQUENCE_EDITOR':
for region in area.regions:
if region.type == 'WINDOW':
region.tag_redraw()
# This is a list so it can be changed instead of set
# if it is only changed, it does not have to be declared as a global everywhere
cb_handle = []
def callback_enable():
if cb_handle:
return
cb_handle[:] = bpy.types.SpaceSequenceEditor.draw_handler_add(
draw_callback_px, (), 'WINDOW', 'POST_VIEW'),
tag_redraw_all_sequencer_editors()
def callback_disable():
if not cb_handle:
return
try:
bpy.types.SpaceSequenceEditor.draw_handler_remove(cb_handle[0], 'WINDOW')
except ValueError:
# Thrown when already removed.
pass
cb_handle.clear()
tag_redraw_all_sequencer_editors()

View File

@ -27,73 +27,77 @@ import tempfile
import bpy import bpy
from bpy.types import AddonPreferences, Operator, WindowManager, Scene, PropertyGroup from bpy.types import AddonPreferences, Operator, WindowManager, Scene, PropertyGroup
from bpy.props import ( from bpy.props import StringProperty, EnumProperty, PointerProperty, BoolProperty, IntProperty
StringProperty,
EnumProperty,
PointerProperty,
BoolProperty,
IntProperty,
)
import rna_prop_ui import rna_prop_ui
from . import pillar, async_loop, flamenco, project_specific from . import pillar, async_loop, flamenco, project_specific
from .utils import pyside_cache, redraw from .utils import pyside_cache, redraw
PILLAR_WEB_SERVER_URL = os.environ.get("BCLOUD_SERVER", "https://cloud.blender.org/") PILLAR_WEB_SERVER_URL = os.environ.get('BCLOUD_SERVER', 'https://cloud.blender.org/')
PILLAR_SERVER_URL = "%sapi/" % PILLAR_WEB_SERVER_URL PILLAR_SERVER_URL = '%sapi/' % PILLAR_WEB_SERVER_URL
ADDON_NAME = "blender_cloud" ADDON_NAME = 'blender_cloud'
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
icons = None icons = None
if bpy.app.version < (2, 80):
SYNC_SELECT_VERSION_ICON = 'DOTSDOWN'
else:
SYNC_SELECT_VERSION_ICON = 'DOWNARROW_HLT'
@pyside_cache
@functools.lru_cache()
def factor(factor: float) -> dict:
"""Construct keyword argument for UILayout.split().
On Blender 2.8 this returns {'factor': factor}, and on earlier Blenders it returns
{'percentage': factor}.
"""
if bpy.app.version < (2, 80, 0):
return {'percentage': factor}
return {'factor': factor}
@pyside_cache('version')
def blender_syncable_versions(self, context): def blender_syncable_versions(self, context):
"""Returns the list of items used by SyncStatusProperties.version EnumProperty.""" """Returns the list of items used by SyncStatusProperties.version EnumProperty."""
bss = context.window_manager.blender_sync_status bss = context.window_manager.blender_sync_status
versions = bss.available_blender_versions versions = bss.available_blender_versions
if not versions: if not versions:
return [("", "No settings stored in your Blender Cloud", "")] return [('', 'No settings stored in your Blender Cloud', '')]
return [(v, v, "") for v in versions] return [(v, v, '') for v in versions]
class SyncStatusProperties(PropertyGroup): class SyncStatusProperties(PropertyGroup):
status: EnumProperty( status = EnumProperty(
items=[ items=[
("NONE", "NONE", "We have done nothing at all yet."), ('NONE', 'NONE', 'We have done nothing at all yet.'),
( ('IDLE', 'IDLE', 'User requested something, which is done, and we are now idle.'),
"IDLE", ('SYNCING', 'SYNCING', 'Synchronising with Blender Cloud.'),
"IDLE",
"User requested something, which is done, and we are now idle.",
),
("SYNCING", "SYNCING", "Synchronising with Blender Cloud."),
], ],
name="status", name='status',
description="Current status of Blender Sync", description='Current status of Blender Sync',
update=redraw, update=redraw)
)
version: EnumProperty( version = EnumProperty(
items=blender_syncable_versions, items=blender_syncable_versions,
name="Version of Blender from which to pull", name='Version of Blender from which to pull',
description="Version of Blender from which to pull", description='Version of Blender from which to pull')
)
message: StringProperty(name="message", update=redraw) message = StringProperty(name='message', update=redraw)
level: EnumProperty( level = EnumProperty(
items=[ items=[
("INFO", "INFO", ""), ('INFO', 'INFO', ''),
("WARNING", "WARNING", ""), ('WARNING', 'WARNING', ''),
("ERROR", "ERROR", ""), ('ERROR', 'ERROR', ''),
("SUBSCRIBE", "SUBSCRIBE", ""), ('SUBSCRIBE', 'SUBSCRIBE', ''),
], ],
name="level", name='level',
update=redraw, update=redraw)
)
def report(self, level: set, message: str): def report(self, level: set, message: str):
assert len(level) == 1, "level should be a set of one string, not %r" % level assert len(level) == 1, 'level should be a set of one string, not %r' % level
self.level = level.pop() self.level = level.pop()
self.message = message self.message = message
@ -110,21 +114,21 @@ class SyncStatusProperties(PropertyGroup):
# because I don't know how to store a variable list of strings in a proper RNA property. # because I don't know how to store a variable list of strings in a proper RNA property.
@property @property
def available_blender_versions(self) -> list: def available_blender_versions(self) -> list:
return self.get("available_blender_versions", []) return self.get('available_blender_versions', [])
@available_blender_versions.setter @available_blender_versions.setter
def available_blender_versions(self, new_versions): def available_blender_versions(self, new_versions):
self["available_blender_versions"] = new_versions self['available_blender_versions'] = new_versions
@pyside_cache @pyside_cache('project')
def bcloud_available_projects(self, context): def bcloud_available_projects(self, context):
"""Returns the list of items used by BlenderCloudProjectGroup.project EnumProperty.""" """Returns the list of items used by BlenderCloudProjectGroup.project EnumProperty."""
projs = preferences().project.available_projects projs = preferences().project.available_projects
if not projs: if not projs:
return [("", "No projects available in your Blender Cloud", "")] return [('', 'No projects available in your Blender Cloud', '')]
return [(p["_id"], p["name"], "") for p in projs] return [(p['_id'], p['name'], '') for p in projs]
@functools.lru_cache(1) @functools.lru_cache(1)
@ -134,54 +138,50 @@ def project_extensions(project_id) -> set:
At the moment of writing these are 'attract' and 'flamenco'. At the moment of writing these are 'attract' and 'flamenco'.
""" """
log.debug("Finding extensions for project %s", project_id) log.debug('Finding extensions for project %s', project_id)
# We can't use our @property, since the preferences may be loaded from a # We can't use our @property, since the preferences may be loaded from a
# preferences blend file, in which case it is not constructed from Python code. # preferences blend file, in which case it is not constructed from Python code.
available_projects = preferences().project.get("available_projects", []) available_projects = preferences().project.get('available_projects', [])
if not available_projects: if not available_projects:
log.debug("No projects available.") log.debug('No projects available.')
return set() return set()
proj = next((p for p in available_projects if p["_id"] == project_id), None) proj = next((p for p in available_projects
if p['_id'] == project_id), None)
if proj is None: if proj is None:
log.debug("Project %s not found in available projects.", project_id) log.debug('Project %s not found in available projects.', project_id)
return set() return set()
return set(proj.get("enabled_for", ())) return set(proj.get('enabled_for', ()))
class BlenderCloudProjectGroup(PropertyGroup): class BlenderCloudProjectGroup(PropertyGroup):
status: EnumProperty( status = EnumProperty(
items=[ items=[
("NONE", "NONE", "We have done nothing at all yet"), ('NONE', 'NONE', 'We have done nothing at all yet'),
( ('IDLE', 'IDLE', 'User requested something, which is done, and we are now idle'),
"IDLE", ('FETCHING', 'FETCHING', 'Fetching available projects from Blender Cloud'),
"IDLE",
"User requested something, which is done, and we are now idle",
),
("FETCHING", "FETCHING", "Fetching available projects from Blender Cloud"),
], ],
name="status", name='status',
update=redraw, update=redraw)
)
project: EnumProperty( project = EnumProperty(
items=bcloud_available_projects, items=bcloud_available_projects,
name="Cloud project", name='Cloud project',
description="Which Blender Cloud project to work with", description='Which Blender Cloud project to work with',
update=project_specific.handle_project_update, update=project_specific.handle_project_update
) )
# List of projects is stored in 'available_projects' ID property, # List of projects is stored in 'available_projects' ID property,
# because I don't know how to store a variable list of strings in a proper RNA property. # because I don't know how to store a variable list of strings in a proper RNA property.
@property @property
def available_projects(self) -> list: def available_projects(self) -> list:
return self.get("available_projects", []) return self.get('available_projects', [])
@available_projects.setter @available_projects.setter
def available_projects(self, new_projects): def available_projects(self, new_projects):
self["available_projects"] = new_projects self['available_projects'] = new_projects
project_specific.handle_project_update() project_specific.handle_project_update()
@ -190,89 +190,88 @@ class BlenderCloudPreferences(AddonPreferences):
# The following property is read-only to limit the scope of the # The following property is read-only to limit the scope of the
# addon and allow for proper testing within this scope. # addon and allow for proper testing within this scope.
pillar_server: StringProperty( pillar_server = StringProperty(
name="Blender Cloud Server", name='Blender Cloud Server',
description="URL of the Blender Cloud backend server", description='URL of the Blender Cloud backend server',
default=PILLAR_SERVER_URL, default=PILLAR_SERVER_URL,
get=lambda self: PILLAR_SERVER_URL, get=lambda self: PILLAR_SERVER_URL
) )
local_texture_dir: StringProperty( local_texture_dir = StringProperty(
name="Default Blender Cloud Texture Storage Directory", name='Default Blender Cloud Texture Storage Directory',
subtype="DIR_PATH", subtype='DIR_PATH',
default="//textures", default='//textures')
)
open_browser_after_share: BoolProperty( open_browser_after_share = BoolProperty(
name="Open Browser after Sharing File", name='Open Browser after Sharing File',
description="When enabled, Blender will open a webbrowser", description='When enabled, Blender will open a webbrowser',
default=True, default=True
) )
# TODO: store project-dependent properties with the project, so that people # TODO: store project-dependent properties with the project, so that people
# can switch projects and the Attract and Flamenco properties switch with it. # can switch projects and the Attract and Flamenco properties switch with it.
project: PointerProperty(type=BlenderCloudProjectGroup) project = PointerProperty(type=BlenderCloudProjectGroup)
cloud_project_local_path: StringProperty( cloud_project_local_path = StringProperty(
name="Local Project Path", name='Local Project Path',
description="Local path of your Attract project, used to search for blend files; " description='Local path of your Attract project, used to search for blend files; '
"usually best to set to an absolute path", 'usually best to set to an absolute path',
subtype="DIR_PATH", subtype='DIR_PATH',
default="//../", default='//../',
update=project_specific.store, update=project_specific.store,
) )
flamenco_manager: PointerProperty(type=flamenco.FlamencoManagerGroup) flamenco_manager = PointerProperty(type=flamenco.FlamencoManagerGroup)
flamenco_exclude_filter: StringProperty( flamenco_exclude_filter = StringProperty(
name="File Exclude Filter", name='File Exclude Filter',
description='Space-separated list of filename filters, like "*.abc *.mkv", to prevent ' description='Space-separated list of filename filters, like "*.abc *.mkv", to prevent '
"matching files from being packed into the output directory", 'matching files from being packed into the output directory',
default="", default='',
update=project_specific.store, update=project_specific.store,
) )
flamenco_job_file_path: StringProperty( flamenco_job_file_path = StringProperty(
name="Job Storage Path", name='Job Storage Path',
description="Path where to store job files, should be accesible for Workers too", description='Path where to store job files, should be accesible for Workers too',
subtype="DIR_PATH", subtype='DIR_PATH',
default=tempfile.gettempdir(), default=tempfile.gettempdir(),
update=project_specific.store, update=project_specific.store,
) )
flamenco_job_output_path: StringProperty( flamenco_job_output_path = StringProperty(
name="Job Output Path", name='Job Output Path',
description="Path where to store output files, should be accessible for Workers", description='Path where to store output files, should be accessible for Workers',
subtype="DIR_PATH", subtype='DIR_PATH',
default=tempfile.gettempdir(), default=tempfile.gettempdir(),
update=project_specific.store, update=project_specific.store,
) )
flamenco_job_output_strip_components: IntProperty( flamenco_job_output_strip_components = IntProperty(
name="Job Output Path Strip Components", name='Job Output Path Strip Components',
description="The final output path comprises of the job output path, and the blend file " description='The final output path comprises of the job output path, and the blend file '
"path relative to the project with this many path components stripped off " 'path relative to the project with this many path components stripped off '
"the front", 'the front',
min=0, min=0,
default=0, default=0,
soft_max=4, soft_max=4,
update=project_specific.store, update=project_specific.store,
) )
flamenco_relative_only: BoolProperty( flamenco_relative_only = BoolProperty(
name="Relative Paths Only", name='Relative Paths Only',
description="When enabled, only assets that are referred to with a relative path are " description='When enabled, only assets that are referred to with a relative path are '
"packed, and assets referred to by an absolute path are excluded from the " 'packed, and assets referred to by an absolute path are excluded from the '
"BAT pack. When disabled, all assets are packed", 'BAT pack. When disabled, all assets are packed.',
default=False, default=False,
update=project_specific.store, update=project_specific.store,
) )
flamenco_open_browser_after_submit: BoolProperty( flamenco_open_browser_after_submit = BoolProperty(
name="Open Browser after Submitting Job", name='Open Browser after Submitting Job',
description="When enabled, Blender will open a webbrowser", description='When enabled, Blender will open a webbrowser',
default=True, default=True,
) )
flamenco_show_quit_after_submit_button: BoolProperty( flamenco_show_quit_after_submit_button = BoolProperty(
name='Show "Submit & Quit" button', name='Show "Submit & Quit" button',
description='When enabled, next to the "Render on Flamenco" button there will be a button ' description='When enabled, next to the "Render on Flamenco" button there will be a button '
'"Submit & Quit" that silently quits Blender after submitting the render job ' '"Submit & Quit" that silently quits Blender after submitting the render job '
"to Flamenco", 'to Flamenco',
default=False, default=False,
) )
@ -291,30 +290,24 @@ class BlenderCloudPreferences(AddonPreferences):
blender_id_profile = blender_id.get_active_profile() blender_id_profile = blender_id.get_active_profile()
if blender_id is None: if blender_id is None:
msg_icon = "ERROR" msg_icon = 'ERROR'
text = "This add-on requires Blender ID" text = 'This add-on requires Blender ID'
help_text = ( help_text = 'Make sure that the Blender ID add-on is installed and activated'
"Make sure that the Blender ID add-on is installed and activated"
)
elif not blender_id_profile: elif not blender_id_profile:
msg_icon = "ERROR" msg_icon = 'ERROR'
text = "You are logged out." text = 'You are logged out.'
help_text = "To login, go to the Blender ID add-on preferences." help_text = 'To login, go to the Blender ID add-on preferences.'
elif bpy.app.debug and pillar.SUBCLIENT_ID not in blender_id_profile.subclients: elif bpy.app.debug and pillar.SUBCLIENT_ID not in blender_id_profile.subclients:
msg_icon = "QUESTION" msg_icon = 'QUESTION'
text = "No Blender Cloud credentials." text = 'No Blender Cloud credentials.'
help_text = ( help_text = ('You are logged in on Blender ID, but your credentials have not '
"You are logged in on Blender ID, but your credentials have not " 'been synchronized with Blender Cloud yet. Press the Update '
"been synchronized with Blender Cloud yet. Press the Update " 'Credentials button.')
"Credentials button."
)
else: else:
msg_icon = "WORLD_DATA" msg_icon = 'WORLD_DATA'
text = "You are logged in as %s." % blender_id_profile.username text = 'You are logged in as %s.' % blender_id_profile.username
help_text = ( help_text = ('To logout or change profile, '
"To logout or change profile, " 'go to the Blender ID add-on preferences.')
"go to the Blender ID add-on preferences."
)
# Authentication stuff # Authentication stuff
auth_box = layout.box() auth_box = layout.box()
@ -328,175 +321,166 @@ class BlenderCloudPreferences(AddonPreferences):
# Texture browser stuff # Texture browser stuff
texture_box = layout.box() texture_box = layout.box()
texture_box.enabled = msg_icon != "ERROR" texture_box.enabled = msg_icon != 'ERROR'
sub = texture_box.column() sub = texture_box.column()
sub.label( sub.label(text='Local directory for downloaded textures', icon_value=icon('CLOUD'))
text="Local directory for downloaded textures", icon_value=icon("CLOUD") sub.prop(self, "local_texture_dir", text='Default')
) sub.prop(context.scene, "local_texture_dir", text='Current scene')
sub.prop(self, "local_texture_dir", text="Default")
sub.prop(context.scene, "local_texture_dir", text="Current scene")
# Blender Sync stuff # Blender Sync stuff
bss = context.window_manager.blender_sync_status bss = context.window_manager.blender_sync_status
bsync_box = layout.box() bsync_box = layout.box()
bsync_box.enabled = msg_icon != "ERROR" bsync_box.enabled = msg_icon != 'ERROR'
row = bsync_box.row().split(factor=0.33) row = bsync_box.row().split(**factor(0.33))
row.label(text="Blender Sync with Blender Cloud", icon_value=icon("CLOUD")) row.label(text='Blender Sync with Blender Cloud', icon_value=icon('CLOUD'))
icon_for_level = { icon_for_level = {
"INFO": "NONE", 'INFO': 'NONE',
"WARNING": "INFO", 'WARNING': 'INFO',
"ERROR": "ERROR", 'ERROR': 'ERROR',
"SUBSCRIBE": "ERROR", 'SUBSCRIBE': 'ERROR',
} }
msg_icon = icon_for_level[bss.level] if bss.message else "NONE" msg_icon = icon_for_level[bss.level] if bss.message else 'NONE'
message_container = row.row() message_container = row.row()
message_container.label(text=bss.message, icon=msg_icon) message_container.label(text=bss.message, icon=msg_icon)
sub = bsync_box.column() sub = bsync_box.column()
if bss.level == "SUBSCRIBE": if bss.level == 'SUBSCRIBE':
self.draw_subscribe_button(sub) self.draw_subscribe_button(sub)
self.draw_sync_buttons(sub, bss) self.draw_sync_buttons(sub, bss)
# Image Share stuff # Image Share stuff
share_box = layout.box() share_box = layout.box()
share_box.label(text="Image Sharing on Blender Cloud", icon_value=icon("CLOUD")) share_box.label(text='Image Sharing on Blender Cloud', icon_value=icon('CLOUD'))
share_box.prop(self, "open_browser_after_share") share_box.prop(self, 'open_browser_after_share')
# Project selector # Project selector
project_box = layout.box() project_box = layout.box()
project_box.enabled = self.project.status in {"NONE", "IDLE"} project_box.enabled = self.project.status in {'NONE', 'IDLE'}
self.draw_project_selector(project_box, self.project) self.draw_project_selector(project_box, self.project)
extensions = project_extensions(self.project.project) extensions = project_extensions(self.project.project)
# Flamenco stuff # Flamenco stuff
if "flamenco" in extensions: if 'flamenco' in extensions:
flamenco_box = project_box.column() flamenco_box = project_box.column()
self.draw_flamenco_buttons(flamenco_box, self.flamenco_manager, context) self.draw_flamenco_buttons(flamenco_box, self.flamenco_manager, context)
def draw_subscribe_button(self, layout): def draw_subscribe_button(self, layout):
layout.operator("pillar.subscribe", icon="WORLD") layout.operator('pillar.subscribe', icon='WORLD')
def draw_sync_buttons(self, layout, bss): def draw_sync_buttons(self, layout, bss):
layout.enabled = bss.status in {"NONE", "IDLE"} layout.enabled = bss.status in {'NONE', 'IDLE'}
buttons = layout.column() buttons = layout.column()
row_buttons = buttons.row().split(factor=0.5) row_buttons = buttons.row().split(**factor(0.5))
row_push = row_buttons.row() row_push = row_buttons.row()
row_pull = row_buttons.row(align=True) row_pull = row_buttons.row(align=True)
row_push.operator( row_push.operator('pillar.sync',
"pillar.sync", text='Save %i.%i settings' % bpy.app.version[:2],
text="Save %i.%i settings" % bpy.app.version[:2], icon='TRIA_UP').action = 'PUSH'
icon="TRIA_UP",
).action = "PUSH"
versions = bss.available_blender_versions versions = bss.available_blender_versions
if bss.status in {"NONE", "IDLE"}: version = bss.version
if not versions: if bss.status in {'NONE', 'IDLE'}:
row_pull.operator( if not versions or not version:
"pillar.sync", text="Find version to load", icon="TRIA_DOWN" row_pull.operator('pillar.sync',
).action = "REFRESH" text='Find version to load',
icon='TRIA_DOWN').action = 'REFRESH'
else: else:
props = row_pull.operator( props = row_pull.operator('pillar.sync',
"pillar.sync", text='Load %s settings' % version,
text="Load %s settings" % bss.version, icon='TRIA_DOWN')
icon="TRIA_DOWN", props.action = 'PULL'
) props.blender_version = version
props.action = "PULL" row_pull.operator('pillar.sync',
props.blender_version = bss.version text='',
row_pull.operator( icon=SYNC_SELECT_VERSION_ICON).action = 'SELECT'
"pillar.sync", text="", icon="DOWNARROW_HLT"
).action = "SELECT"
else: else:
row_pull.label(text="Cloud Sync is running.") row_pull.label(text='Cloud Sync is running.')
def draw_project_selector(self, project_box, bcp: BlenderCloudProjectGroup): def draw_project_selector(self, project_box, bcp: BlenderCloudProjectGroup):
project_row = project_box.row(align=True) project_row = project_box.row(align=True)
project_row.label(text="Project settings", icon_value=icon("CLOUD")) project_row.label(text='Project settings', icon_value=icon('CLOUD'))
row_buttons = project_row.row(align=True) row_buttons = project_row.row(align=True)
projects = bcp.available_projects projects = bcp.available_projects
project = bcp.project project = bcp.project
if bcp.status in {"NONE", "IDLE"}: if bcp.status in {'NONE', 'IDLE'}:
if not projects: if not projects:
row_buttons.operator( row_buttons.operator('pillar.projects',
"pillar.projects", text="Find project to load", icon="FILE_REFRESH" text='Find project to load',
) icon='FILE_REFRESH')
else: else:
row_buttons.prop(bcp, "project") row_buttons.prop(bcp, 'project')
row_buttons.operator("pillar.projects", text="", icon="FILE_REFRESH") row_buttons.operator('pillar.projects',
props = row_buttons.operator( text='',
"pillar.project_open_in_browser", text="", icon="WORLD" icon='FILE_REFRESH')
) props = row_buttons.operator('pillar.project_open_in_browser',
text='',
icon='WORLD')
props.project_id = project props.project_id = project
else: else:
row_buttons.label(text="Fetching available projects.") row_buttons.label(text='Fetching available projects.')
enabled_for = project_extensions(project) enabled_for = project_extensions(project)
if not project: if not project:
return return
if not enabled_for: if not enabled_for:
project_box.label(text="This project is not set up for Attract or Flamenco") project_box.label(text='This project is not set up for Attract or Flamenco')
return return
project_box.label( project_box.label(text='This project is set up for: %s' %
text="This project is set up for: %s" % ", ".join(sorted(enabled_for)) ', '.join(sorted(enabled_for)))
)
# This is only needed when the project is set up for either Attract or Flamenco. # This is only needed when the project is set up for either Attract or Flamenco.
project_box.prop(self, "cloud_project_local_path", text="Local Project Path") project_box.prop(self, 'cloud_project_local_path',
text='Local Project Path')
def draw_flamenco_buttons( def draw_flamenco_buttons(self, flamenco_box, bcp: flamenco.FlamencoManagerGroup, context):
self, flamenco_box, bcp: flamenco.FlamencoManagerGroup, context
):
header_row = flamenco_box.row(align=True) header_row = flamenco_box.row(align=True)
header_row.label(text="Flamenco:", icon_value=icon("CLOUD")) header_row.label(text='Flamenco:', icon_value=icon('CLOUD'))
manager_split = flamenco_box.split(factor=0.32, align=True) manager_split = flamenco_box.split(**factor(0.32), align=True)
manager_split.label(text="Manager:") manager_split.label(text='Manager:')
manager_box = manager_split.row(align=True) manager_box = manager_split.row(align=True)
if bcp.status in {"NONE", "IDLE"}: if bcp.status in {'NONE', 'IDLE'}:
if not bcp.available_managers: if not bcp.available_managers or not bcp.manager:
manager_box.operator( manager_box.operator('flamenco.managers',
"flamenco.managers", text='Find Flamenco Managers',
text="Find Flamenco Managers", icon='FILE_REFRESH')
icon="FILE_REFRESH",
)
else: else:
manager_box.prop(bcp, "manager", text="") manager_box.prop(bcp, 'manager', text='')
manager_box.operator("flamenco.managers", text="", icon="FILE_REFRESH") manager_box.operator('flamenco.managers',
text='',
icon='FILE_REFRESH')
else: else:
manager_box.label(text="Fetching available managers.") manager_box.label(text='Fetching available managers.')
path_split = flamenco_box.split(factor=0.32, align=True) path_split = flamenco_box.split(**factor(0.32), align=True)
path_split.label(text="Job File Path:") path_split.label(text='Job File Path:')
path_box = path_split.row(align=True) path_box = path_split.row(align=True)
path_box.prop(self, "flamenco_job_file_path", text="") path_box.prop(self, 'flamenco_job_file_path', text='')
props = path_box.operator( props = path_box.operator('flamenco.explore_file_path', text='', icon='DISK_DRIVE')
"flamenco.explore_file_path", text="", icon="DISK_DRIVE"
)
props.path = self.flamenco_job_file_path props.path = self.flamenco_job_file_path
job_output_box = flamenco_box.column(align=True) job_output_box = flamenco_box.column(align=True)
path_split = job_output_box.split(factor=0.32, align=True) path_split = job_output_box.split(**factor(0.32), align=True)
path_split.label(text="Job Output Path:") path_split.label(text='Job Output Path:')
path_box = path_split.row(align=True) path_box = path_split.row(align=True)
path_box.prop(self, "flamenco_job_output_path", text="") path_box.prop(self, 'flamenco_job_output_path', text='')
props = path_box.operator( props = path_box.operator('flamenco.explore_file_path', text='', icon='DISK_DRIVE')
"flamenco.explore_file_path", text="", icon="DISK_DRIVE"
)
props.path = self.flamenco_job_output_path props.path = self.flamenco_job_output_path
job_output_box.prop(self, "flamenco_exclude_filter") job_output_box.prop(self, 'flamenco_exclude_filter')
prop_split = job_output_box.split(factor=0.32, align=True) prop_split = job_output_box.split(**factor(0.32), align=True)
prop_split.label(text="Strip Components:") prop_split.label(text='Strip Components:')
prop_split.prop(self, "flamenco_job_output_strip_components", text="") prop_split.prop(self, 'flamenco_job_output_strip_components', text='')
from .flamenco import render_output_path from .flamenco import render_output_path
@ -504,29 +488,25 @@ class BlenderCloudPreferences(AddonPreferences):
output_path = render_output_path(context) output_path = render_output_path(context)
if output_path: if output_path:
path_box.label(text=str(output_path)) path_box.label(text=str(output_path))
props = path_box.operator( props = path_box.operator('flamenco.explore_file_path', text='', icon='DISK_DRIVE')
"flamenco.explore_file_path", text="", icon="DISK_DRIVE"
)
props.path = str(output_path.parent) props.path = str(output_path.parent)
else: else:
path_box.label( path_box.label(text='Blend file is not in your project path, '
text="Blend file is not in your project path, " 'unable to give output path example.')
"unable to give output path example."
)
flamenco_box.prop(self, "flamenco_relative_only") flamenco_box.prop(self, 'flamenco_relative_only')
flamenco_box.prop(self, "flamenco_open_browser_after_submit") flamenco_box.prop(self, 'flamenco_open_browser_after_submit')
flamenco_box.prop(self, "flamenco_show_quit_after_submit_button") flamenco_box.prop(self, 'flamenco_show_quit_after_submit_button')
class PillarCredentialsUpdate(pillar.PillarOperatorMixin, Operator): class PillarCredentialsUpdate(pillar.PillarOperatorMixin,
Operator):
"""Updates the Pillar URL and tests the new URL.""" """Updates the Pillar URL and tests the new URL."""
bl_idname = 'pillar.credentials_update'
bl_label = 'Update credentials'
bl_description = 'Resynchronises your Blender ID login with Blender Cloud'
bl_idname = "pillar.credentials_update" log = logging.getLogger('bpy.ops.%s' % bl_idname)
bl_label = "Update credentials"
bl_description = "Resynchronises your Blender ID login with Blender Cloud"
log = logging.getLogger("bpy.ops.%s" % bl_idname)
@classmethod @classmethod
def poll(cls, context): def poll(cls, context):
@ -548,51 +528,50 @@ class PillarCredentialsUpdate(pillar.PillarOperatorMixin, Operator):
# Only allow activation when the user is actually logged in. # Only allow activation when the user is actually logged in.
if not self.is_logged_in(context): if not self.is_logged_in(context):
self.report({"ERROR"}, "No active profile found") self.report({'ERROR'}, 'No active profile found')
return {"CANCELLED"} return {'CANCELLED'}
try: try:
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
loop.run_until_complete(self.check_credentials(context, set())) loop.run_until_complete(self.check_credentials(context, set()))
except blender_id.BlenderIdCommError as ex: except blender_id.BlenderIdCommError as ex:
log.exception("Error sending subclient-specific token to Blender ID") log.exception('Error sending subclient-specific token to Blender ID')
self.report({"ERROR"}, "Failed to sync Blender ID to Blender Cloud") self.report({'ERROR'}, 'Failed to sync Blender ID to Blender Cloud')
return {"CANCELLED"} return {'CANCELLED'}
except Exception as ex: except Exception as ex:
log.exception("Error in test call to Pillar") log.exception('Error in test call to Pillar')
self.report({"ERROR"}, "Failed test connection to Blender Cloud") self.report({'ERROR'}, 'Failed test connection to Blender Cloud')
return {"CANCELLED"} return {'CANCELLED'}
self.report({"INFO"}, "Blender Cloud credentials & endpoint URL updated.") self.report({'INFO'}, 'Blender Cloud credentials & endpoint URL updated.')
return {"FINISHED"} return {'FINISHED'}
class PILLAR_OT_subscribe(Operator): class PILLAR_OT_subscribe(Operator):
"""Opens a browser to subscribe the user to the Cloud.""" """Opens a browser to subscribe the user to the Cloud."""
bl_idname = 'pillar.subscribe'
bl_idname = "pillar.subscribe" bl_label = 'Subscribe to the Cloud'
bl_label = "Subscribe to the Cloud"
bl_description = "Opens a page in a web browser to subscribe to the Blender Cloud" bl_description = "Opens a page in a web browser to subscribe to the Blender Cloud"
def execute(self, context): def execute(self, context):
import webbrowser import webbrowser
webbrowser.open_new_tab("https://cloud.blender.org/join") webbrowser.open_new_tab('https://cloud.blender.org/join')
self.report({"INFO"}, "We just started a browser for you.") self.report({'INFO'}, 'We just started a browser for you.')
return {"FINISHED"} return {'FINISHED'}
class PILLAR_OT_project_open_in_browser(Operator): class PILLAR_OT_project_open_in_browser(Operator):
bl_idname = "pillar.project_open_in_browser" bl_idname = 'pillar.project_open_in_browser'
bl_label = "Open in Browser" bl_label = 'Open in Browser'
bl_description = "Opens a webbrowser to show the project" bl_description = 'Opens a webbrowser to show the project'
project_id: StringProperty(name="Project ID") project_id = StringProperty(name='Project ID')
def execute(self, context): def execute(self, context):
if not self.project_id: if not self.project_id:
return {"CANCELLED"} return {'CANCELLED'}
import webbrowser import webbrowser
import urllib.parse import urllib.parse
@ -600,34 +579,28 @@ class PILLAR_OT_project_open_in_browser(Operator):
import pillarsdk import pillarsdk
from .pillar import sync_call from .pillar import sync_call
project = sync_call( project = sync_call(pillarsdk.Project.find, self.project_id, {'projection': {'url': True}})
pillarsdk.Project.find, self.project_id, {"projection": {"url": True}}
)
if log.isEnabledFor(logging.DEBUG): if log.isEnabledFor(logging.DEBUG):
import pprint import pprint
log.debug('found project: %s', pprint.pformat(project.to_dict()))
log.debug("found project: %s", pprint.pformat(project.to_dict())) url = urllib.parse.urljoin(PILLAR_WEB_SERVER_URL, 'p/' + project.url)
url = urllib.parse.urljoin(PILLAR_WEB_SERVER_URL, "p/" + project.url)
webbrowser.open_new_tab(url) webbrowser.open_new_tab(url)
self.report({"INFO"}, "Opened a browser at %s" % url) self.report({'INFO'}, 'Opened a browser at %s' % url)
return {"FINISHED"} return {'FINISHED'}
class PILLAR_OT_projects( class PILLAR_OT_projects(async_loop.AsyncModalOperatorMixin,
async_loop.AsyncModalOperatorMixin, pillar.AuthenticatedPillarOperatorMixin,
pillar.AuthenticatedPillarOperatorMixin, Operator):
Operator,
):
"""Fetches the projects available to the user""" """Fetches the projects available to the user"""
bl_idname = 'pillar.projects'
bl_idname = "pillar.projects" bl_label = 'Fetch available projects'
bl_label = "Fetch available projects"
stop_upon_exception = True stop_upon_exception = True
_log = logging.getLogger("bpy.ops.%s" % bl_idname) _log = logging.getLogger('bpy.ops.%s' % bl_idname)
async def async_execute(self, context): async def async_execute(self, context):
if not await self.authenticate(context): if not await self.authenticate(context):
@ -636,71 +609,69 @@ class PILLAR_OT_projects(
import pillarsdk import pillarsdk
from .pillar import pillar_call from .pillar import pillar_call
self.log.info("Going to fetch projects for user %s", self.user_id) self.log.info('Going to fetch projects for user %s', self.user_id)
preferences().project.status = "FETCHING" preferences().project.status = 'FETCHING'
# Get all projects, except the home project. # Get all projects, except the home project.
projects_user = await pillar_call( projects_user = await pillar_call(
pillarsdk.Project.all, pillarsdk.Project.all,
{ {'where': {'user': self.user_id,
"where": {"user": self.user_id, "category": {"$ne": "home"}}, 'category': {'$ne': 'home'}},
"sort": "-name", 'sort': '-name',
"projection": {"_id": True, "name": True, "extension_props": True}, 'projection': {'_id': True,
}, 'name': True,
) 'extension_props': True},
})
projects_shared = await pillar_call( projects_shared = await pillar_call(
pillarsdk.Project.all, pillarsdk.Project.all,
{ {'where': {'user': {'$ne': self.user_id},
"where": { 'permissions.groups.group': {'$in': self.db_user.groups}},
"user": {"$ne": self.user_id}, 'sort': '-name',
"permissions.groups.group": {"$in": self.db_user.groups}, 'projection': {'_id': True,
}, 'name': True,
"sort": "-name", 'extension_props': True},
"projection": {"_id": True, "name": True, "extension_props": True}, })
},
)
# We need to convert to regular dicts before storing in ID properties. # We need to convert to regular dicts before storing in ID properties.
# Also don't store more properties than we need. # Also don't store more properties than we need.
def reduce_properties(project_list): def reduce_properties(project_list):
for p in project_list: for p in project_list:
p = p.to_dict() p = p.to_dict()
extension_props = p.get("extension_props", {}) extension_props = p.get('extension_props', {})
enabled_for = list(extension_props.keys()) enabled_for = list(extension_props.keys())
self._log.debug("Project %r is enabled for %s", p["name"], enabled_for) self._log.debug('Project %r is enabled for %s', p['name'], enabled_for)
yield { yield {
"_id": p["_id"], '_id': p['_id'],
"name": p["name"], 'name': p['name'],
"enabled_for": enabled_for, 'enabled_for': enabled_for,
} }
projects = list(reduce_properties(projects_user["_items"])) + list( projects = list(reduce_properties(projects_user['_items'])) + \
reduce_properties(projects_shared["_items"]) list(reduce_properties(projects_shared['_items']))
)
def proj_sort_key(project): def proj_sort_key(project):
return project.get("name") return project.get('name')
preferences().project.available_projects = sorted(projects, key=proj_sort_key) preferences().project.available_projects = sorted(projects, key=proj_sort_key)
self.quit() self.quit()
def quit(self): def quit(self):
preferences().project.status = "IDLE" preferences().project.status = 'IDLE'
super().quit() super().quit()
class PILLAR_PT_image_custom_properties(rna_prop_ui.PropertyPanel, bpy.types.Panel): class PILLAR_PT_image_custom_properties(rna_prop_ui.PropertyPanel, bpy.types.Panel):
"""Shows custom properties in the image editor.""" """Shows custom properties in the image editor."""
bl_space_type = "IMAGE_EDITOR" bl_space_type = 'IMAGE_EDITOR'
bl_region_type = "UI" bl_region_type = 'UI'
bl_label = "Custom Properties" bl_label = 'Custom Properties'
_context_path = "edit_image" _context_path = 'edit_image'
_property_type = bpy.types.Image _property_type = bpy.types.Image
@ -724,10 +695,9 @@ def load_custom_icons():
return return
import bpy.utils.previews import bpy.utils.previews
icons = bpy.utils.previews.new() icons = bpy.utils.previews.new()
my_icons_dir = os.path.join(os.path.dirname(__file__), "icons") my_icons_dir = os.path.join(os.path.dirname(__file__), 'icons')
icons.load("CLOUD", os.path.join(my_icons_dir, "icon-cloud.png"), "IMAGE") icons.load('CLOUD', os.path.join(my_icons_dir, 'icon-cloud.png'), 'IMAGE')
def unload_custom_icons(): def unload_custom_icons():
@ -763,8 +733,8 @@ def register():
addon_prefs = preferences() addon_prefs = preferences()
WindowManager.last_blender_cloud_location = StringProperty( WindowManager.last_blender_cloud_location = StringProperty(
name="Last Blender Cloud browser location", default="/" name="Last Blender Cloud browser location",
) default="/")
def default_if_empty(scene, context): def default_if_empty(scene, context):
"""The scene's local_texture_dir, if empty, reverts to the addon prefs.""" """The scene's local_texture_dir, if empty, reverts to the addon prefs."""
@ -773,11 +743,10 @@ def register():
scene.local_texture_dir = addon_prefs.local_texture_dir scene.local_texture_dir = addon_prefs.local_texture_dir
Scene.local_texture_dir = StringProperty( Scene.local_texture_dir = StringProperty(
name="Blender Cloud texture storage directory for current scene", name='Blender Cloud texture storage directory for current scene',
subtype="DIR_PATH", subtype='DIR_PATH',
default=addon_prefs.local_texture_dir, default=addon_prefs.local_texture_dir,
update=default_if_empty, update=default_if_empty)
)
WindowManager.blender_sync_status = PointerProperty(type=SyncStatusProperties) WindowManager.blender_sync_status = PointerProperty(type=SyncStatusProperties)

View File

@ -52,13 +52,13 @@ def open_blend(filename, access="rb"):
bfile.is_compressed = False bfile.is_compressed = False
bfile.filepath_orig = filename bfile.filepath_orig = filename
return bfile return bfile
elif magic[:2] == b"\x1f\x8b": elif magic[:2] == b'\x1f\x8b':
log.debug("gzip blendfile detected") log.debug("gzip blendfile detected")
handle.close() handle.close()
log.debug("decompressing started") log.debug("decompressing started")
fs = gzip.open(filename, "rb") fs = gzip.open(filename, "rb")
data = fs.read(FILE_BUFFER_SIZE) data = fs.read(FILE_BUFFER_SIZE)
magic = data[: len(magic_test)] magic = data[:len(magic_test)]
if magic == magic_test: if magic == magic_test:
handle = tempfile.TemporaryFile() handle = tempfile.TemporaryFile()
while data: while data:
@ -90,7 +90,6 @@ class BlendFile:
""" """
Blend file. Blend file.
""" """
__slots__ = ( __slots__ = (
# file (result of open()) # file (result of open())
"handle", "handle",
@ -115,7 +114,7 @@ class BlendFile:
"is_modified", "is_modified",
# bool (is file gzipped) # bool (is file gzipped)
"is_compressed", "is_compressed",
) )
def __init__(self, handle): def __init__(self, handle):
log.debug("initializing reading blend-file") log.debug("initializing reading blend-file")
@ -126,12 +125,11 @@ class BlendFile:
self.code_index = {} self.code_index = {}
block = BlendFileBlock(handle, self) block = BlendFileBlock(handle, self)
while block.code != b"ENDB": while block.code != b'ENDB':
if block.code == b"DNA1": if block.code == b'DNA1':
( (self.structs,
self.structs, self.sdna_index_from_id,
self.sdna_index_from_id, ) = BlendFile.decode_structs(self.header, block, handle)
) = BlendFile.decode_structs(self.header, block, handle)
else: else:
handle.seek(block.size, os.SEEK_CUR) handle.seek(block.size, os.SEEK_CUR)
@ -143,9 +141,7 @@ class BlendFile:
self.blocks.append(block) self.blocks.append(block)
# cache (could lazy init, incase we never use?) # cache (could lazy init, incase we never use?)
self.block_from_offset = { self.block_from_offset = {block.addr_old: block for block in self.blocks if block.code != b'ENDB'}
block.addr_old: block for block in self.blocks if block.code != b"ENDB"
}
def __enter__(self): def __enter__(self):
return self return self
@ -154,7 +150,7 @@ class BlendFile:
self.close() self.close()
def find_blocks_from_code(self, code): def find_blocks_from_code(self, code):
assert type(code) == bytes assert(type(code) == bytes)
if code not in self.code_index: if code not in self.code_index:
return [] return []
return self.code_index[code] return self.code_index[code]
@ -162,7 +158,7 @@ class BlendFile:
def find_block_from_offset(self, offset): def find_block_from_offset(self, offset):
# same as looking looping over all blocks, # same as looking looping over all blocks,
# then checking ``block.addr_old == offset`` # then checking ``block.addr_old == offset``
assert type(offset) is int assert(type(offset) is int)
return self.block_from_offset.get(offset) return self.block_from_offset.get(offset)
def close(self): def close(self):
@ -189,15 +185,12 @@ class BlendFile:
def ensure_subtype_smaller(self, sdna_index_curr, sdna_index_next): def ensure_subtype_smaller(self, sdna_index_curr, sdna_index_next):
# never refine to a smaller type # never refine to a smaller type
if self.structs[sdna_index_curr].size > self.structs[sdna_index_next].size: if (self.structs[sdna_index_curr].size >
self.structs[sdna_index_next].size):
raise RuntimeError( raise RuntimeError("cant refine to smaller type (%s -> %s)" %
"cant refine to smaller type (%s -> %s)" (self.structs[sdna_index_curr].dna_type_id.decode('ascii'),
% ( self.structs[sdna_index_next].dna_type_id.decode('ascii')))
self.structs[sdna_index_curr].dna_type_id.decode("ascii"),
self.structs[sdna_index_next].dna_type_id.decode("ascii"),
)
)
@staticmethod @staticmethod
def decode_structs(header, block, handle): def decode_structs(header, block, handle):
@ -206,7 +199,7 @@ class BlendFile:
""" """
log.debug("building DNA catalog") log.debug("building DNA catalog")
shortstruct = DNA_IO.USHORT[header.endian_index] shortstruct = DNA_IO.USHORT[header.endian_index]
shortstruct2 = struct.Struct(header.endian_str + b"HH") shortstruct2 = struct.Struct(header.endian_str + b'HH')
intstruct = DNA_IO.UINT[header.endian_index] intstruct = DNA_IO.UINT[header.endian_index]
data = handle.read(block.size) data = handle.read(block.size)
@ -288,7 +281,6 @@ class BlendFileBlock:
""" """
Instance of a struct. Instance of a struct.
""" """
__slots__ = ( __slots__ = (
# BlendFile # BlendFile
"file", "file",
@ -299,25 +291,21 @@ class BlendFileBlock:
"count", "count",
"file_offset", "file_offset",
"user_data", "user_data",
)
def __str__(self):
return (
"<%s.%s (%s), size=%d at %s>"
%
# fields=[%s]
(
self.__class__.__name__,
self.dna_type.dna_type_id.decode("ascii"),
self.code.decode(),
self.size,
# b", ".join(f.dna_name.name_only for f in self.dna_type.fields).decode('ascii'),
hex(self.addr_old),
)
) )
def __str__(self):
return ("<%s.%s (%s), size=%d at %s>" %
# fields=[%s]
(self.__class__.__name__,
self.dna_type.dna_type_id.decode('ascii'),
self.code.decode(),
self.size,
# b", ".join(f.dna_name.name_only for f in self.dna_type.fields).decode('ascii'),
hex(self.addr_old),
))
def __init__(self, handle, bfile): def __init__(self, handle, bfile):
OLDBLOCK = struct.Struct(b"4sI") OLDBLOCK = struct.Struct(b'4sI')
self.file = bfile self.file = bfile
self.user_data = None self.user_data = None
@ -330,8 +318,8 @@ class BlendFileBlock:
if len(data) > 15: if len(data) > 15:
blockheader = bfile.block_header_struct.unpack(data) blockheader = bfile.block_header_struct.unpack(data)
self.code = blockheader[0].partition(b"\0")[0] self.code = blockheader[0].partition(b'\0')[0]
if self.code != b"ENDB": if self.code != b'ENDB':
self.size = blockheader[1] self.size = blockheader[1]
self.addr_old = blockheader[2] self.addr_old = blockheader[2]
self.sdna_index = blockheader[3] self.sdna_index = blockheader[3]
@ -345,7 +333,7 @@ class BlendFileBlock:
self.file_offset = 0 self.file_offset = 0
else: else:
blockheader = OLDBLOCK.unpack(data) blockheader = OLDBLOCK.unpack(data)
self.code = blockheader[0].partition(b"\0")[0] self.code = blockheader[0].partition(b'\0')[0]
self.code = DNA_IO.read_data0(blockheader[0]) self.code = DNA_IO.read_data0(blockheader[0])
self.size = 0 self.size = 0
self.addr_old = 0 self.addr_old = 0
@ -358,30 +346,28 @@ class BlendFileBlock:
return self.file.structs[self.sdna_index] return self.file.structs[self.sdna_index]
def refine_type_from_index(self, sdna_index_next): def refine_type_from_index(self, sdna_index_next):
assert type(sdna_index_next) is int assert(type(sdna_index_next) is int)
sdna_index_curr = self.sdna_index sdna_index_curr = self.sdna_index
self.file.ensure_subtype_smaller(sdna_index_curr, sdna_index_next) self.file.ensure_subtype_smaller(sdna_index_curr, sdna_index_next)
self.sdna_index = sdna_index_next self.sdna_index = sdna_index_next
def refine_type(self, dna_type_id): def refine_type(self, dna_type_id):
assert type(dna_type_id) is bytes assert(type(dna_type_id) is bytes)
self.refine_type_from_index(self.file.sdna_index_from_id[dna_type_id]) self.refine_type_from_index(self.file.sdna_index_from_id[dna_type_id])
def get_file_offset( def get_file_offset(self, path,
self, default=...,
path, sdna_index_refine=None,
default=..., base_index=0,
sdna_index_refine=None, ):
base_index=0,
):
""" """
Return (offset, length) Return (offset, length)
""" """
assert type(path) is bytes assert(type(path) is bytes)
ofs = self.file_offset ofs = self.file_offset
if base_index != 0: if base_index != 0:
assert base_index < self.count assert(base_index < self.count)
ofs += (self.size // self.count) * base_index ofs += (self.size // self.count) * base_index
self.file.handle.seek(ofs, os.SEEK_SET) self.file.handle.seek(ofs, os.SEEK_SET)
@ -391,23 +377,21 @@ class BlendFileBlock:
self.file.ensure_subtype_smaller(self.sdna_index, sdna_index_refine) self.file.ensure_subtype_smaller(self.sdna_index, sdna_index_refine)
dna_struct = self.file.structs[sdna_index_refine] dna_struct = self.file.structs[sdna_index_refine]
field = dna_struct.field_from_path(self.file.header, self.file.handle, path) field = dna_struct.field_from_path(
self.file.header, self.file.handle, path)
return (self.file.handle.tell(), field.dna_name.array_size) return (self.file.handle.tell(), field.dna_name.array_size)
def get( def get(self, path,
self, default=...,
path, sdna_index_refine=None,
default=..., use_nil=True, use_str=True,
sdna_index_refine=None, base_index=0,
use_nil=True, ):
use_str=True,
base_index=0,
):
ofs = self.file_offset ofs = self.file_offset
if base_index != 0: if base_index != 0:
assert base_index < self.count assert(base_index < self.count)
ofs += (self.size // self.count) * base_index ofs += (self.size // self.count) * base_index
self.file.handle.seek(ofs, os.SEEK_SET) self.file.handle.seek(ofs, os.SEEK_SET)
@ -418,55 +402,36 @@ class BlendFileBlock:
dna_struct = self.file.structs[sdna_index_refine] dna_struct = self.file.structs[sdna_index_refine]
return dna_struct.field_get( return dna_struct.field_get(
self.file.header, self.file.header, self.file.handle, path,
self.file.handle, default=default,
path, use_nil=use_nil, use_str=use_str,
default=default, )
use_nil=use_nil,
use_str=use_str,
)
def get_recursive_iter( def get_recursive_iter(self, path, path_root=b"",
self, default=...,
path, sdna_index_refine=None,
path_root=b"", use_nil=True, use_str=True,
default=..., base_index=0,
sdna_index_refine=None, ):
use_nil=True,
use_str=True,
base_index=0,
):
if path_root: if path_root:
path_full = (path_root if type(path_root) is tuple else (path_root,)) + ( path_full = (
path if type(path) is tuple else (path,) (path_root if type(path_root) is tuple else (path_root, )) +
) (path if type(path) is tuple else (path, )))
else: else:
path_full = path path_full = path
try: try:
yield ( yield (path_full, self.get(path_full, default, sdna_index_refine, use_nil, use_str, base_index))
path_full,
self.get(
path_full, default, sdna_index_refine, use_nil, use_str, base_index
),
)
except NotImplementedError as ex: except NotImplementedError as ex:
msg, dna_name, dna_type = ex.args msg, dna_name, dna_type = ex.args
struct_index = self.file.sdna_index_from_id.get(dna_type.dna_type_id, None) struct_index = self.file.sdna_index_from_id.get(dna_type.dna_type_id, None)
if struct_index is None: if struct_index is None:
yield (path_full, "<%s>" % dna_type.dna_type_id.decode("ascii")) yield (path_full, "<%s>" % dna_type.dna_type_id.decode('ascii'))
else: else:
struct = self.file.structs[struct_index] struct = self.file.structs[struct_index]
for f in struct.fields: for f in struct.fields:
yield from self.get_recursive_iter( yield from self.get_recursive_iter(
f.dna_name.name_only, f.dna_name.name_only, path_full, default, None, use_nil, use_str, 0)
path_full,
default,
None,
use_nil,
use_str,
0,
)
def items_recursive_iter(self): def items_recursive_iter(self):
for k in self.keys(): for k in self.keys():
@ -480,13 +445,9 @@ class BlendFileBlock:
# TODO This implementation is most likely far from optimal... and CRC32 is not renown as the best hashing # TODO This implementation is most likely far from optimal... and CRC32 is not renown as the best hashing
# algo either. But for now does the job! # algo either. But for now does the job!
import zlib import zlib
def _is_pointer(self, k): def _is_pointer(self, k):
return ( return self.file.structs[self.sdna_index].field_from_path(
self.file.structs[self.sdna_index] self.file.header, self.file.handle, k).dna_name.is_pointer
.field_from_path(self.file.header, self.file.handle, k)
.dna_name.is_pointer
)
hsh = 1 hsh = 1
for k, v in self.items_recursive_iter(): for k, v in self.items_recursive_iter():
@ -494,12 +455,9 @@ class BlendFileBlock:
hsh = zlib.adler32(str(v).encode(), hsh) hsh = zlib.adler32(str(v).encode(), hsh)
return hsh return hsh
def set( def set(self, path, value,
self, sdna_index_refine=None,
path, ):
value,
sdna_index_refine=None,
):
if sdna_index_refine is None: if sdna_index_refine is None:
sdna_index_refine = self.sdna_index sdna_index_refine = self.sdna_index
@ -509,34 +467,29 @@ class BlendFileBlock:
dna_struct = self.file.structs[sdna_index_refine] dna_struct = self.file.structs[sdna_index_refine]
self.file.handle.seek(self.file_offset, os.SEEK_SET) self.file.handle.seek(self.file_offset, os.SEEK_SET)
self.file.is_modified = True self.file.is_modified = True
return dna_struct.field_set(self.file.header, self.file.handle, path, value) return dna_struct.field_set(
self.file.header, self.file.handle, path, value)
# --------------- # ---------------
# Utility get/set # Utility get/set
# #
# avoid inline pointer casting # avoid inline pointer casting
def get_pointer( def get_pointer(
self, self, path,
path, default=...,
default=..., sdna_index_refine=None,
sdna_index_refine=None, base_index=0,
base_index=0, ):
):
if sdna_index_refine is None: if sdna_index_refine is None:
sdna_index_refine = self.sdna_index sdna_index_refine = self.sdna_index
result = self.get( result = self.get(path, default, sdna_index_refine=sdna_index_refine, base_index=base_index)
path, default, sdna_index_refine=sdna_index_refine, base_index=base_index
)
# default # default
if type(result) is not int: if type(result) is not int:
return result return result
assert ( assert(self.file.structs[sdna_index_refine].field_from_path(
self.file.structs[sdna_index_refine] self.file.header, self.file.handle, path).dna_name.is_pointer)
.field_from_path(self.file.header, self.file.handle, path)
.dna_name.is_pointer
)
if result != 0: if result != 0:
# possible (but unlikely) # possible (but unlikely)
# that this fails and returns None # that this fails and returns None
@ -564,7 +517,7 @@ class BlendFileBlock:
yield self[k] yield self[k]
except NotImplementedError as ex: except NotImplementedError as ex:
msg, dna_name, dna_type = ex.args msg, dna_name, dna_type = ex.args
yield "<%s>" % dna_type.dna_type_id.decode("ascii") yield "<%s>" % dna_type.dna_type_id.decode('ascii')
def items(self): def items(self):
for k in self.keys(): for k in self.keys():
@ -572,7 +525,7 @@ class BlendFileBlock:
yield (k, self[k]) yield (k, self[k])
except NotImplementedError as ex: except NotImplementedError as ex:
msg, dna_name, dna_type = ex.args msg, dna_name, dna_type = ex.args
yield (k, "<%s>" % dna_type.dna_type_id.decode("ascii")) yield (k, "<%s>" % dna_type.dna_type_id.decode('ascii'))
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
@ -589,7 +542,6 @@ class BlendFileHeader:
BlendFileHeader allocates the first 12 bytes of a blend file BlendFileHeader allocates the first 12 bytes of a blend file
it contains information about the hardware architecture it contains information about the hardware architecture
""" """
__slots__ = ( __slots__ = (
# str # str
"magic", "magic",
@ -603,61 +555,56 @@ class BlendFileHeader:
"endian_str", "endian_str",
# int, used to index common types # int, used to index common types
"endian_index", "endian_index",
) )
def __init__(self, handle): def __init__(self, handle):
FILEHEADER = struct.Struct(b"7s1s1s3s") FILEHEADER = struct.Struct(b'7s1s1s3s')
log.debug("reading blend-file-header") log.debug("reading blend-file-header")
values = FILEHEADER.unpack(handle.read(FILEHEADER.size)) values = FILEHEADER.unpack(handle.read(FILEHEADER.size))
self.magic = values[0] self.magic = values[0]
pointer_size_id = values[1] pointer_size_id = values[1]
if pointer_size_id == b"-": if pointer_size_id == b'-':
self.pointer_size = 8 self.pointer_size = 8
elif pointer_size_id == b"_": elif pointer_size_id == b'_':
self.pointer_size = 4 self.pointer_size = 4
else: else:
assert 0 assert(0)
endian_id = values[2] endian_id = values[2]
if endian_id == b"v": if endian_id == b'v':
self.is_little_endian = True self.is_little_endian = True
self.endian_str = b"<" self.endian_str = b'<'
self.endian_index = 0 self.endian_index = 0
elif endian_id == b"V": elif endian_id == b'V':
self.is_little_endian = False self.is_little_endian = False
self.endian_index = 1 self.endian_index = 1
self.endian_str = b">" self.endian_str = b'>'
else: else:
assert 0 assert(0)
version_id = values[3] version_id = values[3]
self.version = int(version_id) self.version = int(version_id)
def create_block_header_struct(self): def create_block_header_struct(self):
return struct.Struct( return struct.Struct(b''.join((
b"".join( self.endian_str,
( b'4sI',
self.endian_str, b'I' if self.pointer_size == 4 else b'Q',
b"4sI", b'II',
b"I" if self.pointer_size == 4 else b"Q", )))
b"II",
)
)
)
class DNAName: class DNAName:
""" """
DNAName is a C-type name stored in the DNA DNAName is a C-type name stored in the DNA
""" """
__slots__ = ( __slots__ = (
"name_full", "name_full",
"name_only", "name_only",
"is_pointer", "is_pointer",
"is_method_pointer", "is_method_pointer",
"array_size", "array_size",
) )
def __init__(self, name_full): def __init__(self, name_full):
self.name_full = name_full self.name_full = name_full
@ -667,40 +614,40 @@ class DNAName:
self.array_size = self.calc_array_size() self.array_size = self.calc_array_size()
def __repr__(self): def __repr__(self):
return "%s(%r)" % (type(self).__qualname__, self.name_full) return '%s(%r)' % (type(self).__qualname__, self.name_full)
def as_reference(self, parent): def as_reference(self, parent):
if parent is None: if parent is None:
result = b"" result = b''
else: else:
result = parent + b"." result = parent + b'.'
result = result + self.name_only result = result + self.name_only
return result return result
def calc_name_only(self): def calc_name_only(self):
result = self.name_full.strip(b"*()") result = self.name_full.strip(b'*()')
index = result.find(b"[") index = result.find(b'[')
if index != -1: if index != -1:
result = result[:index] result = result[:index]
return result return result
def calc_is_pointer(self): def calc_is_pointer(self):
return b"*" in self.name_full return (b'*' in self.name_full)
def calc_is_method_pointer(self): def calc_is_method_pointer(self):
return b"(*" in self.name_full return (b'(*' in self.name_full)
def calc_array_size(self): def calc_array_size(self):
result = 1 result = 1
temp = self.name_full temp = self.name_full
index = temp.find(b"[") index = temp.find(b'[')
while index != -1: while index != -1:
index_2 = temp.find(b"]") index_2 = temp.find(b']')
result *= int(temp[index + 1 : index_2]) result *= int(temp[index + 1:index_2])
temp = temp[index_2 + 1 :] temp = temp[index_2 + 1:]
index = temp.find(b"[") index = temp.find(b'[')
return result return result
@ -710,7 +657,6 @@ class DNAField:
DNAField is a coupled DNAStruct and DNAName DNAField is a coupled DNAStruct and DNAName
and cache offset for reuse and cache offset for reuse
""" """
__slots__ = ( __slots__ = (
# DNAName # DNAName
"dna_name", "dna_name",
@ -721,7 +667,7 @@ class DNAField:
"dna_size", "dna_size",
# cached info (avoid looping over fields each time) # cached info (avoid looping over fields each time)
"dna_offset", "dna_offset",
) )
def __init__(self, dna_type, dna_name, dna_size, dna_offset): def __init__(self, dna_type, dna_name, dna_size, dna_offset):
self.dna_type = dna_type self.dna_type = dna_type
@ -734,14 +680,13 @@ class DNAStruct:
""" """
DNAStruct is a C-type structure stored in the DNA DNAStruct is a C-type structure stored in the DNA
""" """
__slots__ = ( __slots__ = (
"dna_type_id", "dna_type_id",
"size", "size",
"fields", "fields",
"field_from_name", "field_from_name",
"user_data", "user_data",
) )
def __init__(self, dna_type_id): def __init__(self, dna_type_id):
self.dna_type_id = dna_type_id self.dna_type_id = dna_type_id
@ -750,7 +695,7 @@ class DNAStruct:
self.user_data = None self.user_data = None
def __repr__(self): def __repr__(self):
return "%s(%r)" % (type(self).__qualname__, self.dna_type_id) return '%s(%r)' % (type(self).__qualname__, self.dna_type_id)
def field_from_path(self, header, handle, path): def field_from_path(self, header, handle, path):
""" """
@ -764,7 +709,7 @@ class DNAStruct:
if len(path) >= 2 and type(path[1]) is not bytes: if len(path) >= 2 and type(path[1]) is not bytes:
name_tail = path[2:] name_tail = path[2:]
index = path[1] index = path[1]
assert type(index) is int assert(type(index) is int)
else: else:
name_tail = path[1:] name_tail = path[1:]
index = 0 index = 0
@ -773,7 +718,7 @@ class DNAStruct:
name_tail = None name_tail = None
index = 0 index = 0
assert type(name) is bytes assert(type(name) is bytes)
field = self.field_from_name.get(name) field = self.field_from_name.get(name)
@ -784,69 +729,47 @@ class DNAStruct:
index_offset = header.pointer_size * index index_offset = header.pointer_size * index
else: else:
index_offset = field.dna_type.size * index index_offset = field.dna_type.size * index
assert index_offset < field.dna_size assert(index_offset < field.dna_size)
handle.seek(index_offset, os.SEEK_CUR) handle.seek(index_offset, os.SEEK_CUR)
if not name_tail: # None or () if not name_tail: # None or ()
return field return field
else: else:
return field.dna_type.field_from_path(header, handle, name_tail) return field.dna_type.field_from_path(header, handle, name_tail)
def field_get( def field_get(self, header, handle, path,
self, default=...,
header, use_nil=True, use_str=True,
handle, ):
path,
default=...,
use_nil=True,
use_str=True,
):
field = self.field_from_path(header, handle, path) field = self.field_from_path(header, handle, path)
if field is None: if field is None:
if default is not ...: if default is not ...:
return default return default
else: else:
raise KeyError( raise KeyError("%r not found in %r (%r)" %
"%r not found in %r (%r)" (path, [f.dna_name.name_only for f in self.fields], self.dna_type_id))
% (
path,
[f.dna_name.name_only for f in self.fields],
self.dna_type_id,
)
)
dna_type = field.dna_type dna_type = field.dna_type
dna_name = field.dna_name dna_name = field.dna_name
if dna_name.is_pointer: if dna_name.is_pointer:
return DNA_IO.read_pointer(handle, header) return DNA_IO.read_pointer(handle, header)
elif dna_type.dna_type_id == b"int": elif dna_type.dna_type_id == b'int':
if dna_name.array_size > 1: if dna_name.array_size > 1:
return [ return [DNA_IO.read_int(handle, header) for i in range(dna_name.array_size)]
DNA_IO.read_int(handle, header) for i in range(dna_name.array_size)
]
return DNA_IO.read_int(handle, header) return DNA_IO.read_int(handle, header)
elif dna_type.dna_type_id == b"short": elif dna_type.dna_type_id == b'short':
if dna_name.array_size > 1: if dna_name.array_size > 1:
return [ return [DNA_IO.read_short(handle, header) for i in range(dna_name.array_size)]
DNA_IO.read_short(handle, header)
for i in range(dna_name.array_size)
]
return DNA_IO.read_short(handle, header) return DNA_IO.read_short(handle, header)
elif dna_type.dna_type_id == b"uint64_t": elif dna_type.dna_type_id == b'uint64_t':
if dna_name.array_size > 1: if dna_name.array_size > 1:
return [ return [DNA_IO.read_ulong(handle, header) for i in range(dna_name.array_size)]
DNA_IO.read_ulong(handle, header)
for i in range(dna_name.array_size)
]
return DNA_IO.read_ulong(handle, header) return DNA_IO.read_ulong(handle, header)
elif dna_type.dna_type_id == b"float": elif dna_type.dna_type_id == b'float':
if dna_name.array_size > 1: if dna_name.array_size > 1:
return [ return [DNA_IO.read_float(handle, header) for i in range(dna_name.array_size)]
DNA_IO.read_float(handle, header)
for i in range(dna_name.array_size)
]
return DNA_IO.read_float(handle, header) return DNA_IO.read_float(handle, header)
elif dna_type.dna_type_id == b"char": elif dna_type.dna_type_id == b'char':
if use_str: if use_str:
if use_nil: if use_nil:
return DNA_IO.read_string0(handle, dna_name.array_size) return DNA_IO.read_string0(handle, dna_name.array_size)
@ -858,39 +781,30 @@ class DNAStruct:
else: else:
return DNA_IO.read_bytes(handle, dna_name.array_size) return DNA_IO.read_bytes(handle, dna_name.array_size)
else: else:
raise NotImplementedError( raise NotImplementedError("%r exists but isn't pointer, can't resolve field %r" %
"%r exists but isn't pointer, can't resolve field %r" (path, dna_name.name_only), dna_name, dna_type)
% (path, dna_name.name_only),
dna_name,
dna_type,
)
def field_set(self, header, handle, path, value): def field_set(self, header, handle, path, value):
assert type(path) == bytes assert(type(path) == bytes)
field = self.field_from_path(header, handle, path) field = self.field_from_path(header, handle, path)
if field is None: if field is None:
raise KeyError( raise KeyError("%r not found in %r" %
"%r not found in %r" (path, [f.dna_name.name_only for f in self.fields]))
% (path, [f.dna_name.name_only for f in self.fields])
)
dna_type = field.dna_type dna_type = field.dna_type
dna_name = field.dna_name dna_name = field.dna_name
if dna_type.dna_type_id == b"char": if dna_type.dna_type_id == b'char':
if type(value) is str: if type(value) is str:
return DNA_IO.write_string(handle, value, dna_name.array_size) return DNA_IO.write_string(handle, value, dna_name.array_size)
else: else:
return DNA_IO.write_bytes(handle, value, dna_name.array_size) return DNA_IO.write_bytes(handle, value, dna_name.array_size)
elif dna_type.dna_type_id == b"int": elif dna_type.dna_type_id == b'int':
DNA_IO.write_int(handle, header, value) DNA_IO.write_int(handle, header, value)
else: else:
raise NotImplementedError( raise NotImplementedError("Setting %r is not yet supported for %r" %
"Setting %r is not yet supported for %r" % (dna_type, dna_name), (dna_type, dna_name), dna_name, dna_type)
dna_name,
dna_type,
)
class DNA_IO: class DNA_IO:
@ -907,20 +821,20 @@ class DNA_IO:
@staticmethod @staticmethod
def write_string(handle, astring, fieldlen): def write_string(handle, astring, fieldlen):
assert isinstance(astring, str) assert(isinstance(astring, str))
if len(astring) >= fieldlen: if len(astring) >= fieldlen:
stringw = astring[0:fieldlen] stringw = astring[0:fieldlen]
else: else:
stringw = astring + "\0" stringw = astring + '\0'
handle.write(stringw.encode("utf-8")) handle.write(stringw.encode('utf-8'))
@staticmethod @staticmethod
def write_bytes(handle, astring, fieldlen): def write_bytes(handle, astring, fieldlen):
assert isinstance(astring, (bytes, bytearray)) assert(isinstance(astring, (bytes, bytearray)))
if len(astring) >= fieldlen: if len(astring) >= fieldlen:
stringw = astring[0:fieldlen] stringw = astring[0:fieldlen]
else: else:
stringw = astring + b"\0" stringw = astring + b'\0'
handle.write(stringw) handle.write(stringw)
@ -936,44 +850,44 @@ class DNA_IO:
@staticmethod @staticmethod
def read_string(handle, length): def read_string(handle, length):
return DNA_IO.read_bytes(handle, length).decode("utf-8") return DNA_IO.read_bytes(handle, length).decode('utf-8')
@staticmethod @staticmethod
def read_string0(handle, length): def read_string0(handle, length):
return DNA_IO.read_bytes0(handle, length).decode("utf-8") return DNA_IO.read_bytes0(handle, length).decode('utf-8')
@staticmethod @staticmethod
def read_data0_offset(data, offset): def read_data0_offset(data, offset):
add = data.find(b"\0", offset) - offset add = data.find(b'\0', offset) - offset
return data[offset : offset + add] return data[offset:offset + add]
@staticmethod @staticmethod
def read_data0(data): def read_data0(data):
add = data.find(b"\0") add = data.find(b'\0')
return data[:add] return data[:add]
USHORT = struct.Struct(b"<H"), struct.Struct(b">H") USHORT = struct.Struct(b'<H'), struct.Struct(b'>H')
@staticmethod @staticmethod
def read_ushort(handle, fileheader): def read_ushort(handle, fileheader):
st = DNA_IO.USHORT[fileheader.endian_index] st = DNA_IO.USHORT[fileheader.endian_index]
return st.unpack(handle.read(st.size))[0] return st.unpack(handle.read(st.size))[0]
SSHORT = struct.Struct(b"<h"), struct.Struct(b">h") SSHORT = struct.Struct(b'<h'), struct.Struct(b'>h')
@staticmethod @staticmethod
def read_short(handle, fileheader): def read_short(handle, fileheader):
st = DNA_IO.SSHORT[fileheader.endian_index] st = DNA_IO.SSHORT[fileheader.endian_index]
return st.unpack(handle.read(st.size))[0] return st.unpack(handle.read(st.size))[0]
UINT = struct.Struct(b"<I"), struct.Struct(b">I") UINT = struct.Struct(b'<I'), struct.Struct(b'>I')
@staticmethod @staticmethod
def read_uint(handle, fileheader): def read_uint(handle, fileheader):
st = DNA_IO.UINT[fileheader.endian_index] st = DNA_IO.UINT[fileheader.endian_index]
return st.unpack(handle.read(st.size))[0] return st.unpack(handle.read(st.size))[0]
SINT = struct.Struct(b"<i"), struct.Struct(b">i") SINT = struct.Struct(b'<i'), struct.Struct(b'>i')
@staticmethod @staticmethod
def read_int(handle, fileheader): def read_int(handle, fileheader):
@ -982,22 +896,19 @@ class DNA_IO:
@staticmethod @staticmethod
def write_int(handle, fileheader, value): def write_int(handle, fileheader, value):
assert isinstance(value, int), "value must be int, but is %r: %r" % ( assert isinstance(value, int), 'value must be int, but is %r: %r' % (type(value), value)
type(value),
value,
)
st = DNA_IO.SINT[fileheader.endian_index] st = DNA_IO.SINT[fileheader.endian_index]
to_write = st.pack(value) to_write = st.pack(value)
handle.write(to_write) handle.write(to_write)
FLOAT = struct.Struct(b"<f"), struct.Struct(b">f") FLOAT = struct.Struct(b'<f'), struct.Struct(b'>f')
@staticmethod @staticmethod
def read_float(handle, fileheader): def read_float(handle, fileheader):
st = DNA_IO.FLOAT[fileheader.endian_index] st = DNA_IO.FLOAT[fileheader.endian_index]
return st.unpack(handle.read(st.size))[0] return st.unpack(handle.read(st.size))[0]
ULONG = struct.Struct(b"<Q"), struct.Struct(b">Q") ULONG = struct.Struct(b'<Q'), struct.Struct(b'>Q')
@staticmethod @staticmethod
def read_ulong(handle, fileheader): def read_ulong(handle, fileheader):

View File

@ -33,9 +33,7 @@ from cachecontrol.caches import FileCache
from . import appdirs from . import appdirs
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
_session = ( _session = None # requests.Session object that's set up for caching by requests_session().
None # requests.Session object that's set up for caching by requests_session().
)
def cache_directory(*subdirs) -> str: def cache_directory(*subdirs) -> str:
@ -58,12 +56,12 @@ def cache_directory(*subdirs) -> str:
if profile: if profile:
username = profile.username username = profile.username
else: else:
username = "anonymous" username = 'anonymous'
# TODO: use bpy.utils.user_resource('CACHE', ...) # TODO: use bpy.utils.user_resource('CACHE', ...)
# once https://developer.blender.org/T47684 is finished. # once https://developer.blender.org/T47684 is finished.
user_cache_dir = appdirs.user_cache_dir(appname="Blender", appauthor=False) user_cache_dir = appdirs.user_cache_dir(appname='Blender', appauthor=False)
cache_dir = os.path.join(user_cache_dir, "blender_cloud", username, *subdirs) cache_dir = os.path.join(user_cache_dir, 'blender_cloud', username, *subdirs)
os.makedirs(cache_dir, mode=0o700, exist_ok=True) os.makedirs(cache_dir, mode=0o700, exist_ok=True)
@ -78,11 +76,10 @@ def requests_session() -> requests.Session:
if _session is not None: if _session is not None:
return _session return _session
cache_name = cache_directory("blender_cloud_http") cache_name = cache_directory('blender_cloud_http')
log.info("Storing cache in %s" % cache_name) log.info('Storing cache in %s' % cache_name)
_session = cachecontrol.CacheControl( _session = cachecontrol.CacheControl(sess=requests.session(),
sess=requests.session(), cache=FileCache(cache_name) cache=FileCache(cache_name))
)
return _session return _session

File diff suppressed because it is too large Load Diff

View File

@ -36,6 +36,7 @@ class BatProgress(progress.Callback):
self.loop = asyncio.get_event_loop() self.loop = asyncio.get_event_loop()
def _set_attr(self, attr: str, value): def _set_attr(self, attr: str, value):
async def do_it(): async def do_it():
setattr(bpy.context.window_manager, attr, value) setattr(bpy.context.window_manager, attr, value)
@ -43,48 +44,48 @@ class BatProgress(progress.Callback):
def _txt(self, msg: str): def _txt(self, msg: str):
"""Set a text in a thread-safe way.""" """Set a text in a thread-safe way."""
self._set_attr("flamenco_status_txt", msg) self._set_attr('flamenco_status_txt', msg)
def _status(self, status: str): def _status(self, status: str):
"""Set the flamenco_status property in a thread-safe way.""" """Set the flamenco_status property in a thread-safe way."""
self._set_attr("flamenco_status", status) self._set_attr('flamenco_status', status)
def _progress(self, progress: int): def _progress(self, progress: int):
"""Set the flamenco_progress property in a thread-safe way.""" """Set the flamenco_progress property in a thread-safe way."""
self._set_attr("flamenco_progress", progress) self._set_attr('flamenco_progress', progress)
def pack_start(self) -> None: def pack_start(self) -> None:
self._txt("Starting BAT Pack operation") self._txt('Starting BAT Pack operation')
def pack_done( def pack_done(self,
self, output_blendfile: pathlib.Path, missing_files: typing.Set[pathlib.Path] output_blendfile: pathlib.Path,
) -> None: missing_files: typing.Set[pathlib.Path]) -> None:
if missing_files: if missing_files:
self._txt("There were %d missing files" % len(missing_files)) self._txt('There were %d missing files' % len(missing_files))
else: else:
self._txt("Pack of %s done" % output_blendfile.name) self._txt('Pack of %s done' % output_blendfile.name)
def pack_aborted(self, reason: str): def pack_aborted(self, reason: str):
self._txt("Aborted: %s" % reason) self._txt('Aborted: %s' % reason)
self._status("ABORTED") self._status('ABORTED')
def trace_blendfile(self, filename: pathlib.Path) -> None: def trace_blendfile(self, filename: pathlib.Path) -> None:
"""Called for every blendfile opened when tracing dependencies.""" """Called for every blendfile opened when tracing dependencies."""
self._txt("Inspecting %s" % filename.name) self._txt('Inspecting %s' % filename.name)
def trace_asset(self, filename: pathlib.Path) -> None: def trace_asset(self, filename: pathlib.Path) -> None:
if filename.stem == ".blend": if filename.stem == '.blend':
return return
self._txt("Found asset %s" % filename.name) self._txt('Found asset %s' % filename.name)
def rewrite_blendfile(self, orig_filename: pathlib.Path) -> None: def rewrite_blendfile(self, orig_filename: pathlib.Path) -> None:
self._txt("Rewriting %s" % orig_filename.name) self._txt('Rewriting %s' % orig_filename.name)
def transfer_file(self, src: pathlib.Path, dst: pathlib.Path) -> None: def transfer_file(self, src: pathlib.Path, dst: pathlib.Path) -> None:
self._txt("Transferring %s" % src.name) self._txt('Transferring %s' % src.name)
def transfer_file_skipped(self, src: pathlib.Path, dst: pathlib.Path) -> None: def transfer_file_skipped(self, src: pathlib.Path, dst: pathlib.Path) -> None:
self._txt("Skipped %s" % src.name) self._txt('Skipped %s' % src.name)
def transfer_progress(self, total_bytes: int, transferred_bytes: int) -> None: def transfer_progress(self, total_bytes: int, transferred_bytes: int) -> None:
self._progress(round(100 * transferred_bytes / total_bytes)) self._progress(round(100 * transferred_bytes / total_bytes))
@ -97,17 +98,15 @@ class BatProgress(progress.Callback):
class ShamanPacker(shaman.ShamanPacker): class ShamanPacker(shaman.ShamanPacker):
"""Packer with support for getting an auth token from Flamenco Server.""" """Packer with support for getting an auth token from Flamenco Server."""
def __init__( def __init__(self,
self, bfile: pathlib.Path,
bfile: pathlib.Path, project: pathlib.Path,
project: pathlib.Path, target: str,
target: str, endpoint: str,
endpoint: str, checkout_id: str,
checkout_id: str, *,
*, manager_id: str,
manager_id: str, **kwargs) -> None:
**kwargs
) -> None:
self.manager_id = manager_id self.manager_id = manager_id
super().__init__(bfile, project, target, endpoint, checkout_id, **kwargs) super().__init__(bfile, project, target, endpoint, checkout_id, **kwargs)
@ -117,27 +116,25 @@ class ShamanPacker(shaman.ShamanPacker):
from ..blender import PILLAR_SERVER_URL from ..blender import PILLAR_SERVER_URL
from ..pillar import blender_id_subclient, uncached_session, SUBCLIENT_ID from ..pillar import blender_id_subclient, uncached_session, SUBCLIENT_ID
url = urllib.parse.urljoin( url = urllib.parse.urljoin(PILLAR_SERVER_URL,
PILLAR_SERVER_URL, "flamenco/jwt/generate-token/%s" % self.manager_id 'flamenco/jwt/generate-token/%s' % self.manager_id)
) auth_token = blender_id_subclient()['token']
auth_token = blender_id_subclient()["token"]
resp = uncached_session.get(url, auth=(auth_token, SUBCLIENT_ID)) resp = uncached_session.get(url, auth=(auth_token, SUBCLIENT_ID))
resp.raise_for_status() resp.raise_for_status()
return resp.text return resp.text
async def copy( async def copy(context,
context, base_blendfile: pathlib.Path,
base_blendfile: pathlib.Path, project: pathlib.Path,
project: pathlib.Path, target: str,
target: str, exclusion_filter: str,
exclusion_filter: str, *,
*, relative_only: bool,
relative_only: bool, packer_class=pack.Packer,
packer_class=pack.Packer, **packer_args) \
**packer_args -> typing.Tuple[pathlib.Path, typing.Set[pathlib.Path]]:
) -> typing.Tuple[pathlib.Path, typing.Set[pathlib.Path]]:
"""Use BAT🦇 to copy the given file and dependencies to the target location. """Use BAT🦇 to copy the given file and dependencies to the target location.
:raises: FileTransferError if a file couldn't be transferred. :raises: FileTransferError if a file couldn't be transferred.
@ -148,36 +145,30 @@ async def copy(
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
wm = bpy.context.window_manager wm = bpy.context.window_manager
packer = packer_class( packer = packer_class(base_blendfile, project, target,
base_blendfile, compress=True, relative_only=relative_only, **packer_args)
project,
target,
compress=True,
relative_only=relative_only,
**packer_args
)
with packer: with packer:
with _packer_lock: with _packer_lock:
if exclusion_filter: if exclusion_filter:
# There was a mistake in an older version of the property tooltip, # There was a mistake in an older version of the property tooltip,
# showing semicolon-separated instead of space-separated. We now # showing semicolon-separated instead of space-separated. We now
# just handle both. # just handle both.
filter_parts = re.split("[ ;]+", exclusion_filter.strip(" ;")) filter_parts = re.split('[ ;]+', exclusion_filter.strip(' ;'))
packer.exclude(*filter_parts) packer.exclude(*filter_parts)
packer.progress_cb = BatProgress() packer.progress_cb = BatProgress()
_running_packer = packer _running_packer = packer
log.debug("awaiting strategise") log.debug('awaiting strategise')
wm.flamenco_status = "INVESTIGATING" wm.flamenco_status = 'INVESTIGATING'
await loop.run_in_executor(None, packer.strategise) await loop.run_in_executor(None, packer.strategise)
log.debug("awaiting execute") log.debug('awaiting execute')
wm.flamenco_status = "TRANSFERRING" wm.flamenco_status = 'TRANSFERRING'
await loop.run_in_executor(None, packer.execute) await loop.run_in_executor(None, packer.execute)
log.debug("done") log.debug('done')
wm.flamenco_status = "DONE" wm.flamenco_status = 'DONE'
with _packer_lock: with _packer_lock:
_running_packer = None _running_packer = None
@ -193,7 +184,7 @@ def abort() -> None:
with _packer_lock: with _packer_lock:
if _running_packer is None: if _running_packer is None:
log.debug("No running packer, ignoring call to bat_abort()") log.debug('No running packer, ignoring call to bat_abort()')
return return
log.info("Aborting running packer") log.info('Aborting running packer')
_running_packer.abort() _running_packer.abort()

View File

@ -1,40 +1,16 @@
import functools import functools
import pathlib import pathlib
import typing
from pillarsdk.resource import List, Find, Create from pillarsdk.resource import List, Find, Create
class Manager(List, Find): class Manager(List, Find):
"""Manager class wrapping the REST nodes endpoint""" """Manager class wrapping the REST nodes endpoint"""
path = 'flamenco/managers'
path = "flamenco/managers"
PurePlatformPath = pathlib.PurePath PurePlatformPath = pathlib.PurePath
@functools.lru_cache(maxsize=1) @functools.lru_cache()
def _path_replacements(self) -> list: def _sorted_path_replacements(self) -> list:
"""Defer to _path_replacements_vN() to get path replacement vars.
Returns a list of tuples (variable name, variable value).
"""
settings_version = self.settings_version or 1
try:
settings_func = getattr(self, "_path_replacements_v%d" % settings_version)
except AttributeError:
raise RuntimeError(
"This manager has unsupported settings version %d; "
"upgrade Blender Cloud add-on"
)
def longest_value_first(item):
var_name, var_value = item
return -len(var_value), var_value, var_name
replacements = settings_func()
replacements.sort(key=longest_value_first)
return replacements
def _path_replacements_v1(self) -> typing.List[typing.Tuple[str, str]]:
import platform import platform
if self.path_replacement is None: if self.path_replacement is None:
@ -42,36 +18,13 @@ class Manager(List, Find):
items = self.path_replacement.to_dict().items() items = self.path_replacement.to_dict().items()
this_platform = platform.system().lower() def by_length(item):
return [ return -len(item[0]), item[0]
(varname, platform_replacements[this_platform])
for varname, platform_replacements in items
if this_platform in platform_replacements
]
def _path_replacements_v2(self) -> typing.List[typing.Tuple[str, str]]:
import platform
if not self.variables:
return []
this_platform = platform.system().lower() this_platform = platform.system().lower()
audiences = {"users", "all"} return [(varname, platform_replacements[this_platform])
for varname, platform_replacements in sorted(items, key=by_length)
replacements = [] if this_platform in platform_replacements]
for var_name, variable in self.variables.to_dict().items():
# Path replacement requires bidirectional variables.
if variable.get("direction") != "twoway":
continue
for var_value in variable.get("values", []):
if var_value.get("audience") not in audiences:
continue
if var_value.get("platform", "").lower() != this_platform:
continue
replacements.append((var_name, var_value.get("value")))
return replacements
def replace_path(self, some_path: pathlib.PurePath) -> str: def replace_path(self, some_path: pathlib.PurePath) -> str:
"""Performs path variable replacement. """Performs path variable replacement.
@ -79,11 +32,10 @@ class Manager(List, Find):
Tries to find platform-specific path prefixes, and replaces them with Tries to find platform-specific path prefixes, and replaces them with
variables. variables.
""" """
assert isinstance(some_path, pathlib.PurePath), ( assert isinstance(some_path, pathlib.PurePath), \
"some_path should be a PurePath, not %r" % some_path 'some_path should be a PurePath, not %r' % some_path
)
for varname, path in self._path_replacements(): for varname, path in self._sorted_path_replacements():
replacement = self.PurePlatformPath(path) replacement = self.PurePlatformPath(path)
try: try:
relpath = some_path.relative_to(replacement) relpath = some_path.relative_to(replacement)
@ -91,26 +43,26 @@ class Manager(List, Find):
# Not relative to each other, so no replacement possible # Not relative to each other, so no replacement possible
continue continue
replacement_root = self.PurePlatformPath("{%s}" % varname) replacement_root = self.PurePlatformPath('{%s}' % varname)
return (replacement_root / relpath).as_posix() return (replacement_root / relpath).as_posix()
return some_path.as_posix() return some_path.as_posix()
class Job(List, Find, Create): class Job(List, Find, Create):
"""Job class wrapping the REST nodes endpoint""" """Job class wrapping the REST nodes endpoint
"""
path = "flamenco/jobs" path = 'flamenco/jobs'
ensure_query_projections = {"project": 1} ensure_query_projections = {'project': 1}
def patch(self, payload: dict, api=None): def patch(self, payload: dict, api=None):
import pillarsdk.utils import pillarsdk.utils
import json
api = api or self.api api = api or self.api
url = pillarsdk.utils.join_url(self.path, str(self["_id"])) url = pillarsdk.utils.join_url(self.path, str(self['_id']))
headers = pillarsdk.utils.merge_dict( headers = pillarsdk.utils.merge_dict(self.http_headers(),
self.http_headers(), {"Content-Type": "application/json"} {'Content-Type': 'application/json'})
)
response = api.patch(url, payload, headers=headers) response = api.patch(url, payload, headers=headers)
return response return response

View File

@ -23,31 +23,28 @@ from pillarsdk import exceptions as sdk_exceptions
from .pillar import pillar_call from .pillar import pillar_call
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
HOME_PROJECT_ENDPOINT = "/bcloud/home-project" HOME_PROJECT_ENDPOINT = '/bcloud/home-project'
async def get_home_project(params=None) -> pillarsdk.Project: async def get_home_project(params=None) -> pillarsdk.Project:
"""Returns the home project.""" """Returns the home project."""
log.debug("Getting home project") log.debug('Getting home project')
try: try:
return await pillar_call( return await pillar_call(pillarsdk.Project.find_from_endpoint,
pillarsdk.Project.find_from_endpoint, HOME_PROJECT_ENDPOINT, params=params HOME_PROJECT_ENDPOINT, params=params)
)
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
log.warning( log.warning('Access to the home project was denied. '
"Access to the home project was denied. " 'Double-check that you are logged in with valid BlenderID credentials.')
"Double-check that you are logged in with valid BlenderID credentials."
)
raise raise
except sdk_exceptions.ResourceNotFound: except sdk_exceptions.ResourceNotFound:
log.warning("No home project available.") log.warning('No home project available.')
raise raise
async def get_home_project_id() -> str: async def get_home_project_id() -> str:
"""Returns just the ID of the home project.""" """Returns just the ID of the home project."""
home_proj = await get_home_project({"projection": {"_id": 1}}) home_proj = await get_home_project({'projection': {'_id': 1}})
home_proj_id = home_proj["_id"] home_proj_id = home_proj['_id']
return home_proj_id return home_proj_id

View File

@ -27,8 +27,8 @@ from pillarsdk import exceptions as sdk_exceptions
from .pillar import pillar_call from .pillar import pillar_call
from . import async_loop, pillar, home_project, blender from . import async_loop, pillar, home_project, blender
REQUIRES_ROLES_FOR_IMAGE_SHARING = {"subscriber", "demo"} REQUIRES_ROLES_FOR_IMAGE_SHARING = {'subscriber', 'demo'}
IMAGE_SHARING_GROUP_NODE_NAME = "Image sharing" IMAGE_SHARING_GROUP_NODE_NAME = 'Image sharing'
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -36,84 +36,76 @@ async def find_image_sharing_group_id(home_project_id, user_id):
# Find the top-level image sharing group node. # Find the top-level image sharing group node.
try: try:
share_group, created = await pillar.find_or_create_node( share_group, created = await pillar.find_or_create_node(
where={ where={'project': home_project_id,
"project": home_project_id, 'node_type': 'group',
"node_type": "group", 'parent': None,
"parent": None, 'name': IMAGE_SHARING_GROUP_NODE_NAME},
"name": IMAGE_SHARING_GROUP_NODE_NAME,
},
additional_create_props={ additional_create_props={
"user": user_id, 'user': user_id,
"properties": {}, 'properties': {},
}, },
projection={"_id": 1}, projection={'_id': 1},
may_create=True, may_create=True)
)
except pillar.PillarError: except pillar.PillarError:
log.exception("Pillar error caught") log.exception('Pillar error caught')
raise pillar.PillarError("Unable to find image sharing folder on the Cloud") raise pillar.PillarError('Unable to find image sharing folder on the Cloud')
return share_group["_id"] return share_group['_id']
class PILLAR_OT_image_share( class PILLAR_OT_image_share(pillar.PillarOperatorMixin,
pillar.PillarOperatorMixin, async_loop.AsyncModalOperatorMixin, bpy.types.Operator async_loop.AsyncModalOperatorMixin,
): bpy.types.Operator):
bl_idname = "pillar.image_share" bl_idname = 'pillar.image_share'
bl_label = "Share an image/screenshot via Blender Cloud" bl_label = 'Share an image/screenshot via Blender Cloud'
bl_description = "Uploads an image for sharing via Blender Cloud" bl_description = 'Uploads an image for sharing via Blender Cloud'
log = logging.getLogger("bpy.ops.%s" % bl_idname) log = logging.getLogger('bpy.ops.%s' % bl_idname)
home_project_id = None home_project_id = None
home_project_url = "home" home_project_url = 'home'
share_group_id = None # top-level share group node ID share_group_id = None # top-level share group node ID
user_id = None user_id = None
target: bpy.props.EnumProperty( target = bpy.props.EnumProperty(
items=[ items=[
("FILE", "File", "Share an image file"), ('FILE', 'File', 'Share an image file'),
("DATABLOCK", "Datablock", "Share an image datablock"), ('DATABLOCK', 'Datablock', 'Share an image datablock'),
("SCREENSHOT", "Screenshot", "Share a screenshot"), ('SCREENSHOT', 'Screenshot', 'Share a screenshot'),
], ],
name="target", name='target',
default="SCREENSHOT", default='SCREENSHOT')
)
name: bpy.props.StringProperty( name = bpy.props.StringProperty(name='name',
name="name", description="File or datablock name to sync" description='File or datablock name to sync')
)
screenshot_show_multiview: bpy.props.BoolProperty( screenshot_show_multiview = bpy.props.BoolProperty(
name="screenshot_show_multiview", description="Enable Multi-View", default=False name='screenshot_show_multiview',
) description='Enable Multi-View',
default=False)
screenshot_use_multiview: bpy.props.BoolProperty( screenshot_use_multiview = bpy.props.BoolProperty(
name="screenshot_use_multiview", description="Use Multi-View", default=False name='screenshot_use_multiview',
) description='Use Multi-View',
default=False)
screenshot_full: bpy.props.BoolProperty( screenshot_full = bpy.props.BoolProperty(
name="screenshot_full", name='screenshot_full',
description="Full Screen, Capture the whole window (otherwise only capture the active area)", description='Full Screen, Capture the whole window (otherwise only capture the active area)',
default=False, default=False)
)
def invoke(self, context, event): def invoke(self, context, event):
# Do a quick test on datablock dirtyness. If it's not packed and dirty, # Do a quick test on datablock dirtyness. If it's not packed and dirty,
# the user should save it first. # the user should save it first.
if self.target == "DATABLOCK": if self.target == 'DATABLOCK':
if not self.name: if not self.name:
self.report({"ERROR"}, "No name given of the datablock to share.") self.report({'ERROR'}, 'No name given of the datablock to share.')
return {"CANCELLED"} return {'CANCELLED'}
datablock = bpy.data.images[self.name] datablock = bpy.data.images[self.name]
if ( if datablock.type == 'IMAGE' and datablock.is_dirty and not datablock.packed_file:
datablock.type == "IMAGE" self.report({'ERROR'}, 'Datablock is dirty, save it first.')
and datablock.is_dirty return {'CANCELLED'}
and not datablock.packed_file
):
self.report({"ERROR"}, "Datablock is dirty, save it first.")
return {"CANCELLED"}
return async_loop.AsyncModalOperatorMixin.invoke(self, context, event) return async_loop.AsyncModalOperatorMixin.invoke(self, context, event)
@ -121,87 +113,80 @@ class PILLAR_OT_image_share(
"""Entry point of the asynchronous operator.""" """Entry point of the asynchronous operator."""
# We don't want to influence what is included in the screen shot. # We don't want to influence what is included in the screen shot.
if self.target == "SCREENSHOT": if self.target == 'SCREENSHOT':
print("Blender Cloud add-on is communicating with Blender Cloud") print('Blender Cloud add-on is communicating with Blender Cloud')
else: else:
self.report({"INFO"}, "Communicating with Blender Cloud") self.report({'INFO'}, 'Communicating with Blender Cloud')
try: try:
# Refresh credentials # Refresh credentials
try: try:
db_user = await self.check_credentials( db_user = await self.check_credentials(context, REQUIRES_ROLES_FOR_IMAGE_SHARING)
context, REQUIRES_ROLES_FOR_IMAGE_SHARING self.user_id = db_user['_id']
) self.log.debug('Found user ID: %s', self.user_id)
self.user_id = db_user["_id"]
self.log.debug("Found user ID: %s", self.user_id)
except pillar.NotSubscribedToCloudError as ex: except pillar.NotSubscribedToCloudError as ex:
self._log_subscription_needed(can_renew=ex.can_renew) self._log_subscription_needed(can_renew=ex.can_renew)
self._state = "QUIT" self._state = 'QUIT'
return return
except pillar.UserNotLoggedInError: except pillar.UserNotLoggedInError:
self.log.exception("Error checking/refreshing credentials.") self.log.exception('Error checking/refreshing credentials.')
self.report({"ERROR"}, "Please log in on Blender ID first.") self.report({'ERROR'}, 'Please log in on Blender ID first.')
self._state = "QUIT" self._state = 'QUIT'
return return
# Find the home project. # Find the home project.
try: try:
home_proj = await home_project.get_home_project( home_proj = await home_project.get_home_project({
{"projection": {"_id": 1, "url": 1}} 'projection': {'_id': 1, 'url': 1}
) })
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
self.log.exception("Forbidden access to home project.") self.log.exception('Forbidden access to home project.')
self.report({"ERROR"}, "Did not get access to home project.") self.report({'ERROR'}, 'Did not get access to home project.')
self._state = "QUIT" self._state = 'QUIT'
return return
except sdk_exceptions.ResourceNotFound: except sdk_exceptions.ResourceNotFound:
self.report({"ERROR"}, "Home project not found.") self.report({'ERROR'}, 'Home project not found.')
self._state = "QUIT" self._state = 'QUIT'
return return
self.home_project_id = home_proj["_id"] self.home_project_id = home_proj['_id']
self.home_project_url = home_proj["url"] self.home_project_url = home_proj['url']
try: try:
gid = await find_image_sharing_group_id( gid = await find_image_sharing_group_id(self.home_project_id,
self.home_project_id, self.user_id self.user_id)
)
self.share_group_id = gid self.share_group_id = gid
self.log.debug("Found group node ID: %s", self.share_group_id) self.log.debug('Found group node ID: %s', self.share_group_id)
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
self.log.exception("Unable to find Group ID") self.log.exception('Unable to find Group ID')
self.report({"ERROR"}, "Unable to find sync folder.") self.report({'ERROR'}, 'Unable to find sync folder.')
self._state = "QUIT" self._state = 'QUIT'
return return
await self.share_image(context) await self.share_image(context)
except Exception as ex: except Exception as ex:
self.log.exception("Unexpected exception caught.") self.log.exception('Unexpected exception caught.')
self.report({"ERROR"}, "Unexpected error %s: %s" % (type(ex), ex)) self.report({'ERROR'}, 'Unexpected error %s: %s' % (type(ex), ex))
self._state = "QUIT" self._state = 'QUIT'
async def share_image(self, context): async def share_image(self, context):
"""Sends files to the Pillar server.""" """Sends files to the Pillar server."""
if self.target == "FILE": if self.target == 'FILE':
self.report( self.report({'INFO'}, "Uploading %s '%s'" % (self.target.lower(), self.name))
{"INFO"}, "Uploading %s '%s'" % (self.target.lower(), self.name)
)
node = await self.upload_file(self.name) node = await self.upload_file(self.name)
elif self.target == "SCREENSHOT": elif self.target == 'SCREENSHOT':
node = await self.upload_screenshot(context) node = await self.upload_screenshot(context)
else: else:
self.report( self.report({'INFO'}, "Uploading %s '%s'" % (self.target.lower(), self.name))
{"INFO"}, "Uploading %s '%s'" % (self.target.lower(), self.name)
)
node = await self.upload_datablock(context) node = await self.upload_datablock(context)
self.report({"INFO"}, "Upload complete, creating link to share.") self.report({'INFO'}, 'Upload complete, creating link to share.')
share_info = await pillar_call(node.share) share_info = await pillar_call(node.share)
url = share_info.get("short_link") url = share_info.get('short_link')
context.window_manager.clipboard = url context.window_manager.clipboard = url
self.report({"INFO"}, "The link has been copied to your clipboard: %s" % url) self.report({'INFO'}, 'The link has been copied to your clipboard: %s' % url)
await self.maybe_open_browser(url) await self.maybe_open_browser(url)
@ -211,21 +196,19 @@ class PILLAR_OT_image_share(
Returns the node. Returns the node.
""" """
self.log.info("Uploading file %s", filename) self.log.info('Uploading file %s', filename)
node = await pillar_call( node = await pillar_call(pillarsdk.Node.create_asset_from_file,
pillarsdk.Node.create_asset_from_file, self.home_project_id,
self.home_project_id, self.share_group_id,
self.share_group_id, 'image',
"image", filename,
filename, extra_where={'user': self.user_id},
extra_where={"user": self.user_id}, always_create_new_node=True,
always_create_new_node=True, fileobj=fileobj,
fileobj=fileobj, caching=False)
caching=False, node_id = node['_id']
) self.log.info('Created node %s', node_id)
node_id = node["_id"] self.report({'INFO'}, 'File succesfully uploaded to the cloud!')
self.log.info("Created node %s", node_id)
self.report({"INFO"}, "File succesfully uploaded to the cloud!")
return node return node
@ -236,7 +219,7 @@ class PILLAR_OT_image_share(
import webbrowser import webbrowser
self.log.info("Opening browser at %s", url) self.log.info('Opening browser at %s', url)
webbrowser.open_new_tab(url) webbrowser.open_new_tab(url)
async def upload_datablock(self, context) -> pillarsdk.Node: async def upload_datablock(self, context) -> pillarsdk.Node:
@ -248,13 +231,12 @@ class PILLAR_OT_image_share(
self.log.info("Uploading datablock '%s'" % self.name) self.log.info("Uploading datablock '%s'" % self.name)
datablock = bpy.data.images[self.name] datablock = bpy.data.images[self.name]
if datablock.type == "RENDER_RESULT": if datablock.type == 'RENDER_RESULT':
# Construct a sensible name for this render. # Construct a sensible name for this render.
filename = "%s-%s-render%s" % ( filename = '%s-%s-render%s' % (
os.path.splitext(os.path.basename(context.blend_data.filepath))[0], os.path.splitext(os.path.basename(context.blend_data.filepath))[0],
context.scene.name, context.scene.name,
context.scene.render.file_extension, context.scene.render.file_extension)
)
return await self.upload_via_tempdir(datablock, filename) return await self.upload_via_tempdir(datablock, filename)
if datablock.packed_file is not None: if datablock.packed_file is not None:
@ -283,7 +265,7 @@ class PILLAR_OT_image_share(
with tempfile.TemporaryDirectory() as tmpdir: with tempfile.TemporaryDirectory() as tmpdir:
filepath = os.path.join(tmpdir, filename_on_cloud) filepath = os.path.join(tmpdir, filename_on_cloud)
self.log.debug("Saving %s to %s", datablock, filepath) self.log.debug('Saving %s to %s', datablock, filepath)
datablock.save_render(filepath) datablock.save_render(filepath)
return await self.upload_file(filepath) return await self.upload_file(filepath)
@ -295,27 +277,25 @@ class PILLAR_OT_image_share(
import io import io
filename = "%s.%s" % (datablock.name, datablock.file_format.lower()) filename = '%s.%s' % (datablock.name, datablock.file_format.lower())
fileobj = io.BytesIO(datablock.packed_file.data) fileobj = io.BytesIO(datablock.packed_file.data)
fileobj.seek(0) # ensure PillarSDK reads the file from the beginning. fileobj.seek(0) # ensure PillarSDK reads the file from the beginning.
self.log.info("Uploading packed file directly from memory to %r.", filename) self.log.info('Uploading packed file directly from memory to %r.', filename)
return await self.upload_file(filename, fileobj=fileobj) return await self.upload_file(filename, fileobj=fileobj)
async def upload_screenshot(self, context) -> pillarsdk.Node: async def upload_screenshot(self, context) -> pillarsdk.Node:
"""Takes a screenshot, saves it to a temp file, and uploads it.""" """Takes a screenshot, saves it to a temp file, and uploads it."""
self.name = datetime.datetime.now().strftime("Screenshot-%Y-%m-%d-%H%M%S.png") self.name = datetime.datetime.now().strftime('Screenshot-%Y-%m-%d-%H%M%S.png')
self.report({"INFO"}, "Uploading %s '%s'" % (self.target.lower(), self.name)) self.report({'INFO'}, "Uploading %s '%s'" % (self.target.lower(), self.name))
with tempfile.TemporaryDirectory() as tmpdir: with tempfile.TemporaryDirectory() as tmpdir:
filepath = os.path.join(tmpdir, self.name) filepath = os.path.join(tmpdir, self.name)
self.log.debug("Saving screenshot to %s", filepath) self.log.debug('Saving screenshot to %s', filepath)
bpy.ops.screen.screenshot( bpy.ops.screen.screenshot(filepath=filepath,
filepath=filepath, show_multiview=self.screenshot_show_multiview,
show_multiview=self.screenshot_show_multiview, use_multiview=self.screenshot_use_multiview,
use_multiview=self.screenshot_use_multiview, full=self.screenshot_full)
full=self.screenshot_full,
)
return await self.upload_file(filepath) return await self.upload_file(filepath)
@ -324,25 +304,22 @@ def image_editor_menu(self, context):
box = self.layout.row() box = self.layout.row()
if image and image.has_data: if image and image.has_data:
text = "Share on Blender Cloud" text = 'Share on Blender Cloud'
if image.type == "IMAGE" and image.is_dirty and not image.packed_file: if image.type == 'IMAGE' and image.is_dirty and not image.packed_file:
box.enabled = False box.enabled = False
text = "Save image before sharing on Blender Cloud" text = 'Save image before sharing on Blender Cloud'
props = box.operator( props = box.operator(PILLAR_OT_image_share.bl_idname, text=text,
PILLAR_OT_image_share.bl_idname, text=text, icon_value=blender.icon("CLOUD") icon_value=blender.icon('CLOUD'))
) props.target = 'DATABLOCK'
props.target = "DATABLOCK"
props.name = image.name props.name = image.name
def window_menu(self, context): def window_menu(self, context):
props = self.layout.operator( props = self.layout.operator(PILLAR_OT_image_share.bl_idname,
PILLAR_OT_image_share.bl_idname, text='Share screenshot via Blender Cloud',
text="Share screenshot via Blender Cloud", icon_value=blender.icon('CLOUD'))
icon_value=blender.icon("CLOUD"), props.target = 'SCREENSHOT'
)
props.target = "SCREENSHOT"
props.screenshot_full = True props.screenshot_full = True

File diff suppressed because it is too large Load Diff

View File

@ -6,16 +6,18 @@ import typing
# Names of BlenderCloudPreferences properties that are both project-specific # Names of BlenderCloudPreferences properties that are both project-specific
# and simple enough to store directly in a dict. # and simple enough to store directly in a dict.
PROJECT_SPECIFIC_SIMPLE_PROPS = ("cloud_project_local_path",) PROJECT_SPECIFIC_SIMPLE_PROPS = (
'cloud_project_local_path',
)
# Names of BlenderCloudPreferences properties that are project-specific and # Names of BlenderCloudPreferences properties that are project-specific and
# Flamenco Manager-specific, and simple enough to store in a dict. # Flamenco Manager-specific, and simple enough to store in a dict.
FLAMENCO_PER_PROJECT_PER_MANAGER = ( FLAMENCO_PER_PROJECT_PER_MANAGER = (
"flamenco_exclude_filter", 'flamenco_exclude_filter',
"flamenco_job_file_path", 'flamenco_job_file_path',
"flamenco_job_output_path", 'flamenco_job_output_path',
"flamenco_job_output_strip_components", 'flamenco_job_output_strip_components',
"flamenco_relative_only", 'flamenco_relative_only',
) )
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -36,28 +38,25 @@ def mark_as_loading():
project_settings_loading -= 1 project_settings_loading -= 1
def update_preferences( def update_preferences(prefs, names_to_update: typing.Iterable[str],
prefs, new_values: typing.Mapping[str, typing.Any]):
names_to_update: typing.Iterable[str],
new_values: typing.Mapping[str, typing.Any],
):
for name in names_to_update: for name in names_to_update:
if not hasattr(prefs, name): if not hasattr(prefs, name):
log.debug("not setting %r, property cannot be found", name) log.debug('not setting %r, property cannot be found', name)
continue continue
if name in new_values: if name in new_values:
log.debug("setting %r = %r", name, new_values[name]) log.debug('setting %r = %r', name, new_values[name])
setattr(prefs, name, new_values[name]) setattr(prefs, name, new_values[name])
else: else:
# The property wasn't stored, so set the default value instead. # The property wasn't stored, so set the default value instead.
bl_type, args = getattr(prefs.bl_rna, name) bl_type, args = getattr(prefs.bl_rna, name)
log.debug("finding default value for %r", name) log.debug('finding default value for %r', name)
if "default" not in args: if 'default' not in args:
log.debug("no default value for %r, not touching", name) log.debug('no default value for %r, not touching', name)
continue continue
log.debug("found default value for %r = %r", name, args["default"]) log.debug('found default value for %r = %r', name, args['default'])
setattr(prefs, name, args["default"]) setattr(prefs, name, args['default'])
def handle_project_update(_=None, _2=None): def handle_project_update(_=None, _2=None):
@ -71,32 +70,27 @@ def handle_project_update(_=None, _2=None):
with mark_as_loading(): with mark_as_loading():
prefs = preferences() prefs = preferences()
project_id = prefs.project.project project_id = prefs.project.project
log.debug( log.info('Updating internal state to reflect extensions enabled on current project %s.',
"Updating internal state to reflect extensions enabled on current project %s.", project_id)
project_id,
)
project_extensions.cache_clear() project_extensions.cache_clear()
from blender_cloud import attract, flamenco from blender_cloud import attract, flamenco
attract.deactivate() attract.deactivate()
flamenco.deactivate() flamenco.deactivate()
enabled_for = project_extensions(project_id) enabled_for = project_extensions(project_id)
log.info("Project extensions: %s", enabled_for) log.info('Project extensions: %s', enabled_for)
if "attract" in enabled_for: if 'attract' in enabled_for:
attract.activate() attract.activate()
if "flamenco" in enabled_for: if 'flamenco' in enabled_for:
flamenco.activate() flamenco.activate()
# Load project-specific settings from the last time we visited this project. # Load project-specific settings from the last time we visited this project.
ps = prefs.get("project_settings", {}).get(project_id, {}) ps = prefs.get('project_settings', {}).get(project_id, {})
if not ps: if not ps:
log.debug( log.debug('no project-specific settings are available, '
"no project-specific settings are available, " 'only resetting available Flamenco Managers')
"only resetting available Flamenco Managers"
)
# The Flamenco Manager should really be chosen explicitly out of the available # The Flamenco Manager should really be chosen explicitly out of the available
# Managers. # Managers.
prefs.flamenco_manager.available_managers = [] prefs.flamenco_manager.available_managers = []
@ -104,31 +98,23 @@ def handle_project_update(_=None, _2=None):
if log.isEnabledFor(logging.DEBUG): if log.isEnabledFor(logging.DEBUG):
from pprint import pformat from pprint import pformat
log.debug('loading project-specific settings:\n%s', pformat(ps.to_dict()))
log.debug("loading project-specific settings:\n%s", pformat(ps.to_dict()))
# Restore simple properties. # Restore simple properties.
update_preferences(prefs, PROJECT_SPECIFIC_SIMPLE_PROPS, ps) update_preferences(prefs, PROJECT_SPECIFIC_SIMPLE_PROPS, ps)
# Restore Flamenco settings. # Restore Flamenco settings.
prefs.flamenco_manager.available_managers = ps.get( prefs.flamenco_manager.available_managers = ps.get('flamenco_available_managers', [])
"flamenco_available_managers", [] flamenco_manager_id = ps.get('flamenco_manager_id')
)
flamenco_manager_id = ps.get("flamenco_manager_id")
if flamenco_manager_id: if flamenco_manager_id:
log.debug("setting flamenco manager to %s", flamenco_manager_id) log.debug('setting flamenco manager to %s', flamenco_manager_id)
try: try:
# This will trigger a load of Project+Manager-specfic settings. # This will trigger a load of Project+Manager-specfic settings.
prefs.flamenco_manager.manager = flamenco_manager_id prefs.flamenco_manager.manager = flamenco_manager_id
except TypeError: except TypeError:
log.warning( log.warning('manager %s for this project could not be found', flamenco_manager_id)
"manager %s for this project could not be found",
flamenco_manager_id,
)
elif prefs.flamenco_manager.available_managers: elif prefs.flamenco_manager.available_managers:
prefs.flamenco_manager.manager = prefs.flamenco_manager.available_managers[ prefs.flamenco_manager.manager = prefs.flamenco_manager.available_managers[0]
0
]["_id"]
def store(_=None, _2=None): def store(_=None, _2=None):
@ -147,34 +133,31 @@ def store(_=None, _2=None):
prefs = preferences() prefs = preferences()
project_id = prefs.project.project project_id = prefs.project.project
all_settings = prefs.get("project_settings", {}) all_settings = prefs.get('project_settings', {})
ps = all_settings.get(project_id, {}) # either a dict or bpy.types.IDPropertyGroup ps = all_settings.get(project_id, {}) # either a dict or bpy.types.IDPropertyGroup
for name in PROJECT_SPECIFIC_SIMPLE_PROPS: for name in PROJECT_SPECIFIC_SIMPLE_PROPS:
ps[name] = getattr(prefs, name) ps[name] = getattr(prefs, name)
# Store project-specific Flamenco settings # Store project-specific Flamenco settings
ps["flamenco_manager_id"] = prefs.flamenco_manager.manager ps['flamenco_manager_id'] = prefs.flamenco_manager.manager
ps["flamenco_available_managers"] = prefs.flamenco_manager.available_managers ps['flamenco_available_managers'] = prefs.flamenco_manager.available_managers
# Store per-project, per-manager settings for the current Manager. # Store per-project, per-manager settings for the current Manager.
pppm = ps.get("flamenco_managers_settings", {}) pppm = ps.get('flamenco_managers_settings', {})
pppm[prefs.flamenco_manager.manager] = { pppm[prefs.flamenco_manager.manager] = {
name: getattr(prefs, name) for name in FLAMENCO_PER_PROJECT_PER_MANAGER name: getattr(prefs, name) for name in FLAMENCO_PER_PROJECT_PER_MANAGER
} }
ps[ ps['flamenco_managers_settings'] = pppm # IDPropertyGroup has no setdefault() method.
"flamenco_managers_settings"
] = pppm # IDPropertyGroup has no setdefault() method.
# Store this project's settings in the preferences. # Store this project's settings in the preferences.
all_settings[project_id] = ps all_settings[project_id] = ps
prefs["project_settings"] = all_settings prefs['project_settings'] = all_settings
if log.isEnabledFor(logging.DEBUG): if log.isEnabledFor(logging.DEBUG):
from pprint import pformat from pprint import pformat
if hasattr(all_settings, 'to_dict'):
if hasattr(all_settings, "to_dict"):
to_log = all_settings.to_dict() to_log = all_settings.to_dict()
else: else:
to_log = all_settings to_log = all_settings
log.debug("Saving project-specific settings:\n%s", pformat(to_log)) log.debug('Saving project-specific settings:\n%s', pformat(to_log))

View File

@ -37,33 +37,31 @@ from pillarsdk import exceptions as sdk_exceptions
from .pillar import pillar_call from .pillar import pillar_call
from . import async_loop, blender, pillar, cache, blendfile, home_project from . import async_loop, blender, pillar, cache, blendfile, home_project
SETTINGS_FILES_TO_UPLOAD = ["userpref.blend", "startup.blend"] SETTINGS_FILES_TO_UPLOAD = ['userpref.blend', 'startup.blend']
# These are RNA keys inside the userpref.blend file, and their # These are RNA keys inside the userpref.blend file, and their
# Python properties names. These settings will not be synced. # Python properties names. These settings will not be synced.
LOCAL_SETTINGS_RNA = [ LOCAL_SETTINGS_RNA = [
(b"dpi", "system.dpi"), (b'dpi', 'system.dpi'),
(b"virtual_pixel", "system.virtual_pixel_mode"), (b'virtual_pixel', 'system.virtual_pixel_mode'),
(b"compute_device_id", "system.compute_device"), (b'compute_device_id', 'system.compute_device'),
(b"compute_device_type", "system.compute_device_type"), (b'compute_device_type', 'system.compute_device_type'),
(b"fontdir", "filepaths.font_directory"), (b'fontdir', 'filepaths.font_directory'),
(b"textudir", "filepaths.texture_directory"), (b'textudir', 'filepaths.texture_directory'),
(b"renderdir", "filepaths.render_output_directory"), (b'renderdir', 'filepaths.render_output_directory'),
(b"pythondir", "filepaths.script_directory"), (b'pythondir', 'filepaths.script_directory'),
(b"sounddir", "filepaths.sound_directory"), (b'sounddir', 'filepaths.sound_directory'),
(b"tempdir", "filepaths.temporary_directory"), (b'tempdir', 'filepaths.temporary_directory'),
(b"render_cachedir", "filepaths.render_cache_directory"), (b'render_cachedir', 'filepaths.render_cache_directory'),
(b"i18ndir", "filepaths.i18n_branches_directory"), (b'i18ndir', 'filepaths.i18n_branches_directory'),
(b"image_editor", "filepaths.image_editor"), (b'image_editor', 'filepaths.image_editor'),
(b"anim_player", "filepaths.animation_player"), (b'anim_player', 'filepaths.animation_player'),
] ]
REQUIRES_ROLES_FOR_SYNC = set() # no roles needed. REQUIRES_ROLES_FOR_SYNC = set() # no roles needed.
SYNC_GROUP_NODE_NAME = "Blender Sync" SYNC_GROUP_NODE_NAME = 'Blender Sync'
SYNC_GROUP_NODE_DESC = ( SYNC_GROUP_NODE_DESC = 'The [Blender Cloud Addon](https://cloud.blender.org/services' \
"The [Blender Cloud Addon](https://cloud.blender.org/services" '#blender-addon) will synchronize your Blender settings here.'
"#blender-addon) will synchronize your Blender settings here."
)
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -76,7 +74,7 @@ def set_blender_sync_status(set_status: str):
try: try:
return func(*args, **kwargs) return func(*args, **kwargs)
finally: finally:
bss.status = "IDLE" bss.status = 'IDLE'
return wrapper return wrapper
@ -92,16 +90,18 @@ def async_set_blender_sync_status(set_status: str):
try: try:
return await func(*args, **kwargs) return await func(*args, **kwargs)
finally: finally:
bss.status = "IDLE" bss.status = 'IDLE'
return wrapper return wrapper
return decorator return decorator
async def find_sync_group_id( async def find_sync_group_id(home_project_id: str,
home_project_id: str, user_id: str, blender_version: str, *, may_create=True user_id: str,
) -> typing.Tuple[str, str]: blender_version: str,
*,
may_create=True) -> typing.Tuple[str, str]:
"""Finds the group node in which to store sync assets. """Finds the group node in which to store sync assets.
If the group node doesn't exist and may_create=True, it creates it. If the group node doesn't exist and may_create=True, it creates it.
@ -111,52 +111,43 @@ async def find_sync_group_id(
# created by Pillar while creating the home project. # created by Pillar while creating the home project.
try: try:
sync_group, created = await pillar.find_or_create_node( sync_group, created = await pillar.find_or_create_node(
where={ where={'project': home_project_id,
"project": home_project_id, 'node_type': 'group',
"node_type": "group", 'parent': None,
"parent": None, 'name': SYNC_GROUP_NODE_NAME,
"name": SYNC_GROUP_NODE_NAME, 'user': user_id},
"user": user_id, projection={'_id': 1},
}, may_create=False)
projection={"_id": 1},
may_create=False,
)
except pillar.PillarError: except pillar.PillarError:
raise pillar.PillarError("Unable to find sync folder on the Cloud") raise pillar.PillarError('Unable to find sync folder on the Cloud')
if not may_create and sync_group is None: if not may_create and sync_group is None:
log.info("Sync folder doesn't exist, and not creating it either.") log.info("Sync folder doesn't exist, and not creating it either.")
return "", "" return '', ''
# Find/create the sub-group for the requested Blender version # Find/create the sub-group for the requested Blender version
try: try:
sub_sync_group, created = await pillar.find_or_create_node( sub_sync_group, created = await pillar.find_or_create_node(
where={ where={'project': home_project_id,
"project": home_project_id, 'node_type': 'group',
"node_type": "group", 'parent': sync_group['_id'],
"parent": sync_group["_id"], 'name': blender_version,
"name": blender_version, 'user': user_id},
"user": user_id,
},
additional_create_props={ additional_create_props={
"description": "Sync folder for Blender %s" % blender_version, 'description': 'Sync folder for Blender %s' % blender_version,
"properties": {"status": "published"}, 'properties': {'status': 'published'},
}, },
projection={"_id": 1}, projection={'_id': 1},
may_create=may_create, may_create=may_create)
)
except pillar.PillarError: except pillar.PillarError:
raise pillar.PillarError("Unable to create sync folder on the Cloud") raise pillar.PillarError('Unable to create sync folder on the Cloud')
if not may_create and sub_sync_group is None: if not may_create and sub_sync_group is None:
log.info( log.info("Sync folder for Blender version %s doesn't exist, "
"Sync folder for Blender version %s doesn't exist, " "and not creating it either.", blender_version)
"and not creating it either.", return sync_group['_id'], ''
blender_version,
)
return sync_group["_id"], ""
return sync_group["_id"], sub_sync_group["_id"] return sync_group['_id'], sub_sync_group['_id']
@functools.lru_cache() @functools.lru_cache()
@ -167,94 +158,82 @@ async def available_blender_versions(home_project_id: str, user_id: str) -> list
sync_group = await pillar_call( sync_group = await pillar_call(
pillarsdk.Node.find_first, pillarsdk.Node.find_first,
params={ params={
"where": { 'where': {'project': home_project_id,
"project": home_project_id, 'node_type': 'group',
"node_type": "group", 'parent': None,
"parent": None, 'name': SYNC_GROUP_NODE_NAME,
"name": SYNC_GROUP_NODE_NAME, 'user': user_id},
"user": user_id, 'projection': {'_id': 1},
},
"projection": {"_id": 1},
}, },
caching=False, caching=False)
)
if sync_group is None: if sync_group is None:
bss.report({"ERROR"}, "No synced Blender settings in your Blender Cloud") bss.report({'ERROR'}, 'No synced Blender settings in your Blender Cloud')
log.debug( log.debug('-- unable to find sync group for home_project_id=%r and user_id=%r',
"-- unable to find sync group for home_project_id=%r and user_id=%r", home_project_id, user_id)
home_project_id,
user_id,
)
return [] return []
sync_nodes = await pillar_call( sync_nodes = await pillar_call(
pillarsdk.Node.all, pillarsdk.Node.all,
params={ params={
"where": { 'where': {'project': home_project_id,
"project": home_project_id, 'node_type': 'group',
"node_type": "group", 'parent': sync_group['_id'],
"parent": sync_group["_id"], 'user': user_id},
"user": user_id, 'projection': {'_id': 1, 'name': 1},
}, 'sort': '-name',
"projection": {"_id": 1, "name": 1},
"sort": "-name",
}, },
caching=False, caching=False)
)
if not sync_nodes or not sync_nodes._items: if not sync_nodes or not sync_nodes._items:
bss.report({"ERROR"}, "No synced Blender settings in your Blender Cloud.") bss.report({'ERROR'}, 'No synced Blender settings in your Blender Cloud.')
return [] return []
versions = [node.name for node in sync_nodes._items] versions = [node.name for node in sync_nodes._items]
log.debug("Versions: %s", versions) log.debug('Versions: %s', versions)
return versions return versions
# noinspection PyAttributeOutsideInit # noinspection PyAttributeOutsideInit
class PILLAR_OT_sync( class PILLAR_OT_sync(pillar.PillarOperatorMixin,
pillar.PillarOperatorMixin, async_loop.AsyncModalOperatorMixin, bpy.types.Operator async_loop.AsyncModalOperatorMixin,
): bpy.types.Operator):
bl_idname = "pillar.sync" bl_idname = 'pillar.sync'
bl_label = "Synchronise with Blender Cloud" bl_label = 'Synchronise with Blender Cloud'
bl_description = "Synchronises Blender settings with Blender Cloud" bl_description = 'Synchronises Blender settings with Blender Cloud'
log = logging.getLogger("bpy.ops.%s" % bl_idname) log = logging.getLogger('bpy.ops.%s' % bl_idname)
home_project_id = "" home_project_id = ''
sync_group_id = "" # top-level sync group node ID sync_group_id = '' # top-level sync group node ID
sync_group_versioned_id = "" # sync group node ID for the given Blender version. sync_group_versioned_id = '' # sync group node ID for the given Blender version.
action: bpy.props.EnumProperty( action = bpy.props.EnumProperty(
items=[ items=[
("PUSH", "Push", "Push settings to the Blender Cloud"), ('PUSH', 'Push', 'Push settings to the Blender Cloud'),
("PULL", "Pull", "Pull settings from the Blender Cloud"), ('PULL', 'Pull', 'Pull settings from the Blender Cloud'),
("REFRESH", "Refresh", "Refresh available versions"), ('REFRESH', 'Refresh', 'Refresh available versions'),
("SELECT", "Select", "Select version to sync"), ('SELECT', 'Select', 'Select version to sync'),
], ],
name="action", name='action')
)
CURRENT_BLENDER_VERSION = "%i.%i" % bpy.app.version[:2] CURRENT_BLENDER_VERSION = '%i.%i' % bpy.app.version[:2]
blender_version: bpy.props.StringProperty( blender_version = bpy.props.StringProperty(name='blender_version',
name="blender_version", description='Blender version to sync for',
description="Blender version to sync for", default=CURRENT_BLENDER_VERSION)
default=CURRENT_BLENDER_VERSION,
)
def bss_report(self, level, message): def bss_report(self, level, message):
bss = bpy.context.window_manager.blender_sync_status bss = bpy.context.window_manager.blender_sync_status
bss.report(level, message) bss.report(level, message)
def invoke(self, context, event): def invoke(self, context, event):
if self.action == "SELECT": if self.action == 'SELECT':
# Synchronous action # Synchronous action
return self.action_select(context) return self.action_select(context)
if self.action in {"PUSH", "PULL"} and not self.blender_version: if self.action in {'PUSH', 'PULL'} and not self.blender_version:
self.bss_report({"ERROR"}, "No Blender version to sync for was given.") self.bss_report({'ERROR'}, 'No Blender version to sync for was given.')
return {"CANCELLED"} return {'CANCELLED'}
return async_loop.AsyncModalOperatorMixin.invoke(self, context, event) return async_loop.AsyncModalOperatorMixin.invoke(self, context, event)
@ -264,146 +243,132 @@ class PILLAR_OT_sync(
This is a synchronous action, as it requires a dialog box. This is a synchronous action, as it requires a dialog box.
""" """
self.log.info("Performing action SELECT") self.log.info('Performing action SELECT')
# Do a refresh before we can show the dropdown. # Do a refresh before we can show the dropdown.
fut = asyncio.ensure_future( fut = asyncio.ensure_future(self.async_execute(context, action_override='REFRESH'))
self.async_execute(context, action_override="REFRESH")
)
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
loop.run_until_complete(fut) loop.run_until_complete(fut)
self._state = "SELECTING" self._state = 'SELECTING'
return context.window_manager.invoke_props_dialog(self) return context.window_manager.invoke_props_dialog(self)
def draw(self, context): def draw(self, context):
bss = bpy.context.window_manager.blender_sync_status bss = bpy.context.window_manager.blender_sync_status
self.layout.prop(bss, "version", text="Blender version") self.layout.prop(bss, 'version', text='Blender version')
def execute(self, context): def execute(self, context):
if self.action != "SELECT": if self.action != 'SELECT':
log.debug("Ignoring execute() for action %r", self.action) log.debug('Ignoring execute() for action %r', self.action)
return {"FINISHED"} return {'FINISHED'}
log.debug("Performing execute() for action %r", self.action) log.debug('Performing execute() for action %r', self.action)
# Perform the sync when the user closes the dialog box. # Perform the sync when the user closes the dialog box.
bss = bpy.context.window_manager.blender_sync_status bss = bpy.context.window_manager.blender_sync_status
bpy.ops.pillar.sync( bpy.ops.pillar.sync('INVOKE_DEFAULT',
"INVOKE_DEFAULT", action="PULL", blender_version=bss.version action='PULL',
) blender_version=bss.version)
return {"FINISHED"} return {'FINISHED'}
@async_set_blender_sync_status("SYNCING") @async_set_blender_sync_status('SYNCING')
async def async_execute(self, context, *, action_override=None): async def async_execute(self, context, *, action_override=None):
"""Entry point of the asynchronous operator.""" """Entry point of the asynchronous operator."""
action = action_override or self.action action = action_override or self.action
self.bss_report({"INFO"}, "Communicating with Blender Cloud") self.bss_report({'INFO'}, 'Communicating with Blender Cloud')
self.log.info("Performing action %s", action) self.log.info('Performing action %s', action)
try: try:
# Refresh credentials # Refresh credentials
try: try:
db_user = await self.check_credentials(context, REQUIRES_ROLES_FOR_SYNC) db_user = await self.check_credentials(context, REQUIRES_ROLES_FOR_SYNC)
self.user_id = db_user["_id"] self.user_id = db_user['_id']
log.debug("Found user ID: %s", self.user_id) log.debug('Found user ID: %s', self.user_id)
except pillar.NotSubscribedToCloudError as ex: except pillar.NotSubscribedToCloudError as ex:
self._log_subscription_needed(can_renew=ex.can_renew) self._log_subscription_needed(can_renew=ex.can_renew)
self._state = "QUIT" self._state = 'QUIT'
return return
except pillar.UserNotLoggedInError: except pillar.UserNotLoggedInError:
self.log.exception("Error checking/refreshing credentials.") self.log.exception('Error checking/refreshing credentials.')
self.bss_report({"ERROR"}, "Please log in on Blender ID first.") self.bss_report({'ERROR'}, 'Please log in on Blender ID first.')
self._state = "QUIT" self._state = 'QUIT'
return return
# Find the home project. # Find the home project.
try: try:
self.home_project_id = await home_project.get_home_project_id() self.home_project_id = await home_project.get_home_project_id()
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
self.log.exception("Forbidden access to home project.") self.log.exception('Forbidden access to home project.')
self.bss_report({"ERROR"}, "Did not get access to home project.") self.bss_report({'ERROR'}, 'Did not get access to home project.')
self._state = "QUIT" self._state = 'QUIT'
return return
except sdk_exceptions.ResourceNotFound: except sdk_exceptions.ResourceNotFound:
self.bss_report({"ERROR"}, "Home project not found.") self.bss_report({'ERROR'}, 'Home project not found.')
self._state = "QUIT" self._state = 'QUIT'
return return
# Only create the folder structure if we're pushing. # Only create the folder structure if we're pushing.
may_create = self.action == "PUSH" may_create = self.action == 'PUSH'
try: try:
gid, subgid = await find_sync_group_id( gid, subgid = await find_sync_group_id(self.home_project_id,
self.home_project_id, self.user_id,
self.user_id, self.blender_version,
self.blender_version, may_create=may_create)
may_create=may_create,
)
self.sync_group_id = gid self.sync_group_id = gid
self.sync_group_versioned_id = subgid self.sync_group_versioned_id = subgid
self.log.debug("Found top-level group node ID: %s", self.sync_group_id) self.log.debug('Found top-level group node ID: %s', self.sync_group_id)
self.log.debug( self.log.debug('Found group node ID for %s: %s',
"Found group node ID for %s: %s", self.blender_version, self.sync_group_versioned_id)
self.blender_version,
self.sync_group_versioned_id,
)
except sdk_exceptions.ForbiddenAccess: except sdk_exceptions.ForbiddenAccess:
self.log.exception("Unable to find Group ID") self.log.exception('Unable to find Group ID')
self.bss_report({"ERROR"}, "Unable to find sync folder.") self.bss_report({'ERROR'}, 'Unable to find sync folder.')
self._state = "QUIT" self._state = 'QUIT'
return return
# Perform the requested action. # Perform the requested action.
action_method = { action_method = {
"PUSH": self.action_push, 'PUSH': self.action_push,
"PULL": self.action_pull, 'PULL': self.action_pull,
"REFRESH": self.action_refresh, 'REFRESH': self.action_refresh,
}[action] }[action]
await action_method(context) await action_method(context)
except Exception as ex: except Exception as ex:
self.log.exception("Unexpected exception caught.") self.log.exception('Unexpected exception caught.')
self.bss_report({"ERROR"}, "Unexpected error: %s" % ex) self.bss_report({'ERROR'}, 'Unexpected error: %s' % ex)
self._state = "QUIT" self._state = 'QUIT'
async def action_push(self, context): async def action_push(self, context):
"""Sends files to the Pillar server.""" """Sends files to the Pillar server."""
self.log.info("Saved user preferences to disk before pushing to cloud.") self.log.info('Saved user preferences to disk before pushing to cloud.')
bpy.ops.wm.save_userpref() bpy.ops.wm.save_userpref()
config_dir = pathlib.Path(bpy.utils.user_resource("CONFIG")) config_dir = pathlib.Path(bpy.utils.user_resource('CONFIG'))
for fname in SETTINGS_FILES_TO_UPLOAD: for fname in SETTINGS_FILES_TO_UPLOAD:
path = config_dir / fname path = config_dir / fname
if not path.exists(): if not path.exists():
self.log.debug("Skipping non-existing %s", path) self.log.debug('Skipping non-existing %s', path)
continue continue
if self.signalling_future.cancelled(): if self.signalling_future.cancelled():
self.bss_report({"WARNING"}, "Upload aborted.") self.bss_report({'WARNING'}, 'Upload aborted.')
return return
self.bss_report({"INFO"}, "Uploading %s" % fname) self.bss_report({'INFO'}, 'Uploading %s' % fname)
try: try:
await pillar.attach_file_to_group( await pillar.attach_file_to_group(path,
path, self.home_project_id,
self.home_project_id, self.sync_group_versioned_id,
self.sync_group_versioned_id, self.user_id)
self.user_id,
)
except sdk_exceptions.RequestEntityTooLarge as ex: except sdk_exceptions.RequestEntityTooLarge as ex:
self.log.error("File too big to upload: %s" % ex) self.log.error('File too big to upload: %s' % ex)
self.log.error( self.log.error('To upload larger files, please subscribe to Blender Cloud.')
"To upload larger files, please subscribe to Blender Cloud." self.bss_report({'SUBSCRIBE'}, 'File %s too big to upload. '
) 'Subscribe for unlimited space.' % fname)
self.bss_report( self._state = 'QUIT'
{"SUBSCRIBE"},
"File %s too big to upload. "
"Subscribe for unlimited space." % fname,
)
self._state = "QUIT"
return return
await self.action_refresh(context) await self.action_refresh(context)
@ -417,37 +382,31 @@ class PILLAR_OT_sync(
else: else:
bss.version = max(bss.available_blender_versions) bss.version = max(bss.available_blender_versions)
self.bss_report({"INFO"}, "Settings pushed to Blender Cloud.") self.bss_report({'INFO'}, 'Settings pushed to Blender Cloud.')
async def action_pull(self, context): async def action_pull(self, context):
"""Loads files from the Pillar server.""" """Loads files from the Pillar server."""
# If the sync group node doesn't exist, offer a list of groups that do. # If the sync group node doesn't exist, offer a list of groups that do.
if not self.sync_group_id: if not self.sync_group_id:
self.bss_report( self.bss_report({'ERROR'},
{"ERROR"}, "There are no synced Blender settings in your Blender Cloud." 'There are no synced Blender settings in your Blender Cloud.')
)
return return
if not self.sync_group_versioned_id: if not self.sync_group_versioned_id:
self.bss_report( self.bss_report({'ERROR'}, 'Therre are no synced Blender settings for version %s' %
{"ERROR"}, self.blender_version)
"Therre are no synced Blender settings for version %s"
% self.blender_version,
)
return return
self.bss_report({"INFO"}, "Pulling settings from Blender Cloud") self.bss_report({'INFO'}, 'Pulling settings from Blender Cloud')
with tempfile.TemporaryDirectory(prefix="bcloud-sync") as tempdir: with tempfile.TemporaryDirectory(prefix='bcloud-sync') as tempdir:
for fname in SETTINGS_FILES_TO_UPLOAD: for fname in SETTINGS_FILES_TO_UPLOAD:
await self.download_settings_file(fname, tempdir) await self.download_settings_file(fname, tempdir)
self.bss_report( self.bss_report({'WARNING'}, 'Settings pulled from Cloud, restart Blender to load them.')
{"WARNING"}, "Settings pulled from Cloud, restart Blender to load them."
)
async def action_refresh(self, context): async def action_refresh(self, context):
self.bss_report({"INFO"}, "Refreshing available Blender versions.") self.bss_report({'INFO'}, 'Refreshing available Blender versions.')
# Clear the LRU cache of available_blender_versions so that we can # Clear the LRU cache of available_blender_versions so that we can
# obtain new versions (if someone synced from somewhere else, for example) # obtain new versions (if someone synced from somewhere else, for example)
@ -457,89 +416,71 @@ class PILLAR_OT_sync(
bss = bpy.context.window_manager.blender_sync_status bss = bpy.context.window_manager.blender_sync_status
bss.available_blender_versions = versions bss.available_blender_versions = versions
if not versions: if versions:
# There are versions to sync, so we can remove the status message. # There are versions to sync, so we can remove the status message.
# However, if there aren't any, the status message shows why, and # However, if there aren't any, the status message shows why, and
# shouldn't be erased. # shouldn't be erased.
return self.bss_report({'INFO'}, '')
# Prevent warnings that the current value of the EnumProperty isn't valid.
current_version = "%d.%d" % bpy.app.version[:2]
if current_version in versions:
bss.version = current_version
else:
bss.version = versions[0]
self.bss_report({"INFO"}, "")
async def download_settings_file(self, fname: str, temp_dir: str): async def download_settings_file(self, fname: str, temp_dir: str):
config_dir = pathlib.Path(bpy.utils.user_resource("CONFIG")) config_dir = pathlib.Path(bpy.utils.user_resource('CONFIG'))
meta_path = cache.cache_directory("home-project", "blender-sync") meta_path = cache.cache_directory('home-project', 'blender-sync')
self.bss_report({"INFO"}, "Downloading %s from Cloud" % fname) self.bss_report({'INFO'}, 'Downloading %s from Cloud' % fname)
# Get the asset node # Get the asset node
node_props = { node_props = {'project': self.home_project_id,
"project": self.home_project_id, 'node_type': 'asset',
"node_type": "asset", 'parent': self.sync_group_versioned_id,
"parent": self.sync_group_versioned_id, 'name': fname}
"name": fname, node = await pillar_call(pillarsdk.Node.find_first, {
} 'where': node_props,
node = await pillar_call( 'projection': {'_id': 1, 'properties.file': 1}
pillarsdk.Node.find_first, }, caching=False)
{"where": node_props, "projection": {"_id": 1, "properties.file": 1}},
caching=False,
)
if node is None: if node is None:
self.bss_report({"INFO"}, "Unable to find %s on Blender Cloud" % fname) self.bss_report({'INFO'}, 'Unable to find %s on Blender Cloud' % fname)
self.log.info("Unable to find node on Blender Cloud for %s", fname) self.log.info('Unable to find node on Blender Cloud for %s', fname)
return return
async def file_downloaded( async def file_downloaded(file_path: str, file_desc: pillarsdk.File, map_type: str):
file_path: str, file_desc: pillarsdk.File, map_type: str
):
# Allow the caller to adjust the file before we move it into place. # Allow the caller to adjust the file before we move it into place.
if fname.lower() == "userpref.blend": if fname.lower() == 'userpref.blend':
await self.update_userpref_blend(file_path) await self.update_userpref_blend(file_path)
# Move the file next to the final location; as it may be on a # Move the file next to the final location; as it may be on a
# different filesystem than the temporary directory, this can # different filesystem than the temporary directory, this can
# fail, and we don't want to destroy the existing file. # fail, and we don't want to destroy the existing file.
local_temp = config_dir / (fname + "~") local_temp = config_dir / (fname + '~')
local_final = config_dir / fname local_final = config_dir / fname
# Make a backup copy of the file as it was before pulling. # Make a backup copy of the file as it was before pulling.
if local_final.exists(): if local_final.exists():
local_bak = config_dir / (fname + "-pre-bcloud-pull") local_bak = config_dir / (fname + '-pre-bcloud-pull')
self.move_file(local_final, local_bak) self.move_file(local_final, local_bak)
self.move_file(file_path, local_temp) self.move_file(file_path, local_temp)
self.move_file(local_temp, local_final) self.move_file(local_temp, local_final)
file_id = node.properties.file file_id = node.properties.file
await pillar.download_file_by_uuid( await pillar.download_file_by_uuid(file_id,
file_id, temp_dir,
temp_dir, str(meta_path),
str(meta_path), file_loaded_sync=file_downloaded,
file_loaded_sync=file_downloaded, future=self.signalling_future)
future=self.signalling_future,
)
def move_file(self, src, dst): def move_file(self, src, dst):
self.log.info("Moving %s to %s", src, dst) self.log.info('Moving %s to %s', src, dst)
shutil.move(str(src), str(dst)) shutil.move(str(src), str(dst))
async def update_userpref_blend(self, file_path: str): async def update_userpref_blend(self, file_path: str):
self.log.info("Overriding machine-local settings in %s", file_path) self.log.info('Overriding machine-local settings in %s', file_path)
# Remember some settings that should not be overwritten from the Cloud. # Remember some settings that should not be overwritten from the Cloud.
prefs = blender.ctx_preferences() prefs = blender.ctx_preferences()
remembered = {} remembered = {}
for rna_key, python_key in LOCAL_SETTINGS_RNA: for rna_key, python_key in LOCAL_SETTINGS_RNA:
assert ( assert '.' in python_key, 'Sorry, this code assumes there is a dot in the Python key'
"." in python_key
), "Sorry, this code assumes there is a dot in the Python key"
try: try:
value = prefs.path_resolve(python_key) value = prefs.path_resolve(python_key)
@ -549,31 +490,28 @@ class PILLAR_OT_sync(
continue continue
# Map enums from strings (in Python) to ints (in DNA). # Map enums from strings (in Python) to ints (in DNA).
dot_index = python_key.rindex(".") dot_index = python_key.rindex('.')
parent_key, prop_key = python_key[:dot_index], python_key[dot_index + 1 :] parent_key, prop_key = python_key[:dot_index], python_key[dot_index + 1:]
parent = prefs.path_resolve(parent_key) parent = prefs.path_resolve(parent_key)
prop = parent.bl_rna.properties[prop_key] prop = parent.bl_rna.properties[prop_key]
if prop.type == "ENUM": if prop.type == 'ENUM':
log.debug( log.debug('Rewriting %s from %r to %r',
"Rewriting %s from %r to %r", python_key, value, prop.enum_items[value].value)
python_key,
value,
prop.enum_items[value].value,
)
value = prop.enum_items[value].value value = prop.enum_items[value].value
else: else:
log.debug("Keeping value of %s: %r", python_key, value) log.debug('Keeping value of %s: %r', python_key, value)
remembered[rna_key] = value remembered[rna_key] = value
log.debug("Overriding values: %s", remembered) log.debug('Overriding values: %s', remembered)
# Rewrite the userprefs.blend file to override the options. # Rewrite the userprefs.blend file to override the options.
with blendfile.open_blend(file_path, "rb+") as blend: with blendfile.open_blend(file_path, 'rb+') as blend:
prefs = next(block for block in blend.blocks if block.code == b"USER") prefs = next(block for block in blend.blocks
if block.code == b'USER')
for key, value in remembered.items(): for key, value in remembered.items():
self.log.debug("prefs[%r] = %r" % (key, prefs[key])) self.log.debug('prefs[%r] = %r' % (key, prefs[key]))
self.log.debug(" -> setting prefs[%r] = %r" % (key, value)) self.log.debug(' -> setting prefs[%r] = %r' % (key, value))
prefs[key] = value prefs[key] = value

View File

@ -27,12 +27,15 @@ import bgl
import pillarsdk import pillarsdk
from .. import async_loop, pillar, cache, blender, utils from .. import async_loop, pillar, cache, blender, utils
from . import ( from . import menu_item as menu_item_mod # so that we can have menu items called 'menu_item'
menu_item as menu_item_mod, from . import nodes
) # so that we can have menu items called 'menu_item'
from . import draw, nodes
REQUIRED_ROLES_FOR_TEXTURE_BROWSER = {"subscriber", "demo"} if bpy.app.version < (2, 80):
from . import draw_27 as draw
else:
from . import draw
REQUIRED_ROLES_FOR_TEXTURE_BROWSER = {'subscriber', 'demo'}
MOUSE_SCROLL_PIXELS_PER_TICK = 50 MOUSE_SCROLL_PIXELS_PER_TICK = 50
TARGET_ITEM_WIDTH = 400 TARGET_ITEM_WIDTH = 400
@ -44,16 +47,16 @@ ITEM_PADDING_X = 5
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class BlenderCloudBrowser( class BlenderCloudBrowser(pillar.PillarOperatorMixin,
pillar.PillarOperatorMixin, async_loop.AsyncModalOperatorMixin, bpy.types.Operator async_loop.AsyncModalOperatorMixin,
): bpy.types.Operator):
bl_idname = "pillar.browser" bl_idname = 'pillar.browser'
bl_label = "Blender Cloud Texture Browser" bl_label = 'Blender Cloud Texture Browser'
_draw_handle = None _draw_handle = None
current_path = pillar.CloudPath("/") current_path = pillar.CloudPath('/')
project_name = "" project_name = ''
# This contains a stack of Node objects that lead up to the currently browsed node. # This contains a stack of Node objects that lead up to the currently browsed node.
path_stack = [] # type: typing.List[pillarsdk.Node] path_stack = [] # type: typing.List[pillarsdk.Node]
@ -62,12 +65,12 @@ class BlenderCloudBrowser(
menu_item_stack = [] # type: typing.List[menu_item_mod.MenuItem] menu_item_stack = [] # type: typing.List[menu_item_mod.MenuItem]
timer = None timer = None
log = logging.getLogger("%s.BlenderCloudBrowser" % __name__) log = logging.getLogger('%s.BlenderCloudBrowser' % __name__)
_menu_item_lock = threading.Lock() _menu_item_lock = threading.Lock()
current_display_content = [] # type: typing.List[menu_item_mod.MenuItem] current_display_content = [] # type: typing.List[menu_item_mod.MenuItem]
loaded_images = set() # type: typing.Set[str] loaded_images = set() # type: typing.Set[str]
thumbnails_cache = "" thumbnails_cache = ''
maximized_area = False maximized_area = False
mouse_x = 0 mouse_x = 0
@ -81,18 +84,16 @@ class BlenderCloudBrowser(
# Refuse to start if the file hasn't been saved. It's okay if # Refuse to start if the file hasn't been saved. It's okay if
# it's dirty, we just need to know where '//' points to. # it's dirty, we just need to know where '//' points to.
if not os.path.exists(context.blend_data.filepath): if not os.path.exists(context.blend_data.filepath):
self.report( self.report({'ERROR'}, 'Please save your Blend file before using '
{"ERROR"}, 'the Blender Cloud addon.')
"Please save your Blend file before using " "the Blender Cloud addon.", return {'CANCELLED'}
)
return {"CANCELLED"}
wm = context.window_manager wm = context.window_manager
self.current_path = pillar.CloudPath(wm.last_blender_cloud_location) self.current_path = pillar.CloudPath(wm.last_blender_cloud_location)
self.path_stack = [] # list of nodes that make up the current path. self.path_stack = [] # list of nodes that make up the current path.
self.thumbnails_cache = cache.cache_directory("thumbnails") self.thumbnails_cache = cache.cache_directory('thumbnails')
self.mouse_x = event.mouse_x self.mouse_x = event.mouse_x
self.mouse_y = event.mouse_y self.mouse_y = event.mouse_y
@ -104,93 +105,91 @@ class BlenderCloudBrowser(
# Add the region OpenGL drawing callback # Add the region OpenGL drawing callback
# draw in view space with 'POST_VIEW' and 'PRE_VIEW' # draw in view space with 'POST_VIEW' and 'PRE_VIEW'
self._draw_handle = context.space_data.draw_handler_add( self._draw_handle = context.space_data.draw_handler_add(
self.draw_menu, (context,), "WINDOW", "POST_PIXEL" self.draw_menu, (context,), 'WINDOW', 'POST_PIXEL')
)
self.current_display_content = [] self.current_display_content = []
self.loaded_images = set() self.loaded_images = set()
self._scroll_reset() self._scroll_reset()
context.window.cursor_modal_set("DEFAULT") context.window.cursor_modal_set('DEFAULT')
return async_loop.AsyncModalOperatorMixin.invoke(self, context, event) return async_loop.AsyncModalOperatorMixin.invoke(self, context, event)
def modal(self, context, event): def modal(self, context, event):
result = async_loop.AsyncModalOperatorMixin.modal(self, context, event) result = async_loop.AsyncModalOperatorMixin.modal(self, context, event)
if not {"PASS_THROUGH", "RUNNING_MODAL"}.intersection(result): if not {'PASS_THROUGH', 'RUNNING_MODAL'}.intersection(result):
return result return result
if event.type == "TAB" and event.value == "RELEASE": if event.type == 'TAB' and event.value == 'RELEASE':
self.log.info("Ensuring async loop is running") self.log.info('Ensuring async loop is running')
async_loop.ensure_async_loop() async_loop.ensure_async_loop()
if event.type == "TIMER": if event.type == 'TIMER':
self._scroll_smooth() self._scroll_smooth()
context.area.tag_redraw() context.area.tag_redraw()
return {"RUNNING_MODAL"} return {'RUNNING_MODAL'}
if "MOUSE" in event.type: if 'MOUSE' in event.type:
context.area.tag_redraw() context.area.tag_redraw()
self.mouse_x = event.mouse_x self.mouse_x = event.mouse_x
self.mouse_y = event.mouse_y self.mouse_y = event.mouse_y
left_mouse_release = event.type == "LEFTMOUSE" and event.value == "RELEASE" left_mouse_release = event.type == 'LEFTMOUSE' and event.value == 'RELEASE'
if left_mouse_release and self._state in {"PLEASE_SUBSCRIBE", "PLEASE_RENEW"}: if left_mouse_release and self._state in {'PLEASE_SUBSCRIBE', 'PLEASE_RENEW'}:
self.open_browser_subscribe(renew=self._state == "PLEASE_RENEW") self.open_browser_subscribe(renew=self._state == 'PLEASE_RENEW')
self._finish(context) self._finish(context)
return {"FINISHED"} return {'FINISHED'}
if self._state == "BROWSING": if self._state == 'BROWSING':
selected = self.get_clicked() selected = self.get_clicked()
if selected: if selected:
if selected.is_spinning: if selected.is_spinning:
context.window.cursor_set("WAIT") context.window.cursor_set('WAIT')
else: else:
context.window.cursor_set("HAND") context.window.cursor_set('HAND')
else: else:
context.window.cursor_set("DEFAULT") context.window.cursor_set('DEFAULT')
# Scrolling # Scrolling
if event.type == "WHEELUPMOUSE": if event.type == 'WHEELUPMOUSE':
self._scroll_by(MOUSE_SCROLL_PIXELS_PER_TICK) self._scroll_by(MOUSE_SCROLL_PIXELS_PER_TICK)
context.area.tag_redraw() context.area.tag_redraw()
elif event.type == "WHEELDOWNMOUSE": elif event.type == 'WHEELDOWNMOUSE':
self._scroll_by(-MOUSE_SCROLL_PIXELS_PER_TICK) self._scroll_by(-MOUSE_SCROLL_PIXELS_PER_TICK)
context.area.tag_redraw() context.area.tag_redraw()
elif event.type == "TRACKPADPAN": elif event.type == 'TRACKPADPAN':
self._scroll_by(event.mouse_prev_y - event.mouse_y, smooth=False) self._scroll_by(event.mouse_prev_y - event.mouse_y,
smooth=False)
context.area.tag_redraw() context.area.tag_redraw()
if left_mouse_release: if left_mouse_release:
if selected is None: if selected is None:
# No item clicked, ignore it. # No item clicked, ignore it.
return {"RUNNING_MODAL"} return {'RUNNING_MODAL'}
if selected.is_spinning: if selected.is_spinning:
# This can happen when the thumbnail information isn't loaded yet. # This can happen when the thumbnail information isn't loaded yet.
return {"RUNNING_MODAL"} return {'RUNNING_MODAL'}
if selected.is_folder: if selected.is_folder:
self.descend_node(selected) self.descend_node(selected)
else: else:
self.handle_item_selection(context, selected) self.handle_item_selection(context, selected)
if event.type in {"RIGHTMOUSE", "ESC"}: if event.type in {'RIGHTMOUSE', 'ESC'}:
self._finish(context) self._finish(context)
return {"CANCELLED"} return {'CANCELLED'}
return {"RUNNING_MODAL"} return {'RUNNING_MODAL'}
async def async_execute(self, context): async def async_execute(self, context):
self._state = "CHECKING_CREDENTIALS" self._state = 'CHECKING_CREDENTIALS'
self.log.debug("Checking credentials") self.log.debug('Checking credentials')
try: try:
db_user = await self.check_credentials( db_user = await self.check_credentials(context, REQUIRED_ROLES_FOR_TEXTURE_BROWSER)
context, REQUIRED_ROLES_FOR_TEXTURE_BROWSER
)
except pillar.NotSubscribedToCloudError as ex: except pillar.NotSubscribedToCloudError as ex:
self._log_subscription_needed(can_renew=ex.can_renew, level="INFO") self._log_subscription_needed(can_renew=ex.can_renew, level='INFO')
self._show_subscribe_screen(can_renew=ex.can_renew) self._show_subscribe_screen(can_renew=ex.can_renew)
return None return None
@ -203,11 +202,11 @@ class BlenderCloudBrowser(
"""Shows the "You need to subscribe" screen.""" """Shows the "You need to subscribe" screen."""
if can_renew: if can_renew:
self._state = "PLEASE_RENEW" self._state = 'PLEASE_RENEW'
else: else:
self._state = "PLEASE_SUBSCRIBE" self._state = 'PLEASE_SUBSCRIBE'
bpy.context.window.cursor_set("HAND") bpy.context.window.cursor_set('HAND')
def descend_node(self, menu_item: menu_item_mod.MenuItem): def descend_node(self, menu_item: menu_item_mod.MenuItem):
"""Descends the node hierarchy by visiting this menu item's node. """Descends the node hierarchy by visiting this menu item's node.
@ -216,25 +215,25 @@ class BlenderCloudBrowser(
""" """
node = menu_item.node node = menu_item.node
assert isinstance(node, pillarsdk.Node), "Wrong type %s" % node assert isinstance(node, pillarsdk.Node), 'Wrong type %s' % node
if isinstance(node, nodes.UpNode): if isinstance(node, nodes.UpNode):
# Going up. # Going up.
self.log.debug("Going up to %r", self.current_path) self.log.debug('Going up to %r', self.current_path)
self.current_path = self.current_path.parent self.current_path = self.current_path.parent
if self.path_stack: if self.path_stack:
self.path_stack.pop() self.path_stack.pop()
if self.menu_item_stack: if self.menu_item_stack:
self.menu_item_stack.pop() self.menu_item_stack.pop()
if not self.path_stack: if not self.path_stack:
self.project_name = "" self.project_name = ''
else: else:
# Going down, keep track of where we were # Going down, keep track of where we were
if isinstance(node, nodes.ProjectNode): if isinstance(node, nodes.ProjectNode):
self.project_name = node["name"] self.project_name = node['name']
self.current_path /= node["_id"] self.current_path /= node['_id']
self.log.debug("Going down to %r", self.current_path) self.log.debug('Going down to %r', self.current_path)
self.path_stack.append(node) self.path_stack.append(node)
self.menu_item_stack.append(menu_item) self.menu_item_stack.append(menu_item)
@ -247,18 +246,18 @@ class BlenderCloudBrowser(
return self.path_stack[-1] return self.path_stack[-1]
def _finish(self, context): def _finish(self, context):
self.log.debug("Finishing the modal operator") self.log.debug('Finishing the modal operator')
async_loop.AsyncModalOperatorMixin._finish(self, context) async_loop.AsyncModalOperatorMixin._finish(self, context)
self.clear_images() self.clear_images()
context.space_data.draw_handler_remove(self._draw_handle, "WINDOW") context.space_data.draw_handler_remove(self._draw_handle, 'WINDOW')
context.window.cursor_modal_restore() context.window.cursor_modal_restore()
if self.maximized_area: if self.maximized_area:
bpy.ops.screen.screen_full_area(use_hide_panels=True) bpy.ops.screen.screen_full_area(use_hide_panels=True)
context.area.tag_redraw() context.area.tag_redraw()
self.log.debug("Modal operator finished") self.log.debug('Modal operator finished')
def clear_images(self): def clear_images(self):
"""Removes all images we loaded from Blender's memory.""" """Removes all images we loaded from Blender's memory."""
@ -287,7 +286,7 @@ class BlenderCloudBrowser(
return menu_item return menu_item
def update_menu_item(self, node, *args): def update_menu_item(self, node, *args):
node_uuid = node["_id"] node_uuid = node['_id']
# Just make this thread-safe to be on the safe side. # Just make this thread-safe to be on the safe side.
with self._menu_item_lock: with self._menu_item_lock:
@ -297,7 +296,7 @@ class BlenderCloudBrowser(
self.loaded_images.add(menu_item.icon.filepath_raw) self.loaded_images.add(menu_item.icon.filepath_raw)
break break
else: else:
raise ValueError("Unable to find MenuItem(node_uuid=%r)" % node_uuid) raise ValueError('Unable to find MenuItem(node_uuid=%r)' % node_uuid)
self.sort_menu() self.sort_menu()
@ -311,11 +310,11 @@ class BlenderCloudBrowser(
self.current_display_content.sort(key=menu_item_mod.MenuItem.sort_key) self.current_display_content.sort(key=menu_item_mod.MenuItem.sort_key)
async def async_download_previews(self): async def async_download_previews(self):
self._state = "BROWSING" self._state = 'BROWSING'
thumbnails_directory = self.thumbnails_cache thumbnails_directory = self.thumbnails_cache
self.log.info("Asynchronously downloading previews to %r", thumbnails_directory) self.log.info('Asynchronously downloading previews to %r', thumbnails_directory)
self.log.info("Current BCloud path is %r", self.current_path) self.log.info('Current BCloud path is %r', self.current_path)
self.clear_images() self.clear_images()
self._scroll_reset() self._scroll_reset()
@ -324,41 +323,34 @@ class BlenderCloudBrowser(
if node_uuid: if node_uuid:
# Query for sub-nodes of this node. # Query for sub-nodes of this node.
self.log.debug("Getting subnodes for parent node %r", node_uuid) self.log.debug('Getting subnodes for parent node %r', node_uuid)
children = await pillar.get_nodes( children = await pillar.get_nodes(parent_node_uuid=node_uuid,
parent_node_uuid=node_uuid, node_type={"group_texture", "group_hdri"} node_type={'group_texture', 'group_hdri'})
)
elif project_uuid: elif project_uuid:
# Query for top-level nodes. # Query for top-level nodes.
self.log.debug("Getting subnodes for project node %r", project_uuid) self.log.debug('Getting subnodes for project node %r', project_uuid)
children = await pillar.get_nodes( children = await pillar.get_nodes(project_uuid=project_uuid,
project_uuid=project_uuid, parent_node_uuid='',
parent_node_uuid="", node_type={'group_texture', 'group_hdri'})
node_type={"group_texture", "group_hdri"},
)
else: else:
# Query for projects # Query for projects
self.log.debug( self.log.debug('No node UUID and no project UUID, listing available projects')
"No node UUID and no project UUID, listing available projects"
)
children = await pillar.get_texture_projects() children = await pillar.get_texture_projects()
for proj_dict in children: for proj_dict in children:
self.add_menu_item( self.add_menu_item(nodes.ProjectNode(proj_dict), None, 'FOLDER', proj_dict['name'])
nodes.ProjectNode(proj_dict), None, "FOLDER", proj_dict["name"]
)
return return
# Make sure we can go up again. # Make sure we can go up again.
self.add_menu_item(nodes.UpNode(), None, "FOLDER", ".. up ..") self.add_menu_item(nodes.UpNode(), None, 'FOLDER', '.. up ..')
# Download all child nodes # Download all child nodes
self.log.debug("Iterating over child nodes of %r", self.current_path) self.log.debug('Iterating over child nodes of %r', self.current_path)
for child in children: for child in children:
# print(' - %(_id)s = %(name)s' % child) # print(' - %(_id)s = %(name)s' % child)
if child["node_type"] not in menu_item_mod.MenuItem.SUPPORTED_NODE_TYPES: if child['node_type'] not in menu_item_mod.MenuItem.SUPPORTED_NODE_TYPES:
self.log.debug("Skipping node of type %r", child["node_type"]) self.log.debug('Skipping node of type %r', child['node_type'])
continue continue
self.add_menu_item(child, None, "FOLDER", child["name"]) self.add_menu_item(child, None, 'FOLDER', child['name'])
# There are only sub-nodes at the project level, no texture nodes, # There are only sub-nodes at the project level, no texture nodes,
# so we won't have to bother looking for textures. # so we won't have to bother looking for textures.
@ -368,26 +360,22 @@ class BlenderCloudBrowser(
directory = os.path.join(thumbnails_directory, project_uuid, node_uuid) directory = os.path.join(thumbnails_directory, project_uuid, node_uuid)
os.makedirs(directory, exist_ok=True) os.makedirs(directory, exist_ok=True)
self.log.debug("Fetching texture thumbnails for node %r", node_uuid) self.log.debug('Fetching texture thumbnails for node %r', node_uuid)
def thumbnail_loading(node, texture_node): def thumbnail_loading(node, texture_node):
self.add_menu_item(node, None, "SPINNER", texture_node["name"]) self.add_menu_item(node, None, 'SPINNER', texture_node['name'])
def thumbnail_loaded(node, file_desc, thumb_path): def thumbnail_loaded(node, file_desc, thumb_path):
self.log.debug("Node %s thumbnail loaded", node["_id"]) self.log.debug('Node %s thumbnail loaded', node['_id'])
self.update_menu_item(node, file_desc, thumb_path) self.update_menu_item(node, file_desc, thumb_path)
await pillar.fetch_texture_thumbs( await pillar.fetch_texture_thumbs(node_uuid, 's', directory,
node_uuid, thumbnail_loading=thumbnail_loading,
"s", thumbnail_loaded=thumbnail_loaded,
directory, future=self.signalling_future)
thumbnail_loading=thumbnail_loading,
thumbnail_loaded=thumbnail_loaded,
future=self.signalling_future,
)
def browse_assets(self): def browse_assets(self):
self.log.debug("Browsing assets at %r", self.current_path) self.log.debug('Browsing assets at %r', self.current_path)
bpy.context.window_manager.last_blender_cloud_location = str(self.current_path) bpy.context.window_manager.last_blender_cloud_location = str(self.current_path)
self._new_async_task(self.async_download_previews()) self._new_async_task(self.async_download_previews())
@ -395,13 +383,13 @@ class BlenderCloudBrowser(
"""Draws the GUI with OpenGL.""" """Draws the GUI with OpenGL."""
drawers = { drawers = {
"INITIALIZING": self._draw_initializing, 'INITIALIZING': self._draw_initializing,
"CHECKING_CREDENTIALS": self._draw_checking_credentials, 'CHECKING_CREDENTIALS': self._draw_checking_credentials,
"BROWSING": self._draw_browser, 'BROWSING': self._draw_browser,
"DOWNLOADING_TEXTURE": self._draw_downloading, 'DOWNLOADING_TEXTURE': self._draw_downloading,
"EXCEPTION": self._draw_exception, 'EXCEPTION': self._draw_exception,
"PLEASE_SUBSCRIBE": self._draw_subscribe, 'PLEASE_SUBSCRIBE': self._draw_subscribe,
"PLEASE_RENEW": self._draw_renew, 'PLEASE_RENEW': self._draw_renew,
} }
if self._state in drawers: if self._state in drawers:
@ -409,18 +397,15 @@ class BlenderCloudBrowser(
drawer(context) drawer(context)
# For debugging: draw the state # For debugging: draw the state
draw.text( draw.text((5, 5),
(5, 5), '%s %s' % (self._state, self.project_name),
"%s %s" % (self._state, self.project_name), rgba=(1.0, 1.0, 1.0, 1.0), fsize=12)
rgba=(1.0, 1.0, 1.0, 1.0),
fsize=12,
)
@staticmethod @staticmethod
def _window_region(context): def _window_region(context):
window_regions = [ window_regions = [region
region for region in context.area.regions if region.type == "WINDOW" for region in context.area.regions
] if region.type == 'WINDOW']
return window_regions[0] return window_regions[0]
def _draw_browser(self, context): def _draw_browser(self, context):
@ -428,9 +413,8 @@ class BlenderCloudBrowser(
from . import draw from . import draw
if not self.current_display_content: if not self.current_display_content:
self._draw_text_on_colour( self._draw_text_on_colour(context, "Communicating with Blender Cloud",
context, "Communicating with Blender Cloud", (0.0, 0.0, 0.0, 0.6) (0.0, 0.0, 0.0, 0.6))
)
return return
window_region = self._window_region(context) window_region = self._window_region(context)
@ -449,16 +433,13 @@ class BlenderCloudBrowser(
block_height = item_height + ITEM_MARGIN_Y block_height = item_height + ITEM_MARGIN_Y
bgl.glEnable(bgl.GL_BLEND) bgl.glEnable(bgl.GL_BLEND)
draw.aabox( draw.aabox((0, 0), (window_region.width, window_region.height),
(0, 0), (window_region.width, window_region.height), (0.0, 0.0, 0.0, 0.6) (0.0, 0.0, 0.0, 0.6))
)
bottom_y = float("inf") bottom_y = float('inf')
# The -1 / +2 are for extra rows that are drawn only half at the top/bottom. # The -1 / +2 are for extra rows that are drawn only half at the top/bottom.
first_item_idx = max( first_item_idx = max(0, int(-self.scroll_offset // block_height - 1) * col_count)
0, int(-self.scroll_offset // block_height - 1) * col_count
)
items_per_page = int(content_height // item_height + 2) * col_count items_per_page = int(content_height // item_height + 2) * col_count
last_item_idx = first_item_idx + items_per_page last_item_idx = first_item_idx + items_per_page
@ -474,30 +455,32 @@ class BlenderCloudBrowser(
bottom_y = min(y, bottom_y) bottom_y = min(y, bottom_y)
self.scroll_offset_space_left = window_region.height - bottom_y self.scroll_offset_space_left = window_region.height - bottom_y
self.scroll_offset_max = ( self.scroll_offset_max = (self.scroll_offset -
self.scroll_offset - self.scroll_offset_space_left + 0.25 * block_height self.scroll_offset_space_left +
) 0.25 * block_height)
bgl.glDisable(bgl.GL_BLEND) bgl.glDisable(bgl.GL_BLEND)
def _draw_downloading(self, context): def _draw_downloading(self, context):
"""OpenGL drawing code for the DOWNLOADING_TEXTURE state.""" """OpenGL drawing code for the DOWNLOADING_TEXTURE state."""
self._draw_text_on_colour( self._draw_text_on_colour(context,
context, "Downloading texture from Blender Cloud", (0.0, 0.0, 0.2, 0.6) 'Downloading texture from Blender Cloud',
) (0.0, 0.0, 0.2, 0.6))
def _draw_checking_credentials(self, context): def _draw_checking_credentials(self, context):
"""OpenGL drawing code for the CHECKING_CREDENTIALS state.""" """OpenGL drawing code for the CHECKING_CREDENTIALS state."""
self._draw_text_on_colour( self._draw_text_on_colour(context,
context, "Checking login credentials", (0.0, 0.0, 0.2, 0.6) 'Checking login credentials',
) (0.0, 0.0, 0.2, 0.6))
def _draw_initializing(self, context): def _draw_initializing(self, context):
"""OpenGL drawing code for the INITIALIZING state.""" """OpenGL drawing code for the INITIALIZING state."""
self._draw_text_on_colour(context, "Initializing", (0.0, 0.0, 0.2, 0.6)) self._draw_text_on_colour(context,
'Initializing',
(0.0, 0.0, 0.2, 0.6))
def _draw_text_on_colour(self, context, text: str, bgcolour): def _draw_text_on_colour(self, context, text: str, bgcolour):
content_height, content_width = self._window_size(context) content_height, content_width = self._window_size(context)
@ -505,9 +488,8 @@ class BlenderCloudBrowser(
bgl.glEnable(bgl.GL_BLEND) bgl.glEnable(bgl.GL_BLEND)
draw.aabox((0, 0), (content_width, content_height), bgcolour) draw.aabox((0, 0), (content_width, content_height), bgcolour)
draw.text( draw.text((content_width * 0.5, content_height * 0.7),
(content_width * 0.5, content_height * 0.7), text, fsize=20, align="C" text, fsize=20, align='C')
)
bgl.glDisable(bgl.GL_BLEND) bgl.glDisable(bgl.GL_BLEND)
@ -529,10 +511,8 @@ class BlenderCloudBrowser(
ex = self.async_task.exception() ex = self.async_task.exception()
if isinstance(ex, pillar.UserNotLoggedInError): if isinstance(ex, pillar.UserNotLoggedInError):
ex_msg = ( ex_msg = 'You are not logged in on Blender ID. Please log in at User Preferences, ' \
"You are not logged in on Blender ID. Please log in at User Preferences, " 'Add-ons, Blender ID Authentication.'
"Add-ons, Blender ID Authentication."
)
else: else:
ex_msg = str(ex) ex_msg = str(ex)
if not ex_msg: if not ex_msg:
@ -545,16 +525,14 @@ class BlenderCloudBrowser(
bgl.glDisable(bgl.GL_BLEND) bgl.glDisable(bgl.GL_BLEND)
def _draw_subscribe(self, context): def _draw_subscribe(self, context):
self._draw_text_on_colour( self._draw_text_on_colour(context,
context, "Click to subscribe to the Blender Cloud", (0.0, 0.0, 0.2, 0.6) 'Click to subscribe to the Blender Cloud',
) (0.0, 0.0, 0.2, 0.6))
def _draw_renew(self, context): def _draw_renew(self, context):
self._draw_text_on_colour( self._draw_text_on_colour(context,
context, 'Click to renew your Blender Cloud subscription',
"Click to renew your Blender Cloud subscription", (0.0, 0.0, 0.2, 0.6))
(0.0, 0.0, 0.2, 0.6),
)
def get_clicked(self) -> typing.Optional[menu_item_mod.MenuItem]: def get_clicked(self) -> typing.Optional[menu_item_mod.MenuItem]:
@ -570,89 +548,78 @@ class BlenderCloudBrowser(
from pillarsdk.utils import sanitize_filename from pillarsdk.utils import sanitize_filename
self.clear_images() self.clear_images()
self._state = "DOWNLOADING_TEXTURE" self._state = 'DOWNLOADING_TEXTURE'
node_path_components = ( node_path_components = (node['name'] for node in self.path_stack if node is not None)
node["name"] for node in self.path_stack if node is not None local_path_components = [sanitize_filename(comp) for comp in node_path_components]
)
local_path_components = [
sanitize_filename(comp) for comp in node_path_components
]
top_texture_directory = bpy.path.abspath(context.scene.local_texture_dir) top_texture_directory = bpy.path.abspath(context.scene.local_texture_dir)
local_path = os.path.join(top_texture_directory, *local_path_components) local_path = os.path.join(top_texture_directory, *local_path_components)
meta_path = os.path.join(top_texture_directory, ".blender_cloud") meta_path = os.path.join(top_texture_directory, '.blender_cloud')
self.log.info("Downloading texture %r to %s", item.node_uuid, local_path) self.log.info('Downloading texture %r to %s', item.node_uuid, local_path)
self.log.debug("Metadata will be stored at %s", meta_path) self.log.debug('Metadata will be stored at %s', meta_path)
file_paths = [] file_paths = []
select_dblock = None select_dblock = None
node = item.node node = item.node
def texture_downloading(file_path, *_): def texture_downloading(file_path, *_):
self.log.info("Texture downloading to %s", file_path) self.log.info('Texture downloading to %s', file_path)
def texture_downloaded(file_path, file_desc, map_type): def texture_downloaded(file_path, file_desc, map_type):
nonlocal select_dblock nonlocal select_dblock
self.log.info("Texture downloaded to %r.", file_path) self.log.info('Texture downloaded to %r.', file_path)
if context.scene.local_texture_dir.startswith("//"): if context.scene.local_texture_dir.startswith('//'):
file_path = bpy.path.relpath(file_path) file_path = bpy.path.relpath(file_path)
image_dblock = bpy.data.images.load(filepath=file_path) image_dblock = bpy.data.images.load(filepath=file_path)
image_dblock["bcloud_file_uuid"] = file_desc["_id"] image_dblock['bcloud_file_uuid'] = file_desc['_id']
image_dblock["bcloud_node_uuid"] = node["_id"] image_dblock['bcloud_node_uuid'] = node['_id']
image_dblock["bcloud_node_type"] = node["node_type"] image_dblock['bcloud_node_type'] = node['node_type']
image_dblock["bcloud_node"] = pillar.node_to_id(node) image_dblock['bcloud_node'] = pillar.node_to_id(node)
if node["node_type"] == "hdri": if node['node_type'] == 'hdri':
# All HDRi variations should use the same image datablock, hence once name. # All HDRi variations should use the same image datablock, hence once name.
image_dblock.name = node["name"] image_dblock.name = node['name']
else: else:
# All texture variations are loaded at once, and thus need the map type in the name. # All texture variations are loaded at once, and thus need the map type in the name.
image_dblock.name = "%s-%s" % (node["name"], map_type) image_dblock.name = '%s-%s' % (node['name'], map_type)
# Select the image in the image editor (if the context is right). # Select the image in the image editor (if the context is right).
# Just set the first image we download, # Just set the first image we download,
if context.area.type == "IMAGE_EDITOR": if context.area.type == 'IMAGE_EDITOR':
if select_dblock is None or file_desc.map_type == "color": if select_dblock is None or file_desc.map_type == 'color':
select_dblock = image_dblock select_dblock = image_dblock
context.space_data.image = select_dblock context.space_data.image = select_dblock
file_paths.append(file_path) file_paths.append(file_path)
def texture_download_completed(_): def texture_download_completed(_):
self.log.info( self.log.info('Texture download complete, inspect:\n%s', '\n'.join(file_paths))
"Texture download complete, inspect:\n%s", "\n".join(file_paths) self._state = 'QUIT'
)
self._state = "QUIT"
# For HDRi nodes: only download the first file. # For HDRi nodes: only download the first file.
download_node = pillarsdk.Node.new(node) download_node = pillarsdk.Node.new(node)
if node["node_type"] == "hdri": if node['node_type'] == 'hdri':
download_node.properties.files = [download_node.properties.files[0]] download_node.properties.files = [download_node.properties.files[0]]
signalling_future = asyncio.Future() signalling_future = asyncio.Future()
self._new_async_task( self._new_async_task(pillar.download_texture(download_node, local_path,
pillar.download_texture( metadata_directory=meta_path,
download_node, texture_loading=texture_downloading,
local_path, texture_loaded=texture_downloaded,
metadata_directory=meta_path, future=signalling_future))
texture_loading=texture_downloading,
texture_loaded=texture_downloaded,
future=signalling_future,
)
)
self.async_task.add_done_callback(texture_download_completed) self.async_task.add_done_callback(texture_download_completed)
def open_browser_subscribe(self, *, renew: bool): def open_browser_subscribe(self, *, renew: bool):
import webbrowser import webbrowser
url = "renew" if renew else "join" url = 'renew' if renew else 'join'
webbrowser.open_new_tab("https://cloud.blender.org/%s" % url) webbrowser.open_new_tab('https://cloud.blender.org/%s' % url)
self.report({"INFO"}, "We just started a browser for you.") self.report({'INFO'}, 'We just started a browser for you.')
def _scroll_smooth(self): def _scroll_smooth(self):
diff = self.scroll_offset_target - self.scroll_offset diff = self.scroll_offset_target - self.scroll_offset
@ -670,9 +637,9 @@ class BlenderCloudBrowser(
if smooth and amount < 0 and -amount > self.scroll_offset_space_left / 4: if smooth and amount < 0 and -amount > self.scroll_offset_space_left / 4:
amount = -self.scroll_offset_space_left / 4 amount = -self.scroll_offset_space_left / 4
self.scroll_offset_target = min( self.scroll_offset_target = min(0,
0, max(self.scroll_offset_max, self.scroll_offset_target + amount) max(self.scroll_offset_max,
) self.scroll_offset_target + amount))
if not smooth: if not smooth:
self._scroll_offset = self.scroll_offset_target self._scroll_offset = self.scroll_offset_target
@ -681,44 +648,39 @@ class BlenderCloudBrowser(
self.scroll_offset_target = self.scroll_offset = 0 self.scroll_offset_target = self.scroll_offset = 0
class PILLAR_OT_switch_hdri( class PILLAR_OT_switch_hdri(pillar.PillarOperatorMixin,
pillar.PillarOperatorMixin, async_loop.AsyncModalOperatorMixin, bpy.types.Operator async_loop.AsyncModalOperatorMixin,
): bpy.types.Operator):
bl_idname = "pillar.switch_hdri" bl_idname = 'pillar.switch_hdri'
bl_label = "Switch with another variation" bl_label = 'Switch with another variation'
bl_description = ( bl_description = 'Downloads the selected variation of an HDRi, ' \
"Downloads the selected variation of an HDRi, " "replacing the current image" 'replacing the current image'
)
log = logging.getLogger("bpy.ops.%s" % bl_idname) log = logging.getLogger('bpy.ops.%s' % bl_idname)
image_name: bpy.props.StringProperty( image_name = bpy.props.StringProperty(name='image_name',
name="image_name", description="Name of the image block to replace" description='Name of the image block to replace')
)
file_uuid: bpy.props.StringProperty( file_uuid = bpy.props.StringProperty(name='file_uuid',
name="file_uuid", description="File ID to download" description='File ID to download')
)
async def async_execute(self, context): async def async_execute(self, context):
"""Entry point of the asynchronous operator.""" """Entry point of the asynchronous operator."""
self.report({"INFO"}, "Communicating with Blender Cloud") self.report({'INFO'}, 'Communicating with Blender Cloud')
try: try:
try: try:
db_user = await self.check_credentials( db_user = await self.check_credentials(context, REQUIRED_ROLES_FOR_TEXTURE_BROWSER)
context, REQUIRED_ROLES_FOR_TEXTURE_BROWSER user_id = db_user['_id']
)
user_id = db_user["_id"]
except pillar.NotSubscribedToCloudError as ex: except pillar.NotSubscribedToCloudError as ex:
self._log_subscription_needed(can_renew=ex.can_renew) self._log_subscription_needed(can_renew=ex.can_renew)
self._state = "QUIT" self._state = 'QUIT'
return return
except pillar.UserNotLoggedInError: except pillar.UserNotLoggedInError:
self.log.exception("Error checking/refreshing credentials.") self.log.exception('Error checking/refreshing credentials.')
self.report({"ERROR"}, "Please log in on Blender ID first.") self.report({'ERROR'}, 'Please log in on Blender ID first.')
self._state = "QUIT" self._state = 'QUIT'
return return
if not user_id: if not user_id:
@ -726,67 +688,57 @@ class PILLAR_OT_switch_hdri(
await self.download_and_replace(context) await self.download_and_replace(context)
except Exception as ex: except Exception as ex:
self.log.exception("Unexpected exception caught.") self.log.exception('Unexpected exception caught.')
self.report({"ERROR"}, "Unexpected error %s: %s" % (type(ex), ex)) self.report({'ERROR'}, 'Unexpected error %s: %s' % (type(ex), ex))
self._state = "QUIT" self._state = 'QUIT'
async def download_and_replace(self, context): async def download_and_replace(self, context):
self._state = "DOWNLOADING_TEXTURE" self._state = 'DOWNLOADING_TEXTURE'
current_image = bpy.data.images[self.image_name] current_image = bpy.data.images[self.image_name]
node = current_image["bcloud_node"] node = current_image['bcloud_node']
filename = "%s.taken_from_file" % pillar.sanitize_filename(node["name"]) filename = '%s.taken_from_file' % pillar.sanitize_filename(node['name'])
local_path = os.path.dirname(bpy.path.abspath(current_image.filepath)) local_path = os.path.dirname(bpy.path.abspath(current_image.filepath))
top_texture_directory = bpy.path.abspath(context.scene.local_texture_dir) top_texture_directory = bpy.path.abspath(context.scene.local_texture_dir)
meta_path = os.path.join(top_texture_directory, ".blender_cloud") meta_path = os.path.join(top_texture_directory, '.blender_cloud')
file_uuid = self.file_uuid file_uuid = self.file_uuid
resolution = next( resolution = next(file_ref['resolution'] for file_ref in node['properties']['files']
file_ref["resolution"] if file_ref['file'] == file_uuid)
for file_ref in node["properties"]["files"]
if file_ref["file"] == file_uuid
)
my_log = self.log my_log = self.log
my_log.info("Downloading file %r-%s to %s", file_uuid, resolution, local_path) my_log.info('Downloading file %r-%s to %s', file_uuid, resolution, local_path)
my_log.debug("Metadata will be stored at %s", meta_path) my_log.debug('Metadata will be stored at %s', meta_path)
def file_loading(file_path, file_desc, map_type): def file_loading(file_path, file_desc, map_type):
my_log.info( my_log.info('Texture downloading to %s (%s)',
"Texture downloading to %s (%s)", file_path, utils.sizeof_fmt(file_desc['length']))
file_path,
utils.sizeof_fmt(file_desc["length"]),
)
async def file_loaded(file_path, file_desc, map_type): async def file_loaded(file_path, file_desc, map_type):
if context.scene.local_texture_dir.startswith("//"): if context.scene.local_texture_dir.startswith('//'):
file_path = bpy.path.relpath(file_path) file_path = bpy.path.relpath(file_path)
my_log.info("Texture downloaded to %s", file_path) my_log.info('Texture downloaded to %s', file_path)
current_image["bcloud_file_uuid"] = file_uuid current_image['bcloud_file_uuid'] = file_uuid
current_image.filepath = ( current_image.filepath = file_path # This automatically reloads the image from disk.
file_path # This automatically reloads the image from disk.
)
# This forces users of the image to update. # This forces users of the image to update.
for datablocks in bpy.data.user_map({current_image}).values(): for datablocks in bpy.data.user_map({current_image}).values():
for datablock in datablocks: for datablock in datablocks:
datablock.update_tag() datablock.update_tag()
await pillar.download_file_by_uuid( await pillar.download_file_by_uuid(file_uuid,
file_uuid, local_path,
local_path, meta_path,
meta_path, filename=filename,
filename=filename, map_type=resolution,
map_type=resolution, file_loading=file_loading,
file_loading=file_loading, file_loaded_sync=file_loaded,
file_loaded_sync=file_loaded, future=self.signalling_future)
future=self.signalling_future,
)
self.report({"INFO"}, "Image download complete") self.report({'INFO'}, 'Image download complete')
# store keymaps here to access after registration # store keymaps here to access after registration
@ -794,11 +746,9 @@ addon_keymaps = []
def image_editor_menu(self, context): def image_editor_menu(self, context):
self.layout.operator( self.layout.operator(BlenderCloudBrowser.bl_idname,
BlenderCloudBrowser.bl_idname, text='Get image from Blender Cloud',
text="Get image from Blender Cloud", icon_value=blender.icon('CLOUD'))
icon_value=blender.icon("CLOUD"),
)
def hdri_download_panel__image_editor(self, context): def hdri_download_panel__image_editor(self, context):
@ -806,7 +756,7 @@ def hdri_download_panel__image_editor(self, context):
def hdri_download_panel__node_editor(self, context): def hdri_download_panel__node_editor(self, context):
if context.active_node.type not in {"TEX_ENVIRONMENT", "TEX_IMAGE"}: if context.active_node.type not in {'TEX_ENVIRONMENT', 'TEX_IMAGE'}:
return return
_hdri_download_panel(self, context.active_node.image) _hdri_download_panel(self, context.active_node.image)
@ -815,27 +765,25 @@ def hdri_download_panel__node_editor(self, context):
def _hdri_download_panel(self, current_image): def _hdri_download_panel(self, current_image):
if not current_image: if not current_image:
return return
if "bcloud_node_type" not in current_image: if 'bcloud_node_type' not in current_image:
return return
if current_image["bcloud_node_type"] != "hdri": if current_image['bcloud_node_type'] != 'hdri':
return return
try: try:
current_variation = current_image["bcloud_file_uuid"] current_variation = current_image['bcloud_file_uuid']
except KeyError: except KeyError:
log.warning( log.warning('Image %r has a bcloud_node_type but no bcloud_file_uuid property.',
"Image %r has a bcloud_node_type but no bcloud_file_uuid property.", current_image.name)
current_image.name,
)
return return
row = self.layout.row(align=True).split(factor=0.3) row = self.layout.row(align=True).split(**blender.factor(0.3))
row.label(text="HDRi", icon_value=blender.icon("CLOUD")) row.label(text='HDRi', icon_value=blender.icon('CLOUD'))
row.prop(current_image, "hdri_variation", text="") row.prop(current_image, 'hdri_variation', text='')
if current_image.hdri_variation != current_variation: if current_image.hdri_variation != current_variation:
props = row.operator( props = row.operator(PILLAR_OT_switch_hdri.bl_idname,
PILLAR_OT_switch_hdri.bl_idname, text="Replace", icon="FILE_REFRESH" text='Replace',
) icon='FILE_REFRESH')
props.image_name = current_image.name props.image_name = current_image.name
props.file_uuid = current_image.hdri_variation props.file_uuid = current_image.hdri_variation
@ -846,21 +794,21 @@ variation_label_storage = {}
def hdri_variation_choices(self, context): def hdri_variation_choices(self, context):
if context.area.type == "IMAGE_EDITOR": if context.area.type == 'IMAGE_EDITOR':
image = context.edit_image image = context.edit_image
elif context.area.type == "NODE_EDITOR": elif context.area.type == 'NODE_EDITOR':
image = context.active_node.image image = context.active_node.image
else: else:
return [] return []
if "bcloud_node" not in image: if 'bcloud_node' not in image:
return [] return []
choices = [] choices = []
for file_doc in image["bcloud_node"]["properties"]["files"]: for file_doc in image['bcloud_node']['properties']['files']:
label = file_doc["resolution"] label = file_doc['resolution']
variation_label_storage[label] = label variation_label_storage[label] = label
choices.append((file_doc["file"], label, "")) choices.append((file_doc['file'], label, ''))
return choices return choices
@ -875,22 +823,20 @@ def register():
# HDRi resolution switcher/chooser. # HDRi resolution switcher/chooser.
# TODO: when an image is selected, switch this property to its current resolution. # TODO: when an image is selected, switch this property to its current resolution.
bpy.types.Image.hdri_variation = bpy.props.EnumProperty( bpy.types.Image.hdri_variation = bpy.props.EnumProperty(
name="HDRi variations", name='HDRi variations',
items=hdri_variation_choices, items=hdri_variation_choices,
description="Select a variation with which to replace this image", description='Select a variation with which to replace this image'
) )
# handle the keymap # handle the keymap
wm = bpy.context.window_manager wm = bpy.context.window_manager
kc = wm.keyconfigs.addon kc = wm.keyconfigs.addon
if not kc: if not kc:
print("No addon key configuration space found, so no custom hotkeys added.") print('No addon key configuration space found, so no custom hotkeys added.')
return return
km = kc.keymaps.new(name="Screen") km = kc.keymaps.new(name='Screen')
kmi = km.keymap_items.new( kmi = km.keymap_items.new('pillar.browser', 'A', 'PRESS', ctrl=True, shift=True, alt=True)
"pillar.browser", "A", "PRESS", ctrl=True, shift=True, alt=True
)
addon_keymaps.append((km, kmi)) addon_keymaps.append((km, kmi))
@ -900,7 +846,7 @@ def unregister():
km.keymap_items.remove(kmi) km.keymap_items.remove(kmi)
addon_keymaps.clear() addon_keymaps.clear()
if hasattr(bpy.types.Image, "hdri_variation"): if hasattr(bpy.types.Image, 'hdri_variation'):
del bpy.types.Image.hdri_variation del bpy.types.Image.hdri_variation
bpy.types.IMAGE_MT_image.remove(image_editor_menu) bpy.types.IMAGE_MT_image.remove(image_editor_menu)

View File

@ -15,21 +15,18 @@ if bpy.app.background:
shader = None shader = None
texture_shader = None texture_shader = None
else: else:
shader = gpu.shader.from_builtin("2D_UNIFORM_COLOR") shader = gpu.shader.from_builtin('2D_UNIFORM_COLOR')
texture_shader = gpu.shader.from_builtin("2D_IMAGE") texture_shader = gpu.shader.from_builtin('2D_IMAGE')
Float2 = typing.Tuple[float, float] Float2 = typing.Tuple[float, float]
Float3 = typing.Tuple[float, float, float] Float3 = typing.Tuple[float, float, float]
Float4 = typing.Tuple[float, float, float, float] Float4 = typing.Tuple[float, float, float, float]
def text( def text(pos2d: Float2, display_text: typing.Union[str, typing.List[str]],
pos2d: Float2, rgba: Float4 = (1.0, 1.0, 1.0, 1.0),
display_text: typing.Union[str, typing.List[str]], fsize=12,
rgba: Float4 = (1.0, 1.0, 1.0, 1.0), align='L'):
fsize=12,
align="L",
):
"""Draw text with the top-left corner at 'pos2d'.""" """Draw text with the top-left corner at 'pos2d'."""
dpi = bpy.context.preferences.system.dpi dpi = bpy.context.preferences.system.dpi
@ -52,9 +49,9 @@ def text(
for idx, line in enumerate(mylines): for idx, line in enumerate(mylines):
text_width, text_height = blf.dimensions(font_id, line) text_width, text_height = blf.dimensions(font_id, line)
if align == "C": if align == 'C':
newx = x_pos - text_width / 2 newx = x_pos - text_width / 2
elif align == "R": elif align == 'R':
newx = x_pos - text_width - gap newx = x_pos - text_width - gap
else: else:
newx = x_pos newx = x_pos
@ -82,7 +79,7 @@ def aabox(v1: Float2, v2: Float2, rgba: Float4):
shader.bind() shader.bind()
shader.uniform_float("color", rgba) shader.uniform_float("color", rgba)
batch = batch_for_shader(shader, "TRI_FAN", {"pos": coords}) batch = batch_for_shader(shader, 'TRI_FAN', {"pos": coords})
batch.draw(shader) batch.draw(shader)
@ -97,14 +94,10 @@ def aabox_with_texture(v1: Float2, v2: Float2):
texture_shader.bind() texture_shader.bind()
texture_shader.uniform_int("image", 0) texture_shader.uniform_int("image", 0)
batch = batch_for_shader( batch = batch_for_shader(texture_shader, 'TRI_FAN', {
texture_shader, "pos": coords,
"TRI_FAN", "texCoord": ((0, 0), (0, 1), (1, 1), (1, 0)),
{ })
"pos": coords,
"texCoord": ((0, 0), (0, 1), (1, 1), (1, 0)),
},
)
batch.draw(texture_shader) batch.draw(texture_shader)
@ -112,8 +105,3 @@ def bind_texture(texture: bpy.types.Image):
"""Bind a Blender image to a GL texture slot.""" """Bind a Blender image to a GL texture slot."""
bgl.glActiveTexture(bgl.GL_TEXTURE0) bgl.glActiveTexture(bgl.GL_TEXTURE0)
bgl.glBindTexture(bgl.GL_TEXTURE_2D, texture.bindcode) bgl.glBindTexture(bgl.GL_TEXTURE_2D, texture.bindcode)
def load_texture(texture: bpy.types.Image) -> int:
"""Load the texture, return OpenGL error code."""
return texture.gl_load()

View File

@ -0,0 +1,90 @@
"""OpenGL drawing code for the texture browser.
Requires Blender 2.79 or older.
"""
import typing
import bgl
import blf
import bpy
Float2 = typing.Tuple[float, float]
Float3 = typing.Tuple[float, float, float]
Float4 = typing.Tuple[float, float, float, float]
def text(pos2d: Float2, display_text: typing.Union[str, typing.List[str]],
rgba: Float4 = (1.0, 1.0, 1.0, 1.0),
fsize=12,
align='L'):
"""Draw text with the top-left corner at 'pos2d'."""
dpi = bpy.context.user_preferences.system.dpi
gap = 12
x_pos, y_pos = pos2d
font_id = 0
blf.size(font_id, fsize, dpi)
# Compute the height of one line.
mwidth, mheight = blf.dimensions(font_id, "Tp") # Use high and low letters.
mheight *= 1.5
# Split text into lines.
if isinstance(display_text, str):
mylines = display_text.split("\n")
else:
mylines = display_text
maxwidth = 0
maxheight = len(mylines) * mheight
for idx, line in enumerate(mylines):
text_width, text_height = blf.dimensions(font_id, line)
if align == 'C':
newx = x_pos - text_width / 2
elif align == 'R':
newx = x_pos - text_width - gap
else:
newx = x_pos
# Draw
blf.position(font_id, newx, y_pos - mheight * idx, 0)
bgl.glColor4f(*rgba)
blf.draw(font_id, " " + line)
# saves max width
if maxwidth < text_width:
maxwidth = text_width
return maxwidth, maxheight
def aabox(v1: Float2, v2: Float2, rgba: Float4):
"""Draw an axis-aligned box."""
bgl.glColor4f(*rgba)
bgl.glRectf(*v1, *v2)
def aabox_with_texture(v1: Float2, v2: Float2):
"""Draw an axis-aligned box with a texture."""
bgl.glColor4f(1.0, 1.0, 1.0, 1.0)
bgl.glEnable(bgl.GL_TEXTURE_2D)
bgl.glBegin(bgl.GL_QUADS)
bgl.glTexCoord2d(0, 0)
bgl.glVertex2d(v1[0], v1[1])
bgl.glTexCoord2d(0, 1)
bgl.glVertex2d(v1[0], v2[1])
bgl.glTexCoord2d(1, 1)
bgl.glVertex2d(v2[0], v2[1])
bgl.glTexCoord2d(1, 0)
bgl.glVertex2d(v2[0], v1[1])
bgl.glEnd()
bgl.glDisable(bgl.GL_TEXTURE_2D)
def bind_texture(texture: bpy.types.Image):
"""Bind a Blender image to a GL texture slot."""
bgl.glBindTexture(bgl.GL_TEXTURE_2D, texture.bindcode[0])

View File

@ -30,39 +30,31 @@ class MenuItem:
text_size_small = 10 text_size_small = 10
DEFAULT_ICONS = { DEFAULT_ICONS = {
"FOLDER": os.path.join(library_icons_path, "folder.png"), 'FOLDER': os.path.join(library_icons_path, 'folder.png'),
"SPINNER": os.path.join(library_icons_path, "spinner.png"), 'SPINNER': os.path.join(library_icons_path, 'spinner.png'),
"ERROR": os.path.join(library_icons_path, "error.png"), 'ERROR': os.path.join(library_icons_path, 'error.png'),
} }
FOLDER_NODE_TYPES = { FOLDER_NODE_TYPES = {'group_texture', 'group_hdri',
"group_texture", nodes.UpNode.NODE_TYPE, nodes.ProjectNode.NODE_TYPE}
"group_hdri", SUPPORTED_NODE_TYPES = {'texture', 'hdri'}.union(FOLDER_NODE_TYPES)
nodes.UpNode.NODE_TYPE,
nodes.ProjectNode.NODE_TYPE,
}
SUPPORTED_NODE_TYPES = {"texture", "hdri"}.union(FOLDER_NODE_TYPES)
def __init__(self, node, file_desc, thumb_path: str, label_text): def __init__(self, node, file_desc, thumb_path: str, label_text):
self.log = logging.getLogger("%s.MenuItem" % __name__) self.log = logging.getLogger('%s.MenuItem' % __name__)
if node["node_type"] not in self.SUPPORTED_NODE_TYPES: if node['node_type'] not in self.SUPPORTED_NODE_TYPES:
self.log.info("Invalid node type in node: %s", node) self.log.info('Invalid node type in node: %s', node)
raise TypeError( raise TypeError('Node of type %r not supported; supported are %r.' % (
"Node of type %r not supported; supported are %r." node['node_type'], self.SUPPORTED_NODE_TYPES))
% (node["node_type"], self.SUPPORTED_NODE_TYPES)
)
assert isinstance(node, pillarsdk.Node), "wrong type for node: %r" % type(node) assert isinstance(node, pillarsdk.Node), 'wrong type for node: %r' % type(node)
assert isinstance(node["_id"], str), 'wrong type for node["_id"]: %r' % type( assert isinstance(node['_id'], str), 'wrong type for node["_id"]: %r' % type(node['_id'])
node["_id"]
)
self.node = node # pillarsdk.Node, contains 'node_type' key to indicate type self.node = node # pillarsdk.Node, contains 'node_type' key to indicate type
self.file_desc = file_desc # pillarsdk.File object, or None if a 'folder' node. self.file_desc = file_desc # pillarsdk.File object, or None if a 'folder' node.
self.label_text = label_text self.label_text = label_text
self.small_text = self._small_text_from_node() self.small_text = self._small_text_from_node()
self._thumb_path = "" self._thumb_path = ''
self.icon = None self.icon = None
self._is_folder = node["node_type"] in self.FOLDER_NODE_TYPES self._is_folder = node['node_type'] in self.FOLDER_NODE_TYPES
self._is_spinning = False self._is_spinning = False
# Determine sorting order. # Determine sorting order.
@ -83,21 +75,21 @@ class MenuItem:
"""Return the components of the texture (i.e. which map types are available).""" """Return the components of the texture (i.e. which map types are available)."""
if not self.node: if not self.node:
return "" return ''
try: try:
node_files = self.node.properties.files node_files = self.node.properties.files
except AttributeError: except AttributeError:
# Happens for nodes that don't have .properties.files. # Happens for nodes that don't have .properties.files.
return "" return ''
if not node_files: if not node_files:
return "" return ''
map_types = {f.map_type for f in node_files if f.map_type} map_types = {f.map_type for f in node_files if f.map_type}
map_types.discard("color") # all textures have colour map_types.discard('color') # all textures have colour
if not map_types: if not map_types:
return "" return ''
return ", ".join(sorted(map_types)) return ', '.join(sorted(map_types))
def sort_key(self): def sort_key(self):
"""Key for sorting lists of MenuItems.""" """Key for sorting lists of MenuItems."""
@ -109,7 +101,7 @@ class MenuItem:
@thumb_path.setter @thumb_path.setter
def thumb_path(self, new_thumb_path: str): def thumb_path(self, new_thumb_path: str):
self._is_spinning = new_thumb_path == "SPINNER" self._is_spinning = new_thumb_path == 'SPINNER'
self._thumb_path = self.DEFAULT_ICONS.get(new_thumb_path, new_thumb_path) self._thumb_path = self.DEFAULT_ICONS.get(new_thumb_path, new_thumb_path)
if self._thumb_path: if self._thumb_path:
@ -119,22 +111,20 @@ class MenuItem:
@property @property
def node_uuid(self) -> str: def node_uuid(self) -> str:
return self.node["_id"] return self.node['_id']
def represents(self, node) -> bool: def represents(self, node) -> bool:
"""Returns True iff this MenuItem represents the given node.""" """Returns True iff this MenuItem represents the given node."""
node_uuid = node["_id"] node_uuid = node['_id']
return self.node_uuid == node_uuid return self.node_uuid == node_uuid
def update(self, node, file_desc, thumb_path: str, label_text=None): def update(self, node, file_desc, thumb_path: str, label_text=None):
# We can get updated information about our Node, but a MenuItem should # We can get updated information about our Node, but a MenuItem should
# always represent one node, and it shouldn't be shared between nodes. # always represent one node, and it shouldn't be shared between nodes.
if self.node_uuid != node["_id"]: if self.node_uuid != node['_id']:
raise ValueError( raise ValueError("Don't change the node ID this MenuItem reflects, "
"Don't change the node ID this MenuItem reflects, " "just create a new one.")
"just create a new one."
)
self.node = node self.node = node
self.file_desc = file_desc # pillarsdk.File object, or None if a 'folder' node. self.file_desc = file_desc # pillarsdk.File object, or None if a 'folder' node.
self.thumb_path = thumb_path self.thumb_path = thumb_path
@ -142,8 +132,8 @@ class MenuItem:
if label_text is not None: if label_text is not None:
self.label_text = label_text self.label_text = label_text
if thumb_path == "ERROR": if thumb_path == 'ERROR':
self.small_text = "This open is broken" self.small_text = 'This open is broken'
else: else:
self.small_text = self._small_text_from_node() self.small_text = self._small_text_from_node()
@ -174,8 +164,8 @@ class MenuItem:
texture = self.icon texture = self.icon
if texture: if texture:
err = draw.load_texture(texture) err = texture.gl_load(filter=bgl.GL_NEAREST, mag=bgl.GL_NEAREST)
assert not err, "OpenGL error: %i" % err assert not err, 'OpenGL error: %i' % err
# ------ TEXTURE ---------# # ------ TEXTURE ---------#
if texture: if texture:
@ -195,15 +185,8 @@ class MenuItem:
text_x = self.x + self.icon_margin_x + ICON_WIDTH + self.text_margin_x text_x = self.x + self.icon_margin_x + ICON_WIDTH + self.text_margin_x
text_y = self.y + ICON_HEIGHT * 0.5 - 0.25 * self.text_size text_y = self.y + ICON_HEIGHT * 0.5 - 0.25 * self.text_size
draw.text((text_x, text_y), self.label_text, fsize=self.text_size) draw.text((text_x, text_y), self.label_text, fsize=self.text_size)
draw.text( draw.text((text_x, self.y + 0.5 * self.text_size_small), self.small_text,
(text_x, self.y + 0.5 * self.text_size_small), fsize=self.text_size_small, rgba=(1.0, 1.0, 1.0, 0.5))
self.small_text,
fsize=self.text_size_small,
rgba=(1.0, 1.0, 1.0, 0.5),
)
def hits(self, mouse_x: int, mouse_y: int) -> bool: def hits(self, mouse_x: int, mouse_y: int) -> bool:
return ( return self.x < mouse_x < self.x + self.width and self.y < mouse_y < self.y + self.height
self.x < mouse_x < self.x + self.width
and self.y < mouse_y < self.y + self.height
)

View File

@ -2,27 +2,25 @@ import pillarsdk
class SpecialFolderNode(pillarsdk.Node): class SpecialFolderNode(pillarsdk.Node):
NODE_TYPE = "SPECIAL" NODE_TYPE = 'SPECIAL'
class UpNode(SpecialFolderNode): class UpNode(SpecialFolderNode):
NODE_TYPE = "UP" NODE_TYPE = 'UP'
def __init__(self): def __init__(self):
super().__init__() super().__init__()
self["_id"] = "UP" self['_id'] = 'UP'
self["node_type"] = self.NODE_TYPE self['node_type'] = self.NODE_TYPE
class ProjectNode(SpecialFolderNode): class ProjectNode(SpecialFolderNode):
NODE_TYPE = "PROJECT" NODE_TYPE = 'PROJECT'
def __init__(self, project): def __init__(self, project):
super().__init__() super().__init__()
assert isinstance( assert isinstance(project, pillarsdk.Project), 'wrong type for project: %r' % type(project)
project, pillarsdk.Project
), "wrong type for project: %r" % type(project)
self.merge(project.to_dict()) self.merge(project.to_dict())
self["node_type"] = self.NODE_TYPE self['node_type'] = self.NODE_TYPE

View File

@ -19,24 +19,23 @@
import json import json
import pathlib import pathlib
import typing import typing
from typing import Any, Dict, Optional, Tuple
def sizeof_fmt(num: int, suffix="B") -> str: def sizeof_fmt(num: int, suffix='B') -> str:
"""Returns a human-readable size. """Returns a human-readable size.
Source: http://stackoverflow.com/a/1094933/875379 Source: http://stackoverflow.com/a/1094933/875379
""" """
for unit in ["", "Ki", "Mi", "Gi", "Ti", "Pi", "Ei", "Zi"]: for unit in ['', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi']:
if abs(num) < 1024: if abs(num) < 1024:
return "%.1f %s%s" % (num, unit, suffix) return '%.1f %s%s' % (num, unit, suffix)
num //= 1024 num //= 1024
return "%.1f Yi%s" % (num, suffix) return '%.1f Yi%s' % (num, suffix)
def find_in_path(path: pathlib.Path, filename: str) -> Optional[pathlib.Path]: def find_in_path(path: pathlib.Path, filename: str) -> typing.Optional[pathlib.Path]:
"""Performs a breadth-first search for the filename. """Performs a breadth-first search for the filename.
Returns the path that contains the file, or None if not found. Returns the path that contains the file, or None if not found.
@ -67,30 +66,37 @@ def find_in_path(path: pathlib.Path, filename: str) -> Optional[pathlib.Path]:
return None return None
# Mapping from (module name, function name) to the last value returned by that function. def pyside_cache(propname):
_pyside_cache: Dict[Tuple[str, str], Any] = {}
def pyside_cache(wrapped):
"""Decorator, stores the result of the decorated callable in Python-managed memory. """Decorator, stores the result of the decorated callable in Python-managed memory.
This is to work around the warning at This is to work around the warning at
https://www.blender.org/api/blender_python_api_master/bpy.props.html#bpy.props.EnumProperty https://www.blender.org/api/blender_python_api_master/bpy.props.html#bpy.props.EnumProperty
""" """
import functools if callable(propname):
raise TypeError('Usage: pyside_cache("property_name")')
@functools.wraps(wrapped) def decorator(wrapped):
# We can't use (*args, **kwargs), because EnumProperty explicitly checks """Stores the result of the callable in Python-managed memory.
# for the number of fixed positional arguments.
def decorator(self, context):
result = None
try:
result = wrapped(self, context)
return result
finally:
_pyside_cache[wrapped.__module__, wrapped.__name__] = result
This is to work around the warning at
https://www.blender.org/api/blender_python_api_master/bpy.props.html#bpy.props.EnumProperty
"""
import functools
@functools.wraps(wrapped)
# We can't use (*args, **kwargs), because EnumProperty explicitly checks
# for the number of fixed positional arguments.
def wrapper(self, context):
result = None
try:
result = wrapped(self, context)
return result
finally:
rna_type, rna_info = getattr(self.bl_rna, propname)
rna_info['_cached_result'] = result
return wrapper
return decorator return decorator
@ -104,6 +110,6 @@ class JSONEncoder(json.JSONEncoder):
"""JSON encoder with support for some Blender types.""" """JSON encoder with support for some Blender types."""
def default(self, o): def default(self, o):
if o.__class__.__name__ == "IDPropertyGroup" and hasattr(o, "to_dict"): if o.__class__.__name__ == 'IDPropertyGroup' and hasattr(o, 'to_dict'):
return o.to_dict() return o.to_dict()
return super().default(o) return super().default(o)

View File

@ -37,43 +37,31 @@ def load_wheel(module_name, fname_prefix):
try: try:
module = __import__(module_name) module = __import__(module_name)
except ImportError as ex: except ImportError as ex:
log.debug("Unable to import %s directly, will try wheel: %s", module_name, ex) log.debug('Unable to import %s directly, will try wheel: %s',
module_name, ex)
else: else:
log.debug( log.debug('Was able to load %s from %s, no need to load wheel %s',
"Was able to load %s from %s, no need to load wheel %s", module_name, module.__file__, fname_prefix)
module_name,
module.__file__,
fname_prefix,
)
return return
sys.path.append(wheel_filename(fname_prefix)) sys.path.append(wheel_filename(fname_prefix))
module = __import__(module_name) module = __import__(module_name)
log.debug("Loaded %s from %s", module_name, module.__file__) log.debug('Loaded %s from %s', module_name, module.__file__)
def wheel_filename(fname_prefix: str) -> str: def wheel_filename(fname_prefix: str) -> str:
path_pattern = os.path.join(my_dir, "%s*.whl" % fname_prefix) path_pattern = os.path.join(my_dir, '%s*.whl' % fname_prefix)
wheels = glob.glob(path_pattern) wheels = glob.glob(path_pattern)
if not wheels: if not wheels:
raise RuntimeError("Unable to find wheel at %r" % path_pattern) raise RuntimeError('Unable to find wheel at %r' % path_pattern)
# If there are multiple wheels that match, load the last-modified one. # If there are multiple wheels that match, load the latest one.
# Alphabetical sorting isn't going to cut it since BAT 1.10 was released. wheels.sort()
def modtime(filename: str) -> int:
return os.stat(filename).st_mtime
wheels.sort(key=modtime)
return wheels[-1] return wheels[-1]
def load_wheels(): def load_wheels():
load_wheel("blender_asset_tracer", "blender_asset_tracer") load_wheel('blender_asset_tracer', 'blender_asset_tracer')
load_wheel("lockfile", "lockfile") load_wheel('lockfile', 'lockfile')
load_wheel("cachecontrol", "CacheControl") load_wheel('cachecontrol', 'CacheControl')
load_wheel("pillarsdk", "pillarsdk") load_wheel('pillarsdk', 'pillarsdk')
if __name__ == "__main__":
wheel = wheel_filename("blender_asset_tracer")
print(f"Wheel: {wheel}")

View File

@ -1,9 +1,9 @@
# Primary requirements: # Primary requirements:
-e git+https://github.com/sybrenstuvel/cachecontrol.git@sybren-filecache-delete-crash-fix#egg=CacheControl -e git+https://github.com/sybrenstuvel/cachecontrol.git@sybren-filecache-delete-crash-fix#egg=CacheControl
lockfile==0.12.2 lockfile==0.12.2
pillarsdk==1.8.0 pillarsdk==1.7.0
wheel==0.29.0 wheel==0.29.0
blender-asset-tracer==1.11 blender-asset-tracer==1.1
# Secondary requirements: # Secondary requirements:
asn1crypto==0.24.0 asn1crypto==0.24.0

156
setup.py
View File

@ -32,14 +32,12 @@ from distutils.command.install import install, INSTALL_SCHEMES
from distutils.command.install_egg_info import install_egg_info from distutils.command.install_egg_info import install_egg_info
from setuptools import setup, find_packages from setuptools import setup, find_packages
requirement_re = re.compile("[><=]+") requirement_re = re.compile('[><=]+')
sys.dont_write_bytecode = True sys.dont_write_bytecode = True
# Download wheels from pypi. The specific versions are taken from requirements.txt # Download wheels from pypi. The specific versions are taken from requirements.txt
wheels = [ wheels = [
"lockfile", 'lockfile', 'pillarsdk', 'blender-asset-tracer',
"pillarsdk",
"blender-asset-tracer",
] ]
@ -57,9 +55,9 @@ class BuildWheels(Command):
description = "builds/downloads the dependencies as wheel files" description = "builds/downloads the dependencies as wheel files"
user_options = [ user_options = [
("wheels-path=", None, "wheel file installation path"), ('wheels-path=', None, "wheel file installation path"),
("deps-path=", None, "path in which dependencies are built"), ('deps-path=', None, "path in which dependencies are built"),
("cachecontrol-path=", None, "subdir of deps-path containing CacheControl"), ('cachecontrol-path=', None, "subdir of deps-path containing CacheControl"),
] ]
def initialize_options(self): def initialize_options(self):
@ -72,23 +70,22 @@ class BuildWheels(Command):
self.my_path = pathlib.Path(__file__).resolve().parent self.my_path = pathlib.Path(__file__).resolve().parent
package_path = self.my_path / self.distribution.get_name() package_path = self.my_path / self.distribution.get_name()
self.wheels_path = set_default_path(self.wheels_path, package_path / "wheels") self.wheels_path = set_default_path(self.wheels_path, package_path / 'wheels')
self.deps_path = set_default_path(self.deps_path, self.my_path / "build/deps") self.deps_path = set_default_path(self.deps_path, self.my_path / 'build/deps')
self.cachecontrol_path = set_default_path( self.cachecontrol_path = set_default_path(self.cachecontrol_path,
self.cachecontrol_path, self.deps_path / "cachecontrol" self.deps_path / 'cachecontrol')
) self.bat_path = self.deps_path / 'bat'
self.bat_path = self.deps_path / "bat"
def run(self): def run(self):
log.info("Storing wheels in %s", self.wheels_path) log.info('Storing wheels in %s', self.wheels_path)
# Parse the requirements.txt file # Parse the requirements.txt file
requirements = {} requirements = {}
with open(str(self.my_path / "requirements.txt")) as reqfile: with open(str(self.my_path / 'requirements.txt')) as reqfile:
for line in reqfile.readlines(): for line in reqfile.readlines():
line = line.strip() line = line.strip()
if not line or line.startswith("#"): if not line or line.startswith('#'):
# comments are lines that start with # only # comments are lines that start with # only
continue continue
@ -100,45 +97,37 @@ class BuildWheels(Command):
self.wheels_path.mkdir(parents=True, exist_ok=True) self.wheels_path.mkdir(parents=True, exist_ok=True)
for package in wheels: for package in wheels:
pattern = package.replace("-", "_") + "*.whl" pattern = package.replace('-', '_') + '*.whl'
if list(self.wheels_path.glob(pattern)): if list(self.wheels_path.glob(pattern)):
continue continue
self.download_wheel(requirements[package]) self.download_wheel(requirements[package])
# Build CacheControl. # Build CacheControl.
if not list(self.wheels_path.glob("CacheControl*.whl")): if not list(self.wheels_path.glob('CacheControl*.whl')):
log.info("Building CacheControl in %s", self.cachecontrol_path) log.info('Building CacheControl in %s', self.cachecontrol_path)
# self.git_clone(self.cachecontrol_path, # self.git_clone(self.cachecontrol_path,
# 'https://github.com/ionrock/cachecontrol.git', # 'https://github.com/ionrock/cachecontrol.git',
# 'v%s' % requirements['CacheControl'][1]) # 'v%s' % requirements['CacheControl'][1])
# FIXME: we need my clone until pull request #125 has been merged & released # FIXME: we need my clone until pull request #125 has been merged & released
self.git_clone( self.git_clone(self.cachecontrol_path,
self.cachecontrol_path, 'https://github.com/sybrenstuvel/cachecontrol.git',
"https://github.com/sybrenstuvel/cachecontrol.git", 'sybren-filecache-delete-crash-fix')
"sybren-filecache-delete-crash-fix",
)
self.build_copy_wheel(self.cachecontrol_path) self.build_copy_wheel(self.cachecontrol_path)
# Ensure that the wheels are added to the data files. # Ensure that the wheels are added to the data files.
self.distribution.data_files.append( self.distribution.data_files.append(
("blender_cloud/wheels", (str(p) for p in self.wheels_path.glob("*.whl"))) ('blender_cloud/wheels', (str(p) for p in self.wheels_path.glob('*.whl')))
) )
def download_wheel(self, requirement): def download_wheel(self, requirement):
"""Downloads a wheel from PyPI and saves it in self.wheels_path.""" """Downloads a wheel from PyPI and saves it in self.wheels_path."""
subprocess.check_call( subprocess.check_call([
[ 'pip', 'download',
sys.executable, '--no-deps',
"-m", '--dest', str(self.wheels_path),
"pip", requirement[0]
"download", ])
"--no-deps",
"--dest",
str(self.wheels_path),
requirement[0],
]
)
def git_clone(self, workdir: pathlib.Path, git_url: str, checkout: str = None): def git_clone(self, workdir: pathlib.Path, git_url: str, checkout: str = None):
if workdir.exists(): if workdir.exists():
@ -147,25 +136,24 @@ class BuildWheels(Command):
workdir.mkdir(parents=True) workdir.mkdir(parents=True)
subprocess.check_call( subprocess.check_call(['git', 'clone', git_url, str(workdir)],
["git", "clone", git_url, str(workdir)], cwd=str(workdir.parent) cwd=str(workdir.parent))
)
if checkout: if checkout:
subprocess.check_call(["git", "checkout", checkout], cwd=str(workdir)) subprocess.check_call(['git', 'checkout', checkout],
cwd=str(workdir))
def build_copy_wheel(self, package_path: pathlib.Path): def build_copy_wheel(self, package_path: pathlib.Path):
# Make sure no wheels exist yet, so that we know which one to copy later. # Make sure no wheels exist yet, so that we know which one to copy later.
to_remove = list((package_path / "dist").glob("*.whl")) to_remove = list((package_path / 'dist').glob('*.whl'))
for fname in to_remove: for fname in to_remove:
fname.unlink() fname.unlink()
subprocess.check_call( subprocess.check_call([sys.executable, 'setup.py', 'bdist_wheel'],
[sys.executable, "setup.py", "bdist_wheel"], cwd=str(package_path) cwd=str(package_path))
)
wheel = next((package_path / "dist").glob("*.whl")) wheel = next((package_path / 'dist').glob('*.whl'))
log.info("copying %s to %s", wheel, self.wheels_path) log.info('copying %s to %s', wheel, self.wheels_path)
shutil.copy(str(wheel), str(self.wheels_path)) shutil.copy(str(wheel), str(self.wheels_path))
@ -175,19 +163,19 @@ class BlenderAddonBdist(bdist):
def initialize_options(self): def initialize_options(self):
super().initialize_options() super().initialize_options()
self.formats = ["zip"] self.formats = ['zip']
self.plat_name = "addon" # use this instead of 'linux-x86_64' or similar. self.plat_name = 'addon' # use this instead of 'linux-x86_64' or similar.
self.fix_local_prefix() self.fix_local_prefix()
def fix_local_prefix(self): def fix_local_prefix(self):
"""Place data files in blender_cloud instead of local/blender_cloud.""" """Place data files in blender_cloud instead of local/blender_cloud."""
for key in INSTALL_SCHEMES: for key in INSTALL_SCHEMES:
if "data" not in INSTALL_SCHEMES[key]: if 'data' not in INSTALL_SCHEMES[key]:
continue continue
INSTALL_SCHEMES[key]["data"] = "$base" INSTALL_SCHEMES[key]['data'] = '$base'
def run(self): def run(self):
self.run_command("wheels") self.run_command('wheels')
super().run() super().run()
@ -196,7 +184,7 @@ class BlenderAddonFdist(BlenderAddonBdist):
"""Ensures that 'python setup.py fdist' creates a plain folder structure.""" """Ensures that 'python setup.py fdist' creates a plain folder structure."""
user_options = [ user_options = [
("dest-path=", None, "addon installation path"), ('dest-path=', None, 'addon installation path'),
] ]
def initialize_options(self): def initialize_options(self):
@ -210,12 +198,12 @@ class BlenderAddonFdist(BlenderAddonBdist):
filepath = self.distribution.dist_files[0][2] filepath = self.distribution.dist_files[0][2]
# if dest_path is not specified use the filename as the dest_path (minus the .zip) # if dest_path is not specified use the filename as the dest_path (minus the .zip)
assert filepath.endswith(".zip") assert filepath.endswith('.zip')
target_folder = self.dest_path or filepath[:-4] target_folder = self.dest_path or filepath[:-4]
print("Unzipping the package on {}.".format(target_folder)) print('Unzipping the package on {}.'.format(target_folder))
with zipfile.ZipFile(filepath, "r") as zip_ref: with zipfile.ZipFile(filepath, 'r') as zip_ref:
zip_ref.extractall(target_folder) zip_ref.extractall(target_folder)
@ -225,8 +213,8 @@ class BlenderAddonInstall(install):
def initialize_options(self): def initialize_options(self):
super().initialize_options() super().initialize_options()
self.prefix = "" self.prefix = ''
self.install_lib = "" self.install_lib = ''
class AvoidEggInfo(install_egg_info): class AvoidEggInfo(install_egg_info):
@ -241,38 +229,34 @@ class AvoidEggInfo(install_egg_info):
setup( setup(
cmdclass={ cmdclass={'bdist': BlenderAddonBdist,
"bdist": BlenderAddonBdist, 'fdist': BlenderAddonFdist,
"fdist": BlenderAddonFdist, 'install': BlenderAddonInstall,
"install": BlenderAddonInstall, 'install_egg_info': AvoidEggInfo,
"install_egg_info": AvoidEggInfo, 'wheels': BuildWheels},
"wheels": BuildWheels, name='blender_cloud',
}, description='The Blender Cloud addon allows browsing the Blender Cloud from Blender.',
name="blender_cloud", version='1.12.0',
description="The Blender Cloud addon allows browsing the Blender Cloud from Blender.", author='Sybren A. Stüvel',
version="1.25", author_email='sybren@stuvel.eu',
author="Sybren A. Stüvel", packages=find_packages('.'),
author_email="sybren@stuvel.eu",
packages=find_packages("."),
data_files=[ data_files=[
("blender_cloud", ["README.md", "README-flamenco.md", "CHANGELOG.md"]), ('blender_cloud', ['README.md', 'README-flamenco.md', 'CHANGELOG.md']),
("blender_cloud/icons", glob.glob("blender_cloud/icons/*")), ('blender_cloud/icons', glob.glob('blender_cloud/icons/*')),
( ('blender_cloud/texture_browser/icons',
"blender_cloud/texture_browser/icons", glob.glob('blender_cloud/texture_browser/icons/*'))
glob.glob("blender_cloud/texture_browser/icons/*"),
),
], ],
scripts=[], scripts=[],
url="https://developer.blender.org/diffusion/BCA/", url='https://developer.blender.org/diffusion/BCA/',
license="GNU General Public License v2 or later (GPLv2+)", license='GNU General Public License v2 or later (GPLv2+)',
platforms="", platforms='',
classifiers=[ classifiers=[
"Intended Audience :: End Users/Desktop", 'Intended Audience :: End Users/Desktop',
"Operating System :: OS Independent", 'Operating System :: OS Independent',
"Environment :: Plugins", 'Environment :: Plugins',
"License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)", 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',
"Programming Language :: Python", 'Programming Language :: Python',
"Programming Language :: Python :: 3.5", 'Programming Language :: Python :: 3.5',
], ],
zip_safe=False, zip_safe=False,
) )

View File

@ -15,94 +15,71 @@ from blender_cloud.flamenco import sdk
class PathReplacementTest(unittest.TestCase): class PathReplacementTest(unittest.TestCase):
def setUp(self): def setUp(self):
self.test_manager = sdk.Manager( self.test_manager = sdk.Manager({
{ '_created': datetime.datetime(2017, 5, 31, 15, 12, 32, tzinfo=pillarsdk.utils.utc),
"_created": datetime.datetime( '_etag': 'c39942ee4bcc4658adcc21e4bcdfb0ae',
2017, 5, 31, 15, 12, 32, tzinfo=pillarsdk.utils.utc '_id': '592edd609837732a2a272c62',
), '_updated': datetime.datetime(2017, 6, 8, 14, 51, 3, tzinfo=pillarsdk.utils.utc),
"_etag": "c39942ee4bcc4658adcc21e4bcdfb0ae", 'description': 'Manager formerly known as "testman"',
"_id": "592edd609837732a2a272c62", 'job_types': {'sleep': {'vars': {}}},
"_updated": datetime.datetime( 'name': '<script>alert("this is a manager")</script>',
2017, 6, 8, 14, 51, 3, tzinfo=pillarsdk.utils.utc 'owner': '592edd609837732a2a272c63',
), 'path_replacement': {'job_storage': {'darwin': '/Volume/shared',
"description": 'Manager formerly known as "testman"', 'linux': '/shared',
"job_types": {"sleep": {"vars": {}}}, 'windows': 's:/'},
"name": '<script>alert("this is a manager")</script>', 'render': {'darwin': '/Volume/render/',
"owner": "592edd609837732a2a272c63", 'linux': '/render/',
"path_replacement": { 'windows': 'r:/'},
"job_storage": { 'longrender': {'darwin': '/Volume/render/long',
"darwin": "/Volume/shared", 'linux': '/render/long',
"linux": "/shared", 'windows': 'r:/long'},
"windows": "s:/", },
}, 'projects': ['58cbdd5698377322d95eb55e'],
"render": { 'service_account': '592edd609837732a2a272c60',
"darwin": "/Volume/render/", 'stats': {'nr_of_workers': 3},
"linux": "/render/", 'url': 'http://192.168.3.101:8083/',
"windows": "r:/", 'user_groups': ['58cbdd5698377322d95eb55f'],
}, 'variables': {'blender': {'darwin': '/opt/myblenderbuild/blender',
"longrender": { 'linux': '/home/sybren/workspace/build_linux/bin/blender '
"darwin": "/Volume/render/long", '--enable-new-depsgraph --factory-startup',
"linux": "/render/long", 'windows': 'c:/temp/blender.exe'}}}
"windows": "r:/long",
},
},
"projects": ["58cbdd5698377322d95eb55e"],
"service_account": "592edd609837732a2a272c60",
"stats": {"nr_of_workers": 3},
"url": "http://192.168.3.101:8083/",
"user_groups": ["58cbdd5698377322d95eb55f"],
"variables": {
"blender": {
"darwin": "/opt/myblenderbuild/blender",
"linux": "/home/sybren/workspace/build_linux/bin/blender "
"--enable-new-depsgraph --factory-startup",
"windows": "c:/temp/blender.exe",
}
},
}
) )
def test_linux(self): def test_linux(self):
# (expected result, input) # (expected result, input)
test_paths = [ test_paths = [
("/doesnotexistreally", "/doesnotexistreally"), ('/doesnotexistreally', '/doesnotexistreally'),
("{render}/agent327/scenes/A_01_03_B", "/render/agent327/scenes/A_01_03_B"), ('{render}/agent327/scenes/A_01_03_B', '/render/agent327/scenes/A_01_03_B'),
("{job_storage}/render/agent327/scenes", "/shared/render/agent327/scenes"), ('{job_storage}/render/agent327/scenes', '/shared/render/agent327/scenes'),
("{longrender}/agent327/scenes", "/render/long/agent327/scenes"), ('{longrender}/agent327/scenes', '/render/long/agent327/scenes'),
] ]
self._do_test(test_paths, "linux", pathlib.PurePosixPath) self._do_test(test_paths, 'linux', pathlib.PurePosixPath)
def test_windows(self): def test_windows(self):
# (expected result, input) # (expected result, input)
test_paths = [ test_paths = [
("c:/doesnotexistreally", "c:/doesnotexistreally"), ('c:/doesnotexistreally', 'c:/doesnotexistreally'),
("c:/some/path", r"c:\some\path"), ('c:/some/path', r'c:\some\path'),
("{render}/agent327/scenes/A_01_03_B", r"R:\agent327\scenes\A_01_03_B"), ('{render}/agent327/scenes/A_01_03_B', r'R:\agent327\scenes\A_01_03_B'),
("{render}/agent327/scenes/A_01_03_B", r"r:\agent327\scenes\A_01_03_B"), ('{render}/agent327/scenes/A_01_03_B', r'r:\agent327\scenes\A_01_03_B'),
("{render}/agent327/scenes/A_01_03_B", r"r:/agent327/scenes/A_01_03_B"), ('{render}/agent327/scenes/A_01_03_B', r'r:/agent327/scenes/A_01_03_B'),
("{job_storage}/render/agent327/scenes", "s:/render/agent327/scenes"), ('{job_storage}/render/agent327/scenes', 's:/render/agent327/scenes'),
("{longrender}/agent327/scenes", "r:/long/agent327/scenes"), ('{longrender}/agent327/scenes', 'r:/long/agent327/scenes'),
] ]
self._do_test(test_paths, "windows", pathlib.PureWindowsPath) self._do_test(test_paths, 'windows', pathlib.PureWindowsPath)
def test_darwin(self): def test_darwin(self):
# (expected result, input) # (expected result, input)
test_paths = [ test_paths = [
("/Volume/doesnotexistreally", "/Volume/doesnotexistreally"), ('/Volume/doesnotexistreally', '/Volume/doesnotexistreally'),
( ('{render}/agent327/scenes/A_01_03_B', r'/Volume/render/agent327/scenes/A_01_03_B'),
"{render}/agent327/scenes/A_01_03_B", ('{job_storage}/render/agent327/scenes', '/Volume/shared/render/agent327/scenes'),
r"/Volume/render/agent327/scenes/A_01_03_B", ('{longrender}/agent327/scenes', '/Volume/render/long/agent327/scenes'),
),
(
"{job_storage}/render/agent327/scenes",
"/Volume/shared/render/agent327/scenes",
),
("{longrender}/agent327/scenes", "/Volume/render/long/agent327/scenes"),
] ]
self._do_test(test_paths, "darwin", pathlib.PurePosixPath) self._do_test(test_paths, 'darwin', pathlib.PurePosixPath)
def _do_test(self, test_paths, platform, pathclass): def _do_test(self, test_paths, platform, pathclass):
self.test_manager.PurePlatformPath = pathclass self.test_manager.PurePlatformPath = pathclass
@ -110,11 +87,9 @@ class PathReplacementTest(unittest.TestCase):
def mocked_system(): def mocked_system():
return platform return platform
with unittest.mock.patch("platform.system", mocked_system): with unittest.mock.patch('platform.system', mocked_system):
for expected_result, input_path in test_paths: for expected_result, input_path in test_paths:
as_path_instance = pathclass(input_path) as_path_instance = pathclass(input_path)
self.assertEqual( self.assertEqual(expected_result,
expected_result, self.test_manager.replace_path(as_path_instance),
self.test_manager.replace_path(as_path_instance), 'for input %r on platform %s' % (as_path_instance, platform))
"for input %r on platform %s" % (as_path_instance, platform),
)

View File

@ -8,18 +8,18 @@ from blender_cloud import utils
class FindInPathTest(unittest.TestCase): class FindInPathTest(unittest.TestCase):
def test_nonexistant_path(self): def test_nonexistant_path(self):
path = pathlib.Path("/doesnotexistreally") path = pathlib.Path('/doesnotexistreally')
self.assertFalse(path.exists()) self.assertFalse(path.exists())
self.assertIsNone(utils.find_in_path(path, "jemoeder.blend")) self.assertIsNone(utils.find_in_path(path, 'jemoeder.blend'))
def test_really_breadth_first(self): def test_really_breadth_first(self):
"""A depth-first test might find dir_a1/dir_a2/dir_a3/find_me.txt first.""" """A depth-first test might find dir_a1/dir_a2/dir_a3/find_me.txt first."""
path = pathlib.Path(__file__).parent / "test_really_breadth_first" path = pathlib.Path(__file__).parent / 'test_really_breadth_first'
found = utils.find_in_path(path, "find_me.txt") found = utils.find_in_path(path, 'find_me.txt')
self.assertEqual(path / "dir_b1" / "dir_b2" / "find_me.txt", found) self.assertEqual(path / 'dir_b1' / 'dir_b2' / 'find_me.txt', found)
def test_nonexistant_file(self): def test_nonexistant_file(self):
path = pathlib.Path(__file__).parent / "test_really_breadth_first" path = pathlib.Path(__file__).parent / 'test_really_breadth_first'
found = utils.find_in_path(path, "do_not_find_me.txt") found = utils.find_in_path(path, 'do_not_find_me.txt')
self.assertEqual(None, found) self.assertEqual(None, found)

View File

@ -9,8 +9,8 @@ fi
BL_INFO_VER=$(echo "$VERSION" | sed 's/\./, /g') BL_INFO_VER=$(echo "$VERSION" | sed 's/\./, /g')
sed "s/version=\"[^\"]*\"/version=\"$VERSION\"/" -i setup.py sed "s/version='[^']*'/version='$VERSION'/" -i setup.py
sed "s/\"version\": ([^)]*)/\"version\": ($BL_INFO_VER)/" -i blender_cloud/__init__.py sed "s/'version': ([^)]*)/'version': ($BL_INFO_VER)/" -i blender_cloud/__init__.py
git diff git diff
echo echo
@ -19,4 +19,4 @@ echo git commit -m \'Bumped version to $VERSION\' setup.py blender_cloud/__init_
echo git tag -a version-$VERSION -m \'Tagged version $VERSION\' echo git tag -a version-$VERSION -m \'Tagged version $VERSION\'
echo echo
echo "To build a distribution ZIP:" echo "To build a distribution ZIP:"
echo python3 setup.py bdist echo python setup.py bdist