Move package manager to blender branch
Moved the package manger out of an addon. It now lives here: https://developer.blender.org/diffusion/B/browse/soc-2017-package_manager/ This repository still contains the repo generation script, the readme has been updated to reflect this.
This commit is contained in:
124
README.md
124
README.md
@@ -1,36 +1,9 @@
|
|||||||
# BlenderPackage, the Blender Package Manager (wip)
|
# Repository generator for the blender package manager
|
||||||
|
For the package manager itself, see the `soc-2017-package_manager` branch here:
|
||||||
|
https://developer.blender.org/diffusion/B/browse/soc-2017-package_manager/
|
||||||
|
|
||||||
This is work-in-progress documentation for the work-in-progress package manager
|
This repository contains a script (`generate_repository`) for generating
|
||||||
(the name is also a work-in-progress) As such, everything here is subject to
|
repositories of blender packages. Example usage:
|
||||||
change.
|
|
||||||
|
|
||||||
# Installation and Testing
|
|
||||||
|
|
||||||
1. Clone blender and checkout the [`soc-2017-package_manager` branch](https://developer.blender.org/diffusion/B/browse/soc-2017-package_manager/).
|
|
||||||
|
|
||||||
git clone git://git.blender.org/blender.git
|
|
||||||
git checkout soc-2017-package_manager
|
|
||||||
|
|
||||||
2. [Compile](https://wiki.blender.org/index.php/Dev:Doc/Building_Blender).
|
|
||||||
2. You may want to build without addons (or delete them from `<build dir>/bin/<version>/scripts/addons*` afterwards).
|
|
||||||
This is because system addons typically can't be altered by the user (permissions), so the package manager won't be able to uninstall/update them.
|
|
||||||
Plus, the test repo only contains official addons, so if left as is, all addons will be installed already.
|
|
||||||
3. Clone the [package manager addon repository](https://developer.blender.org/diffusion/BPMA/repository):
|
|
||||||
|
|
||||||
git clone git://git.blender.org/blender-package-manager-addon.git
|
|
||||||
|
|
||||||
4. Install the addon. Copy (or symlink) the `package_manager` directory
|
|
||||||
contained *within* the cloned repository into
|
|
||||||
`/path/to/blender/build/bin/2.78/scripts/addons/`
|
|
||||||
5. Start blender and enable the addon (it's in the "Testing" support level")
|
|
||||||
6. Add a repository in *User Preferences > Packages > Repositories* by clicking the "plus" icon. You can use a local repository (see below), or the one I set up for testing: `http://blendermonkey.com/bpkg`.
|
|
||||||
Currently only one repository is allowed, but this will change.
|
|
||||||
|
|
||||||
|
|
||||||
### Repository creation
|
|
||||||
|
|
||||||
A local repository can be generated with the `generate_repository` script found
|
|
||||||
in the addon repository. Example usage:
|
|
||||||
|
|
||||||
./generate_repository /path/to/packages --baseurl 'http://localhost/'
|
./generate_repository /path/to/packages --baseurl 'http://localhost/'
|
||||||
|
|
||||||
@@ -38,82 +11,21 @@ This will produce a `repo.json` file in the current directory, which can then
|
|||||||
be copied to the server. The baseurl is prepended to the filename of each
|
be copied to the server. The baseurl is prepended to the filename of each
|
||||||
package to form the package's url (so for example, `http://localhost/node_wrangler.py`).
|
package to form the package's url (so for example, `http://localhost/node_wrangler.py`).
|
||||||
|
|
||||||
|
For an explanation of the other options see `generate_repository --help`:
|
||||||
|
|
||||||
# Known limitations
|
usage: generate_repository [-h] [-v] [-u BASEURL] [-n NAME] [-o OUTPUT] path
|
||||||
|
|
||||||
Things which are known to be bad, but are planned to get better
|
Generate a blender package repository from a directory of addons
|
||||||
|
|
||||||
* No progress display
|
positional arguments:
|
||||||
* Asynchronous operators can be run multiple times at once
|
path Path to addon directory
|
||||||
* Not more than one repository can be used at once
|
|
||||||
* Only the latest version of a package can be installed and uninstalled
|
|
||||||
|
|
||||||
# Notes
|
optional arguments:
|
||||||
|
-h, --help show this help message and exit
|
||||||
My intention is to eventually make uninstalls undo-able until blender is
|
-v, --verbose Increase verbosity (can be used multiple times)
|
||||||
restarted by moving the uninstalled files to a cache directory which is flushed
|
-u BASEURL, --baseurl BASEURL
|
||||||
on startup and/or exit.
|
Component of URL leading up to the package filename.
|
||||||
|
-n NAME, --name NAME Name of repo (defaults to basename of 'path')
|
||||||
Packages are identified by their name. This could of course cause issues if two
|
-o OUTPUT, --output OUTPUT
|
||||||
different packages end up with the same name. As it seems 2.8 will break many
|
Directory in which to write repo.json file
|
||||||
addons anyway, perhaps we can add a required metadata field that allows for
|
|
||||||
more reliable unique identification?
|
|
||||||
|
|
||||||
# Terminology
|
|
||||||
|
|
||||||
## Package
|
|
||||||
|
|
||||||
A _package_ consists of a single file, or a zip archive containing files to be installed.
|
|
||||||
|
|
||||||
Note:
|
|
||||||
I think it would be good to always store `bl_info` metadata with the package,
|
|
||||||
but how best to do this while being compatible with existing addons and future
|
|
||||||
non-addons remains an open question (perhaps we can always include an
|
|
||||||
`__init__.py` even in non-addon packages?)
|
|
||||||
|
|
||||||
|
|
||||||
## Repository
|
|
||||||
|
|
||||||
A _repository_ consists of a directory containing a "repo.json" file. This
|
|
||||||
repo.json file contains metadata describing each package (`bl_info`) and where
|
|
||||||
it may be retrieved from.
|
|
||||||
|
|
||||||
A repo.json file may currently be generated from a directory of addons by
|
|
||||||
running `blenderpack.py <path/to/addon/dir>`.
|
|
||||||
|
|
||||||
|
|
||||||
## Client
|
|
||||||
|
|
||||||
Clients "use" a repository by including its repo.json in packages listed in
|
|
||||||
Clients can be configured to use multiple repositories at once.
|
|
||||||
|
|
||||||
In addition, the client maintains it's own "local repository", which is a
|
|
||||||
repo.json containing installed packages.
|
|
||||||
|
|
||||||
Clients can take the following actions:
|
|
||||||
|
|
||||||
### Install
|
|
||||||
|
|
||||||
_Installing_ means downloading a single package, adding it to the local
|
|
||||||
repository, and extracting/copying the package's file(s) to their
|
|
||||||
destination(s).
|
|
||||||
|
|
||||||
### Uninstall
|
|
||||||
|
|
||||||
_Uninstalling_ means deleting a single package's files, then removing it from
|
|
||||||
the local repository.
|
|
||||||
|
|
||||||
Note:
|
|
||||||
If some packages store user-created data (e.g. preferences), we may want to
|
|
||||||
preserve that somehow.
|
|
||||||
|
|
||||||
### Upgrade
|
|
||||||
|
|
||||||
_Upgrading_ means looking for and installing newer addons with the same names as installed
|
|
||||||
addons.
|
|
||||||
|
|
||||||
### Refresh
|
|
||||||
|
|
||||||
_Refreshing_ means checking for modifications to the `repo.json`s of the enabled
|
|
||||||
repositories, and new packages which may have appeared on disk.
|
|
||||||
|
|
||||||
|
File diff suppressed because it is too large
Load Diff
@@ -1,552 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
# -*- coding: utf-8 -*-
|
|
||||||
# Copyright (c) 2005-2010 ActiveState Software Inc.
|
|
||||||
# Copyright (c) 2013 Eddy Petrișor
|
|
||||||
|
|
||||||
"""Utilities for determining application-specific dirs.
|
|
||||||
|
|
||||||
See <http://github.com/ActiveState/appdirs> for details and usage.
|
|
||||||
"""
|
|
||||||
# Dev Notes:
|
|
||||||
# - MSDN on where to store app data files:
|
|
||||||
# http://support.microsoft.com/default.aspx?scid=kb;en-us;310294#XSLTH3194121123120121120120
|
|
||||||
# - Mac OS X: http://developer.apple.com/documentation/MacOSX/Conceptual/BPFileSystem/index.html
|
|
||||||
# - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html
|
|
||||||
|
|
||||||
__version_info__ = (1, 4, 0)
|
|
||||||
__version__ = '.'.join(map(str, __version_info__))
|
|
||||||
|
|
||||||
|
|
||||||
import sys
|
|
||||||
import os
|
|
||||||
|
|
||||||
PY3 = sys.version_info[0] == 3
|
|
||||||
|
|
||||||
if PY3:
|
|
||||||
unicode = str
|
|
||||||
|
|
||||||
if sys.platform.startswith('java'):
|
|
||||||
import platform
|
|
||||||
os_name = platform.java_ver()[3][0]
|
|
||||||
if os_name.startswith('Windows'): # "Windows XP", "Windows 7", etc.
|
|
||||||
system = 'win32'
|
|
||||||
elif os_name.startswith('Mac'): # "Mac OS X", etc.
|
|
||||||
system = 'darwin'
|
|
||||||
else: # "Linux", "SunOS", "FreeBSD", etc.
|
|
||||||
# Setting this to "linux2" is not ideal, but only Windows or Mac
|
|
||||||
# are actually checked for and the rest of the module expects
|
|
||||||
# *sys.platform* style strings.
|
|
||||||
system = 'linux2'
|
|
||||||
else:
|
|
||||||
system = sys.platform
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
|
|
||||||
r"""Return full path to the user-specific data dir for this application.
|
|
||||||
|
|
||||||
"appname" is the name of application.
|
|
||||||
If None, just the system directory is returned.
|
|
||||||
"appauthor" (only used on Windows) is the name of the
|
|
||||||
appauthor or distributing body for this application. Typically
|
|
||||||
it is the owning company name. This falls back to appname. You may
|
|
||||||
pass False to disable it.
|
|
||||||
"version" is an optional version path element to append to the
|
|
||||||
path. You might want to use this if you want multiple versions
|
|
||||||
of your app to be able to run independently. If used, this
|
|
||||||
would typically be "<major>.<minor>".
|
|
||||||
Only applied when appname is present.
|
|
||||||
"roaming" (boolean, default False) can be set True to use the Windows
|
|
||||||
roaming appdata directory. That means that for users on a Windows
|
|
||||||
network setup for roaming profiles, this user data will be
|
|
||||||
sync'd on login. See
|
|
||||||
<http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
|
|
||||||
for a discussion of issues.
|
|
||||||
|
|
||||||
Typical user data directories are:
|
|
||||||
Mac OS X: ~/Library/Application Support/<AppName>
|
|
||||||
Unix: ~/.local/share/<AppName> # or in $XDG_DATA_HOME, if defined
|
|
||||||
Win XP (not roaming): C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
|
|
||||||
Win XP (roaming): C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
|
|
||||||
Win 7 (not roaming): C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
|
|
||||||
Win 7 (roaming): C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>
|
|
||||||
|
|
||||||
For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
|
|
||||||
That means, by default "~/.local/share/<AppName>".
|
|
||||||
"""
|
|
||||||
if system == "win32":
|
|
||||||
if appauthor is None:
|
|
||||||
appauthor = appname
|
|
||||||
const = roaming and "CSIDL_APPDATA" or "CSIDL_LOCAL_APPDATA"
|
|
||||||
path = os.path.normpath(_get_win_folder(const))
|
|
||||||
if appname:
|
|
||||||
if appauthor is not False:
|
|
||||||
path = os.path.join(path, appauthor, appname)
|
|
||||||
else:
|
|
||||||
path = os.path.join(path, appname)
|
|
||||||
elif system == 'darwin':
|
|
||||||
path = os.path.expanduser('~/Library/Application Support/')
|
|
||||||
if appname:
|
|
||||||
path = os.path.join(path, appname)
|
|
||||||
else:
|
|
||||||
path = os.getenv('XDG_DATA_HOME', os.path.expanduser("~/.local/share"))
|
|
||||||
if appname:
|
|
||||||
path = os.path.join(path, appname)
|
|
||||||
if appname and version:
|
|
||||||
path = os.path.join(path, version)
|
|
||||||
return path
|
|
||||||
|
|
||||||
|
|
||||||
def site_data_dir(appname=None, appauthor=None, version=None, multipath=False):
|
|
||||||
"""Return full path to the user-shared data dir for this application.
|
|
||||||
|
|
||||||
"appname" is the name of application.
|
|
||||||
If None, just the system directory is returned.
|
|
||||||
"appauthor" (only used on Windows) is the name of the
|
|
||||||
appauthor or distributing body for this application. Typically
|
|
||||||
it is the owning company name. This falls back to appname. You may
|
|
||||||
pass False to disable it.
|
|
||||||
"version" is an optional version path element to append to the
|
|
||||||
path. You might want to use this if you want multiple versions
|
|
||||||
of your app to be able to run independently. If used, this
|
|
||||||
would typically be "<major>.<minor>".
|
|
||||||
Only applied when appname is present.
|
|
||||||
"multipath" is an optional parameter only applicable to *nix
|
|
||||||
which indicates that the entire list of data dirs should be
|
|
||||||
returned. By default, the first item from XDG_DATA_DIRS is
|
|
||||||
returned, or '/usr/local/share/<AppName>',
|
|
||||||
if XDG_DATA_DIRS is not set
|
|
||||||
|
|
||||||
Typical user data directories are:
|
|
||||||
Mac OS X: /Library/Application Support/<AppName>
|
|
||||||
Unix: /usr/local/share/<AppName> or /usr/share/<AppName>
|
|
||||||
Win XP: C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
|
|
||||||
Vista: (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
|
|
||||||
Win 7: C:\ProgramData\<AppAuthor>\<AppName> # Hidden, but writeable on Win 7.
|
|
||||||
|
|
||||||
For Unix, this is using the $XDG_DATA_DIRS[0] default.
|
|
||||||
|
|
||||||
WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
|
|
||||||
"""
|
|
||||||
if system == "win32":
|
|
||||||
if appauthor is None:
|
|
||||||
appauthor = appname
|
|
||||||
path = os.path.normpath(_get_win_folder("CSIDL_COMMON_APPDATA"))
|
|
||||||
if appname:
|
|
||||||
if appauthor is not False:
|
|
||||||
path = os.path.join(path, appauthor, appname)
|
|
||||||
else:
|
|
||||||
path = os.path.join(path, appname)
|
|
||||||
elif system == 'darwin':
|
|
||||||
path = os.path.expanduser('/Library/Application Support')
|
|
||||||
if appname:
|
|
||||||
path = os.path.join(path, appname)
|
|
||||||
else:
|
|
||||||
# XDG default for $XDG_DATA_DIRS
|
|
||||||
# only first, if multipath is False
|
|
||||||
path = os.getenv('XDG_DATA_DIRS',
|
|
||||||
os.pathsep.join(['/usr/local/share', '/usr/share']))
|
|
||||||
pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
|
|
||||||
if appname:
|
|
||||||
if version:
|
|
||||||
appname = os.path.join(appname, version)
|
|
||||||
pathlist = [os.sep.join([x, appname]) for x in pathlist]
|
|
||||||
|
|
||||||
if multipath:
|
|
||||||
path = os.pathsep.join(pathlist)
|
|
||||||
else:
|
|
||||||
path = pathlist[0]
|
|
||||||
return path
|
|
||||||
|
|
||||||
if appname and version:
|
|
||||||
path = os.path.join(path, version)
|
|
||||||
return path
|
|
||||||
|
|
||||||
|
|
||||||
def user_config_dir(appname=None, appauthor=None, version=None, roaming=False):
|
|
||||||
r"""Return full path to the user-specific config dir for this application.
|
|
||||||
|
|
||||||
"appname" is the name of application.
|
|
||||||
If None, just the system directory is returned.
|
|
||||||
"appauthor" (only used on Windows) is the name of the
|
|
||||||
appauthor or distributing body for this application. Typically
|
|
||||||
it is the owning company name. This falls back to appname. You may
|
|
||||||
pass False to disable it.
|
|
||||||
"version" is an optional version path element to append to the
|
|
||||||
path. You might want to use this if you want multiple versions
|
|
||||||
of your app to be able to run independently. If used, this
|
|
||||||
would typically be "<major>.<minor>".
|
|
||||||
Only applied when appname is present.
|
|
||||||
"roaming" (boolean, default False) can be set True to use the Windows
|
|
||||||
roaming appdata directory. That means that for users on a Windows
|
|
||||||
network setup for roaming profiles, this user data will be
|
|
||||||
sync'd on login. See
|
|
||||||
<http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
|
|
||||||
for a discussion of issues.
|
|
||||||
|
|
||||||
Typical user data directories are:
|
|
||||||
Mac OS X: same as user_data_dir
|
|
||||||
Unix: ~/.config/<AppName> # or in $XDG_CONFIG_HOME, if defined
|
|
||||||
Win *: same as user_data_dir
|
|
||||||
|
|
||||||
For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
|
|
||||||
That means, by deafult "~/.config/<AppName>".
|
|
||||||
"""
|
|
||||||
if system in ["win32", "darwin"]:
|
|
||||||
path = user_data_dir(appname, appauthor, None, roaming)
|
|
||||||
else:
|
|
||||||
path = os.getenv('XDG_CONFIG_HOME', os.path.expanduser("~/.config"))
|
|
||||||
if appname:
|
|
||||||
path = os.path.join(path, appname)
|
|
||||||
if appname and version:
|
|
||||||
path = os.path.join(path, version)
|
|
||||||
return path
|
|
||||||
|
|
||||||
|
|
||||||
def site_config_dir(appname=None, appauthor=None, version=None, multipath=False):
|
|
||||||
"""Return full path to the user-shared data dir for this application.
|
|
||||||
|
|
||||||
"appname" is the name of application.
|
|
||||||
If None, just the system directory is returned.
|
|
||||||
"appauthor" (only used on Windows) is the name of the
|
|
||||||
appauthor or distributing body for this application. Typically
|
|
||||||
it is the owning company name. This falls back to appname. You may
|
|
||||||
pass False to disable it.
|
|
||||||
"version" is an optional version path element to append to the
|
|
||||||
path. You might want to use this if you want multiple versions
|
|
||||||
of your app to be able to run independently. If used, this
|
|
||||||
would typically be "<major>.<minor>".
|
|
||||||
Only applied when appname is present.
|
|
||||||
"multipath" is an optional parameter only applicable to *nix
|
|
||||||
which indicates that the entire list of config dirs should be
|
|
||||||
returned. By default, the first item from XDG_CONFIG_DIRS is
|
|
||||||
returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set
|
|
||||||
|
|
||||||
Typical user data directories are:
|
|
||||||
Mac OS X: same as site_data_dir
|
|
||||||
Unix: /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
|
|
||||||
$XDG_CONFIG_DIRS
|
|
||||||
Win *: same as site_data_dir
|
|
||||||
Vista: (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
|
|
||||||
|
|
||||||
For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False
|
|
||||||
|
|
||||||
WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
|
|
||||||
"""
|
|
||||||
if system in ["win32", "darwin"]:
|
|
||||||
path = site_data_dir(appname, appauthor)
|
|
||||||
if appname and version:
|
|
||||||
path = os.path.join(path, version)
|
|
||||||
else:
|
|
||||||
# XDG default for $XDG_CONFIG_DIRS
|
|
||||||
# only first, if multipath is False
|
|
||||||
path = os.getenv('XDG_CONFIG_DIRS', '/etc/xdg')
|
|
||||||
pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
|
|
||||||
if appname:
|
|
||||||
if version:
|
|
||||||
appname = os.path.join(appname, version)
|
|
||||||
pathlist = [os.sep.join([x, appname]) for x in pathlist]
|
|
||||||
|
|
||||||
if multipath:
|
|
||||||
path = os.pathsep.join(pathlist)
|
|
||||||
else:
|
|
||||||
path = pathlist[0]
|
|
||||||
return path
|
|
||||||
|
|
||||||
|
|
||||||
def user_cache_dir(appname=None, appauthor=None, version=None, opinion=True):
|
|
||||||
r"""Return full path to the user-specific cache dir for this application.
|
|
||||||
|
|
||||||
"appname" is the name of application.
|
|
||||||
If None, just the system directory is returned.
|
|
||||||
"appauthor" (only used on Windows) is the name of the
|
|
||||||
appauthor or distributing body for this application. Typically
|
|
||||||
it is the owning company name. This falls back to appname. You may
|
|
||||||
pass False to disable it.
|
|
||||||
"version" is an optional version path element to append to the
|
|
||||||
path. You might want to use this if you want multiple versions
|
|
||||||
of your app to be able to run independently. If used, this
|
|
||||||
would typically be "<major>.<minor>".
|
|
||||||
Only applied when appname is present.
|
|
||||||
"opinion" (boolean) can be False to disable the appending of
|
|
||||||
"Cache" to the base app data dir for Windows. See
|
|
||||||
discussion below.
|
|
||||||
|
|
||||||
Typical user cache directories are:
|
|
||||||
Mac OS X: ~/Library/Caches/<AppName>
|
|
||||||
Unix: ~/.cache/<AppName> (XDG default)
|
|
||||||
Win XP: C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
|
|
||||||
Vista: C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache
|
|
||||||
|
|
||||||
On Windows the only suggestion in the MSDN docs is that local settings go in
|
|
||||||
the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
|
|
||||||
app data dir (the default returned by `user_data_dir` above). Apps typically
|
|
||||||
put cache data somewhere *under* the given dir here. Some examples:
|
|
||||||
...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
|
|
||||||
...\Acme\SuperApp\Cache\1.0
|
|
||||||
OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
|
|
||||||
This can be disabled with the `opinion=False` option.
|
|
||||||
"""
|
|
||||||
if system == "win32":
|
|
||||||
if appauthor is None:
|
|
||||||
appauthor = appname
|
|
||||||
path = os.path.normpath(_get_win_folder("CSIDL_LOCAL_APPDATA"))
|
|
||||||
if appname:
|
|
||||||
if appauthor is not False:
|
|
||||||
path = os.path.join(path, appauthor, appname)
|
|
||||||
else:
|
|
||||||
path = os.path.join(path, appname)
|
|
||||||
if opinion:
|
|
||||||
path = os.path.join(path, "Cache")
|
|
||||||
elif system == 'darwin':
|
|
||||||
path = os.path.expanduser('~/Library/Caches')
|
|
||||||
if appname:
|
|
||||||
path = os.path.join(path, appname)
|
|
||||||
else:
|
|
||||||
path = os.getenv('XDG_CACHE_HOME', os.path.expanduser('~/.cache'))
|
|
||||||
if appname:
|
|
||||||
path = os.path.join(path, appname.lower().replace(' ', '-'))
|
|
||||||
if appname and version:
|
|
||||||
path = os.path.join(path, version)
|
|
||||||
return path
|
|
||||||
|
|
||||||
|
|
||||||
def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
|
|
||||||
r"""Return full path to the user-specific log dir for this application.
|
|
||||||
|
|
||||||
"appname" is the name of application.
|
|
||||||
If None, just the system directory is returned.
|
|
||||||
"appauthor" (only used on Windows) is the name of the
|
|
||||||
appauthor or distributing body for this application. Typically
|
|
||||||
it is the owning company name. This falls back to appname. You may
|
|
||||||
pass False to disable it.
|
|
||||||
"version" is an optional version path element to append to the
|
|
||||||
path. You might want to use this if you want multiple versions
|
|
||||||
of your app to be able to run independently. If used, this
|
|
||||||
would typically be "<major>.<minor>".
|
|
||||||
Only applied when appname is present.
|
|
||||||
"opinion" (boolean) can be False to disable the appending of
|
|
||||||
"Logs" to the base app data dir for Windows, and "log" to the
|
|
||||||
base cache dir for Unix. See discussion below.
|
|
||||||
|
|
||||||
Typical user cache directories are:
|
|
||||||
Mac OS X: ~/Library/Logs/<AppName>
|
|
||||||
Unix: ~/.cache/<AppName>/log # or under $XDG_CACHE_HOME if defined
|
|
||||||
Win XP: C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
|
|
||||||
Vista: C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs
|
|
||||||
|
|
||||||
On Windows the only suggestion in the MSDN docs is that local settings
|
|
||||||
go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
|
|
||||||
examples of what some windows apps use for a logs dir.)
|
|
||||||
|
|
||||||
OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
|
|
||||||
value for Windows and appends "log" to the user cache dir for Unix.
|
|
||||||
This can be disabled with the `opinion=False` option.
|
|
||||||
"""
|
|
||||||
if system == "darwin":
|
|
||||||
path = os.path.join(
|
|
||||||
os.path.expanduser('~/Library/Logs'),
|
|
||||||
appname)
|
|
||||||
elif system == "win32":
|
|
||||||
path = user_data_dir(appname, appauthor, version)
|
|
||||||
version = False
|
|
||||||
if opinion:
|
|
||||||
path = os.path.join(path, "Logs")
|
|
||||||
else:
|
|
||||||
path = user_cache_dir(appname, appauthor, version)
|
|
||||||
version = False
|
|
||||||
if opinion:
|
|
||||||
path = os.path.join(path, "log")
|
|
||||||
if appname and version:
|
|
||||||
path = os.path.join(path, version)
|
|
||||||
return path
|
|
||||||
|
|
||||||
|
|
||||||
class AppDirs(object):
|
|
||||||
"""Convenience wrapper for getting application dirs."""
|
|
||||||
def __init__(self, appname, appauthor=None, version=None, roaming=False,
|
|
||||||
multipath=False):
|
|
||||||
self.appname = appname
|
|
||||||
self.appauthor = appauthor
|
|
||||||
self.version = version
|
|
||||||
self.roaming = roaming
|
|
||||||
self.multipath = multipath
|
|
||||||
|
|
||||||
@property
|
|
||||||
def user_data_dir(self):
|
|
||||||
return user_data_dir(self.appname, self.appauthor,
|
|
||||||
version=self.version, roaming=self.roaming)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def site_data_dir(self):
|
|
||||||
return site_data_dir(self.appname, self.appauthor,
|
|
||||||
version=self.version, multipath=self.multipath)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def user_config_dir(self):
|
|
||||||
return user_config_dir(self.appname, self.appauthor,
|
|
||||||
version=self.version, roaming=self.roaming)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def site_config_dir(self):
|
|
||||||
return site_config_dir(self.appname, self.appauthor,
|
|
||||||
version=self.version, multipath=self.multipath)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def user_cache_dir(self):
|
|
||||||
return user_cache_dir(self.appname, self.appauthor,
|
|
||||||
version=self.version)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def user_log_dir(self):
|
|
||||||
return user_log_dir(self.appname, self.appauthor,
|
|
||||||
version=self.version)
|
|
||||||
|
|
||||||
|
|
||||||
#---- internal support stuff
|
|
||||||
|
|
||||||
def _get_win_folder_from_registry(csidl_name):
|
|
||||||
"""This is a fallback technique at best. I'm not sure if using the
|
|
||||||
registry for this guarantees us the correct answer for all CSIDL_*
|
|
||||||
names.
|
|
||||||
"""
|
|
||||||
import _winreg
|
|
||||||
|
|
||||||
shell_folder_name = {
|
|
||||||
"CSIDL_APPDATA": "AppData",
|
|
||||||
"CSIDL_COMMON_APPDATA": "Common AppData",
|
|
||||||
"CSIDL_LOCAL_APPDATA": "Local AppData",
|
|
||||||
}[csidl_name]
|
|
||||||
|
|
||||||
key = _winreg.OpenKey(
|
|
||||||
_winreg.HKEY_CURRENT_USER,
|
|
||||||
r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
|
|
||||||
)
|
|
||||||
dir, type = _winreg.QueryValueEx(key, shell_folder_name)
|
|
||||||
return dir
|
|
||||||
|
|
||||||
|
|
||||||
def _get_win_folder_with_pywin32(csidl_name):
|
|
||||||
from win32com.shell import shellcon, shell
|
|
||||||
dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0)
|
|
||||||
# Try to make this a unicode path because SHGetFolderPath does
|
|
||||||
# not return unicode strings when there is unicode data in the
|
|
||||||
# path.
|
|
||||||
try:
|
|
||||||
dir = unicode(dir)
|
|
||||||
|
|
||||||
# Downgrade to short path name if have highbit chars. See
|
|
||||||
# <http://bugs.activestate.com/show_bug.cgi?id=85099>.
|
|
||||||
has_high_char = False
|
|
||||||
for c in dir:
|
|
||||||
if ord(c) > 255:
|
|
||||||
has_high_char = True
|
|
||||||
break
|
|
||||||
if has_high_char:
|
|
||||||
try:
|
|
||||||
import win32api
|
|
||||||
dir = win32api.GetShortPathName(dir)
|
|
||||||
except ImportError:
|
|
||||||
pass
|
|
||||||
except UnicodeError:
|
|
||||||
pass
|
|
||||||
return dir
|
|
||||||
|
|
||||||
|
|
||||||
def _get_win_folder_with_ctypes(csidl_name):
|
|
||||||
import ctypes
|
|
||||||
|
|
||||||
csidl_const = {
|
|
||||||
"CSIDL_APPDATA": 26,
|
|
||||||
"CSIDL_COMMON_APPDATA": 35,
|
|
||||||
"CSIDL_LOCAL_APPDATA": 28,
|
|
||||||
}[csidl_name]
|
|
||||||
|
|
||||||
buf = ctypes.create_unicode_buffer(1024)
|
|
||||||
ctypes.windll.shell32.SHGetFolderPathW(None, csidl_const, None, 0, buf)
|
|
||||||
|
|
||||||
# Downgrade to short path name if have highbit chars. See
|
|
||||||
# <http://bugs.activestate.com/show_bug.cgi?id=85099>.
|
|
||||||
has_high_char = False
|
|
||||||
for c in buf:
|
|
||||||
if ord(c) > 255:
|
|
||||||
has_high_char = True
|
|
||||||
break
|
|
||||||
if has_high_char:
|
|
||||||
buf2 = ctypes.create_unicode_buffer(1024)
|
|
||||||
if ctypes.windll.kernel32.GetShortPathNameW(buf.value, buf2, 1024):
|
|
||||||
buf = buf2
|
|
||||||
|
|
||||||
return buf.value
|
|
||||||
|
|
||||||
def _get_win_folder_with_jna(csidl_name):
|
|
||||||
import array
|
|
||||||
from com.sun import jna
|
|
||||||
from com.sun.jna.platform import win32
|
|
||||||
|
|
||||||
buf_size = win32.WinDef.MAX_PATH * 2
|
|
||||||
buf = array.zeros('c', buf_size)
|
|
||||||
shell = win32.Shell32.INSTANCE
|
|
||||||
shell.SHGetFolderPath(None, getattr(win32.ShlObj, csidl_name), None, win32.ShlObj.SHGFP_TYPE_CURRENT, buf)
|
|
||||||
dir = jna.Native.toString(buf.tostring()).rstrip("\0")
|
|
||||||
|
|
||||||
# Downgrade to short path name if have highbit chars. See
|
|
||||||
# <http://bugs.activestate.com/show_bug.cgi?id=85099>.
|
|
||||||
has_high_char = False
|
|
||||||
for c in dir:
|
|
||||||
if ord(c) > 255:
|
|
||||||
has_high_char = True
|
|
||||||
break
|
|
||||||
if has_high_char:
|
|
||||||
buf = array.zeros('c', buf_size)
|
|
||||||
kernel = win32.Kernel32.INSTANCE
|
|
||||||
if kernal.GetShortPathName(dir, buf, buf_size):
|
|
||||||
dir = jna.Native.toString(buf.tostring()).rstrip("\0")
|
|
||||||
|
|
||||||
return dir
|
|
||||||
|
|
||||||
if system == "win32":
|
|
||||||
try:
|
|
||||||
import win32com.shell
|
|
||||||
_get_win_folder = _get_win_folder_with_pywin32
|
|
||||||
except ImportError:
|
|
||||||
try:
|
|
||||||
from ctypes import windll
|
|
||||||
_get_win_folder = _get_win_folder_with_ctypes
|
|
||||||
except ImportError:
|
|
||||||
try:
|
|
||||||
import com.sun.jna
|
|
||||||
_get_win_folder = _get_win_folder_with_jna
|
|
||||||
except ImportError:
|
|
||||||
_get_win_folder = _get_win_folder_from_registry
|
|
||||||
|
|
||||||
|
|
||||||
#---- self test code
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
appname = "MyApp"
|
|
||||||
appauthor = "MyCompany"
|
|
||||||
|
|
||||||
props = ("user_data_dir", "site_data_dir",
|
|
||||||
"user_config_dir", "site_config_dir",
|
|
||||||
"user_cache_dir", "user_log_dir")
|
|
||||||
|
|
||||||
print("-- app dirs (with optional 'version')")
|
|
||||||
dirs = AppDirs(appname, appauthor, version="1.0")
|
|
||||||
for prop in props:
|
|
||||||
print("%s: %s" % (prop, getattr(dirs, prop)))
|
|
||||||
|
|
||||||
print("\n-- app dirs (without optional 'version')")
|
|
||||||
dirs = AppDirs(appname, appauthor)
|
|
||||||
for prop in props:
|
|
||||||
print("%s: %s" % (prop, getattr(dirs, prop)))
|
|
||||||
|
|
||||||
print("\n-- app dirs (without optional 'appauthor')")
|
|
||||||
dirs = AppDirs(appname)
|
|
||||||
for prop in props:
|
|
||||||
print("%s: %s" % (prop, getattr(dirs, prop)))
|
|
||||||
|
|
||||||
print("\n-- app dirs (with disabled 'appauthor')")
|
|
||||||
dirs = AppDirs(appname, appauthor=False)
|
|
||||||
for prop in props:
|
|
||||||
print("%s: %s" % (prop, getattr(dirs, prop)))
|
|
@@ -1,19 +0,0 @@
|
|||||||
__all__ = (
|
|
||||||
"exceptions",
|
|
||||||
"types",
|
|
||||||
)
|
|
||||||
|
|
||||||
from . types import (
|
|
||||||
Package,
|
|
||||||
Repository,
|
|
||||||
)
|
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
def load_repositories(repo_storage_path: Path) -> list:
|
|
||||||
repositories = []
|
|
||||||
for repofile in repo_storage_path.glob('*.json'):
|
|
||||||
# try
|
|
||||||
repo = Repository.from_file(repofile)
|
|
||||||
# except
|
|
||||||
repositories.append(repo)
|
|
||||||
return repositories
|
|
@@ -1,14 +0,0 @@
|
|||||||
class BpkgException(Exception):
|
|
||||||
"""Superclass for all package manager exceptions"""
|
|
||||||
|
|
||||||
class InstallException(BpkgException):
|
|
||||||
"""Raised when there is an error during installation"""
|
|
||||||
|
|
||||||
class DownloadException(BpkgException):
|
|
||||||
"""Raised when there is an error downloading something"""
|
|
||||||
|
|
||||||
class BadRepositoryException(BpkgException):
|
|
||||||
"""Raised when there is an error while reading or manipulating a repository"""
|
|
||||||
|
|
||||||
class PackageException(BpkgException):
|
|
||||||
"""Raised when there is an error while manipulating a package"""
|
|
@@ -1,501 +0,0 @@
|
|||||||
import logging
|
|
||||||
import json
|
|
||||||
from pathlib import Path
|
|
||||||
from . import exceptions
|
|
||||||
from . import utils
|
|
||||||
|
|
||||||
class Package:
|
|
||||||
"""
|
|
||||||
Stores package methods and metadata
|
|
||||||
"""
|
|
||||||
|
|
||||||
log = logging.getLogger(__name__ + ".Package")
|
|
||||||
|
|
||||||
def __init__(self, package_dict:dict = None):
|
|
||||||
self.bl_info = {}
|
|
||||||
self.url = ""
|
|
||||||
self.files = []
|
|
||||||
|
|
||||||
self.repositories = set()
|
|
||||||
self.installed_location = None
|
|
||||||
self.module_name = None
|
|
||||||
|
|
||||||
self.installed = False
|
|
||||||
self.is_user = False
|
|
||||||
self.enabled = False
|
|
||||||
|
|
||||||
self.set_from_dict(package_dict)
|
|
||||||
|
|
||||||
def test_is_user(self) -> bool:
|
|
||||||
"""Return true if package's install location is in user or preferences scripts path"""
|
|
||||||
import bpy
|
|
||||||
user_script_path = bpy.utils.script_path_user()
|
|
||||||
prefs_script_path = bpy.utils.script_path_pref()
|
|
||||||
|
|
||||||
if user_script_path is not None:
|
|
||||||
in_user = Path(user_script_path) in Path(self.installed_location).parents
|
|
||||||
else:
|
|
||||||
in_user = False
|
|
||||||
|
|
||||||
if prefs_script_path is not None:
|
|
||||||
in_prefs = Path(prefs_script_path) in Path(self.installed_location).parents
|
|
||||||
else:
|
|
||||||
in_prefs = False
|
|
||||||
|
|
||||||
return in_user or in_prefs
|
|
||||||
|
|
||||||
def test_enabled(self) -> bool:
|
|
||||||
"""Return true if package is enabled"""
|
|
||||||
import bpy
|
|
||||||
if self.module_name is not None:
|
|
||||||
return (self.module_name in bpy.context.user_preferences.addons)
|
|
||||||
else:
|
|
||||||
return False
|
|
||||||
|
|
||||||
def test_installed(self) -> bool:
|
|
||||||
"""Return true if package is installed"""
|
|
||||||
import addon_utils
|
|
||||||
return len([Package.from_module(mod) for mod in addon_utils.modules(refresh=False) if
|
|
||||||
addon_utils.module_bl_info(mod)['name'] == self.name and
|
|
||||||
addon_utils.module_bl_info(mod)['version'] == self.version]) > 0
|
|
||||||
|
|
||||||
def set_installed_metadata(self, installed_pkg):
|
|
||||||
"""Sets metadata specific to installed packages from the Package given as `installed_pkg`"""
|
|
||||||
self.installed = installed_pkg.test_installed()
|
|
||||||
self.enabled = installed_pkg.test_enabled()
|
|
||||||
self.is_user = installed_pkg.test_is_user()
|
|
||||||
self.module_name = installed_pkg.module_name
|
|
||||||
self.installed_location = installed_pkg.installed_location
|
|
||||||
|
|
||||||
def to_dict(self) -> dict:
|
|
||||||
"""
|
|
||||||
Return a dict representation of the package
|
|
||||||
"""
|
|
||||||
return {
|
|
||||||
'bl_info': self.bl_info,
|
|
||||||
'url': self.url,
|
|
||||||
'files': self.files,
|
|
||||||
}
|
|
||||||
|
|
||||||
def set_from_dict(self, package_dict: dict):
|
|
||||||
"""
|
|
||||||
Get attributes from a dict such as produced by `to_dict`
|
|
||||||
"""
|
|
||||||
if package_dict is None:
|
|
||||||
package_dict = {}
|
|
||||||
|
|
||||||
for attr in ('files', 'url', 'bl_info'):
|
|
||||||
if package_dict.get(attr) is not None:
|
|
||||||
setattr(self, attr, package_dict[attr])
|
|
||||||
|
|
||||||
# bl_info convenience getters {{{
|
|
||||||
# required fields
|
|
||||||
@property
|
|
||||||
def name(self) -> str:
|
|
||||||
"""Get name from bl_info"""
|
|
||||||
return self.bl_info.get('name')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def version(self) -> tuple:
|
|
||||||
"""Get version from bl_info"""
|
|
||||||
return tuple(self.bl_info.get('version'))
|
|
||||||
|
|
||||||
@property
|
|
||||||
def blender(self) -> tuple:
|
|
||||||
"""Get blender from bl_info"""
|
|
||||||
return self.bl_info.get('blender')
|
|
||||||
|
|
||||||
# optional fields
|
|
||||||
@property
|
|
||||||
def description(self) -> str:
|
|
||||||
"""Get description from bl_info"""
|
|
||||||
return self.bl_info.get('description')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def author(self) -> str:
|
|
||||||
"""Get author from bl_info"""
|
|
||||||
return self.bl_info.get('author')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def category(self) -> str:
|
|
||||||
"""Get category from bl_info"""
|
|
||||||
return self.bl_info.get('category')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def location(self) -> str:
|
|
||||||
"""Get location from bl_info"""
|
|
||||||
return self.bl_info.get('location')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def support(self) -> str:
|
|
||||||
"""Get support from bl_info"""
|
|
||||||
return self.bl_info.get('support')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def warning(self) -> str:
|
|
||||||
"""Get warning from bl_info"""
|
|
||||||
return self.bl_info.get('warning')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def wiki_url(self) -> str:
|
|
||||||
"""Get wiki_url from bl_info"""
|
|
||||||
return self.bl_info.get('wiki_url')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def tracker_url(self) -> str:
|
|
||||||
"""Get tracker_url from bl_info"""
|
|
||||||
return self.bl_info.get('tracker_url')
|
|
||||||
# }}}
|
|
||||||
|
|
||||||
# @classmethod
|
|
||||||
# def from_dict(cls, package_dict: dict):
|
|
||||||
# """
|
|
||||||
# Return a Package with values from dict
|
|
||||||
# """
|
|
||||||
# pkg = cls()
|
|
||||||
# pkg.set_from_dict(package_dict)
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_blinfo(cls, blinfo: dict):
|
|
||||||
"""
|
|
||||||
Return a Package with bl_info filled in
|
|
||||||
"""
|
|
||||||
return cls({'bl_info': blinfo})
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_module(cls, module):
|
|
||||||
"""
|
|
||||||
Return a Package object from an addon module
|
|
||||||
"""
|
|
||||||
from pathlib import Path
|
|
||||||
filepath = Path(module.__file__)
|
|
||||||
if filepath.name == '__init__.py':
|
|
||||||
filepath = filepath.parent
|
|
||||||
|
|
||||||
pkg = cls()
|
|
||||||
pkg.files = [filepath.name]
|
|
||||||
pkg.installed_location = str(filepath)
|
|
||||||
pkg.module_name = module.__name__
|
|
||||||
|
|
||||||
try:
|
|
||||||
pkg.bl_info = module.bl_info
|
|
||||||
except AttributeError as err:
|
|
||||||
raise exceptions.BadAddon("Module does not appear to be an addon; no bl_info attribute") from err
|
|
||||||
return pkg
|
|
||||||
|
|
||||||
def download(self, dest: Path, progress_callback=None) -> Path:
|
|
||||||
"""Downloads package to `dest`"""
|
|
||||||
|
|
||||||
if not self.url:
|
|
||||||
raise ValueError("Cannot download package without a URL")
|
|
||||||
|
|
||||||
return utils.download(self.url, dest, progress_callback)
|
|
||||||
|
|
||||||
def install(self, dest_dir: Path, cache_dir: Path, progress_callback=None):
|
|
||||||
"""Downloads package to `cache_dir`, then extracts/moves package to `dest_dir`"""
|
|
||||||
|
|
||||||
log = logging.getLogger('%s.install' % __name__)
|
|
||||||
|
|
||||||
downloaded = self.download(cache_dir, progress_callback)
|
|
||||||
|
|
||||||
if not downloaded:
|
|
||||||
log.debug('Download returned None, not going to install anything.')
|
|
||||||
return
|
|
||||||
|
|
||||||
utils.install(downloaded, dest_dir)
|
|
||||||
# utils.rm(downloaded)
|
|
||||||
|
|
||||||
def __eq__(self, other):
|
|
||||||
return self.name == other.name and self.version == other.version
|
|
||||||
|
|
||||||
def __lt__(self, other):
|
|
||||||
return self.version < other.version
|
|
||||||
|
|
||||||
def __hash__(self):
|
|
||||||
return hash((self.name, self.version))
|
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
|
||||||
# return self.name
|
|
||||||
return "Package('name': {}, 'version': {})".format(self.name, self.version)
|
|
||||||
|
|
||||||
class ConsolidatedPackage:
|
|
||||||
"""
|
|
||||||
Stores a grouping of different versions of the same package
|
|
||||||
"""
|
|
||||||
|
|
||||||
log = logging.getLogger(__name__ + ".ConsolidatedPackage")
|
|
||||||
|
|
||||||
def __init__(self, pkg=None):
|
|
||||||
self.versions = []
|
|
||||||
self.updateable = False
|
|
||||||
|
|
||||||
if pkg is not None:
|
|
||||||
self.add_version(pkg)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def installed(self) -> bool:
|
|
||||||
"""Return true if any version of this package is installed"""
|
|
||||||
for pkg in self.versions:
|
|
||||||
if pkg.installed:
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
@property
|
|
||||||
def name(self) -> str:
|
|
||||||
"""
|
|
||||||
Return name of this package. All package versions in a
|
|
||||||
ConsolidatedPackage should have the same name by definition
|
|
||||||
|
|
||||||
Returns None if there are no versions
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
return self.versions[0].name
|
|
||||||
except IndexError:
|
|
||||||
return None
|
|
||||||
|
|
||||||
def get_latest_installed_version(self) -> Package:
|
|
||||||
"""
|
|
||||||
Return the installed package with the highest version number.
|
|
||||||
If no packages are installed, return None.
|
|
||||||
"""
|
|
||||||
#self.versions is always sorted newer -> older, so we can just grab the first we find
|
|
||||||
for pkg in self.versions:
|
|
||||||
if pkg.installed:
|
|
||||||
return pkg
|
|
||||||
return None
|
|
||||||
|
|
||||||
def get_latest_version(self) -> Package:
|
|
||||||
"""Return package with highest version number, returns None if there are no versions"""
|
|
||||||
try:
|
|
||||||
return self.versions[0] # this is always sorted with the highest on top
|
|
||||||
except IndexError:
|
|
||||||
return None
|
|
||||||
|
|
||||||
def get_display_version(self) -> Package:
|
|
||||||
"""
|
|
||||||
Return installed package with highest version number.
|
|
||||||
If no version is installed, return highest uninstalled version.
|
|
||||||
"""
|
|
||||||
pkg = self.get_latest_installed_version()
|
|
||||||
if pkg is None:
|
|
||||||
pkg = self.get_latest_version()
|
|
||||||
return pkg
|
|
||||||
|
|
||||||
def add_version(self, newpkg: Package):
|
|
||||||
"""Adds a package to the collection of versions"""
|
|
||||||
|
|
||||||
if self.name and newpkg.name != self.name:
|
|
||||||
raise exceptions.PackageException("Name mismatch, refusing to add %s to %s" % (newpkg, self))
|
|
||||||
|
|
||||||
for pkg in self:
|
|
||||||
if pkg == newpkg:
|
|
||||||
pkg.repositories.union(newpkg.repositories)
|
|
||||||
if newpkg.installed:
|
|
||||||
pkg.set_installed_metadata(newpkg)
|
|
||||||
return
|
|
||||||
|
|
||||||
self.versions.append(newpkg)
|
|
||||||
self.versions.sort(key=lambda v: v.version, reverse=True)
|
|
||||||
|
|
||||||
|
|
||||||
def __iter__(self):
|
|
||||||
return (pkg for pkg in self.versions)
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
return ("ConsolidatedPackage<name={}>".format(self.name))
|
|
||||||
|
|
||||||
class Repository:
|
|
||||||
"""
|
|
||||||
Stores repository metadata (including packages)
|
|
||||||
"""
|
|
||||||
|
|
||||||
log = logging.getLogger(__name__ + ".Repository")
|
|
||||||
|
|
||||||
def __init__(self, url=None):
|
|
||||||
if url is None:
|
|
||||||
url = ""
|
|
||||||
self.set_from_dict({'url': url})
|
|
||||||
|
|
||||||
# def cleanse_packagelist(self):
|
|
||||||
# """Remove empty packages (no bl_info), packages with no name"""
|
|
||||||
|
|
||||||
def refresh(self, storage_path: Path, progress_callback=None):
|
|
||||||
"""
|
|
||||||
Requests repo.json from URL and embeds etag/last-modification headers
|
|
||||||
"""
|
|
||||||
import requests
|
|
||||||
|
|
||||||
if progress_callback is None:
|
|
||||||
progress_callback = lambda x: None
|
|
||||||
|
|
||||||
progress_callback(0.0)
|
|
||||||
|
|
||||||
if self.url is None:
|
|
||||||
raise ValueError("Cannot refresh repository without a URL")
|
|
||||||
|
|
||||||
url = utils.add_repojson_to_url(self.url)
|
|
||||||
|
|
||||||
self.log.debug("Refreshing repository from %s", self.url)
|
|
||||||
|
|
||||||
req_headers = {}
|
|
||||||
# Do things this way to avoid adding empty objects/None to the req_headers dict
|
|
||||||
try:
|
|
||||||
req_headers['If-None-Match'] = self._headers['etag']
|
|
||||||
except KeyError:
|
|
||||||
pass
|
|
||||||
try:
|
|
||||||
req_headers['If-Modified-Since'] = self._headers['last-modified']
|
|
||||||
except KeyError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
try:
|
|
||||||
resp = requests.get(url, headers=req_headers, timeout=60)
|
|
||||||
except requests.exceptions.InvalidSchema as err:
|
|
||||||
raise exceptions.DownloadException("Invalid schema. Did you mean to use http://?") from err
|
|
||||||
except requests.exceptions.ConnectionError as err:
|
|
||||||
raise exceptions.DownloadException("Failed to connect. Are you sure '%s' is the correct URL?" % url) from err
|
|
||||||
except requests.exceptions.RequestException as err:
|
|
||||||
raise exceptions.DownloadException(err) from err
|
|
||||||
|
|
||||||
try:
|
|
||||||
resp.raise_for_status()
|
|
||||||
except requests.HTTPError as err:
|
|
||||||
self.log.error('Error downloading %s: %s', url, err)
|
|
||||||
raise exceptions.DownloadException(resp.status_code, resp.reason) from err
|
|
||||||
|
|
||||||
if resp.status_code == requests.codes.not_modified:
|
|
||||||
self.log.debug("Packagelist not modified")
|
|
||||||
return
|
|
||||||
|
|
||||||
resp_headers = {}
|
|
||||||
try:
|
|
||||||
resp_headers['etag'] = resp.headers['etag']
|
|
||||||
except KeyError:
|
|
||||||
pass
|
|
||||||
try:
|
|
||||||
resp_headers['last-modified'] = resp.headers['last-modified']
|
|
||||||
except KeyError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
self.log.debug("Found headers: %s", resp_headers)
|
|
||||||
|
|
||||||
progress_callback(0.7)
|
|
||||||
|
|
||||||
try:
|
|
||||||
repodict = resp.json()
|
|
||||||
except json.decoder.JSONDecodeError:
|
|
||||||
self.log.exception("Failed to parse downloaded repository")
|
|
||||||
raise exceptions.BadRepositoryException(
|
|
||||||
"Could not parse repository downloaded from '%s'. Are you sure this is the correct URL?" % url
|
|
||||||
)
|
|
||||||
repodict['_headers'] = resp_headers
|
|
||||||
repodict['url'] = self.url
|
|
||||||
|
|
||||||
self.set_from_dict(repodict)
|
|
||||||
self.to_file(storage_path / utils.format_filename(self.name, ".json"))
|
|
||||||
|
|
||||||
progress_callback(1.0)
|
|
||||||
|
|
||||||
|
|
||||||
def to_dict(self, sort=False, ids=False) -> dict:
|
|
||||||
"""
|
|
||||||
Return a dict representation of the repository
|
|
||||||
"""
|
|
||||||
packages = [p.to_dict() for p in self.packages]
|
|
||||||
|
|
||||||
if sort:
|
|
||||||
packages.sort(key=lambda p: p['bl_info']['name'].lower())
|
|
||||||
|
|
||||||
if ids:
|
|
||||||
for pkg in packages:
|
|
||||||
# hash may be too big for a C int
|
|
||||||
pkg['id'] = str(hash(pkg['url'] + pkg['bl_info']['name'] + self.name + self.url))
|
|
||||||
|
|
||||||
return {
|
|
||||||
'name': self.name,
|
|
||||||
'packages': packages,
|
|
||||||
'url': self.url,
|
|
||||||
'_headers': self._headers,
|
|
||||||
}
|
|
||||||
|
|
||||||
def set_from_dict(self, repodict: dict):
|
|
||||||
"""
|
|
||||||
Get repository attributes from a dict such as produced by `to_dict`
|
|
||||||
"""
|
|
||||||
|
|
||||||
# def initialize(item, value):
|
|
||||||
# if item is None:
|
|
||||||
# return value
|
|
||||||
# else:
|
|
||||||
# return item
|
|
||||||
|
|
||||||
#Be certain to initialize everything; downloaded packagelist might contain null values
|
|
||||||
# url = initialize(repodict.get('url'), "")
|
|
||||||
# packages = initialize(repodict.get('packages'), [])
|
|
||||||
# headers = initialize(repodict.get('_headers'), {})
|
|
||||||
name = repodict.get('name', "")
|
|
||||||
url = repodict.get('url', "")
|
|
||||||
packages = repodict.get('packages', [])
|
|
||||||
headers = repodict.get('_headers', {})
|
|
||||||
|
|
||||||
self.name = name
|
|
||||||
self.url = url
|
|
||||||
self.packages = [Package(pkg) for pkg in packages]
|
|
||||||
self._headers = headers
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_dict(cls, repodict: dict):
|
|
||||||
"""
|
|
||||||
Like `set_from_dict`, but immutable
|
|
||||||
"""
|
|
||||||
repo = cls()
|
|
||||||
repo.set_from_dict(repodict)
|
|
||||||
return repo
|
|
||||||
|
|
||||||
def to_file(self, path: Path):
|
|
||||||
"""
|
|
||||||
Dump repository to a json file at `path`.
|
|
||||||
"""
|
|
||||||
if self.packages is None:
|
|
||||||
self.log.warning("Writing an empty repository")
|
|
||||||
|
|
||||||
self.log.debug("URL is %s", self.url)
|
|
||||||
|
|
||||||
with path.open('w', encoding='utf-8') as repo_file:
|
|
||||||
json.dump(self.to_dict(), repo_file, indent=4, sort_keys=True)
|
|
||||||
self.log.debug("Repository written to %s" % path)
|
|
||||||
|
|
||||||
# def set_from_file(self, path: Path):
|
|
||||||
# """
|
|
||||||
# Set the current instance's attributes from a json file
|
|
||||||
# """
|
|
||||||
# repo_file = path.open('r', encoding='utf-8')
|
|
||||||
#
|
|
||||||
# with repo_file:
|
|
||||||
# try:
|
|
||||||
# self.set_from_dict(json.load(repo_file))
|
|
||||||
# except Exception as err:
|
|
||||||
# raise BadRepository from err
|
|
||||||
#
|
|
||||||
# self.log.debug("Repository read from %s", path)
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_file(cls, path: Path):
|
|
||||||
"""
|
|
||||||
Read repository from a json file at `path`.
|
|
||||||
"""
|
|
||||||
repo_file = path.open('r', encoding='utf-8')
|
|
||||||
|
|
||||||
with repo_file:
|
|
||||||
try:
|
|
||||||
repo = cls.from_dict(json.load(repo_file))
|
|
||||||
except json.JSONDecodeError as err:
|
|
||||||
raise exceptions.BadRepositoryException(err) from err
|
|
||||||
if repo.url is None or len(repo.url) == 0:
|
|
||||||
raise exceptions.BadRepositoryException("Repository missing URL")
|
|
||||||
|
|
||||||
cls.log.debug("Repository read from %s", path)
|
|
||||||
return repo
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
return "Repository({}, {})".format(self.name, self.url)
|
|
@@ -1,218 +0,0 @@
|
|||||||
from pathlib import Path
|
|
||||||
from . import exceptions
|
|
||||||
import shutil
|
|
||||||
import logging
|
|
||||||
|
|
||||||
|
|
||||||
def format_filename(s: str, ext=None) -> str:
|
|
||||||
"""Take a string and turn it into a reasonable filename"""
|
|
||||||
import string
|
|
||||||
if ext is None:
|
|
||||||
ext = ""
|
|
||||||
valid_chars = "-_.() %s%s" % (string.ascii_letters, string.digits)
|
|
||||||
filename = ''.join(char for char in s if char in valid_chars)
|
|
||||||
filename = filename.replace(' ','_')
|
|
||||||
filename.lower()
|
|
||||||
filename += ext
|
|
||||||
return filename
|
|
||||||
|
|
||||||
def download(url: str, destination: Path, progress_callback=None) -> Path:
|
|
||||||
"""
|
|
||||||
Downloads file at the given url, and if progress_callback is specified,
|
|
||||||
repeatedly calls progress_callback with an argument between 0 and 1, or infinity.
|
|
||||||
Raises DownloadException if an error occurs with the download.
|
|
||||||
|
|
||||||
:returns: path to the downloaded file, or None if not modified
|
|
||||||
"""
|
|
||||||
|
|
||||||
import requests
|
|
||||||
log = logging.getLogger('%s.download' % __name__)
|
|
||||||
|
|
||||||
if progress_callback is None:
|
|
||||||
# assign to do-nothing function
|
|
||||||
progress_callback = lambda x: None
|
|
||||||
|
|
||||||
progress_callback(0)
|
|
||||||
|
|
||||||
# derive filename from url if `destination` is an existing directory, otherwise use `destination` directly
|
|
||||||
if destination.is_dir():
|
|
||||||
# TODO: get filename from Content-Disposition header, if available.
|
|
||||||
from urllib.parse import urlsplit, urlunsplit
|
|
||||||
parsed_url = urlsplit(url)
|
|
||||||
local_filename = Path(parsed_url.path).name or 'download.tmp'
|
|
||||||
local_fpath = destination / local_filename
|
|
||||||
else:
|
|
||||||
local_fpath = destination
|
|
||||||
|
|
||||||
log.info('Downloading %s -> %s', url, local_fpath)
|
|
||||||
|
|
||||||
# try:
|
|
||||||
resp = requests.get(url, stream=True, verify=True)
|
|
||||||
# except requests.exceptions.RequestException as err:
|
|
||||||
# raise exceptions.DownloadException(err) from err
|
|
||||||
|
|
||||||
try:
|
|
||||||
resp.raise_for_status()
|
|
||||||
except requests.HTTPError as err:
|
|
||||||
raise exceptions.DownloadException(resp.status_code, str(err)) from err
|
|
||||||
|
|
||||||
if resp.status_code == requests.codes.not_modified:
|
|
||||||
log.info("Server responded 'Not Modified', not downloading")
|
|
||||||
return None
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Use float so that we can also use infinity
|
|
||||||
content_length = float(resp.headers['content-length'])
|
|
||||||
except KeyError:
|
|
||||||
log.warning('Server did not send content length, cannot report progress.')
|
|
||||||
content_length = float('inf')
|
|
||||||
|
|
||||||
# TODO: check if there's enough disk space.
|
|
||||||
|
|
||||||
|
|
||||||
downloaded_length = 0
|
|
||||||
with local_fpath.open('wb') as outfile:
|
|
||||||
for chunk in resp.iter_content(chunk_size=1024 ** 2):
|
|
||||||
if not chunk: # filter out keep-alive new chunks
|
|
||||||
continue
|
|
||||||
|
|
||||||
outfile.write(chunk)
|
|
||||||
downloaded_length += len(chunk)
|
|
||||||
progress_callback(downloaded_length / content_length)
|
|
||||||
|
|
||||||
return local_fpath
|
|
||||||
|
|
||||||
|
|
||||||
def rm(path: Path):
|
|
||||||
"""Delete whatever is specified by `path`"""
|
|
||||||
if path.is_dir():
|
|
||||||
shutil.rmtree(str(path))
|
|
||||||
else:
|
|
||||||
path.unlink()
|
|
||||||
|
|
||||||
class InplaceBackup:
|
|
||||||
"""Utility class for moving a file out of the way by appending a '~'"""
|
|
||||||
|
|
||||||
log = logging.getLogger('%s.inplace-backup' % __name__)
|
|
||||||
|
|
||||||
def __init__(self, path: Path):
|
|
||||||
self.path = path
|
|
||||||
self.backup()
|
|
||||||
|
|
||||||
def backup(self):
|
|
||||||
"""Move 'path' to 'path~'"""
|
|
||||||
if not self.path.exists():
|
|
||||||
raise FileNotFoundError("Can't backup path which doesn't exist")
|
|
||||||
|
|
||||||
self.backup_path = Path(str(self.path) + '~')
|
|
||||||
if self.backup_path.exists():
|
|
||||||
self.log.warning("Overwriting existing backup '{}'".format(self.backup_path))
|
|
||||||
rm(self.backup_path)
|
|
||||||
|
|
||||||
shutil.move(str(self.path), str(self.backup_path))
|
|
||||||
|
|
||||||
def restore(self):
|
|
||||||
"""Move 'path~' to 'path'"""
|
|
||||||
try:
|
|
||||||
getattr(self, 'backup_path')
|
|
||||||
except AttributeError as err:
|
|
||||||
raise RuntimeError("Can't restore file before backing it up") from err
|
|
||||||
|
|
||||||
if not self.backup_path.exists():
|
|
||||||
raise FileNotFoundError("Can't restore backup which doesn't exist")
|
|
||||||
|
|
||||||
if self.path.exists():
|
|
||||||
self.log.warning("Overwriting '{0}' with backup file".format(self.path))
|
|
||||||
rm(self.path)
|
|
||||||
|
|
||||||
shutil.move(str(self.backup_path), str(self.path))
|
|
||||||
|
|
||||||
def remove(self):
|
|
||||||
"""Remove 'path~'"""
|
|
||||||
rm(self.backup_path)
|
|
||||||
|
|
||||||
|
|
||||||
def install(src_file: Path, dest_dir: Path):
|
|
||||||
"""Extracts/moves package at `src_file` to `dest_dir`"""
|
|
||||||
|
|
||||||
import zipfile
|
|
||||||
|
|
||||||
log = logging.getLogger('%s.install' % __name__)
|
|
||||||
log.debug("Starting installation")
|
|
||||||
|
|
||||||
if not src_file.is_file():
|
|
||||||
raise exceptions.InstallException("Package isn't a file")
|
|
||||||
|
|
||||||
if not dest_dir.is_dir():
|
|
||||||
raise exceptions.InstallException("Destination is not a directory")
|
|
||||||
|
|
||||||
# TODO: check to make sure addon/package isn't already installed elsewhere
|
|
||||||
|
|
||||||
# The following is adapted from `addon_install` in bl_operators/wm.py
|
|
||||||
|
|
||||||
# check to see if the file is in compressed format (.zip)
|
|
||||||
if zipfile.is_zipfile(str(src_file)):
|
|
||||||
log.debug("Package is zipfile")
|
|
||||||
try:
|
|
||||||
file_to_extract = zipfile.ZipFile(str(src_file), 'r')
|
|
||||||
except Exception as err:
|
|
||||||
raise exceptions.InstallException("Failed to read zip file: %s" % err) from err
|
|
||||||
|
|
||||||
def root_files(filelist: list) -> list:
|
|
||||||
"""Some string parsing to get a list of the root contents of a zip from its namelist"""
|
|
||||||
rootlist = []
|
|
||||||
for f in filelist:
|
|
||||||
# Get all names which have no path separators (root level files)
|
|
||||||
# or have a single path separator at the end (root level directories).
|
|
||||||
if len(f.rstrip('/').split('/')) == 1:
|
|
||||||
rootlist.append(f)
|
|
||||||
return rootlist
|
|
||||||
|
|
||||||
conflicts = [dest_dir / f for f in root_files(file_to_extract.namelist()) if (dest_dir / f).exists()]
|
|
||||||
backups = []
|
|
||||||
for conflict in conflicts:
|
|
||||||
log.debug("Creating backup of conflict %s", conflict)
|
|
||||||
backups.append(InplaceBackup(conflict))
|
|
||||||
|
|
||||||
try:
|
|
||||||
file_to_extract.extractall(str(dest_dir))
|
|
||||||
except Exception as err:
|
|
||||||
for backup in backups:
|
|
||||||
backup.restore()
|
|
||||||
raise exceptions.InstallException("Failed to extract zip file to '%s': %s" % (dest_dir, err)) from err
|
|
||||||
|
|
||||||
for backup in backups:
|
|
||||||
backup.remove()
|
|
||||||
|
|
||||||
else:
|
|
||||||
log.debug("Package is pyfile")
|
|
||||||
dest_file = (dest_dir / src_file.name)
|
|
||||||
|
|
||||||
if dest_file.exists():
|
|
||||||
backup = InplaceBackup(dest_file)
|
|
||||||
|
|
||||||
try:
|
|
||||||
shutil.copyfile(str(src_file), str(dest_file))
|
|
||||||
except Exception as err:
|
|
||||||
backup.restore()
|
|
||||||
raise exceptions.InstallException("Failed to copy file to '%s': %s" % (dest_dir, err)) from err
|
|
||||||
|
|
||||||
log.debug("Installation succeeded")
|
|
||||||
|
|
||||||
|
|
||||||
# def load_repository(repo_storage_path: Path, repo_name: str) -> Repository:
|
|
||||||
# """Loads <repo_name>.json from <repo_storage_path>"""
|
|
||||||
# pass
|
|
||||||
#
|
|
||||||
# def download_repository(repo_storage_path: Path, repo_name: str):
|
|
||||||
# """Loads <repo_name>.json from <repo_storage_path>"""
|
|
||||||
# pass
|
|
||||||
# this is done in Repository
|
|
||||||
|
|
||||||
|
|
||||||
def add_repojson_to_url(url: str) -> str:
|
|
||||||
"""Add `repo.json` to the path component of a url"""
|
|
||||||
from urllib.parse import urlsplit, urlunsplit
|
|
||||||
parsed_url = urlsplit(url)
|
|
||||||
new_path = parsed_url.path + "/repo.json"
|
|
||||||
return urlunsplit((parsed_url.scheme, parsed_url.netloc, new_path, parsed_url.query, parsed_url.fragment))
|
|
@@ -1,49 +0,0 @@
|
|||||||
# ##### BEGIN GPL LICENSE BLOCK #####
|
|
||||||
#
|
|
||||||
# This program is free software; you can redistribute it and/or
|
|
||||||
# modify it under the terms of the GNU General Public License
|
|
||||||
# as published by the Free Software Foundation; either version 2
|
|
||||||
# of the License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU General Public License for more details.
|
|
||||||
#
|
|
||||||
# You should have received a copy of the GNU General Public License
|
|
||||||
# along with this program; if not, write to the Free Software Foundation,
|
|
||||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
|
||||||
#
|
|
||||||
# ##### END GPL LICENSE BLOCK #####
|
|
||||||
|
|
||||||
|
|
||||||
import os
|
|
||||||
import logging
|
|
||||||
import pathlib
|
|
||||||
|
|
||||||
from . import appdirs
|
|
||||||
|
|
||||||
log = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
def cache_directory(*subdirs) -> pathlib.Path:
|
|
||||||
"""Returns an OS-specifc cache location, and ensures it exists.
|
|
||||||
|
|
||||||
Should be replaced with a call to bpy.utils.user_resource('CACHE', ...)
|
|
||||||
once https://developer.blender.org/T47684 is finished.
|
|
||||||
|
|
||||||
:param subdirs: extra subdirectories inside the cache directory.
|
|
||||||
|
|
||||||
>>> cache_directory()
|
|
||||||
'.../blender_cloud/your_username'
|
|
||||||
>>> cache_directory('sub1', 'sub2')
|
|
||||||
'.../blender_cloud/your_username/sub1/sub2'
|
|
||||||
"""
|
|
||||||
|
|
||||||
# TODO: use bpy.utils.user_resource('CACHE', ...)
|
|
||||||
# once https://developer.blender.org/T47684 is finished.
|
|
||||||
user_cache_dir = appdirs.user_cache_dir(appname='Blender', appauthor=False)
|
|
||||||
cache_dir = pathlib.Path(user_cache_dir) / 'blender_package_manager' / pathlib.Path(*subdirs)
|
|
||||||
cache_dir.mkdir(mode=0o700, parents=True, exist_ok=True)
|
|
||||||
|
|
||||||
return cache_dir
|
|
@@ -1,73 +0,0 @@
|
|||||||
from .bpkg.types import Repository
|
|
||||||
|
|
||||||
class Message:
|
|
||||||
"""Superclass for all message sent over pipes."""
|
|
||||||
|
|
||||||
|
|
||||||
# Blender messages
|
|
||||||
|
|
||||||
class BlenderMessage(Message):
|
|
||||||
"""Superclass for all messages sent from Blender to the subprocess."""
|
|
||||||
|
|
||||||
class Abort(BlenderMessage):
|
|
||||||
"""Sent when the user requests abortion of a task."""
|
|
||||||
|
|
||||||
|
|
||||||
# Subproc messages
|
|
||||||
|
|
||||||
class SubprocMessage(Message):
|
|
||||||
"""Superclass for all messages sent from the subprocess to Blender."""
|
|
||||||
|
|
||||||
class Progress(SubprocMessage):
|
|
||||||
"""Send from subprocess to Blender to report progress.
|
|
||||||
|
|
||||||
:ivar progress: the progress percentage, from 0-1.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, progress: float):
|
|
||||||
self.progress = progress
|
|
||||||
|
|
||||||
class Success(SubprocMessage):
|
|
||||||
"""Sent when an operation finished sucessfully."""
|
|
||||||
|
|
||||||
class RepositoryResult(SubprocMessage):
|
|
||||||
"""Sent when an operation returns a repository to be used on the parent process."""
|
|
||||||
|
|
||||||
def __init__(self, repository_name: str):
|
|
||||||
self.repository = repository
|
|
||||||
|
|
||||||
class Aborted(SubprocMessage):
|
|
||||||
"""Sent as response to Abort message."""
|
|
||||||
|
|
||||||
# subproc warnings
|
|
||||||
|
|
||||||
class SubprocWarning(SubprocMessage):
|
|
||||||
"""Superclass for all non-fatal warning messages sent from the subprocess."""
|
|
||||||
|
|
||||||
def __init__(self, message: str):
|
|
||||||
self.message = message
|
|
||||||
|
|
||||||
# subproc errors
|
|
||||||
|
|
||||||
class SubprocError(SubprocMessage):
|
|
||||||
"""Superclass for all fatal error messages sent from the subprocess."""
|
|
||||||
|
|
||||||
def __init__(self, message: str):
|
|
||||||
self.message = message
|
|
||||||
|
|
||||||
class InstallError(SubprocError):
|
|
||||||
"""Sent when there was an error installing something."""
|
|
||||||
|
|
||||||
class UninstallError(SubprocError):
|
|
||||||
"""Sent when there was an error uninstalling something."""
|
|
||||||
|
|
||||||
class BadRepositoryError(SubprocError):
|
|
||||||
"""Sent when a repository can't be used for some reason"""
|
|
||||||
|
|
||||||
class DownloadError(SubprocMessage):
|
|
||||||
"""Sent when there was an error downloading something."""
|
|
||||||
|
|
||||||
def __init__(self, message: str, status_code: int = None):
|
|
||||||
self.status_code = status_code
|
|
||||||
self.message = message
|
|
||||||
|
|
@@ -1,80 +0,0 @@
|
|||||||
"""
|
|
||||||
All the stuff that needs to run in a subprocess.
|
|
||||||
"""
|
|
||||||
|
|
||||||
from pathlib import Path
|
|
||||||
from . import bpkg
|
|
||||||
from . import messages
|
|
||||||
from .bpkg import exceptions as bpkg_exs
|
|
||||||
from .bpkg.types import (Package, Repository)
|
|
||||||
import logging
|
|
||||||
|
|
||||||
def download_and_install_package(pipe_to_blender, package: Package, install_path: Path):
|
|
||||||
"""Downloads and installs the given package."""
|
|
||||||
|
|
||||||
log = logging.getLogger(__name__ + '.download_and_install')
|
|
||||||
|
|
||||||
from . import cache
|
|
||||||
cache_dir = cache.cache_directory('downloads')
|
|
||||||
|
|
||||||
try:
|
|
||||||
package.install(install_path, cache_dir)
|
|
||||||
except bpkg_exs.DownloadException as err:
|
|
||||||
pipe_to_blender.send(messages.DownloadError(err))
|
|
||||||
log.exception(err)
|
|
||||||
except bpkg_exs.InstallException as err:
|
|
||||||
pipe_to_blender.send(messages.InstallError(err))
|
|
||||||
log.exception(err)
|
|
||||||
|
|
||||||
pipe_to_blender.send(messages.Success())
|
|
||||||
|
|
||||||
|
|
||||||
def uninstall_package(pipe_to_blender, package: Package, install_path: Path):
|
|
||||||
"""Deletes the given package's files from the install directory"""
|
|
||||||
#TODO: move package to cache and present an "undo" button to user, to give nicer UX on misclicks
|
|
||||||
|
|
||||||
for pkgfile in [install_path / Path(p) for p in package.files]:
|
|
||||||
if not pkgfile.exists():
|
|
||||||
pipe_to_blender.send(messages.UninstallError("Could not find file owned by package: '%s'. Refusing to uninstall." % pkgfile))
|
|
||||||
return None
|
|
||||||
|
|
||||||
for pkgfile in [install_path / Path(p) for p in package.files]:
|
|
||||||
bpkg.utils.rm(pkgfile)
|
|
||||||
|
|
||||||
pipe_to_blender.send(messages.Success())
|
|
||||||
|
|
||||||
|
|
||||||
def refresh_repositories(pipe_to_blender, repo_storage_path: Path, repository_urls: str, progress_callback=None):
|
|
||||||
"""Downloads and stores the given repository"""
|
|
||||||
|
|
||||||
log = logging.getLogger(__name__ + '.refresh_repository')
|
|
||||||
|
|
||||||
if progress_callback is None:
|
|
||||||
progress_callback = lambda x: None
|
|
||||||
progress_callback(0.0)
|
|
||||||
|
|
||||||
repos = bpkg.load_repositories(repo_storage_path)
|
|
||||||
|
|
||||||
def prog(progress: float):
|
|
||||||
progress_callback(progress/len(repos))
|
|
||||||
|
|
||||||
known_repo_urls = [repo.url for repo in repos]
|
|
||||||
for repo_url in repository_urls:
|
|
||||||
if repo_url not in known_repo_urls:
|
|
||||||
repos.append(Repository(repo_url))
|
|
||||||
|
|
||||||
for repo in repos:
|
|
||||||
log.debug("repo name: %s, url: %s", repo.name, repo.url)
|
|
||||||
for repo in repos:
|
|
||||||
try:
|
|
||||||
repo.refresh(repo_storage_path, progress_callback=prog)
|
|
||||||
except bpkg_exs.DownloadException as err:
|
|
||||||
pipe_to_blender.send(messages.DownloadError(err))
|
|
||||||
log.exception("Download error")
|
|
||||||
except bpkg_exs.BadRepositoryException as err:
|
|
||||||
pipe_to_blender.send(messages.BadRepositoryError(err))
|
|
||||||
log.exception("Bad repository")
|
|
||||||
|
|
||||||
progress_callback(1.0)
|
|
||||||
pipe_to_blender.send(messages.Success())
|
|
||||||
|
|
@@ -1,29 +0,0 @@
|
|||||||
import bpy
|
|
||||||
from . import bpkg
|
|
||||||
from pathlib import Path
|
|
||||||
import logging
|
|
||||||
|
|
||||||
from collections import OrderedDict
|
|
||||||
|
|
||||||
def fmt_version(version_number: tuple) -> str:
|
|
||||||
"""Take version number as a tuple and format it as a string"""
|
|
||||||
vstr = str(version_number[0])
|
|
||||||
for component in version_number[1:]:
|
|
||||||
vstr += "." + str(component)
|
|
||||||
return vstr
|
|
||||||
|
|
||||||
def sanitize_repository_url(url: str) -> str:
|
|
||||||
"""Sanitize repository url"""
|
|
||||||
from urllib.parse import urlsplit, urlunsplit
|
|
||||||
parsed_url = urlsplit(url)
|
|
||||||
# new_path = parsed_url.path.rstrip("repo.json")
|
|
||||||
new_path = parsed_url.path
|
|
||||||
return urlunsplit((parsed_url.scheme, parsed_url.netloc, new_path, parsed_url.query, parsed_url.fragment))
|
|
||||||
|
|
||||||
def add_repojson_to_url(url: str) -> str:
|
|
||||||
"""Add `repo.json` to the path component of a url"""
|
|
||||||
from urllib.parse import urlsplit, urlunsplit
|
|
||||||
parsed_url = urlsplit(url)
|
|
||||||
new_path = str(Path(parsed_url.path) / "repo.json")
|
|
||||||
return urlunsplit((parsed_url.scheme, parsed_url.netloc, new_path, parsed_url.query, parsed_url.fragment))
|
|
||||||
|
|
114
setup.py
114
setup.py
@@ -1,114 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
# ##### BEGIN GPL LICENSE BLOCK #####
|
|
||||||
#
|
|
||||||
# This program is free software; you can redistribute it and/or
|
|
||||||
# modify it under the terms of the GNU General Public License
|
|
||||||
# as published by the Free Software Foundation; either version 2
|
|
||||||
# of the License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU General Public License for more details.
|
|
||||||
#
|
|
||||||
# You should have received a copy of the GNU General Public License
|
|
||||||
# along with this program; if not, write to the Free Software Foundation,
|
|
||||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
|
||||||
#
|
|
||||||
# ##### END GPL LICENSE BLOCK #####
|
|
||||||
|
|
||||||
# setup.py adapted from blender_cloud addon
|
|
||||||
|
|
||||||
import glob
|
|
||||||
import sys
|
|
||||||
import zipfile
|
|
||||||
|
|
||||||
from distutils import log
|
|
||||||
from distutils.core import Command
|
|
||||||
from distutils.command.bdist import bdist
|
|
||||||
from distutils.command.install import install
|
|
||||||
from distutils.command.install_egg_info import install_egg_info
|
|
||||||
from setuptools import setup, find_packages
|
|
||||||
|
|
||||||
# sys.dont_write_bytecode = True
|
|
||||||
|
|
||||||
# noinspection PyAttributeOutsideInit
|
|
||||||
class BlenderAddonBdist(bdist):
|
|
||||||
"""Ensures that 'python setup.py bdist' creates a zip file."""
|
|
||||||
|
|
||||||
def initialize_options(self):
|
|
||||||
super().initialize_options()
|
|
||||||
self.formats = ['zip']
|
|
||||||
self.plat_name = 'addon' # use this instead of 'linux-x86_64' or similar.
|
|
||||||
|
|
||||||
def run(self):
|
|
||||||
self.run_command('wheels')
|
|
||||||
super().run()
|
|
||||||
|
|
||||||
|
|
||||||
# noinspection PyAttributeOutsideInit
|
|
||||||
class BlenderAddonFdist(BlenderAddonBdist):
|
|
||||||
"""Ensures that 'python setup.py fdist' creates a plain folder structure."""
|
|
||||||
|
|
||||||
user_options = [
|
|
||||||
('dest-path=', None, 'addon installation path'),
|
|
||||||
]
|
|
||||||
|
|
||||||
def initialize_options(self):
|
|
||||||
super().initialize_options()
|
|
||||||
self.dest_path = None # path that will contain the addon
|
|
||||||
|
|
||||||
def run(self):
|
|
||||||
super().run()
|
|
||||||
|
|
||||||
# dist_files is a list of tuples ('bdist', 'any', 'filepath')
|
|
||||||
filepath = self.distribution.dist_files[0][2]
|
|
||||||
|
|
||||||
# if dest_path is not specified use the filename as the dest_path (minus the .zip)
|
|
||||||
assert filepath.endswith('.zip')
|
|
||||||
target_folder = self.dest_path or filepath[:-4]
|
|
||||||
|
|
||||||
print('Unzipping the package on {}.'.format(target_folder))
|
|
||||||
|
|
||||||
with zipfile.ZipFile(filepath, 'r') as zip_ref:
|
|
||||||
zip_ref.extractall(target_folder)
|
|
||||||
|
|
||||||
|
|
||||||
# noinspection PyAttributeOutsideInit
|
|
||||||
class BlenderAddonInstall(install):
|
|
||||||
"""Ensures the module is placed at the root of the zip file."""
|
|
||||||
|
|
||||||
def initialize_options(self):
|
|
||||||
super().initialize_options()
|
|
||||||
self.prefix = ''
|
|
||||||
self.install_lib = ''
|
|
||||||
|
|
||||||
|
|
||||||
class AvoidEggInfo(install_egg_info):
|
|
||||||
"""Makes sure the egg-info directory is NOT created.
|
|
||||||
|
|
||||||
If we skip this, the user's addon directory will be polluted by egg-info
|
|
||||||
directories, which Blender doesn't use anyway.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def run(self):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
setup(
|
|
||||||
cmdclass={'bdist': BlenderAddonBdist,
|
|
||||||
'fdist': BlenderAddonFdist,
|
|
||||||
'install': BlenderAddonInstall,
|
|
||||||
'install_egg_info': AvoidEggInfo},
|
|
||||||
name='bpkg',
|
|
||||||
description='Integrated package manager for Blender',
|
|
||||||
version='0.0.1',
|
|
||||||
author='Ellwood Zwovic',
|
|
||||||
author_email='gandalf3@blendermonkey.com',
|
|
||||||
packages=['bpkg'],
|
|
||||||
scripts=['generate_repository'],
|
|
||||||
url='https://developer.blender.org/diffusion/BPMA/',
|
|
||||||
platforms='',
|
|
||||||
zip_safe=False,
|
|
||||||
)
|
|
||||||
|
|
@@ -5,7 +5,12 @@ from pathlib import Path
|
|||||||
import logging
|
import logging
|
||||||
import ast
|
import ast
|
||||||
import json
|
import json
|
||||||
import bpkg-repogen
|
|
||||||
|
import types
|
||||||
|
import importlib.machinery
|
||||||
|
loader = importlib.machinery.SourceFileLoader('generate_repository', 'generate_repository')
|
||||||
|
generate_repository = types.ModuleType(loader.name)
|
||||||
|
loader.exec_module(generate_repository)
|
||||||
|
|
||||||
logging.basicConfig(level=logging.ERROR,
|
logging.basicConfig(level=logging.ERROR,
|
||||||
format='%(levelname)8s: %(message)s')
|
format='%(levelname)8s: %(message)s')
|
||||||
@@ -18,19 +23,11 @@ class TestRepoGeneration(unittest.TestCase):
|
|||||||
def test_extract_blinfo_from_nonexistent(self):
|
def test_extract_blinfo_from_nonexistent(self):
|
||||||
test_file = 'file_that_doesnt_exist'
|
test_file = 'file_that_doesnt_exist'
|
||||||
with self.assertRaises(FileNotFoundError):
|
with self.assertRaises(FileNotFoundError):
|
||||||
bpkg-repogen.extract_blinfo(self.addon_path / test_file)
|
generate_repository.extract_blinfo(self.addon_path / test_file)
|
||||||
|
|
||||||
def test_package_quantity(self):
|
def test_generate_repository_from_nonexistent(self):
|
||||||
repo = bpkg-repogen.bpkg-repogen(self.addon_path, "name of the repo")
|
|
||||||
acceptible_addons = [
|
|
||||||
f for f in self.addon_path.iterdir()
|
|
||||||
if not f.match('*nonaddon*')
|
|
||||||
]
|
|
||||||
self.assertEqual(len(repo.packages), len(acceptible_addons))
|
|
||||||
|
|
||||||
def test_bpkg-repogen_from_nonexistent(self):
|
|
||||||
with self.assertRaises(FileNotFoundError):
|
with self.assertRaises(FileNotFoundError):
|
||||||
bpkg-repogen.bpkg-repogen(Path('in_a_galaxy_far_far_away'), "somename")
|
generate_repository.make_repo(Path('in_a_galaxy_far_far_away'), "somename", "someurl")
|
||||||
|
|
||||||
# addons which should contain bl_infos
|
# addons which should contain bl_infos
|
||||||
yes_blinfo = [
|
yes_blinfo = [
|
||||||
@@ -45,7 +42,7 @@ no_blinfo = [
|
|||||||
|
|
||||||
def generate_good_blinfo_test(test_file: Path):
|
def generate_good_blinfo_test(test_file: Path):
|
||||||
def test(self):
|
def test(self):
|
||||||
reality = bpkg-repogen.extract_blinfo(test_file)
|
reality = generate_repository.extract_blinfo(test_file)
|
||||||
with (self.helper_path / 'expected_blinfo').open("r") as f:
|
with (self.helper_path / 'expected_blinfo').open("r") as f:
|
||||||
expectation = ast.literal_eval(f.read())
|
expectation = ast.literal_eval(f.read())
|
||||||
self.assertEqual(expectation, reality)
|
self.assertEqual(expectation, reality)
|
||||||
@@ -53,8 +50,8 @@ def generate_good_blinfo_test(test_file: Path):
|
|||||||
|
|
||||||
def generate_bad_blinfo_test(test_file: Path):
|
def generate_bad_blinfo_test(test_file: Path):
|
||||||
def test(self):
|
def test(self):
|
||||||
with self.assertRaises(bpkg-repogen.BadAddon):
|
with self.assertRaises(generate_repository.BadAddon):
|
||||||
bpkg-repogen.extract_blinfo(test_file)
|
generate_repository.extract_blinfo(test_file)
|
||||||
return test
|
return test
|
||||||
|
|
||||||
# Add test method retur
|
# Add test method retur
|
||||||
|
@@ -1,72 +0,0 @@
|
|||||||
import requests
|
|
||||||
import unittest
|
|
||||||
from unittest import mock
|
|
||||||
# from blenderpack import Repositories, fetch_repo
|
|
||||||
from datetime import datetime
|
|
||||||
import json
|
|
||||||
|
|
||||||
# based on https://stackoverflow.com/a/28507806/2730823
|
|
||||||
|
|
||||||
# This method will be used by the mock to replace requests.get
|
|
||||||
def mocked_requests_get(*args, **kwargs):
|
|
||||||
cidict = requests.structures.CaseInsensitiveDict
|
|
||||||
req_headers = cidict(kwargs.get('headers'))
|
|
||||||
t_fmt = '%a, %m %b %Y %X %Z'
|
|
||||||
|
|
||||||
class MockResponse:
|
|
||||||
def __init__(self, headers: cidict, status_code: int):
|
|
||||||
self.headers = headers
|
|
||||||
self.status_code = status_code
|
|
||||||
|
|
||||||
def json(self):
|
|
||||||
return json.dumps({'url': 'http://someurl.tld/repo.json'})
|
|
||||||
|
|
||||||
if args[0] == 'http://someurl.tld/repo.json':
|
|
||||||
resp_headers = cidict({
|
|
||||||
"ETag": '"2a0094b-b74-55326ced274f3"',
|
|
||||||
"Last-Modified": 'Sun, 13 Mar 2011 13:38:53 GMT',
|
|
||||||
})
|
|
||||||
|
|
||||||
if req_headers == {}:
|
|
||||||
resp_code = 200
|
|
||||||
else:
|
|
||||||
req_headers = cidict(req_headers)
|
|
||||||
resp_code = 304 if req_headers.get('if-none-match', '') == resp_headers['etag']\
|
|
||||||
or datetime.strptime(req_headers.get('if-modified-since', ''), t_fmt) < \
|
|
||||||
datetime.strptime(resp_headers['last-modified'], t_fmt) \
|
|
||||||
else 200
|
|
||||||
return MockResponse(resp_headers, resp_code)
|
|
||||||
|
|
||||||
return MockResponse(None, 404)
|
|
||||||
|
|
||||||
class MockRepositories:
|
|
||||||
storage = {}
|
|
||||||
|
|
||||||
def load(self, *args, **kwargs):
|
|
||||||
if args[0] not in self.storage:
|
|
||||||
self.storage[args[0]] = {'url': args[0]}
|
|
||||||
|
|
||||||
return self.storage[args[0]]
|
|
||||||
|
|
||||||
def write(self, *args, **kwargs):
|
|
||||||
self.storage[args[0]['url']] = args[0]
|
|
||||||
|
|
||||||
|
|
||||||
class fetch_url_twice(unittest.TestCase):
|
|
||||||
|
|
||||||
@mock.patch('requests.get', side_effect=mocked_requests_get)
|
|
||||||
def test_fetch(self, mock_get):
|
|
||||||
self.fail('unfinished test')
|
|
||||||
repos = MockRepositories()
|
|
||||||
fetch_repo('http://someurl.tld/repo.json', repos)
|
|
||||||
mock_get.assert_called_with('http://someurl.tld/repo.json', headers={})
|
|
||||||
|
|
||||||
fetch_repo('http://someurl.tld/repo.json', repos)
|
|
||||||
mock_get.assert_called_with('http://someurl.tld/repo.json', headers={
|
|
||||||
'If-None-Match': '"2a0094b-b74-55326ced274f3"',
|
|
||||||
'If-Modified-Since': 'Sun, 13 Mar 2011 13:38:53 GMT'
|
|
||||||
})
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
@@ -1,65 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
import unittest
|
|
||||||
from pathlib import Path
|
|
||||||
import logging
|
|
||||||
import json
|
|
||||||
import bpackage as BP
|
|
||||||
|
|
||||||
logging.basicConfig(level=logging.DEBUG,
|
|
||||||
format='%(levelname)8s: %(message)s')
|
|
||||||
|
|
||||||
class TestRepoInstantiation(unittest.TestCase):
|
|
||||||
"""
|
|
||||||
Tests of the creation of a Repository object
|
|
||||||
"""
|
|
||||||
|
|
||||||
# helper_path = Path('tests/test_helpers')
|
|
||||||
# repos = blenderpack.Repositories(helper_path / 'repo.json')
|
|
||||||
|
|
||||||
# def test_load(self):
|
|
||||||
# repo = self.repos.load('http://someurl.tld/repo.json')
|
|
||||||
|
|
||||||
repo_dict = {
|
|
||||||
'name': 'The Best Repo Ever',
|
|
||||||
'url': 'http://someurl.tld/repo.json',
|
|
||||||
'packages': [
|
|
||||||
{'name': 'pkg1'},
|
|
||||||
{'name': 'pkg2'},
|
|
||||||
],
|
|
||||||
}
|
|
||||||
|
|
||||||
def test_create_from_dict(self):
|
|
||||||
"""
|
|
||||||
Instantiate repository repository with a dict and check
|
|
||||||
if all the items are carried over
|
|
||||||
"""
|
|
||||||
repodict = self.repo_dict
|
|
||||||
repo = BP.Repository(repodict)
|
|
||||||
for key, val in repodict.items():
|
|
||||||
self.assertEqual(getattr(repo, key), val)
|
|
||||||
|
|
||||||
def test_create_from_none(self):
|
|
||||||
"""
|
|
||||||
Instantiate repository repository from none and check that
|
|
||||||
the new repository's properties are set to none
|
|
||||||
"""
|
|
||||||
repodict = self.repo_dict
|
|
||||||
repo = BP.Repository(None)
|
|
||||||
for key, val in repodict.items():
|
|
||||||
self.assertEqual(getattr(repo, key), None)
|
|
||||||
|
|
||||||
def test_create_from_incomplete(self):
|
|
||||||
"""
|
|
||||||
Instantiate repository repository from a partial dict
|
|
||||||
and check that all properties are set, either to None or to the
|
|
||||||
value from the dict
|
|
||||||
"""
|
|
||||||
repodict = {
|
|
||||||
'name': 'The Best Repo Ever',
|
|
||||||
}
|
|
||||||
repo = BP.Repository(repodict)
|
|
||||||
for key, val in repodict.items():
|
|
||||||
self.assertEqual(getattr(repo, key), val)
|
|
||||||
self.assertIs(repo.url, None)
|
|
||||||
|
|
Reference in New Issue
Block a user