391 Commits

Author SHA1 Message Date
4570b4637b Move attachment parsing on the node level 2017-02-27 16:23:21 +01:00
e381ca774e On Page load use replaceState instead of pushState
Fix T50797 and replace the id-based url with a custom url for page in the browser's history.
2017-02-27 13:08:56 +01:00
6765276519 Introducing attachments fixes for blog posts and assets.
Requires migration of attachments schema using
python manage.py maintenance upgrade_attachment_schema --all
2017-02-21 18:08:42 +01:00
eca4ade9d8 Linking to Blender Cloud add-on (and no longer to bundle)
Added a note that states the add-on requires Blender 2.78+. Even though
this isn't strictly true (it also supports 2.77a if you manually install
the Blender ID add-on), it simplifies things greatly.

Fixes T49721
2017-02-21 11:14:46 +01:00
2e00e81b30 Raise z-index of col_right by 1 2017-02-14 16:03:37 +01:00
0a86ad357f Analytics for videojs 2017-02-08 16:27:52 +01:00
02f736dcc4 Hide missing summaries on projects homepage 2017-02-08 15:27:20 +01:00
d8eae2c44b Fix OG crash on projects without picture_header 2017-02-08 15:26:56 +01:00
c98cd82b3f OpenGraph: Check if we have a description/post content 2017-02-08 14:48:55 +01:00
69b3e06b1c Use project picture as fallback if og_picture/node is undefined 2017-02-07 18:03:35 +01:00
7b9fef2fc8 Update caches 2017-02-06 14:44:05 +01:00
528887b1a6 Unify Twitter cards and Open Graph data 2017-02-06 14:37:53 +01:00
10df0af355 Fix search list not scrolling 2017-02-06 14:35:51 +01:00
ae38bec218 Fix project header videos 2017-02-06 12:07:05 +01:00
3ef0bf6761 Typo 2017-02-02 18:08:21 +01:00
1e56ca5227 Only load videojs when there are sources, and minor style tweaks 2017-02-02 18:05:30 +01:00
b8ad0cd18f Update cache version 2017-02-02 17:40:32 +01:00
e049ab0a08 Fire videojs via js 2017-02-02 17:40:04 +01:00
089b0f1535 Own copy of videojs 5.8.8 2017-02-02 16:57:31 +01:00
bf0ebce81a Videojs for project video headers 2017-02-02 16:57:18 +01:00
eb02fa5eec Replace Flowplayer with the open source Video.js library 2017-02-02 16:06:41 +01:00
bc6f526b72 Don't use ?format=amp after url_for()
url_for() is smart enough to add variables to the query string if there is
no route parameter for them.
2017-01-24 16:35:02 +01:00
0e07cb2b1d Link to AMP view if we're in a node 2017-01-24 16:01:05 +01:00
2b528f0fff Added pillar.api.utils.bsonify(some_dict)
It was used in an experiment in Flamenco as an alternative to JSON; it
might still be used in the future if BSON turns out to be significantly
faster to generate.
2017-01-24 09:19:24 +01:00
9b90070191 AMP: break too long words 2017-01-23 16:15:53 +01:00
68fcae64ae AMP: Use srcset to load different headers depending on screen size 2017-01-23 15:56:41 +01:00
e3fc5d1b9b Initial support for AMP (Accelerated Mobile Pages)
https://www.ampproject.org/

Basic implementation. Still needs the node description to be parsed,
as <img> tags need to be <amp-img> with special tags.
2017-01-23 15:47:14 +01:00
85988bb8c9 Fix for some project names breaking javascript 2017-01-20 17:35:08 +01:00
85dba5e9e9 Blog: Re-order hideOverlay to be re-used 2017-01-20 13:13:11 +01:00
350577033c Blog: Expand images when clicking on them (and the link is an image)
Duplicated in both index/view post to get it out for today's Cycles post, wrote a note to fix this.
2017-01-20 12:38:50 +01:00
eb5fb4eb09 Fix undefined projectTree 2017-01-20 12:10:23 +01:00
181cbc07d6 Blog: Center images on posts 2017-01-20 12:05:28 +01:00
784c1ed0bb CSS: top border for active status on table rows 2017-01-19 16:57:41 +01:00
604d6c1a07 Added pillar.web.utils.last_page_index()
This returns the last page number (base-1) of a paged Eve result.
2017-01-19 15:13:01 +01:00
129ec94608 Renamed flamenco.jobs to flamenco_jobs 2016-12-14 14:48:37 +01:00
01cc52bba9 Allow user updates in create_service_account() calls. 2016-12-14 14:41:06 +01:00
8115bc2ad5 Collections are now named flamenco_xxx instead of flamenco.xxx
The dot notation disallowed Eve hooks, as the collection names weren't
valid Python identifiers.
2016-12-14 14:40:38 +01:00
a100d73a8b Collections in extension eve_settings now should start with the ext name.
Instead of Pillar automagically prepending 'attract.' or 'flamenco.' to the
names this should now be done explicitly in the extension's Eve settings.
This allows for more explicit configuration, and ensures foreign key
definitions are unambiguous.
2016-12-14 11:26:28 +01:00
11197e669c Remove /about endpoint 2016-12-02 18:02:29 +01:00
7a6e1d3386 refresh css 2016-12-02 17:54:12 +01:00
6bb491aadc Support for page urls
Now we can access pages with the following url
/p/<project_url>/<page-url>. Internally we use the existing view_node,
but if we detect that the node_id is not an object id we try to treat
it as a page url and therefore we try to define node and project using
render_node_page().
2016-12-02 16:57:51 +01:00
bc456f9387 Fix typo 2016-12-02 16:25:47 +01:00
1beb3ca488 Better join page for the agent project 2016-12-02 16:18:17 +01:00
0190cf944a Show free assets 2016-12-02 15:39:44 +01:00
5f590a2063 Search point to Join page for not subscribers 2016-12-02 14:46:22 +01:00
c284156723 Project thumbnail link to project root, not about 2016-12-02 12:43:15 +01:00
7219c5ca72 Disable Learn More on projects for now 2016-12-02 12:42:58 +01:00
86b5c1b242 Fix scrolling on sidebar for posts 2016-12-01 16:42:17 +01:00
ffdffdeb96 Bigger thumbnail for posts 2016-12-01 16:39:20 +01:00
455bfdfc49 Update CSS 2016-12-01 16:31:03 +01:00
2ad3c8a7ed Show Browse Project on top of the list 2016-12-01 16:30:27 +01:00
08f3467406 Fix width on containers 2016-12-01 16:30:17 +01:00
2bae7c2fef Thumbnail on list of blogs on sidebar 2016-12-01 16:21:02 +01:00
b6b517688e Display blog list and posts within the project
TODO: Edit within the project as well
2016-12-01 15:57:59 +01:00
f2942a20fe Refactor manage commands using subcommands
This way we clean up the output of manage.py and sort the commands in
three main categories:
- setup: Setup utilities, like setup_db() or create_blog()
- maintenance:  Maintenance scripts, to update user groups
- operations: Backend operations, like moving nodes across projects
2016-12-01 00:33:24 +01:00
d9b56f485b Extend CHECK_PERMISSIONS_IMPLEMENTED_FOR
We support flamenco.jobs. This is a temporary workaround until we
implement check permissions in a way that can be extended by extensions.
2016-11-30 23:50:21 +01:00
f06b3c94eb join_agent page for the agent project 2016-11-30 23:32:46 +01:00
742a16fb9f Better 403 error message 2016-11-30 22:11:27 +01:00
e72f02711d Temporary tweak to join mechanism
TODO: move this to the external app (blender-cloud).
2016-11-30 15:57:11 +01:00
48ebdf11b3 Update project-main 2016-11-29 18:49:49 +01:00
e43f99593a Vertical spacing on hdri thumbnails 2016-11-29 18:43:32 +01:00
476e7be826 Update CSS 2016-11-29 18:22:43 +01:00
8654503f5a Show free ribbon on project view 2016-11-29 18:17:35 +01:00
98295305fd Only show lock icon when we don't have a valid role 2016-11-29 18:00:54 +01:00
e43b0cbccf Responsive layout for HDRI listing 2016-11-29 16:58:11 +01:00
462ef953bc Update CSS 2016-11-29 16:12:53 +01:00
29629f6647 Update CSS 2016-11-29 16:06:04 +01:00
e3fc265408 Bigger thumbnail for HDRIs 2016-11-29 16:02:56 +01:00
a67774c6e8 textures and hdris can also have the public icon 2016-11-29 16:01:51 +01:00
dea6dd5242 Show Public status on textures 2016-11-29 15:58:21 +01:00
a79ca80f28 Limit free icon on jstree for asset/texture items 2016-11-29 15:51:18 +01:00
7fb94a86e8 Display a nice icon on jstree if item is free 2016-11-29 15:35:12 +01:00
9783711818 Add New File button: avoid selection of text and highlight when active 2016-11-29 14:50:59 +01:00
bf5b457141 Node description for HDRI/Textures folders 2016-11-29 14:44:41 +01:00
3fbee33369 Open jstree folders on load, and set parent as selected as well
So when we open a node inside a folder, it highlights itself and parent folder
2016-11-29 14:39:47 +01:00
2c71168677 in some cases hdr files can be read as None 2016-11-29 13:03:57 +01:00
51d7eed164 Fix alignment of text on status-bar 2016-11-29 13:03:42 +01:00
64ce091f11 Fix sidebar height missing navbar height into account 2016-11-29 12:25:46 +01:00
4a5d553bc8 No blog on activity stream 2016-11-25 13:32:17 +01:00
f75c43055f Blog on frontpage 2016-11-25 13:32:05 +01:00
f2d9df8b61 Add note about status parsing during the node tree creation 2016-11-25 12:56:41 +01:00
c73ad07e83 Remove whitespaces 2016-11-25 12:45:29 +01:00
a93d9be632 Remove whitespace 2016-11-25 12:43:59 +01:00
89689db96e Move tooltips/popovers code to layout 2016-11-24 19:43:11 +01:00
01e79f8565 Show icons on project homepage list 2016-11-24 19:42:12 +01:00
5866cc54aa Style group_texture 2016-11-24 19:16:34 +01:00
e8b03de444 Update css 2016-11-24 19:04:17 +01:00
1e1d9e57e7 Show description/content of posts/assets 2016-11-24 19:03:43 +01:00
5617f89c99 Style posts and assets on project homepage 2016-11-24 18:47:15 +01:00
b30aba2463 Fix clicking on posts 2016-11-24 18:46:41 +01:00
c8ae748bd6 Move colors for node types to config 2016-11-24 18:46:26 +01:00
3e6a9909da Update CSS 2016-11-24 18:21:18 +01:00
d35f2aa8c9 style tweaks to homepage activity stream 2016-11-24 18:17:40 +01:00
32ac0a64fb navbar is now opaque 2016-11-24 18:17:23 +01:00
3125ff75ca Style tweaks to sidebar 2016-11-24 18:17:13 +01:00
62b518c81e Show updated time on page templaet 2016-11-24 18:16:25 +01:00
8865ae02e4 Merge nodes_blog and nodes_featured 2016-11-24 18:16:15 +01:00
44c4182a86 Remove blog from sidebar and use folder icon 2016-11-24 18:15:45 +01:00
f59086c025 Style blog and page items on the tree 2016-11-24 18:15:20 +01:00
081a7f96ca No transparent navbar anymore 2016-11-24 18:14:46 +01:00
b1a0e1e3b6 Show blog on the tree 2016-11-24 18:14:25 +01:00
6910d3da49 We always include the picture now 2016-11-24 18:14:07 +01:00
b9c3d6b0fb Merge featured assets and blog posts into one activity stream 2016-11-24 18:13:46 +01:00
f99869f57e 10 featured/latest items 2016-11-24 18:12:38 +01:00
85bfbdb5e3 Display 10 comments on frontpage 2016-11-24 18:12:16 +01:00
ee20926233 List style for homepage activities 2016-11-24 16:31:36 +01:00
f732f1e08b Expand > Toggle 2016-11-24 16:31:36 +01:00
f899fb48ce Lighter background for navtree 2016-11-24 16:31:36 +01:00
4f071260f7 Fix tooltips not visible 2016-11-24 16:31:36 +01:00
6ed772278c Tooltips on the right and better text for them 2016-11-24 16:31:36 +01:00
Dalai Felinto
b04ed3f5b6 Fix problem pip install failing
Repeated elements here makes it fails (at least in WSL - Windows Subsystem Linux)
2016-11-21 23:03:52 +01:00
738c3e82d7 Remove box for containers on posts 2016-11-21 12:37:03 +01:00
9e952b0436 Fix on scrollbars 2016-11-21 12:29:09 +01:00
6ef2c5ca0d Refresh CSS cache 2016-11-17 15:24:00 +01:00
c025aa3aac Move table classes up a level so they can have effect without being nested 2016-11-17 14:55:04 +01:00
a41bda6859 Minor tweaks to tree/nav tree 2016-11-16 17:58:52 +01:00
9210285089 Make status-bar one line 2016-11-16 17:57:38 +01:00
f1661f7efb Use native scrollbars 2016-11-16 17:48:35 +01:00
8959fac415 Tooltips/popovers without delay 2016-11-11 20:04:08 +01:00
9b469cee7d Style tweaks to jstree 2016-11-11 20:03:45 +01:00
bbb3f5c7c0 Don't display extra content on /about 2016-11-11 18:16:16 +01:00
3139ba5368 Style tweak to nav header 2016-11-11 18:16:02 +01:00
df810c0c4e Fix icon 2016-11-11 18:04:53 +01:00
29b4ebd09a Link to project blog 2016-11-11 17:55:23 +01:00
76a5d9c9e1 Blog and Latest assets are shown bigger now 2016-11-11 17:48:38 +01:00
fe848525b1 Small refactor of jstree style
Still needs some work but it's a bit cleaner
2016-11-11 17:11:35 +01:00
24ede3f4ee Include node_type on jstree list item 2016-11-11 17:11:35 +01:00
756e3d2d89 Template for pages 2016-11-11 17:11:35 +01:00
684afb8cd5 Style .container.box 2016-11-11 17:11:35 +01:00
52a1602a7c Allow overriding whether the user can comment from URL.
Not really secure (user can still post comments via API and by changing the
URL and re-requesting the embedded comment form), but at least normal users
are blocked from commenting this way.
2016-11-11 16:01:56 +01:00
ce6020702e Don't check for hardcoded caminandes-3 url
We now have the header_video_file feature for it
2016-11-11 15:37:46 +01:00
76f2367e66 Added extra role to UserAdminTest. 2016-11-11 15:23:25 +01:00
5f0092cfa1 Fixed bug in /u/ where home project group membership was lost after edit.
Rather than understanding the code, I rewrote the editing and added a
unit test for it.
2016-11-11 15:06:29 +01:00
4b84e6506b CLI command to check home project group membership 2016-11-11 15:05:43 +01:00
a13937e500 Log error when unable to update home project 2016-11-11 12:44:47 +01:00
b9e27a4cbf Quote activity verb in log 2016-11-11 08:40:49 +01:00
3b694a91af Fix alignment of header 2016-11-10 11:26:26 +01:00
f651ece343 Set color for navigation on sidebar 2016-11-10 11:21:11 +01:00
595a690473 Removed activity 'extra fields', as it wasn't used and half-built. 2016-11-10 09:50:10 +01:00
1702b40812 hover color of active list items 2016-11-10 00:54:06 +01:00
9612e99806 Unify active state of list and table items 2016-11-10 00:52:36 +01:00
c17993418c Fix animated stripes background not aligned on tables 2016-11-10 00:10:29 +01:00
60e43c368d Active statuses for tables and list items 2016-11-09 23:37:17 +01:00
2f3e5a513b Unify inputs with other apps 2016-11-09 23:14:15 +01:00
54fccfc3ad Status colors
From Attract, but will be used also in Flamenco and others in the future
2016-11-09 23:01:32 +01:00
b6b62babd2 Some fixes and utils from Attract 2016-11-09 22:42:53 +01:00
ad3f2c0119 Introducing apps_base.sass, contains basic layout/generic classes 2016-11-09 22:36:55 +01:00
dc70705b1e Don't show "Join the conversation" to demo users
And minor style tweaks
2016-11-09 18:15:34 +01:00
ab375b2126 Moved node_setattr() from Attract to Pillar 2016-11-09 12:50:30 +01:00
fcecc75c3d Update CSS cache version 2016-11-08 18:29:46 +01:00
15be184816 Align edit header to the right 2016-11-08 18:25:40 +01:00
45328b629b Escape html when building jstree 2016-11-08 18:25:23 +01:00
cce45b96e1 Fix special characters on document title 2016-11-08 18:08:30 +01:00
edad85ee34 Display private/public label on projects 'shared with me' 2016-11-08 17:56:56 +01:00
116ed9f08a More padding on error message 2016-11-08 16:03:39 +01:00
7391f40cba Users list: copy to clipboard for IDs
Feature request by Francesco
2016-11-08 15:59:14 +01:00
e54bfa4520 Clipboard.js, brought over from Attract, we'll use it here as well 2016-11-08 15:49:47 +01:00
d272896787 Use a lock icon (instead of download icon) when there's no permission to download 2016-11-08 15:05:44 +01:00
724fe6ceeb 'Join the conversation' wasn't accurate for subscribers without POST permission 2016-11-08 14:19:23 +01:00
865259d40e pretty_date('some string') now tries to parse the string as datetime.
dateutil.parser.parse('some string') is used for this.
2016-11-08 13:38:36 +01:00
65b554986c pretty_date(None) now returns None 2016-11-08 12:56:19 +01:00
fb6e326a14 Also support future dates and times in pretty_date 2016-11-08 12:24:55 +01:00
920a1de263 No need to format known number 2016-11-08 12:24:55 +01:00
0da4e3bafc Public/Private label for list of own projects 2016-11-08 12:00:15 +01:00
89be4efe6f If day is in the future, just print the time (not empty) 2016-11-07 17:10:41 +01:00
ba591da2fc Store js libraries locally 2016-11-07 12:20:23 +01:00
4c6a51c501 Fixed some package version conflicts between Pillar and the SDK. 2016-11-07 10:56:31 +01:00
76174046ad Use our own perfect scrollbar, not cdn 2016-11-04 16:11:04 +01:00
7b79270481 Auto-open dropdown menus only on nav bars 2016-11-04 11:22:22 +01:00
a1dca29382 Quick fix for layout of attachments file upload 2016-11-04 11:05:19 +01:00
c1427cf6a2 avoid horizontal scroll on notifications 2016-11-03 18:27:50 +01:00
a89ada7c2f Ported yesno Django filter to Jinja2 2016-11-03 18:26:11 +01:00
84a86a690e Gracefully handle replies on comments on deleted nodes. 2016-11-03 17:45:25 +01:00
0a0db88701 Style disabled buttons 2016-11-03 15:35:00 +01:00
27bad1be8a Fix markdown on coments 2016-11-03 15:34:50 +01:00
e98b158886 Disabled auto-slug feature.
It broke file uploads. Thanks @venomgfx for joining in solving this.
2016-11-03 14:04:40 +01:00
324d500edb Tweaks to style of file attachments 2016-11-02 19:42:44 +01:00
ef326a2193 Fix width of project header when page is not fully loaded 2016-11-02 19:05:43 +01:00
5ade876784 Labels for fields 2016-11-02 18:55:26 +01:00
738c20b36b Undertitle field labels 2016-11-02 18:51:51 +01:00
3c6642d879 Undertitle labels for checkboxes
Avoids ugly 'is_tileable' label on textures
2016-11-02 18:50:20 +01:00
e43405a349 Fix for empty File field not showing when there are no files
Committing on behalf of Dr. Sybren
2016-11-02 18:43:41 +01:00
f394907dd2 CLI replace_pillar_node_type_schemas: abort when unable to save 2016-11-02 18:20:44 +01:00
e117432f3d CLI replace_pillar_node_type_schemas: allow setting license types on public project nodes. 2016-11-02 18:15:23 +01:00
295c821b9d Simplified code 2016-11-02 17:55:37 +01:00
865f777152 CLI replace_pillar_node_type_schemas: using PILLAR_NAMED_NODE_TYPES 2016-11-02 17:21:50 +01:00
36e7cc56ef Removed colon for easy copy & paste of IDs 2016-11-02 17:21:50 +01:00
aa3340ddbe CLI upgrade_attachment_schema: stop when a node cannot be saved. 2016-11-02 17:21:50 +01:00
4280e0175b CLI upgrade_attachment_schema: only upgrade non-deleted nodes 2016-11-02 17:21:50 +01:00
cc562a9fb1 Fix attachment rendering for nodes without description. 2016-11-02 17:21:50 +01:00
4ec3268a23 Reloading comment list via event 'pillar:comment-posted' on body element. 2016-11-02 17:21:50 +01:00
80601f75ed Remove deprecated +button-rounded-filled mixin
We now use just 'button', as roundness and filled are configurable
2016-11-02 16:36:47 +01:00
9ac2f38042 Warn if there's no slug to append 2016-11-02 16:21:10 +01:00
4bd334e403 Add button to 'Add Attachment to Description' 2016-11-02 16:16:20 +01:00
ae859d3ea7 Minor style tweaks to file form widgets 2016-11-02 16:16:20 +01:00
e69393e95e WIP: endpoint for posting new comments without comment list.
We need to determine what happens when such a comment is successfully
posted, as we can't just reload the comment list. In other words, this is
dependent on where we are embedded, and cannot be handled just locally.
2016-11-02 15:40:26 +01:00
2cc21583d9 On-create activities are only created for Pillar nodes.
This allows Attract to use custom on-create activities.
2016-11-02 15:39:16 +01:00
0ac0f482ac Merge branch 'production' 2016-11-02 14:52:37 +01:00
f30cdd5246 Minor style tweaks to attachments form 2016-11-02 14:51:10 +01:00
48157254c1 Fixed snag. 2016-11-02 14:43:19 +01:00
3fc08bcafd Set the slug based on the file name 2016-11-02 14:07:02 +01:00
ff94cc57a3 Only show image size if it's image
Otherwise it'd be NonexNone
2016-11-02 12:51:49 +01:00
cf28e5a3f4 Unified "Add New File" and ".. Attachment" buttons. 2016-11-02 12:29:38 +01:00
6ea7386bd3 "Add new attachment" button works. 2016-11-02 12:28:45 +01:00
90c6fdc377 Handle empty attachments (no slug nor oid) and reject duplicate slugs 2016-11-02 12:28:45 +01:00
2a5b3dc53e Removed unused code. 2016-11-02 12:28:45 +01:00
dabc1a44b8 Set icon for error message 2016-11-02 11:42:49 +01:00
eb1561136b Fix typo in attachments code 2016-11-02 11:42:23 +01:00
d24677992e Datetimes in dynamic properties are now timezone-aware (but hardcoded). 2016-11-02 10:52:44 +01:00
e143b9cb72 Use undertitle filter when displaying node status 2016-11-01 19:36:04 +01:00
6faea83372 Fix rating on comments 2016-11-01 19:28:53 +01:00
d36dcad773 Fix rated status for comments (was missing space between classes) 2016-11-01 19:28:53 +01:00
a385a373b9 Typo in comments 2016-11-01 19:28:53 +01:00
8fa135d52e Add license types and notes to asset node_type 2016-11-01 19:05:14 +01:00
6f460ee127 Fix for non existing attachments 2016-11-01 18:05:26 +01:00
8cc2cfb189 Don't use hardcode url for homepage 2016-11-01 17:29:27 +01:00
c672bc07fe Only load comments on assets or posts
Was trying to load comments on groups, textures, etc.
2016-11-01 17:17:33 +01:00
656944f1ce Allow add_to_project() to take generator for node types 2016-11-01 16:47:55 +01:00
ab9d5c1793 CLI upgrade_attachment_schema: skip already upgraded nodes. 2016-11-01 16:47:55 +01:00
fe4d70c0d1 CLI upgrade_attachment_schema: also remove attachments form_schema
Previously they would have {'attachments': {'visible': False}}, but this
is no longer needed.
2016-11-01 16:47:55 +01:00
964e807721 Give admin explicit permissions, instead of blindly granting everything.
This ensures that the allowed_methods properties are properly set. Admin
users get the union of all permissions given to all groups and users.
2016-11-01 16:47:55 +01:00
3cf71a365f Forms for attachments work, VERY HACKISH Hardcodedness™ 2016-11-01 16:47:55 +01:00
5bd2c101fe Restore DB from 'cloud' subdir 2016-11-01 16:47:55 +01:00
aef7754537 Attachment rendering for posts & node descriptions. 2016-11-01 16:47:55 +01:00
d50d206e77 Gracefully handle non-existing files when renaming asset nodes. 2016-11-01 16:47:55 +01:00
28223159e7 Allow admin users to do everything.
This makes things more consistent (previously admins could create projects,
but not nodes in those projects).
2016-11-01 16:47:55 +01:00
a38e053c1a Added CLI command to create blogs. 2016-11-01 16:47:55 +01:00
62ac12deff Some more simplification 2016-11-01 16:47:55 +01:00
64ece74404 Cleaned up some blog post viewing code 2016-11-01 16:47:55 +01:00
bffbbad323 Support Cerberus valueschema in ValidateCustomFields 2016-11-01 16:47:55 +01:00
8fb64c38d6 Removed API-side attachment parsing. 2016-11-01 16:47:55 +01:00
f72890cc59 Define standard set of node types 2016-11-01 16:47:55 +01:00
0929a80f2b New data structure for attachments. 2016-11-01 16:47:55 +01:00
ff7101c3fe Small improvements in ValidateCustomFields() 2016-11-01 16:47:55 +01:00
590d075735 New schema for attachments, using propertyschema/valueschema. 2016-11-01 16:47:55 +01:00
fa3406b7d0 only_for_node_type_decorator() now supports checking multiple node types 2016-11-01 16:47:32 +01:00
5805f4eb2a Comments is now part of the base style 2016-11-01 15:53:40 +01:00
53cbe78ec1 Use #comments-embed for embedding comments. Avoid duplicate ID 2016-11-01 15:53:40 +01:00
f4b5e49c26 Return service account info from create_service_account() 2016-11-01 14:00:00 +01:00
499af03473 Gracefully handle 404 in get_user_info() 2016-11-01 14:00:00 +01:00
51c2c1d568 Make it possible for Pillar extensions to add service accounts. 2016-11-01 14:00:00 +01:00
144c5b8894 Use statusBarSet() js function from Pillar 2016-11-01 12:30:53 +01:00
c9d7da3a42 Attract and Flamenco icons 2016-10-21 20:41:41 +02:00
b59fcb5cba Prevent {{ url_for_node(...) }} crashing the planet when node doesn't exist.
Now None is returned as URL, and a warning is logged, rather than crashing
with a 500. A situation like this occurs when an activity refers to a
no longer existing node.
2016-10-21 16:00:03 +02:00
7be8e9b967 Show a nicer 404 error when something was deleted (instead of just "not there") 2016-10-21 15:27:17 +02:00
041722f71a Allow custom messages in the 404_embed.jade template 2016-10-21 14:38:57 +02:00
457a63ddcb Notifications: Fix alignment of mark as read button 2016-10-21 11:43:40 +02:00
5677ae8532 Prevent errors when notification is linked to non-existing node 2016-10-20 17:43:51 +02:00
8d99f8fc2e No more on-focus resizing; the "POST COMMENT" button moves away when you click it 2016-10-20 17:30:39 +02:00
09a21510a2 Comments: fixed issue cancelling reply & then posting top-level comment
This would still post as a reply, rather than as a top-level comment.
2016-10-20 17:29:45 +02:00
73641ecc8a Allow more tags in comments, including iframe (for video embedding) 2016-10-20 17:14:20 +02:00
b1da6de46e Comment textarea min height set when editing + only transition border-color 2016-10-20 17:04:02 +02:00
fceac01505 Set a nice minimum height when editing a comment 2016-10-20 17:02:07 +02:00
8b64f9140b Allow resizing of comment textarea 2016-10-20 17:01:58 +02:00
e1678537c0 Editing comments via PATCH on pillar-web, and some other comment fixes 2016-10-20 16:47:04 +02:00
d8686e5a14 Fixed comment rating 2016-10-20 16:34:33 +02:00
e71e6a7b32 API for editing comments via PATCH 2016-10-20 16:22:11 +02:00
8352fafd21 Replaced markdown with commonmark module 2016-10-20 13:05:43 +02:00
db2680be81 Removed unused import 2016-10-20 13:05:43 +02:00
c456696600 Added TODO 2016-10-20 13:05:43 +02:00
ad1816c617 log.warning → .info 2016-10-20 13:05:43 +02:00
8d3c4745aa Remove unnecessary form_schema fields. 2016-10-20 13:05:43 +02:00
3afeeaccd0 Removed permission keys from node type definitions.
This prevents replace_pillar_node_type_schemas() from overwriting existing
permissions.
2016-10-20 13:05:43 +02:00
7f4ad85781 Count comments and replies, not just top-level comments 2016-10-19 17:16:27 +02:00
ea2be0f13d Major revision of comment system.
- Comments are stored in HTML as well as Markdown, so that conversion
  only happens when saving (rather than when viewing).
- Added 'markdown' Jinja filter for easy development. This is quite
  a heavy filter, so it shouldn't be used (much) in production.
- Added CLI command to update schemas on existing node types.
2016-10-19 16:57:17 +02:00
eea934a86a Added username to public user fields 2016-10-19 16:57:17 +02:00
f2f66d7a6c Moved subquery.py from Attract to Pillar, as it's useful for comments too.
It's an attempt to speed up common queries which would ordinarily be
embedded by Eve. We want to move away from embedding due to security
issues (allowing the embedding of users leaks privacy-sensitive info).
2016-10-18 15:34:39 +02:00
aca54d76e0 Moved find_url_for_node() to its own module and made more pluggable.
Extensions can now register custom node URL finders using the
@pillar.web.nodes.finders.register_node_finder(node_type_name) decorator.
2016-10-18 12:03:06 +02:00
646ab58395 Style sidebar icons 2016-10-18 11:34:53 +02:00
d99ddca410 Split base styles into base.css
That way we can load this css in other projects to bring the basic stuff
such as normalize, navbar, notifications, custom scrollbars, and so on.
2016-10-17 16:17:23 +02:00
87f3093503 Delete attract main.sass, attract has its own 2016-10-17 15:40:14 +02:00
ae723b1655 update css 2016-10-14 15:57:11 +02:00
0a606ae15c Fix Free tag overflow 2016-10-14 15:19:40 +02:00
6af3dfdb51 Use local bootstrap 3.3.7 2016-10-13 16:02:38 +02:00
eca3f47eb8 Style form-upload-progress-bar when uploading
Had the same green hue for completed/uploading, which made it confusing.
2016-10-13 14:25:18 +02:00
8043caf187 Font Pillar: Question mark icon 2016-10-13 14:25:18 +02:00
aa953f76a1 Cache FlaskInternalApi object on request keyed by authentication token. 2016-10-13 10:01:29 +02:00
10ecb2158e Log error when URLer service is used but not configured. 2016-10-13 10:01:11 +02:00
96c9e12f7f doc_diff() optionally no longer reports differences between falsey values.
If falsey_is_equal=True, all Falsey values compare as equal, i.e. this
function won't report differences between DoesNotExist, False, '', and 0.
2016-10-12 17:09:48 +02:00
7c310e12ef Added util function to compute the difference between two dicts. 2016-10-12 16:01:30 +02:00
26aa155b9e Cache Pillar API Object on request object. 2016-10-12 14:29:47 +02:00
0146b568c0 Allow extra fields in activities. 2016-10-12 14:29:28 +02:00
ade62033ba Added only_for_node_type_decorator(node_type_name) decorator factory func
This allows you to create a decorator for Eve hooks. The decorator returns
a decorator that checks its first argument's node type.

If the node type is not of the required node type, returns None,
otherwise calls the wrapped function.
2016-10-12 13:41:16 +02:00
8aab88bdc2 Activities now have explicit project ID
This allows for directly querying activity on a certain project.
Used in Attract for task/shot activity streams.
2016-10-12 13:40:27 +02:00
f4b34f1d02 Error handler: set 'code' and 'description' defaults separately. 2016-10-12 10:22:25 +02:00
4eb8319697 Better logging of OAuth issues, in the hope to figure out what's going on. 2016-10-11 17:09:02 +02:00
5dd49fa5dd Pillar Extensions can now add links to the sidebar. 2016-10-11 16:33:44 +02:00
6429c3df21 Modernised flask.ext.login imports → flask_login 2016-10-11 15:23:40 +02:00
3561cb61c6 Fix favicon 2016-10-10 17:29:13 +02:00
a52c263733 Homepage: Fix long comments 2016-10-10 16:39:36 +02:00
c9d4a06486 Swap Blender Sync with Agent 327 project announcement 2016-10-07 16:42:42 +02:00
8a35fe3a16 Swap blog stream with random featured assets 2016-10-07 15:12:27 +02:00
620107fdc0 If there's no content_type, display node_type
Like in the case of textures, they are not content_type but node_type
2016-10-07 15:06:29 +02:00
14a8be6329 Fix 'Latest Assets' list not being updated
Was simply missing project_id
2016-10-07 15:05:57 +02:00
77b17e31e0 Homepage: Minor style tweaks to make feed a bit more compact 2016-10-07 14:52:39 +02:00
2028891e7a No need to cache Sass, it's so fast anyway 2016-10-07 14:51:46 +02:00
abe0c28a99 Flowplayer: Fix fullscreen icon 2016-10-06 11:35:10 +02:00
c71186f318 Allow project membership to be managed by ppl with admin role.
This was already mentioned as possible in the frontend, but not implemented
in the backend.
2016-10-05 14:36:07 +02:00
4e0db78ff1 Made the use of the term "Team member" consistent on the proj sharing page.
Also clarified that project owners *and* team members can edit the project,
and that team members can also delete assets.
2016-10-04 12:51:23 +02:00
d1610da5f9 JStree: HREF attribute link to actual node instead of #
This allows things like middle click on an item to load in a separate tab, yay!

Idea and help by Dr. Sybren
2016-10-04 12:38:08 +02:00
73ec464292 py.test: run with -x (stop at first error) and --ff (failed test first) 2016-10-04 11:58:46 +02:00
0de8772c98 Removed __all__, as we didn't keep it up to date anyway. 2016-10-04 11:58:46 +02:00
91b116aa74 Slightly smarter ./gulp script (taken from Attract) 2016-10-04 11:58:46 +02:00
6537332b26 Don't use # as link on group nodes listing, use the actual link 2016-09-30 18:07:36 +02:00
001d310d76 Fix double pushState when browsing group nodes
Was calling displayNode() twice
2016-09-30 18:07:36 +02:00
e2921c8da8 nodes_latest was missing the content_type 2016-09-30 18:07:36 +02:00
d1d48553e5 Fix link to blog items not working 2016-09-30 18:07:36 +02:00
dd58d4ad04 Created AbstractPillarTest.create_project_admin() function. 2016-09-30 12:54:21 +02:00
b429933737 Added 'required_after_creation' Eve schema validator. 2016-09-30 12:54:21 +02:00
2cc22f4f90 Fix scrolling on mobile 2016-09-30 11:28:21 +02:00
e2236864e7 Filter out '^attract_.*' node types from jstree
While we're at it, also filter out comment & post from the query, rather
than later in Python code.
2016-09-29 17:34:24 +02:00
74d86487a9 Added self-building gulp command 2016-09-29 10:01:31 +02:00
d7fe196af0 Some dependency cleanups. 2016-09-29 10:01:15 +02:00
dcef372e4f Gracefully handle project without node types.
This can happen when a projection excludes node types.
2016-09-29 09:55:49 +02:00
7931428312 Clipboard icons on pillar-font 2016-09-27 17:01:07 +02:00
407aefb9ad Added CLI command for moving top-level nodes between projects.
Also introduces a slightly nicer way to get the database interface, and
an object-oriented way to allow dependency injection.
2016-09-27 12:57:57 +02:00
c64fbf61ba Removed project node type 2016-09-27 12:57:57 +02:00
063023c69a PEP8 2016-09-27 12:57:57 +02:00
2c7d2e7dfd Move font-pillar into its own css file
So we can easily link it from attract/flamenco/etc
2016-09-23 17:29:35 +02:00
7968c6ca37 Added node_type_utils to assign permissions to certain node types.
This separates "mechanism" from "policy".
2016-09-23 17:13:26 +02:00
91e3ec659f Added ProjectUtils.projectUrl() 2016-09-23 10:12:57 +02:00
e0f92b6185 Don't log entire exception when forwarding a 412 Precondition Failed. 2016-09-23 09:40:05 +02:00
0bf07b4ba4 ProjectUtils: add context
Currently used in Attract for the shots/tasks list
2016-09-22 18:59:55 +02:00
dfe398458b Tutti: Check if algoliaIndex is defined 2016-09-22 18:59:55 +02:00
30215bf87c Tutti: Check if tooltip/popover exist 2016-09-22 18:59:55 +02:00
0f23ee7a08 Added handler for 412 Precondition Failed from SDK. 2016-09-22 18:09:43 +02:00
9514066893 Gulp: Don't livereload by default
When running gulp watch, we were livereloading by default, which meant we can't have multiple 'gulp watch'.
2016-09-22 18:07:05 +02:00
cd8707207b Made format_undertitle() Jinja filter None-safe 2016-09-22 10:33:51 +02:00
7f9f89853d Properly handle embed/non-embed error renders for some SDK exceptions. 2016-09-22 09:25:59 +02:00
78824c9c2a Allow extensions to define custom project properties 2016-09-20 15:59:39 +02:00
40896fc70b Better logging when bad extension class is given.
This was necessary to debug an issue with different unit tests influencing
each other in Attract.
2016-09-20 15:59:39 +02:00
7598ad0b57 Gulp: Avoid re-building unchanged files by caching the results 2016-09-20 15:17:19 +02:00
4b11aab429 Update cloud headline 2016-09-19 16:53:11 +02:00
ad91e37d14 Art of Blender is selling out! 2016-09-19 12:34:03 +02:00
df8afb8b14 Append license notes to Algolia index
So we can keep nodes without description or uploaded by other users (like
textures), with clean names and still be able to search them easily by
their copyright notes.

Reviewers: sybren, fsiddi

Reviewed By: sybren, fsiddi

Differential Revision: https://developer.blender.org/D2225
2016-09-14 09:39:19 +02:00
55b2911665 Added .arcconfig for phabricator integration 2016-09-14 09:39:19 +02:00
1680475d92 Expose License notes on Textures, if any 2016-09-12 18:57:57 +02:00
d116439b57 correct text when there are no hdris 2016-09-12 18:11:25 +02:00
56c669874d Agent in the frontpage 2016-09-12 18:01:11 +02:00
76b0f5fc46 Moved login-code into a separate function.
This makes it easier to log in users by their token from unittests.
2016-09-08 12:03:51 +02:00
68666f0650 Updated unittest code so that we can create 100% valid projects.
This means also creating a user and groups so that the references are
valid.
2016-09-08 12:03:17 +02:00
4313284dab Added 'hide_none' Jinja filter, which replaces None with an empty string 2016-09-07 17:01:56 +02:00
9e6b998c50 Refactored static file handling so that extensions can provide static files 2016-09-07 16:36:25 +02:00
b2e8711ac4 Moved Jinja2 stuff to its own module, and added |undertitle filter. 2016-09-07 16:03:40 +02:00
f03566a10f Added template for embedded error 500 2016-09-07 14:57:05 +02:00
2730a7a2b2 Added error handlers for some PillarSDK exceptions. 2016-09-07 12:23:48 +02:00
f21b708085 Made it easier for extensions to register multiple blueprints at different URLs
The blueprint's own url_prefix='/xxx' setting is now taken into account.
2016-09-07 11:40:24 +02:00
8a6cd96198 Added pi-users icon + documented regeneration of pillar-font. 2016-09-07 11:14:36 +02:00
4ae36a0dc3 Allow custom template dirs for extensions 2016-09-06 18:39:35 +02:00
eac49ab810 Use BLENDER_ID_ENDPOINT to get roles from BlenderID
Also refactored some code.
2016-09-06 17:27:14 +02:00
49c08cba10 Custom error handlers: also properly handle non-Werkzeug exceptions. 2016-09-06 17:10:50 +02:00
cf30bb5d62 Use BlenderID-side roles to grant demo role. 2016-09-06 16:42:48 +02:00
ab5a4a6b6c Custom error pages.
These make a distinction between API requests on /api/ (which will return
a JSON response) and other requests (which will return HTML).

Fixes T49212
2016-09-06 14:22:52 +02:00
e04b2ef7ea Fix background color for nav container 2016-09-06 12:41:52 +02:00
52ca2adc19 User admin: actually show the search hit container. 2016-09-06 12:16:25 +02:00
29a0bed39b Fix background color of node-container on /about 2016-09-06 12:11:47 +02:00
634ad86fa1 Fix search on blog and tweaks to navbar 2016-09-06 12:04:40 +02:00
574178cffc Prevent accessing /nodes/undefined/view from search pages.
`firstHit.attr('data-hit-id')` can be undefined; in that case we just
ignore the siutation.

Furthermore, I've removed the call to clearTimeout(), as it is only
called after the timeout has been hit, and thus is a no-op.
2016-09-06 11:56:54 +02:00
305d9b44ec re-indented algolia_search.js so that it uses 4-space indents. 2016-09-06 11:52:26 +02:00
3bb55fd3db User admin: properly handle AJAX errors.
Added specific handling for clicking on non-existing users. The styling
might need some tweaking (it's pretty ugly), but then again, it's just
for us admins.
2016-09-06 11:27:49 +02:00
486686f1f9 File upload: Removed JS-side file size check.
Instead, the size of the entire HTTP request body is checked against the
maximum file size. This allows for slightly smaller files (in the order
of 200-300 bytes), which shouldn't be noticeable given our 32 MiB limit
for non-subscribers. This check is performed before accessing
request.files[], and thus before the file even starts uploading.

This also allows unlimited file uploads to subscribers and demo users.
This was already possible using the API, so now the web interface is
consistent. Limits can be set using config[_local].py.

This closes T49264: Allow large uploads for admins
2016-09-06 10:33:28 +02:00
52cc61b143 Use Roboto font for headings as well 2016-09-05 19:40:46 +02:00
e4763d809b Project view: Fix transparent background of tree/sidebar 2016-09-05 18:55:49 +02:00
4cf7fde5bf Welcome Colin and Beau! 2016-09-05 16:00:45 +02:00
e58f29a9d0 Fix missing pictures on latest blog posts and node updates 2016-09-05 16:00:45 +02:00
fa050da8e2 Display Blog on the sidebar, if available 2016-09-05 16:00:45 +02:00
3d9b9e40d4 Added PillarExtension.setup_app(app)
It's called on each extension after all extensions have been processed,
and after all built-in Pillar modules have had their setup_app() called.
Call order is random.
2016-08-31 16:03:45 +02:00
4cf779e040 Keep reference to loaded extension, and refuse to load twice.
The Pillar extensions are now stored, by their name, in a dictionary.
2016-08-31 16:02:55 +02:00
a0cc76259e Renamed TestPillarServer to PillarTestServer
TestXXX classes are seen as unit tests by py.test, so anything that's not
a unit test should not be called TestXXX.
2016-08-31 11:29:16 +02:00
54bc0e87ce Updated test requirements 2016-08-31 11:28:38 +02:00
cb5128907c Removed old-src folder, use the last-before-fusion tag instead.
The 'last-before-fusion' tag points to the last revision before the
fusion with Pillar-Web. Any old source can be looked up there.
2016-08-31 11:10:44 +02:00
34921ece76 Added quotes around node type name 2016-08-30 16:00:16 +02:00
5ebec42e6d Removed unused, commented-out code 2016-08-30 15:58:58 +02:00
4529d0597b Gracefully handle nodes of a type for which we don't have a template.
Before, it would simply return a 500 Internal Server Error.
2016-08-30 15:52:55 +02:00
3f9d519753 Added Dummy deploy script for people with a 'git pp' alias
For people with a 'git pp' alias to push to production. This are the
aliases I use to push & deploy changes to production:

    prod = "!git checkout production && git fetch origin production && gitk --all"
    ff = "merge --ff-only"
    pp = "!git push && if [ -e deploy.sh ]; then ./deploy.sh; fi && git checkout master"

Those are handy to make branch switches easy, and to ensure that you don't
accidentally continue work on the production branch after deploying.
2016-08-30 14:37:36 +02:00
3039aef7d3 Removed Attract node types.
Those are moved into the new Blender Cloud server's Attract module.
2016-08-30 14:24:14 +02:00
cb84e6f0b7 Allow CLI commands to set the current user to a non-existing admin user. 2016-08-30 14:24:14 +02:00
88b5537df4 Avoid crash when there is no current user 2016-08-30 14:24:14 +02:00
88dd574797 No longer using flask.ext.XXX, more imports have to change too. 2016-08-30 14:24:14 +02:00
8d6df947c8 User our own jQuery 2016-08-30 14:10:04 +02:00
b9b993fe4a Extension system: allow empty Eve settings.
Extensions are now able to return an empty dict from their eve_settings()
method.
2016-08-30 13:55:43 +02:00
2c62bd4016 When replying, use @username only 2016-08-30 13:54:59 +02:00
06ed6af2a9 Use Blender Cloud add-on version from config 2016-08-30 12:17:59 +02:00
32c130ed93 Fall back to application/octet-stream when there is no content-type header 2016-08-26 17:57:52 +02:00
634b233685 mass_copy_between_backends: Also catch unexpected exceptions, and simply move on to the next file. 2016-08-26 17:50:40 +02:00
eb7b875122 Copying files to other backend now works 2016-08-26 15:52:02 +02:00
c4a3601939 Broken file_storage.py up into file_storage/{__init__,moving}.py 2016-08-26 15:36:34 +02:00
225f9ae054 WIP for change file backends 2016-08-26 15:36:34 +02:00
163db3f2b8 Let generated links for 'unittest' backend actually be a valid link. 2016-08-26 15:35:18 +02:00
dd6fc8bde4 generate_link: warn when GCS blob can't be found. 2016-08-26 15:34:58 +02:00
ff692d287c Added 'check_cdnsun' management command.
This command performs a HEAD on each file stored at CDNSun, including its
variations. Logs missing variations and missing main files (but only when
there are no variations).
2016-08-26 14:16:05 +02:00
186 changed files with 10516 additions and 6491 deletions

6
.arcconfig Normal file
View File

@@ -0,0 +1,6 @@
{
"project_id" : "Pillar Server",
"conduit_uri" : "https://developer.blender.org/",
"git.default-relative-commit" : "origin/master",
"arc.land.update.default" : "rebase"
}

8
deploy.sh Executable file
View File

@@ -0,0 +1,8 @@
#!/bin/bash
echo
echo "==========================================================================="
echo "Dummy deploy script for people with a 'git pp' alias to push to production."
echo "Run deploy script on your server project."
echo "When done, press [ENTER] to stop this script."
read dummy

19
gulp Executable file
View File

@@ -0,0 +1,19 @@
#!/bin/bash -ex
GULP=./node_modules/.bin/gulp
function install() {
npm install
touch $GULP # installer doesn't always touch this after a build, so we do.
}
# Rebuild Gulp if missing or outdated.
[ -e $GULP ] || install
[ gulpfile.js -nt $GULP ] && install
if [ "$1" == "watch" ]; then
# Treat "gulp watch" as "gulp && gulp watch"
$GULP
fi
exec $GULP "$@"

View File

@@ -11,6 +11,7 @@ var rename = require('gulp-rename');
var sass = require('gulp-sass');
var sourcemaps = require('gulp-sourcemaps');
var uglify = require('gulp-uglify');
var cache = require('gulp-cached');
var enabled = {
uglify: argv.production,
@@ -20,6 +21,7 @@ var enabled = {
liveReload: !argv.production
};
/* CSS */
gulp.task('styles', function() {
gulp.src('src/styles/**/*.sass')
@@ -39,6 +41,7 @@ gulp.task('styles', function() {
gulp.task('templates', function() {
gulp.src('src/templates/**/*.jade')
.pipe(gulpif(enabled.failCheck, plumber()))
.pipe(cache('templating'))
.pipe(jade({
pretty: enabled.prettyPug
}))
@@ -51,6 +54,7 @@ gulp.task('templates', function() {
gulp.task('scripts', function() {
gulp.src('src/scripts/*.js')
.pipe(gulpif(enabled.failCheck, plumber()))
.pipe(cache('scripting'))
.pipe(gulpif(enabled.maps, sourcemaps.init()))
.pipe(gulpif(enabled.uglify, uglify()))
.pipe(rename({suffix: '.min'}))
@@ -90,7 +94,10 @@ gulp.task('scripts_concat_markdown', function() {
// While developing, run 'gulp watch'
gulp.task('watch',function() {
// Only listen for live reloads if ran with --livereload
if (argv.livereload){
livereload.listen();
}
gulp.watch('src/styles/**/*.sass',['styles']);
gulp.watch('src/templates/**/*.jade',['templates']);

View File

@@ -1,783 +0,0 @@
#!/usr/bin/env python
from __future__ import division
from __future__ import print_function
import copy
import logging
import os
from bson.objectid import ObjectId
from eve.methods.post import post_internal
from eve.methods.put import put_internal
from flask.ext.script import Manager
# Use a sensible default when running manage.py commands.
if not os.environ.get('EVE_SETTINGS'):
settings_path = os.path.join(os.path.dirname(os.path.abspath(__file__)),
'pillar', 'eve_settings.py')
os.environ['EVE_SETTINGS'] = settings_path
# from pillar import app
from pillar.api.node_types.asset import node_type_asset
from pillar.api.node_types import node_type_blog
from pillar.api.node_types.comment import node_type_comment
from pillar.api.node_types.group import node_type_group
from pillar.api.node_types.post import node_type_post
from pillar.api.node_types import node_type_storage
from pillar.api.node_types.texture import node_type_texture
manager = Manager()
log = logging.getLogger('manage')
log.setLevel(logging.INFO)
MONGO_HOST = os.environ.get('MONGO_HOST', 'localhost')
@manager.command
def runserver(**options):
# Automatic creation of STORAGE_DIR path if it's missing
if not os.path.exists(app.config['STORAGE_DIR']):
os.makedirs(app.config['STORAGE_DIR'])
app.run(host=app.config['HOST'],
port=app.config['PORT'],
debug=app.config['DEBUG'],
**options)
@manager.command
def runserver_memlimit(limit_kb=1000000):
import resource
limit_b = int(limit_kb) * 1024
for rsrc in (resource.RLIMIT_AS, resource.RLIMIT_DATA, resource.RLIMIT_RSS):
resource.setrlimit(rsrc, (limit_b, limit_b))
runserver()
@manager.command
def runserver_profile(pfile='profile.stats'):
import cProfile
cProfile.run('runserver(use_reloader=False)', pfile)
def each_project_node_type(node_type_name=None):
"""Generator, yields (project, node_type) tuples for all projects and node types.
When a node type name is given, only yields those node types.
"""
projects_coll = app.data.driver.db['projects']
for project in projects_coll.find():
for node_type in project['node_types']:
if node_type_name is None or node_type['name'] == node_type_name:
yield project, node_type
def post_item(entry, data):
return post_internal(entry, data)
def put_item(collection, item):
item_id = item['_id']
internal_fields = ['_id', '_etag', '_updated', '_created']
for field in internal_fields:
item.pop(field, None)
# print item
# print type(item_id)
p = put_internal(collection, item, **{'_id': item_id})
if p[0]['_status'] == 'ERR':
print(p)
print(item)
@manager.command
def setup_db(admin_email):
"""Setup the database
- Create admin, subscriber and demo Group collection
- Create admin user (must use valid blender-id credentials)
- Create one project
"""
# Create default groups
groups_list = []
for group in ['admin', 'subscriber', 'demo']:
g = {'name': group}
g = post_internal('groups', g)
groups_list.append(g[0]['_id'])
print("Creating group {0}".format(group))
# Create admin user
user = {'username': admin_email,
'groups': groups_list,
'roles': ['admin', 'subscriber', 'demo'],
'settings': {'email_communications': 1},
'auth': [],
'full_name': admin_email,
'email': admin_email}
result, _, _, status = post_internal('users', user)
if status != 201:
raise SystemExit('Error creating user {}: {}'.format(admin_email, result))
user.update(result)
print("Created user {0}".format(user['_id']))
# Create a default project by faking a POST request.
with app.test_request_context(data={'project_name': u'Default Project'}):
from flask import g
from pillar.api import projects
g.current_user = {'user_id': user['_id'],
'groups': user['groups'],
'roles': set(user['roles'])}
projects.create_project(overrides={'url': 'default-project',
'is_private': False})
def _default_permissions():
"""Returns a dict of default permissions.
Usable for projects, node types, and others.
:rtype: dict
"""
from pillar.api.projects import DEFAULT_ADMIN_GROUP_PERMISSIONS
groups_collection = app.data.driver.db['groups']
admin_group = groups_collection.find_one({'name': 'admin'})
default_permissions = {
'world': ['GET'],
'users': [],
'groups': [
{'group': admin_group['_id'],
'methods': DEFAULT_ADMIN_GROUP_PERMISSIONS[:]},
]
}
return default_permissions
@manager.command
def setup_for_attract(project_uuid, replace=False):
"""Adds Attract node types to the project.
:param project_uuid: the UUID of the project to update
:type project_uuid: str
:param replace: whether to replace existing Attract node types (True),
or to keep existing node types (False, the default).
:type replace: bool
"""
from pillar.api.node_types import node_type_act
from pillar.api.node_types.scene import node_type_scene
from pillar.api.node_types import node_type_shot
# Copy permissions from the project, then give everyone with PUT
# access also DELETE access.
project = _get_project(project_uuid)
permissions = copy.deepcopy(project['permissions'])
for perms in permissions.values():
for perm in perms:
methods = set(perm['methods'])
if 'PUT' not in perm['methods']:
continue
methods.add('DELETE')
perm['methods'] = list(methods)
node_type_act['permissions'] = permissions
node_type_scene['permissions'] = permissions
node_type_shot['permissions'] = permissions
# Add the missing node types.
for node_type in (node_type_act, node_type_scene, node_type_shot):
found = [nt for nt in project['node_types']
if nt['name'] == node_type['name']]
if found:
assert len(found) == 1, 'node type name should be unique (found %ix)' % len(found)
# TODO: validate that the node type contains all the properties Attract needs.
if replace:
log.info('Replacing existing node type %s', node_type['name'])
project['node_types'].remove(found[0])
else:
continue
project['node_types'].append(node_type)
_update_project(project_uuid, project)
log.info('Project %s was updated for Attract.', project_uuid)
def _get_project(project_uuid):
"""Find a project in the database, or SystemExit()s.
:param project_uuid: UUID of the project
:type: str
:return: the project
:rtype: dict
"""
projects_collection = app.data.driver.db['projects']
project_id = ObjectId(project_uuid)
# Find the project in the database.
project = projects_collection.find_one(project_id)
if not project:
log.error('Project %s does not exist.', project_uuid)
raise SystemExit()
return project
def _update_project(project_uuid, project):
"""Updates a project in the database, or SystemExit()s.
:param project_uuid: UUID of the project
:type: str
:param project: the project data, should be the entire project document
:type: dict
:return: the project
:rtype: dict
"""
from pillar.api.utils import remove_private_keys
project_id = ObjectId(project_uuid)
project = remove_private_keys(project)
result, _, _, _ = put_internal('projects', project, _id=project_id)
if result['_status'] != 'OK':
log.error("Can't update project %s, issues: %s", project_uuid, result['_issues'])
raise SystemExit()
@manager.command
def refresh_project_permissions():
"""Replaces the admin group permissions of each project with the defaults."""
from pillar.api.projects import DEFAULT_ADMIN_GROUP_PERMISSIONS
proj_coll = app.data.driver.db['projects']
result = proj_coll.update_many({}, {'$set': {
'permissions.groups.0.methods': DEFAULT_ADMIN_GROUP_PERMISSIONS
}})
print('Matched %i documents' % result.matched_count)
print('Updated %i documents' % result.modified_count)
@manager.command
def refresh_home_project_permissions():
"""Replaces the home project comment node type permissions with proper ones."""
proj_coll = app.data.driver.db['projects']
from pillar.api.blender_cloud import home_project
from pillar.api import service
service.fetch_role_to_group_id_map()
fake_node_type = home_project.assign_permissions(node_type_comment,
subscriber_methods=[u'GET', u'POST'],
world_methods=[u'GET'])
perms = fake_node_type['permissions']
result = proj_coll.update_many(
{'category': 'home', 'node_types.name': 'comment'},
{'$set': {'node_types.$.permissions': perms}})
print('Matched %i documents' % result.matched_count)
print('Updated %i documents' % result.modified_count)
@manager.command
def clear_db():
"""Wipes the database
"""
from pymongo import MongoClient
client = MongoClient(MONGO_HOST, 27017)
db = client.eve
db.drop_collection('nodes')
db.drop_collection('node_types')
db.drop_collection('tokens')
db.drop_collection('users')
@manager.command
def add_parent_to_nodes():
"""Find the parent of any node in the nodes collection"""
import codecs
import sys
UTF8Writer = codecs.getwriter('utf8')
sys.stdout = UTF8Writer(sys.stdout)
nodes_collection = app.data.driver.db['nodes']
def find_parent_project(node):
if node and 'parent' in node:
parent = nodes_collection.find_one({'_id': node['parent']})
return find_parent_project(parent)
if node:
return node
else:
return None
nodes = nodes_collection.find()
nodes_index = 0
nodes_orphan = 0
for node in nodes:
nodes_index += 1
if node['node_type'] == ObjectId("55a615cfea893bd7d0489f2d"):
print(u"Skipping project node - {0}".format(node['name']))
else:
project = find_parent_project(node)
if project:
nodes_collection.update({'_id': node['_id']},
{"$set": {'project': project['_id']}})
print(u"{0} {1}".format(node['_id'], node['name']))
else:
nodes_orphan += 1
nodes_collection.remove({'_id': node['_id']})
print("Removed {0} {1}".format(node['_id'], node['name']))
print("Edited {0} nodes".format(nodes_index))
print("Orphan {0} nodes".format(nodes_orphan))
@manager.command
def make_project_public(project_id):
"""Convert every node of a project from pending to public"""
DRY_RUN = False
nodes_collection = app.data.driver.db['nodes']
for n in nodes_collection.find({'project': ObjectId(project_id)}):
n['properties']['status'] = 'published'
print(u"Publishing {0} {1}".format(n['_id'], n['name'].encode('ascii', 'ignore')))
if not DRY_RUN:
put_item('nodes', n)
@manager.command
def set_attachment_names():
"""Loop through all existing nodes and assign proper ContentDisposition
metadata to referenced files that are using GCS.
"""
from pillar.api.utils.gcs import update_file_name
nodes_collection = app.data.driver.db['nodes']
for n in nodes_collection.find():
print("Updating node {0}".format(n['_id']))
update_file_name(n)
@manager.command
def files_verify_project():
"""Verify for missing or conflicting node/file ids"""
nodes_collection = app.data.driver.db['nodes']
files_collection = app.data.driver.db['files']
issues = dict(missing=[], conflicting=[], processing=[])
def _parse_file(item, file_id):
f = files_collection.find_one({'_id': file_id})
if f:
if 'project' in item and 'project' in f:
if item['project'] != f['project']:
issues['conflicting'].append(item['_id'])
if 'status' in item['properties'] \
and item['properties']['status'] == 'processing':
issues['processing'].append(item['_id'])
else:
issues['missing'].append(
"{0} missing {1}".format(item['_id'], file_id))
for item in nodes_collection.find():
print("Verifying node {0}".format(item['_id']))
if 'file' in item['properties']:
_parse_file(item, item['properties']['file'])
elif 'files' in item['properties']:
for f in item['properties']['files']:
_parse_file(item, f['file'])
print("===")
print("Issues detected:")
for k, v in issues.iteritems():
print("{0}:".format(k))
for i in v:
print(i)
print("===")
def replace_node_type(project, node_type_name, new_node_type):
"""Update or create the specified node type. We rely on the fact that
node_types have a unique name in a project.
"""
old_node_type = next(
(item for item in project['node_types'] if item.get('name') \
and item['name'] == node_type_name), None)
if old_node_type:
for i, v in enumerate(project['node_types']):
if v['name'] == node_type_name:
project['node_types'][i] = new_node_type
else:
project['node_types'].append(new_node_type)
@manager.command
def project_upgrade_node_types(project_id):
projects_collection = app.data.driver.db['projects']
project = projects_collection.find_one({'_id': ObjectId(project_id)})
replace_node_type(project, 'group', node_type_group)
replace_node_type(project, 'asset', node_type_asset)
replace_node_type(project, 'storage', node_type_storage)
replace_node_type(project, 'comment', node_type_comment)
replace_node_type(project, 'blog', node_type_blog)
replace_node_type(project, 'post', node_type_post)
replace_node_type(project, 'texture', node_type_texture)
put_item('projects', project)
@manager.command
def test_put_item(node_id):
import pprint
nodes_collection = app.data.driver.db['nodes']
node = nodes_collection.find_one(ObjectId(node_id))
pprint.pprint(node)
put_item('nodes', node)
@manager.command
def test_post_internal(node_id):
import pprint
nodes_collection = app.data.driver.db['nodes']
node = nodes_collection.find_one(ObjectId(node_id))
internal_fields = ['_id', '_etag', '_updated', '_created']
for field in internal_fields:
node.pop(field, None)
pprint.pprint(node)
print(post_internal('nodes', node))
@manager.command
def algolia_push_users():
"""Loop through all users and push them to Algolia"""
from pillar.api.utils.algolia import algolia_index_user_save
users_collection = app.data.driver.db['users']
for user in users_collection.find():
print("Pushing {0}".format(user['username']))
algolia_index_user_save(user)
@manager.command
def algolia_push_nodes():
"""Loop through all nodes and push them to Algolia"""
from pillar.api.utils.algolia import algolia_index_node_save
nodes_collection = app.data.driver.db['nodes']
for node in nodes_collection.find():
print(u"Pushing {0}: {1}".format(node['_id'], node['name'].encode(
'ascii', 'ignore')))
algolia_index_node_save(node)
@manager.command
def files_make_public_t():
"""Loop through all files and if they are images on GCS, make the size t
public
"""
from gcloud.exceptions import InternalServerError
from pillar.api.utils.gcs import GoogleCloudStorageBucket
files_collection = app.data.driver.db['files']
for f in files_collection.find({'backend': 'gcs'}):
if 'variations' not in f:
continue
variation_t = next((item for item in f['variations']
if item['size'] == 't'), None)
if not variation_t:
continue
try:
storage = GoogleCloudStorageBucket(str(f['project']))
blob = storage.Get(variation_t['file_path'], to_dict=False)
if not blob:
print('Unable to find blob for project %s file %s' % (f['project'], f['_id']))
continue
print('Making blob public: {0}'.format(blob.path))
blob.make_public()
except InternalServerError as ex:
print('Internal Server Error: ', ex)
@manager.command
def subscribe_node_owners():
"""Automatically subscribe node owners to notifications for items created
in the past.
"""
from pillar.api.nodes import after_inserting_nodes
nodes_collection = app.data.driver.db['nodes']
for n in nodes_collection.find():
if 'parent' in n:
after_inserting_nodes([n])
@manager.command
def refresh_project_links(project, chunk_size=50, quiet=False):
"""Regenerates almost-expired file links for a certain project."""
if quiet:
import logging
from pillar import log
logging.getLogger().setLevel(logging.WARNING)
log.setLevel(logging.WARNING)
chunk_size = int(chunk_size) # CLI parameters are passed as strings
from pillar.api import file_storage
file_storage.refresh_links_for_project(project, chunk_size, 2 * 3600)
@manager.command
def register_local_user(email, password):
from pillar.api.local_auth import create_local_user
create_local_user(email, password)
@manager.command
def add_group_to_projects(group_name):
"""Prototype to add a specific group, in read-only mode, to all node_types
for all projects.
"""
methods = ['GET']
groups_collection = app.data.driver.db['groups']
projects_collections = app.data.driver.db['projects']
group = groups_collection.find_one({'name': group_name})
for project in projects_collections.find():
print("Processing: {}".format(project['name']))
for node_type in project['node_types']:
node_type_name = node_type['name']
base_node_types = ['group', 'asset', 'blog', 'post', 'page',
'comment', 'group_texture', 'storage', 'texture']
if node_type_name in base_node_types:
print("Processing: {0}".format(node_type_name))
# Check if group already exists in the permissions
g = next((g for g in node_type['permissions']['groups']
if g['group'] == group['_id']), None)
# If not, we add it
if g is None:
print("Adding permissions")
permissions = {
'group': group['_id'],
'methods': methods}
node_type['permissions']['groups'].append(permissions)
projects_collections.update(
{'_id': project['_id']}, project)
@manager.command
def add_license_props():
"""Add license fields to all node types asset for every project."""
projects_collections = app.data.driver.db['projects']
for project in projects_collections.find():
print("Processing {}".format(project['_id']))
for node_type in project['node_types']:
if node_type['name'] == 'asset':
node_type['dyn_schema']['license_notes'] = {'type': 'string'}
node_type['dyn_schema']['license_type'] = {
'type': 'string',
'allowed': [
'cc-by',
'cc-0',
'cc-by-sa',
'cc-by-nd',
'cc-by-nc',
'copyright'
],
'default': 'cc-by'
}
node_type['form_schema']['license_notes'] = {}
node_type['form_schema']['license_type'] = {}
projects_collections.update(
{'_id': project['_id']}, project)
@manager.command
def refresh_file_sizes():
"""Computes & stores the 'length_aggregate_in_bytes' fields of all files."""
from pillar.api import file_storage
matched = 0
unmatched = 0
total_size = 0
files_collection = app.data.driver.db['files']
for file_doc in files_collection.find():
file_storage.compute_aggregate_length(file_doc)
length = file_doc['length_aggregate_in_bytes']
total_size += length
result = files_collection.update_one({'_id': file_doc['_id']},
{'$set': {'length_aggregate_in_bytes': length}})
if result.matched_count != 1:
log.warning('Unable to update document %s', file_doc['_id'])
unmatched += 1
else:
matched += 1
log.info('Updated %i file documents.', matched)
if unmatched:
log.warning('Unable to update %i documents.', unmatched)
log.info('%i bytes (%.3f GiB) storage used in total.',
total_size, total_size / 1024 ** 3)
@manager.command
def project_stats():
import csv
import sys
from collections import defaultdict
from functools import partial
from pillar.api import projects
proj_coll = app.data.driver.db['projects']
nodes = app.data.driver.db['nodes']
aggr = defaultdict(partial(defaultdict, int))
csvout = csv.writer(sys.stdout)
csvout.writerow(['project ID', 'owner', 'private', 'file size',
'nr of nodes', 'nr of top-level nodes', ])
for proj in proj_coll.find(projection={'user': 1,
'name': 1,
'is_private': 1,
'_id': 1}):
project_id = proj['_id']
is_private = proj.get('is_private', False)
row = [str(project_id),
unicode(proj['user']).encode('utf-8'),
is_private]
file_size = projects.project_total_file_size(project_id)
row.append(file_size)
node_count_result = nodes.aggregate([
{'$match': {'project': project_id}},
{'$project': {'parent': 1,
'is_top': {'$cond': [{'$gt': ['$parent', None]}, 0, 1]},
}},
{'$group': {
'_id': None,
'all': {'$sum': 1},
'top': {'$sum': '$is_top'},
}}
])
try:
node_counts = next(node_count_result)
nodes_all = node_counts['all']
nodes_top = node_counts['top']
except StopIteration:
# No result from the nodes means nodeless project.
nodes_all = 0
nodes_top = 0
row.append(nodes_all)
row.append(nodes_top)
for collection in aggr[None], aggr[is_private]:
collection['project_count'] += 1
collection['project_count'] += 1
collection['file_size'] += file_size
collection['node_count'] += nodes_all
collection['top_nodes'] += nodes_top
csvout.writerow(row)
csvout.writerow([
'public', '', '%i projects' % aggr[False]['project_count'],
aggr[False]['file_size'], aggr[False]['node_count'], aggr[False]['top_nodes'],
])
csvout.writerow([
'private', '', '%i projects' % aggr[True]['project_count'],
aggr[True]['file_size'], aggr[True]['node_count'], aggr[True]['top_nodes'],
])
csvout.writerow([
'total', '', '%i projects' % aggr[None]['project_count'],
aggr[None]['file_size'], aggr[None]['node_count'], aggr[None]['top_nodes'],
])
@manager.command
def add_node_types():
"""Add texture and group_texture node types to all projects"""
from pillar.api.node_types.texture import node_type_texture
from pillar.api.node_types.group_texture import node_type_group_texture
from pillar.api.utils import project_get_node_type
projects_collections = app.data.driver.db['projects']
for project in projects_collections.find():
print("Processing {}".format(project['_id']))
if not project_get_node_type(project, 'group_texture'):
project['node_types'].append(node_type_group_texture)
print("Added node type: {}".format(node_type_group_texture['name']))
if not project_get_node_type(project, 'texture'):
project['node_types'].append(node_type_texture)
print("Added node type: {}".format(node_type_texture['name']))
projects_collections.update(
{'_id': project['_id']}, project)
@manager.command
def update_texture_node_type():
"""Update allowed values for textures node_types"""
projects_collections = app.data.driver.db['projects']
for project in projects_collections.find():
print("Processing {}".format(project['_id']))
for node_type in project['node_types']:
if node_type['name'] == 'texture':
allowed = [
'color',
'specular',
'bump',
'normal',
'translucency',
'emission',
'alpha'
]
node_type['dyn_schema']['files']['schema']['schema']['map_type'][
'allowed'] = allowed
projects_collections.update(
{'_id': project['_id']}, project)
@manager.command
def update_texture_nodes_maps():
"""Update abbreviated texture map types to the extended version"""
nodes_collection = app.data.driver.db['nodes']
remap = {
'col': 'color',
'spec': 'specular',
'nor': 'normal'}
for node in nodes_collection.find({'node_type': 'texture'}):
for v in node['properties']['files']:
try:
updated_map_types = remap[v['map_type']]
print("Updating {} to {}".format(v['map_type'], updated_map_types))
v['map_type'] = updated_map_types
except KeyError:
print("Skipping {}".format(v['map_type']))
nodes_collection.update({'_id': node['_id']}, node)
if __name__ == '__main__':
manager.run()

View File

@@ -8,17 +8,18 @@
"license": "GPL",
"devDependencies": {
"gulp": "~3.9.1",
"gulp-sass": "~2.3.1",
"gulp-autoprefixer": "~2.3.1",
"gulp-cached": "~1.1.0",
"gulp-chmod": "~1.3.0",
"gulp-concat": "~2.6.0",
"gulp-if": "^2.0.1",
"gulp-jade": "~1.1.0",
"gulp-sourcemaps": "~1.6.0",
"gulp-plumber": "~1.1.0",
"gulp-livereload": "~3.8.1",
"gulp-concat": "~2.6.0",
"gulp-uglify": "~1.5.3",
"gulp-plumber": "~1.1.0",
"gulp-rename": "~1.2.2",
"gulp-chmod": "~1.3.0",
"gulp-sass": "~2.3.1",
"gulp-sourcemaps": "~1.6.0",
"gulp-uglify": "~1.5.3",
"minimist": "^1.2.0"
}
}

View File

@@ -1,21 +1,24 @@
"""Pillar server."""
import collections
import copy
import json
import logging
import logging.config
import subprocess
import tempfile
import jinja2
import os
import os.path
import jinja2
from eve import Eve
import flask
from flask import render_template, request
from flask.templating import TemplateNotFound
from pillar.api import custom_field_validation
from pillar.api.utils import authentication
from pillar.api.utils import gravatar
from pillar.web.utils import pretty_date
from pillar.web.nodes.routes import url_for_node
import pillar.web.jinja
from . import api
from . import web
@@ -35,6 +38,10 @@ class PillarServer(Eve):
kwargs.setdefault('validator', custom_field_validation.ValidateCustomFields)
super(PillarServer, self).__init__(settings=empty_settings, **kwargs)
# mapping from extension name to extension object.
self.pillar_extensions = collections.OrderedDict()
self.pillar_extensions_template_paths = [] # list of paths
self.app_root = os.path.abspath(app_root)
self._load_flask_config()
self._config_logging()
@@ -178,8 +185,19 @@ class PillarServer(Eve):
def load_extension(self, pillar_extension, url_prefix):
from .extension import PillarExtension
self.log.info('Initialising extension %r', pillar_extension)
assert isinstance(pillar_extension, PillarExtension)
if not isinstance(pillar_extension, PillarExtension):
if self.config.get('DEBUG'):
for cls in type(pillar_extension).mro():
self.log.error('class %42r (%i) is %42r (%i): %s',
cls, id(cls), PillarExtension, id(PillarExtension),
cls is PillarExtension)
raise AssertionError('Extension has wrong type %r' % type(pillar_extension))
self.log.info('Loading extension %s', pillar_extension.name)
# Remember this extension, and disallow duplicates.
if pillar_extension.name in self.pillar_extensions:
raise ValueError('Extension with name %s already loaded', pillar_extension.name)
self.pillar_extensions[pillar_extension.name] = pillar_extension
# Load extension Flask configuration
for key, value in pillar_extension.flask_config():
@@ -187,25 +205,51 @@ class PillarServer(Eve):
# Load extension blueprint(s)
for blueprint in pillar_extension.blueprints():
self.register_blueprint(blueprint, url_prefix=url_prefix)
if blueprint.url_prefix:
blueprint_prefix = url_prefix + blueprint.url_prefix
else:
blueprint_prefix = url_prefix
self.register_blueprint(blueprint, url_prefix=blueprint_prefix)
# Load template paths
tpath = pillar_extension.template_path
if tpath:
self.log.info('Extension %s: adding template path %s',
pillar_extension.name, tpath)
if not os.path.exists(tpath):
raise ValueError('Template path %s for extension %s does not exist.',
tpath, pillar_extension.name)
self.pillar_extensions_template_paths.append(tpath)
# Load extension Eve settings
eve_settings = pillar_extension.eve_settings()
if 'DOMAIN' in eve_settings:
pillar_ext_prefix = pillar_extension.name + '_'
pillar_url_prefix = pillar_extension.name + '/'
for key, collection in eve_settings['DOMAIN'].items():
source = '%s.%s' % (pillar_extension.name, key)
url = '%s/%s' % (pillar_extension.name, key)
assert key.startswith(pillar_ext_prefix), \
'Eve collection names of %s MUST start with %r' % \
(pillar_extension.name, pillar_ext_prefix)
url = key.replace(pillar_ext_prefix, pillar_url_prefix)
collection.setdefault('datasource', {}).setdefault('source', source)
collection.setdefault('datasource', {}).setdefault('source', key)
collection.setdefault('url', url)
self.config['DOMAIN'].update(eve_settings['DOMAIN'])
def _config_jinja_env(self):
# Start with the extensions...
paths_list = [
jinja2.FileSystemLoader(path)
for path in reversed(self.pillar_extensions_template_paths)
]
# ...then load Pillar paths.
pillar_dir = os.path.dirname(os.path.realpath(__file__))
parent_theme_path = os.path.join(pillar_dir, 'web', 'templates')
current_path = os.path.join(self.app_root, 'templates')
paths_list = [
paths_list += [
jinja2.FileSystemLoader(current_path),
jinja2.FileSystemLoader(parent_theme_path),
self.jinja_loader
@@ -216,34 +260,30 @@ class PillarServer(Eve):
custom_jinja_loader = jinja2.ChoiceLoader(paths_list)
self.jinja_loader = custom_jinja_loader
def format_pretty_date(d):
return pretty_date(d)
def format_pretty_date_time(d):
return pretty_date(d, detail=True)
self.jinja_env.filters['pretty_date'] = format_pretty_date
self.jinja_env.filters['pretty_date_time'] = format_pretty_date_time
self.jinja_env.globals['url_for_node'] = url_for_node
pillar.web.jinja.setup_jinja_env(self.jinja_env)
def _config_static_dirs(self):
pillar_dir = os.path.dirname(os.path.realpath(__file__))
# Setup static folder for the instanced app
self.static_folder = os.path.join(self.app_root, 'static')
# Setup static folder for Pillar
self.pillar_static_folder = os.path.join(pillar_dir, 'web', 'static')
pillar_dir = os.path.dirname(os.path.realpath(__file__))
pillar_static_folder = os.path.join(pillar_dir, 'web', 'static')
self.register_static_file_endpoint('/static/pillar', 'static_pillar', pillar_static_folder)
from flask.views import MethodView
from flask import send_from_directory
from flask import current_app
# Setup static folders for extensions
for name, ext in self.pillar_extensions.items():
if not ext.static_path:
continue
self.register_static_file_endpoint('/static/%s' % name,
'static_%s' % name,
ext.static_path)
class PillarStaticFile(MethodView):
def get(self, filename):
return send_from_directory(current_app.pillar_static_folder,
filename)
def register_static_file_endpoint(self, url_prefix, endpoint_name, static_folder):
from pillar.web.static import PillarStaticFile
self.add_url_rule('/static/pillar/<path:filename>',
view_func=PillarStaticFile.as_view('static_pillar'))
view_func = PillarStaticFile.as_view(endpoint_name, static_folder=static_folder)
self.add_url_rule('%s/<path:filename>' % url_prefix, view_func=view_func)
def process_extensions(self):
# Re-initialise Eve after we allowed Pillar submodules to be loaded.
@@ -268,6 +308,132 @@ class PillarServer(Eve):
self.finish_startup()
def register_error_handlers(self):
super(PillarServer, self).register_error_handlers()
# Register error handlers per code.
for code in (403, 404, 412, 500):
self.register_error_handler(code, self.pillar_error_handler)
# Register error handlers per exception.
from pillarsdk import exceptions as sdk_exceptions
sdk_handlers = [
(sdk_exceptions.UnauthorizedAccess, self.handle_sdk_unauth),
(sdk_exceptions.ForbiddenAccess, self.handle_sdk_forbidden),
(sdk_exceptions.ResourceNotFound, self.handle_sdk_resource_not_found),
(sdk_exceptions.ResourceInvalid, self.handle_sdk_resource_invalid),
(sdk_exceptions.MethodNotAllowed, self.handle_sdk_method_not_allowed),
(sdk_exceptions.PreconditionFailed, self.handle_sdk_precondition_failed),
]
for (eclass, handler) in sdk_handlers:
self.register_error_handler(eclass, handler)
def handle_sdk_unauth(self, error):
"""Global exception handling for pillarsdk UnauthorizedAccess
Currently the api is fully locked down so we need to constantly
check for user authorization.
"""
return flask.redirect(flask.url_for('users.login'))
def handle_sdk_forbidden(self, error):
self.log.info('Forwarding ForbiddenAccess exception to client: %s', error, exc_info=True)
error.code = 403
return self.pillar_error_handler(error)
def handle_sdk_resource_not_found(self, error):
self.log.info('Forwarding ResourceNotFound exception to client: %s', error, exc_info=True)
content = getattr(error, 'content', None)
if content:
try:
error_content = json.loads(content)
except ValueError:
error_content = None
if error_content and error_content.get('_deleted', False):
# This document used to exist, but doesn't any more. Let the user know.
doc_name = error_content.get('name')
node_type = error_content.get('node_type')
if node_type:
node_type = node_type.replace('_', ' ').title()
if doc_name:
description = u'%s "%s" was deleted.' % (node_type, doc_name)
else:
description = u'This %s was deleted.' % (node_type, )
else:
if doc_name:
description = u'"%s" was deleted.' % doc_name
else:
description = None
error.description = description
error.code = 404
return self.pillar_error_handler(error)
def handle_sdk_precondition_failed(self, error):
self.log.info('Forwarding PreconditionFailed exception to client: %s', error)
error.code = 412
return self.pillar_error_handler(error)
def handle_sdk_resource_invalid(self, error):
self.log.info('Forwarding ResourceInvalid exception to client: %s', error, exc_info=True)
# Raising a Werkzeug 422 exception doens't work, as Flask turns it into a 500.
return 'The submitted data could not be validated.', 422
def handle_sdk_method_not_allowed(self, error):
"""Forwards 405 Method Not Allowed to the client.
This is actually not fair, as a 405 between Pillar and Pillar-Web
doesn't imply that the request the client did on Pillar-Web is not
allowed. However, it does allow us to debug this if it happens, by
watching for 405s in the browser.
"""
from flask import request
self.log.info('Forwarding MethodNotAllowed exception to client: %s', error, exc_info=True)
self.log.info('HTTP Referer is %r', request.referrer)
# Raising a Werkzeug 405 exception doens't work, as Flask turns it into a 500.
return 'The requested HTTP method is not allowed on this URL.', 405
def pillar_error_handler(self, error_ob):
# 'error_ob' can be any exception. If it's not a Werkzeug exception,
# handle it as a 500.
if not hasattr(error_ob, 'code'):
error_ob.code = 500
if not hasattr(error_ob, 'description'):
error_ob.description = str(error_ob)
if request.full_path.startswith('/%s/' % self.config['URL_PREFIX']):
from pillar.api.utils import jsonify
# This is an API request, so respond in JSON.
return jsonify({
'_status': 'ERR',
'_code': error_ob.code,
'_message': error_ob.description,
}, status=error_ob.code)
# See whether we should return an embedded page or a regular one.
if request.is_xhr:
fname = 'errors/%i_embed.html' % error_ob.code
else:
fname = 'errors/%i.html' % error_ob.code
# Also handle the case where we didn't create a template for this error.
try:
return render_template(fname, description=error_ob.description), error_ob.code
except TemplateNotFound:
self.log.warning('Error template %s for code %i not found',
fname, error_ob.code)
return render_template('errors/500.html'), error_ob.code
def finish_startup(self):
self.log.info('Using MongoDB database %r', self.config['MONGO_DBNAME'])
@@ -275,6 +441,10 @@ class PillarServer(Eve):
web.setup_app(self)
authentication.setup_app(self)
for ext in self.pillar_extensions.itervalues():
self.log.info('Setting up extension %s', ext.name)
ext.setup_app(self)
self._config_jinja_env()
self._config_static_dirs()
@@ -372,3 +542,23 @@ class PillarServer(Eve):
links.sort(key=lambda t: len(t[0]) + 100 * ('/api/' in t[0]))
pprint(links)
def db(self):
"""Returns the MongoDB database.
:rtype: flask_pymongo.PyMongo
"""
return self.data.driver.db
def extension_sidebar_links(self, project):
"""Returns the sidebar links for the given projects.
:returns: HTML as a string for the sidebar.
"""
if not project:
return ''
return jinja2.Markup(''.join(ext.sidebar_links(project)
for ext in self.pillar_extensions.values()))

View File

@@ -1,6 +1,10 @@
import logging
from flask import g, request, current_app
from pillar.api.utils import gravatar
log = logging.getLogger(__name__)
def notification_parse(notification):
activities_collection = current_app.data.driver.db['activities']
@@ -13,6 +17,11 @@ def notification_parse(notification):
if activity is None or activity['object_type'] != 'node':
return
node = nodes_collection.find_one({'_id': activity['object']})
if not node:
# This can happen when a notification is generated and then the
# node is deleted.
return
# Initial support only for node_type comments
if node['node_type'] != 'comment':
return
@@ -131,27 +140,71 @@ def activity_object_add(actor_user_id, verb, object_type, object_id,
subscriptions = notification_get_subscriptions(
context_object_type, context_object_id, actor_user_id)
if subscriptions.count() > 0:
activity = dict(
actor_user=actor_user_id,
verb=verb,
object_type=object_type,
object=object_id,
context_object_type=context_object_type,
context_object=context_object_id
)
if subscriptions.count() == 0:
return
activity = current_app.post_internal('activities', activity)
if activity[3] != 201:
info, status = register_activity(actor_user_id, verb, object_type, object_id,
context_object_type, context_object_id)
if status != 201:
# If creation failed for any reason, do not create a any notifcation
return
for subscription in subscriptions:
notification = dict(
user=subscription['user'],
activity=activity[0]['_id'])
activity=info['_id'])
current_app.post_internal('notifications', notification)
def register_activity(actor_user_id, verb, object_type, object_id,
context_object_type, context_object_id,
project_id=None,
node_type=None):
"""Registers an activity.
This works using the following pattern:
ACTOR -> VERB -> OBJECT -> CONTEXT
:param actor_user_id: id of the user who is changing the object
:param verb: the action on the object ('commented', 'replied')
:param object_type: hardcoded name, see database schema
:param object_id: object id, to be traced with object_type
:param context_object_type: the type of the context object, like 'project' or 'node',
see database schema
:param context_object_id:
:param project_id: optional project ID to make the activity easily queryable
per project.
:param node_type: optional, node type of the node receiving the activity.
:returns: tuple (info, status_code), where a successful operation should have
status_code=201. If it is not 201, a warning is logged.
"""
activity = {
'actor_user': actor_user_id,
'verb': verb,
'object_type': object_type,
'object': object_id,
'context_object_type': context_object_type,
'context_object': context_object_id}
if project_id:
activity['project'] = project_id
if node_type:
activity['node_type'] = node_type
info, _, _, status_code = current_app.post_internal('activities', activity)
if status_code != 201:
log.error('register_activity: code %i creating activity %s: %s',
status_code, activity, info)
else:
log.info('register_activity: user %s "%s" on %s %s, context %s %s',
actor_user_id, verb, object_type, object_id,
context_object_type, context_object_id)
return info, status_code
def before_returning_item_notifications(response):
if request.args.get('parse'):
notification_parse(response)

View File

@@ -1,7 +1,7 @@
import logging
from bson import ObjectId
from datetime import datetime
from bson import ObjectId, tz_util
from datetime import datetime, tzinfo
from eve.io.mongo import Validator
from flask import current_app
@@ -9,31 +9,43 @@ log = logging.getLogger(__name__)
class ValidateCustomFields(Validator):
# TODO: split this into a convert_property(property, schema) and call that from this function.
def convert_properties(self, properties, node_schema):
"""Converts datetime strings and ObjectId strings to actual Python objects."""
date_format = current_app.config['RFC1123_DATE_FORMAT']
for prop in node_schema:
if not prop in properties:
if prop not in properties:
continue
schema_prop = node_schema[prop]
prop_type = schema_prop['type']
if prop_type == 'dict':
properties[prop] = self.convert_properties(
properties[prop], schema_prop['schema'])
if prop_type == 'list':
try:
dict_valueschema = schema_prop['schema']
properties[prop] = self.convert_properties(properties[prop], dict_valueschema)
except KeyError:
dict_valueschema = schema_prop['valueschema']
self.convert_dict_values(properties[prop], dict_valueschema)
elif prop_type == 'list':
if properties[prop] in ['', '[]']:
properties[prop] = []
if 'schema' in schema_prop:
for k, val in enumerate(properties[prop]):
if not 'schema' in schema_prop:
continue
item_schema = {'item': schema_prop['schema']}
item_prop = {'item': properties[prop][k]}
properties[prop][k] = self.convert_properties(
item_prop, item_schema)['item']
# Convert datetime string to RFC1123 datetime
elif prop_type == 'datetime':
prop_val = properties[prop]
properties[prop] = datetime.strptime(prop_val, date_format)
prop_naieve = datetime.strptime(prop_val, date_format)
prop_aware = prop_naieve.replace(tzinfo=tz_util.utc)
properties[prop] = prop_aware
elif prop_type == 'objectid':
prop_val = properties[prop]
if prop_val:
@@ -43,6 +55,20 @@ class ValidateCustomFields(Validator):
return properties
def convert_dict_values(self, dict_property, dict_valueschema):
"""Calls convert_properties() for the values in the dict.
Only validates the dict values, not the keys. Modifies the given dict in-place.
"""
assert dict_valueschema[u'type'] == u'dict'
assert isinstance(dict_property, dict)
for key, val in dict_property.items():
item_schema = {u'item': dict_valueschema}
item_prop = {u'item': val}
dict_property[key] = self.convert_properties(item_prop, item_schema)[u'item']
def _validate_valid_properties(self, valid_properties, field, value):
from pillar.api.utils import project_get_node_type
@@ -72,7 +98,7 @@ class ValidateCustomFields(Validator):
except Exception as e:
log.warning("Error converting form properties", exc_info=True)
v = Validator(node_type['dyn_schema'])
v = self.__class__(schema=node_type['dyn_schema'])
val = v.validate(value)
if val:
@@ -80,3 +106,22 @@ class ValidateCustomFields(Validator):
log.warning('Error validating properties for node %s: %s', self.document, v.errors)
self._error(field, "Error validating properties")
def _validate_required_after_creation(self, required_after_creation, field, value):
"""Makes a value required after creation only.
Combine "required_after_creation=True" with "required=False" to allow
pre-insert hooks to set default values.
"""
if not required_after_creation:
# Setting required_after_creation=False is the same as not mentioning this
# validator at all.
return
if self._id is None:
# This is a creation call, in which case this validator shouldn't run.
return
if not value:
self._error(field, "Value is required once the document was created")

View File

@@ -121,6 +121,7 @@ users_schema = {
},
'service': {
'type': 'dict',
'allow_unknown': True,
'schema': {
'badger': {
'type': 'list',
@@ -623,7 +624,16 @@ projects_schema = {
'permissions': {
'type': 'dict',
'schema': permissions_embedded_schema
}
},
# Properties defined by extensions. Extensions should use their name
# (see the PillarExtension.name property) as the key, and are free to
# use whatever they want as value (but we suggest a dict for future
# extendability).
'extension_props': {
'type': 'dict',
'required': False,
},
}
activities_subscriptions_schema = {
@@ -667,6 +677,19 @@ activities_schema = {
'type': 'objectid',
'required': True
},
'project': {
'type': 'objectid',
'data_relation': {
'resource': 'projects',
'field': '_id',
},
'required': False,
},
# If the object type is 'node', the node type can be stored here.
'node_type': {
'type': 'string',
'required': False,
}
}
notifications_schema = {

View File

@@ -4,11 +4,11 @@ import mimetypes
import tempfile
import uuid
from hashlib import md5
import os
import requests
import bson.tz_util
import datetime
import eve.utils
import os
import pymongo
import werkzeug.exceptions as wz_exceptions
from bson import ObjectId
@@ -307,6 +307,8 @@ def generate_link(backend, file_path, project_id=None, is_public=False):
storage = GoogleCloudStorageBucket(project_id)
blob = storage.Get(file_path)
if blob is None:
log.warning('generate_link(%r, %r): unable to find blob for file path,'
' returning empty link.', backend, file_path)
return ''
if is_public:
@@ -319,8 +321,10 @@ def generate_link(backend, file_path, project_id=None, is_public=False):
if backend == 'cdnsun':
return hash_file_path(file_path, None)
if backend == 'unittest':
return md5(file_path).hexdigest()
return 'https://unit.test/%s' % md5(file_path).hexdigest()
log.warning('generate_link(): Unknown backend %r, returning empty string as new link.',
backend)
return ''
@@ -379,10 +383,10 @@ def ensure_valid_link(response):
else:
log_link.debug('No expiry date for link; generating new link')
_generate_all_links(response, now)
generate_all_links(response, now)
def _generate_all_links(response, now):
def generate_all_links(response, now):
"""Generate a new link for the file and all its variations.
:param response: the file document that should be updated.
@@ -441,7 +445,7 @@ def on_pre_get_files(_, lookup):
cursor = current_app.data.find('files', parsed_req, lookup_expired)
for file_doc in cursor:
# log.debug('Updating expired links for file %r.', file_doc['_id'])
_generate_all_links(file_doc, now)
generate_all_links(file_doc, now)
def refresh_links_for_project(project_uuid, chunk_size, expiry_seconds):
@@ -469,7 +473,7 @@ def refresh_links_for_project(project_uuid, chunk_size, expiry_seconds):
for file_doc in to_refresh:
log.debug('Refreshing links for file %s', file_doc['_id'])
_generate_all_links(file_doc, now)
generate_all_links(file_doc, now)
log.info('Refreshed %i links', min(chunk_size, to_refresh.count()))
@@ -524,7 +528,7 @@ def refresh_links_for_backend(backend_name, chunk_size, expiry_seconds):
log.debug('Refreshing links for file %s', file_id)
try:
_generate_all_links(file_doc, now)
generate_all_links(file_doc, now)
except gcloud.exceptions.Forbidden:
log.warning('Skipping file %s, GCS forbids us access to '
'project %s bucket.', file_id, project_id)
@@ -623,7 +627,7 @@ def assert_file_size_allowed(file_size):
@file_storage.route('/stream/<string:project_id>', methods=['POST', 'OPTIONS'])
@require_login()
def stream_to_gcs(project_id):
def stream_to_storage(project_id):
project_oid = utils.str2id(project_id)
projects = current_app.data.driver.db['projects']
@@ -635,6 +639,14 @@ def stream_to_gcs(project_id):
log.info('Streaming file to bucket for project=%s user_id=%s', project_id,
authentication.current_user_id())
log.info('request.headers[Origin] = %r', request.headers.get('Origin'))
log.info('request.content_length = %r', request.content_length)
# Try a check for the content length before we access request.files[]. This allows us
# to abort the upload early. The entire body content length is always a bit larger than
# the actual file size, so if we accept here, we're sure it'll be accepted in subsequent
# checks as well.
if request.content_length:
assert_file_size_allowed(request.content_length)
uploaded_file = request.files['file']
@@ -663,7 +675,8 @@ def stream_to_gcs(project_id):
# Figure out the file size, as we need to pass this in explicitly to GCloud.
# Otherwise it always uses os.fstat(file_obj.fileno()).st_size, which isn't
# supported by a BytesIO object (even though it does have a fileno attribute).
# supported by a BytesIO object (even though it does have a fileno
# attribute).
if isinstance(stream_for_gcs, io.BytesIO):
file_size = len(stream_for_gcs.getvalue())
else:
@@ -673,41 +686,22 @@ def stream_to_gcs(project_id):
assert_file_size_allowed(file_size)
# Create file document in MongoDB.
file_id, internal_fname, status = create_file_doc_for_upload(project_oid, uploaded_file)
file_id, internal_fname, status = create_file_doc_for_upload(project_oid,
uploaded_file)
if current_app.config['TESTING']:
log.warning('NOT streaming to GCS because TESTING=%r', current_app.config['TESTING'])
log.warning('NOT streaming to GCS because TESTING=%r',
current_app.config['TESTING'])
# Fake a Blob object.
gcs = None
blob = type('Blob', (), {'size': file_size})
else:
# Upload the file to GCS.
from gcloud.streaming import transfer
blob, gcs = stream_to_gcs(file_id, file_size, internal_fname,
project_id, stream_for_gcs,
uploaded_file.mimetype)
log.debug('Streaming file to GCS bucket; id=%s, fname=%s, size=%i',
file_id, internal_fname, file_size)
# Files larger than this many bytes will be streamed directly from disk, smaller
# ones will be read into memory and then uploaded.
transfer.RESUMABLE_UPLOAD_THRESHOLD = 102400
try:
gcs = GoogleCloudStorageBucket(project_id)
blob = gcs.bucket.blob('_/' + internal_fname, chunk_size=256 * 1024 * 2)
blob.upload_from_file(stream_for_gcs, size=file_size,
content_type=uploaded_file.mimetype)
except Exception:
log.exception('Error uploading file to Google Cloud Storage (GCS),'
' aborting handling of uploaded file (id=%s).', file_id)
update_file_doc(file_id, status='failed')
raise wz_exceptions.InternalServerError('Unable to stream file to Google Cloud Storage')
if stream_for_gcs.closed:
log.error('Eek, GCS closed its stream, Andy is not going to like this.')
# Reload the blob to get the file size according to Google.
blob.reload()
log.debug('Marking uploaded file id=%s, fname=%s, size=%i as "queued_for_processing"',
log.debug('Marking uploaded file id=%s, fname=%s, '
'size=%i as "queued_for_processing"',
file_id, internal_fname, blob.size)
update_file_doc(file_id,
status='queued_for_processing',
@@ -715,7 +709,8 @@ def stream_to_gcs(project_id):
length=blob.size,
content_type=uploaded_file.mimetype)
log.debug('Processing uploaded file id=%s, fname=%s, size=%i', file_id, internal_fname, blob.size)
log.debug('Processing uploaded file id=%s, fname=%s, size=%i', file_id,
internal_fname, blob.size)
process_file(gcs, file_id, local_file)
# Local processing is done, we can close the local file so it is removed.
@@ -725,7 +720,8 @@ def stream_to_gcs(project_id):
log.debug('Handled uploaded file id=%s, fname=%s, size=%i, status=%i',
file_id, internal_fname, blob.size, status)
# Status is 200 if the file already existed, and 201 if it was newly created.
# Status is 200 if the file already existed, and 201 if it was newly
# created.
# TODO: add a link to a thumbnail in the response.
resp = jsonify(status='ok', file_id=str(file_id))
resp.status_code = status
@@ -733,6 +729,32 @@ def stream_to_gcs(project_id):
return resp
def stream_to_gcs(file_id, file_size, internal_fname, project_id,
stream_for_gcs, content_type):
# Upload the file to GCS.
from gcloud.streaming import transfer
log.debug('Streaming file to GCS bucket; id=%s, fname=%s, size=%i',
file_id, internal_fname, file_size)
# Files larger than this many bytes will be streamed directly from disk,
# smaller ones will be read into memory and then uploaded.
transfer.RESUMABLE_UPLOAD_THRESHOLD = 102400
try:
gcs = GoogleCloudStorageBucket(project_id)
blob = gcs.bucket.blob('_/' + internal_fname, chunk_size=256 * 1024 * 2)
blob.upload_from_file(stream_for_gcs, size=file_size,
content_type=content_type)
except Exception:
log.exception('Error uploading file to Google Cloud Storage (GCS),'
' aborting handling of uploaded file (id=%s).', file_id)
update_file_doc(file_id, status='failed')
raise wz_exceptions.InternalServerError(
'Unable to stream file to Google Cloud Storage')
# Reload the blob to get the file size according to Google.
blob.reload()
return blob, gcs
def add_access_control_headers(resp):
"""Allows cross-site requests from the configured domain."""

View File

@@ -0,0 +1,191 @@
"""Code for moving files between backends."""
import datetime
import logging
import os
import tempfile
from bson import ObjectId
import bson.tz_util
from flask import current_app
import requests
import requests.exceptions
from . import stream_to_gcs, generate_all_links, ensure_valid_link
import pillar.api.utils.gcs
__all__ = ['PrerequisiteNotMetError', 'change_file_storage_backend']
log = logging.getLogger(__name__)
class PrerequisiteNotMetError(RuntimeError):
"""Raised when a file cannot be moved due to unmet prerequisites."""
def change_file_storage_backend(file_id, dest_backend):
"""Given a file document, move it to the specified backend (if not already
there) and update the document to reflect that.
Files on the original backend are not deleted automatically.
"""
dest_backend = unicode(dest_backend)
file_id = ObjectId(file_id)
# Fetch file document
files_collection = current_app.data.driver.db['files']
f = files_collection.find_one(file_id)
if f is None:
raise ValueError('File with _id: {} not found'.format(file_id))
# Check that new backend differs from current one
if dest_backend == f['backend']:
raise PrerequisiteNotMetError('Destination backend ({}) matches the current backend, we '
'are not moving the file'.format(dest_backend))
# TODO Check that new backend is allowed (make conf var)
# Check that the file has a project; without project, we don't know
# which bucket to store the file into.
try:
project_id = f['project']
except KeyError:
raise PrerequisiteNotMetError('File document does not have a project')
# Ensure that all links are up to date before we even attempt a download.
ensure_valid_link(f)
# Upload file and variations to the new backend
variations = f.get('variations', ())
try:
copy_file_to_backend(file_id, project_id, f, f['backend'], dest_backend)
except requests.exceptions.HTTPError as ex:
# allow the main file to be removed from storage.
if ex.response.status_code not in {404, 410}:
raise
if not variations:
raise PrerequisiteNotMetError('Main file ({link}) does not exist on server, '
'and no variations exist either'.format(**f))
log.warning('Main file %s does not exist; skipping main and visiting variations', f['link'])
for var in variations:
copy_file_to_backend(file_id, project_id, var, f['backend'], dest_backend)
# Generate new links for the file & all variations. This also saves
# the new backend we set here.
f['backend'] = dest_backend
now = datetime.datetime.now(tz=bson.tz_util.utc)
generate_all_links(f, now)
def copy_file_to_backend(file_id, project_id, file_or_var, src_backend, dest_backend):
# Filenames on GCS do not contain paths, by our convention
internal_fname = os.path.basename(file_or_var['file_path'])
file_or_var['file_path'] = internal_fname
# If the file is not local already, fetch it
if src_backend == 'pillar':
local_finfo = fetch_file_from_local(file_or_var)
else:
local_finfo = fetch_file_from_link(file_or_var['link'])
# Upload to GCS
if dest_backend != 'gcs':
raise ValueError('Only dest_backend="gcs" is supported now.')
if current_app.config['TESTING']:
log.warning('Skipping actual upload to GCS due to TESTING')
else:
# TODO check for name collisions
stream_to_gcs(file_id, local_finfo['file_size'],
internal_fname=internal_fname,
project_id=str(project_id),
stream_for_gcs=local_finfo['local_file'],
content_type=local_finfo['content_type'])
# No longer needed, so it can be closed & dispersed of.
local_finfo['local_file'].close()
def fetch_file_from_link(link):
"""Utility to download a file from a remote location and return it with
additional info (for upload to a different storage backend).
"""
log.info('Downloading %s', link)
r = requests.get(link, stream=True)
r.raise_for_status()
local_file = tempfile.NamedTemporaryFile(dir=current_app.config['STORAGE_DIR'])
log.info('Downloading to %s', local_file.name)
for chunk in r.iter_content(chunk_size=1024):
if chunk:
local_file.write(chunk)
local_file.seek(0)
file_dict = {
'file_size': os.fstat(local_file.fileno()).st_size,
'content_type': r.headers.get('content-type', 'application/octet-stream'),
'local_file': local_file
}
return file_dict
def fetch_file_from_local(file_doc):
"""Mimicks fetch_file_from_link(), but just returns the local file.
:param file_doc: dict with 'link' key pointing to a path in STORAGE_DIR, and
'content_type' key.
:type file_doc: dict
:rtype: dict self._log.info('Moving file %s to project %s', file_id, dest_proj['_id'])
"""
local_file = open(os.path.join(current_app.config['STORAGE_DIR'], file_doc['file_path']), 'rb')
local_finfo = {
'file_size': os.fstat(local_file.fileno()).st_size,
'content_type': file_doc['content_type'],
'local_file': local_file
}
return local_finfo
def gcs_move_to_bucket(file_id, dest_project_id, skip_gcs=False):
"""Moves a file from its own bucket to the new project_id bucket."""
files_coll = current_app.db()['files']
f = files_coll.find_one(file_id)
if f is None:
raise ValueError('File with _id: {} not found'.format(file_id))
# Check that new backend differs from current one
if f['backend'] != 'gcs':
raise ValueError('Only Google Cloud Storage is supported for now.')
# Move file and variations to the new bucket.
if skip_gcs:
log.warning('NOT ACTUALLY MOVING file %s on GCS, just updating MongoDB', file_id)
else:
src_project = f['project']
pillar.api.utils.gcs.copy_to_bucket(f['file_path'], src_project, dest_project_id)
for var in f.get('variations', []):
pillar.api.utils.gcs.copy_to_bucket(var['file_path'], src_project, dest_project_id)
# Update the file document after moving was successful.
log.info('Switching file %s to project %s', file_id, dest_project_id)
update_result = files_coll.update_one({'_id': file_id},
{'$set': {'project': dest_project_id}})
if update_result.matched_count != 1:
raise RuntimeError(
'Unable to update file %s in MongoDB: matched_count=%i; modified_count=%i' % (
file_id, update_result.matched_count, update_result.modified_count))
log.info('Switching file %s: matched_count=%i; modified_count=%i',
file_id, update_result.matched_count, update_result.modified_count)
# Regenerate the links for this file
f['project'] = dest_project_id
generate_all_links(f, now=datetime.datetime.now(tz=bson.tz_util.utc))

View File

@@ -102,7 +102,7 @@ def latest_comments():
'properties.content': 1, 'node_type': 1,
'properties.status': 1,
'properties.is_reply': 1},
has_public_project, 6)
has_public_project, 10)
# Embed the comments' parents.
nodes = current_app.data.driver.db['nodes']

View File

@@ -6,3 +6,55 @@ _file_embedded_schema = {
'embeddable': True
}
}
ATTACHMENT_SLUG_REGEX = '[a-zA-Z0-9_ ]+'
_attachments_embedded_schema = {
'type': 'dict',
# TODO: will be renamed to 'keyschema' in Cerberus 1.0
'propertyschema': {
'type': 'string',
'regex': '^%s$' % ATTACHMENT_SLUG_REGEX,
},
'valueschema': {
'type': 'dict',
'schema': {
'oid': {
'type': 'objectid',
'required': True,
},
'link': {
'type': 'string',
'allowed': ['self', 'none', 'custom'],
'default': 'self',
},
'link_custom': {
'type': 'string',
},
'collection': {
'type': 'string',
'allowed': ['files'],
'default': 'files',
},
},
},
}
# Import after defining the common embedded schemas, to prevent dependency cycles.
from pillar.api.node_types.asset import node_type_asset
from pillar.api.node_types.blog import node_type_blog
from pillar.api.node_types.comment import node_type_comment
from pillar.api.node_types.group import node_type_group
from pillar.api.node_types.group_hdri import node_type_group_hdri
from pillar.api.node_types.group_texture import node_type_group_texture
from pillar.api.node_types.hdri import node_type_hdri
from pillar.api.node_types.page import node_type_page
from pillar.api.node_types.post import node_type_post
from pillar.api.node_types.storage import node_type_storage
from pillar.api.node_types.text import node_type_text
from pillar.api.node_types.texture import node_type_texture
PILLAR_NODE_TYPES = (node_type_asset, node_type_blog, node_type_comment, node_type_group,
node_type_group_hdri, node_type_group_texture, node_type_hdri, node_type_page,
node_type_post, node_type_storage, node_type_text, node_type_texture)
PILLAR_NAMED_NODE_TYPES = {nt['name']: nt for nt in PILLAR_NODE_TYPES}

View File

@@ -1,5 +0,0 @@
node_type_act = {
'name': 'act',
'description': 'Act node type',
'parent': []
}

View File

@@ -1,4 +1,4 @@
from pillar.api.node_types import _file_embedded_schema
from pillar.api.node_types import _file_embedded_schema, _attachments_embedded_schema
node_type_asset = {
'name': 'asset',
@@ -27,26 +27,7 @@ node_type_asset = {
# We point to the original file (and use it to extract any relevant
# variation useful for our scope).
'file': _file_embedded_schema,
'attachments': {
'type': 'list',
'schema': {
'type': 'dict',
'schema': {
'field': {'type': 'string'},
'files': {
'type': 'list',
'schema': {
'type': 'dict',
'schema': {
'file': _file_embedded_schema,
'slug': {'type': 'string', 'minlength': 1},
'size': {'type': 'string'}
}
}
}
}
}
},
'attachments': _attachments_embedded_schema,
# Tags for search
'tags': {
'type': 'list',
@@ -58,17 +39,29 @@ node_type_asset = {
# this schema: "Root > Nested Category > One More Nested Category"
'categories': {
'type': 'string'
}
},
'license_type': {
'default': 'cc-by',
'type': 'string',
'allowed': [
'cc-by',
'cc-0',
'cc-by-sa',
'cc-by-nd',
'cc-by-nc',
'copyright'
]
},
'license_notes': {
'type': 'string'
},
},
'form_schema': {
'status': {},
'content_type': {'visible': False},
'file': {},
'attachments': {'visible': False},
'order': {'visible': False},
'tags': {'visible': False},
'categories': {'visible': False}
'categories': {'visible': False},
'license_type': {'visible': False},
'license_notes': {'visible': False},
},
'permissions': {
}
}

View File

@@ -18,12 +18,4 @@ node_type_blog = {
'template': {},
},
'parent': ['project',],
'permissions': {
# 'groups': [{
# 'group': app.config['ADMIN_USER_GROUP'],
# 'methods': ['GET', 'PUT', 'POST']
# }],
# 'users': [],
# 'world': ['GET']
}
}

View File

@@ -6,6 +6,11 @@ node_type_comment = {
'content': {
'type': 'string',
'minlength': 5,
'required': True,
},
# The converted-to-HTML content.
'content_html': {
'type': 'string',
},
'status': {
'type': 'string',
@@ -50,16 +55,6 @@ node_type_comment = {
'confidence': {'type': 'float'},
'is_reply': {'type': 'boolean'}
},
'form_schema': {
'content': {},
'status': {},
'rating_positive': {},
'rating_negative': {},
'ratings': {},
'confidence': {},
'is_reply': {}
},
'form_schema': {},
'parent': ['asset', 'comment'],
'permissions': {
}
}

View File

@@ -1,6 +1,6 @@
node_type_group = {
'name': 'group',
'description': 'Generic group node type edited',
'description': 'Folder node type',
'parent': ['group', 'project'],
'dyn_schema': {
# Used for sorting within the context of a group
@@ -24,10 +24,7 @@ node_type_group = {
},
'form_schema': {
'url': {'visible': False},
'status': {},
'notes': {'visible': False},
'order': {'visible': False}
},
'permissions': {
}
}

View File

@@ -15,8 +15,5 @@ node_type_group_hdri = {
],
}
},
'form_schema': {
'status': {},
'order': {}
}
'form_schema': {},
}

View File

@@ -15,8 +15,5 @@ node_type_group_texture = {
],
}
},
'form_schema': {
'status': {},
'order': {}
}
'form_schema': {},
}

View File

@@ -62,5 +62,5 @@ node_type_hdri = {
'content_type': {'visible': False},
'tags': {'visible': False},
'categories': {'visible': False},
}
},
}

View File

@@ -1,4 +1,4 @@
from pillar.api.node_types import _file_embedded_schema
from pillar.api.node_types import _attachments_embedded_schema
node_type_page = {
'name': 'page',
@@ -22,33 +22,10 @@ node_type_page = {
'url': {
'type': 'string'
},
'attachments': {
'type': 'list',
'schema': {
'type': 'dict',
'schema': {
'field': {'type': 'string'},
'files': {
'type': 'list',
'schema': {
'type': 'dict',
'schema': {
'file': _file_embedded_schema,
'slug': {'type': 'string', 'minlength': 1},
'size': {'type': 'string'}
}
}
}
}
}
}
'attachments': _attachments_embedded_schema,
},
'form_schema': {
'content': {},
'status': {},
'url': {},
'attachments': {'visible': False},
},
'parent': ['project', ],
'permissions': {}
}

View File

@@ -1,4 +1,4 @@
from pillar.api.node_types import _file_embedded_schema
from pillar.api.node_types import _attachments_embedded_schema
node_type_post = {
'name': 'post',
@@ -26,34 +26,10 @@ node_type_post = {
'url': {
'type': 'string'
},
'attachments': {
'type': 'list',
'schema': {
'type': 'dict',
'schema': {
'field': {'type': 'string'},
'files': {
'type': 'list',
'schema': {
'type': 'dict',
'schema': {
'file': _file_embedded_schema,
'slug': {'type': 'string', 'minlength': 1},
'size': {'type': 'string'}
}
}
}
}
}
}
'attachments': _attachments_embedded_schema,
},
'form_schema': {
'content': {},
'status': {},
'category': {},
'url': {},
'attachments': {'visible': False},
},
'parent': ['blog', ],
'permissions': {}
}

View File

@@ -1,124 +0,0 @@
from pillar.api.node_types import _file_embedded_schema
node_type_project = {
'name': 'project',
'parent': {},
'description': 'The official project type',
'dyn_schema': {
'category': {
'type': 'string',
'allowed': [
'training',
'film',
'assets',
'software',
'game'
],
'required': True,
},
'is_private': {
'type': 'boolean'
},
'url': {
'type': 'string'
},
'organization': {
'type': 'objectid',
'nullable': True,
'data_relation': {
'resource': 'organizations',
'field': '_id',
'embeddable': True
},
},
'owners': {
'type': 'dict',
'schema': {
'users': {
'type': 'list',
'schema': {
'type': 'objectid',
}
},
'groups': {
'type': 'list',
'schema': {
'type': 'objectid',
'data_relation': {
'resource': 'groups',
'field': '_id',
'embeddable': True
}
}
}
}
},
'status': {
'type': 'string',
'allowed': [
'published',
'pending',
],
},
# Logo
'picture_square': _file_embedded_schema,
# Header
'picture_header': _file_embedded_schema,
# Short summary for the project
'summary': {
'type': 'string',
'maxlength': 128
},
# Latest nodes being edited
'nodes_latest': {
'type': 'list',
'schema': {
'type': 'objectid',
}
},
# Featured nodes, manually added
'nodes_featured': {
'type': 'list',
'schema': {
'type': 'objectid',
}
},
# Latest blog posts, manually added
'nodes_blog': {
'type': 'list',
'schema': {
'type': 'objectid',
}
}
},
'form_schema': {
'is_private': {},
# TODO add group parsing
'category': {},
'url': {},
'organization': {},
'picture_square': {},
'picture_header': {},
'summary': {},
'owners': {
'schema': {
'users': {},
'groups': {
'items': [('Group', 'name')],
},
}
},
'status': {},
'nodes_featured': {},
'nodes_latest': {},
'nodes_blog': {}
},
'permissions': {
# 'groups': [{
# 'group': app.config['ADMIN_USER_GROUP'],
# 'methods': ['GET', 'PUT', 'POST']
# }],
# 'users': [],
# 'world': ['GET']
}
}

View File

@@ -1,5 +0,0 @@
node_type_scene = {
'name': 'scene',
'description': 'Scene node type',
'parent': ['act'],
}

View File

@@ -1,45 +0,0 @@
node_type_shot = {
'name': 'shot',
'description': 'Shot Node Type, for shots',
'dyn_schema': {
'url': {
'type': 'string',
},
'cut_in': {
'type': 'integer'
},
'cut_out': {
'type': 'integer'
},
'status': {
'type': 'string',
'allowed': [
'on_hold',
'todo',
'in_progress',
'review',
'final'
],
},
'notes': {
'type': 'string',
'maxlength': 256,
},
'shot_group': {
'type': 'string',
#'data_relation': {
# 'resource': 'nodes',
# 'field': '_id',
#},
},
},
'form_schema': {
'url': {},
'cut_in': {},
'cut_out': {},
'status': {},
'notes': {},
'shot_group': {}
},
'parent': ['scene']
}

View File

@@ -21,17 +21,6 @@ node_type_storage = {
'type': 'string',
},
},
'form_schema': {
'subdir': {},
'project': {},
'backend': {}
},
'form_schema': {},
'parent': ['group', 'project'],
'permissions': {
# 'groups': [{
# 'group': app.config['ADMIN_USER_GROUP'],
# 'methods': ['GET', 'PUT', 'POST']
# }],
# 'users': [],
}
}

View File

@@ -1,107 +0,0 @@
node_type_task = {
'name': 'task',
'description': 'Task Node Type, for tasks',
'dyn_schema': {
'status': {
'type': 'string',
'allowed': [
'todo',
'in_progress',
'on_hold',
'approved',
'cbb',
'final',
'review'
],
'required': True,
},
'filepath': {
'type': 'string',
},
'revision': {
'type': 'integer',
},
'owners': {
'type': 'dict',
'schema': {
'users': {
'type': 'list',
'schema': {
'type': 'objectid',
}
},
'groups': {
'type': 'list',
'schema': {
'type': 'objectid',
}
}
}
},
'time': {
'type': 'dict',
'schema': {
'start': {
'type': 'datetime'
},
'duration': {
'type': 'integer'
},
'chunks': {
'type': 'list',
'schema': {
'type': 'dict',
'schema': {
'start': {
'type': 'datetime',
},
'duration': {
'type': 'integer',
}
}
}
},
}
},
'is_conflicting' : {
'type': 'boolean'
},
'is_processing' : {
'type': 'boolean'
},
'is_open' : {
'type': 'boolean'
}
},
'form_schema': {
'status': {},
'filepath': {},
'revision': {},
'owners': {
'schema': {
'users':{
'items': [('User', 'first_name')],
},
'groups': {}
}
},
'time': {
'schema': {
'start': {},
'duration': {},
'chunks': {
'visible': False,
'schema': {
'start': {},
'duration': {}
}
}
}
},
'is_conflicting': {},
'is_open': {},
'is_processing': {},
},
'parent': ['shot']
}

View File

@@ -24,5 +24,5 @@ node_type_text = {
},
'form_schema': {
'shared_slug': {'visible': False},
}
},
}

View File

@@ -58,15 +58,8 @@ node_type_texture = {
}
},
'form_schema': {
'status': {},
'content_type': {'visible': False},
'files': {},
'is_tileable': {},
'is_landscape': {},
'resolution': {},
'aspect_ratio': {},
'order': {},
'tags': {'visible': False},
'categories': {'visible': False},
}
},
}

View File

@@ -1,4 +1,5 @@
import base64
import functools
import logging
import urlparse
@@ -7,7 +8,9 @@ import rsa.randnum
import werkzeug.exceptions as wz_exceptions
from bson import ObjectId
from flask import current_app, g, Blueprint, request
from pillar.api import file_storage
import pillar.markdown
from pillar.api.node_types import PILLAR_NAMED_NODE_TYPES
from pillar.api.activities import activity_subscribe, activity_object_add
from pillar.api.utils.algolia import algolia_index_node_delete
from pillar.api.utils.algolia import algolia_index_node_save
@@ -20,6 +23,40 @@ blueprint = Blueprint('nodes_api', __name__)
ROLES_FOR_SHARING = {u'subscriber', u'demo'}
def only_for_node_type_decorator(*required_node_type_names):
"""Returns a decorator that checks its first argument's node type.
If the node type is not of the required node type, returns None,
otherwise calls the wrapped function.
>>> deco = only_for_node_type_decorator('comment')
>>> @deco
... def handle_comment(node): pass
>>> deco = only_for_node_type_decorator('comment', 'post')
>>> @deco
... def handle_comment_or_post(node): pass
"""
# Convert to a set for efficient 'x in required_node_type_names' queries.
required_node_type_names = set(required_node_type_names)
def only_for_node_type(wrapped):
@functools.wraps(wrapped)
def wrapper(node, *args, **kwargs):
if node.get('node_type') not in required_node_type_names:
return
return wrapped(node, *args, **kwargs)
return wrapper
only_for_node_type.__doc__ = "Decorator, immediately returns when " \
"the first argument is not of type %s." % required_node_type_names
return only_for_node_type
@blueprint.route('/<node_id>/share', methods=['GET', 'POST'])
@require_login(require_roles=ROLES_FOR_SHARING)
def share_node(node_id):
@@ -34,6 +71,8 @@ def share_node(node_id):
'node_type': 1,
'short_code': 1
})
if not node:
raise wz_exceptions.NotFound('Node %s does not exist.' % node_id)
check_permissions('nodes', node, request.method)
@@ -133,62 +172,6 @@ def short_link_info(short_code):
}
def item_parse_attachments(response):
"""Before returning a response, check if the 'attachments' property is
defined. If yes, load the file (for the moment only images) in the required
variation, get the link and build a Markdown representation. Search in the
'field' specified in the attachment and replace the 'slug' tag with the
generated link.
"""
attachments = response.get('properties', {}).get('attachments', None)
if not attachments:
return
files_collection = current_app.data.driver.db['files']
for attachment in attachments:
# Make a list from the property path
field_name_path = attachment['field'].split('.')
# This currently allow to access only properties inside of
# the properties property
if len(field_name_path) > 1:
field_content = response[field_name_path[0]][field_name_path[1]]
# This is for the "normal" first level property
else:
field_content = response[field_name_path[0]]
for af in attachment['files']:
slug = af['slug']
slug_tag = "[{0}]".format(slug)
f = files_collection.find_one({'_id': ObjectId(af['file'])})
if f is None:
af['file'] = None
continue
size = f['size'] if 'size' in f else 'l'
# Get the correct variation from the file
file_storage.ensure_valid_link(f)
thumbnail = next((item for item in f['variations'] if
item['size'] == size), None)
# Build Markdown img string
l = '![{0}]({1} "{2}")'.format(slug, thumbnail['link'], f['name'])
# Parse the content of the file and replace the attachment
# tag with the actual image link
field_content = field_content.replace(slug_tag, l)
# Apply the parsed value back to the property. See above for
# clarifications on how this is done.
if len(field_name_path) > 1:
response[field_name_path[0]][field_name_path[1]] = field_content
else:
response[field_name_path[0]] = field_content
def resource_parse_attachments(response):
for item in response['_items']:
item_parse_attachments(item)
def before_replacing_node(item, original):
check_permissions('nodes', original, 'PUT')
update_file_name(item)
@@ -274,9 +257,13 @@ def after_inserting_nodes(items):
else:
activity_subscribe(item['user'], 'node', item['_id'])
verb = 'commented'
else:
elif item['node_type'] in PILLAR_NAMED_NODE_TYPES:
verb = 'posted'
activity_subscribe(item['user'], 'node', item['_id'])
else:
# Don't automatically create activities for non-Pillar node types,
# as we don't know what would be a suitable verb (among other things).
continue
activity_object_add(
item['user'],
@@ -391,18 +378,39 @@ def after_deleting_node(item):
item.get('_id'), ex)
def setup_app(app, url_prefix):
only_for_comments = only_for_node_type_decorator('comment')
@only_for_comments
def convert_markdown(node, original=None):
"""Converts comments from Markdown to HTML.
Always does this on save, even when the original Markdown hasn't changed,
because our Markdown -> HTML conversion rules might have.
"""
try:
content = node['properties']['content']
except KeyError:
node['properties']['content_html'] = ''
else:
node['properties']['content_html'] = pillar.markdown.markdown(content)
def nodes_convert_markdown(nodes):
for node in nodes:
convert_markdown(node)
def setup_app(app, url_prefix):
from . import patch
patch.setup_app(app, url_prefix=url_prefix)
app.on_fetched_item_nodes += before_returning_node
app.on_fetched_resource_nodes += before_returning_nodes
app.on_fetched_item_nodes += item_parse_attachments
app.on_fetched_resource_nodes += resource_parse_attachments
app.on_replace_nodes += before_replacing_node
app.on_replace_nodes += convert_markdown
app.on_replace_nodes += deduct_content_type
app.on_replace_nodes += node_set_default_picture
app.on_replaced_nodes += after_replacing_node
@@ -410,8 +418,11 @@ def setup_app(app, url_prefix):
app.on_insert_nodes += before_inserting_nodes
app.on_insert_nodes += nodes_deduct_content_type
app.on_insert_nodes += nodes_set_default_picture
app.on_insert_nodes += nodes_convert_markdown
app.on_inserted_nodes += after_inserting_nodes
app.on_update_nodes += convert_markdown
app.on_deleted_item_nodes += after_deleting_node
app.register_api_blueprint(blueprint, url_prefix=url_prefix)

View File

@@ -1,15 +1,19 @@
"""PATCH support for comment nodes."""
import logging
import werkzeug.exceptions as wz_exceptions
from eve.methods.patch import patch_internal
from flask import current_app
import werkzeug.exceptions as wz_exceptions
from pillar.api.utils import authorization, authentication, jsonify
from . import register_patch_handler
log = logging.getLogger(__name__)
ROLES_FOR_COMMENT_VOTING = {u'subscriber', u'demo'}
VALID_COMMENT_OPERATIONS = {u'upvote', u'downvote', u'revoke'}
COMMENT_VOTING_OPS = {u'upvote', u'downvote', u'revoke'}
VALID_COMMENT_OPERATIONS = COMMENT_VOTING_OPS.union({u'edit'})
@register_patch_handler(u'comment')
@@ -17,7 +21,23 @@ def patch_comment(node_id, patch):
assert_is_valid_patch(node_id, patch)
user_id = authentication.current_user_id()
# Find the node
if patch[u'op'] in COMMENT_VOTING_OPS:
result, node = vote_comment(user_id, node_id, patch)
else:
assert patch[u'op'] == u'edit', 'Invalid patch operation %s' % patch[u'op']
result, node = edit_comment(user_id, node_id, patch)
return jsonify({'_status': 'OK',
'result': result,
'properties': node['properties']
})
def vote_comment(user_id, node_id, patch):
"""Performs a voting operation."""
# Find the node. Includes a query on the properties.ratings array so
# that we only get the current user's rating.
nodes_coll = current_app.data.driver.db['nodes']
node_query = {'_id': node_id,
'$or': [{'properties.ratings.$.user': {'$exists': False}},
@@ -25,7 +45,7 @@ def patch_comment(node_id, patch):
node = nodes_coll.find_one(node_query,
projection={'properties': 1})
if node is None:
log.warning('How can the node not be found?')
log.warning('User %s wanted to patch non-existing node %s' % (user_id, node_id))
raise wz_exceptions.NotFound('Node %s not found' % node_id)
props = node['properties']
@@ -82,6 +102,7 @@ def patch_comment(node_id, patch):
action = actions[patch['op']]
mongo_update = action()
nodes_coll = current_app.data.driver.db['nodes']
if mongo_update:
log.info('Running %s', mongo_update)
if rating:
@@ -97,10 +118,50 @@ def patch_comment(node_id, patch):
projection={'properties.rating_positive': 1,
'properties.rating_negative': 1})
return jsonify({'_status': 'OK',
'result': result,
'properties': node['properties']
})
return result, node
def edit_comment(user_id, node_id, patch):
"""Edits a single comment.
Doesn't do permission checking; users are allowed to edit their own
comment, and this is not something you want to revoke anyway. Admins
can edit all comments.
"""
# Find the node. We need to fetch some more info than we use here, so that
# we can pass this stuff to Eve's patch_internal; that way the validation &
# authorisation system has enough info to work.
nodes_coll = current_app.data.driver.db['nodes']
projection = {'user': 1,
'project': 1,
'node_type': 1}
node = nodes_coll.find_one(node_id, projection=projection)
if node is None:
log.warning('User %s wanted to patch non-existing node %s' % (user_id, node_id))
raise wz_exceptions.NotFound('Node %s not found' % node_id)
if node['user'] != user_id and not authorization.user_has_role(u'admin'):
raise wz_exceptions.Forbidden('You can only edit your own comments.')
# Use Eve to PATCH this node, as that also updates the etag.
r, _, _, status = patch_internal('nodes',
{'properties.content': patch['content'],
'project': node['project'],
'user': node['user'],
'node_type': node['node_type']},
concurrency_check=False,
_id=node_id)
if status != 200:
log.error('Error %i editing comment %s for user %s: %s',
status, node_id, user_id, r)
raise wz_exceptions.InternalServerError('Internal error %i from Eve' % status)
else:
log.info('User %s edited comment %s', user_id, node_id)
# Fetch the new content, so the client can show these without querying again.
node = nodes_coll.find_one(node_id, projection={'properties.content_html': 1})
return status, node
def assert_is_valid_patch(node_id, patch):
@@ -112,8 +173,12 @@ def assert_is_valid_patch(node_id, patch):
raise wz_exceptions.BadRequest("PATCH should have a key 'op' indicating the operation.")
if op not in VALID_COMMENT_OPERATIONS:
raise wz_exceptions.BadRequest('Operation should be one of %s',
', '.join(VALID_COMMENT_OPERATIONS))
raise wz_exceptions.BadRequest(u'Operation should be one of %s',
u', '.join(VALID_COMMENT_OPERATIONS))
if op not in COMMENT_VOTING_OPS:
# We can't check here, we need the node owner for that.
return
# See whether the user is allowed to patch
if authorization.user_matches_roles(ROLES_FOR_COMMENT_VOTING):

110
pillar/api/nodes/moving.py Normal file
View File

@@ -0,0 +1,110 @@
"""Code for moving around nodes."""
import attr
import flask_pymongo.wrappers
from bson import ObjectId
from pillar import attrs_extra
import pillar.api.file_storage.moving
@attr.s
class NodeMover(object):
db = attr.ib(validator=attr.validators.instance_of(flask_pymongo.wrappers.Database))
skip_gcs = attr.ib(default=False, validator=attr.validators.instance_of(bool))
_log = attrs_extra.log('%s.NodeMover' % __name__)
def change_project(self, node, dest_proj):
"""Moves a node and children to a new project."""
assert isinstance(node, dict)
assert isinstance(dest_proj, dict)
for move_node in self._children(node):
self._change_project(move_node, dest_proj)
def _change_project(self, node, dest_proj):
"""Changes the project of a single node, non-recursively."""
node_id = node['_id']
proj_id = dest_proj['_id']
self._log.info('Moving node %s to project %s', node_id, proj_id)
# Find all files in the node.
moved_files = set()
self._move_files(moved_files, dest_proj, self._files(node.get('picture', None)))
self._move_files(moved_files, dest_proj, self._files(node['properties'], 'file'))
self._move_files(moved_files, dest_proj, self._files(node['properties'], 'files', 'file'))
self._move_files(moved_files, dest_proj,
self._files(node['properties'], 'attachments', 'files', 'file'))
# Switch the node's project after its files have been moved.
self._log.info('Switching node %s to project %s', node_id, proj_id)
nodes_coll = self.db['nodes']
update_result = nodes_coll.update_one({'_id': node_id},
{'$set': {'project': proj_id}})
if update_result.matched_count != 1:
raise RuntimeError(
'Unable to update node %s in MongoDB: matched_count=%i; modified_count=%i' % (
node_id, update_result.matched_count, update_result.modified_count))
def _move_files(self, moved_files, dest_proj, file_generator):
"""Tries to find all files from the given properties."""
for file_id in file_generator:
if file_id in moved_files:
continue
moved_files.add(file_id)
self.move_file(dest_proj, file_id)
def move_file(self, dest_proj, file_id):
"""Moves a single file to another project"""
self._log.info('Moving file %s to project %s', file_id, dest_proj['_id'])
pillar.api.file_storage.moving.gcs_move_to_bucket(file_id, dest_proj['_id'],
skip_gcs=self.skip_gcs)
def _files(self, file_ref, *properties):
"""Yields file ObjectIDs."""
# Degenerate cases.
if not file_ref:
return
# Single ObjectID
if isinstance(file_ref, ObjectId):
assert not properties
yield file_ref
return
# List of ObjectIDs
if isinstance(file_ref, list):
for item in file_ref:
for subitem in self._files(item, *properties):
yield subitem
return
# Dict, use properties[0] as key
if isinstance(file_ref, dict):
try:
subref = file_ref[properties[0]]
except KeyError:
# Silently skip non-existing keys.
return
for subitem in self._files(subref, *properties[1:]):
yield subitem
return
raise TypeError('File ref is of type %s, not implemented' % type(file_ref))
def _children(self, node):
"""Generator, recursively yields the node and its children."""
yield node
nodes_coll = self.db['nodes']
for child in nodes_coll.find({'parent': node['_id']}):
# "yield from self.children(child)" was introduced in Python 3.3
for grandchild in self._children(child):
yield grandchild

View File

@@ -57,22 +57,12 @@ def before_inserting_override_is_private_field(projects):
def before_edit_check_permissions(document, original):
# Allow admin users to do whatever they want.
# TODO: possibly move this into the check_permissions function.
if user_has_role(u'admin'):
return
check_permissions('projects', original, request.method)
def before_delete_project(document):
"""Checks permissions before we allow deletion"""
# Allow admin users to do whatever they want.
# TODO: possibly move this into the check_permissions function.
if user_has_role(u'admin'):
return
check_permissions('projects', document, request.method)
@@ -195,7 +185,7 @@ def after_inserting_project(project, db_user):
result = projects_collection.update_one({'_id': project_id},
{'$set': remove_private_keys(project)})
if result.matched_count != 1:
log.warning('Unable to update project %s: %s', project_id, result.raw_result)
log.error('Unable to update project %s: %s', project_id, result.raw_result)
abort_with_error(500)

View File

@@ -65,6 +65,7 @@ def project_manage_users():
project = projects_collection.find_one({'_id': project_id})
# Check if the current_user is owner of the project, or removing themselves.
if not authorization.user_has_role(u'admin'):
remove_self = target_user_id == current_user_id and action == 'remove'
if project['user'] != current_user_id and not remove_self:
utils.abort_with_error(403)

View File

@@ -90,3 +90,10 @@ def create_new_project(project_name, user_id, overrides):
log.info('Created project %s for user %s', project['_id'], user_id)
return project
def get_node_type(project, node_type_name):
"""Returns the named node type, or None if it doesn't exist."""
return next((nt for nt in project['node_types']
if nt['name'] == node_type_name), None)

View File

@@ -162,7 +162,7 @@ def manage_user_group_membership(db_user, role, action):
return user_groups
def create_service_account(email, roles, service):
def create_service_account(email, roles, service, update_existing=None):
"""Creates a service account with the given roles + the role 'service'.
:param email: email address associated with the account
@@ -170,9 +170,39 @@ def create_service_account(email, roles, service):
:param roles: iterable of role names
:param service: dict of the 'service' key in the user.
:type service: dict
:param update_existing: callback function that receives an existing user to update
for this service, in case the email address is already in use by someone.
If not given or None, updating existing users is disallowed, and a ValueError
exception is thrown instead.
:return: tuple (user doc, token doc)
"""
from pillar.api.utils import remove_private_keys
# Find existing
users_coll = current_app.db()['users']
user = users_coll.find_one({'email': email})
if user:
# Check whether updating is allowed at all.
if update_existing is None:
raise ValueError('User %s already exists' % email)
# Compute the new roles, and assign.
roles = list(set(roles).union({u'service'}).union(user['roles']))
user['roles'] = list(roles)
# Let the caller perform any required updates.
log.info('Updating existing user %s to become service account for %s',
email, roles)
update_existing(user['service'])
# Try to store the updated user.
result, _, _, status = current_app.put_internal('users',
remove_private_keys(user),
_id=user['_id'])
expected_status = 200
else:
# Create a user with the correct roles.
roles = list(set(roles).union({u'service'}))
user = {'username': email,
@@ -184,7 +214,9 @@ def create_service_account(email, roles, service):
'email': email,
'service': service}
result, _, _, status = current_app.post_internal('users', user)
if status != 201:
expected_status = 201
if status != expected_status:
raise SystemExit('Error creating user {}: {}'.format(email, result))
user.update(result)

View File

@@ -102,7 +102,7 @@ def after_fetching_user(user):
return
# Remove all fields except public ones.
public_fields = {'full_name', 'email'}
public_fields = {'full_name', 'username', 'email'}
for field in list(user.keys()):
if field not in public_fields:
del user[field]

View File

@@ -13,10 +13,30 @@ from flask import current_app
from werkzeug import exceptions as wz_exceptions
import pymongo.results
__all__ = ('remove_private_keys', 'PillarJSONEncoder')
log = logging.getLogger(__name__)
def node_setattr(node, key, value):
"""Sets a node property by dotted key.
Modifies the node in-place. Deletes None values.
:type node: dict
:type key: str
:param value: the value to set, or None to delete the key.
"""
set_on = node
while key and '.' in key:
head, key = key.split('.', 1)
set_on = set_on[head]
if value is None:
set_on.pop(key, None)
else:
set_on[key] = value
def remove_private_keys(document):
"""Removes any key that starts with an underscore, returns result as new
dictionary.
@@ -65,6 +85,18 @@ def jsonify(mongo_doc, status=200, headers=None):
headers=headers)
def bsonify(mongo_doc, status=200, headers=None):
"""BSonifies a Mongo document into a Flask response object."""
import bson
data = bson.BSON.encode(mongo_doc)
return current_app.response_class(data,
mimetype='application/bson',
status=status,
headers=headers)
def skip_when_testing(func):
"""Decorator, skips the decorated function when app.config['TESTING']"""
@@ -114,3 +146,50 @@ def gravatar(email, size=64):
return "https://www.gravatar.com/avatar/" + \
hashlib.md5(str(email)).hexdigest() + \
"?" + urllib.urlencode(parameters)
class MetaFalsey(type):
def __nonzero__(cls):
return False
__bool__ = __nonzero__ # for Python 3
class DoesNotExist(object):
"""Returned as value by doc_diff if a value does not exist."""
__metaclass__ = MetaFalsey
def doc_diff(doc1, doc2, falsey_is_equal=True):
"""Generator, yields differences between documents.
Yields changes as (key, value in doc1, value in doc2) tuples, where
the value can also be the DoesNotExist class. Does not report changed
private keys (i.e. starting with underscores).
Sub-documents (i.e. dicts) are recursed, and dot notation is used
for the keys if changes are found.
If falsey_is_equal=True, all Falsey values compare as equal, i.e. this
function won't report differences between DoesNotExist, False, '', and 0.
"""
for key in set(doc1.keys()).union(set(doc2.keys())):
if isinstance(key, basestring) and key[0] == u'_':
continue
val1 = doc1.get(key, DoesNotExist)
val2 = doc2.get(key, DoesNotExist)
# Only recurse if both values are dicts
if isinstance(val1, dict) and isinstance(val2, dict):
for subkey, subval1, subval2 in doc_diff(val1, val2):
yield '%s.%s' % (key, subkey), subval1, subval2
continue
if val1 == val2:
continue
if falsey_is_equal and bool(val1) == bool(val2) == False:
continue
yield key, val1, val2

View File

@@ -81,12 +81,15 @@ def algolia_index_node_save(node):
if 'permissions' in node and 'world' in node['permissions']:
if 'GET' in node['permissions']['world']:
node_ob['is_free'] = True
# Append the media key if the node is of node_type 'asset'
if node['node_type'] == 'asset':
node_ob['media'] = node['properties']['content_type']
# Add tags
if 'tags' in node['properties']:
node_ob['tags'] = node['properties']['tags']
# Add extra properties
for prop in ('tags', 'license_notes'):
if prop in node['properties']:
node_ob[prop] = node['properties'][prop]
current_app.algolia_index_nodes.save_object(node_ob)

View File

@@ -15,6 +15,22 @@ from flask import current_app
log = logging.getLogger(__name__)
CLI_USER = {
'user_id': 'CLI',
'groups': [],
'roles': {'admin'},
}
def force_cli_user():
"""Sets g.current_user to the CLI_USER object.
This is used as a marker to avoid authorization checks and just allow everything.
"""
log.warning('Logging in as CLI_USER, circumventing authentication.')
g.current_user = CLI_USER
def validate_token():
"""Validate the token provided in the request and populate the current_user

View File

@@ -7,7 +7,7 @@ from flask import abort
from flask import current_app
from werkzeug.exceptions import Forbidden
CHECK_PERMISSIONS_IMPLEMENTED_FOR = {'projects', 'nodes'}
CHECK_PERMISSIONS_IMPLEMENTED_FOR = {'projects', 'nodes', 'flamenco_jobs'}
log = logging.getLogger(__name__)
@@ -62,15 +62,18 @@ def compute_allowed_methods(collection_name, resource, check_node_type=None):
# Accumulate allowed methods from the user, group and world level.
allowed_methods = set()
current_user = g.current_user
current_user = getattr(g, 'current_user', None)
if current_user:
user_is_admin = is_admin(current_user)
# If the user is authenticated, proceed to compare the group permissions
for permission in computed_permissions.get('groups', ()):
if permission['group'] in current_user['groups']:
if user_is_admin or permission['group'] in current_user['groups']:
allowed_methods.update(permission['methods'])
for permission in computed_permissions.get('users', ()):
if current_user['user_id'] == permission['user']:
if user_is_admin or current_user['user_id'] == permission['user']:
allowed_methods.update(permission['methods'])
# Check if the node is public or private. This must be set for non logged
@@ -132,6 +135,14 @@ def compute_aggr_permissions(collection_name, resource, check_node_type=None):
if check_node_type is None:
return project['permissions']
node_type_name = check_node_type
elif 'node_type' not in resource:
# Neither a project, nor a node, therefore is another collection
projects_collection = current_app.data.driver.db['projects']
project = projects_collection.find_one(
ObjectId(resource['project']),
{'permissions': 1})
return project['permissions']
else:
# Not a project, so it's a node.
assert 'project' in resource
@@ -155,7 +166,7 @@ def compute_aggr_permissions(collection_name, resource, check_node_type=None):
project_permissions = project['permissions']
# Find the node type from the project.
node_type = next((node_type for node_type in project['node_types']
node_type = next((node_type for node_type in project.get('node_types', ())
if node_type['name'] == node_type_name), None)
if node_type is None: # This node type is not known, so doesn't give permissions.
node_type_permissions = {}

View File

@@ -169,6 +169,15 @@ class GoogleCloudStorageBucket(object):
blob.content_disposition = u'attachment; filename="{0}"'.format(name)
blob.patch()
def copy_blob(self, blob, to_bucket):
"""Copies the given blob from this bucket to the other bucket.
Returns the new blob.
"""
assert isinstance(to_bucket, GoogleCloudStorageBucket)
return self.bucket.copy_blob(blob, to_bucket.bucket)
def update_file_name(node):
"""Assign to the CGS blob the same name of the asset node. This way when
@@ -197,6 +206,11 @@ def update_file_name(node):
storage = GoogleCloudStorageBucket(str(node['project']))
blob = storage.Get(file_doc['file_path'], to_dict=False)
if blob is None:
log.warning('Unable to find blob for file %s in project %s',
file_doc['file_path'], file_doc['project'])
return
# Pick file extension from original filename
_, ext = os.path.splitext(file_doc['filename'])
name = _format_name(node['name'], ext, map_type=map_type)
@@ -222,3 +236,16 @@ def update_file_name(node):
if 'files' in node['properties']:
for file_props in node['properties']['files']:
_update_name(file_props['file'], file_props)
def copy_to_bucket(file_path, src_project_id, dest_project_id):
"""Copies a file from one bucket to the other."""
log.info('Copying %s from project bucket %s to %s',
file_path, src_project_id, dest_project_id)
src_storage = GoogleCloudStorageBucket(str(src_project_id))
dest_storage = GoogleCloudStorageBucket(str(dest_project_id))
blob = src_storage.Get(file_path, to_dict=False)
src_storage.copy_blob(blob, dest_storage)

View File

@@ -0,0 +1,84 @@
import copy
import logging
import types
log = logging.getLogger(__name__)
def assign_permissions(project, node_types, permission_callback):
"""Generator, yields the node types with certain permissions set.
The permission_callback is called for each node type, and each user
and group permission in the project, and should return the appropriate
extra permissions for that node type.
Yields copies of the given node types with new permissions.
permission_callback(node_type, uwg, ident, proj_methods) is returned, where
- 'node_type' is the node type dict
- 'ugw' is either 'user', 'group', or 'world',
- 'ident' is the group or user ID, or None when ugw is 'world',
- 'proj_methods' is the list of already-allowed project methods.
"""
proj_perms = project['permissions']
for nt in node_types:
permissions = {}
for key in ('users', 'groups'):
perms = proj_perms[key]
singular = key.rstrip('s')
for perm in perms:
assert isinstance(perm, dict), 'perm should be dict, but is %r' % perm
ident = perm[singular] # group or user ID.
methods_to_allow = permission_callback(nt, singular, ident, perm['methods'])
if not methods_to_allow:
continue
permissions.setdefault(key, []).append(
{singular: ident,
'methods': methods_to_allow}
)
# World permissions are simpler.
world_methods_to_allow = permission_callback(nt, 'world', None,
permissions.get('world', []))
if world_methods_to_allow:
permissions.setdefault('world', []).extend(world_methods_to_allow)
node_type = copy.deepcopy(nt)
if permissions:
node_type['permissions'] = permissions
yield node_type
def add_to_project(project, node_types, replace_existing):
"""Adds the given node types to the project.
Overwrites any existing by the same name when replace_existing=True.
"""
assert isinstance(project, dict)
assert isinstance(node_types, (list, set, frozenset, tuple, types.GeneratorType)), \
'node_types is of wrong type %s' % type(node_types)
project_id = project['_id']
for node_type in node_types:
found = [nt for nt in project['node_types']
if nt['name'] == node_type['name']]
if found:
assert len(found) == 1, 'node type name should be unique (found %ix)' % len(found)
# TODO: validate that the node type contains all the properties Attract needs.
if replace_existing:
log.info('Replacing existing node type %s on project %s',
node_type['name'], project_id)
project['node_types'].remove(found[0])
else:
continue
project['node_types'].append(node_type)

17
pillar/attrs_extra.py Normal file
View File

@@ -0,0 +1,17 @@
"""Extra functionality for attrs."""
import logging
import attr
def log(name):
"""Returns a logger attr.ib
:param name: name to pass to logging.getLogger()
:rtype: attr.ib
"""
return attr.ib(default=logging.getLogger(name),
repr=False,
hash=False,
cmp=False)

View File

@@ -33,6 +33,11 @@ class UserClass(flask_login.UserMixin):
class AnonymousUser(flask_login.AnonymousUserMixin):
@property
def objectid(self):
"""Anonymous user has no settable objectid."""
return None
def has_role(self, *roles):
return False
@@ -73,6 +78,13 @@ def config_login_manager(app):
return login_manager
def login_user(oauth_token):
"""Log in the user identified by the given token."""
user = UserClass(oauth_token)
flask_login.login_user(user)
def get_blender_id_oauth_token():
"""Returns a tuple (token, ''), for use with flask_oauthlib."""
return session.get('blender_id_oauth_token')

View File

@@ -5,17 +5,28 @@ Run commands with 'flask <command>'
from __future__ import print_function, division
import copy
import logging
from bson.objectid import ObjectId, InvalidId
from eve.methods.put import put_internal
from eve.methods.post import post_internal
from flask import current_app
from flask.ext.script import Manager
from flask_script import Manager
log = logging.getLogger(__name__)
manager = Manager(current_app)
manager_maintenance = Manager(
current_app, usage="Maintenance scripts, to update user groups")
manager_setup = Manager(
current_app, usage="Setup utilities, like setup_db() or create_blog()")
manager_operations = Manager(
current_app, usage="Backend operations, like moving nodes across projects")
@manager.command
@manager_setup.command
def setup_db(admin_email):
"""Setup the database
- Create admin, subscriber and demo Group collection
@@ -58,7 +69,7 @@ def setup_db(admin_email):
'is_private': False})
@manager.command
@manager_maintenance.command
def find_duplicate_users():
"""Finds users that have the same BlenderID user_id."""
@@ -94,7 +105,7 @@ def find_duplicate_users():
))
@manager.command
@manager_maintenance.command
def sync_role_groups(do_revoke_groups):
"""For each user, synchronizes roles and group membership.
@@ -186,7 +197,7 @@ def sync_role_groups(do_revoke_groups):
print('%i bad and %i ok users seen.' % (bad_users, ok_users))
@manager.command
@manager_maintenance.command
def sync_project_groups(user_email, fix):
"""Gives the user access to their self-created projects."""
@@ -250,7 +261,46 @@ def sync_project_groups(user_email, fix):
log.info('Updated %i user.', result.modified_count)
@manager.command
@manager_maintenance.command
def check_home_project_groups():
"""Checks all users' group membership of their home project admin group."""
users_coll = current_app.data.driver.db['users']
proj_coll = current_app.data.driver.db['projects']
good = bad = 0
for proj in proj_coll.find({'category': 'home'}):
try:
admin_group_perms = proj['permissions']['groups'][0]
except IndexError:
log.error('Project %s has no admin group', proj['_id'])
return 255
except KeyError:
log.error('Project %s has no group permissions at all', proj['_id'])
return 255
user = users_coll.find_one({'_id': proj['user']},
projection={'groups': 1})
if user is None:
log.error('Project %s has non-existing owner %s', proj['user'])
return 255
user_groups = set(user['groups'])
admin_group_id = admin_group_perms['group']
if admin_group_id in user_groups:
# All is fine!
good += 1
continue
log.warning('User %s has no admin rights to home project %s -- needs group %s',
proj['user'], proj['_id'], admin_group_id)
bad += 1
log.info('%i projects OK, %i projects in error', good, bad)
return bad
@manager_setup.command
def badger(action, user_email, role):
from pillar.api import service
@@ -265,24 +315,26 @@ def badger(action, user_email, role):
log.info('Status : %i', status)
def _create_service_account(email, service_roles, service_definition):
def create_service_account(email, service_roles, service_definition, update_existing=None):
from pillar.api import service
from pillar.api.utils import dumps
account, token = service.create_service_account(
email,
service_roles,
service_definition
service_definition,
update_existing=update_existing
)
print('Account created:')
print('Service account information:')
print(dumps(account, indent=4, sort_keys=True))
print()
print('Access token: %s' % token['token'])
print(' expires on: %s' % token['expire_time'])
return account, token
@manager.command
@manager_setup.command
def create_badger_account(email, badges):
"""
Creates a new service account that can give badges (i.e. roles).
@@ -292,27 +344,27 @@ def create_badger_account(email, badges):
this account can assign and revoke.
"""
_create_service_account(email, [u'badger'], {'badger': badges.strip().split()})
create_service_account(email, [u'badger'], {'badger': badges.strip().split()})
@manager.command
@manager_setup.command
def create_urler_account(email):
"""Creates a new service account that can fetch all project URLs."""
_create_service_account(email, [u'urler'], {})
create_service_account(email, [u'urler'], {})
@manager.command
@manager_setup.command
def create_local_user_account(email, password):
from pillar.api.local_auth import create_local_user
create_local_user(email, password)
@manager.command
@manager.option('-c', '--chunk', dest='chunk_size', default=50,
@manager_maintenance.command
@manager_maintenance.option('-c', '--chunk', dest='chunk_size', default=50,
help='Number of links to update, use 0 to update all.')
@manager.option('-q', '--quiet', dest='quiet', action='store_true', default=False)
@manager.option('-w', '--window', dest='window', default=12,
@manager_maintenance.option('-q', '--quiet', dest='quiet', action='store_true', default=False)
@manager_maintenance.option('-w', '--window', dest='window', default=12,
help='Refresh links that expire in this many hours.')
def refresh_backend_links(backend_name, chunk_size=50, quiet=False, window=12):
"""Refreshes all file links that are using a certain storage backend.
@@ -332,7 +384,7 @@ def refresh_backend_links(backend_name, chunk_size=50, quiet=False, window=12):
file_storage.refresh_links_for_backend(backend_name, chunk_size, window * 3600)
@manager.command
@manager_maintenance.command
def expire_all_project_links(project_uuid):
"""Expires all file links for a certain project without refreshing.
@@ -353,3 +405,391 @@ def expire_all_project_links(project_uuid):
)
print('Expired %i links' % result.matched_count)
@manager_operations.command
def file_change_backend(file_id, dest_backend='gcs'):
"""Given a file document, move it to the specified backend (if not already
there) and update the document to reflect that.
Files on the original backend are not deleted automatically.
"""
from pillar.api.file_storage.moving import change_file_storage_backend
change_file_storage_backend(file_id, dest_backend)
@manager_operations.command
def mass_copy_between_backends(src_backend='cdnsun', dest_backend='gcs'):
"""Copies all files from one backend to the other, updating them in Mongo.
Files on the original backend are not deleted.
"""
import requests.exceptions
from pillar.api.file_storage import moving
logging.getLogger('pillar').setLevel(logging.INFO)
log.info('Mass-moving all files from backend %r to %r',
src_backend, dest_backend)
files_coll = current_app.data.driver.db['files']
fdocs = files_coll.find({'backend': src_backend},
projection={'_id': True})
copied_ok = 0
copy_errs = 0
try:
for fdoc in fdocs:
try:
moving.change_file_storage_backend(fdoc['_id'], dest_backend)
except moving.PrerequisiteNotMetError as ex:
log.error('Error copying %s: %s', fdoc['_id'], ex)
copy_errs += 1
except requests.exceptions.HTTPError as ex:
log.error('Error copying %s (%s): %s',
fdoc['_id'], ex.response.url, ex)
copy_errs += 1
except Exception:
log.exception('Unexpected exception handling file %s', fdoc['_id'])
copy_errs += 1
else:
copied_ok += 1
except KeyboardInterrupt:
log.error('Stopping due to keyboard interrupt')
log.info('%i files copied ok', copied_ok)
log.info('%i files we did not copy', copy_errs)
@manager_operations.command
@manager_operations.option('-p', '--project', dest='dest_proj_url',
help='Destination project URL')
@manager_operations.option('-f', '--force', dest='force', action='store_true', default=False,
help='Move even when already at the given project.')
@manager_operations.option('-s', '--skip-gcs', dest='skip_gcs', action='store_true', default=False,
help='Skip file handling on GCS, just update the database.')
def move_group_node_project(node_uuid, dest_proj_url, force=False, skip_gcs=False):
"""Copies all files from one project to the other, then moves the nodes.
The node and all its children are moved recursively.
"""
from pillar.api.nodes import moving
from pillar.api.utils import str2id
logging.getLogger('pillar').setLevel(logging.INFO)
db = current_app.db()
nodes_coll = db['nodes']
projs_coll = db['projects']
# Parse CLI args and get the node, source and destination projects.
node_uuid = str2id(node_uuid)
node = nodes_coll.find_one({'_id': node_uuid})
if node is None:
log.error("Node %s can't be found!", node_uuid)
return 1
if node.get('parent', None):
log.error('Node cannot have a parent, it must be top-level.')
return 4
src_proj = projs_coll.find_one({'_id': node['project']})
dest_proj = projs_coll.find_one({'url': dest_proj_url})
if src_proj is None:
log.warning("Node's source project %s doesn't exist!", node['project'])
if dest_proj is None:
log.error("Destination project url='%s' doesn't exist.", dest_proj_url)
return 2
if src_proj['_id'] == dest_proj['_id']:
if force:
log.warning("Node is already at project url='%s'!", dest_proj_url)
else:
log.error("Node is already at project url='%s'!", dest_proj_url)
return 3
log.info("Mass-moving %s (%s) and children from project '%s' (%s) to '%s' (%s)",
node_uuid, node['name'], src_proj['url'], src_proj['_id'], dest_proj['url'],
dest_proj['_id'])
mover = moving.NodeMover(db=db, skip_gcs=skip_gcs)
mover.change_project(node, dest_proj)
log.info('Done moving.')
@manager_maintenance.command
@manager_maintenance.option('-p', '--project', dest='proj_url', nargs='?',
help='Project URL')
@manager_maintenance.option('-a', '--all', dest='all_projects', action='store_true', default=False,
help='Replace on all projects.')
def replace_pillar_node_type_schemas(proj_url=None, all_projects=False):
"""Replaces the project's node type schemas with the standard Pillar ones.
Non-standard node types are left alone.
"""
if bool(proj_url) == all_projects:
log.error('Use either --project or --all.')
return 1
from pillar.api.utils.authentication import force_cli_user
force_cli_user()
from pillar.api.node_types import PILLAR_NAMED_NODE_TYPES
from pillar.api.utils import remove_private_keys
projects_collection = current_app.db()['projects']
def handle_project(project):
log.info('Handling project %s', project['url'])
is_public_proj = not project.get('is_private', True)
for proj_nt in project['node_types']:
nt_name = proj_nt['name']
try:
pillar_nt = PILLAR_NAMED_NODE_TYPES[nt_name]
except KeyError:
log.info(' - skipping non-standard node type "%s"', nt_name)
continue
log.info(' - replacing schema on node type "%s"', nt_name)
# This leaves node type keys intact that aren't in Pillar's node_type_xxx definitions,
# such as permissions.
proj_nt.update(copy.deepcopy(pillar_nt))
# On our own public projects we want to be able to set license stuff.
if is_public_proj:
proj_nt['form_schema'].pop('license_type', None)
proj_nt['form_schema'].pop('license_notes', None)
# Use Eve to PUT, so we have schema checking.
db_proj = remove_private_keys(project)
r, _, _, status = put_internal('projects', db_proj, _id=project['_id'])
if status != 200:
log.error('Error %i storing altered project %s %s', status, project['_id'], r)
raise SystemExit('Error storing project, see log.')
log.info('Project saved succesfully.')
if all_projects:
for project in projects_collection.find():
handle_project(project)
return
project = projects_collection.find_one({'url': proj_url})
if not project:
log.error('Project url=%s not found', proj_url)
return 3
handle_project(project)
@manager_maintenance.command
def remarkdown_comments():
"""Retranslates all Markdown to HTML for all comment nodes.
"""
from pillar.api.nodes import convert_markdown
nodes_collection = current_app.db()['nodes']
comments = nodes_collection.find({'node_type': 'comment'},
projection={'properties.content': 1,
'node_type': 1})
updated = identical = skipped = errors = 0
for node in comments:
convert_markdown(node)
node_id = node['_id']
try:
content_html = node['properties']['content_html']
except KeyError:
log.warning('Node %s has no content_html', node_id)
skipped += 1
continue
result = nodes_collection.update_one(
{'_id': node_id},
{'$set': {'properties.content_html': content_html}}
)
if result.matched_count != 1:
log.error('Unable to update node %s', node_id)
errors += 1
continue
if result.modified_count:
updated += 1
else:
identical += 1
log.info('updated : %i', updated)
log.info('identical: %i', identical)
log.info('skipped : %i', skipped)
log.info('errors : %i', errors)
@manager_maintenance.command
@manager_maintenance.option('-p', '--project', dest='proj_url', nargs='?',
help='Project URL')
@manager_maintenance.option('-a', '--all', dest='all_projects', action='store_true', default=False,
help='Replace on all projects.')
def upgrade_attachment_schema(proj_url=None, all_projects=False):
"""Replaces the project's attachments with the new schema.
Updates both the schema definition and the nodes with attachments (asset, page, post).
"""
if bool(proj_url) == all_projects:
log.error('Use either --project or --all.')
return 1
from pillar.api.utils.authentication import force_cli_user
force_cli_user()
from pillar.api.node_types.asset import node_type_asset
from pillar.api.node_types.page import node_type_page
from pillar.api.node_types.post import node_type_post
from pillar.api.node_types import _attachments_embedded_schema
from pillar.api.utils import remove_private_keys
# Node types that support attachments
node_types = (node_type_asset, node_type_page, node_type_post)
nts_by_name = {nt['name']: nt for nt in node_types}
db = current_app.db()
projects_coll = db['projects']
nodes_coll = db['nodes']
def handle_project(project):
log.info('Handling project %s', project['url'])
replace_schemas(project)
replace_attachments(project)
def replace_schemas(project):
for proj_nt in project['node_types']:
nt_name = proj_nt['name']
if nt_name not in nts_by_name:
continue
log.info(' - replacing attachment schema on node type "%s"', nt_name)
pillar_nt = nts_by_name[nt_name]
proj_nt['dyn_schema']['attachments'] = copy.deepcopy(_attachments_embedded_schema)
# Get the form schema the same as the official Pillar one, but only for attachments.
try:
pillar_form_schema = pillar_nt['form_schema']['attachments']
except KeyError:
proj_nt['form_schema'].pop('attachments', None)
else:
proj_nt['form_schema']['attachments'] = pillar_form_schema
# Use Eve to PUT, so we have schema checking.
db_proj = remove_private_keys(project)
r, _, _, status = put_internal('projects', db_proj, _id=project['_id'])
if status != 200:
log.error('Error %i storing altered project %s %s', status, project['_id'], r)
raise SystemExit('Error storing project, see log.')
log.info('Project saved succesfully.')
def replace_attachments(project):
log.info('Upgrading nodes for project %s', project['url'])
nodes = nodes_coll.find({
'_deleted': False,
'project': project['_id'],
'node_type': {'$in': list(nts_by_name)},
'properties.attachments': {'$exists': True},
})
for node in nodes:
attachments = node[u'properties'][u'attachments']
if isinstance(attachments, dict):
# This node has already been upgraded.
continue
log.info(' - Updating schema on node %s (%s)', node['_id'], node.get('name'))
new_atts = {}
for field_info in attachments:
for attachment in field_info.get('files', []):
new_atts[attachment[u'slug']] = {u'oid': attachment[u'file']}
node[u'properties'][u'attachments'] = new_atts
# Use Eve to PUT, so we have schema checking.
db_node = remove_private_keys(node)
r, _, _, status = put_internal('nodes', db_node, _id=node['_id'])
if status != 200:
log.error('Error %i storing altered node %s %s', status, node['_id'], r)
raise SystemExit('Error storing node; see log.')
if all_projects:
for proj in projects_coll.find():
handle_project(proj)
return
proj = projects_coll.find_one({'url': proj_url})
if not proj:
log.error('Project url=%s not found', proj_url)
return 3
handle_project(proj)
@manager_setup.command
def create_blog(proj_url):
"""Adds a blog to the project."""
from pillar.api.utils.authentication import force_cli_user
from pillar.api.utils import node_type_utils
from pillar.api.node_types.blog import node_type_blog
from pillar.api.node_types.post import node_type_post
from pillar.api.utils import remove_private_keys
force_cli_user()
db = current_app.db()
# Add the blog & post node types to the project.
projects_coll = db['projects']
proj = projects_coll.find_one({'url': proj_url})
if not proj:
log.error('Project url=%s not found', proj_url)
return 3
node_type_utils.add_to_project(proj,
(node_type_blog, node_type_post),
replace_existing=False)
proj_id = proj['_id']
r, _, _, status = put_internal('projects', remove_private_keys(proj), _id=proj_id)
if status != 200:
log.error('Error %i storing altered project %s %s', status, proj_id, r)
return 4
log.info('Project saved succesfully.')
# Create a blog node.
nodes_coll = db['nodes']
blog = nodes_coll.find_one({'node_type': 'blog', 'project': proj_id})
if not blog:
blog = {
u'node_type': node_type_blog['name'],
u'name': u'Blog',
u'description': u'',
u'properties': {},
u'project': proj_id,
}
r, _, _, status = post_internal('nodes', blog)
if status != 201:
log.error('Error %i storing blog node: %s', status, r)
return 4
log.info('Blog node saved succesfully: %s', r)
else:
log.info('Blog node already exists: %s', blog)
return 0
manager.add_command("maintenance", manager_maintenance)
manager.add_command("setup", manager_setup)
manager.add_command("operations", manager_operations)

View File

@@ -62,3 +62,35 @@ class PillarExtension(object):
:rtype: dict
"""
@property
def template_path(self):
"""Returns the path where templates for this extension are stored.
Note that this path is not connected to any blueprint, so it is up to
the extension to provide extension-unique subdirectories.
"""
return None
@property
def static_path(self):
"""Returns the path where static files are stored.
Registers an endpoint named 'static_<extension name>', to use like:
`url_for('static_attract', filename='js/somefile.js')`
May return None, in which case the extension will not be able to serve
static files.
"""
return None
def setup_app(self, app):
"""Called during app startup, after all extensions have loaded."""
def sidebar_links(self, project):
"""Returns the sidebar link(s) for the given projects.
:returns: HTML as a string for the sidebar.
"""
return ''

49
pillar/markdown.py Normal file
View File

@@ -0,0 +1,49 @@
"""Bleached Markdown functionality.
This is for user-generated stuff, like comments.
"""
from __future__ import absolute_import
import bleach
import CommonMark
ALLOWED_TAGS = [
'a',
'abbr',
'acronym',
'b', 'strong',
'i', 'em',
'del', 'kbd',
'dl', 'dt', 'dd',
'blockquote',
'code',
'li', 'ol', 'ul',
'h1', 'h2', 'h3', 'h4', 'h5', 'h6',
'p', 'br', 'hr',
'sup', 'sub', 'strike',
'img',
'iframe',
]
ALLOWED_ATTRIBUTES = {
'a': ['href', 'title', 'target'],
'abbr': ['title'],
'acronym': ['title'],
'img': ['src', 'alt', 'width', 'height', 'title'],
'iframe': ['src', 'width', 'height', 'frameborder', 'allowfullscreen'],
'*': ['style'],
}
ALLOWED_STYLES = [
'color', 'font-weight', 'background-color',
]
def markdown(s):
tainted_html = CommonMark.commonmark(s)
safe_html = bleach.clean(tainted_html,
tags=ALLOWED_TAGS,
attributes=ALLOWED_ATTRIBUTES,
styles=ALLOWED_STYLES)
return safe_html

View File

@@ -1,5 +1,9 @@
# -*- encoding: utf-8 -*-
from __future__ import print_function
from __future__ import absolute_import
import base64
import copy
import json
@@ -26,8 +30,8 @@ import pymongo.collection
from flask.testing import FlaskClient
import responses
from pillar.tests.common_test_data import EXAMPLE_PROJECT, EXAMPLE_FILE
import pillar
from . import common_test_data as ctd
# from six:
PY3 = sys.version_info[0] == 3
@@ -44,17 +48,16 @@ TEST_EMAIL_USER = 'koro'
TEST_EMAIL_ADDRESS = '%s@testing.blender.org' % TEST_EMAIL_USER
TEST_FULL_NAME = u'врач Сергей'
TEST_SUBCLIENT_TOKEN = 'my-subclient-token-for-pillar'
BLENDER_ID_TEST_USERID = 1896
BLENDER_ID_USER_RESPONSE = {'status': 'success',
'user': {'email': TEST_EMAIL_ADDRESS,
'full_name': TEST_FULL_NAME,
'id': BLENDER_ID_TEST_USERID},
'id': ctd.BLENDER_ID_TEST_USERID},
'token_expires': 'Mon, 1 Jan 2018 01:02:03 GMT'}
class TestPillarServer(pillar.PillarServer):
class PillarTestServer(pillar.PillarServer):
def _load_flask_config(self):
super(TestPillarServer, self)._load_flask_config()
super(PillarTestServer, self)._load_flask_config()
pillar_config_file = os.path.join(MY_PATH, 'config_testing.py')
self.config.from_pyfile(pillar_config_file)
@@ -70,7 +73,7 @@ class TestPillarServer(pillar.PillarServer):
class AbstractPillarTest(TestMinimal):
pillar_server_class = TestPillarServer
pillar_server_class = PillarTestServer
def setUp(self, **kwargs):
eve_settings_file = os.path.join(MY_PATH, 'eve_test_settings.py')
@@ -93,19 +96,29 @@ class AbstractPillarTest(TestMinimal):
# Not only delete self.app (like the superclass does),
# but also un-import the application.
del sys.modules['pillar']
remove = [modname for modname in sys.modules
if modname.startswith('pillar.')]
self.unload_modules('pillar')
def unload_modules(self, module_name):
"""Uploads the named module, and all submodules."""
del sys.modules[module_name]
remove = {modname for modname in sys.modules
if modname.startswith('%s.' % module_name)}
for modname in remove:
del sys.modules[modname]
def ensure_file_exists(self, file_overrides=None):
if file_overrides and file_overrides.get('project'):
self.ensure_project_exists({'_id': file_overrides['project']})
else:
self.ensure_project_exists()
with self.app.test_request_context():
files_collection = self.app.data.driver.db['files']
assert isinstance(files_collection, pymongo.collection.Collection)
file = copy.deepcopy(EXAMPLE_FILE)
file = copy.deepcopy(ctd.EXAMPLE_FILE)
if file_overrides is not None:
file.update(file_overrides)
if '_id' in file and file['_id'] is None:
@@ -120,13 +133,24 @@ class AbstractPillarTest(TestMinimal):
return file_id, from_db
def ensure_project_exists(self, project_overrides=None):
self.ensure_group_exists(ctd.EXAMPLE_ADMIN_GROUP_ID, 'project admin')
self.ensure_group_exists(ctd.EXAMPLE_PROJECT_READONLY_GROUP_ID, 'r/o group')
self.ensure_group_exists(ctd.EXAMPLE_PROJECT_READONLY_GROUP2_ID, 'r/o group 2')
self.ensure_user_exists(ctd.EXAMPLE_PROJECT_OWNER_ID,
'proj-owner',
[ctd.EXAMPLE_ADMIN_GROUP_ID])
with self.app.test_request_context():
projects_collection = self.app.data.driver.db['projects']
assert isinstance(projects_collection, pymongo.collection.Collection)
project = copy.deepcopy(EXAMPLE_PROJECT)
project = copy.deepcopy(ctd.EXAMPLE_PROJECT)
if project_overrides is not None:
project.update(project_overrides)
for key, value in project_overrides.items():
if value is None:
project.pop(key, None)
else:
project[key] = value
found = projects_collection.find_one(project['_id'])
if found is None:
@@ -135,6 +159,37 @@ class AbstractPillarTest(TestMinimal):
return found['_id'], found
def ensure_user_exists(self, user_id, name, group_ids=()):
user = copy.deepcopy(ctd.EXAMPLE_USER)
user['groups'] = list(group_ids)
user['full_name'] = name
user['_id'] = ObjectId(user_id)
with self.app.test_request_context():
users_coll = self.app.data.driver.db['users']
assert isinstance(users_coll, pymongo.collection.Collection)
found = users_coll.find_one(user_id)
if found:
return
result = users_coll.insert_one(user)
assert result.inserted_id
def ensure_group_exists(self, group_id, name):
group_id = ObjectId(group_id)
with self.app.test_request_context():
groups_coll = self.app.data.driver.db['groups']
assert isinstance(groups_coll, pymongo.collection.Collection)
found = groups_coll.find_one(group_id)
if found:
return
result = groups_coll.insert_one({'_id': group_id, 'name': name})
assert result.inserted_id
def create_user(self, user_id='cafef00dc379cf10c4aaceaf', roles=('subscriber',),
groups=None):
from pillar.api.utils.authentication import make_unique_username
@@ -152,7 +207,7 @@ class AbstractPillarTest(TestMinimal):
'roles': list(roles),
'settings': {'email_communications': 1},
'auth': [{'token': '',
'user_id': unicode(BLENDER_ID_TEST_USERID),
'user_id': unicode(ctd.BLENDER_ID_TEST_USERID),
'provider': 'blender-id'}],
'full_name': u'คนรักของผัดไทย',
'email': TEST_EMAIL_ADDRESS
@@ -178,12 +233,32 @@ class AbstractPillarTest(TestMinimal):
:rtype: tuple
"""
project_id, proj = self.ensure_project_exists()
admin_group_id = proj['permissions']['groups'][0]['group']
user_id = self.create_user(user_id=user_id, roles=roles, groups=[admin_group_id])
user_id = self.create_project_admin(proj, user_id, roles)
return project_id, user_id
def create_project_admin(self, proj, user_id='cafef00dc379cf10c4aaceaf', roles=('subscriber',)):
"""Creates a user that's member of the project's admin group.
:param proj: project document, or at least a dict with permissions in it.
:type proj: dict
:returns: user_id
:rtype: ObjectId
"""
admin_group_id = proj['permissions']['groups'][0]['group']
user_id = self.create_user(user_id=user_id, roles=roles, groups=[admin_group_id])
return user_id
def create_node(self, node_doc):
"""Creates a node, returning its ObjectId. """
with self.app.test_request_context():
nodes_coll = self.app.data.driver.db['nodes']
result = nodes_coll.insert_one(node_doc)
return result.inserted_id
def badger(self, user_email, roles, action, srv_token=None):
"""Creates a service account, and uses it to grant or revoke a role to the user.
@@ -254,6 +329,11 @@ class AbstractPillarTest(TestMinimal):
return group_ids
def fetch_project_from_db(self, project_id=ctd.EXAMPLE_PROJECT_ID):
with self.app.app_context():
proj_coll = self.app.db()['projects']
return proj_coll.find_one(project_id)
@staticmethod
def join_url_params(params):
"""Constructs a query string from a dictionary and appends it to a url.
@@ -335,3 +415,16 @@ class AbstractPillarTest(TestMinimal):
def patch(self, *args, **kwargs):
return self.client_request('PATCH', *args, **kwargs)
def mongo_to_sdk(data):
"""Transforms a MongoDB dict to a dict suitable to give to the PillarSDK.
Not efficient, as it converts to JSON and back again. Only use in unittests.
"""
import pillar.api.utils
import json
as_json = pillar.api.utils.dumps(data)
return json.loads(as_json)

View File

@@ -2,9 +2,14 @@ import datetime
from bson import tz_util, ObjectId
from pillar.api.node_types import PILLAR_NAMED_NODE_TYPES
EXAMPLE_ADMIN_GROUP_ID = ObjectId('5596e975ea893b269af85c0e')
EXAMPLE_PROJECT_READONLY_GROUP_ID = ObjectId('5596e975ea893b269af85c0f')
EXAMPLE_PROJECT_READONLY_GROUP2_ID = ObjectId('564733b56dcaf85da2faee8a')
EXAMPLE_PROJECT_ID = ObjectId('5672beecc0261b2005ed1a33')
EXAMPLE_PROJECT_OWNER_ID = ObjectId('552b066b41acdf5dec4436f2')
EXAMPLE_FILE = {u'_id': ObjectId('5672e2c1c379cf0007b31995'),
u'_updated': datetime.datetime(2016, 3, 25, 10, 28, 24, tzinfo=tz_util.utc),
@@ -43,197 +48,30 @@ EXAMPLE_PROJECT = {
u'_id': EXAMPLE_PROJECT_ID,
u'_updated': datetime.datetime(2016, 1, 7, 18, 59, 4, tzinfo=tz_util.utc),
u'category': u'assets',
u'description': u'Welcome to this curated collection of Blender Institute textures and image resources. This collection is an on-going project, as with each project we create a number of textures based on our own resources (photographs, scans, etc.) or made completely from scratch. At the moment you can find all the textures from the past Open Projects that were deemed re-usable. \r\n\r\nPeople who have contributed to these textures:\r\n\r\nAndrea Weikert, Andy Goralczyk, Basse Salmela, Ben Dansie, Campbell Barton, Enrico Valenza, Ian Hubert, Kjartan Tysdal, Manu J\xe4rvinen, Massimiliana Pulieso, Matt Ebb, Pablo Vazquez, Rob Tuytel, Roland Hess, Sarah Feldlaufer, S\xf6nke M\xe4ter',
u'description': u'Welcome to this curated collection of Blender Institute textures and image '
u'resources. This collection is an on-going project, as with each project we '
u'create a number of textures based on our own resources (photographs, scans, '
u'etc.) or made completely from scratch. At the moment you can find all the '
u'textures from the past Open Projects that were deemed re-usable. \r\n\r\n'
u'People who have contributed to these textures:\r\n\r\nAndrea Weikert, Andy '
u'Goralczyk, Basse Salmela, Ben Dansie, Campbell Barton, Enrico Valenza, Ian '
u'Hubert, Kjartan Tysdal, Manu J\xe4rvinen, Massimiliana Pulieso, Matt Ebb, '
u'Pablo Vazquez, Rob Tuytel, Roland Hess, Sarah Feldlaufer, S\xf6nke M\xe4ter',
u'is_private': False,
u'name': u'Textures',
u'node_types': [{u'description': u'Group for texture node type',
u'dyn_schema': {u'order': {u'type': u'integer'},
u'status': {u'allowed': [u'published', u'pending'],
u'type': u'string'},
u'url': {u'type': u'string'}},
u'form_schema': {u'order': {}, u'status': {}, u'url': {}},
u'name': u'group_texture',
u'parent': [u'group_texture', u'project'],
u'permissions': {}},
{u'description': u'Generic group node type edited',
u'dyn_schema': {u'notes': {u'maxlength': 256, u'type': u'string'},
u'order': {u'type': u'integer'},
u'status': {u'allowed': [u'published', u'pending'],
u'type': u'string'},
u'url': {u'type': u'string'}},
u'form_schema': {u'notes': {}, u'order': {}, u'status': {}, u'url': {}},
u'name': u'group',
u'parent': [u'group', u'project'],
u'permissions': {}},
{u'description': u'Basic Asset Type',
u'dyn_schema': {
u'attachments': {u'schema': {u'schema': {u'field': {u'type': u'string'},
u'files': {u'schema': {
u'schema': {u'file': {
u'data_relation': {
u'embeddable': True,
u'field': u'_id',
u'resource': u'files'},
u'type': u'objectid'},
u'size': {
u'type': u'string'},
u'slug': {
u'minlength': 1,
u'type': u'string'}},
u'type': u'dict'},
u'type': u'list'}},
u'type': u'dict'},
u'type': u'list'},
u'categories': {u'type': u'string'},
u'content_type': {u'type': u'string'},
u'file': {u'data_relation': {u'embeddable': True,
u'field': u'_id',
u'resource': u'files'},
u'type': u'objectid'},
u'order': {u'type': u'integer'},
u'status': {u'allowed': [u'published',
u'pending',
u'processing'],
u'type': u'string'},
u'tags': {u'schema': {u'type': u'string'}, u'type': u'list'}},
u'form_schema': {u'attachments': {u'visible': False},
u'categories': {},
u'content_type': {u'visible': False},
u'file': {u'visible': False},
u'order': {},
u'status': {},
u'tags': {}},
u'name': u'asset',
u'parent': [u'group'],
u'permissions': {}},
{u'description': u'Entrypoint to a remote or local storage solution',
u'dyn_schema': {u'backend': {u'type': u'string'},
u'subdir': {u'type': u'string'}},
u'form_schema': {u'backend': {}, u'subdir': {}},
u'name': u'storage',
u'parent': [u'group', u'project'],
u'permissions': {u'groups': [{u'group': EXAMPLE_ADMIN_GROUP_ID,
u'methods': [u'GET', u'PUT', u'POST']},
{u'group': ObjectId('5596e975ea893b269af85c0f'),
u'methods': [u'GET']},
{u'group': ObjectId('564733b56dcaf85da2faee8a'),
u'methods': [u'GET']}],
u'users': [],
u'world': []}},
{u'description': u'Comments for asset nodes, pages, etc.',
u'dyn_schema': {u'confidence': {u'type': u'float'},
u'content': {u'minlength': 5, u'type': u'string'},
u'is_reply': {u'type': u'boolean'},
u'rating_negative': {u'type': u'integer'},
u'rating_positive': {u'type': u'integer'},
u'ratings': {u'schema': {
u'schema': {u'is_positive': {u'type': u'boolean'},
u'user': {u'type': u'objectid'},
u'weight': {u'type': u'integer'}},
u'type': u'dict'},
u'type': u'list'},
u'status': {u'allowed': [u'published', u'flagged', u'edited'],
u'type': u'string'}},
u'form_schema': {u'confidence': {},
u'content': {},
u'is_reply': {},
u'rating_negative': {},
u'rating_positive': {},
u'ratings': {},
u'status': {}},
u'name': u'comment',
u'parent': [u'asset', u'comment'],
u'permissions': {}},
{u'description': u'Container for node_type post.',
u'dyn_schema': {u'categories': {u'schema': {u'type': u'string'},
u'type': u'list'},
u'template': {u'type': u'string'}},
u'form_schema': {u'categories': {}, u'template': {}},
u'name': u'blog',
u'parent': [u'project'],
u'permissions': {}},
{u'description': u'A blog post, for any project',
u'dyn_schema': {
u'attachments': {u'schema': {u'schema': {u'field': {u'type': u'string'},
u'files': {u'schema': {
u'schema': {u'file': {
u'data_relation': {
u'embeddable': True,
u'field': u'_id',
u'resource': u'files'},
u'type': u'objectid'},
u'size': {
u'type': u'string'},
u'slug': {
u'minlength': 1,
u'type': u'string'}},
u'type': u'dict'},
u'type': u'list'}},
u'type': u'dict'},
u'type': u'list'},
u'category': {u'type': u'string'},
u'content': {u'maxlength': 90000,
u'minlength': 5,
u'required': True,
u'type': u'string'},
u'status': {u'allowed': [u'published', u'pending'],
u'default': u'pending',
u'type': u'string'},
u'url': {u'type': u'string'}},
u'form_schema': {u'attachments': {u'visible': False},
u'category': {},
u'content': {},
u'status': {},
u'url': {}},
u'name': u'post',
u'parent': [u'blog'],
u'permissions': {}},
{u'description': u'Image Texture',
u'dyn_schema': {u'aspect_ratio': {u'type': u'float'},
u'categories': {u'type': u'string'},
u'files': {u'schema': {u'schema': {
u'file': {u'data_relation': {u'embeddable': True,
u'field': u'_id',
u'resource': u'files'},
u'type': u'objectid'},
u'is_tileable': {u'type': u'boolean'},
u'map_type': {u'allowed': [u'color',
u'specular',
u'bump',
u'normal',
u'translucency',
u'emission',
u'alpha'],
u'type': u'string'}},
u'type': u'dict'},
u'type': u'list'},
u'is_landscape': {u'type': u'boolean'},
u'is_tileable': {u'type': u'boolean'},
u'order': {u'type': u'integer'},
u'resolution': {u'type': u'string'},
u'status': {u'allowed': [u'published',
u'pending',
u'processing'],
u'type': u'string'},
u'tags': {u'schema': {u'type': u'string'}, u'type': u'list'}},
u'form_schema': {u'aspect_ratio': {},
u'categories': {},
u'content_type': {u'visible': False},
u'files': {u'visible': False},
u'is_landscape': {},
u'is_tileable': {},
u'order': {},
u'resolution': {},
u'status': {},
u'tags': {}},
u'name': u'texture',
u'parent': [u'group'],
u'permissions': {}}],
u'name': u'Unittest project',
u'node_types': [
PILLAR_NAMED_NODE_TYPES['group_texture'],
PILLAR_NAMED_NODE_TYPES['group'],
PILLAR_NAMED_NODE_TYPES['asset'],
PILLAR_NAMED_NODE_TYPES['storage'],
PILLAR_NAMED_NODE_TYPES['comment'],
PILLAR_NAMED_NODE_TYPES['blog'],
PILLAR_NAMED_NODE_TYPES['post'],
PILLAR_NAMED_NODE_TYPES['texture'],
],
u'nodes_blog': [],
u'nodes_featured': [],
u'nodes_latest': [],
u'organization': ObjectId('55a99fb43004867fb9934f01'),
u'owners': {u'groups': [], u'users': []},
u'permissions': {u'groups': [{u'group': EXAMPLE_ADMIN_GROUP_ID,
u'methods': [u'GET', u'POST', u'PUT', u'DELETE']}],
u'users': [],
@@ -243,7 +81,7 @@ EXAMPLE_PROJECT = {
u'status': u'published',
u'summary': u'Texture collection from all Blender Institute open projects.',
u'url': u'textures',
u'user': ObjectId('552b066b41acdf5dec4436f2')}
u'user': EXAMPLE_PROJECT_OWNER_ID}
EXAMPLE_NODE = {
u'_id': ObjectId('572761099837730efe8e120d'),
@@ -262,3 +100,19 @@ EXAMPLE_NODE = {
u'_created': datetime.datetime(2016, 5, 2, 14, 19, 37, 0, tzinfo=tz_util.utc),
u'_etag': u'6b8589b42c880e3626f43f3e82a5c5b946742687'
}
BLENDER_ID_TEST_USERID = 1533
EXAMPLE_USER = {'_id': EXAMPLE_PROJECT_OWNER_ID,
'username': 'sybren+unittests@blender.studio',
'groups': [],
'auth': [{
'provider': 'blender-id',
'token': '',
'user_id': str(BLENDER_ID_TEST_USERID),
}],
'full_name': 'sybren+unittest@blender.studio',
'settings': {'email_communications': 1},
'_updated': datetime.datetime(2016, 8, 5, 18, 19, 29),
'_etag': '25a6a90781bf27333218fbbf33b3e8d53e37b1cb',
'_created': datetime.datetime(2016, 8, 5, 18, 19, 29),
'email': 'sybren+unittests@blender.studio'}

View File

@@ -1,8 +1,9 @@
def setup_app(app):
from . import main, users, projects, nodes, notifications, redirects
from . import main, users, projects, nodes, notifications, redirects, subquery
main.setup_app(app, url_prefix=None)
users.setup_app(app, url_prefix=None)
redirects.setup_app(app, url_prefix='/r')
projects.setup_app(app, url_prefix='/p')
nodes.setup_app(app, url_prefix='/nodes')
notifications.setup_app(app, url_prefix='/notifications')
subquery.setup_app(app)

152
pillar/web/jinja.py Normal file
View File

@@ -0,0 +1,152 @@
"""Our custom Jinja filters and other template stuff."""
from __future__ import absolute_import
import logging
import flask
import jinja2.filters
import jinja2.utils
import werkzeug.exceptions as wz_exceptions
import pillar.api.utils
from pillar.web.utils import pretty_date
from pillar.web.nodes.routes import url_for_node
import pillar.markdown
log = logging.getLogger(__name__)
def format_pretty_date(d):
return pretty_date(d)
def format_pretty_date_time(d):
return pretty_date(d, detail=True)
def format_undertitle(s):
"""Underscore-replacing title filter.
Replaces underscores with spaces, and then applies Jinja2's own title filter.
"""
# Just keep empty strings and Nones as they are.
if not s:
return s
return jinja2.filters.do_title(s.replace('_', ' '))
def do_hide_none(s):
"""Returns the input, or an empty string if the input is None."""
if s is None:
return ''
return s
# Source: Django, django/template/defaultfilters.py
def do_pluralize(value, arg='s'):
"""
Returns a plural suffix if the value is not 1. By default, 's' is used as
the suffix:
* If value is 0, vote{{ value|pluralize }} displays "0 votes".
* If value is 1, vote{{ value|pluralize }} displays "1 vote".
* If value is 2, vote{{ value|pluralize }} displays "2 votes".
If an argument is provided, that string is used instead:
* If value is 0, class{{ value|pluralize:"es" }} displays "0 classes".
* If value is 1, class{{ value|pluralize:"es" }} displays "1 class".
* If value is 2, class{{ value|pluralize:"es" }} displays "2 classes".
If the provided argument contains a comma, the text before the comma is
used for the singular case and the text after the comma is used for the
plural case:
* If value is 0, cand{{ value|pluralize:"y,ies" }} displays "0 candies".
* If value is 1, cand{{ value|pluralize:"y,ies" }} displays "1 candy".
* If value is 2, cand{{ value|pluralize:"y,ies" }} displays "2 candies".
"""
if ',' not in arg:
arg = ',' + arg
bits = arg.split(',')
if len(bits) > 2:
return ''
singular_suffix, plural_suffix = bits[:2]
try:
if float(value) != 1:
return plural_suffix
except ValueError: # Invalid string that's not a number.
pass
except TypeError: # Value isn't a string or a number; maybe it's a list?
try:
if len(value) != 1:
return plural_suffix
except TypeError: # len() of unsized object.
pass
return singular_suffix
def do_markdown(s):
# FIXME: get rid of this filter altogether and cache HTML of comments.
safe_html = pillar.markdown.markdown(s)
return jinja2.utils.Markup(safe_html)
def do_url_for_node(node_id=None, node=None):
try:
return url_for_node(node_id=node_id, node=node)
except wz_exceptions.NotFound:
log.info('%s: do_url_for_node(node_id=%r, ...) called for non-existing node.',
flask.request.url, node_id)
return None
# Source: Django 1.9 defaultfilters.py
def do_yesno(value, arg=None):
"""
Given a string mapping values for true, false and (optionally) None,
returns one of those strings according to the value:
========== ====================== ==================================
Value Argument Outputs
========== ====================== ==================================
``True`` ``"yeah,no,maybe"`` ``yeah``
``False`` ``"yeah,no,maybe"`` ``no``
``None`` ``"yeah,no,maybe"`` ``maybe``
``None`` ``"yeah,no"`` ``"no"`` (converts None to False
if no mapping for None is given.
========== ====================== ==================================
"""
if arg is None:
arg = 'yes,no,maybe'
bits = arg.split(',')
if len(bits) < 2:
return value # Invalid arg.
try:
yes, no, maybe = bits
except ValueError:
# Unpack list of wrong size (no "maybe" value provided).
yes, no, maybe = bits[0], bits[1], bits[1]
if value is None:
return maybe
if value:
return yes
return no
def setup_jinja_env(jinja_env):
jinja_env.filters['pretty_date'] = format_pretty_date
jinja_env.filters['pretty_date_time'] = format_pretty_date_time
jinja_env.filters['undertitle'] = format_undertitle
jinja_env.filters['hide_none'] = do_hide_none
jinja_env.filters['pluralize'] = do_pluralize
jinja_env.filters['gravatar'] = pillar.api.utils.gravatar
jinja_env.filters['markdown'] = do_markdown
jinja_env.filters['yesno'] = do_yesno
jinja_env.globals['url_for_node'] = do_url_for_node

View File

@@ -10,7 +10,7 @@ from flask import current_app
from flask import render_template
from flask import redirect
from flask import request
from flask.ext.login import current_user
from flask_login import current_user
from werkzeug.contrib.atom import AtomFeed
from pillar.web.utils import system_util
@@ -64,17 +64,25 @@ def homepage():
random_featured = get_random_featured_nodes()
# Parse results for replies
for comment in latest_comments._items:
to_remove = []
for idx, comment in enumerate(latest_comments._items):
if comment.properties.is_reply:
try:
comment.attached_to = Node.find(comment.parent.parent,
{'projection': {
'_id': 1,
'name': 1,
}},
api=api)
except ResourceNotFound:
# Remove this comment
to_remove.append(idx)
else:
comment.attached_to = comment.parent
for idx in reversed(to_remove):
del latest_comments._items[idx]
main_project = Project.find(current_app.config['MAIN_PROJECT_ID'], api=api)
main_project.picture_header = get_file(main_project.picture_header, api=api)
@@ -82,8 +90,7 @@ def homepage():
def sort_key(item):
return item._created
activities = itertools.chain(latest_posts._items,
latest_assets._items,
activities = itertools.chain(latest_assets._items,
latest_comments._items)
activity_stream = sorted(activities, key=sort_key, reverse=True)
@@ -128,12 +135,7 @@ def services():
def main_blog(url=None):
"""Blog with project news"""
project_id = current_app.config['MAIN_PROJECT_ID']
@current_app.cache.memoize(timeout=3600, unless=current_user_is_authenticated)
def cache_post_view(url):
return posts_view(project_id, url)
return cache_post_view(url)
return posts_view(project_id, url=url)
@blueprint.route('/blog/create')
@@ -146,19 +148,7 @@ def main_posts_create():
@blueprint.route('/p/<project_url>/blog/<url>')
def project_blog(project_url, url=None):
"""View project blog"""
@current_app.cache.memoize(timeout=3600,
unless=current_user_is_authenticated)
def cache_post_view(project_url, url):
api = system_util.pillar_api()
try:
project = Project.find_one({
'where': '{"url" : "%s"}' % (project_url)}, api=api)
return posts_view(project._id, url=url)
except ResourceNotFound:
return abort(404)
return cache_post_view(project_url, url)
return posts_view(project_url=project_url, url=url)
def get_projects(category):
@@ -280,6 +270,12 @@ def error_403():
return render_template('errors/403_embed.html')
@blueprint.route('/join-agent')
def join_agent():
"""Custom page to support Agent 327 barbershop campaign"""
return render_template('join_agent.html')
# Shameful redirects
@blueprint.route('/p/blender-cloud/')
def redirect_cloud_blog():

View File

@@ -2,4 +2,7 @@ from .routes import blueprint
def setup_app(app, url_prefix=None):
from . import custom
custom.setup_app(app)
app.register_blueprint(blueprint, url_prefix=url_prefix)

View File

@@ -0,0 +1,161 @@
import logging
import re
from bson import ObjectId
import flask
import pillarsdk
import wtforms
from pillar.api.node_types import ATTACHMENT_SLUG_REGEX
from pillar.web.utils import system_util
from pillar.web.utils.forms import build_file_select_form, CustomFormField
shortcode_re = re.compile(r'@\[(%s)\]' % ATTACHMENT_SLUG_REGEX)
log = logging.getLogger(__name__)
def render_attachments(node, field_value):
"""Renders attachments referenced in the field value.
Returns the rendered field.
"""
# TODO: cache this based on the node's etag and attachment links expiry.
node_attachments = node.properties.attachments or {}
if isinstance(node_attachments, list):
log.warning('Old-style attachments property found on node %s. Ignoring them, '
'will result in attachments not being found.', node[u'_id'])
return field_value
if not node_attachments:
return field_value
def replace(match):
slug = match.group(1)
try:
att = node_attachments[slug]
except KeyError:
return u'[attachment "%s" not found]' % slug
return render_attachment(att)
return shortcode_re.sub(replace, field_value)
def render_attachment(attachment):
"""Renders an attachment as HTML"""
oid = ObjectId(attachment[u'oid'])
collection = attachment.collection or u'files'
renderers = {
'files': render_attachment_file
}
try:
renderer = renderers[collection]
except KeyError:
log.error(u'Unable to render attachment from collection %s', collection)
return u'Unable to render attachment'
return renderer(attachment)
def render_attachment_file(attachment):
"""Renders a file attachment."""
api = system_util.pillar_api()
sdk_file = pillarsdk.File.find(attachment[u'oid'], api=api)
file_renderers = {
'image': render_attachment_file_image
}
mime_type_cat, _ = sdk_file.content_type.split('/', 1)
try:
renderer = file_renderers[mime_type_cat]
except KeyError:
return flask.render_template('nodes/attachments/file_generic.html', file=sdk_file)
return renderer(sdk_file, attachment)
def render_attachment_file_image(sdk_file, attachment):
"""Renders an image file."""
variations = {var.size: var for var in sdk_file.variations}
return flask.render_template('nodes/attachments/file_image.html',
file=sdk_file, vars=variations, attachment=attachment)
def attachment_form_group_create(schema_prop):
"""Creates a wtforms.FieldList for attachments."""
file_select_form_group = _attachment_build_single_field(schema_prop)
field = wtforms.FieldList(CustomFormField(file_select_form_group), min_entries=1)
return field
def _attachment_build_single_field(schema_prop):
# Ugly hard-coded schema.
fake_schema = {
'slug': schema_prop['propertyschema'],
'oid': schema_prop['valueschema']['schema']['oid'],
'link': schema_prop['valueschema']['schema']['link'],
'link_custom': schema_prop['valueschema']['schema']['link_custom'],
}
file_select_form_group = build_file_select_form(fake_schema)
return file_select_form_group
def attachment_form_group_set_data(db_prop_value, schema_prop, field_list):
"""Populates the attachment form group with data from MongoDB."""
assert isinstance(db_prop_value, dict)
# Extra entries are caused by min_entries=1 in the form creation.
while len(field_list):
field_list.pop_entry()
for slug, att_data in sorted(db_prop_value.iteritems()):
file_select_form_group = _attachment_build_single_field(schema_prop)
subform = file_select_form_group()
# Even uglier hard-coded
subform.slug = slug
subform.oid = att_data['oid']
subform.link = 'self'
subform.link_custom = None
if 'link' in att_data:
subform.link = att_data['link']
if 'link_custom' in att_data:
subform.link_custom = att_data['link_custom']
field_list.append_entry(subform)
def attachment_form_parse_post_data(data):
"""Returns a dict that can be stored in the node.properties.attachments."""
attachments = {}
# 'allprops' contains all properties, including the slug (which should be a key).
for allprops in data:
oid = allprops['oid']
slug = allprops['slug']
link = allprops['link']
link_custom = allprops['link_custom']
if not allprops['slug'] and not oid:
continue
if slug in attachments:
raise ValueError('Slug "%s" is used more than once' % slug)
attachments[slug] = {'oid': oid}
attachments[slug]['link'] = link
if link == 'custom':
attachments[slug]['link_custom'] = link_custom
return attachments

View File

@@ -1,2 +1,8 @@
def append_custom_node_endpoints():
pass
def setup_app(app):
from . import posts
posts.setup_app(app)

View File

@@ -1,16 +1,19 @@
import logging
import warnings
from flask import current_app
from flask import request
from flask import jsonify
from flask import render_template
from flask.ext.login import login_required
from flask.ext.login import current_user
from flask_login import login_required, current_user
from pillarsdk import Node
from pillarsdk import Project
import werkzeug.exceptions as wz_exceptions
from pillar.web import subquery
from pillar.web.nodes.routes import blueprint
from pillar.web.utils import gravatar
from pillar.web.utils import pretty_date
from pillar.web.utils import pretty_date, datetime_now
from pillar.web.utils import system_util
log = logging.getLogger(__name__)
@@ -21,10 +24,22 @@ log = logging.getLogger(__name__)
def comments_create():
content = request.form['content']
parent_id = request.form.get('parent_id')
if not parent_id:
log.warning('User %s tried to create comment without parent_id', current_user.objectid)
raise wz_exceptions.UnprocessableEntity()
api = system_util.pillar_api()
parent_node = Node.find(parent_id, api=api)
if not parent_node:
log.warning('Unable to create comment for user %s, parent node %r not found',
current_user.objectid, parent_id)
raise wz_exceptions.UnprocessableEntity()
node_asset_props = dict(
log.info('Creating comment for user %s on parent node %r',
current_user.objectid, parent_id)
comment_props = dict(
project=parent_node.project,
name='Comment',
user=current_user.objectid,
@@ -37,45 +52,36 @@ def comments_create():
rating_negative=0))
if parent_id:
node_asset_props['parent'] = parent_id
comment_props['parent'] = parent_id
# Get the parent node and check if it's a comment. In which case we flag
# the current comment as a reply.
parent_node = Node.find(parent_id, api=api)
if parent_node.node_type == 'comment':
node_asset_props['properties']['is_reply'] = True
comment_props['properties']['is_reply'] = True
node_asset = Node(node_asset_props)
node_asset.create(api=api)
comment = Node(comment_props)
comment.create(api=api)
return jsonify(
asset_id=node_asset._id,
content=node_asset.properties.content)
return jsonify({'node_id': comment._id}), 201
@blueprint.route('/comments/<string(length=24):comment_id>', methods=['POST'])
@login_required
def comment_edit(comment_id):
"""Allows a user to edit their comment (or any they have PUT access to)."""
"""Allows a user to edit their comment."""
api = system_util.pillar_api()
# Fetch the old comment.
comment_node = Node.find(comment_id, api=api)
if comment_node.node_type != 'comment':
log.info('POST to %s node %s done as if it were a comment edit; rejected.',
comment_node.node_type, comment_id)
raise wz_exceptions.BadRequest('Node ID is not a comment.')
comment = Node({'_id': comment_id})
result = comment.patch({'op': 'edit', 'content': request.form['content']}, api=api)
assert result['_status'] == 'OK'
# Update the node.
comment_node.properties.content = request.form['content']
update_ok = comment_node.update(api=api)
if not update_ok:
log.warning('Unable to update comment node %s: %s',
comment_id, comment_node.error)
raise wz_exceptions.InternalServerError('Unable to update comment node, unknown why.')
return '', 204
return jsonify({
'status': 'success',
'data': {
'content_html': result.properties.content_html,
}})
def format_comment(comment, is_reply=False, is_team=False, replies=None):
@@ -105,7 +111,7 @@ def format_comment(comment, is_reply=False, is_team=False, replies=None):
return dict(_id=comment._id,
gravatar=gravatar(comment.user.email, size=32),
time_published=pretty_date(comment._created, detail=True),
time_published=pretty_date(comment._created or datetime_now(), detail=True),
rating=comment.properties.rating_positive - comment.properties.rating_negative,
author=comment.user.full_name,
author_username=comment.user.username,
@@ -120,6 +126,8 @@ def format_comment(comment, is_reply=False, is_team=False, replies=None):
@blueprint.route("/comments/")
def comments_index():
warnings.warn('comments_index() is deprecated in favour of comments_for_node()')
parent_id = request.args.get('parent_id')
# Get data only if we format it
api = system_util.pillar_api()
@@ -153,6 +161,76 @@ def comments_index():
return return_content
@blueprint.route('/<string(length=24):node_id>/comments')
def comments_for_node(node_id):
"""Shows the comments attached to the given node."""
api = system_util.pillar_api()
node = Node.find(node_id, api=api)
project = Project({'_id': node.project})
can_post_comments = project.node_type_has_method('comment', 'POST', api=api)
can_comment_override = request.args.get('can_comment', 'True') == 'True'
can_post_comments = can_post_comments and can_comment_override
# Query for all children, i.e. comments on the node.
comments = Node.all({
'where': {'node_type': 'comment', 'parent': node_id},
}, api=api)
def enrich(some_comment):
some_comment['_user'] = subquery.get_user_info(some_comment['user'])
some_comment['_is_own'] = some_comment['user'] == current_user.objectid
some_comment['_current_user_rating'] = None # tri-state boolean
some_comment['_rating'] = some_comment.properties.rating_positive - some_comment.properties.rating_negative
if current_user.is_authenticated:
for rating in some_comment.properties.ratings or ():
if rating.user != current_user.objectid:
continue
some_comment['_current_user_rating'] = rating.is_positive
for comment in comments['_items']:
# Query for all grandchildren, i.e. replies to comments on the node.
comment['_replies'] = Node.all({
'where': {'node_type': 'comment', 'parent': comment['_id']},
}, api=api)
enrich(comment)
for reply in comment['_replies']['_items']:
enrich(reply)
nr_of_comments = sum(1 + comment['_replies']['_meta']['total']
for comment in comments['_items'])
return render_template('nodes/custom/comment/list_embed.html',
node_id=node_id,
comments=comments,
nr_of_comments=nr_of_comments,
show_comments=True,
can_post_comments=can_post_comments)
@blueprint.route('/<string(length=24):node_id>/commentform')
def commentform_for_node(node_id):
"""Shows only the comment for for comments attached to the given node.
i.e. does not show the comments themselves, just the form to post a new comment.
"""
api = system_util.pillar_api()
node = Node.find(node_id, api=api)
project = Project({'_id': node.project})
can_post_comments = project.node_type_has_method('comment', 'POST', api=api)
return render_template('nodes/custom/comment/list_embed.html',
node_id=node_id,
show_comments=False,
can_post_comments=can_post_comments)
@blueprint.route("/comments/<comment_id>/rate/<operation>", methods=['POST'])
@login_required
def comments_rate(comment_id, operation):
@@ -170,13 +248,8 @@ def comments_rate(comment_id, operation):
api = system_util.pillar_api()
comment = Node.find(comment_id, {'projection': {'_id': 1}}, api=api)
if not comment:
log.info('Node %i not found; how could someone click on the upvote/downvote button?',
comment_id)
raise wz_exceptions.NotFound()
# PATCH the node and return the result.
comment = Node({'_id': comment_id})
result = comment.patch({'op': operation}, api=api)
assert result['_status'] == 'OK'

View File

@@ -1,7 +1,6 @@
from flask import request
from flask import jsonify
from flask.ext.login import login_required
from flask.ext.login import current_user
from flask_login import login_required, current_user
from pillarsdk import Node
from pillar.web.utils import system_util
from ..routes import blueprint

View File

@@ -2,62 +2,45 @@ from pillarsdk import Node
from pillarsdk import Project
from pillarsdk.exceptions import ResourceNotFound
from flask import abort
from flask import current_app
from flask import render_template
from flask import redirect
from flask.ext.login import login_required
from flask.ext.login import current_user
from flask_login import login_required, current_user
from pillar.web.utils import system_util
from pillar.web.utils import attach_project_pictures
from pillar.web.utils import get_file
from pillar.web.utils import current_user_is_authenticated
from pillar.web.nodes.routes import blueprint
from pillar.web.nodes.routes import url_for_node
from pillar.web.nodes.forms import get_node_form
from pillar.web.nodes.forms import process_node_form
import pillar.web.nodes.attachments
from pillar.web.projects.routes import project_update_nodes_list
def posts_view(project_id, url=None):
# Cached, see setup_app() below.
def posts_view(project_id=None, project_url=None, url=None):
"""View individual blogpost"""
if bool(project_id) == bool(project_url):
raise ValueError('posts_view(): pass either project_id or project_url')
api = system_util.pillar_api()
# Fetch project (for backgroud images and links generation)
if project_id:
project = Project.find(project_id, api=api)
else:
project = Project.find_one({'where': {'url': project_url}}, api=api)
project_id = project['_id']
attach_project_pictures(project, api)
try:
blog = Node.find_one({
'where': {'node_type': 'blog', 'project': project_id},
}, api=api)
except ResourceNotFound:
abort(404)
if url:
try:
post = Node.find_one({
'where': '{"parent": "%s", "properties.url": "%s"}' % (blog._id, url),
'embedded': '{"node_type": 1, "user": 1}',
}, api=api)
if post.picture:
post.picture = get_file(post.picture, api=api)
except ResourceNotFound:
return abort(404)
# If post is not published, check that the user is also the author of
# the post. If not, return 404.
if post.properties.status != "published":
if current_user.is_authenticated:
if not post.has_method('PUT'):
abort(403)
else:
abort(403)
return render_template(
'nodes/custom/post/view.html',
blog=blog,
node=post,
project=project,
title='blog',
api=api)
else:
node_type_post = project.get_node_type('post')
status_query = "" if blog.has_method('PUT') else ', "properties.status": "published"'
posts = Node.all({
'where': '{"parent": "%s" %s}' % (blog._id, status_query),
@@ -68,8 +51,42 @@ def posts_view(project_id, url=None):
for post in posts._items:
post.picture = get_file(post.picture, api=api)
post['properties']['content'] = pillar.web.nodes.attachments.render_attachments(
post, post['properties']['content'])
# Use the *_main_project.html template for the main blog
main_project_template = '_main_project' if project_id == current_app.config['MAIN_PROJECT_ID'] else ''
if url:
post = Node.find_one({
'where': {'parent': blog._id, 'properties.url': url},
'embedded': {'node_type': 1, 'user': 1},
}, api=api)
if post.picture:
post.picture = get_file(post.picture, api=api)
# If post is not published, check that the user is also the author of
# the post. If not, return 404.
if post.properties.status != "published":
if not (current_user.is_authenticated and post.has_method('PUT')):
abort(403)
post['properties']['content'] = pillar.web.nodes.attachments.render_attachments(
post, post['properties']['content'])
return render_template(
'nodes/custom/blog/index.html',
'nodes/custom/post/view{0}.html'.format(main_project_template),
blog=blog,
node=post,
posts=posts._items,
project=project,
title='blog',
api=api)
else:
node_type_post = project.get_node_type('post')
template_path = 'nodes/custom/blog/index.html'
return render_template(
'nodes/custom/blog/index{0}.html'.format(main_project_template),
node_type_post=node_type_post,
posts=posts._items,
project=project,
@@ -123,46 +140,8 @@ def posts_create(project_id):
api=api)
@blueprint.route("/posts/<post_id>/edit", methods=['GET', 'POST'])
@login_required
def posts_edit(post_id):
api = system_util.pillar_api()
def setup_app(app):
global posts_view
try:
post = Node.find(post_id, {
'embedded': '{"user": 1}'}, api=api)
except ResourceNotFound:
return abort(404)
# Check if user is allowed to edit the post
if not post.has_method('PUT'):
return abort(403)
project = Project.find(post.project, api=api)
attach_project_pictures(project, api)
node_type = project.get_node_type(post.node_type)
form = get_node_form(node_type)
if form.validate_on_submit():
if process_node_form(form, node_id=post_id, node_type=node_type,
user=current_user.objectid):
# The the post is published, add it to the list
if form.status.data == 'published':
project_update_nodes_list(post, project_id=project._id, list_name='blog')
return redirect(url_for_node(node=post))
form.parent.data = post.parent
form.name.data = post.name
form.content.data = post.properties.content
form.status.data = post.properties.status
form.url.data = post.properties.url
if post.picture:
form.picture.data = post.picture
# Embed picture file
post.picture = get_file(post.picture, api=api)
if post.properties.picture_square:
form.picture_square.data = post.properties.picture_square
return render_template('nodes/custom/post/edit.html',
node_type=node_type,
post=post,
form=form,
project=project,
api=api)
memoize = app.cache.memoize(timeout=3600, unless=current_user_is_authenticated)
posts_view = memoize(posts_view)

125
pillar/web/nodes/finders.py Normal file
View File

@@ -0,0 +1,125 @@
"""Node-URL-finding microframework."""
import logging
from flask import current_app, url_for
import pillarsdk
from pillarsdk import Node
from pillarsdk.exceptions import ResourceNotFound
from pillar.web.utils import caching
from pillar.web import system_util
log = logging.getLogger(__name__)
node_url_finders = {} # mapping from node type to callable.
def register_node_finder(node_type):
"""Decorator, registers the decorated function as node finder for the given node type."""
def wrapper(func):
if node_type in node_url_finders:
raise ValueError('Node type %r already handled by %r' %
(node_type, node_url_finders[node_type]))
log.debug('Registering %s node finder for node type %r',
func, node_type)
node_url_finders[node_type] = func
return func
return wrapper
@register_node_finder('comment')
def find_for_comment(project, node):
"""Returns the URL for a comment."""
api = system_util.pillar_api()
parent = node
while parent.node_type == 'comment':
if isinstance(parent.parent, pillarsdk.Resource):
parent = parent.parent
continue
try:
parent = Node.find(parent.parent, api=api)
except ResourceNotFound:
log.warning(
'url_for_node(node_id=%r): Unable to find parent node %r',
node['_id'], parent.parent)
raise ValueError('Unable to find parent node %r' % parent.parent)
# Find the redirection URL for the parent node.
parent_url = find_url_for_node(parent)
if '#' in parent_url:
# We can't attach yet another fragment, so just don't link to
# the comment for now.
return parent_url
return parent_url + '#{}'.format(node['_id'])
@register_node_finder('post')
def find_for_post(project, node):
"""Returns the URL for a blog post."""
project_id = project['_id']
if str(project_id) == current_app.config['MAIN_PROJECT_ID']:
return url_for('main.main_blog',
url=node.properties.url)
the_project = project_url(project_id, project=project)
return url_for('main.project_blog',
project_url=the_project.url,
url=node.properties.url)
def find_for_other(project, node):
"""Fallback: Assets, textures, and other node types.
Hard-coded fallback, so doesn't need @register_node_finder() decoration.
"""
the_project = project_url(project['_id'], project=project)
return url_for('projects.view_node',
project_url=the_project.url,
node_id=node['_id'])
@caching.cache_for_request()
def project_url(project_id, project):
"""Returns the project, raising a ValueError if it can't be found.
Uses the "urler" service endpoint.
"""
if project is not None:
return project
if not current_app.config['URLER_SERVICE_AUTH_TOKEN']:
log.error('No URLER_SERVICE_AUTH_TOKEN token, unable to use URLer service.')
return None
urler_api = system_util.pillar_api(
token=current_app.config['URLER_SERVICE_AUTH_TOKEN'])
return pillarsdk.Project.find_from_endpoint(
'/service/urler/%s' % project_id, api=urler_api)
# Cache the actual URL based on the node ID, for the duration of the request.
@caching.cache_for_request()
def find_url_for_node(node):
# Find the node's project, or its ID, depending on whether a project
# was embedded. This is needed some finder functions.
if isinstance(node.project, pillarsdk.Resource):
# Embedded project
project = node.project
else:
project = project_url(node.project, None)
# Determine which function to use to find the correct URL.
finder = node_url_finders.get(node.node_type, find_for_other)
return finder(project, node)

View File

@@ -19,14 +19,30 @@ from wtforms import FieldList
from wtforms.validators import DataRequired
from pillar.web.utils import system_util
from pillar.web.utils.forms import FileSelectField
from pillar.web.utils.forms import ProceduralFileSelectForm
from pillar.web.utils.forms import CustomFormField
from pillar.web.utils.forms import build_file_select_form
from . import attachments
log = logging.getLogger(__name__)
def add_form_properties(form_class, node_schema, form_schema, prefix=''):
def iter_node_properties(node_type):
"""Generator, iterates over all node properties with form schema."""
node_schema = node_type['dyn_schema'].to_dict()
form_schema = node_type['form_schema'].to_dict()
for prop_name, prop_schema in node_schema.iteritems():
prop_fschema = form_schema.get(prop_name, {})
if not prop_fschema.get('visible', True):
continue
yield prop_name, prop_schema, prop_fschema
def add_form_properties(form_class, node_type):
"""Add fields to a form based on the node and form schema provided.
:type node_schema: dict
:param node_schema: the validation schema used by Cerberus
@@ -37,33 +53,16 @@ def add_form_properties(form_class, node_schema, form_schema, prefix=''):
show and hide)
"""
for prop, schema_prop in node_schema.iteritems():
form_prop = form_schema.get(prop, {})
if prop == 'items':
continue
if not form_prop.get('visible', True):
continue
prop_name = "{0}{1}".format(prefix, prop)
for prop_name, schema_prop, form_prop in iter_node_properties(node_type):
# Recursive call if detects a dict
field_type = schema_prop['type']
if field_type == 'dict':
# This works if the dictionary schema is hardcoded.
# If we define it using propertyschema and valueschema, this
# validation pattern does not work and crahses.
add_form_properties(form_class, schema_prop['schema'],
form_prop['schema'], "{0}__".format(prop_name))
continue
if field_type == 'list':
if prop == 'attachments':
# class AttachmentForm(Form):
# pass
# AttachmentForm.file = FileSelectField('file')
# AttachmentForm.size = StringField()
# AttachmentForm.slug = StringField()
field = FieldList(CustomFormField(ProceduralFileSelectForm))
elif prop == 'files':
if field_type == 'dict':
assert prop_name == 'attachments'
field = attachments.attachment_form_group_create(schema_prop)
elif field_type == 'list':
if prop_name == 'files':
schema = schema_prop['schema']['schema']
file_select_form = build_file_select_form(schema)
field = FieldList(CustomFormField(file_select_form),
@@ -112,8 +111,6 @@ def get_node_form(node_type):
class ProceduralForm(Form):
pass
node_schema = node_type['dyn_schema'].to_dict()
form_prop = node_type['form_schema'].to_dict()
parent_prop = node_type['parent']
ProceduralForm.name = StringField('Name', validators=[DataRequired()])
@@ -126,7 +123,7 @@ def get_node_form(node_type):
ProceduralForm.picture = FileSelectField('Picture', file_format='image')
ProceduralForm.node_type = HiddenField(default=node_type['name'])
add_form_properties(ProceduralForm, node_schema, form_prop)
add_form_properties(ProceduralForm, node_type)
return ProceduralForm()
@@ -166,41 +163,25 @@ def process_node_form(form, node_id=None, node_type=None, user=None):
if form.parent.data != "":
node.parent = form.parent.data
def update_data(node_schema, form_schema, prefix=""):
for pr in node_schema:
schema_prop = node_schema[pr]
form_prop = form_schema.get(pr, {})
if pr == 'items':
continue
if 'visible' in form_prop and not form_prop['visible']:
continue
prop_name = "{0}{1}".format(prefix, pr)
if schema_prop['type'] == 'dict':
update_data(
schema_prop['schema'],
form_prop['schema'],
"{0}__".format(prop_name))
continue
for prop_name, schema_prop, form_prop in iter_node_properties(node_type):
data = form[prop_name].data
if schema_prop['type'] == 'dict':
if data == 'None':
continue
data = attachments.attachment_form_parse_post_data(data)
elif schema_prop['type'] == 'integer':
if data == '':
data = 0
else:
data = int(form[prop_name].data)
elif schema_prop['type'] == 'datetime':
data = datetime.strftime(data,
app.config['RFC1123_DATE_FORMAT'])
data = datetime.strftime(data, current_app.config['RFC1123_DATE_FORMAT'])
elif schema_prop['type'] == 'list':
if pr == 'attachments':
# data = json.loads(data)
data = [dict(field='description', files=data)]
elif pr == 'files':
if prop_name == 'files':
# Only keep those items that actually refer to a file.
data = [file_item for file_item in data
if file_item.get('file')]
else:
log.warning('Ignoring property %s of type %s',
prop_name, schema_prop['type'])
# elif pr == 'tags':
# data = [tag.strip() for tag in data.split(',')]
elif schema_prop['type'] == 'objectid':
@@ -209,16 +190,17 @@ def process_node_form(form, node_id=None, node_type=None, user=None):
# SDK before node.update()
data = None
else:
if pr in form:
if prop_name in form:
data = form[prop_name].data
path = prop_name.split('__')
assert len(path) == 1
if len(path) > 1:
recursive_prop = recursive(
path, node.properties.to_dict(), data)
node.properties = recursive_prop
else:
node.properties[prop_name] = data
update_data(node_schema, form_schema)
ok = node.update(api=api)
if not ok:
log.warning('Unable to update node: %s', node.error)

View File

@@ -19,7 +19,7 @@ from flask import abort
from flask_login import current_user
from werkzeug.exceptions import NotFound
from wtforms import SelectMultipleField
from flask.ext.login import login_required
from flask_login import login_required
from jinja2.exceptions import TemplateNotFound
from pillar.web.utils import caching
@@ -28,12 +28,14 @@ from pillar.web.nodes.forms import process_node_form
from pillar.web.nodes.custom.storage import StorageNode
from pillar.web.projects.routes import project_update_nodes_list
from pillar.web.utils import get_file
from pillar.web.utils import attach_project_pictures
from pillar.web.utils.jstree import jstree_build_children
from pillar.web.utils.jstree import jstree_build_from_node
from pillar.web.utils.forms import ProceduralFileSelectForm
from pillar.web.utils.forms import build_file_select_form
from pillar.web import system_util
from . import finders, attachments
blueprint = Blueprint('nodes', __name__)
log = logging.getLogger(__name__)
@@ -118,8 +120,9 @@ def view(node_id):
node_type_name = node.node_type
if node_type_name == 'post':
# Posts shouldn't be shown at this route, redirect to the correct one.
if node_type_name == 'post' and not request.args.get('embed'):
# Posts shouldn't be shown at this route (unless viewed embedded, tipically
# after an edit. Redirect to the correct one.
return redirect(url_for_node(node=node))
# Set the default name of the template path based on the node name
@@ -190,6 +193,9 @@ def view(node_id):
for child in children:
child.picture = get_file(child.picture, api=api)
if 'description' in node:
node['description'] = attachments.render_attachments(node, node['description'])
if request.args.get('format') == 'json':
node = node.to_dict()
node['url_edit'] = url_for('nodes.edit', node_id=node['_id'])
@@ -204,14 +210,12 @@ def view(node_id):
template_action = 'view_theatre'
template_path = '{0}/{1}_embed.html'.format(template_path, template_action)
# template_path_full = os.path.join(current_app.config['TEMPLATES_PATH'], template_path)
#
# # Check if template exists on the filesystem
# if not os.path.exists(template_path_full):
# log.warning('Template %s does not exist for node type %s',
# template_path, node_type_name)
# raise NotFound("Missing template '{0}'".format(template_path))
# Full override for AMP view
if request.args.get('format') == 'amp':
template_path = 'nodes/view_amp.html'
try:
return render_template(template_path,
node_id=node._id,
node=node,
@@ -219,6 +223,15 @@ def view(node_id):
children=children,
config=current_app.config,
api=api)
except TemplateNotFound:
log.error('Template %s does not exist for node type %s', template_path, node_type_name)
return render_template('nodes/error_type_not_found.html',
node_id=node._id,
node=node,
parent=node.parent,
children=children,
config=current_app.config,
api=api)
def _view_handler_asset(node, template_path, template_action, link_allowed):
@@ -246,7 +259,7 @@ def _view_handler_asset(node, template_path, template_action, link_allowed):
# TODO: move this to Pillar
if f.backend == 'cdnsun':
f.link = "{0}&name={1}.{2}".format(f.link, node.name, f.format)
node.video_sources = json.dumps(sources)
node.video_sources = sources
node.file_variations = node_file.variations
else:
node.video_sources = None
@@ -307,27 +320,18 @@ def edit(node_id):
"""Generic node editing form
"""
def set_properties(dyn_schema, form_schema, node_properties, form,
prefix="",
set_data=True):
def set_properties(dyn_schema, form_schema, node_properties, form, set_data,
prefix=""):
"""Initialize custom properties for the form. We run this function once
before validating the function with set_data=False, so that we can set
any multiselect field that was originally specified empty and fill it
with the current choices.
"""
for prop in dyn_schema:
schema_prop = dyn_schema[prop]
form_prop = form_schema.get(prop, {})
prop_name = "{0}{1}".format(prefix, prop)
if schema_prop['type'] == 'dict':
set_properties(
schema_prop['schema'],
form_prop['schema'],
node_properties[prop_name],
form,
"{0}__".format(prop_name))
continue
log.debug('set_properties(..., prefix=%r, set_data=%r) called', prefix, set_data)
for prop, schema_prop in dyn_schema.iteritems():
prop_name = "{0}{1}".format(prefix, prop)
if prop_name not in form:
continue
@@ -350,29 +354,27 @@ def edit(node_id):
if not form[prop_name].choices:
form[prop_name].choices = [(d, d) for d in db_prop_value]
# Choices should be a tuple with value and name
if not set_data:
continue
# Assign data to the field
if set_data:
if prop_name == 'attachments':
for attachment_collection in db_prop_value:
for a in attachment_collection['files']:
attachment_form = ProceduralFileSelectForm()
attachment_form.file = a['file']
attachment_form.slug = a['slug']
attachment_form.size = 'm'
form[prop_name].append_entry(attachment_form)
# If attachments is an empty list, do not append data
if not db_prop_value:
continue
attachments.attachment_form_group_set_data(db_prop_value, schema_prop,
form[prop_name])
elif prop_name == 'files':
schema = schema_prop['schema']['schema']
subschema = schema_prop['schema']['schema']
# Extra entries are caused by min_entries=1 in the form
# creation.
field_list = form[prop_name]
if len(db_prop_value) > 0:
if len(db_prop_value):
while len(field_list):
field_list.pop_entry()
for file_data in db_prop_value:
file_form_class = build_file_select_form(schema)
file_form_class = build_file_select_form(subschema)
subform = file_form_class()
for key, value in file_data.iteritems():
setattr(subform, key, value)
@@ -382,18 +384,6 @@ def edit(node_id):
# form[prop_name].data = ', '.join(data)
else:
form[prop_name].data = db_prop_value
else:
# Default population of multiple file form list (only if
# we are getting the form)
if request.method == 'POST':
continue
if prop_name == 'attachments':
if not db_prop_value:
attachment_form = ProceduralFileSelectForm()
attachment_form.file = 'file'
attachment_form.slug = ''
attachment_form.size = ''
form[prop_name].append_entry(attachment_form)
api = system_util.pillar_api()
node = Node.find(node_id, api=api)
@@ -404,7 +394,6 @@ def edit(node_id):
dyn_schema = node_type['dyn_schema'].to_dict()
form_schema = node_type['form_schema'].to_dict()
error = ""
node_properties = node.properties.to_dict()
ensure_lists_exist_as_empty(node.to_dict(), node_type)
@@ -415,11 +404,9 @@ def edit(node_id):
if process_node_form(form, node_id=node_id, node_type=node_type, user=user_id):
# Handle the specific case of a blog post
if node_type.name == 'post':
project_update_nodes_list(node, list_name='blog')
project_update_nodes_list(node, project_id=project._id, list_name='blog')
else:
project_update_nodes_list(node)
# Emergency hardcore cache flush
# cache.clear()
project_update_nodes_list(node, project_id=project._id)
return redirect(url_for('nodes.view', node_id=node_id, embed=1,
_external=True,
_scheme=current_app.config['SCHEME']))
@@ -429,7 +416,6 @@ def edit(node_id):
else:
if form.errors:
log.debug('Form errors: %s', form.errors)
# Populate Form
form.name.data = node.name
form.description.data = node.description
@@ -437,8 +423,7 @@ def edit(node_id):
form.picture.data = node.picture
if node.parent:
form.parent.data = node.parent
set_properties(dyn_schema, form_schema, node_properties, form)
set_properties(dyn_schema, form_schema, node_properties, form, set_data=True)
# Get previews
node.picture = get_file(node.picture, api=api) if node.picture else None
@@ -453,13 +438,13 @@ def edit(node_id):
embed_string = ''
# Check if we want to embed the content via an AJAX call
if request.args.get('embed'):
if request.args.get('embed') == '1':
# Define the prefix for the embedded template
embed_string = '_embed'
else:
attach_project_pictures(project, api)
template = '{0}/edit{1}.html'.format(node_type['name'], embed_string)
# We should more simply check if the template file actually exsists on
# the filesystem level
try:
@@ -473,6 +458,7 @@ def edit(node_id):
api=api)
except TemplateNotFound:
template = 'nodes/edit{1}.html'.format(node_type['name'], embed_string)
is_embedded_edit = True if embed_string else False
return render_template(
template,
node=node,
@@ -480,7 +466,10 @@ def edit(node_id):
form=form,
errors=form.errors,
error=error,
api=api)
api=api,
project=project,
is_embedded_edit=is_embedded_edit,
)
def ensure_lists_exist_as_empty(node_doc, node_type):
@@ -578,8 +567,9 @@ def url_for_node(node_id=None, node=None):
api = system_util.pillar_api()
# Find node by its ID, or the ID by the node, depending on what was passed
# as parameters.
if node_id is None and node is None:
raise ValueError('Either node or node_id must be given')
if node is None:
try:
node = Node.find(node_id, api=api)
@@ -587,98 +577,9 @@ def url_for_node(node_id=None, node=None):
log.warning(
'url_for_node(node_id=%r, node=None): Unable to find node.',
node_id)
raise ValueError('Unable to find node %r' % node_id)
elif node_id is None:
node_id = node['_id']
else:
raise ValueError('Either node or node_id must be given')
raise NotFound('Unable to find node %r' % node_id)
return _find_url_for_node(node_id, node=node)
@caching.cache_for_request()
def project_url(project_id, project):
"""Returns the project, raising a ValueError if it can't be found.
Uses the "urler" service endpoint.
"""
if project is not None:
return project
urler_api = system_util.pillar_api(
token=current_app.config['URLER_SERVICE_AUTH_TOKEN'])
return Project.find_from_endpoint(
'/service/urler/%s' % project_id, api=urler_api)
# Cache the actual URL based on the node ID, for the duration of the request.
@caching.cache_for_request()
def _find_url_for_node(node_id, node):
api = system_util.pillar_api()
# Find the node's project, or its ID, depending on whether a project
# was embedded. This is needed in two of the three finder functions.
project_id = node.project
if isinstance(project_id, pillarsdk.Resource):
# Embedded project
project = project_id
project_id = project['_id']
else:
project = None
def find_for_comment():
"""Returns the URL for a comment."""
parent = node
while parent.node_type == 'comment':
if isinstance(parent.parent, pillarsdk.Resource):
parent = parent.parent
continue
try:
parent = Node.find(parent.parent, api=api)
except ResourceNotFound:
log.warning(
'url_for_node(node_id=%r): Unable to find parent node %r',
node_id, parent.parent)
raise ValueError('Unable to find parent node %r' % parent.parent)
# Find the redirection URL for the parent node.
parent_url = url_for_node(node=parent)
if '#' in parent_url:
# We can't attach yet another fragment, so just don't link to
# the comment for now.
return parent_url
return parent_url + '#{}'.format(node_id)
def find_for_post():
"""Returns the URL for a blog post."""
if str(project_id) == current_app.config['MAIN_PROJECT_ID']:
return url_for('main.main_blog',
url=node.properties.url)
the_project = project_url(project_id, project=project)
return url_for('main.project_blog',
project_url=the_project.url,
url=node.properties.url)
# Fallback: Assets, textures, and other node types.
def find_for_other():
the_project = project_url(project_id, project=project)
return url_for('projects.view_node',
project_url=the_project.url,
node_id=node_id)
# Determine which function to use to find the correct URL.
url_finders = {
'comment': find_for_comment,
'post': find_for_post,
}
finder = url_finders.get(node.node_type, find_for_other)
return finder()
return finders.find_url_for_node(node)
# Import of custom modules (using the same nodes decorator)

View File

@@ -4,8 +4,7 @@ from flask import Blueprint
from flask import request
from flask import url_for
from flask import abort
from flask.ext.login import login_required
from flask.ext.login import current_user
from flask_login import login_required, current_user
from pillarsdk.activities import Notification
from pillarsdk.activities import ActivitySubscription
from pillar.web.utils import system_util

View File

@@ -1,5 +1,6 @@
import json
import logging
import itertools
from pillarsdk import Node
from pillarsdk import Project
@@ -13,8 +14,7 @@ from flask import session
from flask import abort
from flask import redirect
from flask import url_for
from flask.ext.login import login_required
from flask.ext.login import current_user
from flask_login import login_required, current_user
import werkzeug.exceptions as wz_exceptions
from pillar.web import system_util
@@ -251,17 +251,22 @@ def render_project(project, api, extra_context=None, template_name=None):
project.picture_square = utils.get_file(project.picture_square, api=api)
project.picture_header = utils.get_file(project.picture_header, api=api)
def load_latest(list_of_ids, get_picture=False):
def load_latest(list_of_ids, node_type=None):
"""Loads a list of IDs in reversed order."""
if not list_of_ids:
return []
# Construct query parameters outside the loop.
projection = {'name': 1, 'user': 1, 'node_type': 1, 'project': 1, 'properties.url': 1}
projection = {'name': 1, 'user': 1, 'node_type': 1, 'project': 1,
'properties.url': 1, 'properties.content_type': 1,
'picture': 1}
params = {'projection': projection, 'embedded': {'user': 1}}
if get_picture:
projection['picture'] = 1
if node_type == 'post':
projection['properties.content'] = 1
elif node_type == 'asset':
projection['description'] = 1
list_latest = []
for node_id in reversed(list_of_ids or ()):
@@ -278,9 +283,16 @@ def render_project(project, api, extra_context=None, template_name=None):
return list_latest
project.nodes_latest = load_latest(project.nodes_latest)
project.nodes_featured = load_latest(project.nodes_featured, get_picture=True)
project.nodes_blog = load_latest(project.nodes_blog)
project.nodes_featured = load_latest(project.nodes_featured, node_type='asset')
project.nodes_blog = load_latest(project.nodes_blog, node_type='post')
# Merge featured assets and blog posts into one activity stream
def sort_key(item):
return item._created
activities = itertools.chain(project.nodes_featured,
project.nodes_blog)
activity_stream = sorted(activities, key=sort_key, reverse=True)
if extra_context is None:
extra_context = {}
@@ -301,6 +313,8 @@ def render_project(project, api, extra_context=None, template_name=None):
embed_string = ''
template_name = "projects/view{0}.html".format(embed_string)
extension_sidebar_links = current_app.extension_sidebar_links(project)
return render_template(template_name,
api=api,
project=project,
@@ -308,13 +322,34 @@ def render_project(project, api, extra_context=None, template_name=None):
show_node=False,
show_project=True,
og_picture=project.picture_header,
activity_stream=activity_stream,
extension_sidebar_links=extension_sidebar_links,
**extra_context)
def render_node_page(project_url, page_url, api):
"""Custom behaviour for pages, which are nodes, but accessible on a custom
route base.
"""
# TODO: ensure this is not called for the home project, as it would
# generate conflicting websites
project = find_project_or_404(project_url, api=api)
try:
page = Node.find_one({
'where': {
'project': project['_id'],
'node_type': 'page',
'properties.url': page_url}}, api=api)
except ResourceNotFound:
raise wz_exceptions.NotFound('No such node')
return project, page
@blueprint.route('/<project_url>/<node_id>')
def view_node(project_url, node_id):
"""Entry point to view a node in the context of a project"""
# Some browsers mangle URLs and URL-encode /p/{p-url}/#node-id
if node_id.startswith('#'):
return redirect(url_for('projects.view_node',
@@ -322,12 +357,14 @@ def view_node(project_url, node_id):
node_id=node_id[1:]),
code=301) # permanent redirect
if not utils.is_valid_id(node_id):
raise wz_exceptions.NotFound('No such node')
api = system_util.pillar_api()
theatre_mode = 't' in request.args
api = system_util.pillar_api()
# First we check if it's a simple string, in which case we are looking for
# a static page. Maybe we could use bson.objectid.ObjectId.is_valid(node_id)
if not utils.is_valid_id(node_id):
# raise wz_exceptions.NotFound('No such node')
project, node = render_node_page(project_url, node_id, api)
else:
# Fetch the node before the project. If this user has access to the
# node, we should be able to get the project URL too.
try:
@@ -355,13 +392,16 @@ def view_node(project_url, node_id):
# Append _theatre to load the proper template
theatre = '_theatre' if theatre_mode else ''
extension_sidebar_links = current_app.extension_sidebar_links(project)
return render_template('projects/view{}.html'.format(theatre),
api=api,
project=project,
node=node,
show_node=True,
show_project=False,
og_picture=og_picture)
og_picture=og_picture,
extension_sidebar_links=extension_sidebar_links)
def find_project_or_404(project_url, embedded=None, api=None):
@@ -392,32 +432,6 @@ def search(project_url):
og_picture=project.picture_header)
@blueprint.route('/<project_url>/about')
def about(project_url):
"""About page of a project"""
# TODO: Duplicated code from view function, we could re-use view instead
api = system_util.pillar_api()
project = find_project_or_404(project_url,
embedded={'header_node': 1},
api=api)
# Load the header video file, if there is any.
header_video_file = None
header_video_node = None
if project.header_node and project.header_node.node_type == 'asset' and \
project.header_node.properties.content_type == 'video':
header_video_node = project.header_node
header_video_file = utils.get_file(project.header_node.properties.file)
header_video_node.picture = utils.get_file(header_video_node.picture)
return render_project(project, api,
extra_context={'title': 'about',
'header_video_file': header_video_file,
'header_video_node': header_video_node})
@blueprint.route('/<project_url>/edit', methods=['GET', 'POST'])
@login_required
def edit(project_url):
@@ -722,7 +736,7 @@ def project_update_nodes_list(node, project_id=None, list_name='latest'):
node_list_name = 'nodes_' + list_name
project[node_list_name] = []
nodes_list = project[node_list_name]
elif len(nodes_list) > 5:
elif len(nodes_list) > 15:
nodes_list.pop(0)
if node._id in nodes_list:

12
pillar/web/static.py Normal file
View File

@@ -0,0 +1,12 @@
"""Static file handling"""
import flask
import flask.views
class PillarStaticFile(flask.views.MethodView):
def __init__(self, static_folder):
self.static_folder = static_folder
def get(self, filename):
return flask.send_from_directory(self.static_folder, filename)

View File

@@ -932,6 +932,58 @@
"code": 61930,
"src": "fontawesome"
},
{
"uid": "31972e4e9d080eaa796290349ae6c1fd",
"css": "users",
"code": 59502,
"src": "fontawesome"
},
{
"uid": "c8585e1e5b0467f28b70bce765d5840c",
"css": "clipboard-copy",
"code": 61637,
"src": "fontawesome"
},
{
"uid": "b429436ec5a518c78479d44ef18dbd60",
"css": "clipboard-paste",
"code": 61674,
"src": "fontawesome"
},
{
"uid": "9c9f0a69d4abbeb5ff9d779df7679356",
"css": "question",
"code": 62108,
"src": "fontawesome"
},
{
"uid": "1caab45c74b115247eac24cd8abeca7c",
"css": "attract",
"code": 59407,
"src": "custom_icons",
"selected": true,
"svg": {
"path": "M782.9 955.3C780.2 953.2 780.1 951.5 779 900.2 776.9 808.2 774.9 705.6 773.8 640 772.8 580.4 772.6 576.1 769.9 566.8 758.2 526.8 724.6 493.3 647.3 444.8 553.9 386.1 362.5 288.7 243.9 239.5 228.8 233.2 225.5 232.2 224.1 233.7 222.9 235.1 222 260.2 220.3 339.4 219.1 396.5 217.5 445.9 216.8 449.1 210.6 476.3 174.3 527.5 136.4 562.6 106.5 590.2 79.8 608.4 68.2 609L63.3 609.3 62.6 602.1C61.6 593 71.1 300 73.1 277.3 75.2 254.7 78.9 233 82.4 223.6 85.8 214.5 100.2 192.9 124.2 160.8 145.5 132.3 165.8 111.5 200.6 82.6 240.6 49.3 245.4 46.7 266.4 46.6 282.9 46.6 300 50.3 330 60.3 434 95.2 684.3 220.7 805.7 298.8 875.2 343.5 908.4 376.4 920 412.3 923.1 421.8 927.4 446.8 929.3 466.2 930.7 481.4 939.8 749.6 940 783.8 940.1 804 939.9 805.9 936.8 813.4 922.3 848.8 874.6 902.4 828.6 935 813.2 946 800.4 953.2 792.4 955.5 786.2 957.3 785.4 957.3 782.9 955.3ZM495.5 649C474.5 646.4 455.8 635.1 444.9 618.2 437.3 606.6 433.4 593.1 433.4 577.9 433.2 556.9 439.2 542.9 454.7 527.6 470.4 512 486.3 505.9 507.7 507 541.9 508.9 567.4 530.6 574.4 563.7 578 580.7 575.9 595.9 567.6 611.6 553.8 637.9 525.4 652.7 495.5 649Z",
"width": 1000
},
"search": [
"logo_attract"
]
},
{
"uid": "ca37a039341d6828460976d12b89541b",
"css": "flamenco",
"code": 59503,
"src": "custom_icons",
"selected": true,
"svg": {
"path": "M549.1 804.5C531.8 801.6 513.3 791.5 502.4 779.2 486.7 761.3 479.8 732.8 485.4 709.1 492 681 516 657.1 544 650.6 554.9 648.1 577.5 649.5 588.5 653.4 612.5 661.9 630.2 681.2 637.3 706.5 640.5 717.9 639.8 742.3 635.9 753 628.9 772.3 614.3 788.5 595.8 797.4 579.7 805.2 565.9 807.3 549.1 804.5ZM71.2 757.2C70.4 753.7 71.6 708.6 76.5 568.5 77.3 546.2 79 498.5 80.2 462.5 83.2 375.6 87.5 341.9 97.9 323.8 101.9 316.8 127.2 280.8 143.5 259.1 165.1 230.1 183.3 211.9 226.4 175.9 263.4 145.1 269.7 141 284.9 137.8 324.8 129.6 429.7 169.6 622.7 266.5 729 319.9 811.1 364.6 871.7 402.2 910 425.9 921.7 436 929.6 452.1 933.1 459.4 933.4 461 932.1 465.8 931.3 468.8 929.8 472.6 928.8 474.3 924.8 481.2 869.1 539 841.2 565.2 806.9 597.4 799.2 602.6 786.3 602.7 766.7 602.7 752.1 596.3 698.9 564.7 637.8 528.3 573.4 493.4 489 451 368.5 390.3 248.9 336.8 247.8 343 247.8 343.2 246.5 396.7 245 461.8 242 593 242.7 585.1 231.4 607.4 216.6 636.2 192.2 668.6 162.9 697.8 127.3 733.5 89 761 74.9 761 73.2 761 71.8 759.6 71.2 757.2Z",
"width": 1000
},
"search": [
"logo_flamenco"
]
},
{
"uid": "03e6e1bfe72275c6eaa0d0898fde6c1d",
"css": "chatbubble-working",
@@ -1005,7 +1057,7 @@
{
"uid": "c8388cae1ba05fec948ec5af83771377",
"css": "people-outline",
"code": 59399,
"code": 59407,
"src": "custom_icons",
"selected": false,
"svg": {

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,755 @@
/*!
* Copyright 2011 Twitter, Inc.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
var Hogan = {};
(function (Hogan, useArrayBuffer) {
Hogan.Template = function (codeObj, text, compiler, options) {
codeObj = codeObj || {};
this.r = codeObj.code || this.r;
this.c = compiler;
this.options = options || {};
this.text = text || '';
this.partials = codeObj.partials || {};
this.subs = codeObj.subs || {};
this.ib();
}
Hogan.Template.prototype = {
// render: replaced by generated code.
r: function (context, partials, indent) { return ''; },
// variable escaping
v: hoganEscape,
// triple stache
t: coerceToString,
render: function render(context, partials, indent) {
return this.ri([context], partials || {}, indent);
},
// render internal -- a hook for overrides that catches partials too
ri: function (context, partials, indent) {
return this.r(context, partials, indent);
},
// ensurePartial
ep: function(symbol, partials) {
var partial = this.partials[symbol];
// check to see that if we've instantiated this partial before
var template = partials[partial.name];
if (partial.instance && partial.base == template) {
return partial.instance;
}
if (typeof template == 'string') {
if (!this.c) {
throw new Error("No compiler available.");
}
template = this.c.compile(template, this.options);
}
if (!template) {
return null;
}
// We use this to check whether the partials dictionary has changed
this.partials[symbol].base = template;
if (partial.subs) {
// Make sure we consider parent template now
if (this.activeSub === undefined) {
// Store parent template text in partials.stackText to perform substitutions in child templates correctly
partials.stackText = this.text;
}
template = createSpecializedPartial(template, partial.subs, partial.partials,
this.stackSubs, this.stackPartials, partials.stackText || this.text);
}
this.partials[symbol].instance = template;
return template;
},
// tries to find a partial in the current scope and render it
rp: function(symbol, context, partials, indent) {
var partial = this.ep(symbol, partials);
if (!partial) {
return '';
}
return partial.ri(context, partials, indent);
},
// render a section
rs: function(context, partials, section) {
var tail = context[context.length - 1];
if (!isArray(tail)) {
section(context, partials, this);
return;
}
for (var i = 0; i < tail.length; i++) {
context.push(tail[i]);
section(context, partials, this);
context.pop();
}
},
// maybe start a section
s: function(val, ctx, partials, inverted, start, end, tags) {
var pass;
if (isArray(val) && val.length === 0) {
return false;
}
if (typeof val == 'function') {
val = this.ms(val, ctx, partials, inverted, start, end, tags);
}
pass = !!val;
if (!inverted && pass && ctx) {
ctx.push((typeof val == 'object') ? val : ctx[ctx.length - 1]);
}
return pass;
},
// find values with dotted names
d: function(key, ctx, partials, returnFound) {
var found,
names = key.split('.'),
val = this.f(names[0], ctx, partials, returnFound),
doModelGet = this.options.modelGet,
cx = null;
if (key === '.' && isArray(ctx[ctx.length - 2])) {
val = ctx[ctx.length - 1];
} else {
for (var i = 1; i < names.length; i++) {
found = findInScope(names[i], val, doModelGet);
if (found != null) {
cx = val;
val = found;
} else {
val = '';
}
}
}
if (returnFound && !val) {
return false;
}
if (!returnFound && typeof val == 'function') {
ctx.push(cx);
val = this.mv(val, ctx, partials);
ctx.pop();
}
return val;
},
// find values with normal names
f: function(key, ctx, partials, returnFound) {
var val = false,
v = null,
found = false,
doModelGet = this.options.modelGet;
for (var i = ctx.length - 1; i >= 0; i--) {
v = ctx[i];
val = findInScope(key, v, doModelGet);
if (val != null) {
found = true;
break;
}
}
if (!found) {
return (returnFound) ? false : "";
}
if (!returnFound && typeof val == 'function') {
val = this.mv(val, ctx, partials);
}
return val;
},
// higher order templates
ls: function(func, cx, partials, text, tags) {
var oldTags = this.options.delimiters;
this.options.delimiters = tags;
this.b(this.ct(coerceToString(func.call(cx, text)), cx, partials));
this.options.delimiters = oldTags;
return false;
},
// compile text
ct: function(text, cx, partials) {
if (this.options.disableLambda) {
throw new Error('Lambda features disabled.');
}
return this.c.compile(text, this.options).render(cx, partials);
},
// template result buffering
b: (useArrayBuffer) ? function(s) { this.buf.push(s); } :
function(s) { this.buf += s; },
fl: (useArrayBuffer) ? function() { var r = this.buf.join(''); this.buf = []; return r; } :
function() { var r = this.buf; this.buf = ''; return r; },
// init the buffer
ib: function () {
this.buf = (useArrayBuffer) ? [] : '';
},
// method replace section
ms: function(func, ctx, partials, inverted, start, end, tags) {
var textSource,
cx = ctx[ctx.length - 1],
result = func.call(cx);
if (typeof result == 'function') {
if (inverted) {
return true;
} else {
textSource = (this.activeSub && this.subsText[this.activeSub]) ? this.subsText[this.activeSub] : this.text;
return this.ls(result, cx, partials, textSource.substring(start, end), tags);
}
}
return result;
},
// method replace variable
mv: function(func, ctx, partials) {
var cx = ctx[ctx.length - 1];
var result = func.call(cx);
if (typeof result == 'function') {
return this.ct(coerceToString(result.call(cx)), cx, partials);
}
return result;
},
sub: function(name, context, partials, indent) {
var f = this.subs[name];
if (f) {
this.activeSub = name;
f(context, partials, this, indent);
this.activeSub = false;
}
}
};
//Find a key in an object
function findInScope(key, scope, doModelGet) {
var val, checkVal;
if (scope && typeof scope == 'object') {
if (scope[key] != null) {
val = scope[key];
// try lookup with get for backbone or similar model data
} else if (doModelGet && scope.get && typeof scope.get == 'function') {
val = scope.get(key);
}
}
return val;
}
function createSpecializedPartial(instance, subs, partials, stackSubs, stackPartials, childText) {
function PartialTemplate() {};
PartialTemplate.prototype = instance;
function Substitutions() {};
Substitutions.prototype = instance.subs;
var key;
var partial = new PartialTemplate();
partial.subs = new Substitutions();
partial.subsText = {}; //hehe. substext.
partial.ib();
stackSubs = stackSubs || {};
partial.stackSubs = stackSubs;
for (key in subs) {
if (!stackSubs[key]) stackSubs[key] = subs[key];
partial.subsText[key] = childText;
}
for (key in stackSubs) {
partial.subs[key] = stackSubs[key];
}
stackPartials = stackPartials || {};
partial.stackPartials = stackPartials;
for (key in partials) {
if (!stackPartials[key]) stackPartials[key] = partials[key];
}
for (key in stackPartials) {
partial.partials[key] = stackPartials[key];
}
return partial;
}
var rAmp = /&/g,
rLt = /</g,
rGt = />/g,
rApos = /\'/g,
rQuot = /\"/g,
hChars = /[&<>\"\']/;
function coerceToString(val) {
return String((val === null || val === undefined) ? '' : val);
}
function hoganEscape(str) {
str = coerceToString(str);
return hChars.test(str) ?
str
.replace(rAmp, '&amp;')
.replace(rLt, '&lt;')
.replace(rGt, '&gt;')
.replace(rApos, '&#39;')
.replace(rQuot, '&quot;') :
str;
}
var isArray = Array.isArray || function(a) {
return Object.prototype.toString.call(a) === '[object Array]';
};
})(typeof exports !== 'undefined' ? exports : Hogan);
(function (Hogan) {
// Setup regex assignments
// remove whitespace according to Mustache spec
var rIsWhitespace = /\S/,
rQuot = /\"/g,
rNewline = /\n/g,
rCr = /\r/g,
rSlash = /\\/g;
Hogan.tags = {
'#': 1, '^': 2, '<': 3, '$': 4,
'/': 5, '!': 6, '>': 7, '=': 8, '_v': 9,
'{': 10, '&': 11, '_t': 12
};
Hogan.scan = function scan(text, delimiters) {
var len = text.length,
IN_TEXT = 0,
IN_TAG_TYPE = 1,
IN_TAG = 2,
state = IN_TEXT,
tagType = null,
tag = null,
buf = '',
tokens = [],
seenTag = false,
i = 0,
lineStart = 0,
otag = '{{',
ctag = '}}';
function addBuf() {
if (buf.length > 0) {
tokens.push({tag: '_t', text: new String(buf)});
buf = '';
}
}
function lineIsWhitespace() {
var isAllWhitespace = true;
for (var j = lineStart; j < tokens.length; j++) {
isAllWhitespace =
(Hogan.tags[tokens[j].tag] < Hogan.tags['_v']) ||
(tokens[j].tag == '_t' && tokens[j].text.match(rIsWhitespace) === null);
if (!isAllWhitespace) {
return false;
}
}
return isAllWhitespace;
}
function filterLine(haveSeenTag, noNewLine) {
addBuf();
if (haveSeenTag && lineIsWhitespace()) {
for (var j = lineStart, next; j < tokens.length; j++) {
if (tokens[j].text) {
if ((next = tokens[j+1]) && next.tag == '>') {
// set indent to token value
next.indent = tokens[j].text.toString()
}
tokens.splice(j, 1);
}
}
} else if (!noNewLine) {
tokens.push({tag:'\n'});
}
seenTag = false;
lineStart = tokens.length;
}
function changeDelimiters(text, index) {
var close = '=' + ctag,
closeIndex = text.indexOf(close, index),
delimiters = trim(
text.substring(text.indexOf('=', index) + 1, closeIndex)
).split(' ');
otag = delimiters[0];
ctag = delimiters[delimiters.length - 1];
return closeIndex + close.length - 1;
}
if (delimiters) {
delimiters = delimiters.split(' ');
otag = delimiters[0];
ctag = delimiters[1];
}
for (i = 0; i < len; i++) {
if (state == IN_TEXT) {
if (tagChange(otag, text, i)) {
--i;
addBuf();
state = IN_TAG_TYPE;
} else {
if (text.charAt(i) == '\n') {
filterLine(seenTag);
} else {
buf += text.charAt(i);
}
}
} else if (state == IN_TAG_TYPE) {
i += otag.length - 1;
tag = Hogan.tags[text.charAt(i + 1)];
tagType = tag ? text.charAt(i + 1) : '_v';
if (tagType == '=') {
i = changeDelimiters(text, i);
state = IN_TEXT;
} else {
if (tag) {
i++;
}
state = IN_TAG;
}
seenTag = i;
} else {
if (tagChange(ctag, text, i)) {
tokens.push({tag: tagType, n: trim(buf), otag: otag, ctag: ctag,
i: (tagType == '/') ? seenTag - otag.length : i + ctag.length});
buf = '';
i += ctag.length - 1;
state = IN_TEXT;
if (tagType == '{') {
if (ctag == '}}') {
i++;
} else {
cleanTripleStache(tokens[tokens.length - 1]);
}
}
} else {
buf += text.charAt(i);
}
}
}
filterLine(seenTag, true);
return tokens;
}
function cleanTripleStache(token) {
if (token.n.substr(token.n.length - 1) === '}') {
token.n = token.n.substring(0, token.n.length - 1);
}
}
function trim(s) {
if (s.trim) {
return s.trim();
}
return s.replace(/^\s*|\s*$/g, '');
}
function tagChange(tag, text, index) {
if (text.charAt(index) != tag.charAt(0)) {
return false;
}
for (var i = 1, l = tag.length; i < l; i++) {
if (text.charAt(index + i) != tag.charAt(i)) {
return false;
}
}
return true;
}
// the tags allowed inside super templates
var allowedInSuper = {'_t': true, '\n': true, '$': true, '/': true};
function buildTree(tokens, kind, stack, customTags) {
var instructions = [],
opener = null,
tail = null,
token = null;
tail = stack[stack.length - 1];
while (tokens.length > 0) {
token = tokens.shift();
if (tail && tail.tag == '<' && !(token.tag in allowedInSuper)) {
throw new Error('Illegal content in < super tag.');
}
if (Hogan.tags[token.tag] <= Hogan.tags['$'] || isOpener(token, customTags)) {
stack.push(token);
token.nodes = buildTree(tokens, token.tag, stack, customTags);
} else if (token.tag == '/') {
if (stack.length === 0) {
throw new Error('Closing tag without opener: /' + token.n);
}
opener = stack.pop();
if (token.n != opener.n && !isCloser(token.n, opener.n, customTags)) {
throw new Error('Nesting error: ' + opener.n + ' vs. ' + token.n);
}
opener.end = token.i;
return instructions;
} else if (token.tag == '\n') {
token.last = (tokens.length == 0) || (tokens[0].tag == '\n');
}
instructions.push(token);
}
if (stack.length > 0) {
throw new Error('missing closing tag: ' + stack.pop().n);
}
return instructions;
}
function isOpener(token, tags) {
for (var i = 0, l = tags.length; i < l; i++) {
if (tags[i].o == token.n) {
token.tag = '#';
return true;
}
}
}
function isCloser(close, open, tags) {
for (var i = 0, l = tags.length; i < l; i++) {
if (tags[i].c == close && tags[i].o == open) {
return true;
}
}
}
function stringifySubstitutions(obj) {
var items = [];
for (var key in obj) {
items.push('"' + esc(key) + '": function(c,p,t,i) {' + obj[key] + '}');
}
return "{ " + items.join(",") + " }";
}
function stringifyPartials(codeObj) {
var partials = [];
for (var key in codeObj.partials) {
partials.push('"' + esc(key) + '":{name:"' + esc(codeObj.partials[key].name) + '", ' + stringifyPartials(codeObj.partials[key]) + "}");
}
return "partials: {" + partials.join(",") + "}, subs: " + stringifySubstitutions(codeObj.subs);
}
Hogan.stringify = function(codeObj, text, options) {
return "{code: function (c,p,i) { " + Hogan.wrapMain(codeObj.code) + " }," + stringifyPartials(codeObj) + "}";
}
var serialNo = 0;
Hogan.generate = function(tree, text, options) {
serialNo = 0;
var context = { code: '', subs: {}, partials: {} };
Hogan.walk(tree, context);
if (options.asString) {
return this.stringify(context, text, options);
}
return this.makeTemplate(context, text, options);
}
Hogan.wrapMain = function(code) {
return 'var t=this;t.b(i=i||"");' + code + 'return t.fl();';
}
Hogan.template = Hogan.Template;
Hogan.makeTemplate = function(codeObj, text, options) {
var template = this.makePartials(codeObj);
template.code = new Function('c', 'p', 'i', this.wrapMain(codeObj.code));
return new this.template(template, text, this, options);
}
Hogan.makePartials = function(codeObj) {
var key, template = {subs: {}, partials: codeObj.partials, name: codeObj.name};
for (key in template.partials) {
template.partials[key] = this.makePartials(template.partials[key]);
}
for (key in codeObj.subs) {
template.subs[key] = new Function('c', 'p', 't', 'i', codeObj.subs[key]);
}
return template;
}
function esc(s) {
return s.replace(rSlash, '\\\\')
.replace(rQuot, '\\\"')
.replace(rNewline, '\\n')
.replace(rCr, '\\r');
}
function chooseMethod(s) {
return (~s.indexOf('.')) ? 'd' : 'f';
}
function createPartial(node, context) {
var prefix = "<" + (context.prefix || "");
var sym = prefix + node.n + serialNo++;
context.partials[sym] = {name: node.n, partials: {}};
context.code += 't.b(t.rp("' + esc(sym) + '",c,p,"' + (node.indent || '') + '"));';
return sym;
}
Hogan.codegen = {
'#': function(node, context) {
context.code += 'if(t.s(t.' + chooseMethod(node.n) + '("' + esc(node.n) + '",c,p,1),' +
'c,p,0,' + node.i + ',' + node.end + ',"' + node.otag + " " + node.ctag + '")){' +
't.rs(c,p,' + 'function(c,p,t){';
Hogan.walk(node.nodes, context);
context.code += '});c.pop();}';
},
'^': function(node, context) {
context.code += 'if(!t.s(t.' + chooseMethod(node.n) + '("' + esc(node.n) + '",c,p,1),c,p,1,0,0,"")){';
Hogan.walk(node.nodes, context);
context.code += '};';
},
'>': createPartial,
'<': function(node, context) {
var ctx = {partials: {}, code: '', subs: {}, inPartial: true};
Hogan.walk(node.nodes, ctx);
var template = context.partials[createPartial(node, context)];
template.subs = ctx.subs;
template.partials = ctx.partials;
},
'$': function(node, context) {
var ctx = {subs: {}, code: '', partials: context.partials, prefix: node.n};
Hogan.walk(node.nodes, ctx);
context.subs[node.n] = ctx.code;
if (!context.inPartial) {
context.code += 't.sub("' + esc(node.n) + '",c,p,i);';
}
},
'\n': function(node, context) {
context.code += write('"\\n"' + (node.last ? '' : ' + i'));
},
'_v': function(node, context) {
context.code += 't.b(t.v(t.' + chooseMethod(node.n) + '("' + esc(node.n) + '",c,p,0)));';
},
'_t': function(node, context) {
context.code += write('"' + esc(node.text) + '"');
},
'{': tripleStache,
'&': tripleStache
}
function tripleStache(node, context) {
context.code += 't.b(t.t(t.' + chooseMethod(node.n) + '("' + esc(node.n) + '",c,p,0)));';
}
function write(s) {
return 't.b(' + s + ');';
}
Hogan.walk = function(nodelist, context) {
var func;
for (var i = 0, l = nodelist.length; i < l; i++) {
func = Hogan.codegen[nodelist[i].tag];
func && func(nodelist[i], context);
}
return context;
}
Hogan.parse = function(tokens, text, options) {
options = options || {};
return buildTree(tokens, '', [], options.sectionTags || []);
}
Hogan.cache = {};
Hogan.cacheKey = function(text, options) {
return [text, !!options.asString, !!options.disableLambda, options.delimiters, !!options.modelGet].join('||');
}
Hogan.compile = function(text, options) {
options = options || {};
var key = Hogan.cacheKey(text, options);
var template = this.cache[key];
if (template) {
return template;
}
template = this.generate(this.parse(this.scan(text, options.delimiters), text, options), text, options);
return this.cache[key] = template;
}
})(typeof exports !== 'undefined' ? exports : Hogan);
if (typeof module !== 'undefined' && module.exports) {
module.exports = Hogan;
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,8 @@
/*!
* JavaScript Cookie v2.0.3
* https://github.com/js-cookie/js-cookie
*
* Copyright 2006, 2015 Klaus Hartl & Fagner Brack
* Released under the MIT license
*/
(function(a){if(typeof define==="function"&&define.amd){define(a)}else{if(typeof exports==="object"){module.exports=a()}else{var c=window.Cookies;var b=window.Cookies=a(window.jQuery);b.noConflict=function(){window.Cookies=c;return b}}}}(function(){function b(){var f=0;var c={};for(;f<arguments.length;f++){var d=arguments[f];for(var e in d){c[e]=d[e]}}return c}function a(d){function c(o,n,k){var r;if(arguments.length>1){k=b({path:"/"},c.defaults,k);if(typeof k.expires==="number"){var h=new Date();h.setMilliseconds(h.getMilliseconds()+k.expires*86400000);k.expires=h}try{r=JSON.stringify(n);if(/^[\{\[]/.test(r)){n=r}}catch(m){}n=encodeURIComponent(String(n));n=n.replace(/%(23|24|26|2B|3A|3C|3E|3D|2F|3F|40|5B|5D|5E|60|7B|7D|7C)/g,decodeURIComponent);o=encodeURIComponent(String(o));o=o.replace(/%(23|24|26|2B|5E|60|7C)/g,decodeURIComponent);o=o.replace(/[\(\)]/g,escape);return(document.cookie=[o,"=",n,k.expires&&"; expires="+k.expires.toUTCString(),k.path&&"; path="+k.path,k.domain&&"; domain="+k.domain,k.secure?"; secure":""].join(""))}if(!o){r={}}var q=document.cookie?document.cookie.split("; "):[];var p=/(%[0-9A-Z]{2})+/g;var l=0;for(;l<q.length;l++){var j=q[l].split("=");var f=j[0].replace(p,decodeURIComponent);var g=j.slice(1).join("=");if(g.charAt(0)==='"'){g=g.slice(1,-1)}try{g=d&&d(g,f)||g.replace(p,decodeURIComponent);if(this.json){try{g=JSON.parse(g)}catch(m){}}if(o===f){r=g;break}if(!o){r[f]=g}}catch(m){}}return r}c.get=c.set=c;c.getJSON=function(){return c.apply({json:true},[].slice.call(arguments))};c.defaults={};c.remove=function(f,e){c(f,"",b(e,{expires:-1}))};c.withConverter=a;return c}return a()}));

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,6 @@
/*
* videojs-ga - v0.4.2 - 2015-02-06
* Copyright (c) 2015 Michael Bensoussan
* Licensed MIT
*/
(function(){var a=[].indexOf||function(a){for(var b=0,c=this.length;c>b;b++)if(b in this&&this[b]===a)return b;return-1};videojs.plugin("ga",function(b){var c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u,v,w;return null==b&&(b={}),c={},this.options()["data-setup"]&&(l=JSON.parse(this.options()["data-setup"]),l.ga&&(c=l.ga)),d=["loaded","percentsPlayed","start","end","seek","play","pause","resize","volumeChange","error","fullscreen"],i=b.eventsToTrack||c.eventsToTrack||d,o=b.percentsPlayedInterval||c.percentsPlayedInterval||10,g=b.eventCategory||c.eventCategory||"Video",h=b.eventLabel||c.eventLabel,b.debug=b.debug||!1,n=[],s=r=0,t=!1,k=function(){h||(h=this.currentSrc().split("/").slice(-1)[0].replace(/\.(\w{3,4})(\?.*)?$/i,"")),a.call(i,"loadedmetadata")>=0&&u("loadedmetadata",!0)},v=function(){var b,c,d,e,f;for(b=Math.round(this.currentTime()),c=Math.round(this.duration()),e=Math.round(b/c*100),d=f=0;99>=f;d=f+=o)e>=d&&a.call(n,d)<0&&(a.call(i,"start")>=0&&0===d&&e>0?u("start",!0):a.call(i,"percentsPlayed")>=0&&0!==e&&u("percent played",!0,d),e>0&&n.push(d));a.call(i,"seek")>=0&&(s=r,r=b,Math.abs(s-r)>1&&(t=!0,u("seek start",!1,s),u("seek end",!1,r)))},e=function(){u("end",!0)},p=function(){var a;a=Math.round(this.currentTime()),u("play",!0,a),t=!1},m=function(){var a,b;a=Math.round(this.currentTime()),b=Math.round(this.duration()),a===b||t||u("pause",!1,a)},w=function(){var a;a=this.muted()===!0?0:this.volume(),u("volume change",!1,a)},q=function(){u("resize - "+this.width()+"*"+this.height(),!0)},f=function(){var a;a=Math.round(this.currentTime()),u("error",!0,a)},j=function(){var a;a=Math.round(this.currentTime()),("function"==typeof this.isFullscreen?this.isFullscreen():void 0)||("function"==typeof this.isFullScreen?this.isFullScreen():void 0)?u("enter fullscreen",!1,a):u("exit fullscreen",!1,a)},u=function(a,c,d){window.ga?ga("send","event",{eventCategory:g,eventAction:a,eventLabel:h,eventValue:d,nonInteraction:c}):window._gaq?_gaq.push(["_trackEvent",g,a,h,d,c]):b.debug&&console.log("Google Analytics not detected")},this.ready(function(){return this.on("loadedmetadata",k),this.on("timeupdate",v),a.call(i,"end")>=0&&this.on("ended",e),a.call(i,"play")>=0&&this.on("play",p),a.call(i,"pause")>=0&&this.on("pause",m),a.call(i,"volumeChange")>=0&&this.on("volumechange",w),a.call(i,"resize")>=0&&this.on("resize",q),a.call(i,"error")>=0&&this.on("error",f),a.call(i,"fullscreen")>=0?this.on("fullscreenchange",j):void 0}),{sendbeacon:u}})}).call(this);

43
pillar/web/subquery.py Normal file
View File

@@ -0,0 +1,43 @@
"""Sub-query stuff, for things we would otherwise let Eve embed (but don't want to).
Uses app.cache.memoize() to cache the results. However, since this decorator needs
to run in Flask Application context, it is manually applied in setup_app().
"""
import pillarsdk
import pillarsdk.exceptions
from pillar.web.system_util import pillar_api
def get_user_info(user_id):
"""Returns email, username and full name of the user.
Only returns the public fields, so the return value is the same
for authenticated & non-authenticated users, which is why we're
allowed to cache it globally.
Returns an empty dict when the user cannot be found.
"""
if user_id is None:
return {}
try:
user = pillarsdk.User.find(user_id, api=pillar_api())
except pillarsdk.exceptions.ResourceNotFound:
return {}
if not user:
return {}
# TODO: put those fields into a config var or module-level global.
return {'email': user.email,
'full_name': user.full_name,
'username': user.username}
def setup_app(app):
global get_user_info
decorator = app.cache.memoize(timeout=300, make_name='%s.get_user_info' % __name__)
get_user_info = decorator(get_user_info)

View File

@@ -4,8 +4,8 @@ Replacement of the old SystemUtility class.
import os
import logging
from flask import current_app, session
from flask.ext.login import current_user
from flask import current_app, session, request
from flask_login import current_user
from pillar.sdk import FlaskInternalApi
@@ -35,18 +35,29 @@ def pillar_server_endpoint_static():
def pillar_api(token=None):
# Cache API objects on the request per token.
api = getattr(request, 'pillar_api', {}).get(token)
if api is not None:
return api
# Check if current_user is initialized (in order to support manage.py
# scripts and non authenticated server requests).
use_token = token
if token is None and current_user and current_user.is_authenticated:
token = current_user.id
use_token = current_user.id
api = FlaskInternalApi(
endpoint=pillar_server_endpoint(),
username=None,
password=None,
token=token
token=use_token
)
if token is None:
if not hasattr(request, 'pillar_api'):
request.pillar_api = {}
request.pillar_api[token] = api
return api

View File

@@ -1,4 +1,4 @@
from flask.ext.login import current_user
from flask_login import current_user
from flask_wtf import Form
from pillar.web import system_util
from pillarsdk.users import User

View File

@@ -1,12 +1,17 @@
import json
import logging
import httplib2 # used by the oauth2 package
import requests
import urlparse
from flask import (abort, Blueprint, current_app, flash, redirect,
render_template, request, session, url_for)
from flask_login import login_required, login_user, logout_user, current_user
from flask_login import login_required, logout_user, current_user
from flask_oauthlib.client import OAuthException
from pillar.auth import UserClass, subscriptions
from werkzeug import exceptions as wz_exceptions
import pillar.auth
from pillar.auth import subscriptions
from pillar.web import system_util
from .forms import UserProfileForm
from .forms import UserSettingsEmailsForm
@@ -42,20 +47,25 @@ def login():
@blueprint.route('/oauth/blender-id/authorized')
def blender_id_authorized():
check_oauth_provider(current_app.oauth_blender_id)
try:
oauth_resp = current_app.oauth_blender_id.authorized_response()
except OAuthException as ex:
log.warning('Error parsing BlenderID OAuth response. data=%s; message=%s',
ex.data, ex.message)
raise wz_exceptions.Forbidden('Access denied, sorry!')
if oauth_resp is None:
return 'Access denied: reason=%s error=%s' % (
request.args['error_reason'],
request.args['error_description']
)
msg = 'Access denied: reason=%s error=%s' % (
request.args.get('error_reason'), request.args.get('error_description'))
log.warning('Access denied to user because oauth_resp=None: %s', msg)
return wz_exceptions.Forbidden(msg)
if isinstance(oauth_resp, OAuthException):
return 'Access denied: %s' % oauth_resp.message
session['blender_id_oauth_token'] = (oauth_resp['access_token'], '')
user = UserClass(oauth_resp['access_token'])
login_user(user)
current_app.login_manager.reload_user() # This ensures that flask_login.current_user is set.
pillar.auth.login_user(oauth_resp['access_token'])
if current_user is not None:
# Check with the store for user roles. If the user has an active
@@ -87,8 +97,7 @@ def login_local():
return abort(r.status_code)
res = r.json()
# If correct, receive token and log in the user
user = UserClass(res['token'])
login_user(user)
pillar.auth.login_user(res['token'])
return redirect(url_for('main.homepage'))
return render_template('users/login.html', form=form)
@@ -192,37 +201,16 @@ def users_edit(user_id):
if not current_user.has_role('admin'):
return abort(403)
api = system_util.pillar_api()
try:
user = User.find(user_id, api=api)
except sdk_exceptions.ResourceNotFound:
log.warning('Non-existing user %r requested.', user_id)
raise wz_exceptions.NotFound('Non-existing user %r requested.' % user_id)
form = UserEditForm()
if form.validate_on_submit():
def get_groups(roles):
"""Return a set of role ids matching the group names provided"""
groups_set = set()
for system_role in roles:
group = Group.find_one({'where': "name=='%s'" % system_role}, api=api)
groups_set.add(group._id)
return groups_set
# Remove any of the default roles
system_roles = set([role[0] for role in form.roles.choices])
system_groups = get_groups(system_roles)
# Current user roles
user_roles_list = user.roles if user.roles else []
user_roles = set(user_roles_list)
user_groups = get_groups(user_roles_list)
# Remove all form roles from current roles
user_roles = list(user_roles.difference(system_roles))
user_groups = list(user_groups.difference(system_groups))
# Get the assigned roles
system_roles_assigned = form.roles.data
system_groups_assigned = get_groups(system_roles_assigned)
# Reassign roles based on form.roles.data by adding them to existing roles
user_roles += system_roles_assigned
user_groups += list(get_groups(user_roles))
# Fetch the group for the assigned system roles
user.roles = user_roles
user.groups = user_groups
user.update(api=api)
_users_edit(form, user, api)
else:
form.roles.data = user.roles
return render_template('users/edit_embed.html',
@@ -230,6 +218,29 @@ def users_edit(user_id):
form=form)
def _users_edit(form, user, api):
"""Performs the actual user editing."""
from pillar.api.service import role_to_group_id, ROLES_WITH_GROUPS
current_user_roles = set(user.roles or [])
current_user_groups = set(user.groups or [])
roles_in_form = set(form.roles.data)
granted_roles = roles_in_form - current_user_roles
revoked_roles = ROLES_WITH_GROUPS - roles_in_form
# role_to_group_id contains ObjectIDs, but the SDK works with strings.
granted_groups = {str(role_to_group_id[role]) for role in granted_roles}
revoked_groups = {str(role_to_group_id[role]) for role in revoked_roles}
user.roles = list((current_user_roles - revoked_roles).union(granted_roles))
user.groups = list((current_user_groups - revoked_groups).union(granted_groups))
user.update(api=api)
@blueprint.route('/u')
@login_required
def users_index():
@@ -239,16 +250,22 @@ def users_index():
def user_roles_update(user_id):
"""Update the user's roles based on the store subscription status and BlenderID roles."""
api = system_util.pillar_api()
group_subscriber = Group.find_one({'where': {'name': 'subscriber'}}, api=api)
group_demo = Group.find_one({'where': {'name': 'demo'}}, api=api)
# Fetch the user once outside the loop, because we only need to get the
# subscription status once.
user = User.me(api=api)
store_user = subscriptions.fetch_user(user.email)
if store_user is None:
return
# Fetch user info from different sources.
store_user = subscriptions.fetch_user(user.email) or {}
bid_user = fetch_blenderid_user()
grant_subscriber = store_user.get('cloud_access', 0) == 1
grant_demo = bid_user.get('roles', {}).get('cloud_demo', False)
max_retry = 5
for retry_count in range(max_retry):
@@ -256,14 +273,18 @@ def user_roles_update(user_id):
roles = set(user.roles or [])
groups = set(user.groups or [])
if store_user['cloud_access'] == 1:
if grant_subscriber:
roles.add(u'subscriber')
groups.add(group_subscriber._id)
elif u'admin' not in roles:
# Don't take away roles from admins.
roles.discard(u'subscriber')
groups.discard(group_subscriber._id)
if grant_demo:
roles.add(u'demo')
groups.add(group_demo._id)
# Only send an API request when the user has actually changed
if set(user.roles or []) == roles and set(user.groups or []) == groups:
break
@@ -285,3 +306,32 @@ def user_roles_update(user_id):
else:
log.warning('Tried %i times to update user %s, and failed each time. Giving up.',
max_retry, user_id)
def fetch_blenderid_user():
"""Returns the user info from BlenderID.
Returns an empty dict if communication fails.
:rtype: dict
"""
bid_url = urlparse.urljoin(current_app.config['BLENDER_ID_ENDPOINT'], 'api/user')
log.debug('Fetching user info from %s', bid_url)
try:
bid_resp = current_app.oauth_blender_id.get(bid_url)
except httplib2.HttpLib2Error:
log.exception('Error getting %s from BlenderID', bid_url)
return {}
if bid_resp.status != 200:
log.warning('Error %i from BlenderID %s: %s', bid_resp.status, bid_url, bid_resp.data)
return {}
if not bid_resp.data:
log.warning('Empty data returned from BlenderID %s', bid_url)
return {}
log.debug('BlenderID returned %s', bid_resp.data)
return bid_resp.data

View File

@@ -1,12 +1,14 @@
import datetime
import hashlib
import urllib
import logging
import traceback
import sys
import dateutil.parser
from flask import current_app
from flask import request
from flask.ext.login import current_user
from flask_login import current_user
from pillarsdk import File
from pillarsdk import Project
from pillarsdk.exceptions import ResourceNotFound
@@ -53,30 +55,67 @@ def gravatar(email, size=64):
"?" + urllib.urlencode(parameters)
def pretty_date(time=None, detail=False, now=None):
def datetime_now():
"""Returns a datetime.datetime that represents 'now' in UTC."""
return datetime.datetime.now(tz=pillarsdk.utils.utc)
def pretty_date(time, detail=False, now=None):
"""Get a datetime object or a int() Epoch timestamp and return a
pretty string like 'an hour ago', 'Yesterday', '3 months ago',
'just now', etc
"""
from datetime import datetime
if time is None:
return None
# Normalize the 'time' parameter so it's always a datetime.
if type(time) is int:
time = datetime.fromtimestamp(time, tz=pillarsdk.utils.utc)
elif time is None:
time = now
time = datetime.datetime.fromtimestamp(time, tz=pillarsdk.utils.utc)
elif isinstance(time, basestring):
time = dateutil.parser.parse(time)
now = now or datetime.now(tz=time.tzinfo)
diff = now - time
now = now or datetime.datetime.now(tz=time.tzinfo)
diff = now - time # TODO: flip the sign, so that future = positive and past = negative.
second_diff = diff.seconds
second_diff = diff.seconds # Always positive, so -1 second = -1 day + 23h59m59s
day_diff = diff.days
if day_diff < 0:
return ''
if day_diff < 0 and time.year != now.year:
# "16 Jul 2018"
pretty = time.strftime("%d %b %Y")
if day_diff == 0:
elif day_diff < -21 and time.year == now.year:
# "16 Jul"
pretty = time.strftime("%d %b")
elif day_diff < -7:
week_count = -day_diff // 7
if week_count == 1:
pretty = "in 1 week"
else:
pretty = "in %s weeks" % week_count
elif day_diff < -1:
# "next Tuesday"
pretty = 'next %s' % time.strftime("%A")
elif day_diff == -1:
# Compute the actual number of seconds in the future, positively.
seconds = 24 * 3600 - second_diff
if seconds < 10:
return 'just now'
if seconds < 60:
return 'in %ss' % seconds
if seconds < 120:
return 'in a minute'
if seconds < 3600:
return 'in %im' % (seconds // 60)
if seconds < 7200:
return 'in an hour'
if seconds < 86400:
return 'in %ih' % (seconds // 3600)
elif day_diff == 0:
if second_diff < 10:
return "just now"
if second_diff < 60:
@@ -84,23 +123,23 @@ def pretty_date(time=None, detail=False, now=None):
if second_diff < 120:
return "a minute ago"
if second_diff < 3600:
return str(second_diff / 60 ) + "m ago"
return str(second_diff // 60) + "m ago"
if second_diff < 7200:
return "an hour ago"
if second_diff < 86400:
return str(second_diff / 3600) + "h ago"
return str(second_diff // 3600) + "h ago"
if day_diff == 1:
elif day_diff == 1:
pretty = "yesterday"
elif day_diff <= 7:
# "Tuesday"
pretty = time.strftime("%A")
# "last Tuesday"
pretty = 'last %s' % time.strftime("%A")
elif day_diff <= 22:
week_count = day_diff/7
week_count = day_diff // 7
if week_count == 1:
pretty = "%s week ago" % week_count
pretty = "1 week ago"
else:
pretty = "%s weeks ago" % week_count
@@ -166,3 +205,24 @@ def is_valid_id(some_id):
return True
return False
def last_page_index(meta_info):
"""Eve pagination; returns the index of the last page.
:param meta_info: Eve's '_meta' response.
:returns: Eve page number (base-1) of the last page.
:rtype: int
"""
total = meta_info['total']
if total == 0:
return 1
per_page = meta_info['max_results']
pages = total // per_page
if total % per_page == 0:
return pages
return pages + 1

View File

@@ -4,7 +4,7 @@ from markupsafe import Markup
from pillarsdk import File
from flask import current_app
from flask.ext.login import current_user
from flask_login import current_user
from wtforms import Form
from wtforms import StringField
from wtforms import SelectField
@@ -42,13 +42,13 @@ class CustomFileSelectWidget(HiddenInput):
except ResourceNotFound:
pass
else:
button.append(u'<div class="form-upload-file-meta-container">')
filename = Markup.escape(file_item.filename)
if file_item.content_type.split('/')[0] == 'image':
# If a file of type image is available, display the preview
button.append(u'<img class="preview-thumbnail" src="{0}" />'.format(
file_item.thumbnail('s', api=api)))
else:
button.append(u'<p>{}</p>'.format(filename))
button.append(u'<ul class="form-upload-file-meta">')
# File name
@@ -57,8 +57,16 @@ class CustomFileSelectWidget(HiddenInput):
button.append(u'<li class="size">({0} MB)</li>'.format(
round((file_item.length / 1024) * 0.001, 2)))
# Image resolution (if image)
if file_item.content_type.split('/')[0] == 'image':
button.append(u'<li class="dimensions">{0}x{1}</li>'.format(
file_item.width, file_item.height))
button.append(u'</ul>')
button.append(u'<ul class="form-upload-file-actions">')
# Download button for original file
button.append(u'<li class="original">'
u'<a href="{}" class="file_original"> '
u'<i class="pi-download"></i>Original</a></li>'
.format(file_item.link))
# Delete button
button.append(u'<li class="delete">'
u'<a href="#" class="file_delete" '
@@ -66,12 +74,8 @@ class CustomFileSelectWidget(HiddenInput):
u'data-file_id="{file_id}"> '
u'<i class="pi-trash"></i> Delete</a></li>'.format(
field_name=field.name, file_id=field.data))
# Download button for original file
button.append(u'<li class="original">'
u'<a href="{}" class="file_original"> '
u'<i class="pi-download"></i>Original</a></li>'
.format(file_item.link))
button.append(u'</ul>')
button.append(u'</div>')
upload_url = u'%sstorage/stream/{project_id}' % current_app.config[
'PILLAR_SERVER_ENDPOINT']
@@ -79,6 +83,7 @@ class CustomFileSelectWidget(HiddenInput):
button.append(u'<input class="fileupload" type="file" name="file" '
u'data-url="{url}" '
u'data-field-name="{name}" '
u'data-field-slug="{slug}" '
u'data-token="{token}" '
u'data-file-format="{file_format}">'
u'<div class="form-upload-progress"> '
@@ -88,6 +93,7 @@ class CustomFileSelectWidget(HiddenInput):
u'</div> '
u'</div>'.format(url=upload_url,
name=field.name,
slug=field.name.replace('oid', 'slug'),
token=Markup.escape(current_user.id),
file_format=Markup.escape(file_format_regex)))
@@ -102,12 +108,6 @@ class FileSelectField(StringField):
self.widget = CustomFileSelectWidget(file_format=file_format)
class ProceduralFileSelectForm(Form):
file = FileSelectField('file')
size = StringField()
slug = StringField()
def build_file_select_form(schema):
class FileSelectForm(Form):
pass
@@ -158,8 +158,8 @@ class CustomFormFieldWidget(object):
class CustomFormField(FormField):
def __init__(self, name, **kwargs):
super(CustomFormField, self).__init__(name, **kwargs)
def __init__(self, form_class, **kwargs):
super(CustomFormField, self).__init__(form_class, **kwargs)
self.widget = CustomFormFieldWidget()

View File

@@ -1,7 +1,9 @@
from flask import Markup
from pillarsdk import Node
from pillarsdk.exceptions import ForbiddenAccess
from pillarsdk.exceptions import ResourceNotFound
from flask.ext.login import current_user
from flask_login import current_user
from pillar.web import system_util
@@ -10,19 +12,28 @@ GROUP_NODES = {'group', 'storage', 'group_texture', 'group_hdri'}
def jstree_parse_node(node, children=None):
"""Generate JStree node from node object"""
from pillar.web.nodes.routes import url_for_node
node_type = node.node_type
# Define better the node type
if node_type == 'asset':
node_type = node.properties.content_type
parsed_node = dict(
id="n_{0}".format(node._id),
text=node.name,
a_attr={"href": url_for_node(node=node)},
li_attr={"data-node-type": node.node_type},
text=Markup.escape(node.name),
type=node_type,
children=False)
# Append children property only if it is a directory type
if node_type in GROUP_NODES:
parsed_node['children'] = True
if node.permissions and node.permissions.world:
parsed_node['li_attr']['is_free'] = True
return parsed_node
@@ -34,24 +45,29 @@ def jstree_get_children(node_id, project_id=None):
'name': 1, 'parent': 1, 'node_type': 1, 'properties.order': 1,
'properties.status': 1, 'properties.content_type': 1, 'user': 1,
'project': 1},
'sort': [('properties.order', 1), ('_created', 1)]}
'sort': [('properties.order', 1), ('_created', 1)],
'where': {
'$and': [
{'node_type': {'$regex': '^(?!attract_)'}},
{'node_type': {'$not': {'$in': ['comment', 'post']}}},
],
}
}
if node_id:
if node_id.startswith('n_'):
node_id = node_id.split('_')[1]
lookup['where'] = {'parent': node_id}
lookup['where']['parent'] = node_id
elif project_id:
lookup['where'] = {'project': project_id, 'parent': {'$exists': False}}
lookup['where']['project'] = project_id
lookup['where']['parent'] = {'$exists': False}
try:
children = Node.all(lookup, api=api)
for child in children['_items']:
# Skip nodes of type comment
if child.node_type not in ['comment', 'post']:
if child.properties.status == 'published':
children_list.append(jstree_parse_node(child))
elif child.node_type == 'blog':
children_list.append(jstree_parse_node(child))
elif current_user.is_authenticated and child.user == current_user.objectid:
# TODO: allow nodes that don't have a status property to be visible
# in the node tree (for example blog)
is_pub = child.properties.status == 'published'
if is_pub or (current_user.is_authenticated and child.user == current_user.objectid):
children_list.append(jstree_parse_node(child))
except ForbiddenAccess:
pass
@@ -61,7 +77,7 @@ def jstree_get_children(node_id, project_id=None):
def jstree_build_children(node):
return dict(
id="n_{0}".format(node._id),
text=node.name,
text=Markup.escape(node.name),
type=node.node_type,
children=jstree_get_children(node._id)
)
@@ -76,7 +92,7 @@ def jstree_build_from_node(node):
api = system_util.pillar_api()
# Parse the node and mark it as selected
child_node = jstree_parse_node(node)
child_node['state'] = dict(selected=True)
child_node['state'] = dict(selected=True, opened=True)
# Splice the specified child node between the other project children.
def select_node(x):
@@ -112,7 +128,7 @@ def jstree_build_from_node(node):
# Overwrite children_node with the current parent
child_node = parent_parent
# Set the node to open so that jstree actually displays the nodes
child_node['state'] = dict(opened=True)
child_node['state'] = dict(selected=True, opened=True)
# Push in the computed children into the parent
child_node['children'] = parent_children
# If we have a parent

View File

@@ -4,3 +4,34 @@ Pillar
This is the latest iteration on the Attract project. We are building a unified
framework called Pillar. Pillar will combine Blender Cloud and Attract. You
can see Pillar in action on the [Blender Cloud](https://cloud.bender.org).
## Custom fonts
The icons on the website are drawn using a custom font, stored in
[pillar/web/static/font](pillar/web/static/font).
This font is generated via [Fontello](http://fontello.com/) by uploading
[pillar/web/static/font/config.json](pillar/web/static/font/config.json).
Note that we only use the WOFF and WOFF2 formats, and discard the others
supplied by Fontello.
After replacing the font files & `config.json`, edit the Fontello-supplied
`font.css` to remove all font formats except `woff` and `woff2`. Then upload
it to [css2sass](http://css2sass.herokuapp.com/) to convert it to SASS, and
place it in [src/styles/_font-pillar.sass](src/styles/_font-pillar.sass).
Don't forget to Gulp!
## Installation
Make sure your /data directory exists and is writable by the current user.
Alternatively, provide a `pillar/config_local.py` that changes the relevant
settings.
```
git clone git@git.blender.org:pillar-python-sdk.git ../pillar-python-sdk
pip install -e ../pillar-python-sdk
pip install -U -r requirements.txt
pip install -e .
```

View File

@@ -1,14 +1,21 @@
# Primary requirements
# pillarsdk
attrs==16.2.0
algoliasearch==1.8.0
bcrypt==2.0.0
blinker==1.4
bugsnag==2.3.1
bleach==1.4.3
Cerberus==0.9.2
commonmark==0.7.2
Eve==0.6.3
Events==0.2.1
Flask==0.10.1
Flask-Cache==0.13.1
Flask-Script==2.0.5
flup==1.0.2
Flask-Login==0.3.2
Flask-OAuthlib==0.9.3
Flask-WTF==0.12
gcloud==0.12.0
google-apitools==0.4.11
httplib2==0.9.2
@@ -19,20 +26,22 @@ Pillow==2.8.1
pycparser==2.14
pycrypto==2.6.1
pyOpenSSL==0.15.1
python-dateutil==2.5.3
requests==2.9.1
rsa==3.3
rsa==3.4.2
simplejson==3.8.2
WebOb==1.5.0
wheel==0.24.0
wheel==0.29.0
zencoder==0.6.5
# Development requirements
pytest==2.9.1
pytest==3.0.1
responses==0.5.1
pytest-cov==2.2.1
pytest-cov==2.3.1
mock==2.0.0
# Secondary requirements
flup==1.0.2
Flask-PyMongo==0.4.1
Jinja2==2.8
Werkzeug==0.11.10
@@ -42,10 +51,11 @@ cookies==2.2.1
cryptography==1.3.1
enum34==1.1.3
funcsigs==1.0.1
future==0.15.2
html5lib==0.9999999
googleapis-common-protos==1.1.0
ipaddress==1.0.16
itsdangerous==0.24
mock==2.0.0
oauth2client==2.0.2
pbr==1.9.1
protobuf==3.0.0b2.post2
@@ -54,6 +64,7 @@ py==1.4.31
pyasn1==0.1.9
pyasn1-modules==0.0.8
pymongo==3.2.2
requests_oauthlib==0.7.0
six==1.10.0
wsgiref==0.1.2
coverage==4.0.3

View File

@@ -9,4 +9,4 @@ echo "THIS WILL DROP EXISTING CONNECTIONS"
echo "Press [ENTER] to continue, [CTRL]+[C] to abort."
read dummy
mongorestore -h localhost:27017 -d eve --drop --maintainInsertionOrder --stopOnError "$1/eve"
mongorestore -h localhost:27017 -d eve --drop --maintainInsertionOrder --stopOnError "$1/cloud"

View File

@@ -1,5 +1,5 @@
[pytest]
addopts = -v --cov pillar --cov-report term-missing --ignore node_modules
[tool:pytest]
addopts = -v --cov pillar --cov-report term-missing --ignore node_modules -x --ff
[pep8]
max-line-length = 100

View File

@@ -11,8 +11,14 @@ setuptools.setup(
install_requires=[
'Flask>0.10,<0.11', # Flask 0.11 is incompatible with Eve 0.6.4
'Eve>=0.6.3',
'Flask-Cache>=0.13.1',
'Flask-Script>=2.0.5',
'Flask-Login>=0.3.2',
'Flask-OAuthlib>=0.9.3',
'Flask-WTF>=0.12',
'algoliasearch>=1.8.0,<1.9.0', # 1.9 Gives an issue importing some exception class.
'attrs>=16.2.0',
'bugsnag>=2.3.1,<3.0', # latest version on PyPi is beta of 3.0
'gcloud>=0.12.0',
'google-apitools>=0.4.11',
'MarkupSafe>=0.23',
@@ -22,6 +28,7 @@ setuptools.setup(
'zencoder>=0.6.5',
'bcrypt>=2.0.0',
'blinker>=1.4',
'pillarsdk',
],
tests_require=[
'pytest>=2.9.1',

View File

@@ -28,15 +28,21 @@ $(document).ready(function() {
var params = {
hitsPerPage: HITS_PER_PAGE,
maxValuesPerFacet: MAX_VALUES_PER_FACET,
facets: $.map(FACET_CONFIG, function(facet) { return !facet.disjunctive ? facet.name : null; }),
disjunctiveFacets: $.map(FACET_CONFIG, function(facet) { return facet.disjunctive ? facet.name : null; })
facets: $.map(FACET_CONFIG, function(facet) {
return !facet.disjunctive ? facet.name : null;
}),
disjunctiveFacets: $.map(FACET_CONFIG, function(facet) {
return facet.disjunctive ? facet.name : null;
})
};
// Setup the search helper
var helper = algoliasearchHelper(algolia, INDEX_NAME, params);
// Check if we passed hidden facets in the FACET_CONFIG
var result = $.grep(FACET_CONFIG, function(e){ return e.hidden && e.hidden == true; });
var result = $.grep(FACET_CONFIG, function(e) {
return e.hidden && e.hidden == true;
});
for (var i = 0; i < result.length; i++) {
var f = result[i];
helper.addFacetRefinement(f.name, f.value);
@@ -76,17 +82,24 @@ $(document).ready(function() {
firstHit.addClass('active');
firstHit.find('#search-loading').addClass('active');
var getNode = setTimeout(function(){
$.get('/nodes/' + firstHit.attr('data-hit-id') + '/view', function(dataHtml){
$('#search-hit-container').html(dataHtml);
})
.done(function(){
function done() {
$('.search-loading').removeClass('active');
$('#search-error').hide();
$('#search-hit-container').show();
}
clearTimeout(getNode);
window.setTimeout(function() {
// Ignore getting that first result when there is none.
var hit_id = firstHit.attr('data-hit-id');
if (hit_id === undefined) {
done();
return;
}
$.get('/nodes/' + hit_id + '/view', function(dataHtml) {
$('#search-hit-container').html(dataHtml);
})
.done(done)
.fail(function(data) {
$('.search-loading').removeClass('active');
$('#search-hit-container').hide();
@@ -174,11 +187,21 @@ $(document).ready(function() {
var values = [];
for (var v in facetResult.data) {
var label = '';
if (v === 'true') { label = 'Yes'; }
else if (v === 'false') { label = 'No'; }
if (v === 'true') {
label = 'Yes';
} else if (v === 'false') {
label = 'No';
}
// Remove any underscore from the value
else { label = v.replace(/_/g," "); }
values.push({ label: label, value: v, count: facetResult.data[v], refined: helper.isRefined(facetParams.name, v) });
else {
label = v.replace(/_/g, " ");
}
values.push({
label: label,
value: v,
count: facetResult.data[v],
refined: helper.isRefined(facetParams.name, v)
});
}
var sortFunction = facetParams.sortFunction || sortByCountDesc;
if (facetParams.topListIfRefined) sortFunction = sortByRefined(sortFunction);
@@ -214,7 +237,10 @@ $(document).ready(function() {
// Process pagination
var pages = [];
if (content.page > maxPages) {
pages.push({ current: false, number: 1 });
pages.push({
current: false,
number: 1
});
// They don't really add much...
// pages.push({ current: false, number: '...', disabled: true });
}
@@ -222,12 +248,18 @@ $(document).ready(function() {
if (p < 0 || p >= content.nbPages) {
continue;
}
pages.push({ current: content.page === p, number: (p + 1) });
pages.push({
current: content.page === p,
number: (p + 1)
});
}
if (content.page + maxPages < content.nbPages) {
// They don't really add much...
// pages.push({ current: false, number: '...', disabled: true });
pages.push({ current: false, number: content.nbPages });
pages.push({
current: false,
number: content.nbPages
});
}
var pagination = {
pages: pages,
@@ -264,7 +296,9 @@ $(document).ready(function() {
});
$(document).on('click', '.gotoPage', function() {
helper.setCurrentPage(+$(this).data('page') - 1).search();
$("html, body").animate({scrollTop:0}, '500', 'swing');
$("html, body").animate({
scrollTop: 0
}, '500', 'swing');
return false;
});
$(document).on('click', '.sortBy', function() {
@@ -293,12 +327,12 @@ $(document).ready(function() {
if (isEmpty) {
$('#input-loop').addClass('glyphicon-loop');
$('#input-loop').removeClass('glyphicon-remove');
}
else {
} else {
$('#input-loop').removeClass('glyphicon-loop');
$('#input-loop').addClass('glyphicon-remove');
}
}
function numberWithDelimiter(number, delimiter) {
number = number + '';
delimiter = delimiter || ',';
@@ -306,7 +340,9 @@ $(document).ready(function() {
split[0] = split[0].replace(/(\d)(?=(\d\d\d)+(?!\d))/g, '$1' + delimiter);
return split.join('.');
}
var sortByCountDesc = function sortByCountDesc (a, b) { return b.count - a.count; };
var sortByCountDesc = function sortByCountDesc(a, b) {
return b.count - a.count;
};
var sortByName = function sortByName(a, b) {
return a.value.localeCompare(b.value);
};
@@ -319,11 +355,16 @@ $(document).ready(function() {
return sortFunction(a, b);
};
};
function initWithUrlParams() {
var sPageURL = location.hash;
if (!sPageURL || sPageURL.length === 0) { return true; }
if (!sPageURL || sPageURL.length === 0) {
return true;
}
var sURLVariables = sPageURL.split('&');
if (!sURLVariables || sURLVariables.length === 0) { return true; }
if (!sURLVariables || sURLVariables.length === 0) {
return true;
}
var query = decodeURIComponent(sURLVariables[0].split('=')[1]);
$inputField.val(query);
helper.setQuery(query);
@@ -338,6 +379,7 @@ $(document).ready(function() {
helper.setCurrentPage(page);
}
function setURLParams(state) {
var urlParams = '#';
var currentQuery = state.query;
@@ -356,4 +398,3 @@ $(document).ready(function() {
}
});

View File

@@ -70,10 +70,6 @@ function setup_file_uploader(index, upload_element) {
if (data.originalFiles[0]['type'].length && !acceptFileTypes.test(data.originalFiles[0]['type'])) {
uploadErrors.push('Not an accepted file type');
}
// Limit upload size to 1GB
if (data.originalFiles[0]['size'] && data.originalFiles[0]['size'] > 1262485504) {
uploadErrors.push('Filesize is too big');
}
if (uploadErrors.length > 0) {
$(this).parent().parent().addClass('error');
$(this).after(uploadErrors.join("\n"));
@@ -106,9 +102,14 @@ function setup_file_uploader(index, upload_element) {
}
$file_id_field.val(pillar_file_id);
var filename = data.files[0].name;
// Set the slug based on the name, strip special characters
// TODO: fix this so that it doesn't set the wrong field.
// $('#' + $(this).attr('data-field-slug')).val(filename.replace(/[^0-9a-zA-Z]+/g, ""));
// Ugly workaround: If the asset has the default name, name it as the file
if ($('.form-group.name .form-control').val() == 'New asset') {
var filename = data.files[0].name;
$('.form-group.name .form-control').val(filename);
$('.node-edit-title').html(filename);
}
@@ -118,11 +119,21 @@ function setup_file_uploader(index, upload_element) {
$('body').trigger('file-upload:finished');
},
fail: function (jqXHR, textStatus, errorThrown) {
fail: function (jqXHR, fileupload) {
if (console) {
console.log(textStatus, 'Upload error: ' + errorThrown);
console.log('Upload error:');
console.log('jqXHR', jqXHR);
console.log('fileupload', fileupload);
}
statusBarSet(textStatus, 'Upload error: ' + errorThrown, 'pi-attention', 8000);
var uploadErrors = [];
for (var key in fileupload.messages) {
uploadErrors.push(fileupload.messages[key]);
}
statusBarSet('error',
'Upload error: ' + uploadErrors.join("; "),
'pi-attention', 16000);
set_progress_bar(100, 'progress-error');
@@ -145,6 +156,21 @@ $(function () {
})
.on('file-upload:activated', on_file_upload_activated)
.on('file-upload:finished', on_file_upload_finished)
.on('click', '.js-append-attachment', function(e) {
e.preventDefault();
// Append widget @[slug-name] to the post's description
// Note: Heavily connected to HTML in _node_edit_form.jade
var slug = $(this).parent().find("input[id*='slug']").val();
var widget = '@[' + slug + ']\n';
if (slug) {
document.getElementById('description').value += widget;
statusBarSet('success', 'Attachment appended to description', 'pi-check');
} else {
statusBarSet('error', 'Slug is empty, upload something first', 'pi-warning');
}
})
;
function inject_project_id_into_url(index, element) {

View File

@@ -1,32 +1,25 @@
$(function () {
$('[data-toggle="tooltip"]').tooltip({'delay' : {'show': 1250, 'hide': 250}});
$('[data-toggle="popover"]').popover();
})
function NavbarTransparent() {
var startingpoint = 50;
$(window).on("load scroll", function () {
if ($(this).scrollTop() > startingpoint) {
$('.navbar-overlay, .navbar-transparent').addClass('is-active');
if(document.getElementById("project_context-header") !== null) {
$('#project_context-header').addClass('is-offset');
}
} else {
$('.navbar-overlay, .navbar-transparent').removeClass('is-active');
if(document.getElementById("project_context-header") !== null) {
$('#project_context-header').removeClass('is-offset');
}
};
});
};
NavbarTransparent();
/* Status Bar */
function statusBarClear(delay_class, delay_html){
var statusBar = $("#status-bar");
if (!delay_class) { delay_class = 0 };
if (!delay_html) { delay_html = 250 };
if (delay_class == 0) {
statusBar.removeAttr('class');
return
} else {
setTimeout(function(){
statusBar.removeAttr('class');
setTimeout(function() {
statusBar.html('');
}, delay_html);
}, delay_class);
}
}
function statusBarSet(classes, html, icon_name, time){
/* Utility to notify the user by temporarily flashing text on the project header
Usage:
@@ -57,14 +50,16 @@ function statusBarSet(classes, html, icon_name, time){
icon = '<i class="' + icon_name + '"></i>';
};
statusBarClear(0,0);
var text = icon + html;
$("#project-statusbar").addClass('active ' + classes);
$("#project-statusbar").html(text);
var statusBar = $("#status-bar");
statusBar
.addClass('active ' + classes)
.html(text);
/* Back to normal */
setTimeout(function(){
$("#project-statusbar").removeAttr('class');
$("#project-statusbar").html();
}, time);
statusBarClear(time, 250);
};

View File

@@ -2,20 +2,12 @@ function projectNavCollapse() {
$("#project-side-container").addClass('collapsed');
$("ul.breadcrumb.context").addClass('active');
if (typeof Ps !== 'undefined'){
Ps.destroy(document.getElementById('project_tree'));
};
};
function projectNavExpand() {
$("#project-side-container").removeClass('collapsed');
$("ul.breadcrumb.context").removeAttr('class');
if (typeof Ps !== 'undefined'){
Ps.initialize(document.getElementById('project_tree'), {suppressScrollX: true});
}
};
function projectNavCheck(){
@@ -130,7 +122,7 @@ function hopToTop(limit){
document.getElementById("hop").onclick = function(e){ window.scrollTo(0, 0);}
$(window).scroll(function() {
$(window).on("scroll", function () {
if ($(window).scrollTop() >= limit) {$("#hop").addClass("active")} else {$("#hop").removeAttr("class")}
});
}
@@ -157,7 +149,7 @@ function containerResizeY(window_height){
var container_offset = $('#project-container').offset();
var container_height = window_height - container_offset.top;
var container_height_wheader = window_height - container_offset.top - $('#project_nav-header').height();
var window_height_minus_nav = window_height - $('#project_nav-header').height();
var window_height_minus_nav = window_height - $('#project_nav-header').height() - 50; // 50 is global top navbar
$('#project_context-header').width($('#project_context-container').width());
@@ -168,7 +160,7 @@ function containerResizeY(window_height){
);
if (container_height > parseInt($('#project-container').css("min-height"))) {
if (projectTree){
if (typeof projectTree !== "undefined"){
$(projectTree).css(
{'max-height': container_height_wheader + 'px',
'height': container_height_wheader + 'px'}
@@ -177,7 +169,4 @@ function containerResizeY(window_height){
}
};
if (projectTree){ Ps.update(projectTree) }
};

View File

@@ -4,25 +4,31 @@ $(document).on('click','body .comment-action-reply',function(e){
e.preventDefault();
// container of the comment we are replying to
var parentDiv = $(this).parent().parent();
var parentDiv = $(this).closest('.comment-container');
// container of the first-level comment in the thread
var parentDivFirst = $(this).parent().parent().prevAll('.is-first:first');
var parentDivFirst = parentDiv.prevAll('.is-first:first');
// Get the id of the comment
if (parentDiv.hasClass('is-reply')) {
parentNodeId = parentDivFirst.data('node_id');
parentNodeId = parentDivFirst.data('node-id');
} else {
parentNodeId = parentDiv.data('node_id');
parentNodeId = parentDiv.data('node-id');
}
if (!parentNodeId) {
if (console) console.log('No parent ID found on ', parentDiv.toArray(), parentDivFirst.toArray());
return;
}
// Get the textarea and set its parent_id data
var commentField = document.getElementById('comment_field');
commentField.setAttribute('data-parent_id', parentNodeId);
commentField.dataset.originalParentId = commentField.dataset.parentId;
commentField.dataset.parentId = parentNodeId;
// Start the comment field with @authorname:
var replyAuthor = $(this).parent().parent().find('.comment-author:first').html();
$(commentField).val("**@" + replyAuthor + ":** ");
var replyAuthor = parentDiv.find('.comment-author:first span').html();
$(commentField).val("**@" + replyAuthor.slice(1, -1) + ":** ");
// Add class for styling
$('.comment-container').removeClass('is-replying');
@@ -52,6 +58,9 @@ $(document).on('click','body .comment-action-cancel',function(e){
$('.comment-reply-container').detach().prependTo('#comments-list');
var commentField = document.getElementById('comment_field');
commentField.dataset.parentId = commentField.dataset.originalParentId;
delete commentField.dataset.originalParentId;
$(commentField).val('');
// Convert Markdown
var convert = new Markdown.getSanitizingConverter().makeHtml;
@@ -71,11 +80,16 @@ $(document).on('click','body .comment-action-rating',function(e){
e.preventDefault();
var $this = $(this);
var nodeId = $this.parent().parent().parent().data('node_id');
var nodeId = $this.closest('.comment-container').data('node-id');
var is_positive = !$this.hasClass('down');
var parentDiv = $this.parent();
var rated_positive = parentDiv.hasClass('positive');
if (typeof nodeId === 'undefined') {
if (console) console.log('Undefined node ID');
return;
}
var op;
if (parentDiv.hasClass('rated') && is_positive == rated_positive) {
op = 'revoke';
@@ -107,3 +121,185 @@ $(document).on('click','body .comment-action-rating',function(e){
$this.siblings('.comment-rating-value').text(rating);
});
});
/**
* Fetches a comment, returns a promise object.
*/
function loadComment(comment_id, projection)
{
if (typeof comment_id === 'undefined') {
console.log('Error, loadComment(', comment_id, ', ', projection, ') called.');
return $.Deferred().reject();
}
// Add required projections for the permission system to work.
projection.node_type = 1;
projection.project = 1;
var url = '/api/nodes/' + comment_id;
return $.get({
url: url,
data: {projection: projection},
cache: false, // user should be ensured the latest comment to edit.
});
}
function loadComments(commentsUrl)
{
var commentsContainer = $('#comments-embed');
return $.get(commentsUrl)
.done(function(dataHtml) {
// Update the DOM injecting the generate HTML into the page
commentsContainer.html(dataHtml);
})
.fail(function(xhr) {
statusBarSet('error', "Couldn't load comments. Error: " + xhr.responseText, 'pi-attention', 5000);
commentsContainer.html('<a id="comments-reload"><i class="pi-refresh"></i> Reload comments</a>');
});
}
/**
* Shows an error in the "Post Comment" button.
*/
function show_comment_button_error(msg) {
var $button = $('.comment-action-submit');
var $textarea = $('#comment_field');
$button.addClass('button-field-error');
$textarea.addClass('field-error');
$button.html(msg);
setTimeout(function(){
$button.html('Post Comment');
$button.removeClass('button-field-error');
$textarea.removeClass('field-error');
}, 2500);
}
/**
* Shows an error in the "edit comment" button.
*/
function show_comment_edit_button_error($button, msg) {
var $textarea = $('#comment_field');
$button.addClass('error');
$textarea.addClass('field-error');
$button.html(msg);
setTimeout(function(){
$button.html('<i class="pi-check"></i> save changes');
$button.removeClass('button-field-error');
$textarea.removeClass('field-error');
}, 2500);
}
/**
* Switches the comment to either 'edit' or 'view' mode.
*/
function comment_mode(clicked_item, mode)
{
var $container = $(clicked_item).closest('.comment-container');
var comment_id = $container.data('node-id');
var $edit_buttons = $container.find('.comment-action-edit');
if (mode == 'edit') {
$edit_buttons.find('.edit_mode').hide();
$edit_buttons.find('.edit_cancel').show();
$edit_buttons.find('.edit_save').show();
} else {
$edit_buttons.find('.edit_mode').show();
$edit_buttons.find('.edit_cancel').hide();
$edit_buttons.find('.edit_save').hide();
$container.find('.comment-content').removeClass('editing');
$container.find('.comment-content-preview').html('').hide();
}
}
/**
* Return UI to normal, when cancelling or saving.
*
* clicked_item: save/cancel button.
*
* Returns a promise on the comment loading if reload_comment=true.
*/
function commentEditCancel(clicked_item, reload_comment) {
comment_mode(clicked_item, 'view');
var comment_container = $(clicked_item).closest('.comment-container');
var comment_id = comment_container.data('node-id');
if (!reload_comment) return;
return loadComment(comment_id, {'properties.content': 1})
.done(function(data) {
var comment_html = data['properties']['content_html'];
comment_container.find('.comment-content').html(comment_html);
})
.fail(function(data) {
if (console) console.log('Error fetching comment: ', xhr);
statusBarSet('error', 'Error canceling.', 'pi-warning');
});
}
function save_comment(is_new_comment, $commentContainer)
{
var promise = $.Deferred();
var commentField;
var commentId;
var parent_id;
// Get data from HTML, and validate it.
if (is_new_comment)
commentField = $('#comment_field');
else {
commentField = $commentContainer.find('textarea');
commentId = $commentContainer.data('node-id');
}
if (!commentField.length)
return promise.reject("Unable to find comment field.");
if (is_new_comment) {
parent_id = commentField.data('parent-id');
if (!parent_id) {
if (console) console.log("No parent ID found in comment field data.");
return promise.reject("No parent ID!");
}
}
// Validate the comment itself.
var comment = commentField.val();
if (comment.length < 5) {
if (comment.length == 0) promise.reject("Say something...");
else promise.reject("Minimum 5 characters.");
return promise;
}
// Notify callers of the fact that client-side validation has passed.
promise.notify();
// Actually post the comment.
if (is_new_comment) {
$.post('/nodes/comments/create',
{'content': comment, 'parent_id': parent_id})
.fail(promise.reject)
.done(function(data) { promise.resolve(data.node_id, comment); });
} else {
$.post('/nodes/comments/' + commentId,
{'content': comment})
.fail(promise.reject)
.done(function(resp) {
promise.resolve(commentId, resp.data.content_html);
});
}
return promise;
}

View File

@@ -3,9 +3,11 @@ ProjectUtils = {
nodeId: function() { return document.body.dataset.nodeId; },
parentNodeId: function() { return document.body.dataset.parentNodeId; },
projectId: function() { return document.body.dataset.projectId; },
projectUrl: function() { return document.body.dataset.projectUrl; },
isProject: function() { return document.body.dataset.isProject === 'true'; },
nodeType: function() { return document.body.dataset.nodeType; },
isModified: function() { return document.body.dataset.isModified === 'true'; },
context: function() { return document.body.dataset.context; },
setProjectAttributes: function(props) {
for (var key in props) {
if (!props.hasOwnProperty(key)) continue;

Some files were not shown because too many files have changed in this diff Show More