Support for uploading larger file sizes #130

Closed
opened 2024-05-16 09:46:40 +02:00 by Oleg-Komarov · 29 comments
Owner

We are currently limited by CloudFlare's limits for file uploads.

One solution is to implement chunked file uploads, e.g. see https://github.com/juliomalegria/django-chunked-upload/ (relies on a client-side implementation that is not actively maintained).

We are currently limited by CloudFlare's limits for file uploads. One solution is to implement chunked file uploads, e.g. see https://github.com/juliomalegria/django-chunked-upload/ (relies on a client-side implementation that is not actively maintained).

Just tried to upload my addon and ran into this. The actual addon code is tiny (only like 20KB), but I've bundled all the dependencies (numpy, pillow, skia) for multiple platforms as recommend by the docs. They add about 120MB to the archive.

Just tried to upload my addon and ran into this. The actual addon code is tiny (only like 20KB), but I've bundled all the dependencies (`numpy, pillow, skia`) for multiple platforms as recommend by [the docs](https://docs.blender.org/manual/en/4.2/extensions/python_wheels.html). They add about **120MB** to the archive.

@Bobbe why are you bundling numpy? Blender already bundles numpy (run import numpy on the Python Console editor)

@Bobbe why are you bundling numpy? Blender already bundles numpy (run `import numpy` on the Python Console editor)

why are you bundling numpy? Blender already bundles numpy (run import numpy on the Python Console editor)

Well it's supposed to, yes. I've encountered Blender installations in the past were numpy was in fact not available (iirc it was Arch Linux), so I figured it wouldn't hurt to include it. That being said, I think in that particular case, it was just using system python and you are kinda doomed at that point anyways ...

Regardless, there's lots of reasons for addons to be larger than 10MB or whatever the limit is rn. (In my specific case, removing numpy is not going to cut it).

> why are you bundling numpy? Blender already bundles numpy (run `import numpy` on the Python Console editor) Well it's supposed to, yes. I've encountered Blender installations in the past were numpy was in fact not available (iirc it was Arch Linux), so I figured it wouldn't hurt to include it. That being said, I think in that particular case, it was just using system python and you are kinda doomed at that point anyways ... Regardless, there's lots of reasons for addons to be larger than 10MB or whatever the limit is rn. (In my specific case, removing numpy is not going to cut it).
Pablo Vazquez added the
Type
Enhancement
label 2024-05-27 12:49:36 +02:00

By the way, I started looking at chunked upload support and a lot can be done by just using front-end js to split the file.

I got 80% of the way there, just failed to nicely hook the backend with the CreateView regular routines.

My initial attempt is here: https://projects.blender.org/dfelinto/extensions-website/src/branch/chunked-upload

Anyways, what I wanted to add is that it is probably “trivial” for the real backend developers to implement this. And I was happy to know that it is clearly doable, yet I won’t pursuit this further.

I would also highly encourage this to be on the agenda. Though it is not a pressing matter compared to most of the other topics.

By the way, I started looking at chunked upload support and a lot can be done by just using front-end js to split the file. I got 80% of the way there, just failed to nicely hook the backend with the CreateView regular routines. My initial attempt is here: https://projects.blender.org/dfelinto/extensions-website/src/branch/chunked-upload Anyways, what I wanted to add is that it is probably “trivial” for the real backend developers to implement this. And I was happy to know that it is clearly doable, yet I won’t pursuit this further. I would also highly encourage this to be on the agenda. Though it is not a pressing matter compared to most of the other topics.
Author
Owner

I've just tested an upload and noticed that we looked over one more check in the code that was limiting uploads to 10MB 🤦
This is now updated, and the limit is 200MB, I've managed to upload a 190MB test file as a draft add-on.

I think that we were lucky to get feedback from ~Bobbe (thanks!) about the actual reasons for blowing up the upload size:
there is an issue with shipping large dependencies, made worse by supporting multiple platforms in a single zip

I wonder if there are other use cases for uploads larger than 200MB that are structurally different, i.e. they need to ship something else from large wheels. Is it possible to have some concrete examples that illustrate the need for this?

@dfelinto: re https://projects.blender.org/dfelinto/extensions-website/src/branch/chunked-upload

I see value in keeping the system as simple as it can be for as long as we can, and adding chunked uploads is a complication that requires more elaborate code than what's is presented in the branch (it handles only the happy path):

  1. One illustration is the list of error responses (it maps onto the variety of failure scenarios that need to be accounted for) in https://github.com/juliomalegria/django-chunked-upload/?tab=readme-ov-file#possible-error-responses
  2. The code in the branch doesn't do any server-side bookkeeping for the uploaded chunks, making it hard to debug when something goes wrong:
  • leftover chunks in upload dir don't allow to attribute them to a user doing upload
  • we need to know when to clean up leftovers for uploads that didn't finish (not by scanning the file system, but getting a list from the db)
  • we need to keep track of file checksums to ensure the upload integrity
  • etc

what I want to say is that it is in no way trivial, and we may want to focus on other things unless there is a very-very strong demand to go above the 200MB limit.

I've just tested an upload and noticed that we looked over one more check in the code that was limiting uploads to 10MB 🤦 This is now updated, and the limit is 200MB, I've managed to upload a 190MB test file as a draft add-on. I think that we were lucky to get feedback from ~Bobbe (thanks!) about the actual reasons for blowing up the upload size: there is an issue with shipping large dependencies, made worse by supporting multiple platforms in a single zip I wonder if there are other use cases for uploads larger than 200MB that are structurally different, i.e. they need to ship something else from large wheels. Is it possible to have some concrete examples that illustrate the need for this? @dfelinto: re https://projects.blender.org/dfelinto/extensions-website/src/branch/chunked-upload I see value in keeping the system as simple as it can be for as long as we can, and adding chunked uploads is a complication that requires more elaborate code than what's is presented in the branch (it handles only the happy path): 1. One illustration is the list of error responses (it maps onto the variety of failure scenarios that need to be accounted for) in https://github.com/juliomalegria/django-chunked-upload/?tab=readme-ov-file#possible-error-responses 2. The code in the branch doesn't do any server-side bookkeeping for the uploaded chunks, making it hard to debug when something goes wrong: - leftover chunks in upload dir don't allow to attribute them to a user doing upload - we need to know when to clean up leftovers for uploads that didn't finish (not by scanning the file system, but getting a list from the db) - we need to keep track of file checksums to ensure the upload integrity - etc what I want to say is that it is in no way trivial, and we may want to focus on other things unless there is a very-very strong demand to go above the 200MB limit.

I can give feedback around my use case which is pushing over 200 MB.

I need to include multiple python packages for parsing of specialised scientific data formats. In my testing, if I build the .zip for an individual platform then it is ~160 MB. If I built it with wheels for every platform in the one file then it is ~280 MB.

I am currently using my own custom build script which uses pip to download the required packages and install into the molecularnodes/wheels/ directory. As I am using multiple python packages, each of which also have some dependencies, all of these are installed. I run this build script with blender 4.2, but it still downloads numpy and others that Blender might already have. As Dalai pointed out there isn't a point in bundling numpy if it is already included in Blender, but at the moment I am trying to not have any manual curation of the .whl bundling and instead rely on pip to reduce workload and potential for mistakes. This is why myself (and I assume others) are bundling packages like numpy in the upload.

Currently the upload limit is a limiting factor for me, and I am sure others as well, but I assume it is mostly due to .whl bundling.

If this is to be a hardline stance from Blender then I am happy to set something up for users to install from hosted builds on GitHub, but I would like to try and find a solution that enables it to be on the extensions platform.

Potential solutions from where I sit:

  • Uploading of a separate .zip for each platform. As the building is automated this isn't too much extra effort on my end to do separate uploads. Whether they are grouped on the one extensions page or counted as separate extensions either could work (the latter I could do right now without any changes).
  • Part of the blender_manifest.toml file can instead include a list of python packages & versions - which is then downloaded and bundled on Blender's side. This would stop the need for large file uploads (the add-on without .whl files is ~ 5MB) and increase security as Blender is sourcing .whls from pypi rather than users uploading potentially compromised binary files.
I can give feedback around my use case which is pushing over 200 MB. I need to include multiple python packages for parsing of specialised scientific data formats. In my testing, if I build the `.zip` for an individual platform then it is ~160 MB. If I built it with wheels for every platform in the one file then it is ~280 MB. I am currently using my own [custom build script](https://github.com/BradyAJohnston/MolecularNodes/blob/extensions-platform/build.py) which uses pip to download the required packages and install into the `molecularnodes/wheels/` directory. As I am using multiple python packages, each of which also have some dependencies, all of these are installed. I run this build script with blender 4.2, but it still downloads `numpy` and others that Blender might already have. As Dalai pointed out there isn't a point in bundling `numpy` if it is already included in Blender, but at the moment I am trying to not have any manual curation of the `.whl` bundling and instead rely on `pip` to reduce workload and potential for mistakes. This is why myself (and I assume others) are bundling packages like `numpy` in the upload. Currently the upload limit is a limiting factor for me, and I am sure others as well, but I assume it is mostly due to `.whl` bundling. If this is to be a hardline stance from Blender then I am happy to set something up for users to install from hosted builds on GitHub, but I would like to try and find a solution that enables it to be on the extensions platform. Potential solutions from where I sit: - Uploading of a separate `.zip` for each platform. As the building is automated this isn't too much extra effort on my end to do separate uploads. Whether they are grouped on the one extensions page or counted as separate extensions either could work (the latter I could do right now without any changes). - Part of the `blender_manifest.toml` file can instead include a list of python packages & versions - which is then downloaded and bundled on Blender's side. This would stop the need for large file uploads (the add-on without `.whl` files is ~ 5MB) and increase security as Blender is sourcing `.whl`s from `pypi` rather than users uploading potentially compromised binary files.
Author
Owner

@bunchofbradys thank you for explaining your use case!

I have immediate reactions to two points you raised, but please don't read this as a definite resolution or an official stance:

Uploading of a separate .zip for each platform. As the building is automated this isn't too much extra effort on my end to do separate uploads. Whether they are grouped on the one extensions page or counted as separate extensions either could work (the latter I could do right now without any changes).

I expect this to work reasonably well if you upload separate extension versions per platform, by specifying a platform suffix in the version name and listing one platform per manifest/upload, e.g.

...
version = "0.0.1-windows-x64"`
...
platforms = ["windows-x64"]
...

all your versions will be grouped under the same extension on the website, sharing ratings and download counts, but blender will know which version to install due to the platform filter.

This is why myself (and I assume others) are bundling packages like numpy in the upload.

I wonder which version of numpy is effectively imported in this case - if blender comes with its own version, will your dependencies get the version they expect or will the blender-bundled version be first on the import path? especially for transitive/2nd order dependencies
@ideasman42 could you comment on how this works or point to a doc that explains it? I've glanced through https://docs.blender.org/api/4.2/index.html but didn't find an answer

I agree with your last point that we could rely on some form of python dependency management instead of introducing our own, e.g. use https://peps.python.org/pep-0508/ and python packaging tools

@bunchofbradys thank you for explaining your use case! I have immediate reactions to two points you raised, but please don't read this as a definite resolution or an official stance: > Uploading of a separate .zip for each platform. As the building is automated this isn't too much extra effort on my end to do separate uploads. Whether they are grouped on the one extensions page or counted as separate extensions either could work (the latter I could do right now without any changes). I expect this to work reasonably well if you upload separate extension versions per platform, by specifying a platform suffix in the version name and listing one platform per manifest/upload, e.g. ``` ... version = "0.0.1-windows-x64"` ... platforms = ["windows-x64"] ... ``` all your versions will be grouped under the same extension on the website, sharing ratings and download counts, but blender will know which version to install due to the platform filter. > This is why myself (and I assume others) are bundling packages like numpy in the upload. I wonder which version of numpy is effectively imported in this case - if blender comes with its own version, will your dependencies get the version they expect or will the blender-bundled version be first on the import path? especially for transitive/2nd order dependencies @ideasman42 could you comment on how this works or point to a doc that explains it? I've glanced through https://docs.blender.org/api/4.2/index.html but didn't find an answer I agree with your last point that we could rely on some form of python dependency management instead of introducing our own, e.g. use https://peps.python.org/pep-0508/ and python packaging tools

all your versions will be grouped under the same extension on the website, sharing ratings and download counts, but blender will know which version to install due to the platform filter.

@Oleg-Komarov do you mean that I should be able to do this currently?

> all your versions will be grouped under the same extension on the website, sharing ratings and download counts, but blender will know which version to install due to the platform filter. @Oleg-Komarov do you mean that I should be able to do this currently?
Author
Owner

@bunchofbradys yes, if you

  • use the same extension_id UPD (i meant id field in manifest)
  • upload multiple versions with different version strings: we accept - and + suffixes, according to https://semver.org/ items 9 and 10 of the spec (although those suffixes are meant for different kind of metadata)
  • specify correct platforms value for each of those versions

it would implement what you described as a first solution in your message, and it is expected to work - if it does not, it is a bug.

But I can't tell at this moment If we would want this to be the recommended approach for all multi-platform extensions.

@bunchofbradys yes, if you - use the same extension_id UPD (i meant `id` field in manifest) - upload multiple versions with different version strings: we accept `-` and `+` suffixes, according to https://semver.org/ items 9 and 10 of the spec (although those suffixes are meant for different kind of metadata) - specify correct platforms value for each of those versions it would implement what you described as a first solution in your message, and it is expected to work - if it does not, it is a bug. But I can't tell at this moment If we would want this to be the recommended approach for all multi-platform extensions.

Thanks @Oleg-Komarov for the details. I'll try it out as an approach - mostly so I can get user feedback if the add-on is working well with 4.2 beta. I can change the approach if something else is decided on.

Thanks @Oleg-Komarov for the details. I'll try it out as an approach - mostly so I can get user feedback if the add-on is working well with 4.2 beta. I can change the approach if something else is decided on.

upload multiple versions with different version strings: we accept - and + suffixes, according to https://semver.org/ items 9 and 10 of the spec (although those suffixes are meant for different kind of metadata)

@Oleg-Komarov you mean to use something this for the version? I believe the site itself is not parsing what is after the + away... maybe we should?

  • 1.8.0+windows-x64
  • 1.8.0+macos-arm64
  • 1.8.0+linux-x64
  • 1.8.0+macos-x64
  • 1.8.0+windows-arm64
> upload multiple versions with different version strings: we accept - and + suffixes, according to https://semver.org/ items 9 and 10 of the spec (although those suffixes are meant for different kind of metadata) @Oleg-Komarov you mean to use something this for the version? I believe the site itself is not parsing what is after the + away... maybe we should? * 1.8.0+windows-x64 * 1.8.0+macos-arm64 * 1.8.0+linux-x64 * 1.8.0+macos-x64 * 1.8.0+windows-arm64

EDIT: partly resolved, see later commend.

I have just tried this approach and I'm getting this error:

The extension in the manifest ("molecularnodes") is already being used by another extension.

I uploaded a mac version and just tried uploading a windows version. Relevant parts from the .toml is below.

MacOS

id = "molecularnodes"
version = "4.2.0-macos-arm64"
platforms = [
	"macos-arm64",
]

Windows

id = "molecularnodes"
version = "4.2.0-windows-x64"
platforms = [
	"windows-x64",
]
EDIT: partly resolved, see later commend. ~~I have just tried this approach and I'm getting this error:~~ > The extension in the manifest ("molecularnodes") is already being used by another extension. ~~I uploaded a mac version and just tried uploading a windows version. Relevant parts from the `.toml` is below.~~ MacOS ```toml id = "molecularnodes" version = "4.2.0-macos-arm64" platforms = [ "macos-arm64", ] ``` Windows ```toml id = "molecularnodes" version = "4.2.0-windows-x64" platforms = [ "windows-x64", ] ```

Edit: partly resolved, see later commend.

I just tried a couple of different uploads. If I upload exactly the same file then I get the "You have already uploaded this version" message, but any other version number (even incrementing) or changing the platform suffixes then I get the error about the id being used by another extension.

Edit: partly resolved, see later commend. ~~I just tried a couple of different uploads. If I upload exactly the same file then I get the "You have already uploaded this version" message, but any other version number (even incrementing) or changing the platform suffixes then I get the error about the id being used by another extension.~~

Slight apologies I was trying to upload version in the 'upload' page and not 'upload new version' on the add-on specific page.

I have however managed to upload a new version for windows - but this just takes over from the previously uploaded version, even though their platforms are different. (https://extensions.blender.org/approval-queue/molecularnodes/)

Slight apologies I was trying to upload version in the 'upload' page and not 'upload new version' on the add-on specific page. I have however managed to upload a new version for windows - but this just takes over from the previously uploaded version, even though their platforms are different. (https://extensions.blender.org/approval-queue/molecularnodes/)
Author
Owner

Thanks for trying it out! Sorry, I should have been more precise regarding the upload procedure.

What do you mean by "taking over" - the info displayed in the side bar? If yes, then it is expected from the organization of the website, and we may need to find a better way (cc @dfelinto).

Current limitations:

  • everywhere except "Version History" only the latest version (by upload timestamp) is displayed, this hides the info about additional versions
  • the Version History tab is not available to moderators until the extension becomes listed; this seems like a straightforward fix if we decide that moderators should be able to see unlisted extensions at all times, then in
    if self.request.user.is_staff:
    we replace is_staff with is_moderator
  • we don't make new version uploads visible in the approval queue until the extension is listed, we could change this and start adding comments for every non-draft version
Thanks for trying it out! Sorry, I should have been more precise regarding the upload procedure. What do you mean by "taking over" - the info displayed in the side bar? If yes, then it is expected from the organization of the website, and we may need to find a better way (cc @dfelinto). Current limitations: - everywhere except "Version History" only the latest version (by upload timestamp) is displayed, this hides the info about additional versions - the Version History tab is not available to moderators until the extension becomes listed; this seems like a straightforward fix if we decide that moderators should be able to see unlisted extensions at all times, then in https://projects.blender.org/infrastructure/extensions-website/src/commit/f9fc2e3a5de747a8935a2d10f609190524a7f8dc/extensions/views/mixins.py#L23 we replace is_staff with is_moderator - we don't make new version uploads visible in the approval queue until the extension is listed, we could change this and start adding comments for every non-draft version

By 'taking over' yes I just mean that the most recently uploaded 'version' is what is currently displayed in when looking at it in the review queue, instead even though the previous version was the same but just for a different platform.

Currently there are two versions uploaded. Will both of these need to be approved? If one is approved - will the other become available? I didn't upload the the other macOS and Linux versions because I was waiting to hear back on how I should proceed. Should I upload them now as well?

By 'taking over' yes I just mean that the most recently uploaded 'version' is what is currently displayed in when looking at it in the review queue, instead even though the previous version was the same but just for a different platform. Currently there are two versions uploaded. Will both of these need to be approved? If one is approved - will the other become available? I didn't upload the the other macOS and Linux versions because I was waiting to hear back on how I should proceed. Should I upload them now as well?

Will both of these need to be approved?

No, if one gets approved, they all get approved (as well as subsequent versions of the extension).

If one is approved - will the other become available?

Yes.

Should I upload them now as well?

May as well.

> Will both of these need to be approved? No, if one gets approved, they all get approved (as well as subsequent versions of the extension). > If one is approved - will the other become available? Yes. > Should I upload them now as well? May as well.
Author
Owner

No, if one gets approved, they all get approved (as well as subsequent versions of the extension).

That's not exactly how it works now: initial approve only approves latest_version (and all images). Then, only after the extension is approved, the subsequent uploads are auto-approved.

I don't think that auto-approving all existing versions on the first approve will do the right thing:
this will break a scenario when a user has to iterate on the code due to moderator's feedback before the first approval and uploads fixes as new versions, not cleaning up previous uploads.

> No, if one gets approved, they all get approved (as well as subsequent versions of the extension). That's not exactly how it works now: initial approve only approves latest_version (and all images). Then, only after the extension is approved, the subsequent uploads are auto-approved. I don't think that auto-approving all existing versions on the first approve will do the right thing: this will break a scenario when a user has to iterate on the code due to moderator's feedback before the first approval and uploads fixes as new versions, not cleaning up previous uploads.

initial approve only approves latest_version (and all images). Then, only after the extension is approved, the subsequent uploads are auto-approved.

Okay I'll wait for the feedback / approval on the windows version - then I will upload version for other platforms.

> initial approve only approves latest_version (and all images). Then, only after the extension is approved, the subsequent uploads are auto-approved. Okay I'll wait for the feedback / approval on the windows version - then I will upload version for other platforms.

For the records, I approved Molecular Nodes. There seem to be a bug since the server is not fetching the platform from the manifest. But it is separate from this task. #184

For the records, I approved Molecular Nodes. There seem to be a bug since the server is not fetching the platform from the manifest. But it is separate from this task. https://projects.blender.org/infrastructure/extensions-website/issues/184

Also, created a task on Blender to flag some included modules (e.g., numpy) so their wheel is never loaded: blender/blender#123189

Also, created a task on Blender to flag some included modules (e.g., numpy) so their wheel is never loaded: https://projects.blender.org/blender/blender/issues/123189

Thanks Dalai - appreciate it! I'll hold off on uploading any more versions till the platform parsing is sorted out - but it seems to be working well.

Something related to larger file size add-ons is that clicking 'Install' for an extension that (in my case) is ~120MB, leads to Blender just freezing for 40 s while the operator runs. My connection speed is moderate, so I imagine this potentially being more problematic for people with a slower connection speed. Would it be possible for this to be a modal that can run in the background with a progress bar?

Thanks Dalai - appreciate it! I'll hold off on uploading any more versions till the platform parsing is sorted out - but it seems to be working well. Something related to larger file size add-ons is that clicking 'Install' for an extension that (in my case) is ~120MB, leads to Blender just freezing for 40 s while the operator runs. My connection speed is moderate, so I imagine this potentially being more problematic for people with a slower connection speed. Would it be possible for this to be a modal that can run in the background with a progress bar?
Author
Owner

@bunchofbradys FYI, we decided on the way forward for handling multi-OS builds: #74 (comment) and more details in https://devtalk.blender.org/t/2024-06-14-extensions-platform/35136

Implementing this will take a bit of time. In my opinion, if you don't want to wait, you could keep using separate versions to finish uploading your current release.
Also, once we have the new model in place, we would appreciate your feedback: either trying it out with your next releases, or potentially re-uploading your current release as well.

@bunchofbradys FYI, we decided on the way forward for handling multi-OS builds: https://projects.blender.org/infrastructure/extensions-website/issues/74#issuecomment-1215851 and more details in https://devtalk.blender.org/t/2024-06-14-extensions-platform/35136 Implementing this will take a bit of time. In my opinion, if you don't want to wait, you could keep using separate versions to finish uploading your current release. Also, once we have the new model in place, we would appreciate your feedback: either trying it out with your next releases, or potentially re-uploading your current release as well.

Thanks for the update! I'll probably wait until it is implemented to upload more versions / platforms (have been able to do my testing as it is).

Will definitely do testing and give feedback when it's ready to go.

Thanks for the update! I'll probably wait until it is implemented to upload more versions / platforms (have been able to do my testing as it is). Will definitely do testing and give feedback when it's ready to go.

I have just had a go at uploading two separate platforms with the same versions, but the most recent upload is still currently the only way that is displayed. I uploaded windows and macos ARM, but the macos is the only 'supported' platform displayed.

I have just had a go at uploading two separate platforms with the same versions, but the most recent upload is still currently the only way that is displayed. I uploaded windows and macos ARM, but the macos is the only 'supported' platform displayed.

Update: On the website only the MacOS version is displayed as the current version in the sidebar, but the newer Windows version does show up on my windows machine via the Extensions tab inside of Blender when attempting to install from the website.
image

Update: On the website only the MacOS version is displayed as the current version in the sidebar, but the newer Windows version does show up on my windows machine via the Extensions tab inside of Blender when attempting to install from the website. ![image](/attachments/e01c5ef2-b1ff-48ec-894b-12d4c4cf127f)

@bunchofbradys it is better if you wait until #194 is done. What we have so far is mainly the "infrastructure" for the real multi-os implementation. Its development will resume next week.

@bunchofbradys it is better if you wait until https://projects.blender.org/infrastructure/extensions-website/issues/194 is done. What we have so far is mainly the "infrastructure" for the real multi-os implementation. Its development will resume next week.

Okay no worries, I'll wait. Thanks for the info!

Okay no worries, I'll wait. Thanks for the info!
Author
Owner

Do we know if we still have users who can't upload their extensions with the current 200MB limit?

The largest file on the platform now is molecularnodes-4.2.2-windows_x64.zip at 137MB, which hopefully still leaves some headroom for this particular extension going forward.

I would prefer to close this as won't fix, unless we still have authors blocked by the limitation.

Do we know if we still have users who can't upload their extensions with the current 200MB limit? The largest file on the platform now is `molecularnodes-4.2.2-windows_x64.zip` at 137MB, which hopefully still leaves some headroom for this particular extension going forward. I would prefer to close this as won't fix, unless we still have authors blocked by the limitation.
Sign in to join this conversation.
No Milestone
No project
No Assignees
4 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: infrastructure/extensions-website#130
No description provided.