Giter Club home page Giter Club logo

Comments (11)

adamhenson avatar adamhenson commented on June 12, 2024

Hi @iambumblehead - although I have a vague idea, please clearly provide the exact issue and ideally a solution. Any issue you open on GitHub should at the bare minimum provide a description.

from s3-directory-sync-cli.

adamhenson avatar adamhenson commented on June 12, 2024

You’re gonna make me create an issue template… aren’t you 😄

from s3-directory-sync-cli.

iambumblehead avatar iambumblehead commented on June 12, 2024

@adamhenson The feature being requested:

  • for each file in the LOCAL_DIRECTORY being synchronised,
    • if it can be gzipped
      • gzip it,
      • put it in the bucket with the correct Content-Encoding: gzip header

An example of how this is done can be seen here: https://gist.github.com/veselosky/9427faa38cee75cd8e27 A PR for a similar package adding support for gzip here https://github.com/multiplegeorges/vue-cli-plugin-s3-deploy/pull/51/files

If gzip support is not added, consider adding a README notice that resources aren't optimized for production deployments.

from s3-directory-sync-cli.

adamhenson avatar adamhenson commented on June 12, 2024

@iambumblehead - thanks for raising this point. You definitely raise a good point, but I think we need to solidify a proposed solution before moving forward.

There are several ways to implement compression. It should not be assumed this project compresses in gzip format out of the box, nor do I want it to. This project provides 1 function - to upload a local directory to S3. Compression should be done separately, beforehand, by the end user in any format they choose (I actually prefer Brotli with gzip fallback).

I also have a script I wrote for anyone interested here... compresses for both gzip and Brotli. I do not want to add it to the project because it prescribes too much opinion and adds more complexity to this project that is meant to be simple.

With that said - I do partially agree with the intent of this ticket. We don't want to prevent someone from using best practice and compressing files before uploading to S3 and as you've pointed out. And it seems that in this project's current state - compression is not supported because the user doesn't have control over AWS S3 header settings.

My questions are as follows:

  • What is the minimum we can do to provide support for compression? To be clear we want to support compression... not do compression.
    • What flags or options can we provide to support this? Maybe a Content-Encoding setting?
      • Would this setting be applied to all files, and if not - how do we distinguish files?
    • Is there a better way we can give control to all header options (or other header options we should somehow provide)?
  • What would the API inputs look like (command flags, env variables, etc)?

from s3-directory-sync-cli.

iambumblehead avatar iambumblehead commented on June 12, 2024

@adamhenson if this package would detect gzip'd files and upload them with the correct content-encoding headers that might work for my purpose. One could run commands like these to gzip files before sync,

 (shopt -s globstar; gzip -9 -rkf ./public/**/*js)
 (shopt -s globstar; gzip -9 -rkf ./public/**/*json)
 (shopt -s globstar; gzip -9 -rkf ./public/**/*png)

from s3-directory-sync-cli.

adamhenson avatar adamhenson commented on June 12, 2024

Thanks @iambumblehead. I think we're getting nearer to an agreed upon solution, but my response to your comment is that there is not necessarily a "correct" Content-Encoding value (especially considering our code wouldn't know which files in which to apply it). It would need to be based on the compression strategy at hand which could vary based on the codebase. For example, think about the common case in which the compression strategy encompasses multiple types (in the case below brotli with a gzip fallback):

  • ./my-file.js
  • ./my-file.js.br
  • ./my-file.js.gz

In this case, I believe the user would want to set Accept-Encoding header to provide content negotiation.

Accept-Encoding: gzip, br

Even if I'm off in the above, the fact is that we don't want to make assumptions or prescribe a compression strategy. Also, how would the code in this project determine what files should have header values set.

With all this said - I'm proposing we accept a flag to identify a "config" file in which a glob file pattern could be specified with any corresponding header values, like so:

s3-directory-sync --config ./s3-directory-sync.config.json ...

./s3-directory-sync.config.json

{
  "*.{js,css}": {
    "Accept-Encoding": "gzip, br"
  }
}

Or in your case, something like this...

./s3-directory-sync.config.json

{
  "*.{js}": {
    "Content-Encoding": "gzip"
  }
}

from s3-directory-sync-cli.

iambumblehead avatar iambumblehead commented on June 12, 2024

@adamhenson thank you for a thoughtful and detailed reply to me. As a user, I like using this package with no configuration file or script (easier), and I'm glad if this package provides good defaults that I may choose to override later. Please excuse me if the rest of this reply misunderstands compression-related headers.

I would like if this package could generating default headers itself with few options from me. If gzip and br options are specified (and thus enabled) in the configuration and if gzip and br file variants are found, apply headers for "Accept-Encoding: gzip, br". If gzip and br options are specified and if only a gzip file variant is found, apply headers for "Accept-Encoding: gzip".

S3_DIRECTORY_SYNC_CACHE_CONTROL="max-age=3600"
S3_DIRECTORY_SYNC_ACCEPT_ENCODINGS=gzip,br

The above configuration may seem limited in expression, but I believe future options could be added (if needed) to give more control. For example,

S3_DIRECTORY_SYNC_CACHE_CONTROL="max-age=3600"
S3_DIRECTORY_SYNC_ACCEPT_ENCODINGS="gzip,br"
S3_DIRECTORY_SYNC_GZIP_FILE_PATTERN="**/*.{js,css,json,ico,map,xml,txt,svg,eot,ttf,woff,woff2}"
S3_DIRECTORY_SYNC_BR_FILE_PATTERN="**/*.{fbx,obj}"

[...] we don't want to make assumptions or prescribe a compression strategy

I believe my suggested configuration strikes a good balance between "sane defaults" and "user controls their own compression strategy". Feel free to disagree and I look forward to reading your reply.

from s3-directory-sync-cli.

adamhenson avatar adamhenson commented on June 12, 2024

@iambumblehead - I want to avoid supporting an over-opinionated project. You described one example, but there are many other variations that could be preferred by the user. Also, I'm not confident in my understanding of compression to add all the correct headers... that's why I think we should let the user configure any headers they'd like.

If I do move forward with the config file solution, I think it would be simpler than you may be afraid of and I'll create a sample project.

from s3-directory-sync-cli.

iambumblehead avatar iambumblehead commented on June 12, 2024

@adamhenson I could test any branch with buckets I am using and report results to you.

You described one example, but there are many other variations that could be preferred by the user.

It might be easier to first support one variation that is straightforward to implement then, later, add support for more variations as a separate step.

from s3-directory-sync-cli.

adamhenson avatar adamhenson commented on June 12, 2024

Again, there are far more variations than I could consider with this project... more failure points which equates to more responsibility I am not willing to take on. Compression is not a feature this project will provide. I would like it to support compression however if the user has established their own strategy... and any other custom options they would like to pass on a per file basis, hence the open-ended support a config file would provide.

from s3-directory-sync-cli.

adamhenson avatar adamhenson commented on June 12, 2024

FWIW - I've decided to drop support for this project in light of issues like this and also better, similar projects like s3-sync-client

See the deprecation note for details.

from s3-directory-sync-cli.

Related Issues (4)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.