bradleyg / django-s3direct Goto Github PK
View Code? Open in Web Editor NEWDirectly upload files to S3 compatible services with Django.
License: MIT License
Directly upload files to S3 compatible services with Django.
License: MIT License
Hello,
I've just installed the plugin and seems to be working great from the administrator, however I would like to know if there is a way to make it work too from django-rest-framework, it seem to be accepting a string field where S3DirectField is defined on the model, I have tried also to pass it a base64 image with no success.
Best regards!
I need to do file uploads via an api. Is there a way I can just send up the data like png or .mov to an endpoint and then also pass up the time of file (png mov etc). How do I save the data to the model programmatically and not through this magical form?
Using buckets that contain dots fails with an error dialogue saying "Oops, file upload failed, please try again'.
The issue can be fixed by using a region specific endpoint in this style: https://<s3-region>.amazonaws.com/<bucketname>/
for uploads, however currently the endpoint is hard-coded in s3direct/views.py line 42.
To solve this problem, the endpoint would have to be read form the settings, possibly with the current one as a default value. I considered just using the S3_URL that is used in several boto/storage examples, but since domains with dots will most likely occur in use cases where a custom domain is used to mask S3, this value would not be the required endpoint but the custom url. So it would be best to introduce a new custom setting e.g., S3DIRECT_UPLOAD_ENDPOINT
Using Python 2.7, Django version 1.5 I see the following deprecation warning:
s3direct\views.py:1: DeprecationWarning: the sha module is deprecated; use the hashlib module instead
I was getting a 301 (permanently moved) redirect. Perhaps I just have my bucket configured improperly, but I changed the example form endpoint to be formatted like {{bucket}}.s3.amazonaws.com and it worked like a charm.
Since there are a lot of options now, would it be easier to make it into a dict, instead of a tuple?
Hello,
I would like to ask if there is an option to limit the size upload somehow through the settings for example on my declaration below,
'imgs': ('uploads/imgs', lambda u: True, ['image/jpeg', 'image/png'],),
if it could be,
'imgs': ('uploads/imgs', lambda u: True, ['image/jpeg', 'image/png'],'10MB'),
or
'imgs': ('uploads/imgs', lambda u: True, ['image/jpeg', 'image/png'], lambda file: file._size> 10*1024*1024),
Is there a default field or parameter to monitor in order to throw an exception if someone goes above my preferred threshold?
I am looking for something similar to the discussions here and here.
Thank you very much in advance!
Fairly obvious request, but thought I'd record it here :)
If this isn't being worked on then I'll give it a bash if I get time - is anyone doing this already?
Im trying to upload 2 files at a same time. I changed getUploadURLinside scripts.js by adding a for loop
for(var i=0; i<=el.querySelector('.file-input').files.length-1; i++){
file_set.push(el.querySelector('.file-input').files[i])}
and added form.append('type', file[0].type,file[1].type)
and form.append('name', (file[0].name,file[1].name))
I get Invalid file type () when trying to upload. What else should be changed so that 2 files are upload at the same time
Hey! installed your library and works fine.
I have this on the models.py
:
class Media(BaseModel):
user = models.ForeignKey(User, on_delete=models.CASCADE, related_name='users')
resource_url = S3DirectField(dest='s3', blank=True, null=True)
And the following lines on the settings.py
:
def content_file_name(filename):
ext = filename.split('.')[-1]
name = int(round(time.time() * 1000))
filename = "%s.%s"% (name, ext)
return 'uploads/images/' + filename
S3DIRECT_DESTINATIONS = {
's3': {
'key': content_file_name,
'allowed': ['image/jpeg', 'image/png', 'video/mp4'], # Default allow all mime types
'bucket': 'k-media-s3-bucket',
'acl': 'public-read', # Defaults to 'public-read'... use private.
'cache_control': 'max-age=2592000', # Default no cache-control
'content_disposition': 'attachment', # Default no content disposition
'content_length_range': (5000, 20000000) # Default allow any size
}
I want to know if I can access to the user
attribute on the content_file_name
function, because I want to use the username to create the folder.
Best!
While developing locally, I continue to run into a pop up Sorry, failed to upload file.
The console shows the issue is:
XMLHttpRequest cannot load https://s3-us-west-2.amazonaws.com/BUCKETNAME.
Response to preflight request doesn't pass access control check:
No 'Access-Control-Allow-Origin' header is present on the requested resource.
Origin 'http://localhost:8000' is therefore not allowed access.
Any advice on how to proceed? I've updated the CORS policy in S3 to be
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedMethod>DELETE</AllowedMethod>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>
Hey Bradley,
I've implemented the S3DirectWidget
for a ModelForm
. I'm struggling with figuring out how to know when the file is uploaded so that I can enable and disable a save button. Is there a specific method in using s3direct
with a ModelForm
?
Hello there,
Is it possible to provide a simple app example on backend of how to process received file and send it out to S3? I'm using AngularJS for sending files to server, so the JQuery doesn't fit for me.
Thank you in advance.
regards,
jab3z
For example, the most recent one (v 0.4.1 = 5039cab).
This is pretty straightforward to do but I do not think it can be submitted in a PR thus it could only be pushed by the repo admins.
Hi !
I got this error -> CSRF verification failed. Request aborted.
On this view -> /s3direct/get_upload_params/
Thanks
Hi, I like to contribute come changes regarding permissions (currently staff only) and useage/usecase documentation.
I think it's better to simply use the widget and you field of choice. I'd even like to add multiple widgets that ave different compression value types, that they my return a django File
, instead of an URL.
If you're cool with it I'd sit down and implement it.
How would you go about adding the progress bar to a form view. The progress bar in the admin is nice but would also be nice to see it in the views.
I'm getting a permanent redirect response from amazon when I upload to a bucket in us-west-1
<Error> <Code>PermanentRedirect</Code> <Message> The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint. </Message> <Bucket>tetra-memories-development</Bucket> <Endpoint>s3.amazonaws.com</Endpoint> <RequestId>D4FC960AEE757AF2</RequestId> <HostId> +g+KnR0ME3pAcTCYNqeE5nvpAVZwZqItp/BGR9s5RpKhAspm3icawHcU5eKOod2FSzgeSPUBAkE= </HostId> </Error>
I noticed that the utils.py file concatenates the region to the url for every region except us-east-1
. It looks like Amazon has changed the urls of other regions too. Would anyone be willing to help me troubleshoot this?
I am trying to use it as part of a formset but it doesn't seem to work. Is there a special way to make it work with formset?
Ideally as a cancel link much like the current remove link that appears on completion.
If you try to upload a file that has a name with a space in it, the space is converted to a "+" sign in the Django admin page, but it is left as a space on S3. This causes an error when trying to save the admin entry, because the widget attempts to search for the file by its S3 key when saving and cannot find the S3 key. I've found how to fix this.
In scripts.js, in the update function, replace
url.value = parseURL(xml)
with
var parsedXML = parseURL(xml);
var path = parsedXML.substr(0,parsedXML.lastIndexOf('/')+1);
url.value = path+name;
and add a parameter called "name" to the update function. Then at the end of the upload function in the same file, add the variable called "file.name" as a third parameter to the update function call. Huzzah!
Are there plans in the future for an ability to view files that have already been added using s3direct?
Would be great to be able to re-use files that have been uploaded in different places rather than uploading over the top.
Alternatively, is there another package that works with s3direct and/or s3boto that can be used to achieve this functionality?
Thanks.
I am trying to get my head around this package - but I seem to have run into trouble running the cat
example. So, I am on Django 1.7, I migrate the models and runserver (usual stuff). Then:
localhost/form: shows me the form and I upload a *.pdf and nothing happens on the browser. The filename is displayed next to the choose file
button and thats it.
Upon looking at the Network
tab on Chrome, I see the following:
http://127.0.0.1:8000/s3direct/get_upload_params/ - POST - 403 - test/html - scripts.js:32
and line 32 shows:
req.send(data)
I suspect this has something to do with the localhost dev I am on?
Also, the POST seems to fail silently... Is that a good thing?
Thanks.
Our Django Admin is restricted to https access, but when the S3Direct widget tries to get information, it uses http, resulting in the following:
Mixed Content: The page at 'https://[redact]/admin/[redact]/add/' was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint 'http://[redact]/s3direct/get_upload_params/'. This request has been blocked; the content must be served over HTTPS.
Hi there,
I´ve just implemented the s3direct and it works wonderfuly, but only when I´m logged in as a superuser.
When I´m logged in as a regular user or staff it won´t allow the user to upload, returning an error. If the same user is then turned in a superuser he/she then is able to upload it.
Do I have to add or chenge the bucket policy to allow all users to upload a file? I double checked and am sure I followed all steps on the readme.
Thank you!
Does the S3DirectField support deserializing? I am trying to find a way to import images already uploaded and import them into a database. I have used deserialization to include these files with their URLs to S3, but if I want to change the field for the model from URL to S3Direct I need to associate the image urls with the files. Any thoughts? Thanks!
Hi Bradley,
I have a couple of questions regarding the future of this package. You didn't merge my last pull-request for reasons I understand. It's good to see that you merged some of my ideas. (Tho mentioning of contributions is always nice ;)
I need to know, if you ever plan to include a file widget in the future. I really prefer it that way, as it has some perks:
I will push development of a production ready version of a file widget over the next week. I just want to know if we can find a way to maintain only one package.
Let me know how you feel about this.
Cheers,
joe
no release since july
Should it work?
I have an admin-form with some fields and
image = S3DirectField()
but i don't get any placeholder or render but this text:<s3direct.fields.S3DirectField>
I noticed that two files get uploaded when I pass a function to the destination_key instead of a path string. For example, the following configuration will upload two files:
def create_filename(filename):
import uuid
ext = filename.split('.')[-1]
filename = '%s.%s' % (uuid.uuid4().hex, ext)
return os.path.join('images', filename)
S3DIRECT_DESTINATIONS={
# Allow anybody to upload jpeg's and png's.
'imgs': (create_filename, lambda u: True, ['image/jpeg', 'image/png'],),
}
The only way I have figured out how to get only one file to upload is to pass a path string instead of a function. Any help?
Referenced function found by an example @bradleyg posted here: https://github.com/bradleyg/django-s3direct/blob/master/example/example/settings.py#L75
Works fine under Chrome (Mac/Win), FireFox (Mac/Win), Safari (Mac).
In IE I get the following error:
SCRIPT5007: Unable to get property 'error' of undefined or null reference
File: scripts.js, Line: 136, Column: 21
Digging deeper, I found that in IE the POST /s3direct/get_upload_params/ fails CSRF.
I’m using the field in Django admin.
So, I seem to have got the package working (btw: beautiful!), but I am unsure about the following issues I am facing at this moment:
[1] New name for each uploaded file - So, in my settings I have S3DIRECT_UNIQUE_RENAME = True
, yet, when I upload files, it seems to overwrite them on my bucket, rather than rename (or is this not the intention of this settings?). I am not entirely sure what changes when S3DIRECT_UNIQUE_RENAME
changes.
[2] When I run the example, I do not see anything written to my DB (django side) - Does this workflow needs implementation for the example? If yes, where should I be looking at? (Sorry, I am still figuring out the workflow of this package - everything seems to be done on the JS side). Normally, I'd have a form_valid
method which returns a HttpResponseRedirect
to a get_success_url
.
Thanks in advance & please feel free to close the issue if they are not.
Hi,
The code currently marks an empty path in S3DIRECT_DESTINATIONS as an error condition, and also does not handle the case for specifying '/' very well. I think that both of these are valid cases where the user is specifying that the file be uploaded to the root folder of the bucket. The only way to do this seems to be to provide a function which just returns '${filename}'
I suggest changing views.py to something like:
if hasattr(key, '__call__'):
key = key(filename)
elif key and key !='/':
key = '%s/${filename}' % key
else:
#Key is empty or is '/' - assume the request is to put in the root of the bucket
key = '${filename}'
I'm running into an issue uploading files with a # sign. The upload to S3 works (and thus doesn't show an error in the CMS); however, the S3 reference that gets saved to my model doesn't work because it saves the unencoded version to the database.
For example, s3direct takes a key like "directory/#35 product" and returns "directory/#35+product", when it should return "directory/%2335+product".
What the best practice for having the field fallback to a plain FileField
for local development?
Hi Bradley,
I encountered a 30 second timeout issue on Heroku when uploading large images on my Django site and stumbled upon your plugin which allowed direct uploads to S3 in the background.
I am trying to adapt it to my front end but am encountering some issues:
POST http://rigs-staging.herokuapp.com/listings/add/ 403 (FORBIDDEN) jquery-1.10.1.js:8724
GET http://rigs-staging.herokuapp.com/accounts/signin/?next=/s3direct/get_upload_params/s3direct/photos 404 (NOT FOUND)
The only modifications I have made to your plugin have been in widgets.py (so that it can work within the constraints of my interface):
HTML = """
<div class="s3direct" data-url="{policy_url}">
<a class="link" target="_blank" href="{file_url}" style="display: none;">{file_name}</a>
<span class="btn btn-file">
<span class="fileupload-new">Select image</span>
<span class="fileupload-exists">Change</span>
<input type="hidden" value="{file_url}" id="{element_id}" name="{name}" />
<input type="file" class="fileinput" />
</span>
<a href="#remove" class="btn fileupload-exists" data-dismiss="fileupload">Remove</a>
<div class="progress progress-striped active" style="margin-top:10px">
<div class="bar"></div>
</div>
</div>
"""
The affected page can be found here: http://rigs-staging.herokuapp.com/
I would really appreciate your help on this issue as I cannot find any feasible alternative to deal with Heroku time out. Thank you for your time and consideration in advance.
Would you mind adding the MIME type to the Invalid file type
message? It would be very useful to know at that point.
In views.py
, include content_type
in the message:
data = json.dumps({'error': 'Invalid file type (%s).' % content_type})
Is it possible to save/extract the original filename when S3DIRECT_UNIQUE_RENAME = True? I'd like to view the original filename as a readonly field in the Django admin.
Is this library not meant to be used outside of the admin? I'm confused by the readme.
Is it possible to validate the file to be uploaded in a manner similar to this snippet for FileField: https://djangosnippets.org/snippets/3039/?
Uploading files inside a (stacked or tabular) inline form on the django admin interface doesn't work correctly: no file is actually uploaded. After selecting a file to upload and hitting save, an error is shown telling you that the file field is required. When this error is displayed and you retry the upload, uploading a file works correctly.
Checking with the Chrome devtools shows that nothing is actually getting uploaded when you select a file to upload, so this appears to be an issue with the form.
Tested with Django 1.6 with the default admin interface, django-admin-tools and django-suit. Verified that file uploads in regular forms (non-stacked) works correctly.
Configuration in which it doesn't work (from the django tutorial polls app): https://gist.github.com/jorisvddonk/9fad25e43f30856e29eb
expires_in = datetime.utcnow() + timedelta(seconds=60*5)
Value of "5 minutes" fails when uploading large files.
Maybe it's a good thing to make this value configurable via settings?
Hey there, thanks for making this!
I am a little confused on usage, is there anything special you have to do to use this field as a regular file field? I replaced a FileField with this S3 direct field but it gives a URL back when I do field.file
and that fails when I try to do normal file type interactions? Do I have to download the file manually to interact with it?
Hi.
First of all thank you for work.) Seems to be that's useful tool. i did all how it's in tutorial, but getting error:
XHR finished loading: POST "http://95.215.45.160/s3direct/get_upload_params/". jquery-1.10.2.min.js:6
OPTIONS https://soap-couch/ net::ERR_NAME_NOT_RESOLVED
Django 1.6. Perhaps it's because jquery loaded twice, by django and by s3direct?
I believe I've found a potential security exploit with this library. Please correct me if I'm wrong.
This problem effects you under the following conditions...
Explanation:
The upload location is sent to the client in an html tag "file-upload-to" with a value. The value of that tag is then sent back to the server and the s3 upload process begins. The server trusts the client to provide a valid upload location here. However it is possible that a malicious user could change this value.
I just did this on my test server using chrome's developer tools to right-click --> edit the HTML's "file-upload-to" value. I was able to change the upload location. I had S3DIRECT_UNIQUE_RENAME = False so I was able to set the file name from the client as well. Ultimately I was able to over-ride files in my bucket such as static/mysite/js/script.js with malicious code.
It seems like you would want to avoid sending the upload location to the client at all? Or at least keep a list of valid upload location on the server to cross-check so that server-only folders can't have files leaked into them?
Error:
InvalidRequest
The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.
Look at those links:
http://stackoverflow.com/questions/28514981/uploading-a-file-to-s3-frankfurt-gets-an-error-authorization-aws4-hmac-sha256
http://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-post-example.html
http://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-authentication-HTTPPOST.html
eu-central-1 works only with AWS Signature Version 4
I've made a quick fix for that:
now_date = datetime.utcnow().strftime('%Y%m%dT%H%M%S000Z')
raw_date = datetime.utcnow().strftime('%Y%m%d')
policy_dict = {
"expiration": expires,
"conditions": [
{"bucket": bucket},
["starts-with", "$key", ''],
{"success_action_status": '201'},
{"acl": acl},
{"x-amz-credential": '%s/%s/%s/s3/aws4_request' % (access_key, raw_date, region)},
{"x-amz-algorithm": "AWS4-HMAC-SHA256"},
{"x-amz-date": now_date},
{"content-type": content_type},
]
}
if cache_control:
policy_dict['conditions'].append({'Cache-Control': cache_control})
if content_disposition:
policy_dict['conditions'].append({'Content-Disposition': content_disposition})
policy_object = json.dumps(policy_dict)
policy = b64encode(policy_object.replace('\n', '').replace('\r', '').encode())
date_key = hmac.new('AWS4' + secret_access_key, msg=raw_date, digestmod=hashlib.sha256).digest()
date_region_key = hmac.new(date_key, msg=region, digestmod=hashlib.sha256).digest()
date_region_service_key = hmac.new(date_region_key, msg='s3', digestmod=hashlib.sha256).digest()
signing_key = hmac.new(date_region_service_key, msg='aws4_request', digestmod=hashlib.sha256).digest()
signature = hmac.new(signing_key, msg=policy, digestmod=hashlib.sha256).hexdigest()
structure = getattr(settings, 'S3DIRECT_URL_STRUCTURE', 'https://{1}.{0}')
bucket_url = structure.format(endpoint, bucket)
return_dict = {
"policy": policy,
"success_action_status": 201,
"x-amz-credential": "%s/%s/%s/s3/aws4_request" % (access_key, raw_date, region),
"x-amz-date": now_date,
"x-amz-signature": signature,
"x-amz-algorithm": "AWS4-HMAC-SHA256",
"key": key,
"form_action": bucket_url,
"acl": acl,
"content-type": content_type
}
Note that every region can use v4 version
I have implemented your code in a local virtual environment and everything works fine. But when I upload the code on AWS Elastic Beanstalk it doesn't work anymore.
In particular I see the Remove link before the Browse button, and once I select a file it doesn't launch the upload.
For some reason when I added the S3DirectField to my model when I go to create or edit that model none of the fields show up in the admin interface. What am I doing wrong? I didn't do anything besides adding the field to the model.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.