Giter Club home page Giter Club logo

Comments (34)

yadayada avatar yadayada commented on August 19, 2024

It seems that the files get stored anyway; just an internal timeout occurs while Amazon processes the uploaded file for the node metadata that normally would get sent after the upload is finished.

I've uploaded a 17.4GB file, got a 504 error, then synced, compared the local and remote hash and they matched.
So there appears to be no real limitation on the upload file size.

Have you tried immediately uploading a large file twice? E.g.
./acd_cli.py upload /root/12.9GB.file /; ./acd_cli.py upload /root/12.9GB.file /
This should (hopefully) skip on the second upload attempt.

from acd_cli.

smehernosh avatar smehernosh commented on August 19, 2024

I get this error while uploading large files,

 Uploading "file" failed. Code: 408, msg: {"message": "[acd_cli] no body received."}

I tried downloading from the web UI but couldn't download the file, tried downloading using acd_cli and I got,

Downloading [node id] failed.
Downloading "file" failed. Code: 500, msg: {"message":"Internal failure"}

Hash mismatch between local and remote file for "file"    

I tried listing metadata for the file and got the templink, when I tried opening the templink I got the same error,

 "{"message":"Internal failure"}"

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

I believe Amazon is currently working on these issues.

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

Update: There seems to be a way to access large files, because NetDrive can do it.

from acd_cli.

smehernosh avatar smehernosh commented on August 19, 2024

Nope I'm facing problems with Netdrive also when downloading large files.

http://support.netdrive.net/forums/251857-bug-or-error/suggestions/7599042-2-5-beta-7-i-o-device-error-on-almost-file-copy-co

I guess we'll have to wait for Amazon to fix this

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

I was able to stream a >10GiB file using NetDrive, so it should theoretically be possible to download them as well.

from acd_cli.

smehernosh avatar smehernosh commented on August 19, 2024

I think the limit is 14gb as mentioned here:

https://forums.developer.amazon.com/forums/thread.jspa?threadID=5034&tstart=0

from acd_cli.

Issam2204 avatar Issam2204 commented on August 19, 2024

Also for me:

issam@localhost:/Files$ du -ch output.dat
15G output.dat
15G total
issam@localhost:
/Files$ acdu upload output.dat /
Current file: output.dat
[#################################] 100.0% of 14.6GiB, 19.5MB/s
15-05-05 11:35:42.632 [acd_cli] [WARNING] - Timeout while uploading "%s".
issam@localhost:/Files$ acdu upload output.dat /
Current file: output.dat
[ ] 0.0% of 14.6GiB, 169.2KB/s
15-05-05 11:41:32.862 [acd_cli] [ERROR] - Uploading "output.dat" failed. Name collision with non-cached file. If you want to overwrite, please sync and try again.
issam@localhost:
/Files$

I tried, as you can see, to re-upload the file and indeed it gives me an error (which is good). If I look at the webapp the file is there.

from acd_cli.

chrisidefix avatar chrisidefix commented on August 19, 2024

To add my experience - so far the largest file I managed to upload was sized 40.8 GB.
I tried with 100 GB and 142 GB - both failed with time-out messages. For other files with this error, usually the file was uploaded anyway, but for these admittedly very large files, it didn't work and I am not too keen on trying this over and over.

I guess I could split them, which should also make the download easier again.

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

It was briefly possible to download files >10GiB using ranged requests. I did implement this and it worked. But apparently, Amazon deliberately put an end to this. NetDrive also cannot stream such files anymore.

I think it is time to look for another web storage.

from acd_cli.

procmail avatar procmail commented on August 19, 2024

So this isn't a bug serverside, but perhaps a policy restriction?

from acd_cli.

chrisidefix avatar chrisidefix commented on August 19, 2024

I contacted them through phone support and was directed towards the "technical team" as the first-line support person did not know of any such restrictions. I am waiting to hear back from them, but so far there is no "file size limitation" policy to be found on the website. The funny part is that I now cannot download files I previously was able to upload. Hopefully, this is a temporary problem as the error message suggests: {"message":"Currently unsupported file size."}

from acd_cli.

Issam2204 avatar Issam2204 commented on August 19, 2024

Thanks chrisidefix!

Please let us know ;)

Edit: Please don't spam the issue tracker.

from acd_cli.

chrisidefix avatar chrisidefix commented on August 19, 2024

Official Amazon Customer Service response:

We've worked with developers from the Cloud Drive Web team and the Cloud Drive Service team to figure out what's going on. Presently there is an issue with downloading large file sizes. We are actively pursuing a resolution to this issue. In the meantime, you should try to keep your file sizes less than 10GB. Files that you have uploaded that are larger than the current limitation will continue to be kept safely in storage, but are not able to be downloaded until we are able to resolve the underlying issue.

As soon as we have an idea on a time frame for this problem to be fixed we'll be sure to let know.

from acd_cli.

hazcod avatar hazcod commented on August 19, 2024

Any updates on this matter?

An idea is using a chunkfs on top acd: http://chunkfs.florz.de/

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

The current limit is somewhat above 9GiB, I believe. It is unknown whether this limit is final or not; hopefully, we'll know more next month.

https://forums.developer.amazon.com/forums/thread.jspa?threadID=5240&tstart=0

Currently, we are actively working on improving our upload/download limits, and we should have a fix in place by early next month. Regardless, we are currently verifying what our new limits will be, so please hang tight.

Regards,

Jamie

from acd_cli.

karloluiten avatar karloluiten commented on August 19, 2024

As a workaround I split all my uploads into 9GB parts, that works well.

from acd_cli.

procmail avatar procmail commented on August 19, 2024

I have 33GB, 21GB and 19GB zip files uploaded by the official client.

-- 
Eric Boo

From: karloluiten [email protected]
Reply: yadayada/acd_cli [email protected]>
Date: 25 June 2015 at 8:03:06 pm
To: yadayada/acd_cli [email protected]>
Cc: procmail [email protected]>
Subject:  Re: [acd_cli] 10GB file size limit (#8)

As a workaround I split all my uploads into 9GB parts, that works well.


Reply to this email directly or view it on GitHub.

from acd_cli.

hazcod avatar hazcod commented on August 19, 2024

The official client is not limited by the API file size limit.

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

@procmail
It's possible (and pretty likely) you won't be able to download your large files later. Even if the upload file size limit is higher, it is not advisable to upload large files (as stated by Amazon's customer service) until there is an official statement about size limitations.

@hazcod
The last time I checked (in May), it was not possible to download files larger than 10GiB using the official client; but at that time it was possible to access them using NetDrive or download them using acd_cli. The necessary chunked download feature was consequently removed in early May, reinstated a week later and then removed again in June.

So, based on my prior test, I assume the official client is subject to the same limitations. Did you verify that it isn't?

Update: Tested today with the official client, version 2.4.2 (cd447cab) and a 9500MiB file
acd_dl_fail

from acd_cli.

hazcod avatar hazcod commented on August 19, 2024

@yadayada I was always told this, but i'm afraid I have not tested this myself yet.

from acd_cli.

procmail avatar procmail commented on August 19, 2024

It wasn't possible for me to upload big files the last time I tested the official client, which was a month ago I think.

I will test downloading a big file when I have time, and report back.

On 25 Jun 2015, at 9:18 pm, yadayada [email protected] wrote:

@procmail
It's possible (and pretty likely) you won't be able to download your large files later. Even if the upload file size limit is higher, it is not advisable to upload large files (as stated by Amazon's customer service) until there is no official statement about size limitations.

@hazcod
The last time I checked (in May), it was not possible to download files larger than 10GiB using the official client; but at that time it was possible to access them using NetDrive or download them using acd_cli. The necessary chunked download feature was consequently removed in early May, reinstated a week later and then removed again in June.

So, based on my prior test, I assume the official client is subject to the same limitations. Did you verify that it isn't?


Reply to this email directly or view it on GitHub.

from acd_cli.

Issam2204 avatar Issam2204 commented on August 19, 2024

I've tested uploading 1TB file and the upload went smooth until 100%. It took me almost 24h and I was fairly impressed to discover no issues during the upload time. However, (as expected) upon reaching 100% it failed giving me "Code: 400, msg: {"message":"Cannot complete upload"}".

untitled1

Regarding downloading: everything near or up 10GB is not downlodable.

from acd_cli.

smehernosh avatar smehernosh commented on August 19, 2024

They've increased the file size limit. I was able to download a 32gb archive and extract it successfully.

Guess you can close this now.

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

I've noticed the change. But there still is no official comment on this matter. I also was able to download a large file using the web interface. But surely, there still must be some size limit. It would be nice if someone could ascertain the new limit.

from acd_cli.

chrisidefix avatar chrisidefix commented on August 19, 2024

I did receive a customer support mail 5 days ago, but they didn't specify any specific limit either - only that I should now be able to download large files. Guess it's time for a large file test-run ⭕

from acd_cli.

Issam2204 avatar Issam2204 commented on August 19, 2024

I tried yesterday to upload a 15GB, 50GB and 100GB files. All failed because of internal failure (error code 500). I tried this with 2 different dedicated servers located in different datacenters. The failures happens at the end of the upload.

Has anyone been able to upload large files?

from acd_cli.

smehernosh avatar smehernosh commented on August 19, 2024

I uploaded a 32gb archive and I did get the same error but I was able to download the file and extract it successfully.

from acd_cli.

chrisidefix avatar chrisidefix commented on August 19, 2024

Yes, I was able to upload archives at 42 GB size. I also see the same error, but the files still show up on my cloud drive. However, when I tried much larger files, it also didn't work. I might try again, though, in case Amazon improved anything on their end.

from acd_cli.

ShapeShifter499 avatar ShapeShifter499 commented on August 19, 2024

Hmm weird issue guys, so is it advisable to chunk the files that are to be uploaded, maybe with chunkfs as was suggested above? I'm about to do a mass upload and I'm pretty sure I have some large files >10GB.

from acd_cli.

chrisidefix avatar chrisidefix commented on August 19, 2024

@Issam2204 I remember you telling me to use "split, gnupg and acd_cli" and "Call them with a script", maybe that will solve your problems 👎

@ShapeShifter499 I suggest we take another look at issue #32, where I proposed splitting very large files for upload ...

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

Soooo... the actual limit is 50G(i)B.
https://forums.developer.amazon.com/forums/thread.jspa?threadID=4610

from acd_cli.

Sunako avatar Sunako commented on August 19, 2024

Well, at least the limit is sorta defined now. 50G seems reasonable enough IMO.

from acd_cli.

yadayada avatar yadayada commented on August 19, 2024

Since the limit is now know and the timeout errors are documented in the README.rst, I will close this issue.

from acd_cli.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.