Giter Club home page Giter Club logo

gsutil's Introduction

gsutil

gsutil is a Python application that lets you access Google Cloud Storage from the command line. You can use gsutil to do a wide range of bucket and object management tasks, including:

  • Creating and deleting buckets.
  • Uploading, downloading, and deleting objects.
  • Listing buckets and objects.
  • Moving, copying, and renaming objects.
  • Editing object and bucket ACLs.

Installation

For installation instructions, please see:

https://cloud.google.com/storage/docs/gsutil_install

Testing / Development

The gsutil source code is available at https://github.com/GoogleCloudPlatform/gsutil

See https://cloud.google.com/storage/docs/gsutil/addlhelp/ContributingCodetogsutil for best practices on how to contribute code changes to the gsutil repository.

Help and Support

Run the "gsutil help" command for a list of the built-in gsutil help topics.

You can also browse the help pages online at:

https://cloud.google.com/storage/docs/gsutil

For community support, visit:

https://cloud.google.com/storage/docs/resources-support#community

gsutil's People

Contributors

brandonsalmon22 avatar carlosmerec avatar catleeball avatar cbonnie avatar chestercun avatar craigcitro avatar dilipped avatar ejeselsohn avatar fishjord avatar houglum avatar jterrace avatar karenarialin avatar kosherbacon avatar marcgel avatar mco-gh avatar mengyazhu96 avatar mfschwartz avatar michelrozel avatar minkezhang avatar mkalmanson avatar nickgoog avatar ptai7 avatar reinhillmann avatar rrauber avatar scruffyprodigy avatar starsandskies avatar tedromer avatar thobrla avatar thomasmaclean avatar zwilt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gsutil's Issues

Options to choose between Subdomain and Ordinary calling format. -h option bug with Content-Length header

Original author: [email protected] (September 19, 2010 17:40:49)

What steps will reproduce the problem?

Problem 1: Subdomain & ordinary storage calling:

  1. Run storage w/o DNS subdomain resolution
  2. Try to create new bucket

Problem 2: -h option

  1. Try run gsutil -h "Content-Length: 1000" cp file gs://bucket/file

gsutil will substitute 'Content-Length' with fsize( file ) which can be really equal to zero.

it needs when 'file' is really a stream and we know the size of this stream. In this case gsutil can't correctly determine the size and we should specify it by hands.

What version of the product are you using? On what operating system?
gsutil version 2010-09-14 18:56:45-07:00, config file version 2010-09-11 12:28:22-07:00

Please provide any additional information below.
Now we use attached patch on current gsutil version. I think it would be helpful. Bug with -h option is already fixed in this patch.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=37

cross-bucket data copying / moving doesn't preserve metadata

Original author: [email protected] (September 25, 2010 00:31:12)

If you do a command like this:

gsutil cp gs://mybucket/obj1 s3://mys3bucket/obj2

the bytes of the object get copied through an intermediate temp file, but any metadata set on the object (Content-Type, Content-Encoding, user-specified x-goog-meta tags, etc.) don't get copied over. This happens because the gsutil cross-provider copy code only copies the bytes over. To fix this that code would need to first get the metadata from the old object, then download to the temp file, the upload that temp file to an object with the given metadata set.

This problem also applies to the gsutil 'mv' command (since it is really just a copy followed by a delete)

Original issue: http://code.google.com/p/gsutil/issues/detail?id=38

Expires Header

Original author: [email protected] (July 10, 2010 08:15:50)

I can't seem to set the Expires header.

~$ gsutil -h "Expires: Fri, 06 Jul 2012 00:00:00 GMT" cp styles.css gs://mybucket
~$ gsutil setacl public-read gs://mybucket/styles.css
~$ curl --head http://commondatastorage.googleapis.com/mybucket/styles.css
HTTP/1.1 200 OK
Expires: Sat, 10 Jul 2010 09:12:24 GMT
Date: Sat, 10 Jul 2010 08:12:24 GMT
Cache-Control: public, max-age=3600
ETag: "9523e4644a3b43461cef552988799c44"
Content-Type: text/css
Last-Modified: Sat, 10 Jul 2010 08:11:59 GMT
Content-Length: 8641
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
Server: GSE

Original issue: http://code.google.com/p/gsutil/issues/detail?id=21

allow gsutil cp/mv to specify destination URIs with subdirectory-like paths

Original author: [email protected] (October 15, 2010 16:03:06)

This enhancement was suggested by a question posted on [email protected], included below. This would be useful, for example allowing users to rename "subdirectories" on the cloud server.


from Kevin Postal <[email protected]>
reply-to [email protected]
to gsutil-discuss <[email protected]>
date Thu, Oct 14, 2010 at 5:18 PM
subject Uploading files and directories to folders
mailing list <gsutil-discuss.googlegroups.com>

I was wondering if there was a way to upload Full folders to folder
buckets?

For instance I know I can do:

gsutil cp file/file.txt gs://my_bucket/file/file.txt
gsutil cp file/file.txt gs://my_bucket/file/

Why can't I do?

gsutil cp file/* gs://my_bucket/file/*

or

gsutil cp file/ gs://my_bucket/file/

Original issue: http://code.google.com/p/gsutil/issues/detail?id=45

make gsutil cp command work like UNIX cp command

Original author: [email protected] (September 07, 2010 19:29:34)

copy a file using "gsutil cp filedir/filename gs://bucketname" creates the entire path in the bucket

a number of customers asked to make this command behave like UNIX cp, namely:

gsutil cp dir1/dir2/dir3/filename.txt gs://bucketName
--> copy just the file to the bucket

gsutil cp -r dir1/dir2 gs://bucketName
--> copy dir2 and anything underneath to bucketName

Original issue: http://code.google.com/p/gsutil/issues/detail?id=32

gsutil does not allow having .boto config file anyplace other than $HOME/.boto

Original author: [email protected] (October 14, 2010 22:09:09)

What steps will reproduce the problem?
$ unset HOME
$ ls -l .boto # it exists
$ gsutil ls
You have no boto config file. This script will create one at
./.boto
containing your credentials, based on your responses to the following questions.

What is the expected output? What do you see instead?
It should find the ./.boto which is there. The above output was
due to it looking for HOME in os.environ only.

What version of the product are you using? On what operating system?

2010-09-20 13:11:11-07:00

linux (ubuntu)

Please provide any additional information below.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=44

'ls -l' doesn't work with unicode object name

Original author: [email protected] (December 15, 2010 08:08:05)

What steps will reproduce the problem?

  1. Upload an object with unicode name, like "你.pdf" to somebucket
  2. List bucket 'gsutil ls -l gs://somebucket/*

What is the expected output? What do you see instead?
There should be something like:
'224 2010-12-14T18:11:06 gs://somebucket/你.pdf'.
While there is an error:
'Failure: 'ascii' codec can't decode byte 0xef in position 44: ordinal not in range(128).'

What version of the product are you using? On what operating system?
I'm using latest svn gsutil with Python 2.6.6 on Ubuntu 10.10

Please provide any additional information below.
I found that in gslib/command.py, timestamp is unicode while UriStrFor(iterated_uri, obj) is ascii, so I propose a patch to convert timestamp to ascii, please check it out, thanks.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=50

Any Plans to Undo the Fork?

Original author: [email protected] (May 24, 2010 02:08:10)

As a committer to boto i am pretty excited to see that you found it useful enough to leverage to
build your own utility to Google Storage with.

I am curious however, if there are plans to undo the forked code. I am sure it would be a great
addition to the project (haven't done a diff yet to see what the actual changes were).

Thanks!

Patrick

Original issue: http://code.google.com/p/gsutil/issues/detail?id=2

'gsutil cat' fails with no useful error message

Original author: [email protected] (June 21, 2010 16:00:17)

What steps will reproduce the problem?

  1. $ gsutil cat gs://pub/rose.txt

What is the expected output?
Part of a soliloquy from "Romeo and Juliet"

What do you see instead?
/Users/james/dev/gsutil/gsutil:1154: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
if e.message.find('aws_secret_access_key') != -1:
/Users/james/dev/gsutil/gsutil:1158: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
OutputAndExit(e.message)
'NoneType' object has no attribute 'get_contents_as_string'

What version of the product are you using?
gsutil VERSION file contents:
2010-06-18 17:01:04-07:00

Python 2.6.1 (r261:67515, Feb 11 2010, 00:51:29)
[GCC 4.2.1 (Apple Inc. build 5646)] on darwin

On what operating system?
Mac OS X 10.6.4

Please provide any additional information below.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=11

better reporting of xml parsing errors

Original author: [email protected] (June 18, 2010 18:04:52)

What steps will reproduce the problem?
1.gsutil -d setacl acl.xml gs://chanezon/scan0609.jpg
with an acl.xml file that is malformed
generates a cryptic message saying: Failure: <unknown>:10:16: not well-formed (invalid token).

What is the expected output? What do you see instead?
Specify that the input file was not well formed (how do I know it's not the response)
Maybe provide the stacktrace from the xml parser, to help spot where the issue is.

What version of the product are you using? On what operating system?
eh eh, gsutil has no versioning system yet:-)
Implement my other -v issue and I will tell you.
On Mac OS X.

Please provide any additional information below.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=9

Can't see usage information without a developer key

Original author: [email protected] (May 21, 2010 01:27:15)

All I want to do is check flags for using gsutil.

spindle:gsutil polleyj$ ./gsutil help
You have no boto config file. This script will create one at
/Users/polleyj/.boto
containing your credentials, based on your responses to the following questions.

spindle:gsutil polleyj$ ./gsutil
You have no boto config file. This script will create one at
/Users/polleyj/.boto
containing your credentials, based on your responses to the following questions.

spindle6:gsutil polleyj$ ./gsutil -h
You have no boto config file. This script will create one at
/Users/polleyj/.boto
containing your credentials, based on your responses to the following questions.

I should not have to provide any credentials just to get information about the flags the script
accepts.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=1

Unexpected command line args result in an error and stack dump

Original author: [email protected] (May 25, 2010 21:08:56)

If a user enters an unrecognized flag this results in an uncaught exception
and a stack trace being displayed. I tried gsutil --help and the following
output was dumped to the screen:

gsutil --help
Traceback (most recent call last):
File "/home/mmeade/gsutil/gsutil", line 973, in <module>
main()
File "/home/mmeade/gsutil/gsutil", line 919, in main
command = args[0]
IndexError: list index out of range

Unrecognized command line params should be caught and display the help text
to tell the users what args are valid.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=3

some gsutil tests flaky

Original author: [email protected] (September 30, 2010 02:18:10)

Some tests fail (false positive failures) because they don't correctly check for object creation: they do bucket listings to do the checks, but since bucket listing results have eventual consistency guarantees it's not guaranteed that checking the listing immediately after creating an object will show that object. Instead the tests with this problem should use HEAD operations on the expected objects to test for their presence.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=39

gsutil ls -l is slow when listing bucket contents

Original author: [email protected] (August 11, 2010 22:40:36)

% gsutil ls -l gs://somebucket/*

is slow because it retrieves object metadata for each object being listed. In many cases users don't need to see the ACL, and all info other than the ACL can be retrieved from the bucket metadata without touching the objects.

Solution: make gsutil ls -l list everything other than the ACL, and add a new -L option to list that plus the ACL info.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=26

copying across providers using -t option doesn't work right

Original author: [email protected] (June 25, 2010 17:51:19)

What steps will reproduce the problem?

  1. touch abc.txt
  2. gsutil cp abc.txt gs://bucket/abc.txt
  3. gsutil -d cp -t gs://bucket/abc.txt s3://bucket/abc.txt

What is the expected output? What do you see instead?

Should see Content-Type header of text/plain on the HTTP PUT.
Instead Content-Type header is application/octet-stream

Note that this only happens for cross-provider copies.
The correct behavior is observed for these two cases:

gsutil -d cp -t abc.txt gs://bucket/abc.txt
gsutil -d cp -t gs://bucket/abc.txt gs://bucket/abc.txt

Original issue: http://code.google.com/p/gsutil/issues/detail?id=14

cp and mv still not UNIX-like in some ways

Original author: [email protected] (September 11, 2010 08:03:10)

vomjom@monad:$ gsutil ls -l gs://vomjom/bar/baz
0 Sat, 11 Sep 2010 08 gs://vomjom/bar/baz
TOTAL: 1 objects, 0 bytes (0.0 B)
vomjom@monad:
$ gsutil mv gs://vomjom/bar/baz gs://vomjom/
CommandException: Overlapping source and dest URIs not allowed.
vomjom@monad:~$ gsutil cp gs://vomjom/bar/baz gs://vomjom/
CommandException: Overlapping source and dest URIs not allowed.

Expected behavior:
mv and cp to gs://vomjom/baz

Thanks.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=34

downloading files with content-encoding:gzip should decompress to local files if not named w/ .gz extension

Original author: [email protected] (October 04, 2010 00:27:31)

When you do a command like:
% gsutil cp -z 'txt' abc.txt gs://mybucket
it will create an object gs://mybucket/abc.txt, with its content compressed. If you subsequently do:
% gsutil cp gs://mybucket/abc.txt .
it will copy the compressed data back down to the local file abc.txt. Instead it should uncompress the file, since it has content-encoding:gzip and the file has the extension .txt (not .gz or .gzip etc.)

Original issue: http://code.google.com/p/gsutil/issues/detail?id=42

gsutil ls -l problem

Original author: [email protected] (June 30, 2010 18:53:19)

What steps will reproduce the problem?

  1. gsutil foo gs://bucket
  2. gsutil ls -l gs://bucket
    3.

Similar to issue 15, but simpler (not concerned about Meta-data yet)

What is the expected output? What do you see instead?
Expect to see something like unix ls -l for "foo" (mdate, size, etc)
See:
gs://bucket/:
ACL:
Failure: 0.

What version of the product are you using? On what operating system?
Downloaded and installed to Fedora 8 2010-06-30
No gsutil --version and stdout usage does not show version.

Please provide any additional information below.
work email: [email protected]

Original issue: http://code.google.com/p/gsutil/issues/detail?id=18

gsutil ls gets confused by objects with "$" in their name

Original author: [email protected] (July 15, 2010 23:11:39)

What steps will reproduce the problem?

  1. using GS Manager create a folder within a folder - e.g., I created a folder 'def' within folder 'abc' in my bucket "mfsbucket". At this point GS Manager has the following object in my bucket: gs://mfsbucket/abc/def_$folder$
  2. If you try to do gsutil ls on this object, you get an error:
    % gsutil ls gs://mfsbucket/abc/def_$folder$
    "gs://mfsbucket/abc/def_$" matches no objects.

What is the expected output? What do you see instead?

It should show the object.

Please use labels and text to provide additional information.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=22

gsutil cp doesn't allow one to specify an object's MIME type

Original author: [email protected] (June 02, 2010 05:24:07)

What steps will reproduce the problem?

  1. echo foo > file ; gsutil cp file gs://mybucket/file
  2. gsutil ls gs://mybucket/file

What is the expected output? What do you see instead?

MIME type is unconditionally application/octet-stream, regardless of file type or user intention to
the contrary. It's a perfectly sane default, but since one can't alter the MIME type of an object
after creation, it makes gsutil less than useful in pushing files for browser consumption.

What version of the product are you using? On what operating system?

2010-05-25 13:24:19-07:00, MacOS X

Please provide any additional information below.

Patch attached to add a -t sub-option to the gsutil cp command permitting the MIME type to be
set.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=7

gsutil doesn't complain if file transfer it truncated

Original author: [email protected] (May 26, 2010 13:46:06)

For example I did a gsutil copy on a large file and ^C part way through, and was left with a
partial file.

The user probably would expect that behavior, but the case I'm more concerned about is if a
transfer times out part way through. Some possibilities:

  1. leave as is
  2. warn the user when it happened about the partial files
  3. upload/download to a temp name and move once transfer completes to the final name

Original issue: http://code.google.com/p/gsutil/issues/detail?id=5

make gsutil work on Python 3.x

Original author: [email protected] (August 19, 2010 22:33:59)

Currently if you try to run gsutil on Python v3.x it will get syntax errors at print statements, because Python 3.x made print into a function, so the old syntax:
print 'abcd'
doesn't work, and instead you need:
print('abcd')

There are likely other problems beyond just this to make gsutil work on v3.x. Moreover, at present the boto library only works on Python v2.x.

At some point, when boto moves to support Python 3.x, gsutil should as well.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=29

gsutil wildcarding gets confused by objects whose names contain shell meta chars

Original author: [email protected] (August 04, 2010 21:09:05)

What steps will reproduce the problem?

  1. gsutil mb gs://somebucket
  2. gsutil cp abc.txt gs://somebucket/abc.txt^M
    (use control-V control-M for the final character above)
  3. gsutil rm gs://somebucket/*

What is the expected output? What do you see instead?
Expect gsutil to delete the object created above. Instead it gets a 404 "Not found" error.

Please use labels and text to provide additional information.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=24

The move command does not copy ACL

Original author: [email protected] (December 12, 2010 13:26:52)

Supposing you have a bucket and a file named /mybucket/file.tar.gz with public read ACL. When you perform a "gsutil mv gs://mybucket/file.tar.gz gs://mybucket/myfile.tar.gz" the gsutil application outputs:

Copying gs://mybucket/file.tar.gz...
Removing gs://mybucket/file.tar.gz...

The file is renamed. However, the ACLs from the initial file are not transferred. I mean that in the previous case, the new mybucket/mufile.tar.gz is not public readable. I was expecting that the file is renamed, but also that the ACLs are transferred.

The version I am using is 12-03-2010a.

Original issue: http://code.google.com/p/gsutil/issues/detail?id=49

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.