Giter Club home page Giter Club logo

s3mock's People

Contributors

andyscott avatar carstenlenz avatar chetanmeh avatar dayyan avatar hospadar avatar ignaciopl avatar jimirocks avatar joemeszaros avatar jrouly avatar knservis avatar markarasev avatar mirosval avatar nsanglar avatar romangrebennikov avatar shuttie avatar thereisnospoon avatar tonicebrian avatar ukeller avatar vizog avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

s3mock's Issues

A way to clean bucket between tests?

I'm trying to make my tests with S3Mock independent and to achieve that, clean, delete and recreate the bucket between tests. This is failing because of exceptions related to being unable to delete either a file or the bucket directory itself.

Here's my setup, roughly:

@Before
public void setUp() throws Exception {
    s3Api = S3Mock.create(8001, System.getProperty("java.io.tmpdir") + "s3");
    s3Api.start();

    s3Client = new AmazonS3Client(new AnonymousAWSCredentials());
    s3Client.setEndpoint("http://127.0.0.1:8001");
    s3Client.createBucket(BUCKET_NAME);
}

@After
public void tearDown() throws Exception {
    for (S3ObjectSummary summary : s3Client.listObjects(BUCKET_NAME).getObjectSummaries()) {
        if (!summary.getKey().startsWith(".")) {
            s3Client.deleteObject(BUCKET_NAME, summary.getKey());
        }
    }
    s3Client.deleteBucket(BUCKET_NAME);
}

The if (!summary.getKey().startsWith(".")) is there because of #16. Without it I'm getting:

10:35:58.759 [sqsmock-akka.actor.default-dispatcher-10] DEBUG io.findify.s3mock.provider.FileProvider - deleting bucket s://attachments
10:35:58.763 [sqsmock-akka.actor.default-dispatcher-10] ERROR io.findify.s3mock.route.DeleteBucket - DELETE bucket attachments failed: C:\Users\palucm01\AppData\Local\Temp\s3\attachments\.testKey
java.nio.file.AccessDeniedException: C:\Users\palucm01\AppData\Local\Temp\s3\attachments\.testKey
	at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:83)
	at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
	at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:102)
	at sun.nio.fs.WindowsFileSystemProvider.implDelete(WindowsFileSystemProvider.java:269)
	at sun.nio.fs.AbstractFileSystemProvider.delete(AbstractFileSystemProvider.java:103)
	at java.nio.file.Files.delete(Files.java:1126)
	at better.files.File.delete(File.scala:602)
        ...

while with the if I'm getting:

10:37:09.110 [sqsmock-akka.actor.default-dispatcher-5] DEBUG io.findify.s3mock.provider.FileProvider - deleting bucket s://attachments
10:37:09.116 [sqsmock-akka.actor.default-dispatcher-5] ERROR io.findify.s3mock.route.DeleteBucket - DELETE bucket attachments failed: C:\Users\palucm01\AppData\Local\Temp\s3\attachments
java.nio.file.DirectoryNotEmptyException: C:\Users\palucm01\AppData\Local\Temp\s3\attachments
	at sun.nio.fs.WindowsFileSystemProvider.implDelete(WindowsFileSystemProvider.java:266)
	at sun.nio.fs.AbstractFileSystemProvider.delete(AbstractFileSystemProvider.java:103)
	at java.nio.file.Files.delete(Files.java:1126)
	at better.files.File.delete(File.scala:602)
	at io.findify.s3mock.provider.FileProvider.deleteBucket(FileProvider.scala:130)
        ...

and either way, I can't delete the directory.

Could you please add support for finding objects by prefix?

I have add a short test to io.findify.s3mock.ListBucketTest

  it should "obey delimiters && prefixes when prefix equals to files name" in {
    s3.createBucket("list5")
    s3.putObject("list5", "dev/someEvent/2017/03/13/00/_SUCCESS", "xxx")
    val req2 = new ListObjectsRequest()
    req2.setBucketName("list5")
    req2.setDelimiter("/")
    req2.setPrefix("dev/someEvent/2017/03/13/00/_SUCCESS")
    val list2  = s3.listObjects(req2)
    list2.getObjectSummaries.size shouldEqual 1
    list2.getObjectSummaries.head.getKey shouldEqual "dev/someEvent/2017/03/13/00/_SUCCESS"
  }

Wich is failed with

[info] - should obey delimiters && prefixes when prefix equals to files name *** FAILED ***
[info]   0 did not equal 1 (ListBucketTest.scala:143)

How to get rid of akka warnings

Hi, I'm using this library for testing and I'm seeing lots of warnings such as

[WARN] [06/12/2017 18:39:23.885] [s3mock-akka.actor.default-dispatcher-4] [akka.actor.ActorSystemImpl(s3mock)] Explicitly set HTTP header 'Content-Type: application/octet-stream' is ignored, illegal RawHeader

Is there an easy and safe way to get rid of these?

alpakka-s3 is not compatible with s3mock

As alpakka's client has it's own implementation of s3 client protocol, there's no way to setup the endpoint address in a way to be compatible with s3mock. And alpakka always enforces https, which is not yet supported. This bug is not actually s3mock's bug, but a note to myself (Roman Grebennikov) to make PR to alpakka to support non-https custom endpoints.

Could you please add posibility or describe workaround about DNS endpoint?

From rademe file:

// use IP for endpoint address as AWS S3 SDK uses DNS-based bucket access scheme
// resulting in attempts to connect to addresses like "bucketname.localhost"
// which requires specific DNS setup

I have not tried to setup dns by mysefl but understand that it mat be not trivial. But do you know any workarounds to make "bucketname.localhost" working with your mock? If so could you please publish them in readme file? Because for now it is needed to create separate s3 client instance for integration tests and production code. :(

maven dependencies correct?

Hi - I get the following when trying to start the api using java....

    S3Mock api = S3Mock.create(8001, "/tmp/s3");
    api.start();

Looks like "RoutingSettings" missing from akka-http-core-experimental_2.11?

Rob

[sqsmock-akka.actor.default-dispatcher-4] INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
Exception in thread "main" java.lang.NoClassDefFoundError: akka/http/scaladsl/settings/RoutingSettings$
at io.findify.s3mock.S3Mock.start(S3Mock.scala:137)
at com.elsevier.agrm.sc.articleUpdate.sqs.MockS3Server.main(MockS3Server.java:19)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: akka.http.scaladsl.settings.RoutingSettings$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more

separate storage for file metadata

Currently we're using dot-files to save key-specific metadata, but it may break some things when a user decides to store other dot-files inside s3mock. I'm currently considering multiple ways of doing it:

  • Sqlite database (can be also used for ownership & permissions)
  • the same dot-files, but in another directory not to clash with the working one

Unable to write put objects to temp folder in java

This seems like an issue, but I recently pulled down s3 mock and followed the instructions. However, I'm unable to PUT any objects into my temp folder. It always seems to be writing 0 bytes which eventually fails the md5 of the put method. Below is what I'm seeing in the logs.

07:48:43.445 [sqsmock-akka.actor.default-dispatcher-3] DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://my-bucket/contents to /tmp/s3/mybucket/contents, bytes = 0

Is this expected or am I missing something in the configuration. I tried using anom and a generic basic credential as well.

Wrong content type on GET Service

When doing a GET on "/", the returned ListAllMyBucketsResult is sent as text/plain instead of application/xml, which is a fatal error with jets3t:
org.jets3t.service.S3ServiceException: Expected XML document response from S3 but received content type text/plain; charset=UTF-8

ObjectListing#getCommonPrefixes() return empty list

Expected value of listing.getCommonPrefixes is a List<String> of 2 common prefixes dev/someEvent/2016 and dev/someEvent/2017 but it's returning an empty list when running on s3mock. The same code works fine on an actual S3 instance.

s3Mock = S3Mock.create(8001, "/tmp/s3");
s3Mock.start();

amazonS3Client = new AmazonS3Client();

amazonS3Client.setEndpoint("http://127.0.0.1:8001");
amazonS3Client.createBucket("test-bucket");
amazonS3Client.putObject("test-bucket", "dev/someEvent/2017/03/13/00/_SUCCESS", FILE);
amazonS3Client.putObject("test-bucket", "dev/someEvent/2017/03/13/01/_SUCCESS", FILE);
amazonS3Client.putObject("test-bucket", "dev/someEvent/2016/12/31/23/_SUCCESS", FILE);
amazonS3Client.putObject("test-bucket", "_SUCCESS", FILE);

ListObjectsRequest listObjectsRequest = new ListObjectsRequest("test-bucket", "dev/", null, "/", null);
ObjectListing listing = amazonS3Client.listObjects(listObjectsRequest);
System.out.println("getCommonPrefixes: " + listing.getCommonPrefixes()); //null
System.out.println("getObjectSummaries: " + listing.getObjectSummaries()); //shows 3 objectSummaries

using s3mock from awscli

after running s3mock and then with aws cli trying to create bucket getting error:

A client error (405) occurred when calling the CreateBucket operation: Method Not Allowed

cmd:
AWS_ACCESS_KEY_ID=4324324234 AWS_SECRET_ACCESS_KEY=3412342 aws s3api create-bucket --bucket demo --endpoint-url http://0.0.0.0:80/

When creating bucket with java aws sdk it works.

Is there something I am missing?

Problem with maven-enforcer-plugin in 0.1.4 version

Hello, I'm trying to use your library in 0.1.4 version, but I have a problem with maven-enforcer-plugin.

When I try to install the project, I got:

Dependency
convergence error for com.typesafe:config:1.2.1 paths to dependency are:

+-io.findify:s3mock_2.11:0.1.4
+-com.typesafe.akka:akka-stream_2.11:2.4.11
+-com.typesafe.akka:akka-actor_2.11:2.3.4
+-com.typesafe:config:1.2.1
and
+-io.findify:s3mock_2.11:0.1.4
+-com.typesafe.akka:akka-stream_2.11:2.4.11
+-com.typesafe:ssl-config-akka_2.11:0.2.1
+-com.typesafe:ssl-config-core_2.11:0.2.1
+-com.typesafe:config:1.2.0

[WARNING]
Dependency convergence error for org.scala-lang:scala-library:2.11.8 paths to dependency are:
+-io.findify:s3mock_2.11:0.1.4
+-org.scala-lang:scala-library:2.11.8
and
+-io.findify:s3mock_2.11:0.1.4
+-com.typesafe.akka:akka-stream_2.11:2.4.11
+-org.scala-lang:scala-library:2.11.8
and
+-io.findify:s3mock_2.11:0.1.4
+-com.typesafe.akka:akka-stream_2.11:2.4.11
+-com.typesafe.akka:akka-actor_2.11:2.3.4
+-org.scala-lang:scala-library:2.11.1`

Like you can see, com.typesafe:config has two versions different 1.2.0 and 1.2.1.

Could you fix this problem with the dependencies?

Thank you so much.

Max Keys not respected when calling list objects (V2)

The following code will return all the objects in a given bucket, regardless of the value passed to .withMaxKeys

ListObjectsV2Request objectsRequest = new ListObjectsV2Request()
        .withBucketName(bucketName)
        .withPrefix(topic)
        .withMaxKeys(objectsToRetrieve);

ListObjectsV2Result objectListing = m_s3Client.listObjectsV2(objectsRequest);
objectSummaries = objectListing.getObjectSummaries();

Release to Maven Central in addition to bintray?

Are you planning to publish to Maven Central too? Maven discourages using third party repositories on poms that are going to be pushed into Central, so it would be nice to be able to use s3mock without having to configure an external repository in my pom.

Alpakka multi part upload does not work with s3mock

Attempting a multi-upload with alpakka results in Unsupported Content-Type, supported: application/octet-stream. This is likely caused by the returned content type from s3mock being plain/text for the InitiateMultiUpload request.

Test to reproduce issue:

feature("Alpakka") {
    scenario("Fails on multi part upload with s3mock") {

      implicit val inMemorySystem = ActorSystem.create("some-system", configFor("localhost", 8001))
      implicit val inMemoryMat = ActorMaterializer()(inMemorySystem)

      val api = new S3Mock(8001, new FileProvider("/tmp/s3"))(inMemorySystem)
      api.start

      val endpoint = new EndpointConfiguration(s"http://localhost:8001", "eu-west-1")
      val client = AmazonS3ClientBuilder.standard()
        .withPathStyleAccessEnabled(true)
        .withCredentials(new AWSStaticCredentialsProvider(new AnonymousAWSCredentials()))
        .withEndpointConfiguration(endpoint)
        .build()

      client.createBucket("test-bucket")

      val inMemoryBasedAlpakkaClient = S3Client(BasicCredentials("foo", "bar"), "us-east-1")(inMemorySystem, inMemoryMat)

      val eventualResult: Future[MultipartUploadResult] = Source.single(ByteString("test-content"))
        .runWith(inMemoryBasedAlpakkaClient.multipartUpload("test-bucket", "test-key"))

      Await.result(eventualResult, 10 seconds)
    }
  }


  def configFor(host: String, port: Int): Config = {
    ConfigFactory.parseMap(Map(
      "akka.stream.alpakka.s3.proxy.host" -> host,
      "akka.stream.alpakka.s3.proxy.port" -> port,
      "akka.stream.alpakka.s3.proxy.secure" -> false,
      "akka.stream.alpakka.s3.path-style-access" -> true
    ).asJava)

  }

Output:

[2017-08-04 11:43:57,926] activity-fun-test INFO  [some-system-akka.actor.default-dispatcher-16] i.f.s.route.PutObjectMultipartStart - multipart upload start to test-bucket/test-key
[2017-08-04 11:43:57,932] activity-fun-test DEBUG [some-system-akka.actor.default-dispatcher-16] i.f.s3mock.provider.FileProvider - starting multipart upload for s3://test-bucket/test-key

Unsupported Content-Type, supported: application/octet-stream
akka.http.scaladsl.unmarshalling.Unmarshaller$UnsupportedContentTypeException: Unsupported Content-Type, supported: application/octet-stream
	at akka.http.scaladsl.unmarshalling.Unmarshaller$UnsupportedContentTypeException$.apply(Unmarshaller.scala:158)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$EnhancedFromEntityUnmarshaller$$anonfun$forContentTypes$extension$1$$anonfun$apply$24$$anonfun$apply$25.apply(Unmarshaller.scala:114)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$EnhancedFromEntityUnmarshaller$$anonfun$forContentTypes$extension$1$$anonfun$apply$24$$anonfun$apply$25.apply(Unmarshaller.scala:111)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$$anon$1.apply(Unmarshaller.scala:58)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$EnhancedUnmarshaller$$anonfun$mapWithInput$extension$1$$anonfun$apply$18$$anonfun$apply$19.apply(Unmarshaller.scala:91)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$EnhancedUnmarshaller$$anonfun$mapWithInput$extension$1$$anonfun$apply$18$$anonfun$apply$19.apply(Unmarshaller.scala:91)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$$anon$1.apply(Unmarshaller.scala:58)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$$anonfun$transform$1$$anonfun$apply$2$$anonfun$apply$3.apply(Unmarshaller.scala:23)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$$anonfun$transform$1$$anonfun$apply$2$$anonfun$apply$3.apply(Unmarshaller.scala:23)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$$anon$1.apply(Unmarshaller.scala:58)
	at akka.http.scaladsl.unmarshalling.Unmarshal.to(Unmarshal.scala:25)
	at akka.stream.alpakka.s3.impl.S3Stream$$anonfun$akka$stream$alpakka$s3$impl$S3Stream$$initiateMultipartUpload$1.apply(S3Stream.scala:130)
	at akka.stream.alpakka.s3.impl.S3Stream$$anonfun$akka$stream$alpakka$s3$impl$S3Stream$$initiateMultipartUpload$1.apply(S3Stream.scala:128)
	at scala.concurrent.Future$$anonfun$flatMap$1.apply(Future.scala:251)
	at scala.concurrent.Future$$anonfun$flatMap$1.apply(Future.scala:249)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:38)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:43)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

copyObject with new metadata is not working

copyObject with new metadata is still not working

s3.createBucket("test-bucket");

ObjectMetadata meta = new ObjectMetadata();
meta.addUserMetadata("key1", "value1");
meta.addUserMetadata("key2", "value2");
PutObjectRequest putRequest = new PutObjectRequest("test-bucket", "test.txt", new ByteArrayInputStream("test".getBytes(StandardCharsets.UTF_8)), meta);
s3.putObject(putRequest);

ObjectMetadata newMeta = new ObjectMetadata();
newMeta.addUserMetadata("new-key1", "new-value1");
newMeta.addUserMetadata("new-key2", "new-value2");
CopyObjectRequest copyRequest = new CopyObjectRequest("test-bucket", "test.txt", "test-bucket", "test2.txt").withNewObjectMetadata(newMeta);
s3.copyObject(copyRequest);

S3Object object = s3.getObject("test-bucket", "test2.txt");
assertThat(object.getObjectMetadata().getUserMetadata().size(), is(2));
assertThat(object.getObjectMetadata().getUserMetadata().get("new-key1"), is("new-value1"));
assertThat(object.getObjectMetadata().getUserMetadata().get("new-key1"), is("new-value2"));

Payload > 8MB not supported

We have a special treatment for larger files in our application which should externalize files exceeding a certain threshold to S3 rather than to our database. In order to test the upload part to S3 we currently test s3mock, however we get an exception thrown by Akka that the actual entity size exceeded the content length limit:

[ERROR] [02/22/2017 14:16:52.829] [sqsmock-akka.actor.default-dispatcher-6] [akka://sqsmock/user/StreamSupervisor-0/flow-4-0-unknown-operation] Error during preStart in [akka.http.scaladsl.model.HttpEntity$Limitable@6b9826b2]
EntityStreamSizeException: actual entity size (Some(10039231)) exceeded content length limit (8388608 bytes)! You can configure this by setting `akka.http.[server|client].parsing.max-content-length` or calling `HttpEntity.withSizeLimit` before materializing the dataBytes stream.
	at akka.http.scaladsl.model.HttpEntity$Limitable$$anon$1.preStart(HttpEntity.scala:607)
	at akka.stream.impl.fusing.GraphInterpreter.init(GraphInterpreter.scala:520)
	at akka.stream.impl.fusing.GraphInterpreterShell.init(ActorGraphInterpreter.scala:380)
	at akka.stream.impl.fusing.ActorGraphInterpreter.tryInit(ActorGraphInterpreter.scala:538)
	at akka.stream.impl.fusing.ActorGraphInterpreter.preStart(ActorGraphInterpreter.scala:586)
	at akka.actor.Actor.aroundPreStart(Actor.scala:504)
	at akka.actor.Actor.aroundPreStart$(Actor.scala:504)
	at akka.stream.impl.fusing.ActorGraphInterpreter.aroundPreStart(ActorGraphInterpreter.scala:529)
	at akka.actor.ActorCell.create(ActorCell.scala:590)
	at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:461)
	at akka.actor.ActorCell.systemInvoke(ActorCell.scala:483)
	at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:282)
	at akka.dispatch.Mailbox.run(Mailbox.scala:223)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

I've tried to set the akka.http.server.parsing.max-content-length (client as well) as System.property though without any success. Is there a possibility to specify this threshold via the S3Mock interface somehow? I haven't found anything in either the Http.ServerBinding object returned from s3Mock.start() nor from the Provider returned via s3Mock.p().

PS: you should also update the README.md and update the Maven version to 0.1.6 as 0.1.5 is not available and your Scala version also refers to 0.1.6 also.

deleteObjects is not supported

Everything is working ok with AmazonS3Client.deleteObject(String bucketName, String key) but it fails when calling AmazonS3Client.deleteObjects(DeleteObjectsRequest deleteObjectRequest) for bulk deletion

In-memory driver

Currently we require having user-specified folder for storing data. It's helpful when you need to mock s3 bucket with some external data stored there. But in some cases, when external data is not needed, it will be cool to have a way to throw away the whole bucket contents after the test completed. It should also fix #17

Unable to run sample on Windows

Running the sample code on Windows fails with an error

[ERROR] [10/24/2016 19:38:55.332] [sqsmock-akka.actor.default-dispatcher-7] [akka.actor.ActorSystemImpl(sqsmock)] Error during processing of request HttpRequest(HttpMethod(PUT),http://127.0.0.1:8001/testbucket/file/name,List(Host: 127.0.0.1:8001, user-agent: aws-sdk-java/1.11.41 Windows_7/6.1 Java_HotSpot(TM)_64-Bit_Server_VM/25.74-b02/1.8.0_74, amz-sdk-invocation-id: 4d329e5c-1ebc-f226-f93a-5182d9d93f9d, amz-sdk-retry: 0/0/500, Connection: Keep-Alive, Expect: 100-continue, Timeout-Access: ),HttpEntity.Default(text/plain; charset=UTF-8,8 bytes total),HttpProtocol(HTTP/1.1))
java.nio.file.InvalidPathException: Illegal char <:> at index 2: /C:\tmp\s3
at sun.nio.fs.WindowsPathParser.normalize(WindowsPathParser.java:182)
at sun.nio.fs.WindowsPathParser.parse(WindowsPathParser.java:153)
at sun.nio.fs.WindowsPathParser.parse(WindowsPathParser.java:77)
at sun.nio.fs.WindowsPath.parse(WindowsPath.java:94)
at sun.nio.fs.WindowsFileSystem.getPath(WindowsFileSystem.java:255)
at java.nio.file.Paths.get(Paths.java:84)
at better.files.File$.apply(File.scala:801)
at io.findify.s3mock.provider.FileProvider.create$1(FileProvider.scala:108)
at io.findify.s3mock.provider.FileProvider.createDir(FileProvider.scala:112)
at io.findify.s3mock.provider.FileProvider.putObject(FileProvider.scala:54)
at io.findify.s3mock.route.PutObject$$anonfun$completePlain$1$$anonfun$apply$4$$anonfun$4.apply(PutObject.scala:49)
at io.findify.s3mock.route.PutObject$$anonfun$completePlain$1$$anonfun$apply$4$$anonfun$4.apply(PutObject.scala:48)
at akka.stream.impl.fusing.Map$$anon$5.onPush(Ops.scala:42)
at akka.stream.impl.fusing.GraphInterpreter.processPush(GraphInterpreter.scala:755)
at akka.stream.impl.fusing.GraphInterpreter.processEvent(GraphInterpreter.scala:744)
at akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:624)
at akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:470)
at akka.stream.impl.fusing.GraphInterpreterShell.receive(ActorGraphInterpreter.scala:422)
at akka.stream.impl.fusing.ActorGraphInterpreter.akka$stream$impl$fusing$ActorGraphInterpreter$$processEvent(ActorGraphInterpreter.scala:602)
at akka.stream.impl.fusing.ActorGraphInterpreter$$anonfun$receive$1.applyOrElse(ActorGraphInterpreter.scala:617)
at akka.actor.Actor$class.aroundReceive(Actor.scala:484)
at akka.stream.impl.fusing.ActorGraphInterpreter.aroundReceive(ActorGraphInterpreter.scala:528)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:526)
at akka.actor.ActorCell.invoke(ActorCell.scala:495)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257)
at akka.dispatch.Mailbox.run(Mailbox.scala:224)
at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Test Class

import com.amazonaws.auth.AnonymousAWSCredentials;
import com.amazonaws.services.s3.AmazonS3Client;
import io.findify.s3mock.S3Mock;
import org.junit.Test;

public class FakeS3 {

    @Test
    public void testFakeS3(){
        S3Mock api = S3Mock.create(8001, "C:\\tmp\\s3");
        api.start();

        AmazonS3Client client = new AmazonS3Client(new AnonymousAWSCredentials());
        // use IP endpoint to override DNS-based bucket addressing
        client.setEndpoint("http://127.0.0.1:8001");
        client.createBucket("testbucket");
        client.putObject("testbucket", "file/name", "contents");
    }
}

Unable to read files

Trying to read file that I've just written to s3mock gives me empty InputStream.

client.getObject("bucket", "file").getObjectContent() // <- this is empty InputStream

use S3Mock under PowerMock causing issues

I tried the tutorial example JavaBuilderExample under JUnit4 test:

//@RunWith(PowerMockRunner.class)
public class JavaBuilderExample {
@test
public void testS3() {
// public static void main(String[] args) {
S3Mock api = new S3Mock.Builder().withPort(8001).withInMemoryBackend().build();
api.start();
AmazonS3 client = AmazonS3ClientBuilder
.standard()
.withPathStyleAccessEnabled(true)
.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration("http://localhost:8001", "us-east-1"))
.withCredentials(new AWSStaticCredentialsProvider(new AnonymousAWSCredentials()))
.build();
client.createBucket("testbucket");
client.putObject("testbucket", "file/name", "contents");
api.stop();
}

}

alone, it works.

But I got errors when using it under PowerMock by uncommenting the line //@RunWith(PowerMockRunner.class)

to the following:

@RunWith(PowerMockRunner.class)
public class JavaBuilderExample {
@test
public void testS3() {
...

the error message is:

java.security.NoSuchAlgorithmException: class configured for KeyManagerFactory: sun.security.ssl.KeyManagerFactoryImpl$SunX509 not a KeyManagerFactory

at sun.security.jca.GetInstance.checkSuperClass(GetInstance.java:260)
at sun.security.jca.GetInstance.getInstance(GetInstance.java:237)
at sun.security.jca.GetInstance.getInstance(GetInstance.java:164)
at javax.net.ssl.KeyManagerFactory.getInstance(KeyManagerFactory.java:137)
at com.typesafe.sslconfig.ssl.DefaultKeyManagerFactoryWrapper.<init>(SSLContextBuilder.scala:75)

I Google searched, and tried to use annotations:
@RunWith(PowerMockRunner.class)
@PowerMockIgnore("javax.net.ssl.*")
public class JavaBuilderExample {
@test
public void testS3() {

...

and got a different error message:
java.lang.LinkageError: loader constraint violation: loader (instance of org/powermock/core/classloader/MockClassLoader) previously initiated loading for a different type with name "javax/management/MBeanServer"

at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at org.powermock.core.classloader.MockClassLoader.loadUnmockedClass(MockClassLoader.java:238)
at org.powermock.core.classloader.MockClassLoader.loadModifiedClass(MockClassLoader.java:182)
at org.powermock.core.classloader.DeferSupportingClassLoader.loadClass(DeferSupportingClassLoader.java:70)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.amazonaws.jmx.MBeans.registerMBean(MBeans.java:52)
at com.amazonaws.jmx.SdkMBeanRegistrySupport.registerMetricAdminMBean(SdkMBeanRegistrySupport.java:27)
at com.amazonaws.metrics.AwsSdkMetrics.registerMetricAdminMBean(AwsSdkMetrics.java:392)

Any suggestions to overcome this? I need to use PowerMock to use S3 Mock for unit testing my application.

Thank you.

Support Alpakka listBucket

Attempting a listBucket with alpakka results in Unsupported Content-Type, supported: application/xml; charset=UTF-8. This is likely caused by the returned content type from s3mock being plain/text for the InitiateMultiUpload request.

Using the latest master (containing the upload fix) lets me put the objects, but it does not let me do a listBucket after it.

It tries to Unmarshal:

HttpEntity.Strict(text/plain; charset=UTF-8,<ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
      <Name>testBucket</Name>
      None
      None
      
      <KeyCount>3</KeyCount>
      <MaxKeys>1000</MaxKeys>
      <IsTruncated>false</IsTruncated>
        <Contents>
          <Key>2b2ce68f-676a-4cab-9627-51d9e8408d46.xml</Key>
          <LastModified>2017-09-14T12:06:22Z</LastModified>
          <ETag>0</ETag>
          <Size>477332</Size>
          <StorageClass>STANDARD</StorageClass>
        </Contents>
    </ListBucketResult>)

So it seems to be a similar issue as in #61

The Output

Unsupported Content-Type, supported: application/xml; charset=UTF-8
akka.http.scaladsl.unmarshalling.Unmarshaller$UnsupportedContentTypeException: Unsupported Content-Type, supported: application/xml; charset=UTF-8
	at akka.http.scaladsl.unmarshalling.Unmarshaller$UnsupportedContentTypeException$.apply(Unmarshaller.scala:158)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$EnhancedFromEntityUnmarshaller$.$anonfun$forContentTypes$3(Unmarshaller.scala:114)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$$anon$1.apply(Unmarshaller.scala:58)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$EnhancedUnmarshaller$.$anonfun$mapWithInput$3(Unmarshaller.scala:91)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$$anon$1.apply(Unmarshaller.scala:58)
	at akka.http.scaladsl.unmarshalling.Unmarshaller.$anonfun$transform$3(Unmarshaller.scala:23)
	at akka.http.scaladsl.unmarshalling.Unmarshaller$$anon$1.apply(Unmarshaller.scala:58)
	at akka.http.scaladsl.unmarshalling.Unmarshal.to(Unmarshal.scala:25)
	at akka.stream.alpakka.s3.impl.S3Stream.$anonfun$signAndGetAs$2(S3Stream.scala:248)
	at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:302)
	at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:37)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

ListObjects with prefix does not work on windows

ListObjects assumes that the path separator is "/" so when running on windows the prefix search will always fail.

Similarly keys returned from the method when calling without prefix are separated with '' rather than '/'

What's the best way to write JUnit asserts?

Say that I use below code to create a bucket, folder, and a file.

` EndpointConfiguration endpoint = new EndpointConfiguration("http://localhost:8001", "us-east-1");

	AmazonS3Client client = (AmazonS3Client) AmazonS3ClientBuilder
			.standard()
			.withEndpointConfiguration(endpoint)
			.withPathStyleAccessEnabled(true)  
			.withCredentials(new AWSStaticCredentialsProvider(credentials))     
			.build();

	client.createBucket("bipindelete");
	client.putObject("bipindelete", "foldername/filename.txt", "this is just the sample text for the file content");

`

Should I be writing a corresponding S3 Get or simple assert on
"http://localhost:8001/bipindelete/foldername/filename.txt"????

I have a gut that assert on "http://localhost:8001/bipindelete/foldername/filename.txt" using some HTTP client library would be best...any comments???

S3Mock doesn't differ paths with and without slash at the end

having stored some object under the key some/path/example it should

  • respond with 404 on getObject("some/path") or getObject("some/path/") but it responds with 500
  • respond with 204 on deleteObject("some/path") and do nothing but it responds with 204 and deletes the keys prefixed with some/path
  • also now it's not possible to store two objects ie. one with key key and the second with key key/ or key/example

... may be some other cases needs to be reconsidered as well

OutOfMemoryError when running tests on Windows

Hi
I'm getting OutOfMemoryError when trying to run all project tests via sbt test on Windows. Here is command output:

E:\MySources\s3mock>sbt test
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[info] Loading project definition from E:\MySources\s3mock\project
[info] Set current project to s3mock (in build file:/E:/MySources/s3mock/)
15:14:11.130 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket list
15:14:11.136 DEBUG i.f.s3mock.provider.FileProvider - creating bucket list
15:14:11.258 INFO  io.findify.s3mock.route.PutObject - put object list/foo1 (signed)
15:14:11.288 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,082f4cddaf66a88b241e557ed6d8963e4b4aa210f4f6c4642b23d948bfe8acb1)
15:14:11.288 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:11.289 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,1338becf1d87b2ce6b02791edc580ed4f35e42efa14a3c5885a3c6dd62c1becb)
15:14:11.289 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:11.298 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://list/foo1 to C:\Users\DMYTRO~1\AppData\Local\Temp\7362226302668834523/list/foo1, bytes = 3
15:14:11.520 INFO  io.findify.s3mock.route.PutObject - put object list/foo2 (signed)
15:14:11.525 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,ed933e476fc99ad6cc20345bd2386131d31efd4406dc79b09513d58b8d9a5cad)
15:14:11.525 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:11.526 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,2b35640b6ebeaa1b3773d577955e7f4430fc401e7f0a0c3d481fd8b667e89739)
15:14:11.527 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:11.532 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://list/foo2 to C:\Users\DMYTRO~1\AppData\Local\Temp\7362226302668834523/list/foo2, bytes = 3
15:14:11.647 INFO  io.findify.s3mock.route.ListBucket - listing bucket list with prefix=None, delimiter=None
15:14:11.676 DEBUG i.f.s3mock.provider.FileProvider - listing bucket contents: List(foo1, foo2)
[info] ListBucketEmptyWorkdirTest:
[info] s3mock
[info] - should list bucket with empty prefix
15:14:11.868 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket awscli-lm
15:14:11.874 DEBUG i.f.s3mock.provider.FileProvider - creating bucket awscli-lm
15:14:11.881 INFO  io.findify.s3mock.route.PutObject - put object awscli-lm/foo (signed)
15:14:11.885 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,fa03bc6a2d18a9dd08bfd09e7f7118138e915f1423712a56a0c0263151453914)
15:14:11.885 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:11.886 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,21e6e593c93b574c2f87b95f1b7455ac73c1cfb2c93e71e59cf76a525d25cd01)
15:14:11.886 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:11.891 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://awscli-lm/foo to C:\Users\DMYTRO~1\AppData\Local\Temp\8430220048473319760/awscli-lm/foo, bytes = 3
15:14:12.095 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=awscli-lm, path=foo
15:14:12.096 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://awscli-lm/foo
[WARN] [03/21/2017 15:14:12.148] [sqsmock-akka.actor.default-dispatcher-5] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:12.148] [sqsmock-akka.actor.default-dispatcher-5] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: text/plain; charset=UTF-8' is ignored, illegal RawHeader
15:14:12.165 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket awscli-head
15:14:12.167 DEBUG i.f.s3mock.provider.FileProvider - creating bucket awscli-head
15:14:12.172 INFO  io.findify.s3mock.route.PutObject - put object awscli-head/foo2 (signed)
15:14:12.175 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,261dac9b6231fd3a0f542d5545e909e0b52a369084ec741148ce6dd8e7cd7cc6)
15:14:12.175 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:12.176 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,84b1d4ffae11cf4dc2de55e9d83884b03d730b9d484e055b385d798d4f11bf9c)
15:14:12.176 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:12.181 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://awscli-head/foo2 to C:\Users\DMYTRO~1\AppData\Local\Temp\8430220048473319760/awscli-head/foo2, bytes = 3
15:14:12.228 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=awscli-head, path=foo2
15:14:12.228 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://awscli-head/foo2
[WARN] [03/21/2017 15:14:12.264] [sqsmock-akka.actor.default-dispatcher-8] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:12.265] [sqsmock-akka.actor.default-dispatcher-8] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: text/plain; charset=UTF-8' is ignored, illegal RawHeader
15:14:12.280 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket awscli-head2
15:14:12.281 DEBUG i.f.s3mock.provider.FileProvider - creating bucket awscli-head2
15:14:12.287 INFO  io.findify.s3mock.route.PutObject - put object awscli-head2/foo (signed)
15:14:12.291 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,70d1f42ad1c1ccabc6509b65df49be4dfddd7d18e6dd5e43ae9ac88af32be86f)
15:14:12.292 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:12.294 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,0655c357b8a5b8261700396e4f426008558014bf2c9724cea00ec01fea4c8195)
15:14:12.294 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:12.300 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://awscli-head2/foo to C:\Users\DMYTRO~1\AppData\Local\Temp\8430220048473319760/awscli-head2/foo, bytes = 3
15:14:12.354 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=awscli-head2, path=foo
15:14:12.355 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://awscli-head2/foo
[WARN] [03/21/2017 15:14:12.394] [sqsmock-akka.actor.default-dispatcher-10] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:12.394] [sqsmock-akka.actor.default-dispatcher-10] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: text/plain; charset=UTF-8' is ignored, illegal RawHeader
15:14:12.406 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket awscli
15:14:12.407 DEBUG i.f.s3mock.provider.FileProvider - creating bucket awscli
15:14:12.414 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=awscli, path=doesnotexist
15:14:12.414 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://awscli/doesnotexist
15:14:12.428 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=awscli-404, path=doesnotexist
15:14:12.428 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://awscli-404/doesnotexist
[info] GetObjectTest:
[info] awscli cp
[info] - should receive LastModified header
[info] - should deal with HEAD requests
[info] - should deal with metadata requests
[info] - should respond with status 404 if key does not exist
[info] - should respond with status 404 if bucket does not exist
15:14:12.559 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket awscli
15:14:12.562 DEBUG i.f.s3mock.provider.FileProvider - creating bucket awscli
15:14:12.576 DEBUG io.findify.s3mock.route.ListBuckets - listing all buckets
15:14:12.580 DEBUG i.f.s3mock.provider.FileProvider - listing buckets: List(awscli)
[info] PutBucketTest:
[info] awscli mb
[info] - should create bucket
15:14:12.624 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.634 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.634 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.634 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.635 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.635 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.635 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.635 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.637 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,1234567890123456789012345678901234567890123456789012345678901234)
15:14:12.637 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,1234567890123456789012345678901234567890123456789012345678901234)
15:14:12.638 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:12.639 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.647 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.647 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.648 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.648 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.648 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.648 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.649 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.649 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.649 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(5,84,1234567890123456789012345678901234567890123456789012345678901234)
15:14:12.649 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 5, bufferSize = 90
15:14:12.650 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(5,84,1234567890123456789012345678901234567890123456789012345678901234)
15:14:12.650 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=5
15:14:12.650 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.650 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.650 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.650 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.650 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.650 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.651 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.651 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,1234567890123456789012345678901234567890123456789012345678901234)
15:14:12.651 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:12.651 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
[info] S3ChunkedProtocolTest:
[info] s3 chunk protocol
[info] - should work with simple ins
[info] - should not drop \r\n chars
[info] MapMetadataStoreTest:
[info] map metadata store
[info] - should save md to a fresh store
15:14:12.745 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,1234567890123456789012345678901234567890123456789012345678901234)
15:14:12.747 DEBUG io.findify.s3mock.ChunkBuffer - cannot read header
15:14:12.748 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,1234567890123456789012345678901234567890123456789012345678901234)
15:14:12.749 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:12.750 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,1234567890123456789012345678901234567890123456789012345678901234)
15:14:12.750 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 3, bufferSize = 86
[info] ChunkBufferTest:
[info] chunk buffer
[info] - should detect header
[info] - should fail on non-complete header
[info] - should pull complete chunks
[info] - should ignore incomplete chunks
15:14:12.828 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket bucket-1
15:14:12.832 DEBUG i.f.s3mock.provider.FileProvider - creating bucket bucket-1
15:14:12.837 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket bucket-2
15:14:12.839 DEBUG i.f.s3mock.provider.FileProvider - creating bucket bucket-2
15:14:12.845 INFO  io.findify.s3mock.route.PutObject - put object bucket-1/test.txt (signed)
15:14:12.847 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(8,84,094ff6dce68d5a37a1266c08a82beec0fd05df5a6ff98b8ad3c145a17d8a6500)
15:14:12.847 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=8
15:14:12.848 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,bdf59529ad740f85b64ad0247354c522b512372ea7dd250b2405de4abbf42236)
15:14:12.848 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:12.854 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://bucket-1/test.txt to C:\Users\DMYTRO~1\AppData\Local\Temp\929394534961380229/bucket-1/test.txt, bytes = 8
15:14:12.936 DEBUG i.f.s3mock.provider.FileProvider - Copied s3://bucket-1/test.txt to s3://bucket-2/folder/test.txt
15:14:13.029 INFO  io.findify.s3mock.route.CopyObject - copied object bucket-1/test.txt
15:14:13.036 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=bucket-2, path=folder/test.txt
15:14:13.036 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://bucket-2/folder/test.txt
[WARN] [03/21/2017 15:14:13.084] [sqsmock-akka.actor.default-dispatcher-9] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:13.084] [sqsmock-akka.actor.default-dispatcher-9] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: text/plain; charset=UTF-8' is ignored, illegal RawHeader
15:14:13.097 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket bucket-3
15:14:13.099 DEBUG i.f.s3mock.provider.FileProvider - creating bucket bucket-3
Mar 21, 2017 3:14:13 PM com.amazonaws.services.s3.AmazonS3Client putObject
WARNING: No content length specified for stream data.  Stream contents will be buffered in memory and could result in out of memory errors.
15:14:13.108 INFO  io.findify.s3mock.route.PutObject - put object bucket-3/test.txt (signed)
15:14:13.111 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,28993a99d22b5aa966f53c4ab98ac5277b094aa8dc8f41fc6d0f122c5cedb185)
15:14:13.112 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:13.112 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,eb1336699d225ba1110c3f90587d57ae5931acae6b6f70af45a10111d667d0d2)
15:14:13.112 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:13.117 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://bucket-3/test.txt to C:\Users\DMYTRO~1\AppData\Local\Temp\929394534961380229/bucket-3/test.txt, bytes = 3
15:14:13.167 DEBUG i.f.s3mock.provider.FileProvider - Copied s3://bucket-3/test.txt to s3://bucket-3/test2.txt
15:14:13.226 INFO  io.findify.s3mock.route.CopyObject - copied object bucket-3/test.txt
15:14:13.231 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=bucket-3, path=test2.txt
15:14:13.232 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://bucket-3/test2.txt
[WARN] [03/21/2017 15:14:13.269] [sqsmock-akka.actor.default-dispatcher-9] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
15:14:13.284 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket test-bucket
[WARN] [03/21/2017 15:14:13.270] [sqsmock-akka.actor.default-dispatcher-9] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: application/octet-stream' is ignored, illegal RawHeader
15:14:13.290 DEBUG i.f.s3mock.provider.FileProvider - creating bucket test-bucket
Mar 21, 2017 3:14:13 PM com.amazonaws.services.s3.AmazonS3Client putObject
WARNING: No content length specified for stream data.  Stream contents will be buffered in memory and could result in out of memory errors.
15:14:13.298 INFO  io.findify.s3mock.route.PutObject - put object test-bucket/test.txt (signed)
15:14:13.304 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(4,84,68afd49c28d0131b6ff09f83ffb9019ecaafc6955ad8f479fdb1ddd2184317a4)
15:14:13.305 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=4
15:14:13.306 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,573a274f9d001ba1f04c3f32b5f5f0d7fcaebb1b17f79d7e6628b9304a575109)
15:14:13.306 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:13.313 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://test-bucket/test.txt to C:\Users\DMYTRO~1\AppData\Local\Temp\929394534961380229/test-bucket/test.txt, bytes = 4
15:14:13.375 DEBUG i.f.s3mock.provider.FileProvider - Copied s3://test-bucket/test.txt to s3://test-bucket/test2.txt
15:14:13.409 INFO  io.findify.s3mock.route.CopyObject - copied object test-bucket/test.txt
15:14:13.414 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=test-bucket, path=test2.txt
15:14:13.414 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://test-bucket/test2.txt
[WARN] [03/21/2017 15:14:13.450] [sqsmock-akka.actor.default-dispatcher-7] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: application/octet-stream' is ignored, illegal RawHeader
[info] CopyObjectTest:
[info] object
[info] - should be copied even if destdir does not exist
[info] - should be copied with metadata
[info] - should be copied with new metadata
15:14:13.562 DEBUG io.findify.s3mock.route.ListBuckets - listing all buckets
15:14:13.563 DEBUG i.f.s3mock.provider.FileProvider - listing buckets: List()
[info] ListBucketsTest:
[info] s3 mock
[info] - should list empty buckets
15:14:13.623 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket getput
15:14:13.627 DEBUG i.f.s3mock.provider.FileProvider - creating bucket getput
15:14:13.631 DEBUG io.findify.s3mock.route.ListBuckets - listing all buckets
15:14:13.632 DEBUG i.f.s3mock.provider.FileProvider - listing buckets: List(getput)
15:14:13.638 INFO  io.findify.s3mock.route.PutObject - put object getput/foo (signed)
15:14:13.643 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,b75a764829af6f63e6d321a0865f7eea50a67cb1959daac20f1fb51ea2d64c62)
15:14:13.644 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:13.644 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,ebbb49cd313c8fcd15b6ad83cbca64ac4d57c6f64c9ad6f0bf34f6194cbcef3a)
15:14:13.645 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:13.651 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://getput/foo to C:\Users\DMYTRO~1\AppData\Local\Temp\886176408114888458/getput/foo, bytes = 3
15:14:13.707 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=getput, path=foo
15:14:13.707 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://getput/foo
[WARN] [03/21/2017 15:14:13.740] [sqsmock-akka.actor.default-dispatcher-3] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:13.740] [sqsmock-akka.actor.default-dispatcher-3] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: text/plain; charset=UTF-8' is ignored, illegal RawHeader
15:14:13.762 DEBUG io.findify.s3mock.route.ListBuckets - listing all buckets
15:14:13.766 DEBUG i.f.s3mock.provider.FileProvider - listing buckets: List(getput, getput.metadata)
15:14:13.799 INFO  io.findify.s3mock.route.PutObject - put object getput/foo2 (unsigned)
15:14:13.808 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://getput/foo2 to C:\Users\DMYTRO~1\AppData\Local\Temp\886176408114888458/getput/foo2, bytes = 3
15:14:13.849 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=getput, path=foo2
15:14:13.850 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://getput/foo2
[WARN] [03/21/2017 15:14:13.886] [sqsmock-akka.actor.default-dispatcher-3] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: text/plain; charset=UTF-8' is ignored, illegal RawHeader
15:14:13.891 INFO  io.findify.s3mock.route.PutObject - put object getput/foo1/foo2/foo3 (signed)
15:14:13.896 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(3,84,fb30b1acdb8319388f787344df6636197c5f635ebd07e2f3d1d617f9d80392b1)
15:14:13.897 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=3
15:14:13.898 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,4f50d81ed921c6a2e4b861a41fd5bfaf3eea5d6788d9d9ab9df2d80aefe866d6)
15:14:13.898 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:13.907 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://getput/foo1/foo2/foo3 to C:\Users\DMYTRO~1\AppData\Local\Temp\886176408114888458/getput/foo1/foo2/foo3, bytes = 3
15:14:13.942 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=getput, path=foo1/foo2/foo3
15:14:13.942 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://getput/foo1/foo2/foo3
[WARN] [03/21/2017 15:14:13.975] [sqsmock-akka.actor.default-dispatcher-6] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:13.976] [sqsmock-akka.actor.default-dispatcher-6] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: text/plain; charset=UTF-8' is ignored, illegal RawHeader
15:14:13.985 INFO  io.findify.s3mock.route.PutObject - put object getput/foorn (signed)
15:14:13.987 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(8,84,a5c670cb49b0b8c75f4d9d1c2fd26823753292e94d78a9476fec46f828abeae7)
15:14:13.988 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=8
15:14:13.989 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,4a5e4aed0df238544ca3576405346a6071f71f236a606048c12dd67a4b7c0ab7)
15:14:13.989 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:13.993 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://getput/foorn to C:\Users\DMYTRO~1\AppData\Local\Temp\886176408114888458/getput/foorn, bytes = 8
15:14:14.036 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=getput, path=foorn
15:14:14.037 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://getput/foorn
[WARN] [03/21/2017 15:14:14.073] [sqsmock-akka.actor.default-dispatcher-6] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:14.073] [sqsmock-akka.actor.default-dispatcher-6] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: text/plain; charset=UTF-8' is ignored, illegal RawHeader
Mar 21, 2017 3:14:14 PM com.amazonaws.services.s3.AmazonS3Client putObject
WARNING: No content length specified for stream data.  Stream contents will be buffered in memory and could result in out of memory errors.
15:14:14.264 INFO  io.findify.s3mock.route.PutObject - put object getput/foolarge (signed)
15:14:14.270 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9e005c63cadf7852a19feb904020be4a0a23de3bc65f9104024bb6c5dec82b8c)
15:14:14.270 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 8192
15:14:14.270 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9e005c63cadf7852a19feb904020be4a0a23de3bc65f9104024bb6c5dec82b8c)
15:14:14.270 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.271 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9e005c63cadf7852a19feb904020be4a0a23de3bc65f9104024bb6c5dec82b8c)
15:14:14.272 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.272 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9ce94c2e099869520313dce354ca0a705ad70dd83e405f001a16857b6df6f6ff)
15:14:14.273 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 45056
15:14:14.273 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9ce94c2e099869520313dce354ca0a705ad70dd83e405f001a16857b6df6f6ff)
15:14:14.273 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.274 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9ce94c2e099869520313dce354ca0a705ad70dd83e405f001a16857b6df6f6ff)
15:14:14.275 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.275 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,a5d9bc2af2be88e8742d2f9283e2310e1ce8dfa3349b210530fbe427cd9bf909)
15:14:14.276 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 128720
15:14:14.278 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,a5d9bc2af2be88e8742d2f9283e2310e1ce8dfa3349b210530fbe427cd9bf909)
15:14:14.278 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.278 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,a5d9bc2af2be88e8742d2f9283e2310e1ce8dfa3349b210530fbe427cd9bf909)
15:14:14.279 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.279 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,4b00318579ebaef5ff098adc0d33a7725a5e07dfd5553fff93531bb82df0c837)
15:14:14.279 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.280 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,ca1ace04d271543f67c553d05ff8664ebc91a48be39b77693323a10b9a87e7b7)
15:14:14.280 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 89220
15:14:14.280 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,ca1ace04d271543f67c553d05ff8664ebc91a48be39b77693323a10b9a87e7b7)
15:14:14.280 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.281 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,ca1ace04d271543f67c553d05ff8664ebc91a48be39b77693323a10b9a87e7b7)
15:14:14.281 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.282 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c3acad01a633a018d5750039f7fdb9769046f3cff87b29c3bd1eb4700b6df0ba)
15:14:14.282 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 36864
15:14:14.283 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c3acad01a633a018d5750039f7fdb9769046f3cff87b29c3bd1eb4700b6df0ba)
15:14:14.283 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 101508
15:14:14.283 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c3acad01a633a018d5750039f7fdb9769046f3cff87b29c3bd1eb4700b6df0ba)
15:14:14.283 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.284 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c3acad01a633a018d5750039f7fdb9769046f3cff87b29c3bd1eb4700b6df0ba)
15:14:14.285 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.285 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,37c577c97862148f6597d8eaba075300c4fb982e0f064f52f3bf2b09c571ddd6)
15:14:14.285 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 60264
15:14:14.286 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,37c577c97862148f6597d8eaba075300c4fb982e0f064f52f3bf2b09c571ddd6)
15:14:14.286 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.287 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,37c577c97862148f6597d8eaba075300c4fb982e0f064f52f3bf2b09c571ddd6)
15:14:14.287 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.288 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,0644ed79a6e26e0ba3c964cb71bca2913cc269cf8019902bc8a214a1f8f56d1a)
15:14:14.288 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 50896
15:14:14.289 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,0644ed79a6e26e0ba3c964cb71bca2913cc269cf8019902bc8a214a1f8f56d1a)
15:14:14.290 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.290 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,0644ed79a6e26e0ba3c964cb71bca2913cc269cf8019902bc8a214a1f8f56d1a)
15:14:14.290 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.291 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,56bbda91e0811e9d2401b246bc1fb36918825027a7992a669fdbbb5f122b2c88)
15:14:14.291 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 73728
15:14:14.292 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,56bbda91e0811e9d2401b246bc1fb36918825027a7992a669fdbbb5f122b2c88)
15:14:14.292 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.293 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,56bbda91e0811e9d2401b246bc1fb36918825027a7992a669fdbbb5f122b2c88)
15:14:14.294 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.295 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,1493ee1d4c38f8f83d0d849f8cd7e1d5c22bb577c59543c4b025ad36d453b9c0)
15:14:14.295 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 53248
15:14:14.295 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,1493ee1d4c38f8f83d0d849f8cd7e1d5c22bb577c59543c4b025ad36d453b9c0)
15:14:14.295 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.297 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,1493ee1d4c38f8f83d0d849f8cd7e1d5c22bb577c59543c4b025ad36d453b9c0)
15:14:14.298 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.298 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,2d335829f101714de843acf9fbe6af33506754bacf63f74b27d085a6e701b2a3)
15:14:14.298 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 40960
15:14:14.299 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,2d335829f101714de843acf9fbe6af33506754bacf63f74b27d085a6e701b2a3)
15:14:14.299 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.301 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,2d335829f101714de843acf9fbe6af33506754bacf63f74b27d085a6e701b2a3)
15:14:14.302 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.302 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,94ece7d2eb17dcbe94a19ba43e922f7a13d1498e1d8cbbaede2fae0366f699b1)
15:14:14.302 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 102400
15:14:14.303 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,94ece7d2eb17dcbe94a19ba43e922f7a13d1498e1d8cbbaede2fae0366f699b1)
15:14:14.303 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.304 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,94ece7d2eb17dcbe94a19ba43e922f7a13d1498e1d8cbbaede2fae0366f699b1)
15:14:14.304 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.305 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,6343f2f404177bf9e46ad60f458842a8b844eda8765188d9aca4e775537da546)
15:14:14.305 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 60264
15:14:14.305 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,6343f2f404177bf9e46ad60f458842a8b844eda8765188d9aca4e775537da546)
15:14:14.305 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.307 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,6343f2f404177bf9e46ad60f458842a8b844eda8765188d9aca4e775537da546)
15:14:14.307 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.307 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,4f444aac666b298595ab39154115eba2f4a7682a642db77fa2fcbfaefb3d021f)
15:14:14.308 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 47976
15:14:14.308 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,4f444aac666b298595ab39154115eba2f4a7682a642db77fa2fcbfaefb3d021f)
15:14:14.308 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.309 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,4f444aac666b298595ab39154115eba2f4a7682a642db77fa2fcbfaefb3d021f)
15:14:14.310 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.311 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,1b48e198631ebb44bcdb89f2a7522f74aa19b45a62031a865052a38468a7e9b0)
15:14:14.311 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 75188
15:14:14.312 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,1b48e198631ebb44bcdb89f2a7522f74aa19b45a62031a865052a38468a7e9b0)
15:14:14.312 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.313 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,1b48e198631ebb44bcdb89f2a7522f74aa19b45a62031a865052a38468a7e9b0)
15:14:14.313 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.313 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,4210e2794a33bc901d33eb90dfe7db86d5a300ab41b2c860eade743d204ce9c2)
15:14:14.313 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 94208
15:14:14.314 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,4210e2794a33bc901d33eb90dfe7db86d5a300ab41b2c860eade743d204ce9c2)
15:14:14.314 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.315 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,4210e2794a33bc901d33eb90dfe7db86d5a300ab41b2c860eade743d204ce9c2)
15:14:14.316 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.316 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,02c5804ab36cc2d6dcafe1b55c1630a6987ced9df2dcd85c8f993f700f84a532)
15:14:14.317 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 49152
15:14:14.317 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,02c5804ab36cc2d6dcafe1b55c1630a6987ced9df2dcd85c8f993f700f84a532)
15:14:14.317 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.318 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,02c5804ab36cc2d6dcafe1b55c1630a6987ced9df2dcd85c8f993f700f84a532)
15:14:14.319 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.320 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,eedfc56c15fa17dd741e55cf1cf0d9508a3e5c2491db314101a1acb1a894811a)
15:14:14.320 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 103860
15:14:14.321 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,eedfc56c15fa17dd741e55cf1cf0d9508a3e5c2491db314101a1acb1a894811a)
15:14:14.321 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.322 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,eedfc56c15fa17dd741e55cf1cf0d9508a3e5c2491db314101a1acb1a894811a)
15:14:14.322 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.322 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,93237ca31cf9a295979ceb428ff2be949758b8cb4b41efea3ff59c26f9961047)
15:14:14.323 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 40960
15:14:14.323 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,93237ca31cf9a295979ceb428ff2be949758b8cb4b41efea3ff59c26f9961047)
15:14:14.323 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.324 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,93237ca31cf9a295979ceb428ff2be949758b8cb4b41efea3ff59c26f9961047)
15:14:14.325 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.325 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,10555290c49ae4c08929a567f46f9cda6e09fa0681e91b3c3c0e2edeb4c1961e)
15:14:14.325 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 53248
15:14:14.325 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,10555290c49ae4c08929a567f46f9cda6e09fa0681e91b3c3c0e2edeb4c1961e)
15:14:14.325 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.328 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,10555290c49ae4c08929a567f46f9cda6e09fa0681e91b3c3c0e2edeb4c1961e)
15:14:14.329 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.329 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9264b8bb2a9efe5a63230189790c875d0cf39b14fe934ae9c5ad52704a48b442)
15:14:14.330 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 8192
15:14:14.332 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9264b8bb2a9efe5a63230189790c875d0cf39b14fe934ae9c5ad52704a48b442)
15:14:14.332 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 90112
15:14:14.333 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9264b8bb2a9efe5a63230189790c875d0cf39b14fe934ae9c5ad52704a48b442)
15:14:14.334 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 98304
15:14:14.335 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9264b8bb2a9efe5a63230189790c875d0cf39b14fe934ae9c5ad52704a48b442)
15:14:14.336 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 106496
15:14:14.337 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9264b8bb2a9efe5a63230189790c875d0cf39b14fe934ae9c5ad52704a48b442)
15:14:14.337 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 114688
15:14:14.338 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9264b8bb2a9efe5a63230189790c875d0cf39b14fe934ae9c5ad52704a48b442)
15:14:14.338 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 122880
15:14:14.338 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9264b8bb2a9efe5a63230189790c875d0cf39b14fe934ae9c5ad52704a48b442)
15:14:14.338 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.341 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,9264b8bb2a9efe5a63230189790c875d0cf39b14fe934ae9c5ad52704a48b442)
15:14:14.341 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.342 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.342 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 20480
15:14:14.342 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.342 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 28672
15:14:14.344 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.344 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 36864
15:14:14.345 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.346 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 53248
15:14:14.346 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.346 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 61440
15:14:14.347 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.347 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 110592
15:14:14.348 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.348 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 118784
15:14:14.348 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.348 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 126976
15:14:14.350 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,c1782b8890d68fca84730e367572bc3bfa30ef281a2736e0dc62ef1e2f0988c6)
15:14:14.351 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.351 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,d86013687dcd7fe2481499bb543005c6ae4fc0573e420b7987ef8046d0537e46)
15:14:14.351 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 32768
15:14:14.352 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,d86013687dcd7fe2481499bb543005c6ae4fc0573e420b7987ef8046d0537e46)
15:14:14.352 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 49152
15:14:14.352 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,d86013687dcd7fe2481499bb543005c6ae4fc0573e420b7987ef8046d0537e46)
15:14:14.353 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 57344
15:14:14.353 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,d86013687dcd7fe2481499bb543005c6ae4fc0573e420b7987ef8046d0537e46)
15:14:14.353 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 65536
15:14:14.354 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,d86013687dcd7fe2481499bb543005c6ae4fc0573e420b7987ef8046d0537e46)
15:14:14.354 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 81920
15:14:14.355 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,d86013687dcd7fe2481499bb543005c6ae4fc0573e420b7987ef8046d0537e46)
15:14:14.355 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 98304
15:14:14.356 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,d86013687dcd7fe2481499bb543005c6ae4fc0573e420b7987ef8046d0537e46)
15:14:14.357 DEBUG io.findify.s3mock.ChunkBuffer - not enough data to pull chunk: chunkSize = 131072, bufferSize = 131072
15:14:14.357 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(131072,88,d86013687dcd7fe2481499bb543005c6ae4fc0573e420b7987ef8046d0537e46)
15:14:14.358 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=131072
15:14:14.381 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(17038,87,5bc0f5e6a6af3a02bee4a9fed05495b2912555efad743d9d003785c119dd2c1c)
15:14:14.381 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=17038
15:14:14.382 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,c45cbc76ca5e1164d8013222555e8ad7ca31874ba574ac1774488d1d5412d79c)
15:14:14.382 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:14.391 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://getput/foolarge to C:\Users\DMYTRO~1\AppData\Local\Temp\886176408114888458/getput/foolarge, bytes = 3031694
15:14:14.446 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=getput, path=foolarge
15:14:14.446 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://getput/foolarge
[WARN] [03/21/2017 15:14:14.490] [sqsmock-akka.actor.default-dispatcher-5] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:14.490] [sqsmock-akka.actor.default-dispatcher-5] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: application/octet-stream' is ignored, illegal RawHeader
15:14:14.636 INFO  io.findify.s3mock.route.CreateBucket - PUT bucket tbucket
15:14:14.638 DEBUG i.f.s3mock.provider.FileProvider - creating bucket tbucket
Mar 21, 2017 3:14:14 PM com.amazonaws.services.s3.AmazonS3Client putObject
WARNING: No content length specified for stream data.  Stream contents will be buffered in memory and could result in out of memory errors.
15:14:14.645 INFO  io.findify.s3mock.route.PutObject - put object tbucket/taggedobj (signed)
15:14:14.648 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(7,84,0bdc938293baaea0f9ded25ba19eb55774f681028ae97e16a58fc06606a36b68)
15:14:14.648 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=7
15:14:14.649 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,885f3488bdd85798fc319871b2896805ab59183e6b91d64f40e60f8fb2f1e3df)
15:14:14.649 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:14.656 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://tbucket/taggedobj to C:\Users\DMYTRO~1\AppData\Local\Temp\886176408114888458/tbucket/taggedobj, bytes = 7
15:14:14.713 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=tbucket, path=taggedobj
15:14:14.716 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://tbucket/taggedobj
[WARN] [03/21/2017 15:14:14.764] [sqsmock-akka.actor.default-dispatcher-3] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:14.764] [sqsmock-akka.actor.default-dispatcher-3] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: application/xml; charset=utf-8' is ignored, illegal RawHeader
15:14:14.772 INFO  io.findify.s3mock.route.PutObject - put object tbucket/taggedobj (signed)
15:14:14.775 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(12,84,72a173f67576049fb9ac5aac00c1bef982d4b4a83ffeb3c553b91cde91c84aff)
15:14:14.775 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=12
15:14:14.776 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,cecde26f26b65e930b447844d7aceea396287d4b4071cca70597c5fdb63c2eab)
15:14:14.776 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
15:14:14.778 DEBUG i.f.s3mock.provider.FileProvider - writing file for s3://tbucket/taggedobj to C:\Users\DMYTRO~1\AppData\Local\Temp\886176408114888458/tbucket/taggedobj, bytes = 12
15:14:14.816 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=tbucket, path=taggedobj
15:14:14.816 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://tbucket/taggedobj
[WARN] [03/21/2017 15:14:14.854] [sqsmock-akka.actor.default-dispatcher-6] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Connection: Keep-Alive' is ignored, illegal RawHeader
[WARN] [03/21/2017 15:14:14.854] [sqsmock-akka.actor.default-dispatcher-6] [akka.actor.ActorSystemImpl(sqsmock)] Explicitly set HTTP header 'Content-Type: application/xml; charset=utf-8' is ignored, illegal RawHeader
15:14:14.861 DEBUG io.findify.s3mock.route.GetObject - get object: bucket=aws-404, path=foo
15:14:14.862 DEBUG i.f.s3mock.provider.FileProvider - reading object for s://aws-404/foo
15:14:14.868 INFO  io.findify.s3mock.route.PutObject - put object aws-404/foo (signed)
15:14:14.871 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(7,84,32fd350020a2c8f3d91025ee317627abc3598ac488588cfc6d766082d731126a)
15:14:14.871 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=7
15:14:14.872 DEBUG io.findify.s3mock.ChunkBuffer - read header: Header(0,84,8aae64833ecf462b9edd5bf12b0883c9c8c9997b637efc9d68b09651941b32ca)
15:14:14.872 DEBUG io.findify.s3mock.ChunkBuffer - pulled chunk, size=0
[ERROR] [SECURITY][03/21/2017 15:16:07.104] [sqsmock-akka.actor.default-dispatcher-9] [akka.actor.ActorSystemImpl(sqsmock)] UncaUncaught error from thread [sqsmock-akka.actor.default-dispatcher-9] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is eught
error from thread [sqsmock-akka.actor.default-dispatcher-9] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled
nabled for ActorSystem[sqsmock]
java.lang.OutOfMemoryError: GC overhead limit exceeded
[       at akka.dispatch.Envelope$.apply(AbstractDispatcher.scala:27)
        at akka.actor.Cell$class.sendMessage(ActorCell.scala:295)
        at akka.actor.ActorCell.sendMessage(ActorCell.scala:374)
        at akka.actor.RepointableActorRef.$bang(RepointableActorRef.scala:171)
        at akka.stream.impl.fusing.GraphInterpreterShell$$anonfun$interpreter$1.apply(ActorGraphInterpreter.scala:330)
info    at akka.stream.impl.fusing.GraphInterpreterShell$$anonfun$interpreter$1.apply(ActorGraphInterpreter.scala:326)
        at akka.stream.stage.GraphStageLogic$$anon$2.invoke(GraphStage.scala:885)
        at akka.stream.stage.TimerGraphStageLogic$$anon$3.run(GraphStage.scala:1187)
        at akka.actor.LightArrayRevolverScheduler$$anon$2$$anon$1.run(LightArrayRevolverScheduler.scala:102)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
]       at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
GetPutObjectTest:
[info] s3 mock
[info] - should put object
[info] - should be able to post data
[info] - should put objects in subdirs
[info] - should not drop \r\n symbols
[info] - should put & get large binary blobs
[info] tagging
[info] - should store tags and spit them back on get tagging requests
[info] - should be OK with retrieving tags for un-tagged objects
[info] get
[info] - should produce NoSuchBucket if bucket does not exist
[info] put
[info] - should produce NoSuchBucket if bucket does not exist
[info] io.findify.s3mock.GetPutObjectTest *** ABORTED ***
[info]   java.lang.OutOfMemoryError: GC overhead limit exceeded
[info]   at scala.collection.mutable.ListBuffer.$plus$eq(ListBuffer.scala:174)
[info]   at scala.collection.mutable.ListBuffer.$plus$eq(ListBuffer.scala:45)
[info]   at scala.collection.generic.GenTraversableFactory.fill(GenTraversableFactory.scala:90)
[info]   at scala.util.Random.nextString(Random.scala:89)
[info]   at io.findify.s3mock.GetPutObjectTest$$anonfun$12.apply(GetPutObjectTest.scala:91)
[info]   at io.findify.s3mock.GetPutObjectTest$$anonfun$12.apply(GetPutObjectTest.scala:90)
[info]   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
[info]   ...
java.lang.OutOfMemoryError: GC overhead limit exceeded

On Linux I couldn't reproduce this.

Also, I needed to perform several changes which fixed tests failure on Windows due to couple of other reasons. I think, I will commit them later.

Thanks

Bind to random port and get it's number

Currently it seems it's possible to bind to random port setting port to 0 however there is no way to get that port number back in order to connect it.

This would be handy when running in some CI systems where one cannot rely on fixed port to be free.

listObjects(String) returns incorrectly dot-keys

With S3Mock running on Java 8, when I first do:

s3Client.putObject("myBucket", "testKey", ...);

and then run s3Client.listObjects("myBucket"), the returned XML result will contain (incorrectly) the object with testKey as well as an additional object with a dot-prefixed .testKey:

<ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
  <Name>myBucket</Name>
  <Prefix></Prefix>
  <KeyCount>2</KeyCount>
  <MaxKeys>1000</MaxKeys>
  <IsTruncated>false</IsTruncated>
    <Contents>
      <Key>.testKey</Key>
      <LastModified>2016-12-05T09:26:39Z</LastModified>
      <ETag>0</ETag>
      <Size>832</Size>
      <StorageClass>STANDARD</StorageClass>
    </Contents><Contents>
      <Key>testKey</Key>
      <LastModified>2016-12-05T09:26:39Z</LastModified>
      <ETag>0</ETag>
      <Size>5209</Size>
      <StorageClass>STANDARD</StorageClass>
    </Contents>
</ListBucketResult>

The Java process does not terminate after executing S3Mock.stop() from main method

Trying to use S3Mock in a standalone Java program, with the following main method.

public static void main(String args[]) {
S3Mock api = new S3Mock.Builder().withPort(8001).withInMemoryBackend().build();
api.start();
api.stop();
}

The expectation here is, the Java program should terminate once the execution of the main method is complete.

The process hangs even after executing api.stop();

Is there a way to solve this issue?

Typesafe config fails with s3mock

Typesafe config's load() methods fails If I start s3mock. Not sure what is the link between the two. May be classloader issues??

This is how I start s3mock
new S3Mock.Builder().withPort(8001).withInMemoryBackend().build().start

There are other tests that follow which loads config using ConfigFactory.load(). Load() fails only if I start s3mock.

Emulate s3 behaviour when listing by prefix

Hi,

I am testing a piece of code that depends on the s3 behaviour that listing by prefix, will give you all the objects that match this prefix (not just up to the first /). As far as I can tell s3mock doesn't do that. I would like to add that if that is ok with you.

Could you please add instructions to build this project with test?

I have run sbt clean test and got all test failed due to java.lang.NullPointerException:

[info] io.findify.s3mock.awscli.GetObjectTest *** ABORTED ***
[info]   java.lang.NullPointerException:
[info]   at com.typesafe.config.ConfigException.<init>(ConfigException.java:23)
[info]   at com.typesafe.config.ConfigException$BadPath.<init>(ConfigException.java:198)
[info]   at com.typesafe.config.ConfigException$BadPath.<init>(ConfigException.java:204)
[info]   at com.typesafe.config.impl.PathParser.parsePathExpression(PathParser.java:170)
[info]   at com.typesafe.config.impl.PathParser.parsePathNodeExpression(PathParser.java:85)
[info]   at com.typesafe.config.impl.PathParser.parsePathNodeExpression(PathParser.java:79)
[info]   at com.typesafe.config.impl.ConfigDocumentParser$ParseContext.parseKey(ConfigDocumentParser.java:283)
[info]   at com.typesafe.config.impl.ConfigDocumentParser$ParseContext.parseObject(ConfigDocumentParser.java:397)
[info]   at com.typesafe.config.impl.ConfigDocumentParser$ParseContext.parse(ConfigDocumentParser.java:595)
[info]   at com.typesafe.config.impl.ConfigDocumentParser.parse(ConfigDocumentParser.java:14)
[info]   ...

Metadata is not copies when calling copyObject

According to javadoc of copyObject for AmazonS3

By default, all object metadata for the source object except server-side-encryption storage-class and website-redirect-location are copied to the new destination object, unless new object metadata in the specified {@link CopyObjectRequest} is provided.

The current implementation do not copy metadata for copies object.

Implement copyObject

Sorry, can't find wish list.

For testing purposes need to realize copyObject method.

For now, receive such error:

com.amazonaws.SdkClientException: Failed to parse XML document with handler class com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser$CopyObjectResultHandler

	at com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser.parseXmlInputStream(XmlResponsesSaxParser.java:128)
	at com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser.parseCopyObjectResponse(XmlResponsesSaxParser.java:427)
	at com.amazonaws.services.s3.model.transform.Unmarshallers$CopyObjectUnmarshaller.unmarshall(Unmarshallers.java:221)
	at com.amazonaws.services.s3.model.transform.Unmarshallers$CopyObjectUnmarshaller.unmarshall(Unmarshallers.java:217)
	at com.amazonaws.services.s3.internal.S3XmlResponseHandler.handle(S3XmlResponseHandler.java:62)
	at com.amazonaws.services.s3.internal.ResponseHeaderHandlerChain.handle(ResponseHeaderHandlerChain.java:44)
	at com.amazonaws.services.s3.internal.ResponseHeaderHandlerChain.handle(ResponseHeaderHandlerChain.java:30)
	at com.amazonaws.http.response.AwsResponseHandlerAdapter.handle(AwsResponseHandlerAdapter.java:70)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleResponse(AmazonHttpClient.java:1442)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1149)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:962)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:675)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:649)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:632)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$300(AmazonHttpClient.java:600)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:582)
	at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:446)
	at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4031)
	at com.amazonaws.services.s3.AmazonS3Client.copyObject(AmazonS3Client.java:1716)
	at com.amazonaws.services.s3.AmazonS3Client.copyObject(AmazonS3Client.java:1673)
	at co.list3d.viarlife.service.S3ServiceTest.testCopyObjectSuccess(S3ServiceTest.java:187)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:117)
	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:42)
	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:262)
	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:84)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 1; Premature end of file.
	at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:203)
	at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.fatalError(ErrorHandlerWrapper.java:177)
	at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:400)
	at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:327)
	at com.sun.org.apache.xerces.internal.impl.XMLScanner.reportFatalError(XMLScanner.java:1472)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:1014)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:602)
	at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:112)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:505)
	at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:841)
	at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:770)
	at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
	at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
	at com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser.parseXmlInputStream(XmlResponsesSaxParser.java:114)
	... 52 more

Scala 2.12 support

If someone is interested, this is on my to-do list, but there is a single dependency (better-files) which has not yet been released for 2.12.

getETag from getObjectMetadata returns null

I tried this basic code in the test.

client.createBucket("testbucket");
client.putObject("testbucket", "file/name", "contents");
ObjectMetadata data = client.getObjectMetadata("testbucket", "file/name");
assertNotNull(data.getETag());

The last line fails. It should return the MD5 checksum from the file.

JAVA example failure, AWSCredentials related

I tried your JAVA example. The target file '/tmp/s3/testbucket/file/name' was created but it has no content:

[INFO] [08/19/2016 03:56:28.349] [sqsmock-akka.actor.default-dispatcher-3][akka.actor.ActorSystemImpl(sqsmock)] request: HttpRequest(HttpMethod(PUT),http://127.0.0.1:8001/testbucket/,List(Host: 127.0.0.1:8001, user-agent: aws-sdk-java/1.11.15 Mac_OS_X/10.11.6 Java_HotSpot(TM)_64-Bit_Server_VM/25.45-b02/1.8.0_45, amz-sdk-invocation-id: ac7dd0da-5eed-2f2c-94dc-b17ab7cccb20, amz-sdk-retry: 0/0/500, Connection: Keep-Alive, Timeout-Access: <function1>),HttpEntity.Strict(application/octet-stream,ByteString()),HttpProtocol(HTTP/1.1))
 2016-08-19 03:56:28.454 [sqsmock-akka.actor.default-dispatcher-3] TEST DEBUG i.f.s.provider.FileProvider - crating bucket testbucket
[INFO] [08/19/2016 03:56:28.531] [sqsmock-akka.actor.default-dispatcher- [akka.actor.ActorSystemImpl(sqsmock)] request: HttpRequest(HttpMethod(PUT),http://127.0.0.1:8001/testbucket/file/name,List(Host: 127.0.0.1:8001, user-agent: aws-sdk-java/1.11.15 Mac_OS_X/10.11.6 Java_HotSpot(TM)_64-Bit_Server_VM/25.45-b02/1.8.0_45, amz-sdk-invocation-id: 1e6f40c7-3d9c-bed2-ec74-d0b75c41b8ae, amz-sdk-retry: 0/0/500, Connection: Keep-Alive, Expect: 100-continue, Timeout-Access: <function1>),HttpEntity.Default(text/plain; charset=UTF-8,8 bytes total),HttpProtocol(HTTP/1.1))
2016-08-19 03:56:28.571 [sqsmock-akka.actor.default-dispatcher-3] TEST DEBUG i.f.s.provider.FileProvider - writing file for s3://testbucket/file/name to /tmp/s3/testbucket/file/name, bytes = 0

bytes = 0 according to log

Cannot find aka.stream.Graph.module method

Using 0.2.3 on scala 2.11.11 play application, I'm getting this error out of the box

java.lang.NoSuchMethodError: akka.stream.Graph.module()Lakka/stream/impl/StreamLayout$Module;

	at akka.stream.Fusing$FusedGraph$.unapply(Fusing.scala:56)
	at akka.stream.impl.fusing.Fusing$.aggressive(Fusing.scala:35)
	at akka.stream.Fusing$.aggressive(Fusing.scala:34)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at akka.http.impl.util.StreamUtils$$anon$13.aggressive(StreamUtils.scala:259)
	at akka.http.impl.util.StreamUtils$.fuseAggressive(StreamUtils.scala:272)
	at akka.http.scaladsl.HttpExt.fuseServerFlow(Http.scala:87)
	at akka.http.scaladsl.HttpExt.bindAndHandle(Http.scala:181)
	at io.findify.s3mock.S3Mock.start(S3Mock.scala:62)```

copyObject throws exception when destinationKey contains folder that does not exist

Code:

s3Client.createBucket("bucket-1");
s3Client.createBucket("bucket-2");

PutObjectResult putObjectResult = s3Client.putObject("bucket-1", "test.txt", "contents");
CopyObjectResult res = s3Client.copyObject("bucket-1", "test.txt", "bucket-2", "folder/test.txt");

Exception:
cannot copy object /bucket-1/test.txt: java.nio.file.NoSuchFileException: /bucket-2/folder/test.txt

Reason:
better.files.File.copyTo that is used for copying objects do not create destination folders automatically. Calling better.file.File.createIfNotExists(createParents = true) on destFile may help to solve it.

ObjectListing#getCommonPrefixes order is not alphabetical

Using aws-sdk for java, ObjectListing#getCommonPrefixes returns objects alphabetically unsorted :

"S3DriverSpec/30/"
"S3DriverSpec/20/"
"S3DriverSpec/60/"
"S3DriverSpec/50/"
"S3DriverSpec/10/"
"S3DriverSpec/40/"

In comparison with real s3.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.