dwhjames / aws-wrap Goto Github PK
View Code? Open in Web Editor NEWAsynchronous Scala Clients for Amazon Web Services
Home Page: https://dwhjames.github.io/aws-wrap/
License: Apache License 2.0
Asynchronous Scala Clients for Amazon Web Services
Home Page: https://dwhjames.github.io/aws-wrap/
License: Apache License 2.0
Amazon apparently requires that you not pass any region when you're using the default availability zone. The code always includes the region, and therefore we're getting an error when we try and create a bucket in the default zone.
Can you please make the dynamoDBClient val in ConcurrentBatchWriter.scala public, like in SingleThreadedBatchWriter.scala
I'd like to be able to configure the DynamoDB client specifically for my use case.
Thanks!
[info] FutureTransferSpec:
[info] FutureTransfer
[info] - should upload a file *** FAILED ***
[info] com.amazonaws.AmazonClientException: Unable to verify integrity of data upload. Client calculated content hash (contentMD5: UBqk24T0DmpFPfPUVDOKPA== in base 64) didn't match hash (etag: 74c7b8e2e8c62362011b1a73c583ecb3 in hex) calculated by Amazon S3. You may need to delete the data stored in Amazon S3. (metadata.contentMD5: UBqk24T0DmpFPfPUVDOKPA==, md5DigestStream: null, bucketName: my-s3-bucket-98bfdf06-9475-4d1a-a235-e0eddd5859f9, key: test)
[info] at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1597)
[info] at com.amazonaws.services.s3.transfer.internal.UploadCallable.uploadInOneChunk(UploadCallable.java:131)
[info] at com.amazonaws.services.s3.transfer.internal.UploadCallable.call(UploadCallable.java:123)
[info] at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:139)
[info] at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:47)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:262)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[info] at java.lang.Thread.run(Thread.java:745)
Would love to send a pull request for doc but I don't see it here and not sure how to request src for github page.
http://pellucidanalytics.github.io/aws-wrap/doc/dynamodb.html
add:
import com.pellucid.wrap.dynamodb._
replace
override def fromAttributeMap(
item: mutable.Map[String, AttributeValue]) =
with
item: collection.mutable.Map[String, AttributeValue]) =
In the process of trying to do a TransferManager download
, I found a bug with the FutureTransfer.listenFor
code. In the simple example of
FutureTransfer.listenFor {
transferManager.download("bucket", "location", new File("foobar")
}
where "foobar" is a file with permissions of 400
, instead of throwing a FileNotFoundException
with bad permissions, it just hangs indefinitely. No amount of mapping, or making use of waitForCompletion()
seemed to help, leading to the conclusion that the Future is never finishing. I looked a little into the source code and found that in the listenFor
function, there are cases where the 'progressEvent' will never fire. In this case, since the progress hadn't changed because the File failed before the, and the transfer wasn't done, it just hung.
I'm trying to incorporate the new CodeCommit API into the project but when I upgraded to the latest version of the AWS SDK I hit this issue (see compile errors at bottom of post).
I traced back to version 1.10.4.1 which is the latest version that still compiles ok. I had a look at the diff of these two versions but can't figure out what the breaking change is:
aws/aws-sdk-java@1.10.4.1...1.10.5
The only way I know how to fix this is to not use the 'wrapAsyncMethod' on these methods and just hand code the conversion of the java Future to a Scala one (not ideal).
This will need to be addressed so that we can move forward with the latest AWS developments and services.
Thanks.
[error] /home/tim/v4/aws-wrap/src/main/scala/dynamodb/dynamodb.scala:45: ambiguous reference to overloaded definition, [error] both method batchGetItemAsync in class AmazonDynamoDBAsyncClient of type (x$1: java.util.Map[String,com.amazonaws.services.dynamodbv2.model.KeysAndAttributes], x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.BatchGetItemRequest,com.amazonaws.services.dynamodbv2.model.BatchGetItemResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.BatchGetItemResult] [error] and method batchGetItemAsync in class AmazonDynamoDBAsyncClient of type (x$1: com.amazonaws.services.dynamodbv2.model.BatchGetItemRequest, x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.BatchGetItemRequest,com.amazonaws.services.dynamodbv2.model.BatchGetItemResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.BatchGetItemResult] [error] match expected type (?, com.amazonaws.handlers.AsyncHandler[?,?]) => java.util.concurrent.Future[?] [error] wrapAsyncMethod(client.batchGetItemAsync, batchGetItemRequest) [error] ^ [error] /home/tim/v4/aws-wrap/src/main/scala/dynamodb/dynamodb.scala:64: ambiguous reference to overloaded definition, [error] both method batchWriteItemAsync in class AmazonDynamoDBAsyncClient of type (x$1: java.util.Map[String,java.util.List[com.amazonaws.services.dynamodbv2.model.WriteRequest]], x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.BatchWriteItemRequest,com.amazonaws.services.dynamodbv2.model.BatchWriteItemResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.BatchWriteItemResult] [error] and method batchWriteItemAsync in class AmazonDynamoDBAsyncClient of type (x$1: com.amazonaws.services.dynamodbv2.model.BatchWriteItemRequest, x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.BatchWriteItemRequest,com.amazonaws.services.dynamodbv2.model.BatchWriteItemResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.BatchWriteItemResult] [error] match expected type (?, com.amazonaws.handlers.AsyncHandler[?,?]) => java.util.concurrent.Future[?] [error] wrapAsyncMethod(client.batchWriteItemAsync, batchWriteItemRequest) [error] ^ [error] /home/tim/v4/aws-wrap/src/main/scala/dynamodb/dynamodb.scala:127: ambiguous reference to overloaded definition, [error] both method deleteTableAsync in class AmazonDynamoDBAsyncClient of type (x$1: String, x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.DeleteTableRequest,com.amazonaws.services.dynamodbv2.model.DeleteTableResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.DeleteTableResult] [error] and method deleteTableAsync in class AmazonDynamoDBAsyncClient of type (x$1: com.amazonaws.services.dynamodbv2.model.DeleteTableRequest, x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.DeleteTableRequest,com.amazonaws.services.dynamodbv2.model.DeleteTableResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.DeleteTableResult] [error] match expected type (?, com.amazonaws.handlers.AsyncHandler[?,?]) => java.util.concurrent.Future[?] [error] wrapAsyncMethod(client.deleteTableAsync, deleteTableRequest) [error] ^ [error] /home/tim/v4/aws-wrap/src/main/scala/dynamodb/dynamodb.scala:143: ambiguous reference to overloaded definition, [error] both method describeTableAsync in class AmazonDynamoDBAsyncClient of type (x$1: String, x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.DescribeTableRequest,com.amazonaws.services.dynamodbv2.model.DescribeTableResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.DescribeTableResult] [error] and method describeTableAsync in class AmazonDynamoDBAsyncClient of type (x$1: com.amazonaws.services.dynamodbv2.model.DescribeTableRequest, x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.DescribeTableRequest,com.amazonaws.services.dynamodbv2.model.DescribeTableResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.DescribeTableResult] [error] match expected type (?, com.amazonaws.handlers.AsyncHandler[?,?]) => java.util.concurrent.Future[?] [error] wrapAsyncMethod(client.describeTableAsync, describeTableRequest) [error] ^ [error] /home/tim/v4/aws-wrap/src/main/scala/dynamodb/dynamodb.scala:191: ambiguous reference to overloaded definition, [error] both method listTablesAsync in class AmazonDynamoDBAsyncClient of type (x$1: Integer, x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.ListTablesRequest,com.amazonaws.services.dynamodbv2.model.ListTablesResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.ListTablesResult] [error] and method listTablesAsync in class AmazonDynamoDBAsyncClient of type (x$1: String, x$2: com.amazonaws.handlers.AsyncHandler[com.amazonaws.services.dynamodbv2.model.ListTablesRequest,com.amazonaws.services.dynamodbv2.model.ListTablesResult])java.util.concurrent.Future[com.amazonaws.services.dynamodbv2.model.ListTablesResult] [error] match expected type (?, com.amazonaws.handlers.AsyncHandler[?,?]) => java.util.concurrent.Future[?] [error] wrapAsyncMethod(client.listTablesAsync, listTablesRequest)
I have a time series table UserActivity
case class UserActivity(
id: Long,
activity: String,
occurred: Long = System.currentTimeMillis()
)
object UserActivity {
val tableName = "UserActivityV2"
object Attributes {
val id = "Id"
val activity = "Activity"
val occurred = "Occurred"
}
val tableRequest =
new CreateTableRequest()
.withTableName(UserActivity.tableName)
.withProvisionedThroughput(
Schema.provisionedThroughput(10L, 5L))
.withAttributeDefinitions(
Schema.numberAttribute(Attributes.id),
Schema.numberAttribute(Attributes.occurred),
Schema.stringAttribute(Attributes.activity)
)
.withKeySchema(
Schema.hashKey(Attributes.id),
Schema.rangeKey(Attributes.occurred))
.withGlobalSecondaryIndexes(
new GlobalSecondaryIndex()
.withIndexName("ActivitiesOccurred")
.withKeySchema(
Schema.hashKey(Attributes.activity),
Schema.rangeKey(Attributes.occurred))
.withProjection(new Projection().withProjectionType(ProjectionType.KEYS_ONLY))
.withProvisionedThroughput(
Schema.provisionedThroughput(10L, 5L)),
Using the AWS UI, I can query the index. Although I didn't find an example for running a query, I came up with:
mapper.countQuery[UserActivity](
mapper.CountQueryMagnet.countQuerySecondaryIndex(
"ActivitiesOccurred",
"sometext",
"Occurred",
QueryCondition.greaterThan(1421023065979l)
))
However this blows up with:
com.amazonaws.AmazonServiceException: Query condition missed key schema element Activity (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: KD4364FBV747MPFBQALD9ONS17VV4KQNSO5AEMVJF66Q9ASUAAJG)
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1077)
at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:725)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:460)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:295)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:3106)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.query(AmazonDynamoDBClient.java:1118)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBAsyncClient$18.call(AmazonDynamoDBAsyncClient.java:1557)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBAsyncClient$18.call(AmazonDynamoDBAsyncClient.java:1553)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Is this due to a feature that needs to be implemented upstream? See: aws-amplify/aws-sdk-ios#75
My build.sbt has:
"com.amazonaws" % "aws-java-sdk" % "1.9.14"
that's latest, at least as of last week.
UPDATE: Here is the serializer
implicit object userActivitySerializer
extends DynamoDBSerializer[UserActivity] {
override val tableName = UserActivity.tableName
override val hashAttributeName = Attributes.id
override val rangeAttributeName = Some[String](Attributes.occurred)
override def primaryKeyOf(userActivity: UserActivity) =
Map(Attributes.id -> userActivity.id,
Attributes.occurred -> userActivity.occurred
)
override def toAttributeMap(userActivity: UserActivity) =
Map(
Attributes.id -> userActivity.id,
Attributes.email -> userActivity.email,
Attributes.activity -> userActivity.activity,
Attributes.occurred -> userActivity.occurred
)
override def fromAttributeMap(
item: collection.mutable.Map[String, AttributeValue]) =
UserActivity(
id = item(Attributes.id),
email = item(Attributes.email),
activity = item(Attributes.activity),
occurred = item(Attributes.occurred)
)
}
Hmm, looks like there may be a way to identify keys there. That could be my problem.
Hello,
We are using aws-wrap
as a dependency in commons-aws, do you intend to upgrade the aws dependencies ?
In the negative can we reuse the the code wrapping CloudWatch, S3 and SQS ?
Have a nice day.
During certains times of the day (9:00 pm would work and you can set your time on your mac to reproduce) our AWS requests come back as having invalid signatures. There must be a timing issue.
Mfglabs/commons-aws is jumping through hoops b/c of a few missing methods in the S3 client. It'd be great to be able to clean things up.
The 3 methods in particular are:
Long-term, it might make sense to make this library easier to monkey patch.
It would be great if a jar file built using scala 2.11 compiler is available from Bintray.
When I try to get the contents of a bucket I get exceptions in the parser for containerParser. (aws.s3.S3Parsers line 101)
def publish( topicArn: String, message: String ): Future[PublishResult] = publish(new PublishRequest(topicArn, message))
The method above says it returns a Future[PublishResult]
However, this calls :
def publish( publishRequest: PublishRequest ): Future[PublishResult] = wrapAsyncMethod(client.publishAsync, publishRequest)
which returns wrapAsyncMethod(client.publishAsync, publishRequest)
which actually returns a Future[Nothing]
This means you cannot tell if you have successfully publishes to SNS.
https://mvnrepository.com/artifact/com.github.dwhjames/aws-wrap shows several versions past 0.8.x that aren't represented here. Are the artifacts there from this codebase, and if so can the related code be posted here?
When trying to parse a Boolean value from DynamoDb into a Scala Boolean using the implicit conversions from the package object this throws an Exception:
Caused by: java.lang.IllegalArgumentException: For input string: "null"
at scala.collection.immutable.StringLike$class.parseBoolean(StringLike.scala:293) ~[scala-library-2.11.7.jar:na]
at scala.collection.immutable.StringLike$class.toBoolean(StringLike.scala:260) ~[scala-library-2.11.7.jar:na]
at scala.collection.immutable.StringOps.toBoolean(StringOps.scala:30) ~[scala-library-2.11.7.jar:na]
at com.github.dwhjames.awswrap.dynamodb.package$$anonfun$39$$anonfun$apply$7.apply$mcZ$sp(package.scala:266) ~[aws-wrap_2.11-0.8.0.jar:0.8.0]
at com.github.dwhjames.awswrap.dynamodb.package$$anonfun$39$$anonfun$apply$7.apply(package.scala:266) ~[aws-wrap_2.11-0.8.0.jar:0.8.0]
at com.github.dwhjames.awswrap.dynamodb.package$$anonfun$39$$anonfun$apply$7.apply(package.scala:266) ~[aws-wrap_2.11-0.8.0.jar:0.8.0]
at com.github.dwhjames.awswrap.dynamodb.package$.com$github$dwhjames$awswrap$dynamodb$package$$catchAndRethrowConversion(package.scala:221) ~[aws-wrap_2.11-0.8.0.jar:0.8.0]
at com.github.dwhjames.awswrap.dynamodb.package$$anonfun$39.apply(package.scala:266) ~[aws-wrap_2.11-0.8.0.jar:0.8.0]
at com.github.dwhjames.awswrap.dynamodb.package$$anonfun$39.apply(package.scala:266) ~[aws-wrap_2.11-0.8.0.jar:0.8.0]
I'm trying to wrap my head around how this library is intended to be used and keep running into features that appear to either be incomplete or that simply don't seem to do what I'd expect. But there's a fair chance I'm simply missing something obvious. Examples of usage would be most welcome, in particular for the DynamoDB wrappers.
Hi, is there a reason not to provide wrapper for AmazonS3Client's getObject method? http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/AmazonS3Client.html#getObject(java.lang.String,%20java.lang.String)
Would it be acceptable to create pull request with getObject method implemented?
I prefer to use getObject method than download method from TransferManager. Creating File object for my purposes doesn't seem clear.
Regards.
Why the AWS dependencies are defined as provided
in the https://github.com/dwhjames/aws-wrap/blob/master/build.sbt ?
This way, we (the library clients) must add the aws-wrap
library as dependency and its dependencies too (AmazonSNSAsyncClient
for instance).
I'm quite sure there's a reason for this that I don't easily see. Thanks!
As many devs I have my Eclipse (or sbt) systemically pull the sources of my dependencies.
I wanted to try out this lib, the sources got pulled from bintray alright, but trying to look at them in eclipse fails.
From the looks of it the problem lies in the fact that the package layers "com.github.dwhjames" are missing in the source jar. This corresponds with your source structure in github, but I guess eclipse can't cope with that level of creativity (while fully supported in Scala). You could argue it's Scala IDE bug as well.
Please upgrade to AWS SDK 1.11.x
Right now you can get all kinds of metadata about S3Objects in a bucket but there is no way to get the actual content of the S3Object (for example, the actual png image of a rendered chart stored in the bucket).
It would be great if, in addition to your personal Bintray repository, you also published to JCenter. It's very easy to enable this through the Bintray settings. This would make it much simpler for organizations that maintain local JCenter mirrors to have your packages available and readily accessible without having to set up additional mirrors and it's already predefined in sbt.
I'm trying to separate object model case classes and their DynamoDb serialization-mapping code by having them in different packages (to provide different persistence mechanism in future). For instance, I'd like to have my case class ApiAccessToken
in package model
and corresponding mapping code in object DdbApiAccessToken
in package model.dynamodb
.
I defined all attributes along with an implicit object for toAttributeMap/fromAttributeMap methods in DdbApiAccessToken. It compiles fine, but when I try to use mapper.loadByKey, the compiler complains that it can't find a magnet:
Error:(49, 54) type mismatch;
found : String
required: mapper.LoadByKeyMagnet[model.ApiAccessToken]
data <- mapper.loadByKey[ApiAccessToken](dummy.token)
^
Is it possible? Am I missing something completely?
Thanks,
Hey Daniel, hope you are well
When requesting a non existing file ...
client.getObject(bucketname, "wtf.txt")
[error] Exception in thread "main" com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: [removed]), S3 Extended Request ID: [removed]
[error] at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1239)
[error] at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:823)
[error] at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:506)
[error] at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:318)
[error] at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3595)
[error] at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1116)
[error] at com.github.dwhjames.awswrap.s3.AmazonS3ScalaClient$$anonfun$getObject$1.apply(s3.scala:382)
[error] at com.github.dwhjames.awswrap.s3.AmazonS3ScalaClient$$anonfun$getObject$1.apply(s3.scala:382)
[error] at com.github.dwhjames.awswrap.s3.AmazonS3ScalaClient$$anon$1$$anonfun$run$1.apply(s3.scala:207)
[error] at scala.util.Try$.apply(Try.scala:192)
[error] at com.github.dwhjames.awswrap.s3.AmazonS3ScalaClient$$anon$1.run(s3.scala:206)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
It is annoying as the main thread is stuck waiting for this thread. As if the exception is not catched at all
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.