Comments (68)
Please consider supporting immutable classes in the mapper.
DynamoDbMapper
requires all mapped classes to be mutable. Immutable domain objects are increasingly the norm in Java, and using DynamoDbMapper
means creating mutable duplicates of all domain objects or giving up on the benefits of immutability.
from aws-sdk-java-v2.
Compatibility Between DynamoDBMapper
, DynamoDB
and AmazonDynamoDB
So this is pretty irritating. Within the Java DynamoDB SDK, there are essentially three different ways of working with Dynamo; all of them have their specific uses for specific situations, yet it's really difficult to convert between the different sets of classes that each one uses:
DynamoDBMapper
uses annotated domain classes.AmazonDynamoDB
uses various forms ofMap<String, AttributeValue>
DynamoDB
usesItem
and so forth.
Each of these requires irritating bespoke code to convert. Sure, DynamoDBMapper
has two methods (marshallIntoObject
and marshallIntoObjects
), but there is no corresponding unmarshallIntoMap
or anything like that. It should be simple to convert between domain classes and Map<String, AttributeValue>
. It should be simple to convert between domain classes and Item
. It should be simple to convert between any two of these.
Primarily, since DynamoDBMapper
just doesn't handle Update Expressions at all, there is no elegant method of interop between DynamoDBMapper
and when you need to use them.
Unexpected and Confusing Behavior of DynamoDBMapper
Type Conversion
This one is long.
At the very least, the documentation could use clarification here.
But, the best case scenario is that the various methods of type conversion work together harmoniously so that the representation of data in Dynamo when using DynamoDBMapper
can best match the user's intent and use case. Right now things are kind of close, which for me is more irritating than were it nowhere near at all.
I have an entire project full of nothing but code examples and JUnit tests to document and remind myself of the oddities of and tips and tricks for this system. I am more than happy to share them at any time.
Anyway, Let's start from the top.
At Least Three Methods of Type Conversion (and one kind of)?
I count three methods of explicit type conversion in the DynamoDBMapper
system, and one outlier.
ConversionSchema
DynamoDBTypeConverterFactory
- The
DynamoDBTyped
andDynamoDBTypeConverted
family of annotations AttributeTransformer
(Yes, the use case for this is somewhat tangental but there are some improvements to be made here too.)
Throughout this section, I'll use the java.time.Instant
class as an example for two reasons:
-
In the
1.11.x
SDK, it has no native support. -
Crucially, it can have a number of valid representations depending on the context.
- A
String
/S
, for UTC timestamps. - A DynamoDB
M
for separating out different aspects of a point on the timeline. - A
Long
/Integer
/N
for Epoch milli - A
Long
/Integer
/N
for Epoch second (hello, DynamoDB TTL!)
Each of these representations has a valid use case. Each of these could even conceivably be used within the same project or even
DynamoDBMapper
domain class. - A
ConversionSchema
What I've gathered from looking through the code, the ConversionSchema
is intended to define the base "primitive", direct type mappings between Java types and DynamoDB types by doing a class-to-AttributeValue
mapping. For instance byte
, integer
and family map to AttributeValue.withN(thingie.toString())
and so forth.
So the neat thing is that you can add new primitive types to your mapper using a DynamoDBMapperConfig
. Let's do it and default to a UTC timestamp!
public class InstantArgumentPal implements ArgumentMarshaller, ArgumentUnmarshaller {
@Override
public AttributeValue marshall(Object obj) { return new AttributeValue(obj.toString()); }
@Override
public void typeCheck(AttributeValue value, Method setter) { // Why? When? }
@Override
public Object unmarshall(AttributeValue value) throws ParseException { return Instant.parse(value.getS()); }
}
I cannot for the life of me figure out when typeCheck
is useful. Using it, you cannot tell the mapper "this Unmarshaller is not suitable for this AtributeValue
for this Method" except by throwing an exception, which just kills your entire operation. Can this return a boolean?
Regardless, let's use our Pal!
// Omit getters, setters, equals, hashcode, etc etc
@DynamoDBTable(tableName = "instant_conversion_example")
public class InstantConversionExample {
@DynamoDBHashKey(attributeName = "hash_key")
private Instant hashKey;
@DynamoDBAttribute(attributeName = "whatever")
private String whatever;
}
Later, let's set up a DynamoDBMapperConfig.Builder
.
InstantArgumentPal pal = new InstantArgumentPal();
.withConversionSchema(ConversionSchemas.v2Builder("v2WithInstant")
.addFirstType(Instant.class, pal, pal)
.build());
Let's give it a whirl!
@Test
public void go_go_gadget_instant() throws Exception {
InstantConversionExample example = new InstantConversionExample()
.setHashKey(Instant.now())
.setWhatever("whatevs");
mapper.save(example);
InstantConversionExample example1 = mapper.load(InstantConversionExample.class, example.getHashKey());
assertThat(example1).isEqualTo(example);
//
// Caused by: com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: InstantConversionExample[hash_key]; only scalar (B, N, or S) type allowed for key
//
// (┛◉Д◉)┛彡┻━┻
}
sigh
@DynamoDBTable(tableName = "instant_conversion_example")
public class InstantConversionExample {
@DynamoDBHashKey(attributeName = "hash_key")
private String hashKey;
@DynamoDBAttribute(attributeName = "whatever")
private Instant whatever;
public String getHashKey() {
return hashKey;
}
@Test
public void go_go_gadget_instant() throws Exception {
InstantConversionExample example = new InstantConversionExample()
.setHashKey("hash")
.setWhatever(Instant.now());
mapper.save(example);
InstantConversionExample example1 = mapper.load(InstantConversionExample.class, "hash");
assertThat(example1).isEqualTo(example);
// IT WORKS NOW!
}
OK cool, so far so good. Basically neat but:
- Why aren't
ArgumentMarshaller
andArgumentUnmarshaller
generic? - What is
typeCheck
good for? - The
addFirst
is basicallyaddOnly
, from what I've been able to tell. - Technically, I have defined
Instant
as a scalar in my first example, yet it does not work for hash keys.
The last point tells me that my own additions to the ConversionSchema
are not first class citizens. They should be.
DynamoDBTypeConverterFactory
Let's play around with the DynamoDBTypeConverterFactory
!
public class InstantStringConverter extends DynamoDBTypeConverter.AbstractConverter<String, Instant> {
@Override
public String convert(Instant instant) {
return instant.toString();
}
@Override
public Instant unconvert(String string) {
return Instant.parse(string);
}
}
OK so this seems a little cleaner. It's generic! Let's skip the set-up and tests this time and go directly to the spoilers:
- Still cannot use
Instant
as a hash key - Basically works exactly the same as the
ConversionSchema
thingie but it's generic.
But what's the big problem here?
DynamoDBTypeConverterFactory Is Only Meaningful for String
What happens if I make say, DynamoDBTypeConverter.AbstractConverter<Long, Instant>
? Nothing. No conversion happens.
This is especially important if I register, say two converters:
.withTypeConverterFactory(DynamoDBTypeConverterFactory.standard()
.override()
.with(Long.class, Instant.class, new InstantLongConverter())
.with(String.class, Instant.class, new InstantStringConverter())
.build());
And annotate the Instant
a certain way:
@DynamoDBTyped(DynamoDBMapperFieldModel.DynamoDBAttributeType.N)
private Instant instant;
Since I've registered a type converter for Instant
for a Java type that directly corresponds to a DynamoDB N
type, it stands to reason that the type converter factory would look for a converter for the domain class type that converts to a Java numeric type, right?
No. What happens is this:
- The TypeConverterFactory explicitly searches for a
String
converter for the domain class type - The TypeConverterFactory attempts to coerce the output of the
String
converter to a number type.
That is to say, for @DynamoDBTyped(DynamoDBMapperFieldModel.DynamoDBAttributeType.N)
to work, There must be a String converter registered for the type and the output of the converter must be a Number formatted as a String
this is extremely counter-intuitive. It severely limits the utility of DynamoDBTypeConverterFactory
.
But it gets worse. It turns out that ConversionSchema
and DynamoDBTypeConverterFactory
do not play well at all:
If a ConversionSchema
Marshaller/Unmarshaller pair is registered for a type, the DynamoDBTypeConverterFactory
converters for that type are completely ignored, as are any @DynamoDBTyped
or @DynamoDBTypeConverted
annotations on domain class members for that type.
Also, you cannot register a converter for a non-built-in scalar type (map, etc).
My final comment about this is that it appears that scalar types for this system are hardcoded along with their marshalers and unmarshallers into an Enum. This makes it impossible to extend the DynamoDB type system in a nuanced, elegant way.
The DynamoDBTyped
and DynamoDBTypeConverted
Family
These work OK, they are just really verbose, and honestly I wish that I could just have more nuanced usage of DynamoDBTypeConverterFactory
.
AttributeTransformer
Only comment here is that you should be able to register these on a per-class basis, similar to the ConversionSchema
and DynamoDBTypeConverterFactory
marshalers and unmarshallers.
Summing Up
- Documentation around the available type conversion options is limited and confusing. There is little or no guidance around which one to use in which situation.
- The interoperation between the type conversion options is confusing and counter-intuitive.
- The behavior of
DynamoDBTypeConverterFactory
is confusing, counter-intuitive and limiting. - Because of the above,
DynamoDBTyped
is of seriously limited utility for non-built-in scalar types. DynamoDBTypeConverted
with explicit converters works, but the verbosity is irritating- There should be a way to define your own primitives/scalar types and map them to DynamoDB types.
AttributeValue
Is Irritating
new AttributeValue.withN(someNumberType.toString()) // WHYYYYYYY
Please, please, please add static methods that do null checks.
AttributeValue.S(String s)
AttributeValue.N(Number n)
AttributeValue.N(String s)
AttributeValue.mk(Object) // Introspect argument in a documented way and "do the right thing"
from aws-sdk-java-v2.
Please remove PaginatedList, and replace uses of it with Stream.
java.util.List implementations are expected to have fast size()
methods, but as far as I can tell there is no way to implement that for a DynamoDB scan or query. Currently PaginatedList will either load the entire scan result into memory on a size()
call, or simply fail if configured as ITERATION_ONLY. This is a potentially surprising behaviour which can be made much more intuitive and explicit by returning a Stream and requiring users to collect(toList())
if they want a List.
from aws-sdk-java-v2.
Any updates on when the newer DynamoDBMapper will be available for AWS Java SDK v2? Thanks.
from aws-sdk-java-v2.
Hi all,
The wait is finally over and we'd like to announce the launch of the DynamoDB Enhanced Client for the AWS SDK for Java 2.0.
https://github.com/aws/aws-sdk-java-v2/tree/master/services-custom/dynamodb-enhanced
This client enhances every DynamoDb operation by providing direct mapping to and from your data classes without changing the nature of those interactions for a low-friction and intuitive development experience.
If you'd like to learn about some of the other advantages this client offers, feel free to take a look at our launch blog:
https://aws.amazon.com/blogs/developer/introducing-enhanced-dynamodb-client-in-the-aws-sdk-for-java-v2/
Now we've launched the work doesn't stop here. We've heard lots of great ideas that we haven't had time to implement. We will continue to build and improve this library as well as look at other AWS services that could benefit from enhanced clients of their own.
As always we welcome your support, we hope the good ideas and contributions will keep coming!
from aws-sdk-java-v2.
There could also be an Option 3: Do option 1, and create a spring-data plugin for customers that want something more database-agnostic than option 1 provides.
from aws-sdk-java-v2.
Could we please have the newly announced transactional writes via the high level mapper interface? ❤️
from aws-sdk-java-v2.
We've settled on option 1 above for now. Something like option 2 / spring-data / jnosql would be great, but it sounds like what people really want for now is just an easier to use DynamoDB client.
Update 1
We've started development on what we're calling the DynamoDB enhanced client. The DynamoDB enhanced client replaces the generated DynamoDB client with one that is easier for a Java customer to use. It does this by supporting conversions between Java objects and DynamoDB items, as well as converting between Java built-in types (eg. java.time.Instant) and DynamoDB attribute value types.
We'll be starting development with the non-POJO-based APIs from the 1.11.x document client, and will be moving onto the POJO-based APIs from the 1.11.x mapper client once we've got a nice base to build on.
To that end, we've just released an MVP preview version of the enhanced client that only supports the non-POJO APIs from the 1.11.x document client. It also only currently supports putItem
and getItem
. It doesn't do object mapping (yet), and doesn't do other operations like query or scan (yet). We'll be building on this MVP over time and updating this issue when we add new features.
Please feel free to add feedback as we do so, so that we can fix things that you don't like as soon as possible.
Links
Get started with the DynamoDB enhanced client preview.
Code Examples
Example 1: Adding a single item to a DynamoDB table (synchronously)
// Create a client to use for this example, and then close it. Usually, one client would be used throughout an application.
try (DynamoDbEnhancedClient client = DynamoDbEnhancedClient.create()) {
Table books = client.table("books");
books.putItem(i -> i.putAttribute("isbn", "0-330-25864-8")
.putAttribute("title", "The Hitchhiker's Guide to the Galaxy")
.putAttribute("publicationDate",
p -> p.putAttribute("UK", Instant.parse("1979-10-12T00:00:00Z"))
.putAttribute("US", Instant.parse("1980-01-01T00:00:00Z")))
.putAttribute("authors", Collections.singletonList("Douglas Adams")));
}
Example 2: Retrieving a single item from a DynamoDB table (synchronously)
// Create a client to use for this example, and then close it. Usually, one client would be used throughout an application.
try (DynamoDbEnhancedClient client = DynamoDbEnhancedClient.create()) {
Table books = client.table("books");
ResponseItem book = books.getItem(k -> k.putAttribute("isbn", "0-330-25864-8"));
String isbn = book.attribute("isbn").asString();
String title = book.attribute("title").asString();
Map<String, Instant> publicationDate = book.attribute("publicationDate")
.asMap(String.class, Instant.class);
List<String> authors = book.attribute("authors").asList(String.class);
}
Example 3: Adding a single item to a DynamoDB table (asynchronously)
// Create a client to use for this example, and then close it. Usually, one client would be used throughout an application.
try (DynamoDbEnhancedAsyncClient client = DynamoDbEnhancedAsyncClient.create()) {
AsyncTable booksTable = client.table("books");
// Write a book to the "books" table.
CompletableFuture<Void> serviceCallCompleteFuture =
booksTable.putItem(item -> item.putAttribute("isbn", "0-330-25864-8")
.putAttribute("title", "The Hitchhiker's Guide to the Galaxy"));
// Log when the book is done being written
CompletableFuture<Void> resultLoggedFuture = serviceCallCompleteFuture.thenAccept(ignored -> {
System.out.println("Book was successfully written!");
});
// Block this thread until after we log that the book was written.
resultLoggedFuture.join();
}
Feature Summary
This MVP includes a few features.
Synchronous AWS Calls
- Creating a
DynamoDbEnhancedClient
. - Adding a single item to a DynamoDB table with the
putItem
API. - Retrieving a single item from a DynamoDB table with the
getItem
API.
Asynchronous AWS Calls
- Creating a
DynamoDbEnhancedAsyncClient
. - Adding a single item to a DynamoDB table with the
putItem
API. - Retrieving a single item from a DynamoDB table with the
getItem
API.
Java Type Conversion
- Automatic type conversion between built-in Java types and DynamoDB types. The only ones supported for this MVP:
String
,Integer
,List
,Map
,Instant
. - The ability to specify custom type converters for built-in Java types, or your own types.
from aws-sdk-java-v2.
While collecting lots of good ideas for a ground up redesign of a new API and implementation, what about just porting the current mapper as-is to the v2 SDK? It's become more awkward over time for us having the mapper as the only AWS library still requiring the v1 SDK.
from aws-sdk-java-v2.
Issue reported external to github: @DynamoDBGeneratedUuid does not work for nested list objects.
@DynamoDBTable(tableName="XYZ")
class A {
@DynamoDBAttribute
List<B> listOfB;
}
@DynamoDBDocument
class B {
@DynamoDBGeneratedUuid(DynamoDBAutoGenerateStrategy.CREATE)
UUID id;
}
from aws-sdk-java-v2.
Whats happening with the support for Immutable data types?
We couldn't get this in for the initial launch. This will be an incremental feature delivered some time down the road. We want this feature too, we just don't want to delay basic high level support any longer than we already have.
from aws-sdk-java-v2.
Could we have an idea for when we will have have support for immutable objects?
It is really frustrating that you don't consider this as a priority. Meanwhile we are stuck with mutable objects. Really make me want to switch to another db :(
from aws-sdk-java-v2.
I know the mapper is a higher level interface and currently just returns the data object to keep things simple. Would it complicate the usage too much to instead return an object with the data object and metadata properties, or do we have to use the lower level API if we need those? Useful ones that I have run into would be:
- consumed capacity when doing a query
- modified attributes when doing a save
Edit: I was just thinking about it, and save() returns void. It could return the old object instead, possibly on a flag in DynamoDBMapperConfig
object.
from aws-sdk-java-v2.
Just wanted to bring Joda-Convert to your attention (disclaimer, I'm the author). It provides a convenient way to convert Object to String and vice versa. It is pre-populated with the JDK types and handles other types either by plugin or by annotating methods. Joda-Time, ThreeTen-Extra and Joda-Money all use the annotations for example. We at OpenGamma currently use it with the v1 mapper:
public class JodaConvertDynamoConverter<T> implements DynamoDBTypeConverter<String, T> {
private final Class<T> type;
public JodaConvertDynamoConverter(Class<T> type) {
this.type = type;
}
public String convert(T typedString) {
return JodaBeanUtils.stringConverter().convertToString(typedString);
}
public T unconvert(String string) {
return JodaBeanUtils.stringConverter().convertFromString(type, string);
}
}
Joda-Convert was built exactly for use cases like you have in v2 - converting objects to and from a string form. It would be great to see the v2 client use it directly as it might well solve a lot of your conversion issues. At the least I'd want it still to be possible to use it as a plugin like we currently do.
from aws-sdk-java-v2.
Whats happening with the support for Immutable data types?
from aws-sdk-java-v2.
Just started looking into the DynamoDbEnhancedAsyncClient
and the TableSchema
implementation and mapper. It's working great for simple use cases and is actually very simple to use :)
Nevertheless, I am having difficulty handling more complex mapping requirements. Specifically:
The current Enhanced Client and Mapper implementation does not seem to support mapping a member of type Map<String, SomeOtherDocument>
. I looked at the implementation of the BeanTableSchema
and it seems to only support sub-documents if they are not contained in a Collection/Map like a List<?>
or Map<?,?>
.
Here's a small example:
@DynamoDbBean
@Data
public class ClassA {
private String id;
private Map<String, ClassB> additional;
}
@DynamoDbBean
@Data
public class ClassB {
private String test1;
private String test2;
}
Later if I try to create a table schema to use the enhanced client:
private TableSchema<ClassA> classASchema = TableSchema.fromBean(ClassA.class);
I will get an exception since there's no registered converter that supports converting ClassB
.
Converter not found for EnhancedType(java.util.Map<java.lang.String, com.a.b.c.ClassB>)
Is there an undocumented feature that I am missing here? I looked over the implementation and could not find anything obvious that would support this use-case.
I see this comment that's related to this. Looks like it's not currently supported, but the idea is to support it in the future? If this is the case, is there a work-around in the meantime that is recommended?
from aws-sdk-java-v2.
We're looking to make the automatic depagination provided by PaginatedList work for all service's paginated APIs with this story: #26.
+1 to avoiding accidentally blowing up your memory and holding up a thread by using something as simple as size()
. Using Stream
is a solid idea. We'll need to think about whether other pagination strategies for different APIs can have an efficient size()
method. If so, we shouldn't expose it in as innocuous a form for services like Dynamo that make it expensive.
from aws-sdk-java-v2.
Issue reported external to github: It would be nice to be able to annotate either the get method OR the type for type-specific annotations like @DynamoDBTypeConverted. Currently, the method has to be annotated.
from aws-sdk-java-v2.
Issue reported external to github: DynamoDBMapper support for multiple conditions on same attribute for Save/Delete
I should be able to do this without having to throw away DynamoDBMapper and using the API directly...
expectedAttributes being a Map (DynamoDBSaveExpression.java and DynamoDBDeleteExpression) is a problem.
It's a pretty common use case: "Upsert a record into DynamoDB as long as versionID < {myNewVersionId}" - to ensure not to override older records.
This fails if the record doesn't yet exist, so the intuitive solution is to change the conditional expression to a list of two - versionId.exists(false) OR versionId.LE(myNewVersionID);
This is simply not possible right now. I realize this is possibly because the conditions maps to a Legacy API (https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LegacyConditionalParameters.Expected.html), but unfortunately the new and more powerful Condition Expressions (https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Expressions.ConditionExpressions.html) are not supported through DynamoDBMapper.
So I suppose ultimately that's the request. Can DynamoDBMapper either support multiple conditions on the same attributes either via:
- Generating ConditionExpressions from Expected attributes (that are in a List, not a map)
- Accepting ConditionExpressions through DynamoDBMapper interface
References of others with same problem:
from aws-sdk-java-v2.
Hey all, we haven't forgotten about this feature request.
We’ve got some new ideas for what we want to do in the DynamoDB Mapper with 2.x, but we want to get your feedback early in development so that we can deliver what you really want.
We have two options we're looking to explore:
- Option 1: A DynamoDB-specific client that combines the functionality of 1.11.x's Documents APIs and DynamoDB Mapper APIs in a straight-forward manner.
- Option 2: A generic document database client that creates an abstraction over all document databases, like DynamoDB and MongoDB. This would simplify using multiple document databases in the same application, and make it easier to migrate between the two. Unfortunately as a result, it also wouldn't be a direct DynamoDB experience.
Some ways you can help us:
Option 1: We've created a prototype for what the “option 1” APIs for the new DynamoDB Mapper/Documents API replacement in 2.x could look like. Feel free to comment on this prototype's pull request to give us your feedback. It's only APIs so it's super easy to change.
Option 2: We haven't created a prototype for what the "option 2" APIs might look like (we're working on it!), but feel free to comment on this issue to let us know if this is something you would use or prefer over "option 1".
Our Goals: We've frequently asked ourselves what we're trying to achieve with the DynamoDB Mapper in 2.x and what's important to us as an SDK. We've attempted to capture what we think is important to you, our customers, in a "tenets and goals" pull request. Feel free to review this pull request and make sure we're not forgetting something that you really care about.
from aws-sdk-java-v2.
I believe Option 2 does not make much sense. It would make sense to me to go for Option 1 and start a bounty program to implement a module to popular data access abstraction libraries such as spring-data mentioned above or GORM.
Disclaimer: I'm an author of declarative services for DynamoDB for Micronaut.
from aws-sdk-java-v2.
@danieladams456 @musketyr Thanks for the feedback!
@musketyr Also, thanks for your contribution to the Java+AWS ecosystem.
from aws-sdk-java-v2.
When will this feature (DynamoDBMapper) be available in SDK version 2 ?
from aws-sdk-java-v2.
Feature request from v1: DynamoDB Mapper with Object properties - aws/aws-sdk-java#2045
from aws-sdk-java-v2.
support complex types like
Map<Enum,Enum>
List < Enum >
List < List < Enum >>
currently DynamoDBTypeConvertedJson
cannot handle those issue, since it use Class
instead of TypeReference
i.e.
Class<? extends Object> targetType() default void.class;
jackson cannot convert kind of those data structures properly when we use Class
from aws-sdk-java-v2.
@cah-calixtomelean You could either store the object as a string JSON attribute, or as a Map. If you are choosing the latter (which your question implies), then you do not need to ever convert your object to JSON, you might as well convert it directly into a Map<String, AttributeValue> and cut out the middle-man.
If you wish to store it as a JSON string, then use your favorite JSON marshaller library, turn your object into a string and then store it as a string in your DDB record.
If you wish to store it as a DynamoDB Map, this is possible using the 'dynamodb-enhanced' public preview client [which is part of v2] if, and only if, you are able and willing to define a static schema for the document. There is currently no utility library you can use in v2 that does this by inference.
Here is the attribute type you would use in your StaticTableSchema:
Here is an example of it being used in a test:
from aws-sdk-java-v2.
The current Enhanced Client and Mapper implementation does not seem to support mapping a member of type
Map<String, SomeOtherDocument>
. I looked at the implementation of theBeanTableSchema
and it seems to only support sub-documents if they are not contained in a Collection/Map like aList<?>
orMap<?,?>
....
I see this comment that's related to this. Looks like it's not currently supported, but the idea is to support it in the future? If this is the case, is there a work-around in the meantime that is recommended?
@juanqui Looks like a bug to me. We have a test that asserts this functionality works correctly outside a collection:
However, we appear to be missing a test for your use-case. I'll repro and hopefully fix it. Stay tuned, and thanks for the catch.
from aws-sdk-java-v2.
One of the things I'd call out here is that I think there's a problem with the com.amazonaws.services.dynamodbv2.datamodeling.StandardBeanProperties method of reflecting over an object. Say I have:
@DynamoDBTable(tableName = "Polyps")
public final class Polyp {
private Set<String> endpoints = new ConcurrentSkipListSet<>();
public Set<String> getEndpoints() {
return endpoints;
}
public void setEndpoints(final Collection<String> endpoints) {
if (endpoints != null && !endpoints.isEmpty()) {
this.endpoints = new ConcurrentSkipListSet<>(endpoints);
}
}
}
In theory, even if DynamoDB only respects Sets, I should be able to handle the case where a setter allows a wider Collection type.
It looks like this is because of line 140-146, the setterOf
method, which does a "clobber get with set and look for the exact type" algorithm. Consequently, this method returns null, which later becomes a problem during line 111 of DynamoDBMapperFieldModel.
I think it should be checking if the return type has parent types, and if so trying to run up the tree.
from aws-sdk-java-v2.
With the current DynamoDBMapper implementation, when performing scans, there's no easy way to make the scan operation respect the DynamoDB table's read throughputs.
Solutions combining https://aws.amazon.com/blogs/developer/rate-limited-scans-in-amazon-dynamodb/ and https://aws.amazon.com/blogs/developer/understanding-auto-paginated-scan-with-dynamodbmapper/ have to be used and are not that straightforward.
Please provide a scan operation that is able to respect a table's read throughput (whether by explicitly specifying a percentage of throughput to use or an absolute value of read units to use).
from aws-sdk-java-v2.
DynamoDBMapper feature requests from v1:
aws/aws-sdk-java#214
aws/aws-sdk-java#527
aws/aws-sdk-java#534
aws/aws-sdk-java#547
aws/aws-sdk-java#674
aws/aws-sdk-java#832
aws/aws-sdk-java#953
aws/aws-sdk-java#1170
aws/aws-sdk-java#1201
aws/aws-sdk-java#1235
aws/aws-sdk-java#1253
from aws-sdk-java-v2.
Issue reported external to github: We should support automatic conversion of non-string key values in maps.
from aws-sdk-java-v2.
Issue reported external to github: DynamoDBMapper ignores inherited public getters from non-public classes
If we have 2 classes A, and B extends A. Class A declares a property p with a public getter getP() (Annotated with @DynamoDBAttribute). Then a call is made to dynamoDBMapper.save() with an instance of class B, if class A was public, property p is persisted, but if class A was package-private, property p is ignored.
Sample code that reproduces the issue :
File A.java
----------------------
package com.amazon.ab;
[public] class A {
private int p = 5;
@DynamoDBAttribute
public int getP() {
return p;
}
}
----------------------
File B.java
----------------------
package com.amazon.ab; // Same as A
public class B extends A {
}
----------------------
Test code:
File C.java
----------------------
package com.amazon.xyz; // Different than A & B
...
B b = new B();
dynamoDBMapper.save(b);
...
----------------------
Analysis :
The way the Java compiler and the JVM work are different. When a class inherits another, it inherits all its public methods and they are still public and accessible for everyone accessing the inheriting class. The only thing that the Java compiler enforces is that the inheriting class can access the inherited class (they are in the same package in this case). Other classes in other packages can call public methods regardless if they were just defined or inherited. This does not seem to be the case with how the JVM works. It seems to not allow calls to code that lives inside a package-private class from classes outside the package even if this call is coming through an inheriting public class that lives in the same package. So, to workaround this, the Java compiler generates a synthetic bridge method in the inheriting class that just calls the inherited method (using the #invokespecial instruction) so that classes in other packages are not accessing anything in a class that is not public.
In simpler terms, when a public class inherits a package-private class, the Java compiler generates synthetic bridge methods for inherited public methods while it doesn't if the inherited class was public.
Based on that, the root cause of this issue seems to be in StandardBeanProperties.java. The method canMap() that is called for every method in a bean class, filters out bridge and synthetic methods without further checks.
from aws-sdk-java-v2.
I see that this issue is open yet. Do we have any support for this as of now?
from aws-sdk-java-v2.
@bhaskarbagchi not yet. We plan on tackling the high level libraries shortly after we GA. We will update this issue as we have more information.
from aws-sdk-java-v2.
Issue reported external to github:
It is not possible to use Java stream() on PaginatedList and honor PaginationLoadingStrategy.LAZY_LOADING strategy. As you call stream() on PaginatedList, JDK tries to create a stream using spliterator which calls size() on the Collection. As PaginatedList.size() loads all results into memory, the Lazy_loading feature is not obeyed anymore. So customers either has to sacrifice Lazy loading capability of Java 8 functional programming capabilities.
from aws-sdk-java-v2.
Related: #703
from aws-sdk-java-v2.
Not sure if anyone has mentioned it, but I'd like to see improved filter support for DynamoDBMapper in V2. My understanding is that there are two mechanisms for filtering right now:
and
The former is easier to program against but has limitations in its expressiveness. The latter solves this at the expense of generation. ExpressionSpecBuilder exists for the low-level client and can nearly be used directly with DynamoDBScanExpression, but has 1 major difference. DynamoDBScanExpression's withExpressionAttributeValues method takes Map<String, AttributeValue> where the *ExpressionSpec classes return Map<String, Object>.
The Objects referred to there are the raw values, not AttributeValues. While that seems pretty close, I have not run across any exposed way to marshall from Object to AttributeValue. In this particular case, ConversionSchema contains all the necessarily logic to handle this marshalling but can't be leveraged due to the tight method access:
public static AttributeValue toAttributeValue(final Object o) {
val marshallerSet = new AbstractMarshallerSet(V2MarshallerSet.marshallers(), V2MarshallerSet.setMarshallers());
val marshaller = marshallerSet.getMemberMarshaller(o.getClass());
return marshaller.marshall(o);
}
Even if such a toAttributeValue method did exist in this case, it would still feel rather clunky to have to map over each object in the ScanExpressionSpec's value map and marshall them myself. I'd be great to see withFilterExpression/withScanFilter get replaced with something more along the lines of withScanExpressionSpec for easy interop between DynamoDBMapper and DynamoDB.
from aws-sdk-java-v2.
I think option 3 might get adoption from a broader audience than option 2. It could be used as a stepping stone to move to DynamoDB.
from aws-sdk-java-v2.
We've published the prototype for option 2: A generic document database client that creates an abstraction over all document databases, like DynamoDB and MongoDB. This would simplify using multiple document databases in the same application, and make it easier to migrate between the two. Unfortunately as a result, it also wouldn't be a direct DynamoDB experience: #1116
Let us know what you like or don’t like about it! Even if we go with another design, your feedback on this option will be integrated into the final solution.
from aws-sdk-java-v2.
Maybe you could implement/support JNOSQL spec
http://www.jnosql.org/
https://github.com/eclipse/jnosql-diana-driver
from aws-sdk-java-v2.
Related: #1240
from aws-sdk-java-v2.
@jodastephen Awesome, we'll look into it for supporting the java built-in type converters. Unfortunately we need a way to convert Object
into Map<String, AttributeValue>
so we can't use it for the entire mapper, but for Object
to String
conversion, it'll help.
from aws-sdk-java-v2.
@jodastephen We're opting to strip down the built-in conversions of joda-convert and integrate them into the SDK, instead of using joda-convert directly. We don't need all of the power of joda-convert, but it is saving us a lot of effort in writing the converters themselves. Thanks! Apache 2.0 license reference will be preserved, of course.
from aws-sdk-java-v2.
@VijaySidhu We cannot provide any public estimate of when this will be GA.
from aws-sdk-java-v2.
Is there any other way to override dynamo db table name in sdk 2 ? In daynamodb mapper we have
below method to override table name ?
.withTableNameOverride(..)
from aws-sdk-java-v2.
@VijaySidhu Can you create a new github issue with your specific question and what you're trying to achieve?
from aws-sdk-java-v2.
Hey, I have another feature request with this.
Ability to differentiate null vs undefined values.
This would work by wrapping properties with Optional
in the POJO.
- If the property is present in the entry, but stored explicitly as null, the property would be stored as
Optional.empty()
(by invokingOptional.ofNullable()
during deserialization). - If the property is not present in the entry (undefined), the property itself would be set to null.
This would only apply to properties with objects wrapped in Optional
. As an example, the logic would apply to a property called
Optional<String> myPossiblyUndefinedString
but not to
String myAlwaysDefinedString
.
In the V1 mapper, the only way to differentiate null vs undefined is by reading and parsing the raw map.
The Jackson serializer supports this type of operation via their jackson-datatype-jdk8 module.
from aws-sdk-java-v2.
I would like to be able to force list of columns. V1 mapper tries to deal with it with reflection, and many strings, for every object. With v1 - I could force with projection, but dynamo allows this operation only when rows are not filtered. In my case 20-30% of GC was used for dealing with that...
from aws-sdk-java-v2.
Feature request from V1: Support @DynamoDBTypeConverted on class level - aws/aws-sdk-java#1552
from aws-sdk-java-v2.
I took #1387 for a test drive this morning. There are definitely some notable improvements.
One use case I have is using Map
types to store arbitrary nested data. It's not completely clear to me if this is already handled, and I missed it -- or there is work yet to be done. I've provided a screenshot from the console showing an example item. For fields such as acls
or graph
, there is no schema known up front. Each one will be different in structure and data from record to record.
If I specify an attribute using Attributes.documentMap(...)
, then it expects me to provide a schema for the nested document.
I would be great if the mapper could store maps as is -- deeply converting each key-value pair to the matching DynamoDB primitives -- and not expect a ridgid schema up front.
from aws-sdk-java-v2.
I would be great if the mapper could store maps as is -- deeply converting each key-value pair to the matching DynamoDB primitives -- and not expect a ridgid schema up front.
There is a 'map' attribute type, however it expects a uniform value type to be declared so it knows how to convert the value objects. This would not work in your example where the values are a mix of nested maps or string values.
If a function could be provided that converted a Map<String, Object> to AttributeValue and back again using the necessary introspection then it would be trivial to create an AttributeType around that function, say AttributeType.nestedMap or something. The challenge for us there is to do so without opening the flood-gates to having to write introspective converters for every possible value type (as we've tried to keep things purely declarative up to this point).
We are working on a purely introspective 'BeanTableSchema' that would likely be a much better fit for this use-case. I'm going to definitely factor this feedback into that feature.
from aws-sdk-java-v2.
Until @DynamoDBDocument is available, can someone point me to coding best practice for storing json docs. I can't even find json utility classes in v2, for example, how do I convert a JSON string to a Map<String, AttributeValue>, someone must have a utility class for this in the project?
from aws-sdk-java-v2.
Feature request from V1:
- Support passing null to a DynamoDBTypeConverter - aws/aws-sdk-java#1408
- Support returning updated item after save operation of DynamoDBMapper - aws/aws-sdk-java#1654
from aws-sdk-java-v2.
However, we appear to be missing a test for your use-case. I'll repro and hopefully fix it. Stay tuned, and thanks for the catch.
Repro confirmed. Opened #1748
from aws-sdk-java-v2.
Issue reported external to github: DynamoDBMapper support for multiple conditions on same attribute for Save/Delete
I should be able to do this without having to throw away DynamoDBMapper and using the API directly...
expectedAttributes being a Map (DynamoDBSaveExpression.java and DynamoDBDeleteExpression) is a problem.
It's a pretty common use case: "Upsert a record into DynamoDB as long as versionID < {myNewVersionId}" - to ensure not to override older records.
This fails if the record doesn't yet exist, so the intuitive solution is to change the conditional expression to a list of two - versionId.exists(false) OR versionId.LE(myNewVersionID);
Any updates on this? This is indeed a very common use-case and causes a lot of pain!
from aws-sdk-java-v2.
This fails if the record doesn't yet exist, so the intuitive solution is to change the conditional expression to a list of two - versionId.exists(false) OR versionId.LE(myNewVersionID);
Any updates on this? This is indeed a very common use-case and causes a lot of pain!
When I try and parse this issue in the context of the new DynamoDB Enhanced Client we are currently previewing in v2 I'm not sure it applies. The design of the enhanced client, especially around how conditional statements are handled, is somewhat different to the v1 mapper but I don't understand the nuances of the v1 mapper and the issue as it is written here clearly enough to state with absolute certainty the issue is fixed.
What I'd ask you to do is take a look at the new enhanced client in v2 and see if this issue is addressed, if not I will take another look. The new enhanced client does support the 'newer' conditional expression syntax provided by the DynamoDB API.
from aws-sdk-java-v2.
@mdamak I'm sorry this is such a big painpoint for you. We have a lot of other painpoints we're trying to address at this time, and our customer feedback shows those painpoints (like lack of TransferManager or Metrics in V2) are a lot higher priority for us to address. Once those are addressed, we definitely want to come back and add immutable support to the enhanced client.
If you'd be willing to invest the time into implementing immutable support, we'd be willing to help you with the design and code reviews.
from aws-sdk-java-v2.
Is it possible to implement the @DynamoDBAutoGeneratedTimestamp annotation for the DynamoDB Enhanced Client? https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/dynamodbv2/datamodeling/DynamoDBAutoGeneratedTimestamp.html
This is something I find very useful since all of my tables have "created" and "updated" timestamp attributes.
Is there a way to do this with the EnhancedClient short of manually setting these fields on every write/update?
from aws-sdk-java-v2.
Is it possible to implement the @DynamoDBAutoGeneratedTimestamp annotation for the DynamoDB Enhanced Client?
The best way to do this in my opinion is to write it as an extension (see https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/DynamoDbEnhancedClientExtension.java). This gives you the hooks you need to be able do exactly what this used to do in the v1 mapper, and you can even design your own annotations for it.
For an example of how to write an extension like this, see the versioned record extension which is bundled by default : https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/extensions/VersionedRecordExtension.java
And to see how this extension uses custom annotations, take a look at : https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/extensions/annotations/DynamoDbVersionAttribute.java
Having said all that, if this all sounds like too much work this is a feature we will likely get around to doing ourselves assuming nobody submits a PR for it first. Thanks for +1'ing it.
from aws-sdk-java-v2.
@bmaizels I'll give it a shot. If I have enough success I'll try to open a PR
Thanks for the quick response!
from aws-sdk-java-v2.
Immutables fans can now jump onto #1801 . We're going to start peeling off issues here with the goal of closing this issue once everyone's feedback and desires are accounted for in other places.
from aws-sdk-java-v2.
We will be splitting this issue into the remaining open feature requests for the DynamoDbEnhancedClient. See the mentions above and below to follow-up on what issues you care about the most and +1 them!
from aws-sdk-java-v2.
We think we've peeled off the remaining dynamodb-enhanced features here: dynamodb-enhanced
We might have missed some, because there were many feature requests that were very v1-specific, but I believe we've gotten them all already.
Please feel free to +1 the issues you want to put your vote behind us implementing (or ask on the issue if you want to take a stab at it yourself!).
We'll be resolving this issue, now.
from aws-sdk-java-v2.
Is it possible to implement the @DynamoDBAutoGeneratedTimestamp annotation for the DynamoDB Enhanced Client?
The best way to do this in my opinion is to write it as an extension (see https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/DynamoDbEnhancedClientExtension.java). This gives you the hooks you need to be able do exactly what this used to do in the v1 mapper, and you can even design your own annotations for it.
For an example of how to write an extension like this, see the versioned record extension which is bundled by default : https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/extensions/VersionedRecordExtension.java
And to see how this extension uses custom annotations, take a look at : https://github.com/aws/aws-sdk-java-v2/blob/master/services-custom/dynamodb-enhanced/src/main/java/software/amazon/awssdk/enhanced/dynamodb/extensions/annotations/DynamoDbVersionAttribute.java
Having said all that, if this all sounds like too much work this is a feature we will likely get around to doing ourselves assuming nobody submits a PR for it first. Thanks for +1'ing it.
I created an implementation of this extension via an annotation which acts the same way the @DynamoDBAutoGeneratedTimestamp in v 1.11 works. Feel free to modify and integrate into the SDK or anyone can use it at their own discretion.
https://github.com/gakinson/dyanamodb-enhanced-DynamoDbAutoGeneratedTimestamp-annotation
Thanks,
Geoffrey K
from aws-sdk-java-v2.
Hello, I'm migrating a JAVA Lambda function from DynamoDB SDK 1 to DynamoDB SDK 2.
I have a table TABLE like this:
id as key
att1 as attribute
att2 as atribute
With SDK 2 version, I use a model class with annotations to map this table, MODELTABLE :
@DynamoDbBean
public class MODELTABLE {
String id;
String att1;
String att2;
@DynamoDbPartitionKey
@DynamoDbAttribute(value = "id")
...get / set for id
@DynamoDbAttribute(value = "att1")
...get / set for att1
@DynamoDbAttribute(value = "att2")
...get / set for att2
}
I want to get rows in table with att2 = value. How can I make this?
I try to use this method, but QueryConditional is mandatory to set and I only want to filter by att2 value.
public <T> Iterator<T> getDataByAttributeValues (DynamoDbTable<T> mapper, Map<String, AttributeValue> eav, String filterExpression) {
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder().partitionValue("KEY_VALUE")
.build());
Expression scanExpression = Expression.builder()
.expression(filterExpression)
.expressionValues(eav)
.build();
QueryEnhancedRequest queryRequest = QueryEnhancedRequest.builder().queryConditional(queryConditional).filterExpression(scanExpression).build();
return mapper.query(queryRequest).items().iterator();
}
Thanks in advance.
from aws-sdk-java-v2.
Hello, I'm migrating a JAVA Lambda function from DynamoDB SDK 1 to DynamoDB SDK 2.
I have a table TABLE like this:
id as key
att1 as attribute
att2 as atributeWith SDK 2 version, I use a model class with annotations to map this table, MODELTABLE :
@DynamoDbBean
public class MODELTABLE {
String id;
String att1;
String att2;@DynamoDbPartitionKey
@DynamoDbAttribute(value = "id")
...get / set for id
@DynamoDbAttribute(value = "att1")
...get / set for att1
@DynamoDbAttribute(value = "att2")
...get / set for att2
}I want to get rows in table with att2 = value. How can I make this?
I try to use this method, but QueryConditional is mandatory to set and I only want to filter by att2 value.public <T> Iterator<T> getDataByAttributeValues (DynamoDbTable<T> mapper, Map<String, AttributeValue> eav, String filterExpression) { QueryConditional queryConditional = QueryConditional .keyEqualTo(Key.builder().partitionValue("KEY_VALUE") .build()); Expression scanExpression = Expression.builder() .expression(filterExpression) .expressionValues(eav) .build(); QueryEnhancedRequest queryRequest = QueryEnhancedRequest.builder().queryConditional(queryConditional).filterExpression(scanExpression).build(); return mapper.query(queryRequest).items().iterator(); }
Thanks in advance.
Forget the request. I have found how to do it. ScanEnhancedRequest is the solution:
public <T> Iterator<T> getDataByAttributeValues (DynamoDbTable<T> mapper, Map<String, AttributeValue> eav, String filterExpression) {
Expression scanExpression = Expression.builder()
.expression(filterExpression)
.expressionValues(eav)
.build();
ScanEnhancedRequest scanRequest = ScanEnhancedRequest.builder().filterExpression(scanExpression).build();
return mapper.scan(scanRequest).items().iterator();
}
from aws-sdk-java-v2.
from aws-sdk-java-v2.
Hi Team,
I have the two tables One is customer and other is Users
Customer table is having the primary keys as cutsomerId and users table is having the primary key as userId
I want to update the both table data by using the batchSave method.
List objectsToWrite = Arrays.asList(cust, users);
I tried with batchSave(objectsToWrite)
But i am getting the below exception as
Servlet.service() for servlet [dispatcherServlet] in context with path [/v1/biz-retention-nosql-worker] threw exception [Request processing failed; nested exception is com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: class com.amazonaws.services.dynamodbv2.datamodeling.PaginatedScanList not annotated with @DynamoDBTable] with root cause
com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: class com.amazonaws.services.dynamodbv2.datamodeling.PaginatedScanList not annotated with @DynamoDBTable
Also in both tables i annotated the @DynamoDBTable
from aws-sdk-java-v2.
Related Issues (20)
- Unable to execute HTTP request: Unrecognized SSL message, plaintext connection HOT 6
- How to pass AWSSessionCredentialsProvider in aws-sdk-java-v2 HOT 1
- [put-object] when set readlimit, error raised: java.io.IOException: No position has been marked HOT 2
- `S3TransferManager` / `S3AsyncClient` does not seem to use `SdkAdvancedAsyncClientOption.FUTURE_COMPLETION_EXECUTOR`'s `Executor`. HOT 1
- S3TransferManager - Support download / upload resume in the event of host machine power loss HOT 1
- service is crashed while uploading large files to S3 using aws sdk2 HOT 6
- Add support of Request-level credentials override in DefaultS3CrtAsyncClient
- S3 download leak connection
- AWS Java SDK v2 does not respect AWS_MAX_ATTEMPTS HOT 1
- How to configure KMS vpc endpoint while creating s3Encryption client and s3async client
- DynamoDB Enhanced: Support schema mixins
- In S3 library `ResponseInputStream<?>` doesn't seem to support the `InputSteam` `int read(byte[] buffer)` method correctly
- DynamoDB enhanced client - Add "Select" in ScanEnhancedRequest (short issue description)
- Add support of Request-level overrideConfiguration in s3 async multipart upload
- CRaC support for AWS SDK for Java
- equivalent of static RetryCondition defaultRetryCondition() HOT 5
- InternetGatewayAttachment state and stateAsString mismatch HOT 2
- Support for IP Ranges in NO_PROXY Environment Variable
- DynamoDbEnhancedAsyncClient#createTable() fails to generate secondary indexes HOT 2
- AwsV4HttpSigner does not contain all features from Aws4Signer
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from aws-sdk-java-v2.