Giter Club home page Giter Club logo

sdk-java's Introduction

Java SDK for CloudEvents API

License Maven Central Javadocs

The Java SDK for CloudEvents is a collection of Java packages to adopt CloudEvents in your Java application.

Using the Java SDK you can:

  • Access, create and manipulate CloudEvent inside your application.
  • Serialize and deserialize CloudEvent back and forth using the CloudEvents Event Format, like Json.
  • Read and write CloudEvent back and forth to HTTP, Kafka, AMQP using the CloudEvents Protocol Binding implementations we provide for a wide range of well known Java frameworks/libraries.

To check out the complete documentation and how to get started, look at the dedicated website https://cloudevents.github.io/sdk-java/.

Status

This SDK is considered work in progress. The community is working hard to bring you a new major version of the SDK with major enhancements both to APIs and to implementation.

If you want to know more about v1 of this SDK, check out the v1 readme

Stay tuned!

Supported features of the specification:

v0.3 v1.0
CloudEvents Core ✔️ ✔️
AMQP Protocol Binding
- Proton ✔️ ✔️
AVRO Event Format
HTTP Protocol Binding ✔️ ✔️
- Vert.x ✔️ ✔️
- Jakarta Restful WS ✔️ ✔️
- Basic ✔️ ✔️
- Spring ✔️ ✔️
- http4k ✔️ ✔️
JSON Event Format ✔️ ✔️
- Jackson ✔️ ✔️
Protobuf Event Format ✔️ ✔️
- Proto ✔️ ✔️
Kafka Protocol Binding ✔️ ✔️
MQTT Protocol Binding
NATS Protocol Binding
Web hook

† Source/artifacts hosted externally

Documentation

Documentation is available at https://cloudevents.github.io/sdk-java/.

Javadocs are available on javadoc.io:

You can check out the examples in the examples directory.

Used By

Occurrent Knative Eventing http4k
Occurrent http4k

Community

Each SDK may have its own unique processes, tooling and guidelines, common governance related material can be found in the CloudEvents community directory. In particular, in there you will find information concerning how SDK projects are managed, guidelines for how PR reviews and approval, and our Code of Conduct information.

If there is a security concern with one of the CloudEvents specifications, or with one of the project's SDKs, please send an email to [email protected].

Additional SDK Resources

sdk-java's People

Contributors

aaron-ai avatar alfusainey avatar bobotig avatar bsideup avatar cali0707 avatar codebrewer avatar dependabot[bot] avatar dsyer avatar duglin avatar fabiojose avatar github-actions[bot] avatar jemday avatar johanhaleby avatar jponge avatar matejvasek avatar matzew avatar nvervelle avatar olegz avatar onecricketeer avatar paulschwarz avatar pierdipi avatar ruromero avatar skpark-tech avatar slinkydeveloper avatar smadasu avatar snyk-bot avatar tan9 avatar tqvarnst avatar vikramvuppla avatar zhxnlai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sdk-java's Issues

Should CDI & Vert.x support be removed from core SDK

As a core SDK, would it make more sense that any concrete implementations live in their own Github projects? That way the cloudevents implementations could reside under their parents Github org such as vertx-3/vertx-cloudevents and attract a wider audience. Also removes maintenance burden from the core project.

Extended attributes on json

Dear team,
How can i handle custom attributes using json java sdk?
i need to work with something like this:
{
"specversion" : "1.0",
"type" : "com.example.someevent",
"source" : "/mycontext",
"id" : "A234-1234-1234",
"time" : "2018-04-05T17:31:00Z",
"Custom Attribute1" : "bar"
"Custom Attribute2" : "foo"
"datacontenttype" : "text/xml",
"data" : "<much wow="xml"/>"
}

and manage on the Cloud events object this custom attributes.

Content Type being filtered out at the consumer

Hey,

I have been trying to use CloudEvents along with Kafka. For the message format, I am using Avro.
I am using the java sdk version 1.3.0 with the CloudEvent producers and consumers.

My observation.
In the HeaderMapper.java class https://github.com/cloudevents/sdk-java/blob/361a34cc639ddaa75b2a5080f117fc282be7625b/kafka/src/main/java/io/cloudevents/v1/kafka/HeaderMapper.java#L53, you are filtering out the datacontenttype from being mapped to a header prefixed with ce. Then, later https://github.com/cloudevents/sdk-java/blob/361a34cc639ddaa75b2a5080f117fc282be7625b/kafka/src/main/java/io/cloudevents/v1/kafka/HeaderMapper.java#L76, you add to the headers using content-type. So, after the marshalling the key in the header is content-type. Now, when this header is consumed by a consumer and unmarshalled (I am using the binary unmarshaller) the header, the content-type header is filtered here https://github.com/cloudevents/sdk-java/blob/361a34cc639ddaa75b2a5080f117fc282be7625b/kafka/src/main/java/io/cloudevents/v1/kafka/AttributeMapper.java#L60. After this, when the AttributeImpl object is created through this method https://github.com/cloudevents/sdk-java/blob/242b58a5309d319d24a907077e553c19d15a54fe/api/src/main/java/io/cloudevents/v1/AttributesImpl.java#L182, the datacontenttype is not a field within the attributes map. Then, when the data is being deserialized, the code calls getMediaType() on the AttributeImpl object https://github.com/cloudevents/sdk-java/blob/361a34cc639ddaa75b2a5080f117fc282be7625b/api/src/main/java/io/cloudevents/format/BinaryUnmarshaller.java#L222, it gets a null object. And the data is not being deserialized.

Am i using the sdk in the wrong way ? It would be great to get an example for the Avro Kafka piece with this sdk.

CloudEvents prefix for extension attributes in Kafka headers

The Java SDK does not currently prefix extension attributes with "ce_" in the headers.

Section 3.2.3.1 of the specification for Kafka Protocol Binding states that "CloudEvent attributes are prefixed with ce_ for use in the message-headers section." While the specification does not explicitly state either way whether this applies also to extension attributes, I can't see a good reason why they would be treated differently.

Is the different treatment of extension attributes in the SDK intentional, or do you agree that they should also be prefixed with "ce_"?

Modularization: kafka module

This module implements the Kafka Protocol Binding

This module already exists, but it needs refactorings.

  • kafka

This module depend on api one.

When someone wants the kafka binding to create or read events, they add this module and they concern about potential conflicts with pre-existing dependencies in their projects.

Backwards compatibility future plans

Thank you for this great library. We have integrated CloudEvents in our system already and would like to start using this library too.

There were some breaking changes between v0.1 and v0.2 in the CloudEvents specs and I see that at the moment cloudevents/sdk-java is backwards compatible. (example)

But since these are versions before the major version v1.0, could you please tell me if you intend to always keep backwards compatibility in the future for any new versions of CloudEvents specs (for any future breaking changes)?

It is important for us to be able to gradually update our events with every new CloudEvent version. This results in a Kafka topic with multiple versions of CloudEvents. To avoid creating custom adapters, it would be great to only rely on the cloudevents/sdk-java library no matter which version of CloudEvents we use.

Thanks,
Cristian

Modularization: http-vertx module

This module implements the HTTP Protocol Binding with vertx.

Actually, we have this module, but it needs refactorings.

  • http-vertx

This module depend on api one.

When someone wants the http binding to create or read events with vertx, they add this module and they concern about potential conflicts with pre-existing dependencies in their projects.

Please Support JavaTimeModule

It would be great if this SDK used a Jackson object mapper configured like the following, in this way I don't believe you would need any special ZonedDateTime code, this should handle it and allow for greater flexibility.

        final var objectMapper = JsonMapper.builder()
            .addModule(new ParameterNamesModule(JsonCreator.Mode.DEFAULT))
            .addModule(new Jdk8Module())
            .addModule(new JavaTimeModule())
            .addModule(new GuavaModule())
            .addModule(new HppcModule())
            .addModule(new PCollectionsModule())
            .addModule(new EclipseCollectionsModule())
            .serializationInclusion(JsonInclude.Include.NON_NULL)
            .disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
            .disable(SerializationFeature.WRITE_DURATIONS_AS_TIMESTAMPS)
            .disable(SerializationFeature.WRITE_DATE_KEYS_AS_TIMESTAMPS)
            .disable(DeserializationFeature.ADJUST_DATES_TO_CONTEXT_TIME_ZONE)
            .build();

here are the references from my gradle.build.kts if that helps

dependencies {
    implementation("io.vertx:vertx-core:3.9.0")
    implementation("software.amazon.awssdk:s3:2.11.6")
    implementation("com.google.guava:guava:28.2-jre")
    implementation("com.carrotsearch:hppc:0.8.1")
    implementation("org.pcollections:pcollections:3.1.3")
    implementation("org.eclipse.collections:eclipse-collections:10.2.0")
    implementation("io.cloudevents:cloudevents-api:1.3.0")
    implementation("org.slf4j:slf4j-log4j12:1.7.30")
    implementation("org.slf4j:slf4j-api:1.7.30")
    implementation("com.fasterxml.jackson.core:jackson-core:2.10.3")
    implementation("com.fasterxml.jackson.core:jackson-annotations:2.10.3")
    implementation("com.fasterxml.jackson.core:jackson-databind:2.10.3")
    implementation("com.fasterxml.jackson.module:jackson-module-parameter-names:2.10.3")
    implementation("com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.10.3")
    implementation("com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.10.3")
    implementation("com.fasterxml.jackson.datatype:jackson-datatype-guava:2.10.3")
    implementation("com.fasterxml.jackson.datatype:jackson-datatype-hppc:2.10.3")
    implementation("com.fasterxml.jackson.datatype:jackson-datatype-pcollections:2.10.3")
    implementation("com.fasterxml.jackson.datatype:jackson-datatype-eclipse-collections:2.10.3")
    implementation("javax.validation:validation-api:2.0.1.Final")

    testImplementation("org.junit.jupiter:junit-jupiter-api:5.6.1")

    testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine:5.6.1")
}

Here is a test I wrote:

    @Test
    void zonedDateTimeDoesNotHaveTimeone()
        throws JsonProcessingException {
        final var data = "{\"zonedDateTime\":\"2020-04-09T01:14:10.698998-04:00\"}";
        final var zonedDateTimeObject = objectMapper.readValue(data, ZonedDateTimeObject.class);
        final var zonedDateTimeString = objectMapper.writeValueAsString(zonedDateTimeObject);
        assertEquals(data, zonedDateTimeString);
    }
import java.time.ZonedDateTime;

public class ZonedDateTimeObject {
    public ZonedDateTime zonedDateTime;
}

Kafka Producer Factory

I'm trying to decide on whether a kafka producer factory would be helpful for the SDK? For now, I'll implement CloudEventsKafkaProducerFactory for my project but I can imagine that this could be useful for others.

Fix repository description

The repository description on GitHub and the stub README currently say: "Javascript SDK for CloudEvents"

Given the name of the repository (sdk-java) and the first PR (#1), I guess that it should be "Java SDK for CloudEvents".

toString() equals() and hashCode() methods

Hello,

toString(), equals() and hashCode() are missing from the CloudEvents implementations. Is there any reason for not including this?

I am mostly interested of the toString() for logging purposes.

Should I open a PR and add these methods to all CloudEvent implementations?

Kafka binary content type not set

According to the Binary content mode spec:

For the binary mode, the header content-type property MUST be mapped directly to the CloudEvents datacontenttype attribute.

However, it appears that this header is not being set properly.

I'm not really sure what should be responsible for setting this header. It seems that the HeaderMapper is a good candidate, but it is not aware if the content mode is structured or not. It seems that the only thing that is aware of the content mode is the Marshallers, and this is where the content mode is set for the structured mode as well.

For example, something along the lines of:

public static <T> EventStep<AttributesImpl, T, byte[], byte[]> 
		binary() {
			
	return 
		BinaryMarshaller.<AttributesImpl, T, byte[], byte[]>
		  builder()
			.map(AttributesImpl::marshal)
			.map(Accessor::extensionsOf)
			.map(ExtensionFormat::marshal)
			.map(HeaderMapper::map)
			.map(Json::binaryMarshal)
			.builder((payload, headers) -> {
				headers.putIfAbsent("content-type", headers.get("ce_datacontenttype"));
				return new Wire<>(payload, headers);
			});
}

Its also the case that only the ce_datacontenttype field is considered when unmarshalling, when the "content-type" field should probably be used.

Modularization: json-jackson

This module implements the JSON Event Format with Jackson.

We already have some good work with this, but it needs refactorings.

  • json-jackson

This module depend on api one.

When someone wants the json format to create or read events, they add this module and they concern about potential conflicts with pre-existing dependencies in their projects.

Modularization: api module

This module carry just the API definitions, free of any kind of external dependencies.

  • api

When someone needs the API in their project, add just this module and avoid any kind of conflict with pre-existing dependencies.

java.lang.IllegalStateException: Failed to decode when unmarshalling event

I have a unit test incorporating Spring Boot and CloudEvents:

    @Test
    public void testEventEndpoint() throws Exception {
        String endpoint = "http://localhost:" + port + "/";
        Map<String, String> testValue = new HashMap<>();
        testValue.put("test", "value");
        String body = Json.encode(testValue);
        
        /*Create a tracing extension*/
        final DistributedTracingExtension dt = new DistributedTracingExtension();
        dt.setTraceparent("0");
        dt.setTracestate("congo=4");
        /*Format it as extension format*/
        final ExtensionFormat tracing = new DistributedTracingExtension.Format(dt);
        /* Build a CloudEvent instance */
        CloudEventImpl<String> ce =
            CloudEventBuilder.<String>builder()
                .withType("knative.eventing.test")
                .withSource(URI.create("https://github.com/cloudevents/spec/pull"))
                .withId("A234-1234-1234")                   
                .withTime(ZonedDateTime.now())
                .withContenttype(MediaType.APPLICATION_JSON_VALUE)
                .withData(body)
                .withExtension(tracing)
                .build();

        /* Marshal the event as a Wire instance and grab the headers and body*/
        Wire<String, String, String> wire = Marshallers.<String>binary()
                .withEvent(() -> ce)
                .marshal();
        Map<String, String> headerVals = wire.getHeaders();
        HttpHeaders headers = new HttpHeaders();
        for (Entry<String, String> entry : headerVals.entrySet()) {
            String key = entry.getKey();
            String value = entry.getValue();
            List<String> vals;
            if (headers.containsKey(key)) {
                vals = headers.get(key);
            } else {
                vals = new ArrayList<>();
                headers.put(key, vals);
            }
            vals.add(value);
        }

        System.out.println("Cloud Event headers: " + headers);
        System.out.println("Cloud Event body: " + wire.getPayload().get());
        HttpEntity<String> request = new HttpEntity<>(wire.getPayload().get(), headers);
        ResponseEntity<Void> result = this.server.postForEntity(new URI(endpoint), request, Void.class);
        assertEquals("Unexpected response code", HttpStatus.ACCEPTED, result.getStatusCode());
        
        endpoint = "http://localhost:" + port + "/v1/events";
        String response = server.getForObject(endpoint, String.class);
        System.out.println("Events server response: " + response);
        assertFalse("Error response from server", response.startsWith("ERROR"));
        assertFalse("Warning response from server", response.startsWith("WARNING"));
        assertTrue("Unexpected events response from server: " + response, response.startsWith("NUMBER OF EVENTS"));
    }

The test is failing with this exception. It appears to be unable to handle the JSON string I'm passing in as the body. Am I doing something wrong in the unit test?

Cloud Event headers: [ce-source:"https://github.com/cloudevents/spec/pull", tracestate:"congo=4", ce-specversion:"0.2", ce-type:"knative.eventing.test", ce-id:"A234-1234-1234", traceparent:"0", ce-time:"2019-10-09T14:51:18.529-05:00", Content-Type:"application/json"]
Cloud Event body: "{\"test\":\"value\"}"
Receved request headers: {accept=application/json, application/*+json, ce-source=https://github.com/cloudevents/spec/pull, tracestate=congo=4, ce-specversion=0.2, ce-type=knative.eventing.test, ce-id=A234-1234-1234, traceparent=0, ce-time=2019-10-09T14:51:18.529-05:00, uber-trace-id=3cfe6a1aff615223%3A3cfe6a1aff615223%3A0%3A1, content-type=application/json, content-length=22, host=localhost:64232, connection=Keep-Alive, accept-encoding=gzip, user-agent=okhttp/3.9.0}
Received request body: "{\"test\":\"value\"}"
2019-10-09 14:51:18.719  INFO 20641 --- [o-auto-1-exec-2] i.j.internal.reporters.LoggingReporter   : Span reported: 3cfe6a1aff615223:da07a85ce3d80d2c:3cfe6a1aff615223:1 - event
2019-10-09 14:51:18.729 ERROR 20641 --- [o-auto-1-exec-2] o.a.c.c.C.[.[.[/].[dispatcherServlet]    : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is java.lang.IllegalStateException: Failed to decode: Cannot construct instance of `java.util.LinkedHashMap` (although at least one Creator exists): no String-argument constructor/factory method to deserialize from String value ('{"test":"value"}')
 at [Source: (String)""{\"test\":\"value\"}""; line: 1, column: 1]] with root cause

java.lang.IllegalStateException: Failed to decode: Cannot construct instance of `java.util.LinkedHashMap` (although at least one Creator exists): no String-argument constructor/factory method to deserialize from String value ('{"test":"value"}')
 at [Source: (String)""{\"test\":\"value\"}""; line: 1, column: 1]
	at io.cloudevents.json.Json.decodeValue(Json.java:104) ~[cloudevents-api-0.3.1.jar:na]
	at io.cloudevents.json.Json$1.unmarshal(Json.java:220) ~[cloudevents-api-0.3.1.jar:na]
	at io.cloudevents.json.Json$1.unmarshal(Json.java:217) ~[cloudevents-api-0.3.1.jar:na]
	at io.cloudevents.format.BinaryUnmarshaller$Builder.lambda$unmarshal$2(BinaryUnmarshaller.java:227) ~[cloudevents-api-0.3.1.jar:na]
	at java.util.Optional.map(Optional.java:215) ~[na:1.8.0_181]
	at io.cloudevents.format.BinaryUnmarshaller$Builder.unmarshal(BinaryUnmarshaller.java:226) ~[cloudevents-api-0.3.1.jar:na]
	at application.rest.v1.Events.event(Events.java:77) ~[classes/:na]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_181]

AMQP 1.0 transport

We should start evaluating to add AMQP 1.0 support, e.g. using Apache QPid's proton-j lib.

NPE unmarshalling an event

I'm using Spring for REST and trying to unmarshall an event coming in with this code snippet:

@PostMapping(value = "/", consumes = MediaType.APPLICATION_JSON_VALUE)
    public ResponseEntity<Void> event(@RequestHeader Map<String, Object> headers,
            @RequestBody String body) {
        System.out.println("Receved request headers: " + headers);
        System.out.println("Received request body: " + body);
        if (MediaType.APPLICATION_JSON_VALUE.equalsIgnoreCase(headers.get("content-type").toString())) {
            @SuppressWarnings("rawtypes")
            CloudEvent<AttributesImpl, Map> cloudEvent = Unmarshallers.binary(Map.class)
                .withHeaders(() -> headers)
                .withPayload(() -> body)
                .unmarshal();
            System.out.println("Received CloudEvent: " + cloudEvent);
            //...
        }
        return ResponseEntity.accepted().build();
    }

I get these headers and body:

Receved request headers: {host=****, user-agent=Go-http-client/1.1, content-length=40, accept-encoding=gzip, ce-id=81063005-d897-4162-8a79-5224bdde4d81, ce-source=/apis/v1/namespaces/default/cronjobsources/test-cronjob-source, ce-specversion=0.3, ce-time=2019-10-08T12:56:00.000398275Z, ce-type=dev.knative.cronjob.event, content-type=application/json, forwarded=for=172.30.252.225;proto=http, for=127.0.0.1, k-proxy-request=activator, x-b3-parentspanid=08fdcef844c5b6e6, x-b3-sampled=0, x-b3-spanid=7d2298100968d43e, x-b3-traceid=ff56838a09c39d245ad334a89c91a0c8, x-envoy-decorator-operation=javaspringknativecloudant-vwv5h-nlf5f.default.svc.cluster.local:80/*, x-forwarded-for=172.30.252.225, 127.0.0.1, 172.30.233.147, x-forwarded-proto=http, x-istio-attributes=ClsKF2Rlc3RpbmF0aW9uLnNlcnZpY2UudWlkEkASPmlzdGlvOi8vZGVmYXVsdC9zZXJ2aWNlcy9qYXZhc3ByaW5na25hdGl2ZWNsb3VkYW50LXZ3djVoLW5sZjVmCl0KGGRlc3RpbmF0aW9uLnNlcnZpY2UuaG9zdBJBEj9qYXZhc3ByaW5na25hdGl2ZWNsb3VkYW50LXZ3djVoLW5sZjVmLmRlZmF1bHQuc3ZjLmNsdXN0ZXIubG9jYWwKQwoYZGVzdGluYXRpb24uc2VydmljZS5uYW1lEicSJWphdmFzcHJpbmdrbmF0aXZlY2xvdWRhbnQtdnd2NWgtbmxmNWYKKgodZGVzdGluYXRpb24uc2VydmljZS5uYW1lc3BhY2USCRIHZGVmYXVsdApHCgpzb3VyY2UudWlkEjkSN2t1YmVybmV0ZXM6Ly9hY3RpdmF0b3ItNzY1NDc1OTU0Ny00ZDd2bS5rbmF0aXZlLXNlcnZpbmc=, x-request-id=6041ed42-e2e0-46b5-b6eb-c5ff88120464}
Received request body: {"message":"Hello world! This is CRON!"}

And I get this NPE. Am I doing something wrong in my usage of the SDK?

java.lang.NullPointerException: null
	at sun.reflect.annotation.TypeAnnotationParser.mapTypeAnnotations(Unknown Source) ~[na:1.8.0]
	at sun.reflect.annotation.AnnotatedTypeFactory$AnnotatedTypeBaseImpl.<init>(Unknown Source) ~[na:1.8.0]
	at sun.reflect.annotation.AnnotatedTypeFactory.buildAnnotatedType(Unknown Source) ~[na:1.8.0]
	at sun.reflect.annotation.TypeAnnotationParser.buildAnnotatedType(Unknown Source) ~[na:1.8.0]
	at java.lang.reflect.Field.getAnnotatedType(Unknown Source) ~[na:1.8.0]
	at org.hibernate.validator.internal.metadata.provider.TypeAnnotationAwareMetaDataProvider.findTypeAnnotationConstraintsForMember(TypeAnnotationAwareMetaDataProvider.java:65) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.metadata.provider.AnnotationMetaDataProvider.findPropertyMetaData(AnnotationMetaDataProvider.java:244) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.metadata.provider.AnnotationMetaDataProvider.getFieldMetaData(AnnotationMetaDataProvider.java:227) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.metadata.provider.AnnotationMetaDataProvider.retrieveBeanConfiguration(AnnotationMetaDataProvider.java:137) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.metadata.provider.AnnotationMetaDataProvider.getBeanConfiguration(AnnotationMetaDataProvider.java:125) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.metadata.provider.AnnotationMetaDataProvider.getBeanConfigurationForHierarchy(AnnotationMetaDataProvider.java:108) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.metadata.BeanMetaDataManager.createBeanMetaData(BeanMetaDataManager.java:203) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.metadata.BeanMetaDataManager.getOrCreateBeanMetaData(BeanMetaDataManager.java:231) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.metadata.BeanMetaDataManager.isConstrained(BeanMetaDataManager.java:174) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at org.hibernate.validator.internal.engine.ValidatorImpl.validate(ValidatorImpl.java:195) ~[hibernate-validator-5.3.6.Final.jar!/:5.3.6.Final]
	at io.cloudevents.v02.CloudEventBuilder.build(CloudEventBuilder.java:183) ~[cloudevents-api-0.3.1.jar!/:na]
	at io.cloudevents.v02.CloudEventBuilder.of(CloudEventBuilder.java:158) ~[cloudevents-api-0.3.1.jar!/:na]
	at io.cloudevents.v02.CloudEventBuilder.build(CloudEventBuilder.java:164) ~[cloudevents-api-0.3.1.jar!/:na]
	at io.cloudevents.v02.http.Unmarshallers$$Lambda$22.00000000182159E0.build(Unknown Source) ~[na:na]
	at io.cloudevents.format.BinaryUnmarshaller$Builder.unmarshal(BinaryUnmarshaller.java:241) ~[cloudevents-api-0.3.1.jar!/:na]
	at application.rest.v1.Events.event(Events.java:75) ~[classes!/:1.0-SNAPSHOT]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0]

When adding a extension to a builder it is not being serialized to the message

@JsonIgnoreProperties(value = { "eventTypeVersion", "extensions" }) // was removed from 0.1

When building a cloudEvent like below :

    final DistributedTracingExtension dte = new DistributedTracingExtension();
    dte.setTraceparent("00-0af7651916cd43dd8448eb211c80319c-b7ad6b7169203331-01");
    dte.setTracestate("congo=BleGNlZWRzIHRohbCBwbGVhc3VyZS4");

    return new
        CloudEventBuilder<T>().contentType(ContentType.APPLICATION_JSON.getMimeType())
        .id("123")
        .time(ZonedDateTime.ofInstant(Instant.now(), ZoneOffset.UTC))
        .type("evType")
        .source("a.b.c")
        .data(entity)
        .extension(dte).build();

The extension is not being serialized when creating the json event.

JMS Adaptor

Is there an adaptor/wrapper for JMS client, thus allowing easy plug and play with existing MoM solutions that support topics, without the need to make an adaptor for each specific one?

ZonedDateTime field limits interoperability

Currently the time field is a ZonedDateTime. This requires a custom deserializer ZonedDateTimeDeserializer implemented with jackson. Sending a CloudEvent via http then causes problems on the receiving end. IMHO It would be better to represent time with a more portable type, e.g., long.

Kafka binary content type not read

#67 fixed the problem where the kafka content-type header is not written. However, it appears io.cloudevents.v1.kafka.AttributeMapper does not attempt to read the content type from the appropriate header (doesn't attempt to read content type from content-type header).

Support java1.7

There's lots of usage of java8 in sdk-java like ZonedDateTime, Is there an adaptor version for Java7

Serialization Problems

Hi there,

i use cloudevents sdk version 1.0.0, jackson 2.9.8.

when I run the following test, the "data" node appears twice in the resulting JSON (see last println line):

import io.cloudevents.CloudEvent;
import io.cloudevents.extensions.ExtensionFormat;
import io.cloudevents.format.StructuredMarshaller;
import io.cloudevents.format.Wire;
import io.cloudevents.format.builder.EventStep;
import io.cloudevents.json.Json;
import io.cloudevents.v1.Accessor;
import io.cloudevents.v1.AttributesImpl;
import io.cloudevents.v1.CloudEventBuilder;
import io.cloudevents.v1.kafka.HeaderMapper;
import org.junit.Test;

import java.net.URI;
import java.net.URISyntaxException;
import java.time.ZonedDateTime;
import java.util.UUID;

public class ATest {

    public static class Amount {
        private int a;

        public Amount() {}

        public Amount(int a) {
            this.a = a;
        }

        public int getA() {
            return a;
        }

        public void setA(int a) {
            this.a = a;
        }
    }

    @Test
    public void test1() throws URISyntaxException {
        Amount payload = new Amount(1);
        CloudEvent<AttributesImpl, Amount> event =
                CloudEventBuilder.<Amount>builder()
                        .withId(UUID.randomUUID().toString())
                        .withTime(ZonedDateTime.now())
                        .withDataContentType("application/cloudevent+json")
                        .withDataschema(new URI("structured_schema"))
                        .withSource(new URI("structured_source"))
                        .withSubject("structured_subject")
                        .withData(payload)
                        .withType("structured_type")
                        .build();
        EventStep<AttributesImpl, Amount, byte[], byte[]> builder =
                StructuredMarshaller.<AttributesImpl, Amount, byte[], byte[]>builder()
                        .mime("Content-Type", "application/cloudevents+json".getBytes())
                        .map((e) -> Json.binaryEncode(e))
                        .map(Accessor::extensionsOf)
                        .map(ExtensionFormat::marshal)
                        .map(HeaderMapper::map);
        Wire<byte[], String, byte[]> wire = builder.withEvent(() -> event).marshal();
        System.err.println(new String(wire.getPayload().get()));
    }
}

So the output is something like:

{
  "data": {
    "a": 1
  },
  "id": "88f599ed-e684-4746-a545-5abdd01ac3a5",
  "source": "structured_source",
  "specversion": "1.0",
  "type": "structured_type",
  "datacontenttype": "application/cloudevent+json",
  "dataschema": "structured_schema",
  "subject": "structured_subject",
  "time": "2019-11-20T14:03:53.613+01:00",
  "data": {
    "a": 1
  }
}

Could you please tell me what I am doing wrong?

TIA,

Johannes Hampel

Maven Central Publish

For maven projects they typically release their bits (JAR files) to Maven Central.

Proposal is to use the io.cloudevents package name.

We need to do a JIRA at the Sonatype OSSRH JIRA intance:
https://issues.sonatype.org/browse/OSSRH

In there we identify a few settings for permission:

When filing the JIRA, I will get asked if I own the domain. which I am not 😄

I think we need some sort of statement, that this groupID (io.cloudevents) and the matching domain (cloudevents.io) is owned by the CNCF, and that we are requesting "release permission" on the behalf of the CNCF

Support Kafka Interceptors for general Producer Consumer support

Kafka Interceptors have full access to Header, K, V and would allow the Marshaller to proxy the event and map to/from the wire format. This approach would mean CE can be plugged into existing Kafka applications and applied across all supported Kafka Client language bindings for CE sdks.

Modularization: avro module

This module implements the Avro Event Format.

  • avro

This module depend on api one.

When someone wants the avro format to create or read events, they add this module and they concern about potential conflicts with pre-existing dependencies in their projects.

manage custom attributtes

Dear team,
How can i handle custom attributes using json java sdk?
i need to work with something like this:
{
"specversion" : "1.0",
"type" : "com.example.someevent",
"source" : "/mycontext",
"id" : "A234-1234-1234",
"time" : "2018-04-05T17:31:00Z",
"Custom Attribute1" : "bar"
"Custom Attribute2" : "foo"
"datacontenttype" : "text/xml",
"data" : ""
}

and manage on the Cloud events object this custom attributes. like:
CloudEventImpl cloudEvent = CloudEventBuilder.builder()
.withType("Cloudevent")
.withId("298262ed-eb85-4de7-95b3-efcd42d823d4")
.withSource(URI.create("/EventManagementTest"))
.withData("HELLO".getBytes())
.withTime(dateNow)
.withSubject("Cloud Event")
.withDataContentType(URI.create("http://my.br").toString())
.withDataschema(URI.create("application/json"))
.withExtension(tracing)
.withCustomAttribute(CUSTOM ATTRIBUTE)
.build();

Snapshots repositories not in readme

Currently the README.md points to version 0.3.0 which is not available in Maven Central.
I am assuming that there is a snapshot or nightly build repo where 0.3.0 can be found,
it will be great to add that in the readme.

In the mean time can someone point me to that repository?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.