cloudevents / sdk-javascript Goto Github PK
View Code? Open in Web Editor NEWJavaScript/TypeScript SDK for CloudEvents
Home Page: https://cloudevents.github.io/sdk-javascript/
License: Apache License 2.0
JavaScript/TypeScript SDK for CloudEvents
Home Page: https://cloudevents.github.io/sdk-javascript/
License: Apache License 2.0
Based on some discussion in this pull request there are some improvements that could be made to the Cloudevent.format()
function.
spec
instance and not in Cloudevent
itself.The Cloudevent
object conflates HTTP Content-type
with the CloudEvent specification datacontenttype
. In fact, there is no datacontenttype
attribute on the Cloudevent
object.
The API for a CloudEvent
exposed in this module should map to to the specification and not HTTP.
The standard-version
package automates releases and nicely handles updates to changelogs, git repository tags and semver based versioning bumps for each release. I recommend using it to simplify these things.
The JavaScript SDK uses a Java-like builder pattern for creating events.
This is not ideal as it is not very JavaScript-like, and is unnecessarily wordy. As we are in JavaScript, we can use direct JSON objects for creating the CloudEvent payload.
Here is a sample taken from the README:
let myevent = event()
.source('/source')
.type('type')
.dataContentType('text/plain')
.dataschema('http://d.schema.com/my.json')
.subject('cha.json')
.data('my-data')
.addExtension("my-ext", "0x600");
import {CloudEvent} from 'cloudevents/v1';
let myevent = new CloudEvent({
source: '/source',
type: 'type',
dataContentType: 'text/plain',
dataschema: 'http://d.schema.com/my.json',
subject: 'cha.json',
data: 'my-data',
addExtension: ["my-ext", "0x600"]
});
we need to use mTLS authentication to a https endpoint with certificate and key when submitting cloud events.
We got the following error:
"The "options.agent" property must be one of type Agent-like Object, undefined, or false."
when using the following code:
var options = { method: "POST", url: requestUrl, httpsAgent: new https.Agent({ cert: fs.readFileSync(cf_cert), key: fs.readFileSync(cf_key) }) };
var binding = new cloudeventsSDK.StructuredHTTPEmitter(options); binding.emit(ce) .then(response => { callback(null); }).catch(err => { callback(err); });
emitter_structured.js (possibly in other emitters as well) line #6 & line#23 causing the issue, it assumes the configuration is a JSON object, but in the case of httpsAgent, the underlying module requires httpsAgent attribute of type 'Agent-like' object.
$ npm install
npm ERR! code 1
npm ERR! Command failed: git checkout v1
npm ERR! error: pathspec 'v1' did not match any file(s) known to git
npm ERR!
I'm going to send a PR to fix and align that with the same approach used by TypeScript example.
Providing a subject
key in a structured payload falls into the extension flow. It is then rejected as being a reserved attribute name.
I did not open a PR yet, I wanted to ensure this was not intended behavior. Failing test shown below.
227 passing (226ms)
1 failing
1) HTTP Transport Binding Unmarshaller for CloudEvents v0.3
Structured
Should accept event that follow the spec 0.3:
Reserved attribute name: 'subject'
Currently CI is configured to build all branches. We should consider modifying the configuration so that only master
and perhaps develop
, plus version branches are built. In general, I think feature/fix branches should be on the contributor's fork of the repository to avoid this problem and to prevent branch proliferation on origin
. But that's not always the case - and when a feature/fix branch on origin
fails CI, the badge on the README.md shows as failing. This makes sense if version branches are failing, but not if ongoing work on a feature/fix branch is.
So... I guess I have two recommendations.
master
(maybe develop
) and version branchesorigin
Travis looks like it is flaky. I have added some PRs that don't affect the tests, yet Travis seems to fail.
Example:
#74
Ideally a PR like this shouldn't fail.
Hi,
just requiring the sdk with a simple
const Cloudevent = require('cloudevents-sdk');
causes the following error to be shown in the console:
$ref: keywords ignored in schema at path "#"
Any suggestion or comment?
The .travis.yml
file is configured so that the code coverage results are only published when the commit is not a pull request. However, as written, tests are not run at all for pull requests.
This SDK uses a library called AJV for schema validation. However, it makes creating and using CloudEvent objects very complicated through a set of setters and getters.
It would be a lot simpler to be able to directly create CloudEvent objects given a JSON object. Further field validation could be done externally through a different function rather than a ajv.compile(schema)
and external spec
.
It's hard to contribute to this library when the dev
branch is 50 commits behind master
.
Please merge these commits to master
before we can contribute. There's not a large reason for there to be separate branches afaik.
With the following binary message:
{ accept: 'application/json, text/plain, */*',
'content-type': 'application/json; charset=utf-8',
'ce-type': 'com.github.pull.create',
'ce-specversion': '0.2',
'ce-source': 'urn:event:from:myapi/resourse/123',
'ce-id': '74500b1a-4f87-43d1-afa9-7fe1c6251da8',
'user-agent': 'axios/0.18.0',
'content-length': '15',
host: 'localhost:3000',
connection: 'close' }
{"dummy":"wow"}
We got:
{ message: 'invalid content type',
errors: [ 'application/json; charset=utf-8' ] }
This SDK has a set of common utility methods in a file called fun.js
:
https://github.com/cloudevents/sdk-javascript/blob/master/lib/utils/fun.js
These methods are common amongst other libraries and shouldn't be re-invented.
Rather, use a library such as is
:
https://www.npmjs.com/package/is
In the v0.3 spec for JSON format section about Special Handling of the "data" Attribute, it says:
First, if an implementation determines that the type of the data attribute is Binary or String, it MUST inspect the datacontenttype attribute to determine whether it is indicated that the data value contains JSON data.
If the datacontenttype value is either "application/json" or any media type with a structured +json suffix, the implementation MUST translate the data attribute value into a JSON value, and set the data attribute of the envelope JSON object to this JSON value.
If I'm reading the spec correctly, it looks like the SDK should serialize stringified JSON in the data attribute. In other words, I would expect this to work:
const v03 = require("cloudevents-sdk/v03");
const data = { "so": "scare" };
const ce = v03.event
.id('4e4cfbd6-d542-4235-9e80-99d2df62abee')
.source('com.example.whatever')
.type('something')
.dataContentType("application/json")
.data(JSON.stringify(data)) // <-- stringified json data goes in
.format();
const ceData = ce.getData();
console.log(ceData.so); // <-- this breaks, getData returned the stringified JSON, not an object.
I wrote a sample test (joshwlewis@19cc9f3), if it's more useful than my untested example. It fails with:
CloudEvents Spec v0.3
The Constraints check
'data'
1) should convert data with stringified json to a json object
0 passing (8ms)
1 failing
1) CloudEvents Spec v0.3
The Constraints check
'data'
should convert data with stringified json to a json object:
AssertionError: expected '{"much":"wow"}' to equal { much: 'wow' }
at Context.it (test/spec_0_3_tests.js:209:41)
I am opening this issue as a question but also for proposing my help if you need. I will like to see a receive section in the readme as there is in the golang sdk : https://github.com/cloudevents/sdk-go#working-with-cloudevents
If you think is a good idea I can look into this.
Hello, I'd like to become a copaborator for this repo as I'm working heavily on events at Google.
I was requested to create an issue, so here it is!
Write down guidelines to show how to contribute with CloudEvents' JavaScript SDK
Implements an immutable event builder, to builds events instances that are unchangeable after its build.
Depends on #28
Hello,
I wanted to ask if a plan to support Batched Content Mode was in the work?
Because there are no linting rules outlined in the repository, and no task in package.json to lint the code, contributors can't know what the style guidelines/expectations are, and might therefore submit a pull request that will fail the linting checks on Codacy.
JavaScript is ubiquitous in the browser and the CloudEvents
module should be easy to consume in that environment. A good example is the sockeye cloud event viewer. This little chunk of code would be much improved if the CloudEvent
module could be used.
The repo has a folder of the CE spec in JSON format in ext
.
https://github.com/cloudevents/sdk-javascript/tree/master/ext
While this may be useful, it's not necessary for a CloudEvent SDK. Let's refer back to the spec rather than keeping the files.
The --signoff
flag for commits is not often used and is primarily a declaration that the committer has actually authored the code or has rights to submit code she has not written. Instead, I think it's better to use cryptographically signed commits which can also be enforced by GitHub (see protected branches in settings). Using GPG signatures on commits is generally a better way to ensure that a commit is coming from the person it appears to be coming from, vs. --signoff
which is just a line of text at the bottom of a commit message. By using GPG signed commits, there is much more certainty that the commit is legitimate and coming from the person it appears to be coming from.
There are 2 PRs that are already merged.
PR #10 Changed JSON schema validator's options to remove warning: "$ref: key…
PR #8 Added missing 'source' getter on Spec02
Can this 2 fixes be promoted to the npm registry?
Also trunk based development working on master
branch seems to be more productive and adopted this days, why not just have 1 branch master
?
I have a Knative service which receives a CloudEvent from a Knative source. After processing this event I want to emit another CloudEvent to my Knative Eventing broker. Unfortunately, I'm having a hard time achieving this:
const binding = new CloudEvent.bindings["http-binary0.2"]({
method: "POST",
url: "http://my-broker.default.svc.cluster.local/"
});
This configuration leads to communication timeouts. Any help on this is highly appreciated 🙂
Currently, when receiving CloudEvents
you have to know what spec version the event is being sent as. But that shouldn't be required. As an event receiver, I should be able to accept events from any spec version supported by the module without prior knowledge of the sender's spec version.
Currently there are only a handful of folks in the @cloudevents/sdk-maintainers group who are actively participating in this repository. But as other contributors come on board, it might be good to codify how pull requests are landed.
For example, can a contributor with committer rights submit, approve and land their own PR all within a single day? A single hour? At the risk of overkill, here's a strawman proposal.
Thoughts?
Spec v1.0 is coming, now should be the time to start the implementation.
Currently we are on 3.x. Current release line is 7.x. That's a big jump, but not as big as you might think. Ref: https://www.npmjs.com/package/uuid#deep-requires-now-deprecated
Currently the repository is tagged for each prerelease, mirroring the spec versions as they progressed. However, there are no release branches. This was probably fine prior to 1.0. But now that 1.0 is out, a v1.x
branch should be created and maintained so that fixes can be backported to previous versions as we continue to move forward with changes in the repository.
I suggest the following:
v1.x
branch off of the v1.0.0
tag.v1.x-backport
label which can be used to label any PRs that have landed since the 1.0.0 release, indicating they should be considered for backporting to the v1.x
branch.v1.x
branch, tagging the repo on the v1.x
branch with v1.0.1
, for example, for the next patch release.v1.x
branch, tagging the repo on the v1.x
branch with v1.1.0
, for example, for the next minor release.v1.1.x
branch for minor release patch updates.v1.1.x
branch, tagging that branch with v1.1.1
, for example, for the next patch release in this major/minor combination.This is related to the question raised in #72 where the contributor's guide talks about using the develop
branch. That hasn't been done since the 1.0.0 release. We need to decide if we want to move forward with this process. If so, then the develop
branch needs to be updated with all recent commits, and set as the primary branch in the repository settings. This setting will ensure that by default, pull requests from contributors are created to land on the develop
branch.
My personal feeling is that the develop
branch is mostly redundant with master
and we don't really need to do that. If contributors can manage rebasing their pull requests with updates from master
regularly, this shouldn't be a problem. Though I think this does add some pressure to land PRs quickly. What are other's thoughts on this?
Today we have just one cloudevent.js that makes the interface with Spec classes. This is ok with vanilla js, because of flexibility which exists in js programming we inject new methods we needed.
Now it's time to stabilize the things. We are reaching the v1.0 and some attribute names are getting stable, in special the required ones: id
, source
, specversion
and type
.
We need:
For example STRUCTURED_ATTRS_1
is used only inside received_structured_1.js
.
But it requires all the other constants inside: require("./constants.js");
.
Is it worth separating constants in different files? Is there a possibility that the file of constants will grow even more?
In the README, the example on how to emit events does not work because the word new is missing.
The TypeScript declaration package is missing in https://www.npmjs.com/~types .
More information on this: https://www.typescriptlang.org/docs/handbook/declaration-files/publishing.html
This is very important to ensure a proper diffusion of the cloudevents standards.
The HTTP binding for binary formatted data currently does not follow the cloudevents/spec:
Additionally, the config object should be copied because creating multiple bindings with the same config will override the content-type.
Fixed in issue #14.
I have a question regarding error handling of events. Standard dictates that you always return the event parameters(e.g. headers) in case of success what about in case of an error? In your example you do not return them but just respond with status 415.
And how to handle a batch of events if only one fails or n fail.
The TypeScript definitions at v1/index.d.ts
are not being used and cannot be used as-is (at least in a normal way). They are not exported in package.json
.
They're likely out of date and not going to be aligned with the source code as things change.
I suggest removing them and auto-generating them at a later point in time.
Implement the support for attributes of spec v0.3
In the SDK requirements documentation, it says "Each SDK supports the latest(N), and N-1, major releases of the CloudEvent spec", with a special case being 1.0 as latest with 0.3 recommended support.
Ultimately, dropping support for 0.2 should make maintenance simpler. From npm download numbers, I guess it's hard to tease out how many are still on 0.2, but this seems reasonable.
I wonder also what the versioning strategy is. Is the npm module is versioned independently from the spec version? It seems like it must be - at least for minor and patch levels.
As I understand, when uploading a file over the HTTP, the data
of the event should contain a stream
. So a NodeJS example would look like this?
// http1
http.createServer((req, res) => {
const event = { ...read event metadata from headers... };
event.data = req; // attach data stream
...add logic...
res.end({ ...respond with CE metadata... });
});
// http2
const server = http2.createServer();
server.on('stream', (stream, headers, flags) => {
const event = { ...read event metadata from headers... };
event.data = stream; // attach data stream
...add logic...
stream.end({ ...respond with CE metadata... });
});
Is this correct?
Fix the pipeline coverage command to run just for master and develop branches.
Is time to deprecate the implementation for spec v0.1?
The cloudevents/conformance repository now has HTTP transport cucumber specifications. This SDK should integrate these into the tests.
There are a number of vulnerabilities identified when running a clean npm install
. There are also a number of dependencies specified in package.json that are unused. And finally, a couple of the dependencies are no longer maintained.
See: #49
The project board for this repository has not been updated in over 8 months. Is it still being used? If so, it should be updated to reflect the current state of the project. If not, it should be disabled in GitHub.
Ref: https://github.com/cloudevents/sdk-javascript/projects/1
Hi,I was using the example from (https://github.com/cloudevents/sdk-javascript/blob/master/examples/express-ex) and got this error.
{ message: 'content type not allowed',
errors: [ 'application/octet-stream' ] }
The contributor's guide for this repository is out of date and should be updated, or the recommendations there should be enforced. For example, the develop
branch is out of date, and we're not following the gitflow recommendation of pull requests submitted to develop
. Personally, I don't mind master
being the tip as long as fixes can be backported to version branches.
If using a develop
branch is important for this project, then I think the develop
branch should be set as the default branch in GitHub and GH Actions should be used to enforce these processes: https://help.github.com/en/github/administering-a-repository/defining-the-mergeability-of-pull-requests
There should also be an update to the section on managing the changelog when this lands: #68
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.