Giter Club home page Giter Club logo

ccf_draft's Issues

Add more types and reassign CBOR tag numbers in CDDL

The following improvements were identified by reading Cadence Core Contracts with CCF specs and codec in mind.

More types need to be added in CDDL:

  • struct interface type
  • resource interface type
  • contract interface type
  • reference type
  • restricted type

Reassign tag numbers to reserve some tag numbers in each (sub)group.

Add interface types as options to ccf-composite-type-message. This is more extensible than using simple type at the cost of encoding a little more data.

Add reference and restricted types as options to inline-type.

Add support for function value.

Refactor CDDL to separate type objects from type value objects for readability and cleaner implementation.

Remove unnecessary cadence-type-id

Problem

In JSON-CDC, sometimes cadence-type-id was encoded when it was just a "stringification" of other encoded data and not necessary to encode.

In CCF, we don't need to keep this inefficiency for the sake of compatibility.

Thanks @turbolent for spotting this! ๐Ÿ‘

Proposed Solution

Remove the inefficiency by removing unnecessary cadence-type-id from

  • restricted-type-value
  • restricted-type
  • function-value

Update Security Considerations

Mention decoding limits can be stricter for untrusted inputs and less strict for trusted inputs. For example, CBOR limits such as MaxArrayElements, MaxMapPairs, and MaxNestedLevels can be set differently for decoders processing trusted and untrusted inputs. CCF-based protocols can also specify different limits to balance tradeoffs.

The main tradeoff for decoder limits:

  • too high will allow memory exhaustion attacks, etc. to succeed.
  • too low will create the possibility of being unable to decode a non-malicious message that exceeds limits.
    NOTE: Encoders usually don't enforce limits because it's much simpler and more efficient for apps to enforce it.

Example Limit for Max Array Elements

A GRPC limit of 20 MB can support (at most) a 20,000,000 element array (for an unrealistic message with zero-overhead and 1 byte elements).

In practice, it would take many thousands of non-malicious CCF messages (like average-sized events) to reach a 20 MB GRPC limit, so it doesn't make sense to allow more than 20,000,000 elements for each array within a single CCF message.

This update to CCF specs can be done after opening PR to add CCF Codec to onflow/cadence and before CCF Specs RC2.

Update outdated benchmarks in README

Benchmarks are from initial proof-of-concept CCF codec. We should update benchmarks and comparisons with a reminder that we are not comparing apples to apples.

Prior formats (CBF and JSON-Cadence Data Interchange) didn't specify requirements for validity, sorting, etc.

Sorting data, deterministic encoding, etc. are not free.

Update specs for composite-type-value.initializers

Some changes to specs are required:

  • Update composite-type-value.initializers from "one or many" to "zero or one" since only one initializer is supported and sorting is hard for multiple initializers.

  • Remove deterministic sorting requirement for composite-type-value.initializers since only one initializer is supported and initializer parameters have natural sorting and shouldn't be changed.

Thanks @turbolent for great discussion and suggesting this today!

Add section about canonical encoding

For example, make it clear that CCF-based protocols and data formats must use a deterministic sequence when encoding more than one Cadence composite type.

README has outdated status and introduction can be improved

The current status and timeline is outdated. Also, the introduction can be improved by copying this Introduction section from the CCF specification:

Introduction

Cadence external values (e.g. transaction arguments, events, etc.) have been encoded using JSON-Cadence Data Interchange format, which is human-readable, verbose, and doesn't define deterministic encoding.

CCF is a binary data format that allows more compact, efficient, and deterministic encoding of Cadence external values. Consequently, the CCF codec in Cadence is faster, uses less memory, encodes deterministically, and produces smaller messages than the JSON-CDC codec.

A real FeesDeducted event can encode to:

  • 298 bytes in JSON-CDC (minified).
  • 118 bytes in CCF (fully self-describing mode).
  • ~20 bytes in CCF (partially self-describing mode) with 12 bytes for data and ~8 bytes for type id (counter, hash, etc.)

Unlike prior formats, CCF defines all requirements for deterministic encoding (sort orders, smallest encoded forms, and Cadence-specific requirements) to allow CCF codecs implemented in different programming languages to deterministically produce identical messages.

For security, CCF was designed to allow efficient detection and rejection of malformed messages without creating Cadence objects. This allows more costly checks (e.g. validity) to be performed only on well-formed messages.

CCF leverages vendor-neutral Internet Standards such as CBOR (RFC 8949), which is designed to be relevant for decades.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.