Giter Club home page Giter Club logo

trustdidweb's Introduction

Trust DID Web - A DID Method

The spec repository for did:tdw -- Trust DID Web DID method.

Read the spec: https://bcgov.github.io/trustdidweb/

Implementations available:

Abstract

The did:tdw (Trust DID Web) method is an enhancement to the did:web protocol, providing a complementary web-based DID method that addresses limitations of did:web. It's features include the following.

  • Ongoing publishing of all DID Document (DIDDoc) versions for a DID instead of, or alongside a did:web DID/DIDDoc.
  • Uses the same DID-to-HTTPS transformation as did:web.
  • Provides resolvers the full history of the DID using a verifiable chain of updates to the DIDDoc from genesis to deactivation.
  • A self-certifying identifier (SCID) for the DID that is globally unique and derived from the initial DIDDoc that enables DID portability, such as moving the DIDs web location (and so the DID string itself) while retaining the DID's history.
  • DIDDoc updates include a proof signed by the DID Controller(s) authorized to update the DID.
  • An optional mechanism for publishing "pre-rotation" keys to prevent loss of control of the DID in cases where an active private key is compromised.
  • DID URL path handling that defaults (but can be overridden) to automatically resolving <did>/path/to/file by using a comparable DID-to-HTTPS translation as for the DIDDoc.
  • A DID URL path <did>/whois that defaults to automatically returning (if published by the DID controller) a Verifiable Presentation containing Verifiable Credentials with the DID as the credentialSubject, signed by the DID.

Combined, the additional features enable greater trust and security without compromising the simplicity of did:web. The incorporation of the DID Core compatible "/whois" path, drawing inspiration from the traditional WHOIS protocol, offers an easy to use, decentralized, trust registry. This did:tdw aims to establish a more trusted and secure web environment by providing robust verification processes and enabling transparency and authenticity in the management of decentralized digital identities.

Contributing to the Specification

Pull requests (PRs) to this repository may be accepted. Each commit of a PR must have a DCO (Developer Certificate of Origin - https://github.com/apps/dco) sign-off. This can be done from the command line by adding the -s (lower case) option on the git commit command (e.g., git commit -s -m "Comment about the commit").

Rendering and reviewing the spec locally for testing requires npm and node installed. Follow these steps:

  • Fork and locally clone the repository.
  • Install node and npm.
  • Run npm install from the root of your local repository.
  • Edit the spec documents (in the /spec folder).
  • Run npm run render'
    • Use npm run edit to interactively edit, render and review the spec.
  • Review the resulting index.html file in a browser.

The specification is currently in Spec-Up format. See the Spec-Up Documentation for a list of Spec-Up features and functionality.

trustdidweb's People

Contributors

andrewwhitehead avatar brianorwhatever avatar dependabot[bot] avatar rajpalc7 avatar swcurran avatar wadebarnes avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

trustdidweb's Issues

How to authenticate a DID rotation

The method must only allow rotations of DID documents by the controller.

Controller is a bit of a loaded term but my assumption is that this is the controller property of the DID doc. If so, the first document will be lacking this property won't it? What do we expect a controller property to look like? It can be a set of keys but we would have to further specify (see #2)

Extract SCID section to a standalone spec?

The SCID Generation and Validation section would make an excellent standalone mini-spec (which the did:tdw spec can just refer to).
The technique is general, and useful outside of the DID space. For example, I'd love to use it as a way to generate stable identifiers for ActivityPub / ActivityStreams2 objects.

Is that something this group would consider doing?

Proposed processes for creating and updating a DIDDoc

Here is a stab at the processes for creating an initial DIDDoc with all the features, and for updating a DIDDoc.

  • Create the DIDDoc with the desired content, leaving placeholders for the DID identifier {{SCID}}. Include in the DIDDoc:
    • The DID identifier as the id and must include an instance of {{SCID}}
    • The top level controller that is self-referencing (same DID as id) or references an external controller.
      • If omitted, the controller is assumed to be the DID itself.
      • If the entry references multiple controllers any of the DIDs is authorized to update the DIDDoc.
    • Optional: service entries to be used to find referenced paths used by the DID (e.g., AnonCreds objects)
    • Optional: service entry with an id of whois that is of type LinkedVP
    • Any desired alsoKnownAs, with/without SCID
    • Any other content.
  • Call a routine to generate the SCID
    • Process scid = sha256(JCS(DIDDoc)) -- see #6
  • Update the DIDDoc with the SCID replacing the placeholder -- the result is the versionID=0 DIDDoc
  • Call a routine to generate the first log with the content, any necessary parameters (such as the version of the Spec being used) -- excluding the proof.
    • See #20
    • The prevEntryHash is null for the first entry. Subsequent entries use the entryHash from the previous log entry.
  • Have a the entryHash signed as needed to create the proof, and add the proof item to the log entry.
    • The authorized keys for signing the proof are found from resolving the controller(s) listed in the initial DIDDoc. Subsequent entries use keys found from resolving the controller(s) listed in the initial DIDDoc
  • Publish the log at the prescribed location according to the DID.
  • Optional: Transform the initial (current) DIDDoc to a did:web by a text change of the did:tdw to did:web, and adding an alsoKnownAs entry for this DID and publish the resulting did.json file at the prescribed location.

Authorized signing keys to update the DIDDoc:

  • Since the controller references DIDs and not specific keys in those DIDs, the method assumes the controller key for each controller DID is:
    • The authentication key type (if found), else the verificationMethod key type.
    • Nice to have: Support added for the use of the verifiableCondition as a reference from the authentication type or as an alternative to the verificationMethod in the DID.
  • It is not clear to us that authentication is the right key type to use to demonstrate authorization to update a DIDDoc. For now, it is good enough based on the discussions we have had in the community, but we probably need to dig deeper into this.
  • The controllers MUST be either:
    • The DID itself (key references are in the DIDDoc)
    • Other resolvable did:tdw or did:web DIDs
    • Copies in the DIDDoc of the relevant verificationMethods for an external controller DID
  • Otherwise, the DID resolution process returns an error 404 - NOT RESOLVABLE

Generate the log entry for the DIDDoc update - see #20

Update the DIDDoc:

  • Update the DIDDoc as necessary. This could include updating the authentication keys if the DIDDoc contains such keys.
    • The id of the DIDDoc MAY be changed -- see moving the DIDDoc.
    • Anything else can be updated, although the items listed in the section above on creating the DIDDoc should be included.
  • Call the routine (described above) to generate the next history log entry.
  • Publish the updated log, accessible as required by the DID
  • Optional: Generate and publish a did:web DIDDoc as described above.

Moving a DIDDoc:

  • Permitted by changing the DID (effectively, creating a new DID, but with the previous DIDs history)
  • Ideally a redirect is provided to the new DID, and the addition of an alsoKnownAs for the old DID.
    • If the old DID is resolved, and their is a Web Redirect to the new location, the resulting DIDDoc has the id of the old DID, and an alsoKnownAs for the new DID (or an equivalentID?).
  • See #19

Implementor Guidance: Publishing AnonCreds objects using did:tdw

This will not be part of the spec, or the implementors guide, but we will want a "did:tdw AnonCreds Method", and this issue is to get some ideas down on how to do that.

For most of the objects, the publishing mechanism will be:

  • Use a similar method to the DID URLs in did:indy as defined in the did:indy specification. "Similar" except for the reference to the Indy Transaction ID of the Schema in dependent objects. This means the schemas are:
    • Schema: <did>/anoncreds/v0/SCHEMA/<schema_id>/<schema_ver>.json
    • CredDef: <did>/anoncreds/v0/CREDDEF/<schema_id>/<schema_ver>/<cred_def_id>.json
    • RevRegDef: <did>/anoncreds/v0/REV_REG_DEF/<schema_id>/<schema_ver>/<cred_def_id>/<rev_reg_id>.json

We propose that each object be a W3C VC signed by the DID, with the credentialSubject being the AnonCreds object.

That is all pretty straight forward. In theory, AnonCreds objects on Indy can be updated, although in practice that is not done as being unnecessarily complex -- no one has ever (AFAIK) implemented them. For the did:tdw AnonCreds Method, an update could overwrite the JSON file with new version or perhaps we could on publishing save two files, one with the _<version>.json (or _<time>.json), and one with just .json, so that the history was available.

Not listed above is the RevRegEntry object, which is a bit more interesting. In that case, it is updated each time the Issuer revokes a batch (1 or more) of credentials in the RevReg. That complexity must be handled. As defined in the AnonCreds v1 Specification RevRegEntry contains:

  • An ID
  • A timestamp (UNIX Time - epoch)
  • An accumulator for the used by the holder to create a Non-Revocation Proof (NRP) and by the verifier to verify the NRP
  • Either the full state of the credentials (a bit per credential - revoked or not) or the delta of states since the last change.
    • For the did:tdw AnonCreds Method, we will use the "full state" method.
    • Might be worth looking at the Cheqd AnonCreds Method for how that was done, and to be consistent.
    • Presumably, a compressed bit string such as is used in StatusList2021.
    • Note that the size of an AnonCreds RevReg is known at creation time, and cannot practically be larger than about 3k -- perhaps 5k at the absolute outset, which is worst case 640 bytes of data (plus, presumably, base64 encoding for about 1Kb).

The tricky part of RevRegEntry is that we MUST support multiple versions, and the querying is recommended in AnonCreds to by publication time range. A verifier will give the holder a from and to time range (although typically the two values are the same -- for a point in time -- and typically that time is now()), and the holder must find a RevRegEntry that is in that time range.

Potential solutions:

  • Replicate the "history" functionality in did:tdw for RevRegEntries.
    • Probably a bad idea because of the need for the full state of the RevRegEntries every time. Even though each compressed bit string might be pretty small, having all of them in one file would get unmanagable.
  • Publish a file associated with the RevRegDef that lists all of the RevRegEntries DID URLs paths:
    • Publish each RevRegEntry as: <did>/anoncreds/v0/REV_REG_DEF/<schema_id>/<schema_ver>/<cred_def_id>/<rev_reg_id>_<epoch_timestamp>.json
      • Nice to have: Publish the current RevRegEntry as: <did>/anoncreds/v0/REV_REG_DEF/<schema_id>/<schema_ver>/<cred_def_id>/<rev_reg_id>_latest.json
    • After/in parallel with publishing a new entry publish, publish an updated list that adds the new RevRegEntry as the latest. The list could be in the RevRegDef itself, or could be in another file, e.g. <did>/anoncreds/v0/REV_REG_DEF/<schema_id>/<schema_ver>/<cred_def_id>/<rev_reg_id>_entries.json

Proper content-type for `.jsonl`

Continuing discussions from #41 I think we need to specify something.

When I upload it to a browser it says it's application/octet-stream.

There doesn't appear to be a consensus in wardi/jsonlines#19

The proposal there is application/jsonl which I think looks like a typo..

Another proposal in there (amazon uses) is application/jsonlines which seems pretty good to me other than not being in IANA

Document Format

Proposals:

Proposal 1: something very DID docy

Benefits:

  • Easy parsing at resolution time
    Drawbacks:
  • Large file size
{
  "@context": [
    "https://www.w3.org/ns/did/v1",
    "https://w3id.org/security/multikey/v1"
  ],
  "authentication": ["#z6Mko4t2y5Pbx96hbQP5p8JkJXA3z16nAq7VpUyZGcjhd9ra"],
  "capabilityInvocation": ["#z6MkqHVcPwdxaNnwAYEi1ZrqE1z3EonVdZ6hmGFXz9wSHqUB"],
  "capabilityDelegation": ["#z6Mkhn3U7DCskv9SzLW8Fwa3ssYhwjzZMTPj443RqMmaBKYb"],
  "assertionMethod": ["#z6MkmUNA3PrAr28xhofDKMN6So2tjsC2HoohsJzMwejXuxAm"],
  "verificationMethod": [{
      "id": "#z6Mko4t2y5Pbx96hbQP5p8JkJXA3z16nAq7VpUyZGcjhd9ra",
      "type": "Multikey",
      "publicKeyMultibase": "z6Mko4t2y5Pbx96hbQP5p8JkJXA3z16nAq7VpUyZGcjhd9ra"
    }, {
      "id": "#z6MkqHVcPwdxaNnwAYEi1ZrqE1z3EonVdZ6hmGFXz9wSHqUB",
      "type": "Multikey",
      "publicKeyMultibase": "z6MkqHVcPwdxaNnwAYEi1ZrqE1z3EonVdZ6hmGFXz9wSHqUB"
    }, {
      "id": "#z6Mkhn3U7DCskv9SzLW8Fwa3ssYhwjzZMTPj443RqMmaBKYb",
      "type": "Multikey",
      "publicKeyMultibase": "z6Mkhn3U7DCskv9SzLW8Fwa3ssYhwjzZMTPj443RqMmaBKYb"
    }, {
      "id": "#z6MkmUNA3PrAr28xhofDKMN6So2tjsC2HoohsJzMwejXuxAm",
      "type": "Multikey",
      "publicKeyMultibase": "z6MkmUNA3PrAr28xhofDKMN6So2tjsC2HoohsJzMwejXuxAm"
    }],
  "service": [{
    "id": "#didcomm-1",
    "type": "DIDCommMessaging",
    "serviceEndpoint": {
        "uri": "https://example.com/path",
        "accept": [
            "didcomm/v2",
            "didcomm/aip2;env=rfc587"
        ],
        "routingKeys": ["did:example:somemediator#somekey"]
    }
  }, {
    "id": "#linked-vp",
    "type": "LinkedVerifiablePresentation",
    "serviceEndpoint": ["https://bar.example.com/verifiable-presentation.jsonld"]
  }]
}

Proposal 2: Something like selfhash

Benefits:

  • Uses well defined placeholder slots
  • Others are using it so presumably it works
  • Comes predesigned with a microledger
    Drawbacks:
  • Not sure on flexibility

{"data_byte_v":[1,2,4,8,16,32,64,128],"name":"grippoponkey","previous":"Eyl1ZXOe_nOufcpisvlW1ZUeXKnY_kIKOOljEhMAd5zw","self_hash":"Ew4c7NoEyI842yfIxBakoWgCYI6v4nAQp1Z1OtHzOiFQ","stuff_count":43}

Proposal 3: Custom encoded like did:peer

Benefits:

  • Compact
    Drawbacks:
  • Need to define microledger approach

Vz6Mkj3PUd1WjvaDhNZhhhXQdz5UnZXmS7ehtx8bsPpD47kKc.Ez6LSg8zQom395jKLrGiBNruB9MM6V8PWuf2FpEy4uRFiqQBR.SeyJ0IjoiZG0iLCJzIjp7InVyaSI6Imh0dHA6Ly9leGFtcGxlLmNvbS9kaWRjb21tIiwiYSI6WyJkaWRjb21tL3YyIl0sInIiOlsiZGlkOmV4YW1wbGU6MTIzNDU2Nzg5YWJjZGVmZ2hpI2tleS0xIl19fQ.SeyJ0IjoiZG0iLCJzIjp7InVyaSI6Imh0dHA6Ly9leGFtcGxlLmNvbS9hbm90aGVyIiwiYSI6WyJkaWRjb21tL3YyIl0sInIiOlsiZGlkOmV4YW1wbGU6MTIzNDU2Nzg5YWJjZGVmZ2hpI2tleS0yIl19fQ

Any other proposals?

How to migrate servers

Ideally there is a method to migrate servers so you are not tied to a specific server and can easily migrate. I think this can be accomplished by updating the id in the document. I think this makes the id property redundant here @andrewwhitehead. But I'm just now as I'm typing this realize this opens up forks..

I drew up a picture to explain what I'm talking about

image

At the end of the day I think we need a way to freely migrate. How do we accomplish this?

How do we support the “whois” Verifiable Presentation functionality?

For other web-based DID methods, we proposed that the DID URL <did>/whois would resolve to a VP containing VCs for which the DID is the credentialSubject.id, and the VP proof is from the DID.

How do we do that for this DID method?

Obviously, one option is to do exactly what is proposed above.

Another possibility is “linked-vp”. Need to add a reference for that.

How do we include support for “signed files” verifiably created by the DID?

A crucial component of this DID method is the ability for the DID to publish documents such that their publication by the DID is verifiable. The proposed method for this using another web-based DID method was to use a DID URL and path, and to have the file located at the path added to the web-based location of the DID. For example:

  • DID URL: did:webnext:example.com:dids:12345/anoncreds/schema/1.0/legal_entity.json
  • is found at https://example.com/dids/12345/anoncreds/schema/1.0/legal_entity.json, and
  • https://example.com/dids/12345/anoncreds/schema/1.0/legal_entity.json.jws is a JWS created using a key from DIDDoc.

Alternatively, it has been proposed that we use a service in the DIDDoc as the location of the documents. That makes the location of the files (and signatures) independent of the location of the DID, and so could be used for any DID method — not just web-based DIDs.

Note that a requirement is to support the feature in this DID method. A bonus is to support the feature in a generally useful way.

How to deactivate

Following up from the discussion in #21 we need to specify a "deactivation" mechanism. There have been a few thrown around.

  1. Set the controller to an unusable DID (ie did:tdw:1).
  2. Remove all authentication keys in the document.
  3. Use a deactivated parameter.

1 and 2 would be permanent (other then deleting the log entry)

3 could be reversible

How to generate the SCID?

  • The SCID is the hash of specific element(s) of the initial DIDDoc (such as public keys) so it is verifiable when one has the initial DIDDoc.
    • Looking at the example DIDDocs on the universal resolver, I realize that the DIDDocs do not have a consistent/concise way of representing keys. E.g. a JWK is quite a different representation from a multibase key.
    • We could state that it is the JSON items that contain public keys, but that starts to get ugly/hacky IMHO.
  • Generate a selfhash over the entire initial DIDDoc, with placeholders for wherever the SCID is to be placed.
    • Construct the entire DIDDoc with placeholders wherever they are needed, and pass it into a method to generate the SCID and fill in the placeholders.
    • To verify the SCID, a verification method is given the SCID and the initial DIDDoc, does a text replacement of the SCID with the placeholder, removes any items not included in the hashing, and verifies that the hash of the canonicalized result is the SCID.
      • Challenges are defining the placeholder, the hash algorithm, the canonicalization, and the items to leave out of the hashing. Presumably, self-hash defines most of that.
  • Other alternatives?

Log Format

The following was pulled in from here

Proposed DID history format

The history consists of a set of JSON-formatted arrays. Each line is in the format:

[version hash, version ID, timestamp, params, data, proofs]

Within each line the properties are as follows:

version hash

  • The hash corresponding to the current line. See version hash calculation.

version ID

  • An incrementing integer value corresponding to the line index. The first line is index 1.

timestamp

  • A timestamp associated with the current update. This must be formatted according to ISO8601 with the timezone offset included.

params

  • A JSON object containing the parameters updated by the current line. See parameters.

data

  • A JSON object containing the updates to the document state. See data updates.

proofs

  • An array of proofs for the current update, used to establish key ownership as well as continuity from the previous state. See proofs.

Parameters

There is currently 2 supported parameters.

method

  • An identifier for the document versioning method. This will establish the calculation method for the self-certifying document identifier as well as the supported proof formats.

scid

  • The self-certifying identifier derived from the initial document. This will also be included in the method specific identifier (the DID).

hash

  • The hash function that is used.

The first line of the history must establish these parameters.

Example (parameters only):

{"method": "did:webnext:1", "scid": "abc123abc123abc123abc123", "hash": "sha256"}

Data updates

Each data update contains one of:

value

  • Establishing the full contents of the document, or

patch

  • Containing a JSON patch from the previous content of the document.

Examples (data only):

{"value": {"@context": ["https://www.w3.org/ns/did/v1", "https://w3id.org/security/data-integrity/v2", "https://w3id.org/security/multikey/v1"], "id": "did:webnext:example.com:menetsyqymts6yhpmtfebtym", "authentication": ["did:webnext:example.com:menetsyqymts6yhpmtfebtym#kbaEzdWJd6UFDcifYo158dh2mXysTB7INj51JcgpCqg"], "verificationMethod": [{"id": "did:webnext:example.com:menetsyqymts6yhpmtfebtym#kbaEzdWJd6UFDcifYo158dh2mXysTB7INj51JcgpCqg", "type": "Multikey", "controller": "did:webnext:example.com:menetsyqymts6yhpmtfebtym", "publicKeyMultibase": "z6Mkn4z9ze2ESZhWL2HKGdQcGFdT8dH5Ht9zDnBoiPJfANyz"}]}}


{"patch": [{"op": "add", "path": "/alsoKnownAs", "value": ["did:web:example.com"]}]}

The first line of the history must contain a value property.

Version hash calculation

  • Take the version hash of the previous line as last hash. For the first line, use the empty string ("").
  • Calculate the raw digest as hash(JCS([last hash, version ID, timestamp, params, data]))
  • Output version hash = base32lower(digest)

Example:

hash input: JCS([
        "zQmcPRLmBy5KwV74epZ9fvoEngjbukRphUdaZS1nxqKfM16",
        2,
        "2024-03-19T23:39:37Z",
        {},
        {"patch": [{"op": "add", "path": "/controller", "value": ["did:alt:controller"]}]},
    ])
output: zQmSMxhxGp1CQ6SL5vi1rSCRXXfR9RMkzPTLLVF26wEAztE

Proofs

Data integrity proofs (cryptosuite TBD) are created with an authentication purpose, including the version hash as a challenge and using the new state of the document. The proofs are extracted from the signed document and included in the log entry. These are injected back into the document as proof when verifying the log entry.

Example:

[{"type":"DataIntegrityProof","created":"2024-03-28T08:18:57Z","verificationMethod":"did:tdw:jNQ4MKr87HBoox9iXKjRKt3S#FFhXVfsx","cryptosuite":"eddsa-2022","proofPurpose":"authentication","challenge":"z3dY6uzgy4CK3tFC29TSjES6MrhhyvnbE4KfccvbzU8PQ","proofValue":"zBK6pMkFBnzXH1qYPn2iEsi2jo3y7pBcNuqrGH99GKp2mvCH9VvRK1EPZPiXvzzQoxwEhveBBPs1Jk6spUXVsfme"}]

Full example

(With invalid hashes)

["zQmNQFTHCUc1S3iiNoZ5TzPevXQqwfT1xuiuYsC4hrc97Kz", 1, "2024-03-25T18:51:50Z", {"method": "did:webnext:1"}, {"value": {"@context": ["https://www.w3.org/ns/did/v1", "https://w3id.org/security/data-integrity/v2", "https://w3id.org/security/multikey/v1"], "id": "did:webnext:example.com:menetsyqymts6yhpmtfebtym", "authentication": ["did:webnext:example.com:menetsyqymts6yhpmtfebtym#kbaEzdWJd6UFDcifYo158dh2mXysTB7INj51JcgpCqg"], "verificationMethod": [{"id": "did:webnext:example.com:menetsyqymts6yhpmtfebtym#kbaEzdWJd6UFDcifYo158dh2mXysTB7INj51JcgpCqg", "type": "Multikey", "controller": "did:webnext:example.com:menetsyqymts6yhpmtfebtym", "publicKeyMultibase": "z6Mkn4z9ze2ESZhWL2HKGdQcGFdT8dH5Ht9zDnBoiPJfANyz"}]}, [{"type": "DataIntegrityProof", "cryptosuite": "eddsa-jcs-2022", "verificationMethod": "did:webnext:example.com:menetsyqymts6yhpmtfebtym#did:webnext:example.com:menetsyqymts6yhpmtfebtym#kbaEzdWJd6UFDcifYo158dh2mXysTB7INj51JcgpCqg", "created": "2024-03-25T18:51:50Z", "challenge": "zQmNWMPLDMx3QyYtiNarjELQ9RCvHm1hAMHRrrrqNQ7kbvt", "proofPurpose": "authentication", "proofValue": "z4Por1indyg8WgytdzKkHD1FqWUfzPHrKsWvr5zn4r3bn6grLMP54ahfqQ9hPdYuTjSWeqjsTRF8M9ujnK2BzqcCG"}]]
["zQmbfhXs7E8FMJNNN2Gc98hDsUatAjZ53K7gdwWEKx5ZJWs", 2, "2024-03-25T18:51:50Z", {}, {"patch": [{"op": "add", "path": "/controller", "value": ["did:webnext:example.com:menetsyqymts6yhpmtfebtym", "did:example:controller"]}, {"op": "add", "path": "/verificationMethod/1", "value": {"id": "did:example:controller#tWMK1fLzevQaBJc_bl4vJ6waLI32bQwTIcVv9BXTiGA", "type": "Multikey", "controller": "did:example:controller", "publicKeyMultibase": "z6Mks2wuUH1sgBYUxnd48LV9TxS5fACun4ZdCAGy8ujRLzJx"}}, {"op": "add", "path": "/authentication/1", "value": "did:example:controller#tWMK1fLzevQaBJc_bl4vJ6waLI32bQwTIcVv9BXTiGA"}]}, [{"type": "DataIntegrityProof", "cryptosuite": "eddsa-jcs-2022", "verificationMethod": "did:webnext:example.com:menetsyqymts6yhpmtfebtym#did:example:controller#tWMK1fLzevQaBJc_bl4vJ6waLI32bQwTIcVv9BXTiGA", "created": "2024-03-25T18:51:50Z", "challenge": "zQmbfhXs7E8FMJNNN2Gc98hDsUatAjZ53K7gdwWEKx5ZJWs", "proofPurpose": "authentication", "proofValue": "zfr2rcCyQdvsy6ixvVkjFjB14Z3ncb2GqvPEGd49zG9WgJtM96HBoQVaTbs1MwtCcWq3szY28cw63SQ8nUxPDfyY"}]]
["zQmbbT3zAxR6vWibmzN9PxhM5e6kDRDTUsbhnwnP9tiFLbU", 3, "2024-03-25T18:51:50Z", {}, {"patch": [{"op": "add", "path": "/alsoKnownAs", "value": ["did:web:example.com"]}]}, [{"type": "DataIntegrityProof", "cryptosuite": "eddsa-jcs-2022", "verificationMethod": "did:webnext:example.com:menetsyqymts6yhpmtfebtym#did:webnext:example.com:menetsyqymts6yhpmtfebtym#kbaEzdWJd6UFDcifYo158dh2mXysTB7INj51JcgpCqg", "created": "2024-03-25T18:51:50Z", "challenge": "zQmbbT3zAxR6vWibmzN9PxhM5e6kDRDTUsbhnwnP9tiFLbU", "proofPurpose": "authentication", "proofValue": "z2j9sGp6aTxFZyMX7r8xYtbNJEw5gBGktdEj7M58G6FiC4FtugVUVZHWRdr1NZHhr7DsP7uVFXuf9PRbo3fGreouG"}]]

List of resolver verification errors and error codes

I thought it would be useful to add a ticket where we can track a list of error conditions in the resolver that are possible and what error codes (or other technique) we can use to surface the error to the caller.

  • Log not found.
  • etc.

Relative path resolution

The DID spec does not outline a process for resolving a generic DID URL, ie. a DID with a path attachment (did:method/relative/path), leaving this up to specific DID methods. It does specify service and relativeRef DID parameters that can be used to resolve paths via service endpoints. An example from the did:web Anoncreds method:

did:method:domain?service=anoncreds&relativeRef=/v1/schema/5762v4VZxFMLB5n9X4Upr3gXaCKTa8PztDDCnroauSsR

Now we could define a simple alias as part of this DID method, and automatically map any DID URL of of the form {DID}/service/{SERVICE}{PATH}{?PARAMS} (for example). In this case the above resource might be represented as:

did:method:domain/service/anoncreds/v1/schema/5762v4VZxFMLB5n9X4Upr3gXaCKTa8PztDDCnroauSsR

On resolution it would simply be transformed into the original format before proceeding. Parameters such as hl (hashlink) should be preserved when this happens, for processing during the resolution process, but not added to the resulting URL.

For more flexibility we could enable path prefixes to be mapped by one or more methods. For example, a service entry of a recognized type:

{
    "id":"#index",
    "type": "PathResolution", 
    "serviceEndpoint": {
        "anoncreds/v1": "https://external-service/me/anoncreds/v1",
        "didcomm/v1": "did:myagent?service=didcomm",
        "*": "/" 
    }
}

or an external file defining the mappings:

{
    "id":"#index",
    "type": "PathResolution", 
    "serviceEndpoint": "/index.jsonld"
}

How to do prerotation

Which keys are prerotated? Is it just the controller keys or is this a requirement of every verification method?

and How do we define prerotation?

Add a proof item to each DIDDoc version

  • A proof item is included in each DIDDoc that contains a DID Controller signature over that entire DIDDoc (minus the proof item), plus the the proof item from the previous DIDDoc
    • The signature must be from one of controllers of the previous version of the DIDDoc.
      • In the initial document, there is no signature from the previous DIDDoc.
    • How is the DIDDoc canonicalized when creating the signature?
    • Is the signature from the previous DIDDoc included in the DIDDoc?
  • Include metadata about the DIDDoc in the DIDDoc
    • versionId (required). Must be 0 for the initial DIDDoc and incremented per DIDDoc update.
    • created — a locally defined (by the DID controller) time when the initial DIDDoc was created.
    • updated — a locally defined (by the DID controller) time when this DIDDoc was created.

Fundamental to this is #7 — how do we ensure signatures across the JSON data is verifiable?

Method Name

If a goal of the method is to be agnostic to backing storage I don't think "web" should be in the name. I also think webnext is too long I prefer 3-4 characters max.

Unfortunately, I see that did:next is taken

Include metadata about the DID in the DIDDoc

Include in the DIDDoc the following information:

  • versionId (required). Must be 0 for the initial DIDDoc and incremented per DIDDoc update.
  • created — a locally defined (by the DID controller) time when the initial DIDDoc was created.
  • updated — a locally defined (by the DID controller) time when this DIDDoc was created.
  • previousProof — the proof item from the previous version of the DIDDoc.
    • An empty array [] for the initial DIDDoc.

Additional resolver metadata

It may be useful, especially for implementing the DNS extensions, for clients to receive information on the associated document proofs. The DID resolution metadata could include this information:

  • proof(s) for the resolved version
  • associated raw (multibase?) public keys for each referenced verification method
  • possibly the hash of the associated log line

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.