Giter Club home page Giter Club logo

penumbra's Introduction

Table of Contents

Penumbra by Transcend

Penumbra

Encrypt/decrypt anything in the browser using streams on background threads.

Quickly and efficiently decrypt remote resources in the browser. Display the files in the DOM, or download them with conflux.

Known Vulnerabilities


Compatibility

.decrypt .encrypt .saveZip
Chrome
Edge >18
Safari ≥14.1
Firefox ≥102
Safari <14.1 🐢 🐢 🟡
Firefox <102 🐢 🐢 🟡
Edge 18

✅ = Full support with workers

🐢 = Uses main thread (lacks native WritableStream support)

🟡 = 32 MiB limit

❌ = No support

Usage

Importing Penumbra

With Yarn/NPM

yarn add @transcend-io/penumbra

# or
npm install --save @transcend-io/penumbra
import { penumbra } from '@transcend-io/penumbra';

penumbra.get(...files).then(penumbra.save);

Vanilla JS

<script src="lib/main.penumbra.js"></script>
<script>
  penumbra
    .get(...files)
    .then(penumbra.getTextOrURI)
    .then(displayInDOM);
</script>

Check out this guide for asynchronous loading.

RemoteResource

penumbra.get() uses RemoteResource descriptors to specify where to request resources and their various decryption parameters.

/**
 * A file to download from a remote resource, that is optionally encrypted
 */
type RemoteResource = {
  /** The URL to fetch the encrypted or unencrypted file from */
  url: string;
  /** The mimetype of the resulting file */
  mimetype?: string;
  /** The name of the underlying file without the extension */
  filePrefix?: string;
  /** If the file is encrypted, these are the required params */
  decryptionOptions?: PenumbraDecryptionInfo;
  /** Relative file path (needed for zipping) */
  path?: string;
  /** Fetch options */
  requestInit?: RequestInit;
  /** Last modified date */
  lastModified?: Date;
  /** Expected file size */
  size?: number;
};

.get

Fetch and decrypt remote files.

penumbra.get(...resources: RemoteResource[]): Promise<PenumbraFile[]>

.encrypt

Encrypt files.

penumbra.encrypt(options: PenumbraEncryptionOptions, ...files: PenumbraFile[]): Promise<PenumbraEncryptedFile[]>

/**
 * penumbra.encrypt() encryption options config (buffers or base64-encoded strings)
 */
type PenumbraEncryptionOptions = {
  /** Encryption key */
  key: string | Buffer;
};

.encrypt() examples:

Encrypt an empty stream:

size = 4096 * 128;
addEventListener('penumbra-progress', (e) => console.log(e.type, e.detail));
addEventListener('penumbra-complete', (e) => console.log(e.type, e.detail));
file = penumbra.encrypt(null, {
  stream: new Response(new Uint8Array(size)).body,
  size,
});
data = [];
file.then(async ([encrypted]) => {
  console.log('encryption complete');
  data.push(new Uint8Array(await new Response(encrypted.stream).arrayBuffer()));
});

Encrypt and decrypt text:

const te = new self.TextEncoder();
const td = new self.TextDecoder();
const input = '[test string]';
const buffer = te.encode(input);
const { byteLength: size } = buffer;
const stream = new Response(buffer).body;
const options = null;
const file = {
  stream,
  size,
};
const [encrypted] = await penumbra.encrypt(options, file);
const decryptionInfo = await penumbra.getDecryptionInfo(encrypted);
const [decrypted] = await penumbra.decrypt(decryptionInfo, encrypted);
const decryptedData = await new Response(decrypted.stream).arrayBuffer();
const decryptedText = td.decode(decryptedData);
console.log('decrypted text:', decryptedText);

.getDecryptionInfo

Get decryption info for a file, including the iv, authTag, and key. This may only be called on files that have finished being encrypted.

penumbra.getDecryptionInfo(file: PenumbraFile): Promise<PenumbraDecryptionInfo>

.decrypt

Decrypt files.

penumbra.decrypt(options: PenumbraDecryptionInfo, ...files: PenumbraEncryptedFile[]): Promise<PenumbraFile[]>
const te = new TextEncoder();
const td = new TextDecoder();
const data = te.encode('test');
const { byteLength: size } = data;
const [encrypted] = await penumbra.encrypt(null, {
  stream: data,
  size,
});
const options = await penumbra.getDecryptionInfo(encrypted);
const [decrypted] = await penumbra.decrypt(options, encrypted);
const decryptedData = await new Response(decrypted.stream).arrayBuffer();
return td.decode(decryptedData) === 'test';

.save

Save files retrieved by Penumbra. Downloads a .zip if there are multiple files. Returns an AbortController that can be used to cancel an in-progress save stream.

penumbra.save(data: PenumbraFile[], fileName?: string): AbortController

.getBlob

Load files retrieved by Penumbra into memory as a Blob.

penumbra.getBlob(data: PenumbraFile[] | PenumbraFile | ReadableStream, type?: string): Promise<Blob>

.getTextOrURI

Get file text (if content is text) or URI (if content is not viewable).

penumbra.getTextOrURI(data: PenumbraFile[]): Promise<{ type: 'text'|'uri', data: string, mimetype: string }[]>

.saveZip

Save a zip containing files retrieved by Penumbra.

type ZipOptions = {
  /** Filename to save to (.zip is optional) */
  name?: string;
  /** Total size of archive in bytes (if known ahead of time, for 'store' compression level) */
  size?: number;
  /** PenumbraFile[] to add to zip archive */
  files?: PenumbraFile[];
  /** Abort controller for cancelling zip generation and saving */
  controller?: AbortController;
  /** Allow & auto-rename duplicate files sent to writer. Defaults to on */
  allowDuplicates: boolean;
  /** Zip archive compression level */
  compressionLevel?: number;
  /** Store a copy of the resultant zip file in-memory for inspection & testing */
  saveBuffer?: boolean;
  /**
   * Auto-registered `'progress'` event listener. This is equivalent to calling
   * `PenumbraZipWriter.addEventListener('progress', onProgress)`
   */
  onProgress?(event: CustomEvent<ZipProgressDetails>): void;
  /**
   * Auto-registered `'complete'` event listener. This is equivalent to calling
   * `PenumbraZipWriter.addEventListener('complete', onComplete)`
   */
  onComplete?(event: CustomEvent<{}>): void;
};

penumbra.saveZip(options?: ZipOptions): PenumbraZipWriter;

interface PenumbraZipWriter extends EventTarget {
  /**
   * Add decrypted PenumbraFiles to zip
   *
   * @param files - Decrypted PenumbraFile[] to add to zip
   * @returns Total observed size of write call in bytes
   */
  write(...files: PenumbraFile[]): Promise<number>;
  /**
   * Enqueue closing of the Penumbra zip writer (after pending writes finish)
   *
   * @returns Total observed zip size in bytes after close completes
   */
  close(): Promise<number>;
  /** Cancel Penumbra zip writer */
  abort(): void;
  /** Get buffered output (requires saveBuffer mode) */
  getBuffer(): Promise<ArrayBuffer>;
  /** Get all written & pending file paths */
  getFiles(): string[];
  /**
   * Get observed zip size after all pending writes are resolved
   */
  getSize(): Promise<number>;
}

type ZipProgressDetails = {
  /** Percentage completed. `null` indicates indetermination */
  percent: number | null;
  /** The number of bytes or items written so far */
  written: number;
  /** The total number of bytes or items to write. `null` indicates indetermination */
  size: number | null;
};

Example:

const files = [
  {
    url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/tortoise.jpg.enc',
    name: 'tortoise.jpg',
    mimetype: 'image/jpeg',
    decryptionOptions: {
      key: 'vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=',
      iv: '6lNU+2vxJw6SFgse',
      authTag: 'ELry8dZ3djg8BRB+7TyXZA==',
    },
  },
];
const writer = penumbra.saveZip();
await writer.write(...(await penumbra.get(...files)));
await writer.close();

.setWorkerLocation

Configure the location of Penumbra's worker threads.

penumbra.setWorkerLocation(location: WorkerLocationOptions | string): Promise<void>

Examples

Display encrypted text

const decryptedText = await penumbra
  .get({
    url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/NYT.txt.enc',
    mimetype: 'text/plain',
    filePrefix: 'NYT',
    decryptionOptions: {
      key: 'vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=',
      iv: '6lNU+2vxJw6SFgse',
      authTag: 'gadZhS1QozjEmfmHLblzbg==',
    },
  })
  .then((file) => penumbra.getTextOrURI(file)[0])
  .then(({ data }) => {
    document.getElementById('my-paragraph').innerText = data;
  });

Display encrypted image

const imageSrc = await penumbra
  .get({
    url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/tortoise.jpg.enc',
    filePrefix: 'tortoise',
    mimetype: 'image/jpeg',
    decryptionOptions: {
      key: 'vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=',
      iv: '6lNU+2vxJw6SFgse',
      authTag: 'ELry8dZ3djg8BRB+7TyXZA==',
    },
  })
  .then((file) => penumbra.getTextOrURI(file)[0])
  .then(({ data }) => {
    document.getElementById('my-img').src = data;
  });

Download an encrypted file

penumbra
  .get({
    url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/africa.topo.json.enc',
    filePrefix: 'africa',
    mimetype: 'image/jpeg',
    decryptionOptions: {
      key: 'vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=',
      iv: '6lNU+2vxJw6SFgse',
      authTag: 'ELry8dZ3djg8BRB+7TyXZA==',
    },
  })
  .then((file) => penumbra.save(file));

// saves africa.jpg file to disk

Download many encrypted files

penumbra
  .get([
    {
      url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/africa.topo.json.enc',
      filePrefix: 'africa',
      mimetype: 'image/jpeg',
      decryptionOptions: {
        key: 'vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=',
        iv: '6lNU+2vxJw6SFgse',
        authTag: 'ELry8dZ3djg8BRB+7TyXZA==',
      },
    },
    {
      url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/NYT.txt.enc',
      mimetype: 'text/plain',
      filePrefix: 'NYT',
      decryptionOptions: {
        key: 'vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=',
        iv: '6lNU+2vxJw6SFgse',
        authTag: 'gadZhS1QozjEmfmHLblzbg==',
      },
    },
    {
      url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/tortoise.jpg', // this is not encrypted
      filePrefix: 'tortoise',
      mimetype: 'image/jpeg',
    },
  ])
  .then((files) => penumbra.save({ data: files, fileName: 'example' }));

// saves example.zip file to disk

Advanced

Prepare connections for file downloads in advance

// Resources to load
const resources = [
  {
    url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/NYT.txt.enc',
    filePrefix: 'NYT',
    mimetype: 'text/plain',
    decryptionOptions: {
      key: 'vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=',
      iv: '6lNU+2vxJw6SFgse',
      authTag: 'gadZhS1QozjEmfmHLblzbg==',
    },
  },
  {
    url: 'https://s3-us-west-2.amazonaws.com/bencmbrook/tortoise.jpg.enc',
    filePrefix: 'tortoise',
    mimetype: 'image/jpeg',
    decryptionOptions: {
      key: 'vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=',
      iv: '6lNU+2vxJw6SFgse',
      authTag: 'ELry8dZ3djg8BRB+7TyXZA==',
    },
  },
];

// preconnect to the origins
penumbra.preconnect(...resources);

// or preload all of the URLS
penumbra.preload(...resources);

Encrypt/Decrypt Job Completion Event Emitter

You can listen to encrypt/decrypt job completion events through the penumbra-complete event.

window.addEventListener(
  'penumbra-complete',
  ({ detail: { id, decryptionInfo } }) => {
    console.log(
      `finished encryption job #${id}%. decryption options:`,
      decryptionInfo,
    );
  },
);

Progress Event Emitter

You can listen to download and encrypt/decrypt job progress events through the penumbra-progress event.

window.addEventListener(
  'penumbra-progress',
  ({ detail: { percent, id, type } }) => {
    console.log(`${type}% ${percent}% done for ${id}`);
    // example output: decrypt 33% done for https://example.com/encrypted-data
  },
);

Note: this feature requires the Content-Length response header to be exposed. This works by adding Access-Control-Expose-Headers: Content-Length to the response header (read more here and here)

On Amazon S3, this means adding the following line to your bucket policy, inside the <CORSRule> block:

<ExposeHeader>Content-Length</ExposeHeader>

Configure worker location

// Set only the base URL by passing a string
penumbra.setWorkerLocation('/penumbra-workers/');

// Set all worker URLs by passing a WorkerLocation object
penumbra.setWorkerLocation({
  base: '/penumbra-workers/',
  penumbra: 'worker.penumbra.js',
  StreamSaver: 'StreamSaver.js',
});

// Set a single worker's location
penumbra.setWorkerLocation({ penumbra: 'worker.penumbra.js' });

Waiting for the penumbra-ready event

<script src="lib/main.penumbra.js" async defer></script>
const onReady = async ({ detail: { penumbra } } = { detail: self }) => {
  await penumbra.get(...files).then(penumbra.save);
};

if (!self.penumbra) {
  self.addEventListener('penumbra-ready', onReady);
} else {
  onReady();
}

Querying Penumbra browser support

You can check if Penumbra is supported by the current browser by comparing penumbra.supported(): PenumbraSupportLevel with penumbra.supported.levels.

if (penumbra.supported() > penumbra.supported.levels.possible) {
  // penumbra is partially or fully supported
}

/** penumbra.supported.levels - Penumbra user agent support levels */
enum PenumbraSupportLevel {
  /** Old browser where Penumbra does not work at all */
  none = -0,
  /** Modern browser where Penumbra is not yet supported */
  possible = 0,
  /** Modern browser where file size limit is low */
  size_limited = 1,
  /** Modern browser with full support */
  full = 2,
}

Webpack

Penumbra is compiled and bundled on npm. The recommended use is to copy in the penumbra build files into your webpack build. We do this with copy-webpack-plugin

i.e.

const fs = require('fs');
const CopyPlugin = require('copy-webpack-plugin');
const path = require('path');
const PENUMBRA_DIRECTORY = path.join(
  __dirname,
  'node_modules',
  '@transcend-io/penumbra',
  'dist',
);

module.exports = {
  plugins: [
    new CopyPlugin({
      patterns: fs.readdirSync(PENUMBRA_DIRECTORY)
        .filter((fil) => fil.indexOf('.') > 0)
        .map((fil) => ({
          from: `${PENUMBRA_DIRECTORY}/${fil}`,
          to: `${outputPath}/${fil}`,
        })),
    }),
  ]

Contributing

# setup
yarn
yarn build

# run tests
yarn test:local

# run tests in the browser console
yarn test:interactive

License

FOSSA Status

penumbra's People

Contributors

bencmbrook avatar dependabot[bot] avatar dipack95 avatar dmattia avatar eligrey avatar fossabot avatar michaelfarrell76 avatar snyk-bot avatar wassup789 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

penumbra's Issues

Tests running on CI

not biggest deal in the world, but would nice to have the browser tests running on CI again

Check authentication tag in penumbra

Right now we're skipping the authtag verification step, which is a tiny security flaw, because it doesn't verify the signature (see step 3 below), meaning Transcend could tamper with the encrypted bytes passed through (which, worst case, just breaks the file).

All we need to do is:

  1. set the authtag on the decipher object. This is done.
  2. decrypt all the data This is done.
  3. then verify the signature with decipher.final. This has to happen after decryption. I think new TransformStream takes a close argument where this may be possible.

This is good to do, because...

  • in the future we might add AAD like the User ID to federate files with coreIdentifiers.
  • people using this package for data that has more serious implications of tampering (i.e. where they're somehow vulnerable to a chosen ciphertext attack)

Diagrams for presentation

Would be nice to outline at high level what project does. Maybe outline of default cryptography operations

Standardize a way to declare worker script locations

We should standardize a way to declare worker script locations. We will have three worker scripts: one for decryption (implementing penumbra methods), one for zipping (through conflux), and one for streamsaver.

We have a few possible options:

  1. Require that ./penumbra-decrypt.worker.js and ./penumbra-zip.worker.js are present wherever Penumbra is used. (we're going to do this by default if no configuration is set)
  2. Allow Penumbra consumers to configure worker script locations through Penumbra API (e.g. Penumbra.setWorkerLocation('/worker-scripts/'))
  3. Allow configuration through script element (e.g. <script src="penumbra.js" data-worker-dir="/worker-scripts/"/> or maybe <script src="penumbra.js" data-workers='{"decrypt":"https://example.com/decryption.worker.js","zip":"/worker-scripts/zip.worker.js"}"/>)
  4. Hardcode decryption and zipping worker locations and host them ourselves.
  5. Allow configuration through <link rel="penumbra" data-config="..."> (placed before the Penumbra script element)

Uncaught (in promise) TypeError

I installed the npm package and after calling any function from penumbra I got the error is Uncaught (in promise) TypeError: _transcend_io_penumbra__WEBPACK_IMPORTED_MODULE_2___default.a.saveZip is not a function, I tried to copy penumbra files from it's build folder to my project's build folder as the developers wrote in the manual, but still no luck. I am sorry for this "issue", because it should not be the issue, it is mostly my misunderstanding of the propper way I should use the package, but how can I make it work?
My webpack.conf.js

const path = require("path");
const CopyPlugin = require("copy-webpack-plugin");
const fs = require("fs");
const PENUMBRA_DIRECTORY = path.join(__dirname, "node_modules", "@transcend-io/penumbra", "build");
module.exports = {
    plugins: [
        new CopyPlugin({
            patterns: fs
                .readdirSync(PENUMBRA_DIRECTORY)
                .filter((fil) => fil.indexOf(".") > 0)
                .map((fil) => ({
                    from: `${PENUMBRA_DIRECTORY}/${fil}`,
                    to: `${"dist"}/${fil}`,
                })),
        }),
    ],
    entry: ["babel-polyfill", "./app/app.jsx"],
    mode: "development",
    output: {
        filename: "./main.js",
    },
    devServer: {
        contentBase: path.join(__dirname, "dist"),
        compress: true,
        port: 4000,
        watchContentBase: true,
        progress: true,
    },
    module: {
        rules: [
            {
                test: /\.m?js$/,
                exclude: /(node_modules|bower_components)/,
                use: {
                    loader: "babel-loader",
                },
            },
            {
                test: /\.css$/,
                loader: "style-loader!css-loader",
            },
            {
                test: /\.(png|svg|jpg|gif)$/,
                use: ["file-loader"],
            },
            { test: /\.jsx?$/, exclude: /(node_modules|bower_components)/, loader: "babel-loader" },
            {
                test: /\.txt$/,
                use: "raw-loader",
            },
        ],
    },
};

package.json

{
  "name": "reactapp",
  "version": "1.0.0",
  "description": "React app",
  "main": "./app/app.js",
  "scripts": {
    "dev": "webpack-dev-server",
    "start": "webpack"
  },
  "author": "Me",
  "license": "ISC",
  "devDependencies": {
    "@babel/core": "^7.8.4",
    "@babel/preset-env": "^7.8.4",
    "@babel/preset-react": "^7.8.3",
    "babel-core": "^6.26.3",
    "babel-loader": "^8.0.6",
    "babel-polyfill": "^6.26.0",
    "babel-preset-es2015": "^6.24.1",
    "babel-preset-stage-0": "^6.24.1",
    "css-loader": "^3.4.2",
    "raw-loader": "^4.0.0",
    "style-loader": "^1.1.3",
    "webpack": "^4.46.0",
    "webpack-cli": "^3.3.11",
    "webpack-dev-server": "^3.11.2"
  },
  "dependencies": {
    "@babel/plugin-proposal-class-properties": "^7.13.0",
    "@transcend-io/penumbra": "^5.0.4",
    "bootstrap": "^4.4.1",
    "copy-webpack-plugin": "^6.2.1",
    "fs": "0.0.1-security",
    "react": "^16.12.0",
    "react-dom": "^16.12.0",
    "react-dropzone": "^10.2.1",
    "socket.io-client": "^3.1.2"
  },
  "babel": {
    "presets": [
      "@babel/env",
      "@babel/react"
    ],
    "plugins": [
      [
        "@babel/plugin-proposal-class-properties",
        {
          "loose": true
        }
      ]
    ]
  }
}

app.jsx

import React, { Component } from "react";
import ReactDOM from "react-dom";
import ItemsList from "./components/ItemsList.jsx";
const propsValues = {
    title: "Menu",
    items: ["Samsung Galaxy Note20", "Apple iPhone 12 Pro", "Google Pixel 5", "Huawei P40 Pro", "OnePlus 8 Pro", "Asus Zenfone 7 Pro"],
};
ReactDOM.render(<ItemsList data={propsValues} />, document.querySelector("#root"));

ItemsList.jsx

import React, { Component } from "react";
import SearchPlugin from "./SearchPlugin.jsx";
import penumbra from "@transcend-io/penumbra";
class ItemsList extends React.Component {
    constructor(props) {
        super(props);
        this.state = { items: this.props.data.items };

        this.filterList = this.filterList.bind(this);
    }
    async componentDidMount() {
        const cacheBuster = Math.random().toString(10).slice(2);
        const files = [
            {
                url: "https://s3-us-west-2.amazonaws.com/bencmbrook/tortoise.jpg.enc",
                name: "tortoise.jpg",
                mimetype: "image/jpeg",
                decryptionOptions: {
                    key: "vScyqmJKqGl73mJkuwm/zPBQk0wct9eQ5wPE8laGcWM=",
                    iv: "6lNU+2vxJw6SFgse",
                    authTag: "ELry8dZ3djg8BRB+7TyXZA==",
                },
            },
        ];
        const writer = penumbra.saveZip();
        await writer.write(...(await penumbra.get(...files)));
        await writer.close();
    }
    filterList(text) {
        var filteredList = this.props.data.items.filter(function (item) {
            return item.toLowerCase().search(text.toLowerCase()) !== -1;
        });
        this.setState({ items: filteredList });
    }

    render() {
        return (
            <div>
                <h2>{this.props.data.title}</h2>
                <SearchPlugin filter={this.filterList} />
                <ul>
                    {this.state.items.map(function (item) {
                        return <li key={item}>{item}</li>;
                    })}
                </ul>
            </div>
        );
    }
}

export default ItemsList;

Discussion: support for saving locally to IndexedDB

It would be cool if visitors to Privacy Centers with previous access requests can continue viewing their export, even after it's gone from our systems.

IndexedDB allows for Blob storage. Could be interesting.

(A) Is this a worthwhile feature?
(B) Does it belong here or in the Privacy Center code?

TypeError: Cannot read property 'bind' of undefined

I think there's something wrong with the build or the instructions (maybe a dependency?). I tried to use this, but no matter what I tried I got the above error. Even just doing import { penumbra } from '@transcend-io/penumbra'; in a webpack/React app or adding the pre-built build/penumbra.js to an HTML page a results in the error.

Multi-thread jobs & streaming improvements

Penumbra should offer an option to use multiple job execution threads/workers for improved performance.

This will require a major refactor touching streaming, error handling, and a new job API model. Fixing this could unburden much of penumbra's tech debt.

  • Use an adaptive worker pool
  • Improve job API model

Configurable streamsaver endpoint

Penumbra should have an option for configuring the streamsaver HTML endpoint. This should improve security by allowing library users to self-host the endpoint instead of having to rely on our hardcoded endpoint.

BrowserStack testing is flaky with Chrome

About 1/5th of the time Chrome doesn't run any Penumbra tests and the CI needs to be re-triggered to pass.

I'm not sure if there is anything that we can do to improve this on our end.

Repository cleanup

Couple of things to fix:

  • Error indicators on README need to be fixed
  • Issues could be cleaned up a tiny bit

Add benchmarks

We should benchmark the time it takes to decrypt and download encrypted files. We still need to decide on a benchmarking framework.

Firefox crashes intermittently

Some unsolicited testing: I was waiting for a call and had a few min so checked out the latest on Transcend and saw this new repo, so I gave it a try.

Downloading the patreon video crashed intermittently in Firefox. I couldn't capture a crash, but here's the performance profile and a screenshot. Are web workers being used in the demo?

Screen Shot 2019-06-07 at 3 28 38 PM

profile.json.zip

Port Node crypto to WebCrypto

In order to reduce the worker script bundle size, we should look into porting our Node crypto usage to WebCrypto. This is a low priority task.

Future of downloadEncryptedFile and getDecryptedContent

Right now Penumbra bundles StreamSaver.js and implements a download function and a display function.

Should Penumbra be just about returning a ReadableStream of deciphered data? After all, these ReadableStreams are what we'd want to use as input to conflux..

fetchAndDecipher is the bread and butter of Penumbra. The repo description is "Fetch and decrypt files in the browser" - that's this function.

downloadEncryptedFile and getDecryptedContent (and their helpers) are just use cases that extend fetchAndDecipher to download the file or display it in the DOM

Perhaps we break out these specific use-cases into another file?

Add tests

We should add basic tests in penumbra and also consider integrating with heavier browser-based frameworks (like cypress) in the long-run.

Clean up resource hints

@eligrey

This should play nicely with React. The only issue I can think of is the memory leak, which can be solved by returning a cleanup function that can be called when you are done with your connections:

    const preconnect_cleanup = preconnect(origin);
    await makeRequestsToOrigin(...);
    preconnect_cleanup();

Originally posted by @eligrey in #20

@bencmbrook:

We might want to do that by default upon completion of a download. After a URL has been downloaded, call

cleanupHint(href)

It wouldn't account for fetching multiple times (would anyone want to?), but we could have an override option on the RemoteResource object, or just be opinionated about it (personally, idc)

Inline worker script for easier library usage

We should inline our worker script so that the entry bundle can instantiate penumbra workers without any external URL references.

We can do this by adding a build step to compile the worker script and expose this through process.env (or similar) for embedding by Webpack. This embedded minified worker script could be allocated to a Blob URI at runtime and used for worker instantiation.

Pull #20 broke the example

  • This likely implies that Penumbra is broken for all of 2.x -- premature merge of #20?

  • Build dir now looks like this which won't in browsers (it's probably fine for Transcend since we use Webpack to recompile it anyway)
    Screen Shot 2019-07-09 at 9 58 23 AM

getTextOrURI batch API

I'd actually prefer to receive an array of promises, rather than a promise of an array. This allows us to display files as they're ready. The user can always call Promise.all if they want them all at once.

Current return:
Promise<PenumbraTextOrURI[]>

Proposed:
Promise<PenumbraTextOrURI>[]

This is as simple as returning the array rather than the Promise.all here

penumbra/src/API.ts

Lines 179 to 181 in 8eaaab9

return Promise.all(
(data as PenumbraFile[]).map(async (file) => getTextOrURI(file)),
) as Promise<PenumbraTextOrURI[]>;

Note the demo is able to do this because I worked around it by calling getTextOrURI many times with arr length of 1

Save to IndexedDB ("prefetch")

Can we write decrypted streams to IndexedDB? This can be a nice way of implementing "prefetch". Files that aren't yet needed can be fetched, decrypted, put into IndexedDB, then pulled out when needed (e.g. when arriving at that page).

`penumbra.saveZip()` Background Fetch integration for rich download UI

We could add an optional resources field onto ZipOptions consisting of uninitialized RemoteResource descriptors. A Service Worker could handle downloading these resources with Background Fetch which enables us to create rich download UIs and persistent downloads that continue when the tab is closed.

Rich download UI example screenshot:

Screen Shot 2020-10-27 at 3 01 40 PM

Background Fetch example (from https://developers.google.com/web/updates/2018/12/background-fetch)

navigator.serviceWorker.ready.then(async (swReg) => {
  const bgFetch = await swReg.backgroundFetch.fetch('my-fetch', ['/ep-5.mp3', 'ep-5-artwork.jpg'], {
    title: 'Episode 5: Interesting things.',
    icons: [{
      sizes: '300x300',
      src: '/ep-5-icon.png',
      type: 'image/png',
    }],
    downloadTotal: 60 * 1024 * 1024,
  });
});

Limitations

This feature might not be compatible with use cases that involve an unknown amount of async paginated resources.

Penumbra 3.0 API

The new API will incorporate file zipping and saving now, since we’re building everything to be streamable and all performed inside a Service Worker.

Maybe it's a chainable thing?

fetchManyAndDecrypt(resources).save()
fetchManyAndDecrypt(resources).getTextOrURI()
fetchAndDecrypt(resource).save()
fetchAndDecrypt(resource).getTextOrURI()

@giacaglia @eligrey

Offload decryption to Worker

Fetching, decrypting, zipping, and saving will start to get really slow when we start incorporating large files.

This is because (A) Safari doesn't support WriteStreams and (B) JSZip reads files into a Blob (i.e. fully into memory) rather than a stream.

I'm working on a streaming zip alternative based on this issue (which might encourage the community to build a library we're building an open source package with Jimmy Warting). I'm also working on bundling a service worker option into Penumbra, but it should be done on privacy-page so the .zip bundling is done on the SW as well.

While these are good speedups, a simpler solution is to offload this processing to a Service Worker so it doesn't block the DOM thread.

The service worker should do everything until the final output - so it fetches, decrypts, zips, and then sends the final output to the DOM thread for use (often a stream of data - e.g. the final zip - or otherwise an output, like a string of decrypted text, or the src of an object URI for media).

This was done successfully with Penumbra here using Comlink:

  • import { getDecryptedContent, downloadEncryptedFile } from '../dist/index';
    import * as Comlink from 'comlink';
    const myValue = 43;
    class PenumbraSW {
    getDecryptedContent(...args) {
    return getDecryptedContent(...args);
    }
    downloadEncryptedFile(...args) {
    return downloadEncryptedFile(...args);
    }
    logSomething() {
    console.log(`my value = ${myValue}`);
    }
    }
    Comlink.expose(PenumbraSW);
  • const Penumbra = USE_SERVICE_WORKER
    ? // Offload penumbra processing to service worker
    Comlink.wrap(
    new Worker('./decryption.worker.js', { type: 'module' })
    )
  • new Penumbra()
    .then(instance => instance.getDecryptedContent(url, file.key, file.iv, file.authTag, file.mime))
    .then(url => (image.src = url));
  • plugins: [
    new WorkerPlugin({
    globalObject: false
    })
    ],

Consider simple webpack plugin

since penumbra requires some webpack config, would be nice to bundle this as a small plugin with penumbra so anyone can just import the config and have penumbra working

Question: What's missing on Safari?

Just wondering, are there plans to support encrypt on Safari? What's missing about it?

Also, what's the role of authTag? Which algorithms are used for encryption/decryption? It would be great to have some docs about all that. Given the lack of encryption support on Safari, I was thinking about encrypting server side (e.g., with AES CTR + IV + Key derived from PBKDF2), but I can't seem to find a way to decrypt a file encrypted like that.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.