Giter Club home page Giter Club logo

slowreader's Introduction

Slow Reader

Web app to combine feeds from social networks and RSS and to help read more meaningful and deep content.

Right now, it is just a prototype. We plan to have features:

  • Combine all feeds (social media, RSS) in a single app.
  • Track how each subscription is useful for you.
  • Split subscriptions to slow (something useful, deep) and fast-food (fun and small).
  • Spend more time on slow content by blocking fast in the evening, etc.

Open in StackBlitz

Pre-alpha prototype: dev.slowreader.app

↬ How to contribute and join the team

To ask any question: h+h lab Discord


License

Slow Reader is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License (version 3 or any later). For details see the LICENSE.md file in this directory.

Principles

Local-First: Clients Do All the Job

Local-first means the client stores all data locally and does most of the job. Even if we decide to close the cloud, you can still use the Slow Reader client.

In our case, it means that:

  • Client stores all feeds and posts. You can read posts offline.
  • Client checks feeds for new posts.
  • You need the cloud to sync data between clients.

Local-first manifest tells more philosophy behind that idea.

Zero-Knowledge Synchronization

Clients use end-to-end encryption during cloud sync. It means the cloud can’t know what feeds you are reading or what posts you like.

You can check encryption source code.

We have a proxy to make HTTP requests to other servers from the website. But this proxy is only for development, bypassing government censorship, or the initial test of the app. The web client should mostly use an upcoming web extension to bypass the CORS limit. Upcoming native clients will use direct HTTP requests since they don’t have a CORS limitation.

Event-Sourcing and CRDT

The source of truth in the client is a list of changes (action log). An action is a JSON object like:

{
  "type": "feeds/changed",
  "id": "kc4VfXpvw3vZZBu_ugGlC",
  "fields": {
    "reading": "slow"
  }
}

To render the UI, the client reduces actions from the log into the state (objects of feeds, posts, etc.). We store the state cache in Nano Stores.

The log simplifies synchronization. We just need to track the last synchronized action and send all actions after that one.

We use Logux to work with log and synchronization.

Client Core: Reusing All Logic Between Different Platforms

For now, we have only a web client. But we want to have native clients for different platforms.

To make client porting easier, we separate core logic and UI. Core logic is the same for every client. The client just needs to bind this logic to the UI using native components.

We write logic in TypeScript as smart stores in the Nano Stores state manager. The client needs to subscribe to stores and render UI according to the store.

We try to move to the store as much as possible: app routing, validations, and UI helpers. The client should be as thin as possible. The ideal client is just a UI renderer.

Core depends on the platform environment (like storage to store settings). Before using any store, the client must call setEnvironment() to define how the core should interact with the platform.

Project Structure

Slow Reader is a local-first app. Clients do most of the work, and the server just syncs data between users’ devices (with end-to-end encryption).

  • Clients.
    • core/: client’s logic and i18n translations. Clients for specific platforms is just a UI around this core to simplify porting
    • web/: the client to be run in the browser. Both for desktop and mobile.
  • server/: a small server that syncs data between users’ devices.
  • proxy/: HTTP proxy server to bypass censorship or to try web clients before they install the upcoming extensions (to bypass the CORS limit of the web apps).
  • api/: types and constants shared between clients and server.
  • docs/: guides for developers.
  • scripts/: scripts to test project and configure Google Cloud. Check the script’s descriptions for further details.
  • loader-tests/: integration tests for each social network or news format.
  • .github/: scripts to test projects on CI.
  • .husky/: scripts to call on git commit command to avoid popular errors.
  • .vscode/: VS Code settings to reduce code format errors for new contributors.

We are using pnpm monorepo. Each project has its dependencies, tools, and configs. Read README.md in each project for project’s files and architecture.

Tools

Global development tools:

  • asdf to synchronize Node.js and pnpm versions across the team and CI.
  • Prettier to use the same code style formatting.
  • TypeScript for strict type checking.
  • ESLint to check for popular mistakes in JavaScript.
  • remark to find mistakes in .md files.

Each project has its own tools, too.

Scripts

  • pnpm test: run all tests.
  • pnpm start: run proxy and web client development server.
  • pnpm format: fix code style in all files.
  • pnpm clean: remove all temporary files.
  • pnpm check-opml: test loaders with user’s OPML RSS export.

We use pnpm feature to run scripts in parallel, having scripts like test:types and test:audit. Then, we run all scripts in all projects by test:* prefix.

Synchronization Protocol

We use Logux WebSocket protocol to synchronize actions between clients and server.

Clients keep a list of changes (action log) as the source of truth and then send new actions to the server. The server then sends all new actions to other clients.

The server doesn’t see those actions because clients encrypt them before sending and decrypt them upon receiving. The server sees only actions like:

// Add encrypted action to the server log
{
  "type": "0",
  "d": "encrypted data",
  "iv": "random vector used together with password in encryption"
}
// Remove action from the server log
{
  "type": "0/clean",
  "id": "action ID"
}

Client Storage

The clients store a list of changes (action log). During the start, the client reduces all necessary actions from the log to the Logux SyncMap stores.

For simple things like client local settings, we use Nano Store Persistent.

The web client uses IndexedDB to store log and localStorage for the client’s settings.

Test Strategy

If any mistake happens a few times, we should add an automatic tool to prevent mistakes in the future. Possible strategies:

  1. Types.
  2. Scripts, custom ESLint or Stylelint plugins.
  3. Unit-tests.
  4. Pull request checklist.

Any code-style rule should be implemented as a pre-commit hook or linter’s rule.

Types should try to use precise types and explain data relations with them:

- { type: string, error?: Error }
+ { type: 'ok' } | { type: 'error', error: Error }

We are using unit tests for client core. We mock network requests and the platform environment but emulate user interaction and test the composition of all stores.

For the platform’s clients, we mostly use visual tests. But they could be complex and test the whole pages with mocking core’s stores.

Visual Language

We prefer the platform native look and feel where possible.

Where not possible, we use old-style 3D with rich visual feedback and a z-axis.

The slow mode should always use a yellow newspaper-like background (on color screens).

We are using Material Design Icons icons.

On desktops, we care not only about mouse UX but also about keyboard UX. Our keyboard UX rules:

  • Create a path: what keys can the user press to do some action? Try to make the path shorter.
  • Make hotkeys and non-standard keys visible to the user.
  • Think about focus. If the user starts to interact with the keyboard, move the focus to the next control.
  • Esc should work in as many cases as possible.
  • Don’t use only Tab to navigate. Mix it with arrows and hotkeys for list items.

Dependencies

How we choose dependencies:

  1. Always checking alternatives from npm search, not just take the most popular one.
  2. By project activity looking at their repository/issues/PR.
  3. By JS bundle size for web client dependency.
  4. By node_modules size and number of subdependencies.

After adding a web client dependency, do not forget to call cd web && pnpm size to check the real size of dependency in our JS bundle.

We put to dependencies only dependencies we need for production deploy. All other dependencies you should put to devDependencies. During production deploy we will use pnpm install --prod to reduce security risks of having malicious code in some dependency.

We are using in package.json 1.0.0 version requirement and not ^1.0.0 to not get unexpected dependencies updates (at least, direct dependencies) if we will break the pnpm lock file. The ./scripts/check-versions.ts in pre-commit hook will check that you do not forget this rule.

To update specific dependency use:

pnpm update DEPENDENCY

To update all dependencies:

pnpm update --interactive --latest -r --include-workspace-root
pnpm update -r --include-workspace-root

We can update all dependencies at least once per week.

Guides

slowreader's People

Contributors

ai avatar sashachabin avatar ilyatitovich avatar pekac avatar sanchopanda avatar konfuze avatar dependabot[bot] avatar toplenboren avatar edward-leopold avatar upteran avatar privorotskii avatar guria avatar greenteacake avatar easing avatar tainkrios avatar kossnocorp avatar usmanyunusov avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.