Giter Club home page Giter Club logo

pandanite's Introduction

Pandanite

Pandanite is a minimalist implementation of a layer 1 cryptocurrency similar to Bitcoin. It is designed with utmost simplicity and user friendliness in mind and is written from the ground up in C++ — it isn't yet another re-packaging of existing open-source blockchain code (where's the fun in that?!).

Circulation

Pandanite is minted by miners who earn rewards. Mining payments occur using the thwothirding algorithm, which yields a total final circulation of ~99.1M PDN:

  • 6647477.8490 PDN carried over from previous forks distributed
  • 50 PDN per block at Heights 1 to 515736
  • 50*(2/3) PDN per block from blocks 515737 to 515736+666666
  • 50*(2/3)^2 PDN per block from blocks 515736+666667 to 515736+2*666666

etc.

Comparison with halving

Block reward changes are more often and have less impact compared to halving:

The payout curve is smoother in twothirding compared to halving:

Technical Implementation

Pandanite is written from the ground up in C++. We want the Pandanite source code to be simple, elegant, and easy to understand. Rather than adding duct-tape to an existing currency, we built Pandanite from scratch with lots of love. There are a few optimizations that we have made to help further our core objectives:

  • Switched encryption scheme from secp256k1 (which is used by ETH & BTC) to ED25519 -- results in 8x speedup during verification and public keys half the size.
  • Up to 25,000 transactions per block, 90 second block time

Getting Started

Windows: Windows is not currently supported as a build environment. You may run the dcrptd miner to mine Pandanite

Mac OSX build pre-requirements

brew install leveldb
brew install cmake
pip3 install conan

Ubuntu 18.04 LTS install pre-requirements

sudo apt update
sudo apt-get -y install make cmake automake libtool python3-pip libleveldb-dev curl git
sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.6 1
sudo pip3 install conan==1.59

Ubuntu 20.04 LTS install pre-requirements

sudo apt-get update
sudo apt-get -y install make cmake automake libtool python3-pip libleveldb-dev curl git
sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.8 1
sudo pip3 install conan==1.59

Ubuntu 22.04 LTS install pre-requirements

sudo apt-get update
sudo apt-get -y install make cmake automake libtool python3-pip libleveldb-dev curl git
sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.10 1
sudo pip3 install conan==1.59

Building

git clone https://github.com/pandanite-crypto/pandanite
cd pandanite
mkdir build
cd build
conan install .. --build=missing
cd ..
cmake .

*Ubuntu 18.04 Requires a code change to build server in src/server/server.cpp change:

Line 10
#include <filesystem>
to
#include <experimental/filesystem>

Lines 50, 52, & 58
std::filesystem::...  
to 
std::experimental::filesystem::...

To compile the miner run the following command:

make miner

You will also need the keygen app to create a wallet for your miner:

make keygen

To compile the node server:

make server

To compile a simple CLI tool that lets you send PDN:

make cli

For a separate, interactive GUI wallet see https://github.com/pandanite-crypto/pandanite-wallet

Usage

Start by generating keys.json.

./bin/keygen

Keep a copy of this file in a safe location -- it contains pub/private keys to the wallet that the miner will mint coins to. If you lose this file you lose your coins. We recommend keeping an extra copy on a unique thumbdrive (that you don't re-use) the moment you generate it.

To start mining:

./bin/miner

To host a node:

./bin/server

Some server running args:

-n (Custom Name, shows on peer list)
-p (Custom Port, default is 3000)
--testnet (Run in testnet mode, good for testing your mining setup)

Full list of arguments can be found here: https://github.com/pandanite-crypto/pandanite/blob/master/src/core/config.cpp

To send PDN to another address (run with --local flag if the server you want to use is listening on localhost):

./bin/cli

Docker

Pandanite is pre-built for amd64 and arm64 with GitHub Actions and distributed with the GitHub Container Registry

Running with Docker

with docker

docker run -d --name pandanite -p 3000:3000 -v $(pwd)/pandanite-data:/pandanite/data ghcr.io/pandanite-crypto/pandanite:latest server
docker logs -f pandanite

You can follow the progress of server sync from http://localhost:3000

Running with docker-compose is recommended to easily add more options like cpu usage limits and a health checks:

version: '3.4'

services:
  pandanite:
    image: ghcr.io/pandanite-crypto/pandanite:latest
    command: server
    ports:
      - 3000:3000
    volumes:
      - ./pandanite-data:/pandanite/data
    restart: unless-stopped
    cpus: 8
    healthcheck:
      test: ["CMD", "curl", "-sf", "http://127.0.0.1:3000"]
      interval: 10s
      timeout: 3s
      retries: 3
      start_period: 10s

Building with docker

Clone this repository and then

docker build . -t pandanite
docker run [OPTIONS] pandanite server

Running CI build locally

docker run --privileged --rm tonistiigi/binfmt --install all
docker buildx create --use

GITHUB_REPOSITORY=NeedsSomeValue GITHUB_SHA=NeedsSomeValue docker buildx bake --progress=plain

pandanite's People

Contributors

barrystyle avatar bugtastic avatar crodnun avatar de-crypted avatar j-ttt avatar johanaugustsandels avatar matti avatar mbroemme avatar mr-pandabear avatar mrmikeo avatar orchardstreet avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

pandanite's Issues

BMB003: Implement programID related REST endpoints and modifications

Update the following endpoints to take an optional programID get parameters:

  • block_headers
  • block_count
  • sync
  • total_work

Add an endpoint to submit a program: add_program that will take in JSON with a hexdump of the WASM code to execute the program and a SHA256 hash indicating the programID

Add an endpoint called get_program which will take in the SHA256 hash of a program and fetch it's contents

Improve node bootup time by caching hashes or lastBlockHash per IP

Node bootup time is a huge pain point. We should either:

  1. Create an on disk cache of pufferfish inputs and outputs
  2. store per IP what the lastBlockHash and last downloaded block was, so that we don't need to re-download the entire header chain. This has some nuance because it may not handle edge cases where the node has forked the chain.

wlinknet upgrade: excessive sync retries on ./bin/server. Excessive error notices. False prompt to delete data directory on sync crash. Almost guaranteed crashing (on sync, not normal use). Sync needs manual restarts and lots of patience (20+ hours) to sync without bootstrap.

This is a bug report for the wlinknet upgrade branch of the main repo. I tested using that branch, also known as the 'new node'.

The syncing has been improved since last few months. In that a bootstrap is not needed if you have patience and many hours to spare. This is very good. Longstanding issues with the nodes were also fixed by wlinknet, which is great. Random crashing (after sync) seems to be gone, meaning exchanges can probably take PDN out of maintenance (with a bootstrap)

However, even with the upgrade, the syncing will exceed max retries, and need manual restarts. After it exceeds max retries after crashing, it will prompt this user to delete the data directory. They should not delete the data directory. As it will continue syncing if restarted at this point. A bandaid solution would be to suppress the log errors, and delete the string telling the user to delete the directory folder, with automated restarts of the syncing process.

A long term solution, which wlinknet said he may use, involves building a map of the bad blocks so to navigate around them. Erik also said a JSON file fix may fix most or all of this issue.

tl;dr syncing without a bootstrap works on the upgrade, but it is not reasonable to ask users or exchanges to do so considering the time, excessive warning, and prompts to delete data directory. In my opinion, more work is needed on this before release a new node, or at least a bandaid solution. As mentioned earlier, wlinknet's upgrade seems to solve many issues and is very good, but I think it's worth noting issues before release, so as to fix or to use a band aid. Thanks.

Add /mine_status/<blockID> endpoint

This endpoint should return:
-- timestamp of block
-- wallet address of miner
-- total minting fee for the block
-- total tx fees for the block

MerkleTree construction is wrong

The MerkleTree generated is incorrect, and may produce trees such as the following

      x
      / \
     x   \
    / \   \
   x   x   x
  / \ / \ / \
  x x x x x x

segment fault?

ubuntu@vmi617837:~/pandanite$ ./bin/miner --local
░░░░░░░░▄██▄░░░░░░▄▄░░
░░░░░░░▐███▀░░░░░▄███▌
░░▄▀░░▄█▀▀░░░░░░░░▀██░
░█░░░██░░░░░░░░░░░░░░░
█▌░░▐██░░▄██▌░░▄▄▄░░░▄
██░░▐██▄░▀█▀░░░▀██░░▐▌
██▄░▐███▄▄░░▄▄▄░▀▀░▄██
▐███▄██████▄░▀░▄█████▌
▐████████████▀▀██████░
░▐████▀██████░░█████░░
░░░▀▀▀░░█████▌░████▀░░
░░░░░░░░░▀▀███░▀▀▀░░░░
[STATUS] 08-16-2023 15:31:50: Ignoring host http://129.146.9.111:3000
[STATUS] 08-16-2023 15:31:50: Running miner. Coins stored in : 0013481B9A47CBCEA1360B25739B4F0E479F4C6DB7DE0CDA0A
[STATUS] 08-16-2023 15:31:50: Starting miner on single thread
[STATUS] 08-16-2023 15:31:50: [ NEW ] block = 155402, difficulty = 23, transactions = 0 - http://localhost:3000
Segmentation fault

am runnign server locally, but seems to keep looping over the blocks, goes and then seems to start from first block

[STATUS] 08-16-2023 15:33:26: Added block 166961
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166962
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166963
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166964
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166965
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166966
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166967
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166968
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166969
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166970
[STATUS] 08-16-2023 15:33:26: difficulty= 23
[STATUS] 08-16-2023 15:33:26: Added block 166971
[STATUS] 08-16-2023 15:33:26: difficulty= 23

cli segfault

Running the cli and waiting a couple of seconds will result in a segfault.

Thread 25 "cli" received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x7ffff6612700 (LWP 21066)]
0x00005555557e4ce7 in PufferfishCache::getHash (this=0x7fffd8001130, hash=...)
    at /root/bamboo/src/server/pufferfish_cache.cpp:22
22          leveldb::Status status = db->Get(leveldb::ReadOptions(), sha256ToSlice(hash), &value);

gdb backtrace:

#0  0x00005555557e4ce7 in PufferfishCache::getHash (this=0x7fffd8001130,
    hash=...) at /root/bamboo/src/server/pufferfish_cache.cpp:22
#1  0x0000555555796ae8 in PUFFERFISH (
    buffer=0x7ffff6611990 "\234\214\347\017\211\376\241\247b\227\\\372s\216>]\200\331", len=64, useCache=true) at /root/bamboo/src/core/crypto.cpp:31
#2  0x0000555555796e39 in SHA256 (
    buffer=0x7ffff6611990 "\234\214\347\017\211\376\241\247b\227\\\372s\216>]\200\331", len=64, usePufferFish=true, useCache=true)
    at /root/bamboo/src/core/crypto.cpp:53
#3  0x0000555555797e27 in concatHashes (a=..., b=..., usePufferFish=true,
    useCache=true) at /root/bamboo/src/core/crypto.cpp:224
#4  0x0000555555797fed in verifyHash (target=..., nonce=...,
    challengeSize=6 '\006', usePufferFish=true, useCache=true)
    at /root/bamboo/src/core/crypto.cpp:246
#5  0x0000555555774985 in Block::verifyNonce (this=0x7ffff6611c70)
    at /root/bamboo/src/core/block.cpp:155
#6  0x000055555579ab3d in HeaderChain::load (this=0x555555e7ea10)
    at /root/bamboo/src/core/header_chain.cpp:107
#7  0x000055555579a1d3 in chain_sync (chain=...)
    at /root/bamboo/src/core/header_chain.cpp:13
#8  0x000055555579c5d4 in std::__invoke_impl<void, void (*)(HeaderChain&), std::reference_wrapper<HeaderChain> > (
    __f=@0x555555e7dcc0: 0x55555579a1ac <chain_sync(HeaderChain&)>)

should the cli even be doing a header chain sync?

Where is my coins?

few hours ago i had 82650 BMB on 00D368B6A8C88D240D43A92D05307A35588DE465B7C0DF94BB , but now i have 82400

Add exceptions to invalid addresses

Currently you can create junk addresses with the stringToAddress function, it should throw an exception on invalid hex string (!=50 digits)

BMB004: Implement RhizomeExecutor and RhizomeChain classes

RhizomeExecutor will feed in the following into a WASM module:

  • Layer1Block
  • RhizomeBlock
  • Ledger
  • TransactionStore
  • Layer1Ledger

The WASM module will implement the following two methods:
ExecuteBlock
RevertBlock

Implement a base wasm module for something simple as a proof of concept (e.g a basic token)

MemPool Updates

-- Impose minimum transaction fee in Mempool and handle dust transactions, The minimum transaction fee should be computed dynamically based on the total transaction volume of the last N blocks.

-- Increase mempool volume beyond block capacity

-- prioritize transactions to send high fee transactions to miners first

build fails on fresh ubuntu 22.04

usockets/0.8.1: Downloaded recipe revision 0
conanfile.txt: Installing package
Requirements
    openssl/3.0.0 from 'conancenter' - Downloaded
    usockets/0.8.1 from 'conancenter' - Downloaded
    uwebsockets/19.3.0 from 'conancenter' - Downloaded
    zlib/1.2.12 from 'conancenter' - Downloaded
Packages
    openssl/3.0.0:b88c434d8110e892fcfb7bbb4f76234f590822f1 - Download
    usockets/0.8.1:ee9429af7c481efe3fd412f23bec8d4d71ff3ce6 - Missing
    uwebsockets/19.3.0:5ab84d6acfe1f23c4fae0ab88f26e3a396351ac9 - Download
    zlib/1.2.12:dfbe50feef7f3c6223a476cd5aeadb687084a646 - Download

Installing (downloading, building) binaries...
ERROR: Missing binary: usockets/0.8.1:ee9429af7c481efe3fd412f23bec8d4d71ff3ce6

usockets/0.8.1: WARN: Can't find a 'usockets/0.8.1' package for the specified settings, options and dependencies:
- Settings: arch=x86_64, build_type=Release, compiler=gcc, compiler.version=11, os=Linux
- Options: eventloop=syscall, fPIC=True, with_libuv=deprecated, with_ssl=False
- Dependencies:
- Requirements:
- Package ID: ee9429af7c481efe3fd412f23bec8d4d71ff3ce6

ERROR: Missing prebuilt package for 'usockets/0.8.1'
Use 'conan search usockets/0.8.1 --table=table.html -r=remote' and open the table.html file to see available packages
Or try to build locally from sources with '--build=usockets'

More Info at 'https://docs.conan.io/en/latest/faq/troubleshooting.html#error-missing-prebuilt-package'

BMB002: Implement create program transaction

This special transaction type creates a new program on the blockchain for a given wallet. It will be flagged using -2 as the timestamp parameter and transmitted using the following schema:

struct TransactionProgramInfo {
    char signature[64];
    char signingKey[32];
    uint64_t timestamp;
    char programHash[32];
    TransactionAmount fee;
};

The executor will be updated so as to lock funds within the wallet corresponding to the signingKey. We will have to re-architected the to & from JSON methods of the transaction class and add significant new support for handling multiple transactions.

TransactionStore needs to handle ED25519 signature malleability problem

TransactionStore currently calls transaction.getHash() to store previously executed transactions. This function includes the signature in the hash computation, this means that two identical transactions with distinct signatures would each be allowed to execute.

To fix this we should call transaction.getContentHash() to generate the key used to identify the transaction in TransactionStore

Further reading:
https://github.com/dalek-cryptography/ed25519-dalek#a-note-on-signature-malleability

Handle "429 Too Many Requests" response from ifconfig.co

Currently the node uses ifconfig.co to find it's own IP. Unfortunately sometimes the IP is not returned -- we don't handle this failure case... we should ideally:

  1. Use a second or third IP discovery service
  2. Throw and have a handled exception when we can't figure out our own IP.

wlinknet upgrade, CLI wallet (./bin/cli) segfaults randomly. Longstanding issue.

This is a bug report for the wlinknet upgrade branch of the main repo. It occurs on that branch.
This is an issue which was on the old node as well, currently the master branch. The issue has not been fixed in the upgrade yet.

The Wlinknet upgrade solves the major issue with the CLI wallet involving remote sends, which is great :) This means that people can send using ./bin/cli without running a local server. However, users are still likely to complain about the random segfaults.

No specific input triggers the segfault during ./bin/cli
To replicate the issue, just try running './bin/cli --local' a few times for extended periods of time

The issue does not prevent sends if you enter your information fast enough.

I think this is worth fixing, as other users have complained about this in the past.

MINER

There is many INVALID_BLOCK_ID in the miner , this is ok ?

Server crashes on /block/0

Error message:

[ERROR] 01-14-2022 13:56:30: /block: Invalid block
Error: Returning from a request handler without responding or attaching an abort handler is forbidden!
libc++abi.dylib: terminating
zsh: abort      ./bin/server

The crash is because the request handler does not return.

Nodes should store last peer list in LevelDB or JSON file

When nodes are rebooted it can take 10-15 minutes to re-discover peers. During this time miners push these (now lost) peer chains forward. Nodes should store their last seen peers to speed up re-discovery on re-boot and prevent chain forks.

git clone

you put the wrong git clone url for take the repository

BMB001 : Extend transaction data type to support 32 byte program ID and 128 bytes of data storage

Several changes in this proposal to support the Rhizome Virtual Blockchain:

  1. The timestamp will now act as a FLAG -- if set to -1 the transaction will be discarded from the main mempool and instead forwarded only to nodes running a particular virtual blockchain.
  2. The TransactionInfo struct will now contain an additional 32 byte program ID (identifying the program the transaction data should be sent to) as well as a 128 byte data field which my be utilized to store arbitrary data. The full transaction data will be forwarded on to the virtual blockchain for processing.
  3. Transaction hashing and data storage will remain the same -- only network transmission of transactions will change. LevelDB will not store the additional programID and data fields.

The newly proposed structure is as follows:

struct TransactionInfo {
    char signature[64];
    char signingKey[32];
    uint64_t timestamp;
    PublicWalletAddress to;
    PublicWalletAddress from;
    TransactionAmount amount;
    TransactionAmount fee;
    bool isTransactionFee;
    char targetProgram[32];
    char data[128];
};

Modifications will need to be made to the serialization and deserialization functions below:

TransactionInfo transactionInfoFromBuffer(const char* buffer);
void transactionInfoToBuffer(TransactionInfo& t, char* buffer);

Furthermore, BlockStore will need to be updated to no longer use these functions when serializing/deserializing transactions to discard the targetProgram and data fields when writing to LevelDB

Missing last hash check!

At block ~7400 I discovered inconsistencies in balances between chains. This investigation led to the discovery that chains had discrepancies due to a failure to check the last block hash when accepting blocks. This means that when a chain fork occurs miners may submit blocks from one fork and they will be accepted by the other fork... this is a catastrophic failure yielding divergent chains with inconsistent proof of work.

To fix this I am introducing a new patch that will start checking this last block hash starting at block 8000. The obvious flaw is that the first 8000 blocks could be modified as long as the final hash is equal to the hash of block 7999. To protect against this we are hard-coding the account balances at block 7999 into the ledger state. If a chain does not yield account balances equivalent to the balances at block 7999 it will be rejected.

The security of this construction is equivalent to starting again with a new genesis block.

Make error due to gcc older than 8

Unable to make 0.7.3 beta if gcc and g++ version 7.5. Fixed with manually updating gcc and g++ to version 8 or greater and changing conan profile compile from 7 to 8. Install build essential installs only version 7.5 at least on ubuntu 18.04.

arm build fails with conan

(pre-built ?) deps are missing - I'm not familiar with conan, don't know if these can be installed without conan:

usockets/0.8.1: Downloaded recipe revision 0
ERROR: Missing binary: openssl/3.0.0:afdeb3df330291e8119ad54fa23cfb6384f756f7
ERROR: Missing binary: usockets/0.8.1:eec902d1897262f597e74a9b7d483fc44117c624
ERROR: Missing binary: zlib/1.2.12:115c8fa606e08a868a5ec8073c26c1a096c7263b
usockets/0.8.1: WARN: Can't find a 'usockets/0.8.1' package for the specified settings, options and dependencies:
- Settings: arch=armv8, build_type=Release, compiler=gcc, compiler.version=11, os=Linux
- Options: eventloop=syscall, fPIC=True, with_libuv=deprecated, with_ssl=False
- Dependencies:
- Requirements:
- Package ID: eec902d1897262f597e74a9b7d483fc44117c624
ERROR: Missing prebuilt package for 'openssl/3.0.0', 'usockets/0.8.1', 'zlib/1.2.12'
Use 'conan search usockets/0.8.1 --table=table.html -r=remote' and open the table.html file to see available packages
Or try to build locally from sources with '--build=openssl --build=usockets --build=zlib'
More Info at 'https://docs.conan.io/en/latest/faq/troubleshooting.html#error-missing-prebuilt-package'
conanfile.txt: Installing package
Requirements
    openssl/3.0.0 from 'conancenter' - Downloaded
    usockets/0.8.1 from 'conancenter' - Downloaded
    uwebsockets/19.3.0 from 'conancenter' - Downloaded
    zlib/1.2.12 from 'conancenter' - Downloaded
Packages
    openssl/3.0.0:afdeb3df330291e8119ad54fa23cfb6384f756f7 - Missing
    usockets/0.8.1:eec902d1897262f597e74a9b7d483fc44117c624 - Missing
    uwebsockets/19.3.0:5ab84d6acfe1f23c4fae0ab88f26e3a396351ac9 - Download
    zlib/1.2.12:115c8fa606e08a868a5ec8073c26c1a096c7263b - Missing
Installing (downloading, building) binaries...
------
executor failed running [/bin/sh -c conan install ..]: exit code: 1

json error on startup

latest ghcr.io/bamboo-crypto/bamboo:06b8c90d734407c3948efdd55fe1ef006397bf64

[STATUS] 08-29-2022 15:41:28: BlockStore does not exist
terminate called after throwing an instance of 'nlohmann::detail::parse_error'
  what():  [json.exception.parse_error.101] parse error at line 1, column 1: syntax error while parsing value - unexpected end of input; expected '[', '{', or a literal

https://github.com/johanaugustsandels/docker-bamboo

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.