Giter Club home page Giter Club logo

firestore-backfire's People

Contributors

benyap avatar dependabot[bot] avatar renovate[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

mkozjak

firestore-backfire's Issues

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Rate-Limited

These updates are currently rate-limited. Click on a checkbox below to force their creation now.

  • chore(deps): update dependency resolve-tspaths to v0.8.18
  • chore(deps): update dependency rimraf to v5.0.5
  • fix(deps): update dependency regenerator-runtime to v0.14.1
  • chore(deps): update commitlint monorepo to v17.8.1 (@commitlint/cli, @commitlint/config-conventional)
  • chore(deps): update dependency @faker-js/faker to v8.4.1
  • chore(deps): update dependency @google-cloud/firestore to v6.8.0
  • chore(deps): update dependency @google-cloud/storage to v7.9.0
  • chore(deps): update dependency release-it to v16.3.0
  • fix(deps): update dependency commander to v11.1.0
  • chore(deps): update actions/cache action to v4
  • chore(deps): update actions/checkout action to v4
  • chore(deps): update actions/github-script action to v7
  • chore(deps): update actions/setup-node action to v4
  • chore(deps): update actions/stale action to v9
  • chore(deps): update commitlint monorepo to v19 (major) (@commitlint/cli, @commitlint/config-conventional)
  • chore(deps): update dependency @google-cloud/firestore to v7
  • chore(deps): update dependency @release-it/bumper to v6
  • chore(deps): update dependency @release-it/conventional-changelog to v8
  • chore(deps): update dependency firebase-tools to v13
  • chore(deps): update dependency husky to v9
  • chore(deps): update dependency release-it to v17
  • chore(deps): update dependency vitest to v1
  • chore(deps): update pnpm/action-setup action to v3
  • fix(deps): update dependency commander to v12
  • fix(deps): update dependency cosmiconfig to v9
  • πŸ” Create all rate-limited PRs at once πŸ”

Open

These updates have all been created already. Click a checkbox below to force a retry/rebase of any.

Detected dependencies

github-actions
.github/workflows/publish.yml
  • actions/checkout v3
  • actions/setup-node v3
  • pnpm/action-setup v2.4.0
  • actions/cache v3
  • actions/github-script v6
.github/workflows/stale.yml
  • actions/stale v8
npm
package.json
  • ansi-colors 4.1.3
  • commander 11.0.0
  • cosmiconfig 8.2.0
  • exceptional-errors 0.4.4
  • regenerator-runtime 0.14.0
  • @aws-sdk/client-s3 3.388.0
  • @aws-sdk/credential-provider-ini 3.388.0
  • @aws-sdk/types 3.387.0
  • @commitlint/cli 17.7.1
  • @commitlint/config-conventional 17.7.0
  • @faker-js/faker 8.0.2
  • @google-cloud/firestore 6.7.0
  • @google-cloud/storage 7.0.1
  • @release-it/bumper 5.1.0
  • @release-it/conventional-changelog 7.0.0
  • @swc/cli 0.1.62
  • @swc/core 1.3.76
  • @types/node 20.4.10
  • @vitest/coverage-c8 0.33.0
  • concurrently 8.2.0
  • firebase-tools 12.4.7
  • husky 8.0.3
  • prettier 3.0.1
  • release-it 16.1.5
  • resolve-tspaths 0.8.15
  • rimraf 5.0.1
  • typescript 5.1.6
  • vite 4.4.9
  • vitest 0.34.1
  • @aws-sdk/client-s3 >=3.0.0
  • @aws-sdk/credential-provider-ini >=3.0.0
  • @google-cloud/firestore >=6.0.0
  • @google-cloud/storage >=6.0.0
  • node >=12

  • Check this box to trigger a request for Renovate to run again on this repository

Where is the firestore exported data located by default on windows?

I installed the package globally on Windows 11

npm install -g firestore-backfire @google-cloud/firestore

And I didn't notice that I have to use the full path, so I used a relative path instead

backfire export . -p database -k credentials.json

and

backfire export data -p database -k credentials.json

Where are the data located by default in order to delete it?

Import stops at a file

I tried importing data back to a test Firestore project to test out importing with this library. The import was taking more than an hour so I stopped it. I tried again several times with the same result. How long would a 88mb file take to import? I used this:

backfire import hm.ndjson --paths stories -K testCredentials.json -P test-hm-bq0gl2 --verbose true

When I look at the logs, it stops at a file. I stopped the import and then restarted it again. It didn't overwrite the documents I exported already and it continued to export new documents, but then it stopped again:

Screenshot 2023-05-22 at 10 32 40 AM

I deleted the collection from the project and tried again from scratch with the same results.

When I deleted the collection from Firestore, it deleted a little more than 9000 documents. I think the data I was trying to import has more than 10k documents.

Screenshot 2023-05-22 at 10 43 48 AM

Here are the Firestore rules(just in case):

rules_version = '2';
service cloud.firestore {
  match /databases/{database}/documents {
    match /{document=**} {
      allow read, write;
    }
  }
}

fails to work on github actions

Hi, I'm trying to automatically export a db and make it available publicly, so I tried to add this to github actions but it failed with below error.

npm ERR! could not determine executable to run

this is what I used, any reason why it shouldn't work?

name: Backup Firestore
on: [push]
  jobs:
  backup:
    runs-on: ubuntu-latest
    steps:
    - name: Create secret files
      run: |
        echo ${{ secrets.GCP_SA_KEY }} | base64 -d > gcp.json
        echo ${{ secrets.FB_SA_KEY }} | base64 -d > fb.json
        
    - name: Backup firestore db in json format
      env:
        GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
        NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
      run: npx backfire export gs://nostrdb-backups  --gcpProject nostrdirectory --gcpKeyFile gcp.json --paths twitter --project nostrdirectory --keyFile fb.json

Permission denied

  1. I installed the library globally using:
    npm install -g firestore-backfire @google-cloud/firestore

  2. I tried exporting a collections using:
    backfire export ./hm -P HistoryMaps -K credentials.json --paths stories

note:

  • credentials.json is a credentials file for the project I downloaded awhile back. Do I need to re-issue a new one?
  • projectName: HistoryMaps
  • collection: stories

I got a permission denied with the following error message:

code: 7,
  details: 'Permission denied on resource project HistoryMaps.',
  metadata: Metadata {
    internalRepr: Map(3) {
      'google.rpc.errorinfo-bin' => [Array],
      'google.rpc.help-bin' => [Array],
      'grpc-status-details-bin' => [Array]
    },
    options: {}
  },
  statusDetails: [
    Help { links: [Array] },
    ErrorInfo {
      metadata: [Object],
      reason: 'CONSUMER_INVALID',
      domain: 'googleapis.com'
    }
  ],
  reason: 'CONSUMER_INVALID',
  domain: 'googleapis.com',
  errorInfoMetadata: {
    service: 'firestore.googleapis.com',
    consumer: 'projects/HistoryMaps'
  }
}

Is it possible to extend a DatasourceWriter or manually select one?

Hey! First of all thanks for this lib.

I'm trying to upload my backup to DigitalOcean Spaces. It claims to be compatible with AWS S3. I was wondering if s3 writer could be able to do the operations with DigitalOcean address. But seems like the matcher defaults to local. If not possible to just reuse s3 writer, is there a way to extend it?

Ideas to reduce export time

Problem statement

Exporting large collections and deeply nested documents can be quite slow as all paths are explored. Though this is good in some cases, it can be wasteful (lots of Firestore reads) and time consuming if you know you just want to export one or a few particular documents.

Enhancement

To add options to export exact document or collection paths. This will be helpful in restricting how deep or wide the program will "explore" to look for documents. Must be different from the patterns options as that option is more for filtering through paths it's given, rather than actually making decisions on which paths to explore deeper.

Error: 9 FAILED_PRECONDITION: The requested snapshot version is too old

Getting the following error while exporting the data from Firestore collection

backfire 1:58:02 PM error   Error: 9 FAILED_PRECONDITION: The requested snapshot version is too old.
    at callErrorFromStatus (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/@grpc/grpc-js/build/src/call.js:31:19)
    at Object.onReceiveStatus (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/@grpc/grpc-js/build/src/client.js:351:73)
    at Object.onReceiveStatus (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:323:181)
    at /usr/local/lib/node_modules/@google-cloud/firestore/node_modules/@grpc/grpc-js/build/src/resolving-call.js:94:78
    at processTicksAndRejections (node:internal/process/task_queues:78:11)
for call at
    at ServiceClientImpl.makeServerStreamRequest (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/@grpc/grpc-js/build/src/client.js:334:34)
    at ServiceClientImpl.<anonymous> (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/@grpc/grpc-js/build/src/make-client.js:105:19)
    at /usr/local/lib/node_modules/@google-cloud/firestore/build/src/v1/firestore_client.js:225:29
    at /usr/local/lib/node_modules/@google-cloud/firestore/node_modules/google-gax/build/src/streamingCalls/streamingApiCaller.js:38:28
    at /usr/local/lib/node_modules/@google-cloud/firestore/node_modules/google-gax/build/src/normalCalls/timeout.js:44:16
    at Object.request (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/google-gax/build/src/streamingCalls/streaming.js:130:40)
    at makeRequest (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/retry-request/index.js:141:28)
    at retryRequest (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/retry-request/index.js:109:5)
    at StreamProxy.setStream (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/google-gax/build/src/streamingCalls/streaming.js:121:37)
    at StreamingApiCaller.call (/usr/local/lib/node_modules/@google-cloud/firestore/node_modules/google-gax/build/src/streamingCalls/streamingApiCaller.js:54:16) {
  code: 9,
  details: 'The requested snapshot version is too old.',
  metadata: Metadata { internalRepr: Map(0) {}, options: {} }

How to solve this issue?

I have ~17,00,000 documents in a collection. The error is after exporting ~12,00,000 documents.

> TypeError: _this.writer.open is not a function

Hi,

Thanks for the great library,

I have the error in the title while tying to implemlent, here's our code:

const result = await exportFirestoreData( { project: firebaseProjectID, credentials, }, { path:gs://..., gcpProject: "...", gcpCredentials: { type: "service_account", ... }, } );

Thanks,

ThΓ©o

Slow document downloads

Hi benyap,

Firstly, I would like to thank you for version 2.x, it is so better, in the 1.x I was having some problems connecting on the Firebase Emulator, sometimes it connected but sometimes did not, but now it is working fine.

So, now I am finding it is slow to download the documents, when I am using the Firebase Emulator, it downloads about 10 documents and when using the production, it downloads about 40 documents per 5 seconds, I do not know if this slow is because of download document or when saving in the JSON. Do you think there is a way to do it faster? The reason for my question is because my data it big.

It is part of the logs:

export info Export data from demo-backup to backfile.ndjson
export verbose {
options: {
paths: [ 'xxxx/xxxx/xxxx/xxxx' ],
exploreInterval: 10,
exploreChunkSize: 5000,
downloadChunkSize: 5000,
downloadInterval: 10,
overwrite: true,
debug: true,
verbose: true
},
connection: { project: 'demo-backup', emulator: 'localhost:8180' }
}
export debug Overwriting existing data at backfile.ndjson
export debug Progress: 0 exported, 0 exploring, 1 to explore
export verbose Exploring for subcollections in 1 document
export verbose Exporting 1 document
export debug Progress: 0 exported, 1 exploring, 0 to explore
export verbose Found 3 subcollections in xxxx/xxxx/xxxx/xxxx
export verbose Exploring for documents in 3 collections
export verbose Found 1 document in xxxx/xxxx/xxxx/xxxx/apps
export verbose Exporting 1 document
export debug Progress: 2 exported, 2 exploring, 1 to explore
export verbose Found 10163 documents in xxxx/xxxx/xxxx/xxxx/xxxx
export verbose Exporting 5000 documents
export verbose Found 24 documents in xxxx/xxxx/xxxx/xxxx/xxxx
export verbose Exploring for subcollections in 5000 documents
export verbose Exporting 5000 documents
export debug Progress: 5002 exported, 5000 exploring, 5188 to explore
export verbose Found 2 subcollections in xxxx/xxxx/xxxx/xxxx/xxxx/xxxx
export verbose Exporting 187 documents
export debug Progress: 10189 exported, 4990 exploring, 5190 to explore
export debug Progress: 10189 exported, 4980 exploring, 5190 to explore
export debug Progress: 10189 exported, 4975 exploring, 5190 to explore
export debug Progress: 10189 exported, 4966 exploring, 5190 to explore
export debug Progress: 10189 exported, 4955 exploring, 5190 to explore
export debug Progress: 10189 exported, 4953 exploring, 5190 to explore
export debug Progress: 10189 exported, 4942 exploring, 5190 to explore
export debug Progress: 10189 exported, 4932 exploring, 5190 to explore
export debug Progress: 10189 exported, 4929 exploring, 5190 to explore
export debug Progress: 10189 exported, 4918 exploring, 5190 to explore
export debug Progress: 10189 exported, 4909 exploring, 5190 to explore
export debug Progress: 10189 exported, 4905 exploring, 5190 to explore
export debug Progress: 10189 exported, 4894 exploring, 5190 to explore
export debug Progress: 10189 exported, 4885 exploring, 5190 to explore
export debug Progress: 10189 exported, 4883 exploring, 5190 to explore
export debug Progress: 10189 exported, 4873 exploring, 5190 to explore
export debug Progress: 10189 exported, 4864 exploring, 5190 to explore
export debug Progress: 10189 exported, 4861 exploring, 5190 to explore
export debug Progress: 10189 exported, 4851 exploring, 5190 to explore
export debug Progress: 10189 exported, 4841 exploring, 5190 to explore
export debug Progress: 10189 exported, 4839 exploring, 5190 to explore
export debug Progress: 10189 exported, 4831 exploring, 5190 to explore
export debug Progress: 10189 exported, 4821 exploring, 5190 to explore
export debug Progress: 10189 exported, 4815 exploring, 5190 to explore
export debug Progress: 10189 exported, 4809 exploring, 5190 to explore
export debug Progress: 10189 exported, 4799 exploring, 5190 to explore
export debug Progress: 10189 exported, 4792 exploring, 5190 to explore
export debug Progress: 10189 exported, 4788 exploring, 5190 to explore
export debug Progress: 10189 exported, 4778 exploring, 5190 to explore
export debug Progress: 10189 exported, 4769 exploring, 5190 to explore
export debug Progress: 10189 exported, 4766 exploring, 5190 to explore
export debug Progress: 10189 exported, 4756 exploring, 5190 to explore
export debug Progress: 10189 exported, 4747 exploring, 5190 to explore
export debug Progress: 10189 exported, 4744 exploring, 5190 to explore

SyntaxError: Unexpected non-whitespace character after JSON at position 20441

I exported my Firestore collection (88mb). Since I'm not familiar with the data structure of this export, I used JSON.parse() to find it.

import * as fs from "fs"
const filePath = JSON.parse(fs.readFileSync("../../data/hm.ndjson"))

const readStream = fs.createReadStream(filePath, { encoding: "utf8" })

readStream.on("data", (chunk) => {
  const lines = chunk.split("\n")
  lines.forEach((line) => {
    if (line.trim() !== "") {
      try {
        const jsonObject = JSON.parse(line)
        // Analyze the structure of `jsonObject` here
        console.log(jsonObject)
      } catch (error) {
        console.error("Error parsing line:", line)
      }
    }
  })
})

readStream.on("end", () => {
  console.log("Finished reading the NDJSON file.")
})

readStream.on("error", (error) => {
  console.error("Error reading the file:", error)
})

It gave this error:
SyntaxError: Unexpected non-whitespace character after JSON at position 20441
at JSON.parse ()
at file:///Users/nonoumasy/Downloads/node/transform/findJSONStructure.js:2:23
at ModuleJob.run (node:internal/modules/esm/module_job:194:25)

I looked into the issue, and it seems to have issue with the _nanoseconds property. I'm not sure why πŸ€·β€β™‚οΈ.

Also VScode was unable to prettify the NDJSON file because the filesizes was too big, any suggestions on how to view it?

Document how to use dataSourceFactory.createReader() and dataSourceFactory.createWriter() - example code is incorrect

I've like to use your API with local files but your documentation does not explain how to do this.
Its unclear from your code what is required for DefaultOptions & T

Note that my paths are computed so Typescript can't see the id prefix in case that's used.

  const writer = await dataSourceFactory.createWriter(exportFilePath, ???)

  const reader = await dataSourceFactory.createReader(importFilePath, ???)

Same issue for the other pre-registered readers and writers.

Import not working as expected

The following import results in a "exportAction is not a function" error.

import { exportAction } from "@benyap/backfire";

Current workaround is to export directly from the dist directory like so:

import { exportAction } from "@benyap/backfire/dist";

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.