Easily work with Firestore Batched Writes that are bigger than currently allowed by Firestore.
Read an article explaining how this library works: The solution to the Firestore batched write limit
npm install @qualdesk/firestore-big-batch --save
or
yarn add @qualdesk/firestore-big-batch
import * as admin from 'firebase-admin'
import { BigBatch } from '@qualdesk/firestore-big-batch'
const fs = admin.firestore()
const batch = new BigBatch({ firestore: fs }) //
const ids = myListOfIdsThatMightGoOver499
ids.forEach((id) => {
const ref = fs.collection('documents').doc(id)
batch.set(ref, { published: true }, { merge: true })
})
await batch.commit()
This will create multiple Firestore batches if you have more than 499 operations in your BigBatch, which works for simple use cases, but does not give all the benefits of a Firestore batch.
- better error handling when batches fail (
Promise.all()
is not that great) - see if we can support runTransaction
- write tests!
PRs are welcome, or you can raise an issue