Firebase Admin SDK Bulk Import Example

By Ron Royston

June 2018

Table of Contents


This article demonstrates using the Firebase Admin SDK to import a very large dataset from a CSV file.


Firestore Usage and limits indicates a couple of key points. First, batched writes are limited to 500 documents per write, and no more than 1 write per second. Second, one of my recordsets weighed in at 1.5 million rows. Node.js fails if your script runs with too little memory. FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory

Setup and Installation

Follow the Add the Firebase Admin SDK to Your Server instructions provided by Firebase to get Node.js and the Firebase Admin SDK installed on your machine.

This article assumes a large .csv file as the native dataset format so the csv-parse module is used to read this data. Install this module now.

npm install csv-parse

Running Node with Additional Memory

Experiment with a sample size before running your script against your entire dataset. You may or may not need to tell Node to use additional memory. Even if you tell Node to use additional memory, you may have to split your dataset into parts if it is too large. The following command increases the memory allocated to the Node process to 8GB. node --max_old_space_size=8000 app.js

Node Script

The script below reads the CSV file line by line, creating and setting a new batch object for each line. A counter creates a new batch write per 500 objects and finally asynch/await is used to rate limit the writes to 1 per second. Last, we notify the user of the write progress with console logging.

var admin = require("firebase-admin");
var serviceAccount = require("./your-firebase-project-service-account-key.json");
var fs = require('fs');
var csvFile = "./my-huge-file.csv"
var parse = require('csv-parse');

  credential: admin.credential.cert(serviceAccount),
  databaseURL: ""

var firestore = admin.firestore();
var thisRef;
var obj = {};
var counter = 0;
var commitCounter = 0;
var batches = [];
batches[commitCounter] = firestore.batch();

		parse({delimiter: '|',relax_column_count:true,quote: ''})
    .on('data', function(csvrow) {
		if(counter <= 498){
			if(csvrow[1]){ = csvrow[1];
				obj.series = csvrow[2];
				obj.sku = csvrow[3];
				obj.description = csvrow[4];
				obj.price = csvrow[6];	
			thisRef = firestore.collection("your-collection-name").doc();
			batches[commitCounter].set(thisRef, obj);
			counter = counter + 1;			
		} else {
			counter = 0;
			commitCounter = commitCounter + 1;
			batches[commitCounter] = firestore.batch();
    .on('end',function() {

function oneSecond() {
	return new Promise(resolve => {
		setTimeout(() => {
		}, 1010);

async function writeToDb(arr) {
	console.log("beginning write");
	for (var i = 0; i < arr.length; i++) {
		await oneSecond();
		arr[i].commit().then(function () {
			console.log("wrote batch " + i);


One thing that comes to mind is limitations on non-billable Firebase projects. If you experience exceeded quota errors, this may be your issue. Finally, this script assumes you have a CSV formatted file as the source file. If you are starting with a spreadsheet or any other format you should alter the above script to suit your needs.

Dialog Title

Cell Phone Number

Your order ID is . The grand total is , or bitcoin.

Delete location?

A password reset link will be emailed to you.

New users register here for an email authenticated account.

Forget your password? Click here to have a new one emailed to you.

Find a bug? Experience an error? How can we do better? We appreciate your feedback.


Delete order? This action cannot be undone.

Delete order? This action cannot be undone.