DEV Community

Mike
Mike

Posted on

Why are IndexedDB operations significantly slower in Chrome vs Firefox?

I was writing a simple key/value promise wrapper around IndexedDB, continuing a project I started a couple of years ago but stopped when LocalForage released, since that does pretty much the same thing. But while running some benchmarks by Nolan Lawson, I noticed a problem. Depending on the operation, Chrome is 2x to 7x slower than Firefox when working with IndexedDB. For a trivial insert (objectStore put() operation), it's a bit more than 2x slower. But more than that and it gets significantly worse.

Running the test code I have has Firefox 68 at 170ms and 2800ms for a single .put() transaction and 10k .put() transactions. Running the same code in Chrome 76 is 430ms and 19,400ms. Yes, that's about 700% slower on Chrome when doing a lot of transactions. In Nolan Lawson's database comparison, you can see that with the LocalForage and PouchDB (non-WebSQL) tests.

This kind of matters because things like LocalForage don't combine many operations into a single transaction, meaning multiple database put/set operations will be a lot slower on Chrome than Firefox. But I'm not sure how often you'll be doing tons of inserts.

Included is some code I wrote that you can paste into your browser's dev tools for a simple benchmark. I tried to isolate the test to just inserting 10,000 objects in one transaction vs inserting 10,000 objects in 10,000 transactions, but there are some other small things going on (e.g. casting number to string, a for loop, a function call, an array access, and an array push()).

So... why? Is it actually just that Chrome is much slower with IndexedDB transactions, or is it something else? Is there ever a reason to do thousands of object inserts into a database all at once?

const testRuns = 10000;

runSingleTX(testRuns)
.then(_=>runManyTX(testRuns));

function runSingleTX(runs) {
    return new Promise(async resolve => {

        const database = await init();

        await clear(database);
        const data = generateData(runs);
        const startTime = Date.now(); // benchmark start

        const transaction = database.transaction(['theStore'], 'readwrite');
        const objStore = transaction.objectStore('theStore');

        for (let i = 0; i < runs; i++) {
            objStore.put(data[i], i+'');
        }

        transaction.oncomplete = async _ => {
            const endTime = Date.now();
            console.log(`${runs} objects inserted in a single transaction: ${endTime-startTime} ms`);

            await clear(database);
            resolve();
        };
    });
}

function runManyTX(runs) {
    return new Promise(async resolve => {
        const database = await init();

        await clear(database);
        const data = generateData(runs);
        const startTime = Date.now(); // benchmark start

        const promises = []

        for (let i = 0; i < runs; i++) {
            promises.push(tx(database, i, data));
        }

        // doesn't matter THAT much, since "readwrite" transactions are basically synchronous
        await Promise.all(promises);

        const endTime = Date.now();
        console.log(`${runs} objects inserted one per transaction: ${endTime-startTime} ms`);

        await clear(database);
        resolve();
    });

    // a transaction for a single .put() operation
    function tx(database, i, data) {
        return new Promise(resolve => {
            const transaction = database.transaction(['theStore'], 'readwrite');
            const objStore = transaction.objectStore('theStore');

            objStore.put(data[i], i+'');

            transaction.oncomplete = _ => resolve();
        });  
    }
}

// utility to generate random data outside of benchmarking
function generateData(size) {
    const data = [];
    for (let i = 0; i < size; i++) {
        data.push(Math.random());
    }
    return data;
}

// utility to clear the database of all entries
function clear(database) {
    return new Promise(resolve => {
        const transaction = database.transaction(['theStore'], 'readwrite');
        const objStore = transaction.objectStore('theStore');

        objStore.clear();

        transaction.oncomplete = _ => resolve();
    });
}

// open/create the database
function init() {
    return new Promise((resolve, reject) => {
        let request = indexedDB.open('theDB', 1);

        // create the db the first time
        request.onupgradeneeded = _ => {
            let transaction = request.result.createObjectStore('theStore').transaction;
            transaction.oncomplete = _ => {
                resolve(request.result);
            };
        }
        request.onsuccess = _ => {
            resolve(request.result);
        };
        request.onerror = _ => reject(request.error);
    });
}
Enter fullscreen mode Exit fullscreen mode

Top comments (4)

Collapse
 
guoyunhe profile image
Guo Yunhe

I want to add something about read/get transaction performance.

If you make a mini demo that only has IndexedDB initialization and access code, it is fast. Only 1ms for a get transaction. Cool, right?

However, if you use IndexedDB in a CPU heavy React app, it perform much worse. A get transaction in Chrome can take 400~1200ms, which is even slower than fetch it from our server...

Image description

In Firefox, this is much faster, but still, it is slowing down when your React app is doing heavy CPU work:

Image description

If you want to use IndexedDB as a fast and unlimited cache storage, you might be disappointed. It is persist but you cannot expect it to be fast.

Collapse
 
fotang profile image
fotang • Edited

That is curious. I can confirm the figures for the single transaction (FF 68: 225 ms, Chrome 76: 444 ms). For the multiple tx, FF has taken 2184 ms, while Chrome has taken 390194 ms! That is 178x slower.

Collapse
 
skhmt profile image
Mike

Wow! That's crazy!

What is going on?!

Collapse
 
skhmt profile image
Mike

As a random follow up, the same code runs in Edge 44.18362.267.0 in 1020ms and 7100ms on the same machine as the tests in the article.