DEV Community

Manon Simonin
Manon Simonin

Posted on

Send records in batch to Algolia with free plan

The free plan of Algolia is more than enough for basic needs, development testing or even some little websites, internal or not. Just keep in mind you need to credit Algolia when using in production.

You get 10,000 search requests every months and can store 1 million records. But there's a restriction on the size limit of a single record which makes it difficult to add these records in the first place. The only thing reason we would go for the first priced plan is that limit being increased from 10 to 100 Kb.

In my case, I wanted to use Algolia as a search engine for Wiki.js. The main problem I faced is that I had already hundreds of pages when I decided to do so, and Wiki.js only supports indexing your entire database at once. Obviously this was too large and resulted in an error.
Once the index is built, every changes or addition is sent individually, respecting the size limit for the most part.

I just needed to build an initial index of my existing pages, so I wrote a script that sends the records in batch instead.

All you need is node.js installed, for instance in MacOS :

brew install nvm
echo 'source $(brew --prefix nvm)/nvm.sh' >> ~/.zshrc
mkdir ~/.nvm
nvm install node
Enter fullscreen mode Exit fullscreen mode

Create a directory and add the following packages :
This is designed for a Postgre database, but you could make the same with a MySQL database with mysql12 and some tweaks.

npm install pg
npm install algoliasearch
Enter fullscreen mode Exit fullscreen mode

Then create a file app.js and don't forget to replace your Algolia credentials as well as your database connection :

const { Client } = require('pg');
const algoliasearch = require('algoliasearch');

const dbConfig = {
    user: "username",
    password: "password",
    host: "localhost",
    port: 5432,
    database: "db-name",
  };

  const algoliaConfig = {
      appId: 'appID',
      apiKey: 'apiKey',
      indexName: 'index-name',
  }

const client = new Client(dbConfig);

client.connect()
  .then(async () => {
    const query = `
      SELECT title, path, content, description, LENGTH(content) / 1024 AS size_kb
      FROM pages
    `;

    const result = await client.query(query);

    const algoliaClient = algoliasearch(algoliaConfig.appId, algoliaConfig.apiKey);

    const index = algoliaClient.initIndex(algoliaConfig.indexName);

    const sendRecordToAlgolia = async (record) => {
      try {
        await index.saveObject(record);
        console.log(`Record sent to Algolia with objectID: ${record.objectID}`);
      } catch (error) {
        console.error('Error sending record to Algolia:', error);
      }
    };

    for (const row of result.rows) {
        if (row.size_kb.toFixed(2) < 10) {
            const record = {
                objectID: row.path,
                path: row.path,
                title: row.title,
                content: row.content,
                description: row.description || '',
                locale: 'fr',
              };
              await sendRecordToAlgolia(record);
        }
    }
  })
  .catch((error) => {
    console.error('Error:', error);
  })
  .finally(() => {
    client.end();
  });
Enter fullscreen mode Exit fullscreen mode

Run the script :

node app.js
Enter fullscreen mode Exit fullscreen mode

You should then be able to find those records on Algolia, don't forget to check if some pages were too large with a SQL query for example. In my case I only had 2, so I just split them and edited them manually.

Top comments (0)