DEV Community

Cover image for Build a real-time dashboard with Next.js, Tremor, and Tinybird
Cameron Archer for Tinybird

Posted on • Originally published at tinybird.co

Build a real-time dashboard with Next.js, Tremor, and Tinybird

Can you imagine shipping a new user-facing dashboard only to have your users met with a visualization that takes several seconds or even minutes to load? No way, right? Your users would get frustrated by the opportunities missed, efficiencies destroyed, and decisions delayed based on outdated information and a horrible user experience.

Sadly, this is the status quo for many who build dashboards into their products. If you don’t know how to build real-time data architectures, you’ll be stuck with inefficient, legacy business intelligence platforms that can’t keep pace with user-facing features.

The contrast of this scenario with today's real-time analytics landscape couldn't be more stark, underlining just how vital it is to give your users immediate access to data analytics.

A "Hide the Pain, Harold" meme with the text "My boss looking at my real-time dashboard instead of a spreadsheet."
Release the pain, Harold.

In this post, we're going to build real-time dashboards from scratch within a user-facing application. And we're not doing it the old-fashioned way, oh no! We're using some of my personal favorite tech out there: Tinybird, Tremor, and Next.js.

This post will walk through all of the steps to building a real-time dashboard using just these 3 tools. You can follow along here without any prior knowledge or resources, or if you’d like to work with and augment an existing project, you can clone this GitHub repo (which is the culmination of this guide, and then some).

And by the way, you can do all of this for free. 🤑

Before we jump in, let’s talk about what we mean by a “real-time dashboard” and why most dashboards aren’t “real-time.” If you’re just here for the tutorial, you can skip ahead.

What is a real-time dashboard?

A real-time dashboard is an interactive real-time data visualization that displays continually updated metrics. It incorporates data that is just seconds old, refreshes almost instantaneously, and can support many concurrent viewers at once. Unlike traditional business intelligence dashboards that update on a periodic or batch basis, real-time dashboards pull in data as it is created, processed, or changed, providing an up-to-the-second snapshot of a system or process.

A real-time dashboard should serve fresh data quickly to many concurrent users.

The primary components of a real-time dashboard include:

  1. Data Sources: The real-time feeds of information from various systems, services, and devices. Examples include sensor readings, user activity on a website, sales transactions, or social media interactions.
  2. Data Processing Engine: The system that aggregates, filters, and transforms the raw data from various sources into a format that can be consumed by a frontend application.
  3. Visualization Layer: The frontend app that brings the data to life through graphs, charts, maps, and other visuals. In a real-time dashboard, these visual components update near-instantaneously, reflecting the most current state of the data sources.
  4. Interactive Controls: Components by which users can interact with real-time dashboards, such as by adjusting filters, drilling down into detailed views, or setting alerts for specific conditions. These features empower users to explore data in more depth and respond to changes more swiftly.

Why are most dashboards slow?

Let’s be honest, most dashboards aren’t of the “real-time” variety. But why?

A meme with a  skeleton next to the text "Me, waiting for the data to be extracted, transformed, and loaded."

The problem is the underlying architecture and the manner in which the data is handled. Here are the main reasons that dashboards are slow:

  1. Batch ETL (Extract, Transform, Load) Processes: Many dashboards rely on batch ETL processes that collect, transform, and load data at specific intervals. These time-bound processes result in data that isn’t fresh. It doesn’t matter if a dashboard can refresh in 50 milliseconds if the data it’s showing is hours or days old.
  2. Complex Business Intelligence (BI) Tools: BI tools were designed for a small handful of users to run and visualize complex analytical queries over a database or data warehouse. While they are powerful for internal reporting and dashboarding, they tend to be slow. They’re not optimized for user-facing applications and often struggle with high query latencies and minimal user concurrency.
  3. Poorly Configured Data Stack: Most databases, data warehouses, and data processing layers aren’t optimized for real-time analytics. Things like inefficient indexing, row-based storage, improper data partitioning, and lack of in-memory processing can all cause bottlenecks in the data flow.
  4. Poorly Designed Queries: Inefficient or poorly constructed queries can significantly slow down data retrieval. Bad indexing, heavy joins, and full table scans all contribute to slow dashboards.
  5. Lack of Scalability: Real-time dashboards need to be able to scale with big data and with many concurrent users. A modern real-time dashboard must be built with scalability in mind to ensure that performance does not degrade as demand grows.

Most dashboards are slow because the underlying data pipelines are slow.

To build real-time dashboards, you need a real-time streaming data architecture. For more information on building such an architecture, read this post.

Tutorial: Building a real-time data analytics dashboard

Okay, so what are we building? Imagine you’re a DocuSign competitor. You’re building a SaaS to disrupt the document signature space, and as a part of that, you want to give your users a real-time data analytics dashboard so they can monitor how, when, where, and what is happening with their documents in real time.

Let’s build that dashboard.

To do so, we'll be using:

  • Tinybird for real-time data ingestion, real-time data processing, and real-time APIs.
  • Tremor components for the data visualization. It turns those numbers and statistics into something beautiful.
  • Next.js as a fully-featured React framework. It ensures everything looks slick and runs smoothly.

The Tech Stack

Here's the flow of what we’re building today:

The basic architecture used in this tutorial to build a real-time dashboard
In this tutorial, we'll use Tinybird to capture event streams, process them with SQL, and expose the transformations as real-time APIs. Then we'll use Tremor components in a Next.js app to build a beautiful, responsive, real-time dashboard.

  1. Events (like a document being sent, signed, or received) will be sent to the Tinybird Events API, an HTTP streaming endpoint that captures events and writes them to a columnar datastore optimized for real-time analytics.
  2. Tinybird is a real-time data platform that we can use to build real-time metrics with SQL and instantly publish them as APIs.
  3. Tremor will then poll the API endpoints we publish in Tinybird and visualize the real-time metrics as beautiful visualizations.
  4. Next.js as a React framework for building our dashboard.

How to build a real-time dashboard from scratch

To build a real-time dashboard from scratch, you’ll follow these steps:

Step 0: Install Prerequisites

Before you get started, you’ll need to have the following prerequisites installed:

  • Node.js (version 18 or above)
  • Python (version 3.8 or above)

For information on installing those, check out their docs: Node.js and Python.

Initialize your Next.js project

Once you have those installed, create a new Next.js app. In this demo, I’ll be using plain JavaScript files (no TypeScript) and Tailwind CSS.

npx create-next-app signatures-dashboard --js --tailwind --eslint --src-dir
Enter fullscreen mode Exit fullscreen mode

When prompted, select “Yes” for App Router and "No" for customizing the default import alias.

Next, create some folders in your Next.js project. We’ll use these for the Tinybird resources we create later.

cd signatures-dashboard
mkdir data-project data-project/utils
Enter fullscreen mode Exit fullscreen mode

Create a Tinybird Account and Workspace

Tinybird is the real-time data platform that underpins our real-time dashboard. If you’re new to Tinybird, create a free account here. After you’ve created an account, you’ll be prompted to create a Workspace. Go ahead and do that. You can choose the region in which you’d like to host your Workspace, and I recommend you choose the one that’s geographically closest to you and your users. I’ve created mine in the EU region and named it signatures_dashboard.

A screenshot showing Workspace creation in Tinybird

Install the Tinybird CLI

The Tinybird CLI is a command-line tool that allows you to interact with Tinybird’s API. You will use it to create and manage the data project resources that underpin your real-time dashboard.

To install the Tinybird CLI, run the following commands:

python3 -mvenv .e
. .e/bin/activate
pip install tinybird-cli
tb auth –interactive
Enter fullscreen mode Exit fullscreen mode

Choose the region in which you created your Workspace. You’ll then be prompted for your Admin token. Go to https://ui.tinybird.co/tokens (or https://ui.us-east.tinybird.co/tokens for US-East regions) and copy the token with admin rights. Paste it into the CLI and press enter.

A screenshot showing auth token management in Tinybird

You’ll be authenticated to your workspace, and your auth details will be saved in a .tinyb file in the current working directory. Make sure that the .tinyb file is not publicly exposed by adding it to your .gitignore file:

echo ".tinyb" >> .gitignore
Enter fullscreen mode Exit fullscreen mode

Step 1: Create a mock data stream

If you’re building a real-time data dashboard using existing data streams, then you won’t need to follow this step. But since I’m building a dashboard for a hypothetical document signature SaaS, I’ll need some mock data to work with!

To get that mock data, I used the JavaScript Faker library (the same library that powers Mockingbird, the free, open-source mock data stream generator by Tinybird).

We’re going to use the mockDataGenerator.js script in the linked repository to create some mock account and signatures data and send it to Tinybird.

But before we do, let’s peek into this code to get a feel for what it’s doing.

The mockDataGenerator.js script generates mock user accounts, with fields like account_id, organization, phone_number, and various certification statuses related to the account’s means of identification:

const generateAccountPayload = () => {
    const status = ["active", "inactive", "pending"];
    const id = faker.number.int({ min: 10000, max: 99999 });
    account_id_list.push(id);

    return {
        account_id: id,
        organization: faker.company.name(),
        status: status[faker.number.int({ min: 0, max: 2 })],
        role: faker.person.jobTitle(),
        certified_SMS: faker.datatype.boolean(),
        phone: faker.phone.number(),
        email: faker.internet.email(),
        person: faker.person.fullName(),
        certified_email: faker.datatype.boolean(),
        photo_id_certified: faker.datatype.boolean(),
        created_on: (faker.date.between({ from: '2020-01-01', to: '2023-12-31' })).toISOString().substring(0, 10),
        timestamp: Date.now(),
    }
}
Enter fullscreen mode Exit fullscreen mode

In addition, the code generates mock data events about the document signature process, with variable status values such as in_queue, signing, expired, and error, amongst others:

const generateSignaturePayload = (account_id, status, signatureType, signature_id, since, until, created_on) => {
    return {
        signature_id,
        account_id,
        status,
        signatureType,
        since: since.toISOString().substring(0, 10),
        until: until.toISOString().substring(0, 10),
        created_on: created_on.toISOString().substring(0, 10),
        timestamp: Date.now(),
        uuid: faker.string.uuid(),
    }
}
Enter fullscreen mode Exit fullscreen mode

Finally, the generator will create and send a final status for the signature using some weighted values:

const finalStatus = faker.helpers.weightedArrayElement([
        { weight: 7.5, value: 'completed' },
        { weight: 1, value: 'expired' },
        { weight: 0.5, value: 'canceled' },
        { weight: 0.5, value: 'declined' },
        { weight: 0.5, value: 'error' },
]) // 7.5/10 chance of being completed, 1/10 chance of being expired, 0.5/10 chance of being canceled, declined or error
Enter fullscreen mode Exit fullscreen mode

Now download the mockDataGenerator.js file from the repository or copy the code into a new file called mockDataGenerator.js and place it into the data-project directory.

cd data-project
ls
# mockDataGenerator.js      utils
Enter fullscreen mode Exit fullscreen mode

You may have noticed that this script utilizes a couple of helper functions to access your Tinybird token and send the data to Tinybird with an HTTP request using the Tinybird Events API. These helper functions are located in the tinybird.js file in the repo. Download that and add it to the data-project/utils directory.

Below is the code for the helper function that sends data to Tinybird via the Events API.

export async function send_data_to_tinybird(name, token, payload) {
    const events_url = "https://api.tinybird.co/v0/events?name=";

    return fetch(events_url + name, {
        method: "POST",
        body: JSON.stringify(payload),
        headers: {
            Authorization: `Bearer ${token}`,
        },
    })
    .then((res) => res.json())
    .catch((error) => console.log(error));
}
Enter fullscreen mode Exit fullscreen mode

The Tinybird Events API is useful for two reasons:

  1. It allows for the flexible and efficient ingestion of data, representing various stages of signatures, directly into the Tinybird platform without needing complex streaming infrastructure.
  2. It allows you to stream events directly from your application instead of relying on batch ETLs or change data capture which requires the events to first be logged in a transactional database, which can add lag to the data pipeline.

You’ll also need to install the Faker library:

npm install @faker-js/faker
Enter fullscreen mode Exit fullscreen mode

To run this file and start sending mock data to Tinybird, you’re going to create a custom script in your package.json file. So open up that file and add the following to the scripts:

“seed”: “node data-project/mockDataGenerator.js”
Enter fullscreen mode Exit fullscreen mode

Note that since our code is using ES modules, we’ll need to add ”type”: “module” to the package.json file to be able to run the script and access the modules. For more information on why, you can read this helpful post.

Once you’re done, your package.json should look something like this:

{
  "name": "signatures-dashboard",
  "version": "0.1.0",
  "private": true,
  "type": "module",
  "scripts": {
    "seed": "node data-project/mockDataGenerator.js",
    "dev": "next dev",
    "build": "next build",
    "start": "next start",
    "lint": "next lint"
  },
  "dependencies": {
    "@faker-js/faker": "^8.0.2",
    "autoprefixer": "10.4.15",
    "eslint": "8.48.0",
    "eslint-config-next": "13.4.19",
    "next": "13.4.19",
    "postcss": "8.4.28",
    "react": "18.2.0",
    "react-dom": "18.2.0",
    "tailwindcss": "3.3.3"
  }
}
Enter fullscreen mode Exit fullscreen mode

Also, since we’re setting the type as module, we’ll need to treat our Next.js and PostCSS config scripts as CommonJS scripts:

mv next.config.js next.config.cjs
mv postcss.config.js postcss.config.cjs
Enter fullscreen mode Exit fullscreen mode

To begin sending this mock data to Tinybird, run the following command from your local project directory (assuming you added the script to your package.json):

npm run seed
Enter fullscreen mode Exit fullscreen mode

You should start seeing your mock data being sent to Tinybird.

A screenshot showing a CLI sending mock data streams to Tinybird
Sending mock data to Tinybird with our data seeding script.

Let this mock data generator run in the background so you can have some data to play with in the next step.

To verify that the data is flowing properly into Tinybird, inspect the Tinybird Data Sources. In the Tinybird UI, navigate to the signatures and accounts Data Sources to confirm that the data has been received. The latest records should be visible.

Likewise, you can use the Tinybird CLI to monitor rows being created in the data sources. For example:

tb sql “select count() from signatures”
Enter fullscreen mode Exit fullscreen mode

This will return the current number of rows in the signatures Data Source. If you’re mock data creation is working (and still running in the background), you’ll see that number tick up.

Step 2: Build dashboard metrics with SQL in Tinybird

You now have events streaming into Tinybird, which will ensure your real-time dashboard has access to fresh data. The next step is to build real-time metrics using Tinybird Pipes.

In Tinybird, a Pipe is a set of chained, composable nodes of SQL that process, transform, and enrich data in your Data Sources.

Here's why you’ll love Pipes:

  • Performance: Pipes process new data in real time, allowing for rapid transformations on streaming data sets.
  • Flexibility: Pipes let you define custom data processing flows using filters, aggregations, and joins, enabling complex analytics and insights.
  • Scalability: Pipes can handle massive volumes of data, scaling with your needs.
  • Ease of Use: Pipes break up larger SQL queries into manageable nodes, which makes it easier to prototype, debug, and identify performance bottlenecks.
  • Maintainability: Pipes organize the data workflow intuitively, making it easier to understand and modify. To create a new Pipe in the Tinybird UI, start from your Workspace dashboard (ui.tinybird.co). Click the Plus (+) icon in the left-side navigation bar next to the Pipes section.

A screenshot showing Pipe creation in Tinybird

Now for the fun part. Define your real-time dashboard metrics using chained nodes of SQL. Below, for example, is a node that filters and groups signature data by account_id for a specified date range, then orders the results by the total count.

SELECT
  account_id,
  count() AS total
FROM signatures
WHERE
    fromUnixTimestamp64Milli(timestamp)
    BETWEEN '2023-01-01'
    AND '2024-01-01'
GROUP BY account_id
HAVING total > 0
ORDER BY total DESC
Enter fullscreen mode Exit fullscreen mode

Making your queries more dynamic

The SQL above uses static date range filters, but as I described earlier, real-time dashboards should be interactive. Instead of hardcoding the date range, we want our users to be able to select a dynamic range and have the results refresh in real time.

You can do this in Tinybird with its templating language, which lets you define dynamic, typed query parameters as well as add custom logic such as if/else statements and more.

Take a look at the updated SQL below using the Tinybird templating language. I’ve made two changes:

  1. Added an if defined statement. This clause tells the Pipe to execute the statements only if a certain parameter is passed. In this case, I’ve created logic such that if a boolean tag called completed is passed, the Pipe calculates the number of completed signatures. Otherwise, it calculates all signatures.
  2. Added date_from and date_to query parameters (Date type), which will dynamically change the filter based on the date values passed.
SELECT 
  account_id,
  {% if defined(completed) %}
    countIf(status = 'completed') total
  {% else %}
    count() total
  {% end %}
FROM signatures
WHERE
    fromUnixTimestamp64Milli(timestamp)
    BETWEEN {{
        Date(
            date_from,
            '2023-01-01',
            description="Initial date",
            required=True,
        )
    }}
    AND {{
Date(
  date_to,
   '2024-01-01',
   description="End date",
    required=True
) 
     }}
GROUP BY account_id
HAVING total > 0
ORDER BY total DESC
Enter fullscreen mode Exit fullscreen mode

Now, name this node retrieve_signatures.

A screenshot of an SQL Pipe in Tinybird

Below this node, create a new Node with the following SQL:

SELECT
  organization,
  sum(total) AS org_total
FROM retrieve_signatures
LEFT JOIN accounts ON accounts.account_id = retrieve_signatures.account_id
GROUP BY organization
ORDER BY org_total DESC
LIMIT {{Int8(limit, 10, description="The number of rows accounts to retrieve", required=False)}}
Enter fullscreen mode Exit fullscreen mode

Name this node endpoint.

A screenshot showing a second node of SQL in a Tinybird Pipe

You now have a 2-node Pipe that gets the top <limit> organizations by signatures within a date range, either completed or total depending on whether you pass a completed query parameter.

At the top of the UI, name this Pipe ranking_of_top_organizations_creating_signatures.

A screenshot showing the final Tinybird Pipe

Step 3: Publish metrics as APIs using Tinybird

You’ll want to build an API (Application Programming Interface) for your dashboard to ensure seamless integration, accessibility, and interaction with other applications or services.

Here's why real-time APIs are so important for fast dashboards with many users:

  1. Integration & Interoperability: APIs allow your dashboard to be accessed programmatically by other applications. This enables a more comprehensive integration with different platforms, tools, or third-party services.
  2. Scalability: Through APIs, the dashboard can be quickly and easily scaled to serve multiple clients, including web, mobile, or IoT devices. This ensures that as your needs grow, your architecture can adapt without major redesigns.
  3. Real-Time Data Access: If your dashboard relies on real-time or frequently updated data, APIs are essential for providing up-to-the-minute access to the information, enhancing decision-making and user experience.

With Tinybird, it’s trivial to create low-latency, high-concurrency REST APIs from your Pipes. Simply open the Pipe that you want to publish and click the “Create API Endpoint” button in the top right corner of the screen. Then select the Node that you want to publish, in this case endpoint.

A screenshot showing how to create SQL-based API endpoints in Tinybird

With that, your API has been created! You’ll be greeted with an API page that contains a usage monitoring chart, parameter documentation, and sample usage. In addition, the API has been secured through an automatically generated read-only Auth Token.

Now let’s test your new API! Copy the HTTP endpoint from the sample usage and paste it directly into your browser to see the response.

A screenshot showing sample usage for real-time Tinybird APIs

Within the endpoint URL, you will notice the date_from and date_to parameters. These control the date range for the query, and they can be modified to filter the results accordingly. You’ll also see the limit parameter, which controls how many rows are returned.

Try altering the values for these parameters in the browser's address bar. As you change the dates or limit and refresh the page, you should see different data returned in response to your query. This behavior verifies that the dynamic filtering is working correctly, allowing the query to adapt to different user inputs or requirements.

If you request the data in a JSON format, you’ll also receive some metadata about the response, including statistics about the query latency:

"statistics": {
    "elapsed": 0.001110996,
    "rows_read": 4738,
    "bytes_read": 101594
}
Enter fullscreen mode Exit fullscreen mode

In the above example, the API response took barely 1 millisecond, which is a recipe for fast dashboards! You can utilize this metadata to continue to monitor your dashboard query performance and optimize as needed.

Pulling your Tinybird project into your local directory

If you want to pull the Tinybird resources into your local directory so that you can manage this project with git, you can do so as follows.

In your terminal, start by pulling the Tinybird data project and putting the resources into your data-project directory:

tb pull –auto
mv datasources pipes data-project/
Enter fullscreen mode Exit fullscreen mode

You’ll see a confirmation that 3 resources (signatures.datasource, accounts.datasource, and ranking_of_top_organizations_creating_signatures.pipe) were written into two subfolders, datasources and pipes, which were created by using the --auto flag.

As you add additional resources in the Tinybird UI, simply use that same command to pull files from Tinybird. You can then add them to your git commits and push them to your remote repository.

If you create data project resources locally, you can push them to the Tinybird server with tb push. For more information on managing Tinybird data projects in the CLI, check out this CLI quick start guide.

Step 4: Create real-time dashboard components with Tremor and Next.js

Now that you have a low-latency API with real-time dashboard metrics, let’s create the visualization layer using Next.js and Tremor. These two tools give us a scalable and responsive interface that can effectively integrate with Tinybird's APIs to display data dynamically.

Here's how you can get started:

Add Tremor to your Next.js app

We’re going to use Tremor to create a simple bar chart that displays the signature count for each organization. Tremor gives you beautiful React chart components that you can deploy easily and customize as needed.

Start by installing Tremor with the CLI:

npx @tremor/cli@latest init
Enter fullscreen mode Exit fullscreen mode

Select Next as your framework and allow Tremor to overwrite your existing tailwind.config.js.

Set up environment variables

Next, you need to add your Tinybird host and admin token as environment variables so you can run the project locally. Add the following to your .env.local file:

NEXT_PUBLIC_TINYBIRD_HOST=your_tinybird_host # (e.g. api.tinybird.co)
NEXT_PUBLIC_TINYBIRD_TOKEN=your_tinybird_token
Enter fullscreen mode Exit fullscreen mode

Set up your index.js

Let’s create an index page to build and display our real-time dashboard.

cd src/pages
code index.js
Enter fullscreen mode Exit fullscreen mode

Next.js may have made a default index.js, in which case start by clearing its contents.

Import UI libraries

To build your dashboard component, you will need to import various UI elements and functionalities from the libraries provided. Make sure you have the following libraries and components imported at the beginning of your file:

"use client";

import { Card, Text, Subtitle, Title, BarChart } from '@tremor/react';
import React, { useState, useEffect } from 'react';
Enter fullscreen mode Exit fullscreen mode

Note we’re using the use client; directive to render the components on the client side. For more details on this, check out the Next.js docs.

Define constants and states

Inside your main component, define the constants and states required for this specific component. We’re going to set the state for the chart data and the latency of the query (so we can see how fast this dashboard is!):

// Get your Tinybird host and token from the .env file
const TINYBIRD_HOST = process.env.NEXT_PUBLIC_TINYBIRD_HOST; // The host URL for the Tinybird API
const TINYBIRD_TOKEN = process.env.NEXT_PUBLIC_TINYBIRD_TOKEN; // The access token for authentication with the Tinybird API

export default function Dashboard() {
    // React state hook for managing the "newSignaturesPerDay" data
    // Initializes data with an array containing an object with default values for the top orgs creating signatures
    const [ranking_of_top_organizations_creating_signatures, setData] = useState([{
        "organization": "",
        "org_total": 0,
    }]);

    // Initializes latency with an integer 0  
    const [latency, setLatency] = useState(0);
Enter fullscreen mode Exit fullscreen mode

Connect your dashboard to your Tinybird API

You’ll need to write a function to fetch data from Tinybird. Note that for the sake of brevity, we are hardcoding the dates and using the default limit in the Tinybird API. You could set up a Tremor datepicker and/or number input if you wanted to dynamically update the dashboard components from within the UI.

// Define hardcoded date range for the query
const dateFrom = new Date(2023, 0, 1); // Start date for the query (January 1st, 2023; JavaScript months are 0-indexed)
const dateTo = new Date(2023, 11, 31); // End date for the query (December 31st, 2023)

// Format for passing as a query parameter
const dateFromFormatted = dateFrom.toISOString().substring(0, 10);
const dateToFormatted = dateTo.toISOString().substring(0, 10);

// Constructing the URL for fetching data, including host, token, and date range
const topRankingOfOrganizationsCreatingSignaturesURL = `https://${TINYBIRD_HOST}/v0/pipes/ranking_of_top_organizations_creating_signatures.json?token=${TINYBIRD_TOKEN}&date_from=${dateFromFormatted}&date_to=${dateToFormatted}`;

// Function to fetch data from Tinybird URL
const fetchTinybirdUrl = async (fetchUrl, setData, setLatency) => {
        const data = await fetch(fetchUrl); // Performing an asynchronous HTTP fetch request
        const jsonData = await data.json(); // Parsing the response as JSON
        console.log(jsonData.data); // Logging the parsed data for debugging purposes
        console.log(jsonData.statistics.elapsed)
        setData(jsonData.data); // Setting the state with the fetched data
        setLatency(jsonData.statistics.elapsed) // Setting the state with the query latency from Tinybird
};
Enter fullscreen mode Exit fullscreen mode

Configure the Tinybird API Call

You need to define the specific URL for the Tinybird API call and make the fetch request using the fetchTinybirdUrl function inside the useEffect hook:

// useEffect hook to handle side-effects (in this case, fetching data) in a functional component
useEffect(() => {
  // Calling the fetchTinybirdUrl function with the URL and state setter function
  // The function fetches the data and updates the state
  fetchTinybirdUrl(topRankingOfOrganizationsCreatingSignaturesURL, setData, setLatency)
}, [topRankingOfOrganizationsCreatingSignaturesURL]); // The effect will rerun if the value of topRankingOfOrganizationsCreatingSignaturesURL changes
Enter fullscreen mode Exit fullscreen mode

Render the Component

Finally, include the rendering code to display the "Ranking of the top organizations creating signatures" in the component's return statement:

return (
    <Card>
            <Title>Top Organizations Creating Signatures</Title>
            <Subtitle>
                Ranked from highest to lowest
            </Subtitle>
            <BarChart
                className="mt-6"
                data={ranking_of_top_organizations_creating_signatures}
                index="organization"
                categories={["org_total"]}
                colors={["blue", "red"]}
                yAxisWidth={48}
                showXAxis={true}
            />
            <Text>Latency: {latency*1000} ms</Text>
        </Card>
);
}
Enter fullscreen mode Exit fullscreen mode

To view your real-time dashboard component, run the following:

npm run dev
Enter fullscreen mode Exit fullscreen mode

Navigate to ​​http://localhost:3000/ in your browser. You should see something like this:

A screenshot showing the final rendered real-time dashboard component

And that’s it! You’ve created a real-time dashboard component using Tinybird, Tremor, and Next.js. You’ll notice the dashboard is rendering very quickly by taking a peek at the latency number below the component. In my case, Tinybird returned the data for my dashboard in a little over 40 milliseconds aggregating over about a million rows. Not too bad for a relatively unoptimized query!

Next Steps

This tutorial showed you how to build a single real-time dashboard component, but you probably want to add additional components and interactive elements.

If you need ideas, check out the GitHub repository for this project. It has some additional components including new visualizations (and Tinybird Pipes to support them) plus an interactive data range picker.

A real-time dashboard built with Tinybird, Tremor, and Next.js

You can also spend some time optimizing your data project for faster responses and minimal data processing using fine-tuned indexes, Materialized Views, and more. For tips on optimizing SQL queries or building Materialized Views, check out the Tinybird docs.

Wrapping up

If you are interested in building real-time dashboards or any other real-time visualizations, you need a data stack and frontend library that can keep pace. In this tutorial, you learned how to use modern tooling to build an end-to-end real-time data pipeline and dashboard. With Tinybird, Tremor, and Next.js, it’s possible to build a real-time dashboard from scratch in less than an hour.

The combination of Tinybird, Next.js, and Tremor provides a powerful solution for building real-time dashboards, but the real “speed layer” here is Tinybird. Here’s what Tinybird is perfect for building real-time data visualizations:

  • Real-Time Data Ingestion and Processing: Tinybird can handle large streams of data in real time. Unlike traditional batch ETL processes, it can ingest, process, and analyze millions of events per second on the fly. This means that your dashboard can reflect changes almost instantly, keeping the insights fresh and timely.
  • Highly Optimized Query Engine: Tinybird’s query engine is built to execute complex analytical queries in milliseconds. It can handle filtering, aggregating, or joining data without breaking a sweat, which means your dashboards won’t experience lagging refresh times.
  • Scalable Architecture: Tinybird is a scalable, serverless real-time data platform. It flexibly scales storage and compute resources based on demand. As your data volumes and user loads increase, Tinybird responds to ensure fast dashboards at scale.
  • Integration with Streaming Sources: Tinybird includes many first-class connectors for streaming data sources (like Apache Kafka, Google Pub/Sub, Amazon Kinesis, and more), so you can unify data from multiple sources directly into your visualization layer.
  • Real-time API Publication: Tinybird is designed specifically for user-facing applications. With Tinybird, you can instantly publish SQL-based metrics as APIs that you can integrate into your frontend.
  • Compatibility with Next.js and Tremor: Tinybird's architecture and API are designed to work seamlessly with modern frontend frameworks like Next.js and visualization tools like Tremor. This integration creates a smooth user experience from data ingestion to visualization.
  • Easy to Use: Even with all its robust capabilities, Tinybird remains accessible to developers. Its simplified SQL-based query language and well-documented APIs mean that building and maintaining a real-time dashboard does not require specialized skills or extensive training.

If you're dabbling in real-time data processing or looking to shift to event-driven architectures for your dashboards, Tinybird could be for you. It's free to start and designed to help you build real-time data pipelines fast. You can sign up here (no credit card, no time limit)

Stuck somewhere along the way? Join the Tinybird Slack community for help. Want to dive deeper into Tinybird? The Tinybird docs are a great place to start.

Additional Resources

FAQs

What are the technologies used for creating a real-time dashboard, and why?

The application uses Tinybird for real-time data ingestion and real-time analytics, Tremor for data visualization, and Next.js as a fully-featured React framework. These technologies are chosen for their efficiency in processing large streams of real-time data, visualizing it in a user-friendly way, and ensuring a smooth and visually appealing rendering.

What makes Tinybird essential for real-time data handling in the tech stack?

Tinybird provides real-time data ingestion and processing, optimized query execution, scalable architecture, compatibility with streaming sources, integration with modern frontend frameworks, and accessibility to developers. Its architecture is tailor-made for real-time analytics dashboards, making it an essential part of this process.

How can I set up the Tinybird CLI?

Setting up Tinybird CLI involves creating a virtual environment, activating it, installing Tinybird CLI using pip, authenticating with Tinybird, and securing the Tinybird config file. Detailed instructions are provided in the post.

What is the role of the mock data generator, and how does it work?

The mock data generator simulates the signature flow by generating random accounts, simulating signature statuses, determining final status, and sending payloads to Tinybird. It leverages the Events API to send HTTP events to Tinybird for efficient data ingestion.

Why are most data dashboards slow, and how does this application overcome that?

Traditional dashboards are slow due to issues like batch ETL processes, complex BI tools, heavy data stack, poorly designed queries, and lack of scalability. By using Tinybird, Next.js, and Tremor, this application overcomes these issues with real-time processing, optimized queries, and scalable architecture.

What are the primary components of a real-time dashboard?

A real-time dashboard consists of Data Sources, a real-time data processing engine, a real-time visualization layer, and interactive controls. They collectively provide an up-to-the-second snapshot of critical metrics and KPIs, allowing users to interact with the data and gain insights instantly.

How can I verify that the data is flowing properly from the data seeder to Tinybird?

You can verify the data flow by inspecting the signatures or accounts Data Source in Tinybird to confirm that the data has been received and reviewing the Ingestion Metrics.

Can the structure described in the post be applied to other use cases besides signatures?

Yes, the structure and technologies can be adapted to almost any use case that requires real-time data handling, making it highly versatile.

What precautions should be taken with Tinybird authentication tokens?

The admin token should be kept secure and not shared or published. It's essential to hide secrets by adding the Tinybird config file to the .gitignore file, ensuring that it won't be committed to the repository.

Top comments (0)