DEV Community

Cover image for Decentralized Databases: ComposeDB
K for Fullstack Frontend

Posted on

Decentralized Databases: ComposeDB

When I entered the Web3 space in 2021, my first question was: What is the database of choice?

The answers were a bit unsatisfying.

There were the omnipresent blockchain networks, which were expensive and slow, and there were decentralized storage networks that focused on object storage at that time. Their more granular APIs could have been more convenient to use. And finally, there were p2p databases that stored all data on the client, which didn't seem optimal to me either.

Much has changed in the last few years, so I wanted to check out the decentralized database products.

I wanted to do a series on decentralized databases, where I would build a backend for a blog with different database solutions—starting with ComposeDB.

What is Compose DB?

ComposeDB is a graph database created by 3BoxLabs, a company well-known in the Web3 ecosystem for their work on decentralized identifiers (DIDs) and their main product the Ceramic network. Ceramic is a network of nodes that store and share composable data streams on top of libp2p, the network stack that also powers IPFS.

The goal of Ceramic is to enable decentralized storage independent from a specific blockchain network. This way, you can store your data in a decentralized manner and use it on DApps independently of what blockchain powers them.

Ceramic is the low-level infrastructure, and ComposeDB gives you a graph database on top of it. So, ComposeDB is part of every Ceramic node.

You define models with GraphQL schemas, so you can query the data from a node via GraphQL too.

When two or more nodes use the same model, they start synchronizing their data, which leads to decentralization.

To get data into the network, you must host your own node and add your schema. If others think your models are worth their resources, they can load your schema and start mirroring it.

There is no incentivization layer in the form of tokens. While on a blockchain network, you have to pay fees for data you want to store, the node operators have to interact "manually" to figure out what to keep and who they allow to upload it.

So, let's build a decentralized blog with it!

Features

  • Authentication
  • Profile creation and update
  • Article creation and update
  • Commenting on articles
  • Reading profiles, articles, and comments

Architecture

The architecture of a ComposeDB powered backend would look something like this:

ComposeDB Architecture

The orange Ceramic nodes share a model and synchronize their data, the red ones are Ceramic nodes without, or with different models.

Prerequisites

Implementation

Let's start by creating a new Node.js project and installing all dependencies with these commands:

$ mkdir composedb && cd composedb
$ npm init -y
$ npm i @ceramicnetwork/cli @ceramicnetwork/http-client @composedb/cli @composedb/devtools @composedb/devtools-node dids key-did-provider-ed25519 key-did-resolver uint8arrays
Enter fullscreen mode Exit fullscreen mode

Creating the Ceramic Config File

After this, we can create an NPM script in the package.json to start a Ceramic node.

"scripts": {
  "ceramic": "ceramic daemon --config=config/daemon.config.json"
},
Enter fullscreen mode Exit fullscreen mode

This script starts the Ceramic node and creates a new config file. You have to stop it with ctrl+c since we will edit the config file later and have to restart it with the updated config file.

$ npm run ceramic
Enter fullscreen mode Exit fullscreen mode

Creating the Admin Account

To be able to modify the database via its API, we need an admin account. Let's create a script for it, and make sure you use the .mjs extension.

File scripts/setup-account.mjs:

import { mkdirSync, readFileSync, writeFileSync } from "fs"
import { execSync } from "child_process"

const composeDb = (...args) =>
  execSync("composedb " + args.join(" "), {
    encoding: "utf-8" 
  }).trim()

mkdirSync("account")
const adminKey = composeDb("did:generate-private-key")
writeFileSync("account/admin-key", adminKey)

const adminDid = composeDb("did:from-private-key", adminKey)
writeFileSync("account/admin-did", adminDid)

const daemonConfig = JSON.parse(
  readFileSync("config/daemon.config.json", {
    encoding: "utf-8"
  })
)

daemonConfig["http-api"]["admin-dids"].push(adminDid)

writeFileSync(
  "config/daemon.config.json", 
  JSON.stringify(daemonConfig)
)
Enter fullscreen mode Exit fullscreen mode

This script will generate a random private key and use it to generate a DID. Both get saved into the account directory and added to the Ceramic config file we created before.

To run the account creation, execute the following command:

$ node scripts/setup-account.mjs
Enter fullscreen mode Exit fullscreen mode

Creating the Schema

The schema consists of multiple GraphQL files; they all get indexed by the Ceramic node and then linked to each other.

ComposeDB comes with DID-based accounts out of the box, so we don't have to implement them ourselves.

Creating the Profile Model

Having more info about a user than just their DID would be nice, so let's start with a profile model.

File schema/profile.graphql:

type Profile 
  @createModel(accountRelation: SINGLE, description: "Author profile") {
  author: DID! @documentAccount
  name: String! @string(minLength: 1, maxLength: 50)
  bio: String @string(maxLength: 100000)
}
Enter fullscreen mode Exit fullscreen mode
  • Profile uses the @createModel directive, which marks the type as a model for ComposeDB
  • accountRelation describes how many instances of this model a ComposeDB account can have; in this case, one
  • description helps people who find the model in the registry
  • author field adds the owning account to the Profile
  • name and bio fields are just text with a length definition

Creating the Article Model

What is a blog without articles? I know, I know, most of us never get to the actual writing because we are preoccupied with building blogs... just like right now, lol.

File schema/article.graphql:

type Profile @loadModel(id: "$PROFILE_ID") {
  id: ID!
}

type Article
  @createModel(
    accountRelation: LIST
    description: "Text written by an author"
  ) {
  author: DID! @documentAccount
  title: String! @string(maxLength: 50)
  date: DateTime!
  content: String! @string(maxLength: 100000)
  profileId: StreamID! @documentReference(model: "Profile")
}
Enter fullscreen mode Exit fullscreen mode
  • @loadModel will load a model with a specific ID from the Ceramic node. We don't have that ID because we didn't index Profile yet, so we put in a placeholder that will be replaced by a script later
  • accountRelation is a LIST because an account can have more than one article, even if it has a low probability of happening.
  • author Again, the ID of the account that created the record
  • title and content for the actual content of the article
  • date to log when the article was created
  • profileId to embed the Profile from before; that way we can load it as part of the Article in one query

Creating the Comment Model

Now, what are articles without flame wars, right? So, let's add a Comment model to spice things up a bit!

File schema/comment.graphql:

type Article @loadModel(id: "$ARTICLE_ID") {
  id: ID!
}

type Comment
  @createModel(
    accountRelation: LIST
    description: "A comment on an article"
  ) {
  author: DID! @documentAccount
  content: String! @string(maxLength: 5000)
  articleId: StreamID! @documentReference(model: "Article")
}
Enter fullscreen mode Exit fullscreen mode
  • @loadModel to get the indexed Article model so that we can link it to each comment
  • accountRelation is a LIST because an account can have more than one comment
  • author Again, the ID of the account that created the record
  • content for the actual content of the comment
  • articleId to embed the Article from before; that way, we can load it as part of the Comment in one query

Link Article and Comment Models

Now, we can show a Profile on an article and an Article on a comment, but it would be much more interesting to show multiple Comment instances on an Article.

For this, we have to create another schema.

File schema/article.comment.graphql:

type Comment @loadModel(id: "$COMMENT_ID") {
  id: ID!
}

type Article @loadModel(id: "$ARTICLE_ID") {
  comments: [Comment] @relationFrom(model: "Comment", property: "articleId")
}
Enter fullscreen mode Exit fullscreen mode

We load the indexed Comment model from our Ceramic node and the indexed Article model, but this time we extend the Article model with a relation. The @relationFrom directive will create a field that checks the articleId of our Comment models match the ID of a corresponding Article when requested.

Indexing the Schema

We need to get all these models loaded by our Ceramic node, so we create a script that replaces our ID placeholders before it loads a model. This way, we can link our models with the correct model IDs without manually adding them to the schema definitions.

Again, make sure you use the .mjs extension.

File scripts/setup-schema.mjs:

import { execSync } from "child_process"
import { readFileSync } from "fs"
import { CeramicClient } from "@ceramicnetwork/http-client"
import { DID } from "dids"
import { Ed25519Provider } from "key-did-provider-ed25519"
import { getResolver } from "key-did-resolver"
import { fromString } from "uint8arrays/from-string"
import { Composite } from "@composedb/devtools"
import {
  createComposite,
  writeEncodedComposite,
} from "@composedb/devtools-node"

const privateKey = fromString(
  readFileSync("account/admin-key", { encoding: "utf-8" }).trim(),
  "base16"
)

const did = new DID({
  resolver: getResolver(),
  provider: new Ed25519Provider(privateKey),
})
await did.authenticate()

const ceramic = new CeramicClient("http://127.0.0.1:7007")
ceramic.did = did

const profileComposite = await createComposite(
  ceramic,
  "schema/profile.graphql"
)

const articleSchema = readFileSync("schema/article.graphql", {
  encoding: "utf-8",
}).replace("$PROFILE_ID", profileComposite.modelIDs[0])

const articleComposite = await Composite.create({
  ceramic,
  schema: articleSchema,
})

const commentSchema = readFileSync("schema/comment.graphql", {
  encoding: "utf-8",
}).replace("$ARTICLE_ID", articleComposite.modelIDs[1])

const commentComposite = await Composite.create({
  ceramic,
  schema: commentSchema,
})

const articleCommentSchema = readFileSync("schema/article.comment.graphql", {
  encoding: "utf-8",
})
  .replace("$ARTICLE_ID", articleComposite.modelIDs[1])
  .replace("$COMMENT_ID", commentComposite.modelIDs[1])

const articleCommentComposite = await Composite.create({
  ceramic,
  schema: articleCommentSchema,
})

const composite = Composite.from([
  profileComposite,
  articleComposite,
  commentComposite,
  articleCommentComposite,
])

await writeEncodedComposite(composite, "composites/blog.json")
await composite.startIndexingOn(ceramic)

execSync(
  "composedb composite:compile composites/blog.json composites/blog.runtime.json --ceramic-url=http://0.0.0.0:7007"
)
execSync(
  "composedb composite:compile composites/blog.json composites/blog.runtime.js --ceramic-url=http://0.0.0.0:7007"
)
Enter fullscreen mode Exit fullscreen mode

First, we load the private key to authenticate with the Ceramic node.

Then, we load each model, replace the ID placeholders with stream IDs, and create a composite.

After creating and linking all models, we merge them so our node can index them.

We then save the merged composite to disk at composites/blog.json and compile two runtime composites from it that go into the same directory. Our GraphQL server will use these runtime composites later.

To run the script, we have to start the Ceramic node again and execute the script, each in different command lines.

Start the node with this command:

$ npm run ceramic
Enter fullscreen mode Exit fullscreen mode

Open another command line and run the script with the following command:

$ node scripts/setup-schema.mjs
Enter fullscreen mode Exit fullscreen mode

After this, our models should be queryable.

Testing the Schema

We can test the schema using the GraphQL server that ships with the ComposeDB CLI.

Let's add an NPM script for this:

File package.json:

...
"scripts": {
  ...
  "graphql": "composedb graphql:server --graphiql composites/blog.runtime.json --ceramic-url=http://0.0.0.0:7007 --did-private-key=$(cat account/admin-key) --port=5005"
},
...
Enter fullscreen mode Exit fullscreen mode

The GraphQL server will use the runtime composite we generated and connect to the Ceramic node.

If you open http://localhost:5005/graphql you'll see the Yoga GraphiQL interface and can start issuing queries.

Querying your Account

The most straightforward query is for the currently active account, the viewer.

Query:

{
  viewer {
    id
    isViewer
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "viewer": {
      "id": "did:key:...",
      "isViewer": true
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

In the case of the GraphQL server, the viewer is the admin account we created. Obviously, isViewer is true when we query our current account.

Creating a Profile

If we use the auto-completion of GraphiQL, we see the models we indexed are already available.

GraphiQL auto-complete

But we didn't create records for our models.

Records are called nodes and edges in a graph database, but I will use records to prevent confusion with the Ceramic nodes.

So, let's start by creating a Profile for our admin. To do so, run this mutation:

Mutation:

mutation {
  createProfile(
    input: {
      content: {
        name: "Admin"
        bio: "The creator of this blog."
      }
    }
  ) {
    profile: document {
      name
      bio
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "createProfile": {
      "profile": {
        "name": "Admin",
        "bio": "The creator of this blog."
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Loading the Profile

Now, we can use the viewer to receive the profile we created.

Query:

{
  viewer {
    profile {
      name
      bio
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "viewer": {
      "profile": {
        "name": "Admin",
        "bio": "The creator of this blog."
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

If we want to load the profile for another user, we can use ComposeDB's built-in Node type. Every document is a Node; we must use inline fragments to define which kind of nodes we want.

Query:

{
  profile: node(id: "...") {
    ... on Profile {
      name
      bio
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "profile": {
      "name": "Admin",
      "bio": "The creator of this blog."
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Creating and Updating an Article

Now that we have a profile, we can start writing something!

Mutation:

mutation {
  createArticle(
    input: {
      content: {
        date: "2023-03-21T12:19:06.089Z"
        title: "First Article!"
        content: "This is some content."
        profileId: ""
      }
    }
  ) {
    article: document {
      date
      title
      content
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "createArticle": {
      "article": {
        "date": "2023-03-21T12:19:06.089Z",
        "title": "First Article!",
        "content": "This is some content."
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

If we wanted to update the article's title, the mutation would look like this:

Mutation:

mutation {
  updateArticle(
    input: {
      id: "..."
      content: {
        title: "New title!"
      }
    }
  ) {
    article: document {
      title
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "updateArticle": {
      "article": {
        "title": "New title!"
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Loading Profile with Articles and Article with Profile

Now, in a blog UI, we want to show the profile of a user and all of their articles.

Depending on what ID we have at hand, we can either query the profile by ID and traverse to the article list of the owner of the profile:

Query:

{
  profile: node(id: "") {
    ... on Profile {
      name
      bio
      author {
        articleList(last: 10) {
          articles: edges {
            article: node {
              id
              date
              title
            }
          }
        }
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response

{
  "data": {
    "profile": {
      "name": "Admin",
      "bio": "The creator of this blog.",
      "author": {
        "articleList": {
          "articles": [
            {
              "article": {
                "id": "...",
                "date": "2023-03-21T12:19:06.089Z",
                "title": "New title!"
              }
            }
          ]
        }
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

We can also start from the account and get the profile and article record.

Query:

{
  account: node(id: "did:key:...") {
    ... on CeramicAccount {
      profile {
        name
        bio
      }
      articleList(last: 10) {
        articles: edges {
          article: node {
            id
            date
            title
          }
        }
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "account": {
      "profile": {
        "name": "Admin",
        "bio": "The creator of this blog."
      },
      "articleList": {
        "articles": [
          {
            "article": {
              "id": "...",
              "date": "2023-03-21T12:19:06.089Z",
              "title": "New title!"
            }
          }
        ]
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Creating and Loading Comments

Let's start with adding comments to an article.

Mutation:

mutation {
  createComment(
    input: {
      content: {
        content: "Hello!"
        articleId: "..."
      }
    }
  ) {
    comment: document {
      content
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "createComment": {
      "comment": {
        "content": "Hello!"
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Now that we have an article with a comment, we can query the article, with the author's name and all related comments with their author's name.

Query:

{
  article: node(id: "...") {
    ... on Article {
      author {
        profile {
          name
        }
      }
      date
      title
      content
      comments(last: 10) {
        edges {
          node {
            author {
              profile {
                name
              }
            }
            content
          }
        }
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "data": {
    "article": {
      "author": {
        "profile": {
          "name": "Admin"
        }
      },
      "date": "2023-03-21T12:19:06.089Z",
      "title": "New title!",
      "content": "This is some content.",
      "comments": {
        "edges": [
          {
            "node": {
              "author": {
                "profile": {
                  "name": "Admin"
                }
              },
              "content": "Hello!"
            }
          }
        ]
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

No Data Deletion

There is no way to delete data on IPFS for sure, so we'll be missing out on that feature of our blog. If the network participants deem it necessary enough, they will pin it all over the place and keep it from getting deleted.

We can write update methods that remove content and mark it as deleted for the UX. But remember that users might be surprised when they find their content elsewhere.

Summary

Even though ComposeDB is still in beta, it looks like a promising database solution for your DApp.

It distributes data via the Ceramic Network, which, in turn, is built on top of IPFS. It doesn't have any token-based incentivization layer (yet?), but that is a plus for some people.

If someone cares about a DApp, they can host a Ceramic node and import the relevant models to mirror the data.

ComposeDB is a graph database that can be confusing when you come from a relational database, but its GraphQL interface might still be familiar to the average developer.

Additional Resources

Top comments (8)

Collapse
 
quantotto profile image
quantotto

Thanks for the article. When developing dApps, there is a dilemma on whether to store everything on chain or have some data residing off chain. For bigger apps, where performance is important and data models become more complex, it is almost impossible to stay exclusively on-chain. Such a decentralized solution for DB might be a good answer. Just need to test the performance as IPFS is not extremely optimized.

Collapse
 
kayis profile image
K

Yes, on-chain storahe pricing is quite steep.

That's why I set out to do some research on decentralised alternatives.

Collapse
 
ceramicnetwork profile image
Ceramic

Really great article, @kayis! We tried to reach out to you via Twitter DM but it was giving us an error, we'd love to connect and potentially collaborate with you. Can you give us a method of contact or send us a DM on Twitter to connect? Lmk! Thank you :)

Christina
Community Manager

Collapse
 
kayis profile image
K

Wrote you on twitter!

Collapse
 
dweng0 profile image
Jay

It's a really intriguing idea, what benefits would a dapp have using this over, say, graphql?

Collapse
 
kayis profile image
K

This DApp is using GraphQL.

If you want to know what benefits DApps have over regular apps, then here the answer is data portability.

You can run your own Ceramic node with the models and host the backend if you like. The data would be synchronized between your node and the "founding" node via a p2p network in the background. If the founding node goes down, the DApp still works.

Collapse
 
mendsalbert profile image
Mends Albert

Thanks for such an insightful project.

can you show me how did node or api can be used at the frontend for query?

Collapse
 
asaddev1 profile image
Asad ullah

Thank you it was very helpfull . i want to change sqlite to postgres how i do this?