DEV Community


Posted on

On-Chain SVG Generation - Part 1

As of recent, a lot of projects have requested for us to deploy their art on-chain. While this is cool, trendy way of keeping all of your NFT art on-chain, it comes with some limitations. The image (in bytes) that they want to store on-chain can only be so big without losing a ton of money to gas costs.

While this is not a new idea by any means (Cryptopunks is on-chain and they were one of the first), I found trouble finding a good tutorial on how to do it.

Thus, we will be covering how to generate the art on chain, and in part 2 we will explain how you go about actually handling randomness and pushing the correct metadata for the OS protocol.

First steps

You will need to deploy all of your assets onto IPFS (bear with me, we won't be just referencing them in our metadata as the image). You will then need to build a json struct, here is an example I did for a "Mobsters" NFT:

        "Shoes": [
            {"Black": ""},
            {"Brown": ""},
            {"Gold": ""},
            {"Red": ""},
            {"White": ""},
            {"White & Blue": ""},

        "Pants": [
            {"Black": ""},
            {"Gold": ""},
            {"Jeans": ""},
            {"Pinstripe": ""},
            {"Red": ""},
            {"White": ""}
   // ETC
Enter fullscreen mode Exit fullscreen mode

Next, we are going to generate a file called Traits.sol (some parts cut for the sake of part 1s simplicity, which is where we are going to upload each of these assets as base64 encoded versions).

It will look something like this:

// SPDX-License-Identifier: MIT LICENSE

pragma solidity ^0.8.0;
import "@openzeppelin/contracts/access/Ownable.sol";
import "@openzeppelin/contracts/utils/Strings.sol";
import "./interfaces/ITraits.sol";
import "./interfaces/IMnC.sol";

contract Traits is Ownable, ITraits {

  using Strings for uint256;

  // struct to store each trait's data for metadata and rendering
  struct Trait {
    string name;
    string png;

  // mapping from trait type (index) to its name
  string[8] private _traitTypes = [
      "Mouth Piece",
  // storage of each traits name and base64 PNG data
  mapping(uint8 => mapping(uint8 => Trait)) public traitData;

   * administrative to upload the names and images associated with each trait
   * @param traitType the trait type to upload the traits for (see traitTypes for a mapping)
   * @param traits the names and base64 encoded PNGs for each trait
  function uploadTraits(uint8 traitType, uint8[] calldata traitIds, Trait[] calldata traits) external onlyOwner {
    require(traitIds.length == traits.length, "Mismatched inputs");
    for (uint i = 0; i < traits.length; i++) {
      traitData[traitType][traitIds[i]] = Trait(

  /** RENDER */

   * generates an <image> element using base64 encoded PNGs
   * @param trait the trait storing the PNG data
   * @return the <image> element
  function drawTrait(Trait memory trait) public pure returns (string memory) {
    return string(abi.encodePacked(
      '<image x="4" y="4" width="64" height="64" image-rendering="pixelated" preserveAspectRatio="xMidYMid" xlink:href="data:image/png;base64,',

   * generates an entire SVG by composing multiple <image> elements of PNGs
   * @param tokenId the ID of the token to generate an SVG for
   * @return a valid SVG of the Mobster
  function drawSVG(uint256 tokenId) internal view returns (string memory) {
    IMnC.CopperMobster memory s = mncNFT.getTokenTraits(tokenId);

    string memory svgString = string(abi.encodePacked(

    return string(abi.encodePacked(
      '<svg id="NFT" width="100%" height="100%" version="1.1" viewBox="0 0 64 64" xmlns="" xmlns:xlink="">',
Enter fullscreen mode Exit fullscreen mode

So, essentially all we are doing her is storing all the traits in a struct which will be stored in a mapping for later use by using the uploadTraits method. This will be the first thing you will want to knock out when starting this project. drawTrait will be used to actually wrap the base64 encoded png in an image tag so it can be rendered. Finally, drawSVG will allow us to grab the randomized trait (will be explained in the next tutorial, there are a lot of ways to do this, but to do it entirely on-chain, here used A.J. Walker's Alias Algorithm for anyone wanting to jump ahead.

Script to upload one group of traits

Note, this can also be done in a loop to do everything at once, but for simplicity sake, here is a script of how to upload one group (we will be doing our mobsters shirts).

const { ethers } = require("hardhat");
require('dotenv').config({path: ".env"})
const fs = require('fs');
const fetch = require("node-fetch");
const traitJson = JSON.parse(fs.readFileSync('scripts/traitMapping.json'));

async function main() {

  const [user1, user2]  = await ethers.getSigners();
  provider = ethers.getDefaultProvider()

  // Deploy our Traits.sol contract
  const Traits = await ethers.getContractFactory("Traits")
  t = await Traits.deploy()
  await t.deployed()
  console.log("Traits.sol deployed to", t.address)
  // Attach to the contract
  const tContract = await Traits.attach(t.address)
  let traitsToUpload = []
  for (let i=0; i < traitJson["Mobster"]["Shirt"].length; i++) {
    // Get name of the specific attribute
    let key = Object.keys(traitJson["Mobster"]["Shirt"][i]).toString();
    // Get IPFS URL for the asset
    let value = Object.values(traitJson["Mobster"]["Shirt"][i]).toString();
    // Encode in Base64 the png (this is where the magic happens)
    let imgB64 = await fetch(value).then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));
    // Append the dictionary of the mapping fo the name to the base64 image
    traitsToUpload.push({name: key, png: imgB64})
  tx = await tContract.uploadTraits(2, [3, 3, 8, 1, 10, 10, 10, 11, 3, 11, 8, 10], traitsToUpload)
  // The first arg is the position of the trait in Traits.sol, the second array is for randomness
  // randomness which will be explained in part 2 of the tutorial, and finally the third arg is
  // our actual mappings we just generated

  .then(() => process.exit(0))
  .catch((error) => {

Enter fullscreen mode Exit fullscreen mode

After doing this for each attribute, all of about assets are ready to be combined upon mint!

Script to make sure your assets combine correctly

If you’re curious how your assets stack up, you can do this off-chain to check that things are assembling correctly, once you have altered the script below, at the bottom of this webpage you can paste in your svg code to see a preview of what it will render as. Keep in mind these are SVG's so you need to keep them small. SVG stands for Scalable Vector Graphics. Which means even if your SVG is very small (64x64px is probably the largest I would go for gas reasons it will scale as it is rendered on OpenSea or elsewhere.)

This is a very blunt script, to help explain in detail what exactly will happen on-chain to combine the assets. (It's crucial you sync with your artist to make sure you are layering them in the correct order.)

const fetch = require("node-fetch");

async function main() {
    let shoesB64 = await fetch("").then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));
    let pantsB64 = await fetch("").then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));
    let shirtB64 = await fetch("").then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));
    let eyesB64 = await fetch("").then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));
    let headB64 = await fetch("").then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));
    let hatB64 = await fetch("").then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));
    let mouthB64 = await fetch("").then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));
    let assessB64 = await fetch("").then(r => r.buffer()).then(buf => `data:image/png;base64,`+buf.toString('base64'));

    base = '<image x="4" y="4" width="64" height="64" image-rendering="pixelated" preserveAspectRatio="xMidYMid" xlink:href="'
    end = '"/>'

    shoes = base + shoesB64 + end
    pants = base + pantsB64 + end
    shirt = base + shirtB64 + end
    eyes = base + eyesB64 + end
    head = base + headB64 + end
    hat = base + hatB64 + end
    mouth = base + mouthB64 + end
    access = base + assessB64 + end

    starter = '<svg id="NFT" width="100%" height="100%" version="1.1" viewBox="0 0 64 64" xmlns="" xmlns:xlink="">'
    finisher = "</svg>"


  .then(() => process.exit(0))
  .catch((error) => {
Enter fullscreen mode Exit fullscreen mode

This should output a string you can paste in the link above to see if that specific NFT is rendering correctly.

I hope you found this tutorial helpful. If you have any comments or suggestions on how I could improve my process or how you did it more efficiently, I am all ears. Please feel free to leave something in the comments. Once the second part of the tutorial is complete, we will be open sourcing the entire repo.

If you found this tutorial helpful, please give me a follow on Twitter and follow my company, CrossChain Labs for any NFT/DeFi help you made need with your project or more helpful blog posts.

Stay tuned for part 2!!

Discussion (0)