I don’t usually do multipart posts, but this one grew a bit after I published the initial article. The setup I’ve got there describes how to import content from Notion to Eleventy and render a site. But it’s getting very slow very soon.
Webhooks
I described how to use Zapier as a deployment method. Two weeks later I found out that their generic webhooks are free only in their grace period. I don’t make any money with my site, so I think it’s fair to stay on free services all around. I switched to Pipedream, which provides generic webhooks in its free plan indefinitely.
Caching
As I started to import all my content into my new blog setup, build times grew to a slouch, easily exceeding 15 minutes. All those Notion requests were taking their toll. I needed one request against the page containing all blog posts, one request to get the blog post page and one request for each post’s content. One or two additional ones for long articles, because they’re paginated (which I didn’t take into account before, either). Multiply that by 3 for each of my sections and you get a lot of time to play Memory.
Eleventy has this nice caching plugin called Eleventy-Fetch, which will fetch and cache the request for a given duration. It takes a URL and will fetch that. There’s just one problem: I was using the Notion Client, which will only take an ID. There are no URLs involved:
const { Client } = require("@notionhq/client");
const notion = new Client({ auth: process.env.NOTION_KEY });
const db = await notion.databases.query({
database_id: process.env.NOTION_BLOG_ID,
sorts: [
{
property: "Date",
direction: "descending",
},
],
});
There’s a feature request in Eleventy-Fetch to work with functions like that, but as far as I can see, no one’s working at the at the moment. It’s nice to work with the client, but in order to cache those results, I need to switch to plain old URLs that I can feed to Eleventy-Fetch. Notion’s docs describe how to curl page content:
curl https://api.notion.com/v1/blocks/16d8004e-5f6a-42a6-9811-51c22ddada12/children?page_size=100 \
-H 'Authorization: Bearer '"$NOTION_API_KEY"'' \
-H "Notion-Version: 2022-06-28"
Translated into javascript and cached by Eleventy:
const db = await EleventyFetch(
`https://api.notion.com/v1/databases/${process.env.NOTION_BLOG_ID}/query`,
{
duration: "7d",
type: "json",
fetchOptions: {
method: "POST",
withCredentials: true,
credentials: "include",
body: JSON.stringify({
sorts: [
{
property: "Date",
direction: "descending",
},
],
}),
headers: {
Authorization: `Bearer ${process.env.NOTION_KEY}`,
"Notion-Version": "2022-06-28",
"Content-Type": "application/json",
},
},
}
);
The results should be identical, meaning that I can still move on with NotionToMarkdown from here, like I did before.
I’m doing the same thing again to actually fetch the content:
const url = `https://api.notion.com/v1/blocks/${id}/children?page_size=100`;
const response = await EleventyFetch(url, {
duration: "7d",
type: "json",
{
headers: {
Authorization: `Bearer ${process.env.NOTION_KEY}`,
"Notion-Version": "2022-06-28",
"Content-Type": "application/json",
},
},
});
Pagination
You see that page_size
query parameter up there in the URL? That’s going to be bad news. I’m only fetching the first 100 blocks of the notion document, which is the largest page size available to me. The Notion Client manages all that by itself, but when I do it manually, I need to loop that.
I need that function in all of my data sources anyway, so I refactored it into a helper function:
const fetchNotionBlocks = async (
id,
blocks = [],
cursor = null,
) => {
let url = `https://api.notion.com/v1/blocks/${id}/children?page_size=100`;
if (cursor) {
url += `&start_cursor=${cursor}`;
}
const fetchOptions = {
headers: {
Authorization: `Bearer ${process.env.NOTION_KEY}`,
"Notion-Version": "2022-06-28",
"Content-Type": "application/json",
},
};
const response = await EleventyFetch(url, {
duration: "7d",
type: "json",
fetchOptions,
});
blocks.push(...response.results);
if (response.has_more) {
blocks = await fetchNotionBlocks(
id,
blocks,
response.next_cursor,
);
}
return blocks;
};
This will loop through the pages by moving the cursor for each iteration until Notion tells me that I’m on the last one. Now I’ve got everything I had with the Notion Client, but cached.
No Caching
Caching everything all the time is a problem though. I’ve overshot the mark. When I change anything in an article, I won’t get the updates. Even entirely new articles won’t appear, because Notion’s overview is cached as well. I need to be smarter about it:
- For regular deployments, I want to have cached Notion responses for all articles but the latest one. I never want to have cached overview responses, so I can see new articles. I’ll use those deployments the most by far.
- Full non-cached deployments can already be triggered via Netlify. I’ll use them only for edge cases.
- There’s also a dev mode, which will cache all Notion requests. I’ll use that locally when I want to have fast build times and don’t care about up-to-date content.
- I also want to have a per-article deployment method. That one will not cache the overview and a given article
To get that I need to switch from Eleventy-Fetch to fetch
for requests that I don’t want to cache. I also need to know which article I want to un-cache.
So far, I have one central deploy button in my Notion Space that triggers a Pipedream Webhook (from GET to POST), which triggers a Netlify Deploy Hook. Netlify’s hook can read the POST body and use it as an environment variable in the build script. That means that I can add the Notion ID in the Pipedream Hook as a Query Parameter, move it to the POST body and finally read it in the build script to exclude this exact ID from being cached.
Now I can use the ID from Netlifys Environment Variable INCOMING_HOOK_BODY
as process.env.INCOMING_HOOK_BODY
in my Eleventy Data file:
const id = posts[i].id.replaceAll('-', '');
const skipCache = (!process.env.INCOMING_HOOK_BODY && i === 0) || process.env.INCOMING_HOOK_BODY === id; // don't cache latest or specified article
if (skipCache) {
console.log('skipping cache for:', posts[i].title);
}
const blocks = await fetchNotionBlocks(posts[i].id, [], null, skipCache);
const post = await getContent(blocks);
…and push it through to the Notion API code:
const fetchNotionBlocks = async (
id,
blocks = [],
cursor = null,
skipCache = false
) => {
// ...
const response = skipCache
? await (await fetch(url, fetchOptions)).json()
: await EleventyFetch(url, {
duration: "7d",
type: "json",
fetchOptions,
});
//...
};
Tada! Incremental Deploy Hooks with Eleventy and Netlify!
Visiting https://$PIPEDREAM_HOOK_ID.m.pipedream.net?id=$NOTION_PAGE_ID
will now trigger a deployment that requests only the given article.
Top comments (0)