DEV Community

Cover image for My fist codestory. A day in my life, as a coder :)
Code From Anywhere
Code From Anywhere

Posted on

My fist codestory. A day in my life, as a coder :)

Thursday, 2022-11-18

We're almost reaching full circle. the OS is to be a framework that enables developers to build real world applications and change the world. It is an idea that can potentially incubate many startups that have real positive impact. The market place is full of boilerplates that have just cost me a couple of hours. Many models I made deserve their own startup.

I have to try to make an article, but it's not that easy. But maybe I should start trying to write all my todos into a file, like this. If I can then augment the result with code snippets, we're already getting somewhere. No screenshots needed, just references.

Sooo this is it then! Let's start my day. I'm going to document the process. Every section should have some expandable items below it to show all the relevant pieces of code. I can make a function called markdownCodeToPostable that does this for me. It can generate a full-fledged markdown that goes on multiple websites, and it can generate a spoiler, which is just summarizing the day and maybe adds some good quotes. Both of them should later be edited further, manually, but both of them are potentially already good media posts.

This is great. So let's start!

undefined

Bundle...

generateBundle gets me a bundle error! check why tooltip isn't included and stuff, it's hard to debug... annoying!

Ahhhh... I found it.

Finding operations
k-dev ... new-template ... generate-sdk-operations ... markdown-reader-web ... markdown-reader-ui ... db-web ... function-server ... Finding inherited bundles
{ allInheritedSlugs: [ 'typerepo' ] }
Summarizing operations
Found 3 apps, 3 modules, 1 packages
Copying operations
Installing repo ^C
Enter fullscreen mode Exit fullscreen mode

It seems that it's trying to build only the base, without all inherited bundles and their dependencies. Of course! This is not going to work, if the summary is going to be made of only the base, without any dependencies.

I think we should be able to leave all inherited slugs for waht they are EXCEPT for the first one. The first one should be calculataed, because that's the one you are rebuilding (in this case typerepo).

Let's make it so that it summarises the first inherited bundle.

10 minutes later: it seems the required stuff is not in generateBundle itself but in findAndCopyOperations.

I found it!

It seems that I needed to add the 4th line here: I forgot to actually attach the newly generated bundle configuration to the one scheduled for creation.

const allSummaries = slugsIncludingInherited
  .map((slug) => {
    // NB: include the one that we actually just made!
    if (slug === newBundleConfig.slug) return newBundleConfig;

    return bundleConfigs.find((x) => x.slug === slug);
  })
  .filter(notEmpty)
  .map(getBundleSummary);
Enter fullscreen mode Exit fullscreen mode

Saved. Let's go over the rest of the file and find some other problems. When building I encountered one newly introduced build error which took me 1 minute to fix.

Let's build typerepo without errors!


Creating/updating bundled folder
{ realSkipPull: true }
Not pulling repo, creating new bundle from what we have
Finding operations for typerepo
k-dev ... new-template ... generate-sdk-operations ... markdown-reader-web ... markdown-reader-ui ... db-web ... function-server ... Finding inherited bundles
{ slugsIncludingInherited: [ 'typerepo' ] }
Summarizing operations
Found 3 apps, 118 modules, 1 packages
Copying operations

Enter fullscreen mode Exit fullscreen mode

Wow, nice! This looks way better! 118 modules, that seems about right.

We seem to still have some problems though! There are some packages not included.


error Couldn't find package "react-with-native-modal@1.0.60" required by "db-web@0.1.0" on the "npm" registry.\n` +
    `Error: Couldn't find package "watch-all@0.0.1" required by "index-typescript@0.0.1" on the "npm" registry.\n

Enter fullscreen mode Exit fullscreen mode

As you can see it can't find all code. In the end it crashes with Generating SDKs went wrong. This is most likely because I cleaned up some packages, I removed them, and they were not indexed after cleaning. So let's fix that.

generateBundle

generateBundle does the following

  • optionally, pull new code from Github and remove all that will be generated, but keep some things that I don't maintain
  • uses bundleConfig to find all needed packages
  • wraps those packages in a bundled project folder structure
  • moves the created source to a specified path where a .git folder is located (and the previous bundle version)
  • optionally, ship it to github
async (
  /**
   * All bundle configs,
   */
  bundleConfigs: BundleConfig[],
  /**
   * The slug of the bundle config you want to geneerate the bundle for
   */
  slug: string,
  customisableBundleConfig?: CustomisableBundleConfig
): Promise => {
  const bundleConfig = bundleConfigs.find((x) => x.slug === slug);

  if (!bundleConfig) return;

  const finalBundleConfig = mergeBundleConfigs(
    bundleConfig,
    customisableBundleConfig
  );

  const {
    customisableBundleConfig: {
      debug,
      branchName,
      isOffline,
      skipPush,
      skipUpsert,
      gitUserEmail,
      gitUserName,
    },
    createBundleConfig: { informationStrategy },
  } = finalBundleConfig;

  let isReallyOffline = false;
  if (!isOffline) {
    isReallyOffline = !(await isOnline());

    if (isReallyOffline) {
      log("🌈 Couldn't find google, not pulling/pusing", { type: "important" });
      console.log("Possible reasons:", [
        "they blew up",
        "we're offline",
        "we're using a node version below v17",
      ]);
    }
  }

  // NB: if you are offline, skip is also needed
  const realSkipPush = isReallyOffline || isOffline || skipPush;

  console.log({ realSkipPush });
  const bundlePaths = getBundlePaths(bundleConfig);

  if (!bundlePaths) {
    log("Not all bundlepaths could be calculated", { type: "error" });
    return;
  }
  const { destinationFolderPath } = bundlePaths;

  // 1) update bundled folder so it's up to date, if not already
  log(`Creating/updating bundled folder`, { type: "important" });
  const isReady = await syncNicheFolder(finalBundleConfig, isReallyOffline);
  if (!isReady) return;

  // remove everything except .git folder and foldersFromRepo
  await removeAllExcept(destinationFolderPath, { ignore: [".git"] });

  // 2) make path `new-template/assets/templates/monorepo` to be copied into the destination
  const result = await newTemplate("monorepo", destinationFolderPath);

  if (result !== destinationFolderPath) {
    log("Something is weird, as resulted path is not the destinationPath", {
      type: "error",
    });
    return;
  }

  // 3) find and copy all needed apps and packagess into the monorepo
  const newBundleConfig = await findAndCopyOperations(
    bundleConfig,
    bundleConfigs,
    debug
  );

  // Try to install without sdk's
  yarnInstall(isOffline, destinationFolderPath);

  log(`Generating sdk operations`, { type: "important" });

  const isSuccessful = await generateSdkOperations(newBundleConfig, {
    yarnInstallAfter: true,
    yarnInstallBefore: true,
    manualProjectRoot: destinationFolderPath,
  });
  if (!isSuccessful) {
    log("Generating SDKs went wrong", { type: "error" });
    return;
  }

  // 5) DOCS AND README
  await copyDocsAndReadme(bundleConfig);

  // sync information
  await syncInformation(finalBundleConfig);

  if (!skipUpsert) {
    // Upsert
    // @ts-ignore
    await db.upsert("BundleConfig", newBundleConfig);
  }

  const {
    createBundleConfig: _,
    customisableBundleConfig: __,
    ...publicBundleConfig
  } = newBundleConfig;

  const packageJsonPath = path.join(
    bundlePaths.destinationFolderPath,
    "package.json"
  );

  await mapObjectJson(packageJsonPath, (packageJson) => {
    const repository: Operation["repository"] =
      bundleConfig.isGitRepoPublic && bundleConfig.gitRepoUrl
        ? { type: "git", url: bundleConfig.gitRepoUrl }
        : undefined;

    return {
      ...packageJson,
      description: bundleConfig.description,
      repository,
      homepage: bundleConfig.isGitRepoPublic
        ? bundleConfig.gitRepoUrl
        : undefined,
    };
  });

  // Write public bundle config to the bundle
  const publicBundleConfigPath = path.join(
    bundlePaths.destinationFolderPath,
    "public-bundle-config.json"
  );
  await writeJsonToFile(publicBundleConfigPath, publicBundleConfig);

  if (realSkipPush !== true) {
    const pushCommand = branchName
      ? `git push -u origin ${branchName}`
      : "git push";

    const setConfigEmailCommand = gitUserEmail
      ? `git config user.email "${gitUserEmail}"`
      : undefined;

    const setConfigNameCommand = gitUserName
      ? `git config user.name "${gitUserName}"`
      : undefined;

    const gitCommitCommand = `[[ \`git status --porcelain .\` ]] && git add . && git commit -m '${finalBundleConfig.customisableBundleConfig.description}'`;

    try {
      // NB: This doesn't work well with nested .git folders!
      execSync(gitCommitCommand, {
        cwd: destinationFolderPath,
        stdio: "inherit",
      });

      if (setConfigEmailCommand) {
        execSync(setConfigEmailCommand, {
          cwd: destinationFolderPath,
          stdio: "inherit",
        });
      }
      if (setConfigNameCommand) {
        execSync(setConfigNameCommand, {
          cwd: destinationFolderPath,
          stdio: "inherit",
        });
      }

      execSync(pushCommand, { cwd: destinationFolderPath, stdio: "inherit" });
    } catch {}
  }

  return newBundleConfig;
};
Enter fullscreen mode Exit fullscreen mode

generateBundle

generateBundle does the following

  • optionally, pull new code from Github and remove all that will be generated, but keep some things that I don't maintain
  • uses bundleConfig to find all needed packages
  • wraps those packages in a bundled project folder structure
  • moves the created source to a specified path where a .git folder is located (and the previous bundle version)
  • optionally, ship it to github
async (
  /**
   * All bundle configs,
   */
  bundleConfigs: BundleConfig[],
  /**
   * The slug of the bundle config you want to geneerate the bundle for
   */
  slug: string,
  customisableBundleConfig?: CustomisableBundleConfig
): Promise => {
  const bundleConfig = bundleConfigs.find((x) => x.slug === slug);

  if (!bundleConfig) return;

  const finalBundleConfig = mergeBundleConfigs(
    bundleConfig,
    customisableBundleConfig
  );

  const {
    customisableBundleConfig: {
      debug,
      branchName,
      isOffline,
      skipPush,
      skipUpsert,
      gitUserEmail,
      gitUserName,
    },
    createBundleConfig: { informationStrategy },
  } = finalBundleConfig;

  let isReallyOffline = false;
  if (!isOffline) {
    isReallyOffline = !(await isOnline());

    if (isReallyOffline) {
      log("🌈 Couldn't find google, not pulling/pusing", { type: "important" });
      console.log("Possible reasons:", [
        "they blew up",
        "we're offline",
        "we're using a node version below v17",
      ]);
    }
  }

  // NB: if you are offline, skip is also needed
  const realSkipPush = isReallyOffline || isOffline || skipPush;

  console.log({ realSkipPush });
  const bundlePaths = getBundlePaths(bundleConfig);

  if (!bundlePaths) {
    log("Not all bundlepaths could be calculated", { type: "error" });
    return;
  }
  const { destinationFolderPath } = bundlePaths;

  // 1) update bundled folder so it's up to date, if not already
  log(`Creating/updating bundled folder`, { type: "important" });
  const isReady = await syncNicheFolder(finalBundleConfig, isReallyOffline);
  if (!isReady) return;

  // remove everything except .git folder and foldersFromRepo
  await removeAllExcept(destinationFolderPath, { ignore: [".git"] });

  // 2) make path `new-template/assets/templates/monorepo` to be copied into the destination
  const result = await newTemplate("monorepo", destinationFolderPath);

  if (result !== destinationFolderPath) {
    log("Something is weird, as resulted path is not the destinationPath", {
      type: "error",
    });
    return;
  }

  // 3) find and copy all needed apps and packagess into the monorepo
  const newBundleConfig = await findAndCopyOperations(
    bundleConfig,
    bundleConfigs,
    debug
  );

  // Try to install without sdk's
  yarnInstall(isOffline, destinationFolderPath);

  log(`Generating sdk operations`, { type: "important" });

  const isSuccessful = await generateSdkOperations(newBundleConfig, {
    yarnInstallAfter: true,
    yarnInstallBefore: true,
    manualProjectRoot: destinationFolderPath,
  });
  if (!isSuccessful) {
    log("Generating SDKs went wrong", { type: "error" });
    return;
  }

  // 5) DOCS AND README
  await copyDocsAndReadme(bundleConfig);

  // sync information
  await syncInformation(finalBundleConfig);

  if (!skipUpsert) {
    // Upsert
    // @ts-ignore
    await db.upsert("BundleConfig", newBundleConfig);
  }

  const {
    createBundleConfig: _,
    customisableBundleConfig: __,
    ...publicBundleConfig
  } = newBundleConfig;

  const packageJsonPath = path.join(
    bundlePaths.destinationFolderPath,
    "package.json"
  );

  await mapObjectJson(packageJsonPath, (packageJson) => {
    const repository: Operation["repository"] =
      bundleConfig.isGitRepoPublic && bundleConfig.gitRepoUrl
        ? { type: "git", url: bundleConfig.gitRepoUrl }
        : undefined;

    return {
      ...packageJson,
      description: bundleConfig.description,
      repository,
      homepage: bundleConfig.isGitRepoPublic
        ? bundleConfig.gitRepoUrl
        : undefined,
    };
  });

  // Write public bundle config to the bundle
  const publicBundleConfigPath = path.join(
    bundlePaths.destinationFolderPath,
    "public-bundle-config.json"
  );
  await writeJsonToFile(publicBundleConfigPath, publicBundleConfig);

  if (realSkipPush !== true) {
    const pushCommand = branchName
      ? `git push -u origin ${branchName}`
      : "git push";

    const setConfigEmailCommand = gitUserEmail
      ? `git config user.email "${gitUserEmail}"`
      : undefined;

    const setConfigNameCommand = gitUserName
      ? `git config user.name "${gitUserName}"`
      : undefined;

    const gitCommitCommand = `[[ \`git status --porcelain .\` ]] && git add . && git commit -m '${finalBundleConfig.customisableBundleConfig.description}'`;

    try {
      // NB: This doesn't work well with nested .git folders!
      execSync(gitCommitCommand, {
        cwd: destinationFolderPath,
        stdio: "inherit",
      });

      if (setConfigEmailCommand) {
        execSync(setConfigEmailCommand, {
          cwd: destinationFolderPath,
          stdio: "inherit",
        });
      }
      if (setConfigNameCommand) {
        execSync(setConfigNameCommand, {
          cwd: destinationFolderPath,
          stdio: "inherit",
        });
      }

      execSync(pushCommand, { cwd: destinationFolderPath, stdio: "inherit" });
    } catch {}
  }

  return newBundleConfig;
};
Enter fullscreen mode Exit fullscreen mode

findAndCopyOperations

finds all operations and copies them into bundle location

async (
  /** NB: no finalBundleConfig because we are calculating a new original budnle config here */
  bundleConfig: BundleConfig,
  bundleConfigs: BundleConfig[],
  debug?: boolean
): Promise => {
  const { destinationFolderPath } = getBundlePaths(bundleConfig)!;

  const {
    createBundleConfig: { keepStructure, keepTodos },
  } = bundleConfig;

  log(`Finding operations for ${bundleConfig.name}`, { type: "important" });

  const dependencies: OperationPrivacy[] = await calculateBundleDependencies(
    bundleConfig.createBundleConfig,
    debug
  );

  const newBundleConfig: BundleConfig = {
    ...bundleConfig,
    createBundleConfig: { ...bundleConfig.createBundleConfig, dependencies },
  };

  log("Finding inherited bundles", { type: "important" });

  const slugsIncludingInherited = findInherited(
    newBundleConfig.slug,
    bundleConfigs
  );
  console.log({ slugsIncludingInherited });

  log("Summarizing operations", { type: "important" });

  const allSummaries = slugsIncludingInherited
    .map((slug) => {
      // NB: include the one that we actually just made!
      if (slug === newBundleConfig.slug) return newBundleConfig;

      return bundleConfigs.find((x) => x.slug === slug);
    })
    .filter(notEmpty)
    .map(getBundleSummary);

  const appNames = allSummaries
    .map((x) => x.appNames)
    .flat()
    .filter(onlyUnique2());
  const moduleNames = allSummaries
    .map((x) => x.moduleNames)
    .flat()
    .filter(onlyUnique2());
  const packageNames = allSummaries
    .map((x) => x.packageNames)
    .flat()
    .filter(onlyUnique2());

  console.log(
    `Found ${appNames.length} apps, ${moduleNames.length} modules, ${packageNames.length} packages`
  );

  const operationsFoldersObject = {
    apps: appNames,
    packages: packageNames,
    modules: moduleNames,
  };

  const operationFolders = Object.keys(
    operationsFoldersObject
  ) as (keyof typeof operationsFoldersObject)[];

  if (keepStructure && keepTodos) {
    log("Copying todos", { type: "important" });

    await copyTodosIntoBundle(destinationFolderPath);
  }

  log(
    `Copying operations ${
      bundleConfig.createBundleConfig.keepStructure ? "including readmes" : ""
    }`,
    { type: "important" }
  );

  const promises = operationFolders
    .map((folder) => {
      const operationNames = operationsFoldersObject[folder];

      const promises = operationNames.map((operationName) =>
        copyOperation({
          operationName,
          folder,
          bundleConfig,
          destinationFolderPath,
        })
      );

      return promises;
    })
    .flat();

  await Promise.all(promises);

  return newBundleConfig;
};
Enter fullscreen mode Exit fullscreen mode

undefined

Fixing bugs in rebuild operation

Let's start with a singular rebuildOperation on rebuild-operation, and if that seems fine, let's run rebuildAllOperations.

Seems fine! I love running rebuildAllOperations. It really shows how much can quickly happen in the background. It's indexing hundreds of functions there.

I can't wait to make the watchAll work with typescript indexation though, that will be a huge usability improvement for typerepo.

But yea... This rebuilding is going to take an hour, if not more. I probably gotta do something else in the meantime. Gotta pay a close eye though for if anything goes wrong.

Let's scroll through my todolist.

I scrolled through a little but I saw that the rebuilder had some packages unresolved. Let's run generateOperationsSdk to fix that. The great thing about rebuildAllOperations is that it just skips the ones that I just did before, so I can easily quit...

Anyway, @guilherme's package open-zip-rar seems to have issues buildling but it's not relevant for any of my bundles, so it's no biggie. It fauiled but the builder is continuing. Same for pdf-to-md. I don't have the required libs probably, but I'm not going to turn on my internet, just for that.

Oh wow, I'm noticing quite a bug! It says Operation wasn't found /Users/king/King/operations/tools/purpose/codebase-introspection/database/db-web. When looking it up, the string there should've been an operation name even though it's a PATH! That's not good. Let's fix it.

After some searching I found that it's in here rebuildOperation --> indexTypescriptFile --> findAndUpsertTsInterfaces --> generateSchema.

I found it. I did this quickly and just didn't understand my own naming. I gotta be even more explicit!


    await db.update(
      "Operation",
      (item) => item.name === operationFolderName,
      (old) =>
        mergeNestedObject(old, {
          operation: {
            buildResultIndexed: { indexInteracesErrors: [problem] },
          },
        }),
      { operationName: operationFolderName }
    );
Enter fullscreen mode Exit fullscreen mode

I set operationName to operationFolderName, but operationFolderName seems to contain a full path! Not a folder name.

Bad naming! Fixed by changing this:

const operationFolderName = getFolder(operationBasePath);
Enter fullscreen mode Exit fullscreen mode

into this

const operationFolderName = getLastFolder(operationBasePath);
Enter fullscreen mode Exit fullscreen mode

Tiny difference, changing getFolder into getLastFolder, but it has a huge impact! They return completely different things!

I wish the linter could've picked that up, but typescript isn't that far yet. I wish we could have a semantics linter... Maybe soon!

rebuildOperation

This function rebuilds an operation and re-indexes (part of) its files.

async (config: {
  /** last date when the rebuild-operation operation was updated (or any of its dependencies) */
  updatedAt?: number;
  /** If given, uses this as project root for the rebuildOperation dependencies, instead of the calculatable one */
  typerepoManualProjectRoot?: string;

  /**
   * If given, uses this as project root for finding things from the database for the operation that needs to be rebuilt
   */
  operationManualProjectRoot?: string;

  /** Full path to operation folder or any file therein */
  operationBasePath: string;
  /** If not given, explores all files in src folder of the operation. if given, must be an array of absolute file paths. it is not supported to index index files, as this creates duplicate and incorrect interfaces.  */
  filePaths?: string[];
  /** used for stopping recursion */
  noUnresolvedRebuilding?: boolean;
  /** if true, will not skip if nothing changed */
  force?: boolean;
  /** show logs */
  debug?: boolean;
  /** normally, it exits if the operation that was rebuilt was itself or one of its dependencies. Handy for watchOperations in combination with nodemon. If we don't want this behavior, provide noExit  */
  noExit?: boolean;
  /** stack of recursion of module names */
  stack?: string[];
}): Promise => {
  const {
    operationBasePath,
    force,
    debug,
    noExit,
    stack = [],
    updatedAt,
    noUnresolvedRebuilding,
    operationManualProjectRoot,
    typerepoManualProjectRoot,
  } = config;

  let { filePaths } = config;
  const operationName = getLastFolder(operationBasePath);
  const packageJson = await getPackageJson({
    operationFolderPath: operationBasePath,
  });
  if (
    isSdkOperation(operationBasePath) ||
    isGeneratedOperation(operationBasePath)
  ) {
    console.log(`not going to rebuild sdk operation: ${operationName}`);
    return false;
  }

  const stackPrefix = `${stack.map(() => `--`).join("")}${operationName}: `;
  log(
    `${stackPrefix}Rebuilding${
      stack.length > 0 ? `(coming from ${stack.join(", ")})` : ""
    }`,
    {
      type: "important",
    }
  );

  log(`${stackPrefix}Pre-index lint`, { type: "important" });
  const lintProblems = await preIndexLint({
    operationFolderPath: operationBasePath,
  });

  if (lintProblems.length > 0) {
    log(`${stackPrefix}Operation cannot be built, we've got problem(s)`, {
      type: "warning",
    });
    log(lintProblems.join("\n"), { type: "warning" });

    await db.update(
      "Operation",
      () => true,
      (old) => {
        setKeyAtLocation(
          "operation.buildResultIndexed.lintProblems",
          lintProblems,
          old
        );

        return old;
      },
      { operationName }
    );

    return false;
  }

  const skip = await shouldSkip({
    operationBasePath,
    debug,
    force,
    rebuildUpdatedAt: updatedAt,
    operationManualProjectRoot,
  });
  if (skip) {
    log(`Skipping ${operationName}`);
    return true;
  }

  const result = await cleanupTsDatabase(
    operationName,
    operationManualProjectRoot
  );

  log(
    result?.amountRemoved
      ? `Removed ${result?.amountRemoved} ts db instances`
      : "Nothing to clean up",
    { type: "success" }
  );

  executeCommandQuietUnlessFail({
    command: "yarn --offline",
    cwd: operationBasePath,
    description: `${stackPrefix}Installing`,
  });

  // 2a) finding imports/exports and writing them to index
  // TODO: we can also just check if build folder and index.js exist before looking at the import statements. These are easy to detect and when that happens we don't need to do the things below.
  log(`${stackPrefix}Getting imports/exports`, { type: "important" });

  await runChildProcess({
    operationFolderName: "get-imports-exports",
    scriptFileName: "findAndWriteImportsExports.cli",
    args: [operationBasePath, operationManualProjectRoot],
  });

  /// SDK SHIT

  if (operationName !== "sdk") {
    // get all newly generated imports through the db (NB: old index files have just been removed)
    const imports = await db.get("TsImport", {
      operationName: operationName,
      manualProjectRoot: operationManualProjectRoot,
    });
    // find the ones that are unresolved
    const unresolvedModules = imports
      .filter(
        (x) => x.isAbsolute && x.isModuleFromMonorepo && !x.isModuleResolved
      )
      .map((x) => x.module)
      .filter(onlyUnique);

    // console.log({
    //   imports: imports.length,
    //   unresolvedModules: unresolvedModules.length,
    // });

    // if there are any, we need to rebuildOperation for those modules and then rebuild ourselves again
    // NB: we can't do this if we already did this before
    if (unresolvedModules.length > 0) {
      if (noUnresolvedRebuilding) {
        log(
          `rebuilding the unresolved modules didn't work. Probably some indexation fails!`,
          { type: "error" }
        );

        await db.update(
          "Operation",
          () => true,
          (old) => {
            setKeyAtLocation(
              "operation.buildResultIndexed.dependencyBuildFailed",
              true,
              old
            );

            return old;
          },
          { operationName, manualProjectRoot: operationManualProjectRoot }
        );

        return false;
      }

      log(
        `${stackPrefix}We need to rebuild ${
          unresolvedModules.length
        } operations because they have conflicts (${unresolvedModules.join(
          ", "
        )})`,
        { type: "warning" }
      );
      const succeededArray = await oneByOne(
        unresolvedModules,
        async (unresolvedOperationName) => {
          const fullPath = await getOperationPath(unresolvedOperationName, {
            manualProjectRoot: operationManualProjectRoot,
          });
          if (!fullPath) {
            log(`${stackPrefix}${unresolvedOperationName} not found`, {
              type: "warning",
            });
            return false;
          }

          if (
            unresolvedOperationName === operationName ||
            stack.includes(unresolvedOperationName)
          ) {
            log(`${stackPrefix}cyclic dep`, { type: "warning" });
            return false;
          }

          console.log(`${stackPrefix}diving into new rebuildOperation `, {
            operationName,
            stack,
            unresolvedOperationName,
          });
          return rebuildOperation({
            operationManualProjectRoot,
            typerepoManualProjectRoot,
            operationBasePath,
            stack: stack.concat([unresolvedOperationName]),
            debug,
            // can't skip this one because it is a dependency
            // however, skipping is very well defined. we can skip if shouldSkip is true!
            // force: true,
            noExit,
          });
        }
      );

      // NB: we don't rebuild this operation again if one of the dependency builds failed...
      if (!isAllTrue(succeededArray)) {
        log(`${stackPrefix}something failed! returning`);

        await db.update(
          "Operation",
          () => true,
          (old) => {
            setKeyAtLocation(
              "operation.buildResultIndexed.dependenciesBuildsFailed",
              true,
              old
            );
            return old;
          },
          { operationName, manualProjectRoot: operationManualProjectRoot }
        );

        return false;
      }

      log(`${stackPrefix}rebuilding ourselves again`);
      // NB: all files on purpose...
      return rebuildOperation({
        operationManualProjectRoot,
        typerepoManualProjectRoot,
        operationBasePath,
        debug,
        force,
        noExit,
        noUnresolvedRebuilding: true,
        stack: stack.concat([operationName]),
      });
    } else {
      log(`${stackPrefix}all imports were resolved`);
    }
  }

  // 2b) compiling and writing build errors to index
  log(`${stackPrefix}writing build errors SKIPPED due to memory bug`, {
    type: "important",
  });

  // await runChildProcess({
  //   operationFolderName: "compile-typescript",
  //   scriptFileName: "writeBuildErrors.cli",
  //   args: [
  //     operationBasePath,
  //     operationManualProjectRoot,
  //     typerepoManualProjectRoot,
  //   ],
  // });
  // read errors...
  // TODO: if this returns errors, don't continue

  // 3. creating remaining operation index files
  if (!filePaths) {
    //files from src
    filePaths = (
      await Promise.all(
        await getPackageSourcePaths({
          operationBasePath,
          ignoreIndexFiles: true,
        })
      )
    ).filter(notEmpty);
  }

  if (filePaths.length === 0) {
    log(`${stackPrefix}No files found for operation ${operationName}`, {
      type: "error",
    });
  } else {
    log(`${stackPrefix}${filePaths.length} files to index:`, {
      type: "important",
    });
  }

  // first index the files that have changed
  await runChildProcess({
    operationFolderName: "index-typescript",
    scriptFileName: "cli",
    args: [...filePaths, operationManualProjectRoot || "null"],
  });
  // after that's done, generate cli's where possible. only if it's a node operation (or can I use node for js functions that also run on the frontend?) if not, I might be better off generating a [operation-name]-cli operation for js-only operations...

  // first step, just

  const indexPath = await generateSimpleIndex({
    operationName,
    manualProjectRoot: operationManualProjectRoot,
  });

  // // because we generated a new index, let's also reindex that file!
  // TODO: Figure out if this is REALLY NEEDED... I guess we can also infer which things are in the index, and we don't want to index things here except maybe the imports/exports!
  // if (indexPath) {
  //   await runChildProcess({
  //     operationFolderName: "index-typescript",
  //     scriptFileName: "cli",
  //     args: manualProjectRoot ? [indexPath, manualProjectRoot] : [indexPath],
  //   });

  //   log(`indexed index :)`, {
  //     type: "success",
  //   });
  // }

  const isBuildNeeded = isOperationBuildNeeded(operationBasePath);

  let buildSucceeded = true;

  // NB: no build, no tests (for now)
  if (isBuildNeeded) {
    const skipMinify = packageJson?.operation?.skipMinify;

    buildSucceeded = await yarnBuild(operationBasePath, {
      rmFirst: false,
      skipMinify,
    });

    // TESTING EVERYTHING, including all dependant packages

    const imports = await db.get("TsImport", {
      manualProjectRoot: operationManualProjectRoot,
    });
    // find the ones that are unresolved
    const dependantOperationNames = imports
      .filter((x) => x.isModuleFromMonorepo && x.module === operationName)
      .map((x) => x.operationName)
      .filter(onlyUnique)
      .filter(notEmpty);

    const testableOperations = [operationName, ...dependantOperationNames];
    const testPromises = testableOperations.map((operationName) =>
      // NB: we need this to be a child process because it requires the tests from the index, and that file changes, while normally a require will be cached. We can't easily invalidate the cache because it can come from many files.
      runChildProcess({
        operationFolderName: "k-test",
        scriptFileName: "runTestsForOperation.cli",
        args: [operationName, operationManualProjectRoot],
      })
    );

    await Promise.all(testPromises);
  }

  await generateJsonSchemas(operationManualProjectRoot, operationName);

  await db.update(
    "Operation",
    () => true,
    (old) => {
      setKeyAtLocation(
        "operation.buildResultIndexed.buildSucceeded",
        true,
        old
      );
      return old;
    },
    { operationName, manualProjectRoot: operationManualProjectRoot }
  );

  const operationSummary = await getOperationSummary({
    operationName,
    manualProjectRoot: operationManualProjectRoot,
  });

  if (operationSummary) {
    // make a readme of the new index
    await operationToMarkdown({ operationSummary, returnType: "save" });
  }

  await db.update(
    "Operation",
    () => true,
    (old) => {
      setKeyAtLocation(
        "operation.buildResultIndexed.indexImportExportError",
        "",
        old
      );
      setKeyAtLocation(
        "operation.buildResultIndexed.lintProblems",
        lintProblems,
        old
      );

      return old;
    },
    { operationName, manualProjectRoot: operationManualProjectRoot }
  );

  await recalculateOperationIndexJson(
    operationBasePath,
    operationManualProjectRoot
  );

  if (!noExit) {
    await exitIfProcessDependenciesChanged(
      operationName,
      operationManualProjectRoot
    );
  }

  return true;
};
Enter fullscreen mode Exit fullscreen mode

rebuild-operation

I didn't write a good description for this yet. Please let me know if you want to know more

rebuildAllOperations

Rebuilds all operations that are needed to be rebuilt

async (
  /**
   * If true, you are indicating that the rebuilding process has changed and all operations should be rebuilt after this date.
   */
  isRebuildingProcessUpdated?: boolean,
  manualProjectRoot?: string
) => {
  const configPath = path.join(__dirname, "..", "config.json");

  if (isRebuildingProcessUpdated) {
    await writeJsonToFile(configPath, { updatedAt: Date.now() });
  }

  const config = await readJsonFile<{ updatedAt: number }>(configPath);

  forAllFolders({
    type: "operations",
    basePath: getPathsWithOperations({ manualProjectRoot }),
    callback: async (folderPath, index) => {
      log(`#${index}: Let's do ${folderPath}`, { type: "success" });

      await rebuildOperation({
        operationBasePath: folderPath,
        noExit: true,
        updatedAt: config?.updatedAt,
      });

      return;
    },
  });
};
Enter fullscreen mode Exit fullscreen mode

rebuildAllOperations

Rebuilds all operations that are needed to be rebuilt

async (
  /**
   * If true, you are indicating that the rebuilding process has changed and all operations should be rebuilt after this date.
   */
  isRebuildingProcessUpdated?: boolean,
  manualProjectRoot?: string
) => {
  const configPath = path.join(__dirname, "..", "config.json");

  if (isRebuildingProcessUpdated) {
    await writeJsonToFile(configPath, { updatedAt: Date.now() });
  }

  const config = await readJsonFile<{ updatedAt: number }>(configPath);

  forAllFolders({
    type: "operations",
    basePath: getPathsWithOperations({ manualProjectRoot }),
    callback: async (folderPath, index) => {
      log(`#${index}: Let's do ${folderPath}`, { type: "success" });

      await rebuildOperation({
        operationBasePath: folderPath,
        noExit: true,
        updatedAt: config?.updatedAt,
      });

      return;
    },
  });
};
Enter fullscreen mode Exit fullscreen mode

watchAll

👁 👁 Finds all watchers within typerepo and ensures they all start watching their watch

async (config?: {
  /**
   * Overwrite the default ignored behavior
   */
  customIgnored?: string[];
}) => {
  const projectRoot = getProjectRoot();
  if (!projectRoot) return;

  // NB: fix to globally alter real fs in order to fix EMFile error that happens in TSMorph (see https://github.com/isaacs/node-graceful-fs)
  gracefulFs.gracefulify(realFs);

  console.log("Searching...");
  const tsFunctions = await db.get("TsFunction");

  const projectWatcherTsFunctions = tsFunctions.filter(
    (x) => x.explicitTypeName === "ProjectWatcher"
  );

  const projectWatchers = projectWatcherTsFunctions
    .map((x) => x.name)
    .map((name) => sdk[name as keyof typeof sdk] as ProjectWatcher | undefined)
    .filter(notEmpty);

  log(`${projectWatchers.length} watchers gonna watch ${projectRoot}`, {
    type: "important",
  });

  const startupWaitMs = 1000;
  setTimeout(() => {
    log(
      `There they are! \n\n${projectWatchers
        .map((projectWatcher) => {
          return `👁 👁 ${projectWatcher.name} ✅`;
        })
        .join("\n")}`,
      {
        type: "success",
      }
    );
  }, startupWaitMs);

  const startTimeAt = Date.now();

  watch(projectRoot, {
    ignoreInitial: true,
    ignored: config?.customIgnored || [
      "**/node_modules/**",
      "**/.next/**",
      "**/.expo/**",
      "**/build/**",
      "**/db/**",
      "**/.git/**",
      "**/.turbo/**",
      "**/generated/**",
    ],
    // alwaysStat: true, // not sure why I would need this, seems inefficient if I don't need it, I can simply run fs.stat
  }).on("all", (eventName, path, stats) => {
    if (Date.now() < startTimeAt + startupWaitMs) return;

    const relevantWatchers = projectWatchers.filter((watcher) =>
      watcher.filter(eventName, path)
    );

    oneByOne(relevantWatchers, async (projectWatcher) => {
      await projectWatcher(eventName, path);
    });
  });
};
Enter fullscreen mode Exit fullscreen mode

typerepo

The new way to dev

Find more on GitHub

generateOperationsSdk

sdk-operations indexes all operations and builds an object containing all operations.

async (config?: {
  manualProjectRoot?: string;
  skipYarnInstall?: boolean;
  dryrun?: boolean;
}) => {
  const skipYarnInstall = config?.skipYarnInstall;
  const dryrun = config?.dryrun;
  const manualProjectRoot = config?.manualProjectRoot;
  const projectRoot = manualProjectRoot || getProjectRoot();
  if (!projectRoot) return;

  const operationFolderPaths = await exploreOperationFolders({
    basePath: getPathsWithOperations({ manualProjectRoot }),
  });

  const operationNamePathRows = operationFolderPaths.map(
    (operationFolderPath) => {
      const operationName = getLastFolder(operationFolderPath);

      return `"${operationName}": "${makeRelative(
        operationFolderPath,
        projectRoot
      )}"`;
    }
  );
  const operationObjectString = `export const operations = { ${operationNamePathRows.join(
    ",\n"
  )} };`;

  await newOperationWithFiles(
    "sdk-operations",
    await getSdkDescription("sdk-operations"),
    { "src/sdk-operations.ts": operationObjectString },
    { overwriteIfExists: true, skipYarnInstall, manualProjectRoot, dryrun }
  );
};
Enter fullscreen mode Exit fullscreen mode

rebuildAllOperations

Rebuilds all operations that are needed to be rebuilt

async (
  /**
   * If true, you are indicating that the rebuilding process has changed and all operations should be rebuilt after this date.
   */
  isRebuildingProcessUpdated?: boolean,
  manualProjectRoot?: string
) => {
  const configPath = path.join(__dirname, "..", "config.json");

  if (isRebuildingProcessUpdated) {
    await writeJsonToFile(configPath, { updatedAt: Date.now() });
  }

  const config = await readJsonFile<{ updatedAt: number }>(configPath);

  forAllFolders({
    type: "operations",
    basePath: getPathsWithOperations({ manualProjectRoot }),
    callback: async (folderPath, index) => {
      log(`#${index}: Let's do ${folderPath}`, { type: "success" });

      await rebuildOperation({
        operationBasePath: folderPath,
        noExit: true,
        updatedAt: config?.updatedAt,
      });

      return;
    },
  });
};
Enter fullscreen mode Exit fullscreen mode

pdf-to-md

I didn't write a good description for this yet. Please let me know if you want to know more

rebuildOperation

This function rebuilds an operation and re-indexes (part of) its files.

async (config: {
  /** last date when the rebuild-operation operation was updated (or any of its dependencies) */
  updatedAt?: number;
  /** If given, uses this as project root for the rebuildOperation dependencies, instead of the calculatable one */
  typerepoManualProjectRoot?: string;

  /**
   * If given, uses this as project root for finding things from the database for the operation that needs to be rebuilt
   */
  operationManualProjectRoot?: string;

  /** Full path to operation folder or any file therein */
  operationBasePath: string;
  /** If not given, explores all files in src folder of the operation. if given, must be an array of absolute file paths. it is not supported to index index files, as this creates duplicate and incorrect interfaces.  */
  filePaths?: string[];
  /** used for stopping recursion */
  noUnresolvedRebuilding?: boolean;
  /** if true, will not skip if nothing changed */
  force?: boolean;
  /** show logs */
  debug?: boolean;
  /** normally, it exits if the operation that was rebuilt was itself or one of its dependencies. Handy for watchOperations in combination with nodemon. If we don't want this behavior, provide noExit  */
  noExit?: boolean;
  /** stack of recursion of module names */
  stack?: string[];
}): Promise => {
  const {
    operationBasePath,
    force,
    debug,
    noExit,
    stack = [],
    updatedAt,
    noUnresolvedRebuilding,
    operationManualProjectRoot,
    typerepoManualProjectRoot,
  } = config;

  let { filePaths } = config;
  const operationName = getLastFolder(operationBasePath);
  const packageJson = await getPackageJson({
    operationFolderPath: operationBasePath,
  });
  if (
    isSdkOperation(operationBasePath) ||
    isGeneratedOperation(operationBasePath)
  ) {
    console.log(`not going to rebuild sdk operation: ${operationName}`);
    return false;
  }

  const stackPrefix = `${stack.map(() => `--`).join("")}${operationName}: `;
  log(
    `${stackPrefix}Rebuilding${
      stack.length > 0 ? `(coming from ${stack.join(", ")})` : ""
    }`,
    {
      type: "important",
    }
  );

  log(`${stackPrefix}Pre-index lint`, { type: "important" });
  const lintProblems = await preIndexLint({
    operationFolderPath: operationBasePath,
  });

  if (lintProblems.length > 0) {
    log(`${stackPrefix}Operation cannot be built, we've got problem(s)`, {
      type: "warning",
    });
    log(lintProblems.join("\n"), { type: "warning" });

    await db.update(
      "Operation",
      () => true,
      (old) => {
        setKeyAtLocation(
          "operation.buildResultIndexed.lintProblems",
          lintProblems,
          old
        );

        return old;
      },
      { operationName }
    );

    return false;
  }

  const skip = await shouldSkip({
    operationBasePath,
    debug,
    force,
    rebuildUpdatedAt: updatedAt,
    operationManualProjectRoot,
  });
  if (skip) {
    log(`Skipping ${operationName}`);
    return true;
  }

  const result = await cleanupTsDatabase(
    operationName,
    operationManualProjectRoot
  );

  log(
    result?.amountRemoved
      ? `Removed ${result?.amountRemoved} ts db instances`
      : "Nothing to clean up",
    { type: "success" }
  );

  executeCommandQuietUnlessFail({
    command: "yarn --offline",
    cwd: operationBasePath,
    description: `${stackPrefix}Installing`,
  });

  // 2a) finding imports/exports and writing them to index
  // TODO: we can also just check if build folder and index.js exist before looking at the import statements. These are easy to detect and when that happens we don't need to do the things below.
  log(`${stackPrefix}Getting imports/exports`, { type: "important" });

  await runChildProcess({
    operationFolderName: "get-imports-exports",
    scriptFileName: "findAndWriteImportsExports.cli",
    args: [operationBasePath, operationManualProjectRoot],
  });

  /// SDK SHIT

  if (operationName !== "sdk") {
    // get all newly generated imports through the db (NB: old index files have just been removed)
    const imports = await db.get("TsImport", {
      operationName: operationName,
      manualProjectRoot: operationManualProjectRoot,
    });
    // find the ones that are unresolved
    const unresolvedModules = imports
      .filter(
        (x) => x.isAbsolute && x.isModuleFromMonorepo && !x.isModuleResolved
      )
      .map((x) => x.module)
      .filter(onlyUnique);

    // console.log({
    //   imports: imports.length,
    //   unresolvedModules: unresolvedModules.length,
    // });

    // if there are any, we need to rebuildOperation for those modules and then rebuild ourselves again
    // NB: we can't do this if we already did this before
    if (unresolvedModules.length > 0) {
      if (noUnresolvedRebuilding) {
        log(
          `rebuilding the unresolved modules didn't work. Probably some indexation fails!`,
          { type: "error" }
        );

        await db.update(
          "Operation",
          () => true,
          (old) => {
            setKeyAtLocation(
              "operation.buildResultIndexed.dependencyBuildFailed",
              true,
              old
            );

            return old;
          },
          { operationName, manualProjectRoot: operationManualProjectRoot }
        );

        return false;
      }

      log(
        `${stackPrefix}We need to rebuild ${
          unresolvedModules.length
        } operations because they have conflicts (${unresolvedModules.join(
          ", "
        )})`,
        { type: "warning" }
      );
      const succeededArray = await oneByOne(
        unresolvedModules,
        async (unresolvedOperationName) => {
          const fullPath = await getOperationPath(unresolvedOperationName, {
            manualProjectRoot: operationManualProjectRoot,
          });
          if (!fullPath) {
            log(`${stackPrefix}${unresolvedOperationName} not found`, {
              type: "warning",
            });
            return false;
          }

          if (
            unresolvedOperationName === operationName ||
            stack.includes(unresolvedOperationName)
          ) {
            log(`${stackPrefix}cyclic dep`, { type: "warning" });
            return false;
          }

          console.log(`${stackPrefix}diving into new rebuildOperation `, {
            operationName,
            stack,
            unresolvedOperationName,
          });
          return rebuildOperation({
            operationManualProjectRoot,
            typerepoManualProjectRoot,
            operationBasePath,
            stack: stack.concat([unresolvedOperationName]),
            debug,
            // can't skip this one because it is a dependency
            // however, skipping is very well defined. we can skip if shouldSkip is true!
            // force: true,
            noExit,
          });
        }
      );

      // NB: we don't rebuild this operation again if one of the dependency builds failed...
      if (!isAllTrue(succeededArray)) {
        log(`${stackPrefix}something failed! returning`);

        await db.update(
          "Operation",
          () => true,
          (old) => {
            setKeyAtLocation(
              "operation.buildResultIndexed.dependenciesBuildsFailed",
              true,
              old
            );
            return old;
          },
          { operationName, manualProjectRoot: operationManualProjectRoot }
        );

        return false;
      }

      log(`${stackPrefix}rebuilding ourselves again`);
      // NB: all files on purpose...
      return rebuildOperation({
        operationManualProjectRoot,
        typerepoManualProjectRoot,
        operationBasePath,
        debug,
        force,
        noExit,
        noUnresolvedRebuilding: true,
        stack: stack.concat([operationName]),
      });
    } else {
      log(`${stackPrefix}all imports were resolved`);
    }
  }

  // 2b) compiling and writing build errors to index
  log(`${stackPrefix}writing build errors SKIPPED due to memory bug`, {
    type: "important",
  });

  // await runChildProcess({
  //   operationFolderName: "compile-typescript",
  //   scriptFileName: "writeBuildErrors.cli",
  //   args: [
  //     operationBasePath,
  //     operationManualProjectRoot,
  //     typerepoManualProjectRoot,
  //   ],
  // });
  // read errors...
  // TODO: if this returns errors, don't continue

  // 3. creating remaining operation index files
  if (!filePaths) {
    //files from src
    filePaths = (
      await Promise.all(
        await getPackageSourcePaths({
          operationBasePath,
          ignoreIndexFiles: true,
        })
      )
    ).filter(notEmpty);
  }

  if (filePaths.length === 0) {
    log(`${stackPrefix}No files found for operation ${operationName}`, {
      type: "error",
    });
  } else {
    log(`${stackPrefix}${filePaths.length} files to index:`, {
      type: "important",
    });
  }

  // first index the files that have changed
  await runChildProcess({
    operationFolderName: "index-typescript",
    scriptFileName: "cli",
    args: [...filePaths, operationManualProjectRoot || "null"],
  });
  // after that's done, generate cli's where possible. only if it's a node operation (or can I use node for js functions that also run on the frontend?) if not, I might be better off generating a [operation-name]-cli operation for js-only operations...

  // first step, just

  const indexPath = await generateSimpleIndex({
    operationName,
    manualProjectRoot: operationManualProjectRoot,
  });

  // // because we generated a new index, let's also reindex that file!
  // TODO: Figure out if this is REALLY NEEDED... I guess we can also infer which things are in the index, and we don't want to index things here except maybe the imports/exports!
  // if (indexPath) {
  //   await runChildProcess({
  //     operationFolderName: "index-typescript",
  //     scriptFileName: "cli",
  //     args: manualProjectRoot ? [indexPath, manualProjectRoot] : [indexPath],
  //   });

  //   log(`indexed index :)`, {
  //     type: "success",
  //   });
  // }

  const isBuildNeeded = isOperationBuildNeeded(operationBasePath);

  let buildSucceeded = true;

  // NB: no build, no tests (for now)
  if (isBuildNeeded) {
    const skipMinify = packageJson?.operation?.skipMinify;

    buildSucceeded = await yarnBuild(operationBasePath, {
      rmFirst: false,
      skipMinify,
    });

    // TESTING EVERYTHING, including all dependant packages

    const imports = await db.get("TsImport", {
      manualProjectRoot: operationManualProjectRoot,
    });
    // find the ones that are unresolved
    const dependantOperationNames = imports
      .filter((x) => x.isModuleFromMonorepo && x.module === operationName)
      .map((x) => x.operationName)
      .filter(onlyUnique)
      .filter(notEmpty);

    const testableOperations = [operationName, ...dependantOperationNames];
    const testPromises = testableOperations.map((operationName) =>
      // NB: we need this to be a child process because it requires the tests from the index, and that file changes, while normally a require will be cached. We can't easily invalidate the cache because it can come from many files.
      runChildProcess({
        operationFolderName: "k-test",
        scriptFileName: "runTestsForOperation.cli",
        args: [operationName, operationManualProjectRoot],
      })
    );

    await Promise.all(testPromises);
  }

  await generateJsonSchemas(operationManualProjectRoot, operationName);

  await db.update(
    "Operation",
    () => true,
    (old) => {
      setKeyAtLocation(
        "operation.buildResultIndexed.buildSucceeded",
        true,
        old
      );
      return old;
    },
    { operationName, manualProjectRoot: operationManualProjectRoot }
  );

  const operationSummary = await getOperationSummary({
    operationName,
    manualProjectRoot: operationManualProjectRoot,
  });

  if (operationSummary) {
    // make a readme of the new index
    await operationToMarkdown({ operationSummary, returnType: "save" });
  }

  await db.update(
    "Operation",
    () => true,
    (old) => {
      setKeyAtLocation(
        "operation.buildResultIndexed.indexImportExportError",
        "",
        old
      );
      setKeyAtLocation(
        "operation.buildResultIndexed.lintProblems",
        lintProblems,
        old
      );

      return old;
    },
    { operationName, manualProjectRoot: operationManualProjectRoot }
  );

  await recalculateOperationIndexJson(
    operationBasePath,
    operationManualProjectRoot
  );

  if (!noExit) {
    await exitIfProcessDependenciesChanged(
      operationName,
      operationManualProjectRoot
    );
  }

  return true;
};
Enter fullscreen mode Exit fullscreen mode

indexTypescriptFile

async (
  project: Project,
  file: CompleteOperationPathParse,
  projectRoot: string
) => {
  const problems: string[] = [];

  const { filePath, operationName, operationRelativeTypescriptFilePath } = file;
  if (!operationName) return;

  // console.log(`indexing file`, {
  //   operationName,
  //   filePath,
  //   operationRelativeTypescriptFilePath,
  // });
  // END VALIDATION

  const fileContent = await fs.readFile(filePath, "utf8");

  //select correct SourceFile from tsmorph project
  const sourceFile = project.getSourceFile(filePath);

  if (!sourceFile) {
    const problem = `couldn't load file ${filePath}`;
    problems.push(problem);

    await db.update(
      "Operation",
      (item) => item.name === operationName,
      (old) =>
        mergeNestedObject(old, {
          operation: { buildResultIndexed: { indexInteracesErrors: problems } },
        }),
      { operationName }
    );

    log(problem, { type: "error" });
    return;
  }

  const newTsInterfaces = await findAndUpsertTsInterfaces({
    filePath,
    sourceFile,
    operationName,
    projectRoot,
  });

  if (!newTsInterfaces) {
    log("Shouldn't happen but tsInterfaces is undefined here...");
    return;
  }

  const allTsInterfaces = await db.get("TsInterface");

  const allWithNewTsInterfaces = [
    ...newTsInterfaces,
    ...allTsInterfaces,
  ].filter(onlyUnique2>((a, b) => a.name === b.name));
  // NB: interfaces are a prerequisite for function...

  // console.log({
  //   newTsInterfaces: newTsInterfaces.length,
  //   allTsInterfaces: allTsInterfaces.length,
  //   allWithNewTsInterfacesUnique: allWithNewTsInterfaces.length,
  // });
  // TODO:
  const tsLintWarnings: TsLintWarning[] = [];

  // TODO: get main comment from top of file or associated md
  const mainComment = undefined;
  const pathMetaData = await calculatePathMetaData(filePath);

  const { tsFunctions, tsVariables } = await getTsStatements(
    sourceFile,
    allWithNewTsInterfaces,
    operationRelativeTypescriptFilePath,
    fileContent
  );

  // gets all top level statements
  const topLevelComments: Creation[] = sourceFile
    .getStatementsWithComments()
    .map((x) => {
      const comments = getAllComments(
        x,
        fileContent,
        operationRelativeTypescriptFilePath
      );
      return comments;
    })
    .flat();

  const functionComments: Creation[] = tsFunctions
    .map((f) => f.commentsInside)
    .flat();
  const interfaceComments: Creation[] = newTsInterfaces
    .map((f) => f.commentsInside)
    .flat();
  const variableComments: Creation[] = tsVariables
    .map((f) => f.comments)
    .flat();

  // TODO: get all top level comments from the statements, but also get all comments already found in functions, variables, and interfaces, put together.
  const tsComments: Creation[] = [
    topLevelComments,
    functionComments,
    interfaceComments,
    variableComments,
  ].flat();

  // Inserting all results into the database...

  // @ts-ignore
  await db.remove(
    "TsFunction",
    (fn) =>
      fn.operationRelativeTypescriptFilePath ===
        operationRelativeTypescriptFilePath &&
      !tsFunctions.map((x) => x.name).includes(fn.name),
    { operationName, manualProjectRoot: projectRoot }
  );
  // @ts-ignore
  await db.upsert("TsFunction", tsFunctions, {
    operationName,
    manualProjectRoot: projectRoot,
  });

  await db.remove(
    "TsVariable",
    (v) =>
      v.operationRelativeTypescriptFilePath ===
        operationRelativeTypescriptFilePath &&
      !tsVariables.map((x) => x.name).includes(v.name),
    { operationName, manualProjectRoot: projectRoot }
  );
  await db.upsert("TsVariable", tsVariables, {
    operationName,
    removeUntouched: true,
    manualProjectRoot: projectRoot,
  });

  await db.remove(
    "TsComment",
    (c) =>
      c.operationRelativeTypescriptFilePath ===
      operationRelativeTypescriptFilePath,
    { operationName, manualProjectRoot: projectRoot }
  );
  await db.upsert("TsComment", tsComments, {
    operationName,
    removeUntouched: true,
    manualProjectRoot: projectRoot,
  });
};
Enter fullscreen mode Exit fullscreen mode

findAndUpsertTsInterfaces

async (config: {
  /**
   * If not provided, will load the project at the operation base path of the filepath, and get the source file at the filePath
   */
  sourceFile?: SourceFile;
  operationName: string;
  /**
   * path of the file to find TsInterfaces in
   */
  filePath: string;
  projectRoot?: string;
}): Promise[]> => {
  const { filePath, operationName, projectRoot } = config;
  let { sourceFile } = config;

  const operationBasePath = findOperationBasePath(filePath);
  if (!operationBasePath) return;

  if (!sourceFile) {
    const project = getTsMorphProject(operationBasePath);
    if (!project) return;
    sourceFile = project.getSourceFile(filePath);
  }

  if (!sourceFile) {
    console.log("Filepath not existing");
    return;
  }

  // NB: we need to get the named absolute import names because there may be type interfaces in there that we should add into our database!
  const namedAbsoluteImportNames = sourceFile
    .getImportDeclarations()
    .map((importDeclaration) => {
      const module = String(
        importDeclaration.getModuleSpecifier().getLiteralText()
      );
      if (isAbsoluteImport(module)) {
        const namedImports: string[] = importDeclaration
          .getNamedImports()
          .map((x) => x.getName());

        return namedImports;
      }
    })
    .filter(notEmpty)
    .flat();

  const morphInterfaceInfo: MorphInterfaceInfo[] = sourceFile
    .getInterfaces()
    .map((x) => ({
      hasGeneric: getHasGeneric(x),
      raw: x.getFullText(),
      name: x.getName(),
      isExported: x.isExported(),
      description: x
        .getLeadingCommentRanges()
        .map((x) => x.getText())
        .join("\n\n"),
      extensions: x.getExtends().map((x) => x.getText()),
    }));

  const morphTypeInfo: MorphInterfaceInfo[] = sourceFile
    .getTypeAliases()
    .map((x) => {
      const isExported = x.isExported();
      const name = x.getName();
      return {
        hasGeneric: getHasGeneric(x),
        raw: x.getFullText(),
        isExported,
        description: x
          .getLeadingCommentRanges()
          .map((x) => x.getText())
          .join("\n\n"),
        name,
        extensions: [],
      };
    });

  const morphTypesAndInterfacesInfo = morphTypeInfo.concat(morphInterfaceInfo);

  const tsInterfaces = await generateSchema(
    filePath,
    morphTypesAndInterfacesInfo,
    namedAbsoluteImportNames
  );

  const operationRelativeTypescriptFilePath = getOperationRelativePath(
    filePath,
    operationBasePath
  );
  // console.log({
  //   morphTypeNames: morphTypesAndInterfacesInfo.map((x) => x.name),
  //   namedAbsoluteImportNames,
  //   tsInterfacesLength: tsInterfaces.length,
  // });

  // @ts-ignore
  await db.remove(
    "TsInterface",
    (i) =>
      i.operationRelativeTypescriptFilePath ===
        operationRelativeTypescriptFilePath &&
      !tsInterfaces.map((x) => x.name).includes(i.name),
    { operationName, manualProjectRoot: projectRoot }
  );
  // @ts-ignore
  const result = await db.upsert("TsInterface", tsInterfaces, {
    operationName,
    removeUntouched: true,
    manualProjectRoot: projectRoot,
  });

  // log(`Done`, { type: "debug" }, result);
  return tsInterfaces;
};
Enter fullscreen mode Exit fullscreen mode

generateSchema

If existing schema is not stale, just require it.
Otherwise, generate it for a file

NB: The createGenerator function finds also imported TsInterfaces, which leads to duplicate TsInterfaces. With pushing the interfaces to the slug filename, this is no problem though, there should not be any duplication!

async (
  filePath: string,
  morphInterfaceInfo: MorphInterfaceInfo[],
  namedAbsoluteImportNames: string[]
): Promise[]> => {
  // console.log({ filePath, namedAbsoluteImportNames });
  const problems: string[] = [];

  const operationBasePath = findOperationBasePath(filePath);
  if (!operationBasePath) {
    log("No operation base path");
    return [];
  }
  const operationRelativePath = makeRelative(filePath, operationBasePath);
  const operationFolderName = getLastFolder(operationBasePath);
  if (operationRelativePath === "src/index.ts") {
    // should not index index
    log("This should never happen, operationRelativePath is src/index");
    return [];
  }

  const tsConfigPath = path.join(operationBasePath, "tsconfig.json");
  const tsConfigExists = fs.existsSync(tsConfigPath);

  if (!tsConfigExists) {
    const problem = `no tsconfig found for ${filePath}, not generating schemas`;
    log(problem, {
      type: "error",
    });
    problems.push(problem);

    await db.update(
      "Operation",
      (item) => item.name === operationFolderName,
      (old) =>
        mergeNestedObject(old, {
          operation: { buildResultIndexed: { indexInteracesErrors: problems } },
        }),
      { operationName: operationFolderName }
    );

    return [];
  }

  // TODO: check the defaults and possibilities in the docs/readme
  const config: Config = {
    // skipTypeCheck: true,
    path: filePath,
    tsconfig: tsConfigPath,
    skipTypeCheck: true,
    type: "*", // Or  if you want to generate schema for that one type only
  };
  const { schema, error } = tryCreateSchema(config);

  if (!schema || !schema.definitions) {
    const problem = `No schema/definitions found for ${filePath}. Error: ${error}`;
    log(problem, { type: "warning" });

    await db.update(
      "Operation",
      (item) => item.name === operationFolderName,
      (old) =>
        mergeNestedObject(old, {
          operation: {
            buildResultIndexed: { indexInteracesErrors: [problem] },
          },
        }),
      { operationName: operationFolderName }
    );

    return [];
  }

  const interfacePromises = Object.keys(schema.definitions).map((typeName) => {
    const thisMorphInterfaceInfo = morphInterfaceInfo.find(
      (x) => x.name === typeName
    );

    const tsMorphFoundTypeAlso = !!thisMorphInterfaceInfo;
    const isImportedType = namedAbsoluteImportNames.includes(typeName);
    const isNamedParameters = typeName.startsWith("NamedParameters");
    if (tsMorphFoundTypeAlso || isImportedType || isNamedParameters) {
      return schemaToTsInterface(
        filePath,
        typeName,
        schema,
        thisMorphInterfaceInfo
      );
    }

    // console.log({ definitionNames: Object.keys(schema.definitions) });

    log(
      `Skipping type ${typeName}`,
      { type: "debug" },
      { tsMorphFoundTypeAlso, isImportedType, isNamedParameters }
    );

    // NB: only the interfaces declared in this file end up in the database! otherwise you'll get duplicates anyway.

    // NB: we are still allowing absolute imported types to end up in the database. They will not be exported from our index, but we still need them for some frontend-generation tasks.
    return;
  });

  const interfaces = (await Promise.all(interfacePromises)).filter(notEmpty);

  return interfaces;
};
Enter fullscreen mode Exit fullscreen mode

getFolder

if the path exists:

  • if the pathString is a folder, that is returned.
  • if the pathstring is not a folder, returns the pathstring without the file suffix

if the path doesn't exist: returns pathString witout last chunk (this would only work for file paths)

(pathString: string) => {
  const parsedPath = path.parse(pathString);
  const hasExtension = parsedPath.ext.length > 0;

  if (hasExtension) {
    // NB: assume it's a file, let's avoid folders with dots!
    const pathChunks = pathString.split("/");
    pathChunks.pop(); //remove the filename
    return pathChunks.join("/");
  } else {
    // NB: it's already a folder!
    return pathString;
  }
};
Enter fullscreen mode Exit fullscreen mode

getLastFolder

removes everything after the last slash to get folder path

input: /Users/king/Documents/some/folder/xyz
output: xyz

input: /Users/king/Documents/some/folder/xyz.txt
output: folder

(pathString: string) => {
  const lastFolder = getFolder(pathString).split("/").pop()!;
  // console.log({ pathString, lastFolder });
  return lastFolder;
};
Enter fullscreen mode Exit fullscreen mode

undefined

Let's continue...

Ok let's run rebuildAllOperations again. Now without that bug. In the meantime I'm going to try to see how I can make the database faster.

The main function is getDbModel. THe main reason it's slow is because of this query happening for accessing any model:

const indexes = (await db.get("TsInterface")).filter(
  (x) => x.name === interfaceName
);
Enter fullscreen mode Exit fullscreen mode

It's looking for any interface, anywhere, because it doesn't know the location of the interface, based on the name. Let's create a mapped index! I already did that for funcitons: sdk-function-paths so it's going to be easy.

I copied generateFunctionPathsSdk and called it generateInterfacePathsSdk. Later I will make a more general purpose function, and I can possibly also use simple-typescript for that. Ok, done.... Built generate-sdk-operations. Now I can run this one.

DONE.

Now I changed this code

// const indexes = (await db.get("TsInterface")).filter(
//   (x) => x.name === interfaceName
// );

// const index = indexes[0];

const indexPath = sdkInterfacePaths[interfaceName];
const index = await readProjectRelativeJsonFile<TsInterface>(indexPath);
Enter fullscreen mode Exit fullscreen mode

As you can see, I don't need to fetch 1000+ interfaces anymore, just one!
Seeing the result, I don't think this was the full reason why it was slow. I have to optimise the speed of the whole function wrapper (api) because there are some problems there. It's way to slow.

Anyway.... rebuilding all operations is done... Let's go back to bundling.

rebuildAllOperations Rebuilds all operations that are needed to be rebuilt ```tsx async ( /** * If true, you are indicating that the rebuilding process has changed and all operations should be rebuilt after this date. */ isRebuildingProcessUpdated?: boolean, manualProjectRoot?: string ) => { const configPath = path.join(__dirname, "..", "config.json"); if (isRebuildingProcessUpdated) { await writeJsonToFile(configPath, { updatedAt: Date.now() }); } const config = await readJsonFile<{ updatedAt: number }>(configPath); forAllFolders({ type: "operations", basePath: getPathsWithOperations({ manualProjectRoot }), callback: async (folderPath, index) => { log(`#${index}: Let's do ${folderPath}`, { type: "success" }); await rebuildOperation({ operationBasePath: folderPath, noExit: true, updatedAt: config?.updatedAt, }); return; }, }); }; ``` getDbModel gets all instances of an db data interface from the db in a typesafe way ```tsx async < KInterface extends Extract, TDatasetConfig extends DatasetConfig >( /** * the interfaceName you want to get */ interfaceName: KInterface | null, /** * optionally, provide a configuration */ datasetConfig?: TDatasetConfig, /** * This search should be done on the deepest JSON value's of the whole thing. The purpose is not limiting the content to user, but rather just a nice user experience where one can quickly search */ search?: string ): Promise> => { if (!interfaceName) { return { data: [], hasMore: false, message: "No interfaceName posted" }; } const data = await db.get(interfaceName); // NB: slice the data, if needed const slicedStartData = data.slice(datasetConfig?.startFromIndex); const slicedLimitData = datasetConfig?.maxRows ? slicedStartData.slice(0, datasetConfig.maxRows) : slicedStartData; const hasMore = slicedLimitData.length < slicedStartData.length; // NB: filter the sliced data, if needed const filteredData = datasetConfig?.filter?.length ? datasetConfig?.filter.reduce((filteredData, datasetFilter) => { const newFilteredData: DbModels[KInterface][] = filteredData.filter( (item) => { const key = datasetFilter.objectParameterKey as keyof typeof item; const value = item[key]; if (datasetFilter.operator === "equal") { return String(value) === datasetFilter.value; } if (datasetFilter.operator === "notEqual") { return String(value) === datasetFilter.value; } const lowercaseValue = String(value).toLowerCase(); const lowercaseDatasetValue = String( datasetFilter.value ).toLowerCase(); if (datasetFilter.operator === "endsWith") { return lowercaseValue.endsWith(lowercaseDatasetValue); } if (datasetFilter.operator === "startsWith") { return lowercaseValue.startsWith(lowercaseDatasetValue); } if (datasetFilter.operator === "includes") { return lowercaseValue.includes(lowercaseDatasetValue); } if (datasetFilter.operator === "includesLetters") { return hasAllLetters(lowercaseValue, lowercaseDatasetValue); } if ( datasetFilter.operator === "greaterThan" && datasetFilter.value !== null && datasetFilter.value !== undefined ) { return Number(value) > Number(datasetFilter.value); } if ( datasetFilter.operator === "lessThan" && datasetFilter.value !== null && datasetFilter.value !== undefined ) { return Number(value) < Number(datasetFilter.value); } if ( datasetFilter.operator === "greaterThanOrEqual" && datasetFilter.value !== null && datasetFilter.value !== undefined ) { return Number(value) >= Number(datasetFilter.value); } if ( datasetFilter.operator === "lessThanOrEqual" && datasetFilter.value !== null && datasetFilter.value !== undefined ) { return Number(value) <= Number(datasetFilter.value); } return false; } ); return newFilteredData; }, slicedLimitData) : slicedLimitData; // NB: sort the filtered data, if needed const sortedData = datasetConfig?.sort ? datasetConfig.sort.reduce((sortedData, datasetSort) => { const newSortedData: DbModels[KInterface][] = sortedData.sort( (a, b) => { // @ts-ignore const valueA = a[datasetSort.objectParameterKey]; // @ts-ignore const valueB = b[datasetSort.objectParameterKey]; const directionMultiplier = datasetSort.sortDirection === "ascending" ? 1 : -1; return Number(valueA) < Number(valueB) ? directionMultiplier : directionMultiplier * -1; } ); return newSortedData; }, filteredData) : filteredData; const searchedData = search && search.length > 0 ? sortedData.filter((item) => { const searchable = Object.values(item) .map((value) => JSON.stringify(value)) .join(",") .toLowerCase(); return searchable.includes(search.toLowerCase()); }) : sortedData; const subsetData = datasetConfig?.objectParameterKeys?.length ? searchedData.map( (item) => getSubsetFromObject( item, datasetConfig.objectParameterKeys! as readonly (keyof DbModels[KInterface])[] ) as DatasetItem ) : searchedData; const ignoredData = datasetConfig?.ignoreObjectParameterKeys?.length ? subsetData.map((item) => { return removeOptionalKeysFromObjectStrings( item as { [key: string]: any }, datasetConfig.ignoreObjectParameterKeys! ); }) : subsetData; const finalData = ignoredData as DbModels[KInterface][]; return { datasetConfig, data: finalData, hasMore, }; }; ``` sdk-function-paths I didn't write a good description for this yet. Please let me know if you want to know more generateFunctionPathsSdk ```tsx async (config?: { manualProjectRoot?: string; skipYarnInstall?: boolean; dryrun?: boolean; }) => { const skipYarnInstall = config?.skipYarnInstall; const dryrun = config?.dryrun; const manualProjectRoot = config?.manualProjectRoot; const projectRoot = manualProjectRoot || getProjectRoot(); if (!projectRoot) return; const tsFunctions = await db.get("TsFunction", { manualProjectRoot }); const functionPathsObject = mergeObjectsArray( tsFunctions .map((tsFunction) => { const projectRelativePath = tsFunction.projectRelativePath; const exists = fs.existsSync( path.join(projectRoot, projectRelativePath) ); if (!exists) return; return { [tsFunction.name]: projectRelativePath }; }) .filter(notEmpty) ); const operationObjectString = `export const sdkFunctionPaths = ${JSON.stringify( functionPathsObject, null, 2 )};`; await newOperationWithFiles( "sdk-function-paths", await getSdkDescription("sdk-function-paths"), { "src/sdk-function-paths.ts": operationObjectString }, { overwriteIfExists: true, skipYarnInstall, manualProjectRoot, dryrun } ); }; ``` generateInterfacePathsSdk `sdk-function-paths` indexes all operations and builds an object containing all operations. ```tsx async (config?: { manualProjectRoot?: string; skipYarnInstall?: boolean; dryrun?: boolean; }) => { const skipYarnInstall = config?.skipYarnInstall; const dryrun = config?.dryrun; const manualProjectRoot = config?.manualProjectRoot; const projectRoot = manualProjectRoot || getProjectRoot(); if (!projectRoot) return; const tsInterfaces = await db.get("TsInterface", { manualProjectRoot, operationName: "*", }); const interfacePathsObject = mergeObjectsArray( tsInterfaces .map((tsInterface) => { const projectRelativePath = tsInterface.projectRelativePath; const exists = fs.existsSync( path.join(projectRoot, projectRelativePath) ); if (!exists) return; return { [tsInterface.name]: projectRelativePath }; }) .filter(notEmpty) ); const operationObjectString = `export const sdkInterfacePaths = ${JSON.stringify( interfacePathsObject, null, 2 )};`; await newOperationWithFiles( "sdk-interface-paths", await getSdkDescription("sdk-interface-paths"), { "src/sdk-interface-paths.ts": operationObjectString }, { overwriteIfExists: true, skipYarnInstall, manualProjectRoot, dryrun } ); }; ``` generate-sdk-operations I didn't write a good description for this yet. Please let me know if you want to know more

undefined

Bundling, 2nd attempt

Found 3 apps, 124 modules, 1 packages

BOOM.

✨  Done in 4.84s.
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-interface-paths ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-db ✅
{
  tsFunctions: 836,
  manualProjectRoot: '/Users/king/King/bundled/typerepo'
}
{ exportedFunctions: 714 }

Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-api ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-api-keys ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-ui ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-js ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-env-public ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-env-private ✅
Installing repo ✅
Enter fullscreen mode Exit fullscreen mode

That's GREAT it all worked it seems, and so fast! I almost don't believe it, let's have a look.

Unfortunately it didn't work! Some indexation failed, specifically for db-web. I added some Operation .operation.indirectDependencies and now it should be fine.

Found 3 apps, 128 modules, 1 packages

This is great. But after trying, also authentication didn't bundle well. Let's add server-login as its sole .operation.indirectDependencies as well.

Let's try again. And again.

Found 3 apps, 133 modules, 1 packages

Found 3 apps, 135 modules, 1 packages

Awesome! I mean, it's too bad that I can't automate this process though, as I don't have a way to consistenlty run the process and look for which packages are missing. I can improve the indexation though, so maybe, at some point , it will index everything and indirectDependencies aren't needed.

BOOM. The bundle works! I couldn't have a better morning than this. 🌞

BOOM

Now it's possible to release this badboy! I haven't been able to for 2+ weeks. But since there are some bugs in the db-ui still I'm not going to yet. I'm going to work on db-web, function-web, and potentially finish the assets once and for all. Let's see what I can do.

db-web

I didn't write a good description for this yet. Please let me know if you want to know more

Operation

Model for typerepo operations. Stored in package.json in every package (compatible with regular npm package.json data structure). An Operation is a NPM Package that applies the typerepo convention.

TODO: add a validation to package.json files for the whole project, to ensure i can apply fs-orm convention

authentication

I didn't write a good description for this yet. Please let me know if you want to know more

server-login

I didn't write a good description for this yet. Please let me know if you want to know more

db-web

I didn't write a good description for this yet. Please let me know if you want to know more

function-web

I didn't write a good description for this yet. Please let me know if you want to know more

undefined

Asset improvements

❌ Parse BackendAsset's when returning stuff from fs-orm: add the URL. Or maybe SHOULD I store it for efficiency? NO we don't need this, wtf! we have projectRelativePath available on every item! We can calculate the URL from there, on the frontend...

✅ Remove .apiPath from BackendAsset.

✅ I made a function to calculate asset path from a db item on the frontend (as lazily as possible). I came up with ModelItemAssetView, which relies on AssetView. It's not very flexible yet, but at least it can present any asset. I guess it'd be nice to decouple the functional part as well, so people can make their own UI's for this.

✅ I came up with itemGetBackendAssetUrl. Great! 🌈

BackendAsset

Part of the asset that should be sent to the backend. The rest should frontend-only

Some values are stored, some are not

BackendAsset

Part of the asset that should be sent to the backend. The rest should frontend-only

Some values are stored, some are not

ModelItemAssetView

(props: {
  item: T;
  backendAsset?: BackendAsset;
  hideDownloadLink?: boolean;
  className?: string;
}) =&gt; {
  const { backendAsset, item, hideDownloadLink, className } = props;

  return backendAsset ? (

  ) : null;
};
Enter fullscreen mode Exit fullscreen mode

AssetView

(props: {
  asset: Asset;
  className?: string;
  projectRelativeReferencingFilePath: string;
  hideDownloadLink?: boolean;
}) =&gt; {
  const {
    asset,
    className,
    projectRelativeReferencingFilePath,
    hideDownloadLink,
  } = props;

  const isRemote = !asset.blobPath;

  /**
   * NB: `relativePath` is required
   */
  const getRemoteUrl = (isDownload: boolean) =&gt;
    getReferencedAssetApiUrl(
      apiUrl,
      projectRelativeReferencingFilePath,
      asset.relativePath!,
      isDownload
    );
  /**
   * NB: `asset.temporaryDestination` is not a URL that can be used to retreive the image as "src"
   */
  const src = isRemote
    ? asset.relativePath
      ? getRemoteUrl(false)
      : undefined
    : asset.blobPath;

  const downloadRemotely = isRemote &amp;&amp; !!asset.relativePath;

  const downloadHref = downloadRemotely
    ? getRemoteUrl(true)
    : // NB: other type is ugly when downloading. Also doesn't make sense much to want to download an asset after uploading
    asset.type !== "other"
    ? asset.blobPath
    : undefined;

  const sizeText =
    asset.sizeBytes !== undefined
      ? `(${readableSize(asset.sizeBytes)})`
      : undefined;

  const downloadText = " âŦ‡ī¸ Download";

  const extension = getExtensionFromAsset(asset);

  const filename =
    isRemote &amp;&amp; asset.relativePath
      ? asset.relativePath.split("/").pop()!
      : extension
      ? `untitled.${extension}`
      : undefined;

  const isTextFile = filename &amp;&amp; isText(filename) === true ? true : false;

  const type =
    isRemote &amp;&amp; asset.relativePath
      ? getTypeFromRelativePath(asset.relativePath)
      : asset.type;

  const [rawText, setRawText] = useState("");
  useEffect(() =&gt; {
    if (!!src &amp;&amp; isTextFile) {
      fetch(src).then((result) =&gt; {
        result.text().then((text) =&gt; setRawText(text));
      });
    }
  }, [src, isTextFile]);

  return (
    <span>
      {downloadHref &amp;&amp; !hideDownloadLink ? (

          <a href="%7BdownloadHref%7D">
            {downloadText} {sizeText}
          </a>

      ) : null}

      {type === "image" &amp;&amp; src &amp;&amp; (
        <img src="%7Bsrc%7D">
      )}
      {type === "audio" &amp;&amp; src &amp;&amp; }
      {type === "video" &amp;&amp; src &amp;&amp; (
        <video controls="" src="%7Bsrc%7D"></video>
      )}

      {type === "other" &amp;&amp; isTextFile ? (

      ) : null}

      {/* LATER: render PDF renderer here */}
      {type === "other" &amp;&amp; extension === "pdf" ? null : null}

      {src === undefined ? <p>Asset src not found</p> : null}
    </span>
  );
};
Enter fullscreen mode Exit fullscreen mode

itemGetBackendAssetUrl

Get remote url for a BackendAsset in an AugmentedAnyModelType database model item.

If you provide an array it'll take the first asset.

(config: {
  item: AugmentedAnyModelType;
  backendAsset?: BackendAsset | BackendAsset[];
  isDownload?: boolean;
}) =&gt; {
  const { backendAsset, item, isDownload } = config;

  const realBackendAsset = backendAsset ? takeFirst(backendAsset) : undefined;

  if (!realBackendAsset?.relativePath) {
    return undefined;
  }

  const url = getReferencedAssetApiUrl(
    apiUrl,
    item.projectRelativePath,
    realBackendAsset.relativePath,
    isDownload
  );
  return url;
};
Enter fullscreen mode Exit fullscreen mode

undefined

Backend performance

Areas that we have to take a look at:

  • why is it slow, what takes the most time?
  • can these time consuming things be improved? at which level is the easiest?
  • will a better server (more cpu) decrease loading time enough?
  • can I do something smart with caching? I guess this will make it much faster in consecutive times

The backend does a lot of things before executing the actual function. Just had a look in the FunctionExecution model and specifically, the performance part (what came out of getNewPerformance).

First I solved a little bug (the ms amounts were negative)...

But I also noticed that almost nothing takes time, except for the part where it finds the device. The upsertDevice function, seems to, almost always, take 600+ ms. This is obviously a huge problem because it hurts the server performance by an insane amount. And we don't even have so many users yet! So let's have a look if we can find the bottleneck there.

Before I started, I wanted to decouple getNewPerformance into its own operation. It's a great tool to measure performance. Simple but useful. I also documented it way better.

Boom. we now have measure-performance, a CJS only operation that stores performance items into memory. It's important to note that this may lead to memory overflow but we can solve that later, I don't know how to clean this up nicely in a genaral purpose way. I have to use it more first.

So then I added all getNewPerformance I wanted into upsertDevice. To test it, I spun up the function-server and search-web. But searchGetStaticProps was crashing so I fixed that first.

Now I run one page-load for search-web and this is what I see:

upsertDevice, already device
[
  undefined,
  { label: 'lookupIp', durationMs: 5 },
  { label: 'findAlreadyDevice', durationMs: 621 },
  { label: 'gatherIpInfo', durationMs: 0 },
  { label: 'alreadyDevice_makeUpdatedDevice', durationMs: 0 },
  { label: 'alreadyDevice_updateDevice', durationMs: 456 }
]
upsertDevice, already device [
  undefined,
  { label: 'lookupIp', durationMs: 0 },
  { label: 'findAlreadyDevice', durationMs: 620 },
  { label: 'gatherIpInfo', durationMs: 0 },
  { label: 'alreadyDevice_makeUpdatedDevice', durationMs: 0 },
  { label: 'alreadyDevice_updateDevice', durationMs: 457 }
]

Enter fullscreen mode Exit fullscreen mode

Holy cow! The performance problem lies here:

const alreadyDevice = (await db.get("Device", { include: deviceInclude })).find(
  (x) => x.authToken === authToken
);
Enter fullscreen mode Exit fullscreen mode

and here:

await db.update(
  "Device",
  (item) => item.authToken === authToken,
  () => updatedDevice
);
Enter fullscreen mode Exit fullscreen mode

There's obviously something wrong with get, because update relies on get (and set).

This means that fs-orm is doing something very inefficiently, because it shouldn't take 620 ms to open a 7 kb json file...

I am going to have to measure debug fs-orm and see what's happening!

FunctionExecution

Model for tests, examples, cache, and recent executions of any function

Requirement for tifo-stitching

Example:

const someFunction = (inputA: string, inputB:string):string => {

return ${inputA} != ${inputB}
}

find this in the database after executing the function

const functionExecution1 = {
....
functionName: "someFunction",
inputParameters: ["hello", "world"],
output: "hello != world",
isTest: false,
isExample: false,
isResultFromCache: false,
performance: [....],
}

/**
 * Model for tests, examples, cache, and recent executions of any function
 *
 * Requirement for **tifo-stitching**

Example: 

const someFunction = (inputA: string, inputB:string):string =&gt; {

  return `${inputA} != ${inputB}`
}


// find this in the database after executing the function

const functionExecution1 = {
  ....
  functionName: "someFunction",
  inputParameters: ["hello", "world"],
  output: "hello != world",
  isTest: false,
  isExample: false,
  isResultFromCache: false,
  performance: [....],
}

*/

export interface FunctionExecution extends DefaultModelType {
  functionName: string;
  tsFunctionId: Id;
  tsFunction?: TsFunction;
  inputParameters: any[] | undefined;
  output: any;
  isTest: boolean;
  isExample: boolean;
  /**
   * test description or example description or anything
   */
  description?: Markdown;
  isResultFromCache: boolean;
  /**
   * if true, the api of the function (input/output interface) has changed in bewteen, so the re-execution would probably fail or return a different result
   */
  hasApiChanged?: boolean;
  performance: PerformanceItem[];
}
Enter fullscreen mode Exit fullscreen mode

getNewPerformance

Function that lets you measure performance inside any function with ease.

Usage:

Firstly, make a performance array, and a unique execution id, and start the measurement, like so:

import { generateUniqueId, getNewPerformance, PerformanceItem, cleanupTimer } from "measure-performance";

at the start of your function

const executionId = generateUniqueId();
const performance: (PerformanceItem | undefined)[] = [];
getNewPerformance("start", executionId, true)
Enter fullscreen mode Exit fullscreen mode

After that, push a new performance item at every step you want to measure. Provide your label describing what happened before this (the step you are measuring).

performance.push(getNewPerformance("your label", executionId));
Enter fullscreen mode Exit fullscreen mode

At the end of your function, you can view your performance array by printing it on the console (or store it somewhere if you like)

Don't forget to run cleanupTimer, or you'll run into memory leaks!

cleanupTimer(executionId);
Enter fullscreen mode Exit fullscreen mode
(
  label: string,
  uniqueId: string,
  isNew?: boolean
): PerformanceItem | undefined =&gt; {
  const timePrevious = timer[uniqueId];
  const timeNow = Date.now();
  timer[uniqueId] = timeNow;

  if (isNew) return;

  const durationMs = timeNow - timePrevious;

  return { label, durationMs };
};
Enter fullscreen mode Exit fullscreen mode

upsertDevice

Returns device with all attached (logged in) Persons, and currentPersonCalculated

Either finds the device and updates it according to the new request metadata, or creates a new device.

Should never return undefined if the database functions...

TODO: Use cookies (https://serverjs.io/documentation/reply/#cookie-) to login

Needed for having authToken with GET as well in a safe manner (e.g. for images)

async (serverContext: Context): Promise =&gt; {
  // in your function
  const executionId = generateUniqueId();
  const performance: (PerformanceItem | undefined)[] = [];

  performance.push(getNewPerformance("start", executionId, true));

  const authToken: string | undefined = serverContext.data?.authToken;
  const ip = serverContext.ip;

  if (!authToken || authToken.length &lt; 24) {
    console.log("warn upsert device: no authToken");
    return;
  }

  // NB: range: [ ,  ], the rest is described in the type interface
  const ipLookup = (geoip.lookup(ip) || {}) as Partial;
  const {
    city,
    area: positionRadiusKm,
    ll,
    country,
    region,
    timezone,
  } = ipLookup;

  const position: Position | undefined =
    !!ll?.[0] &amp;&amp; !!ll?.[1] ? { latitude: ll[0], longitude: ll[1] } : undefined;

  const userAgent: IResult = parseUserAgent(
    serverContext.req.get("User-Agent")
  );
  performance.push(getNewPerformance("lookupIp", executionId));

  const alreadyDevice = (
    await db.get("Device", { include: deviceInclude })
  ).find((x) =&gt; x.authToken === authToken);

  performance.push(getNewPerformance("findAlreadyDevice", executionId));

  const ipInfo: IPInfo = {
    ip,
    city,
    position,
    positionRadiusKm,
    country,
    region,
    timezone,
  };

  const origin = serverContext.req.get("Origin") as string;
  const referer = serverContext.req.get("Referrer") as string;

  // server.reply
  //   .cookie(
  //     "testje",
  //     authToken,

  //     {
  //       /**
  //        * NB: VERY IMPORTANT In order to receive the cookie at other port or domain
  //        */
  //       sameSite: "none",
  //       secure: true,
  //       /**
  //        * It turned out that Chrome won't set the cookie if the domain contains a port. Setting it for localhost (without port) is not a problem
  //        */
  //       domain: "localhost",
  //     }
  //   )

  performance.push(getNewPerformance("gatherIpInfo", executionId));

  // NB: either a new device creation or updating an existing device
  if (alreadyDevice) {
    // console.log("device already");
    const currentIpInfo: IPInfo = {
      ip: alreadyDevice.ip,
      city: alreadyDevice.city,
      position: alreadyDevice.position,
      positionRadiusKm: alreadyDevice.positionRadiusKm,
      country: alreadyDevice.country,
      region: alreadyDevice.region,
      timezone: alreadyDevice.timezone,
    };

    const previousIpsHasAlready =
      !currentIpInfo.ip ||
      alreadyDevice.previousIps.find((x) =&gt; x.ip === currentIpInfo.ip);
    const newPreviousIps = previousIpsHasAlready
      ? alreadyDevice.previousIps
      : alreadyDevice.previousIps.concat(currentIpInfo);

    const newIpStuff =
      alreadyDevice.ip === ip ? {} : { ...ipInfo, previousIps: newPreviousIps };

    const newOrigins = alreadyDevice.origins.includes(origin)
      ? alreadyDevice.origins
      : alreadyDevice.origins.concat(origin);

    const currentPersonCalculated = alreadyDevice.currentPersonId
      ? alreadyDevice.persons?.find(
          (x) =&gt; x.id === alreadyDevice.currentPersonId
        )
      : undefined;

    const updatedDevice: Device = {
      ...alreadyDevice,
      ...newIpStuff,
      currentPersonCalculated,
      origins: newOrigins,
      userAgent,
      userAgentString: userAgent.ua,
    };

    performance.push(
      getNewPerformance("alreadyDevice_makeUpdatedDevice", executionId)
    );

    await db.update(
      "Device",
      (item) =&gt; item.authToken === authToken,
      () =&gt; updatedDevice
    );

    performance.push(
      getNewPerformance("alreadyDevice_updateDevice", executionId)
    );

    savePageVisit(alreadyDevice.id, ipInfo, referer);
    // console.log("upsertDevice, already device", performance);

    return updatedDevice;
  }

  // console.log("new device");
  const newDevice: Creation = {
    authToken,
    authenticationMethods: [],
    ...ipInfo,
    lastOnlineAt: 0,
    lastSyncDatabaseAtObject: {},
    name: calculateDeviceName(ipInfo, userAgent),
    origins: [origin],
    previousIps: [],
    userAgent,
    userAgentString: userAgent.ua,
    hasPapi: false,
  };

  performance.push(getNewPerformance("calculateNewDevice", executionId));

  //  console.log({ newDevice });
  // Create new device
  //@ts-ignore
  const upsertResult = await db.upsert("Device", newDevice, {
    onlyInsert: true,
  });

  performance.push(getNewPerformance("upsertNewDevice", executionId));

  // console.log({ upsertResult });

  const fullNewDevice = (
    await db.get("Device", { include: deviceInclude })
  ).find((x) =&gt; x.authToken === authToken);

  performance.push(getNewPerformance("getFullNewDevice", executionId));

  if (fullNewDevice) {
    savePageVisit(fullNewDevice.id, ipInfo, referer);
  }

  const currentPersonCalculated = fullNewDevice?.currentPersonId
    ? fullNewDevice.persons?.find((x) =&gt; x.id === fullNewDevice.currentPersonId)
    : undefined;

  const finalNewDevice: Device | undefined = fullNewDevice
    ? { ...fullNewDevice, currentPersonCalculated }
    : undefined;

  performance.push(getNewPerformance("calculateMetadata", executionId));

  // console.log("upsertDevice", performance);
  return finalNewDevice;
};
Enter fullscreen mode Exit fullscreen mode

getNewPerformance

Function that lets you measure performance inside any function with ease.

Usage:

Firstly, make a performance array, and a unique execution id, and start the measurement, like so:

import { generateUniqueId, getNewPerformance, PerformanceItem, cleanupTimer } from "measure-performance";

at the start of your function

const executionId = generateUniqueId();
const performance: (PerformanceItem | undefined)[] = [];
getNewPerformance("start", executionId, true)
Enter fullscreen mode Exit fullscreen mode

After that, push a new performance item at every step you want to measure. Provide your label describing what happened before this (the step you are measuring).

performance.push(getNewPerformance("your label", executionId));
Enter fullscreen mode Exit fullscreen mode

At the end of your function, you can view your performance array by printing it on the console (or store it somewhere if you like)

Don't forget to run cleanupTimer, or you'll run into memory leaks!

cleanupTimer(executionId);
Enter fullscreen mode Exit fullscreen mode
(
  label: string,
  uniqueId: string,
  isNew?: boolean
): PerformanceItem | undefined =&gt; {
  const timePrevious = timer[uniqueId];
  const timeNow = Date.now();
  timer[uniqueId] = timeNow;

  if (isNew) return;

  const durationMs = timeNow - timePrevious;

  return { label, durationMs };
};
Enter fullscreen mode Exit fullscreen mode

measure-performance

I didn't write a good description for this yet. Please let me know if you want to know more

getNewPerformance

Function that lets you measure performance inside any function with ease.

Usage:

Firstly, make a performance array, and a unique execution id, and start the measurement, like so:

import { generateUniqueId, getNewPerformance, PerformanceItem, cleanupTimer } from "measure-performance";

at the start of your function

const executionId = generateUniqueId();
const performance: (PerformanceItem | undefined)[] = [];
getNewPerformance("start", executionId, true)
Enter fullscreen mode Exit fullscreen mode

After that, push a new performance item at every step you want to measure. Provide your label describing what happened before this (the step you are measuring).

performance.push(getNewPerformance("your label", executionId));
Enter fullscreen mode Exit fullscreen mode

At the end of your function, you can view your performance array by printing it on the console (or store it somewhere if you like)

Don't forget to run cleanupTimer, or you'll run into memory leaks!

cleanupTimer(executionId);
Enter fullscreen mode Exit fullscreen mode
(
  label: string,
  uniqueId: string,
  isNew?: boolean
): PerformanceItem | undefined =&gt; {
  const timePrevious = timer[uniqueId];
  const timeNow = Date.now();
  timer[uniqueId] = timeNow;

  if (isNew) return;

  const durationMs = timeNow - timePrevious;

  return { label, durationMs };
};
Enter fullscreen mode Exit fullscreen mode

upsertDevice

Returns device with all attached (logged in) Persons, and currentPersonCalculated

Either finds the device and updates it according to the new request metadata, or creates a new device.

Should never return undefined if the database functions...

TODO: Use cookies (https://serverjs.io/documentation/reply/#cookie-) to login

Needed for having authToken with GET as well in a safe manner (e.g. for images)

async (serverContext: Context): Promise =&gt; {
  // in your function
  const executionId = generateUniqueId();
  const performance: (PerformanceItem | undefined)[] = [];

  performance.push(getNewPerformance("start", executionId, true));

  const authToken: string | undefined = serverContext.data?.authToken;
  const ip = serverContext.ip;

  if (!authToken || authToken.length &lt; 24) {
    console.log("warn upsert device: no authToken");
    return;
  }

  // NB: range: [ ,  ], the rest is described in the type interface
  const ipLookup = (geoip.lookup(ip) || {}) as Partial;
  const {
    city,
    area: positionRadiusKm,
    ll,
    country,
    region,
    timezone,
  } = ipLookup;

  const position: Position | undefined =
    !!ll?.[0] &amp;&amp; !!ll?.[1] ? { latitude: ll[0], longitude: ll[1] } : undefined;

  const userAgent: IResult = parseUserAgent(
    serverContext.req.get("User-Agent")
  );
  performance.push(getNewPerformance("lookupIp", executionId));

  const alreadyDevice = (
    await db.get("Device", { include: deviceInclude })
  ).find((x) =&gt; x.authToken === authToken);

  performance.push(getNewPerformance("findAlreadyDevice", executionId));

  const ipInfo: IPInfo = {
    ip,
    city,
    position,
    positionRadiusKm,
    country,
    region,
    timezone,
  };

  const origin = serverContext.req.get("Origin") as string;
  const referer = serverContext.req.get("Referrer") as string;

  // server.reply
  //   .cookie(
  //     "testje",
  //     authToken,

  //     {
  //       /**
  //        * NB: VERY IMPORTANT In order to receive the cookie at other port or domain
  //        */
  //       sameSite: "none",
  //       secure: true,
  //       /**
  //        * It turned out that Chrome won't set the cookie if the domain contains a port. Setting it for localhost (without port) is not a problem
  //        */
  //       domain: "localhost",
  //     }
  //   )

  performance.push(getNewPerformance("gatherIpInfo", executionId));

  // NB: either a new device creation or updating an existing device
  if (alreadyDevice) {
    // console.log("device already");
    const currentIpInfo: IPInfo = {
      ip: alreadyDevice.ip,
      city: alreadyDevice.city,
      position: alreadyDevice.position,
      positionRadiusKm: alreadyDevice.positionRadiusKm,
      country: alreadyDevice.country,
      region: alreadyDevice.region,
      timezone: alreadyDevice.timezone,
    };

    const previousIpsHasAlready =
      !currentIpInfo.ip ||
      alreadyDevice.previousIps.find((x) =&gt; x.ip === currentIpInfo.ip);
    const newPreviousIps = previousIpsHasAlready
      ? alreadyDevice.previousIps
      : alreadyDevice.previousIps.concat(currentIpInfo);

    const newIpStuff =
      alreadyDevice.ip === ip ? {} : { ...ipInfo, previousIps: newPreviousIps };

    const newOrigins = alreadyDevice.origins.includes(origin)
      ? alreadyDevice.origins
      : alreadyDevice.origins.concat(origin);

    const currentPersonCalculated = alreadyDevice.currentPersonId
      ? alreadyDevice.persons?.find(
          (x) =&gt; x.id === alreadyDevice.currentPersonId
        )
      : undefined;

    const updatedDevice: Device = {
      ...alreadyDevice,
      ...newIpStuff,
      currentPersonCalculated,
      origins: newOrigins,
      userAgent,
      userAgentString: userAgent.ua,
    };

    performance.push(
      getNewPerformance("alreadyDevice_makeUpdatedDevice", executionId)
    );

    await db.update(
      "Device",
      (item) =&gt; item.authToken === authToken,
      () =&gt; updatedDevice
    );

    performance.push(
      getNewPerformance("alreadyDevice_updateDevice", executionId)
    );

    savePageVisit(alreadyDevice.id, ipInfo, referer);
    // console.log("upsertDevice, already device", performance);

    return updatedDevice;
  }

  // console.log("new device");
  const newDevice: Creation = {
    authToken,
    authenticationMethods: [],
    ...ipInfo,
    lastOnlineAt: 0,
    lastSyncDatabaseAtObject: {},
    name: calculateDeviceName(ipInfo, userAgent),
    origins: [origin],
    previousIps: [],
    userAgent,
    userAgentString: userAgent.ua,
    hasPapi: false,
  };

  performance.push(getNewPerformance("calculateNewDevice", executionId));

  //  console.log({ newDevice });
  // Create new device
  //@ts-ignore
  const upsertResult = await db.upsert("Device", newDevice, {
    onlyInsert: true,
  });

  performance.push(getNewPerformance("upsertNewDevice", executionId));

  // console.log({ upsertResult });

  const fullNewDevice = (
    await db.get("Device", { include: deviceInclude })
  ).find((x) =&gt; x.authToken === authToken);

  performance.push(getNewPerformance("getFullNewDevice", executionId));

  if (fullNewDevice) {
    savePageVisit(fullNewDevice.id, ipInfo, referer);
  }

  const currentPersonCalculated = fullNewDevice?.currentPersonId
    ? fullNewDevice.persons?.find((x) =&gt; x.id === fullNewDevice.currentPersonId)
    : undefined;

  const finalNewDevice: Device | undefined = fullNewDevice
    ? { ...fullNewDevice, currentPersonCalculated }
    : undefined;

  performance.push(getNewPerformance("calculateMetadata", executionId));

  // console.log("upsertDevice", performance);
  return finalNewDevice;
};
Enter fullscreen mode Exit fullscreen mode

function-server

I didn't write a good description for this yet. Please let me know if you want to know more

search-web

I didn't write a good description for this yet. Please let me know if you want to know more

searchGetStaticProps

async (context) =&gt; {
  const query = takeFirst(context.params?.query) || null;

  const imagePaths = await fs.readdir(
    path.join(__dirname, "../../..", "public")
  );
  const searchResults = getAllSearchResults(query) || null;

  const timelineItems: {
    comment: string;
    filePath: string | undefined;
    line: number;
  }[] = []; // = await getTimelineItems();

  const props: QueryPageProps = {
    query,
    searchResults,
    imagePaths,
    timelineItems,
  };

  return {
    props,
  };
};
Enter fullscreen mode Exit fullscreen mode

search-web

I didn't write a good description for this yet. Please let me know if you want to know more

get

async () =&gt; {
  await upsert();

  const items = await testDb.get("MarkdownTestModel");
  console.dir({ items }, { depth: 99 });
};
Enter fullscreen mode Exit fullscreen mode

set

Can set a markdown item into a subfolder in the db model folder

() =&gt; {
  const item: Creation = {
    categoryStackCalculated: ["sub", "folder"],
    name: "hell-yeah",
    markdown: "some markdown....",
  };

  const x = testDb.set("MarkdownTestModel", [item]);

  return x;
};
Enter fullscreen mode Exit fullscreen mode

fs-orm

ORM that works with JSON and FS

fs-orm

ORM that works with JSON and FS

undefined

Optimising the performance of fs-orm

TODO-LIST:

  • Add getPerformanceItem into fs-orms get
  • Log a lot there
  • Check if I can find the culprit by executing a test where I run db.get("Device")

I did that...
I created a test called testPerformance inside of database. Running it yields the following:

// Devices with include
get performance [
  { label: 'mergeConfigs', durationMs: 0 },
  { label: 'getDatabaseFiles', durationMs: 57 },
  { label: 'processInclude', durationMs: 0 },
  { label: 'dbContentPromises', durationMs: 4 },
  { label: 'dbContent', durationMs: 1 },
  { label: 'dbContentObject', durationMs: 0 }
]
get performance [
  { label: 'mergeConfigs', durationMs: 0 },
  { label: 'getDatabaseFiles', durationMs: 58 },
  { label: 'processInclude', durationMs: 0 },
  { label: 'dbContentPromises', durationMs: 2 },
  { label: 'dbContent', durationMs: 0 },
  { label: 'dbContentObject', durationMs: 0 }
]
get performance [
  { label: 'mergeConfigs', durationMs: 0 },
  { label: 'getDatabaseFiles', durationMs: 85 },
  { label: 'processInclude', durationMs: 123 },
  { label: 'dbContentPromises', durationMs: 2 },
  { label: 'dbContent', durationMs: 2 },
  { label: 'dbContentObject', durationMs: 0 }
]
// Devices without include
get performance [
  { label: 'mergeConfigs', durationMs: 0 },
  { label: 'getDatabaseFiles', durationMs: 57 },
  { label: 'processInclude', durationMs: 0 },
  { label: 'dbContentPromises', durationMs: 1 },
  { label: 'dbContent', durationMs: 0 },
  { label: 'dbContentObject', durationMs: 0 }
]
DONE
Enter fullscreen mode Exit fullscreen mode

It seems very clear, I need to optimise processInclude and getDatabaseFiles. But it seems that processInclude does two other get calls internally, which, in turn, are slow because of getDatabaseFiles. Because I think that the convention now does not let the user specify where the data is located, which can make it slow. Because we are absolutely sure where the Device data is located, it should be a convention that you can specify where the data lives more clearly in the database config when calling createDb. This can be indexed for. The data can live at 4 places: root (things like Device), operation (things like TsFunction), or anywhere (things like TodoFile).

This is probably possible to do more efficiently if I make it part of a bigger refactor that I've been thinking about: I want to make it possible to find things like todos and postables anywhere. I've written down this idea before, so I should really sit down for this one.

For now, I am too psyched about the database ui and function ui, so I will do this another time. The performance is still shitty, but at least now I know why! getDatabaseFiles is a monstrousity :/ I am going to spend some time on this soon....

testPerformance

async () =&gt; {
  const executionId = generateUniqueId();
  const performance: (PerformanceItem | undefined)[] = [];
  getNewPerformance("start", executionId, true);

  const deviceInclude: Include = {
    referenceKey: "personIds",
    include: { referenceKey: "groupSlugs" },
  };
  log("Devices with include", { type: "important" });

  const withInclude = await db.get("Device", { include: deviceInclude });

  log("Devices without include", { type: "important" });

  const withoutInclude = await db.get("Device");

  performance.push(
    getNewPerformance(
      "testPerformance (2x get, 1x with double include)",
      executionId
    )
  );
  cleanupTimer(executionId);
  log("DONE", { type: "success" });
  console.log(performance);
};
Enter fullscreen mode Exit fullscreen mode

database

I didn't write a good description for this yet. Please let me know if you want to know more

getDatabaseFiles

This function gets the files that the data can be stored, by convention, based on the model and the config

Only returns the file paths that actually exist.

CONVENTION:

When searching for data, fs-orm will look in:

  • db/ in your project root
  • db/ in any operation

In these folders, fs-orm will search for files based on your storage method.
@see DbStorageMethod for more info

Returns not only the file paths, but also where they were found (operationName, projectRelativePath, operationRelativePath)

async (
  modelName: string,
  mergedConfig: MergedQueryConfig
): Promise =&gt; {
  const executionId = generateUniqueId();
  const performance: (PerformanceItem | undefined)[] = [];
  getNewPerformance("start", executionId, true);

  const projectRoot =
    mergedConfig?.manualProjectRoot || mergedConfig.projectRoot;
  if (!projectRoot) return [];

  const dbStorageMethod = mergedConfig.dbStorageMethod;

  performance.push(getNewPerformance("get projectRoot", executionId));

  const pattern = getLocationPattern(dbStorageMethod, modelName, mergedConfig);

  performance.push(getNewPerformance("get location pattern", executionId));

  const operationPath = await getMergedConfigOperationPath(
    mergedConfig,
    projectRoot
  );

  performance.push(
    getNewPerformance("get merged config operation path", executionId)
  );

  // Please note, it can return false as well, which should continue here
  if (operationPath === undefined) return [];

  const rootFolders = await getRootFolders({
    mergedConfig,
    operationPath,
    projectRoot,
    manualProjectRoot: projectRoot,
  });

  performance.push(getNewPerformance("getRootFolders", executionId));

  cleanupTimer(executionId);
  // console.log({ performance });

  /**
  based on configuration and convention, we will fill this array with the files to get data from

  NB: this should contain the actual files, not the patterns
   */
  let dbFiles: DbFileLocation[] = [];

  const isOperationFile =
    !!mergedConfig.operationName &amp;&amp;
    !!operationPath &amp;&amp;
    !!mergedConfig.operationRelativePath;

  if (isOperationFile &amp;&amp; !!operationPath) {
    const exactAbsoluteOperationFilePath = path.join(
      operationPath,
      mergedConfig.operationRelativePath!
    );
    //make sure that extension matches `dbStorageMethod`, warn otherwise
    const customExt = mergedConfig.operationRelativePath
      ? path.parse(mergedConfig.operationRelativePath).ext
      : undefined;
    const isWrongExtension =
      customExt !== getDbStorageMethodExtension(dbStorageMethod);

    if (isWrongExtension) {
      log(
        `Incorrect extension found in operationRelativePath, found ${customExt}`,
        { type: "warning" }
      );
    }

    const projectRelativePath = exactAbsoluteOperationFilePath.substring(
      projectRoot.length
    );
    const operationRelativePath =
      mergedConfig.operationName === null
        ? undefined
        : exactAbsoluteOperationFilePath.substring(operationPath.length);

    dbFiles.push({
      modelName,
      absolutePath: exactAbsoluteOperationFilePath,
      operationName: mergedConfig.operationName!,
      projectRelativePath,
      operationRelativePath,
    });
  }

  if (!isOperationFile &amp;&amp; mergedConfig.projectRelativePath) {
    const absolutePath = path.join(
      projectRoot,
      mergedConfig.projectRelativePath
    );
    const operationName = null;
    const projectRelativePath = mergedConfig.projectRelativePath;

    dbFiles.push({
      modelName,
      absolutePath,
      operationName,
      projectRelativePath,
    });
  }

  if (!mergedConfig.projectRelativePath &amp;&amp; !isOperationFile &amp;&amp; pattern) {
    // no exact path

    const conventionedPaths: DbFileLocation[] = (
      await Promise.all(
        rootFolders.map(async (rootFolder) =&gt; {
          const absolutePathPattern = path.join(rootFolder.basePath, pattern);
          const projectRelativePath = absolutePathPattern.substring(
            projectRoot.length
          );

          const operationRelativePath =
            rootFolder.operationName === null
              ? undefined
              : absolutePathPattern.substring(rootFolder.basePath.length);

          const parsedPath = path.parse(absolutePathPattern);

          if (parsedPath.name === "*") {
            const fileNames = getWildcardDbFileLocations({
              modelName,
              parsedPath,
              operationName: rootFolder.operationName,
              projectRoot,
              rootFolder,
            });

            return fileNames;
          } else {
            const dbFileLocation: DbFileLocation = {
              modelName,
              absolutePath: absolutePathPattern,
              operationName: rootFolder.operationName,
              projectRelativePath,
              operationRelativePath,
            };
            return [dbFileLocation];
          }
        })
      )
    ).flat();

    dbFiles = dbFiles.concat(conventionedPaths);
  }

  return dbFiles;
};
Enter fullscreen mode Exit fullscreen mode

get

async () =&gt; {
  await upsert();

  const items = await testDb.get("MarkdownTestModel");
  console.dir({ items }, { depth: 99 });
};
Enter fullscreen mode Exit fullscreen mode

getDatabaseFiles

This function gets the files that the data can be stored, by convention, based on the model and the config

Only returns the file paths that actually exist.

CONVENTION:

When searching for data, fs-orm will look in:

  • db/ in your project root
  • db/ in any operation

In these folders, fs-orm will search for files based on your storage method.
@see DbStorageMethod for more info

Returns not only the file paths, but also where they were found (operationName, projectRelativePath, operationRelativePath)

async (
  modelName: string,
  mergedConfig: MergedQueryConfig
): Promise =&gt; {
  const executionId = generateUniqueId();
  const performance: (PerformanceItem | undefined)[] = [];
  getNewPerformance("start", executionId, true);

  const projectRoot =
    mergedConfig?.manualProjectRoot || mergedConfig.projectRoot;
  if (!projectRoot) return [];

  const dbStorageMethod = mergedConfig.dbStorageMethod;

  performance.push(getNewPerformance("get projectRoot", executionId));

  const pattern = getLocationPattern(dbStorageMethod, modelName, mergedConfig);

  performance.push(getNewPerformance("get location pattern", executionId));

  const operationPath = await getMergedConfigOperationPath(
    mergedConfig,
    projectRoot
  );

  performance.push(
    getNewPerformance("get merged config operation path", executionId)
  );

  // Please note, it can return false as well, which should continue here
  if (operationPath === undefined) return [];

  const rootFolders = await getRootFolders({
    mergedConfig,
    operationPath,
    projectRoot,
    manualProjectRoot: projectRoot,
  });

  performance.push(getNewPerformance("getRootFolders", executionId));

  cleanupTimer(executionId);
  // console.log({ performance });

  /**
  based on configuration and convention, we will fill this array with the files to get data from

  NB: this should contain the actual files, not the patterns
   */
  let dbFiles: DbFileLocation[] = [];

  const isOperationFile =
    !!mergedConfig.operationName &amp;&amp;
    !!operationPath &amp;&amp;
    !!mergedConfig.operationRelativePath;

  if (isOperationFile &amp;&amp; !!operationPath) {
    const exactAbsoluteOperationFilePath = path.join(
      operationPath,
      mergedConfig.operationRelativePath!
    );
    //make sure that extension matches `dbStorageMethod`, warn otherwise
    const customExt = mergedConfig.operationRelativePath
      ? path.parse(mergedConfig.operationRelativePath).ext
      : undefined;
    const isWrongExtension =
      customExt !== getDbStorageMethodExtension(dbStorageMethod);

    if (isWrongExtension) {
      log(
        `Incorrect extension found in operationRelativePath, found ${customExt}`,
        { type: "warning" }
      );
    }

    const projectRelativePath = exactAbsoluteOperationFilePath.substring(
      projectRoot.length
    );
    const operationRelativePath =
      mergedConfig.operationName === null
        ? undefined
        : exactAbsoluteOperationFilePath.substring(operationPath.length);

    dbFiles.push({
      modelName,
      absolutePath: exactAbsoluteOperationFilePath,
      operationName: mergedConfig.operationName!,
      projectRelativePath,
      operationRelativePath,
    });
  }

  if (!isOperationFile &amp;&amp; mergedConfig.projectRelativePath) {
    const absolutePath = path.join(
      projectRoot,
      mergedConfig.projectRelativePath
    );
    const operationName = null;
    const projectRelativePath = mergedConfig.projectRelativePath;

    dbFiles.push({
      modelName,
      absolutePath,
      operationName,
      projectRelativePath,
    });
  }

  if (!mergedConfig.projectRelativePath &amp;&amp; !isOperationFile &amp;&amp; pattern) {
    // no exact path

    const conventionedPaths: DbFileLocation[] = (
      await Promise.all(
        rootFolders.map(async (rootFolder) =&gt; {
          const absolutePathPattern = path.join(rootFolder.basePath, pattern);
          const projectRelativePath = absolutePathPattern.substring(
            projectRoot.length
          );

          const operationRelativePath =
            rootFolder.operationName === null
              ? undefined
              : absolutePathPattern.substring(rootFolder.basePath.length);

          const parsedPath = path.parse(absolutePathPattern);

          if (parsedPath.name === "*") {
            const fileNames = getWildcardDbFileLocations({
              modelName,
              parsedPath,
              operationName: rootFolder.operationName,
              projectRoot,
              rootFolder,
            });

            return fileNames;
          } else {
            const dbFileLocation: DbFileLocation = {
              modelName,
              absolutePath: absolutePathPattern,
              operationName: rootFolder.operationName,
              projectRelativePath,
              operationRelativePath,
            };
            return [dbFileLocation];
          }
        })
      )
    ).flat();

    dbFiles = dbFiles.concat(conventionedPaths);
  }

  return dbFiles;
};
Enter fullscreen mode Exit fullscreen mode

Device

A Device that accesses any King OS api.

A device can be connected to a person. A person can have multiple Devices.

A Device does not necissarily have King OS installed themselves, they can also be a visitor to another King OS app of someone else.

createDb

Create your database by passing your models as a generic and some optional configuration

(dbConfig?: DbConfig): Db =&gt; {
  // need to get

  const getDbFileLocationPath = async (
    storedItem: Storing,
    operationName: string | null,
    modelName: Extract,
    config: CustomQueryConfig
  ) =&gt; {
    const mergedQueryConfig = mergeConfigs(modelName, dbConfig, config);

    const result = await getDbFileLocation(
      storedItem,
      operationName,
      mergedQueryConfig,
      modelName
    );

    return result?.absolutePath;
  };

  const getByFile = async &gt;(
    modelName: TModelName,
    config?: GetQueryConfig
  ) =&gt; {
    const executionId = generateUniqueId();
    const performance: (PerformanceItem | undefined)[] = [];

    getNewPerformance("start", executionId, true);

    const mergedQueryConfig = mergeConfigs(modelName, dbConfig, config);
    performance.push(getNewPerformance("mergeConfigs", executionId));

    const dbFiles = await getDatabaseFiles(modelName, mergedQueryConfig);
    performance.push(getNewPerformance("getDatabaseFiles", executionId));

    // console.log("getByFile", {
    //   modelName,
    //   mergedQueryConfig,
    //   dbFiles: dbFiles.length,
    // });

    /**
     * An object used for attaching all referenced data onto the model, recursively
     */
    let includeData: IncludeDataObject = {};

    /**
     * A recursive function that takes an Include and adds data to includeData, if it's not already there
     */
    const processInclude = async (includeConfig: Include) =&gt; {
      if (!includeConfig.referenceKey) return;

      const parameterInfo = getReferenceParameterInfo(
        includeConfig.referenceKey
      );

      if (!parameterInfo.isReferenceParameter || !parameterInfo.interfaceName)
        return;

      if (!includeData[parameterInfo.interfaceName]) {
        const includeThisData = await get(
          parameterInfo.interfaceName as Keys,
          { manualProjectRoot: mergedQueryConfig.manualProjectRoot }
        );
        includeData[parameterInfo.interfaceName] = includeThisData;
      }

      if (includeConfig.include) {
        // Recursively process all includes as well
        const includeArray = makeArray(includeConfig.include);
        await Promise.all(includeArray.map(processInclude));
      }
    };

    // NB: for auto we need to do it per file because we don't know exactly which keys exist on the model yet
    const isAuto: boolean =
      config?.include &amp;&amp; !Array.isArray(config.include)
        ? config.include.auto === true
          ? true
          : false
        : false;

    // NB: Create an includeArray or fill the includeData object
    const includeArray: Include[] =
      isAuto || !config?.include ? [] : makeArray(config?.include);
    await Promise.all(includeArray.map(processInclude));

    performance.push(getNewPerformance("processInclude", executionId));

    const dbContentPromises = dbFiles.map(async (dbFileLocation) =&gt; {
      const items = await getAugmentedData(
        dbFileLocation,
        mergedQueryConfig.dbStorageMethod
      );
      if (!items) return;

      const filteredItems = config?.filter
        ? items.filter(config.filter)
        : items;

      let augmentedItems: TModels[TModelName][] = includeArray
        ? filteredItems.map(
            (item) =&gt;
              augmentItemWithReferencedDataRecursively(
                item,
                includeArray,
                includeData
              ) as TModels[TModelName]
          )
        : filteredItems;

      if (isAuto) {
        /**
        TODO: 

        Go over all keys in the first item of augmentedItems, and see if it contains reference keys.

        For every reference key, create an `Include`. call `includes.map(processInclude)`

        Call augmentItemRecursively with the includes.

        */
        log("auto is not supported yet", { type: "warning" });
        augmentedItems = augmentedItems;
      }

      return { [dbFileLocation.absolutePath]: augmentedItems };
    });

    performance.push(getNewPerformance("dbContentPromises", executionId));

    const dbContent = (await Promise.all(dbContentPromises)).filter(notEmpty);

    performance.push(getNewPerformance("dbContent", executionId));

    const dbContentObject = mergeObjectsArray(dbContent);

    performance.push(getNewPerformance("dbContentObject", executionId));

    // console.log("get performance", performance);

    cleanupTimer(executionId);

    // console.log({ dbContentObject });
    return dbContentObject;
  };

  const get: DbGet = async &gt;(
    modelName: TModelName,
    config?: GetQueryConfig
  ) =&gt; {
    const items = (
      Object.values(
        await getByFile(modelName, config)
      ) as TModels[TModelName][][]
    ).flat();
    return items;
  };

  /**
   *
   */
  const clear = async &gt;(
    modelName: TModelName,
    config?: CustomQueryConfig
  ) =&gt; {
    const mergedConfig = mergeConfigs(modelName, dbConfig, config);
    const locations = await getDatabaseFiles(modelName, mergedConfig);

    await mapMany(
      locations,
      async (loc) =&gt; fs.existsSync(loc.absolutePath) &amp;&amp; fs.rm(loc.absolutePath),
      maxConcurrency
    );

    return {
      amountRemoved: locations.length,
      isSuccesful: true,
      message: `${locations.length} files removed`,
    };
  };

  const set: DbSet = async &gt;(
    modelName: TModelName,
    data: Creation[],
    config?: CustomQueryConfig
  ) =&gt; {
    const mergedConfig = mergeConfigs(modelName, dbConfig, config);
    const { dbStorageMethod } = mergedConfig;
    const itemsPerFile = await groupByFile(data, mergedConfig, modelName);
    const locations = await getDatabaseFiles(modelName, mergedConfig);

    await mapMany(
      locations,
      async (dbFileLocation) =&gt; {
        // First remove the file
        if (fs.existsSync(dbFileLocation.absolutePath)) {
          log(`Removing ${dbFileLocation.absolutePath}`, { type: "debug" });
          await fs.rm(dbFileLocation.absolutePath);
        }
      },
      maxConcurrency
    );

    // Then, if there are new items for that file location, also set that file to contain the new items

    const upsertResults = (
      await mapMany(Object.keys(itemsPerFile), async (fileKey) =&gt; {
        const value = itemsPerFile[fileKey];

        if (!value) return;

        const { dbFileLocation, items } = value;

        log(`set new values to there: ${items.length}`, {
          type: "debug",
        });
        // if the item-array is empty, upsert nothing.
        if (items.length === 0) return;

        const result = await upsertItems(
          dbStorageMethod,
          dbFileLocation,
          items
        );

        return result;
      })
    ).filter(notEmpty);

    const amountInserted = sum(upsertResults.map((x) =&gt; x.amountInserted || 0));

    return {
      isSuccesful: true,
      amountInserted,
    };
  };

  const upsert: DbUpsert = async &gt;(
    modelName: TModelName,
    data: Creation | Creation[],
    config?: UpsertQueryConfig
  ) =&gt; {
    const mergedConfig = mergeConfigs(modelName, dbConfig, config);
    const { dbStorageMethod } = mergedConfig;
    const creationItems = makeArray(data);

    //  splits the items into the needed files
    const dataPerStorageFile = await groupByFile(
      creationItems,
      mergedConfig,
      modelName
    );

    // console.log({ creationItems, dataPerStorageFile });

    //  upserts items for every file, grouped, efficiently.
    const result = await mapMany(
      Object.keys(dataPerStorageFile),
      async (absolutePath) =&gt; {
        const itemsObject = dataPerStorageFile[absolutePath];
        const { dbFileLocation, items } = itemsObject;
        if (config?.removeUntouched &amp;&amp; fs.existsSync(absolutePath)) {
          await fs.rm(absolutePath);
        }

        // console.log(
        //   `upserting ${dbStorageMethod} ${modelName}`,
        //   dbFileLocation,
        //   items
        // );
        const result = await upsertItems(
          dbStorageMethod,
          dbFileLocation,
          items,
          config?.onlyInsert
        );

        return result;
      },
      maxConcurrency
    );

    return {
      isSuccesful: true,
      message: `Upserted into ${result.length} files`,
    };
  };

  const remove: DbRemove = async &gt;(
    modelName: TModelName,
    removeWhere: (content: TModels[TModelName]) =&gt; boolean,
    config?: CustomQueryConfig
  ) =&gt; {
    const mergedQueryConfig = mergeConfigs(modelName, dbConfig, config);
    const dbFiles = await getDatabaseFiles(modelName, mergedQueryConfig);

    const amountRemovedArray = await mapMany(
      dbFiles,
      async (dbFileLocation) =&gt; {
        const { amountRemoved } = await removeMultiple(
          mergedQueryConfig.dbStorageMethod,
          dbFileLocation,
          (content) =&gt; removeWhere(content as TModels[TModelName])
        );

        return amountRemoved || 0;
      },
      maxConcurrency
    );

    const amountRemoved = sum(amountRemovedArray);

    if (amountRemoved === 0) {
      return { isSuccesful: false, message: "Nothing removed", amountRemoved };
    }

    return {
      amountRemoved,
      isSuccesful: true,
      message: "Items removed",
    };
  };

  const update: DbUpdate = async &gt;(
    modelName: TModelName,
    updateWhere: undefined | ((content: TModels[TModelName]) =&gt; boolean),
    map: (oldValue: TModels[TModelName]) =&gt; TModels[TModelName],
    config?: CustomQueryConfig
  ) =&gt; {
    // `get` -&gt; `update` -&gt; `groupByFile(newItems)` -&gt; set (overwrite those files, remove leftover files)

    const data = await get(modelName, config);

    let amountUpdated = 0;

    const newData: Creation[] = data.map((item) =&gt; {
      const needsUpdate = updateWhere ? updateWhere(item) : true;
      if (needsUpdate) {
        amountUpdated++;
      }
      return needsUpdate ? map(item) : item;
    });

    const { isSuccesful, message } = await set(modelName, newData, config);

    const result: DbQueryResult = {
      amountUpdated,
      message,
      isSuccesful,
    };

    return result;
  };

  return {
    get,
    getDbFileLocationPath,
    getByFile,
    clear,
    upsert,
    set,
    remove,
    update,
  };
};
Enter fullscreen mode Exit fullscreen mode

Device

A Device that accesses any King OS api.

A device can be connected to a person. A person can have multiple Devices.

A Device does not necissarily have King OS installed themselves, they can also be a visitor to another King OS app of someone else.

TsFunction

Interface for arrow functions and normal functions

TodoFile

Any markdown file in the todo folder should become this model

/**
 * Any markdown file in the todo folder should become this model
 */
export interface TodoFile extends MarkdownModelType {
  priority?: TodoPriority;

  // `TodoOffer` config. Must be flat because it's a `MarkdownModelType`

  /**
   * overwrites visibility for freelancer
   *
   * by default a todo is visible, unless specifically hiding it
   *
   * by default a todo with `isDraft: true` is hidden, unless specifically making it visible
   */
  isHiddenForFreelancer?: boolean;
  /**
   * make todo claimable by a freelancer
   */
  isClaimable?: boolean;

  /**
   * Price to be paid, that, if offered by a freelancer, will be accepted.
   */
  doNowPrice?: Price;

  /**
   * admin can specify when this needs to be finished
   */
  deadlineAt?: number;

  /**
   * Source needed from these operations, can be made accessible after accepting the offer
   */
  neededOperation_packageJsonSlugs: Id[];
  todoOffersCalculated?: TodoOffer[];

  /**
   * special categories that augment todo-ui functionality:
   *
   * - `ideas` can have altered visibility, according to config
   * - `done` can have altered visibility, according to config
   * - `backlog` can have altered visibility, according to config
   *
   * Not sure if this should really be fixed, but a convention is always good.
   */
  categoryStackCalculated: CategoryStack;
}
Enter fullscreen mode Exit fullscreen mode

getDatabaseFiles

This function gets the files that the data can be stored, by convention, based on the model and the config

Only returns the file paths that actually exist.

CONVENTION:

When searching for data, fs-orm will look in:

  • db/ in your project root
  • db/ in any operation

In these folders, fs-orm will search for files based on your storage method.
@see DbStorageMethod for more info

Returns not only the file paths, but also where they were found (operationName, projectRelativePath, operationRelativePath)

async (
  modelName: string,
  mergedConfig: MergedQueryConfig
): Promise =&gt; {
  const executionId = generateUniqueId();
  const performance: (PerformanceItem | undefined)[] = [];
  getNewPerformance("start", executionId, true);

  const projectRoot =
    mergedConfig?.manualProjectRoot || mergedConfig.projectRoot;
  if (!projectRoot) return [];

  const dbStorageMethod = mergedConfig.dbStorageMethod;

  performance.push(getNewPerformance("get projectRoot", executionId));

  const pattern = getLocationPattern(dbStorageMethod, modelName, mergedConfig);

  performance.push(getNewPerformance("get location pattern", executionId));

  const operationPath = await getMergedConfigOperationPath(
    mergedConfig,
    projectRoot
  );

  performance.push(
    getNewPerformance("get merged config operation path", executionId)
  );

  // Please note, it can return false as well, which should continue here
  if (operationPath === undefined) return [];

  const rootFolders = await getRootFolders({
    mergedConfig,
    operationPath,
    projectRoot,
    manualProjectRoot: projectRoot,
  });

  performance.push(getNewPerformance("getRootFolders", executionId));

  cleanupTimer(executionId);
  // console.log({ performance });

  /**
  based on configuration and convention, we will fill this array with the files to get data from

  NB: this should contain the actual files, not the patterns
   */
  let dbFiles: DbFileLocation[] = [];

  const isOperationFile =
    !!mergedConfig.operationName &amp;&amp;
    !!operationPath &amp;&amp;
    !!mergedConfig.operationRelativePath;

  if (isOperationFile &amp;&amp; !!operationPath) {
    const exactAbsoluteOperationFilePath = path.join(
      operationPath,
      mergedConfig.operationRelativePath!
    );
    //make sure that extension matches `dbStorageMethod`, warn otherwise
    const customExt = mergedConfig.operationRelativePath
      ? path.parse(mergedConfig.operationRelativePath).ext
      : undefined;
    const isWrongExtension =
      customExt !== getDbStorageMethodExtension(dbStorageMethod);

    if (isWrongExtension) {
      log(
        `Incorrect extension found in operationRelativePath, found ${customExt}`,
        { type: "warning" }
      );
    }

    const projectRelativePath = exactAbsoluteOperationFilePath.substring(
      projectRoot.length
    );
    const operationRelativePath =
      mergedConfig.operationName === null
        ? undefined
        : exactAbsoluteOperationFilePath.substring(operationPath.length);

    dbFiles.push({
      modelName,
      absolutePath: exactAbsoluteOperationFilePath,
      operationName: mergedConfig.operationName!,
      projectRelativePath,
      operationRelativePath,
    });
  }

  if (!isOperationFile &amp;&amp; mergedConfig.projectRelativePath) {
    const absolutePath = path.join(
      projectRoot,
      mergedConfig.projectRelativePath
    );
    const operationName = null;
    const projectRelativePath = mergedConfig.projectRelativePath;

    dbFiles.push({
      modelName,
      absolutePath,
      operationName,
      projectRelativePath,
    });
  }

  if (!mergedConfig.projectRelativePath &amp;&amp; !isOperationFile &amp;&amp; pattern) {
    // no exact path

    const conventionedPaths: DbFileLocation[] = (
      await Promise.all(
        rootFolders.map(async (rootFolder) =&gt; {
          const absolutePathPattern = path.join(rootFolder.basePath, pattern);
          const projectRelativePath = absolutePathPattern.substring(
            projectRoot.length
          );

          const operationRelativePath =
            rootFolder.operationName === null
              ? undefined
              : absolutePathPattern.substring(rootFolder.basePath.length);

          const parsedPath = path.parse(absolutePathPattern);

          if (parsedPath.name === "*") {
            const fileNames = getWildcardDbFileLocations({
              modelName,
              parsedPath,
              operationName: rootFolder.operationName,
              projectRoot,
              rootFolder,
            });

            return fileNames;
          } else {
            const dbFileLocation: DbFileLocation = {
              modelName,
              absolutePath: absolutePathPattern,
              operationName: rootFolder.operationName,
              projectRelativePath,
              operationRelativePath,
            };
            return [dbFileLocation];
          }
        })
      )
    ).flat();

    dbFiles = dbFiles.concat(conventionedPaths);
  }

  return dbFiles;
};
Enter fullscreen mode Exit fullscreen mode

undefined

Typerepo usability

🌈 I am getting so close to a release now! But I amn't done yet. Before I release, I wanted to make it more usable, before I do a big release like this, I came up with the following things to make everything much more usable.

✅ Ensured search-web starts up using pm2-util once you start the server (only if port is not in use). Server should print the URL in the console, or just open it directly, after startup.

✅ Added script to templates that starts up the server when you run yarn dev in root

✅ In search-web, the timeline is great, but by far not done yet. Omit it for now, so search-web becomes fast. ⚡ī¸

🌈 Now I think search-web can easily be included into the bundle. Great! Also the typerepo is a lot easier now to start up. That's amazing. The only thing I need to finish now, for this release, is making the whole database UI and function UI more functional. They're currently super buggy due to the many refactors I did the past weeks. Besides, I have many ideas to improve them that haven't been implemented.

Let's goooooo đŸ›ģđŸ›ģđŸ›ģđŸ›ģđŸ›ģ

search-web

I didn't write a good description for this yet. Please let me know if you want to know more

pm2-util

I didn't write a good description for this yet. Please let me know if you want to know more

search-web

I didn't write a good description for this yet. Please let me know if you want to know more

search-web

I didn't write a good description for this yet. Please let me know if you want to know more

undefined

Making the Database UI work again

✅ Upsert doesn't work somehow. There were some problems with the URL paths, because of the refactor. The solution was to simply create a new function useModelQuery to get the name of the model again correctly...

✅ Ensure linking is fine again. Again, this was easy to fix. The same reason as the previous, the links just still lead to the old locations of before the refactor.

✅ Show assets successfully in the table, using ModelItemAssetView We first need to detect if something is a BackendAsset. If that's the case, we can use the ModelItemAssetView for it. The columns are created in the ModelComponent file... We can detect if it's an asset input type (based on the property name) using getAssetInputType from name-conventions.

✅ In react-with-native-table I added the BackendAsset PresentationType and render the ModelAssetView there.

✅ I had to do some more changes, also in the backend, to make it possible again to expose referenced assets.... After some coding, I got it done, and now the admin beatifully shows the images, audios, or anything, from the backend! 🌈

YAY! FINALLY IT WORKS. AMAZED!

❌ Add stylePreset to ensure they show up small. I think it already looks fine for now, we also don't need to omit the download button, in the db-admin that's super useful... :) I'm happy with the result.

I'm happy that I've made the DB admin functional again. Also it's a lot faster because of sdk-interface-paths. Not bad! Next up is the function-ui because that is also very nice to have for typerepo.

If we have more time after that, we can make the DB-UI even better, but let's see!

BackendAsset

Part of the asset that should be sent to the backend. The rest should frontend-only

Some values are stored, some are not

ModelItemAssetView

(props: {
  item: T;
  backendAsset?: BackendAsset;
  hideDownloadLink?: boolean;
  className?: string;
}) =&gt; {
  const { backendAsset, item, hideDownloadLink, className } = props;

  return backendAsset ? (

  ) : null;
};
Enter fullscreen mode Exit fullscreen mode

ModelComponent

In the table headings, all xxxSlug, xxxId etc should be called xxx.

In the table values, all slugs and ids should show the name of the instance of the refered model.

It has to be possible to navigate to an id or slug using #[id] or #[slug] in the URL, just add div ids to all rows

(props: { modelName?: string; highlight: Highlight }) =&gt; {
  const { modelName, highlight } = props;
  const alert = useAlert();
  const router = useRouter();

  const views = modelViews.map((modelView) =&gt; ({
    value: modelView.view,
    label: `${modelView.emoji} ${modelView.view}`,
  }));

  const [SelectView, viewItem] = useSelect(views, views[0]);
  const view = viewItem!.value;

  const metadataQuery = queries.useGetDbModelMetadata(modelName as DbModelEnum);
  const { datasets, tsInterface } = destructureOptionalObject(
    metadataQuery.data?.result
  );

  const datasetItems = datasets?.map((dataset) =&gt; ({
    label: dataset.name,
    value: dataset.id,
    data: dataset,
  }));

  const datasetSelectItems: Item[] = [
    { value: "", label: "Select a dataset" },
    ...(datasetItems || []),
    { value: "new", label: "(+) New dataset" },
  ];

  const [SelectDataset] = useSelect(
    datasetSelectItems,
    undefined,
    (newValue) =&gt; {
      if (newValue?.value === "new") {
        // show a blank screen
        setDatasetConfig({ key: `config${Math.random()}` });
        return;
      }

      if (newValue?.value === "") {
        setDatasetConfig(null);
        return;
      }

      if (newValue?.data) {
        setDatasetConfig({ ...newValue.data, key: `config${Math.random()}` });
        return;
      }
    }
  );

  const [datasetConfig, setDatasetConfig] = useStore("db-crud.datasetConfig");

  const model = useInfiniteGetDbModel();
  const modelReferences = queries.useGetReferencableModelData(
    modelName as DbModelEnum
  );

  const isLoading = model.isLoading || model.isRefetching || model.isFetching;

  const allData = model?.data?.pages
    .map((x) =&gt; x.result?.data)
    .flat()
    .filter(notEmpty);

  // const  count = sum(model.data?.pages.map((x) =&gt; x.result?.data.length || 0) || []);

  const indexDescription = tsInterface ? (

      <p>{tsInterface.name}</p>

      {renderMarkdownContent(tsInterface.description || "No description", {
        projectRelativeBaseFolderPath: getFolderJs(
          tsInterface.projectRelativePath
        ),
        projectRelativeMarkdownFilePath: tsInterface.projectRelativePath,
      })}

  ) : isLoading ? (

  ) : (
    "No index found"
  );

  const headerButtons = (

       router.push(`/upsert/${modelName}`)}
        title="New"
        emoji="➕"
      /&gt;

       model.refetch(),
          title: "Reload",
          emoji: isLoading ? undefined : "🔄",
          component: isLoading ? () =&gt;  : undefined,
        }}
      /&gt;







  );

  const onEndReached = () =&gt; {
    const pages = model.data?.pages;

    const lastPage = pages ? pages[pages.length - 1] : undefined;

    const hasMore = lastPage?.result?.hasMore;

    if (hasMore &amp;&amp; !model.isFetchingNextPage) {
      model.fetchNextPage();
    }
  };

  const deleteItem = (item: AugmentedAnyModelType) =&gt; {
    alert?.("Are you sure?", "Do you want to delete this one?", [
      {
        text: "Yes",
        style: "destructive",
        onPress: () =&gt; {
          if (item?.id) {
            deleteDbModel(modelName as any, item.id).then((result) =&gt; {
              model.refetch();
              modelReferences.refetch();
            });
          }
        },
      },
      { text: "Cancel", style: "cancel" },
    ]);
  };

  const deleteAction: ItemAction = {
    action: deleteItem,
    emoji: "❌",
    name: "Delete",
  };

  const updateItem = (item: AugmentedAnyModelType) =&gt;
    router.push(`/upsert/${modelName}?id=${item?.id}`);

  const updateAction: ItemAction = {
    name: "Update",
    emoji: "✏ī¸",
    action: updateItem,
  };

  const actions: ItemAction[] = [deleteAction, updateAction];

  const CrudView = {
    table: CrudTable,
    grid: CrudGrid,
    timeline: CrudTimeline,
    tree: CrudTree,
  }[view];

  const crudViewProps: CrudViewProps = {
    actions,
    data: allData,
    highlight,
    tsInterface,
    onEndReached,
  };

  return (


        {headerButtons}
        {indexDescription}

        {datasetConfig &amp;&amp; modelName ? (

        ) : null}


      {/* NB: here a table view should be rendered */}
      {Array.isArray(allData) &amp;&amp; allData.length &gt; 0 &amp;&amp; CrudView ? (

      ) : null}

  );
};
Enter fullscreen mode Exit fullscreen mode

getAssetInputType

Convention parameters for assets: [name], [name]s, xyz[Name], xyz[Name]s

Different ways to name assets: image, video, audio, file or just asset, which is everything together

(
  parameterName: string,
  valueType?: SimplifiedSchemaType
): AssetInputType | undefined =&gt; {
  const lastWord = lowerCaseArray(parameterName).pop();
  if (!lastWord) return;

  const isObjectOrArray =
    !valueType || valueType === "array" || valueType === "object";

  if (!isObjectOrArray) return;

  if (
    ["image", "video", "audio", "file", "asset"].includes(singularize(lastWord))
  ) {
    return {
      type: singularize(lastWord) as AssetInputType["type"],
      isMultiple: isPlural(lastWord),
    };
  }

  return;
};
Enter fullscreen mode Exit fullscreen mode

name-conventions

I didn't write a good description for this yet. Please let me know if you want to know more

react-with-native-table

I didn't write a good description for this yet. Please let me know if you want to know more

BackendAsset

Part of the asset that should be sent to the backend. The rest should frontend-only

Some values are stored, some are not

sdk-interface-paths

I didn't write a good description for this yet. Please let me know if you want to know more

typerepo

The new way to dev

Find more on GitHub

undefined

Function UI Improvements

ℹī¸ Function-UI is something I made in just 4 hours until now... It's just NOT finished, so there are many things to improve. I'm not going to show you every detail, but I'll make a list of things I have improved.

I'm going to improve the UX all over the place, and just flow on this :)

TODO:

  • ✅ Add loading indicator when loading a function
  • ✅ Add function page title
  • ✅ Add loading when executing a function
    • ✅ Added loading state to function-form
  • ✅ Add feedback after executing a function
    • ✅ created a new package cool-toast to standardise my toast messages.
    • ✅ added some really nice toast messages based on the expected response (which is super diverse)
    • ✅ added a json render with markdown to also show the raw json response.
  • ✅ Added a button to open a file in the VSCode editor (use api with vscodeOpen)
  • ✅ Added a default loader in big-button so you don't have to have external state for that...
  • ✅ Removed all tabs that aren't ready yet so it can be released.
  • ✅ Made it possible to add the parameters of a previous execution in to the form wow this is awesome ⚡ī¸
  • ✅ Show the JSON of executions nicely using MarkdownCodeblock

Wow! It's just past one pm now, and I've done a great deal already. I'm going to round up now because my meeting starts 3pm (end of the week review)... All I want to do before that is to make a functional typerepo bundle.... Let's do it đŸ’Ē

Here are some previews

You can see the code for any function in your project, and open it in VSCode with the click of a button...

You can execute the function from a form after making it...

You can see the recent executions in a simple overview, you can turn them into tests or examples with the click of a button...

Isn't this awesome? A good start for a great dev experience!

undefined

Release typerepo bundle

Time to bundle! First I needed to ensure Operation .operation.indirectDependencies are correct. Seems allright after adding them for function-web.

I added search-web and function-web into the bundle, and ran generateBundles typerepo

Found 5 apps, 144 modules, 1 packages

This seems great, but let's not cheer too early!

src/sdk-api.ts(129,10): error TS2305: Module '"function-server-endpoints"' has no exported member 'getNewPerformance'.

Right... Need to rebuild everything and create a new SDK.... Gonna take a while. Let's take a break!

During rebuilding I saw that a lot of operations I recently added into the repo weren't building right. In order to fix that I needed to remove one and refactor a couple others. Now it seems to be better, but it may very well end up into errors in the SDK, so let's see....

In the meantime, we still need to re-index Âą30 operations, almost done!

💤

15 minutes later, all packages seem to be indexed nicely. Let's try generating an sdk for my own project first. If that works, we can generate the bundle.

Another 15 minutes later and boom! We have a new working SDK. Let's bundle...

Found 5 apps, 147 modules, 1 packages

It even found 3 more modules. Crazy.

✨  Done in 5.92s.
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-interface-paths ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-db ✅
{
  tsFunctions: 961,
  manualProjectRoot: '/Users/king/King/bundled/typerepo'
}
{ exportedFunctions: 822 }
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-api-keys ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-ui ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-api ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-js ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-env-private ✅
Compiling source /Users/king/King/bundled/typerepo/packages/generated/sdk-env-public ✅
Installing repo ✅
Enter fullscreen mode Exit fullscreen mode

YES! Let's try it out and see if it doesn't crash...
It does crash, but only at the search-web! But because this is a huge improvement, I decided to take a look to fix it. The problem was that authentication was still doing outdated stuff that I already refactored today.

After fixing, I bulit a new one... IT WORKS 🎉🎉🎉🎉🎉🎉

It's pushed now and available at typerepo.com

This new bundle should be much better than the previous one, with many bugs weeded out. Also it's great to be able to startup any UI from search-web, and the function admin is finally there!

Very happy to release this new version of typerepo, hope you like it! You can find it at typerepo.com

Operation

Model for typerepo operations. Stored in package.json in every package (compatible with regular npm package.json data structure). An Operation is a NPM Package that applies the typerepo convention.

TODO: add a validation to package.json files for the whole project, to ensure i can apply fs-orm convention

function-web

I didn't write a good description for this yet. Please let me know if you want to know more

search-web

I didn't write a good description for this yet. Please let me know if you want to know more

function-web

I didn't write a good description for this yet. Please let me know if you want to know more

search-web

I didn't write a good description for this yet. Please let me know if you want to know more

authentication

I didn't write a good description for this yet. Please let me know if you want to know more

search-web

I didn't write a good description for this yet. Please let me know if you want to know more

typerepo

The new way to dev

Find more on GitHub

undefined

Top comments (0)