DEV Community

Peter Wan
Peter Wan

Posted on • Updated on

Dev log: gimme_readme 0.1 release

On September 13th, 2024, I did something I never done before; that is, to publish my program, gimme_readme to the npm registry for the whole world to use!

The gimme_readme command-line tool takes a user's local source code files and uses them to create a README.md file that explains their code. gimme_readme lets you select different AI APIs (e.g., Gemini's gemini-1.5-flash model & Groq's llama3-8b-8192 model) to analyze the provided code and create documentation that explains the code.

gimme_readme-0.1-demo-revised

To learn more about gimme_readme, I invite you to check out my repository below, or to watch the release demo for 0.1 which is also linked in my repository.

GitHub logo peterdanwan / gimme_readme

gimme_readme is a command-line tool powered by AI that generates a comprehensive README.md file for your project. It analyzes multiple source code files at once, providing concise explanations of each file's purpose, functionality, and key components, all in a single, easy-to-read document.

gimme_readme

gimme_readme is a command-line tool powered by AI that generates a comprehensive README.md file for your project. It analyzes multiple source code files at once, providing concise explanations of each file's purpose, functionality, and key components, all in a single, easy-to-read document.

gimme_readme-0.1-demo-revised

See our 0.1 Release Demo!

Table of Contents

  1. Getting Started
  2. Usage
  3. Example Usage
  4. Supported Models by Providers
  5. Contributing
  6. Testing Locally
  7. Author

1. Getting Started

To get started with gimme_readme, follow these steps:

  1. Install the latest version of Node.js for your operating system.

  2. Run the following command to install gimme_readme globally:

    npm i -g gimme_readme
    Enter fullscreen mode Exit fullscreen mode

    NOTE: MAC/LINUX users may need to run sudo npm i -g gimme_readme

  3. Generate a configuration file by running in any folder you'd like:

    gr-ai -c
    Enter fullscreen mode Exit fullscreen mode

    This command creates a .gimme_readme_config file in your home directory. Do not move this file from this location.

  4. Open the .gimme_readme_config file and add your API…

Table of Contents

  1. Developing gimme_readme
  2. Getting started with gimme_readme
  3. Features of gimme_readme
  4. Example usage
  5. Conclusion
  6. Links

Developing gimme_readme

"Stand on the shoulders of giants"

This quote echoed in my head as I was creating my command-line tool since I know very well that without the work of many other companies and distinct individuals, I would not be able to release my own project.

To that end, let me delve into some of the technologies I used to create gimme_readme.

To start, I knew I wanted to work with JavaScript because of its simple syntax, and its ability to run on Linux, Mac, and Windows. Since cross-platform availability is something I value, I knew I wanted to work with JavaScript from the start.

After choosing JavaScript as the programming language I'd write in, I thought about how I would publish my code. The first thought that came to mind was npm. npm or the node package manager is the largest open source registry in the world. People from around the world use code from npm and share their code to npm and the process of using npm is very straightforward.

When I started my computer science journey in 2022, I was fascinated with how easy it was to just write:

npm i NPM_PACKAGE
Enter fullscreen mode Exit fullscreen mode

and my code would magically work. I was even more impressed when I found out that the packages that were installed (if they were maintained correctly), were able to be installed on different operating systems.

To show you how easy node / npm's ecosystem is, let me show you how easy it is to make your JavaScript code into an executable that runs on every operating system.

You can make your script executable by adding a line similar to this to your package.json file:

{
  "bin": {
    // Makes an executable called "gr-ai" which simply calls my JS script
    "gr-ai": "./src/_gr.js"
  }
}
Enter fullscreen mode Exit fullscreen mode

How neat is that? With just a few lines of code (minus my comment), you are halfway done with making an executable called gr-ai which calls ./src/_gr.js that can run on all operating systems.

The final piece of the puzzle for making an executable is simulating how you would publish your code OR publishing your code for real.

To simulate publishing your code, run the following command in the root of your JavaScript project (i.e., where your package.json is):

npm link
Enter fullscreen mode Exit fullscreen mode

This command simulates you having installed your program globally and will give you access to your own gr-ai command!

In the event that you no longer want to have the code for this command installed globally (whether it be your simulated code / code that you installed globally via npm), you can run:

npm uninstall -g gimme_readme
Enter fullscreen mode Exit fullscreen mode

Please note, that you need to specify the name of your package when uninstalling and not the name of your executable.

I had to simulate publishing my code several times before actually publishing it to npm. For a really good guide on publishing your code to the npm registry, I suggest watching Web Dev Simplified's video on creating and publishing your first npm package.

With direction on how I'd publish my code, I was able to start thinking about all the different dependencies I would need to get my program to work.

The dependencies and packages I'm currently using for gimme_readme are:

  • @google/generative-ai & groq/sdk, which give me access to different LLMs that will help explain the user's source code
  • commander, which made it easy to configure the different options of my command-line tool
  • chalk, which allows me to colourize my command-line text
  • dotenv, which helps me work with secret files that store sensitive information
  • ora, which gives code that produces a loading spinner

It was with these great APIs and libraries that I was able to produce a tool of my own. With that said, let me show you how you can get started with gimme_readme so you can make heads or tails of your local source code files.

Getting started with gimme_readme

To get started with gimme_readme, follow these steps:

1. Install the latest version of Node.js for your operating system

The download for Node.js can be found here: https://nodejs.org/en/download/package-manager.

Node.js will come with npm and allow you to install gimme_readme.

2. Run the following command to install gimme_readme globally

npm i -g gimme_readme
Enter fullscreen mode Exit fullscreen mode

NOTE: MAC/LINUX users may need to run sudo npm i -g gimme_readme

3. Generate a configuration file by running in any folder you'd like

gr-ai -c
Enter fullscreen mode Exit fullscreen mode

This command creates a .gimme_readme_config file in your home directory. Do not move this file from this location.

Follow the instructions in the file to create your own API keys and set your own default values.

.gimme_readme_config

Congratulations! You just installed gimme_readme to your system and if you created your own API keys, you should be able to use gimme_readme on the command-line!

With installation out of the way, let's delve into how you can use gimme_readme.

Features of gimme_readme

At a top level, gimme_readme supports the following features:

  1. The ability to display a help page.
  2. The ability to get gimme_readme's version number.
  3. The ability to create a .gimme_readme_config file or locate it
  4. The ability to send it source files, and have an AI model provide an explanation for your source code files.
  5. The ability to choose where the AI model's explanation is outputted (i.e., to a file or to your terminal).
  6. The ability to specify the AI model that provides explanations for you.
  7. The ability to send your own custom AI prompt.
  8. The ability to set the temperature of your model (i.e., how deterministic you want your model's response to be).
  9. The ability to get information regarding how many tokens were consumed when prompting an LLM model.

Let's show you demonstrations of each feature.

Example usage

Display the help page

The most basic gimme_readme command is:

gr-ai
Enter fullscreen mode Exit fullscreen mode

gr-ai-help

This shows us how use gr-ai and its different options.

Display the version number

Providing the -v option to the gr-ai command returns the version number

gr-ai -v
Enter fullscreen mode Exit fullscreen mode

gr-ai-version

Create a .gimme_readme_config file or find the path to your existing one

gr-ai -c
Enter fullscreen mode Exit fullscreen mode

Take several input files, choose your LLM of choice, and outputs the selected LLM's response to a file

#         file            file         model               output file
gr-ai -f .prettierignore .gitignore -m gemini-1.5-flash -o explain.md
Enter fullscreen mode Exit fullscreen mode

contact-ai

Conclusion

If you made it this far, I'd like to thank you for giving this blog a read. Creating the 0.1 release of gimme_readme has been a great experience, and I’m excited to continue developing new features and improving the tool. If you're interested in trying it out or contributing, feel free to check out the GitHub repository.

Stay tuned for more updates in the coming weeks!

Links

Top comments (0)