Today marks the release v0.1.0 of Explainer.js. A CLI tool to process files and output the code blocks along with explanation using an LLM. Built using Commander.js and Groq SDK in JavaScript!
Been working on it for the past week and I finally got it ready for release v0.1.0. This is a open source project I was tasked with making for my Topics in Open Source Development
. If you find yourself reading some code and you don't care how it does what it does but you just want to know what it does (like me) you'll find this tool pretty useful. If you want to know how to use it keep reading or if you wanna check it out yourself you can visit and checkout the detailed (and formal) README.md.
https://github.com/aamfahim/explainer.js
Some Prerequisites
- Have latest stable(LTS) version of node.js installed.
- Optional: A valid
.env
file with API credentials with all typed out. - Checkout Groq Documentation if you are planning on using custom model or API since Groq doesn't support everything. If you are not sure stick to default! All you need is a Groq API key which is pretty easy to get.
How-to
If you want to checkout how it works you can try out
node index.js examples/bubble_sort.js
After node index.js
you can define your own path/s name for example
node index.js ../../../project/src/index.js
Or you can define multiple files at once
node index.js ../../project/src/index.js ../../project2/src/index.js
I would suggest defining a output file to dump the output to if your file is too long or you are defining multiple files.
node index.js ../project/src/index.js ../../project2/src/index.js -o output.md
Here's are the list of options available to you as of today.
Options
-
-a, --api-key <your-key>
: (Required) The API key used for processing. Can be set via the.env
file (API_KEY
) or passed as a command-line argument. -
-b, --baseURL <url>
: The base URL for the API. Default:https://api.groq.com/
. -
-m, --model <model-name>
: The model name to use for processing. Default:llama-3.1-70b-versatile
. -
-o, --output <file>
: The output file path where the results will be saved. -
-t, --temperature <number>
: The temperature for the model, ranging from 0 to 2. Default:1
. -
-u, --token-usage
: Display number of tokens that were sent in the prompt and the number of tokens that were returned. Default:false
. -
-h, --help
: Display the help message for user. -
-v, --version
: Display the current version of the tool.
Closing Thoughts
Checkout the repo, try out the tool if you have any feedback or find any bugs, file a issue!
Top comments (0)