DEV Community

shrey vijayvargiya
shrey vijayvargiya

Posted on

AI + Humans automating our web Development tasks

Well the story begins 4 years ago before AI and now we have AI so automation become a cake-walk

Today’s blog began 3 years ago, and the story behind today’s blog began 3 years ago.

I am simply telling you this to show us all that no work is a waste at all so keep doing the good work, and how to find the good work? it’s simple the one you love the most.

Never mind, welcome back people, this article is just a small story.

I was writing automation scripts then back and I was very much sincere about the task of automating repetitive work.

My thinking was not to waste much time doing the same work again no new learning at all, instead, we rather move to automate them and shift our focus to learning new stuff, joining the nodes hence working smartly.

The script task was to create an entire full-stack, frontend and backend repository including all basic-required packages being installed as well as configured. This repository is called boilerplates or starter kits in the internet of the world.

Meaning, that you have a directory or so-called github repository with several packages installed, for example, if we like to look into the frontend repository in 2024 what is it made up of?

In my opinion, it can be made up of using the following

  • Next.js as the framework
  • React as JSX will be used
  • UI library for UI components
  • Animation Library
  • Tailwind CSS, styled-components, SCSS for styling
  • Redux, MobX, Xstate, and Zustand for state management
  • Firebase, Supabase, Appwrite, and other DB’s as the database
  • Paypal, Stripe as the payment
  • Analytics SDK package
  • SEO packages
  • other npm small packages for daily development the list can still go way long if I think this boilerplate is for production-level applications especially if it’s serving millions of users.

Taking everything to production is altogether a new game it is like a real football match versus a practice match.

In production, anything can happen and the toughest situation is when you can’t tell at state of the existing application. One can easily tell the status not state is difficult because we need to debug for that.

Never mind, the above boilerplate sounds good for small to decent-scale applications.

Imagine this, you are a developer and working on multiple projects.

Will you create everything from scratch again for the next project?

Will you redo everything from scratch, I am not talking about core specific requirements of the project but most of the projects can be built using the above repository.

We should reuse our work, so for reusing, we need to have the basic or general boilerplate at once and keep reusing it in all our further projects.

Let’s name this fundamental repository as a master repository or parent repository. Now every other project has its repository and that will be the child of the parent repository or the branch of the master repository.

The master repository is and will be our guiding principle and it should keep on evolving to become the best repository.

How to measure which codebase or repository is best?

I my opinion, the one which is serving millions of users at a low cost is the best.

So it will be our goal to create the best master repository for us in frontend and backend and full-stack domain so that we can serve millions of users at a penny cost.

Remember, the cost and performance also depend on the code so we can’t ignore that, we have to even improve the basic way of writing code in the master repository.

I think that is why it becomes important to release new full-stack or frontend or backend master repositories on your own every year so that you can or be able to create the best in a few years down the line.

Coming back to the script writing for automation.

The script was simple, why i should recreate the same branch again and again?

I should automate the process of creating the same file/folder or even cloning the repository from github etc.

I want to have a bit more fast and customisable process.

And what can be better than the custom scripts doing the work for us, so I write a basic javascript script or method to create the directory for me with all the actual code.

Here is what the automation code looks like

I even wrote an article on it


const prompt = require('prompt');
const fs = require('fs');
const exec = require('child_process').execSync;
const chalk = require('chalk');

prompt.start();

process.stdout.write('Welcome to boilerplate automation \n\n');
prompt.get(['projectname'], (err, result) => {
    process.stdout.write(`Entered project name is ${result.projectname} \n\n`);
    process.stdout.write(chalk.bold.green('Project Directory Created Successfully \n\n ' ));
    exec(`mkdir ${result.projectname}`, (err, stderr, stdout) => {
        if(err){
            console.log(err);
            return
        }
        process.stdout(`${stdout} \n\n`)
    });
    process.stdout.write(chalk.bold.blue(`Moving inside the ${result.projectname} directory \n\n `));
    process.stdout.write(chalk.bold.blue('Intialising the project \n\n'));
    const createFoldersCommand = `cd ${result.projectname} && mkdir pages components modules public`;
    exec(createFoldersCommand, (err, stderr, stdout)=>{
        if(err){
            console.log(err)
            return
        }
        process.stdout(chalk.green(`Project folders are created \n\n`));
    });
    const initialzeCommand = `cd ${result.projectname} && npm init`;
    exec(initialzeCommand, (err, stderr, stdout)=>{
        if(err){
            console.log(err)
            return
        }
        process.stdout(`${stdout} \n\n`)
        process.stdout(chalk.green(`Package.json file created successfully \n\n`));
    });
});
Enter fullscreen mode Exit fullscreen mode

The above script might look hard but it's so simple that I am laughing at the time of writing.

Okay, let me explain

This code is a simple script written in Node.js that automates the process of setting up a new project directory for web development. It uses several built-in Node.js modules such as prompt, fs, and child_process to interact with the user, create directories, and execute shell commands.

First, it prompts the user to enter a project name. Once the name is entered, it creates a new directory with that name. Then, it creates several subdirectories within the project directory for organizing code files. After that, it initializes a new Node.js project inside the created directory using npm init, which generates a package.json file to manage project dependencies and configurations.

Throughout the process, it provides feedback to the user in the terminal using different colours to indicate the progress and success of each step. Overall, this script simplifies the setup process for starting a new web development project by automating repetitive tasks and providing a structured starting point for development.

Once you run this script it will generate the repository, look at the video below

I’ve named it React Automation Command because I didn’t know about AI or anything at all, I was very new to this stuff.

But in 2024, this entire work can be done and automated on a large scale using GPT.

We can give the script to GPT to even create the customised repository for the user.

GPT can give customised scripts for specific requirement repositories in all the 3 domains such as frontend, backend and full-stack.

The next automation should penetrate to the migration, version compatibility and AI should work on those domains as well. Because these were the hard domains and we need tremendous human mind power to solve them or fail most of the time.

But eventually, AI will go there or everywhere it can go.

I’ve asked GPT to write the same script for the backend repository using the following packages as the basic ones

  • Express.js
  • MongoDB and SQL, AWS configured as the database
  • Authentication configuration
  • SendGrid, Mailchimp installed for Emailing service
  • Pupetter for scraping service
  • Axios and Loadash for fetching and JS methods
  • Redis for caching
  • Kubernetes and Docker
  • Stripe and Paypal for payment
  • Firebase and Supabase for serverless
  • Github as the version control
  • Hosting platform configuration
  • ENV variables being handled
  • HTML file compatibility Installing most of the packages needs several steps to follow and those are available in the corresponding docs no scripts can do this better than AI and DEVIN is doing the same under the hood.

Reading the docs and installing the packages, creating the file/folder adding variables getting API keys from the website and so on.

This all was supposed to be done using web scraping and automation scripts but AI is much faster in that.

I am sure there will be GPTs that run on your local computer and do all this stuff on your behalf.

The local GPT running in my local terminal and asking me to command or prompt the instructions and me simply typing the instructions will able to create the entire repository in a few seconds.

GPT in the terminal and running locally will be able to do all of my work in one go without even using my hands, I just have to enable the speaker and once the GPT can listen the instructions are easily conveyed via the words and I’ll have my Jarvis.

Well, that thing is still far and I hope I don’t have it any day in my life, otherwise, what would I do?

Top comments (0)