DEV Community

Cover image for Using shell script to automate API calls
Arogya Kharel
Arogya Kharel

Posted on • Updated on

Using shell script to automate API calls

Table of Contents

  1. Quick Summary
  2. Before Starting
  3. Objective
  4. Why Shell Scripts?
  5. What to automate?
  6. Copy-Paste to a .sh File

Quick Summary

Here are the steps we will be taking:

  1. Changing API requests to cURL commands.
  2. Pasting cURL commands to a .sh file.
  3. Profit.

Before Starting

I am assuming you have some familiarity with the following concepts:

  • API, API requests and responses
  • Postman or a similar API testing tool
  • cURL

It shouldn't be difficult to follow either way. Let's go.

Objective

Why automate API calls at all? Well, testing of course and benchmarking if you are into that. It's a pretty good idea to see if your API server responds properly to continuous requests, and find out if the API dependencies are in the right order.

Most students like myself tend to do a few API calls here and there and call it a day. To be honest, most assignments or personal projects don't really need much testing since they are generally tiny in scope. But maybe you will need to find the average server response time for a million requests, or make some graphs on server reliability. Automation will be extremely valuable then.

Why Shell Scripts?

Because they are everywhere. Bash for example is like the glue holding Linux systems together. Most icon you click, startup, configuration and so many more elements of the Linux system is handled through shell scripts.

There is also Python. It is definitely easier to work with and has extensive IDE and library support (there is also this whole thing with floating points and what not). However, there is a bit of an overhead and python is generally slower than bash. But if one is to consider developer efficiency, python is miles ahead (imo). But we use bash here because the actual automation is very simple (it's just cURL).

What to Automate?

I have a few API request formats prepared. Here is one using the GET method:

Query Approved CC

This one sends the server some data in json format. The server will then use some CLI command to complete the operation.

and here is one that sends a data through parameter instead of the body:

Install CC

Here the :package_name at the end of the URL is the parameter. The server is accepting the name of a tarball that needs to be installed. The HTTP method to be used is POST.

The next step is to transform these calls into cURL commands. Tools like Postman generate code snippets that you can use (which is what I did, but it is easy to figure out the cURL format by reading their documentation). I will assume you have a few ones prepared as well. The ones from earlier are:

curl --location --request GET '127.0.0.1:8080/fabric/lifecycle/approve' \
--header 'Content-Type: application/json' \
--data-raw '{
  "cc_name": "basic",
  "channel_name": "mychannel"
}'


curl --location --request POST '127.0.0.1:8080/fabric/lifecycle/install/basic.tar.gz'
Enter fullscreen mode Exit fullscreen mode

Copy-Paste to a .sh File

Now all that is left is to paste all your cURL commands to a file. Add a #!/bin/bash at the very top of your file and that's it! Remember to put them in the correct order.

#!/bin/bash

curl --location --request GET '127.0.0.1:8080/fabric/lifecycle/approve' \
--header 'Content-Type: application/json' \
--data-raw '{
  "cc_name": "basic",
  "channel_name": "mychannel"
}'

curl --location --request POST '127.0.0.1:8080/fabric/lifecycle/install/basic.tar.gz'
Enter fullscreen mode Exit fullscreen mode

You might also need to give execution access to the script with chmod +x script.sh.

Going Further

Copy-pasting while simple can result in a file full of URLs and a script with extremely limited functionality.

1. Printing the response from your server.

Knowing the server response to your requests is vital. One way to print the API response is to substitute ($()) the cURL command into a variable and then echo it, like this:

MYVAR=$(curl --location --request GET ......)
echo $MYVAR
Enter fullscreen mode Exit fullscreen mode

This will print whatever response cURL got from the server.

2. Save the IP address as a variable.

Since your IP address is most likely going to be repeated, why not save it as a variable. This can also be useful when you need to change the IP address (happens a lot when using services like AWS). Simply store your IP address in a variable and then call it later, like below:

IPADDR='127.0.0.1'
ADMIN1=$(curl --location --request POST "$IPADDR:8080/fabric/lifecycle/admin/Org1")
echo $ADMIN1
Enter fullscreen mode Exit fullscreen mode

3. Functions!

What if you need to repeat certain requests? Naively copy-pasting cURL commands will result in a long and illegible code. Use functions and simply call them for better clarity!

# Define your functions
admin1(){
  ADMIN1=$(curl --location --request POST "$IPADDR:8080/fabric/lifecycle/admin/Org1")
  echo $ADMIN1
}

packageCC(){
  PACKAGE=$(curl --location --request POST "$IPADDR:8080/fabric/lifecycle/package" \
  --header 'Content-Type: application/json' \
  --data-raw '{
      "cc_source_name": "asset-transfer-basic",
      "label": "basic_1.0",
      "language": "go",
      "package_name": "basic.tar.gz"
  }')
  echo $PACKAGE
}

# Then call them
admin1
packageCC
Enter fullscreen mode Exit fullscreen mode
Imagine multiple packageCC calls without using the function. We have reduced 9 lines of code into 1!

Conclusion

You can find the source for the script and all the cURL commands here.

This was only the first part of automation. You now might want to loop this file multiple times and calculate some metrics or draw a graph. While certainly possible using bash, it could be interesting (and definitely easier) implementing such tests with python (hint hint).

Discussion (0)