sure, postman is a great tool for poking at an api, but before postman ever existed there was curl
.
curl
is a command-line tool that allows you to do basically anything you want to with an url. it runs on just about every operating system imaginable and has north of ten billion installs.
if you open up the man
page for curl
it's terrifying; there are a dizzying array of options and features. it's easy to get discouraged.
however, curl
is a useful and powerful tool, and learning the subset of features needed to test and develop your restful(ish) api is worthwhile. this article is designed to give us just enough curl
skills to do precisely that.
why curl
if postman already exists, why bother with curl
? there are a lot of good reasons:
curl is universal(ish)
if you use a linux-or-similar type operating system, there's an excellent chance that curl
is already installed. and if, for some reason, it isn't, it's a fast apt
or yum
command away. curl
can also be installed on just about every operating system imaginable. if you're using beOS or plan9, there's a curl
for you. right click on an api call in the firefox console 'network' tab, and there's 'Copy as cURL'. curl
is everywhere.
curl is copy-pastable
a lot of tutorials for things like apis want to show you lo-res screenshots of postman under their 'trying it out' section. it's... not great.
but curl
is just text. if we give examples using curl
, users can copy, modify and paste them to experiment and explore.
curl is composable
because curl
is just text, we can also use it in things like shell scripts or even pipelines if we add a little sed
or jq
here and there. being able to incorporate http calls into our scripts can be useful!
curl is cli-friendly
curl
is also useful for those times when you don't have a graphical environment, like when you're ssh'd into that server.
the flyover
we can do a lot of stuff with curl
, but this article is going to focus on the tools we need for api development. specifically:
- basic
GET
,POST
,PUT
andDELETE
commands - sending headers
- receiving headers
- sending json bodies, both as strings and from files
- uploading files
- a little bit of composition with
jq
the basic 'get'
let's start with the simplest curl
statement: getting the content of an url. it looks like this:
curl https://api.example.ca/stuff
this command only has two parts: the curl
command itself, and the url we are calling.
if you run this against the url of a website, you'll probably get a bunch of html and javascript and the like. if you hit an api endpoint with it, you'll probably get a pile of json. precisely what we would expect!
when you run curl
like this, it's assuming that you're calling HTTP GET
. however, we can explicitly tell curl
to use that method like so:
curl -X GET https://api.example.ca/stuff
the -X GET
argument here is what tells curl
that we want to use GET
(instead of, say, POST
). although it's not strictly necessary, it may be a good idea to explicitly state the HTTP method like this to improve readability.
a pro tip: silent output
when we run curl
we notice that it prints out some status outputs; stuff that looks like this:
* Trying 127.0.0.1:80...
* Connected to api.example.ca (127.0.0.1) port 80 (#0)
if we want to, we can suppress that noise by passing -s
(for 'silent mode') to our command like so:
curl -s -X GET https://api.example.ca/stuff
all the other methods
once we have GET
down pat, using all the other http methods is just a matter of changing what we put after the -X
. for instance, we can POST
to our endpoint like so:
curl -s -X POST https://api.example.ca/stuff
this works nicely for POST
, PUT
, DELETE
and the like.
headers, sending
sending headers with your request is a fairly straightforward affair. here, we add two headers using the -H
(for 'header') argument with the header in a quote-enclosed string:
curl -s -X GET \
-H 'Accept: application/json' \
-H 'Origin: http://web.example.ca' \
https://api.example.ca/stuff
we can send as many headers as we like this way, or none at all.
headers, receiving
a big part of debugging url calls is inspecting the headers the server sends back to us. we can do that with the -I
switch.
curl -I -s -X GET \
-H 'Accept: application/json' \
-H 'Origin: http://web.example.ca' \
https://api.example.ca/stuff
the -I
switch we've added here means we get the headers only. our output will look something like.
HTTP/1.1 200 OK
Server: nginx/1.18.0 (Ubuntu)
Content-Type: application/json
Transfer-Encoding: chunked
Connection: keep-alive
Cache-Control: no-cache, private
Date: Wed, 27 Apr 2022 15:03:42 GMT
X-RateLimit-Limit: 60
X-RateLimit-Remaining: 59
Access-Control-Allow-Origin: *
a pro tip: being verbose
if you want to see the headers sent as well as received and the returned body all at once, there's the 'verbose' argument -v
.
POSTing and PUTting json in the body
of course, if we're calling POST
or PUT
, we will probably want to send a json body of some sort. we can do this with the -d
argument (for 'data'!) thusly:
curl -s -X POST \
-H 'Accept: application/json' \
-H 'Content-Type: application/json' \
https://api.example.ca/stuff \
-d '{"name": "gbhorwood", "os": "popos21"}'
we pass the -d
argument a quote-enclosed string of json here.
there are a couple of important 'gotchas' to watch out for with this, though.
escaping double quotes
first, be careful to escape quotes as needed and to do it properly. for this we simply use the the \
character:
-d '{"name": "gb \"horwood\"", "os": "popos21"}'
escaping single quotes
this is a bit more complex. in older versions of bash, escaping a single quote in a string enclosed by single quotes was challenging. more recently, though, it is possible by pre-pending a $
to the string, ie
-d $'{"name": "gb \'horwood\'", "os": "popos21"}'
building json bodies with variables
constructing a json body with a variable requires a bit more work on our part. let's start with an example:
name="gbhorwood"
os="popos21"
printf -v data '{"name": "%s", "os": "%s"}' "$name" "$os"
curl -s -X POST \
-H 'Accept: application/json' \
-H 'Content-Type: application/json' \
https://api.example.ca/stuff \
-d "$data"
here we see that we've created two variables, name
and os
, that we would like to use in our json body.
on line 3, we use those variables to build a string of json using printf
. we use the -v
switch on printf
to assign the json string we build with our variables to the variable data
and then, in our curl
call, pass the data
variable as the body. note that we need to refer to that variable as "$data"
as the double quotes expand it to its value.
this is certainly a bit more complex, but with some practice it becomes easier.
using a file as a json body
if we decide we want to keep our json body in a separate file, we can certainly do that.
first, let's create a file called /tmp/body.json
{"name": "gbhorwood", "os": "popos21"}
note, that we don't need to have the enclosing single quotes here. this is just straight json!
we can then call curl
and pass it the path to our file using the @
operator like so:
curl -s -X POST \
-H 'Accept: application/json' \
-H 'Content-Type: application/json' \
https://api.example.ca/stuff \
-d "@/tmp/body.json"
and the contents of body.json
file is used as the request body.
uploading a file
uploading files to api endpoints is a handy thing. people want their avatars, after all.
building on what we know so far about curl
, writing a file upload command is fairly straightforward.
curl -s -X POST \
-F 'file=@/path/to/some/file.jpg' \
'http://api.example.ca/avatar'
the magic here is the -F
argument that takes a path to the file we want to upload and assigns it to a variable name (file
in this example). we note that the file path here is indicated with an @
, in the same way it was when we used a file as our json body.
composing a bit with jq
one of the big advantages of curl
is that, because it's just a command, it can be used in a composition like a shell script or pipeline.
to do this effectively, it's helpful to combine our curl
calls with tools that select and format its output like jq
and sed
.
let's take a look at an example where we make a curl
call to a login endpoint and then extract the returned bearer token for use later.
token=`curl -s -X POST 'http://api.example.ca/login' \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-d '{"email": "gbhorwood@example.ca", "password": "<secretpassword>"}' | jq .data.token | sed -e 's/"//g'`
echo "our token is"
echo $token
there's a lot going on here, so let's go over it one step at a time.
first, we assign the output of our curl
call to the variable token
by enclosing our curl
call in backtick quotes. this is a technique called command substitution and it's a powerful way to take the results of any bash
command, not just curl
, and put it in a variable.
the curl
command itself should be readable to us by now; it's a POST
call to login that sends the username and password in a json body.
the next bit of interest is the jq
command after the pipe.
jq
stands for 'json query' and it is a tool that allows us to format and query json text. it is easily installable via the package manager of your choice and is an important utility to have if you work with json.
here we pass jq
the arg .data.token
. this allows us to query the value keyed by token
in the object keyed by data
. an example will help here, so let's look at the json our login endpoint returns.
{
"data": {
"name": "gbhorwood",
"token": "<mybearertoken>"
}
}
to get the value "", we use jq
to query the top level, indicated by a .
, then the object at data
, then the value at token
. the resulting command is jq .data.token
.
of course, json encloses strings in double quotes and we would like to remove those; for that we use sed
.
sed
stands for 'stream editor' and, like the name implies, it allows us to programatically edit a stream of text. it is an incredibly powerful tool, but for our purposes here we will be using it simply to find all instances of the "
character and replace those with null. the syntax is sed -e 's/<what to find>/<what to replace with>/g'
. since we want to remove double quotes, we do sed -e 's/"//g'
.
the result of all this is our bearer token assigned to a variable that we can use for anything we want, ie. curl
calls to endpoints that require authorization using the header argument
-H "Authorization: Bearer $token"
as we become more familiar with jq
, we will be able to leverage it with curl
to do all sorts of great things.
conclusion
curl
is an incredible powerful tool that has been under development for over twenty-five years and what we have covered here represents only a small fraction of what it is capable of. having a basic level of comfort with command allows us to better experiment with, document and communicate the functionality of our apis.
Top comments (2)
This is a great guide for curl! Thank you for that🎉
Thank you for an Awesome article 🔥🚀