Easy command-line JSON posts

Posted on February 24, 2019 by Niels

When I need to test API calls Iā€™ll write a tiny shellscript that: 1. Takes some commandline arguments 2. Then performs an HTTP request with cURL

For example:

#!/bin/bash
session=$1;
name=$2;
result=curl -XPOST https://example.org/api/things \
    '{\"session\": \"$token\",  \"name\" : \"$name\"}';
echo $result;

This is fine for simple things, but editing it is a pain and it gets messy fast. Youā€™ll forget to escape quotes, and it will result in invalid JSON.

You could put your json into a separate file and then use cURLā€™s [--data-binary @file.json feature](https://ec.haxx.se/http-post.html). But then you lose the ability to weave commandline variables through your JSON.

What I find easiest is to keep everything in 1 script by using a feature called ā€œHere Documentsā€.

#!/bin/bash
token=$1;
name=$2;

result=curl -XPOST https://example.org/api/things --data @<(cat << EOF
    {
    "session": "$token",
    "name": "$name"
    }
EOF)

echo $result;

Using the <<EOF notation, you donā€™t have to escape any quotes, and you can still use $variables.

It works from the terminal too, handy when you need to do multi-line user-input. Give it a try:

$ cat <<EOF

# hit Enter, then enter some operation  e.g. date

$ $(date)

# Enter again, then do another thing e.g. echo some text

$ $(echo) Wow this is handy

# Finish up with EOF and hit Enter one last time.

$ EOF

Sun Feb 24 17:34:32 CET 2019
Wow this is handy
$

See this stackoverflow thread for more details.

P.S. I know thereā€™s things like Postman but this is a lot more portable, and integrates well into ci pipelines.