The autocannon load tester can also profile POST requests, we simply have to add a few flags.
Let's modify our server.js file so it can handle POST requests at an endpoint we'll call /echo.
We change our server.js file to the following:
const express = require('express')
const bodyParser = require('body-parser')
const app = express()
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({extended: false}));
app.post('/echo', (req, res) => {
res.send(req.body)
})
app.listen(3000)
We've removed our previous route, added in request body parser middleware, and created an /echo route that mirrors the request body back to the client.
Now we can profile our /echo endpoint, using the -m, -H, and -b flags:
$ autocannon -c 100 -m POST -H 'content-type=application/json' -b '{ "hello": "world"}' http://localhost:3000/echo
Running 10s test @ http://localhost:3000/echo
100 connections with 1 pipelining factor
Stat Avg Stdev Max
Latency (ms) 25.77 4.8 156
Req/Sec 3796.1 268.95 3991
Bytes/Sec 850.48 kB 58.22 kB 917.5 kB
420k requests in 10s, 9.35 MB read
POST requests have roughly 65% the performance of GET requests when compared to the results from the main recipe.
Loading the body from a file
If we wish to get our POST body from a file, autocannon supports this via the -i flag.
If we wish to get our POST body from a file, autocannon supports this via the -i flag.