You may not need Axios

Fetch API to the rescue!

published 3 months agoNov 14th, 2018

updated about 2 months agoJan 7th, 2019

You may not need Axios

This is not an attack on Axios.
Rather, it’s advocacy for the fetch API which has become quite capable. 🦄

credit: william-bout-103533-unsplash.jpg

Overview

I’ll be the first to say: I was an early hater of the fetch API. My first attempt turned into an entirely wasted weekend. I didn’t know I was using broken examples. #fail
The good news is that a large proportion of the documentation and corresponding examples have been improved, and I’ve included several code snippets with common uses cases below.

I’ve collected up-to-date code examples for the patterns which “unlocked” fetch for me.

Check out the head-to-head Feature Comparison; then browse my curated Fetch Examples I’ve accumulated over the past year.

Feature Comparison

fetch axios request
Intercept request and response
Transform request and response data
Cancel requests
Automatic transforms for JSON data
Client side support for protecting against XSRF
Progress
Streaming



When starting this article (late 2018) I assumed I’d end with a table of mixed check boxes. Surely there are special Use Cases which justified axios, request, r2, superagent, got, etc. Well, as it turns out, I overestimated the need for 3rd party http libraries.

Despite using fetch for 2 years (including for non-trivial tasks: file uploads & error/retry support) I still had misconceptions of fetch’s abilities and limits. (Specifically regarding progress updates or canceling requests.)



Fetch Examples

Click the links below to go directly to the code snippet.

  1. GET: JSON from a URL
  2. Custom headers
  3. Error handling w/ HTTP status codes
  4. CORS example
  5. Posting JSON
  6. Posting an HTML <form>
  7. Form encoded data
  8. Uploading a File
  9. Uploading Multiple Files
  10. Timeouts
  11. Progress Percent - Download
  12. TODO: Recursive: Retry on Failure
  13. TODO: Recursive: Automated results paging

Is your Use Case not listed? Let me know ✉️


Get JSON from a URL

fetch('https://api.github.com/orgs/nodejs')
.then(response => response.json())
.then(data => {
console.log(data) // Prints result from `response.json()` in getRequest
})
.catch(error => console.error(error))
view raw basic-example.js hosted with ❤ by GitHub

Custom headers

fetch('https://api.github.com/orgs/nodejs', {
headers: new Headers({
'User-agent': 'Mozilla/4.0 Custom User Agent'
})
})
.then(response => response.json())
.then(data => {
console.log(data)
})
.catch(error => console.error(error))
view raw custom-headers.js hosted with ❤ by GitHub

HTTP Error Handling

const isOk = response => response.ok ? response.json() : Promise.reject(new Error('Failed to load data from server'))
fetch('https://api.github.com/orgs/nodejs')
.then(isOk) // <= Use `isOk` function here
.then(data => {
console.log(data) // Prints result from `response.json()`
})
.catch(error => console.error(error))

CORS example

CORS is primarily checked at the server - so make sure your configuration is correct on the server-side.

The credentials option controls if your cookies are automatically included.

fetch('https://api.github.com/orgs/nodejs', {
credentials: 'include', // Useful for including session ID (and, IIRC, authorization headers)
})
.then(response => response.json())
.then(data => {
console.log(data) // Prints result from `response.json()`
})
.catch(error => console.error(error))
view raw cors-example.js hosted with ❤ by GitHub

Posting JSON

postRequest('http://example.com/api/v1/users', {user: 'Dan'})
.then(data => console.log(data)) // Result from the `response.json()` call
function postRequest(url, data) {
return fetch(url, {
credentials: 'same-origin', // 'include', default: 'omit'
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: JSON.stringify(data), // Use correct payload (matching 'Content-Type')
headers: { 'Content-Type': 'application/json' },
})
.then(response => response.json())
.catch(error => console.error(error))
}
view raw posting-json.js hosted with ❤ by GitHub

Posting an HTML <form>

postForm('http://example.com/api/v1/users', 'form#userEdit')
.then(data => console.log(data))
function postForm(url, formSelector) {
const formData = new FormData(document.querySelector(formSelector))
return fetch(url, {
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: formData // a FormData will automatically set the 'Content-Type'
})
.then(response => response.json())
.catch(error => console.error(error))
}
view raw post-form-data.js hosted with ❤ by GitHub

Form encoded data

To post data with a Content-Type of application/x-www-form-urlencoded we will use URLSearchParams to encode the data like a query string.

For example, new URLSearchParams({a: 1, b: 2}) yields a=1&b=2.

postFormData('http://example.com/api/v1/users', {user: 'Mary'})
.then(data => console.log(data))
function postFormData(url, data) {
return fetch(url, {
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: new URLSearchParams(data),
headers: new Headers({
'Content-type': 'application/x-www-form-urlencoded; charset=UTF-8'
})
})
.then(response => response.json())
.catch(error => console.error(error))
}

Uploading a file

postFile('http://example.com/api/v1/users', 'input[type="file"].avatar')
.then(data => console.log(data))
function postFile(url, fileSelector) {
const formData = new FormData()
const fileField = document.querySelector(fileSelector)
formData.append('username', 'abc123')
formData.append('avatar', fileField.files[0])
return fetch(url, {
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: formData // Coordinate the body type with 'Content-Type'
})
.then(response => response.json())
.catch(error => console.error(error))
}
view raw uploading-files.js hosted with ❤ by GitHub

Uploading multiple files

Setup a file upload element with the multiple attribute:

<input type='file' multiple class='files' name='files' />

Then use with something like:

postFile('http://example.com/api/v1/users', 'input[type="file"].files')
.then(data => console.log(data))
function postFile(url, fileSelector) {
const formData = new FormData()
const fileFields = document.querySelectorAll(fileSelector)
// Add all files to formData
Array.prototype.forEach.call(fileFields.files, f => formData.append('files', f))
// Alternatively for PHPeeps, use `files[]` for the name to support arrays
// Array.prototype.forEach.call(fileFields.files, f => formData.append('files[]', f))
return fetch(url, {
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: formData // Coordinate the body type with 'Content-Type'
})
.then(response => response.json())
.catch(error => console.error(error))
}

Timeouts

Here’s a generic Promise timeout, using the “Partial Application” pattern. It’ll work with any Promise interface. Don’t do too much work in the supplied promise chain, it will keep running - and any failures have a way of creating long term memory leaks.

function promiseTimeout(msec) {
return promise => {
const timeout = new Promise((yea, nah) => setTimeout(() => nah(new Error('Timeout expired')), msec))
return Promise.race([promise, timeout])
}
}
// Before: fetch('https://api.github.com/orgs/nodejs'))
promiseTimeout(5000)(fetch('https://api.github.com/orgs/nodejs'))
.then(response => response.json())
.then(data => {
console.log(data) // Prints result from `response.json()` in getRequest
})
.catch(error => console.error(error))
view raw promised-timeout.js hosted with ❤ by GitHub

And a more complex example, featuring a tracking flag __timeout so you can intercept any costly work.

function promiseTimeout(msec) {
return (promise) => {
let isDone = false
promise.then(() => isDone = true)
const timeout = new Promise((yea, nah) => setTimeout(() => {
if (!isDone) {
promise.__timeout = true
nah(new Error('Timeout expired'))
}
}, msec))
return Promise.race([promise, timeout])
}
}
promiseTimeout(5000)(fetch('https://api.github.com/orgs/nodejs'))
.then(response => response.json())
.then(data => {
console.log(data) // Prints result from `response.json()` in getRequest
})
.catch(error => console.error(error))
view raw promise-timeout.js hosted with ❤ by GitHub

Download Progress Helper

This is included for completeness. You may still want to use a 3rd party library here. Browser streaming interfaces may lack browser compatibility (as of late 2018). Upload Progress is currently a bit buggy outside of Chrome.

The Progress Handler technique shown below avoids wrapping the fetch call in a closure. 👍

progressHelper has the following interface (source available below)

const progressHelper = require('./progressHelper.js')
const handler = ({loaded, total}) => {
console.log(`Downloaded ${loaded} of ${total}`)
}
// handler args: ({ loaded = Kb, total = 0-100% })
const streamProcessor = progressHelper(handler)
// => streamProcessor is a function for use with the response _stream_

Let’s look at a usage example:

// The progressHelper could be inline w/ .then() below...
const streamProcessor = progressHelper(console.log)
fetch('https://fetch-progress.anthum.com/20kbps/images/sunrise-progressive.jpg')
.then(streamProcessor) // note: NO parentheses because `.then` needs to get a function
.then(response => response.blob())
.then(blobData => {
// ... set as base64 on an <img src="base64...">
})

A reusable image downloader might look like getBlob():

const getBlob = url => fetch(url)
.then(progressHelper(console.log)) // progressHelper used inside the .then()
.then(response => response.blob())
view raw getBlob.js hosted with ❤ by GitHub

By the way, a Blob is a Binary Large Object.

It’s important to choose ONE of the 2 usage patterns below (they are functionally equivalent):

// OPTION #1: no temp streamProcessor var
// fetch(...)
.then(progressHelper(console.log))
// ⚠️ OR️ ️⚠️
// OPTION #2: define a `streamProcessor` to hold our console logger
const streamProcessor = progressHelper(console.log)
// fetch(...)
.then(streamProcessor)

My preference is Option #1. However, your scope design may force you to use Option #2.

Finally, here’s the last part of this recipe, our progressHelper:

Source: Progress Helper

function progressHelper(onProgress) {
return (response) => {
if (!response.body) return response
let loaded = 0
const contentLength = response.headers.get('content-length')
const total = !contentLength ? -1 : parseInt(contentLength, 10)
return new Response(
new ReadableStream({
start(controller) {
const reader = response.body.getReader()
return read()
function read() {
return reader.read()
.then(({ done, value }) => {
if (done) return void controller.close()
loaded += value.byteLength
onProgress({ loaded, total })
controller.enqueue(value)
return read()
})
.catch(error => {
console.error(error)
controller.error(error)
})
}
}
})
)
}
}
view raw progressHelper.js hosted with ❤ by GitHub

credit: Special thanks to Anthum Chris and his fantastic Progress+Fetch PoC shown here

Compatibility

“What about NodeJS and the poor IE people?!?”

Fear not, the fractional % of IE9-10 users can be polyfilled with the github/fetch package (maintained by some awesome team at GitHub). It’s possible to go as far back as IE8 - Your mileage may vary.

NodeJS can take advantage of the the fetch API with the node-fetch package:

npm install node-fetch

After polyfill+node-fetch: 99.99% compatible

More coming soon.

Please Tweet at me if you have other Use Cases you’d like to see. ❤️

End of Dan's fetch API Examples

Unless otherwise noted, all content is copyright Dan Levy 2014-1019.