Dan Levy's Avatar DanLevy.net

You may not need Axios

Fetch API to the rescue!

You may not need Axios

You may not need Axios

This is not an attack on Axios.
Rather, it’s advocacy for the fetch API which has become quite capable. 🦄

Overview

This article is a collection of the “missing” fetch code snippets and common use cases I wish were easier to find.

Is your Use Case not listed? Let me know ✉️


Feature Comparison

fetchaxiosrequest
Intercept request and response
Transform request and response data
Cancel requests
Automatic transforms for JSON data
Client side support for protecting against XSRF
Progress
Streaming
Redirects


When starting this article (late 2018, updated 2024) I assumed I’d end with a table of mixed check boxes. Surely there are special Use Cases which justified axios, request, r2, superagent, got, etc.

Well, as it turns out, I overestimated the need for 3rd party http libraries.

Despite using fetch for several years (including for non-trivial tasks: file uploads & error/retry support) I still had misconceptions of fetch’s abilities and limits.

Well, let’s check out what fetch can do…

Fetch Recipes

Get JSON from a URL

fetch('https://api.github.com/orgs/nodejs')
.then(response => response.json())
.then(data => {
console.log(data) // result from `response.json()` above
})
.catch(error => console.error(error))
view raw basic-example.js hosted with ❤ by GitHub

Custom headers

fetch('https://api.github.com/orgs/nodejs', {
headers: new Headers({
'User-agent': 'Mozilla/4.0 Custom User Agent'
})
})
.then(response => response.json())
.then(data => {
console.log(data)
})
.catch(error => console.error(error))
view raw custom-headers.js hosted with ❤ by GitHub

HTTP Error Handling

const isOk = response => response.ok ? response.json() : Promise.reject(new Error('Failed to load data from server'))
fetch('https://api.github.com/orgs/nodejs')
.then(isOk) // <= Use `isOk` function here
.then(data => {
console.log(data) // Prints result from `response.json()`
})
.catch(error => console.error(error))
view raw fetch-custom-error.js hosted with ❤ by GitHub

CORS example

CORS is primarily checked at the server - so make sure your configuration is correct on the server-side.

The credentials option controls if your cookies are automatically included.

fetch('https://api.github.com/orgs/nodejs', {
credentials: 'include', // Useful for including session ID (and, IIRC, authorization headers)
})
.then(response => response.json())
.then(data => {
console.log(data) // Prints result from `response.json()`
})
.catch(error => console.error(error))
view raw cors-example.js hosted with ❤ by GitHub

Posting JSON

fetch examples: https://danlevy.net/you-may-not-need-axios/

postRequest('http://example.com/api/v1/users', {user: 'Dan'})
.then(data => console.log(data)) // Result from the `response.json()` call
function postRequest(url, data) {
return fetch(url, {
credentials: 'same-origin', // 'include', default: 'omit'
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: JSON.stringify(data), // Use correct payload (matching 'Content-Type')
headers: { 'Content-Type': 'application/json' },
})
.then(response => response.json())
.catch(error => console.error(error))
}
view raw posting-json.js hosted with ❤ by GitHub

Posting an HTML <form>

postForm('http://example.com/api/v1/users', 'form#userEdit')
.then(data => console.log(data))
function postForm(url, formSelector) {
const formData = new FormData(document.querySelector(formSelector))
return fetch(url, {
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: formData // a FormData will automatically set the 'Content-Type'
})
.then(response => response.json())
.catch(error => console.error(error))
}
view raw post-form-data.js hosted with ❤ by GitHub

Form encoded data

To post data with a Content-Type of application/x-www-form-urlencoded we will use URLSearchParams to encode the data like a query string.

For example, new URLSearchParams({a: 1, b: 2}) yields a=1&b=2.

postFormData('http://example.com/api/v1/users', {user: 'Mary'})
.then(data => console.log(data))
function postFormData(url, data) {
return fetch(url, {
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: new URLSearchParams(data),
headers: new Headers({
'Content-type': 'application/x-www-form-urlencoded; charset=UTF-8'
})
})
.then(response => response.json())
.catch(error => console.error(error))
}
view raw posting-form-encoded-data.js hosted with ❤ by GitHub

Uploading a file

postFile('http://example.com/api/v1/users', 'input[type="file"].avatar')
.then(data => console.log(data))
function postFile(url, fileSelector) {
const formData = new FormData()
const fileField = document.querySelector(fileSelector)
formData.append('username', 'abc123')
formData.append('avatar', fileField.files[0])
return fetch(url, {
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: formData // Coordinate the body type with 'Content-Type'
})
.then(response => response.json())
.catch(error => console.error(error))
}
view raw uploading-files.js hosted with ❤ by GitHub

Uploading multiple files

Setup a file upload element with the multiple attribute:

<input type='file' multiple class='files' name='files' />
view raw file-upload-field.html hosted with ❤ by GitHub

Then use with something like:

postFile('http://example.com/api/v1/users', 'input[type="file"].files')
.then(data => console.log(data))
function postFile(url, fileSelector) {
const formData = new FormData()
const fileFields = document.querySelectorAll(fileSelector)
// Add all files to formData
Array.prototype.forEach.call(fileFields.files, f => formData.append('files', f))
// Alternatively for PHPeeps, use `files[]` for the name to support arrays
// Array.prototype.forEach.call(fileFields.files, f => formData.append('files[]', f))
return fetch(url, {
method: 'POST', // 'GET', 'PUT', 'DELETE', etc.
body: formData // Coordinate the body type with 'Content-Type'
})
.then(response => response.json())
.catch(error => console.error(error))
}
view raw uploading-multiple-files.js hosted with ❤ by GitHub

Timeouts

Here’s a generic Promise timeout, using the “Partial Application” pattern. It’ll work with any Promise interface. Don’t do too much work in the supplied promise chain, it will keep running - and any failures have a way of creating long term memory leaks.

function promiseTimeout(msec) {
return promise => {
const timeout = new Promise((yea, nah) => setTimeout(() => nah(new Error('Timeout expired')), msec))
return Promise.race([promise, timeout])
}
}
promiseTimeout(5000)(fetch('https://api.github.com/orgs/nodejs'))
.then(response => response.json())
.then(data => {
console.log(data) // Prints result from `response.json()` in getRequest
})
.catch(error => console.error(error)) // Catches any timeout (or other failure)
////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////
// Alternative example:
fetchTimeout(5000, 'https://api.github.com/orgs/nodejs')
.then(console.log)
// Alternative implementation:
function fetchTimeout(msec, ...args) {
return raceTimeout(fetch(...args))
function raceTimeout(promise) {
const timeout = new Promise((yea, nah) => setTimeout(() => nah(new Error('Timeout expired')), msec))
return Promise.race([promise, timeout])
}
}
view raw promised-timeout.js hosted with ❤ by GitHub

And a more complex example, featuring a tracking flag __timeout so you can intercept any costly work.

adds tracking of the promise timeout via `__timeout` flag

function promiseTimeout(msec) {
return (promise) => {
let isDone = false
promise.then(() => isDone = true)
const timeout = new Promise((yea, nah) => setTimeout(() => {
if (!isDone) {
promise.__timeout = true
nah(new Error('Timeout expired'))
}
}, msec))
return Promise.race([promise, timeout])
}
}
promiseTimeout(5000)(fetch('https://api.github.com/orgs/nodejs'))
.then(response => response.json())
.then(data => {
console.log(data) // Prints result from `response.json()` in getRequest
})
.catch(error => console.error(error))
view raw promise-timeout.js hosted with ❤ by GitHub

Download Progress Helper

Upload Progress is currently a bit buggy outside of Chrome.

The Progress Handler technique shown below avoids wrapping the fetch call in a closure. 👍

progressHelper has the following interface (source available below)

const progressHelper = require('./progressHelper.js')
const handler = ({loaded, total}) => {
console.log(`Downloaded ${loaded} of ${total}`)
}
// handler args: ({ loaded = Kb, total = 0-100% })
const streamProcessor = progressHelper(handler)
// => streamProcessor is a function for use with the response _stream_
view raw progressHelper-signature.js hosted with ❤ by GitHub

Let’s look at a usage example:

fetch examples:

// The progressHelper could be inline w/ .then() below...
const streamProcessor = progressHelper(console.log)
fetch('https://fetch-progress.anthum.com/20kbps/images/sunrise-progressive.jpg')
.then(streamProcessor) // note: NO parentheses because `.then` needs to get a function
.then(response => response.blob())
.then(blobData => {
// ... set as base64 on an <img src="base64...">
})
view raw progress-helper-usage.js hosted with ❤ by GitHub

A reusable image downloader might look like getBlob():

const getBlob = url => fetch(url)
.then(progressHelper(console.log)) // progressHelper used inside the .then()
.then(response => response.blob())
view raw getBlob.js hosted with ❤ by GitHub

By the way, a Blob is a Binary Large Object.

It’s important to choose ONE of the 2 usage patterns below (they are functionally equivalent):

// OPTION #1: no temp streamProcessor var
fetch(...)
.then(progressHelper(console.log))
// ⚠️ OR️ ️⚠️
// OPTION #2: define a `streamProcessor` to hold our console logger
const streamProcessor = progressHelper(console.log)
fetch(...)
.then(streamProcessor)
view raw progressHelper-usage.js hosted with ❤ by GitHub

My preference is Option #1. However, your scope design may force you to use Option #2.

Finally, here’s the last part of this recipe, our progressHelper:

Source: Progress Helper

Adds an onProgress mechanism to track Download Progress when using the `fetch` API.

function progressHelper(onProgress) {
return (response) => {
if (!response.body) return response
let loaded = 0
const contentLength = response.headers.get('content-length')
const total = !contentLength ? -1 : parseInt(contentLength, 10)
return new Response(
new ReadableStream({
start(controller) {
const reader = response.body.getReader()
return read()
function read() {
return reader.read()
.then(({ done, value }) => {
if (done) return void controller.close()
loaded += value.byteLength
onProgress({ loaded, total })
controller.enqueue(value)
return read()
})
.catch(error => {
console.error(error)
controller.error(error)
})
}
}
})
)
}
}
view raw progressHelper.js hosted with ❤ by GitHub

credit: Special thanks to Anthum Chris and his fantastic Progress+Fetch PoC shown here

Recursive Retry Helper

/**
* A **Smarter** retry wrapper with currying!
*/
function retryCurry(fn, retriesLeft = 5) {
const retryFn = (...args) => fn(...args)
.catch(err => retriesLeft > 0
? retryFn(fn, retriesLeft - 1)
: Promise.reject(err)
})
return retryFn
}
const getJson = (url) => fetch(url)
.then(response => response.json())
// Usage
const retryGetJson = retryCurry(getJson, 3);
// Now you can pass any arguments through to your function!
retryGetJson('https://api.github.com/orgs/elite-libs')
.then(console.log)
.catch(console.error)
view raw fetch-retry-with-curry.js hosted with ❤ by GitHub
/** Basic retry wrapper for Promises */
function retryPromise(fn, retriesLeft = 5) {
return fn()
.catch(err => retriesLeft > 0
? retryPromise(fn, retriesLeft - 1)
: Promise.reject(err)
})
}
const getJson = (url) => fetch(url)
.then(response => response.json())
// Usage
retry(() => getJson('https://api.github.com/orgs/elite-libs'))
.then(console.log)
.catch(console.error)
view raw fetch-with-retries.js hosted with ❤ by GitHub

Handling HTTP Redirects

const checkForRedirect = (response) => {
// Check for temporary redirect (307), or permanent (308)
if (response.status === 307 || response.status === 308) {
const location = response.headers.get('location')
if (!location) {
return Promise.reject(new Error('Invalid HTTP Redirect! No Location header.'));
}
// You can change the behavior here to any custom logic:
// e.g. open a "confirm" modal, log the redirect url, etc.
return fetch(location)
// Bonus: this will handle recursive redirects ✨
.then(checkForRedirect)
}
return response
};
fetch('https://api.github.com/orgs/elite-libs')
// Next line will handle redirects
.then(checkForRedirect)
.then(response => response.json())
.then(console.log)
.catch(console.error)
view raw location-redirect.js hosted with ❤ by GitHub

Canceling a fetch request

const httpWithTimeout = (timeout = 5000, url) => {
const controller = new AbortController();
// Set an Nsec cancellation timeout
const timer = setTimeout(() => controller.abort(), timeout);
return fetch(url, { signal: controller.signal })
.then(response => {
clearTimeout(timer); // not required but closes open ref
return response.text();
}).then(text => {
console.log(text);
});
}
view raw cancel-fetch.js hosted with ❤ by GitHub

Compatibility

As of 2022, the fetch API is widely supported in all modern browsers and in more recent versions of NodeJS v18+.

If you must support IE you can polyfill fetch with the github/fetch package (maintained by an awesome team at GitHub). It’s possible to go as far back as IE8 - Your mileage may vary.

Earlier NodeJS can take advantage of the the fetch API with the node-fetch package:

npm install node-fetch

After polyfill+node-fetch: 99.99% compatible

Please Tweet at me if you have other Use Cases you’d like to see. ❤️

Edit on GitHubGitHub