Blog post cover
Arek Nawo
22 Jan 2019
8 min read

Async JavaScript basics revise

JavaScript may seem like an easy language to start with at the beginning. This might seem to be true when looking at it without going deeper into the subject, in reality, JS is far from the easiest programming language to learn. Only full dive in can lead to a discovery of JS complexity. One of the most important and complex aspects to understand is asynchronous programming. While many programmers have utilized it in their code, a great number of them might not even know how it really works. In this article, I’ll try to explain this concept in the easiest possible way so that its fundamentals should become clear once-and-for-all for everyone reading this. I’ll focus on exactly specifying what asynchronous means and ways of utilizing it in JS later on. Let’s get started. 🎉

Event loop 💡

First of all, let’s focus on what asynchronous even mean. We could just say that it means that something is not synchronous (running in a predefined order) but I think we can do better than that. To understand it better, we have to know some basics. JavaScript is a single-threaded language. Your code runs on one thread only, thus it isn’t and can’t be concurrent (which is often how async is misinterpreted). In addition, JS code is executed linearly, fragment by fragment, with each program usually consisted of at least 2 fragments (calls, functions, operations etc.). Each event that occurs in your code (event means code that will be run) is handled by the so-called event loop. Every time an iteration of this loop takes place (its called a tick) next event is run. That’s how JS works in a nutshell. This concept can be better understood with this piece of code:

const eventLoop = [/*events*/];
let event;
    event = eventLoop.shift();

Very simple code illustrating basically how this never-ending event loop runs your JS code. To understand this even better let’s compare this process to human thinking. You may argue that our brains allow us to do many things at one time, so-called multitasking. I don’t agree with this statement. I think that our brains are more like single-threaded super-computers having each task done one by one or by breaking them into smaller parts. It’s like hyper-fast asynchronous event loop. You first do one thing then another. If something may take you a bit more time/effort you break it into smaller chunks (something like a list of tasks to do) and execute them in particular order while in the meantime, between these small tasks, you do something completely different. You might agree with me or not but this concept allows me to demonstrate this whole event loop in a much simpler way. That’s why many programmers can easily wrongly interpret async with concurrent programming, just like many people think they’re doing true multi-tasking. The truth is that we, as well as JS,  work linearly, finishing one part of a bigger task or just whole smaller task and going to another, repeating this process infinitely. It’s because of speed, that many can mistake our linear way of life with multi-tasking. Those were better multitaskers have their event loops running faster than the ones of others.

Async in-depth

I hope this clears at least some aspects of this whole mess. Let’s finally talk about asynchronous workflow then. As you probably know, AJAX is the most widely known adopter of async programming in JS. Consider the example below.

myExampleFunction1(); // 1
ajax(""); // 3
myExampleFunction2(); // 2

This example AJAX request might seem a bit useless without any callback or anything like that - I’m just leaving it for later. Now, next to each line, we have the numbers which indicate the order of when given function calls will be completed. As you can see, AJAX call finishes as the last one even when being called second. Why’s that? Well, because of how async works. First, our AJAX function sends a request to the server. It’s like a single point on the list of its to-dos. When it’s done, we can proceed to the third line. As it’s a synchronous function we need to fully complete it before preceding to the next tick. Finally, later on, we get the response from the server and our callback (if defined) is invoked and pushed to our event loop.


Callbacks are a kind of standard way to deal with asynchronous workflow in JS. Callback allows us, as the name implies, define code, that will be executed when given async task reaches the defined point. Let’s go back to the previous example and see how a callback can be applied to that.

ajax("", data => {

After our AJAX function gets a response from the server, our program will execute provided a callback with appropriate data. It all might seem fine at this point, but there’s a catch. Callback functions have a number of issues. One of which is not being consistent with the sequential way our code is executed. Callback interferes into the event loop, that much so we can be sure of the order our code is run and thus what the outcome will be. That’s not really good. Because of this uncertainty, yet another problem emerged. It’s well-known by the name of callback hell 🔥. It can occur when a nested chain of asynchronous functions and callbacks is used. Think about this example.

ajax("", data1 => {
    ajax("", data2 => {
        ajax("", data2 => {

This may seem a bit unrealistic but believe me or not, such use-cases and other nested async callbacks are pretty wide-spread. Now, here we’re clearly dealing with yet another problem of readability. Even with the use of arrow functions, our code may be hard to read, especially when error handling and additional logic will be added.

two person handshaking in front of MacBook Pro
Photo by rawpixel / Unsplash


So, callbacks aren’t perfect - we know it. That’s why ECMAScript 6 introduced us to a revolutionary way of dealing with async workflows - promises. Now, don’t think of promises as an ideal way of doing everything async and solution to every callback problem, no. While promises are definitely way better than a callback, they are just wrappers around what we already know, providing much, much better API. Promises also allow us to create code that doesn’t interfere with our natural, sequential control flow.

I think that’s enough about the pros of promises, let’s see what are these all about. Promise, from pure English, is just that - a promise. 😂 We’re given a promise of future value. Let’s see about the API.

const p = new Promise((resolve, reject) => {
    // Call resolve() to resolve and specify value of the promise
    // Call reject() to reject the promise
p.then(value => {
    // Handle promise being resolve
    console.log(value) // 10
p.catch(err => {
    // Handle promise being rejected

With the above code and comments, I hope it’s clear how to deal with promises. As for our AJAX function above:

const request = ajax("");
request.then(data => {});
request.catch(err => {});

I personally think that promises API is really nice to write and look at, especially when combined with arrow functions. The API also provides us with Promise.all() and Promise.race() methods with first taking an array of promises and resolving when all passed promises have resolved and second taking an array of promises and resolving when the first of them resolves. More about API can be found naturally on MDN.

Asynchronous future

We’ve gone over callbacks and promises - currently mainstream ways of async in JS. But there’s even better solution emerging from deeps of ES7 called async/await. Using these two keywords you can easily write asynchronous code just like you would do with synchronous one. Example of our AJAX function in different form yet again:

async function requestData(){
    const data = await ajax("");
    return data;
const data = requestData();

How ingenious this is! Now, obviously it’s not yet even more syntactic sugar over promises and there are some differences, but the basic idea remains the same. Async/await are based on what’s called generators. These are pretty advanced topic introduced with promises back in ES6. What are they exactly? Well, using the simplest words, generators allow us to define functions that are executed in a non-linear way, not from top to the bottom. Rather like a list of task added to our event loop in an asynchronous manner.

function *generatorExample {
    console.log("Generator start");
    yield; // Generator pause
    console.log("Generator end");
const gen = generatorExample(); // Generator start; // Generator end

So, as you can see generators are defined just like normal functions with a preceding symbol of *. They can be used with their .next() method, but there’s much more to them than I can say. You can always read their documentation and know more. I would add that these techniques are useful when you’re dealing with custom and asynchronous iterators which are really advanced aspects of JavaScript, not suitable for this article. 🙃

It’s just the beginning

I hope this article helped you with a basic understanding of asynchronous workflow in JS, provided a nice revise or just entertained you. If you want to know more about async, callbacks, promises and etc. I really recommend you to read the book “You Don’t Know JS: Async & Performance” which is a great source of knowledge about all async stuff. If this post helped you, consider sharing it with buttons below and following me on Twitter or on my Facebook page. 🦄

If you need

Custom Web App

I can help you get your next project, from idea to reality.

© 2024 Arek Nawo Ideas