Synchronous vs Asynchronous
Synchronous execution usually refers to code executing in sequence. Asynchronous execution refers to execution that doesn’t run in the sequence it appears in the code. In the following example, the synchronous operation causes the alerts to fire in sequence. In the async operation, while
alert(2)appears to execute second, it doesn’t.
// Synchronous: 1,2,3
// Asynchronous: 1,3,2
setTimeout(() => alert(2), 0);
Blocking vs Non-blocking
Blocking refers to operations that block further execution until that operation finishes. Non-blocking refers to code that doesn’t block execution. In the given example,
localStorage is a blocking operation as it stalls execution to read. On the other hand,
fetch is a non-blocking operation as it does not stall
alert(3) from execution.
// Blocking: 1,... 2
var value = localStorage.getItem('foo');
// Non-blocking: 1, 3,... 2
$.fetch('example.com').then(() => alert(2));
One advantage of non-blocking operations is that the CPU can be kept busy and can save you memory.
An example of blocking is how some web servers like ones in Java or PHP handle requests. If your code does something blocking, like reading something from the database, your code “stalls” at that line and waits for the operation to finish. In that period, your machine is holding onto memory and processing time for a thread that isn’t doing anything.
In order to cater other requests while that thread has stalled depends on your setup. Your server can spawn more threads to cater the request or, if you have a load balancing setup, forwards requests to the next available instance. This instills more setup, more memory consumed, more processing.
In contrast, non-blocking servers like ones ones made in Node.JS, only use one thread to service all requests. This might sound counter-intuitive, but the creators designed it with the idea that the I/O is the bottleneck i.e. not computations. This also means an instance of Node makes the most out of a single core. Nothing stops you from spawning another instance to take up another core if you want to get the most out of your hardware as well.
When requests arrive at the server, they are serviced one at a time. When the code serviced needs to query the DB for example, it sends off a request to the DB. However, instead of waiting for the response and stall, it sends the callback to a second queue and the code continues running. Now when the DB returns data, the callback gets queued in a third queue where they are pending execution. When the engine is doing nothing (stack empty), it picks up a callback from the third queue and executes it.