Review of My Advanced NodeJs course with LinkedIn Learning

ยท

11 min read

Review of My Advanced NodeJs course with LinkedIn Learning

This article is divided into sections so you can jump to the section that interests you more ๐Ÿ˜Ž

Disclaimer

Why I took the course

The course review

Conclusion

Disclaimer

This is a personal review on the course "Advanced Nodejs by Alex Banks on LinkedIn learning" having completed the course and certified. This does not represent the interest of LinkedIn or that or Alex neither is it meant to teach the actual content even though I will provide few examples from my jots to make this review worth reading ๐Ÿ˜Š.

You can watch out for my subsequent articles breaking down the concepts one by one for better understanding of even an absolute newbie to asynchronous programming.

That said, let's jump in ๐Ÿš€

Why I took the course

It all started when I took LinkedIn Node.js assessment to boost my profile but ended up falling below the top 30 percentile of those that have taken it, which implies I did not do well. LinkedIn out of generosity ๐Ÿ˜ then recommended two courses to me on their premium subscription to be free for 24 hours, one of the two was Advanced Node.js by Alex Banks. Of course, I quickly jumped on it like anyone willing to learn would have.

As a side note, if you have opportunities like this and you think you cannot finish the contents within 24 hours, the simple hack is for you to skip through one by one and download all the videos to your computer grouping and numbering them appropriately. You can do this within 30 minutes to 1 hour, get the certificate, then you can watch those videos at your convenience.

Certificate is great, but you need the actual knowledge to validate it.

If you are on any Linux distribution, you can try out Xtreme download manager, it compresses videos ridiculously while keeping good quality for fast and economical download. How to install

The course review

Alex divided the course into three

  • Asynchronous Patterns
  • Advanced Streams
  • HTTP Streaming

I will go through them in details and share what I have learnt with code snippets.

Ch.1 Asynchronous Patterns

This begins with callback patterns and promises, how to run promises in sequence and parallel, and finally how to run promises concurrently.

A Callback is a block of instruction wrapped in a function to be called when an asynchronous call has completed

Since we are all familiar with sync calls as default, Alex started from explaining callback using synchronous calls, then gradually moved to async calls.

Example

function hideString(str, done) {
  process.nextTick(() => {
    done(str.replace(/[a-zA-Z]/g, "x"));
  });
}

hideString("Hello world", (hidden) => {
  console.log(hidden);
});
console.log("end");

Then he quickly pointed out the issue with multiple callbacks which is termed callback hell or pyramid of doom and here is an example of how complicated callbacks can be.

const fs = require("fs");
const beep = () => process.stdout.write("\x07");

const doStuffSequentially = () => {
  console.log("starting");
  setTimeout(() => {
    console.log("waiting");
    setTimeout(() => {
      console.log("waiting some more");
      fs.writeFile("file.txt", "Sample File...", (error) => {
        if (error) {
          console.error(error);
        } else {
          beep();
          console.log("file.txt created");
          setTimeout(() => {
            beep();
            fs.unlink("file.txt", (error) => {
              if (error) {
                console.error(error);
              } else {
                console.log("file.txt removed");
                console.log("sequential execution complete");
              }
            });
          }, 3000);
        }
      });
    }, 2000);
  }, 1000);
};

doStuffSequentially();

So, is there a way out of this doom ๐Ÿ˜ฉ? of course there is always a way out...๐Ÿ˜Œ PROMISES!

A promise is an object that can be used to represent the eventual completion of an a asynchronous operation.

Instead of passing callback functions, Promise gives us a nice way to handle what happens when an execution succeeds, using a chain of then, catch and finally functions.

The example above can be turned to

const beep = () => process.stdout.write("\x07");

const { promisify } = require("util");
const fs = require("fs");
const writeFile = promisify(fs.writeFile);
const unlink = promisify(fs.unlink);

const delay = (s) =>
  new Promise((resolves, rejects) => {
    if (s === 2) return rejects("error");
    setTimeout(resolves, s * 1000);
  });

const doStuffSequentially = () =>
  Promise.resolve()
    .then(() => "starting")
    .then(console.log)
    .then(() => delay(1))
    .then(() => "waiting")
    .then(console.log)
    .then(() => delay(3))
    .then(writeFile("file.txt", "Sample File..."))
    .then(() => beep())
    .then(() => "file.txt created")
    .then(console.log)
    .then(() => delay(4))
    .then(() => unlink("file.txt"))
    .then(() => beep())
    .then(() => "file.txt removed")
    .then(console.log)
    .then(() => console.log("sequential execution complete"))
    .catch(console.log);

doStuffSequentially();

Using Promise makes our code more readable and clearly shows the sequential steps.

Within this last example sighted by Alex, you should have noticed a utility library named promisify, this library helps convert methods written to take callbacks into returning promise thereby making our implementation more appealing. Here, it converts writeFile method of fs library into promise.

Let's dig more into promisify.

Promisify() method defined in util module of Node. js standard library. is basically used to convert a method that returns responses using a callback function to return responses in a promise object.

Using the same writeFile example, writeFile is originally implemented to work like

const { writeFile } = require("fs");

writeFile("file.txt", "Hello world", (err, res) => {
  if (err) console.log("error creating file");
  else console.log("file successfully created);
});

but can be converted to return promise like

const { writeFile } = require("fs");
const { promisify } = require("util");
const writeFilePromise = promisify(writeFile);

writeFilePromise("sample.txt", "hello world")
  .then(() => console.log("file successfully created"))
  .catch(() => console.log("error creating file"));

Isn't that cooler and more readable?!

We have seen callback vs promise in action but this can even be made better with async...await approach, Alex explained the use of this and how it compares to promises and callback in details.

Then, He went ahead to explain the comparison between sequential and parallel execution of promises.

As the names imply, sequential execution executes an action, wait till it completes, then move on to the next action, this is mostly achieved by using await to wait for the response from the first action. While parallel execution executes several actions in parallel, this is mostly achieved by invoking async actions next to each other without await but rather with then function; also Promise.all() and Promise.race() for multiple executions in parallel.

Promise.all() waits for all actions to be completed while Promise.race() wait for the first completed action.

Here are 9 tips on when to use these options

  1. async ensures that the function returns a promise, and wraps non-promises in it, so you should expect a promise from a function decorated with async hence no need to wrap again in new Pomise().
  2. await is used for calling an async function and wait for it to resolve or reject. await blocks the execution of the code within the async function in which it is located but not the main event loop.
  3. If the output of function2 is dependent on output of function1 then use await for function1 so the response can be available before executing function2.
  4. To run promises in parallel, create an array of promises and then use Promise.all(promisesArray).
  5. Every time you use await remember that you are writing blocking code. Over the time we tend to neglect this.
  6. Instead of creating huge async functions with many await asyncFunction() in it, it is better to create smaller async functions. This way we will be aware of not writing too much of blocking code.
  7. Another advantage of using smaller async functions is that you force yourself to think what are the async functions that can be run in parallel.
  8. If your code contains blocking code it is better to make it an asyncfunction. By doing this you are making sure that somebody else can use your function asynchronously.
  9. By making async functions out of blocking code, you are enabling the user who will call your function to decide on the level of asynhronicity he wants.

Finally, Alex explains concurrency which is necessary when you have some tasks consisting of lots of intensive async actions. Running all in parallel might take too much resources and overload CPU while running them sequentially might be way too slow.

So, a promise queue is necessary that run the promises in batches, e.g out of 10 promise actions to run, execute maximum of 3 at a time, thus we have max of 3 promises running in parallel and when those are resolved, next three will run till all complete. Hence, this can be termed hybrid (sequential execution of limited parallel promises ๐Ÿคฏ).

You can find full examples for each of these async patterns here.

Ch 2. Advanced Streaming

He began with why streams are necessary and did justice to this by making examples of buffering entire data to be returned to client in memory at once compared to streaming chunk by chunk and how memory intensive is each of them.

A quick example of the two is below

// BUFFER

// Load the whole content into a buffer/ into memory once and send it out to user

const http = require("http");
const media = "./testvid.mp4";
const fs = require("fs");

http
  .createServer((req, res) => {
    fs.readFile(media, (err, data) => {
      if (err) console.log({ err });

      res.writeHeader(200, { "Content-Type": "video/mp4" });
      res.end(data);
    });
  })
  .listen(3000, () => {
    console.log("buffer - port 3000");
  });


// STREAM

// read the content chunk by chunk

const http = require("http");
const fs = require("fs");
const media = "./testvid.mp4";

http
  .createServer((req, res) => {
    res.writeHeader(200, { "Content-Type": "video/mp4" });
    fs.createReadStream(media).pipe(res).on("error", console.log);
  })
  .listen(3000, () => console.log("stream - port 3000"));

Then he differentiate between different types of stream.

There are 4 types of streams in Nodejs.

  • Writable: streams to which we can write data e.g fs.writable.
  • Readable: streams from which data can be read e.g fs.readable.
  • Duplex: streams that are both Readable and Writable e.g fs.duplex.
  • Transform: streams that can modify or transform the data as it is written and read eg fs.transform.

Readable Stream

He built a custom readable streamFromArray class (extending standard Readable method from class Stream) that reads content of a large array item by item. Then, he gave additional example of existing readable streams using fs module.

const fs = require("fs");

const readStream = fs.createReadStream(
  "./link/to/a/video.mp4"
);

readStream.on("data", console.log);
readStream.on("end", () => console.log("Done!"));
readStream.on("error", console.log);

createReadStream is a method of module fs that does exactly as the name implies - create a read stream - Then it raises events as data is being read depending on what happens, this includes data if data is read with no error, error if error is encountered, end when all data have been read etc.

Another example of readable stream is process.stdin which allows us read from user input.

Try it

 process.stdin.on("data", (chunk) => {
   console.log("echo: ", chunk.toString());
 });
console.log("start typing... then hit enter")

Finally, he differentiated between flowing mode and non-flowing mode, the former allows you to stream all data at once while the latter pauses stream after the first chunk till client asks for next byte (think of unix cat and less even though they are not perfect illustration).

This thus shows that readable stream can be paused or resumed as necessary, e.g pause when write stream is overloaded and resume when it is drained.

Writable Stream

Simply, data read from readable stream can be written to a writable stream e.g process.stdout which writes to console and it is the underlying mechanism of console.log.

He gave a real example or writing a fetched data to file e.g fetching data from cloud and writing it to your local file.

const { createReadStream, createWriteStream } = require("fs");
const readStream = createReadStream(
  "./link/to/remote/file.mp4"
);
const writeStream = createWriteStream("./store/copy1.mp4");
readStream.on("data", (chunk) => {
   writeStream.write(chunk);
});
readStream.on("error", console.log);
readStream.on("end", () => {
  writeStream.end();
});
writeStream.on("close", () => {
  process.stdout.write("File copied\n"); 
});
writeStream.on("error", console.log);

Rounding this up, he discussed back-pressure and illustrated with real scenario such as piping water from a supply to a storage, he discussed how to handle back-pressure by pausing the read stream, draining the write stream and then resuming the read stream. He noted that this long process of pause, drain and resume can be avoided simply by using pipe function.

const { createReadStream, createWriteStream } = require("fs");

const readStream = createReadStream(
  "./link/to/remote/file.mp4"
);
const writeStream = createWriteStream("./store/copy1.mp4");

readStream
  .pipe(writeStream)
  .on("close", () => console.log("File Copied!"))
  .on("error", console.error);

Duplex and Transform stream

He described these to be related as they are both middle section, they sit in between read stream and write stream for mid-processing (think of them as nodejs middlewares) e.g sizing how much data is being passed, throttling the stream, encrypting the data etc as they are read from the read stream before writing them to the write stream.

Ch. 3 HTTP Streaming

This is the last chapter where he explained HTTP streaming as web servers uploading or downloading streams (text file, pdf file, audio file, video file etc).

Streams are everywhere and the idea is you want to stream everything thereby saving a lot of memory.

I will not go into much details here as the first two chapters should already give you idea of what this is about and how this would likely be implemented.

He delved deep into how to stream (respond with) data (e.g video) from server to client and how to stream data from client to server, how to specify headers appropriately, how to process video ranges, how to ensure your video file is compatible with all browsers, how to parse data being streamed-in using common libraries etc.

My full code jots for chapter 2 and 3 can be found here.

Conclusion

If you have read this far, kudos ๐Ÿ‘.

I believe you have learnt or have been reminded of few useful knowledge, I highly recommend this course for anyone looking to boost their understanding of advanced node.js concept.

You can head over to LinkedIn Learning to get started.

For non-premium members like myself, you can activate premium for 30 days free and enjoy tons of courses, also consider subscribing to premium if you can afford it as there are lots of benefits attached.

Huge shout-out to LinkedIn for activating this course for me 24 hours free ๐Ÿ˜„.

If you are curious about the Node.js skill assessment, I took it again after finishing the course and passed i.e got more than 70 percentile or better put, fell within the top 30% from thousands that have taken the assessment.

Look out for my next series breaking down these concepts one after the other suitable for absolute newbies.

I always love meeting new friends, say hi on twitter @abdulloooh or LinkedIn Abdullah Oladipo and thanks for reading ๐Ÿค

ย