Skip to content

Node.js

This section groups helpful information about Node.js

Stream error handler

I am using the event onerror to handle exceptions with streams in Node.

input_stream.on('error', function(error){
  console.info(error);
  res.redirect("/error/500");
});

Express.js

JSON parser

Express now has a built-in parser (based on body-parser) for requests sending json payload, that is, those with content-type:application/json.

//Handle json post requests
app.use(express.json({"limit": "1Mb"}));
The sent json is accessible via req.body. This middleware is only activated for requests with json payload.

There are several configuration parameters such limit, which allows payload of at most the specified size.

https://expressjs.com/en/api.html

Parsing form data during a post request in node.js

You should use the middleware express.urlencoded to populate the body property of the req object.

Fileupload

Fileupload is an express library that implements a middleware to handle files that are uploaded via a post form. The middleware adds the attribute files to the request variable. The uploaded file is then accessible via the files.<field_name_form>.


//Handle binary post requests
app.use(fileUpload({
    createParentPath: true
}));
<form method="post" action="/games/puzzle-from-text/from-file" 
enctype="multipart/form-data" target="_self">
    <input class="form-element" type="file" 
    accept="text/txt" name="text_file" id="input-filepath">
</form>
let input_stream = Stream.Readable.from(req.files.text_file.data.toString());

Template engines

Use app.engine to register a template engine.


app.engine('ntl', function (filepath, options, callback) {
    fs.readFile(filepath, function (err, content) {
        if (err) return callback(err);
        var rendered = content.toString();

        for (let el in options.vars) {
            rendered = rendered.replace(new RegExp(`#${el}#`, 'g'), options.vars[el]);
        }

        return callback(null, rendered);
    });
});
app.set('views', './assets/views')
app.set('view engine', 'ntl');

You need to register the location of template files. We do that by setting a configuration parameters of exprres.js called views.

Configuration parameter Type Description Default
views String or Array A directory or an array of directories for the application's views. If an array, the views are looked up in the order they occur in the array. process.cwd() + '/views'

You can also set the default engine extension to use in case you request the file without the extension. This is the paramaeter view engine.

A list of configuration parameters is available at: http://expressjs.com/en/api.html#app.settings.table

Streams in node.js

Stream type Description
Readable a stream you can pipe from, but not pipe into (you can receive data, but not send data to it). When you push data into a readable stream, it is buffered, until a consumer starts to read the data.
Writable a stream you can pipe into, but not pipe from (you can send data, but not receive from it)
Duplex a stream you can both pipe into and pipe from, basically a combination of a Readable and Writable stream
Transform a Transform stream is similar to a Duplex, but the output is a transform of its input

A readable stream stores data that is pushed to it via the push method into an internal buffer which size is limited by constructor option highWaterMark. Once the buffer has reached its limit, push ceases to store more data in the buffer until the latter is consumed.

A similiar reasoning is implemented for writable streams. Except that we use the method write to push data into the buffer.

Signaling to the stream that the data has ended

To signal the end of input data in a Readable stream, you should push the null object.

stream.push(null)

Example: pipe a read stream to the source stream of a binary


  function puzzleFromString(req, res) {
    let input_stream = new Stream.Readable({ read() { } });
    input_stream.push(req.body.text);
    input_stream.push(null); //signal the end of the input.

    binServices.generatePuzzle({ "input_stream": input_stream })
      .then(jsonPuzzle => res.send(jsonPuzzle));
  }

  function puzzleFromFile(req, res) {
    binServices.generatePuzzle({ "input_stream": req })
      .then(jsonPuzzle => res.send(jsonPuzzle));
  }  

  function generatePuzzle({input_stream=null,brick_filepath='',num_letters=7,min_words=5,mode='random'}) {       
    const wordDetectiveApp = `${BIN_DIR}/word-detective`;

    let wd;
    if(input_stream!==null){
        wd = p_execFile(wordDetectiveApp, [];
        input_stream.pipe(wd.child.stdin);
    }else{
        wd = p_execFile(wordDetectiveApp, []);
    }

    return wd.then(result => new Promise(function(resolve) {
        resolve(result.stdout);
    }));

Note that in the example above we didn't set up a read function. This is not good practice. You should push data to the stream inside the read function. Here it is an example:


    let buffer = Buffer.from(req.body.text);
    let input_stream = new Stream.Readable({ 
      "read": function(){
        let read_so_far = 0;
        return function(size) {
          input_stream.push( buffer.toString('utf8',read_so_far,read_so_far+size) );
          read_so_far += size;

          // Push null to signal the end of the buffer
          if(read_so_far>=buffer.length) input_stream.push(null);
        }; 
      }() 
    });

I used a buffer insteas of a string because the prior is supposed to be more efficient while making the substring operation. The substring in a string involves copying.

Ref 1: https://www.freecodecamp.org/news/node-js-streams-everything-you-need-to-know-c9141306be93/
Ref 2: https://nodejs.dev/learn/nodejs-streams
Ref 3: https://nodejs.org/api/stream.html#readablereadsize

Differences between EMCAJavascript and CommonJS

Node.js was originally built on CommonJS, but ECMA is the standard now. There are some differences between these standards, namely, the way we import modules.

First thing, set packages.json to use ECMA modules.

"type":"modules"
Then, instead of using require, you must use the import clause.

More about this (here)

Host configuration and cookies

To use cookies, you need to set a hostname when you create your application with express. However, setting a hostname being the application dockerized has not work so far. How to make the hostname work whenever the application is dockerized?

The hostname you have to set should be '0.0.0.0'

app.listen(port, '0.0.0.0', () => {
    console.log('Server running')
});

Dockerizing a node.js application

The problem was with the cookie setting and a non-https server. I was setting the cookie as

document.cookie = cname + '=' + cvalue + ';' + expires + 'domain=localhost;path=/;SameSite=Lax;secure';

The last attribute, secure, says that cookies are only allowed via secure protocols such as https (reference). By removing this attribute, I was able to save and read cookies normally.

Obs. I guesses that the secure attribute was only for cookies set via http(s) requests... anyway, I must implement a HTTPS server for express.

Script dirname


import * as path from 'path';
import { fileURLToPath } from 'url';

const __dirname = path.dirname(fileURLToPath(import.meta.url));
const PROJECT_ROOT = path.resolve(__dirname,"../../");

Reading command line arguments

Command line arguments passed to a node script are stored in process.argv.

Node + Docker

While experimenting with Node+Docker I found that I could not specify the hostname.

Running outside container:


app.listen(port, hostname, () => {
    console.log(`Server running at http://${hostname}:${port}/`);
});

However, when running inside the container, I was not able to access the application url when specifying the hostname. I've tried 'localhost' and '192.168.1.22' without success. Removing the hostname parameter in the app.listen function, I was able to get it working at 'localhost:4958'.


app.listen(port, () => {
    console.log(`Server running at http://${hostname}:${port}/`);
});

Node Package Manager

You can use npmto install node packages. Here are some useful one to get started

  • express: Basic web-server utilities functions library
  • node-fetch: Use the fetch function as it is in the browser via window.fetch

Installation

# Using Ubuntu
curl -fsSL https://deb.nodesource.com/setup_16.x | sudo -E bash -
sudo apt-get install -y nodejs

# Using Debian, as root
curl -fsSL https://deb.nodesource.com/setup_16.x | bash -
apt-get install -y nodejs

What is Node.js

It is a javascript runtime. It comes with some basic low level libraries. You can now code in javascript outside of the browser.