Get Started with Docker | Dockerizing a nodeJS application

Get Started with Docker | Dockerizing a nodeJS application

Yes! Docker is hot right! Working on the command line is always fun and just feels amazing 😍 and if a noobie is watching you using the command line, it's a Godlike feeling (#proHacker) 😂 just kidding. Well, Docker will get this fantasy of yours fulfilled if you've got one. With all the jokes apart let's get started right away.

Prerequisites

Make sure Docker is installed on your machine. Nothing else is required!!

Our sample nodeJS Application

I am using a simple nodeJS REST API template that I have created for this post. It follows an MVC (Model View Controller) design pattern. You can have a quick look at the code here.

The API has two basic routes namely baseUrl/api/test which is a GET request and baseUrl/api/upload-image which is a POST request where you can upload images by providing the image in the request body.

The node application also exposes PORT 3300 when running on your machine. So, go ahead and clone the repository and test it on your local machine using any REST client like POSTMAN or INSOMNIA.

Alternately, you can also bring your own nodeJS app. The process is the same. So, now let's begin the Dockerization process.

Creating the Dockerfile

In the root directory of the nodeJS application create a Dockerfile. My Dockerfile looks like this,

FROM node:14

WORKDIR /app

COPY package.json .

RUN npm install

COPY . .

EXPOSE 3300

CMD ["node", "app.js"]

Now, let's analyze the Dockerfile for a better understanding.

Specifying the Base Image

The first line FROM node:14 tells Docker to pull the node base image from the Dockerhub. Here, we are requiring a node base image and we build our application on top of it.

The base image node that we use here is maintained by the official nodeJS team. Here, I am tagging a specific version of node, i.e, 14 which is a stable LTS version of nodeJS at the point of time I am writing this post. You can use any base image of your choice as per your requirements. More details on the Dockerhub node official image.

Setting up the Working Directory

As we are containerizing, i.e, Dockerizing our nodeJS application we need to set up a working directory inside the container. So, the second line in the Dockerfile WORKDIR /app specifies that our container working directory is set to /app inside the container.

According to the documentation,

The WORKDIR instruction sets the working directory for any RUN, CMD, ENTRYPOINT, COPY and ADD instructions that follow it in the Dockerfile. If the WORKDIR doesn’t exist, it will be created even if it’s not used in any subsequent Dockerfile instruction.

Also, in the Docker best practices, it is recommended to use it,

For clarity and reliability, you should always use absolute paths for your WORKDIR. Also, you should use WORKDIR instead of proliferating instructions like RUN cd … && do-something, which are hard to read, troubleshoot, and maintain.

I would also like to mention that, the working directory can have any name of your choice. I have used /app here. With our WORKDIR specified let's move on to install all the required packages required by our node application. All the instructions after this instruction will be executed in the Working Directory only.

Installing the packages

To inform Docker that our app needs some external packages and dependencies to run we need to specify Docker which packages to install and their versions. Locally, we have our package.json file in which all our required dependencies are mentioned.

So, with COPY package.json . specifies that we copy the package.json file from our local machine to the image when it's built.

Here, I would also like to explain that we specify a "." after package.json and a blank space. This tells docker to copy the package.json in the above-specified Working Directory, i.e, /app.

So, now we provided Docker with information about which packages to install. But Docker won't do it automatically. Therefore, we use the RUN npm install instruction to get all the packages.

Copying all the required files

Now, we need to copy all our app files and folder into the Working Directory. To do this we use COPY . . instruction. This instruction tells Docker to take all the files that exist in the Project root directory to our working directory.

The first "." after the COPY instruction specifies the root of our project directory on our local machine and the second "." points to the Working Directory of our Dockerized application.

Using ".dockerignore". Don't copy everything!

We don't need to copy node_modules if it exists on our local machine as it will be recreated when the RUN npm install instruction is executed. Also, we don't need to copy our Dockerfile itself.

So we create a .dockerignore file in our project root directory. Docker searches for this .dockerignore file during the image creation process and ignores all the files and folders that are specified in this file. It just works like .gitignore as in the case of GIT.

My .dockerignore looks like this,

node_modules
Dockerfile

Exposing the Container internal port to the External World

This step is very important! By default, the container that is created by Docker is a closed environment. But, we need to communicate with the container from outside.

To make this happen we need to expose the container internal port that our node application is listening on to the external world. In this case, our app is listening on port 3300.

So our next instruction is EXPOSE 3300 which exposes the given port of the container when it is created.

Starting our app in the container

With all the above steps done now, we can specify the npm start instruction to Docker to start the app.

We use CMD ["node", "app.js"] instruction to do this. With all this done we can now move on to the image creation.

Creating our Docker Image

Here comes the part that you have been waiting for! Running commands in the terminal🤪. Let's proceed!

Open your Terminal/Command Prompt/Powershell based on your Operating System or your Command-Line of your choice.

Make sure Docker Desktop is running on your machine.

Navigate to your Project Root Directory from your terminal.

Building the Image

To build the image run docker build . from your terminal. This will build your image but it will name the image automatically. In my case, the name automatically assigned to the image by Docker was strange-tharp.

If you want to give your own image name then use the -t flag in your build command like this, docker build -t <your_image_name>:<image_tag>.

With this Docker will start building the image. If your image is being built the first time then it will take some time to build because it has to pull the node base image and also your packages. Thereafter if you build another image it will happen almost instantly.

To check images you can run docker images from your terminal. This command will list all your images.

Building Container from the Image

Now, we have built the Image successfully it's time to spin up our nodeJS app Container. To build the container run docker run <your_image_name>.

But wait, do you remember that we exposed the container internal port 3300. W cannot visit http://localhost:3300 directly as the app is no longer running on our local machine but inside the container.

So we need to map any port on our local machine to the container exposed port 3300. Here, I will map port 3000 to the container port 3300. To do this we use the -p flag in our docker run ... command like this,

docker run -p 3000:3300 <your_image_name>:<image_tag>

This will map our local machine port 3000 to the container exposed port 3300. Now the container will start with the automatically assigned name.

Go to your http://localhost:3000 and you can see your app running. In the case of APIs, you can test your Dockerized app using any REST client.

You have done it! Congrats for coming this far. You can now Dockerize any nodeJS application.

Some more Docker commands

  • To list all your running containers, run docker ps.
  • To see all running as well as stopped containers, run docker ps -a.
  • To stop a running container docker stop <container_name>.
  • To assign your own name to the container use --name flag in your docker run command,
docker run -p 3000:3300 --name <container_name> <your_image_name>:<image_tag>
  • To remove all stopped containers, docker container prune.
  • To remove all unused images, docker images prune.
  • To remove multiple images docker rmi <image_name> <image_name> ....
  • To run the container in detached mode and to remove the container automatically when stopped use -d and --rm flag,
docker run -p 3000:3300 -d --rm --name <container_name> <your_image_name>:<image_tag>

Data Persistence

You would have noticed that when the container is stopped and removed, the data that it contains is also gone. In our case, if we upload any image to our Dockerized app the uploaded images are permanently deleted if the container is stopped and removed and a new container is created.

This is not a bug but a default behavior of Docker. So, if we need to persist the data even if the container is stopped and removed, we need a volume specifically Named volume.

Named volumes simply map the data inside the container onto some location on our local machine so that if the container is destroyed and a new one is created then the data is mapped onto that new container as well. Hence the data will persist.

We can define a named volume as follows using -v flag,

docker run -p 3000:3300 -d --rm --name <container_name> -v <absolute_path_to_the_folder_you_need_to_persist>:/app/<path_to_that_same_folder_in_container> <your_image_name>:<image_tag>

In my case, I used a Named Volume with the absolute path like this,

docker run -p 3000:3300 -d --rm --name <container_name> -v /Users/shashankbiplav/desktop/dockerized-node/images:/app/images <your_image_name>:<image_tag>

This will add Data persistence to your app.

Conclusion

That was a whole lot of talking to get started with Docker and Dockerizing a nodeJS application. I hope you liked this post. Now go ahead and dockerize any nodeJS app or any other app of your choice. The principles are the same.

For a more complex setup refer to my other post where I use Docker to set up a development environment for Laravel. Happy Coding👨🏽‍💻 Devs!