How to Install Node.js Microservices to AWS using Docker

Cloud services are quickly becoming the norm in different kinds of industries. The software development world is especially undergoing significant breakthroughs thanks to cloud technology. Not only has it helped boost productivity, but it has also enabled developers to build quality applications that would have traditionally taken ages.


While the world of software development is vast and uses a wide variety of tools, certain tools are the standard almost everywhere. In some cases, it is flexible IDEs and robust frameworks. For example, when it comes to JavaScript microservices development, the Node.js runtime is an uncompromisable prerequisite. However, with such rapid cloud adoption by developers, the need to set up Node.js on a cloud platform must commonly arise.


This article discusses a particular example of setting up Node.js on the popular cloud platform AWS. It will utilize powerful container technology, Docker, to set up a Node.js container and deploy it on AWS. It will also briefly discuss Node.js and its popularity.

Install Node.js Microservices to AWS using Docker

A Bit About Node.js

Described as an “asynchronous event-driven JavaScript engine” by its official documentation, Node.js is the most popular JavaScript runtime environment. It is designed for scalability and handling network applications effectively, hence its importance for microservices. It is also a leading choice when it comes to building a framework or a web library. Node.js is primarily an event-driven framework and enables the user to utilize events directly.


Node.js is also well-known for its security features. It utilizes the OpenSSL library as one of its dependencies to provide security levels that are standard across the web. Its primary dependency is the V8 library, a JavaScript engine maintained by Google for its Chrome browser. The most recently launched version of Node.js is its version 16, released in April 2021.


By dividing your whole application into microservices, the application becomes modular, with each module can run independently. When it comes to building scalable microservices that interact masterfully with the internet, Node.js is one of the most dependable development frameworks. The front-end provides a clean interface to all these different microservices working independently. Apart from Node.js, .NET is another popular development framework.

Bringing Node.Js to AWS Through Docker

As Node.js is an industry standard for Javascript runtime environments, all levels of development teams actively use it. Even the teams which are actively working in cloud platforms like AWS and Azure. Thankfully, the leading container technology Docker simplifies the process by a significant amount. Configuring a container that sets up the Node.js application and its dependencies and running it on the platform should get you going.


The steps for making a container around a Node.js application are going to require developing the application first. Afterward, users have to configure the Dockerfile that is going to build the container eventually. Finally, the container is run on Docker and then set up on your AWS platform. Discussed below are the main processes related to setting up Node.js on AWS using a configured Docker container.


The user will have to install Docker AWS service as a prerequisite.

Making A Dockerfile & Container For Your Node.js Application

The Dockerfile contains all the relevant information for setting up and running the Node.js application within its container. You can set it up either manually or by using a service that handles all the technical work for you. The latter, called Hydra, is used for making microservices out of Node.js applications. For the former, the syntax looks something like below:

					FROM node:6.9.4-alpine
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
ADD . /usr/src/app
RUN npm install --production
CMD ["npm", "start"]

The first line specifies the Node.js version, and the following lines specify the maintainer and the port that the microservices will listen on. The following lines give directions to making the relevant directories, installing the dependencies, and running the application. Afterward, the config.json file for the application because its network configurations will change once inside the container. For example, look at the following instance.

					   "redis": {
      "url": "redis",
      "port": 7080,
      "db": 15

The URL usually pointing to localhost is now designated to the named DNS entry.


Once the Dockerfile is set up, we move on to running it in Docker and get our container. The following commands contain all the primary and additional parameters for configuring our container.

					$ docker run -d -p 5050:5050 \
   --add-host redis: \
   --name test-service \

The -d and -p flags for the central command specify that the service runs in the background and publishes the services port specified in the Dockerfile above. We use the –add-host command to specify there is a DNS entry that we changed above. We name the container through the –name flag, and in the last part, we identify its version that should match the one in package.json.


We push the container to the Docker Hub through the command below, and we will pull it in the AWS from there. 

					$ docker push emad/test-service:0.0.1

Deploying The Container On AWS

The assumptions before moving on are that you are well-familiar with AWS, particularly its EC2 service. You would need to create EC2 instances and then protect them with SSH protection. Afterward, the security groups and ports need to be configured. You can set up an instance by selecting the ECS-Optimized AMI on AWS and configuring the port to 8080.

					$ ssh

By the above command, we can SSH into the instance so that we can install our container. It is essential to install the security updates as well. We login into the Docker service through the following command. Afterward, we pull our container from the Docker Hub.

					$ docker login
$ docker pull emad/test-service:0.0.1

We go to the /etc/rc.local file in the machine image and add the following configuration information.

					docker rm -f test-service
docker run -d -p 5050:5050 \
   --restart always \
   --add-host redis: \
   -v /usr/local/etc/configs/test-service:/usr/src/app/config \
   --name test-service \

The location of the test-service config file is specified with the -v flag. The config file’s location needs to be created and populated with the file. If your service uses a service like Redis in this example, a separate Redis container needs to be set up in another instance. The location of that is specified with the add-host flag, and you can pull a sample container from Docker Hub.


After all these instructions, we restart the EC2 instance with the following command, and the microservice should be up and running. Now, you will be able to run Node.js on AWS.

					sudo reboot

Setting Up Node.Js Microservices On AWS Easily With Docker

Node.js is a powerful environment that enables most of the microservice applications we use in our daily lives. With enterprises of all kinds moving to cloud services to boost their productivity, there needs to be a quick and easy way to set up something as fundamental as Node.js there. Thankfully, robust container technologies like Docker make that process hassle-free and easy.

Avatar for Emad Bin Abid
Emad Bin Abid

I'm a software engineer who has a bright vision and a strong interest in designing and engineering software solutions. I readily understand that in today's agile world the development process has to be rapid, reusable, and scalable; hence it is extremely important to develop solutions that are well-designed and embody a well-thought-of architecture as the baseline. Apart from designing and developing business solutions, I'm a content writer who loves to document technical learnings and experiences so that peers in the same industry can also benefit from them.

0 0 votes
Article Rating
Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x