Impact of AWS on DOCKER as a case study.

The life schedule is so busy these days. We all want to do work efficiently and faster. The growth in the technological world is taking us to the next level. So, here, I will be discussing about one of those situations that put a great impact on our life.

Lets imagine a real life situation then we will come to the main topic:

We all use so many application these days on our system. So, we will be having the following needs for this purpose:

→ Efficient hardware i.e. RAM, CPU, STORAGE etc.

→ We must have the knowledge to install the applications.

→ We must know how to resolve the errors during the installation process.

→ We must not in hurry. Because sometimes this process may take long time to install.

Problems that we may face during the installation process of any application:

→ First thing that we think is “How to install it?”.

→ We get stuck with “what to do next?”.

→ If we get any error, we try to find the solution on Stack Overflow, Quora etc.

→ Sometimes problem resolved, sometimes not.

→ If we got stuck we also do troubleshooting.

→ If it installed we must wait for the boot of that application every time.

Here conclusion is perhaps the application will be installed but in this busy life where each and every seconds matter a lot, we are spending much of our time just in installing the specific software.

Here come a solution for this problem that is DOCKER. So, basically we will be discussing a case study on Docker technology. Concepts that are related to the solution of this problem are as follows and we will discuss about each of them:

→ What is Cloud Computing?

→ What is AWS?

→ Services that AWS provides.

→ What is Docker?

→ How Docker works?

→ Why use Docker?

→ Case study of Docker.

What is Cloud Computing?

Cloud Computing Visualization
Cloud Computing

To store, manage and process our data from distant place through the network rather than using it on local server or on personnel computer is called Cloud Computing. We assume our laptop or PCs as the local servers.

Sometimes we don’t have as much hardware and software resources as we need. At that time we have 2 option. Either invest a large capital on these hardware and software to create our personnel server or we can go for any of the Cloud Computing service providers. Hardware means to say RAM, CPUs, HARD DISK and software means the OS to use these hardware. Combinedly it will form a local server.

There are many Cloud Computing service providers. Like

→ Microsoft Azure

→ Amazon Web Services (AWS)

→ Google Cloud

→ Alibaba Cloud

→ IBM Cloud

→ Oracle

→ Salesforce

→ SAP

What is AWS?

AWS

AWS is a Could Computing based company which was founded in 2006. However there are many Cloud Computing service provider but here we will be discussing about AWS only. The AWS Cloud spans 77 Availability Zones within 24 geographic regions around the world. In India we have 1 AWS datacenter having 3 AZ.

Region: Region belongs to the city where AWS Datacenter is located.

Availability Zones: In a region there may be 1 or more individual Datacenters in that particular city and it is called Availability Zones(AZ).

Region and Availability Zones

Blue circles show the existing Datacenters and orange will come soon.

The user have to pay for the resources on model “Pay As We GO”. It means we have to pay as much as we will use. Believe me it is the best solution for the Startups. Why to worry about purchasing resources, just use AWS and pay as you grow. AWS made millions of Startups successful and in future it will do the same. Any services that you think they will be probably provide you. AWS manage data in Petabytes for many companies.

Services that AWS provides

Services

As of 2020, AWS comprises more than 175 products and services including computing, storage, networking, database, analytics, application services, deployment, management, mobile, developer tools, and tools for the Internet of Things.

Some of the featured services provided by AWS are :Compute, Containers, Blockchain, Analytics, Database, IoT, Machine Learning, Storage, Robotics etc. EC2 is the compute service which is widely used by the users in it we use RAM, CPU, OS and HD. If someone want to just store data she can go for Storage service and so on. We just need to choose the resources according to our need and we can pay for it only. In future we can easily upgrade as per the need.

What is Docker?

Docker

Docker is an open-source project that automates the deployment of software applications inside sandbox (called containers) by providing an additional layer of abstraction and automation of OS-level virtualization on Linux. Unlike virtual machines, containers do not have high overhead and hence enable more efficient usage of the underlying system and resources. Containers are isolated from one another and bundle their own software, libraries and configuration files; they can communicate with each other through well-defined channels. Docker, the open source project and eponymous company that kickstarted today’s container hype, was founded by Solomon Hykes in 2010. At the time, it was called dotCloud.

What we do normally if we want to run an application the is Linux based and we have windows OS. First we install Virtual Box and the on this virtual box we install Linux OS. Then after we install the application. Here, we need to give a fixed amount of resources to this virtual machine and the Linux OS. Suppose we gave it 100 GB of HD and 2GB of RAM but that application is using only 10 GB of HD and 1 GB of RAM. So basically it is the wastage of resources. Sometimes it also happens that some applications get stopped working suddenly on that Linux OS, reason may be anything. At that time it is very difficult for the user to mange the things and he have to do the installation process again.

Docker overcame this problem. We just need to make container for each individual application and need to configure Docker containers just once. We will be able to use it forever without being worry about the resources and the processing speed and other issues.

How Docker works?

Docker Architecture

Docker uses a client-server architecture. The Docker client talks to the Docker daemon, which does the heavy lifting of building, running, and distributing your Docker containers. Docker makes it easier and simpler to run containers using simple commands and work-saving automation. AWS services such as AQS Fargate, Amazon ECS, Amazon EKS, and AWS Batch make it easy to run and manage Docker containers at scale. Docker is installed on each server and provides simple commands you can use to build, start, or stop containers.

In a way, Docker is a bit like a virtual machine. But unlike a virtual machine, rather than creating a whole virtual operating system, Docker allows applications to use the same Linux kernel as the system that they’re running on and only requires applications be shipped with things not already running on the host computer. This gives a significant performance boost and reduces the size of the application.

Why use Docker?

Using Docker lets you ship code faster, standardize application operations, seamlessly move code, and save money by improving resource utilization. With Docker, you get a single object that can reliably run anywhere. Docker’s simple and straightforward syntax gives you full control. Wide adoption means there’s a robust ecosystem of tools and off-the-shelf applications that are ready to use with Docker.

→ SHIP MORE SOFTWARE FASTER

→ STANDARDIZE OPERATIONS

→ SEAMLESSLY MOVE

→ SAVE MONEY

Case study of Docker

Before AWS and after AWS

In 2010 the DOCKER was founded and came to the existence in 2013. Till 2013 Docker has less of customers. Docker is helping redefine the way developers build, ship, and run applications. At the beginning, the company focused on making the use of containers a standard practice among developers. Docker forms a huge ecosystem of all types of technology at one place. After 2013 Docker was keep gaining customers but it was a slow process.

After 18 months in 2014, Ben Golub the CEO of Docker explained about its success. He said that developers are like authors and author don’t care how the book will published, he just write. It must be similar with developers, Developers will just create and he should not worry about infrastructure, server accessibility, server crash, dependencies etc. He said, then we get AWS as the solution of all these worries. Because of AWS we were able to make Docker more efficient and we were working on it, without being worry about those problems. Now, AWS will take care of all those things. He is talking about Amazon EC2 Container Service in this conference.

→ Before consulting with AWS, Docker has limited features and after it started to work upon many languages, database, prod, Linux, Mongo, SQL etc.

→ Before it was only limited to web based applications but now it is used in Hospital, Banks, Government Institutions.

→ Before it has not much downloads and now it has Billions of users worldwide.

→ Before processing speed was less and now it has great speed because of AWS.

He said there are 5 steps that docker will keep following in future:

→ Create light weight container.

→ Make container standard, interpretable and easy to use.

→ Create an ecosystem.

→ Enable a Multi Docker App Model.

→ Create a platform for managing it all.

Conclusion

We have discussed everything that proves the importance of AWS. It also reflects the features of Cloud Computing. Millions of developers get profit from AWS. Aws made the technical growth so easier.

THANK YOU….

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Anupam Kumar Thakur

Anupam Kumar Thakur

6 Followers

AWS | DOCKER | ANSIBLE | KUBERNETES | JENKINS | MACHINE LEARNING | DEEP LEARNING | REDHAT