Why should I care about Docker?

Devon Campbell - Jan 21 '19 - - Dev Community

What is Docker?

Docker is a new approach to virtualization. If you understand virtualization, feel free to skip the next section. If not, you’ll need a basic understanding of virtualization before I can help you understand Docker.

What is virtualization?

Let’s start with a metaphor: imagine you own a house. You have a friend who needs a place to say. You have a few options if you want to help out your friend.

  1. Move your friend right into your bedroom with you. This could get a little tense.
  2. Build a new house for your friend on your property. This is an expensive solution.
  3. Invite your friend to stay in the spare bedroom. Now we’re getting somewhere…

The third option is pretty good. You’re able to help out your friend without building them a new house but while also keeping your lives mostly separate. You’ll share some common resources like the kitchen and the living room, but you can each go into your own bedrooms and close the door for some privacy.

Virtualization is like setting up your friend in your spare bedroom. Imagine you want to run a web server on your computer. You want to keep it separate from your own operating system and applications. To accomplish this, you can run a virtual machine containing the web server. It runs like a separate computer, but it uses your computer’s processor and RAM. When you start the virtual machine, its entire operating system shows up in a window inside your operating system.

What's different about Docker?

Docker is a different way to do virtualization. Where a typical virtual machine packs up the operating system with the application you’re running, Docker shares as much as it can between your virtualized systems. This makes them use fewer resources when they run and makes them easier to ship around to other developers or to your production environment.

If you're learning web development on your own, it's hard to know what you should learn next. Sign up for a free mentoring session over at Rad Devon, and we'll figure out your next steps to transition to your web development career!

Why should developers use Docker?

Docker gives web developers some cool superpowers.

Easy Sharing of Development Environments

If you and I are going to collaborate on a Node app, we’d want to make sure we both have Node installed and that it’s the same version so that our environments are consistent. We could skip this and hope for the best, but it could cause us problems that might be difficult to narrow down. Libraries and our own code will sometimes behave differently across different versions of Node.

The solution is to make sure we both have the same version of Node, but, if each of us already has other projects on our systems that require other versions of Node, we’ll probably want to install NVM which allows us to switch Node versions easily. We can then add a .nvmrc file to the root of the project specifying the common version we want.

We only have to do this once, so our work is now done. To summarize, here’s what we had to do:

  1. Decide on a Node version.
  2. Install NVM.
  3. Install our chosen version of Node.
  4. Add a .nvmrc to the project directory, setting the correct Node version.
  5. Start the app.

It works, but it’s a lot. We have to do most of this again for anyone else we want to join us on this project. Even if we take all these steps, we still can’t guarantee the environment is the same for all developers. Things could break between developers running different operating systems or even different versions of the same operating system.

Docker lets us iron out all these problems by delivering the same development environment to all developers. Instead, with Docker, here’s what we would do:

  1. Install Docker.
  2. Write a Dockerfile.
  3. Run docker build -t <image-name>. The image name can be whatever you choose.
  4. Run docker run -p 3000:3000 <image-name>. The “p” option maps a container port to a local port. This allows you to hit port 3000 on your computer to which will map to port 3000 on the container. Use the same image name as in step 3.

This may not seem much simpler than the Node/NVM setup (and it really isn’t). It does come with an advantage though. You’ll need to install Docker only once regardless of your tech stack. Sure, you’ll only have to install Node once (unless you need multiple versions), but, when you’re ready to work on an app that’s on a different stack, you’ll need to install all the software you need with that stack. With Docker, you’ll just write a different Dockerfile (or Docker Compose file depending on the complexity of your app).

The Dockerfile is very simple: it’s a text file named “Dockerfile” without an extension. Let’s look at a Dockerfile you might use for a simple Node app.

# This Docker image will be based on the Node 11.6 image
FROM node:11.6.0

# Install dependencies
COPY package*.json ./
RUN npm install

# Copy the node app from the host into the image at /app
COPY . /app

# Expose port 3000 and start the app
EXPOSE 3000
CMD npm start
Enter fullscreen mode Exit fullscreen mode

This Dockerfile is written for a node app that listens on port 3000 and starts with the npm start command. Commit this to your project’s repository, and on-boarding new developers becomes pretty easy and 100% consistent: every developer gets the same environment every time.

Develop on the Same Environment as Production

Once you have your app up and running in a Docker development environment, you can actually ship that entire container directly to production. If you think it’s a pain to deal with inconsistencies between two developers, just wait until you write code that works on your machine only to have it not work in production. It’s extremely frustrating.

You have tons of options for deploying Docker containers to production. Here are a few:

I like Heroku’s approach because it’s the only one that allows you to simply push up your project with a Dockerfile for them to run. The others take several more steps like pushing your Docker image to a repository. The extra steps are not the end of the world, but they aren’t necessary.

What about more complex apps?

Because of the philosophy of Docker (one process per container), most apps will require multiple containers. For example, a WordPress site should consist of a container for the web server running PHP and a container for the MySQL database. That means, you need some way for containers to talk. This is called container orchestration.

If you can run all your containers on a single host, Docker Compose will probably meet your orchestration needs. It’s included when you install Docker, and it’s easy to learn. It lets you start multiple containers simultaneously and set up networking between them so they can talk to one another. This is the quickest and easiest way to orchestrate multiple containers.

If you need to orchestrate containers spread out across multiple hosts, Kubernetes is the prevailing solution. Many hosts who support Docker deployments offer Kubernetes for orchestration.

Quick Wins from Understanding Docker

It may not seem important now, but file this knowledge away for the first time you bump into an issue caused by differences in environments. You won’t want it to happen again. By learning Docker, you’ll be able to ensure a consistent environment for your app, no matter where it’s running or who is running it. That means consistent results that you, your clients, and your employers can rely on.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .