Intro to Simple Docker with Node & Koa

blog-docker

04 Oct

Docker notes

Overivew

Docker is a useful tool to run applications in isolated containers. You can configure which ports and/or directories are available to the host system. Docker’s best feature is the ability to run the same app (packed to container) with the same environment (os, libraries, runtimes, etc) on different hosts. Basically, this easily solves the problem of different versions of node/ruby/etc, specific “features”. Another advantage to dockerized applications is that they know nothing about host OS and have no ability to read/change host files (except shared files which you configured).

Docker also has the ability to run applications in stateless mode (with key --rm). I.e. each time an application is started, is just like the first time it was started.

A Non-obvious advantage of docker is using layers for a file system. Each layer stores only changes. As result, if you have 2 application containers based on the same image (for example Ubuntu ~ 800MB) you need only (800 + app1 diffs + app2 diffs) MB on your hard drive. Further, Docker can be configured to startup app containers on the host boot, restart terminated apps.

Tools: Docker-Compose

A very useful tool for docker is docker-compose. It allows you to run some docker containers with configured links between them (start orders, shared ports, directories ( “volumes” in terms of docker)). You only need to write the file docker-compose.yml and then run (and build if needed) your apps as docker compose up. Calling docker-compose down will stop all applications and remove allocated resources (private networks, containers, etc).

Continuing, docker-compose allows the use of shared yml-files between different docker-compose.yml files. It allows common settings to be stored in the same file and use different files docker-compose.yml for production and development.

All in all, docker-compose is must have for app development. It allows you to run your app and required database engine without installing development tools and database on the host.

Sample docker-compose.yml file:

Tools: Docker-Machine

Another useful tool for docker is docker-machine. It allows you to work with the remote docker daemon (on remote machine, on virtual machine, on VPS) just like the local docker. For example: the command docker-machine create -d virtualbox default will create a virtual machine default on VirtualBox (it should be installed) with boot2docker.iso. Then, if you run: eval "$(docker-machine env default)" you can use docker and docker-compose like you would with local docker.

docker-machine used to be the only way to use docker on Windows and OS X. Now there are more “native” solutions for Windows 10 (Docker for Windows) and for OS X 10.10.3+ (Docker for OS X).

How To: Passing settings to contained applications

The easiest way to do that is using environment variables. You can pass them via -e NAME=VALUE or --env-file with docker. Or with settings environment or env_file in docker-compose.yml with docker-compose. If you need to use the config file you can put them to a specific directory on the host OS and then pass this directory as volume to container (like -v /path/to/conf/on/host:/conf with docker).

Usefull docker commands

Run NodeJS tests without nvm

Build .Net Core project without installing .Net Framework and Visual Studio

Run specific version MongoDB server by a command

Run specific version Postgresql server by a command

Run Bash command line inside container

Stop a container (a packed application)

Show log of a container (a packed application)

Show ran containers (packed applications)

Save a container to image (to publish it or use in children containers)

docker-compose usage

GUI tools for Docker

Kitematic

My thoughts about using Docker on development

I found on the internet a lot of solutions which recommend to build a unique container for app each time. But I see disadvantages there:
– You need to have ready Dockerfile to start app
– You must have a ready to run application
– You don’t have direct access to control application execution (you have to stop container to rebuild app, or use tools like nodemon, no direct access to app’s console)

I suggest don’t create Dockerfile at the begin (it will be created later if need).

Steps for new project

I will use nodejs here. But this solution can be applied for other languages and tools too.

1. Create empty directory for project and init source version control there

2. Create docker-compose.yml which will run required databases and other services

3. Run docker-compose

4. Connect to command line of app container and go to directory /src

Now you have access to source directory from container’s shell.

5. Create app files. Install dependencies.

Now you can create any files which need to app in host OS using any text editor. From container shell you can run dependencies and run the app.

Create in project directory file index.js with content

Run the app

Add another file(s), add another modules. From container’s shell you can run/stop the app, run tests, etc.

As result you can develop nodejs (and not only nodejs) application on computer without having NodeJS and MongoDB installed.

6. Create Docker file for production.

Create Dockerfile with content:

And create new docker-compose.production.yml for production mode.

Now on production server set environment variable PORT to required value (heroku does that itself) and run docker-compose up -d -f docker-compose.production.yml to run the packed app.

Tags:
, ,
Andrey Belchikov
Andrey Belchikov
blogs@bandwidth.com

Andrey is a Software Developer at Bandwidth. He started his work as embedded-device programmer on C/C++. Then he switched to web development on C#. Now he knows JavaScript, Ruby, Golang and has experience in frontend, backend and mobile. He found to learn something new in IT to follow his credo: "you don't live if you don't like to study". While not at work he can be found taking photos, and playing with his child.

No Comments

Post A Comment