Welcome!

Linux Containers Authors: Elizabeth White, Pat Romanski, Liz McMillan, Stackify Blog, Yeshim Deniz

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

A Deep Dive into Docker – Part 2 By @AppDynamics | @DevOpsSummit

The technical aspects of Docker, such as the difference between Docker and virtual machines

A Deep Dive into Docker - Part 2
By Anand Akela

In Part One of this Docker primer I gave you an overview of Docker, how it came about, why it has grown so fast and where it is deployed. In the second section, I'll delve deeper into technical aspects of Docker, such as the difference between Docker and virtual machines, the difference between Docker elements and parts, and the basics of how to get started.

Docker vs. Virtual Machines
First, I will contrast Docker containers with virtual machines like VirtualBox or VMware. With virtual machines the entire operating system is found inside the environment, running on top of the host through a hypervisor layer. In effect, there are two operating systems running at the same time.

In contrast, Docker has all of the services of the host operating system virtualized inside the container, including the file system. Although there is a single operating system, containers are self-contained and cannot see the files or processes of another container.

Differences Between Virtual Machines and Docker

  • Each virtual machines has its own operating system, whereas all Docker containers share the same host or kernel.

  • Virtual machines do not stop after a primary command; on the other hand, a Docker container stops after it completes the original command.

  • Due to the high CPU and memory usage, a typical computer can only run one or two virtual machines at a time. Docker containers are lightweight and can run alongside several other containers on an average laptop computer. Docker's excellent resource efficiency is changing the way developers approach creating applications.

  • Virtual machines have their own operating system, so they might take several minutes to boot up. Docker containers do not need to load an operating system and take microseconds to start.

  • Virtual machines do not have effective diff, and they are not version controlled. You can run diff on Docker images and see the changes in the file systems; Docker also has a Docker Hub for checking images in and out, and private and public repositories are available.

  • A single virtual machine can be launched from a set of VMDK or VMX files while several Docker containers can be started from a one Docker image.

  • A virtual machine host operating system does not have to be the same as the guest operating system. Docker containers do not have their own independent operating system, so they must be exactly the same as the host (Linux Kernel.)

  • Virtual machines do not use snapshots often - they are expensive and mostly used for backup. Docker containers use an imaging system with new images layered on top, and containers can handle large snapshots.

Similarities Between Virtual Machines and Docker

  • For both Docker containers and virtual machines, processes in one cannot see the processes in another.

  • Docker containers are instances of the Docker image, whereas virtual machines are considered running instances of physical VMX and VMDK files.

  • Docker containers and virtual machines both have a root file system.

  • A single virtual machine has its own virtual network adapter and IP address; Docker containers can also have a virtual network adapter, IP address, and ports.

Virtual machines let you access multiple platforms, so users across an organization will have similar workstations. IT professionals have plenty of flexibility in building out new workstations and servers in response to expanding demand, which provides significant savings over investing in costly dedicated hardware.

Docker is excellent for coordinating and replicating deployment. Instead of using a single instance for a robust, full-bodied operating system, applications are broken down into smaller pieces that communicate with each other.

Installing Docker
Docker gives you a fast and efficient way to port apps on machines and systems. Using Linux containers (LXC) you can place apps in their own applications and operate them in a secure, self-contained environment. The important Docker parts are as follows:

  1. Docker daemon manages the containers.

  2. Docker CLI is used to communicate and command the daemon.

  3. Docker image index is either a private or public repository for Docker images.

Here are the major Docker elements:

  1. Docker containers bold everything including the application.

  2. Docker images are of containers or the operating system.

  3. Dockerfiles are scripts that build images automatically.

Applications using the Docker system employ these elements.

Linux Containers - LXC
Docker containers can be thought of as directories that can be archived or packed up and shared across a variety of platforms and machines. All dependencies and libraries are inside the container, except for the container itself, which is dependent on Linux Containers (LXC). Linux Containers let developers create applications and their dependent resources, which are boxed up in their own environment inside the container. The container takes advantage of Linux features such as profiles, cgroups, chroots and namespaces to manage the app and limit resources.

Docker Containers
Among other things, Docker containers provide isolation of processes, portability of applications, resource management, and security from outside attacks. At the same time, they cannot interfere with the processes of another container, do not work on other operating systems and cannot abuse the resources on the host system.

This flexibility allows containers to be launched quickly and easily. Gradual, layered changes lead to a lightweight container, and the simple file system means it is not difficult or expensive to roll back.

Docker Images
Docker containers begin with an image, which is the platform upon which applications and additional layers are built. Images are almost like disk images for a desktop machine, and they create a solid base to run all operations inside the container. Each image is not dependent on outside modifications and is highly resistant to outside tampering.

As developers create applications and tools and add them to the base image, they can create new image layers when the changes are committed. Developers use a union file system to keep everything together as a single item.

Dockerfiles
Docker images can be created automatically by reading a Dockerfile, which is a text document that contains all commands needed to build the image. Many instructions can be completed in succession, and the context includes files at a specific PATH on the local file system or a Git repository location; related subdirectories are included in the PATH. Likewise, the URL will include the submodules of the repository.

Getting Started
Here is a shortened example on how to get started using Docker on Ubuntu Linux - enter these Docker Engine CLI commands on a terminal window command line. If you are familiar with package managers, you can use apt and yum for installation.

  1. Log into Ubuntu with sudo.

  2. Make sure curl is installed:
    $ which curl

  3. If not, install it but update the manager first:
    $ sudo apt-get update
    $ sudo apt-get install curl

  4. Grab the latest Docker version:
    $ curl -fsSL

  5. You'll need to enter your sudo password. Docker and its dependencies should be downloaded by now.

  6. Check that Docker is installed correctly:
    $ docker run hello-world

You should see "Hello from Docker" on the screen, which indicates Docker seems to be working correctly. Consult the Docker installation guide to get more details and find installation instructions for Mac and Windows.

Ubuntu Images
Docker is reasonably easy to work with once it is installed since the Docker daemon should be running already. Get a list of all docker commands by running sudo docker

Here is a reference list that lets you search for a docker image from a list of Ubuntu images. Keep in mind an image must be on the host machine where the containers will reside; you can pull an image or view all the images on the host using sudo docker images

Commit an image to ensure everything is the same where you last left - that way it is at the same point for when you are ready to use it again: sudo docker commit [container ID] [image name]

To create a container, start with an image and indicate a command to run. You'll find complete instructions and commands with the official Linux installation guide.

Technical Differences
In this second part of my two-part series on Docker, I compared the technical differences between Docker and virtual machines, broke down the Docker components and reviewed the steps to get started on Linux. The process is straight forward - it just takes some practice implementing these steps to start launching containers with ease.

Begin with a small, controlled environment to ensure the Docker ecosystem will work properly for you; you'll probably find, as I did, that the application delivery process is easy and seamless. In the end, the containers themselves are not the real advantage: the real game-changer is the opportunity to deliver applications in a much more efficient and controlled way. I believe you will enjoy how Docker allows you to migrate from dated monolithic architectures to fast, lightweight microservice faster than you thought possible.

Docker is changing app development at a rapid pace. It allows you to create and test apps quickly in any environment, provides access to big data analytics for the enterprise, helps knock down walls separating Dev and Ops, makes the app development process better and brings down the cost of infrastructure while improving efficiency.

The post A Deep Dive into Docker - Part 2 appeared first on Application Performance Monitoring Blog | AppDynamics.

Read the original blog entry...

More Stories By AppDynamics Blog

In high-production environments where release cycles are measured in hours or minutes — not days or weeks — there's little room for mistakes and no room for confusion. Everyone has to understand what's happening, in real time, and have the means to do whatever is necessary to keep applications up and running optimally.

DevOps is a high-stakes world, but done well, it delivers the agility and performance to significantly impact business competitiveness.

@ThingsExpo Stories
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
DXWorldEXPO LLC announced today that the upcoming DXWorldEXPO | CloudEXPO New York event will feature 10 companies from Poland to participate at the "Poland Digital Transformation Pavilion" on November 12-13, 2018.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
The best way to leverage your CloudEXPO | DXWorldEXPO presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering CloudEXPO | DXWorldEXPO will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at CloudEXPO. Product announcements during our show provide your company with the most reach through our targeted audienc...
JETRO showcased Japan Digital Transformation Pavilion at SYS-CON's 21st International Cloud Expo® at the Santa Clara Convention Center in Santa Clara, CA. The Japan External Trade Organization (JETRO) is a non-profit organization that provides business support services to companies expanding to Japan. With the support of JETRO's dedicated staff, clients can incorporate their business; receive visa, immigration, and HR support; find dedicated office space; identify local government subsidies; get...
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
DXWorldEXPO LLC announced today that ICC-USA, a computer systems integrator and server manufacturing company focused on developing products and product appliances, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City. ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo at the Javits Center in New York City, NY.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...