Linux Containers Authors: Flint Brenton, Pat Romanski, Yeshim Deniz, Elizabeth White, Roger Strukhoff

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

An Introduction to Docker – Part 1 By @AppDynamics | @DevOpsSummit #DevOps

In simple terms, the Docker platform is all about making it easier to create, deploy and run applications by using containers

An Introduction to Docker - Part 1
By Anand Akela

What is Docker?
In simple terms, the Docker platform is all about making it easier to create, deploy and run applications by using containers. Containers let developers package up an application with all of the necessary parts, such as libraries and other elements it is dependent upon, and then ship it all out as one package. By keeping an app and associated elements within the container, developers can be sure that the apps will run on any Linux machine no matter what kind of customized settings that machine might have, or how it might differ from the machine that was used for writing and testing the code. This is helpful for developers because it makes it easier to work on the app throughout its life cycle.

Docker is kind of like a virtual machine, but instead of creating a whole virtual operating system (OS), it lets applications take advantage of the same Linux kernel as the system they're running on. That way, the app only has to be shipped with things that aren't already on the host computer instead of a whole new OS. This means that apps are much smaller and perform significantly better than apps that are system dependent. It has a number of additional benefits.

Docker is an open platform for distributed applications for developers and system admins. It provides an integrated suite of capabilities for an infrastructure agnostic CaaS model. With Docker, IT operations teams are able to secure, provision and manage both infrastructure resources and base application content while developers are able to build and deploy their applications in a self-service manner.

Key Benefits

  • Open Source: Another key aspect of Docker is that it is completely open source. This means anyone can contribute to the platform and adapt and extend it to meet their own needs if they require extra features that don't come with Docker right out of the box. All of this makes it an extremely convenient option for developers and system administrators.

  • Low-Overhead: Because developers don't have to provide a truly virtualized environment all the way down to the hardware level, they can keep overhead costs down by creating only the necessary libraries and OS components that make it run.

  • Agile: Docker was built with speed and simplicity in mind and that's part of the reason it has become so popular. Developers can now very simply package up any software and its dependencies into a container. They can use any language, version and tooling because they are packaged together into a container that, in effect, standardizes all elements without having to sacrifice anything.

  • Portable: Docker also makes application containers completely portable in a totally new way. Developers can now ship apps from development to testing and production without breaking the code. Differences in the environment won't have any effect on what is packaged inside the container. There's also no need to change the app for it to work in production, which is great for IT operations teams because now they can avoid vendor lock in by moving apps across data centers.

  • Control: Docker provides ultimate control over the apps as they move along the life cycle because the environment is standardized. This makes it a lot easier to answer questions about security, manageability and scale during this process. IT teams can customize the level of control and flexibility needed to keep service levels, performance and regulatory compliance in line for particular projects.

How Was It Created and How Did It Come About?
Apps used to be developed in a very different fashion. There were tons of private data centers where off-the-shelf software was being run and controlled by gigantic code bases that had to be updated once a year. With the development of the cloud, all of that changed. Also, now that companies worldwide are so dependent on software to connect with their customers, the software options are getting more and more customized.

As software continued to get more complex, with an expanding matrix of services, dependencies and infrastructure, it posed many challenges in reaching the end state of the app. That's where Docker comes in.

In 2013, Docker was developed as a way to build, ship and run applications anywhere using containers. Software containers are a standard unit of software that isn't affected by what code and dependencies are included within it. This helped developers and system administrators deal with the need to transport software across infrastructures and various environments without any modifications.

Docker was launched at PyCon Lightning Talk - The future of Linux Containers on March 13, 2013. The Docker mascot, Moby Dock, was created a few months later. In September, Docker and Red Hat announced a major alliance, introducing Fedora/RHEL compatibility. The company raised $15 million in Series B funding in January of 2014. In July 2014 Docker acquired Orchard (Fig) and in August 2014 the Docker Engine 1.2 was launched. In September 2014 they closed a $40 million Series C funding and by December 31, 2014, Docker had reached 100 million container downloads. In April 2015, they secured another $95 million in Series D funding and reached 300 million container downloads.

How Does It Work?
Docker is a Container as a Service (CaaS). To understand how it works, it's important to first look at what a Linux container is.

Linux Containers
In a normal virtualized environment, virtual machines run on top of a physical machine with the aid of a hypervisor (e.g. Xen, Hyper-V). Containers run on user space on top of an operating system's kernel. Each container has its own isolated user space, and it's possible to run many different containers on one host. Containers are isolated in a host using two Linux kernel features: Namespaces and Control Groups.

There are six namespaces in Linux and they allow a container to have its own network interfaces, IP address, etc. The resources that a container uses are managed by control groups, which allow you to limit the amount of CPU and memory resources a container should use.

Docker is a container engine that uses the Linux Kernel features to make containers on top of an OS and automates app deployment on the container. It provides a lightweight environment to run app code in order to create a more efficient workflow for moving your app through the life cycle. It runs on a client-server architecture. The Docker Daemon is responsible for all the actions related to the containers, and this daemon gets the commands from the Docker client through cli or REST APIs.

The containers are built from images, and these images can be configured with apps and used as a template for creating containers. They are organized in a layer, and every change in an image is added as a layer on top of it. The Docker registry is where Docker images are stored, and developers use a public or private registry to build and share images with their teams. The Docker-hosted registry service is called Docker Hub, and it allows you to upload and download images from a central location.

Once you have your images, you can create a container, which is a writable layer of the image. The image tells Docker what the container holds, what process to run when the container is launched and other configuration data. Once the container is running, you can manage it, interact with the app and then stop and remove the container when you're done. It makes it simple to work with the app without having to alter the code.

Why Should a Developer Care?
Docker is perfect for helping developers with the development cycle. It lets you develop on local containers that have your apps and services, and can then integrate into a continuous integration and deployment workflow. Basically, it can make a developer's life much easier. It's especially helpful for the following reasons:

Easier Scaling
Docker makes it easy to keep workloads highly portable. The containers can run on a developer's local host, as well as on physical or virtual machines or in the cloud. It makes managing workloads much simpler, as you can use it to scale up or tear down apps and services easily and nearly in real time.

Higher Density and More Workloads
Docker is a lightweight and cost-effective alternative to hypervisor-based virtual machines, which is great for high density environments. It's also useful for small and medium deployments, where you want to get more out of the resources you already have.

Key Vendors and Supporters Behind Docker
The Docker project relies on community support channels like forums, IRC and StackOverflow. Docker has received contributions from many big organizations, including:

  • Project Atomic
  • Google
  • GitHub
  • FedoraCloud
  • AlphaGov
  • Tsuru
  • Globo.com

Docker is supported by many cloud vendors, including:

  • Microsoft
  • IBM
  • Rackspace
  • Google
  • Canonical
  • Red Hat
  • VMware
  • Cisco
  • Amazon

Stay tuned for our next installment, where we will dig even deeper into Docker and its capabilities. In the meanwhile, read this blog post to learn how AppDynamics provides complete visibility into Docker Containers.

The post An Introduction to Docker - Part 1 appeared first on Application Performance Monitoring Blog | AppDynamics.

Read the original blog entry...

More Stories By AppDynamics Blog

In high-production environments where release cycles are measured in hours or minutes — not days or weeks — there's little room for mistakes and no room for confusion. Everyone has to understand what's happening, in real time, and have the means to do whatever is necessary to keep applications up and running optimally.

DevOps is a high-stakes world, but done well, it delivers the agility and performance to significantly impact business competitiveness.

@ThingsExpo Stories
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DXWorldEXPO LLC announced today that "Miami Blockchain Event by FinTechEXPO" has announced that its Call for Papers is now open. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expe...
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPO also offers s...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.