Linux Containers Authors: Liz McMillan, Elizabeth White, Dana Gardner, Pat Romanski, Lori MacVittie

Related Topics: Containers Expo Blog, Java IoT, Linux Containers, @CloudExpo, SDN Journal

Containers Expo Blog: Article

Network Automation: Leveraging Virtual Application Delivery Controllers

ADCaaS is beginning to gain momentum as enterprises move to virtual, cloud and software-defined data centers

With the growing number of applications in use in today's enterprises, you as a network administrator are surely noticing some new challenges. Many IT professionals responsible for supporting applications in data center and cloud environments continue to see growth in the number of applications being deployed and upgraded, which often goes hand-in-hand with data center consolidation, while IT budgets either remain stagnant or are reduced year-over-year. In other words, the adage remains: do more with less.

The good news is that changes with application delivery, one of the most critical services in the data center, can help enterprises overcome roadblocks to application and business performance and address the application and budget challenges that so many now face. By adopting a virtual application delivery controller environment, or ADC-as-a-Service (ADCaaS), location-independent computing can be cost-effective and help deliver flawless application performance.

Doing More with Less
IT budgets are essentially stationary. Recently, the Enterprise Strategy Group (ESG)[1] reported that a majority of the enterprises that they surveyed expected flat to reduced IT budgets year-over-year. As such, obtaining spending approval for modern data center technologies to support an increase in applications can be a challenge, but ESG also found that demonstrating a project's strong ROI is the most effective way to get that approval.

Despite static budgets, enterprise organizations are looking increasingly more like service providers in the transition to modern environments. With requirements for greater flexibility, agility and efficiency, IT delivers services to different business units in multi-tenant environments, leveraging cloud services when necessary. The ESG report indicated that the number of applications that IT organizations support will increase year over year in more than 84 percent of enterprise accounts, and that the average enterprise should also expect to upgrade between 11-25 existing applications over the next year.

Given the fact that a strong ROI is the number one way to get the necessary spend approval for new projects, if application-focused technologies can reduce CAPEX and OPEX costs, deliver a short ROI and maintain the same or better SLAs for application services, IT administrators will be in a better position to meet their applications demands while complying with a flat budget.

The Need for ADC
The Application Delivery Controller (ADC) is one of the key technologies for application delivery infrastructure. ADCs are used to help scale, improve availability, secure and optimize applications.

Virtual ADCs allow customers to dynamically deploy an ADC per application through ADCaaS. With ADCaaS, users can quickly spin the ADC up or down for a scalable, secure and elastic delivery of enterprise, cloud and e-commerce applications. It can also control and optimize end-user services by inspecting, transforming, prioritizing and distributing application traffic across environments, from physical and virtual data centers to public and hybrid clouds.

ADCaaS is beginning to gain momentum as enterprises move to virtual, cloud and software-defined data centers. ESG spoke with enterprise and service provider IT organizations and found that new technologies are required in the ADC space. A software-based, cloud-ready, highly automated ADCaaS approach can enable organizations to demonstrate the necessary ROI while significantly improving business processes.

Moving Toward ADCaaS
For enterprises ready to work toward a modern data center environment in the face of greater demands for applications, agility and flexibility, the following considerations may be helpful.

Provision Faster: The time taken to deploy an application can be heavily impacted by the time required to provision the application delivery infrastructure. ADCaaS can typically reduce the amount of time to provision this infrastructure by an order of magnitude, resulting in the ability to spin up new services in about 30 to 60 seconds compared with those with legacy ADCs that would report times in the two to six hour range.

Rapidly Move from Test and Development to Production: Many legacy ADC users find that the test and development environment does not match the production environment, which means a longer transition time from test and development to production. ADCaaS users typically can easily support test and development with all the capabilities of the production environment and deliver those services more quickly, which can translate to faster time to market.

Accelerate The Time to Value: Purchasing and deploying legacy ADCs can delay ROI by several months. Meanwhile, ADCaaS customers can download the software, type in the license key and be operational in a matter of hours. In addition, ADCaaS can be easier to use than legacy ADCs, which is useful for consolidated data centers that are large but must remain agile. Easy-to-understand interfaces mean that in some cases even less-skilled staff can configure and provision services.

Rapidly Scale the Environment: By leveraging a pay-as-go licensing model and the ability to deploy solutions in the cloud (IaaS), ADCaaS-enabled organizations are able to rapidly scale to meet demand while only paying for what they use. ADCaaS environments eliminate the time-consuming steps required by the physical domain, and eliminate the need to purchase and wait for additional appliances when more capacity is needed. Scaling an environment back down also happens faster with a virtual appliance, which is especially useful in test and development environments with constantly changing requirements.

Burst to the Cloud: As a virtual appliance or software-based solution, ADCaaS instances are easily deployed in IaaS environments, and many cloud providers either offer solutions by the hour/month or have bring your own license (BYOL) options in place. Enterprises can significantly lower CAPEX and OPEX costs by bursting to the cloud as well because a pay-as-you-go model eliminates the need to overprovision physical devices and pay maintenance fees for devices not always in use.

Traditional ADCs cannot inherently scale efficiently and burst to the cloud; it is clear that new, agile, software-based technologies are needed to support the demands of modern data centers. Enterprises that achieve this vision can experience location-independent computing, turn distance and location into a competitive advantage, and provide IT the flexibility to host applications and data in optimal locations while delivering flawless application performance and the best user experience.


1. ESG "ESG: ROI Benefits from Automating Application Delivery Solutions" by Bob Laliberte, Sr. Analyst, November 2013.

More Stories By Neil Abogado

Neil Abogado is currently a director of product marketing for Riverbed SteelApp application delivery. Before joining Riverbed, he worked at Cisco Systems as an outbound product manager for application delivery, Metro Ethernet, storage networking, and L2 VPN technologies. Throughout his career, he has focused on marketing application delivery, server virtualization, networking routing, and switching.

Neil received a BS in Mathematics and Computer Science from the University of Illinois at Chicago and his MS in Telecommunications from DePaul University. He is a Routing & Switching CCIE and a VMware Certified Professional on vSphere 5.0.

Outside of the office, Neil enjoys digital photography, following Chicago sports teams, and spending time with his family.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...