Welcome!

Linux Authors: Paige Leidig, Yeshim Deniz, Carmen Gonzalez, Mike Kavis, Elizabeth White

Related Topics: Cloud Expo, Java, SOA & WOA, Web 2.0, Big Data Journal, SDN Journal

Cloud Expo: Article

The Outside-In Battle for the Soul of the Cloud

The clouds that can best adapt to the demands of the workloads they are supporting will be best positioned for success

Whether they admit it or not, the emergence of public cloud providers has dramatically altered the playing field for hardware vendors of every type. Amazon Web Services (AWS) and its competitors opened Pandora's box by introducing the world to a completely programmatic, scalable, evolving, and pay-as-you-go way to procure and utilize network, compute and storage resources on a global scale. They have disrupted many layers of the technology industry from the applications being written to the way companies interact with the infrastructure being used to support those applications.

Nowhere is this disruption easier to see than in the virtualization ecosystem. For the better part of the last decade, hypervisor companies like VMware, Citrix, Microsoft and Red Hat worked hand-in-hand with hardware manufacturers like Cisco, NetApp, EMC, HP and Dell to define both the infrastructure foundation as well as the virtualized abstraction layer that sat underneath the entirety of the client/server era. These companies provided a direct link between the enterprise applications, the hypervisor and the hardware. They owned the traditional datacenter construct.

It's that construct, since rebranded as "private cloud," that is directly under attack by public cloud providers. I predict that this will be the battlefield for the heart and soul of enterprise IT for the next decade.

The response to the public cloud threat has been varied, and often reflects the ability of traditional companies to pivot and meet the challenge. Interestingly, erstwhile competitors Microsoft and VMware reacted similarly. This is because they were both uniquely positioned to create a software-defined solution to the problem.

For both companies, the response started with existing enterprise workloads. One of the largest challenges of the AWS public cloud is the fact that getting workloads, and especially data, into and out of an enterprise environment can be both technically challenging and expensive. Most workloads running on an enterprise-virtualized platform today can't be easily ported into AWS and this increases the cost and risk of any migration. As companies with extensive and hard-won experience running mission-critical enterprise workloads, Microsoft and VMware came to much the same conclusion: build a public cloud using their existing platform and allow customers and developers to leverage all of the investment they've made in their own data centers as they selectively move workloads outside of their own data centers. Thus, Microsoft Azure and VMware vCHS were born. Both are clouds that customers can move workloads to without the need to rewrite or re-architect them. They can also be licensed using existing agreements and can be managed by existing staff and tools.

Unfortunately, the traditional data center infrastructure is now the weak link in this new software-defined world. In each of the public clouds referenced, the focus has been on the abstraction layer and how it interacts with the end users. What's missing is how the abstraction layer and the applications and tools that sit on top of it interact with the infrastructure directly.

There have been attempts at hardware-based offloading, especially with regards to storage. VAAI is a good example of VMware trying to create a way to let enterprise storage arrays handle the tasks they are good at without requiring the direct involvement of the hypervisor. But even there it's a rudimentary exchange at best: the hypervisor asks "can you do this task instead of me?" and the array responds. If the answer is yes, the hypervisor waits for the task to complete; if the answer is no, the hypervisor does the task itself. This relationship isn't dynamic, and is ignorant of the reason for and context behind the task in the first place.

In summary, we have an outside force, AWS and public cloud, being the primary catalyst driving change into the enterprise, yet very little of that change is happening below the cloud management or hypervisor layer. Why is that? Why is it important that the infrastructure layer become more of an asset to the rest of the stack? What would that look like? Let's dig in.

The question of why is actually pretty simple: it's really, really hard to take legacy hardware architecture and retrofit it into something agile and programmatic. In some cases, it's just a new concept that requires a hardware refresh (like Cisco UCS and its take on XML-defined BIOS policies), but in many cases, especially around storage, it requires a complete reimagining of the platform. It's no coincidence that most of the innovation in this agile infrastructure space is being done by startups who have no legacy customers, technical debt or margins to deal with.

Why is it important? While the best hardware is boring hardware, it's still a critical part to providing a flexible, reliable and high-performance foundation to handle applications that matter to enterprises. There are times where the best way to handle the demands of an application or, more important, multiple applications at once is in hardware. This is true at the network layer, where the manipulation of packets benefits from proximity to processing resources; the compute layer, where apps can benefit from having specialized GPU resources to handle unique requirements; and most especially at the storage layer.

Storage services can have the most dramatic impact on workload performance, yet are often implemented in such a way that they have no direct relationship with those workloads. Services like compression, deduplication and quality-of-service are usually "on or off" features when it comes to storage arrays. Best case, a storage administrator will create a volume or LUN, choose the features that need to be enabled, and then a virtualization admin will map that volume to a data store. Perhaps the virtualization team will create manual storage profiles that define the features offered by that data store, but placing and migrating VMs remains a manual process, and they will not have the ability to map application policy equally across the hypervisor and hardware layers. (Of course, it's not impossible to create programmatic, hypervisor-aware infrastructure, but it is pretty hard.)

Enterprises have come to expect some fundamental features from the public cloud space: simple architecture, linear scaling, API availability and granular application of services. These features allow an infrastructure to respond to the increased requirements of a workload natively, without the overhead of a bolt-on orchestration engine. They provide the ability for the hypervisor to be both a northbound and southbound policy enforcer. They enable the Next-Generation Data Center, one in which the hardware, the hypervisor and the application all play an integrated, coordinated role in providing the performance and availability demanded by the enterprise.

No matter where your workloads run, the rise of public cloud has ushered in an era of computing defined by a seamless, programmatic experience. The old, monolithic infrastructure of yesterday's client/server wave is giving way to a more agile, more responsive, more services-rich and more scalable cloud-based model. The battle for the enterprise soul is beginning and, inside or outside the firewall, the clouds that can best adapt to the demands of the workloads they are supporting will be best positioned for success.

More Stories By Jeramiah Dooley

Jeramiah Dooley joined the SolidFire team as a Cloud Architect on the Technology Solutions team. Prior to SolidFire he was most recently at VCE and before that Peak 10. You can check out his Virtualization for Service Providers blog or follow him on twitter @jdooley_clt.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The only place to be June 9-11 is Cloud Expo & @ThingsExpo 2015 East at the Javits Center in New York City. Join us there as delegates from all over the world come to listen to and engage with speakers & sponsors from the leading Cloud Computing, IoT & Big Data companies. Cloud Expo & @ThingsExpo are the leading events covering the booming market of Cloud Computing, IoT & Big Data for the enterprise. Speakers from all over the world will be hand-picked for their ability to explore the economic strategies that utility/cloud computing provides. Whether public, private, or in a hybrid form, clo...
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridsto...
SYS-CON Events announces a new pavilion on the Cloud Expo floor where WebRTC converges with the Internet of Things. Pavilion will showcase WebRTC and the Internet of Things. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices--computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.
The Internet of Things (IoT) is making everything it touches smarter – smart devices, smart cars and smart cities. And lucky us, we’re just beginning to reap the benefits as we work toward a networked society. However, this technology-driven innovation is impacting more than just individuals. The IoT has an environmental impact as well, which brings us to the theme of this month’s #IoTuesday Twitter chat. The ability to remove inefficiencies through connected objects is driving change throughout every sector, including waste management. BigBelly Solar, located just outside of Boston, is trans...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics...
Internet of @ThingsExpo Silicon Valley announced on Thursday its first 12 all-star speakers and sessions for its upcoming event, which will take place November 4-6, 2014, at the Santa Clara Convention Center in California. @ThingsExpo, the first and largest IoT event in the world, debuted at the Javits Center in New York City in June 10-12, 2014 with over 6,000 delegates attending the conference. Among the first 12 announced world class speakers, IBM will present two highly popular IoT sessions, which will take place November 4-6, 2014 at the Santa Clara Convention Center in Santa Clara, Calif...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at Internet of @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., will show what is needed to leverage the IoT to transform your business. He will discuss opportunities and challenges ahead for the IoT from a market and tec...
SYS-CON Events announced today that TeleStax, the main sponsor of Mobicents, will exhibit at Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. TeleStax provides Open Source Communications software and services that facilitate the shift from legacy SS7 based IN networks to IP based LTE and IMS networks hosted on private (on-premise), hybrid or public clouds. TeleStax products include Restcomm, JSLEE, SMSC Gateway, USSD Gateway, SS7 Resource Adaptors, SIP Servlets, Rich Multimedia Services, Presence Services/RCS, Diame...
From a software development perspective IoT is about programming "things," about connecting them with each other or integrating them with existing applications. In his session at @ThingsExpo, Yakov Fain, co-founder of Farata Systems and SuranceBay, will show you how small IoT-enabled devices from multiple manufacturers can be integrated into the workflow of an enterprise application. This is a practical demo of building a framework and components in HTML/Java/Mobile technologies to serve as a platform that can integrate new devices as they become available on the market.
SYS-CON Events announced today that O'Reilly Media has been named “Media Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. O'Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O'Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption by amplifying "faint signals" from the alpha geeks who are creating the future. An...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
SYS-CON Events announced today that Aria Systems, the recurring revenue expert, has been named "Bronze Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Aria Systems helps leading businesses connect their customers with the products and services they love. Industry leaders like Pitney Bowes, Experian, AAA NCNU, VMware, HootSuite and many others choose Aria to power their recurring revenue business and deliver exceptional experiences to their customers.
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
As a disruptive technology, Web Real-Time Communication (WebRTC), which is an emerging standard of web communications, is redefining how brands and consumers communicate in real time. The on-going narrative around WebRTC has largely been around incorporating video, audio and chat functions to apps. In his session at Internet of @ThingsExpo, Alex Gouaillard, Founder and CTO of Temasys Communications, will look at a fourth element – data channels – and talk about its potential to move WebRTC beyond browsers and into the Internet of Things.
SYS-CON Events announced today that Gigaom Research has been named "Media Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Ashar Baig, Research Director, Cloud, at Gigaom Research, will also lead a Power Panel on the topic "Choosing the Right Cloud Option." Gigaom Research provides timely, in-depth analysis of emerging technologies for individual and corporate subscribers. Gigaom Research's network of 200+ independent analysts provides new content daily that bridges the gap between break...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Swiss innovators dizmo Inc. launches its ground-breaking software, which turns any digital surface into an immersive platform. The dizmo platform seamlessly connects digital and physical objects in the home and at the workplace. Dizmo breaks down traditional boundaries between device, operating systems, apps and software, transforming the way users work, play and live. It supports orchestration and collaboration in an unparalleled way enabling any data to instantaneously be accessed on any surface, anywhere and made interactive. Dizmo brings fantasies as seen in Sci-fi movies such as Iro...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation and integration; and visibility through intelligent business operations and big data.
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, will describe an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people’s real needs and desires.