Click here to close now.

Welcome!

Linux Authors: Carmen Gonzalez, Lori MacVittie, Ian Khan, VictorOps Blog, Jason Bloomberg

Related Topics: SDN Journal, Java, SOA & WOA, Linux, Virtualization, Security

SDN Journal: Article

Addressing the Concerns CIOs Have with the SDDC

A Q&A session with CIOs regarding the SDDC

First and foremost you can't have a successful software-defined model if your team still have a hardware-defined mentality. Change is inevitable and whether it's embraced or not it will happen. For experienced CIOs this is not the first time they've experienced this technological and consequently cultural change in IT.

Question 1. Vendors are racing to lead the movement towards a softwaredefined data centre. Where are we up to in this journey, and how far are we from seeing this trend widely adopted?

Considering most organizations have still not fully virtualized or moved towards a true Private Cloud model, SDDC is still in its infancy in terms of mainstream adoption and certainly won't be an overnight process. While typical early adopters are advancing quickly down the software-defined route these are mostly organizations with large scale multi-site data centers who are already mature in terms of their IT processes. Such large scale organizations are not the norm and while the SDDC is certainly on the mindset of senior IT executives, establishing such a model requires several key challenges and tasks.

Typical environments are still characterized by numerous silos, complex & static configurations and partially virtualized initiatives. Isolated component and operational silos need to be replaced with expertise that cover the whole infrastructure so that organizations can focus on defining their business policies. In this instance the converged infrastructure model is ideal as it enables the infrastructure to be managed, maintained and optimized as a single entity by a single silo. Subsequently such environments also need to dramatically rearrange their IT processes to accommodate features such as orchestration, automation, metering and billing as they all have a knock on effect to service delivery, activation and assurance as well as change management and release management procedures. The SDDC necessitates a cultural shift and change to IT as much as a technical one and the latter historically always takes longer. It could still be several years before we really see the SDDC be adopted widely but it's definitely being discussed and planned for the future.

Question 2. Looking at all the components of a data center, which one poses the most challenges to being virtualized and software-defined?

The majority of data center components have experienced considerable technological advancements in past few years. Yet in comparison to networking, compute and hypervisor, storage arrays still haven't seen that many drastic changes beyond new features of auto-tiering, thin-provisioning, deduplication and the introduction of EFDs. Moreover Software Defined's focus is applications and dynamically meeting the changing requirements of an application and service offering. Beyond quality of service monitoring based on IOPS and back-end / front-end processor utilization, there are still considerable limitations with storage arrays in terms of application awareness.

Additionally with automation being integral to a software-defined strategy that can dynamically shift resources based on application requirements, automation technologies within storage arrays are up to now still very limited. While storage features such as dynamic tiering may be automated, they are still not based on real-time metrics and consequently not responsive to real-time requirements.

This leads to the fact that storage itself has moved beyond the array and is now encompassed in numerous forms such as HDD, Flash, PCM and NVRAM etc. each with their own characteristics, benefits and challenges. As of yet the challenge is still to have a software layer that can abstract all of these various formats as a single resource pool. The objective should be that regardless of where these formats reside whether that's within the server, the array cache or the back end of the array, etc., they can still dynamically be shifted across platforms to meet application needs as well as provide resiliency and high availability.

Question 3. Why has there been confusion about how software-defined should be interpreted, and how has this effected the market?

Similar to when the Cloud concept first emerged in the industry, the understanding of the software-defined model quickly became somewhat blurred as marketing departments of traditional infrastructure vendors jumped on the bandwagon. While they were quick to coin the Software-Defined terminology to their offerings, there was little if anything different to their products or product strategy. This led to various misconceptions such as software- defined was just another term for Cloud, if it was virtualized it was software-defined or even more ludicrously that software-defined meant the non-existence or removal of hardware.

To elaborate, all hardware components need software of some kind to function but this does not necessitate them to be software-defined. For example Storage arrays use various software technologies such as replication, snapshotting, auto-tiering and dynamic provisioning. Some storage vendors even have the capability of virtualizing third party vendor arrays behind their own or via appliances and consequently abstracting the storage completely from the hardware whereby an end user is merely looking at a resource pool. But this in itself does not define the array as software defined and herein lies the confusion that some end users face as they struggle to understand the latest trend being directed at them by their C-level execs.

Question 4. The idea of a software-defined data center (virtualizing and automating the entire infrastructure wildly disrupts the make-up of a traditional IT team. How can CIOs handle the inevitable resistance some of their IT employees will make?

First and foremost you can't have a successful software-defined model if your team still have a hardware-defined mentality. Change is inevitable and whether it's embraced or not it will happen. For experienced CIOs this is not the first time they've experienced this technological and consequently cultural change in IT. There was resistance to change from the mainframe team when open systems took off, there was no such thing as a virtualisation team when VMware was first introduced and only now are we seeing Converged infrastructure teams being established despite the CI market being around for more than three years. For the traditional IT teams to accept this change they need to recognize how it will inevitably benefit them.

Market research is unanimous in its conclusion that currently IT administrators are far too busy doing maintenance tasks that involve firefighting "keeping the lights" on exercises. Generally figures point to a 77% mark of overall time spent for IT admin on doing mundane maintenance and routine tasks with very little time spent on innovation, optimization and focus of delivering value to the business. For these teams the software-defined model offers the opportunity to move away from such tasks and free up their time enabling them to be proactive as opposed to reactive. With the benefits of orchestration and automation, IT admin can focus on the things they are trained and specialized in such as delivering performance optimization, understanding application requirements and aligning their services and work to business value.

Question 5. To what extent does a software-defined model negate the need to deploy the public cloud? What effect will this have on the market?

The software defined model shouldn't and most likely won't negate the public cloud, if anything it will make its use case even clearer. The SDDC is a natural evolution of cloud, and particularly the private cloud. The private cloud is all about IT service consumption and delivery of IT services whether this be layered upon converged infrastructure or self assembled infrastructures. Those that have already deployed a private cloud and are also utilizing the public cloud have done so with the understanding and assessment of their data; it's security and most typically it's criticality. The software defined-model introduces a greater level of intelligence via software where application awareness and requirements linked to business service levels are met automatically and dynamically. Here the demand is being dictated by the workload and the software is the enabler to provision the adequate resources for that requirement.

Consequently organizations will have a greater level of flexibility and agility to previous private cloud and even public cloud deployments, thus providing more lucidity in the differentiation between the private and public cloud. Instead of needing to request from a cloud provider permission, the software defined model will provide organizations on-demand access to their data as well as independently dictate the level of security. While this may not completely negate the requirement for a public cloud, it will certainly diminish the immediate benefits and advantages associated with it.

Question 6. For CIOs looking for pure bottom-line incentives they can take to senior management, what is the true value of a software-defined infrastructure?

The true value of a software defined model is that it empowers IT to be a true business enabler. Most business executives still see IT as an expensive overhead as opposed to a business enabler. This is typically because of IT's inability to respond quicker to ever changing service requirements, market trends and new project roll-outs that the business demands. Much of this is caused by the deeply entrenched organizational silos that exist within IT where typical infrastructure deployments can take up to months. While converged infrastructure solutions have gone some way to solving this challenge, the software defined model builds on this by providing further speed and agility to the extent that organizations can encapsulate their business requirements into business delivery processes. In this instance infrastructure management processes become inherently linked to business rules that incorporate compliances, performance metrics and business policies. In turn via automation and orchestration these business rules dynamically drive and provision the infrastructure resources of storage, networking and compute in real time to the necessary workloads as the business demands it.

Question 7. To what extent will a software-defined infrastructure change the way end-users should approach security in the data centre?

A software-defined model will change the way data center security is approached in several ways. Traditional physical data center security architecture is renowned for being inflexible and complex due to its reliance on segmented numbers of dedicated appliances to provide numerous requirements such as load balancing, gateways, firewalls, wire sniffers etc. Within a software-defined model, security can potentially not only be delivered as a flexible and agile service but also as a feature that's built into the architecture. Whether that is based on an approach of security being embedded within the servers, storage or network, a software-defined approach has to take advantage of being able to dynamically distribute security policies and resources that are logically managed and scaled via a single pane.

From a security perspective a SDDC provides immediate benefits. Imagine how simplified it will become when automation can be utilized to restructure infrastructure components that have become vulnerable to security threats? Even the automation of isolating malware infected network end points will drastically simplify typical security procedures but will then consequently need to be planned for differently.

Part of that planning is acknowledging not just the benefits but the new types of risk they inevitably introduce. For example, abstracting the security control plane from the security processing and forwarding planes means that any potential configuration errors or security issues can have far more complex consequences than in the traditional data centre. Furthermore centralizing the architecture ultimately means a greater security threat should that central control be compromised. These are some of the security challenges that organizations will face and there are already movements in the software defined security space to cater for this.

Question 8. Where do you see the software-defined market going over the next couple of years?

The concept of the SDDC is going to gain even more visibility and acceptance within the industry and the technological advances that have already come about with Software-Defined Networking will certainly galvanize this. Vendors that have adopted the software-defined tagline will have to mature their product offerings and roadmaps to fit such a model as growing industry awareness will empower organizations to distinguish between genuine features and marketing hyperbole.

For organizations that have already heavily virtualized and built private clouds the SDDC is the next natural progression. For those that have adopted the converged infrastructure model this transition will be even easier as they will have already put the necessary IT processes and models in place to simplify their infrastructure as a fully automated, centrally managed and optimized baseline from which the SDDC will emanate from. It is fair to say that it won't be a surprise to see a lot of the organizations that embraced the converged infrastructure model to also be the pioneers of a successful SDDC.


The above interview with Archie Hendryx is taken from the May 2014 issue of Information Age: http://www.information-age.com/sites/default/files/May%202014%20OPT.pdf

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.

@ThingsExpo Stories
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
HP and Aruba Networks on Monday announced a definitive agreement for HP to acquire Aruba, a provider of next-generation network access solutions for the mobile enterprise, for $24.67 per share in cash. The equity value of the transaction is approximately $3.0 billion, and net of cash and debt approximately $2.7 billion. Both companies' boards of directors have approved the deal. "Enterprises are facing a mobile-first world and are looking for solutions that help them transition legacy investments to the new style of IT," said Meg Whitman, Chairman, President and Chief Executive Officer of HP...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
SYS-CON Events announced today that GENBAND, a leading developer of real time communications software solutions, has been named “Silver Sponsor” of SYS-CON's WebRTC Summit, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. The GENBAND team will be on hand to demonstrate their newest product, Kandy. Kandy is a communications Platform-as-a-Service (PaaS) that enables companies to seamlessly integrate more human communications into their Web and mobile applications - creating more engaging experiences for their customers and boosting collaboration and productiv...