Welcome!

Linux Containers Authors: Elizabeth White, Pat Romanski, Liz McMillan, Sematext Blog, XebiaLabs Blog

Related Topics: Containers Expo Blog, Java IoT, Linux Containers, Agile Computing, @CloudExpo, SDN Journal

Containers Expo Blog: Article

The Future Is Now: Why Flash Storage Will Transform the Data Center

By using software-defined storage, data center architects can design a flexible, efficient and powerful framework

For an example of just how dramatically storage has changed over the past fifteen years, consider your music collection. At one point, you had a collection of cassettes that stored the song files on tape. As the years went on, your hairstyle changed and you bought a CD player that used a spinning disk to store more song data at a higher quality than tape could. Spinning disks flourished well into the MP3 player era, surviving even the initial introduction of flash storage due to its competitive cost. Eventually, however, your newest smartphone or iPod shipped with flash storage instead, as manufacturers bowed to its improved performance over disk storage and its increasingly competitive price point.

This is an example of a sea change taking place at a much bigger scale as well. Instead of gigabytes, think petabytes.

The data center infrastructures designed by telcos, service providers and major enterprises to store massive quantities of data have lately used predominantly disk storage in their servers, sometimes blending in flash storage for performance-intensive tasks. While the speed and performance of flash storage has tempted data center architects to deploy it more widely throughout the data center, it has only been recently that the price of flash has decreased enough to make its broader use a viable option.

To understand why flash storage has suddenly become a practical choice for data center architects across industries, it is helpful to examine the differences between flash and disk storage.

The Next Big Thing, Again
As the example above shows, when it was introduced, disk storage represented leaps and bounds of progress in speed and efficiency compared to tape storage, the predominant method of the time. Even after flash was introduced to the market, disk storage remained the server architecture of choice. Flash did deliver substantially higher performance, but was priced too high to ever present a real threat to the prevalence of spinning disks. In addition, flash drives were smaller in capacity and not able to store as much data per unit as spinning disks at the same value.

However, new improvements in flash have slashed its price significantly, positioning it as a true data center hardware alternative whose benefits - speed in throughput and latency - have dramatically increased at the same time. As an added plus, flash is highly energy efficient, needing only a fraction of the power needed by disk storage, sometimes at the ratio of one to 16. Flash drives still break down at a faster rate than does disk storage, but its boosts in performance and drop in price in recent years have made flash a realistic and highly attractive option for data center architecture and design needs.

Making the Switch
In fact, it's increasingly feasible that today's data center - still reliant on disk storage - could use 100 percent flash storage tomorrow. Telcos, service providers, major enterprises and other major companies whose profits are tied to the speed and availability they can provide to their customer base, are beginning to look at flash storage's blistering performance as less of a "nice to have" option and more of a core technology necessary to maintaining a competitive edge.

While the high-performance-demanding industries of telco and service providers are diving into flash straight away, vendors in other vertical markets have made cost-benefit calculations and have elected to hold back until the price of flash storage drops even further. For example, a Dropbox-style file hosting service for consumer cloud storage isn't as likely to be motivated by fast performance as it would be with ensuring the availability of cheap storage at scale. Companies like these are making the usual tradeoff in storage: finding a comfortable place between price and capacity. However, when the price of flash finally descends to that of disk storage, the last barrier will be removed for those companies that want to remain competitive. When this last milestone finally happens, the market shift will be as significant as when disks replaced tape storage by beating it on the same markers: higher performance and better pricing.

Advancements in Software
One of the trends making this shift possible is that of software-defined storage. By adopting a software-defined approach to storage infrastructure, organizations have the flexibility to deploy flash storage throughout their data center architectures quickly and easily.

As background, the concept of software-defined storage seeks to move functions and features from the hardware layer to the software layer. This approach removes the dependence on expensive and annoying redundancies that solve issues based in the hardware layer. Data center architects must also plan for the inevitable failure of hardware. Flash storage, in particular, currently has a faster time-to-failure rate than disk does. In storage environments that don't use RAID cards, the failure of a disk prompts an error that will impact the end-user's experience. To solve this, architects will build in expensive and redundant RAID cards to hide the errors. By using the right software-defined strategy, these problems can be absorbed and made invisible to the end user. Since software-defined storage is hardware-agnostic, it can run on any hardware configuration.

There are a number of additional benefits that telcos and service provider data center architects can achieve by combining software-defined storage with flash hardware. For instance, the organization could still utilize a single name space spanning all its storage nodes if it were to use a software-defined storage approach. In addition, it could also run applications in the storage nodes as well, creating new "compustorage" nodes instead. As a result, the storage hardware wouldn't need to be big or costly, but could still have very high performance and speed. Organizations can start with a small number of cheap servers instead of building a large, expensive and traditional installation, and still scale linearly as needed.

Flash Assets
Benefits of a software-defined approach to an all-flash data center are:

  • Huge performance improvement through the ability to use the faster flash technology throughout the data center.
  • Lower power consumption means that SSDs reduce running costs, generating far less heat than a spinning disk and requiring less energy for cooling.
  • SSDs deliver a smaller footprint in the data center. Since SSDs are much smaller than spinning disks, they require less space and less real estate to house them.
  • Running more applications on the same hardware, due to hardware performance gains.

Conclusion
Even as many of us still listen to CDs in the car, the music industry is inevitably shifting to a new paradigm built on music files saved on flash storage. The trend is repeating across industries, but nowhere as dramatically as it is in the data center. Flash storage - with its extreme performance, efficient energy usage and increasingly competitive cost - will eventually become the industry status quo. By using software-defined storage, data center architects can design a flexible, efficient and powerful framework for telcos, service providers and major enterprises looking to get the most powerful and energy-efficient data center possible by using all flash.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.