Welcome!

Linux Containers Authors: Flint Brenton, Pat Romanski, Elizabeth White, Liz McMillan, John Esposito

Related Topics: Linux Containers

Linux Containers: Article

Easing Data Migration

The trick is don't move the data

Linux is emerging as the platform of choice for a growing number of enterprises across the globe. The cost, choice, and control advantages of using Open Source software for mission-critical applications have already enabled hundreds of organizations to control IT costs while expanding IT capabilities and productivity. Customers in telecommunications, financial services, and government have aggressively already deployed Linux in production workloads like databases, SAP, messaging services, and custom applications.

While moving to a new operating system is not trivial, its complexity pales in comparison to the struggles of migrating actual data from one platform to another in production environments. The ability to migrate data between different operating systems can reduce IT costs, either as part of platform migrations or multi-platform workflows.

Many companies undertake elaborate migration projects that require the manual migration of data. However, manually migrating data between typically disparate and incompatible systems requires a substantial investment of time and labor. In fact, this complexity often overwhelms the benefits it promises. Moreover, such migrations can trigger a range of risks in data loss, data corruption, policy compliance, and -worst of all - production downtime.

As a result, a growing number of organizations are turning to automated data migration tools to minimize such costs and risks in migrating production data workloads.

The Need to Migrate Data
Server and storage equipment replacements, relocation, consolidation, lease renewals, and balancing workloads all drive the need to migrate data on a regular basis. With larger disk sizes readily available, many organizations are looking to control costs by replacing a number of smaller drives with fewer but larger drives. Of course, fewer drives also means fewer spindles, which can negatively impact overall system performance.

Others simply have so much storage spread out among their worldwide data centers that storage migrations become a frequent process of removing old storage and adding new storage devices.

Other organizations discover they've outgrown their storage capabilities faster than anticipated and planned for, making their existing infrastructure unable to accommodate current and future data storage needs.

The Challenges
Migrating data to a Linux platform is easier said than done. According to a recent survey by Symantec over 72% of respondents take more than two weeks to plan an implementation and over 40% of the migrations involve more than five people to complete them. What's more, 61% exceed their planned downtime, 54% exceed their budget, and 83% exceed their staffing plan.

First, there are the operational issues to consider. Downtime must be scheduled, particularly in cases where the organization is making an application's data set accessible from another access point in the data center. And, with today's virtual environments, organizations have to be able to migrate from a physical to a virtual environment, and vice versa. Having inadequate manual and semi-automated approaches makes this even more difficult.

In all cases, coordination is key to a successful data migration. All administration groups involved in the process must be aware of the organization's data migration schedule, and process, and their role in it. And re-establishing access to storage must be done with minimal disruptions - which is very difficult when upgrading or adding another switch to a storage area network (SAN).

Beyond the operational challenges, organizations have to contend with storage-centric issues, the most daunting being file system issues. When moving data from a Unix to a Linux environment, for example, or simply adding new storage to a server and moving off an old storage device, it's necessary to resize the file system to use the new storage. A number of technologies facilitate this, enabling the virtualization of storage in such a way that the file system can interoperate better with the storage infrastructure.

Organizations must also deal with storage volumes that have incompatible formats, the challenge of preserving LUN and disk mappings across the migration, reclamation, and ensuring capacity at the destination. And as with any conversion and migration, the integrity of the data is at risk.

Application-level issues have to be considered when migrating data from one platform, such as Unix to Linux. Application data formats may not be cross-platform portable, some sort of conversion process on the data file format has to occur to be able to reach the same data on a Linux box.

Finally, organizations must contend with TCP/IP network-centric issues such as ensuring sufficient bandwidth and addressing interoperability concerns. Physical connectivity issues such as re-cabling and the implications on performance made by topological changes must also be addressed.

Easing Cross-Platform Data Migration
With half or more of enterprises' structured data stored in databases by some analyst estimates, this data is very likely to be migrated between unlike platforms at some point in its lifetime. But manual methods make the process unwieldy, time-consuming, and resource-intensive.

For example, moving a database from an Oracle instance running on a Sun Solaris server to another Oracle instance on a Linux server introduces a number of challenges. The storage volumes mounted on the existing system can't simply be unplugged and attached to the new server because the new Linux-based server can't interpret the information being sent.

There are a number of platform-specific factors that limit the ability to share volumes across servers. Among these are disk drive sector size and block size. As a result, new volumes have to be created on the Linux system, and these volumes have to be configured to get data from the existing Solaris server. All processing of applications has to be halted as the data moves from one platform to the next, and the data on the volumes has to be moved physically to the new Linux server. This can be done across the network or manually using tape backup and restore procedures. And the volumes will probably have to be converted before they are mounted or restored on the server. This typically happens when data is moved between platforms with dissimilar endians.

To overcome these challenges, a growing number of organizations are turning to new technologies that don't move the data but simply let it be accessed from another operating system host. The key to this technology is a new default disk format, the basis of platform-independent virtual volume building blocks, often called portable data containers. Volumes formatted with the new parameters of this disk format can be used with volume manager solutions regardless of the operating environment that initialized the disk (including issues like endianess). The resulting volume format enables platform-specific dependencies to be removed from the data movement equation, including sector and block size. In short, why convert and migrate the data when you can just convert the metadata and remount the storage device?

With this new technology, migrating data from Unix to Linux is a simple process, taking minutes, not days. Administrators unmount the file system on Unix, run a conversion utility, deport disks on Unix and import disks on Linux, start volumes, and mount the file system. According to laboratory tests this process can be done in less than a few minutes for a 500GB tablespace - whereas data conversion from tape backup would take five hours and the same process from NFS would need four hours. Actually the time it takes for such migrations isn't dependent on the total size (or capacity) of the data, but on the number of files in the file system.

The portable data-container building blocks simplify data migrations between heterogeneous server platforms. Application data storage can be used by any processing platform, which offers IT organizations greater leverage over existing heterogeneous computing resources in their environment.

Enhancing Business Performance
Moving data from one platform will never be trivial. In fact, it has historically been so hard that many organizations run their applications on sub-optimal and expensive legacy platforms just to avoid the complexities and downtime associated with data migration.

However, by leveraging new technologies that reduce the time and resources required to move data between unlike platforms - obviating the need and risk of traditional data migrations - volumes can easily be transported between unlike platforms. Physical disks can be grouped into logical volumes to improve disk utilization and eliminate storage-related downtime. Moreover, administrators have the flexibility to move data between storage arrays as needed, migrate data to new operating systems, and move files to the most appropriate storage device based on importance.

With these tools, organizations can reduce cost, risk, and downtime, while enhancing performance and maximizing the productivity of their heterogeneous IT environments.

Reference

More Stories By Andy Fenselau

Andy Fenselau has led product management across various parts of the Linux technology stack since 1998. He is currently the Linux Product Line Manager for Symantec's enterprise storage and server management solutions, spending most of his time with customers and partners to ensure Symantec's Linux solutions are meeting their needs. As a Linux evangelist, Andy has authored many articles and spoken at many events about the technical and business advantages of the evolving Linux solutions. He holds a BA from Harvard University and an MBA from Stanford University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
IoT offers a value of almost $4 trillion to the manufacturing industry through platforms that can improve margins, optimize operations & drive high performance work teams. By using IoT technologies as a foundation, manufacturing customers are integrating worker safety with manufacturing systems, driving deep collaboration and utilizing analytics to exponentially increased per-unit margins. However, as Benoit Lheureux, the VP for Research at Gartner points out, “IoT project implementers often ...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Presidio has received the 2015 EMC Partner Services Quality Award from EMC Corporation for achieving outstanding service excellence and customer satisfaction as measured by the EMC Partner Services Quality (PSQ) program. Presidio was also honored as the 2015 EMC Americas Marketing Excellence Partner of the Year and 2015 Mid-Market East Partner of the Year. The EMC PSQ program is a project-specific survey program designed for partners with Service Partner designations to solicit customer feedbac...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change t...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
SYS-CON Events announced today that Bsquare has been named “Silver Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. For more than two decades, Bsquare has helped its customers extract business value from a broad array of physical assets by making them intelligent, connecting them, and using the data they generate to optimize business processes.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...