Click here to close now.

Welcome!

Linux Containers Authors: Liz McMillan, Elizabeth White, AppDynamics Blog, Pat Romanski, JP Morgenthal

Related Topics: Linux Containers

Linux Containers: Article

The Best of Both Worlds

Disk and tape backups: an optimal solution

Speedy disk backups are gaining in popularity as networking demands increase, but the traditional tape data protection won't disappear overnight. Maybe it's time to think about the best of both worlds.

With more and more pressure on Linux system administrators to protect data and make it readily accessible 24 hours a day, an increasing number of companies are looking for alternatives or supplemental solutions to traditional client-to-tape backup/restore.

Faced with budget constraints and a steadily escalating need for quick if not instant data recovery, disk-to-disk backup is an option that some businesses have implemented. Both Linux-based hardware and software manufacturers are moving to meet the increased demand for methods that don't rely exclusively on tape backup.

Comparing disk backup to tape backup isn't really an exercise in what is best. More realistically, the comparison is based on what is most appropriate in terms of networking demands and structure.

Complete backup and restore is, of course, the best protection against data loss and the two foremost methods available are tape or hard disk systems. Historically, tape has been the preferred method because tape cartridges were inherently cheaper than disk drives. A secondary reason was that tape cartridges were portable and could be stored away from the computer system. Since the disk drive was internal to the PC, it could not be removed and/or stored easily or conveniently.

However, technology evolves to meet demands. The design of system/drive components has steadily pushed down the cost of disk drives, resulting in a competitive alternative to pure tape systems. Of equal significance is the fact that tape is comparatively slow and sequential, sometimes making it difficult to find files quickly. Disk drives, on the other hand, rely on direct random access, noteworthy time-saving (time = money) and read/write efficiencies that translate into enhanced productivity, and a potential drop in operating costs.

Industry professionals who tout disk-to-disk options point out advantages that are primarily related to instant access of mission-critical information. They claim:

  • Much shorter backup windows with little or no downtime
  • Rapid restore
  • Cost effective
Users and servers are increasingly scattered across wide-ranging time zones, leading to applications and data being available 24x7x365.

For data to be unavailable during a backup window of significant duration is no longer just inconvenient, it's unacceptable. In highly competitive situations, businesses of all sizes need up-to-the-minute data protection without sacrificing high availability.

Disk-to-disk can shorten or eliminate some backup window problems. Simply put, writing straight to disk is faster than to tape. Downtime is reduced or eliminated, thus decreasing the time when data is vulnerable to loss.

Similarly, restoring data from a disk is quicker than from tape. Instant access to files is possible for everyone on the network without a time-costly restore of huge tape archives.

It should be pointed out that some software users (and producers) advocate disk backup because their backup software does not allow the multiplexing or multistreaming of data to tape drives. This software limitation is avoided by multiple access to disk applications.

The cost factor between the two methods is also becoming less and less substantial. Disk-based storage hardware prices are at an all-time low. When considering speed and capacity ratios, hard disks can offer an impressive return on investment. While tape has historically been substantially less expensive than equivalent disk storage, disk prices have fallen dramatically (and capacities grown) to the point where disk-based systems have become a financially feasible alternative to tape-based systems, while delivering significantly superior performance.

Tape drive backups are still easier to secure, either by archiving to a different site, or by pulling the tape each night and taking a copy home in your backpack. High volume disk arrays are just not as flexible when formulating security guidelines.

In actuality, the combination of disk and tape backup is an optimal solution for many businesses, especially if there is a regular need for quickly retrieving and restoring files as part of business demands. Data is backed up to a hard drive, giving users immediate access to the previously saved files. Meanwhile, the disk array is backed up offline to archive the files as they change. Users don't lose any working time and data isn't vulnerable for long periods of downtime. The tape drives can be at a different location or duplicated by tape management software in order to take a copy offsite.

Essentially, then, the question "Is disk or tape backup best?" isn't as valid as "What's right for my configuration and network demands?"

More Stories By Phil Roussel

Phil Roussel is chief executive officer and cofounder of Arkeia Corporation.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Collecting data in the field and configuring multitudes of unique devices is a time-consuming, labor-intensive process that can stretch IT resources. Horan & Bird [H&B], Australia’s fifth-largest Solar Panel Installer, wanted to automate sensor data collection and monitoring from its solar panels and integrate the data with its business and marketing systems. After data was collected and structured, two major areas needed to be addressed: improving developer workflows and extending access to a business application to multiple users (multi-tenancy). Docker, a container technology, was used to ...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will addresses this very serious issue of profound change in the industry.
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers is very hard. You have to learn five new and different technologies and best practices (libswarm, sy...
SYS-CON Events announced today that DragonGlass, an enterprise search platform, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. After eleven years of designing and building custom applications, OpenCrowd has launched DragonGlass, a cloud-based platform that enables the development of search-based applications. These are a new breed of applications that utilize a search index as their backbone for data retrieval. They can easily adapt to new data sets and provide access to both structured and unstruc...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem fil...
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
The recent trends like cloud computing, social, mobile and Internet of Things are forcing enterprises to modernize in order to compete in the competitive globalized markets. However, enterprises are approaching newer technologies with a more silo-ed way, gaining only sub optimal benefits. The Modern Enterprise model is presented as a newer way to think of enterprise IT, which takes a more holistic approach to embracing modern technologies.