Click here to close now.

Welcome!

Linux Authors: Lori MacVittie, Hovhannes Avoyan, Esmeralda Swartz, Carmen Gonzalez, Elizabeth White

Related Topics: Linux

Linux: Article

How to Avoid Desktop Disasters

Take basic precautions with data backups

It's 2:00 a.m., you're working on that critical presentation, and the power goes out. Since you moved your power supply to your significant other's computer, you just lost all your work.

We've all been there one time or another. Then the real trouble starts: not only haven't you saved your work in an hour but lo and behold your PC won't boot back into your operating system. As the cold sweat drips from your brow you realize that in addition to losing your presentation, you've also lost your financial records, calendar, and more data than you could ever hope to replace. Before it happens to you again you need to have a desktop backup strategy.

This scenario resonates with many, if not all of us. It's not an individual problem; it's a computer-user problem common among suits and "propeller-heads" alike. That's why I'm focusing on ways to avoid desktop disasters this month.

In addition to focusing on backup disasters I'm going to delve a little more deeply into the command line to solve some of these problems. I don't want to intimidate anyone who's more comfortable in the GUI world, but there are many easy-to-use and powerful command-line utilities that are at your disposal in most Linux distributions. Also, I have been recently inspired by Doc Searls (http://doc.weblogs.com) and his movement for DIY (do-it-yourself) IT; check out his IT Garage (http://garage.docsearls.com/). This idea isn't fascinating or interesting to me because it's new; it's because I come from the do-it-yourself generation, the one that makes it possible for a DIY channel on cable and 101 home improvement shows to exist. What's interesting and noteworthy is that, with a little knowledge, you can save yourself a lot of money by doing some of these tasks yourself. So with a nod to my fellow do-it-yourselfers, let's explore the world of Linux desktop backup strategies.

Desktop Backup Strategies

There's an old saying, "An ounce of prevention is worth a pound of cure." This phrase could just as easily have been attributed to a system administrator after a server crash if it wasn't first attributed to Ben Franklin. More often than not we as computer users have not taken the most basic precautions with our personal computing environments. The bottom line is that there is no foolproof way to keep our PCs from crashing, so backups are critical. Please follow me as we take a walk down backup lane.

Data

The first step in developing a backup strategy for your data is to figure out where your data lives on your system. By default almost all your user data is stored in your /home/$USER directory ($USER just means the user you log in to Linux with). Once you confirm that this is where your information is, you can decide how you will make backups. In my case, I have a DVD/RW on my laptop. I try to keep the bulk of my data in one directory but it's actually a fat32 partition that I access from Linux or Windows. That directory is /windows/D/. I simply launch my favorite CD burner software k3b (www.k3b.org) and create a data project to copy all my critical files to a DVD or CD. Its drag-and-drop interface allows me to make copies easily, but that's only one of the many ways to archive your data. You can also copy to a second hard drive or other storage device or even back up your data over the network. However, the method is not nearly as important as remembering to do it, or at least scheduling the system to do it on a regular basis (see the sidebar How to Automate Backups).

Operating System Backups

Most of you who have read my previous columns will know I'm a Linux LiveCD junkie (www.linuxworld.com/story/45259.htm); Linux LiveCDs can be very valuable when backing up your operating system. One method for creating backups is to use the handy dandy Knoppix CD (www.knoppix.org) and then do a complete disk copy to an extra hard drive (see Figure 1). Now this is not the fastest way to do things, but it catches everything on my hard drive (including all my data) and it's very easy to verify the contents of the backup. In this scenario I'm going to use the example of a PC with two hard drives. The first hard drive will be where your operating system resides and the second hard drive is where we will store your data. Not everyone will have a second hard drive but in these days of cheap storage it's not cost prohibitive to have a second internal or an external hard drive. For a 60GB hard drive there are many options well under $100. Since you don't have to invest in backup software with these techniques, you'll most likely have the extra cash.

How to Back Up Your Hard Drive Using Knoppix

You can create backups of your hard drive in a variety of ways; however, copying the entire contents of one local drive to another can create the most thorough backup. To start, boot your computer using your Knoppix CD. You should see both hard drives on your desktop and if you open your shell you can sudo su to gain root (super-user) access. You can also discern which drives are which by their contents, which should be browsable using the Konqueror file manager. Make sure that once you have discerned which hard drive is which by browsing them that you unmount them so you can copy one drive to another. One drive is likely to be /dev/hda and the other is probably /dev/hdb. You may need to research them a little more thoroughly to be sure. To copy one drive to another simply type the following command:

dd if=/dev/hda of=/dev/hdb

Imaging Your Drive from Another Server

You may not have room for a hard-drive image on your local machine and may choose to keep it on a server so that you can migrate it to newer hardware or use it for other PCs in your enterprise. Once again you can use the same dd command in conjunction with a variety of remote access protocols to reimage your disk. While I have done this a few times, I defer to an expert on this topic, J.H. Moore, who has put together an excellent how-to at www.okmoore.com/imagedrive.html, mirrored at knoppix.net/docs/index.php/ImageYourHardDriveUsingKnoppix.

Give Your Hard Drive a KickStart

Inevitably you'll have to reinstall your operating system on your desktop Linux PC; in a larger environment you may have to do it for many machines. Part of a good disaster plan is a fast recovery, that's why you may want to use Red Hat's KickStart to automate the reinstall or even the initial reinstall of systems. The process is to build one PC that has your preferred configuration (this is called the build machine), then use that template to "kick off" additional installs. This automates the installs and minimizes human interference (and consequent mistakes). This is a good measure for fixing failed machines because you basically maintain one master machine and then let it configure your additional PCs painlessly. This tool was designed for use with Red Hat Linux, but many people have hacked it for other distributions. A search on Google for "Kickstart Linux" will yield a bounty of information on the subject. For more information, check out the Kickstart (listman.redhat.com/archives/kickstart-list/) mailing list as it's a good place to ask questions and get ideas on what other Linux users are doing with this tool.

Other Tips and Tricks

Backups are obviously essential to recover from a desktop disaster, but here are some tips to prevent you from having to use your backups or fix your installation before you resort to a restore of your system.

Protecting Files: Making Data Read Only

Sometimes we have data that we store but never alter. In those cases it could be advantageous to make the data read only so it doesn't get overwritten or deleted. You can make your data read only by using Linux permissions. Every file and device on the system is controlled by permissions, and before a file is executed, written, or read there it checks to make sure that that action is allowed.

Users and Groups

Every process on a Linux system is executed by a user from the super-user root to a user with restricted permissions, so he or she can't compromise other parts of the operating system; in my case, this user is mrhinkle. Each of these users is part of one or more groups, which makes it easier to share resources and still enforce permissions. All information about users and groups is kept in /etc/passwd and /etc/group/. Now if you do have data and you don't write it often, or maybe it's just an archived copy of a presentation, you could change the permissions for the files to be "read only." That way it's harder, but not impossible, to be overwritten or deleted. Do this by using the chmod command. Chmod sets permissions for a file. The syntax is typically chmod ### file where ### are the permissions for owner, group, and all other users. For fields that I don't want to get overwritten I occasionally set that number to 444, where the four indicates read only.

In the example below I've created the file example.txt using the vi editor. Once I created the file I used chmod to set the files to read only so that I don't overwrite the data there. Notice that when I try to remove the file I also get warned. This isn't a foolproof method and I could have answered "y" to the rm dialog and the file would be deleted, but the warning should help me to think twice before I edit the file.

mrhinkle@linux:~> vi example.txt
mrhinkle@linux:~> chmod 444 example.txt
mrhinkle@linux:~> rm example.txt
rm: remove write-protected regular file `example.txt'? n

You may also want to be extra cautious and have the files owned by another user with just read-only attributes. For example, if I normally work as mrhinkle, I could have the files owned by root. That way, the only way I could delete them would be if I was logged in as root. You may even want to create an archive user, then use the shown command to "change owner" to archive. Keep in mind that to do this you must have permissions for that file so it's easily done as the root user. The syntax for this would be:

root@linux:~> chown archive example.txt

Keep in mind that I first created the user archive, then changed the ownership of my files.

Creating a Boot Floppy

Most distributions offer various utilities for creating a boot floppy in case a misapplied kernel update or other disaster strikes. If you want to find out how to make your own, you can try this method. It takes your kernel image and copies it to a floppy disk.

Step 1: Find the Kernel

For the most part, your kernel is usually going to be in /vmlinuz or /boot/vmlinuz (on my SuSE 9.1 installation it's /boot/vmlinuz). This is a soft link to my kernel, which is vmlinuz-2.6.5-7.95-smp.

Step 2: Copy the Kernel to a Floppy

You can do this by copying the kernel image to your floppy; in most cases this will be /dev/fd0.

dd if=/vmlinuz of=/dev/fd0

Step 3: Set the kernel image on the floppy to the location of your root system.

rdev /dev/fd0 /dev/hda7

Your root filesystem may be some-where other than "/dev/hda7". You might find that your Linux installation is somewhere other than "/dev/hda7". I found out where my Linux installation was by changing to the root user and using the fdisk utility tool, and listed my partition table.

SIDEBAR 1

Simple Backup Script

Most of us are used to our point-and-click environments, but sometimes it's easier to use the command line. However, if you're like me, you forget the syntax to the commands and make mistakes that cause the commands to become troublesome. That's why I took a little time to find a simple script to help you create a compressed archive directory complete with date. It also requires that you write a shell script. I chose to write the following script I called "arcive":

Step 1: In your favorite text editor type the following three lines:

#!/bin/sh
tar czvf $1.$(date +%Y%m%d).tgz $1
exit $?

Step 2: Save the file and make sure it's executable. I saved my file and called it arcive. (Since I have some other files called archive I decided to change the spelling so I wouldn't be confused. You could call it whatever you want.) I had to change the file to make it executable by using the chmod command:

chmod 755 arcive

Step 3: Now the easy part: I'm going to make a backup of my Firefox Web browser folder so I can preserve all my bookmarks and plug-ins as well as copy them to another test PC. The format for doing this is to simply enter:

./arcive directory_name

where directory name is the name of the directory you want to archive. Since I archived a directory called Firefox I got the following result:

firefox.20040712.tgz

Notice that the name has the date (July 12, 2004) in it so I can easily track when I made the backup. This is a simple script that I find very useful as a noncommand-line guy; it makes it easy to archive directories.

SIDEBAR 2

How to Automate Backups

There are as many ways to back up your data as there are types of data. Here's a quick way to back up your data that's mirrored in the exact same format that it's stored in your home directory. Keep in mind, this is from a command-line interface. You'll be using two commands: rsynch and crontab. They're probably installed on your system already, but if not you may want to use Google to search for more information on their installation and use.

Rsynch is used to synchronize with files at another location. In this simple example, we'll be synching to another directory on the hard drive. In this example I will be synching my data from my /home/mrhinkle/data/ directory to a drive I have mounted at /mnt/backups/.

Cron is a daemon that executes scheduled commands. In this case, we're using it to schedule backups, but you could use it to schedule the upload of files to a Web site, to archive mail, or for a variety of other tasks.

To start this exercise, type the following at a shell prompt:

crontab -e

then add a line in the format below. Keep in mind that the anatomy of a cron file looks like this:

30 4 * * * rsync -a /home/mrhinkle/data/ /mnt/backups/
30 4 * * *

This is the line that indicates the time at which to run the commands further down the line. Remember that your PC needs to be turned on at this time for it to run. (This should be evident but sometimes we forget the basics when we enter new territory.) The first number is the minutes; the second the hour; the next three asterisks indicate day of the month, month of the year, and then the days of the week (acceptable values here are 0-6 with zero equal to Sunday through Saturday. I used the wildcard "*" to indicate every day of the month, month of the year, and day of the week).

rsync -a

This is the command. Run rsynch, which will copy the data from one directory to another and only synchronize the data that has changed since the last synch. In the case of this example I put all my data in one directory. Other things you may want to back up are bookmarks and browser settings. (I use Firefox so all my browser preferences are saved in /home/mrhinkle/.mozilla/.)

/home/mrhinkle/data/

This is the data I want to synchronize. In this case the data is stored in /home/mrhinkle/data/.

/mnt/backups/

This is the location I want to synchronize to. In a desktop PC this might be a second hard drive or ideally a remote file server to ensure additional redundancy. I have many different configurations so this is just an example. Once again, if this is a file system that must be mounted, make sure it's mounted at the time the cron job runs.

Once you have done this you'll want to check that your first couple of backups have run as you expected them to. Once you verify that your system is backed up regularly, you should have peace of mind that you could restore that data should the need arise. Also, it's good form to check from time to time that nothing has gone amiss.

Summary

To boil it all down, no matter how careful you are or how stable Linux is, chances are you will someday run into a "disastrous" crash and your only recourse will be to restore your system and data. That's why I recommend that data backups be the backbone of your disaster plan. Also, the preceding tips and tricks may be helpful in preventing a disaster. The bottom line is that I sincerely hope you never need to use the tactics outlined in this article, but if you follow these guidelines I think you'll find a Linux desktop crash won't be a disaster.

More Stories By Mark R. Hinkle

Mark Hinkle is the Senior Director, Open Soure Solutions at Citrix. He also is along-time open source expert and advocate. He is a co-founder of both the Open Source Management Consortium and the Desktop Linux Consortium. He has served as Editor-in-Chief for both LinuxWorld Magazine and Enterprise Open Source Magazine. Hinkle is also the author of the book, "Windows to Linux Business Desktop Migration" (Thomson, 2006). His blog on open source, technology, and new media can be found at http://www.socializedsoftware.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
As enterprises move to all-IP networks and cloud-based applications, communications service providers (CSPs) – facing increased competition from over-the-top providers delivering content via the Internet and independently of CSPs – must be able to offer seamless cloud-based communication and collaboration solutions that can scale for small, midsize, and large enterprises, as well as public sector organizations, in order to keep and grow market share. The latest version of Oracle Communications Unified Communications Suite gives CSPs the capability to do just that. In addition, its integration ...
SYS-CON Media announced today that @ThingsExpo Blog launched with 7,788 original stories. @ThingsExpo Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @ThingsExpo Blog can be bookmarked. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago.
The world's leading Cloud event, Cloud Expo has launched Microservices Journal on the SYS-CON.com portal, featuring over 19,000 original articles, news stories, features, and blog entries. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. Microservices Journal offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Follow new article posts on Twitter at @MicroservicesE
SYS-CON Events announced today that robomq.io will exhibit at SYS-CON's @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. robomq.io is an interoperable and composable platform that connects any device to any application. It helps systems integrators and the solution providers build new and innovative products and service for industries requiring monitoring or intelligence from devices and sensors.
Wearable technology was dominant at this year’s International Consumer Electronics Show (CES) , and MWC was no exception to this trend. New versions of favorites, such as the Samsung Gear (three new products were released: the Gear 2, the Gear 2 Neo and the Gear Fit), shared the limelight with new wearables like Pebble Time Steel (the new premium version of the company’s previously released smartwatch) and the LG Watch Urbane. The most dramatic difference at MWC was an emphasis on presenting wearables as fashion accessories and moving away from the original clunky technology associated with t...
SYS-CON Events announced today that Litmus Automation will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Litmus Automation’s vision is to provide a solution for companies that are in a rush to embrace the disruptive Internet of Things technology and leverage it for real business challenges. Litmus Automation simplifies the complexity of connected devices applications with Loop, a secure and scalable cloud platform.
In 2015, 4.9 billion connected "things" will be in use. By 2020, Gartner forecasts this amount to be 25 billion, a 410 percent increase in just five years. How will businesses handle this rapid growth of data? Hadoop will continue to improve its technology to meet business demands, by enabling businesses to access/analyze data in real time, when and where they need it. Cloudera's Chief Technologist, Eli Collins, will discuss how Big Data is keeping up with today's data demands and how in the future, data and analytics will be pervasive, embedded into every workflow, application and infra...
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, will provide some practical insights on what, how and why when implementing "software-defined" in the datacenter.
So I guess we’ve officially entered a new era of lean and mean. I say this with the announcement of Ubuntu Snappy Core, “designed for lightweight cloud container hosts running Docker and for smart devices,” according to Canonical. “Snappy Ubuntu Core is the smallest Ubuntu available, designed for security and efficiency in devices or on the cloud.” This first version of Snappy Ubuntu Core features secure app containment and Docker 1.6 (1.5 in main release), is available on public clouds, and for ARM and x86 devices on several IoT boards. It’s a Trend! This announcement comes just as...
SYS-CON Events announced today that Vicom Computer Services, Inc., a provider of technology and service solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. They are located at booth #427. Vicom Computer Services, Inc. is a progressive leader in the technology industry for over 30 years. Headquartered in the NY Metropolitan area. Vicom provides products and services based on today’s requirements around Unified Networks, Cloud Computing strategies, Virtualization around Software defined Data Ce...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
The only place to be June 9-11 is Cloud Expo & @ThingsExpo 2015 East at the Javits Center in New York City. Join us there as delegates from all over the world come to listen to and engage with speakers & sponsors from the leading Cloud Computing, IoT & Big Data companies. Cloud Expo & @ThingsExpo are the leading events covering the booming market of Cloud Computing, IoT & Big Data for the enterprise. Speakers from all over the world will be hand-picked for their ability to explore the economic strategies that utility/cloud computing provides. Whether public, private, or in a hybrid form, clo...
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, will discuss how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at the same time reduce Time to Market (TTM) by using plug and play capabilities offered by a robust I...
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, a...
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this case) takes into account the number and quality of contextual references that a user receives.
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? Join this panel of experts as they peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you’ll have no problem filling in your buzzword bingo cards.
SYS-CON Events announced today that AIC, a leading provider of OEM/ODM server and storage solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. AIC is a leading provider of both standard OTS, off-the-shelf, and OEM/ODM server and storage solutions. With expert in-house design capabilities, validation, manufacturing and production, AIC's broad selection of products are highly flexible and are configurable to any form factor or custom configuration. AIC leads the industry with nearly 20 years of ...
How is unified communications transforming the way businesses operate? In his session at WebRTC Summit, Arvind Rangarajan, Director of Product Marketing at BroadSoft, will discuss how to extend unified communications experience outside the enterprise through WebRTC. He will also review use cases across different industry verticals. Arvind Rangarajan is Director, Product Marketing at BroadSoft. He has over 19 years of experience in the telecommunications industry in various roles such as Software Development, Product Management and Product Marketing, applied across Wireless, Unified Communic...
SYS-CON Events announced today the IoT Bootcamp – Jumpstart Your IoT Strategy, being held June 9–10, 2015, in conjunction with 16th Cloud Expo and Internet of @ThingsExpo at the Javits Center in New York City. This is your chance to jumpstart your IoT strategy. Combined with real-world scenarios and use cases, the IoT Bootcamp is not just based on presentations but includes hands-on demos and walkthroughs. We will introduce you to a variety of Do-It-Yourself IoT platforms including Arduino, Raspberry Pi, BeagleBone, Spark and Intel Edison. You will also get an overview of cloud technologies s...