Welcome!

Linux Containers Authors: Mehdi Daoudi, Yeshim Deniz, Elizabeth White, Gordon Haff, Toddy Mladenov

Related Topics: Linux Containers

Linux Containers: Article

Alternative Ways to Download Files with Linux

A sneak peek

Fedora Core (http://fedora.redhat.com) is one of the world's most popular Linux distributions in part from support by Red Hat and a strong community of users. It also has been a proving ground for Red Hat to develop technologies that will eventually make their way into Red Hat Enterprise Linux.

Recently Red Hat has spun off the Fedora Core Project into its own organization in much the same way that the Mozilla project was spun out of Netscape back in the late 1990s. Both Fedora and Red Hat Enterprise Linux have proven to be raging successes in their own right for both Red Hat and Linux users who choose to pursue a do-it-yourself approach to Linux. For those of you looking for additional resources to learn more about these distributions, you could consider the Red Hat Fedora and Enterprise Linux 4 Bible written by Christopher Negaus.

Christopher Negus has been working with UNIX systems, the Internet, and (most recently) Linux systems for more than two decades at AT&T Bell Laboratories, UNIX System Laboratories, and Novell, helping to develop the UNIX operating system. Features from many of the UNIX projects Chris worked on at AT&T have found their way into Red Hat Enterprise Linux, Fedora, and other Linux systems. Chris has written several books on UNIX and the Internet; his latest book is the Red Hat Fedora and Enterprise Linux 4 Bible (excerpted below) and released in July 2005. It discusses the features and capabilities of both operating systems. His book is written as a resource for users trying to adopt either of these technologies.

With your Fedora Core system connected to the Internet, you can take advantage of dozens of tools for browsing the Web, downloading files, getting e-mail, and communicating live with your friends. In most cases, you have several choices of GUI and command-line applications for using Internet services from your Linux desktop or shell. However, there are some utilities you may not be familiar with like wget, scp, and sftp.

Excerpt from Chapter 9...

Getting Files with wget
If you already know where a file is on the network, there are more efficient ways of downloading that file than opening an FTP session, moving around the FTP server, and running the get command. The wget command is a simple, efficient tool for doing non-interactive downloads of files over the Internet.

If there is a file you want to download from an FTP site or Web server (HTTP), and you know exactly where the file is, wget is a good way to download. The wget command is very useful if you want to copy a whole site, recursively, from one computer to another (for example, containing user home directories). When downloading from FTP sites, wget can let you just download as the anonymous user or add your own user name and password to the command line.

Downloading a File with User Name and Password
If you are doing an FTP file copy and need to log in as a user other than anonymous, you can add that information to the command line or to a .netrc file in your home directory (type man netrc to see the format of that file). Here is an example of adding the password to the command line:

$ wget ftp://joe:[email protected]/memol.doc .

CAUTION: Adding a password to a command line leaves the password exposed to onlookers. This practice is generally discouraged, except in cases where no one can see your monitor or your history files.

In the previous example, the user logs in as joe with the password my67chevy. The wget then copies the file memo1.doc from the current directory on the host computer named ftp.handsonhistory.com. That current directory is most likely /home/joe.

Downloading a Whole Web Site
Using wget, you can download a large number of files from Web servers as well. The wget command downloads files using the http protocol, if file addresses begin with http://. Downloading a single file, you would use the same form as you would for an FTP file (for example: wget http://host/file.). The best wget option for HTTP downloads is -r (recursive).

A recursive download lets you choose a point at a Web site and download all content below that point. Here is an example of a recursive download used to download the contents of the www.example.com Web site.

$ wget -r http://www.example.com

In this example, the HTML pages, images, and other content on the www.example.com Web site are copied below the current directory in the directory www.example.com. This is useful if you want to gather the contents of a Web site but don't have login access to that site. Because content is taken by following links, if there is content in a directory at the Web site that isn't in a link, it won't be downloaded.

Downloading an entire Web site can result in a massive amount of data being downloaded. If you want only part of a Web site, start from a point lower in the site's structure. Or, as an alternative, you can limit the number levels the wget will go down the site structure. Using the -l option (l as in level), the following example gets two levels of HTML content:

$ wget -r -l 2 http://www.example.com

To mirror a site, you can use the -m option instead of -r. With wget -m http://site, it is like asking to download an infinite number of levels recursively (-r -l inf), keep current time stamps (-N), and keep FTP directory listings (-nr).

Continuing a Download
In the old days, if you were downloading a particularly large file (such as an ISO image of a CD or DVD), if the download stopped for some reason (a disconnected network or errant reboot), you needed to start all over. With wget, you can choose to restart a download and have it continue right where it left off. This has been a lifesaver for me on many occasions.

Let's say that you were downloading a 4GB DVD ISO image named mydvd.iso from the site ftp://ftp.example.com and you killed the wget process by mistake after about 3GB of download. Make sure that your current directory is the one that contains the partially downloaded ISO. Then run the same wget command you did originally, adding the -c (continue) option as follows:

$ wget -c ftp://ftp.example.com/mydvd.iso

If you had not used the -c option, in this case, a new download would have started using the file name mydvd.iso.1 to download to.

NOTE: Another command that you might be interested in, similar to wget, is the curl command. Like wget, curl can download files using the ftp or http protocols. Curl can also do multiple file transfers on the same connection.

# ssh [email protected] "tail -f /var/log/messages"
[email protected]'s password:

After you typed the password in the preceding case, the last several lines of the /var/log/messages file on the remote computer would be displayed. As messages were received, they would continue to be displayed until you decided to exit (press Ctrl+C to exit the tail command).

NOTE: Find out more about the ssh command from the SSH Web site (www.openssh.org).

Using scp for Remote File Copy
The scp command is a simple yet secure way of copying files among Linux systems. It uses the underlying ssh facility, so if ssh is enabled, so is scp. Here is an example of using scp to copy a file from one computer to another:

# scp myfile toys.linuxtoys.net:/home/chris
[email protected]'s password: ******

In this example, the file myfile is copied to the computer named toys.linuxtoys.net in the /home/chris directory. If you don't provide a user name (as is the case here), scp assumes you are using the current user name. Unlike some tools that provide remote login, scp and ssh do allow you to login as root user over the network, by default. (Many people turn off this feature for security reasons.)

To use scp with a different user name, you can append the user name with an @ character. For example, [email protected]:/home/chris would attempt to log in as the user named chris to do the file copy.

The first time you connect to a remote computer using scp or ssh, those commands try to establish the authenticity of the remote host. If it cannot establish the host's authenticity, it will display the RSA key fingerprint and ask you if you want to continue. If you type yes, scp will not question the authenticity of that computer again for subsequent scp commands.

However, if the RSA key fingerprint should change in the future for the remote computer (which will happen if, for example, the operating system is reinstalled on that computer), scp will refuse to let you connect to that remote computer. To override that refusal, you need to edit your $HOME/.ssh/known_hosts file and delete the entry for the remote computer. You can then verify the authenticity of the remote computer and continue to use scp.

The sftp command, which also communicates using secure ssh protocols, is a command for copying files from an FTP server. It is considered a more secure way of getting files from an FTP server that has an sshd server running.

.  .  . 

You can receive more helpful tips on using Fedora and Red Hat Enterprise Linux 4 in Chris's book, Red Hat Fedora and Enterprise Linux 4 Bible available at your local retailer.

More Stories By Mark R. Hinkle

Mark Hinkle is the Senior Director, Open Soure Solutions at Citrix. He also is along-time open source expert and advocate. He is a co-founder of both the Open Source Management Consortium and the Desktop Linux Consortium. He has served as Editor-in-Chief for both LinuxWorld Magazine and Enterprise Open Source Magazine. Hinkle is also the author of the book, "Windows to Linux Business Desktop Migration" (Thomson, 2006). His blog on open source, technology, and new media can be found at http://www.socializedsoftware.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Ca...
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, introduced the technologies required for implementing these idea...
Why do your mobile transformations need to happen today? Mobile is the strategy that enterprise transformation centers on to drive customer engagement. In his general session at @ThingsExpo, Roger Woods, Director, Mobile Product & Strategy – Adobe Marketing Cloud, covered key IoT and mobile trends that are forcing mobile transformation, key components of a solid mobile strategy and explored how brands are effectively driving mobile change throughout the enterprise.
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
SYS-CON Events announced today that IoT Now has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, SSL, peer-to-peer, mob...
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, discussed the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports.
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
With billions of sensors deployed worldwide, the amount of machine-generated data will soon exceed what our networks can handle. But consumers and businesses will expect seamless experiences and real-time responsiveness. What does this mean for IoT devices and the infrastructure that supports them? More of the data will need to be handled at - or closer to - the devices themselves.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...