Welcome!

Linux Containers Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Kalyan Ramanathan, Xenia von Wedel

Related Topics: Linux Containers

Linux Containers: Article

Alternative Ways to Download Files with Linux

A sneak peek

Fedora Core (http://fedora.redhat.com) is one of the world's most popular Linux distributions in part from support by Red Hat and a strong community of users. It also has been a proving ground for Red Hat to develop technologies that will eventually make their way into Red Hat Enterprise Linux.

Recently Red Hat has spun off the Fedora Core Project into its own organization in much the same way that the Mozilla project was spun out of Netscape back in the late 1990s. Both Fedora and Red Hat Enterprise Linux have proven to be raging successes in their own right for both Red Hat and Linux users who choose to pursue a do-it-yourself approach to Linux. For those of you looking for additional resources to learn more about these distributions, you could consider the Red Hat Fedora and Enterprise Linux 4 Bible written by Christopher Negaus.

Christopher Negus has been working with UNIX systems, the Internet, and (most recently) Linux systems for more than two decades at AT&T Bell Laboratories, UNIX System Laboratories, and Novell, helping to develop the UNIX operating system. Features from many of the UNIX projects Chris worked on at AT&T have found their way into Red Hat Enterprise Linux, Fedora, and other Linux systems. Chris has written several books on UNIX and the Internet; his latest book is the Red Hat Fedora and Enterprise Linux 4 Bible (excerpted below) and released in July 2005. It discusses the features and capabilities of both operating systems. His book is written as a resource for users trying to adopt either of these technologies.

With your Fedora Core system connected to the Internet, you can take advantage of dozens of tools for browsing the Web, downloading files, getting e-mail, and communicating live with your friends. In most cases, you have several choices of GUI and command-line applications for using Internet services from your Linux desktop or shell. However, there are some utilities you may not be familiar with like wget, scp, and sftp.

Excerpt from Chapter 9...

Getting Files with wget
If you already know where a file is on the network, there are more efficient ways of downloading that file than opening an FTP session, moving around the FTP server, and running the get command. The wget command is a simple, efficient tool for doing non-interactive downloads of files over the Internet.

If there is a file you want to download from an FTP site or Web server (HTTP), and you know exactly where the file is, wget is a good way to download. The wget command is very useful if you want to copy a whole site, recursively, from one computer to another (for example, containing user home directories). When downloading from FTP sites, wget can let you just download as the anonymous user or add your own user name and password to the command line.

Downloading a File with User Name and Password
If you are doing an FTP file copy and need to log in as a user other than anonymous, you can add that information to the command line or to a .netrc file in your home directory (type man netrc to see the format of that file). Here is an example of adding the password to the command line:

$ wget ftp://joe:[email protected]/memol.doc .

CAUTION: Adding a password to a command line leaves the password exposed to onlookers. This practice is generally discouraged, except in cases where no one can see your monitor or your history files.

In the previous example, the user logs in as joe with the password my67chevy. The wget then copies the file memo1.doc from the current directory on the host computer named ftp.handsonhistory.com. That current directory is most likely /home/joe.

Downloading a Whole Web Site
Using wget, you can download a large number of files from Web servers as well. The wget command downloads files using the http protocol, if file addresses begin with http://. Downloading a single file, you would use the same form as you would for an FTP file (for example: wget http://host/file.). The best wget option for HTTP downloads is -r (recursive).

A recursive download lets you choose a point at a Web site and download all content below that point. Here is an example of a recursive download used to download the contents of the www.example.com Web site.

$ wget -r http://www.example.com

In this example, the HTML pages, images, and other content on the www.example.com Web site are copied below the current directory in the directory www.example.com. This is useful if you want to gather the contents of a Web site but don't have login access to that site. Because content is taken by following links, if there is content in a directory at the Web site that isn't in a link, it won't be downloaded.

Downloading an entire Web site can result in a massive amount of data being downloaded. If you want only part of a Web site, start from a point lower in the site's structure. Or, as an alternative, you can limit the number levels the wget will go down the site structure. Using the -l option (l as in level), the following example gets two levels of HTML content:

$ wget -r -l 2 http://www.example.com

To mirror a site, you can use the -m option instead of -r. With wget -m http://site, it is like asking to download an infinite number of levels recursively (-r -l inf), keep current time stamps (-N), and keep FTP directory listings (-nr).

Continuing a Download
In the old days, if you were downloading a particularly large file (such as an ISO image of a CD or DVD), if the download stopped for some reason (a disconnected network or errant reboot), you needed to start all over. With wget, you can choose to restart a download and have it continue right where it left off. This has been a lifesaver for me on many occasions.

Let's say that you were downloading a 4GB DVD ISO image named mydvd.iso from the site ftp://ftp.example.com and you killed the wget process by mistake after about 3GB of download. Make sure that your current directory is the one that contains the partially downloaded ISO. Then run the same wget command you did originally, adding the -c (continue) option as follows:

$ wget -c ftp://ftp.example.com/mydvd.iso

If you had not used the -c option, in this case, a new download would have started using the file name mydvd.iso.1 to download to.

NOTE: Another command that you might be interested in, similar to wget, is the curl command. Like wget, curl can download files using the ftp or http protocols. Curl can also do multiple file transfers on the same connection.

# ssh [email protected] "tail -f /var/log/messages"
[email protected]'s password:

After you typed the password in the preceding case, the last several lines of the /var/log/messages file on the remote computer would be displayed. As messages were received, they would continue to be displayed until you decided to exit (press Ctrl+C to exit the tail command).

NOTE: Find out more about the ssh command from the SSH Web site (www.openssh.org).

Using scp for Remote File Copy
The scp command is a simple yet secure way of copying files among Linux systems. It uses the underlying ssh facility, so if ssh is enabled, so is scp. Here is an example of using scp to copy a file from one computer to another:

# scp myfile toys.linuxtoys.net:/home/chris
[email protected]'s password: ******

In this example, the file myfile is copied to the computer named toys.linuxtoys.net in the /home/chris directory. If you don't provide a user name (as is the case here), scp assumes you are using the current user name. Unlike some tools that provide remote login, scp and ssh do allow you to login as root user over the network, by default. (Many people turn off this feature for security reasons.)

To use scp with a different user name, you can append the user name with an @ character. For example, [email protected]:/home/chris would attempt to log in as the user named chris to do the file copy.

The first time you connect to a remote computer using scp or ssh, those commands try to establish the authenticity of the remote host. If it cannot establish the host's authenticity, it will display the RSA key fingerprint and ask you if you want to continue. If you type yes, scp will not question the authenticity of that computer again for subsequent scp commands.

However, if the RSA key fingerprint should change in the future for the remote computer (which will happen if, for example, the operating system is reinstalled on that computer), scp will refuse to let you connect to that remote computer. To override that refusal, you need to edit your $HOME/.ssh/known_hosts file and delete the entry for the remote computer. You can then verify the authenticity of the remote computer and continue to use scp.

The sftp command, which also communicates using secure ssh protocols, is a command for copying files from an FTP server. It is considered a more secure way of getting files from an FTP server that has an sshd server running.

.  .  . 

You can receive more helpful tips on using Fedora and Red Hat Enterprise Linux 4 in Chris's book, Red Hat Fedora and Enterprise Linux 4 Bible available at your local retailer.

More Stories By Mark R. Hinkle

Mark Hinkle is the Senior Director, Open Soure Solutions at Citrix. He also is along-time open source expert and advocate. He is a co-founder of both the Open Source Management Consortium and the Desktop Linux Consortium. He has served as Editor-in-Chief for both LinuxWorld Magazine and Enterprise Open Source Magazine. Hinkle is also the author of the book, "Windows to Linux Business Desktop Migration" (Thomson, 2006). His blog on open source, technology, and new media can be found at http://www.socializedsoftware.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web ...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effici...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and G...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...