0% found this document useful (0 votes)
174 views32 pages

Wget Command Guide for Linux Users

Uploaded by

Trung Tuyen Lam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
174 views32 pages

Wget Command Guide for Linux Users

Uploaded by

Trung Tuyen Lam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

nixCraft → Tutorials → Linux → Wget command in Linux/Unix

Linux wget command examples: Your


Ultimate Command Line Downloader
Author: Vivek Gite • Last updated: May 12, 2024 • 44 comments

I t is a common practice to manage UNIX/Linux/BSD servers


remotely over the ssh session. You may need to download
the software or other files for installation. There are a few
compelling graphical download manager exists for Linux and
UNIX like operating systems. But, when it comes to the
command line or shell prompt downloader, wget command, the
non-interactive downloader rules. The wget command supports HTTP, FTP,
HTTPS, other Internet protocols, authentication facility, and tons of other
options. Here are some tips to get the most out of wget command in Linux and
Unix-like systems.

This page explains how to use the wget command with valuable examples and
comprehensive explanations of the most common options to become a pro-CLI
user. Let us get started with wget.

ADVERTISEMENT

Linux wget command examples

The syntax is as follows for wget command in Linux, macOS, and Unix system:

[Link] 1/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

wget url
wget [options] url

The url where files are located is always the last parameter. For instance:
[Link] Let us see some common Linux
wget command examples, syntax and usage.

How to install wget command on Linux

Use the apt command/apt-get command if you are on Ubuntu/Debian/Mint


Linux:

$ sudo apt install wget

Fedora Linux users should type dnf command

$ sudo dnf install wget

RHEL/CentOS/Oracle/Rocky and Alma Linux users should type the yum


command to install the wget command:

$ sudo yum install wget

SUSE/OpenSUSE Linux user should type the zypper command to install the wget
command:

$ zypper install wget

Arch Linux users should type the pacman command when you wish to install the
GNU wget command:

[Link] 2/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

$ sudo pacman -S wget

Alpine Linux users try the apk command to install the GNU wget command
instead of using busybox wget:

# apk add wget

Installing GNU/wget on Unix like systems

FreeBSD users try the pkg command to install the GNU wget command:

$ sudo pkg install wget

macOS user first install Homebrew on Mac OS to use the brew package manager
and then type the following brew command for installing the GNU wget
command:

$ brew install wget

OpenBSD and NetBSD users try the following pkg_add command to install the
wget command:

$ doas pkg_add wget

Now that the GNU wget command is installed on your Linux or Unix-like system,
it is time to learn how to use the wget command to download stuff from the LAN,
WAN, or Internet. Let us gets some hands-on experience to increase our
productivity.

Downloading a single file using the wget


command
[Link] 3/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

Type the following command to grab the latest Linux kernel and compile and
install the Linux Kernel from source code:

$ wget [Link]
[Link]

Click to enlarge

The above output clearly shows that the wget command connects to the
[Link] server to download the file. It also displays the download
progress, file name, DL speed, and other information. By default, the file is
downloaded in the current working directory. However, you can save the
downloaded file under a different directory, including the file name.

Examples about the wget

Try some examples about the wget command:

$ wget [Link]
$ wget -q [Link]
$ wget -O [Link]
[Link]
$ wget --output-document=file=[Link]
'[Link]
rhel-baseos-9.0-beta-0-x86_64-[Link]?_auth_=xyz'
$ wget

[Link] 4/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

[Link]
[Link].bz2

Fig.01: wget command in action on my Linux box

How to save the downloaded file under a different name and


directory using the wget

The -O FILE or --output-document=FILE is used to tell wget to download given


file from URL and save into name stated by this option. Say you are downloading
[Link] and you just wanted to save it as [Link],
then:

$ wget --output-document=file=[Link] '[Link]


[Link]?key=xyz'
## OR use the capital '-O' as follows ##
$ wget -O [Link] '[Link]
key=xyz'

[Link] 5/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

In other words, download file and write it to given FILE named [Link]. To save
the downloaded file under a different name and /tmp/, pass the -O option as
follows:

$ wget -O /tmp/[Link] \
'[Link]

You can set directory prefix to prefix by passing the -P prefix option. The
directory prefix is the directory where all other files and subdirectories will be
saved to. In other words, the top of the retrieval tree. The default is . (the
current directory):

$ wget -P /tmp/ url


$ wget -P /isoimages/ [Link]
$ wget -P /isoimages/ [Link]
$ wget -P /isoimages/ [Link]

Logging messages to FILE when using the wget

To log messages to FILE, use:

$ wget --output-file=[Link] [Link]


## small '-o' ##
$ wget -o [Link] [Link]

We can combine both options as follows:

$ wget -o [Link] \
-O [Link] \
'[Link]
rhel-baseos-9.0-beta-0-x86_64-[Link]?_auth_=xyz'

To view log, use the more command/cat command/less command:

[Link] 6/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

$ cat [Link]
more [Link]
## or use the grep command/egrep command ##
$ grep 'error' [Link]
$ egrep -iw 'warn|error' [Link]

How do I download multiple files using the wget?

Use the following wget syntax when you wish to download stuff from multiple
URLs. For example:

$ wget [Link] \
[Link] \
[Link]

You can create a shell variable that holds all urls and use the ‘BASH for loop‘ to
download all files using the wget:

URLS = "[Link] \
[Link] \
[Link] \
[Link]
for u in $URLS
do
wget " $u "
done

How do I read URLs list from a text file and grab files using
wget?

You can put all urls in a text file and use the -i option to the wget to download
all files. First, create a text file as follows using a text editor such as nano or
vi/vim:

[Link] 7/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

$ vi /tmp/[Link]

Append a list of urls:

[Link]
[Link]
[Link]
[Link]

Then type the wget command as follows:

$ wget -i /tmp/[Link]

Resuming downloads with wget time saving tip

You can also force wget to get a partially-downloaded file i.e. resume
downloads. This is useful when you want to finish up a download started by a
previous instance of wget, or by another program. For instance:

$ wget -c [Link]
$ wget -c -i /tmp/[Link]

Please note that the -c option only works with FTP / HTTP / HTTPS servers
that support the “range” header.

Forcing the wget to download all files in


background

[Link] 8/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

The -o option used to force wget to go into background immediately after


startup. If no output file is specified via the -o option, output is redirected to
wget-log file:

$ wget -cb -o /tmp/[Link] -i /tmp/[Link]

OR

$ nohup wget -c -o /tmp/[Link] -i /tmp/[Link] &

The nohup runs the given COMMAND (in this example wget) with hangup signals
ignored, so that the command can continue running in the background after you
log out.

How do I limit the download speed with wget


command?

You can limit the download speed to amount bytes per second when using the
wget command. Amount may be expressed in bytes, kilobytes with the k suffix,
or megabytes with the m suffix. For example, --limit-rate=100k will limit the
retrieval rate to 100KB/s for your wget download session. This is useful when,
for whatever reason, you don’t want the wget to consume the entire available
bandwidth. Hence, this is useful when you want to download a large file file, such
as an ISO image from mirrors:

$ wget -c -o /tmp/[Link] --limit-rate=50k


[Link]

Use the m suffix for megabytes ( --limit-rate=1m ). The above command will
limit the retrieval rate to 50KB/s. It is also possible to specify disk quota for
automatic retrievals to avoid disk DoS attack. The following command will be
aborted when the disk quota is (100MB+) exceeded.
[Link] 9/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

$ wget -cb -o /tmp/[Link] -i /tmp/[Link] --


quota=100m

From the wget man page:

Please note that Wget implements the limiting by sleeping the appropriate
amount of time after a network read that took less time than specified by the
rate. Eventually this strategy causes the TCP transfer to slow down to
approximately the specified rate. However, it may take some time for this
balance to be achieved, so don’t be surprised if limiting the rate doesn’t work
well with very small files.

Using the wget command with the password


protected sites

You can supply the http username/password on server as follows:

$ wget --http-user=vivek --http-password=Secrete


[Link]

Another way to specify username and password is in the URL itself. For example:

$ wget '[Link]

Either method reveals your password to anyone who bothers to run ps


command:

$ ps aux

Sample outputs:

[Link] 10/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

vivek 27370 2.3 0.4 216156 51100 ? S 05:34 0:06


vivek 27744 0.1 0.0 97444 1588 pts/2 T 05:38 0:00
vivek 27746 0.5 0.0 97420 1240 ? Ss 05:38 0:00 w

To prevent the passwords from being seen, store them in .wgetrc or .netrc, and
make sure to protect those files from other users with chmod command. If the
passwords are really important, do not leave them lying in those files either edit
the files and delete them after the wget has started the download. Under Linux
you can hide processes from other users and ps command. Same can be done
with FreeBSD to prevent users from seeing information about processes owned
by other users using the ps command or top command/htop command.

Download all mp3 or pdf file from a remote FTP


server using the wget

Generally you can use shell special character aka wildcards such as *, ?, [] to
specify selection criteria for files. Same can be use with FTP servers while
downloading files. For instance:

$ wget [Link]
$ wget [Link]

OR

$ wget -g on [Link]

Downloading file to the stdout and shell pipes


wget command example
[Link] 11/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

In this example, the wget command will grab the file to stand output device ( -O
- ) and pipe it to the shell command of your choice (such as the tar command:

$ wget -q -O - '[Link] | tar -Jxzf - -C


/tmp/data/

How to make a mirror of a website using the wget

Some websites/ftp servers may throttle requests or ban your IP address for
excessive requests. So use this option carefully and do not overload remote
servers.

We can create a mirror of a website with wget by passing the -m or --mirror


option:

$ wget -m [Link]
$ wget --mirror [Link]

Changing the wget User-Agent

We can identify as -U AGENT ( --user-agent=AGENT ) instead of default


Wget/VERSION. For example, change user-agent to ‘Mozilla/5.0 (Macintosh; Intel
Mac OS X 10.15; rv:94.0) Gecko/20100101 Firefox/94.0’

$ wget -U 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15;


rv:94.0) Gecko/20100101 Firefox/94.0' \
[Link]

[Link] 12/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

Getting around HTTPS (TLS) certificate errors


when using the wget

Don’t validate the server’s HTTPS/TLS certificate by passing the --no-check-


certificate option:

$ wget --no-check-certificate \
[Link]

Tip: Use the lftp when you need multithreaded


download instead of wget

lftp fetches HTTP URLs in a manner similar to wget, but segments the retrieval
into multiple parts to increase download speed. It gets the specified file using
several connections. This can speed up transfer, but loads the net heavily
impacting other users. Use only if you really have to transfer the file ASAP.

$ lftp -e 'pget -n 5 -c url/[Link]; exit'

Above command will download [Link] in 5 segments/connections.

The wget command tutorials in video format

Here is a quick video format:

[Link] 13/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

Linux / Unix wget command tutorial with examples for new…


new…

Examples of GUI based file downloaders for Linux


and Unix

Not a fan of wget CLI? Try the following GUI based tools:

KGet is a versatile and user-friendly download manager for KDE desktop


system.

Gwget is a download manager for the Gnome Desktop

Uget is an easy-to-use download manager written in GTK+

Conclusion

The GNU wget command is a powerful command-line utility to download files,


resume broken partial downloads, mirror HTTP/FTP sites, provide user
authentication, throttle download speed, and much more. Please note that wget
command is also available on Linux and *BSD/macOS/Unix-like systemce.
Hence, do check wget command documentation and source code for more
advanced options. Of course, seasond developers, users and sysadmin can

[Link] 14/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

always type the following man command or help command to read docs on your
Linux or Unix server prompt:

$ man wget
$ wget --help
# get a short help for wget options using the egrep command as
filter #
$ wget --help | grep -Ewi -- '-(i|w|e)'

🥺 Was this helpful? Please add a comment to show your appreciation or


feedback.

Hi! 🤠
I'm Vivek Gite, and I write about Linux, macOS, Unix, IT, programming,
infosec, and open source. Subscribe to my RSS feed or email newsletter
for updates.

🔎 To search, type & hit enter...


Related Posts

Discover UNIX and Linux Command line combinations to…

How to return your unused Windows license and get…

The Ultimate Linux / Unix Desktop GUI Calculator

Gimmie ultimate desktop organizer for Linux

[Link] 15/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

Deluge: Ultimate Linux / UNIX / Mac / Windows GUI…

Ultimate Lighttpd webserver security

The Ultimate Computer Desk / Table (CPU Desk)

Linux command editing examples using cat, ed, and sed

44 comments… add one ↓

JZA • Nov 1, 2006 @ 22:49

I remember there was a way to download all .txt file from a URL however I
haven’t been able to find it. You mention that wget [Link] will do it
however it generate an error saying:
HTTP request sent, awaiting response… 404 Not Found

↩ ∞

🛡️ nixCraft • Nov 1, 2006 @ 22:56


Yes. It can be done provided that remote ftp/web server support this feature.
Due to abuse/security or to avoid server load most remote system disables this
feature.

Try to use -i option as described above to fetch list of files from text file.
wget -i [Link]

↩ ∞

[Link] 16/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

dushyant • Jan 3, 2007 @ 11:32

if some site not giving ( showing its full URL )and also talking http passwd and
user login than
how to get the data directory and how can i use these with wget command
plz mail me on my id i cant remember this site name .

↩ ∞

mangesh • Jul 13, 2007 @ 7:42

download recessively, resume with passive

wget -c -r –passive-ftp -nH [Link]

↩ ∞

akhil • Dec 4, 2007 @ 13:14

wget is not working for HTTPS protocol..please tell me y?

↩ ∞

Ramamoorthy • Nov 21, 2013 @ 7:48

It will work . if it shows certificate error , then use –no-check-certificate .


for example . wget –no-check-certificate [Link]
some https server or even http server will allow download only if you are
logged in , in that case you need to log in in your browser , have to get the
cookies and use that cookies in wget..

↩ ∞

[Link] • May 9, 2008 @ 4:46

Lol, these instructions are great for my dedicated server for spreading out my
[Link] downloads.

[Link] 17/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

Cheers!
↩ ∞

Interesting • Nov 20, 2008 @ 14:54

Very interesting thanks for the information on Wget my favorite way of uploading
to my dedicated servers. Be it an over complicated and complex way but my
favorite none the less

↩ ∞

sara • Feb 17, 2009 @ 6:10

thanks alot for your help and usefull informations

↩ ∞

David D. • Mar 8, 2009 @ 22:03

Is there a way to specify the filename wget writes to?


Would the following do the trick?
“wget $options $url > [Link]”,

↩ ∞

🛡️ nixCraft • Mar 8, 2009 @ 22:47


Try following to save [Link] as [Link]

wget -O [Link] [Link]

↩ ∞

Lakshmipathi.G • Apr 29, 2009 @ 6:42

Really useful info….exactly what i wanted …. thanks

↩ ∞

[Link] 18/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

mimo • May 2, 2009 @ 6:57

Nice hints!

What I am missing (and still searching for) is –> how to limit file size of a file to
download? One of my current bash scripts is ‘spidering’ certain sites for certain
archive files. IF encountered a 5GB *.zip file, It would happily download it, which I
don’t want. So: what would be a good practise to limit downloads to, say, 2 MB?

Cheers

↩ ∞

Neela.V • Sep 15, 2009 @ 6:31

Hi, i need to retrieve a file from an http site, which asks for username and
password to access the site. How can i retrieve the file using wget? please guide
me.

↩ ∞

vivian • Sep 18, 2009 @ 4:46

Hello, i am not able to use wget in my ubuntu system.. whenever i try to


download anything for example,

sudo wget [Link]

what i get is

--2009-09-18 [Link]-- [Link]


Resolving [Link] [Link]
Connecting to [Link] [Link]|:80... ^C

[Link] 19/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

The above is same for any website.


i am able to do sudo apt-get install and this is the evidence that i am connecting
to the internet. But for wget case i am not able to. Is there any config settings
that i need to do?
↩ ∞

TJ • Mar 15, 2016 @ 13:50

Looks like you are pressing Control-C from your shell copy here

↩ ∞

Palash • Oct 15, 2009 @ 8:28

Many many thanks for give important tips of wget.

↩ ∞

Feseha • Feb 13, 2010 @ 3:50

Quite informative and great time saver!!!


Thanks

↩ ∞

zazuge • Mar 7, 2010 @ 12:41

multi mirror download of a single file trick


wget -c “url1” -O [Link]
then use dd to fill zeros to the other chunks
dd if=/dev/zero of=[Link] bs=1024 count=100000
dd if=/dev/zero of=[Link] bs=1024 count=200000
dd if=/dev/zero of=[Link] bs=1024 count=300000

then on multiple terminals do

[Link] 20/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

wget -c “url2” -O [Link]


wget -c “url3” -O [Link]

now merge
dd if=[Link] of=[Link] bs=1024 count=100000 skip=100000
dd if=[Link] of=[Link] bs=1024 count=100000 skip=200000

^^
↩ ∞

Mark • May 14, 2010 @ 3:37

Thanks! $ wget -c -i was very helpful. 🙂


↩ ∞

bayyou • Aug 13, 2010 @ 5:14

$wget -cb -o
********** (10stars)

↩ ∞

johnwerhun • Aug 27, 2010 @ 18:13

How can I use wget to download Samba in Linux. I’ve tried several times but
have been unsuccessful?

↩ ∞

Trav • Sep 14, 2010 @ 9:35

Try MultiGet, it also splits the file up into threads to download faster.

↩ ∞

James Locke • Dec 13, 2010 @ 14:59


[Link] 21/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

I am trying to download data from my server using wget.

I send the login details and store the cookie. I then rotate through n numbers to
copy the data into a new file. The saved file is always blank i.e 0kb file size.

My website stores data on individual pages e.g.: (i have changed my actual site
name to “mywebsite”)

‘[Link]

I am trying to rotate through the numbers 50 to 1 and extract the data from each
page.

The code I am using is below:

#!/usr/bin/perl

system (“wget –post-data ‘username=Test&password=Test&autologin=1’ –


cookies=on –keep-session-cookies –save-cookies=[Link]
[Link]

$x = 50;
while ($x <= 1) {
system ("wget –wait=400 –post-data 'html=true&order_id=50' –
referer=[Link] –cookies=on –load-cookies=[Link] –
keep-session-cookies –save-cookies=[Link]
[Link]

system ("wget –post-data 'html=true&order_id=50' –


referer=[Link] –cookies=on –load-cookies=[Link] –
keep-session-cookies –save-cookies=[Link]
[Link]

[Link] 22/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

$x++;
}

How do I modify the code above so data is pulled correctly and the saved files
are not blank? Thank you
↩ ∞

Priya • Jan 7, 2011 @ 10:51

Sir,
I have used the command given as example but error is comming as given
below.
wget url
–[Link]– url
Resolving [Link] [Link], [Link]
Connecting to [Link] failed: Network is
unreachable.
Connecting to [Link] failed:
Network is unreachable.

↩ ∞

zack • Feb 9, 2011 @ 4:40

wget --http-user=***** --http-passwd=********** -O [Link] -o Lo

Please help me. I want to download data from this website –


[Link] using above wget
command. The problem is how to download the whole one day data with
minutes and hour is changing such as starting from 0000UTC-8am(Malaysia
local time) 1/1/2010 until 1/1/2010-2350UTC. The data is available for every 10
minutes and time is in UTC format(hhmm).

[Link] 23/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

variable
yyyy=year, mm=month, dd=day, hh=hour, nn=minute

so for this example, yyyy=2010, mm=01, dd=01, hh=hour, nn=minute, user=mmd,


passwd=bmlofeb2011

I hope anyone can help me how to download the data.


↩ ∞

Dinesh Jadhav • Feb 15, 2011 @ 7:59

I want to change the owner of the file with the wget command, what ever the
filles wiil be downloaded needed to change the owner.

↩ ∞

m3nt4t • Apr 17, 2011 @ 11:34

The easiest way to avoid changing owner/group of download files via wget is to
use sudo (run process as other user) or set the download path to a mount point
with specific uid/gid (e.g. uid=henry,gid=henry)

↩ ∞

Ajay • May 3, 2011 @ 10:09

Hello,

how can i use the wget command if the following situation arises

1) when connected to a particular server the wget command will download the
file. for this if we set a crontab then at the mentioned time the download will
happen. but at time if there is a problem with the server and is not getting
connected the wget command will overwrite the existing file with a dummy file
there by loosing the contents

[Link] 24/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

Is there any way to prevent this?? i.e. when not connected to the server the wget
command should not create or overwrite.
↩ ∞

Bipin • May 20, 2011 @ 20:23

any Idea to put password from prompt with wget command.

Thanks,
Bipin Bahuguna

↩ ∞

Krishna • Jul 4, 2011 @ 12:13

When I use wget to download 5 files from server using a script it sends 1 GET
request and waits for server to respond then sends the 2nd and so on. I want the
GET to be sent simultaneously irrespective of the response from the server at
the same moment. How to do this? Any insights? Thanks

↩ ∞

Ajay • Jul 4, 2011 @ 13:46

You can write a small shell script such that for 5 different files u write 5 wget
command inside the shell script and finally run the shell script.

↩ ∞

Ajay • Jul 4, 2011 @ 13:55

URLS=”[Link]
[Link]
[Link]
[Link]
for u in $URLS
[Link] 25/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

do
wget $u
done
↩ ∞

gbviswanadh • Sep 18, 2011 @ 11:46

how can extract file are in the format of [Link].bz2 and [Link].bz2 , so is this
possible to bzip2 and unzip a these files in at time command,

↩ ∞

mohit • Dec 18, 2011 @ 18:24

how can i download a file over a https website using wget .

↩ ∞

Baronsed • Jul 1, 2012 @ 12:36

And what when it comes to get the list of links contained in a web page ?
$ lynx -dump -listonly file > html-list

^^

↩ ∞

netlord • Jul 10, 2012 @ 15:05

hi
and how can i (most easyli) get files from sourceforge?
this “automatic closiest proxy” selection is awful!

↩ ∞

deepu • Aug 25, 2012 @ 20:03

[Link] 26/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

hi,
I have to download a .txt file on daily basis through our one of server, these all
exercise i have done manualy. and save the file in to unix system. i want to create
a automate script who can download and save file into unix directory.

Thanks in advance

↩ ∞

mike • Sep 18, 2012 @ 10:57

I’ve been with windows my whole life until I started pc science in college I
started playing around with linux its a whole new world!1

↩ ∞

hitayezu • Jun 9, 2013 @ 12:43

it great command thanks

↩ ∞

Tony pezzella • Aug 13, 2013 @ 1:19

How do I upload a file using wget?

↩ ∞

Srikanth Ravuri • Apr 20, 2016 @ 18:33

I am using the wget command to pull the files from external website. It executes
successfully but, I see a file which has html code rather file that I am expecting.

Logs from the command execution…

[Link] 27/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

$ wget --ftp-user=xxxxxxx --ftp-password=yyyyy "[Link]


--2016-04-20 [Link]-- [Link]
Resolving [Link]... [Link], [Link]
Connecting to [Link]|[Link]|:80... connected
HTTP request sent, awaiting response... 302 Found
Location: [Link]
--2016-04-20 [Link]-- [Link]
Resolving [Link]... [Link], [Link]
Connecting to [Link]|[Link]|:443... connecte
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: “RxNorm_full_04042016.zip.3”

2016-04-20 [Link] (98.5 KB/s) - “RxNorm_full_04042016.zip.3”

Can you please help me understand what I am missing using wget?

Thank you
↩ ∞

Pecel • Feb 13, 2022 @ 23:58

Hi! May i ask you something? So, if i dont join patreon, is that mean this website
will tracking me?

↩ ∞

🛡️ Vivek Gite • Feb 14, 2022 @ 14:25


Hello,

[Link] 28/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

No. The Patreon content comes from a separate domain. This site is always
supported by ads.

HTH.
↩ ∞

Leave a Reply

Your email address will not be published. Required fields are marked *

Comment *

Name

Post Comment

Use HTML <pre>...</pre> for code samples. Your comment will appear only after approval by

the site admin.

Next post: Linux Iptables Block All Incoming Traffic But Allow SSH

Previous post: Linux flush or remove all iptables firewall rules

[Link] 29/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

🔎 SEARCH
🔎 To search, type & hit enter...
🔥 FEATURED ARTICLES
1 30 Cool Open Source Software I Discovered in 2013

2 30 Handy Bash Shell Aliases For Linux / Unix / Mac OS X

3 Top 32 Nmap Command Examples For Linux Sys/Network Admins

4 25 PHP Security Best Practices For Linux Sys Admins

5 30 Linux System Monitoring Tools Every SysAdmin Should Know

6 40 Linux Server Hardening Security Tips

7 Linux: 25 Iptables Netfilter Firewall Examples For New SysAdmins

[Link] 30/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

8 Top 20 OpenSSH Server Best Security Practices

9 Top 25 Nginx Web Server Best Security Practices

10 My 10 UNIX Command Line Mistakes

📧 SIGN UP FOR MY NEWSLETTER

👀 /etc
➔ Howtos & Tutorials

➔ Linux shell scripting tutorial

➔ RSS/Feed

➔ About nixCraft

➔ nixCraft Shop

➔ Mastodon

[Link] 31/32
29/07/2024, 18:45 Wget Command in Linux with Examples - nixCraft

©2002-2024 nixCraft • Privacy • ToS • Contact/Email • Corporate patron Cloudflare

[Link] 32/32

You might also like