1.
sudo jps
command to see all hadoop daemons.
2. hadoop fs
List all the hadoop file system shell commands
3.
4. hadoop version
Print the Hadoop version
5. hadoop fs -ls /
List the contents of the root directory in HDFS
6.hadoop fs -df hdfs:/
Report the amount of space used and
available on currently mounted filesystem.
7. hadoop fs -count hdfs:/
Count the number of directories,files and bytes under
the paths that match the specified file pattern
8.hadoop fs -mkdir /sample
Create a directory in hdfs named as sample.
9.hadoop fs -put /home/cloudera/sample.txt /sample
hadoop fs -put /source location /destination
hadoop fs -put /home/cloudera/Desktop/word /sample
Add a sample text file from the local directory
named named as "sample.txt" to the new directory you created in HDFS
during the previous step.
10.hadoop fs -ls /sample
List the contents of this new directory in HDFS.
11.hadoop fs -put /home/cloudera/abc.txt hdfs:/
Command to put a file from local file system directly in hdfs.
12. hadoop fs -cat /abc.txt
command to see the contents of file.
13.hadoop fs -rm /abc.txt
command to remove/delete a file from hdfs.
14. hadoop fs -rm -r /sample
command to remove a directory.
15. hadoop fs -get /abc.txt /home/cloudera/Desktop
command to put a file from hdfs to local file system.
16. hadoop fs -mv /b.txt /sample
command to move one file from one location to another in hdfs. (both
locations should be in hdfs)
PUT COMMAND: copy the file from local system to hdfs/hadoop
GET COMMAND: FROM HDFS TO LOCAL