This post is part of my preparation series for the Cloudera CCA175 exam, “Certified Spark and Hadoop Developer”.
The Hadoop File System commands
These commands are used to import, export, and manipulate files on the HDFS system. For example, you can copy a text file from your local drive to HDFS by issuing
hdfs dfs -put myfile.txt. Earlier, the command was called
hadoop fs ..., but this is now deprecated and you should use
hdfs dfs ... instead. The functionality is still the same, though.
You can issue
hdfs dfs -help to see a short description of each command. The most important ones are:
-copyFromLocal): “Upload” files from local to HDFS. With
-f, you overwrite existing files.
-copyToLocal): Copy files from HDFS to the local name
The basic UNIX commands all exist as a dash parameter, for example
-ls. The help provides is enough to quickly answer any question you might have.