Get List Of Files From Hdfs (hadoop) Directory Using Python Script
How to get a list of files from hdfs (hadoop) directory using python script? I have tried with following line: dir = sc.textFile('hdfs://127.0.0.1:1900/directory').collect() The
Solution 1:
Use subprocess
import subprocess
p = subprocess.Popen("hdfs dfs -ls <HDFS Location> | awk '{print $8}'",
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
for line in p.stdout.readlines():
print line
EDIT: Answer without python. The first option can be used to recursively print all the sub-directories as well. The last redirect statement can be omitted or changed based on your requirement.
hdfs dfs -ls -R <HDFS LOCATION> | awk '{print $8}' > output.txt
hdfs dfs -ls <HDFS LOCATION> | awk '{print $8}' > output.txt
EDIT: Correcting a missing quote in awk command.
Solution 2:
import subprocess
path = "/data"
args = "hdfs dfs -ls "+path+" | awk '{print $8}'"
proc = subprocess.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
s_output, s_err = proc.communicate()
all_dart_dirs = s_output.split() #stores list of files andsub-directories in'path'
Solution 3:
Why not have the HDFS client do the hard work by using the -C
flag instead of relying on awk or python to print the specific columns of interest?
i.e. Popen(['hdfs', 'dfs', '-ls', '-C', dirname])
Afterwards, split the output on new lines and then you will have your list of paths.
Here's an example along with logging and error handling (including for when the directory/file doesn't exist):
from subprocess import Popen, PIPE
import logging
logger = logging.getLogger(__name__)
FAILED_TO_LIST_DIRECTORY_MSG = 'No such file or directory'classHdfsException(Exception):
passdefhdfs_ls(dirname):
"""Returns list of HDFS directory entries."""
logger.info('Listing HDFS directory ' + dirname)
proc = Popen(['hdfs', 'dfs', '-ls', '-C', dirname], stdout=PIPE, stderr=PIPE)
(out, err) = proc.communicate()
if out:
logger.debug('stdout:\n' + out)
if proc.returncode != 0:
errmsg = 'Failed to list HDFS directory "' + dirname + '", return code ' + str(proc.returncode)
logger.error(errmsg)
logger.error(err)
ifnot FAILED_TO_LIST_DIRECTORY_MSG in err:
raise HdfsException(errmsg)
return []
elif err:
logger.debug('stderr:\n' + err)
return out.splitlines()
Solution 4:
For python 3:
from subprocess import Popen, PIPE
hdfs_path = '/path/to/the/designated/folder'
process = Popen(f'hdfs dfs -ls -h {hdfs_path}', shell=True, stdout=PIPE, stderr=PIPE)
std_out, std_err = process.communicate()
list_of_file_names = [fn.split(' ')[-1].split('/')[-1] for fn in std_out.decode().readlines()[1:]][:-1]
list_of_file_names_with_full_address = [fn.split(' ')[-1] for fn in std_out.decode().readlines()[1:]][:-1]
Solution 5:
you can use the listdir function in the os library
files = os.listdir(path)
Post a Comment for "Get List Of Files From Hdfs (hadoop) Directory Using Python Script"