hadoop - transfer files in/out of HDFS via ssh tunnel -


a bit of complicated set up:

i have following structure

localhost --> bastion_host -> server -> hadoop_cluster 

now able create ssh tunnel allows me copy files localhost , server. once in server, can use hadoop fs -put/get transfer files in out of cluster. cluster not visible anywhere else apart server

is there way of copying files in , out of cluster using existing tunnel?

i under impression use"

ssh -p 2345 localhost "hadoop fs -put -/user/eron/test_file3" < testing_scp.txt 

where 2345 local port tunnel , testing_scp.txt local file.

i get, however,

"sh: hadoop: command not found"

so command not executed on server

when ssh server, $path updated executing .bashrc, .profile, etc. use tunnel, /usr/local/hadoop/bin not added $path

it should work specifying hadoop binary path:

ssh -p 2345 localhost "/usr/local/hadoop/bin/hadoop fs -put -/user/eron/test_file3" < testing_scp.txt 

Comments