I have a shell script on hdfs that accepts 8-9 parameters. Normally, I can carry it out as follows:
sh sample.sh -mode FULL -status DELETE -id 1456 -region AP -path </path/to/filepath>
I tried hadoop fs -cat /dev/test/sample.sh | exec bash -mode FULL -status DELETE -id 1456 -region AP -path /dev/resultsFolder
Even though I pass these arguments they are not read and script executes without the arguments.
Threw error as No such file or directory What is the best way to deal with this?
-cat <..>, i.e. using angle brackets. Also, explain what you mean by These may not work. Either it works, or it doesn't. You need to know, and if it doesn't work, you need to describe in what respect it doesn't work.