0

I am trying to execute shell script placed in a bucket using dataflow job. I can execute gsutil commands using this job using Direct Runner :

String[] cmdline = { "cmd.exe", "/c", "gsutil ls gs://Bucketname" };
Process p = Runtime.getRuntime().exec(cmdline); 
BufferedReader reader = new BufferedReader(new 
InputStreamReader(p.getInputStream()));
           String line = null;
           while ((line = reader.readLine()) != null)
           {
              System.out.println(line);
           }

Note : I will use dataflow runner to execute the script because i am using windows machine.

2
  • Are you trying to run this from within a ParDo? Is this failing? If so, what error are you seeing? What does your pipeline look like? Commented Aug 23, 2017 at 0:03
  • If i write String command = "bash shellDataflow.sh"; Process p = Runtime.getRuntime().exec(command);. I get error message as " execption occured Cannot run program "bash": CreateProcess error=2, The system cannot find the file specified" . Error is like this because above program is invoking my windows system's command line instance . Is there any way to connect to cloud console enviroment through java code so that i can run my shell in that enviroment using my dataflow job. Commented Aug 23, 2017 at 10:54

1 Answer 1

1

Try using this. It runs in my case. You have to deploy the code in cloud as a .jar or a maven project. The path /home/*/test.sh is in the cloud console.

String[] cmd = {"sh",  "/home/akash/test.sh", "/home/akash/"};
        Runtime.getRuntime().exec(cmd);
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.