Spark CLI in java

So I want to communicate to my core in OSX. I can do this no problem with the spark-cli in the terminal and it will work no problem.
I have found how to execute commands in java with help from places like this.

Only problem is I keep getting an error:

java.io.IOException: Cannot run program "spark": error=2, No such file or directory

Is it possible to send the strings I want to run straight into the terminal from java?

You might need to provide the full path to your ‘spark’ command.

Hi @StudBeefpile,

When exec`ing shell commands from other languages, the key is making sure you’ve got all the environment variables set. Make sure your path includes the location of the CLI, and the Node.js environment variables, etc. :slight_smile:

which spark

Thanks,
David

1 Like

So I’m trying something like this:

String homedir = "/usr/local/lib/node_modules/spark-cli/";
File wd = new File(homedir);
Process pwd = Runtime.getRuntime().exec("spark", null, wd);
Scanner scanner = new Scanner(pwd.getInputStream());
while (scanner.hasNextLine()) {
  System.out.println(scanner.nextLine());

Output: Cannot run program "spark" (in directory "/usr/local/lib/node_modules/spark-cli"): error=2, No such file or directory

I still get the same problem. How would I also direct the path to the node.js thingy?

I have tried which spark now that I know what it does. It gives me “/usr/local/bin/spark”.

Problem is that doesn’t exist and java returns this: error=20, Not a directory

I cannot even find the directory, even with hidden files showing.

Try:

Process pwd = Runtime.getRuntime().exec("/usr/local/bin/spark", null, wd);

The “working directory” is not the same as the “search path” for commands.It’s probably best to specify the full path directly in your command parameter.

2 Likes