Compile fails while using local cloud

The command "spark compile " works when using the spark cloud, but fails when I use my local cloud, with no descriptive reason as to why. What gives?

I see the following message in my server logs:

10.0.1.18 - - [Mon, 27 Apr 2015 17:31:15 GMT] "POST /v1/binaries?access_token=[redacted] HTTP/1.1" 404 9 "-" "-"

My console:

➜  ~ spark --version
1.2.0
➜  ~ spark config identify
Current profile: santaclara
Using API: http://10.0.1.2:8080
Access token: [redacted]
➜  ~ spark list
Checking with the cloud...
Retrieving cores... (this might take a few seconds)
garagedoor (redacted) is online
  Functions:
    int digitalread(String args)
    int digitalwrite(String args)
    int analogread(String args)
    int analogwrite(String args)
➜  ~ spark compile blinky.ino   ## taken from docs.spark.io
Including:
    blinky.ino
attempting to compile firmware
pushing file: blinky.ino
Compile failed -  compile failed

EDIT:

Directly accessing 10.0.1.2:8080/v1/binaries?access_token=[redacted], simply gives Not Found

@needmoreram, the local cloud does not support compiling like the Spark Cloud does.

1 Like

@peekay123 That explains it! I’m just noticing in the readme, that this is currently in the roadmap - can’t wait :slight_smile: Thanks for your help!

1 Like

So how do you go about loading projects onto cores that are controlled locally?

If you write your code in the Web IDE, you can download the binary, and flash that locally (over USB, or wireless). You can also use the CLI to compile your projects on the Particle Cloud, and then flash them locally again like mentioned above. You could also compile locally, with your own toolchain.

1 Like