The local cloud does come with web IDE correct?

HI everyone,

The local cloud server which we can run on our own hardware (and have our own core profile to connect to it) does come with the web IDE correct?

Otherwise how would a core connected to a local cloud receive it’s sketches?

Thank you everyone!

1.) Local :cloud: doesn’t come with Web IDE

2.) You can use things like Spark-cli or call the API to send a binary file to the core


Thanks kennethlimcp; that was super fast. Ok then can I use the “official” web IDE to write code and then as you suggested call an API to send a binary file to the core?

How would I compile code using the web IDE without having registered a device ID with it (since the device wouldn’t live there but rather on the local cloud)?

Is the web IDE NOT open source and available to all?


good question. the most convenient way right now is to use the tool: Spark-cli

Here’s what you will do:

1.) Compile the code via Spark :cloud: (you can also compile on your computer if you don’t mind setting up a local toolchain)

spark compile ......

2.) Switch over to your own :cloud: and flash the binary file over

spark config PROFILE_NAME

spark flash CORE_NAME xxxxxx.bin

I don’t think there are plans to open source the Web IDE…

If you are keen, i would recommend that you:

1.) Get familiar with the Spark ecosystem before using a local :cloud:

2.) follow my tutorial to get things setup:

1 Like

Thanks again; I don’t mean to pester but rather hope this would be useful for others with the same topics in mind plus your answers are to the point and no marketing BS. Got it; the IDE is proprietary and that, of course is your porogative! In that case are there plans to allow YOUR IDE to interact with people’s local clouds? For example point your ec2 which runs the IDE to push to spark servers that happened to live outside your ec2 pvc space? Those would be the servers people run as per the official local cloud tutorials?

I am digging into this so that I can clarify whether this would be another “Electric Imp” problem or actually a superior choice in terms of the amount of proprietary environment imposed on the user base. Thanks again!

@bubba198, there’s no lock-in of any kind like i mentioned and i cannot represent and speak on behalf of Spark’s plan.

1.) firmware is open source
2.) cloud code is open source
3.) toolchain is open source
4.) hardware parts can be bought anywhere

Spark’s mission is to help product creators commercialize their products and is in the works to create more tools to better manage products out in your user’s hands and how you people can better manage all the devices as a fleet.

I’m confident that you will love the tools that they will be announcing and the only issue is… you need to use the Spark :cloud: to enjoy all those stuff!

Unlike the electric imp issue (not badmouthing about them as well), you are free to do anything you want to the hardware you bought. Connecting to the :cloud: is not a must

Maybe @jeiden or @dan can help with your questions better :smiley:

Hi @bubba198, thanks for the question! Another option I would like to throw out there would be to use Spark Dev, our open-source local development environment. Because it’s open source, I don’t see a reason why this couldn’t be adjusted to point at a local cloud. I haven’t done it before, but like I said, all of the code is fully hackable :). @suda was the engineer behind Spark Dev, perhaps he may be able to confirm what I’m proposing.

Thanks so much for your interest in building with Spark!!! Happy to have you in the community

Currently Spark Dev can only communicate with our cloud but adding support for local cloud shouldn’t be a problem. With Web IDE it’s a little more complicated. Still you can develop, compile and flash your devices without using our Spark Cloud.

Since receiving my core I have wondered if the web IDE would be open sourced like the cloud (although that is not fully done yet apparently). The IDE is great and very helpful but once you go local its no help. Spark Dev I have had nothing but problems. The CLI works perfectly though my problem with that is not functionality its the fact the CLI is the only option for us in 2015 which is hilarious.

On the other hand Spark/Particle is still developing and getting better each day. Hopefully Spark Dev will get a local cloud feature and a few more niggles removed so we can get a more modern setup to use.
Until then I would suggest developing in the cloud until complete then just switching references to the local cloud. Making development as simple as possible.