[TELL SPARK!] Feedback on Local IDE and Cloud development tools

There have been lots of posts in the last few months centered around local development, and specifically having a robust software debugger for local development. With the release of Spark Dev, we have started down the road of enabling you to do your development locally with a full-featured IDE. Now that it’s out and we’re seeing people start to use it, report bugs and request features it would be interesting to hear from the community at large about local VS cloud based development. Getting your feedback is very important to us, and while we have definite plans to get the Spark IDEs working together and sharing more features, it would be great to know how they’re being used right now. Between Netbeans on Windows, Linux-based solutions and preferences or feature requests for our Cloud IDE or Spark Dev, if you have an opinion on the topic, sound off in this thread so we can get your ideas wrangled together in one place - we want to build something that works for everyone!

Tavares @ Spark

I program my Spark with Vim and the instructions in firmware/README.md
I especially appreciate that the dependencies were a snap to install on Debian/Ubuntu. This was a major factor in how fast and frictionless it was for me to get up and running. I was impressed!
It’s very important to me that the tools were familiar (GCC, Make, Git, etc.) and the documentation was complete and easily found.
Continuing to have simple dependencies, available as official Ubuntu packages, would be just great. (It was a surprise and a big difference compared to my frustrating experiences setting up environments to build other firmwares.)

When it comes to developing software I have a preference for local development. I work at home with Microsoft Visual Studio 2008 for leisure projects but I find that a rather intimidating IDE and use only the most necessary things. The other stuff Microsoft build in is wasted money. I hope the Spark Dev will never be like that.

The simple Arduino development IDE is just too basic.

I love the Purebasic IDE. Everything you need is present and it rocks. You can download a free example. I use this compiler for mor than 10 years for special developments and it is multi platform.

The PICAXE programming editor (students at schools) is good example of an IDE that gives sufficient tools for programming your PIC chip. Here you can simulate your program (interpretor).

Professionally at work I use the whole day the build-in VBA IDE’s of Microsoft Office programs mostly Excel. Setting breakpoint and letting me examine the variable I use are more than sufficient.

Now and then I use the Microchip MPLAB X IDE to develop a simple program for a pic controller. I love what this IDE brings but is already going into the intimidating direction.

I like the idea of a Spark Dev IDE with a switchable Novice/Advanced/Super User IDE setting.

Bottomline is “Keep it simple, but smart”.

I’ve been toying with the idea of writing an extension to enable Spark development in Visual Studio, especially with the newly released free Community Edition.

The work already happening on the local IDE will provide a great base to work from and will help reduce the effort required to create the extension (I’ve done a bit of Visual Studio integration before, although not a custom project type/language).

It’s a great IDE and I’d love to be able to use it for my Spark development.

Being able to do local debugging would be the holy grail for me!



Hi @Marcel, sorry to hear that! Did you approach Spark for tech support or asked around the community for assistance?

If you did, I’m sure it would have been resolved! Let us know if you are still interested in getting it up and running! :smile:

There are some users who had difficulty initially but with some help, they now have a great platform to tinker with.

Take care!

I had some troubles with downloading the local IDE on windows, my avast went nuts haha. Perhaps you can email avast or something similar to prevent this in de future :slight_smile:

1 Like

I definetly second @greg 's whish for debugging.

Another whish would be to provide a way to hook up my local toolchain and dfu-util to Spark Dev to choose between local and cloud build.
Additionally it would be cool if I could easily build my local code against any combination of spark-commom-lib-, spark-communication-lib- and spark-firmware-branch/-fork combination right from the Git repos without having to clone them locally (silly idea, but for brainstorming some craziness is compulsory :stuck_out_tongue_winking_eye:)
More ideas there, but I don’t want to make a complete fool of myself :smirk:

1 Like

I’m with @TheHawk1337 Avast is driving me crazy especially with all the updated that are going to come out.

1 Like

The community jumped in and helped me solving the problem with the Spark. I managed in updating the Spark Core software and the Spark works now. Thanks for the support, I used also your input!


I’ve been playing with Spark Dev for a little bit, so haven’t gotten into it much yet. It would be helpful if it had tooltips on the buttons like the web IDE. Also, how do I open the libraries?

1 Like

There’s tooltips available but just a little delayed. 2-3 seconds later the tooltip will appear.

Library integration is not available yet so you have to place them in your project file and use for now :wink:

Thanks Kenneth. For the tooltips, I can’t get them to work at all. Even after 20 seconds still no tooltip. Don’t know if it matters, but I’m on Windows 8.1.

1 Like

Just to be sure, are you on V0.0.18? https://github.com/spark/spark-dev/releases/tag/v0.0.18

Otherwise, you can file a github issue! :smiley:


I am based in China so being “tied” to everything online is proving problematic. Also I move around a lot and so having to set up the spark core dev kit each time i want to “play” is slowing me down.

So there is a way for “offline” compile and deploy through the serial interface?

This might be useful. https://community.spark.io/t/how-to-video-for-compiling-locally-in-windows/2457
But seeing as setting up the Spark Dev is apparently a problem, I believe you’ll have a hard time finding something good that’s quicker to install.

In addition to local compile, local dfu-flash, local debugging and the other things I mentioned earlier, I’d wish Spark Dev would allow for a project folder with sub-folders to avoid the need to have all the libs cluttered in one directory and manually having to alter include paths.

As a temporary workaround an implicit #define SPARK_DEV when building with Spark Dev would also be good, to allow for lib providers to do this

#if defined (SPARK_DEV)
  #include "this.h"
  #include "that.h"
  #include "../this/this.h"
  #include "../that/that.h"
1 Like

I have not tried the local IDE Spark Dev. I thought I would at least tell you how I build.

I use Eclipse, I have separate directories for each project / library. I set each up as if it was going to github and eventually the Spark Web IDE - so I use the create_spark_library for each directory.

I use spark.include in my main application and that sets up links to all of my custom libraries. I compile and flash using the CLI.

Eclipse works ok for my needs, it’s integrated with GIT so that is very helpful. I would be nice to be able to compile and flash from within Eclipse, I’m sure you can, I have just not taken the time to figure that out.

Eventually I would like to add building of the firmware to the process, but I guess once the photon is out that may all change anyway.

@mtnscott, in this thread you’ll find how to add dfu-util to Eclipse - it works like a charm

I still haven’t got rid of some bogus bug indicators in the Eclipse editor when working on core-common-lib or core-communication-lib. Have you got something similar or did you get that sorted?
Maybe we could discuss that in the above thread :wink:

I read all the posts and see several mentions of compiling/dfu-flashing, but does Spark Dev work with a local cloud yet?