[TELL SPARK!] Feedback on Local IDE and Cloud development tools

When it comes to developing software I have a preference for local development. I work at home with Microsoft Visual Studio 2008 for leisure projects but I find that a rather intimidating IDE and use only the most necessary things. The other stuff Microsoft build in is wasted money. I hope the Spark Dev will never be like that.

The simple Arduino development IDE is just too basic.

I love the Purebasic IDE. Everything you need is present and it rocks. You can download a free example. I use this compiler for mor than 10 years for special developments and it is multi platform.

The PICAXE programming editor (students at schools) is good example of an IDE that gives sufficient tools for programming your PIC chip. Here you can simulate your program (interpretor).

Professionally at work I use the whole day the build-in VBA IDEā€™s of Microsoft Office programs mostly Excel. Setting breakpoint and letting me examine the variable I use are more than sufficient.

Now and then I use the Microchip MPLAB X IDE to develop a simple program for a pic controller. I love what this IDE brings but is already going into the intimidating direction.

I like the idea of a Spark Dev IDE with a switchable Novice/Advanced/Super User IDE setting.

Bottomline is ā€œKeep it simple, but smartā€.

Iā€™ve been toying with the idea of writing an extension to enable Spark development in Visual Studio, especially with the newly released free Community Edition.

The work already happening on the local IDE will provide a great base to work from and will help reduce the effort required to create the extension (Iā€™ve done a bit of Visual Studio integration before, although not a custom project type/language).

Itā€™s a great IDE and Iā€™d love to be able to use it for my Spark development.

Being able to do local debugging would be the holy grail for me!

Greg

3 Likes

Hi @Marcel, sorry to hear that! Did you approach Spark for tech support or asked around the community for assistance?

If you did, Iā€™m sure it would have been resolved! Let us know if you are still interested in getting it up and running! :smile:

There are some users who had difficulty initially but with some help, they now have a great platform to tinker with.

Take care!

I had some troubles with downloading the local IDE on windows, my avast went nuts haha. Perhaps you can email avast or something similar to prevent this in de future :slight_smile:

1 Like

I definetly second @greg 's whish for debugging.

Another whish would be to provide a way to hook up my local toolchain and dfu-util to Spark Dev to choose between local and cloud build.
Additionally it would be cool if I could easily build my local code against any combination of spark-commom-lib-, spark-communication-lib- and spark-firmware-branch/-fork combination right from the Git repos without having to clone them locally (silly idea, but for brainstorming some craziness is compulsory :stuck_out_tongue_winking_eye:)
More ideas there, but I donā€™t want to make a complete fool of myself :smirk:

1 Like

Iā€™m with @TheHawk1337 Avast is driving me crazy especially with all the updated that are going to come out.

1 Like

The community jumped in and helped me solving the problem with the Spark. I managed in updating the Spark Core software and the Spark works now. Thanks for the support, I used also your input!

2 Likes

Iā€™ve been playing with Spark Dev for a little bit, so havenā€™t gotten into it much yet. It would be helpful if it had tooltips on the buttons like the web IDE. Also, how do I open the libraries?

1 Like

Thereā€™s tooltips available but just a little delayed. 2-3 seconds later the tooltip will appear.

Library integration is not available yet so you have to place them in your project file and use for now :wink:

Thanks Kenneth. For the tooltips, I canā€™t get them to work at all. Even after 20 seconds still no tooltip. Donā€™t know if it matters, but Iā€™m on Windows 8.1.

1 Like

Just to be sure, are you on V0.0.18? https://github.com/spark/spark-dev/releases/tag/v0.0.18

Otherwise, you can file a github issue! :smiley:

Hi,

I am based in China so being ā€œtiedā€ to everything online is proving problematic. Also I move around a lot and so having to set up the spark core dev kit each time i want to ā€œplayā€ is slowing me down.

So there is a way for ā€œofflineā€ compile and deploy through the serial interface?

This might be useful. https://community.spark.io/t/how-to-video-for-compiling-locally-in-windows/2457
But seeing as setting up the Spark Dev is apparently a problem, I believe youā€™ll have a hard time finding something good thatā€™s quicker to install.

In addition to local compile, local dfu-flash, local debugging and the other things I mentioned earlier, Iā€™d wish Spark Dev would allow for a project folder with sub-folders to avoid the need to have all the libs cluttered in one directory and manually having to alter include paths.

As a temporary workaround an implicit #define SPARK_DEV when building with Spark Dev would also be good, to allow for lib providers to do this

#if defined (SPARK_DEV)
  #include "this.h"
  #include "that.h"
#else
  #include "../this/this.h"
  #include "../that/that.h"
#endif
1 Like

I have not tried the local IDE Spark Dev. I thought I would at least tell you how I build.

I use Eclipse, I have separate directories for each project / library. I set each up as if it was going to github and eventually the Spark Web IDE - so I use the create_spark_library for each directory.

I use spark.include in my main application and that sets up links to all of my custom libraries. I compile and flash using the CLI.

Eclipse works ok for my needs, itā€™s integrated with GIT so that is very helpful. I would be nice to be able to compile and flash from within Eclipse, Iā€™m sure you can, I have just not taken the time to figure that out.

Eventually I would like to add building of the firmware to the process, but I guess once the photon is out that may all change anyway.

@mtnscott, in this thread you'll find how to add dfu-util to Eclipse - it works like a charm

I still haven't got rid of some bogus bug indicators in the Eclipse editor when working on core-common-lib or core-communication-lib. Have you got something similar or did you get that sorted?
Maybe we could discuss that in the above thread :wink:

I read all the posts and see several mentions of compiling/dfu-flashing, but does Spark Dev work with a local cloud yet?

I am just starting out so have mostly been using the cloud IDE. I have installed and played with Spark Dev and it would be nice to be able to either share cloud storage with the cloud IDE or at least import/export with 1-click between the two.

Sometimes the simple cloud IDE is all I need (for example programming from an iPad) and sometimes I want the more advanced local IDE, but moving the code between the IDEs is cumbersome.

Is there already a better way to do this that I have not discovered?

1 Like

Iā€™d love to be able to develop locally, compile locally, and push to the Core or Photon all from a UI that I run locally.

Ideally, a VS plugin which just works for compiling the .bin and uploading it to the cloud is what I have mind. Configurable cloud server, of course; I know Iā€™m going to work on projects which wonā€™t be allowed to use the Spark Cloud because of enterprise policy.

I donā€™t think thereā€™s a way, but local debugging on a USB-connected Core or Photon would be absolutely incredible.