When it comes to developing software I have a preference for local development. I work at home with Microsoft Visual Studio 2008 for leisure projects but I find that a rather intimidating IDE and use only the most necessary things. The other stuff Microsoft build in is wasted money. I hope the Spark Dev will never be like that.
The simple Arduino development IDE is just too basic.
I love the Purebasic IDE. Everything you need is present and it rocks. You can download a free example. I use this compiler for mor than 10 years for special developments and it is multi platform.
The PICAXE programming editor (students at schools) is good example of an IDE that gives sufficient tools for programming your PIC chip. Here you can simulate your program (interpretor).
Professionally at work I use the whole day the build-in VBA IDEās of Microsoft Office programs mostly Excel. Setting breakpoint and letting me examine the variable I use are more than sufficient.
Now and then I use the Microchip MPLAB X IDE to develop a simple program for a pic controller. I love what this IDE brings but is already going into the intimidating direction.
I like the idea of a Spark Dev IDE with a switchable Novice/Advanced/Super User IDE setting.
Iāve been toying with the idea of writing an extension to enable Spark development in Visual Studio, especially with the newly released free Community Edition.
The work already happening on the local IDE will provide a great base to work from and will help reduce the effort required to create the extension (Iāve done a bit of Visual Studio integration before, although not a custom project type/language).
Itās a great IDE and Iād love to be able to use it for my Spark development.
Being able to do local debugging would be the holy grail for me!
I had some troubles with downloading the local IDE on windows, my avast went nuts haha. Perhaps you can email avast or something similar to prevent this in de future
Another whish would be to provide a way to hook up my local toolchain and dfu-util to Spark Dev to choose between local and cloud build.
Additionally it would be cool if I could easily build my local code against any combination of spark-commom-lib-, spark-communication-lib- and spark-firmware-branch/-fork combination right from the Git repos without having to clone them locally (silly idea, but for brainstorming some craziness is compulsory )
More ideas there, but I donāt want to make a complete fool of myself
The community jumped in and helped me solving the problem with the Spark. I managed in updating the Spark Core software and the Spark works now. Thanks for the support, I used also your input!
Iāve been playing with Spark Dev for a little bit, so havenāt gotten into it much yet. It would be helpful if it had tooltips on the buttons like the web IDE. Also, how do I open the libraries?
Thanks Kenneth. For the tooltips, I canāt get them to work at all. Even after 20 seconds still no tooltip. Donāt know if it matters, but Iām on Windows 8.1.
I am based in China so being ātiedā to everything online is proving problematic. Also I move around a lot and so having to set up the spark core dev kit each time i want to āplayā is slowing me down.
So there is a way for āofflineā compile and deploy through the serial interface?
In addition to local compile, local dfu-flash, local debugging and the other things I mentioned earlier, Iād wish Spark Dev would allow for a project folder with sub-folders to avoid the need to have all the libs cluttered in one directory and manually having to alter include paths.
As a temporary workaround an implicit #define SPARK_DEV when building with Spark Dev would also be good, to allow for lib providers to do this
I have not tried the local IDE Spark Dev. I thought I would at least tell you how I build.
I use Eclipse, I have separate directories for each project / library. I set each up as if it was going to github and eventually the Spark Web IDE - so I use the create_spark_library for each directory.
I use spark.include in my main application and that sets up links to all of my custom libraries. I compile and flash using the CLI.
Eclipse works ok for my needs, itās integrated with GIT so that is very helpful. I would be nice to be able to compile and flash from within Eclipse, Iām sure you can, I have just not taken the time to figure that out.
Eventually I would like to add building of the firmware to the process, but I guess once the photon is out that may all change anyway.
@mtnscott, in this thread you'll find how to add dfu-util to Eclipse - it works like a charm
I still haven't got rid of some bogus bug indicators in the Eclipse editor when working on core-common-lib or core-communication-lib. Have you got something similar or did you get that sorted?
Maybe we could discuss that in the above thread
I am just starting out so have mostly been using the cloud IDE. I have installed and played with Spark Dev and it would be nice to be able to either share cloud storage with the cloud IDE or at least import/export with 1-click between the two.
Sometimes the simple cloud IDE is all I need (for example programming from an iPad) and sometimes I want the more advanced local IDE, but moving the code between the IDEs is cumbersome.
Is there already a better way to do this that I have not discovered?
Iād love to be able to develop locally, compile locally, and push to the Core or Photon all from a UI that I run locally.
Ideally, a VS plugin which just works for compiling the .bin and uploading it to the cloud is what I have mind. Configurable cloud server, of course; I know Iām going to work on projects which wonāt be allowed to use the Spark Cloud because of enterprise policy.
I donāt think thereās a way, but local debugging on a USB-connected Core or Photon would be absolutely incredible.