There have been lots of posts in the last few months centered around local development, and specifically having a robust software debugger for local development. With the release of Spark Dev, we have started down the road of enabling you to do your development locally with a full-featured IDE. Now that it’s out and we’re seeing people start to use it, report bugs and request features it would be interesting to hear from the community at large about local VS cloud based development. Getting your feedback is very important to us, and while we have definite plans to get the Spark IDEs working together and sharing more features, it would be great to know how they’re being used right now. Between Netbeans on Windows, Linux-based solutions and preferences or feature requests for our Cloud IDE or Spark Dev, if you have an opinion on the topic, sound off in this thread so we can get your ideas wrangled together in one place - we want to build something that works for everyone!
I program my Spark with Vim and the instructions in firmware/README.md
I especially appreciate that the dependencies were a snap to install on Debian/Ubuntu. This was a major factor in how fast and frictionless it was for me to get up and running. I was impressed!
It’s very important to me that the tools were familiar (GCC, Make, Git, etc.) and the documentation was complete and easily found.
Continuing to have simple dependencies, available as official Ubuntu packages, would be just great. (It was a surprise and a big difference compared to my frustrating experiences setting up environments to build other firmwares.)
When it comes to developing software I have a preference for local development. I work at home with Microsoft Visual Studio 2008 for leisure projects but I find that a rather intimidating IDE and use only the most necessary things. The other stuff Microsoft build in is wasted money. I hope the Spark Dev will never be like that.
The simple Arduino development IDE is just too basic.
I love the Purebasic IDE. Everything you need is present and it rocks. You can download a free example. I use this compiler for mor than 10 years for special developments and it is multi platform.
The PICAXE programming editor (students at schools) is good example of an IDE that gives sufficient tools for programming your PIC chip. Here you can simulate your program (interpretor).
Professionally at work I use the whole day the build-in VBA IDE’s of Microsoft Office programs mostly Excel. Setting breakpoint and letting me examine the variable I use are more than sufficient.
Now and then I use the Microchip MPLAB X IDE to develop a simple program for a pic controller. I love what this IDE brings but is already going into the intimidating direction.
I like the idea of a Spark Dev IDE with a switchable Novice/Advanced/Super User IDE setting.
I’ve been toying with the idea of writing an extension to enable Spark development in Visual Studio, especially with the newly released free Community Edition.
The work already happening on the local IDE will provide a great base to work from and will help reduce the effort required to create the extension (I’ve done a bit of Visual Studio integration before, although not a custom project type/language).
It’s a great IDE and I’d love to be able to use it for my Spark development.
Being able to do local debugging would be the holy grail for me!
Another whish would be to provide a way to hook up my local toolchain and dfu-util to Spark Dev to choose between local and cloud build.
Additionally it would be cool if I could easily build my local code against any combination of spark-commom-lib-, spark-communication-lib- and spark-firmware-branch/-fork combination right from the Git repos without having to clone them locally (silly idea, but for brainstorming some craziness is compulsory )
More ideas there, but I don’t want to make a complete fool of myself
In addition to local compile, local dfu-flash, local debugging and the other things I mentioned earlier, I’d wish Spark Dev would allow for a project folder with sub-folders to avoid the need to have all the libs cluttered in one directory and manually having to alter include paths.
As a temporary workaround an implicit #define SPARK_DEV when building with Spark Dev would also be good, to allow for lib providers to do this
I have not tried the local IDE Spark Dev. I thought I would at least tell you how I build.
I use Eclipse, I have separate directories for each project / library. I set each up as if it was going to github and eventually the Spark Web IDE - so I use the create_spark_library for each directory.
I use spark.include in my main application and that sets up links to all of my custom libraries. I compile and flash using the CLI.
Eclipse works ok for my needs, it’s integrated with GIT so that is very helpful. I would be nice to be able to compile and flash from within Eclipse, I’m sure you can, I have just not taken the time to figure that out.
Eventually I would like to add building of the firmware to the process, but I guess once the photon is out that may all change anyway.
@mtnscott, in this thread you’ll find how to add dfu-util to Eclipse - it works like a charm
I still haven’t got rid of some bogus bug indicators in the Eclipse editor when working on core-common-lib or core-communication-lib. Have you got something similar or did you get that sorted?
Maybe we could discuss that in the above thread