Spark-Dev IDE Questions

So, I feel like the Spark-Dev should be able to grab my cloud IDE projects and load them locally into my IDE, then sync them back to the cloud. That way I could work with the local machine but still work on them while not near my computer for quick fixes, or just if I’m elsewhere.

Now, this functionality may already exist, but I can’t find it. If it doesn’t, then I’d like to see it! We can apparently already compile with the cloud, so it’s sending our sketches to the cloud anyhow…let’s get this thing syncing!

1 Like

This functionality, unfortunately, is not available as of now.

You have to manually export the project code the old-school copy and paste method :smiley:

Done. Now, how about a suggestion to include the Spark Windows drivers with the windows installer? That was confusing, trying to figure out why the core wouldn’t communicate.

@damccull, unfortunately the DFU-util license does not allow re-distribution of some sort but i agree that this should be better communicated!

There’s an issue opened for this.

gonna ping @suda about this. :wink:

Ok, thanks for the heads up and for looking into getting that better communicated!

When I click the Flash or Compile buttons in the editor, it seems to cloud compile just fine, but my core never gets flashed. Using the same code in the web IDE flashes it almost instantly. How do I fix this? Also, what’s the typing dialog that pops up when I run this command that says “no matches”?

click on the core list and choose the one you want to flash and test again.

I haven’t played around much with Spark dev but try and see how this works?

When I go to the Spark menu and Select Core, I see my core, but when I get that dialog on the Flash button, it says “No Matches Found”.

Does the status bar at the bottom show the name of the core you selected with a breathing cyan light indicator?

Yes. It shows the name of it in the status bar, breathing blue.

hmm. sorry but i can’t help much since i haven’t played around enough to understand issues with spark dev.

Let’s ping @suda

Hey sorry for late response. Answering your questions:

@damccull: Spark-Dev should be able to grab my cloud IDE projects and load them locally into my IDE, then sync them back to the cloud

This is something we really want to have in the future but as it involves modifying Build, API and couple other moving parts, we have to plan it the right way. For now you need to copy/paste your code.

@damccull: how about a suggestion to include the Spark Windows drivers with the windows installer?

We have plan to do so (also option to install CLI). The DFU util can't be bundled with Dev (but it can be distributed separately).

@damccull: When I go to the Spark menu and Select Core, I see my core, but when I get that dialog on the Flash button, it says "No Matches Found".

Do you have more then one compiled firmwares (firmware_*.bin files) in your project's directory?

@suda Do you have more then one compiled firmwares (firmware_*.bin files) in your project's directory?

Where is this directory? Just where I saved the ino file? I couldn't really find a 'new project' button so I just made a new folder and saved the pasted code from the web ide in it. If that's the folder you're referring to, then no. I don't have ANY of those files.

Also, for some reason when I try to connect via the Serial Monitor window I'm getting:

Window load time: 10235ms index.js:39
Uncaught Error: Invalid port specified: undefined events.js:85
Uncaught TypeError: Cannot read property 'fd' of null c:\Program Files (x86)\Spark Dev\resources\app\node_modules\spark-dev\lib\views\serial-monitor-view.js:201module.exports.SerialMonitorView.isPortOpen c:\Program Files (x86)\Spark Dev\resources\app\node_modules\spark-dev\lib\views\serial-monitor-view.js:201module.exports.SerialMonitorView.disconnect c:\Program Files (x86)\Spark Dev\resources\app\node_modules\spark-dev\lib\views\serial-monitor-view.js:237module.exports.SerialMonitorView.toggleConnect c:\Program Files (x86)\Spark Dev\resources\app\node_modules\spark-dev\lib\views\serial-monitor-view.js:194(anonymous function) c:\Program Files (x86)\Spark Dev\resources\app\node_modules\space-pen\lib\space-pen.js:182handler c:\Program Files (x86)\Spark Dev\resources\app\src\space-pen-extensions.js:104jQuery.event.dispatch c:\Program Files (x86)\Spark Dev\resources\app\node_modules\space-pen\vendor\jquery.js:4681elemData.handle c:\Program Files (x86)\Spark Dev\resources\app\node_modules\space-pen\vendor\jquery.js:4359

I’m having similar issues to @damccull. My program compiles fine, but there is no *.bin file in the source directory after the compile. And wen I try to flash I also get “No Matches Found”

My Core is breathing cyan and and is shows the same at the bottom of the IDE.

Any directory you open in Spark Dev becomes “project directory”. After you run Compile in the cloud command, it should create file which name starts with firmware_ and ends with .bin. This file is then used by Flash X in the cloud command.

I see that you have two .ino files in your project. To work compile correctly, project should contain only one .ino file.

@suda Actually there’s only one ino file. I have a .ccp and a .h file as well. That’s 3 total files. Where are you seeing the second one?

I’m sorry I’ve merged two topics into one answer :wink: 0.0.17 will be released today and should fix most of issues including yours.

2 Likes

Ok, is the spark dev IDE suppose to compile with the cloud Library files or do I have to download the library and include in my directory structure? If I try and include the header file (SparkIntervalTimer/SparkIntervalTimer.h) all I get is an “App code was invalid” message at the bottom when compiling, no other useful info. If I start commenting out the library includes that error goes away and I get related errors about missing info. I figured since it was “compiling in the cloud” it would still work with the cloud libraries.

@suda, any chance Spark DEV can support the same spar.ignore and spark.include files as Spark CLI? :smile:

There’s issue about specifying ignored files in Spark Dev settings but we could support both settings and .ignore/.include files.

1 Like