Spark local Cloud (Beta) - Collection of issues

Ok, I must ask that to the community:
Hi Spark Community,

why did I wrote that? Because I am not sure if I made a mistake or the issue is in the software of the local cloud self. So I tried to figure out if I am alone with that problem or anyone have similar trouble with one of the following two points:

x time reconnections
I have noticed that my core needs up to 30-40 (sometimes 20) tries to establish a connection to the local cloud. I did not now why but that is a problem for myself. Before the Core is connected the setup and the main loop are not executed.

  • I switched on the core (LED is white)
  • The Core tries to connect to my wifi (flashes green)
  • Then it blinks 5 or 6 times red - stops for a second and again 6 times red
  • After that it begins with the white LED and so on and on

All I can see in the log of the server the following x times… between 20 or 40 tries the connections seems to be established and the code is executed… but that´s not really helpful.

Connection from: 192.168.178.xx, connId: 12
on ready { coreID: ‘xxxxx’,
  ip: ‘192.168.178.xx’,
  product_id: 65535,
  firmware_version: 65535,
  cache_key: undefined }
Core online!

spark.publish() did not work

I used @bko´s tutorial to stream my reading results to my local cloud with spark.publish(). To bring it on my website I also used that example from bko (in most cases) some javascript/json/parsing magic. This all worked just fine with the Spark Cloud but it immediately stops by running that on my local cloud. I know I know this is a beta. But as I mentioned in the beginning of this thread. I did not know if this is really a software problem. I´m not perfect, therefore I search first if I made a mistake. Then I point my finger to the HW/SW…!
There is a issued open on Github by @kennethlimcp - but again: Does anyone noticed the same? I need that feature and want to make further progress but if I don´t get results I am totally stuck.
Thanks to @Dave who helped my with flashing new firmware and CC3000 update. So I should have something like a deep_update.

So far so good, now I need your help! Does anybody have one of those problems or is there anything what can I do to bring light in the dark? Is there an update for the beta? Something,…anything :slight_smile: Help !

Thx a lot clyde

i don’t like to trash what @kennethlimcp wrote, but the “fist time” instructions and “raspberry pi” instructions are still not clear and certainly don’t work for me. they also seem to contradict what’s on github regarding which keys you need to upload and if/when you have to break out of spark-cli.

i managed once to get the node app to see the spark but spark-cli couldn’t see it, subsequent attempts didn’t even get that far.

what’s worse is that i had a very hard time trying to get my core to work at all afterwards, i thought it was bricked for a moment. had to factory reset a couple of times, reflash the factory firmware, manually apply the deep update, use keys-doctor etc. my core was stuck on a static blue led when it was supposed to have a flashing blue led, then afterwards it wouldn’t connect to either cloud and was stuck with a green led.

perhaps if each step of the instructions said what they were supposed to be doing it would help, as then we could figure out if the instructions or our interpretation were wrong. it seems to me that you’re pointing your local spark-cli and the spark itself to the local cloud by giving it a new public key, and generating a new private key for your local install. it shouldn’t be that hard.

also the tutorials seem pretty macosx-centric as usual oddly enough, i’d have thought linux would be the starting point given that that’s what’s on the remote cloud/build servers.

for the time being i’m not prepared to even try the local cloud again.

p.s. we really need to get static ip working, or at least some sort of spark-cli command to show your dhcp ip address.

1 Like

Totally agree with that!

And what´s up with the beta program? There was a time I can remember when we have to register us and then re-register us. I´ll hoped that there is something like a seperate forum / part / what ever. Maybe for beginners (ooop I am one sorry for that wording) this could be very confusing.

What are the right words: local cloud IDE, local cloud, cloud-cli, spark-cloud, core, firmeware, deep_update, cc3000 patch, whats-o-ever. But in any case I miss some information if the beta has stopped or there is much more in the pipeline which has more prio than the beta of the spark-cloud. Give us some news what´s going on. If you need some time, then I don´t ask every week. No offense, promise :wink:

1.) If you think the instructions are not clear, why not write one yourself or improve on it?

2.) Spark-server is released earlier than the expected summer dateline for more people to explore and use. There are definitely bugs but majority of the basic functionality works.

3.) I did not see you post any threads about the problems you faced for local :cloud: and we would have been happy to assist you to resolve it.

Some other community members had other issues and spent hours with me offline to resolve it.

It’s just offensive that someone would simply jump it and start ranting before even seeking help or troubleshooting the issue.

4.) What about mac-osx centric?

All my tutorial available for spark-cli and spark-server has windows installation instructions.

5.) Static ip

Did you ask about this anywhere in another thread?

Is there a github issue you posted on spark-cli?

Also, command line instructions on mac-osx and software installation on linux are pretty similar.

FYI, many of the Spark Elites are helping the community to feedback issues hoping to benefit everyone during our FREE time.

I mentioned about the beta program to @dave and they are working on it but we need to give them some time to structure everything in the midst of hiring.

If you think we missed something, feedback gracefully and rest assure someone will be assisting you on it. It’s just so uncool for anyone to jumping at issues out of the blue.

1 Like

i did say i didn’t want to trash what you’d done, which is why i never posted before. i was waiting for official documentation, which hasn’t materialised.

Inside voices, guys! Stay calm. We’re all good here. We’re all on the same team.

@clyde and @sej7278, feedback heard. spark-server is still very much a beta product, so it should not be expected to be completely stable upon its first release. If you have issues that need to be addressed, please add them to github:

If you can add your issues with a clear description of how to recreate the bug, that gives us the ammunition we need to fix them quickly.

@sej7278, the official documentation for spark-server is in the README file: https://github.com/spark/spark-server/

If there is something that you would like to see documented there that is not or is improperly documented, please also raise it as a github issue.

2 Likes

Maybe my English is not skilled enough to express myself clear.
This is totally wrong. My first post should be a call to the community if someone have the same problems that I have. Like I said:

I´m not perfect, therefore I search first if I made a mistake.

So I just want to ask… not offend/attack anybody…no more no less… So please keep up a good
community spirit guys.
And finally have mercy on my English, ask if something what I say offends you or is completely dumb.
I´am still willing to learn :smile:

ok, here’s what i’ve done - this is a mixture of the github instructions and @kennethlimcp’s tutorial, as neither of them work for me on their own, although this doesn’t work either, but maybe its a way forward towards clarity.

factory reset:
    hold MODE then preset RESET, keep holding MODE for 10secs until it flashes white

start server:

    node main.js

edit ~/.spark/spark.config.json (do this now as ken says not later as git says) to clear your current remote cloud login and point an empty account at the local cloud:

{
  "access_token": null,
  "username": null,
  "apiUrl": "http://192.168.0.25:8080"
}

enter dfu mode:
    hold MODE then press RESET, keep holding MODE until flashes yellow

load local public key onto spark (IP is your server not your spark):

    spark keys server default_key.pub.pem 192.168.0.25

enter listening mode:
    press RESET then hold MODE for 3 secs until flashes blue

get core id:

    spark identify

create local user:
(enter different user details than your normal spark account, don’t ctrl-c at any point, contrary to what ken says)

    spark setup

enter dfu mode:
    hold MODE then press RESET, keep holding MODE until flashes yellow

provision keys
(ken saves them and changes directory, git runs doctor but from the wrong directory):

    cd core_keys
    spark keys doctor your_core_id

list sparks
(doesn’t work, either returns none or says they are offline, contrary to what the debug messages say on the server):

  spark list

I will tell my experience for you guys .
Yes the local cloud has bugs . and if you flush the frimrware to local cloud the spark core will flash red then green then red again ,… becomes weird until 2 mins it will reconnect .
But in case you flush tinker app it will connect instantly to your local cloud .

  • So i suggest to try tinker flush first before any other code even the most simple one
    Solution for this bug : fresh clean installation fixed it . server - protocole and cli

Another bug will be the publish event which does not work with the cli CURL
Solution : Was fixed by @kennethlimcp kenn manually waiting for patch .

Note : I must say that @kennethlimcp helped me hours and hours to fix the cloud and doing his best to improve it .
so i have to thank you for the tutorial and all the help.

Now about the keys i had also i bug , and going back from local cloud to public … suddenly the core never fucntions properly whatever i do . so i flush it with wifi deep update and then used :smile:

spark keys doctor your_core_id

The doctor command fixed it .

Note : all these tests were made 1 month ago , i dont know what version it is now but i guess its bit slow so please guys if you want to download it , be noted that its still too far from bein stable and thats why its called beta . and i guess spark released it so we can improve it and help all together as a team to make it better.

Thank you all .