Local Cloud - Lost Connection After any Function Call [solved]

Hello Guys,

I’m playing my spark core with a local cloud setup on my computer. I’m having a problem to call any of my registered function through the cloud server, curl or CLI call, both have the same problem:

that When I call the func, the server will send the function call to the spark core but it seems my spark core is just not responding, and then connection lost. I tried to manually put a Spark.process() function inside the loop, but it doesn’t solve the problem. Following are logs from the server side. Anybody have this problem before?

FunCall { coreID: '53ff71065075535142331387',
  user_id: 'UbZUu0R+Cl37I170+AUcX9SoOskQvYOf' }
FunCall - calling core  { coreID: '53ff71065075535142331387',
  user_id: 'UbZUu0R+Cl37I170+AUcX9SoOskQvYOf' }
onSocketData called, but no data sent.
1: Core disconnected: socket close false { coreID: '53ff71065075535142331387',
  cache_key: '_24',
  duration: 180.515 }
Session ended for _24
Connection from: 192.168.1.102, connId: 26
192.168.1.100 - - [Tue, 09 Sep 2014 18:59:30 GMT] "POST /v1/devices/53ff71065075535142331387/myfunc HTTP/1.1" 408 27 "-" "-"
on ready { coreID: '53ff71065075535142331387',
  ip: '192.168.1.102',
  product_id: 65535,
  firmware_version: 65535,
  cache_key: '_25' }
Core online!

BTW, I’m using updated protocol, the latest server and firmware code from github master branch. SYSTEM_MODE(AUTOMATIC); application.cpp as following:

#include "application.h"
int myFunc(String command);
SYSTEM_MODE(AUTOMATIC);
void setup()
{
	Spark.function("myfunc", myFunc);
}
void loop()
{
//Spark.process();
}
int myFunc(String cmd){
	digitalWrite(D7, HIGH);
	return 1;
}

CLI command:

spark call 53ff71065075535142331387 myfunc "asdf"

Did you copy+paste this from your original source? Only the comment is the wrong way around (it won’t compile like that). Please edit to include your original source, and I’ll try to reproduce the problem.

Hi @mdma

That’s the original code that I copy paste, but just add this one line for Spark.process(); in order to show that I did try to have this there before, but I got it deleted later.

Thanks!
Yan

Still @mdma was right. Don’t comment that way you’ll forget it’s there later and experience hell when your code grows.

For your problem, I think the Core disconnect after every 20 seconds in automatic mode. However, when you have delays in while loop then that should solve the problem, since it calls Spark.process() to ping the cloud on every delay() call.

Thanks @metaculus, I’ve updated my application.cpp as following, but still have the same problem, it seems the core just won’t call Spark.process(), or something else weird is happening…

#include “application.h”

int myFunc(String command);

SYSTEM_MODE(AUTOMATIC);

void setup()
{
Spark.function(“myfunc”, myFunc);
}
void loop()
{
delay(1);
//Spark.process();
}

int myFunc(String cmd){
digitalWrite(D7, HIGH);
return 1;
}

Your server IP address is: 192.168.1.100
server started { host: 'localhost', port: 5683 }
Connection from: 192.168.1.102, connId: 1
on ready { coreID: '53ff71065075535142331387',
  ip: '192.168.1.102',
  product_id: 65535,
  firmware_version: 65535,
  cache_key: '_0' }
Core online!
FunCall { coreID: '53ff71065075535142331387',
  user_id: 'UbZUu0R+Cl37I170+AUcX9SoOskQvYOf' }
FunCall - calling core  { coreID: '53ff71065075535142331387',
  user_id: 'UbZUu0R+Cl37I170+AUcX9SoOskQvYOf' }
192.168.1.100 - - [Wed, 10 Sep 2014 06:37:12 GMT] "POST /v1/devices/53ff71065075535142331387/myfunc HTTP/1.1" 408 27 "-" "-"
onSocketData called, but no data sent.
1: Core disconnected: socket close false { coreID: '53ff71065075535142331387',
  cache_key: '_0',
  duration: 170.186 }
Session ended for _0
Connection from: 192.168.1.102, connId: 2
on ready { coreID: '53ff71065075535142331387',
  ip: '192.168.1.102',
  product_id: 65535,
  firmware_version: 65535,
  cache_key: '_1' }
Core online!

And I also tried the original tinker firmware, that after each function call (POST), the core will lost connection and the curl/CLI command will be returned a timeout error.

I’ll give it a try when I have time. But just for now why are you declaring the myFunc() at the top of your program? I saw that in the example too, but calling Spark.function() without that seemed to work just fine.

I’m wondering, what does the sever means when it is saying:

onSocketData called, but no data sent.

no special reason of doing that. It is a cpp file, and people do declare the func before the main(), we have no main() in application.cpp though, but just following the tinker example.

@FlyingYanz maybe you can try something like:

void loop() {
  if (!Spark.disconnected()) Spark.connect();
}

Regarding this question of @metaculus:

A declaration of a function without implementation is called a function prototype and is considered good practice. Normally these prototypes would be found in header files and this way anybody using a precompiled lib would know how the function signatures look.
Another reason for using them is, that not all compilers like to find calls to functions they have not “heard” of before and will complain with something like ... not declared in this scope, so you tell the compiler about the existance of this function and its signature and that an implementation will follow.

If your compiler does not complain, it might be setup to first “scan” the code completely, register all declared functions/variables and then compile in a subsequent pass.

2 Likes

Thank you @ScruffR for the heads up. I have to go brush some dust off my C++ :smile:

Hey Guys,

In automatic mode you shouldn’t need to be doing any connection management. There’s no reason the core should be dropping the connection after every function call even with tinker. Hmm… Have you modified the server code, or do you have breakpoints set in it at all? Something that would interrupt it?

Thanks,
David

@Dave can this be there’s something blocking in @FlyingYanz 's code that would exceed the Spark cloud’s time?

Hi @metaculus,

It’s possible it’s something blocking in the firmware, but it’s unlikely since he said he tested it against Tinker. I suspect it’s something different about the server setup, since he’s using the local cloud in this case. :slight_smile:

Thanks,
David

Hi Guys,

Good news is I got the problem resolved, but the bad news is I’ve no idea what the problem is. So I just list what I did here. If anybody else have the same problem, try the following steps.

  1. git clone and compile locally the Core-Firmware. no change. Flash into the core.
    OBS: the core can connect with my local cloud. breathing cyan. but will lost connection after any function call, even cli cmd: spark list.

  2. redirect the core and CLI to spark cloud:
    CLI: spark keys server cloud_key.pub.pem
    OBS: the core keep flashing cyan, having problem connecting with the public cloud.

  3. put my core to Listening mode, try to setup the core again with my iphone
    OBS: the iphone app was not able to find the Spark Core.

  4. perform deep update
    CLI: spark flash --usb deep_update_2014_06
    OBS: core connected with public cloud. able to access core function through curl.

  5. direct the core back to local cloud (on my raspberry pi)
    OBS: it works! sign… All function call are successful, no lost connection again

  6. re git clone the core-firmware folder, update the application.cpp with code above again, make, flash
    OBS: it also works! Orz…

I kind of feel the deep update somehow did a magic work, fixed something that I messed up before? maybe

This is really interesting guys.

2 Likes