Spark.variable frequent call

I have been using the HTML to access the Core spark.variable with getJSON as following

var sparkCore = "https://api.spark.io/v1/devices/" + deviceID;

function callGetJson(sparkVar)
{
var requestURL = sparkCore+"/"+sparkVar+"/?access_token="+accessToken;

  $.getJSON(requestURL, processGetJson); 

}

It was working fine but lately discover some issue.
I am accessing 3 different variables with a single "button" event.
Once a while if I push too frequent, before response come back.
Then the Spark Cloud is not responding. When I check the device, it is flashing blue (meaning it lost wifi credentials).

After the flashing, the device will be normal again.
Is this a known issue ?

I need to put in semaphore to protect multiple calls to Spark ?

Thanks

1 Like

After further testing with a semaphore, the result has shown that it always “timeout” no response from cloud on the 3rd spark.variable call. It does not matter of the order of the variables. I tried to change the order and result is the same.

If I call just one of the spark.variable from click button, it will be OK… I can push as fast as I can without any issue.

Apparently, it needs to be different variable in sequence to trigger this issue.

1 Like

With further debug in different cases.

  • GET (for spark.variable)
  • POST (for digitalread command)

Case 1 :
Send GET and POST commands using CURL in 2 different command windows
Each window (batch file) will perform over 50 times of sequences of GET or POST command async.
The result show OK.

Case 2 :
Same GET and POST commands in HTML (Chrome browser).
Sequence is connect to push button – keep pushing
The Core will lose WiFi connection in just 2 try of same sequence back to back
Attached is the timing capture from the Chrome.

Will Spark team able to investigate this further and confirm my finding ?
Let me know if you need certain part of the code to duplicate issue. I could email over.

Right now, I need to put in semaphore to waiting till each GET and POST to complete before calling again. Then it is OK.

@Dave, would like to bring this to your attention for some quick feedback :smiley:

Hi @Dilbert

I know that there have been some programming errors that looked denial of service attacks since they hit the Spark API so hard, so it is possible that Spark has taken some effort to meter your API calls. Are you checking the return values of all these GET/POST requests? It might be good to look at those and see if there is a clue.

1 Like

I did monitor the return status and it all return OK before it stalls. Each time, it is timeout from GET or POST request which does not have response from the calling URL.

Since HTML case is using call back, so it will be even more intense as it could just keep sending request and then waiting fr each response to come back.
But for CURL with command window, it is performing a send-receive sequence. So traffic is much less.

Hi @Dilbert,

Right now the cloud won’t try to limit how often you ask your core for a variable value, or call a function on your core. Variable requests are generally pretty quick, but a function call can take a bit longer depending on what it’s doing. It’s possible to knock your core offline by sending it too many requests too quickly, but in theory that should be impractical (hundreds of simultaneous requests).

It looks like you’re calling ‘digitalread’ and getting a variable, what firmware are you using? Tinker only has function calls and no variables, so it must be custom firmware, no? Can you share your code?

Thanks,
David

1 Like

Hi @Dilbert,

Thanks for sending me your code samples. I think what we’re seeing is a few instances where you’re sending 3 simultaneous variable requests, and 4 simultaneous function requests, which is probably overwhelming your core. I don’t like adding rate limiting code when I can avoid it, but the core also shouldn’t crash.

I’ve passed this along to our firmware team who will dig in.

Thanks!
David

Just FYI, we were discussing this today. I just added an issue for those who want to track:

2 Likes