High Latency when call particle.io function with node/curl

Hi i’m trying to call particle function from nodejs via particle cloud api (api.particle.io)
the call went successful but with 1500-2000 ms delay
i tried using curl but got the same high delay

tried using javascript sdk’s device.callFunction this time it goes up to 2000+ ms

i can getEventStream and receive event in time (<200ms) but not callFunction

here’s my nodejs request code

    url: "https://api.particle.io/v1/devices/"+data.coreid+"/blink",
    form: { access_token: sparkAccessToken }

here’s my curl

curl "https://api.particle.io/v1/devices/{devid}/blink" --data "access_token={acctoken}"

But when calling from Browser using https://github.com/hpssjellis/spark-core-web-page-html-control
it work (<200ms) (using same function name, coreid, access token)

am i missing something?

PS. I’m using particle core, awaiting photon shipment.

UPDATE: i tried to run the same code on digitalocean’s SG server but still take 1400-2000ms

Hi @lpeachl,

Where are you in the world, what kind of speed / throughput do you get when testing from your connection to the east or west coast in the United States? (We’ll be adding more regions worldwide later this year as well :slight_smile: )


I’m connecting from Thailand, i use fttx connection with 50Mbps max download speed, i got around 35 Mbps connect to usw with ~250 ping time.

Neither connection nor location is the problem because

I had tried using browser to send function call via xhr and it responsed in time (around 200ms), but not the particle’s javascript sdk, nodejs’s request module, nodejs native https client class, curl.

Hi @lpeachl,

Cool, thanks for testing! When sending commands via the cloud, you can imagine your latency will come into play with each hop ( you -> cloud -> device -> cloud -> you ), or 2 hops to reach your device, and 2 back to get the result. So a roundtrip time of around 1s would make sense for a 250ms ping. When we add a cloud region there later this summer, you should see dramatically improved latency! :slight_smile:

It’s interesting that you’re seeing different performance based on the app you’re using. Sometimes the node apps take a second or two to spin up the first time, are you running the Node app and then immediately stopping it, using a fresh process for each request? (This would cause extra delay).

If you need faster command turnaround, you could use direct TCP / UDP messages to your device. A good example of this is the ‘voodoospark’ app and the firmata libraries.

I hope that helps! :slight_smile:


thank for replying

i know that the round trip for my function call should be 1sec+ but the 1.5-2sec i measured is not the roundtrip you mentioned

i set it up like this:

1.core send signal via serial to node when i press switch connect to d1
2.node made function call using js sdk or request library to particle.io
3.function call come into core, core measured the time elapsed since i press the button
4.core set blink flag for loop to blink d7
5.core return time elapse in ms

so the flow should be

node -> cloud -> core

even if i have 250ms ping the measured time should be 500ms

i got 1500ms to 2000 ms returned from function call at node side, but the time measured at the node are much higher (around 4sec)

i don’t care if the call take longer to return to node but i want core to get the request in time (<500ms) when node call it

But, when i use browser to send the request via xhr the led on core blink almost immediately (under 500ms)

i have test some more this morning seem like node able to send the call to usw web server (www.particle.io) in time using the request library (220-280ms)

but when i try to send invalid api call to (api.particle.io) and measured the time it take and i got around 1100-1500ms

this does not happen when i use google chrome to try to send both test

nodejs request: (measured using console.time() before sending and console.timeEnd() after received response)

www.particle.io -> 224ms
api.particle.io -> 1149ms

google chrome result: (measured using network debug request timing)

www.particle.io -> 220ms
api.particle.io -> 280ms

i think i found the problem

when i test node againt www.particle.io i’m missing the s in https

nodejs didn’t cache the ssl key

so every time i make api call node need to do ssl handshake

1 Like

Ahh, awesome! Cool to see such thorough testing! :slight_smile:


Is there a way i can bypass ssl handshake when using node?

something like ssl connection stream for function calling

Most browsers will cache and re-use the initial certificate downloading / handshake step I believe. I think you can do some fancy TLS session management with Node, see:

https://nodejs.org/api/tls.html#tls_tlssocket_getsession and sessionIdContext I think?


reporting back (sorry for the delay)

i have tried your suggestion and implementing a https client from tlsSocket and store the session.

then use stored session to do TLS Session Resumption and reconnect to the api server. (it’s not sessionIdContext but tls.connect 's option.session, sessionIdContext is for creating server)


without session:

request 0: 1728ms
request 1: 1665ms
request 2: 1735ms
request 3: 1611ms
request 4: 1673ms
request 5: 1614ms
request 6: 1671ms
request 7: 1663ms
request 8: 1614ms
request 9: 1610ms

with session:

request 0: 1697ms
request 1: 1396ms
request 2: 1384ms
request 3: 1364ms
request 4: 1392ms
request 5: 1356ms
request 6: 1314ms
request 7: 1352ms
request 8: 1375ms
request 9: 1337ms

as you can see, the result are far from chrome’s ~300ms.

i wonder if it because of my implementation of https client or chrome have other thing to speed up the tls.

i can’t send my data in plain text because i plan on sending sensitive customer data i can’t let’s someone in the customer’s wifi sniff it.

but i’m giving up on this. i might use chrome to send it or host US server.

thank you very much.

Edit: @Dave do you have suggestion on cloud hosting?

Hi @lpeachl,

If you want to get close to our main server region right now, anywhere in the US should work, my hope is that we’ll have another region closer to you in the next few months. I’m also intending to offer APIs with less overhead than vanilla HTTPS. :slight_smile: