Every time I publish something to my local server I get the error below in my server window. I still see the correct data in the CLI windows. I am not sure if this is a problem or not, but I thought I’d ask. I am running a photon on a raspberry pi local server.
Your server IP address is: 192.168.0.10
server started { host: 'localhost', port: 5683 }
Connection from: ::ffff:192.168.0.24, connId: 1
on ready { coreID: '2b0034001547343339383037',
ip: '::ffff:192.168.0.24',
product_id: 6,
firmware_version: 65535,
cache_key: '_0' }
Core online!
onSocketData called, but no data sent.
routeMessage got a NULL coap message { coreID: '2b0034001547343339383037' }
1: Core disconnected: socket close false { coreID: '2b0034001547343339383037',
cache_key: '_0',
duration: 4.168 }
Session ended for _0
Connection from: ::ffff:192.168.0.24, connId: 2
on ready { coreID: '2b0034001547343339383037',
ip: '::ffff:192.168.0.24',
product_id: 6,
firmware_version: 65535,
cache_key: '_1' }
Core online!
I have my Particle receiving ble connections, and those are reported through the server.
So I can have 8+ messages per second being pushed.
If I have only one device connected the messages go through (1/sec), but as soon as I bump to more than 2 the messages stop. I see the Particle “sending” them, but none come through the server.
Any idea why that could be? I removed that throughput limit that the original code had by the way.
This still sounds as if you were hitting the documented limit of 1 Particle.publish() per second (with a burst of up to four with a four sec pause).
Where have you removed that limit?
That limit is part of the Parrticle firmware on your device. To lift that restriction for your local server, you'd need to alter the "framework" build locally and reflash that tweaked firmware.
communication/src/publisher.h function is_rate_limited must return true
communication/src/spark_protocol.cpp function “bool SparkProtocol::send_event” that check the rate per second
hi @straccio, shouldn't this return false instead?
what an excellent find for SparkCore.js – I'll be testing this out as well!
@marfife indeed it's a pretty lonely space for spark-server – you can tell just from the name that it hasn't seen any official updates, but at least we've been able to keep it going through these incremental bugfixes from the community.