Local cloud server connection problem, spark core led turn red and then reset

Hello Guys,

I was trying to setup a local spark server on my computer, and have encountered in to a problem:

after using spark setup to set the wifi connection of my spark core, the server first have a core online! message, and then the spark core led turns red and restart. After that, the whole system goes into a dead loop, that the core wake up, talk to the server, server detect connection, spark core show red signal, core reset, and wake up again.

Here’s the message from the spark server terminal:

----END PUBLIC KEY-----

Your server IP address is: 192.168.1.100
server started { host: β€˜localhost’, port: 5683 }
Connection from: 192.168.1.103, connId: 1
on ready { coreID: β€˜53ff71065075535142331387’,
ip: β€˜192.168.1.103’,
product_id: 0,
firmware_version: 0,
cache_key: β€˜_0’ }
Core online!
Connection from: 192.168.1.103, connId: 2
on ready { coreID: β€˜53ff71065075535142331387’,
ip: β€˜192.168.1.103’,
product_id: 0,
firmware_version: 0,
cache_key: β€˜_1’ }
Core online!
Connection from: 192.168.1.103, connId: 3
on ready { coreID: β€˜53ff71065075535142331387’,
ip: β€˜192.168.1.103’,
product_id: 0,
firmware_version: 0,
cache_key: β€˜_2’ }
Core online!

Anybody else saw this before?

Thanks,
Yan

If you are compiling locally, you can use the workaround for now explained here:

Best approach to resolve this issue would be to compile locally using the core-firmware’s latest β€œmaster” branch.

I’ve been trying this several times. Now, the core and the local server would be able to connect but just for a few second, and lose connection soon after that, and reboot and reconnect.

Here’s exactly what I did, and the message from the screen output.

Did anyone go into this situation before? Any suggestions on this?

Thanks,
Yan

Mac OS 10.9.4
dfu-util 0.7
gcc-arm-none-eabi-4_8-2014q2

CLI installed successfully.

git clone https://github.com/spark/spark-server.git
git clone https://github.com/spark/core-firmware.git
git clone https://github.com/spark/core-common-lib.git
git clone https://github.com/spark/core-communication-lib.git

  1. core factory reset

  2. go to core-firmware/spark-utilities.cpp, comment 617 //Multicast_Presence_Announcement();

  3. go to core-firmware/build, make all

  4. dfu-util -d 1d50:607f -a 0 -s 0x08005000:leave -D core-firmware.bin

message:

dfu-util 0.7

Copyright 2005-2008 Weston Schmidt, Harald Welte and OpenMoko Inc.
Copyright 2010-2012 Tormod Volden and Stefan Schmidt
This program is Free Software and has ABSOLUTELY NO WARRANTY
Please report bugs to dfu-util@lists.gnumonks.org

Filter on vendor = 0x1d50 product = 0x607f
Opening DFU capable USB device… ID 1d50:607f
Run-time device DFU version 011a
Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=0, name="@Internal Flash /0x08000000/20001Ka,108001Kg"
Claiming USB DFU Interface…
Setting Alternate Setting #0 …
Determining device status: state = dfuERROR, status = 10
dfuERROR, clearing status
Determining device status: state = dfuIDLE, status = 0
dfuIDLE, continuing
DFU mode device DFU version 011a
Device returned transfer size 1024
No valid DFU suffix signature
Warning: File has no DFU suffix
DfuSe interface name: "Internal Flash "
Downloading to address = 0x08005000, size = 78940
…
File downloaded successfully
Transitioning to dfuMANIFEST state
Error during download get_status

cd spark-server/js
npm install
node main.js

message:
npm install

npm WARN package.json spark-server@0.1.1 No description
npm WARN package.json spark-server@0.1.1 No README data

>ursa@0.8.0 install /Users/Yanz/work/spark/spark-server/js/node_modules/ursa
>node-gyp configure build && node install.js

CXX(target) Release/obj.target/ursaNative/src/ursaNative.o
In file included from …/src/ursaNative.cc:3:
…/src/ursaNative.h:6:9: warning: β€˜BUILDING_NODE_EXTENSION’ macro redefined
#define BUILDING_NODE_EXTENSION
^
\command line>:4:9: note: previous definition is here
#define BUILDING_NODE_EXTENSION 1
^
1 warning generated.
CXX(target) Release/obj.target/ursaNative/src/asprintf.o
SOLINK_MODULE(target) Release/ursaNative.node
SOLINK_MODULE(target) Release/ursaNative.node: Finished
xtend@4.0.0 node_modules/xtend

node-oauth2-server@1.5.3 node_modules/node-oauth2-server

when@3.4.4 node_modules/when

ursa@0.8.0 node_modules/ursa

when@3.4.4 node_modules/when

request@2.40.0 node_modules/request
β”œβ”€β”€ json-stringify-safe@5.0.0
β”œβ”€β”€ forever-agent@0.5.2
β”œβ”€β”€ aws-sign2@0.5.0
β”œβ”€β”€ oauth-sign@0.3.0
β”œβ”€β”€ stringstream@0.0.4
β”œβ”€β”€ tunnel-agent@0.4.0
β”œβ”€β”€ qs@1.0.2
β”œβ”€β”€ node-uuid@1.4.1
β”œβ”€β”€ mime-types@1.0.2
β”œβ”€β”€ tough-cookie@0.12.1 (punycode@1.3.1)
β”œβ”€β”€ http-signature@0.10.0 (assert-plus@0.1.2, asn1@0.1.11, ctype@0.5.2)
β”œβ”€β”€ hawk@1.1.1 (cryptiles@0.2.2, sntp@0.2.4, boom@0.4.2, hoek@0.9.1)
└── form-data@0.1.4 (mime@1.2.11, async@0.9.0, combined-stream@0.0.5)

hogan-express@0.5.2 node_modules/hogan-express
└── hogan.js@3.0.2 (mkdirp@0.3.0, nopt@1.0.10)

moment@2.8.1 node_modules/moment

spark-protocol@0.1.4 node_modules/spark-protocol
β”œβ”€β”€ buffer-crc32@0.2.3
β”œβ”€β”€ h5.buffers@0.1.1
β”œβ”€β”€ h5.coap@0.0.0
└── hogan.js@3.0.2 (mkdirp@0.3.0, nopt@1.0.10)

ursa@0.8.0 node_modules/ursa

express@3.4.8 node_modules/express
β”œβ”€β”€ methods@0.1.0
β”œβ”€β”€ merge-descriptors@0.0.1
β”œβ”€β”€ range-parser@0.0.4
β”œβ”€β”€ debug@0.8.1
β”œβ”€β”€ cookie-signature@1.0.1
β”œβ”€β”€ fresh@0.2.0
β”œβ”€β”€ buffer-crc32@0.2.1
β”œβ”€β”€ cookie@0.1.0
β”œβ”€β”€ mkdirp@0.3.5
β”œβ”€β”€ commander@1.3.2 (keypress@0.1.0)
β”œβ”€β”€ send@0.1.4 (mime@1.2.11)
└── connect@2.12.0 (uid2@0.0.3, pause@0.0.1, qs@0.6.6, bytes@0.2.1, raw-body@1.1.2, batch@0.5.0, negotiator@0.3.0, multiparty@2.2.0)

\7. spark keys server default_key.pub.pem 192.168.1.100

\8. spark setup with wifi info
msg: core LED breathing blue
flashing cyan
msg from server:

Expected to find public key for core 53ff71065075535142331387 at /Users/Yanz/work/spark/spark-server/js/core_keys/53ff71065075535142331387.pub.pem
onSocketData called, but no data sent.
1: Core disconnected: socket close false { coreID: β€˜53ff71065075535142331387’, cache_key: β€˜_1’ }
Session ended for _1

\9. spark keys save core_ID

core onlie breathing cyan, temporarily...
               flashing cyan again:

message from the server:

on ready { coreID: β€˜53ff71065075535142331387’,
ip: β€˜192.168.1.101’,
product_id: 65535,
firmware_version: 65535,
cache_key: β€˜_2’ }
Core online!
onSocketData called, but no data sent.
1: Core disconnected: socket close false { coreID: β€˜53ff71065075535142331387’,
cache_key: β€˜_2’,
duration: 25.064 }
Session ended for _2
Connection from: 192.168.1.101, connId: 4
on ready { coreID: β€˜53ff71065075535142331387’,
ip: β€˜192.168.1.101’,
product_id: 65535,
firmware_version: 65535,
cache_key: β€˜_3’ }
Core online!
routeMessage got a NULL coap message { coreID: β€˜53ff71065075535142331387’ }
got counter 22158 expecting 22157 { coreID: β€˜53ff71065075535142331387’ }
1: Core disconnected: Bad Counter { coreID: β€˜53ff71065075535142331387’,
cache_key: β€˜_3’,
duration: 0.023 }
Session ended for _3
SparkCore - sendReply before READY { coreID: β€˜53ff71065075535142331387’ }

Hi @FlyingYanz,

This feels familiar, have you seen this thread?

http://community.spark.io/t/local-cloud-sos-panic-flash-with-user-firmware-solved/6161

I think the bad counter issue is related to a bug I discovered recently and patched. I haven’t distributed the patch yet, but should tomorrow. So if you upgrade tomorrow night, I’m hoping it’ll be fixed. :slight_smile:

Thanks,
David

That’s great @Dave!

I’ll wait for that. So should I just grab the latest spark-server code from github and run it?

btw, I’m new to this area and really want to learn something by playing with my spark-core, but was frequently stuck with some basic stuff and don’t know how to solve it. It is so nice that you guys are always actively participate in the community page, answering questions and helping beginners like me, I really appreciate!

Thanks,
Yan

Okay guys, I just published the update to the spark-protocol module, so if you do an npm update or

cd spark-server/js
rm node_modules/spark-protocol
#or rm -rf node_modules/spark-protocol
npm install

Thanks!
David

Problem solved!

Thanks!
Yan

1 Like

Woo hoo! :slight_smile:

Glad that’s working for you!

Thanks!
David