Questions about Firmware in general

Hey all!

I'm trying to understand how the factory reset is 'triggered' and i can't figure out.

What i understand is when you press the Mode button for >= 7s, this is set:

WLAN_DELETE_PROFILES = 1;

I searched the Github repository but cannot understand how it gets into the listening mode WiFiCredentialsReader::read


Another question:
@mohit @satishgn

Where can we find the factory reset firmware? Like i know there's V1 and V2 right now and how often are they changed during production?

I would like to use it to test out the factory reset firmware during a factory reset of my core and other stuff like WEP configuration and stuff to catch more bugs :smile:

Thanks! :smiley:

I went down that same route one day like … where is it??? But then realized the only way it works is because it’s in the bootloader:

For factory firmware, the highest number here wins:

However, I would probably go with the latest from compile-server2 bin because the spark_2 build has a really slow OTA process, compared to newer code. Could also try master repo but not all things will be perfect in master.

2 Likes

I checked everything but not the bootloader. haha!

Thanks for the input :smiley:

1 Like

@Dave SOS ALERT!

I downloaded the factory-firmware to the wrong location and ended up at the private key location after missing a 0 in the address. haha!

Now my core goes rapidly blinking yellow.

I tried the CLI 'spark keys doctor core_id' but it says no dfu device found. (i placed the core in dfu mode already)

It's fun when some stuff that I always wanted to try messing around became a real problem that i need to resolve :smile:

EDIT:

I did "spark keys new" and spark keys send and it was successful.

BUT im unable to update my private key using

e:\dfu>dfu-util -d 1d50:607f -a 1 -s 0x2000 -v -D core.pem

error is:

Downloading to address = 0x00002000, size = 609
Poll timeout 50 ms
Download from image offset 00000000 to memory 00002000-00002260, size 609
Poll timeout 30 ms
Error during download get_status
Failed to write whole chunk: -7 of 609 bytes

It's fun to mess around with this but well..i'l get some sleep and check your inputs in the morning :sleeping:

Hey @kennethlimcp,

Just got the email about this from the forums, not to worry! It’s the “.der” file that needs to be copied onto your core, you can do this with the CLI:

spark keys load core.der

The Core’s dfu interaction has a quirk where the files need to be an even number of bytes, the CLI will pad the key with an extra byte to workaround that. If the CLI isn’t finding the core, which would be weird if it shows up with dfu-util -l, try running spark keys new until there is a ‘der’ file with an even number of bytes, then send and load that key.

Thanks!
David

@Dave, i did as you mentioned above:

  1. I have a core.der of 608 bytes which i managed to generate

    spark keys load core.der

  2. I used dfu-util -d 1d50:607f -a 1 -s 0x2000 -D core.der to upload the key to my core

  3. spark keys send 48ff6a065067555008342387 core.pub.pem to update the cloud of my new key

But the core still did not connect.

Hi @kennethlimcp,

Hmm… I only have two public keys for your core, the factory key, and one that you uploaded. Didn’t you say earlier that you also uploaded one then?

I did using CLI and it said successful :smiley:

Does that mean the API to do that through CLI didn’t worked as expected? I can give you the pub.pem file while we figure out on that

EDIT: Ah @Dave i got it running now. I used ‘0x00002000’ instead ‘0x2000’, missed one 0 at the front!

So it seems like the ‘spark keys doctor’ has an issue with windows as well. Since dfu and OpenSSL are in 2 different folder…

The API won’t insert duplicate keys, so it would be a question of if you had sent different keys multiple times, etc. Glad you got it back up and running! CLI success (sorta) :slight_smile:

@Dave I had a fun time doing this though most people might be screaming when it happen by accident :stuck_out_tongue:

Speaking of which i made some comments about this at the spark-cli for your inputs when you are free :smile:

Hi @kennethlimcp,

Thanks! Much appreciated. :slight_smile: Which issue on spark-cli should I be looking at, I think the only one updated recently is the “serial wifi” backwards compatibility issue?

Thanks,
David

1 Like