The device keys are 1024 bit RSA keys, stored in PKCS#1.5 DER format, which means the private key is usually around 612 bytes. The cloud public key is 2048 bits, and is around 294 bytes.
Core creates a protobufs Hello with counter set to the uint32 represented by the most significant 4 bytes of the IV, encrypts the protobufs Hello with AES, and sends the ciphertext to Server.
Is it protobuf like Protocol Buffers - Wikipedia ? I was thinking spark protocol is using CoAP, can someone explain me this part and how CoAP is used by the protocol.
I think that comment may be obsolete. Way before I joined Particle, but I believe protobufs was considered at some point early on in development of the protocol.
I’ve been organising my thoughts to start a python framework that would allow devices such as Raspberry Pi, and the upcoming rash of cheap SOC-based devices running linux (think Domino.io, CHIP, etc etc) to leverage the cloud infrastructure.
The target language is Python, so I think we have something to share. Could you give me more details about your framework project as I’m not able to understand the link between Particle and devices you’re talking about. May be I miss something.
On my side the main idea was to offers a python librairie for the spark protocol allow people to build their own software / cloud solution to manage their device.
But I’m not a big Python expert it is a way for me to learn it more
It sounds like we would be working different sides of the network connection between device and cloud. I think there is great potential value having additional options and code for both.
My goal is to create a python client (and hopefully do it in an extensible, if not pythonic, way) that can connect to the Particle cloud (and hence also your stuff) whilst running on generic Linux hardware (e.g. Raspberry Pi, the other platforms I mentioned.) I’m not personally interested in creating the IDE experience, but I’m sure others might do something creative in that department if the basic functionality of Spark.publish()/Spark.subscribe() and friends was available.
Note: Particle already has the concept of selling/licensing their scalable cloud infrastructure, so I’m not offering a back-door to leech off their cloud - just a way to include generic Linux platforms alongside compound devices, like the Bluz, and the DigiStump Oak.
I work with @nadley on the python server implementation. I’m very happy to see you’re working on the client side. That can be used in our project to manage tests! Have you started your work?
Ok now I'm understanding your purpose, nice idea to create a client this is fully complementary.
Note: Particle already has the concept of selling/licensing their scalable cloud infrastructure, so I'm not offering a back-door to leech off their cloud - just a way to include generic Linux platforms alongside compound devices, like the Bluz, and the DigiStump Oak.
I know that particle can sell/licensing their cloud infrastructure. They also release an open source version of a local cloud GitHub - particle-iot/spark-server: UNMAINTAINED - An API compatible open source server for interacting with devices speaking the spark-protocol, but the project is not very active, but I understand this is not their priority they need to do business first , everybody need to eat at the end of month. To be honest I can't use a software which is not Open Source. That why I'm not using the actual Particle Cloud, or just to test if my device are OK.
I tried to run the local cloud which is working but not complet...