I have started using spark core for my prototype and it is really working out good. I am looking for information on if there is any document or tutorial on how to take the prototype all the way to production. Can I use the spark core directly in my PCB or should I take the soft IP from github and do it afresh? How does the spark cloud work when I scale out the product to a huge number of commercial devices? Do I keep using the spark cloud service and flash all the devices from there or should I build the spark cloud myself in my own cloud?
The guideline can be found at: https://github.com/spark/core under the README.md
There should be more information released by @zachary soon for people building products using Spark dev kits.
Private Spark Cloud hosting by Spark team will also be available for people wanting to have control over their own data and connection instead of sharing the same public Spark cloud.
That of course, comes with a fee and details of such is not yet available.
Great question @develamit, and thanks @kennethlimcp for pinging me.
We are actively working on creating a clear, interactive checklist for product creators. Coming in the next several weeks!
You can either build a PCB with headers for the Core and expect reduced prices if you buy in the thousands, or you can build your own PCB with an STM32 and a TI CC3000 based on our reference design (in the repo Kenneth linked to). The latter option will be more cost effective at scale but will require more up front circuit design on your part.
If you don’t buy hardware from us, you’ll just pay a very small fee for cloud access for each device, and each device will need a key provisioned on the manufacturing line.
Thanks zachary and kennethlimcp,
I really appreciate your response. Now that you have instigated the production line of thoughts, here are my requests:
Looking forward to the clear interactive checklist that you have mentioned.
I bought one spark core for $39. If I choose to use Spark Core straight in my PCB can you please give me an idea on the volume pricing (price per hundred or thousand cores). In this way if I can manage the BOM cost for my product then I can save on the circuit design a lot and save on hires. This will also give me the benefit of using a tested off-the-shelf design. Have to carefully do the trade-off here.
Is it also implied that if I buy the hardware from spark then I do not have to pay the cloud service fee?
If I do not buy the hardware and just do the circuit design all over again, then how much would that cloud service fee be per device and any other associated overhead fee?
Spark has already done an awesome job with the hardware, documentation and building a community. My desire to have an awesome document which explains all these aspects in greater detail has increased
Regarding #3, correct. If you buy hardware from Spark, you get lifetime cloud access for free.
Regarding #2 & #4, as we develop new product lines, we determine new pricing options as well. For any particular company or product it’s best to send an email to email@example.com with your specific needs, volumes, timelines, etc. to get specific quotes.
We’re also planning to release a new category soon, for people building products around Spark technologies. Currently working on getting some starter information together for the new category launch.
When this will get launched? And can I order the SparkCore development board now or will there be a new board/kit launched for this category?
The “category” most likely refers to a forum category specifically for people planning on going into production.
You can most certainly use the current Spark Core (there are already product out there using them!). About potential new hardware; according to Spark:“A new hardware architecture that is backwards-compatible with the Spark Core”, so you needn’t worry about compatibility issues. Go ahead, order one, and start tinkering :)!
Aha, thanks! My bad. How easy it is to go from SparkCore to an own ‘on board’ version? I mean, a big part of the code have to be written over then since it’s not running the SparkCore bootloader/firmware, right?
Perhaps @BDub can explain some more about that since he, if I recall correctly, built his own a long time ago.
@zachary might be able to explain what it takes to get your home built board to play nicely with the Spark Servers.
Depending on your requirements, you could also consider placing female headers in your product, in which you could place your Core. I believe there are specials programs available for production products, depending on the quantities. For that you’d best mail the previously mentioned email address, who should be able to help you further.
As long as you (1) implement the firmware communication library, (2) have a device ID (we currently use the 96-bit unique ID register in STM32 chips, other devices will require some different ID), and (3) have a key provisioned for that ID that the Spark Cloud knows about (see spark-cli
spark keys doctor command to familiarize yourself with the process), you’re good to go!
More complete info coming this fall, but those are the basics.