Connecting a Spark Photon

Hi All,

In preparation for the upcoming Spark Photons, can anyone provide some more information about the process of getting it on a wifi network? I know its using a Broadcom chip vs a TI chip. Are there any docs analogous to this: http://docs.spark.io/connect/#connecting-your-core-smart-config-with-the-spark-app

Thanks
-Matt

@matt_ri, there are no docs yet as the product us still unreleased. However, I can tell you that the Photon will act as a softAP (access point) and will be very easy to setup. :smile:

@peekay123 Sure, I realize its unreleased and its supposed to be easy. However, I bet the dev team has at least some specs of what they plan/hope to do. I think it would be helpful for the community if those could be shared. That way we could at least have an idea of what needs to be done so we can start roughing out the rest of our development plans.

Even with the caveat that the actual implementation might change before release, having a rough idea of the details would be helpful in facilitating quick releases of products based on the Photon.

@matt_ri, I completely agree! However, the documentation is still in development as the team works diligently at finishing the firmware. Perhaps @zach can provide some guidance as to when documentation will be released. :smile:

AFAIK,

there are a couple of ways to do it:

1.) USB mode (enter SSID, security type and password)

2.) SoftAP mode (connecting to the AP of the Photon)

3.) Mobile App (probably using the SoftAP feature as well)

The iOS sdk has gone public but probably more to be added before the announcements come official: https://github.com/spark/spark-sdk-ios :smile:

1 Like

Perfect! the sdk should be helpful for now.
Thanks

It seems, as if this project is just a named template based on AFNetworking.
Is it going to be the official starting point for the upcoming sdk, or is this just another project?

Another question is, if AFNetworking is the right way to go. This frameworks handles all types
of tasks, e.g. like it contains a serializer for JSON, XML and more, or an implementation to
handle the Amazon-API.
From what I know Spark cores and its upcoming siblings are only dealing with JSON.

Hence the general question is, can the 3 types of mostly simple tasks, function calls, event listenings
and cloud management (tokens, cores) get handled without an added overload?