Spark Server >> Pairing an app to a Spark Device

Hi All,

Looking for expert advice!

I am working on a service that will pair an iOS app with a Spark Core using the local cloud… Most specifically:

In the app I type the core id and flip a switch on the spark side that will put the device in pairing mode.

What I want is my server to allow multiple apps (users) to be paired with one core. Or a simple app to be paired with multiple cores.

Once they are paired they can talk to each other while the server only works as a proxy.

Now… In your expert opinion, where in the Spark Server would you go to implement this? Is this something to be implemented in the “protocol” code, or where would the best practice be?

What Class you would override its prototype or extend… Or where would you start to tackle this idea?

Thanks in advance!

You need the same authentication method used for REST Api on the normal :spark: cloud to use the access tokens (one for each user)

Currently, the cores on the local :cloud: are not tagged to any user account and is accessible by any user with a local cloud account.

@Dave has the code but it has yet to be released :slight_smile:

Hey All,

Right now the local server doesn’t have a concept of user ownership, so any user locally can use any core. I’ll be fixing that and adding some ownership code.

@frlobo if you want to be adding your own permissions / changing the interface, I would modify the routes exposed in the views folder, api_v1.js, and EventViews001.js. There you can add your own ownership rules if you want, and after I add that baseline code it’ll be easier to modify it further. :slight_smile:

Thanks,
David

Thanks Dave.

The question is that multiple apps most have ownership to the same core… Are this rules set in the views?

I am tempted to make my own simplified server so that I can make a simpler integration. Do you recommend this or do you recommend i hack the spark server?

Hi @frlobo,

Right now you can easily allow multiple apps to interact with multiple cores by simply sharing or creating multiple access tokens. The Spark server is also there for you to change and extend, so please feel free to hack on the local server if there is a feature you want that isn’t there yet. Pull requests welcome, and there’s a good chance someone else might want the same feature. :slight_smile:

Thanks,
David

Thank you. Is there any documentation on the structure of the code so we know where to start to dig the hacking?

Hi @frlobo,

It’s very new so I haven’t had a chance to release all the docs I would want about the structure and extensibility. Here are the fun parts:

adding routes / handlers for web requests:
spark-server -> views : https://github.com/spark/spark-server/tree/master/js/views

changing what the server does when a core connects:
spark-protocol -> clients -> SparkCore : https://github.com/spark/spark-protocol/blob/master/js/clients/SparkCore.js

Thanks!
David