Direct Local Control of Spark - Reading URI direct requests to Tiny Webser IP

As an extension to the discussion about token security and non-API access to the Spark Core here: https://community.spark.io/t/is-it-safe-to-place-spark-core-token-number-in-a-webpage/3301/55 I started this topic to expand on how to gain direct control of the Spark Core using Tiny Webserver and Direct URl reguests to the Tiny Webser IP Address. Tiny Webserver would then read the request and pass reques to Spark Core to control one or more pins.

As suggested by @gruvin direct control of Spark Core pins is what I need for my project at the local level and without the Cloud.

**

My question is

: The translation and resulting comands sent from the Spark API to the Spark Core, Can that functionality be duplicated by a direct http request to the Spark Core ?**

The overall spark api is pretty sophisticated, but for those people who don’t or can’t rely on the spark cloud the local cloud option that is yet to be released would allow you to replicate most cloud functionality, but to do this you would need a local server running the cloud software. I don’t think your looking for this type of solution, by local you mean on the actual core?

If you require the ability to set or read specific pins this could be built using the core and httpserver/httpclient API. The httpserver library is a start but I think the new httpserver @BDub is working on will help with handling HTTP GET requests but some additional coding would still be required , are you look for something like a basic RESTful style api, like

Read pin value http://{core-ip-address}/get/d0
Set pin http://{core-ip-address}/set/d0/high

1 Like

@deancs Thanks for reply to my question as I no nothing about API’s and don’t want to become a Android programmer just to cycle a DigitalWrite to a pin on the Spark Core.

Hoping it can be as simple as it is on the Arduino http://{core-ip-address}/?{variable}=ON

And, even better yet, if the variable can be set on Tiny Server witin the Spark Core where the request can be read and passed on to the Spark Core to make a pin High of Low. I know this is like going back to 1985 and looking at the blinking cursor and saying, now what but, A very simple High / Low triggered cross-platorm option would open Spark Core to all those Arduino users, millions of people.

I’m working on a Tiny Server and I think Dave is working on one as well. I’m sure when it’s done it will do exactly what you want and more. I just need time to finish it! :smile:

4 Likes

That’s great to here @BDub as its really above my knowledge or ability. Thanks for all that you do (and I really do mean that !)

1 Like

I have this exact need for a product I am working on. I need to communicate directly with the Spark Core - no Cloud, no router.

Until they release the Cloud software to the community (as promised) I won’t be able start working on a solution of direct control of the Spark Core. I don’t want my core to be a tracking device and as long as its connected to the cloud that’s exactly what I have.

Will you still be using the Wifi on Spark Core to control it or ?

What about a patch with this to extend your range + 5v bat pack: Mini Wifi Router

Also, there are Portable Wifi Webservers and plans for same like this one: wifi webserver

if you think you can use websocket, I have a tiny websocket server working on the spark.
it can handle up-to 4 clients. Local Websocket Server - Project Share - Particle
/N

I would say that the only thing (and it’s a really big thing) that the Spark cloud brings over all other methods is a secure connection to the outside world.

The cloud services are extremely easy to use and have a good scaleable implementation, but the one thing you would have a hard time rolling on your own, is the security. There just aren’t enough resources on the core to have multiple encrypted connections and do the certificate management.

@bko,

I and many others will be using our spark cores inside our local networks only and with that said security is moot and handled by a router. If someone can bypass WPA router they can bypass Oauth 2 used by Spark API; Corrrect ?

The Spark Cloud is awesome for ā€œInternet of Thingsā€ approach but, there is another dynamic here; Assigning a IP over the internet is like asking for a criminal to eat you for dinner. I don’t want criminals to eat me for dinner :smile:

So, and I am not the only one, I would like to see a standard set for all users of Spark Core but up to now, it is hit and miss. Not that I am trying to dismiss the wonderful work that others have done to reach an ā€œOff Clouldā€ interaction with their core.

Any Thoughts of what would be the best approach fof a Local only ā€œNon-Cloudā€ and if possible, non-api direct access interation with the Spark Core ?

THANKS,

Bobby

@NaAl,

Yes, I will take a look at your tiny websocket server.

Thanks.

Bobby

check this out:

1 Like

@NaAl

I am not a network person, did you have to set a static IP to the spark core via your router ? & WOW - AWESOME !

EVEN BETTER ! - All without the cloud.

I’m a little confused though with all this talk from others how the Spark API converts Http requests to machine language for the Cortex M3 processor ? Do I have wrong.

Can’t wait to get back home and get this going on my Core.

Thanks for the video and help.

Bobby

Note: I just read your Websocket Topic and read more how it works, downloaded all source too. Thanks Again !

@spydrop
yes, I have added a static IP to my router.

I am not sure what you are referring to, but the core is able to handle any TCP/UDP requests, (chek TCPClient (spark_wiring_tcpclient.cpp) & TCPServer (spark_wiring_tcpserver.cpp) classes).

/N

@NaAl,

Will check out these classes to understand better.

Thanks, Bobby

Hi,
Here my projet local http server :
https://github.com/captainigloo/sparkcore-local-http-server-rest-json