Atomiot (formerly Spark Tools) Cloud services for your Spark Projects

There is nothing listed in the device info section. The only code that I’ve been running is this:
http://pastebin.com/iYcDkrLw

I didn’t see any links to the spark documentation for variables and functions.

Hi @Jackalopes

There are several ways your Spark core can communicate over the cloud: publish, subscribe, variables, and functions.

These are doc’ed here:

http://docs.spark.io/firmware/#cloud-functions-data-and-control

I think what Kareem is saying is that his cool service works with functions and variables, not with published events.

@bko Thanks for ALL your help around here! You guys make learning this so much easier.

1 Like

Beat me to it @bko. Thanks for chiming in. I’ll clarify the information at the top of that page.

@Jackalopes are you up and running now?

@kareem613 I’m sending data over to the site and creating graphs! Thanks for the assistance.

Excellent. Please share the graph when you’ve got some data built up.

Nooby question: what if I want to do something different from most people: pull in weather predictions fr a service and pull in monitoring data from a spark. Them org grim some calculations and send a command back to the Spstk core. This is for my automated rainbarrel project discussed in another thread. Any ideas for the not so savvy?

Awesome concept.
I’ll be rolling out a feature that will let you write your own code that will execute in Atomiot.
That will let you build whatever logic you like.
You’ll have access to device data easily.
The last remaining piece will be weather data, which can be gathered with another feature that will roll out soon after.

You’re rolling out a weather feature? This is like a one stop shop. :slight_smile:
What weather service are you planning to use? My plans are to use forecast.io (formerly used wunderground).

I’m aiming a bit more generic than that. Support for anything REST api that returns json.
Perhaps some canned support for specific services like weather. Honestly they all seem the same to me. Is there anything particularly special about draft.io?

Hi, I’ve been using Atomiot for a while and its been super stable and useful, however it seems to be down as of today giving a 503 service unavailable error, in case the site operator is still monitoring the Spark community…

Wow. It’s been running smoothly for months now. Strange issue today. Not sure what it was but it showed a flaw in my monitoring.
All running smooth again now. Sorry for the downtime.
I’ll have to dig deeper into the issue.

Hello, @kareem613, thank you so much for Atomiot, it is indeed really good and I’ve been using it for a month or so with no fail.

My only question is: how’s the work on downloading the series data (as CSV for example)? The graphs are very useful but raw historic data would be even better.

Thanks again!

Hej @kareem613,

Good one! I just signed up. A little question about the graphs. What format should the variable string have? Or is there some documentation available?

API access (to push data from any device)
Publish event logging support
CSV download are next on the list

I’m hoping to have all these compete in the next couple of months.

The variable name can be anything that spark supports.
The value must be an integer or a float.

I suggest trying the tinker firmware and the examples in the device browser to get familiar with things. Tinker firmware has everything as functions but the way values are treated is identical.

Good luck. Post a link to your graph when you’ve got it going.

Thanks @kareem613,
I understand that only one variable, eg “21.1” is supported by your system. Do you think it would be possible to make it work like this: “10,20,5,30” to display a graph with 4 lines? ( each in their own colour ;)).

Already supported but by way of multiple variables.
Create as many variables as you want and a schedule for each. Then you can graph as many variables as you like on a graph.
Benefit of this way is you can graph data from multiple devices on a single graph.

Hi @kareem613,

Good to know. It limits the number of variables to 10, the maximum in a core, sufficient for most cases. I think that the efficiency of your system will be much higher when a multiple variable lookup will be possible with one call to a Spark, don’t you agree?
Especially when you become a big hit!
But anyway, Atomiot, looks very good! Thanks for the initiative,
Marcus

Agreed. That would certainly be the best of both worlds.