Atomiot (Formerly Spark Tools) - Monitor Spark Projects with zero code.. and other stuff

Working on logging of function results today. This should provide logging support for the Tinker Firmware out of the box.


Awesome! I’m excited to check it out when it’s ready!

Ok. Function monitoring is live.

You can schedule function execution like variable reads.

I’ve got a stepper motor rotate function scheduled every ten minutes returning the number of times it’s been executed since boot.
Works like a charm. Now to connect it something. :slight_smile:

I tested against the tinker firmware as well.


You should make some sort of user guide to make it easy for a beginner to understand how to use these features.

Absolutely. I tried putting some info at the top of each page in the interim. I guess not helpful enough?

I haven’t tried it out yet so maybe what you already have is fine.

I just signed up to your site. I see you have updated it. I’ll give it a shot and report back.

@kareem613 I’m playing with your tool and the tinker firmware tonight. Is there a way to see the function params for the schedule after creating the schedule? I’m using ‘A0’ for analogRead, but I don’t see a way to modify the params or even see them in the schedule.

Good point. I’ll do that next. Thanks for the feedback.

Edit: I’m trying to figure out how this would be used. Is there a benefit to multiple schedules for the same function with different parameters?


  1. Schedules can now be deleted. Deleting a schedule does NOT delete the series data it was logging to. This is so you can get rid of old schedules, but keep the data. If you create another schedule for the same device and function/variable then it will log to the same series as the previous one.

  2. Functions parameters are now editable.

I think I’ll turn my attention now to displaying the data until I get feedback on the monitoring capabilities. I’d like to hear what people are trying to achieve with monitoring.

1 Like

I added a couple of graphing features.

  1. You can now save graphs and share the link.
  2. You can control the series style for each series. Example below. This is my temperature and humidity data that’s been collecting for a couple of weeks.

Link here.

I was wondering if it is practical/possible to use this for real-time graphing of Spark Core variables over wifi/web interface? Would it overload the server the data is being passed through, etc?

There’s really no such thing as realtime over the web so what frequency do you really mean?

Every 5 minutes, 1 minute, every second? Depending on what you’re monitoring usually even a few minutes delayed is overkill.

That being said, all this is hosted on azure so there’s cost associated with the server storage.

I’d like to keep it low cost. Perhaps I could work out a small pricing structure for heavy users.

I think the best answer though is to design your solution balanced.
Monitoring logging less frequent. 5, 10 minutes for example. Then use the spark.publish feature for real-time event logging.

I’ll be integrating with the publishing feature once they release webhooks.

Does this make sense? What’s your application?

Hi @helium

I love what @kareem613 has built here, it is very useful and produces beautiful results, but here’s a much simpler idea that I have used:

On a web page where you are listening to Spark.publish() events with your data, use Javascript and the HTML5 <canvas ...> vector drawing commands to plot points on a graph, adding a new point every time a new Spark.publish() event comes in.

In what I am doing, there is no data-logging here, just plotting the last N saved values, since it all runs in Javascript. I am plotting the last 8 temperature points, for instance, but it could be anything.

Does that sound more like what you wanted?

Kudos and a big thank you, Kareem! That’s a really nice piece of software and services that you’re sharing with us here.

I have been using Xively for - just like you - graphing temp and humidity values from a DHT22 in my apartment, but your platform is a no brainer and makes all this so easy to do.

What are you planning next to enhance it?

Regards from Paris.

I always use a feature privately first for a few days before making it available publicly.
Here’s a sneak peak though.

From my android device…


I started playing with the publish API, so I built in an event viewer. You don’t have to be registered to use it. Just enter your access token, device id, and the event name to listen for.

If you’re signed in, you can just click the “Events” link from the devices list to have all but the event name pre-filled for you.

Direct link:

Here’s a screenshot of my simple button press test.

I set up a public Trello board for ideas and feedback.

Anybody should be able to see ideas and vote. I’m not sure if anybody can add new cards to the board. If you can’t, just post the thought here and I’ll add it.

The only features on there now are things I want for my projects. I’d be interested to see what others are trying to accomplish and how the spark tools service might help.

1 Like

Just wanted to say I’ve started using this to log temperature and barometric pressure data from my MS5607 24-Bit altimeter and I love the interface! Very well done! Once I get my thermostat up and running I’ll be using it to log: Temperature, Humidity, Barometric Pressure, Set Temperature and whether the A/C or Heat Pump is running.

Awesome. Glad you like it. Interested in being a beta user of new features?

Give me your username there and I’ll give you access to the Actions feature which lets you setup notifications.

Heck yes! I should be timb.