Atomiot (formerly Spark Tools) Cloud services for your Spark Projects


#1

I was getting a few support emails about my Spark Tools service with people thinking it was provided by the Spark team themselves. I figured it best to pick a new name to avoid any confusion.

So the same Spark Tools service has been rebranded to AtomIot!

http://atomiot.com

To commemorate the occasion, the new actions and notifications features are now open to the everyone! You can receive push notifications when a value from your device is above or below a set threshold.

The service remains free as my costs for hosting and data storage are next to nothing on Azure so far(and apparently dropping 18% tomorrow).

The future roadmap includes:

  1. Log Spark.Publish events
  2. Actions based on Spark.Publish events
  3. Scheduled function execution
  4. Trigger functions from actions (ex. When temp is too high, execute function to set color of RGB LED)
  5. More visualizations & dashboards

The roadmap isn’t written in stone. I’m all ears and would appreciate some feedback. If you’re using the service, please post about your project. A link to a graph would be great! Saved graphs are the only thing publicly available.


Atomiot (Formerly Spark Tools) - Monitor Spark Projects with zero code.. and other stuff
Simple post to external server possible?
Room Temperature Monitor
#2

This is a really great feature! For those looking for an easy way to monitor Spark.publish() events without having to develop their own tools, this is an excellent service! Thank you.


#3

Latest release.

  1. You can now delete series so you can clean up old data you don’t want anymore.
  2. Basic Schedules for executing functions. This is the same as logging data for function return values. Set your params to pass in and have it execute every 5,10, 30, or 60 minutes.

#4

Awesome work! Does Atomiot allow data logging when the browser isn’t open?

I’ve been working on a very similar system, except with the aim of data logging and visualization. I am currently writing a NodeJS middleman EventSource app that binds to Spark Cloud events, logs registered hooks to a database, then pushes the data and event out to any active endpoint watchers. The goal is to add graphs and various visualization capability.

Are you interested in teaming up?


#5

The service logs data according to the schedule your define. No need for the browser to be open.
But you’re talking about publish events right?

Currently logging is only supported for variables and functions. I’ll add support for publish events when the spark team releases their callbacks/webhooks feature.

Event sourcing isn’t really intended for server to server integration. Keeping that many connections permanently open is resource intensive and doesn’t scale. Be careful with that approach if you want to support more than your own personal projects.

Know what you’re trying to do for visualizations?


#6

@kareem613, I’ve been playing with atomiot, it’s really cool, something to be proud of! A project I’m working on needs to sleep, and thus schedules (and thus actions) are fairly unreliable, as the schedule fails if it checks when the core is sleeping. Until webhooks/callbacks happen, I’m not sure there’s going to be much of a solution for that, though. I’m considering doing posts when the core wakes up and does it’s sensor reads to a php page that can update a MySQL DB to get around this right now. Keep up the good work, though - atomiot has definitely made my life easier!


#7

Excellent. Glad it’s proving useful.

I was planning on creating an api endpoint for this use. If I did that, would you build an Atomiot client library and open source it?

I’d like a spark and arduino library. Not sure what’s involved in supporting arduino as well.


#8

I think if you build it, they will come applies here :smile: Make the api endpoint and someone WILL build it… especially if it works well. I just signed up for atomiot finally… looks great so far. I just signed up because I saw how dead simple you’ve made push notifications, and it’s secure as far as I can tell! I’m going to be using this for lots of stuff in the future. I can see why it would be free now as you develop it, but what’s you plan for the future? It would be great if you could always have a free account, and then as long as things scale you can offer less rate limiting for a fee, or just a much larger number of spark cores per account for a fee. Once you start charging though, it should be very reliable :wink: Great job with this service!


#9

@kareem613 something I’m curious when you’ll have working is the Spark.publish() trigger for Pushover push notifications :smile: Getting a realtime push-notification is really what it’s all about anyway. Right now I’m testing out the scheduling feature that’s polling a variable on the Spark, and if I’ve pressed a button the Spark latches the state of a variable from 0 to 1. The schedule polls it essentially every 5 minutes, and the Action is set to send me a Pushover notification if it sees that variable as 1.

Obviously there are a lot of problems with this. It’s not real time. I have to save the state of an asynchronous short duration event (button press) longer than 5 minutes so the scheduler can see that it was pressed. And then I have to clear the variable to keep the scheduler from sending push notifications over and over.

I purchased the Pushover app. Now I’m hoping for a simple way to use it :wink:


#10

Glad you like it.
An open endpoint is already in the works. Just working on the security now.
As for charging for the service, you’ve got the right idea. If this grows enough, I’ll have to start charging for it, but there will always be a free tier.
Not sure which parts will get limited. I don’t think I’ll limit anything that’s already available now though.


#11

I’m with you @BDub! It’s the same feature I need myself for the doorbell push notifications.

Everything I can do is already done and waiting for the spark team to release the webhooks feature. As soon as that happens, I’m pretty quick to tie it in.

It’s been “coming in the next sprint” ever since the publish feature was released so not really sure what’s going on.


#12

@kareem613 I’m having difficulty saving my cores. I signed up through the Google login if that means anything. Here’s a screen shot of the homepage with me logged in:
https://dl.dropboxusercontent.com/u/20008825/Web%20Pics/Forums/Spark.io/AtomIOT%20(Medium).jpg


#13

What do you see on the devices screen?
Also, did you enter your spark core token in your profile?


#14

Ah, I think I had an incorrect Access Token on my profile.


#15

Perhaps the messaging needs to be clearer. I’ll review.

What’s your project? Always interested in seeing what people are using Atomiot for.


#16

Just a temp sensor at this point. I want to monitor a beer storage area with the outside temperature and see what the fluctuations are.

Hopefully, I’ll start logging some data this weekend. I’d be interested in seeing what others are doing with your great web app. Maybe we should start a new thread?


#17

I’m still trying to figure out how to create a graph from my data. Do you think it would be possible to create a short tutorial for us total noobs on how to start using your service?


#18

Definitely on my todo list. I think I’ll move that up to the top.

For now.

  1. Go to the series page
  2. Select the series you want to view and click “View Graph”.
  3. If you’re happy with what you see, select “Save Graph”
  4. Now you can view all your saved graphs from the Graphs page.

#19

I don’t see anything on the series page. When I go to the devices page to create a schedule, there is no “browse” option to configure data logging.

The only thing I’ve been able to see from my spark is by using spark.publish()


#20

Ok. Let’s back up a bit.
Have you managed to read and variables or execute any functions?
From the devices screen seemed “device info” for a device.
That screen will show you all the variables and functions available from that core.
There are instructions at the top of the page linking to spark documentation on how to whose variables and functions.