machine learning API now supports Visualization

Hey community!

We just wanted to let you know that now supports visualization of output from our machine learning API. So, if you are using our service for anomaly detection or other data processing, and you have setup an account with a 3rd party service, you can forward predictions by selecting the Data Visualization tab in your admin dashboard.

We have a complete example of how to send data to using Particle webhooks, predict anomalies in your data, and post to ThingSpeak from our API on our particle gitHub repo here.

1 Like

As a follow up, we would like to gauge what other visualization services the community would like to see integrate with. Please reply to this post if you have a particular service in mind, like Ubidots, Librato, etc.


We are working on getting proper how-to docs up on our new Data Visualization functionality, but for now I thought I’d post the name of the fields you will need to have set up in Librato/Thingspeak for those that just can’t wait for the docs. By endpoint, they are:

  1. anomaly_detect: ‘probability_predictions’ and ‘anomaly_predictions’
  2. predict: ‘predictions.class_predictions’.
  3. signal_extract: The feature names of the data being sent in plus the corresponding 1-to-1 raw inputs. For example, a processed ‘temperature’ feature would be sent on to Librato/Thingspeak as ‘temperature’ plus the corresponding ‘RAW_temperature’ with the original value.

As a quick example of what the new functionality can do, here is a “predict” integration with Librato presenting class predictions vs actual on a accelerometer data (242 sensors) being streamed in real time:

More info on the Librato integration and docs to come shortly.

Sign-up for free access access here.


Above is some additional information on posting to ThingSpeak or Librato.



Here is forum post with more information about Librato and ThingSpeak visualization using