Today, we’re excited to launch an alpha version of the Dashboard, a brand new feature that simplifies and improves the Spark development experience! Our team has been daydreaming and planning this dashboard for months, and we’re super stoked to share it with our favorite people: you, the community!
What is the Dashboard?
Think of the Dashboard as your central hub. It gives you visibility into what’s happening on your devices, both while you’re developing and once your Spark-powered project is being used in the real world.
Specifically, it is the place for:
- Event logging (available now)
- Error tracking (coming soon)
- Rich visualizations from device data (coming soon)
- Interacting with and controlling your Spark devices (coming soon)
This is also the start of a whole bunch of tools that will give those of you fine folks creating products the ability to manage and learn from your fleet of devices (pushing out firmware en masse, monitoring the health of your systems, and more). More details on this are coming soon!
Today, we’re excited to roll out Event Logging as part of the alpha. Many of you already use Spark’s publish/subscribe feature to efficiently pass data between the cloud and your devices. Up until now, the only ways to see events being sent to and from your device were either to fire up the command line and run
spark subscribe, or hit an API endpoint. Useful, but it ain’t pretty.
The Dashboard provides a clean interface to view event information in real-time, scoped for your devices. We’re hoping that this is handy both while debugging code during development, and checking out recent activity on your device once you power-on your finished project. Here’s a snapshot of a Spark device monitoring the health of a theoretical aquarium:
The top section of the page is a visualization of the number of events from your devices over time. Each color in the bar graph represents a unique event. Here’s a closeup:
Below is a real-time log of events passing through the cloud. You’ll get the name, timestamp, device name, and data associated with the event. Oh yea! And, if you click, you can see a raw JSON view of the event if you’re nerdy or whatever.
How Do I Get Started?
To check out the Dashboard alpha, head over to https://dashboard.spark.io and login with your Spark account.
When you first log in, your Dashboard might look a little lonely. That’s ok! Fill it up with published events.
To take full advantage of the event logging feature right now, you’ll need to have a Spark device online, that is actively publishing events to the Spark cloud. Specifically, this means calling
Spark.publish() in your firmware.
We'd love to hear what you think of the Dashboard so far. Please post any feedback in this here thread! That being said, a couple of things to keep in mind:
- This is not a finished product, and there will likely be bugs found. We wanted to focus on shipping you all something quickly to begin iterating based on feedback and usage. We don’t expect this to be production-ready, and neither should you! There are numerous additional features and elements coming - this is just a start
- Currently, this view is just events in real-time. It doesn’t yet show historical events sent before this window was open.
- We’ll be rolling out historical events very soon.
- We so appreciate your feedback, thoughts and ideas. Please keep them coming! We’re building this for all of you, and will incorporate your feedback into future versions of the dashboard.
Thanks for being awesome! We’re seriously stoked to see how y’all like it!
P.S. @jtzemp and I will be doing an AMA here on community.spark.io this Thursday, April 9 from 12PM-4PM PDT, where you can ask us anything your heart desires!