Have a look, I started this today, so there is work to be done. But it’s be cool to consolidate efforts if others are interested.
Care to combine forces? I whipped up this guy to handle IO API calls. https://github.com/efatsi/ruby_spark.
Looks like you are wrapping the tinker API. I’m more interested in wrapping the Cloud API. Combining these should be pretty easy.
I see from looking into efatsi’s Github that you guys have Tinker wrapper running. I am going to play with that tonight. Did anyone ever build a Ruby or Rails wrapper for the Cloud API? I am a beginner Rails developer and am new to API dev… any ideas?
I think @jgoggins has started playing with a ruby wrapper for the API so he might be able to provide some pointers.
This gem would be the start of what you want. It’s missing some features like firmware flashing. But depending on what you want you could just use it from within Rails.
link fails, did you move this gem elsewhere?
I joined with @efatsi, see revised post.
awesome, thanks. I am just getting started with RoR and was hoping for something to learn from.
@nixpulvis @efatsi YES!!! I really like how you’re wrapping the API here, it looks like a pleasure to use. I’d love to contribute to this. Also, it’s great y’all have joined forces, I hope I can throw down something useful too.
Right now all of the Ruby Spark API encapsulation stuff I’ve done is cooked into an internal gem we’re using here at Spark for various Opsy types of things. I’m going to figure out a way to break it into a separate public gem that this project could utilize or perhaps fork this project and issue a pull request. I’ll be sure to carve some time out during the next week or two to make this happen.
Awesome man. I’d say the join should be a single gem that wraps the whole API, with as few dependencies as possible.
@jgoggins would love to see what you’ve cooked up!
@efatsi thanks for the reminder–I can’t believe it’s been two weeks! This is still on my list. I figured out how to go about easily breaking things out from our internal gem, but haven’t gotten around to actually doing it. Was planning on doing it this weekend but I have family in town so won’t be able to. What if you and I bounced a few emails back and forth to vet and discuss what I was thinking, and if it seems good and useful, take the next steps? If that sounds good, hit me at joe at spark dot io.
It’s been quite some time since anyone has posted to this conversation and I wanted to post a quick update about where things landed.
Short story: The ruby_spark gem looks great! Use it.
Eli and I went back and forth (constructively ) over email for a bit and in the end came to the conclusion that our goals with API encapsulation are a bit different. I was hoping for something that was comprehensive and easy to maintain from Spark’s standpoint whereas Eli was shooting for something immediately useable for the most important Spark API use cases (i.e.
variable), which his gem already does in a syntactically wonderful way using great testing tools like RSpec + VCR.
Furthermore, with the availability of the awesome Node.js Spark CLI @Dave released just a couple of weeks ago, the need for what I was shooting for (comprehensive Spark API coverage) is already covered and illustrated there, so there isn’t a huge gain for this kind of encapsulation in Ruby right now.
It’s great to see libraries and wrappers in different languages for the Core, looking forward to more of these! Would you guys want us to try and help organize teams around the different languages?
I’ve been really impressed with the documentation put together by the spark team. I think with the docs that are available we should be able to adhere to a pretty defined set of functionality. Building wrappers in a bunch of languages is just a matter of time.
Agree. If there’s a few people who are motivated to push stuff and make it happen it’s gonna be showing up real fast.
Plus the rapid feedback from the community on what’s good/what should be better/what is broken etc… You can have something real cool done up!
Also wanted to note that the awesome Artoo package (from the Hybrid Group) has Spark support (in Ruby)
But also in JS:
On the other hand, if you are writing custom firmware the Ruby Spark library is a better route because it is more flexible and generalized to support integration against custom firmware functions and variables.
The Ruby Spark repo looks interesting. Looking to understand a little more.
Can the Ruby Spark repo make calls from the Spark Core to a Rails app? I’m looking to record info on the Spark and then, over wifi, send that data and POST it to my Rails app DB. Am I thinking of your repo in the right way?
Excuse my developer noob status + I don’t have my Spark accessible right now to test the repo.