I’d like to be able to return strings, and customize the REST response from Spark API so that I can provide something more meaningful than simply an integer.
It’d be great if I could do something this example
There’s no mechanism to do that directly now. However, you can built it with the features available in the core at present.
To simply get a string back to a caller, what you could to is have a function that updates a String variable. So you’d first call the function via the cloud, the core updates it’s variable, and then you fetch the value of the variable from the cloud.
This works for simple cases where you just need an updated value, but not if you want to associate the value with a specific function call.
In that case, we need to get a bit more complex. Here, the function would increment a value, such as a counter and return that as the function result. That represents an ID for the unique function call. At the same time, the function posts an event name, like “update/” where the is the counter value returned by the function. The data for the published event is the string to deliver - the json string you posted above.
To get the value, a client does this:
subscribe to “update” events from the core
call the function and save the result as ID
wait for a “update/ID” event to come (where ID is the function return value from 2.)
I hope that’s clear - I’m happy to elaborate where needed.
I think the integer return value from Spark.function() is meant to return success/failure codes like -1 or 404 for failed, +1 for succeeded.
If you treat the Spark.fucntion() as a set method and the Spark.variable() as a get method for variables on the core, everything works out nicely. With a char array Spark.variable() you can return your own JSON as the value which will be returned to you in the “result” field from the cloud. The only small thing to watch out for is that Spark.function takes a much smaller string than Spark.variable can return.
That's true, but it's merely a convention - the cloud just passes the return value back without interpreting it, so you can pass back anything, including an ID for that specific function call, and then associate that ID with more data via a published event.
@ubergeek82 - what do you want to do with this json?
I’d like to be able to add more context to the json responses but I understand there’s bound to be some limitations. I’m planning on using an intermediary like apigee to provide more elaborate messaging and “RESTfulness”
Thanks for the tip about registering events, I think I’ll use this for things like status updates rather than polling the Spark.variable() call