Spark function and a loop

I think the best approach is to treat Spark.function()'s like you would an interrupt handler: do the minimum work that you can in the handler itself and post flags that you check back in the main loop.

So in particular, I would not do while(1) {...} in a Spark.function() or interrupt handler but neither would I routinely do the Serial.println() which can take a long time (several milliseconds at say 9600 baud) other that for debugging.

Your second function, minus the debugging Serial.println()'s is more in keeping with best practices.

Just like on Arduino, the return value of loop() is ignored (it is declared void), so in order to return a JSON, use a Spark.variable() with a string. You can use a Spark.function() to trigger an update to the JSON variable or you can use Spark.publish() to let the web side know that the variable was updated, depending on how you compute the values. You can also use Spark.publish() directly to send a JSON to the web side and I have a tutorial showing you how to do that here in the forum.