Dup, the data that you are producing needs to be in JSON format similar to the example on the logging topic:
sprintf(resultstr, "{\"data1\":%d,\"data2\":%d}", data1, data2);
the result of doing a GET from the spreadsheet will look like this:
"result": "{\"data1\":23,\"data2\":26,}",
The JSON structure here shows a string object called "results". This string can be parsed into parts by using the "" delimiter to get names (data1, data2) and values (23, 26). This can be parsed by the google spreadsheet using (from the example):
var p = JSON.parse(result); // parse the JSON you created
var d = new Date(); // time stamps are always good when taking readings
sheet.appendRow([d, p.data1, p.data2]); // append the date, data1, data2 to the sheet
So your code should read something like this:
sprintf (resultstr, "{"Ax":%6d,"Ay":%6d,"Az":%6d,"Mx":%6d,"My":%6d,"Mx":%6d}", ax, ay, az, mx, my, mz);
Which will result in this JSON output from the Spark Cloud (where 1234 is the actual value of the variables):
"result": "{"Ax":1233,"Ay":1234,"Az":1234,"Mx":1234,"My":1234,"Mx":1234}"
Then, in the spreadsheet, you parse that out like this:
sheet.appendRow([d, p.Ax, p.Ay, p.Az, p.Mx, p.My, p.Mz]); // append the date and data to the sheet
Give that a shot and let me know how it goes.