Google Cloud Integration not getting data

I’m connecting my photon to my Google Cloud Datastore but I cannot read the value of the data being passed.

I followed the tutorial on how to setup the Google Cloud integration, and it does work because I can see the events and also I can save it to my Datastore the attributes. The problem I’m having is accessing the data I’m passing from my photon.

I’m looking at this guide: https://docs.particle.io/tutorials/integrations/google-cloud-platform/

This is what I get when I console.log the message before it is sent to the Datastore:

{ connectionId: '8ee5ea0d-<redacted>-365b6b627947',
  ackId: 'fj09-<redacted>-JLLD4',
  id: '17762767062884',
  attributes:
   { device_id: 'XXXXXXXXXXXXXXX',
     event: 'gc_tankreading',
     published_at: '2018-01-07T12:25:13.754Z' },
  publishTime: 2018-01-07T12:25:13.928Z,
  received: 1515327914367,
  data: <Buffer 30>,
  length: [Getter],
  ack: [Function: ack],
  nack: [Function: nack] } 

When I look at the data my console I can see the data value from the photon but on the console.log I see the <Buffer 30>.

On my photon I have the following to publish:
Particle.publish("my_event", String(cm), PRIVATE);

Thanks

You may want to show your JS code (assuming you are using JS) which does the logging too.

In order to unwrap nested objects you could use util.inspect()

Yes, I’m using JS and the nodeJS app engine.

Here is the code I have

'use strict';

const express = require('express');
const Firestore = require('@google-cloud/firestore');
const Pubsub = require('@google-cloud/pubsub');

const app = express();

const firestore = new Firestore({
    projectId: 'myprojectId',
    keyFilename: './credentials/mycredentials.json'
});

const pubsub = new Pubsub({
    projectId: 'myprojectid',
    keyFilename: './credentials/mycredentials.json'
});

const topic = pubsub.topic('projects/mydb/topics/mydocument');
const subscription = pubsub.subscription('projects/mytopic/subscriptions/myevent');

function saveEvent(message) {
    const dataRef = firestore.collection('mydocument').doc();
    console.log(message);
    const obj = {
		device_id: message.attributes.device_id,
        //event: message.attributes.event,
        reading: message.data,
		published_at: message.attributes.published_at
	}

     dataRef.set(obj);
}

subscription.on('message', function(message) {
    saveEvent(message);
	message.ack();
});


app.get('/', (req, res) => {
    res.status(200).send('Status: OK');
});

if (module === require.main) {
    // [START server]
    // Start the server
    const server = app.listen(process.env.PORT || 8081, () => {
        const port = server.address().port;
        console.log(`App listening on port ${port}`);
    });
    // [END server]
}
  
module.exports = app;
1 Like

I based my test code on the particle tutorial, and I made a small change to use the Firestore database.

Also the console.log I displayed on the first msg if from the saveEvent function

When i use the util.inspect I get the following.

<Buffer 31 36 39>

This is what I see on my particle console.

{"data":"169","ttl":60,"published_at":"2018-01-07T17:51:04.868Z","coreid":"123","userid":"123","version":0,"public":false,"productID":123,"name":"myevent"}

Data is the correct value from what I'm sending from my photon.

I was able to fix and get the data. I had to do the following to my message.data.

Buffer.from(message.data, 'base64').toString();

I ran into this same problem. Looks like the encoding is jacked up somewhere.

Sending data that looks like this:

"data":{"temperature":67.28,"humidity":44.00,"thermistor":68.06,"bpm":0}

I used this js code:

  // decode data
  var data = JSON.parse(Buffer.from(message.data, 'base64').toString());
    
  // Copy the data in message.data, the Particle event data, as top-level 
  // elements in obj. This breaks the data out into separate columns.
  for (var prop in data) {
    if (data.hasOwnProperty(prop)) {
      obj[prop] = data[prop];
    }
  }