StorageHelperRK - Inconsistent Results Loading Data

@rickkas7 ,

Thank you for your comprehensive and illustrative response. I now see why my estimates of the size of each object and the required off-sets were too small. I think this was one of my key issues.

Another issue I had was on the frequency of flushing the data. I have moved to a model where I will flush the nodeID and current data on demand rather than polling in the main look so I can be sure that I catch changes as they occur during the few seconds that the gateway and nodes are awake each hour.

If I may, I have one more area that is nagging me and that is where the gateway stores information on the nodes. Most of the gateway’s interaction with the nodes is transactional - the node sends data, the gateway sends a response and the data from the node is captured in a webhook stored in the queue for the next connect time. However, there is a need for the gateway to keep track of some node data over time:

  • Node number for each node (this is for LoRA and is not too important for reporting)
  • deviceID mapped to each node (used in reporting to the back-end)
  • LastConnect time - helps the gateway tell if there may be an issue in communications and can trigger resetting of the LoRA radio or the Gateway itself.
  • Success rate, or what percent according to the nodes, do their data reports get through
  • Pending alerts that the gateway will send on the next interaction with the node (such as updating the sensor type)

I am storing this in a JSON object and saving this big (1024 bytes) object in the LoRA_Functions class using StorageHelperRK. I have accessor functions that use JSONParserGeneratorRK to access data to do things like get a node number given a deviceID, store connection times and test for node connection health.

Here is the question: In this model, I need to store the big JSON object every time there is a change and while this is not that often, the object is large and the reporting window can get busy. I am trying to create objects in the scope of a function - but is this is the right approach here?

Some specifics:

LoRA_Function Class header:

// JSON for node data
JsonParserStatic<1024, 50> jp;						// Make this global - reduce possibility of fragmentation

Then, in my LoRA_Functions::instance().setup() function, I load the JSON object and parse it:

...
	// Here is where we load the JSON object from memory and parse
	jp.addString(nodeID.get_nodeIDJson());				// Read in the JSON string from memory

	if (jp.parse()) Log.info("Parsed Successfully");
	else {
		nodeID.resetNodeIDs();
		Log.info("Parsing error resetting nodeID database");
	}
...

Finally, when I want to access the data, I use functions that create the node array and object container for the structure and the key value pairs that I can then examine and update, here is one such function again in the LoRA_Functions class:

bool LoRA_Functions::changeAlert(int nodeNumber, int newAlert) {
	int currentAlert;

	if (nodeNumber > 10) return false;										// Function only for configured nodes

	const JsonParserGeneratorRK::jsmntok_t *nodesArrayContainer;			// Token for the outer array
	jp.getValueTokenByKey(jp.getOuterObject(), "nodes", nodesArrayContainer);
	const JsonParserGeneratorRK::jsmntok_t *nodeObjectContainer;			// Token for the objects in the array

	nodeObjectContainer = jp.getTokenByIndex(nodesArrayContainer, nodeNumber-1);	// find the entry for the node of interest
	if(nodeObjectContainer == NULL) return false;							// Ran out of entries - node number entry not found triggers alert

	jp.getValueByKey(nodeObjectContainer, "pend", currentAlert);			// Now we have the oject for the specific node
	Log.info("Changing pending alert from %d to %d", currentAlert, newAlert);

	const JsonParserGeneratorRK::jsmntok_t *value;							// Node we have the key value pair for the "pend"ing alerts	
	jp.getValueTokenByKey(nodeObjectContainer, "pend", value);

	JsonModifier mod(jp);													// Create a modifier object
	mod.startModify(value);													// Update the pending alert value for the selected node
	mod.insertValue((int)newAlert);
	mod.finish();

	nodeID.set_nodeIDJson(jp.getBuffer());									// This updates the JSON object but doe not commit to to persistent storage

	return true;
}

So, after calling this function, if I have this correct, the JSON object “jp” which is scoped to the LoRA_Functions class will be updated with new pending alert data. If I call another function in this class to, for example, to print the current node data, it will see the update I made here.

So, I am only flushing the data to FRAM once on each interaction with a node even if there are multiple calls to functions that query and update the nodeID data. Still, since this is just one huge object - the whole 1024 bytes gets backed up each time. It works but, is this the best way to do this?

As always, any advice appreciated.

Chip