Subscribe to two Spark Core boards and decided which one published data first

In order to use strtok() with const char *data, you need to copy it to a char* array first.

Or you use String deviceEventData1; - since that already is a string you don’t want an array [64].

Obviously you need to get some of the C basics worked out in order to chew projects like this.

1 Like

@ScruffR
I saw your example before but what I used for the strtok is:

since I need every token as variable so I can use them in my program

I tried that

but I still have the same errors

The key info why strtok() does not work with const char* can be found in the link I provided above

If you go for String you don't use strtok() but indexOf() and substring().
These are very important basics to understand before going any further.

1 Like

@ScruffR Yes, I read that in the link that you provided. After redefining my variables, the code is compiled successfully.

#include "Particle.h"
char * deviceEventMillis1;
char * deviceEventMillis2;
char *deviceEventData1;
char * deviceEventData2;
char *recdata1;
char *recdata2;

void myHandler3(const char *event, const char *data)
{
  strcpy(recdata1,data);
  strcpy(deviceEventData1, strtok(recdata1 , "|"));
  strcpy(deviceEventMillis1, strtok(NULL, "|"));
  Particle.publish("Pad3",deviceEventData1);
}

void myHandler4(const char *event, const char *data)
{
  strcpy(recdata2,data);
  strcpy(deviceEventData2, strtok(recdata2 , "|"));
  strcpy(deviceEventMillis2, strtok(NULL, "|"));
  Particle.publish("Pad4",deviceEventData2);
}

void setup()
{
  Serial.begin(4800); 
  Particle.subscribe("Carpet3", myHandler3);
  Particle.subscribe("Carpet4", myHandler4);
if  (deviceEventMillis1-deviceEventMillis2>0){
  strcat(deviceEventData1, deviceEventData2);
  Particle.publish("Pad3,Pad4",deviceEventData1);
}
else if  (deviceEventMillis2-deviceEventMillis1>0){
  strcat(deviceEventData2, deviceEventData1);
  Particle.publish("Pad4,Pad3",deviceEventData2);
}
}

void loop(){}

but I’m not sure if this is the right way or not?

Nope, this will corrupt your RAM contents since you have not allocated space for your char* variables.

Just leave your variables like you had them before

char deviceEventMillis1[64];
char deviceEventMillis2[64];
char deviceEventData1[64];
char deviceEventData2[64];
char recdata1[64];
char recdata2[64];
1 Like

So what I should do in this case since I followed what you told me here?

Thanks for that. It also compiled successfully. I will try to connect my sensors and give the feedback.

@ScruffR Is it normal that the Spark (Receiver) goes to SOS status after I flashed my code if the sender Sparks are not connected to the cloud?


Edit: I moved this part of code to be inside myhandler3,4 to get rid of the SOS status.

if  (deviceEventMillis1-deviceEventMillis2>0){
  strcat(deviceEventData1, deviceEventData2);
  Particle.publish("Pad3,Pad4",deviceEventData1);
}

So the handler looked like this:

void myHandler3(const char *event, const char *data)
{
  strcpy(recdata1,data);
  strcpy(deviceEventMillis1, strtok(recdata1 , "|"));
  strcpy(deviceEventData1, strtok(NULL, "|"));
  Particle.publish("Pad3",deviceEventData1);
if  (deviceEventMillis1-deviceEventMillis2<0){
  strcat(deviceEventData1, deviceEventData2);
  Particle.publish("Pad3,Pad4",deviceEventData1);
}
}

void myHandler4(const char *event, const char *data)
{
  strcpy(recdata2,data);
  strcpy(deviceEventMillis2, strtok(recdata2 , "|"));
  strcpy(deviceEventData2, strtok(NULL, "|"));
  Particle.publish("Pad4",deviceEventData2);
if  (deviceEventMillis2-deviceEventMillis1<0){
  strcat(deviceEventData2, deviceEventData1);
  Particle.publish("Pad4,Pad3",deviceEventData2);
}
}

Hello @ScruffR
I tested my receiver Spark code with the sensors and the other two Sparks (Sender), but I have an issue with the data that I'm receiving. The scenario should work as bellow:
1- The sender Spark1 and Spark2 send the data to the receiver (Spark3)
2- The receiver Spark3 receive the data, separate the data from the millis, compare the millis for Spark1 and Spark2, concatenate the first come data with the second come data, and publish it to the cloud as data1data2 or data2data1.

What is happening now is that whenever Spark1 sends the data, Spark3 publishes the data as data1data2 without taking care of Spark2 data. As an initial data, if Spark1 sent data, Spark3 publishes data1 as a concatenated data since Spark2 didn't send the data yet. When Spark2 sends the data, Spark3 puts the data of Spark 2 after the data of Spark1 and sends it to the cloud as data1data2.

If Spark2 started sending the data followed by Spark2, Spark3 will also publish the data as data1data2 (that's mean Spark1 publish first and this is not correct).

I believe that the issue is in this condition:

I also tried to add one of these three conditions before the above one but without success:

if  (deviceEventData1!=0)
if  (deviceEventData2!=0)
if  (deviceEventData1!=0 && if  (deviceEventData2!=0))

Any suggestion please?

Can you post a handful of events (from the dashboard) from spark 1 and 2 and then what you want the data to look like from spark 3?

1 Like

Thanks @justinmy
Before operating Spark3, the data that I receive from Spark1 and 2 labeled as Carpet3, Carpet4 on the Dashboard is shown below:

When I operate Spark3 and start sending the data from Spark1 only, Spark3 should send Spark1 data to the cloud (labeled as Carpet3) and wait for receiving data from Spark2 (labeled as Carpet4) to concatenate them together and send them as a single stream as SCXESDXE (labeled as Pad3,Pad4). Now, whenever I receive data from Spark1, Spark3 send it’s data directly without waiting for Spark2 and also send it as concatenated data as shown below:

The data that I should receive is:
1- When Spark1 sends data first followed by Spark2, the Spark3 should send SCXESDXE (labeled as Pad3,Pad4).
2- When Spark2 sends data first followed by Spark1, the Spark3 should send SDXESCXE (labeled as Pad4,Pad3).
Even if the data from Spark2 came before Spark1, Spark3 sends it to the cloud as show below:

That’s mean the if condition in handler4 (for Spark2) was not applied even when the data is exist, while the if condition in handler3 is applied even if the data from Spark2 is not exist.

You're sending the data like 12345678.1234|SC8E right? The above code is putting the first part of that string (the number part) into deviceEventData1, and the second part in deviceEventMillis1; that's backwards isn't it? You should be copying the first part of the string into deviceEventMillis1.

1 Like

I'm sorry for that, yes you are correct @Ric . I changed my code above but forgot to change it in the Particle community, So it should be looks like this:

@Ric Any suggestion to what is happening that prevents the code to work as expected?

How frequently should device 1 and 2 be sending data? I ask so I can best come up with a method for concatenating the strings together. In your sample above they are 3 seconds apart, is that common? What’s the smallest time frame? And the largest?

1 Like

@justinmy They should be activated in the sequence of device1 then 2 or device2 then 1. The time frame between them should be between 2-5 seconds. Here is a figure for the scenario that I want to achieve:

okay I have some thoughts, but too hard to type out on my phone :smiley: I’ll reply when I get to my desk. But the basic idea is that the handlers check to see if the other data exists, if not they exit out. If it does, the copy and zero out the other data. Right now you have no method to reset the data to know to process it or not.

1 Like

Also, how frequently to the both post data (time between groups of data)?

1 Like

Do you mean that I should do something like this in handler3

and

in handler4?

It might be one minute or one hour or any other period since the Spark1 and 2 are placing in a hallway and they will activate if someone came near of them.