Receive data through the I2C on the Spark core and send it to the cloud

Hey Folks
I want to do a project on the spark core, that is a smart carpets (Four Carpet). I have 16 pressure sensors,which are embedded in four smart carpets, which produce voltage when someone step over them. Each smart carpet has 4 sensors.

Each carpet’s data will be received by the Pic18 microcontroller board. Each Pic18 will convert the analog voltage to digital by using ADC, and convert it to ascii codes after that. This is the first part of my project which was done by one of my friends.

Now the part that I want to do is sending these Ascii codes through the I2C connection to the Spark Core1 (Master), and forward the data by using the wireless to the cloud. The other three carpets will also have their system (PIC18+Spark Core) and act as slaves.They will take control from the Master spark core to send their data to the cloud or to the Master Spark core.

For example, if segment 3 has data, it will wait until receive the permission signal from the master to send his data to the cloud, or send it to the Master and the master will forward the final data to the cloud at the end. Overall, my system will have four smart carpet, and each carpet’s system consists of PIC18+Spark core.

If it is easier to send the data directly to the cloud without taking the permission from the Master, I’ll go with this solution to achieve my project. A sample of my data will be like this form (SA0001E), were S for start, A for carpet A, E for the end of the data, and finally 0001 is for the location of the active sensors in the carpet, and here it means that only one sensor is active in this carpet.

Any idea or sample code that help me to start?

I appreciated your help. I hope if you can understand my whole project’s idea.

Here is a picture that demonstrates the idea.

Why not replace the PIC18 with the Photon/Core?

1 Like

Hello @tjp
Thanks for your suggestion, but the Pic18 part of my project was already done by my friend, so I can’t change it at this time.

@Ahmedsa1983

Unless you have specific requirements for the master, there is no need for having a master to collect the data from the slaves and send it to cloud. You can have the same code running on all the Spark modules. Each spark core can get the data from the PIC and send it to the cloud using Spark.publish. Please note that you need a Web page to pull the data and take some possible actions. You can have a look at the tutorial on Spark.publish at the link

Hope this gives you a good starting point

2 Likes

Thanks @TheVelozGroup
I think I’ll go with your suggestion, but this mean that I couldn’t display the published data if I don’t have a webpage? Is these anyway that I can display the published data on the cloud somewhere without need to create a webpage?

@Ahmedsa1983

Please check http://docs.particle.io/photon/dashboard/ This provides information about the dashboard tool which can be used for viewing events published by your cores.

Hope this helps.

2 Likes

Thanks @TheVelozGroup
I might change the connection between the Spark core and the Pic18f to the serial instead of the I2c.
Here is the code that I’m trying complete in order to put on my spark cores, but I’m not sure if I can see the serial data on the DASHBOARD or not? Is there any error in the code? Can I publish my data on the Dashboard privately?

Thanks for your help

Here is the code (Reformatted and some comments added by ScruffR)

//Spark1
INT RXPIN = RX;
INT LED = D7;
str incomingBytes1 = 0; // for incoming serial data

void setup()
{
  PINMODE(RXPIN,INPUT);
  PINMODE(LED,OUTPUT);
  Serial1.begin(19200); // start serial for output
  Spark.subscribe("Jag_Smart_Carpet2", myHandler, MY_DEVICES);
}

void loop()
{
  if (Serial1.available()) 
    SPARK_WLAN_Loop(); 
  {   // <-- this block does NOT belong to the if-statement
    if (digitalRead(RXPIN)>0) 
    {
      incomingBytes1 = Serial1.read();
      delay(100);
      Spark.publish("Jag_Smart_Carpet1",incomingBytes1,PRIVATE);
    }
  }   // <-- end of orphan block
}

void myHandler(const char *Jag_Smart_Carpet2, const char *data)
{
  if (strcmp(data,"incomingBytes1")==0) 
  {
    // if there is data from the incomingBytes1
    Serial1.print("incomingBytes1 is");
    Serial1.print(incomingBytes1);
    digitalWrite(LED,HIGH);
    delay(300);
  }
  else 
  {
    digitalWrite(LED,LOW);
    delay(300);
  }
}
---------------------------------------------------
//Spark2
INT RXPIN = RX;
INT LED = D7;
str incomingBytes2 = 0; // for incoming serial data

void setup()
{
  PINMODE(RXPIN,INPUT);
  PINMODE(LED,OUTPUT);
  Serial1.begin(19200); // start serial for output
  Spark.subscribe("Jag_Smart_Carpet1", myHandler, MY_DEVICES);
}

void loop()
{
  if (Serial1.available()) 
    SPARK_WLAN_Loop(); 
  {   // <-- orphan block
    if (digitalRead(RXPIN)>0)
    {
      incomingBytes2 = Serial1.read();
      delay(100);
      Spark.publish("Jag_Smart_Carpet2",incomingBytes2,PRIVATE);
    }
  }   // <-- end of orphan block
}

void myHandler(const char *Jag_Smart_Carpet1, const char *data)
{
  if (strcmp(data,"incomingBytes2")==0) 
  {
    // if there is data from the incomingBytes2
    Serial1.print("incomingBytes2 is");
    Serial1.print(incomingBytes2);
    digitalWrite(LED,HIGH);
    delay(300);
  }
  else 
  {
    digitalWrite(LED,LOW);
    delay(300);
  }
}

Any other suggestion please @tjp @bko @pra
Thanks for all

No sure I understand your code but I’m pretty sure you don’t want to be doing digitalReads on the RX pin when its being used for Serial1 input. I also don’t see how data in myHandler would ever get the value “incomeBytes1” or “2”. Serial1.print(incomingBytesX) is going to write whatever the last input back out the serial port. Is that your intent? Never having used the Arduino programming model or publish/suncribe et el, I’m not a good source. Maybe @bko and/or @peekay123 can help you out here.

2 Likes

Oh! This code needs some healing! I’ll look in the morning. :grinning:

2 Likes

This here is an odd construct.
What do you intend with this?

Could you please correct your indentation? This might also help seeing structural issues of your code (as the one quoted).

And @pra has a point. Have another look how to use the built-in Serial1.

2 Likes

@Ahmedsa1983, I am so confused by what you are trying to do! If your goal is to send the carpet sensor data from the PIC18 to a Core and then to the cloud then your code is not reflecting that. What I see is (the intention to have) one Core publish its data to another Core. Instead, you should have a node.js app on a server listening for the published events to gather the data.

Nonetheless, using Serial data on the Core is possible because you have a “start” character “S” which the Core can listen for and parse the PIC18 data accordingly. However, your code, as it stands, will not work whatsoever for a whole bunch of reasons. So let’s start over again. Here are some questions before proceeding with any advice:

  1. How much sensor data (#bytes) does the PIC prepare and send?
  2. At what rate does the PIC refresh and send the data?
  3. How often do you need to publish this data to the “cloud”
  4. Does the carpet data need to be ordered in time (ie a timestamp)?
  5. Do you expect to have a cloud server listening for and gathering the published data?

I look forward to your answers :smile:

2 Likes

Hello @pra
Thanks for your help.
Actually I’m not expert in C programming, but I tried to apply the example in the Particle.io website with the instruction format in the Firmware to write this code. For the digitalRead, I also think it is extra here since I’m already using Serial1 input. I couldn’t understand how is exactly the function of myhandler working, but I thought that I can see the data in the dashboard by just putting publish but I’m not sure. What I need to do is trying to read the sensor’s data (when it is >0) which is coming from the PIC18f board, read it by the spark, and publish this data to the cloud for displaying.
Thanks again.

Dear @ScruffR
As I explained to @pra, I’m new to C programming, and I used the SPARK_WLAN_Loop() because I though it will check the connectivity between the cloud and the spark to make sure that they are connected correctly. I’ll take another look for the use of the built-in Serial1 but I couldn’t understand how can I modify your comment(structural issues of your code (as the one quoted).)
Thank you so much.

Hello @peekay123
Thanks for your great reply. My goal as I mentioned before is trying to read the sensor’s data (when it is >0 such as SA0001E) which is coming from the PIC18f board by using the spark either by using the I2C or Serial1, and publish this data to the cloud for displaying. if you mean by a node.js is a server to receive the published the data I I’ll say yes, I need a server if it is better than the dashboard in displaying the data and control it. I just wanted to establish a communication between Spark 1 and Spark 2 when I wrote these codes. To answer your question as if I understood them correctly:
1- I need 7 or 8 bytes each time there is a data from the sensors (the data may be equal zero SA0000E that meant these is no active sensor, or it has a value>0 as SA0001E) and the pic18 is streaming the data continuously if these is an active sensor or not.
2- the baud rate will be 9600 or 19200.
3-I want to publish the data to the cloud when it is>0) and put the spark on sleep when the data is zero(there is no active sensor).
4-yes the data need to be order in the cloud.
5-yes I expected to have a cloud server listening for and gathering the published data.
thank you so much.
I appreciate your help.

Are we to assume the 4 carpets are located a distance away from each other, therefore a I2C wire between carpets would not be usable ?

If the carpet are close enough to run a I2C wire between them, then you should only need one Core, rather than 4.

1 Like

Dear @Jack
I’ll put the carpets in the four corner in a room(each one in a corner), so i need to build four system(each system will have Pic18 and Spark Core)
Thanks for your help.

1 Like

@Ahmedsa1983, I think I understand except for a few minor details. Here are a couple of issues with your design:

  1. The PIC18 puts out data constantly which the Core reads, looking for “data frames” representing sensor elements.
  2. By reading and parsing the data frames, the Core can tell if a sensor is on or not. HOWEVER, the Core cannot sleep since it must be reading the data to assess sensor status.
  3. Since a Core cannot publish more than once a second, the sensor data from the PIC18 would have to represent the number of sensor “ON” events that were read in the second prior to the publish and if no events, not publish at all.

For the Core to sleep, the PIC18 would need to set a digital input on the Core when any of its sensors turns on. This would wake the Core which could then sample the PIC18 data.

Is this correct?

1 Like

How real time do you want the info? is this to be just historic log, or do you need it published fast? 3 seconds? 30 seconds, 5 minutes?

1 Like

I've reformatted your code above as I would write it, and if you look at this, you might see one structural issue that I meant

  if (Serial1.available()) 
    SPARK_WLAN_Loop(); 
  {   // <-- orphan block
    if (digitalRead(RXPIN)>0)
    {
      incomingBytes2 = Serial1.read();
      delay(100);
      Spark.publish("Jag_Smart_Carpet2",incomingBytes2,PRIVATE);
    }
  }   // <-- end of orphan block

With this kind of indentation you always have opening and closing curly braces on the same column and directly under its "parent" statement.
Since the marked braces have no "parent" - due to the semicolon ; ending the if-statement already before the opening brace - your intended structure might be broken.


BTW: Since C/C++ is case sensitive, this might not work

PINMODE(RXPIN,INPUT);

This should most likely be

pinMode(RXPIN,INPUT);
1 Like