SparkFun free data storage

Hey guys!

SparkFun announced you can now store 5KB of data in their cloud!
Check it out!

https://data.sparkfun.com/?utm_source=SparkFun+Customer+Newsletter&utm_campaign=55fd0fb4de-PhantData&utm_medium=email&utm_term=0_fa5287abaf-55fd0fb4de-60800101

It would be interesting to see people integrate this with the spark core.

3 Likes

@pzula, I successfully send temperature data from DHT22 to Sparkfun, the following code should work:

#define SPARKFUN_DATA_URL "data.sparkfun.com"
#define SPARKFUN_DATA_PORT 80
#define SPARKFUN_INPUT_PATH "input"
#define PRIVATE_KEY "<<privatekey>>"
#define PUBLIC_KEY "<<publickey>>"
#define STREAM_NAME "<<streamname>>"

void sendToSparkfunData(float temperature){
    char szData[16];

    if(client.connect(SPARKFUN_DATA_URL, SPARKFUN_DATA_PORT)){
        sprintf(szData, "%.2f", temperature);
        
        client.print("GET /");
        client.print(SPARKFUN_INPUT_PATH);
        client.print("/");
        client.print(PUBLIC_KEY);
        client.print("?");
        client.print("private_key=");
        client.print(PRIVATE_KEY);
        client.print("&");
        client.print(STREAM_NAME);
        client.print("=");
        client.println(szData);
        client.print("Host: ");
        client.println(SPARKFUN_DATA_URL);
        client.println("Connection: close");
        client.println();
        
        client.flush();
        
        while(client.available()){
            sprintf(szData, "%c", client.read());
            
            Serial.print(szData);
        }
        
        client.stop();
    }
    else{
        Serial.println("Cannot connect to Sparkfun Data");
    }
}
6 Likes

@krvarma I tried to compile your code, but I got the following error:

Hi,

I’m working on a library for Phant on a Spark Core. If anyone want to participate :smile:

https://github.com/romainmp/phant

3 Likes

@Amakaruk, It seems that you didn’t declare the TCPClient, the code is only the function to send data to Sparkfun Data. You should declare the TCPClient. The following code should work, please not that you should have include the DHT library.

// This #include statement was automatically added by the Spark IDE.
#include "DHT.h"

#define DHTPIN D4
#define DHTTYPE DHT22  

#define SPARKFUN_DATA_URL "data.sparkfun.com"
#define SPARKFUN_DATA_PORT 80
#define SPARKFUN_INPUT_PATH "input"
#define PRIVATE_KEY "qzz8b4BBdEs5EaY4YgBR"
#define PUBLIC_KEY "g66mX3rroWuDAQ7q7KJx"
#define STREAM_NAME "temperature"

DHT dht(DHTPIN, DHTTYPE);
TCPClient client;
int delaySend = 60 * 1000;


void setup() {
    pinMode(D7, OUTPUT);
    Serial.begin(115200);
    dht.begin();
    
    Serial.println("Ready, start sending data to Sparkfun.");
    digitalWrite(D7, HIGH);
}

void loop() {
    Serial.print("Sending data to Sparkfun...");
    sendToSparkfunData(dht.readTemperature());
    Serial.println(", Completed");

    delay(delaySend);
}

void sendToSparkfunData(float temperature){
    char szData[16];

    if(client.connect(SPARKFUN_DATA_URL, SPARKFUN_DATA_PORT)){
        sprintf(szData, "%.2f", temperature);
        
        client.print("GET /");
        client.print(SPARKFUN_INPUT_PATH);
        client.print("/");
        client.print(PUBLIC_KEY);
        client.print("?");
        client.print("private_key=");
        client.print(PRIVATE_KEY);
        client.print("&");
        client.print(STREAM_NAME);
        client.print("=");
        client.println(szData);
        client.print("Host: ");
        client.println(SPARKFUN_DATA_URL);
        client.println("Connection: close");
        client.println();
        
        client.flush();
        
        while(client.available()){
            sprintf(szData, "%c", client.read());
            
            Serial.print(szData);
        }
        
        client.stop();
    }
    else{
        Serial.println("Cannot connect to Sparkfun Data");
    }
}
1 Like

@rmp, that’s great!, waiting for your library to complete, thanks for sharing.

1 Like

Ok, the library seems to be working … at least for the HTTP POST method and clearStream function.
Need your help to test it more : dual stream, etc…
I’ll find more time later to document more and refactor some of the code

Keep me posted if you find bugs and send me pull requests !

PS : it is published in the IDE so that it’s easier to test for you all.
PPS : the library example file posts data to this Phant stream

2 Likes

Thanks @rmp, let me play with it this week end and get back to you more on this.

@rmp, your lib is working great for me. Thanks for publishing it!

1 Like

@rmp,

i tried the example but the code did not seem to be published.

Is there something i must add besides the example code to test out? :smile:

1 Like

Hello.

I am trying to send data to data.sparkfun.com. I can send data, but my core appears to reset or at least drop some type of connection.

First, below is the code I’m using to send data to data.sparkfun.com. As you can see, I’m waiting 60 seconds, updating the COUNT variable by 1 and then sending that value as the TEMP or HUMIDITY. That is being done as a place holder so I can see where connections are dropping or what not.

unsigned long lastEvent = 0; // Time we last updated the database

int interval = 60000; //60 seconds
int count = 0;

TCPClient client;

void setup() {
}

void loop() 
{
  unsigned long currentMillis = millis();

  if (currentMillis - lastEvent > interval)
  {
    lastEvent = currentMillis; //update the time of this new event  
    count = count + 1;
    client.stop();
    if (client.connect("data.sparkfun.com", 80)) 
    {
        client.print("GET /input/");
        client.print("WGR4vz0r17c86j2QayWg");
        client.print("?private_key=");
        client.print("PRIVATEKEY");
        client.print("&");
        client.print("humidity");
        client.print("=");
        client.print(count);
        client.print("&");
        client.print("temp");
        client.print("=");
        client.print(count);
        client.println(" HTTP/1.1");
        client.print("Host: ");
        client.println("data.sparkfun.com");
        client.println("Connection: close");
        client.println();
        client.flush();
    }
  }

As you can see by the results page:

https://data.sparkfun.com/streams/WGR4vz0r17c86j2QayWg

I’m missing some entries. While watching the spark core itself, I can see it do the breathing cyan. Then later a flashing cyan, then back to a flashing green, then breathing cyan. At first, I thought the core was restarting but if that was the case, I wouldn’t be missing numbers in my COUNT sequence, I’d be seeing it start over from 1… right?

I can confirm that the wifi connection is solid as other code that does not connect to send data staying connected without issue for days at a time.

What am I doing wrong here?

Did you try the POST method instead of GET ?
I had some trouble with GET also.

I’ll look into this as soon as I can.
It should work like that … or at least it worked before. I’ll see if anything has changed.

1 Like

Where can I find information on using POST instead of GET for this specific code?

Also, thank you for looking in to this. I really look forward to any assistance you can offer.

I used your PHANT library on the Spark web interface. Here is the stream that I’m posting to.

https://data.sparkfun.com/streams/ZGNYV0WVm5U2mlL991rJ

I am seeing more consistent COUNT numbers but still a couple drops. BUT I am not seeing the Spark Core freak out and appear to restart/reconnect to the cloud/wifi as I did before.

In your phant.cpp on line 132, I see you have a delay(150); Is that to allow the data to post to the stream before the connection is closed? If so, and since I’m seeing a few COUNT drops on my end without the Core reconnecting, do you think increasing that delay may help? 250 perhaps?

EDIT: I did restart the core with a forked copy of your PHANT library with an increase of the aforementioned delay to 250. I’ll let this run a few hours to gather data.

Using delay() and then client.flush() is not an effective way to clear the bytes returned by the server and you will eventually get failures. A better way is to way to wait for returned data and then read it until there is no more. I helped @mtnscott with a similar issue with this library and he now has a stable connection.

2 Likes

Are you referring to this thread?

Yes, from post 4 down.

1 Like

Thank you for your help, as you said, the return data is not handled very well (I agree with the term “primitive” you used :wink: ).
I’ll implement that correctly ASAP. If you have any suggestions …

1 Like

Hi @rmp

I know that @mtnscott had working code so maybe he could send a pull request. It is pretty easy–wait for data available with a timeout, then read data until there is no more. Then flush and stop.