hi,
I am trying to send data with the publish command inside a loop, the calls are being sent to the server however only the first post sends data correctly and the following posts send null data
((fr=f_open(&fil, name, FA_READ|FA_OPEN_EXISTING))==FR_OK) &&
((fr=f_read(&fil, buf, sizeof(buf), &dw))==FR_OK);
Serial.println(dw);
do{
primerVez == false;
for(int i=0; i<1024; i++)
{
cadenaRespaldo += buf[i];
if(buf[i] == ',')
{
Spark.publish("PulsosTest", cadenaRespaldo); // this only publishes data correctly the first time
//Spark.publish("PulsosTest", String(i)); //if i publish this it works good!
delay(5000);
Serial.println(cadenaRespaldo);
cadenaRespaldo = "";
i++; // para saltear el espacio despues de la coma
}
//Serial.print(buf[i]);
}
f_read(&fil, buf, sizeof(buf), &dw);
}while(dw > 0);
It does not make sense to me, maybe it’s something with the way it handles strings internally.
and when i send just the loop counter number it behaves correctly in every way, so my guess it is something with strings
I would appreciate if someone can help! this has been driving me crazy. Thanks!
The delay(5000)
might be the issue. Also, are you using Webhooks or listening to events?
i am using webhooks to post data to the server, i placed the delay because there is a limit of 4 publishes per second (or 1 publish per second and burst of 4), if i remove the delay i get the same result i get the post requests but with empty data (except the first request). Could you suggest something else?
Thanks by the way!
Try something like:
unsigned long old_time = 0;
void loop(){
if (millis() - old_time >= 5000){
Spark.publish("PulsosTest", cadenaRespaldo);
Spark.publish("PulsosTest", String(i));
old_time = millis();
}
}
@victor_carreon, does this line always report the string values you'd expect?
For my personal opinion - I try to avoid String
whereever possible.
"Too much going on in the hidden dark" makes debugging harder
Could you please also show the declaration of cadenaRespaldo
and buf
?
Do you really want to send multiple times 1024 bytes out of a file in chuncks of max 63byte via Spark.publish()
.
BTW:
I wouldn't neccessarily see an immediate problem in the use of delay()
for test purposes, you could even go for delay(1001)
, to speed things up a bit.
But in general @kennethlimcp's suggestion to rather go for "soft delay" is a better choice - where applicable.
Just for clarity, you could place a line of Spark.process();
just before the delay()
(which does it implicitly from time to time, too)
thanks Scruff,
Yeah the Serial.println throws all the correct values. should i always use char pointers? i also think the problem is with string handling in the background.
here is the variable declaration:
char buf[622];
String cadenaRespaldo = "";
i also thought that sending multiple chunks would not be optimal, i am not doing it that way anymore, i am currently using spark.variable to expose larger strings with my data. but i would like to know what is causing this problem anyway.
If your buf
is 622 byte long, don't do this
1 Like
@victor_carreon, the first thing is that though @kennethlimcp code using a “soft” timer is great, there are two Spark.publish() events to the same webhook in a row. I have had issues with rapid firing of webhooks so you may want to verify on the dashboard that the hooks are firing.
On the subject of strings, is the file content you are trying to send purely ASCII?
Yeah my bad, it is actually 621 the length of the buffer, thanks for the heads up!
@peekay123 thanks, the dashboard shows all the requests. The thing is that data does not show up. I am pretty sure it has something to do with how are strings managed, because if I just publish the loop counter everything works good
@victor_carreon, can you tell me if the file content you are trying to send purely ASCII?
It might be superstition, but I never really trust String
.
Would you try Spark.publish("PulsosTest", cadenaRespaldo.c_str());
for me please - just to give me some peace of mind
1 Like
@peekay123 yes, the file content is just ASCII. @ScruffR lol yes, i’ve seen some posts regarding String class not being very reliable, i also tried .c_str() but i get the same result
Thanks for trying.
But I’m not letting go of this String
thing for another reason
I just find this more elegant
As I understand your code, you’re splitting up a comma delimited string in your buf
char by char.
For this task I usually use a C function called strtok()
int offset = 0;
const char whitespace[] = { ' ', '\t', '\n', '\r', '\0' }; // for trimming
const char delimiters[] = { ',', '\0' }; // use comma as delimiter (more possible)
// (\0 terminate string)
const char* token = strtok(buf, delimiters);
while(token)
{
if(strlen(token)) // e.g. ignore multiple delimiters in a row
{
offset = strspn(token, whitespace); // ignore all leading whitespace chars
Spark.publish("PulsosTest", &token[offset]);
// do your other stuff
delay(1001); // with implicit Spark.process()
}
token = strtok(NULL, delimiters); // NULL means carry on in same buffer
}
I’ve done something similar to your code with the leading blank (and other whitespaces), but this and boundaries between file chunks might need some more attention, tho’
1 Like
Thanks @ScruffR nice coding! I will implement it and see what is the result, thanks again
1 Like