Create time out for Spark.connect()?

Hello! I have been working on an Arduino project, and recently ported the code to my Spark Core. I understand that when the core powers up, it connects to the cloud (which is a blocking function). The problem I have is not depending so strongly on a wifi connection. For example, if my router goes down while the core is doing an important task, it needs to be able to work without the wifi and worry about connecting later.

I’ve read about the new
SYSTEM_MODE(SEMI_AUTOMATIC);
and believe that control over Spark.connect() will help me. But, I’ve noticed that the connecting function will not stop until it has found a connection. Is there any way to try to find a connection, but only for a set period of time? Then, if the cloud connection is lost my code can enter a special mode where it saves data to upload later, etc. while only periodically checking for a cloud connection.

I appreciate any help!

2 Likes

@JDvorak yes, you can totally do that. I had the same requirement. I booted in SYSTEM_MODE(SEMI_AUTOMATIC) and start my program offline.

Then in my loop() I check to see if connected. If not connected, I run a try to connect routine. This routine starts a timer based interrupt using “SparkIntervalTimer”, then calls Spark.connect(). Once connected, I disable the timer based interrupt. However, if unable to connect, eventually the timer based interrupt gets triggered and the ISR includes a call to Spark.disconnect().

This works like a charm for me.

5 Likes

Thanks @matt_ri !

It seems like this should either be documented under Spark.connect() or built into the API… I’m sure lots of people need this functionality for projects / products.

1 Like

Can you share that code. That is exactly what I need to allow me a few seconds to reflash a core that is working in manual mode.

Hello @rocksetta,

I think the solution is to use the library here . I haven’t implemented it yet, but the github seems to have good docs.

I don’t think the example program is for the Spark, it looks like for the Arduino.

PIN_MAP[ledPin].gpio_peripheral->BSRR = PIN_MAP[ledPin].gpio_pin; // LED High

Other than that line, most of the example looks like it makes sense.

I hope @peekay123 adds a spark example because it looks like it would be very helpful. For my situation I have my own solution at https://community.spark.io/t/how-to-re-flash-core-when-in-manual-mode/11494/6 but your situation would have to use the SparkIntervalTimer to occasionally check for connectivity.

It is basic frustrating things like this that remind me of my IoT Rant at

1 Like

@rocksetta, that line IS for the Spark Core!! The entire library was adapted for the Core. What the line you highlighted shows is a direct GPIO pin manipulation which is much faster than digitalWrite(). If you need any help with the library, let me know :grinning:

I will try to write one, but I spend so much time re-flashing dead cores that it would be nice if someone who knows what they are doing would write a basic “Hello World No Wifi” for the situation in which, if you are away from wifi and you want your Spark Core to just work (as if in manual mode). Then if you are near your wifi, when you power on it gives you 30 seconds to reflash the core to update your program.

My hack works fine for me since I carry my cell with a hotspot but not much use to others in the same situation.

// must have wifi working for startup, since automatic mode blocks until wifi is connected
// Can use cell phone hotspot as long as it is setup earlier with another cell phone.

void setup() {
    delay(2000);   //give some time to reflash the core
    delay(2000);
    delay(2000);
    delay(2000);
    delay(2000);
    delay(2000);
    delay(2000);
    delay(2000);
    delay(2000);
    delay(2000);
    Spark.disconnect(); // basically puts you in manual mode, so should ignore any poor wifi reception issues

}

void loop() {

}

@rocksetta, I am going to take this up with other Elites and the Spark Team. With both WiFi.listen() and Spark.connect() blocking user code, having an “out” for the Core would be good. This, of course, will not be a problem on the Photon :smiley:

4 Likes

Great idea. Here is my suggestion, but I have to get home to try it out. At least it compiles (assuming you include the two sparkIntervalTimer files .h and .cpp)

#include "SparkIntervalTimer.h"


SYSTEM_MODE(SEMI_AUTOMATIC);


// Create IntervalTimer object
IntervalTimer myTimer;


// Pre-define ISR callback functions
void checkWifi(void);


//const uint8_t ledPin = D7;		// LED for first Interval Timer


void setup(void) {
    pinMode(D7, OUTPUT);  

    digitalWrite(D7,HIGH);      // blink D7 fast once before timer activated
    delay(40);
    digitalWrite(D7,LOW);

    myTimer.begin(checkWifi, 60000, uSec);   // check for wifi every 60 seconds or until stopped
    
    digitalWrite(D7,HIGH);      // blink D7 slow twice when timer activated (checks if blocking)
    delay(500);
    digitalWrite(D7,LOW);
    delay(500); 
    
    digitalWrite(D7,HIGH);     
    delay(500);
    digitalWrite(D7,LOW);

  
}


//volatile unsigned long blinkCount = 0; // use volatile for shared variables seems cool to know


void checkWifi(void) {
    Spark.connect();   // this is normally a blocking function 
}

int myLoop;
int myConnect = 0;       // 0 just starting, 1 connected, 2 manual mode

void loop() {



  if (myConnect == 0) {    // check for connectivity

   if (WiFi.ready()){                 // wifi is connected
        myTimer.end();                // shut down the timer
        myConnect = 1;                // says wifi connected
       
   } else {
            myLoop++;
            delay(1000);
            if (myLoop >= 20) {			      // tried to get Wifi for 20 seconds
                Spark.disconnect();           // not sure if this is needed
                myTimer.end();
                myConnect = 2;               // says no wifi
	        }
        }
  }
  
  
  if (myConnect == 1) {         // do your wifi connected loop
      
    digitalWrite(D7,HIGH);      // blink D7 fast to show that Wifi is working
    delay(100);
    digitalWrite(D7,LOW);
    delay(100);    
    // other statements here
  }
  
  if (myConnect == 2) {            // do your manual loop
    digitalWrite(D7,HIGH);         // leave D7 on to show manual mode   
    // other statements here  
  }



  if (myConnect >= 1) {           // do statements that are for either manual or automatic mode 
    // other statements here  
  }
  
  

}

I'd guess it won't actually do what you intend, since you should not call a blocking function inside an ISR

In the other thread I wrote (and am still convinced, to be true)

So you'd rather call Spark.connect() in your normal code but "cancel" the connection attempt from your ISR by calling Spark.disconnect() (and possibly WiFi.off() aswell).

This might also be of interest

http://docs.spark.io/firmware/#system-modes-semi-automatic-mode

3 Likes

Good points I will try writing some different code to test it.

Ok, so here is the second attempt, using the timer to shutdown the connect process. It compiles but I have not tested it. Also finally learnt how to use the spark libraries. So the include statement is a bit easier. Can someone else load this as I have my spark core testing my Pixy Camera Rover and I don’t want to break it again.

Of course Spark.io could always send me a few Cores (like a class set=15 one per two students), so I can continue trying to make Spark products easier to use :wink:

// This #include statement was automatically added by the Spark IDE.
#include "SparkIntervalTimer/SparkIntervalTimer.h"

// Attempt number 2 
// check Spark Core for Wifi on startup, if no wifi then allow it to run none-cloud dependent activities.





SYSTEM_MODE(SEMI_AUTOMATIC);


// Create IntervalTimer object
IntervalTimer myTimer;


// Pre-define ISR callback functions
void checkWifi(void);


//const uint8_t ledPin = D7;		// LED for first Interval Timer


void setup(void) {
    pinMode(D7, OUTPUT);  

    digitalWrite(D7,HIGH);      // blink D7 fast once before timer activated
    delay(40);
    digitalWrite(D7,LOW);

    myTimer.begin(checkWifi, 20000, uSec);   // check for wifi and shutdown if no connection after 20 seconds
    delay(50);
    
    Spark.connect();   // try to connect
    
    
    
    digitalWrite(D7,HIGH);      // blink D7 slow twice when timer activated (checks if blocking)
    delay(500);
    digitalWrite(D7,LOW);
    delay(500); 
    
    digitalWrite(D7,HIGH);     
    delay(500);
    digitalWrite(D7,LOW);

  
}


volatile unsigned long myConnect = 0; // use volatile for shared variables: 0 just starting, 1 connected, 2 no connectivity, 99 first timer done


void checkWifi(void) {
    if (myConnect != 0){  // not the first activation of the timer
    
      if (WiFi.ready()){   // this loop should wait 20 seconds to occur
          myConnect = 1;   // means wifi got connected
      } else {
           Spark.disconnect(); 
           WiFi.off();
           myConnect =2;   // means no wifi
      }
       myTimer.end();  // no mater what happens shut down the timer since we only want it to work once
       
       
       
    } else {
        myConnect = 99;  // identifies the first run has completed.
        
    }
       
}

int myLoop;


void loop() {

  if (myConnect == 1) {         // do your wifi connected loop
    digitalWrite(D7,HIGH);      // blink D7 fast to show that Wifi is working
    delay(100);
    digitalWrite(D7,LOW);
    delay(100);    
    // other statements here
  }
  
  if (myConnect == 2) {            // do your no wifi loop
    digitalWrite(D7,HIGH);         // leave D7 on to show manual mode   
    // other statements here  
  }

  if (myConnect == 1 || myConnect == 2) {           // do statements that are for either wifi or no wifi modes
    // other statements here  
  }
  
  

}

.
.
.
.

Here is information about the Pixy Camera Rover, it follows a certain color. I still have to tweak some values so the rover doesn’t go racing down the street.

That thread is at https://community.spark.io/t/how-to-make-a-spark-toy-car-heel-like-a-dog/10762/15

The github site is at:

1 Like

Almost there, I think.

After the amout of effort you already put into this, I’d like to suggest these adaptions
At this point I haven’t tested them myself, but just from experience this should work - and I’ll test it soon and correct if required :wink:
Edit: This is now working code, but in my tests I’ve got the impression, all that interrupt/timer effort is not needed anyhow, since Spark.connect() doesn’t seem to block anymore (despite still being stated to be and as I remember it to have been)
@mdma, has anything been changed “recently” that hasn’t found its way into the docs? Or am I imagining things?

// This #include statement was automatically added by the Spark IDE.
#include "SparkIntervalTimer/SparkIntervalTimer.h"

SYSTEM_MODE(SEMI_AUTOMATIC);

// Create IntervalTimer object
IntervalTimer myTimer;

// Pre-define ISR callback functions
void checkWifi(void);

volatile uint8_t isrTriggered = 0;

const unsigned long CONNECT_TIMEOUT = 30000;
unsigned long lastMillis;    // it's always good to have this for non-blocking delays

void setup(void) {
    pinMode(D7, OUTPUT);  

    digitalWrite(D7,HIGH);   // blink D7 fast once before timer activated
    delay(40);
    digitalWrite(D7,LOW);

    // this would only wait 20ms since uSec is micro seconds
    // myTimer.begin(checkWifi, 20000, uSec);   // check for wifi and shutdown if no connection after 20 seconds
    // the other time scale is half-milliseconds (hence 2*)
    myTimer.begin(checkWifi, 2 * CONNECT_TIMEOUT, hmSec);
    
    lastMillis = millis();  // store soft-delay reference time
    
    // try to connect
    isrTriggered = 0;
    Spark.connect();
    if (!isrTriggered)
    {   // only wait if connection attempt succeeded
        myTimer.end();  // don't disconnect prematurely
    
        while ((millis() - lastMillis) < CONNECT_TIMEOUT)
        {
            digitalWrite(D7, HIGH); 
            delay(50);             // flash LED about 10Hz
            Spark.process();
            digitalWrite(D7, LOW);  
            delay(50);             // flash LED about 10Hz
        } 
        Spark.disconnect();  // missed your chance for OTA flashing
    }
    WiFi.off();  // deactivate WiFi in any case  
}

// no need to re-check inside loop() now
void loop()
{
    digitalWrite(D7, HIGH); 
    delay(500);            // flash LED about 10Hz
    digitalWrite(D7, LOW);  
    delay(500);            // flash LED about 10Hz
}

void checkWifi(void) 
{
    isrTriggered = 1;
    Spark.disconnect();  // no more required inside ISR
}

Edit: Fixed some build error - Spark.connect() does not return any success/fail indicatior :frowning:

1 Like

Thanks @ScruffR. I just found another thread that might be useful.

Yep, that’s the one that introduced SYSTEM_MODE().
Only after this it became even possible to squeeze in some code before Spark.connect() that might be able to stop any running connection attempt.

Nothing has changed since the last release in October 2014. Afaict, Spark.connect() will never block, since it just calls WiFi.connect(), which is non-blocking, and sets a flag to cause the background thread to connect to the cloud.

What about this then?

When the user calls Spark.connect(), the user code will be blocked, and the Core will attempt to negotiate a connection. This connection will block until either the Core connects to the Cloud or an interrupt is fired that calls Spark.disconnect().

http://docs.spark.io/firmware/#system-modes-semi-automatic-mode

@ScruffR, @mdma perhaps some test code is in order to establish if Spark.connect() or WiFi.connect() actually block user code or not. The story until now has been that user code will block when the cloud connection is LOST in AUTOMATIC mode but SEMI_AUTOMATIC and MANUAL mode scenarios are not so clear. It would be good to understand the behavior once and for all :stuck_out_tongue:

1 Like

As my code above suggests, it actually doesn’t block in SEMI_AUTOMATIC - which contradicts the official docs, tho’.

After the call to Spark.connect() the 10Hz blinking starts immediately, even when my router is off.