Servo on local network

Hi there,

I’m trying to port my Arduino sketch where I control a couple of servos over UDP on the local network. I need it to work even if it has no connection to the internet/cloud so I included Spark.disconnect(); in setup().

Is something wrong in my code? It compiles and flashes ok, the board breathes green signifying it’s connected to the local network, but does not receive any values.

unsigned int localPort = 8888;
String value = "";

char packetBuffer[24]; //buffer to hold incoming packet,

UDP Udp;

Servo Servo1;
Servo Servo2;

void setup() {
  Spark.disconnect();
  Udp.begin(localPort);
  Servo1.attach(A1);  // Servo 1 on pin A1
  Servo2.attach(A2);  // Servo 2 on Pin A2
  }

void loop() {
  // if there's data available, read a packet
  int packetSize = Udp.parsePacket();

  char command[3];      // commandstring
  unsigned int vnumber; // the value as integer

  if(packetSize)
  {

    // read the packet into packetBufffer
    Udp.read(packetBuffer,24);
    value = "";
        for (int i=0; i < packetSize; i++)
    {
      if (i < 3)  // the first 3 characters
      {
      command[i]=packetBuffer[i];
      }
      else        // the last caracters are the value
      {
        value += (char)packetBuffer[i];
      }
    }
    value += "\0";      // necessary - the string ends here
    command[3] = '\0';
    vnumber = value.toInt();  // string to integer

// to servos
    if ((String)command == "SR1")
    {
      Servo1.write(vnumber);
    }
    if ((String)command == "SR2")
    {
      Servo2.write(vnumber);
    }
  }
}

any reason you’re using UDP instead of TCP? do you need to stream (with no syn/ack)? essentially you’re gaining speed but losing reliability.

there seems to be a lot of threads about udp not working, tcp works fine for me client and server.

I’m using UDP because I’m controlling robots with servos. Real time, “instant” response is a priority over reliability.

As @sej7278 said, the UDP libraries seems to have some bugs at the moment, so there are a few tricks to be aware of. I’ve heard that updating the CC3000 firmware may help with some issues, but I haven’t tried it myself. Here are some tips I learned getting my UDP to work, hopefully one of them might solve your problem (my guess is #3 will fix it for you).

  1. Instead of using Spark.disconnect(), put #include “spark_disable_cloud.h” at the top of your code. Apparently Spark.disconnect() doesn’t get called early enough to prevent an initial connect attempt, which could freeze you up if you have no internet, but spark_disable_cloud.h should prevent any attempt at all.

  2. Unlike most standard UDP implementations, the Spark functions don’t really have a concept of a datagram packet, so you are NOT guaranteed that Udp.read() will get you either a full packet or nothing - you may in fact receive a partial packet. Since it looks like you’re using an ASCII protocol, you may want to null-terminate your UDP packets so that you can do a sanity check on receive that the packet has been completely received.

  3. Don’t call UDP::begin() from setup(). Especially when disabling the cloud, it seems to break the UDP object for some reason. Instead, in loop, wait until the WiFi is connected and then you may call begin(), something like this:

// Global flag
bool udpInitialized = false;

void loop()
{
  if (!udpInitialized)
  {
    // Check if WiFi is connected - this works for me, but I don't know if it's a legit way to check
    if (Network.SSID()[0] != '\0')
    {
      Udp.begin(localPort);
      udpInitialized = true;
    }
  }
  else
  {
    // Normal program functionality
  }
}

Additionally, if you ever end up transmitting UDP from the SparkCore, make absolutely sure the address you are transmitting to exists on the network. When disabling the cloud, trying to transmit to a non-existent UDP address seems to trigger a hardfault in the core.

Some more information can be found here and here.

1 Like

I just wanted to add to number 2. In addition to possibly getting a partial packet, it’s also possible to get more than one packet in the same read - either multiple whole packets or 1 or more whole plus a partial. Packet terminators, and careful parsing of the data you get are almost required; fixed packet sizes may be another option.

Thank you dpursell and erjm for the in depth info!

It seems moving the Udp.begin to after the core got its SSID did the trick, then Spark.disconnect() worked as well.

I tried #include “spark_disable_cloud.h” but it threw this error:
In file included from …/inc/spark_wiring.h:30:0,
from …/inc/application.h:31,
from /the_user_app.cpp:2:
…/…/core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning “Defaulting to Release Build” [-Wcpp]
In file included from /the_user_app.cpp:1:0:
…/inc/spark_disable_cloud.h: In constructor ‘SparkDisableCloud::SparkDisableCloud()’:
…/inc/spark_disable_cloud.h:34:3: error: ‘SPARK_CLOUD_CONNECT’ was not declared in this scope
make: *** [/the_user_app.o] Error 1

I’ve had the core running for an hour now, receiving UDP messages without missing any beats as I can see.

Heres my code in case it’s useful for anyone:

unsigned int localPort = 8888;
String value = "";
int led = D7;
// Global flag
bool udpInitialized = false;

char packetBuffer[24]; //buffer to hold incoming packet,
    
UDP Udp;

Servo servo1;
Servo servo2;
    
char myIpString[24];

void setup() {
  Spark.disconnect();
  servo1.attach(A0);  // Servo 1 on pin A0
  servo2.attach(A1);  // Servo 2 on Pin A1
  servo1.write(90);
  servo2.write(90);
}
    
void loop() {
    if (!udpInitialized)
    {
    // Check if WiFi is connected 
    if (Network.SSID()[0] != '\0')
    {
    IPAddress myIp = Network.localIP();
    sprintf(myIpString, "%d.%d.%d.%d", myIp[0], myIp[1], myIp[2], myIp[3]);
    Spark.variable("ipAddress", myIpString, STRING);
    Udp.begin(localPort);
    udpInitialized = true;
    }
  }
  else
  {
    // Normal program functionality

    // if there's data available, read a packet
    int packetSize = Udp.parsePacket();
    
    char command[3];      // commandstring
    unsigned int vnumber; // the value as integer
    
    if(packetSize)
    {

    // read the packet into packetBufffer
    Udp.read(packetBuffer,24);
    value = "";
    for (int i=0; i < packetSize; i++)
    {
        if (i < 3)  // the first 3 characters
        {
        command[i]=packetBuffer[i];
        }
        else        // the last caracters are the value
        {
            value += (char)packetBuffer[i];
        }
    }
    value += "\0";      // necessary - the string ends here
    command[3] = '\0';
    vnumber = value.toInt();  // string to integer
  
    // to servos
    if ((String)command == "SR1")
        {
            servo1.write(vnumber);
        }
    if ((String)command == "SR2")
        {
          servo2.write(vnumber);
        }
        
        }
  }
}

It seems I was wrong, since where I ran my code was on a router connected to internet, and trying it today on one where I cannot connect to the cloud then Spark.disconnect() is not put to work before after the core has connected to the cloud, as dpursell pointed out earier.

So is it really at the moment not possible to use the spark core without a connection to the cloud?

Hi @blind

You can start without the cloud (and then connect and disconnect later) by using the include

#include "spark_disable_cloud.h"

You have to be careful because the cloud is connected at core start-up, you care guaranteed that the WiFi connection is up and running by the time your setup() function runs, but with the cloud disabled at start-up, you have to test to make sure the network connection is ready yourself.

It seems that #include “application.h” was needed as well. Now it works :smile:

And yes, I’m now making a check as pointed out by dpursell that UDP only connects when the core is connected to wifi.

cheers!