Automatic Time Zone

Does any one know if it’s possible to set automatically set the time zone and or time based on the WiFi router location? To help reduce user intervention of setting the time zone and daylight savings manually via a web UI.

Thanks

You can use something like telize to get a JSON back with your IP and timezone in string format.

http://www.telize.com/geoip

This may or may not be accurate depending on network routing, VPN usage, etc. but you could use it to fill in a default value for the user to confirm.

1 Like

@bko do you know of a way of obtaining the core’s location without having to use HTTP client in the firmware?

I used webhooks to get a JSON back form telize.com but as you would imagine it gave me back the IP/location of the Amazon servers were the Spark cloud is hosted.

Is there another path I’m missing?

Thanks!

Hi @sazp96

I don’t know of a way right now to get the core’s location without using HTTP directly, but I think in the near future there will be.

There has been some discussion of private per core publish and subscribe events (similar to the existing on-line/off-line events you can see using IFTTT) that would allow you to get your public IP address that the cloud sees. I know that @Dave was working on this, but there were some problems.

Armed with that IP address you could have a webhook that used the IP address end point at a service or your own server like telize:

http://www.telize.com/geoip/46.19.37.108

1 Like

Heya!

If you subscribe to the event “spark/device/ip”, and then publish the event “spark/device/ip” from your device, the cloud should reply with your external ip. :slight_smile: This is a somewhat experimental / undocumented feature, so please let me know if it doesn’t work for you. My goal was to make the experience a bit cleaner before fully releasing it.

Thanks!
David

4 Likes

Hi @dave and @bko,

I have try “spark/device/ip” but when I was able to get it to work, my other subscriptions stop working. I wasn’t sure if I was having some string/pointer issue or I was just using it wrong. That is why I was looking for alternatives.

In the example below, the other two subscriptions are working fine, but “spark/device/ip” is not. @dave could you help me figure out what am I doing wrong? Thanks.

#include "application.h"

int lastTime = Time.now();
int turn = 1;

int counterSe = 0;
int counterIP = 0;
int counterHo = 0;
int counterMa = 0;

char externalIP[50] = "EmptyIP";
char response[1024] = "EmptyRes";

void hoHandler(const char *name, const char *data)
{
	counterHo++;
	String str = String(data);
	strcpy(response, str.c_str());
}

void ipHandler(const char *name, const char *data)
{
  counterIP++;
	String str = String(data);
	strcpy(externalIP, str.c_str());
}

void maHandler(const char *name, const char *data)
{
	counterMa++;
	String str = String(data);
	strcpy(response, str.c_str());
}

void setup()
{
	delay(5000);
	Spark.variable("externalIP", externalIP, STRING);
	Spark.variable("response", response, STRING);

	Spark.variable("counterSe", &counterSe, INT);
	Spark.variable("counterIP", &counterIP, INT);
	Spark.variable("counterHo", &counterHo, INT);
	Spark.variable("counterMa", &counterMa, INT);

	Spark.subscribe("hook-response/", hoHandler, MY_DEVICES);
  Spark.subscribe("spark/device/ip", ipHandler);
	Spark.subscribe("manualCli", maHandler);
}

void loop()
{
	if (Time.now() - lastTime > 15)
	{
		counterSe++;
		if(turn == 1)
		{
			Spark.publish("spark/device/ip");
			turn = 2;
		}
		else
		{
			Spark.publish("server3"); //This calls the webhook
			turn = 1;
		}
		lastTime = Time.now();
	}
}

Heya @sazp96,

Thanks for posting this, I think there might be an issue, digging in.

Thanks,
David

Hmm, I made another version of your app and I couldn’t get the other subscriptions to stop working:

#include "application.h"

int lastTime = Time.now();
int turn = 1;

char externalIP[50] = "EmptyIP";
char response[256] = "EmptyRes";


void ipHandler(const char *name, const char *data)
{
    Serial.println("heard back " + String(name) + ": " + String(data));
     
    //counterIP++;
	String str = String(data);
	strcpy(externalIP, str.c_str());
}


void setup()
{
	//delay(5000);
	Serial.begin(9600);
	
	Spark.variable("externalIP", externalIP, STRING);
	//Spark.variable("response", response, STRING);

    Spark.subscribe("three", ipHandler);
    Spark.subscribe("two", ipHandler);
	Spark.subscribe("one", ipHandler);
    Spark.subscribe("spark/device/ip", ipHandler);
	//Spark.subscribe("manualCli", maHandler);
}

void loop()
{
	if (Time.now() - lastTime > 15)
	{

	    Serial.println("publishing...");
		Spark.publish("spark/device/ip");

		lastTime = Time.now();
	}
}

Some other users have mentioned there might be an issue with subscribing to multiple hooks, so I’ll give that a look as well.

Thanks,
David

Same over here. Neither in your code or my code the other subscriptions stop working. They always work.

The problem now is that “spark/device/ip” doesn’t work. Even when I strip down the code to have only the IP subscription. Is this what you are seeing as well?

Thanks

Hi @Dave, have you had any luck getting “spark/device/ip” working?

Cheers,
Sergio

Hi @sazp96,

I’m sorry I haven’t had time to dig back into this recently, so many things happening at once! I think I’ll have more time early next week to revisit these, sorry about the delay.

Thanks,
David

1 Like

No worries. Thanks for the update.

1 Like

I had this working once, but recall I started having issues with subscription/publish so I pulled the code. Don’t know if it will help. I might have had issues because I called the publish for the IP in setup? I striped down the code for this posting.

  void setup()
  {

     Spark.subscribe("spark/device/ip", ipHandler, myDeviceId);
     Spark.publish("spark/device/ip", NULL, 60, PRIVATE);

 }
 

void publishEvent(String eventName, String message)
{
    Spark.publish(eventName, message, 60, PRIVATE);
}

void ipHandler(const char *event, const char *data)
{
    publishEvent("CurrentIp", data);
}

Hi @xcode

I think this is code has a race condition. When you do the subscribe, a Spark protocol request is queued up to notify the cloud that you want to listen to events of the given name etc. Then when you do the publish the cloud gets a request that should trigger the subscribe, but you didn’t give any time for the subscribe request to be completed and processed before the publish came in.

By putting some delay between these call I was able to make it work fine:

void setup() {
     Spark.subscribe("spark/device/ip", ipHandler, Spark.deviceID());
     delay(1000);
     Spark.publish("spark/device/ip", NULL, 60, PRIVATE);

}

I didn’t play around to find the reliable minimum but 1 second worked fine for me.

1 Like

Can you guys try simply using Spark.publish("spark/device/ip");?

Or try this demo code: http://docs.spark.io/firmware/#get-public-ip

1 Like

Hey guys,

I was able to isolate a repro of the issue. In a nutshell, what I have found is that when one subscribes to a webhook, “spark/device/ip” stops working. It only happens with webhooks, “spark/device/ip” works fine with other subscriptions.

@dave let me know if I’m mission something obvious here.

Cheers

Working fine without webhook:

#include "application.h"

int lastTime = Time.now();
char externalIP[15] = "EmptyIP";

void ipHandler(const char *name, const char *data)
{
	String str = String(data);
	strcpy(externalIP, str.c_str());
}

void setup()
{
	Spark.variable("externalIP", externalIP, STRING);
	//Spark.subscribe("hook-response/get_weather", ipHandler, MY_DEVICES);
	Spark.subscribe("spark/device/ip", ipHandler);
	Spark.subscribe("hello", ipHandler, MY_DEVICES);
}

void loop()
{
	if (Time.now() - lastTime > 15)
	{
		Spark.publish("spark/device/ip");
		lastTime = Time.now();
	}
}

Not working when subscribed to a webhook:

#include "application.h"

int lastTime = Time.now();
char externalIP[15] = "EmptyIP";

void ipHandler(const char *name, const char *data)
{
	String str = String(data);
	strcpy(externalIP, str.c_str());
}

void setup()
{
	Spark.variable("externalIP", externalIP, STRING);
	Spark.subscribe("hook-response/get_weather", ipHandler, MY_DEVICES);
	Spark.subscribe("spark/device/ip", ipHandler);
	Spark.subscribe("hello", ipHandler, MY_DEVICES);
}

void loop()
{
	if (Time.now() - lastTime > 15)
	{
		Spark.publish("spark/device/ip");
		lastTime = Time.now();
	}
}
1 Like

Hi @sazp96

There are many things wrong with this code that make it difficult to tell where the bug might be. Let’s just focus on the non-working, second case:

  • You have three subscriptions all calling the same function and that function copies the event data into a local string array. Is it required to trigger the problem that they all call the same function?
  • In the event handler you subscribe with, you first copy the char array data into an Arduino String object and then copy it out again in to externalIP. This seems like it just wastes memory.
  • Also in the event handler you pay no attention to if the string data you received is longer than the 15 location externalIP array, so this can really clobber memory.
  • You are initializing lastTime by calling Time.now() and this likely to be before the clock is running. I guess this OK since it probably returns 0 anyway, but an explicit 0 value might be better.

So does it still fail if your webhook subscribe function is different? Does it still fail if you bounds check the incoming event data before smashing it into externalIP?

If it does, that will be a lot easier to debug.

3 Likes

Hi @bko,

Thanks for the quick reply and good feedback, below are some comments/questions on your points.

No it is not. I was just being lazy and used the existing handler to do the repro.

Fair point. I wanted to do this instead:

String externalIP = "EmptyIP";

void ipHandler(const char *name, const char *data)
{
	externalIP = String(data);

But it gives me an "undefined reference to 'String operator=(String&&)'" error. What I'm doing wrong?

Given that I can't get it to work using the String class, I'm using a char array:

strcpy(externalIP, data);

For the repro I didn't validate this. Given that only "spark/device/ip" will call then handler, I believe it should have no impact.

Fair point.

It seems so. Below is the improved code still failing:

#include "application.h"

int lastTime = 0;
char externalIP[150] = "EmptyIP";

void ipHandler(const char *name, const char *data)
{
	strcpy(externalIP, data);
}

void webHookHandler(const char *name, const char *data)
{
}

void setup()
{
	Spark.variable("externalIP", externalIP, STRING);
	Spark.subscribe("hook-response/get_weather", webHookHandler, MY_DEVICES);
	Spark.subscribe("spark/device/ip", ipHandler);
}

void loop()
{
	if (Time.now() - lastTime > 15)
	{
		Spark.publish("spark/device/ip");
		lastTime = Time.now();
	}
}

Hi @sazp96

This certainly looks like a bug and I opened as issue for it here: https://github.com/spark/firmware/issues/420

You can work around this by ordering the subscribe commands in the reverse order. I don’t know why.

void setup()
{
	Spark.variable("externalIP", externalIP, STRING);
	Spark.subscribe("spark/device/ip", ipHandler);  //I need to be first!
	Spark.subscribe("hook-response/get_weather", webHookHandler, MY_DEVICES);
}

2 Likes

Thanks @bko!