It’s finally here! There is now a Time library baked into the Core firmware. You can now get the date and time on your Spark Core without the need for an additional library (sorry @bko). The documentation is here under the Core Firmware → Libraries → Time.
I also noticed that @zachary snuck in some documentation for Spark.syncTime() that will synchronize the time from the cloud. Ideally, I think you would only run this once a day or so as needed.
Now it’s time for me to rewrite large portions of code that use the NTP library…
I’m very pleased to see the concurrent release of some documentation. But how did you become aware of this. @wgbartley? Were you monitoring commits at Github? Was / is there going to be an official announcement or is this it?
I am puzzled as to the API - the set of function calls available to the programmer. Are the standard time and date Posix calls supported too? And, if not, why not?
Since @wgbartley is part of the Spark Elites, I guess he’s a bit more up to date on new developments. Besides that, this new RTC stuff has been mentioned on several topics, on several occasions. I still do hope they make some sort of official announcement, since not everyone may read every random post on the forum. Pinning this topic would be a great start to make sure everybody sees it. A blog post may not be out of order either.
As I’ve written in the “documentation, again” topic, this would be a great use case for a “change log/news” page. This would make it easy for people to find all the new updates, from which you could be redirected to the respective documentation.
Looking forward to whatever is coming next, keep up the great work guys!
The firmware code is missing the function timeStr() with no arguments, right now it is only possible to call it passing the timestamp and the doc only states it without arguments. It’s really not a problem, but a discrepancy between the code and docs.
There is a widely accepted de facto standard of C function calls used by application programmers. These are not o/s kernel calls, they are the set of functions expected to be used as an application programming interface, an API. The C standard varies a little but the overlap is very close between POSIX, ANSI, Standard C, C99 etc etc. We are compiling using gcc, it ought to be able to generate machine code for any supported target CPU from POSIX / ANSI / Standard C/C++. So far I sound knowledgable. Now I might start to sound dumb: What’s to stop us using any function in the POSIX/ANSI/Standard C/C++ API(s) that gcc supports? That’s what I am trying to work out.
To return to the question asked by me and the clarification requested by @wgbartley: These are the functions / definitions all of which (or a subset of which?) should be available to us to use, (I think!)
What I am trying also to address is the issue of the development by the Spark Team of arbitrary libraries and APIs when perfectly good other alternative already existing and popular ones already exist: We ought not to be re-inventing the railway if that means our wagons will have a different gauge to to everyone else’s but rather our engineering should be developing a railway that other people’s wagons can travel on un-modified. Substitute API for railway, and application programs for wagons. I suppose I sound ungrateful now. But I don’t want to write code for the Spark that hasn’t a hope in hell of working elsewhere, and v.v.
I would say this API is inspired by and most similar to the Arduino time. The Arduino library is function, not object, based and allows for multiple sources of time sync, but otherwise they are very close.
As an ingrate I comment that better they were the same. Or as close to being the same as possible. It seems to me that the Arduino function call API should have been what was implemented and then, for the OO-compulsives, that an Object-Method interface be implemented directly on top of that. Now, for someone who wants to write code which is compatible between Arduino and Spark, a C-function call interface will have to be contrived from the OO interface - precisely the wrong way around.
@psb777, have you tried including the library time.h?
#include <time.h>
I know for sure it has at least localtime(), mktime() and strftime() functions because I was using them before. Perhaps the whole thing is there for you.
I’m kind of hoping someone is just going to answer the more general question, to say what I can and what I cannot use out of the POSIX/ANSI etc standards, or to tell me why I would be an idiot to use them, that there’s some memory overhead or something. And/or the date and time functions which I should never have previously expected to work at all (such as time()) are now available as I would expect AS A CONSEQUENCE of Spark Cores now always knowing the time. It seems to me that the availability of time() [returns the time as the number of seconds since the Epoch, 1970-01-01 00:00:00 +0000 (UTC)] should be an aim of the project to get Spark to know the time, and that this should be documented. time(), it seems, is identical to the new Spark.now() method. It should work. I guess I’ll have to try it. If it works, does Spark.now() use less RAM than time()?
The function setTime (h, m, s, D, M, Y) is missing.
The Unix timestamp “time_t” has a wrong integer type. The correct type is an ‘unsigned long’ or ‘unsigned int’ (2^32 - 1).
I have created a small Time and Date library with Time zone, universal usable DST/Summer time, Sunrise - Sunset, simple to use date and time formating and a verry compact function for four NTP time servers.
Next week you can download it from my site kendziorra.nl including examples for DST, Sunrise - Sunset, date and time formatting and NTP.
Hey,
Is there any way to attach an interrupt to the this time library functions…Like calling a particular function when at a certain time(HH:MM:SS)… Is there any other way I could achieve this other than using inbuilt Timers which are meant for short time intervals…
Thanks in advance…
You should check out my port of the Time Alarms for Spark–I think you will like it. It is a available in the webIDE too. It uses the built-in time functions that are baked into Spark now.