Data type "word" looks not implemented

Compilation fails on the data type word (but can be replaced by “uint32_t” I suppose), nevertheless this makes porting from other compilers a little bit cumbersome (typically from Arduino).
Note that in the documentation, word is specified as data type, see:

word hello;
void setup() {
hello = 'H';

void loop() {


After verification:

In file included from ../inc/spark_wiring.h:30:0,
from ../inc/application.h:29,
from test.cpp:2:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
warning "Defaulting to Release Build"
test.cpp:1:1: error: 'word' does not name a type
test.cpp: In function 'void setup()':
test.cpp:3:1: error: 'hello' was not declared in this scope
void setup();
make: *** [test.o] Error 1
Error: Could not compile. Please review your code.

I took a quick look at the source code and it seems like it’s not inside.

Let’s @ping @satishgn to find out more. :wink:

The same is true for byte also, both can be added with

typedef uint8_t byte;
typedef uint16_t word;

at the top of the code.

There’s also some useful macros, such as word to create a word from some other type, or combine two bytes

word makeWord(byte h, byte l) { return (h<<8)|l);
word makeWord(word t) { return t; }
#define word(...) makeWord(__VA_ARGS__)

With this, the user can then write

word w1 = word(123);  // w1==123
word w2 = word(1, 2);  // w2==258

As far as I could test, the different data types documented are accepted beside word (i.e, byte is working)
Nb: sorry for my mistake word should be equivalent to unint16_t and not unint32_t).

I think this gets at the heart of why the Spark team didn’t include this. On Arduino, int is 16-bits and word corresponds to int, but on Spark int is 32-bits. What should word correspond to then, 16- or 32-bits? I think it is always better to use explicit types like uint16_t when you know you need exactly 16-bits and reserve things like the int type for loop counters etc.