Several places in the documentation indicate that the range of a signed INT is +/-32768. The Spark Core has a 32 bit micro so this value should be +2,147,483,647. to -2,147,483,648
BTW I have not found the ranges for the Long or Float. But I will assume that they are the same as any other implementation of a 32 bit micro.
No need for in32_t… int already has a signed range of +2,147,483,647. to -2,147,483,648. see code below, after about twenty seconds the variable “count” will overflow to -2,147,483,648
Thanks @sierrasmith71—of course you are absolutely right. Where in the docs do we have it listed as 32768? Also, the docs are open source, so you can submit a pull request if you just fork the repo and fix it.