This library is for arduino. I ported it to C++ for the spark but I’m having trouble to get correct values out of my sensor board (using the SparkFun 9 Degrees of Freedom - Sensor Stick)
I think it has to do with the I2C communication but can’t see what is going wrong. Are there fundamental differences between arduino and spark core besides the pull-up resistors?
I’m flashing the spark via usb using the latest firmware from github. I also tried via cloud but no luck either.
@stahl I reviewed the schematic on the SparkFun product. What voltage power supply are you supplying to the board? The board has an onboard voltage regulator that will drop your Vin to +3.3v. I don’t know what voltage regulator they are using but they typically need more than their output voltage in order to regulate.
@mtnscott Thanks, I now tried to power the sensor board via VIN from the spark but no luck.
I scanned the i2c bus and found, that only two of the three sensors on the IMU were found. Again for reference here’s the IMU https://www.sparkfun.com/products/10724
Any idea why one address/sensor is not found when scanning?
Is i2c broken or buggy?
Do I need to provide a delay(…) between talking to the different i2c devices?
(BTW i tested the imu on arduino, it’s not a sensor board failure)
i2c works find on the Spark core–it’s a hardware implementation in the ARM chip. Two things leap to mind after glancing at the sample code for Arduino you pointed to in your first post:
Spark int is 32-bits not 16-bits like Arduino so code that builds bytes into longer words should be changed to use a more specific type like uint16_t instead of int
Although the i2c transfer speed is the same, Spark is a lot faster than Arduino for all the code around the i2c transactions. The sample code I saw already had a bunch of delay(5); statements scattered around indicating slow i2c devices were in use on that board. I would find the places that don’t have delays and add some and see if it improves.
One user reported electrical problems based on i2c on Spark having too fast an edge transition time for his device and wiring (long wires) so he added series 100 ohm resistors from the core pins to the device (at the core end of the wiring) to slow-down the edges. Might help, might not but is easy to try.
It really looks like a type overflow problem that i have when porting the library from arduino to spark. I couldn’t find out how to solve it tough.
The following code is giving me different results in arduino and spark:
float accel[3];
byte buff[6];
// buff is filled with bytes from i2c device
// let's say the values are the following
buff[3] = 255;
buff[2] = 219;
// create float from the two bytes
accel[0] = (buff[3] << 8) | buff[2];
for accel[0], arduino gives me -37.00, spark is giving me 65499.00 (as does python).
as far as i understand arduino does a roll-over when shifting, but i have no clue how i should deal with this.
This is just really bad coding practice. Try this:
float accel[3];
byte buff[6];
// buff is filled with bytes from i2c device
// let's say the values are the following
buff[3] = 255;
buff[2] = 219;
// create float from the two bytes
int16_t temp = (buff[3]<<8) | buff[2]; //form a signed 16-bit number
accel[0] = (float)temp;