Spark's i2c implementattion

I have a question about implementation of current Spark Wire.h
I am writing a library for LTC2990 i2c quad-channel voltmeter

See datasheet screenshot: http://screenshots.ryotsuke.ru/scr_a696a5cbbf7d.png

My questions:
Do I need to append R/W bit myself? ACK bit?
I assume I don’t need this and I don’t need to use shifted left addresses like 0x98? That is corrent?

Please confirm if i understand correctly:

Write command: address from datasheet: 0x98h, command: 0x3, data: 0x14 (See Figure 5)
Spark Code:

Wire.beginTransmission(0x98>>1); 
Wire.write(0x3);
Wire.write(0x14);
Wire.endTransmission();

Read command: address from datasheet: 0x98h, command: 0x3 (See Figure 7)
Spark Code:

Wire.beginTransmission(0x98>>1); 
Wire.write(0x3); 
Wire.endTransmission(); //very unsure here
Wire.requestFrom(0x98>>1, 1, true);
if(Wire.available()) {
    uint8_t result = Wire.read();
    return result;
}

Full datasheet is http://cds.linear.com/docs/en/datasheet/2990fc.pdf i2c related stuff is pages 13-14

Nope. Just use the 7-bit address! It handles the R/W bit and ACK/NACK stuff. All you have to do is read and write the data!

???
I am using 7 bit address in this code and am just reading and writing data. What exactly is nope here?

@ryotsuke I would assume @timb's "Nope." response is the answer to the first two questions in your original post.

I see :smiley:
Though code still is not working for my device. I guess I’ll switch to Arduino debugging - it is easier to do and easier to reset.

It doesn't appear you're using the 7-bit address according to that code… Why are you doing …0x98>>1… instead of just the address?

Because datasheet is using 8 bit addresses. 98, 9A, 9C + R/W bit and I want to see address from datasheet in code.
7bit address is 0x4C

Try converting it beforehand and just giving it the 7-bit address. (0x4C if my math is correct!)

0x98>>1 is 0x4C, that is not the issue. Can you adjust code to correct_for_spark variation? Ignore address, it is working fine. You are saying transmitting is done differently in spark. I need it to fit datasheet. Arduino runs this fine

The Wire library works the same. It’s the low level STM32 I2C drivers that are different. They’re the basic out of the box drivers provided by ST and not very fancy. Until I finish my fresh DMA implementation I2C is going to be a bit buggy and unstable. Make sure you give time for read and write operations to take place after your endTransmission call and you’ll generally be alright.

Do you have access to a Logic analyzer? A Bus Pirate or Raspberry Pi even? If so I’d log the I2C traffic from the Arduino and Core then compare the differences.

Can you just adjust actual real code please? :smiley: I am not tech spec, I have no idea how it works on low level. That it why I am asking tech specialists for help.

“Add magic delay values to places” is not actually helping ^^’’ Some things you say sound total nonsence for me ^^’’. Like address thing or “give time for read and write operations to take place”. How I am supposed to give time to read operation? Like that?

b = Wire.read() 
//here value of b is one value
delay(1)
//and here value of b can be different? Is read operation async?

I’ve located blocking code:

Wire.beginTransmission(_addr);
Wire.write(r);
uint8_t status = Wire.endTransmission(false);
delayMicroseconds(150);
if(status == 0) {
    //IF CODE REACHES HERE SPARK DIES <-------
    if(Wire.requestFrom(_addr, 1, true) == 1) {
        uint8_t result = Wire.read();
        return result;
    }; 
}
return 0;

Once code goes to Wire.requestFrom(_addr, 1, true), Wire.read() - core reboots. Basically it reboots immediately after cloud connects.

One more weird issue.

 Wire.beginTransmission(_addr);
Wire.write(r);
uint8_t status = Wire.endTransmission(false);

Works fine when no device is present on i2c line. So I check status and try to read only when status==0, meaning write has completed successfully and device present.
Now to interesting part. When I read spark dies. I expect that once i disconnect i2c device it wont do read operation and spark will recover. But it does not. Reading from Wire corrupts firmware?

Currently I have no idea what happening. I’m starting monitoring not immediatelly after Sparks starts and now it is not stuck in reboots

I have a question: reading Spark variable and firing function works on interrupt? If yes, how can I prevent them from firing during time-sensitive processes?

I’ve updated gits http://gist.github.com/Ryotsuke/9100459 with working code.
But it stability rate is only 60% which is terribly low. Stability = successfull reads/all reads

I am working on it. :wink:

I’ve reached about 99% stability by now by adding noInterrupts() (gist updated too)

Remaining issues are: why core hangs if monitoring starts immediately
What should be exact delays in code

I guess I will never know the answer to second one :-/ 200 microseconds work fine.

What are current ETA on this feature?
I feel I need some table for that :slight_smile: Monitoring Serial1 buffer issue as well.

Made spark monitor it’s own supply battery voltage http://screenshots.ryotsuke.ru/scr_26faf8a64106.png
Reading error rate is high again

Advanced I2C Implementation/Rework Pull Request for those interested:

3 Likes