DS2482 Headroom

EDIT: if you know of some other things to adjust let me know. Also, I could with some work make a simple GitHub build for you to duplicate my situation.

I'm successfully using the DS2482-RK library to interface with 1-wire DS18B temperature sensor. I ran into a little difficulty that I wanted to share and maybe get some suggestions.

My app hammers the I2C pretty hard. I've framed the work using timers to even things out over time. My main frame runs at 0.100 s update.

There is an array commandListStack that seems to act like a buffer. When I stress test my app that buffer over-runs an I get a 4-blink SoS panic (I2C error). I can avoid it by setting COMMAND_LIST_STACK_SIZE to 12 (nominally 4) and recompiling. There is a minor hit to code size.

The vulnerability to overrun remains. The consequence of the SoS for me is small. I use retained memory backup SRAM for all critical parameters so the Photon 2 restarts and runs fine.

Others may not have a good experience if they run into this.

It seems the failure could be a little less dramatic? Perhaps drop data instead of panicking? This may help others if you could address that. I notice too that sometimes the ready flag goes to 0 and stays there after stressing things out. I would like to know how to kick-start the DS2482 logic if that happens.

Thanks. I'm enjoying the Photon 2. I think you did a great job. Migrating like this is THE hardest thing developers do.

The command stack should have a maximum finite size. Clearly it's bigger than 4, but it's not infinite. Given a maximum CPU speed there will be a value that cannot be exceeded, but it would be hard to calculate.

Yes, it should check if the bounds have been exceeded and stop. However, it's really hard to unwind the operations at that point, and in doing so the chip will need to be reset because it will be in an indeterminate state. I deemed it so unlikely to happen it would not be worth the effort, but that may have been an incorrect determination.

Rick I'm hitting it REALLY hard. I've been hitting various limits that surprised other Particle people. I think I2C is competing with Serial for resources? I'm sending a lot of data out Serial and quite a bit in. I've tried increasing BAUD for Serial it hasn't helped much. Even for me I'm doing a stress test. I gather data over Serial then overplot a digital twin model running in Python. That takes a lot of fidelity but only for regression and verification testing. But not when deployed. I wouldn't worry about it. But I do wonder if I'm pulling the right handles. Nothing worse than mis-using somebody's creation (nice work!).

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.