Keep getting DEVICE_DISCONNECTED_F back from Spark Dallas Temperature

Ever since I update my core to use the Spark Dallas Temperature library from the libraries tab, rather than a copy I had in my app directly, I keep getting back a temperature of -196.6 on almost all reads, which is the DEVICE_DISCONNECTED_F error. Unfortunately I had no way to go back to my previous version since there’s still no versioning in the Web IDE. Running curl in a loop with a 5 second sleep out of 3343 attempts only 264 of them came back with a real result.

I added some debugging but I don’t see why it’s happening.
Anyone know why I might get this error?

This is my update function:

static void update()
{
    if (!sensors.validAddress(address)) {
        sensors.getAddress(address, 0);
    }
    
    sprintf(addrString, "%02x%02x%02x%02x%02x%02x%02x%02x - %s,%s",
        address[0],
        address[1],
        address[2],
        address[3],
        address[4],
        address[5],
        address[6],
        address[7],
        sensors.validAddress(address) ? "VAL" : "INV",
        sensors.isConnected(address) ? "CON" : "DISC"
        );

    if (sensors.validAddress(address)) {    
        sensors.requestTemperatures();
        // uint8_t scratchPad[9];
        // float temp = sensors.rawToFahrenheit(sensors.calculateTemperature(address, scratchPad));
        float temp = sensors.getTempF(address);

        temperature = (int) (temp + 0.5f);
        sprintf(tempString, "%f", temp);
    }
    else {
        temperature = -123;
        strcpy(tempString, "Invalid Address, Sigh");
    }

    Spark.publish("temperature", tempString, EVENT_EVERY_MS / 1000, PRIVATE);
}

Same happens if I just watch the event stream:

event: temperature
data: {"data":"-196.600006","ttl":"30","published_at":"2015-01-19T03:46:43.042Z","coreid":"foo"}

event: temperature
data: {"data":"-196.600006","ttl":"30","published_at":"2015-01-19T03:47:13.041Z","coreid":"foo"}