Internal RTC calibration


we have a product (clock) which uses internal RTC of the STM32 processor. Some customers run the clock offline and we would like to implement RTC calibration for them. The internal 32.768 kHz oscillator is not much accurate by default (quick test gave me around 66ppm offset), and there is a simple way how to calibrate it using RTC_CALR register of the STM32.

The question is, how to access this register on Photon? How to write into it?

Thank you,


Hi @dfarny that sounds like a nice feature to have. Would you be willing to do some research (see if there are any app notes on the subject) and file a Github issue/proposal here?

There are probably some ways to get the job done without an official feature as well. You can typically write to ST registers directly if you know the symbol or address.

1 Like

Hi @BDub I think there is no need for official support for this feature. I will try to access the register directly. The register name is RTC_CALIBR, it is 32bits register at 0x18.

Sounds good @dfarny, and it might be nice to make one anyway if it makes the process easier to find. I’m sure we’d all love to hear what the calibration process is that you come up with. When when you figure it out please share. Thanks!

@BDub great, once I have some results, I will post it here. How can I manipulate registers directly on Photon? Thanks!

Usually you can grep the source code for things you’re interested in, and try to search for register names used in the STM32F205 datasheet and programming manual.

Then see where they are used and how they are used and how they might be labelled differently. Try to use existing functions as much as possible.

Looks like you might be able to use RTC_CoarseCalibConfig(uint32_t RTC_CalibSign, uint32_t Value)

1 Like

The function BDub refers to is part of the STM32F2xx standard peripheral library, which you can call from regular user firmware. The documentation is here:


I can confirm that this function works - the time drift is now around +2 seconds/week instead of 2 seconds per day. I used 63ppm as a highest value possible (according to datasheet) and it is still not enough - I will tinker with this more…

RTC_CoarseCalibCmd(ENABLE); //enable the calibration feature
RTC_CoarseCalibConfig(RTC_CalibSign_Negative, 63); //negative for slowing down, 63ppm

How does this calibration work? I suppose you have to calibrate against something? I suppose temperature can be of influence, or are there any other factors.

I hope you can give some insight in the calibration process.

Unfortunately, I was not able to slow down the clock more using the functions mentioned above. Even with the highest value (63ppm) is still 2s/week “faster”. Increasing the value has no further effect.
In datasheet, I noticed two ways for calibration of the STM32 MCU - “Coarse calibration” and “Fine calibration”, I dont know how is the RTC_CoarseCalibConfig function implemented, maybe there is still option to slow down the clock even more.

As for the calibration itself - my plan was to implement function which would run on the background for couple of days, compare the internal clock with time from cloud and would adjust the calibration value according to the result. It is necessary to run the process for days (at least) to get reasonable time window for comparison. The time synced from cloud is limited to 1s resolution - too coarse. And you never know if the actual time was synced on the beginning of actual second or on the end - up to 1s offset.

Our application will run in a room temp all the time, I dont expect significant drifting caused by temperature.

If you find some solution for this, please let us know, thank you!