latest MQTT-TLS using mbedTLS 2.7.5, as you pointed it’s easy to customize:)
@hirotakaster I just checked your mbedtls includes in the mqtt-tls lib on github to see how it is different from what I am using. I noticed that in config.h the
defines are missing (among other details). The mbedtls versioning is a little unclear to me, on the site it is referred to as 2.13, 2.7.6, 18.104.22.168 (I assume there are individual parts that are versioned) so I just checked your library for date and it states in the top comment “Copyright © 2006-2015, ARM Limited, All Rights Reserved”, whereas the 2.12 lib I am using lists 2006-2018.
My point is mainly that with the library version that operates with the defines I listed above, one can lower the RAM footprint which in our small particle world is a big deal. Perhaps you did this in another way and I missed it?
I’m curious as to where your RAM footprints are landing with this client? I’ve used the WolfSSL TLS 1.2 client in past Particle iterations for AWS-IoT endpoints and wondered how this stacked up in comparison??
OUT_CONTENT_LEN, IN_CONTENT_LEN can use on 2.13.x version.
You could change the MBEDTLS_SSL_MAX_CONTENT_LEN on 2.7.x.
I can’t decide about the which mbedTLS version could good for this library and developers in 2.8.x/2.7.x/2.1.x version released.
But 2.8.x-2.13.x is released for latest features(IN/OUT_CONTENT_LEN…etc) version, 2.7.x/2.1.x become maintenance release version in this year.
Here is mbedTLS release comments.
Mbed TLS 2.7.6 and 2.1.15 are maintenance releases, and intentionally do not contain new features to avoid changing the library interface and allow users to change library versions easily.
So I will update MQTT-TLS/TlsTcpClient mbedTLS version to 2.13(or 2.14?) in sometime soon
@hirotakaster, re-reading the MBEDTLS release notes and your comments I understand now that 2.7.x and 2.1.x are previous versions of the library that have seen security upgrades but no feature upgrades (pretty messy to new comers).
So I upgraded MQTT-TLS to 2.12 a few months ago with minimal effort, it was really a small change with respect to a time() call if I remember correctly. It did give me the possibility to change the buffer sizes which was important in my app which uses lots of RAM for data analysis. So my input/output buffer sizes are 4K/2K which works well with AWS IoT. I also tested pretty much all options in the config.h files for the minimal set needed to connect to AWS.
Still tricky is the startup of the library during which it consumes much more memory and depletes the stack rather rapidly. But I am not going to tinker with MBEDTLS itself outside the config.h file (though my fingers are tickling to do it!). It will be good to see you upgrade to 2.13, thanks again for the nice MQTT library!
I can second this, I had to move a lot of static variables to
**const** to reduce memory footprint to get a connection/start.
const int syncTimeRate = 60*60;
Currently this allows it to boot and results in System.freeMemory() returning a value of: 48832
I was finding anything lower than 35-40k was causing instability.
I’m also using a lot of Strings (@ScruffR would say that is a no no and I should use Chars, but finding it difficult to swap between chars and strings when dealing with cloud return variables). I think the memory foot print changes quite a lot as the code goes through different functions/timed events.
You can almost always work with
char* and never touch
If you have a specific example, I may be able to show you how.
// docs for Particle.function() state the callback needs to look like int someFn(String args); // but this works just the same int someFn(const char* args);
So I was able to reduce the RAM footprint of the library by 12KB with two config changes. The first one is a no-brainer if you have the space in ROM, as you lose absolutely nothing.
With default config:
Free RAM 16024
With MBEDTLS_AES_ROM_TABLES enabled (this moves an 8KB table from RAM to ROM. This pushed my code to the ROM limit, but I was able to remove some old code and get it to work)
Free RAM 24128
Reducing MBEDTLS_SSL_MAX_CONTENT_LEN (With MBEDTLS_SSL_MAX_FRAGMENT_LENGTH enabled per default) to 6144 (4096 couldn’t connect for me). Handshake seems to work fine, was able to send ~2k payload (my max Len) over the connection with no issues. Keep in mind that this can be server dependent, so this may not apply outside of aws iot, and this may eventually not be sufficient if they change something on the server side.
Free RAM 28248
Edit: With these changes, the total approximate observed continuous RAM impact of the library is 24KB instead of 28.5KB ( though still requiring an extra 7KB dynamically during handshake per my previous post so requiring a minimum of ~30KB instead of 35KB Free RAM for a handshake ).
Edit 2: Here’s a post with some deeper optimization possibilities, though these may be risky on AWS IOT since we don’t really control the server here. Someone with more knowledge of TLS / SSL may be able use these to optimize though. https://tls.mbed.org/kb/how-to/reduce-mbedtls-memory-and-storage-footprint