Is anybody aware of any lossless compression algorithms in the Particle Library ecosystem. I have been recommended LZ4 but it doesn't look like that is available.
What are you trying to compress?
How large is the data? And if a dictionary compression, how often does the dictionary reset?
Does need to be compressed or decompressed on other platforms?
I am trying to compress about 17K bytes of data that mostly represents floating point numbers. These are readings from a medical EEG device. It will need to be decompressed on the iphone which is receiving this data. I am currently sending the uncompressed data over BLE but I would like to be able to make improvements in the transmission speed. This data is also stored on an SDCard so if compressed before store and then retrieved would help reduce time I am taking saving and reading from the card.
If no one has a better solution I have a port of zlib for Particle that's almost done. It works, it just need to be packaged into a library with documentation. I didn't finish it since I didn't need it after all.
That would probably work well for your use case, as it's available for most platforms including iOS.
That would be awesome @rickkas7
Hi @rickkas7 Just curious when you think I might be able to get my hands on your compression algorithm. Not a rush, just doing some engineering planning on my end. Thanks so much
I found two issues.
One is that it uses way more RAM than I thought for compression. It works fine on the P2, Photon 2, and M-SoM but not on earlier devices. It should be possible to decrease the memory requirements, but the change did not work for some reason I need to determine.
The bigger problem is that there is a memory corruption issue (I think) somewhere. While running the test suite, occasionally the device will SOS fault. I haven't found the cause yet.
ugg. Those memory corruption issues are always difficult to find. Good luck and thanks for trying to get this to work @rickkas7.
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.