MicroSD and also Tinker Questions

Hello there.

So I’m looking into an SD card for my project and I started here: [quote=“BDub, post:32, topic:2666”]
Come one come all… get your red hot SD Card library here:https://github.com/technobly/SparkCore-SD
[/quote]

The main reason I want to implement a SD card onto my project/system is to ensure data is continuously being logged. I’m sending data via the core every 30 sec to a server for logging, graphics, etc. So if I ever lose WiFi what I want the Core to do is, without skipping a beat (within reason of course) just continue logging data, but now on a CSV file (or appropriate text file) that is on the SD card. THEN when WiFi is back up, Core connects back to WiFi and sends the data stored to the SD card to the server, then continues logging and sending data via WiFi. So two questions: (I’ll try and ask them clearly this time :wink: )

  1. Is the above feasible? Or should I add a separate device and/or processor that only handles the offline data logging, then when WiFi is back on it hands the data off to the SparkCore so that it can send it via WiFi?

  2. Almost completely unrelated. But I also am interested in using the Tinker app alongside my project, with the main goal being data logging. I know Tinker uses the SparkCloud to communicate to the Core so that it can write or read pins in near real time. Would I be able to use the Tinker app without interrupting the Core’s logging functions and it’s capability of sending data to my specified server? The issue I feel I would run into is, while sending data via WiFi to my server there might be a chance that at the same time Tinker would be trying to communicate with the Core; so I’m not sure if the Core can do both at once.

Sorry for the moth full but I would really appreciate any insight or direction on how to tackle mainly #1 and possibly #2 . Thank you :slight_smile:

Cheers,
UST

Hi UST.

Let’s tackle each of your points:

  1. Yes and no. because of the blocking nature of the wifi driver on the Spark Core when it loses connection with the internet it will block user code until it can re-establish connection. The Photon will fix this but to fix it with your core you need to do one of two things: 1, use a seperate microprocessor which does the logging/sensing and then talks to the core for remote loggin or 2, write your own non-blocking driver (I wouldn’t take this option unless you and C/Cpp are best buds :slight_smile: )
  2. Yes. Tinker is open source. Simply go here: https://github.com/spark/firmware/blob/master/src/application.cpp and put the function prototypes, functions, and setup code straight into your application. Flash it to your core and your core should behave like Tinker. Remember however that you could accidentally interrupt your sensing code if you aren’t careful if you tell tinker to control/read from a pin you are already using for sensing or communication. You might want to write you own “limits” on the tinker functions which keep it from controlling certain pins you need

@harrisonhjones, depending on the sampling speed, it may be possible to use SparkIntervalTimer to create sampling interrupts that will run even with the blocking call (I think!). Again, depending on the amount of data, the data could be cached and only written out to FRAM or SD at slower intervals to keep the average ISR service time low. I would approach the logging in reverse order where data is queued to the SD by default with the cloud connection disabled. Then every 30 seconds, attempt to connect and de-queue the data via a set of head and tail pointers stored in the file or in a separate file. You get the idea. :smile:

I can’t comment on the SparkIntervalTimer (because I don’t know!) but I think your plan to store in SD by default is a good one.

Assuming you weren’t writing too much info you could store it to external flash. There’s almost 2MB of available space there…

@harrisonhjones, I or some other member (hey @kennethlimcp!) will have to test the SparkIntervalTimer ISR servicing while the cloud connection is blocking user code. Could be an important feature to have.

Using external flash is a great idea but the thing with using SD is that you never have to worry about wear or purging the data. Since the largest file the library can handle is 2GB (I think) then the file can grow for a long time. All you need is a head and tail pointer for the unsent data.

1 Like

Hey @harrisonhjones and @peekay123 , sorry for the late reply - it’s been a busy day!

@harrisonhjones so a long story short is that I need to be always connected to the internet in order to have my user code running … which could be a big issue for my project :frowning: … Hmm I haven’t done C in a while but if it’s needed to look into option 2. then maybe it’s necessary. Maybe until the Photon arrives, cause I know I will be upgrading to that regardless I will look into option 1. Appreciate the detailed answers thank you, but in #2. you said Tinker is open source, so does this mean it is independent of SparkCloud?

@peekay123 thanks for your answers. So essentially, SparkIntervalTimer might be a work around into logging and doing the scenario as I described; or otherwise Ill need a separate processor to do my logging and only call to the Spark to send via WiFi; until the photon comes OR I find a blocking driver solution as @harrisonhjones described :)’ …

The Tinker is independent in the sense that the Tinker firmware is open source and the interaction with the firmware is simply through API calls to the cloud. If you use your own “local” cloud then you can still use tinker.

1 Like