Hi peekay
I have managed to find a way to contact Robert Tidey to ask for any suggestions and he has responded immediately. I have not had a chance to check his recommendations as I am working but will have a proper look tomorrow.
Robert Tidey’s GitHub
Regards
@bevangg, if you can forward me his message, I can take a look as well. Sorry I don’t have one of these to test 
Hi peekay
Here is a link to his response:
[Robert Tidey’s response on Github][1]
I have had a good look at what he has highlighted. It seems to me that pin variables are being declared in both the ino and the cpp files, as are some of the timing variables. These seem to me to potentially conflict, cause confusion and I can see no reason for them to be in both places.
Also I wonder if the problem could be with the pulse durations in ticks declared in lwtx.cpp at line 43 as they are time dependent:
// These set the pulse durations in ticks
static byte tx_low_count = 7; // total number of ticks in a low (980 uSec)
static byte tx_high_count = 4; // total number of ticks in a high (560 uSec)
static byte tx_trail_count = 2; //tick count to set line low (280 uSec)
// Use with low repeat counts
static byte tx_gap_count = 72; // Inter-message gap count (10.8 msec)
//Gap multiplier byte is used to multiply gap if longer periods are needed for experimentation
//If gap is 255 (35msec) then this to give a max of 9 seconds
//Used with low repeat counts to find if device times out```
Also I noticed that in the ino setup the 140usec tick count is declared for the spark and arduino:
```cpp
#ifdef SPARK_CORE
//Transmit on pin D2, 10 repeats,no invert, 140uSec tick)
lwtx_setup(D3, 10, 0, 140);
#else
//Transmit on pin 7, 10 repeats,no invert, 140uSec tick)
lwtx_setup(7, 10, 0, 140);
#endif```
The line that Robert has pointed out also seems to have the 140 value:
```cpp
#ifndef SPARK_CORE
clock = uSecT; //(-1 ??)
} else {
//default 140 uSec
clock = 140;
}
#else
clock = (uSecT / 4) - 1;
} else {
//default 140 uSec
clock = 34;
}
#endif```
I really am purely pointing out observations, I don't know enough yet to fully understand the whole code.
Regards
[1]: https://github.com/roberttidey/LightwaveRF/issues/1#issuecomment-44592393
@bevangg, I believe Robert caught the problem!! He was right about the# ifndef at line 242. I must have seen that line a hundred times and missed it! I fixed it in the github repo.
As for the SPARK_CORE “version”, the IntervalTimer library takes the “actual” time instead of a register calculation for the arduino. Both the arduino and Spark use 140us default tick time.
Give the fix a shot and see what happens. You will most likely find that the Arduino version now works as well. 
Holy Cow peekay
It looks like that was it. I am trying not to get too excited without testing it for a day or two but it really does seem to be responding correctly now. This is incredible, I am so pleased to see it work. I will try it on the arduino atmega tomorrow and also see how it is with the cheaper radios on both platforms. The range at the moment is extremely good. I will post results as they are produced. I will also produce a fritzing of the spark circuit and full details of the project and post it wherever project examples are posted when I am 100% sure that it is working.
The spark running your modded code is now in charge of the pirs, magnetic sensors and lights on the stairs and doors etc, we will see how it fares overnight and in the morning. It has run so far for over an hour with no issue so it really does look like mission accomplished. It is noticeably more responsive than the atmega, lights come on almost instantly, exactly why I started going down the ARM 32 bit path and then chanced on the spark.
There is one last thing that could be done as a final nice touch, to modify the library to also detect and configure for the due. There must be a few people out there who would find that very useful. I presume that your interval timer library would be applicable to the due and that there would be just a few pin modifications required but I know nothing!.
I will post tomorrow afternoon with final conclusions on the current revision, which is still working fine as I type.
Best regards and thanks loads for the input / help
@bevangg, I am so glad to hear that! You have to give Mr. Tidley some major koodos for catching that little error.
The Due has a set of timers much like the Uno. Adapting the libraries will require a bit more work to access those timers. I have to do a bit of research on this. 
For the Due the hardware timer library [here][1] is probably a good match.
The rx pin change interrupts should be virtually identical with the wider range of pins that can be chosen like the Spark.
[1]: https://github.com/ivanseidel/DueTimer
If the base library is expanding to be potentially used in 3 environments it may be better to clean up the differences along the following lines.
-
Have a separate EEPROM #def that just controls that functionality in the library
-
Move all the timer set up / control logic into an external timer library which then accounts for the differences. So, for example on the TX side it would just need a setup attachInterrupt(isr, delay), and start(), and stop methods. The timer library would then translate these into their equivalents for the different environments. The RX side is already pretty similar. If the EEPROM is handled the same then the only difference to account for is the pin restriction and mapping of the mega.
@bobtidey, I saw that library also! I totally agree with your approach. Overloading a library with conditional compiling can really confuse things. If this library is going to be used a lot, I would make the extra library. If this is a one-off then that’s different. To me, it’s all about the effort. At this point, I am focusing on porting libraries for the Spark. 
Update
Hi peekay
All tested now. Results as follows:
Atmega works with both radio types but with somewhat limited range with the cheaper radios
Spark core is exactly the same.
So your were right, it does work with both variants, and the only differential factor is the quality of the radios. I had to comment out a definition of application.h in both lwrx and lwtx header files for it to compile for the atmega but it then works fine.
So yes, confirmation that your great work has come to a positive result, thankyou, and thanks must also go to Robert Tidey for his original library and his input in resolving this. I hope that he will get further involved in the spark community as he clearly knows what he is talking about and has got a great attitude, rather like yourself.
As you have pointed out you are more interested in porting for the spark than the due, I had no idea it would be so difficult considering they both share the same cpu. I agree that our time would be better spent concentrating on the spark. The due has nothing like the potential of the spark.
I am working on a project to post on the spark community with a practical application of all this stuff. So far I have created the Fritzing layouts for the spark master and atmega slave transceiver circuits but can’t see any point putting it up until I get a web interface working with the spark. As I am under great pressure at work right now this might not happen for a while but I will now be working in my domain ie server / client, rest,ajax etc.
This has been a massive learning experience and I now know how little I understand about programming and am determined to learn more about it!
So far as porting the library it is done and working so would you allow me send you those bits now? I would really like you to be able to try it out yourself. I should have all the parts in my arsenal ready to pack and send by the end of next week.
I’m starting to do a small bit of rationalisation of the library to handle these variants in a more maintainable way… Basically I’m just moving the timer setup, start and stop routines out to a separate module which has separate #ifdef controlled variants of these routines. The mega368 code is directly in there. For the Spark it will pass through to the passes through to the SparkIntervalTimer library. Probably do the same for the Due with that other library. The EEPROM stuff is then separately optionally included.
I’m already set up to compile for 368 and Due, but I don’t have a set up for Spark as yet, so I’ll have to check into how that is done. I won’t be able to physically test on a Spark either. First step will be to make the re-organised code run OK on the 368.
@bevangg and @bobtidey, I am extremely happy that all is working well One of the products of the last Spark Sprint was to create an EEPROM emulator for the Core external flash. It is possible to adapt the LightwaveRF library to use this emulation if you feel that functionality is important. Any thoughts before I publish the library on the web IDE? 
I have pushed a restructured version of the libraries onto Github. In the end to keep it simple I separated out the timer interrupt logic into 3 routines, setup, start and stop and put these into the main LwTx code. This makes it much easier to put in different versions of these and control them in one place with #defines. The code has the native AVR code built in and has versions for the Spark Core and the Due which call out to the SparkIntervalTimer library or the DueTimer library. The document is updated to reflect this. Similarly EEPROM support can be turned off more easily from the header file or a compile variable.
I have so far only fully compiled and tested the AVR version. The Spark Core should behave the same way as already done but I haven’t tested that and it is always possible there are compile issues inserted. Similarly for the Due but I should be able to have a go at trying that one.
On the subject of EEPROM support for the Spark Core that may be nice depending on usage. I saw the main need for this on the RX side where dedicated devices could store their pairing data without any other communication then the incoming RF and therefore be equivalent to a native LightwaveRF device. If the device has separate comms for configuring then there is less need for this. I.e. if Spark Core is being used for TX from wifi instructions then the EEPROM is maybe less important.
Robert, your code is a work of art. it so much more understandable now, I will try it out and post results
Regards
@bobtidey, during the last Spark Sprint, EEPROM emulation was added to use 100 bytes of onboard flash as “EEPROM”. This will make the EEPROM parts of the library fully functional with the Spark! The documentation is here.
BTW, nice work on the cleanup. I will bring those changes back into the Spark library and once tested and completed, I will post it on the Spark web IDE. 
@bobtidey, I found some missing SPARK_CORE tests around the EEPROM.h inclusion statements. The Core does not use the include as the EEPROM functionality is build in. Did you want me to do a pull request? 
That is interesting. Robert’s work includes the EEPROM but I have worked around it, purely to achieve my aims, by hard coding the commands into the sketch. I have got at least 50 commands set as constant variables and suspect this is not an ideal way to deal with this situation. Also it limits the use of dimming and mood commands.It really is a very bad approach but works, if not ideally.
It seems that using EEPROM or equivalent could be worth understanding and implementing.
As an aside my first spark core is now sending text to speech notifications and gmail messages via pushingbox.com when a door sensor, laser or pir is triggered and alarm state is true. It is incredibly easy to set up. Perhaps I should do a tutorial on it as it is a great way to get messages from the spark core. without a lot of work for somebody who wants a simple and cheap alarm system with mobile and email notification.
1 Like
@bevangg, I have incorporated Bob’s changes in my github library with some minor fixes. It includes the EEPROM code for storing pairing information in “emulated eeprom” on the Spark. Can you give it a shot to see if works for you? 
Hi peekay
That is great and I can’t wait to have a look. I have hit a brick wall now and am exhausted, it has been totally full on with work this week and all this happening aswell! I think the excitement has done me over. In fact I have had a couple of beers to celebrate the events of this week. You have been a great help and now we seem to have Bob on the spark scene. I will try to persuade Bob to let me send him my last spare spark core as he really has suddenly become a great input in all this.
When you both have the hardware to test this can only move forwards rapidly.
Regards
Yes please send where the EEPROM is missed. It is a little difficult to weed them as I have
the EEPROM library so it always finds the references.
I am looking at setting up the environment to compile for Spark but as
I’m new to that device I am still researching it.
@bobtidey, it is not where the EEPROM.h is missed so much as it should be excluded for the Spark. Both LwTx.h and LwRx.h need these minor mods:
//Include EEPROM if required to include storing device paramters in EEPROM
#if EEPROM_EN
#ifndef SPARK_CORE
#include <../EEPROM/EEPROM.h>
//define EEPROMaddr to location to store message addr
#endif
#define EEPROMaddr 0
#endif