@ScruffR, Hey Just an update - little steps have been made ( and it took ages)
Spark dev Installed (Tick)
simple flash LED code copied into a new .ino TAB (Tick)
Compile .ino file in the cloud, BIN created, core updated (Tick)
LED flashing (Tick)
I have come to learn that (in a small way) happiness is a flashing LED. Is that wrong???
OK so I guess that to add .cpp files and .h files, I would put them into different tabs in the spark Dev screen and then compile away?
Can you enlarge on -
At the moment itâs just a bit of a pain that you have to put all the required files for compilation into one directory and have to alter the #include statements accordingly.
Hi @Julian, good to hear youâre getting things working
For building a larger project youâd create one folder into which youâd copy all required .cpp, .h and .ino files (but donât put other files into it).
Then youâd open the âFileâ menu and in there find âOpen Folder âŚâ and navigate to your project folder and select it.
Then all the files in there will be displayed in a side-bar and you can build away
The thing with the includes is, that libs usually contain something like #include "../somelib/somelib.h" which you'd need to change into#include âsomelib.hâ.
I usually put the original version and the altered into something like this
@ScruffR, @Peekay123 Thank you for your explanation. I think I understand. To test that I am going to try a neopixel library example first. The spark dev environment seems like it could be a half way between CLI and IDE.
When you add an #include statement into your code, does it then expose the abilities of the libryr you are envoking or do you have to add the library yourself into another tab?
So for example if I call #include neopixel.h do I then have to add a tab, call it neopixel.h and add some code, from somewhere? Or is it added just by the #include statement?
If you are talking about Spark Dev you just do as written in my post above - especially this
No less no more. As for libraries it means ALL .h/.cpp files which belong to the lib.
For terminology:
Instead of calling it "to add a tab" you might rather use "add a file to the project (folder)", this might make clearer what actually needs to be done.
You can add a file from inside Spark Dev (as File ... New File) or just by copying it into the folder - no matter how.
@ScruffR, So I have it set out in a project folder with files (tabs) inside it. I have changed the #include statements to show no path, just neopixel.h for example.
I am getting an error saying the app code is invalid. I think I need to put in the application.h file as it is referenced in some of the other code.
Where can I get the application.h code from to add to a new file in my folder?
These three are libraries tha have to be added to your project folder (gamma is part of the Adafruit_mfGFX lib by @peekay123, SparkIntervalTimer is a seperate one by the same awesome guy).
Usually youâll find an indication what other libraries will be needed in the repo of the individual libraries.
A good starting point for available libraries is the Web IDE library inventory.
As said already - one 32x32 panel with or two 32x32 without double buffering is no problem.
Or four 32x16 panels would give you a 128x16 scroll text display
Given the experiance Iâve made with my two panels, the video cube should be possible with no real tweaking, since it seems to show the same picture on all six faces, so you provide one pic from the Core and this gets forwarded through to all daisy chained panels. You just need to supply enough power (5V x 2A per panel = 60W).
This at least worked for two panels that way.
The video wall is a different kettle of fish, since you need to provide a different pic for each panel, for which the Core has not enough RAM.
The Photon will allow more panels, but then speed might become an issue.
If you look at the hardware used for the video wall, youâll find some more computational power.
@ScruffR, the key word here is âvideoâ. Both the video cube and wall take a video feed (DVI, display port, etc) to display. Neither the Core or the Photon could do that. There is some really nice high-speed matrix panel stuff for the Teensy3 product that uses DMA for high-speed refresh. The biggest issue with the Core and possibly Photon is that none of the GPIO are mapped to a single port, making it difficult for DMA to be used. I would love to optimize the RGBMatrixPanel even more to handle more panels and higher refresh rates!
But what does one consider a video. Any moving picture (e.g. the Teensy demo seems to be an animated GIF) or does it have to come from a standard video feed.
If itâs animations rather than video the video-cube demos may also be possible on the Core - sure, propper video would not work.
And as mentioned in one of my earlier posts, I have squeezed a teeny-weeny bit of speed out of your RGBmatrixPanel lib by remapping the R1, G1, B1, R2, G2, B2 pins to the low byte of GPIOA.
I have not timed it yet, thoâ.
And I havenât gone the ASM route yet either.
@ScruffR, video is video dude! I think you are talking about animation. Nonetheless your point is valid. BTW, I posted the optimized RGB pin library last week. I find that it add 30-50% improvement in frame rate!
What I would love to see is FastLed for the Core/Photon where you could do some serious animation on âsmartâ LEDs like neopixels. I posted a message on the authors github to see if the Spark Community could help move the port along.