Looping inside the loop

Hi Everyone,
Good day. I’m just a newbie in spark core. I have some queries regarding about the looping… I want to loop the digitalWrite…I hope you could help me to figure this out :smile: it seems that all leds that i initialized doesnt lit

int led1 = A5;
int led2 = A4;
int led3 = A3;
int led4 = A0;
int led5 = A1;
int led6 = A6;
char led[5];    
void setup() 
{
    // ***init*** //  
  pinMode(led1, OUTPUT);
  pinMode(led2, OUTPUT);
  pinMode(led3, OUTPUT);
  pinMode(led4, OUTPUT);
  pinMode(led5, OUTPUT);
  pinMode(led6, OUTPUT);
}
void loop() {

    for (int e = 1; e < 7; e++) {        
    digitalWrite(sprintf(led,"led%d\n", e), HIGH);   
    delay(100);                  
    digitalWrite(sprintf(led,"led%d\n", e), LOW);
  }

You don't need to use sprintf - that creates a string. digitalWrite is expecting simply a pin number.

You can do it like this:

    int leds[] = { A5,A4,A3, A0, A1, A6 };
    int ledCount = arraySize(leds);
    void setup() {
       for (int i=0; i<ledCount; i++) {
        pinMode(leds[i], OUTPUT);
       }
    }
    
    void loop() {
       for (int i=0; i<ledCount; i++) {
          digitalWrite(leds[i], HIGH);
       }
       delay(100);
       for (int i=0; i<ledCount; i++) {
          digitalWrite(leds[i], LOW);
       }
       delay(100);
    }

Also notice that with arrays, the first value is at index 0 - if you asked anyone else to start counting they would start "1, 2...", it's quite strange that we programmers count from 0, but that's how it is! :slight_smile:

1 Like

I tried to compile your code @mdma and there are a bunch of typos that prevented me from doing so… the extra ) in the for loop was kicking my butt for a little bit.

int leds[] = { A5, A4, A3, A0, A1, A6 };
int ledCount = sizeof(leds)/sizeof(leds[0]); 

void setup() {
  for (int i=0; i<ledCount; i++) {
    pinMode(leds[i], OUTPUT);
  }
}

void loop() {
  for (int i=0; i<ledCount; i++) {
    digitalWrite(leds[i], HIGH);
  }
  delay(100);
  
  for (int i=0; i<ledCount; i++) {
    digitalWrite(leds[i], LOW);
  }
  delay(100);
}

I changed the second line as well… I think it might be more intuitive for new users. Probably needs an explanation like "total number of bytes used in the array divided by number of bytes used in each element equals the number of array elements.

It would be great to just overload the Array type with .length() though!

I wrote the code while in between things at work. Writing code in the Discourse editor is not the most pleasant of tasks… The errors were an extra ‘)’ in the first for loop and missing ‘void’ on the setup() and loop() functions. Thanks for the head’s up, I’ve fixed my original post.

If you want a method like .length() then you’ll need to use a collection type from the STL. That will hide some of the nastiness of raw arrays, but then you have the learning curve of STL iterators, allocators and all that fun.

Some people define a macro to do it for statically allocated arrays:

#define ARRAY_SIZE(x) \
   sizeof(x)/sizeof(x[0])

But the danger for the newcomer is that this will also compile when passed a pointer, but give the wrong result.

Also digitalWrite(led[i]... is an issue, then it should compile :wink:

BTW, I always recommend throwing snippets into the Web IDE to make sure they compile. It's easy enough.

I like the macro idea... perhaps we should add the one for arrays to the core-firmware. That's a pretty common macro nomenclature. I realize it won't work with pointers, but typically you can get around that if you know what your pointer is pointing to. If not, then you have to get creative with special array terminations or use typedefs.

We’re in luck. The arraySize macro is already there, in spark_macros.h, I’ve updated the example. (I did put it through the compiler after you highlighted there were problems, so not sure why there were still errors after the previous update.)

Bah, I looked everywhere but there! There are macros all over the place too.

It kind of bugs me that it's camelCase instead of all caps. Perhaps we should suggest changing it to ARRAY_SIZE...

I updated the Docs meanwhile to add this useful reference.

I can recommend "Find in files" NetBeans - works well and quicker than manually searching.

btw, nice that you updated the doc too. The macro is in the core-common-lib, so it might not be clear to future maintainers that this is part of the spark's public api. Maybe best to define a new macro in spark_wiring.h in the core-firmware repo, that way you get your rename to uppercase and it's then clearly part of the public api.

Well, all this took far more time than I expected from just writing down a few lines of code to help a fellow spark buddy, but good that something positive has come from it!

Welcome to my world :slight_smile: Isn't it fun?

I've been meaning to get NetBeans installed.... gotta do that!

@mdma thanks for the response… :slight_smile: Big Help… Cheers :wink:

1 Like