Spark Door access control system

hi @Dragonsshout!

I finally decided to buy the same rfid reader you have, so I can avoid “compatibility” issues at least. Can’t get the reader work :frowning:

While it’s not arriving just wanted to ask you, do you need a separated 5V source for the reader, or you just connected it directly to the Spark? Any other “addon” i would need to connect it with the spark properly?

Hope this one won’t trick me and will be straight forward.

Kind regards,
Noten

Hi @Noten!
Ok, that’s bad…

Actually I’m powering the SparkCore trough the Spark Relay Shield, who is giving 5V to the Vin pin of the Spark Core and I’ve connected the RFID reader 5V at the same pin.

And you will need to make a voltage divider as explained in my post above. All the wiring is explained.

Kind regards,

DSS

hey!

Cool, thanks for the info.

Do you think a “single” usb cable would do, to control both? (connect the mini usb to the core, and connect the reader to the vin port)

That’s all the cable i will have at the garage, so really hope it will work like that :slight_smile:

You welcome!

Yes, I think it should work :wink:

Hi

I have a problem with your code it docent seems to be reading my tags correct.
I have the exact reader as you have on on the picture the RDM6300.

I i tried to modify the code to see what happen and have shorten i to see just the RFID Number printer over Serial via Tera Term for Debugging.
But i just have me a lot of 2 2 2 2 2 2
So i tested the the reader on Arduino to be sure i worked correct.

First my code just to verify the RFID Read via an Arduino Uno.
Code:

#include <SoftwareSerial.h>
SoftwareSerial RFID(2, 3); // RX and TX
 
int i;
 
void setup()
{
  RFID.begin(9600);    // start serial to RFID reader
  Serial.begin(9600);  // start serial to PC 
}
 
void loop()
{
  if (RFID.available() > 0) 
  {
     i = RFID.read();
     Serial.print(i, DEC);
     Serial.print(" ");
  }
}

It Outputs the following:
2 48 52 48 48 52 51 52 66 56 69 56 50 3
Witch is my RFID ID and that’s correct tested on an USB Reader to.
So the Reader Works.

On the Spark Core i the use the following Code

int i;
 
void setup() {
   
    Serial1.begin(9600);
    Serial.begin(9600);  
}

void loop() {
   
  if (Serial1.available() > 0)
  {
   i = Serial1.read();
     Serial.print(i, DEC);
     Serial.print(" ");
  }
}

Witch Outputs
2 48 56 2 48 56 2 48 66 3
Other times
2 52 52 50 2 52 52 50 2 52 51 56 2 48 66 3 2 52 69
And
2 48 56 3 2 48 66 3 2 52 52 50 2 48 66 3 2 52 69 2 52 52 50 2 48 56 2 48 51 56 2 48 66 50 2 48 56 2 48 51 56 2 48 66 3 2 52 69 2 52 52 56
Depending on how long i hold the TAG over the Reader.
So non of that is the correct numbers as it if jumps a couple of numbers sometimes.

So i am wondering if its a problem with the Spark Core IDE and the Serial Communication.

Have you experienced such problem ore have any Ida hoe to solve the problem ?

Thanks Kim.

Hi @CentauriDK

I have exactly the same problem as you. I thinked first that my tag number was only a lot of 2 2 2 2 2 2…
But, after your reply, it really seems to be a bug.

I will check again my code…

Thanks for your reply!

Regards,

Dragonsshout

Hi Guys,

I think you are running into the problem where the serial port read is using polled io with no buffering. Maybe @Dave or @zach can comment on when they are going to address this but my feeling is that while it was high priority, they had a bunch of other higher priority issues with CFOD etc.

You can help yourself by reading Serial1 more quickly. You are currently reading one byte per call of loop(); reading until no more are available will be better.

while (Serial1.available() > 0) {
  char c = Serial1.read();
  Serial.print(c);
}

See the thread on LinkSprite Camera’s for a bit more detail.

Hi @bko,

Thank you very much for your reply.

That’s annoying. I didn’t know that the serial port is using polled io.
I will trying your solution and report back if it helps.

Regards,

Dragonsshout

@Dave here! :slight_smile: Mohit was indeed building a ring buffer implementation for the Serial interface, but was interrupted by CFOD work. So it’s in the pipeline, but won’t be right away.

usually I’ll read all I can at once:

while (Serial.available()) { ... = Serial.read(); }

Thanks!
David

1 Like

Ok nice! Hope it will be available soon :smiley:

Thank you, I will try your solutions!

Dragonsshout

1 Like

hi All!

It seems i had the same issue with the ID12LA rfid reader as well, and now i tested the same code with the RDM6300 and same issue.

I’m wondering whether it’s affecting only the usb serial port? Does it read the values properly all the time (e.g. if i would like to flash a led on a specific card read, would it work? Regardless of what it pushes to the serial port.)

At the moment i’m not sure whether this issue is related to the RX or to the usb serial port, can somebody please specify?

Thanks,
Noten

Hi @Noten

The serial port buffering issue relates to the RX/TX serial port called Serial1 in the code and available on the pins shown here:

http://docs.spark.io/images/core-pin-usart.jpg

The USB port has always worked OK for me but I have not used it much so I really can’t say if it is affected too.

Thank you @bko!

Unfortunately this is the bigger problem i think :frowning:

Hope the Spark guys will fix it soon. @Dave, any eta for this? Will it happen within weeks or months?

Regards,
Noten

@satishgn awesomely committed a fix for this just last night (you can see the commit here: https://github.com/spark/core-firmware/commit/d6ff1b460f28ad9923c3a0e497dc846cbcd9a584 ).

That fix will be automatically included in build IDE projects when we update the “compile-server2” branch next. We try to be careful when we move those branches forward, so that will probably happen Friday, or next week. If you were building locally from master you could pull these changes and test them out now if you wanted. :smile:

Thanks,
David

1 Like

hey @Dave!

Awesome! As i don’t know how to do it, i think i will just wait till friday or next week :smile:

Are you going to release a blog post/msg here once this “compile-server2” update is ready from your end? Also i don’t need to do anything just compile the same code and deploy it on the spark again?

Keep up good work!

Noten

Hi @Noten,

Definitely, we’ve generally being doing soft rollouts on Fridays and watching things closely during the weekend, and then publishing a blog post on Monday about the changes. Firmware updates will automatically be included when you re-compile / flash on the build site. Just add a space or something to your code so it re-compiles and isn’t cached. :smile:

Thanks,
David

Dear @Dave!

Did you implement the changes during friday night?

I just checked the same code whether it’s ok now, but it seems everything is the same. I tried with the code below, and it’s showing different output on each read for the same card.

int i;

void setup() {

    Serial1.begin(9600);
    Serial.begin(9600);  
}

void loop() {

  while (Serial1.available() > 0) {
  char c = Serial1.read();
  Serial.print(c);
}
}

Hi Dragonsshout, can you post the code for your Android Widget?

I’m working on a similar project but have no experience with Android developement and want to create a similar widget. Perhaps there already is an easy to use widget to make curl-like requests in the Play Store or I can learn from your code.

Thanks and keep up the good work!

Hi @beklein,

Yes I can, but I have to to clean up the code first.

Thank you :blush:

Hi @Noten,

We’re still doing QA on the firmware updates, but we’re hoping to upgrade them to the stable branch early next week. You can follow along on the process here: https://github.com/spark/core-firmware/network (when compile-server2 catches up to master. :slight_smile: )

Thanks,
David