Trying to send data from spark to the serial port(by using TX pin)

Hey Guys
I’m trying to use the TX pin on the spark core to send data and display it on the command line such as Putty. I’v already done an experiment to receive the data from RX pin and send it to the cloud, but I want to approve the two way communication by sending through the TX pin (Bidirectional). Please any suggestion?
Thanks in advance.

Have you actually had a look at the docs?

We had quite an elaborate thread together but please do some research before posting a question.

http://docs.particle.io/photon/firmware/#communication

All the Serial.xxxx() functions refer to the RX/TX pins. (Edit ScruffR: I forgot the 1 - it’s Serial1.xxxx() - sorry :blush: )

1 Like

Dear @ScruffR
I read all of these documents, but I confused about what is the difference between serial1.write and serial1.print? If I’m reading the data from serial1 by serial1.read, and use serial.write to display the data on putty, does this mean that I’m using the tx pin of serial USB internally? In this case, I can use serial1.read to read the incoming data, and write it to the putty by using serial1.write right? And off course this will need ttl to usb converter to see the data.
Thanks.

Just to be clear, Serial.read()/Serial.print() will read/write to the serial device via the USB bus, while Serial1.read()/Serial1.print() (note the extra 1) will read/write to the RX/TX pins,

The main difference between write and print is that write only handles characters or char* strings. While print can take numbers, Strings and also can append a newline with the println() variant.

2 Likes

Dear @ScruffR
Actually what I mean that I’d like to recieve data wirelessly throught the spark core and forward this data to the TX pin so I can display it on Putty. Is there some methods to do that?

Thanks @mdma for this explanation. Yes I do understand what is the difference between serial and serial1 since it is described in the documents page, but I asked about that to see if I can read the data from the serial1 and write it on on the same serial1 for displaying on putty. For now, I’m interested on receiving data from an external source(any digital source) and send this data to the tx pin by serial1.write. How can I do that?

I’m sorry for the mistakes that I’m doing on the english language since it is not my first language.

To read data from an external source, you first have to fetch the data, and then write it to Serial1. For example, if the external data is from a Spark.function then you could write it to Serial1 like this:

int print(Stirng value) 
{
   Serial1.println(value); // write the value to Serial1
   return 0; // have to return some value
}

void setup()
{
   Serial1.begin(9600);
   Spark.function("print", print);
}

So when you call print function in the Cloud, it will write the value passed to the function to Putty. You can call functions using particle CLI, like this:

particle call <devicename> print "Happy Hacking"

This would send the data to the cloud, which is then sent to your device via WiFi, and then the device would send it to the print() function in your code, which then prints out the string over Serial1.

For development, it’s often easier to use Serial (USB) rather than Serial1 since you can just connect your device to your computer and then connect putty to the appropriate COM port.

1 Like

Thanks @mdma
I’ll try that tomorrow. Can I receive data from a device which is not a spark core? I mean is there any way to fetch some wireless data other than from the spark?

Yes, you can use TCP sockets to retrieve data from any other networked device using sockets. The web is one such class of devices (a pretty big one!) In the online IDE there is the HTTPClient library that let’s you fetch web pages.

1 Like

Great. I’ll read about that today, and let you know tomorrow what I ended up with.
Thanks a lot

2 Likes

Dear @mdma
I did download the HttpClient application to my spark core, and here is the received data on Serial:

HttpClient> Status Code: 403
Application> Response status: 403
Application> HTTP Response Body:

Application> Start of Loop.
HttpClient> Connecting to: www.timeapi.org:80
HttpClient> Start of HTTP Request.
GET /utc/now HTTP/1.0
Connection: close
HOST: www.timeapi.org
Content-Length: 37
Accept: /
Hello. Here is the Spark Core Carpet1
HttpClient> End of HTTP Request.

HttpClient> Receiving TCP transaction of 128 bytes.
HTTP/1.1 403 Forbidden
Date: Wed, 15 Jul 2015 17:03:07 GMT
Connection: close
X-Frame-Options: sameorigin
X-Xss-Protection: 1; mode=block
Content-Type: text/html;charset=utf-8
Content-Length: 0
Server: thin 1.5.0 codename Knife
Via: 1.1 vegur

HttpClient> End of TCP transaction.

but sometimes it looks like this:

HttpClient> Status Code: 403
Application> Response status: 403
Application> HTTP Response Body:

Application> Start of Loop.
HttpClient> Connecting to: www.timeapi.org:80
HttpClient> Start of HTTP Request.
GET /utc/now HTTP/1.0
Connection: close
HOST: www.timeapi.org
Content-Length: 37
Accept: /
Hello. Here is the Spark Core Carpet1
HttpClient> End of HTTP Request.

HttpClient> Receiving TCP transaction of 128 bytes.
HTTP/1.1 403 Forbidden
Date: Wed, 15 Jul 2015 17:02:47 GMT
Connection: close
X-Frame-Options: sameorigin
X-Xss-Protection: 1; mode=block
Content-Type: text/html;charset=utf-8
Content-Length: 0
Server: thin 1.5.0 codename Knife
Via: 1.1 vegur
HttpClient> End of TCP transaction.
HttpClient> Error: Timeout while reading response.
HttpClient> End of HTTP Response (5908ms).

Or this:

HttpClient> Status Code: 403
Application> Response status: 403
Application> HTTP Response Body: ntent="width=device-width, initial-scale=1">

  <style type="text/css">
                                 html, body, iframe { margin: 0; padding: 0;

height: 100%; }
iframe { display: block; width: 100%; border: none; }

Application Error

Application **Error**< /p>

Application> Start of Loop.
HttpClient> Connecting to: www.timeapi.org:80
HttpClient> Start of HTTP Request.
GET /utc/now HTTP/1.0
Connection: close
HOST: www.timeapi.org
Content-Length: 29
Accept: /
Hello. Here is the Spark Core
HttpClient> End of HTTP Request.

HttpClient> Receiving TCP transaction of 128 bytes.
HTTP/1.1 403 Forbidden
Date: Wed, 15 Jul 2015 17:18:13 GMT
Connection: close
X-Frame-Options: sameorigin
X-Xss-Protection: 1; mode=block
Content-Type: text/html;charset=utf-8
Content-Length: 0
Server: thin 1.5.0 codename Knife
Via: 1.1 vegur

HttpClient> End of TCP transaction.
HttpClient> Error: Timeout while reading response.

HttpClient> End of HTTP Response (5907ms).

Please do you know what is the reason for Error and 403 Forbidden in the output?
Thanks in advance.