Hello,
I am very new to Spark, and I am having some trouble I could use help with.
I have flashed the following example sketch from documentation:
#include "application.h"
// Do not connect to the Cloud
SYSTEM_MODE(MANUAL);
const int LED = D0;
#define LISTEN_PORT 6666
TCPServer server(LISTEN_PORT);
TCPClient client;
void init_serial_over_usb() {
digitalWrite(LED, HIGH);
Serial.begin(9600);
while (!Serial.available()) SPARK_WLAN_Loop();
digitalWrite(LED, LOW);
}
void connect_to_wifi() {
Serial.println("Connecting to WiFi...");
WiFi.on();
WiFi.connect();
Serial.println("Connected.");
Serial.println("Acquiring DHCP info:");
while (!WiFi.ready()) SPARK_WLAN_Loop();
Serial.print("SSID: "); Serial.println(WiFi.SSID());
Serial.print("IP: "); Serial.println(WiFi.localIP());
Serial.print("Gateway: "); Serial.println(WiFi.gatewayIP());
}
void setup() {
pinMode(LED, OUTPUT);
init_serial_over_usb();
connect_to_wifi();
// start listening for clients
server.begin();
Serial.print("Listening on "); Serial.print(WiFi.localIP()); Serial.print(":"); Serial.println(LISTEN_PORT);
}
void loop() {
if (client.connected()) {
Serial.println("Client connected!");
// echo all available bytes back to the client
while (client.available()) {
server.write(client.read());
Serial.println("ECHO");
}
} else {
// if no client is yet connected, check for a new connection
client = server.available();
Serial.println("Waiting for connection...");
}
delay(2000);
}
Here’s the network info printed to the serial terminal when Spark starts:
Connecting to WiFi...
Connected.
Acquiring DHCP info:
SSID: vulcan
IP: 192.168.7.121
Gateway: 192.168.7.1
Listening on 192.168.7.121:6666
The test program work fine, if I connect to it from a device on the same LAN. So something like this works well:
nc 192.168.7.121 6666
The connection succeeds and my input is echoed back to me. So far so good.
Next, I try to perform exactly the same test, but from a device on another LAN. In order to achieve this I configured the WiFi router to forward port 6666 to 192.168.7.121, which is reserved for the Spark’s MAC address. This does not work - the following simply times out:
nc 192.168.6.1 6666
In order to make sure that my port forward is correct, I changed the forward IP address to another Linux box on the same network as the Spark, and made sure that port forwarding is working properly with:
nc -l 6666
Switching the IP back to the Spark, once again leads to time outs.
So, I can successfully port forward to a Linux box, but not to Spark running TCPServer to listen for connections. Any idea what might be going wrong here?
I wonder if this might be related to this issue: https://community.spark.io/t/cannot-port-forward-to-spark-outside-local-network/5364
Any advice would be appreciated.
Val