[Resolved]Spark Cloud - IP range and ports for firewall

Can anyone please tell me what outbound port and IP range are being used with Spark core? I need to create firewall rules for the same.

arca1n, the IP is assigned by your local router’s DHCP server and, though this is an educated guess, ports 80 (http) and 443 (https) should cover it.

Hi Peekay,

Thanks for replying. Unfortunately, what I was looking for the IP range for the Spark Cloud (the IPs the SparkCore connects to download the code.) along with the Port number it connects on.

Thanks
Arca1n

arca1n, got it! I changed the title of the post to better reflect what you are looking for. I believe the Spark engineers will answer this better than I can. I will poke them to see if I can get an answer for you.

The IP should be 54.208.229.4 and COAP Protocol is on port 5683.

It should be correct but let’s see what the team say :smiley:

1 Like

Thank you guys! This is a lot of help. I will verify this and mark this as resolved.

2 Likes

I’m not sure how the cloud is set up regarding load balancing. It could all be behind a single IP address, or it could be multiple addresses with round-robin DNS. I’d wait for someone from Spark to comment in that regards. I remember @Dave talking about their load-balancing setup the other day, but didn’t catch all of it.

Heya!

@kennethlimcp is right, you only need to allow outgoing TCP to 54.208.229.4 on port 5683. Most firewalls don’t block outgoing TCP, so you should be fine unless you are extremely strict :smile:

Thanks,
David

Thank you Dave for the quick response! The firewall at my workplace is something that requires outbound exceptions on non-standard ports.

1 Like

Port 53 (DNS) is probably accepted as a standard port, by the Spark will connect to Port 53, IP:8.8.8.8 to verify the alan is up and capable of making an internet connection (internet_test()).

1 Like

As does mine. I made a business use case to get that exception granted. Now I'm just waiting on the local cloud before I can implement that business use case!

1 Like

In case anyone stumbles on this thread while troubleshooting - I had a similar problem more recently and resolved it by opening device.spark.io with port 5683 for TCP, as @kennethlimcp suggested.

2 Likes

Is this still the correct address? 54.208.229.4 and device.spark.io are different locations.

Also, device.spark.io resolves and device.particle.io does not…

You should allow device.spark.io, port 5683, TCP outbound. That’s the hostname that’s in system firmware.

1 Like

I hope that IP address doesn’t change too much. It’s IP addresses you add to firewalls, not hostnames.

Thank you.