IP Range for Particle Cloud?

Hey Guys,

I was wondering what the IP Range of outgoing connections the Photon makes to the particle cloud. I know it uses Port 5683 (CoAP) to connect to the AWS Servers with load balancing - but I’m not sure of the exact range that it uses? I’m on a closed educational network, but they cannot unblock the ports globally, but they can unblock certain IP Addresses with ports if they know the destination IPs.

Thank you,

Sam

I believe it is just the one address. I don’t have it handy, but it’s just one IP, and one port; no ranges.

2 Likes

There is a thread that talks about the required open ports (in addition to the CoAP port)

It also needs some IPs like 8.8.8.8 for DNS testing.
But some things have changed with a recent update, so maybe @Dave or @BDub can chime in on this.

1 Like

Thanks for getting back to me, I look forward to hearing from @Dave and @BDub too. I contacted Particle, and they have given me the main address (54.208.229.4) ( as well as a fallback address (prior 0.6.0 firmware) (54.225.2.62), but the Photon is connecting to addresses which they have not defined. I believe this is the load balancing on the AWS Servers. All calls from the Photon being picked up on the firewall seem to be port 5683 too.

Good Question!

For the photon, you only need outgoing TCP port 5683 like you mentioned, and for hosts, we don’t have a single fixed IP anymore. We try to keep that host around for legacy devices / old configurations, but these days you can find the server IPs for your geographic region by querying “device.spark.io”.

right now that gives me:

dig device.spark.io

; <<>> DiG 9.8.3-P1 <<>> device.spark.io
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 21375
;; flags: qr rd ra; QUERY: 1, ANSWER: 2, AUTHORITY: 4, ADDITIONAL: 4

;; QUESTION SECTION:
;device.spark.io.		IN	A

;; ANSWER SECTION:
device.spark.io.	60	IN	CNAME	device.nodes.spark.io.
device.nodes.spark.io.	60	IN	A	54.225.2.62

;; AUTHORITY SECTION:
nodes.spark.io.		60	IN	NS	ns-2025.awsdns-61.co.uk.
nodes.spark.io.		60	IN	NS	ns-216.awsdns-27.com.
nodes.spark.io.		60	IN	NS	ns-882.awsdns-46.net.
nodes.spark.io.		60	IN	NS	ns-1114.awsdns-11.org.

;; ADDITIONAL SECTION:
ns-216.awsdns-27.com.	111848	IN	A	205.251.192.216
ns-882.awsdns-46.net.	111858	IN	A	205.251.195.114
ns-1114.awsdns-11.org.	111985	IN	A	205.251.196.90
ns-2025.awsdns-61.co.uk. 111755	IN	A	205.251.199.233

However you should expect that this set of IPs will change from time to time, so the domain is the most accurate way of getting the list.

Thanks!
David

3 Likes

wow, just one month ago I was told it was a single address. Glad I have the new info.

2 Likes

Great! Thank you - I will try tweaking the firewall tomorrow and get back to you. Thanks!

1 Like

Hi @Dave & ‘photoners’,

Thanks for all of your help,

I still had a few extra IPs to unblock on port 5683, which I will post up on the forum once I have compiled a full list, I believe it is due to load balancing - but I may be wrong.

As a note for educators, on your network, a setting that we found to alleviate some issues was the ‘Block Unknown HTTP Requests’ on LightSpeed Filtering. If you are in this situation, I would recommend finding the MAC Address, assigning a static IP Address and giving the Photon its own filter class within LightSpeed with the Unknown HTTP Requests setting.

Thanks,

Sam

1 Like

Hi @Sammy_Herring,

I am facing a similar issue in some schools. The IT would like to know a list of IP addresses to allow. Did you end up compiling a list? It would be extremely helpful!

Thanks,
Dan

The Particle device cloud is comprised of many servers running in different places around the world. We routinely start and stop servers automatically to meet demand, and during routine maintenance, and it would be difficult to commit to a range of stable IP addresses. Our domain device.spark.io is dynamically updated and regenerated to reflect what local servers are available for devices, and is the best way to get a list of local servers to approve, but it is subject to routine change.

One alternative is to whitelist all Photon devices by MAC address, allowing them to make outgoing TCP connections to port 5683 on any host.

If this is not possible, as of the time of writing this is the list of IP addresses. It is subject to change.

54.175.114.62
107.22.156.56
52.204.226.242
54.90.239.114
52.91.51.61
52.90.147.116
52.90.98.3
107.22.28.43
54.175.227.173
54.221.65.123

If a new server is added, and is blocked by your firewall, it will time out and try again using a different server, but this will increase the time it takes to connect to the cloud.

5 Likes

@rickkas7, @Dave,

It seems this has changed in the since the last reply on this topic. Is there a new place to look for the list of servers?

Here are my dig results:

dig device.spark.io

; <<>> DiG 9.10.3-P4-Ubuntu <<>> device.spark.io
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 53164
;; flags: qr rd ra; QUERY: 1, ANSWER: 2, AUTHORITY: 0, ADDITIONAL: 1

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 512
;; QUESTION SECTION:
;device.spark.io.		IN	A

;; ANSWER SECTION:
device.spark.io.	59	IN	CNAME	device.nodes.spark.io.
device.nodes.spark.io.	59	IN	A	34.224.22.50

;; Query time: 99 msec
;; SERVER: 2001:4860:4860::8844#53(2001:4860:4860::8844)
;; WHEN: Thu Mar 01 17:53:48 UTC 2018
;; MSG SIZE  rcvd: 87

Following to device.nodes.spark.io:

dig device.nodes.spark.io

; <<>> DiG 9.10.3-P4-Ubuntu <<>> device.nodes.spark.io
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 25130
;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 1

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 512
;; QUESTION SECTION:
;device.nodes.spark.io.		IN	A

;; ANSWER SECTION:
device.nodes.spark.io.	59	IN	A	52.204.226.242

;; Query time: 46 msec
;; SERVER: 2001:4860:4860::8844#53(2001:4860:4860::8844)
;; WHEN: Thu Mar 01 17:53:34 UTC 2018
;; MSG SIZE  rcvd: 66

The current list is below. As before, the list is subject to change, so it’s best to avoid whitelisting using this list, if possible.

34.228.24.195
52.90.98.3
34.224.22.50
52.90.147.116
107.22.28.43
52.91.51.61
54.90.239.114
107.22.156.56
34.207.234.253
52.204.226.242

You can generate the list yourself by doing a dig of device.nodes.spark.io. You need to wait a minute, I use 65 seconds to be safe, between requests and repeat until you stop getting new addresses. That takes about 20 minutes now but it would take longer as the number of servers is increased. This method works now, but is also subject to change.

Thanks! I am going to try to avoid using this, but its nice to know how to get this list for quickly diagnosing issues.

Is this still a good method to find the particle cloud IP’s?

There is a current list here:

https://docs.particle.io/tutorials/device-cloud/introduction/#cloud-services-and-firewalls

1 Like