[Solved] Api.spark.io certificate is expired


I have been using api.spark.io to connect with my photons.
It was fine until today with the error of invalid certificate.
Apparently, it was expired yesterday.
If I were to switch to api.particle.io, I have to update the algorithm accordingly which I hesitate to do so since I am not very familiar with the functions.
Is there any other option?

Thanks in advance.

@phyo_tz, i believe that domain is not being planned for deprecation so the SSL cert needs to be updated.

Let me check with the team on the plan.

1 Like

Ok, the team will be working on updating the SSL cert for api.spark.io . :wink:

Will provide an update in the next few hours.

@phyo_tz, what are your concerns from changing to api.particle.io? I believe they work the same?

If you have a staging environment, you can try making the change and see if things are breaking.

1 Like

Thank you so much. I look forward to the update. I almost got an heart attack. :sweat_smile:
Meanwhile, my cloud functions and others which connects to api.spark.io are dying. :cry:

It is because I wrote the code based on this tutorial in which is using api.spark.io. And other tutorials were using spark ecosystem at the time of me implementing the code.
It took me a while (several days) to understand and modify the tutorials to get what I needed.
Therefore, I am quite hesitating to start the things new again.

Tutorial link

I believe that the change from spark.io to particle.io would not impact the web interface in the tutorial :wink:

I tried replacing all “spark” with “particle” in my files. But they are shooting me back with this errors. :sob:

Hi @phyo_tz,

Don’t do a simple string replace of the word ‘spark’ to ‘particle’, instead, how about “api.spark.io” to “api.particle.io”.



Hello :smiley:

I used this code as a base and modified extensively.
I bypassed the validation process with directly inserting the access token inside the code [ which I aware is a bad practise ]. I did it as a starting point because I needed to see the device status as I load the page.
Since I bypassed login process, there is no api.spark.io code involved.

I used several spark functions to retrieve all the data, which I think it is the reason why new particle api is not working as can be seen in below code.

Thank you.

In this code, “spark” refers to a javascript module, it’s probably this file:

<script src="http://cdn.jsdelivr.net/sparkjs/1.0.0/spark.min.js" type="text/javascript" charset="utf-8"></script>  

When I open up that file, I see it’s using the correct url:


changing the variable name “spark” to “particle” in your code doesn’t really accomplish anything.


1 Like

In the console log (line 286), it is redirecting to api.spark.io, which is the main cause of certificate error. :sweat:

Hi @phyo_tz,

That’s what I’m saying, you have some bit of code (perhaps in a module / library) with the url: “api.spark.io”. You should change that to “api.particle.io”.


1 Like

Hi Dave,

Per your suggestion, I had been tracing api.spark.io in several external js files.
What happened was that there are duplicates of spark.min.js in serveral locations.
I was looking for api.spark.io in spark.min.js in location 1.
But I hardcoded api.spark.io in spark.min.js of location 2.
Silly me.
Now I changed back to api.particle.io.
Thanks for your suggestions and help, both @Dave and @kennethlimcp.
Now data are fetching great from api.particle.io. :smile:
Please update me if there is any update for certificate of api.spark.io.

Again, thanks for your time. =D

1 Like