I have been using api.spark.io to connect with my photons.
It was fine until today with the error of invalid certificate.
Apparently, it was expired yesterday.
If I were to switch to api.particle.io, I have to update the algorithm accordingly which I hesitate to do so since I am not very familiar with the functions.
Is there any other option?
Thank you so much. I look forward to the update. I almost got an heart attack.
Meanwhile, my cloud functions and others which connects to api.spark.io are dying.
It is because I wrote the code based on this tutorial in which is using api.spark.io. And other tutorials were using spark ecosystem at the time of me implementing the code.
It took me a while (several days) to understand and modify the tutorials to get what I needed.
Therefore, I am quite hesitating to start the things new again.
I used this code as a base and modified extensively.
I bypassed the validation process with directly inserting the access token inside the code [ which I aware is a bad practise ]. I did it as a starting point because I needed to see the device status as I load the page.
Since I bypassed login process, there is no api.spark.io code involved.
I used several spark functions to retrieve all the data, which I think it is the reason why new particle api is not working as can be seen in below code.
That’s what I’m saying, you have some bit of code (perhaps in a module / library) with the url: “api.spark.io”. You should change that to “api.particle.io”.
Per your suggestion, I had been tracing api.spark.io in several external js files.
What happened was that there are duplicates of spark.min.js in serveral locations.
I was looking for api.spark.io in spark.min.js in location 1.
But I hardcoded api.spark.io in spark.min.js of location 2.
Silly me.
Now I changed back to api.particle.io.
Thanks for your suggestions and help, both @Dave and @kennethlimcp.
Now data are fetching great from api.particle.io.
Please update me if there is any update for certificate of api.spark.io.