Is there a way to get TCP server/client status?

I continue to have problems with the Spark TCP Server/Client. I am not familiar enough with Unix sockets to know for sure the Spark is the problem, but it looks that way. I created crude logs of the transactions. My host machine wants some info from the Spark, a child process is spawned to handle the transaction. The child asks for a connection to Spark server, gets it, the child sends the request for info to the Spark. Most of the time the Spark replies, but then it won’t. I can send a “spark flash …” command, effectively rebooting the spark with cli and the TCP comms are fine again for a while. So what I want to know is:

  1. Can I check the status of the TCP Server to know if it’s still running?
  2. Is there a way to restart the server, can I issue multiple server.begins()?
  3. Can I set a timeout on the server or client to close the client server pairs in case something gets stuck?
  4. is there a server.stop() to kill the server, so it can be restarted cleanly?
  5. Is there a “spark reboot” command for the cli?
1 Like

@ronm if you use HttpClient library you can get all details.