Advice on using a RPi and Particle agent to write to the display screen and use touch input

I am looking for some advice about how to develop a particle sketch that could write to a connected touch screen connected to the RPi and receive input from touch events. I have an official Raspberry Pi 7" touchscreen display which works well with the RPi out of the box. Also, would it be possible to drive an HDMI connected display panel.

1 Like

Hi, I have just ordered an RPi 3B+ with 7" touchscreen display and was wondering about the same question…
Have you made any progress in this since January?


@FiDel Not much sorry to say. Feedback from J Vanier was to look at the maker project on github. In summary, this is a kiosk type solution and uses Chromium browser running on the Rpi and the Rpi setup as a web server. The particle application on the Rpi writes JSON to a file situated on a specific directory. The browser default page is set to a local html file containing AJAX javascript which is reading the JSON file and then updating the browser output. Input via the browser is possible but then has to be handled in the html/js.

1 Like

Thanks for your quick and clear reply @armor!
Did you also use that described html/js method for your project?

The makerfaire project code is quite dense and clever (if J Vanier is reading this). I had to consult a friend who is a web dev manager to get the AJAX/ javascript part working because it just was not working - permissions and correct JSON and directories. There is a lot to get right for it to work!

1 Like

How about using “Open HAB” on the RPi?