Video Mesh-agges

Good people at Particle,
On thing that still seems beyond reach on the Particle platform is video streaming.
Searching this forum I gathered that the bottleneck is not the WiFI side, it is getting the data out of the camera.
Is there anything new in the Argon that would facilitate implementing a higher speed interface, or something similar to the CSI port on the Pi?
And if not streaming, a faster transfer of still images?


The Photon/Electron don’t have the digital camera interface (DCMI) to interface with cheap video sensors like the OV76760. However, the real limitation is RAM. With only about 60Kbytes of free memory, there isn’t enough RAM to buffer even a single frame at any reasonable resolution. This eliminates the possibility of doing anything other than sending raw data off by TCP.

In addition to much more RAM in the Pi, it also has the GPU to offload some of the processing to. Neither the current Particle devices nor the new ones are likely to be suited for video, nor are they intended to be.


Second that. You’re looking for a higher class of processor/platform.

You can use the DMA to semi-automatically get data from the camera faster, but you’re pretty much limited to shooting it over the wire as fast as it comes in, in whatever format you’re getting it.

Here is a camera another guy on here just purchased to see if we can easily interface it with an Electron from taking and then uploading images to the web for viewing. It has a new library and were hoping its easy to use and work with.

What do you think about this @rickkas7

The uCAM is more reasonable because it creates small images, and it does the JPEG compression on the camera module. That makes it more reasonable to just read the compressed JPEG off the camera and upload it to a server without having to do any processing on the Electron/Photon.

1 Like

Thank you for the replies.
I agree the processor is not meant for this, but I was thinking of piping the data directly to WIFI. I guess data would have to come in from the SPI or I2C ports and it would help if the camera did the compression. The client side could do it too, but I wouldn’t know how to code that.

I looked at the uCam before, but it has low resolution.
The Pixy camera is really impressive. I haven’t read about all its features, but based on the videos I saw the video quality is low.
There are many interesting ArduCam projects. The frame rate is low, but it seems the same concept I had in mind.

Anyway, I thought I would ask because I seem to recall the new modules will support a higher transfer rate. I can’t find where I read it.