I2C Camera Modules?

I hear that we can’t use Serial Cameras with the Spark Core due to Serial issues that have not been fixed yet.

I was wondering if its possible to use a I2C camera with the Spark Core like this to get pictures out and onto the web. http://www.uctronics.com/mt9t112-mi-soc3132-31mp-hd-cmos-camera-module-adapter-board-jpeg-out-p-1440.html

1 Like

@RWB Is ‘we can use it’ means that we can also stream it over wifi?

We should be able send the pictures over wifi to a webpage or email. I’m not sure about constant video streaming.

Streaming requires far too much overhead to do with the Core. You could get about 1FPS on a really good day.

You’d need a dedicated bus like the Raspberry Pi.

@timb How about just using the I2C compatiable camera modules to transmit 1 picture?

Spark is great for a fast prototyping. Im leaving now to Raspery side :wink:

Thank you,
Hasan

If you want to stream video online you can get some really nice web cameras with software and all that jazz for $100. I just saw about 10 different options at FRY’s the other night.

1 Like

Yeah, one picture would work. There’s also some cameras that transmit the picture over UART as well (SparkFun sells some).

Shit if it will work then I guess were gonna need to figure out how to make it work.

That UART one that Sparkfun has is super low rez. I found some 2MP camera linked above that are I2C with onboard jpg compression so the core does not have to do it. Not sure if it even could do that.

Who else wants their Spark Core to be able to send a Picture via email or text?

@RWB Unfortunately the camera you linked doesn’t send the picture over I2C. It has two interfaces: MIPI CSI (high-speed serial differential data, this is what the Raspberry Pi uses) and an 8-bit parallel output (which you could hook to D0 to D7 on the Core). I2C is used to control the camera itself, tell it when to take a picture, what format it should be output as, set focus, enable/disable a flash, stuff like that.

I suppose one option would be to use the 8-bit output hooked to a Shift Register and connect that to the SPI port of the Core. That’s still a pretty high resolution, so even with JPEG compression you’d need some SPI based SRAM to hold the image so it could be uploaded/emailed/MMSed. That would make it even slower since you’d have to share the SPI port with the Shift Register and SRAM, basically reading a byte from the camera, writing it to the SRAM, sending an I2C write to tell the camera to send the next byte, rinse, lather, repeat.

The Core is just not designed to work with a large megapixel camera. :frowning:

Your best bet is to either use a UART camera or get a cheap WiFi webcam and program the Core to talk to it via TCP (basically instruct it to snap a picture and post it to a specific URL).

@timb now that i managed to get the core to take pics with the Linksprite, are there ways to reduce the time to copy out the entire jpeg hex contents?

It’s take like 2minutes and the core gets disconnected to the cloud but for now i’m compiling locally until i write it in the non-block form

Not at the moment. Once the WiFi code is coupled into a separate process you won’t need to worry about user loop code timing the CC3000 out after 10 seconds. With the current firmware the WiFi should at least gracefully restart without resetting the entire Core.

Tonight I picked up a UART/SPI 640x480 JPEG camera module from Radio Shack. I’m thinking the SPI interface shouldn’t cause a WiFi disconnect, because I can read bytes for 1 second and store them in a buffer, keep the CS line low and allow the main loop to iterate where I read for another second, append to the buffer, so on and so on. This isn’t possible over UART (at least without CTS/DTS lines) as camera will just spit out a stream of data with no wait for ACK.

That sounds good! Hmm so sad that I can’t push it further but I might be able to write it to nonblocking for the loops.

Hi @kennethlimcp

The Link Sprite manual says that there two parameters, chunk size and interval time that you could try playing around with.

@bko Thanks for the information! I tried messing around with that but the output gets a little erratic with some additional bytes appearing at the end.

It’s also mentioned in other forum about the issue when they attempt to increase the buffer.

In addition, the ring buffer for Serial1 is 64 bytes and we wont be able to push it much faster without modifying the core firmware.

Any idea if writing to an SD card is faster that way? I think printing the output on the Serial Console is not and accurate way to tell the entire time required as the Write might be slowed down when we send to the console.

Sending directly to the SD for storage might speed things up. Or the external flash maybe? :smiley:

You’re going up against the speed of the serial port either way. Whatever speed you’re seeing on the console is the same speed you’d be writing to an SD card. 115kbps = 11.5KB/s. A raw 640x480 image is about 2.5MB, with JPEG compression let’s say that’s 250KB. That’s 21 seconds of transfer time. This is all assuming a raw stream of data. If you’re having to request each byte you can cut that in at least half.

The Camera Board I picked up from Radio Shack actually looks right up the alley for the Core, especially with the SPI output. Adafruit also sells a similar camera using the exact same CCD and DSP chip, however it doesn’t have SPI broken out!

I’m working on a driver for it now.

1 Like

It is certainly better to offload image processing to a dedicated chip. I’m super excited about the Pixy (well, once it ships.) I have the original CMUcam and it is a brilliant thing.

1 Like

@jonathanberi PIXY looks perfect for working with the Spark Core! Looking forward to getting one of those once they are available.