Electron and water pressure sensor

Hello Particle community!

I don’t know if this is the right place to ask, but here goes.

I’ve bought myself an MBS 1900, Pressure transmitter for air and water applications.

And I’ve run into some problems.
It takes in 4.5 to 5.5 Volts as input and outputs 10-90% of input voltage depending on the pressure (bars) it reads. It has a range of 0-10 Bar. I’ve connected it to a tiny water pumping system.

I have a 3v3 to 5v boost converter connected between the particle Electron and the pressure transmitter. I’ve also added a voltage divider (R1 = 10k, R2 = 20k) to the output, to keep it safe for the Electron ADC input.

I’ve connected the output to A4 on the Electron.

The code is very simple, I just read the data from input by the line:


One of the problems is that I read a varying float around 1250 when the pin is unconnected which translates to about 1v. I don’t know if this is just to be ignored, but I don’t get it. I’ve read that a pull-down should not be used for analog pins.

The other problem is that I read about 2810 to 3200 ~ 2.3 to 2.5v when I change between 0 and 2 bar. Which pretty much means no change at all in the readings. This checks out when I measure the sensor output I volts with a multimeter.

What am I doing wrong? Any help is much appreciated.

Thanks in advance,

Why, 390 units is just short 10% of the full range of the ADC of which you are not using 100% of due to source voltage range 0.5~4.5V, translated to 0.33~3.0V (raw ADC 621~3723) and the 0~2bar range maps to 12.5% of the whole pressure range of the sensor (before your edit from originally “0-16bar” to “0-10bar”). So (before the edit) that does did appear to make sense to me.

@ScruffR Thank you for the answer! But i accidentally connected the pressure transmitter wrong. The analog input was shifting back and fourth between the 2810 to 3200 uncontrollably, no matter what i set the pressure to.

I assumed the wiring was red --> power, black --> ground and brown --> output. It appears however that the brown wire is for power and the red is for output. Black was still for ground.

Now i read 1300 to 1350 on the analog input that varies no matter what the actual pressure (BAR) is :frowning:
BUT… i get 0.460 volts on output when actual 0.3 bar pressure and 0.99 volts on output when actual 2.0 bar pressure.

It was a rookie mistake from me to assume anything when it comes to wiring. :sweat_smile:

I’m gonna do some calculations, and see if it checks out with the voltage. The analogRead is still confusing me though.

If my calculations are correct, then the voltage read from multimeter on output after voltage divider makes sense.

0.99 volt is 20% of the possible output volt range (3.428 - 0.38 = 3.048). 20% of 10 (pressure transmitter range is 0-10 bar) is 2, and it was at actual 2 bar.

0.46 volt is 2.62% of the possible output volt range. 2.62% of 10 is 0.262 and it was at actual 0.3 bar.

Checking out the analog input problem tomorrow and do some more testing.


With the updated data you provided in the OP, you should be reading ~1230 units at 0.99V and if your 0.5V @ 0bar and 4.5V @ 10bar is correct and linear, you should get a “vacuum reading” of 0.33V (~410 units) and 3.0V at your analog pin reading ~3725 tops (assuming your supply is exactly 5.0V and resistors are exactly 10k & 20k)


Works like a charm now! Thank you @ScruffR!