More ADC shenanigans

After measuring the voltage on the divider, it seemed ok, as it varied linearly with the output voltage, it was also right around what I calculated it should be it's a 90k/10k divider so the output voltage of the divider should be (and is) very close to 10% of the post voltage. However the output of the opamp (configured as a unity buffer) does not go much above 1.9v which is why the ADC tops out at ~19v.

I don't know enough about op amps and analog electronics to fully understand why this is, but reading the datasheet again indicates that Voh (voltage output high) with Rl (load resistance) of >= 2k is Vcc - 1.5v. In this application Vcc is 3.3v so if the ADC input has an input impedance near that (which it might based on the recommended maximum source impedance of 2.5k) we could be looking at a maximum output voltage of 1.8v which is fairly close to 1.9v, which would correspond to right around 19v.

If I were to redesign this circuit and board, I would change the op amp to a rail-to-rail device and this issue would go away, at the moment a complete redesign is not happening. So to solve the issue on the board I had, I decided that I would change the voltage divider so that the voltage stays within the usable range of 0-1.9v. To do this I desoldered the 90k resistor from both pad and resoldered it to one pad standing on end. I soldered another 90k to the other pad, also on end and then connected the top ends together. This in effect puts the 2 90k resistors in series to give 180k. It also means that the maximum value is around 1.5v so it remains in range.

After I made this hardware change, I re-calibrated the conversion line and tested it. To find that between 0.8v and ~30v the front panel readout is within 10-20mv of my volt meter.

Subject: