Batt. Voltage Meaurement

Darrell Norquay dnorquay at awinc.com
Tue May 14 03:32:24 GMT 1996


At 11:23 AM 5/13/96 EDT, you wrote:
>Suppose I had an A/D board and wanted to route my car battery's voltage into it
>for purposes of monitoring the voltage that my injectors receive.  What extent
>of filtering or other electrical rigging is necessary in order to get a decent
>signal for the brain to do math with i.e., how do I do it?
>

The simplest way to measure battery voltage for an A/D converter is to scale 
it down with a pair of resistors in voltage divider mode.  This approach, 
however, wastes a lot of your scale, since you are not really interested in
voltages below around 8V or so.  The voltage divider simply divides the input
voltage by a constant, giving you (say) a 0-5V output for a 0-16V input.
A better approach is to subtract a fixed offset from the input, and then do 
your resistive divider to scale the result to 0-5V (or whatever your A/D full
scale reading is).  Assuming your A/D input is 0-5V, which is most common, and 
you want to read an input range of 8-16V, which is about right for this 
application, whip up a little circuit as follows:

                           
                                                                    R2
                                             Vout = (Vin - 8) X ( ------- )
                   /                                              R1 + R2
                  |/|
where    Batt-----| |-------O                     assume  Vin = 16V
                  |\|       |                            Vout = 5V
                 /          \ 
                  8V    R1  /           Arbitrarily assign R1 = 2.0K
                 zener      \                      then    R2 = 3.3K
                            /
|
                            O-----O-----Vout
                            |     |
                            \     |
                        R2  /    ---
                            \    ---  Cap
                            /     |
                            |     |
                       gnd ---   ---

This circuit will subtract a fixed 8V from the battery voltage, so for an 
input of 8-16V you will have 0-8V at the top of the voltage divider.  Scale
this down by a factor of .625, and you have 0-5V out for 8-16 V in.  This will
give a nice expanded scale for the input of the A/D, and increase your
resolution.  A 1N4738 zener has a nominal voltage of 8.2V and should do
nicely for this application.  
The resistor values shown are the closest stock values, and actually give about
0-4.8V out.  This should not have too much effect, if you are a
perfectionist you
could calculate values using 1% tolerance resistors for maximum accuracy.

The Cap shown will add some filtering to the output voltage, the value kind of
depends on how much filtering you need.  Start out with a few microfarads, and
if your input is too noisy, jack up the value.  I would think that for this 
application you don't need real fast readings, you wouldn't have to adjust for 
voltage for each injector cycle, except maybe during cranking where it would
be the most important.  The cap will slow down response to fast changing 
voltages, so if you are sampling at a high rate, you may filter out the data
of interest.  Perhaps some additional software filtering by averaging a few 
readings together would also help.

Since this has to live in an automotive environment, it may not hurt to add
some transient protection in the form of a 5.6 or 6.2V zener across R2 to 
protect the A/D input from voltage surges, etc.  Be careful with this, however,
since a zener draws significant current below it's rated voltage, and may upset
the operation of the divider near full scale.  This is not really too much of
a concern in this application, since I would think you are interested in the
best accuracy at the lower end of the scale where battery voltage has the most
effect on injector opening time.

Hope this helps, (and the ASCII schematic survives transmission)
regards
dn




More information about the Diy_efi mailing list