[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Current sensing

One last question for the Panel of Experts: I want to build a (reasonably)
accurate current meter for my e15, and for the electric car I am building,
but I am running into a snag:

usually, when you want to build a current meter, you just measure the
voltage across some small resistor in the circuit you want to measure. 
However, at, say, 100 A, the power dissipated, "eye-squared-are" (I^2 * R),
by even a .1 ohm resistor is about 100*100*.1 or 1000W !  The problem is,
how do you get the resistance to be nice and small, and yet stable, so you
can measure current without pissing away power? (Thermal variations must
take over really quickly ...)

I must be missing the design theory for large current measure, because it
has to be cheaper and easier than things like....

- Since the current runs pretty much at steady state, one could, I guess,
measure the magnetic field around a straight segment of conductor
- measuring the resistance as a function of temperature of a piece of the
circuit's conductor, and then compensating the voltage reading across that
segment by temperature ( thermistor kissing the side of the conductor)
- give up and call Bob Batson at EVoA and shell out the money for a factory
built one.
- drink a triple espresso so you can count electrons individually as they 
by (grin)

Any thoughts? How does a fancy-pants one work? Thanks, allayou.