Voltage Divider problems
tyler wrote 02/21/2022 at 20:29 • 0 pointsHello! Today I started work on (basically) my first ever analog circuit. It will probably be familiar to most of you: a LDR in a voltage divider to light up an incandescent Christmas light.
I set it up, plugged my multimeter into Vout, and the logic worked as expected. I put a flashlight up to the LDR, and the voltage approached the input. (I used a potentiometer for the other resistor.)
Here's the problem: when I substituted the multimeter with the light, nothing happens. When I put the multimeter in parallel (not series) with the light, it shows no voltage. The circuit can work fine, but as soon as I add in the light, it quits. Does anyone know what could cause this?
Thanks, Tyler
EDIT: I wrote that I put the multimeter in series with the bulb - I have no idea why I said this, for in actuality I put the probes in parallel with the bulb. Sorry for any confusion this probably caused, and thanks for pointing it out.
ask
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.
Thank you all for your help. I ended up using a common-emitter transistor amplifier to drive the light, and it worked! Truly magical experience, thanks for your help on my first (real) analog circuit!
Are you sure? yes | no
This question is often posed by beginners in forums. A voltage divider only works perfectly if there is no load and a multimeter has a very high impedance so is a good approximation. When you impose a load, it changes the divider ratio.
If voltage dividers were such magical things, all you would need to step down 230V to 110V mains supply would be a pair of resistors. Not!
Are you sure? yes | no
Does that improve with higher value resistors (assuming precision is maintained) or is it just something that moves all over depending on the overall circuit?
Are you sure? yes | no
Not sure what you mean. If you know the value of the resistors in the divider and the resistance of the load, it can all be calculated.
Are you sure? yes | no
The incandescent bulb will have an impedance of a few ohms, the LDR will be a few kilo-ohms even when fully illuminated. You need an amplifier like a transistor, specifically a darlington pair to get towards the 1000x amplification you need.
E.g. https://technologystudent.com/images2/ldrlght.gif
Are you sure? yes | no
Impedance is the answer, the meter has high impedance so it can measure the divider output without loading it too much, but a light has too low of an impedance and thus tries to pull too much current, collapsing the divider output. What you need is effectively an amplifier on the output of the divider which will feed the light. You can build a simple single transistor amplifier using a bjt or mosfet.
Are you sure? yes | no
Can you share the schematic? Hand-drawn ones would also help me to understand your problem :D
Are you sure? yes | no
What is the original voltage source?
If you want the analog control, it is possible. It will need a bigger FET. Look up "source follower" for how to hook up the FET and light bulb.
Are you sure? yes | no
"across" for voltage, "in series" for current......
Are you sure? yes | no
I had the probes in parallel with the bulb to measure the voltage... Do you mean that I should be doing it in series?
Are you sure? yes | no
"When I put the multimeter in series with the light" , if the multimeter is not on current measurement, the circuit may not perform as intended.
Are you sure? yes | no
It sounds like the christmas light bulb wants too much current. For a low voltage bulb, if you use a simple N Channel MOSFET transistor to boost the current from the LDR circuit, it should work. It gets a little harder if the bulb runs directly off mains voltage.
Are you sure? yes | no
If I use a FET, will this make me lose the analog control over the output?
Also, I know that the original voltage source had enough current to light the bulb, but I'm not sure if my little circuit was reducing that at all.
Are you sure? yes | no