This experiment demonstrates how a voltage divider can be used in an ADC circuit to scale down an input voltage so it is compatible with the ADC input range, enabling accurate voltage measurement.
To learn how a voltage divider can adjust an analog signal to match the input range of an ADC for accurate digital conversion.
Connect the circuit as shown in the diagram below. The voltage divider scales down the input voltage before it reaches the ADC pin on the microcontroller.
An ADC typically has a limited input range (e.g., 0-5V or 0-3.3V). Using a voltage divider, we can scale down higher input voltages to fall within this range. The output voltage from the divider is given by:
V_out = V_in * (R2 / (R1 + R2))
This equation helps us select resistor values so the maximum expected input voltage is safely within the ADC's range.
Record the applied input voltage and the corresponding ADC output value. Observe how the voltage divider brings the input voltage within the ADC’s range, and compare the readings with calculated values based on the resistor ratio.
This experiment shows that a voltage divider allows for the safe and accurate conversion of higher input voltages by adjusting them to the ADC's input range. This approach is valuable for measuring analog signals that exceed the ADC’s native voltage range.