Home
Class 12
PHYSICS
When an AC is connected to a resistor wh...

When an AC is connected to a resistor what is the phase difference between the current and voltage ?

A

`90^@`

B

`180^@`

C

`0^@`

D

`60^@`

Text Solution

AI Generated Solution

The correct Answer is:
To determine the phase difference between the current and voltage when an AC source is connected to a resistor, we can follow these steps: ### Step-by-Step Solution: 1. **Understanding the Circuit**: - When an AC voltage source is connected to a resistor, we have a purely resistive circuit. This means that the only component in the circuit is a resistor (R). 2. **Behavior of Current and Voltage**: - In a purely resistive circuit, the current (I) and the voltage (V) across the resistor are directly related. The voltage across the resistor is equal to the voltage of the AC source. 3. **Phasor Representation**: - To analyze the phase relationship, we can use phasor diagrams. In this case, we represent the current and voltage as phasors. - Let’s assume the current phasor is represented along the positive x-axis. 4. **Direction of Voltage**: - Since the circuit is purely resistive, the voltage phasor will also align with the current phasor. This means that the voltage phasor will also point in the same direction as the current phasor. 5. **Phase Difference Calculation**: - The phase difference (φ) between the current and voltage in a purely resistive circuit is defined as the angle between the two phasors. - Since both phasors (current and voltage) are in the same direction, the angle between them is 0 degrees. 6. **Conclusion**: - Therefore, the phase difference between the current and voltage when an AC source is connected to a resistor is **0 degrees**. ### Final Answer: The phase difference between the current and voltage when an AC is connected to a resistor is **0 degrees**.
Promotional Banner