Home
Class 10
MATHS
If alpha and beta be the roots of the eq...

If `alpha` and `beta` be the roots of the equation `x^2-1=0`, then show that.
`alpha+beta=(1)/(alpha)+(1)/(beta)`

Text Solution

Verified by Experts

The correct Answer is:
`alpha+beta=(1)/(alpha)+(1)/(beta)`
Hence proved
Promotional Banner

Similar Questions

Explore conceptually related problems

If alpha and beta are the roots of the equation 4x^(2)+3x+7=0, then (1)/(alpha)+(1)/(beta)=

If alpha and beta are the roots of the quadratic equation x^(2)-3x-2=0, then (alpha)/(beta)+(beta)/(alpha)=

If alpha and beta be the roots of the equation x^(2)+3x+1=0 then the value of ((alpha)/(1+beta))^(2)+((beta)/(alpha+1))^(2)

Let alpha and beta be the roots of the equation x^(2)+x+1=0. Then for y ne 0 in R, |(y+1,alpha,beta),(alpha,y+beta,1),(beta,1,y+alpha)| is equal to :

If alpha and beta are the roots of the equation x^(2)-x-4=0 , find the value of (1)/(alpha)+(1)/(beta)-alpha beta :

If alpha and beta are the roots of the equation 3x^(2)+8x+2=0 then ((1)/(alpha)+(1)/(beta))=?

If alpha,beta are the roots of the equation x^(2)-3x+1=0, then the equation with roots (1)/(alpha-2),(1)/(beta-2) will be