Home
Class 12
MATHS
Show that the equation A^2/(x-a)+B^2/(x...

Show that the equation ` A^2/(x-a)+B^2/(x-b)+C^2/(x-c)+.+H^2/(x-h)=x+1` where A,B,C `,……, a,b,c and i` are real cannot have imaginary roots.

Text Solution

AI Generated Solution

Promotional Banner

Similar Questions

Explore conceptually related problems

Show that the equation A^(2)/(x-a)+B^(2)/(x-b)+C^(2)/(x-c)+...+H^(2)/(x-h)=k has no imaginary root,where A,B,C,...., Handa,b,c,........., handk in R

If (a+b+c)>0 and a < 0 < b < c, then the equation a(x-b)(x-c)+b(x-c)(x-a)+c(x-a)(x-c)=0 has (i) roots are real and distinct (ii) roots are imaginary (iii) product of roots are negative (iv) product of roots are positive

Show that the roots of the equation: (x-a) (x -b) +(x- b) (x-c)+ (x- c) (x -a) =0 are always real and these cannot be equal unless a=b=c

If a,b,c,d are unequal positive numbes, then the roots of equation x/(x-a)+x/(x-b)+x/(x-c)+x+d=0 are necessarily (A) all real (B) all imaginary (C) two real and two imaginary roots (D) at least two real

The roots of the equation (b-c)x^(2)+(c-a)x+(a-b)=0

-If a,b,c in R then prove that the roots of the equation (1)/(x-a)+(1)/(x-b)+(1)/(x-c)=0 are always real cannot have roots if a=b=c

The roots of the equation a^(2)x^(2)+(a+b)x-b^(2)=0(a,b!=0) are :(A) real and different (B) real and equal (C) imaginary (D) None