Home
Class 12
MATHS
Prove that the roots of equation bx^2+(b...

Prove that the roots of equation `bx^2+(b-c)x+b-c-a=0` are real if those of equatiion `ax^2+2bx+b=0` are imaginary and vice versa where `a,b,c epsilon R`.

Promotional Banner

Similar Questions

Explore conceptually related problems

Prove that the roots of the equation bx^(2)+(b-c)x+(b-c-a)=0 are real if those of ax^(2)+2bx+2b=0 are imaginary and vice versa

If roots of the equation ax^(2)+2(a+b)x+(a+2b+c)=0 are imaginary then roots of the equation ax^(2)+2bx+c=0 are

If a root of the equation ax^(2)+bx+c=0 be reciprocal of a root of the equation a'x^(2)+b'x+c'=0 then

If 0

If 0 lt a lt b lt c and the roots alpha, beta of the equation ax ^(2)+bx+c=0 are imaginary, then

If the roots of equation ax^(2)+bx+c=0;(a,b,c in R and a!=0) are non-real and a+c>b. Then

The expression ax^2+2bx+b has same sign as that of b for every real x, then the roots of equation bx^2+(b-c)x+b-c-a=0 are (A) real and equal (B) real and unequal (C) imaginary (D) none of these

If the equation ax^(2) + 2 bx - 3c = 0 has no real roots and (3c)/(4) lt a + b , then