Home
Class 10
MATHS
Show that the equation 2(a^2+b^2)x^2+2(a...

Show that the equation `2(a^2+b^2)x^2+2(a+b)x+1=0` has no real roots, when `a!=bdot`

Promotional Banner

Similar Questions

Explore conceptually related problems

Show that the equation e^(x-1)+x-2=0 has no real root which is less than 1.

Prove that the equation x^(2)(a^(2)b^(2))+2x(ac+bd)+(c^(2)+d^(2))=0 has no real root if adnebc .

Show that the equation e^(x-1)+x-2=0 has no real root greater than 1.

Show that the equation 2x^(2)-6x+3=0 has real roots and find these roots.

Prove that the quadratic equation 2(a^(2)+b^(2))x^(2)+2(a+b)x+1=0 will have no real root if aneb .

If a, b, c in R and the quadratic equation x^2 + (a + b) x + c = 0 has no real roots then

If a ,b ,c ,d in R , then the equation (x^2+a x-3b)(x^2-c x+b)(x^2-dx+2b)=0 has a) 6 real roots b) at least 2 real roots c) 4 real roots d) none of these

If a ,b ,c ,d in R , then the equation (x^2+a x-3b)(x^2-c x+b)(x^2-dx+2b)=0 has a. 6 real roots b. at least 2 real roots c. 4 real roots d. none of these

If a ,b ,c ,d in R , then the equation (x^2+a x-3b)(x^2-c x+b)(x^2-dx+2b)=0 has a. 6 real roots b. at least 2 real roots c. 4 real roots d. none of these