Home
Class 10
MATHS
Show that the equation 2(a^2+b^2)x^2+2(a...

Show that the equation `2(a^2+b^2)x^2+2(a+b)x+1=0` has not real roots, when `a!=bdot`

Promotional Banner

Similar Questions

Explore conceptually related problems

Show that the equation 2(a^2+b^2)x^2+2(a+b)x+1=0 has no real roots, when a!=bdot

Show that the equation 2(a^(2)+b^(2))x^(2)+2(a+b)x+1=0 has no real roots,when a!=b

Show that the equation 2(a^(2)+b^(2))x^(2)+2(a+b)x+1=0 has no real roots when a!=b .

Show that the equation e^(x-1)+x-2=0 has no real root which is less than 1.

Prove that the equation x^(2)(a^(2)b^(2))+2x(ac+bd)+(c^(2)+d^(2))=0 has no real root if adnebc .

Show that the equation e^(x-1)+x-2=0 has no real root greater than 1.

Show that the equation 2x^(2)-6x+3=0 has real roots and find these roots.

Prove that the quadratic equation 2(a^(2)+b^(2))x^(2)+2(a+b)x+1=0 will have no real root if aneb .

If a, b, c in R and the quadratic equation x^2 + (a + b) x + c = 0 has no real roots then

Show that the equation 2x^(2)-6x+7=0 has no real root.