Home
Class 10
MATHS
Show that the equation 2(a^2+b^2)x^2+2(a...

Show that the equation `2(a^2+b^2)x^2+2(a+b)x+1=0` has not real roots, when `a!=bdot`

Promotional Banner

Topper's Solved these Questions

  • QUADRATIC EQUATIONS

    NAGEEN PRAKASHAN|Exercise Exercise 4d|56 Videos
  • QUADRATIC EQUATIONS

    NAGEEN PRAKASHAN|Exercise Revision Exercise Very Short Answer Questions|15 Videos
  • QUADRATIC EQUATIONS

    NAGEEN PRAKASHAN|Exercise Exercise 4b|16 Videos
  • PROBABILITY

    NAGEEN PRAKASHAN|Exercise Revision Exercise Very Short Answer/short Answer Questions|16 Videos
  • REAL NUMBERS

    NAGEEN PRAKASHAN|Exercise Revision Exercise Long Answer Questions|5 Videos

Similar Questions

Explore conceptually related problems

Show that the equation 2(a^(2)+b^(2))x^(2)+2(a+b)x+1=0 has no real roots,when a!=b

Show that the equation 2x^(2)-6x+3=0 has real roots and find these roots.

The roots of the equation (p - 2)x^2 + 2(p - 2)x+ 2 =0 are not real when-

.Show that the equation x^(2n)-1=0 has only two real roots.

Show that (x^(2)+1)^(2)-x^(2)=0 has no real roots.

Show that the equation x^(2)+2x+1+sqrtx has no real roots.

Show that the equation x^(2)+6x+6=0 has real roots ad solve it.

If the equation x^(2)-kx+1=0 has no real roots then

Show that the equation x^(2)+5x-6=0 has real roots.