Home
Class 12
MATHS
Prove that the roots of (x-a)(x-b)=h^(2)...

Prove that the roots of `(x-a)(x-b)=h^(2)` are always real.

Promotional Banner

Topper's Solved these Questions

  • MOST IMPORTANT QUESTIONS

    VGS PUBLICATION-BRILLIANT|Exercise QUADRATIC EQUATIONS (SHORT ANSWER TYPE QUESTIONS)|7 Videos
  • MOST IMPORTANT QUESTIONS

    VGS PUBLICATION-BRILLIANT|Exercise THEORY OF EQUATIONS (VERY SHORT ANSWER TYPE QUESTIONS)|9 Videos
  • MOST IMPORTANT QUESTIONS

    VGS PUBLICATION-BRILLIANT|Exercise DE MOIVRE.S THEOREM (LONG ANSWER TYPE QUESTIONS)|10 Videos
  • MODEL PAPER 9

    VGS PUBLICATION-BRILLIANT|Exercise SECTION-C (III. LONG ANSWER TYPE QUESTIONS) |7 Videos

Similar Questions

Explore conceptually related problems

The roots of (x - a) (x - b) = b^(2) are…

If k in R , then roots of (x - 2) (x - 3) = k^2 are always

Assertion (A) : If 3 + iy is a root of x^(2) + ax + b = 0 then a = -6 Reason (R) : If a, b , c are real and Delta lt 0 the roots of ax^(2) + bx + c = 0 are conjugate complex numbers

If a, b, c are real then (b-x)^(2)-4(a-x) (c-x)=0 will have always roots which are

If alpha, beta are the real roots of x^(2) + px + q= 0 and alpha^(4), beta^(4) are the roots of x^(2) - rx + s = 0 , then the equation x^(2) - 4 q x + 2q^(2) - r = 0 has always

If alpha , beta are the roots of a ^2 x + bx +c=0 and alpha + h+ beta + h are the roots of px^2+qx +r=0 then h=

If a, b are the roots of x^(2) +x+1=0 then a^2 +b^2=

IF alpha , beta are the roots of x^2+bx +c=0 and alpha + h, beta +h are the roots of x^2+qx +r=0 then h=