Home
Class 12
MATHS
If b^2<2a c , then prove that a x^2+b x^...

If `b^2<2a c` , then prove that `a x^2+b x^2+c x+d=0` has exactly one real root.

Text Solution

Verified by Experts

The correct Answer is:
`=(b^(2))/(a^(2))-(2c)/(a)=(b^(2)-2ac)/(a^(2))`
Promotional Banner

Topper's Solved these Questions

  • QUADRATIC EQUATION & EXPRESSION

    FIITJEE|Exercise SOLVED PROBLEMS (SUBJECTIVE)|12 Videos
  • QUADRATIC EQUATION & EXPRESSION

    FIITJEE|Exercise SOLVED PROBLEMS (OBJECTIVE)|27 Videos
  • PROGRESSION & SERIES

    FIITJEE|Exercise NUMERICAL BASED|3 Videos
  • SET, RELATION & FUNCTION

    FIITJEE|Exercise Exercise 3|8 Videos

Similar Questions

Explore conceptually related problems

(a + b)^2 + (a- b)^2 = ….

If a/b= 3/2, then (a ^(2) +b ^(2) )/(a ^(2) - b ^(2)) = ?

Show that (a - b)^(2), (a^(2) + b^(2)) " and " (a + b)^(2) are in AP.

( a - b) ^(2) + 2ab = ? A. a^(2) - b^(2) B. a^(2) + b^(2) C. a^(2) - 4ab + b^(2) D. a^(2) - 2ab + b^(2)