Home
Class 12
MATHS
If ax^(2)+bx+c=0, a,b, c in R, then find...

If `ax^(2)+bx+c=0, a,b, c in R`, then find the condition that this equation would have atleast one root in (0, 1).

Promotional Banner

Topper's Solved these Questions

  • DY / DX AS A RATE MEASURER AND TANGENTS, NORMALS

    ARIHANT MATHS|Exercise Exercise For Session 1|15 Videos
  • DY / DX AS A RATE MEASURER AND TANGENTS, NORMALS

    ARIHANT MATHS|Exercise Exercise For Session 2|6 Videos
  • DIFFERENTIATION

    ARIHANT MATHS|Exercise Exercise For Session 10|4 Videos
  • ELLIPSE

    ARIHANT MATHS|Exercise Exercise (Questions Asked In Previous 13 Years Exam)|27 Videos

Similar Questions

Explore conceptually related problems

If a+b + c = 0, then show that the quadratic equation 3ax^(2) + 2bx +c=0 has at least one root in [0,1].

If 2a+3b+6c=0, then show that the equation ax^(2)+bx+c=0 has atleast one real root between 0 to 1.

he quadratic equation 3ax^(2)+2bx+c=0 has atleast one root between 0 and 1, if

If a,b,c in R and a+b+c=0, then the quadratic equation 3ax^(2)+2bx+c=0 has (a) at least one root in [0,1] (b) at least one root in [1,2](c) at least one root in [(3)/(2),2] (d) none of these

The equation ax^(2) +bx+ c=0, where a,b,c are the side of a DeltaABC, and the equation x^(2) +sqrt2x+1=0 have a common root. Find measure for angle C.

If a,b,c in R and (a+c)^(2)

If a+b+2c=0, c!=0, then equation ax^2+bx+c=0 has (A) at least one root in (0,1) (B) at least one root in (0,2) (C) at least on root in (-1,1) (D) none of these

a,b,c in R,a!=0 and the quadratic equation ax^(2)+bx+c=0 has no real roots,then