Home
Class 12
MATHS
Let f(x)=a5x^5+a4x^4+a3x^3+a2x^2+a1x , w...

Let `f(x)=a_5x^5+a_4x^4+a_3x^3+a_2x^2+a_1x ,` where `a_i ' s` are real and `f(x)=0` has a positive root `alpha_0dot` Then `f^(prime)(x)=0` has a positive root `alpha_1` such that `0

A

f'(x) = 0 has a root `alpha_(1) "such that"ltalpha_(1)ltalpha_(0)`

B

f' (x) = 0 has at least one real root

C

f''(x) = 0 has at least one real root

D

none of these

Text Solution

Verified by Experts

The correct Answer is:
A, B, C
Promotional Banner

Topper's Solved these Questions

  • APPLICATION OF DERIVATIVES

    CENGAGE ENGLISH|Exercise LINKED COMPREHENSION TYPE|8 Videos
  • APPLICATION OF DERIVATIVES

    CENGAGE ENGLISH|Exercise NUMERICAL VALUE TYPE|19 Videos
  • APPLICATION OF DERIVATIVES

    CENGAGE ENGLISH|Exercise EXERCISES|57 Videos
  • 3D COORDINATION SYSTEM

    CENGAGE ENGLISH|Exercise DPP 3.1|11 Videos
  • APPLICATION OF INTEGRALS

    CENGAGE ENGLISH|Exercise All Questions|142 Videos

Similar Questions

Explore conceptually related problems

Let f(x)=ax^(5)+bx^(4)+cx^(3)+dx^(2)+ ex , where a,b,c,d,e in R and f(x)=0 has a positive root. alpha . Then,

Let F(x)=1+f(x)+(f(x))^2+(f(x))^3 where f(x) is an increasing differentiable function and F(x)=0 hasa positive root, then

If f(x)=a x^2+b x+c ,g(x)=-a x^2+b x+c ,where ac !=0, then prove that f(x)g(x)=0 has at least two real roots.

Let f(x)=ax^3+bx^2+cx+1 has exterma at x=alpha,beta such that alpha beta 0 (c)one positive root if f(alpha) 0 (d) none of these

If quadratic equation f(x)=x^(2)+ax+1=0 has two positive distinct roots then

consider the function f(X) =x+cosx -a values of a which f(X) =0 has exactly one positive root are

Let f(x)=x^(2)+ax+b , where a, b in R . If f(x)=0 has all its roots imaginary, then the roots of f(x)+f'(x)+f''(x)=0 are

Let f(x)=a x^3+b x^2+c x+d , a!=0 If x_1 and x_2 are the real and distinct roots of f prime(x)=0 then f(x)=0 will have three real and distinct roots if

If f(x) is a real valued polynomial and f (x) = 0 has real and distinct roots, show that the (f'(x)^(2)-f(x)f''(x))=0 can not have real roots.

let f(x)=a_0+a_1x^2+a_2x^4+............a_nx^(2n) where 0< a_0 < a_1 < a_3 ............< a_n then f(x) has