Home
Class 12
MATHS
Let f(x)=a x^3+b x^2+c x+d , a!=0 If x1...

Let `f(x)=a x^3+b x^2+c x+d , a!=0` If `x_1 and x_2` are the real and distinct roots of `f prime(x)=0` then `f(x)=0` will have three real and distinct roots if

Promotional Banner

Similar Questions

Explore conceptually related problems

Let f(x)=x^(3)+ax^(2)+bx+c be the given cubic polynomial and f(x)=0 be the corresponding cubic equation, where a, b, c in R. Now, f'(x)=3x^(2)+2ax+b Let D=4a^(2)-12b=4(a^(2)-3b) be the discriminant of the equation f'(x)=0 . If D=4(a^(2)-3b)gt0 and f(x_(1)).f(x_(2)) lt0" where " x_(1),x_(2) are the roots of f(x), then a. f(x) has all real and distinct roots b. f(x) has three real roots but one of the roots would be repeated c. f(x) would have just one real root d. None of the above

If f(x)=x^(3)-3x+1, then the number of distinct real roots of the equation f(f(x))=0 is

Let f(x) = x^(2) - 5x + 6, g(x) = f(|x|), h(x) = |g(x)| The set of value of mu such that equation h(x) = mu has exactly 8 real and distinct roots, contains. (a) 0 (b) 1/8 (c) 1/16 (d) 1/4

If f(x) is a real valued polynomial and f (x) = 0 has real and distinct roots, show that the (f'(x)^(2)-f(x)f''(x))=0 can not have real roots.

If f(x) is a real valued polynomial and f (x) = 0 has real and distinct roots, show that the (f'(x)^(2)-f(x)f''(x))=0 can not have real roots.

If a

If a lt b lt c lt d then show that (x-a)(x-c)+3(x-b)(x-d)=0 has real and distinct roots.

If a lt b lt c lt d then show that (x-a)(x-c)+3(x-b)(x-d)=0 has real and distinct roots.