Home
Class 12
MATHS
Prove that, if f(x) = a(0) + a(1)x^(2) +...

Prove that, if `f(x) = a_(0) + a_(1)x^(2) + a_(2)x^(4) and 0 lt a_(0) lt a_(1) lt a_(2)`, then f(x) has only one minima at x = 0.

Text Solution

Verified by Experts

The correct Answer is:
x = 0
Promotional Banner

Topper's Solved these Questions

  • APPLICATION OF DERIVATIVES

    ARIHANT PUBLICATION|Exercise QUESTION FOR PRACTICE PART - IV (SHORT ANSWER TYPE QUESTIONS )|17 Videos
  • APPLICATION OF DERIVATIVES

    ARIHANT PUBLICATION|Exercise QUESTION FOR PRACTICE PART - IV (LONG ANSWER TYPE QUESTIONS)|11 Videos
  • APPLICATION OF DERIVATIVES

    ARIHANT PUBLICATION|Exercise QUESTION FOR PRACTICE PART - III (SHORT ANSWER TYPE QUESTIONS )|9 Videos
  • AREA UNDER PLANE CURVES

    ARIHANT PUBLICATION|Exercise CHAPTER PRACTICE (LONG ANWER TYPE QUESTIONS)|15 Videos

Similar Questions

Explore conceptually related problems

Prove that In(1+x) lt x for x gt 0.

Prove that (x)/(1 + x) lt log (1 + x) lt "x for x" gt 0

If f(x)=int(dx)/((1+x^2)^ (3/2) and f(0)=0 then what is the value of f(1)?

Let f(x) = e^(x)g(x) , g(0)=2 and g (0)=1 , then find f'(0) .

Show that f(x) = {:{(5x-4, when, 0 lt x le 1), (4x^(2) - 3x, when, 1 lt x lt 2):} is continuous at x = 1 .