Home
Class 12
MATHS
Let f(x)=x^(2)+bx+c and g(x)=x^(2)+b(1)x...

Let `f(x)=x^(2)+bx+c` and `g(x)=x^(2)+b_(1)x+c_(1)`
Let the real roots of `f(x)=0` be `alpha, beta` and real roots of `g(x)=0` be `alpha +k, beta+k` fro same constant `k`. The least value fo `f(x)` is `-1/4` and least value of `g(x)` occurs at `x=7/2`
The roots of `f(x)=0` are

Promotional Banner

Similar Questions

Explore conceptually related problems

f(x)=x2+b_(1)x+c_(1)*g(x)=x^(2)+b_(2)x+c_(2) Real roots of f(x)=0 be alpha,beta and real roots of g(x)=0 be alpha+gamma,beta+gamma. Least values of f(x) be -(1)/(4) Least value of g(x) occurs at x=(7)/(2)

If alpha,beta are the roots of x^(2)-k(x+1)-c=0 then (1+alpha)(1+beta)=

Let f(x)=x^(2)-ax+3a-4 and g(a) be the least value of f(x) then the greatest value of g(a)

Let f(x)=ax^(2)+bx+c.g(x)=ax^(2)+qx+r where a,b,c,q,r in R and a<0. if alpha,beta are the roots of f(x)=0 and alpha+delta,beta+delta are the roots of g(x)=0, then

Let f(x)=ax^(2)+bx+c , g(x)=ax^(2)+qx+r , where a , b , c , q , r in R and a lt 0 . If alpha , beta are the roots of f(x)=0 and alpha+delta , beta+delta are the roots of g(x)=0 , then

If f(x)=ax^(2)+bx+c,g(x)=-ax^(2)+bx+c, where ac !=0 then prove that f(x)g(x)=0 has at least two real roots.

Let f (x)=(x+1) (x+2) (x+3)…..(x+100) and g (x) =f (x) f''(x) -f ('(x)) ^(2). Let n be the numbers of rreal roots of g(x) =0, then:

If alpha and beta are the roots of x^(2)+bx+c=0 and alpha+k and beta+k are the roots of x^(2)+qx+r=0 then k=

Let f(x)=ax^(2)+bx+c, where a,b,c,in R and a!=0, If f(2)=3f(1) and 3 is a roots of the equation f(x)=0, then the other roots of f(x)=0 is

If f(x)=Pi_(k=1)^(999)(x^(2)-47x+k) . then product of all real roots of f(x)=0 is