Home
Class 11
MATHS
If f(x) = ax^(2) bx + c, g(x) = - ax^(2)...

If `f(x) = ax^(2) bx + c, g(x) = - ax^(2) + bx + c` where ac `ne0`, then prove that f(x)g(x) = 0 has at least two real roots

Promotional Banner

Similar Questions

Explore conceptually related problems

If f(x)=a x^2+b x+c ,g(x)=-a x^2+b x+c ,w h e r ea c!=0, then prove that f(x)g(x)=0 has at least two real roots.

If p(x) = ax^2 + bx + c and Q(x) = -ax^2 + dx +c where ac ne 0 then p(x). Q(x) = 0 has at least …………. Real roots

If f(x) = ax^(2) + bc + c , where a ne 0, b ,c in R , then which of the following conditions implies that f(x) has real roots?

Let f(x) =ax^(2) + bx + c and f(-1) lt 1, f(1) gt -1, f(3) lt -4 and a ne 0 , then

Let f(x) = ax^(2) - bx + c^(2), b ne 0 and f(x) ne 0 for all x in R . Then

If ax^(2)+ bx +c and bx ^(2) + ax + c have a common factor x +1 then show that c=0 and a =b.

Let f(x)=ax^(2)+bx+c , g(x)=ax^(2)+qx+r , where a , b , c , q , r in R and a lt 0 . If alpha , beta are the roots of f(x)=0 and alpha+delta , beta+delta are the roots of g(x)=0 , then

The equation 4ax^2 + 3bx + 2c = 0 where a, b, c are real and a+b+c = 0 has

If f(x)=x^2+x+3/4 and g(x)=x^2+a x+1 be two real functions, then the range of a for which g(f(x))=0 has no real solution is (A) (-oo,-2) (B) (-2,2) (C) (-2,oo) (D) (2,oo)