Home
Class 12
MATHS
Let f(x) = x^2 + ax + b , where a,b in...

Let `f(x) = x^2 + ax + b` , where `a,b in R` . If `f (x)=0` has all its roots imaginary then the roots of `f(x)+ f'(x)+f"(x)=0` are

Promotional Banner

Similar Questions

Explore conceptually related problems

Let f(x) = x^2 + ax + b , where a, b in RR . If f(x) =0 has all its roots imaginary , then the roots of f(x) + f'(x) + f(x) =0 are :

Let f(x)=ax^(2)+bx+c, where a,b,c,in R and a!=0, If f(2)=3f(1) and 3 is a roots of the equation f(x)=0, then the other roots of f(x)=0 is

Let f(x) = ax^(2) + bx + c, a, b, c, in R and equation f(x) - x = 0 has imaginary roots alpha, beta . If r, s be the roots of f(f(x)) - x = 0 , then |(2,alpha,delta),(beta,0,alpha),(gamma,beta,1)| is

Let f(x) = ax^(2) + bx + c, a, b, c, in R and equation f(x) - x = 0 has imaginary roots alpha, beta . If r, s be the roots of f(f(x)) - x = 0 , then |(2,alpha,delta),(beta,0,alpha),(gamma,beta,1)| is

Let f(x) =ax^(2) -b|x| , where a and b are constants. Then at x = 0, f (x) is

Let f(x)=ax^(2)-b|x| , where a and b are constant . Then at x=0 , f(x) has

If f(x)= ax^(2)+bx+c,a,b,c in R and eqation f(x)-x=0 has imaginary roots alpha,beta,gamma "and " delta be the roots of f(x) -x=0 then |{:(1,alpha,delta),(beta,0,alpha),(gamma,beta,1):}| is

Let f(x) = ax^(2) - bx + c^(2), b ne 0 and f(x) ne 0 for all x in R . Then