Home
Class 12
MATHS
Let f(x) = x^2 + ax + b , where a,...

Let `f(x) = x^2 + ax + b `, where a, b `in RR` . If f(x) =0 has all its roots imaginary , then the roots of f(x) + f'(x) + f(x) =0` are :

Promotional Banner

Similar Questions

Explore conceptually related problems

Let f(x)=x^(2)+ax+b , where a, b in R . If f(x)=0 has all its roots imaginary, then the roots of f(x)+f'(x)+f''(x)=0 are

Let f(x)=x^(2)+ax+b , where a, b in R . If f(x)=0 has all its roots imaginary, then the roots of f(x)+f'(x)+f''(x)=0 are

Let f(x)=x^(2)+ax+b, where a,b in R. If f(x)=0 has all its roots imaginary then the roots of f(x)+f'(x)+f(x)=0 are

Let f(x)=ax^(2)+bx+c, where a,b,c,in R and a!=0, If f(2)=3f(1) and 3 is a roots of the equation f(x)=0, then the other roots of f(x)=0 is

Let f(x) = ax^(2) + bx + c, a, b, c, in R and equation f(x) - x = 0 has imaginary roots alpha, beta . If r, s be the roots of f(f(x)) - x = 0 , then |(2,alpha,delta),(beta,0,alpha),(gamma,beta,1)| is

Let f(x) = ax^(2) + bx + c, a, b, c, in R and equation f(x) - x = 0 has imaginary roots alpha, beta . If r, s be the roots of f(f(x)) - x = 0 , then |(2,alpha,delta),(beta,0,alpha),(gamma,beta,1)| is

Let f(x)=ax^(2)-b|x| , where a and b are constant . Then at x=0 , f(x) has

Let f(x) = (ax+b)/(cx + d), then f(x) has

Let f(x) =ax^(2) -b|x| , where a and b are constants. Then at x = 0, f (x) is

If f(x) is a quadratic expression such that f(1) + f(2) = 0, and -1 is a root of f(x) = 0 , then the other root of f(x) = 0 is :