Home
Class 11
MATHS
If a and c are odd prime numbers and a x...

If `a and c` are odd prime numbers and `a x^2+b x+c=0` has rational roots , where `b in I ,` prove that one root of the equation will be independent of `a ,b ,c dot`

Promotional Banner

Similar Questions

Explore conceptually related problems

If a,b,c are nonzero real numbers and az^(2)+bz+c+i=0 has purely imaginary roots,then prove that a=b^(2)c.

Let a and c be prime number and b an integer. Given that the quadratic equation ax^(2)+bx+c=0 has rational roots, show that one of the root is independent of the co-efficients. Find the two roots.

If a,b,c in R and the quadratic equation x^(2)+(a+b)x+c=0 has no real roots then

If c,d are the roots of the equation (x-a)(x-b)-k=0, prove that a a,b are roots of the equation (x-c)(x-d)+k=0

The equation a x^2+b x+c=0 has real and positive roots. Prove that the roots of the equation a^2x^2+a(3b-2c)x+(2b-c)(b-c)+a c=0 re real and positive.

If b^2<2a c , then prove that a x^3+b x^2+c x+d=0 has exactly one real root.

If a+b+c=0 and a,b,c are ratiional. Prove that the roots of the equation (b+c-a)x^(2)+(c+a-b)x+(a+b-c)=0 are rational.

if a,b and c are the roots of the equation x^3+2x^2+1=0 , find |{:(a,b,x),(b,c,a),(c,a,b):}|

If p and q are odd integers, then the equation x^2+2px+2q=0 (A) has no integral root (B) has no rational root (C) has no irrational root (D) has no imaginary root