Home
Class 11
MATHS
Let alpha, beta be the roots of x^2 - ax...

Let `alpha, beta` be the roots of `x^2 - ax + b = 0,` where `alpha and beta in RR.` If `alpha + 3beta = 0,` then

Promotional Banner

Similar Questions

Explore conceptually related problems

Let alpha, beta be the roots of x^2 - ax + b = 0, where alpha and beta in R. If alpha + 3beta = 0, then (A) 3 a^2 + 4b =0 (B) 3 b^2 + 4a = 0 (C) b lt 0 (D) a lt 0

If alpha , beta are the roots of ax^2 + bx + c=0 then alpha beta ^(2) + alpha ^2 beta + alpha beta =

If alpha , beta are the roots of ax ^2 + bx +c=0 then alpha ^5 beta ^8 + alpha beta ^5=

If alpha , beta are the roots of ax ^2 + bx +c=0 then alpha ^5 beta ^8 + alpha^8 beta ^5=

IF alpha , beta are the roots of ax ^2 + bx + c=0 then (alpha ^2 + beta ^2)/(alpha ^(-2)+ beta ^(-2))=

If alpha and beta are the roots of x^(2) - ax + b^(2) = 0, then alpha^(2) + beta^(2) is equal to :

If tan alpha and tan beta are the roots of x^(2) + ax + b = 0, then (sin (alpha + beta))/(sin alpha sin beta) is equal to

IF alpha , beta are the roots of x^2 - ax +b=0 , then the whose roots are (alpha + beta ) /( alpha ) , (alpha + beta)/( beta) is

IF alpha , beta are the roots of ax ^2 + bx +c=0 then (alpha ^3 + beta ^3)/(alpha ^(-3) + beta ^(-3)) =

IF alpha , beta are the roots of ax ^2 + bx +c=0 then (alpha ^2)/( beta ) +( beta ^2)/( alpha )