Home
Class 11
MATHS
(a) show that alpha^2 + beta^2 + alpha b...

(a) show that `alpha^2 + beta^2 + alpha beta =0`

Promotional Banner

Similar Questions

Explore conceptually related problems

If alpha and beta are the complex cube roots of unity, show that alpha^4+beta^4 + alpha^-1 beta^-1 = 0.

Factorise alpha^2 +beta^2 + alpha beta

(i) If alpha , beta be the imaginary cube root of unity, then show that alpha^4+beta^4+alpha^-1beta^-1=0

Show that (sin (alpha + beta))/( sin (alpha + beta)) = 2, given that tan alpha = 2 tan beta.

lf cos^2 alpha -sin^2 alpha = tan^2 beta , then show that tan^2 alpha = cos^2 beta-sin^2 beta .

Using lagrange's mean value theorem, show that (beta-alpha)/(1+beta^2) alpha > 0.

If alpha , beta are the roots of ax ^2 + bx +c=0 then alpha ^5 beta ^8 + alpha^8 beta ^5=

Prove that 2 sin^2 beta + 4 cos(alpha + beta) sin alpha sin beta + cos 2(alpha + beta) = cos 2alpha

if tan alpha = 2tanbeta show that (sin(alpha+beta))/(sin(alpha-beta)) = 3

If alpha , beta , gamma are the roots of x^3 +px^2 +qx +r=0 then find sum alpha^2 beta + sum alpha beta ^2