Home
Class 12
MATHS
if alpha and beta are imaginary cube roo...

if `alpha` and `beta` are imaginary cube root of unity then prove `(alpha)^4 + (beta)^4 + (alpha)^-1 . (beta)^-1 = 0`

Text Solution

Verified by Experts

The correct Answer is:
0

Complex cube roots of unity are `omega,omega^(2)`. Let `alpha = omega, beta = omega^(2)`. Thne
`alpha + beta^(2)+ alpha^(-1)beta^(-1)= omega^(4) +(omega^(2))^(4) +(omega^(-1))(omega^(2))^(-1)`
`omega +omega^(2) + omega^(2) + 1=0`
Doubtnut Promotions Banner Mobile Dark
|

Topper's Solved these Questions

  • COMPLEX NUMBERS

    CENGAGE ENGLISH|Exercise EXERCISE3.5|12 Videos
  • COMPLEX NUMBERS

    CENGAGE ENGLISH|Exercise EXERCISE3.6|10 Videos
  • COMPLEX NUMBERS

    CENGAGE ENGLISH|Exercise EXERCISE3.3|7 Videos
  • CIRCLES

    CENGAGE ENGLISH|Exercise Comprehension Type|8 Videos
  • CONIC SECTIONS

    CENGAGE ENGLISH|Exercise All Questions|101 Videos

Similar Questions

Explore conceptually related problems

(i) If alpha , beta be the imaginary cube root of unity, then show that alpha^4+beta^4+alpha^-1beta^-1=0

If alpha and beta are the complex cube roots of unity, then prove that (1 + alpha) (1 + beta) (1 + alpha)^(2) (1+ beta)^(2)=1

If alpha and beta are the complex cube roots of unity, show that alpha^4+beta^4 + alpha^-1 beta^-1 = 0.

If alpha , beta , gamma in {1,omega,omega^(2)} (where omega and omega^(2) are imaginery cube roots of unity), then number of triplets (alpha,beta,gamma) such that |(a alpha+b beta+c gamma)/(a beta+b gamma+c alpha)|=1 is

If alpha and beta are the roots of 2x^(2) + 5x - 4 = 0 then find the value of (alpha)/(beta) + (beta)/(alpha) .

If alpha and beta are the roots of the equation x^2+4x + 1=0(alpha > beta) then find the value of 1/(alpha)^2 + 1/(beta)^2

If alpha , beta are the roots of x^2 +x+1=0 then alpha beta + beta alpha =

If alpha and beta be the roots of equation x^(2) + 3x + 1 = 0 then the value of ((alpha)/(1 + beta))^(2) + ((beta)/(1 + alpha))^(2) is equal to

If alpha and beta are the roots of the equation px^(2) + qx + 1 = , find alpha^(2) beta + beta^(2)alpha .

If alpha and beta are the roots of the equation x^2+sqrt(alpha)x+beta=0 then the values of alpha and beta are -