Home
Class 12
MATHS
Let alpha and beta be two roots of the e...

Let `alpha` and `beta` be two roots of the equation `x^(2) + 2x + 2 = 0`. Then `alpha^(15) + beta^(15)` is equal to

A

`- 512`

B

128

C

512

D

`-256`

Text Solution

Verified by Experts

The correct Answer is:
D
Promotional Banner

Similar Questions

Explore conceptually related problems

Let alpha, and beta are the roots of the equation x^(2)+x +1 =0 then

If alpha and beta are the roots of the equation 2x^(2) - 3x + 4 = 0 , then alpha^(2) + beta^(2) = ____

If alpha and beta are the root of the equation x^(2) - 4x + 5 = 0 , then alpha^(2) + beta^(2) = ________

If alpha and beta are roots of the equation x^(2)+x+1=0 , then alpha^(2)+beta^(2) is equal to

If alpha, beta in C are distinct roots of the equation x^2+1=0 then alpha^(101)+beta^(107) is equal to

Let alpha and beta be the roots of the equation x^(2) + x + 1 = 0 . Then, for y ne 0 in R. [{:(y+1, alpha,beta), (alpha, y+beta, 1),(beta, 1, y+alpha):}] is

If alpha and beta be the roots of equation x^(2) + 3x + 1 = 0 then the value of ((alpha)/(1 + beta))^(2) + ((beta)/(1 + alpha))^(2) is equal to

Let alpha, beta are the roots of the equation x^(2)+x+1=0 , then alpha^3-beta^3

Let alpha and beta be two real roots of the equation 5cot^2x-3cotx-1=0 , then cot^2 (alpha+beta) =

If alpha and beta are roots of the equation 2x^(2)-3x-5=0 , then the value of (1)/(alpha)+(1)/(beta) is