Home
Class 12
MATHS
Let a,b and c be the roots of x ^(3) -x+...

Let a,b and c be the roots of `x ^(3) -x+1 =0,` then the ralue of `((1)/(a+1)+(1)/(b+1)+(1)/(c+1))` equals to :

Promotional Banner

Similar Questions

Explore conceptually related problems

If the roots of a(b-c)x^(2)+b(c-a)x+c(a-b)=0 are equal then (1)/(a)+(1)/(b)+(1)/(c)=

If a,b,c are the roots of x^(3)-3x^(2)+3x+26=0 and omega is cube roots of unity then the value of (a-1)/(b-1)+(b-1)/(c-1)+(c-1)/(a-1)=

If the ratio of the roots of a_(1)x^(2) + b_(1) x + c_(1) = 0 be equal to the ratio of the roots of a_(2) x^(2) + b_(2)x + c_(2) = 0 , then (a_(1))/(a_(2)) , (b_(1))/(b_(2)) , (c_(1))/(c_(2)) are in :

If a ,\ b ,\ c are the roots of the equation x^3+p x+q=0 , then find the value of the determinant |(1+a,1 ,1), (1, 1+b,1),( 1, 1 ,1+c)| .

If (1)/(2),(1)/(3),(1)/(4) are the roots of ax^(3)+bx^(3)+cx+d=0 then the roots of a(x+1)^(3)+b(x+1)^(2)+c(x+1)+d=0 are

If (1)/(2),(1)/(3),(1)/(4) are the roots of ax^(3)+bx^(2)+cx+d=0 then the roots of a(x+1)^(3)+b(x+1)^(2)+c(x+1)+d=0 are

If a and b are roots of x^2 -p(x+1)-c =0 then (1+a) (1+b) and (a^2 + 2a+1)/(a^2 + 2a+c) + (b^2 +2b +1)/(b^2 + 2b +c) are,