Home
Class 12
MATHS
Let x(1),x(2) are the roots of the quadr...

Let `x_(1),x_(2)` are the roots of the quadratic equation `x^(2) + ax + b=0`, where a,b, are complex numbers and `y_(1), y_(2)` are the roots of the quadratic equation `y^(2) + |a|yy+ |b| = 0`. If `|x_(1)| = |x_(2)|=1`, then prove that `|y_(1)| = |y_(2)| =1`

Text Solution

AI Generated Solution

To solve the problem, we need to prove that if \( |x_1| = |x_2| = 1 \), then \( |y_1| = |y_2| = 1 \) for the given quadratic equations. Let's go through the steps systematically. ### Step-by-Step Solution: 1. **Identify the roots of the first quadratic equation**: The roots \( x_1 \) and \( x_2 \) are the solutions to the equation: \[ x^2 + ax + b = 0 ...
Promotional Banner

Topper's Solved these Questions

  • COMPLEX NUMBERS

    CENGAGE ENGLISH|Exercise EXERCISE3.1|4 Videos
  • COMPLEX NUMBERS

    CENGAGE ENGLISH|Exercise EXERCISE3.2|9 Videos
  • COMPLEX NUMBERS

    CENGAGE ENGLISH|Exercise ILLUSTRATION|110 Videos
  • CIRCLES

    CENGAGE ENGLISH|Exercise Comprehension Type|8 Videos
  • CONIC SECTIONS

    CENGAGE ENGLISH|Exercise All Questions|101 Videos

Similar Questions

Explore conceptually related problems

Let x_1, x_2 are the roots of the quadratic equation x^2+a x+b=0,w h e r ea ,b are complex numbers and y_1, y_2 are the roots of the quadratic equation y^2+|a|y+|b|=0. If |x_1|=|x_2|=1, then prove that |y_1|=|y_2|=1

If a in (-1,1), then roots of the quadratic equation (a-1)x^2+a x+sqrt(1-a^2)=0 are

if alpha&beta are the roots of the quadratic equation ax^2 + bx + c = 0 , then the quadratic equation ax^2-bx(x-1)+c(x-1)^2 =0 has roots

If one root of the quadratic equation (a-b)x^2+ax+1=0 is double the other root where a in R , then the greatest value of b is

Let alpha,beta be the roots of the quadratic equation ax^2+bx+c=0 then the roots of the equation a(x+1)^2+b(x+1)(x-2)+c(x-2)^2=0 are:-

If 1 is a root of the quadratic equation 3x^2+a x-2=0 and the quadratic equation a(x^2+6x)-b=0 has equal roots, find the value of bdot

Find the roots of the quadratic equations by using the quadratic formula 1/2 x^(2)-sqrt(11)x+1=0

If p, q are the roots of the quadratic equation x2 + 2bx + c = 0, prove that 2log (√(y – p) + √(y – q)) = log2 + log(y + b + √(y2 + 2by + c))