Home
Class 12
MATHS
If b^2<2a c , then prove that a x^2+b x^...

If `b^2<2a c` , then prove that `a x^2+b x^2+c x+d=0` has exactly one real root.

Text Solution

Verified by Experts

The correct Answer is:
`=(b^(2))/(a^(2))-(2c)/(a)=(b^(2)-2ac)/(a^(2))`
Promotional Banner

Similar Questions

Explore conceptually related problems

If a^(2) +b^(2) = 13 and ab= 6 find: 3(a+b) ^(2) - 2(a-b)^(2)

If a^(2) + b^(2)= 34 and ab= 12 , find: 7(a-b)^(2) - 2(a +b)^(2)

Evaluate : b^(2)y - 9b^(2)y + 2b^(2)y - 5b^(2)y

If (a-b)^(2)+(a+b)^(2)=24 , then a^(2)+b^(2)=

Factorise : (a+ b) ^(2) - 5( a^(2) - b^(2) ) - 24 (a-b)^(2)

If a,b,c are in G.P., then show that : (a^2-b^2)(b^2+c^2)=(b^2-c^2)(a^2+b^2)

In Q. No 126, c= (a) b (b) 2b (c) 2b^2 (d) -2b

Show that: |b^2+c^2a b a c b a c^2+a^2b c c a c b a^2+b^2|=4a^2b^2c^2

Show that: |b^2+c^2a b a c b a c^2+a^2b cc a c b a^2+b^2|=4a^2\ b^2\ c^2 .