Solutions to Problem Set 10

Math 310-01/02

10-4-2017

1. Premises: $\displaystyle
   \left\{\matrix{(A \lor C) \ifthen B \cr A \lor D \cr \lnot C \ifthen
   \lnot D \cr}\right.$ .

Prove: B.

$$\matrix{ \hfill 1. & (A \lor C) \ifthen B \hfill & \hbox{Premise} \hfill \cr \hfill 2. & A \lor D \hfill & \hbox{Premise} \hfill \cr \hfill 3. & \lnot C \ifthen \lnot D \hfill & \hbox{Premise} \hfill \cr \hfill 4. & \lnot B \hfill & \hbox{Premise for proof by contradiction} \hfill \cr \hfill 5. & \lnot(A \lor C) \hfill & \hbox{Modus tollens (1,4)} \hfill \cr \hfill 6. & \lnot A \land \lnot C \hfill & \hbox{DeMorgan (5)} \hfill \cr \hfill 7. & \lnot A \hfill & \hbox{Decomposing a conjunction (6)} \hfill \cr \hfill 8. & \lnot C \hfill & \hbox{Decomposing a conjunction (6)} \hfill \cr \hfill 9. & D \hfill & \hbox{Disjunctive syllogism (2,7)} \hfill \cr \hfill 10. & C \hfill & \hbox{Modus tollens (3,9)} \hfill \cr \hfill 11. & C \land \lnot C \hfill & \hbox{Constructing a conjunction (8,10)} \hfill \cr \hfill 12. & B \hfill & \hbox{Proof by contradiction (4,11)} \quad\halmos \hfill \cr}$$


2. Prove that the following curves do not intersect:

$$(x - 1)^2 + y^2 = 3 \quad\hbox{and}\quad -2 x + 2 y + 11 = 0$$

Suppose that the curves intersect in a point $(x,y)$ . Then $(x,y)$ satisfies both equations, and I may add the equations to get

$$(x - 1)^2 + y^2 - 2 x + 2 y + 11 = 3.$$

Expand the left side and simplify:

$$x^2 - 4 x + y^2 + 2 y + 12 = 3.$$

Complete the square in x and y:

$$(x^2 - 4 x + 4) + (y^2 + 2 y + 1) + 7 = 3, \quad (x - 2)^2 + (y + 1)^2 = -4.$$

The left side is the sum of two squares, so it must be greater than or equal to 0. Consequently, it can't be equal to -4. This contradiction proves that the curves do not intersect.


3. Use Rolle's Theorem to prove that the function $f(x) = x^4 + 2 x^2 + 17$ has at most two roots.

Suppose on the contrary that f has three roots: Say $f(a) = f(b) = f(c) = 0$ , where $a < b < c$ . By Rolle's theorem, there are numbers u and v, where

$$f'(u) = 0, \quad f'(v) = 0, \quad a < u < b, \quad\hbox{and}\quad b < v < c.$$

That is, f must have at least two critical points.

However,

$$f'(x) = 4 x^3 + 4 x = 4 x (x^2 + 1).$$

Thus, $f'(x) = 0$ for $x
   = 0$ only: f has only one critical point.

This contradiction shows that f can't have three roots. Therefore, f has at most two roots.


What would life be if we had no courage to attempt anything? - Vincent Van Gogh


Contact information

Bruce Ikenaga's Home Page

Copyright 2017 by Bruce Ikenaga