Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Root Finding Algorithms: Bisection, Secant, and Newton's Methods - Prof. Erin K. Mcnelis, Study notes of Mathematical Methods for Numerical Analysis and Optimization

An overview of root finding algorithms used in numerical analysis to solve nonlinear equations in one variable. The bisection method, secant method, and newton's method, discussing their algorithms, strengths, weaknesses, and convergence issues. Students can use this document as a study guide for understanding the concepts of root finding and solving nonlinear equations.

Typology: Study notes

Pre 2010

Uploaded on 08/19/2009

koofers-user-02w
koofers-user-02w 🇺🇸

10 documents

1 / 4

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
MATH 441/541 - Numerical Analysis
Third Meeting: Root Finding Algorithms
Thursday, September 6th, 2007
Root Finding (i.e. Solving Nonlinear Equations) in One Variable
Introductory Thoughts:
1. How are root finding and solving nonlinear equations the same thing?
2. What are the challenges to doing this analytically?
Section 2.1: The Bisection Method (a.k.a. the Binary Search Method)
1. The Problem:
Given: Suppose you have a continuous function fdefined on [a, b] with f(a) and f(b) having
opposite signs.
Some Conclusions: The Intermediate Value Theorem guarantees that fhas at least one
root on [a, b], call it p.
The Big Question: How do you go about finding the root pgiven this information and the
formula for f?
2. The concept of “Bracketing the Root”
3. The Bisection Method Algorithm:
OBJECTIVE: Given the continuous function fon the interval [a, b] where f(a) and f(b) have
opposite signs, find a solution to f(x) = 0.
INPUT: endpoints aand b; tolerance TOL; maximum number of iterations N(and f).
OUTPUT: approximate solution por message of failure.
4. Other Possible Stopping Criteria
5. Convergence Issues:
Will it converge?
How fast will it converge?
Example: The equation xcos(x) = 3 + 8xx3has a solution on the interval [2,5]. How
many iterations will you need, at most, to find the solution with an accuracy of 104?
6. Strengths and Weaknesses of the Bisection Method:
Section 2.3: Newton’s Method (Secant Method, and Method of False Position)
1. The Problem:
Given: Suppose you have a continuous (and possibly continuously differentiable) function
fand some reasonable guesses as to a root of the function.
The Big Question: How do you go about finding the root pgiven this information and the
formula for f?
2. The Secant Method:
(a) Given the continuous function f(x) and two reasonable guesses to the root, p0and p1.
(b) A Visual Explanation
(c) Deriving the Formula for the New Guess at the Root:
pf3
pf4

Partial preview of the text

Download Root Finding Algorithms: Bisection, Secant, and Newton's Methods - Prof. Erin K. Mcnelis and more Study notes Mathematical Methods for Numerical Analysis and Optimization in PDF only on Docsity!

MATH 441/541 - Numerical Analysis Third Meeting: Root Finding Algorithms Thursday, September 6 th, 2007

Root Finding (i.e. Solving Nonlinear Equations) in One Variable

  • Introductory Thoughts:
    1. How are root finding and solving nonlinear equations the same thing?
    2. What are the challenges to doing this analytically?
  • Section 2.1: The Bisection Method (a.k.a. the Binary Search Method)
    1. The Problem:
      • Given: Suppose you have a continuous function f defined on [a, b] with f (a) and f (b) having opposite signs.
      • Some Conclusions: The Intermediate Value Theorem guarantees that f has at least one root on [a, b], call it p.
      • The Big Question: How do you go about finding the root p given this information and the formula for f?
    2. The concept of “Bracketing the Root”
    3. The Bisection Method Algorithm:

OBJECTIVE: Given the continuous function f on the interval [a, b] where f (a) and f (b) have opposite signs, find a solution to f (x) = 0.

INPUT: endpoints a and b; tolerance TOL; maximum number of iterations N (and f ).

OUTPUT: approximate solution p or message of failure.

  1. Other Possible Stopping Criteria
  2. Convergence Issues:
    • Will it converge?
    • How fast will it converge?
    • Example: The equation x cos(x) = 3 + 8x − x^3 has a solution on the interval [2, 5]. How many iterations will you need, at most, to find the solution with an accuracy of 10−^4?
  3. Strengths and Weaknesses of the Bisection Method:
  • Section 2.3: Newton’s Method (Secant Method, and Method of False Position)
  1. The Problem:
  • Given: Suppose you have a continuous (and possibly continuously differentiable) function f and some reasonable guesses as to a root of the function.
  • The Big Question: How do you go about finding the root p given this information and the formula for f?
  1. The Secant Method: (a) Given the continuous function f (x) and two reasonable guesses to the root, p 0 and p 1. (b) A Visual Explanation (c) Deriving the Formula for the New Guess at the Root:

(d) The Secant Method Algorithm (e) An Example: Use the Secant Method with initial guesses p 0 = 2 and p 1 = 3 to find an approximation to the solution to x cos(x) = 3 + 8x − x^3 correct to three decimal places. (f) What Problems Could Occur? (g) Strengths and Weaknesses of the Secant Method:

  1. Newton’s Method: (a) Given continuously differentiable f (x) and one reasonable guesses to the root, p 0. (b) A Visual Explanation: (c) Deriving the Formula for the New Guess at the Root: (d) Newton’s Method Algorithm:

(e) The Relation to the Secant Method (f) An Example: Use Newton’s Method with initial guess p 0 = 2 to find an approximation to the solution to x cos(x) = 3 + 8x − x^3 correct to three decimal places. (g) What Problems Could Occur? (h) Strengths and Weaknesses of Newton’s Method:

  1. The Method of False Position: (a) Given continuous function f (x) and two reasonable guesses to the root, p 0 and p 1 that bracket the root. (b) A Visual Explanation (c) The Formula for the New Guess at the Root (d) An Example: Use the Method of False Position with initial guesses p 0 = 2 and p 1 = 4 to find an approximation to the solution to x cos(x) = 3 + 8x − x^3 correct to three decimal places. (e) What Problems Could Occur? (f) Strengths and Weaknesses of Method of False Position:
  • Section 2.2: Fixed-Point Methods
  1. The Problem:
  • Given: Suppose you have a continuous function f and some reasonable guesses as to a fixed-point of the function.
  • The Big Question: How do you go about finding the fixed point p given this information and the formula for f?
  1. What is a fixed-point, and how is finding the fixed-point of a function related to root-finding?
  2. Fixed-Point Iteration: (a) Given the continuous function f (x) and a reasonable guess to the fixed point, p 0. (b) A Visual Explanation (c) Deriving the Formula for the New Guess at the Fixed Point: (d) The Fixed-Point Algorithm
  3. Existence and Uniqueness Theorems: (a) Existence: If g ∈ C[a, b] and a ≤ g(x) ≤ b for all x ∈ [a, b], then g has at least one fixed point, p, in [a, b]. (b) Uniqueness: i. Contraction Mapping Theorem: If there also exists a value 0 < k < 1 such that |g(x) − g(y)| ≤ k|x − y| for all x and y in [a, b], (i.e. g is a contraction mapping on [a, b]), then the fixed point p is unique.
  1. Theorem (from 2.3): Let f ∈ C^2 [a, b]. If p ∈ (a, b) is such that f (p) = 0 and f ′(p) 6 = 0, then there exists a δ > 0 such that Newton’s method generates a sequence {pn}∞ n=0 converging to p for any initial approximation p 0 ∈ [p − δ, p + δ].
  2. Theorem: f ∈ C^1 [a, b] has a simple zero at p in (a, b) if and only if f (p) = 0, but f ′(p) 6 = 0.
  3. Corollary: If f ∈ C^1 [a, b] and has a root, p, on [a, b], there exists an interval about p where Newton’s method converges quadratically to p for any initial approximation p 0 ∈ [a, b] provided that p is not a simple zero.