Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Root Finding Algorithms: Newton's Method, Secant Method, and Method of False Position - Pr, Study notes of Mathematical Methods for Numerical Analysis and Optimization

An in-depth exploration of root finding algorithms, specifically newton's method, secant method, and method of false position. It covers the problem of finding roots of continuous functions, the visual explanation and derivation of formulas for new guesses at the root, algorithms for each method, examples, and strengths and weaknesses. Additionally, it discusses the relation to fixed-point methods and existence and uniqueness theorems.

Typology: Study notes

Pre 2010

Uploaded on 08/18/2009

koofers-user-32t
koofers-user-32t 🇺🇸

10 documents

1 / 4

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
MATH 441/541 - Numerical Analysis
Fourth Meeting: Root Finding Algorithms (cont.)
Thursday, September 13th, 2007
Root Finding (i.e. Solving Nonlinear Equations) in One Variable
Section 2.3: Newton’s Method (Secant Method, and Method of False Position)
1. The Problem:
Given: Suppose you have a continuous (and possibly continuously differentiable) function
fand some reasonable guesses as to a root of the function.
The Big Question: How do you go about finding the root pgiven this information and the
formula for f?
2. The Secant Method:
(a) Given the continuous function f(x) and two reasonable guesses to the root, p0and p1.
(b) A Visual Explanation
(c) Deriving the Formula for the New Guess at the Root:
(d) The Secant Method Algorithm
(e) An Example: Use the Secant Method with initial guesses p0= 2 and p1= 3 to find an
approximation to the solution to xcos(x) = 3 + 8xx3correct to three decimal places.
(f) What Problems Could Occur?
(g) Strengths and Weaknesses of the Secant Method:
3. Newton’s Method:
(a) Given continuously differentiable f(x) and one reasonable guesses to the root, p0.
(b) A Visual Explanation:
(c) Deriving the Formula for the New Guess at the Root:
(d) Newton’s Method Algorithm:
(e) The Relation to the Secant Method
(f) An Example: Use Newton’s Method with initial guess p0= 2 to find an approximation to
the solution to xcos(x) = 3 + 8xx3correct to three decimal places.
(g) What Problems Could Occur?
(h) Strengths and Weaknesses of Newton’s Method:
4. The Method of False Position:
(a) Given continuous function f(x) and two reasonable guesses to the root, p0and p1that
bracket the root.
(b) A Visual Explanation
(c) The Formula for the New Guess at the Root
(d) An Example: Use the Method of False Position with initial guesses p0= 2 and p1= 4 to
find an approximation to the solution to xcos(x) = 3 + 8xx3correct to three decimal
places.
(e) What Problems Could Occur?
(f) Strengths and Weaknesses of Method of False Position:
pf3
pf4

Partial preview of the text

Download Root Finding Algorithms: Newton's Method, Secant Method, and Method of False Position - Pr and more Study notes Mathematical Methods for Numerical Analysis and Optimization in PDF only on Docsity!

MATH 441/541 - Numerical Analysis Fourth Meeting: Root Finding Algorithms (cont.) Thursday, September 13 th, 2007

Root Finding (i.e. Solving Nonlinear Equations) in One Variable

  • Section 2.3: Newton’s Method (Secant Method, and Method of False Position)
    1. The Problem:
      • Given: Suppose you have a continuous (and possibly continuously differentiable) function f and some reasonable guesses as to a root of the function.
      • The Big Question: How do you go about finding the root p given this information and the formula for f?
    2. The Secant Method: (a) Given the continuous function f (x) and two reasonable guesses to the root, p 0 and p 1. (b) A Visual Explanation (c) Deriving the Formula for the New Guess at the Root: (d) The Secant Method Algorithm (e) An Example: Use the Secant Method with initial guesses p 0 = 2 and p 1 = 3 to find an approximation to the solution to x cos(x) = 3 + 8x − x^3 correct to three decimal places. (f) What Problems Could Occur? (g) Strengths and Weaknesses of the Secant Method:
    3. Newton’s Method: (a) Given continuously differentiable f (x) and one reasonable guesses to the root, p 0. (b) A Visual Explanation: (c) Deriving the Formula for the New Guess at the Root: (d) Newton’s Method Algorithm:

(e) The Relation to the Secant Method (f) An Example: Use Newton’s Method with initial guess p 0 = 2 to find an approximation to the solution to x cos(x) = 3 + 8x − x^3 correct to three decimal places. (g) What Problems Could Occur? (h) Strengths and Weaknesses of Newton’s Method:

  1. The Method of False Position: (a) Given continuous function f (x) and two reasonable guesses to the root, p 0 and p 1 that bracket the root. (b) A Visual Explanation (c) The Formula for the New Guess at the Root (d) An Example: Use the Method of False Position with initial guesses p 0 = 2 and p 1 = 4 to find an approximation to the solution to x cos(x) = 3 + 8x − x^3 correct to three decimal places. (e) What Problems Could Occur? (f) Strengths and Weaknesses of Method of False Position:
  • Section 2.2: Fixed-Point Methods
    1. The Problem:
      • Given: Suppose you have a continuous function f and some reasonable guesses as to a fixed-point of the function.
      • The Big Question: How do you go about finding the fixed point p given this information and the formula for f?
    2. What is a fixed-point, and how is finding the fixed-point of a function related to root-finding?
    3. Fixed-Point Iteration: (a) Given the continuous function f (x) and a reasonable guess to the fixed point, p 0. (b) A Visual Explanation (c) Deriving the Formula for the New Guess at the Fixed Point: (d) The Fixed-Point Algorithm
    4. Existence and Uniqueness Theorems: (a) Existence: If g ∈ C[a, b] and a ≤ g(x) ≤ b for all x ∈ [a, b], then g has at least one fixed point, p, in [a, b]. (b) Uniqueness: i. Contraction Mapping Theorem: If there also exists a value 0 < k < 1 such that |g(x) − g(y)| ≤ k|x − y|

for all x and y in [a, b], (i.e. g is a contraction mapping on [a, b]), then the fixed point p is unique. ii. Text Theorem: If, in addition to the existence criteria, g′(x) exists on (a, b) and a positive constant d < 1 exists with |g′(x)| ≤ k for all x ∈ (a, b)

then the fixed pint in [a, b] is unique. (c) Fixed Point Theorem: Let g ∈ C[a, b] be such that a ≤ g(x) ≤ b for all x ∈ [a, b]. Suppose, in addtion, tht g′^ exists on (a, b) and that a constant 0 < k < 1 exists with |g′(x)| ≤ k for all x ∈ (a, b) Then, for any number p 0 in [a, b], the sequence defined by pn = g(pn− 1 ), n ≥ 1 converges to the unique fixed point p in [a, b]. Also, the bounds for the error involved in using pn to approximate p are given by: |pn − p| ≤ kn^ max{p 0 − a, b − p 0 } and |pn − p| ≤ k

n 1 − k |p 1 − p 0 |, n ≥ 1

(d) An Example: Verify that the function f (x) = 12 e−x^ has a unique fixed point on [0, 1]. Use the Fixed-Point Iteration Method with initial guess p 0 = 0.9 to find an approximation to the true fixed point correct to three decimal places.

  1. M¨uller’s Method (a) Given the continuous function f (x) and three reasonable guesses to the root, x 0 , x 1 , and x 2. (b) A Visual Explanation: (c) The Formula for the New Guess for the Root: Given three guesses, x 0 , x 1 , and x 2 generate the new approximation, x 3 as:

x 3 = x 2 − 2 c b + sgn(b)

b^2 − 4 ac where

c = f (x 2 ) b = (x 0 − x 2 )^2 (f (x 1 ) − f (x 2 )) − (x 1 − x 2 )^2 (f (x 0 ) − f (x 2 )) (x 0 − x 2 )(x 1 − x 2 )(x 0 − x 1 ) a = (x^1 −^ x^2 )

(^2) (f (x 0 ) − f (x 1 )) − (x 0 − x 2 ) (^2) (f (x 1 ) − f (x 2 )) (x 0 − x 2 )(x 1 − x 2 )(x 0 − x 1 ) (d) Strengths and Weaknesses

  • Specific Rates of Convergence for Methods^1 McGraw Hill, 2002. Page 234:

Bisection Secant Newton False Position Fixed Point M¨uller’s α = 2 α ≈ 1 α = 1 +^

2 ≈^1.^62 (for simple roots)^ α^ = 1^ α^ ≈^1.^839 α = 1 (for multiple roots)

(^1) For M¨uller’s Method rate in particular see Michael T. Heath. Scientific Computing: An Introductory Survey, 2nd (^) Ed.