TW-ZJLin / FiniteDifference

Numerical differentiation for unkown function

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Finite Difference

Introduction

Numerical differentiation of discrete functions.

Provides three ways of Numerical differentiations for unkown function:

  1. Forward Finite Difference
  2. Central Finite Difference
  3. Backward Finite Difference

And compare Central Finite Difference with Numpy.gradient at the end of the article.

input/output format

Case A:

  • Input: ( x, y, neighbor )
    • x, y: Numpy array with 1 dimensions
    • neighbor: integer
  • Output: ( y_diff )
    • y_diff: Numpy array with 1 dimensions ( differentiations of y with respect to x )

Case B:

  • Input: ( x, neighbor )
    • x: Numpy array with 1 dimensions
    • neighbor: integer
  • Output: ( x_diff )
    • x_diff: Numpy array with 1 dimensions ( differentiations of x with respect to natural number )

The variable "neighbor" is used to determine how many data points are used to calculate the differentiations.
The default of "neighbor" is 1.
The Central Finite Difference Method is recommended to get better accuracy.

NOTE: When "neighbor" becomes larger, the curve will be smoother, but the boundary error will also be larger.

Sample

Analytic Expression:
The function is y = 4x**5 + 12x3 + 15*x2 - 20*x + 8

Results:
The three ways of Numerical differentiations and their 1st,2nd-order error as shown below.
The results show that the Central Finite Difference has better performance.

Comparing Central Finite Difference and Numpy.gradient, the results are clearly the same.

About

Numerical differentiation for unkown function


Languages

Language:Jupyter Notebook 100.0%