escline / linefit

LineFit Ruby Gem does weighted or unweighted least-squares line fitting to two-dimensional data (y = a + b * x). (Linear Regression)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Contents:
    NAME
    SYNOPSIS
    DESCRIPTION
    ALGORITHM
    LIMITATIONS
    EXAMPLES
    METHODS
    SEE ALSO
    AUTHOR
    LICENSE
    DISCLAIMER

NAME
    LineFit - Least squares line fit, weighted or unweighted

SYNOPSIS
	require 'linefit'
	lineFit = LineFit.new
	lineFit.setData(x,y)
	intercept, slope = lineFit.coefficients
	rSquared = lineFit.rSquared
	meanSquaredError = lineFit.meanSqError
	durbinWatson = lineFit.durbinWatson
	sigma = lineFit.sigma
	tStatIntercept, tStatSlope = lineFit.tStatistics
	predictedYs = lineFit.predictedYs
	residuals = lineFit.residuals
	varianceIntercept, varianceSlope = lineFit.varianceOfEstimates

DESCRIPTION
    The LineFit class does weighted or unweighted least-squares
    line fitting to two-dimensional data (y = a + b * x). (This is also
    called linear regression.) In addition to the slope and y-intercept, the
    class can return the square of the correlation coefficient (R squared),
    the Durbin-Watson statistic, the mean squared error, sigma, the t
    statistics, the variance of the estimates of the slope and y-intercept,
    the predicted y values and the residuals of the y values. (See the
    METHODS section for a description of these statistics.)

    The class accepts input data in separate x and y arrays or a single 2-D
    array (an array of two arrays). The optional weights are input in a
    separate array. The module can optionally verify that the input data and
    weights are valid numbers. If weights are input, the line fit minimizes
    the weighted sum of the squared errors and the following statistics are
    weighted: the correlation coefficient, the Durbin-Watson statistic, the
    mean squared error, sigma and the t statistics.

    The class is state-oriented and caches its results. Once you call the
    setData() method, you can call the other methods in any order or call a
    method several times without invoking redundant calculations.

    The decision to use or not use weighting could be made using your a
    prior knowledge of the data or using supplemental data. If the data is
    sparse or contains non-random noise, weighting can degrade the solution.
    Weighting is a good option if some points are suspect or less relevant
    (e.g., older terms in a time series, points that are known to have more
    noise).

ALGORITHM
    The least-square line is the line that minimizes the sum of the squares
    of the y residuals:

     Minimize SUM((y[i] - (a + b * x[i])) ** 2)

    Setting the parial derivatives of a and b to zero yields a solution that
    can be expressed in terms of the means, variances and covariances of x
    and y:

     b = SUM((x[i] - meanX) * (y[i] - meanY)) / SUM((x[i] - meanX) ** 2) 

     a = meanY - b * meanX

    Note that a and b are undefined if all the x values are the same.

    If you use weights, each term in the above sums is multiplied by the
    value of the weight for that index. The program normalizes the weights
    (after copying the input values) so that the sum of the weights equals
    the number of points. This minimizes the differences between the
    weighted and unweighted equations.

    LineFit uses equations that are mathematically equivalent to
    the above equations and computationally more efficient. The module runs
    in O(N) (linear time).

LIMITATIONS
    The regression fails if the input x values are all equal or the only
    unequal x values have zero weights. This is an inherent limit to fitting
    a line of the form y = a + b * x. In this case, the class issues an
    error message and methods that return statistical values will return
    undefined values. You can also use the return value of the regress()
    method to check the status of the regression.

    As the sum of the squared deviations of the x values approaches zero,
    the class's results become sensitive to the precision of floating
    point operations on the host system.

    If the x values are not all the same and the apparent "best fit" line is
    vertical, the class will fit a horizontal line. For example, an input
    of (1, 1), (1, 7), (2, 3), (2, 5) returns a slope of zero, an intercept
    of 4 and an R squared of zero. This is correct behavior because this
    line is the best least-squares fit to the data for the given
    parameterization (y = a + b * x).

    On a 32-bit system the results are accurate to about 11 significant
    digits, depending on the input data. Many of the installation tests will
    fail on a system with word lengths of 16 bits or fewer. (You might want
    to upgrade your old 80286 IBM PC.)

EXAMPLES

	require 'linefit'
	lineFit = LineFit.new
	x = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18]
	y = [4039,4057,4052,4094,4104,4110,4154,4161,4186,4195,4229,4244,4242,4283,4322,4333,4368,4389]

	lineFit.setData(x,y)

	intercept, slope = lineFit.coefficients
	rSquared = lineFit.rSquared
	meanSquaredError = lineFit.meanSqError
	durbinWatson = lineFit.durbinWatson
	sigma = lineFit.sigma
	tStatIntercept, tStatSlope = lineFit.tStatistics
	predictedYs = lineFit.predictedYs
	residuals = lineFit.residuals
	varianceIntercept, varianceSlope = lineFit.varianceOfEstimates

	print "Slope: #{slope}  Y-Intercept: #{intercept}\n"
	print "r-Squared: #{rSquared}\n"
	print "Mean Squared Error: #{meanSquaredError}\n"
	print "Durbin Watson Test: #{durbinWatson}\n"
	print "Sigma: #{sigma}\n"
	print "t Stat Intercept: #{tStatIntercept}  t Stat Slope: #{tStatSlope}\n\n"
	print "Predicted Ys: #{predictedYs.inspect}\n\n"
	print "Residuals: #{residuals.inspect}\n\n"
	print "Variance Intercept: #{varianceIntercept}   Variance Slope: #{varianceSlope}\n"
	print "\n"

	newX = 24
	newY = lineFit.forecast(newX)
	print "New X: #{newX}\nNew Y: #{newY}\n"

METHODS
    The class is state-oriented and caches its results. Once you call the
    setData() method, you can call the other methods in any order or call a
    method several times without invoking redundant calculations.

    The regression fails if the x values are all the same. In this case, the
    module issues an error message and methods that return statistical
    values will return undefined values. You can also use the return value
    of the regress() method to check the status of the regression.

  new() - create a new LineFit object
     linefit = LineFit.new
     linefit = LineFit.new(validate)
     linefit = LineFit.new(validate, hush)

     validate = 1 -> Verify input data is numeric (slower execution)
              = 0 -> Don't verify input data (default, faster execution)
     hush     = 1 -> Suppress error messages
              = 0 -> Enable error messages (default)

  coefficients() - Return the slope and y intercept
     intercept, slope = linefit.coefficients

    The returned list is undefined if the regression fails.

  durbinWatson() - Return the Durbin-Watson statistic
     durbinWatson = linefit.durbinWatson

    The Durbin-Watson test is a test for first-order autocorrelation in the
    residuals of a time series regression. The Durbin-Watson statistic has a
    range of 0 to 4; a value of 2 indicates there is no autocorrelation.

    The return value is undefined if the regression fails. If weights are
    input, the return value is the weighted Durbin-Watson statistic.

  meanSqError() - Return the mean squared error
     meanSquaredError = linefit.meanSqError

    The return value is undefined if the regression fails. If weights are
    input, the return value is the weighted mean squared error.

  predictedYs() - Return the predicted y values array
     predictedYs = linefit.predictedYs

    The returned list is undefined if the regression fails.

  forecast() - Return the independent (Y) value, by using a dependent (X) value.
     forecasted_y = linefit.forecast(x_value)
     
    Will use the slope and intercept to calculate the Y value along the line
    at the x value. Note: value returned only as good as the line fit.
  
  regress() - Do the least squares line fit (if not already done)
     linefit.regress

    You don't need to call this method because it is invoked by the other
    methods as needed. After you call setData(), you can call regress() at
    any time to get the status of the regression for the current data.

  residuals() - Return predicted y values minus input y values
     residuals = linefit.residuals

    The returned list is undefined if the regression fails.

  rSquared() - Return the square of the correlation coefficient
     rSquared = linefit.rSquared

    R squared, also called the square of the Pearson product-moment
    correlation coefficient, is a measure of goodness-of-fit. It is the
    fraction of the variation in Y that can be attributed to the variation
    in X. A perfect fit will have an R squared of 1; fitting a line to the
    vertices of a regular polygon will yield an R squared of zero. Graphical
    displays of data with an R squared of less than about 0.1 do not show a
    visible linear trend.

    The return value is undefined if the regression fails. If weights are
    input, the return value is the weighted correlation coefficient.

  setData() - Initialize (x,y) values and optional weights
     lineFit.setData(x, y)
     lineFit.setData(x, y, weights)
     lineFit.setData(xy)
     lineFit.setData(xy, weights)

    xy is an array of arrays; x values are xy[i][0], y values are
    xy[i][1]. The method identifies the difference between the first
    and fourth calling signatures by examining the first argument.

    The optional weights array must be the same length as the data array(s).
    The weights must be non-negative numbers; at least two of the weights
    must be nonzero. Only the relative size of the weights is significant:
    the program normalizes the weights (after copying the input values) so
    that the sum of the weights equals the number of points. If you want to
    do multiple line fits using the same weights, the weights must be passed
    to each call to setData().

    The method will return false if the array lengths don't match, there are
    less than two data points, any weights are negative or less than two of
    the weights are nonzero. If the new() method was called with validate =
    1, the method will also verify that the data and weights are valid
    numbers. Once you successfully call setData(), the next call to any
    method other than new() or setData() invokes the regression.

  sigma() - Return the standard error of the estimate
    sigma = linefit.sigma

    Sigma is an estimate of the homoscedastic standard deviation of the
    error. Sigma is also known as the standard error of the estimate.

    The return value is undefined if the regression fails. If weights are
    input, the return value is the weighted standard error.

  tStatistics() - Return the t statistics
     tStatIntercept, tStatSlope = linefit.tStatistics

    The t statistic, also called the t ratio or Wald statistic, is used to
    accept or reject a hypothesis using a table of cutoff values computed
    from the t distribution. The t-statistic suggests that the estimated
    value is (reasonable, too small, too large) when the t-statistic is
    (close to zero, large and positive, large and negative).

    The returned list is undefined if the regression fails. If weights are
    input, the returned values are the weighted t statistics.

  varianceOfEstimates() - Return variances of estimates of intercept, slope
     varianceIntercept, varianceSlope = linefit.varianceOfEstimates

    Assuming the data are noisy or inaccurate, the intercept and slope
    returned by the coefficients() method are only estimates of the true
    intercept and slope. The varianceofEstimate() method returns the
    variances of the estimates of the intercept and slope, respectively. See
    Numerical Recipes in C, section 15.2 (Fitting Data to a Straight Line),
    equation 15.2.9.

    The returned list is undefined if the regression fails. If weights are
    input, the returned values are the weighted variances.

SEE ALSO
     Mendenhall, W., and Sincich, T.L., 2003, A Second Course in Statistics:
       Regression Analysis, 6th ed., Prentice Hall.
     Press, W. H., Flannery, B. P., Teukolsky, S. A., Vetterling, W. T., 1992,
       Numerical Recipes in C : The Art of Scientific Computing, 2nd ed., 
       Cambridge University Press.

AUTHOR
    Eric Cline, escline(at)gmail(dot)com
    Richard Anderson

LICENSE
    This program is free software; you can redistribute it and/or modify it
    under the same terms as Ruby itself.

    The full text of the license can be found in the LICENSE file included
    in the distribution and available in the RubyForge listing for
    LineFit (see rubyforge.org).

DISCLAIMER
    To the maximum extent permitted by applicable law, the author of this
    module disclaims all warranties, either express or implied, including
    but not limited to implied warranties of merchantability and fitness for
    a particular purpose, with regard to the software and the accompanying
    documentation.

About

LineFit Ruby Gem does weighted or unweighted least-squares line fitting to two-dimensional data (y = a + b * x). (Linear Regression)

License:Other


Languages

Language:Ruby 100.0%