Local regression, so smooooth!
Julia
Clone or download
Latest commit acf298f Jul 18, 2018
Permalink
Failed to load latest commit information.
src indentation Jul 14, 2018
test deprecations Jul 13, 2018
.travis.yml CI config for 0.7 Jul 13, 2018
LICENSE.md Readme, license, exports and such. Sep 13, 2013
README.md readme badges Jul 13, 2018
REQUIRE require 0.7 beta2 Jul 18, 2018

README.md

Loess

Build Status Loess Loess

This is a pure Julia loess implementation, based on the fast kd-tree based approximation described in the original Cleveland, et al papers, implemented in the netlib loess C/Fortran code, and used by many, including in R's loess function.

Synopsis

Loess exports two functions: loess and predict, that train and apply the model, respectively.

using Loess

xs = 10 .* rand(100)
ys = sin(xs) .+ 0.5 * rand(100)

model = loess(xs, ys)

us = collect(minimum(xs):0.1:maximum(xs))
vs = predict(model, us)

using Gadfly
p = plot(x=xs, y=ys, Geom.point, Guide.xlabel("x"), Guide.ylabel("y"),
         layer(Geom.line, x=us, y=vs))
draw(SVG("loess.svg", 6inch, 3inch), p)

Example Plot

There's also a shortcut in Gadfly to draw these plots:

plot(x=xs, y=ys, Geom.point, Geom.smooth, Guide.xlabel("x"), Guide.ylabel("y"))

Status

Multivariate regression is not yet fully implemented, but most of the parts are already there, and wouldn't require too much additional work.