##
Nonparametric Regression Analysis

Regression analysis, broadly construed, traces the conditional distribution
-- or some aspect of the conditional distribution, such as its mean -- of a
dependent variable (Y) as a function of one or more independent variables
(X's). Regression analysis as it is usually applied is much more
restrictive -- assuming that the mean of Y is a linear function of the X's,
that the conditional distribution of Y given the X's is a normal
distribution, and that the conditional variance of Y is constant. These
assumptions lead naturally to linear least-squares estimation.
Nonparametric regression makes minimal assumptions about the dependence of
the average Y on the X's. This short course will introduce nonparametric
regression estimators both for simple- regression analysis (a single X) --
also called scatterplot smoothers -- and for multiple-regression analysis
(several X's). I will describe naive binning estimators, kernel (local
weighted averaging) estimators, local-polynomial ("lowess") estimators, and
additive nonparametric regression models. There will be some consideration
of methods of statistical inference for nonparametric regression, analogous
to the methods employed for linear least squares.

A background in linear least-squares regression will be assumed. Most of
the material will be presented at an elementary mathematical and statistical
level. Some of the material on statistical inference requires knowledge of
matrix algebra. I will limit this more demanding material to the last lecture.