Modern Regression and Classification - DC

Trevor Hastie (trevor@stat.Stanford.EDU)
Tue, 20 Jan 1998 21:45:50 -0800 (PST)


++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++ +++
+++ Modern Regression and Classification: +++
+++ +++
+++ Widely applicable statistical methods +++
+++ for modeling and prediction +++
+++ +++
+++ +++
+++ +++
+++ Washington DC: April 6-7, 1998. +++
+++ +++
+++ Trevor Hastie, Stanford University +++
+++ Rob Tibshirani, University of Toronto +++
+++ +++
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

This two-day course will give a detailed overview of statistical
models for regression and classification. Known as machine-learning in
computer science and artificial intelligence, and pattern recognition
in engineering, this is a hot field with powerful applications in
finance, science and industry.

This course covers a wide range of models from linear regression
through various classes of more flexible models to fully nonparametric
regression models, both for the regression problem and for
classification.

Although a firm theoretical motivation will be presented, the emphasis
will be on practical applications and implementations. The course
will include many examples and case studies, and participants should
leave the course well-armed to tackle real problems with realistic
tools. The instructors are at the forefront in research in this area.

After a brief overview of linear regression tools, methods for
one-dimensional and multi-dimensional smoothing are presented, as well
as techniques that assume a specific structure for the regression
function. These include splines, wavelets, additive models, MARS
(multivariate adaptive regression splines), projection pursuit
regression, neural networks and regression trees. All of these can be
adapted to the time-series framework for predicting future trends from
the past.

The same hierarchy of techniques is available for classification
problems. Classical tools such as linear discriminant analysis and
logistic regression can be enriched to account for nonlinearities and
interactions. Generalized additive models and flexible discriminant
analysis, neural networks and radial basis functions, classification
trees and kernel estimates are all such generalizations. Other
specialized techniques for classification including nearest- neighbor
rules and learning vector quantization will also be covered.

Apart from describing these techniques and their applications to a
wide range of problems, the course will also cover model selection
techniques, such as cross-validation and the bootstrap, and diagnostic
techniques for model assessment.

Software for these techniques will be illustrated, and a comprehensive
set of course notes will be provided to each attendee.

Additional information is available at the Website:

http://stat.stanford.edu/~trevor/mrc.general.html

************************************************************
Some quotes from past attendees:

"... the best presentation by professional statisticians I have
ever had the pleasure of attending"
"Superior to most courses in all aspects"
"I really liked how you emphasized concepts rather than
mathematical expressions"
"Your 2-day course has saved me months of research"
*************************************************************

COURSE OUTLINE

DAY ONE:

Overview of regression methods: Linear regression models and least
squares. Ridge regression and the ``lasso''. Flexible linear models
and basis function methods. linear and nonlinear smoothers; kernels,
splines, and wavelets. Bias/variance tradeoff- cross-validation and
bootstrap. Smoothing parameters and effective number of parameters.
Non-linear and adaptive time series methods. Surface smoothers.

++++++++

Structured Nonparametric Regression: Problems with high dimensional
smoothing. Structured high-dimensional regression: additive models.
project pursuit regression. CART, MARS. radial basis functions. neural
networks. applications to time series forecasting.

DAY TWO:

Classification: Statistical decision theory and classification
rules. Linear procedures: Discriminant Analysis. Logistics
regression. Quadratic discriminant analysis, parametric
models. Nearest neighbor classification, K-means and LVQ. Adaptive
nearest neighbor methods.

++++++++

The Discrete choice model. Nonparametric classification:
Classification trees: CART. Flexible/penalized discriminant
analysis. Multiple logistic regression models and neural networks.
Kernel methods.

THE INSTRUCTORS

Professor Trevor Hastie of the Statistics and Biostatistics
Departments at Stanford University was formerly a member of the
Statistics and Data Analysis Research group AT & T Bell
Laboratories. He co-authored with Tibshirani the monograph Generalized
Additive Models (1990) published by Chapman and Hall, and has many
research articles in the area of nonparametric regression and
classification. He also co-edited the Wadsworth book Statistical
Models in S (1991) with John Chambers.

Professor Robert Tibshirani of the Statistics and Biostatistics
departments at University of Toronto is the most recent recipient of
the COPSS award - an award given jointly by all the leading
statistical societies to the most outstanding statistician under the
age of 40. He also has many research articles on nonparametric
regression and classification. With Bradley Efron he co-authored the
best-selling text An Introduction to the Bootstrap in 1993, and has
been an active researcher on bootstrap technology for the past 12
years.

Both Prof. Hastie and Prof. Tibshirani are actively involved in
research in modern regression and classification and are well-known
not only in the statistics community but in the machine-learning and
neural network fields as well. The have given many short courses
together on classification and regression procedures to a wide variety
of academic, government and industrial audiences. These include the
American Statistical Association and Interface meetings, NATO ASI
Neural Networks and Statistics workshop, AI and Statistics, and the
Canadian Statistical Society meetings.

April 6-7, 1998
Georgetown University Marriott Conference Center
3800 Reservoir Road N.W
Washington D.C 20057
Phone (202) 687 3242
FAX (202) 687 3310

To make room reservations, call the hotel directly. Some rooms have
been blocked off at a special rate for this function.

PRICE: $1200 per attendee. Discounted price of $950- for academic and
non-profit organizations. Cancellation policy: if notification
received by March 4, full refund will be given; March 4 to March 23 -
a 20% administration fee will be charged. After March 23- at the
discretion of the instructors. A substitute delegate is always
welcome at no extra charge. Attendance is limited to the first 60
applicants, so sign up soon! These courses fill up quickly.

TO REGISTER:

Please print this form, and fill in the hard copy to return by postal
mail or FAX.

Registration by March 4 recommended to ensure a spot.

Modern Regression and Classification:
Widely applicable statistical methods
for modeling and prediction

Monday, April 6 and Tuesday, April 7, 1998.
Georgetown University Marriott Conference Center

Please complete this form (type or print)

Name ___________________________________________________
Last First Middle

Firm or Institution ______________________________________

Standard Registration ____

Mailing Address (for receipt) _________________________

__________________________________________________________

__________________________________________________________

__________________________________________________________
Country Phone FAX

__________________________________________________________
email address

__________________________________________ _______________
Credit card # (if payment by credit card) Expiration Date

(Lunch Menu - tick as appropriate):


___ Vegetarian ___ Non-Vegetarian

Fee payment must be made by MONEY ORDER, PERSONAL CHECK, VISA or
MASTERCARD. All amounts must in US dollar figures. Make fee payable
to Prof. Trevor Hastie. Mail it, together with this completed
Registration Form to:

Prof. T. Hastie
538 Campus Drive
Stanford CA 94305
U.S.A
FAX 415-326 0854

ALL CREDIT CARD REGISTRATIONS MUST INCLUDE BOTH CARD NUMBER AND
EXPIRATION DATE.
DO NOT SEND CASH.

Registration fee includes Course Materials, coffee breaks,
and lunch both days.

If you have further questions, email to trevor@stat.stanford.edu
or tibs@utstat.toronto.edu

--------------------------------------------------------------------
Trevor Hastie trevor@stat.stanford.edu
Phone: 415-725-2231 Fax: 415-725-8977
ftp://stat.stanford.edu/pub/hastie/
http://stat.stanford.edu/~trevor
paper: Statistics Department, Stanford University, CA94305
office: Margaret Jacks Hall, rm 362
--------------------------------------------------------------------