Artwork

Вміст надано The Gradient. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією The Gradient або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.
Player FM - додаток Podcast
Переходьте в офлайн за допомогою програми Player FM !

Ryan Tibshirani: Statistics, Nonparametric Regression, Conformal Prediction

1:46:29
 
Поширити
 

Manage episode 414594470 series 2975159
Вміст надано The Gradient. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією The Gradient або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.

Episode 121

I spoke with Professor Ryan Tibshirani about:

* Differences between the ML and statistics communities in scholarship, terminology, and other areas.

* Trend filtering

* Why you can’t just use garbage prediction functions when doing conformal prediction

Ryan is a Professor in the Department of Statistics at UC Berkeley. He is also a Principal Investigator in the Delphi group. From 2011-2022, he was a faculty member in Statistics and Machine Learning at Carnegie Mellon University. From 2007-2011, he did his Ph.D. in Statistics at Stanford University.

Reach me at editor@thegradient.pub for feedback, ideas, guest suggestions.

The Gradient Podcast on: Apple Podcasts | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter

Outline:

* (00:00) Intro

* (01:10) Ryan’s background and path into statistics

* (07:00) Cultivating taste as a researcher

* (11:00) Conversations within the statistics community

* (18:30) Use of terms, disagreements over stability and definitions

* (23:05) Nonparametric Regression

* (23:55) Background on trend filtering

* (33:48) Analysis and synthesis frameworks in problem formulation

* (39:45) Neural networks as a specific take on synthesis

* (40:55) Divided differences, falling factorials, and discrete splines

* (41:55) Motivations and background

* (48:07) Divided differences vs. derivatives, approximation and efficiency

* (51:40) Conformal prediction

* (52:40) Motivations

* (1:10:20) Probabilistic guarantees in conformal prediction, choice of predictors

* (1:14:25) Assumptions: i.i.d. and exchangeability — conformal prediction beyond exchangeability

* (1:25:00) Next directions

* (1:28:12) Epidemic forecasting — COVID-19 impact and trends survey

* (1:29:10) Survey methodology

* (1:38:20) Data defect correlation and its limitations for characterizing datasets

* (1:46:14) Outro

Links:

* Ryan’s homepage

* Works read/mentioned

* Nonparametric Regression

* Adaptive Piecewise Polynomial Estimation via Trend Filtering (2014)

* Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems (2020)

* Distribution-free Inference

* Distribution-Free Predictive Inference for Regression (2017)

* Conformal Prediction Under Covariate Shift (2019)

* Conformal Prediction Beyond Exchangeability (2023)

* Delphi and COVID-19 research

* Flexible Modeling of Epidemics

* Real-Time Estimation of COVID-19 Infections

* The US COVID-19 Trends and Impact Survey and Big data, big problems: Responding to “Are we there yet?”

Get full access to The Gradient at thegradientpub.substack.com/subscribe

  continue reading

135 епізодів

Artwork
iconПоширити
 
Manage episode 414594470 series 2975159
Вміст надано The Gradient. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією The Gradient або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.

Episode 121

I spoke with Professor Ryan Tibshirani about:

* Differences between the ML and statistics communities in scholarship, terminology, and other areas.

* Trend filtering

* Why you can’t just use garbage prediction functions when doing conformal prediction

Ryan is a Professor in the Department of Statistics at UC Berkeley. He is also a Principal Investigator in the Delphi group. From 2011-2022, he was a faculty member in Statistics and Machine Learning at Carnegie Mellon University. From 2007-2011, he did his Ph.D. in Statistics at Stanford University.

Reach me at editor@thegradient.pub for feedback, ideas, guest suggestions.

The Gradient Podcast on: Apple Podcasts | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter

Outline:

* (00:00) Intro

* (01:10) Ryan’s background and path into statistics

* (07:00) Cultivating taste as a researcher

* (11:00) Conversations within the statistics community

* (18:30) Use of terms, disagreements over stability and definitions

* (23:05) Nonparametric Regression

* (23:55) Background on trend filtering

* (33:48) Analysis and synthesis frameworks in problem formulation

* (39:45) Neural networks as a specific take on synthesis

* (40:55) Divided differences, falling factorials, and discrete splines

* (41:55) Motivations and background

* (48:07) Divided differences vs. derivatives, approximation and efficiency

* (51:40) Conformal prediction

* (52:40) Motivations

* (1:10:20) Probabilistic guarantees in conformal prediction, choice of predictors

* (1:14:25) Assumptions: i.i.d. and exchangeability — conformal prediction beyond exchangeability

* (1:25:00) Next directions

* (1:28:12) Epidemic forecasting — COVID-19 impact and trends survey

* (1:29:10) Survey methodology

* (1:38:20) Data defect correlation and its limitations for characterizing datasets

* (1:46:14) Outro

Links:

* Ryan’s homepage

* Works read/mentioned

* Nonparametric Regression

* Adaptive Piecewise Polynomial Estimation via Trend Filtering (2014)

* Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems (2020)

* Distribution-free Inference

* Distribution-Free Predictive Inference for Regression (2017)

* Conformal Prediction Under Covariate Shift (2019)

* Conformal Prediction Beyond Exchangeability (2023)

* Delphi and COVID-19 research

* Flexible Modeling of Epidemics

* Real-Time Estimation of COVID-19 Infections

* The US COVID-19 Trends and Impact Survey and Big data, big problems: Responding to “Are we there yet?”

Get full access to The Gradient at thegradientpub.substack.com/subscribe

  continue reading

135 епізодів

Усі епізоди

×
 
Loading …

Ласкаво просимо до Player FM!

Player FM сканує Інтернет для отримання високоякісних подкастів, щоб ви могли насолоджуватися ними зараз. Це найкращий додаток для подкастів, який працює на Android, iPhone і веб-сторінці. Реєстрація для синхронізації підписок між пристроями.

 

Короткий довідник