Artwork

Вміст надано The Gradient. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією The Gradient або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.
Player FM - додаток Podcast
Переходьте в офлайн за допомогою програми Player FM !

Ted Gibson: The Structure and Purpose of Language

2:13:24
 
Поширити
 

Manage episode 396127795 series 2975159
Вміст надано The Gradient. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією The Gradient або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.

In episode 107 of The Gradient Podcast, Daniel Bashir speaks to Professor Ted Gibson.

Ted is a Professor of Cognitive Science at MIT. He leads the TedLab, which investigates why languages look the way they do; the relationship between culture and cognition, including language; and how people learn, represent, and process language.

Have suggestions for future podcast guests (or other feedback)? Let us know here or reach us at editor@thegradient.pub

Subscribe to The Gradient Podcast: Apple Podcasts | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter

Outline:

* (00:00) Intro

* (02:13) Prof Gibson’s background

* (05:33) The computational linguistics community and NLP, engineering focus

* (10:48) Models of brains

* (12:03) Prof Gibson’s focus on behavioral work

* (12:53) How dependency distances impact language processing

* (14:03) Dependency distances and the origin of the problem

* (18:53) Dependency locality theory

* (21:38) The structures languages tend to use

* (24:58) Sentence parsing: structural integrations and memory costs

* (36:53) Reading strategies vs. ordinary language processing

* (40:23) Legalese

* (46:18) Cross-dependencies

* (50:11) Number as a cognitive technology

* (54:48) Experiments

* (1:03:53) Why counting is useful for Western societies

* (1:05:53) The Whorf hypothesis

* (1:13:05) Language as Communication

* (1:13:28) The noisy channel perspective on language processing

* (1:27:08) Fedorenko lab experiments—language for thought vs. communication and Chomsky’s claims

* (1:43:53) Thinking without language, inner voices, language processing vs. language as an aid for other mental processing

* (1:53:01) Dependency grammars and a critique of Chomsky’s grammar proposals, LLMs

* (2:08:48) LLM behavior and internal representations

* (2:12:53) Outro

Links:

* Ted’s lab page and Twitter

* Re-imagining our theories of language

* Research — linguistic complexity and dependency locality theory

* Linguistic complexity: locality of syntactic dependencies (1998)

* The Dependency Locality Theory: A Distance-Based Theory of Linguistic Complexity (2000)

* Consequences of the Serial Nature of Linguistic Input for Sentential Complexity (2005)

* Large-scale evidence of dependency length minimization in 37 languages (2015)

* Dependency locality as an explanatory principle for word order (2020)

* Robust effects of working memory demand during naturalistic language comprehension in language-selective cortex (2022)

* A resource-rational model of human processing of recursive linguistic structure (2022)

* Research — language processing / communication and cross-linguistic universals

* Number as a cognitive technology: Evidence from Pirahã language and cognition (2008)

* The communicative function of ambiguity in language (2012)

* The rational integration of noisy evidence and prior semantic expectations in sentence interpretation (2013)

* Color naming across languages reflects color use (2017)

* How Efficiency Shapes Human Language (2019)


Get full access to The Gradient at thegradientpub.substack.com/subscribe
  continue reading

135 епізодів

Artwork
iconПоширити
 
Manage episode 396127795 series 2975159
Вміст надано The Gradient. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією The Gradient або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.

In episode 107 of The Gradient Podcast, Daniel Bashir speaks to Professor Ted Gibson.

Ted is a Professor of Cognitive Science at MIT. He leads the TedLab, which investigates why languages look the way they do; the relationship between culture and cognition, including language; and how people learn, represent, and process language.

Have suggestions for future podcast guests (or other feedback)? Let us know here or reach us at editor@thegradient.pub

Subscribe to The Gradient Podcast: Apple Podcasts | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter

Outline:

* (00:00) Intro

* (02:13) Prof Gibson’s background

* (05:33) The computational linguistics community and NLP, engineering focus

* (10:48) Models of brains

* (12:03) Prof Gibson’s focus on behavioral work

* (12:53) How dependency distances impact language processing

* (14:03) Dependency distances and the origin of the problem

* (18:53) Dependency locality theory

* (21:38) The structures languages tend to use

* (24:58) Sentence parsing: structural integrations and memory costs

* (36:53) Reading strategies vs. ordinary language processing

* (40:23) Legalese

* (46:18) Cross-dependencies

* (50:11) Number as a cognitive technology

* (54:48) Experiments

* (1:03:53) Why counting is useful for Western societies

* (1:05:53) The Whorf hypothesis

* (1:13:05) Language as Communication

* (1:13:28) The noisy channel perspective on language processing

* (1:27:08) Fedorenko lab experiments—language for thought vs. communication and Chomsky’s claims

* (1:43:53) Thinking without language, inner voices, language processing vs. language as an aid for other mental processing

* (1:53:01) Dependency grammars and a critique of Chomsky’s grammar proposals, LLMs

* (2:08:48) LLM behavior and internal representations

* (2:12:53) Outro

Links:

* Ted’s lab page and Twitter

* Re-imagining our theories of language

* Research — linguistic complexity and dependency locality theory

* Linguistic complexity: locality of syntactic dependencies (1998)

* The Dependency Locality Theory: A Distance-Based Theory of Linguistic Complexity (2000)

* Consequences of the Serial Nature of Linguistic Input for Sentential Complexity (2005)

* Large-scale evidence of dependency length minimization in 37 languages (2015)

* Dependency locality as an explanatory principle for word order (2020)

* Robust effects of working memory demand during naturalistic language comprehension in language-selective cortex (2022)

* A resource-rational model of human processing of recursive linguistic structure (2022)

* Research — language processing / communication and cross-linguistic universals

* Number as a cognitive technology: Evidence from Pirahã language and cognition (2008)

* The communicative function of ambiguity in language (2012)

* The rational integration of noisy evidence and prior semantic expectations in sentence interpretation (2013)

* Color naming across languages reflects color use (2017)

* How Efficiency Shapes Human Language (2019)


Get full access to The Gradient at thegradientpub.substack.com/subscribe
  continue reading

135 епізодів

Усі епізоди

×
 
Loading …

Ласкаво просимо до Player FM!

Player FM сканує Інтернет для отримання високоякісних подкастів, щоб ви могли насолоджуватися ними зараз. Це найкращий додаток для подкастів, який працює на Android, iPhone і веб-сторінці. Реєстрація для синхронізації підписок між пристроями.

 

Короткий довідник