Переходьте в офлайн за допомогою програми Player FM !
Shannon Vallor, "The AI Mirror: How to Reclaim Our Humanity in an Age of Machine Thinking" (Oxford UP, 2024)
Manage episode 428131470 series 2421470
There's a lot of talk these days about the existential risk that artificial intelligence poses to humanity -- that somehow the AIs will rise up and destroy us or become our overlords.
In The AI Mirror: How to Reclaim our Humanity in an Age of Machine Thinking (Oxford UP), Shannon Vallor argues that the actual, and very alarming, existential risk of AI that we face right now is quite different. Because some AI technologies, such as ChatGPT or other large language models, can closely mimic the outputs of an understanding mind without having actual understanding, the technology can encourage us to surrender the activities of thinking and reasoning. This poses the risk of diminishing our ability to respond to challenges and to imagine and bring about different futures. In her compelling book, Vallor, who holds the Baillie Gifford Chair in the Ethics and Artificial Intelligence at the University of Edinburgh's Edinburgh Futures Institute, critically examines AI Doomers and Long-termism, the nature of AI in relation to human intelligence, and the technology industry's hand in diverting our attention from the serious risks we face.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology
898 епізодів
Manage episode 428131470 series 2421470
There's a lot of talk these days about the existential risk that artificial intelligence poses to humanity -- that somehow the AIs will rise up and destroy us or become our overlords.
In The AI Mirror: How to Reclaim our Humanity in an Age of Machine Thinking (Oxford UP), Shannon Vallor argues that the actual, and very alarming, existential risk of AI that we face right now is quite different. Because some AI technologies, such as ChatGPT or other large language models, can closely mimic the outputs of an understanding mind without having actual understanding, the technology can encourage us to surrender the activities of thinking and reasoning. This poses the risk of diminishing our ability to respond to challenges and to imagine and bring about different futures. In her compelling book, Vallor, who holds the Baillie Gifford Chair in the Ethics and Artificial Intelligence at the University of Edinburgh's Edinburgh Futures Institute, critically examines AI Doomers and Long-termism, the nature of AI in relation to human intelligence, and the technology industry's hand in diverting our attention from the serious risks we face.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology
898 епізодів
Усі епізоди
×Ласкаво просимо до Player FM!
Player FM сканує Інтернет для отримання високоякісних подкастів, щоб ви могли насолоджуватися ними зараз. Це найкращий додаток для подкастів, який працює на Android, iPhone і веб-сторінці. Реєстрація для синхронізації підписок між пристроями.