AI Dirty Dozen 2020 Part II
- Share
-
-
arroba
There are many forces that shape the hyped AI stories we read. Media is everywhere and competition is fierce. Articles with provocative headlines and content are clickbait for the browsing consumer. We’re going to count down the AI Dirty Dozen: the top twelve AI hyped stories for 2020. Join Dr. Robert J. Marks as he discusses some of these stories with Jonathan Bartlett and Dr. Eric Holloway. Listen to Part I here.
Show Notes
- 00:30 | Introducing Jonathan Bartlett
- 00:38 | Introducing Dr. Eric Holloway
- 01:25 | #8: “Is AI really better than physicians at diagnosis?” (Mind Matters News)
- 05:07 | #07: “Learning to Simulate Dynamic Environments with GameGAN” (GameGAN)
- 09:03 | #6: “GPT-3 Is “Mindblowing” If You Don’t Question It Too Closely” (Mind Matters News), “Built to Save Us from Evil AI, OpenAI Now Dupes Us” (Mind Matters News), “There’s a subreddit populated entirely by AI personifications of other subreddits” (The Verge), and “Bot posing as human fooled people on Reddit for an entire week” (The Independent)
- 16:03 | #5: “Lack of Sleep Could Be a Problem for AIs” (Scientific American)
Additional Resources
- Jonathan Bartlett at Discovery.org
- Eric Holloway at Discovery.org
- #8: “Is AI really better than physicians at diagnosis?” (Mind Matters News)
- #7: “Learning to Simulate Dynamic Environments with GameGAN” (GameGAN)
- #6: “GPT-3 Is “Mindblowing” If You Don’t Question It Too Closely” (Mind Matters News), “Built to Save Us from Evil AI, OpenAI Now Dupes Us” (Mind Matters News), “There’s a subreddit populated entirely by AI personifications of other subreddits” (The Verge), and “Bot posing as human fooled people on Reddit for an entire week” (The Independent)
- #5: “Lack of Sleep Could Be a Problem for AIs” (Scientific American)