No one needs or wants a book about AI risk – Grey Enlightenment
Summary by greyenlightenment.com
1 Articles
1 Articles
All
Left
Center
Right
No one needs or wants a book about AI risk – Grey Enlightenment
In an upcoming book co-authored with Nate Soares, Eliezer Yudkowsky doesn’t pull any punches, promising in the title that if anyone builds superhuman AI, that ‘everyone in will die.’ But It’s more likely that rather than everyone dying, this book will go in the dustbin of history. Like most non-fiction books, the putative objective of his book is a call to action and to raise awareness–in his case, about AI risk. However, two things: the public …
Coverage Details
Total News Sources1
Leaning Left0Leaning Right0Center0Last UpdatedBias DistributionNo sources with tracked biases.
Bias Distribution
- There is no tracked Bias information for the sources covering this story.
Factuality
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage