Paul Christiano's important views on AI go mainstream! (arstechnica.com)
from dgerard@awful.systems to sneerclub@awful.systems on 19 Apr 2024 10:51
https://awful.systems/post/1384354

#sneerclub

threaded - newest

skillissuer@discuss.tchncs.de on 19 Apr 2024 16:26 next collapse

In that post, Christiano breaks down what he estimates are the probabilities of an AI takeover (22 percent), that “most” humans will die “within 10 years of building powerful AI” that makes labor obsolete (20 percent), and that “humanity has somehow irreversibly messed up our future within 10 years of building powerful AI” (46 percent).

He clarified that these probabilities are only intended “to quantify and communicate what I believe, not to claim I have some kind of calibrated model that spits out these numbers.” He said these numbers are basically guesses that often change depending on new information that he receives.

oh yeah i think that there’s exactly 37% percent chance of chatgpt turning earth into matrix, but don’t worry too hard about these numbers, i pulled these out of my ass for dramatic effect

long long time ago in first year of our lord covid i’ve read an article where an anthropologist looked on how innumerate people (is that a word? the point is not just people bad at math, it’s people bad at math to such a degree it impairs their understanding of the world) and specifically conspiracy theorists looked at numbers. the point is, they looked at numbers like illiterate people looked at writing: numbers mostly transmit vibes of knowledge, accuracy and certainty while being none of these. i probably won’t be able to find it anymore

skillissuer@discuss.tchncs.de on 19 Apr 2024 16:53 collapse

ars technica forum aggresively doesn’t get how critihype works