PortaJohn

AGGEQat3zmYr.jpeg
 

Fuck that noise. The un shouldn't even exist, much less any staff members having "full diplomatic immunity from prosecution" from anywhere in the world.

Did I happen to say "fuck that noise" ?
 
AI May Be Faking Stupidity to Take Control of Us

Yampolskiy, a computer scientist with over a decade of AI risk research, told Rogan that many industry leaders privately estimate a 20-30% chance AI could wipe out humanity.

Rogan summarized the common optimistic view: AI could make life easier, cheaper, and better. But Yampolskiy disagreed sharply: “It’s actually not true. All of them are on the record the same: this is going to kill us. Their doom levels are insanely high. Not like mine, but still, 20 to 30 percent chance that humanity dies is a lot.”

Rogan, sounding uneasy, replied: “Yeah, that’s pretty high. But yours is like 99.9 percent.”

Yampolskiy didn’t argue. “It’s another way of saying we can’t control superintelligence indefinitely. It’s impossible.”



 
Last edited:
based on the criteria they used, all terrorist groups would appear to be eligible.

the judges said that they possessed the “characteristics” of a nationality, which according to the Convention include belonging to a “group determined by its cultural, ethnic or linguistic identity, common geographical or political origins or its relationship with the population of another state.”

 
AI May Be Faking Stupidity to Take Control of Us

Yampolskiy, a computer scientist with over a decade of AI risk research, told Rogan that many industry leaders privately estimate a 20-30% chance AI could wipe out humanity.

Rogan summarized the common optimistic view: AI could make life easier, cheaper, and better. But Yampolskiy disagreed sharply: “It’s actually not true. All of them are on the record the same: this is going to kill us. Their doom levels are insanely high. Not like mine, but still, 20 to 30 percent chance that humanity dies is a lot.”

Rogan, sounding uneasy, replied: “Yeah, that’s pretty high. But yours is like 99.9 percent.”

Yampolskiy didn’t argue. “It’s another way of saying we can’t control superintelligence indefinitely. It’s impossible.”




Yep, there are going to be some really big problems with AI.

I should probably use a different term to describe this, but I just "love" when a pundit starts pimping AI and pontificating how "great it is". Absolutely no self or situational awareness as to the high probability of everything "AI" going sideways in a big way.

Kinda like covid. Unfortunately, we're all tied at the hip and we can't/don't control those that develop and introduce these types of things. We just get to deal with all of the negative fallout.