the-real-numbers-deactivated202

I appreciate the desire to do good better. With respect to EA priorities though, "AI Alignment" is pretty ludicrous to me. I really don't understand why unrealistically superintelligent AI developing the uncontrollable capability for harm is a realistic threat and nobody can give me an answer that isn't vague science fiction or a sanctimonious scolding. The subject of alignment could have changed, but last I checked it seemed pretty concerned with GAI and other far-out scenarios. I personally believe a lot of the GAI stuff is science fiction anxiety driving pascal's wager.

And there seem to be many real examples of far stupider machine learning algorithms being carelessly placed in a position to do harm. If there's any ML area of concern, it's AI Fairness, something which I think EAs in general tend to enjoy dunking on because it's not as grandiose as their pet projects or "it's full of wokescolds" or whatever. It's a terrible look. It makes me wonder if they're actually equipped to fairly judge long-term threats.

fruityyamenrunner

the GAI they are afraid of is a psychological extrapolation of their own striving - they are relentlessly self-improving bourgeoises who are being chased by a hyena with multiple very parental looking heads asking them why they aren't making even more [loud superimposed string of phonemes].

someone i think here posted about how becoming initiated into amphetamine usage was an important transformation that let them identify an eigenvector in their hellvectorspace that remained pointed to [loud phoneme of desire], but there are other similar transformations too like "getting a better programming job", "networking with the bay area mafia", "writing some software that automates and optimises a process".

attempting to put a combination of all of these transformations together into a single entity, which will be the perfect entity that will make professor mother therapist rabbi general hyena happy at its ability to obtain [loud phoneme catastrophe] ends up looking like "a general artificial intelligence" whatever that means.

argumate

*nodding wisely* oh nobody will like this

the-real-numbers-deactivated202

the prose poets have logged the fuck on

kontextmaschine

Oh I mostly understood it functionally, as a legitimating myth for spinning up a tech-autist equivalent of the “nonprofit industrial complex” to address elite overproduction and the unevenly distributed wealth of a startup economy