@r000t normal people should just embrace the plausible deniability the AI fakes give. photography essentially is worthless as evidence now. video will soon be worthless. instead they choose to be triggered that someone might do pr0n with all the pictures they posted out of vanity for the whole fucking world to see, as they got zero impulse control.
legislators want AI to be gone because of the implications for evidence. they want the fun tools to create incriminating shit to jail everyone of bother.
@r000t yip.
maybe people are just unsettled by the thought of someone rubbing one out to fake nudes? which is a bit misguided as well as in all likelihood someone is doing that anyway. fake nudes or not.
@bonifartius It's very telling that people are more concerned about fappery than being framed for a crime and jailed or put to death.
It's the normalcy bias. Nobody ever thinks it'll happen to them.
@r000t how anyone still can think "will not happen" after the last few years is a mystery to me :)
@WALFTEAM
They're just preparing for the dimensional merge imo
@bonifartius
@bonifartius ^^^^^^
The response to this tech is to breathe a sign of relief. If actual nudes of you leak, you can point and say "oh, those darned kids and their AI porn machine"