A lot of discussion on how AI-generated art will impact the artist community and art-making in general. I'm encouraged to see artists starting to organize around what happens next.
It will be a difficult near-future for anyone who doesn't want anything to do with these technologies. They're pretty solidly as transformative as the synthesizer, if not the camera. And they're pretty irrevocably here. So what happens next?
One angle I've seen some convincingly-pleasing presentations on is challenging the legality of how these tools were constructed. I think that's a pretty vertical cliff to scale (hard to draw a clean line that kills this tech without also killing other transformative uses of other people's work that are already accepted as non-infringing of copyright).
For the sake of argument, I'm going to assume that such legal challenges carry the day and Stable Diffusion / OpenAI are dismantled / banned from any legal use.
What I need people to understand is it won't be enough, and the next part is much harder.
There is a problem that I call the Balder problem. There is, I'm sure, a better term for it in the literature around collective labor action, but I cop to my ignorance of the space. In Norse legend, when Balder was slain, an opportunity was given for him to come back to the realm of the living... If every living being expressed their sorrow at his passing. One single giant, Thokk, did not. As a result, Balder would be kept in the underworld until Ragnarok.
If the current set of tools are found to be illegally-generated, the *next* set won't be. It will be more than worth it for every major media and advertising company to pay scads of cash to a small subset of artists to generate seed material for their own diffusers. The big ones could *easily* afford to make a hundred or even a thousand artists *lifetime-career-rich* for exclusive rights to their art.
The only way to stop that front is for every visual artist out there to say no to that money unless the money comes with solid guarantees that the livelihood of every artist is preserved.
That's a hell of a temptation, and it will be dangled in front of every visual artist in the world. I'm an optimist, but the odds of all artists standing together with one voice and demanding a right to survive on their talent that doesn't exclude other artists are poor.
But if I could think of any one demographic in this country at least that I have the highest hope for pulling it off, it's them. My optimism knows no bounds.
@lauren RealID is raplidly becoming the IPv6 of real-world authentication.
@harpaa01 I don't follow. Self-driving car research is very much alive and ongoing.
Spent a good ol' chunk of the day chasing down that when the Phaser game engine has camera zoom set, it doesn't immediately update its transform matrix so `getWorldPoint` doesn't give the right answer.
Game engines are great, but they don't remove the need for good old-fashioned debugging.
Twaining an AI to undewstand what "harmful information" is turns out to be reawwy hard, UwU.
@jleedev Weirdly, it does that with other dates such as January 32nd.
@lauren I need to populate my acronym dictionary with the actual definition, so that my brain stops mapping "TASS" to "Tool-Assisted Speedrun Speedrun."
(That, of course, is the game of seeing how quickly you can write a tool to build tool-assisted speedruns for a game. Two sub-categories: 100% for writing your tool starting from a blank code document, and any% for configuring an existing TAS tool).
@joe_no_body Deal, so long as what it does is correct Nov. 31 to December 1st. ;)
@lauren Does your node allow editing, or did you do a discard-and-redraft?
Some bits of Mastodon are still a mystery to me. I should really go read the core protocol.
@lauren If I recall correctly, we learn in 2010 that HAL's breakdown isn't even its fault: the mission designers gave it mutually-conflicting directives of processing data without distortion or concealment and hiding the mission's true objective once it reached Jupiter. The conflict created a breakdown as the only logical solution HAL could find that satisfied all parameters was remove all humans from the interaction so there was nobody to conceal the directives from.
(There's a larger essay I need to get around to writing on this theme: there's a difference in kind between simple tools that we can fully understand the function of and tools approaching or exceeding complexity that can fit in a human brain, and a hallmark of sci-fi is creating category-B tools, giving them broad objectives or responsibilities ignorant of how they'd need to complete objectives of such broad scope, and watching the situation go off the rails. At some level of complexity, you actually *do* want a human instead of a machine because at least the human will make "human-shaped" mistakes that our squishy empathic circuits are tuned to predict and adapt to).
@ocdtrekkie @lauren @Santaclaus I don't think anyone can guarantee to me that an antitrust shake up that broke Google into smaller companies would preserve the Chromebook. It's an integrated product---it uses Google's authentication, Google's operating system, and Google's web store. I, for one, am uninterested in risking all of those pieces not ending up in the same fragment of a restructured Google so they can continue to work together seamlessly.
@ocdtrekkie @lauren @Santaclaus
> but the complete obliteration of Google, Apple, and Amazon antitrust models would have wide-sweeping benefits for consumers
I'm feeling like that's the sort of assertion French Revolutionaries made before the Jacobins took over.
I can't be sure whether things would be better if the big tech companies were broken up by force of law. In the short run, my in-laws' Chromebook, which has been the absolute ideal platform for her, would no longer be supported.
From where I sit, that's not a boon.
I'm going to have to put more butter on this popcorn while I watch this Musk/Apple dust-up.
Musk is accusing Apple of hating free speech.
They won't, but I really want Apple to respond with "Yeah bro. Have you even heard of us? We're the OG walled garden. The *vertical* monopoly. Our whole *deal* is 'Pay us to think for you.' Our *vibe* is "safety and comfort." Our *unofficial logo* is the rounded rectangle. You think you're a problem for us? Our users *pay us money* to deal with *problems like you.* *For them.* *Efffffff* your frozen peaches."
... they won't say it.
...**But they could.**
My head-canon is that Solomon had, like, ten trials lined up for the baby case but the one woman failed so hard on the first one that he ended up with a whole free-ass day, just... Calendar cleared.
"We'll cut the baby in half and... What? You said what? Wow. I... Okay. You know what? There was going to be a whole philosophical journey I was going to take you on here, comparing and contrasting property and people, responsibility to ownership, culminating with recognition of the mystery of soul and the supremacy of G~d in all things, but... No! I don't care who the biological mother is. Give the baby to the woman who *doesn't think it's okay to cut babies in half!* Case closed praise YHWH!"
@mds2 I like this "conservation of curvature" idea and would like to subscribe to your newsletter and / or civic architecture journal. ;)
us pol, kosa
@ocdtrekkie It is probably best to not allow use of YouTube unsupervised (back in the day, we had a similar rule for cable TV. Network TV required the rule less because FCC poiicies generally kept what was broadly considered unacceptable off-air).
@lauren The first yellow flag on the whole thing for me was when the SVP in charge of the project posted a big rah-rah internal message about G+, a popular Googler responded to that internal post with "FIRST!", and that Googler was then asked side-channel to *take that reply down.*
In hindsight, the leadership who thought that ask was appropriate or sensible was *exactly* the leadership that would blindly blunder into "Let's just fuse YouTube comments to G+ comments, what's the worst that could happen, all comments are fungible and there's no such thing as a 'community culture'."
@estrapade @noracodes The thing is, unfortunately... Mastodon *is* too technical in a few key ways that will matter for adoption.
It's things that a lot of techies will gloss right over but will turn off a lot of users of centralized services because they never had to deal with the added layer before. Really simple but persistent things like "If you find the name of a user on another node's web UI, following a user who is on another node requires you to copy-and-paste their Mastodon ID into a search bar in the UI for your own node, re-find them, and then join them along that direction."
Seems easy to just do, makes the whole thing look an absolute hot mess to a naive user (and technically speaking, will be hard to improve given the modern domain security model for web sites).
@lauren I was there (but not working on Social) for the Great Real Names Considered Harmful debate at Google.
I never got a strong sense of why it was desired other than the two datapoints that (a) it was technically Facebook's (basically unenforced) policy and (b) Google already had experience with Orkut and it seemed like some senior leadership believed Orkut's failure to gain traction was due to how hard it was to find people you knew IRL in the Orkut network because people didn't publish their names.
Do you have any insight on what happened there? The third theory I've heard is "because advertising;" that never seemed sufficient to me, but maybe it was?
@mds2 There was a time we made public works pretty for the sake of making them pretty, on account of we knew we'd have to be looking at them forever.
... not everywhere, and not uniformly. But I went to a public high school with huge Thomas Jefferson busts on it put there for no other reason than it was WPA times and the state needed an excuse to keep artists employed. Those busts outlived the original artist.
Career software engineer living something approximating the dream he had as a kid.