These are public posts tagged with #AIauthorship. You can interact with them if you have an account anywhere in the fediverse.
I suppose now I have a quandary: who is the author of my Python version? I personally never touched it, but all its spec, the symbols, strings and the pattern of the flow are mine…
How does #AI influence perceptions of authorship and ownership?
We found an "AI ghostwriter effect" in our study in #TOCHI here: https://lnkd.in/dq_h_zUZ
with @fionadraxler Anna Werner, @florianlehmann, Matthias Hoppe, Albrecht Schmidt, @robin_welsch
I won't name him here, but I recently got a #YouTube suggestion for someone who makes videos about "How to Make Money #Writing Books with #ChatGPT." The #YouTuber has a few actual books out, all of which are about marketing and not about "how to write #fiction."
Someone who tells you how to get rich, but who only got rich by telling people how to get rich is called a "Con Artist."
Someone who's never written a fiction novel, but tells you how to get rich by writing a fiction novel is also called a "Con Artist."
"AI can write books for you!" is one of the latest in a long line of #EasyMoney #GetRichQuickSchemes.
Also, #AI cannot hold a #copyright– nor can you to that #AIgenerated text, unless a #substantive portion of it is human-authored.
In other words, you still have to do the work.
#AI #AIAuthorship #AIAuthored #LLMs #ArtificialIntelligence #WritingCommunity #Scams #Scammers #WritersLife #ConArtist #HumanAuthorship #ConArtistry #ConArtists #Publishing #PublishingMyths
Interesting take. I wonder if it is important in this context to realize that emotions are but a subset of our subjective experience (the subset that is coupled to significant physiological responses, and thus presumably "older" in evolution). But for sure, the role of subjectivity in science is underappreciated. We hear so often that "science is objective" - but it is not: it is "shared subjectivity" (which is actually far more interesting).
Your comment on our major mode of thinking being a rationalizing justification of intuitions is spot on. I wrote on that over the last days, now posted for #Sentientsyllabus where I analyze why it is so hard to get the question of #AIauthorship right. That mechanism plays a major role.
The "original sin" of emergent writing
sentientsyllabus.substack.com@vwang93@mstdn.science
Good on you! We need more thinking on science and values.
I was thinking a lot about values while writing my recent analysis for the #SentientSyllabus Project:
Getting #AIauthorship right is harder than one would think.
https://sentientsyllabus.substack.com/p/silicone-coauthors
Key takeaways include: that arguments based on our usual criteria of /contribution/ and /accountability/ are brittle; that the problem lies with authorship being a vague term (cf. sorites paradox); that we are using posterior reasoning to justify our intuitions; and that reliable intuitions about the actual nature of the emergent(!) source-AI-author system need more work. A practical policy proposal rounds it off: empower the authors, use meaningful acknowledgements, quantify contributions.
The "original sin" of emergent writing
sentientsyllabus.substack.comGetting #AIauthorship right is harder than one would think.
I just posted an analysis on AI authorship over at #SentientSyllabus:
https://sentientsyllabus.substack.com/p/silicone-coauthors
Key takeaways include: that arguments based on our usual criteria of /contribution/ and /accountability/ are brittle; that the problem lies with authorship being a vague term (cf. sorites paradox); that we are using posterior reasoning to justify our intuitions; and that reliable intuitions about the actual nature of the emergent(!) source-AI-author system need more work. A practical policy proposal rounds it off: empower the authors, use meaningful acknowledgements, quantify contributions.
The "original sin" of emergent writing
sentientsyllabus.substack.com