User avatar
walnut 🌱 @walnut@thesoftestpaws.net
1mo
genai, "ethics aside" @lumi
ugh, lost my first draft because my computer froze. Fortunately, I was able to take a photo of most of it.

For me the entire technology just misses the point. The main reason to use it for writing, digital art, music, programming, etc. is because you don't care about the process, only the output. But we already have enough to read, look at, listen to, and I'd argue, code to last several lifetimes. The only people who care about churning out "content" are capitalists who don't care about the art or what form it is. And if all you care about is the volume of content, we already have enough.

There are many reasons to write something of your own besides just the output. You may enjoy writing for its own sake, or it helps organize your thoughts, you want to make something new*, or have your own voice heard. Using an LLM to churn out slop doesn't help any of those. There's also an element of "if a human didn't care enough to write this, why would I care enough to read it."

Some with any other art, like playing music. Someone can already listen to any recording of Bach by the best players in the world, or their favorite band, but they still learn to play it themselves, or attend live shows, or write their own music. You can see that tech bros just don't understand the point by insisting that nobody actually enjoys learning an instrument.

Heck, this is part of why people care about sports too I guess. Nobody (very few) watches sports only for the world records; they watch what's live
now even if Michael Jordan is retired and his entire career recorded. They haven't yet figured out how to try to replace football with a bot, but I'm sure they would if they could.

LLMs also are not able to average out something into the best. And if you want to make the best (which is not the only reason to make something), you cannot learn without doing, so using an LLM to do it for you will hurt your efforts in learning.

This last part is my worry with programming especially. It's not controversial to say the state of software was pretty bad before LLMs, but adding more code of mediocre quality that people
definitely always review the output of (they don't. even if they may intend to) is not going to help things. I'd much prefer to stop the treadmill and actually try improving the quality of software, tooling, documentation, accessibility, security, etc. than using a tool that drowns us in a volume of slop that's impossible to review, made worse by nobody learning any expertise.

*For non fiction this can include reporting on new events, novel research, or a new analysis. LLMs aren't any good for that. For fiction new doesn't mean "100% original", but that doesn't mean the average of all previous works is new or interesting either.