James B Maxwell
3 min readDec 10, 2024

--

I think you're really agreeing with Affleck more than you let on, to be honest. Your positive examples of what an AI can do are still examples of what a human can do with an AI tool, and I basically think that's Affleck's perspective. On its own it will likely always be quite bland and lifeless because it's just a statistical model at the end of the day.

The side of this that neither of you quite seems to be talking about is that "Hollywood" is killing itself anyway by making increasingly shitty movies. It's becoming obvious enough that even audiences are recognizing it. And that main reason is because all they give a shit about is the bottom line. That's the same reason relying on AI for cost-cutting will continue to create shit work and people won't care. There was a time when tension between artists and producers could find its way to producing great work despite the friction. But as our society weakens and becomes little more than and capitalist horror show, the power is simply passed over to the bean counters entirely and the work suffers. The artists fuck off and make great work on their own. And again, the examples you give of good work that uses AI are not at all this. I've always felt that AI should be a tool—that's always been my entire philosophy about generative processes in artistic production.

I think where you underestimate Affleck's take is that I think what he's saying comes from having a deep understanding of art. Art isn't just aesthetics and that's why AI can't replicate it. It's something that moves through human channels, in human societies, specifically because it involves humans doing things to express themselves and their situations. It has never really been about how "good" or "bad" it looks. And "good" films of today aren't somehow better than "good" films of yesterday. Art has never worked that way; there is no progression from bad to good in art. This is why "better" models won't matter, at least in terms of just having the model spit out a piece of art. They'll always be slightly shit because they'll never really connect to anything that art is actually about if they don't have that connection through a human author or guide. That's not because human are "superior", but only because they're humans and they live among humans, and we're the ones watching and the ones who care to watch.

Honest, I think international regulations should be put in place to require that all generated content be watermarked in some way that it can be identified with regard to the model used and the generative approach taken (e.g., a prompt). These could then be used to determine how much of a work was done by or with AI and how much wasn't. Then people could at least find out, if they were interested in knowing. Anything that is 100% AI from an autonomous model should probably require an immediately recognizable watermark, so that it is absolutely visually apparent on first sighting. This does nothing to hold back technological progress and threatens no valid business of any kind. It's just like money. There's real money and counterfeit money, and you can't pass the fake shit off as real. That's super simple with money, so why should it be any more difficult with media. Simple fact; it shouldn't.

Interesting works, that draw on all sorts of tools will be exactly as they should be; complex hybrids created by artists who just work with those particular tools. It literally threatens nobody to make this explicit. It's just a responsible approach. Somebody just needs to have an ounce of fucking will power to put something like this in motion.

--

--

James B Maxwell
James B Maxwell

Written by James B Maxwell

Composer, musician, programmer, technologist, PhD

No responses yet