like there's a bunch of faff in here about security issues (and sure, fair enough, the context here *is* wired's "security" column) and quality problems, but there is a much darker interpretation of what's happening

bluntly, POSIWID: "open source" is a social system for corporations to externalize infrastructure costs, and, materially, not much else. there are of course principles involved and possible future benefits, but *today*, that's mostly what is going on in the "community"

corporations have been reluctant to "give back" because while it can produce good press, if you start "giving back" to the "community" too much, then the cost savings you got from externalizing your complement starts to erode; if you're going to give money to some tech, might as well own it

[remember to like and subscribe and support me on github sponsors and patreon and tidelift, thanks]

but the scary part of this article is the bit where vibe coding *reads* to corporate interests as an alternative to open source, in that you can externalize your infrastructure development and maintenance costs onto OpenAI's VC investors instead. slightly higher overhead per developer, but no need to deal with pesky human beings who might start to agitate for more resources, so the reduction in hassle is worth the cost

it doesn't actually work; a vibe-coded framework is never going to help structure your systems anywhere near as well as one that is the result of human judgement and discernment, even one with a hefty pile of legacy junk associated with it. but management is not going to be able to see this; structurally, managers see the benefits and have a much harder time measuring or even perceiving the costs

this is why there is no such thing as "vibe engineering" and it is farcical to imagine a world where it even could exist. even with all the responsible code review and QA and cross-checking (which is like, literally impossible, given the amount of vigilance fatigue that AI systems provoke, and the metrics we have so far all bear that out), the long-term maintenance cost of a workslop infrastructure is going to be *devastating*

the biggest problem we *already have* in open source right now, which we have oversimplified into the term "supply chain security", is the lack of understanding that putting a dependency in your project's dependency set (package.json, pyproject.toml, requirements.txt, cargo.toml, etc) is not just "downloading some code", it is *establishing an ongoing trust relationship with a set of human beings*. this fact is *way* too obscured in all the tools we use.

the "vibe engineering" version of this — even in the fantasyland scenario where the tools work and produce correct results, somehow, even in the face of all the evidence that they don't — is that you establish a permanent unfixable dependency on *OpenAI's subscription services*, which are being subsidized in the model of the millennial lifestyle right now, but as soon as you have slipped your organization's neck fully into that pricing noose, it will tighten up real fast

Follow

@glyph

As we just witnessed with all video streaming platforms.

You're totally right.

That kind of dependency is the kind of thing that kills companies. OpenAI has all the incentives to create a ecosystem of totally depending partners and then suck them dry.

@jgg @glyph Based on their financials, that's ultimately their only option.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.