

OK, now take that to the next step: what do you do when optimizing ease of implementation and support limits the user needs that can be met? How do you decide which objective is higher priority?
OK, now take that to the next step: what do you do when optimizing ease of implementation and support limits the user needs that can be met? How do you decide which objective is higher priority?
I would say, it’s them caring about the product and their needs, rather than the underlying stack.
That’s the idealistic fairy tale that only the most fatuous of UX people believe. But anyone who looks at any process closely enough will soon realize that a system’s stakeholders often have objectives that are in conflict with each other. It’s not all about the users, it’s about low operations cost, it’s about collecting and selling data on user behavior, it’s about minimizing support costs, improving monetization, up-selling, cross-selling, and a number of other things the users neither want nor need. And that is the root cause of enshittification.
every one of them sucks
They went to the cheapest bidders, and those poor devs in Chennai have no fucking clue about proper UX design.
Do you know why banks are still running COBOL on new, old architecture, IBM mainframes? Sure, it’s in part due to risk aversion, ignorance and inertia. But it’s also because, if in the end the result is the same, then the tech stack doesn’t matter.
I’ve done extensive consulting for financial firms, including mainframe-replacement projects, and that’s not the reason. There are two reasons: banks regard IT as a cost center, so they systematically underinvest. They never budget for lifecyle maintenance, they just kick the can down the road as far as they can. In addition to that, hardly anyone who wrote that COBOL code is still alive, and when it was written in the 1970s or 80s, the requirements and design were seldom documented, and after decades of code maintenance, they’re no longer accurate anyway. So to replace that COBOL, you have to reverse-engineer the whole business process and the app that embodies it. And you can’t do like-for-like replacement, since a lot of the business process is actually workarounds for the limitations of that legacy system. And it’s even worse than that: sometimes that COBOL has some custom IBM 360 assembler behind it, and nobody remembers what that shit does either, and finding people who know that, or CICS, or the quirks of ancient DB/2 versions, is even harder than finding someone skillful who can write COBOL. So until there are new requirements, usually new regulations that cannot be weaseled out of, they let that sleeping dog lie.
They care about your site being compromised or down, and your choice of tech stack has a lot to do with how likely those things will be.
I know a lot of people who have moved away from smartphones to basic feature phones because of the expense and annoyance, as well as the relentless intrusive surveillance.
Economics is much harder to understand when you realize how little the basic fundamentals actually tell you.
When I started in software, my first employer was just phasing out punched cards for programming. One of my jobs was to work out how programming would be done using terminals instead of the old workflow of submitting coding sheets to card-punch operators who would then pass the jobs on to operations. Typically you’d find out if your code compiled the next morning. There was a big basket by the computer hall where they’d deposit your card deck and a massive printout on fanfold paper of the job status and (if it went tits up) the stack dump.
By that time (late 1970s), cards had sequence numbers (usually numbered 10, 20, 30 so you could interpolate a few cards if need be). If you dropped a deck on the floor, you had to carefully gather them back up (so they wouldn’t get bent or torn), feed them into a card sort machine, and wait until the deck was sorted. You could also run a special batch job to clone a card deck, for example if you wanted to box it up and ship it to another location.
The big challenge with cardless programming was that computer reliability wasn’t great in those days, so you needed reliable persistent storage of some kind. That was generally magnetic tape. Disk drives were way too expensive, and optical disks hadn’t been widely adopted yet. So to save your work, you used tar or some non-/Unix-OS equivalent. There was version-control software, but it was primitive (rcs, which later had svn built over it).
On the positive side, you could compile and build without an overnight wait.
Is the US government now a “Russia-aligned threat actor” too? Just wondering.
When I got into the business in the late 1970s, there was strong selective pressure in favor of people being capable and smart. Back then, software didn’t offer a lucrative career path for people with good memories, conformist instincts and a superficial command of MBA jargon. The people who had coding jobs and who didn’t wash out had it in their blood. There were lots of bullshitters, just as there are now, but they failed rapidly and were driven out.
I’m a bit younger than the OG greybeards (and a lot younger than people like Don Knuth). I’ve been in the business for longer than most coders have been alive. During that time, I’ve reskilled more times than I can count, and I still write code, though it’s mainly prototype and proof-of-concept stuff at this stage in my career, when the development team gets stuck.
And that’s the thing: I’m not there to block new people from submitting pull requests. I’m there to help get the job done. If you find the whole process opaque and need mentoring, just ask.
It’s the endarkenment.
Stopping pollution at the source is much more thermodynamically efficient.
I made sure to replace all my comments and posts with obscene lorem ipsum before leaving.
That’s been happening for a while, even before AI came in. Fascist mods pretending to be on the spectrum had the same result: bizarre, contorted interpretations of rules, selectively applied with zero-tolerance. Always in subs with a population of fascist mods, and always applied against people making anti-fascist comments. What a fucking coincidence.
It’ll be deeply satisfying if spez ends up with nothing. His mismanagement and tolerance for fascism are why I quit using the site.
If platforms are protected from the speech of their users, they shouldn’t be allowed to censor the speech of their users (unless that speech is actually criminal, as in defamation or specific, actionable death threats). The big platforms shouldn’t be able to have it both ways.
It works, but not all that well. The hair that comes back tends to be really thin.
I had postcard white-guy Jesus hair, hanging to the middle of my back, straight and reddish blond. A beard too. I went bald in my mid-40s and now what’s left around the fringes is white. People who see pictures of me from back in the day don’t recognize me.
But it’s very much luck of the draw.
Forced speech is just as odious as restricted speech. And with Trump and his band of wreckers, you can be sure that the end state will be a combination of both. Platforms will be forced to disseminate MAGA propaganda, and any user with the temerity to talk back will be banned.