Serious question: there was a time, a few years back, when the Haskell-ish languages seemed poised (finally!) for a breakthrough. Some of that was due to an association between Haskell and cryptocurrency, but there was also a sense that people were *ready* for Haskell’s particular vision. (I’m thinking of, say, PureScript as an alternative to both JavaScript and Elm)

That seems to have fizzled. Have there been writeups as to why?

@marick I'll be curious what you find out about pure functional languages.

I'm making my second go at learning haskell. I did in Haskell last year. In preparation for doing it again this year, I'm dusting off the VSCode environment and doing a few small problems.

I successfully used monads to carry state through a calculation. It was really cool. And it took me a week off-and-on to wrap my head around it. If I'd been doing it for years and it was just another tool in the belt, it would be an awesome tool to have.

For me, the mental shift from Java in my day job to Haskell is huge, and takes work. I can see why people don't use functional languages without a strong reason. And I've never been at a company where management provided (or allowed) a strong reason.

Follow

@schwaigbub @marick I like the “clone the CTO plan”. As the CTO, that should lower my workload. 🤣

Writing command line tools in Haskell does seem like a good place to start. Unfortunately, I was the one who made the rule that we have a critical mass of people who know a language before we start using it. That way we can count on having somebody who can maintain it.

@bwbeach @schwaigbub I’m not so sure about the “let Haskell take the blame for a project that’s doomed to fail” as a long-term adoption strategy.

I take a guilty pleasure in reading things in fields like “the social construction of technology” (en.wikipedia.org/wiki/Social_c), as it throws up surprising things like:

* We remember those funny big-front-wheel bicycles (“pennyfarthings”)

@bwbeach @schwaigbub * We note that our bicycles look nothing like that.
* We see that there are bicycles that appear evolutionarily intermediate.

@bwbeach @schwaigbub * We assume there was a logical progression from silly bicycles to sensible bicycles.

That doesn’t seem to have been the case. Pennyfarthings and safety bicycles coexisted, served different markets.

Pennyfarthings were for young men.

@bwbeach @schwaigbub Safety bicycles were for women, who obviously couldn’t go riding around with their crotches at eye level. To them, they primarily represented freedom from depending on men to move them around.

@bwbeach @schwaigbub (Images are from a talk I gave in 2016, which wasn’t recorded. I should probably do a podcast episode on it.)

I mention this because I’m old enough to remember when statically-typed functional languages hit the scene. They played heavily on safety: they eliminate the possibility that a null-pointer exception will send a plane crashing into the ground.

@bwbeach @schwaigbub That was a pretty compelling argument up until the web. With it, a `NullPointerException` would get caught at the top level, where it wouldn’t crash the program, just return a 500 error to one user of many. Who would probably not get the same error if she refreshed the screen.

(This is an unintentional implementation of the Tandem/Erlang/fail-fast approach to fault tolerance, a big deal back in the day.)

@bwbeach @schwaigbub So my argument is that Haskell grew up in an age where program crashes were a Big Deal, and that heavily influenced its development.

It turns out that, today, in a vast number of situations, they’re not a big deal at all (except in niche applications like unpatchable “smart contracts”).

It feels to me that leaves Haskell struggling for relevance. Not sure command-line apps is it. Where’s its “unfair advantage”.

@marick @schwaigbub I, too, remember the old days. I worked at HP Labs in the 1980s, when "AI" meant Lisp and "expert systems".

HP and the University of Utah were working on Portable Standard Lisp, which was intended to be a performant platform for building AI systems.

I was into flying, and wrote a flight simulator, and an autopilot to fly it. That's me in the picture, taken from an HP brochure from 1985.

There's no way I would have gotten into an actual plane flown by that software. Without strong typing, and without rigorous testing, it just wasn't reliable enough.

@bwbeach @schwaigbub I remember Portable Standard Lisp! It was the striving dialect between the 500-pound gorilla of MacLisp and the hippy-dippy Californian InterLisp. (I got into Lisp in 1982, when my job was to port CMU Common Lisp to the Gould PowerNode superminis.)

@marick @schwaigbub At the same time, I was at HP Labs porting Portable Standard Lisp to the Motorola 68000-based HP workstations. And also, for some reason, ported to the DEC 20 in the computer room downstairs. Good times.

@bwbeach @marick Sounds like your jobs were more pioneering and scientific back then. Programming nowadays is more like a craft. Sometimes it's more interesting just like *how* you do something rather than what you do. Can't imagine writing a flight sim ever by myself. Were you alone with the task?

@schwaigbub @marick I've been thinking about your comment. I don't think it's that programming was different back then. I happened to have a job at a research lab. Other people had jobs writing COBOL for banks, firmware for ocilloscopes, or C compilers. (That COBOL code is probably still running.)

The flight simulator came about because HP was looking for a demo that showed Lisp could be efficient. Portable Standard Lisp compiles to native code, and if you're careful not to allocate much memory, the garbage collector didn't slow things down. My friend, Craig, and I were taking flying lessons, and thought a flight simulator would be fun. He wrote the graphics and I wrote the simulator.

The simulation itself wasn't super complicated. Each time increment, there's a formula to calculate the lift of the wing and the thrust of the propeller. Those are used to update the velocity vector and the angular momentum, which are then used to update the position.

@bwbeach @schwaigbub Was just interviewing David Chapman today about a program he cowrote 40 years ago. There were definitely widely different programming cultures back then. I like to think that’s less true today. That there’s been some convergence on practices. But I dunno.

@marick @schwaigbub I'll look forward to hearing that interview.

My experience in the 1980s both at HP and at Bell Northern Research was that commercial development was consistently waterfall.

I'm sure there were other cultures in other places, though. In the mid-70s, my undergraduate physics class visited Lick Observatory on Mt. Hamilton, near San José. The astronomers there were hacking stuff in FORTH on a PDP-8 to run their observations.

@bwbeach @schwaigbub I probably exaggerate. But in the ‘80s, Unix programming culture (which is where I came up) was distinctly different from the DoD-inspired waterfall style, but it shared an awful lot of assumptions (a waterfall variant). And scientific programming was a complete Wild Wild West.* As was small-shop PC coding, but stylistically different because of different constraints.

@bwbeach @schwaigbub * I once talked to someone at a conference who worked at one of the big national labs (maybe Livermore?) Talking about software development, he described a programmer (or group?) that refused to use the VI editor because their code was a single Fortran file so big that VI crashed on it.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.