And obviously the "wow a GUI in 1977" stuff couldn't have happened on the pre-1976 interpreted Smalltalk, because too slow.
But that old reverse-Forth style? I love it so much. It just seems to map so closely onto what actual Internet-era processes are about: parsing tokens from the wire.
Losing that, and getting compiled Smalltalk-76 notation.... sigh.
It just feels heavy, like the death of a dream.
Reading more about Smalltalk-76, and I'm just... yeah. I just never ever have managed to get my head around that keyword trailing-colon syntax. It hurts me.
I don't like it in Ruby, either. Or in REBOL or Red.
Maybe I'm just broken. But I feel like this paragraph, and this diagram, are telling us something very different than the text's face value of "yay we improved it".
I look at this and I see mind-melting complexity right in the kernel where there should be sweetness and simplicity.
Not that I want to be an Alan Kay personality cultist, cos I probably don't agree with him on everything...
But that huge complexity jump corresponds to the shift from "Alan Kay's design" in 1972 to "Daniel Ingalls' design" in 1976.
And I think that complexity jump, because it made hacking the kernel of Smalltalk much harder, froze in place a bunch of experimental design decisions that have haunted OOP ever since. Even if Smalltalk-76 got more right than C++, it still was very far from done.
Chief among those design decisions, being that the very core original Smalltalk idea, of messages, got lost! And nobody noticed, because Smalltalk seemed to be achieving so many other breakthroughs, and was. But it had already lost its soul.
That soul being the simple equation that "a message is a vector of symbols".
If a Smalltalk-76 message is a vector (sequence/list/array), it's now an extremely obfuscated one.
I blame compilation. But surely it didn't *have* to mangle messages so badly?
What a wonderful convesation!
A couple of objections @zensaiyuki:
1) reading and writing are not "unnatural", they are "natural to the humans" or at least "as natural as humans".
So I'd argue that the asyncronous comunication that writing (and painting before) enabled was fundamental to human evolution so much that we could consider humans as being shaped from it.
So I'm not surprised that programming and written languages are so tightly coupled: writing is the most precise and effective async comunication system we have, designed over several thousands years.
2) #Mathematics is not "a natural feature of the landscape" but a byproduct of human brain's evolution: different species would have different #Math and #Logic and see it as "the language of the Universe", "a natural feature of the landscape" and so on.
http://www.tesio.it/2018/10/11/math-science-and-technology.html
@Shamar @natecull here’s an article on just how much of a struggle it is to actually teach kids to read
https://www.apmreports.org/episode/2019/08/22/whats-wrong-how-schools-teach-reading
and here’s roger penrose on the question of whether math is invented or discovered
@icedquinn @natecull @Shamar some of this stuff came up in the recent discussion of an a possible ancient prehistoric civilization (the silurian theory). i was put off by the particularly dingbatty tone of one youtube video about it so I looked up the definition of “civilization” that archeologists use. in sum, you need a city, a government, a class system and specialisation, all only really made possible by a writing system of some kind, for receipts and spreadsheets.
@icedquinn @natecull @Shamar you can debate whether any of that stuff has in fact been “good” but there’s no dispute that reading and writing is a technology.
@icedquinn @natecull @Shamar it’s a bit of a stretch, but you could call those rocks a kind of writing system. it’s about a step away from a quipu, which *is* a writing system. of a sort.
@icedquinn @natecull @Shamar it’s presumably uniquely identifiable. that is, technically, data. but, it’s really arguing semantics.
@icedquinn @natecull @Shamar and- to be clear, it’s possible there were writing systems older than 6000 years. the trouble being the complete absence of evidence for them. which could be because the materials have been obliterated by time. paper and wood doesn’t last very long without climate control.
@icedquinn @natecull @Shamar back to smalltalk- Alan Kay has a talk “doing with pictures makes symbols” which is basically his argument against the very concept of a “programming language”, in the way it constrains our thinking. symbols are the important thing, not writing.
@icedquinn @natecull @Shamar Bret Victor tries to continue that train of thought in “Media for Thinking the Unthinkable” http://worrydream.com/MediaForThinkingTheUnthinkable/
@icedquinn @natecull @Shamar i know some of those words!
@icedquinn @natecull @Shamar that setup almost suggests an accidental polymorphism. a runtime indexed ducktyping table. interesting!
@icedquinn @natecull @Shamar who wants this message! free to anyone who can understand it!!
@icedquinn @natecull @Shamar i am an affete dabbler. i am more interested in ideas than actually doing anything with anything. the concept of older versions of smalltalk being like forth is fascinating . if you go to the paper linked in nate’s original post, it links to live web based emulators of each version of smalltalk to try out.
@icedquinn @natecull @Shamar i lack the big brain energy too- but i did find the interesting project SOM (smalltalk object machine) when researching an adjacent thread nate is in. they were talking about how modern language research lacks backtracking and tail call optimisation because the VM’s/ecosystems/runtimes they’re built on lack those features, and thus limit the big risky ideas new languages are willing to try. i thought “big risky ideas vm” would be an interesting project.
1. How old is painting? The point is that asyncronous comunication that these tools enabled shaped us as a species.
Yet you are correct that universal literacy is a recent thing. But despite being a slow process, evolution preserve only those traits that become universal.
2. The truthness of #Pithagoran Theorem is an interesting objection, but I'd argue that it supports my point: have you ever seen a triangle in nature? have you ever seen a square or a circle? They do NOT exists.
#Math IS discovered, actually.
But it's always a discovery of humans' minds' traits and behaviors. We see math outside us but it's just a projection of our minds on our perceptions. It's like looking through invisible glasses that shape what we see. By discovering math, we just discover features of these glasses.
And math is so much about humans that a theorem is not proved until all humans can understand it.
@Shamar @natecull @icedquinn triangles and squares in nature? lots of em. and I don’t think they’re a cultural hallucination.
the asynchrony of painting? ok fine, no objection to that. but it has no bearing on the unnaturalness of reading and writing. staying mindful of the original topic, my point is that “programming languagess” based on text unecessarily constrains our imaginations for what computers are capable of. categorising paintings as valid communication is agreeing with me.
I don't see a single straight line in the images you shared. Even when they look straight (as with some cristals), they are not. And the truthness of Pythagorean theorem breaks on non euclidean spaces and below the Plank constant.
You see straight lines, triangles and so on just because you are abstracting on your perceptions. Which is fine, fun and useful, but abstractions are just in your mind. Since we are both humans, I can see them too. But I'm simply aware they are just constructs of our minds.
Different senses (perceprions) and brains and thus different evolutions would lead to completely different maths.
___
As for the constraints that language pose on programming, yes I could agree that a different expression system would lead to different programs.
I play piano.
I guess that if I designed a programming system with a piano keyboard as the only possible input system, the programs I could invent would be very different from the ones I code in the several programming language I know.
BUT, the choice of language and writing to express programming was not driven by the available input devices, but by the available humans for whom such devices were built.
It's correlation, not causation.
Writing, so far, is the most ptecise and efficient way humans invented to comunicate asyncronously, as we do with the users when we write software that they run (or even just interact with).
@Shamar @natecull @icedquinn it’s fun to think about universes with differerent rules, and alien perceptual systems. i think though, that if you have gravity, light, a body that is capable of movement in an environment, a need to eat and reproduce, a lot of the problems that alien perception would need to solve would be the same.
as for writing being precise and efficient? maybe for some things, but not all. Writing can’t capture tacit knowledge.
Actually you don't need to think about alternative universes to meet non euclidean geometry: the Pythagorean theorem doesn't work over the Earth surface.
And you don't need to involve aliens: you can just think about dolphins, blind worms, flies or ants: different perception systems would lead to entirely different abstractions.
For sure writing can't capture tacit knowledge, but the programmed computers cannot either!
I wrote about this before https://medium.com/@giacomo_59737/the-delusions-of-neural-networks-f7085d47edb6 and https://medium.com/@giacomo_59737/yet-another-definition-of-intelligence-9bbaaa73086d
So if you want a dumb computer to do what you want, you can't use a language that leverage tacit knowledge that it lacks.
@Shamar @natecull 1. anatomically modern humans have been around for about 100,000 years. writing has been around for about 6000 (6%), the concept of universal literacy is only about 120 years old. sure, writing has been important. but it hasn’t “shaped” us any more than bread or iphones have. you can’t call it natural without rendering the word “natural” meaningless.
2. the pythagoran theorum didn’t suddenly become true when humans discovered it.