My semi-spicy take is that computer science depends on analogy, metaphor, and subtext just as much as the humanities or the arts. But because so many people in tech are willfully oblivious to this, they're especially prone to the kinds of cognitive distortions that people in the humanities and the arts are trained to recognize and understand.

What makes a neural network "neural"? What assumptions does that analogy carry? What does that encourage us to see? What does that prevent us from seeing?

Follow

@io My impression was that academic CS people understand very implicitly that terms that aren't well defined can be misused and attempt to have a clear distinction between well and non-well-defined ones. Do you think otherwise, or am I looking at the wrong set of distortions?

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.