My biggest frustration with mathematics texts is the notation. Why use Greek letters? It's not even the Greek letters Greek people use. It feels like a difficult to justify mix of legacy and vanity. "It's how it was done before" misses the point that the math was actually fine in Arabic, but Germans used Greek letters when 'discovering' the same math centuries later. "You just learn it" sends an arbitrary bit of gatekeeping for those without a similar background that just makes learning the actual math harder than it needs to be, but gives an in-the-know clique an inflated sense of intelligence. I suspect it actually makes it harder for those who know some of it to actually understand and manipulate the concepts if the language they use is also still basically foreign to them.

Better math might use variable naming conventions from software, or mathematicians may as well use Wingdings. They are probably more relevant than math Greek.

"The Greek letter forms used in mathematics are often different from those used in Greek-language text: they are designed to be used in isolation, not connected to other letters, and some use variant forms which are not normally used in current Greek typography.

The OpenType font format has the feature tag "mgrk" ("Mathematical Greek") to identify a glyph as representing a Greek letter to be used in mathematical (as opposed to Greek language) contexts." en.wikipedia.org/wiki/Greek_le

Follow

@DenialShown

> It's not even the Greek letters Greek people use.

> the math was actually fine in Arabic

I guess it evens out because we don't use the same numbers Arab people use (1, 2, 3, ... vs ۱, ۲, ۳, ...).

I like the Greek letters because they indicate information about what the symbol represents - in the same way you might use x for a scalar but X for a matrix, or typeset ***x*** in bold to show it's a vector, seeing φ and θ and π floating around tells you that you're likely dealing with angles. In another context I might have x, y in physical space and χ, η in Joukowsky coordinates, so when I see z and ζ I have a good idea which system each belongs to right away, and thus I'm unlikely to get them mixed up as I manipulate my equations.

On the other hand, despite their distinctness, they're close enough to Latin letters that they're often intuitive (α looks like "a", β like "B", etc.), so between that and the limited number you use at any one time, it's never felt particularly onerous to learn them. That's an advantage I think you'd lose with wingdings.

@khird absolutely! As a fun aside, i threw Arab numbers into something recently for subsection markers (1.A.۱, 1.B.۲, …) just to play with this notion. Anyway, what you say makes sense. What do you think of these aspects:

Depending on fonts those glyphs can be very similar looking. I'm thinking of the story of how "patient O" became as "patient 0" (radiolab.org/podcast/patient-z). There are other real world examples of glyphs becoming confused, causing confusion in underlying concepts.

Consider a student in a lecture. How is a student who uses a digital device for note taking going to capture these as a lecture flies by? You might say, use pen and paper, but what if there are dexterity issues or other constraints that prevent this? Even if they have pen and paper, what if they hear the instructor say a glyphs name, but they didn't recognize it.

Or consider the learner reading a text. When you read a math text do you think of the symbols as visual symbols, or do you translate them in your mind into utterances in a natural language?

@DenialShown

> As a fun aside, i threw Arab numbers into something recently for subsection markers (1.A.۱, 1.B.۲, …) just to play with this notion.

I hope it worked well! One caveat - if the audience was unfamiliar with the symbols, those ones may not have been particularly useful, because someone who doesn't know the order can't use them to navigate (for example, if you see 1.A.۶ and you want to find 1.A.۵, should you scan up the page or down it?). And if the list isn't ordered, you may as well just use bullet points.

> Depending on fonts those glyphs can be very similar looking. I'm thinking of the story of how "patient O" became as "patient 0" (radiolab.org/podcast/patient-z). There are other real world examples of glyphs becoming confused, causing confusion in underlying concepts.

This is true, but I don't think it affects Greek letters particularly badly. I used to do peer tutoring as an undergrad, and I can't recall anyone mixing up Greek and Latin letters, but I can recall plenty of occasions where someone would mistake variable x for multiplication ×, or even variable t for addition +, *in his own handwriting*. It really awakened me to the importance of penmanship, and now I always use cursive, even in equations.

> Consider a student in a lecture. How is a student who uses a digital device for note taking going to capture these as a lecture flies by? You might say, use pen and paper, but what if there are dexterity issues or other constraints that prevent this? Even if they have pen and paper, what if they hear the instructor say a glyphs name, but they didn't recognize it.

Absolutely. Math notation was handwritten for most of its history and so it evolved under very different pressures from programming notation - suppose you want to distinguish multiple kinds of "x". If you were handwriting them, it'd be much faster to just add some kind of distinguishing mark (x₀, ẋ, x⃗, x̅, x̂, ...) than it would be to use programming variable names `x_initial`, `x_vector`, etc. Moreover, the math forms (and the initial choice of "x", for that matter) take up much less space on physical media, which is costly relative to disk space. On the other hand, to create those symbols at a keyboard requires either a very good memory for Unicode codepoints or extra time to go look them up. To me it's a case where machine must serve man, not the other way around - if you're going to use a computer as a substitute for taking notes by hand, whatever assistive technology you use has to be capable of handling the notation; if it's not, or if it makes things more difficult, the fault is with the technology, not the notation.

> Or consider the learner reading a text. When you read a math text do you think of the symbols as visual symbols, or do you translate them in your mind into utterances in a natural language?

Utterances for sure. Lots of people say they're visual learners, but I've never thought of myself as such - even if I'm reading quickly, a mistake like "well" instead of "we'll" sticks out like a sore thumb, but I can skim right past a homophone like "wheel" without noticing. In a math context, reading fast means I might confuse A and a, because I read them both as "eh" if I don't take the time to read the first one as "capital eh". But I'll never confuse α for either because it sounds completely different.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.