total aside, but i was reminded of one of the cooler things i learned about the gameboy: people actually use binary coded decimal, for things like scores, lives, or time

to display text on a gameboy, you need to create a set of tiles for each letter or number. to show the number "123" you have to tell the gameboy to use a "1", "2" and "3" tile

which means that when you store a number in binary coded decimal, it's already in the perfect form to pass onto the tile mapper

i used to think BCD was a relic of the past, but it turns out the overhead of converting from binary to decimal adds up when you're frequently displaying numbers as text

for example: i believe one of the large tech corps found out that their timestamp routine was taking up huge chunks of cpu time, as it was called heavily for logging

by storing the time as binary coded decimal from the outset, printing the time becomes almost a free operation

the other place i think BCD might be more useful than binary is things like floating point calculations

i don't think we'd see better precision or accuracy, but i do think rounding would be nicer, as would "the number i type in is the number the computer stores", and maybe it might make floating point just a little bit less weird to use

Follow

@tef Yes, is better for reproducing human calculations done in decimal.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.