A binary distribution of #Haskell’s #GHC compiler takes 1.6GB of disk space, and many of the files therein are very compressable (e.g. gzipping libHSghc-9.2.6.a goes from 134MB to 22MB).

Would it be worth compressing them by default? Could tools like the linker deal with that?

@nomeata Good question, for which I have no answer, except perhaps getting the binaries through `upx -9`? For .a and .so it's another problem however

@hecate Compressing executables is probably not the lowest hanging fruit. I'm thinking more of all the data files (.o, .hi, .a).

But yes, file system level compression is probably the easier route to go.

@nomeata we just have to create a nested Btrfs + zstd partition where we install our bindists. :>
Totally not invasive and absolutely justified! /s

Follow

@hecate @nomeata Zstd is a great rug to cover size-related issues with!

It is fast and the rates are good. It can be made even faster if GHC would ship pre-trained dictionaries.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.