pafurijaz

Today I discovered that there is also an excellent compression format, #Zstandard (ZSTD), is fast with an excellent compression ratio, developed by #Meta, and released as #opensource
I needed to backup my files because I have to wipe my PC and reinstall #Linux. Now I have a dilemma: choosing between #antiX and #Lubuntu. I have a fairly decent computer, so I could even install a more full-featured OS, but I prefer an OS that doesn’t use too many resources.
#zstd
youtube.com/watch?v=k5XsiuxHv_A

- YouTube

Enjoy the videos and music you love, upload original…

www.youtube.com
Hugo van Kemenade

Brand new PEP by @emmatyping to add Zstandard to the standard library:
peps.python.org/pep-0784/

Will it make it in to 3.14 before the feature freeze on 2025-05-06? It'll be close but it's possible!

The PEP also suggests namespacing the other compression libraries lzma, bz2 and zlib, with a 10-year deprecation for the old names.

Join the discussion to give your support, suggestions or feedback:

discuss.python.org/t/pep-784-a

#PEP #PEP784 #zstd #zstandard #stdlib #Python #compression

PEP 784 – Adding Zstandard to the standard library | peps.python.org

Zstandard is a widely adopted, mature, and highly efficient…

Python Enhancement Proposals (PEPs)
Thorsten Leemhuis (acct. 2/4)

#Zstandard (aka #zstd) v1.5.7 is out:

github.com/facebook/zstd/relea

'"[…] a significant release […] brings enhancements across various domains, including performance, stability, and functionality […]

The compression speed for small data blocks has been notably improved at fast compression levels[…]

The --patch-from functionality of the zstd CLI […] v1.5.7 largely mitigates the speed impact of high compression levels 18+,

The compression ratio has been enhanced slightly for large data across all compression levels […]"'

R. L. Dane :Debian:

I love playing around with #compression

In this case, it's all text-based data in csv and xml formats.

Size:

32,696,320 202411.tar 4,384,020 202411.tar.bz2 4,015,912 202411.tar.zst 3,878,583 202411.tar.bz3 3,730,416 202411.tar.xz

zstd was invoked using zstd --ultra -22
xz was invoked using xz -9e
bzip2 was invoked using bzip2 -9
bzip3 has no compression level options

Speed:

zstd 54.31user 0.25system 0:54.60elapsed 99%CPU xz 53.80user 0.06system 0:53.93elapsed 99%CPU bzip2 5.33user 0.01system 0:05.35elapsed 99%CPU bzip3 3.98user 0.02system 0:04.01elapsed 99%CPU

Maximum memory usage (RSS):

zstd 706,312 xz 300,480 bzip3 75,996 bzip2 7,680

*RSS sampled up to ten times per second during execution of the commands in question

#bzip3 is freaking amazing, yo.

#DataCompression #bzip #bz3 #zstd #zst #zstandard #xz #lzma
#CouldaBeenABlost ;)

Alessio Buccino

We started with #LosslessCompression. Across a range of general-purpose (GP) compressors, we found that #Zstandard with
@Blosc2 achieves the best compromise between compression ratio and decompression speed!

NP1: compressed size ~36%
NP2: compressed size ~52%

(3/n)

clacke: exhausted pixie dream boy 🇸🇪🇭🇰💙💛
Oh, this is very cool. I understand the FSE Zstandard on a very handwavy level after reading some articles, but hopefully the Professor himself might be able to explain the principles more clearly, and also where the lines are drawn between FSE, more general tANS and even more general ANS.

My handwavy layman's understanding is basically "arithmetic coding is like huffman but you have arbitrary-sized ranges rather than binary divisions of the probabilities, and FSE compared to arithmetic coding is kind of analogous to when you take the naive floating point divisions you might use to draw a diagonal line on the screen and then you replace that with integer subtraction and a remainder".

#LibrePlanet #ANS #tANS #FSE #zstd #zstandard #FiniteStateEntropy #AsymmetricNumeralSystems
clacke: exhausted pixie dream boy 🇸🇪🇭🇰💙💛
Wow, this thing is really gaining ground, huh?

$ apt-cache rdepends libzstd1 | xargs echo
libzstd1 Reverse Depends: libzstd-dev tor tor mesa-vulkan-drivers mesa-va-drivers mesa-opencl-icd mesa-vdpau-drivers libzstd-dev libxatracker2 libosmesa6 libgl1-mesa-dri libd3dadapter9-mesa libapt-pkg5.0 dpkg tor libsereal-encoder-perl libsereal-decoder-perl librpmio8 libgroonga0 libblosc1 fsarchiver casync borgbackup btrfs-progs libapt-pkg5.0 dpkg

It's understandable! It is a really cool algorithm, and by cool I don't just mean that the papers are a fascinating read and the history of it is a great story that I still haven't fully traced and written down, I also mean it's an impressive algorithm with great tradeoffs between (de)compression speed and ratio, strictly better than its forebears in several scenarios.

#zstd #zstandard

Yes yes, the name is horrible in at least two ways. I wish they'd just called it LZFH or something, but zstd is the registered MIME type now.
Zrythm DAW

What extension should Zrythm projects have?

#yaml #zstandard #MIME

clacke: exhausted pixie dream boy 🇸🇪🇭🇰💙💛
Early this year #archlinux started using #zstandard for their packages:

> zstd and xz trade blows in their compression ratio. Recompressing all packages to zstd with our options yields a total ~0.8% increase in package size on all of our packages combined, but the decompression time for all packages saw a ~1300% speedup.


www.archlinux.org/news/now-usi…
Arch Linux - News: Now using Zstandard instead of xz for package compression

www.archlinux.org
niconiconi [MOVED]

Started using #Zstandard for compression, just tried compressing my first file. Compression is so fast that it runs faster than the hard drive itself, while still giving a first-tier compression rate. xz is still the best, but took 5x-10x more time to complete. #zstd FTW.