These are public posts tagged with #LosslessCompression. You can interact with them if you have an account anywhere in the fediverse.
AlexBuz/llama-zip: LLM-powered lossless compression tool
Leverages a user-provided LLM (large language model) as the probabilistic model for an arithmetic coder. This achieves high compression ratios on structured or natural language text, since few bits are needed to encode tokens that the model predicts with high confidence.
#LLM #LosslessCompression #Python #AI #ArtificialIntelligence #Github #PythonProgramming
LLM-powered lossless compression tool. Contribute to…
github.comText Compression Gets Weirdly Efficient With LLMs - It used to be that memory and storage space were so precious and so limited of a r... - https://hackaday.com/2023/08/27/text-compression-gets-weirdly-efficient-with-llms/ #artificialintelligence #losslesscompression #lossycompression #textcompression #softwarehacks #neuralnetwork #compression #winzip #llm
It used to be that memory and storage space were so…
HackadayWe started with #LosslessCompression. Across a range of general-purpose (GP) compressors, we found that #Zstandard with
@Blosc2 achieves the best compromise between compression ratio and decompression speed!
NP1: compressed size ~36%
NP2: compressed size ~52%
(3/n)
A nice overview of #lossless #compression #algorithms with a brief #history:
“History Of Lossless Data Compression Algorithms” [2014], ETHW (https://ethw.org/History_of_Lossless_Data_Compression_Algorithms).
Via HN: https://news.ycombinator.com/item?id=31922396
#LZ77 #LempelZiv #LosslessCompression #ZIP #DEFLATE #LZMA #HuffmanCoding #RLE