Dolly 2.0 is a really big deal: https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm
"The first open source, instruction-following LLM, fine-tuned on a human-generated instruction dataset licensed for research and commercial use"
My notes so far on trying to run it: https://til.simonwillison.net/llms/dolly-2
@fell ah. if you mean why Python is in the position it's in, I think it's mostly not technical and more cultural, and to some extent historical accident. like, a few motivated ML folks also liked Python, built libraries that were easy to play around with which others in the community picked up on and built on.
@2ck Forgive me if this sounds rude, but don't see the point in leaning and using a "glue" language when you could simply use C/C++ straight away. It allows compiler optimisations throughout the entire program, direct access to operating system features like memory mapping and just less wasted instruction cycles overall. ML is the most intense computing application I can think of, and I really don't get why Python of all things became the de facto standard.