Dolly 2.0 is a really big deal: https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm
"The first open source, instruction-following LLM, fine-tuned on a human-generated instruction dataset licensed for research and commercial use"
My notes so far on trying to run it: https://til.simonwillison.net/llms/dolly-2
@fell IDK how familiar you are with Python or ML libraries in Python, but for "real" applications (not learning the basics) all of the actual computation of the model is pushed down to native code. Python remains useful as the glue language, as it always has done
@fell ah. if you mean why Python is in the position it's in, I think it's mostly not technical and more cultural, and to some extent historical accident. like, a few motivated ML folks also liked Python, built libraries that were easy to play around with which others in the community picked up on and built on.