Or, perhaps it is no longer known by the name queuing theory? 😅 This would probably be the simplest explanation.

Show thread

Maybe someone out here can help me with this question: Why isn't queuing theory more popular in computer science? It seems like an extremely powerful tool for performance modeling, but I do not often encounter it in HPC performance modeling papers, but perhaps I just don't know where to look? A notable exception is in Vasily Volkov's GPU work, although he only seems to use Little's law, and I would hope there are deeper results that can be of use.

At my school, I don't see any evidence of a CS class being taught on the subject. Both the Industrial/ Systems Engineering and Electrical Engineering departments have offered courses in the past but they don't seem to offer them regularly.

Why don't people seem to care about queuing theory? Was it not as useful as its advocates claimed? Did it not yield any useful models?

Anyone following the drama today? In lieu of a CoC, they adopted the 1500 year old Rule of St. Benedict 😂

I've been working on reading the papers that make up SuperLU (a direct sparse linear solver) and I stumbled across this list of freely available linear algebra software - good to have around!

netlib.org/utk/people/JackDong

Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.