I just ran a program twice for testing and both executions took exactly the same time to so many decimals that I don't even know how that minuscule division is called. $10^{-19}$ of a second, I think?
The third time took a little less, which left me even more amazed by what I had just witnessed.
Also, I'm happy because in all three tests the program was really fast.