I wonder if work done is calculated over a fixed amount of time or the speedup is just a factor that scales as the machine scales?
How you define work is entirely dependent on the problem you are solving and the context in which you are solving it. I think this lecture does a good job showing that there is not a singular answer to that question. For example, practically it may be enough if your solution only scales to a certain fixed input size. In that case, you could measure and validly claim a speedup that wouldn't scale beyond the problem size of interest.