How big is that performance gain after all?
For me, a natural way of reporting the result would be to use the execution time of the original JSC as a baseline and give the improvement in terms execution time: 1 - 77/100 = 0.23, i.e., 23% progression in runtime.
However, the scripts coming with SunSpider take another way. They would report that my version is 100/77 = 1.29x as fast as the original. (And this is mathematically sound, too.) Often, this is reported as 29% progression in performance.
So, for the same result, we have two interpretations. And the only explanation for the difference in the figures is hidden in the words standing after 'progression in'. I'm not sure that we always care about that last word. (Especially, if it's omitted, like '10% progression'.) Unfortunately, this can make the comparison of different improvements wobbly.
Moreover, from marketing point of view it does matter whether you 'sell' your work as being 23% or 29% better. Of course, if it's marketing, one should always go for the higher figure... But then there is the risk that someone will blame you for 'tweaking' math for your own good...
Well, there is no real conclusion. The question remains on the table: how big is that gain? Shall it be reported in terms of execution time, performance, or both?
Or am I simply overcomplicating things?