What causes the speed-up bug, and how to sort of fix it.

In my efforts to reverse engineer Saints Row 2 I've discovered what causes the speed-up bug, how a bug in Windows fixes it for some people and how to sort of fix it.

I have found that the game divides the QueryPerformanceFrequency by 1,000 and 1,000,000 to give you the ticks per millisecond value and the ticks per microsecond values respectively. The problem is that the game stores these values as integers and not floats. So you end up losing precision and the game speeds up as a result.

For example, if the ticks per second value is 9,999,999. The ticks per ms value would be stored as 9,999 and the ticks per microsecond value would be 9. Converting the ticks per ms back to ticks per second, it would be 9,999,0000 and for microseconds it would be 9,000,0000. That's about a 10% speed up.

Luckily, the game engine has a variable that changes the speed of the engine. It's like the play testers reported the bug, and they started to try and fix it. Unfortunately, it doesn't work with the cutscenes as different parts of the cutscene engine uses different metrics and everything goes out of sync.

I thought I'd report my findings, as some people might find it useful.

Here are the memory values of the different variables:

0x0234651064 bit integerTicks per second frequency
0x023464F864 bit integerTicks per millisecond frequency
0x0234650064 bit integerTicks per microsecond frequency
0x00E8438CfloatGame speed divider

Here's a quick video I made explaining how it works:

 
This is a big breakthrough! More people should focus on this as this would help aid in fixing cutscenes and gameplay issues.
 
Back
Top