Jerry
12-08-2006, 10:49 AM
1) What determines the maximum speed of a timer loop?
Obviously if you are trying to make a visual change, such as
turning a light on and off, there is no point in making the timer
go faster than the frame rate of the video card. But if the timer
loop just makes a simple calculation, say incrementing a variable,
how fast can you make it go?
2) What happens if the loop contains a complex mathematical calculation
that takes longer than the period you specify for the timer? When the
timer times out and the previous calculation has not had time to complete
does it restart the calculation or just ignore timeouts until the calculation
has finished?
Obviously if you are trying to make a visual change, such as
turning a light on and off, there is no point in making the timer
go faster than the frame rate of the video card. But if the timer
loop just makes a simple calculation, say incrementing a variable,
how fast can you make it go?
2) What happens if the loop contains a complex mathematical calculation
that takes longer than the period you specify for the timer? When the
timer times out and the previous calculation has not had time to complete
does it restart the calculation or just ignore timeouts until the calculation
has finished?