Click here to Skip to main content
15,885,757 members
Please Sign up or sign in to vote.
4.71/5 (4 votes)
I am making a multi-player JavaScript game with a C# server. I started off with just JavaScript and using setInterval as the game timer to calculate the position of each player every 1000/60 milliseconds, which worked perfectly and everything was smooth. Now I have moved the main game logic to the server in C#, and am using System.Timers.Timer with an interval of 1000/60. My game is now really juddery and it is because the C# timer is getting ticked anywhere between every 15-30 milliseconds.

I understand that I could use the multimedia timer for more precision, but this seems crazy that javascript would natively support a timer more accurate than C#. Perhaps it's something wrong with my computer?

Here is the code to replicate the issue:

C#
C#
using System;
using System.IO;
using System.Timers;

class Program
{
    static StreamWriter _writer;

    static void Main(string[] args)
    {
        using (var timer = new Timer(1000 / 60))
        using (_writer = new StreamWriter(Environment.GetFolderPath(Environment.SpecialFolder.Desktop) + @"\TimerTest.csv"))
        {
            timer.Elapsed += timer_Elapsed;
            timer.Start();

            System.Threading.Thread.Sleep(10000);
        }
    }

    static DateTime _lastTick = DateTime.MinValue;
    static void timer_Elapsed(object sender, ElapsedEventArgs e)
    {
        if (_lastTick != DateTime.MinValue)
            _writer.WriteLine((e.SignalTime - _lastTick).TotalMilliseconds);

        _lastTick = e.SignalTime;
    }
}


and JavaScript.js
JavaScript
function RunTest()
{
    var array = []
    var date;

    var id = setInterval(function ()
    {
        if (date !== undefined)
            array.push(new Date() - date);

        date = new Date();
    }, 1000 / 60);

    setTimeout(function ()
    {
        clearInterval(id);
        DrawGraph(array);
    }, 10000);
}

function DrawGraph(data)
{
    var width = 800;
    var height = 400;
    var padding = 50;
    var max = Math.max(d3.max(data), 20);

    var xScale = d3.scale.linear().domain([0, data.length - 1]).range([0, width]);
    var yScale = d3.scale.linear().domain([0, max]).range([height, 0]);
    var xAxis = d3.svg.axis().scale(xScale).ticks(5).tickFormat(function (d) { return Math.round(d / 60); }).orient('bottom');
    var yAxis = d3.svg.axis().scale(yScale).ticks(5).orient('left');

    var line = d3.svg.line().x(function (d, i) { return xScale(i); }).y(yScale);

    var svg = d3.select("#chart").append("svg:svg").attr("width", width + padding * 2).attr("height", height + padding * 2).append('svg:g').attr("transform", "translate(" + padding + "," + padding + ")");

    svg.append('svg:g').attr('transform', 'translate(0,' + height + ')').call(xAxis);
    svg.append('svg:g').call(yAxis);
    svg.append('svg:path').attr('d', line(data));
}

Index.html
HTML
<!DOCTYPE html>
<html>
<head>
    <title>TimeoutTest</title>
    <script src="http://d3js.org/d3.v3.min.js" charset="utf-8"></script>
    <script src="JavaScript.js"></script>
    <style>
        path {
            stroke:black;
            stroke-width:1;
            fill:none;
        }

        * {
            font-family: Helvetica;
        }
    </style>
</head>
<body onload="RunTest()">
    <div id="chart"></div>
</body>
</html>


The C# generates a .csv file which I have graphed in excel, and the JavaScript displays a graph in browser, here are some screen captures: http://imgur.com/cStf6IB[^]

I guess I'm wondering if there's anything stupid I've missed or if I just have to deal with it. If so I think I'll try and stick to this 'juddery' timer and interpolate in the JavaScript to get back to smooth motion in the client.

On another note I have used CompositionTarget.Rendering for animation in WPF and this obviously works very well, we get a tick every 16ms. Obviously this isn't suitable for a server application though.
Posted
Updated 18-Jul-13 4:28am
v2
Comments
Manfred Rudolf Bihy 18-Jul-13 16:19pm    
A very well put question with a means to reproduce the behaviour. This question deserves a 5!

Cheers!

You are fighting the classic Windows timer resolution issue. With a twist.
By default, the timer resolution in Windows is 15.6ms, which means that when you start a periodic timer, specifying a 16.7ms interval, you might receive your next event exactly when you expect it or delayed by as much as 15ms. This explains why you are seeing actual intervals of up to 32ms.

As I mentioned above, the 15.6ms is the default timer resolution, but it is possible for an application to change that value. You may ask why the default is not the highest resolution possible (1ms) and as far as I can tell, the primary reason is because a higher resolution results in a higher CPU load, which translates into higher power usage, which again means higher electric cost and shorter battery life for portable devices.
Read more here: http://msdn.microsoft.com/en-us/windows/hardware/gg463266.aspx[^]

So, timers are not more accurate in JavaScript than in C#, but in your case it appears so because the browser you are running the JavaScript in, sets the timer resolution to a finer accuracy than the default and since you are not doing the same in your C# program, confusion sets in.
Read more here: http://www.nczonline.net/blog/2011/12/14/timer-resolution-in-browsers[^]

timeBeginPeriod()[^] is used for changing the timer resolution in an application (call timeEndPeriod()[^] when you no longer need the accuracy) and I thought there was a .NET equivalent, but I am not seeing it, so it looks like you might have to use P/Invoke.
For more on that, see here: http://stackoverflow.com/questions/15071359/how-to-set-timer-resolution-from-c-sharp-to-1-ms[^]


[EDIT]
As I was re-reading my solution, I realized it seems I recommend changing the timer resolution and keep using regular timers in your C# program. That was not my intention, I mentioned the timeBeginPeriod() with the idea that you could try it out to see if that solved the problem.

I recommend that you do use the multimedia timers instead, just as you mentioned in your question, but hopefully my answer helps explain why you were getting your results.
[/EDIT]


Soren Madsen
 
Share this answer
 
v2
Comments
Member 10160214 19-Jul-13 4:10am    
This was very interesting read. As it happens all I really want is regular intervals rather than exactly 16.67ms intervals, and Alan N's solution seems to achieve that. I've not really used P/Invoke before and I don't have much time to experiment so I'm not going to investigate this route further, but thanks for all your help.
SoMad 19-Jul-13 7:25am    
I understand, but if you run into other problems with your timers (such as a higher load or UI event handling causing even more erratic intervals), you should seriously consider changing to the multimedia timers.

Soren Madsen
Yes I can see this jittering effect on my computer too, or to be more accurate I could when a programme was being recorded from the USB digital TV tuner. Now that has stopped and the tuner has been unplugged the timer is quite regular.

I was seeing SignalTime intervals of either 15.625ms or 31.2ms and could confirm with the System.Diagnostics.Stopwatch that the variance was real.

If your timer resolution is the same as mine the intervals will be multiples of 15.625ms and setting 16ms (1000/60) will give a (rounded up) interval of 31.25 ms. This is the case on the unloaded system but the TV tuner certainly messes things up a bit and suprisingly the 'bad' Timer.Elapsed events are being fired at a shorter interval than expected (~16ms instead of ~31ms), something that is not easy to explain.

Interestingly setting a timer interval above 16, i.e. 17 to 31ms, gives consistent event intervals of ~31ms and of course the shortest possible event interval of ~16ms is achieved by setting the timer interval to between 1 and 15.

The same jittering between the rounded up and the rounded down interval was seen when the timer interval was set to a value just above other multiples of 15.625, i.e. 32, 47 and 63ms. At higher multiples the effect disappeared and as indicated before there is no jitter unless the TV tuner was operating.

So confirmation rather than explanation, but at least you're not alone!

Alan.

[LATE EDIT the jittering effect can also be reproduced when an mpeg2 video is played in either Windows media player or VLC media player]
 
Share this answer
 
Comments
Member 10160214 19-Jul-13 4:06am    
Okay I set the interval to 1ms and then 17ms, both these get a regular interval at ~15ms and ~31 as you say. This is a perfect solution for me as all I really wanted was regular intervals. Updated with a new graph - http://imgur.com/a/Fc2Fh.

Thanks for your help!
May be Because C# timer control works on server-side so its dependent on client request (time it takes for request and response according to the triggered event)....
on the other hand javascript's setinterval works on client-side that's why its more precise..

Hope it solves for your purpose..

Thanks
 
Share this answer
 
Comments
Member 10160214 18-Jul-13 10:44am    
Well that's what I thought at first, but the C# code I posted above is a self contained example with no client-server comms involved and it still shows the same problem.
Yuriy Loginov 18-Jul-13 11:52am    
It could be because c# program is writing to disk which is causing the run time of elapsed event to be longer than the interval of the timer. From Microsoft: " If the processing of the Elapsed event lasts longer than Interval, the event might be raised again on another ThreadPool thread. In this situation, the event handler should be reentrant." What this tells me is that events sometimes may be ignored if the event handler is still running. Maybe try sending the output to the console to see if it makes a difference
Member 10160214 18-Jul-13 12:05pm    
Okay tried that, still no difference, I'm still seeing up to 32 ms between ticks, the same as in the graph.
Yuriy Loginov 18-Jul-13 12:12pm    
That's a little surprising.... final thing I would try is eliminating all computations inside the event handler and simply adding e.SignalTime into a list and then printing that list when the thread sleep has elapsed.
Member 10160214 18-Jul-13 13:36pm    
Still the same, and either way there's no chance that doing a simple subtraction is going to take 16ms. Do you get the same result, or is this limited to my computer?

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900