Several years ago, a coworker made an Arduino "mouse" that traced a square with a 1KHz report rate. Multiple systems running Windows 10 and reading mouse updates via either regular window events or the Win32 raw mouse API would occasionally lose events, meaning the square would drift over time. We tried both with a custom app and with drawing programs like Paint.
We could not reproduce the issue on systems running macOS or Linux, and we chalked it up to a bug in Windows. It was hard to know if it affected real mice but I expect it did. I havenât tried retesting with more recent versions of Windows to see if it is fixed, maybe it has been.
Anyway, Iâm not disputing OPs claim, I can totally believe it, but I always thought it was funny that pro gamers on Windows with high end mice could be losing the occasional movement and apparently nobody noticed that.
I can buy that humans can see at least 120hz at a minimum. 60Hz is the generally accepted threshold, but Iâve long suspected that 120Hz has mostly imperceptible effects that are still noticeable, if rarely.
I canât buy this:
> I've also learnt I do benefit from the 8 kHz setting of my mouse, as even at 3200 DPI with fast & smooth motion, some frames still miss a pointer update
It may be true that pointer updates were being missed. But does that really affect anything?
It turns out that thereâs a way to test this experimentally. Do a double blind experiment, just like in science. If you can tell which monitor is 240hz more than randomly, then it matters. Ditto for the pointer updates.
The corollary is that if you canât tell with better than random chance, then none of this matters, no matter how much you think it does.
Experiments like this have decisively settled the âDoes higher sampling rate matter when listening to music?â debate, among other questions. People still swear that they can tell that thereâs a difference, but itâs expectation bias. Theyâre mistaken.
(10ms drops every few seconds would definitely be noticeable though; that wasnât the point.)
> I can buy that humans can see at least 120hz at a minimum. 60Hz is the generally accepted threshold, but Iâve long suspected that 120Hz has mostly imperceptible effects that are still noticeable, if rarely.
There are videos on youtube showing people perceive differences at much higher framerates. e.g. https://www.youtube.com/watch?v=OX31kZbAXsA (long video, so you can skip to the end - they found that even casual players were performing measurably more consistently at 240Hz than even 144Hz.)
Anecdotally, I recently switched to playing racing games at 165FPS and the difference is massive!
As per the post, I wrote this tool to confirm I was getting jerks of ~10ms every few seconds on one USB port and not the other. This would _suggest_ I can catch differences around the ballpark of 100 Hz.
I'm game for a randomized blinded test on 120 Hz refresh rate vs 240 Hz refresh rate. I would indeed be very curious to confirm I can tell the difference with a proper protocol.
Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
Turns out he only needed to look at the pointer through one push of the mouse to tell right away, successful 100% of the time in a blinded experiment.
> Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
Thatâs a silly experiment. I could look at a CRT with a completely static image and tell almost immediately whether it was at 60Hz, 90Hz or 120Hz. Flickr at 60Hz was awful, 90Hz was clearly perceptible, and even 120Hz was often somewhat noticeable. And most CRT/graphics card combos would become perceptibly blurry in the horizontal direction at 120Hz at any reasonable desktop resolution, so you could never truly win. Interlaced modes made the flicker much less visible, but the crawling effect was easy to see and distracting.
Yes this a (computer) CRT thing. The commenter might be misremembering the exact numbers, 60 Hz baseline is a flat panel thing, by the mid to late 90s, 75 Hz was something a typical computer CRT would generally aim for and was part of various standards and recommendations.
As to how you can perceive the difference between 120 events per second and 240, I have what I hope is a fairly simple explanation.
It's like lightning strokes of tens of microseconds making a lasting impression on your perception of the scene. You don't "count" strokes over time, but in space.
When you make circles fast and large enough on screen, you can evaluate the number of cursors that appear before your eyes. At 4 circles per second, is each circle made of ~60 pointers or ~30? Belief not fact: it's not hard to guess.
âIf anyone wants to implement this, I think the way to do it is to put the mouse cursor randomly on the edge of a circle whose radius is a few hundred pixels. The randomness is important, though Iâm not sure it would be possible to count how many cursors there are.â
And then I realized that doesnât work, for a few reasons.
One is that you wonât be able to count how many cursors appear during one second. Itâll all look like a jumble.
That leads to the argument that you should place the cursors at a consistent spacing, and the spacing needs to make it so that the cursors stay at the same spot on the screen each loop around the circle.
Unfortunately that doesnât work either, because youâll end up seeing a trail of cursors going around a circle once per second, and counting the cursors is hopeless.
So I think youâd need to make a list of the spots on the circle where the cursors should go, then randomly select from them as quickly as possible. That will let each cursor be perceptible because theyâll be spread out over time; the next cursor wonât be just one pixel apart, so this eliminates the âtrail of cursorsâ problem.
Iâm still a bit skeptical this could work, but I admit I canât think of a reason it wouldnât. Youâll need to be careful, because itâs really easy to fool yourself that youâve done it correctly when you havenât.
It would be interesting to make a WebGL canvas and try this out for real. Or maybe just reposition the mouse cursor with Python instead of doing anything graphical.
It seems important to reposition the mouse cursor rather than use WebGL to draw frames, but I think both could work. Actually, the WebGL route would be more faithful to the question of whether gamers specifically can notice 240Hz; there are all kinds of reasons why repositioning the mouse cursor wouldnât really tell you that. Vice-versa too, because it might be possible to notice when repositioning cursors but not when using WebGL, though I canât think of why that would be the case.
Higher refresh rates don't have to be perceptible to be useful: they can shift the balance in head-to-head gaming.
Imagine 2 identical gaming setups with 2 players that have the same skill set. In an FPS game, you'd expect each of those players to win 50% of the games.
Now switch one monitor from 120Hz to 240Hz. On average, the player on the 240Hz monitor will see their adversary 4ms earlier than the player on the 120Hz monitor and thus be able to push the mouse button earlier too.
I think this sort of effect odd what makes people think they can tell the difference - they can notice the indirect side-effects that correlate with the difference.
A pro FPS player might notice that they loose contests peeking around corners more often. Obviously network latency in online games will be a factor as well, but since it likely averages out for both players over time, I would guess you can mostly discount it along with alternating whoâs doing the peeking.
I donât think anyone could look at a scene on a 120hz vs 240hz display and tell the difference, there needs to be some indirect clue.
I play video games at a decently high level (like top ~10% in a few competitive games). To support what youâre saying, I can tell the difference between 144hz and 240hz if Iâm in control. For example, if I can shake the screen around.
If Iâm just watching, Iâm not sure I could even tell the difference between 60hz and 144hz.
Do any competitive FPS games actually render 240 different frames in a second? Because if both playersâ hardware is doing 60FPS, that monitor difference changes nothing.
More directly if the game engine only updates player state every 60 seconds (tick rate) then is this 4ms advantage actually present for the 240Hz case?
Further if your network has more than 4ms of jitter then I don't think you can make any concrete claim in either direction.
My theory about why it helps gamers to have a higher frame rate is that for something like a whip turn, with a low frame rate, your brain has to take a brief moment to work out where it ended up looking after the pan. But if your frame rate is high enough, you brain can keep updating its state during the pan because the updates are continuous enough not to lose âstateâ during it. This means when you finish the fast move, there is no delay while you reorient yourself for a few milliseconds.
The author didnât say that they have a use for those 3200 updates per second other than as a workaround for some other issue. With a competently composited desktop and applications that pace input processing and frame generation well, and ignoring pointer acceleration, 1 correctly timed update per frame is enough. (As far as I know this does not exist from any vendor on a modern system other than for games, although really old Apple II-era software often got it right.). For acceleration, some pointer history is needed. And no one has a mouse that has an API that allows the host to pace the updates.
Presumably the 3200 Hz is needed for a combination of reasons:
- Under ideal conditions, if
you want less than 10% variation in the number of samples per frame at 240Hz, you may need ~2400Hz. This effect is visible even by human eyeballs â you can see multiple cursor images across your field of view, and uneven spacing is noticeable.
- The mouse itself may work less well at a lower sampling rate.
- The OS and input stack may be poorly designed and work better at higher rates.
In any case, the application and cursor implementation are unlikely to ask for a mouse location more than once per frame, so the user is not really using 3200 updates per second, but thatâs irrelevant.
People were skeptical that 120Hz vs 60Hz made any difference in gaming. Iâm still not sure how skeptical I should be. The threshold seems somewhere between 60 and 240, and itâs really hard to believe that 240 makes a measurable difference in real world performance (e.g. competitive games). But I can believe 120Hz matters vs 60Hz.
Thereâs also confusion over human response time vs whether you can perceive something. Even if 240Hz looks slightly different, if a human canât react to that difference (other than to say it looks nicer, which is a personal preference rather than an empirical assessment of âbetterâ) then it doesnât really matter anyway. Kind of like how Avatar looked different in 48Hz instead of 24Hz, and at the time it was hailed as some revolution in movies, and then it came and went. Personal preference.
As a direct answer to your question, I was a gamedev from 2005 to 2012, and back then people were arguing that 120Hz couldnât make a difference and that 60Hz was fine. It stuck with me, since it seemed mistaken. So I shouldnât have said âgenerally accepted,â just âI vaguely remember the world arguing a decade or so ago that 60Hz was good enough in all situations, e.g. competitive gaming.â
How much latency you can perceive greatly depends on the context. But in the right context humans can perceive display latency down close to 1 ms as demonstrated by Microsoft Research many years ago. There is no excuse to be skeptical about that. https://www.youtube.com/watch?v=vOvQCPLkPt4
So tired of defending against this same, old, completely wrong intuition from people especially those saying "do the science" to justify their ignorance instead of looking themselves since the science has already been done and it's coming up on a full century old.
From this one paper alone, humans can perceive information from a single frame at 2000 Hz.
Humans can read numbers and reproduce them immediately a 5 digit number is displayed for 1 frame at 400 fps. This is a single exposure, it is not a looping thing with persistence of vision or anything like that. 7 digit numbers required the framerate to be 333 fps. Another student produced 9 digit number from a single frame at 300 fps. These were the average results. The record results were a correct reproduction of a 7 digit number from a single viewing of a single frame at 2000 Hz. This was the limit within 2% accuracy of the tachistoscopic equipment in question. From the progression of the students chasing records, no slowing of their progression had ever been in sight. The later papers from this author involve considerable engineering difficulty to construct an even faster tachistocope and are limited by 1930s-1940s technology.
This research led the US Navy in WW2 to adopt tachistotopic training methods for aircraft recognition replacing the WEFT paradigm (which had approximately a 0% success rate) to a 1 frame at 75 fps paradigm which led to 95% of cadets reaching 80% accuracy on recognition, and 100% of cadets reaching 62.5% accuracy after just 50 sessions.
Yes, humans can see 2000 fps.
Yes, humans can see well beyond 2000 fps in later work from this researcher.
Yes, humans can detect flicker well above 1000 fps in daily life at the periphery of vision with cone cells as cone cells can fire from a single photon of light and our edge detection circuits operate at a far higher frequency than our luminance and flicker-fusion circuits. Here's flicker being discriminated from steady light at an average of 2 kHz for 40 degree saccades, and an upper limit above 5 kHz during 20 degree saccades, which would be much more typical for eyes on a computer monitor.
There is no known upper limit to the frequency of human vision that is detectable. As far as I know, all studies (such as this one I link) have always been able to measure up to the reliable detection limit of their equipment, never up to a human limit.
I think the original claim got corrupted into what people argue about now: those lower fps were found to be roughly the border between perceiving something as smooth motion and jerky stop-motion (like claymation). Then someone misunderstood "smooth motion" to mean we can't perceive any better than that, and it started getting repeated incorrectly as the upper limit.
I bought a Logitech wireless mouse called the Marathon which boasted an amazing three-year battery life on two AAs. I initially thought it was broken; it had a maddening delay where the sensors turned off after a short idle time, so when I wanted to use the mouse, it didn't register the first movements since it had to "wake up".
This delay wasn't present on the Logitech gaming mouse I previously used, probably a combination of a high polling rate (500Hz) and a much longer idle delay. The battery life was also much shorter, only 250 hours on high-performance mode, but I just recharged a set of AA batteries every week so it was never an issue.
Iâm happy with my $5 wired Logitech mouse as itâs got essentially zero lag, never runs out of batteries and unlike the high end Logitech mice has no ârubberâ which tends to invariably go icky over time.
At one point I had a Razer wireless mouse (Mamba I think?) which had no discernible latency and a nice dock for recharging the mouse, I was very happy with it until one evening it just stopped working. While alone in my flat, I stepped away from using my computer for about an hour, didnât even put it to sleep, came back, and it would no longer move but would still register mouse clicks. I tried contacting customer support asking if there was a way to reset it or reflash the firmware or something and theyâre just like ânopeâ. Last piece of Razer hardware I ever bought.
Seems like it doesn't properly handle mouse events on Safari in macOS and only shows "frames with no pointer events". I assume it's because "pointerrawupdate" event is not supported there.
Also it's interesting that with ProMotion enabled it reports 16.67ms per frame (indicating 60Hz redraw rate) in Safari, but in Chrome it's 8.33.
Yes, I rely on pointerrawupdate. Thanks for letting me know! Unfortunately pointermove is typically synced with graphics in my limited experience, and I think I'd rather not show anything than provide wildly inaccurate numbers.
Although it's for gamepads, it's pretty much indispensable in debugging gamepad-related latency issues. For example, I found that my presumably 1000Hz controller can do only 500Hz in ideal conditions and it starts to drop at a much lower distance from the computer than advertised. Neat stuff.
TheTon â 1 hour ago
We could not reproduce the issue on systems running macOS or Linux, and we chalked it up to a bug in Windows. It was hard to know if it affected real mice but I expect it did. I havenât tried retesting with more recent versions of Windows to see if it is fixed, maybe it has been.
Anyway, Iâm not disputing OPs claim, I can totally believe it, but I always thought it was funny that pro gamers on Windows with high end mice could be losing the occasional movement and apparently nobody noticed that.
sillysaurusx â 11 hours ago
I canât buy this:
> I've also learnt I do benefit from the 8 kHz setting of my mouse, as even at 3200 DPI with fast & smooth motion, some frames still miss a pointer update
It may be true that pointer updates were being missed. But does that really affect anything?
It turns out that thereâs a way to test this experimentally. Do a double blind experiment, just like in science. If you can tell which monitor is 240hz more than randomly, then it matters. Ditto for the pointer updates.
The corollary is that if you canât tell with better than random chance, then none of this matters, no matter how much you think it does.
Experiments like this have decisively settled the âDoes higher sampling rate matter when listening to music?â debate, among other questions. People still swear that they can tell that thereâs a difference, but itâs expectation bias. Theyâre mistaken.
(10ms drops every few seconds would definitely be noticeable though; that wasnât the point.)
haiku2077 â 11 hours ago
There are videos on youtube showing people perceive differences at much higher framerates. e.g. https://www.youtube.com/watch?v=OX31kZbAXsA (long video, so you can skip to the end - they found that even casual players were performing measurably more consistently at 240Hz than even 144Hz.)
Anecdotally, I recently switched to playing racing games at 165FPS and the difference is massive!
ptramo â 11 hours ago
I'm game for a randomized blinded test on 120 Hz refresh rate vs 240 Hz refresh rate. I would indeed be very curious to confirm I can tell the difference with a proper protocol.
Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
Turns out he only needed to look at the pointer through one push of the mouse to tell right away, successful 100% of the time in a blinded experiment.
amluto â 8 hours ago
Thatâs a silly experiment. I could look at a CRT with a completely static image and tell almost immediately whether it was at 60Hz, 90Hz or 120Hz. Flickr at 60Hz was awful, 90Hz was clearly perceptible, and even 120Hz was often somewhat noticeable. And most CRT/graphics card combos would become perceptibly blurry in the horizontal direction at 120Hz at any reasonable desktop resolution, so you could never truly win. Interlaced modes made the flicker much less visible, but the crawling effect was easy to see and distracting.
andai â 8 hours ago
pvg â 2 hours ago
ptramo â 11 hours ago
It's like lightning strokes of tens of microseconds making a lasting impression on your perception of the scene. You don't "count" strokes over time, but in space.
When you make circles fast and large enough on screen, you can evaluate the number of cursors that appear before your eyes. At 4 circles per second, is each circle made of ~60 pointers or ~30? Belief not fact: it's not hard to guess.
sillysaurusx â 7 hours ago
âIf anyone wants to implement this, I think the way to do it is to put the mouse cursor randomly on the edge of a circle whose radius is a few hundred pixels. The randomness is important, though Iâm not sure it would be possible to count how many cursors there are.â
And then I realized that doesnât work, for a few reasons.
One is that you wonât be able to count how many cursors appear during one second. Itâll all look like a jumble.
That leads to the argument that you should place the cursors at a consistent spacing, and the spacing needs to make it so that the cursors stay at the same spot on the screen each loop around the circle.
Unfortunately that doesnât work either, because youâll end up seeing a trail of cursors going around a circle once per second, and counting the cursors is hopeless.
So I think youâd need to make a list of the spots on the circle where the cursors should go, then randomly select from them as quickly as possible. That will let each cursor be perceptible because theyâll be spread out over time; the next cursor wonât be just one pixel apart, so this eliminates the âtrail of cursorsâ problem.
Iâm still a bit skeptical this could work, but I admit I canât think of a reason it wouldnât. Youâll need to be careful, because itâs really easy to fool yourself that youâve done it correctly when you havenât.
It would be interesting to make a WebGL canvas and try this out for real. Or maybe just reposition the mouse cursor with Python instead of doing anything graphical.
It seems important to reposition the mouse cursor rather than use WebGL to draw frames, but I think both could work. Actually, the WebGL route would be more faithful to the question of whether gamers specifically can notice 240Hz; there are all kinds of reasons why repositioning the mouse cursor wouldnât really tell you that. Vice-versa too, because it might be possible to notice when repositioning cursors but not when using WebGL, though I canât think of why that would be the case.
Neat idea. Thanks.
tverbeure â 11 hours ago
Imagine 2 identical gaming setups with 2 players that have the same skill set. In an FPS game, you'd expect each of those players to win 50% of the games.
Now switch one monitor from 120Hz to 240Hz. On average, the player on the 240Hz monitor will see their adversary 4ms earlier than the player on the 120Hz monitor and thus be able to push the mouse button earlier too.
nkrisc â 10 hours ago
A pro FPS player might notice that they loose contests peeking around corners more often. Obviously network latency in online games will be a factor as well, but since it likely averages out for both players over time, I would guess you can mostly discount it along with alternating whoâs doing the peeking.
I donât think anyone could look at a scene on a 120hz vs 240hz display and tell the difference, there needs to be some indirect clue.
sjoedev â 9 hours ago
If Iâm just watching, Iâm not sure I could even tell the difference between 60hz and 144hz.
brookst â 8 hours ago
haiku2077 â 7 hours ago
https://youtu.be/nqa7QVwfu7s
timewizard â 7 hours ago
Further if your network has more than 4ms of jitter then I don't think you can make any concrete claim in either direction.
haiku2077 â 5 hours ago
https://youtu.be/GqhhFl5zgA0
You can film the screen in slow motion and visually see more fluid motion (and see how it reacts to player input).
Games also use predictive methods and client side hit detection to mitigate most of the effects of network latency in the common cases.
dubbie99 â 5 hours ago
leni536 â 6 hours ago
You can present the game state statistically earlier to the player with the higher refresh rate display.
amluto â 8 hours ago
Presumably the 3200 Hz is needed for a combination of reasons:
- Under ideal conditions, if you want less than 10% variation in the number of samples per frame at 240Hz, you may need ~2400Hz. This effect is visible even by human eyeballs â you can see multiple cursor images across your field of view, and uneven spacing is noticeable.
- The mouse itself may work less well at a lower sampling rate.
- The OS and input stack may be poorly designed and work better at higher rates.
In any case, the application and cursor implementation are unlikely to ask for a mouse location more than once per frame, so the user is not really using 3200 updates per second, but thatâs irrelevant.
ptramo â 7 hours ago
Second 3200 was DPI not Hz. I can trivially tell how much I have to move with 3200 DPI (my sweet spot with 2 4K monitors), 4800 DPI, and 6400.
For Hz, it was the polling rate. With a configured 8000 Hz polling rate which is a lie/peak, I still see stalls in the 4ms range with my hardware.
As to acceleration I disable it. To truly lose it at high DPIs I've had to install RawAccel on Microsoft Windows.
eviks â 53 minutes ago
sillysaurusx â 40 minutes ago
Thereâs also confusion over human response time vs whether you can perceive something. Even if 240Hz looks slightly different, if a human canât react to that difference (other than to say it looks nicer, which is a personal preference rather than an empirical assessment of âbetterâ) then it doesnât really matter anyway. Kind of like how Avatar looked different in 48Hz instead of 24Hz, and at the time it was hailed as some revolution in movies, and then it came and went. Personal preference.
As a direct answer to your question, I was a gamedev from 2005 to 2012, and back then people were arguing that 120Hz couldnât make a difference and that 60Hz was fine. It stuck with me, since it seemed mistaken. So I shouldnât have said âgenerally accepted,â just âI vaguely remember the world arguing a decade or so ago that 60Hz was good enough in all situations, e.g. competitive gaming.â
modeless â 6 hours ago
tofof â 9 hours ago
From this one paper alone, humans can perceive information from a single frame at 2000 Hz.
https://doi.org/10.1080/00223980.1945.9917254
Humans can read numbers and reproduce them immediately a 5 digit number is displayed for 1 frame at 400 fps. This is a single exposure, it is not a looping thing with persistence of vision or anything like that. 7 digit numbers required the framerate to be 333 fps. Another student produced 9 digit number from a single frame at 300 fps. These were the average results. The record results were a correct reproduction of a 7 digit number from a single viewing of a single frame at 2000 Hz. This was the limit within 2% accuracy of the tachistoscopic equipment in question. From the progression of the students chasing records, no slowing of their progression had ever been in sight. The later papers from this author involve considerable engineering difficulty to construct an even faster tachistocope and are limited by 1930s-1940s technology.
This research led the US Navy in WW2 to adopt tachistotopic training methods for aircraft recognition replacing the WEFT paradigm (which had approximately a 0% success rate) to a 1 frame at 75 fps paradigm which led to 95% of cadets reaching 80% accuracy on recognition, and 100% of cadets reaching 62.5% accuracy after just 50 sessions.
Yes, humans can see 2000 fps. Yes, humans can see well beyond 2000 fps in later work from this researcher.
https://doi.org/10.1080/00223980.1945.9917254
Yes, humans can detect flicker well above 1000 fps in daily life at the periphery of vision with cone cells as cone cells can fire from a single photon of light and our edge detection circuits operate at a far higher frequency than our luminance and flicker-fusion circuits. Here's flicker being discriminated from steady light at an average of 2 kHz for 40 degree saccades, and an upper limit above 5 kHz during 20 degree saccades, which would be much more typical for eyes on a computer monitor.
There is no known upper limit to the frequency of human vision that is detectable. As far as I know, all studies (such as this one I link) have always been able to measure up to the reliable detection limit of their equipment, never up to a human limit.
Izkata â 3 hours ago
I think the original claim got corrupted into what people argue about now: those lower fps were found to be roughly the border between perceiving something as smooth motion and jerky stop-motion (like claymation). Then someone misunderstood "smooth motion" to mean we can't perceive any better than that, and it started getting repeated incorrectly as the upper limit.
chrisweekly â 8 hours ago
gblargg â 10 hours ago
tuatoru â 10 hours ago
Not really relevant. Music is experienced after a Fourier transform, in frequency space,
The more telling example is that experienced drummers get frustrated by lag of 2 ms from computer-generated effects. That's 500 Hz.
jerlam â 8 hours ago
This delay wasn't present on the Logitech gaming mouse I previously used, probably a combination of a high polling rate (500Hz) and a much longer idle delay. The battery life was also much shorter, only 250 hours on high-performance mode, but I just recharged a set of AA batteries every week so it was never an issue.
I ended up returning the Marathon mouse.
jonathanlydall â 8 hours ago
At one point I had a Razer wireless mouse (Mamba I think?) which had no discernible latency and a nice dock for recharging the mouse, I was very happy with it until one evening it just stopped working. While alone in my flat, I stepped away from using my computer for about an hour, didnât even put it to sleep, came back, and it would no longer move but would still register mouse clicks. I tried contacting customer support asking if there was a way to reset it or reflash the firmware or something and theyâre just like ânopeâ. Last piece of Razer hardware I ever bought.
naoru â 12 hours ago
Also it's interesting that with ProMotion enabled it reports 16.67ms per frame (indicating 60Hz redraw rate) in Safari, but in Chrome it's 8.33.
ptramo â 12 hours ago
naoru â 12 hours ago
Although it's for gamepads, it's pretty much indispensable in debugging gamepad-related latency issues. For example, I found that my presumably 1000Hz controller can do only 500Hz in ideal conditions and it starts to drop at a much lower distance from the computer than advertised. Neat stuff.
ptramo â 13 hours ago
BearOso â 13 hours ago
Neat tool, though. I'm also very sensitive towards latency.
mrheosuper â 1 hour ago
daft_pink â 12 hours ago
Iâm curious if there is a USB hub that I could buy of higher quality as my mac doesnât have too much i/o
lostlogin â 9 hours ago
Iâd love to be wrong on this but havenât been so far.
daft_pink â 7 hours ago
xnx â 11 hours ago
ptramo â 10 hours ago
There are other differences in the tools, mine was designed for what I wanted to understand so I'm biased toward it.
rufus_foreman â 8 hours ago
wkat4242 â 12 hours ago
notfed â 12 hours ago