I can buy that humans can see at least 120hz at a minimum. 60Hz is the generally accepted threshold, but I’ve long suspected that 120Hz has mostly imperceptible effects that are still noticeable, if rarely.
I can’t buy this:
> I've also learnt I do benefit from the 8 kHz setting of my mouse, as even at 3200 DPI with fast & smooth motion, some frames still miss a pointer update
It may be true that pointer updates were being missed. But does that really affect anything?
It turns out that there’s a way to test this experimentally. Do a double blind experiment, just like in science. If you can tell which monitor is 240hz more than randomly, then it matters. Ditto for the pointer updates.
The corollary is that if you can’t tell with better than random chance, then none of this matters, no matter how much you think it does.
Experiments like this have decisively settled the “Does higher sampling rate matter when listening to music?” debate, among other questions. People still swear that they can tell that there’s a difference, but it’s expectation bias. They’re mistaken.
(10ms drops every few seconds would definitely be noticeable though; that wasn’t the point.)
haiku2077 2 hours ago [-]
> I can buy that humans can see at least 120hz at a minimum. 60Hz is the generally accepted threshold, but I’ve long suspected that 120Hz has mostly imperceptible effects that are still noticeable, if rarely.
There are videos on youtube showing people perceive differences at much higher framerates. e.g. https://www.youtube.com/watch?v=OX31kZbAXsA (long video, so you can skip to the end - they found that even casual players were performing measurably more consistently at 240Hz than even 144Hz.)
Anecdotally, I recently switched to playing racing games at 165FPS and the difference is massive!
ptramo 3 hours ago [-]
As per the post, I wrote this tool to confirm I was getting jerks of ~10ms every few seconds on one USB port and not the other. This would _suggest_ I can catch differences around the ballpark of 100 Hz.
I'm game for a randomized blinded test on 120 Hz refresh rate vs 240 Hz refresh rate. I would indeed be very curious to confirm I can tell the difference with a proper protocol.
Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
Turns out he only needed to look at the pointer through one push of the mouse to tell right away, successful 100% of the time in a blinded experiment.
amluto 3 minutes ago [-]
> Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
That’s a silly experiment. I could look at a CRT with a completely static image and tell almost immediately whether it was at 60Hz, 90Hz or 120Hz. Okay, maybe someone somewhere built a really nice high-persistence CRT that didn’t flicker perceptibly at 60Hz, but I certainly never owned such a thing. And most CRT/graphics card combos would become perceptibly blurry in the horizontal direction at 120Hz, so you could never truly win.
ptramo 2 hours ago [-]
As to how you can perceive the difference between 120 events per second and 240, I have what I hope is a fairly simple explanation.
It's like lightning strokes of tens of microseconds making a lasting impression on your perception of the scene. You don't "count" strokes over time, but in space.
When you make circles fast and large enough on screen, you can evaluate the number of cursors that appear before your eyes. At 4 circles per second, is each circle made of ~60 pointers or ~30? Belief not fact: it's not hard to guess.
tverbeure 2 hours ago [-]
Higher refresh rates don't have to be perceptible to be useful: they can shift the balance in head-to-head gaming.
Imagine 2 identical gaming setups with 2 players that have the same skill set. In an FPS game, you'd expect each of those players to win 50% of the games.
Now switch one monitor from 120Hz to 240Hz. On average, the player on the 240Hz monitor will see their adversary 4ms earlier than the player on the 120Hz monitor and thus be able to push the mouse button earlier too.
nkrisc 1 hours ago [-]
I think this sort of effect odd what makes people think they can tell the difference - they can notice the indirect side-effects that correlate with the difference.
A pro FPS player might notice that they loose contests peeking around corners more often. Obviously network latency in online games will be a factor as well, but since it likely averages out for both players over time, I would guess you can mostly discount it along with alternating who’s doing the peeking.
I don’t think anyone could look at a scene on a 120hz vs 240hz display and tell the difference, there needs to be some indirect clue.
sjoedev 44 minutes ago [-]
I play video games at a decently high level (like top ~10% in a few competitive games). To support what you’re saying, I can tell the difference between 144hz and 240hz if I’m in control. For example, if I can shake the screen around.
If I’m just watching, I’m not sure I could even tell the difference between 60hz and 144hz.
tofof 25 minutes ago [-]
So tired of defending against this same, old, completely wrong intuition from people especially those saying "do the science" to justify their ignorance instead of looking themselves since the science has already been done and it's coming up on a full century old.
From this one paper alone, humans can perceive information from a single frame at 2000 Hz.
Humans can read numbers and reproduce them immediately a 5 digit number is displayed for 1 frame at 400 fps. This is a single exposure, it is not a looping thing with persistence of vision or anything like that. 7 digit numbers required the framerate to be 333 fps. Another student produced 9 digit number from a single frame at 300 fps. These were the average results. The record results were a correct reproduction of a 7 digit number from a single viewing of a single frame at 2000 Hz. This was the limit within 2% accuracy of the tachistoscopic equipment in question. From the progression of the students chasing records, no slowing of their progression had ever been in sight. The later papers from this author involve considerable engineering difficulty to construct an even faster tachistocope and are limited by 1930s-1940s technology.
This research led the US Navy in WW2 to adopt tachistotopic training methods for aircraft recognition replacing the WEFT paradigm (which had approximately a 0% success rate) to a 1 frame at 75 fps paradigm which led to 95% of cadets reaching 80% accuracy on recognition, and 100% of cadets reaching 62.5% accuracy after just 50 sessions.
Yes, humans can see 2000 fps.
Yes, humans can see well beyond 2000 fps in later work from this researcher.
Yes, humans can detect flicker well above 1000 fps in daily life at the periphery of vision with cone cells as cone cells can fire from a single photon of light and our edge detection circuits operate at a far higher frequency than our luminance and flicker-fusion circuits. Here's flicker being discriminated from steady light at an average of 2 kHz for 40 degree saccades, and an upper limit above 5 kHz during 20 degree saccades, which would be much more typical for eyes on a computer monitor.
There is no known upper limit to the frequency of human vision that is detectable. As far as I know, all studies (such as this one I link) have always been able to measure up to the reliable detection limit of their equipment, never up to a human limit.
gblargg 2 hours ago [-]
There could be side-effects of these higher rates that are noticeable, or bugs in handling different rates.
tuatoru 1 hours ago [-]
> Experiments like this have decisively settled the “Does higher sampling rate matter when listening to music?” debate...
Not really relevant. Music is experienced after a Fourier transform, in frequency space,
The more telling example is that experienced drummers get frustrated by lag of 2 ms from computer-generated effects. That's 500 Hz.
ptramo 5 hours ago [-]
With 240 Hz displays you probably want your mouse polling setting at 4000 or better 8000 Hz. This tool lets anyone confirm that on their hardware.
BearOso 4 hours ago [-]
That's a recent invention. Only the latest gaming mice can poll at that rate, and not particularly well on USB2. They're usually limited to 1000.
Neat tool, though. I'm also very sensitive towards latency.
naoru 4 hours ago [-]
Seems like it doesn't properly handle mouse events on Safari in macOS and only shows "frames with no pointer events". I assume it's because "pointerrawupdate" event is not supported there.
Also it's interesting that with ProMotion enabled it reports 16.67ms per frame (indicating 60Hz redraw rate) in Safari, but in Chrome it's 8.33.
ptramo 4 hours ago [-]
Yes, I rely on pointerrawupdate. Thanks for letting me know! Unfortunately pointermove is typically synced with graphics in my limited experience, and I think I'd rather not show anything than provide wildly inaccurate numbers.
Although it's for gamepads, it's pretty much indispensable in debugging gamepad-related latency issues. For example, I found that my presumably 1000Hz controller can do only 500Hz in ideal conditions and it starts to drop at a much lower distance from the computer than advertised. Neat stuff.
daft_pink 3 hours ago [-]
I found that plugging my keyboards directly into my Mac’s limited USB ports is noticably faster.
I’m curious if there is a USB hub that I could buy of higher quality as my mac doesn’t have too much i/o
lostlogin 46 minutes ago [-]
‘USB hub’ and ‘quality’ never go together.
I’d love to be wrong on this but haven’t been so far.
I can’t buy this:
> I've also learnt I do benefit from the 8 kHz setting of my mouse, as even at 3200 DPI with fast & smooth motion, some frames still miss a pointer update
It may be true that pointer updates were being missed. But does that really affect anything?
It turns out that there’s a way to test this experimentally. Do a double blind experiment, just like in science. If you can tell which monitor is 240hz more than randomly, then it matters. Ditto for the pointer updates.
The corollary is that if you can’t tell with better than random chance, then none of this matters, no matter how much you think it does.
Experiments like this have decisively settled the “Does higher sampling rate matter when listening to music?” debate, among other questions. People still swear that they can tell that there’s a difference, but it’s expectation bias. They’re mistaken.
(10ms drops every few seconds would definitely be noticeable though; that wasn’t the point.)
There are videos on youtube showing people perceive differences at much higher framerates. e.g. https://www.youtube.com/watch?v=OX31kZbAXsA (long video, so you can skip to the end - they found that even casual players were performing measurably more consistently at 240Hz than even 144Hz.)
Anecdotally, I recently switched to playing racing games at 165FPS and the difference is massive!
I'm game for a randomized blinded test on 120 Hz refresh rate vs 240 Hz refresh rate. I would indeed be very curious to confirm I can tell the difference with a proper protocol.
Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
Turns out he only needed to look at the pointer through one push of the mouse to tell right away, successful 100% of the time in a blinded experiment.
That’s a silly experiment. I could look at a CRT with a completely static image and tell almost immediately whether it was at 60Hz, 90Hz or 120Hz. Okay, maybe someone somewhere built a really nice high-persistence CRT that didn’t flicker perceptibly at 60Hz, but I certainly never owned such a thing. And most CRT/graphics card combos would become perceptibly blurry in the horizontal direction at 120Hz, so you could never truly win.
It's like lightning strokes of tens of microseconds making a lasting impression on your perception of the scene. You don't "count" strokes over time, but in space.
When you make circles fast and large enough on screen, you can evaluate the number of cursors that appear before your eyes. At 4 circles per second, is each circle made of ~60 pointers or ~30? Belief not fact: it's not hard to guess.
Imagine 2 identical gaming setups with 2 players that have the same skill set. In an FPS game, you'd expect each of those players to win 50% of the games.
Now switch one monitor from 120Hz to 240Hz. On average, the player on the 240Hz monitor will see their adversary 4ms earlier than the player on the 120Hz monitor and thus be able to push the mouse button earlier too.
A pro FPS player might notice that they loose contests peeking around corners more often. Obviously network latency in online games will be a factor as well, but since it likely averages out for both players over time, I would guess you can mostly discount it along with alternating who’s doing the peeking.
I don’t think anyone could look at a scene on a 120hz vs 240hz display and tell the difference, there needs to be some indirect clue.
If I’m just watching, I’m not sure I could even tell the difference between 60hz and 144hz.
From this one paper alone, humans can perceive information from a single frame at 2000 Hz.
https://doi.org/10.1080/00223980.1945.9917254
Humans can read numbers and reproduce them immediately a 5 digit number is displayed for 1 frame at 400 fps. This is a single exposure, it is not a looping thing with persistence of vision or anything like that. 7 digit numbers required the framerate to be 333 fps. Another student produced 9 digit number from a single frame at 300 fps. These were the average results. The record results were a correct reproduction of a 7 digit number from a single viewing of a single frame at 2000 Hz. This was the limit within 2% accuracy of the tachistoscopic equipment in question. From the progression of the students chasing records, no slowing of their progression had ever been in sight. The later papers from this author involve considerable engineering difficulty to construct an even faster tachistocope and are limited by 1930s-1940s technology.
This research led the US Navy in WW2 to adopt tachistotopic training methods for aircraft recognition replacing the WEFT paradigm (which had approximately a 0% success rate) to a 1 frame at 75 fps paradigm which led to 95% of cadets reaching 80% accuracy on recognition, and 100% of cadets reaching 62.5% accuracy after just 50 sessions.
Yes, humans can see 2000 fps. Yes, humans can see well beyond 2000 fps in later work from this researcher.
https://doi.org/10.1080/00223980.1945.9917254
Yes, humans can detect flicker well above 1000 fps in daily life at the periphery of vision with cone cells as cone cells can fire from a single photon of light and our edge detection circuits operate at a far higher frequency than our luminance and flicker-fusion circuits. Here's flicker being discriminated from steady light at an average of 2 kHz for 40 degree saccades, and an upper limit above 5 kHz during 20 degree saccades, which would be much more typical for eyes on a computer monitor.
There is no known upper limit to the frequency of human vision that is detectable. As far as I know, all studies (such as this one I link) have always been able to measure up to the reliable detection limit of their equipment, never up to a human limit.
Not really relevant. Music is experienced after a Fourier transform, in frequency space,
The more telling example is that experienced drummers get frustrated by lag of 2 ms from computer-generated effects. That's 500 Hz.
Neat tool, though. I'm also very sensitive towards latency.
Also it's interesting that with ProMotion enabled it reports 16.67ms per frame (indicating 60Hz redraw rate) in Safari, but in Chrome it's 8.33.
Although it's for gamepads, it's pretty much indispensable in debugging gamepad-related latency issues. For example, I found that my presumably 1000Hz controller can do only 500Hz in ideal conditions and it starts to drop at a much lower distance from the computer than advertised. Neat stuff.
I’m curious if there is a USB hub that I could buy of higher quality as my mac doesn’t have too much i/o
I’d love to be wrong on this but haven’t been so far.
There are other differences in the tools, mine was designed for what I wanted to understand so I'm biased toward it.