Whatever the resolution of ‘real life’, what matters is at what point our little eyes and brains no longer can perceive a difference.
In average scenery, the general consensus is about 60 pixels per degree of vision. If you have something a bit more synthetic, like a white dot in empty space, then that sort of specific small high contrast would take maybe 200 pixels per degree to ensure that the white dot is appropriately equally visible in the display versus directly seeing. A 75" display 2 meters out at 4k is about 85 pixels per degree. This is comfortable enough for display.
Similar story with ‘frames per second’. Move something back and forth really fast and you’ll see a blurry smear of the object rather than observing it’s discrete movement. So if you accurately match the blurring you will naturally see and do low persistence backlight/display, you’ll get away with probably something like 60 FPS. If you are stuck with discrete representations and will unable to blur or turn off between meaningful frames, you might have to go a bit further up, to like 120 or 144 FPS.
The question isn’t how high the resolution of reality is, but how well we can process it is. There is an upper limit to visual acuity, but I’d have to calculate what an arc-minute at 6 meters would be and I’m too lazy right now. Regarding fps, some people can notice artefacts up to 800hz, but I’d think going with 120hz would be ok.
Remember, you’ll have to generate stereoscopic output.
Both are practically infinite, or well, the question doesn’t really make sense.
Reality isn’t rasterized, so there’s no resolution. You just have light waves bouncing off of things and into your eyes. They can hit at all kinds of angles and positions, and your brain will interpret different impact frequency distributions as some color or brightness you see in a certain position.
And you don’t have a shutter in your eyes or something else that would isolate individual frames. Light waves just arrive whenever they do and your brain updates its interpreted image continuously.
So, in principle, you can increase the resolution and display rate of a screen to infinity and you’d still perceive it differently (even if it’s not noticeable enough to point it out).
The cost just goes up ever more and the returns diminish, so the question rather has to be, whether it’s worth your money (and whether you want to sink that much money into entertainment in the first place).
It doesn’t matter what refresh rate it says on the box. It depends on the hardware and firmwar inside. I’ve seen good 60hz tvs and I’ve seen them with blurry motion that is borderline unwatchable. There’s some really cheap tvs out there now and there’s a reason they’re so cheap.
How many Ks is real life resolution and at how many fps does it run?
Whatever the resolution of ‘real life’, what matters is at what point our little eyes and brains no longer can perceive a difference.
In average scenery, the general consensus is about 60 pixels per degree of vision. If you have something a bit more synthetic, like a white dot in empty space, then that sort of specific small high contrast would take maybe 200 pixels per degree to ensure that the white dot is appropriately equally visible in the display versus directly seeing. A 75" display 2 meters out at 4k is about 85 pixels per degree. This is comfortable enough for display.
Similar story with ‘frames per second’. Move something back and forth really fast and you’ll see a blurry smear of the object rather than observing it’s discrete movement. So if you accurately match the blurring you will naturally see and do low persistence backlight/display, you’ll get away with probably something like 60 FPS. If you are stuck with discrete representations and will unable to blur or turn off between meaningful frames, you might have to go a bit further up, to like 120 or 144 FPS.
It’s possible to argue motion blur looks better, but at least in Rocket League, it makes it insanely hard to play
I think it’s about 1044 fps, give or take.
I feel like it’s kinda infinite, because you can zoom in to the quantum level and then looking at things sorta fails you… But I’m no scientist.
The question isn’t how high the resolution of reality is, but how well we can process it is. There is an upper limit to visual acuity, but I’d have to calculate what an arc-minute at 6 meters would be and I’m too lazy right now. Regarding fps, some people can notice artefacts up to 800hz, but I’d think going with 120hz would be ok. Remember, you’ll have to generate stereoscopic output.
But I asked how much, not how well. I wanna know about the first question, not the second 🥹
Both are practically infinite, or well, the question doesn’t really make sense.
Reality isn’t rasterized, so there’s no resolution. You just have light waves bouncing off of things and into your eyes. They can hit at all kinds of angles and positions, and your brain will interpret different impact frequency distributions as some color or brightness you see in a certain position.
And you don’t have a shutter in your eyes or something else that would isolate individual frames. Light waves just arrive whenever they do and your brain updates its interpreted image continuously.
So, in principle, you can increase the resolution and display rate of a screen to infinity and you’d still perceive it differently (even if it’s not noticeable enough to point it out).
The cost just goes up ever more and the returns diminish, so the question rather has to be, whether it’s worth your money (and whether you want to sink that much money into entertainment in the first place).
It doesn’t matter what refresh rate it says on the box. It depends on the hardware and firmwar inside. I’ve seen good 60hz tvs and I’ve seen them with blurry motion that is borderline unwatchable. There’s some really cheap tvs out there now and there’s a reason they’re so cheap.