High resolution VS anti-aliasing. Who will win?
I then came up with the idea: if the monitor resolution is high enough, the serration of the edges will not be noticeable. Does this mean that in the future you can forget about antialiasing? And that will be more efficient in terms of performance with the same picture quality: anti-aliasing or a higher resolution?will pobedit resolution as there is rasanya the border of the screen and the pixel density is not in the era of 8-16K smoothing what we know today probably will completely exhausted.
in terms of performance while increasing the complexity of the render antialiasing will always be more profitable but the pitch philosophically smoothing in the usual manner will never give you something that will give high resolution of AK as we are talking about the original lack of information.
however, the truth most likely is that it will be tricky upscaling and some sort of ultra-precise edge render or if it is actively develop raytracing that can simultaneously develop analytical techniques to resolve Pixela and artifacts. the picture is actually rendered at 4K and then analyze the vertices of the angles of the pixels in some huge resolution and trying to be filled and then the entire frame squeezed to any 16K
so there is only one winner - this graaaaaah. the other being if you znachitsya wins cloud remote streaming streaming will come pedanticism and there will be myltse.