27" Monitors

Started by camdave, May 25, 2021, 16:03:29

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

camdave

I currently use a 23" 1080p monitor and I am thinking of purchasing a 27" replacement.
This is all for 'office work', web browsing and occasional simple audio & photo editing.

I read many articles saying that one should not consider a 1080p in the larger size as the pixel density will make the picture look blurred; 1440p is the minimum for 27" it is said.

I have to confess that I am not convinced; I can barely differentiate the pixels on my monitor even at a distance of just a few inches. My eyes are old but not too bad.

I know I can view various monitors at a retail store but I am interested in peoples' real-life experience.

zappaDPJ

There's quite a noticeable difference between 1080p and 1440p at 27". I suspect if you viewed the two side by side you would want to walk out of the shop with 1440p. I never went beyond 24" when I was running 1080p because I found that to be the optimal size for graphics design.

Having said that, eyesight, monitor quality, calibration, distance from the monitor, viewing angle, light in the room etc. all play a part and for many 1080p is perfectly usable.
zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Simon

I'm always confused by monitor sizes and screen resolutions.  The higher the resolution, the smaller everything seems to be on the screen, and if you adjust the monitor to a non-native resolution, text seems 'fuzzy'.   :dunno:
Simon.
--
This post reflects my own views, opinions and experience, not those of IDNet.

zappaDPJ

Flat panel displays have a fixed raster i.e. unlike CRTs, they can't change resolution to match a signal input. Therefore the optimal display quality of a flat panel can only be reached when the signal input matches the native resolution aka one-to-one mapping.

Attempts to move away from the native resolution will result in the loss of one-to-one mapping and the panel will have to rely on interpolation which causes a loss of image quality. Interpolation estimates a new set of data points or mapping within a known set of data points (the native resolution).

An exception to the above is the use of 'integer scaling' e.g. running 800x800 on a display that has a native resolution of 1600x1600 should give you a reasonable result.
zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.