Advertisement

4K UHD

Started by March 08, 2015 02:46 PM
34 comments, last by Ravyne 9 years, 7 months ago


For myself, I'm more interested in the screen real-estate of a 4K monitor for development purposes, though I hear that some applications and OS interfaces have difficulty at that resolution. Better OS support for window snapping would be nice too.

Yea, Windows by default only supports left/right and (I think, can't really remember) up/down snapping I had to get a program that lets my snap to corners like I can on Linux. Interfaces are pretty good on a 28" display, don't think I could personally deal with anything lower than that.


My major argument against a single 4K monitor is that it will be impossible to use that resolution optimally.

A 4K monitor at 400 ppi means you're approaching strainless readability of text. The printed page is usually between 600 and 1200 dpi with some high-quality printing on good-quality coated stock even higher. I'd say a 32 inch 4K monitor is pretty near optimal use of the resolution.

It's not a waste to be able to get away from the crap most developers are used to. Today's monitors at 90-100 ppi are pretty much like old bottle-based SD TV. You just don't know how bad things are until you get used to the good stuff.

Stephen M. Webb
Professional Free Software Developer

Advertisement

I think the next generation of consoles will be capable of today's pixel-quality at 4K/60hz, and perhaps even a little better -- but I don't think we're yet at a point were we can say that pixel quality has reached a point where there's no room for improvement, so developers will still choose to render at reduced framerate or resolution in order to increase pixel quality. These things are always in balance -- there's nothing physical preventing today's consoles from rendering at 1080p/60hz all the time, except that the developers have deemed that pixel quality would suffer too much for it. Same with 720p/60 last generation, and 480p the generation before.

I think we'll probably see a 8 or 10x increase in GPU GFLOPS next generation. The soon-coming flagship PC GPUs from nVidia and AMD are both already pressing 4-5x more GFLOPS than the PS4 or Xbox One can, so we will only need to see today's technology doubled and then commoditized for it to appear in consoles.

throw table_exception("(? ???)? ? ???");


I'd say a 32 inch 4K monitor is pretty near optimal use of the resolution.

It's right about at the Apple standard of 70 ppd (pixels-per-degree), if we assume an anecdotally determined (and larger than normal) 30 inch viewing distance.

Unfortunately, even at that distance, your monitor will still take up a massive 90 degrees of visual field! By comparison, a 22" monitor at the recommended 24" viewing distance, only occupies 45 degrees. That's going to require a lot more head motion to see the corners properly...

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

I recently took a look at a Samsung 27" 4k monitor at a local MicroCenter. I have realized I cannot go back and need to buy a monitor like this. Too bad the demo unit was just displaying photos. I wanted to see a game in motion on there. Maybe change the settings to disable AA and then see if I still see some jaggies from 2 feet away.

New game in progress: Project SeedWorld

My development blog: Electronic Meteor


I recently took a look at a Samsung 27" 4k monitor at a local MicroCenter. I have realized I cannot go back and need to buy a monitor like this.

I'm surprised you can see much of a difference between 4k and 1440p at that size.

I have a number of 27" monitors at 1440p, and already you can't discern individual pixels at a normal viewing distance.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Advertisement


I'm surprised you can see much of a difference between 4k and 1440p at that size.

I have a number of 27" monitors at 1440p, and already you can't discern individual pixels at a normal viewing distance.

Agreed. The jump from 1080p to 1440p at 27" is quite substantial, but I'd imagine law of diminishing returns kicks in after that.

CC Ricers, have you compared a 1440p and 4k?

if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight

I remember setting AF to 16x in Skyrim and the foliage still looked horrible and glittered like a disco ball.

All it took was FXAA to smoothen/blur the image and it looked like a totaly different game.

EDIT: I also remember how awesome Alien Isolation looked at the very beginning: the first room and the corridor in which you start had absolutely no flicker because of high frequency textures/normal maps. It looked like a movie. But then I got a bit further on and flickering ahoy laugh.png But those 10 seconds at the start really stayed with me: it's about reducing flickering.

I still use a cathode ray tube television .
Call me 'old fashion', but I do not see any value in a very expensive television that will have image quality issues in 4 - 6 years .
My laptop has the equivalent of a 720p screen .

I cannot remember the books I've read any more than the meals I have eaten; even so, they have made me.

~ Ralph Waldo Emerson

This topic got more attention than I thought! Do you think a developer could use something like this as a value add? First game with 4k resolution etc?

This topic is closed to new replies.

Advertisement