GeekNights Monday - Displays

Tonight on GeekNights, we consider displays. LCDs, OLEDs, "Smart" TVs, and the current state of the art. We haven't talked about this since 2006, and a lot has changed in 14 years!

In the news, we've migrated our 2-Factor Auth to NFC Yubikeys, Comcast is enforcing data caps nationally, the Apple M1 is revolutionary, Fleets are here I guess, and the GIMP is 25 years old and not much better than it was 20 years ago.

Things of the Day

Episode Links

Live stream:
https://www.youtube.com/watch?v=nG6QUF4ltc4

On Patreon:
https://www.patreon.com/posts/44245080

My thoughts on the M1 chip: I bought my current MacBook Pro four years ago, and got all the upgrades in terms of CPU, GPU and RAM (but not SSD). Compared to the new MacBook Air with no fan, it’s garbage. And costs about €2,000 less.

I bought my previous MacBook Pro hoping for four years of work, but it lasted seven years. I bought this MacBook expecting four years, but hoping for seven years again. Without this Apple Silicon transition, it might have made it seven years, but when such a huge step change is possible, it’s hard to hold out.

Due to not actually traveling at the moment, I only want a home workstation. The new M1 Mac Mini would be great, but there seems to be space above the current M1 Mac Mini for more RAM, more ports, etc. So the M2 or the M1XZX chip in a space grey Mac Mini, if it comes out next year, might be my next work machine.

2 Likes

I can probably hold out 3-5 more years before I have to upgrade, so I’m really just watching this space and seeing how things shake out.

But since my workflows are GPU heavy and I an primarily a PC gamer, unless games run on this thing and they support serious GPUs, I’m stuck in Windows/x86 land for the foreseeable future. Apply has abandoned gaming for so long, I can’t imagine that suddenly changing.

1 Like

Actually I use my Mac for game development, and that takes an eGPU to work as I want it to. And it works great!

Which brings up another thing you mentioned in the main topic. My monitor is a 144hz display, with free sync. When gaming on the PC, the free sync works perfectly, and there is no screen tearing.

However, as the monitor only has one display port input, and I don’t want to keep switching the cables, I just use a HDMI cable from my Mac/eGPU, and the main difference is the lack of free sync and seeing the screen tearing. But I’m just lazy enough that it’s not worth swapping cables to not have that issue.

2 Likes

Here’s a case for more smaller displays instead of maximizing resolution - and this is not an argument against 4K screens. They’re lovely, but not for my workflow.

As a game dev it’s really nice to have a 3-screen setup with Unity/Unreal on one side, a code monitor in the middle, and an email/slack/browser monitor on the other side. I can just drag and maximize windows without snapping or tiling. I also don’t like screens larger than 24" because my desk is not very deep and having to turn my head to scan from one extent to the other is fatiguing to my lazy squishy body. The discrete monitors become “attention zones” that can help me avoid distractions, too.

I also do the same full-screen attention zone thing, albeit with two 24" monitors instead of 3. I think you are making the assumption that the higher resolution is being used to put more things on one display. It is not necessarily so. Given 4K monitors I would do things the same, everything would just look better because smaller pixels.

Yeah that’s fair, I was assuming 4K meant 30" or larger screens which would be too big for my setup. But it looks like you can get 24-27" 4K monitors, which is amazing if a bit pricey. If you have the budget and the video card to drive your setup, I agree more pixels is generally better. I think there’s a subjective point of diminishing returns though. What’s the dollar / ppi sweet spot?

I have no idea what the sweet spot is. What i do know is that a 4K 24" screen becomes “retina” at 21" or over. At least according to this calculator Is This Retina? - DPI/PPI Display Calculator.

My eye is almost 30" away from my monitor at my desk. That means any smaller pixels than than won’t make much difference.

My current monitor is 1920x1200 and 24" which will only be retina if I am 36" or more away from the screen, and I am closer than that.

Retina basically just means, you can’t differentiate the pixels from each other, so making them smaller becomes pointless. The resolution of the screen will be higher than the resolution of your human eye.

If I manage to get a 3080 for my next build, I might as well get a 360Hz screen too.

The 360Hz screens are still on 1080p, which is fine and perfect for gaming. I do mostly photo editing, not so much video, I will probably get a dedicated monitor calibrated for my photography.

The jump from 30 FPS to 60 FPS is crazy enough. 120, 144… it’s nuts.

I would much rather have 4k60 than 1080p144+.

I am unconvinced that very high frame rates help with gaming, either visually or in terms of player performance.

Overwatch looks absolutely fantastic at 4k. It’s very hard to even go back to 1080 after playing it like that.

I find it unbelievable that you are unconvinced. The benefit for me isn’t just noticeable but incredible and easily measurable. I posted in another thread, but here’s the nugget:

I’d put in all that training on the 15 inch 60hz screen, and it was like doing a month of marathon training wearing scuba diving gear.

Free of crappy equipment and with a proper setup, I did the same tests with the same scenarios. Here are my best 10 test averages from a month with my laptop vs two days with my gaming PC (60hz vs 144hz):

  • 1wall6targets TE - 95 vs 115 … up 21%
  • Headglitch 180 (recoil) - 1310 vs 1825 … up 39%
  • 1wall 6targets small - 561 vs 716 … up 27%
  • PUBG Heavy Recoil - 3365 vs 6006 … up 78%

A possible conclusion to this is that upgrading from 60hz laptop screen to a 144hz gaming monitor brings a similar level of aim improvement as a month of regular practice.

But for this experiment there wasn’t a control group. I put in a month of training, and at the end of that process I installed Kovaaks on a PC connected to a better monitor.

Also I went from running PUBG on a MacBook Pro that could only output to the monitor at 60fps maximum (but often only 50fps) to a PC with a Vega 64 video card that runs PUBG at 130 to 144fps.

2 Likes
1 Like

Now print that art … BWAHAHAHHAHAA

(one of the side businesses I’ve been involved in was a on demand garment printer, we spent and obscene amount of time calibrating the printers and monitors so everything was constant between each other. We still spent an inordinate amount of time explaining to customers why you can’t print a bright neon green using CMYK and that they needed to use a spot color/screen print process)

Currently working as a fabric print technician, most of my job has been trying to teach people how to soft proof or hard proof before committing to the final print.

Mostly, people want the end product and not the process.

So true, our main issue was they wanted to print one offs really cheaply for something that physically could not produce those results. Educating customers was a huge part of the work there.

I’m really behind on listening to episodes but wanted to hop in here and say that Google Authenticator does not automatically reset itself after scanning that QR code to transfer between phones. You can have the same data on two phones till your reset one yourself.

Do not buy smart TVs. If you have no choice, do not let them connect to the Internet.