Computer advice thread ("What's the best way to do this?")

Reminder: I made this thing and it still works. Say goodbye to the GeForce Experience!

https://twitter.com/driver_nv

1 Like

So with a bit more testing, at 1440 resolution and Freesync on, I’m averaging 200+ fps but everything is smooth, no tearing (which I have seen sometimes in the past but nothing bad.)

Turning Vsync on immediately explodes everything, at least for this case. Bad black flashing and stuttering and horrible input lag. Keeping that one off. I’ve been reading mixed information on whether vsync is good with Freesync but the official answer seems to be no.

I’m gonna try bumping to 4k and see how it goes. I should be able to hit 120fps or so still in BF4 which should keep me well above the line. But my concern is in the past at 4k I would occasionally drop down to 50-60fps, so if that happens I don’t want to be seeing bad things. Only one way to find out.

I’ve only really used Gsync but I assume there isn’t much difference.
I think you generally want Vsync off unless the game is doing something weird with the rendering. You also need to be in “exclusive fullscreen” mode or the fancy dynamic refresh won’t happen. I had my OS set at 85hz recently and had issues with certain games trying to lock things to 60hz without telling the monitor.

So this morning my right hand monitor was blank. Even after the computer woke up, and the monitor’s power LED changed from orange to blue, the screen was blank. The USB hub in the monitor still works. Windows detects the monitor. The screen is just blank.

Also, when I try to use the buttons on the monitor to maybe change inputs or anything, the screen is still blank. I also don’t hear any beeping noises that it usually makes when I press the buttons on it.

Sadly, the left hand monitor is the one with the dead pixel :frowning:

I’m wondering if it’s maybe some other issue. I did recently update the NVidia driver. Monitor internal menu not appearing is very troubling, though…

Try swapping cables?

That’s definitely going to happen. Didn’t have time to find a cable before work.

Ahh yeah. My first reaction is like, just plug the monitor one cable into blank monitor and see.

But yes the fact you get no menu makes it seem like something is not firing up the screen itself.

Also google that specific monitor and that specific problem. I’ve fixed a couple monitors by doing so (then buying capacitors and soldering them in to replace those on the power supply board).

DSP cable? Are you 100% sure it isn’t one of the bad ones? The damage can sneak up on you or strike suddenly.

One of my monitors is HDMI and one is DSP. I don’t know if the busted one is using DSP, I have to check.

Even if it is, I’ve been using that cable for six+ years, and the cable came with the monitor.

It’s pretty hard to see on an IPS panels, but if you shine a bright light on it you may be able to make out if it’s actually displaying anything (look for a window border or something else high contrast). If it is then it’s the backlight or the backlight power supply.

1 Like

Very possible, but the screen is BLACK. I’ll try using my super flashlight on it.

From what I gather, the voltage problem is insidious. It depends on the net voltage between the two sides, which itself can fluctuate.

It can stress power regulation components in unforeseen ways over time.

The only way to know would be to use a voltimeter or cable tester to see if that pin is wired through. If it is…

I can’t speak to monitor cables, but just like a few days ago I did a full test of every pin on a PSU. There there’s only like 5 or so possible voltages any given pin can be.

5, 12, 3.3, -12 and, -5. Any other pin is ground. Unless you’re talking about doing some circuit nonsense after the pins and testing the readout after your circuit nonsense, I’m not sure what you’re talking about.

I find it hard to believe Dell would ship a faulty cable with the monitor. if they did, class action lawsuit? Recall?

1 Like

http://monitorinsider.com/displayport/dp_pin20_controversy.html

I guess it’s not as bad as the problems with USB-C, but imagine how nice it would be if everyone would just spend the penny to include a protection diode on that pin.

What problems does USB-C have? Now that I have four devices that use it (PC, Switch, iPad, camera) it seems to be working fantastically. I can use the same USB-C->HDMI adapter on every device. I can connect devices together with cables to transfer data. Everything seems to “just work”. I want everything to be USB-C now.

The only problem is that the Switch refuses to output video on USB-C if it doesn’t detect the one true legit Switch AC adapter.

It’s getting better, and if all the stuff you have is not cheap garbage you’ll probably never deal with it. But there was a whole thing with out of spec cables for a while, IIRC the spec doesn’t allow for >5V but everyone does it anyway, confusion about USB 2/3/3.1, thunderbolt/HDMI/DP etc.

The issue I personally run into still is macbooks deciding they would rather draw power than transfer data when in target mode, but that’s pretty niche. There’s also some weird complexities in implementing it properly, but that’s slowly going away as dedicated ICs handle it for you. I like the idea of it and it mostly “just works” now, but the roll out was messy.

Gotta love being able to plug in a “universal” cable and it blows up your shit.