Computer advice thread ("What's the best way to do this?")

Ages ago, possibly in the old forum I mentioned that there were consumer level products serving thousands of customers running entirely on windows servers. I’m finally proud to say that that is no longer the case; to my great relief. We still use windows servers for the internal facing stuff but I don’t care when that stuff breaks so it’s fine.

I give it a year or two at best. Thats how long both AMD setups I’ve built lasted. Once is bad luck twice is a pattern.

[quote=“panfriedmarmot, post:83, topic:288, full:true”]
I give it a year or two at best. Thats how long both AMD setups I’ve built lasted. Once is bad luck twice is a pattern.
[/quote]Man, y’all must be doing something wrong, every AMD system I’ve built has gone the distance. The only AMD part I’ve had fail on me so far has been a graphics card that had liquid damage(Which wasn’t uncommon for parts purchased around that time, in this city, since much of it was underwater earlier that year.).

3 Likes

Sooo I’ve gone through a ton of different errors with my pc, part of which is my fault. I think the not booting might have just been a loose cable in hindsight. Might have been getting some issues from a dead bios battery but that seems ok right now so I’ll just order a new one when I get a chance. I tried just swapping the cpu with one I ordered since I thought that’s what was borked at the time. I’d read that windows would just change accordingly, but that didn’t seem to be the case. Eventually I swapped back and figured I would just upgrade it when I need to do a reinstall. The problem now is it seems to work after I do a system restore, but the gpu craps out in various ways. Sometimes it will work fine, I even was able to play a match of Overwatch and a little Dirt. Other times as soon as I log into windows I’m given a blank green, purple, orange, or light blue screen. Sometimes the monitor will just shut off. I noticed it would happen when I would open Geforce experience? A few of the times I would hear a windows alert and then the monitor would immediately turn off. I tried updating the drivers from an offline installer, or uninstalling geforce experience altogther neither helped. It generally works fine off the integrated graphics. Do you guys think somehow my drivers just got borked and I should do a reinstall of Windows, or is my graphics card dying?

Might be the graphics card there buddy.

@ninjarabbi I’d probably try a reinstall of windows first just in case something has gone crazy, bad BIOS batteries and stuff can cause some weird behavior too but I’d probably be leaning towards the card.

As to the whole AMD stability / longevity conversation, Intel has been noticeably better in the longer run on average. In my experience though the motherboards tend to be the point of failure / instability.

I currently have 3 Intel and 3 AMD systems in the house. The new parts machines are solid (2 each), the cobbled together machines are less so (1 each). To be fair the unstable Intel one is a gaming laptop that was used as a weapon before I got it and the fact that it even turns on is impressive.

Two of mine blew up, my wife’s blew up, the freaking prebuilt my parents had blew up. Its the components friend.

Wow, I haven’t had a single issue with my AMD system, fingers crossed. Although is does SOUND like it’s about to burst into flames whenever I play anything modern and 3D. I do plan on going Intel with my next PC though; I build this one when I was going to school and working retail, so ‘horsepower per dollar spent’ was my chief concern.

Not consumer level but a I wasn’t very far away from a mid-scale data-center rollout and after a large amount of negotiations and whatever people who don’t wear jeans to work do, the company purchased something like 1500 AMD CPUs. The whole ops team went to a hockey game at the garden in one of those ritzy booths where you’re above the ice but also inside. From what they tell me The Great One was in the booth next door and even though they pounded on the wall he did not come over and shake their hands.

Anyway back on topic. Said DC was rolled out many years ago and still serves the company that owns it well. My point is in the hands of someone with ops skills AMD is just fine.

Ive been Team Blue up to now, and until i started seeing some positive experinces with the Ryzen 7, was still on the fence with going 7700k. But between the 7700 being just a mostly maxxed out 7600 and with my workflow beong one where octocore might make sense, its hard to pass up on giving it a shot. Being able to have a full CAD session going, running a simulation or render, streaming a movie, and then jump into a round of BF1 or something, without killing apps, that sounds nice.

Worst case here, if Ryzen turns out to be poop, i cam take my L and itll put me out maybe $500 for an i7/mobo combo. The GTX1080 will still be relevant till at least the 12xx series comes out, DDR4 will be plenty fine for years, and everything else doesnt really care at all what chipset i use.

Seems like the Ryzen has twice as many cores. So it wins out a lot if you can actually make use of those cores. Except individually those cores are still crappier than the Intel ones, and single core performance is still what matters for most real world desktop usage.

The only game I can think of that may be able to use all those cores is X-Plane, because it likes to have a dedicated core per AI plane if possible, but most games and desktop apps probably won’t need it unless you’re doing some crazy, heavy parallel calculation stuff. Scientific computing, maybe, as well as high-end rendering perhaps.

Every console except the Switch is running AMD lol.
All the bitcoin miners use AMD because the cards have more computation per dollar than nVidia.

I’ve had setups with both companies so I don’t have an issue. Sure AMD pushes the cards way harder and have a far warmer operating temperature, but it still works.

I’ve decided to stay on nVidia due to ShadowPlay and nVidia have too much collaboration with game devs and hardware manufacturers for minimal hassle play. AMD would get there but the driver update would always be 1 week or more late, worst case I would end up finishing the game by the time the optimised driver was released.

I’m open to AMD CPUs but am fine with my current Intel setup at the moment.

I avoid ATI cards because as far as I know, they are pretty useless for video editing/rendering compared to CUDA-enabled nvidia cards.

1 Like

[quote=“SkeleRym, post:95, topic:288, full:true”]
I avoid ATI cards because as far as I know, they are pretty useless for video editing/rendering compared to CUDA-enabled nvidia cards.
[/quote]Nah man, CUDA and OpenCL are pretty neck-and-neck these days, you can expect any AMD card to perform almost identically to it’s equivalent Nvidia card. And it also depends on the card more than you might think depending on the task - for example, an Nvidia 970, for rendering and transcoding tasks, tends to slightly outperform the 980 and 980 TI, despite the latter two being more powerful cards, but the more powerful a card is, the better it’ll perform for certain other very high horsepower tasks, like transcoding and chroma keyed 4k footage down to 1080p footage in the same format. And even then, except for the last one, the differences are not all that huge.

To be honest, right now with video editing, unless you’re buying seriously specialist equipment, you might as well just pick by card, not by brand, except for the tiny handful of specific tasks that you’re likely not going to be doing often enough to justify the extra expense.

I saw a lot of benchmarks showing abysmal performance for non-CUDA if you have a bunch of Lumetri effects enabled. Maybe that changed in the most recent generation?

144Hz. Worth it over 60 for gaming?

[quote=“SkeleRym, post:97, topic:288, full:true”]
I saw a lot of benchmarks showing abysmal performance for non-CUDA if you have a bunch of Lumetri effects enabled. Maybe that changed in the most recent generation?
[/quote]Maybe, not sure, haven’t tested that specifically. I do recall that for quite a while, Adobe couldn’t use OpenCL for some reason, which made AMD cards absolutely chug by comparison. But along with Windows 10 and one of the CC2015 updates came full OpenCL compatibility(May not be the right word) without workarounds, which really gave it a kick in the arse.

Edit - Actually, come to think, I know a chap who is the IT guy/tech for a company that, for lack of a better term, does outsourced video editing, effects and rendering, and I know he used to be all NVIDIA, and recently started using some AMD, I’ll ask him and see what’s up when it’s a more reasonable hour of the day.

Motherboard and CPU arrived. Amazon might not be getting my computer biz anymore after someone or something there decided slapping a shipping label on an unsealed motherboard box, and slapping that naked into the mail system is acceptable shipping practice. Nevertheless the box looks clean and all contents are there so when I get a chance I’ll dig in and take stock of what I got.

Leaving for paintball this weekend so I won’t get to really dig into my build until next week, but that gives me time for newegg parts to roll in.

I still am not keen on ATI graphics. Maybe someday.

I have a 144hz G-Sync Monitor. I like it a lot but realistically probably not worth it. The main benefit to having both is the wiggle room where anything between 40 and 144 FPS is smooth. If it’s marginally more expensive and you need a monitor anyway, go for it. For me it’s a similar feeling to going up a small step in resolution, noticeably better but not game changing. Above 90 FPS I can now induce motion sickness in shooters by flailing the mouse around, so that’s fun.