PC Building V4 - "ok SSDs got cheap, now do RAM next"
999 replies, posted
RTX is indeed a collection of (proprietary) nvidia things, which is why I said raytracing specifically.
From https://developer.nvidia.com/rtx
Ray tracing acceleration is leveraged by developers through NVIDIA OptiX, Microsoft DXR enhanced with NVIDIA ray tracing libraries, and the upcoming Vulkan ray tracing API.
OptiX isn't meant for games so we can ignore it in this case.
You can run games written for DXR just fine on any GPU, it'll just be dog slow if it's not an RTX card.
Nothing is stopping AMD or anyone else (Intel 2020?) from adding DXR acceleration to their hardware and being compatible with it.
You said RTX, which is an Nvidia specific platform.
The RTX platform provides software APIs and SDKs running on advanced hardware to provide solutions capable accelerating and enhancing graphics, photos, imaging and video processing.
DXR will have cross-vendor support just fine (as well as software created for one vendor can...) eventually.. As for how this is all going to actually shake down with DXR and whatever Vulkan extensions are coming, is still way up in the air.
The developer resources (the few that exist) for DXR don't even currently support safe fallback on AMD devices, lmao. Typical Nvidia.
I'm planning on just starting a complete new build to replace what I have now (i7-3770k, 24GB DDR3, GTX 1060 6GB) and I haven't really looked at PC building too much since Sandy Bridge so looking for some feedback.
I'm thinking about getting the MSI RTX 2080 Ti Gaming X Trio because it scores high in benchmarks, with low fan noise. Bit disappointed about the 3 power connector requirement though, because now i'll need to run two cables. The ASUS RTX 2080 Ti ROG Strix looked interesting too at first, until I found out it has only 2 DisplayPort connectors while I want 3 (triple screen setup, and a Vive).
Because the Gaming X Trio is a whopping 327mm long, I'm thinking of getting the Phanteks Enthoo Evolv X in which it should fit just fine.
Undecided on PSU, probably the Corsair RM850x (2018 revision)
The CPU will be the i9-9900K, motherboard unknown because none are released yet, and the RAM a 2x16GB Kit of Corsair Vengeance LPX DDR4-3200.
Is that RAM the right choice? Should I pick something with a lower frequency and try to overclock it, or a higher frequency? I've also read that two DIMMs instead of four should put less strain on the memory controller and allow for better overclocking.
For storage I'll get a Samsung 970 500GB or 1TB, but what's the currently recommended model of HDD? My current build has a bunch of separate 1 and 2TB drives (Spinpoint F3 and Seagate Barracuda), and I'm looking for at least 4TB.
That leaves the CPU cooler. I thought about the Noctua NH-D15 but I don't really like the idea of hanging 1.5KG of heatsink off my motherboard in addition to the 1.5KG of GPU. I also don't think it's a good idea to start experimenting with custom water loops.
After that I'm basically left with an AIO, but which? Something like a Kraken X62/X72 or Corsair H115i/H150i?
Question... Are RT ops something that specifically HAS to be done by the same GPU drawing the raster frame? Or can other, specialized hardware take on that task and simply pass its results to the GPU along the system bus?
Say, you build a dedicated RT card that's just a GPU-sized chunk of RT cores.
While I've never had a massive GPU, unless you count the 5870 which was super long and had a little sag, I've always had a full length, two slot GPU. I've had a NH-D14 which is a good bit bigger than a NH-D15 attached to my motherboard for 8+ years. I have have had to physically move the computer every 3-6 months to dust it out because where I live is very dusty. So it goes up and down stairs, picked up, tilted, bounced and generally moved all around. Secufirm is absolutely incredible to use, the backplate has not been off my motherboard since installed in 2010. I'm positive you would snap the motherboard off of the standoff's in the case long before a Noctua causes any issues.
Now, I might not recommend knocking around say a $800 E-ATX EVGA SR2 dual socket board with two massive heatsinks hanging off of it. But any setup like that can't fit a couple NH-Dxx coolers anyways.
AIO's really kinda suck, unless you spend $120+ on the highest end 280mm+ radiators you're not going to beat a $70-80 Noctua. AIO's are really best at aesthetics and features.
I've gone the route of a 2TB and a 4TB for space, the 2TB is a WD Black that I use for shitty steam and origin games that take up a ton of space and I rarely play. Despite only having like 30 games installed, they all consume 1.15TB... Then for my media storage I went for a Ironwolf 4TB, it's surprisingly faster than slightly older 7200rpm drives too. I would use either a 4TB or 6TB 5400rpm NAS drive for your rarely used storage, something that isn't spun up unless you access it. My thinking is that they'll last longer than a standard drive. I would then pick up one of those 2TB Micron SSD's for general storage, programs, games and stuff that's really going to benefit from the faster storage. On top of all that I'd pick out a nice nvme, like 500gb for a boot drive. Some games seem to not be affected by storage speeds at all, so going all in on nvme doesn't seem worthwhile unless you have networked storage.
Storage is really assessing your personal needs and trying to figure out what will work out best for you. I'm a bit of a data hoarder, I really don't need most games downloaded on my computer since I get 36mb/s down from steam anyways but it's convenient as I have the space.
Nvidia's RT Cores art part of the actual SM:
https://s.gvid.me/s/2018/09/24/BSr872.png
https://s.gvid.me/s/2018/09/24/rvK589.png
So there's probably some dependence on cache locality when using them.
Dunno exactly, but AMD really should be looking at a chiplet solution to the RT problem, since its the only reasonable way forward with RT in the first-place IMO.
I'm looking to build my first PC with no experience (expect more questions later), so I was wondering, with the new Intel cards shipping, do you think the prices on the GTX 1060/1070/1080 will drop substantially by the time Black Friday rolls around? If anybody has any guesses on how much cheaper they'll be that would be helpful.
Also, what should I be looking for with an optical drive? I'm not regularly burning CD's or anything, I just need something that can watch DVDs and CDs without it being super slow or getting jammed because of shoddy build quality. Also, how much more will it cost to get something blu-ray compatible?
I know it might sound wasteful, but... Why not render the game twice? Render only the raster on the GPU, render only the RT scene on the dedicated RT card, and provide the RT card with the resources it needs to operate independently, including its own cache (the CPU would just deliver the same data to both caches and both memory stacks at once as a redundancy measure)
So, the CPU sends a frame to both cards simultaneously, the RT card caches or ignores everything that isn't relevant to RT operation and gets to work on the RT scene. Once it's done, it sends a completed RT dataset to the GPU for it to paint over the next available raster frame as a shader. The GPU would NOT wait for the RT card to finish; if the RT card has not finished its next frame yet, the GPU will reuse the last RT dataset it was sent, and would perhaps intelligently scale/pan/warp it to act as a rough interpolation between it and the next dataset.
If timing between the two cards is still a major concern - IE, if the GPU can turn out rasters way faster than the RT card can compute its own work - the GPU could always read the timing signal from the RT card via the SLI fingers using a simplified bridge cable (or read an AMD card's timing via the Crossfire interface), or vice versa if the RT card is getting its work done too fast for the GPU and is left wasting cycles not doing anything.
I probably know absolutely nothing I'm talking about but hey.
Intel cards?
Intel's add-in cards aren't coming around until 2020 (if they don't get delayed/cancelled à la Larabee)
Pricing for older series generally goes down the longer we get into the current generation, but GPU pricing (and Nvidia's new market manipulation tactics) means pricing might not go as expected. If you can wait, wait, I guess, and see what graphics card you can afford when it comes time to buy.
Blue-ray drives are roughly 2X the price of a good CD/DVD drive.
Just go with a high rated LG BD drive honestly, they're only ~$60.
LG Black 14X BD
Don't bother with UHD/4K Drives on PC, IMO, it's a fucking mess.
I was talking about the RTX 20 cards that just came out.
Just because the TIM is better doesn't mean the die produces less heat, if you're powering two extra cores, you're gonna generate that extra heat.
Just look at the i7-7820X delidded to compare, people can't cool those @ 5GHz on any existing AiO, they require custom loops.
If I already have a Lenovo Yoga Book (W10, 4GB RAM, Atom x5, 64GB eMMC) as a daily use tablet for stuff like art, web browsing and video watching, would a Surface Go be much of an improvement? Same screen size, but instead of the Yoga Book's (actually kinda nice) halo keyboard and integrated Wacom surface, I'd have to buy a keyboard cover and pen, and draw on the screen instead.
The screen of the Go is pretty darn excellent, it's faster on paper, and the build is among the best I've ever felt in a tablet, but factoring in the accessories it's much more expensive, and doesn't have 100% feature parity.
I dunno what to do
No, I'm estimating the 33% core increase to increase power consumption by roughly 33%. The larger die will have greater surface area, the IHS being soldered certainly helps, but it's not going to weasel around the fact you're going to need to thermally exchange 250~300+ Watts of power.
Just look at results with delidded i7-7820Xs, people are maxing out their AiOs @ 5GHz.
I'm both extrapolating Coffee Lake results (250 Watts conservative for 8C @ 4.7-4.9GHz) and inferring from 7820X.
Both are Intel Core X86, the only difference being the 7820X is a much larger die and IHS, which should actually give it a better thermal situation.
In virtually all of the OC/Thermal tests the only part they stress is the core (E.G Prime95), so the "extra" stuff like addtional PCIe lanes or beefier IMC aren't particularly pulling anything extra. (Technically cache configuration is also different, but only by a margin, and shouldn't impact stressing wattage too much)
Anyway, you can just look at 8700K wattage figures and multiply by 1.33, and you get a good ballpark of what the 8-core CFL parts are going to use. Also consider that when adding extra cores, the voltage floor for any given OC is higher too, meaning those golden chip voltages on 6-core parts aren't going to be attainable on 8-core parts.
My bad, I meant Nvidia. I'm not remotely interested in getting the 20XX series, but I was just wondering if prices on the 1060/70/80 might drop because the 20XX just hit.
GTX 1070 is a fantastic value right now; you can get them for $350 or less from a lot of places, and that card eats 1080p games for breakfast.
I wish I could find prices that low over here. They're all over the place - from 400 to 500+€
Money isn't really the issue, like I said I care more about not having a giant chunk of metal on my motherboard (weight and aesthetic wise). What would be not asetek junk, because I thought everything was asetek anyways?
Stock boost speeds already 4.7GHz for all cores at the same time, and these should be soldered so I'm hoping it'll be fine. In case it's not, how difficult (and maybe a bad idea) would you say building a custom loop for the first time is?
It's really not difficult. And it's also really not dangerous either, just don't be an idiot and properly run a 24h leak test and you'll be fine.
Pressurize it with air first. You don't need much, just 20-30 PSI, and leave it over-night with a pressure gauge on it. If it drops at all, you know there's a leak and it won't hold water.
I upgraded from my 4 year old 980 to a 2080 a couple of days ago. Love every aspect of it. Fight me.
Youd’ve loved it just as much if it was a 1080 Ti, and bee $200 richer.
Considering the cheapest 1080tis in my area would have only saved me 50 bucks I'm not crying a river
That's unfortunate. At that point I can't argue too much against the new card. Hopefully raytracing will be attainable on that card, too.
This really is one of the best times I've seen in years to upgrade your GPU.
9-series have a whopping THREE sets of really good upgrade options, getting to take their pick from the 10-series, the 20-series, or AMD's Vega series. Heck, even the RX 5-series are fine upgrades for 960 holdouts.
Seriously, if you've been putting it off, now is the time. Go. Before it's too late and the tariffs set in.
I've a 144hz 1440p monitor, I don't have much choice outside of a 1080 ti or an even more expensive 2080 ti to reach 144fps in most games do I?
1080 Ti does ~100-130fps in almost all new games with Ultra preset. The 2080 Ti I think does a few games at 140-150 but it's mostly 120-140. If you drop down to High or use MSAA instead of TAA you'll probably do 144 just fine on a 1080 Ti
I wish nvidia would add analog out back to their cards so I could upgrade from my 980 Ti at some point, but this card is staying forever unless that happens. One day I will play DOOM 2016 on an FW900... one day.......
You can easily adapt Displayport to VGA, with virtually no added lag
Sorry, you need to Log In to post a reply to this thread.