PC Building V4 - "ok SSDs got cheap, now do RAM next"
999 replies, posted
Are these things decent? Doesn't have to be super amazing, as long as it actually is reusable. I'm going to be testing a lot of different CPU coolers and don't want to waste a tube of good paste.
https://www.amazon.com/Innovation-Cooling-Graphite-Thermal-Pad/dp/B07CKVW18G
There's little formal testing on them. GN said they were testing them but that was like a month ago. Here's a LTT video where IC Graphite is demonstrated to be comparable in performance to a good thermal paste to within 1-2C, but LTT's testing methodologies are barely short of a joke.
https://www.youtube.com/watch?v=YpphKzmDiJM
There's no long-term data on them whatsoever, they're too new.
I mean if heat doesn't cause it to actually change its shape or crack over time, I don't really see how its performance could deteriorate.
Maybe I'm wrong but on the surface it seems like it'd be less volatile than paste.
I agree with you but materials science is weird, I'm uninformed and I don't want to make assumptions that seem intuitive.
Okay yeah apparently one BIG downside to it is the fact it's a conductive material, meaning it can kill stuff if it lands on electrical contacts while you're trying to place it
And it's thicker than paste so it isn't recommended for use in laptops, GPUs, or other components manufactured with extremely tight tolerances
So.
I now have a motherboard, with an I7 2600 and 16GB of RAM, that has 6 PCI-E slots on it.
Home-made super-computer? I have been meaning to upgrade the Think-Tank with more serious hardware for server/learning systems...
Man, the nvidia 2000-series is a joke
https://techreport.com/review/34105/nvidia-geforce-rtx-2080-ti-graphics-card-reviewed
More or less equivalent performance at a higher MSRP than the equivalent 1000-series cards? Perfect.
The 2080 Ti is way faster than the 1080 Ti thoug, I'm watching Hardware Unboxed's video
30% faster for 2x the price 🤔
Yes, please follow up if you do purchase it. I don't know anybody who's bought one. I initially meant to, but I decided against it at the last second.
It amazes me how well Pascal/Turing scale, just throw more SMs at the problem and almost linear scaling (given memory is scaled appropriately).
AMD can't fucking compete with that, GCN's rdiculously fat CU pipeline just isn't balanced.
Oh my god that card price. What the fuck. I hope nobody actually pays that.
DLSS seems to be the core benefit here; 2080 absolutely wipes the floor with the 1080 ti with it. As more developers are able to implement it, it should be a notable increase in performance.
As more developers implement it
That's always the key
DLSS is upscaling, not a native apples VS apples rendering comparison.
If it looks better and runs better, it's clearly better, whether or not it works the same way
Once again though, it needs per-game profiles, so we'll see how support goes in the future
It will for sure run better, because it's doing 1/4 the raster work.
It probably won't look better (since, again, it's using DNNs to infer what that missing resolution would look like), but it's far from apples to apples.
And it's a shame too, since on the raster side Nvidia has implemented a lot of great features (basically all the Vega features, plus way more (such as double speed FP16, async compute)) so why they need to cripple their offering is ridiculous.
Hopefully AMD makes Navi work, strip out silicon hogs like HBCC, lean up the GCN core, and get essential features working (DCC, DSBR/TBR, PS).
If they actually opened support for Freesync via Displayport, and allowed for custom fan profiles in their own software, Nvidia's cards would literally have no practical downsides
Besides financially supporting a company with pretty bad business practices.
unless your pixel peeping it probably doesnt look noticeably worse during gameplay, they use 4k as the usage example but it should benefit any running higher refreshrates as well
https://www.youtube.com/watch?v=MMbgvXde-YA
dlss at first appeared to look worse to me, but while it seems like there is less pixel detail on distant objects it did look more accurate.
Is it just me or does the FF demo seem intentionally cherrypicked to muddy the results in favor of DLSS? It's like they specifically picked a game that has a bad TAA implementation and postprocessing in general. It looks like the other settings aren't even the same in both cases as the TAA version has some additional motion blur/DOF that makes parts of the image blurrier than it should be. Even then DLSS still doesn't necessarily look as good or better than the native 4k + TAA image - there's plenty of discussion you could have on the subjective aesthetic aspects of each, but the fact is that the upscaled 1440p doesn't resolve the same amount of detail as 4k + TAA.
What I really want to see is how 1440p DLSS looks vs 4k native without AA/with SMAA/w good TAA implementation. And of course the full native 4k + DLSS vs 4k TAA, that's when we can really start talking about the quality/performance benefits of each technique.
The only takeaway so far is that DLSS looks like a pretty attractive way to reach 4k on hardware that otherwise wouldn't be able to run it smoothly, but I definitely still wouldn't want to run sub-native on lower resolutions.
I'm quite eager to see how it performs in real games rather than specifically selected examples that play along nicely with it. How does it do with small subpixel details that you can't resolve without jittering the sampling positions? Will it completely choke on temporally unstable features like Moire patterns? These are the situations that TAA excels at.
To be fair, AMD is willingly aiding the Chinese Communist Party in spite of the clearly oppressive way they use technology on their own people
Everyone has terrible business practices these days
It is finally complete, this took a week longer than expected.
https://imgur.com/a/R1qrtAh
I'd like a transparent case but I am absolute dogshit at cable management
Neither Intel nor Qualcomm are particularly innocent either
While that's true, so do most tech manufacturers. I'm talking more about shitty business practices that directly effect the segment I'm buying the product in. Pretty much all nvidia has done for gaming for years is making it increasingly proprietary.
AMD has been actively seeking direct dealings with the CCP for state-sponsored projects, or dealings with state-sponsored entities (like that Threadripper console), to obtain the capital necessary to threaten Intel in the microprocessor space
Bad cable management is usually the fault of the case, not the user. Just make sure the case has holes on ALL THREE EDGES of the motherboard, preferably the ones with cool rubber notched stoppers in them for added sleekness, and it's easy as pie. It doesn't matter how messy it is behind the motherboard.
Sorry, you need to Log In to post a reply to this thread.