PC Building V4 - "ok SSDs got cheap, now do RAM next"
999 replies, posted
Hello Nvidia, how fast is the 2080Ti?
"2"
Thank you Nvidia. I shall now put in my preorder
I wouldn't say that. It's all speculation, of course, but I'm particularly interested in the theory that Nvidia is releasing the 2080Ti at Turing's launch because the 2080 is weaker than the 1080Ti and it'd look quite bad if they didn't immediately come out with a product to beat their last generation's flagship.
In apples to apple comparisons, I suspect it's going to be roughly in-line with the 1080Ti in most current applications.
it really all hinges on the improvements to CUDA that they've announced, and how much more performance we'll see per CUDA core. And we have absolutely no way of knowing that for now.
Oh also in news:
https://arstechnica.com/gadgets/2018/08/amd-to-use-tsmc-to-make-its-7nm-cpus-gpus-as-glofo-abandons-7nm-development/
GloFo is canning their 7nm node. This is the like, 4th time GloFo has fucked AMD. With GloFo inheriting the IBM research, a lot of people were hoping the GloFo 7nm process would be much more salable than their current 12/14nm nodes.
Now the only 7nm~ fabs are going to be Samsung, Intel, and TSMC. Meaning price-per-wafer is going to suck, and AMD is going to have to head-to-head Nvidia on the same node, instead of potentially having a process advantage.
WTF glofo, fuck right off.
I thought AMD had some kind of shareholder control over GF? This is crazy. I really don't like how dominant TSMC is in the market.
I have an entire box of noctua gaskets of different colors. Do you want anything other than brown? I'll ship it to you for free.
Eh, I don't think they're even doing anything functional with my case because for some reason Corsair doesn't flush mount their fans but rather has standoffs for each screw. The rear 120mm is awkwardly mounted with like enough standoff that I could probably jam the wire under it.
I'm sure someone else here could probably use them and more so where they're far more visible and functional since mine are behind the front panel and a dust filter.
So a former colleague of mine is working with another company now, on small-scale machine learning type stuff, and while he can't go into detail with his current project (due to confidentiality reasons), he described it as "a server inside a miniature wind tunnel." Basically a powerful push/pull setup across a multi-board system, like it's an oversized radiator.
I say this because as a joke a few days ago I told him to boot Windows Server and run Speedfan for laughs.
https://files.facepunch.com/forum/upload/228820/baf52509-15df-432e-8cea-73a9c2395e82/airflow.png
Oh.
https://i.imgur.com/AUsRQ2W.png
Is this any good?
It costs 615 USD here. How much it would cost in US/UK/EU?
I plan on adding a GPU later. I guess something like GTX1060 would be a good match.
Is there anything I'm missing?
https://twitter.com/TUM_APISAK/status/1034262129254092800
This guy supposedly was able to benchmark it and found the 2080 to be slightly faster than the 1080 ti
Don't get a Corsair CX, get a Seasonic if they're cheap in your country. The case is awful. Seems fine otherwise.
2080 is probably going to be (in apples/apples situations) 20-30% uplift over the 1080, in situations where the 1080 was limited by memory bandwidth (4K, HDR) expect higher gains, but still in line with the 1080Ti.
I think there's a reason Nvidia released the 2080Ti on launch, and it's not just because 7nm is around the corner, but because the 2080Ti is only going to beat the 1080Ti by that same ~20%.
Just bought a 1080ti for £460 so im happy
In terms of being able to run stuff at 1080P, then up-scaling to 4K, yeah,
But that's not an apples-to-apples comparison, nor do I consider "fancy up-scaling" to be something worth cheering. Nvidia could've put more SMs on their die instead of otherwise useless Tensor processors.
I'll be happy to buy ray-tracing hardware once it's had more than one generation to mature.
For now, I feel dead-set on a 1080 unless benchmarks blow me away (doubt).
Yeah except that 1080ti's aren't even selling for MSRP anymore. They're coming all the way down to $600 and lower. There's like 5 of them at least right now on Newegg instock for $650 and or lower with rebates.
https://www.newegg.com/Product/Product.aspx?Item=N82E16814487381
Unless you are budgeting into the $600+ tier for a video card and cost is no issue, none of these cards are attractive in any way. The 1080 is practically 80% of a 2080 if my card is 83% of a 1080ti. Now keep in mind, a 1080 goes for $350-400 vs a 2080 for $750-800. RTX is just not a good value until these prices are slashed like in half and with pascal only getting cheaper and used cards poised to flood the market.....
I feel like saving 20% and getting 3gb more video memory would be a much better deal for what's pretty much a low end Titan Xp. 1GB cards and 2GB cards were popular in the midrange tier not long ago and they aged awful with newer games. If you plan to be moving to 1440p or 4k down the road, that 1080ti I would think is going to be much more useful at the resolutions of tomorrow than a card designed at the end of this recent dram shortage.
With Nvidia forcing 10 series gpus down their partner's throats if they want access to the 20 series and pretty much every rumor turning to be a wild over expectation. Expect pascal cards to get further price cuts as they try to move them as fast as possible. Or for Nvidia to price fix by selling pascal GTX 20 series cards and not budging on MSRP and taking out their bad luck with the cryptobubble pop on gamers.
I really hope turing shapes up into a decent release with future driver improvements but it's really not looking to be anything more than an extension to their current pricing scheme for the time being.
Yeah I'll likely be set with my Vega 64 for a couple generations until actual new hardware accelerated tech catches on.
holy fuck, a Vega 64 owner in the wild, WHERE'S MY CAMERA
Checkerboard rendering is more of temporal/spatial accumulation, which I don't really have a problem with, native is obviously better.
My issue is dedicating a whole 25% of a chip to what is basically just "deep-rescaling"; should've just pumped that into more SMs and being able to actually hit 4K 60+.
DLSS is the most exciting part of RTX for me, but unfortunately the fact that it has to be implemented per-game could really limit its success.
It's going to be interesting to see how Vega performs on DXR, since it actually does do double rate FP16, and has "RPM".
Wouldn't surprise me if Vega beats pascal handily in ray-tracing, but Turing stomps both.
Maybe AMD should look into simple little FP16/INT8 chiplets they can interconnect (a la EPYC, but smaller-scale) to just blast raytracing performance, raytracing isn't bandwdith limited particularly, so your GMI/I∞ isn't going to get saturated.
I can't remember if I've asked this before, but can I drive a 1080 or 1080Ti on my 520W Seasonic PSU? Power supplies and power consumption are really not something I have a good feel for.
It's literally just upscaling using a neural net, I think that's the most boring use of a on-chip neural net you can use.
Man I could not disagree more, the ramifications for performance and render resolution are really exciting to me.
Right when shit started to drop in pricing this most recent mining bust, when there were still only like 3 models in stock.
Bonus rare pic: Powered Vega with idle fans
https://files.facepunch.com/forum/upload/109699/379c3999-e244-42e8-922d-a612c3d983dc/IMG_20180827_173600.jpg
I can't imagine raytracing is mature enough for me to want to use yet, especially since I aim for a 144FPS target at the expense of image quality. DLSS is the only application of RTX that even remotely interests me at this time.
Any answer to this? M12-II if it matters any. I honestly don't really know what people mean when they talk about "12V rails" and such.
Should be fine, Seasonic doesn't really do anything weird with rails and tends to underrate them at any case. What CPU are you running with it again?
Sorry, you need to Log In to post a reply to this thread.