• CIPWTTKT&GC V44 - Vega Appreciation Station
    5,006 replies, posted
I want a 1080 Ti for the new EVGA ICX coolers. But honestly I don't need a 1080 TI right now. I play like 90% overwatch.
I'll probably pick up a Vega 64 and a water block for it at some point but I'm in no rush.
[QUOTE=Cyberuben;52571722]Maybe I'm completely in the wrong place to ask this question, but I guess the people here are the most experienced. Lets say I have a project that has 3 different servers that run the same stateless API server. Type of the executable shouldn't matter. There's tools like GitLab CI, but it doesn't seem to serve the purpose of deploying 1 project to multiple servers. When searching for information on this subject, I find that people recommend to hard-code all the servers in a .gitlab-ci.yml file, which is something I don't want to do. How do you guys handle deployment of certain applications? The programming section of FP doesn't seem to be the right place imo, as it's mostly games which don't get deployed in such a way.[/QUOTE] I'm unsure if I understand the question correctly, but are you asking for a method to simply deploy a service to multiple servers? If so there are multiple ways, one of the simplest and most popular today is Ansible. If you require something that constantly keeps the server state consistent then I would recommend eg. Puppet or Saltstack. If you are looking for something to include with your development workflow (aka. CI/CD) then you have Jenkins.
[QUOTE=Atlascore;52572604]Vega isn't bad, it's just late.[/QUOTE] Well. And the fact that apparently only 5 cards of each type were manufactured, given how everywhere sold out in a few milliseconds.
Article on 1070 vs Vega 56: "but it’s worth remembering that the GTX 1070 still costs at least $450" IT DIDN'T UNTIL CRYPTOCURRENCY ASSHOLES DROVE PRICES UP Still butt-mad I didn't pull the trigger on a 1070 Mini for $300 or a regular 1070 for $350 when I had the chance. [editline].[/editline] Like ffs I might as well keep using my GTX 1060 3GB until Nvidia's next generation at this point.
[QUOTE=Dr. Evilcop;52572656]Article on 1070 vs Vega 56: "but it’s worth remembering that the GTX 1070 still costs at least $450" IT DIDN'T UNTIL CRYPTOCURRENCY ASSHOLES DROVE PRICES UP Still butt-mad I didn't pull the trigger on a 1070 Mini for $300 or a regular 1070 for $350 when I had the chance. [editline].[/editline] Like ffs I might as well keep using my GTX 1060 3GB until Nvidia's next generation at this point.[/QUOTE] My old GTX1070 (RIP) cost me $399. My friend's 1070 in a build I did for them was $315 open-box. When my 1070 died the cheapest 1070 was 529 at microcenter, so I got a 1080 open box for 539... Fuck miners mang. I hope AMDs Vega bundle plan works out for new builders trying to get a reasonably priced card though. Vega is really compelling even with its huge power draw, since at worst it's only down about 5% from the 1080 and we haven't even seen good coolers on these cards yet.
[QUOTE=Dr. Evilcop;52572656]Article on 1070 vs Vega 56: "but it’s worth remembering that the GTX 1070 still costs at least $450" IT DIDN'T UNTIL CRYPTOCURRENCY ASSHOLES DROVE PRICES UP Still butt-mad I didn't pull the trigger on a 1070 Mini for $300 or a regular 1070 for $350 when I had the chance. [editline].[/editline] Like ffs I might as well keep using my GTX 1060 3GB until Nvidia's next generation at this point.[/QUOTE] tbh, I gotta wonder if Vega's price is inflated, to match with the price-boom that Eth brought on. Part of me just isn't expecting prices to come down, and vendors using this as a way to just straight increase profit.
[QUOTE=Brt5470;52572452]I want a 1080 Ti for the new EVGA ICX coolers. But honestly I don't need a 1080 TI right now. I play like 90% overwatch.[/QUOTE] You out of anyone I would think could use the extra CUDA cores for video rendering. Plus a 1080ti means that you can run OW at higher than 144 fps :v: [editline]14th August 2017[/editline] You could actually make use of overclocking your monitor to 165 Hz too with it.
[QUOTE=wingless;52572265]I think Vega looks pretty alright. It's decently performant without being more expensive. However I will say, the Water 64 has gotta be the worst of the bunch. In terms of price, it's only a small fraction cheaper than a 1080 Ti but the Ti pretty much outclasses it, and you don't have to deal with the extra pain and organisation of a built-in water loop. Air 64 looks pretty okay, comparable to a 1080 at a comparable price point, with the opportunity to be cheaper with a little bit of time. I think with discounts, Air 64 will look pretty nice. 56 also looks alright. Competitive price, competitive performance. I see no issue with that. The only catch with Vega as a whole is that it runs hot and draws a good chunk of power. But, well, Nvidia did too not to long ago, then they invested heavily into power-efficiency and it paid off. I suspect AMD will soon too, depending on how their income goes. Overall I really don't see how it's [B]bad[/B] at all. It's just not the best. No problem with that.[/QUOTE] Next gen will probably focus on optimizing it, now that they finally have a promising base architecture for their GPU's
[QUOTE=wingless;52572619]Well. And the fact that apparently only 5 cards of each type were manufactured, given how everywhere sold out in a few milliseconds.[/QUOTE] This happens all the time though. Pretty much exactly like the 970 release.
[QUOTE=wingless;52572265]I think Vega looks pretty alright. It's decently performant without being more expensive. However I will say, the Water 64 has gotta be the worst of the bunch. In terms of price, it's only a small fraction cheaper than a 1080 Ti but the Ti pretty much outclasses it, and you don't have to deal with the extra pain and organisation of a built-in water loop. Air 64 looks pretty okay, comparable to a 1080 at a comparable price point, with the opportunity to be cheaper with a little bit of time. I think with discounts, Air 64 will look pretty nice. 56 also looks alright. Competitive price, competitive performance. I see no issue with that. The only catch with Vega as a whole is that it runs hot and draws a good chunk of power. But, well, Nvidia did too not to long ago, then they invested heavily into power-efficiency and it paid off. I suspect AMD will soon too, depending on how their income goes. Overall I really don't see how it's [b]bad[/b] at all. It's just not the best. No problem with that.[/QUOTE] It's bad because AMD isn't very good at moving cards in the first place, and this is pretty much as late as bad as AMD has been in the time I've followed the PC GPU space? I don't see who AMD is gonna sell these to beyond coin miners.
I think people are sleeping on the fact that with vega you can also get an adaptive refresh monitor for cheaper than you would be able to with nvidia. Also was there an information source on the new rendering pipeline amd set up? it sounds interesting but I haven't been able to find anything on it.
[QUOTE=WrathOfCat;52573080]I think people are sleeping on the fact that with vega you can also get an adaptive refresh monitor for cheaper than you would be able to with nvidia.[/QUOTE] Sure that's nice if you want a new monitor - but most people already have a monitor that they may or may not want to upgrade. Freesync monitors are still quite expensive. This is a selling point, but it's for a smaller market than just those who need a GPU. You also have to weigh the money you save against the performance/noise you lose/get.
I probably won't be buying any more monitors until my current ones die so FreeSync\GSync aren't really a factor for me - and probably many others.
[QUOTE=BackSapper;52572722]You out of anyone I would think could use the extra CUDA cores for video rendering. Plus a 1080ti means that you can run OW at higher than 144 fps :v: [editline]14th August 2017[/editline] You could actually make use of overclocking your monitor to 165 Hz too with it.[/QUOTE] 980 Ti already runs my monitor at 1440p165 just fine. And I can do color grading on 4k30 10bit images in Premiere at 100% scale already and use maybe 35-40% of the GPU. People overestimate how much GPU power they need for editing. Believe it or not, I don't need to upgrade :v:
[QUOTE=WrathOfCat;52573080]I think people are sleeping on the fact that with vega you can also get an adaptive refresh monitor for cheaper than you would be able to with nvidia. Also was there an information source on the new rendering pipeline amd set up? it sounds interesting but I haven't been able to find anything on it.[/QUOTE] I would consider Vega if Blender had better AMD support. Or I'd consider the Vega 56 if it was sub-$350, but neither of these seem to be happening.
What will a 5450 get me? It's AMD.
[QUOTE=Kiwi;52573371]It's 1100 for Vega 64 in NZ currently. Wtb Vega 64. Will suck dick for cash.[/QUOTE] $5, gotta swallow
5450 is far more than 64
[QUOTE=Brt5470;52572452]I want a 1080 Ti for the new EVGA ICX coolers. But honestly I don't need a 1080 TI right now. I play like 90% overwatch.[/QUOTE] Neither do I, I got the EVGA GeForce GTX 1080 Ti FTW3 GAMING and all I play is GTA V and I just recently got PUBG (who wants to play?). [QUOTE=Brt5470;52573253]980 Ti already runs my monitor at 1440p165 just fine. And I can do color grading on 4k30 10bit images in Premiere at 100% scale already and use maybe 35-40% of the GPU. People overestimate how much GPU power they need for editing. Believe it or not, I don't need to upgrade :v:[/QUOTE] Nice, I'm sadly running everything in 60Hz because none of my 4 monitors support anything higher :frown: But I now have one of those 34-inch Dell ultrawides! Video editing is so much easier with more screen space..
[QUOTE=Brt5470;52573253]980 Ti already runs my monitor at 1440p165 just fine. And I can do color grading on 4k30 10bit images in Premiere at 100% scale already and use maybe 35-40% of the GPU. People overestimate how much GPU power they need for editing. Believe it or not, I don't need to upgrade :v:[/QUOTE] Also my FPS problems in OW are down to Ryzen + Slow ram. GPU tanks to 50-60% usage during big fights.
[QUOTE=Anderen2;52572526]I'm unsure if I understand the question correctly, but are you asking for a method to simply deploy a service to multiple servers? If so there are multiple ways, one of the simplest and most popular today is Ansible. If you require something that constantly keeps the server state consistent then I would recommend eg. Puppet or Saltstack. If you are looking for something to include with your development workflow (aka. CI/CD) then you have Jenkins.[/QUOTE] it seems like this is actually what I need, thanks. Though, since Ansible is really expensive, I've looked into alternatives, and saw Puppet, Saltstack and Chef come up. I'm not sure which exactly I'd need, but this is definitely a push in the right direction!
Any of you guys know any good file server auditing software? To keep logs of what users access what. Also a program that can run client side and log what they transfer to USB devices and the like.
[QUOTE=Van-man;52572736]Next gen will probably focus on optimizing it, now that they finally have a promising base architecture for their GPU's[/QUOTE] AMD RTG / Their GPU division has a rough road ahead, clearly for gaming GCN hasn't paid off, for quite a few generations the only thing that's saved them has been raw compute horsepower, and Nvidia bungling something. They've had years and years to get their drivers into a good condition, but yet GCN still under performs Nvidia in most graphics situations. I think they should reconsider what they're doing with it, maybe make a new architecture, learn from GCNs strengths and weaknesses, and look at how Nvidia has pulled off their performance per watt.
[QUOTE=glitchvid;52573643]AMD RTG / Their GPU division has a rough road ahead, clearly for gaming GCN hasn't paid off, for quite a few generations the only thing that's saved them has been raw compute horsepower, and Nvidia bungling something. They've had years and years to get their drivers into a good condition, but yet GCN still under performs Nvidia in most graphics situations. I think they should reconsider what they're doing with it, maybe make a new architecture, learn from GCNs strengths and weaknesses, and look at how Nvidia has pulled off their performance per watt.[/QUOTE] They're actually starting to show that they know what to, and what NOT to do ontop of finally getting their architecture figured out. A do-over now just because they're a bit more power hungry is foolish when it's clear it's mostly a problem of optimizing. They're not doing a Intel and pulling a Pentium 4, that's for sure.
[QUOTE=Van-man;52573733]They're actually starting to show that they know what to, and what NOT to do ontop of finally getting their architecture figured out. A do-over now just because they're a bit more power hungry is foolish when it's clear it's mostly a problem of optimizing. They're not doing a Intel and pulling a Pentium 4, that's for sure.[/QUOTE] GCN is still largely the same architecture as it was years ago, Vega does have some of the first large-scale changes, but it clearly hasn't worked so well (E.G Fury X clock-for-clock perf VS RX Vega 64). They've had years to manage GCN and get it working well compared to its raw compute potential.
[QUOTE=Atlascore;52573868]I think AMD made a huge mistake by adopting HBM tech so early.[/QUOTE] According to Buildzoid (I think) a big reason was that AMD's memory controllers were already really large in die size, and would basically have to double in size in order to accommodate GDDR5X at the same bandwidth as HBM or whatever, putting the memory controller power draw alone at something like 70W. I can't tell you how true that is, but yeah, it definitely doesn't feel like it's been paying off for them.
[QUOTE=Dr. Evilcop;52573292]I would consider Vega if Blender had better AMD support. Or I'd consider the Vega 56 if it was sub-$350, but neither of these seem to be happening.[/QUOTE] iirc OpenCl has been feature-par for a while now, and as of recently its supposedly performance par. [URL="https://wiki.blender.org/index.php/Dev:Source/Render/Cycles/OpenCL"]https://wiki.blender.org/index.php/Dev:Source/Render/Cycles/OpenCL[/URL] [editline]14th August 2017[/editline] also theres that renderer amd was making for blender
[QUOTE=BackSapper;52572722]You out of anyone I would think could use the extra CUDA cores for video rendering. Plus a 1080ti means that you can run OW at higher than 144 fps :v: [editline]14th August 2017[/editline] You could actually make use of overclocking your monitor to 165 Hz too with it.[/QUOTE] The CUDA cores doesn't really help exporting videos that much, it's still heavily on the CPU, however effects and other shit are accelerated with the GPU. I don't know how much exactly but it seems to help a little bit but if you're exporting a video with literally no graphics of any kind, it doesn't seem to help much at all. Then again, this all really depends on the software you're using and how optimized it actually is.
[QUOTE=Cyberuben;52573612]it seems like this is actually what I need, thanks. Though, since Ansible is really expensive, I've looked into alternatives, and saw Puppet, Saltstack and Chef come up. I'm not sure which exactly I'd need, but this is definitely a push in the right direction![/QUOTE] Expensive? It's free if you do not need Ansible Tower (aka. Ansibles "answer" to Puppets Foreman) or support from Redhat :) [editline]15th August 2017[/editline] [QUOTE=Levelog;52573621]Any of you guys know any good file server auditing software? To keep logs of what users access what. Also a program that can run client side and log what they transfer to USB devices and the like.[/QUOTE] Something like Tripwire? Windows or Linux? And what kind of fileserver?
Sorry, you need to Log In to post a reply to this thread.