• Building my first pc soon please tell me if this build is good
    21 replies, posted
Hey guys so i'm going to be getting a new pc in about a month or so and well i was wondering if you guys could tell me how my build is, my buddy gave me the link and says i should get it, it looks good to me but im not too sure, i don't need it to run everything on high to very high just i want a decent gaming rig for my first pc that can run most games from medium settings and up and my budget is 800 dollars, anyways heres the build [URL="http://pcpartpicker.com/p/JHYt99"]http://pcpartpicker.com/p/JHYt99[/URL] Thanks.
Looks pretty good to me, although I'd recommend you to change the memory to these modules instead: [url]http://pcpartpicker.com/part/gskill-memory-f312800cl9d8gbrl[/url] I'd also recommend you to add an SSD: [url]http://pcpartpicker.com/part/samsung-internal-hard-drive-mz7te120bw[/url]
You're missing a CPU cooler, unless you're not planning on overclocking and just using the stock cooler, in which case you don't need a 4690k you could just have a plain old 4690. Saves a little money.
In that case, I'd say he's better off getting an i5 4440 instead, and changing the 280X into a 290X [editline]28th November 2014[/editline] [IMG]http://i.imgur.com/OEn5h2u.png[/IMG] [url]http://pcpartpicker.com/p/kK28XL[/url] Pure gaming performance, yo
Okay thanks so much guys i really apprechiate it.
Okay here we go [img]http://i.imgur.com/sWfFZQJ.png[/img] You're welcome. This is future proof, broadwell is coming next year mate, the board is compatible with broadwell (no skylake tho) and you pick up the sweet i5-5000k's ;) also pentium K is unlocked, push that baby to 4.5GHz easily no hassle, mobo pro3 better than pro4 because it's cheaper gtx970 superior to 280X every day of the week, newer architecture any questions let me know [editline]28th November 2014[/editline] if you really want the slowmo storage, add a 500GB HDD for backups and whatnot, i manage perfectly fine my 840 EVO 250
[QUOTE=Xmeagol;46594883]Okay here we go You're welcome. This is future proof, broadwell is coming next year mate, the board is compatible with broadwell (no skylake tho) and you pick up the sweet i5-5000k's ;) also pentium K is unlocked, push that baby to 4.5GHz easily no hassle, mobo pro3 better than pro4 because it's cheaper gtx970 superior to 280X every day of the week, newer architecture any questions let me know [editline]28th November 2014[/editline] if you really want the slowmo storage, add a 500GB HDD for backups and whatnot, i manage perfectly fine my 840 EVO 250[/QUOTE] I really don't think a ~5% performance increase is worth over 200 dollars more. On top of that, he'll have to use a crippled Intel Pentium system for six months first
lmao crippled, 5%? this build is way faster than everyone elses [editline]28th November 2014[/editline] Let's start with the SSD i suggest using a 250GB EVO for system and games only, enough left from budget to get snail storage. make no mistake, games that use high resolution textures benefit more from SSD's second, motherboard, asrock pro3 ever since Z68 has been of great value/performance, pro4 is suitable for people that want to do raid arrays, more pci slots etc. he doesn't third, gpu, don't know about you, but at 1080p the gtx 970 will completely wreck the high end 290X, using LESS power and having less load temperatures, less load temperatures means lower airflow temperature, which leads to my fourth point cpu, i'm assuming he can save 200 dollars in the next 6 months to get a cpu, he can even sell the pentium k for 50 bucks and save some dosh. do you know how computers work? do you know that the higher resolution the less you are cpu dependant? also did you know nvidia cards are more MT efficient than AMD cards? i love amd, don't get me wrong, but the facts are the facts, hate to bite the bullet. google the benchmarks, go to up to 4k gaming or whatever, see the 4.5GHz PK getting 2 less fps than an core i7. This also gets him to learn about how UEFI works and whatever, when he gets a broadwell, if he wants to do so, he's ready to have the time of his life with an i5/i7 Personally, i would save more money and wait for the first Z170 motherboards, and buy a skylake i5 -k equivalent. broadwell is having bad yeilds, which is why it's been delayed for a long time now, reason being the 14nm node is unstable as fuck (14nm and leaky electrons) and if intel is having trouble i can't imagine samsung, globalfoundries, TSMC, etc. hope you have learned a few things today
[QUOTE=Xmeagol;46595682]i suggest using a 250GB EVO for system and games only, enough left from budget to get snail storage. make no mistake, games that use high resolution textures benefit more from SSD's[/QUOTE] Maybe in load times? Once the textures are in VRAM then disk storage has no purpose. You sacrifice storage capacity for a few extra seconds of load time. [QUOTE=Xmeagol;46595682]third, gpu, don't know about you, but at 1080p the gtx 970 will completely wreck the high end 290X, using LESS power and having less load temperatures, less load temperatures means lower airflow temperature, which leads to my fourth point[/QUOTE] Wreck? Lol. [img]http://blogs-images.forbes.com/antonyleather/files/2014/11/r9-390X-performance.jpg[/img] [img]http://blogs-images.forbes.com/antonyleather/files/2014/11/r9-390X-power-consumption.png[/img] The R9 290X performs better than the GTX 970. And while the R9 290X uses 44% more power, it's not castrated like the GTX 970 is regarding power. When the GTX 970 is loaded down, you start losing performance because the core clock starts dropping to maintain it's ridiculously low power envelope. If the card wasn't artificially capped, it probably draws near the same amount of power as the 290X. [QUOTE=Xmeagol;46595682]cpu, i'm assuming he can save 200 dollars in the next 6 months to get a cpu, he can even sell the pentium k for 50 bucks and save some dosh. do you know how computers work? do you know that the higher resolution the less you are cpu dependant?[/QUOTE] Wat. Do you even know how game engines work? Rendering the world takes more than pushing pixels, if you have a garbage CPU paired with a over the top GPU then performance is going to suffer in engines that have lots of AI, physics, scripting, etc. going on in the background. [QUOTE=Xmeagol;46595682]hope you have learned a few things today[/QUOTE] That you still have no idea what you're talking about? You don't need to continually remind us.
Thanks for doing to work I was too lazy to Gigabite. And I'm definitely waiting on picking up a 970/980 until I can see what some custom bios's can do with those artificial caps.
[editline]28th November 2014[/editline] [quote] don't know about you, but at 1080p the gtx 970 will completely wreck the high end 290X [/quote] No, the GPU's of the 970 and the 290X are pretty much equal raw-performance wise. However, the 290X is faster when it comes to more VRAM taxing applications, as it uses a 512-bit memory bus, while the 970 is using a 256-bit one. (Which also turns it's 4GB of VRAM into a joke) When it comes to which will "wreck" which in 1920x1080, that depends entirely on the application. The 290X will be slightly faster in some, the 970 will be slightly faster in some. [quote] cpu, i'm assuming he can save 200 dollars in the next 6 months to get a cpu, he can even sell the pentium k for 50 bucks and save some dosh. [/quote] Which still means that he will have wasted ~ 100 dollars on something that won't give him a significant performance boost, while also having to deal with a slow Pentium for half a year. Also, why would anyone buy a used Intel Pentium for 50 dollars when they can get a new one for 60? [quote] do you know how computers work? do you know that the higher resolution the less you are cpu dependant? [/quote] Yes? What does that have to do with anything? The Pentium will bottleneck him massively in 1080p, it will also massively bottleneck him in more CPU-heavy titles, 4K or not.
[QUOTE=Levelog;46596250]Thanks for doing to work I was too lazy to Gigabite. And I'm definitely waiting on picking up a 970/980 until I can see what some custom bios's can do with those artificial caps.[/QUOTE] That all depends on if the card manufacturers are actually put VRMs on the PCB that can stand the additional load without burning/exploding. I've seen more than a few burned cards from people trying to extract more power than the card could deliver. I've also seen a select few with holes blown straight through the PCB where the VRMs used to be.
[QUOTE=GiGaBiTe;46596383]That all depends on if the card manufacturers are actually put VRMs on the PCB that can stand the additional load without burning/exploding. I've seen more than a few burned cards from people trying to extract more power than the card could deliver. I've also seen a select few with holes blown straight through the PCB where the VRMs used to be.[/QUOTE] Yeah we'll see. I'll have to be looking a lot at the different models in the spring to see what's doing the best with a custom bios. Right now I'm between the msi and evga. They just need to release a wider variety of waterblocks.
It will not bottleneck anything i have testing this fucking build because i have it LOL gigabyte, performance per watt is more important than performance per dollar
[QUOTE=Xmeagol;46596644]It will not bottleneck anything i have testing this fucking build because i have it LOL [/QUOTE] Then you are very bad at detecting bottlenecks, I suppose [QUOTE=Xmeagol;46596644] gigabyte, performance per watt is more important than performance per dollar[/QUOTE] Uhh, that's entirely subjective. Also, are you saying you'd rather pay 500 dollars for graphics card X that consumes 100 Watt, than 250 dollars for card Y that performs like card X but consumes 200W?
remember rix, those 100 watts you save from playinjg games with top end hardware may be used for an fMRI somewhere else still, the build i suggested has visible bottlenecks at very low resolutions, ofc, the lower you go the more it's gonna draw cpu resources. i'm just talking from experience, i'm having a great time with those specs. But there's still the problem of top dog game devs slacking out on pc ports, that's not the hardwares fault now is it?!
Performance per dollar is far more important. Electricity is cheap. Performance is not.
[QUOTE=Xmeagol;46596722] still, the build i suggested has visible bottlenecks at very low resolutions, ofc, the lower you go the more it's gonna draw cpu resources. [/QUOTE] As we've already said, the stress on the CPU obviously doesn't just rely on how frequently it has to feed the GPU with new data. Play a CPU taxing game such as Battlefield 4, ARMA, etc. etc. at 4K using a GTX 980 and an Intel Pentium, and see how well it runs. And I have no idea how you manage to have a "great" experience with that processor, I can only assume that either your standards are extremely low, or you're suffering from a bad case of buyer's remorse. I used to have an AMD Phenom II X4 955 + GTX 770 system. Even this is an obvious bottleneck. And guess what? When I upgraded to my current i5 4690, I experienced a 40-50 FPS boost in pretty much everything I played. I can't even begin to imagine how much a fucking Intel Pentium must bottleneck a GTX 970 [editline]29th November 2014[/editline] [QUOTE=Levelog;46596789]Performance per dollar is far more important. Electricity is cheap. Performance is not.[/QUOTE] I suppose it depends on whether you value low temperatures and silence very highly or not. Also, Xmeagol, sorry for being blunt, but if you're seriously that concerned about saving energy, then you probably shouldn't enter threads just to spew stupid bullshit
[QUOTE=Xmeagol;46596722]remember rix, those 100 watts you save from playinjg games with top end hardware may be used for an fMRI somewhere else[/QUOTE] Except your argument falls flat here. The R9 290X and GTX 970 have similar performance. [QUOTE=Xmeagol;46596722]still, the build i suggested has visible bottlenecks at very low resolutions, ofc, the lower you go the more it's gonna draw cpu resources.[/QUOTE] [Citation Needed] [QUOTE=Xmeagol;46596722]i'm just talking from experience, i'm having a great time with those specs. But there's still the problem of top dog game devs slacking out on pc ports, that's not the hardwares fault now is it?![/QUOTE] Crappy PC ports is no reason to build a retarded PC with completely mismatched hardware.
[QUOTE=GiGaBiTe;46596819] [Citation Needed] [/QUOTE] Nah, he's actually right about that. The faster the GPU works, the more stressful it becomes for the CPU. Look at this: [IMG]http://www.sweclockers.com/image/diagram/2974?k=0aaa4463419779d12136ccc32f723d73[/IMG] (And before you whine about "BUT LOOK, THEY'RE BASICALLY THE SAME!!!", yes, but this is during a pre-scripted singleplayer "cutscene", more or less) And then take a look at this: [IMG]http://www.sweclockers.com/image/diagram/2973?k=b24988f6ea404ae3629e541a541bfb67[/IMG] Now there are clear differences (I just noticed that these pictures are worthless in this case as they don't show the CPU/GPU utilization ratio)
[QUOTE=Rixxz2;46596865]Nah, he's actually right about that. The faster the GPU works, the more stressful it becomes for the CPU.[/QUOTE] Except he was arguing that the lower the resolution the more stress the CPU is under.
[QUOTE=GiGaBiTe;46597016]Except he was arguing that the lower the resolution the more stress the CPU is under.[/QUOTE] Yeah, because it then renders the frames faster, yelling at the CPU to feed it data more frequently
Sorry, you need to Log In to post a reply to this thread.