The "Quick Questions That Don't Deserve A Thread"...Thread. v5
5,001 replies, posted
[QUOTE=Van-man;47887272]if the power lines in the house are divided in different phases (some are supplied with 3 phase and neutral, whereas the phases are isolated from each other and 90-degrees out of phase with each other) then those ethernet-over-powerline adapters don't work.
it needs to be on the same phase line.[/QUOTE]Over here in the US most residential homes are single phase, it's commercial buildings that are 3 phase. Seems like its different in Europe though, don't know anything about how they do their electrical
[QUOTE=Glitch360;47887322]Over here in the US most residential homes are single phase, it's commercial buildings that are 3 phase. Seems like its different in Europe though, don't know anything about how they do their electrical[/QUOTE]
Our house is a dinky one from the early 60's, yet it had 3 phase power when it was built.
It's nice to be able to use industrial equipment at home.
Weird. Over here it's mostly fancy houses that have 3 phase systems, although a few places do have 3 phases for apartments or farm buildings and the like
Repost from CIP
Anyone know a workaround for this error?
[t]http://i.imgur.com/FP8S2LA.png[/t]
[QUOTE=Uberpro;47887163]However that was at 2000-3000 fps.[/QUOTE]
Arguably, the biggest culprit of coil whine is an insanely high FPS count. It doesn't matter if the card is literally flawless in every way, coil whine is inevitable in a situation like that because of how these devices are built and how they're meant to function. I can almost guarantee you that enabling v-sync in your games will reduce it significantly.
[QUOTE=Glitch360;47887322]Over here in the US most residential homes are single phase, it's commercial buildings that are 3 phase. Seems like its different in Europe though, don't know anything about how they do their electrical[/QUOTE]
US homes could be technically 2 phase since there are two 120V lines and a neutral. They are just combined for heavy duty use like electric ranges and dryers. But really it's just single phase.
Is there any way for me to plug my laptop (has Windows installed) into my PC (which hasn't got Windows installed) and boot a file from my laptop on my PC?
Basically I wanna install Windows, but I don't want to have to shell out 15 quid on 4.7GB DVD's, especially when I only need one
[QUOTE=loopoo;47892725]Is there any way for me to plug my laptop (has Windows installed) into my PC (which hasn't got Windows installed) and boot a file from my laptop on my PC?
Basically I wanna install Windows, but I don't want to have to shell out 15 quid on 4.7GB DVD's, especially when I only need one[/QUOTE]
Yes, but there are probably better options.
Most computers can boot off USB drives - grab one of those if you don't have one already, and once you're done installing you can use it for other things.
If you want to boot from another computer on the network, the way to do it is called PXE. It's usually used for thin clients and big corporate servers but it can be used for what you want. [url=http://www.rmprepusb.com/tutorials/serva]This tutorial[/url] looks decent, but I've never messed with network booting myself.
Thanks a bunch, I guess I'll just go get a USB drive. More use than a pack of CD's I won't even need after this.
Network booting is great when it's done right. At a contract for upgrading a hospital system to 7 I could shit out upwards of 60 upgraded and configured systems a day.
I dunno if this is the same thing as you guys are talking about, but my dad once helped me hook up two computers via Ethernet to transfer files between them and I got ridiculous transfer speeds. Is that sorta along the same lines as what you guys are talking about?
I figure the easiest option is to just use a USB drive, so I'm hoping it works out well. PC has been sitting unused for almost two weeks now since my hard drive broke.
[QUOTE=loopoo;47893149]I dunno if this is the same thing as you guys are talking about, but my dad once helped me hook up two computers via Ethernet to transfer files between them and I got ridiculous transfer speeds. Is that sorta along the same lines as what you guys are talking about?
I figure the easiest option is to just use a USB drive, so I'm hoping it works out well. PC has been sitting unused for almost two weeks now since my hard drive broke.[/QUOTE]
Your PC boots into a network boot mode where it waits for a server (your laptop) and boots over that just using Ethernet.
But yes, USB is definitely the easier solution.
[QUOTE=loopoo;47893149]I dunno if this is the same thing as you guys are talking about, but my dad once helped me hook up two computers via Ethernet to transfer files between them and I got ridiculous transfer speeds. Is that sorta along the same lines as what you guys are talking about?
I figure the easiest option is to just use a USB drive, so I'm hoping it works out well. PC has been sitting unused for almost two weeks now since my hard drive broke.[/QUOTE]
Ethernet is not that fast. A fast HDD can saturate 1Gbit.
[QUOTE=the_grul;47893280]Ethernet is not that fast. A fast HDD can saturate 1Gbit.[/QUOTE]
It's a lot faster than USB 2.0 though.
[QUOTE=Demache;47893304]It's a lot faster than USB 2.0 though.[/QUOTE]
ya, twice as.
I've heard there's been some issues with the memory on the GTX 970, has that been fixed yet? I was considering getting that for a build, but if the issues aren't fixed I might get an overclocked GTX 960 instead.
[QUOTE=daigennki;47893777]I've heard there's been some issues with the memory on the GTX 970, has that been fixed yet? I was considering getting that for a build, but if the issues aren't fixed I might get an overclocked GTX 960 instead.[/QUOTE]
The issue is with the hardware design, it will not be fixed.
In practice, it's also not that important, especially if you don't game in resolutions higher than 1080p; the card still performs very well, it's just that the stated specs were sort of misleading.
[QUOTE=daigennki;47893777]I've heard there's been some issues with the memory on the GTX 970, has that been fixed yet? I was considering getting that for a build, but if the issues aren't fixed I might get an overclocked GTX 960 instead.[/QUOTE]
The issues were not actual issues - there was some complex engineering decisions made that a lot of people failed to understand but made a lot of noise about. Rest assured that the 970 is much better than the 960, even overclocked. There is nothing to "fix".
To explain the "issue" fully:
The biggest single factor in GPU performance is memory bandwidth. This is handled by the [I]memory controller[/I], or rather, several memory controllers, which communicate across the [I]memory bus[/I] to the actual RAM chips.
The 970 has, nominally, a 256-bit memory bus. This is quite a lot - the 980 has the same size, and the 960 has only a 128-bit memory bus.
However, due to some internal design changes (to make the card cheaper), the 970 cannot use the full memory bus at once, the way the 980 can. It can instead use either a 224-bit memory channel (connected to 3.5GB of RAM), or a 32-bit memory channel (connected to 512MB of RAM). 224-bit is still pretty damn good, but the 32-bit channel is definitely not. The last Nvidia GPU to use a 32-bit memory bus were certain models of the GeForce 210, the bottom-end chip of its generation.
The GPU drivers are aware of this, and so they prefer to use only the 224-bit memory channel. However, that only holds 3.5GB of memory. If a program tries to use more than 3.5GB, the GPU is forced to use that 32-bit secondary channel, which causes some steep performance drop.
Not as steep, however, as not having that channel at all. Otherwise, it would have to start storing things in main system RAM, which is even more bottlenecked than the secondary channel. And that's what happens if a program tries to use over 4GB of memory on this card, or on the 980. And every card has some breakdown point like this - for the 960, it's down at 2GB, unless you get a double-memory model.
So the 970 is clearly faster than the 960 in every possible condition. Furthermore, it is nearly as fast as a 980 in most cases, at a significantly lower cost.
Which is why Nvidia did this: production costs. When they're making the chips, they aren't making specifically 970 or 980 chips. At that point in production, they're just making "GM204" dies. But manufacturing defects are high when making such large chips, so defects happen. GM204 has sixteen "shader modules" (containing the actual math-doing stuff) and eight "memory partitions" (combining two memory controllers and two segments of L2 cache). A chip that comes out of the factory with all 16 SMMs and all 8 MPs working at full speed gets put on a 980 board and sold as a 980. A chip with less working has parts disabled - three SMMs are fused off, and one MP is partially disabled - and is then sold as a 970. That partially-disabled MP is the cause of these issues - only one of the two memory controllers can be used at a time, and half the cache is gone. If they didn't do this, they'd have to throw away those expensively-produced chips, so the cost of every GM204-based card would be higher. This stuff is standard industry practice - AMD does it, and Intel does it, and both of them honestly do it way more than Nvidia does.
Ah, now I understand. So as long as I don't play games in 4K or anything like that I should be fine with the GTX 970?
[QUOTE=daigennki;47895927]Ah, now I understand. So as long as I don't play games in 4K or anything like that I should be fine with the GTX 970?[/QUOTE]
Yep. I've been using a pair of them for The Witcher 3 on ultra, and it works excellently, maintaining 60-70 fps, just has the normal occasional stutter from loading stuff, and the typical SLI stuff like microstutter.
[QUOTE=daigennki;47895927]Ah, now I understand. So as long as I don't play games in 4K or anything like that I should be fine with the GTX 970?[/QUOTE]
Yep. The 970 should be able to max out everything above 60fps at 1080p, and even 1440p/3x1080p should be fine at moderate settings.
When in doubt, always look at [url=http://www.anandtech.com/bench/product/1355]the benchmarks[/url]. Raw specs only tell you so much - benchmarks, as long as they're done by people who know what they're doing and aren't biased, are the final word on performance.
[QUOTE=gman003-main;47896040]and even 1440p/3x1080p should be fine at moderate settings.
[/QUOTE]
My one problem with surround
[t]http://7proxies.pw/i/2015/06/15-06-07_15-27-42.png[/t]
My displays are not squares
[QUOTE=Scratch.;47896315]My one problem with surround
[t]http://7proxies.pw/i/2015/06/15-06-07_15-27-42.png[/t]
My displays are not squares[/QUOTE]
5:4? They pretty much are.
[QUOTE=Levelog;47896395]5:4? They pretty much are.[/QUOTE]
[t]https://akari.in/vJqP[/t]
Or not
[QUOTE=SEKCobra;47897102]I have a very specific need for a very small tool that hooks into Windows, and since I have no idea how hooking into Windows works, I hope maybe someone else had this need and already made it.
I want to log the title of a program's window (it conveys some info, and the changes are so abrupt because it isn't functioning correctly, but I also need that info to fix it) whenever it changes.[/QUOTE]
This was posted in the GD thread about Robotboy's stuff. I think he meant to post it here but it went to the wrong tab.
Is there a reason why the Nvidia control panel only allows up to 8x AA on program settings?
[img]http://puu.sh/ifBs7/c2c3fd73e5.png[/img]
I googled it and apparently older versions of the Nvidia drivers allowed up to 32x but not the newer ones unless you have SLI which allows 16x.
On a GTX 960 2GB.
[QUOTE=Scratch.;47896417]Or not[/QUOTE]
Your problem with surround is that your displays aren't the same size.
[QUOTE=daigennki;47895927]Ah, now I understand. So as long as I don't play games in 4K or anything like that I should be fine with the GTX 970?[/QUOTE]
Even if you were playing at 4K, the GTX 970 would still largely be better than the GTX 960. Just not [I]quite[/I] as much as with less memory-intensive workloads.
[editline]7th June 2015[/editline]
[quote=SEKCobra]<window title changing stuff>[/quote]
I think those are windows messages, which you can read with a message spy such as [url=http://blog.nektra.com/main/2009/02/18/deviare-message-spy/]this[/url].
Don't know if there's any such program that will save a log though.
[QUOTE=DrTaxi;47897511]Your problem with surround is that your displays aren't the same size.
[/QUOTE]
Yeah
It's only a gripe that the driver can't handle both custom resolutions and surround.
Or it can and I'm not trying hard enough, but I'm already breaking enough as it is
[editline]7th June 2015[/editline]
Actually, while you're still online,
If I ever think about expanding what I already have,
Would it be profitable just to SLI another 970, or is there going to be a point where I should just upgrade to a higher card.
Honestly I think SLI would work, but there's a limit of how far these things go
[QUOTE=Scratch.;47897530]Yeah
It's only a gripe that the driver can't handle both custom resolutions and surround.
Or it can and I'm not trying hard enough, but I'm already breaking enough as it is
[editline]7th June 2015[/editline]
Actually, while you're still online,
If I ever think about expanding what I already have,
Would it be profitable just to SLI another 970, or is there going to be a point where I should just upgrade to a higher card.
Honestly I think SLI would work, but there's a limit of how far these things go[/QUOTE]
SLI 970's are [url=http://www.pcgamer.com/nvidia-geforce-gtx-980-ti-review/]a little faster than a single 980 Ti[/url], but (as expected) lower and lower with higher resolutions.
So it's about whether 100% definitely being free from microstuttering and poor SLI utilisation and having extra VRAM for the future is worth the extra 100€ and a few less FPS to you.
[QUOTE=ashxu;47897365]Is there a reason why the Nvidia control panel only allows up to 8x AA on program settings?
[img]http://puu.sh/ifBs7/c2c3fd73e5.png[/img]
I googled it and apparently older versions of the Nvidia drivers allowed up to 32x but not the newer ones unless you have SLI which allows 16x.
On a GTX 960 2GB.[/QUOTE]
you can force a lot of custom settings with nvidia inspector, funny i entered the thread to ask if anyone knew why nvinspector stopped saving profiles
Sorry, you need to Log In to post a reply to this thread.