What do you think the next leap with PC's will be?
91 replies, posted
pictometre processor dies :c00l:
Withing the next couple of decades, transistors will be so small that we'll have to develop a new architecture for microprocessors. 1 nm is roughly 10 atoms wide. You can't physically make transistors smaller than an atom without some crazy-ass quantum mechanic shit.
Computer hooked with the brain :v:
[QUOTE=Pixel Heart;16433600]Networking evolves into Petabit Ethernet, delivering blistering speeds that won't make your Galaxy of Warcraft game lag in the slightest. Wireless lags behind slightly (As usual) with Wireless 802.11y (Yotabit speeds)[/QUOTE]
Yeah, except in the US we'll still be sending data through the damn phone lines :argh:
Edit: oh, and btw Yotta = 1,000,000,000 x Peta :v:
cloud computing
"waits for it"
"Hey, look, this game if completely photorealistic!
But on my 1080megapixel monitor I can see flaws!
Because your video card sucks, it only has 64 terabytes of vram!"
Yeah, this will never end.
Octocores.
[QUOTE=OIFY_only;16438079]cloud computing
"waits for it"[/QUOTE]
Cloud computing is terrible. :argh:
I read somewhere that in around 5 years nobody will be using a mouse anymore, although they said nothing of what would replace it so I don't really know how that's going to work....
[QUOTE=mattfinch;16438473]I read somewhere that in around 5 years nobody will be using a mouse anymore, although they said nothing of what would replace it so I don't really know how that's going to work....[/QUOTE]
Mouse 4 lyfe yo.
I'm not going to use anything else.
linux to be the main OS of choice
Pff.
There won't be any computers in the future.
We'll all be using some kind of magic force and shit, dawg.
[QUOTE=OIFY_only;16438711]linux to be the main OS of choice[/QUOTE]
yes because that's a brilliant idea when you consider there's people who install 40 toolbars into IE6 and call their tower a hard drive/CPU
[QUOTE=reapaninja;16439419]yes because that's a brilliant idea when you consider there's people who install 40 toolbars into IE6 and call their tower a hard drive/CPU[/QUOTE]
Eh, those people will soon die off. Either from old age or from suicide when their MySpace gets hacked.
[QUOTE=Roast Beast;16439482]Eh, those people will soon die off. Either from old age or from suicide when their MySpace gets hacked.[/QUOTE]
Lets start hacking Myspaces. :smug:
[QUOTE=PvtCupcakes;16438366]Cloud computing is terrible. [img]http://d2k5.com/sa_emots/emot-argh.gif[/img][/QUOTE]
Yeah, look at what the internet has become. [b]Utter Failure[/b] of cloud computing.
[QUOTE=Master117;16431878]
So what are your thoughts on future computer tech?[/QUOTE]
[url]http://www.gnu.org/philosophy/right-to-read.html[/url]
Everything becoming a black box you can't modify.
[QUOTE=armydude359;16434999][b] OVER 9000 [/b] TB's :D[/QUOTE]
:eng101:
"Explorer performed illegal operation and will shut down" will return and kill us all.
I wonder how much longer binary will be in use.
[QUOTE=Pixel Heart;16433600]4096-bit architecture (x4096)
Processors containing 256 CPU cores on a die, each core with 32 micro cores of their own, and a 3TB L24 Cache (Socket R800).
SATA dies altogether, as everything now connects via M-Sockets. M-Sockets not only hold your 768TB of QDR 8192MHz QIMMs in quintuple-channels, but also interface your holo-magna-optical drives and Samsung's PRAM storage drives.
Networking evolves into Petabit Ethernet, delivering blistering speeds that won't make your Galaxy of Warcraft game lag in the slightest. Wireless lags behind slightly (As usual) with Wireless 802.11y (Yotabit speeds)
Since PCIe's death, graphics cards now reside in the GIB x512 slot (Graphics Interconnect Bus).They now have a 2nm die, and connect you your holo-res display via wireless DisplayLink Technology.
....ok, so i made everything up except for Samsung's PRAM... it's being experimented with to replace flash memory. :P[/QUOTE]
yota is bigger than peta... by a lot.
[QUOTE=TrafficMan;16442608]I wonder how much longer binary will be in use.[/QUOTE]
Yeah, we can do alot with just 0 and 1, so imagine if we used all 0 - 9.
[img]http://d2k5.com/sa_emots/emot-psypop.gif[/img]
[QUOTE=MelonGuy;16431896]Quatum Computing wee![/QUOTE]
useless for everyday computing IIRC.
[QUOTE=OIFY_only;16438711]linux to be the main OS of choice[/QUOTE]
the second linux gets stable game compatability I'm switching.
well, In a sense I have, I've got one hard drive with ubuntu, but I mean If wine gets alot more stable, or people begin developing for linux, It will be my main os.
[QUOTE=Master117;16431878]I've been thinking. A few years ago, the concept of a "64-bit quad-core with a 4870 with one gig of [b]DDR5 video memory[/b], and a 2TB HD" would be absolutely meaningless. No one would have ever expected something that powerful a few years ago.
Sure there were servers that had more than one physical CPU on it, but never did they have more than one CPU on one die. Now we can cram [b]8 CPU[/b]'s on a single die.[/QUOTE]
DDR5 and GDDR5 arn't the same fucking thing.
They creates shitloads of processors on a single wafer die.
Also it's cores (or processors depending how it's define now, CPU is seen as the platform which they reside on), and there isn't a true Octa-Core processor (IBM's Cell is not one) just yet, Core i7 are all Quad-Cores with HT technologies. IBM's Cell super computers have to be backed by AMD processors and hardware to actually work well.
[QUOTE=pentium;16436884]EISA/ISA freaking ruled but motherboard manufacturers had to be a fag and replace it with something else, just like they are right now with the good old PCI slot.[/QUOTE]
Maybe because it's 20 old and highly superceeded
[QUOTE=TrafficMan;16442608]I wonder how much longer binary will be in use.[/QUOTE]
Computers can only operate in binary. It only sees on and off. What is it going to do with 2-9?
To form numbers like 2-9 all a computer does is convert it from binary, same with letters (ASCII)
A byte is a set of 8 bits, which you can form any ASCII character you want with.
The letter 'A' in binary is 01000001 which is 65 (41 in hexadecimal) which is the ASCII code for 'A'
01000001 represents a byte, each 1 or 0 is a bit, and 8 bits make a byte. The 1s mean 'on', and when you're talking about memory, a bit can only be on or off.
I'd say, with the i7, the new GTX lineup coming on, and the glasses that nvidia makes, we're in what the 80's thought would happen in 10 years.
i bet we'll have something like i7's to throw as frisbees in the next couple of years, when we have chips the size of vitamins, making processes faster than 4 i7's.
[QUOTE=PvtCupcakes;16449165]Computers can only operate in binary. It only sees on and off. What is it going to do with 2-9?
To form numbers like 2-9 all a computer does is convert it from binary, same with letters (ASCII)
A byte is a set of 8 bits, which you can form any ASCII character you want with.
The letter 'A' in binary is 01000001 which is 65 (41 in hexadecimal) which is the ASCII code for 'A'
01000001 represents a byte, each 1 or 0 is a bit, and 8 bits make a byte. The 1s mean 'on', and when you're talking about memory, a bit can only be on or off.[/QUOTE]
Yes I understand how it works (It's the reason some linux servers [url=http://en.wikipedia.org/wiki/Year_2038_problem]will crash in 2038[/url]) but inevitably binary will become obsolete eventually.
[QUOTE=TrafficMan;16451575]Yes I understand how it works (It's the reason some linux servers [url=http://en.wikipedia.org/wiki/Year_2038_problem]will crash in 2038[/url]) but inevitably binary will become obsolete eventually.[/QUOTE]
That problem is because of 32bit software, not because it uses binary. 64bit software doesn't have this problem.
Binary won't become obsolete because it's the whole basis of a computer. 0 and 1, true and false, on and off.
It can't be done another way.
Sorry, you need to Log In to post a reply to this thread.