• Sony Reportedly Unlocks 7th PS4 Core
    80 replies, posted
[QUOTE=Ninja Gnome;49225282]the cool thing about consoles is that they're a standardized platform so once devs get used to it they can push optimization to insane degrees[/QUOTE] Yeah just compare GTAIV and GTAV.
[QUOTE=CrimsonChin;49229299]It also has some sort of audio function for uh, what's it called? That thing that is like surround sound, but not surround sound.[/QUOTE] Head Related Transfer Functions? (HRTF)
[QUOTE=hexpunK;49229242]An ARM core and any x86/x64 core are in no way comparable in terms of core count and pure clock speed. ARM as an architecture is inherently slower than x86/64, allowing it to have a lower thermal output and lower power draw, 1GHz of ARM is considerably slower than 1GHz of any modern x86/64 CPU. Your Samsung Galaxy S6 might have a beef CPU, but it's never going to be as powerful as what a console, even one from last gen, contains by the sheer nature of the architecture. [editline]2nd December 2015[/editline] There isn't anything preventing a console from doing this other than consumer expectations for the hottest new visual effects. Many things could be scaled down a bit to increase resolution and decrease frametime, but then you'd be subjected to shrieks from the "PC master race" of "HAHA LOOK HOW SHITTY THESE ARE!!!!!!!", despite the fact they are still massively outperforming the equivalent hardware if it was in a PC due to the nature of a unified architecture.[/QUOTE] They aren't outperforming PC's that have similar specs most of the time. The consoles has 2 shitty x86 AMD QUAD cores pasted together with 1 core locked off. At very low clock speeds with likely garbage IPC. The main issue with the consoles are these garbage CPU's as devs have said as well. The graphics card is alright in the PS4 but the Xbox One has not got a real good one. But these graphic cards are slight alternations of desktop graphic cards, those cards being mid-range cards. Unified architecture doesn't make it better, hell it makes it worse as the memory is already 5 GB(3 GB for os) that is shared by GPU and the system. So you have to put textures AND everything that would normally be in the work memory onto 5 GB. The consoles are under powered because they aren't being sold at a loss or a small loss. They don't have leading hardware at all even when they came out. It was already mid-range/low-range hardware. They are very bad for 1080/60 fps gaming at the amount of details that the devs want to push out of them. Call me PCmasterrace all you want but Consoles do have a place, they are mostly for the people who play games like maybe 1 hour each day but they also serve as media centers for these people. However they are not great performance wise and hardware wise. Optimizing for this configuration will help a little bit but there aren't many tricks to discover due to x86. They won't last that much longer hopefully if people want to play 1080/60 fps games. Last gen consoles were way better due to the fact they were selling them at a lose(apparently PS3 was 1000 dollars to make? and that is why they were much cheaper then PC's.
[QUOTE=vrej;49227460]Just buy a PC people, why go through all this?[/QUOTE] What the hell is your post even supposed to mean? It's Sony's engineers that are going through this work of changing how the console functions, not the customers. We get to just sit here and play games with ease. [editline]2nd December 2015[/editline] [QUOTE=Valiantttt;49229439]They aren't outperforming PC's that have similar specs most of the time.[/QUOTE] Of course they are. That's always been the case with consoles. Why would it change this generation? [QUOTE=Valiantttt;49229439]The consoles has 2 shitty x86 AMD QUAD cores pasted together with 1 core locked off. At very low clock speeds with likely garbage IPC.[/quote] Here's some CPU theoretical peak performance numbers: PS4 Jaguar: 102.4 Gflops XB1 Jaguar: 112.0 Gflops Intel i7 3770K: 112.0 Gflops The XB1 and PS4 score surprisingly high. Does anyone have a good explanation of this? [url]http://www.vgleaks.com/playstation-4-xbox-one-comparison-chart[/url] [url]http://forums.anandtech.com/showthread.php?t=2319793[/url] (vague source this one but it seems correct based on other CPU numbers) [QUOTE=Valiantttt;49229439]Unified architecture doesn't make it better, hell it makes it worse as the memory is already 5 GB(3 GB for os) that is shared by GPU and the system. So you have to put textures AND everything that would normally be in the work memory onto 5 GB.[/quote] Unified memory does improve the performance. If the memory was split up into, say, 6GB CPU (where 3 was taken by the OS) and 2GB GPU RAM, how would it perform better? How is it not better that the dev gets to choose how to spend the RAM? [QUOTE=Valiantttt;49229439]Optimizing for this configuration will help a little bit but [B]there aren't many tricks to discover due to x86.[/B][/quote] This makes no sense. New graphics technologies are invented all the time. Why would it stop now?
[QUOTE=paul simon;49229514] Here's some CPU theoretical peak performance numbers: PS4 Jaguar: 102.4 Gflops XB1 Jaguar: 112.0 Gflops Intel i7 3770K: 112.0 Gflops The XB1 and PS4 score surprisingly high. Does anyone have a good explanation of this?[/QUOTE] That seems off. I thought the Jaguar in the consoles was similar to a lower end i3? [QUOTE=CakeMaster7;49229424]Head Related Transfer Functions? (HRTF)[/QUOTE] I remember now, it's binaural audio. The little box has an audio chip for that.
[QUOTE=CrimsonChin;49229908]That seems off. I thought the Jaguar in the consoles was similar to a lower end i3?[/QUOTE] Oddly enough those are the only numbers I find, which are apparently confirmed in the PS4 SDK.
[QUOTE=paul simon;49229514]What the hell is your post even supposed to mean? It's Sony's engineers that are going through this work of changing how the console functions, not the customers. We get to just sit here and play games with ease. [editline]2nd December 2015[/editline] Of course they are. That's always been the case with consoles. Why would it change this generation? Here's some CPU theoretical peak performance numbers: PS4 Jaguar: 102.4 Gflops XB1 Jaguar: 112.0 Gflops Intel i7 3770K: 112.0 Gflops The XB1 and PS4 score surprisingly high. Does anyone have a good explanation of this? [url]http://www.vgleaks.com/playstation-4-xbox-one-comparison-chart[/url] [url]http://forums.anandtech.com/showthread.php?t=2319793[/url] (vague source this one but it seems correct based on other CPU numbers) Unified memory does improve the performance. If the memory was split up into, say, 6GB CPU (where 3 was taken by the OS) and 2GB GPU RAM, how would it perform better? How is it not better that the dev gets to choose how to spend the RAM? This makes no sense. New graphics technologies are invented all the time. Why would it stop now?[/QUOTE] They aren't outperforming them by a large enough margin. They often times run at sub 1080/60 fps at medium settings. And Glops are just one part of a big picture how a CPU will perform even more so because if you look at a graphics card you will see they have shit like 8000 Gflops. The unified memory is not going to improve the performance. Like how the fuck would that improve the performance? You got 5 GB workable memory, and most PC's have that + 2/4 GB dedicated VRAM. you could even do the same on PC if you could buy GDDR5 memory sticks because you can access it and put textures on it. And new graphics technologies are invented yes but nothing that will give these consoles an edge over PC's. Because it is x86, all new technologies will be on any PC platform as well. You cannot discover many new tricks for the graphic cards of the consoles because it is a known platform.
[QUOTE=Valiantttt;49230291]They aren't outperforming them by a large enough margin. They often times run at sub 1080/60 fps at medium settings.[/QUOTE] Yeah, the i3/750ti seems to be a frequent comparison in Digital Foundry articles. Always seems to match the PS4 in performance but at either slightly higher settings or slightly higher res. I dunno if that's good for the PC or bad for the consoles. Probably the latter, since it means that hardware that was "meh" on release is keeping up with them, whereas previous consoles used to be a powerhouse that normally couldn't be matched in price/performance for 2+ years. This gen seems to be a bit of a disappointment console-wise.
Of course unified memory is better than the same size in split. Then you don't have to waste time and bandwidth moving data from the vram to ram or vice versa, and you have lower total ram utilization, since you wouldn't need duplicates of files. [QUOTE=Matt2468rv;49225195] Glad Don't get me wrong, I love my PS4 but I'd like to think that after 10 years a "next-gen" console would be able to render its games in 1080p.[/QUOTE] Approximately 98% of ps4 games are 1080p. Most of the ones that aren't are Battlefield, or Assassin's Creed. Others are Watch Dogs and UFC and another two or three I can't remember.
[QUOTE=paul simon;49229514]What the hell is your post even supposed to mean? It's Sony's engineers that are going through this work of changing how the console functions, not the customers. We get to just sit here and play games with ease. [editline]2nd December 2015[/editline] Of course they are. That's always been the case with consoles. Why would it change this generation? Here's some CPU theoretical peak performance numbers: PS4 Jaguar: 102.4 Gflops XB1 Jaguar: 112.0 Gflops Intel i7 3770K: 112.0 Gflops The XB1 and PS4 score surprisingly high. Does anyone have a good explanation of this? [URL]http://www.vgleaks.com/playstation-4-xbox-one-comparison-chart[/URL] [URL]http://forums.anandtech.com/showthread.php?t=2319793[/URL] (vague source this one but it seems correct based on other CPU numbers) Unified memory does improve the performance. If the memory was split up into, say, 6GB CPU (where 3 was taken by the OS) and 2GB GPU RAM, how would it perform better? How is it not better that the dev gets to choose how to spend the RAM? This makes no sense. New graphics technologies are invented all the time. Why would it stop now?[/QUOTE] Unified memory really isn't great though. That's the one point of yours I'm going to have to disagree with. A CPU prefers extremely low latency and bandwidth doesn't matter nearly as much. A GPU prefers extremely high bandwidth memory and generally doesn't give a fuck about latency (comparatively). When you've got unified memory all of one type, something is going to suffer.
The CPU and gpu in the ps4 have their own bus, so they shouldn't compromise each other when in use, from what I understand.
[QUOTE=CrimsonChin;49230444]Of course unified memory is better than the same size in split. Then you don't have to waste time and bandwidth moving data from the vram to ram or vice versa, and you have lower total ram utilization, since you wouldn't need duplicates of files. Approximately 98% of ps4 games are 1080p. Most of the ones that aren't are Battlefield, or Assassin's Creed. Others are Watch Dogs and UFC and another two or three I can't remember.[/QUOTE] what are you talking about? why would they move data from the VRAM to the RAM when you have dedicated ram for that. and how the fuck would you get lower ram usage? and most of those games run at 30 fps or experience drops.
[QUOTE=CrimsonChin;49230444]O Approximately 98% of ps4 games are 1080p. Most of the ones that aren't are Battlefield, or Assassin's Creed. Others are Watch Dogs and UFC and another two or three I can't remember.[/QUOTE] Look at the performance for a lot of those games. A large amount of them performance issues
If a file relevant to the game is in both ram and vram it could take say 10mb of space as it would be present in both, but with one pool of ram that both cpu and gpu access simultaneously, would only take 5mb of space, right? Is this wrong? [editline]2nd December 2015[/editline] [QUOTE=BusterBluth;49230555]Look at the performance for a lot of those games. A large amount of them performance issues[/QUOTE] That is not true. Generally, most ps4 games hold a stable 30 fps, and some also are 60fps. The ones that don't (such as AC Unity, or Witcher 3 which has been patched now) get talked about a lot and make it seem more prevelant than it is.(vocal minority). In any case, at least frame rates compared to last gen are much more stable now.
[QUOTE=CrimsonChin;49230588]If a file relevant to the game is in both ram and vram it could take say 10mb of space as it would be present in both, but with one pool of ram that both cpu and gpu access simultaneously, would only take 5mb of space, right? Is this wrong?[/QUOTE] this is wrong because you don't see this happen. (almost)All of the memory in a GPU is used for graphical reasons. Why would you put it in work memory, used for storing other things related to the game. and 30 fps is performance issues, it is 2015. Only a few truely run at a solid 60fps
[QUOTE=Valiantttt;49230633]this is wrong because you don't see this happen. (almost)All of the memory in a GPU is used for graphical reasons. Why would you put it in work memory, used for storing other things related to the game. and 30 fps is performance issues, it is 2015. Only a few truely run at a solid 60fps[/QUOTE] Stable 30fps is a performance issue? Mmkay nice to know. I guess stable 60 fps is a performance issue since that is just as old as 30fps when it comes to games.
[QUOTE=Valiantttt;49230633]this is wrong because you don't see this happen. (almost)All of the memory in a GPU is used for graphical reasons. Why would you put it in work memory, used for storing other things related to the game. and 30 fps is performance issues, it is 2015. Only a few truely run at a solid 60fps[/QUOTE] "only a few". Last year i went and looked at resolution and framerate data for consoles, and the results (at that time) were as follows: [quote]94% of PS4 retail games run at 1080p 65% of PS4 retail games run at 60fps 61% of PS4 retail games run at both 1080p and 60fps (this means the MAJORITY of them) 47% of XB1 retail games run at 1080p 50% of XB1 retail games run at 60fps 26% of XB1 retail games run in both 1080p and 60fps 12.5% of Wii U retail games run at 1080p (Framerate data for Wii U games is largely lacking) These are calculations based on lists of rendering resolutions and framerates for current retail games. This is bound to change somewhat in the future, but is a good pointer towards how resolutions and framerates in consoles are at the moment. [url=https://forum.beyond3d.com/threads/list-of-rendering-resolutions.41152/][1][/url][url=http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates][2][/url][/quote] This ignores downloadable games from the store, because those are often 2D games or generally less demanding games that will always run at 1080/60.
To be fair several of the 60fps games were mostly hovering between 45 and 55.
[QUOTE=CrimsonChin;49230725]To be fair several of the 60fps games were mostly hovering between 45 and 55.[/QUOTE] Yeah, whether they run at [I]stable[/I] 60 is hard to determine. Someone needs to compile a huge database of this stuff, it's very hard to find proper information.
[QUOTE=CrimsonChin;49230588]I That is not true. Generally, most ps4 games hold a stable 30 fps, and some also are 60fps. The ones that don't (such as AC Unity, or Witcher 3 which has been patched now) get talked about a lot and make it seem more prevelant than it is.(vocal minority). In any case, at least frame rates compared to last gen are much more stable now.[/QUOTE] Witcher 3 and AC Unity still commonly drop sub 28 fps with their latest patches. Hell look at Fallout 4. It's pretty damn prevalent.
[QUOTE=BusterBluth;49230918]Witcher 3 and AC Unity still commonly drop sub 28 fps with their latest patches. Hell look at Fallout 4. It's pretty damn prevalent.[/QUOTE] Fallout 4 is a bad example, it genuinely looks like an early ps3 game on the consoles, it's a joke tech wise, and AC unity is also a bad example too but at least it looked pretty good. Both are from studios known for extremely buggy games with performance issues on all platforms. Remember Skyrim going to 1fps on ps3? But Witcher 3 is 30 fps most of the time now on ps4 according to Eurogamers fps test video.
[QUOTE=Valiantttt;49230291]They aren't outperforming them by a large enough margin. They often times run at sub 1080/60 fps at medium settings. And Glops are just one part of a big picture how a CPU will perform even more so because if you look at a graphics card you will see they have shit like 8000 Gflops. The unified memory is not going to improve the performance. Like how the fuck would that improve the performance? You got 5 GB workable memory, and most PC's have that + 2/4 GB dedicated VRAM. you could even do the same on PC if you could buy GDDR5 memory sticks because you can access it and put textures on it. And new graphics technologies are invented yes but nothing that will give these consoles an edge over PC's. Because it is x86, all new technologies will be on any PC platform as well. You cannot discover many new tricks for the graphic cards of the consoles because it is a known platform.[/QUOTE] Pascal cards will come with unified memory which will allow the GPU to access the main system RAM which would improve performance. [editline]2nd December 2015[/editline] [QUOTE=CrimsonChin;49231054]it's a joke tech wise[/QUOTE] subjective. it doesn't push rendering to its limits but i love the artstyle and to say it's a joke is ignoring the complexity of implementing things like tiled deferred rendering in what i'm guessing was a previously forward rendered engine
[QUOTE=CrimsonChin;49231054]Fallout 4 is a bad example, it genuinely looks like an early ps3 game on the consoles, it's a joke tech wise[/QUOTE] I'd love to know how fully volumetric lighting and kinda-there PBR are "a joke". Certainly things that haven't been seen on the PS3 at least.
[QUOTE=TheAdmiester;49231381]I'd love to know how fully volumetric lighting and kinda-there PBR are "a joke". Certainly things that haven't been seen om the PS3 at least.[/QUOTE] slight correction, it's not fully volumetric. this would imply it's approximating the effects of fog/dust/air as a true volume influencing light distribution. they only model crepuscular rays
[QUOTE=TheAdmiester;49231381]I'd love to know how fully volumetric lighting and kinda-there PBR are "a joke". Certainly things that haven't been seen on the PS3 at least.[/QUOTE] Have you seen the game on consoles, it looks like crap. The textures are mostly ugly and low resolution, the draw distance is like 5 centimeters, low poly models galore. The pbmr implementation is poor, you will see a metal object that looks metal and sitting next to it is a metal object the looks like plastic. At least it is 1080p I guess?
quantify ass then? i feel like youre arguing an artsyle
[QUOTE=DOG-GY;49231626]quantify ass then? i feel like youre arguing an artsyle[/QUOTE] Artstyle is not the cause of this. [img]https://cdn0.vox-cdn.com/thumbor/SyZ0RGi6GrjT6gwu4xIhbrUUNBw=/1000x0/filters:no_upscale()/cdn0.vox-cdn.com/uploads/chorus_asset/file/4245663/Fallout4_E3_GarageRun.0.jpg[/img] [img]http://i.imgur.com/4gE8HLL.gif[/img] [img]http://cdn.makeagif.com/media/11-09-2015/_DKjg2.gif[/img] [img]http://i.imgur.com/MYpPSnZ.gif[/img] Even maxed out on PC, the game looks incredibly average and outdated, full of ugly flat textures, low poly terrain with clipping ground textures, the foliage is mostly poor but at least some tree trunks are decent. [img]http://i.imgur.com/Y5ZMmBE.jpg[/img] [img]http://i.imgur.com/MCXAAuk.jpg[/img] [img]http://abload.de/img/fallout42015-11-1003-dojct.png[/img]
[QUOTE=CrimsonChin;49231787]Artstyle is not the cause of this. [img]http://i.imgur.com/4gE8HLL.gif[/img][/QUOTE] this cant be real
It is, and they even managed to make that ugly janky crap run at like 26fps on the PS4. Gamebryo engine is awful especially on console.
It's not real. Those were taken from the NVIDIA page where they showed comparisons across settings + how to tweak visuals. It's since been taken down. you can still access some of the images knowing the URL [url]http://images.nvidia.com/geforce-com/international/comparisons/fallout-4/fallout-4-ugridstoload-tweak-interactive-comparison-001-ugrids-13-vs-ugrids-5.html[/url]
Sorry, you need to Log In to post a reply to this thread.