• Anonymous Assassins Creed Unity developer wrote into the Giant Bombcast to give insight to 900p pari
    61 replies, posted
[QUOTE=Bruhmis;46253014]ubisoft hasn't been ''professional'' since 2008. [B]I can list to you well over 20 indie games that have been released around the same time as AAA ubisoft titles that ran far better and looked far better than said AAA titles[/B]. there's nothing professional about lashing out at a platform/community for not buying your unplayable games. there is nothing professional about routinely lying to customers and journalists alike then trying to cover up your lies with more lies. there is nothing professional about falsely advertising products and refusing to accept responsibility when you charge people for defective products. there are plenty of people on facepunch who are aspiring game developers and experienced programmers. they likely don't know [B]more[/B] than the average ubisoft dev, but they know enough to catch them in one of their infinite lies, with which the quote in the op is littered.[/QUOTE] Englighten me.
[QUOTE=redBadger;46252553]Thread prediction: Facepunch will know more about developing for next-gen systems than professional workers[/QUOTE] Thread prediction correction: Facepunch will take "Anonymous developer mail with zero confirmation of source and obvious conflict of interest" as unbiased factual information. [editline]16th October 2014[/editline] Like I mean come on the guy ADMITS having a deal with Microsoft but somehow magically a deal with Microsoft and the PS4 having verifiable 40% higher GPU processing power have NO EFFECT WHATSOEVER and it's just some kind of weird coincidence? How naive do you have to be to believe that for a second? [editline]16th October 2014[/editline] Ubisoft in last few years has track record of horribly bad optimization across all platforms, gimping previously advertised graphics before release, and at best shady business deals influencing the development. Suddenly there's an ANONYMOUS DEVELOPER EMAIL and all is forgotten, it perfectly explains everything and justifies it all.
[QUOTE=redBadger;46252553]Thread prediction: Facepunch will know more about developing for next-gen systems than professional workers[/QUOTE] Have you thought that maybe some people on facepunch are professional workers themselves? (and working in the industry doesn't necessarily make you more knowledgeable than people who do it for a hobby, it just gives you more credibility)
Ubi has the same outlook as the makers of CoD, the engine they have runs well enough and people don't really complain and keep buying the games it runs off of so why bother changing it other than some minor tweaks? Until something gives it actual competition why bother with the extra work and development that won't really change sames and only lower the profit margin of the game?
Hey Ubisoft, or anonymous dev. Here's a thought. How about instead of attaching so many shiny bells and whistles to your game...you get it running at 60 fps, THEN see how much you can add before the framerate starts tanking. 60 fps is better than graphical fidelity any day.
Wasn't it Gameplay -> Quality -> Graphics? People will go fucking nuts over games that have great gameplay with quality story/atmosphere. Look how big Portal exploded, it was an amazing game and it stiill had 2004 Half-Life 2 visuals in 2007. Graphics shouldn't be an afterthought but whats with almost every developer trying desperately hard to make their game look like a movie? "MS and Sony wanted to push graphics first, so that's what we did. " So admit that's a problem, and don't whine about it and try to defend all of your work when that is quite obviously a big problem when it comes to making a game? You don't have to be a game developer to see that shit is going wrong with Ubisoft.
[QUOTE=The freeman;46252830]Ubisoft has always had issues with their engines going back to the PS2/Xbox. They should have learned by now but I guess not. Ubisoft is a weird company.[/QUOTE] No, they're just mercenary and insulated from criticism, and solely focus on literally churning out solely-for-profit titles.
If you can't get a game that barely looks better than a game that runs on 3 Cores and 512MB of RAM, to run at 60 Frames per second at 900p, or 30FPS at 1080, on machines with 8 cores and 8GB of RAM, then you're doing something completely wrong
[QUOTE=JCDentonUNATCO;46255144]Look how big Portal exploded, it was an amazing game and it stiill had 2004 Half-Life 2 visuals in 2007. Graphics shouldn't be an afterthought but whats with almost every developer trying desperately hard to make their game look like a movie?[/QUOTE] Let's not kid ourselves here, 2007 Portal looked miles better than 2004 HL2 for a huge number of reasons, given a new version of Source engine, higher poly count, and a closed environment. Portal looked quite good for its time; Valve has always been good about getting their games to run on all sorts of hardware and they've put gameplay first, but 2004 Source and EP1 Source/Orange Box Source/DOTA2 Source have all had significant graphical upgrades from revision to revision. [editline]16th October 2014[/editline] [QUOTE=Awesomecaek;46253869]Like I mean come on the guy ADMITS having a deal with Microsoft but somehow magically a deal with Microsoft and the PS4 having verifiable 40% higher GPU processing power have NO EFFECT WHATSOEVER and it's just some kind of weird coincidence? How naive do you have to be to believe that for a second?[/QUOTE] You can attack Ubisoft's engine being badly designed/not utilizing its available resources to their fullest extent, but if what the dev source says is true and the game is actually being primarily CPU bottlenecked, then it would make a lot more sense why the difference between the two consoles is no where near as large as people would expect. In terms of CPU processing power the difference between the two is a lot closer than when comparing the GPUs of the two, not to mention that unpacking/processing whatever prebaked GI might not scale linearly with CPU clockspeed/available threads, further reducing the performance advantage of the PS4. I don't think it's unreasonable to think that the engine as designed really can't run faster on the PS4 than on the Xbone, not because it's intentionally crippled, but just as a consequence of the engine's design vs the early lifecycle of the current generation. [editline]16th October 2014[/editline] [QUOTE=TheTalon;46255477]If you can't get a game that barely looks better than a game that runs on 3 Cores and 512MB of RAM, to run at 60 Frames per second at 900p, or 30FPS at 1080, on machines with 8 cores and 8GB of RAM, then you're doing something completely wrong[/QUOTE] Also this isn't really a fair comparison since I'm pretty sure that the Assassin's Creeds didn't run at native 720p/30 on the previous gen consoles Given that for example Halo 3 on the 360 ran at native 640p upscaled to 720p and Blops on the PS3 ran at native 544p upscaled to 720p, the jump in resolution from previous gen to current gen is a lot bigger than it appears nominally. It's not that the devs are trying to cram more shit into the new gen while staying at the same resolution, they never even hit HD resolutions in the last gen; they're trying to cram more shit while doubling or quadrupling the number of pixels they're pushing. It's not all that surprising that especially this early on they're hitting walls everywhere with regards to resolution and graphics given that they're being given mid-range PCs to work with without a significant amount of time to develop ways of getting everything they can out of each system.
Bull. Shit. No way is this game bigger than the new versions of GTA5 that are going to be released soon. That game looks pretty as hell! Speaking of Rockstar, how did they make Max Payne 3 so stable I can run it better on my dad's Mac? They made the game properly is how. I know about all the console CPU bottlenecks, but that's not their excuse: 'way too much bullshit about 1080p making a difference is being thrown around' I don't know if it's just because it's typed, but to have someone say that and not have 1080p as an industry standard now is quite sad. tl;dr Ubisoft can't optimise for shit.
[QUOTE=Memobot;46256108]Bull. Shit. No way is this game bigger than the new versions of GTA5 that are going to be released soon. That game looks pretty as hell![/QUOTE] Difference between last gen and current gen GTA 5... Higher resolution and fps? Check. Improved graphical effects? Check. Higher object count/view range? Check. Doing anything you fundamentally can't do on a last gen console? Not so much. I mean, genuinely impressed with what Rockstar did on the PS3 and 360, but if a game runs reasonably well on that hardware then getting the same game with improved graphics to run at 1080p and 60fps on current gen is a walk in the park. Meanwhile, Assassin's Creed Unity appears to be a huge step up in map size, npc ai/numbers, visuals, etc over last gen Assassin's Creed games, and notably is the first installment in the series not to have a 360/PS3 version. So if it runs at "only" 900p/30fps, I am okay with that. Not happy in a dancing on the roof way or trying to make BS claims that 900p and 30fps is better, but that Ubisoft has had to settle for this resolution and fps shows they are really pushing boundaries.
Kinda interesting they had performance problems this late into development. You think they would have solved that early on.
[QUOTE=Memobot;46256108]Bull. Shit. No way is this game bigger than the new versions of GTA5 that are going to be released soon. That game looks pretty as hell! Speaking of Rockstar, how did they make Max Payne 3 so stable I can run it better on my dad's Mac? They made the game properly is how. I know about all the console CPU bottlenecks, but that's not their excuse: 'way too much bullshit about 1080p making a difference is being thrown around' I don't know if it's just because it's typed, but to have someone say that and not have 1080p as an industry standard now is quite sad. tl;dr Ubisoft can't optimise for shit.[/QUOTE] Rockstar's stable Max Payne 3 PC port is if anything an argument for Ubisoft, since it demonstrates Rockstar already has a well polished, optimized and mature port to x86 of their engine to rely on, whereas Ubisoft doesn't have any such precedent for their engine. Not even getting into how GTA5 is highly unlikely to have GI at the level of detail as Ubisoft is attempting. If unpacking prebaked GI is CPU centric, it stands to reason that both consoles are going to perform similarly - you can't make use of the PS4's superior GPU to make up any performance in that case, simply because the available processing power isn't enough to just start doing GI in real time. The jump in necessary graphics capability to go from baked GI to real time GI is massive - far more than the difference between the PS4 and the Xbone can account for, and so both consoles have to rely on pre-baked, which means both consoles are getting bottle necked by their CPUs. Hence since both consoles have basically identical CPU performance, the engine performs basically identically on both consoles. The left over GPU power might allow the PS4 to handle more extra stuff that isn't lighting related, but if the processor is fully occupied with lighting, it leaves little overhead for issuing instructions to the GPU to handle any more rendering, which would correlate to the developer talking about the difference between the two consoles being on the order of just a few FPS, rather than a dramatic difference. That bottleneck is an engine design issue, not an optimization issue or an issue of laziness. Hardware bottlenecks are bottlenecks, and just wishing it ran faster isn't going to magically make it run faster.
The resolution and frame rate caps only apply to the console release right? Because if that is the case, who the [B]fuck[/B] gives a shit about that? At the distances people sit from their screens when using consoles, they aren't going to miss 180 rows of pixels. They aren't going to pick up on the finer details 60fps would show as readily. Yes, it would be nice for every game to run native 1080p with solid frame rates, but the new generation is still setting up camp, give it a year or two and the engines running these will have all the hacks and tweaks needed to run games smoothly, exactly as it has been for every generation ever. There will be a point when the developers stop working on the graphics as much thanks to diminishing returns, we're already fairly close to it with some studios work, so graphical improvement should plateau fairly soon. I don't know why everybody was expecting native 1080p this early into a generation, just because the consoles use a similar hardware architecture to a PC does not mean development will shoot straight to the moon, they still have specialised pieces of hardware to work out and find tweaks for. Interestingly Bioshock on the PS3 had a graphics toggle that made the game look like pure sin in the name of higher FPS, I dunno if the XBox version had this toggle, but people have tried this stuff in the past and it went fairly unused by basically everybody.
[quote] If the game is as pretty and fun as ours will be, who cares? [/quote] Because 1080p is a standard You don't see people going around saying "But the show's are entertaining, who cares" for crt tv's 240p resolution.
30 fps isn't even that bad, console gamers are used to it. Why are you guys whining so much don't you play shit on the PC?
As someone who works at a AAA game company the person who wrote this sounds like they don't know what they're talking about, and doesn't sound like a developer to me. (based on the way they talk) "baked global illumination lighting." You cannot "bake lighting in a day/night cycle world (I'm presuming this is the case as all Assassin's creed games have this feature) Baking lights only work in a static environment as I understand it. If he/she does work at Ubisoft I think they must be in QA or some general producer role where they aren't able to understand low level workings of an engine. Herp.
[QUOTE=Major.Dump;46257307] You cannot "bake lighting in a day/night cycle world (I'm presuming this is the case as all Assassin's creed games have this feature) Baking lights only work in a static environment as I understand it.[/QUOTE] You could always do a whole lot of bakes at different times of day and interpolate through them, which would help explain the filesize.
[QUOTE=Major.Dump;46257307]As someone who works at a AAA game company the person who wrote this sounds like they don't know what they're talking about, and doesn't sound like a developer to me. (based on the way they talk) "baked global illumination lighting." You cannot "bake lighting in a day/night cycle world (I'm presuming this is the case as all Assassin's creed games have this feature) Baking lights only work in a static environment as I understand it. If he/she does work at Ubisoft I think they must be in QA or some general producer role where they aren't able to understand low level workings of an engine. Herp.[/QUOTE] This confused me as well. That said, there is a lot more to lighting than just baking textures like you did in source these days. He could be referring to a gross oversimplification that amounts to "an assload of lighting data had to be saved that takes up half the game size" which is not that unbelievable at all, especially if they are combining several lighting techniques. And ESPECIALLY if there isn't a dynamic day/night cycle (which makes the comment make far more sense to the limits of my knowledge). They go in a bit more detail in the actual podcast with a podcast member bringing up what he said to a programming contact he has in the industry and that contact said it all is pretty plausible. There was also proof offered but they didn't dive that deep into it (being just an email they were bringing up on their podcast question segment). That said, its still pretty likely the person isn't a core programmer of the team. But that doesn't make it any less valid (if that IS the case) - what matters is the information provided and what the dev process was like to get it to where it's at now would still be accurate (if its not fake)
[QUOTE=Stiffy360;46252873]Guys, open world + GI is completely non-existent so far. Real-time GI is super expensive, which is why it doesn't exist yet, but there's also Baked in GI like Sonic Unleashed and most games with lightmaps, but that causes the file size to jump immensely. One stage of Sonic Unleashed had 300mb of GI. The fact that they have an open world and GI is rather impressive, even if it is baked in. (Ryse had no GI, instead it used real-time point and spot lights to fake it) but still, I'd rather have 60 FPS than GI, but that's what we're given with unity.[/QUOTE] Literterally every game has GI, quake and half-life 1 have GI. What you're thinking off is dynamic GI, but the post states its baked.
guys did everyone miss the part where actual 600 npcs on screen I think that's the part where the gpu power difference between the 2 consoles doesnt matter [editline]17th October 2014[/editline] [QUOTE=MaddaCheeb;46256940]Kinda interesting they had performance problems this late into development. You think they would have solved that early on.[/QUOTE] devs do not care about frame rates until the game steps out of alpha, they have bigger issues on their hands [editline]17th October 2014[/editline] [QUOTE=Major.Dump;46257307]As someone who works at a AAA game company the person who wrote this sounds like they don't know what they're talking about, and doesn't sound like a developer to me. (based on the way they talk) "baked global illumination lighting." You cannot "bake lighting in a day/night cycle world (I'm presuming this is the case as all Assassin's creed games have this feature) Baking lights only work in a static environment as I understand it. If he/she does work at Ubisoft I think they must be in QA or some general producer role where they aren't able to understand low level workings of an engine. Herp.[/QUOTE] I'm sorry to burst your bubble but that's what it exactly means they baked lightmaps for every stage in the day and night cycle
[QUOTE=Wootman;46257186]30 fps isn't even that bad, console gamers are used to it. Why are you guys whining so much don't you play shit on the PC?[/QUOTE] I recall Ubisoft (or someone else) saying that 30fps is the new standard or so. I hated that sentence because currently there are games on, for example, PS4 that are 1080p/60fps -standard. But the trick is, they aren't really 1080p OR 60fps. Resolution is slightly worse and is just upscaled, and the fps fluctuates between 30-60fps. Killzone: Shadowfalls multiplayer is a neat example, the fps is 60 when you look at an empty landscape or a wall, but instantly drops down to sub-40 when ANYTHING happens. And a blind person can see it's not really native 1080p. If the 60fps-standard is like that, 30fps would be fucking awful. Optimization would be utter garbage on those titles, with the fps sinking down to sub-10 at worst (look at quite a few last-gen multiplats on the PS3) and after a while THAT would be in the acceptable range with people defending it saying "it's not so bad we're used to it". And it would probably still happen at garbage-resolutions that are upscaled below-720p. And games like that won't magically be 20 times better on the PC. If they can't be arsed to make it work well on a pre-set hardware how can they be arsed to make it prettier and make it run better on thousands of possible PC-builds? But it's "pretty" so WHO CARES RIGHT?
[QUOTE=SgtTupelo;46258978]I recall Ubisoft (or someone else) saying that 30fps is the new standard or so. I hated that sentence because currently there are games on, for example, PS4 that are 1080p/60fps -standard. But the trick is, they aren't really 1080p OR 60fps. Resolution is slightly worse and is just upscaled, and the fps fluctuates between 30-60fps. Killzone: Shadowfalls multiplayer is a neat example, the fps is 60 when you look at an empty landscape or a wall, but instantly drops down to sub-40 when ANYTHING happens. And a blind person can see it's not really native 1080p. If the 60fps-standard is like that, 30fps would be fucking awful. Optimization would be utter garbage on those titles, with the fps sinking down to sub-10 at worst (look at quite a few last-gen multiplats on the PS3) and after a while THAT would be in the acceptable range with people defending it saying "it's not so bad we're used to it". And it would probably still happen at garbage-resolutions that are upscaled below-720p. And games like that won't magically be 20 times better on the PC. If they can't be arsed to make it work well on a pre-set hardware how can they be arsed to make it prettier and make it run better on thousands of possible PC-builds? But it's "pretty" so WHO CARES RIGHT?[/QUOTE] you do realize "ubisoft dev" is not synonymous with "ubisoft marketing exec" right
[QUOTE=hexpunK;46257077]The resolution and frame rate caps only apply to the console release right? Because if that is the case, who the [B]fuck[/B] gives a shit about that?[/QUOTE] Rumor has it that the PC release will have the same/similar limitations. Besides, this isn't much of a discussion about the quality of Unity as it is a discussion about Ubisoft's compulsive and obvious lying, their refusal to admit that they're bad at optimizing video games, and their refusal to correct any of these problems that are obviously on their end of development, not the development of current and future hardware.
[QUOTE=Lordgeorge16;46259973]Rumor has it that the PC release will have the same/similar limitations. Besides, this isn't much of a discussion about the quality of Unity as it is a discussion about Ubisoft's compulsive and obvious lying, their refusal to admit that they're bad at optimizing video games, and their refusal to correct any of these problems that are obviously on their end of development, not the development of current and future hardware.[/QUOTE] not a single ac game released on pc has been limited in frames, ever. stop talking out of your ass just because you happen to dislike ubisoft
[QUOTE=Egevened;46260371]not a single ac game released on pc has been limited in frames, ever. stop talking out of your ass just because you happen to dislike ubisoft[/QUOTE] I never said they did it in the past, but some people are saying that they might do it with the PC release of Unity, which honestly wouldn't surprise me at all at this point. And I, like many other people, have very good reason to dislike Ubisoft and even mentioned those reasons in my post. I really miss Bad Reading.
I don't see 900p as such a big deal. On a TV screen you're usually sitting so far away that it's [url=http://arstechnica.com/gaming/2013/10/op-ed-why-im-not-too-worked-up-about-the-next-gen-console-resolution-wars/]barely noticeable[/url]. 30FPS on the other hand is absolutely the bare minimum a third or first person game has to run at to be enjoyable. Last weekend they had a playable build available for the public (running on XB1s) and that was closer to 15 FPS than 30FPS. There were a few scenes where it was running better (at 30FPS I assume) when climbing up buildings and most of the screen was filled with the skybox, but when looking at the city or walking around at street level was barely playable and certainly not enjoyable. I really hope that was just a shitty alpha build and not representative of the final game.
[QUOTE=Lordgeorge16;46260454]I never said they did it in the past, but some people are saying that they might do it with the PC release of Unity, which honestly wouldn't surprise me at all at this point. And I, like many other people, have very good reason to dislike Ubisoft and even mentioned those reasons in my post. I really miss Bad Reading.[/QUOTE] Didn't they already say among the 900p and 30fps for consoles news that PC will be unaffected. There is no reason to cap the resolution of PC games and minuscule reasons to cap framerates.
anvil, anvilnext and anvil 2.0 (the engine running unity) are all coded for pc, which is also the exact reason why they have problems porting it to consoles ac3 was a mess on consoles, and it ran 40fps in 1080p on my pc thats from 2008 so go figure
[QUOTE=Cl0cK;46252581]Thread prediction: Facepunch won't lie as much as Ubisoft.[/QUOTE] Thread Prediction: Facepunch lies a billion times more than Ubisoft.
Sorry, you need to Log In to post a reply to this thread.