[QUOTE=paul simon;46030726]Great argument. An old video that no longer represents the state of the tech :v:[/QUOTE]
Bad artstyles and unseriousness aside you're right. It seems like the main way to render points is to draw 3d sprites. But with the amount of interest in 3d scanning and visualization of 3d scanned objects, especially in the "open source community" I feel like an open source version of this is just around the corner.
[del]I'm no math expert but it seems similar to converting screen coordinates to world coordinates. And then maybe have a coordinate tree which contains the colors. Doing it this way you will only render what your screen sees. Sort of like path tracing maybe?
I don't know how efficient that would be in terms of performance but I imagine it would at least look better.[/del]
[editline]20th September 2014[/editline]
ops i completely forgot about depth
They always seem to vanish for an extended period of time, come back and there's no change.
Come on man, give it up.
[QUOTE=Zege;46021646]The bit where he said that we could tell that we could tell if a scene is from the real world or not, ehh... I could kinda figure that something's not right. It's still kinda neat though I guess.[/QUOTE]
I instantly realized that it wasn't from the real world...
[QUOTE=mastersrp;46030763]
Static scenery, lightning baked in. What are you trying to say?[/QUOTE]
He's saying that UE4's static scenery with baked lighting looks better than Euclideon's static scenery with baked lighting.
[QUOTE=IrishBandit;46031605]He's saying that UE4's static scenery with baked lighting looks better than Euclideon's static scenery with baked lighting.[/QUOTE]
Well, the reality is that we actually don't know. We haven't actually seen their modern lighting system being put to use yet. What we've seen is, in chronological order of videos being released:
1: Massive amounts of "artistic" point cloud data put together
2: A huge island composed of tonnes of "fiction", "factual", and "hybrid" creations, often in excessive redundant amounts.
3: Several indoor laser scanned areas, and a single outdoor laser scanned area, with supposedly no fancy effects other than the simple rendering of their engine.
In reality, we don't know what their modern dynamic lighting system looks like, how well their shadowing system works, and any possible post-process effects beyond that.
And in reality, we don't actually need to, because those are really not the core issue at hand. The matter of fact is the system itself being able to properly render the amounts of point cloud data in a reasonable way, with not a lot (if any) loss of quality.
On top of that, it would likely not be "too difficult" (read: not impossible) to implement post processing effects such as lighting, flares, motion blur, and so on (AA not needed).
Fucking hate the way he says 'data'
DAHTAH.
But yeah I just tend to think "oh, these guys again" now, they've just made empty promises so far.
Is he seriously suggesting exclusively using laser scanners to create video game art assets? Does he think we're all huge idiots?
Point cloud data are completely useless in normal AAA games anyways though, so he's just a shitty scammer trying to get the public hyped about software that will never be used outside of specialized industries already working with point cloud data.
[QUOTE=Robber;46033142]Is he seriously suggesting exclusively using laser scanners to create video game art assets? Does he think we're all huge idiots?[/QUOTE]
No he isn't.
[QUOTE=Robber;46033142]Point cloud data are completely useless in normal AAA games anyways though, so he's just a shitty scammer trying to get the public hyped about software that will never be used outside of specialized industries already working with point cloud data.[/QUOTE]
Why exactly?
I'm sure that at some point it was absurd to think that polygons would be used in games, and they were only fit for specialized computing and visualization.
I remember when the first video of this was posted. Still waiting for a demo I can put on my PC and run in real time. Until then, I see no reason to believe in anyone claiming to have produced "unlimited detail" - unless, of course, he's also invented "unlimited memory".
I should preface all this by saying I haven't seen the video in the OP, mostly because I don't like those Euclideon guys. I am however very invested into seeing how voxel-based engines can go for games, and even though 'Unlimited Detail' hasn't really bore any fruit towards voxel-based games, there are plenty of other voxel-based engines that have been showing great progress.
A few I can list off the top of my head are the Atomontage Engine and Voxel Farm. The following media I will post are all from Atomontage Engine.
[QUOTE=Grindigo;46023874]This technology is bullshit, everyone with a working brain can tell it and it's nothing extraordinary, it's been used for ages, all the stuff they did was scan bunch of models from multi angled photos, [B]if there was a better way to achieve graphics then entire industry would have already jumped on that[/B], there is a reason why there isn't anything better than polygons, because there isn't, polygons offer both best functionality and control period.[/QUOTE]
You could argue the same as to why the majority of the transportation industry hasn't switched over from combustion based engines towards electric driven types.
It is an underdeveloped technology that is currently far behind the current standard, even though it has the potential to surpass it ten-fold if given the chance. This is why we have pioneers like Tesla (or even Facebook w/ Oculus) with so much money that they have the luxury to not worry about job security. They can therefore focus on trying to iterate and work solely on improving the technology.
[QUOTE=Robber;46033142]Is he seriously suggesting exclusively using laser scanners to create video game art assets? Does he think we're all huge idiots?[/QUOTE]
I don't even really see a problem with this. It'll be a return to making actual props much like the trend the movie industry has been doing about making actual scenes rather than just green-screening it all.
Even if you don't want to laser-scan everything, it doesn't mean you can't convert 3D models over to voxels.
[t]http://www.atomontage.com/sshots/ae_panels_g.jpg[/t]
[t]http://www.atomontage.com/sshots/ae_tank_1324.jpg[/t]
[t]http://www.atomontage.com/sshots/ae_mm_tank2543.jpg[/t]
Not to mention the true destructibility you can achieve with voxels is something polygons can only dream of.
[t]http://www.atomontage.com/sshots/atomontage_cut5.jpg[/t]
This first video shows off a large scene, which I believe is captured from photography from an airship. It also shows off one of the neat bits about the engine, is that it still allows polygon-based models. They can also interact directly with each other. The second one is a truck driving and bouncing over terrain, all while carving little trails in the sand.
[media]http://www.youtube.com/watch?v=VYfBrNOi9VM[/media]
[media]http://www.youtube.com/watch?v=Gshc8GMTa1Y[/media]
[QUOTE=MegaJohnny;46034394]I remember when the first video of this was posted. Still waiting for a demo I can put on my PC and run in real time. Until then, I see no reason to believe in anyone claiming to have produced "unlimited detail" - unless, of course, he's also invented "unlimited memory".[/QUOTE]
The point is obviously that there is no hard limit to how much data it can handle. And only so much gets loaded into the RAM anyways. Due to the intelligent streaming of data, it can even be used across networks.
In the video specifically meant for the geo-spatial industry, Bruce explained how you could simply send a link to a massive database stored in their format (say, an entire scanned city) and you'd just open the link on your laptop and load it into their program, and it'd stream the point-cloud data to you as you fly around and you would be able to interact with it seamlessly. And their compression is apparently really good too, so it works on slower networks. (10-20% the size of similially complex point cloud data of other formats)
That shit's pretty awesome for the industry.
Oh and hey, it could make streaming of videogames a lot more feasible. The games can be terabytes large, but they're stored on a server somewhere and is being streamed to you as point cloud data. This means that you still have the base engine on your computer. Everything except the geometry is done locally. Content streaming with no input lag, essentially. Also gives your computer a heck of a headroom for doing post-process effects.
There are plenty of companies that already have invested in this technology:
Kimoto, McMULLEN NOLAN Group, Aerometrex, C.R. Kennedy survey solutions, Schlenker Mapping, Universal Spatial Solutions, Meixner Imaging, Merrick, Stewart Weir and Mena 3D.
But if you prefer to believe it's fake because the creator is an oddball with a weird sense of humor and terrible advertising skills, go ahead.
[QUOTE=paul simon;46034507]
There are plenty of companies that already have invested in this technology:
Kimoto, McMULLEN NOLAN Group, Aerometrex, C.R. Kennedy survey solutions, Schlenker Mapping, Universal Spatial Solutions, Meixner Imaging, Merrick, Stewart Weir and Mena 3D.
[/QUOTE]
and they still can't hire a proper in-house artist..
[editline]20th September 2014[/editline]
a very efficient way of loading texture memory was developed by a very reputable person and was marketed as unlimited texture detail and met with similar skepticism
and then id's rage happened and proved that the skepticism was well grounded, unlimited texture detail was largely impractical, and that traditional methods of loading textures were more efficient and production friendly and flexible
[QUOTE=Juniez;46034567]and they still can't hire a proper in-house artist..[/QUOTE]
They're a technology company, not a game studio. Art isn't their thing.
This is a tool for others to use. He specifically mentioned this in the island video, that they're not artists and would very much like to see what actual artists could achieve when they have no polygon budget.
[editline]21st September 2014[/editline]
[QUOTE=Juniez;46034567]
a very efficient way of loading texture memory was developed by a very reputable person and was marketed as unlimited texture detail and met with similar skepticism
and then id's rage happened and proved that the skepticism was well grounded, unlimited texture detail was largely impractical, and that traditional methods of loading textures were more efficient and production friendly and flexible[/QUOTE]
This will obviously not be as production friendly to begin with, but it has extremely good potential.
Euclideon has a converter that works in Autodesk software, and generally they're trying as hard as they can to make it usable for devs.
And I for one hope they succeed.
Notice how there are no dynamic lighting demonstrations in any of the environments in the new Euclideon videos... The results that they show are aesthetically indistinguishable to images that can be produced using hdr projections onto modeled geometry, so as far as we know everything displayed in this new video could be using flat shading.
even a single senior artist for half a year would give them ample results to demonstrate their technology; if they very much wanted to see what actual artists could achieve then maybe they should bring some aboard
[QUOTE=Juniez;46034672]even a single senior artist for half a year would give them ample results to demonstrate their technology; if they very much wanted to see what actual artists could achieve then maybe they should bring some aboard[/QUOTE]
That's really what they attempted for the island video. And it was pretty cool.
[editline]21st September 2014[/editline]
[QUOTE=Talkbox;46034669]Notice how there are no dynamic lighting demonstrations in any of the environments in the new Euclideon videos... The results that they show are aesthetically indistinguishable to images that can be produced using hdr projections onto modeled geometry, so as far as we know everything displayed in this new video could be using flat shading.[/QUOTE]
They did show dynamic lighting in the island video, but explained that it was still WIP.
I'm sure you can imagine that stuff like that is a challenge when the technology is so different from traditional approaches.
[QUOTE=paul simon;46034745]
They did show dynamic lighting in the island video, but explained that it was still WIP.
I'm sure you can imagine that stuff like that is a challenge when the technology is so different from traditional approaches.[/QUOTE]
If they wanted to demonstrate dynamic lighting that looks as photo-real as the baked lighting in the new video, they would have to generate light sources from their laser scanned imagery and then use a real time raytracer to render the environment with a decent framerate. This is not achievable with modern hardware especially when you see that they are using tons of voxels.
[QUOTE=Talkbox;46034899]If they wanted to demonstrate dynamic lighting that looks as photo-real as the baked lighting in the new video, they would have to generate light sources from their laser scanned imagery and then use a real time raytracer to render the environment with a decent framerate. This is not achievable with modern hardware especially when you see that they are using tons of voxels.[/QUOTE]
Well no shit. When I say dynamic lighting, I mean stuff like basic shadow casting.
Area lights are extremely computationally expensive, and are rarely seen in games.
[QUOTE=Leintharien;46034479]...
Not to mention the true destructibility you can achieve with voxels is something polygons can only dream of.
[t]http://www.atomontage.com/sshots/atomontage_cut5.jpg[/t]
...[/QUOTE]
You can still do that with polygons fairly easily (Clips planes, CSG, etc.), it's just that it works better with voxels because they natively have a notion of an "inside", you can clip a voxel and still see the clipped part, while a polygon just becomes a hyperfine plane.
L4D2 used something similar for gore rendering, it'd clip out part of the model and overlay another model in place (Something you'd still need to do for a voxel based renderer, etc.)
I mean, it looks nice, but it doesn't look real.
[QUOTE=Juniez;46034567]and they still can't hire a proper in-house artist..
[editline]20th September 2014[/editline]
a very efficient way of loading texture memory was developed by a very reputable person and was marketed as unlimited texture detail and met with similar skepticism
and then id's rage happened and proved that the skepticism was well grounded, unlimited texture detail was largely impractical, and that traditional methods of loading textures were more efficient and production friendly and flexible[/QUOTE]
Except that there's no unlimited texture detail because there's no textures at all. There are no classical models either, in the sense that we've come to know from ordinary games.
However, if you open up RAGE today, you'll see that the quality of the game can be pretty stunningly good. However, I do agree that in a world of polygon-based limited-loaded assets, it can be difficult to provide the same kind of flexibility with megatextures as could be done with ordinary texture loading/unloading.
On that note though, streaming in just the required amount of pixels to the screen from a huge index of data on the harddrive is a MUCH more efficient way to display 3D graphics, however storage of this (as well as loading of the data) *can* be a huge problem, especially if you want more and more detail. Obviously it IS going to take up more space, even with on-the-fly decompression or binary bit unpacking, but it can also, given enough space, produce a more realistic and real-world-like image.
And before any idiots come out and say "but it doesn't look real" I can only say that you should stop dreaming. In order to represent the world in a 1:1 way, we would have to potentially recreate it in a 1:1 way, and as you might imagine, such a feat is not a simple thing to do.
I'm quite happy with the graphics we have nowadays, but I don't think people want 1:1 graphics. They just want to be fooled with realism, and we're getting closer and closer everyday.
[QUOTE=wauterboi;46038657]I'm quite happy with the graphics we have nowadays, but I don't think people want 1:1 graphics. They just want to be fooled with realism, and we're getting closer and closer everyday.[/QUOTE]
In the gaming industry, maybe, but otherwise I think people would like to not be fooled, but to actually see ie architecture as it WOULD be in the real world, able to inspect it for instance in an environment completely laserscanned in, with an addition of the construction they wanted to inspect.
Where's proof beyond these terrible videos they shit out? Say what you want but these videos make everything come off as to good to be true.
[QUOTE=chunkymonkey;46039244]Where's proof beyond these terrible videos they shit out? Say what you want but these videos make everything come off as to good to be true.[/QUOTE]
Well, considering various companies have bought their product, I suppose you *could* contact these companies to hear them out on it.
Alternatively, calm the fuck down and wait until they have a public demo. It'll happen, and even if it didn't, why do you care?
[QUOTE=chunkymonkey;46039244]Where's proof beyond these terrible videos they shit out? Say what you want but these videos make everything come off as to good to be true.[/QUOTE]
There's quite a bunch of proof if you bothered googling around a bit.
They have a working SDK, they have working converter tools for Autodesk programs.
The Australian government reviewed it and found it promising enough to give them a huge money grant. (i think it was 2 million australian dollars)
Like 10 companies have bought their technology and is currently using it.
Oh and then there's a couple of interviewers that have gotten access to trying out a real-time demo on a laptop, and have written or made videos about it.
Where's something us plebs can use.
[QUOTE=chunkymonkey;46039466]Where's something us plebs can use.[/QUOTE]
Nowhere yet, you'll have to be a bit more patient.
Oooor you could request a demo from the distributor of Euclideon Geoverse, Meixner Imaging: [url]http://meixnerimaging.com/geoverse-sdk-info/[/url]
[editline]21st September 2014[/editline]
Here's a video of it in use by one of Euclideons partners, Merrick.
[media]http://www.youtube.com/watch?v=uHG3bDc_JI4[/media]
What about the Brigade Engine? It uses raytracing.
[media]http://www.youtube.com/watch?v=BpT6MkCeP7Y[/media]
[QUOTE=Daemon White;46040106]What about the Brigade Engine? It uses raytracing.
[media]http://www.youtube.com/watch?v=BpT6MkCeP7Y[/media][/QUOTE]
[quote][Using 2 GTX TITAN[/quote]
That's the difference. The Unlimited Detail system doesn't.
Sorry, you need to Log In to post a reply to this thread.