[QUOTE=paul simon;52034001]I'm just not sure you could [I]store[/I] it.[/QUOTE]
Yeah good point actually, I hadn't thought of that. You kinda need the previous state of the universe stored if you want to simulate the next state, right?
This is kind of a cop out but what if whatever is simulating our universe has far more advanced technology that we don't even know about yet? Or is that unlikely?
[QUOTE=paul simon;52034001]I'm just not sure you could [I]store[/I] it.[/QUOTE]
Yep this is the problem. Sure you could maybe build a processor to run a simulation of a volume in a volume smaller than it, but you have to take into account storage. The volume that stores this information can't get any smaller because at that point it lacks the information. Not to mention that you have to handle the non-locality of entanglement and the complex, intertwined interactions across literally every last bit of data of the universe, so all that data is really tied together making the storage problem even worse. Compression is probably not possible for a multitude of reasons.
It just doesn't check out as remotely plausible.
[editline]30th March 2017[/editline]
[QUOTE=kariko;52034139]This is kind of a cop out but what if whatever is simulating our universe has far more advanced technology that we don't even know about yet? Or is that unlikely?[/QUOTE]
The problem is one of physical limits, not a matter of level of technology. The signs from our science point to some unknown aspect of physics making this plausible being a resounding nope. There's simply no physics or even a glimmer on the horizon of physics of our universe under which a universe simulator works.
[QUOTE=DOG-GY;52034148]Yep this is the problem. Sure you could maybe build a processor to run a simulation of a volume in a volume smaller than it, but you have to take into account storage. The volume that stores this information can't get any smaller because at that point it lacks the information. Not to mention that you have to handle the non-locality of entanglement and the complex, intertwined interactions across literally every last bit of data of the universe, so all that data is really tied together making the storage problem even worse. Compression is probably not possible for a multitude of reasons.
It just doesn't check out as remotely plausible.
[editline]30th March 2017[/editline]
The problem is one of physical limits, not a matter of level of technology. The signs from our science point to some unknown aspect of physics making this plausible being a resounding nope. There's simply no physics or even a glimmer on the horizon of physics of our universe under which a universe simulator works.[/QUOTE]
Oh, I see!
All this stuff is super interesting though! And kind of confusing. The video started losing me near the end I'll admit... but I got most of it.
[QUOTE=JohnnyMo1;52032244]It depends strongly on your interpretation. My favorites though, particularly [url=http://www.preposterousuniverse.com/blog/2014/06/30/why-the-many-worlds-formulation-of-quantum-mechanics-is-probably-correct/]many-worlds[/url], don't even have collapse.[/QUOTE]
I don't know that I completely understood it but that was a pretty interesting read. Definitely can't disagree with the possibility at the very least.
[QUOTE=DOG-GY;52034148]Yep this is the problem. Sure you could maybe build a processor to run a simulation of a volume in a volume smaller than it, but you have to take into account storage. The volume that stores this information can't get any smaller because at that point it lacks the information. Not to mention that you have to handle the non-locality of entanglement and the complex, intertwined interactions across literally every last bit of data of the universe, so all that data is really tied together making the storage problem even worse. Compression is probably not possible for a multitude of reasons.
It just doesn't check out as remotely plausible.
[editline]30th March 2017[/editline]
The problem is one of physical limits, not a matter of level of technology. The signs from our science point to some unknown aspect of physics making this plausible being a resounding nope. There's simply no physics or even a glimmer on the horizon of physics of our universe under which a universe simulator works.[/QUOTE]
I see one flaw with your reasoning. You're assuming that the universe isn't deterministic. If it's deterministic then you can boil it down to an equation. If one axis of that equation is time then you can simply jump to any given point in time without having to store large amounts of information. Of course if this was true then it's still not actually a very persuasive argument for the universe being a simulation since having such an equation means they would be capable of knowing the state of the universe under any given circumstances.
[QUOTE=Alice3173;52035338]I don't know that I completely understood it but that was a pretty interesting read. Definitely can't disagree with the possibility at the very least.
I see one flaw with your reasoning. You're assuming that the universe isn't deterministic. If it's deterministic then you can boil it down to an equation. If one axis of that equation is time then you can simply jump to any given point in time without having to store large amounts of information. Of course if this was true then it's still not actually a very persuasive argument for the universe being a simulation since having such an equation means they would be capable of knowing the state of the universe under any given circumstances.[/QUOTE]
Even if it is deterministic would you still not need to at some point contain information of much of, if not the whole system? For lack of a better analogy, and maybe this is where my flaw lies, even after processing the whole system it's still likely going to have a whole universe of information getting "outputted". Or to make a simplification assuming we are the subjects of the system, an output of all information of everything Earth or the solar system can ever physically interact with. So at some point all of this information of the universe at a single point in time must exist all at once. Back to the original storage problem?
There's also not much use in simulating a universe in a black box, so the architects would want to be able to observe and tinker with the universe. Now the entire simulation is crucial to all areas within light's reach because any point of observation can and does interact with the rest of the universe within its reach. We also at least [I]seem[/I] to experience time as continuous which gives the sense that the simulation is ever ticking forwards, making it even more doubtful that such a system exists. Assuming that means a simulation of such a scale must happen in reasonable time for these architects.
[QUOTE=DOG-GY;52035750]Even if it is deterministic would you still not need to at some point contain information of much of, if not the whole system? For lack of a better analogy, and maybe this is where my flaw lies, even after processing the whole system it's still likely going to have a whole universe of information getting "outputted". Or to make a simplification assuming we are the subjects of the system, an output of all information of everything Earth or the solar system can ever physically interact with. So at some point all of this information of the universe at a single point in time must exist all at once. Back to the original storage problem?
There's also not much use in simulating a universe in a black box, so the architects would want to be able to observe and tinker with the universe. Now the entire simulation is crucial to all areas within light's reach because any point of observation can and does interact with the rest of the universe within its reach. We also at least [I]seem[/I] to experience time as continuous which gives the sense that the simulation is ever ticking forwards, making it even more doubtful that such a system exists. Assuming that means a simulation of such a scale must happen in reasonable time for these architects.[/QUOTE]
Think of it as something like Minecraft's procedural landscape generation. It has templates for the objects and has the equation necessary to procedurally generate the world as you go without having to store absolutely everything.
As for the latter part, you can modify the equation being used in order to tinker with things. It'd just result in a different universe being generated. And for the bit on time, that seems to me to be a flaw in your viewpoint on the flow of time. You seem to be of the mind (unless I'm misunderstanding you) that our perception of time has to be the same as their external one when something external to our universe that's modifying it in any way would end up doing so on such a level that it'd effectively be something we can't sense at all.
[QUOTE=DOG-GY;52035750]Even if it is deterministic would you still not need to at some point contain information of much of, if not the whole system? For lack of a better analogy, and maybe this is where my flaw lies, even after processing the whole system it's still likely going to have a whole universe of information getting "outputted". Or to make a simplification assuming we are the subjects of the system, an output of all information of everything Earth or the solar system can ever physically interact with. So at some point all of this information of the universe at a single point in time must exist all at once. Back to the original storage problem?
There's also not much use in simulating a universe in a black box, so the architects would want to be able to observe and tinker with the universe. Now the entire simulation is crucial to all areas within light's reach because any point of observation can and does interact with the rest of the universe within its reach. We also at least [I]seem[/I] to experience time as continuous which gives the sense that the simulation is ever ticking forwards, making it even more doubtful that such a system exists. Assuming that means a simulation of such a scale must happen in reasonable time for these architects.[/QUOTE]
It's possible for a computer to output a system that is larger than itself through time multiplexing, but for a system that's extremely interdependent and not well compressible (i.e. our universe) the simulation leading up to that point then is [code]Ω(exp(count of particles in the universe / storage, simulated time))[/code], or in other words: at best still ridiculously slow.
What happens if you put detectors at every tiny interval possible (which I know is difficult), and make them detect every tiny time interval possible. Wouldn't you be able to retroactively deduce the particles speed and positions?
[QUOTE=Zenreon117;52035824]What happens if you put detectors at every tiny interval possible (which I know is difficult), and make them detect every tiny time interval possible. Wouldn't you be able to retroactively deduce the particles speed and positions?[/QUOTE]
If I'm not completely mistaken, you'd be able to tell how the particle behaved classically in that interval (and just two detectors that give you the time are enough for that iinm), but the [I]current[/I] wave function would still be a huge mess so you still would be unable to make a deterministic prediction.
Here's a more practical visualization of the potential behavior of a quantum system:
[video=youtube;WIyTZDHuarQ]https://www.youtube.com/watch?v=WIyTZDHuarQ[/video]
[QUOTE=Alice3173;52035794]Think of it as something like Minecraft's procedural landscape generation. It has templates for the objects and has the equation necessary to procedurally generate the world as you go without having to store absolutely everything.
As for the latter part, you can modify the equation being used in order to tinker with things. It'd just result in a different universe being generated. And for the bit on time, that seems to me to be a flaw in your viewpoint on the flow of time. You seem to be of the mind (unless I'm misunderstanding you) that our perception of time has to be the same as their external one when something external to our universe that's modifying it in any way would end up doing so on such a level that it'd effectively be something we can't sense at all.[/QUOTE]
Right, I had the same concept in mind: Mindcraft, NMS, etc. To reformulate my case:
To be useful all this information doesn't just disappear out of the processor. It returns an output, and that output is information needing to be stored (even if "storing" is only data in a cable on the way to its destination). So while with determinism you can perhaps reduce the universe to the range of the observable universe, and definitely to a single state in time, that single state (wherever it exists in the computer's pipeline) still contains the information of an entire universe.
As for the stuff on time I'm not of that mind. I'm making an assumption for the sake of casting doubt because it's likely true. My argument actually falls apart if the system is purely deterministic because that demands our consciousness is not real but merely an artifact of information, thus not a valid perspective to gauge anything about time outside our own system.
[editline]31st March 2017[/editline]
[QUOTE=Zenreon117;52035824]What happens if you put detectors at every tiny interval possible (which I know is difficult), and make them detect every tiny time interval possible. Wouldn't you be able to retroactively deduce the particles speed and positions?[/QUOTE]
We don't even know if time has a quanta. You also have to consider that the detectors affect the system.
[QUOTE=DOG-GY;52035925]Right, I had the same concept in mind: Mindcraft, NMS, etc. To reformulate my case:
To be useful all this information doesn't just disappear out of the processor. It returns an output, and that output is information needing to be stored (even if "storing" is only data in a cable on the way to its destination). So while with determinism you can perhaps reduce the universe to the range of the observable universe, and definitely to a single state in time, that single state (wherever it exists in the computer's pipeline) still contains the information of an entire universe.[/QUOTE]
You're overlooking an important detail though. Minecraft in particular (not as sure about NMS) doesn't actively store the entire world at any one time. It only loads what you are currently interacting with. And it only stores to disk what's been changed in areas that have been interacted with. The rest can still be reconstructed from the equation. And in this case, since we're operating under the assumption at the moment that it's deterministic, there's no need to store changes to the disk. You just actively load the parts you need to look at as you need them and subsequently only need the storage space for the bits you need to see at any one given time.
For the time thing, I'll just drop that one. I'm not even sure how to word any further arguments on the matter anyways, lol.
[QUOTE=carcarcargo;52033104]Sort of but not really, we don't really know what the hell is going on with quantum physics. An observer is more just anything it interacts with. There's some who argue it isn't until a person actually reads the results but I think that's more just philosophical babble.
There are genuinely odd quantum physics effects, such as electrons interfering with themselves in slit experiments and causing interference patterns.[/QUOTE]
The second point you mention is a consequence of what the video is talking about.
The probability density function - that is, the squared modulus of the wavefunction ( [t]https://www.latex4technics.com/l4ttemp/9kvy0j.png?1490959380997[/t] ) - tells you the probability of finding a particle at some given area ( [I]not[/I] at a given point, as technically the probability would be 0 ). The experiment you reference was performed by throwing these electrons at two slits, in which the spacing and width of the slits was small enough to be contained within the PDF - meaning that unless the wavefunction collapsed by some observation strictly determining the position before going through the slits - the electron is technically able to go through both slits at the same time as it's position is undefined; hence causing interference.
Wikipedia's page on the double slit experiment has a good video showing the PDF / Wavefunction and how it can interfere with itself.
[vid]https://upload.wikimedia.org/wikipedia/commons/a/a0/Double_slit_experiment.webm[/vid]
When the electrons are observed after going through these slits, you can see a clear interference pattern in the PDF.
I'm also just going to go and say have fun trying to simulate reality; once you realise how horrendously large Hilbert space is, it becomes fairly obvious just from the cost of matrix algebra alone that it's virtually impossible to accurately simulate the universe. Quantum computers don't magically solve this problem, although they can work on every state at the same time, you still need to actually have a quantum computer - then you've got error correction, decoherence to worry about, etc. etc.
[QUOTE=Alice3173;52035943]You're overlooking an important detail though. Minecraft in particular (not as sure about NMS) doesn't actively store the entire world at any one time. It only loads what you are currently interacting with. And it only stores to disk what's been changed in areas that have been interacted with. The rest can still be reconstructed from the equation. And in this case, since we're operating under the assumption at the moment that it's deterministic, there's no need to store changes to the disk. You just actively load the parts you need to look at as you need them and subsequently only need the storage space for the bits you need to see at any one given time.
For the time thing, I'll just drop that one. I'm not even sure how to word any further arguments on the matter anyways, lol.[/QUOTE]
Sorry, still wasn't that clear I guess. I was bringing into question: Of what needs to be output to "display", how much data is necessary? My intuition tells me it's still on physically impossible scales.
[QUOTE=DOG-GY;52038987]Sorry, still wasn't that clear I guess. I was bringing into question: Of what needs to be output to "display", how much data is necessary? My intuition tells me it's still on physically impossible scales.[/QUOTE]
I don't think our intuition really matters at grand scales or infinitesimal scales
[QUOTE=HumanAbyss;52039247]I don't think our intuition really matters at grand scales or infinitesimal scales[/QUOTE]
If this were true we'd never learn anything past our daily experience, but look at all we've done. Would Dirac have ever discovered antimatter in his equations years before experimental proof without intuition? Would Feynman have been able to simplify quantum interactions down to child friendly diagrams, when nobody thought a visual model was even possible, without intuition?
I understand a lot of this stuff (to be fair at a bit below surface level) intuitively now. Took a good number of books, study of optics, rendering programming, and more, but intuition can work at any scale with the right frame of mind.
[QUOTE=DOG-GY;52038987]Sorry, still wasn't that clear I guess. I was bringing into question: Of what needs to be output to "display", how much data is necessary? My intuition tells me it's still on physically impossible scales.[/QUOTE]
Ah, I get you now. I still disagree that it's totally impossible though. Implausible, yes, especially with our current understanding of the universe and current technology, but we don't know how things will change in the future and we can be certain that anyone who's advanced enough to even attempt something like this is going to have access to technologies we can hardly imagine. I'd hazard a guess that at some point (probably very far, like hundreds of millennia) in the future, enormous scale supercomputers are going to be possible. The current limiting factor in such large scale systems is the speed of light but if we ever figure out stuff like wormholes or how to properly harness quantum entanglement this would no longer be a limiting factor.
[QUOTE=Alice3173;52040328][...] how to properly harness quantum entanglement this would no longer be a limiting factor.[/QUOTE]
I agree with most of the post, but quantum entanglement can't be used to transmit information (as per the current understanding of it, but afaik everything known so far strongly suggests it's truly random).
[editline]1st April 2017[/editline]
[QUOTE=DOG-GY;52038987]Sorry, still wasn't that clear I guess. I was bringing into question: Of what needs to be output to "display", how much data is necessary? My intuition tells me it's still on physically impossible scales.[/QUOTE]
This really depends on the problem.
In this case, [URL="https://facepunch.com/showthread.php?t=1558459&p=52035818&viewfull=1#post52035818"]if you recompute all intermediate results and keep computation sequential[/URL], then the memory requirement should be proportional to the simulated time (technically the maximum potential interaction depth, but they grow in the same way) + the display itself.
The display could be on the same order of magnitude as current computer memory, but the rest storage would still be a stupidly big stack architecture. Still, it could be smaller than the universe it simulates.
Sorry, you need to Log In to post a reply to this thread.