A while back I though of a theoretical way to store an infinite amount of data. If would only work if the following could be accomplished
maintain perfectly the relative distance from particle to a point.
be able to double and half the exact distance of this particle to that point
be able to know if the distance is exactly 1 unit or greater meaning
be able to offset the distance by exactly 1 unit.
It works like this. The particle starts at exactly 0 units from the origin, this defines an infinite number of 0s.
to write a bit
if the bit is a 1
add exactly 1 unit to the distance
half the distance of the particle
the bit is a zero
half the distance of the particle
to read a bit
double the distance
if the distance >= 1 unit
subtract 1 unit to the distance
output 1
else
output 0
so, as an example. To write 1101 to a blank particle
distance - write bit
0 ---- 1
0.5 ---- 1
0.75 ---- 0
0.375 ---- 1
0.875
so the distance of 0.875 encodes the data 1101
theoretically this could hold an infinite amount of data. The draw back is the data has to be read one bit at a time and can't be read at any point on the drive. This type of storage would would well for surveillance being that it can store any amount data and then you can seek through it like a tape. This obviously beyond the realm of reality but it is fun to think about. I thought I would share it.
Theoretical Infinate Storage
Not quite infinite; you'll run into the Planck length sooner or later. Still a theoretically huge amount of data though
. 22 Racing Series .
Implemented as arithmetic coding. While Huffman encoding cannot reduce data below 1/8th of original size (for bytes), arithmetic coding has no such limit. An experiment I once ran encoded 1 megabyte file of 'A's into 3 bytes.
JPEG might use this method instead of Huffman coding, depending on patent licensing.
In physical world it doesn't work, since distances quickly become unmeasurable
JPEG might use this method instead of Huffman coding, depending on patent licensing.
In physical world it doesn't work, since distances quickly become unmeasurable
If you have to be able to accurately move a particle by increments of 1/(2^n), couldn't we just store bits in a fashion that they take up 1/(2^n) space in the physical world and just use them without having to decode all the non-binary data you have to measure?
There are disks out storing data in near picometer^2/bit size. I'm not sure how simple it would be to move a particle with pico distance accuracy to make this really any more "infinite" than we have today.
There are disks out storing data in near picometer^2/bit size. I'm not sure how simple it would be to move a particle with pico distance accuracy to make this really any more "infinite" than we have today.
maintain perfectly the relative distance from particle to a point.
Nice try, but I'll stop you there.
This is impossible. It violates the uncertainty principle, which fundamental to quantum physics.
http://en.wikipedia....ainty_principle
Bell's inequality dictates that this is a fundamental nature of matter/energy which can not be compensated for by any extremes of measurement or control of local variables without violating relativity.
http://en.wikipedia....Bell_inequality
Even if you invalidated relativity, which would be quite a task in itself, you'd still have to explain Bose-Einstein condensates, and common observable wave phenomena (such as the famous double slit experiment), and the only way to do that without evoking something akin to Heisenberg's principles is to suggest "Conscious" meddling in some form.
That is, you'd have to postulate some kind of deity, or computer program (if we're all in the matrix) that deliberately fools us into perceiving wave phenomena when, in fact, there are none.
The "Conscious collapse" interpretations of quantum mechanics are not science, but pseudoscience quackery based losely on the human-centric Copenhagen interpretation of quantum mechanics (which, taken to an extreme, is of no use here because it postulates only random information, which would amount to an infinite amount of storage for what amounts to TV static).
Of course, it's fun to think about. You might be able to get away with it in soft science fiction, but there are good reasons why it violates some of the fundamental laws of reality, and even mathematical logic (like Gödel's incompleteness).
which would amount to an infinite amount of storage for what amounts to TV static
More formally known as MTV.
Ah this reminds me of point-cloud 3D imagery. Someone does have a project on point clouds for real-time gaming on the molecular level, but im skeptical wih a lack of demonstration.
As far as infinite data storage, its a bit tricky. Theres really no way to atore infinite amounts of data.
BUT if you are procedurally generating and keep all variables for generation constant at rutime everytime, the data can quite easily be drawn up completely identical to the last time. There, at least, you can effectively work around the hardware limits. EDIT: Or rather, stave it off for a while.
As far as infinite data storage, its a bit tricky. Theres really no way to atore infinite amounts of data.
BUT if you are procedurally generating and keep all variables for generation constant at rutime everytime, the data can quite easily be drawn up completely identical to the last time. There, at least, you can effectively work around the hardware limits. EDIT: Or rather, stave it off for a while.
Morley
Aspiring programmer, modeler and game designer.
Aspiring programmer, modeler and game designer.
BUT if you are procedurally generatig and keep all variables for generation constant at rutime everytime, the data can quite wasily be drawn up completely identical to the last time. There, at least, you can effectively work around the hardware limits.
Certainly procedural generation can make it appear like there is more data stored than there really is, but ultimately it's a visualization of a very limited amount of data in the form a fractal that suggests more complexity than actually exists (if analyzed sufficiently, one would always find the same repeating patterns).
Just as we can easily write "turtles all the way down" and yield an infinite number of stacked turtles, it's not really an infinite amount of information; just the same piece of finite information repeated over and over again without end- a very, very small amount of actual information.
If your goal is repeatable procedural world generation that is truly infinite (instead of just repeating), you have to use a seed based on an irrational number that can be computed as one progresses through the world (starting locally with the highest decimal place, and moving out from there).
That would give you an infinite world, which is not simply a repeating pattern, and would look the same every time.
Of course, the problem with that is, as you expand outwards, the computation becomes more and more difficult, so either your progress slows to a crawl (taking hundreds of years to take a step, then millions of years, then billions of years) with increasingly larger computers needed (with increasing memory), requiring solar systems of space, then entire galaxies... Or you find a new irrational number of calculate.
Stepping ahead to the next irrational number just delays the problem. Eventually you are into the realm of the googleplex googleplex roots, and you run into the wall of processing power and computer memory again to merely fetch the next needed seed.
Consistent or complete- choose one. Gödel's a bitch, isn't he?
That's not to say we even need truly infinite worlds. Just with the information we could store on a thumb drive, it could take a human lifetime to explore.
If your goal is repeatable procedural world generation that is truly infinite (instead of just repeating), you have to use a seed based on an irrational number that can be computed as one progresses through the world (starting locally with the highest decimal place, and moving out from there).
[/quote]
While you're absolutely right on a theoretical level, you could just use a PRNG which has a period that is larger than the number of bits needed to generate content to last an entire lifetime. That would be enough for all practical purposes.
Widelands - laid back, free software strategy
to write a bit
if the bit is a 1
add exactly 1 unit to the distance
half the distance of the particle
the bit is a zero
half the distance of the particle
to read a bit
double the distance
if the distance >= 1 unit
subtract 1 unit to the distance
output 1
else
output 0
so, as an example. To write 1101 to a blank particle
distance - write bit
0 ---- 1
0.5 ---- 1
0.75 ---- 0
0.375 ---- 1
0.875
so the distance of 0.875 encodes the data 1101
Doesn't 0.875 really encode the data 1110. I think 0.6875 would encode 1101.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement