ID:152318
 
I've been thinking about making a dungeon crawler game with random dungeons, and while doing it I came up with an idea to create a supermassive overworld storing the smallest amount of data, and then being able to travel back to places you've been already: combining procedural map generation with a random number generator. After someone creates a procedure to produce a random number generator with the same seed and bounds (because the built-in rand() function won't return the same value for the same seed and bounds), they could store the seed and bounds (or just the seed, and the bounds could be an overarcing grid system) and store them in a save file. When a player goes back to that area, the seed (and bounds, if you choose to save those) is grabbed again and the map is produced again.

Do you think this would work, or could you see any eventual problems with it? I don' know if this has been done before, so let me know if it has.
If the environment is changed in any way, then you aren't going to be able to keep those changes unless you save the changes specifically.
Another problem is generating the same map over. There's always that small chance.

I think I've seen something like it before, don't remember where.
In response to D4RK3 54B3R
Hence the reason you create your own random generation procedure. You could store the info in a datum, and then that data shouldn't get messed with, somehow.
It could work, but you'd have to be sure that your algorithm was deterministic, and it could be very CPU-intensive.

Also note that bigger is not necessarily better - the big problem with random map generation on a massive scale like this is that everything tends to look pretty similar, so you don't actually gain much.

The basic idea is sound, though. Spore apparently takes a somewhat similar approach for transmitting creatures over the network (transmit the small source information rather than the large result), though of course Spore creatures aren't generated using a random seed.
In response to Crispy
Actually, the idea I had with this was to be able to data in a small file. If you produced a map each time, and then saved it to a file, that file could become rather large eventually. This way, all you have to do is store the seed value.
In response to Popisfizzy
I see... so you'd only generate areas on demand and keep nearby areas "uncompressed?. Then you'd chuck away the data for distant areas when the file started getting too large? That's a nice way of handling it.