[Case study]Procedural Generation Computer technology

Not much of a difference. You just ensure that the results the algorithm produces look somewhat like what you might get from a much more complicated and accurate method. That's what real-time coding for games and/or the demoscene is based upon, after all, cheating. You've got 2 choices. 1) Show them one you made earlier and pretend you made it up on the spot, or 2) Show them one you knocked together on the spot that looks real if you squint, then make sure that they have to squint.

No no, with your approach it is a one-step process where the algorithm is applied, makes extreme changes, and it's done. That is "weathering" and it's not a simulation, it's a visual effect just like a ship skin. Some game engine devkits even have a Weathering slider they can use to make objects look older through an algorithm. They just drag it from left to right, the item changes, it's all done in one swipe. A simulation differs in that small variations early in the simulation can lead to huge differences later on.

If you take that Weathering slider in your typical FPS devkit and slide it back and forth you'll get the same results all day long. If you simulate the weathering effects you will get a different result every time depending on the random "seed" that was put into the algorithm in the beginning. Two entirely different processes are occurring, and only the latter is procedural generation.


Or you use a lot of approximation and false "detail" passes that make what you are generating appear/behave somewhat like you would expect from a simulation, but at a much lower cost in processing terms.

I'm guessing that most cloud generation is done with simple noise algorithms, not a full simulation of the water cycle. But if it appears to be cloud-like, does it matter?

Would you believe me if I told you this actually takes a lot more processing power than simulating weather effects? Approximating involves a lot of floating point calculations and computers don't like floating points. At least CPU's. GPU's, ASICs and FPGA's work well with floating points, which is why they're typically used in these applications, because you can take the shortcuts you are suggesting without it taking more time than doing it step by step. Moreover, this seems to be the exact opposite direction they are heading because if they had any intention of using floating point calculations heavily they would have converted the executable to 64-bit long ago.

I have a feeling Mr. Braben doesn't like shortcuts. ;)
 
I'm guessing that changes will be saved on a general level, but will be quickly brushed away over a shortened period of time so that storage doesn't increase exponentially because you like destroying cities.

For it to be realistic it will have to take into account player interaction. Its not a case of wanting to blow up cities. Think of our world now.

In the furture it may be a lot worse than now, which would mean damage on a huge scale, happening daily. The game would need to generate the environment and its condition as it happens, for each -player to see, because the idea is, we the players, react with the game.
However, if the game world has a invisible dome of protection over fixed items, ie stations, planets etc. Then where does that leave realism.

That is going to be very difficult for the game to achieve.

I think that is where I got randomly generated from, having watched an interview with the makers of NMS a while ago, being asked the same question. They said damage may not stay the same for all players, ie its generated back to the initial look.
?
 
However, if the game world has a invisible dome of protection over fixed items, ie stations, planets etc. Then where does that leave realism.

You have to draw the line somewhere ...

Whilst I am sure things can and will get tweaked over time, realtime damage which is then fed down to other players is not going to be possible. And in all honesty wouldnt work from a gameplay point of view anyway.

As stated I am sure that things can get altered from one week to the next, for example a city could be fine once day, then the next (due to the state of the system in question) it may be a smoldering wreak. Thats possible certainly, but in realtime, as players are affecting it? No thats just not possible on a scale of a planet let alone an entire galaxy.

This thread demonstrates where people can get carried away way too easily. Expectations needs to be reigned in a little here.

Im sure there will be some surprises but im not expecting anything near that realistic in all honesty. There are limitations to what you can do with technology, maybe not as a tech demo. But the game needs to be scalable on a variety of systems, and FD cant afford for the backend infrastructure to cost a fortune.

Keep it real people ...
 
No no, with your approach it is a one-step process where the algorithm is applied, makes extreme changes, and it's done. <snip> A simulation differs in that small variations early in the simulation can lead to huge differences later on.

(Have you seen Werkzeug 4?)

Anyway, you can always do multiple passes, and even introduce a little PRNG "chaos" to proceedings so the output doesn't always look the same everywhere.

Would you believe me if I told you this actually takes a lot more processing power than simulating weather effects?

Probably not, but that might be unfair as I don't know you from Adam.

<patronising stuff snipped>if they had any intention of using floating point calculations heavily they would have converted the executable to 64-bit long ago.

From what I understand, most of the maths stuff is 64-bit already - at least, that's according to what the developers have posted on this forum. Apparently, they could move over to 64-bit EXEs in a second if they wanted - I don't think they have any problem working in 64-bit math.

I have a feeling Mr. Braben doesn't like shortcuts. ;)

I have a feeling he won't have much of a choice but to use them with hundreds of billions of planets with wildly-varying compositions, sizes, tidal effects, climates, and ages to consider. Unless they decide to make us wait in that hyperspace tunnel for a couple of (hours/days/months/years) whilst simulating the developmental history of multiple planetary bodies within our chosen destination system...
 
For it to be realistic it will have to take into account player interaction. Its not a case of wanting to blow up cities. Think of our world now.

In the furture it may be a lot worse than now, which would mean damage on a huge scale, happening daily. The game would need to generate the environment and its condition as it happens, for each -player to see, because the idea is, we the players, react with the game.
However, if the game world has a invisible dome of protection over fixed items, ie stations, planets etc. Then where does that leave realism?

Well, our spaceships have magic hyperdrives and magic shields, so realism is ultimately relative. Why not have magic city-wide shields? Anyway, Frontier had domes over some cities. Domes are cool and futuristic.

I would guess that most populated worlds will be fairly calm, stable places by 3300, mainly because the available weaponry should be so potent by that stage that bringing about worldwide annihilation would be easier than making a decent cup of tea.
 
No system is generated until a player visits it, after which it is regenerated once a month unless no player has visited within 10 days of the due date for regeneration, in which case it will be generated when another player visits.

no need to PG 4 billion systems, 3.9 billion of which are probably going to remain unexplored forever.
 
Interesting discussion, I really don't know what FD\DB consider possible in the ED universe, I can just say that some of the PG I've seen look pretty amazing :)
 
(Have you seen Werkzeug 4?)

Anyway, you can always do multiple passes, and even introduce a little PRNG "chaos" to proceedings so the output doesn't always look the same everywhere.
Then you're just mimicking real weathering the hard way.

Probably not, but that might be unfair as I don't know you from Adam.



From what I understand, most of the maths stuff is 64-bit already - at least, that's according to what the developers have posted on this forum. Apparently, they could move over to 64-bit EXEs in a second if they wanted - I don't think they have any problem working in 64-bit math.
Actually I believe they said that anything they might have wanted to do in 64 bit is just as easily accomplished in 32 bit, and that's why there is no reason to change over yet. It takes more time to develop in 64 bit and they would see no benefit.

Also nice to see that I've garnered a reputation and every post I make will come with a free tongue lashing, no matter how sincere or polite.
I have a feeling he won't have much of a choice but to use them with hundreds of billions of planets with wildly-varying compositions, sizes, tidal effects, climates, and ages to consider. Unless they decide to make us wait in that hyperspace tunnel for a couple of (hours/days/months/years) whilst simulating the developmental history of multiple planetary bodies within our chosen destination system...

You're doing the same thing everyone else is doing, and assuming that it has to be taken into account down to the minutest detail. Details don't matter on a geological scale.

It may sound like I just contradicted my previous posts, but I didn't. It's.... complicated but let's just say that with true procedural generation you don't need to mimic or approximate nature, you can achieve the natural result, and do it more efficiently. There is literally no reason to approximate or take shortcuts if you know how it works on a fine and large scale.

David has had 3 decades of applying procedural generation in a practical manner. I trust him to get that right if nothing else. I may not agree with some of his game mechanic choices, his choices on community development, or any other number of things, but he's got the PG down pat. It's going to be more or less a matter of if he has enough time to get it to work in a way that he is satisfied with.
 
I can say that doing things procedurally doesn't naturally mean that it's quick to calculate. I had to output 4 8k tiles for a landscape for a film recently. I used a program called World Machine, which is procedurally based. What I would do is set the thing to calculate, go home, and come into work the next day and it would be finished. That was around 12 hours processing time. Granted, the detail was astonishing, but 4x 8k isn't a lot when you're considering a whole planet to zoom around on.

That's just anecdotal. I'm not saying PG can't be fast, but for complex computations like erosion, you can be waiting a while in order to get a good result at a decent resolution. And obviously we're after nice results. ;)

I'm really looking forward to seeing what the devs get upto with this.
 
It needs to be done real time or close to, not because it can't be done slower, but because if you did it in advance, where on earth would you store 4,000 billion planets worth of terrain data!

Take a silly low figure like 1 MB for a whole planets terrain data.

That alone is 4 million terrabytes!

So the terrain will need to be calculated by the client either in a loading screen, or streaming in the generation in a predicting where you're going next kind of way, or realtime. Either way it needs to be milliseconds to seconds, not minutes or hours.
You are right about the amount of resulting data and it needing to be calculated on the fly. Thank goodness the demo scene worked it out, here's a terrain fly-over made in 4KiB (4096 bytes):
https://www.youtube.com/watch?v=AWcbj7ksqwE
Works fast enough, so if the client software is done properly, it will for us too.
Braben wrote Frontier in 68000 assembly; I don't think FD is afraid of doing such again for the critical parts.
 
Actually I believe they said that anything they might have wanted to do in 64 bit is just as easily accomplished in 32 bit, and that's why there is no reason to change over yet. It takes more time to develop in 64 bit and they would see no benefit.

Apart from the whole "scale of space" thing, of course, where they have to use doubles for precision at certain times. Mark Allen said as much in some thread previously. So they use doubles already where necessary, and they have stated that they would have no problem building 64-bit EXEs if memory usage demanded it, but right now it doesn't. (Pretty much repeating myself from previously there.)

Also nice to see that I've garnered a reputation and every post I make will come with a free tongue lashing, no matter how sincere or polite.

Okay, "patronising" is not quite right. How about "condescending" then? (That's when you talk down to people - or over simplify things that you assume they don't know). ;)

You're doing the same thing everyone else is doing, and assuming that it has to be taken into account down to the minutest detail. Details don't matter on a geological scale.

No, I'm not - hence the word "approximation". Perhaps "simplification" would be less confusing? I don't know. I'm saying as long as it looks the part, it doesn't need to be an exact simulation.

It may sound like I just contradicted my previous posts, but I didn't. It's.... complicated but let's just say that with true procedural generation you don't need to mimic or approximate nature, you can achieve the natural result, and do it more efficiently. There is literally no reason to approximate or take shortcuts if you know how it works on a fine and large scale.

I think we are basically making the same points here. I know a little bit about a little bit, I've been playing about with some of this stuff for almost 20 years off and on, so I'm not entirely green.

Procedural generation is not that complicated really. Take a seed, generate some noise in varying dimensions and "octaves" as desired, feed that into whatever algorithms you need to generate terrain/textures/clouds/other, rinse and repeat.

You can make stuff that looks the part very cheaply (as long as you dont care about caves, or lakes/rivers that make any sense). I've seen that demo posted previously by another poster (if it's the same one) - quite good looking (albeit vegetation-free) terrain done in a 4Kb EXE with a day-night cycle, a very simple summer-winter cycle, and synthesised music. That's a bit different from simulation, however.

That's the point I was making. You shouldn't need to actually simulate the whole of a planet's lifecycle, just approximate the resulting appearance (and hopefully some plausible enough map data that can be used to fake wind/cloud/tidal patterns, and the location and concentration of various elements/minerals for mining purposes). I was just saying that if algorithms can be used to generate noise in multiple dimensions, one of those dimensions can be time...and if the right combination of algorithm, type of noise, and filters can be found, you should be able to get an accurate enough look without doing any hardcore simulation.
 
Last edited:
If you take that Weathering slider in your typical FPS devkit and slide it back and forth you'll get the same results all day long. If you simulate the weathering effects you will get a different result every time depending on the random "seed" that was put into the algorithm in the beginning. Two entirely different processes are occurring, and only the latter is procedural generation.
Put the same seed in every time and you'll get the same result. That's the basis of procedural generation. It basically encodes a complex result into a single number.

Would you believe me if I told you this actually takes a lot more processing power than simulating weather effects? Approximating involves a lot of floating point calculations and computers don't like floating points. At least CPU's. GPU's, ASICs and FPGA's work well with floating points, which is why they're typically used in these applications, because you can take the shortcuts you are suggesting without it taking more time than doing it step by step. Moreover, this seems to be the exact opposite direction they are heading because if they had any intention of using floating point calculations heavily they would have converted the executable to 64-bit long ago.
I dunno about your computer but my computer loves floating point, so much so it has specialised instructions to perform 4 functions at once.

I have a feeling Mr. Braben doesn't like shortcuts. ;)
no, he loves shortcuts. Computer graphics is a series of short cuts.
 
You are right about the amount of resulting data and it needing to be calculated on the fly. Thank goodness the demo scene worked it out, here's a terrain fly-over made in 4KiB (4096 bytes):
https://www.youtube.com/watch?v=AWcbj7ksqwE
Works fast enough, so if the client software is done properly, it will for us too.
Braben wrote Frontier in 68000 assembly; I don't think FD is afraid of doing such again for the critical parts.

Yeah, I've seen that before. Here's one (albeit not as good looking) in 1K. That's exactly the kind of thing David Braben was trying to avoid, however. Where are the overhangs and caves? Where does the water in the lakes and seas come from and go to? There's a lot more than a height map and textures required to do a convincing job in this game. How they do that procedurally will be interesting to see.
 
Apart from the whole "scale of space" thing, of course, where they have to use doubles for precision at certain times. Mark Allen said as much in some thread previously. So they use doubles already where necessary, and they have stated that they would have no problem building 64-bit EXEs if memory usage demanded it, but right now it doesn't. (Pretty much repeating myself from previously there.)



Okay, "patronising" is not quite right. How about "condescending" then? (That's when you talk down to people - or over simplify things that you assume they don't know). ;)



No, I'm not - hence the word "approximation". Perhaps "simplification" would be less confusing? I don't know. I'm saying as long as it looks the part, it doesn't need to be an exact simulation.



I think we are basically making the same points here. I know a little bit about a little bit, I've been playing about with some of this stuff for almost 20 years off and on, so I'm not entirely green.

Procedural generation is not that complicated really. Take a seed, generate some noise in varying dimensions and "octaves" as desired, feed that into whatever algorithms you need to generate terrain/textures/clouds/other, rinse and repeat.

You can make stuff that looks the part very cheaply (as long as you dont care about caves, or lakes/rivers that make any sense). I've seen that demo posted previously by another poster (if it's the same one) - quite good looking (albeit vegetation-free) terrain done in a 4Kb EXE with a day-night cycle, a very simple summer-winter cycle, and synthesised music. That's a bit different from simulation, however.

That's the point I was making. You shouldn't need to actually simulate the whole of a planet's lifecycle, just approximate the resulting appearance (and hopefully some plausible enough map data that can be used to fake wind/cloud/tidal patterns, and the location and concentration of various elements/minerals for mining purposes). I was just saying that if algorithms can be used to generate noise in multiple dimensions, one of those dimensions can be time...and if the right comindation of algorithm and type of noise can be found, you should be able to get an accurate enough look without doing any hardcore simulation.

Not quite. Procedural generation uses more logarithms and algorithms, and approximation uses more RNG.

Nature uses logarithms and algorithms on both a small and large scale, such as fractals, crystals, spirals, etc... so it's easier to mimic nature with methods that prioritize them over pure chaos.

Everything from a nautilus shell to the distribution of mass in the universe is more easily reproduced through systematic mathematical methods than brute-forcing random parameters into an acceptable result.

That isn't to say that Procedural generation doesn't use random elements, or that approximation doesn't use algorithms or logarithms, the difference is that PG uses them as a primary function and RNG is just there to add flavor, while approximation is the exact opposite.

And again, the less RNG the faster the process runs, similar results can be achieved with both methods, but not in a similar amount of time. Procedural Generation will always be faster for producing natural geological formations that are true-to-life.
 
I dunno about your computer but my computer loves floating point, so much so it has specialised instructions to perform 4 functions at once.

That's great, but your average ASIC processor does 32 per cycle, and that's the preferred architecture of AMD GPU's, which uses hundreds of individual processors on a single chip. The GK110 approaches the process differently but gets similar results. Discussing this any further gets very windy and complicated so I'll just break it down in a GFLOPS/s comparison, even though it doesn't convey the whole picture.


Premium Intel CPU: 354 GFLOPS
Premium AMD GPU: 5,632 GFLOPS
Premium Nvidia GPU: 4,494 GFLOPS


So unless you've got a $1000 i7-5960x processor overclocked to 6ghz and cooled with liquid nitrogen 24/7 your processor doesn't qualify for the qualifying races. And as big as that gap is between a CPU and GPU, the gap between GPU's and specialized ASICs is just as big.

Not being rude here, just pointing out that it's an entirely different game once you start talking FPU-specialized hardware, which CPU's are not.

Graphics is a series of shortcuts, modeling is not. It's about precision. David is trying to model a galaxy, not design or render one.
 
Not quite. Procedural generation uses more logarithms and algorithms, and approximation uses more RNG.

I don't know where you are getting your definitions from. Are you talking approximation in a mathematical sense? Because I'm not.

I mean that what you want to do is get to approximate shape, then add detail. Look at Perlin noise. It's not random, it's seeded.

Nature uses logarithms and algorithms on both a small and large scale, such as fractals, crystals, spirals, etc... so it's easier to mimic nature with methods that prioritize them over pure chaos.

Yes, phi is a useful number. Who mentioned pure chaos anyway?

Everything...is more easily reproduced through systematic mathematical methods

You mean algorithms, right? Those things that I've been writing walls of text about?

than brute-forcing random parameters into an acceptable result.

You need "random" (in reality pseudo-random, reproducible randomness from a seed value) to make it look different each time, or you end up with billions of identical planets. I'm not sure what you are arguing now.

That isn't to say that Procedural generation doesn't use random elements, or that approximation doesn't use algorithms or logarithms, the difference is that PG uses them as a primary function and RNG is just there to add flavor, while approximation is the exact opposite.

PG uses randomness at it's core, converting it to fit the requirements using domain-specific algorithms, heuristics, and data. That's how you get that "flavour". That's what I mean by PG.

If you want basic heightmap terrain, then you use PRNG noise to perturb the surface and you increase the "octave" (or detail level) of the noise to make finer details. If you use the wrong type of noise or the wrong algorithm, you get unnatural looking stuff. If you want overhangs, caves or that other nice stuff, however, you would probably look at adding noise dimensions and increasing the number of passes to add those features. That is where the algorithms come in handy, but you still need the noise to provide something with that natural randomnesss to it.

And again, the less RNG the faster the process runs, similar results can be achieved with both methods, but not in a similar amount of time. Procedural Generation will always be faster for producing natural geological formations that are true-to-life.

There aren't any fast PG plate tectonics algos AFAIK. But a combination of the right type of noise, the right algorithms and the right filters could make something that looks almost right.

You know continental plates exist, you know that there's usually a ridge system where new plate rises from the magma, and you know you get folds where plates crumple together, trenches where one slides under the other, faults where they slide alongside each other, and sometimes volcanos pop up. Given that information, you can generate something that looks the part. Voronoi tesselate the planet into "plates" (using noise), pick a ridge system out (using noise) and set direction vectors for each plate using the location of the ridges (and some noise), and perturb the terrain at the interaction boundaries as appropriate. That could be done relatively quickly. It's a procedural method, it assumes some knowledge of plate behaviour, but it also relies on the noise to make it unique.
 

The seed provided by the conditions of the system, the protoplanet that the generator starts with, and the number of cycles is all of the noise you need. Those factors are going to vary widely enough that you will never get a duplicate. Asteroids alone are enough RNG to guarantee this. Anything else is randomization and in order to keep the randomization from producing undesirable results you'd have to add extra steps to the process that limit it's effects. Why make things unnecessarily complicated if it's just going to make it take longer anyways?

Breaking down the process into it's individual parts is how you avoid relying on RNG. Take your example of tectonic plates. Do you put the generator for mountains inside the generator for tectonic plates, or do you make a separate generator for mountains, and let the tectonic plates generator produce seeds for the mountain generator?

Keep in mind you do not need to think linearly. GPU's, ASICs, and FPGAs do not operate linearly like CPU's. The mountain generator can be running at the same time as the tectonic plate generator if they are separate, whereas if they were combined into one process they would have to take that process step-by-step in the order that the steps were described by the programmer.

You could say that hyperthreading alleviates that, and you would be right, but if the programmer combines the two generators and then separates them as simultaneous protocols he is just nesting the mountain generator within the tectonic plate generator, and adding baggage to any other process that needs to use the mountain generator separately. Especially in an environment where nesting isn't even necessary.

Excluding the fact that hyperthreading pales in comparison to the multitasking methods used by GPU's, ASICs, and FPGA's.

So if we don't have to think linearly what is the limitation? Where do you stop using simultaneously running generators to produce natural results?

Personally I'd say the grass is a good stopping point. :D

Especially if some companies can live up to the promise of 100 GFLOPS/watt FPGA's that I've seen tech sheets on.

Edit: Kinda found what I was remembering. This is easier to digest anyways.

Adapteva presentation poster.

They've done 25 GFLOPs/watt as a proof of concept and claim scalability to 100 with 28nm architecture. They aren't quite a bunch of no-name hobunks either. They've got a few notches in their belt to be able to say they know what they're doing.

Second Edit: Took the time to poke around some more. The 25GFLOPS/watt version is available on Amazon and cheap enough to pick up as a toy to experiment with.

How did I miss this? Given David's involvement with the Raspberry Pi he's got to have his eye on this.
 
Last edited:
Breaking down the process into it's individual parts is how you avoid relying on RNG. Take your example of tectonic plates. Do you put the generator for mountains inside the generator for tectonic plates, or do you make a separate generator for mountains, and let the tectonic plates generator produce seeds for the mountain generator?

I suppose you would have to use terrain code to perturb the output data that came from the plate code, so it would be a 2-pass deal. However, the noise code could be independent, generating repeatable values based on a seed and an n-dimensional values.

The plate code would filter all but the top n values and use those as the seed points for the voronoi cell "plates".

The terrain code would increase the noise magnitude to the difference from the base height of the current point, and use more octaves of noise to fill in details.

(You would want to use the same algorithms operating on normal maps instead of meshes/patches/height maps/whatever, so that any pop-in of terrain features would be kept to a minimum when flying towards the planet.)

Keep in mind you do not need to think linearly.

Probably not, but you also can't make an omelette without breaking some eggs.
 
I suppose you would have to use terrain code to perturb the output data that came from the plate code, so it would be a 2-pass deal. However, the noise code could be independent, generating repeatable values based on a seed and an n-dimensional values.

The plate code would filter all but the top n values and use those as the seed points for the voronoi cell "plates".

The terrain code would increase the noise magnitude to the difference from the base height of the current point, and use more octaves of noise to fill in details.

(You would want to use the same algorithms operating on normal maps instead of meshes/patches/height maps/whatever, so that any pop-in of terrain features would be kept to a minimum when flying towards the planet.)



Probably not, but you also can't make an omelette without breaking some eggs.

And here you've pointed out something crucial that I keep forgetting to mention. More RNG means more saved data, since it's more results that can't be reproduced by running the algorithm again. The more data created, the more expensive the entire process gets with bandwidth and storage costs.

Given the amount of time it should take for a player to safely approach a planet close enough to see detail/make an entry into the atmosphere, there should be no problem with creating a variety of seeds to be fed to the player's PC as you describe. You can have two sets of generators. One server-side for working out the details accurately and saving the information and another client-side for reproducing the results as needed with significantly less steps involved. That would be a scenario in which the "once and done" approach would work perfectly.

Offline mode though.... I guess if the seeds were distributed regularly in patches that would reduce bandwidth costs and take care of offline mode.
 
And here you've pointed out something crucial that I keep forgetting to mention. More RNG means more saved data, since it's more results that can't be reproduced by running the algorithm again. The more data created, the more expensive the entire process gets with bandwidth and storage costs.

What are you on about? It's PRNG (PSEUDO-random number generation), not RNG - you don't have to store anything, that's the point of using it! It produces a random-looking sequence but it is repeatable - the same seed will always produce the same sequence.

Given the amount of time it should take for a player to safely approach a planet close enough to see detail/make an entry into the atmosphere, there should be no problem with creating a variety of seeds to be fed to the player's PC as you describe.

Why would you need to do that? You know where the planet is, and that information can be used as the seed for the PRNG. Every client can generate the seeds itself. Obviously, orbits will occur, positions will change, but the orbital path and speed is very unlikely to change, so that information would be a fairly safe bet for a seed.

can have two sets of generators. One server-side for working out the details accurately and saving the information and another client-side for reproducing the results as needed with significantly less steps involved. That would be a scenario in which the "once and done" approach would work perfectly.

Except the server would eventually have to store millions of terabytes of data if that were the case. Much better to have the client generate that data in the background on the fly each time you hyperspace to a system.

Offline mode though.... I guess if the seeds were distributed regularly in patches that would reduce bandwidth costs and take care of offline mode.

Or you could just do what I said earlier, use PRNG and save the effort.
 
Last edited:
Back
Top Bottom