Board Thread:Questions and Answers/@comment-24071602-20141011230416/@comment-2.162.117.97-20141017211128

Sinthoniel wrote: I looked for it but I did not understand this part of the code :/ If anyone can figure out what this means, please tell me: Let's try, as far as i know my math (annotations always below the program code):

public static int getChance(float probability, int seconds) Is called with prob = 0.1 / 0.3 / 0.9 (rare, uncommon, common) and 3600 secs = 1 hour.

{

int ticks = seconds * 20; You have 20 ticks per second, so 72,000 ticks per hour.

double d = (double)probability;

d = 1D - d; Probability that nothing happens: 0.9 / 0.7 / 0.1

d = Math.pow(d, 1D / (double)ticks); Probability that nothing happens during all the ticks (nearly 1).

d = 1D - d; Probability that something happens during all the ticks (nearly 0).

d = 1D / d; Average time until someting happens in ticks.

d /= 20D;

Average time until someting happens in secs:

34168 secs (9,5 hrs), 10093 (2,8 hrs), 1563 (0,4 hrs)

int chance = (int)Math.round(d);

return chance;

}

public static int RARE = getChance(0.1F, 3600);

public static int UNCOMMON = getChance(0.3F, 3600);

public static int COMMON = getChance(0.9F, 3600); I assume you generate every second a random number between 0 and the corresponding chance number (34168, 10093 or 1563). When that number is 0, you'll spawn an invasion (check the code where the chances are used).

When you call the function with the extrema 0.0 (never invasion) and 1.0 (always invasion), you'll get a division-by-0-error (which means a veeeeery large number) for a 0.0 probability and a 0 for a 1.0 probability (which means an invasion every second), so I think, I got it right.

I hope, my knowledge in statistics is correct, it's long ago, that I learned these things and it took me some time to remember.