Calculate risk of death in a computer simulated world

poizn99

New member
Joined
May 23, 2011
Messages
1
Hi all

This is my first post here.
I have a very difficult math problem, that I hope someone can help with.
I am a programmer and I am interested in evolution, so I decided to create a computer simulated world.
It is a very simple world with only a few functions at the moment.
You can be born, live and then die (I have commented out the pair bonding and reproduction out until I can fix my problem).

The problem I am having is calculating when people should die.
Basically in the program, there is a loop that runs and each iteration of the loop represents a year for the people.
In the loop I have a function that calculates weather each person should die on that die.
The calculation goes like this.
It gets a random number between 0 and 100 and if the random number is less that a risk factor (I explain the risk next) the person dies.
The risk factor is the problem.

I want the average age (in years) for my people to be lets use 4 for now.
So over a few hundred people, the average age for the people that have died should be 4.
So if one person dies at age 3 and anothe dies at age 5, then the average age is 4.

I have tried a few different ways of calculating the risk.
Some have been my own thoughts and some are math functions that exist, like binomeal distribution, negative binomeal distribution, logistics functions etc..

So my question is, how do I calculate the risk on a daily basis, so that the average age for my people is a specific number (lets say 4 for now)?
Obviously the equation/s needs to take into account the current year and the average that I am aiming for.

For anyone willing to help, please bear in mind that I have been out of school for a while, so my math knowledge is pretty basic at the moment.
Also my world isnt going to represent the real world 100%, so I'm not concerned about real life death tables and spikes in mortality rates at birth etc..
Also risk of death should never be 100% or more, so the risk should get close to 100% but never actually get there.
If I am not mistaken, this would mean that the risk curve (risk and days on a graph) cant be linear, it would be some kind of curve, probably something close to the top left corner of a circle.

Thanks in advance :)
 
It would help if your design were rational. Barring that, look up "Gompertz" and "Makeham"and think on "Force of Mortality".
 
Let p(x) be the probability of an individual, x days old, to die today.

Then the probability of living to age x days is q(x) = [1-p(0)] . [1-p(1)] . [1-p(2)] . .... . [1-p(x-1)] . p(x)

The average age will be q(1) + 2q(2) + 3q(3) + 4q(4) + .....

An examples :

If p(x) = P, a constant, then q(x) = P(1-P)^x, and the average age is P(1-P) + 2P(1-P)^2 + 3P(1-P)^3 + ... which equals (1/P)-1.

I'd suggest either :

* choosing a formula for q(x), and working out the corresponding p(x) via q(x)/q(x-1) = p(x)[1/p(x-1) - 1]. The advantage is that it's easy to choose q(x) to give the right average age.
* choosing a formula for p(x), and working out the corresponding q(x). The advantage is that p(x) is what you'll actually use in your program to decide if a particular individual drops out.
 
Top