I'm trying to implement a scoring system into a small app I'm working on.

Every x minutes the player has a chance to catch an objective and each location will have an average score.

Say a location's average score is 100,000 per objective caught, how would I calculate the score per objective depending on it's rarity?

For example:

Objective A chance of catching is 50%, it should be worth x points.

Objective B chance of catching is 1%, it should be worth y points (far higher than A).

Objective C chance of catching is 0.1%, it should be worth z points (extremely large number).

After a large enough sample size of catches the average should always come out at ~100,000.

I can't think of any formula that will solve this, I am by no means a mathematician

Thanks for your help in advance!

~Viscosity