Hi all,
I'm programming a small app that is based off players catching objectives every 5 minutes.
Depending on the location of the player, there will be an average objective point value.
Per location there will be multiple different kinds of fish with varying rarities.
Say a location has 100,000 average point value:
Objective A with 50% chance will be worth x points (below average - 50,000 I think?).
Objective B with 1% chance will be worth x points (far above average - around a million I think?).
Objective C with 0.001% chance will be worth x points (EXTREME amount of points - don't even know).
Is there a function I can plug in the average and percentage that I can use on this logic, so that after a large enough sample size the average number of points will come out to ~100,000?
This has been wracking my mind for the last couple of days...
Thanks in advance!
~Ryan
I'm programming a small app that is based off players catching objectives every 5 minutes.
Depending on the location of the player, there will be an average objective point value.
Per location there will be multiple different kinds of fish with varying rarities.
Say a location has 100,000 average point value:
Objective A with 50% chance will be worth x points (below average - 50,000 I think?).
Objective B with 1% chance will be worth x points (far above average - around a million I think?).
Objective C with 0.001% chance will be worth x points (EXTREME amount of points - don't even know).
Is there a function I can plug in the average and percentage that I can use on this logic, so that after a large enough sample size the average number of points will come out to ~100,000?
This has been wracking my mind for the last couple of days...
Thanks in advance!
~Ryan