John_in_Parrish
New member
- Joined
- Oct 21, 2013
- Messages
- 3
A hypothetical program computes for 3 seconds then outputs a variable length record to be printed. The printer takes from 0.75 to 4.75 seconds (average time is 2.75 seconds) to print each output record. (Use a random number generator)
The hypothetical program loops 500 times for each case of software buffers (0, 1, 2, 3, 4, 5, 10, 25, and 100 software output buffers). Calculate the AVERAGE time for the program to “virtually compute and print” a record from the 500 records, for EACH of the 9 choices of buffer. Plot the results
The way I have the problem set up is (3 + avg time) / # of buffers. When I go to plot it though it makes sense for the buffers 1 - 100 but not the 0 buffer since it is divided by 0. So I am assuming I am setting this problem up incorrectly. It is a programing question I have no problem with the programing part just need some help getting the mathematical formula for the problem. Thanks in advance for your help,
John
The hypothetical program loops 500 times for each case of software buffers (0, 1, 2, 3, 4, 5, 10, 25, and 100 software output buffers). Calculate the AVERAGE time for the program to “virtually compute and print” a record from the 500 records, for EACH of the 9 choices of buffer. Plot the results
The way I have the problem set up is (3 + avg time) / # of buffers. When I go to plot it though it makes sense for the buffers 1 - 100 but not the 0 buffer since it is divided by 0. So I am assuming I am setting this problem up incorrectly. It is a programing question I have no problem with the programing part just need some help getting the mathematical formula for the problem. Thanks in advance for your help,
John