Hi!
I have a query and it may sound retarded.
I am going to post an example of a sum instead of explaining.
3500/0.8 = 4375
3500*0.2 = 700 ; 3500 +700= 4200
I have calculated the final answers based on increasing it by 20%. When I multiply I use 0.2 and then add it to the main amount. And if I divide , I do it by 0.8.
Shouldn't both the answers be more closer to each other? The difference becomes massive when I use it for bigger numbers. What is the theory behind this?
Please help!
I have a query and it may sound retarded.
I am going to post an example of a sum instead of explaining.
3500/0.8 = 4375
3500*0.2 = 700 ; 3500 +700= 4200
I have calculated the final answers based on increasing it by 20%. When I multiply I use 0.2 and then add it to the main amount. And if I divide , I do it by 0.8.
Shouldn't both the answers be more closer to each other? The difference becomes massive when I use it for bigger numbers. What is the theory behind this?
Please help!