I just did some calculations on how much difference a shit monitor does versus a gaming monitor. I am comparing 25 ms response time, 60 hz (fps) to a monitor with 1 ms response time, 144 hz (reaction time). To display the difference, I am analyzing the situation where we observe how long we have to react to a skillshot.
Understand that we have no control over when the skillshot is fired. Thus the animation can begin just before a frame is produced on your screen (max time for you to react) or immediately after a frame is produced on your screen (shortest time for you to react). Thus there is error in this measurement. Since the difference between these values in a 144 hz vs a 60 hz monitor is 9 ms, the final value, or final measurement in the differential in time to react between these 2 monitors, will be +/-4.5 milliseconds. That means that sometimes you will have 4.5 milliseconds longer than the average to react, and sometimes 4.5 ms less than the average to react.
Since the difference in bland response time is 25ms- 1ms= 24 ms, that calculation is easy.
Thus the final value for how much difference in time to react between the two monitors is 24+4.5=28.5
If you include error its (28.5 +/- 4.5) ms.
This means that since I have an average of a 230 ms response time on my old 25 ms, 60 hz monitor, I will have a (201.5 +/- 4.5) ms response time on the new 1 ms, 144 hz monitor.
The % difference between the two times is [(230-201.5)/(230+201.5)]*100%=12%
Thus you could say you have a 12% increase in time to react.
This monitor costs 200$
200$=12% increase in time to react