Learn Before
Trade-off Between Type I and Type II Errors
There is an inherent inverse relationship between the probability of committing a Type I error and a Type II error. If researchers attempt to reduce Type I errors by setting a stricter alpha level (e.g., ), they make it harder to reject true null hypotheses, but they inadvertently make it harder to reject false ones, thereby increasing the risk of Type II errors. Conversely, raising the alpha level (e.g., ) reduces Type II errors but increases Type I errors. The standard alpha level of serves as a conventional balance to keep both error rates acceptable.
0
1
Tags
KPU
Research Methods in Psychology - 4th American Edition @ KPU
Learn After
Because there is an inherent inverse relationship between error types in hypothesis testing, what happens to the risk of committing a Type II error if a researcher decides to set a stricter alpha level (e.g., .01 instead of .05) to reduce Type I errors?
A researcher conducting a hypothesis test decides to use an alpha level of .10 instead of the conventional .05. This change will make it easier to detect a real effect if one exists, but it will also increase the chance of concluding that an effect exists when it actually does not.