
How can we limit this? Reliability: Measurement Error Sources - Intra-Rater Variability/Reliability - Comparison of same measure by same tester on different occasions (coach bias) If an error occurs, when does it, and can we isolate why? Two people looking at the same measures, same tests, etc., but come to different conclusions Need to have open discussions How can we limit it? Reliability: Measurement Error Sources - Inter-Rater Variability/Reliability - Comparison of same measure between two testers (tester bias)

If an error occurs, when does it, and can we isolate it? Consistency of performance by person being tested This is why in fitness testing situations, we want the conditions to be standardized Reliability: Measurement Error Sources - Intra-Subject Variability/Reliability - This is about the person being tested (subject) Systematic error affects all variables or participants equally (more common 29 Little to no relationship Reliability: Measurement Error - Random error affects all variables or participants inconsistently (tend to be a result of human influence) 90 to 1.0 Very high (reliability, validity)

Closer to +1 or -1, the stronger the relationship Closer to 0, the weaker the relationship Many exam questions are about error and how to reduce error Observed Score Observed score is due to someone's true ability and random error Interpreting a Pearson's "r" with respect to reliability (or an ICC) and validity - On a scale from +1 to -1 The goal is to reduce as much error as possible (no matter how slight or significant) Difference in the same individual Test-retest Reliability - Repeated on two or more occasions (could be 3 repeated tests: test-retest-test) Random Error - Can be many different things Rater = who is doing the test Intra-rater Reliability - Comparison of same measure by same tester different occasions Correlation is considered to be the weakest analysis for reliability Inter-rater Reliability - Comparison of same measure between two or more tests Note: the higher the 'r' and 'ICC' the higher the reliability ICC is popular as it can assess test-retest-retest reliability with one statistic, r cannot

ICC describes how strong the units in the same group resemble each other Statistically determined by Pearson's correlation coefficient (r), intra-class coefficient (ICC) and other more complex statistics The degree to which a measure is consistent and dependable

Reliability - Definition Reliability refers to the consistency or repeatability
