Test-retest reliability of the BrainFx 360 performance assessment

dc.contributor.advisor Ragan, Brian en_US
dc.contributor.author Searles, Chelsea M. en_US
dc.contributor.committeemember Jubenville, Colby en_US
dc.contributor.department Health & Human Performance en_US
dc.date.accessioned 2015-08-25T14:39:37Z
dc.date.available 2015-08-25T14:39:37Z
dc.date.issued 2015-06-26 en_US
dc.description.abstract Concussions occur at a frequency between 1.6 and 3.8 million injuries per year. Concussion assessment tests have been designed to compare post injury performance with preseason baseline performance. However to date, there is not a gold standard in concussion assessment testing. Many assessment tests lack acceptable reliability. The purpose of this study was to examine the test-retest reliability of the BrainFx 360 Performance Assessment. Fifteen healthy adults recruited from a large university in the south participated in this study. Participants took the initial assessment and returned seven to fourteen days later for the retest. The overall performance of BrainFx had good reliability. The results of most categories and subsections display moderate to acceptable reliability. Some subtests showed extremely low reliability and should be altered or thrown out in order to increase the overall reliability of performance. Future studies should consider using a randomized test order to assess the test-retest reliability. en_US
dc.description.degree M.S. en_US
dc.identifier.uri http://jewlscholar.mtsu.edu/handle/mtsu/4550
dc.publisher Middle Tennessee State University en_US
dc.subject Baseline testing en_US
dc.subject BrainFx en_US
dc.subject Concussion en_US
dc.subject Reliability en_US
dc.subject.umi Health sciences en_US
dc.thesis.degreegrantor Middle Tennessee State University en_US
dc.thesis.degreelevel Masters en_US
dc.title Test-retest reliability of the BrainFx 360 performance assessment en_US
dc.type Thesis en_US
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
1.25 MB
Adobe Portable Document Format