Telepractice and WISC-V

Wechsler Intelligence Scale for Children ® - Fifth UK Edition (WISC-V UK) - Telepractice and WISC-V

 

Telepractice and the WISC-VUK

A telepractice session (also known as ‘telehealth’ or ‘telemedicine’) includes an examiner in one geographical location and an examinee at a different location. Using a high-speed internet connection and a secure software platform designed for web-based meetings (i.e. teleconference platform), an examiner and examinee join a shared web-based meeting via computers with audio and video capability. A facilitator may be required to join in the examinee location. The examiner and examinee can see and hear one another throughout the session. Text, pictures, and video can be shared through the teleconference platform.

The Wechsler Intelligence Scale for Children – Fifth UK Edition (WISC–VUK; Wechsler, 2016) can be administered in a telepractice context by using digital tools from Q-global®, Pearson’s secure online-testing platform. Specifically, Q-global digital assets (e.g. stimulus books) can be shown via the screen-sharing features of teleconference platforms to an examinee in another location. Details regarding Q-global and how it is used are provided on the Q-global homepage

A spectrum of options is available for administering the WISC–VUK via telepractice. They vary based on the role, and presence of, the onsite facilitator:

  1. Trained professional facilitator: If there is an onsite facilitator who is a well-trained professional, then telepractice can involve the use of manipulative (e.g. blocks), response booklets, and audiovisual equipment. This method supports all of the traditional published  WISC–VUK composite scores. The supporting equivalency literature is outlined below.
  2. Parent/guardian facilitator: During the COVID-19 (Coronavirus) pandemic, however, the only facilitator available may be someone who is not in a professional   role (e.g. parent/guardian).

These guidelines provide detail on how best to manage an assessment in the second scenario whereby a trained onsite facilitator is not available, and the examiner must rely on a parent/guardian to play the role of facilitator.

Firstly, careful decisions must be made about whether an assessment is deemed necessary, and if so, in what capacity this should be completed. The examiner should use their professional judgment about the capacity of the facilitator to perform the required functions described throughout these guidelines correctly and without interfering in the testing session.

If the onsite facilitator is a parent/guardian, follow the guidelines outlined in the administration and scoring manual regarding the presence of a parent or guardian in the room to ensure adherence to standard administration procedures. As specified in the manual, it is very rare that the parent/guardian stays in the room during testing. Further detail is presented below, however it is not recommended to allow a parent/guardian to present blocks or response booklets to examinees. The parent/guardian may only make audiovisual adjustments as required.

 

Nonmotor Full Scale Score alternative to FSIQ

During this COVID-19 (Coronavirus) pandemic, if it is deemed that an assessment is necessary and the onsite facilitator is a parent/guardian, an alternative proposed approach is to administer the WISC-VUK so that a Nonmotor Full Scale Score (NMFSS) is derived (Raiford, 2017). This approach reduces the reliance on an onsite facilitator, as the assessment removes the subtests that require blocks and response booklets. If blocks and response booklets are not used, a NMFSS can be derived using specially developed scores accessible in the Q-global Resource Library. These scores have been developed in order to meet the unique needs of examiners during the COVID-19 (Coronavirus) pandemic and should not be not used as a standard approach. These scores make use of Visual Puzzles in place of Block Design and do not include Coding. As such, this NMFSS can be derived by summing the scaled scores for Similarities, Matrix Reasoning, Digit Span, Vocabulary, Figure Weights and Visual Puzzles. During the COVID-19 (Coronavirus) pandemic, the NMFSS composite may be used in place of the Full Scale IQ composite. Other composites that can be obtained include the Verbal Comprehension Index, Fluid Reasoning Index, Working Memory Index, Quantitative Reasoning Index, and Auditory Working Memory Index. If utilising this Nonmotor Full Scale Score, it is important for examiners to include in the assessment report that this is a nonstandard approach and not all areas of cognitive function have been assessed. As such, appropriate care should be taken in the interpretation of the assessment results.

 

Conducting Telepractice Assessment

Conducting a valid assessment in a telepractice (telehealth, telemedicine) delivery model requires an understanding of the interplay of a number of complex issues. In addition to the general information on our Telepractice homepage, examiners should address five factors (Eichstadt et al., 2013) when planning to administer and score assessments via telepractice:

 

1.  Audio/visual environment

Computers and connectivity

Two computers with audio and video capability and stable internet connectivity – one for the examiner and one for the examinee – are required. A stationary web camera, microphone, and speakers or headphones are required for both the examiner and the examinee. It is recommended that the examiner have a second computer screen so that he or she can view the Administration and Scoring Manual, but the paper format manual can also be used.

Teleconference platform

A secure teleconference platform with screensharing capability is required.

Video 

High-quality video (HD preferred) is required during the administration. Make sure the full faces of the examiner and the examinee are seen using each respective web camera. The teleconference platform should allow all relevant visual stimuli to be fully visible to the examinee when providing instruction or completing items; the video of the examiner should not impede the examinee’s view of visual stimuli.

Screensharing digital components

Digital components are shared within the teleconferencing software as specified in Table 1 (PDF | 158 KB). There are two ways to view digital components in the Q-global Resource Library: through the pdf viewer in the browser window or full screen in presentation mode. Always use full screen (i.e. presentation) mode for digital components viewed by the examinee. This provides the cleanest presentation of test content without onscreen distractions (e.g. extra toolbars). Refer to Using Your Digital Assets on Q-global in the Q-global Resource Library for complete directions on how to enter presentation mode.

Image/screen size 

When items with visual stimuli are presented, the digital image of the visual stimuli on the examinee’s screen should be at least 9.7” (25 cm) measured diagonally, similar to an iPad (although an iPad is not required). Some teleconferencing platforms shrink the size of images, so the facilitator should verify the image size prior to the testing session. Typically, computer screens used for teleconference assessment are a minimum of 15” (38 cm) measured diagonally. Smaller screens, such as those of iPad minis and smartphones, are not allowed for examinee-facing content as these have not been examined empirically and may affect stimulus presentation, examinee response, and validity of the test results. Similarly, presenting stimuli on extremely large screens has not been examined, so the same precaution applies. Prior to testing, ask the onsite facilitator to aim a peripheral camera or device (as described in the next paragraph) at the examinee’s screen to ensure that the examinee’s screen is displaying images in the correct aspect ratio and not stretching or obscuring the stimuli image.

Peripheral camera or device 

A stand-alone peripheral camera that can be positioned to provide a view of the session from another angle or a live view of the examinee’s progress is helpful. Alternately, the onsite facilitator may join the teleconference from via a separate device (e.g. a smartphone with a camera or another peripheral device) and set it in a stable position to show the examinee.

The facilitator should silence the audio and mute the microphone on any peripheral devices to prevent feedback. Train the onsite facilitator to position the peripheral camera/device before subtests that elicit pointing or gestured responses (refer to Table 1 [PDF | 158 KB]) so you can view the examinee’s real-time responses. Instruct the facilitator not to capture video or take photos as this is a copyright violation.

Gesturing

When gesturing to the stimulus books is necessary, display them as digital assets onscreen and point using the mouse. Refer to Table 1 (PDF | 158 KB) for specific instructions by subtest. 

Audio considerations

High-quality audio capabilities are required during the administration. An over-the-head, two-ear, stereo headset with attached boom microphone is recommended for both the examiner and examinee.

Audio check

Test the audio for both the examiner and examinee prior to the administration to ensure a high-quality audio environment is present. This is especially critical for Digit Span, Letter-Number Sequencing, and Arithmetic. Testing the audio should include an informal conversation prior to the administration where the examiner is listening for any clicks, pops, or breaks in the audio signal that distorts or interrupts the voice of the examinee. The examiner should also ask the examinee and facilitator if there are any interruptions or distortions in the audio signal on their end. Record any connectivity lapses, distractions, or intrusions that occurred during testing.

Manage audiovisual distractions

As with any testing session, make sure the examinee’s environment is free from audio and visual distractions. If you are unfamiliar with the examinee’s planned physical location, meet virtually with the facilitator in advance of the testing session. Ask the facilitator to show the intended testing room and provide a list of issues to address to transform the environment into an environment suitable for testing. For example, remove distracting items, silence all electronics, and close doors. Ask the examinee and facilitator to close all other applications on the computer, laptop, or other device, and to silence alerts and notifications on the peripheral device. Ensure radios, televisions, other mobile phones, fax machines, smart speakers, and equipment that emit noise are silenced and removed from the room.

Lighting

Establish good overhead and facial lighting for the examiner and examinee. Close blinds or curtains to reduce sun glare on faces and the computer screens.

 

2. Examiner Factors

Practice

 During the telepractice setup, and before administering to any actual examinee, practise the mechanics and workflow of every item in the entire test using the selected teleconference platform so that you are familiar with the administration procedures. For example, use a colleague as a ‘practice examinee’.

Standardised procedures 

Follow the administration procedures of face-to-face administration as much as possible. For example, if a spoken stimulus cannot be said more than once in face-to-face administration, do not say it more than once in a telepractice administration unless a technical difficulty precluded the examinee from hearing the stimulus.

Facilitator role and training

The onsite facilitator’s role in a telepractice session is largely to manage audiovisual needs. Train the facilitator to troubleshoot audiovisual needs that arise during the testing session, including camera angle, lighting, and audio checks. The facilitator’s role is not to manage rapport, engagement, or attention during the testing session and they are not to interfere with the examinee’s performance or responses.

As detailed above, if using an onsite facilitator who is not in a professional role (e.g. parent/guardian), the examiner should use their professional judgment about the capacity of the facilitator to perform the required functions correctly and without interfering in the testing session. The examiner should communicate expectations about the facilitator’s role in testing tasks immediately prior to the testing session when the examinee is not present to ensure that nothing is disclosed to the examinee about the tasks. Do not allow the facilitator to show or warn the examinee about any portion of the test. Once the audiovisual components have been set up and tested, it is expected the parent/guardian should leave the room. 

Any other roles and responsibilities for which an examiner needs support, such as behaviour management, should be outlined and trained prior to the beginning of the testing session. The examiner is responsible for documenting all behaviours of the facilitator during test administration and taking these into consideration when reporting scores and performance.

 

3.  Examinee Factors

Appropriateness 

Ensure that a telepractice administration is appropriate for the examinee and for the purpose of the assessment. Use clinical judgement, best practice guidance for telepractice (e.g. British Psychological Society Effective Therapy via Video), information from professional organisations, existing research, and any available government regulations in the decision-making process.

Preparedness

Before initiating test administration, ensure that the examinee is well-rested, able, prepared, and ready to appropriately and fully participate in the testing session.

Facilitator role

Explain the role of the facilitator to the examinee so participation and actions are understood.

Headset

It may not be appropriate or feasible for some examinees to use a headset due to behaviour, positioning, physical needs, or tactile sensitivities. Use clinical judgement on the appropriate use of a headset in these situations. If a headset is not utilised, ensure your microphone and the examinee’s speakers are turned up to a comfortable volume. 

Mouse

On some teleconference platforms, you can pass control of the mouse to allow the examinee to point to indicate responses; this is acceptable if it is within the capabilities of the examinee. Best practice guidelines provide cautions about this, however      (IOPC, 2020).

 

4. Test/test materials

Copyright

Obtain permission for access to copyrighted materials (e.g. stimulus books) as appropriate. Pearson has provided a          letter of No Objection (PDF | 77.5 KB) to permit use of copyrighted materials for telepractice via non-public-facing teleconferencing software and tools to assist in remote administration of assessment content during the COVID-19 pandemic. This permission is not intended to allow for use of photocopying, scanning, or duplication of test protocols, including any screen capture or session recording technology. Examiners should use  a Record Form for all recording as per standard practice.

Digital assets

Practice using the digital assets until the use of the materials is as smooth as a face-to-face administration. 

Considerations 

Review Table 1 (PDF | 158 KB) for the specific telepractice considerations for each subtest to be administered.

Input and output requirements and equivalence evidence 

Consider the input and output requirements for each task, and the evidence available for telepractice equivalence for the specific task type.

 

Telepractice versus face-to-face administration

A study of the equivalence of WISC–VUK telepractice compared with face-to-face administration and scoring modes in examinees with specific learning disabilities demonstrated that the primary index scores and the Full Scale IQ corresponded to an extremely high degree (Hodge et al., 2019). A study of the equivalence of Wechsler Abbreviated Scale of Intelligence (WASI; Pearson, 1999) telepractice compared with face-to-face modes in examinees with intellectual disability produced similar results (Temple et al., 2010). Several tasks drawn from the Wechsler scales have also produced evidence of equivalence in telepractice and face-to-face modes for examinees with a wide variety of clinical conditions (Cullum et al., 2006, 2014; Galusha-Glasscock et al., 2016; Grosch, Gottlieb, & Cullum, 2011; Grosch, Weiner, Hynan, Shore, & Cullum, 2015; Hildebrand, Chow, Williams, Nelson, & Wass, 2004; Ragbeer et al., 2016; Stain et al., 2011; Temple et al., 2010; Wadsworth, Dhima, et al., 2016; Wadsworth, Galusha-Glasscock, et al., 2018). Other studies support equivalence of tasks that are highly similar to those of the WISC–VUK subtests with nonclinical examinees using telepractice compared with face-to-face administration and scoring (Galusha-Glasscock et al., 2016; Sutherland et al., 2017; Wright, 2016, 2018). In addition, a meta-analysis of telepractice studies provides rigorous support for telepractice and face-to-face mode equivalence (Brearly et al., 2017).

 

Digital versus traditional format

Telepractice involves the use of technology in assessment as well as viewing onscreen stimuli. For these reasons, studies that investigate assessment in digital versus traditional formats are also relevant. 

A number of investigations of the Wechsler Intelligence Scale for Children – Fourth Edition (WISC–IV; Wechsler, 2003) and the  WISC–V have produced evidence of equivalence when administered and scored via digital or traditional formats to examinees without clinical conditions (Daniel, 2012; Daniel et al., 2014; Raiford, Zhang, et al., 2016). In addition, equivalence has been demonstrated for examinees with clinical conditions, such as intellectual giftedness or intellectual disability (Raiford et al., 2014, Raiford, Zhang, et al., 2016), attention-deficit/hyperactivity disorder or autism spectrum disorder (Raiford, et al., 2015; Raiford, Zhang, et al., 2016), or specific learning disorders in reading or mathematics (Raiford, Drozdick, et al., 2016; Raiford, Zhang, et al., 2016).

 

Evidence by subtest

Table 2 (PDF | 127 KB) lists each WISC–V subtest, the input and output requirements, the direct evidence of subtest equivalence in telepractice – face-to-face and digital–traditional investigations, and the evidence for similar tasks. The numbers in the evidence columns correspond to the studies in the reference list, which is organised alphabetically in telepractice and digital sections. For clarity, each study is denoted either T or D, with T indicating the study investigated telepractice–face-to-face mode, and D indicating the study addressed digital–traditional format.

 

5. Other/miscellaneous 

State in your report that the test was administered via telepractice, and briefly describe the method of telepractice used. For example, ‘The WISC–VUK was administered via remote telepractice using digital stimulus materials on the Pearson Q-global system during the live video connection using the [name of telepractice system]. A parent facilitator assisted in the set up and testing of the audiovisual equipment prior to the assessment taking place.’

Make a clinical judgment, similar to a face-to-face session, about whether or not you are able to gather the examinee’s best performance. Report your clinical decision(s) in your report and comment on the factors that led to the decision to report (or not report) the scores. For example, ‘The remote testing environment appeared free of distractions, adequate rapport was established with the examinee via video/audio, and the examinee appeared appropriately engaged in the task throughout the session. No significant technological problems were noted during administration. Modifications to the standardisation procedure included: [list].      The WISC–VUK subtests, or similar tasks, have received initial validation in several samples for remote telepractice and digital format administration, and the results are considered a valid description of the examinee’s skills and abilities.’

 

Conclusion

The WISC–VUK was not standardised in a telepractice mode, and this should be taken into consideration when utilising this test via telepractice and interpreting results. Provided that you have thoroughly considered and addressed all five factors and the specific considerations as listed above, you are prepared to observe and comment about the reliable and valid delivery of the WISC–VUK via telepractice. You may use the WISC–VUK materials via telepractice without additional permission from Pearson in the following published contexts:

  • WISC–VUK manuals, digital stimulus books, and associated administration materials via Q-global®
  • WISC–VUK via Q-interactive (requires advanced technology skills and mirroring software).

Any other use of the WISC–VUK via telepractice requires prior permission from Pearson and is not currently recommended. This includes, but is not limited to, scanning the paper stimulus books, digitising the paper record forms, holding the materials physically up in the camera's viewing area, or uploading a manual onto a shared drive or site.

 

References

Eichstadt, T. J., Castilleja, N., Jakubowitz, M., & Wallace, A. (2013, November). Standardized assessment via telepractice: Qualitative review and survey data [Paper presentation]. Annual meeting of the American-Speech-Language-Hearing Association, Chicago, IL United States.

Grosch, M. C., Gottlieb, M. C., & Cullum, C. M. (2011). Initial practice recommendations for teleneuropsychology. The Clinical Neuropsychologist, 25, 1119–1133.

Interorganizational Practice Committee [IOPC]. (2020). Recommendations/guidance for teleneuropsychology (TeleNP) in response to the COVID-19 pandemic. Retrieved March 30, 2020, from https://static1.squarespace.com/static/50a3e393e4b07025e1a40d0/t/5e8260be9a64587cfd3a9832/1585602750557/Recommendations-Guidance+for+Teleneuropsychology-COVID-19-4.pdf

Raiford, S. E. (2017). Essentials of WISC–V Integrated Assessment.  (A. S. Kaufman & N. L. Kaufman, Eds.) John Wiley & Sons.

Wechsler, D. (2003). Wechsler Intelligence Scale for Children        (4th ed.). Bloomington, MN: Pearson.

Wechsler, D. (2016). Wechsler Intelligence Scale for Children –     Fifth UK Edition. London: Pearson Clinical Assessment.

 

Telepractice–Face-to-Face Mode:

  1. Brearly, T., Shura, R., Martindale, S., Lazowski, R., Luxton, D., Shenal, B., & Rowland, J. (2017). Neuropsychological test administration by videoconference: A systematic review and meta-analysis. Neuropsychology Review, 27(2), 174–186.
  2. Cullum, C., Weiner, M., Gehrmann, H., & Hynan, L. (2006). Feasibility of telecognitive assessment in dementia. Assessment, 13(4), 385–390.
  3. Cullum, C. M., Hynan, L. S., Grosch, M., Parikh, M., & Weiner, M. F. (2014). Teleneuropsychology: Evidence for video teleconference-based neuropsychological assessment. Journal of the International Neuropsychological Society, 20, 1028–1033.
  4. Galusha-Glasscock, J., Horton, D., Weiner, M., & Cullum, C. (2016). Video teleconference administration of the Repeatable Battery for the Assessment of Neuropsychological Status. Archives of Clinical Neuropsychology, 31(1), 8–11.
  5. Grosch, M., Weiner, M., Hynan, L., Shore, J., & Cullum, C. (2015). Video teleconference-based neurocognitive screening in geropsychiatry. Psychiatry Research, 225(3), 734–735.
  6. Hildebrand, R., Chow, H., Williams, C., Nelson, M., & Wass, P. (2004). Feasibility of neuropsychological testing of older adults via videoconference: Implications for assessing the capacity for independent living. Journal of Telemedicine and Telecare, 10(3), 130–134. https://doi.org/10.1258/135763304323070751
  7. Hodge, M., Sutherland, R., Jeng, K., Bale, G., Batta, P., Cambridge, A., Detheridge, J., Drevensek, S., Edwards, L., Everett, M., Ganesalingam, K., Geier, P., Kass, C., Mathieson, S., McCabe, M., Micallef, K., Molomby, K., Ong, N., Pfeiffer, S., … Silove, N. (2019). Agreement between telehealth and face-to-face assessment of intellectual ability in children with specific learning disorder. Journal of Telemedicine and Telecare, 25(7), 431–437. https://doi.org/10.1177/1357633X18776095
  8. Ragbeer, S. N., Augustine, E. F., Mink, J. W., Thatcher, A. R., Vierhile, A. E., & Adams, H. R. (2016). Remote assessment of cognitive function in juvenile neuronal ceroid lipofuscinosis (Batten disease): A pilot study of feasibility and reliability. Journal of Child Neurology, 31, 481–487. https://doi.org/10.1177/0883073815600863
  9. Stain, H. J., Payne, K., Thienel, R., Michie, P., Vaughan, C., & Kelly, B. (2011). The feasibility of videoconferencing for neuropsychological assessments of rural youth experiencing early psychosis. Journal of Telemedicine and Telecare, 17, 328–331. https://doi.org/10.1258/jtt.2011.101015
  10. Sutherland, R., Trembath, D., Hodge, A., Drevensek, S., Lee, S., Silove, N., & Roberts, J. (2017). Telehealth language assessments using consumer grade equipment in rural and urban settings: Feasible, reliable and well tolerated.   Journal of Telemedicine and Telecare, 23(1), 106–115. https://doi.org/10.1177/1357633X15623921
  11. Temple, V., Drummond, C., Valiquette, S., & Jozsvai, E. (2010). A comparison of intellectual assessments over video conferencing and in-person for individuals with ID: Preliminary data. Journal of Intellectual Disability Research, 54(6), 573–577. https://doi.org/10.1111/j.1365-2788.2010.01282.x
  12. Wadsworth, H., Galusha-Glasscock, J., Womack, K., Quiceno, M., Weiner, M., Hynan, L., Shore, J., & Cullum, C. (2016). Remote neuropsychological assessment in rural American Indians with and without cognitive impairment. Archives of Clinical Neuropsychology, 31(5), 420–425. https://doi.org/10.1093/arclin/acw030
  13. Wadsworth, HE, Dhima, K., Womack, K.B, Hart, J., Weiner, M. F., Hynan, L. S., & Cullum, C. M. (2018). Validity of teleneuropsychological assessment in older patients with cognitive disorders. Archives of Clinical Neuropsychology  33(8), 1040–1045. https://doi.org/10.1093/arclin/acx140
  14. Wright, A. J. (2016). Equivalence of remote, online administration and traditional, face-to-face administration of the Woodcock-Johnson IV cognitive and achievement tests. Retrieved March 16, 2020, from https://www.presencelearning.com/app/uploads/2016/09/WJ-IV_Online_Remote_whitepaper_FINAL.pdf
  15. Wright, A. J. (2018). Equivalence of remote, online administration and traditional, face-to-face administration of the Reynolds Intellectual Assessment Scales-Second Edition. Retrieved March 16, 2020, from https://pages.presencelearning.com/rs/845-NEW-442/images/Content-PresenceLearning-Equivalence-of-Remote-Online-Administration-of-RIAS-2-White-Paper.pdf

Digital–Traditional Format

  1. Daniel, M. H. (2012). Equivalence of Q-interactive administered cognitive tasks: WISC–IV (Q-interactive Technical Report 2). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/009-s-Technical%20Report%202_WISC-IV_Final.pdf
  2. Daniel, M. H., Wahlstrom, D., & Zhang, O. (2014). Equivalence of Q-interactive and paper administrations of cognitive tasks: WISC®–V (Q-interactive Technical Report 8). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/003-s-Technical-Report_WISC-V_092514.pdf
  3. Raiford, S. E., Holdnack, J. A., Drozdick, L. W., & Zhang, O. (2014). Q-interactive special group studies: The WISC–V and children with intellectual giftedness and intellectual disability (Q-interactive Technical Report 9). Pearson. Retrieved from http://www.helloq.com/content/dam/ped/ani/us/helloq/media/Technical_Report_9_WISC-V_Children_with_Intellectual_Giftedness_and_Intellectual_Disability.pdf
  4. Raiford, S. E., Drozdick, L. W., & Zhang, O. (2015). Q-interactive special group studies: The WISC–V and children with autism spectrum disorder and accompanying language impairment or attention-deficit/hyperactivity disorder (Q-interactive Technical Report 11). Pearson. http://images.pearsonclinical.com/images/assets/WISC-V/Q-i-TR11_WISC-V_ADHDAUTL_FNL.pdf
  5. Raiford, S. E., Drozdick, L. W., & Zhang, O. (2016). Q-interactive special group studies: The WISC–V and children with specific learning disorders in reading or mathematics (Q-interactive Technical Report 13). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/012-s-Technical_Report_9_WISC-V_Children_with_Intellectual_Giftedness_and_Intellectual_Disability.pdf
  6. Raiford, S. E., Zhang, O., Drozdick, L. W., Getz, K., Wahlstrom, D., Gabel, A., Holdnack, J. A., & Daniel, M. (2015). Coding and Symbol Search in digital format: Reliability, validity, special group studies, and interpretation (Q-interactive Technical Report 12). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/002-Qi-Processing-Speed-Tech-Report_FNL2.pdf

 

 
What do you think?