Repeatable Battery for the Assessment of Neuropsychological Status Update (RBANS™ Update) - Telepractice and RBANS
The telepractice information in this document is intended to support clinicians in making informed, well-reasoned decisions around remote assessment. This information is not intended to be comprehensive regarding all considerations for assessment via telepractice. It should not be interpreted as a requirement or recommendation to conduct assessment via telepractice.
Clinicians should remain mindful to:
follow telepractice regulations and legal requirements from local authorities; licensing boards; and professional liability insurance providers
develop competence with assessment via telepractice through activities such as practising, studying, consulting with other professionals, and engaging in professional development.
Clinicians should use their clinical judgment to determine if assessment via telepractice is appropriate for a particular examinee, referral question, and situation. There are circumstances where assessment via telepractice is not feasible and/or is contraindicated. Documentation of all considerations, procedures, and conclusions remains a professional responsibility.
Several professional organisations and experts have provided guidance on telepractice assessment (e.g. British Psychological Society) to assist clinicians in decision making and ethical and legal practice issues.
The Repeatable Battery for Neuropsychological Status (RBANS; Randolph, 2012) can be administered in a telepractice context by using digital tools from Q-global®, Pearson’s secure online testing and scoring platform. Specifically, Q-global digital assets (e.g. stimulus books) can be shown to the examinee in another location via the screen-sharing features of teleconference platforms. Details regarding Q-global and how it is used are provided on the Q-global page.
A spectrum of options is available for administering the RBANS via telepractice; however, it is important to consider the fact that the normative data were collected via face-to-face assessment. Telepractice is a deviation from the standardised administration, and the methods and approaches to administering it via telepractice should be supported by research and practice guidelines when appropriate.
Providers engaging in telepractice assessment may train facilitators to work with them on a regular basis to provide greater coverage to underserved populations (e.g. only two providers within a 75-mile radius, shortage of school practitioners within a city/county council). If such a facilitator is well trained and in a professional role (i.e. a professional facilitator), they can present the response pages as well as adjust audiovisual equipment. This approach yields the RBANS scores that are available in face-to-face assessment mode. If a professional facilitator is not used, it increases measurement error, impacts the workflow of the session, and may impact the validity of the derived scores.
In times when social distancing is necessary (such as the COVID-19 pandemic), using a professional facilitator may not be safe or feasible. If testing must occur under these conditions, it is possible that the examinee may participate without the help of an onsite facilitator. If the examiner determines that no facilitator is required, the examinee can assist with technological and administrative tasks during testing and should be oriented to these responsibilities prior to the testing session, and again at the beginning of the session. An initial virtual meeting should occur in advance of the testing session to address numerous issues specific to testing via telepractice. This initial virtual meeting is described in the administrative and technological tasks portion of the Examiner Considerations section and referred to in various sections of this document. The examiner should consider best practice guidelines, the referral question, and the patient’s condition, as well as telepractice equivalence study conditions to determine if this is possible and appropriate. Independent examinee participation may not be possible or appropriate, for example, for examinees with low cognitive ability or with low levels of technological literacy and experience.
If the examiner determines that the examinee cannot participate independently, and testing must occur under social distancing constraints, the only facilitator available may be someone in the examinee’s home (e.g. a parent, guardian, or caretaker). If the onsite facilitator is not in a professional role (i.e. non-professional facilitator), they can assist with technological and administrative tasks during testing and should be oriented to these responsibilities in the initial virtual meeting and again at the beginning of the session.
Professional and non-professional facilitators typically do not remain in the room with the examinee throughout the testing session. The examiner should plan to minimise (as much as possible) the need for the facilitator to remain in the room. In rare cases when the facilitator must remain in the room, they should do so passively and unobtrusively, and merely to monitor and address the examinee’s practical needs, as well as any technological or administrative issues as necessary. The facilitator’s role should be defined clearly by the examiner. The facilitator should only perform those functions the examiner approves and deems necessary. In any case, if a facilitator is necessary it is preferred that the facilitator remain accessible.
There are three response pages required for RBANS: one each for the Figure Copy, Figure Recall, and Coding subtests. If using these response pages is not feasible, the composite scores for Visuospatial/Construction, Attention, Delayed Memory, and the Total Scale will not be available.
Conducting Telepractice Assessment
Conducting a valid assessment in a telepractice service delivery model requires an understanding of the interplay of a number of complex issues. In addition to the general information on Pearson’s telepractice page, examiners should address five factors (adapted from Eichstadt et al., 2013) when planning to administer and score assessments via telepractice
1. Telepractice Environment and Equipment
Computers and connectivity
Two computers with audio and video capability and stable Internet connectivity – one for the examiner and one for the examinee – are required. A web camera, microphone, and speakers or headphones are required for both the examiner and the examinee. A second computer screen or split-screen format on a large computer monitor for the examiner is helpful to allow a view of the digital administration and scoring manual, but the examiner can also use the paper format manual or the Q-interactive® platform. The second computer monitor or large screen also tends to make sharing test content more straightforward for the examiner.
When items with visual stimuli are presented, the digital display on which visual stimuli are presented to the examinee should be at least 24.6 cm measured diagonally, similar to an iPad or iPad Air. Smaller screens, such as those of iPad minis, small tablet PCs, and smartphones, are not allowed for examinee-facing content, as these have not been examined empirically and may affect stimulus presentation, examinee response, and validity of the test results. It is recommended that computer screens used for teleconference assessment be at least 38.1 cm measured diagonally. Some teleconferencing platforms shrink the size of images, so the facilitator should verify the image size in the initial virtual meeting. Image size in displays should be the same size as in the original paper stimulus book, even when screen size varies. Similarly, presenting stimuli on extremely large screens has not been examined, so the same precaution applies. At the beginning of the testing session, the examiner may ask for a peripheral camera or device (as described later in this section) to be aimed at the examinee’s screen to ensure that the examinee’s screen is displaying images in the correct aspect ratio and not stretching or obscuring the stimuli.
A teleconference platform is required. Screensharing capability is required if anything other than items with verbal stimuli and responses are administered.
High-quality video (HD preferred) is required during the administration. Make sure the full faces of the examiner and the examinee are seen using each respective web camera. The teleconference platform should allow all relevant visual stimuli to be fully visible to the examinee when providing instruction or completing items; the view of the examiner should not impede the examinee’s view of visual test stimuli.
Screensharing digital components
Digital components are shared within the teleconferencing software as specified in Table 1. There are two ways to view digital components in the Q-global Resource Library: through the pdf viewer in the browser window or full screen in presentation mode. Always use full screen (i.e. presentation) mode for digital components viewed by the examinee. This provides the cleanest presentation of test content without onscreen distractions (e.g. extra toolbars). Refer to Using Your Digital Assets on Q-global in the Q-global Resource Library for complete directions on how to enter presentation mode.
Test item security in the audiovisual environment
The examiner is responsible for ensuring test item security is maintained, as outlined in the Terms and Conditions for test use. The examiner should address test security requirements with the examinee (and facilitator, if applicable) during the informed consent process. The examiner should make it clear that the video should not be captured, photos should not be taken, and stimuli should not be copied or recorded, as this is a copyright violation. The examinee must agree that they will not record (audio or visual) or take photos or screenshots of any portion of the test materials or testing session, and not permit anyone to observe the testing session or be in the testing room (except for a facilitator, when necessary). Any response pages used in the testing session must be returned to the examiner (see Assessment Procedures and Materials for more information).
Peripheral camera or device
A stand-alone peripheral camera that can be positioned to provide a view of the session from another angle or a live view of the examinee’s progress is helpful. Alternately, a separate device (e.g. a smartphone with a camera or another peripheral device) can be connected to the teleconference and set in a stable position to show the examinee’s pointing or written responses. The device’s audio should be silenced, and microphone should be muted to prevent feedback. The examiner should guide positioning of the peripheral camera/device before administering written response tasks (i.e. Figure Copy, Figure Recall, and Coding) so that the examiner can see that the examinee’s real-time responses are captured.
In a typical telepractice session, it is more feasible to make a document or moveable camera available in the examinee’s location. However, while social distancing is necessary, the only camera available may be a stationary camera integrated into the examinee’s laptop or computer screen. It is unrealistic to expect examinees to have document cameras within their homes. It may be necessary for examiners to think creatively about how to use a smartphone in the examinee’s location to gain a view of the examinee’s progress in a written response or when pointing at a screen. Prior to attempting this with an examinee, the examiner should work to become fluid and competent at directing examinees in these methods, which can require extensive practice with varied individuals and types of smartphones. In addition, this requires planning and practice in the initial virtual meeting to prevent technical difficulties, and so the examinee feels confident doing this when it is time.
Online instructional videos (e.g. here) demonstrate how a smartphone may be used with common household objects (e.g. a tower or stack of books, paper weight, ruler, and rubber band or tape) to create an improvised document camera for use during tasks involving response pages. Similarly, for Line Orientation, some examinees may point to responses rather than say the number corresponding to their response. In this situation, other everyday household objects (e.g. books) could be used to form an improvised stand upon which to position the device to provide a second-angle view of the examinee pointing at the screen. Typically, devices provide the best view of the examinee’s screen and pointing responses when positioned in landscape format. While using a smartphone as the peripheral camera is not an optimal solution for telepractice, it can be functional if executed well.
For Line Orientation and Coding, gesturing to the stimulus image or response page is necessary. For Line Orientation, the examiner should display the digital assets onscreen and point using the mouse cursor. For Coding, it may be necessary to gesture to areas of a paper copy of a response page to demonstrate how to respond using the examiner’s camera. Refer to Table 1 for specific instructions by subtest.
Capturing written performance (if used)
The examiner may ask for the completed response to be shown on camera immediately at the conclusion of a task, so that the examiner can score it immediately and so responses are not lost or modified. One successful approach to protecting test security uses sealed envelopes (i.e. the sealed envelope method) and is described as follows. The examiner gathers the response pages and a self-addressed stamped envelope. The examiner places these materials in an envelope and signs it on the seal, then posts or delivers it to the testing location. The examiner emphasises that the sealed envelope containing the response pages must not be opened until the examiner asks. For Figure Copy and Figure Recall, printable response pages are available on the Q-global Resource Library. These pages are required and should be printed single-sided to prevent exposure to the Figure Copy response during Figure Recall. The facilitator or examinee does not open the sealed envelope containing the response pages until the examiner asks. The Figure Copy response page is then placed in the provided self-addressed stamped envelope by the examinee after completion of the Figure Copy drawing to prevent continued viewing of the response, as this could impact recall of the image during the delayed condition. At the conclusion of Figure Recall, the examinee or facilitator should place the Coding and Figure Recall pages in the envelope, seal the envelope and sign across the seal on camera, and then post or deliver to the examiner immediately following the testing session. The Coding Response sheet can be torn from the paper record form (Q-interactive or standard version) and sent to the examinee along with the Figure Copy and Figure Recall pages.
High-quality audio capabilities are required during the administration. An over-the-head, two-ear, stereo headset with attached boom microphone is recommended for both the examiner and examinee. Headphones with a microphone may be used if a headset is not available.
The examiner should test the audio for both the examiner and examinee in the initial virtual meeting and at the beginning of the testing session to ensure a high-quality audio environment is present. This is especially critical for List Learning, Story Memory, Picture Naming, Semantic Fluency, Digit Span, List Recall, and Story Recall. Testing the audio should include an informal conversation prior to the administration where the examiner is listening for any clicks, pops, or breaks in the audio signal that distorts or interrupts the voice of the examinee. The examiner should also ask if there are any interruptions or distortions in the audio signal on the examinee’s end. Any connectivity lapses, distractions, or intrusions that occurred during testing should be reported.
Manage audiovisual distractions
As with any testing session, the examiner should do everything possible to make sure the examinee’s environment is free from audio and visual distractions. If the examiner is unfamiliar with the examinee’s planned physical location, a visual tour of the intended testing room should be given during the initial virtual meeting. The examiner can then provide a list of issues to address to transform the environment into one suitable for testing. For example, remove distracting items, silence all electronics, and close doors. The examiner should confirm that these issues have been addressed at the time of testing. If possible, the examinee should be positioned facing away from the door to ensure the examiner can verify through the examinee’s camera that the door remains shut and can monitor any interruptions. The examiner should confirm that all other applications on the computer, laptop, or peripheral device are closed, the keyboard is moved aside or covered after the session is connected, and alerts and notifications are silenced on the peripheral device. Radios, televisions, other mobile phones, fax machines, smart speakers, printers, and equipment that emit noise must be silenced and/or removed from the room.
Good overhead and facial lighting should be established for the examiner and examinee. Blinds or curtains should be closed to reduce sun glare on faces and the computer screens.
The examiner should record any and all atypical events that occur during the testing session. This may include delayed audio or video, disruptions to connectivity, the examinee being distracted by external stimuli, and any other anomalies. These can be noted on the record form or in the Q-interactive notes and should be considered during interpretation and described in the written report.
2. Assessment Procedures and Materials
Permission must be obtained for access to copyrighted materials (e.g. stimulus books, response booklets) as appropriate. Pearson has provided a letter of No Objection to permit use of copyrighted materials for telepractice via teleconferencing software and tools to assist in remote administration of assessment content during the COVID-19 pandemic.
Response pages (if used)
The response pages should be provided in advance of the testing session, and the plan for securing and forwarding/returning materials, real-time and after testing, should be communicated. See the capturing written performance portion of the Telepractice Environment and Equipment section for suggested procedures to facilitate immediate scoring and secure handling of response pages.
For Figure Copy and Figure Recall, print out the Remote Figure Copy and Recall Response Pages on single sheets (i.e. do not print two-sided) from the Q-global Resource Library. The same form can be used across all forms of RBANS.
For Coding, remove the Coding response page from the standard paper record form or the Q-interactive response form to send to the examinee. Ensure that the correct Coding response page is sent for the form of RBANS you are administering, as Coding differs across forms.
The examiner should practise using the digital assets until the use of the materials is as smooth as a face-to-face administration. It is not recommended that the examiner display items from paper stimulus books on a camera.
Review Table 1 for the specific telepractice considerations for each subtest to be administered.
Input and output requirements and equivalence evidence
The examiner should consider the input and output requirements for each task, and the evidence available for telepractice equivalence for the specific task type.
Telepractice Versus Face-to-Face Administration
Preliminary research has compared results obtained in telepractice and face-to-face administration modes. A study of the RBANS equivalence of video-teleconference administration compared with face-to-face administration in 18 examinees with and without impairment produced high correlations between scores across the two administration and no statistical differences were observed between RBANS scores (Galusha-Glasscock et al., 2016). Several tasks similar to those found in the RBANS have produced evidence of equivalence in telepractice and face-to-face modes for examinees with a variety of clinical conditions (Cullum et al., 2006, 2014; Galusha-Glasscock et al., 2016; Grosch et al., 2011; Hildebrand et al., 2004; Ragbeer et al., 2016; Stain et al., 2011; Sutherland et al., 2017; Temple et al., 2010; Wadsworth et al., 2018; Wadsworth et al., 2016). Other studies support equivalence of tasks that are similar to some of the RBANS subtests with non-clinical examinees using telepractice compared with face-to-face administration and scoring (Galusha-Glasscock et al., 2016; Wright, 2018a, 2018b). In addition, a meta-analysis of telepractice studies provides support for telepractice and face-to-face mode equivalence across a variety of neuropsychological tests (Brearly et al., 2017).
While equivalence data on similar measures are relevant, practitioners should be mindful that more research is needed to establish equivalence in all ages and for all tasks on the RBANS. Additional caveats and cautions are described in Grosch et al. (2011). Also, most telepractice-based studies were conducted with volunteer subjects in controlled environments. When social distancing is key (such as during the COVID-19 pandemic) some examinations may need to occur in patients’ homes, and it should be noted that very little research has been done about remote assessment in private homes. It is important to consider the conditions under which equivalence studies of telepractice and face-to-face assessment modes were conducted and attempt to reproduce these as closely as possible if testing via telepractice. Typical telepractice studies that support telepractice and face-to-face equivalence involve the examiner becoming very familiar with the teleconference platform by using it for its intended purpose for several hours and administering tests (even those that are familiar in face-to-face mode) multiple times to ‘practice examinees.’ Some studies that have established telepractice and face-to-face mode equivalence involve a professional facilitator. However, preliminary research conducted and described by Lana Harder (Stolwyk et al., 2020) with parents serving as in-home facilitators who managed audiovisual needs and response booklets found no significant differences across modes. Finally, the examinee is typically in an office- or school-based setting. Therefore, if in-home assessment is taking place, it is advisable to prepare a similar environment as much as possible as described in the Telepractice Environment and Equipment section.
Digital Versus Traditional Format
Telepractice involves the use of technology in assessment as well as viewing onscreen stimuli. For these reasons, studies that investigate assessment in digital versus traditional formats are also relevant.
The RBANS utilises tasks similar to those investigated within other measures, such as the WAIS-IV and WMS-IV. A number of investigations of the Wechsler scales have produced evidence of equivalence when administered and scored via digital or traditional formats to examinees without clinical conditions (Daniel, 2012a, 2012b; Daniel et al., 2014; Raiford, Zhang, et al., 2016). In addition, equivalence has been demonstrated for examinees with clinical conditions, such as intellectual giftedness or intellectual disability (Raiford et al., 2014; Raiford, Zhang, et al., 2016), attention-deficit/hyperactivity disorder or autism spectrum disorder (Raiford, Drozdick, et al., 2016; Raiford, Zhang, et al., 2016), or specific learning disorders in reading or mathematics (Raiford, Drozdick, et al., 2016; Raiford, Zhang, et al., 2016). However, it is important to note that these studies were not conducted remotely or via video conference.
Evidence by Subtest
Table 2 lists each RBANS subtest, the input and output requirements, the direct evidence of subtest equivalence in telepractice–face-to-face and digital–traditional investigations, and the evidence for similar tasks. The abbreviations in the Input and Output columns correspond to the various input and output requirements of each subtest, and a key appears at the bottom of the table. For example, brief spoken directions as an input requirement is abbreviated as BSD. The numbers in the evidence columns correspond to the studies in the reference list, which is organised alphabetically in telepractice and digital sections. For clarity, each study is denoted by either T or D, with T indicating the study investigated telepractice–face-to-face mode, and D indicating the study addressed digital–traditional format.
3. Examinee Considerations
The examiner should first ensure that a telepractice administration is appropriate for the examinee and for the purpose of the assessment. Clinical judgment, best practice guidance for telepractice (e.g. APA Services, 2020; ASPPB, 2013; IOPC, 2020), information from professional organisations and other professional entities (e.g. licensing boards, legal resources, professional liability insurance providers), consultation with other knowledgeable clinicians, existing research, and any available government regulations should be considered in the decision-making process. Consideration should be given to whether the necessary administrative and technological tasks involved in a telepractice session can be accomplished without influencing results.
Before initiating test administration, the examiner should ensure that the examinee is well-rested, able, prepared, and ready to appropriately and fully participate in the testing session.
If using a facilitator, the role of the facilitator must be explained to the examinee so participation and actions are understood.
It may not be appropriate or feasible for some examinees to use a headset due to behaviour, positioning, physical needs, or tactile sensitivities, or if a headset is not available. Clinical judgment on the appropriate use of a headset in these situations should be used. If a headset is not utilised, the examiner’s and examinee’s microphones and speakers should be turned up to a comfortable volume.
On some teleconference platforms, the examiner can pass control of the mouse to allow the examinee to point to indicate responses; this is an option if it is within the capabilities of the examinee. However, best practice guidelines provide cautions about this. For example, the IOPC guidelines suggest examiners be alert throughout administration, return control of the screen once the task is finished, and never leave the computer unattended while the examinee has control over the examiner’s computer (IOPC, 2020).
4. Examiner Considerations
During the telepractice setup, and before administering to any actual examinee, the examiner should rehearse the mechanics and workflow of every item in the entire test using the selected teleconference platform so that the examiner is familiar with the administration procedures. For example, a colleague could be used as a practice examinee.
The examiner must follow the administration procedures of face-to-face administration as much as possible. For example, if a spoken stimulus cannot be said more than once in face-to-face administration, the examiner must not say it more than once in a telepractice administration unless a technical difficulty precluded the examinee from hearing the stimulus.
Administrative and technological tasks
In order to conduct a smooth telepractice session, audiovisual needs and materials must be managed appropriately. The initial virtual meeting involves the examiner, examinee, and/or the facilitator (if used), and presents the opportunity for the examiner to provide information about the audiovisual needs and materials. During the initial virtual meeting, the examiner should provide training in troubleshooting audiovisual needs that arise during the testing session, including camera angle, lighting, and audio checks. The examiner should provide verbal feedback to guide camera adjustment, checking the onscreen video shown by the peripheral camera/device to provide information about how to reposition it until the proper view is shown. The examiner should emphasise that no materials should be opened until the examiner provides instructions to do so, if applicable. The examiner should also expect to provide verbal guidance about these issues during the testing session. Refer to the Telepractice Environment and Equipment section and to Table 1 for specific subtest telepractice considerations.
If used, the facilitator is to assist with administrative and technological tasks and not to manage rapport, engagement, or attention during the testing session. The examiner should direct them not to interfere with the examinee’s performance or responses. Any other roles and responsibilities for which an examiner needs support, such as behaviour management, should be outlined and trained prior to the beginning of the testing session. The examiner is responsible for documenting all behaviours of the facilitator during test administration and taking these into consideration when reporting scores and performance.
5. Other Considerations
There are special considerations for written reports describing testing that takes place via telepractice.
The professional completing the written report should state in the report that the test was administered via telepractice, and briefly describe the method of telepractice used. For example, ‘The RBANS was administered via remote telepractice using digital stimulus materials on Pearson’s Q-global system, and a facilitator monitored the administration onsite using a printed response booklet during the live video connection using the [name of telepractice system, e.g. Zoom] platform.’
The professional should also make a clinical judgment, similar to a face-to-face session, about whether or not the examiner was able to obtain the examinee’s best performance. Clinical decisions should be explained in the report, including comments on the factors that led to the decision to conduct testing via telepractice and to report all (or not to report suspect) scores. In addition, it is recommended that the report include a record of any and all atypical events during the testing session (e.g., delayed video or audio, disruptions to connectivity, extraneous noises such as a phone ringing or loud dog barking, a person or animal unexpectedly walking into the room, the examinee responding to other external stimuli). Notes may be recorded about these issues on the record form or in the notes section on Q-interactive. List and describe these anomalies as is typical for reporting behavioural observations in the written report, as well as any observed or perceived impact on the testing sessions and/or results, and consider these in the interpretation of results. For example, ‘The remote testing environment appeared free of distractions, adequate rapport was established with the examinee via video/audio, and the examinee appeared appropriately engaged in the task throughout the session. No significant technological problems or distractions were noted during administration. Modifications to the standardisation procedure included [list]. The RBANS subtests and similar tasks have received initial validation in several samples for remote telepractice and digital format administration, and the results are considered a valid description of the examinee’s skills and abilities.”
The RBANS was not standardised in a telepractice mode, and this should be taken into consideration when utilising this test via telepractice and interpreting results. For example, the examiner should consider relying on convergence of multiple data sources and/or being tentative about conclusions. Provided that the examiner has thoroughly considered and addressed the factors and the specific considerations as listed above, the examiner should be prepared to observe and comment about the reliable and valid delivery of the test via telepractice. Materials may be used via telepractice without additional permission from Pearson in the following published contexts:
RBANS manual, digital stimulus books, and response booklets via Q-global
RBANS via Q-interactive (requires advanced technology skills and mirroring software)
Any other use of the RBANS via telepractice is not currently recommended. This includes, but is not limited to, scanning the paper stimulus books, digitising the paper record forms, holding the stimulus books physically up in the camera's viewing area, or uploading a manual onto a shared drive or site.
American Psychological Association Services (APA Services). (2020). Guidance on psychological tele-assessment during the COVID-19 crisis. (2020). https://www.apaservices.org/practice/reimbursement/health-codes/testing/tele-assessment-covid-19?fbclid=IwAR1d_YNXYS2Yc5mdIz_ZIYSkrrJ_6A9BQeKuIHxEEjjRh1XDR6fOYncM3b4
Eichstadt, T. J., Castilleja, N., Jakubowitz, M., & Wallace, A. (2013, November). Standardized assessment via telepractice: Qualitative review and survey data [Paper presentation]. Annual meeting of the American-Speech-Language-Hearing Association, Chicago, IL, United States.
Grosch, M. C., Gottlieb, M. C., & Cullum, C. M. (2011). Initial practice recommendations for teleneuropsychology. The Clinical Neuropsychologist, 25, 1119–1133.
Inter Organizational Practice Committee [IOPC]. (2020). Recommendations/guidance for teleneuropsychology (TeleNP) in response to the COVID-19 pandemic. https://static1.squarespace.com/static/50a3e393e4b07025e1a4f0d0/t/5e8260be9a64587cfd3a9832/1585602750557/Recommendations-Guidance+for+Teleneuropsychology-COVID-19-4.pdf
Randolph, C. (2012). Repeatable Battery for the Assessment of Neuropsychological Status-Update. Pearson.
Stolwyk, R., Hammers, D. B., Harder, L., & Cullum, C. M. (2020). Teleneuropsychology (TeleNP) in response to COVID-19. https://event.webinarjam.com/replay/13/pyl2nayhvspsp09
Wechsler, D. (1999). Wechsler Abbreviated Scale of Intelligence. Pearson.
Wechsler, D. (2008). Wechsler Adult Intelligence Scale (4th ed.). Pearson.
Wechsler, D. (2014). Wechsler Intelligence Scale for Children (5th ed.). Pearson.
See Table 1
Barcellos, L. F., Horton, M., Shao, X., Bellesis, K. H., Chinn, T., Waubant, E., Bakshi, N., Marcus, J., Benedict, R. H. B., & Schaefer, C. (2017). A validation study for remote testing of cognitive function in multiple sclerosis. Multiple Sclerosis Journal. https://doi.org/10.1177/1352458520937385
Brearly, T., Shura, R., Martindale, S., Lazowski, R., Luxton, D., Shenal, B., & Rowland, J. (2017). Neuropsychological test administration by videoconference: A systematic review and meta-analysis. Neuropsychology Review, 27(2), 174–186.
Cullum, C. M., Weiner, M., Gehrmann, H., & Hynan, L. (2006). Feasibility of telecognitive assessment in dementia. Assessment, 13(4), 385–390.
Cullum, C. M., Hynan, L. S., Grosch, M., Parikh, M., & Weiner, M. F. (2014). Teleneuropsychology: Evidence for video teleconference-based neuropsychological assessment. Journal of the International Neuropsychological Society, 20, 1028–1033.
Dekhtyar, M., Braun, E. J., Billot, A., Foo, L., & Kiran, S. (2020). Videoconference administration of the Western Aphasia Battery-Revised: Feasibility and validity. American Journal of Speech-Language Pathology, 29, 673–687.
Galusha-Glasscock, J., Horton, D., Weiner, M., & Cullum, C. M. (2016). Video teleconference administration of the Repeatable Battery for the Assessment of Neuropsychological Status. Archives of Clinical Neuropsychology, 31(1), 8–11.
Grosch, M., Weiner, M., Hynan, L., Shore, J., & Cullum, C. M. (2015). Video teleconference-based neurocognitive screening in geropsychiatry. Psychiatry Research, 225(3), 734–735.
Hildebrand, R., Chow, H., Williams, C., Nelson, M., & Wass, P. (2004). Feasibility of neuropsychological testing of older adults via videoconference: Implications for assessing the capacity for independent living. Journal of Telemedicine and Telecare, 10(3), 130–134. https://doi.org/10.1258/135763304323070751
Hodge, M., Sutherland, R., Jeng, K., Bale, G., Batta, P., Cambridge, A., Detheridge, J., Drevensek, S., Edwards, L., Everett, M., Ganesalingam, K., Geier, P., Kass, C., Mathieson, S., McCabe, M., Micallef, K., Molomby, K., Ong, N., Pfeiffer, S., … Silove, N. (2019). Agreement between telehealth and face-to-face assessment of intellectual ability in children with specific learning disorder. Journal of Telemedicine and Telecare, 25(7), 431–437. https://doi.org/10.1177/1357633X18776095
Jacobsen, S. E., Sprenger, T., Andersson, S., & Krogstad, J. (2003). Neuropsychological assessment and telemedicine: A preliminary study examining the reliability of neuropsychology services performed via telecommunication. Journal of the International Neuropsychological Society, 9, 472–478.
Ragbeer, S. N., Augustine, E. F., Mink, J. W., Thatcher, A. R., Vierhile, A. E., & Adams, H. R. (2016). Remote assessment of cognitive function in juvenile neuronal ceroid lipofuscinosis (Batten disease): A pilot study of feasibility and reliability. Journal of Child Neurology, 31, 481–487. https://doi.org/10.1177/0883073815600863
Stain, H. J., Payne, K., Thienel, R., Michie, P., Vaughan, C., & Kelly, B. (2011). The feasibility of videoconferencing for neuropsychological assessments of rural youth experiencing early psychosis. Journal of Telemedicine and Telecare, 17, 328–331. https://doi.org/10.1258/jtt.2011.101015
Sutherland, R., Trembath, D., Hodge, A., Drevensek, S., Lee, S., Silove, N., & Roberts, J. (2017). Telehealth language assessments using consumer grade equipment in rural and urban settings: Feasible, reliable and well tolerated. Journal of Telemedicine and Telecare, 23(1), 106–115. https://doi.org/10.1177/1357633X15623921
Temple, V., Drummond, C., Valiquette, S., & Jozsvai, E. (2010). A comparison of intellectual assessments over video conferencing and in-person for individuals with ID: Preliminary data. Journal of Intellectual Disability Research, 54(6), 573–577. https://doi.org/10.1111/j.1365-2788.2010.01282.x
Vahia, I. V., Ng, B., Camacho, A., Cardenas, V., Cherner, M. ... Agha, Z. (2015). Telepsychiatry for neurocognitive testing in older rural Latino adults. American Journal of Geriatric Psychiatry, 23, 666–670.
Vestal, L., Smith-Olinde, L., Hicks, G., Hutton, T., & Hart, J., Jr. (2006). Efficacy of language assessment in Alzheimer’s disease: Comparing in-person examination and telemedicine. Clinical Interventions in Aging, 1(4), 467–471.
Wadsworth, H., Galusha-Glasscock, J., Womack, K., Quiceno, M., Weiner, M., Hynan, L., Shore, J., & Cullum, C. (2016). Remote neuropsychological assessment in rural American Indians with and without cognitive impairment. Archives of Clinical Neuropsychology, 31(5), 420–425. https://doi.org/10.1093/arclin/acw030
Wadsworth, H. E., Dhima, K., Womack, K. B., Hart, J., Jr., Weiner, M. F., Hynan, L. S., & Cullum, C. M. (2018). Validity of teleneuropsychological assessment in older patients with cognitive disorders. Archives of Clinical Neuropsychology 33(8), 1040–1045. https://doi.org/10.1093/arclin/acx140
Waite, M. C., Theodoros, D. G., Russell, T. G., & Cahill, L. M. (2010). Internet-based telehealth assessment of language with CELF-4. Language, speech, and hearing services in schools, 41, 445-458.
Wright, A.J. (2018a). Equivalence of remote, online administration and traditional, face-to-face administration of the Woodcock-Johnson IV cognitive and achievement tests. Archives of Assessment Psychology, 8(1), 23–35.
Wright, A. J. (2018b). Equivalence of remote, online administration and traditional, face-to-face administration of the Reynolds Intellectual Assessment Scales-Second Edition. https://pages.presencelearning.com/rs/845-NEW-442/images/Content-PresenceLearning-Equivalence-of-Remote-Online-Administration-of-RIAS-2-White-Paper.pdf
Wright, A. J. (2020). Equivalence of remote, digital administration and traditional, in-person administration of the Wechsler Intelligence Scale for Children, Fifth Edition (WISC-V). Psychological Assessment. Advance online publication. http://dx.doi.org/10.1037/pas0000939
See Table 2
Daniel, M. H. (2012a). Equivalence of Q-interactive administered cognitive tasks: WAIS–IV (Q-interactive Technical Report 1). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/007-s-QinteractiveTechnical%20Report%201_WAIS-IV.pdf
Daniel, M. H. (2012b). Equivalence of Q-interactive administered cognitive tasks: WISC–IV (Q-interactive Technical Report 2). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/009-s-Technical%20Report%202_WISC-IV_Final.pdf
Daniel, M. (2013). Equivalence of Q-interactive and paper administration of WMS-IV cognitive tasks (Q-interactive Technical Report 6). Pearson.
Daniel, M. H., Wahlstrom, D., & Zhang, O. (2014). Equivalence of Q-interactive and paper administrations of cognitive tasks: WISC®–V (Q-interactive Technical Report 8). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/003-s-Technical-Report_WISC-V_092514.pdf
Raiford, S. E., Holdnack, J. A., Drozdick, L. W., & Zhang, O. (2014). Q-interactive special group studies: The WISC–V and children with intellectual giftedness and intellectual disability (Q-interactive Technical Report 9). Pearson. http://www.helloq.com/content/dam/ped/ani/us/helloq/media/Technical_Report_9_WISC-V_Children_with_Intellectual_Giftedness_and_Intellectual_Disability.pdf
Raiford, S. E., Drozdick, L. W., & Zhang, O. (2015). Q-interactive special group studies: The WISC–V and children with autism spectrum disorder and accompanying language impairment or attention-deficit/hyperactivity disorder (Q-interactive Technical Report 11). Pearson. http://images.pearsonclinical.com/images/assets/WISC-V/Q-i-TR11_WISC-V_ADHDAUTL_FNL.pdf
Raiford, S. E., Drozdick, L. W., & Zhang, O. (2016). Q-interactive special group studies: The WISC–V and children with specific learning disorders in reading or mathematics (Q-interactive Technical Report 13). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/012-s-Technical_Report_9_WISC-V_Children_with_Intellectual_Giftedness_and_Intellectual_Disability.pdf
Raiford, S. E., Zhang, O., Drozdick, L. W., Getz, K., Wahlstrom, D., Gabel, A., Holdnack, J. A., & Daniel, M. (2016). Coding and Symbol Search in digital format: Reliability, validity, special group studies, and interpretation (Q-interactive Technical Report 12). Pearson. https://www.pearsonassessments.com/content/dam/school/global/clinical/us/assets/q-interactive/002-Qi-Processing-Speed-Tech-Report_FNL2.pdf