Blog

Blog posts


Read our latest blog posts from our sponsor, Hogrefe

NEW 4th October 2022 

Using parallel assessments of neuropsychological status to explore conceptual flexibility


Written by Lucy McIvor | Trainee Clinical Psychologist at the Salomons Institute for Applied Psychology


The capacity to deal with information and concepts flexibly is a subtle, yet essential requirement for everyday living. It may represent the consideration of alternatives, abstracting from the concrete to the generalisable, changing tact when the situation requires it, learning from feedback, following what ‘works’, persisting until we get it ‘right’, and holding the ‘bigger picture’ in mind. Is this perhaps the definition of ‘cognitive flexibility’?

 

The SPANS-X1 is a neuropsychological test used to measure neuropsychological functioning in individuals with acquired brain injury. The Conceptual Flexibility Index (CFI) of the SPANS-X aims to measure cognitive flexibility (CF). It combines two subtests that possess elements of concept formation, thinking laterally and flexibly and combining concepts into superordinate categories. Currently, the CFI is not functioning like other SPANS-X index scores on measures of reliability (such as Cronbach’s alpha, alternate version and test-retest). This raises several considerations. Crucially, there is the matter of whether ‘cognitive flexibility’ exists. If a construct cannot be reliably measured, arguably, it cannot be classed a construct. Conversely, if we do assume its existence, consideration must be given to how the SPANS-X CFI could increase its performance and capture the construct of CF more accurately. 

 

Within the domain of brain injury research, CF is often treated as synonymous with task- or set- shifting.2 CF is therefore viewed as a specific skill, and has historically been examined using shifting paradigms (e.g. the Wisconsin Card Sorting Test3). The outcome of this conventionality is that CF has been operationalized by the tasks that are used to measure it.4 This circular reasoning has borne neuropsychology’s own chicken and egg debate of which came first, the accepted construct of ‘CF’, or its associated tasks. It is also likely that the ability to shift task or set is just one piece of the cognitive flexibility puzzle2, and that when individuals show a ‘flexible response’ in everyday life away from ‘laboratory conditions’ an interaction of multiple cognitive subsystems (e.g. attentional mechanisms, perception, inhibition) occurs. In addition, task demands and contextual factors (e.g. an individual’s effort level and previous knowledge) will likely contribute. The use of and focus on individual test paradigms such as set or task shifting may therefore not be sufficient in capturing the full range of a person’s CF.

 

Our current research aims to add two additional items of greater difficulty to each of the existing CFI subtests, and to add four further CF subtests that theoretically align with the current CFI. To achieve this, we will draw inspiration from a full range of existing neuropsychological tests purporting to measure ‘CF’ (the Wisconsin Card Sorting Test4, the D-KEFS sorting test6, the Hayling and Brixton Tests7, the Category Test of the Halstead-Retain8, and the Rule Shift Cards and the Modified Six Elements Tests of the BADS9). Our aim in doing this is to avoid using individual paradigms in isolation, instead combining a diverse range of paradigms to gain an overarching view of CF. Through introducing multiple diverse subtests that aim to measure the same concept, we aim to observe which subtests covary with one another. Observed covariance within these subtests would suggest the measurement of the same construct (possibly CF). We will also gauge participants’ effort levels, to ensure we are considering CF as a property of the cognitive system highly influenced by context. We will further look carefully at the validity of these neuropsychological tests; not only if the variable performance on the tests listed correlates with the nature and severity of brain injury, but also if an individual’s scores on the listed tests correlate with relevant, real-world outcomes (e.g. social and occupational functioning).

 

Within our study, we will also ask participants to complete an online version of the Wisconsin Card Sorting Test (WCST). The WCST demonstrates a ceiling effect in non-clinical samples, which affects its reliability, however correlates well with certain functional outcomes in patient populations (e.g. returning to work). If participant performance on the WCST and the new/established CF subtests of the SPANS-X correlate, we may infer that the SPANS-X CFI might also predict useful outcomes. 

 

Lucy is a trainee clinical psychologist at the Salomons Institute for Applied Psychology. Her published research explores the neuropsychological basis of emotional difficulties and the enactment of violence and self-harm. Her doctoral thesis will explore the construct of ‘cognitive flexibility’ and the validity and reliability of the SPANS-X Conceptual Flexibility Index. Data collection will commence in 2023 and completion is expected by 2024.

 

 

1.    Burgess, G.H. (2022). Short Parallel Assessments of Neuropsychological Status-Extended (SPANS-X). Oxford: Hogrefe.

2.    Ionescu, T. (2012). Exploring the nature of cognitive flexibility. New Ideas in Pschology, 30(2), 190–200. https://doi.org/10.1016/j.newideapsych.2011.11.001

3.    Berg, E. A. (1948). A simple objective technique for measuring flexibility in thinking. The Journal of General Psychology, 39(1), 15-22. https://doi.org/10.1080/00221309.1948.9918159

4.    Dajani, D. R., & Uddin, L. Q. (2015). Demystifying cognitive flexibility: Implications for clinical and developmental neuroscience. Trends in Neurosciences, 38(9), 571-578. https://doi.org/10.1016/j.tins.2015.07.003


Watch the joint BNS-Hogrefe Webinar:

Using parallel assessments of neuropsychological status to explore conceptual flexibility


This webinar is available to watch via Zoom using passcode: =nQ369mX 



11th May 2022 

Remote Assessment of Neuropsychological Status: SPANS-X


How Covid-19 ushered in the first empirical evidence for the administration of online assessments of neuropsychological status

“Can you hear me?... Can you hear me now?”


All of us are familiar with the sounds of remote calling, particularly after two years of learning to cope professionally within a pandemic world. From the early days of lockdown in March 2020, neuropsychologists (and anyone providing critical outpatient appointments and assessments) had to try to find a way to adapt to a new virtual reality – adapt quickly, and with very little guidance or evidence to support them.


For patients with neuropsychological needs – including acquired brain injury and suspected or diagnosed dementia or Alzheimer’s – visiting a memory clinic or hospital even on a normal day can be confusing, chaotic and sometimes scary. Add to that the very real issues associated with Covid-19 in terms of shielding vulnerable patients, difficulties with face masks, and generally having been isolated for two years – and suddenly remote assessment has become not just desired, but especially critical for some of our most vulnerable population.

In a well-timed technological advance, an assessment for neuropsychological status has been developed specifically to be administered remotely, via any online videoconference platform. The Short Parallel Assessments of Neuropsychological Status – Extended (SPANS-X), a test battery for accurate cognitive measurement, expands upon the important work of its earlier edition in various ways. Amongst the advancements is essential empirical evidence from a validity/equivalency study, which compared demographically matched samples performances administering remote SPANS-X against traditional in-person administration.


Now, patients with neuropsychological needs can be assessed from home or clinic – and neuropsychologists and other health care workers can be assured of the results.


How does this work?


As the ‘original’, in-person, face-to-face (FTF) SPANS administration required few paper materials and used mostly just a stimulus book and the administrator’s voice, it was possible to convert SPANS-X to online remote (REM) administration. As a result, the SPANS-X administrator should not find it difficult to shift between FTF and REM administration as the situation requires.


The basis of REM administration is that you can administer SPANS-X to anyone in the world who has the suitable technology and aptitude, by simply ‘sharing screen’ while on an online videoconference platform with the examinee.


A validation study that examined equivalency between FTF and REM administrations can be found in the SPANS-X manual, which in summary found them suitably equivalent.


What technology do I require? And my patient?


Digital materials, including recordings to support a listening task (helping to avoid any confounding bias that may occur with delays in technology) are included with every SPANS-X kit.


Familiarity with your computer is essential, and it must be capable of taking screenshots, have a built-in or external webcam and mic, and have access to a videoconferencing platform with ‘share screen’ capability. You should be fluent and adept at switching screen share on and off and use an audio headset for noise clarity and to reassure the examinees of their privacy.


Similarly, the examinee will require aptitude and/or assistance (from you or from someone with them). They will need access to a computer with webcam/videoconference capability and have a reliable internet connection.


What else can SPANS-X do?


The SPANS-X now boasts:


  • a norm for every age from 18 to 90 years, calculated using an innovative nonlinear regression process, for precise and accurate interpretation
  • equivalent alternate versions for reliable retest while minimising practice effects
  • a focus on ‘functional’ everyday skills for real-world application
  • simpler administration and new streamlined scoring for quick and accurate results
  • expanded interpretation guidance – including specific guidance for using the measure in different situations (such as assessing mental capacity, testing those with intellectual disability or with yes/no only responding, etc.)
  • new validation studies and updated reliability studies


For more on the SPANS-X, please contact Hogrefe Ltd at customersupport@hogrefe.co.uk

Share by: