Search our website

Find any of our blog posts or products in our assessment catalogue

The Top 9 Trends in Psychological Assessments (and What They Mean for You)

16 September 2025

Explore 2025’s key workplace assessment trends and considerations. Stay ahead with insights for planning your 2026 talent strategy

Author: Kleinjan Redelinghuys

In today's fast-paced business world, artificial intelligence (AI), technology, and data science are frequently applied across the assessment lifecycle, spanning test design, delivery, scoring, interpretation, and feedback (International Test Commission & Association of Test Publishers, 2025). With these advances come opportunities for unparalleled innovation, but they also create an environment where one should tread carefully from ethical, legal, and practical perspectives (International Test Commission & Association of Test Publishers, 2025; OECD, 2019; UNESCO, 2023).

JVR’s stance is clear: Businesses should lean into these trends to stay competitive, but always prioritise fairness, privacy, and reliability. By doing so, you can hire better talent, boost team performance, and avoid costly mistakes. Against this background, we aim to outline key psychological assessment trends in the workplace, while highlighting some important considerations that accompany them.

Key trends and considerations

Before delving into the trends, it is worth noting that adoption or occurrence rates may vary significantly across industries, organisations, and regions due to factors such as infrastructure readiness, availability of funding, regulatory requirements, workforce capabilities, and organisational objectives. Nonetheless, below we examine several notable psychological assessment trends, emphasising their potential value-add, challenges, and documented real-world applications where available.

Trend 1: A shift in how assessments are being administered.

Traditional in-person pencil-and-paper-based tests have declined significantly as organisations adopt digital platforms that enable remote delivery, mobile accessibility, and automation of various aspects of the assessment process (International Test Commission & Association of Test Publishers, 2025; Elosua et al., 2023).

Potential value-add: Digital assessments offer cost and environmental savings, allow candidates to complete tests at any time and from anywhere, scale effortlessly for large groups, and deliver results much faster through automation.

Challenges: Unstable internet connectivity, limited access to devices, data affordability, and varying levels of digital literacy may reduce efficacy. Additional considerations include ensuring secure data storage, protecting candidate privacy, maintaining compliance with relevant data regulations, and providing sufficient human oversight for automated processes (International Test Commission & Association of Test Publishers, 2025; Laher, 2024).

JVR’s position: As a test publisher in Africa, we have also seen the increasing digitisation of assessments and selection processes. However, structural challenges remain, depending on the assessment context. In this regard, we will continue to provide a wide range of options, which include different assessment modalities and methods. Digital assessments allow for scalability, but scale may compromise standardised testing conditions, and even make cheating easier, so in addition to the above we also advise assessment practitioners to be strategic with regards to assessment supervision.

Trend 2: Expanded assessment modes.

Some niche organisations are moving beyond traditional assessment methods to collect data from wearables, such as biometric wristbands, smartwatches, VR headsets, and augmented reality glasses (Maltseva, 2020). Others are also experimenting with social media and digital trace data (Elosua et al., 2023). For example, some employers skim public profiles (e.g., LinkedIn, X, Facebook) as an indicator of a candidate’s professionalism, communication style, or cultural fit. Some also use structured digital trace data (i.e., data gathered from tests or simulations) and unstructured data (data obtained through everyday online activity) to reveal how candidates approach tasks and solve problems.

Potential value-add: Wearables capture real-time behavioural data and physiological responses (e.g., activity, heart rate, skin conductance), enabling richer, more dynamic assessments of how individuals think, feel, and perform in their natural work environment (Maltseva, 2020). They can also provide crucial indicators of employee ergonomics, safety, and well-being. In parallel, social media and digital trace data offer indirect insights into personality, communication style, and behavioural tendencies (Elosua et al., 2023).

Challenges: These methods raise ethical concerns around consent, data ownership, and potential surveillance. Technical and practical barriers include cost, integration into existing systems, and the need for robust validation to ensure that data collected from these novel sources are both reliable and fair (Maltseva, 2020).

JVR’s position: We are watching this space closely as we believe there will be interesting innovations, research opportunities, and collaborations in this space in the future.

Trend 3: Candidate-centric assessments.

Organisations are increasingly exploring beyond traditional batteries by introducing AI-driven simulations, gamified assessments, adaptive testing, and video interviews. These tools not only allow for the evaluation of candidates’ abilities, behaviours, and potential under realistic or interactive conditions, but are also intentionally designed to create smoother, more engaging, and less stressful candidate experiences. The emphasis is shifting from static, one-size-fits-all tests to assessments that are dynamic, immersive, personalised, and candidate-friendly (Elosua et al., 2023; Laher, 2024; Bharadwaj, 2024).

Potential value-add: These approaches may provide richer behavioural and performance data, enabling more effective large-scale screening and highlighting critical soft skills (e.g., problem-solving, adaptability, collaboration). They may also offer deeper insights into candidates’ performance in applied settings. Adaptive testing and gamified elements personalise the experience, maintain engagement, and may deliver more precise insights in less time.

Challenges: Ensuring fairness, transparency, and inclusivity is vital, as AI-driven and gamified tools can unintentionally introduce bias or disadvantage certain groups. For example, in certain instances, a gamified test can measure familiarity with the game rather than the intended job-relevant skills, unintentionally favouring people with certain characteristics (e.g., candidates who are younger, more tech-savvy, etc.). Accessibility remains a concern (e.g., familiarity with gaming environments, internet connectivity, adequate bandwidth for video interviews), whereas adaptive algorithms require extensive development and validation. Over-gamification may reduce the perceived seriousness or validity of assessments. Thus, rigorous validation is necessary to confirm that these methods reliably and fairly measure what they intend to (International Test Commission & Association of Test Publishers, 2025).

JVR’s position: JVR is in support of the trend toward candidate-centric assessments, but not at the expense of scientific rigour. We aim to find a win-win balance, focusing on those things we are good at (assessments) and partnering with others where we may yet learn. We aim to provide our clients with quality options, and we are always open to partner to make things more effective and efficient for all involved.

Trend 4: Increased AI integration.

The use of AI in assessment design, scoring, and interpretation is growing. Among others, this includes automated scoring of complex responses (e.g., video interviews, written essays) using natural language processing (NLP) and machine learning (ML), the creation of assessment content (e.g., items, multimedia) through generative AI, and the analysis of large datasets to reveal patterns and insights that were previously impractical for humans to detect (Association of Test Publishers, 2021).

Potential value-add: AI promises to significantly improve efficiency by reducing scoring time, enabling rapid scaling, and lowering administrative costs. It aims to enhance precision and objectivity in scoring, generate innovative and adaptive test content, and reveal deeper insights into candidate behaviour and performance. By processing large and complex datasets, AI may also support continuous validation and improvement of assessments.

Challenges: Addressing algorithmic bias, improving transparency, and preventing misuse of sensitive candidate data are ongoing priorities. Additional considerations include the quality and representativeness of training data, the need for human oversight to complement AI outputs, ongoing monitoring and retraining to maintain fairness, and ensuring diversity in AI development teams to mitigate bias (Dumas et al., 2025; International Test Commission & Association of Test Publishers, 2025; Royal Society, 2024). “From the perspective of a human-centred approach, AI tools should be designed to extend or augment human intellectual abilities and social skills, and not undermine them, conflict with them or usurp them” (UNESCO, 2023, p. 38).

JVR’s position: JVR is cautiously optimistic about the use of AI in assessments, as well as in selection and development more broadly. Some areas represent relatively less risk (e.g. development work, personalised feedback), and some areas more (e.g. selection, item generation, scoring). For those also wanting to experiment, go for it, but exercise caution. Many ideas remain untested yet highly promoted. Put your scientist/practitioner hat on, ask for the technical manual, and do research to ensure that the tool meets the basic criteria of validity, reliability, and fairness.

Trend 5: Connecting assessment insights to workforce analytics.

Assessment results are increasingly being integrated with Human Resource Information Systems (HRIS) / Applicant Tracking Systems (ATS) platforms, performance management systems, and broader people analytics tools. This linkage feeds assessment insights directly into talent management, creating a holistic and continuously updated picture of the workforce (PsycheJunction, 2025).

Potential value-add: Linking assessments to workforce analytics enables predictive modelling for succession planning, development, and retention. It also supports data-driven decision-making by providing leaders with data visualisation in interactive dashboards that highlight strengths, risks, and opportunities across the workforce. For employees, it can offer clearer development pathways and personalised feedback, enhancing engagement and career growth.

Challenges: Integration can be technically complex, requiring interoperability between systems and robust data governance. Risks include data privacy concerns, potential misuse of insights (e.g., overly mechanistic decisions), and ensuring that predictive models are validated and unbiased. High costs of integration and the need for cross-functional expertise (e.g., HR, IT, data science) can also present barriers for many organisations (International Test Commission & Association of Test Publishers, 2025).

JVR’s position: JVR recently launched JVR Solutions, our latest business unit in the group of companies. JVR Solutions is a software development and IT company focused on delivering robust and scalable tech solutions in the assessment space. Our goal is to help clients unlock the value-adds, and overcome the challenges mentioned above.

Trend 6: A stronger emphasis on assessing beyond technical skills.

Despite the continued focus on technical skills, organisations are increasingly seeking tools to evaluate non-technical competencies that are crucial in complex, evolving work environments (PsycheJunction, 2025). The World Economic Forum (2025) highlights this shift, ranking technological literacy only sixth on its list of top ten skills for 2025 while placing greater emphasis on core non-technical skills like analytical thinking, resilience, flexibility, agility, ‘leadership and social influence’, ‘motivation and self-awareness’, ‘empathy and active listening’, and ‘curiosity and lifelong learning’.

Potential value-add: Expanding assessment focus beyond technical skills provides a more holistic understanding of candidate and employee potential. It can improve role and culture fit, enhance team dynamics, and better predict long-term success in rapidly changing contexts. Assessing non-technical skills also supports leadership development, adaptability, and employee well-being, helping organisations future-proof their workforce.

Challenges: Measuring non-technical skills is challenging and can lead to inconsistent or biased evaluations. The development of valid tools depends on meticulous design and thorough validation, which can be both time-consuming and costly. There is also a risk of overlap or redundancy with existing measures, and organisations must ensure that these competencies are assessed fairly across diverse cultural and demographic groups.

JVR’s position: Success in today’s fast-changing world depends on more than just technical expertise. JVR advocates for the assessment of both potential (the likelihood of displaying or ease of developing a skill) and performance (the demonstration of a skill). By integrating psychological constructs (i.e., cognitive ability, personality, and values, etc.) with measures of displayed performance, organisations can identify the technical and non-technical skills (i.e. analytical thinking, adaptability, resilience, and social influence)—that best predict long-term success.

Trend 7: Enabling global accessibility and inclusivity.

There is a growing need for digital assessments to be designed and adapted to accommodate diverse populations through multilingual interfaces, culturally appropriate content, and accessibility features for candidates with a wide range of physical, sensory, cognitive, or learning differences (International Test Commission & Association of Test Publishers, 2025; PsycheJunction, 2025).

Potential value-add: Inclusive assessment design broadens access to talent, supports fairer hiring and development practices, and strengthens employer reputation. It helps organisations tap into more diverse candidate pools, improves equity across regions and demographic groups, and ensures compliance with global accessibility standards.

Challenges: Creating truly inclusive assessments requires significant investment in localisation, cultural validation, and accessibility testing. Organisations may face technical hurdles in adapting platforms to different contexts. Failing to implement inclusivity measures properly may introduce bias or unintentionally exclude candidates. Thus, balancing standardisation with cultural adaptation remains a complex challenge (International Test Commission, 2017; Laher, 2024).

JVR’s position: We strive to combine rigorous psychometric standards with a deep understanding of local contexts, to help organisations broaden access to talent while maintaining the reliability and comparability needed for sound decision-making. No single, perfect assessment exists, so we design our solutions to balance global consistency with cultural sensitivity, ensuring assessments are meaningful, fair, and relevant across regions.

Trend 8: Enhanced efforts to combat cheating and unusual assessment behaviour.

As remote and digital delivery becomes more common, organisations are adopting measures to maintain the integrity of assessment results. These include locked-down browsers, remote or AI-based proctoring, facial recognition software, and facial analytics designed to detect unusual behaviour during testing (International Test Commission & Association of Test Publishers, 2025).

Potential value-add: Stronger security measures help ensure fairness by protecting against cheating, preserving the validity of results, and safeguarding organisational credibility. They also provide confidence to stakeholders that remote assessments can be trusted as much as traditional in-person methods, enabling wider adoption of digital formats.

Challenges: Security measures can raise concerns about candidate privacy, surveillance, and data protection (Wen et al., 2024). Certain tools (e.g., facial recognition or behavioural analytics) may also be prone to bias or errors, potentially disadvantaging some groups. Overly intrusive monitoring can create stress or negative candidate experiences, while technical requirements (e.g., reliable internet, compatible devices) may further limit accessibility.

JVR’s position: Every context is different, but rather than relying on intrusive surveillance technologies, our approach balances built-in security measures with standardised processes, with a focus on candidate experience, privacy, and inclusivity. We combine technical safeguards with careful attention to ethical guidelines, data protection standards, and cultural fairness.

Trend 9: Prioritising ethical and regulatory practices.

With growing reliance on digital platforms and AI, organisations are required to prioritise fairness, data privacy, and compliance with emerging regulations. This includes transparent communication to candidates about how their data is used and stored, monitoring AI systems for bias, ensuring secure data storage and governance, obtaining candidate consent where appropriate, and staying up to date with relevant laws and industry guidelines (Bharadwaj, 2024; PsycheJunction, 2025).

Potential value-add: A strong ethical and regulatory framework builds trust with candidates, protects organisational reputation, and reduces legal and compliance risks. It also promotes fairer assessment outcomes by ensuring responsible AI use and actively addressing bias. Clear communication and consent practices can improve the candidate experience by making processes feel transparent and respectful.

Challenges: Rapidly evolving regulations across jurisdictions create complexity for multinational organisations. Ensuring AI systems are explainable, unbiased, and secure requires ongoing investment. Balancing innovation with compliance can slow adoption of new technologies, and failure to uphold ethical standards may result in reputational damage or loss of candidate trust (Camilleri, 2023).

JVR’s position: JVR strives to combine rigorous adherence to international best-practice guidelines with a proactive focus on local regulatory requirements. We ensure that assessments are designed and delivered in ways that are responsible, explainable, and bias-aware, while also communicating openly with candidates about data use, consent, and privacy protections.

Conclusion

The landscape of workplace psychological assessment is rapidly evolving, driven by advances in AI, digital platforms, and novel assessment modes. These trends offer remarkable opportunities to enhance candidate experience, generate richer behavioural insights, and improve organisational decision-making. However, despite the promise of expanded modalities, gamification, AI-driven analysis, and integration with workforce analytics, organisations must remain alert. Practical constraints, ethical concerns, and regulatory requirements present ongoing challenges that should not be taken lightly.

Even as innovation drives new assessment methods, the fundamental psychometric principles of reliability (consistency across time, raters, and test forms), validity (measuring what is intended), and fairness (equitable outcomes across demographic groups) must stay at the forefront of all assessment practices. No matter how engaging, technologically sophisticated, or efficient an assessment may be, it will only deliver meaningful, defensible results if it is psychometrically sound. Organisations should therefore ensure that new approaches are rigorously validated, inclusive, transparent, and interpretable, while maintaining appropriate human oversight. By balancing innovation with rigorous standards and responsible practices, organisations can harness the potential of new technologies without compromising integrity, inclusivity, and long-term effectiveness.

References

Association of Test Publishers. (2021, July 6). Artificial intelligence and the testing industry: A primer. https://www.testpublishers.org/assets/ATP%20White%20Paper_AI%20and%20Testing_A%20Primer_6July2021_Final%20R1%20.pdf

Bharadwaj, A. (2024, June 15). The next wave of psychometric assessments: Innovations and insights. Mercer | Mettl. https://blog.mettl.com/the-next-wave-of-psychometric-assessments-innovations-and-insights/

BotPenguin. (2025, June 5). 10 real world recruitment chatbot examples used by top companies. https://botpenguin.com/blogs/recruitment-chatbot-examples

Camilleri, M. A. (2023). Artificial intelligence governance: Ethical considerations and implications for social responsibility. Expert Systems, Article e13406. https://doi.org/10.1111/exsy.13406

Dumas, D., Greiff, S., & Wetzel, E. (2025). Ten guidelines for scoring psychological assessments using artificial intelligence [Editorial]. European Journal of Psychological Assessment, 41(3), 169–173. https://doi.org/10.1027/1015-5759/a000904

Hale, M. (2025, May 15). Next Gen AI in action: Unilever’s AI-powered recruitment revolution. Global Skill Development Council. https://www.gsdcouncil.org/blogs/next-gen-ai-in-action-unilever-s-ai-powered-recruitment-revolution

Elosua, P., Aguado, D., Fonseca-Pedrero, E., Abad, F. J., & Santamaría, P. (2023). New trends in digital technology-based psychological and educational assessment. Psicothema, 35(1), 50–57. https://doi.org/10.7334/psicothema2022.241

Laher, S. (2024). Assessment futures: Reflections on the next decade of psychological assessment in South Africa. African Journal of Psychological Assessment, 6(0), a166. https://doi.org/10.4102/ajopa.v6i0.166

International Test Commission. (2017). The ITC guidelines for translating and adapting tests (Second edition). https://www.intestcom.org/files/guideline_test_adaptation_2ed.pdf

International Test Commission and Association of Test Publishers. (2025). Guidelines for technology-based assessment. https://www.intestcom.org/page/28

Maltseva, K. (2020). Wearables in the workplace: The brave new world of employee engagement. Business Horizons, 63(4), 493–505. https://doi.org/10.1016/j.bushor.2020.03.007

Organisation for Economic Co-operation and Development (OECD). (2019). Recommendation of the Council on Artificial Intelligence. https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0449

Phenom. (2025). Transforming Mastercard talent acquisition. Phenom. https://www.phenom.com/resource/transforming-mastercard-talent-acquisition

PsycheJunction. (2025, May 8). Top 7 psychometric trends shaping the future of assessment in 2025. https://www.psychejunction.com/news/top-7-psychometric-trends-shaping-the-future-of-assessment-in-2025

Royal Society. (2024). Science in the age of AI. https://royalsociety.org/news-resources/projects/science-in-the-age-of-ai/

Schmidt, E. (2022, December 21). How Nestlé automated nearly 8,000 hours of recruiting work this year. Paradox. https://www.paradox.ai/blog/8-000-hours-of-work-turned-to-almost-zero-how-nestle-did-it

UNESCO. (2023). Guidance for generative AI in education and research. https://www.unesco.org/en/articles/guidance-generative-ai-education-and-research

Vodacom Group. (2022, April 11). Vodacom’s wearable devices set to boost mineworker safety. https://vodacom.com/news-article.php?articleID=7772

Wen, Y., Liu, B., Song, L., Cao, J., & Xie, R. (2024). Facial recognition technology and the privacy risks. In Face de-identification: Safeguarding identities in the digital era (pp. 15–20). Springer, Cham. https://doi.org/10.1007/978-3-031-58222-6_2

World Economic Forum. (2025, January 7). The Future of Jobs Report 2025. https://www.weforum.org/publications/the-future-of-jobs-report-2025

Newsletter

Get up-to-date industry news right in your inbox

Someone pointing to the left looking surprised

This site uses cookies to enhance your experience and to provide us with information on how to improve our website. To find out more, see our Terms of Business.