learncloudassignment.online

Table of Contents

12 Assessment Centre Exercises Used by Global Companies

Home / Blog / 12 Assessment Centre Exercises Used by Global Companies
Author picture

When organisations rely solely on interviews to make hiring and promotion decisions, they are working with one of the least predictive tools in the HR toolkit. Research published in the Journal of Applied Psychology consistently demonstrates that unstructured interviews have a validity coefficient of around 0.38, whereas well-designed assessment centre exercises combining multiple methods can push predictive validity above 0.65. That is not a marginal improvement. It is the difference between a strategic talent decision and an expensive guess. This guide breaks down every assessment centre exercise that HR professionals, L&D managers, and organisational development consultants are deploying inside Fortune 500 companies, Big 4 firms, and global conglomerates today, and shows you how to use them with precision.

Stat: A 2023 meta-analysis covering 183 studies found that assessment centres outperform single-method selection by up to 27% in predicting on-the-job performance (Schmidt & Hunter, updated meta-analysis, Society for Industrial and Organizational Psychology).

1. What Are Assessment Centre Exercises? (And How They Differ from Standard Tests)

Assessment centre exercises are structured, multi-method behavioural simulations designed to observe and measure how a candidate or employee performs across a defined set of behavioural competencies in conditions that mirror real work situations. The term ‘centre’ is a methodology, not a location: assessments can run in person, in a virtual assessment centre format, or in a blended hybrid model.

Unlike psychometric tests or cognitive ability assessments, which measure underlying traits or aptitudes in a standardised format, assessment centre activities require participants to actively demonstrate competencies through observed, scorable behaviour. A personality questionnaire tells you what someone is likely to do. An assessment centre exercise shows you what they actually do when placed under conditions that matter.

The distinction is fundamental:

Dimension

Standard Psychometric Test

Assessment Centre Exercise

What it measures

Traits, aptitudes, preferences

Observable behaviour in context

Format

Questionnaire or cognitive test

Simulation, roleplay, task, or group activity

Assessor involvement

Minimal to none

Active, trained assessors observe in real time

Competency coverage

Typically 1 to 3 constructs

Can cover 4 to 8 competencies simultaneously

Predictive validity

0.30 to 0.45 (personality)

0.50 to 0.65+ when designed well

Legal defensibility

Moderate

High when ORCE methodology is followed

The ORCE framework (Observe, Record, Classify, Evaluate) is the globally recognised standard for conducting assessor observation in these exercises, and it forms the backbone of quality control in every reputable assessment centre programme.

For a deeper grounding in competency-based assessment design and how it connects to talent decisions, the Able Ventures behavioural assessment practice provides an evidence-based framework tailored to Indian and multinational contexts.

2. The 12 Assessment Centre Exercises: A Deep-Dive Guide

Each exercise below is documented with the depth that HR professionals and L&D managers need to actually deploy them, not just recognise the name.

Exercise 1: In-Tray Exercise (In-Basket Exercise)

What it is: The in-tray exercise assessment centre staple presents candidates with a realistic inbox, calendar, and briefing pack from a fictional (or lightly disguised real) organisation. Candidates must work through the materials under time pressure, making prioritisation decisions, drafting responses, delegating tasks, and flagging escalations.

Competencies measured: Planning and organisation, priority management, judgement and decision-making, written communication, analysis of information.

How global companies use it: Unilever has deployed in-tray exercises in its Future Leaders programme as a core screen for analytical thinking and commercial acumen. The exercise is typically set in a realistic Unilever-adjacent context: a regional sales manager returning from leave to a backlog of emails requiring immediate decisions. This contextual anchoring is deliberate. It improves face validity and allows assessors to observe competency-specific behaviour more cleanly.

Best suited for: Middle management selection, graduate schemes, first-line supervisor promotions, and any role requiring information processing under pressure.

Expert note: The in-tray exercise should always be followed by a structured debrief interview where candidates explain their reasoning. The decisions themselves are only half the data.

Exercise 2: Leaderless Group Discussion (LGD)

What it is: A group exercise assessment centre activity in which a group of 4 to 8 candidates is given a shared problem or scenario with no designated leader. Assessors observe how participants contribute, influence, listen, challenge, and build consensus without formal authority.

Competencies measured: Teamwork and collaboration, influencing skills, communication, leadership without authority, conflict management, strategic thinking.

How global companies use it: Deloitte UK uses LGDs during consulting intake days, typically presenting a hypothetical client situation. Because no one is assigned a leader role, the exercise surfaces authentic influencing behaviours. Assessors track who initiates, who structures the discussion, who yields unnecessarily, and who holds positions under pressure.

Best suited for: Graduate selection, management development programmes, leadership pipeline identification, and team-based professional roles.

“The leaderless group discussion remains the single most revealing window into how someone actually operates in a team dynamic. It is very difficult to fake for 45 minutes.”  — Dr. Dave Bartram, Chief Psychologist, SHL (formerly), Assessment Matters

Exercise 3: Role Play Exercise (Structured Simulation)

What it is: A role play exercise places the candidate in a one-on-one or small-group scenario with a trained actor or assessor playing a specific role: a difficult client, an underperforming team member, a senior stakeholder. The candidate must navigate the interaction live.

Competencies measured: Interpersonal effectiveness, empathy, active listening, influence, negotiation, conflict resolution, emotional intelligence.

How global companies use it: HSBC uses structured roleplay simulations in its Senior Manager assessment suite, typically framing the scenario as a conversation with a large business client threatening to exit the bank. The scenario design is deliberately ambiguous: the client’s issue contains both a legitimate grievance and an unrealistic demand. This tests whether candidates can de-escalate, find common ground, and protect commercial outcomes simultaneously.

Best suited for: Client-facing roles, HR business partners, sales leadership, customer success managers, and general management.

Exercise 4: Competency-Based Interview (Structured Behavioural Interview)

What it is: Within an assessment centre context, the competency-based interview is a structured conversation anchored in the STAR methodology (Situation, Task, Action, Result), with each question targeting a specific behavioural competency. It is not a general interview; every question maps directly to a pre-agreed competency framework.

Competencies measured: Any competency in the framework. Assessment centres typically allocate 2 to 4 competencies per interview to maintain assessment integrity and avoid data overlap with other exercises.

How global companies use it: Google’s structured interview approach, well-documented through its Project Oxygen research, follows a similar logic: each interviewer in a hiring panel is assigned specific competency areas and uses calibrated rating scales. The result is a multi-assessor, multi-data-point view of the candidate rather than a single interviewer’s general impression.

Best suited for: All levels, but particularly effective for senior leadership roles where track record and past decision-making quality are central evidence points.

Exercise 5: Case Study / Business Case Analysis

What it is: Candidates receive a multi-page business scenario, financial data, market information, or an organisational challenge and are asked to analyse it, develop recommendations, and present their findings to a panel of assessors playing the role of the leadership team.

Competencies measured: Strategic thinking, analytical reasoning, commercial awareness, structured communication, decision quality, stakeholder management.

How global companies use it: McKinsey, BCG, and Bain use live business cases throughout their recruitment process. At the internal talent level, companies like Tata Group use case studies in their senior leadership development centres to evaluate readiness for P&L accountability. The exercise typically involves a 30-minute preparation window followed by a 20-minute presentation and a 15-minute challenging Q&A.

Best suited for: Senior management, C-suite pipeline, strategy roles, and any leadership development centre focused on executive readiness.

Designing Assessment Centres That Actually Predict Performance?

Exercise 6: Presentation Exercise

What it is: Candidates are given a topic, briefing document, or problem and asked to prepare and deliver a structured presentation to an assessor panel within a fixed timeframe (typically 10 to 20 minutes), followed by a question session.

Competencies measured: Structured thinking, verbal communication, executive presence, persuasion, composure under pressure, ability to synthesise information.

How global companies use it: PwC uses presentation exercises in its Director and Partner assessment centres, where candidates are asked to present a recommendation on a fictional client engagement. The Q&A session is deliberately probing: assessors introduce new information mid-session to assess adaptability and composure rather than just delivery skill.

Best suited for: Senior hires, leaders requiring board-level communication, commercial roles, and development centres focused on executive readiness.

Exercise 7: Written Communication Exercise

What it is: Candidates are asked to produce a written output under timed conditions: a briefing note, a memo to the board, a policy recommendation, a response to a stakeholder complaint, or an analysis of a scenario. The format mirrors an actual deliverable from the target role.

Competencies measured: Written communication clarity and conciseness, structuring of argument, professional tone, accuracy under pressure.

How global companies use it: The UK Civil Service Fast Stream uses a written exercise as a core component of its final assessment centre, requiring candidates to draft a ministerial briefing from a data pack. The scoring rubric evaluates structure, accuracy, conciseness, and appropriateness of tone separately, ensuring assessors do not conflate writing quality with intelligence or domain knowledge.

Best suited for: Policy roles, legal, compliance, HR, communications, senior management, and any role where written output is a significant part of the day-to-day.

Exercise 8: Analysis Exercise (Data Interpretation)

What it is: Candidates receive a dataset, set of reports, or mixed-format information pack and must analyse it within a time limit, draw conclusions, and either write a recommendation or present findings. This is distinct from a full case study in that the emphasis is on data handling and quantitative reasoning rather than strategic narrative.

Competencies measured: Analytical thinking, numerical reasoning, attention to detail, evidence-based decision-making.

How global companies use it: Amazon uses data analysis exercises in its Operations Manager assessment centres, providing candidates with supply chain or logistics data and asking them to identify inefficiencies and prioritise interventions. The exercise intentionally contains more data than a candidate can process fully, testing prioritisation of analysis as much as the analysis itself.

Best suited for: Operations, finance, data roles, commercial management, supply chain, and any analytically intensive function.

Exercise 9: Stakeholder Meeting Simulation

What it is: A more complex simulation than the one-on-one role play. The candidate must navigate a meeting involving multiple stakeholders (played by assessors or actors) who have competing priorities, different levels of seniority, and conflicting agendas. The candidate must chair or participate in the meeting while achieving a defined outcome.

Competencies measured: Stakeholder management, political awareness, influencing at multiple levels, conflict facilitation, meeting management, strategic communication.

How global companies use it: Procter and Gamble uses multi-stakeholder simulations in its Brand Management assessment centres, presenting a scenario where a candidate must align a cross-functional team (represented by assessors playing marketing, supply chain, and finance) on a product launch decision under budget constraints.

Best suited for: Project managers, programme directors, senior HR business partners, general managers, and organisational development roles.

“Assessment centres are not about catching people out. They are about creating conditions in which a person’s genuine capability has the best possible chance to surface. The assessor’s job is to create that space and observe it without bias.”  — Prof. Ivan Robertson, Occupational Psychology, University of Manchester

Exercise 10: 360-Degree Feedback Integration (Developmental AC)

What it is: In development centre contexts, 360-degree feedback data collected prior to the centre is integrated with live exercise performance. Assessors compare how an individual is perceived by others versus how they actually perform in observed simulations. This triangulation is one of the most powerful diagnostic tools in leadership assessment.

Competencies measured: Self-awareness, alignment between perceived and demonstrated behaviour, leadership impact, developmental areas.

How global companies use it: General Electric’s leadership pipeline has historically used 360-degree data integration in its Crotonville development centre sessions, where leaders are asked to reflect on the gap between their self-assessment, peer assessments, and observed behaviour in exercises. The integration forces a structured reflection conversation that is itself an assessment data point.

Best suited for: Mid to senior leadership development centres, high-potential programmes, and succession planning initiatives rather than external selection.

Exercise 11: Psychometric and Cognitive Ability Testing (Within AC Framework)

What it is: While standalone psychometric tests are not assessment centre exercises, they are routinely integrated into the assessment centre methodology as one data stream among many. The critical difference from standalone use is that psychometric data is never treated as decisive on its own. It is triangulated against exercise performance and interview evidence.

Competencies measured: Cognitive ability, reasoning, personality traits that underpin behavioural tendencies, emotional resilience.

How global companies use it: Shell integrates SHL-developed cognitive ability tests and OPQ personality assessments into its Graduate Assessment Centre as one of six data sources. The scoring model explicitly weights psychometric data at no more than 20% of the overall assessment outcome, ensuring that a high or low psychometric score cannot override strong or weak performance in the simulation exercises.

Best suited for: All levels, but most effective when used as a screening or enrichment tool within a broader AC framework rather than as a standalone gate.

Exercise 12: Virtual Assessment Centre Exercises

What it is: The virtual assessment centre exercises category encompasses all of the above exercise types delivered through digital platforms. Video-based role plays, asynchronous in-tray simulations, online group discussions via video conferencing, digital case studies with AI-assisted scoring, and remote presentation exercises all qualify. Virtual ACs became mainstream during 2020 and have since been refined into a rigorous methodology.

Competencies measured: All competencies measurable in face-to-face formats, with additional data available around digital communication effectiveness, self-management in a remote context, and technology adaptability.

How global companies use it: IBM and Accenture both shifted significant portions of their global graduate and experienced hire assessment to virtual formats post-2020 and have retained them. Accenture’s virtual AC uses AI-assisted observation tools to flag observable behavioural indicators across video-based exercises, which trained human assessors then review and rate. The hybrid model has improved assessor calibration consistency and reduced per-candidate assessment cost.

Best suited for: Any context where geographic reach, cost efficiency, or speed-to-hire are priorities, as long as assessor training and platform quality meet the standards required for fair and valid assessment.

Able Ventures designs and delivers both in-person and virtual assessment centre programmes. Learn more about our Assessment and Development Centre services and how they are structured for Indian and multinational organisations.

3. Comparison Table: 12 Exercises Mapped to Behavioural Competencies

Use this matrix to plan your assessment centre design. A well-designed centre ensures each competency is measured by at least two independent exercises (multi-trait, multi-method principle).

Exercise

Leadership

Communication

Analysis

Interpersonal

In-Tray Exercise

Moderate

High

High

Low

Leaderless Group Discussion

High

High

Low

High

Role Play Exercise

Moderate

High

Low

High

Competency-Based Interview

High

High

Moderate

High

Business Case Analysis

High

High

High

Moderate

Presentation Exercise

Moderate

High

High

Moderate

Written Communication Exercise

Low

High

Moderate

Low

Analysis / Data Exercise

Low

Moderate

High

Low

Stakeholder Meeting Simulation

High

High

Moderate

High

360-Degree Feedback Integration

High

Moderate

Moderate

High

Psychometric Testing (within AC)

Moderate

Low

High

Moderate

Virtual AC Exercises

Varies

Varies

Varies

Varies

 

Note: Competency coverage varies significantly by exercise design and briefing. This table reflects typical usage in well-calibrated assessment centre programmes.

4. How to Choose the Right Assessment Centre Exercise for Different Roles and Needs

Choosing the correct mix of assessment centre exercises is not a preference exercise. It should follow a structured job analysis and competency mapping process. Here is a practical decision framework:

Step 1: Define the Competency Framework First

Every exercise selection decision must start with a clearly defined competency framework for the role. Without this, there is no principled basis for choosing one exercise over another. If your organisation does not have an existing framework, the competency mapping step is not optional.

Step 2: Apply the Multi-Trait, Multi-Method Principle

Each targeted competency should be observable across at least two different exercise types. This reduces measurement error and increases the reliability of the final assessment rating.

Step 3: Match Exercise to Role Level

Role Level

Recommended Core Exercises

Graduate / Entry Level

In-Tray, LGD, Written Exercise, Psychometric

Middle Management

Case Study, Roleplay, Competency Interview, LGD

Senior Leadership

Stakeholder Simulation, Presentation, 360 Integration, Business Case

Executive / C-Suite

Stakeholder Simulation, Board Presentation, Strategic Case, Psychometric

Step 4: Consider Delivery Context

If candidate volumes are high and geographic spread is wide, virtual assessment centre exercises offer strong validity with reduced logistical overhead. If the role is senior and requires direct behavioural observation, in-person delivery with trained assessors is still the gold standard.

Able Ventures supports organisations through the full assessment and development centre design process, from job analysis through to assessor calibration and post-centre reporting.

Need a Custom Assessment Centre Design for Your Organisation?

5. Common Mistakes HR Makes When Designing or Running These Exercises

Mistake 1: Selecting exercises before defining competencies. The exercise mix should emerge from the competency framework, not precede it. When HR teams choose ‘a group exercise and a roleplay’ without first mapping what they are measuring, they produce uninterpretable data.

Mistake 2: Over-reliance on a single exercise type. An assessment centre built around one dominant exercise (typically the group discussion) sacrifices the multi-method principle that gives the methodology its validity advantage.

Mistake 3: Untrained assessors. Using line managers as assessors without structured training in the ORCE methodology produces assessor contamination, halo effects, and legally indefensible decisions. This is the single most common quality failure in internally run assessment centres.

Mistake 4: Conflating the exercise briefing with the assessment criteria. Candidates should receive enough context to perform meaningfully in the exercise. Assessors should work from a separate, detailed scoring guide that candidates do not see.

Mistake 5: Ignoring the candidate experience. A poorly run assessment centre with unclear instructions, logistical confusion, or disrespectful handling of candidates destroys employer brand. Candidates talk. A candidate who does not get the role but respects the process becomes an ambassador. One who feels disrespected becomes a detractor.

Mistake 6: No assessor calibration session. Assessors must align their ratings and interpretations before the wash-up meeting. Without calibration, individual assessor biases go undetected and final ratings reflect personality clashes between assessors as much as candidate performance.

6. How to Train Internal Assessors to Evaluate These Exercises Objectively

This section deserves far more attention than it typically receives in assessment centre literature. The quality of an assessment centre is only as high as the quality of its assessors. A well-designed exercise run by a poorly trained assessor panel produces unreliable data. Here is the full training pathway for internal assessors:

Phase 1: Competency Framework Orientation (Half Day)

Assessors must be able to define each target competency in behavioural terms, distinguish between high and low demonstrations, and understand why the framework was structured as it was. This is not a light briefing. It is a structured workshop.

Phase 2: ORCE Methodology Training (Full Day)

Observe: Assessors learn to maintain structured attention throughout an exercise, using an observation record sheet to capture verbatim behavioural evidence.

Record: Evidence is captured in behavioural, not evaluative, language. ‘Candidate proposed three alternative solutions’ is valid evidence. ‘Candidate was creative’ is an inference and must not appear in raw observation notes.

Classify: Each piece of recorded evidence is mapped to the competency it most cleanly demonstrates. Assessors learn to identify overlap and ambiguity.

Evaluate: Using a predefined rating scale (typically 1 to 5 with behavioural anchors), assessors produce an independent rating for each competency before any group discussion.

Phase 3: Practice and Calibration (Half Day to Full Day)

Assessors practice on video-recorded exercises, produce independent ratings, and then compare and discuss discrepancies. This calibration step is where individual bias becomes visible and correctable. Assessors who consistently rate higher or lower than the group norm receive coaching on their anchor interpretation.

Phase 4: Live Assessment Centre Support

For first-time assessors, a shadow run alongside an experienced assessor on live exercises builds confidence and ensures fidelity to the methodology before they take on an independent assessment role.

Able Ventures delivers accredited Competency Assessor Certification programmes that train internal HR professionals and line managers to the standard required for defensible, high-quality assessment centre delivery.

“An untrained assessor in an assessment centre is not just ineffective, they are actively harmful. They introduce systematic bias that undermines every piece of valid data collected that day.”  — Dr. Filip Lievens, Professor of Industrial-Organisational Psychology, Singapore Management University

7. How Able Ventures Can Help

Able Ventures is an organisational development and learning practice working with HR teams, L&D functions, and senior leaders across India and internationally. Our work in assessment and development centres covers the full end-to-end process: competency framework development, exercise design, assessor training, live delivery, and post-centre developmental reporting.

Our assessment practice is built on three principles: validity (every tool we use has an evidence base), fairness (every candidate is assessed against the same behavioural criteria), and utility (every assessment generates data that is actionable, not just reportable).

Organisations we work with use assessment centres for graduate selection, internal promotions, succession planning, and leadership development. We design for context: the exercise mix for a manufacturing company selecting first-line supervisors is not the same as the design for a financial services firm identifying its next generation of senior leaders.

If you are rebuilding or launching an assessment centre programme, our behavioural assessment and leadership assessment and coaching pages provide an entry point into how we approach this work.

Talk to Able Ventures About Your Assessment Centre Needs

Frequently Asked Questions

What are the most commonly used assessment centre exercises?

The most widely used assessment centre exercises globally are the in-tray exercise, leaderless group discussion, competency-based interview, and role play simulation. Most well-designed assessment centres use a minimum of four exercises to ensure each target competency is observed across at least two independent methods.

What are assessment centre exercises examples for senior leadership roles?

For senior leadership selection or development, typical assessment centre exercises examples include a multi-stakeholder simulation, a board-level presentation with live Q&A, a strategic business case analysis, a 360-degree feedback integration session, and a structured competency-based interview targeting leadership and commercial competencies. The exercise design should reflect the actual complexity of the target role.

How do virtual assessment centre exercises compare to in-person ones?

Research published in the Journal of Business and Psychology indicates that virtual assessment centre exercises, when properly designed and delivered with trained assessors, produce validity levels comparable to in-person formats. The key differentiators are platform quality, assessor training for the virtual environment, and exercise design adapted to the constraints of video-based interaction. Virtual formats are not inferior by default. They are inferior only when the methodology is adapted poorly.

How long does a typical assessment centre last?

A single-day assessment centre typically runs between 6 and 8 hours and accommodates 3 to 5 exercises. Multi-day centres are used for senior leadership and executive selection, where depth of assessment across 6 to 8 competencies requires more observation time. Graduate-level assessment centres can be designed to run in 4 to 5 hours when virtual, using asynchronous pre-work to offset the reduction in live exercise time.

What is the in-tray exercise assessment centre format and how is it scored?

The in-tray exercise presents candidates with a simulated inbox of emails, memos, reports, and calendar items. Candidates must work through the material within a fixed time (typically 45 to 90 minutes) and produce a set of decisions, delegations, and written responses. Scoring is competency-anchored: assessors evaluate the quality of prioritisation decisions, the reasoning behind delegations, the appropriateness of written tone, and the accuracy of analysis rather than simply tracking which items were addressed.

What is the difference between an assessment centre and a development centre?

An assessment centre is used primarily for selection and promotion decisions, with final ratings used to make a pass or fail determination. A development centre uses the same exercise methodology but with the primary purpose of generating individual developmental feedback. In a development centre, candidates typically see their own assessor observation notes and participate in a facilitated debrief designed to build self-awareness. The exercise formats can be identical. What differs is the use of the data.

How long does it take to see results from a shift to potential-based hiring?

The most meaningful results typically become visible at twelve to eighteen months after a potential-based hire enters a role, when the candidate has had enough time to demonstrate growth and performance in conditions that experienced hires might plateau in. Organisations running systematic pilots typically have enough data within eighteen months to make a statistically meaningful comparison between experience-led and potential-led hire cohorts. Culture change in hiring practice takes longer, often two to three years of consistent reinforcement before potential-based criteria become the natural default for hiring panels.

How does India's talent market specifically affect the case for potential-based hiring?

Several factors make the case more urgent in the Indian context. A young and growing workforce means a large proportion of available talent has limited experience by definition, and organisations that cannot evaluate potential are effectively excluding this cohort from consideration. Constrained senior talent supply in high-demand sectors means experience-first hiring concentrates competition among organisations for the same small pool. And the pace of role evolution across technology, financial services, manufacturing, and consumer sectors means that experience in the previous version of many roles has declining predictive value.

Recent Blogs

Scroll to Top