Independent School Examination

Independent School Examination

Reimagining examination

Reimagining examination

Year

2022-2023

Company

Century TECH

Client

ISEB

Role

Lead Designer

Users

14K+

Project Overview

ISEB provides digital entrance examinations used by independent schools during their admissions process.

For more than eight years the organisation had remained the industry standard. During this time, examination content continued to be updated annually and maintained a strong academic reputation.

However, the digital platform itself had seen little structural or visual evolution. As new EdTech competitors entered the market with more modern tools, the gap between ISEB’s trusted assessment content and its aging product experience became increasingly visible.

This shift was reflected in declining subscription numbers, signalling an urgent need to reassess how the platform supported the wider examination process.

This project explored how the platform could evolve to better support the full examination ecosystem.

Problem

Although the exam journey appears simple, the operational workflow behind it involves multiple organisations and stakeholders.

The platform was designed primarily for applicants completing assessments, leaving much of the administrative process unsupported.

As a result, schools and invigilation centres relied heavily on email, spreadsheets and manual coordination to manage examinations.

This created delays, inconsistencies and a significant administrative burden across participating institutions.

Solution

Research revealed that many of the platform’s challenges came from fragmented communication and manual data handling between stakeholders.

Rather than focusing only on interface improvements, the opportunity was to rethink how the platform supported the wider examination ecosystem.

The proposed direction introduced role-based access for the key participants involved in the examination process — applicants, guardians, invigilation centres, senior schools and ISEB administrators.

Bringing these roles into a shared system enabled secure data sharing, clearer communication and more consistent workflows.

Current Workflow

EXPECTED PLATFORM FLOW

Actual Operational Workflow

Early discovery revealed that while the examination process appeared straightforward, the majority of operational tasks occurred outside the platform.

Communication between schools, invigilation centres and administrators relied heavily on manual processes, including email exchanges, spreadsheet exports and manual result declarations.

This fragmentation created delays, inconsistencies and increased administrative burden across all participating institutions.

Research findings

Discovery research highlighted several structural challenges within the examination ecosystem.

Application Overlap

Students frequently applied to multiple senior schools, yet the system lacked a mechanism to track or manage these relationships.

Manual Processes

Senior schools manually onboarded applicants, while many exams were administered through invigilation centres rather than the schools themselves.

Communication Gaps

Invigilation centres lacked direct access to testing tools, requiring exam-related information to be exchanged through email or phone calls.

Operational Burden

Invigilation centres often hired additional staff to manage examinations despite receiving no formal compensation for the administrative workload.

Post-Exam Complexity

Test declarations were submitted manually for each student and distributed to schools via email, creating delays and opportunities for errors.

High-Stress Environment

The high-stakes nature of these examinations created significant pressure on applicants, often resulting in requests for retakes and additional administrative coordination.

Multiple Test Attempts

Due to communication gaps, some students were able to take the same test multiple times by applying through different schools.

Limited System Access

ISEB administrators had no interface to oversee the process, and guardians had no direct access to exam information.

Rethinking the Platform Infrastructure

ISEB’s digital assessment platform had historically been built around a single user group: applicants completing examinations.

However, the wider admissions process involved multiple organisations working outside the platform, coordinating through emails, spreadsheets and manual communication.

As the ecosystem grew, this fragmented structure created increasing operational complexity for schools, invigilation centres and administrators.

Rather than simply improving the applicant interface, the project explored how the platform could better support the entire examination ecosystem.

This required shifting the platform from a single-user testing tool to a shared infrastructure supporting multiple organisations.

Expanding the system actors

The first step was recognising the full set of stakeholders involved in the admissions process.

Beyond applicants, the system needed to support:

  • Guardians managing applications

  • Senior schools linking applicants to their admissions processes

  • Invigilation centres administering assessments

  • ISEB administrators coordinating the examination framework

Mapping these roles revealed how responsibilities and communication flowed between organisations, highlighting opportunities to centralise key processes within the platform.

Introducing a shared applicant record

To support these interactions, the platform architecture was restructured around a shared applicant profile.

This profile became the central system record connecting all participating organisations.

Different actors could access the same applicant data with role-based permissions:

  • Guardians could create and manage applicant profiles

  • Schools could link candidates to applications and review results

  • Invigilation centres could schedule assessments and manage sessions

  • ISEB administrators could oversee accessibility requirements and system governance

By introducing a shared data structure, the platform reduced duplication and enabled organisations to collaborate within a single system.

With these roles connected through a shared applicant record, the platform could now coordinate the operational workflow of examinations across institutions.

Supporting the full examination workflow

With the platform infrastructure in place, the system could support the operational flow of examinations across institutions.

Guardians initiate the process by creating an applicant profile and selecting an invigilation centre.
Schools connect applicants to their admissions processes, while invigilation centres manage assessment scheduling and session logistics.

Applicants access the assessment player through a temporary access code provided at the invigilation centre, ensuring secure and time-restricted access to examination content.

Following completion, invigilators submit assessment declarations and results are made available to participating schools through the platform.

This structure allows the entire assessment lifecycle—from applicant registration to result distribution—to be coordinated within a unified system.

Rethinking the Platform Infrastructure

Early wireframes explored the shared applicant record, accessibility requirements, and cross-organisation workflows.

Early wireframes explored the shared applicant record, accessibility requirements, and cross-organisation workflows.

The final interface unified these workflows into role-specific portals for guardians, schools and invigilation centres.

The final interface unified these workflows into role-specific portals for guardians, schools and invigilation centres.

Shamaazi Restructure