The story so far

It all began with a Question

Welcome to our journey. From 2015 to this very day we have invented, invested and re-invented daily to refine, tune, and perfect our offerings.


Using a skills-based approach to teaching and learning, how do we transform and sustain a learner’s performance in the classroom and exam hall?

Phase 0:

At the very initial stage (Phase 0), we partnered with Oxford Brookes University to develop an understanding of the characteristics of high academic achievers.

The interview questions targeted specific dimensions such as the individual students’:


foresight and clarity with respect to their own strengths and weaknesses,


aspirations and evidence of being able to plan ahead,


emotional balance, and


ability to respond and cope with stress, anxiety and pressure.

The analysis of the interviews revealed four characteristics that the higher performing pupils in this cohort tended to manifest.

These included the ability to:


retain and recall the content of their modules and assignments with relative ease and little stress and effort;


manage their time, showing a clear set of organisation skills such as prioritising, planning, scheduling and forecasting;


schedule their social life around their academic ambitions and having the discipline to prioritise their academic work when needed;


channel anxiety and cope with pressure in a positive manner, e.g. by increasing their effort to succeed.

The results of the initial pilot were encouraging and led the company to invest in trialling its approach in primary and secondary schools in the UK.

This next phase (henceforth – Phase 1) also consisted of a more systematic definition and refinement of psychological and behavioural dimensions identified during Phase 0 and an improvement of PL’s assessment procedures and methods.

With respect to the latter, the company invested in the development of the Performance Learning Online Analysis (PLOA) tools, which served both to record and to analyse students’ responses with respect to 27 traits identified during this phase as common to pupils across the full range of academic ability (lowest to highest performing). 

The 27 traits were determined through further research mainly in secondary schools in the UK. The traits described a pupil’s present PL diagnosis regarding grades, attitude to learning, behaviour, class attendance and participation.  They also provided information about students’ mindset towards their learning along with possible barriers for their learning, their response to different learning environments (e.g. home or school) and their general well being.

These traits were then further segmented into PL’s five assessment risk levels, ranging from assessment level 1 (extremely high risk) to assessment level 5 (no risk), where the risk was defined concerning the degree to which a learner is believed to be able to reach a target or a predicted grade. 

In this phase, the PL’s assessment procedures were also refined, with the PLOA assessment being conducted at the start of a pupil’s PLs curriculum, mid-way through and at the end, to establish any changes in pupil’s self-assessment over-time.

During the assessment, students answer a set of 64 questions with each response being placed in a report outlining their behavioural and psychological characteristics. The PLOA tool offers its assessment back to the students and teachers in the form of a score for each psycho-behavioural category assessed.  It also offers each pupil a set of behavioural targets to achieve over the course of PL’s curriculum.

The students are grouped according to their needs and risk levels. Face-to-face sessions are delivered to each group on a weekly and then fortnightly basis, with the view to gradually guide the learners into a habit of independent, critical and regular self-appraisal, goal-setting and action. As well as undertaking regular PLOA assessments, pupils receive a printed manual outlining each lesson.

The PL lessons are specific to the pupil reaching their target PLOA score, with the teacher scoring the pupil within the system at the end of each lesson to assess the pupil’s progress towards their PLOA targets.

In this phase, two schools participated in the trial, with data of 412 pupils from two secondary schools having been included in the analysis. Of those, a total of 113 pupils participated in PL’s programme either fully (over a 9-month period) or partially (over a six-month period), with a total of 37 pupils from one school and 76 pupils from a second school having participated. Descriptive analysis was conducted on data from the two schools respectively to ascertain any changes in the grades obtained by the PL student cohort following the PL curriculum as compared to those predicted for them before commencing the intervention.

The final grades were also compared to the grades obtained by the students who did not participate in the programme (PL Nil).  In both schools, PL students achieved higher grades than predicted across the core subjects (English, Math and Science), and in one of the schools also in Science as an additional subject. In both schools, the improvements in performance were particularly noticeable in English where the percentage of PL students achieved one or more grades higher than predicted and nearly double that of students in the PL Nil group (see Fig. 1 and Fig 2 respectively).

Crucially to PL’s overarching vision, the improvement in performance of the PL cohort was even more significant for students on free school meals, with the percentage of students who achieved one or more grade higher than predicted being four times above their PL Nil peers’ grades for English, and around three times for math.

Phase 1:

The focus is presently on automating and further refining of its approach to pupil assessment.  Working with University College London’s Knowledge Lab, PL sees a particular opportunity in mining of the data generated to help


understand better the behavioural patterns of relevance to self-reflection and self-regulation and


inform further development of PL’s technology, mainly focusing on real-time modelling of learners’ behaviours and metacognitive competencies.

To date, PL’s Phase 2 has brought about significant enhancements in its:


method of assessment to allow the students to self-assess using non-discrete social, emotional and mental categories in a way that captures the nuance of their states; this is achieved through a modal interface as shown in Fig.3;


frequency of assessment, which is now conducted at the beginning and at the end of each PL lesson, whereby the learner is asked fundamental questions about individual lessons to assess their motivation, understanding and their willingness and likelihood of applying the skill in their wider academic and personal contexts



delivery of lessons, each lesson is now delivered digitally through game like interactions with the system tracking and recording data such as time on task, accuracy, completion attempts, quantity of usage;


volume and nature of data collected, gathered from the teachers’ assessments of the individual pupils as well as pupils self assessments, providing a unique opportunity for a systematic comparison between the two perspectives;


expanded set of behavioural traits along with a definition of a scoring mechanism for modelling and qualifying students behaviours along a spectrum of their strengths and weaknesses.

In this phase, the assessment categories have been expanded from the original 27 to 35 to enable the system to qualify pupils’ self-assessments and teachers’ assessments along with a spectrum of students’ strengths and weaknesses.  Furthermore, the initial PLOA assessment categorises the pupil along a set of behaviours, allocating a score per behavioural category and a definition explaining the category and the specific score (see Fig.4).

This, in turn, provides a concrete basis for the pupils’ reflection and gives teachers the ability to check their “gut feeling” perceptions and assessments of their students.

As such, the definitions provide a common ground for both the teachers and the learners to discuss the individual assessments in a way that is targeted, systematic and inspectable over time.

The expanded set of Phase 2 behavioural categories is based on an in-depth analysis of existing literature which both provides research evidence for the categories and explains their nature and relationship to metacognitive competencies and academic success.

Of particular interest here are four broad psycho-behavioural domains requiring self-regulatory control to support successful cognitive performance and learning more.  These domains include:


sleep management,


outcome oriented mindset,


memory, and



While distinct, these domains are interrelated and mutually impacting.

Phase 2:

This work that was undertaken was essential to creating solid, evidence-based foundations. While laborious and time-consuming, the iterative research and development methodology that was adopted at every stage during this phase ensured a unique product was developed.

The data generated through the technology was invaluable to future development. Specifically, data generated through the platform was created to be mined exhaustively in relation to well-refined research questions of pertinence to metacognition and learning achievement.

Work was undertaken to gather evidence of the effectiveness of the selected approach and the real educational value offered. Data was analysed as a basis for developing artificially intelligent learner modelling tools which were to serve as the foundation for developing a full AI-driven adaptive assessment and feedback tool. Learner modelling was quickly identified as representing a key component of intelligent tutoring systems and intelligent learning environments.

This phase of works provided a position from which to start the development of such modelling capabilities resulting in the delivery of scale as well as good quality, effective support for learners in a way that was personalised and tailored to each individual’s needs and for teachers in a way that was informative about their learners in a fine-grained level of detail.

We had our work reviewed by UCL who said ‘the three phases of research and development completed by PL Education were core to the production of a strong foundation for AI components’.

They said that these phases of research ‘involved brokering strong relationships between academic researchers and a commercial organisation. This brokerage is a key to taking AI research to scale and demonstrating its impact on learning.’

During the partnership with UCL, we progressed together through three stages of refinement validation and technological implementation. Each stage produced key conclusions that informed each following iteration.

The results of the first intervention involving 28 subject-independent lessons delivered on a one – to – one basis using paper-based training materials were encouraging with the 14 PL students outperforming the rest of their peers who were not using PL.

The results of the initial phase helped to shape the development of the PLOA tools which recorded and analysed students self-assessment responses with respect to 27 traits identified as common to pupils across the full range of academic ability. Again the results were encouraging with PL learners achieving higher grades than predicted across their core subjects with disadvantaged learners performing particularly well.

Following this, further refinement of pupil assessments was commenced along with in-depth analysis of existing literature to provide both research evidence for the behavioural categories in the assessment and to explain the nature of the relationship to metacognitive competencies and academic success.

The PL tools have the potential to enable both learners and teachers to articulate their assessments of the pupils in an individualized way and in relation to factors that are fundamental to learning, including lifestyle changes, such as sleep and self management, to improve learner attitudes to learning and attainment.

The goal is to instill a habit, both in teachers and learners, to regularly reflect on key factors as such reflection is known to lead to targeted planning and action and ultimately to better learning outcomes.

The PL approach also provides a very concrete basis for inspection, verification and discussion with real-time assessment of the pupil at the end of each lesson, and providing the teacher with real time feedback on the pupils’ mindset, reactions and understanding of academic content.

Phase 3:

Through the existing engagement with The UCL Knowledge Lab, Institute of Education, University College London Institute of Education and our software development partners as well as the hiring of two headteachers who took a secondment from their headship to join us, we began to develop the following areas in our existing model, from the last phase of research and development.

The importance/weighting of the variables within the context of the prescribed learner journey which links directly to the weightings within the Machine Learning Algorithm.

The accuracy of the predictor algorithm and it’s feedback loop to the user (the learner, the learner’s guardians/parents and the teachers).

Ensuring the Learning Management System (LMS) is fully configurable by an administrator.

The differences that are necessary in assessments used for secondary schools, further education colleges (F.E.) and Higher Education Universities.  Specifically, the development of multiple user interfaces with a unique user experience.

Our Product Journey

Version 1.0

Version 2.0

Version 3.0

Version 4.0