Assessment Research Grants

The purpose of the Assessment Research Grants is to contribute, together with internal British Council research activities, to the validation of Aptis.

The research agenda upon which the grants are premised is based on the modified socio-cognitive validation framework (Weir, 2005; O’Sullivan & Weir, 2011) and will build a significant body of substantive evidence of the validity of Aptis.

The grants will support projects directly focused on Aptis. Projects which are more exploratory in nature and utilise instruments other than Aptis will be considered provided the proposer can demonstrate the potential impact of their research on Aptis.

Projects will be for a period of one year.


Educational institutions and suitably qualified individuals are invited to apply for funding to undertake applied research projects of relevance to Aptis. In order to foster our relationship with clients, we will also encourage applications from test users interested in understanding how the test impacts on their institution.


The maximum financial support for individual projects will, in principle, be limited to a maximum of £17,500, although it is expected that projects requesting financial support in the region of £10,000 to £15,000 will be most positively considered.

Applications for extensive travel or large items of equipment will not be supported. N.B. Applications should NOT at this stage include requests for funding for conference presentations; if completion of the research project results in acceptance of a paper at an international conference, the British Council will consider supporting the attendance of one of the authors through a separate application.


The research grants support researchers around the world in conducting and disseminating the highest quality research. The areas of particular interest to the British Council in 2016 include:


Studies investigating the attitudes towards, or perceptions of, Aptis among stakeholders (e.g. test takers, commissioning clients etc.).

Studies investigating the relationship between the use of feedback derived from Aptis and the impact on teaching and learning.

Studies investigating innovative methodologies to facilitate interaction between test users and test developers in the planning, execution and interpretation of impact studies.

Studies investigating how different test preparation methods affect scores and attitudes towards Aptis.


Studies providing an explicit framework for justifiable forms of modification to the content of Aptis tests in order to take account of local contextual needs.

Studies investigating the impact of localisation of Aptis on different stakeholders.


Studies focusing on cognitive processing in speaking or writing.

Studies investigating key criteria involving features of performance in speaking and writing at the Common European Framework of Reference for Language (CEFR) levels assessed in Aptis.

Studies investigating the usefulness of applying automated analyses techniques to investigate lexical thresholds and lexical profiles across the Common European Framework of Reference for Language (CEFR) levels assessed in Aptis.

Studies investigating the predictive validity of Aptis test scores in specific educational or employment contexts.


Studies exploring the impact of delivery mode on test taker and/or rater performance.

Delivery modes may compare the range of options currently available in the Aptis testing system. However preference will be given to studies focusing on new technologies or which consider possibilities for innovative new modes of delivery.


Consideration will also be given to other issues of current interest from the area of applied linguistics and Second Language Acquisition in relation to language assessment.


  • Stephen Bax & Prithvi Shrestha (Open Univesity,UK) for their project which will explore lexical thresholds and lexical profiles across the Common European Framework of Reference for Language (CEFR) levels assessed in the Aptis test.
  • Sally O'Hagan & Kellie Frost (University of Melbourne) for their project which will examine test taker processes and strategies and stakeholder perceptions of relevance of the Aptis for Teachers speaking test in the Australian context.
  • Parvaneh Tavakoli & Fumiyo Nakatsuhara (University of Reading) for their project which looks at the scoring validity of the Aptis Speaking test: Investigating fluency across tasks and levels of proficiency.
  • Xun Yan, Ha Ram Kim & Ji Young Kim (University of Illinois at Urbana-Champaign) for their project which explores the complexity, accuracy and fluency features of speaking preformances on Aptis across different CEFR levels.
  • Nguyen Thi Thuy Minh & Ardi Marwan (National Institute of Education, Nanyang Technological University, Singapore) for their project which seeks to analyse test-takers' pragmatic performance and cognitive processing in the Aptis General Writing Test, Task 4.


  • Yo In’nami and Rie Koizumi (Chuo University, Japan). Factor structure and four-skill profiles of the Aptis test
  • Jesús García Laborda, Marian Amengual Pizarro, Mary Frances Litzler, Soraya García-Esteban and Nuria Otero de Juan (University of Alcalá, Spain). Student perceptions of the CEFR levels and the impact of guided practice on APTIS oral test performance.
  • Carol Spöttl, Franz Holzknecht, Kathrin Eberharter, Benjamin Kremmel and Eva Konrad (University of Innsbruck, Austria). Looking into listening: Using eye-tracking to establish the cognitive validity of the Aptis Listening Test.


  • Khaled Barkaoui(York University, Canada). Roles of delivery mode and computer ability in performance on Aptis writing tasks.
  • Noriko Iwashita, Lyn May and Paul Moore (The University of Queensland, Australia). Features of discourse and lexical richness at different performance levels in the APTIS speaking test.
  • Andrea Phillott and Rosalind Warfield Brown (Asian University for Women, Bangladesh). The influence of sociocultural context on tertiary students’ understanding and performance on Aptis written tests.
  • Susan Sheehan, Peter Sanderson and Ann Harris (University of Huddersfield, UK). Identifying key criteria of written and spoken English at C1.



November 2015 Call for proposals
30 January 2016 Applications close
February 2016 Preliminary review of applications
February – March 2016 Evaluation and selection
March 2016 Notification of decisions to applicants

If additional information about the assessment research grants is required, contact the British Council prior to application at