The purpose of the Assessment Research Grants is to contribute, in parallel with internal British Council research activities, to innovation in assessment practice, as well as to the validation of Aptis and other British Council language assessment projects.

The research agenda upon which the grants are premised is based on the modified socio-cognitive validation framework (Weir, 2005; O’Sullivan & Weir, 2011; O’Sullivan, 2014). The use of alternative validation frameworks that also contribute evidence of the validity of British Council assessments will also be considered. The goal of this research agenda is to build a significant body of substantive evidence of the validity of British Council assessments and contribute to innovation in language assessment development in general. 

Projects which are more exploratory in nature and utilise instruments other than Aptis will be considered provided the proposer can demonstrate strong potential for impact in the field of language assessment.

Projects will usually be for a period of one year but projects lasting up to 18 months will also be considered.

Who can apply?

Educational institutions and suitably qualified individuals are invited to apply. In order to foster our relationship with partners, we will also encourage applications from test users interested in understanding how British Council tests impact on their institution. Please note that regardless of the geographical location of applicants and regardless of where the actual research will take place, all contracts for research projects issued by the British Council are subject to the laws of England and Wales; this is non-negotiable.

Financial support

The research grants support researchers around the world in conducting and disseminating the highest quality research.

Financial support for individual projects will, in principle, be limited to a maximum of £17,500, although it is expected that projects requesting financial support in the region of £10,000 to £15,000 will be most positively considered.

Applications for extensive travel or large items of equipment will not be supported. N.B. Applications should NOT at this stage include requests for funding for conference presentations; if completion of the research project results in acceptance of a paper at an international conference, the British Council will consider supporting the attendance of one of the authors through a separate application.

Areas of interest for 2021

We would like to invite proposals that explore innovation in task types and task construction as well as approaches to scoring/marking.

The following are the five main areas that are of particular interest to the British Council in 2021:  

NEW TASK TYPES

  • New and innovative ways of testing all four skills, particularly those exploiting computer delivery or the use of new technologies to introduce new task types.
  • New and innovative ways of testing grammar and vocabulary.
  • Exploring new ways of designing and implementing integrated tasks.
  •  Integration of scenario-based assessment in computer delivered test systems.

 TASK CONSTRUCTION

  • Identifying a range of features which can be manipulated in ways that will have stable and predictable effects on item difficulty.
  • Automated item generation – issues related to ensuring consistent difficulty across items at the item generation stage.

METHODS OF MARKING

  • Develop innovative ways of scoring and reporting, including the use of IRT ability values as the basis for direct score transformation.
  • Investigate the use of technology to enhance the rating of productive skills, including the use of automated scoring and artificial intelligence; of particular interest are proposals integrating human and automated scoring. This could include corpus-driven approaches to spoken grammar and / or incorporating the use of spoken grammar within a rating framework.
  • Explore innovative ways of capturing more contextual information around the test score and incorporating that into diagnostic feedback (e.g. the use of eye-tracking, pupil dilation etc. in estimating the amount of effort/concentration of students while completing test items).
  • Investigate the nature of comprehensibility within task specifications and rating scales for speaking (e.g. phonological control in the CEFR Companion Volume – see below)

TEST DELIVERY IN RESPONSE TO COVID

  • Implications of remote invigilation for online assessments (e.g. on test taker anxiety, security, accessibility, etc.)

USE OF THE CEFR COMPANION VOLUME

We welcome proposals which address the potential implications for assessment of updated materials in the Companion Volume of the Common European Framework of Reference for Languages (Council of Europe, 2020), particularly with respect to their application in new task types and rating scales:

  • New descriptor scales
  • Existing scales updated with additional/revised descriptors
  • Mediation skills
  • Plurilingual/Pluricultural competence 

CORPUS-BASED APPROACHES TO TEST VALIDATION

We invite proposals that seek to make use of the British Council Lancaster Aptis Corpus. (This is the Aptis spoken corpus of test-taker responses elicited through the Aptis speaking test)

TEST IMPACT

  • Test impact on a variety of stakeholders
  • Test washback

Consideration will also be given to other issues of current interest in the fields of applied linguistics and second language acquisition in relation to language assessment.

2019

  • Shangchao Min and Hongwen Cai (Zhejiang University and Guangdong University of Foreign Studies), Exploring the cognitive demands of Aptis listening tasks using cognitive diagnostic assessment.
  • Vahid Aryadoust, Guoxing Yu, Nathaniel Owen, Mikako Nishikawa (National Institute of Education Singapore, University of Bristol,  Open University, Kyoto University), An Eye-Tracking Investigation of the Relationships between Test Takers’ Attention, Item Difficulty, and Item Type in the Aptis Grammar and Vocabulary Test.
  • Janina Iwaniec, Ana Halbach, Lyndsay Renee Buckingham Reynolds, Miguel Fernández Álvarez (University of Bath),  The effect of bilingual schooling in Madrid on SES: The student perspective.
  • Soo Jung Youn (Northern Arizona University), The role of pragmatic variables in predicting the Aptis speaking test difficulty.

2018

Sathena Chan, Daniel Lam and Tony Green (CRELLA, University of Bedfordshire) for their project which will Investigate the textual features and revising processes of EFL and L1 English writers in China in Aptis for Teens Writing Task 4.

Nivja de Jong and Jos Pacilly (Leiden University Centre of Linguistics, Leiden University, Holland) for their project which will look at new techniques to measure fluency in speech automatically.

Ute Knoch, Catherine Elder, Jason Fan and Tina Peixin Zhang (University of Melbourne, Australia) for their project to investigate the discourse produced at score levels B2.2 to C2 on the Aptis Advanced Writing Test.

Judit Kormos (Department of Linguistics and English Language, Lancaster University) for her project which will explore time-extension and the second language reading performance of children with different first language literacy profiles.

2017

Trisevgeni Liontou (University of Athens) for her study which will apply automated analyses techniques to investigate discourse features in the Aptis for Teens Writing test: Evidence from lower secondary EFL students.

Nathaniel Owen (Open University) for Exploring rater behaviour with test-taker responses in Aptis Writing.

Okim Kang (Northern Arizona University) for her study on linguistic features and automatic scoring of Aptis speaking performances.

Carol Spöttl, Eva Konrad, Franz Holzknecht & Matthias Zehentner (Language Testing Research Group, University of Innsbruck) for their project on assessing writing at lower levels: research findings, task development locally and internationally, and the opportunities presented by the extended CEFR descriptors.

Azlin Zaiti Zainal , Ng Lee Luan & Tony Green (Faculty of Languages and Linguistics, University of Malaya)for their study which looks into the impact of the ProELT 1 training programme and Aptis on Malaysian English teachers’ classroom practice.

KEY DATES FOR ASSESSMENT RESEARCH GRANTS 2021

February 2021 Call for proposals
31 March 2021 Applications close
April 2021 Preliminary review of applications
May 2021 Evaluation and selection
June 2021 Notification of decisions to applicants

If additional information about the assessment research awards is required, contact the British Council prior to application at arag@britishcouncil.org