print version

Language tests are formal instruments of assessment. They can be used either to measure proficiency without reference to a particular programme of learning or to measure the extent to which learners have achieved the goals of a specific course. The language tests that adult migrants are sometimes required to take in order to secure entry to their host country, permanent residence or citizenship, may fall into either of these categories.

Language tests are not necessarily the most appropriate form of assessment to use with adult migrants, especially when linked to financial or social sanctions, because they can undermine motivation to learn. In some circumstances, particularly when assessment is associated with a language course, it may be preferable to use an alternative instrument, for example a portfolio. The European Language Portfolio is especially suitable for this purpose because it is explicitly linked to the categories of language use and the levels of proficiency described in the Common European Framework of Reference for Languages (CEFR). However, use of the ELP as an assessment instrument requires continuous support from the teacher, especially as self-assessment will not have played a role in the previous educational experience of many adult migrants. Self-assessment should always be supported by evidence of achievement, and its validity is enhanced when it is supported by other forms of continuous assessment. Alternative forms of assessment are especially useful when certification is localised.

Language tests that are properly designed, constructed and administered have the following advantages:

  • results are standardised and reliable, which means that it is easy to compare candidates across the same or different administrations
  • candidates are assessed with a high degree of independence and objectivity
  • large numbers may be tested in a short space of time
  • test validity helps to ensure fairness

Good practice in test design requires that developers first determine the purpose of their test and the real-world demands on test takers. Real-world demands must then be translated into linguistic requirements: the knowledge and skills that the test-taker is likely to require, which can be mapped on to the proficiency levels and “can do” statements of the CEFR. The next step is to produce a test specification, which describes the item or task types to be used, the format of the test, the criteria by which performance will be measured, and other practical matters. The test specification must then be broken down into specific testing points so that a suitable combination of test tasks and task types can be developed. The goal should be to provide test-takers with adequate opportunities to demonstrate that they meet the assessment criteria. Test development also requires pre-testing of items.

Language tests should be taken under conditions which are equally fair for all test-takers. This entails that test centres are suitably accredited for the administration of the tests and meet general quality requirements ; test centre staff are professionally competent; a high level of security and confidentiality is maintained throughout the testing process; physical conditions in the test centre are appropriate (e.g. noise and temperature level, distance between candidates); and all necessary arrangements are made for test takers with special requirements. If not appropriately managed, each aspect of test administration has the potential to infringe the human rights of test-takers.

Objectively marked test items (e.g. multiple choice questions used in tests of listening and reading) can be accurately scored by machines or by trained markers; subjectively marked items (used to assess speaking and writing) need to be scored by trained raters whose work must be constantly monitored. In general tests should be kept under continuous review in order to ensure that they test the abilities they claim to test, the abilities are measured in a consistent way by all versions of the same test, and each test works in a way that is fair to all test-takers, whatever their background. These issues are clearly of central importance when tests are aimed at adult migrants. So too is the issue of access to the test: requiring adult migrants to pay a fee may be a disincentive and lead to discrimination.

DL

Related resources

  • Providers of courses for adult migrants- Self-assessment Handbook, 2012, Richard Rossner (enabling language course providers to assess the quality of their programmes for adult migrants)
    EN   FR   IT   SL
  • Quality assurance in the provision of language education and training for adult migrants – Guidelines and options, 2008, Richard Rossner
    EN   FR
  • Language tests for social cohesion and citizenship – an outline for policy makers, 2008, ALTE Authoring Group (Association of Language Testers in Europe)
    EN   FR
    These versions were kindly provided by ALTE members: BG   DE   IT    NO
  • Tailoring language provision and requirements to the needs and capacities of adult migrants, 2008, Hans-Jürgen Krumm, Verena Plutzar
    EN   FR
  • Report on a survey: Language requirements for adult migrants in Council of Europe member states, 2011, Claire Extramiana, Piet van Avermaet. Language Policy Division
    EN  FR
  • Manual for relating Language Examinations to the Common European Framework of Reference for Languages (CEFR)
  • Manual for Language test development and examining for use with the CEFR, 2011, produced by ALTE on behalf of the Language Policy Division, Council of Europe