Apparent flaws in the Ordinary National Educational Test (O-Net) have provided another strong reminder that Thailand's education system badly needs improvement.
For years, agencies have lamented the falling average of O-Net scores among students in all subjects. But if O-Net exam papers are not really well designed and mired with mistakes, how can we expect the students to do well?
Thailand Development Research Institute (TDRI) academic Dr Dilaka Lathapipat pointed out that while the ISA test reflected the vision of the 21st century, Thailand’s O-Net was stuck to the 20th century.
“The test-design standards for these two tests are much different,” Dilaka said.
If the test designers cannot provide reliable and efficient tests, teachers and students will lack trusted indicators of their performance. Relevant agencies, in that case, will also find it hard to check which areas they should concentrate on to improve for the country’s educational system.
As part of the country’s educational reform, the National Institute of Educational Testing Service (NIETS) sprang into operation and the O-Net was launched.
O-Net has also replaced the decades-long academically tough university exams.
From its inception, NIETS talked about developing standardised tests but years passed and the results were far from satisfactory.
Many universities became so worried about the O-Net ability to select qualified students for some of their fields that they allocated fewer and fewer seats for the central admission system, which has used O-Net scores as admission criteria.
Apart from such reactions from the universities, students and parents have often criticised the O-Net questions too. To them, some questions appear either ridiculous or rather stupid. Multiple choices given in each question have also raised eyebrows.
In fact, when the former director of NIETS looked at one O-Net question for Mathayom students this year, she gave an incorrect answer.
Dr Utumporn Jamornmann, who is now an adviser to the Ombudsman, chose “b” when presented with the following choices for the question – “If you have a sexual urge, what should you do?” a) Ask friends if you can play football together; b) Consult family members; c) Try to sleep; d) Go out with a friend of the opposite sex; or e) Invite a close friend to watch a movie together.”
According to NIETS director Dr Samphan Phanphrut, the answer was “a”.
But critics pointed out that both “a” and “b” could been seen as correct answers for boys; furthermore, in the case of most girls, “a” would be a strange option, and therefore not a viable choice. Utumporn said good test questions must be clear and in line with facts.
Test takers who sat the O-Net, so far, did not think NIETS was successful in ensuring that only good questions appeared in test papers.
Institute for Research and Quality Development Foundation chairman Chainarong Indhara-meesup said any minor mistake could ruin the credibility and reliability of the tests.
“If internal panels can’t notice the minor mistakes, recruit the help of outsiders,” he suggested.
He also recommended that test-design panels first try out some questions with a small group of students so as to test children’s reactions.
“In my opinion, tests should use open-ended questions. A few sentences of answers written by students will tell a lot of things about what students know,” Chainarong said.
He said when it came to assessment of educational quality, agencies should not tighten the budget too much.
“Look at the PISA exam paper, it contains no multiple choices. We have to adjust to improve,” he |said.
The Programme for International Student Assessment (PISA) performed its first test in 2000, but its exam has won respect already across the world. It is a worldwide evaluation in OECD member countries (currently there are 65 member nations) of 15-year-old school pupils’ scholastic performance, performed first in 2000 and repeated every three years.
TDRI vice-chairman Dr Somkiat Tangkitvanich said he was not an educational expert but he noticed the need for the NIETS to improve its services.
“Let’s do some research for test designing. I also think it’s a good idea to publicly announce answers of already-used test questions and explain the rationale of the questions,” he said.
TDRI’s Dilaka said the tests were tools to evaluate not just students’ academic performance but also the performance of teachers and schools.
“Good tests should determine children’s ability to apply knowledge to their daily life,” he commented.
Dilaka said the United States had a clear policy to put emphasis on subjects crucial for the development of the country.
If tests reveal the knowledge of those important subjects is still inadequate on the part of students, schools will be required to reduce learning hours for other subjects and allocate more time for the important stuff.