October 22, 2012 00:00 By Wannapa Khaopa The Nation
But govt undecided on response to assessment by TIMSS
The results of the world’s famous international maths and science test are going to be heard again at year-end – and academics are questioning if Thailand will seriously utilise the results to analyse flaws in science and maths education to make changes and improvement, or if the country will just acknowledge the results but not undertake any measures by relevant agencies afterward.
The latest results of Trends in International Mathematics and Science Study (TIMSS) will be announced in December. First conducted in 1995, TIMSS reports every four years on the mathematics and science achievement of fourth and eighth grade students worldwide. TIMSS 2011 is the fifth in the International Association for the Evaluation of Educational Achievement (IEA)’s series of international assessments of student achievement dedicated to improving teaching and learning in mathematics and science. Findings from the survey are used to inform education policy-makers and to improve teaching.
Following the previous results of TIMSS, other countries have already made some changes to their education policies and science and maths teaching and learning, especially Germany.
The country has reacted to what they called “the TIMSS Shock” and made major changes in education after a weak profile of competence of its Grade 8 students in TIMSS 1995. Now, Germany’s attempt has paid off as German students have performed better in international tests, according to Eckhard Klieme, a professor of education at the German Institute of International Educational Research.
He and two other representatives from Hong Kong and Botswana presented how each country has utilised TIMSS results at the 53rd IEA General Assembly Meeting hosted recently by Office of the Education Council (OEC) in Phuket.
Thai students’ achievements have shown continuous declines over the past decade since Thailand joined TIMSS. Thai students’ scores were 525 for science and 522 for mathematics in 1995. The scores dropped to 482 (science) and 467 (maths) in 1999. Even in 2007, the scores also dropped to 471 (science) and 441 (math). Meanwhile, TIMSS average scores were about 500 or a little more or less than 500.
“We have not seen concrete projects and policies on addressing the declines in the international tests from the Education Ministry,” Chulalongkorn University (CU)’s Faculty of Education Dean, Prof Sirichai Kanjanawasee told The Nation in a separate event.
“Thailand is passive in terms of reacting to the decreasing results but other countries or territories, including Korea, Hong Kong and Taiwan have seriously had experts analyse their TIMSS results and improved their science and maths education until their students can make obvious progress.
NOT ENOUGH ANALYSIS DONE HERE
“Thailand has not done deep enough analysis on its own beyond what has been analysed by TIMSS. So, it has not found what exactly affects students’ learning in maths and science and there are no concrete projects and policies started to solve the problem,” he said.
Sirichai said he had analysed IEA’s Second International Mathematics Study dozens of years ago. He offered that he could help the ministry to do multilevel analysis using his analysis expertise with assistance from CU researchers. Based on information delivered by TIMSS, this kind of analysis would review different factors including the environment at schools, in classes, etc, that affects students’ learning and achievements.
“I hope that after December, the month that we will acknowledge the latest TIMSS results together with people in other countries, Thai educational administrators will be alert to the results and start to react. The Office of the Basic Education Commission and the Institute for Promotion of Teaching Science and Technology will conduct deeper analysis that will create impact on teaching and policies. [But] we, researchers from Chula are pleased to help,” Sirichai said.
So, what can Thailand learn from other countries’ practices?
Klieme said German policy reactions to the TIMSS Shock were more control and more support for schools, teachers and students for standards.
“We always thought students doing well and we had good training but in fact deprivations,” he added.
He said national standards had been established since 2009 for monitoring and school evaluation. It had used school inspectors and a national indicator-based report. Certification had been based on statewide exit exams. The country had participated regularly in national and international surveys. Then Germany implemented new pedagogical initiatives, focused on maths and science.
According to Prof Frederick KS Leung, director of Education and Development for Research Integrity at the University of Hong Kong, the mathematics 2001 curriculum change was partly based on a study done utilising TIMSS results.
“Our science curriculum has put too much emphasis on the learning of details and has overlooked the learning of principles. We might not have become aware of these shortcomings in our curriculum if we have not participated in the TIMSS studies,” Leung said, quoting a book.
“Education is a complex endeavour – we cannot expect the TIMSS results to produce answers for all our national problems in education,” he said.
“TIMSS is meant for individual countries to find out the truth about their maths and science education and seek improvement in their educational practices,” Leung said, adding that Hong Kong had given books to teachers to learn from TIMSS.
Dr Serara Moahi, executive secretary of Botswana’s Examination Council, said Botswana responded to TIMSS 2003 recommendations by reviewing its 2007 curriculum and proposed adjustments, then related adjustments were made in the 2010 junior secondary curriculum. It had done “Strengthening of Mathematics and Science in Secondary Education” teacher support programme, so they have been taught strategies for teaching learners with diverse styles.
She said the TIMSS 2003 report also showed that Botswana has not performed well in both subjects.