By Nepomuceno Malaluan
On Dec. 5 (11 a.m. Paris time), the Organization for Economic Cooperation and Development (OECD) will release the results of the 8th round of the Program for International Student Assessment (PISA). Conducted in March to April 2022 for the Philippines, the PISA measures how 15-year-old students perform in reading, science, and mathematics, particularly on how they use their knowledge and skills in these subjects to face real-world conditions. This is the second time that the country has participated in the international assessment.
Knowing that education quality does not transform overnight and considering as well the education disruption brought about by COVID-19, we do not expect a significant change in the country’s result for PISA 2022. The dismal performance of the Philippines in PISA 2018 generated a lot of noise and shrill voices on the state of education quality. I hope this time, education reformers will take a more sober and forward-looking approach to the PISA 2022 results and focus on how specific analysis and insights from PISA and other large-scale assessments can effectively inform the next steps in addressing the challenge of education quality.
Large-scale assessments (LSAs) form part of the Department of Education’s (DepEd) assessment system for kindergarten to grade 12. They complement the classroom formative and summative assessment. National standardized tests, such as the National Achievement Test (NAT), provide a system-level check on whether curricular standards are met at key stages of education progression, while international tests allow us to benchmark against international standards. LSA targets are included as key indicators for system-level basic education quality in the Philippine Development Plan 2023-2028.
The administration and participation in LSAs entail huge budgetary costs. To be truly significant, LSAs must serve their purpose, and not be relegated to unused data. Several low-hanging fruits will allow us to make LSAs a smart investment for education quality and post-COVID recovery.
One, LSA results should be integrated into the Enhanced Basic Education Information System (EBEIS) of DepEd. The EBEIS is a web-based information system that contains vast school-level information on attributes, resources, and programs. Data points include school identifiers, teaching and non-teaching personnel, health and nutrition, electricity, sanitation, infrastructure and other resources inventory, and disaster risk reduction and management (DRRM) information. EBEIS easily cross-references with the web-based Learner Information System that tracks enrollment and important learner data.
Unfortunately, for the longest time, LSA data have not been integrated into these information systems. Thus, the correlation of education inputs and school and learner attributes with indicators of learning outcomes has not been developed as a practice in DepEd planning at all levels. During the last DepEd administration, there was a directive to the Planning Service and the Bureau of Education Assessment to integrate school-level LSA data into the EBEIS. I hope this directive is being pursued.
Two, the data should be made available within and outside DepEd for research and analysis. Properly stored and retrievable LSA data must be made accessible, alongside the other relevant datasets in the EBEIS, to maximize their use for research and analysis. DepEd can set reasonable parameters and levels of use, while users should also be committed to exercise reasonable responsibility in the use of the data. On the part of DepEd, an appreciable portion of its funds for research should be allocated to support rigorous and high-quality analysis of large-scale assessment and context data to inform national policy and school-level interventions.
Three, LSA outcomes should be an integral component of the planning, monitoring, and evaluation standards of DepEd. DepEd planning, monitoring, and evaluation parameters are concentrated on education inputs and education access. Outcomes are assumed to follow from fulfillment of the inputs and access targets. This partly explains why LSA results are not integrated into the information system. We need to close the loop on inputs and outcomes in planning, monitoring, and evaluation.
Four, a comprehensive professional development course on assessment should be a compulsory offering for teachers, school leaders, and relevant units. Professional development will be critical in improving teachers’ assessment literacy and content knowledge, which should help them align classroom practice with national assessments as well as with the emerging literacies measured by international assessments. One such program has been developed by a consortium of DepEd units and outside assessment experts. The program includes the following course titles: Enhancement of Teachers’ Assessment Competencies; Assessment of learning in DepEd; The Philippine K-12 Curriculum and the ILSA; Adapting assessment principles and practices to the emerging literacies; and Monitored application of assessment practice in the classroom setting. We hope the program will continue to be offered.
Finally, inter-school exchanges on effective school management and teaching practices in terms of school-level large-scale assessment performance should be facilitated and promoted. As important as centralized evidenced-based policies are school-based approaches and interventions to address LSA performance. There is no one-size-fits-all approach to education. Exchanges of best practices from top performing schools can provide ideas, inspiration, and common threads for school-based planning and management.
Nepomuceno Malaluan is a Trustee and Senior Fellow at Action for Economic Reforms. He served as education Undersecretary in the last administration.