0% found this document useful (0 votes)
766 views40 pages

RM 101 s2021 Revised Guidelines On The Contextualized School Based Management Assessment Process and Tool SBM APAT 2

Uploaded by

Jivy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
766 views40 pages

RM 101 s2021 Revised Guidelines On The Contextualized School Based Management Assessment Process and Tool SBM APAT 2

Uploaded by

Jivy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 40
Republic of the Philippines Department of Education REGION V - BICOL October 5, 2021 REGIONAL MEMORANDUM No. jo}, s. 2021 REVISED GUIDELINES ON THE CONTEXTUALIZED SCHOOL-BASED MANAGEMENT ASSESSMENT PROCESS AND TOOL (SBM-APAT) To: — Schools Division Superintendents Regional Functional Division Chiefs Division SBM Coordinators All Others Concerned 1. In ight of the new normal and in reference to the Regional Memorandum No. 67, s. 2019 titled, “Implementing Guidelines on the Contextualized School-Based Management (SBM) Assessment, Process, and Tool (APAT), the Department of Education (DepEd) Region V, through the Field Technical Assistance Division (FTAD), hereby issues the revised guidelines on the assessment and validation of the SBM Level of Practice. 2. For uniformity in assessing and validating the level of practice of every school in the region, the latest guidelines on the use of contextualized SBM-APAT shall be utilized in all divisions in this region. 3. The Schools Division Superintendents (SDSs) are hereby directed to proceed with the activities related to the implementation of the SBM assessment, following, the attached guidelines. 4. Conduct of reorientation and advocacy campaign on the guiding principles of the contextualized SBM-APAT is instructed before undertaking the assessment proper. 5. _ Provisions contained in the previous Regional issuances inconsistent with this Memorandum are hereby repealed. 6. The process flow for the different levels of assessment, the scoring sheets and templates are found as enclosures to this Memorandum. 7. For any concern, please contact Dr. Evangeline A. Saculo, Chief, Field and ‘Technical Assistance Division (FTAD), DepEd ROV Rawis, Legazpi City at Cp. No. 09662068937 or send message at — [email protected]/ [email protected]. << Regional Center Site, Rawis, Legazpi City 4500 4 © annem © mdnnbodepeasoven Republic of the Philippines Department of Education REGION V - BICOL 8. Immediate dissemination of and strict compliance with this Memorandum are enjoined. GIL . Regional Director Encl: As stated References: DepEd Order No. 83, s. 2012 ‘To be indicated in the Perpetual Index under the following subjects: ASSESSMENT PROGRAMS FUNDS, PROJECTS FTAD/easfoc 10/05/2021 Regional Center Site, Rawis, Legazpi City 4500 @ enim @ rrentodepedgovsh Enclosure No. 1 to Regional Memorandum No. 101, s. 2021 REVISED GUIDELINES ON THE CONTEXTUALIZED SCHOOL-BASED MANAGEMENT ASSESSMENT, PROCESS AND TOOL (SBM-APAT) I. RATIONALE School-Based Management (SBM) has been embedded in the Governance of Basic Education of 2002 to provide the guidance and enabling mechanism in the governance and operations of schools. Since its inception in 2012 through DepEd Order No. 83, s. 2012, the schools have used the mechanism to assess their level of practice and to maintain and sustain their improvement efforts and practices. This has resulted in self-directed, self-sustaining, and much improved system of governance. However, since the issuance of the same DepEd Order, and with the turn of events in education landscape and in health emergencies, nothing was done to adjust the mechanism and guidelines more suited to the school contexts. ‘Anchored on the provisions of DepEd Order 83, s. 2012 or the Implementing Guidelines on the Revised School-Based Management (SBM) Framework, Assessment Process and Tool (APAT), DepEd Region V has come up with a contextualized tool that specifies all the possible Means of Verification (MOVs) to support each indicator in assessing the Level of Practice of the school as embodied in Regional Memorandum No. 67, s. 2019, titled “Implementing Guidelines on the Contextualized School-Based Management (SBM) Assessment, Process and Tool (APAT). However, said guidelines need to be adjusted in order to respond to the new normal brought about by the COVID-19 pandemic, hence, the development of the revised guidelines on SBM APAT. ‘These guidelines are issued to provide the rules and clarification along the use of the Contextualized School-Based Management Assessment (SBM), Process and ‘Tool (APAT) by all public elementary, secondary, and integrated schools in DepEd Region V in light of the pandemic. Il, COVERAGE ‘These guidelines shall cover the: A. Revised implementing guidelines in light of the COVID-19 pandemic B. Contextualized SBM APAT initiated by the DepEd Region V along the principles of Leadership and Governance, Curriculum and Learning Resource, Accountability and Continuous Improvement, and Management of Resources C. Scoring Sheets and Templates Il. DEFINITION OF TERMS ‘A. Contextualized Tool - refers to the SBM Assessment Process and Tool initiated by DepEd Region V which specifies particular Means of Verification (MOV) for each indicator to assess the SBM-APAT Level of Practice of the schools in the region. Regional Center Site, Rawis, Legazpi City 4500 4 2 @Qanmen @ redonbodepescoern ‘Means of Verification (MOV) - refers to all documents, artifacts, and other evidences that support the systems, processes, and accomplishments of the school along the various indicators of the contextualized tool. C. SBM Validation Process - is the step-by-step procedure of validating the ‘SBM Level of Practice of a particular school following the D-O-D (Document Analysis, Observation, Discussion). D. Focus Group Discussion - is a small group meeting conducted to discuss the process of SBM Validation among the SBM Validating Team and the school internal and external stakeholders to corroborate documents and clarify processes and systems that support the claims for the SBM level of practice of the school. Internal Stakeholders- refer to the school administration, faculty and staff, as well as parents and learners. F, External Stakeholders- refer to the partner agencies, Local Government Units, Non-Government Organizations, People Organizations, Alumni Association, and other community organizations. G. Learner- a child who attends classes in any level of basic education system, under the supervision and tutelage of a teacher or facilitator. H. Emergent Saturation sampling - It is a sampling technique that counts on new opportunities to recruit participants or to gain access to a new site which may develop after the fieldwork has begun. It is also called opportunistic or emergent sampling’. As used in SBM validation process, this sampling technique means data sufficiency, It means that the validating teams ensure that data and documents collected are enough to satisfy the validity of claims and rating or scores. It also means that SBM Validating Teams can reasonably assure that further documents collection and validation would yield similar results and serve to confirm analysis and conclusions. When Regional SBM Validating teams can claim that they have collected enough documents to achieve percentage weights per principle, they should report how, when, and to what degree they achieved data saturation. 1. Snowballing - Snowball sampling is where research participants recruit other participants for a test or study. It is used where potential participants are hard to find. It's called snowbail sampling because (in theory) once you have the ball rolling, it picks up more “snow” along the way and becomes Jarger and larger®. ‘As used in this regional SBM validation process, snowball technique means creation of pool of participants for onsite and/or face-to-face focus group discussion, as the case may be, one-on-one interview or other validation scheme that involves key informants. The pool may be composed of participants recommended by the Division SBM Team or any other potential participants. ‘br haifa EinusharaT NOES, Pai, FASP, PHO Senor Lecturer n Publi Hesth Graduate Entry Medical School Univesity of merc eland {QUALITATIVE SAMPLING TECNIGUES Traning Course Ih Sealand Reproduce Heath Research Geneva 2016 Retoved fom htp/ww/gimer ch/ sR Course 2036/rezeaeh-methodolog/pat/Quatatv sampling techniaueyEhwusharat2016.pe ‘eterna ch Core 2086 /researcrmethodlogy/pet/Qualtatve-samping techniques cimashara2016.pat Regional Center Site, Rawis, Legazpi City 4500 4 = 2 Downmun @ rdontodepedgoveh J. SBM Coordinating Team - refers to the group who shall validate the SBM Level of Practice of the school, composed of the following: 1, School Level - Chair: School Head Members: Faculty members in-charge of SBM Principles ‘School Governing Council 2. Division Level - Chair: Schools Division Superintendent or his/her Authorized Representative Co-Chair: Chief Education Supervisor (SGOD) Members: Division SBM Focal Person Other SDO Officials in-charge of SBM Principles 3. Regional Level - Chair: Regional Director or his/her Authorized Representative Co-Chair: Chief, Field and Technical Assistance Division Members: Regional SBM Focal Person Other ROV Officials in-charge of SBM Principles 4, National Level - Chair: Undersecretary for Regional Operations Co-Chair: National SBM Focal Person Members: Technical Working Group from Central Office Quality Assurance/Monitoring & Evaluation Team. Core Group on SBM-PASBE Planning Service Team K. SBM Principles - refers to the four core principles of a school system that guide the SBM processes, as follows: L 2. 3. Principle of Leadership and Governance. A network of leadership and of Governance that guides the education system to achieve its shared vision, mission, and goals making them responsive and relevant to the context of diverse environments. Principle of Community-based Learning. The curriculum and the learning systems anchored on the community and learners’ contexts and aspirations are collaboratively developed and continuously improved. Principle of Accountability for Performance and Results. A clear, transparent, inclusive and responsive accountability system is in place and collaboratively developed by the school community which monitors performance and acts appropriately on gaps and gains. : Principle of Convergence to Harness Resources for Education. Resources are collectively organized, judiciously mobilized following ethical standards and managed with transparency, effectiveness and efficiency to support targeted educational outcome Regional Center Site, Rawis, Legazpi City 4500 ” 2 Doren Oo a @© rrvmnscsapeacouen 2 Performance Indicators - refers to indicators of performance based on actual data about the learners, the teachers, the facilities, and the like. In assessing the SBM Level of Practice, these performance indicators reflect school performance along access, efficiency, and quality (learning outcome). As used in these contextualized guidelines, these indicators are construed simply as: 1, Enrolment Rate - the percentage of learners who enrolled in a particular school year compared to previous year’s enrolment. For this purpose, the enrolment data shall cover the current enrolment over the previous years from Kindergarten to Grade ; Grade 7 to Grade 10 in the Junior High School and Grade 11- 12 in the Senior High School. 2. Dropout Rate - the percentage of learners or pupils who left school during the year and did not finish the curriculum year for the particular grade or year level. This included students who transferred to another school but did not finish or have dropped from school within the same curriculum year. 3, Cohort-Survival Rate - the proportion of enrollees at the beginning grade or year at the end of the required number of years of study multiplied by 100 percent. 4, Promotion Rate - the percentage of learners who are promoted to the next grade level. 5. Learning outcomes — the average of the final ratings of all students from Kinder to Grade 6 for elementary and Grade 7 to 12 in the secondary. The coverage of the ratings shall be by grade level and by subject area. POLICY STATEMENT DepEd Region V recognizes the importance of School-Based Management and the corresponding level of practice that the schools may attain as an articulation of shared governance, shared responsibility and accountability of the school and community. DepEd Region V believes that in keeping with the current slogan “Education must continue”, schools should be guided on how to make adjustment in the light of pandemic and that schools shall also continue to operate and improve accordingly and that their efforts to improve are most critical to school success in this time of pandemic. The efforts of the schools in stepping up to elevate their level of SBM practices which are consistent with guidelines set forth in DepEd Order No. 83, s. 2012 shall be acknowledged and recognized accordingly. To achieve Regional Center Site, Rawis, Legazpi City 4500 % D ovr sare @ revonsodepedgoxyn this, a contextualized tool, process, procedures and structures needs to be designed/revised. V. PROCEDURE A, SBM Assessment Process Purpose of Assessment: SBM Assessment is administered to determine the depth of SBM practice alongside the principles of ACCESs (A Child- and Community-Centered Education Systems). It is conducted to determine the profile of schools which needs assistance or recognition. Schools which have effective practices are recommended for award and recognition as well as benchmarking. Schools that fall short of the expectations along SBM shall be given priority for technical assistance. A.1 Self-Assessment (School Level) Pre-assessment Step 1: Organize a team of faculty members, parents and other external. Stakeholders, and non-teaching personnel, preferably the in- charge of finance. The School Head acts as team leader, a teacher as secretary, and the rest of the members shall gather and organize documents and other evidences under each SBM Indicator in the contextualized assessment tool. Step 2: Let team members select the area (Principle) they want to assess. There should be at least two members to take charge of each principle. For schools with less than 10 teaching and/or non-teaching personnel, teachers may be a member of one or all the core principles. Step 3: In the pre-assessment meeting, decide whether to use the whole or part method. In the whole method, all team members shall work as a group, to validate one principle after another. In the part method, at least two (2) members shall be assigned to every principle. ‘The team leader acts as coordinator and facilitator while the secretary acts as documenter. The team should study the Contextualized Assessment Process and Tool, especially the D-O-D (Document Observation and Discussion) process and the use of the Scoring Templates. A.2 Validation Procedure (DOD) Step 4. Conduct Document Analysis. Obtain and assemble all existing artifacts related to the indicator being assessed. Artifacts are the things used by the school community to achieve educational goals, ¢.g., lesson plans, annual reports, assessment tools, test results, community learning centers, computers, organizations charts, development plans, things made by the learners, and the like. Evaluate the validity or truthfulness of each artifact against the four RACCS criteria, namely: rejoice tvs nocionreoms © oon sr r208 @ redentodepedgoveh * Relevance. The evidence must be appropriate to the indicator being assessed. Accuracy, The evidence must be concise, correct, and accurate. Currency. The evidence must be present, existing or actual, and within the time frame. * Consistency. The evidence must be verifiable, generates the same results from most of the sources and must be consistent to all the indicators. * Sufficiency. The evidence must be adequate and supportive to the indicators Who conducts the DOD? A school assessment committee conducts the DOD if assessment is school initiated. A Division Assessment Committee conducts the DOD if the assessment is Division-initiated, or if the assessment is requested by the school. What operational principles guide the DOD process? * Collaboration. The assessors work as a team. Leadership is shared. Decisions are made by consensus and every member is accountable for the performance of the team. « Transparency. The validation of evidence is open to stakeholders view and review. © Confidentiality. Information obtained from the DOD process that may prejudice individuals, groups, or the school is handled judiciously and accordance with the Data Privacy Act. * Validity. Documentary analysis and observation are rigorous in procedure and demanding in quality of results. * Reform-oriented. DOD comes up with informed recommendations and action programs that continuously move the school to higher levels of practice. © Principle-oriented. D-O-D is guided by the ACCESs principles Stakeholders’ satisfaction is an existing growth experience Analysis of document, artifacts, and processes unfold the progress made, the objectives achieved, the new techniques developed, best practices mainstreamed, prizes won despite limited resources and physical, social and political constraints. Step 5. Collect and analyze evidence horizontally (by subject) and vertically (by year and grade level) to ensure content validity, and then synthesize the results of the document analysis. Step 6. Conduct online or onsite observation to obtain process evidence. Documentary evidence may show the school’s focus on learner-centered learning like cooperative, interactive, problem solving, and decision making. ‘There is a need to obtain process evidence to know if these are being practiced. In lieu of face-to-face classes, video footages of how learners do their tasks whenever they are taking their lesson could also be used. 4 Process evidence is obtained by scrutinizing instructional leadership and management styles, methods, techniques, approaches, and activities used by the school community to achieve the SBM goal. Evidence is identified through participant or non-participant observation which may be conducted formally or informally. Individual or group interviews are held to verify or clarify the evidence. Evidence is scrutinized for validity using the RACCS criteria. Determining the number of observations, interviews, and documents to be scrutinized is a sampling problem in conducting D-O-D. The problem is commonly addressed by using saturation sampling. The technique is described in the attachment of the SBM Assessment Tool. Use the process evidence to cross-validate documentary evidence. Synthesize the process evidence for group discussion. Step 7. Discuss the synthesized documentary and process evidence. Conduct the discussion in a friendly, non-confrontational conversation to explain, verify, clarify and augment the evidence. Invite members of the school community who were engaged in the collection and presentation of evidence to participate in the discussion. As the team arrives at a consensus on the level of practice of the indicator being assessed, a check mark ( « ) is placed in the appropriate box to indicate compliance of artifacts in the indicator being assessed. The process is continued until all four dimensions are assessed. Practices vary in establishing the level of practice of an indicator. The most common is the integrative approach in which the entire body of evidence for all indicators of a standard is assembled first, scrutinized for internal consistency, and finally used as guide in making a consensual decision to which level of practice an indicator belongs. ‘The other practice is non-integrative. Indicators of a standard are scrutinized one by one for evidence and also classified one by one for level of practice. Relationships among indicators are given less attention. In this period of pandemic, online assessment and validation may be conducted. School visit may be done with schools whose barangay is COVID free. However, strict health and safety protocols shall be observed per IATF and/or LGU guidelines. Proper courtesies must be accorded to the School Heads in planning the assessment process. Step 8: Discussion of Documents and Process Evidence. Summarize data for each Principle/Indicator. Clarify issues, problems, opportunities, etc. ‘Then, the team scores the indicators. The scoring of the indicators is a consensual decision of the team based on concrete evidence and merits of the claim. Step 9. Closure or exit conference. This part of the procedure is equally important as this will solidify the overall process and decision of the team. ‘This may be done onsite or online, as deemed applicable and allowable. Aside from the final decision made, the process inherent to the team should be ‘f discussed. Other significant and/or critical observations must also be discussed or clarified. Step 10. Report writing by the team. This should be done within a week after the exit conference. All members of the team should sign the report. A copy of the report should be provided to members of the school SBM team and to the proper offices and authorities. A.3 Division Level Assessment Upon request of the school, the Schools Division Office SBM Coordinating Team shall schedule the validation. The school should be informed of the schedule through a written notice. The Division validating team shalll act on the request within one week upon receipt of the said request. ‘The Division SBM team shall convene to decide on the mode of validation. Thereafter, the team shall commence the validation process. ‘The same process as enumerated in Steps 1-10 school level assessment is observed. During the assessment proper, onsite or online platform or both may ‘be done. Specify the duration of each phase, to wit: Document Analysis (1- day), Observation analysis (1-day), and Discussion of data (half-day). Classify /organize the documents by principle. Take note that there are documents that can be a source of evidence for several indicators across principles. For online platform, scanned copies of documents, video clips of activities, documentaries, and testimonials will be enough evidence. For onsite validation, strict adherence to health protocols is enjoined. Conduct process validation. The purpose is to verify processes and validate documented evidences using observation and interview. ‘The Division/Regional Team shall conduct Focus Group Discussion (FGD) to further clarify processes, procedures, and systems done by the school to arrive at a particular rating per indicator in the contextualized tool. The platform to be used shall be decided on by the School and the Validating Team. Gather evidence using the D-O-D process and select samples of documents using emergent saturation sampling and snowballing. Summarize the evidence, and arrive _ at the consensus, especially on rating to be given to each indicator based on documented evidences. If the attainment of Level II SBM Level of Practice is validated and confirmed by the division team, the SDO shall give certification and/or recognition to the school for achieving such level. If the claim of Level II SBM Level of Practice is not achieved, the team shall provide Technical Assistance to the school to obtain the target SBM Level II Practice. Furthermore, if the school was assessed by SDO Team to have attained the SBM Level III, the Division Office shall request for validation from the Regional Office. @ ene @ rponsoacpedgorgh A.4 Regional Validation ‘The regional office, through the regional SBM coordinating team shall act only on requests of schools which have been assessed as Level II] endorsed by the Schools Division Office. ‘The regional coordinating team shall decide on the mode of validation process, Prior to the actual conduct of validation, the SDO shall first submit the documents and other evidences to support the claim. The regional coordinating team shall, within one week, do an initial assessment of the documents submitted. Then, the team decides consensually whether the documents are sufficient to proceed to the validation process. The SDO shall be notified of the decision and the succeeding actions of the regional team. ‘The following are the steps in the regional validation: Step 1. An endorsement and formal request for SBM validation shall be forwarded by the Division Office through the Division SBM Coordinator to the Regional Office, Attn: FTAD. Step 2. The School, Division and Regional Office shall agree on the date and mode of validation. Step 3. If the parties agree on onsite validation, all documentary requirements must be prepared ahead of time. On the other hand, should those involved agree on online validation, a Google folder will be created by the regional office and shall be sent to the school through the Division SBM Coordinator where all the documentary evidences will be uploaded two weeks prior to the scheduled focus group discussion. If both online and onsite validation shall be employed, a schedule onsite visit shall be made while all the documentary evidence for the online validation will have to be uploaded to a designated Google Drive folder. Step 4. For the documents submitted to the Regional Office via Google drive folder, the regional coordinator will conduct assessment and determine the sufficiency of evidences presented to proceed with the validation. Step 5. If found insufficient, the school through the Division SBM Coordinator, will be formally informed. The school will be given another week to comply with the additional/needed requirements by the Regional Validation Team. Step 6. If the additional requirements have been complied with and duly assessed, the Regional SBM team will schedule a date for an online validation through a focus group discussion with the School SBM team. Step 7. Then, steps 7-10 of the School and Division level assessment shall be followed. Furthermore, an in-depth process validation using the guide questions developed by the regional technical working group shall be done. “Regional CenterSite, Rawis, Legazpi City 4500 | 2 Downe © meoneosepadsovsh If the regional team confirms the level of practice to Level III, the regional office shall give certification and/or recognition to the school for the attainment of such level. If the team determines the practice as Level I only or below, the regional team shall provide technical assistance to SDO. Likewise, the SDO will provide TA to the school to obtain the highest level of SBM practice. Thereafter, the regional office shall recommend the school to the Central Office for recognition. Figure 1. The Validation Process Flow ‘THE SBM LEVEL OF PRACTICE VALIDATION AND QUALITY ASSURANCE FLOW CHART BS) =] ag SBM Level of Practice (Self-Assessment) School Level 2 60% rnng ovomes Princes Final Computation (SBM Validation Form) ‘BM Iplemertation (A) A Indorsement from PSDS . Se} ila) Division Level SBM Level of Practice (Division Level-Assessment) No a ees|| _ 60% oe e Lesming Outcomes ty Panning Unt Final Computation by SM Mae (SBM Validation Form) Level Endorsement to Next eve Validation for Regional Center Site, Rawis, Legazpi City 4500 i, a Derm @ redentoaepedgoveh SBM Assessment Indorsement from SOS 7 Regional Level SBM Level of Practice (Regional Level-Validation) i ¥ — aaa 9 40% fee om | (2. rs ier, | | “inet i naa Secor eee wo ; - — Final Computation 4 (SBM Validation Form) byRFTAT 4 : rae |, vs enn bol sn’ > [ee —> Cena > C. The Contextualized SBM Assessment Tool ‘The Contextualized School- Based Management (SBM) Assessment Tool is guided by the four principles of ACCESs (A Child and Community centered Education). The indicators of SBM practice were contextualized from the ideals of an ACCESs school system. The unit of analysis is the school system, which may be classified as Developing, Maturing or Advanced. Characteristics and Features. The revised tool is systems-oriented, principle guided, evidence-based, learner-centered, process-focused, non-prescriptive, user- friendly, collaborative in approach, and results/outcomes focused. Parts of the Tool. The tool shall contain the following parts: a.) basic school/learning center information; b.) principle-guided indicators; c.) description of SBM practice scaled in terms of extent of community involvement; a.) Learner centeredness, and e.) scoring instructions. Awards or wins in Brigada Eskwela and Wash in Schools (WiNS) which were included in Curriculum and Learning principle in the original tool, were transferred to the Management of Resources principle. Users.The users of the tool are the teachers, school heads, learners, parents, LGU, Private Sector, NGO/ PO and the different administrative governance levels of DepEd. Scoring Instructions 1. The four (4) principles have the following assigned percentage weights on the basis of their relative importance to the aim of school (improved learning outcomes and school operations): _, Regional Center Site, Rawis, Legazpi City 4500 SEO wom @ reomtovetgorss * Leadership and Governance - 30% Curriculum and Learning - 30% * Accountability and Continuous Improvement - 25% © Management of Resources - 15% 2. Bach principle has several indicators. Based on the results of the D-O-D { Document Analysis, Observation and Discussion}, summarize the evidences, and arrive at a consensus on the rating that will be given to each indicator. 3. Rate the items by checking the appropriate boxes. These are the points earned by the school for the specific indicator. The rating scale below shall be used: 0 -No evidence 1 -Evidence indicates early or preliminary stages of implementation 2 Evidence indicates planned practices and procedures which are fully implemented 3. Evidence indicates practices and procedures that satisfy quality standards 4. Gather the Rubrics rated by the respondents; edit them for errors like double entries or incomplete responses; 5. Count the number of check marks in each indicator and record in the appropriate box in the summary table for area/ standard rated. 6. Multiply the number of check marks in each column by the points (1-3); 7. Get the average rating for each principle by dividing the total score by the number of indicators of the principle; 8. Record the average ratings for the principle in the Summary Table for the computation of the General Average; 9. Multiply the rating for each principle by its percentage weight to get the weighted average rating: 10.To get the total rating for the four principles, get the sum of all the weighted ratings. The value derived is the school rating based on DOD; 11. The level of practice will be computed based on the criteria below: © 60% based on the improvement of learning outcomes © 40% according to validated practices using DOD Description of SBM Levels of Practice. The resulting levels are described as follows: Level I: DEVELOPING - Developing structures and mechanisms with acceptable level and extent of community participation and impact on learning outcomes. OD enrmn Regional Center Site, Rawis, Legazpi City 4500 @ reonsoaepesgoven Level II: MATURING - Introducing and sustaining continuous improvement process that integrates wider community participation and improves significantly performance and learning outcomes. Level I: ADVANCED - Ensuring the production of intended outputs Joutcomes and meeting all standards of a system fully integrated in the local community and is self- renewing and self-sustaining. Scoring Sheets and Templates are found in the enclosure. D. RECOGNITION AND INCENTIVES ‘To accelerate implementation and reward best practices, the contextualized SBM assessment gives systematic recognition and incentives program in terms of cash incentive subject to availability of funds and to auditing and accounting rules and regulations; and/or learning materials, plaque of recognition and sharing of best, practices in a Regional SBM Congress. To ensure achievement of targets and sustained best and improved practices, a surveillance audit for two consecutive years shall be conducted by the division and/or regional office. It is only after recertification by the regional office that the school shall be recommended for recognition to the Central Office. ‘The highest recognition after two consecutive reaccreditation status may be awarded to a school as Center of Excellence in a specific field, ie, Technical Vocational, Special Education, Multigrade School, ete. VI. MONITORING AND EVALUATION On SBM Quality Implementation 1. To ascertain the current status of SBM quality implementation and assessment, all concerned divisions through the SGOD are tasked to provide baseline data and Division SBM Profile quarterly which will serve as basis in the provision of relevant and timely technical assistance (TA) to SDOs and schools. A summary report shall be submitted to the Regional Office, Attention: FTAD Chief ES, every end of the quarter. The same report shall be reflected in the DMEA quarterly. 2. Based on identified priority needs with the end-in-view of improving SBM level of practice, Division Field Technical Assistance Teams (DFTAT) shall conduct onsite visit /virtual meet-up to provide TA to schools. 3. Tracking of agreements and interventions after provision of TA shall be done by the Division SBM Task Force. 4. Template for the M&E Report is enclosed which shall be uploaded via link bitdy/SBM_RS. (On SBM Level of Practice Validation 1. DepEd ROV through the assigned RFTAT and in coordination with SDOs shall conduct validation of SBM Level of Practice to ensure that the school meet the Fewonsl center Se, ws oat Gy #500 4 a © onme @ vemoweptcoume vol. prescribed standards set forth by regional memorandum anchored on RM 67,8.2019 and DO 83, s. 2012 provided that : ‘a. The SDO endorsed the school with Level III of practice to ROV. b. The school rating on SBM level of practice is certified by the Division SBM Task Force and is based on the result of quality assessment. c. Certification is duly approved by the Schools Division Superintendent. REFERENCES DepEd Order No. 82, s. 2012 re: “Implementing Guidelines on the Revised ‘School-Based Management (SBM) Framework, Assessment, Process and Tool (aPaT” Regional Memorandum No. 67, s. 2019 re: “Implementing Guidelines on the Contextualized School-Based Management (SBM) Assessment, Process and ‘Tool (APAT)” ANNEXES b. Process Flow of Levels of Assessment c. Assessment Tool d. Validation Form . Score Sheets ‘Vil. EFFECTIVITY ‘These implementing guidelines shall be effective immediately on the date when the Memorandum to this effect was signed. .gional Center Site, Rawis, Legazpi City 4500 4 © a seo @ recentodepedgoush

You might also like