Why are CTE centers/regions required to have common program evaluation tools?
The evaluation section of the Perkins application requires that a common evaluation tool be used across all programs at Maine career and technical education (CTE) centers/regions. Each CTE center/region is responsible for selecting an evaluation tool that best fits the needs of all programs at that center/region.
Who is responsible for designing the common program evaluation tool for each school?
It is recommended that the director of the CTE center/region in collaboration with their team work together to design a CYET reflective of the school needs.
Who completes the program evaluation tool?
Common yearly evaluation tools must be reviewed with Program Advisory Committees. The PAC consists of Teachers, Students, Secondary constituents, Postsecondary constituents, Business/Industry partners other interested or concerned stakeholders.
How often should the program evaluation tool be used?
What’s the difference between the program evaluation tool and the program self study that is part of the comprehensive school review process?
The program self study process that is used for a comprehensive school review is highly comprehensive and detailed. The common program evaluation tool provides an annual snapshot of where the program is heading in terms of short-term goals and accomplishments. As such, it need not encompass an extensive review process.
What should be included in the common program evaluation tool?
The following components may be part of the program evaluation tool:
1. Statement of purpose for program evaluation
2. Instructions for using the evaluation tool
3. CTE center/region name, address, phone number, website
4. CTE center/region director
5. Name of program
7. Program description
8. Program technical skills standards
9. Industry-related technical skills assessments that have been selected or are being piloted
10. Student requirements, prerequisites
12. Program planning process
13. Student recruitment process
14. Program curriculum
15. Instructional process
16. Assessment process
18. Connections with sending schools, business and industry, postsecondary schools
19. Enrollment and placement statistics, including special populations and non-traditional students
20. Equipment and facilities
21. Program advisory committee
22. Feedback regarding the program from students, media, community, etc.
23. Names/signatures of program evaluators
What should the common program evaluation tool look like?
In designing the program evaluation tool, the director may decide to use a specific format, or they may opt to incorporate several formats within the tool to best suit the program component being evaluated.
Suggested Formats for Common Program Evaluation Tool
Comprehensive in nature and provides for an in-depth review
Time consuming for the evaluator to record; difficult for data collecting
Short answer response forms
Time/effort is minimum for the evaluator; responses are easy to read and record
Not all program evaluation components can be accurately described in short answers
Responses are easily recorded, tabulated and evaluated; data lends itself to drive program and staff development
Not all program components can be easily evaluated in rubrics
Quantitative data; easy for evaluator to record responses
May not reveal total picture for an accurate program evaluation
Qualitative data gives realistic description of the program; portfolios by definition are showcases of work accomplished
Qualitative data may be difficult to manage