Follow us on Facebook
Twitter
Current Students
Prospective Students
Faculty & Staff
Alumni
Overview Standard 1 Standard 2 Standard 3 Standard 4 Standard 5 Standard 6 Standard 7 Standard 8

2c. Use of Data for Program Improvement

2c.1.        Describe the unit’s plans to regularly and systematically use data to evaluate the efficacy of its courses, programs, and clinical experiences. Describe the unit’s plans to ensure use of these data to initiate changes to its courses, programs, and clinical experiences.

The unit has developed and implemented a comprehensive assessment system that is systematic and reflects professional and state standards. The outcomes-based assessment model enables evaluation of teacher candidates’ mastery of all required knowledge, skills, and content as defined by the unit’s Conceptual Framework and the standards of the Georgia PSC. Analysis of aggregated and disaggregated performance data enables program assessment and ongoing program improvement. Data from all key teacher candidate assessments are entered in the LiveText system for the purpose of aggregating, disaggregating, trending, and analyzing data regarding candidates’ performance related to standards to enable program improvements. The university’s Director of Institutional Assessment assists the unit’s Coordinator of Field Experiences and Assessment in the production of meaningful reports from the LiveText system and provides statistical analysis of candidate performance data.

Each May an assessment retreat is held that involves all program faculty together with program stakeholders in review of data, discussion of what the data mean, and consideration for what changes the data may indicate. Participants in the annual assessment retreat are provided with relevant data regarding applicant/candidate qualifications, candidate proficiencies, graduates’ competence, and program operations and quality. They are also provided with a template for the Annual Program Report that includes questions related to candidate proficiencies, program operations, and the assessment system. For example, in the section on Candidate Proficiencies, faculty and stakeholders are asked to consider: “What do the summarized key assessment, exit survey, and graduate follow-up survey data sets show about candidate performance on each standard that was assessed? Please address each program standard separately by providing a brief analysis of the data findings; and interpretation of how those data provide evidence for meeting the standards.” Then they are asked to consider the action implications of those data: What specific short-term actions will be taken during the following academic year in order to improve candidate performance? What are long-term action implications? Please specify tasks and timelines for planned actions.” The completed Annual Program Reports are reviewed and monitored by respective program coordinators, the division chair, and the division’s Coordinator of Field Experiences and Assessment. 

Throughout the year, teacher candidates’ performance is reviewed by multiple committees. The Division of Education Assessment Review Committee meets twice a semester to review all teacher candidate performance data reports from LiveText and presents findings to Faculty Council for the purpose of program improvement. They also make recommendations to the Advisory Committee and to the Curriculum Committee, as needed. The Division of Education Curriculum Committee meets twice a semester to consider programmatic improvements related to curriculum based on analysis of data provided through the LiveText system. An ongoing audit of all course syllabi, with particular focus on course outcomes and assessments aligned with GaPSC professional and content standards is a central task of the committee. At the level of curriculum, before any changes in programs become final, such as adding a new course, deleting existing courses, program admission requirements they must go through the Division and University Governance Structure.

The Field Experience Committee evaluates all aspects of operations related to field and clinical experiences. The Coordinator of Field Experience oversees the collection of evaluation data of the field/clinical experience triad. Each participant; the teacher candidate, mentor teacher, and university supervisor evaluate one another. The Field Experience Committee uses these data, along with teacher candidate data reports from LiveText to determine the appropriate changes needed to field and clinical experiences.

The Unit’s assessment system includes a strong component focused on faculty quality that involves both university and unit level assessment. Thomas University systematically completes a comprehensive evaluation process of full-time and adjunct/part-time faculty with the goal of continuous improvement. The formal evaluation process requires faculty to annually submit a Self-Reflective Profile demonstrating teaching effectiveness, scholarship/professional development, and service activities. During fall and spring semesters and summer term, student evaluations of all full and part-time faculty members are administered by the university’s Office of Institutional Assessment. Results of student evaluations are analyzed and course summaries are distributed to the faculty and division chairs as aggregated and disaggregated data after each semester ends. If a faculty member is evaluated as not meeting expectations in one of the performance areas of Teaching Excellence, Scholarly and Professional Involvement and Achievement, or Service to Student Body, University, and Wider Community, the Division Chair develops a corrective action plan in discussion with the faculty member to which he/she agrees. Action plans typically align the faculty member with a university support person such as the Academic Technology Specialist, a senior academic advisor, or a peer mentor to meet regularly with the faculty member to address areas of improvement needed. Action plans require statement of measurable changes in behavior or productivity and these targets are addressed by the faculty member with the support person and progress is closely supervised by the Division Chair over the following academic year. Progress is formally evaluated in the next year’s annual evaluation. Additionally, faculty who supervise field and clinical experiences are evaluated by both the mentor teacher and the teacher candidate.

2c.2.      Describe plans to give faculty members access to candidate assessment data and/or data systems.

Faculty have constant access to aggregated teacher candidate performance data analyzed through the LiveText through each faculty account. The system allows disaggregation by various variables, a critical feature to support ongoing program improvement. Program-level data collected and analyzed are shared at regular meeting times with various division governance structures: Administrative Council, Faculty Council, Assessment Review Committee, and the Advisory Committee, several of which involve PK-12 school partners and Division of Arts and Science faculty.

Each May, an assessment retreat is held that involves all program faculty together with program stakeholders in review of data, discussion of what the data mean, and consideration for what changes the data may indicate. Participants in the annual assessment retreat are provided with relevant data regarding applicant/candidate qualifications, candidate proficiencies, graduates’ competence, and program operations and quality. They are also provided with a template for the Annual Program Report that includes questions related to candidate proficiencies, program operations, and the assessment system. For example, in the section on Candidate Proficiencies, faculty and stakeholders are asked to consider: “What do the summarized key assessment, exit survey, and graduate follow-up survey data sets show about candidate performance on each standard that was assessed? Please address each program standard separately by providing a brief analysis of the data findings; and interpretation of how those data provide evidence for meeting the standards.” Then they are asked to consider the action implications of those data: "What specific short-term actions will be taken during the following academic year in order to improve candidate performance? What are long-term action implications? Please specify tasks and timelines for planned actions.” The completed Annual Program Reports are reviewed and monitored by respective program coordinators, the division chair, and the division’s Coordinator of Field Experiences and Assessment.

2c.3.       Describe plans to share assessment data with candidates, faculty, and other stakeholders to help them reflect on and improve their performance and programs.

The unit’s assessment system has program/unit improvement at its core. Assessment data are shared with candidates, faculty, and other stakeholders in many ways. All individual candidate performance data submitted through the LiveText system is immediately available to the candidate once the assessment is complete. This feature allows candidates to receive immediate feedback, reflect on the feedback, and improve their performance. Throughout all programs in the unit the teacher candidate is required to reflect and improve practice from entry through induction. For example, during field experiences and student teaching in all programs the teacher candidate is required to record lessons for analysis and reflection. Faculty provide ongoing feedback and mentor each candidate in ways that support and encourage professional and personal growth.

Faculty have constant access to aggregated teacher candidate performance data analyzed through the LiveText system is available to faculty through each faculty account. The system allows disaggregation by various variables, a critical feature to support ongoing program improvement. Program-level data collected and analyzed are shared with various division governance structures: Administrative Council, Faculty Council, Assessment Review Committee, and the Advisory Committee, several of which involve PK-12 school partners and Division of Arts and Science faculty. Each May an assessment retreat is held where faculty receive data reports tables. Advisory members participate with faculty in the May assessment retreat.

There are established processes through which faculty continuously use data to reflect on and improve their own practice. Course evaluations completed by candidates serve as an important source of data for the annual evaluation as do peer evaluations. Faculty work with their mentors, peers, and the division chair to reflect on their performance and to develop ways to improve their teaching, scholarship, and service.

 

 

 

 

 

 

 

 

 

 

 

 

Education that Engages... Empowers... TRANSFORMS

nimblecmsan NTS product