The unit has an assessment system that collects and analyzes data on applicant qualifications, candidate and graduate performance, and unit operations to evaluate and improve the performance of candidates, the unit, and its programs.
Introduction
History of Assessment in the PEU
Assessment in the Professional Education Unit (PEU) at the University of San Diego’s
School of Leadership and Education Sciences has evolved in stages. For more than 20 years, the School of Education, later the School of Leadership and Education Sciences (SOLES), at the University of San Diego, has employed a credential analyst to ensure that candidates for teacher, school counselor, and administrative services credentials have met all of the state requirements for their respective credentials. The person in this position monitors changes in California legislation and shifting requirements for teacher and other school credential candidates. The credential analyst’s website contains the most updated credential information.
Although the state of California has long had assessment requirements that must be met
by all teacher candidates, faculty members wanted to assess more than these content-focused end-of-program assessments. In 2001-2002, the Department of Learning and Teaching implemented a portfolio system to assess candidate development in all of their teacher preparation programs. Students submitted centerpiece assignments for each credential course. Faculty evaluated student work using criteria that addressed California Standards for teacher candidates. This system permitted the first program-wide systematic assessment of candidates at the midpoint and end of their credential programs.
In 2004, recognizing the importance or assessment across the PEU, Dean Paula Cordeiro
created a half time position exclusively dedicated to facilitating assessment efforts for teacher education programs. That assessment person launched TaskStream’s Learning Achievement Tool (LAT) for faculty and students in the teacher credential programs. Each faculty member developed a site in which student work could be submitted and evaluated on a set of criteria. At the same time, faculty in the credential programs developed a set of standardized criteria to be used to evaluate the centerpiece assignments according to the appropriate California Teacher Performance Expectations (TPE’s).
Also in 2004, Stanford University was piloting its Performance Assessment for California Teachers (PACT) a comprehensive authentic assessment system for elementary and secondary (multiple and single subject) teacher candidates. The University of San Diego
agreed to be part of this project because of our belief that the set of assessments that the PACT provides allows both faculty members and teacher candidates a very accurate assessment of the knowledge, skills and abilities in situ that beginning teachers should
possess. Faculty knew that the teacher candidate data from the PACT would permit a more careful examination of areas of strength and weakness for candidates. Because of the powerful evidence of learning in the PACT assessment, faculty decided to discontinue the use of the portfolio with multiple and single subject teacher candidates.
Research across institutions
within the consortium has found PACT to be an effective tool for measuring individual teacher competence and a powerful tool for teacher learning and program improvement (Pecheone & Chung, 2006). At USD, adoption of the PACT assessment system has led us to more systematically examine our teacher preparation program, working to ensure that coursework and field experiences work together to progressively prepare candidates to respond to all areas of planning, instruction, assessment, reflection, and academic language. Course readings, course assignments, and field supervision expectations have all been updated to respond to the rigorous expectations set by PACT. In addition,
the Embedded Signature Assignments (ESAs) and TPEs required by PACT have
provided USD’s teacher education faculty with substantive samples of candidate
work to collaboratively review in order to measure individual student progress
and assess areas of strength and weakness within the program.
During the same period, candidates in school counselor and administrator preparation programs were also being assessed through formative and summative assignments and projects, such as program portfolios. For example, preliminary and professional administrative services credential candidates develop extremely detailed portfolios that respond to the California Professional Standards for Educational Leaders (CPSEL) as part of their development as K-12 administrators.
As part of the PACT teaching event, credential candidates are required to maintain
daily reflection logs in which they monitor student learning and make appropriate adjustments to instruction within and between lessons. In addition, candidates are expected to demonstrate that they are able to use research, theory, and reflections on
teaching and learning to guide practice and support continual professional
growth. Candidates for master’s degrees within the Department of Learning and Teaching engage in action research investigations within their classrooms. These in-depth research projects require candidates to assess their students learning, to evaluate the efficacy of instructional practices, and to reflect on their own learning and the conditions that support their professional growth. Such critical reflection creates the disposition to be a life-long learner, which supports both professional growth (TPE 13) and commitment to reflection and learning as ongoing processes. It is also crucial for becoming effective and proactive in meeting professional, legal, and ethical obligations of the teaching profession (TPE 12).
In 2005-2006, SOLES began to use the University of Washington’s standardized
course assessment system. The University of Washington’s course assessment was chosen from several others that were considered because it had been proven to be valid and reliable and had been adopted by many other universities. The system replaced course evaluations that varied by department, and, in some cases, by instructor and provided a way to look at candidate evaluation of courses and instructors on a school- and department-wide basis. Faculty were able to get reports summarizing the evaluations each semester.
In 2006-2007, Dean Cordeiro decided to expand the assessment position to a full
time Director of Assessment position supported by a .5 FTE doctoral level graduate assistant. The new director developed a PEU-wide tracking system for all credential candidates, which permitted more accurate monitoring of the progress of each teacher or school counselor candidate as he or she entered and progressed through the program to completion, the formal filing of the credential documents with the State of California by the SOLES credential analyst. The director also worked closely with an Information Technology support person to pull data from TaskStream’s LAT.
An important contribution made during this time was the development and delivery
of electronic exit surveys for all programs in SOLES. Early program assessment data were retained only on paper documents. The new exit surveys were the first electronic data collection in SOLES. The surveys were developed by program faculty and delivered by the Office of Assessment using Survey Monkey. The first year in which programs used
electronic data collection in a systematic manner was 2006-2007. Survey Monkey
permitted data to be exported into Excel spreadsheets, which could be more easily analyzed for program use than data collected using paper documents. By 2007-2008, every initial and advanced program was using an electronic exit survey that included a set of questions standardized across the PEU and a set of program specific questions.
In addition, the Director of Assessment was able to develop a set of electronic assessments to be used by the key stakeholders involved in student teaching (i.e., the candidates, university supervisors and cooperating teachers), delivered via Survey Monkey. Electronic delivery has two major advantages over paper assessments. First, the forms can be completed from any location with internet access. Second, the data are recorded in a permanent repository from which reports can be generated.
Of special note during the 2006-2008 time frame is that faculty in the School
Counseling specialization in the Master of Arts in Counseling program decided
to seek accreditation by the Council for Accreditation of Counseling and Related Educational Programs (CACREP). This process involved much documentation, including the development and assessment of student learning outcomes. Program faculty
created a midpoint assessment, now evolved into a Fieldwork Readiness Assessment, to ensure that candidates were prepared to advance to the fieldwork components of the program. They developed a fieldwork assessment that is used by on-site supervisors and university faculty to assess the key components of the counseling candidate’s development: clinical skills, assessment skills, and professional skills. This assessment
instrument is used in all three phases of the fieldwork: pre-practicum, practicum and internship. The School Counseling Specialization was granted an eight-year accreditation by the Council for Accreditation of Counseling and Related Educational Programs (CACREP) in January 2009.
Assessment Today
In summer 2008, when the first full time Director of Assessment left USD for
another position, following a national search, SOLES hired a new Director of
Assessment. The director was tasked with working with all program faculty in SOLES on program assessment and documentation for accreditation and accountability. Her office’s name was changed to Office of Assessment Support to more accurately describe the function of the office. To improve accuracy in data reporting, the SOLES independent data system was integrated into the USD data system, Banner. This involved several phases, including special training and development of a close relationship with the Office of Institutional Research and Planning, the Registrar’s Office, and Information Technology Services. Historical data for credential candidates was updated with assistance from the credential analyst and a graduate student assistant.
At the same time, SOLES administration decided to invest even more resources into
assessment and program improvement. Because of recognized gaps in the clinical experiences of credential candidates, a new position was created in the Department of Learning and Teaching: Director of Professional Services. This person was initially hired to coordinate clinical experiences for teacher candidates and serve as the PACT liaison and administrator because the Chair of the Department of Learning and Teaching the
crucial role of clinical experience in the education of teacher candidates. Two additional positions, pathway managers, were created to act as placement liaisons with schools. One of the pathway managers also provides support for PACT and TaskStream. After a year of careful assessment of department needs, the chair expanded the Director of Professional Service’s role to include oversight of the teacher credential programs for elementary and secondary education. She also determined that the program specialist who serves as liaison to the College of Arts and Sciences for undergraduate students in credential programs should report to the Director. The Director initiated the reorganization of
clinical experiences for teacher candidates and is a pivotal member of the team
that developed USD’s Transformation Initiative.
From the time they were both hired the new directors collaborated to improve assessment of the clinical experiences of teacher candidates, even as those experiences were redefined. To improve documentation of student learning outcomes assessment, SOLES’ master’s level programs, including those advanced master’s programs in the PEU, track program assessment using TaskStream’s Accountability Management System (AMS). AMS provides a place for programs to record their student learning outcomes and measures for collecting evidence of those outcomes. Faculty then provide interpretation of those results. TaskStream’s AMS includes the annual development of an action plan that is based on the results of measures for each learning outcome. All phases of use of the AMS system are facilitated by the SOLES Director of Assessment Support.
Faculty across the PEU have been working to raise the bar in terms of program
assessment. In two of the PEU program areas, the Department of Learning and Teaching and the School Counseling program, designated faculty are assigned assessment activities and collaborate with the Director of Assessment Support as part of their assigned faculty
responsibilities.
An example of faculty members taking initiative to improve an assessment occurred
in the Department of Learning and Teaching regarding the assessment Masters of Education (M.Ed.) candidates. Faculty had developed a rubric to assess the Action Research (AR) projects of students in the Curriculum and Instruction M.Ed., and they wanted to use the rubric systematically to provide key evidence of student learning. All of the full time Learning and Teaching faculty involved in the Master’s programs worked with the Director of Assessment Support to implement this rubric across all M.Ed. programs, testing and revising it three times into its present form. In addition, because they want to be sure that the scoring of AR projects is reliable across raters, faculty decided to hold a calibration session with her every fall and spring semester.
The Director of Assessment Support is invited to department, program and School-wide faculty meetings and is included in all discussions about the use of data to enhance academic programs (pedagogy, curriculum, and assessment). All programs have posted student learning outcomes and they have mapped the assessment of those outcomes to appropriate courses in the curriculum. Exit surveys and clinical assessment forms that were previously delivered using Survey Monkey have been migrated to Qualtrics, which is a survey tool that provides greater analytic capability. Most of these surveys have been refined by program faculty members working with the Office of Assessment Support; many have added direct measures through observation. In addition, more than 60 new
assessment instruments have been developed for use in programs across the PEU and other SOLES programs (i.e., Leadership Studies and Marital and Family Therapy).
The Office of Assessment Support oversaw a change in course evaluations for spring
2010. After using the University of Washington (UW) course evaluation system for four years, three drawbacks were identified. First, there was a lag of several months between the collection of candidates’ scantron forms and when reports were received back in SOLES. Typically, the next semester was well underway before an instructor would receive feedback from the previous semester. Second, UW only analyzed data from scantron forms. Candidates’ written comments had to be entered by SOLES administrative assistants into reports by course each term. Third, not all of the questions were relevant for a graduate school most likely because a majority of the universities that participate in the University of Washington course evaluation program are using them to evaluate undergraduate courses. Taking advantage of an opportunity presented by USD’s Information Technology (IT) area, SOLES was able launch a course evaluation system embedded within the candidates’ university e-mail accounts. In summer, 2009, the Director of Assessment Support convened a team of tenured professors from the four program areas in the School of Leadership and Education Sciences to develop a more
appropriate set of questions for course evaluations. Questions were then brought to the full faculty in fall, 2009. Some modifications were made and the Office of Assessment Support worked with IT to successfully implement the new course evaluation system at the end of the Spring 2010 semester.
Exhibits
- Exhibit 1: Description of the unit’s assessment system in detail including the requirements and key assessments used at transition points
- Exhibit 2: Data from key assessments used at entry to programs
- Exhibit 3: Procedures for ensuring that key assessments of candidate performance and evaluations of unit operations are fair, accurate, consistent, and free of bias
- Exhibit 4: Policies and procedures that ensure that data are regularly collected, compiled, aggregated, summarized, analyzed, and used to make improvements
- Exhibit 5: Sample of candidate assessment data disaggregated by alternate route, off-campus, and distance learning programs
- Exhibit 6: Policies for handling student complaints
- Exhibit 7: File of student complaints and the unit’s response [Available on Site]
- Exhibit 8: Examples of changes made to courses, programs and the unit in response to data gathered from the assessment system