Standard 2 COE Accreditation/Self-Study Report Home Overview Conceptual Framework Standard 1 Standard 2 Standard 3 Standard 4 Standard 5 Standard 6 Exhibits

2c. Use of Data For Program Improvement

2c.1. In what ways does the unit regularly and systematically use data to evaluate the efficacy of and initiate changes to its courses, programs, and clinical experiences?

The unit’s assessment timeline (Std2c.Exh11) in the Assessment Handbook shows the various tasks and schedules. COE faculty participate in a University-wide Outcomes Assessment Report process managed by the UAA Associate Vice Provost for Curriculum and Assessment. This process entails an annual review, completed each June, of program effectiveness as indicated by performance on key assessments. The data on these key assessments guide faculty to make decisions regarding program improvement. Data are reviewed not only by the faculty of each individual program, but by faculty on a University-wide committee that provides feedback to the unit. The assessments chosen by COE programs align with state and national standards, thus data gathered for this University-wide self-improvement report also indicate effectiveness in meeting standards set forth by the various SPAs.

Faculty submitting new or revised curriculum (course and program) proposals must provide to the COE Course and Curriculum Committee the data that support the proposals. Proposals cannot proceed to second reading without these data. In addition, UAA supports a protocol calling for all courses to be reviewed for possible updates at least every 5 years. This provides an opportunity to revise courses in light of new research or changes in the field, and prompts each program to remain current in course content. COE Course and Curriculum Committee summaries and the university-wide undergraduate (UAB) and graduate (GAB) curriculum boards meeting summaries provide evidence of this robust process.

The IDEA faculty evaluation system can be tailored by the individual faculty member for each course. This system gathers data about student satisfaction and perception, and can be linked to skills and dispositions that faculty select to fit the course methods and goals. The results of the candidate surveys are provided to faculty, department chairs, and the Dean each semester, and thus allow for fine-tuning pedagogy and assessment strategies in a relatively short timeline. They also constitute a core component of the faculty promotion, tenure, and annual review process.

On an annual basis, the A&A Committee, the LT, and the departments review the results of exit, alumni, and employer surveys to assess perception of the curriculum, services, and field experiences offered by the unit. In addition to the surveys, individual program areas use feedback from school-based mentors of graduating interns, conversations with employers, focus groups, advisory boards and councils, and other means of gathering data. Faculty regularly review these means, in addition to enrollment data, graduation rates, and anecdotal evidence gathered during candidate advising, at program and department meetings.

2c.2. What data-driven changes have occurred over the past three years?

In 2005, COE received an Area for Improvement for Standard 2. The COE Assessment and Accreditation Committee and Data Manager have worked diligently with departments and faculty to support and improve the functioning of the assessment system, and the College continues to evolve in its use of data-informed decision-making processes. COE relies not only on quantitative data but also on qualitative data gathered through long-term relationships with candidates, knowledge of school climates, and understanding of the uniqueness of the Alaskan context. We believe in a quote attributed to Einstein: “Not everything that can be counted counts, and not everything that counts can be counted.”

Numerous examples over the past three years provide evidence that COE uses data to inform decisions. The data used to inform these decisions came directly from components of COE’s assessment system including exit, alumni, and employer surveys; faculty and department chair concerns when analyzing course-level or transition point assessments; stakeholder feedback through advisory boards, councils, and focus groups;  UAA Outcomes Assessment and Program Review processes; faculty and unit resource and productivity data including enrollments and other performance metrics, revenue, and expenditures; state regulation; and supply and demand data. A more detailed description can be found in Standard 2C Exhibit 12 Examples of changes made to courses, programs, and the unit in response to data gathered from the assessment system.

  • The COE Graduate Committee provided leadership to create a common graduate writing rubric, revised research requirements for M.Ed.s, and revised graduate admission and exit criteria based on data obtained from candidates, faculty, and stakeholders.
  • UAA’s cyclic and special Program Review process provided the data informing the revision of the M.Ed. in Counselor Education and the deletion of the M.Ed. in Adult Education.
  • Candidate usage data resulted in the closure of the COE Computer Lab.
  • Supply and demand data, candidate feedback, and stakeholder feedback led to the implementation of three program tracks for secondary education (MAT), implementation of an alternate route option to special education certification, and addition of Career and Technical Education and Physical Education endorsement areas to the MAT.
  • State regulation resulted in the modification of internship evaluation instruments to incorporate new standards for beginning teachers.
  • Internship placement and budget data prompted the implementation of videoconferencing for remote and rural internship supervision.
  • Workload and annual activity report analyses led to the implementation of department databases for tracking workload credit and compensation for thesis/project courses.
  • Internal and external stakeholder feedback prompted the hiring of a Recruitment and Retention Coordinator.
  • Lack of sufficient data at transition points related to candidate knowledge, skills, and dispositions led to the implementation of the NASSP Leadership Skills Assessment in the principal preparation programs.
  • Candidate and school district feedback on the success of a pilot cohort model prompted the implementation of a district/university collaborative cohort model of delivery of the principal preparation program.
  • Professional standards as well as candidate, alumni, and employer feedback informed the design of new research courses to meet the needs of program candidates.
  • Praxis II score analyses led to the revision of social sciences content course requirements in the undergraduate programs.
  • Candidate, mentor teacher, mentor principal, methods faculty, and clinical faculty feedback prompted a revised structure for the Elementary Education internship year.
  • Focus group discussions resulted in the development of a new course requirement in children’s literature for Early Childhood Education candidates.

2c.3. What access do faculty members have to candidate assessment data and/or data systems?

Faculty have a variety of sources for candidate assessment data. Data may be accessed through the Banner database, OnBase, TaskStream, Blackboard, UAOnline, department databases, the unit’s file sharing and PETaL systems, and department paper-based files.

TaskStream is COE’s system for portfolios, and departments are in various stages of implementation. The system collects and reports data on key assessments.

BlackBoard is used in both distance education courses and campus courses. Faculty and candidates use it to exchange electronic documents: faculty store and manage course documents and records in their course shells, while course assignments can be submitted and graded within Blackboard. Thus, it is an accessible communication and record-keeping system for candidate assessment and achievement data.

The UAA Outcomes Assessment Reports, which provide aggregate reports on candidate performance related to learning outcomes, are public documents and accessible to everyone through the UAA Assessment web page.

Candidate achievement and status data are available through Banner and OnBase including admission and enrollment status, test scores, and program plans. The unit’s PETaL system allows easy access to information in Banner and other sources. PETaL, with its filtering and criteria selection options, provides an adaptable and responsive system for faculty to manage program review, assessment, and refinement. Individual and aggregate candidate data can be accessed.

COE’s shared drive is a source for assessment data and records. These include, but are not limited to, candidate assessment data that may be stored by departments; survey results; and records of meeting summaries and reports from unit committees, boards, and councils.

In addition to electronic sources for accessing candidate data, department staff retain paper files with the goal of eliminating paper-based files as the College continues to refine its electronic data systems.

2c.4. How are assessment data shared with candidates, faculty, and other stakeholders to help them reflect on and improve their performance and programs?

Candidates receive assessment data through graded course assignments, field experience observation instruments, and transition points. Scoring guides provide specific information to candidates about both achievement and areas for improvement. The Blackboard “Grade Center” calculates a class average on a given assignment and displays this to candidates so they know how their performance compares to others’. Class discussions help candidates reflect on their developing competence and areas of improvement, and formative assessments help them reach desired outcomes.

Faculty and staff regularly conduct advising sessions with candidates to assess academic progress. Communication between the school placement sites and COE results in the sharing of problems and successes within field placement settings.

College data on various measures, such as enrollments and ethnicity, are shared through annual print and online publications (Fast Facts). Aggregate candidate performance (degrees/certificates; test scores), aggregate faculty data, and budget information are similarly shared.

Faculty receive candidate evaluations each semester, and annual or periodic evaluations through the Peer Review Committee and Dean. Aggregate faculty productivity data are reported to LT, departments, advisory boards and councils, and central administration.

Annual program assessment reports prompt data analysis and discussion among faculty regarding program improvement. These reports are posted on the UAA Assessment web site and are publicly available.

The Director of Clinical Services, department chairs, and faculty maintain ongoing contact with school districts regarding program effectiveness, candidate preparation, and other relevant data. Mentor workdays and meetings are forums for exchange of data and considering program improvements with school-based and University faculties. Employer surveys provide further data about success in preparing qualified and employable candidates over the long term.


1. What does your unit do particularly well related to Standard 2?

All programs within COE focus on ensuring that candidates are familiar with state and national standards and the core values of the conceptual framework. COE is making great strides toward inspiring candidates to reflect on the core values of the unit. This encouragement toward reflection and self-assessment pervades instruction, and faculty consider it as a key element of an effective educator. All syllabi incorporate both relevant standards and core values linked to outcomes and assignments.

The Early Childhood Education program provides candidates with multiple opportunities for reflection as they progress through transition points. Key assignments require reflective essays, description of performance, and analysis of child data. Candidates consistently express their satisfaction with both the support they receive and their growing ability to be reflective as they progress through the program. From the introductory course, through multiple practica experiences, to the final internship, the program emphasizes the national standards. Candidates reflect on these standards many times through their four-year study. A service-learning project conducted semiannually by the entire program encourages reflection on a shared experience and self-analysis against these standards.

The Educational Leadership principal program has worked hard to implement district/university partnerships to offer cohort models in 4 of the 5 largest districts in Alaska. Its cohort program includes participants from a district who engage in collaboration and reflection, builds trusting relationships among candidates, and enhances candidates’ ability to be reflective and self-evaluative.

The Special Education faculty have been proactive in conducting self-review of programs and courses. They have focused on ensuring they adequately address standards and eliminate redundancies or gaps in key content.