Assessment data are collected at multiple points and multiple assessments are used
to gather both internal and external data. Data are regularly compiled, summarized,
analyzed and used. For example, candidate data are used by programs to make decisions
regarding candidate admission, matriculation, and program completion. Program assessments
are used internally to measure program quality and manage and improve Unit operations
and programs. SPA program reports are external evaluations used to strengthen the
overall performance of the Unit and ensure that graduates have the knowledge, skills,
and dispositions to meet program standards. SPA program approval reflects on Unit
and operations quality. Employer surveys are used to ascertain candidate proficiencies
in the workplace as well as Unit and operations quality. Follow-up surveys of completers
also provide data for improvement of Unit operations (exhibits 2.1.1; 2.2.2; & 2.1.3)
Course and Instructor Evaluations are completed by candidates and compiled by IT.
Results of these evaluations are shared with faculty members to improve the teaching
and learning environment. They are also used by departmental chairs during annual
faculty evaluations and as an indicator of Unit and program operations quality.
Faculty submit the Annual Faculty Report (exhibit 2.1.4). Faculty evaluations by
department chairs are conducted annually (exhibit 2.1.5) and feedback is used to improve
faculty productivity and to assist faculty in meeting tenure and promotion goals.
Data also provide evidence of Unit and program operations quality. Tenure-track faculty
are evaluated for tenure and promotion on criteria following procedures established
in the GSU Faculty Handbook (exhibit 2.1.6). Faculty are also evaluated by peers using
an observation form (exhibit 2.1.7).
GSU supervisors and cooperating teachers are evaluated using multiple evaluations.
Data from these evaluations are used to make future FEX assignments and as an indicator
of the effectiveness of the Unit and the quality of program operations. The following
evaluations are completed at the end of each semester: 1) student teaching candidate
evaluation of GSU supervisor and cooperating teacher, 2) cooperating teacher evaluation
of GSU supervisor, and 3) GSU supervisor evaluation of cooperating teacher. For more
information please see the Student Teaching Handbook (exhibit 2.1.8).
The annual departmental goals and objectives form (exhibit 2.1.9) is used to guide
the planning and operations of each department and is used as an indicator of Unit
and program operations quality. Each fall, departmental faculty set goals, objectives,
strategies, and performance measures for the upcoming fiscal year and evaluate performance
measures from the previous year.
The Unit Assessment system is continually evolving to further meet the needs of faculty,
staff and candidates. For example, when compiling Key Assessment data for the 2014-2015
school year, the Assessment Coordinator noted Assessment as an area of weakness for
the candidates. This will be brought to the Assessment System Review Panel for further
action in Fall 2015. Courses of action may include updating courses to include more
instruction on how to create and analyze valid and reliable assessment instruments
and adding resources for faculty and candidates to the online library housed in TaskStream.
The Unit is on a path toward continuous improvement. We strive to stay current and
align courses with state and local policies. Changes have been made to course syllabi
to align them to PARCC, COMPASS, and our Unit's Conceptual Framework (Exhibits 2.2.b.1
& 2.2.b.2). The unit also adopted the use of 4 point rubrics across programs. However,
it has been noted by the Assessment System Review Panel that not all courses are using
the syllabus template (Exhibit 2.2.b.3). Also, some courses do not use a 4 point rubric
and many of the syllabi have not been updated to align with the revised Conceptual
Framework. Therefore, it was suggested that Mrs. Jones speak at various meetings to
facilitate the updates. Likewise, syllabus updates will be scheduled during faculty
meetings in Fall 2015. The Assessment System Review Panel will discuss the possibility
of appointing a syllabus steward or requesting that panel members themselves review
syllabi. The Unit continually uses data in order to improve the programs offered.
For example, when analyzing data from Praxis I, it was noted that students scored
poorly on the writing portion. Therefore, ED 111 was added in order to address this
deficiency.
Unit decisions about courses are linked to Specialized Program Assessments (exhibit
2.2.b.4). The programs are changed in order to align with feedback from the various
accreditation bodies. For example, based upon the feedback from the report from the
IRA, the unit has updated the key assessments that will be used to gather data for
that report. This data, in turn, will be used to improve the programs and courses
offered.
Recently, our Completer Survey and Employer (Principal) Survey were revised in response
to the annual EPP report (exhibit 2.2.b.5 & 2.2.b.6). Recent surveys of graduates
tell us that we need to: offer more courses online; provide more opportunities for
practicing teachers to interact with teacher candidates through videos, workshops,
PowerPoints, etc.; expose candidates to the logistics of operating a classroom; facilitate
more hands on activities with the teacher candidates; update curriculum to include
more technology integration; prepare candidates to effectively deal with severe behavior
issues in the classroom; and shadow/closely mentor candidates to keep them on track.
Recent survey results from principals (employers) that tell us that our completers
should continue to encourage students to take initiative for their learning and others
in the classroom and they need to keep abreast of new trends while exploring the curriculum
changes for Louisiana. These results will be discussed and addressed in the Fall 2015
meeting of the Assessment System Review Panel and a plan for addressing these needs
will be created (completer results exhibit 2.2.b.7; employer results exhibit 2.2.b.8).
The Unit also uses current student surveys to gather data for each individual course
(exhibit 2.2.9). Changes to individual courses can be viewed in exhibit 2.2.b.10.
However, the Assessment System Review Panel has noted that student surveys disseminated
for online courses is same as those used for face to face courses. Some items do not
apply to both types of courses. For example, the item "my instructor spoke audibly
and clearly" is not applicable to an online course. If a student decided to skip the
questions that do not apply, the instructor would receive a score of zero for that
item as there is no N/A option and the unanswered question score defaults to zero.
The Assessment System Review Panel recommends that a request for revisions to the
student survey be sent to the College of Education Administrative Council. If approved,
the suggestion would then be sent to the pk-16 council who would then send the approved
suggestion back to the dean who would take the recommendation to the provost.
After assisting in aggregating key assessment data and conversing with instructors,
the Assessment Coordinator noticed that the area of Assessment is an area of need
in both undergrad and graduate programs (exhibit 2.2.b.11). In order to address this
need, a plan to address this need will be presented to the PK-16 council for consideration
in Fall 2015. Possible ways to address this will be to allow the Assessment Coordinator
and other willing faculty to offer faculty development to help instructors develop
activities around assessment. This development could consist of resources uploaded
to TaskStream, sharing other web-based resources via e-mail, and presentations at
faculty meetings.
One area that will need to be addressed in Fall 2015 is the assessment of our academic
advisement process. Because we use a dual advisement system, not all advisement goes
thru the Care Center. The development of system to assess centralized advisement and
certification will need to be added to the agenda of the Fall 2015 meeting of the
Assessment System Review Panel. The Panel will need to devise a way to assess the
effectiveness of the services in the Care Center and the dual advisement process.
A possible way to do this is through student surveys similar to the ones used to assess
instructor effectiveness.
Previously, the Unit's areas for improvement came from 2b. Data Collection, Analysis,
and Evaluation. The AFI stated that the Unit's assessment data "were summarized but
not analyzed to provide valid, reliable, and consistent information about programs
and candidates. Since the last visit, the Unit has not regularly and systematically
analyzed assessment data to evaluate the efficacy of courses, programs, and field
experiences. A systematic approach is not evident." Response:
In response to this AFI, the Unit has implemented procedures and programs to ensure
that data is collected, analyzed, summarized, and used for continuous improvement
in courses and in programs across the college.
The Unit maintains a data collection, analysis, review plan that details when assessments
are administered, the frequency of data collection, the responsibility for data collection,
the frequency of data analysis and summary, the responsibility for data analysis and
summary, the responsibility for evaluation and monitoring of the use of data, and
how data are used (see exhibit 2.3.b.1 Assessment System Handbook).
Assessment data are collected at multiple points and analyzed by both individuals
and the Unit through faculty meetings, Assessment System Review Panel meetings, and
Data Days. Multiple assessments are used including both internal and external data.
For example, candidate data are used by programs to make decisions regarding candidate
admission, matriculation, and program completion (exhibit 2.3.b.2). Program assessments
are used internally to measure program quality and manage and improve Unit operations
and programs. SPA program reports are external evaluations used to strengthen the
overall performance of the Unit and ensure that graduates have the knowledge, skills,
and dispositions to meet program standards. SPA program approval reflects on Unit
and operations quality (exhibit 2.3.b.3). Employer surveys are used to ascertain candidate
proficiencies in the workplace as well as Unit and operations quality. Follow-up surveys
also provide data for improvement of Unit operations.
Course and Instructor Evaluations are completed by candidates. Results of these evaluations
are shared with faculty members to improve the teaching and learning environment and
are used by departmental chairs during annual faculty evaluations as well as an indicator
of Unit and program operations quality.
Faculty members submit the Annual Faculty Report. Faculty evaluations by department
chairs are conducted annually and feedback is used to improve faculty productivity
and to assist faculty in meeting tenure and promotion goals. Data also provide evidence
of Unit and program operations quality. Tenure-track faculty members are evaluated
for tenure and promotion on criteria following procedures established in the GSU Faculty
Handbook. Faculty members are also evaluated by peers using the Faculty Peer Evaluation.
GSU supervisors and cooperating teachers are evaluated and data are used to make
future assignments and as an indicator of Unit and program operations quality (see
Student Teaching Handbook). These evaluations are completed at the end of each semester:
1) student teaching candidate evaluation of GSU supervisor and cooperating teacher,
2) cooperating teacher evaluation of GSU supervisor, and 3) GSU supervisor evaluation
of cooperating teacher.
The annual departmental goals and objectives form is used to guide the planning and
operations of each department and is used as an indicator of Unit and program operations
quality. Each fall, departmental faculty set goals, objectives, strategies, and performance
measures for the upcoming fiscal year and evaluate performance measures from the previous
year.
2b. Data Collection, Analysis, and Evaluation. The AFI also stated that "The Unit
did not provide evidence that information technology is used systematically across
all advanced programs for data collection, analysis, and evaluation. With the exception
of doctor of education in Curriculum and Instruction (C & I) and doctor of education
in EDLD, which are part of the Louisiana Education Consortium (LEC), the use of information
technologies is not consistently evidenced in the advanced teacher preparation programs.
At GSU, TaskStream is an information technology tool that is used by some but not
all advanced programs."
Response:
In fall 2009, an internal review of the Unit's operations was conducted. As a result,
the Unit adopted TaskStream for all courses with key assessments with implementation
in spring 2010. The Unit uses TaskStream for all courses that include a key assessment.
Although it was implemented Unit-wide, it went widely underutilized for several semesters.
In spring 2014 the Unit was able to hire a full time Assessment Coordinator. The Coordinator
was able to provide additional training as well as a more systematic approach to data
collection through the use of TaskStream. Currently, all advanced programs that have
courses and data used for SPA reports have been loaded into TaskStream. Most of these
courses have begun data collection and analysis of data in TaskStream. Some are undergoing
either initial construction of key assessments for their SPA or are being revised.
As the revisions are made courses are opened and/or updated in TaskStream. Likewise,
beginning in fall 2015, instructors of both initial and advanced programs will schedule
a date for the Assessment Coordinator to facilitate the uploading of data for all
key assessment SPA courses. In this way, the Unit will ensure that all programs have
data housed and analyzed in the TaskStream system and thus data will become available
to the entire Unit for use in various ways (exhibit 2.3.b.1).