March 6, 2009
Bridge Program Evaluation Plan David Brigham, DCAS
Bridge Program Goals
Program Completion and Student Achievement
The following evaluation plan for the Bridge Program was prepared at the request of the Office of Academic Affairs (OAA) at Empire State College by David Brigham, Director of College Academic Support, in consultation with Yong-Lyun Kim, Lead Research Analyst, Outcomes Assessment and Institutional Research. The plan contains a description of proposed evaluation activities for the following college terms: Term I (September 2008), Term 3 (January 2009), Term 5 (May 2009), and Term 1 (September 2009).
The purpose of this evaluation is to determine the extent to which the Bridge Program is meeting its goals and to track the academic progress of students participating in the Bridge Program. Evaluation data obtained at the end of each college term will be reviewed and used to make adjustments in the Bridge Program in an ongoing manner throughout the course of the evaluation (September 2008-December 2009).
This evaluation uses an emergent, formative design that will change in response to issues, concerns, or circumstances arising during the course of the evaluation. Possible design changes include but are not limited to revisions to program goals, the focus of the evaluation, evaluation questions, and types of data collected.
The Bridge Program is a 15-week program of instruction designed to strengthen the writing and reading of students identified as having near-college level skills in these areas. The program has its roots in Empire State College’s Front Porch Project, a college-wide initiative which began in 2003 with the intention of providing increased college support for students during their first two terms of enrollment. The local Directors of Academic Support developed the Bridge Program under the leadership of the Assistant Vice President of Academic Affairs for Academic Services.
The Bridge Program began enrolling students in September 2008 as a voluntary, non-credit, tuition-free program of instruction. As part of the process of applying for admission to the college, students complete a skills assessment administered by the Admissions Office. Based on the results of the admissions assessment, the Admissions Office refers students determined to have near-college level writing and reading skills to the Director of Academic Support (DAS) at the student’s center for review. The DAS reviews the student’s admission assessment and, if indicated, may do further testing with the student. In addition to new students, currently enrolled students may also enter the Bridge Program by referral to center DAS from a mentor or instructor or through self-referral.
In December 2008 the DAS proposed a college-wide approach to assessing the skills of student’s referred to them for possible placement in the Bridge Program. They recommended that all DAS administer the COMPASS reading assessment and the same writing assessment (to be determined) to student referrals. Students who score below established cut scores would be placed in the Bridge Program. This evaluation plan assumes this recommendation will be implemented.
In its current form, the instructional content of the Bridge Program consists of eight modules developed in ANGEL, the college’s online learning course management system. Modules contain learning objectives, readings, activities, discussions, and writing assignments. DAS teach the same Bridge Program content and administer the same writing and reading assessment at the end of the Bridge Program to measure skill development. This assessment generates one of four ratings: does not meet expectations, approaches expectations, meets expectations, or exceeds expectations. Students who receive a rating of meets expectations or exceeds expectations are considered to have satisfactorily achieved Bridge Program learning objectives.
Before the start of a college term, each DAS receives a copy of the ANGEL master to customize for the local center. The DAS works with students to decide on the appropriate method for delivering the Bridge Program as it may be delivered as an individualized study, group study, blended study, or fully online. The Bridge Program is currently scheduled to run for terms 1, 3, and 5 each year.
This evaluation plan focuses on the following four aspects of the Bridge Program: placement and nonplacement, instructional effectiveness, academic progress, and program implementation. Exploring each aspect should yield information that will help determine the extent to which Bridge Program goals are being met and provide information that will enable changes to be made in the program to facilitate goal attainment.
Through the establishment of the Front Porch Project, Empire State College has made an institutional commitment to provide academic support to students in their first two terms of enrollment. The 2006-2010 Strategic Plan Implementation in Progress (March 2008) provides a checklist of objectives accomplished in several strategic areas including in the area of promoting student success by supporting “vulnerable” students in their first two terms of enrollment. Over the past two years, the college has also demonstrated its commitment to providing academic support for students by creating and filling eight DAS positions at the centers and establishing an Office of College Academic Support headed by a new Director of College Academic Support (DCAS). Among the DCAS’s responsibilities is implementing and monitoring the Bridge Program.
The Bridge Program is being implemented at a time of significant change at Empire State College. The Front Porch Project began under former President Moore, continued one-year by acting president Joyce Elliot, and has now transferred to a new president, Alan Davis, who began his tenure in August 2008.
The Bridge Program is new to the college and represents a change in how students receive academic support. Before the DAS began working in the centers, academic support was provided primarily by mentors.
The following groups have a stake in the outcome of this evaluation. It is important that this evaluation address their concerns, if the results of this evaluation are to be accepted and effectively implemented. Stakeholders include students, DAS, mentors, and college administrators.
The following sets of evaluation questions and sub-questions focus on each of the four areas: placement and nonplacement, program completion and achievement, academic progress, and program implementation. These questions will drive data collection and analysis.
For the purposes of this evaluation, “placement” will be defined as student referrals that DAS designate for enrollment in the Bridge Program. “Nonplacement” will be defined as student referrals that DAS do not designate for enrollment in the Bridge Program. Based on secondary assessment test scores (COMPASS and writing assessment), DAS may recommend that students not placed in the Bridge Program receive another form of academic support or no additional academic support. Data will be collected on students in each group, those placed and not placed in the Bridge Program, so that the academic progress of both groups of students can be tracked and compared. Collecting this information will enable assessment cut scores to be adjusted, if necessary.
The following evaluation questions focus on student placement and nonplacement:
Of students placed in the Bridge Program, it is important to know the percentage of students who complete the program and how successful they were in achieving learning outcomes. It would also be useful to understand why some students fail to complete the program and why some students fail to achieve course outcomes. Therefore, this evaluation includes questions that seek to understand the characteristics of completers and non-completers as well as the characteristics of students who achieve learning outcomes and those who do not. In addition, since the Bridge Program could potentially be delivered in four modes (individualized, group study, blended, and online), and administered at up to eight different centers, information will be collected to detect patterns that may be associated with delivery mode or center.
The following evaluation questions focus on program completion and student achievement:
Tracking the progress of students who participate in the Bridge Program is a crucial aspect of this evaluation. The Bridge Program was implemented with the intention of facilitating the academic progress of students with near-college level skills through the first two terms of college, if not to graduation. Therefore it is necessary to monitor the short- and long-term progress of students placed in this program. In addition, it will be useful to track the short- and long-term progress of students referred to DAS for skills assessment but who were not placed in the Bridge Program. This information will help to inform whether the skills assessment cut scores need to be adjusted.
The evaluation of academic progress will use the college’s existing indicators of student progress as these are commonly understood and accepted by the stakeholders. In addition, using existing indicators facilitates data collection and reliability. These indicators will include: satisfactory academic progress (SAP), student grade point average (GPA), credits earned, and course or study grades.
The following evaluation questions focus on academic progress:
When new programs such as the Bridge Program are launched, it is important that the evaluation design provides a way to capture feedback from those in the field who are delivering the program (i.e., the DAS) and those who experience the program (students). This feedback is necessary to obtain information that will permit improvements to be made to successive program launches. Therefore, this evaluation plan includes questions that attempt to obtain feedback on specific aspects of Bridge Program implementation with an eye toward identifying and correcting implementation weaknesses. These areas include communication about the program, the student selection process, registering students for the program, distributing and setting up course shells, the instructional quality of the modules, the usefulness of assessments, and unanticipated issues.
The following questions focus on program implementation:
The Office of College Academic Support (OCAS) will manage the collection of information for this evaluation. The table below summarizes the types of information OCAS staff will collect to address the evaluation questions. The table is organized by evaluation focus (column 1) and information source (columns 2-5). Individual cells under the information sources indicate the collection method (e.g., questionnaire). OCAS will be responsible for developing collection tools and protocols and storing collected information in a secure place.
The Office of College Academic Support (OCAS) in collaboration with the Office of Outcomes Assessment and Institutional Research (OAIR) will be responsible for the overall analysis and interpretation of evaluation information. OCAS will organize and maintain descriptive and qualitative data and bring it to bear on the evaluation questions and subquestions. OCAS will request assistance from OAIR for the analysis and interpretation of quantitative data for statistical analysis and for reviewing OCAS’s analysis and interpretation of evaluation information.
Question 1: What percentages of referrals were enrolled in the Bridge Program, recommended for another form of academic support, or found not to require academic support?
Method: frequency, content analysis
Question 2: What are the differences between the characteristics of student referrals placed in the Bridge Program, recommended for another form of academic support, and found not to require academic support?
Method: frequency, t-test of group differences
Question 1: What are the characteristics of Bridge Program completers and non- completers?
Method: frequency, t-test of group differences, content analysis
Question 2: To what extent do students who complete the Bridge Program improve their reading and writing skills?
Method: t-test of group differences, content analysis
Question 3: What percentage of students who complete the Bridge Program are rated meets expectations or higher and lower than meets expectations?
Question 1: To what extent is achievement level in the Bridge Program associated with short-term (two terms) and long-term (until graduation) academic progress?
Method: frequency, correlation, content analysis
Question 2: What is the short-term (two terms) and long-term (until graduation) academic progress of referrals recommended for another form of academic support and those found not to require academic support?
Method: t-test of group differences
Question 1: To what extent did centers implement the Bridge Program as planned?
Method: Frequency, T Test of group differences, content analysis
Question 2: What aspects of the Bridge Program implementation need improvement?
Method: frequency, survey, interviews, focus group, content analysis
The reporting schedule for this evaluation seeks to meet the information needs of key evaluation stakeholders (students, DAS, mentors, and administrators) in a timely fashion, starting with the evaluation design itself and ending with the final report of evaluation results. Because this is a formative evaluation, reports will be issued frequently and may be revised to accommodate unanticipated circumstances. Report formats will vary with the purpose of the report and the audience. Planned reports include mid-term updates, end of term progress reports, and a comprehensive final report covering the full period of the evaluation. Various derivative reports such as PowerPoint presentations will be created as appropriate to communicate with various audiences.
The Director of College Academic Support will be responsible for managing evaluation activities. The following table displays major evaluation tasks and indicates who will carry them out. A detailed work plan including subtasks and due dates will guide this work.
Middle States Documents
New Academic Program Development
Office of Sponsored Programs (login required)
Corporate and Community Partnerships
Community College Partnerships