ECP Assessment Practices
The MSE/ESM Engineering Communications Program conducts two types of regular assessments:
- Annual assessments of senior design portfolios against ABET student outcomes for both departments
- Bi-annual assessments of the MSE Communications Portfolios against selected program outcomes
The sections below describe these assessments in more detail.
Background: In 2005, through the MSE/ESM Engineering Communications Program (ECP), we began the process of calling on departmental advisory boards to review a subsample of the capstone design portfolios against the ABET student outcomes for Criterion 3 (a-k along with discipline-specific outcomes). In the spring of 2006, after completing the process twice (once in MSE and once in ESM), Drs. Marie Paretti (ECP Director), Steven Kampe (MSE Asst. Dept. Head), and Scott Case (ESM Asst. Dept. Head) received a grant from the Virginia Tech Center for Excellence in Undergraduate Education to formalize the method and validate the findings. With funding from the university, we have revised the scoring rubrics (see attachments) to include concrete rankings and tested the revised rubric with both departmental advisory boards.
Description: For the purposes of assessment, students’ capstone design portfolios are subsampled to include projects at the low, middle, and high end of the grading spectrum; because of the small number of projects, we typically review 50% or more of the projects each year. Should a project fail (which happens rarely), that project is included in the subsample. Each portfolio includes the following materials:
1. A written proposal, PowerPoint slides, and a presentation video submitted/presented to the course instructors and the team’s faculty advisor at the beginning of the project to gain approval to move forward.
2. A written report, PowerPoint slides, and a presentation video submitted/presented to the course instructors and the team’s faculty advisor in December to report progress. These reports include work breakdowns for each team member.
3. A written report, PowerPoint slides, and a presentation video submitted/presented to the course instructors in February to report progress. These reports include work breakdowns for each team member.
4. A written report, PowerPoint slides, and a presentation video submitted/presented to the course instructors in March to report progress.
5. A final written report, PowerPoint slides, and a presentation video submitted to the at the conclusion of the project in April.
Approximately 2-3 weeks prior to the annual Advisory Board Meeting, board members are each sent a set of CDs containing folders for each project included in the subsample; each folder contains items 1-5 (without any grades or faculty comments). They also receive an instruction letter and a copy of the evaluation rubric. Each advisory board member is asked to review one portfolio in detail, and to skim the other portfolios as time permits, prior to the board meeting. At least two advisory board members are assigned to each portfolio to check for agreement across evaluators.
During the annual board meeting, we allot 1-2 hours to the portfolio review. The board addresses each portfolio in turn; the ECP Director and the Associate Department Head take notes on the discussion for review. Board members submit completed rubrics for later analysis, and the meeting concludes with the board drafting a letter summarizing the review.
To obtain copies of the evaluation rubrics, please send a request to Dr. Marie C. Paretti
Limitations: This assessment method provides external programmatic review. Because the capstone design projects are team projects, in large part the advisory board evaluates the project as a whole. However, because the presentation videos include each member of the team and the progress reports provide work breakdowns for team members, the board members also provide some degree of feedback on individuals, noting students who seemed both less and more competent than their team members.
Moreover, because of the small sample size (e.g. 5 portfolios were reviewed in 2006, encompassing 10 students), meaningful quantitative data is not viable. Although quantitative results could be tabulated, the low n poses problems for determination of statistical significance. Instead, the findings are primarily qualitative. However, the strong agreement among board members with respect to the quality of each project, both on the rubrics and through discussion, along with the close alignment of board rankings and rankings by course grade, supports the viability of the method.
Every other summer, a team of external evaluators assessess a purposive subsample of the communications portfolios completed in MSE 4900. The subsample is designed to include high, middle, and low-scoring students based on average grades across the ECP curriculum. The evaluators are expert technical communications instructors; typically, they are VT faculty who teaching English 3764: Technical Writing, a course that serves students across the university.
The evaluators scored each portfolio using a rubric that addressed approximately 1/3 of the ECP outcomes; correctness and conciseness are addressed in each assessment; other outcomes are rotated to insure that over a six year period, all outcomes are assessed. Each outcome is scored on a 5-point scale, with 1 as the lowest and 5 as the highest possible score (a score of 0 indicated the evaluator did not have enough information to assess the item; no scores of 0 were reported).
To norm the evaluators, all three evaluators, along with the Program Directors and Assistant Director, read and scored sample portfolios. In any area where evaluators differ by more than one point, the evaluation team discusses the criteria and reaches a concensus regarding scoring standards.
All portfolios in the sample are then scored by 2 evaluators and the scores averaged. If the 2 evaluators differ by more than 1 point on more than 1 question, the portfolio is scored by the third evaluator, and all three scores are averaged.
For details on the evaluation rubrics currently in use, please contact Dr. Marie C. Paretti.