Continuous Improvement Program

for

Undergraduate Education

 

 

 

 

 

 

 

 

 

 

 

 

 

Department of Electrical Engineering

University of Washington

Seattle, Washington, USA

Version 1.0

December 7, 1999

Table of Contents

1 Preamble *

2 Overview *

3 General Guidance for Review and Improvement Activities *

4 Review and Improvement Schedule *

5 Specific Guidance for Review and Improvement Activities *

5.1 Undergraduate Educational Program Constituents *

5.2 Undergraduate Education Mission and Objectives *

5.3 Undergraduate Education Outcomes *

5.4 Course Offerings *

5.5 Course Syllabi *

5.6 Group Curricula *

5.7 Department Curriculum *

5.8 Undergraduate Advising *

5.9 Faculty Competencies *

5.10 Classroom Facilities *

5.11 Laboratory Facilities *

5.12 Computing Facilities *

5.13 Financial and Institutional Resources *

5.14 Asynchronous Assessment Program *

5.15 Continuous Improvement Program *

5.16 Continuous Improvement Program Performance *

6 Asynchronous Assessment Overview and Schedule *

7 Specific Guidance for Asynchronous Assessment Activities *

7.1 Constituent, outcomes and objectives survey *

7.2 Course Review and Improvement Report *

7.3 Student Evaluation of Course Outcome Coverage *

7.4 Student Portfolio Assessment *

7.5 Student Grades *

7.6 Student Graduation Requirements Audit *

7.7 Outcome Achievement Surveys *

7.8 Alumni Surveys *

Appendix A Example Course Offering Review and Improvement Report *

Appendix B Constituents *

Appendix C Mission and Objectives *

Appendix D Outcomes *

Appendix E Responsibilities by Position *

Appendix F Program Summary *

Record of Revisions

Nov. 1, 1999 Version 1.0α - Initial version to UGSC

Dec 7, 1999 Version 1.0 - First version approved by the faculty. Minor changes in Table 1 and to R&I report contents and portfolio assessment instructions. Summary pages added.

 

  1. Preamble
  2. The Department of Electrical Engineering of the University of Washington is committed to educational excellence. We seek to achieve excellence through cutting edge innovation and mastery of fundamentals. Innovation is guaranteed by the strong research orientation of the faculty. Fundamentals are addressed by this Continuous Improvement Program. The program mandates systematic review and improvement of every aspect of the undergraduate education program, and ensures institutional memory of improvement a ctivities through documentation requirements.

    Improvement of undergraduate education is the core concept and guiding principle of the program. Stated simply, improvement comes first. Many aspects of the program are also informed by ABET Engineering Criteria 2000 requirements.

     

  3. Overview
  4. The Continuous Improvement Program (CIP) is composed of review and improvement (R&I) activities. These activities cover every element of the undergraduate education program. These elements range from individual course offerings to the entire underg raduate curriculum, and include the Continuous Improvement Program itself. Each program element has an R&I schedule, a person responsible for performing the R&I activity, and a reporting requirement.

    Review and improvement requires assessment to determine needed improvements, as well as proactive improvement. Some assessment activities are conducted asynchronously from review and improvement activities. T hese assessment results are archived and provided to the review and improvement activity at the appropriate time. Assessment may also be performed as part of a review and improvement activity, and documented in the review and improvement activity report.< /P>

    It is insufficient merely to schedule review and improvement activities. They must also be performed! Performance of review and improvement activities is monitored by the Continuous Improvement Program subcommittee (CIPSC) of the Undergraduate Studies Committee. The CIPSC keeps track of completion of scheduled activities through received review and improvement reports, and issues an annual performance report for the program. In addition to summarizing completed improvement activities, the report detail s uncompleted scheduled activities and the responsible individuals. The written report is issued to the faculty and Department Chair, and presented to department constituents.

     

     

    In monitoring review and improvement activities, the CIPSCís interest is to ascertain whether the activity has been performed and whether the associated report contains the required information. The CIPSC does not monitor the quality of the improvement activities. Existing processes in the department already address quality of work. For example, course revisions must be approved by the appropriate curriculum group, and by the UGSC. Curriculum revisions must be approved by vote of the faculty. Course of ferings are evaluated by peer review of teaching, student course evaluations, and the promotion, tenure and salary evaluation process. Activities assigned to staff, such as advising and computing services, are similarly evaluated by the staff evaluation p rocess. The performance record and the reports from review and improvement activities are of course available as additional inputs to these existing quality control processes.

    Those making improvements in the undergraduate educational program must be guided by the goals of the department. These goals, which appear in the form of mission, objectives and outcomes to conform to ABET Engineering Criteria 2000 (EC 2000), are them selves subject to assessment and review and improvement processes under the aegis of the CIP.

    Outcomes - student abilities in different areas - are addressed in the undergraduate courses that make up the undergraduate curriculum. Each course has specific outcomes it is expected to address. These are assigned in the department curriculum. The wa y the course addresses the outcomes is described in the Master Course Description document. Certain assessment activities measure the extent to which each course addresses its assigned outcomes.

    ABET EC 2000 imposes specific requirements on the CIP and on performance of activities under the CIP. Those performing review and improvement and assessment activities are expected to be fully aware of EC 2000 requirements, and to incorporate them into their activities. Many, perhaps all, of the EC 2000 requirements would exist in some form in a competent CIP. EC 2000 essentially codifies these requirements into one specific form.

     

  5. General Guidance for Review and Improvement Activities

This section contains general guidance on the performance of review and improvement (R&I) activities.

Persons assigned the responsibility for the review and improvement of a specific element of the undergraduate studies program ("improvers") should be aware of the periodic schedule for their R&I activity. The schedule indicates the longes t time interval between R&I activities for a given program element. More frequent R&I activities are permitted. For example, if R&I for a course is scheduled for every five years, and the last R&I was done in 1999-2000, R&I must be don e again in the 2004-05 school year. However, R&I could be done sooner, for example in 2001-02, perhaps in connection with a desire to proactively revise the course, or because of a curriculum change. In that case the next R&I would be due in 2006- 07. The R&I schedule clock starts when the CIP is adopted by the faculty, or when a course or program element is first implemented.

The next step is to review the guidance in the CIP for performing the R&I activity, both this general guidance and specific guidance located in the specific R&I activityís section below. This guidance is just that - guidance, to be followed or superceded by creativity and innovation at the discretion of the improver. The requirements are that the R&I activities be performed in accordance with the schedule, and that R&I reports be written.

For high-periodicity program elements, specifically for course offerings, the Advising Office will automatically provide assessment materials to course instructors at the start of the quarter. Assessment results and previous R&I reports for other R &I activities can be obtained by requesting them from the Advising Office, which has the responsibility to archive and provide this material.

Improvers should review current department undergraduate program mission, objectives and outcomes to ensure their R&I activity is consistent with these guides.

Improvers should consider early in the process how they plan to involve constituents in their R&I activities. Close involvement of all classes of constituents, especially students and industry, is encouraged.

Improvers may wish to perform assessment activities specific to the R&I activity they are performing. For example, asynchronous assessment may reveal that students have relatively lower ability in a specific outcome. The improver may want to invest igate the specific causes of this lower ability, first by talking to a few students, and then performing a survey designed specifically to verify the impressions gained from conversation, in order to make changes that will result in the most improvement.< /P>

Improvers will have to synthesize the assessment results, when they are contradictory, confusing or incomplete. It is up to the improver to extract signal from noise, and in turn use the signal to generate improvement.

Reactive improvements, based on assessment results, should be supplemented by proactive improvements motivated by creativity and innovation to lead the moving target and advance our undergraduate program to where it should be in the future.

Improvers may want to consider the possibility of obtaining funding for innovative and creative program element improvement activities and related assessment activities. In addition to NSF funds from various programs, the University Tools for Transform ation program is a good potential source of local funding for educational improvement activities. Improvers may also want to consider publication of their improvements.

In many, but not all cases, changes results from the R&I activity will require a formal approval process. For example, course revisions must be approved by the UGSC, and CIP changes must be approved by the faculty.

On completion of the R&I activity, the improver must write a report to document the activity. The report provides an institutional memory of R&I activities, communicating the basis for various changes in program elements to the future independe nt of the memory of individuals. The report also fulfills a key ABET requirement that the use of assessment input to improve achievement of objectives and outcomes be documented.

While there is an expectation that most R&I activities will result in some form of improvement, it is also expected that some reviews will show no need for change, or that desired improvements are beyond the scope of available resources. In these c ases, there is still a need to write an R&I report synthesizing assessment and explaining the absence of need for improvement.

R&I reports for frequent R&I activities should be concise. An example R&I report for a course offering is in Appendix A. R&I reports for larger tasks should be correspondingly more detailed. The format of the R&I report is not as im portant as the content, although it would be convenient for future users if the format in the appendix was followed. The report should cover:

The R&I report should be sent to the chair of the Continuous Improvement Program subcommittee (CIPSC). The CIPSC will review the reports for content, and forward them to the archives, where they will be important inputs to future R&I activities for the same program elements. The R&I report is the documentation of the completion of scheduled R&I activities, and so should be submitted prior to the R&I activity scheduled due date.

 

  1. Review and Improvement Schedule
  2. This section lays out the detailed schedule of review and improvement activities. For each activity, the person responsible for performing the review and improvement activity, and for writing the associated report, is identified. The maximum time betwe en review and improvement activities is identified. Table 1 summarizes the program elements and R&I schedule.

    Table 1 - Review and Improvement Schedule

    Program Element

    Responsible Person

    Schedule (Periodicity)

    Undergraduate Education Program Constituents

    UGSC Chair

    3 year

    Undergraduate Education Mission and Objectives

    UGSC Chair

    3 year

    Undergraduate Education Outcomes

    UGSC Chair

    3 year

    Course Offerings

    Course Instructors

    Every offering

    2 wks after qtr

    Course Syllabi

    Course Coordinators

    3 year

    Group Curricula

    Group Chairs

    3 year

    Department Curriculum

    UGSC Chair

    3 year

    Undergraduate Advising

    Advisor

    Annual

    Faculty Competencies

    UGSC Chair

    Annual

    Classroom Facilities

    Assoc. Chair

    for Education

    Annual

    Laboratory Facilities

    Laboratory Committee Chair

    Annual

    Computing Facilities

    Computing Committee Chair

    Annual

    Financial and Institutional Resources

    Department Chair

    Annual

    Asynchronous Assessment Program

    CIPSC Chair

    Annual

    Continuous Improvement Program

    CIPSC Chair

    Annual

    Continuous Improvement Program Performance

    CIPSC Chair

    Annual

    For activities with a period of one or more years, the due date is the end of the spring quarter of the relevant school year, specifically by June 15. For the course offering activity, the due date is two weeks after the end of the quarter, by the univ ersity academic calendar.

    For major revisions such as course syllabi and curricula, the revision may require several years for implementation and assessment. ABETís six year periodicity suggests this as the maximum time frame between R&I activities. Three years as a time fr ame seems to allow sufficient time for changes to take effect, yet still generates frequent attention to these program elements.

    A general recommendation is that group curricula R&I should occur in conjunction with, or the year after, department curriculum R&I, and that course syllabus R&I should in turn follow on group curriculum R&I.

    Assigned persons, while responsible for the R&I activity and associated report, may of course delegate their activities.

     

  3. Specific Guidance for Review and Improvement Activities
  4.  

    1. Undergraduate Educational Program Constituents
    2. Review the constituent list in Appendix B. Consider whether constituents should be added or removed from the list. Update Appendix B as required. Review constituent involvement in the undergraduate program. Consider how constituent involvement can be i mproved.

      Assessment material includes annual surveys of faculty, students and industry concerning constituents.

    3. Undergraduate Education Mission and Objectives
    4. Review the Undergraduate Mission and Objectives in Appendix C. Consider whether the mission and objectives are consistent with College and University mission and objectives, and whether they reflect the needs of our constituents. Consider whether the o bjectives are consistent with ABET EC 2000 Criterion 2. Consider whether the mission and objectives are widely published. Update Appendix C as required.

      Assessment materials include annual surveys of faculty and constituents about mission and objectives.

    5. Undergraduate Education Outcomes
    6. Review the Undergraduate Education Outcomes in Appendix D. Consider whether they are consistent with ABET EC 2000 Criterion 3 and the department mission and objectives, and how well they meet the needs of our students. Consider adding, dropping or revi sing outcomes. Update Appendix D as required.

      Assessment materials include annual surveys of faculty and constituents about the importance and achievement of outcomes.

    7. Course Offerings
    8. Review how the course has previously been offered, and the results from those offerings. Consider how the outcomes the course is intended to address can be improved. Implement the improvements in your offering of the course. Write a short (1-2 page) Co urse Offering Review and Improvement Report focusing on the most significant improvements you made. Include the average grade in the course. See Appendix A for an example of a report.

      Assessment materials including previous R&I reports for your course and student evaluation information related to the course (not the instructor) will be provided to you at the start of the quarter. Additional materials may be requested from the Ad vising Office.

    9. Course Syllabi
    10. Review and improve the syllabus of your course. Check the department curriculum concerning outcome coverage required from your course. Ensure that the Master Course Description is up to date. Assessment materials include those available for the course offering, and alumni and exit surveys.

    11. Group Curricula
    12. Review and improve the curriculum of your curriculum group. Consider whether courses should be added or dropped, and the general contents of the courses. Consider the outcome coverage required by the department curriculum.

    13. Department Curriculum
    14. Review and improve the department curriculum, including graduation requirements. Consider how outcomes are addressed by the courses in the curriculum. Consider how the curriculum addresses the requirements of ABET EC 2000 Criteria 3 and 4, and the Elec trical Engineering Program Criteria.

    15. Undergraduate Advising
    16. Review and improve undergraduate advising practices and resources. Spot check transcripts to ensure that graduation requirements are met and transfer policies are in force. Consider how advising practices address the requirements of ABET EC 2000 Criter ion 1.

    17. Faculty Competencies
    18. Review the number and curricular capabilities of the faculty with respect to the needs of the undergraduate educational program. Consider how they address the requirements of ABET EC 2000 Criterion 5. Results should provide guidance on number of facult y and search areas.

    19. Classroom Facilities
    20. Review the classroom facilities, other than laboratory facilities, used by the undergraduate educational program, including location, availability of technology and classroom support services. Consider how they address the requirements of ABET EC 2000 Criterion 6.

    21. Laboratory Facilities
    22. Review the laboratory facilities, not including computer laboratories, used by the undergraduate educational program, including availability of technology and hardware support. Consider how they address the requirements of ABET EC 2000 Criterion 6.

    23. Computing Facilities
    24. Review the computing facilities used by the undergraduate educational program, including availability of hardware and software, and hardware and software support services. Consider how these address the requirements of ABET EC 2000 Criterion 6.

    25. Financial and Institutional Resources
    26. Review the institutional support and financial resources available to the undergraduate program. Consider how these address the requirements of ABET EC 2000 Criterion 7. One source of assessment input could be in R&I reports from other program elem ents that report on improvements not made due to lack of support.

    27. Asynchronous Assessment Program
    28. Review the assessment activities conducted independently of other program element R&I. Consider whether new activities should be implemented, and old ones terminated. Examine R&I reports for use of existing assessments, and for R&I related assessment techniques that should become asynchronous assessment. Consider how archiving and retrieval can be enhanced. Revise the assessment sections of the CIP.

    29. Continuous Improvement Program
    30. Review the overall Continuous Improvement Program. Consider how can it be changed to generate more improvement in the undergraduate program. Consider what items in the program should be added or removed. Revise this document.

    31. Continuous Improvement Program Performance

    Determine the level of performance of CIP activities, measured by percentage of R&I activities completed on schedule as documented by R&I reports. Identify responsible persons who have not performed scheduled R&I activities. Make changes to improve R&I performance. Report performance to the faculty, students and industry. Issue reminders of R&I activities due in the coming academic year.

  5. Asynchronous Assessment Overview and Schedule
  6. Assessment is a vital component of improvement. Without measurement of accomplishment, it is impossible to know whether something is improved - or whether it should be improved. Further, formal assessment provides the documentary evidence required by A BET EC 2000. Assessment methods must be efficient - they must return as much useful information as possible at minimum expenditure of time and effort. Further, they must measure the right things. These are not easy things to do, and the assessment program is expected to be the subject of continuous improvement in this direction.

    Assessment can be divided into asynchronous assessment, which is performed independently of review and improvement (R&I) activities, and assessment performed as part of R&I activities. In most cases the latter is left to the discretion of the p erson conducting the R&I activity. Specific guidelines are provided for assessment as part of the course improvement report.

    The assessment program laid out below seeks to obtain three broad categories of information. First, assessment activities seek to gather information from constituents about the identity of constituents, and about department objectives and outcomes. Sec ond, assessment activities seek to determine whether outcomes that are supposed to be addressed in undergraduate courses are being addressed in those courses. Third, and most important, assessment activities seek to measure achievement of outcomes by our students. The outcomes primarily represent abilities that students are expected to have upon graduation, and in a few cases attitudes students should have.

    The first category, constituent input on constituents, outcomes and objectives, represents expressions of opinion by our constituents, and is therefore ideal for survey techniques.

    The second category, determining whether outcomes scheduled to be addressed in courses are addressed in courses, is assessed by a combination of auditing and surveys. Course instructors are required to write an R&I report on each course offering in dicating how they have improved the way the outcomes assigned to the course are addressed. Students in every course are surveyed to determine their opinion of how well outcomes are addressed in the course. Student work collected from courses is audited to determine whether outcomes are addressed.

    The third category, determining how well outcomes are achieved, is assessed by a combination of grading, auditing, student surveys, alumni surveys, and portfolio assessment.

    Table 2 gives the schedule for asynchronous assessment activities. The Continuous Improvement Program Subcommittee is responsible for monitoring the accomplishment of assigned assessment activities.

     

     

    Table 2 - Assessment Activity Schedule

    Assessment Activity

    Responsible Person

    Schedule (Periodicity)

    Constituent, outcomes and objectives survey of students

    CIPSC Chair

    Annual

    Constituent, outcomes and objectives survey of industry

    CIPSC Chair

    Annual

    Constituent, outcomes and objectives survey of faculty

    CIPSC Chair

    Annual

    Course Improvement Report

    Course Instructors

    Every offering*

    Student Evaluation of Course Outcome Coverage

    Course Instructors

    Every offering**

    Student Portfolio Assessment

    Course Coordinators

    Annual

    Student Grades

    Course Instructors

    Every Course

    Student Graduation Requirements Audit

    Advisor

    Before Graduation

    Junior Outcome Achievement Survey

    Advisor

    Annual

    Senior Outcome Achievement Survey

    Advisor

    Annual

    Graduating Senior Exit Survey

    Advisor

    Annual

    One-Year Alumni Survey

    Office of Educational Assessment***

    Biannual

    Five-Year Alumni Survey

    Office of Educational Assessment***

    Biannual

    * - This is the same activity as the Course Offering R&I activity

    ** - In conjunction with student course evaluations

    *** - The Office of Educational Assessment is not bound by this schedule

    Assessment results are archived by the advising office and supplied on request to review and improvement activities. Assessment results for course offering review and improvement activities are automatically supplied to course instructors at the beginn ing of the course.

     

  7. Specific Guidance for Asynchronous Assessment Activities
    1. Constituent, outcomes and objectives survey
    2. This survey is performed annually. The same survey is given to accessible major constituent groups of the department, presently identified as undergraduate students, industry in the form of the Corporate Advisory Board, and the faculty of the departmen t. The survey asks respondents to indicate their opinion of the importance of current department constituents, outcomes and objectives, and to suggest other constituents, outcomes or objectives that the department should adopt. The results are directly us eful to the Constituents, Objectives and Outcomes review and improvement activities. The surveys are conducted by the Continuous Improvement Program subcommittee.

       

    3. Course Review and Improvement Report
    4. The Course Review and Improvement Report is the R&I report associated with the Course Offering R&I activity, written by the course instructor at the end of each course offering. This report summarizes assessment results known prior to each cour se offering and gives course improvements in the context of the outcomes addressed by the course. A series of these reports is valuable information for new course instructors, for providing a historical perspective on course improvement, and as input to c ourse syllabus and curriculum revision.

       

    5. Student Evaluation of Course Outcome Coverage
    6. In each undergraduate course, students are asked to rate the degree to which the course addressed each of the department outcomes. This information is gathering using the optional questions on the back of the student course evaluations. All undergradua te courses, except seminars and individual project work, must complete course evaluations. Advising is responsible to distribute the list of questions, in overhead form, to course instructors when course evaluations are distributed. Course instructors are responsible for showing the questions to the students and asking them to answer them. The information is provided to course instructors the next time the course is offered, and is also used in syllabus and curriculum revision.

       

    7. Student Portfolio Assessment
    8. Course instructors in all undergraduate courses, with the exception of seminars and individual project courses, are asked to copy student work handed in for each assignment, including homework, laboratory reports, examinations and quizzes, and course p rojects. For each assignment, three students should be selected, one whose work represents high achievement, one whose work represents low achievement, and one whose work represents average achievement. A copy of the assignment should also be provided. Th e selected students need not be the same from assignment to assignment.

      Annually, the course coordinator for each course should meet with that yearís course instructor(s) to review collected student work. The review should address two questions:

      1. Does the work show that the course is addressing the outcomes assigned to it in its Master Course Description?

      2. On an absolute scale, rate the achievement of each assigned outcome by the students as unacceptable, acceptable, or exemplary, in comparison to the ability level expected of a recently graduated engineer with a BS degree. The suggested approach is t o separate the work into high, average and low sets, then for each person at the meeting to rate each set separately, then to discuss any major differences in rating.

      3. For concepts or problems that students had difficulty working with or doing correctly, suggest teaching ideas or syllabus improvements that would facilitate fewer students having difficulty.

      The course coordinator writes a summary report of the portfolio assessment, reporting on the results of answering the two questions.

      The results are important for a variety of assessment activities.

       

    9. Student Grades
    10. Student grades are assigned to students in every course by course instructors. Grades are generally a relative measure. Moreover, grades represent a composite measure of outcome achievement, rather than being associated with specific outcomes. While st udents can judge their improvement by increased grades, the program cannot say that its outcomes are being better achieved if average grades increase. However, course grades below 2.0 (C on the letter grade scale) are an indication of unacceptable achieve ment of the outcomes addressed by a course.

       

    11. Student Graduation Requirements Audit
    12. Advising is responsible for ensuring that students have taken all courses needed to meet graduation requirements, and thus that all outcomes have been addressed in their courses. Advising is also responsible for ensuring that grade requirements for gra duation are met. Note that minimum grades control unacceptable outcome performance. Advising will write an annual report on problems meeting graduation requirements, which will be input to the department curriculum R&I activity.

       

    13. Outcome Achievement Surveys
    14. Advising will administer an annual survey to juniors, seniors and exiting students, asking them to evaluate their level of confidence in their abilities with respect to each of the department outcomes, in comparison to what they would expect of a recen tly graduated electrical engineer. The survey may be conducted by email, with the same survey sent to each undergraduate. Additional information may be sought in the exit survey.

       

    15. Alumni Surveys

The Office of Educational Assessment (OEA) conducts biannual surveys of recently graduated students, eleven months after they receive their BS degrees. While the questions are directed at all students, the surveys are separated by department. The outco me related questions in the existing survey map fairly well to a number of department outcomes.

OEA also conducts biannual surveys of alumni five years after graduation, with a similar overlap to department outcomes.

The Office of Educational Assessment is not a part of the Department of Electrical Engineering and therefore is not bound by the requirements of this plan. In the event that OEA decides to discontinue its alumni surveys, the Continuous Improvement Plan will be revised to provide a replacement assessment activity.

 

Appendix A Example Course Offering Review and Improvement Report

Course Offering Review and Improvement Report Date: April 5, 1998

Course: EE 455 Power System Analysis II

Instructor: Christie

Quarter: W98

Avg grade: 3.12

Available assessment material included student course evaluations from three previous offerings. Review of these materials indicated that many of the students had difficulty using the assigned transient stability software. This falls under outcome (k), use of modern engineering tools.

The lecture associated with the transient stability computer homework was upgraded to include an in-class demonstration of the computer tool, with a chance for students to ask questions about its operation.

A guest lecturer from Puget Sound Energy gave one lecture on practical transient stability assessment. In conversation he suggested improving the coverage of time domain stability in the course. The material on theoretical multi-machine stability was a bridged from two hours to one, and the time was spent on time domain analysis, using class notes.

Course evaluations at the end of the course indicated that students were more comfortable with the computer homework, although a number of them still had difficulty with the design part of the problem. More emphasis on design techniques may be warrante d. I was able to ask a time domain analysis question on the final. Use of a different stability analysis tool might be beneficial.

 

Appendix B Constituents

Based on faculty surveys taken in the fall of 1999, the department major constituents are :

Minor constituents include:

 

Appendix C Mission and Objectives

The mission and objectives of the undergraduate program of the Department of Electrical Engineering were drafted by the Undergraduate Studies Committee during the 1998-99 school year. Membership included a student representative. The mission and object ives were then reviewed and approved by the Corporate Advisory Board, and adopted on May 18, 1999 by a vote of the Electrical Engineering faculty. They may be found in the Undergraduate Handbook, which is supplied to each undergraduate and also posted on the EE department web page. This text is controlling.

UW Electrical Engineering Undergraduate Program Mission & Objectives

Adopted by the EE Faculty, May 18, 1999

The mission of the undergraduate program of the Department of Electrical Engineering at the University of Washington is excellence in undergraduate education. We shall be among the best in the quality of our undergraduate program, preparing our gra duates for successful careers in engineering, postgraduate education, and life-long learning.

Our program has been carefully designed to provide our students with excellent classroom and laboratory instruction. Our educational mission shall be fulfilled by meeting the following set of objectives. Our graduates will:

  1. Be instructed by outstanding faculty, whose expertise covers a wide range of specialties, and who actively participate in advanced research and development;
  2. Learn the fundamentals of electrical engineering through a broad set of required core courses that apply science and mathematics to engineering and require effective oral and written communications:
  3. Apply engineering fundamentals to a selected specialty of electrical engineering, culminating in a significant design experience:
  4. Apply a variety of modern software tools and laboratory equipment to engineering design and analysis in an environment that emphasizes teamwork;
  5. Explore the opportunity for significant extra-curricular undergraduate experience, through participation in research projects, industrial co-op, EE student organizations, and engineering service to the community to better understand the societal impac t of engineering activities.
  6. Exhibit the creativity and innovation needed for life-long learning in the rapidly changing field of electrical engineering.

 

 

Appendix D Outcomes

The outcomes mandated by ABET EC 2000 have been adopted verbatim. Any set of department outcomes must include or subsume the ABET mandated outcomes. Confusion is averted by simply adopting them. To date no additional outcomes have been generated by the outcome review and improvement activity, although additional outcomes may appear in the future. The correspondence between the pre-existing alumni survey outcome set and the ABET outcomes suggests that there is no major gap in the ABET outcomes. The outc omes are:

(a) an ability to apply knowledge of mathematics, science, and engineering

(b) an ability to design and conduct experiments, as well as to analyze and interpret data

(c) an ability to design a system, component, or process to meet desired needs

(d) an ability to function on multi-disciplinary teams

(e) an ability to identify, formulate, and solve engineering problems

(f) an understanding of professional and ethical responsibility

(g) an ability to communicate effectively

(h) the broad education necessary to understand the impact of engineering solutions in a global and societal context

(i) a recognition of the need for, and an ability to engage in life-long learning

(j) a knowledge of contemporary issues

(k) an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.

Note that the Electrical Engineering program specific guidelines add more detail to outcome (a) by requiring the knowledge of mathematics and science to include

(a1) an ability to apply knowledge of probability and statistics, including applications appropriate to the program name and objectives

(a2) an ability to apply knowledge of mathematics through differential and integral calculus, basic sciences, and engineering sciences necessary to analyze and design complex electrical and electronic devices, software, and systems containing hardware and software components

(a3) an ability to apply knowledge of advanced mathematics, typically including differential equations, linear algebra, complex variables, and discrete mathematics

 

Appendix E Responsibilities by Position

This table gives the responsibilities identified for each position from the review and improvement activities and the assessment activities tables, and other responsibilities associated with the Continuous Improvement Program.

Responsible Person

Activity

Schedule

Section

Advisor

Student Graduation Requirements R&I

Before Graduation

7.6

Advisor

Junior Outcome Achievement Survey

Annual

7.7

Advisor

Senior Outcome Achievement Survey

Annual

7.7

Advisor

Graduating Senior Exit Survey

Annual

7.7

Advisor

Undergraduate Advising R&I

Annual

5.8

Advisor

Archive and Retrieve Assessement Materials and R&I Reports

Quarterly, and on demand

3, 5.4, 6

Assoc. Chair for Ed.

Classroom Facilities R&I

Annual

5.10

CIPSC Chair

Constituent, outcomes and objectives survey of students

Annual

7.1

CIPSC Chair

Constituent, outcomes and objectives survey of industry

Annual

7.1

CIPSC Chair

Constituent, outcomes and objectives survey of faculty

Annual

7.1

CIPSC Chair

Asynchronous Assessment Program R&I

Annual

5.14

CIPSC Chair

Continuous Improvement Program R&I

Annual

5.15

CIPSC Chair

Continuous Improvement Program Performance R&I

Annual

5.16

Comp. Comm. Chair

Computing Facilities R&I

Annual

5.12

Course Coordinators

Course Syllabi R&I

3 year

5.5

Course Coordinators

Student Portfolio Assessment

Annual

7.4

Course Instructors

Course Offerings R&I and Course Improvement Report

Every offering

5.4, 7.2

Course Instructors

Student Evaluation of Course Outcome Coverage

Every offering

7.3

Course Instructors

Student Grades

Every offering

7.5

Department Chair

Financial and Institutional Resources R&I

Annual

5.13

Group Chairs

Group Curricula R&I

3 year

5.6

Lab. Comm. Chair

Laboratory Facilities R&I

Annual

5.11

 

 

Responsible Person

Activity

Schedule

Section

Office of Educational Assessment

One-Year Alumni Survey

Biannual

7.8

Office of Educational Assessment

Five-Year Alumni Survey

Biannual

7.8

UGSC Chair

Constituents R&I

3 year

5.1

UGSC Chair

Mission and Objectives R&I

3 year

5.2

UGSC Chair

Outcomes R&I

3 year

5.3

UGSC Chair

Department Curriculum R&I

3 year

5.7

UGSC Chair

Faculty Competencies R&I

Annual

5.9

Appendix F Program Summary