A Database System for IS Curriculum Assessment using ISA Examination Performance Mark Segall segallm@mscd.edu Metropolitan State College of Denver Denver, CO 80217 USA Biswadip Ghosh bghosh@mscd.edu Metropolitan State College of Denver Denver, CO 80217 USA Joe Morrell morrellj@mscd.edu Metropolitan State College of Denver Denver, CO 80217 USA Abstract This research paper describes the design and proposed use of a database driven system to conduct curriculum assessment activities, with data from student performance on the national ISA examination. The system goes beyond existing web site capabilities to allow a two step mapping, from the ISA exam questions to objectives in individual courses in a CIS curriculum, via the IS 2002 model curriculum Learning Units (LU). The system allows access and updates by multiple faculty members and provides a means of proposing and tracking course changes over time. A variety of reports are produced by the system that can be used in the ABET reaccreditation cycle. Keywords: Accreditation, ABET, IS 2002 Curriculum Model 1. Introduction The ISA national exam is an annual exam that Information Systems (IS) majors can take to receive a national certification of their IS skills after completing an IS degree. The assessment exam is part of an overall effort to assess the knowledge and practical readiness of IS students and professionals. The purpose of the exam is to assess individual student performance in six skill areas as defined in the IS 2002 model curriculum (Gorgone, et al. 2002) and IS entry-level job ad criteria and represents what IS students need to know upon graduation (Landry, Longenecker, Haigood, and Feinstein 2000). Recently, information systems programs have been pursuing accreditation under ABET guidelines to enhance a program’s quality and appeal to stakeholders (ABET CAC 2002). The ABET guidelines call for continuous assessment of the curriculum through the measurement of the outcomes of the program’s courses and course objectives as well as overall program objectives and outcomes. The biannual ABET recertification process also requires documentation of such assessment efforts and any curriculum/course changes being made as part of the assessment process. Such assessment activities must be done through the efforts of a large group of faculty members, each teaching one or more courses and can be more effectively coordinated if a suitable information system is put into practice. The IS2002 model curriculum and its web based mapping tools (www.iseducation.org) provide a framework to map the curriculum learning units (LU) to course level objectives in an institution’s curriculum of courses (Daigle, et. al. 2004). This mapping assigns a value from 1-4 for the depth of coverage of the LU in the course. Such mappings have been used in curriculum design and changes by IS faculty and have been reported in the IS educational literature. Landry, et. al. (2004) used such course-LU mappings to compare two course offerings to decide on a required course for there is program. Other possible uses of the mapping in curriculum assessment were to objectively measure overlaps in LU coverage among courses, find coverage gaps in courses and objectively compare courses. The ISA exam can be a useful tool for assessing the curriculum in CIS programs (Longenecker, McKell and White, 2007). Assessing student performance using selected exam questions may provide a more detailed assessment of the institution’s CIS curriculum and candidate courses or teaching methods. The student scores attained on the exam vary widely, even for students that are graduating from the same institution (Segall, Gollhardt and Morrell, 2007). Segall, Gollhardt and Morrell (2007) found several factors that can explain the variability in student scores outside of their program of study – student age, ESL (English as Second Language) status and physiological status on the day of the exam. The study also found that student success (attained grade) in certain courses can significantly account for large variance in the exam scores. Therefore, it begs the question, “Can the performance on certain questions on the ISA exam be used to evaluate an institution’s curriculum, teaching methods, and be used to make recommendations to faculty to improve individual courses and/or the IS program?” 2. Curriculum Assessment Typically, curriculum assessment should answer the following questions (Acharya, 2003): (1) What do we want the students to learn? (2) Why do you want them to learn it? (3) How can we help them to learn it? (4) How do we know what they have learned? The first two questions can be answered using the IS 2002 model curriculum and reviews of the learning units with stakeholders such as employers, former students, faculty and college administrators. However the later two questions require more planning and effort. Generally, curriculum objectives are achieved through course objectives and individual courses can address one or more curriculum objectives at different coverage depths. By measuring the student’s proficiency on the curriculum objectives we answer question 4, “How do we know what they have learned”. This is a prerequisite for a robust curriculum assessment process, which must close the loop and provide feedback to instructors about objectives which are not being met. This feedback gives instructors the opportunity to modify, add or drop teaching strategies, which will help students learn all of the curriculum objectives. Given the comprehensive nature of the ISA exam, it is a unique instrument to help in answering these assessment questions. Therefore, our primary objective is to devise a database driven system that can help analyze a selected subset of ISA exam questions on which our students did good or poorly on to assess and improve our IS curriculum. The system will store the mapping of IS 2002 LU’s to the courses in our curriculum where they are covered and at what level (1-4). Individual faculty members are given rights to view and manage the mappings and propose and track changes in their courses, which can be tracked using the system. Moreover, after each ISA exam, the exams questions and their mapping to the LU’s will be loaded into the system along with student performance on each question to provide reports on problem or success areas in courses to focus assessment efforts on. The purpose of this paper is to describe the database design for a system the objectives of which are described below. 3. Detailed System Objectives The detailed objectives of this system that go beyond currently available alternatives include the following: 1. To allow teaching faculty to map IS 2002 curriculum objectives to objectives in their course(s). To also allow them to review and change the mapping. Note that the web sites do not allow multiple faculty from a single institution to be administered into the system with change permissions. 2. To map ISA exam questions to individual courses in the CIS curriculum. The ISA exam web site shows the relation of ISA exam questions to LU’s in the IS model curriculum. The isedg.org website allows mapping IS model curriculum to the CIS program at our college. But the two step mapping from ISA exam questions to course objectives is not possible. The new system is needed to achieve this two step mapping from ISA exam questions to the CIS program courses. 3. To convey reports to teaching faculty on the student performance on ISA exam questions (from one or more semesters) that map to objectives in their course(s). 4. Provide a way to set and store goals on curriculum objectives in the system to be entered by individual faculty members as they review test results. The system provides a historical view of test results and the course changes implemented as a result of the exam results. 5. Allow faculty to track the changes in ISA exam performance over a period based for questions that are related to their course(s) and see results of changes in teachings methods that they have instituted and recorded in the system over time. Produce reports for accreditation review purposes of course changes and the assessment cycle. Such reports of assessment activities are important input into the ABET reaccreditation process. 4. Profile of Metro State Metro State follows the published IS model curriculum, IS 2002. The required course sequence to earn a Major in CIS degree has been mapped to the IS 2002 curriculum using the mapping tools on the “www.iseducation.org” web site. The CIS program at Metro State has received ABET accreditation and draws a diverse pool of students. Metro State is an urban institution which serves non-traditional, working students. The student demographics were reported by Segall, Gollhardt and Morrell (2007) and can be summarized as follows. The average is roughly 30 years with approximately 15% of students reporting English as a Second Language (ESL). A large set of IS degree seeking students work in IT, retail, government and banking industries. This diversity of the student base calls for rigor in assessment activities to ensure that best efforts are made to enhance student learning across the spectrum using a diverse set of methods. 5. Database Design The database was created by taking data from the www.iseduction.org website. The database is an ACCESS 2007 database. The process of populating data into the IS Education website starts with typing in each course taught in a program. When students register for the ISA exam they are asked to select which courses they took and the grade they received in the course. The next step is to map the courses taught with the 150 LUs of the IS 2002 model curriculum. Faculty members who teach each course are asked to fill out a form that describes the level of coverage for each of the LUs in their course. Each course can cover material from more than one of the ten IS 2002 courses. Figure 1 shows mapping document for the LUs associated with the IS 2002.1 “Fundamentals of Information Systems” course. This mapping is useful to identify the coverage of LU in the entire curriculum. The IS Education website then provides reports to identify which LUs are covered for all courses in the program’s curriculum. One note about the ISA exam is that not all LUs are covered on the exam. Each LU that is tested has at least 4 questions for increased reliability, thus there are only about 40 percent of the LUs covered on the exam. The test would require 600 questions (4 questions x 150 learning units) if every single LU was covered and would take more than the 3 hours allotted for the current version of the exam. When students take the exam the average scores for the school (Local) and all students taking the exam (National) are reported for each semester. The relationship diagram for the database is shown in Figure 2. The three main elements of data taken from the IS Education website for the assessment database are the Courses, Learning Units, and Exam Results. The course information was copy and pasted into the Course Table from the Local Courses web page. The Learning Units data was first copied into an Excel spreadsheet from the Map LUs webpage, and then imported into Access. The data for the Local Course-LU associative entity table was manually typed in. The ‘Model Curriculum Course Learning Unit Analysis’ report was exported as an Excel file from IS Education, processed to match the format for the database, and selectively imported into Access for the Exam Score and Exam Questions tables. One example of a report (Figure 3) created with the assessment database is a hierarchical form with exam question information as the parent and the associated courses as the child details. This form is sorted by the test score so the user can find which questions are the most difficult and which courses cover the LU for the question. This can be thought of as an exception report which highlights problem areas and allows an assessment coordinator the opportunity to craft recommendations targeted to courses which teach the specific LU objectives. 6. Reports distributed to faculty An ISA exam retrospective report that was generated from the system after the IS 2002 Learning Units (LU) had been mapped to the local IS courses along with the depth of coverage of each LU in each assigned local course. This retrospective report listed, by their assigned local course, the LU’s that students performed well on and those LU’s that students performed poorly on based on results from the ISA exam. A picture of this report is shown in Figure 4. This report was circulated to each course coordinators and faculty member teaching one of more of the courses. The faculty members are using the report to develop a plan for possible course improvements to address areas, where the local score was much lower (by 15 points) than the national score. Knowledge exchange is also taking place among faculty about learning units where the local score is much higher (by 15 points) than the national score. Further analysis is needed to determine if certain teaching methods work more effectively for students at MSCD and how they might be universally used in multiple courses. 7. Future Work In the future, the learning objectives for each local course will be added to the database and then they will be mapped to the IS 2002 Learning Units. Since mappings are being developed between local course objectives and both program and ABET objectives, the assessment provided by the ISA Exam will result in assessment of both program and ABET objectives. 8. Conclusion “Organizations are being compelled to capture, understand, and harness their data to support decision making in order to improve business operations.” (Turban et. al. 2007.) This quote sums up the assessment process in education today. Our goal is to improve the education process (our business operation), but the assessment process requires clear documentation of the measurements taken to achieve this goal. This paper describes the creation of a database used to improve the assessment process by giving more control over the assessment information provided by the ISA exam. It moves the control of the information closer to the end user by giving all faculty access to the data used to make assessment decisions. 9. References ABET CAC (2002). 2003-2004 “Criteria for Accrediting Computing Programs”, Computing Accreditation Commission, Accreditation Board for Engineering and Technology, Inc., Baltimore, M.D., Retrieved on June 12, 2008 from http://www.abet.org/criteria.html. Acharya, C. (2003). “Outcome-based education (OBE): A new paradigm for learning.” CDTLink, November. Retrieved May 22, 2008, from http://www.cdtl.nus.edu.sg/link/nov2003/obe.htm Daigle, R., Longenecker, H.,E., Landry, J.P. and Pardue, J. (2004). “Using the IS 2002 Model Curriculum for Mapping an IS Curriculum”. Information Systems Education Journal, 2 (1). http://isedj.org/2/1/. Gorgone, J.T., Davis, G.B. Valacich, J.S Topi, H. Feinstein, D.L.and Longenecker, H.E. (2002). “IS 2002 Model Curriculum and Guidelines for Undergraduate Degree Programs in Information Systems”. Atlanta: Association for Information Systems. Landry, J.P., Reynolds, J.H. and Longenecker, H.E. (2003). “Assessing Readiness of IS Majors to Enter the Job Market: An IS competency Exam Based on the Model Curriculum.” Proceedings of the 2003 Americas Conference on Information Systems, August 4-6. Landry, J.P., Purdue, H.J, Daigle, R.J. and Longenecker, H.E. (2004). “Using IS2002 to Assess the Impact of a Proposed Curriculum Change”, Proceedings of the 2004 Americas Conference on Information Systems, August. Longenecker, H.E., McKell, L.J. and White, B.A. (2007). “Using the IS Assessment Test in a Program of Continuous Improvement.”, Panel at Information Systems Education Conference, Nov 3, 2007. McKell, L., Reynolds, J.H., Longenecker, H.E. Landry, J. and Pardue, J. (2005) “Information Systems Analyst (ISA): A Professional Certification Based on the IS2002 Model Curriculum”, The Review of Business Information Systems, Summer (9:3), pp. 19-24. Reynolds, J.H., Longenecker, H.E. and Landry, J.P. (2004). Information Systems National Assessment Update: The Results of a Beta Test of a New Information Systems Exit Exam Based on the IS 2002 Model Curriculum. Information Systems Education Journal, 2 (24). http://isedj.org/2/24/. Segall M., Gollhardt, L., and Morrell, J.S. (2007). The Information Systems Analyst National Assessment Exam: Factors for Success. Information Systems Education Journal, 5 (40). http://isedj.org/5/40/. Appendix Course Number and Title Levels 1 Recognize 2 Differentiate 3 Use 4 Apply Learning Unit Title Learning Unit Goal                   (count = 9)   IS.1 (5) Systems and Quality to introduce systems and quality concepts   IS.1 (6) Information and Quality to provide an introduction to the organizational uses of information to improve overall quality   IS.1 (7) IT Hardware and Software to present hardware, software and related information technology concepts   IS.1 (8) IT Systems Specification to provide concepts and skills for the specification and design or the re-engineering of organizationally related systems of limited scope using information technology   IS.1 (9) IT and Attaining Objectives to show how information technology can be used to design, facilitate and communicate organizational goals and objectives   IS.1 (10) Characteristics of an IS Professional to explain the concepts of individual decision making, goal setting, trustworthiness and empowerment   IS.1 (11) IS Careers to show career paths in Information Systems   IS.1 (12) Ethics and the IS Professional to present and discuss the professional and ethical responsibilities of the IS practitioner   IS.1 (13) IS Personal Level Systems to identify, investigate, analyze, design, develop with either with packages (and/or high level languages) and use personal level information systems to enhance individual productivity   Figure 1. Mapping Document of IS 2002.1 (Fundamentals of Information Systems) Learning Units. Figure 2. Assessment Database Relationship Diagram Figure 3. Hierarchical Form for mapping exam questions to courses. Figure 4. Faculty Report Example ?? ?? ?? ??