The Information Systems Analyst National Assessment Exam: Factors for Success Mark Segall segallm@mscd.edu Loren Gollhardt gollhard@mscd.edu Joe Morrell morrellj@mscd.edu Computer Information Systems, Metropolitan State College of Denver Denver, CO 80220, USA ABSTRACT This paper reports on a study of the Information Systems Analyst National Assessment Exam results from students in the Computer Information Systems Department at Metropolitan State College of Denver. It investigates the relationship between exam success and individual student characteristics such as academic success, work experience, age, and other characteristics. The study is prompted by the incongruent scores from students in the same course of study. The purpose of the study is to attempt to determine the characteristics for success on this exam and to use the results to improve the curriculum and advising to allow students to experience greater success on this exam. This paper concludes that academic performance, native language, age, and the state of the test taker at the time of the test are important characteristics for success. Keywords: Information Systems Analyst Assessment Exam, ISA, CCER, ICCP, Information System Education, Model Curricula, System Analysis and Design, Academic Performance, Age, English as a Second Language 1. INTRODUCTION The Information Systems Analyst (ISA) Assessment Exam has been available since 2004 (McKell 2004, McKell 2005). It allows students with an undergraduate degree to earn a certification as an ISA. It is administered through the Center for Computing Education Research (CCER). The CCER is a division of the Institute for Certification of Computing Professionals (ICCP) Education Foundation. ICCP was founded in 1973 to promote “professional standards for the computer industry” (ICCP History 2006). The exam questions map to the IS 2002 model curriculum for undergraduate Information Systems (IS) departments (Gorgone 2003). CCER identifies six IS core areas covered by the exam: (1) Analysis and Design, (2) Role of IS in Organizations, (3) Data Management, (4) Networking and Telecommunications, (5) Modern Programming Language, and (6) Hardware and Software. The exam is a valuable tool for undergraduate departments to determine how well their students are prepared to be IS professionals, as well as assisting in program accreditation (i.e., http://www.abet.org/). Currently assessment is one of the highest priorities in higher education and is a vital concern for accreditation. The ISA exam is critical to any CIS Department, but especially to those undergoing accreditation which is the case with our department. Our Computer Information Systems (CIS) department at Metropolitan State College of Denver (MSCD) is part of a school of Business, with about 500 students. Our institution has been using this exam since the fall of 2004. Student performance on this exam is a vital concern. The scores earned by students in the CIS Department on these exams have ranged from a low of seventeen percent to a high of seventy nine percent. Students have ranked nationally from some of the lowest in the nation to the top scores with rankings including first, fifth, and many in the top ten percentile. Although the high scores are a source of pride for the department, the low scores are an area of concern. The profile of the student with success (or failure) becomes an important issue in curriculum development and advising. Research Questions The primary question for this research is "What are the characteristics for success on the ISA Exam?" The authors determined that a wealth of information was available from the students' academic records and ISA Exam results. The authors were specifically interested in addressing the following research questions with respect to the scores on the ISA Exam: * How are academic factors related to these scores? * How is work experience related to these scores? * How are personal factors related to these scores? * How are environmental factors related to these scores? 2. METHODOLOGY Data Collection The primary instrument for data collection was a request to the Office of Information Resources (OIR) at the institution. This request was by student id number for the items in the student database as listed below. Student Information: Student ID, First Name, Middle Name, Last Name, Age, Gender, Email, Area Code, Phone, Street, City, State, Zip, Cumulative GPA, Major GPA, Expected Graduation. MSCD Credits: Student ID, Term, Subject, Course Number, Credit Hours, CRN, Grade. Transfer Credits: Student ID, Subject, Course Number, Credit Hours. A secondary instrument for data collection was a survey sent to all students which consisted of identification and personal, work, and environmental questions. These questions are listed (in Table 2) below. * Company type * Years of experience in line of work * Job Title * Job Skills * Amount of Work (Full Time, Part Time, None) * Motivation to do well on ISA Exam (1 lowest – 10 highest) * Rate your IS skill level (1 lowest – 10 highest) * Is English your Primary Language? * How well do you control your own emotional state in general? (1 lowest – 10 highest) * Rate your own general problem solving ability. (1 lowest – 10 highest) * Age (15-20, 21-25, 26-30, 31-35, 36-50, 51-100) * Are you married? * Do you have children who live with you? * How many minor children live with you? * Gender * How many hours did you prepare for this test? (0, 1-5, 6-10) * How did you feel physically at test time? (1 lowest – 10 highest) * Rate your own emotional state at test time. (1 lowest – 10 highest) * How many credit hours were you taking the semester in which you took the test? A third source of data was the www.iseducation.org website through which the ISA exam is delivered and administered. The variable of English as a Second Language was collected for all students. This paper focuses on the analysis of the categories in Table 1 with respect to ISA Exam Scores followed by an analysis of the characteristics in Table 2. The OIR (Table 1) request generated results for the 140 students taking the exam over four semesters. The survey (Table 2) was emailed to 140 students with a response from 48 students resulting in a 34.3% usable responses rate. 3. SAMPLE CHARACTERISTICS Student Academic Data Our college is an urban institution which serves non-traditional, working students. Of the 140 students who took the exam from Fall 2004 to Spring 2006 the average score on the exam was 49.7 (± 12.2 SD). The average age is 30.1 (± 8.1 SD), which is typical of our students. The percent of male / female students is 68% / 32%. The number of students reporting English as a Second Language (ESL) is 22 (15.7%). Sixty three percent of the students are listed as Caucasian, 17% Asian, 6% Hispanic, 5% Black, 5% Other, and 4 % Non-resident Alien. The average overall GPA (3.03 ± .52 SD) and major GPA (2.97 ± .59 SD) are similar. The students take an average of 9.9 (± 2.39 SD) credits per semester, since most are working full or part time. Therefore, a student completes a degree in about six years. The average number of courses transferred is 16.6 (± 14.4), with a maximum of 73. Of the 74% who do have transfer courses most transfer in 1 – 2 semesters' worth of courses. Survey Data A total of 48 out of the 140 students completed the survey (34% return rate). The average age for those who completed the survey and those that did not: 30.1 years. Students who completed the survey had a GPA of 3.1, while those that did not had a GPA of 3.0. There was a significant difference in the average ISA exam scores: Completed Survey (53.1 ± 12.7 SD) vs. No Survey (48.0 ± 11.7 SD). Sixty nine percent of the students in the survey report working full time, while 19 percent report working part time. This heavy work load outside of school is typical of our students. One third of the survey students self report as working in the IT industry. The next most frequent responses are Student, Retail, Government, and Banking & Finance. Company Type Percent Information Technology 31 Student 19 Retail 13 Government 8 Banking & Finance 8 Other (7 Industries) 21 Total 100 Sixty three percent of the students report three or more years of work experience. Years Employed Percent Less than one year 14 1 – 2 years 19 3 – 5 years 21 More than 5 years 42 Not Applicable 4 Total 100 4. ANALYSIS The MINITAB® release 14.20 statistical software was used to complete the data analysis. The primarily analysis was to correlate various factors through the use of single variable regressions to predict the score of the ISA exam. Our goal is to find which factors predict exam performance. ANOVAs and t-tests were used when appropriate (Anderson et. al. 2005). Student Academic Data See Table 1 for a summary of the student academic variables. Demographic Factors: Gender does not predict ISA score. The average scores for males and females are 50.4 and 48.3 respectively, which is not a significant difference (t = 0.94, p = 0.35). As a single variable predictor, age does account for a small amount of variance in the ISA Score (See Figure 1). The slope of 0.26 suggests that for every 10 years of age the exam taker will score 2.6 points higher. Older students are more likely to have work experience which should improve their score. English as a Second Language (ESL) is the best demographic predictor, accounting for 13.3% of the variance in score (See Figure 2). About 16% of the sample reported yes on ESL and this group scored 12 points less than native speakers. A 2-way ANOVA between the ‘4050 Grade’ and ESL indicated no interaction between these two factors, thus the decrease in score is consistent for good, average and poor students. Semester: In general the ISA score was not significantly different based on the semester, except for spring 2006 where the score was six points higher. We cannot say that the Spring 2006 cohort had better students: there was no significant difference (t = 0.53, p = 0.60) for the GPA of the spring 2006 students (2.99) versus all other students (3.04). (The national data on the ISA scores was not available at the time this paper was submitted.) Academic Background: The average number of transfer credits is about 16 credit hours. However the range of transfers is from 0 to 73 credits. About 75% of the students in this sample transferred some credits (Average = 22 credits). The number of courses transferred did not predict the ISA score. Students take an average of ten credits per semester, ranging from four to sixteen credits per semester. The Average number of credits factor also had no predictive power for the ISA exam score. We wanted to see if students taking more than the required upper division CMS course did better on the ISA exam. The number of upper division courses taken did not predict ISA performance. Academic Performance: Both GPA and Major GPA are significant predictors of performance accounting for about 20% of the variance in ISA score. Since these two GPA variables have a high correlation of p = 0.94 they measure the same underlying academic ability. Having a high overall GPA in school predicts that a student will do well on the ISA exam. Each increase in a letter grade predicts about a ten point increase on the ISA exam. Individual Courses: The grade a student receives in course CMS 4050 (Advanced Systems Analysis and Design) was the best single predictor of their performance on the ISA exam, accounting for 38% of the variance in the ISA exam score (See Figure 3). Each increase of a letter grade in the course CMS 4050 predicts a 9.5 point increase on the exam. This is consistent with the fact that 42% of the questions on the ISA exam cover the Analysis and Design IS core area. The next best course predictor was the grade the student received in CMS 3230 (Telecommunication Systems) which accounted for 18.3% of the variance in exam scores. The slope suggests that an increase in one letter grade in the course increases the ISA exam score by 5.8 points. This is a surprising result because only 5% of the questions on the exam cover the Networking and Telecommunications IS core area. There is however a high correlation (r = 0.64) between the grades a student received in the in CMS 3230 and CMS 4050 courses. While the topics are different, there seems to be a higher than usual commonality in the skills needed to do well in Advanced SAD and Telecommunications. The third course which predicts student performance on the ISA exam is CMS 2110 (Business Problem Solving: A Structured Programming Approach), which accounts for 17.3% of the variance. Similar to the course CMS 3230, each letter grade increase predicts a 6.0 point increase in the exam score. However, the correlation between the grade in courses CMS 2110 and CMS 4050 is not as high (r = 0.39) as with the telecommunication course. CMS 2110 is the gateway course for the CIS major which introduces (1) the idea of breaking a problem into component modules and (2) the programming concepts of sequence, selection, iteration. Mastering these concepts early is important for doing well on the ISA exam. CSM 3060 (File Design & Database Management) is the last course that shows strong predictive powers (9.7% of the variance in exam scores accounted for). Each increase in a letter grade predicts a 4.1 point increase on the exam. The Data Management IS core area accounts for 17% of the ISA exam questions. Two courses show weaker predictions for the ISA exam: CMS 2010 (Computer Applications for Business, R2 = 5.0%, slope = 3.6) and CMS 3340 (Advanced Business Statistics, R2 = 4.4%, slope = 3.0). Both of these courses are required by all School of Business majors, including the CIS major. The CMS 2010 course gives students an overview of business information systems and overlaps with the Role of IS in Organization IS core area, which comprises 28% of the ISA exam. The letter grade the student received in the course CMS 3050 (Fundamentals of Systems Analysis and Design) did not predict the student’s ISA exam scores (R2 = 6.1%, slope = 2.8). This could be due to the small number of students who took the course CMS 3050. This course is only a required course for some of the students, depending on their Catalog year, so only 42 students were used to create a regression equation. A larger sample will be needed to rule out this course as an important factor. Most students in this sample were required to take one of many programming courses (VB, Java, C, C++, C#) to satisfy the CIS Department’s programming requirement. A curriculum change in the 2005/2006 catalog requires all students to take the Visual Basic programming language. The grades which constitute the programming variable are a composite of whichever programming course the student took. This variable did not predict the student’s ISA score (R2 = 0.9%, slope = 1.0). One possible reason is that the ISA exam is a vendor-neutral exam, while the programming courses are all geared to a specific programming language. Another reason is that this is the second course that a student is required to complete. The first programming course is CMS 2110 which is a good predictor for the ISA score. It should also be noted that the IS Core area of Modern Programming Language only covers 5% of the exam. Survey Data See Table 2 for a summary of the survey variables. Employment Variables: The type of company a student works for (IT, Retail, student) does not predict the ISA exam score. Nor does working full time or part time have an effect of the ISA exam score. Having five or more years of work experience (F = 2.88, p = 0.097) was the employment variable that was closest to being significant at a false positive rate of 5%. The larger sample (n = 140) of Student Academic data did indicate that age was a significant predictor of the Exam score. It seems likely that older students would have more experience and this would help them on the exam. The survey data was not able to confirm this hypothesis, but it is a question worth pursuing in a larger study. Personal Variables: The self-reported variables of motivational level, IS skill level, emotional control, problem solving, number of credits taken the semester of the exam, number of children, and marital status had no effect on predicting the ISA exam score. The self-reported physical condition (F = 5.56, p = 0.023) and emotional state (F = 10.4, p = 0.002; See Figure 4) at the time of the exam did have a positive correlation with the ISA exam score. Every one point increase in the 10 point physical rating scale increased the ISA exam score by about 2 points, while each increase in the emotional scale increased the exam score by 3 points. It is interesting to note that the questions about a person's personality traits were not predictive, while those about the state of the individual at the time of the exam were predictive. However, it may be that students, knowing how they did on the exam, reported a good or bad state to correspond to their test performance. It will be important to replicate these results with a short survey given immediately before the exam begins. The ISA exam is a comprehensive exam that covers a wide array of topics. As part of the standard letter we send to students we do not recommend that they study for the ISA exam, yet about 40% of the students reported that they did prepare for the exam. It is interesting to note an inverse correlation between the amount of time preparing for the exam and the ISA exam score. Those that did not prepare at all performed 11 points better than all of the other students, and those that reported preparing 6 hours or more performed 17 points worse. Those that reported preparing 6 or more hours also reported an average emotional state (4.8 ± 2.3 SD) significantly lower than the zero hours group (7.0 ± 1.8 SD) or the 1 – 5 hour group (7.1 ± 1.2 SD). One possibility is that this is a small group of students who think they will not do well on the exam, are anxious about the exam, and do try to prepare for it but are not successful in raising their exam scores based on the preparation. 5. DISCUSSION This study was designed to determine the factors that indicate success for students taking the ISA Exam. The analyses from the academic data show that the following factors affect the ISA Exam score. * ESL has a substantial negative impact * Age has weak positive effect * Academic performance (GPA) is a positive predictor * Performance in Advanced Systems Analysis & Design is the best predictor of performance on the ISA Exam * Telecommunications, Structured Programming / Problem Solving, and Database courses are all substantial predictors * General courses in Business Information Systems and Business Statistics are weak predictors The importance of a student doing well in a few key core courses can not be overstated as factors affecting the ISA exam. From the results of this study students who do want to prepare for the exam would be advised to focus on Advanced System Analysis & Design. The other choices of areas to review are telecommunications, programming, structured problem solving, and databases. The strong predictive power of the Telecommunication course on the ISA exam was a surprising result because it constitutes a small percent of the ISA exam. The analysis of the survey data indicates that the physical and emotional state of the students at the time of the exam might affect the ISA Exam score. Paradoxically, preparation for the exam was a negative predictor of performance. However these three variables might be an important indicator of a student’s self expectations. 6. AVENUES TO IMPROVE STUDENT ISA EXAM SUCCESS The results of this study should first be communicated to students preparing for the exam to give them some indications of the history of the exam and how it pertains to their expectations and preparation. Our CIS Department should consider requiring a C or better for all core CIS courses. 7. ACKNOWLEDGEMENTS We would like to thank Ellen Boswell and Bill Lind from the MSCD Office of Budgets and Institutional Research for their assistance with student data collection. 8. CONCLUSION The evidence of the impact on ISA Exam scores from academic, personal, and environmental factors is very strong. This study indicates that there is a strong relationship between grades specific courses and the ISA exam score. These results are from a study of just one CIS Department students taking the exam over a period of two years. Clearly this study should be expanded to all CIS Departments in all institutions using the ISA Assessment Exam. This is both an invitation and a challenge for those institutions to join us in this endeavor. 9. CITATIONS Anderson, D., D. Sweeney, and T. Williams (2005) Statistics for Business and Economics. South-Western, Manson, Ohio. Gorgone, J., Davis, G., Valacich, J., Topei, H., Feinstein, D., and Longenecker, H. (2003) "Model Curriculum and Guidelines for Undergraduate Programs in Information Systems", Database for Advances in Information Systems, Winter (Vol. 34, No. 1) ICCP History (2006) http://www.iccp.org/ iccpnew/about.html. McKell, L.,J. Reynolds, H. Longenecker, J. Landry, H. Pardue (2004) “Integrating Program Evaluation and a New Certification for Information Technology Professionals”, Proceedings of SIGITE’04, Oct 28-30, Salt Lake City, Utah, USA. McKell, L.,J. Reynolds, H. Longenecker, J. Landry, H. Pardue (2005) “Information Systems Analyst (ISA): A Professional Certification Based on the IS2002 Model Curriculum”, The Review of Business Information Systems, Summer (9:3), pp. 19-24. Table 1 Student Academic Variables: Predictors of ISA Score Predictor n Slope R2 Value F-score P-value Course Name Gender 140 -2.2 0.7% 1.0 0.331 Age 140 0.26 3.0% 4.3 0.040 * ESL 140 -12.2 13.3% 21.2 0.000 ** Fall 2004 140 -2.4 0.5% 0.7 0.413 Spr 2005 140 -3.8 2.5% 3.5 0.063 Fall 2005 140 1.8 0.3% 0.4 0.515 Spr 2006 140 6.1 4.1% 6.0 0.016 * Transfer Credits 140 0.1 1.5% 2.1 0.151 Credits / Term 140 -0.3 0.4% 0.5 0.479 CMS Upper Div 140 0.5 0.5% 0.7 0.410 GPA 140 10.7 20.6% 35.8 0.000 ** Major GPA 140 9.3 20.0% 34.6 0.000 ** CMS 4050 124 9.5 38.0% 74.9 0.000 ** Adv. SAD CMS 3230 134 5.8 18.3% 29.6 0.000 ** Telecom Systems CMS 2110 122 6.0 17.3% 25.2 0.000 ** Bus. Prob. Solving: Structured Prog. Approach CMS 3060 139 4.1 9.7% 14.7 0.000 ** File Design & DB Management CMS 2010 112 3.6 5.0% 5.7 0.018 * Comp. Apps. For Bus. CMS 3340 134 3.0 4.4% 6.0 0.015 * Adv. Bus. Stat. CMS 3050 42 2.8 6.1% 2.6 0.115 Fund. Of SAD Programming 138 1.0 0.9% 1.2 0.274 VB, Java, C, or C++ Table 2 Survey Variables: Predictors of ISA Score Predictor n Slope R2 Value F-score P-value IT Company 48 -1.3 0.2% 0.10 0.748 Student 48 -1.3 0.2% 0.08 0.783 Retail 48 -1.4 0.1% 0.07 0.798 Less than 1 year 48 -5.9 2.7% 1.26 0.267 1 – 2 years 48 -4.9 2.3% 1.09 0.303 3 – 5 years 48 -2.6 0.7% 0.33 0.569 More than 5 years 48 6.2 5.9% 2.88 0.097 Full Time 48 -0.7 0.1% 0.03 0.871 Part Time 48 -0.6 0.0% 0.01 0.904 No Work 48 2.1 0.3% 0.14 0.712 Motivation Level 48 0.7 2.1% 0.98 0.327 IS Skill Level 48 0.7 0.9% 0.40 0.531 Emotional Control 48 0.6 0.7% 0.33 0.571 Problem Solving 48 1.3 2.6% 1.21 0.277 # of Children 48 -2.1 1.5% 0.71 0.405 Married 48 2.4 0.9% 0.40 0.532 Credit Hours semester test was taken 48 0.1 0.1% 0.03 0.86 Physical Condition 48 1.9 10.8% 5.56 0.023 * Emotional State 48 3.0 18.4% 10.4 0.002 ** Zero Hours of Preparation 48 11.0 18% 10.1 0.003 ** 1 – 5 Hours of Preparation 48 -3.9 1.9% 0.87 0.355 6 + Hours of Preparation 48 -17.0 19.8% 11.3 0.002 ** 10