Computer Literacy Proficiency Testing Craig VanLengen craig.vanlengen@nau.edu Computer Information Systems Northern Arizona University Flagstaff, AZ 86011-5066 USA Abstract A pre- post-test of our introduction to computer information systems students was conducted to provide information on the level of their computer conceptual knowledge. Faculty members within the college of business and from other colleges at the university have raised questions on the appropriateness of the course with the “increased” computer proficiency of our incoming freshmen. The study results showed that most students did not possess sufficient proficiency to “test-out” of the course and even those students who achieved a passing score (60%) increased their computer conceptual knowledge by 15%. Keywords: Computer Literacy, Proficiency Testing, Computer Proficiency, IS Education, Student Computing Ability 1. INTRODUCTION Employers would like to hire graduating seniors who are comfortable with information technology along with analytical and problem solving skills (Wolk, 2006). Most colleges of business and university degree programs require students to either demonstrate computer or technology literacy or to enroll in a course similar to IS 2002.1: Fundamentals of Information Systems. VanLengen and Haney (2006) reviewed web sites of their peer and non-peer institutions and found that some universities assumed or expected incoming students to possess the necessary computer skills. Other universities required their students to pass an examination, with tutorial assistance available or show certification of computer skills. Many universities required a computer literacy course but also allowed the requirement to be met with CLEP examination or other credit by examination options. One of the major difficulties in teaching computer literacy is that its definition is constantly changing (McDonald, 2004). The technology that is used by business organizations, software functions, capabilities, and features available have not remained static over the last 20 to 30 years (McDonald, 2004). To keep the introduction to computer information systems (CIS) course relevant colleges and universities must continuously modify the course as technology, student capability, and employer demand change. Twenty years ago the major computer lab activity was teaching programming in BASIC. Today our computer lab activities cover word processing, use of spreadsheets, database, presentation, and use of the Internet. We have reduced our coverage of word processing, since most incoming students have been using some type of word processing software during high school. Universities also need to start including more of the Web 2.0 technologies such as blogs, wikis, and other social networking experiences. The scary, fun, and hopefully exciting part will be the students participating in the creation of course content and experience. The problem with the Web 2.0 technologies is how do we ensure a certain level of proficiency in computer and information technology while entering a user directed Web 2.0 environment? Microsoft Canada conducted an online survey of “students from Grade 11 through second-year University.” The results were encouraging in that 92% of the students thought that technological experience was important for their future career, however only a little over 40% reported that their school encouraged development of technology skills (Smith, 2007). Our college of business has a computer literacy course that is required by all business majors. The course is also used by other colleges for their computer literacy requirement. Some concern has been expressed, from within the college and from other colleges at the university, that many students are coming to the university in possession of the necessary computer skills and should not be required to take the computer literacy course. Dettori, Steinbach, and Kalin (2005) report that computer conceptual knowledge of incoming students is usually varied and that “students tend to believe they are better prepared than they really are.” Wallace and Clariana (2005) found that incoming freshman business students did not “posses adequate knowledge of both computer concepts and computer literacy skills.” In their study only one-third of the incoming freshman business students could achieve a passing score on their proficiency examination. Wallace and Clariana (2005) also found that the students who took the introductory computer course achieved average gains of greater than 20% in both computer concepts and software. Having incoming freshman take the course appears to be worthwhile, since two-thirds of the incoming freshmen do not possess sufficient computer conceptual knowledge and computer skills and that those taking the course achieved gains greater than 20%. Besides an increase in knowledge of computer concepts and computer skills an examination of student self reported data from the College Student Experiences Questionnaire found relationships between a student’s computing ability and their perceived analytical and problem solving skills (Wolk, 2006). How large is the lack of computer proficiency of our incoming freshman? Ceccucci (2005) conducted a nationwide study of “over one hundred randomly selected public secondary schools.” The findings showed that 99% of the schools surveyed offered a course in software application. However, only “13% of the surveyed schools required that students take at least one semester of Computer Applications for graduation.” Since only 13% are required to take a computer application course one could assume that most students acquire computer conceptual knowledge and use of software in the home or from friends. Therefore, the computer knowledge is informal as they are only seeking out knowledge as they recognize a need for it. McDonald (2004) reports on a case study where six exams were created to test CIS majors “Computer Skills Prerequisites” (CSPs). In the pilot study CIS majors were tested using the six exams and over 50% failed to achieve passing scores on all six exams. If incoming CIS majors are unable to demonstrate proficiency, how can we expect non-CIS majors to demonstrate proficiency? The author has conducted informal surveys of the introductory CIS students about their prior computer experience. The type of experience reported is creating graphics, working on photo books, playing computer games, and creating their profiles on facebook or MySpace. It would appear that this informal computer use does not provide an in-depth knowledge of computer concepts and use of software applications that are expected by business organizations. For example when the author was covering computer storage concepts in the introductory course the students were asked who owned an MP3 player. Almost all the students raised their hand. The students were then asked how their songs were stored on their MP3 player. None of the students in the class had an answer. So even though they owned and used the technology they did not even wonder how the data (music) was stored on their device. This question on MP3 players did provide a “teachable moment” and students became more interested in the different types of storage devices and media. To evaluate the concerns of some university faculty that our incoming freshman already possess adequate computer skills and also to determine if the computer literacy course provided value (improved computer knowledge) for those who would be deemed to be proficient. A pre- and post-test was administered in two sections of our introduction to CIS course. 2. PURPOSE OF THE STUDY The major purpose of the study was to assess the level of conceptual computer knowledge of students taking the introductory CIS course. A secondary purpose was to determine what percentage of students could achieve a passing score (60% was used as an equivalent to a “D”) and be considered for “test-out” of the course. Currently the CLEP is the only approved “test-out” option at the university so the study participants were not offered a “test-out” for passing the study instrument. 3. RESEARCH METHODOLOGY Two sections of introduction to CIS course consisting of approximately 80 students were used as the sample for this study during spring 2007. The exam questions were chosen from a publisher’s testbank and reviewed by two different instructors teaching the course for coverage of general computer concepts. The publisher’s testbank was used instead of questions developed by the course instructors to limit any test question bias from the individual instructors. The questions were selected as being appropriate for a final exam in the course and long-term conceptual knowledge. Since this was an introduction to CIS course, and no computer proficiency could be assumed, the testing was done using paper exams and answer sheets. The use of paper exams was to alleviate any impact on the score from a student having problems with computer use while taking the exam. The topics covered in the course and tested on included: 1. Fundamentals of Computer Systems- input, processing, storage, and output 2. Information Processing Models 3. Hardware components 4. Software - systems and applications (productivity software includes word processing, spreadsheets, database, business presentations, and Internet) 5. Telecommunication models and uses 6. Business systems concepts and components 7. Internet and World Wide Web concepts 8. Societal, ethical and global issues surrounding computers and information technology including privacy, security and crime During the first class meeting of the second week of the semester all students in the two sample sections were given the exam and awarded participation points for taking the exam. The usual content of the introduction to CIS course was covered during the semester. At the end of the semester the students were administered the post-test, which consisted of the same questions as on the pre-test with the questions scrambled. Again all students taking the post-test were awarded participation points. Sixty-one students completed both the pre- and post-test. Several students withdrew from the course and the remaining students were absent on either the pre- or post-test date. 4. RESULTS A summary of the test results are shown in Table 1: All Student Pre- and Post-Test and in Table 2: “Test-Out” Students Scoring Greater Than 60% on Pre-Test, found in Appendix 1. As can be seen in Table 1, the average pre-test score for all students was 40.61 out of a possible score of 75 with a gain of 9.78 points or a 24% increase, for an average post-test score of 50.39 for the 61 students completing both the pre- and post-test. A one-tailed tTest showed a significant difference between the two means. This indicates that completing the CIS course provided a significant difference in the student’s computer conceptual knowledge. This supports the view that students should be required to take the course since it provides a positive impact in computer conceptual knowledge. The secondary purpose of the study was to determine what percentage of students could achieve at least 60% on the pre-test and could be considered for “test-out” of the course. As can be seen in Table 2, 19 of the original 80 students (a little under 24%) achieved a score equal to or greater than 60% on the pre-test. However, if we examine the pre-test mean score of 47.9 the “test-out” students achieved a mean increase in score of 7.2 or a 15% increase in computer conceptual knowledge. The increase in the mean score was also significant showing that even the top 24% of the students could receive benefit from being required to take the introductory computer course. An alternative would be to have a second level course that the top students could take that would give them more of a challenge and an enhanced learning experience. 5. DISCUSSION AND RECOMMENDATIONS The significant result for all students shows that the introduction to CIS course is assisting the students in increasing their knowledge of computer concepts. We also need to look at the significant results of the students who would have “tested-out” if that option was available. These students gained almost 10 percentage points in their performance by participating in the course for the semester. Would the students be better served by allowing them to “test-out” or is the percentage gain in conceptual knowledge justification to require the students to remain in the course for the entire semester. If they were allowed to “test-out” should they just receive credit for the introductory course or should they be required to take a possible more advanced course that could enhance their conceptual knowledge? If the student receives a passing score does the college maintain the credit-hours for the course the student is getting credit for or like the CLEP the student receives- credit for the requirement and the college receives no credit-hours? Another consideration is that our introduction to CIS course is taught as an integrated lecture/lab course and loosely follows the fluency with information technology approach as reported by Waldman and Ulema (2005). Several exams would have to be prepared and updated frequently to cover the computer concepts and each of the software packages that are covered in the course. Since the possible “test-out” students gained almost 10 percentage points by participating in the course we are recommending that all students be required to complete the course. This recommendation also includes a requirement that the content of the introduction to CIS course be continuously reviewed so that it remains relevant as technology, student capability, and employer demand change. A second recommendation is for the CIS area to explore the use of proficiency testing for the future. Proficiency testing can raise a number of issues about the validity and reliability of the test instruments used. The other issue with locally developed test instruments is the resource requirement to constantly update the test questions and the development of a number of different versions to prevent sharing of test content. Since “outsourcing” has become common in the information systems field we may want to look into outsourcing the testing function. The following short list of testing organizations and others should be investigated by any college that decides on the proficiency testing route: Tek.Xam (2007) provides separate 35-minute proficiency tests for “General Computing Concepts, Knowledge and Use of the Internet, Word Processing, Spreadsheets, Presentations, Databases, and Web Authorship. First Advantage Assessment Solutions (2007) offers a large number of skill assessments including skills and abilities in Microsoft Office, computer literacy, and IT. ExpertRating (2007) is an interesting alternative. They offer a free computer skills test covering computer settings, hardware, networking, keyboard usage, terminology, software, Internet, Windows, and Emailing. If you pass the test you can order a certificate to prove proficiency. The main reasons for using an outside assessment exam are to reduce the amount of resources to create and maintain the assessment instruments and the proof of validity and reliability of the test. The significant increase in computer conceptual knowledge shown in the results of this study have provided the college of business with the justification for requiring all business majors to complete the introduction to CIS course. The results of this study are also being provided to the other colleges on campus that currently require our course as part of the general studies requirement. The results have also allowed us to show our critics that incoming students do not “know everything about computers and technology” and that the course adds value for the student. 6. REFERENCES Ceccucci (2006). What Do Students Know When They Enter College? Information Systems Education Journal, 4 (90). http://isedj.org/4/90/. ISSN: 1545-679X. (Also appears in The Proceedings of ISECON 2005: §2344. ISSN: 1542-7382.) (Accessed 04 Jun, 2007) Dettori, Steinbach, and Kalin (2006). Is this Course Right for You? Using Self-Tests for Student Placement. Information Systems Education Journal, 4 (77). http://isedj.org/4/77/. ISSN: 1545-679X. (Also appears in The Proceedings of ISECON 2005: §2312. ISSN: 1542-7382.) (Accessed 04 Jun, 2007) ExpertRating (2007). “Computer Skills Test.” Retrieved June 4, 2007 http://www.expertrating.com/Computer-Skills-test.asp. First Advantage Assessment Solutions (2007). “SkillCheck Assessment Portfolio.” Retrieved June 4, 2007 http://www.fadvassessments.com/products.pl. First Advantage Assessment Solutions (2004). “Assessing ICT Literacy of Students.” Retrieved June 4, 2007 http://www.fadvassessments.com/docs/papers/whitepaper_ICT_literacy.pdf. "Information Literacy Competency Standards for Higher Education." American Library Association. 2006. http://www.ala.org/acrl/ilcomstan.html (Accessed 04 Jun, 2007) McDonald, D. S. (2004). “Computer Literacy Skills for Computer Information Systems Majors: A Case Study.” Retrieved June 4, 2007 http://findarticles.com/p/articles/mi_qa4041/is_200404/ai_n9406522. National Center for Academic Transformation (2007). “Winward Community College Abstract.” Retrieved June 4, 2007 http://www.center.rpi.edu/States/Hawaii/WCC_Abstract.htm. Smith, B. (2007). “MS student survey identifies lack of IT education, soft skills: CIPS exec says computer class should be mandatory in high school.” Retrieved June 4, 2007 http://www.itbusiness.ca/it/client/en/home/News.asp?id=42909. Tek.Xam (2007). “Course Exemption.” Retrieved June 4, 2007 http://www.tekxam.com/testout/testout.htm. University of Scranton (2006). “C/IL 102L Computing & Information Literacy Lab.” Retrieved June 4, 2007 http://www.cs.scranton.edu/~dmb2/syllabus.doc. University of Texas at Austin (2007). “Computer Proficiency Test.” Retrieved June 4, 2007 http://www.utexas.edu/academic/mec/cbe/cpt.html. VanLengen and Haney (2006). Fundamentals of Information Systems Alternatives. Information Systems Education Journal, 4 (30). http://isedj.org/4/30/. ISSN: 1545-679X. (Also appears in The Proceedings of ISECON 2005: §4112. ISSN: 1542-7382.) (Accessed 04 Jun, 2007) Virginia Commonwealth University (2007). “MASC 203 Enrollment and Computer Literacy Procedures.” Retrieved June 4, 2007 http://www.has.vcu.edu/mac//pdf_s/MASC203EnrollmentformFeb2007.pdf. Wallace and Clariana (2005). Perception versus Reality—Determining Business Students’ Computer Literacy Skills and Need for Instruction in Information Concepts and Technology. Journal of Information Technology Education, 4. http://informingscience.org/jite/documents/Vol4/v4p141-151Wallace59.pdf (Accessed 04 Jun, 2007) Waldman and Ulema (2006). Utilizing Snyder’s “Fluency with Information Technology” In An Undergraduate “Introduction To Information Systems” Class. Information Systems Education Journal, 4 (38). http://isedj.org/4/38/. ISSN: 1545-679X. (Also appears in The Proceedings of ISECON 2005: §2352. ISSN: 1542-7382.) (Accessed 04 Jun, 2007) Wolk, R M.  How Important is Student Computing Ability? The Role of Information Technology Competence in Business School Accreditation.  In The Proceedings of ISECON 2006, v 23 (Dallas): §3142. ISSN: 1542-7382. (Accessed 04 Jun, 2007) APPENDIX 1: PRE – POST-TEST RESULTS Table 1: All Students Pre- and Post-Test N=61 Students Pre-Test Score (correct responses out of 75 questions) Pre-Test Percent Post-Test Score (correct responses out of 75 questions) Post-Test Percent Average 40.61 54.1% 50.39 67.2% tTest, 1 tailed, paired 8.59566189E-17 Table 2: “Test-Out” Students Scoring Greater Than 60% on Pre-Test N=19 Students Pre-Test Score (correct responses out of 75 questions) Pre-Test Percent Post-Test Score (correct responses out of 75 questions) Post-Test Percent Average 47.9 63.9% 55.1 73.4% tTest, 1 tailed, paired 7.26916E-05