A Case Study in the Redesign of an Introductory MIS Course to Account for the Multiple Learning Styles of Online Students Gary Ury garyury@nwmissouri.edu CSIS, Northwest Missouri State University Maryville, MO 64468 Abstract In previous research the author discovered distinct and significant differences between online and traditional classroom student performance as measured by final course grade in a Management Information Systems course. A decision was made to redesign the course to include multiple media access to various components of the course. In particular we were interested in increasing the availability of socialization along with audio and visual interactivity. This article reports on the methods of the redesign, tools used, and compares the results of student performance under the new program to the findings in the previous study. In short, student performance as measured by final course grade percentage remained the same with traditional students performing better than online students. There was no significant increase in either group’s performance level as a result of the course redesign. Keywords: online student performance, online instruction, online course design, online course delivery, online pedagogy. 1. INTRODUCTION Ten years ago, online learning was a unique, if not unknown, idea. Today, 11 percent of all students are taking classes online (Roach, 2003). The growth rate of enrollment in online courses was 24.8% in 2004, which equates to roughly 2.4 million students enrolled in online courses. Students are attracted to the 24/7 availability and resource saving complexion of the online format (Smith, 2005; Carlson, 2004). “Technology capabilities have led the way to new and recently popular course formats in higher education that involve online instruction, in which the instructor and students do not see each other face-to-face, but interact via a discussion board or email. Online course content has multiple components and is typically available on the Internet in written, photographic, video, and/or audio arrangements” (Smith, 2005, p. 52). The development of technologies such as wireless broadband, streaming video and audio, and better compression algorithms make it possible to download recorded lectures and demonstrations onto an increasing variety of devices like notebook computers, PDAs, cell phones, and iPods. Online technologies have also changed today’s traditional classroom environment with Microsoft Power Point presentations, Microsoft One Note, computer games, and Flash applications replacing the blackboard/whiteboard illustrations of old. Wireless networked portable computer labs on a cart are available to instantly convert any classroom into a connected, high tech environment. By now it should be obvious to educators and administrators in higher education that online learning and distance education have or will become a fact of everyday life. A 2003 Sloan Survey of Online Learning polled academic leaders and two out of three respondents reported that online learning was critical to their long-term strategy with the most positive view and the highest rate of online student enrollment shown at large public institutions (Carlson, 2004). Online delivery presents two distinct questions for educators: 1) Are online students demonstrating an adequate understanding of the learning objectives presented for the course? 2) Are online courses developed to address the multiple learning styles of the diverse student population enrolled? 2. ONLINE VERSUS TRADITIONAL COURSES “Because online learning has been available for less than a decade, the number of empirical studies examining online pedagogy is limited and conflicting” (Smith, 2005, p. 53). Many studies to date have fallen into the categories of pretest-posttest models and opinion surveys. More recently studies have begun to surface that compare online and traditional student performance through cumulative course grades coupled with demographic factors like class standing and overall GPA. Many of the studies have small sample sizes, were performed over short time periods, and measured a single teacher’s experience with traditional and online instruction (Bearden, Robinson, & Deis 2002; Holman, 2000; Miller, Cohen, & Beffa-Negrini 2001; Smith, Smith, & Boone 2000; Thirunarayanan & Peres-Prado 2002). The results of these studies have shown mixed findings. Some show online students outperform classroom students, many show no difference between the two formats, and other studies found that classroom students outperformed online students. Lorenzetti (2005) reported that a study by Joseph Cavanaugh, associate professor of economics at Wright State University (Dayton, Ohio), compared a sampling of students taking six online courses to all students in all courses offered at Wright State. Cavanaugh’s study found that online students outperformed traditional classroom students. However, the same study reported that the average online student’s overall GPA (3.45) was about one-half point higher than the average Wright State student (2.91). This tracked consistently with the performance level in the online courses. Roach (2003) suggests “from the Ivy League to tiny community colleges, a majority of higher education institutions report that online learning is just as good as traditional, face-to-face classroom instruction” (p. 44). Fortune, Shifflett, and Sibley (2006) reported on a study of two courses. One course instructed online, using highly technical Internet tools such as WebCT, video streaming, instant messaging, and e-mail, and the other course was taught on campus in a traditional classroom environment. It was determined that student perceptions with respect to skill development were that the online mode of instruction was just as effective as the traditional in-class delivery of instruction. The study also reported that students selecting the online environment may be more independent than students who select a traditional on-campus course. Summers, Waigandt, and Whittaker (2005) examined differences between online and traditional classroom students in an introductory undergraduate statistics course. Two outcome dimensions were measured: students’ final grades and student satisfaction with the course. “There was no significant difference in grades between the online and traditional classroom contexts. However, students enrolled in the online course were significantly less satisfied with the course than the traditional classroom students on several dimensions” (p. 233). O’Connell (2002) studied two groups of motivated corporate learners who were taught the same subject, Economics, at the same time in different formats. Even though full use of modern online teaching tools including streaming video of classroom lectures were employed, online students scored significantly lower than their classroom counterparts. O’Connell determined that even with the most modern technologies the student/teacher interaction gap is difficult to fill. 3. LEARNING STYLES “Each student comes to class with certain learning experiences, expectations, and needs that have to be addressed, and to which instructors need to be sensitive, to maximize the students' learning experiences” (Mupinga & Yaw, 2006, p. 187). Regardless of delivery method or level of instruction those who deliver instruction should remain cognizant of these differences. To the online course designer it means providing instruction through multiple formats and media. There are many learning style theories and variations on those theories, but three major models recur in much of the literature. The Dunn and Dunn Model, which includes five strands (environmental, emotional, sociological, physiological and psychological), and within those five strands are 20–22 variables. Gardner’s theory of multiple intelligences that define the four learning styles of Visual/Verbal Learners (learn best by reading and writing), Visual/Nonverbal Learners (learn best by seeing pictures, charts, maps, and illustrations), Auditory/Verbal Learners (learn best by hearing and watching), and Tactile/Kinesthetic Learners (learn best by doing, participating, and experiencing) (Reese, 2002; Garland & Martin, 2005). Finally, Terry (2001) references the Kolb model that developed the four-stage cycle of learning that comprises the Experiential Learning Model (ELM). This model depicts learning as a cycle that consists of four learning modes: concrete experience (CE), reflective observation (RO), abstract conceptualization (AC), and active experimentation (AE). There is some disagreement in the field of education as to whether a particular learning style must be exactly matched to the individual student to achieve the most efficient learning model. There is also disagreement as to which learning style model is best used to predict student success. According to Terry (2001) “Researchers disagree, not over the idea that different people have different styles of thinking and learning, but over the claims that these styles can be measured reliably using currently available instruments, and that tailoring instruction to match these styles (by whatever learning styles measure) improves classroom learning performance” (p. 70). According to Grasha and Yangarber-Hicks (2000), “the literature on the connections of technology to teaching and learning styles is not well developed” (para. 39). Reese (2002) states that most educators and researchers who have studied different learning styles say that there is no "right" or "wrong" way to learn, and there are no "good" learning styles or "bad" learning styles. There are simply different learning styles and the best one for the student is the one that works. Some people learn actively and interactively, others focus on facts, some prefer visual forms of information, and some learn from written and spoken explanations (Mupinga & Yaw, 2006). In the classroom good teachers incorporate the various learning styles instinctively, but in the online environment conscious effort must be employed to design redundant materials that match different learning styles. Regardless of model or learning style there is agreement that each student has a learning style that they prefer. In the classroom instructors have the ability to adjust to individual learning styles or provide a variety of exposures. Online instructors must predict and preplan to accommodate the variety of learning styles possessed by their students. 4. PURPOSE In previous research that involved 728 students over a 5 year period, the author determined that students enrolled in traditional classroom sections of a Management Information Systems course significantly outperformed online students enrolled in the same course with an average course grade of 84% (traditional) compared to 81% (online). There was no significant difference between students’ overall GPA, ACT scores, class standing, or instructor. The purpose of this project was to reengineer the Management Information Systems course to include a larger variety of content delivery driven specifically by addressing a variety of learning styles for online delivery. The course is offered by the Computer Science/Information Systems department of this University and is the initial information systems course taken by all MIS majors. Management Information Systems is also one of the core courses required of all majors in the School of Business at this University. The Management Information Systems course was designed to meet the scope of the IS 2002.01 Fundamentals of Information System course defined in the IS 2002 Model Curriculum and Guidelines for Undergraduate Degree Programs in Information Systems. The Association for Information Systems (2002) defines the scope of the IS 2002.01 course as “… an introduction to systems and development concepts, information technology, and application software. It explains how information is used in organizations and how IT enables improvement in quality, timeliness, and competitive advantage” (p. 24). 5. METHODOLOGY Course redesign was accomplished during the summer of 2005. Lectures were recorded in small topic units of less than 10 minutes each using two different applications. Tegrity (http://www.tegrity.com/) hardware and software was used to record both the instructor and Power Point slides for some lectures. The Impatica (http://www.impatica.com/) software application was used to create a streaming, audio only, file from narrated Power Point slides for other lectures. Narrated and illustrated demonstrations of Microsoft Access and Excel applications were added to the course website to assist students with the activities and assignments involving these software applications. Written lecture notes, outlines, and chapter slides were also provided at the course website and “Fundamental of Information Systems” by Stair and Reynolds was adopted as the required text. Discussion groups were added to the website and instructor interactions using announcements and email distributions lists were designed into the course. 6. FINDINGS SPSS version 11.5 statistical software was used to perform independent sample t-tests to determine differences between the average course grades of 113 traditional classroom students and 46 online students were gathered in the fall and spring semester of the 2005-2006 school year. The course grades were compared by delivery method and by instructor. Additionally, students were compared by total credit hours completed (to establish class standing), overall GPA, and composite ACT scores. Table 1, located in Appendix 1, demonstrates that the students enrolled in the online sections and the students enrolled in the traditional sections of MIS were academically equal. The profile of the average student enrolled in both sections was a college Junior with an overall average GPA around 2.90 and an average composite ACT score of 22 to 23. Table 2, located in Appendix 1, illustrates that there were no significant differences between course grade and instructor (t = 1.850, p > 0.05). Both instructors taught one online section and multiple traditional sections during the study timeframe. Table 3, located in Appendix 1, shows that traditional classroom students (83%) significantly outperformed online students (79%) when average final course grades were compared (t = 1.734, p < 0.05). Finally, course grades from the redesigned course groups were compared to 800 traditional classroom students and 219 online students who completed the course under the previous course design. This data was collected over a 6 year period of time. Table 4 illustrates that there was basically no difference (actually a decrease in the traditional sections) in the performance levels of students in both traditional and online sections after the redesign efforts. A breakdown of traditional and online scores by major activity is shown in table 5. Four of the major categories (Access lab exam, Access Project, Excel lab exam, and Excel project) exhibit significantly lower scores for online students. Online students outperformed traditional students when average exam scores were compared. 7. LIMITATIONS The groups were self selected. Students were allowed to enroll in traditional or online sections as they chose. In some cases students who enrolled after all traditional sections were full may have been forced into the online section against their will. No random selection method was employed. This study was based on a single course at a mid-sized, moderately selective public university in the Midwest. Generalization of results would be difficult without duplicating the study in similar courses at other universities. This study limited the measure of performance to the quantitative criteria of course and project grades and the data were collect over a single academic year. No attempt was made to separate students who stopped participating but did not officially drop the course. 8. CONCLUSIONS While online students exhibited significantly lower course grades than traditional classroom students, it is important to point out that the online students performed successfully with an average course grade of 79%. As other case studies and publications indicate, sometimes there is just a difference in performance between online students and traditional classroom students. That difference seems to be more dramatic as the complexity and technical requirements of the course increase. When individual elements of the course grade were compared it was apparent that online and traditional students produced exam grades that were relatively close. The components that established the major differences were the technical, hand-on projects. While both groups used the same tutorial/case book and great care was taken to provide additional guided instruction through online demonstrations, the traditional classroom students outperformed the online students in all four activities. It is logical to assume the ability to work on the practice exercises in class and interacting with the instructor in a structured environment added significantly to traditional students’ understanding of the technical aspects of the course. Considering on the statistical measures of this study, it appears that the redesign efforts had no payback when student performance measured by grades on major course modules and final course grades were compared. Many online students verbally indicated that they appreciated the new content, but information gathered from statistics generated by the course delivery system indicated that even though the new tools were present, some students were not using them. This was especially true in the area of Access and Excel tutorial videos. It is possible that online students who received artificially low scores may have stopped participating in the course while failing to official drop the course. These extremely low scores could have produced dynamic downward pressure on grades in the short time frame of this study. A method of excluding such cases from the study would be beneficial. Since students for the most part chose the online sections of the MIS course, it is possible that they were willing to sacrifice points for convenience. Students may have mistakenly perceived the online delivery as easier since class attendance was not required. Some students may view this course, required of all majors in the school of business, as just another hurdle in the process of earning their degree. These students may never come to a realization of how this course applies to their major and therefore sacrifice performance levels to concentrate on other major specific courses. Two major questions remain: 1) How can technical hands-on labs be designed to become more interactive and valuable to online students? 2) How to motivate online students to stay in the course and perform to the best of their abilities? Educators must persevere in their efforts to continually improve online courses not necessarily to emulate the classroom, but to provide new and equal learning experiences to online students. Many studies and surveys predict that the demand for online classes and degree programs will continue to grow. Teachers must use evolving technologies and established tools to develop an online pedagogy that works for the variety of student learning styles. Students must fully utilize these online elements if they are to understand the concepts as completely as their classroom counterparts. This partnership must be embraced by both students and teachers if online courses and degree programs are to be viewed as being equal to the traditional college and university education. Sustained improvement of online methods of instruction driven by continued research and new technologies is essential in this highly competitive field of education. 9. REFERENCES Association for Information Systems (AIS). (2002). IS 2002 model curriculum and guidelines for undergraduate degree programs in information systems. Retrieved June 28, 2006 from http://www.aisnet.org/Curriculum/ Bearden, E. B., Robinson, K, & Deis, M. H. (2002, Summer). A statistical analysis of dental hygiene students’ grades in online and on-campus course and performance on the National Board Dental Hygiene Exams. Journal of Dental Hygiene, 76 (3), 213-217. Retrieved June 1, 2006 from the Expanded Academic database. Carlson, S. (2004, November 26). Online-education survey finds unexpectedly high enrollment growth. Chronicle of Higher Education, 51 (14), 30. Retrieved June 10, 2006, from Academic Search Premier database. Fortune, M.F., Shifflett, B., Sibley, R.E. (2006, Mar/Apr). A comparison of online (high tech) and traditional (high touch) learning in business communication courses in silicon valley. Journal of Education for Business, 81 (4), 210-214. Retrieved June 5, 2006 from the Expanded Academic database. Garland, D., Martin, B.N. (2005, Fall). Do gender and learning style play a role in how online courses should be designed? Journal of Interactive Online Learning v. 4 (2), 67-81. Retrieved June 5, 2006 from the Expanded Academic database. Grasha, A.F., Yangarber-Hicks, N. (2000, Winter). Integrating teaching styles and learning styles with instructional technology. College Teaching, 48 (1), 2-10. Retrieved June 5, 2006 from the Expanded Academic database. Holman, L. (2000, Fall). A comparison of computer assisted instruction and classroom bibliographic instruction. Reference & User Services Quarterly, 40 (1), 53. Retrieved June 1, 2006 from the Expanded Academic database. Lorenzetti, J.P. (2005, November). How online students can improve overall student quality. Distance Education Report, 9 (22), Magna Publications, Inc. Retrieved June 15, 2006, from Education Full-Text database. Miller, B., Cohen, N. L., & Beffa-Negrini, P. (2001, Winter). Factors for success in online and face-to-face instruction. Academic Exchange Quarterly, 5 (4), 4-10. Retrieved June 1, 2006 from the Expanded Academic database. Mupinga, D.M., Nora, R.T., Yaw, D.C. (2006, Winter). The learning styles, expectations, and needs of online students. College Teaching, 54 (1) 185-193. Retrieved June 15, 2006 from the Expanded Academic database. O’Connell, B. (July 2002). A poor grade for E-learning. Workforce, 81 (7), p. 15 Retrieved June 10, 2006, from Academic Search Premier database. Reese, S. (2002, January). Understanding our differences. Techniques (Association for Career and Technical Education), 77 (1). Retrieved June 15, 2006 from the Expanded Academic database. Roach, R. (2003, September 25). Survey says online learning equal to classroom instruction. Black Issues in Higher Education, 20 (16), 44. Retrieved June 10, 2006, from Academic Search Premier database. Smith, S. (2005, Spring). The positive and challenging aspects of learning online and in traditional face-to-face classrooms: A student perspective. Journal of Special Education Technology, 20 (2), 52-59. Retrieved June 15, 2006, from Education Full-Text database. Smith, S. B., Smith, S. J., & Boone, R. (2000, Spring). Increasing access to teacher preparation: The effectiveness of traditional instructional methods in an online learning environment. Journal of Special Education Technology, 15 (2), 37-46. Retrieved June 15, 2006 from the Education Full Text database. Summers, J., Waigandt, A., Whittaker, T. (2005, Spring). A comparison of student achievement and satisfaction in an online versus a traditional face-to-face statistics class. Innovative Higher Education, 29 (3), 233-250. Retrieved June 10, 2006, from Academic Search Premier database. Terry, M. (2001, Fall). Translating learning style theory into university teaching practices: an article based on Kolb's experiential learning model. Journal of College Reading and Learning, 32 (1), 68-85. Retrieved June 15, 2006 from the Education Full Text database. Thirunarayanan, M. O. & Perez-Prado, A. (2002, Winter). Comparing web-based and classroom-based learning: A quantitative study. Journal of Research on Technology in Education, 34 (2), 131-137. Retrieved June 10, 2006 from the Education Full Text database.