Student Usage of Instructional Technologies: Differences in Online Learning Styles Robert M. Ballenger ballengerb@wlu.edu Dennis M. Garvis garvisd@wlu.edu Washington and Lee University Lexington, Virginia 24450 USA Abstract We contribute to the MIS education literature by empirically examining Web log server data generated by undergraduate students enrolled in multiple sections of a MIS course where an online Learning Management System (LMS) was used to complement a traditional classroom environment. We identify online learning styles by investigating differences in LMS usage patterns, finding four distinct usage patterns as well as differences in the level and variation of LMS usage by male and female students. We suggest that online learning styles are important considerations for instructors using instructional technologies as well as for researchers. Keywords: online learning styles, instructional technologies, empirical investigation log file data 1. INTRODUCTION Higher education has adapted to innovations in the development of instructional technology with initiatives ranging from hybrid courses offered on traditional campuses to Web-based online degrees conferred by online schools. In connection with the growth in these programs, MIS education and researchers in other fields have empirically investigated the use of these technologies (Alavi & Gallupe 2003; Lu, Yu, & Lui, 2003; Arbaugh 2005a). The present study adds to this empirical literature by examining Web log server data generated by undergraduate students enrolled in multiple sections of a Management Information Systems (MIS) course where an online Learning Management System (LMS) was used to complement a traditional classroom environment. Specifically, we identify online learning styles by investigating differences in LMS usage patterns. As reported below, we find four distinct usage patterns as well as differences in the level and variation of LMS usage by male and female students. This study represents a portion of a larger research project investigating the relationships between instructional technologies, student learning styles, instructor teaching styles, and student learning outcomes. Accordingly, this project responds to calls for additional depth in theoretically grounded empirical research on the integration of technology, instructional method and environment, and student learning (Alavi and Leidner, 2001; Woods, Badzinski, & Baker, 2007). We also contribute to the literature by conducting an empirical examination of primary data representing student usage of instructional technologies recorded on a Web server. The remainder of this paper is divided into the following sections. First, we review research from several disciplines to focus on an integrative model of student learning with instructional technologies. Second, we outline our research questions regarding online patterns of student behavior and the potential effects on student learning. Next, we describe our methodology followed by our results. We then close the paper with discussion of our findings. 2. RESEARCH FOCUS Three broad research streams crossing many disciplines have examined the development of instructional technologies. One stream has prescribed how instructors and institutions should use instructional technologies to create innovative projects, course web sites, courses, management systems, and programs (Bergman & Bergman 2003; Morgan 2003; Twigg 2001). For MIS educators, research of this type has recognized key organizational, learning, and teaching practices and processes (Alavi & Gallupe 2003; Kraft, Kakish & Steenkamp, 2009). A second stream of work has sought to determine whether there are significant differences in outcomes between technology-based and traditional courses (Harley et al. 2003; Phipps & Merisotis 1999). Although the majority of this early research here indicated no significant differences between student outcomes, this work has been criticized for methodological problems (Phipps & Merisotis 1999) and for omitting constructs such as instructional design and teaching style (Arbaugh 2000a; Hiltz & Wellman 1997). Subsequent findings from more rigorous research are still mixed. Benbunan-Fich and Hiltz (2002) reported no significant difference between perceived learning and course mode delivery (completely online, mixed, completely on campus) but significantly lower grade performance for on campus learning environments in MIS and non-MIS classes. Newlin, Lavooy, and Wang (2005), using true experimental design that randomly assigned students to a traditional face-to-face lecture, Web-based synchronous lecture, or Web-based asynchronous lecture, reported no significant difference in student learning performance but higher positive student attitudes towards learning from student in the Web-based lectures. Koch and colleagues (2007) found that significant mid-term differences in communication ambiguity and student grade performance between students enrolled in online and face-to-face courses disappeared by the end of the term. The third and most recent stream of research has developed integrative models focusing on teaching and learning with instructional technologies. Researchers here empirically explored the potential influences of both extrinsic and intrinsic factors on several dimensions of student outcomes, including student satisfaction, learning, and course performance. For example, Alavi and Leidner (2001) suggested that examination of the crucial question of how technology enhances learning requires attention to relationships between instructional, psychological, and environmental factors. Similarly, Arbaugh and Stelzer (2003) suggested that relationships with and interactions between student characteristics, student learning, and instructor pedagogical styles are fundamental to understanding the role of faculty in Web-based courses. Graf and colleagues (2007) have focused on understanding student learning styles as a means of improving student modeling necessary in developing adaptive learning systems that enhance student learning. Clearly, this research emphasizes the crucial influences on and interdependencies between learning and teaching that affect student outcomes. Whereas much research in the second research stream has been pursued in the attempt to discover comparative differences between courses that use instructional technology and those that do not, our research focus is directed towards looking at differentiation in the usage of instructional technologies within courses. As such, we build on the third stream of work examining a comprehensive model of teaching and learning. While the learning context model suggested by Alavi and Gallupe (2003) has influenced us, our focus on student learning styles is more consistent with work performed by MIS (Graf, et. al, 2007; Liegle & Janicki, 2006), business (Brokaw & Merz 2000), management (Marks, Sibley & Arbaugh 2005), marketing (Young, Klemz, & Murphy 2003) and engineering (Zwyno 2003) education researchers. We are investigating potential differences in effects of student usage of instructional technologies by specifically focusing on Learning Management Systems (LMS) as a particular online instructional technology. Commonly adopted or created by educational institutions, approximately 90% of all higher education systems have adopted either a proprietary or open-source LMS such as Blackboard, WebCT, Moodle, or Sakai (Hawkins, Rudy, and Madsen, 2003). While there are multiple forms of and uses for LMS (Boetcher 2003; Morgan 2003), it is a platform for both asynchronous learning environments (file transfers, email, text, graphics, video, audio, and discussion forums) and synchronous learning environments (whiteboards, videoconferencing, and chat) that extend conventional learning environments. 3. STUDENT LEARNING STYLES AND LEARNING MANAGEMENT SYSTEMS Student learning has been conceptualized as an individual’s perceptual and intellectual activities relating to individual information processing, problem solving, and decision-making (Armstrong 2000). Student learning styles are the preferences and behaviors that serve as indicators of how learners perceive, interact with, and respond to the learning environment. Although more than fifty different cognitive learning style theories and models have been proposed by scholars (Armstrong 2000), three prominent streams of empirical research have followed from the works of Grasha, Kolb, and Felder. Whereas Grasha’s (1996) research is premised on a social interaction model of learning and teaching, and Kolb’s (Kolb & Kolb 2005) work is based on experiential learning theory, the Felder-Silverman Learning Style Index (LSI) model is focused on differences in information acquisition, processing, and understanding (Felder & Silverman 1988; Felder & Spurlin 2005). Due to this emphasis, Felder’s LSI model is adopted in our study examining the usage of instructional technologies. Felder’s LSI model consists of four dimensions. The active/reflective dimension contrasts the behavior of active learners who retain and understand information by discussing, applying, or explaining it, to reflective learning, which involves contemplation and consideration. The sensing/intuitive dimension points toward preferences for concrete information versus abstraction. The student with sensing tendencies prefers facts and well-established methods whereas intuitive learning entails discovering possibilities and looking for innovative problem-solving techniques and solutions. The visual/verbal dimension rests upon the means by which information is presented. Visual learning involves the images students see – these kinds of learners remember best pictures, diagrams, charts, and demonstrations. On the other hand, verbal learners rely upon words, either written or spoken. Finally, the sequential/global dimension is grounded on the sequence by which information is understood. Sequential learners move in linear steps, where each intermediate step is logically followed until a complete solution is understood. Global learners move in large, seemingly random jumps before they “get it.” Over the last two decades researchers from multiple disciplines, including MIS and related fields, have empirically confirmed differences in student learning styles. The investigation of Graf and colleagues (2007) into the LSI learning style preferences of 207 students, most of whom were studying Information Systems, led to the identification of additional characteristics within the four LSI learning dimensions. Most recently Sandman (2009) reported that the Felder-Silverman learning style preferences of 307 undergraduate business telecommunications students skewed strongly towards sensing and very strongly towards visual dimensions. He also reported that only one learning style type (reflective-sensing-verbal-sequential) scored significantly higher grades than the rest of the sample. A few researchers have begun to incorporate instructional technologies into their learning style analysis, typically using LSI. Young and his colleagues (2003) found that the LSI learning style preferences of undergraduate students enrolled in required marketing courses in a U.S. university did not extend to differences in preferences for instructional technology usage. Similarly, Lu and colleagues (2003) reported no differences in learning by graduate students in an MIS course despite differences in learning styles, online usage patterns, and demographic characteristics. In contrast, Zwyno and Waalen’s (2002) examination of Canadian undergraduate students enrolled in an upper-level engineering course showed that students with LSI preferences for intuitive, visual, and active learning had the highest average number of page hits, logins, and pages viewed, whereas students preferring verbal learning were highest users of email but had the lowest number of logins, hits, and use of web resources. In a study of undergraduate marketing courses in an Australian university, Morrison and his colleagues (2003) reported that the LSI learning styles of traditional students differed from on-line students in that on-campus students tended to prefer visual and active learning style dimensions, whereas online students preferred sensing, reflective, and verbal dimensions. Furthermore, applying cluster analysis, three learning style groups were found for traditional as well as online students in which different combinations of learning style factors were reflected. In examining the outcomes of graduate business students enrolled in a U.S. university, Clouse and Evans (2003) reported that an on-campus face-to-face class had more LSI active learners whereas an off-campus online course more reflective learners. More recently, Garland and Martin (2005) used Kolb’s learning style instruments, rather than LSI to investigate learning styles in business and non-business courses. They found differences that indicated online students exhibited an assimilating learning style while face-to-face students showed a diverging learning style. The empirical trend in this research suggests that instructional technologies influence student learning styles. Usage of instructional technologies reflects the process by which students access, process, and understand information that is necessary to learning. Thus, it is reasonable to expect that because students exhibit measureable differences in their learning styles, that there will likewise be differences in instructional technology usage and online learning styles. It is through this lens that we seek answers to our first research question: Are there differences in online instructional technology usage by students that reflect online learning styles? Our second question seeks to answer the question of whether gender matters in instructional technology usage and online learning styles. This follows from prior findings that gender impacts the usage of instructional technology. In Arbaugh’s (2000a) examination of MBA students, the instructional technology embedded in an Internet-based MBA course did not lead to gender differences in learning (i.e., grades). It did show, however, that female students used the communications tools as a means of learning in a participatory and collaborative way that was distinguishable from male students. Similarly, Garland and Martin (2005) found partial support for gender differences in the relationship between student learning style and Blackboard usage. Accordingly, our second research question asks: What differences in instructional technology usage and online learning styles are noticeable when gender is taken into account? 4. METHODOLOGY The data for this study were collected from five undergraduate sections of a survey Management Information Systems course taught in the business school at a small highly selective private liberal arts university offered over a two and one-half year period. A single instructor taught all sections of the course, employing a hybrid instructional format, consisting of traditional face-to-face class meetings three days a week integrated with extensive use of a LMS developed by the instructor. Students had access to the LMS by means of desktop computers made available in labs and laptop computers used intermittently in class, both provided by the university, and their own personal (desktop or laptop) computer from both on and off campus. The LMS contains virtually all of the course’s required pedagogical resources, other than the lectures, discussions, and related material presented in face-to-face classroom time. The LMS course content resources include Course Syllabus, Assignments, Student Grades, Textbook Online, Topical Articles, Real-World Scenarios, Case Guidelines, Case Studies, CyberShows (McCray 2000), Software Development Projects, Software Tutorials, and a Final Case Study. These are all asynchronous resources. The LMS also includes a variety of pages for team project management, password management, instructor contact information, online textbook password and access, and general navigation (home, menu, and header pages), none of which contain substantive course content. We do have future research planned using a commercial LMS. In order to create consistency in resource availability to allow for examination of student usage patterns of a larger sample of students, no changes were made to the LMS used by this instructor for the period of this study. Table 1 provides a brief description of the content available on the LMS. Data Collection and Preparation During the first day of class, students were instructed on how to access and use the instructor’s LMS. Students were also instructed to regularly check the LMS for new or revised assignments, which were posted approximately ten days before due dates. Occasionally during the semester, students were informed that new material or assignments had been recently posted to the LMS. Each enrolled student was assigned a username but selected their own password as LMS access was not granted to unauthorized users. Web servers tracked and logged all student “click-stream” activity over the entire academic term, including which students accessed resources, which resources were accessed, when resources were accessed, from where resources were accessed, and how resources were used. In raw form, the format of the log file data collected by the server is unsuitable for meaningful data analysis. In addition, because the Web server logs any and all activity on the LMS, there is a considerable amount of log file data that is not pertinent to learning style activity. Consequently, a conversion process is necessary to remove non-pertinent data and convert raw data into a format suitable for data analysis (Ballenger & Garvis, 2005). After completing the data scrubbing and conversion process, the total number of pedagogical content hits attributable to the students in this sample is 49,499. Measures Student demographic characteristics show that eighty-seven traditional undergraduate students enrolled in five sections of the same MIS course, consisting of: 1) thirty-three females (37.9%) and fifty-four males (62.1%); 2) forty-one Seniors (47.1%) and forty-six Juniors (52.9%); 3) majors from the fields of Accounting (n=3, 3.4%), Business Administration (n=76, 87.4%), Computer Science (n=3, 3.4%), Engineering (n=1, 1.1%), Economics (n=2, 2.3%), Mathematics (n=1, 1.1%), and Politics (n=1, 1.1%). Patterns in online learning styles were measured by frequency of accessing the thirteen LMS pedagogical resources. While this technique was considered unusual in the early stages of empirical research in instructional technology usage (Nachmias & Segev 2003; Peled & Rashty 1999), it has become a typical method in examining student usage of technology resources (Baugher, Varnelli, & Weisbord , DSJIE, 2003; Lu, Yu, and Liu, Information Management, 2003; Garland & Martin, 2005). As previously mentioned, in the present study student access to LMS content was recorded on the LMS server based on the student’s username in the log file. The number of times each student accessed a type of pedagogical content resource is then counted, thereby allowing us to analyze patterns of resources accessed on the LMS by student. These measures of student usage are grouped by classification of resource type in Table 2. For example, all of the variables relating to procedural content appear under the “Procedural” heading, while all content relating to online readings are grouped together under the “Reading” heading. Five measures of student performance were adopted. The grades for three case studies and two software development projects were averaged separately, resulting in two variables, Case Studies Average and Software Projects Average. The Course Contribution Grade and the Final Case Study Grade were also included as performance measure variables. The final variable, Course Average, represents a student’s overall weighted average of all performance measures in the course and the final grade awarded to the student. A summary of these variables is presented in Table 3. 5. RESULTS AND ANALYSIS Table 2 reports the full set of descriptive statistics (mean, median, standard deviation, minimum, and maximum) for the thirteen LMS resource variables. Textbook Online (14,864 hits), Assignments (6,835 hits), and CyberShows (6,361 hits) were the most widely used LMS resources, and Online Presentations (156 hits), Final Case Study (419 hits), and Real World Scenarios (568 hits) the least used. Textbook Online and CyberShows resources also showed the highest levels of usage variation. Cluster analysis of the thirteen pedagogical content variables was used to identify patterns in LMS usage. Ward’s hierarchical cluster analysis was initially used to determine the number of clusters followed by K-means cluster analysis (Hair, Black, Babin, Anderson, Tatham, 2006). Cross-tabulations between the initial Wards Method and the final K-means clustering results indicated 83.6% of the students were placed in the same clusters using both methods, thus providing evidence of convergent validity (Hair et al. 2006; Morrison et al. 2003). The clustering procedures yielded four clusters, as shown in Table 4. Examination of the variable means and medians for each cluster provides a mechanism for comparative analysis of LMS resources usage patterns. Overall, Cluster 1 has the lowest values for all variables with the exception of the CyberShow variable, while Cluster 4 is the group of heaviest users with the highest values for all variables except for CyberShows and Online Presentations. Values for Clusters 2 and 3 primarily lie within but vary between the other two clusters. While the mean and median values of the LMS variables assist in the initial identification of the clusters, they do not completely address all elements that require interpretation and understanding. Information regarding the degree of dispersion of the clustering variables in each cluster allows us to assess the relative strength of each clustering variable across the four clusters, and thus provide us with richer information that can be used to interpret the four clusters. In order to assess the degree of dispersion of the clustering variables across the four clusters, we calculated a Z-Score for each clustering variable in the four clusters: Z = ((cluster mean – sample mean) / sample standard deviation). The derived Z-Scores are reported in Table 4. A graphical representation of these results provides the final step in our understanding and interpretation of the four generated online learning style clusters, as presented in Figure 1. Inspection of the Z-Score values in Table 4 and of the graph of those scores in Figure 1 reveals four distinct patterns of student behavior in accessing content on the LMS. In the remainder of the paper, all references to the values of clustering variables refer to the mean value of that variable for a given cluster. The Z-Scores are based on these mean values, therefore, when comparing a Z-Score for one variable across clusters we are actually comparing the mean behavior of students in that cluster to the mean of the entire class or to the mean behavior of students in another cluster. As presented in Table 1, some types of LMS resources may be seen as text oriented and static, whereas others are dynamic, multimedia, and interactive. Focusing on learning style dimensions of the various resources, active learners should prefer using the interactive Software Tutorials more often because they can immediately apply what they have learned, while reflective learners would rather think about the material and use the resource less often. Visual learners, because they remember best what they see, should have a stronger preference towards using the multimedia/interactive content (CyberShows and Software Tutorials), while the verbal learners get more out of written words. Therefore they should prefer the reading content (Textbook Online, Topical Articles, Real World Scenarios, and Online Presentations). Sensing learners prefer learning material that is connected to the real world, they like learning facts and problem-solving methods they have been previously taught. Therefore they should prefer the Real World Scenarios, Topical Articles, Case Studies, Software Projects and Final Case Study content. Intuitive learners, on the other hand, because they dislike dealing with details, may shy away from accessing the Software Projects, Case Studies, and Final Case Study content. Consequently, the frequency that students used LMS resources represent usage patterns that measure learning styles. The students in Clusters 2 and 3 exhibited similar behavior when accessing the Procedural resources but divergent patterns for Multimedia/Interactive, Reading, and Performance Outcome content. Both clusters were relatively close to the class mean for Procedural resource, indicating a pragmatic approach to this content. However, as is very noticeable for Multimedia/Interactive content, Cluster 3’s frequency of access for CyberShows is the highest while Cluster 2 is lowest. Based on Felder’s visual/verbal dimension, this suggests that the students in Cluster 3 may be more visually oriented learners and those in Cluster 2 more verbally, or text oriented. When Reading content is considered, these visual and verbal orientations are affirmed as Cluster 2 students access Textbook Online and Real World Scenarios reading content, which constitutes the bulk of the assigned reading for the course, more frequently than the Cluster 3 students. This supports the interpretation that the reading preference of the Cluster 2 students indicates verbal orientation as explained by Felder. Accordingly, good descriptive labels for Learning Style Clusters 2 and 3 would be “Verbally Oriented” and “Visually Oriented,” respectively. The usage pattern of the students in Cluster 4 showed that they enthusiastically embraced all of the LMS technology. Their frequency of accessing the Multimedia/Interactive materials suggests active learners comfortable with visual as well as verbal resources. They also appear to be very comfortable with reading assignments online. Accordingly, an appropriate descriptive label for Learning Style Cluster 4 would be “Enthusiast.” In contrast, the students in Learning Style Cluster 1 had the lowest frequency of usage for almost all of the LMS resources, which provides the label for this cluster as “Minimalist.” Indeed, on those important items that normally require continual access by students throughout the entire semester, such as Assignments, Textbook Online, and Topical Areas, the students in this cluster had the lowest frequency. Several different scenarios possibly explain this behavior. First, the students in this cluster may be a type of technologically averse student. Rather than accessing online resources on a regular basis, it is possible that these students may be printing out the content, filing it, and using hard copy in a more traditional manner. Similarly, since the results indicate that these students accessed the Multimedia/Interactive content considerably less than students in the other three clusters, these students may have been gaining the relevant information from traditional face-to-face classroom instruction. Overall, this pattern may represent a reflective, verbally oriented learning style that did not fit with the emphasis on technology embedded in the teaching style of the instructor. It may be that an objectivist, teaching-centered style would have been more consistent with the learning styles of the students in this group. Alternatively, these students may simply have been unmotivated, and regardless of the instructor’s teaching style, their online activity would lag. As shown in Table 5, the grade performance of this group was notably lowest in three of the five categories including Course Grade. This is in direct contrast to the Enthusiast cluster, which had the highest grades in all five categories. In addition to general differences in student usage of resources, we also sought to investigate whether there are gender differences. Descriptive statistics for resource usage by gender are reported in Table 6. Female students showed higher levels of overall average resource usage, with a higher usage in nine of the thirteen resources. Female students were noticeably higher users of Assignment, CyberShows and Software Tutorial resources, whereas male students were higher users of the Textbook Online resource. Male students showed higher variation in the overall usage of resources, with higher variation usage for eight of the thirteen variables. Male students showed noticeably higher variation in Software Tutorials, Textbook Online, and Topical Article resources, whereas female students showed more variation in the CyberShows. Gender is taken into account in the clusters in Table 7 by tabulating the number of female and male students within each cluster. These results show that both male and female students are found in each learning style clusters, with noticeable results in Clusters 3 and 4. With respect to Cluster 3, there were more female students showing a Visually Oriented style, both in comparison to the number of male students in this cluster and to what would be expected from the overall sample. In Cluster 4, the Enthusiast learning style, there were nine male students but only one female student. Of further note is that over 44% of the male students are in Cluster 1. Overall, there is some evidence suggesting that there are gender differences in online learning styles reflected in the LMS usage. 6. DISCUSSION Before we discuss the interpretation and contribution of our findings, we must note one limitation. Our sample was drawn from a multiple sections of a single course at a single institution taught by a single instructor, which is common in prior education research in this area (Arbaugh & Stelzer 2003). While this approach provides benefits in controlling for teaching style and learning environment, it also limits the generalizability of results. The resource usage patterns comprising the online learning styles evidenced here may not be the same that would be that reflected when interacting with other teaching styles or in completely online courses. Extending these findings to more varied teaching styles and learning environments is clearly a necessary development for future work (Arbaugh & Benbunan-Fich, 2003). Despite this limitation, we can answer our two primary research questions regarding differences in instructional technology usage and online learning styles. First, we found evidence of four distinguishable patterns in the usage of the instructional technologies from a relatively large sample of students. In essence, these differences suggest that students do have differences in online learning styles. Recognizing learning style differences is the first step for instructors seeking appropriate teaching methods (Brokaw & Merz 2000). While some have recommended that instructors should modify their teaching styles to accommodate the wide variety of student learning preferences (De Vita 2001; Felder 1993), instructional technologies such as LMS offer supplemental mechanisms by which instructors may be able to address a wider variety of instructional needs. Individual instructors unwilling to adapt well-established and successful personal classroom teaching styles may find that the development of new uses for instructional technologies offers a means of responding to previously overlooked learning styles. Accordingly, rather than a simple additional static channel facilitating course delivery, instructional technologies such as LMS represent an alternative means to increase consistency between student learning style and instructor teaching style and thereby improve student learning. Second, our work shows that gender differences exist in the use of instructional technology, as we found that the female students were higher overall users of the LMS instructional technology and considerably more female students than expected were in the Visually Oriented cluster. Thus, similar to research that found gender differences in learning styles of students (Litzinger, Lee, Wise, & Felder, 2005), we have evidence of differences in online learning styles. It is important to highlight, however, that a simplistic reading of our findings should not conclude that learning styles could be merely divided into gender types. Instead, what should be noticed is that there were women distributed across each of the learning style clusters. Thus instructors should focus on differences in learning styles and not gender when designing their course content. In addition to answering our two research questions, we are also contributing to the empirical research in this area by using a new source and type of primary data. Prior research focusing on student learning has relied almost exclusively on self-report data of student learning preferences collected through surveys. While a great deal has been learned from this type of data, server data provides complementary evidence of actual usage of instructional technologies and online learning styles over an extended period of time. This direct information can serve to overcome some of the methodological problems associated with self-reported and perceptual measures. A necessary means to getting this new source of data, and the final contribution of this exploratory study, is the development and application of the methodology to collect, process, analyze, and interpret Web log data. While this process has been applied in many fields, it clearly has a place in MIS education research. 7. ENDNOTES 1 The various literatures have not adopted a common terminology regarding the term instructional technology. Terms such as online technology, Web-based courses, learning management system, asynchronous learning network, and computer-mediated instruction have been used to describe many approaches in the use of instructional technologies. We use the term instructional technologies to be inclusive of all the terminologies and systems previously used. 8. REFERENCES Alavi, M. & R. B. Gallupe, (2003) "Using Information Technology in Learning: Case Studies in Business and Management Education Programs." Academy of Management Learning & Education, 2, 139-153. Alavi, M. & D. E. Leidner, (2001) "Research Commentary: Technology-Mediated Learning -- A Call For Greater Depth and Breadth of Research." Information Systems Research, 12, 1-10. Arbaugh, J. B. (2000a) "An Exploratory Study of the Effects of Gender on Student Learning and Class Participation in an Internet-Based MBA Course." Management Learning, 31, 533-549. Arbaugh, J. B. (2000b) "Virtual Classroom Versus Physical Classroom: An Exploratory Comparison of Class Discussion Patterns and Student Learning in an Asynchronous Internet-Based MBA Course." Journal of Management Education, 24, 207-227. Arbaugh, J. B. (2005a) "How Much Does "Subject Matter" Matter? A Study of Disciplinary Effects in On-Line MBA Courses." Academy of Management Learning & Education, 4, 57-73. Arbaugh, J. B. (2005b) "Is There an Optimal Design for On-Line MBA Courses?" Academy of Management Learning & Education, 4, 135-149. Arbaugh, J. B. & R. Benbunan-Fich (2003). "Testing the Applicability of Learning Theories to Web-Based MBA Courses". Best Paper Proceedings of the 2003 Meeting of the Academy of Management, A1-A6. Arbaugh, J. B. & L. Stelzer (2003) "Learning and Teaching Management on the Web: What Do We Know?" In C. Wankel & R. DeFillipi (Eds.), Educating Managers with Tomorrow's Technologies (pp. 17-51). Greenwich, CT: Information Age Publishing. Armstrong, S. J. (2000) "The Influence of Individual Cognitive Style on Performance in Management Education." Educational Psychology, 20, 323-339. Ballenger, R. M. & D. M. Garvis (2005) “Assessment of Web-Based Instructional Technologies: A Methodology to Identify Patterns in Student Usage of Leaning Management Systems.” Association of Management/International Association of Management 2005 Conference Proceeding, 22 (1), Norfolk, VA. Benbunan-Fich, R. & S. R. Hiltz, (2002) "Correlates of Effectiveness of Learning Networks: The Effects of Course Level, Course Type and Gender on Outcomes." Proceedings of the 35th Hawaii International Conference on System Sciences, 1- 8. Bergman, T. J. & M. M. Bergman (2003) "Application of Internet Technology to Facilitate Student Team Projects." Journal of the Academy of Business Education, 4(Fall), 85-102. Boetcher, J. V. (2003) "Course Management Systems and Learning Principles: Getting to Know Each Other." Syllabus, 16 (12), 33-36. Brokaw, A. J. & T. E. Merz (2000) "The Effects of Student Behavior and Preferred Learning Style on Performance." Journal of the Academy of Business Education, 1 (Spring), 44-53. Carlson, T. (2005) "The Net Generation Goes To College." Chronicle of Higher Education, 52(7), 7. De Vita, G. (2001) "Learning Styles, Culture and Inclusive Instruction in the Multicultural Classroom; A Business and Management Perspective." Innovations in Education and Teaching International, 38(2), 165-174. Felder, R. (1993) "Reaching the Second Tier: Learning and Teaching Styles in College Science Education." Journal of College Science Teaching, 23(5), 286-290. Felder, R. & L. K. Silverman (1988) "Learning and Teaching Styles in Engineering Education." Engineering Education, 78(7), 674-681. Felder, R. & J. Spurlin (2005) "Applications, Reliability, and Validity of the Index of Learning Styles." International Journal of Engineering Education, 21(1), 103-112. Grasha, A. F. (1996) "Teaching with Style: A Practical Guide to Enhancing Learning by Understanding Teaching and Learning Styles." Pittsburgh, PA: Alliance Publishers. Hair, J. F., Jr., W. C. Black, B. J. Babin, R. E. Anderson & R. L. Tatham (2006) "Multivariate Data Analysis" (6th ed.). Upper Saddle River, NJ: Prentice-Hall, Inc. Harley, D., J. Henke, S. Lawrence, F. McMartin, M. Maher, M. Gawlik, et al. (2003) "Costs, Culture, and Complexity: An Analysis of Technology Enhancements in a Large Lecture Course at UC Berkeley" [Electronic Version]. Center for Studies in Higher Education, University of California, Berkeley. Retrieved November 10, 2003 from http://www.repositories.cdlib.org/chse/CSHE3-03 Hiltz, S. R. & B. Wellman (1997) "Asynchronous Learning Networks as Virtual Classrooms." Communications of the ACM, 40 (9), 44-49. Kraft, T. A., K. M. Kakish & A. L Steenkamp (2009) "Bridging the Digital Divide in Undergraduate Business Information Systems Education." Information Systems Education Journal, 7 (4). http://isedj.org/7/4/. ISSN: 1545-679X. (Preliminary version appears in The Proceedings of ISECON 2007: §4123. ISSN: 1542-7382. Kolb, A. Y. & D. A. Kolb (2005) "Learning Styles and Learning Spaces: Enhancing Experiential Learning in Higher Education." Academy of Management Learning & Education, 4 (2), 193-212. Litzinger, T. A., S. H. Lee, J. C. Wise & R. M. Felder (2005) "A Study of the Reliability and Validity of the Felder-Soloman Index of Learning Styles." Proceedings of the 2005 ASEE Annual Conference & Exposition. Marks, R. B., S. D. Sibley & J. B. Arbaugh (2005) "A Structural Equation Model of Predictors for Effective Online Learning." Journal of Management Education, 29(4), 531-563. McCray, G. E. (2000) "The hybrid course: Merging on-line instruction and the traditional classroom." Information Technology and Management, 1(4), 307. Morgan, G. (2003) "Key Findings: Faculty Use of Course Management Systems" [Electronic Version]. Educause Center for Applied Research. Retrieved November 10, 2003 from http://www.educause.edu/ir/library/pdf/ecar_so/ers/ers0302/ekf0302.pdf Morrison, M., A. Sweeney & T. Heffernan, (2003) "Learning Styles of On-Campus and Off-Campus Marketing Students: The Challenge for Marketing Educator." Journal of Marketing Education, 25(3), 208-217. Nachmias, R. & L. Segev (2003) "Student's Use of Content in Web-Supported Academic Courses." Internet and Higher Education, 6, 145-157. Newlin, M., M. Lavooy & A. Wang (2002) "The Experimental Comparison of Conventional and Web-based Instructional Formation." North American Journal of Psychology, 7(2), 327-336. Peled, A. & D. Rashty (1999) "Logging For Success: Advancing the Use of WWW Logs to Improve Computer Mediated Distance Learning." Journal of Educational Computing Research, 21(4), 413-431. Phipps, R. & J. Merisotis (1999) "What’s The Difference?: A Review of Contemporary Research On the Effectiveness of Distance Learning in Higher Education" [Electronic Version]. The Institute for Higher Education Policy. Retrieved November 10, 2003 from http://www.ihep.com/Pubs/PDF/Difference.pdf Twigg, C. A. (2001) "Innovations in Online Learning: Moving Beyond No Significant Difference" [Electronic Version]. The Pew Learning and Technology Program. Retrieved November 10, 2003 from http://www.center.rpi.edu/PewSym/Mono4.pdf Wang, A. Y. & M. Newlin (2002) "Predictors of Performance in the Virtual Classroom." Technological Horizons in Education Journal, 29, 10. Woods, R., D. Badzinski & J. Baker (2007) "Student Perceptions of Blended Learning in a Traditional Undergraduate Environment, from Blended Learning: Research Perspectives," Picciano, A. G. & Dzuiban, C.D. (eds.). Sloan-C, United States of America. Young, M. R., B. R. Klemz & J. W. Murphy (2003) "Enhancing Learning Outcomes: The Effects of Instructional Technology, Learning Styles, Instructional Methods, and Student Behavior." Journal of Marketing Education, 25(2), 130-142. Zwyno, M. S. (2003) "Student Learning Styles, Web Use Patterns, and Attitudes Toward Hypermedia-Enhanced Instruction." Proceedings of the 33rd ASEE/IEEE Frontiers in Education Conference. Zwyno, M. S., & J. K. Waalen (2002) "The Effect of Individual Learning Styles on Student Outcomes in Technology-enabled Education." Global Journal of Engineering Education, 6(1), 35-44. Appendices Table 1. Description of Learning Management System Content Content Description Pedagogical Content Syllabus A Web page containing lecture meeting location and times, where to purchase the access key for the online textbook, course description and objectives, major topics covered, course requirements, due dates for major assignments, grading scale, method of evaluating student performance, course policies, and links to various course content. Assignments This Web page lists the homework or major assignment due for each class period. The date of the class, the topic to be covered, and the assignment due for that class are provided in a chronological table. The assignment usually consists of set of hyperlinks that are linked to various pedagogical resources contained within LMS. Most of the resources on the LMS must be accessed through the assignments page. Student Grades Students may access their grades on individual assignments and their overall average for the course using this dynamically generated Web page. This page is updated as assignments are returned to students. Students may view only their individual grades and the class averages. Textbook Online This is an online version of the textbook provided by the publisher and hosted on the LMS. Because it is online and a subset of the textbook it is considerably less expansive than the full paper version. Students purchase an access key at the university bookstore in order to gain access to the textbook on the LMS. Students may access individual chapters or sections of the textbook through the assignments page on the LMS. Topical Articles A variety of topic specific online articles from business newspapers, periodicals, and academic journals are hosted on the LMS as Adobe Acrobat files. Real-World Scenarios These are online mini-case studies, usually 2 to 2.5 pages in length, which present information technology issues that are being evaluated by real organizations. The real-world scenarios are an integral part of classroom discussion at the conclusion of presenting a major IT topic. Four of these mini-case studies are included in the LMS. Case Guidelines A set of pages that provide guidelines on how to analyze a case study and prepare a written document of the analysis and the subsequent recommendations. Case Studies Three online case studies are hosted on the LMS. Students must prepare a written analysis and a set of recommendations for each case, as well as, be prepared to actively discuss the case during class the day the case is due. CyberShows These are online multimedia mini-lectures that are 10 to 15 minutes in duration. They were developed by the instructor to cover various IT topics before the students attend class. Five CyberShows are hosted on the LMS. Software Development Projects These Web pages contain the business scenario along with the functional and deliverable requirements for two Microsoft Access application development projects. Table 1. Description of Learning Management System Content - continued Content Description Pedagogical Content - continued Software Tutorials This content consists of a set of Web pages that contain hyperlinks to ElementK’s online interactive multimedia software tutorials as well as the actual ElementK Tutorials. The software tutorials cover the material necessary to complete the software development projects and some homework assignments. Online Presentations PowerPoint presentations used during class. The presentations are made available for students to download after they are presented in class. Final Case Study The course “final” is an in-depth comprehensive case study. The case narrative, tables, and figures along with the preparation requirements and guidelines are hosted on the LMS. The final case study is due in the middle of final exams. Logistical Content Team Management This page contains a narrative describing why teams are used for the software development projects, the process used to form teams, and general team management information. Team Registration Students in the class use this dynamic Web page hosted on the LMS to form and register their teams. Change Password This page allows students to change their default password to a password of their choosing. Instructor Contact Information A Web page listing the instructor’s office hours, office location, email address, and phone number for the semester. Access Code This is a dynamic Web page where students enter the access code they purchased at the bookstore to gain access to the online textbook. When a valid access code is entered the access code is linked to the student’s username in a database, so students only need to enter the access code once during the semester. Navigational Content Home Page This is a splash page that serves as a visual introduction and portal to the LMS. It is also a frame within a frameset that defaults to home, menu, and header pages. Menu Page The menu page in the primary navigational page on the LMS. This page contains hyperlinks to the main content areas of the site and is always visible to the student. Header Page The header page appears at the top of the frameset and contains graphics and text identifying the course LMS. This page is also always visible to the student. Table 2. LMS Pedagogical Content Variables and Descriptive Statistics Variable Type Description Hits Mean Median StDev. Minimum Maximum Procedural Syllabus 1,027 11.80 10.00 7.994 1 43 Assignments 6,835 78.56 72.00 38.060 5 201 Case Guidelines 1,954 22.46 21.00 12.170 5 80 Case Studies 845 9.71 9.00 4.557 2 26 Software Projects 3,553 40.84 35.00 22.131 3 123 Final Case Study 419 4.82 4.00 3.529 1 23 Multimedia/Interactive CyberShows 6,361 73.11 53.00 64.022 0 322 Software Tutorials1 3,227 37.09 53.00 26.549 2 127 Reading Textbook Online 14,864 170.85 145.00 116.456 2 613 Topical Articles 5,687 65.37 60.00 41.976 0 216 Real World Scenarios 568 6.53 5.00 4.976 0 30 Online Presentations 156 1.79 .00 3.159 0 17 Performance Outcomes Student Grades 3,889 44.70 26.00 54.403 2 420 Total Hits Total Content Hits 49,499 568.95 523.00 274.264 52 1,569 1Total software tutorial hits (1,018 on LMS sever and 2,209 on ElementK server) Table 3. Summary Student Performance Outcomes Performance Measures Mean Median StDev. Minimum Maximum Case Studies 83.26 84.33 6.224 69.00 93.00 Software Projects 86.63 87.00 7.711 63.00 97.50 Final Case Study 82.95 85.00 9.752 37.00 94.34 Course Contribution 86.74 87.00 5.856 67.00 96.00 Course Average 84.39 84.78 5.439 69.77 92.93 Table 4. Clustering Variable Profiles for the Online Learning Style Clusters Cluster 1 Minimalist (n = 34) Cluster 2 Verbally Oriented (n =26) Content Mean Median Z-Score Mean Median Z-Score Syllabus 8.35 8.00 -0.432 12.23 11.00 0.053 Assignments 53.12 47.50 -0.669 88.50 89.00 0.261 Case Guidelines 16.85 15.00 -0.461 24.08 24.00 0.133 Case Studies 8.56 8.00 -0.253 10.31 10.50 0.131 Software Projects 33.59 26.50 -0.328 38.65 31.00 -0.099 Final Case Study 4.18 3.50 -0.181 4.38 4.00 -0.122 CyberShows 50.29 37.00 -0.356 37.69 36.00 -0.553 Software Tutorials 24.06 24.00 -0.491 37.73 33.00 0.024 Textbook Online 74.56 81.50 -0.827 212.35 208.50 0.356 Topical Articles 37.26 36.00 -0.670 72.35 69.50 0.166 Real-World Scenarios 3.68 3.00 -0.573 8.04 7.50 0.303 Online Presentations 1.35 .00 -0.139 1.88 .50 0.029 Student Grades 28.44 20.00 -0.299 28.15 22.50 -0.304 Cluster 3 Visually Oriented (n = 17) Cluster 4 Enthusiast (n = 10) Content Mean Median Z-Score Mean Median Z-Score Syllabus 14.47 11.00 0.334 17.90 14.50 0.763 Assignments 94.24 84.00 0.412 112.60 104.00 0.894 Case Guidelines 24.88 22.00 0.199 33.20 25.50 0.883 Case Studies 8.59 8.00 -0.247 14.00 14.00 0.941 Software Projects 49.18 47.00 0.377 57.00 57.00 0.730 Final Case Study 4.88 4.00 0.019 8.00 7.50 0.902 CyberShows 147.41 133.00 1.160 116.50 110.00 0.678 Software Tutorials 51.35 46.00 0.537 55.50 55.00 0.693 Textbook Online 161.76 177.00 -0.078 405.80 385.50 2.018 Topical Articles 74.12 71.00 0.208 127.90 124.00 1.490 Real-World Scenarios 7.12 7.00 0.118 11.30 9.50 0.959 Online Presentations 2.53 1.00 0.233 1.80 1.50 0.002 Student Grades 66.35 58.00 0.398 106.20 71.00 1.130 Table 5. Student Performance Outcomes By Cluster Cluster 1 Minimalist (n = 34) Cluster 2 Verbally Oriented (n = 26) Outcome Mean Median StDev. Mean Median StDev. Case Studies 82.12 82.68 5.637 82.03 83.68 6.473 Software Projects 85.50 87.00 7.311 84.79 85.75 8.771 Final Case Study 80.78 83.65 11.885 82.64 83.87 7.727 Course Contribution 84.94 84.00 6.466 86.27 86.00 5.807 Course Average 82.92 82.02 5.127 83.23 84.04 5.582 Cluster 3 Visually Oriented (n = 17) Cluster 4 Enthusiast (n = 10) Outcome Mean Median StDev. Mean Median StDev. Case Studies 84.04 85.33 6.575 88.97 89.17 3.710 Software Projects 88.00 87.0 6.643 92.95 94.25 4.180 Final Case Study 85.51 87.00 5.865 86.72 90.00 10.844 Course Contribution 88.76 89.00 3.930 90.60 90.50 4.006 Course Average 86.03 86.55 4.681 89.62 90.64 3.639 Table 6. LMS Log File Hits - Pedagogical Content By Gender Female (n = 33) Male (n = 54) Content Mean Median StDev. Mean Median StDev. Syllabus 12.91 10.00 9.183 11.13 9.00 7.180 Assignments 86.30 77.00 36.918 73.83 68.00 38.308 Case Guidelines 20.70 21.00 9.809 23.54 20.50 13.384 Case Studies 9.06 8.00 4.183 10.11 10.00 4.765 Software Projects 41.48 35.00 21.782 40.44 35.50 22.535 Final Case Study 4.64 5.00 2.434 4.93 4.00 4.074 CyberShows 90.94 97.00 70.869 62.22 47.00 57.444 Software Tutorials 42.85 42.00 24.792 33.57 95.80 79.500 Textbook Online 156.30 145.00 75.971 179.74 142.00 135.311 Topical Articles 68.94 68.00 29.277 63.19 51.00 48.257 Real-World Scenarios 7.52 7.00 3.768 5.93 5.00 5.535 Online Presentations 2.09 1.00 3.574 1.61 .00 2.897 Student Grades 52.27 29.00 73.775 40.07 24.50 38.195 Total Hits 597.67 572.00 256.734 551.41 478.00 285.365 Table 7. Cluster Demographic Distribution Gender Cluster Female (n = 33) Male (n = 54) 1 – Minimalist Count 10 24 % Within Cluster 29.4 70.6 % Within Column 30.3 44.4 Expected Count 12.9 21.1 2 – Verbally Oriented Count 11 15 % Within Cluster 42.3 57.7 % Within Column 33.3 27.8 Expected Count 9.9 16.1 3 – Visually Oriented Count 11 6 % Within Cluster 64.7 35.3 % Within Column 33.3 11.1 Expected Count 6.4 10.6 4 – Enthusiast Count 1 9 % Within Cluster 10.0 90.0 % Within Column 3.0 16.7 Expected Count 3.8 6.2 Figure 1. Graphical Representation of the Cluster Variable Profiles Used to Form Online Learning Style Profiles