How Effective is Student-Centric Edutainment in Large Introductory IS Survey Courses? Sharen Bakke Department of Computer and Information Sciences missives@sharenbakke.com Cleveland State University Cleveland, OH 44115 USA Robert H. Faley rfaley@kent.edu Geoff Steinberg gsteinb@kent.edu Management and Information Systems Department Kent State University Kent, OH 44240 USA Abstract IS teachers who teach large introductory courses are faced with two primary challenges: a curriculum that is perceived as being dull and impersonal and students that require constant stimulation to remain engaged in the course materials. This paper examines the effectiveness of a student-centric approach to learning embedded within an edutainment-oriented environment. This curriculum engaged students by providing a game show atmosphere and other interactive learning exercises that facilitate active learning during class time. Students were kept engaged outside of the classroom by completing mandatory exercises and homework assignments. These online activities provided students with control over when they learned and offered immediate feedback to check their understanding of the concepts. A regression discontinuity analysis of the mean test scores for thirteen semesters demonstrates that there is a significant increase in student performance after the implementation of the new curriculum. Keywords: IS curriculum design, edutainment, active learning, regression discontinuity 1. INTRODUCTION In most undergraduate business programs students are first exposed to information systems (IS) using an introductory survey course. The typical curriculum covers a vast array of subjects including data and information characteristics, operating systems, information system development, database theory and design, hardware and software concepts, the use of information systems in decision-making and information systems in e-commerce and Internet security. While the objectives of this approach are laudable, they are often not realized: introductory classes are usually too large, focus too broadly on most topics, are populated with students with a variety of backgrounds and interests, and are often perceived as impersonal. Students often enroll in these courses primarily to satisfy graduation requirements rather than to satisfy an inherent interest in the subject matter. Hence, neither students nor professors are satisfied with the learning experience and, more importantly, graduates from business schools do not master fundamental IS skills before entering the professional world. We evaluate the effect of a student-centric approach to large introductory IS survey courses (Bakke, Faley et al., 2007) on performance. The curriculum for this course is delivered and administered primarily online; students regulate the pace and direction of the learning process. This student-controlled learning process is based on a sound pedagogical foundation that includes the use of active learning techniques, ample amounts of practice opportunities, formative assessments that provide a constant supply of course-related feedback, and mandatory, sequential learning exercises that structure the overall learning process. Students can earn extra credit tokens that are redeemable for “gifts” (e.g. homework and quiz extensions or the deletion of a low grade) listed in the course’s online gift catalog. The overall intent of this student-centric IS-based curriculum is to maintain student interest and engagement in order to increase course-related knowledge. The contribution of this paper is twofold. First, we describe a novel, IT-intensive learning environment that significantly increases student performance. Second, we use the regression-discontinuity design to evaluate the effect of this learning environment on performance. This overlooked technique is especially valuable for evaluating the effect of interventions such as the one we describe. This paper has three parts. Part One examines and synthesizes the pedagogical literature on curriculum design and discusses how the IS curriculum described above meets important design-related considerations. The second part describes the methodology used in the empirical study of the impact of this IS curriculum on performance. The third part provides an analysis of the results and discusses the implications of the study. 2. THE STUDENT-CENTRIC APPROACH – A DESCRIPTION The IS-based curriculum described above employs active-learning techniques and structured sequential learning to facilitate student performance. Students control their learning process by choosing when they want to complete a series of prerequisite exercises. Feedback is automatically generated about their performance during and/or after completing an assignment. This feedback is used by students to better understand their strengths and weaknesses and by the professor to tailor lectures to address systemic learning problems that may exist. Giving students considerable control over when and how they learn can increase their motivation to complete course-related activities (Malone, 1980; Lepper and Malone, 1987; Malone and Lepper, 1987; Csikszentmihaly, 2000; Liao and Tai, 2006). Students are more motivated to complete homework and practice exercise when given the opportunity to regulate their learning process. Online technology facilitates this process by allowing students to access course-related materials at the times and places of their choosing (Bostow, Kritch et al., 1995). During class time the professor creates an enjoyable classroom experience by setting the mood and delivering edutainment – a form of entertainment that is designed to be educational. Students are motivated to attend classes and learn the material because the classes are engaging and because they have an opportunity to earn tokens (i.e. extra credits) that can be used to purchase gifts from an online gift catalog. A detailed description of the classroom environment, the online gift catalog, outside of class activities/other grading criteria as well as information about the evolution of the curriculum follows. The Classroom Environment At the start of each semester approximately 400 – 450 students enrolled in an introductory IS course enter an auditorium-style lecture hall. To their surprise, the students are greeted with soft music and a slide show comprised of pieces of art projected on the main viewing screen. The dimmed lights and soft music have an obvious calming effect. Before the class officially starts, the professor engages nearby students in conversation. The professor asks topical questions that often lead to stimulating dialogue. Participation is optional and clearly enjoyable. The primary goal of these interactions is to establish a trusting, safe and supportive environment that facilitates learning. Course content is delivered in a lecture format; the professor users PowerPoint slides to guide the discussion. During the lecture portion of the course, game-show like activities, accompanied by specific theme music and colorful graphics, randomly appear on the projection screen. Student-contestants for the game shows are randomly selected based on their seat assignment. Contestants are coaxed with a “Come on Down” cry from the professor and plenty of applause from the audience. Depending on the game show, students may compete alone or as part of a team. Students who correctly answer the question(s) are rewarded with extra credits in the form of tokens that can be used to purchase gifts from an online gift catalog. As can be seen, the environment of the course has been specifically designed to maximize student participation and satisfaction in order to facilitate student performance (Fulton 1991). The use of educational games (the backbone of “edutainment”) helps to both engage the students as well as maximize the amount of active learning that takes place. These games must be enjoyable but challenging (Lawson, 1995), intrinsically motivating (Watson, Kessler et al., 1996) and help students increase their confidence with class material (Townsend, Moore et al., 1998). They must also improve students’ higher order thinking and reasoning skills (Hogle, 1996). The Online Gift Catalog Students redeem their tokens through Orion, an online classroom management tool. Each token is worth a specified number of points and can be used only once. Students log into Orion, enter their token number, and select their gift. Orion tracks the extra credit points that have been redeemed and by whom and distributes the gifts selected. Available gifts include due-date extensions for quizzes or homework, permission to retake a quiz, erase a grade or submit a homework assignment one additional time. The professor manages the types of awards available, the number of points required for each award and the amount of awards that are offered each week. Outside of Class Activities Students are required to complete twelve regular homework assignments and five objective quizzes through Orion, the online classroom management tool. The homework assignments are application-oriented and cover topics such as working with an Excel spreadsheet (shown in Figure 1), creating a simple web page and querying information from a database. Orion grades these assignments automatically and provides sufficient hints for students to correct their mistakes and resubmit their answers. Students may complete each assignment up to three times before the assignment’s due date. Their recorded grade is the grade they earn on their final attempt. Online technology is an extremely effective way to deliver the formative assessments students need to help them assess their level of understanding of the course material (Brown and Knight, 1994; William and Black, 1996; Seale, Chapman et al., 2000), especially relative to the course norms set by the professor (Smith and Ragan, 1993; Wiliam, 2006). As noted by Buchanan (2000), a web-based formative assessment strategy increases student interest in learning and improves student test scores. It is no surprise that timely and accurate feedback is extremely important to the learning process (Trabasso, 1987; Kritch and Bostow, 1998). As noted above, many of the learning activities related to the class are sequentially structured. For example, prior to attempting unit quizzes students must successfully complete a series of required exercises. These exercises can be attempted as often as necessary to attain the mandatory perfect score. Students can track their progress by checking the Prerequisite Readiness chart in Orion (shown in Figure 2). During the completion of these exercises students receive feedback through Orion’s “On Screen Tips” feature. All this allows students to learn through trial and error without fear of penalty. In this non-threatening environment, more difficult concepts can be introduced into the curriculum (e.g. SQL statement creation and Excel Macro generation). The structured-sequential nature of this part of the learning process, in effect, forces students to learn the material before they attempt important unit quizzes (Buchanan, 2000; Wiliam, 2006). Five unit quizzes that account for 60 percent of the course grade are administered online in a proctored computer lab. Students can take the quiz any time during the specified “quiz period” (one week – five weekdays only). Each quiz is made up of multiple-choice, true/false or fill in the blank questions. Students can use their notes as well as the software applications to determine the correct answers. The quizzes can be taken only once. Evolution of the Curriculum The curriculum for the introductory IS course has been a work in progress since January 2001. About that time there had been an ongoing decline in mean test scores; the dropout rate was increasing and students collaborated freely on what were supposed to be individual homework assignments. This last problem was solved by electronically tagging homework assignments with student ID numbers to prevent duplicate submissions. A major change in the curriculum occurred during the Spring 2004 semester when edutainment and sequential learning exercises were rolled out. In Figure 3 this is denoted as the implementation (i.e. treatment) period which occurred during Term 7. In the spirit of continuous process improvement, refinements to the overall curriculum have continued throughout the post implementation period (shown in Table 1). 3. THE EMPIRICAL STUDY The primary goal of the curriculum was to improve the level of student learning while increasing the breadth and depth of the curriculum. Thus, we investigated the following research question: Did the mean test scores of students improve significantly after the implementation period? Methodology We analyzed the data using a technique called regression-discontinuity that is especially useful when an event takes place that changes the intercept and/or slope of a regression line (Campbell and Stanley, 1963; Shadish, Cook et al., 2002). This quasi-experimental design is used to examine the effect of an intervention on important outcomes variables (see Figure 4). To determine whether the intervention has had an effect, the slope (ß1) and intercept (ßo) of the regression line before and after (ß1 + ß2) are compared. An intercept change (ß3) is deemed more salient because a slope change may indicate only that the regression equation is curvilinear. If no significant different exists in either the intercept or slope of the regression line after the treatment implementation period, the treatment has not had the expected effect. In this two-group pretest-posttest design participants are assigned to a particular group based on a pre-determined and known cutoff score on an assignment variable. This procedure controls some of the random differences between the pre and post treatment groups and insures the differences measured can be attributed to the treatment rather than to random fluctuations. Assuming that assignment to the treatment group is based on this cutoff score and both the treatment indicator and assignment variables are included as covariates in a hypothesized regression model, an unbiased estimate of the treatment effect can be obtained (Rubin, 1977; Cook and Campbell, 1979; vanDerKlaauw, 2002). The treatment was implemented in Spring 2004, which is denoted as Term 7. Hence, the predetermined cutoff value for the regression discontinuity design is Term 7. Mean test scores and frequencies obtained from Terms 1 – 6 are in the pretest group; mean test scores and frequencies obtained from Terms 8 – 13 are in the posttest group. The mean test scores and frequencies for Term 7, the implementation term, were not used in the analysis. Sample The sample used in this study consists of 6448 undergraduate business students enrolled in an introductory IS course between 2001 and 2007. This mandatory course was required for all incoming freshman in the College of Business of a medium-sized public university in the Midwest. A count of the number of students enrolled in each semester and the mean test scores for each term are illustrated in Table 2. Dependent variables The research question was examined using mean student test scores. The mean test scores were an average of the quiz scores, homework assignments and extra-credit points obtained throughout the term for students that earned 40 or more points. We set the cutoff at 40 points because a score of less than 40 points indicated that the student completed only a very small portion of the course. These students most likely either formally dropped the course or just stopped completing the required assignments. A random sampling of students with scores less than 40 points confirmed this outcome. Data Results A regression discontinuity equation was used to analyze the data. The parameter of interest in the regression discontinuity equation is x3, the indicator variable for discontinuity in regression line intercepts. Evidence of the program effect is found when the treatment effect parameter, ß3, is significantly different from 0. Consequently, the null hypothesis, H1, to be tested can be restated as follows: The results in Table 3 suggest there is a significant difference between the pre and post implementation slopes after the implementation period. The ß3 value of 6.959 indicates the mean score is 6.959 points higher after the treatment. Therefore, H1 is supported. Discussion IS teachers who teach large introductory courses are faced with two primary challenges: a curriculum that is perceived as being dull and impersonal and students that require constant stimulation to remain engaged in the course materials. A student-centric approach to learning embedded within an edutainment–oriented environment was designed and implemented to meet these challenges. This curriculum engaged students by providing a game show atmosphere and other interactive learning exercises that facilitated active learning during class time. Winners of the game shows received tokens that could be used to buy homework or quiz extensions or a second opportunity to take a quiz. Students were kept engaged outside of the classroom by completing mandatory exercises and homework assignments. These online activities provided students with control over when they learned and offered immediate feedback to check their understanding of the concepts. Students could submit their homework up to three times so there were ample opportunities to earn as many points as possible. Before taking the unit quizzes, students had to prove their mastery of the concepts by scoring 100% on a series of prerequisite exercises. Students responded well to this curriculum; their comments range from “the program lets me do things when I want to – I like that”, “I like the feedback the program gives me – it helps me get my work done”, “the program keeps me focused and requires me to do the work” to “the pop extra credit is nice – I used the points several times to buy gifts from the catalog”. The regression discontinuity analysis demonstrated that there was a significant increase in student performance after the implementation period. This increase is most likely due to the use of mandatory prerequisite exercises that forced the students to master the material before they actually took important quizzes. At the conclusion of the current course students are comfortable operating the standard Microsoft office suite applications (Word, Access, PowerPoint, and Excel) and have learning the basic concepts of more advanced topics such as SQL (structured query language), HTML (hypertext markup language), and most impressively, XML (extensible markup language). This study demonstrates that the use of structured sequential learning makes it possible to increase the difficulty level of the material covered in large IS introductory classes. In fact, the highest mean test performance score was recording during semester 12, when some of the most difficult material (e.g., XML) was added to the curriculum. This study demonstrates that including student-centric edutainment in the classroom and providing sequential structured learning outside of class increases mean test scores. Professors, wishing to make their introductory IS survey courses more appealing, should consider including these didactic techniques. (Bakke, Faley et al., 2007) REFERENCES Bakke, S., R. H. Faley and G. Steinberg (2007) "A student-centric approach to large introductory survey courses." Journal of Information Systems Education, forthcoming. Bostow, D. E., L. M. Kritch and B. F. Tomkins (1995) "Computers and pedagogy: Replacing telling with interactive computer-programmed instruction." Behavior Research Methods, Instruments & Computers, 27, pp. 297 - 300. Brown, S. A. and P. Knight (1994). Assessing learners in higher education. London, Kogan Page. Buchanan, T. (2000) "The efficacy of a World-Wide Web mediated formative assessment." Journal of Computer Assisted Learning, 16, pp. 193 - 200. Campbell, D. T. and J. C. Stanley (1963). Experimental and quasi-experimental designs for research. Dallas, Houghton Mifflin Company. Cook, T. D. and D. T. Campbell (1979). Quasi-experimentation: Design and analytical issues for field settings. Chicago, IL, Rand McNally. Csikszentmihaly, M. (2000). Beyond boredom and anxiety, 25th Anniversary Edition. San Francisco, Jossey-Bass Publishers. Hogle, J. G. (1996) "Considering games as cognitive tools: In search of effective "Edutainment."" Educational Resource Information Center, ED 425 737, pp. 1 - 28. Kritch, K. M. and D. E. Bostow (1998) "Degree of constructed-response interaction in computer-based programmed instruction." Journal of Applied Behavioral Analysis, 31, pp. 387 - 398. Lawson, T. J. (1995) "Active-learning exercises for consumer behavior courses." Teaching of Psychology, 25, pp. 200 - 202. Lepper, M. and T. W. Malone, Eds. (1987). Intrinsic motivation and instructional effectiveness in computer-based education. Aptitude, learning, and instruction. Hillsdale, NJ, Erlbaum. Liao, W. and W. Tai (2006) "Organizational justice, motivation to learn, and training outcomes." Social Behavior and Personality, 34(5), pp. 545 - 556. Malone, T. W. (1980) "What makes things fun to learn? A study of intrinsically motivating computer games." Unpublished dissertation, Stanford University. Malone, T. W. and M. Lepper, Eds. (1987). Making learning fun: A taxonomy of intrinsic motivations for learning. Aptitude, learning, and instruction (Vol. 3 pp.223 - 253). Hillsdale, N.J., Earlbaum. Rubin, D. (1977) "Assignment to treatment group on the basis of a covariate." Journal of Educational Statistics, 2(1), pp. 1 - 26. Seale, J., J. Chapman and C. Davey (2000) "The influence of assessments on students' motivation to learn in a therapy degree course." Medical Education, 34, pp. 614 - 621. Shadish, W. R., T. D. Cook and D. T. Campbell (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston/New York, Houghton Mifflin Company. Smith, P. L. and T. J. Ragan, Eds. (1993). Designing instructional feedback for different learning outcomes. Interactive instruction and feedback. Englewood Hills, NJ, Educational Technology Publications. Townsend, M. A. R., D. W. Moore, B. F. Tuck and K. M. Wilton (1998) "Self-concept and anxiety in university students studying social science statistics within a co-operative learning structure." Educational Psychology, 18, pp. 41 - 54. Trabasso, T., Ed. (1987). Discussion. Aptitude, learning, and instruction. Hillsdale, NJ, Earlbaum. vanDerKlaauw, W. (2002) "Estimating the effect of financial aid offers on college enrollment: A regression-discontinuity approach." International Economic Review, 43(4), pp. 1249 - 1287. Watson, D. L., D. A. Kessler, S. Kalla, C. M. Kam and K. Ueki (1996) "Active learning exercises are more motivating than quizzes for underachieving college students." Psychological Reports, 78, pp. 131 - 134. Wiliam, D. (2006) "Formative Assessment: Getting the focus right." Educational Assessment, 11(3 & 4), pp. 283 - 289. William, D. and P. Black (1996) "Meanings and consequences: A basis for distinguishing formative and summative functions of assessment." British Educational Research Journal, 22, pp. 537 - 548.