Faculty Resource NetworkAn academic partnership devoted to faculty development. Now in our fourth decade, we remain committed to this partnership, and to fostering connection, collaboration, and collegiality among our members.
Student Interpretations of Disasters: Do Online Students Respond Differently Than Face-to-Face Students? – Evidence From the Classroom
Challenge as Opportunity: The Academy in the Best and Worst of Times
A National Symposium
November 20-21, 2009
Clark Atlanta University, Morehouse College, and Spelman College
Richard Vogel, Farmingdale State College
Students’ beliefs and opinions about disaster situations, formed as a result of their personal experiences and from recent seminal events such as the terrorist attacks on 9/11 and Hurricane Katrina, are often their initial starting points when they begin a disaster studies course. As they are exposed to a wide range of analyses of different disaster events throughout the semester, the students’ conceptual frameworks are challenged concerning the impacts and policy implications of a disaster situation. This paper assesses student comprehension and understanding of the disaster situation based upon an analysis of student performance on final projects across three sections of a disaster studies course offered in spring and summer 2008 and spring 2009.
All three courses used the same set of course materials and course outline, with the exception of the assignments, which were adjusted to reflect different teaching environments. Two sections of this course were taught online, while the third section was offered in the traditional classroom. Online, student learning was facilitated by a series of written mini-lectures, asynchronous discussions on the various topics, a series of short written assignments and a course project. The traditional face-to-face class met once a week for two and a half hours during which time class material was delivered by a combination of lecture and facilitated discussion. Student assignments for the traditional course were submitted in hardcopy format in the classroom.
Few Economics departments at U.S. colleges and universities offer formal courses in disaster studies. There is also very little research evaluating disaster studies courses in economics education or other fields. This paper adds to this limited literature by examining student outcomes and performances in an economics-related disasters studies course.
Courses in disasters studies serve a twofold purpose. First, they provide students with the opportunity to apply discipline-specific skills and knowledge to real-life natural, social and physical phenomena. They additionally will help students to better understand the potential consequences of various types of disaster events and allow them to be better prepared for future events.
1. Online versus the Traditional Classroom
Over the last decade, a number of studies have been published evaluating the effectiveness of online and distance learning environments versus traditional classroom experiences. Anstine and Mark’s (2005) analysis highlights some of the problems associated with evaluating outcomes between traditional and online courses, suggesting that students select either learning environment in a nonrandom manner, e.g., online students tend to have greater family responsibilities and are more likely to be working full-time than traditional students. After accounting for the self-selection bias, they conclude that students do not perform as well in the online environment as they do in the traditional classroom environment. Coates and Humphries (2004), using a similar experimental design, find that students perform better in the online environment.
Terry, Lewer et al (2003) find that student performance in the online format is below that of students enrolled in traditional or hybrid formats. While Brown and Liedholm (2002) find similar results, they also discover that students in the online learning environment tend to have higher level backgrounds. Harmon and Lambrinos (2007) focus on student comprehension as evidenced by student responses to individual exam questions on material covered in either an online format or in the classroom. They find that a student is more likely to answer an exam question correctly if the material had been covered online than if it had been presented in the classroom.
2. The Economics of Global Natural Disasters Course: Online and Traditional
The three courses of “Economics of Global Natural Disasters” (EGND) consisted of eight units of material organized thematically, including overviews of disaster impact, methods of regional economic analysis, disaster mitigation and planning issues, recovery from disaster, and public policy issues. Students were assigned readings from a combination of surveys and journal articles, primary readings, and government and NGO reports that were embedded as pdf files directly in the course management system.
Students in the three classes were required to respond to a minimum of four Problem/Issue Response assignments and to participate in eight asynchronous discussions in the online forums (or in-person classroom discussions). Students were also required to complete a course project. The course project consisted of (1) an overview of the particular hazard, (2) an overview of the region in question, (3) a discussion of pre-disaster mitigation/planning policies, (4) the short-term impact of the event and discernable long-term impact. Parts 1 and 2 were submitted together, with the other parts (3 and 4) submitted separately.
The project essentially demonstrates the students’ comprehension of the course material and their ability to apply economic concepts and analytical techniques to the disaster situation. Thus, the project serves as an assessment tool for various aspects of the overall course goals across three dimensions. Students should be able to identify and explain the issues and problems associated with disaster impact on a particular region (Goal 1). They then should be able to assess regional mitigation policies and community responses to the particular disaster event (Goal 2). The students are subsequently asked to apply the tools of economic analysis to the particular disaster event and evaluate the overall short-term and long-term economic impacts of the event (Goal 3).
3. Outcomes Analysis
Student outcomes for the course are evaluated using a basic production function applied to the classroom. Assignment and course grades for individual student grades are hypothesized to be a function of attribute and effort variables as well as completion of course modules/course material. The basic function form of the equation that I estimate is:
The student’s final course grade or project grade, Gr, is determined by student attributes, Attr, which include factors such as the student’s major, level in college (sophomore, junior, senior), and gender, student effort, Eff, which includes factors such as attendance, and completion of course modules, CM which is reflected in the student’s participation in class discussions and whether they completed all of the required class assignments.
Final grades, project grades, and the percentage of student attendance enter into the analysis as continuous variables for all students that completed the course. All other variables enter into the analysis as dummy variables, with gender equal to one for a female student and zero otherwise; business major equal to one and zero otherwise; student completion of more than seventy percent of the module materials equal to one and zero otherwise; a discussion score of greater than seventy percent (completion of course module materials) equal to one and zero otherwise; summer enrollment (Summer) equal to one and zero otherwise; and online enrollment equal to one if the course was taken online and zero otherwise.
Summary data for the variables listed above are presented in Table 1. CMASG and CMDISC represent student completion of issues/problem responses and online/in-class moderated discussions of the course material respectively. These are dummy variables equal to one if a student’s overall score is greater than 69, and zero otherwise, indicative that students have completed most of the course material. FGRADE and PGRADE represent the students total final course grade and project grade respectively. PATTEND is the student’s total attendance during the semester as a percentage value of total possible attendance. GENDER is a dummy variable equal to one if female, and zero otherwise. LEVEL represents the student’s year in college, taking on a value of one for freshmen, two for sophomore, three for junior, and four for senior. MAJOR is a dummy variable equal to one if the student is a business major and zero otherwise. ONLINE and SUMMER are both dummy variables that take on a value of one if the course was taken online or during the summer semester and zero otherwise. Out of a total of 66 students taking the course across the three semesters, 25 were female, 37 were business majors, 41 took the course online, and 16 were enrolled during the summer.
While students earning a high score for their final grade are more likely to have performed well on the project, a course grade from 65 to 75 does not necessarily imply that the student did not understand the course material. It may simply reflect the student’s level of diligence and attendance during the semester. The project grade better represents the student’s understanding and comprehension of the course material and their ability to analyze, model and apply economic analysis to the disaster situation.
The results of the analysis are presented in Tables 2 and 3. Factors such as completion of written assignments and class discussions as well as class attendance are estimated to be positive and significant. Coefficients for gender, major and online, though negative are not significant. Summer enrollment is estimated to be positive and significant. Online instruction did not appear to have a significant impact on overall student outcomes, though summer students appear to have performed better overall than students during the rest of the school year. Possible explanations for this result include a combination of student self-selection (more motivated summer students) and lighter course loads during the summer.
In the case of the project grade analysis (Table 3), the results of the analysis indicate that the dummy variable for online is positive and significant. Student completion of course material on both the CMASG and CMDISC variables are also both positive and significant.
Student performance on each part of the assignment is evaluated by examining their arguments in the different sections of the project and coding them according to a qualitative scale related to Goals 1, 2, and 3. If a student has met 70 percent or more of one of the stated goals, its value is equal to one and zero otherwise. An ordered logistic function similar to equation 1 is evaluated of the following form:
‘Outcome’ represents the degree to which a student has met any or all of the three stated outcomes.
Summary data on student goals achievement is presented in Table 4. Of the total number of students in the course, 43 met Goal 1, 44 met Goal 2, and 49 met Goal 3. These data were coded in the following manner: a student meeting no goal was equal to zero; meeting Goal 1 is equal to one; meeting Goal 2 is equal to two; meeting Goal 3 is equal to three; meeting Goals 1 and 2 is equal to four; meeting Goals 1 and 3 is equal to five, meeting Goals 2 and 3 is equal to six; and meeting Goals 1, 2, and 3, is equal to seven.
Regression results for the ordered logit are presented in Table 5. The estimated coefficients indicate that completion of course content (CMASG) leads to higher attainment of learning outcomes. If a student took the course online or in the summer they are also more likely to satisfy one of the three learning goals. Additionally, the analysis indicates that the student’s level in college did have a significant impact. The variables listed as Cut 1, Cut 2, and Cut 3, refer to the lower limits of the index values, indicating the degree to which the students have met each individual goal.
Overall, the analysis indicates that students in the online course environment are achieving stated learning outcomes at as high a rate (if not higher) than students in the traditional classroom, and that all three goals are equally achievable. The implications of the analysis are also that as students apply themselves to participating in and completing the course material, they are able to apply some level of economic analysis to the analyses of disasters regardless of whatever initial biases and viewpoints they may have held.
Students coming into a disaster studies course, as with any other subject, bring with them a number of preconceptions and ideas based on previous courses that they may have taken as well as beliefs that they have formed from their own personal experiences. Anecdotal evidence from my own classes suggest that students’ initial arguments regarding issues related to disasters are often rooted in their own personal experiences and are very often related to popular news coverage of recent local or seminal disaster events. Once they begin to read and work through the course materials, their evaluations and analyses of disaster events and policies should expand.
The analysis demonstrates that students are able to effectively evaluate and interpret the actual and potential consequences of disaster events employing the tools of economic analysis. By focusing the analysis on the course project, which is the same for both the online and traditional versions of the class, the analysis reduces some of the issues associated with potential student self-selection bias. Regardless of the learning environment, approximately 70 percent of students met the stated learning goals of both the course and the course project. The analysis indicates that students in the online environment performed better in achieving the learning goals than students in the traditional classroom. Summer students also appeared to perform better than students during one of the regular spring semesters regardless of learning environments.
Anstine, J. & Mark, S. (2005.) A Small Sample Study of Traditional and Online Courses with Sample Selection Adjustment. Journal of Economic Education. 36(2): 107-127.
Brown, B.W. & Liedholm, C. E. (2002.) Can Web Courses Replace the Classroom in Principles of Microeconomics? American Economic Review. 92(2): 444-448.
Coates, D., Humphries, B.R., et al. (2004.) ‘No Significant Distance’ Between Face to Face and Online Instruction: Evidence from Principles of Economics. Economics of Education Review. 23(6): 553-546.
Harmon, O. R. & Lambrinos, J. (2007.) Student Performance in Traditional versus Online Format: Evidence from an MBA Level Introductory Economics Course. Working Paper. 2007-03. University of Connecticut, Stors, CT.
Terry, N., Lewer, J., et al. (2003.) The Efficacy of Alternative Instruction Modes in Economics. Journal of Economics and Economic Education Review. 4(1): 23-34.