Uploaded by User85389

Li, L., & Titsworth, S. (2015). Student misbehaviors in online classrooms

advertisement
The Amer. Jrnl. of Distance Education, 29:41–55, 2015
Copyright © Taylor & Francis Group, LLC
ISSN: 0892-3647 print/1538-9286 online
DOI: 10.1080/08923647.2015.994360
Student Misbehaviors in Online Classrooms: Scale
Development and Validation
Li Li
University of Wyoming
Scott Titsworth
Ohio University
The current program of research included two studies that developed the Student Online Misbehaviors
(SOMs) scale and explored relationships between the SOMs and various classroom communication
processes and outcomes. The first study inductively developed initial SOM typologies and tested
factor structure via an exploratory factor analysis. Subsequently, the second study evaluated the
model fit through a confirmatory factor analysis (CFA) and assessed relationships between students’
perceptions of their online misbehaviors, perceived learning, and various demographic characteristics. Four factors were found in the SOMs scale: Seeking Unallowed Assistance, Internet Slacking,
Aggressiveness, and Lack of Communication. Reliability and validity were established. Results indicated certain demographics were related to perceptions of use and severity of SOMs; SOMs were
minimally related to students’ perception of learning.
Researchers continue to assess multiple issues related to academic quality and student success. Among these issues is the connection between communication and student classroom
misbehaviors (Plax, Kearney, McCroskey, and Richmond 1986); however, most research on this
topic has been conducted in traditional classroom settings. Teachers and researchers would benefit from understanding how student misbehaviors and subsequent communication-based reactions
from teachers manifest in online learning environments.
A growing body of research documents behaviors necessary to student success in online
classes. For example, independent learners are more self-motivated, organized, self-disciplined,
and able to think more critically (Lowes 2005). Because online classes typically utilize asynchronous learning, being independent is strongly related to student success (DiBiase and Kidwai
2010). Consequently, mature learners tend to succeed at higher levels than younger learners in
online settings (Knowles, Holton, and Swanson 2005). In addition to independence, research suggests that feeling comfortable connecting, communicating, and collaborating with other learners
promotes academic success in online classes (Nagel, Blignaut, and Cronjé 2009).
Although researchers have documented behaviors linked to success in online classes, challenges arise when scholars apply student misbehavior research to those settings. Student
Correspondence should be sent to Li Li, University of Wyoming, UU432, 125 College Drive, Casper, WY 82601.
E-mail: [email protected]
42
LI AND TITSWORTH
misbehaviors are actions considered inappropriate for classroom settings because they disrupt
learning (Durmuscelebi 2008; Kearney et al. 1985, 1991; Plax, Kearney, and Tucker 1986).
Research in traditional face-to-face classrooms has conceptualized student misbehaviors as either
active or passive (Dreikurs, Grunwald, and Pepper 1971; Plax and Kearney 1990). Active
misbehaviors are recognizable actions that are immediately disruptive to the classroom environment and student learning (e.g., talking out of turn and shouting). Passive misbehaviors are
covert behaviors that may not be readily identified as destructive but that interfere with teaching
and learning (e.g., refusing to do homework and truancy).
Freestone and Mitchell (2004) suggested, “The Internet has paved the way for many new forms
of aberrant behavior, of which some are entirely new and others are technologically updated versions of long standing ethical debates” (122). Because online pedagogy has steadily grown, and
in fact has become a common form of education delivery, there is a practical need to better understand behaviors that facilitate and degrade learning. Although a significant body of research has
explored students’ misbehaviors in face-in-face classes, the nature of online instruction/learning
does not permit easy adaptation of that work to mediated contexts.
This article reports results of two studies to address this issue. The first study inductively
developed the Student Online Misbehaviors (SOMs) scale using a typology of behaviors reported
by students who took online classes. The second study provided additional evidence for the factor validity by testing model fit through a confirmatory factor analysis (CFA) and by assessing
the impact of demographics on student-perceived SOMs as well as the influence of SOMs on
students’ perceived learning.
STUDY 1
Because student misbehavior research has not been expanded to online settings, the purpose of
the first study was to inductively generate a typology of online student misbehaviors that could
form an inventory for use in subsequent studies of online classes.
Stage 1
Participants. A total of 13 teachers (8 females, 5 males) and 110 students (67 females,
42 males, 1 unidentified) from a university in the midwestern United States took part in the study.
The mean age of the students was 25.21 years (SD = 7.83). The mean age of the teachers was
43.58 years (SD = 10.21). Eleven teachers had gained doctorate degrees, and 2 teachers indicated
their highest education was at the master’s degree level.
Procedures. After approval from the Institutional Review Board, we sent e-mail to all teachers of online courses to invite them and their students to complete an online survey. Invitations
were e-mailed to the teachers twice throughout the quarter. To maximize the response rate,
we created two versions of the survey: one anonymous and the other recording names so that
teachers could grant students participation credit. Participants were asked to provide answers
to an open-ended question as well as demographic information. For the open-ended question, respondents described students’ misbehaviors that they had experienced or seen in online
classes—participants were prompted to record up to five misbehaviors.
STUDENT MISBEHAVIORS IN ONLINE CLASSROOMS
43
Generation of scale item pool. Using Johanson and Brooks’s (2009) recommendation
regarding adequate sample size, the 160 valid student misbehavior messages collected from
110 students were deemed sufficient for the study. The 38 messages created by the teacher sample
were integrated into the initial typology and served as additional evidence for validation of the
categories.
Using the constant comparative method (Glaser and Strauss 1967), twenty student misbehavior
types (Table 1) were identified. The process of refining the categories was iterative. Face validity for the categories was established by pairing participants’ actual wordings of examples with
conceptual definitions. This process ensured internal validity for identified misbehaviors because
all such behaviors were grounded on participants’ stated experiences. Additional content validity
was established by asking two layperson coders to match all the original misbehavior examples
with the categories conceptualized by the researchers. Two expert scholars were also consulted
over the established categories.
Stage 2
Participants. A total of 412 students (282 females, 124 males, 6 unidentified) enrolled in
online classes at several U.S. universities responded to the survey. Participants were predominantly Caucasian. The mean age of the students was 24.96 years of age (SD = 8.16) with a range
from 17 to 60 years old. The average prior online classes were 4.4 classes (SD = 5.63), ranging
from 0 to 48 classes.
Data were screened for missing values and outliers. Missing values (8.94%) were imputed by
the “multiple imputations” procedure in the LISREL 8.80 analysis program. As a standard procedure to detect multivariate outliers, Mahalanobis Distance was evaluated “as χ2 with degrees
of freedom equal to the number of variables” (Tabachnick and Fidell 2007, 99). The SOMs scale
includes twenty variables and thus all twenty Mahalanobis variables must be examined against
45.315, which was the critical value of chi-square at p < .001. Ten cases were removed from the
data file. The final data set contained 397 cases.
Sample size is a very important factor to consider in scale development because insufficient
cases might severely influence the factor structure produced. According to Bryant and Yarnold
(1995), “One’s sample should be at least five times the number of variables. The subjects-tovariables ratio should be 5 or greater. Furthermore, every analysis should be based on a minimum
of 100 observations regardless of the subjects-to-variables ratio” (100). Meyers, Gamst, and
Guarino (2006) recommended a sample size target ratio of ten cases for every variable, with
at least two hundred cases. We conducted a split-file procedure to utilize approximately half of
the 397 participants (SOMs1 data set: n = 200) in this stage for an exploratory factor analysis
(EFA); the other half was retained for use in Study 2 (SOMs2 data set: n = 197). Both sample
sizes were close to the criteria of recommendations.
Procedure. In the subsequent three quarters, we sent e-mail to online class instructors to
recruit student participants. Four methods were utilized. First, e-mails were sent to 120 instructors who taught online classes at the same midwestern university in Stage 1, excluding all the
teachers who had been contacted from the previous stage. Second, our survey was enrolled in
the Communication Research Pool of the university, thus securing some responses from students
who took online classes from the School of Communication Studies. Third, we sent recruitment
44
LI AND TITSWORTH
TABLE 1
Initial SOMs: Student Online Misbehavior and Type
Types
1. Bad textual manners
2. Bad nontextual mannersa
3. Technology failure
4. Aggressive toward teachera
5. Aggressive toward classmatesa
6. Excessive communication
7. Lack of communication with
teachera
8. Lack of communication with
classmatesa
9. No communicationa
10. Being inattentivea
11. Lack of critical thinkinga
12. Multitasking
13. Irrelevant communication
14. Procrastinationa
15. Slacking (group work)a
Misbehaviors
Incorrect punctuation/bad grammar/slangs/typo in e-mail or on discussion
boards, etc.; Rude/inappropriate language/topics in e-mails, discussion board,
and course chat rooms
Inappropriate use of nontextual manners such as using capitalizing or boldfacing;
Smoking when face-to-face on Skype; Inappropriate posters in the background
when video blogging
Failure to open or send attached files on Blackboard; Unable to upload video to
Blackboard; Not muting oneself when on conference calls; Unable to take
online tests
Being argumentative toward/hostilely communicating with the teacher on
discussion boards or via e-mail. For example, demanding credit for late work
despite the teacher’s policy against it; Making grade threats: “You MUST give
me at least a B or I won’t be able to start my new job” via e-mail; Accusing the
instructor/teaching assistant of unfair grading; Snarky references to the
assignment on blogs; Gripes about the teacher sent to all students in classes
Becoming offended easily by opposing ideas; Attacking (negative feedback,
insulting, bad mouthing, cursing, rudely criticizing) other students’ thoughts
or group members’ comments on discussion boards, in blogs, or online
classroom chat
Sending too many e-mails to the teacher and other students too frequently
Rarely initiating communication with the teacher; Rarely responding to
teacher-initiated communication; Asking the teacher fewer or no questions;
Not clarifying teachers’ instruction; Responding slowly to e-mail inquiries
Rarely initiating communication with classmates; Rarely responding to
classmates’ initiated communication; Failure to e-mail classmates or to clarify
classmates’ posts on discussion board; Responding slowly to e-mail inquiries
There is no communication from the student; Students often disappear! They fall
off the face of the earth and the teacher and other students never hear from
them again regardless of how many times/channels the teacher and other
students try to contact them; Never check e-mail; Not responding to e-mails;
Not logging on to the online class for days, even though it has a daily
assignment
Ignoring/carelessness in reading instructions; Forgetting deadline/exams
Only trying to fill out some information instead of reading and discussion or
what is their own opinion and why; Poor discussion board comments;
Submitting very short responses; Posts contained very little relevant
information; Focusing just on the exam, not the knowledge
Watching online TV and movies; Listening to music; Playing games, text
messaging, or e-mailing people; Engaging in Facebook, Yahoo chats, Twitter,
or checking other unrelated websites
Posting topics irrelevant to topic; Straying from subject on discussion board;
Taking the discussion boards out of context
Late submissions, postings, assignments
Being uninvolved when assigned to partner/group work via Blackboard; Not
doing their part in a group activity; Reliance on group members to complete
work; Not cooperating/contributing
(Continued)
STUDENT MISBEHAVIORS IN ONLINE CLASSROOMS
45
TABLE 1
(Continued)
Types
Misbehaviors
16. Slacking (individual work)a
17. Cheating individuallya
18. Unallowed collaborationa
19. Plagiarisma
20. Abusing technologya
a Indicates
Failure to do the reading, notes, review of the lectures. In terms of coursework,
not following guidelines (short responses/incomplete assignments, never
submitting any work); Not participating in required postings in discussion
boards
Cheating on exam by checking related book
Working together during the essay portion/exams/tests/quizzes; Sharing
work/file exchange
Googling during tests/quizzes; Copying from Internet; Having other people do
the work
Taking advantage of technology features of online classroom to gain unallowed
personal benefits; Making use of different testing time to get the test questions;
Blaming technology for failure of communication, assignment completion, or
submissions
items included in the final scale.
e-mail to The Communication, Research, and Theory Network (CRTNET), an e-mail Listserv
managed by National Communication Association, for any possible help. Fourth, we used convenient sampling by soliciting help from acquaintances who taught or knew instructors who were
teaching online classes at any U.S. higher education institutions at the time.
Similar to the procedure in Stage 1, two online surveys were created: one version asked
for anonymous responses whereas the other recorded names for participation credit. Following
Durmuscelebi (2008) and Kearney et al. (1985), participants were asked to indicate frequency of
use (1 = never, 5 = very often) and the severity (1 = least severe, 5 = extremely severe) for each
SOM.
Initial Instrument Development
Four preliminary examinations were performed to check the factorability of the SOMs1 data
set. First, correlations were calculated among use ratings for the twenty variables. Coefficients
ranged between .30 and .90, suggesting a lack of both independence and singularity (Tabachnick
and Fidell 2007). Second, Principal Axis Factoring with a Promax rotation was applied to the
use data. The Kaiser-Meyer-Olkin Measure (KMO) and Bartlett’s Test showed that significant
correlations existed (χ 2 = 2306.369, df = 190, p < .05) and the KMO test of sampling adequacy
(.927) indicated appropriateness of the factor analysis approach. Finally, 4 items (Items 3, 6, 12,
13) were eliminated from the current scale because communality values did not meet acceptable
levels.
Both a parallel analysis (see Patil et al. 2008) and a scree plot suggested a four-factor structure
for the 16 items. Nine items met the .60/.40-loading criterion advocated by McCroskey and
Young (1979). Goodboy (2011) suggested that items with borderline loadings (close to .60) with
a secondary loading not exceeding 50% of the primary loading should be retained. Items 10,
11, 14, 15, and 17 met that threshold. Although Item 2 had a primary loading close to .60 but
a secondary loading above .40, we chose to retain that item to maintain at least 3 items in each
46
LI AND TITSWORTH
TABLE 2
Rotated Factor Structure of the SOMs Scale
Factor
1
2
3
4
19. Plagiarism
20. Abusing technology
18. Unallowed collaboration
17. Cheating individually
16. Slacking over individual work
14. Procrastination
15. Slacking over group work
10. Being inattentive
11. Lack of critical thinking
1. Bad textual manners
5 Aggressive toward classmates
4. Aggressive toward the teacher
2 Bad nontextual manners
8. Lack of communication with classmates
7. Lack of communication with teachers
9. No communication
.896
.722
.685
.592
.052
.132
.148
.095
.081
.011
.096
−.008
−.194
.071
−.133
.003
.188
.047
−.148
.194
.737
.596
.582
.576
.567
.434
−.127
−.065
.458
−.041
.151
.116
−.090
−.052
.239
.027
−.155
.075
−.004
.035
.033
.397
.868
.862
.588
.027
.004
−.025
−.135
−.028
.130
.065
.104
−.023
.054
.085
.124
−.033
−.004
.051
−.077
.856
.733
.728
Eigenvalue
% of variance
Alpha
7.93
47.21
.87
1.41
6.78
.86
1.31
4.96
.83
.89
3.13
.93
Note: Principal Axis Factoring with Promax rotation was used. Boldface indicates the items that are meaningfully
retained for the scale.
factor. Item 1 was eliminated from the pool because it did not meet any of the criteria and was not
needed to maintain a sufficient number of items in any particular factor. The final scale included
15 items (Table 2).
The four factors had strong face validity when analyzed in comparison to literature on online
class communication. Factor 1, Seeking Unallowed Assistance (M = 2.29, SD = .96) consisted
of four items related to students’ behaviors of seeking inappropriate help for their work. Factor
2, Internet Slacking (M = 2.69, SD = .92), included five items describing ways in which students took advantage of Internet technology to do less work. Factor 3, Aggressiveness (M = 1.61,
SD = .69), contained three items related to students’ aggressive communication behaviors toward
their classmates and teachers. Factor 4, Lack of Communication (M = 2.48, SD = .96), included
three items that indicated students’ preference or behavior of noncommunication, or lack of communication to their teachers or their classmates. The four factors were significantly correlated
(Table 3). The scale’s overall reliability was .93.
STUDY 2
Study 1 provided initial evidence of validity, reliability, and dimensionality of the SOMs
scale. Study 2 gave further evidence of validity, reporting a CFA and assessing relationships
STUDENT MISBEHAVIORS IN ONLINE CLASSROOMS
47
TABLE 3
Correlation Matrix of SOMs Dimensions
Factors
1. Seeking Unallowed Assistance
2. Internet Slacking
3. Aggressiveness
4. Lack of Communication
∗p
2
3
4
.698∗
—
.492∗
.570∗
—
.556∗
.661∗
.521∗
—
< .01.
between students’ perceptions of their online misbehaviors, their perceived learning, and various
demographic characteristics.
To test model fit of the scale’s four-factor structure, a CFA was performed with maximum
likelihood estimation using LISREL 8.80 on SOMs2 data set (N = 197). Five model fit indices
were used: (a) the chi-square, (b) the root mean square error of approximation (RMSEA), (c)
comparative fit index (CFI), (d) the non-normal fit index (NNFI), and (e) the standard root mean
square residual (SRMR). Model fit is generally considered acceptable if RMSEA statistics do not
exceed .08 (and preferably less than .05), the values of CFI and NNFI are above .90, and SRMR
value is less than .08 (Kline 2005; MacCallum, Browne, and Sugawara 1996). Ideally, the chisquare statistics should be nonsignificant. However, large sample sizes common in CFAs rarely
allow this criterion to be tenable. To confirm the four-factor structure of the scale, an adequate
model fit should be observed:
H1: The four-factor structure observed in the first study will have adequate fit with the SOMs2
data set.
Scholars have suggested that technology is related to multiple variables in a learning environment (Freestone and Mitchell 2004; Selwyn 2008) in addition to students’ actual behaviors
(Patchin and Hinduja 2006; Rocco and Warglien 1995). Consequently, we considered the possibility that online misbehaviors stemming from a technology-mediated learning environment
would be related to other variables and behaviors.
First, students’ maturity level (e.g., age) influences their learning behaviors because older
students tend to be more motivated and more reflective learners (DiBiase and Kidwai 2010).
Related scholarship has also suggested that students’ previous experience with Internet technology influences their experience with online learning (Lim 2001). Consequently, we explored
whether previous experience with technology as well as age are related to students’ responses to
the SOMs scale:
RQ1: Is the SOMs students’ perception (frequency vs. severity) a function of (a) number of
online classes taken or (b) student age?
Although on-site student misbehavior results in diminished learning (Seidman 2005), it is
unknown whether a similar relationship exists with online misbehaviors. To explore this relationship in online settings, we considered both cognitive and affective learning. Because students
were from a variety of classes, we could not assess actual cognitive learning. As such, a scale
developed by Frymier and Houser (1999) was used to assess students’ behaviors normally
48
LI AND TITSWORTH
associated with cognitive learning (e.g., studying for exams). Affective learning emphasizes
students’ “interests, attitudes, appreciations, values” (Krathwohl, Bloom, and Masia 1964, 7).
RQ2: How is student learning (affective and cognitive) related to student use of SOMs?
Method and Measures
In addition to completing the previously described SOMs scale, participants completed two scales
assessing cognitive learning indicators and affective learning.
The Revised Cognitive Learning Indicators Scale. The Revised Cognitive Learning
Indicators Scale (Frymier and Houser 1999) includes seven items assessing learner behaviors
or activities associated with learning course content. Sample items include “I review the course
content” and “I think about the course content outside the class.” Previous findings have demonstrated construct validity and satisfactory reliability, with alpha coefficients ranging from .83 to
.86 (Frymier and Houser 1999; Hsu 2012). In this study, Cronbach’s alpha was .85.
The Affective Learning Scale. The Affective Learning Scale (McCroskey 1994;
McCroskey et al. 1985) includes twenty-four items measuring students’ attitudes toward the
course, subject matter, and the teacher as well as the likelihood of students’ related behavior. Each
of these dimensions is evaluated through four seven-point bipolar adjective subscales (good–
bad, worthless–valuable, fair–unfair, and positive–negative). Through repeated uses, the scale
has resulted in reliability estimates around .90 (Hsu 2012; McCroskey et al. 1985; Plax et al.
1986). In this study, Cronbach’s alpha was .95. Specifically, the reliability for the subscales were
affect toward the behaviors recommended in the course (α = .94), the class’s content (α = .83),
the instructor (α = .91), likelihood of taking future courses with the specific instructor (α =
.96), likelihood of taking future courses in the content area (α = .94), and likelihood of actually
attempting to engage in behaviors recommended in the course (α = .95).
RESULTS
Results of the CFA revealed acceptable fit for a four-factor model: χ 2 (84) = 179.753, p < .01;
CFI = .98, NNFI = .97, SRMR = .06, RMSEA = .076 [90% CI = .061:.092]. Inspection of the
λ loadings and accompanying z scores indicated that all fifteen items loaded significantly (factor
loadings ranged from .48 to .98) on their respective factors (see Table 4). When the error variance
between unallowed collaboration and aggressiveness toward teacher was allowed to correlate,
the model had slightly better fit: χ 2 (83) = 162.887, p < .01; CFI = .98, NNFI = .98, SRMR =
.06, RMSEA = .070 [90% CI = .054:086]. The corresponding Lambda loadings and z scores
indicated that all fifteen items’ loading on their respective factors remained the same. Obtained
Cronbach alphas were .84 for Seeking Unallowed Assistance, .87 for Internet Slacking, .75 for
Aggressiveness, and .85 for Lack of Communication. Therefore, the CFA procedure confirmed
the four-factor structure suggested by previous EFA by showing satisfying model fit.
The first research question asked whether the frequency and severity of SOMs were related to
the number of previous online classes taken or students’ ages. Simple correlations were assessed
to detect any significant relationship (see Table 5). Results revealed that the number of online
classes students had taken was negatively related to students’ use of unallowed collaboration and
STUDENT MISBEHAVIORS IN ONLINE CLASSROOMS
49
TABLE 4
Confirmatory Factor Analysis of the SOMs
Latent construct item
Factor 1. Seeking Unallowed Assistance
Cheating
Unallowed collaboration
Plagiarism
Abusing technology
Factor 2. Internet Slacking
Inattentiveness
Lack of critical thinking
Procrastination
Slacking over group work
Slacking over individual work
Factor 3. Aggressiveness
Nontextual
Aggressiveness toward teacher
Aggressiveness toward classmates
Factor 4. Lack of Communication
Little communication with teacher
Little communication with classmates
No communication
M
SD
λ
SE
2.03
2.02
2.09
2.08
.99
1.05
1.09
.97
.80
.79
.91
.59
.06
.07
.07
.07
2.34
2.68
2.75
2.46
2.64
1.01
1.04
1.25
1.09
1.08
.79
.72
.98
.86
.83
.06
.07
.08
.07
.07
1.59
1.35
1.49
.84
.63
.66
.54
.48
.52
.06
.04
.04
2.44
2.44
2.19
1.05
1.09
1.08
.84
.93
.83
.07
.07
.07
Note: All factor loadings are standardized and significant at p < .01.
TABLE 5
Correlations Between SOM Use/Severity, Students’ Experiences With Online Classes, and Age
Online classes
SOM
2
4
5
7
8
9
10
11
14
15
16
17
18
19
20
∗p
< .05. ∗∗ p < .01.
Student age
Use
Severity
Use
Severity
.02
.03
.13
−.06
.03
.03
.12
.06
.04
.20∗
.13
−.03
−.19∗
−.01
−.02
.06
.03
.11
−.03
.13
.07
.06
.07
.20∗∗
.21∗
.22∗∗
.14
.07
.05
.12
−.01
−.06
.05
−.03
.09
.07
.08
.05
.01
.14
.06
−.04
−.18∗
−.14
−.18∗
.08
−.01
−.00
−.12
−.01
−.00
−.05
.05
.12
.14
.14
.13
.18∗
.08
.15
50
LI AND TITSWORTH
TABLE 6
Correlations of SOMs With Cognitive Learning Indicators and Affective Learning
SOM
2
4
5
7
8
9
10
11
14
15
16
17
18
19
20
Cognitive
Affective
A1
A2
A3
A4
A5
A6
−.07
−.12
−.03
−.21∗∗
−.08
−.03
−.09
−.16∗
−.12
.02
−.09
−.13
−.15
−.05
−.06
−.07
−.08
−.05
−.05
−.03
−.01
−.06
−.07
−.02
.05
−.06
.04
.02
.07
.06
−.12
−.10
−.11
−.13
−.04
−.02
−.11
−.06
−.02
.09
−.06
.05
−.06
−.01
.01
.07
−.00
.09
.06
.07
.11
.06
.01
.08
.14
.05
.06
−.03
.14
.17∗
−.04
−.06
−.11
−.04
.03
−.07
−.09
−.03
−.01
.02
.03
.04
.09
.03
.01
−.11
−.14
−.11
−.13
−.15∗
−.09
−.12
−.10
−.03
.02
−.06
−.06
.05
−.00
.06
−.06
−.07
−.15
−.05
−.03
.07
−.02
−.09
.04
.06
−.05
−.02
−.05
−.06
−.02
−.03
−.09
.01
−.03
.03
−.09
−.05
−.06
−.09
−.07
−.10
.05
.07
.12
.01
Note: A1 = Affect toward class’ content; A2 = Likelihood of taking future courses in the content area; A3 =
Affect toward the instructor; A4 = Affect toward the behaviors recommended in the course; A5 = Likelihood of actually
attempting to engage in behaviors recommended in the course; A6 = Likelihood of taking future courses with the specific
instructor.
∗ p < .05. ∗∗ p < .01.
positively related to slacking over group work. For SOMs severity, correlations indicated that
students’ online class experiences are positively related to their perception of the severity of procrastination and slacking over individual work. Simple correlations also indicated that students’
age was negatively related to their perceptions of use of unallowed collaboration and abusing
technology. However, students’ age was negatively related to their perception of the severity of
unallowed collaboration. The results suggested that even though the older students are less likely
to collaborate with other people, they see such behavior as less severe.
The second question asked for the relationship between students’ SOMs and students’ affective
learning and learning behaviors. Table 6 reports the results of simple correlations of SOMs with
cognitive and affective learning. Two SOMs (lack of communication with teacher and lack of
critical thinking) were negatively related to students’ cognitive learning. None of the SOMs was
significantly associated with affective learning in general. When the subscales of affective learning measure were further investigated, lack of communication with classmates was negatively
associated with the subscale of “affect toward the behaviors recommended in the course.” In fact,
the absolute average value of Pearson correlations between SOMs and affective learning was
below .10, suggesting minimal association between the two constructs.
DISCUSSION
This study represents an initial attempt at creating a scale assessing online student misbehaviors.
Four factors were retained from the SOMs scale: Seeking Unallowed Assistance, Internet
STUDENT MISBEHAVIORS IN ONLINE CLASSROOMS
51
Slacking, Aggressiveness, and Lack of Communication. The observed factors suggest similarities to misbehaviors in face-to-face classes, but they also present misbehaviors unique to the
online learning environment. Our discussion begins with an analysis of these similarities and
differences.
The fifteen-item scale conforms to the dichotomy of active/passive division (Dreikurs,
Grunwald, and Pepper 1971). More specifically, Aggressiveness shares the same characteristics as
the active misbehavior in that students’ aggressive behavior is easily detected and recognizable.
Seeking Unallowed Assistance emphasizes students’ behaviors of actively looking for inappropriate ways to improve one’s performance in a class. Lack of Communication can be categorized
as the passive misbehavior as students try to avoid communicating with classmates or teachers.
Similarly, Internet Slacking depicts students’ passivity of minimizing one’s efforts in online learning. In all, passive misbehaviors are not as clearly shown as the active communication, which is
direct and up front.
Furthermore, the four factors point to the four types of problematic student behaviors in online
classes (Nagel, Blignaut, and Cronjé 2009) that are related to the visibility of student participation (Beaudoin 2002; Nagel, Blignaut, and Cronjé 2009). More specifically, Seeking Unallowed
Assistance specifies students’ intention and behaviors to look for vicarious participation, Internet
Slacking means inadvertent participation, Aggressiveness is aggressive participation, and Lack
of Communication emphasizes students’ tendency of nonparticipation. Correspondingly, Internet
Slacking and Lack of Communication reveal students’ use of the technological nature of online
classes to be invisible participants whereas Aggressiveness and Seeking Unallowed Assistance
highlight students’ visibility inadvertently.
The results also support Diaz and Cartnal’s (1999) claim that online students need to be
more independent learners. The findings added more evidence to support the andragogical model
(Knowles, Holton, and Swanson 2005) that adult learners tend to be independent learners: older
students are less likely to engage in SOMs such as “unallowed collaboration” and “abusing
technology.” More important, the older the students are, the more severe they perceive the misbehavior of “unallowed collaboration.” Even though students of young age are more familiar and
comfortable with technology involved in online classes (Lim 2001), results pertaining to the first
research question lent support for DiBiase and Kidwai’s (2010) claim that older students rather
than younger students are better online learners.
Interestingly, although on-site communication research has suggested that student
misbehaviors are negatively related to students’ affective learning and cognitive learning (e.g.,
McCroskey et al. 1985; Plax, Kearney, McCroskey, and Richmond 1986), the argument is not supported for student online class communication. Instead, the present study indicates that SOMS
are not related to students’ affective learning but are minimally related to students’ cognitive
learning. Two reasons could explain these findings. As mentioned previously, students tend to
attribute their communication manners to different technology use habits (Stephens, Houser, and
Cowan 2009). It is likely that students differentiate the appropriateness of their behaviors from
their actual attitude toward learning. Meanwhile, Chaiken and Eagly (1983) suggested that affect,
as a type of peripheral cue, is more salient to nonverbal channels (e.g., the video and audiotape)
than to verbal channels. They also argued that textual messages are more related to central process of information (i.e., critical thinking). Because most of the messages took place in verbal
formats of this study, it is likely that students’ affect could not be detected here.
52
LI AND TITSWORTH
LIMITATIONS AND FUTURE STUDY
Although this study offers insights to online classroom communication research, it is not without
limitations. First, we did not use random sampling to collect data. We solicited only possible
student misbehavior types from one university during Stage 1 and tried to apply those types
to all the undergraduate and graduate students who were enrolled in online courses in U.S.
higher education institutions that we could access at Stage 2. Although the broader sample of
students utilized for the SOMs1 /SOMs2 combined data set were asked to provide any additional
observed/experienced misbehaviors, it is still possible that our approach has not captured fully
the potential misbehaviors displayed by students in online classes. In an effort to maximize participants, the sampling technique used for Stage 2 was more purposeful than random. Most of our
data came from students who were either studying communications or were enrolled in courses
at the same midwestern university that we worked for. Therefore, the generalization of the data
might be appropriate only for the specific university or the specific discipline of communication.
Use of self-report data also poses some level of limitation. Generally speaking, social desirability could potentially inhibit students from identifying and admitting to using misbehaviors. This
potential could result in underreported information on both the nature of misbehaviors and the frequency with which they are enacted. Future research could address this issue by asking students
to identify only the misbehaviors they have observed others to use in online classes. Future studies could also triangulate students’ use of SOMs by comparing self-reports of misbehaviors with
externally measurable student behaviors captured through learning management systems, such as
times and contents of e-mail communication, discussion threads, and time on task. Research could
also pair student misbehavior scale results and learning scale results from face-to-face classroom
situations with the online situation and could address the critical issue of retention by focusing on
students’ SOM scale responses and responses on the learning measures with completion of the
class.
The measure of affective learning and cognitive learning indicators were also based on students’ self-reports. Valid questions can rise as to whether they, especially the cognitive learning
indicators scale, have privilege over the actual tests as to measuring students’ actual learning of
the whole course content. McCroskey and Richmond (1992) argued, “The study of variables that
impact cognitive learning has long been impeded by the difficulty in establishing valid measures
of this type of learning” (106). Even though the pretest and posttest sound appealing, Hooker
and Denker (2014) warned that this type of assessment must be specific to the course and thus is
not widely generalizable across disciplines. Furthermore, Hooker and Denker (2014) examined
the Learning Loss Scale (a self-perceived scale) through two studies that illustrate validity concerns of previous findings by showing either a smaller or no relationship between the scale scores
and performance on other cognitive learning measures. Therefore, students’ self-reports in the
current study might have little relationship with their actual learning outcomes in online classes.
Of course, to achieve a broader sample from multiple classes, a controlled assessment of actual
cognitive learning was simply not possible. Future studies should be focused on a more specific
class to triangulate students’ misbehaviors, learning outcome, and the self-reported cognitive
learning.
Meanwhile, the participants were a composite of students. As this study has informed that
demographic information (e.g., age, number of online classes) could significantly influence people’s report of SOMs, the variety of demographics inevitably impact the integrity of the collected
STUDENT MISBEHAVIORS IN ONLINE CLASSROOMS
53
data. Future research could further this vein and develop more targeted SOMs in various populations. One way to do it is to solicit messages from students of different age groups and
majors.
Finally, as student online visibility and participation are assumed to define students’ proper
behaviors toward learning in previous online pedagogy literature (Lei 2004; Nagel, Blignaut, and
Cronjé 2009), questions have already been raised as to whether invisibility of students’ participation necessarily encumbers their learning (Beaudoin 2002). Although related studies have tried to
address those questions in on-site classroom communication (Meyer 2007, 2008), future research
needs to pursue the same questions in online settings.
IMPLICATIONS AND CONCLUSION
Findings of this study offer important implications for students, teachers, and administrators.
First, students should be independent and cooperative participant learners. Unfortunately, as the
study indicates, students reported themselves or other students as being likely to abuse the influence of technology that perpetuates online communication. Students are not making an effort to
communicate with teachers or classmates, they are not performing to their full capacity, and they
tend to be aggressive in communication. In other words, students have a greater likelihood of not
being independent or participating. Even though students do not associate their SOMs with their
affective learning, they do indicate that lack of communication with teachers and lack of critical
thinking are negatively influencing their cognitive learning. Therefore, in order to succeed, online
learners need to be both independent and cooperative participants.
In order to promote students’ independence and participation, teachers should treat online
classroom interactions as interpersonal communication, attending to various factors (e.g.,
age, class experience) that might influence students’ perception of various online behaviors.
Administrators who supervise online education programs should bear in mind that effective classroom management should be achieved by educating both teachers and students. As previous
research pointed out that student misbehaviors are inappropriate and disruptive (Durmuscelebi
2008; Kearney et al. 1991; Plax, Kearney, and Tucker 1986), those misbehaviors need to be
managed to enhance student learning. Even though the present study indicates that students
differentiate their attitude toward online learning and misbehaviors, those misbehaviors nevertheless still negatively impact their cognitive learning. Therefore, administrators should consider
providing prerequisite training sessions for both students and teachers prior to their first online
classes.
REFERENCES
Beaudoin, M. F. 2002. Learning or lurking? Tracking the “invisible” online student. Internet and Higher Education 5 (2):
147–55.
Bryant, F. B., and P. R. Yarnold. 1995. Principal-components analysis and exploratory and confirmatory factor analysis.
In Reading and understanding multivariate statistics, ed. L. G. Grimm and P. R. Yarnold, 99–136. Washington, DC:
American Psychological Association.
Chaiken, S., and A. H. Eagly. 1983. Communication modality as a determinant of persuasion: The role of communicator
salience. Journal of Personality and Social Psychology 45 (2): 241–265.
54
LI AND TITSWORTH
Diaz, D. P., and R. B. Cartnal. 1999. Students’ learning styles in two classes: Online distance learning and equivalent
on-campus. College Teaching 47 (4): 130–135.
DiBiase, D., and K. Kidwai. 2010. Wasted on the young? Comparing the performance and attitudes of younger and older
U.S. adults in an online class on geographic information. Journal of Geography in Higher Education 34 (3): 299–326.
Dreikurs, R., B. B. Grunwald, and F. C. Pepper. 1971. Maintaining sanity in the classroom: Illustrated teaching
techniques. New York: Harper & Row.
Durmuscelebi, M. 2008. Investigating students’ misbehavior in classroom management in state and private primary
schools with a comparative approach. Education 130 (3): 377–383.
Freestone, O., and V. Mitchell. 2004. Generation Y attitudes towards e-ethics and Internet-related misbehaviors. Journal
of Business Ethics 54 (2): 121–128.
Frymier, A. B., and M. L. Houser. 1999. The revised learning indicators scale. Communication Studies 50 (1): 1–12.
Glaser, B. G., and A. L. Strauss. 1967. The discovery of Grounded Theory. Chicago: Aldine.
Goodboy, A. K. 2011. The development and validation of the instructional dissent scale. Communication Education 60
(4): 422–440.
Hooker, J., and K. Denker. 2014. The Learning Loss Scale as an assessment tool: An empirical examination of convergent
validity with performative measures. Communication Teacher 28 (2): 130–143.
Hsu, C. 2012. The influence of vocal qualities and confirmation of nonnative English-speaking teachers on student
receiver apprehension, affective learning, and cognitive learning. Communication Education 61 (1): 4–16.
Johanson, G. A., and G. P. Brooks. 2009. Initial scale development: Sample size for pilot studies. Educational and
Psychological Measurement 70 (3): 1–7.
Kearney, P., T. G. Plax, E. R. Hays, and M. J. Ivey. 1991. College teacher misbehaviors: What students don’t like about
what teachers say and do. Communication Quarterly 39 (4): 309–324.
Kearney, P., T. G. Plax, V. P. Richmond, and J. C. McCroskey. 1985. Power in the classroom III: Teacher communication
techniques and messages. Communication Education 34 (1): 19–28.
Kline, R. B. 2005. Principles and practice of structural equation modeling. New York: Guilford Press.
Knowles, M. S., E. F. Holton, and R. A. Swanson, 2005. The adult learner. Amsterdam: Elsevier.
Krathwohl, D. R., B. S. Bloom, and B. B. Masia. 1964. Taxonomy of educational objectives: Handbook II. New York:
David McKay.
Lei, L. W. 2004. Evaluation of computer-assisted instruction in histology. Ph.D. diss., University of Washington, Seattle.
Lim, C. K. 2001. Computer self-efficacy, academic self-concept, and other predictors of satisfaction and future
participation of adult distance learners. The American Journal of Distance Education 15 (2): 41–51.
Lowes, S. 2005. Online teaching and classroom change: The impact of virtual high school on its teachers and
their schools. Available online at http://www.academia.edu/1106534/Online_teaching_and_classroom_change_The_
impact_of_Virtual_High_School_on_its_teachers_and_their_schools
MacCallum, R. C., M. W. Browne, and H. M. Sugawara. 1996. Power analysis and determination of sample size for
covariance structure modeling. Psychological Methods 1 (2): 130–149.
McCroskey, J. C. 1994. Assessment of affect toward communication and affect toward instruction in communication. In
1994 SCA summer conference proceedings and prepared remarks, ed. S. Morreale and M. Brooks, 55–68. Annandale,
VA: Speech Communication Association.
McCroskey, J. C., and V. P. Richmond. 1992. Increasing teacher influence through immediacy. In Power in the classroom:
Communication, control, and concern, ed. V. P. Richmond and J. C. McCroskey, 101–119. Hillsdale, NJ: Erlbaum.
McCroskey, J. C., V. P. Richmond, T. G. Plax, and P. Kearney. 1985. Power in the classroom: Behavior alteration
techniques, communication training and learning. Communication Education 34:214–226.
McCroskey, J. C., and T. J. Young. 1979. The use and abuse of factor analysis in communication research. Human
Communication Research 5 (4): 375–382.
Meyer, K. R. 2007. Student engagement in the classroom: An examination of student silence and participation. Paper
presented at the Annual Convention of the National Communication Association, November, Chicago, IL.
———. 2008. Student classroom engagement: A multiple linear regression analysis of the variables predicting student
silence and participation. Paper presented at the Annual Convention of the National Communication Association,
November, San Diego, CA.
Meyers, L. S., G. Gamst, and A. J. Guarino. 2006. Applied multivariate research: Design and interpretation. Thousand
Oaks, CA: Sage.
Nagel, L. L., A. S. Blignaut, and J. C. Cronjé. 2009. Read-only participants: A case for student communication in online
classes. Interactive Learning Environments 17 (1): 37–51.
STUDENT MISBEHAVIORS IN ONLINE CLASSROOMS
55
Patchin, J., and S. Hinduja. 2006. Bullies move beyond the schoolyard. Youth Violence and Juvenile Justice 4 (2):
148–169.
Patil, V. H., S. N. Singh, S. Mishra, and D. D. Donavan. 2008. Efficient theory development and factor retention criteria:
Abandon the “eigenvalue greater than one” criterion. Journal of Business Research 61 (2): 162–170.
Plax, T. G., and P. Kearney. 1990. Classroom management: Structuring the classroom for work. In Teaching communication: Theory, research, and methods, ed. J. Daly, G. Friedrich, and A. Vangelisti, 223–236. Hillsdale, NJ:
Erlbaum.
Plax, T. G., P. Kearney, J. C. McCroskey, and V. P. Richmond. 1986. Power in the classroom VI: Verbal control strategies,
nonverbal immediacy and affective learning. Communication Education 35 (1): 43–55.
Plax, T. G., P. Kearney, and L. K. Tucker. 1986. Prospective teachers’ use of behavior alteration techniques on common
student misbehaviors. Communication Education 35:32–42.
Rocco, E., and M. Warglien. 1995. Computer mediated communication and the emergence of electronic opportunism.
Available online at http://eprints.biblio.unitn.it/34/1/CEEL96_01.pdf
Seidman, A. 2005. The learning killer: Disruptive student behavior in the classroom. Reading Improvement 42 (1): 40–46.
Selwyn, N. 2008. A safe haven for misbehaving? An investigation of online misbehavior among university students.
Social Science Computer Review 26 (4): 446–465.
Stephens, K., M. Houser, and R. Cowan. 2009. R U able to meat me: The impact of students’ overly casual email messages
to instructors. Communication Education 58 (3): 303–326.
Tabachnick, B. G., and L. S. Fidell. 2007. Using multivariate statistics. Boston: Allyn & Bacon.
Copyright of American Journal of Distance Education is the property of Taylor & Francis Ltd
and its content may not be copied or emailed to multiple sites or posted to a listserv without
the copyright holder's express written permission. However, users may print, download, or
email articles for individual use.
Download