PDF
version of: This Article (89 KB) |
This
Issue (1 MB)
|
|||||
![]() |
|||||
![]() |
Classroom or cyberspace? Ethical and methodological challenges of on-line gambling surveys for adolescents
AbstractThis paper outlines the practical and ethical implications of a recent trial of an on-line adolescent gambling survey conducted in Australia's capital city, Canberra. The main aim of the survey was to explore the potential suitability of an on-line methodology for future national gambling studies. The trial identified a number of important methodological and ethical advantages and disadvantages associated with using an on-line methodology. The principal advantage of this method is that it minimises disruption to school routines because it allows greater flexibility in the timing of the survey and in the amount of teacher time required for administration. However, the trial also provided useful insights into the potential disadvantages of this methodology, including difficulties in obtaining adequate response rates, lack of control over the administration context, and missed opportunities to obtain more detailed open-ended responses. Key words: on-line methodology, adolescents, surveys, schools, gambling IntroductionThis paper reflects on the merits of a recent pilot on-line gambling survey of Australian adolescent school students. The on-line survey was conducted as part of a larger research project into adolescent gambling, where the primary methodology consisted of an identical pencil-and-paper survey (Delfabbro, Lahn, & Grabosky, 2005). The on-line survey was undertaken with a view to identifying the most appropriate method to be used in future cross-jurisdictional studies. During planning, it was envisaged that on-line surveys would enable smooth dissemination and collection of student surveys across Australia while minimising the organisational difficulties of participating school staff. It was envisaged that the electronic accessibility of on-line surveys could streamline the administration process of surveys and allow surveys to be completed by adolescents away from the classroom using any computer terminal with Internet access. During the course of conducting this survey, it became apparent that this methodology had further unforeseen advantages and disadvantages that prompted the question of whether surveys are better located in classrooms or in cyberspace. The purpose of this paper is to summarise our experiences and to provide recommendations for future research conducted using an on-line methodology. Surveys in adolescent gambling researchResearch into adolescent gambling practices is a relatively recent but absorbing area of scholarly inquiry that is expanding at a rapid pace. The major research tool of adolescent gambling studies has been the survey, which among gambling studies generally emerged and dominated during the 1990s (McGowan, 2004; see this paper for a review of qualitative and quantitative methodologies used in gambling research). Schools have been the locale of the bulk of adolescent gambling research, and the vast majority of studies have employed surveys administered to school populations. This school-based survey approach seems likely to remain the approach to adolescent gambling research, despite the appearance of qualitative alternatives based on individual interview or focus group methodologies (e.g., ACOSS, 1997; Derevensky & Gupta, 2001; Wiebe & Falkowski-Ham, 2003; Wood & Griffiths, 2002). Although alternative qualitative research projects can yield rich data, they are also labour intensive and time consuming and focus more upon individual experiences and the linguistic expression or construction of these experiences rather than on the prevalence of different behaviours or beliefs. Accordingly, larger-scale surveys are likely to remain the most effective way to obtain information from a large number of participants in the shortest possible timeframe. Curiously, in this burgeoning field, our methodologies and their ethical implications rarely feature in debate (notwithstanding recent contributions on the measurement of problem gambling, e.g., Derevensky, Gupta, & Winters, 2003; Ladouceur et al., 2000). As a specialised and relatively new field of inquiry engaging with vulnerable research subjects, adolescent gambling research constitutes fertile ground for exploring the ethical and practical implications of our research methods for the benefit of both future student participants and school authorities. Despite this, relatively little discussion has been directed towards the processes by which we obtain information about youth gambling. A notable exception is a recent paper (McPhee & Canham, 2002) that focuses on improving research processes, including our engagement with policy makers and community agencies, and the relationship between researchers and the educators who assist in gambling projects. McPhee and Canham (2002) usefully draw attention to the various issues that researchers need to take into account when interacting with host institutions and, in particular, the school staff who are required to facilitate or implement the research that is conducted. Pointing to the enormous demands placed on schools by researchers (both within and beyond the field of gambling research), McPhee and Canham suggest that researchers need to be more attuned to the pressures experienced by school staff who assist in the administration of surveys, and that researchers should take these factors into account in the organization and execution of surveys, and in deciding what tasks are delegated to school staff rather than their own project staff. Their general view resonates with our own experiences in approaching Canberra schools in 2004 to participate in a conventional pencil-and-paper gambling survey for adolescents. Very few schools appeared to be concerned about what could be considered a somewhat sensitive research topic. Assurances of school anonymity and data confidentiality were certainly noted. However, by far the most common and immediate source of disquiet, and of schools declining to participate, was the placement of additional demands on teachers' time. School representatives tend to feel overwhelmed by the number of research applications they receive each year. Educators, including principals, teachers, and school counsellors, who are involved in approving and facilitating academic research in their schools state that they are not opposed to research itself but feel that research places extra pressures on an already underresourced sector. The processing of consent forms and survey supervision are considered very time-consuming activities. Indeed, some school principals refused to be involved due to active-consent procedures, while others who had tight teaching schedules refused due to disruption to regular class activities. On occasion, despite obtaining formal agreement from a school administration to participate in the survey, classroom teachers remained reluctant to participate due to the time-consuming nature of encouraging students to return consent forms if they want to participate. In our view, such issues should be a concern not only to researchers, who rely on school co-operation as a straightforward means to access adolescent populations, but also to university ethics committees. Ethics procedures understandably tend to focus far more on issues of consent among students than the additional burdens being placed on frequently overworked and busy teaching staff. In Australia, there is little room to shift the informed-consent process from active to passive, as "consent to a child's or young person's participation in research must be obtained from … the parents/guardian in all but exceptional circumstances" (National Health and Medical Research Council, 1999). A young person is defined as someone who has the maturity to consent without parental involvement. Ethics committees appear to err on the side of caution and classify everyone under 18 as requiring active parental consent. While ethically grounded in protecting parental rights over children's activities, active-consent requirements have negative effects on participating schools and data validity (e.g., Bridwell, Ford, Ewing, & Ferguson, 1999; see also Haggerty, 2004). As McPhee and Canham (2002) point out, active-consent procedures produce low response rates and, likely, a biased subject population. Moreover, classroom teachers note that active-consent requirements create burdens on teachers, who inevitably facilitate the bulk of this process by reminding students (often daily) to return consent forms that have been signed by a parent or guardian. For these reasons, McPhee and Canham rightly state that researchers could do a lot to take the pressure off schools by managing the process of obtaining active consent. However, in Australia, there are legal impediments to implementing their suggested strategy of "mailing consent forms directly to parents, tracking responses, forwarding reminder slips, conducting telephone follow-ups …". While this process seems ideal, there are impediments to Australian research following this path where ethics protocols are bound by 'Information Privacy Principles' gleaned from Privacy Act 1988 (Commonwealth). In accordance with privacy principles, schools can only release parental contact details under extreme or life-threatening circumstances. Thus, in our own context, it would be inappropriate for researchers, rather than schools, to liaise with students and parents. This means that only the school can perform any mail-outs or initiate personal contact with parents. This effectively rules out the prospect, at least in Australia, of researchers actively managing the informed-consent process in the manner suggested by McPhee and Canham's Canadian work. In addition to modifying consent procedures, there may be other aspects of methodology that serve to relieve some of the pressure on schools. One of these factors is the requirement for teachers to administer and retrieve surveys. Thus, the aim of our trial was to investigate whether an on-line methodology could alleviate some of these difficulties and whether it could be used in future cross-jurisdictional studies potentially involving many hundreds of schools. The principal advantages we envisaged from using on-line surveys were that it required minimal teacher supervision and electronic access to and retrieval of surveys. This would mean that students could undertake the survey at any time of the day and eliminate the need for researchers to make repeated visits to schools to collect surveys and hand out reminders. Reminders could instead be sent by e-mail to each student with the names of the nonrespondents suppressed. The Canberra studyPrior to the completion of this survey, relatively few studies had been conducted into adolescent gambling research in Australia (e.g., Delfabbro & Thrupp, 2003; Moore & Ohtsuka, 1997; Moore & Ohtsuka, 2000; Victorian DHS, 1999), and nothing was known about adolescent gambling in the national capital, Canberra. Thus, the aim of the study was to extend previous Australian findings by investigating the prevalence of gambling, gambling-related beliefs, and problem gambling in a sample of Canberra schools. Eighteen schools agreed to participate and a total of 926 completed surveys were returned. Students were drawn from years 7 to 12 with an age range of 11 to 19 years (only 4.2% were 18 years or older). For each of the participating schools, the methodology involved a pencil-and-paper survey administered by teachers in classrooms. The research process involved several stages. After permission had been obtained from the relevant education boards, school authorities, and teachers at the respective grade levels, the first stage was to distribute consent forms to students. In some schools, access was provided to all school grades, whereas in others, it was only possible to obtain the cooperation of teachers of specific year levels or particular classes. Teachers who agreed to survey their classes were asked to distribute the consent forms to students and to ask them to return the forms by the following week. Each take-home set of documents included an information sheet for both parents and students as well as two consent forms. For students to be able to participate in the survey, they had to return the consent form signed by their parents. This further reduced the eligible population for the study to those students who obtained active parental consent (45% of all surveys handed out). Some teachers specifically set aside class time for administration of the surveys themselves, whereas others arranged times where the researcher could be present during survey administration. Of the total number of pencil-and-paper surveys, 56% were supervised by the researchers and 44% by teachers. The project results are reported elsewhere (see Delfabbro, Lahn, & Grabosky, 2005). The same active written-consent procedure was followed for the on-line version of the survey. Both parents and students were required to give written consent prior to participating. In addition to the information contained on a take-home information sheet, a privacy statement was attached to the survey describing the project, the purpose of data collection, how the data were to be used (e.g., publications), how long the data were to be kept, the contact address of the researchers, how the Web site was secured, how the data were to be secured, and a warning about the insecurity of the Internet as a means of transferring data, as well as a link to the university's disclaimer and own privacy statement. Two methods were tested to allow students access to a single on-line survey. In the first method, a Web site address was distributed along with single-use user identifications and passwords to each student who provided a consent form. Passwords and user IDs were not linked to particular individuals. The second method involved e-mailing a hotlink to students' preferred e-mails (obtained on the consent form) with a hotlink to the survey. In total, the on-line survey was completed by 21 students. To build our on-line survey, the first author used a Web-based polling software, called Apollo, designed by the Australian National University. 1 No technical problems were encountered with the Apollo software during the survey period. Apollo has a number of features:
Advantages of the on-line surveyThe on-line survey format enabled a number of enhancements to the existing pencil-and-paper survey: Eliciting precise answers from students (e.g., more than one response could not be recorded for single-item response formats). The on-line survey enabled us to obtain more precise answers. For example, in the pencil-and-paper surveys, respondents were asked to indicate their gender; a small number of respondents ticked two boxes, male and female. In the on-line format, respondents could only choose male or female. Moreover, the gender question (and other sociodemographic questions) were mandatory; they had to be answered before moving on. A further example is that in the pencil-and-paper surveys, where answers required respondents to rank their activities (none or very little, some of the time, a lot of the time, most of the time), some students placed ticks on the line between columns, creating responses that were difficult to code. In the on-line survey, students had to make a choice among the answer options, preventing them from indicating a middle position. Enhanced privacy protections for students as surveys are immediately secured on a password-protected Web site. This is a marked improvement from surveys being administered by teachers and later retrieved by researchers. There is a danger that staff may examine or misplace the completed surveys, or hand them on to the wrong researcher. This latter example happened during the course of the pencil-and-paper surveys where surveys belonging to another university researcher were mistakenly given to the first author. Some schools and school staff do not have the time to properly manage survey retrieval. The advantage of on-line surveys is that they can be Web stored and password protected so that only researchers can access the data. If they wish, researchers can generate paper copies of the on-line surveys in a controlled university environment. A further advantage is that students can choose to complete the survey privately without any teacher or other adult supervising. This is important given the potential for gambling to be a sensitive topic for some participants (see Chambers, 2003). Minimising pressure on school resources. Teachers do not have to retrieve and store surveys for collection by a researcher; retrieval can occur electronically. Rapid and accurate data entry. Apollo surveys can be downloaded into SPSS. This process circumvents lengthy data entry and costs to the research budget if data-entry personnel need to be hired. Facilitating cross-jurisdictional studies. Across Australian states, there are different types of gambling available and different age thresholds attached to the legal use of different types of gambling. For example, there are no poker machines (VLTs) available outside casinos in Western Australia, and in South Australia, adolescents are legally able to purchase scratch tickets at 16, two years earlier than in other states. Given these differences in the gambling environment, there is merit in conducting large comparative surveys of adolescents across a number of state jurisdictions. The potential for on-line surveys to eliminate the need for large multi-sited teams of researchers would facilitate such a process. There would be no need for collection people in the event of conducting a larger survey drawing on a sample from multiple state jurisdictions, as surveys would automatically be sent to a host Web site. Disadvantages of the on-line surveyPotentially no supervision. As the surveys could be completed at any time, the social context in which students completed the survey could not be controlled or known. This outcome is the downside to the time saved by erasing the need for survey administration. Although students have the advantage of undertaking the survey at a time or location that may be more convenient for them, researchers cannot prevent students from undertaking the survey under less than desirable conditions (e.g., with music playing, friends present) or with other survey participants in the room. Further, there are no researchers available to respond to queries about the on-line survey. From our experience in conducting the pencil-and-paper surveys, this is an important consideration, particularly for younger participants, who tended to consult available researchers more frequently than their older counterparts. No opportunity for data validity checks at data-entry phase. For the pencil-and-paper surveys, some written comments and visual patterns of responding were useful in identifying the approximately 1 to 1.5% of aberrant responses at the data-entry stage. These included answers appearing over several pages in a series of 'z' patterns. Pencil-and-paper surveys allowed a two-stage data-checking process to occur. The first occurred at the data-entry phase, where response patterns and contradictory responses were tagged, and the second occurred when cross-checking responses using SPSS. It is worth noting that all suspect surveys noted during the data-entry phase were independently highlighted during cross-checking in SPSS without visual inspection of the paper surveys. Statistical identification of aberrant responding (e.g., as indicated by illogical findings such as scored problem gambling items amongst students who did not gamble, or inconsistent responses to semantically similar items) would still be possible for data obtained on-line, but without the capacity to visually inspect the paper survey for other evidence of noncompliance with the survey requirements. Reduced space for participant commentary. A number of students completing the pencil-and-paper survey wrote messages on or illustrated the surveys and the envelopes provided. These students commented on specific questions that they felt were unnecessary or difficult to understand, and their illustrations may have indicated when they were getting bored. The on-line survey provided one space for commentary at the very end of the survey. Most students wrote nothing in the space provided. Methodology only effective where students have Internet access. The on-line survey methodology can minimise the involvement of school staff and researchers in survey administration, where the students fill out the on-line survey outside class times. However, in this way, students from low socioeconomic backgrounds have diminished opportunities to complete the survey, as they are less likely to have Internet access at home. Their only Internet access may be at school. Reduced survey completions. Not all students who returned consent forms and received a user ID and password completed a survey. Only 70% of students returning consent forms completed the on-line survey, despite a reminder e-mail being sent. This raises questions of representativeness. In the case of classroom-based pencil-and-paper surveys, every person who returned a consent form and attended school on the day of the survey completed the survey. It may be worth investigating the reasoning behind student noncompletion of surveys. For instance, it may be that completing a survey is reminiscent of schoolwork or simply that there are more interesting things to do on the Internet than complete a survey about gambling. Conclusions concerning on-line methodologiesOn-line surveys can potentially deliver enormous resource savings for schools and researchers. On-line surveys also afford great protections for students' privacy by allowing them to complete questionnaires privately and by eliminating the possibility of lost surveys or the potential for staff examining the surveys before passing them on to researchers. Such methods also have advantages over telephone interview methods in that the cost is minimal, no call-backs are required, and there is no danger of other people (e.g., parents) overhearing the young person’s responses. In addition, as with computer programs designed to administer telephone surveys, data entry is replaced by direct downloading of survey data into SPSS. On-line methods therefore seem particularly suited to conducting multijurisdictional, national, or international studies because a survey collection person is not required for each location (Fox, Murray, & Warm, 2003). The main disadvantage was that completion rates for the on-line method were poor by comparison with pencil-and-paper surveys. Completion rates could be improved in two ways. Firstly, in traditional surveys, incentives (such as money and product vouchers) have been used to improve completion rates. Such initiatives could easily be inserted into the research process, though this does raise issues of resulting data quality (see, e.g., Davern, Rockwood, Sherrod, & Campbell, 2003). Secondly, completion rates could be improved by paying greater attention to the creation of a more visually and technically appealing on-line survey interface. Further, such initiatives could be designed, tested, and modified in conjunction with adolescents. Such efforts to make on-line surveys more appealing are likely important given that participants are solely responsible for completing the on-line survey when neither a teacher nor a researcher is present. However, in terms of poor completion rates, it is important to bear in mind that those students who returned a completed consent form and received a password and user ID but did not complete an on-line survey were exercising their right to refuse to participate, in the absence of the researcher or teacher. Schools are hierarchical institutions, and the unequal distribution of power that exists between staff and students raises questions about the extent to which genuine consent, relying on a high level of student agency, is possible in school contexts. There is an inherent danger that in basing our research in schools, students may actually feel compelled to consent to participate (Forster, 2003). Students may also wish to participate in surveys during class time because it offers a break in class routine. It is likely that the option of completing on-line surveys away from school staff, and from researchers, is actually a positive outcome for students as it enables them to assume greater control over their ability to consent to participate in research. However, this is not a desirable result for researchers and may require the inclusion of more extensive feedback or reward structures in research so that students receive greater benefits and/or compensation for their time. From our trial on-line survey, it seems likely that, while suffering from active-consent procedures, on-line methods of administering surveys can indeed ease the pressure on school resources, secure greater participant privacy, and save on data-entry time for researchers. These outcomes are worthy of further testing, particularly in the conduct of comparative large-scale and cross-jurisdictional studies of adolescent gambling. It seems clear from our pilot survey that the conduct of cross-jurisdictional studies of youth gambling can benefit enormously from placing our surveys in cyberspace. Given the reduced participation rates of those students returning a consent form but failing to complete a survey on-line, one option could involve conducting on-line surveys within class periods with teachers supervising. Most Australian schools have computer labs and students have their own e-mail accounts. From this study, it seems that cyberspace does have benefits for research, but classroom supervision may still be the best way to ensure maximum student participation rates. Nevertheless, it would be useful in future research to conduct a more extensive investigation of the utility of the two different on-line methodologies developed for this pilot study to determine whether such methods could be used to conduct national surveys that avoid disruptions to school routines, as well as significant costs to researchers. AcknowledgementsWe thank Adrian Thompson, DOI Business Solutions, at the Australian National University, for assistance with the Apollo software, and two anonymous reviewers for their thoughtful suggestions on an earlier draft. Any errors remain the responsibility of the authors. References
Manuscript history: Submitted January 27, 2005; accepted June 7, 2005. All URLs were available at the time of submission. This article was peer-reviewed. For correspondence: Dr Julie Lahn, LPO Box 8279, Australian National University, Canberra, ACT, 2601, Australia. E-mail: julie.lahn@hotmail.com Contributors: JL is a co-researcher attached to the Adolescent Gambling project. She is the primary author of this paper. She transferred the project survey into an on-line format and administered the survey to school students. PD is one of two chief investigators on the Adolescent Gambling project. He designed the survey and is the originator of the idea to test an on-line format to compare it with the paper format. PG is the second of two chief investigators on the Adolescent Gambling project. He engaged in all discussions concerning the design, implementation, and results of the trial. Competing interests: None declared. Ethics approval: Human Research Ethics Committee, The Australian National University, gave ethics approval for the study "Adolescent gambling: Prevalence, risk factors and opportunities for controls and interventions" on June 27, 2003 and November 4, 2003. Funding: PG and JL are employed by the Australian National University. PD is employed by the University of Adelaide. PG and PD were awarded funding for the project by The Australian Research Council and the Australian Capital Territory's Gambling and Racing Commission (LPO348759). Julie Lahn, PhD, is a researcher in the Centre for Gambling Research, Regulatory Institutions Network (RegNet) at the Australian National University, Canberra. She holds a PhD in sociocultural anthropology and her interests lie in the application of ethnographic and other qualitative research methods to the study of gambling as an intercultural social practice. She has contributed to Australian gambling-related research in relation to offenders and, most recently, with adolescents, with Dr Paul Delfabbro and Professor Peter Grabosky. Paul Delfabbro, PhD, is a senior lecturer in the Department of Psychology, University of Adelaide, South Australia, teaching statistics, social psychology, language development, and learning theory. He has numerous journal articles and conference presentations on gambling-related topics and on adolescent adjustment, foster care, parenting, and methodology. In addition to his gambling prevalence research with adults and adolescents, he has conducted applied experimental studies on irrational thinking in gambling and the application of learning principles to real-life gambling behaviour. He has acted as a consultant on state and federal government projects in Australia, and has been the principal supervisor of postgraduate projects in the area of gambling. Peter Grabosky, PhD, is a professor in the Regulatory Institutions Network (RegNet) of the Research School of Social Sciences at the Australian National University. He holds a PhD in Political Science from Northwestern University and has written extensively on regulation, public policy, and criminal justice. Professor Grabosky's general interests are in the use of nongovernmental resources in the furtherance of public policy. In 2002, he was appointed to the National Advisory Body on Gambling. 1 Details about Apollo can be found at https://apollo.anu.edu.au/default.asp?script=true.
|
![]() |
![]() |
![]() |
||
|
issue 16 — april 2006 ![]() |
contents | intro | research | brief report | clinical corner | opinion | reviews | letter
letters to the editor | submissions | links | archive | subscribe
Please note that these links will always point to the current issue of JGI. To navigate previous issues, use the sidebar links near the top of the page.
Copyright © 1999-2006 The Centre for Addiction and Mental Health
Editorial Contact: Phil Lange
Join our list to be notified of new issues.