Too many sources, too little time...

A few days ago I was worried I wouldn't have enough sources to fill up my literature review. Now, I'm worried I'll have too many! In order to broaden my scope of sources, I consulted the bibliographies of my already-selected sources for helpful texts. I read a few interesting and relevant ones, which then led me to more studies of interest, and so on and so on. It was as if the floodgates had opened, and I couldn't stop the material from pouring in. This is the problem with research, especially research in a popular area: there's so much material! I couldn't possibly read every important content analysis study ever written, but with so many out there, I can't help but feel like if I don't look at them, I'll be missing something. Which is just a feeling I'm going to have to get used to, because in research, there's probably always going to be something you're not reading. I guess the important thing is to read as much as you can of the fundamental, cornerstone works that define your field so even if there is some new study on some small part related to yours, the foundation of your study will remain intact.
That is not to say that I've neglected my research, it's just that when using a popular method and studying in a broad field, it seems that widening your search parameters can be both useful and problematic. Too narrow and you've got nothing, too wide and you've got everything, and that is simply too much to go through in a few weeks.

A New Appreciation for Quantitative Methods

I am currently finishing up an edition review for another class, in which I discuss editorial decisions made throughout the Riverside Chaucer (1987). The research conducted is quite familiar to me, mainly focussing on literature and content analysis, with some manuscript studies in there as well. In doing my research for this assignment, I was struck by how similar these methods are to some of the studies we have discussed in class, especially ethnography and discourse analysis, as well as some of the more quantitative methods.
In compiling an edition of the Canterbury Tales, scholarly editors have to go through all of the existing (and acceptable) manuscripts and collate them, determining variants in spelling or word choice, etc., in order to decide what manuscript to use a base text, or whether to compile several manuscripts in order to create a new edition. I had never thought of it before, but thanks to my new found knowledge of research methods, doing this work would involve so much more quantitative analysis than I ever realized. You would have to count and record every variant in every manuscript, and then use your data to decide what edition is most accurate for your purposes. This is a far cry from the textual analysis I had always seen it as.
I have been doing research in this area for a few years now, but it was not until this past week that I saw manuscript studies as being so data-centric and quantitative, something I can surely attribute to my introduction to those methods in this course.

Adieu or not ?

Just like research is never ending in the life of a researcher, so also I do not want this blog to be totally over for us. I had started on an uncertain process, as I had never blogged before nor was very fond of publishing my private thoughts. But this was another domain – where you shared your ideas with your peers. As well as for the world to see !!! Sometimes I was unsure whether to be formal and at other times just gave in to the feelings, the tensions and the questions that overwhelmed me. Gradually a realization came over me that this was also a way of finding out for ourselves what we were searching for at that moment in time. It was like a research into our deepest thoughts and obligations brought out in print.

Now, as I struggle to make my real research more meaningful, my methods click, my hypothesis achievable – I feel that I have certainly gained as much from this blogging and unravelling of thought process as the Research methods class itself.

Texts and more...

Around the beginning of the Research Methods course, I was excited to have found a book like Luker as text. I did not have the patience then to wait for every class and read the assigned chapters. I went through the entire book immediately with a hope to learning an important aspect of the academic world and also enjoying it immensely. I was finished with the preliminary reading within a few days but the thoughts lingered with me. As I started work with the Research Proposal I realized the necessity and utility of Knight. Knight and other research texts I had come across in the past, somehow fill me with more confidence about the task at hand. They provide a rigorous understanding of the definitions, descriptions, explanations about methods and how to go about practically doing them. Luker in retrospect is more of an overview for me. Even though not “a sugar-coated bitter pill” in the words of Shaw, Luker’s book nicely and gently introduces us to research and tries to make us get rid of some inherent fears we may have of committing to and accomplishing such work.

Comparing Luker and Knight close to the end of the proposal, provided me with an understanding of how much information/knowledge I needed from whom and when. I realize that it has surely been fruitful to experience the different ways of approaching a subject and accepting the various perspectives as that has allowed us to open up our minds and enabled us to see more.

Source Dilemma

I'm having a bit of a dilemma right now over a certain source and whether to use it or not. First of all, the research I'm proposing involves computer-mediated discourse analysis and political discussions in certain forums online. Sound familiar? It's similar to the research conducted by Kushin and Kitchener in their article "Getting political on social network sites", on which I did my peer review assignment (I had picked my topic long before reading these articles). My dilemma is that in reviewing this article and the research conducted, I am aware of all the flaws in the research design and implementation. However, their research would be very useful in order to provide a sort of background and support for the research I'm proposing. But since I have found it to be flawed, is it ill-advised to use this article as a source in my proposal?
In fact, I think it might not be that much of a dilemma after all. Their only similarities are in method employed (which isn't unique to their study) and the general subject area of focus. Also, it might be easier to justify the need for my research if one of the fundamental studies in this area has some serious flaws and biases all the way down from research design to implementation and data analysis. I don't think I really have a choice but to use this study as one of my sources, as long as I acknowledge its' weaknesses and don't repeat them.

Thankful for Salsa Dancing

I'm currently working on my research proposal, and I'm finding that the methods section is giving me more trouble than I expected. I'm not used to doing research in a way that requires me to define my method and explain its' parameters, but I guess doing discourse analysis of human subjects is a bit different from a literature analysis. I am struggling a bit to define my method, as I will be drawing on a few different ones in order to do my hypothetical research. Herring's computer-mediated discourse analysis will likely serve as my main method, but I am not ruling out drawing on other methods as I move forward and as they are appropriate. However, this leaves me to find out what else I can use that would be useful in my study, so I've been doing a lot of research on research methods (which is a bit too meta for me).
Luker's salsa dancing social scientist idea has never been more useful to me than now, because picking one method and sticking with it seems to be a bit too narrow a path for me and my research tastes. Following her allows me some more freedom when deciding on a method for my research, which suits me perfectly, especially when the research I'm proposing would require a wider range of methods in order to have any conclusiveness.

User-centred research at U of T

University of Toronto has its own guidelines for researchers and I thought I would have a look at them given that we are having a guest speaker who can address our concerns. I also plan to have University of Toronto students as my user base for a part of the study. I would then need to “gain access to data about students, staff and faculty held by the University of Toronto”. Interestingly the Office of the Vice President and Provost’s site informs that a considerable amount of research is conducted involving members of the university community. The Office of the Vice President Research has a Research Ethics Board (REB, that has to approve the research to be done and must be contacted at the research planning stages. I was excited to learn that I could request for data held on the Repository of Student Information (ROSI), the Human Resources Information System, or collected through surveys such as the National Survey of Student Engagement (NSSE). This would help to conduct research of 'student users' from various departments independent of each other.

Guidelines are provided in keeping with the following aims(which sounded very thoughtful to me, both with respect to those researched and the researcher): “to prevent survey fatigue, protect confidentiality and employee rights, and ensure that access does not conflict with any current or planned research to be conducted by the University or its administrative/academic units”.

Disclosing Cognition

The ethics of research was an interesting topic to read about this week (though, to continue my last post’s line of thought, this segmentation is a bit artificial for me: its not ‘ethics’ as a subcategory of ‘research’ but ethical choices seem to underscore the entire research initiative). My own research topic doesn’t dive very deep into these professional ethics, but Knight’s discussion of achieving ‘disclosure’ with participants had touched on what I’ve been grappling with lately.

My research proposal began with a strong interest in the cultural meaning assigned to an object (a typeface), so I have spent a lot of time working out my historical analysis methodology. I had always worried that this approach would veer off into a grand narrative that might loose contact with the current world. In my proposal, I counterbalanced this method with the addition of a focus group to maintain the individual perspective.

Describing the focus group in more detail has been a struggle; how to do you test for the psychology of aesthetics? While Knight suggests building a trusting relationship or an insider approach to understand what the participants think, I doubt that our aesthetic preferences are so well-thought out that can be readily articulated by the participant.

User and research

A critical aspect of user-centred research is to understand who the potential users are, especially because early user involvement is a primary principle of user-centred design. This can be extremely relevant to HCI research as well. HCI research though concerns the user experience about probable usability of website, interface or software; it does not always involve user participation from the initial stages. User-centred processes try to include the actual users in the development process at the earliest possible time in an effort to correspond to the needs and behaviors of the users.

As I progress with my proposal I realize how much the method can be linked to the area of study. That is because my research findings are dependent on user feedback. My quest is dependent on how I embark on it than the findings themselves, as querying usability becomes an important part of the study too. When conducting user research, theorists recommend that several methods be used in order to obtain rich qualitative data that would help build a holistic view of the studied user group. Thus the most common methods used include interviews, observations and questionnaires with other methods such as cultural probes or artefact analyses being applied at times too. Interestingly, it also falls under the purview of exploratory research into the unknown.

Murphy's Law

Knight brings up a very important point in this week's reading: technology and the human element have the potential to present very real headaches during the course of interview research! I'm positive that everyone has had issues with technology at one point or another. In my personal experience, I have forever been cursed with computers that have some hardware issue or another. Even the overpriced laptop I currently own locks up randomly due to a faulty processor. Due to these wonderful experiences I have learned to have data backed up to at least three hard drives/flash memory/etc. at any given time. I'm sure these habits are applicable in the research realm. Knight even mentions that data should be collected through several recorders at once.

The human element can be equally hit and miss. Knight says to “try and distribute tasks so as to use partners' strengths and avoid [the] weaknesses [of research assistants]” (p. 163). I'm sure everyone has been in group projects in which everyone contributes, and others in which no one seems to contribute. The only method that seems to work in the latter situation is to assign tasks (in my experience). I suppose that it is equally challenging to develop camaraderie with research subjects.

-Martin

Ugh...Consent...this is a long one

Reading the required bits of the Toronto Office of Research Ethics Guidelines and Practices Manual while trying to write my proposal has been very informative yet also frustrating. So many things to consider! In my research I am looking at two age groups and unfortunately one of those groups is teenagers, 13 to 18. So don't just have to convince them to participate, I have to convince their parents or guardian to let them. For my proposal I have written a rough consent letter which I shall post here, please let me know if I've made any errors or left anything out...thanks!

Dear Parent/Guardian;

Your child has been invited to participate in an academic study run by a graduate student in the iSchool at the University of Toronto. This study will be supervised by Prof. Awesome of the U of T iSchool faculty and will follow all guidelines designated by the university of Toronto's Office of Reserach Ethics.

The purpose of this research is to investigate how a female's personal identity is internally constructed and perceived with the use of digital media. The study is examining two age groups of females, 13-18 year olds and 35-40 year olds and comparign the results. We hope to gain insight into the thoughts of women introduced to technology later in life as compared to those born into it. This study will contribute to dicussions surrounding women, identity and the use of digitla media. Following David Gauntlett's "Making is Connecting" method (www.artlab.org.uk), the participants will be able to create vsual representations of their identity both online and offline and then discuss and reflect on what they have created. The participants will spend 3 hours on a Saturday afternoon on the 5th floor of the University of Toronto Bissel Building and will be compensated in community service hours completed.

Your child has been approached because she is a female of the correct age and she uses digital media daily. If she chooses to participate, she will be working individually but discussing her creation in a group of 6 of her peers. In teh session itself, the participants will sign confidentiality agreements with one another not to share others' personal thoughts and feelings.

Your child's participation is completely voluntary and she may refuse to participate or withdraw at anytime. However, upon withdrawal the allotted community service hours will not be given. Photographs will be taken of teh visual representations created and video recordings will be made of the discussion. The photographs will be used in teh research findings presentation, but the video footage will not. Video recording will be used only by the researchers for identification of the participants in linking the person to what they say. The recordings are completely confidential and will not be used outside of transcribing the study. Some sections of the transcript may be used in teh findings presentation, but there will be no name or identifying characteristics attached to it. If you have any questions regarding any aspect of this study please contact the Office of Research Ethics at ethics.review@utoronto.ca or 416-946-3273 and the researcher at aurianne.steinman@utoronto.ca

Please sign below to allow your child to participate in this study.

Second research daisy

Since one of my first posts for this course was of a research daisy for my intended course of study, I can't resist posting a second daisy, one that is closer to what I will probably submit with my research proposal. This one illustrates the fields involved in my case study - of one aspect of the United States Government's open data system. It was done using OmniGraffle, the equivalent of Visio for Macs.

Yin on case study research

Since I'm looking at doing a case study for my thesis and we've been discussing Robert Yin in class, I borrowed his book entitled Case study research: Design and methods (2003) from the library. This book has been extremely useful in building the justification for a case study design and understanding/addressing the strengths and weaknesses of the approach. When defining the research approach, Yin notes that: “a case study is an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between the phenomenon and context are not clearly evident. (p. 13)"

By pressing the importance of examining the phenomenon within its context, Yin contrasts case studies to, for example, experiments, where phenomena can be tested removed from their environment - in the laboratory context. This makes me thing of Walsham's synthesized framework, which many of us studied in INF1003. In Interpreting Information Systems in Organizations, Walsham suggests different points of entry for IS research, including examining context, content and processes. Walsham also wrote a detailed methodological article called "Doing interpretive research" (2006). Because interpretive IS research fits well with the case study approach defined by Yin, I think that the two authors complement each other well. Chapter 7 of Knight's Small scale research (2002) also gives very practical advice for data collection and interaction with the research subjects.

Interestingly, Yin notes the weakness of case studies, particularly single-case studies, which do not benefit from the comparative element of multi-case studies. He explains that, like single experiments, single-case studies are vulnerable to misinterpretation and access issues. This loops back to last week's post, in which I briefly discussed the importance of, and anxiety related to, obtaining appropriate access to the subjects in the case study. I do think, however, that Yin's six sources of data for case studies can address these problems - documentation, archival records, interviews, direct observation, participant observation. When studying online information systems, some of this data can be collected through interaction with the system itself and by accessing publicly available records.

Going back to the question of studying phenomena within their environments, I am also reminded of Bruno Latour's commentary on the separation between lab and field research in science, in Pandora's Hope. As an outsider, he is struck by the abstraction and subjectivity necessary for lab studies in botany. He explains that a plant sample, for example, has no meaning outside of the context in which it has evolved and that the recall of the field researcher is necessary to fill in that context. When using the case study approach, it may thus be possible to reduce the gap between the case itself and the researcher's abstraction and categorization, as it is never removed from its context.

The Morality of Researchers

One of my favourite insights into research methodologies has been Knight's remark that “the most important thing in small-scale research is to be mindful about the sorts of claims that research is intended to enable” (p. 114). While a popular arena of considering the impact of claims is the academic field (with peer-reviews, for example), I’ve started to consider research methods as an impact to the social arena (not in the practical translation of the results, but how the methodology is a reflection of the society back on to itself). Isn’t there a moral backbone required in claimsmkaing, and if so, where does it come from — what instructs the researchers of the value of their research claim? Not to suggest that researchers all value the same types of claims, because the research value is as diverse and multiple as the individuals. But I am wondering where Luker’s “imposition of a schema on the social world” (p. 214) come from and if its ontology matters?

Our Research Question, and Hypotheses of the Project

Before we begin writing a grant proposal, we should take some time to map out our research strategy. A good first step is to formulate a research question. A Research Question is a statement that identifies the phenomenon to be studied. For example, “What resources are helpful to new and minority drug abuse researchers?”
To develop a strong research question from our ideas, we should ask ourselves these things:
  • Do I know the field and its literature well?
  • What are the important research questions in my field?
  • What areas need further exploration?
  • Could my study fill a gap? Lead to greater understanding?
  • Has a great deal of research already been conducted in this topic area?
  • Has this study been done before? If so, is there room for improvement?
  • Is the timing right for this question to be answered? Is it a hot topic, or is it becoming obsolete?
  • Would funding sources be interested?
  • If you are proposing a service program, is the target community interested?
  • Most importantly, will my study have a significant impact on the field?
Think about the potential impact of the research we are proposing. What is the benefit of answering our research question? Who will it help (and how)? If you cannot make a definitive statement about the purpose of your research, it is unlikely to be funded. A research focus should be narrow, not broad-based. For example, “What can be done to prevent substance abuse?” is too large a question to answer. It would be better to begin with a more focused question such as “What is the relationship between specific early childhood experiences and subsequent substance-abusing behaviors?”

Luker vs Knight

The combo of the readings this week were really interesting as both Luker and Knight discuss very similar things. So similar in fact, that it really shows how much their writing/teaching styles/method diverge from one another. Even the titles of the chapters are indicative of the differences, Luker's 'Getting Down to the Nitty-Gritty' versus Knight's 'Research Design: Bringing it All Together'. Both discuss the synthesis of the research and method elements that we have learned about thus far. While Luker and Knight both use helpful anicdotes in giving examples, they part ways when it comes to their chosen teaching method.

Luker tends to kind of coddle the reader by repeatedly accounting for any negative feeling about their own personal research they may have. She discusses anxiety and reiterates important questions and information over and over again. Luker is a bit of a hand-holder, which I totally appreciate as most of us are going into unknown territory and may not even know exactly what we are studying until the end of our study. It is like she is prepping us for taking a leap of faith. Luker also leaves a lot of room for the creativity of the researcher by only making vague suggestions of what to do, how to do it and what has worked for her.

Knight on the other hand is a lot more pragmatic in all matters of research design and implementation. He provides great detail and excellent definitions for all aspects of small-scale research. Knight injects just enough charm so the reader does not die of dehydration. He does cram an insane amount of information into each chapter, so much so that it can seem a bit overwhelming. He is quite a bit more objective in his teaching method and writing style than Luker.

The truth is, I go to Luker when I'm feeling confused and like I don't really know what I'm doing, reading her is like getting a big hug and a pat on the head. I go to Knight for serious information and guidelines when I already have an idea of what I want to do.

I realize that the different ways that Luker and Knight tackle small-scale research is the reason that both of these textbooks were chosen for this class. I just find it really interesting how two people took basically the same method and design information and created such different but equally helpful artifacts.

Cronbach's alpha and research anxiety

As I was going through Luker (2008) today wondering what I might write about for my blog post, I hit upon a completely new concept: Cronbach's alpha, a coefficient of internal consistency. As Luker explains, once you've coded your data and created a coding book (a list of what codes relate to what themes and sub-themes), you ask someone who doesn't know your research to use it to code a sample of your data. Then, you run the test on the resulting analysis, along with your own analysis of the same data, and have a measure of the extent to which you have been consistent in coding.

Because I'm not a mathematician, I imagine this could be done with SPSS. In fact, in an FAQ document, SPSS shows the formula "for conceptual purposes" before giving more familiar (to me) screen shots of what this would look like in the program.



I liked Luker's coding by hand methodology, but wonder how long it would take to input the data for the blind-coded sample into SPSS in order to run the test. It might be easier to sit down with a "for dummies" book and learn how to do it myself.

What attracted me to this method is not so much, as Luker posits, that it would legitimize my research to canonical social scientists, but rather that it would keep me accountable and disciplined during the analysis process. Working towards a logical, clear coding book would keep my thoughts in order and seeing the coefficient (which we hope, would be high) might relieve my own anxieties about the research process.

In fact, I've been having some anxieties about my research, as I've been working on my research proposal. I started the blog post with a remedy for analysis anxiety, but there is also data collection anxiety. I have an exciting case study in mind. How to I ensure that I gain entry to my research participants? Walsham (2006), in a paper about interpretive research methods for the study of information systems, devotes a section to the social skills of researchers and places importance on being liked and respected by the participants. Similarly, Luker discusses the need to make your research relevant to the participants and showing reciprocity. When she gained access to an abortion clinic that she wanted to study, she donated blood in exchange. It sounds like a good idea, but I'm not sure I have enough blood for the number of interviews I would like to do. I imagine that what Walsham and Luker describe are simply normal social relations - why would our relationships and interactions with other human beings be any different because we are doing research? In any case, I can see why Luker discusses researcher anxiety near the end of her book. During my research journey, I will want to test my process many, many times, primarily to reassure myself that my work is sound.

Codification/Reduction of Data

I wrote my Peer Review on the McMillan “Soap Box” article. My primary criticism of the research centered around McMillan's coding or “reduction” of data. All I ended up seeing was the seeming arbitrariness of interviewee categories into a neat (but incredibly vague and questionable) table. McMillan didn't seem to consider that the interviewees might use a variety of media outlets on a day-to-day basis. Personally, I find it difficult to define “media,” and tend to use a variety of different sources. McMillan also didn't seem to consider researcher bias (Luker's “fish problem”) or alternate viewpoints. Some of the categories consisted of only one interviewee, which led me to think that a much larger interviewee base (or quantitative methods) might have helped. I really didn't think that this kind of “redundancy” can be claimed after 18 interviews.
Because of the Soap Box article, I was re-considering incorporating a mixed method approach in my research design. However, I was brought back to Earth after reading Knight, who points out that drawbacks of cost and faulty judgments are involved with quantitative research. I would tend to agree with his advice, “The temptation to exploit the potential of numbers needs to be resisted unless the data really are of the right sorts” (p.177).
-Martin

Census and Research

Recently, I had to do a debate which concerned the Long Form Census – and whether it should be cancelled or not. That exposed me to this singular method and authority of collecting data from a large number of the population. It also brought forth the fact that an equally immense number of organizations are dependent on that data for their research work as well as the fact that the information collected through the census is subjected to considerable and continuous analysis.

The Canadian Research Data Centre Network (CRDCN) in its website states that since 2000, it has been in partnership with Statistics Canada to “transform quantitative social science research in Canada”. Researchers analyze the census data to enhance their understanding of the Canadian society. The census functions as the primary source of information about the population of Canada. It is in fact a benchmark against which all other data are measured and evaluated. It provides knowledge about language, education, income, housing, geographic mobility, ethnicity and so on. It is widely used by policy makers, city planners to businesses and marketing researchers and NGOs. Even though it does have issues regarding privacy, it is also true that it provides us researchers with an unimaginable treasure trove to dig into.

Immersed in literature

Hine's (2004) examination of Internet ethnography, just like Wheeler (2010) and Miller and Slater (2000), among many others, conceptualizes the Internet as a 'place', a 'network' or a 'community'. One thus studies a part of the Internet just as one would study a village, a grassroots association or a practice - by examining the people linked to them and the relationships between them.

However, discussing my thesis proposal with Professor Grimes today, it hit me that not all parts of the Internet are conducive to this type of study. In fact, some parts of the Internet would better be qualified as 'technologies' or even 'artifacts' than 'places'. This applies to my chosen area of study, the US Government's geographic information system. While an ethnographic lens, particularly the one described by Star (1999), may be useful in examining the politics and of the GIS, Pinch and Bijker's (1984) social construction of technology framework/method might provide the right bridge between relationships and technology.

In this line of thought, at the DIY Citizenship conference at the University of Toronto this weekend, Ron Deibert of the Citizen Lab talked about the methods that his team used to study cyber attacks on the Office of the Dalai Lama in Dharamsala, India. Deibert discussed what he termed 'fusion methodology' which consists of field methods (participant observation + focused interviews) and technical interrogation (in-depth analysis of the technologies in play). This gives equal value to the social interactions and the technology itself, differing from Star's method which examines technology only as a small part of the ethnographic study.

The final report, entitled Shadows in the Cloud and produced by the Information Warfare Monitor and the Shadowserver Foundation, provides an interesting description of the mixed method - definitely worth considering for those approaching their research through science and technology studies.

Oh my Facebook

I'm using an article entitled “The Librarian as Video Game Player” (Kirriemuir, 2006) for an INF1300 Annotated Bibliography project, and I thought it brought up an interesting argument. Kirriemuir states that gamers are typically capable of multi-tasking, of using sophisticated information-locating resources (online and offline), of installing hardware, and of using social networking tools effectively (all of which happen to be fairly relevant skills in a library). I narcissistically like to think that I possess many of these strengths, save one.

Yes, in a futile effort to recover my humanity, I have deleted my Facebook account. I hope I don't come across here as someone who thinks he is “above” social networking sites, but part of it had to do with the way people whip out their smartphones in the middle of social gatherings. Another part of it had to do with the fact that I don't actually care how well my friends are doing in Farmville. The list goes on but I'm pretty certain I'll have to reactivate at some point, for some reason or another. It's simply too ingrained in our culture.

I thought this topic was quite relevant to the Orgad article, since the blurring of lines between online and offline lately seem fairly substantial. I would agree that the Internet is an extension of people's lives, and that studying online/offline in conjunction could yield valuable insights, depending on the research question.

-Martin

Detecting spam in Twitter Network

Now a days Twitter has become one of the biggest social networking sites around. Twitter has some useful features which make it especially helpful for research on almost any topic under the sun. This is because Twitter pages are viewable to anyone, even those without accounts, and the site has a search function which pulls in all recent posts dealing with a given topic or phrase. Twitter is a micro blogging service where users can post 140 character messages called tweets. Unlike Facebook and MySpace, Twitter is directed, meaning that a user can follow another user, but the second user is not required to follow back. Most accounts are public and can be followed without requiring the owner’s approval. With this structure, spammers can easily follow legitimate users as well as other spammers.
As online social networking sites become more and more popular, they have also attracted the attentions of the spammers. In the article of First Monday, “detecting spam in twitter network” by Yardi and others, they mentioned that Twitter, a popular micro-blogging service, is studied as an example of spam bots detection in online social networking sites. A machine learning approach is proposed to distinguish the spam bots from normal ones. To facilitate the spam bots detection, three graph-based features, such as the number of friends and the number of followers, are extracted to explore the unique follower and friend relationships among users on Twitter.
Unfortunately, spam is becoming an increasing problem on Twitter as other online social network sites. Spammers use Twitter as a tool to post multiple duplicate updates containing malicious links, abuse the reply function to post unsolicited messages to users, and hijack trending topics.

Network( -ed / -ing)

Though there is no novelty in thinking of cultural complexity, this week’s readings has underscored its awe through discussions around the formation of research methods on the Internet. The same metaphor kept popping up for me, and even explicitly stated a number of times: that of the network. 

Through Hine’s ethnographic analysis (as studying phenomena requires a point of origin and a definition of perimeter) to Orgad’s online/offline discussion (to which these difference data should “mutually contextualize themselves” [pg 48]), the readings presented phenomenons as a blurred interconnected array of factors that any research method must be conscious of. 

I am a little intrigued about the network metaphor doubling in both the Internet’s architecture and this theoretical understanding of the world (I doubt if its done tounge-in-cheek, though I don’t think it’s completely haphazard). Regardless, the metaphor serves as a valuable insight to both the Internet and ethnography broadly. I was reminded of a recent lecture (image below) about the increasing intention of web design to tap into this network idea, where offline/online behaviours are so connected it becomes blurred (think Foursquare, NikeRun, and continuing experiments of networked objects and cities). While canonical methods approach the Internet as an artifact, web designers are attempting to evolve the Internet as culture; in the meantime, researchers, like us, are attemping to adapt to both.

Dark Days

Last night my partner and I watched the Marc Singer documentary "Dark Days". The name of this film had been stuck in my head since Prof. Grimes had mentioned it in a class long, long ago, in October sometime. I know that this weeks readings focus on online research, which has absolutely nothing to do with this film, but they also discuss ethnography, boundaries and the fact that field sites do not exist in a vacuum.

This film is not a pure ethnographic study and Singer is not an ethnographer, neither claim to be so. But Singer fully immerses himself into a marginalized culture and is able to record his experience. He filmed bits of the lives and thoughts of a group of people that lived in a section of the New York city underground railway system. They built their own homes out of what they could find and had free electricity, but lived mostly in the dark.

These underground dwellings were Singer's main field site, he didn't stray from there much except to occasionally follow his subjects while they looked for food or tried to make money above ground. This culture was so small and unique, and Singer's access so complete and rare that I don't think he had any issues with defining where this project started and stopped. I suppose he could have expanded it to tackle issues like New York's crack problem, or effects of living underground, but he didn't.

Singer was specific in recording only these people (and later on some authorities). He chose to focus on their relationships, their back stories and sometimes their aspirations. He also took a personal interest in their well-being. Singer played a large role in finding a lot of the people actual housing. I suppose this could potentially be seen as similar to action research. Singer wanted to make a change for these people and by researching them and making outsiders aware of their plight, he was able to make a real difference.

Again, I repeat that Marc Singer is not an ethnographer nor claims to be one. But, this documentary really did show his immersion into a completely alien culture, a very bounded and precise culture. He was so immersed that he was able to entirely change the culture forever.

A Strategic Use of Methodology


My weekly post has been delayed because I really wanted to talk about the peer-review process and feared the dangers of posting too early. Alas, its been a few days so I’m considering safe to discuss it.

My peer-review was the examination of Facebook as a platform for political debate. One of the first things I did was follow the references the paper gave (majority linked to the field of CMC, computer-mediated communication) and sat down with a couple of books on the subject. It was, in sum, a gravely outdated theory from the 90s — the central claim was that all Internet communication are a series of rude childish insults. They begin to theorize about the importance of social cues in civility, and creating models about ranking the communication mediums by how much of the physical person they present: the Internet, lacking visual or audio persona, somehow psychologically triggers our inner animal, as it unleashes us from the perceived social pressure of civility.

The bulk of this academic field has grown with this (biased) perception. The particular research study that was up for review, however, approached this was a purely strategic method: take a unit of the Internet, and count how many posts are civil and uncivil. Lo and behold, the majority of them were completely civil! Its a rather simple procedure, but its power rests in quantitatively debunking the theoretical work of CMC. To put it in a different way, the study played by the academic rules and still able to beat the academy at their own game.

That last sentence wasn’t meant to turn out as negative as it did, but it brings me into an overarching point: researching with the academic discourse and researching against it. That is, I think to a point, what peer-review taps into: how well does it fit with the academic body of work happening? While you can adopt it or challenge it, there is an expectation that you are working with it.

How important is it to have this relationship with the academic body (again, I’m considering dissent as a type of response)? Class discussions suggested ‘breaking free’ and just research whatever you find interesting is much more liberating, but I’m just worried that this privatized research is just a disguise of your working methods: your own perceptions, biases and are not explicitly formalized. To extend, challenge or adapt the scholarly work (with references and citations) at least allows the the reader the luxury of accessing this history.

Although, this fault works the other way, as working with the the academy is to adopt all of its biases (the Facebook study, though it challenged the civility argument adopted many other biases without hesitation; it is as if you can only challenge one idea at a time). 

I feel myself loosing track of this post, but it’s been just an area of tension that I’ve been trying to grapple with; the academy with all its embedded faults and limitations, provides the strength of context.

Peer Review

I guess I am too full of assignments right now. I was doing a peer review of the paper using ethnography and since ethnography makes use of pictures as well, I was inspired to use some images myself in this posting to depict what peer review is, instead of writing yet again. Hoping the pictures would serve more than a million words...












Bias in Research Methodology Design

This week's peer review made me realize just how complex research methodology can be, especially in terms of the design. Those in charge of the study have to make sure that their methodology tests everything they intend it to and that these tests are reliable. The problem is, every researcher approaches the study with a bias, and this can affect the results. It's one of those things that can't really be avoided, but it can sometimes be argued that a bias can be good for the study, as it might bring a new and unique perspective. However, sometimes the bias can completely distort a study because the researcher had something particular in mind and, subconsciously or not, designed the study towards that bias.
Biases are something that every researcher has to deal with, but it just becomes a question of how aware you are of your bias and whether the bias affects the research design in a negative way.

Public relations, funding and case studies

As I started reading Beaulieu et al's (2007) article, I was struck by their mention of the cultural and institutional context of case study use in science and technology studies and how this might link the research to multiple audiences. This made me think about the fact that case studies can provide an extremely powerful illustration of a more complex issue for a public outside of the research field.

While research popularization can contribute to general understanding of an issue, if the research is publicized adequately, it can also lead to increased funding, as grant-making institutions want to fund relevant research. I haven't examined this enough to be in the position to make a link between case studies and the funding that they receive, but my experience with the mock SSHRC proposal tells me that the clearer the link between research and a current social issue, the better the chances are that it will be funded. My work in public relations also has taught me that the more concrete and emotionally engaging a story, the more likely it is that it will be picked up by the media, and thus reach the public through traditional channels. Therefore, an engaging case study that can be covered by the media and insert the researcher into a societal debate will also receive financial support to do so. What's more, the funding agency might benefit from a little publicity as well.

Since I have diagrams on the brain (and I'm sure I'm not the only one in this situation), I thought that I would illustrate my point with a little drawing. I think that this applies whether the research is in the social sciences, humanities or hard sciences. The 'generalization' ring is what links the case study to societal debate.
I then thought about how the diagram above might translate into a press release, and because I have a lot of time on my hands, I made a model of a press release that might get media attention by making the case study relevant to the public.

I'm not sure to what extent this reflects reality for researchers who have been funded, but I think that examining the politics of research methods and funding is a very interesting way to position the researcher within a broader societal context. Taking this into consideration is, in my opinion, a good way to make sure that our research can both happen and have an impact.


Science and technology

I found aspects of the article by Pinch and Bijker quite interesting this week. Written in 1987, its ideal for doing your own mental comparison study using contemporary thoughts. The part of the article that I became particularily engrossed with was their brief and admittedly partial discussion of the science-technology relationship. I feel like it could have been a contemporary piece. Nowadays, Science and Technology are still grouped together and people are still trying to define the distinction between the two, even though they are inextricably intermixed. Perhaps this is because technology has evolved at such an exponential rate. Pinch and Bijker dismiss some prominent philosopher thoughts that "science is about the discovery of truth whereas technology is about the application of truth" as overidealized and simplified. When I googled "what is the difference between science and technology" an answer that popped up alot was, "Science is knowing and technology is doing". One of the same prevailing thoughts as 23 years ago!
Pinch and Bijker take a social constructivist view of science and technology. "Scientists and technologists can be regarded as constructing their respective bodies of knowledge and techniques with each drawing on the resources of the other". All this time, well the last two months, I've been under the impression that the meshing of disciplines and blurring of classification lines was a new phenomenon. Obviously, it started long ago. The category of science and the category of technology are socially constructed, only now we have started to take down the barriers that we ourselves erected, or have we actually? Two decades later we are still wrestling with the same issues.

Peer Review Assignment

I have chosen the article for peer review is “Getting political on social network sites: exploring online political discourse on face book”. This article is about the social network site like Facebook for online political discussion. This article shows the impact of facebook in political discourse and this site give people a tool to interact online and extend their social lives beyond working hours. Of the many narratives exploring use and outcomes of social network websites, perhaps the most common explores the public sharing of personal identity information. Over the past four years, social network websites (SNS) have achieved strong market penetration with a wide range of participants. Sites such as Facebook.com are indicative of the phenomenal growth social network sites have seen in the past few years. The site was launched in 2004 and as of April, 2008 surpassed 70 million active. With the rapid popularization of social network sites, the potential for individuals to engage in online discussion about social and political issues has grown exponentially in a few short years. Due to the explosive growth of social network sites, scholars have little understanding of the nature of online political discourse as it is occurring in these new social spaces. This study explores how Facebook is serving as an arena for political debate among members. This research has shown that online political discussion does serve to expose participants to non likeminded partners. Yet, despite the potential of the Internet to bring opposing camps together in a common space and provide exposure to different ideas, some evidence suggests that this may not necessarily be occurring. I found this is very interesting subject in research.
-Meenaxi

Article hardships

I have to admit, I found this week's readings a little confusing and verbose. My brief summaries/interpretations of the first two readings are as follows:
Beauliu et al.: A research approach allows for debate and discussion, whereas work in a laboratory tends to constrict workers to their own tasks. A research approach can also be useful in examining elements of space, time and relationships. In studying the relationship between local and universal, it is helpful to draw comparisons across cases and disciplines.
Yin: Case studies do not necessarily entail a certain kind of evidence (such as “qualitative”) or a particular data collection method (such as ethnography or participant observation). The fundamental thing about case studies is that they attempt to examine a contemporary phenomenon in a real-life context. A number of techniques are effective in a case studies approach, such as focusing on a topic, noting meaningful events, and creating explanations for outcomes. Case studies yield more than single data points or single observations, so cross-case analysis should facilitate reflexivity.
Did I completely miss the point of the articles? It would be great if anyone could shed light on what the authors were trying to say!
-Martin

INF1300 Interview

I don't really plan to include the study of artifacts or literature review in my research design (although that could change), so I'm going to instead talk about my experience conducting an interview for the INF1300 that many of us are taking:
It didn't go quite as expected. Although I told my subject (who happens to be the most extroverted person in the world) that it was a one-on-one interview, she was visiting with two friends when I arrived. Since it was obvious that asking them to leave would have made the situation uncomfortable, I ended up letting them stay. Thankfully, the friends did not (for the large part) answer for the interviewee, give her suggestions, or pose questions of their own to her. The interview also ended up going much longer than anticipated and I ended up with more material than I could possibly have used. I omitted tangential responses in my report and I was sure to address all of these issues in the report.
Comparing Meenaxi's experience to mine, it seems that the flow of an interview largely depends on the nature and background of the interviewee. The INF1300 interview was about impressions of the library, and I think I had a relatively easy time getting responses because my interviewee was a student. Did anyone else have any interesting experiences/difficulties with the interview?
-Martin

An Introduction of Critical Discourse Analysis

While going through this week readings, I found Critical Discourse Analysis very different topic in research, I personally don’t know what exactly it is? So I went through the definition of CDA and found that it is an interdisciplinary approach to the study of discourse that views language as a form of social practice and focuses on the ways social and political domination are reproduced by text and talk.
In my opinion the term ‘critical’ has become ‘little more than a rallying cry demanding that researchers consider ‘whose side they are on?’’. From personal experience I have found that it also seems
to cause the hackles of other discourse analysts to rise because of the implication that they are ‘non-critical’ or even ‘sub-critical’ and therefore somehow in favour of things like oppression, exploitation and inequality: by commandeering the moral high ground of being critical, CDA thus ‘others’ mainstream discourse analysis and performs the very kind of domination through language that it seeks to oppose.
Discourse analysis challenges us to move from seeing language as abstract to seeing our words as having meaning in a particular historical, social, and political condition. Even more significant, our words (written or oral) are used to convey a broad sense of meanings and the meaning we convey with those words is identified by our immediate social, political, and historical conditions. This is a powerful insight for home economists and family and consumer scientists. We should never again speak, or read/hear others’ words, without being conscious of the underlying meaning of the words. Our words are politicized, even if we are not aware of it, because they carry the power that reflects the interests of those who speak. The words of those in power are taken as "self-evident truths" and the words of those not in power are dismissed as irrelevant, inappropriate, or without substance as van Dijk mentioned in his article.
-Meenaxi

The Contextual Scope of an Artifact's Meaning

I have to admit, I feel way more at home with this week's readings; although I found the quantitative and ethnographic research methods interesting, I was struggling to apply them to my own current research area. Thomas' Artifactual Study in the Analysis of Culture, in particular, struck an interest in me. 

One of the first problems Thomas' discusses is that of the substitution problem; can the study of artifacts be equivalent with the study of human behaviour? She points out two shaky assumptions this problem is built on: that of 'a direct method' (as if all of human behaviour is can be captured in one ultimate method) and that of attempted equivalency (as if artifacts are trying to be a pale reproduction of human behaviour). 

The substitution paradox, I think, rests on the hierarchical dichotomy of effable and ineffable. Rudolf Arnheim, a psychologists most noted for his work on visual arts, argues that this word-over-image bias rests in a linguistic-deterministic framework: the visual world is so chaotic and meaningless that the only way to liberate is to impose the structure of the language, a mold in which order and meaning is created. The implication of this, thus, is that the visual is innately chaotic and represents no meaning or order in and of itself. 

This seems to be where Thomas' problem with artifactural study rests: can artifacts, a physical realm, be innately meaningful or does it require linguistic, a rationale realm, to give meaning? While she argues no, and to which I agree, I wonder the scope of innate meaning lies (i.e. is this meaning culture bound?). 

The dehumanizing effect of critical discourse analysis

In the closing paragraph of his article, van Dijk writes: "this paper has sketched a rather simplified picture of power, dominance and their relations to discourse" (1993). Unfortunately, that is exactly why critical discourse analysis should not be used, in my view, for social research.

Dualistic frameworks about power and oppression, dominance and hegemony can be applied in the formulation of a research question, for example during an ethnographic study. They might also be useful in examining social phenomena, such Paulo Freire's popular education movement. However, they ideally should only represent a portion of the researcher's work.

Van Dijk's statement that: "critical scholars should not worry about the interests of perspectives of those in power, who are best placed to take care of their own interests anyway" is frightening, because it implies that there are two classes of human beings: the powerful and the powerless. It also indicates that the former class is less worthy of study, and even of human compassion, than the latter one.

I would venture that, in reality, power dynamics are much more complicated than this, and that human beings, whether they hold more or less power, remain multi-dimensional and unclassifiable. Any researcher that splits a population into two groups, discards one group and promotes perceived interests of the other, is not only misguided, but can also create serious damage in any community.

Dilemma concerning ethnography

I plan to make use of ethnography and am presently in the process of understanding how to “cover my bases” and be as transparent as possible and today’s class seemed to help a lot in this respect. Previously, I had come across some articles which addressed issues concerning ethnography that stated that it is a method which is not regarded to have as high a standing as certain other methods. And it is not just ‘quant’ researchers who have such views but ‘qual’ people share similar ideas. It has to be always substantiated by statistical calculations or concrete qualitative analysis. One of the basic reasons for this is the bias that a researcher or group of researchers can develop while conducting the field work. That is because, the research is analysed from one person or group’s point of view and their perception does come into play often. One of the ways of protecting ourselves from such criticism would be by being “extra super vigilante” while situating oneself into a culture or system. Thus the “importance of distance” plays an exceptionally major role in making the research credible.

Yet another attack against this method of research is that it studies one culture, organization, or system and tries to ‘see the world in a grain of sand’. This is true in some sense when the local provides an image of the global, the micro that of the macro. But it is also true that in a world as varied as ours, such study and its relevance can be highly limited. If I study one department in an organization, it is unlikely that the same gleanings from that study could be applied to another very different department. So studying one aspect of a company might not reveal much about the entire organization. It is difficult to defend ethnography in such a case. However, keeping in mind that all methods do have their drawback/s I guess conversely, ethnography too has its own (confident) position... enough to gain ground amongst the other research methodologies.

A face-to-face interview

Many of us have experienced with interview method which we had used in the assignment of INF1300. This assignment involves conducting an interview on the public's view of the library. Last week I have conducted the interview and have found that this class has really assisted me in determining the method I would employ as a researcher. The goal in this particular research is to discover what a person knows about libraries, library services and facilities etc. but while taking the interview , I had discovered that there is a lack of knowledge about the library gives a researcher valid information which can lead to questions of why there is such a lack of knowledge on the resources offered by the library. I have used an open-ended questions, it became apparent that there needed to be more structure than I had originally thought. The questions that I thought would have long answers were very short and I had to compensate by asking questions in different ways or being more specific without telling them what they know about the library and its services.
In other words, a face-to-face interview is the method most widely used in the research of any topic and based on a direct meeting between interviewer and interviewee. By personal communication it is possible not only to obtain much more information, but also to use visual materials (cards, pictures, logos, etc.) to encourage response.

Meenaxi

"In the midst..."

The view from the “vantage point” is what ethnography aims to provide. According to Shaffir, “hanging around” has been and can be the best advice for conducting such field work. Similar other terms have been coined or associated over time with ethnography to reveal what it truly is. “Aculturisation” is one such term, which means that one becomes a part of the culture which is being investigated. It involves “direct engagement”, interaction and integration into a system or culture. “Going behind the scenes” to conduct research can prove to be consistently interesting as it also involves being “present” and “participating” in the natural and daily activities of that which is being studied. The participatory aspect of this form of research consists of becoming something close to a “member” or part of the whole or even non member “participant-as-observer” (Stebbins)and "learning" the community or people or system practices. Of extreme importance is “fitting in”, as that is what makes those being studied to openly provide insights into their inner-workings. And acquiring “first-hand information" is no doubt valuable and can possibly reveal many unknown or unthought-of issues.

Even though ethnography can be regarded as one of the most in-depth of research methods, it does have several drawbacks: there lies the risk of over-participation and subsequent abandonment of unprejudiced work leading to a biased outcome; it is time consuming – not simply in conducting field work but in analysis as well; potentially expensive and therefore limited to considerably small scale research; lacking in range or breadth and so on. Hopefully, in my small range proposal, it does find a place and provide me with "truths" that can prove to be beneficial enough to answer the deadly “So What?”

Offline and online research

This week's readings made me think about how useful ethnography can be in studying online activity, as we talk more and more about 'online communities' and 'social networks'. However, they have raised one question for me, and that is, how do you set the limits to an online community? Physical communities, such as the one described by Shaffir, have perhaps more delineated borders. Often, in fact, researchers travel to a community and physically imbed themselves within it.

Online communities are a little different. I was recently studying the Etsy social e-commerce site; examining how buyers and sellers interacted with the companies and what kind of social dynamics were present on the site, if one dug a little. I soon realized, however, that the network had tentacles, which extended into many other social media, and through them, into face-to-face meetings. In this sense, an ethnography of the network might include observing how its members interacted on and offline. This tendency is made even more visible by websites such as Meetup, which allows users to network online before meeting in person.

Perhaps what is so daunting about studying online communities is that often, each member of that community is part of one or several physical communities, distributed around the world. A solution might be to examine how distributed communities are studied, such as international communities of practice. I may simply be pointing out the obvious - that in our current highly networked societies, framing the subject for an ethnography is a challenging endeavour.

Respect the Gap

This weeks readings, particularly the one by Shaffir, talked about the importance of distance in the participant observation methodology to maintain valid results. While I don’t personally plan on using PO in my current research, the discussion brought to my mind larger questions about the nature of scientific knowledge. 

In his book Pandora’s Hope (a reading I have been doing for a different class), Latour discusses the concept of validity of scientific knowledge through circulating reference. While the original context was that of physical sciences, I am going to (attempt to) present this concept through a more ethnographic lens, as applicable to the participation observation methodology. I’m not sure what I’m hoping to achieve with this, but I’m finding these philosophies of validity much more interesting than a step-by-step guide to execute method ‘x’.

Interview Tactics

Much like Yuliya, I also found it difficult to address the “So what?” Obviously it's an issue that I feel matters, and since then I have thought about clarifying certain aspects of my larger question.
I am trying to tie the subject of ethnographic approach to my own interview approach. I admit that I'm a fan of the interview method- as Luker points out, I believe that resurfacing patterns of opinion amongst a substantial number of individual interviewees lends a lot of credence to research and suggests broader social implications. I was initially considering conducting an epic mixed method approach, complete with hundreds of interviews and thousands of surveys. Wouldn't that have been awesome?
Interestingly enough, I'm conducting an interview today for another class I'm taking! My main concerns are 1) to make the interviewee as much at ease as possible 2) to ask appropriately open-ended questions, and 3) to document the findings as quickly as possible (as Luker urges us to do). I am also considering using Luker's reverse-psychology tactic of asking intentionally leading question in order to illicit clarification – although I would have to justify this method to the professor. Shaffir notes that the idea of removing political views of the researcher is a “facade.” As such, in my own interview I will attempt to remain as receptive as possible, and then follow-up with a reflexive approach.
-Martin

I'm an ethnographer and I didn't even know it!

Ethnography. I've gone from never really hearing the word before to it permeating my weekly readings and discussions. This word has popped up in all four of my classes, it must be important. I come from a humanities background, where personal research was not forgrounded or necessary to get one's degree (in my personal experience). Ethnography was a method reserved for those in anthropology or social sciences, those studying actual people...where as I felt I was just investigating the artifacts or output created by people. Now I see that this was kind of a silly distinction that I made up in my head. I saw studying belief and studying action as divorced from each other. However, as Luker points out, that when belief and action are combined we get 'pratices' and this is what good ethnography studies.

In December of 2006 I began on my own ethnographic study...without even knowing it. Like many fresh graduates from university, I was unsure what to do, so I moved to Asia. I situated myself in a small town named Hualien in Taiwan to be an English teacher. Admittedly, I was very much overwhelmed at first. Reading Luker's explanation of living in a different culture actually made me laugh out loud, because it was so true. Everything was a puzzle, I couldn't read, write, speak or listen. I never ended up getting a bank account, because it was too confusing. I had local friends, but I still lived completely on the periphery and was recognized as an outsider just by how I looked. Obviously, I was not aware that I was doing an ethnographic study so I didn't keep field notes, but I did keep a sketch book and a journal where I recorded my thoughts about my life there. After living there for almost a year, being 'let in' to the culture to a certain degree, I believe that I can say I know a very minute bit about the Taiwanese small-town practices. In retrospect I wish I would have paid closer attention instead of just being in awe. It would have been interesting to study more of the subtle practices, rather then just the big obvious ones. For instance, I was the only foreigner working at my school, it was unclear to me how authority worked there. I was told in a very round-about way if I was doing something incorrectly. Using ethnography to study power structures would have been really neat (and probably have helped me out, as I was confused most of the time).

Using ethnography to study library and information science seems like an odd fit at first glance. But if you look at both libraries and information as the products of practices or entities in which particular practices happen, ethnography seems like a logical method of study to use.

Playing digital games after writing a SSHRC proposal

A perspective on the SSHRC proposal: I am afraid that in my proposal I didn’t answer the question “So what?” for the reader. Things discussed in it seem so obvious and connected, flowing from each other: (i) computer games are fun, (ii) work usually lacks this property (not that people don’t like their jobs; simply tasks they may have to conduct at the computer are not always exciting, engaging or providing possibility for personal growth), (iii) games have specific mechanics that are considered to be reasons of fun, and fun at work that doesn’t exclude work itself is a benefit for employees and consequently for organizations, (iv) why not to apply those game mechanics to work and observe fruitful results of work becoming fun? It is simply saying and it is all their but I am not sure that these ideas in proposal are fully interconnected to justify a proposed research, mainly because I somehow focused more on games than on work: was it the right focus for two pages or not?.. Also I now intend to reexamine my methodology and add more studying of context, documents, history of the organization or department (considering where to put boundaries will be a tough job, I guess). It is all under the influence of INF1003 course materials on interpretational research; I only have to figure out how to apply it to small scale research without luxury of several years to conduct it.

From doubts and intentions to pleasant thoughts. I was intrigued to hear a discussion at the lecture about digital games (it isn’t hard to guess why, if you’ve seen a glimpse of my proposal above). I have to admit that I’ve never reviewed literature on game violence research and found it insightful to learn about biases and politics involved that are especially distinct in observation: if games can aid in learning and teach good things, why simultaneously they will be considered absolutely neutral in regard to teaching bad things. My new, general, and probably greatly biased thoughts on this account are as follows.

I am conscious of the nature of skepticism in that observation, but can remember only very-very few games that actually “teach” bad things, meaning encourage them. If violence is present in games but is punished or regarded as a disgusting act (what is mostly the case; even if you chose “bad” path in the game, you don’t get attractive results or tapping on the shoulder; if you find destruction and tears, and pure power attractive, isn’t it your nature? if the game gave you a choice, isn’t it your choice? but I digress...), than game can mostly teach things about punishment and violence’s disgusting nature. At the same time people put a great effort in designing good teaching tools, and research, and debate a lot on how to do it; so, if we consider that violent game teaches violence, we can, with every reason, reward its game designers with PhDs of psychology and education. ...And on the other hand we have “susceptible” child’s psyche (“bad” path is easier, I always follow it in games, let me try it in the real life) and factor of getting used to things (there are so much violence on the screen, I can easily stand one more case in the real life) that are themselves full of contradictions. On the grounds of my opinion (sounds even too academically :) ) I would suggest that maybe researchers don’t need to learn so diligently from each other — those who study games as violence promoters and those who study teaching tools in the form of games — but rather apply their research methods to see how games do as right-things promoters and games as a form of having fun, respectfully. In my view, these studies differ greatly and the term “teaching” has to be used cautiously, that is why in one of the cases I chose to use “promotion”.

Last-minute observations

My initial crisis with the SSHRC proposal was that I felt it covered too broad a topic. It also seemed to me that a large-scale, mixed method approach (while ideal in my opinion) would be completely unrealistic for small-scale research. To remedy the situation, I reduced the scope of my research to a relatively small, specialized population.
I felt that a semi-structured interview was the most “suitable” for the kind of qualitative research I was thinking of. However, as a result, I feel that I'm in a position where my potential findings could be interpreted as “lacking generalizability” or, as Luker puts it, “spurious!” For a long time I was questioning whether or not it would be a struggle to draw greater implications from the potential research results, but I think addressing significance and specifying questions helped to alleviate that fear.
For me, the theme of the Kline article seemed to be that research is interpreted differently depending on the audience, and can be rendered subjectively insubstantial through this process. It seemed quite relevant to the concerns I had about my SSHRC proposal, because my topic also deals with legal battles around media freedom!
-Martin

(Re)shaping the question

My primary area of interest – other than archival studies – is English literature, and in particular Shakespeare and the English Renaissance. When writing my research proposal, the greatest difficulty that I found was how to articulate my interests, which are literary (and therefore belong more to the ‘humanities’ school) in nature, within the framework of the social sciences. In particular, the methodology section was confusing, as I am used to thinking of ‘the method’ as simply reading a text, thinking about what the work says, researching the criticism that has been written about the work, and then attempting to contribute to the scholarly tradition with my own analysis. I thus found myself having to think of how the project, originally entirely theoretical, might have some practical applications which could be researched and established through a more ‘sociological’ approach.

Overall, however, I found the experience of writing a SSHRC proposal extremely rewarding. In particular, it helped me to refine my interests and general objective. I will try to use this approach again (if I am ever put again in the position), when I write literary criticism. By thinking of how my analysis can be applied to practical, real-world questions, I should be able to present my work in a more convincing shape.