Too many sources, too little time...

A few days ago I was worried I wouldn't have enough sources to fill up my literature review. Now, I'm worried I'll have too many! In order to broaden my scope of sources, I consulted the bibliographies of my already-selected sources for helpful texts. I read a few interesting and relevant ones, which then led me to more studies of interest, and so on and so on. It was as if the floodgates had opened, and I couldn't stop the material from pouring in. This is the problem with research, especially research in a popular area: there's so much material! I couldn't possibly read every important content analysis study ever written, but with so many out there, I can't help but feel like if I don't look at them, I'll be missing something. Which is just a feeling I'm going to have to get used to, because in research, there's probably always going to be something you're not reading. I guess the important thing is to read as much as you can of the fundamental, cornerstone works that define your field so even if there is some new study on some small part related to yours, the foundation of your study will remain intact.
That is not to say that I've neglected my research, it's just that when using a popular method and studying in a broad field, it seems that widening your search parameters can be both useful and problematic. Too narrow and you've got nothing, too wide and you've got everything, and that is simply too much to go through in a few weeks.

A New Appreciation for Quantitative Methods

I am currently finishing up an edition review for another class, in which I discuss editorial decisions made throughout the Riverside Chaucer (1987). The research conducted is quite familiar to me, mainly focussing on literature and content analysis, with some manuscript studies in there as well. In doing my research for this assignment, I was struck by how similar these methods are to some of the studies we have discussed in class, especially ethnography and discourse analysis, as well as some of the more quantitative methods.
In compiling an edition of the Canterbury Tales, scholarly editors have to go through all of the existing (and acceptable) manuscripts and collate them, determining variants in spelling or word choice, etc., in order to decide what manuscript to use a base text, or whether to compile several manuscripts in order to create a new edition. I had never thought of it before, but thanks to my new found knowledge of research methods, doing this work would involve so much more quantitative analysis than I ever realized. You would have to count and record every variant in every manuscript, and then use your data to decide what edition is most accurate for your purposes. This is a far cry from the textual analysis I had always seen it as.
I have been doing research in this area for a few years now, but it was not until this past week that I saw manuscript studies as being so data-centric and quantitative, something I can surely attribute to my introduction to those methods in this course.

Adieu or not ?

Just like research is never ending in the life of a researcher, so also I do not want this blog to be totally over for us. I had started on an uncertain process, as I had never blogged before nor was very fond of publishing my private thoughts. But this was another domain – where you shared your ideas with your peers. As well as for the world to see !!! Sometimes I was unsure whether to be formal and at other times just gave in to the feelings, the tensions and the questions that overwhelmed me. Gradually a realization came over me that this was also a way of finding out for ourselves what we were searching for at that moment in time. It was like a research into our deepest thoughts and obligations brought out in print.

Now, as I struggle to make my real research more meaningful, my methods click, my hypothesis achievable – I feel that I have certainly gained as much from this blogging and unravelling of thought process as the Research methods class itself.

Texts and more...

Around the beginning of the Research Methods course, I was excited to have found a book like Luker as text. I did not have the patience then to wait for every class and read the assigned chapters. I went through the entire book immediately with a hope to learning an important aspect of the academic world and also enjoying it immensely. I was finished with the preliminary reading within a few days but the thoughts lingered with me. As I started work with the Research Proposal I realized the necessity and utility of Knight. Knight and other research texts I had come across in the past, somehow fill me with more confidence about the task at hand. They provide a rigorous understanding of the definitions, descriptions, explanations about methods and how to go about practically doing them. Luker in retrospect is more of an overview for me. Even though not “a sugar-coated bitter pill” in the words of Shaw, Luker’s book nicely and gently introduces us to research and tries to make us get rid of some inherent fears we may have of committing to and accomplishing such work.

Comparing Luker and Knight close to the end of the proposal, provided me with an understanding of how much information/knowledge I needed from whom and when. I realize that it has surely been fruitful to experience the different ways of approaching a subject and accepting the various perspectives as that has allowed us to open up our minds and enabled us to see more.

Source Dilemma

I'm having a bit of a dilemma right now over a certain source and whether to use it or not. First of all, the research I'm proposing involves computer-mediated discourse analysis and political discussions in certain forums online. Sound familiar? It's similar to the research conducted by Kushin and Kitchener in their article "Getting political on social network sites", on which I did my peer review assignment (I had picked my topic long before reading these articles). My dilemma is that in reviewing this article and the research conducted, I am aware of all the flaws in the research design and implementation. However, their research would be very useful in order to provide a sort of background and support for the research I'm proposing. But since I have found it to be flawed, is it ill-advised to use this article as a source in my proposal?
In fact, I think it might not be that much of a dilemma after all. Their only similarities are in method employed (which isn't unique to their study) and the general subject area of focus. Also, it might be easier to justify the need for my research if one of the fundamental studies in this area has some serious flaws and biases all the way down from research design to implementation and data analysis. I don't think I really have a choice but to use this study as one of my sources, as long as I acknowledge its' weaknesses and don't repeat them.

Thankful for Salsa Dancing

I'm currently working on my research proposal, and I'm finding that the methods section is giving me more trouble than I expected. I'm not used to doing research in a way that requires me to define my method and explain its' parameters, but I guess doing discourse analysis of human subjects is a bit different from a literature analysis. I am struggling a bit to define my method, as I will be drawing on a few different ones in order to do my hypothetical research. Herring's computer-mediated discourse analysis will likely serve as my main method, but I am not ruling out drawing on other methods as I move forward and as they are appropriate. However, this leaves me to find out what else I can use that would be useful in my study, so I've been doing a lot of research on research methods (which is a bit too meta for me).
Luker's salsa dancing social scientist idea has never been more useful to me than now, because picking one method and sticking with it seems to be a bit too narrow a path for me and my research tastes. Following her allows me some more freedom when deciding on a method for my research, which suits me perfectly, especially when the research I'm proposing would require a wider range of methods in order to have any conclusiveness.

User-centred research at U of T

University of Toronto has its own guidelines for researchers and I thought I would have a look at them given that we are having a guest speaker who can address our concerns. I also plan to have University of Toronto students as my user base for a part of the study. I would then need to “gain access to data about students, staff and faculty held by the University of Toronto”. Interestingly the Office of the Vice President and Provost’s site informs that a considerable amount of research is conducted involving members of the university community. The Office of the Vice President Research has a Research Ethics Board (REB, that has to approve the research to be done and must be contacted at the research planning stages. I was excited to learn that I could request for data held on the Repository of Student Information (ROSI), the Human Resources Information System, or collected through surveys such as the National Survey of Student Engagement (NSSE). This would help to conduct research of 'student users' from various departments independent of each other.

Guidelines are provided in keeping with the following aims(which sounded very thoughtful to me, both with respect to those researched and the researcher): “to prevent survey fatigue, protect confidentiality and employee rights, and ensure that access does not conflict with any current or planned research to be conducted by the University or its administrative/academic units”.

Disclosing Cognition

The ethics of research was an interesting topic to read about this week (though, to continue my last post’s line of thought, this segmentation is a bit artificial for me: its not ‘ethics’ as a subcategory of ‘research’ but ethical choices seem to underscore the entire research initiative). My own research topic doesn’t dive very deep into these professional ethics, but Knight’s discussion of achieving ‘disclosure’ with participants had touched on what I’ve been grappling with lately.

My research proposal began with a strong interest in the cultural meaning assigned to an object (a typeface), so I have spent a lot of time working out my historical analysis methodology. I had always worried that this approach would veer off into a grand narrative that might loose contact with the current world. In my proposal, I counterbalanced this method with the addition of a focus group to maintain the individual perspective.

Describing the focus group in more detail has been a struggle; how to do you test for the psychology of aesthetics? While Knight suggests building a trusting relationship or an insider approach to understand what the participants think, I doubt that our aesthetic preferences are so well-thought out that can be readily articulated by the participant.

User and research

A critical aspect of user-centred research is to understand who the potential users are, especially because early user involvement is a primary principle of user-centred design. This can be extremely relevant to HCI research as well. HCI research though concerns the user experience about probable usability of website, interface or software; it does not always involve user participation from the initial stages. User-centred processes try to include the actual users in the development process at the earliest possible time in an effort to correspond to the needs and behaviors of the users.

As I progress with my proposal I realize how much the method can be linked to the area of study. That is because my research findings are dependent on user feedback. My quest is dependent on how I embark on it than the findings themselves, as querying usability becomes an important part of the study too. When conducting user research, theorists recommend that several methods be used in order to obtain rich qualitative data that would help build a holistic view of the studied user group. Thus the most common methods used include interviews, observations and questionnaires with other methods such as cultural probes or artefact analyses being applied at times too. Interestingly, it also falls under the purview of exploratory research into the unknown.

Murphy's Law

Knight brings up a very important point in this week's reading: technology and the human element have the potential to present very real headaches during the course of interview research! I'm positive that everyone has had issues with technology at one point or another. In my personal experience, I have forever been cursed with computers that have some hardware issue or another. Even the overpriced laptop I currently own locks up randomly due to a faulty processor. Due to these wonderful experiences I have learned to have data backed up to at least three hard drives/flash memory/etc. at any given time. I'm sure these habits are applicable in the research realm. Knight even mentions that data should be collected through several recorders at once.

The human element can be equally hit and miss. Knight says to “try and distribute tasks so as to use partners' strengths and avoid [the] weaknesses [of research assistants]” (p. 163). I'm sure everyone has been in group projects in which everyone contributes, and others in which no one seems to contribute. The only method that seems to work in the latter situation is to assign tasks (in my experience). I suppose that it is equally challenging to develop camaraderie with research subjects.

-Martin

Ugh...Consent...this is a long one

Reading the required bits of the Toronto Office of Research Ethics Guidelines and Practices Manual while trying to write my proposal has been very informative yet also frustrating. So many things to consider! In my research I am looking at two age groups and unfortunately one of those groups is teenagers, 13 to 18. So don't just have to convince them to participate, I have to convince their parents or guardian to let them. For my proposal I have written a rough consent letter which I shall post here, please let me know if I've made any errors or left anything out...thanks!

Dear Parent/Guardian;

Your child has been invited to participate in an academic study run by a graduate student in the iSchool at the University of Toronto. This study will be supervised by Prof. Awesome of the U of T iSchool faculty and will follow all guidelines designated by the university of Toronto's Office of Reserach Ethics.

The purpose of this research is to investigate how a female's personal identity is internally constructed and perceived with the use of digital media. The study is examining two age groups of females, 13-18 year olds and 35-40 year olds and comparign the results. We hope to gain insight into the thoughts of women introduced to technology later in life as compared to those born into it. This study will contribute to dicussions surrounding women, identity and the use of digitla media. Following David Gauntlett's "Making is Connecting" method (www.artlab.org.uk), the participants will be able to create vsual representations of their identity both online and offline and then discuss and reflect on what they have created. The participants will spend 3 hours on a Saturday afternoon on the 5th floor of the University of Toronto Bissel Building and will be compensated in community service hours completed.

Your child has been approached because she is a female of the correct age and she uses digital media daily. If she chooses to participate, she will be working individually but discussing her creation in a group of 6 of her peers. In teh session itself, the participants will sign confidentiality agreements with one another not to share others' personal thoughts and feelings.

Your child's participation is completely voluntary and she may refuse to participate or withdraw at anytime. However, upon withdrawal the allotted community service hours will not be given. Photographs will be taken of teh visual representations created and video recordings will be made of the discussion. The photographs will be used in teh research findings presentation, but the video footage will not. Video recording will be used only by the researchers for identification of the participants in linking the person to what they say. The recordings are completely confidential and will not be used outside of transcribing the study. Some sections of the transcript may be used in teh findings presentation, but there will be no name or identifying characteristics attached to it. If you have any questions regarding any aspect of this study please contact the Office of Research Ethics at ethics.review@utoronto.ca or 416-946-3273 and the researcher at aurianne.steinman@utoronto.ca

Please sign below to allow your child to participate in this study.

Second research daisy

Since one of my first posts for this course was of a research daisy for my intended course of study, I can't resist posting a second daisy, one that is closer to what I will probably submit with my research proposal. This one illustrates the fields involved in my case study - of one aspect of the United States Government's open data system. It was done using OmniGraffle, the equivalent of Visio for Macs.

Yin on case study research

Since I'm looking at doing a case study for my thesis and we've been discussing Robert Yin in class, I borrowed his book entitled Case study research: Design and methods (2003) from the library. This book has been extremely useful in building the justification for a case study design and understanding/addressing the strengths and weaknesses of the approach. When defining the research approach, Yin notes that: “a case study is an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between the phenomenon and context are not clearly evident. (p. 13)"

By pressing the importance of examining the phenomenon within its context, Yin contrasts case studies to, for example, experiments, where phenomena can be tested removed from their environment - in the laboratory context. This makes me thing of Walsham's synthesized framework, which many of us studied in INF1003. In Interpreting Information Systems in Organizations, Walsham suggests different points of entry for IS research, including examining context, content and processes. Walsham also wrote a detailed methodological article called "Doing interpretive research" (2006). Because interpretive IS research fits well with the case study approach defined by Yin, I think that the two authors complement each other well. Chapter 7 of Knight's Small scale research (2002) also gives very practical advice for data collection and interaction with the research subjects.

Interestingly, Yin notes the weakness of case studies, particularly single-case studies, which do not benefit from the comparative element of multi-case studies. He explains that, like single experiments, single-case studies are vulnerable to misinterpretation and access issues. This loops back to last week's post, in which I briefly discussed the importance of, and anxiety related to, obtaining appropriate access to the subjects in the case study. I do think, however, that Yin's six sources of data for case studies can address these problems - documentation, archival records, interviews, direct observation, participant observation. When studying online information systems, some of this data can be collected through interaction with the system itself and by accessing publicly available records.

Going back to the question of studying phenomena within their environments, I am also reminded of Bruno Latour's commentary on the separation between lab and field research in science, in Pandora's Hope. As an outsider, he is struck by the abstraction and subjectivity necessary for lab studies in botany. He explains that a plant sample, for example, has no meaning outside of the context in which it has evolved and that the recall of the field researcher is necessary to fill in that context. When using the case study approach, it may thus be possible to reduce the gap between the case itself and the researcher's abstraction and categorization, as it is never removed from its context.