User and research

A critical aspect of user-centred research is to understand who the potential users are, especially because early user involvement is a primary principle of user-centred design. This can be extremely relevant to HCI research as well. HCI research though concerns the user experience about probable usability of website, interface or software; it does not always involve user participation from the initial stages. User-centred processes try to include the actual users in the development process at the earliest possible time in an effort to correspond to the needs and behaviors of the users.

As I progress with my proposal I realize how much the method can be linked to the area of study. That is because my research findings are dependent on user feedback. My quest is dependent on how I embark on it than the findings themselves, as querying usability becomes an important part of the study too. When conducting user research, theorists recommend that several methods be used in order to obtain rich qualitative data that would help build a holistic view of the studied user group. Thus the most common methods used include interviews, observations and questionnaires with other methods such as cultural probes or artefact analyses being applied at times too. Interestingly, it also falls under the purview of exploratory research into the unknown.

Murphy's Law

Knight brings up a very important point in this week's reading: technology and the human element have the potential to present very real headaches during the course of interview research! I'm positive that everyone has had issues with technology at one point or another. In my personal experience, I have forever been cursed with computers that have some hardware issue or another. Even the overpriced laptop I currently own locks up randomly due to a faulty processor. Due to these wonderful experiences I have learned to have data backed up to at least three hard drives/flash memory/etc. at any given time. I'm sure these habits are applicable in the research realm. Knight even mentions that data should be collected through several recorders at once.

The human element can be equally hit and miss. Knight says to “try and distribute tasks so as to use partners' strengths and avoid [the] weaknesses [of research assistants]” (p. 163). I'm sure everyone has been in group projects in which everyone contributes, and others in which no one seems to contribute. The only method that seems to work in the latter situation is to assign tasks (in my experience). I suppose that it is equally challenging to develop camaraderie with research subjects.


Ugh...Consent...this is a long one

Reading the required bits of the Toronto Office of Research Ethics Guidelines and Practices Manual while trying to write my proposal has been very informative yet also frustrating. So many things to consider! In my research I am looking at two age groups and unfortunately one of those groups is teenagers, 13 to 18. So don't just have to convince them to participate, I have to convince their parents or guardian to let them. For my proposal I have written a rough consent letter which I shall post here, please let me know if I've made any errors or left anything out...thanks!

Dear Parent/Guardian;

Your child has been invited to participate in an academic study run by a graduate student in the iSchool at the University of Toronto. This study will be supervised by Prof. Awesome of the U of T iSchool faculty and will follow all guidelines designated by the university of Toronto's Office of Reserach Ethics.

The purpose of this research is to investigate how a female's personal identity is internally constructed and perceived with the use of digital media. The study is examining two age groups of females, 13-18 year olds and 35-40 year olds and comparign the results. We hope to gain insight into the thoughts of women introduced to technology later in life as compared to those born into it. This study will contribute to dicussions surrounding women, identity and the use of digitla media. Following David Gauntlett's "Making is Connecting" method (, the participants will be able to create vsual representations of their identity both online and offline and then discuss and reflect on what they have created. The participants will spend 3 hours on a Saturday afternoon on the 5th floor of the University of Toronto Bissel Building and will be compensated in community service hours completed.

Your child has been approached because she is a female of the correct age and she uses digital media daily. If she chooses to participate, she will be working individually but discussing her creation in a group of 6 of her peers. In teh session itself, the participants will sign confidentiality agreements with one another not to share others' personal thoughts and feelings.

Your child's participation is completely voluntary and she may refuse to participate or withdraw at anytime. However, upon withdrawal the allotted community service hours will not be given. Photographs will be taken of teh visual representations created and video recordings will be made of the discussion. The photographs will be used in teh research findings presentation, but the video footage will not. Video recording will be used only by the researchers for identification of the participants in linking the person to what they say. The recordings are completely confidential and will not be used outside of transcribing the study. Some sections of the transcript may be used in teh findings presentation, but there will be no name or identifying characteristics attached to it. If you have any questions regarding any aspect of this study please contact the Office of Research Ethics at or 416-946-3273 and the researcher at

Please sign below to allow your child to participate in this study.

Second research daisy

Since one of my first posts for this course was of a research daisy for my intended course of study, I can't resist posting a second daisy, one that is closer to what I will probably submit with my research proposal. This one illustrates the fields involved in my case study - of one aspect of the United States Government's open data system. It was done using OmniGraffle, the equivalent of Visio for Macs.

Yin on case study research

Since I'm looking at doing a case study for my thesis and we've been discussing Robert Yin in class, I borrowed his book entitled Case study research: Design and methods (2003) from the library. This book has been extremely useful in building the justification for a case study design and understanding/addressing the strengths and weaknesses of the approach. When defining the research approach, Yin notes that: “a case study is an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between the phenomenon and context are not clearly evident. (p. 13)"

By pressing the importance of examining the phenomenon within its context, Yin contrasts case studies to, for example, experiments, where phenomena can be tested removed from their environment - in the laboratory context. This makes me thing of Walsham's synthesized framework, which many of us studied in INF1003. In Interpreting Information Systems in Organizations, Walsham suggests different points of entry for IS research, including examining context, content and processes. Walsham also wrote a detailed methodological article called "Doing interpretive research" (2006). Because interpretive IS research fits well with the case study approach defined by Yin, I think that the two authors complement each other well. Chapter 7 of Knight's Small scale research (2002) also gives very practical advice for data collection and interaction with the research subjects.

Interestingly, Yin notes the weakness of case studies, particularly single-case studies, which do not benefit from the comparative element of multi-case studies. He explains that, like single experiments, single-case studies are vulnerable to misinterpretation and access issues. This loops back to last week's post, in which I briefly discussed the importance of, and anxiety related to, obtaining appropriate access to the subjects in the case study. I do think, however, that Yin's six sources of data for case studies can address these problems - documentation, archival records, interviews, direct observation, participant observation. When studying online information systems, some of this data can be collected through interaction with the system itself and by accessing publicly available records.

Going back to the question of studying phenomena within their environments, I am also reminded of Bruno Latour's commentary on the separation between lab and field research in science, in Pandora's Hope. As an outsider, he is struck by the abstraction and subjectivity necessary for lab studies in botany. He explains that a plant sample, for example, has no meaning outside of the context in which it has evolved and that the recall of the field researcher is necessary to fill in that context. When using the case study approach, it may thus be possible to reduce the gap between the case itself and the researcher's abstraction and categorization, as it is never removed from its context.

The Morality of Researchers

One of my favourite insights into research methodologies has been Knight's remark that “the most important thing in small-scale research is to be mindful about the sorts of claims that research is intended to enable” (p. 114). While a popular arena of considering the impact of claims is the academic field (with peer-reviews, for example), I’ve started to consider research methods as an impact to the social arena (not in the practical translation of the results, but how the methodology is a reflection of the society back on to itself). Isn’t there a moral backbone required in claimsmkaing, and if so, where does it come from — what instructs the researchers of the value of their research claim? Not to suggest that researchers all value the same types of claims, because the research value is as diverse and multiple as the individuals. But I am wondering where Luker’s “imposition of a schema on the social world” (p. 214) come from and if its ontology matters?

Our Research Question, and Hypotheses of the Project

Before we begin writing a grant proposal, we should take some time to map out our research strategy. A good first step is to formulate a research question. A Research Question is a statement that identifies the phenomenon to be studied. For example, “What resources are helpful to new and minority drug abuse researchers?”
To develop a strong research question from our ideas, we should ask ourselves these things:
  • Do I know the field and its literature well?
  • What are the important research questions in my field?
  • What areas need further exploration?
  • Could my study fill a gap? Lead to greater understanding?
  • Has a great deal of research already been conducted in this topic area?
  • Has this study been done before? If so, is there room for improvement?
  • Is the timing right for this question to be answered? Is it a hot topic, or is it becoming obsolete?
  • Would funding sources be interested?
  • If you are proposing a service program, is the target community interested?
  • Most importantly, will my study have a significant impact on the field?
Think about the potential impact of the research we are proposing. What is the benefit of answering our research question? Who will it help (and how)? If you cannot make a definitive statement about the purpose of your research, it is unlikely to be funded. A research focus should be narrow, not broad-based. For example, “What can be done to prevent substance abuse?” is too large a question to answer. It would be better to begin with a more focused question such as “What is the relationship between specific early childhood experiences and subsequent substance-abusing behaviors?”

Luker vs Knight

The combo of the readings this week were really interesting as both Luker and Knight discuss very similar things. So similar in fact, that it really shows how much their writing/teaching styles/method diverge from one another. Even the titles of the chapters are indicative of the differences, Luker's 'Getting Down to the Nitty-Gritty' versus Knight's 'Research Design: Bringing it All Together'. Both discuss the synthesis of the research and method elements that we have learned about thus far. While Luker and Knight both use helpful anicdotes in giving examples, they part ways when it comes to their chosen teaching method.

Luker tends to kind of coddle the reader by repeatedly accounting for any negative feeling about their own personal research they may have. She discusses anxiety and reiterates important questions and information over and over again. Luker is a bit of a hand-holder, which I totally appreciate as most of us are going into unknown territory and may not even know exactly what we are studying until the end of our study. It is like she is prepping us for taking a leap of faith. Luker also leaves a lot of room for the creativity of the researcher by only making vague suggestions of what to do, how to do it and what has worked for her.

Knight on the other hand is a lot more pragmatic in all matters of research design and implementation. He provides great detail and excellent definitions for all aspects of small-scale research. Knight injects just enough charm so the reader does not die of dehydration. He does cram an insane amount of information into each chapter, so much so that it can seem a bit overwhelming. He is quite a bit more objective in his teaching method and writing style than Luker.

The truth is, I go to Luker when I'm feeling confused and like I don't really know what I'm doing, reading her is like getting a big hug and a pat on the head. I go to Knight for serious information and guidelines when I already have an idea of what I want to do.

I realize that the different ways that Luker and Knight tackle small-scale research is the reason that both of these textbooks were chosen for this class. I just find it really interesting how two people took basically the same method and design information and created such different but equally helpful artifacts.

Cronbach's alpha and research anxiety

As I was going through Luker (2008) today wondering what I might write about for my blog post, I hit upon a completely new concept: Cronbach's alpha, a coefficient of internal consistency. As Luker explains, once you've coded your data and created a coding book (a list of what codes relate to what themes and sub-themes), you ask someone who doesn't know your research to use it to code a sample of your data. Then, you run the test on the resulting analysis, along with your own analysis of the same data, and have a measure of the extent to which you have been consistent in coding.

Because I'm not a mathematician, I imagine this could be done with SPSS. In fact, in an FAQ document, SPSS shows the formula "for conceptual purposes" before giving more familiar (to me) screen shots of what this would look like in the program.

I liked Luker's coding by hand methodology, but wonder how long it would take to input the data for the blind-coded sample into SPSS in order to run the test. It might be easier to sit down with a "for dummies" book and learn how to do it myself.

What attracted me to this method is not so much, as Luker posits, that it would legitimize my research to canonical social scientists, but rather that it would keep me accountable and disciplined during the analysis process. Working towards a logical, clear coding book would keep my thoughts in order and seeing the coefficient (which we hope, would be high) might relieve my own anxieties about the research process.

In fact, I've been having some anxieties about my research, as I've been working on my research proposal. I started the blog post with a remedy for analysis anxiety, but there is also data collection anxiety. I have an exciting case study in mind. How to I ensure that I gain entry to my research participants? Walsham (2006), in a paper about interpretive research methods for the study of information systems, devotes a section to the social skills of researchers and places importance on being liked and respected by the participants. Similarly, Luker discusses the need to make your research relevant to the participants and showing reciprocity. When she gained access to an abortion clinic that she wanted to study, she donated blood in exchange. It sounds like a good idea, but I'm not sure I have enough blood for the number of interviews I would like to do. I imagine that what Walsham and Luker describe are simply normal social relations - why would our relationships and interactions with other human beings be any different because we are doing research? In any case, I can see why Luker discusses researcher anxiety near the end of her book. During my research journey, I will want to test my process many, many times, primarily to reassure myself that my work is sound.