Interviewee selection criteria

Our FCL group has been asked to participate in the mid-summer check-in for the Oregon Sea Grant Summer Scholars Program. Members of our group will be giving a 2-hour seminar for the six undergraduate students participating in the program. The workshop will be about communicating sciences and outreach, and I have been helping with the planning process. Therefore, I have been thinking a lot about science communication and its often association to the “broader impact” components in research grants. What would be important to include in such a workshop to introduce the debate of science communication to these young scholars in the beginning of their careers?

If  science education needs some reform, how important is it for educators to partner with the scientists in order for such reform to occur? I think it is very important but mostly when  science outreach starts to be viewed as more than a voluntary activity with tangential benefits for scientists and has broader significance to them. Thinking interpretively, this will only be possible when outreach and science education opportunities accommodate their interests, time and talent. Sooner or later, every scientist will be required to engage in some sort of outreach, but the key here is whether the role they fall into is a role they feel comfortable with.

In their Fall 1998 newsletter of the Forum on Education of the American Physical Society, Rodger W. Bybee and Cherilynn A. Morrow (1998) talked about “Improving Science Education: The Role of Scientists” and reported on a matrix that sorts out the roles scientists could or do play in science outreach. Such roles were classified in the formal and informal educational settings and they fitted in one of three categories: Advocate, Resource, and Partner. For example, if a scientist assumes a role of advocate within an informal education setting such as a science center, he or she could perhaps participate on the board and participate in decision making. On the other hand, if a scientist choose to be a resource, he or she can review science content in exhibits or programs, give a talk at a science center, etc. As a partner, a scientist would collaborate with the creation of a exhibit or program from the get go. Here is the link for this article:

This matrix on possible scientist’s role in outreach and science communication is an important resource for the proposed workshop. I think it is imperative for young scientists to understand the possibilities for involvement, the possible venues and the roles they may find themselves in someday. BUT I came to think that it is also very important that these young scientists can think about who they are and how their talents can best fit within the matrix. Are they advocates, resources or partners? regardless, they need to feel comfortable in their roles in order for them to effectively contribute to a science education reform.

As the next crop of scientists graduates from universities, what role will they see themselves playing within science outreach and communication? Do they see themselves in a outreach role at all? motivations should not only be external such as a requirement of a grant funded project but should also be internal such as relevance and usefulness within the scientist work scope and interests. Below is some more food for thought in the subject:

Thiry et al 2008

Halvesen & Tran 2011

Larsen et al 2008

MarBEF article

Thiry et al. 2008

Where does science come from?

As I work towards a coherent research question for my dissertation, I find myself challenging assumptions that I never dealt with before. One is that visitors trust the science that is being presented in museums. There is lots of talk about learning science, public understanding of science, public engagement, etc., but trust is frequently glossed over. When we ask someone what they learned from an exhibit, we don’t also ask them how reliable they feel the information is. Much like the various fields of science, there is an assumption that what is being presented is accurate and unbiased in the eyes of the visitor.

In accordance with this, it is also frequently assumed that visitors know the difference between good science, bad science, pseudo science, not science, science in fiction, and science fiction; and that this is reflected in their visitor experiences in science museums. Especially in the internet age, where anyone can freely and widely distribute their thoughts and opinions and agendas, how do people build their understanding of science, and how do these various avenues of information impact trust in science? Media sources have been exposed in scandals where false “science” was disseminated. Various groups deliberately distort information to suit their purposes. In this melee of information and misinformation, are science centers still viewed as reliable sources of science information by the public?

Getting those research technology ducks in a row

Since defending and getting back in to our lab duties full time, Katie and I realized today there are a lot of tasks to tackle for the lab before the summer break! Basically it’s all about getting the lab’s ducks in a row and putting in place some procedures for both research and equipment that will help the lab move forward in the future and ease transitions with new students and new scholars working with the lab.

Firstly, we’ve come to a space crunch with the visitor center tech storage. So the first task is to clear out unused or unwanted pieces we have been holding on to in order to minimize our space needs and make room for tech development and new equipment in the future. Secondly, we need to inventory everything we have so far. There are so many pieces of cameras, mics, etc that we have experimented with and tested that need to be accounted for and documented. This is an essential step because it will help us decide what we are still missing from our suite of tools, and put some equipment loan procedures in place to protect the quality and security of the equipment. Thirdly, there are research agendas that need developing. These will determine the next stages for getting research and data collection moving in the visitor center within the overarching lab agenda, and help drive technology development for the lab for the future. They will also help us plan next steps for ongoing IRB applications. Research tools aren’t much good if you’re not sure how you will be using them for research. Lastly, there are the next steps. We will be making data collection and technology development plans for the upcoming months to help build research “game plans” and tasks lists for the future.

All in all, it’s quite an exciting time for the lab, and for me personally I love all this organizing and “cleaning house”. Knowing where you  are in a project and where you are going in my opinion makes for great future products.

Abandoned Ship

Last weekend there was a wonderful free choice learning event at Lincoln City Oregon – The Remotely Operated Vehicle Competition.  It was so much fun to watch and perform the role of judge.  This is an event that is sponsored by the Marine Advanced Technology Education Center and numerous local and national sponsors.  The most interesting thing to me is the level of excitement that surrounds these events from all involved.  However today I am going to write about one particular participant from last Saturday’s event.  This particular sophomore chaired his team for the Rovers portion of the competition which meant they were competing at the level to win the only slot to move forward to the international competition and prize money to help offset costs.  This particular participant had a serious of events on Saturday that would make any person, young or old most likely walk away from the competition.  In my mind his actions truly embodied what it means to be a good sport, but the aspects of free choice learning.

First of all during the debriefing it was clear that another team his team was competing against had not brought all their materials nor had they read the rules.  I instantly offered to share his supplies and the printed out materials with them which he was not required to do.  When the head judge said he did nto have to do that, the instructions were clear online, he said it is all for learning and fun isn’t it – I’m I allowed to share.  We said sure.  Next thing, his team members did not show up.  This meant that he was instantly disqualified if he did not have at least one more person with him “on deck” for the trails and for the competition.  He enlisted the help of one of his family members.  The judges told him that he still mostly would not advance as the team had changed from the date of submission.  He said ok, but can I still go through the event.  Yes was the answer.  Next his ROV did not meet specs.  He was given 20 minutes to alter it – he did it passed.  He proceeded with the trails and placed higher then I actually thought his ROV could achieve.  Impressive driving for the limited machine.  However this is not all, he watched other competitors, cheered the younger competitors on.  Walked around and read the various posters that other teams produced and encouraged the other teams throughout the event.  When chatting with him, he remarked about how much fun this was and how much he was learning.  All on his own choice!  He didn’t win, he didn’t make the paper, but his actions stood out enough that he was voted to receive a Spirit Award that he did not know even existed.  Congratulations – “Abandoned Ship”

The “Transformative” Process of Analyzing Qualitative Data

Hi all!

I have  been doing some readings for my Advanced Qualitative Methods Class and run into some interesting remarks about the challenges of  qualitative data analysis. I though I would share this with you. If you are still to dive into data analysis for your projects, I think these are good references to have as they offered many strategies to cope with the challenges of analyzing qualitative data.

The readings brought forth the idea that the steps and rationale of qualitative data analysis is often obscured in research reports.  There is no widespread understanding in the field as to how qualitative analysis is to be done. Can there ever be such an understanding? Given the very nature of qualitative analysis, no single cookbook is possible, but some strategies have been proposed by various researchers and have been proven helpful in aiding analysis of data.

Bulmer (1979) discusses concept generation, referring to previous work from other researchers who attempted to address the “categorization paradox” and the problem of validating concepts defined/used in qualitative analysis. The “sensitizing” concepts of Blumer, the “analytical induction” of Znaniecki, and the “grounded theory” of Glaser and Strauss are all, within their limitations, sources of insight for thinking about concept validation, as they bring forth the importance of conceptualizing in a way that is faithful to the data collected. I believe this was important to the development of inductive research in more rigid ways that allowed for appropriate generalizations.

Since then, other publications have emphasized the practice of qualitative data analysis and strategies to consider along the way (e.g. Emerson et al. 1995; Lofland et al. 1984; Weiss 1995).  Developments have been made in discussing concerns about data faithfulness and its interplay with the subjectivity of the researcher. I particularly like the Lofland et al.  (1984) definition of analysis as a transformative process, turning raw data into “findings/results”. Here the researcher is a central agent in the inductive analysis process, which is highly interactive, labor intensive and time consuming, and therefore requires a systematic approach to analyzing data in order to account for the interplay between the data and the researcher-produced theoretical constructs.  The authors suggest a few strategies to use while analyzing data, two of which I would like to elaborate on here: normalizing and managing anxiety and memoing.

I have read many qualitative methods materials and they all discuss the need for the qualitative researcher to recognize and be aware of his/her subjectivity in the course of preparing for, conducting and writing about a research problem.  Lofland et al.(1984) touches further on a point that I now believe to be key to subjective interference in data analysis, the issue of researcher anxiety.  At first it seemed to be an overstatement, but the more I read the more I found substance in the issue. Understanding a social situation is no easy task and requires an open-ended approach that can cause much anxiety as the researcher is confronted with the challenge of finding significance in the materials. Ethical and emotional issues come into play in the midst of making sense and organizing a rapidly growing body of data and they can negatively affect the research experience if not dealt with properly.

The authors emphasized five anxiety-management principles for researchers to think about: 1) recognition and acceptance of anxiety; 2) Start analysis during data collection; 3) be persistent and methodical; 4) accumulation of information, at minimum, will ensure some content to talk about; 5) Discuss with others in same situation.   These strategies really addressed my worries regarding the process of data analysis. High emotions, fears, and wanting to quit are all part of anxiety reactions I have been feeling myself.  I believe starting early and being methodological and persistent are key strategies to deal with anxiety issues because it can assure you have time to address the challenges, make changes and not be so frustrated in the course of doing so.

If starting early, initial coding can be done in advance of starting focused coding, giving the researcher time away from the data that may needed to reduce anxiety. Early coding assures the possibility for early memos, which can help clarify connections along the way and assure persistence will prevail due to observable progress. I believe memos are the start of the  “transformative process’’ that Lofland et al. (1984) were referring to while defining data analysis. It is the bridge between the data and the researcher’s meanings, a first draft of a completed analysis where the interplay between data and theoretical constructs take place. Consequently, writing memos become necessary rather than optional.

Both Lofland et al. (1984) and Emerson et al. (2011) extensively discuss the memoing process. Operational memos are notes to self about research procedures and strategies. Code memos clarify assumptions underlying written codes. Theoretical memos record the researcher’s ideas about the codes and relationships. These are the memos that can take place even before coding starts, and that provide the basis for the “integrative” memoing that Emerson et al. (2011) refer to as they talk about identifying, developing, and modifying broader analytic themes and arguments into narrower focused core themes. Furthermore, while Lofland et al. (1984) explores the art of writing memos, Emerson et al. (2011) emphasizes the “reading” of memos, and the importance of reading notes as a whole and in the order they were written as beneficial to this integrative process of making meaning. This aspect added a fourth layer of subjectivity in addition to the layers of observing, deciding and writing about a phenomenon – the layer of reading and making sense of them.

In the course of doing so, the researcher’s assumptions, interests and theoretical commitments influence analytical decisions. In this sense, data analysis is not just a matter of “discovering” but a matter of giving priority to certain incidents and events from data materials in order to understand them in a given case or in relationship to other events.  This idea is interesting to me as I used to think of theoretical constructs emerging from the data in a process of discovery, and now I see it as a process of immersion. The researcher not only can immerse him/herself in the phenomenon being studied during data collection, be he/she is also immersed during data analysis as these inseparable subjective decisions shape the theoretical constructs. While I still think there is an aspect of discovery, it is somewhat created rather than naturally occurring.

In sum, there are several methodological attempts to clarify the logic of qualitative data analysis. However, the use of such guidelines and strategies are not very transparent in research reports and one may be left wondering about how the data analysis was actually done, how exactly the concepts came to be in a given study. Nevertheless, such methodological strategies highly emphasize the interplay between concept use and empirical data observation. Although a logical process does take place in analysis and it is indeed crucial to the systematization of ideas and formation of concepts, it seems to me this process is as logical as the researcher makes it within his/her sociological orientation, the study of substantive framework and the nature of the phenomenon in study. In this sense, nothing is really created but transformed through a logical theorizing process that is unique to the research in question.  Nothing is discovered by chance, qualitative analysis is rather an “analytical” discovery.


Bulmer, M. (1979). Concepts in the analysis of qualitative data. Sociological Review, 27(4), 651-677.

Emerson, R. M.; Fretz, R. I.;  & Shaw, L. L. (1995). Writing Ethnographic Fieldnotes. University of Chicago Press, Chicago, IL.

Glaser, B. G. & Strauss, A. L. (1967). The discovery of grounded theory: strategies for qualitative research. Aldine de Gruyter.

Lofland, J.; Snow, D.; Anderson, L. & Lofland, L. (2011). Analyzing Social Settings: a guide to qualitative observation and analysis.Wadsworth.

Weiss, R. S. (1995). Learning from strangers: The art and method of qualitative interview studies. Simon and Schuster Inc. New York.

Metal and culture

This post will be a light one, as most of my waking—and non-waking—hours are now occupied by a very small person who emerged from my wife recently. This very small person falls asleep when I play a certain type of music at a low volume, which got me thinking.

What makes a thing or circumstance “metal?” I’m not referring to metal in the material sense, but in the cultural and aesthetic sense. “Metal” as in “Slayer,” not “metal” as in “aluminum.” It’s a tough question I often amuse myself with, but it does have some relevance to my work as I wait to collect data.

The target audience for my game project is adult tabletop gamers, and I’ve observed a significant overlap between the tabletop gamer/metalhead communities of practice. I think it has something to do with an affinity for dragons and medieval imagery, but that’s conjecture on my part. I’m a very enthusiastic but somewhat peripheral participant in both areas.

I’ve found difficulty identifying the exact criteria used to determine if something is metal, but it’s fairly easy to reach consensus as to what is or is not metal. It would be easy to say it’s a subjective assessment, but this doesn’t appear to be the case. The criteria are difficult to pin down, but there’s a high degree of intersubjectivity here nonetheless. This is what intrigues me.

“Metalness” is a valuable—if not strictly necessary—aesthetic attribute to a large potential audience segment for my work. Ian Christe’s “Sound of the Beast” is a good primer on metal music and culture. Sam Dunn has done some work on metal as a cultural force and musical form, constructing a handy “heavy metal family tree” and several documentaries:

Aquarist Sid defined it rather succinctly: “Metal is black. Metal is contrast.” He elaborated that contrariness is an important aspect of a thing’s metalness. Volunteer coordinator Becca noted the importance of pain, while her husband cited common elements like death, depression, long hair, distorted guitars, double bass drum work and “long Scandinavian winters.”

What do you think? How would you define metal, musically and aesthetically? Can you give an example? What purpose do metal and its meanings serve to the audience(s)?

Let’s talk.