Comps readings this week
Hess, D. J. (1997). Critical and Cultural Studies of Science and Technology (pp. 112-147). Science studies: An advanced introduction. New York: New York University Press.
Not useful in that I'll run out and apply things, but useful in that it's an overview and discussion. More really conversational. I'm not sure at all that this title really fits the content. His version of culture is definitely different than standpoint theory or critical race theory or the like... he's really talking about a continuation of STS after SSK and in to more recent times. Hits the gender/sex thing and public understanding of science... as well as mentioning ethnographic studies like Traweek's and dropping some famous names.
Bishop, A. P. (1999). Document structure and digital libraries: how researchers mobilize information in journal articles. Information Processing & Management, 35(3), 255-279. doi:10.1016/S0306-4573(98)00061-2
...current manner in which the content of articles is constrained to a traditional, linear structure is an artifact of both the technology of printing and accepted beliefs about the scientific method that prevailed in the seventeenth century. Kircz [1998] argues that the electronic environment, where storage and presentation are no longer integrated, is hospitable to shattering the linear structure of the article so that research reporting more clearly serves the needs of research readers (p. 217):
... we have reached the stage where comprehensive communication no longer needs a linear build-up of a single document. A complete set of modules, each being in themselves (small) texts emphasizing aspects of the message that together establish a complete message from the author to reader, is the next natural step in scientific communication.
modules - that sounds like blog posts :) This article gathers together findings from a whole bunch of different studies of how people use various parts and pieces of journal articles. It's hard to separate issues with the content and interface from differences that they're looking for but she does describe various uses of the metadata, headings, tables and images, etc. to get an overview/orientation, judge relevance, and maybe in place of reading the article. Unfortunately, the users really didn't take to their interface, so there really weren't demonstrated implications for interface design.
Thelwall, M. (2007). Blog searching: The first general-purpose source of retrospective public opinion in the social sciences? Online Information Review, 31(3), 277-289
Sort of a general overview and some approaches to searching blogs using commercial blog searching tools, both subscription as well as things like Google blog search, for social science research. As he points out, blogs are one of the few sources of retrospective or historical opinion information for things as they were happening. Nothing new to see here, but I had to include some blog search articles because my committee doesn't recognize my
street cred in that regard.
Mishne, G., & de Rijke, M. (2006). A study of blog search. Advances in information retrieval (LNCS 3936) (pp. 289-301). New York: Springer. DOI: 10.1007/11735106_26
More evidence that people treat google boxes differently based on where they are, what they need, and what they expect to find in the collection (see discussion of Wolfram's article last week, yes , it's obvious). The authors took the transaction logs from blogdigger from May 2005 (hey, blogdigger's still around, that's a surprise). They categorize the queries into filtering and ad-hoc. Filtering is setting up an alert. 81% of all queries were filtering, but once you look at unique queries only, filter are just 30%. Terms per query is pretty much the same as regular web search, but filtering queries are much shorter (for unique ad-hoc, 2.71/query, filtering 1.98/query). They tried to make standard categorizations work - informational, navigational, transaction - but it doesn't really work for these queries so they looked at 1000 random queries, half of each type. They come up with context queries, ones that answer the question, "in what context does this thing appear?" And concept queries to locate blogs on a topic. They go on to l0ok at percentages and then popular queries. People really were looking for different things on the blog search engine than they did on regular search engines. For one thing, they search for news and named entities a lot more. So this was pretty interesting.
Leydesdorff, L., & Vaughan, L. (2006). Co-occurrence matrices and their applications in information science: Extending ACA to the web environment. Journal of the American Society for Information Science and Technology, 57(12), 1616-1628. DOI: 10.1002/asi.20335
We join this debate currently in progress... So there's a pile of papers about which similarity measures to use (Jaccard, Cosine, Pearson...) and how to go about it for co-citation or author co-citation (ACA) work. This paper is in that family, but different. The authors discuss the difference between using the asymmetrical matrices (so the columns are cited papers or authors and the rows are citing papers or authors) and going from them vs. using a symmatrized version cited x cited with the number being the times that these two co-occur. A proximity measure shows the co-occurrences and can be entered directly into multidimensional scaling algorithms. You can get from the asymmetrical to the proximity one by doing correlation type thing (Pearson for metric similarities, Spearman for rank...) or by multiplying the matrix by its transpose. Farther on in the paper they basically say that you can ditch MDS and just use Pajek to lay out the network... So I guess I'm a little confused. I get how you aren't supposed to do the correlation after you've symmatrized, but I don't know why to do correlation vs. multiplying by transpose. Maybe a computing power issue depending on the size of the network? I guess if you want to know the correlation coefficient for some other reason...
At this point, I think I've now read all of the things I hadn't read before, and I'm back to reading things I've read at some point. I'm concentrating on things that I don't remember too well, or read a long time ago (started library school in 2000!), and are more important. Of course, reading things now is different from reading them when I was coming from a science undergrad and work experience outside of libraries, you never dip your foot into the same river twice. I'm going to try to pay more attention to tying things together, too, to get ready for the actual exam.
Pettigrew, K. E., Fidel, R., & Bruce, H. (2001). Conceptual frameworks in information behavior. In M. E. Williams (Ed.), Annual Review of Information Science and Technology (ARIST) 35 (pp. 43-78). Medford,NJ: Information Today
Ignoring the little defensive bit about how we are much more theoretically inclined than we used to be and how information behavior research really does use theory (not so much)... I think the most valuable aspect of this article is how it compares and contrasts the cognitive view of information behavior with the social view. The cognitive view was really hot in the late 80s and 90s, somewhat in response to Dirvan and Nilan's call for it. This work - like Kuhlthau's model - addresses how individuals perceive information need, seek information, and use information based on their knowledge structures and other aspects, in context but separate from other people. The social view was slightly later, and deals with the impact a person's place in a social network, or the relationships they have has on their perception of information need, decision to seek information, sources from which they can seek information, and how they use information. Interestingly, and something that's frequently overlooked, is the research in this work on decisions not to seek or attend to information that might be useful based on social or other factors (blunting in medical stuff or Chatman's findings about self-protective behaviors and world views). They spend less time talking about organizational approaches - and actually, I didn't realized I'd read about these before I read some of the structuration stuff a few weeks ago.
Of course now, with fmri and such, we are trying to get in the black box of cognitive processes, so to psychological, social, organizational and other levels of analysis, we also add chemical/electrical/biological/physiological - but information scientists aren't using these as much yet.
Schramm, W. (1971). The nature of communication between humans. In W. Schramm & D. F. Roberts (Eds.), The process and effects of mass communication (Revised ed., pp. 3-53). Urbana, IL: University of Illinois Press.
Good stuff here. He discusses the view of communications in the 1950s - the "Bullet Theory of Communication" in which audiences were passive and you just had to target them to change their minds and motivate action. Funny how this still comes up sometimes, people still don't get it.
Labels: comps