the other kind of peer review

I think a lot about peer review, but it’s almost all about the journals side of things – the related-but-not-the-same issues of open access and peer review.  And by that which is called “editorial peer review” to distinguish it from peer review in the grants/funding world, a kind of peer review that is probably much more important to a lot of people than the journals-specific kind.

But a couple of recent notes about the other kind of peer review jumped out at me and connected – what do these, taken together, suggest about how we = beyond higher ed, as a society and a culture – value knowledge creation.  Or maybe more what I mean is what do they suggest about how we should value knowledge creation.

First, there’s this note today from Female Science Professor.  She’s responding to another article in Slate, but it’s the piece she’s responding to that I am interested in here too as well – the amount of time that faculty in different disciplines (and in different environments) spend writing proposals to get funding for their research.  The Slate article includes a quote suggesting that med school faculty at Penn spend half their time writing grant proposals.  That number has increased, it goes on to suggest, because of the effort to get in on stimulus funding.

The comments, with a few exceptions, suggest that the 50% number is not out of line in that environment.

So that, connected with this item from EcoTone last month – has to make you think, right?

(quoting the abstract of an article in Accountability in Research)  Using Natural Science and Engineering Research Council Canada (NSERC) statistics, we show that the $40,000 (Canadian) cost of preparation for a grant application and rejection by peer review in 2007 exceeded that of giving every qualified investigator a direct baseline discovery grant of $30,000 (average grant).

Obviously, there are stark differences in scope and scale between these disciplines.  Also obviously, the process of writing grant proposals isn’t entirely divorced from the goal of knowledge creation – the researcher undoubtedly benefits from going through the process – the project benefits from the work donw on the proposals – in some ways.

In others, they are undoubtedly a distraction, and the process becomes more about the process than about the knowledge creation.  No solutions offered here, not even a coherent articulation of a problem, more like it just makes you wonder what it says about us when, within the knowledge creation process itself, the problems and issues of getting funding take precedence over the problems and issues connected to the direct experience of creating new knowledge.

talking on the web

What is it about spring term that it always ends up being overloaded?  Sometimes it is travel, and this term definitely has its share of that – informational visits to the University of Minnesota and the University of Wisconsin-Madison, WILU in Montreal, and an insane 30-hour total trip across 3 time zones for a college reunion.  But unlike other travel-crazy terms, this time around it’s the presentations that have me feeling that “there’s always something more to be working on” feeling.

Upcoming – my first real forays into web-based presentations.

First, there is this one with Rachel: Social Media and the Ethics of Information.

Then a few days later, Kate and I are going to do a version of the Peer Review 2.0 talk as a professional development workshop for community college libraries in Seattle.

Given budget realities for all of us, I would expect that this form of presentation and sharing will only become more important, so I am excited to try it out.  But I’m also nervous.  I’ve definitely been in on online presentation situations where the content and/or presentation style didn’t translate very well.  And it’s not something you can practice, or at least I haven’t figured out how; every practice feels even more artificial than practicing a traditional presentation in front of the mirror.

(Not that I’d know anything about that – I am a practice-while-driving type of presenter)

So yeah, if you’ve ever sat in on a really great webcast presentation, or a really bad one, I’d love to hear what works and what doesn’t.

it’s the math

I’m not sure that even my tendency to see information literacy connections everywhere will explain why I’m posting this, but I just thought it was really interesting.  This morning, I got pointed to this article (via a delicious network) which argues that hands-on, unstructured, discovery-based learning doesn’t do the trick for many science students at the secondary level.  Using preparedness for college science as their definition of success, most students are more successful if their high school science learning is significantly structured for them by their teachers.

Structure More Effective in High School Science Classes, Study Reveals

What jumped out at me here was that the reason seemed to be linked to the math – students with good preparation in the math, did benefit from unstructured, discovery based learning.  And then there was another “similar articles to this one link at the bottom of the page, pointing to another study, making another point -which supports this idea too (which is not hugely surprising because both items point to different papers by the same researchers).

You do better in college if you’ve taken high school classes in chemistry, better in physics if you’ve taken physics – but the one big exception to the success in one doesn’t generalize argument?  You do better in everything if you’re well-prepared in math.

College Science Success Linked to Math and Same-Subject Preparation

After that there are more “articles like this one links” leading to articles about middle-school math teachers in the US being really ill-prepared, or things about gender and math and science which really got me thinking about further implications of those findings – if math is such a lynchpin.  So there is something there about how this dynamic, browsable environment makes your brain work in ways that make research better.

There’s also something there about context – getting the “math teachers aren’t prepared” article in the context of the “math is key” research made the significance of the former clearer, made how I could *use* that research much clearer than it would have been if I came upon it alone.  There’s also something there about the power of sites like ScienceDaily (and ScienceBlogs, and ResearchBlogging.org and others) to pull together research, present it in an accessible way in spaces where researchers/readers can make those connections.

And there might even be something there about foundational, cognitive skills that undergird other learning. But mostly, I just found it interesting.

—————

Studies referenced were reported on here:

Sadler, Philip M. & Tai, Robert, H.   The two high-school pillars supporting college science (Education Forum)  Science 27 July 2007:  Vol. 317. no. 5837, pp. 457 – 458.   DOI: 10.1126/science.1144214  (paywall)
Tai & Sadler, Same Science for all?  Interactive association of structure in learning activities and academic attainment background on college science performance in the USAInternational Journal of Science Education, Volume 31, Issue 5 March 2009 , pages 675 – 696.  DOI: 10.1080/09500690701750600

what do huge numbers + powerful computers tell us about scholarly literature? (peer-reviewed Monday)

ResearchBlogging.org A little more than a month ago, I saw a reference to an article called Complexity and Social Science (by a LOT of authors).  The title intrigued me, but when I clicked through I found out that it was about a different kind of complexity than I had been expecting.

Still, because the authors had made the pre-print available, I started to read it anyway and found myself making my way through the whole thing. The article is about what might be possible with computers and data and powerful computers able to crunch lots of data – what might be possible for the social sciences, not just the life sciences or the physical sciences. The reason it grabbed me was this here -

Computational social science could easily become the almost exclusive domain of private companies and government agencies. Alternatively, there might emerge a “Dead Sea Scrolls” model, with a privileged set of academic researchers sitting on private data from which they produce papers that cannot be critiqued or replicated. Neither scenario will serve the long-term public interest in the accumulation, verification, and dissemination of knowledge.

See, the paper opens by making the point that research in fields like biology and physics have been incontrovertibly transformed by “capacity to collect and analyze massive amounts of data” but while lots and lots of people are doing stuff online every day – stuff that leaves “breadcrumbs” that can be noticed, counted, tracked and analyzed, the literature in the social sciences includes precious few examples of that kind of data analysis.  Which isn’t to say that it isn’t happening – it is and we know it is, but it’s the googles and the facebooks and the NSA’s that are doing it. The quotation about gets at the implications of that.

The article is brief and well worth a scan even if you, like me, need a primer to really understand the kind of analysis they are talking about.  I read it, bookmarked it, briefly thought about writing about it here but couldn’t really come up with the information literacy connection I wanted (there is definitely stuff there – if nowhere else it’s in the discussion of privacy, but the connection I wasn’t looking for wasn’t there for me at that moment) so I didn’t.

But then last week, I saw this article, Clickstream Data Yields High-Resolution Maps of Science, linked in the ResearchBlogs twitter feed (and since then at Visual Complexity, elearnspaceStephen’s Web, Orgtheory.net, and EcoTone).

And they connect – because while this specific type of inquiry isn’t one of the examples listed in the Science article, this is exactly what happens when you turn the huge amounts of data available, all of those digital breadcrumbs, into a big picture of what people are doing on the web — in this case what they are doing when they work with the scholarly literature. And it’s a really cool picture:

The research is based on data gathered from “scholarly web portals” – from publishers, journals, aggregators and institutions.  The researchers collected nearly 1 billion interactions from these portals, and used them to develop a journal clickstream model, which was then visualized as a network.

For librarians, this is interesting because it adds richness to our picture of how people, scholars, engage with the scholarly literature – dimensions not captured by traditional measures of impact data.  For example, what people cite and what they actually access on the web aren’t necessarily the same thing, and a focus on citation as the only measure of significance has always provided only a part of whatever picture there is out there.  Beyond this, as the authors point out, clickstream data allows analysis of scholarly activity in real-time, while to do citation analysis one has to wait out the months-and-years delay of the publication cycle.

It’s also interesting in that it includes data not just from the physical or natural sciences, but from the social sciences and humanities as well.

What I also like about this, as an instruction librarian, is the picture that it provides of how scholarship connects.  It’s another way of providing context to students who don’t really know what disciplines are, don’t really know that there are a lot of different scholarly discourses, and who don’t really have the tools yet to contextualize the scholarly literature they are required to use in their work.  Presenting it as a visual network only highlights this potential for this kind of research more.

And finally – and pulling this back to the Science article mentioned at the top, this article is open – published in an open-access journal and I have to think that the big flurry of attention is has received in the blogs I read, blogs with no inherent disciplinary or topical connection to each other, is in part because of that.

———————-

Lazer, D., Pentland, A., Adamic, L., Aral, S., Barabasi, A., Brewer, D., Christakis, N., Contractor, N., Fowler, J., Gutmann, M., Jebara, T., King, G., Macy, M., Roy, D., & Van Alstyne, M. (2009). SOCIAL SCIENCE: Computational Social Science Science, 323 (5915), 721-723 DOI: 10.1126/science.1167742

Bollen, J., Van de Sompel, H., Hagberg, A., Bettencourt, L., Chute, R., Rodriguez, M., & Balakireva, L. (2009). Clickstream Data Yields High-Resolution Maps of Science PLoS ONE, 4 (3) DOI: 10.1371/journal.pone.0004803

peer-review, what it is good for? (peer-reviewed Monday)

ResearchBlogging.org

In a lot of disciplines, the peer reviewed literature is all about the new, but while the stories may be new, they’re usually told in the same same same old ways.  This is a genre that definitely has its generic conventions.  So while the what of the articles is new, it’s pretty unusual to see someone try to find a new way to share the what.  I’ll admit it, that was part of the attraction here.

And also attractive is that it is available.  Not really openly available, but it is in the free “sample” issue of the journal Perspectives on Psychological Science.  I’m pulling this one thing out of that issue, but there are seriously LOTS of articles that look interesting and relevant if you think about scholarship, research, or teaching/learning those things — articles about peer review, IRB, research methods, evidence-based practice, and more.

Trafimow, D., & Rice, S. (2009). What If Social Scientists Had Reviewed Great Scientific Works of the Past? Perspectives on Psychological Science, 4 (1), 65-78 DOI: 10.1111/j.1745-6924.2009.01107.x

So here’s the conceit – the authors take several key scientific discoveries, pretend they have been submitted to social science/ psychology journals, and write up some typical social-science-y editors’ decisions.  The obvious argument of the paper is that reviewers in social science journals are harsher than those in the sciences, and as a result they are less likely to publish genius research.

No Matter Project (flickr)

I think that argument is a little bit of a red herring; the real argument of the paper is more nuanced.  The analogy I kept thinking about was the search committee with 120 application packets to go through – that first pass through, you have to look for reasons to take people out of the running, right?  That’s what they argue is going on with too many reviewers – they’re looking for reasons to reject.  They further argue that any reviewer can find things to criticize in any manuscript, and that just because an article can be criticized doesn’t mean it shouldn’t be published:

A major goal is to dramatize that a review who wishes to find fault is always able to do so.  Therefore, the mere fact that a manuscript can be criticized provides insufficient reason to evaluate it negatively.

So, according to their little dramatizations, Eratosthenes, Galileo, Newton, Harvey, Stahl, Michelson and Morley, Einstein, and Borlaug would each have been rejected by social science reviewers, or at least some social science reviewers.  I won’t get into the specifics of the rejection letters – Einstein is called “insane” (though also genius – reviewers disagree, you know) and Harvey “fanciful” but beyond these obviously amusing conclusions are some deeper ideas about peer review and epistemology.

In their analysis section, Trafimow and Rice come up with 9 reasons why manuscripts are frequently rejected:

  • it’s implausible
  • there’s nothing new here
  • there are alternative explanations
  • it’s too complex
  • there’s a problem with methodology (or statistics)
  • incapable of falsification
  • the reasoning is circular
  • I have different questions I ask about applied work
  • I am making value judgments

A few of these relate to the inherent conservatism of science and peer review, that has been well established (and which was brought up here a few months ago).  For example, plausibility refers to reviewers are inclined to accept what is already “known” as plausible, and challenges to that received knowledge as implausible, no matter how strong the reasoning behind the challenging interpretation.

A few get at that “trying to find fault” thing I mentioned above.  You can always come up with some “alternative explanation” for a researcher’s results, and you can always suggest some other test or some other measurement a researcher “should have” done.  The trick is to suggest rejection only when you can show how that missing test, or alternative explanation really matters, but they suggest that a lot of reviewers don’t do this.

Interestingly, Female Science Professor had a similar post today, about reviewers who claim that things are not new, but who do not provide citations to verify that claim.  Trafimow and Rice spend a bit of time themselves on the “nothing new” reason for rejectin.  They suggest that there are five levels at which new research or knowledge can make a new contribution:

  • new experimental paradigm
  • new finding
  • new hypothesis
  • new theory
  • new unifying principle

They posit that few articles will be “new” in all of these ways, and that reviewers who want to reject an article can focus on the dimension where the research isn’t new, while ignoring what is.

Which relates to the value judgments, or at least to the value judgment they spend the most time on – the idea that social science reviewers value data, empirical data, more than anything else, even at the expense of potentially groundbreaking new theory that might push the discourse in that field forward.  They suggest that a really brilliant theory should be published in advance of the data – that other, subsequent researchers can work on that part.

And that piece is really interesting to me because the central conceit of this article focuses our attention, with hindsight, on rejections of stuff that would fairly routinely be considered genius.  And even the most knee-jerk, die hard advocate of the peer review process would not make the argument that most of the knowledge reported in peer-reviewed journals is also genius.  So what they’re really getting at here isn’t does the process work for most stuff so much as it is, are most reviewers in this field able to recognize genius when they see it, and are our accepted practices likely to help them or hinder them?

More Revision, Djenan (flickr)

And here’s the thing – I am almost thinking that they think that recognizing genius isn’t up to the reviewers.  I know!  Crazytalk.  But one of the clearest advantages to peer review is that revision based on thoughtful, critical, constructive commentary by experts in the field will, inherently, make a paper better.  That’s an absolute statement but one I’m pretty comfortable making.

What I found striking about Trafimow and Rice’s piece is that over and over again I kept thinking that the problems with the problems they were identifying was that they led to reviews that weren’t helpful to the authors.  They criticize suggestions that won’t make the paper better, that conventions that shouldn’t apply to all research, and the like.  They focus more on bad reviews than good and they don’t really talk explicitly about the value of peer review but if I had to point at the implicit value of peer review as suggested by this paper, that would be it.

There are two response pieces alongside this article, and the first one picks up this theme.  Raymond Nickerson does spend some time talking about one purpose of reviews being to ensure that published research meets some standard of quality, but he talks more about what works in peer review and what authors want from reviewers – and in this part of his response he talks about reviews that help authors make papers better.  In a small survey he did:

Ninety percent of the respondents expected reviewers to do substantially more than advise an editor regarding a manuscript’s publishability.  A majority (77%) expressed preferences for an editorial decision with detailed substantive feedback regarding problems and suggestions for improvement…

(Nickerson also takes issue with the other argument implied by the paper’s title – that the natural and physical sciences have been so much kinder to their geniuses.  And in my limited knowledge of this area, that is a point well taken.  That inherent conservatism of peer rview certainly attaches in other fields – there’s a reason why Einstein’s Theory of Special Relativity is so often put forward as the example of the theory published in advance of the data.  It’s not the only one, but it is not like there are zillions of examples to choose from.)

Nickerson does agree with Trafimow and Rice’s central idea – that just because criticisms exist doesn’t mean new knowledge should be rejected.  M. Lynne Cooper, in the second response piece, also agrees with this premise but spends most of her time talking about the gatekeeper, or quality control, aspects of the peer review process.  And as a result, her argument, at least to me, is less compelling.

She seems too worried that potential reviewers will read Trafimow and Rice and conclude that they should not ever question methodology, or whether something is new — that just because Trafimow and Rice argue that these lines of evaluation can be mis-used, that potential reviewers will assume that they cannot be properly used.  That seems far-fetched to me, but what do I know?  This isn’t my field.

Cooper focuses on what Trafimow and Rice don’t: what makes a good review.  A good review should:

  • Be evaluative and balanced between positives and negatives
  • Evaluate connections to the literature
  • Be factually accurate and provide examples and citations where criticisms are made
  • Be fair and unbiased
  • Be tactful
  • Treat the author as an equal

But I’m less convinced by Cooper’s suggestions for making this happen.  She rejects the idea of open peer review in two sentences, but argues that the idea of authors giving (still anonymous) reviewers written feedback at the end of the process might cause reviewers to be more careful with their work.  She does call, as does Nickerson, for formal training.  She also suggests that the reviewers’ burden needs to decrease to give them time to do a good job, but given other things I have read make me wonder about her suggestion that there be fewer reviewers per paper.

In any event, these seem at best like bandaid solutions for a much bigger problem.   See, what none of these papers do (and it’s not their intent to do this) is talk about the bigger picture of scholarly communication and peer review.  And that’s relevant, particularly when you start looking at these solutions.  I was just at a presentation recently where someone argued that peer review was on it’s way out not for any of the usual reasons but because they were being asked to review more stuff, and they had time to review less.  Can limiting reviewing gigs to the best reviewers really work; can the burden on those reviewers be lightened enough?

The paper’s framing device includes science that pre-dates peer review, that pre-dates editorial peer review as we know it, that didn’t go through the full peer-review process, which begs the question – do we need editorial peer review to make this knowledge creation happen?  Because the examples they’re putting out there aren’t Normal Science examples.  These are the breaks and the shifts and the genius that Normal Science process kind of by definition has trouble dealing with.

And I’m not saying that editors and reviewers and traditional practices don’t work for genius, that’s would be ridiculous.  But I’m wondering if the peer-reviewed article is really the only way to get at all of the kinds of knowledge creation, of innovation, that the authors talk about in this article – is this process really the best way for scholars to communicate all of those five levels/kinds of new knowledge outlined above?  I don’t want to lose the idealistic picture of expert, mentor scholars lending their expertise and knowledge to help make others’ contributions stronger.  I don’t want to lose that which extended reflection, revision and collaboration can create.

I am really not sure that all kinds of scholarly communication or scholarly knowledge creation benefit from the iterative, lengthy, revision-based process of peer review.  I guess what I’m saying is that I don’t think problems with peer review by themselves are why genius sometimes gets stifled, and I don’t think fixing peer review will mean genius gets shared.  I don’t think the authors of any of these pieces think that either, but these pieces do beg that question – what else is there.

doodling as pedagogy

ResearchBlogging.org

This one has been all over the news in the last two days, but if you haven’t seen it, it’s an Early View article in the journal Applied Cognitive Psychology. The article suggests that people who doodle while they are listening to stuff retain more of what they hear than non-doodlers do.

As an unabashed doodler, for me it’s usually fancy typography-like versions of my dog’s name, this isn’t all that surprising. But my brain keeps going back to it — should we be figuring out ways to encourage our students to doodle in library sessions?

See, the article doesn’t say definitively why the doodling works.  But the author, Jackie Andrade, does suggest that it might have something to do with keeping the brain engaged just enough to prevent daydreaming, but not enough to be truly distracting:

A more specific hypothesis is that doodling aids concentration by reducing daydreaming, in situations where daydreaming might be more detrimental to perfomance than doodling itself.

So you’ve got an information literacy session in the library, with a librarian-teacher you have no relationship at all, about a topic about which you may or may not think you need instruction.  That sounds like a perfect situation for daydreaming.

And it’s not too hard to think of ways to encourage doodling.  Handouts with screenshots of the stuff you’re talking about – encourage them to draw on the handouts.  Maybe even provide pencils?  I don’t know – it’s not an idea where I’ve fully figured out the execution, but I’m interested.

My students, most of the time, don’t take notes while I’m talking.  Part of this is my style, I talk fast and I don’t talk for very long in any one stretch before switching to hands-on.  But I don’t think that’s all of it – most of them don’t even take out note-taking materials unless they are told to do so by their professor (and then they ALL do) or unless I say “you should make a note of this” (then most of them do).   And this isn’t something I’ve worried about.  I have course pages they can look at if they need to return to something, and I’m confident that most of them know how to get help after the fact if they need it.

But the no-notetaking thing means that they aren’t even in a position to do any doodling.  And as someone who needs that constant hands/part of the brain occupation to stay focused, I wonder why I’ve never thought about that as a problem before.

This study specifically tried to make sure that the subjects were prone to boredom.  They had them do this task right after they had just finished another colleagues experiment, thinking that would increase the chance that they would be bored.  And they gave them a boring task – monitoring a voice message.  Half doodled, half did not, and then they were tested on their recall of the voice message.

I don’t mean to suggest that information literacy sessions are inherently boring; I don’t actually think they are.  But I think some of the conditions for boredom are there, particularly in the one-shot setting, and I don’t think there’s stuff that we can do about all of those conditions.  Some of them are inherent.  The idea of using the brain research that’s out there to figure out some strategies for dealing with that interests me a lot.

——————–
Jackie Andrade (2009). What does doodling do? Applied Cognitive Psychology DOI: 10.1002/acp.1561

Peer-reviewed Monday (plus 24 hours) – has anyone tried out this Delphi method?

ResearchBlogging.org

So this is a little different for peer-reviewed Monday, even though it is a peer-reviewed article about information literacy. It’s different in that I chose the article because of the research method – the infolit topic was just a bonus. I’m going to be involved in a project for the Oregon Library Association that is going to be using this same method, so I wanted to check it out.

(It’s the Delphi method, if you’re curious. And I talked about another project that used it about halfway through this post.)

In the January issue of portal: Libraries in the Academy, Laura Sanders from Simmons College uses the Delphi method to do some forecasting about information literacy and academic libraries. The Delphi method is frequently used for forecasting, which is actually how my project will be using it, so we’re off to a good start.

So, in short, the Delphi method involved identifying a set of experts on a topic. Then these experts are asked to complete a survey – an open-ended survey with lots of room for them to talk about what they think is important. The researcher collects the surveys and synthesizes the responses, and then sends out another set of questions (which may be new and which may be repeats) for another set of responses. This goes on with the goal being consensus – expert consensus on the topic in question. In this case, the experts were asked to examine some potential scenarios for the future of information literacy:

This study develops possible scenarios for the future of library instruction services and offers practitioners, administrators, and library users a sense of how existing technologies, resources, and skills can best be employed to meet this vision.

Saunders identified her experts by their participation in information literacy organizations, publishing, presenting, and research. She identified 27, 14 agreed to participate and 13 eventually did. She did two rounds of surveys. She pulled some potential futures for information literacy out of the literature (things stay the same, librarians get replaced by faculty who take over all information literacy instruction, and librarians and faculty collaborate) and asked the expert panel to talk about four things:

  1. the scenario they thought was most reasonable or likely and why
  2. any obstacles they could see getting in the way of realizing these scenarios
  3. alternative possibilities or scenarios
  4. other comments

After the first round, she kept one scenario (the collaborative one) and created another composite scenario based on responses from the experts (this last scenario posited a reduced need to teach information literacy because of improvements in the technology). Participants could reiterate their initial choice or choose the new scenario. It isn’t clear from the methods section whether or not the participants were given the same four questions again, nor is it clear what information they were given from the first round – did they see everyone’s responses, a synthesis or summary of those responses, or just the new questions?

The research showed that these experts were largely optimistic about the future of information literacy, and that they overwhelmingly thought the collaborative scenario was the most likely. They identified faculty resistant as a major obstacle, and also mentioned staff and money issues as obstacles. They believed that librarians should leverage their expertise to play a stronger role establishing information literacy goals at the institutional level.

They saw partnerships on instructional design and assignment design as a place where librarians would continue to play a role in information literacy instruction, even if classroom faculty took on more responsibility for teaching, but they expressed concern that librarians aren’t ready to take on those roles. They were also not sure that library schools were providing new practitioners with these skills and that knowledge:

For librarians to be truly integrated into the curriculum rather than offering one-shot sessions, they must have much more pedagogical and theoretical knowledge. Although practicing librarians might have experience with library instruction, few have the background to transition easily to the [consulting] roles being described. Furthermore, respondents were unsure that library school programs were developing courses to adequately prepare future graduates for these responsibilities

The experts also argued that assessment needs to be a concern, and they also raised the age-old question of what do we really mean by information literacy anyway. Following along the lines of the researchers discussed in two recent peer reviewed Mondays, they generally agreed that context is significant when it comes to information literacy, and that information literacy must be understood more broadly than “library research skills.” At the same time, some argued for the more reductionist, standards-based definitions because they are easier to assess.

On a methods level, I found this study compelling, though there were two things that nagged at me. First was the idea that 13 is just not enough experts. Too often Saunders was forced to spend significant time on a point only to undercut its significance when the reader realizes that it had been articulated by only one person on the panel. Some of them were necessary correctives or added important subtleties to the conversation, so I don’t fault her for including them. But when it’s just one voice it is just not possible to entirely dismiss the idea that that voice is alone because it’s wacky.

The second thing was related to this and that is that I didn’t get any sense from the article that any kind of real consensus had been reached, or that the experts involved had changed or refined their views as a result of the process. Those who were outliers after round one of the surveys remained outliers. As it was described to us, the Delphi process offers the interactivity and social learning benefits of a focus group, while allowing the participants to provide individual, thoughtful, reflective feedback. That may have gone on in this study, but I don’t feel like I really saw it if it did.

On a content level, I found the argument that instruction librarians needed more pedagogical and theoretical knowledge intriguing. I was struck by the extent to which the expert panel focused on teaching knowledge as the thing separating the faculty and the librarians, as shown by this quote from one of the panelists: “faculty ‘view librarians as having no pedagogic understanding.'”

Librarians frequently talk about teaching as something faculty don’t think we can do, but in my experience it isn’t teaching but research that causes this gap. I haven’t run across many faculty members who don’t think librarians can teach, though some certainly don’t know that they do, but I regularly run across faculty members who are surprised to hear that there is information science research. And when it comes to what would make me feel comfortable approaching faculty and saying “this is what your students need” – it is research on what students actually do need, what they do and don’t know, and more that would help me do that – not better knowledge or teaching techniques. This may be a function of spending a lot of time at research-focused institutions, but I’m not sure. And it may be that this is exactly what these experts mean by knowledge that is more pedagogical and theoretical, but again, I’m not sure.

And this relates to the assessment piece and the definitions of information literacy piece as well. Because these are both examples of places where there is a danger of following the path of least resistance – of defining information literacy like faculty understand it, or of assessing what other people think is important. Not that that has to be what happens, but the danger exists.

And I do wonder how these experts have themselves avoided the gaps they see in others – how they have developed the knowledge of theory and pedagogy that they think librarians need. Or maybe they haven’t – maybe they are including themselves in the number of librarians who aren’t ready to be faculty partners in this way. I’m not sure. But I have been thinking lately that the Delphi method might be useful for getting at that question as well – how do expert instruction librarians develop the knowledge they need to do what they do?

————-
Laura Saunders (2008). The Future of Information Literacy in Academic Libraries: A Delphi Study portal: Libraries and the Academy, 9 (1), 99-114 DOI: 10.1353/pla.0.0030

Peer-reviewed Monday – Reflective Pedagogy

ResearchBlogging.org
When I wrote that one theory and practice post last November I was thinking about reflective practice, but I didn’t really talk about it.  Luckily, Kirsten at Into the Stacks picked up that thought for me.  The whole post is great, but here’s the reflection piece:

But the purpose of theory, it seems to me, is as much to cause us to reflect on our practice as it is to inform our practice.

In my own post, I over-used the term “inform,” because while that is important, I think that reflection is just as, if not more important.  Reflection is the point where the practice part of the job mixes with the theory part, with the writing part and the presenting part and the reading outside the discipline part.  It’s not just a matter of taking what someone else has done and saying “I could do that.”  It’s taking what someone else has said and saying “wow, this makes me think/feel/understand something about what I do.”

So this article from last year’s Journal of Academic Librarianship jumped out at me – as it brings together the ideas of praxis and information literacy:

H JACOBS (2008). Information Literacy and Reflective Pedagogical Praxis The Journal of Academic Librarianship, 34 (3), 256-262 DOI: 10.1016/j.acalib.2008.03.009

The article is well-done, and I recommend it if you’re interested in the why of reflective practice, particularly where it comes to teaching and information literacy, but for me it felt a little like that one song in Dirty Dancing – the one they dance to at the end that sounds like it is going to be classic 80’s overwrought pop and you keep thinking it’s going to take off into the saxophones and dance beat and it never does because in that last scene they’re doing the mambo that Jennifer Grey’s character learned as a novice and it can never really deviate from its initial beat as much as it sounds like it is going to?

The whole thing is why we should think about reflective practice, with no how or even how I do it.  Which is fine, and important, but when you’ve already drunk that particular kool-aid it lacks a certain punch.

Anyway, quick summary.  Jacobs argues that librarians need to think more about pedagogy and not just about teaching.  She briefly touches on the lack of teaching/ pedagogy training in library school, and argues that even if one has had a teaching methods class that isn’t enough.   Because so much of the teaching/learning work we do happens outside of the classroom setting, teaching methods alone won’t give us the coherence or the big picture we need to be effective.

She also argues for a broad, inclusive definition of information literacy.  Based on the the UN’s Alexandria Proclamation on Information Literacy and Lifelong Learning, the definition she favors includes that stuff in the standards-based definitions, but “goes on to make explicit what is implied in the other definitions by emphasizing the democratizing and social justice elements inherent in information literacy.”  This broad definition, she says, forces an understanding of information literacy that has to extend beyond the classroom.

Which brings us to the crux of the paper’s argument:

What I am suggesting is that the dialogues we have surrounding information literacy instruction strive to find a balance in the daily and the visionary, the local and the global, the practices and the theories, the ideal and the possible. One of the ways we can begin to do this in our daily teaching lives is to work toward creating habits of mind that prioritize reflective discussions about what it is that we are doing when we ‘do’ information literacy. This means thinking about pedagogy and talking about how we might work toward making the global local, the visionary concrete, the theoretical practicable, and, perhaps, the ideal possible. But how can we, as individual librarians, begin to work toward making information literacy ideals possible?

She argues that letting external standards, or quantitative measures, or standards-based rubrics define what we “do” when we do information literacy is not the way to go.  Not only will that keep us from understanding IL in that broad way advocated here, but it also reinforces an old-school, disempowering vision of education itself – Paolo Friere’s banking metaphor – where the teacher deposits knowledge among the students.

Finally, she gets to the argument mentioned in the abstract, that composition and rhetoric offer a lot to librarians trying to figure out how to understand information literacy in broader contexts.  She points out that the rhet/comp literature pushes back on the idea of standards-based assessment or pedagogy.  For one thing, this kind of approach makes it that much harder to really critically interrogate the assumptions underlying the standards or models themselves.

Which brings her to praxis:

Praxis — the interplay of theory and practice — is vital to information literacy since it simultaneously strives to ground theoretical ideas into practicable activities and use experiential knowledge to rethink and re-envision theoretical concepts.

She points to a particular article from the rhet/comp field, Shari Stenberg and Amy Lee’s College English article Developing Pedagogies: Learning the Teaching of English.  Drawing on Stenberg and Lee, Jacobs argues that we must develop ways to study our actual practice as texts, our teaching as texts.  She further argues that most of what we do when it comes to pedagogy is articulate different visions of it – visions that are not grounded in what practitioners actually do.

Beyond this, she argues that we need to study these things together, have critical, reflective conversations together about what it is that we do.  At the heart of this is the idea that teaching can’t be mastered, that developing our understanding of what we do is an inherently ongoing process.

And here’s the thing – I really like all of what she has to say here.  I do find it interesting that given the large body of literature on reflective practice she doesn’t draw from that, but what she says is consistent with the parts of that literature I like so overall, I don’t mind.  But here is where she ends.  She’s made the case for reflective practice and reflective conversations, for reading our practice like texts – but she doesn’t go on to the how of things.

Partly, because she refuses to do so:

For these reasons, I resist offering answers, solutions, or methods to questions about how to engage theory and practice within information literacy initiatives.

But recognizes that this is frustrating:

At the same time, I acknowledge that refusing to provide answers to questions such as “how do I teach information literacy” or “how do I become a reflective pedagogue” or “how might I foster a reflective pedagogical environment in my library” often seems evasive and counter productive.

She argues that librarians should engage in reflective dialogue, and that they should in effect walk the walk in front of their students – that the best way to get students engaged in the learning process is for teachers to be engaged in it as well.   That teachers should interrogate their own assumptions about their own learning process, examine why the set the problems they set, be engaged in their own learning process as they would want their students to be engaged.  To encourage students to develop their curiosity, to set meaningful problems for themselves to investigate – librarians should do that too.  Especially when it comes to their own practice.

But again, no how.  And I will admit I don’t find the “articulating this for you would be against what i am arguing” to be unsatisfying.  Because I don’t really think that Jacobs is letting us see her process – I don’t think that she is letting us see her walk the walk.  I see her problem-setting on a personal, engaged level – but instead I see her telling us that there is a problem, arguing that in very traditional, very objective scholarly language, and then positing a solution to the problem that doesn’t fit in that rhetorical structure.  It’s late, and I’m tired, and I will defintiely accept that I didn’t catch something that is here – but I don’t think that personal engagement is here.

Don’t get me wrong, one of the things I like about the article is what I do see of Jacobs’ passion for this subject, her ability to draw connections and connect the dots.  But I want to see her reflecting on her practice – as a teacher, maybe and as a scholar certainly.  I think that would have allowed her to be true to her “no prescriptive reflection recipes” principles, while still offering something more satisfying than “creative, reflective dialogue.”

Perhaps my perspective is skewed, though, because I am increasingly starting to believe that showing students how we use the tools we describe in our own research and scholarship is the best way to communicate their value.  I do think that modeling what we preach is crucial.   So I may be glomming onto what is a less important part of her overall argument than I would have you believe.

Still, my favorite part of this article is buried in footnote 59 – where she can’t resist weighing in with some ideas.  And I find the peek into the reviewing process entirely charming:

59. The question of how to go about enacting this creative, reflective dialogue is undeniably pressing. In response to this piece, an anonymous reviewer asked a crucial question: “am I simply to include more problem based learning into my teaching of information literacy, or do I need to start from scratch and sit alongside the classes I work with, understanding how they think, and walking with them on their path to critical thinking and information literacy. God please give me the time for this.” The reviewer concludes, “However, this is perhaps the nature of the reflective activity the author is recommending.” Indeed, the answer the reviewer provides to his or her question is the answer I too would offer. The act of asking questions such as the ones quoted above is precisely the kind of reflective activity I am advocating. Pedagogical reflection does not mean we need to dismantle and rebuild our information literacy classes, programs, and initiatives from the ground up (though we may, after reflection, choose to do so). Instead pedagogical reflection means that that we ask questions like the ones quoted above of ourselves and our teaching and that we think critically and creatively about the small and large pedagogical choices we make.

I know what paleophidiothermometry means because of scholarly blogging

Way back when we first started talking about talking about peer review, Kate made a point that has stuck with me ever since – that we talk being accessible a lot in libraries but we usually talk about it in one sense of the word. To be fair, it is the first sense listed by the OED:

1. Capable of being used as an access; affording entrance; open, practicable. Const. to.

This meaning gets at access to information – our ability to physically get our hands (or our eyes) on the information we want or need, our ability to get past technical barriers, bad interfaces, or paywalls.  It also gets at our accessiblity in terms of open hours, our availability to answer questions and maybe even a little bit our openness in terms of friendliness.

What Kate pointed out that for our students, actually for a lot of us, the scholarly or scientific discourse is inaccessible in another way (OED’s 3rd)

c. Able to be (readily) understood or appreciated. Freq. applied to academic or creative work.

How many times do we teach students how to find scholarly articles by showing them the physical access points – the databases (or results limiting options) that will bring back articles that have been peer-reviewed, that will meet their professor’s requirement that for one, three or five “scholarly articles? while all the time we know that they will struggle with reading, understanding, and really USING these articles in their work?

How often do all of us begin poking around on a new topic to find scholarly articles that are too narrowly focused, assume too much about what we know about context and significance, full of technical terms, and just plain inaccessible to us, at least early on in our investigation?

And it’s not the articles’ fault.  The authors of peer reviewed articles have an audience to consider, and it’s not us.  Which is why I love the idea of these same authors writing for a different audience – and academic or science blogging is a great way to do that.  I know I’ve made this point here before, and I’ll probably do it again, but I thought this post today at Dracovenator (by Adam Yates, an Australian palaeontologist) was such a great example of it that I wanted to put it out there.

I only clicked on the link today (out of ResearchBlogging’s Twitter feed) because the title of the post was SO inaccessible to me.  I was just delighted by the post though – look how accessible it is, on every level.  Quick explanations of technical terms, a short summary of the research, an explanation of the context.  That context piece is one of my favorite parts, actually.  But then also a critique of the research.

And you can tell from the first line of the post that this isn’t a dumbed-down explanation written for the uninformed – the author assumes we’ve all heard about the study. I think there is so much positive potential in scholars and experts simply showing how they interact with the work in their field, how they understand it, how they read it, and how they talk about it.

Peer-reviewed Monday post-conference-drive-by

ResearchBlogging.org

Oh who am I kidding.  It probably won’t be short.  But it might be disjointed.  My good intentions were foiled by intermittent Internet access at the Super Conference, which was not that unexpected.  And by a seriously limited amount of power for my computer, which was totally unexpected except for my expected ability to do boneheaded things like leaving my power adapter at home.

{FYI – the Canadians, they know how to treat their speakers.  It’s been great.}

I do have something to say about peer reviewed research today though – it’s about this 2005 Library Quarterly article by Kimmo Tuominen, Reijo Savolainen and Sanna Talja.

Fair warning, I really liked this article.  I first read something by Savolainen when I was working on an annotated bibliography in library school (I think the topic was genre) and I’ve been something of a fan ever since.  Like AnneMaree Lloyd who was discussed here two weeks ago, these authors argue that we need to expand our definitions of information literacy.  And the expansion they’re arguing for is similar to Lloyd’s.  I find more food for thought here – more connections between the different things I’m thinking about and working on.

Perhaps this is because this is not a research article – these authors are not bound by their own sample, questions, or data.  Perhaps it is because tthey do a better job of placing their vision of information literacy in its theoretial context, or at least of explaining what that context is and why we should care about it.  Or perhaps it is just because their vision is broader.

In any event, their starting point is similar to Lloyd’s -

The predominant view of information literacy tends to conceive of IL as a set of attributes – or personal fluencies – that can be taught, evaluated, and measured independently of the practical tasks and contexts in which they are used.

And they have similar conclusions -

We argue that understanding the interplay between knowledge formation, workplace learning, and information technologies is crucial for the success of IL initiatives.  Needs for information and information skills are embedded in work practice and domain-dependent tasks.

So from here the authors look back at the IL discussion over time. They locate its start in the 1960’s and early 1970’s, trace the initial involvement of professional associations in the 1980’s, touch on the Big 6 model in the 1990’s and then argue that the concept of information literacy began to be associated with the broader concept of lifelong learning in the 1990’s.  They conclude this history section with the argument that since the 1990’s there have been many attempts to define competency standards for information literacy.

From here, they move to talking about challenges to the idea of information literacy.  Interestingly, they place the argument that IL instruction requires cooperation with faculty, integration into the curriculum, and a grounding in content-focused classroom assignments as one such challenge.  Given that that model has been presented to me as the norm (with the separate, credit-course instruction idea as the exception) since I was in library school, this rang a little strange to me.

The authors dismiss the challenges to IL.  They argue that so long as definitions of IL take the individual as subject, and outline a set of generic, transferable skills that individual can master – there is broad agreement as to what the potentially vague concept of “information literacy” means.  They argue that the ACRL IL Standards for Higher Education, for example, define a set of generic skills that are supposed to have relevance across the disciplines and across contexts.

This is very interesting to me, because we spent a long time on my campus defining just that kind of generic, transferable information literacy standards.  We did so in conjunction with faculty across the disciplines – from all of our colleges and who taught all levels of undergraduates.  The thing is this, this was a really invigorating process.  We held focus groups with faculty and had conversations with a lot of programs and units across campus and I’m really, really proud of the document we came up with .  As a model for objectives/ goal-writing, this document is not bad.  Look at the action verbs!

And more than that, the repeated conversations with faculty were really morale-boosting.  Getting faculty to come over to the library and talk, and talk in-depth and really, really intelligently about information literacy wasn’t a challenge – it was easy.  And the faculty had such useful and smart things to say about the stuff we all cared about.  It was a good process.

Since then, we’ve been wondering what to do with the document.  Our campus doesn’t have any instiutution-wide learning goals; we don’t have a structure where our competencies could fit in or be adopted on a campus-wide level.   So that’s an issue.  But even within the library, we’ve struggled with where to go next.

And I think that the factors mentioned in this article may have something to do with it.  We use the course-integrated, there should be an assignment, IL has more meaning with taught in the context of an actual information need model.  And we thought and still think that we *could* define disciplinary or context- specific versions of our competencies (or at least of the examples), but we haven’t done that.  Our one attempt to do so got bogged down in a nightmare mire of granularity.

We want to define a program that integrates the branch campuses, the archives/ special collections, faculty programs and all levels of graduate/undergraduate student instruction and I’m not sure that the competency document is that helpful in doing that.  It was a useful reflective exercise for us, and the process of creating it collaboratively with faculty was very useful.  But beyond that, I’m not sure how to make it useful for us as we try to structure a doing-more-with-less type of instruction program.

And it might be because of what is articulated in this article.  When I think of how to create a document like this for beginning composition, for example, which while multidiscipinary has a clearly articulated goal of introducing students to academic writing and knowledge creation.  And that goal is a context – academic writing provides a context.  Context is even easier to conceptualize in different fields.

The authors argue that these standards-based ideas of IL are based on an assumption of information as something factual and knowable (I think our collaborative process with faculty undercut this in our case).  They also suggest that the standards are too focused on the individual as agent seeking and using information.  This piece I have a little bit of a problem with.  It’s kind of a typical criticism of constructivism – arguing that it’s too individual, too grounded in individual cognitive processes:

Most of the published IL literature draws from constructivist theories of learning stressing that individuals not only absorb the messages carried by information but are also active builders of sense and meaning.

What they’re missing here, or probably not giving the same emphasis that I would give more than ignoring – is Vygotsky.  Kuhlthau, who they acknowledge as influential, deliberately focuses on Vygotsky’s brand of constructivism, which was a deliberate effort to integrate the social and cultural back IN to constructivism.  Still, much as I love Vygotsky, and much as I respect Kuhlthau for going that route — I have to agree that the *image* of the solitary scholar undergirds the picture painted by most IL competency standards.

Beyond this, the authors idea of the social in information literacy is very specific – grounded in this idea of socialtechnical practice.  As they suggest, “the most important aspects of IL may be those that cannot be measured at the level of the individual alone.”   By this, they mean, that it is not the individual but the community that decides what kinds of sources are useful, and valued, and important — which things you have to master to be successful within the community:

Groups and communities read and evaluate texts collaboratively.  Interpretation and evaluation in scientific and other knowledge domains is undertaken in specialized “communities of practice,” or “epistemic communities.”

Which is why I think the “academic writing” context in beginning composition is not too broad as to be useful.  For new or neophyte scholars, the idea that there are practices of communicating knowledge, that there are types of knowledge more avalued than others – these ideas are new enough that they deserve an introduction all their own.  Expecting students to jump into the epistemic community of a discipine before they really understand that there is such a thing as epistemology… that seems unreasonable to me.

The authors here tend to argue that IL is too grounded in school and that it misses the communitie of practice aspect because it’s too grounded in school.  I think I would probably argue (though this just occurred to me and I’m a classic introvert which means I need more processing time and thus must reserve the right to argue the opposite of this later) … anyway … I would probably argue that the problem isn’t that we focus on generic academic writing skills instead of grounding things in context – I would argue that we present generic academic writing skills without really grounding them in their context.  I agree that we have an assumption that these skills, mastered in any context, will be useful and valuable.  That we don’t have to explain their significance across contexts because students will be able to draw those connections.  I’m not sure that’s true unless we specifically, and deliberately, explain the academic context in the first place.

And I really like the idea of grounding those pieces of information literacy – that what you will even be looking for is determined by the discourse, by the practice-standards of a particular discourse community — in the community or the context.  So, to these authors, “sociotechnical practice” means identifying where the community determines what it means to be information literate.    And that’s really, really valuable.

Beyond this, I think we need to start deliberately teaching our students how to figure out what those community standards are.  Not teaching them what they are – but how to figure that out.  I wondered the other day if students are using search engines to figure out how to enter the scholarly discourse even if they aren’t taught specifically what “peer reviewed” means, or anything like that.  Looking at my referral logs, I don’t think they are.  That kind of bothers me – they should know how to go looking for the how-to information they need.  And I suspect we should start teaching it.

********
Kimmo Tuominen, Reijo Savolainen, Sanna Talja (2005). Information Literacy as a Sociotechnical Practice The Library Quarterly, 75 (3), 329-345 DOI: 10.1086/497311