I am an unscrupulous, unscrupulous formatter

Knowing about my constant and abiding interest in all things peer-review, a colleague handed me this pamphlet the other day.  Published by a project I like, Sense about Science (and funded by, among others, Elsevier, Blackwell, the Royal Pharmaceutical Society of Great Britain, the Institute of Biology and the Medical Research Council), this pamphlet provides a good summary of a lot of reasons why people should value peer-reviewed research.

I really like its focus on the reproducability of research, the role that peer review plays in getting science out there to be acted upon by other scientists.  And this statement here gets at a lot of what I have been thinking about information evaluation lately – about how important it is that we evaluate sources within contexts, not in a vacuum:

If it is peer-reviewed, you can look for more information on what other scientists say about it, the size and approach of  the study and whether it is a part of a body of evidence pointing towards the same conclusions.

But this has me mystified.  A callout box titled How can you tell whether reported results have been peer reviewed? A question any academic reference librarian has struggled to answer at some point, right?

Their answer totally mystifies me.  I keep reading it and reading it and I can’t make it make any sense.   Seriously – they say the full reference to peer-reviewed papers is likely to look like this, and then they present – two formatted article citations, one from the New England Journal of Medicine and one from Science.  The Science one is APA, but I’m not even sure exactly what style the second one follows.

just formatted citations, right?

just formatted citations, right?

So under the citations, there’s a word balloon that says that unscrupulous people might “use this style on websites and articles to cite work that is not peer reviewed. But fortunately, this is rare.”

!

Wait, what?   So yeah, it turns out that I’m totally unscrupulous!  And so are you if you use APA to cite an article from the New Republic, or Time or The Journal of Really Lousy Non-Peer Reviewed Science!

I am so confused!  What do they mean by this?

mental debrief from WILU

There’s something about spring term that’s always crazy.  Last week was my last presentation obligation of the term – the WILU conference in Montreal.  WILU is one of my favorite conferences, based on the one time I’ve been before, and luckily we presented on Tuesday, so I was able to enjoy most of it without imminent presentation pressure looming over my head.

Kate and I presented on some very early findings from a research project we have been working on for the last several months – examining stories that instruction librarians tell.  I told Kate at the end that if I ended up blogging about this presentation at this early stage, it would be to write something up about how incredibly valuable it can be to present on research in the early stages, even in the very early stages.

Basically, the segment of the research that we presented on at WILU was drawn from an online survey where we asked instruction librarians to share some stories.  Our interest is … epistemological.  We were hoping to identify some themes that would suggest what we “know” as instruction librarians and professionals, as well as  some ideas of what we talk about, worry about, and feel proud about when it comes to  our professional practice.  This work was primarily intended to inform another round of story-gathering, done as interviews, but we were also hoping that these preliminary results would be interesting on their own.

ETA -it was brought to my attention that some more information about the kinds of stories we gathered might be useful.  This is the slide listing the prompts we used to elicit work stories.  They’re adapted from a chapter in this book.

story prompts

So beyond the obvious benefit of a deadline and potential audience forcing you into your data to figure out what it might say early on, presenting even those early findings was a really positive experience.  For one, other people are as interested in the story method as we are, which is awesome.

For another, a whole room full of other pairs of eyes is a fabulous thing.  Kate and I started the conversations that started this project talking about this conversation between Kate and I and some others (and further framed into reflective practice talk by Kirsten at Into the Stacks) though I don’t think it has stayed there.  There has definitely been research-question creep along the way.

We started the project thinking about theory/practice, a is obvious from the conversation linked above.  And we made the connection to reflective practice based on that as well – based on the idea that scholarship represents another way of knowing what we know, and thinking about ways that scholarship can inform and push our reflections on practice.

And we got a great question about whether it makes sense to conflate scholarship with theory in this context, especially when, as another commenter mentioned, much of the LIS literature isn’t clear when it comes to any theoretical frameworks the author used.  A really useful question to think about that scope of the project creep – and also exactly the kind of question I can never answer on the spot.

Theory vs. practice is useful shorthand, especially in a short session like these were.  And I do think that including non-theory generating scholarship in the initial conversations that sparked the project reflected some of the ambivalence we were seeing.  As I said at that time, I really don’t think all of that ambivalence is tied up in “if the scholarship in librarianship was more useful, or more rigorous, or more scholarly, or better-written, or  more theoretically grounded, I would totally use it.”

I also think that Schon’s Reflective Practitioner allows these things to be discussed together as well, not because he conflates them, but because he sets the Reflective Practitioner in contrast to both the pure theorist and the applied scientific researcher:

As one would expect from the hierarchical model of professional knowledge, research is institutionally separate from practice, connected to it by carefully defined relationships of exchange.  researchers are supposed to provide the basic and applied science from which to derive techniques for diagnosing and solving the problems of practice.  Practitioners are supposed to furnish researchers with problems for study and with tests of the utility of research results.

Schon argues that this hierarchical model of professional knowledge has dominated the way we understand, and teach, professional practice – and it is in both the development of grounding theory (basic, disciplinary knowledge) and the development of a body of rigorous, scientific applied knowledge for problem-solving that the practitioners, and the practitioner’s unique ways of knowing, are left out of the equation.

Which is a long way of saying that the initial connections we were making still have value for me mentally when thinking about these questions, but I’m not sure we want to stop there.  All of this begs the question of whether thinking about these questions, and thinking about the stories, with a clearer distinction between theory and practice in mind might be more useful.  I think maybe it would be.  On the one hand, clarity is good, and a lack of clarity in prior discussions might actually suggest the need for more clarity all by itself.

But the conversation brought a couple of additional thoughts to the forefront, neither of which were really clear until the mental presenting-dust settled.

Here and there along the way, I’ve been thinking about the real-world information literacy literature and its connection to this discussion.  One reason to not discuss it in our 30 minutes was the fact that some of what I have read in that literature recently (as relates to real-world information literacy in professional contexts) examines the differences between the ideal knowing captured in our professional texts/ training/ theories and the real-world/ tacit/ experiential knowing that comes with actually dealing with the uncertainty of practice.  The connections to our original questions probably seem clear, but I wasn’t comfortable calling the peer-reviewed literature our abstract, ideal text-based knowing in the same way as the firefighter’s manuals were understood in this article, for example.

Which on the one level is part of the subject of our next steps with this project – figuring out what our abstract, ideal, text-based knowing IS in instruction librarianship.  But on another level points to the problems with conflating theory and scholarship – parsing them out more clearly I think would make the connections to this body of literature more useful.

Related to this comes the question of our training (or lack therof) as instruction librarians, in LIS education and after that.  Between us, we saw several sessions about professional development for new librarians, which dovetailed with conversations we’d had about the distinction between the stuff we read related to information literacy in grad school and most of the stuff in the literature today.

Kate mentioned that the articles she read in library school instruction classes weren’t the articles about practice, but about theory.  I didn’t take a specific instruction class, but I would say the same was true at my school, and was definitely true in the learning theory class that I took.  I think to follow up on that question usefully will also require parsing that discussion more clearly.

So thanks to all of the people who participated in this great (for us) conversation, and we’ll be contacting people soon for the next round of work on the project.

Final lesson from WILU?  I’m still useless when I try to speak from notes.  Not necessarily the speaking part, though it is defintiely not natural for me, but more the actually using the information in my notes part.  I tried in this talk, not throughout but just in one moment at the end, and I still made a total mess of the process.  I walk away from them, get lost, talk past where I am in the notes, and leave things out anyway.  It’s weird that speaking from notes is as much a learned skill as speaking without them is, but it totally is.  I think I blame high school debate, and I suspect it’s too late for me now.

pay no attention to all that money behind the curtain

I give up.

You know that there is an intersection between science and marketing  – 4 of 5 doctors agree that X works for Y?

Most of the marketing goes on below public radar; it’s not directed at us, but at other medical professionals.   This 2005 article at PLoS Medicine couldn’t state it more strongly:  Medical Journals are an Extension of the Marketing Arm of Pharmaceutical Companies.

This article is talking about sponsored trials – research that is sponsored by drug companies, that finds that the drug in question works:

Overall, studies funded by a company were four times more likely to have results favourable to the company than studies funded from other sources. In the case of the five studies that looked at economic evaluations, the results were favourable to the sponsoring company in every case.  The evidence is strong that companies are getting the results they want, and this is especially worrisome because between two-thirds and three-quarters of the trials published in the major journals—Annals of Internal Medicine, JAMA, Lancet, and New England Journal of Medicine—are funded by the industry (citation here, Egger M, Bartlett C, Juni P. Are randomised controlled trials in the BMJ different? BMJ. 2001;323:1253.)

Which has been a topic of conversation for a while, but why stop there?  If the drug companies can create a bunch of the research, why don’t they create the journals too?  Just create a journal.  Don’t pretend that it’s reporting knowledge for the public good, don’t make it so the public can even find it, don’t make it so the doctors can even find it – don’t index it in Medline, don’t even put a website up.

That’s apparently what Merck and Elsevier did.

The full original story is behind The Scientist’s registration-wall, so here’s a good summary with extra added TOC analysis from Mitch André Garcia at Chemistry Blog.

See, I talked briefly here a while back about my frustration with people like Andrew Keen and Michael Gorman when they accept uncritically the idea of traditional media gatekeepers serving a quality-control or talent-identifying role, without acknowledging that the corporate media makes many decisions that are not based on a mission of guaranteeing quality or identifying genius.

And Kate and I talk frequently about how traditional methods of scholarly publishing are not intended to guarantee quality in terms of identifying the best articles, or even the most true or accurate articles, but that those methods are instead intended to create a body of knowledge that supports further knowledge creation.

We’ve managed to fill presentations about peer review pretty easily without focusing on the corporatization of scholarly publishing — there’s a lot of discussion of this corporatization in open access conversations already and a lot of confusion that comes up about the implications of open access for peer review.  Sometimes it seems like every open access conversation in the broader higher education world gets bogged down by misunderstandings about peer review.  So it is has seemed true that drawing this artificial, but workable, line between what we are talking about and what we’re not, just makes it easier to keep our focus on peer review itself.

But man – it might be just too artificial.  Maybe we can’t talk about peer review at all anymore without talking about the future of a system of knowledge reporting that is almost entirely dependent upon on the volunteer efforts of scholars and researchers, almost entirely dependent upon their professionalism and commitment to the quality of their disciplines, in a world where ultimate control is passing away from those scholars’ and researchers’ professional societies and into the hands of  corporate entities whose decisions are driven not by commitment to quality, knowledge creation or disciplinary integrity.

We’ve been focusing on “why pay attention to scholarly work and conversations going on on the participatory web” mostly in terms of how these things help us give our students access to scholarly material, how they help our students contextualize and understand scholarly debates, how they lay bare the processes of knowledge creation that lie under the surface of the perfect, final-product article you see in scholarly journals.  And all of those things are important.  But I think we’re going to have to add that “whistleblower” aspect — we need to pay attention to scholars on the participatory web so they can point out where the traditional processes are corrupt, and where the gatekeepers are making decisions that aren’t in the interests of the rest of us.

pointed to the story by friends on Twitter and Facebook

Here’s the article at BoingBoing

blog.bioethics.net (American Journal of Bioethics)

Drug Injury watch blog has links to reports of the Australian court case where the story was noticed earlier.

the other kind of peer review

I think a lot about peer review, but it’s almost all about the journals side of things – the related-but-not-the-same issues of open access and peer review.  And by that which is called “editorial peer review” to distinguish it from peer review in the grants/funding world, a kind of peer review that is probably much more important to a lot of people than the journals-specific kind.

But a couple of recent notes about the other kind of peer review jumped out at me and connected – what do these, taken together, suggest about how we = beyond higher ed, as a society and a culture – value knowledge creation.  Or maybe more what I mean is what do they suggest about how we should value knowledge creation.

First, there’s this note today from Female Science Professor.  She’s responding to another article in Slate, but it’s the piece she’s responding to that I am interested in here too as well – the amount of time that faculty in different disciplines (and in different environments) spend writing proposals to get funding for their research.  The Slate article includes a quote suggesting that med school faculty at Penn spend half their time writing grant proposals.  That number has increased, it goes on to suggest, because of the effort to get in on stimulus funding.

The comments, with a few exceptions, suggest that the 50% number is not out of line in that environment.

So that, connected with this item from EcoTone last month – has to make you think, right?

(quoting the abstract of an article in Accountability in Research)  Using Natural Science and Engineering Research Council Canada (NSERC) statistics, we show that the $40,000 (Canadian) cost of preparation for a grant application and rejection by peer review in 2007 exceeded that of giving every qualified investigator a direct baseline discovery grant of $30,000 (average grant).

Obviously, there are stark differences in scope and scale between these disciplines.  Also obviously, the process of writing grant proposals isn’t entirely divorced from the goal of knowledge creation – the researcher undoubtedly benefits from going through the process – the project benefits from the work donw on the proposals – in some ways.

In others, they are undoubtedly a distraction, and the process becomes more about the process than about the knowledge creation.  No solutions offered here, not even a coherent articulation of a problem, more like it just makes you wonder what it says about us when, within the knowledge creation process itself, the problems and issues of getting funding take precedence over the problems and issues connected to the direct experience of creating new knowledge.

talking on the web

What is it about spring term that it always ends up being overloaded?  Sometimes it is travel, and this term definitely has its share of that – informational visits to the University of Minnesota and the University of Wisconsin-Madison, WILU in Montreal, and an insane 30-hour total trip across 3 time zones for a college reunion.  But unlike other travel-crazy terms, this time around it’s the presentations that have me feeling that “there’s always something more to be working on” feeling.

Upcoming – my first real forays into web-based presentations.

First, there is this one with Rachel: Social Media and the Ethics of Information.

Then a few days later, Kate and I are going to do a version of the Peer Review 2.0 talk as a professional development workshop for community college libraries in Seattle.

Given budget realities for all of us, I would expect that this form of presentation and sharing will only become more important, so I am excited to try it out.  But I’m also nervous.  I’ve definitely been in on online presentation situations where the content and/or presentation style didn’t translate very well.  And it’s not something you can practice, or at least I haven’t figured out how; every practice feels even more artificial than practicing a traditional presentation in front of the mirror.

(Not that I’d know anything about that – I am a practice-while-driving type of presenter)

So yeah, if you’ve ever sat in on a really great webcast presentation, or a really bad one, I’d love to hear what works and what doesn’t.

it’s the math

I’m not sure that even my tendency to see information literacy connections everywhere will explain why I’m posting this, but I just thought it was really interesting.  This morning, I got pointed to this article (via a delicious network) which argues that hands-on, unstructured, discovery-based learning doesn’t do the trick for many science students at the secondary level.  Using preparedness for college science as their definition of success, most students are more successful if their high school science learning is significantly structured for them by their teachers.

Structure More Effective in High School Science Classes, Study Reveals

What jumped out at me here was that the reason seemed to be linked to the math – students with good preparation in the math, did benefit from unstructured, discovery based learning.  And then there was another “similar articles to this one link at the bottom of the page, pointing to another study, making another point -which supports this idea too (which is not hugely surprising because both items point to different papers by the same researchers).

You do better in college if you’ve taken high school classes in chemistry, better in physics if you’ve taken physics – but the one big exception to the success in one doesn’t generalize argument?  You do better in everything if you’re well-prepared in math.

College Science Success Linked to Math and Same-Subject Preparation

After that there are more “articles like this one links” leading to articles about middle-school math teachers in the US being really ill-prepared, or things about gender and math and science which really got me thinking about further implications of those findings – if math is such a lynchpin.  So there is something there about how this dynamic, browsable environment makes your brain work in ways that make research better.

There’s also something there about context – getting the “math teachers aren’t prepared” article in the context of the “math is key” research made the significance of the former clearer, made how I could *use* that research much clearer than it would have been if I came upon it alone.  There’s also something there about the power of sites like ScienceDaily (and ScienceBlogs, and ResearchBlogging.org and others) to pull together research, present it in an accessible way in spaces where researchers/readers can make those connections.

And there might even be something there about foundational, cognitive skills that undergird other learning. But mostly, I just found it interesting.

—————

Studies referenced were reported on here:

Sadler, Philip M. & Tai, Robert, H.   The two high-school pillars supporting college science (Education Forum)  Science 27 July 2007:  Vol. 317. no. 5837, pp. 457 – 458.   DOI: 10.1126/science.1144214  (paywall)
Tai & Sadler, Same Science for all?  Interactive association of structure in learning activities and academic attainment background on college science performance in the USAInternational Journal of Science Education, Volume 31, Issue 5 March 2009 , pages 675 – 696.  DOI: 10.1080/09500690701750600

what do huge numbers + powerful computers tell us about scholarly literature? (peer-reviewed Monday)

ResearchBlogging.org A little more than a month ago, I saw a reference to an article called Complexity and Social Science (by a LOT of authors).  The title intrigued me, but when I clicked through I found out that it was about a different kind of complexity than I had been expecting.

Still, because the authors had made the pre-print available, I started to read it anyway and found myself making my way through the whole thing. The article is about what might be possible with computers and data and powerful computers able to crunch lots of data – what might be possible for the social sciences, not just the life sciences or the physical sciences. The reason it grabbed me was this here -

Computational social science could easily become the almost exclusive domain of private companies and government agencies. Alternatively, there might emerge a “Dead Sea Scrolls” model, with a privileged set of academic researchers sitting on private data from which they produce papers that cannot be critiqued or replicated. Neither scenario will serve the long-term public interest in the accumulation, verification, and dissemination of knowledge.

See, the paper opens by making the point that research in fields like biology and physics have been incontrovertibly transformed by “capacity to collect and analyze massive amounts of data” but while lots and lots of people are doing stuff online every day – stuff that leaves “breadcrumbs” that can be noticed, counted, tracked and analyzed, the literature in the social sciences includes precious few examples of that kind of data analysis.  Which isn’t to say that it isn’t happening – it is and we know it is, but it’s the googles and the facebooks and the NSA’s that are doing it. The quotation about gets at the implications of that.

The article is brief and well worth a scan even if you, like me, need a primer to really understand the kind of analysis they are talking about.  I read it, bookmarked it, briefly thought about writing about it here but couldn’t really come up with the information literacy connection I wanted (there is definitely stuff there – if nowhere else it’s in the discussion of privacy, but the connection I wasn’t looking for wasn’t there for me at that moment) so I didn’t.

But then last week, I saw this article, Clickstream Data Yields High-Resolution Maps of Science, linked in the ResearchBlogs twitter feed (and since then at Visual Complexity, elearnspaceStephen’s Web, Orgtheory.net, and EcoTone).

And they connect – because while this specific type of inquiry isn’t one of the examples listed in the Science article, this is exactly what happens when you turn the huge amounts of data available, all of those digital breadcrumbs, into a big picture of what people are doing on the web — in this case what they are doing when they work with the scholarly literature. And it’s a really cool picture:

The research is based on data gathered from “scholarly web portals” – from publishers, journals, aggregators and institutions.  The researchers collected nearly 1 billion interactions from these portals, and used them to develop a journal clickstream model, which was then visualized as a network.

For librarians, this is interesting because it adds richness to our picture of how people, scholars, engage with the scholarly literature – dimensions not captured by traditional measures of impact data.  For example, what people cite and what they actually access on the web aren’t necessarily the same thing, and a focus on citation as the only measure of significance has always provided only a part of whatever picture there is out there.  Beyond this, as the authors point out, clickstream data allows analysis of scholarly activity in real-time, while to do citation analysis one has to wait out the months-and-years delay of the publication cycle.

It’s also interesting in that it includes data not just from the physical or natural sciences, but from the social sciences and humanities as well.

What I also like about this, as an instruction librarian, is the picture that it provides of how scholarship connects.  It’s another way of providing context to students who don’t really know what disciplines are, don’t really know that there are a lot of different scholarly discourses, and who don’t really have the tools yet to contextualize the scholarly literature they are required to use in their work.  Presenting it as a visual network only highlights this potential for this kind of research more.

And finally – and pulling this back to the Science article mentioned at the top, this article is open – published in an open-access journal and I have to think that the big flurry of attention is has received in the blogs I read, blogs with no inherent disciplinary or topical connection to each other, is in part because of that.

———————-

Lazer, D., Pentland, A., Adamic, L., Aral, S., Barabasi, A., Brewer, D., Christakis, N., Contractor, N., Fowler, J., Gutmann, M., Jebara, T., King, G., Macy, M., Roy, D., & Van Alstyne, M. (2009). SOCIAL SCIENCE: Computational Social Science Science, 323 (5915), 721-723 DOI: 10.1126/science.1167742

Bollen, J., Van de Sompel, H., Hagberg, A., Bettencourt, L., Chute, R., Rodriguez, M., & Balakireva, L. (2009). Clickstream Data Yields High-Resolution Maps of Science PLoS ONE, 4 (3) DOI: 10.1371/journal.pone.0004803