On Newspapers as Sources (A Historian’s Craft)
The Academic Manuscript (Wicked Anomie)
Useful Chemistry (Open Notebook Science)
On Newspapers as Sources (A Historian’s Craft)
The Academic Manuscript (Wicked Anomie)
Useful Chemistry (Open Notebook Science)
I don’t know how many of you are aware of the explosion of copyright discussion surrounding Emily the Strange and Nate the Great (and the alleged intersections between the two). I read probably more than my share of comics news, and most of it entirely passed me by.
At issue, does this character:
equal this character:
(The one in the top right corner).
It’s an interesting case if you’re interested in copyright, copyleft, creativity and the like – Shaun outlines a lot of the salient points here – both about the situation itself, and about the discussion of the situation.
Because there’s a picture going around that looks like this -
He argues, really well, that to focus on that physical similarity is to miss the point, or what should be the point:
However unoriginal her figure maybe, Emily is not a direct copy of Rosamond. She is an adaptation. Most importantly, the two characters exist in entirely different contexts. The fact that Rosamond is a supporting character in someone else’s narrative while Emily is at the center of her own storyworld, is, or should be, the most salient point in this discussion
Copyright should afford people, and notably the actual creators of a work, protection against actual plagiarism, or at least a right to proper attribution, but that is a far distance from being able to lock up all references to, pieces of, or derivations of a work, especially in, or something very much approaching, perpetuity. The fact that creators and other copyright owners feel compelled, and empowered, to assert such rights is a threat to continued creativity.
Lots to think about.
Okay, not really. And OMG Peer-Reviewed Monday is back! But there are connections to the one-shot here, really.
One thing that came out, over and over, in the research that Kate and I just presented at WILU was the idea that student in information literacy classes aren’t motivated to do the work, and that the instructors in those classes have to work super hard and super constantly on engagement.
So this special issue of the Journal of Research in Science Teaching caught my eye because of its focus on motivation. And this article in particular further caught my eye because of its focus on situational interest.
Palmer, D. (2009). Student interest generated during an inquiry skills lesson Journal of Research in Science Teaching, 46 (2), 147-165 DOI: 10.1002/tea.20263
(I couldn’t find an online copy, even though this is a Romeo green publication)
Palmer defines “situational interest” in contrast to “personal interest” where the latter is a deep, lasting, engaged interest in a topic or domain. The former, in contrast, is “short-term interest that is generated by aspects of a specific situation.”
The relevance to information literacy instruction is obvious when Palmer’s example of the kind of “situation” that can spark “interest” is a particularly engaging or awesome demonstration. Like I said, one idea that came through in many, many stories we gathered was the idea that if we as librarian instructors can be engaging, exciting, fun and compelling enough, our students will be motivated to learn.
Palmer synthesizes several factors that lead to situational interest from the literature: novelty, surprise, autonomy, suspense, social involvement, ease of comprehension and background knowledge. So one could expect that there are things that could be in an IL session that are not spectacular demonstrations that could still tap into this idea of situational motivation – content they’re not expecting (novelty and surprise), giving them real, authentic choices (autonomy) and group activities (social involvement).
So on to the study itself – Palmer’s purpose was to evaluate different parts of a science lesson to determine how much situational interest was generated, and to identify the sources of that interest. Students participated in a 40 minute lesson (the topics of the lesson varied, though the basic structure did not, to protect against the possibility that some topics are just more motivating than others). The data gathered was qualitative, gathered via group interviews at the end of the lesson.
His population was younger than you’d find in my academic environment – 14-15 years old. And, of course, they were studying science, not library research. On the other hand, he chose a hands-on lesson, delivered in one shot, which does have relevance for my sessions.
The results are interesting in part because of how similar they are, in some specific ways, to the ways librarians and faculty describe student library research skills. For example, the researchers examined how students engaged in inquiry skills (problem-setting, observation, reporting, analyzing, etc.) during the lesson. While they had many chances to use these skills, most of the time they “were not of a high standard.” Students were more likely to describe their method than articulate meaningful questions, and more likely to describe their experiment than analyze their results.
When it comes to motivation, students demonstrated a significant preference for certain parts of the lesson. The experiment was broken into the following pieces for analysis:
Of these, students showed the lowest amount of interest during the Copying Notes phases, by a lot. Anyone surprised?
Of the others, they showed the most interest during the Experiment phase, with Demonstration next. Both of these were noticeably higher than Proposal and Report. The two pieces with the highest level of interest total, also had high levels of interest in terms of how many students showed that interest. 95% showed interest during the Experiment phase, and 90% during the Demonstration.
In the interviews, students said copying notes wasn’t interesting because it was what they were used to doing in science class. That’s kind of sad. This piece was to get at the domain knowledge piece needed for motivation, but there must be a better way to do that. I copy notes on my own motivation regularly, but it sounds nightmarish as an in-class activity.
Learning came up over and over as a source of interest, which explains the popularity of both the Demonstration and the Experiment phases. Students in 68% of the groups said that having a choice about what to do was a source of interest, even though it only really came up in the Proposal phase. Physical activity was also a source of interest, and this one connects most strongly to the Experiment phase. Novelty and surprise came in a little lower. These codes actually came up most often in the Demonstration phase. Palmer points out, however, that the learning they said they liked could in fact be a form of responding to novelty – enjoying the learning because it was something new.
Palmer, in fact, seems to credit most of the “learning” responses to the idea of novelty — he concludes that 3 factors are most responsible for student motivation:
In summary, it has been argued above that the situational interest experienced by students in this study was basically derived from three separate sources — novelty, autonomy (choice) and social involvement.
The decision to dismiss learning as a factor here seems a bit abrupt. I would have liked to see it unpacked a little more – as it is, I don’t see the evidence Palmer saw. This connects to the biggest gap in the paper, from my perspective – the fact that there’s no reporting of any assessment of the learning that did happen in the lesson in this paper, with the exception of the evaluation of their inquiry skills (which is presented separately from any content or domain knowledge learning).
It seems a little incomplete to talk about motivation to learn without talking about learning. As Palmer says himself:
a student might be very highly motivated to learn in a lesson, but if the teacher does not use appropriate teaching techniques by guiding and scaffolding the direction of learning, then very little science will be learnt. For optimal learning to occur, motivational strategies need to be used in tandem with instructional strategies which focus on the development of scientific understandings.
One of the inherently interesting things about this paper for instruction librarians is its focus on immediate classroom practice. There is nothing in this research method or in the analysis of the results that wouldn’t totally apply to the one-shot, which is pretty rare in the education literature. Of course, the author counts this as a limitation in the study, because real inquiry “is usually developed over a longer time frame than the 40-minute procedure used in this study.” But still, his limitation is our relevance.
Connected to this is his finding that there is a lot of variability in situational interest throughout this lesson. The different pieces were only a few minutes long, so that suggests that students’ interest and motivation can change very quickly. On the one hand, this suggests that we could lose them quickly. On the other hand, it also suggests that perhaps we shouldn’t worry so much about those normal ebbs and flows. If one piece isn’t hugely motivating to them, the next one could be.
Other implications for instruction librarians are found in the lit review, Palmer uses research that suggests “multiple experiences of situational interest” can develop into long-term interest. At best, this suggests that students would need repeated exposure to awesome information literacy teachers to develop a long-term interest in research or inquiry just from IL classes alone. In fact, Palmer suggests that one reason for the mediocrity he observed in inquiry skills was the fact that students didn’t really have the experience with independent inquiry to know how to talk about what they were doing.
Situation motivation seems like a fruitful line of further inquiry for instruction librarians, though even this easy intro to the subject suggests that it’s not a panacea for what ails the one-shot, or for what ails the librarians who teach too many of them.
if we define “good information” as something that makes you go – “OMG I have to find out more about that” instead of defining it as something that makes you go “wow, I never knew that and this source is entirely trustworthy so it must be true and I don’t have to look into it any further” does that change how we think about evaluation?
(yes, I read a cool thing on the Internet before my coffee was ready)
There’s something about spring term that’s always crazy. Last week was my last presentation obligation of the term – the WILU conference in Montreal. WILU is one of my favorite conferences, based on the one time I’ve been before, and luckily we presented on Tuesday, so I was able to enjoy most of it without imminent presentation pressure looming over my head.
Kate and I presented on some very early findings from a research project we have been working on for the last several months – examining stories that instruction librarians tell. I told Kate at the end that if I ended up blogging about this presentation at this early stage, it would be to write something up about how incredibly valuable it can be to present on research in the early stages, even in the very early stages.
Basically, the segment of the research that we presented on at WILU was drawn from an online survey where we asked instruction librarians to share some stories. Our interest is … epistemological. We were hoping to identify some themes that would suggest what we “know” as instruction librarians and professionals, as well as some ideas of what we talk about, worry about, and feel proud about when it comes to our professional practice. This work was primarily intended to inform another round of story-gathering, done as interviews, but we were also hoping that these preliminary results would be interesting on their own.
ETA -it was brought to my attention that some more information about the kinds of stories we gathered might be useful. This is the slide listing the prompts we used to elicit work stories. They’re adapted from a chapter in this book.
So beyond the obvious benefit of a deadline and potential audience forcing you into your data to figure out what it might say early on, presenting even those early findings was a really positive experience. For one, other people are as interested in the story method as we are, which is awesome.
For another, a whole room full of other pairs of eyes is a fabulous thing. Kate and I started the conversations that started this project talking about this conversation between Kate and I and some others (and further framed into reflective practice talk by Kirsten at Into the Stacks) though I don’t think it has stayed there. There has definitely been research-question creep along the way.
We started the project thinking about theory/practice, a is obvious from the conversation linked above. And we made the connection to reflective practice based on that as well – based on the idea that scholarship represents another way of knowing what we know, and thinking about ways that scholarship can inform and push our reflections on practice.
And we got a great question about whether it makes sense to conflate scholarship with theory in this context, especially when, as another commenter mentioned, much of the LIS literature isn’t clear when it comes to any theoretical frameworks the author used. A really useful question to think about that scope of the project creep – and also exactly the kind of question I can never answer on the spot.
Theory vs. practice is useful shorthand, especially in a short session like these were. And I do think that including non-theory generating scholarship in the initial conversations that sparked the project reflected some of the ambivalence we were seeing. As I said at that time, I really don’t think all of that ambivalence is tied up in “if the scholarship in librarianship was more useful, or more rigorous, or more scholarly, or better-written, or more theoretically grounded, I would totally use it.”
I also think that Schon’s Reflective Practitioner allows these things to be discussed together as well, not because he conflates them, but because he sets the Reflective Practitioner in contrast to both the pure theorist and the applied scientific researcher:
As one would expect from the hierarchical model of professional knowledge, research is institutionally separate from practice, connected to it by carefully defined relationships of exchange. researchers are supposed to provide the basic and applied science from which to derive techniques for diagnosing and solving the problems of practice. Practitioners are supposed to furnish researchers with problems for study and with tests of the utility of research results.
Schon argues that this hierarchical model of professional knowledge has dominated the way we understand, and teach, professional practice – and it is in both the development of grounding theory (basic, disciplinary knowledge) and the development of a body of rigorous, scientific applied knowledge for problem-solving that the practitioners, and the practitioner’s unique ways of knowing, are left out of the equation.
Which is a long way of saying that the initial connections we were making still have value for me mentally when thinking about these questions, but I’m not sure we want to stop there. All of this begs the question of whether thinking about these questions, and thinking about the stories, with a clearer distinction between theory and practice in mind might be more useful. I think maybe it would be. On the one hand, clarity is good, and a lack of clarity in prior discussions might actually suggest the need for more clarity all by itself.
But the conversation brought a couple of additional thoughts to the forefront, neither of which were really clear until the mental presenting-dust settled.
Here and there along the way, I’ve been thinking about the real-world information literacy literature and its connection to this discussion. One reason to not discuss it in our 30 minutes was the fact that some of what I have read in that literature recently (as relates to real-world information literacy in professional contexts) examines the differences between the ideal knowing captured in our professional texts/ training/ theories and the real-world/ tacit/ experiential knowing that comes with actually dealing with the uncertainty of practice. The connections to our original questions probably seem clear, but I wasn’t comfortable calling the peer-reviewed literature our abstract, ideal text-based knowing in the same way as the firefighter’s manuals were understood in this article, for example.
Which on the one level is part of the subject of our next steps with this project – figuring out what our abstract, ideal, text-based knowing IS in instruction librarianship. But on another level points to the problems with conflating theory and scholarship – parsing them out more clearly I think would make the connections to this body of literature more useful.
Related to this comes the question of our training (or lack therof) as instruction librarians, in LIS education and after that. Between us, we saw several sessions about professional development for new librarians, which dovetailed with conversations we’d had about the distinction between the stuff we read related to information literacy in grad school and most of the stuff in the literature today.
Kate mentioned that the articles she read in library school instruction classes weren’t the articles about practice, but about theory. I didn’t take a specific instruction class, but I would say the same was true at my school, and was definitely true in the learning theory class that I took. I think to follow up on that question usefully will also require parsing that discussion more clearly.
So thanks to all of the people who participated in this great (for us) conversation, and we’ll be contacting people soon for the next round of work on the project.
Final lesson from WILU? I’m still useless when I try to speak from notes. Not necessarily the speaking part, though it is defintiely not natural for me, but more the actually using the information in my notes part. I tried in this talk, not throughout but just in one moment at the end, and I still made a total mess of the process. I walk away from them, get lost, talk past where I am in the notes, and leave things out anyway. It’s weird that speaking from notes is as much a learned skill as speaking without them is, but it totally is. I think I blame high school debate, and I suspect it’s too late for me now.
So the whirlwind trip back to Philadelphia is over. It was definitely fun; still might end up being ill-advised. The WILU presentation is looming large now, and we’ll see how much I end up missing that 36 hours.
We walked up to see the hullaboo that is alumni weekend/ reunions at my alma mater, and while we were there we came upon the alumni edition of the student newspaper, which featured this story below the fold on the front page — LexisNexis offers free access to law school grads pursuing nonprofit work.
The program in question is called the ASPIRE program – Associates Serving Public Interests Research. This sounds like something that might have been around forever – a corporate donation to public good work type of thing, but it’s actually in response to the current economy, which I thought was really interesting. First, because being past the friends graduating from law school phase of my life, this is an impact of the current situation I didn’t know about –
Almost a fifth of Penn Law’s graduating class had full-time post-graduate plans secured – only to have their hiring firm delay start dates and withhold expected salaries.
Since a large number of firms are recommending, or even requiring, that graduates pursue nonprofit work during the deferred interim, a number of graduates are finding themselves without adequate resources.
It’s also interesting to me because of what it says about the significance of information, and access to information. Obviously, keeping law school grads addicted to the kind of access, convenience and research power that LexisNexis affords has an economic upside for this company — I regularly tell students that I am going to have to stay in academia my whole life because I don’t want to give up that access – but that doesn’t change the fact that doing research with access to LexisNexis is a different thing than doing research without that access, one thing the haves have is this kind of access, and this makes me think more about that.
I have a more substantive post brewing, but I am about to head out on a whirlwind (and probably ill-advised) trip back east to visit friends, and I don’t think it will get done tonight.
I wanted to mention a couple of things today, though, while they are still fresh in my mind. I taught WR 222 again today, which is a composition class that focuses on non-scholarly public discourses. When I do sessions for this class, I don’t have to talk about finding books, or scholarly articles so much as editorials, opinion pieces, letters to the editor, blogs, tweets, and the like.
The students come in very early in their process too; about half of each class is still in the “what should I write about” phase. So I also focus on showing them places where they can browse lots of ideas, arguments, opinions and points of view.
This year, I showed Newsmap again. It was the first time I really looked at it since it upgraded, and it is improved. Unlike last year, there is a search function now. It’s still a slice of Google News, and the search bits have some glitches. But for the students who don’t want to browse, and who still want the visual interface, it’s an improvement. This tool was a big hit again, most of the students at least tried it, and some stayed in it the whole session.
I also showed some of the new additions to Google search- search options, unveiled this week at the Searchology conference. I haven’t really figured out how to make the timeline thingy work interestingly. And I haven’t browsed the options for narrowing down to recent information very much.
Mostly, I wanted to show the Wonder Wheel, which offers its own visual search interface (h/t Caleb). It’s pretty fun to browse up and down connected searches. This one was popular as well, especially in the second class I taught today – in that group, I saw at least half of the group trying it out.
Usually, when students come in to the library and I’m teaching, they have topics and they’ve done some work. I honestly don’t know how it would work to encourage open browsing in library databases – it might work out great, but still, I’m glad that when I have a reason to encourage it, it’s in this class where we can use these fun tools to do it.
I give up.
You know that there is an intersection between science and marketing – 4 of 5 doctors agree that X works for Y?
Most of the marketing goes on below public radar; it’s not directed at us, but at other medical professionals. This 2005 article at PLoS Medicine couldn’t state it more strongly: Medical Journals are an Extension of the Marketing Arm of Pharmaceutical Companies.
This article is talking about sponsored trials – research that is sponsored by drug companies, that finds that the drug in question works:
Overall, studies funded by a company were four times more likely to have results favourable to the company than studies funded from other sources. In the case of the five studies that looked at economic evaluations, the results were favourable to the sponsoring company in every case. The evidence is strong that companies are getting the results they want, and this is especially worrisome because between two-thirds and three-quarters of the trials published in the major journals—Annals of Internal Medicine, JAMA, Lancet, and New England Journal of Medicine—are funded by the industry (citation here, Egger M, Bartlett C, Juni P. Are randomised controlled trials in the BMJ different? BMJ. 2001;323:1253.)
Which has been a topic of conversation for a while, but why stop there? If the drug companies can create a bunch of the research, why don’t they create the journals too? Just create a journal. Don’t pretend that it’s reporting knowledge for the public good, don’t make it so the public can even find it, don’t make it so the doctors can even find it – don’t index it in Medline, don’t even put a website up.
The full original story is behind The Scientist’s registration-wall, so here’s a good summary with extra added TOC analysis from Mitch André Garcia at Chemistry Blog.
See, I talked briefly here a while back about my frustration with people like Andrew Keen and Michael Gorman when they accept uncritically the idea of traditional media gatekeepers serving a quality-control or talent-identifying role, without acknowledging that the corporate media makes many decisions that are not based on a mission of guaranteeing quality or identifying genius.
And Kate and I talk frequently about how traditional methods of scholarly publishing are not intended to guarantee quality in terms of identifying the best articles, or even the most true or accurate articles, but that those methods are instead intended to create a body of knowledge that supports further knowledge creation.
We’ve managed to fill presentations about peer review pretty easily without focusing on the corporatization of scholarly publishing — there’s a lot of discussion of this corporatization in open access conversations already and a lot of confusion that comes up about the implications of open access for peer review. Sometimes it seems like every open access conversation in the broader higher education world gets bogged down by misunderstandings about peer review. So it is has seemed true that drawing this artificial, but workable, line between what we are talking about and what we’re not, just makes it easier to keep our focus on peer review itself.
But man – it might be just too artificial. Maybe we can’t talk about peer review at all anymore without talking about the future of a system of knowledge reporting that is almost entirely dependent upon on the volunteer efforts of scholars and researchers, almost entirely dependent upon their professionalism and commitment to the quality of their disciplines, in a world where ultimate control is passing away from those scholars’ and researchers’ professional societies and into the hands of corporate entities whose decisions are driven not by commitment to quality, knowledge creation or disciplinary integrity.
We’ve been focusing on “why pay attention to scholarly work and conversations going on on the participatory web” mostly in terms of how these things help us give our students access to scholarly material, how they help our students contextualize and understand scholarly debates, how they lay bare the processes of knowledge creation that lie under the surface of the perfect, final-product article you see in scholarly journals. And all of those things are important. But I think we’re going to have to add that “whistleblower” aspect — we need to pay attention to scholars on the participatory web so they can point out where the traditional processes are corrupt, and where the gatekeepers are making decisions that aren’t in the interests of the rest of us.
Here’s the article at BoingBoing
blog.bioethics.net (American Journal of Bioethics)
Drug Injury watch blog has links to reports of the Australian court case where the story was noticed earlier.
Or kind of. After writing this post last winter, I started thinking about this idea as a way to connect with some of the classes I work with. Quick recap, I was looking at craft tutorials online and came up with some common characteristics they had, that our library tutorials don’t always have:
I work with some of our distance education classes, the writing classes for example, and having some very quick and easy “here’s how to get this thing done” how-to’s make so much sense for those students – I tend to answer the same questions over and over and I have access to their class space in the LMS and to their email addresses.
But it’s not just the distance classes that I am thinking about. I taught for a business writing class and it was exactly the kind of class I frequently have trouble with. The students need to do a little bit of very specific kinds of research for every project they have in this class — there’s no way to time a single instruction section so that it works for this class.
To show them how to find the specific types of stuff (information on non-profits, job listings, community statistics, opinion polls, company information, annual reports, and on and on and on) they need to find, in a face-to-face session inherently means spending most of the session doing straight-up how-to demos to support assignments they don’t even have yet. There’s no way around it. There’s no way the instructor could have structured the class any better, and there’s no way that I could make these topics more relevant in a traditional one-shot.
And the stuff is pretty straightforward – it’s mostly a matter of pointing to where the stuff is, and a few tips on the how, and they can take it and with it from there. The complex part of what students need to do in this class is to figure out what kind of evidence they need to write about the project they’ve come up with for the audience they have — that’s good, interesting work but it’s also not well-suited to a one-shot because they have to do this over and over again for every project they do. So multiple one-shot sessions would make no sense for this class either.
What makes sense to me is to connect with this class at the start of the term, by visiting them in person since they are an in-person class. But the quick connection at the start would be pretty easy to replicate online. And once that first connection is made, it makes sense to me to send the class quick how-to information about the stuff they need to find, when they need to find it.
I am also thinking about some of the large general education classes that I would like to support, but which we could never support with face-to-face sessions given current staffing levels. We are already embedded into the First Year Composition curriculum, which is the only course required for all of our undergraduates. But there are a lot of other courses that have a lot of undergraduates enrolled and some of those have assignments that require outside sources. Thinking about the opportunity to reach 500 or so students with some point-of-need help (that reinforces the FYC lessons) in each of those classes, while continuing to reach 700-800 in FYC – that would make me pretty happy about our impact on the first-year experience.
So, is copying the craft tutorials the way to go? Maybe it is – not entirely copying, but there are some opportunities there, I think. Our web developer, Susan McEvoy, put together a blog for me to use just for this – that should let us track the same kind of statistics we track on the overall website. It’s very simple, stripped-down. The posts are just text and images. Because I write fast, putting together one of these takes 20-40 minutes, with most of that taken up uploading images.
So that means I can be really responsive and tailor things to assignments. It’s also easy to send students a link and announcement from within the LMS. In fact, there’s a DIY Tutorial on how to do that.
So how do they match up with the craft tutorials? Do these concepts translate? Sometimes yes, sometimes now.
1. They’re kind of at the point of need, they’re kind of not.
This is true in that they are sent to students at the point of need, and they also persist, so they can be found or re-found later. But I don’t think they’re very searchable now, given that I haven’t done much to make that happen. The images are all on flickr, which is something I think could be utilized better – at first I thought putting together Joe Murphy-style tutorials at the same time as the DIY tutorials made sense, but then I realized that I re-use a lot of the images. But I think the tagging here could connect people to the finished products too, if I think about it more.
2. They’re all about how to make something.
This one, I have trouble with. The bibliographic management ones work in this way – “make a bibliography.” A lot of the others are more process-focused. I tried to focus the titles at least on the thing(s) that could be found with the process, but I think this one needs more work too.
3. They usually assume some level of knowledge on the part of the user.
I did link out to other tutorials when I thought there might be things people didn’t know. But otherwise these are limited to the specific thing they are about, not all of the building blocks knowledge people will need, or the additional questions they might spark.
4. They are presented using social tools.
Yes, and this is important. There is an issue with the comments, since most of our users don’t log in to the system before using it. But putting it on a blog allows for the content to be repurposed easily into our Course Pages:
and our LMS:
5. There’s value added. They do some of the work for the user.
This, I haven’t figured out. Perhaps if I was working in less of a teaching environment this would be easier.
6. A lot of the time, they’re marketing tools.
7. They are created within an existing community.
To the extent that they are being created and conceptualized entirely within existing classes, yes, this works. To the extent that being of a community makes them findable, I think that is less clear.
So, we’ll see how it goes. I have no plans for assessment at this point beyond web logging information – including the time spent and return visits, so more interesting than straight hit counts. And I have a fairly modest definition of success – these take so little effort to make, I don’t need all of the students in the class to find them useful, or even to try them. I will keep you posted.
So slammed, so briefly (well, for me). Via CrookedTimber, a pointed to this post by Julian Sanchez on argumentative fallacies, experts, non-experts and debates about climate change. It’s well worth reading, especially if you are interested in the question of how non-experts can evaluate and use expert information, which is a topic that I think should be of interest to any academic librarian.
Obviously, when it comes to an argument between trained scientific specialists, they ought to ignore the consensus and deal directly with the argument on its merits. But most of us are not actually in any position to deal with the arguments on the merits.
Sanchez argues that most of us have to rely upon the credibility of the author — which is a strategy many librarians also espouse — in part because someone who truly wants to confuse them can do so, and sound very plausible.
Give me a topic I know fairly intimately, and I can often make a convincing case for absolute horseshit. Convincing, at any rate, to an ordinary educated person with only passing acquaintance with the topic.
Further, he suggests that the person who wants to confuse a complex issue actually has an advantage over those who want to talk about the complexity:
Actually, I have a plausible advantage here as a peddler of horseshit: I need only worry about what sounds plausible. If my opponent is trying to explain what’s true, he may be constrained to introduce concepts that take a while to explain and are hard to follow, trying the patience (and perhaps wounding the ego) of the audience:
Come to think of it, there’s a certain class of rhetoric I’m going to call the “one way hash” argument.
And that’s where we get to the evaluation piece. We need to know how much we know to know whether it even makes sense to try and evaluate the arguments. Because if we don’t know enough, trying to evaluate the quality of the actual argument will probably steer us astray more often than using credibility as our evaluation metric.
If we don’t sometimes defer to the expert consensus, we’ll systematically tend to go wrong in the face of one-way-hash arguments, at least our own necessarily limited domains of knowledge.
(Note: I skipped most of the paragraph where he really explains the one-way hash argument – you should read it there)
The thing I really want to focus on is this – that one word, consensus. Because I don’t think we do much with that idea in beginning composition courses, or beginning communication courses, or many other examples of “beginning” courses which often serve as a student’s first introduction to scholarly discourse.
And by “we” here, I’m talking about higher ed in general, not OSU in particular. I think we ask students in these beginning classes to find sources related to their argument; their own argument or interest is the thing that organizes the research they find. They work with that article outside of any context, except which might be presented in the literature review – they don’t know if it’s steadily mainstream, a freakish outlier, or suggesting something really new.
So they go out and find their required scholarly sources, and they read them and they think about the argument in the scholarly paper and how it relates to the argument they are making in their own paper and try to evaluate it – and of course, they evaluate mostly on the question of how well it fits into their paper. And what other option do they have?
Sanchez argues, and it rings true to me, that we usually don’t have the skills to evaluate the quality of the argument or research ourselves. And I know that I am not at all comfortable with the “it was in a scholarly journal so it is good” method of evaluation. Even if they find the author’s bona fides, I’m not sure that helps unless they can find out what their reputation is in the field, and isn’t that just another form of figuring out consensus?
In some fields, meta-analyses would be helpful here, or review essays in others, but so many students choose topics where neither of those tools would be available, that it’s hard to figure out how to use that in the non-disciplinary curriculum.
And perhaps it doesn’t matter – maybe just learning that there are scholarly journals and that there are disciplinary conventions, is enough at the beginning level. But if that’s the case, then maybe we should let that question of evaluation, when it comes to scholarly arguments, go at that level too?