Peer Reviewed Monday – Scaffolding Evaluation Skills

ResearchBlogging.org
So this week we’re also behind a paywall, I think.  Someday I will have time to actually go looking for Peer Reviewed Monday articles that meet a set of standards, but right now we’re still in the “something I read in real life this week” phase.

And this one was interesting – so far, when I have found articles that are specifically about deliberate interventions designed to teach something about peer review or about research articles, it is almost always in this literature, about the teaching of science.  Not surprising, but it does beg the question of disciplinary differences.  Still, the overarching takeaway of this article isn’t that everyone should teach about evaluating scientific evidence like we did so much as it is everyone should be teaching this on purpose, and over and over.

Which is a message I can get behind.  And one, I suspect, that is true across disciplines.

So the article has two parts.  One is a presentation of the model the authors used to teach students to evaluate evidence, and the second is a report on their research assessing the use of the model in a class.  Their students are not college students, but advanced high school students.

The authors open by arguing for the significance of evaluation skills in science -

Students, more frequently now than before, are faced with important socio-scientific dilemmas and they are asked or they will be asked in the future to take action on them.  They should be in position to have reflective discussions on such debates and not accept data at face value.

They further argue that students are not being taught this now – that most problem-based or inquiry-based curricula takes the data as a given, and doesn’t include “question the data” as part of the lesson. ( I almost think they are arguing that this is even more important now then it had been before because of the current emphasis on active, experiential learning.  That they’re suggesting that this type of pedagogy requires evaluation skills that the old lecture model didn’t, but that teaching evaluation skills hasn’t been built into these curricula.  That’s an interesting idea.)

In the lit review, the authors spend some time on the question of what “credibility” means.  For the purposes of this paper, that are arguing that there are two main components to the assessment of the credibility of evidence:  the source of the evidence and the method, how it was constructed.  This interpretation is heavily influenced by Driver, et al, 2001).

Questions to ask of the source:

  • Is there evident bias or not?
  • Was it peer-reviewed?
  • Who is the author? What is their reason for producing the evidence? What is their background?
  • What is the funding source?

Questions to ask of the methodology:

  • Does the evidence refer to a comparison of two different groups?
  • Is there any control of variables?
  • Were the results replicated?

The review of the literature suggests that there is ample evidence to support the claim that students are uncertain about how to evaluate evidence and assess claims.  This holds true across grade levels and disciplines.  They also suggest that there is very little research on whether these skills can be improved.

image of steel building framework

Credibility Assessment Framework

The authors then turn their attention to the Credibility Assessment Framework, which they believe will help high school students build the skills they need to assess evidence in inquiry situations.  The framework is based on two specific theoretical concepts: Learning-for-use framework (Edelson 2001) and scaffolding design framework (Quintana, et al 2004).  The framework is intended to help designers create good learning activities that include:

  • authentic contexts
  • authentic activities
  • multiple perspectives
  • coaching and scaffolding by the teacher at critical times
  • authentic assessment of learning within the tasks
  • support for the collaborative construction of knowledge
  • support for reflection about and articulation of learning

What they did

The team spent eight months building the learning environment for a class of secondary school science students.  They built their evaluation learning activities around a project where students were supposed to be doing hands-on work on an ill-structured and complex problem (food and GMOs) — a context where their work should naturally and authentically benefit from the critical evaluation of multiple sources of evidence.

One thing that is significant here, is that the authors supplied the reserach for the students to evaluate — they didn’t include a “finding stuff” piece to this work.  But they also modified the sources that the students were going to use, when they felt it was important to do so to decrease the cognitive load on students.  What was really interesting to me about this was what they added in – context, why the study was done and where it fit.  This is exactly what I feel (feel, because I haven’t got data) my students are missing when they’re just assigned “peer-reviewed articles.”

This information was put in a database in the students’ online learning space.  This space includes both an “inquiry” environment and a reflective “WorkSpace” environment; the project used both.

Scaffolding was built in, using both human-provided information (from the teacher) and computer-supported information (available online for the duration of the unit).  And the unit as a whole lasted elevent weeks.  There were 11 90 minute lesson plans.  The students started out doing hands-on experiments, and then spent the remainder of the unit doing groupwork which included data evaluation.  Then at the end the groups presented their findings.

In the first four lessons, the students were evaluating the provided sources without direct instruction. In the fifth lesson, they did a specific exercise where they evaluated the credibility of two sources unrelated to the class’ topic — this was done to reveal the criteria that the students had been unconsciously using as they attempted to evaluate provided sources in the first four weeks.

What they found out

The authors gathered pre- and post- test data using two instruments.  One measured the mastery of concepts and the other the evaluation skills.  They also videotaped the class sessions and used data captured from the online learning environment.  There was a control class as well, which did not have any of the specific evaluation lessons. The authors found that for the study group, there was a statistically significant difference between the pre- and post- tests for both conceptual understanding and evaluation skills.  For the control group there was no significant difference.

Two findings I found particularly interesting:

  • Including the qualitative data gave more insight.  In the pre-tests students were abel to identify the more credible sources, but they were not able to articulate WHY those sources were more credibile.
  • Within the particular components of credibility that the authors identified (source and method) the students did fine on author/author background by themselves, but needed help with: type of publication and funding source.

The students needed scaffolding help on methodological criteria, and even with it, many students didn’t get it (though they got more of it than they had coming in – this was a totally new concept for most of them).

Here’s the piece that I found the most interesting.  The impact of the study, as interpreted by me, was not so much on the students’ ability to tell the really good or the really bad sources.  It sounded to mek like the real impact was that the students were able to do more meaningful navigation of the sources in the middle.  And I think that’s really important — and something that most students don’t know they need to know on their own.  Related to this – the students were likely to mistrust ALL “internet” sources at the beginning, but by the end they were able to identify a journal article, even if that journal article was published online.  That’s significant to me too – that shows the start of that more sophisticated understanding of evaluation that I think is necessary to really evaluate the scholarly literature.

Finally, the authors found that the students had most of the conversations they did have about evaluation as the result of instruction – not on their own – which they took to prove that instruction was needed.

As I said before, the point of the paper seemed to me to be more about the fact taht this kind of direct intervention is needed, not that this specific intervention is the be all and end all of instruction in this area.  Beyond this, I think the paper is interesting because it illustrates how big a job “evaluation” is to teach – that it includes not only a set of skills but a related set of epistemological ideas — that the students need to know something about knowledge and why and how it’s created.  That’s a big job, and I’m not surprised it took 3 months to do here.

Nicolaidou, I., Kyza, E., Terzian, F., Hadjichambis, A., & Kafouris, D. (2011). A framework for scaffolding students’ assessment of the credibility of evidence Journal of Research in Science Teaching DOI: 10.1002/tea.20420

it has been a while, yes

Wow, that was kind of unplanned hiatus.  Since I last posted:  my library has hired a new University Librarian, I received tenure, I gave some talks, almost all of Spring term has gone by, I was surprised by a completely unexpected but lovely award, I finally finished the IS executive committee minutes from Midwinter, and I submitted an epic proposal for IRB approval.

I am also almost done with an actual blog post.  Until that’s done, though, here’s something awesome:  a news website article (from The Guardian UK) entitled This is a news website article about a scientific paper.

A sneak preview -

This paragraph elaborates on the claim, adding weasel-words like “the scientists say” to shift responsibility for establishing the likely truth or accuracy of the research findings on to absolutely anybody else but me, the journalist.

If I could summarize one of my goals for library instruction it would be – to make sure OSU students understand the scholarly article better than this.

While using my iPad for article-reading, a blog post about Storify appeared

It has been ages since I talked about a new tool/service like this but Shaun came home talking about Storify the other day and it sounded good so I got myself an invite.

Basically, it lets you pull content from the dynamic web, including all of the social social media suspects plus search results, into a timeline-like interface. You add text (or not) and you have a story.

Reading the “one year out” iPad posts that have been popping up, I have been thinking about how I use mine — especially how I use it differently than I expected.  One thing I didn’t expect was the extent to which I have used it to replace some of the paper in my life.  Not all of it, but some of it.   And one of the most interesting pieces of that story, to me, has been the extent to which some of the papers being replaced are the reams and reams of paper worth of article printouts I used to create.

Those printouts were totally outside my workflow in so many ways – but I had to be able to:

  • Take them places (even my laptop is so much less mobile than a folder of paper and a pen).
  • Read them (which I could technically do, but not really do on my phone).
  • Take notes on them (typing doesn’t count for me.  I wish it did.  But it doesn’t).

With the iPad, some of that started to change.  Here’s a story about how.

 

Screenshot of the top few lines of a story created using the Storify tool

 

There are definitely some glitches – the integration with Flickr wasn’t working at all for me, for example.  But it was quick and intuitive and I like the output a lot.  I have some more interesting ideas for using it than this one.

I wrote a book chapter!

When I first started at OSU, I was browsing through some composition texts because I knew that part of my job was going involve working closely with the writing program on the beginning composition class. While I was doing that, I came across some descriptions of different writing styles outlined by OSU professor Lisa Ede in her book Work in Progress and immediately recognized myself in her description of the “heavy reviser.”

(Seriously, she could have included my picture)

Reading that really had an impact on me – not that it changed how I write, at all, but it changed how I felt about it.  And most of all, it made it easier for me to write collaboratively.  Knowing my style as one of many meant knowing what to warn people about – knowing that my willingness to slash and burn through a draft just might freak someone who writes that draft more deliberately than I do the heck out.

So it is especially wonderful that now I have collaborated with Lisa herself.  A year-plus ago she told me that she was substantially revising her textbook The Academic Writer, and asked if I would collaborate with her on the research chapter.  Chapter 6 and I spent many hours together over the next several months, and I am pretty happy with the results – even if the scope of the whole made it difficult for me to see the forest for all the trees while I was immersed in creation.

Doing Research: Joining the Scholarly conversation is available here, in OSU Libraries’ Scholars’ Archive – I hope it’s of interest and ultimately of use!

Zotero assignment revisions

So, in the end the Zotero assignment worked very well on the Zotero side, and less well on the information literacy side.  So I’m spending this week revising it and designing some new activities.  A few quick takeaways:

The assignment was trying to do too much.  It was the main way to assess:

  • Students’ ability to recognize different source types and explain where the fit into the scholarly process.
  • Students’ ability to track down those different source types.
  • Students’ understanding of what the scholarly and creative output of their department (and by extension the scope of intellectual activity within their discipline).
  • Students’ ability to use research tools to organize and manage their sources.

Way too much.  Illustrated mainly by the fact that there were a few students to managed to do all of those things in their work.  That made it very clear what others were missing and made me want to figure out a way for all students to be able to get to where the few did in this class.

So here’s the thing – the first two outcomes up there were the problem, not the technology or logistics of syncing libraries and the like.  The bibliography project should really be about the 3rd and 4th outcomes.  The collaborative nature of the bibliography (and ability to see the breadth of what our faculty produces) was lost on students who had to work to hard to meet all of the format requirements that were in place to measure the first two outcomes.  All of the format requirements I put in to meet the first two outcomes took away from the authenticity of the experience, and of the evaluation and contextualization I had hoped the students would be able to do.

So this term, I am planning to get at those first two outcomes in different ways, and then make some changes to the bibliography assignment:

  1. drop the number of sources required in the annotated bibliography from 5 to 3.
  2. increase the emphasis on evaluation (and multiple methods of evaluation) in the annotations.
  3. change the workflow a bit – have students create a broad, pre-evaluated body of resources in a personal library and then have them select their 3 sources from that larger pool, annotate them and add them to the collaborative bibliography.
  4. build in a required conference so that I talk directly to each student about the process fairly early on.
  5. drop the format requirements altogether and allow students to add any 3 resources they want (while increasing their responsibility to justify those choices in multiple ways in their annotations).
  6. push the due date for the sources up a week, add a week between the final sources due date and the final reflection due date, and target and focus the scope final reflection essay significantly.

(Big hat tip to my students.  Many of these changes were also articulated by them when I asked them to help – in some cases their input was what really allowed me to put my finger on the problems).

What about the tech?

In the end, syncing did cause problems for a few, and Zotero hurdles did cause problems for a few.  Students who were, for whatever reason, not able to spend a focused amount of time at some point earlier in the term learning the mechanics of Zotero found it very challenging to manage finding sources and figuring out Zotero in the context of a last-minute scramble.

I had thought that my students would have to do the bulk of their Zotero work at home because of having to re-download and sync Zotero every time in the classroom.  MY Zotero library was still very difficult to sync in the classroom (I assume the hugeness is a factor) but the students rarely had to wait for more than 2-3 minutes.  Clearly, I can and should rely a lot more on classroom time as a place where students can be working with Zotero.

Most students were very positive about Zotero.  A few found it cumbersome.  There was a clear pattern though that I found interesting, but troubling in that there is nothing I can do with it.  The pattern was this — those students who had reason to use Zotero for real, for a real research project, during the term were much, much clearer in their evaluation of its value.  And by extension, I believe that they are the ones most likely to keep using it.

My class is a 1-credit class.  I can’t assign an authentic student-y scholarly research project that would take that little work.  But whether or not they have reason to use it in another class is nothing I can control.  It’s troubling because it points to a deeper issue about this class’ place within the major – issues we all know about but aren’t sure how to fix.

Yes, we did write that up

Finally!

Kate and I finally got an article related to our LOEX of the West presentation (from 2008!) finished and published.  This peer-reviewed article delay had nothing to do with publishing cycles and everything to do with writing process.  But it’s available (in pre-print) now, and I pretty much like it.

Beyond Peer-Reviewed Articles: Using Blogs to Enrich Students’ Understanding of Scholarly Work

Critical Literacy for Research – Sort of Peer-Reviewed Friday

Unexpectedly it’s Peer Reviewed Friday.  Well sort of.  Harvard Educational Review is a student-run journal, with an editorial board made up of graduate students deciding which articles get published.

I was teaching a class in our small classroom – where I never teach – so I went up early to make sure that I still knew how to work the tech.  It’s on the 5th floor, where the L’s are shelved, so I was flipping through the Fall 2009 issue while I waited for them to show up.  This article caught my eye — well worth reading, both for the content/ideas and because it is very enjoyably written.

Harouni, Houman (Fall 2009). High School Research and Critical Literacy: Social Studies with and Despite Wikipedia. Harvard Educational Review, 79:3. 473-493.

It’s a reflective, case-study type description of the author’s experiences reworking his research assignments in high school social studies classes. There’s a ton here to talk about – the specific exercises he developed and describes, the way the piece works as an example of critical reflective practice — but mainly I want to unpack this bit, which I think is the central theme of the work:

If students do not engage in the process of research inside the classroom, then it is natural for them to view the assignment in a results-oriented manner — the only manifestation of their work being their final paper and presentation.  It is not surprising then, that they are willing to quickly accept the most easily accessible and seemingly accurate information that satisfies the assignment and spares them the anxiety of questioning their data.  And when their final products did not meet my expectations, the students responded not by rethinking the research process itself but by simply attempting to adjust the product in light of what they perceived to be personal preferences. (476-77)

(emphasis mine)

Basically, the narrative he lays out says that his research projects had been unsuccessful for a while, but it wasn’t until he noticed his students’ heavy and consistent reliance on Wikipedia as a source that he started digging into why, what that meant, what he really wanted to teach, and what he really wanted students to learn.  And he changed stuff based on those reflections.

Harouni’s thinking about information literacy (which he calls “critical literacy for research”) was initially sparked by students who were not evaluating sources or showing any sign of curiosity as they researched, but it was further sparked when his first attempts at addressing student gaps didn’t work, sparked by students who were trying, and failing, to evaluate texts they weren’t yet ready to evaluate.

Along the way, he talks about the limitations of a checklist, or “algorithmic” approach to evaluation — limitation he discovered when he reflected on what his students actually did when he tried to use that approach in his classroom:

Two observations confirmed the shallowness of the learning experience created through the exercise: first, the students did not apply their learning unless I asked them to do so; second, they remained dependent on the list of rules and questions to guide their inquiry. (480)

In other words, they could do the thing he asked them to do (apply the checklist to information sources) but it didn’t affect their actual practice as researchers, nor did it change how they viewed the information they were getting from Wikipedia.

And also why it is important to help students understand the openness and dynamism of Wikipedia, but that that itself is not enough:  “knowledge of the uncertainties of a source does not automatically translate into an awareness of one’s relationship with the information (477).”

This piece is, I think, essential at getting at what I think is the real value of his insights and experience — many of our students want to find certainty in their research processes.  They want to know that a source is good or bad.  Wikipedia bans feed that.  Checklists feed that too, especially when they are not taught as an initial step in an evaluation process, but as the process itself.  What we really want students to be able to do when they research is to manage uncertainty — to say I know this is uncertain and I can figure out what it means for me as I try to answer my real, important, and complex question.

Harouni’s process his is an excellent reminder of how teachers want clarity too – and how they have to be willing to embrace uncertainty themselves if they are to guide students through a process of authentic inquiry:

In teaching critical literacy for research, I have had to separate research from its dry, academic context and consider it as an everyday practice of becoming informed about issues that have an impact on students’ lives.  I must value not answers but instead questions that represent the continued renewal of the search.  I must value uncertainty and admit complexity in the study of all things. (490)

In this, he knocks on the door of a question that I frequently have as an instruction librarian (one which I think many instruction librarians have — how much can I really accomplish as a teacher on my own).  If the classroom instructor – the person creates, assigns, explains, and evaluates the research assignment isn’t actively engaged with the students’ research process – are there limits to what I can do?  I do think there are.  I don’t think those limits means that I should do nothing, far from it – but I do think those limits affect what I think I should be trying to accomplish on my own and affect the other ways I should be thinking about furthering my goals for students, inquiry and learning.

At the end of the day, one of Harouni’s basic assumptions is that it is part of his job as a social studies teacher to foster inquiry and curiosity in his students, “[f]or two semesters, research projects remained a part of my curriculum — not because they were wonderful learning experiences, but because I could not justify, to myself, a social studies class that did not work to improve the way students navigated the ocean of available information (474-5).”  In other words, he believes that teaching information literacy is an essential part of what he does.   And that is key.  You can’t have that perspective and also value coverage – of content information – above all else.  It’s one or the other.  (is it?  Yeah, I think it is).

Every faculty member isn’t going to have that idea of what their job is.  And every librarian isn’t either – but I think maybe for instruction librarian it should be.  It is true that rules and clarity make coverage easier.  There was a question on ILI-L yesterday from someone (responding to an ongoing discussion about teaching web evaluation)  asking “how do you even have time to talk about web evaluation when you have to cover all this other stuff.”

Rules make it easier to “cover” web evaluation.  Faculty want us to “cover” lots of different tools.  WE want to “cover” lots of different tools.

(N.B. I am not suggesting that everyone who engaged in the “web evaluation” discussion just “covers” it and doesn’t teach it.  Nor am I suggesting that the people who worry about covering what the faculty want them to cover are only interested in coverage.  I do think though that the pressure to “cover” is as true for us as it is for people in the disciplines and these discussions spark reminders of that)

But if we want students to think about research as a process, if we want research to BE a learning process, then we have to engage in teaching the process.  And that’s extra hard for us – we can’t do that in the one-shot by ourselves.  And we can’t do it if we’re worried about coverage — about covering everything the library has to offer.  And I’m not just saying that about “we can’t teach everything about the library in a one-shot” — I think we all know that.  I think I am saying that it can’t be about that at all – that the point has to be about the process, about authenticity, about this -

I now understand that whatever research strategies students use in their day-to-day lives, which no doubt will vary depending on who the learners are, must be investigated and taken into account by their teacher.  Neither this goal nor the goal of improving these strategies can be attained unless students have time to engage in research while they are in the classroom.  And inviting students to the computer lab and remaining attentive to their interaction with online sources is as important as accompanying students to the library. (490)

And maybe this means not worrying about teaching research as a recursive learning process in the one-shot.  Maybe this means rethinking what and where we teach and maybe it’s work with faculty that gets at that overarching goal.  I don’t know.  I do know, though, that I have some great ideas for rethinking my credit class next term.

Classroom activities to promote critical literacy for research:

1. A (relatively innocuous) vandalism example demonstrated in class.  He didn’t change the content of pages, just the accompanying photo to illustrate the process of editing.

2. Students work in pairs to evaluate a Wikipedia article on a topic they know a lot about (for example, one student used the article about her former high school). Through this exercise he was able to teach about:  skepticism & its place in the research process, identifying controversial claims in a text, citations and footnotes, and verifying claims by checking outside sources.

3. Judging a book by its first sentence. He brought in 5 history textbooks, showed the covers and provided the first sentence.  Then he asked students to describe what they could figure out about the book from that first sentence.  With this exercise he was able to teach: authorial bias or point of view; finding the author’s voice.

4. Research beyond the first sentence.  When they tried to apply these critical skills to the texts they found in their research projects, though, they still had trouble because they didn’t know enough about the stuff they were researching.  So he looked for a way through this problem. Enter Wikipedia.  He provided a list of pages identified by Wikipedia editors as biased or lacking a neutral point of view, and asked the students to choose an article on a somewhat familiar topic and write a brief essay, with specific references to the text, with suggestions for improving the piece to meet the Wikipedia’s neutrality standard.

5. Contributing as an author.  Similar to other projects like this, it was one option for his students as a final project.  Interesting in that he collaboratively developed the assignment and rubric with interested students.

Zotero assignment update

So the first mini-deadline on the Zotero assignment has come and gone, and I’m pretty happy with the results so far.  They’re not very impressive to look at, but when you compare what is actually happening with what I thought could happen, I think we are well on our way to getting this done.

For the first section, which has 21 students:

  • 11 successfully added a scholarly source to their Zotero library AND successfully synced to the group library.  Another one got the sync to work, but what got saved isn’t in very good shape yet.  Three more are waiting on ILL to decide which article they want to save to the bibliography.
  • Of those 11, 6 have added an original annotation and tags.

There are a few who added something in another format (and I’m not sure if that is a result of still not knowing how to find a scholarly article for their person, or if it is a matter of the best sources authored by their person not being scholarly articles)  I’ll find out more about that in class this week.

In the second, which has 24 students registered:

  • 13 successfully added a source to their Zotero library AND successfully synced.  Another one did the sync okay, but what got added was wonky.  There is one person who has added two things.  There is also an example article that I added still in there.
  • And there is a weird article from the medical literature that is still mysterious.  The author doesn’t share a last name with one of our target authors, so I am thinking maybe it was left in one of the classroom computers’ Zotero libraries and accidentally got dragged into our group library?
  • Nine have added original annotations.
  • Another handful are waiting on their articles from ILL.

Most of these have wonky notes/ attachments from the databases, and some need some of their metadata cleared up.  Batting 500+, though, was more than I expected at this point.  Why?  A few reasons, actually -

  1. First, these students have never used Zotero before at all.  Most of them have never used any kind of Firefox plugin.  That whole process of downloading and installing Firefox, then the plugin, was conceptually something new.  I expected this to be a hurdle in and of itself, before we even got to the the group library and syncing piece of the puzzle.  And it was, for sure, for some.  But not for most – most got themselves set up with Firefox no problem, and got the plug in working just fine.
  2. I want to be really clear here – it’s not that I thought these students weren’t intelligent enough to do this nor did I think it was really hard – I just thought it was going to be new and made more difficult by the fact that I asked them to do most of this new thing on their own on their own computers.  I did this mostly because I wasn’t at all certain that syncing the classroom computers to the Zotero group library would work with any kind of reliability.  So it comes down to -  I thought that showing them in class and then asking them to do the work at home was not necessarily setting them up for success (for all that that is how homework usually works).
  3. I really didn’t give them much instruction on how to do this at all.  We went over Zotero on the first day of class, and then I asked them to test different features of it along the way.  But here’s the thing – most of them didn’t do that along the way stuff because I wasn’t grading it and it wasn’t on the syllabus.  It was mostly a “please do this for your own good” thing and wasn’t at the top of anybody’s priority list.  So that .500+ batting average comes from students figuring stuff out with the tutorial I provided and what they could find in their notes and on the Zotero website.
  4. Some of the problems that have happened are undoubtedly not about Zotero at all, but are about navigating library systems and databases and the difficulties that come up during the process of finding scholarly articles — those are the primary reason for this class, after all!
  5. The syncing with the classroom computers is working really well – or at least it has for the last two sessions.  I have to tell you that I was worried about this with good reason.  Every time I have attempted to show this in the classroom, the sync has churned and churned and churned without any end (or any sync) in sight.  So when the students were having no trouble syncing the Zotero libraries in the classroom to their group accounts in class two weeks ago and again last week, I was shocked.  But what this means is that this week we can treat the classroom like a lab and troubleshoot most of the remaining problems together.

Onward!

Citations 101

This term our first-year seminar/orientation classes (called U-Engage) have given me the opportunity to do some different things, teaching-wise.  One of the sections asked for resources for a “Citations 101″ unit.  This is what I’ve put together so far.

Does this work because it has a workable focus, and because it treats citing like something that has value, instead of something to do to avoid getting in trouble (or because no one respects the ideas of college students, which is a message that I think some students take away from lessons about the rhetorical uses of outside sources).  I do think the rhetorical uses are crucial, but they were beyond the scope here – and I think would have taken the focus beyond “workable.”

ALS 199: Citations 101

(Built with our Library a la Carte tutorials extension.)

cream colored ponies and crisp apple strudel

Another essentially no more than bullet points post — I have a lot of formal writing I have to be doing now, so this will end at some point.  So, cool stuff…

via Dave Munger (twitter) Alyssa Milano pushing peer-reviewed research — see, it is relevant after you leave school!

via A Collage of Citations (blog).  Former OSU grad student/ writing instructor turned Penn State PhD candidate Michael Faris’ First-Year Composition assignment using archival sources to spark inquiry and curiosity.  Note especially the research-as-learning-process focus of the learning goals.

via Erin Ellis (facebook) plus then via a bunch of other people — proof that, in the age of social media, an awesome title can boost your impact factor.  But the content stands on its own as well – I’ve been thinking a lot about different information seeking style, and how different people gravitate naturally towards different approaches.  By Karen Janke and Emily Dill: “New shit has come to light”: Information seeking behavior in The Big Lebowski

via @0rb (twitter) Journalism warning labels

via Cool Tools (blog) Longform to InstapaperLong Form by itself is pretty cool, it aggregates some of the best long-form (mostly magazine) writing on all kinds of topics.  But what makes it really cool is that it integrates seamlessly with Instapaper, meaning that I can find something there, push a button and have it available on my iPad to read offline the next time I am stuck somewhere boring.

Related – Cool Tools’ post on the best magazine articles ever.

via Cliopatria (blog).  Obligatory history-related resource — London Lives: 1690-1800.  Pulling together documents from 8 archives & 15 datasets, this online archive asks “What was it like to live in the world’s first million person city?”