The DH & Assessment session Saturday afternoon started with the usual mini-rants about our regional accreditor and reductionist assessment and turned into an “oh, wow, here’s a tool for this” discussion. The core of the discussion revolved around the open-source <emma> assessment tool built for the University of Georgia’s first-year composition class. (Links: <emma> front-end, which will be frustrating because it’s just the sign-in for UGA students, and the website of the Calliope Initiative, the non-profit continuing development and the business end for other institutions.)
At the lunchtime Dork Shorts, Robin Wharton had demonstrated the gist of <emma>: students submit papers in Open Document Format. Then instructors and peers can comment on specific passages and code their comments by area of the comment (e.g., thesis development might be coded as green, something else as yellow, etc.). Students’ revisions are linked to their original documents, they declare when a revision is the final document, etc. So far, this looks like a useful, user-friendly way to comment on student work.
In the afternoon session, it became clear that <emma> was also being used for institutional assessment–it has the ability to look at the comments and the comment categories, a sample can be drawn for assessment by a set of readers, with disagreement on basic judgments by two readers kicked to a third reader or other moderation process, etc. And the system has the capacity to allow conclusions such as shifts in comment categories (i.e., student skill development) across a course or a longer span of time. In other words, institution-level judgments based on the day-to-day evaluative culture within composition instruction.
Those at the session had the obvious questions about the system (expensive? it was developed by one person in the English department who taught himself programming, along with two graduate students) and then we started talking about what would be necessary to develop parallel systems for performances (e.g., faculty-juried music performances at the end of the semester). So we gabbed a bit about Pear Note, Transana, and some other options. And then we discovered that as ODF documents, the base documents students submit can include media. Hmmn…
Bottom line for me: Huge thanks to Rob Balthazor and his team at UGA for showing how digital humanities can put assessment on a much less shaky footing.
#1 by Robin Wharton on March 6, 2011 - 2:08 pm
Just wanted to add a bit of clarification to Sherman’s post here. First, with regard to the building of emma, Ron Balthazor has done the lion’s share of coding, but the awesome technology folks at UGA help him keep it running. Development of emma, though, has been an ongoing 10+year enterprise involving many graduate assistants and faculty who teach every semester with emma, the graduate assistants who help run the emma computer lab, and the students who use the program in their classes, all of whom provide feedback, critique, and suggestions. In addition, Christy Desmet, the director of FYC, and Nelson Hilton, the director of UGA’s CTL have provided programmatic and institutional support, as well as collaborating on development. The English department has also demonstrated its commitment to the project in various ways.
In responding to the question, “Is it expensive?” as I did, I wanted to emphasize that, to build something like emma, you do not need dozens of programmers and software engineers, or hundreds of thousands of dollars of expensive equipment. I should have also emphasized, though, that you do need departmental and institutional commitment to the project.
Now some clarification regarding the nuts and bolts, emma supports ODF (created and marked using OpenOffice) and also documents created and marked using an in-browser WYSIWYG HTML editor. In terms of assessment, the multiple rating system at UGA applies to the FYC portfolio and only to final, holistic ratings. Marking and grading of individual essays in a particular course or portfolios created for courses other than FYC, as far as I know, involves only individual instructors. Finally, with regard to data and assessment, emma is collecting all kinds of data, all the time, and those data can be used and interpreted in a variety of ways for a variety of purposes, including careful and expert assessment. Dr. Desmet is the best person to answer questions regarding exactly how emma data are and might be used in a productive way.
Thank you for this write up of the assessment session. It was a blast!