Assessment Work

This is a brief, informal description of the Institute for Writing and Rhetoric's assessment initiatives in recent years.

General Notes

Underpinning all analyses is the theoretical framing grounded in “knowledge transfer” research, a growing field within Writing Studies. This research suggests that writing knowledge gained in one context is not necessarily re-used, adapted, or transformed in another, unless certain ways of teaching and learning are in place. Our projects all seek to better understand how students’ writing knowledge evolves and how this is apparent in their learning, so that we can then select the best teaching strategies and create the best learning contexts.

The work described here has developed over the past four years. Next steps will include re-studying student work after curricular changes to determine how well students are doing, and coordinating the results of these different initiatives into one coherent picture. That will help us to identify gaps in our understanding of students’ learning in and beyond the first year. We see the work to date as a piece in the broader move towards a systematic institution-wide assessment approach for students’ writing knowledge. 

The Institute for Writing and Rhetoric Assessment Project

In 2009, Dartmouth's Institute for Writing and Rhetoric received a $200,000 grant from the Davis Educational Foundation to improve the effectiveness of our first-year writing programs:  Writing 2-3, Writing 5, the First-Year Seminars, and Humanities 1-2.

Supported by Davis funds, the Institute launched a three-pronged assessment project motivated by two important and related questions—How do students transfer knowledge about writing from course to course and task to task?  and How does composing with new technologies improve the transfer of more traditional writing abilities? 

 

 

THE PROJECT

PART ONE: READING AND CODING STUDENT WRITING

The centerpiece of the assessment project is a close examination of first-year student writing.  To initiate this examination, the Institute collected first and final papers from students in every first-year writing class for three years.  From these papers, a random representative sample was collected, rendered anonymous by an independent observer, and then coded by three groups of readers, constituted of 9 faculty and 2 graduate students.  Papers were coded in order to address particular questions that had evolved from the faculty's year-long efforts to define the first-year writing courses' learning outcomes.  The questions explored are as follows. 

  •  One group read entire papers and considered questions including: Does the paper have a guiding claim?  What kinds of evidence does it offer?  What are the strategies for introductions and conclusions? 
  • A second group read two-paragraph samples from the same papers, asking:  Does the paragraph have a controlling claim?  What kinds of evidence does the writer use?  What sorts of transitions?
  • A third group examined the sources that students cite in order to determine how these sources are being represented.  Are students quoting?  Paraphrasing? Summarizing?  Patchwriting?  

We expect the data we collect to allow us to determine patterns in first-year student writing in four student groups: those who take only FYS, those in WRIT 2-3 and FYS, and those in Fall-Winter or Winter-Spring WRIT 5 and FYS.  We also intend to study individual student writers to observe how knowledge about writing transfers (or doesn't) from course to course. 

 

THE INSTITUTE FOR WRITING AND RHETORIC ASSESSMENT PROJECT

In 2009, Dartmouth's Institute for Writing and Rhetoric received a $200,000 grant from the Davis Educational Foundation to improve the effectiveness of our first-year writing programs:  Writing 2-3, Writing 5, the First-Year Seminars, and Humanities 1-2.

Supported by Davis funds, the Institute launched a three-pronged assessment project motivated by two important and related questions—How do students transfer knowledge about writing from course to course and task to task?  and How does composing with new technologies improve the transfer of more traditional writing abilities? 

THE PROJECT

PART ONE: READING AND CODING STUDENT WRITING

The centerpiece of the assessment project is a close examination of first-year student writing.  To initiate this examination, the Institute collected first and final papers from students in every first-year writing class for three years.  From these papers, a random representative sample was collected, rendered anonymous by an independent observer, and then coded by three groups of readers, constituted of 9 faculty and 2 graduate students.  Papers were coded in order to address particular questions that had evolved from the faculty's year-long efforts to define the first-year writing courses' learning outcomes.  The questions explored are as follows. 

  •  One group read entire papers and considered questions including: Does the paper have a guiding claim?  What kinds of evidence does it offer?  What are the strategies for introductions and conclusions? 
  • A second group read two-paragraph samples from the same papers, asking:  Does the paragraph have a controlling claim?  What kinds of evidence does the writer use?  What sorts of transitions?
  • A third group examined the sources that students cite in order to determine how these sources are being represented.  Are students quoting?  Paraphrasing? Summarizing?  Patchwriting?  

We expect the data we collect to allow us to determine patterns in first-year student writing in four student groups: those who take only FYS, those in WRIT 2-3 and FYS, and those in Fall-Winter or Winter-Spring WRIT 5 and FYS.  We also intend to study individual student writers to observe how knowledge about writing transfers (or doesn't) from course to course. 

PART TWO: LINKED COURSES

The second component of our assessment project was designed to offer another, close-up look at knowledge transfer, informed by the assessment work described above.  Here, WRIT 5 and First-Year Seminar instructors teamed up in order 1) to track knowledge transfer from course to course, and 2) to think about which teaching methods best facilitate that transfer. 

 

PART THREE: MULTIMODAL COMPOSITION

The third component of our assessment project organized a group of faculty to examine how multimodal arguments are read and composed in first-year writing classrooms.  In particular, the group paid attention to how composing with media might improve the transfer of writing capabilities that students use regularly in more traditional writing tasks. Faculty familiar with using multimodal assignments in their teaching designed professional development opportunities for colleagues interested in incorporating multimodal assignments in their courses. 

 

ASSESSMENT AND FACULTY DEVELOPMENT

The Institute Davis Assessment project was designed so that every step in the process is faculty development.  Foundationally, each component of the research was supported by faculty who were reviewing and summarizing current scholarship in relevant fields.  This research not only informed how we framed assessment and interpreted results, it also increased faculty expertise on matters such as knowledge transfer and multimodal composition.

Moreover, faculty involved in the actual assessment of student writing—reading and coding student papers—reported the benefit of seeing student writing in new ways.  In this way, their assessment work came to inform their teaching.  And finally, the assessment project offered many opportunities for writing instructors to talk about their shared enterprise:  a strong first-year writing education for Dartmouth students.  

Initiatives within the Institute

Davis Study of Student Writing, 2010-2013 (CPHS approved)

Purpose: To ascertain change in student writing over time, across discipline, and across course type via analysis of student writing samples for evidence of course outcomes.

Approach: We collected a stratified random sample of first year students’ first and last essays across four types of first-year experience 1 (First-year Seminar (FYS)-only; Writing 2-3 and FYS; Fall Writing 5 to Winter FYS; Winter Writing 5 to Spring FYS), for three academic years. A set of 50 students from each group was studied, for a total of approximately 700 essays a year. We have scored these essays, using a coding scheme, against a set of features drawn from the learning outcomes statements for the courses. We compared results across types of experience, and this year, we have begun to analyze 25 case studies of individual students across courses. The preliminary results of this work have informed faculty discussions and faculty development sessions, targeting specific issues raised in the study.2 The analyses are ongoing, and we have begun faculty meetings to determine how this initial study sets the stage for ongoing program assessment of student progress in the first year.

1. Dartmouth offers different types of first-year writing experiences. At the time of the Davis study conception, we offered Writing 2-3 (a two-term developmental course), Writing 5 in fall or winter term, and First-year Seminar. Students could be exempted from Writing 2-3/5. Currently exemption is not possible, students take at least two terms of writing, and students can fulfill the writing sequence by taking Humanities 1 and 2.

Portfolio Project, ongoing (CPHS approved)

Purpose: To identify, via both student self-report and student work, the degree to which students are consciously experiencing coherence in their first-year writing courses, are successfully adapting knowledge from one context into the next, and are developing metacognitive writing knowledge.

Approach: We invite up to 70 students annually to participate in a portfolio project. They turn in all work from their first-year writing courses; identify their best work; complete a self-reflection at the end of each course; meet with the faculty member of their FYS at the start of the second term to share the essay they tagged as their best work from the first term. Students load their work into e-portfolios on the Bedford St. Martin’s platform. We have studied the students’ self-reflections so far to describe what they consider to be “good writing,” what they believe they should re-use in writing in future contexts, how they describe “growth” as writers, and what affect accompanies their writing.

The preliminary results of this work have informed faculty discussions and faculty development sessions, targeting specific issues raised in the study. The analyses are ongoing, and we are embarking on a second year of portfolio collection and analysis. This year, we will be surveying faculty as well. There is already an embedded faculty development purpose in this project, via the First-year Seminar faculty members’ meetings with students who share best work from Writing 2-3 or 5.

The plan is to move to full portfolio implementation for all first-year students by 2015-16. The full portfolio approach will allow systematic full-scale annual assessment.

NSSE questions entrance/exit survey, 2010-2013 (CPHS approved)

Purpose: To gain insight into students’ writing work and experiences across all courses in a year (not only writing courses) and to compare Dartmouth results to national results; to develop additional insights into the writing activities of the students in the Davis study described above.

Approach: For three years, we administered to the first-year class the National Survey of Student Engagement’s set of 27 questions specifically focused on students’ writing and reading practices at the university. The response rate was 50%. This past spring, we administered the same set of questions to the graduating class, which had taken the survey as incoming students. We are currently working on asking the following questions of the data: what kinds of writing/writing activities students do in their first and final year; how do these compare to the information from hundreds of other institutions giving the same survey; how do student responses compare across disciplines. The survey results provide benchmarking data.

We are just beginning to process this data and will use it for discussion with writing faculty but also with faculty across the College.

The Citation Project (CPHS approved)

Purpose: To ascertain students’ existing strategies in source use and citation via systematic, in-depth analysis; to compare our students’ strategies to those of other institutions.

Approach: We joined the national Citation Project, initiated at Syracuse University. This project’s PI asked each participating university to work with ten first-year student papers, using a method designed to analyze in fine detail how students use sources in their writing. We chose to study thirty papers. Study readers locate and read all of the sources and references made in a student’s essay. Readers then identify citations, quotes, paraphrases, summaries, and “patchwriting” in the students’ work. The Citation Project sent us the results from the study of our students and the comparison data with 17 other schools; we can use this as benchmarking data.

This work informed a series of workshops offered to writing and FYS faculty at Dartmouth; we asked faculty to do similar analyses and to discuss how the results affect teaching source use and citation.

We are developing a follow-up study to determine the effect on students’ source use. We also have collected but not yet analyzed junior-year papers from the same students who participated in the Citation Project in their first year. The students are from several disciplines. 

Multimodal Curricular Pilots with Assessment

Purpose: To assess the impact of multimodal composition activity on students’ learning in first-year writing courses.

Approach: The Institute has piloted, for three years, integrating “multimodal” assignments into writing courses in the first year. These assignments can include Powerpoints, films, visuals embedded in texts, and so on, and are intended to offer additional powerful ways to achieve course outcomes. Twenty six faculty from Writing 2-3, 5, and FYS joined this pilot. They identified the specific course outcomes their assignment was designed to achieve, and provided assessment results about these outcomes, sometimes in the form of student self-report and sometimes in the form of evaluation of student activity or production for identifiable demonstration of targeted outcomes. All 26 pilot multimodal assignments produced positive results in both increased student engagement and increased achievement of intended learning outcomes. Every faculty member who piloted a multimodal assignment has continued to use it.

As a program, we must now ask ourselves to what degree we should move beyond recommending, to requiring, that faculty assign this kind of work in their first-year courses.

 

Initiatives with Other Departments and Programs

International Graduate Students’ Needs Assessment, ongoing (CPHS approved)

Purpose: To ascertain the actual needs of international multilingual graduate students for language and writing support; to inform new initiatives.

Approach: In collaboration with Graduate Studies, we have undertaken a needs assessment of Dartmouth’s international multilingual graduate students and post-docs. The study’s PI, Michelle Cox, surveyed graduate students, post docs, and faculty; she has also interviewed key faculty and staff who work with international multilingual students.

The needs assessment data is currently being analyzed. Michelle Cox will make a proposal to Graduate Studies in 2014 to develop specific resources and curricular options for Dartmouth’s international multilingual graduate students, but also for the faculty working with them. After these new initiatives are in place, we will re-assess students’ work, progress, and experiences.

 

Dartmouth Summer Seminar for Writing Research (ongoing)

Every summer we offer a two-week, tuition-driven intensive seminar here at Dartmouth for faculty from around the country and around the world. The Summer Seminar’s purpose is to introduce faculty to the questions, data collection methods, and data analysis approaches used in empirical research about writing, learning to write, and teaching writing. We are gaining a strong reputation for creating research capacity in the field of writing studies. While the Seminar is not designed to teach assessment strategies and methods, it is decidedly affecting faculty understanding of learning outcomes, empirical methods, and evidence-based curricular design.

On Analytics and Other Initiatives

On analytics: We see a great opportunity to use analytics in addition to our current approaches for assessment and research. But for studying the complex activity that is learning to write and to adapt writing knowledge in a variety of contexts, analytics can only be one relatively limited part of our strategy.


On other initiatives: We are working on additional initiatives designed to map the teaching and learning of writing across the College, using interviews, focus groups, and surveys. A group of 14 faculty from across the Divisions has proposed implementation of a sophomore portfolio for assessing students’ general education work and readiness for writing in their majors. We look forward to seeing this proposal take more detailed shape this year.