I recently had the pleasure of participating in an online forum called Interactive Café, sponsored by the National Art Education Association (NAEA) Research Commission. For a week, I exchanged ideas virtually with my co-hosts, Olga Hubard of Teachers College Columbia University, Michelle Grohe of the Isabella Stewart Gardner Museum, and Benjamin Tellie of the Charles E. Smith Jewish Day School in Rockville, Maryland, about assessing students’ responses to works of art. Olga began the forum by posing the provocative question, “What is worth assessing in students’ responses to works of art?” For me, the answer lies in another question: “For what purpose are you assessing students?” As a professional evaluator, the purpose is usually to help a museum understand the impact it has on the students it serves. As Olga noted, there are many possible outcomes or benefits for students when they look at and respond to works of art, and it is my job to help a museum articulate its unique intentions for students. Is the program designed to increase students’ critical thinking skills, curiosity or creativity, personal connections, or something else? Once I truly understand a museum’s intent, the work of developing the assessment can begin. In this post, I describe my work with one museum to illustrate intentionality in the process of developing a student assessment.
For the last eight months I have been working with the Katonah Museum of Art in Westchester County, New York, to assess the impact that the program, ArteJuntos /ArtTogether, has on the bilingual preschool students it serves. The program is a partnership with a nearby local preschool that serves immigrant families. Staff from the Museum visit the children (and their parents) at their school once a week to look at, talk about (through inquiry), and make art; the program also includes two visits to the Museum and parent training (an important part of the program that I have to leave out here for the sake of brevity). I feel honored to be working with such a unique program and with people who understand that quality assessment takes time. Fortunately, a full year of assessment (and other program activities) was generously funded by the National Endowment for the Arts. As mentioned previously, I began by asking the very basic question, “How is your program designed to affect students?” and we continued from there. To illustrate the intentional approach we took to developing the assessment, below I outline and explain the steps we have taken thus far.
- The Museum described its intent for students primarily around literacy, especially emergent literacy. Remember these children are very young (on average 3 years old). Museum staff believed (and had witnessed) that through facilitated, inquiry-based discussions about works of art, students have the unique opportunity to use and develop rich, descriptive language. Furthermore, they had heard from Pre-K teachers of students who had participated in ArteJuntos in previous years that these students seemed more verbal and better prepared for Pre-K. The staff was eager to find out what was happening.
- To hone and better understand the idea of literacy as it manifests in the context of ArteJuntos, we assembled a team of experts including Museum staff, teachers from the preschool, school administrators, and representatives from another community organization to talk about what literacy looks like for these young bilingual children and how the program affects their literacy. By the end of the day, I had a list of key indicators that would serve as evidence of students’ literacy in relation to looking at and talking about art.
- I refined and honed the list of student outcomes and indicators and drafted a rubric that would be used to assess students’ literacy. The draft rubric included relatively simple indicators such as “The child names shapes (triangles, circles) to describe the work of art” and “The child names colors to describe the work of art” as well as more complex indicators like, “The child names two objects that are similar and/or two that are different and accurately describes similarities or differences.” Museum staff, preschool teachers, and the principal reviewed the rubric and provided feedback.
- I developed a protocol for the assessment. In this protocol, each student sits one-on-one with a bilingual educator and a reproduction of a work of art (see the work of art by Carmen Lomas Garza below) and is asked a series of open-ended and close-ended questions that closely mirror the kinds of questions they are asked in the program. For examples, questions include: “What do you see in this picture?” “What can you tell me about that?” and “What colors can you find in this picture?”
- We tested the protocol by trying it out with some of the Museum staff’s children. As a result, we identified problem areas in the line of questioning and revised it as necessary.
- In early fall, before ArteJuntos began, we did the first set of one-on-one student assessments with 12 children who would participate in the program that year. These assessments serve as the pre-program assessment. We videotaped the assessments, as shown here in this still clip from one of the videos:
- Museum staff, preschool teachers, and I watched the videos together, discussing the emerging literacy we saw from the children. As a result of that meeting I revised the rubric. It remains focused on literacy, but now more closely aligns with what we saw happening among students.
We are about three-quarters of the way finished with the project. This spring, once ArteJuntos ends, we will do a second round of assessments with the same children. These will represent the post-program assessments. At that point, we will score all of the videotaped children using the rubric, compare and contrast the pre- and the post-program assessments, and draw some preliminary conclusions about the way ArteJuntos impacts students’ literacy. Our sample is small and we realize there are problems inherent in comparing pre-school children from pre- to post given how rapidly they develop; nevertheless, given our intentional process of developing the assessment tool, we feel confident that we will capture an accurate measure of students’ responses to works of art in the context of this unique program, and we hope that the assessment can continue to be utilized in years to come.