You may recall my previous post, which introduced our two-part visitor study about the Connecting Cultures installation. Well, we are a little better than half-way through our study and we have learned some surprising things so far, none of which have to do with the installation. Rather, we have learned that some our basic assumptions on running this study were dead wrong and we have made some adjustments accordingly.
Assumption Number 1: Placing the survey table near the exit would encourage visitors to stop on their way out.
By the time visitors are heading to the exit, they have mentally checked out of the museum. Many won’t even make eye contact with you (maybe this is a New York thing?), so trying to get their attention as they are headed out the door does not work. I think many visitors see a table at the exit and assume we are trying to sell them something.
Adjustment: We moved the table from the exit area of the lobby to the entry area near the gift shop. This has actually made a big difference, especially to the staff manning the table who now have a line of sight to the admissions desk. The table feels like it is part of the museum experience and not an afterthought.
Assumption Number 2: A sign inviting people to participate in the survey will draw people to the table.
The sign is simply not enough. Some people read it, some people don’t. Some read it and then ask what the table is for. Some people simply assume it is not for them.
Adjustment: While the interview was always by personal invitation, the survey was not. Sure, we might smile and say hello, but we would still sit behind the table waiting for the visitors to come to us. By getting up from the table and actively telling people about the exhibition and inviting them to participate in the survey, there is no question in the visitor’s mind that they are welcome to participate. You simply can’t beat personal interaction.
Assumption Number 3: Visitors would be more willing to participate in the computer survey than the interview.
This one is perhaps the most surprising to me. I truly thought visitors would be less likely to spend time on an interview-based survey than they would on a computer-based survey. Somehow I thought the interview would be more daunting, but not for our visitors. We reached our target number of completed interviews in two weeks! At my guess, more than half the people asked to participate did. The same enthusiasm is simply not there for the computer-based survey. Some visitors will approach the survey table simply to tell us what they thought, but when invited to take the survey on the laptop, they decline. Simply put, they’d rather talk than type.
Adjustment: I reorganized the schedule to decrease the number of interview days and increase the number of survey days. Even with almost exclusively doing the survey for several weeks, we still weren’t getting the number of responses, partially due to miss-assumptions 1 and 2. Another adjustment was to increase awareness of the survey by having admissions staff tell people about it when they purchase a ticket and by adding a sign to the entrance of the exhibition. These are our most recent adjustments. My fingers are crossed that they work.
Although we are only half way through our visitor study, I have already learned several lessons, the most important of which is that it pays to be flexible. I can’t wait to see what the actual responses teach me.