The Free-Choice Learning Lab employs a number of different observation tools and custom software for data analysis and storage. These tools can tell us a lot about our visitors, from simple demographic information to how they interact with exhibits – what they say, look at, activate. Coupled with surveys and inquiry, this information can help us design better exhibits, kiosks, and programs for informal science learning. These tools can also be used to ask a wide range of sociological, psychological, and linguistics questions.
lab-assets

Cyberlab Infrastructure

The FCL Lab’s network of unobtrusive video and audio observation equipment is accessible worldwide through a hosted video server. It allows researchers to watch live footage or review weeks to months’ worth of stored video data.

lab-assets2

Eye Tracking

Advanced eye-tracking systems can capture individual visitor interest and interaction. This allows for very fine-grained study and evaluation of user interfaces and interpretive content, especially when coupled with other observation tools.

lab-assets4

lab-assets3

Face Recognition

The FCL Lab uses facial recognition to collect demographic and visitor time and tracking data. Our system of face detection cameras uses FaceVacs Software to recognize facial features and build a profile for each visitor, creating a user record, demographic information life gender and age, and tracks times and locations they were detected.

lab-assets5

Research Platforms

The hands-on exhibits in the Hatfield Marine Science Center Visitor Center have been designed based on years of research findings and best practices. Our new installations include embedded data collection and adaptive content to address a variety of questions about science learning. Many of our exhibits focus on real-time data and up-to-date research topics of interest to scientists, but are also designed to communicate those topics to visitors. Exhibits emphasize marine science, particularly those topics addressed by research labs elsewhere at HMSC. While our equipment can be set up to observe any of the VC’s exhibits, a few, considered question-rich environments, are permanently equipped for advanced audio-visual evaluation and recording.

Touch Pools:

Four artificial indoor tide pools containing live Pacific Northwest tide pool fauna that visitors can touch serve as a convenient platform for audience observation. Trained docents are assigned to this area to provide information and oversight. Research topics that could be investigated at this platform include interactions with live animals encounters, interactions with and influences of docents, family learning interactions, and more.

Wave Tanks:

The wave tank exhibit is designed to promote interaction around build-and-test and tinkering activities related to real-world engineering challenges. The exhibit area features a wave energy tank, an erosion tank and a two-flume Tsunami Wave Tank with a LEGO building challenge. The latter tank features an overhead camera for detailed observation of visitor interactions. Here researchers can study topics such as iterative design, gender differences in approaching engineering problems, social interactions among age groups, and other STEM related research questions

Vertical and horizontal multi-touch tables:

These features are like oversized tablet computers that contain a wealth of information on a range of topics. Research questions here could pertain to use of new interactive display technologies, family interactions, and others.

lab-assets6
20130413_HatfieldOpenHouse_jef-2456243219-O

Umbrella Institutional Review Board protocol

All of our infrastructure and approaches, including the use of all Cyberlab recording devices within the Visitor Center, are covered by our Oregon State University IRB permit. You will not need a separate IRB permit for work conducted on the premises.

20130413_HatfieldOpenHouse_jef-2456235303-O

Augmented Reality Sandbox

UC Davis' W.M. Keck Center for Active Visualization in the Earth Sciences , together with the UC Davis Tahoe Environmental Research Center, Lawrence Hall of Science, and ECHO Lake Aquarium and Science Center,  developed an NSF funded hands-on exhibit combining a real sandbox, and virtual topography and water created using a closed loop of a Microsoft Kinect 3D camera, powerful simulation and visualization software, and a data projector.

The resulting augmented reality (AR) sandbox allows users to create topography models by shaping real sand, which is then augmented in real time by an elevation color map, topographic contour lines, and simulated water.

Cyberlab faculty is using the AR sandbox as a research platform to answer questions about how the public reads and understands scientific visualizations.

exhibit_0526LK