acm-header
Sign In

Communications of the ACM

Interaction design and children

Ubi-Learning Integrates Indoor and Outdoor Experiences


Ubiquitous computing and mobile technologies provide much scope for designing innovative learning experiences that can take place in a variety of outdoor (for example, parks, city centers, woodlands) and indoor settings (for example, museums, learning centers, labs, home). While learning activities already occur in these contexts, pervasive technologies can help integrate them. Outdoor field trips and computer-based indoor learning activities are typically performed separately; for example, children may go on a field trip and observe and collect data that, on another occasion, they will input into a software simulation package back in the classroom. This separation of interlinked activities can make it difficult for children to see and understand the connections between what are essentially the same representations and processes being studied, albeit in different contexts.

Our research—a core strand of the U.K. Equator Interdisciplinary Research Collaboration—seeks to bridge this gap, enabling children to broaden and connect their understandings, reflections, and hypotheses in both real-world and classroom settings. Our approach is to design and build pervasive environments (WiFi and sensor-based technologies) and connect them with a variety of mobile and standalone computational devices that provide digital augmentation in novel ways. A particular aim is to encourage children to carry out scientific inquiry in the context of discovering and exploring an environment, system, or process.

E-learning implies learning "anytime anywhere" [5]. To support this form of learning, researchers and teachers have begun experimenting with the use of handheld technologies [9]. For example, field trips have been augmented with PDAs (for example, [3, 4]) and software has been developed to provide individualized for scaffolding, including bird watching [2] and concept map building [6]. Mobile computers can become "ready-to-hand" tools; children can use them within the context of their learning, such as collecting data or accessing the Internet, whether in the field or eating lunch in the cafeteria [10]. Ubi-learning seeks to extend the use of these tools even further, by interconnecting them with other mobile devices and learning tools. The purpose of doing so is to provide a pervasive environment that will enable children to more extensively reflect on, explain, and hypothesize about the physical world around them in relation to their formal learning experiences from the classroom. An example of such a configuration is the Ambient Wood project.


Our research seeks to enable children to broaden and connect their understandings, reflections, and hypotheses in both real-world and classroom settings. Our approach is to design and build pervasive environments and connect them with a variety of mobile and standalone computational devices that provide digital augmentation in novel ways.


Back to Top

The Ambient Wood Project

As stressed by Ackermann [1], in order to learn from experience, it is necessary to step back from it and to reflect momentarily upon it before diving back into the experience. The Ambient Wood project was designed to enable children to intermittently switch from their experiences of the physical world (for example, observing a butterfly drinking nectar from a thistle) and to reflect upon the ecological processes that lie behind this interdependency (for example, pollination). To this end, a learning experience was designed that encouraged children to explore and hypothesize about different habitats found in a woodland. In addition, a variety of mobile devices and visualization tools was provided for the children to access and share contextually relevant digital information (for example, animation of seasonal changes) when indoors and outdoors.

There are a number of ways that digital information can be presented or discovered in a physical environment using ubiquitous computing. Information can be displayed on handheld devices (PDAs) or presented sonically (via speakers). It can be requested or obtained, or it can be serendipitously pinged when a person is detected in the vicinity. A primary concern when determining how, when, and where is not to overload the children with digital information that distracts from their interactions and explorations of the physical world. Maintaining a balance between the physical and the digital was central to the design of Ambient Wood.

An infrastructure was built that monitored the children's positions in the woodland, tracking any data the children collected, and triggering location-based information [11]. The data was passed between devices and a central server using a WiFi local area network installed in the woods. The WiFi network was used to send images and sounds to the devices and to monitor the children's activities and their probe readings in real time. Short-range FM transmitters, called pingers, located throughout the woodland were used to broadcast to receivers carried by the children. The location pingers had a range of about 10 meters.

Several handcrafted devices were built together with the use of off-the-shelf devices to provide the different forms of digital augmentation. These were a PDA pinger, a probe tool, a periscope, wireless speakers, an ambient horn, and reflection tools. Not all were used at the same time and different combinations were experimented with in separate studies.

The PDA pinger was programmed to show sporadically an image of a plant or animal together with a voice-over about an aspect of its habitat. This happened whenever the children walked passed a pinger hidden in a predetermined spot. The information displayed was intended to draw their attention to a section of the woods at pertinent times in order to think and reflect upon it.

The probe tool was designed to enable children to collect real-time measurements of light and moisture in the area (see Figure 1). Readings of the probes appeared on the PDA display as dynamic visualizations. These were intended to provoke the children into hypothesizing about what they meant with respect to their surroundings. The probe tool also transmitted and stored all the readings and the location at which they were collected in the woodland (using GPS) to enable the children to reflect about them later at an abstract level.

The periscope was designed as a standalone viewing tool to provide children with access to prerecorded videos about the habitat [12]. Here, the idea was to provide dynamically relevant information conveying seasonal changes and life cycles.

Wireless speakers were hidden in sections of the woods to provide a range of realistic sounds of animals in the habitat and abstract sounds that represented various plant processes. These included a chiff-chaff bird song, a butterfly sipping nectar and photosynthesis. The different kinds of sounds were designed to make the children think about what they meant and their significance in that part of the habitat. The pinger technology was used again to deliver the sounds triggering them to be played in certain spots whenever the children walked past them.

Another sound device was the ambient horn [8]—a handheld device the children held to their ears to hear the sounds. These were also triggered via location pingers, according to the children's location, but was under the children's control.

The visualization tools were developed to enable students to reflect upon their outdoor discoveries in indoor settings (see Figure 2). These tools include an interactive visualization display that showed a bird's- eye view of the woods, overlaid with all the children's collected probe readings in the location (as dots that could be opened to reveal the data collected) and an interactive tangible board designed to allow the children to reconstruct what they had seen, collected, and heard at a higher level of abstraction, using various graphical representations embedded with RFID tags. Feedback was provided on an adjoining display whenever certain combinations of tagged tokens were detected by the board.

Back to Top

How Successful was Ambient Wood?

Two studies were carried out over a 12-month period to assess the children's learning, using different combinations of the pervasive technologies. In the first study, eight pairs of students, ages 11–12, took part and in the second study 12 pairs of the same age participated. Initially, two pairs of children were asked to discover as much as possible about a different part of the environment by looking, touching, smelling, and listening. They were then provided with the devices to uncover more. To facilitate reflection, the children were encouraged to talk with one another and a remote facilitator, via walkie-talkies, reporting on what they had discovered, what its significance was, and what they planned to do next. The learning activity was deliberately designed to be open-ended rather than task-driven. This was to encourage the children to discover and observe different aspects of the habitat and to generate hypotheses about their relationships and their interdependencies. An aim was to see what connections they made when presented with various sounds and images in particular parts of the woods in relation to what they were experiencing and anticipating in the environment.

There was much evidence of the children integrating the findings and information obtained from the devices with their own observations of the physical environment. The outcome was the generation of hypotheses and explaining to one another their ideas about habitat relationships, distributions, and the underlying processes of the ecology. For example, one pair used the probe tool to generate hypotheses about why certain parts are drier (for example, leaves) than others (grass) and what the implications were for what was able to survive there. Another pair made inferences about the interrelationships between readings from the probe device (dry), sightings of organisms in the same location (woodlice) and information they received on the PDA about woodlice. Thus, information from the devices linked to the environment enabled children to begin identifying habitat relationships, distributions, and the underlying processes of ecology.


We propose that digital augmentation offers a promising way for enhancing the learning process, especially encouraging the dovetailing of exploring and reflecting when indoors and outdoors.


The two pairs of children then came together to reflect on and share their explorations in a makeshift classroom housed in a tent, in another part of the woods. A follow-up session was subsequently held in a real classroom where all the pairs of students came together with their teacher and the facilitators to draw further inferences from their explorations.

The findings from the studies illustrated how children use the different devices and forms of digital augmentation to further their exploration [7]. In particular, the information presented on the periscope, ambient horn, and PDA pinger led the students to look for what they had seen or heard and also provoked discussions about what they discovered in relation to relevant ecological issues.

One of the most successful forms of digital augmentation was the combination of the probing tool and the interactive visualization display. The children were able to integrate their present understanding of the woods, derived from their individual probes of it, with their subsequent reflections on the patterns appearing in the bird's-eye visualization compiled from the different paired children's readings.

At ground level, children probed many different aspects of the terrain, taking turns to either probe or read the outcome on the PDA. On average, each pair took about 80 readings, of which half were for light and half for moisture. This frequency of probing suggests the collaborative activity was highly successful at provoking further exploration. After taking a reading, the children would suggest to each other a different place to go to confirm or refute their hypotheses about what the reading would be there. They also suggested where to take the most extreme readings, and again, this involved making and testing predictions about the environment. In addition, probing sometimes led to the discovery of new plants when the children were looking for places or organisms that would provide them with different readings. The spontaneous conversations that took place suggested this method of interacting with the environment provided the children with many opportunities to undertake scientific inquiry.

At bird's-eye level, the children were fascinated that every probe reading they collected and recorded was now available as interactive data points on the visualization display (a feature unknown to them earlier). They were readily able to abstract and explain to each other what the patterns of their personalized data points represented. By clicking on these points they were able to bring up the same readings they had seen before on their PDAs. This caused much interest and amusement, especially when trying to find the data points where they had probed parts of their bodies. Being able to see each other's data in this highly engaging way, enabled the children to develop an overall picture of the different distributions of moisture and light in the two areas and to make generalizations about the contrasting habitats as to why different types of organisms lived in each habitat and why they would not survive well in the other.

Back to Top

Conclusion

Our research has shown that learning experiences can be designed that broaden and connect children's understandings, reflections, and hypotheses across both real world and classroom settings. More generally, we suggest ubi-learning experiences can be designed in a number of ways, namely:

  • Mobile devices can be connected to wireless networks to enable children to access, compare, and input information while in the field.
  • Information and data collected can be sent and commented on by others, who may be in different physical or virtual environments, enabling novel forms of collaborative problem solving to occur in real time over distance.
  • Contextually relevant digital information (for example, images, sounds, visualizations, questions) can be delivered through location and person sensing, via handheld devices at relevant times and situations, to focus particular kinds of learning activity in the field.
  • Novel viewing and tangible computational devices can be designed to enable information and live data to be presented, collated, and interacted with collaboratively indoors.
  • Interactive tabletops and large public displays can be situated in public settings, such as museums and libraries, showing personal data collected by community members over time and space, enabling the members to identify and track their own data relative to others.

To conclude, we propose that digital augmentation offers a promising way for enhancing the learning process, especially encouraging the dovetailing of exploring and reflecting when indoors and outdoors. Here, we have described how scientific experiments can be extended. Digital augmentation could equally be used to integrate learning in other contexts, such as the application of math during team games and understanding chronology while visiting various historical sites. In sum, ubi-learning experiences offer great potential for stretching children's minds [1].

Back to Top

References

1. Ackerman, E. Perspective-taking and object construction: Two keys to learning. Constructionism in Practice: Designing, Thinking and Learning in a Digital World. Y. Kafai and M. Resnick, Eds. Lawrence Erlbaum, Mahwah, NJ, 1996.

2. Chen, Y., Kao, T., and Sheu, J. A mobile learning system for scaffolding bird watching learning. J. Computer-Assisted Learning 19 (2003), 347–359.

3. Gay, G., Reiger, R., and Bennington, T. Using mobile computing to enhance field study. Carrying the Conversation Forward. N. Miyake, R. Hall, and T. Koschmann, Eds. Lawrence Erlbaum, Mahwah, NJ, 507–528, 2001.

4. Grant, W.C. Wireless Coyote: A computer-supported field trip. Comm. ACM 36, 2 (Feb 1993), 57–59.

5. LineZine (2000); www.linezine.com/elearning.htm

6. Luchini, K., Quintana, C., Krajcik, J., Farah, C., Nandihalli, N., Reese, K., Wieczorek, A., and Soloway, E. Scaffolding in the small: Designing educational support for concept mapping on handheld computers. In Proceedings of Human Factors in Computing (CHI 2002) Extended Abstracts. ACM Press, NY, 792–793.

7. Price, S., Rogers, Y., Stanton, D., and Smith, H. A new conceptual framework for CSCL: Supporting diverse forms of reflection through multiple interactions. In Proceedings of the International Conference on CSCL'03 (June 14-18, 2003, Bergen, Norway). Kluwer Academic Publishers, 513–522.

8. Randell, C., Price, S., Rogers, Y., Harris, E., and Fitzpatrick, G. The Ambient Horn: Designing a novel audio-based learning experience. Personal and Ubiquitous Comput. J. 8, 3 (2004), 144–161.

9. Roschelle, J. Unlocking the learning value of wireless mobile devices. J. Computer-Assisted Learning 19, 3 (2003), 260–272.

10. Soloway, E., Grant, W., Tinger, R., Roschelle, J., Resnick, M., Berg, R., and Eisenberg, M. Science in the palms of their hands. Comm. ACM 42, 8 (Aug. 1999), 21–26.

11. Weal, M., Michaelides, D., Thompson, M., and DeRoure, D. The Ambient Wood journals: Replaying the experience. In Proceedings of Hypertext and Hypermedia. ACM Press, NY, 2003, 20–27.

12. Wilde, D., Harris, E. Rogers, Y., and Randell, C. The periscope: Supporting a computer enhanced field trips for children. Personal and Ubiquitous Comput. J. 7, 3–4 (2003), 227–233.

Back to Top

Authors

Yvonne Rogers ([email protected]) is a professor of Informatics and Information Science at Indiana University, Bloomington (formerly at Sussex University).

Sara Price ([email protected]) is a research fellow in the Interact Lab, Department of Informatics at Sussex University, U.K.

Cliff Randell ([email protected]) is a research associate in the Department of Computer Science, University of Bristol, U.K.

Danae Stanton Fraser ([email protected]) is a senior lecturer in the Department of Psychology at Bath University, U.K. (formerly at Nottingham University).

Mark Weal ([email protected]) is a senior research fellow in the School of Electronics and Computer Science at the University of Southampton, U.K.

Geraldine Fitzpatrick ([email protected]) is a senior lecturer in the Interact Lab, Department of Informatics at Sussex University, U.K.

Back to Top

Footnotes

This research was carried out as part of the EQUATOR IRC, funded through the UK's EPSRC. Contributing to this research were the universities of Bristol, Nottingham, RCA, Southampton, and Sussex.

Back to Top

Figures

F1Figure 1. The probe tool in action.

F2Figure 2. Interacting with the visualization tools.

Back to top


©2005 ACM  0001-0782/05/0100  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.


 

No entries found