Skip to content. | Skip to navigation

Citation and Metadata
Document Actions
Full Text

Introduction

For distance education to be effective and for learning to begin immediately, the mediating technology must be as invisible as possible. It must be available, usable and effective whenever and wherever it is required. Otherwise most of the student’s learning effort may be expended mastering the technology rather than the course content (Smulders, 2003), and the teacher-student relationship will be disrupted.

Over the past 150 years distance education has passed through many metamorphoses involving a wide-range of delivery media, from the postal system to satellite transmission (Moore and Kearsley, 2012). Until recently, however, the book-based correspondence course has remained at its core, especially for home-based study. Now, though, there is a widespread trend towards internet-based delivery of distance education with the personal computer replacing the book as the principal study medium.

While web-based distance education offers the promise of a more interactive, collaborative, integrated, and multi-dimensional learning environment, with almost instantaneous communication paths, its greater complexity and its reliance on a well-developed electrical and telecommunication infrastructure also has disadvantages compared with the much simpler and more accessible technology of the book and the postal system. It increases the digital divide between those who have access to and know how to use information technology and those who do not.

This article reports on the evaluation of a novel approach for improving the usability and accessibility of computer-based distance learning environments, drawing upon the researchers’ personal backgrounds as teachers and distance students, and as engineers in software development and in whiteware manufacture. The hypothesis is that on-line distance learning technology can be simplified and rendered “invisible” to the student if the design approach that has marked successful consumer appliances is applied. To test this hypothesis the authors developed and evaluated a virtual learning appliance for the distance study of second languages. This included reviewing the concept of invisibility as it applies to the design of interactive computer systems, and developing a framework for evaluating it based upon an extension of Nielsen’s usability heuristics. This research is part of the ongoing IMMEDIATE e-learning project at Massey University, the principal provider of university-level distance education in New Zealand.

Motivation – Reducing the Visibility of Digital Technologies

The authors’ interest in reducing the visibility of digital technologies began over a decade ago from observations that as computerisation has extended into more and more aspects of everyday life – from the factory floor to distance learning – many have found the new technologies intimidating and harder to use, and expressed a preference for the older non-digital methods. This section summarises the main ideas considered in the search for ways of addressing this problem in distance education and reducing the visibility of e-learning technology.

Alfred Bork, an early researcher of interactive computer-based education, points out that learning does not need the complexities of today's interfaces. For Bork, the incorporation of standard Windows and web browser interface features and functions, which have no connection with the learning activity, amount to "visual garbage" which distracts the student from this task. “Nothing should be on the screen that is not directly relevant for learning the material at hand” (Bork, 2001, p64).

Almost all e-learning at the university level today is based upon some variation of the Learning Management System (LMS) model, in which study material is delivered live over the web from a central repository to a browser running atop the student’s general-purpose desktop or lap top computer. Bork’s comments highlight the downside of delivering learning content to students in this manner. Rather than providing a simpler alternative to the general-purpose computing environment, it adds another layer of complexity to it, which the student must learn to navigate. Moreover, because learning content is accessed across the internet, interactivity is more limited and wait times longer than if the content is stored locally, increasing the visibility of the technology.

An alternative to the general purpose approach is exemplified by successful domestic appliances like the automatic washing machine. The modern washing machine is as much an interactive computer system as the PC. The user sets up the machine through a control panel and display, and the washing process is controlled and monitored by inputs and outputs from the user and from system (machine) components. However, nobody thinks of this task as “computing”; they are “doing the laundry”. The technology simplifies the task while the user retains control throughout the activity. Through a minimalist, reliable, fit-for-purpose design, the engineers have successfully rendered the empowering technology invisible to the user. It has become psychologically subsumed in the task itself. The question is posed whether this approach can be applied to more complex information processing activities?

Don Norman has been the leading advocate of applying the appliance approach to the design of information processors as a method for rendering the computer invisible to the user and breaking the complexity barrier of the PC. He distinguishes between two kinds of invisibility. The first kind is where the technology is embedded in products or hidden beneath the surface, like the household waste disposal system, which the user may be unaware of until something goes wrong. The second kind is “when the device fits the need so perfectly that I forget it’s a complex technical device.” (Bergman, 2000, pp13-14). The key to the information appliance is simplicity, for the tool to fit the task so well that it feels to the user like a natural part of the task (Norman, 1998, pp52-53).

Ubiquitous computing, in which small interlinked computing devices are embedded in the task environment itself, is another approach towards rendering technology invisible. It can be traced back to the work led by Mark Weiser at the Xerox Palo Alto Research Centre in California in the late 1980s (PARC, 2007). Weiser’s concept was that people would manipulate familiar objects to accomplish tasks and may not even be aware of the computation in the background, leaving them with the sense of having done the task themselves (Weiser, 1993). In this way it was not fundamentally a question of technology that makes the ubiquitous computer disappear, but of human psychology, of people ceasing to be aware of it (Weiser, 1991). He placed the user in the box seat: “If computers are everywhere they better stay out of the way, and that means designing them so that the people being shared by the computers remain serene and in control” (Weiser and Brown, 1998).

Weiser’s and Norman’s ideas are built upon the foundation of the pioneers of the Human Computer Interaction discipline in the 1970s like Ben Shneiderman who emphasised the “distinctions between human reason and computer power” and argued that computer scientists should study how the human mind works in order to better understand how to design more usable computer systems (Shneiderman, 1980, p.273).

Technological advances since Weiser’s initial work two decades ago have seen the proliferation of small embedded computing devices, mobile phones, portable computers and internetwork technologies. A considerable body of research carried out under the umbrella of “ubiquitous computing” concerns the integration of laptops and handheld technologies into education for anytime, anyplace learning (Hill et al, 2000). This is also called mobile e-learning, or “m-learning”. M-learning explores ways of using mobile computing devices like PDA’s and cellular phones for supporting and/or delivering some elements of teaching and learning processes, especially with a view to drawing young people alienated from the traditional classroom back into learning (Attewell, 2004). Today the focus is increasingly on “smart phones”, which integrate the functions of PDAs, mobile phones, cameras and other digital technologies into a single hand-held device. In developing countries in particular, mobile phones are by far the most accessible digital information technology.

However, by and large these technologies remain highly visible to the learner, in part because of their multi-purpose functionality as “shrunk down personal computers” (Jones and Marsden, 2006, p12), and in part because interactions with handheld devices are quite limited and difficult to accomplish. Waycott and Kukulska-Hulme (2003) from the Open University UK evaluated the efficacy of PDAs in distance learning. They concluded that while the mobility of PDAs offered some advantages as a support tool for note-taking and communications their restricted interface made them unsuited to the delivery of the body of a course.

In summary, designing for invisibility is not primarily a technical issue of hiding or embedding the technology, but a psychological issue of having people feel that they have accomplished a task or activity for themselves. Consumer appliances provide some of the best examples of invisible technologies. This suggested that the most promising approach towards decreasing the visibility of technology in distance learning is that of the specialised appliance. Bork (2001, p64) extended the concept of an activity-oriented information appliance to education by proposing a simplified learning appliance, which he thought could be built more cheaply than a PC and would be especially useful in the developing countries.

The concept of an ultra-cheap learning appliance based on recycled hardware and open-source software has been successfully tested and evaluated as part of the IMMEDIATE learning project. This project demonstrated the viability of using recycled rather than purpose-built hardware for cheap learning appliances. It also showed that that lean but useful educational software can perform well on “obsolete” technology without the need for high-end telecommunications and computers (Allan and Johnson, 2008).

The IMMEDIATE project has also explored the idea of a virtual learning appliance installed on top of a general-purpose personal computing platform to provide a simplified, specialised learning environment tailored to the student and the learning domain (Johnson et al, 2007). For the purposes of testing their hypothesis regarding invisibility in e-learning, the IMMEDIATE team decided to develop and evaluate a second version of this appliance, together with language teaching experts, to support the online study of second languages. This is discussed next.

The experiment – the IMMEDIATE learning appliance

The IMMEDIATE project had begun as a practical search for ways in which information technology could be applied to distance learning so as to reach beyond the “wired cities” to narrow the digital divide in education and to enhance all distance students’ learning experiences. From an extensive literature review eight guidelines for designing effective on-line systems for distance education were postulated. The most important of these were: to target the distance student as the primary user, to prioritise the student view as the front end of a learning system rather than the back end of a teaching system, and to use the system to extend a real university community of learning to distance students, rather than try to create a parallel virtual one (Johnson et al, 2007). IMMEDIATE v.1 was built to evaluate these guidelines.

In order to achieve the desired universality the approach was to design for a specific persona (Cooper, 2004, p124), an imaginary single user who represented a combination of the most challenging circumstances that the distance e-learning system would have to meet, including rural isolation, unreliable internet connection and limited computing experience.

To satisfy the requirements of this persona, the research team envisioned the learning appliance as a low bandwidth tool providing a simplified, specialised e-learning environment, which works with or without an Internet connection and combines the simplicity and user control of the household appliance with the navigational flexibility and visual cues of a book. Like a book it would define the framework in which learning is delivered while leaving the content to be provided by the teacher. But it would not be an electronic book. Computer functionality would enable the learning material to be delivered in a variety of forms – individual and collaborative, active and passive, formal and informal – analogous with the multidimensional learning of a university institution. And it would provide contextual learning support through a simple mechanism for querying a local knowledge base or contacting an online tutor when available.

While the IMMEDIATE researchers previously experimented with building dedicated Linux-based appliances (Allan and Johnson, 2008), the current prototype is implemented as an application that installs and boots itself on top of the Windows operating system on a specific computer. Once booted it disables and hides the Windows GUI and substitutes its own specialised learning interface, which seamlessly conflates operating system, browser and application functionality (Fig. 1).

Figure 1. IMMEDIATE virtual appliance disables and hides the Windows GUI and substitutes its own specialised learning interface.

The IMMEDIATE student client can be installed on a USB “memory stick”. Plugging the stick into an available computer will temporarily convert it into a specialised “learning appliance”, requiring only periodic internet connection to be updated and fully functional. It uses a “Push-To-Talk”, simplex approach to extend interactive conversation to low-speed internet connections (Ye and Johnson, 2008). Timely access to all required resources is assured through storing learning content on the learner’s machine, with automatic updating in background threads via the internet or portable storage devices.

The design objective was to use the targeted simplicity of an appliance to reduce the visibility of the technology by providing an integrated learning environment with an intuitive interface that would be readily mastered by the student. The guiding principles were that nothing should be on the screen that is not directly relevant for learning the material at hand (Bork, 2001, p64) and that the appliance fits the learning activity so well that it feels to the user like a natural part of the task (Norman, 1998, pp52-53). Learning should be improved through removing the distractions provided by a general purpose environment, eliminating the need to wait for content to download associated with web-based systems, and minimising the effort required to understand and use the software.

To maximise simplicity, usability and teaching effectiveness, IMMEDIATE is built around a set of templates that support various teaching styles and study modes, enabling the learning environment to be tailored to a particular course and a particular student. The teacher defines the course structure and chooses the appropriate study modes to support the particular subject and their preferred teaching styles, and adds content, through a separate tool. This tool also enables the teacher to communicate with the students, and maintain contextual learning support linked to key words in each study topic, based on direct or indirect feedback from students. In the learning appliance, the distance student selects from the available study modes to support their preferred learning style, accesses the learning support provided and communicates directly with the teacher and other students.

IMMEDIATE’s template approach differs from low-level “fill-in-the-blanks”-type templates for simplifying the authoring of learning objects or other content for an LMS (Microsoft, 2011, eLearning Brothers, 2011). It is primarily aimed at providing the teacher with a high-level guide as to what content to add, and where and how to add it, rather than directly assisting with the authoring and formatting of learning materials. As such it is closer to the pedagogical templates for the integration of technology into teaching and learning developed at the University of London (Jara and Mohamad, 2007). However, Jara and Mohamad’s templates are technology-focused. In contrast, IMMEDIATE’s templates are domain-focused. Developed in conjunction with subject experts, they model learning activities associated with successful correspondence education and face-to-face teaching in specific domains. The aim is to extend these proven teaching methods to the distance student rather than to implement any specific online pedagogy. The enabling technologies are embedded in the learning appliance, and will be activated when required by the student’s choice of study mode.

Using IMMEDIATE for second language teaching

In demonstrating the original prototype to educationalists, one domain for which IMMEDIATE had been seen as particularly appropriate was that of computer-assisted second language teaching at any level – Maori for beginners or those required to pass a course in academic English such as IELTS or TOEFL as a prerequisite for international study. For the second version, therefore, as well as improving the usability of the interface, it was important to see whether IMMEDIATE could support the pedagogical goals of second language teachers. As Otto and Pusack stated, “we are almost certainly moving towards new models of teaching that rely heavily on technology” (2009, p787). Levy believes that technology should support a modular approach to the teaching of language given the various strands involved, e.g. grammar and vocabulary, as this provides an “effective structure for representing the scope and range of technologies in use.” (2009, p769). With regard to distance–based language teaching, White (2006) points out that it is not the delivery of materials per se that is important but supporting the interaction between the learner and the learning context. Low technology environments can have a role to play in this respect.

There may be problems, therefore, if the learners are provided with an unsuitable environment for second language learning. As Garrett (2009, p723) notes “Simply providing students with web links does not of itself constitute Computer-Assisted Language Learning.” Students need to be able to focus not only on dialogue and conversation but also, if they wish, on developing specific skills in speaking or reading the language.

IMMEDIATE 2.0 was developed in collaboration with teachers of Maori as a second language. It was then customised (IMMEDIATE 2.1) to handle the requirements of teachers of English as a second language. Joint research was undertaken with staff of a language school who taught English as a second language (to meet university entry requirements for overseas students). They explained their key goals which related to the teaching of academic English. As a result of discussions based on the expertise of the teachers and the material provided to students the required functionality was identified (Table 1). Templates were provided based on the relevant required components.

Learning objectives: make the higher level learning requirements available

Reading Practice: this study mode provides functionality for supporting both long and short reading practice. Comprehension questions with answers can optionally be included.

Writing practice: this study mode supports both short answer and long answer practice. Answers can be optionally included. It should be possible to handle exercises where students are expected to match a word with an appropriate meaning

Group practice: this mode supports oral practice (speaking and listening) between pairs or group members

Dictionary: this mode focuses on the introduction of new vocabulary, showing words, their meanings and examples of usage.

Table 1. Requirements for online learning of English for Speakers of Other Languages


Students’ dual roles as users and learners

It is important to make a distinction between designing for usability and designing for learning (Smulders, 2003). Table 2 shows the comparison between these dimensions of student interaction. There is a set of typical usability goals that a system should support (recognition of commands, error avoidance, ease of user etc). For students, though, it is also vital for them to meet their educational objectives. Some key issues relate to support for student problem solving, recall of material, trial by error, etc., which contrast with the usability goals which seek to minimise the cognitive load. The challenge in designing for online learning is that the requirements for usability and the requirements for learning must each be clearly delineated and evaluated. If we consider screen presentation, for example, the key from a learning perspective is the content, that it is appropriate to the learning objective and makes the student think. From a usability perspective, however, the key is the form in which the material is presented, that it is easily accessible and can be navigated intuitively to enable the student to concentrate on the content. Poor form distracts the student and inhibits learning. Poor content, regardless of how well it is presented, is no help to the student either.

Learner

Computer User

Content

Form

Recall

Recognition

Reflection

Intuition

Academic Rigour

Ease of use

Stop and think

Point and click

Soak it up

Skimming

Deep reading

Scanning

Problem solving

Problem avoidance

Critical thinking

Inquisitive browsing

Tough

Delicate

Trial by error

Avoiding errors

Figure it out

Make it obvious

Answers open to interpretation, discussion and feedback

Customer is always right

End product = end of course

End product = launch of course

Table 2. E-learning interaction design must accommodate student’s contrasting requirements as learner and computer user. From Smulders (2003).


Ultimately, to evaluate along both dimensions, the learning appliance needed to be tested in a realistic teaching situation. But before this could be done it was necessary to follow an iterative software development methodology which integrates evaluation with each cycle. An evaluation of this type would focus on heuristic evaluation by people with expertise in HCI but would also involve some evaluation by domain experts such as language teachers and second language students. For teachers it is important to know that it supports the styles in which they want to teach. The issues for students are to see to what extent they believe the appliance supports them as learners.

Heuristics for invisibility

It has been a general problem in e-learning that the design of the student interface has not been prioritised (Kruse, 2002; Murray, 1999; Bork, 2001)). Kruse considered the interface between students and computers to be the "single most neglected topic in the field of e-learning" and a major reason for students expressing a preference for classroom-based over computer-based instruction (Kruse, 2002, para. 1). If the invisibility of information technology is primarily a question of the user remaining "serene and in control” (Weiser and Brown, 1998), then at its core, therefore, is more effective, user-oriented interface design. The IMMEDIATE team considered that an essential element in developing the appliance to support the distance teaching of second languages was to conduct heuristic evaluation of the interface to find serious usability problems. This would permit evaluation of the information appliance as a ubiquitous tool where the goal is for the technology to recede into the background whilst the task is in the foreground.

Nielsen’s well-known usability heuristics (Nielsen, 1994) (See numbers 1-9 in Table 3) provided the starting point. One of the issues the team faced was the need to adapt and extend Nielsen’s general heuristics to evaluate invisibility in e-learning. Given the importance of the on-line dimension of the

Heuristic

Description

Visibility of System Status

The system should always keep user informed about what is going on, through appropriate feedback within reasonable time.

Match between system and the real world

The system should speak the user’s language, with words, phrases and concepts familiar to the user, rather than system-oriented terms.

User control and freedom

The system should support a user driven approach. Users should be able to easily escape from places they unexpectedly find themselves in.

Consistency and standards

The ways of performing similar actions should be consistent throughout the system.

Recognition rather than recall

Make objects, actions and options visible. The user should not have to remember information from one part of the dialogue to another.

Error prevention

Even better than good error messages is a careful design which prevents a problem from occurring in the first place.

Flexibility and efficiency of use

The system should be accessible from anywhere. Is the system portable and reusable for the user?

Aesthetic and minimalist design

The system needs to follow the ‘just-in-time, just-enough’ functionality rather than ‘just-in-case’. The user should not be presented with any unnecessary or irrelevant information.

Help and documentation

Help information should be provided that can be easily searched and easily followed by the user. The help should be able to provide context specific help when it is requested.

Frequent updates

Updates should be available to the user on a regular basis.

Timeliness

The tasks required by the user should be able to be completed in the least amount of time, without time being wasted by technology.

Ease of use

The system should be easy to use for the target user group. The system should provide support for the user in their learning.

Unique to the online medium

The system should provide benefits to users that enhance their learning ability compared what they could achieve with print-based learning material?

Focus

The system technology should ‘blend into the background’ and allow the users to focus solely on learning.

Familiarity

The user’s interactions with the system should enhance the quality of their work. The user should be treated with respect. The design should be aesthetically pleasing with artistic as well as functional value.

Awareness

The system should provide ease of communication for multiple users without colliding with the activities of others.

Effectiveness

The system needs to be able to support users and their learning requirements. Therefore, users should be able to easily and effectively carry out their tasks.

Trust/ethics/responsibility

The system should help the user to protect personal or private information

Table 3. Invisibility heuristics. (Kemp et al, 2008.)


information appliance, the first step was to add to Nielsen’s general heuristics some of those he suggests for assessing commercial web sites: material often updated, minimal download time, ease of use and material unique to the online medium (Preece et al, 2002, p409). Evaluating the interface of ubiquitous systems, given their complexity, is a major undertaking. Scholz and Consolvo (2004) proposed a framework and metrics for the user evaluation of ubiquitous computing. This enables a systematic approach to check for usability and user acceptance of applications of this kind. Their framework was based not only on research into the interface evaluation of typical desk top applications but also took into account requirements for adaptive interfaces (Jameson, 2003), sensing systems (Belloti et al., 2002) and ethical issues (Friedman et al., 2001).

The issues particularly relevant to the learning appliance where it is important to lessen the barriers to learning include: attention (focus), adoption (flexibility), trust (privacy), conceptual models (predictability of application behaviour and awareness of application capability), interaction (effectiveness, transparency) and invisibility (control, customisation). Sometimes these issues, e.g. flexibility, overlap with those already identified by Nielsen (1994) and Scholz and Consolvo (2004). A final list of 18 heuristics was developed to cover all relevant interface issues (Table 3).

The results – multi-phase evaluation

IMMEDIATE 2 has progressed through a sequence of “iterate and revise” cycles using different evaluation techniques to assess and improve the invisibility of the system (see Table 4). We report on the results of these evaluations below.

Iteration

Evaluation Method

IMMEDIATE 2.0

Heuristic evaluation by HCI experts

IMMEDIATE 2.1

Evaluation with domain expert – language teacher

IMMEDIATE 2.1

Heuristic evaluation by HCI experts

IMMEDIATE 2.1

Evaluation with domain experts – second language students at a NZ university

IMMEDIATE 2.2

User testing with ESOL students in Thailand

Table 4. Evaluation “iterate and revise” cycles


IMMEDIATE 2.0 – Heuristic evaluation

The initial heuristic evaluation of IMMEDIATE 2.0 was carried out by an assessor with an understanding of both the interface and learning aspects of IMMEDIATE (Kemp et al, 2008). The evaluation was extensive lasting four hours, with the evaluator going through the system twice. Even so, not every heuristic could be tested e.g. that for awareness since there were not multiple users on the system. Overall, five key usability issues were identified that needed to be addressed to achieve a greater level of invisibility. These related to the “visibility of system status”, “consistency and standards”, “familiarity”, “recognition rather than recall”, and “ease of use” heuristics.

The heuristic evaluation results were subsequently reviewed by two other people with relevant experience. Various changes were proposed, for instance it was decided to change the screen layout in order to make better use of space. Consistency was also seen as important with the need to standardise terminology and the use of icons. Colour coding of study modes would remain but be limited.

After the evaluation, it was decided to review the suitability of the framework itself. A small number of questions were removed or associated with another guideline. In particular, it was decided that the heuristic “Ease of use” should no longer include the statement “The system should provide support for the user in their learning.” Instead, the relevant issues concerning learning support are now covered by “Focus” and “Effectiveness.”

IMMEDIATE 2.1 – Teacher evaluation

The opinion of a lecturer involved in teaching Maori as a second language was sought to determine the suitability of IMMEDIATE for distance language teaching. One of the main goals of this expert evaluation was to determine whether the proposed framework for the integration of course material (reading, writing and group practice) was appropriate and would support the academic’s pedagogical objectives. A structured interview was carried out to ensure that all relevant topics were covered (Cordingley, 1989). Prior to the evaluation, the lecturer was provided with a copy of the interview questions (Table 5) which related principally to educational issues although there was a more general question about the overall impression of the system. After a demonstration centred on a scenario, the lecturer explored further, clarifying how IMMEDIATE worked before answering the questions.

In general, the system was found to provide support for the range of functionalities that will benefit and enhance second language teaching (Levy’s modular approach to second language teaching). With regard to his own teaching requirements, however, certain changes would have to be made including the addition of two modes to complement group practice – speaking and listening – that were not required by the teachers of academic English as a second language.

1. In your opinion, do you think the way the templates present the material impacts on the pedagogical objectives of the material?

2. Do you have any suggestions for improving the templates? Is there anything else you would like to include?

3. In your opinion does the current learning support provide pedagogical benefits?

4. Do you believe that the technology will inhibit the pedagogical efforts of the students?

5. In your opinion do you think the student’s will have a clear understanding of the required learning outcomes?

6. Do you think that the facilities to monitor the student’s progress are sufficient/adequate?

7. What are you overall impressions of the system?

8. Are there any other comments or suggestions you would like to make?

Table 5. Interview Questions for IMMEDIATE


The goals of this lecturer with regard to second language teaching related not only to the formal teaching of grammar, etc., but also to supporting the everyday use of language in context. This lecturer wanted a learning environment where the terminology was a little less formal than that used in the academic English course. IMMEDIATE allows lecturers to tailor courses to meet their educational objectives and provide an experience for users that matches their real needs as advocated by Norman (1998). When setting up a course lecturers can select from a range of available modes (including listening and speaking) and name them appropriately, e.g. using vocabulary practice instead of comprehension.

IMMEDIATE 2.1 – Heuristic Evaluation

The heuristic evaluation of IMMEDIATE 2.1 was carried out by 6 people – one group consisting of 3 postgraduates with experience of carrying out evaluations of this type and the other with three postgraduate students who had learned English as a second language. The views of the latter were particularly important to see whether the interface was seen as helping or hindering learning.

A task scenario was provided for evaluators to work through (Table 6). It was made clear to participants that the context was the use of IMMEDIATE by inexperienced distance students enrolled on a paper for teaching English as a second language. They were asked to answer a series of questions relating to the 14 of the 18 relevant usability principles successfully employed in the previous study (Kemp et al, 2008). The heuristics relevant to the multi-user situation (10, 13, 16, 18) were omitted as only individual usage was tested. The participants were also able to comment on problems that arose during the course of the evaluation. The main goal was to see whether IMMEDIATE was seen as a suitable tool for delivering language teaching. The specific issues related to whether there was any significant difference between the results for the two groups as well as to identify strengths and weaknesses of IMMEDIATE (by heuristic and question).

1. The course explorer shows there is a written practice exercise for Unit 1.01 Introduction I, complete this exercise.

2. Having done the exercise you are still unsure of the key word construct. Find out more information about it.

3. The extra information has not helped. Contact the lecturer for more information.

4. Having had enough for today, exit the system.

5. Deciding that you wish to resume from where you left, launch IMMEDIATE.

6. You wish to see if the lecturer has replied to your previous question. Check your messages.

7. You remember you have not sent a message to Bob asking when he is free to work on your group practise. Send a message to him.

8. You wish to complete the readings for this unit. Go to the readings and complete these and the ‘Main Idea’ questions.

9. The next unit’s (Unit 1.02) reading has further useful reading practice hints. Go to it.

10. Exit the system.

Table 6. Scenario used for demonstration and heuristic evaluation purposes


All of the evaluators spent approximately one hour using the system before going through the checklist of questions, rating their opinion on a 3 point scale from strongly disagree to strongly agree to (scored on a scale from 1 to 3). The results showed that there was a remarkable degree of consistency between and within each group. For most of the questions the answers were identical or almost identical (that is with only one person not in agreement). This was reflected in the similarity of the means (based on the scores of all the items) for each group – 2.2 was the average for those with HCI experience and 2.25 for the former second language students (Table 7). These were both over the midpoint of 2. Of great interest was the fact that only 5% of the responses indicated dissatisfaction with the interface.

Heuristic

Mean

Focus

2.66

Flexibility

2.4

Recognition

2.4

Help

2.33

Aesthetic

2.3

Timeliness

2.25

Consistency

2.2

Error prevention

2.2

Match system to real world

2.1

Effectiveness

2.1

Familiarity

2.1

Ease of use

2

User control

2

Visibility system status

2

Table 7. Mean of each heuristic


The heuristic that was rated most highly (mean of 2.66 out of a possible 3) was “Focus” which was very pleasing as the central aim is to make the technology blend into the background and allow users to concentrate on learning. Indeed, the former second language students rated IMMEDIATE very highly in this regard (2.8). IMMEDIATE also scored well for “Flexibility and Efficiency of Use” and “Recognition rather than Recall”. Both of these are very important if, from a psychological perspective, people are to be oblivious of the technology (Weiser, 1991). The value of context specific help was also recognised. These are all key issues for educational software which wishes to provide a suitable experience for both individual and group learning. Since none of the heuristics had an average below 2 this indicated, again, that all aspects of IMMEDIATE were generally acceptable.

Some minor problems were specifically reported e.g. icons were easy to overlook and one (to look at answers) was missing. The lengthy load up time when IMMEDIATE was run from a USB stick was also annoying. In the main, however, IMMEDIATE was seen as well presented, supporting a range of functionality that is appropriate for an e-learning application. No major problem was found that would impact on a user’s ability to perform the learning tasks which IMMEDIATE supported.

This was the first time that IMMEDIATE had been exposed to experienced language learners and the results were promising. Overall, the ratings for heuristics such as “Focus”, “Flexibility and Efficiency of Use”, and “Recognition rather than Recall” indicated that the participants could complete the required tasks without difficulty. It appeared that they were able to develop an appropriate conceptual model of the interface (Norman, 1998). It was particularly pleasing that the large number of displays did not impede the participants and the technology appears to recede into the background. Considerable effort had gone into re-designing the interface after the previous evaluation (Kemp et al, 2008) which showed undue reliance on recall rather than recognition. It was satisfying, therefore, to see the improvement in the rating of this heuristic.

From the two evaluations of IMMEDIATE 2.1, it seemed the learning appliance was ready with minor amendments to be tested in a multi-user environment. To this end, field-testing was carried out at a university in Thailand with a new iteration of the prototype supporting the immersive/communicative approach in a multi-user environment.

IMMEDIATE 2.2 – Field testing

An extensive evaluation of IMMEDIATE 2.2 (Figure 2) was conducted at a large open university in Thailand by a team of Thai, Malaysian and New Zealand researchers. A pilot study identified not only problems in the evaluation process but also technical problems. Once these had been sorted out the field test took place.

Eleven participants who were current or former ESOL students and used computers on a daily basis took part in the full day evaluation. Material adapted by New Zealand ESOL teachers from the Thai university’s teaching materials was made available to the participants who were required to carry out tasks specified in scenarios that were available in both Thai and English. The tasks involved not only self-practice (reading and writing) but also live audio and text conversations between students and with the tutor. Contextual learning support and feedback was available. In order to ascertain the views of the participants as they proceeded with their tasks a series of questions had to be answered in situ about the features of IMMEDIATE. A questionnaire about the interface (helpfulness, efficiency, learnability etc) was completed after all the tasks had been finished. Ratings for both the scenario statements and questionnaire items were on a 5 point Likert type scale where values ranged from 1 to 5 (strongly agree to strongly disagree.) Medians were calculated plus the percentage of responses in the top part of the range (1, 2) and the bottom part (4, 5). Finally, the views of the participants were ascertained in a focus group meeting at the end of the day. Views were expressed in Thai and translated into English.

Figure 2. Interaction mode. IMMEDIATE 2.2 for ESOL. Black and white images indicate all users are offline.

The results were extremely encouraging from a pedagogical perspective. All modes and features of IMMEDIATE were seen as supporting the learning process with no median greater than 2. In four cases (Listening, Assignment, Reflection and Self Assess), all of the responses (100%) were in the top part of the range. Sending messages was also seen as easy and there appeared to be no problem completing exercises or using the reflection mode to monitor progress. Other questions related to the stimulating and motivational aspects. The exercises and assignment scored very highly with a median of 1 and again there was no median lower than 2.

The least positive results concerned the Interaction (audio communication) mode. While all participants strongly agreed that this mode supported the learning process (median 1), forty four percent of the respondents strongly agreed or agreed that they found it frustrating using this mode (median 3). A low rating for error prevention also related largely to problems encountered in the Interaction mode. This showed the need for further iterate and revise cycles on this feature. The real-time communication functionality was introduced for IMMEDIATE 2.2, which meant it had not been through the earlier iterate and revise cycles, and it could not be fully tested prior to the evaluation in a multi-user environment. It became apparent that the Thai students were not familiar with “Push-to-Talk” simplex communication systems.

With regard to interface issues, the responses from the questionnaire revealed that the majority of the participants were satisfied with aspects such as the Efficiency, Affect, Focus, Privacy and Suitability for the online medium (all with medians of 2) of IMMEDIATE. However, the system’s Error Prevention mechanism needs to be further improved as the results were similar to those from the first evaluation (median=5, top-2=0%). The learnability of the system was not rated very highly either (median of 3). Additional discussion of the questionnaire results can be found in Hussain et al (2010).

The focus group discussions showed that for some participants the learnability was impaired by an incorrect conceptual model of the learning appliance shaped by their extensive prior experience with Windows PCs and by cultural attitudes towards learning that diverged from the course designers’ expectations. Nevertheless, the focus group members saw IMMEDIATE as a usable system with features such as learning support that should be included in other e-learning systems. Interestingly a number of the participants suggested that Thai students would prefer a more prescriptive, linear approach to the more self-directed, exploratory style encouraged by IMMEDIATE, based on New Zealand distance learning experience. These cultural aspects of the pilot study are discussed in depth in Boonphadh et al (2011).

This evaluation was a first attempt at using IMMEDIATE in realistic circumstances with a focus on educational issues. It was possible to consider issues such as the provision of appropriate activities, content, feedback and assistance. Support for metacognitive behaviour such as self-reflection and monitoring could also be examined. Seeing whether students could work in a flexible way was also important. From the ratings of the modes/features of IMMEDIATE made when students were actually engaged in carrying out tasks, it can be concluded that all the activities supported were seen as conducive to learning (Table 8). None of the medians were below 2 for any of these. Assistance could be satisfactorily provided by asking the tutor (median of 2) or using the learning support (focus group finding). Students found the self assess and reflection activities useful. Whilst the learnability of the system did not rate well in the questionnaire, focus group members believed that they could quickly became familiar with the features of IMMEDIATE.

No

Mode/Feature

Statement

Mean

Median

Top-2

Bottom-2

Respondent

1

Listening Mode

1. This mode is stimulating and motivating

1.6

2

100%

0%

11

2. This mode supports the learning process

1.4

1

100%

0%

10

2

Assignment Mode

3. This mode is stimulating and motivating

1.33

1

100%

0%

9

4. This mode supports the learning process

1.5

1.5

100%

0%

8

3

Reflection Mode

5. I like the use of this mode to monitor my progress in this subject

1.8

2

70%

10%

10

6. This mode supports the learning process

2.2

2

67%

11%

9

4

Interaction Mode

7. Using this mode is frustrating

3

3

44%

33%

9

8. This mode supports the learning process

1.4

1

100%

0%

9

5

Ask the Tutor

9. This feature assists in the learning process

1.9

2

80%

10%

10

6

Self Practice

10. The exercises are stimulating and motivating

1.4

1

89%

0%

9

11. The exercises can be completed easily

2.4

2

78%

11%

9

7

Messaging

12. Sending the message is easy

2.25

1.5

63%

25%

8

13. This feature supports the learning process

2.1

2

78%

11%

9

8

Self Assess

14. This feature is stimulating and motivating

1.4

1

88%

0%

8

15. This feature supports the learning process

1.5

1.5

100%

0%

8

Table 8. Results of in situ questionnaires for each major learning feature, on scale of 1 to 5 (strongly agree to strongly disagree.).


With regard to the content, whilst the exercises were seen as stimulating and motivating, there was not sufficient material to provide a rich learning environment. Without this there was not enough flexibility for the self-directed user. There was also the cross cultural problem that can occur when testing ESOL material. Despite the wish to immerse students in the language being taught, it may still be necessary to have some instructions and examples in the native language. This allows the autonomous learner to work at their own pace at any time (when no tutor is available to help them). Calmness and serenity cannot be promoted if people do not know how to proceed or do not have the relevant material available.

Overall, IMMEDIATE performed well in a difficult setting. Whilst the Interaction mode was highly visible to the students this was not unexpected given that the multi-user feature had never been tested before.

As a result of the Thai field test a new version, IMMEDIATE 3.0, has been developed which addresses some of the issues identified with the Interaction and other modes (Figure 3). IMMEDIATE 2 required some minor configuration of the student’s machine before it would run from a USB stick. IMMEDIATE 3 removes this requirement.

Figure 3. Interaction Mode as revised from Thai pilot study. Colour image indicates student is online and available for live conversation.

Conclusion

The goal of the IMMEDIATE research group at Massey University, New Zealand, is to explore methods for improving the effectiveness of online education by reducing the visibility of the enabling technology. This article has reviewed the group’s efforts to develop and evaluate a virtual learning appliance for the distance study of second languages, to test their hypothesis that on-line distance learning technology can be simplified and rendered “invisible” to the student, if the design approach that has marked successful consumer appliances is applied.

The overriding idea behind this thesis, drawn both from the authors’ practical engineering experience and from a review of the work of computer scientists and cognitive psychologists, is that invisibility is not primarily a physical question of hiding the technology, but one of designing the technology to fit the need so perfectly that the user forgets that it is a complex technical device. By designing the student client in a distance e-learning system as a simplified, specialised learning appliance, the complications and distractions of a general-purpose computing environment can be avoided, making the appliance a more effective and usable learning tool.

To aid the iterative prototyping process by which the virtual learning appliance has been built, the team developed a set of invisibility heuristics as an extension of Nielsen’s usability heuristics. These were used to conduct laboratory evaluations by small groups of usability experts and second language students and teachers, with encouraging results that helped in improving the prototype and encouraging the researchers that they were on the right track. But it was also clear that more definitive conclusions could not be drawn without field testing the appliance with actual second language students. To this end, collaboration was established with a research group at a large open university in Thailand, where an initial field test was conducted in 2010.

The results from the Thai field test were largely positive with the evaluators able to complete most of the learning scenarios during their first contact with the system and agreeing that most features supported their learning. However, they found one key feature, the Interaction Mode, to be frustrating and difficult to use, in part because they were unfamiliar with Push-To-Talk communication on which it was modelled. Some additional difficulties were caused by the clash between the exploratory New Zealand learning culture imbued in the ESOL appliance and the more prescriptive learning style the Thais were accustomed to. Invisible technology in e-learning is difficult to achieve, but not impossible. It requires repeated cycles of heuristic evaluation and field testing with the target users. in the Thai field tests, those features that had been thoroughly evaluated along these lines tended to recede quickly, whereas those which had been introduced without much prior evaluation, or did not match the student’s conceptual model, remained highly visible and impeded learning.

The results from the project so far are encouraging. However, further larger-scale field testing is needed and planned, with form and content modified on the basis of this initial field test. Collaboration has also been established with a research group in southern Thailand who wish to use the IMMEDIATE software to evaluate the appliance approach to facilitate occupational health and safety training among rubber farmers. This will include distributing part of the system to mobile phones to enable help and advice to be accessed from the field.

References

Allan, C.J.; Johnson, R.S.: Taking e-Learning Across The Divide. Presented at 5th International Workshop on Technology for Education in Developing Countries (TEDC’08), Kampala, Uganda, July 31-Aug 2, 2008.

Attewell, J.: Mobile technologies and learning. A technology update and m-learning project summary. Learning and Skills Development Agency. London, 2004 ,(ISBN 1-84572-140-3).

Bergman, E. (ed.): Information Appliances and Beyond: Interaction Design for Consumer Products. Morgan Kaufman Publishers, San Francisco, 2000. (ISBN 1-55860-600-9).

Boonphadh, P, Johnson, R.; Kemp, E.; Wipassilapa, S.; Hussain, N. : Crossing the Cultural Divide: Challenges Involved in Bringing a New Zealand-Designed Interactive Computer-based Esol Package to Thailand. In: Proceedings of the 11h International Conference on Social Implications of Computers in Developing Countries, Kathmandu, Nepal, May 2011.

Bork, A.: Tutorial Learning for the New Century. In: Journal of Science Education and Technology. 10(1), 2001, pp. 57-71.

Cooper, A.: The Inmates are Running the Asylum. SAMS Publishing. Indiana, 2004.

Cordingley, E. S.: Knowledge elicitation techniques for knowledge-based systems. In Diaper, D. (Ed.): Knowledge elicitation: Principles, techniques and applications. Ellis Horwood Ltd., Chichester, England, 1989.

eLearning Brothers: eLearning Activities. eLearningTemplates. 2011 http://elearningtemplates.com/elearning-activities . (last check 2011-09-06)

Friedman, B.; Kahn Jr. P.H.; Borning, A.: Value Sensitive Design: Theory and Methods, tech. report 02-12-01, Univ. Washington, Dec., 2001.

Garret, N.: Computer-assisted language learning trends and issues revisited: Integrating innovation. In: The Modern Language Journal, 93(s1), 2009, pp. 719-740.

Hill, J.R.; Reeves, T.C.; Heidemeier, H.: Ubiquitous Computing for Teaching, Learning, and Communicating: Trends, Issues & Recommendations. White Paper. Department of Instructional Technology, College of Education, The University of Georgia, Athens, USA, 2000. http://lpsl.coe.uga.edu/Projects/AAlaptop/pdf/UbiquitousComputing.pdf , (last check 2011-09-06)

Hussain, N.; Johnson, R.; Kemp, E.: An evaluation of a specialized portable system for tertiary distance teaching of ESOL. In: . Wong, S. L. et al. (Eds.).: Proceedings of the 18th International Conference on Computers in Education. November 29 to December 3, 2010. Asia-Pacific Society for Computers in Education , Putrajaya, Malaysia.

Jameson, A.: Adaptive Interfaces and Agents. In: Jacko, J.; Sears, A. (Eds.): Human-computer interaction handbook, pp. 305-330. Erlbaum, Mahwah, NJ , 2003.

Jara, M.; Mohamad, F.: Pedagogical templates for e-learning. In: WLE Centre. Occasional Papers in Work-based Learning 2, 2007. London. (ISSN 1753-3385)

Johnson, R.; Kemp, E.; Kemp, R.; Blakey, P.: The learning computer: low bandwidth tool that bridges digital divide. In: Educational Technology & Society, 10(4), 2007, pp 143-155. (ISSN 1436-4522 (online) and 1176-3647 (print).)

Jones, M.; Marsden, G.: Mobile Interaction Design. Wiley, Chichester, 2006.

Kemp E. A.; Thompson A.; and Johnson, R. S.: Interface evaluation for invisibility and ubiquity - an example from e-learning. In: Proceedings of CHINZ 2008, The 9th ACM SIGCHI-NZ

Annual Conference on Computer-Human Interaction (Wellington, New Zealand, 2 July). ACM Press, 2008

Kruse, K.: E-Learning and the Neglect of User Interface Design. In: E-Learning Guru, 2002. http://www.e-learningguru.com/articles/art4_1.htm . (not availavble at 2011-09-06)

Levy, M.: Technologies in use for second language learning. In: The Modern Language Journal, 93, 2009, pp 769–782.

Microsoft.com: Create Online Courses and Silverlight Learning Snacks with LCDS. 2011. http://www.microsoft.com/learning/en/us/training/lcds.aspx (last check 2011-09-06)

Moore, M.; Kearsley, G.: Distance Education: A Systems View of Online Learning. 3rd edition, Wadsworth CENGAGE Learning, 2012, (ISBN-13: 978-1-111-52099-1).

Murray, T.: Authoring Intelligent Tutoring Systems: An analysis of the state of the art. In: Int. J. of AI and Education. 10 (1), 1999, pp 98-129.

Nielsen, J.: Heuristic evaluation. In: Nielsen, J.; Mack, R.L. (Eds.): Usability Inspection Methods, John Wiley & Sons, New York, NY, 1994.

Norman, A. D.: The Invisible Computer. MIT Press, Cambridge, Massachusetts, 1998.

Otto,S.: Pusack, J.: CALL Authoring Issues. In: The Modern Language Journal, 93, Supplement 1, December 2009, pp. 784-801(18), http://www.ingentaconnect.com/content/bpl/modl;jsessionid=13lsmr000m811.victoria (last check 2011-09-06)

PARC, 2007. Retrieved January 18, 2007 from http://www.parc.xerox.com/about/history/default.html . (not available at 2011-09-06)

Preece, J.; Rogers, Y.; Sharpe, H.: Interaction Design: Beyond human-computer interaction. 1st Edition. Wiley, New York, 2002. ( ISBN 0-471-49278-7).

Shneiderman, B.: Software Psychology. Human factors in Computer and Information Systems. Winthrop. Cambridge, Massachusetts, 1980.

Smulders, D.: Designing for Learners, Designing for Users. 2003. http://elearnmag.org/subpage/sub_page.cfm?section=3&list_item=11&page=1 . (last check 2011-09-06)

Waycott, J. and Kukulska-Hulme, A. (2003). Students’ experiences with PDAs for reading course materials. Personal and Ubiquitous Computing 7: 30-43.

Weiser, M.: The Computer for the 21st Century. In: Scientific American, 265(3), 1991, pp 94-104.

Weiser, M.: Ubiquitous Computing. In: IEEE Computer, Hot Topics, 26(10), 1993, October.

Weiser, M.; Brown, J.S.: The Coming Age of Calm Technology. In: Denning, P.J.; Metcalfe, R.M. (Eds.): The Next Fifty Years of Computing. New York, 1998, pp 75-85.

White, C.: Distance learning of foreign languages. In: Language Teaching. 39(4), 2006, pp 247-264.

Ye, J.; Johnson, R.S.: An embedded bimodal tool to enable second language learners to practise conversation online over unreliable internet connections. Presented at the 5th International Workshop on Technology for Education in Developing Countries (TEDC’08), Kampala, Uganda, July 31-Aug 2, 2008

Fulltext