Computer Literacy and Skills System (CLaSS) A Software Development Project into Computer and Information Literacy for Nursing Students

Ian John Cole, MSc DipCom (Open)

Cole, I. J. (October, 2004). Computer Literacy and Skills System (CLaSS) A Software Development Project into Computer and Information Literacy for Nursing Students, (OJNI). Vol. 8, No. 3 [Online]. Available at


This paper considers the research, development and the evaluation processes that were involved in the design and implementation of a computer and information literacy software artefact for a specific group of undergraduate nursing students. It draws on existing literature and applies a specific learning model to the software while considering software engineering and user-centered design methodologies.

The technical processes involved in designing and creating the software are described with post-software development data analysis discussed.

The project has attempted to address the learning requirements of a group of health care professionals involved in part-time education with software that focuses on computer literacy problems that some nurses returning to educational study may have.


Computer, information, literacy, software, design


The acquisition of communication and information technology (C&IT) skills presents Universities with significant challenges (NCEHE, 1997) and since the integration of nurse training into higher education in the early 1990’s, the problems associated with computer literacy have been particularly hard to solve for nurse education.

These problems are well documented (Chambers & Coates, 1990; Gassert & McDowell, 1995; Graveley, Lust and Fullerton, 1999; Topp & Kinn, 1999) and with widening participation being the ‘top of the (UK) governments agenda for higher education’ (Woodrow, 2000) in the last few years, accessibility, inclusion and key skills (NCEHE, 1997; Dfes, 2004) are issues that nurse education has continually tried to address.

This project has attempted to address some of the computer and information literacy skills problems faced by a group of nursing students involved in part-time educational studies at a British University, where a piece of learning technology has been created called the ‘Computer Literacy and Skills System’ (CLaSS) to assist students improve specific areas of their computer and information literacy (C&IL).

The ‘CLaSS’ software is an interactive computer training package consisting of five individual tutorials. The software has being designed to run across a wide area network (WAN) or to be delivered via CD-ROM.

it covers the following criteria:

The outcomes of research into the validity of the project is being published in Nurse Education in Practice (Cole & Kelsey, 2003) but a short summery of the research results is shown here to identify the learning needs of the target students. 

Image 1: CLaSS software main menu screen

Contextual Background:

Literature Review:

In the United Kingdom, nursing students have been considered inhibited by a lack of computer training integration into nursing courses. Armstrong (1989) conducted a national study of nurse educators about the use of computers in nurse education and found that few nursing institutions were providing students with computer courses. In the following year, Chambers and Coates (1990) concluded that a national strategy was needed to provide education in computer skills for nurses. Jones, Navin, Barrie, Hillan, & Kinane (1991) considered that Glasgow University needed to change their undergraduate curriculum to include the teaching of computers for clinical practice.

Computer literacy skills problems continued to be identified throughout the 1990’s (Blythe and Royle, 1993; Anthony, 1997; Sarento and Leino-Kilpi, 1997) with institutions making assumptions as to the level of computer competency students had.

A study of academics by Grant (1995) from the University of South Australia, indicated that Universities assumed a level of computer literacy but did not define the level of competence required.

Reid (1997) also found that there was an assumption that students already have computer literacy skills, which is often mistaken. He argues that institutions attempt to define computer literacy to ensure that their students are computer literate. He went on to discusses the seven methods (Ferren, 1993) by which higher education institutions can integrate computer literacy training into undergraduate programs (see table 1).

Distribution Requirements

Computer Literacy is accepted as a discipline and students take one or more subjects in that discipline

Core Curriculum

All students do a core, or foundation group of subjects which include computer literacy content

Correlated Courses

A parallel course in computer literacy is taught that uses the content of another discipline as the vehicle to teach the necessary skills

Freshman Studies

As part of a general introduction to the ways of working at University, computer literacy is included in a program to develop library skills, information retrieval and communication skills considering that computer literacy is a generic skill

General Education module

A computer literacy module is included in all programs as a general education experience, to increase students general awareness of computer literacy issues

Certification of proficiency

The institution takes a purely 'gatekeeping' role and sets up a standard assessment in computer literacy that students must achieve to participate in a degree.

Integration into all courses

The teaching of computer literacy skills is integrated into all subjects and is taught in a 'just-in-time' fashion, or is assumed to exist.

Table 1: Seven Methods of Integrating Computer literacy (Reid 1997)

There are advantages and disadvantages to implementing these methods and a clear strategy is needed for universities and colleges to implement these .

It can be shown how serious some institutions consider this problem to be and how willing they are to integrate a computer literacy strategy. In America, the Western Illinois University has a Senate Committee on Computer Competency (CCC), who in 2000 drafted computer literacy competences (minimum knowledge) that are considered necessary for academic success.

CCC’s basic competenciesare defined as:

“Using word processing programs effectively; use of library databases and catalogues to locate print materials; finding information on the Internet and evaluate its reliability and usefulness; being able to write email effectively and appreciating the ethical issues of computing.”

(Leland 2000, et al.)

These competencies were the result of a cross-university study into computer literacy. The report also outlines what are described as professional-level competences, which includes the use of spreadsheets, databases, graphics programs, desktop publishing, writing web pages and other discipline specific applications.

Some of the main recommendations from the report are that:

The Western Illinois Universities basic competencies can be considered as information literacy competencies rather than computer literacy competencies, as they are concerned with word processing, library databases and Internet searching. If we compare the skills of mouse control, keyboard and Windows navigation requirements that are needed to undertake information literacy then information literacy should be viewed as a broader concept that includes computer literacy. An example of this would be, it is very difficult to undertake a CD-Rom database search (information literacy) on a computer if the student is unable to log on to the computer and use a mouse (computer literacy).

In balancing these observations, the San Francisco State University (SFSU) School of Nursing evaluated their integrated programme of information literacy and found that in comparing a nursing cohort in 1992 to one in 1996, the 1996 cohort of students expressed a ‘significantly greater lack of knowledge’ regarding the use of information resources (Verhay, 1999). These findings have implications for the development of a computer and information literacy curriculum, as the expectations from the SFSU study were an improvement in knowledge gained and not the reverse. 

Integrating Skills:

Here in the UK the National Committee of Enquiry into Higher Education (commonly known as The Dearing Report) in 1997 identified that the key to the future success of graduates lay in four key skills.

These skills are identified as:

The report clearly outlined that

“Students will expect to leave higher education competent and confident in the use of Communication & Information Technology so that they can use it in their future careers and personal learning”

(SKSAIHE, 2001).

In integrating these key skills, the problems of numeracy skills for the health care profession have been well documented (Adams & Duffield, 1991; Blais & Bath, 1992; Edwards, Adrian, Matthews, Pill, Bloor, 1998; Sabin 2002). Numeracy is defined as the mathematical skills that enable people to function in the practical demands of everyday life (Steen, 1991; Schwartzet al 1997). It was the Moser Report (DfEE, 1999) that suggests as many as 40% of UK adults have some numeracy problems.

In an attempt to start addressing this problem for nursing and midwifery in the United Kingdom. In 2003, the Nursing and Midwifery Council (NMC) made it a requirement that:

"Higher Education Institutions ensure that applicants for pre-registration nursing and midwifery education have provided evidence of literacy and numeracy. This may be integral to academic or vocational qualifications, or alternative evidence such as keyskills abilities."

(NMC, 2003).

There was no requirement or recommendation within this document about the specific use of computer or information literacy and we see these key skills are being integrated into UK University programmes with varying degrees of success.

At the University of York, the Department of Health Sciences has integrated all four key skills into their undergraduate Diploma of Higher Education in Nursing Studies programme and now includes literacy and numeracy assessment for admission to their nursing programme.

At the University of Strathclyde, Johnson & Webber (2000) have integrated a credit bearing elective class in information literacy, drawing on three of the four key elements from the Dearing report. Johnson & Webber (2000) excluded numeracy from their information literacy course because they considered that it didn’t explicitly cover the aims and objectives of their class.

In the USA a comprehensive account of the skills and abilities needed to exploit information literacy is provided in a document from the American National Research Council Computer and Telecommunications Board (1999). Their document describes the concept of ‘fluency with information technology whereby “fluency” is defined thus:

“Fluency with information technology…entails a process of lifelong learning in which individuals continually apply what they know to adapt to change and acquire more knowledge to be more effective at applying information technology to their work and personal lives” (p2)

(NRCCST, 1999)

Fluency with information technology requires three types of knowledge:

  1. Foundational concepts – underpinning principles of computing.
  2. Contemporary skills – include the ability to use software applications.
  3. Intellectual capabilities – include the ability to apply information technology in complex situations.

These types of knowledge have a synergy with the concepts of computer and information literacy whereby the foundational concepts are equal to the concepts of computer literacy. The contemporary skills marry with the application of information literacy and the intellectual capabilities match the concepts of critical thinking (Wilkingson, 1996), the concepts of critical thinking can only be applied to computer & information literacy (C&IL) skills once they are sufficiently developed.

It has become clear in this review that there is no single defining definition of Computer & information literacy (C&IL) but it depends on the institution and learning context. C&IL is an ever-changing concept that can be moulded to the institutional needs. The definition used throughout this paper is that ‘computer literacy is defined as the skills needed to undertake Information literacy objectives’.

Summary of the research study:

The research undertaken for this project had the premise to discover what levels of computer and information literacy the target nursing students had already gained and whether there was a need for a piece of learning technology such as ‘CLaSS software’. Quantitative research was carried out via a questionnaire that was designed to elicit the students’ own assessment of their computer skills and knowledge. The questionnaire included a five-point Likert-type scale with scale descriptors: excellent, good, adequate, poor and very poor, throughout the analysis “Poor” and “Very Poor” were considered together as both less than “Adequate”.

497 students were highlighted for the study, these were undergraduate post-registration nursing students (qualified nurses undertaking further educational courses), Only 12% of the students involved in the study were on full time courses, with the balance being employed while undertaking their study. Questionnaires were distributed and collected by lecturers within teaching sessions, which gave a high response rate of 69% (n=342).

 Questionnaire Results:

The results were analysed in the two main categories of computer literacy and information literacy. The computer literacy category is concerned with the knowledge and ability to use computers whereby the information literacy category is concerned with the knowledge and ability to perform searches and communicate electronically.

(A summary of the results are shown as table 2).

The results from this survey show a serious skills deficit and whether these students are suffering from a lack of confidence or a real lack of computer skills the issue is that a large number of these students perceive themselves as being unskilled in the use of computers and that is a problem that needed to be addressed.

The research study recommended that apart from the need for an introductory computer skills courses for nursing students, there is a need for on-going learning support in the form of workbooks and/or computer software (Cole & Kelsey, 2003). Based on these recommendations CLaSS software went into full development.

Summary of Questionnaire Results

92% of students were female giving a ratio of 12 female to every male student

The youngest student was 24 years old and the oldest was 61 (average age 38).

21% felt their ability to use a mouse or keyboard was below adequate

36% felt their ability to navigate through Microsoft Windows was below adequate.

26% felt their ability to use a word processor was below adequate.

25% did not know how to search for a book or journal article in a library.

48% were not able to use an electronic library catalogue.

40% felt their ability to request journal articles through the a University library was below adequate.

34% didn't know how to access information on the Internet and 46% felt their knowledge of the Internet was below adequate.

50% of the students did not use electronic mail

46% felt their understanding of 'what a bibliographical database is' was below adequate.

57% felt their ability to access a bibliographical databases through their University was below adequate.

72% did not know what 'Boolean searching' was and only 16% felt their understanding was adequate or above.

When asked if a computer skills system was a good idea 94% felt it was and 87% said they would use the system occasionally or frequently.

58% felt their understanding of disk drives (floppy A and C drives) was below adequate.

46% felt their knowledge of 'what a CD-ROM is' was below adequate.

39% felt their knowledge of file management (i.e. how and where to save work) was below adequate.

Table 2: Summary of research results.

Learning Model

CLaSS software was designed as 5 interactive tutorials covering the use of the mouse and keyboard, navigating the Windows environment and the use of bibliographical databases and the Internet. The learning model that the software has adopted is based on is David Ausubel’s subsumption theory (Bowen, 2004). Ausubel’s work was concerned with verbal/textual lessons in schools and i n his subsumption theory, he contended that "the most important single factor influencing learning is what the learner already knows." (Ausubel, 1968). Ausuble argued that when learning, the learner needs to link background knowledge (knowledge the student already has) to the foreground knowledge (what is being taught to the student) and to do this he advocated the use of ‘organisers’.

It is possible to apply Ausuble’s theory to the implementation of the CLaSS software if we consider the background knowledge is computer literacy and the foreground knowledge is information literacy. A large proportion of the students have limited or no background knowledge (computer literacy) and because of this factor the computer literacy tutorials (keyboard, mouse and Windows) can be views as Ausuble’s organisers, preparing the way for the foreground knowledge in the form of information literacy (Internet and database searching). Once the background knowledge has been attained or revisited, this leads on the information literacy foreground knowledge and so completing the process.

Product Development Process:

Throughout the project development process the design criteria of Analysis, Planning, Design, Implementation and Testing has been applied as a ‘system development life cycle’ (Parsons & Oja, 1996) see figure 1.



Figure 1: A System Development Life Cycle (Parsons & Oja, 1996).

Software Engineering Methodology:


The project analysis began with the background research study and identifying the need for the learning technology artefact. The target audience was identified and technical and environmental issues such as software requirements and methods of delivery were also identified.


Project planningcontinued with the identifying of key milestones on a project timetable and the recruitment of a usability evaluation team. The choice of a predominantly female usability evaluation team (2/3rds female to 1/3rd male) was taken to give the evaluations a female bias due to the large number of female students who would potentially use the software.

It was decided that the software would be created in the Macromedia Director authoring package with Macromedia Flash and other software tools being used for specific image manipulation tasks.


The design of each tutorial began with a storyboard prototype version created in Microsoft PowerPoint presentation software. The rationale for using PowerPoint for each prototype was that it was possible to get a feel for how the final software would look and run. It was also possible to create and evaluate many of the images that could be used in the finished product. The use of PowerPoint as a ‘rapid prototyping tool’ (Kreitzberg, 1996) made the prototyping quick, cheap and easy to create.

The navigational model used had been designed with letters A to E representing each of the five tutorials. The user then could start at tutorial A and work sequentially through each one. Alternatively the system is flexible enough for the user to choose which tutorials to work though in which order. The design of each tutorial was broken down into separate software creation tasks and then if a tutorial was particularly complex, sub tasks were created.

Particular problems were solved in isolation and integrated into each tutorial after the tutorial had been created. The design metaphor used is a fixed sized window with predominantly two different sized and slightly different coloured panels to carry text and images (see images 2,4,5,&6). This metaphor continues throughout the software with the only exception being the Windows section where the metaphor was changed to represent the Windows 2000 desktop (see images 3).


During the implementation of the software a ‘bug’ was found and resolved, as is typical with problems of this type, it only appears with a particular sequence of events.

The software was installed on to a computer network to see if there were any networking implementation problems, the installation was successful and the evaluation team where asked to test the software and feedback initial comments.

Image 2: CLaSS software keyboard tutorial screen


Initial testing by the usability evaluation team was undertaken as soon as the tutorials were completed, it was also possible to allow two groups of students to evaluate software. Upon completion of prototyping, the formal evaluation by the usability evaluation team (UET) of the completion of an 85-question questionnaire . As a result of the feedback from these questionnaires, the ‘system development life cycle’ (Parsons & Oja, 1996) was re-implemented again and software changes made (see software piloting).

User-Centered Design Methodology


Use both knowledge in the world and knowledge in the head.


Simplify the structure of tasks.


Make things visible: bridge the gulfs of Execution & Evaluation.


Get the mappings right.


Exploit the power of constrains, both natural and artificial


Design for error.


When all else fails, standardize.

Table 3: Seven Principles of User-Centered Design’ ( Norman 1988)

A user-centred design approach (Norman & Draper, 1986) has been utilised throughout the project, incorporating Norman’s ‘ Seven Principles of User-Centered Design’ (1988) see table 3.

Examples of applying the principles of user-centered design are:

  1. Recognising the difference between inexperienced and experienced users and the awareness of implicit knowledge (visual cues) and explicit knowledge (labels on interfaces and instructions).
  2. Simplifying the structure of some of the tasks has been difficult, feedback from the usability evaluation team showed problems in the procedural content of the copy and paste task in the ‘Navigating Windows’ tutorial requiring this task to be simplified.
  3. Visibility throughout the CLaSS software project has been of great concern and the acquisition of high quality images has been problematical. Some of the computer screen images were hard to see when reduced in size leading to slices of images being used to keep the quality high.
  4. Every effort has been made to insure that the mapping models (Newman & Lamming, 1995) (i.e. a model that a user retrieves from memory) of the software are accurate.
  5. The ‘power of constrain’ was used to particular effect in the ‘How to use a Keyboard’ and ‘How to Use a Mouse’ tutorials. In the keyboard tutorial the user is constrained from using a mouse in an effort to make them use the keyboard and this constraint continues throughout the mouse tutorial until the user is required to practice mouse tasks.
  6. Back buttons were integrated into the software to help reverse the error of clicking on the wrong link or button and irreversible operations have been eradicated as much as possible.
  7. There has been much standardisation in the form of text, buttons, links and images throughout all five tutorials.

The synergy of the two methodologies, User-Centered Design and Software Engineering has been a consideration throughout the design process. It was Ben Shneiderman (1998) who stated:

any user-centered design methodology must also mesh with any software-engineering methodology used’

(Shneiderman, 1998).

Shneiderman also outlined that the relationship between software engineering and user-centered design hasn’t always been smooth but the relationship may now be considered to have arrived at a ‘second-generation business-oriented design approach’ (1998. p104). Throughout the process of development, the project has been based upon business ethics with regard to the viability, planning and scheduling of the project.

Software Piloting:

Software piloting was undertaken using three methods, the usability evaluation team, a student evaluation process and a cognitive walkthrough (Shneiderman, 1998). The student and usability team evaluations were completed by the use of self-assessment questionnaires while the cognitive walkthrough evaluation was videotaped and summarised.

Student Evaluation:

The opportunity arose to have two groups of study skills students evaluate CLaSS software. These nursing students were the ideal candidates to ‘user test’ the software, as a study skills course is taken prior to starting other vocational modules and courses. The CLaSS software was integrated into a taught basic computer literacy session with the students being asked to complete a short quantitative questionnaire at the end of the session. The questionnaire was based on Brooke’s (1986) ‘System Usability Scale (SUS) questionnaire that had been developed for the Digital Equipment Corporation.

It is a five-point Likert-type scale that yields a single number representing a composite measure of the overall usability of the system”

(Brooke, 1986)

The scale descriptors used were ‘Strongly Disagree’ through to ‘Strongly Agree’. The questions were designed in such a way that ‘Strongly Agree’ is not always the positive answer to liking the software. The two groups of students totalled 19 and were all female; the questionnaires were anonymous and were only catalogued when the data was inputted into SPSS software for analysis.

Image 3: CLaSS software mouse tutorial screen.

Summary of Student Evaluation:

The overall results from this group of students were very positive, with only 16% of students who didn’t feel very confident using the software (n=3).

Summary of the Usability Team Evaluation:

The usability evaluation team questionnaire was designed as a very detailed 85 question ‘mixed method’ (Brannen, 1995) questionnaire that aimed to combine qualitative and quantitative methods. After each question and at the end of a section the questionnaire included comment boxes for each particular question or section just completed. These comment boxes gave the questionnaire a quantitative data approach that complemented the quantitative data collection.

Section Title Mean Average Descriptors

Introduction and Main Menu.


Terrible to Wonderful



Dull to Stimulating



Difficult to Easy



Frustrating to Satisfying

Keyboard Tutorial.


Terrible to Wonderful



Dull to Stimulating



Difficult to Easy



Frustrating to Satisfying

Mouse Tutorial.

7.86 highest

Terrible to Wonderful



Dull to Stimulating



Difficult to Easy



Frustrating to Satisfying

Navigating Windows


Terrible to Wonderful



Dull to Stimulating


4.86 lowest

Difficult to Easy



Frustrating to Satisfying

Database Tutorial.


Terrible to Wonderful



Dull to Stimulating



Difficult to Easy



Frustrating to Satisfying

Internet Tutorial.


Terrible to Wonderful



Dull to Stimulating



Difficult to Easy



Frustrating to Satisfying

Overall CLaSS Software


Terrible to Wonderful



Dull to Stimulating



Difficult to Easy



Frustrating to Satisfying



Rigid to Flexible



Learning to use – Difficult to Easy

Table 4: Usability Team Reactions to CLaSS Software.

The rationale for this mixed method approach was to elicit as much information as possible from the usability team, so that these comment boxes worked like open questions, which according Brannen (1995) makes it possible to modify the conclusions of a study where it wouldn’t be possible with a quantitative only method. This is certainly the case with these results, whereby the comments have had a direct effect upon the implementation and improvement of the software.

One of the nine members of the usability evaluation team was excluded from completing the questionnaire due to workload. Out of the eight other members of the team only one participant failed to return their questionnaire by the deadline.

A lot of the questions in the questionnaire were duplicated so that team members could do small sections of the questionnaire when time and workloads permitted. It was designed so that the team could work through a numbered section of the software and then complete that sections questions in the questionnaire. Working through the software and questionnaire took on average two hours to complete.

The questionnaire used a 10-point scale (0-9) that had multiple descriptors ranging from:

0 = Terrible, Dull, Difficult, Pointless, Confusing, Frustrating, Ridge.

9 = Wonderful, Stimulating, Easy, Appropriate, Clear, Satisfying, Flexible.

The mean averages for the ‘overall reactions to each section of the software have been compiled into table 4 with the highest and lowest scores highlighted in bold. The mean ratings varied between 4.86 and 7.85. It is clear that the low scores centred on the Navigating Windows tutorial, where the highest scores were around the Mouse tutorial results. These results also showed that the lowest mean score is on the ‘Difficult to Easy’ descriptor scale. This was a clear indicator that more development work was needed on the usability of the Windows tutorial.

Image 4: CLaSS software Windows tutorial screen.

Summary of the cognitive walkthrough:

By using a walkthrough technique it was possible to analysis a user working through each tutorial within the software for the first time. Newman & Lamming (1995) highlighted the benefits of this process:

“Walkthrough analyses often tells us a great deal about the design, the ease of learning and the likelihood of user errors”.

(Newman & Lamming, 1995).

It would have been preferable to conduct the cognitive walkthrough experiment with several of the usability team instead of the one users interaction that was analysed but this was deemed unworkable due to the amount of time each experiment takes (approx 4 hours) and the amount of time that would be needed to transcribe each experiment (approx 8 to 16 hours per experiment). Usability team member J (A qualified female nurse with self confessed poor computer skills) agreed to undertake the walkthrough and consequently was not given access to the software with the rest of the team. The rationale for this member’s exclusion from access to the software prior to the walkthrough was to see how she dealt with the software when seeing it for the first time.

The software developer monitored the walkthrough session, making section notes and numbering where the user was in the software environment. It was made clear to the user that the software developers’ role was purely to monitor the events and not to guide her through the software. It was decided to videotape the walkthrough session and although the user was initially nervous at being filmed she soon settled into her role.

There were several important issues raised by the cognitive walkthrough that hadn’t arisen from the usability questionnaires.

  1. Introduction: The user was confused with one element of the keyboard tutorial introduction as to whether she should press the spacebar on screen with the mouse or press the spacebar on the real keyboard to navigate to the next screen.
  2. Keyboard Tutorial: An interactive exercise to use Alt and Tab together to toggle between programs didn’t work for the user because the CLaSS software was the only program open.
  3. Navigating Windows Tutorial: The user found working through the Navigating Windows tutorial very confusing and frustrating. The tutorial asks the user to switch between the CLaSS software and the real Windows environment, this needed to be simplified. This problem was hinted at by some members of usability evaluation team in their questionnaires but the scale of the problem could only be seen by a cognitive walkthrough.
  4. Database Tutorial: The timed screens jumped or changed before the user was ready, causing confusion.
  5. Internet Tutorial: The user felt there was too much information on some screens and that the information is too technical.
  6. Overall Reaction to CLaSS software: The user felt the software succeeded in its aims but would not recommend completing all five tutorials in one session as she felt ‘information overload’ set in towards the end of the session. She felt the leap in skills ability from the mouse tutorial to the Windows tutorial was too large and would certainly ‘put some people off’. She felt that she would use the Internet tutorial and ‘bits of’ the Windows tutorial again but not the mouse and keyboard tutorials as she now felt competent in these tutorials.

Image 5: CLaSS software database tutorial screen.

Evaluation Conclusions and Recommendations:

The comments from the questionnaires and the cognitive walkthrough session were collated into a list of recommendations with most of these recommendations been implemented into the software prior to the first (12 monthly) software update.

Software Constraints:

The aim within the CLaSS software has been to “keep the locus of control with the user” (Hix & Hartson, 1993). It is particularly important that the user feels in control at all times and this has been maintained throughout.

Constraints such as a 640 x 480 screen window instead of full screen were chosen to help the inexperienced user not feel ‘lost in the software’. The user is able to see the start button, task bar and the desktop behind the software window giving the feeling they are only partially immersed in the software environment.

Colour and Capitalisation:

Some members of the usability team commented negatively on the use of colour within the software and changes have been made throughout the software to give clarity to the use of colour. A specific blue colour has been used for all major navigation buttons, important text messages and instructions. To comply with the teams’ comments, the design recommendations by the Royal National Institute of the Blind ( (2001) have been adopted throughout the revised software. The RNIB recommend that the use of ‘Capitalisation of whole sentences should be avoided, as it is not easy to read sentences written in capitals’ (RNIB 2001).

This issue has implications for dyslexic students and has been avoided throughout the software except in the use of speech bubbles whereby the comic book convention of using capitalisation has been maintained.

To solve any potential problems users may have with the use of capitalisation, whenever important information is being delivered by capitalisation the text is duplicated so that the dyslexic user may read the text more easily.

Image 6: CLaSS software database tutorial screen.

Visibility, affordance and feedback:

Don Norman’s general design principles of ‘visibility, affordance and feedback’(1988) have been applied with differing degrees of success within the software. One good example of these design principles is the round blue navigation button used throughout the software. The button was created in Macromedia Flash 5 authoring software, it is raised, it has a colour change when the mouse rolls over it and has another colour change and slight movement when pressed, and all of these properties follow Norman’s principles. In contrast the coloured hypertext ‘submenu’ links in the Windows, Database and Internet tutorials fail to comply with Norman’s principles and consequently the user needs to be told that it is a link.

Usability Audit:

The planned usability audit cycle based on the (SDLC see figure 1) System Development Life Cycle (Parsons & Oja, 1996) has been continued. This audit cycle initially involved quarterly student evaluations and upgrades at the end of the first year, this has changed to half yearly evaluations and upgrades as and when appropriate.


The initial objectives of this project was to identify the level of computer skills that the target student group had and then if viable develop the CLaSS software as a tool to assist students. The research results clearly showed that students would benefit from such a tool. In an attempt to balance the student learning experience CLaSS software has been designed, developed and rigorously tested according to the strict development protocols outlined within this paper. Educators, nurses and students have successfully evaluated the resulting software and the analysed data formed conclusive proof of the validity of the software. It is hoped that CLaSS software will continue to develop and will improve and assist students who have a need for computer and information literacy support.

A web demo version of CLaSS software can be seen at


Adams, A. & Duffield, C. (1991). The value of drill in developing and maintaining numeracy skills in an undergraduate nursing programme, Nurse Education Today. 11, 213-219.

Anthony, D. (1997). Computer networks in the NHS and academic sector, Nursing Standard [on-line]. 11 (18), 34-38. Available from: [Accessed 29th January 2004].

Armstrong, M. (1989). Computer competencies identified for nursing staff development, Journal of Nursing Staff Development, 5 , 187-191.

Ausubel, D. (1968). Educational Psychology, A Cognitive View. New York: Holt, Rinehart and Winston, Inc.

Blais , K., & Bath , J. ( 1992 ). Drug calculation errors of baccalaureate nursing students. Nurse Educator, 17 (1), 12-15.

Blythe, J., & Royal, J.A. (1993). Assessing Nurses – Information Needs in the Work Environment. Bull Medical Library Association, 81, (4), 433-435.

Bowen, B. (2001), Educational Psychology - David Ausubel [online], Available at: ausubel.htm/ [Accessed 19 Jan 2004]

Brannen. J. (ed.). (1995). Mixing Methods: Qualitative and Quantitative Research. London : Aldershot Ayebury.

Brooke, J. (1986). SUS – A “quick and dirty” usability scale. User information Architecture A/D Group. Digital Equipment Co.Ltd. Available from: [Accessed 4th feb 2004].

Chambers, M., & Coates,V. (1990). Computer training in nurse education: a bird's eye view across the UK , Journal of Advanced Nursing, 15, 16-21.

Cole, I., & Kelsey, A. (2003). Computer and information literacy in post-qualifying education, [online]. Nurse Education in Practice, In Press, Corrected Proof, Available at: [Accessed 4th Feb 2004]

DfES, Department for Education & Skills, (2004). Key skills web site. [0nline]. Available at: [Accessed 4th Feb 2004].

Edwards, A., Matthews, E., Pill, R., & Bloor, M., (1998). Communication about risk: the responses of primary care professionals to standardizing the `language of risk` and communication tools. Family Practice, 15 (4), 296-300.

Ferren, A. S. (1993). General Education Reform and the Computer Revolution. The Journal of General Education, 42 (3), 164-177.

Gassert , C.A. , & McDowell, D. (1995). Evaluating graduate and undergraduate nursing students computer skills to determine need to continue teaching computer literacy. In Greenes, R.A., Peterson, H.E., & Protti, D.J. (Eds). Medinfo’95 Proceedings of the Eighth World Congress on Medical Informatics, Vancouver. pp.1370. Edmonton Healthcare Computing & Communications Canada Inc.

Grant, R. (1995). Report of the Working Group on Student Access to Computers, in Reid, I. (1997). Computer Literacy in Higher Education. ASCILITE97 (The Australian Society for Computers in Learning in Tertiary Education) conference proceedings. [online] Perth, Western Australia. 7-10 December,1997. Available at: [Accessed 16 August 2004].

Graveley, E.A., Lust, B.L., and Fullerton, J.T. (1999). Undergraduate Computer Literacy - Evaluation and Intervention. Computers in Nursing, 17 (2), 166-170.

Hix, D., & Hartson, R.X. (1993). Developing user interfaces: ensuring usability through product & process. New York: Wiley.

Johnson, B., & Webber, S. (2000). Towards the Information Literate Graduate: Rethinking the Undergraduate Curriculum in Business Studies. In: Appletion, K., Macpherson, C., & Orr, D. (eds). Lifelong Learning Conference: Yeppoon, Queensland Australia: 17-17 July 2000. pp. 194-202. Rockhampton: Lifelong Learning Conference Committee.

Jones, R.B., Navin, L.M., Barrie, J., Hillan, E., & Kinane, D. (1991). Computer literacy among medical, nursing, dental and veterinary undergraduates. Medical Education. 25 (3), 191-195.

Kreitzberh, C. (1996). Managing for usability. In Alber, F.R. (Ed), Multimedia: A Management Perspective , Belmont, CA: Wadsworth, pp. 65-88.

Leland, B (Chair)., Dallinger, J., DeVolder, D., Isele, F., Kaul, T., Mathers, R., Murphy, J., Radlo, S., & Stierman, J. (2000). Report of the Computer Competency Committee, [online]. Western Illinois University. Available at: [Accessed 13th August 2004].

Macaulay, L. (1995). Human-Computer Interaction for Software Designers. London: International Thomson Publishing.

The Moser report, (DfEE 1999). [online], A Fresh Start - improving literacy and numeracy, ref: CMBS 1, [Accessed 16 th August 2004].

National Committee of Enquiry into Higher Education (NCEHE), (1997. [online], Higher Education in the learning society,(The Dearing Report) Available at: [Accessed 16 th August 2004].

Newman, W.M., & Lamming, M.G. (1995). Interactive System Design. New York: Harlow . Addison-Wesley Publishers Ltd.

Norman, D.A. (1988). The psychology of everyday things. New York: Basic Books.

Norman, D.A., & Draper, S. (Eds.), (1986). User Centered System Design: New Perspectives on Human-Computer Interaction . Hillsdale, NJ: Lawrence Erlbaum Associates.

Nursing & Midwifery Council (NMC), Circular 18, (2003). [online], Interim advice and guidance on the introduction of entry requirements for pre-registration nursing and midwifery education programmes. [Accessed 16 th August 2004].

National Research Council Computer Science and Telecommunications Board, (NRCCST 1999). Being Fluent with Information Technology, Washington, D.C.: National Academy Press.

Parsons, J.J. & Oja, D. (1996). New Perspectives on Computer Concepts . Cambridge, MA. CTI.

Reid, I. (1997). Computer Literacy in Higher Education’. ASCILITE97 (The Australian Society for Computers in Learning in Tertiary Education) conference proceedings. [online], Perth, Western Australia. 7-10 December,1997. Available at [Accessed 16 August 2004].

Royal National Institute for the Blind. (RNIB 2001). Get the message online: Making Internet shopping accessible to blind and partially sighted people. Campaign Report 15. ISBN 1858784506.

Sabin, M (2002). [on-line]. Competence in Practice-Based Calculation: Issues for Nursing Education. LTSN, London. [Accessed 17 August 2004].

Sarento, K., and Leino-Kilpi, H. (1997). Computer Literacy in Nursing: developing the information technology syllabus in nursing education, Journal of Advanced Nursing, 25, 377-385.

Schwartz, L.M., Woloshin, S., Black, W.C., & Welch, H.G. (1997). The role of numeracy in understanding the benefit of screening mammography. Annals of Internal Medicine, 127 (11), 966-972.

Shneiderman, B. (1998). Designing the User Interface. Reading, Massachusetts: Addison Wesley Longman, Inc.

SKSAIHE. (2001). Supporting Key Skills Achievement in Higher Education web site, [online] Available at: [Accessed 28 July 2003].

Steen, L.A. (1991). Numeracy. In S.R. Graubard (Ed.) Literacy: An overview by fourteen experts. New York, NY: Hill and Wang.

Topp, H., and Kinn, S. (1999). The use of learning technology in nurse education - a survey of Scottish and Welsh nurse educators. ITIN, 11 (4), 6-9.

Verhey, M.P. (1999). Information Literacy in an Undergraduate Nursing Curriculum: Development, Implementation, and Evaluation. Journal of Nursing Education, 38 (6), 252-259

Wilkinson, J.M. (1996). Nursing Process - A Critical Thinking Approach. Harlow: Pearson Education.

Woodrow, M. (2000). Widening Participation: What's It Really All About? [0nline]. Available at: [Accessed 27 January 2004].

Author Bio

Ian J Cole, MSc DipCom (Open)

Mr. Cole is a Lecturer in Information & Communication Technology, in the Department of Health Sciences at the University of York.