Functional requirements

 

About Specifying Requirements gathering
and prioritisation  
 Core requirements
and constraints
 Functional
requirements
 Prioritising
requirements
 Resources 

 

 

Functional requirements

The detailed requirements that make up your Invitation to Tender (ITT) and further testing plan are likely to focus on end-user functionality and technical requirements. You need to find the level of detail that is right for your institution.

Very often when asked about requirements, people delve straight into the detail of how a system should work without stopping to reflect on broader issues about what they really want to achieve and how this will impact the learning experience. In many cases what people tend to describe is the way they do things at the moment with some minor improvements. There is more on this topic in the section on 'Planning for the future'.


"There is no subjectivity in this process so you have to be really, really careful that you write down exactly what you want because what you've written down and agreed is exactly what you are scoring.

                      Correy Murphy, Blended Learning Co-ordinator, Glasgow School of Art 

 

How much detail do you need?

There is no right answer. Our research for this Toolkit found requirements specifications covering the full spectrum from fewer than 100 to c.2,000 separate items.

One university in favour of considerable detail was the University of Huddersfield who had around 2000 separate items. They explained that this has to be viewed in the context of an institution which has a very strong focus on learning technology and some very innovative teaching.

The University of Sussex found that c.150 criteria was about right for them particularly as their existing VLE was not feature rich and they wanted to be very open to new ways of doing things.<,/p>

Some universities feel that taking too granular an approach makes the scoring and evaluation process very difficult. It seems as though on average universities identify c.150-200 separate requirements although often with multiple criteria under a single heading.

A few examples from our research

example numbers of VLE requirements specified

(click to enlarge)

 


"The more criteria you have the harder the evaluation becomes so make sure your specification is not too granular.

                      David Walker, Head of Technology Enhanced Learning, University of Sussex 

 

VLE functional areas: University of Huddersfield example

The University of Huddersfield identified 14 different functional areas of their VLE and created a sub - team to look at each area. The areas were:

  • usability and accessibility;
  • content upload and delivery;
  • assessment;
  • communications and time management;
  • collaboration and social affordances;
  • integrations, interfaces and data feeds;
  • migration and rollover;
  • learning analytics and institutional reporting;
  • VLE entry page;
  • module management and module menus;
  • portal functionality;
  • maintenance and support;
  • technical;
  • project services (implementation and consultancy services included in the tender price).

Project sub - teams put together detailed requirements for each of these functional areas resulting in excess of 2,000 functional requirements.

Each of these was then graded as essential, highly desirable or desirable. The project manager told us that, if doing this again, they would prefer to add a level so that the requirements were prioritised as: mandatory; critical; highly desirable; desirable.

 


"Requirements definition takes a long time and you need to achieve the right level of granularity that suits your institution and will support existing and innovative practice.

                      Andrew Raistrick, Business Analyst and Project Manager, University of Huddersfield 

 

VLE functional areas: University of York example

The requirements specification produced by the University of York in 2013 has been adapted by a number of other universities. Here we give a rough indication of the number of separate criteria in each section. You can find a copy of the full document in the resources section.

University of York number of criteria

Heading No. of items
Pedagogical Functionality
- Content 53
- Assessment 35
- Communications 29
- Course Management 25
System Functionality
- Administrator Requirements 6
- Integrations and Interoperability 9
- Technical Requirements 31
Service Requirements
 - Future Developments
- Deployment (Implementation, Migration Support and Training) 7
198

 

How do you define 'usability'?

Most of the contributors to this Toolkit told us that usability is a critical factor in selecting a VLE. However, defining what you mean by usability is not always easy.

 


"My language when I was initially writing about usability just wasn't tight enough. Words like 'robust' are subjective and meaningless. Specific examples are necessary.

                      Correy Murphy, Blended Learning Co-ordinator, Glasgow School of Art 

 

The Scottish universities and colleges procurement consortium known as APUC (Advanced Procurement for Universities and Colleges) set up a framework agreement for VLE supply and found that clarity was essential when working as a group to define requirements.

The Glasgow School of Art was a pilot institution for the framework. Correy Murphy, Blended Learning Co-ordinator at the School, gave us some tips:

  • you need clearly defined criteria based on knowledge of existing good practice in other words don't just look at the systems and then reverse engineer what you wanted;
  • ensure the language you use is precise and accompanied by examples rather than subjective and meaningless;
  • make your definitions explicit e.g. 'easy-to-use' means three clicks or fewer to undertake a task;
  • consistent navigation is an important factor in usability;
  • give guidance to your assessors on precisely what they have to do and check whether they were able to do it: they need to be able to respond
    - yes/no could they actually do something and
    - be able to give a lower score to a particular product if the task took longer, more clicks or was confusing.

 

Views on usability

"Usability is the perception of the users not a quantitative analysis of the performance of the system.

                      Andrew Raistrick, Business Analyst and Project Manager, University of Huddersfield 

"It's very obvious if the system has an e-portfolio or calendar component or if it doesn't. But when you get to usability you have to spell out exactly what you mean by something like 'easy-to-use': we defined that as three clicks or less to do an activity." 

                      Correy Murphy, Blended Learning Co-ordinator, Glasgow School of Art 

 

A 'quick and dirty' tool for measuring usability

The System Usability Scale (SUS) method is often described as a 'quick and dirty' tool for measuring usability although it has been in use since 1986 and is found to be very reliable.

Participants are asked a set of 10 questions to which they respond on a scale that goes from strongly agree to strongly disagree. The questions are:

questions for measuring usability

(click to enlarge)

 

The tool is easy to administer although the scoring is somewhat complex. You will end up with a score out of 100 that needs to be interpreted in terms of its percentile ranking. A score of 68 is average so a score of 70 is slightly above average but not outstanding.

The SUS approach is used quite frequently in VLE procurement elsewhere in Europe. You can find more information about SUS and how to use it in the resources at the end of this section.