Conducting the evaluation

 

About Procuring Preparing for and running
a tender exercise
 Engaging suppliers
and tendering
 Reviewing
tenders
 Estimating Total Cost
of Ownership (TCO)
 Managing relationships
with suppliers
 Resources 

 

[Reviewing tenders] | [Testing and evaluating] | [Planning an evaluation event] | [Conducting the evaluation] | [Reference sites]

 

 

Conducting the evaluation

 

A well planned evaluation should run smoothly and give you all the information you need to make the right decision. However you will need to think through the logistics to make sure that everything runs smoothly on the days concerned:

  • don't underestimate how tiring these sessions will be for the assessors and make sure they have regular breaks to maintain concentration;
  • allow time for assessors to sum up their thoughts after each session and at the end of each day;
  • ensure catering and refreshment breaks fit appropriately with your evaluation schedule;
  • consider having a facilitator to keep things on track.

 

 
inside of a watch
Top Tips 

 

Creating a good scoring rubric and giving detail on how you are assessing features can help protect you in the event of a challenge.

 

 

Facilitator role

A facilitator, ideally somebody who is not actually scoring the products, can be very helpful in keeping the evaluation on track. Tasks associated with this role might include:

  • ensuring that each session sticks strictly to time;
  • ensuring each session is proceeding at the correct rate so that all elements will be covered - you will not be in a position to make fair comparisons between suppliers if each is allowed to spend extra time on their strengths and skate over weaknesses when time runs out;
  • managing the questioning of your own assessors to ensure that the evaluation doesn't get 'hijacked' by somebody with an interest in one particular area;
  • checking with your team whether a point has been adequately covered and moving the supplier on or requesting further explanation as necessary;
  • collating score sheets at the end of each session and checking for missing or anomalous results - if a number of people have failed to evaluate a point due to insufficient information or if the scores of individual team members differ greatly this can highlight areas which should be followed up whilst the supplier is still on site;
  • facilitating the final summing up of scores - there are bound to be some genuine and valid differences of opinion about the products and it is worth exploring the reasons for these differences rather than resorting to a simple average score which might give a compromise solution that isn't a best fit in any area.
Top Tips 

 

There is a lot to take in: you may wish to make an audio recording of the sessions in case you can't remember the answer to a question or there are differing interpretations of what was said. It can also help to tone down some of the more optimistic sales promises if the supplier knows you have a full record of the discussions! You should, however, make sure you get the agreement of everyone in the room.

 

 

 

magnifying glass icon 

The University of Huddersfield is a good example of applying what we know about good assessment practice to selecting a VLE.

The University had four sets of usability tests: basic and advanced academic staff use, student use and use for administrative tasks.

The vendors had to commit to this testing as part of their tender and it involved a lot of preparation on their part. There was a detailed specification of modules to be set up, content to be uploaded and structure for what this had to look like. The University defined the dataset to be used and the testers followed a prescribed script.

Students were involved in the usability testing and were offered Amazon vouchers for participating.

The format of staff and student tests were very similar - they had to conduct basic tasks on each of the VLEs (1 hour per VLE) and then undertake a survey. The basic tasks involved uploading material, participating in discussion boards etc. and there were more advanced tasks for staff. The academic staff had to give up three full days (one day per VLE).

The University was very concerned to remove any bias in usability testing, for example the students were split into three groups and they tested the VLEs in different order so there was no bias as to which system the students saw first or last.

magnifying glass icon 

Scottish universities and colleges are part of a procurement consortium known as APUC (Advanced Procurement for Universities and Colleges). The group has set up a framework agreement for VLE supply and Glasgow School of Art acted as the pilot institution for this.

When it came to the testing stage there were two testing groups:

  • The framework committee;
  • Glasgow School of Art staff and students.

Each supplier was provided with a set of resources of different types e.g. quiz, SCORM, assignment, video and asked to build a course that did all of these things and create instances for each of the testing groups.

The tests did not involve use of actual institutional course/student data but they did ask for details about data migration. The testers were only charged with looking at functionality and not other considerations such as cost.