Ensure you evaluate
A vital step in any learning technologies adoption project is to undertake a thorough, unbiased evaluation of solutions available in the market, against organisational requirements. Too often have I seen learning technology implementation projects fail due to improper or uninformed options analysis and evaluation, or worse because no such analysis was ever undertaken. The steps below walk through one approach to undertaking an evaluation of learning technology solutions.
1. Find your stakeholders
A fair evaluation should involve all relevant stakeholders early on in the project. Where possible (and appropriate) an organisation wide learning technology solution should be selected. This means an organisation wide evaluation project should be undertaken by pulling together stakeholders, end users and learning technology experts from across the business. Where the end solution will be used by learners, ideally an evaluation project will include one or more student perspectives as well.
2. Compile organisational requirements
Why are you completing the learning technology evaluation? Why is your organisation looking at a new technology solution? These are important questions to ask your self and your stakeholders. Documenting the answers to these questions can help you define some of the high-level project objectives.
Once you have your high-level objectives, you then need to drill down to determine every possible feature, function, educational and technical requirement for a potential technology solution.
Ideally, by the time you have finished compiling your requirements, and have run these past your stakeholders, you should have a solid list of items. I’ve found when I’ve undertaken such requirements analysis in the past I can end up with a list of requirements anywhere from 80 – 120 items long, but this will, of course, vary by organisation and learning technology project.
3. Weighting is key
Not all requirements are created equal. Some requirements will be must-haves (eg. accessible via any modern browser), while others will be nice to have (eg. ability to video chat, while completing an online assessment).
This is where weighting comes in. By providing each requirement with a weighting from 1 (nice to have) to 3 (essential) you can highlight which requirements are most important. This weighting also comes in handy when completing the overall analysis of a learning technology solution, by adjusting the final score based on weightings. I will go into further details around weighting in the “Tally the results” section of this article.
Which requirements are important or not, will vary greatly depending on your organisational needs.
4. Research the technology landscape
What learning technology solutions are out there? Too often I see options chosen because somebody has heard of a technology before or it has previously been used in the organisation. These could potentially be great learning technology solutions, but without proper analysis of other options available in the market, this limits the possibility of finding the best education solution.
Once you have analysed potential technologies in the market use your knowledge (and the knowledge of other experts around you) to narrow down these options to three or four options for in-depth evaluation.
5. Evaluate your options (and ensure others evaluate too)
With your requirements list developed and learning technology for further analysis selected, you are then ready to complete your in-depth evaluation. Utilising any demo environments or test systems you have access to, plus any vendor documentation, each requirement should be thoroughly evaluated and provided a score out of ten (with five being the solution meets the requirement at the most basic level and ten being the solution far exceeds organisational requirements). By scoring each requirement out of ten and not simply stating whether or not a potential solution has a particular feature, you can more fairly evaluate each technology. Just because a learning technology has the feature you are looking for, does not necessarily mean it is easy to access or easy to use. By providing a score, you are able to differentiate between good and bad feature implementations (or where the feature simply doesn’t exist at all).
Ideally, completion of the formal evaluation of possible technology solutions should be undertaken by as many individuals/stakeholders as possible. However, this can be a time-consuming process when comparing against so many criteria and can require a certain level of technical expertise to ensure appropriate scoring of some elements. Because of this having the requirements scored by two or three people is usually enough to get an indication of technology appropriateness.
6. Tally the results
Once an evaluation of each technology has occurred, you can then use the weightings set earlier on to calculate an overall percentage fit to organisational requirements. To do this, you first need to determine the maximum weighted score for each requirement (eg. a requirement with a weighting of three, would result in a maximum score of 30). Once you have the weighted maximum of each requirement add each of these weighted scores together to get the maximum score possible (eg. if every requirement scored 10 out of 10).
Next determine the actual weighted score for each requirement, based on the score you gave out of 10 (eg. a requirement scored 8 out of 10, but had a weighting of 2, which would result in a weighted total of 16). Add up all the weighted scores, to determine the total score for a particular technology solution.
To determine the learning technologies percentage fit against organisational requirements, simply divide your total score by the maximum score, then multiply the result by 100 to get your percentage.
Of course, if you use a spreadsheet to automate these calculations then you will only need to set them up once.
Repeat this for each technology solution being evaluated.
When all evaluations have been completed by all people involved, collate each individual’s findings and average out the percentage for each technology solution to give you a fair evaluation percentage that you can use when presenting findings to your organisation.
7. Consider a pilot project
For the selected learning technology, the last step before considering a formal business case and potential technology implementation project would be to undertake a small pilot project with end users (staff, teachers, learners, etc). Even the most robust evaluation may not pick up all potential issues with implementing and using a learning technology, by undertaking a small pilot project this lowers the risk of overall project failure and allows any problems to be found and resolved, prior to moving forward with a large scale technology roll out.
Cost is important too
Realistically, while it is great that an evaluation of a specific learning technology returns a 98% match to organisational requirements, if the solution far exceeds project budget (eg. the solution will cost $1,500,000 over three years, but your budget is only $200,000) you may have to return to the research phase and determine whether there are any alternative options that may equally meet organisational needs.
On the other hand if the cost of the learning technology solution only slightly exceeds budget, but the evaluation has shown this will best need organisation and even more important learner needs, then the in-depth analysis you have performed creates an excellent starting base for you to develop a formal business case to indicate to the business why additional funding is required.
Evaluate to improve project outcomes
At the end of the day, this is only one approach to learning technology evaluation, the main point is to ensure that a formal evaluation process occurs. This will ensure the best chance for learning technology adoption, enhanced learner experience and improved learning outcomes.
- The rise of QR codes – Is now the time to utilise these in our learning experiences? – 20th February 2021
- Beware of the gap – Considering the digital literacy of your learners – 20th October 2020
- Is ‘All of the above’ an effective question option? – 20th September 2020