So the adage goes, all systems are only as good as the data you put in them.
Their success is also dependent on the effort spent in developing and maintaining their effectiveness too. This is where Su Thomas comes in. She’s worked tirelessly to help develop our ‘Evaluation Platform’, a system that is having an enormous impact on our ability to assess performance in real time.
We’ve been using the system to support the delivery of all outcomes initiatives, to include Benchmarking, 360 Review, competency frameworks assessments, facilitated development, assessment events and much more.
Su explains:
“The platform gives a holistic view of all projects. We can prepare all of the agendas, arrange the logistics of the session in terms of timeframes and number of evaluations per person for example. Assessors are able to complete evaluations during an event. This allows us to provide live updates which can demonstrate how performance against the model is progressing and where further improvements can be made. It all helps to enrich the customer and participant experience.”
Outcomes delivered inside 48 hours
Post-event, Su uses data collected in the platform to generate event reports, and works with the rest of the outcomes team to supply recommendations based on the outcomes delivered, inside 48 hours.
We can even generate individual event certificates or benchmark reports tailored to the customer’s specification and branding, which are sent directly via the platform. Managers can also receive reports relating specifically to a cohort of employees.
Each assessor has a unique link allowing them direct access to evaluations relating only to the participants assigned to them, eliminating the need for assessors to select their participants from dropdown boxes.
Information was previously stored on a project-by-project basis, and often in multiple locations. Here, everything is consolidated in one place, meaning that there’s less of a burden on Su and less of a demand on our coaches. Learner data is also easily accessed via the learning log. It all means that we are improving the standard, accuracy and value of feedback given to participants.
“Monitoring progress of projects has become so much easier,” says Su. “I no longer solely rely on coaches updating me with progress reports or evaluation forms. I can instantly see what evaluations have been submitted and by who. And it’s all live.”
Lee hassle and less likelihood of mistakes
The Evaluation Platform allows us to frame any set of given questions/behaviour indicators, with the added benefit of being able to use multiple languages per project. All of which can be stored for future use.
Take a recent development centre that incorporated presentations, benchmarking and group interaction sessions over a three-day period. Typically via a tool such as Survey Monkey, a coach would be presented with a different hyperlink for every set of questions.
By manipulating the platform, we were able to provide one consistent hyperlink across all three days, seamlessly changing the set of questions for each particular session. It all means less hassle for the assessor and less likelihood of mistakes occurring.
And it’s that kind of system manipulation which offers further exciting opportunities in the future.
“We’re currently developing a new process that will automatically generate a notification whenever the system is updated, which means a coach no longer has to contact us directly by email to confirm an evaluation is complete, further reducing workflow,” adds Su. “Previously many of our coaches experienced frustration with the old Benchmarx system, with screens freezing and data losses. The evaluation platform has a built-in feature which allows the assessor to save an evaluation part-way through and come back to it at a later date to pick up where they left off.
“We have the flexibility to develop the platform even further to seamlessly support the outcomes initiatives and welcome any suggestions which enhances the user experience.”