Putting artificial intelligence at the service of learning13 December 2022 2023-04-20 14:05
Putting artificial intelligence at the service of learning
Putting artificial intelligence at the service of learning
During our daily training missions and meetings with HR decision-makers, we realize that training is often the element of Human Resources that is hardest to evaluate or to quantify the real transfer to the field. However, as we know, competence is more of a process than a state, and in order to be developed, it requires several successive elements:
- a theoretical basis,
- a concrete application in a situation
- a return on the learning, allowing to close the loop of this learning
Learning, which will become a skill through regular questioning of practice. The traditional training modules that we knew before certainly gave us this possibility (provided that the company and its employees had done everything possible to encourage this transfer). However, Covid and the emergence of “remote everything” have changed everything.
During the pandemic, people became interested in how they could use their time wisely, and the main way they found to optimize that time was to go online to learn and grow. As a result, hundreds of new companies and as many methods to achieve the holy grail of skills development have flourished in recent years, and investment in training technology companies has literally exploded. This abundance of solutions creates increased competition and requires an online platform to keep its users captive. To do this, it must achieve its ultimate goal: that the user learns something that will be useful to them. Achievable…but not too easy.
What are the success factors for distance learning?
The conclusion that can be drawn from these three years, two of which were spent in Covid, is the following: adapting to the learner and being close to him or her are two essential ingredients for successful learning. As training and human resources professionals, we see this every day.
What are the rules to respect in order to succeed in this combination?
– The first rule, obviously, is that the learning content must correspond to what the learner expects and provide him with elements that he can use in practice. It is also necessary that this content corresponds to the learner’s current level and, above all, to his or her target level. Not too easy, not too hard. For the adults we have in training, a link that can be made directly with their concrete problems will always speak to them more. It will therefore be necessary to have a precise target: managers of proximity, managers of managers, leaders, etc.
– Second rule: adapt again and again to the learner. What is it all about? Simply to have a relevant reaction of the tool used remotely to any interaction with the learner. Learning statistics, advice, targeted questions, etc. Previously, as a learner, we would walk alone in front of our screen and simply look for the correct answer to our mistake. What did this teach us? Not much: we would say to ourselves “ah ok, that’s it”. But as we know, we learn much better when we have to look for the answer and think harder. If you’re faced with a “too bad, that’s wrong”, chances are you’re not going to look for the answer independently in the learning content, right?
What we’ve learned from experience at SCA is that we need to both tailor our learning to the learners but also personalize the experience of assessing their learning to keep them engaged in their development. Here we present a concrete case study that we have been developing for a year in the SCA training engineering laboratory.
Adapting the learning assessment experience in incremental steps
We proceeded by regular iterations. Indeed, the first questions we had asked for our leadership modules did not seem to answer the problems of our learners. The results were not up to par: too many people did not answer or abandoned the evaluation along the way. And for good reason! We had done standard tests that did not focus enough on the difficulty levels of the course. When learners started a module, they were immediately confronted with a content question without ever really having integrated it or having been able to discuss it with their instructor or their virtual classroom peers. We took this into consideration to establish multiple levels of question difficulty, link them to learning objectives, and also split the initial assessments, tests, and final assessments.
Create questions you want to answer
Sometimes when we watch or read content, questions can be annoying or seem unnecessary. However, they are an integral part of learning and help to integrate it more durably. To do so, they must be intelligently formulated and allow for an association with experience and concrete elements.
We therefore dug deep and searched rigorously for what constituted a qualitative question, be it multiple choice, open-ended questions, ranking or even definitional matching. We explored all the possibilities of good and bad questions and used Bloom’s taxonomy to create a logical and coherent system for creating categories of questions ranging from memory-related learning to comprehension to real-world application.
Analyze feedback and practice continuous improvement
Once our groups of learners had answered these sets of questions, we analyzed their responses using an Artificial Intelligence system and our human analysis as well, thankfully! There was real progress. However, we still had some steps to take, especially because our feedback was too superficial and did not challenge our learners when they answered incorrectly or did not answer at all! So we worked on the best way to help them integrate the key knowledge even when their answer was wrong.
How did we do this? By rephrasing the question again, in a different form, to encourage them to try again and by adapting to their answer. We know that to learn, we need a safe learning environment. To provide learners with the safest learning experience possible, in our modules, they can repeat questions and try again with other cues that encourage them to look for the right answer. Repetition helps with long-term retention of content. Since the course is a year long, students should be able to repeat things often and the tests help with that.
Putting AI to work for people.
Neuroscience teaches us that our brain needs to understand why it learns this or that thing: it too is in search of meaning! It is therefore necessary to create paths in which individuals can project themselves and see the usefulness of their learning. While continuing to interact with a trainer-coach during their course. And… to use artificial intelligence where it can be a support for humans!