Best practices for incorporating competency-based assessment with one45

With the new year now well underway for most of us, I am using this opportunity to start a fresh discussion about how programs like yours are using one45 for competency-based assessment. I’ve had the opportunity to talk to administrators and directors of a number of programs over the last few months and you’ve shared some of the ways that you are using one45 for competency-based assessment. I’ve also fielded a number of questions asking me about what other programs are doing, so I wanted to share what I’ve learned, and I encourage you to share some of your own experiences.

Here are a few of the things I have learned from my conversations:

  1. Most programs incorporate competency-based questions on some, if not most of the assessment forms they use in one45.
  2. Some programs use the competency reports (“spider graph”) feature in one45 to visualize competencies and to generate reports on learner competence across forms.
  3. Some programs use the procedure log tools in one45 to track competence. A few use the procedure log tools for daily assessments like field notes or mini-cex. Competency grids and other advanced log features allow you to quantify exposure to and progress toward competence.
  4. While some programs use one45 for daily evaluations or mini-cex, others use paper forms. In some cases, the paper forms are manually transcribed into one45, and in other cases the information stays on paper.
  5. There are a few programs that have experimented with other tools, including SurveyMonkey and Sharepoint for implementing daily assessments.

The biggest pain points I’ve heard about are similar to those that often get brought up at conferences when discussing the challenges of implementing competency-based assessment.

  1. It’s a lot of work to collect daily (or even weekly) feedback about all of your learners, and form-completion compliance is a huge problem.
  2. Keeping track of whether your learners are being assessed often enough is difficult, and ensuring that learners are getting the right assessments is even trickier.
  3. Faculty aren’t trained on how to do competency-based assessment, the rubrics are new, and time for faculty development is short or non-existent.
  4. If you’ve managed to get past all the barriers described above, making sense of the wave of information you’ve collected, and reporting on competence effectively is hard.

I believe that the key challenge is to remove the barriers that prevent faculty from giving feedback. The two biggest barriers are ready access to the necessary evaluation forms, and the time it takes to fill out those forms which is why we’re building a mobile assessments app. With the app, faculty and learners will have access to request and provide assessments from, and optimized for the device they carry with them everywhere – their phone.

Your feedback

If you have any feedback, or want to share some of the work that you’re doing, I’d love to hear from you.

  1. How is your program incorporating competency-based assessment into your workflows?
  2. Are there any best practices you can share based on your experience?
  3. Are there any workflows that you’ve tried that worked really well and are any that didn’t go well that others should avoid?

I’ve posted a copy of this message on our user forum and if you’re not already a member, you can sign up here. Please share this document with your program administrators, and other colleagues who are working on CBE at your institution – I would love to hear from them as well!

Jason Ladicos

About Jason Ladicos

Jason is one45's Senior Product Manager. Having provided technical support for a number of web-based projects at the University of British Columbia, Faculty of Medicine early in his career, he has a hands-on understanding of the unique needs of medical school users. Jason has been involved in one45's user interface design since 2002 and he is an invited expert in the Medbiquitous Competencies Working Group.

Leave a Reply