On November 25, 2015, the OFC held its first Learning Day at the YWCA in downtown Toronto. The event was attended by almost 100 participants from Ontario's 42 regulators.
It created an opportunity to increase collaboration among regulators by sharing best practices, identifying concerns, and working towards resolving common issues and problems. The event specifically addressed some of the issues and challenges that regulators pinpointed during a recent survey, in addition to those identified during the OFC's last assessment of registration practices. The event also promoted new resources for regulators and continued to build and strengthen the OFC's relationships with regulators.
The Master of Ceremonies was Zubin Austin, Professor at the Leslie Dan Faculty of Pharmacy, University of Toronto, and inaugural holder of the Murray Koffler Chair in Management.
Interim Fairness Commissioner Mary Shenstone welcomed the participants and inaugurated the event. In her address, she shared some general updates from the OFC and announced the release of the 2014–2015 annual report.
Applicants must have an opportunity to demonstrate their ability to practise by showing that they have the required knowledge, skills and judgment through relevant learning and experience. Regulatory bodies continue to find new ways to improve the fairness of their registration requirements. The panel of regulators provided case studies of how they are reviewing these requirements and how they are exploring alternative ways of assessing and accommodating applicants' qualifications.
Gill Pichler, Director, Registration, Association of Professional Engineers and Geoscientists of British Columbia (APEGBC), shared alternatives to the work-experience requirement for engineers. These alternatives are currently being piloted through APEGBC's Canadian Environment Experience Project. The project, which is expected to be completed by the end of 2016, was initiated through a consultation with engineering regulators from across Canada.
The project's objectives are to:
The proposed alternative methods focus on four competencies. Depending on qualifications and competencies demonstrated during the review, the proposed alternatives would include:
This new system is not necessarily easier, but it is attainable and could recharacterize the location and duration of required Canadian-experience time.
Jan Robinson, Registrar and Chief Executive Officer at the College of Veterinarians of Ontario (CVO), discussed how the College assesses the clinical competencies of graduates and undergraduates from unaccredited programs.
One of the main challenges lies in the very definition of what constitutes the practice of veterinary medicine. Applied veterinary medicine is based on culture and varies according to cultural context. Because of this, CVO does not have a work-experience requirement, and it is exploring the use of restricted licensing with certain conditions and terms as an acceptable alternative to general licensing. The College is suggesting an increase in alternative pathways to licensing based on specific skill sets and competencies. This aims to bridge the disconnect in a system that historically has led to a general licence even for veterinarians with a specific practice focus. It remains to be seen if this is the route that will ensure easier access for veterinarians with a narrow scope.
Susan James, Director, Competence, at the Ontario College of Pharmacists, discussed the College's revised Structured Practical Training (SPT) program. In 2012, in response to the OFC's review of work-experience or practical-training requirements and new trends in the profession, the College evaluated the program to assess its necessity and relevance, the impact of new PharmD programs, and the new trends of international applicants.
The results showed that:
The redesign of the SPT program, being done in consultation with the relevant faculties, is consistent with education and residency programs. The redesign includes a behaviorally anchored rating scale where ratings are based on consistency of performance, effectiveness, and quality, and on the level of guidance or support required at entry to practice. For the redesigned program, standards have been set using entry-level competencies. This model is still in the development phase and is slowly being implemented with a small sample of applicants to help the College overcome some of the new challenges that may arise. So far, early trends show that foreseen successful candidates of the new program are more likely to have education and practice from comparable practice environments, have graduated within the past five years, and have completed all other assessments.
Dr. Antoni Marini is a psychometrician specializing in competency assessments and exams that lead to licensing in regulated professions.
In his presentation, he described how the principles of transparency, objectivity, impartiality and fairness can play out and manifest in assessment methods. He also described some of the challenges related to each of the principles in the assessment process and some of the common mistakes associated with them.
He also gave some tips on preparing for competency-based assessments, such as:
He also provided methods for building scoring methods, and described the need to provide meaningful feedback. He discussed some of the issues around reassessments and elaborated on some of the benefits of Universal Design Assessments.
Third-party assessment agencies operate outside of the OFC's direct oversight. However, the fair-access law requires regulatory bodies to hold third parties accountable for fair assessments. This panel provided perspectives from agencies and regulatory bodies on how to best ensure that third-party assessment practices comply with fair-access principles. The panel was conducted in a Q&A format. Following is a summary of the panellists' speaking points.
Tim Owen, from World Education Services (WES), discussed the issue from the perspective of an academic credential assessment agency. WES verifies and evaluates educational credentials from any country in the world and provides a Canadian equivalency.
WES is an academic credential assessment agency that offers services such as verifications, equivalencies and document-storing services. It is an international charitable organization with a volunteer board, which includes a Canadian Advisory Committee to ensure that Canadian concerns are addressed.
The agency ensures that assessments are transparent, objective, impartial and fair (TOIF) through a Memorandum of Understanding (MOU) it has with many regulators. The MOU outlines WES's responsibilities and the nature of WES's work and processes. WES's work is based on UNESCO standards.
Its qualifications framework is monitored nationally by the Alliance of Credential Evaluation Services of Canada. WES often helps regulators complete their reports that explain how they work together to ensure fair assessments. It has an online system that allows applicants and regulators to track an application through the assessment process and monitor issues that might delay the process (e.g., missing documents). This increases transparency, and makes the organization more accountable to individuals and regulators.
WES believes that it is responsible for dealing with complaints, not the regulators. Most complaints from individual clients happen during the document-gathering phase, rather than after the assessment decision. However, WES does have an appeals process for dealing with complaints about assessment decisions.
The appeal process is essentially a policy review, because WES's assessment is based on verified information contained in its assessment database.
Pierre Lemay, from the Medical Council of Canada (MCC), described how the MCC works with medical regulators to ensure that assessments are fair. The MCC evaluates medical students and graduates through exams that it develops and administers, and verifies credentials of internationally educated physicians.
Regulators, educators, medical residents, students and public members are all represented on the MCC. The MCC offers two national exams that must be written by all candidates in Canada seeking to be a licensed doctor. It also offers two ways of assessing internationally educated physicians:
The MCC ensures TOIF assessments in many ways, notably by publishing on its website future exam blueprints, the objectives of all its exams, and information on appeals and test accommodations. The MCC has put in place an oversight committee to ensure that regulations for test accommodation are followed. It is also proactive in training its staff. Further, the committees that created the exams have diverse membership from across the country, with representation from different specialities and language groups. The diversity enhances oversight and reduces potential biases.
Candidates can contact the MCC Service Desk or email their concerns using the correspondence module in their physiciansapply.ca account.
HealthForceOntario (HFO) is also an important source of information to guide the candidates on different pathways to licensing and practice. It also provides an opportunity for MCC and HFO to discuss the overall process and potential challenges faced by candidates.
MCC offers both a reconsideration and formal appeals process. The candidate receives detailed exam results, which he or she can choose to share with registered organizations, including medical regulators. Candidates who have failed the exam can request a rescore, which is done in the presence of either the candidate or MCC staff. Rescore fees are refunded if the procedure leads to a change in the final result (that is, a change from a fail to a pass).
Sten Ardal, from Touchstone Institute, discussed the topic from the perspective of a provincial agency.
Touchstone mainly assesses international applicants in the health professions. It originally worked only with international medical graduates (IMG), but it has expanded to nurses, optometrists and others. Touchstone is also developing expertise in the assessing of assessments of communications competency.
Sound methodology of the exams is paramount in terms of objectivity, fairness, impartiality, and transparency. Touchstone uses competencies developed by regulators to develop exams. The challenges arise when the client is the individual candidate or applicant, as opposed to the regulatory body. In these cases, Touchstone wants to provide as much information as possible about the assessment while protecting the integrity of the exam.
The way Touchstone deals with complaints depends on how exam results are given to the clients. If the results are communicated by the regulatory body, then that body should receive the complaints. Otherwise, if Touchstone is working directly with the individual client, then Touchstone should receive the complaint directly. The complaints are generally about how the assessment process is conducted, rather than the assessment decision itself , which means that Touchstone does not deal with many complaints overall.
There is no common appeals process for all assessments. Some health groups have their own independent appeals process. Touchstone mostly provides information about performance related to individual competencies. What people want to know from an assessment is not just a pass/fail, but detailed information about where they did well and where they need improvement.
Irwin Fefergrad, from the Royal College of Dental Surgeons of Ontario (RCDSO), discussed its perspective on working with third parties and holding them accountable.
The RCDSO's mandate is to protect the public interest. It does not advocate for the profession, and it will register anyone who demonstrates competency.
It relies on other institutions to determine competency. For example, it relies on universities for admission standards, on national examination boards for assessments, and on federations of schools for systems and processes. For them, it is important to monitor how they do their jobs: as such, RCDSO board members are expected to monitor their exams and participate in the development of their tools.
Applicants need consistency in the information they are given about the exams provided by third parties. The RCDSO's two national examining bodies were created by federal statute, and thus see themselves as independent. Since the creation of the OFC and the emphasis on TOIF principles, there has been an increase in the transparency of these two bodies and how they work.
The RCDSO had made a point of working with bodies that work in line with TOIF principles. These bodies now elect regulatory representatives to their boards, post all minutes and agendas online, and use psychometrically tested exams.
Overall, the RCDSO understands that it is accountable to the fairness commissioner and has implemented most of the OFC's recommendations.
Most complaints the RCDSO receives are about outcomes, not the exam process. It is not the RCDSO's business to interfere in the examining bodies' appeals processes.
The content of the exams needs to be protected, but an appeal of the exam's conduct is permissible and an applicant can retake an exam.
Providing information about gaps is important in order to enable an applicant to fill them. For those who want to be specialists, the RCDSO provides the opportunity to do a gap assessment and create an individualized educational program in order to fill the gap. This can be challenging.
Nuzhat Jafri, the OFC's Executive Director, provided an update about the OFC's current work. She shared infographics about the current landscape of regulatory bodies and applicants. She then highlighted the OFC's revised Strategy for Continuous Improvement and its approach to reducing the reporting burden in various reports (Fair Registration Practices Reports, Entry-To-Practice Review Reports, and Audit Reports). Last, she talked about the development of the OFC's online learning modules, the modernization and release of the Fair Registration Practices reporting website, and the upcoming cycle of assessments.
Feedback about the event has been positive, and the OFC plans to hold another Learning Day in 2016.