Skip to main content

Main menu

  • Home
  • Articles
    • Current Issue
    • Archived Issues
  • Info for
    • Authors
    • Advertisements
    • Subscribing
  • About
    • About Us
    • Editorial Board
    • JDH Reviewers
  • More
    • Alerts
    • Feedback
    • Permissions
    • Accessibility Statement

User menu

  • Register
  • Subscribe
  • My alerts
  • Log in

Search

  • Advanced search
Journal of Dental Hygiene

Visit the American Dental Hygienists' Association's main website

  • Register
  • Subscribe
  • My alerts
  • Log in
Journal of Dental Hygiene

Advanced Search

  • Home
  • Articles
    • Current Issue
    • Archived Issues
  • Info for
    • Authors
    • Advertisements
    • Subscribing
  • About
    • About Us
    • Editorial Board
    • JDH Reviewers
  • More
    • Alerts
    • Feedback
    • Permissions
    • Accessibility Statement
  • Visit jdenthyg on Facebook
  • Follow jdenthyg on Twitter
  • Follow jdenthyg on Instagram
  • Follow jdenthyg on Linkedin
  • RSS feeds
Research ArticleInnovations in Dental Hygiene Education

Effectiveness of Online Faculty Calibration Activities

Camille M. Biorn, Ellen J. Rogo and Rachelle Williams
American Dental Hygienists' Association October 2023, 97 (5) 103-115;
Camille M. Biorn
Department of Dental Hygiene, Idaho State University, Pocatello, ID, USA
RDH, MS
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: camillebiorn{at}isu.edu
Ellen J. Rogo
Department of Dental Hygiene, Idaho State University, Pocatello, ID, USA
RDH, PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Rachelle Williams
Department of Dental Hygiene, Idaho State University, Pocatello, ID, USA
RDH-EA, MS
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • References
  • Info & Metrics
  • PDF
Loading

Abstract

Purpose Dental hygiene faculty members must be able to provide evidence of skill calibration for clinical evaluation of students. The purpose of this study was to evaluate the effectiveness of online instructional videos compared to in-person sessions for faculty calibration.

Methods A randomized crossover pretest/posttest research design was used to evaluate online and in-person faculty calibration activities. Fifteen faculty members from a baccalaureate dental hygiene program were randomly assigned to an AB or BA sequence for calibration sessions on two different instruments. Following a 2-week washout period, the groups switched activity modalities. A pretest, posttest, and retention test, administered 10 weeks following the activity, were administered to determine learning levels and the new and retained knowledge. A 7-point Likert scale questionnaire evaluated the reaction to and impact of the calibration activities. Descriptive statistics analyzed demographic and Likert scale data. Paired samples t-tests were used to analyze the research questions (p≤0.05).

Results Online calibration activities yielded higher posttest scores than in-person activities (p=0.01). Findings related to feelings of confidence revealed a greater percentage of participants agreed that online calibration activities increased their ability to evaluate student performance. Findings related to feelings of preparedness supported equal percentages of participants who agreed the online and in-person activities increased their ability to teach dental hygiene instrumentation. There was no significant difference between in-person and online retention test scores (p=0.235).

Conclusion Faculty members agreed that both online and in-person calibration activities were an effective use of their time and contributed to greater feelings of confidence and preparedness. However, the online calibration activities seemed to be more effective at increasing calibration on instrumentation. More research is needed to determine additional effective strategies for online calibration of clinical faculty.

Keywords
  • dental hygiene education
  • clinical education
  • faculty calibration
  • professional development
  • online education

INTRODUCTION

To evaluate competencies, dental hygiene program administrators must implement strategies for student evaluation. One of the most widespread methods to assess clinical competency is through clinical faculty evaluation.1 However, the lack of validity and reliability between instructors while assessing clinical competence has been well established in the literature.2-10 Calibration is needed to increase interrater reliability, especially in areas of subjectivity.3 Clinical instructor calibration is defined by Tabussum as the degree to which various faculty members consistently agree with one another during student evaluation or the reliability of one faculty member’s assessment of student performance on different occasions.11 Clinical instructors with unique expertise, backgrounds, and clinical skills work toward the common goal of creating competent graduates. However, variability in evaluation methods, teaching strategies, verbiage, and clinical skills detract from student learning, attention to patient care, and the motivation for a student to perform to the highest standard.2,3,12 Conflicting information from evaluators can cause frustration, confusion, and dissatisfaction among students.13,14

Understanding students’ and faculty members’ observations of the barriers they perceive impacting learning is necessary to address variances in evaluation. Variability in evaluation increased stress in the clinical learning environment and negatively impacted learning.15,16 Discrepancies in evaluation have been shown to lead students to change their performance based on which instructor was evaluating them to please that instructor and earn a higher grade.3,12,17 Additionally, students reported that differences among faculty members were a major concern, and discrepancies in assessment negatively impacted program evaluation.18 Clinical faculty have indicated that they would have more positive and effective interactions with students if they received more professional preparation.19 These inadequacies highlight the need for sufficient, effective, and convenient training resources for clinical faculty members.

The Commission on Dental Accreditation (CODA) Standards require dental hygiene program faculty members to provide evidence of skill calibration for clinical evaluation.15 However, researchers have reported that most clinical calibration has been discussion-based rather than actual skill standardization.3,12,16 One suggested method to close the gap between discussion and actual skill development is the use of multimedia instruction.17 Multimedia instruction is defined as the use of visual aids as well as words to foster learning.18

Little is known about the effects of instructional videos on dental hygiene clinical faculty calibration. An instructional technique video was shown to increase the knowledge level of dental hygiene clinical faculty for head and neck examinations in a 2017 pilot study.19 In another study, discussion-based orientation sessions were offered to novice dental hygiene faculty transitioning into clinical teaching; however, participants reported that visual aids such as instructional videos and hands-on demonstrations would have better helped them prepare for new teaching roles.20

Video instructional training has also been shown to be beneficial during the COVID-19 pandemic when virtual meetings with colleagues were introduced.21 The use of technology for instruction is currently gaining momentum, and its practicality and cost-effective nature have demonstrated importance during the pandemic era and continue to be developed post-COVID.22

The purpose of this study was to evaluate the effectiveness of online instructional videos as compared to in-person sessions for faculty calibration. Based on the literature reviewed, research questions were developed regarding calibration effectiveness for dental hygiene clinical faculty members:

  1. Is there a significant difference in the pretest, posttest, and retention instrumentation evaluation scores between the online calibration and in-person calibration groups?

  2. How does the reaction (effectiveness and feelings of confidence and preparedness) compare between the online calibration and in-person calibration groups?

  3. How does the impact of the calibration compare between the online calibration and in-person calibration groups?

METHODS

This randomly assigned, cross-over design study received expedited ethical approval under OHRP (DHHS) and FDA guidelines from the Idaho State University Institutional Review Board (IRB-FY2022-255). The research design implemented was a randomly assigned crossover design. The sample of convenience consisted of part-time and full-time clinical faculty members assigned to first- and second-year clinical education during the fall 2022 semester at Idaho State University. The Kirkpatrick Model for Training Evaluation provided the framework for this investigation.23 The model consists of four levels of evaluation (Reaction, Learning, Behavior, and Impact). The definitions of each level and the evaluation mechanism used to collect data are shown in Table I.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table I.

Study Evaluation Mechanisms using the Kirkpatrick Model for training evaluation24

Level one of the Kirkpatrick Model, Reaction, was evaluated using a questionnaire developed by the primary investigator. Evidence of content validity for the Reaction questionnaire was established by the content validity index (CVI). Content validity forms were provided online to 15 clinical faculty with expertise in clinical teaching. The experts rated each domain on its degree of relevance (1=not relevant to 4=highly relevant). After content validation, each expert’s relevance rating was recoded as 1 (relevance scale of 3 or 4) or 0 (relevance scale of 1 or 2).24 Next, the number of experts in agreement with relevance was divided by the number of experts. This computation provided the item CVI (I-CVI) or the proportion of content experts giving the item a relevance rating of 3 or 4. Items with an I-CVI of 0.78 or higher for three or more experts were considered evidence of good content validity.25 All items on the questionnaire received an I-CVI of 0.78 or higher. The Reaction questionnaire was administered immediately after both phases of the online and in-person calibration activities.

Level two, Learning, determined the degree to which participants acquired the knowledge intended from the calibration sessions and was evaluated by comparing online and in-person posttest evaluation scores for instrumentation technique evaluation. This measurement occurred directly after the online and in-person calibration activities. Data were analyzed using a paired samples t-test and Jamovi software. Jamovi is a free and open-source program used for data analysis and for performing statistical analysis.

For this study, the gold standard for measuring performance consisted of the evaluation of the 11/12 explorer and 1/2 Gracey curette as determined by the principal investigator and a co-investigator. The two investigators standardized on the evaluation of both instruments during the compilation of the online instructional videos and the gold-standard keys for the instrumentation technique scoring. The gold standard served as the pretest and posttest evaluation key to determine the percentage of correct responses faculty achieved. Pre-existing grading rubrics for evaluation of instrumentation performance using the Gracey 1/2 curette and the 11/12 explorer were used for the pretest and posttest. The instrumentation grading rubrics were developed by the dental hygiene faculty in 2015, prior to an accreditation site visit in 2017, and have been validated over time.

Prerecorded simulation videos for mock testing were created for both pretests and posttests. All simulation videos ranged from 3-4 minutes. The simulated video for the online calibration pretest for the 11/12 explorer was created in the maxillary right quadrant. Likewise, the posttest simulation video for the online calibration posttest was created in the maxillary left quadrant. Intentional errors in technique were fashioned equally for both the pretest and the posttest. The simulated video for the in-person calibration pretest for the Gracey 1/2 curette was created in the mandibular left quadrant. Similarly, the posttest simulation video for the in-person calibration posttest was created in the mandibular right quadrant.

Level three of the Kirkpatrick Model, Behavior, or the degree to which participants applied what they learned during the semester in the clinical setting, was evaluated using a retention instrumentation technique evaluation for both instruments and was administered 10 weeks after completion of all calibration activities. Online pretest scores were compared to online posttest scores and in-person pretest scores were compared with in-person retention test scores. Online retention test scores and in-person retention test scores were also compared. Data were analyzed using paired samples t-tests and Jamovi software. The simulated instrumentation videos created for the posttests for both the 11/12 explorer and the Gracey 1/2 curette were used for the Retention test. The gold standard for retention test assessment consisted of an evaluation of the 11/12 explorer and Gracey 1/2 curette as determined by the principal investigator and a co-investigator. The same preexisting grading rubrics used for the pretests and posttests were used for the Retention tests.

The fourth level, Impact, determined the degree to which desired outcomes occurred in the clinical setting as a result of the calibration, was evaluated by administering a questionnaire developed by the primary investigator. Evidence of content validity for the Impact questionnaire was also established by the content validity index (CVI) as previously described. The Impact questionnaire was administered 10 weeks after all calibration activities for both online and in-person groups.

The study was conducted August-December of 2022. The convenience sample consisted of fifteen clinical faculty members from a baccalaureate dental hygiene program who were randomly assigned to one of the crossover sequences. Participants’ names were placed in a secure box and randomly drawn by the primary investigator and a co-investigator and placed into each of the two sequences. In the fall semester of 2022, clinical faculty members were emailed an invitation to participate in the calibration study. The email contained a cover letter informing the respondents of the study’s purpose, risks, benefits, and voluntary nature of the study. If the respondent consented to participate in the study, the informed consent was returned to the principal investigator signed and with a chosen pseudonym to maintain confidentiality.

A two-group randomized crossover AB/BA design (Figure 1) was used to determine the effect of incorporating online instructional videos into faculty calibration exercises. Participants were randomly assigned to an AB or BA sequence. The group assigned to AB participated in a pretest (evaluation of prerecorded instrumentation technique) and then received access to the online instructional video for the calibration session which included reviewing the 11/12 explorer instrumentation criteria and viewing correct and improper techniques. The instructional video was approximately 25 minutes long. A posttest was then administered (Learning).

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Two-group Randomized cross-over AB/BA design

After a 2-week washout period, during phase II, Group AB participated in a pretest (evaluation of prerecorded video instrumentation) and then received the in-person Gracey 1/2 curette calibration session reviewing the instrumentation criteria, breaking into dyads to model correct and improper technique, and participating in a question and answer discussion. A posttest was then administered (Learning). Likewise, the participants assigned to BA received the in-person Gracey 1/2 activity with pretests and posttests (Learning) in the first phase and after a 2-week washout period, during phase II, they received the online 11/12 explorer instructional videos for the calibration session with pretests and posttests (Learning). The Reaction to instructional video use and in-person calibration activities was assessed using a Likert scale questionnaire after each calibration activity. The 7-point Likert scale used ranged from 1=completely disagree to 7=completely agree.

Although there is a discrepancy in the literature about how much time should pass before recalibration, experts have reported to maintain intra and interrater reliability calibration exercises should be offered anywhere from 10 weeks to 1 year after the initial training.26 Ten weeks after phase II of the study, all participants participated in two retention (Behavior) tests (evaluation of prerecorded instrumentation technique) to test retention of knowledge of both the online calibrated instrument (11/12 explorer) and the in-person calibrated instrument (Gracey 1/2) and were provided Likert scale questionnaires (Impact) to evaluate the degree to which desired outcomes occurred as a result of both the online and in-person activities. The same Likert scale used for the evaluation of Reaction was used to evaluate Impact.

RESULTS

Demographics

Twelve of fifteen participants (80%) responded with demographic information. Of the twelve, 8% (n=1) were male and 92% (n=11) were female. The highest degree earned was a master’s degree, held by 50% (n=6), and the remaining 50% (n=6) held bachelor’s degrees. Sixty-seven percent (n=8) of the respondents had full-time teaching experience. Most (92%, n=11) had taught in preclinic and in junior (first year) or senior (second year) clinics. Additionally, 92% (n=11) had full-time clinical experience ranging from zero years to forty years and 100% (n=12) had part-time experience ranging from one year to forty years.

Reaction

There were fifteen responses (100%) to the online Reaction questionnaire and eleven responses (73%) to the in-person Reaction questionnaire. More participants (67%, n=10) mostly or completely agreed the online calibration activities increased their feelings of confidence in the ability to evaluate student performance compared with in-person participants (27%, n=3). Sixty percent (n=9) of respondents indicated they mostly or completely agreed the online calibration left them feeling confident in their ability to verbalize instrumentation feedback for students versus 55% (n=6) who mostly or completely agreed the in-person calibration increased their feelings of confidence in verbalizing feedback. Just over half of the online participants (53%, n=8) and the in-person participants (55%, n=6) felt the activities increased feelings of confidence in the ability to identify student strengths and weaknesses. Seventy-three percent of respondents mostly or completely agreed both the in-person calibration (n=8) and the online calibration (n=11) activities increased feelings of preparedness to teach dental hygiene instrumentation. (Table II).

View this table:
  • View inline
  • View popup
Table II.

Reaction Responses (n=26)

When asked if the calibration sessions were an effective use of their time, a majority of online (93%, n=14) and in-person (90%, n=10) participants agreed that the activities were an effective use of their time. A majority (80%, n=12) agreed that online instruction provided sufficient instruction to facilitate the evaluation of student performance, whereas, 73% (n=8) agreed that the in-person calibration instruction provided sufficient instruction. Seventy-five percent (n=6) of participants agreed the online instruction provided sufficient resources to guide their evaluation and 75% (n=3) of participants agreed the in-person instruction provided sufficient resources. However, this question was inadvertently omitted from the questionnaire for phase II of the study. Nearly all of the online (93%, n=14) and the in-person (91%, n=10) participants were satisfied with the calibration training styles. Participant responses are shown in Table II.

Learning

A statistically significant difference in posttest scores between the online and in-person calibration activities was noted using a paired samples t-test (online=50.1, in-person=40.6; p=0.01; 95% CI=2.70, 17.8). These findings suggest greater knowledge and learning were gained with the online video calibration. Although the posttest scores were higher than the pre-test scores in the online group, the results were not statistically significant (p=0.634). Mean in-person instrument evaluation scores decreased by 10% between pretests and posttests in the in-person group (pretest=50.2, posttest=40.6; p<0.001; 95% CI=5.32, 13.9).

Behavior

There was no statistically significant difference in retention (Behavior) test scores ten weeks after the calibration efforts between the online calibrated instrument (11/12 explorer) and the in-person calibrated instrument (Gracey 1/2 curette) as noted by a paired samples t-test (online=53.3, in-person=49.3; p=0.235; 95% CI=−2.86, 10.7). Additionally, there was no significant difference between pretest scores and retention test scores ten weeks after the activities for the online activities (online pre=49.2, online retention=53.3; p=0.385; 95% CI=−13.8, 5.65) nor the in-person activities (in-person pre=50.2, in-person retention=49.3; p=0.783; 95% CI=−5.75, 7.48).

Impact

Most participants completed the Impact questionnaire (in-person; 73%, n=11, and online; 87%, n=13). The majority (85%, n=11) of participants agreed they used what they learned in the online calibration activities and nearly all (91%, n=10) agreed they used what was learned in the in-person calibration activities to accurately evaluate student performance. More participants agreed that they were experiencing positive interactions with students (61%, n=8) following the online calibration as compared to in-person (54%, n=6).

DISCUSSION

This study examined the effect of online instructional video technology on faculty calibration as an effective means of increasing faculty members’ evaluation of instrumentation scores and feelings of preparedness and confidence. Results supported McMillan’s suggestion of incorporating online instruction for clinical faculty calibration, complementing the traditional calibration of checklists and rubrics.27 Online calibration has been shown to enhance the intellectual processes involved in how adult learners retain information.17 Additionally, the online calibration activity has also been shown to assist clinical faculty members in recognizing relevant information, picturing a mental image, and integrating the new knowledge with prior knowledge.17 These strategies are consistent with previous research on adult learning.28,29

The first two levels of the Kirkpatrick Model of Evaluation, Reaction and Learning, offered data related to the quality and effectiveness of the online calibration program and the degree to which knowledge and skill were learned.22 Levels three and four, Behavior and Impact, provided the data needed to assess the application of information learned. These levels measured clinical evaluation performance and subsequent results related to the reinforcement of calibration. The Behavior and Impact levels are not routinely included as part of the evaluation process; however, these levels provided valuable data regarding the implementation efficacy of what was learned during each educational program.23 For this study, the desired learning outcomes were increased learning (test scores) and increased feelings of confidence and preparedness to evaluate instrumentation. The Kirkpatrick Model continues to be useful, appropriate, and applicable in the evaluation of all types of training activities30 and has recently been used to assess both student and faculty educational programs in dentistry.24,31,32

The overall results of the online calibration exercises were positive. Mean online instrument posttest evaluation scores were higher than those of the online pre-test evaluation scores, suggesting an increase in faculty knowledge and learning. The increase in performance outcomes is consistent with previous research which has suggested that adult learners recall more information when videos are accompanied by narration versus with text only where learners can produce more problem-solving solutions.33 Participants were able to replay the content of the online instructional videos as many times as they wanted to solidify concepts, and more participants responded that the online calibration activities provided sufficient instruction as compared to in-person. Carter had similar findings regarding the use of an instructional video and the increased knowledge level of dental hygiene instructors for head and neck examinations.19 A systematic review and meta-analysis of faculty development outcomes showed that faculty development and calibration sessions helped and calibration helped improve the self-confidence of clinical faculty.34 The online participants in this study reported feeling more confident in thoroughly evaluating student performance, verbalizing feedback, and identifying student strengths and weaknesses.

Additionally, increased numbers of part-time clinical faculty members, diverse responsibilities outside of academia, and most recently, the guidelines for social distancing amid a pandemic, have shown a need for a more flexible approach to faculty calibration. Previous literature reported difficulty getting all faculty together for participation, and program administrators have reported finding time for clinical calibration sessions as one of their biggest challenges.12 A 2019 systematic review and meta-analysis reported current calibration programs required resources, funds, effort, space, commitment, and flexibility, however, these support services were not available.34 The online calibration activity in this study was convenient to access at any time and from anywhere and had 100% participation from clinical faculty. The online program required very few resources and no space. Moreover, participants reported being more satisfied with the online training style versus the in-person style.

In contrast, the in-person calibration activities showed mixed results. Mean instrument evaluation scores were shown to decrease between pretests and posttests, suggesting a disconnect between the information gained and the implementation of the knowledge. Unlike the online calibration which could be viewed repeatedly; the in-person activities, once executed, were complete. DiGiacinto reported adult learners given text-only learning instruction may not recall information as well as when provided narration accompanied by video instruction.33 Also, for a calibration program to be effective, it must occur regularly and technical skills must be addressed and maintained.12 However, program administrators report frustration with finding recurrent time periods to accommodate all faculty members’ attendance and to provide compensation for their time.17 For instance, participants attending the in-person session were asked to attend the 1.5-hour calibration on a predetermined day and at a predetermined time and had to adjust their schedules accordingly.

More in-person participants reported there were concepts of student evaluation that they still did not understand after the activities. Fewer in-person participants than online participants agreed that the calibration activity was an effective use of their time. In-person activities are more difficult to schedule at a time that is convenient for all faculty members to attend and are more difficult to repeat on a regular basis to maintain technical skills. Respondents of one study were undecided regarding their satisfaction with calibration sessions because they felt it was not a wise use of time and resources when the activities were mainly discussion-based rather than technical skill evaluations with comparisons between faculty members.12 Respondents were also not consistently compensated monetarily for their time. In the current study, participants received continuing education credit for both online and in-person calibration activities. Documented continuing education in calibration aligns with the CODA requirement that all dental hygiene program faculty have a current knowledge of the specific subjects they are teaching and a documented background in current educational methodology concepts consistent with their teaching assignments.15 McDermid et al. stated that clinical faculty dissatisfaction stemmed from a feeling of unpreparedness and lack of confidence due to insufficient instruction and educational resources offered by the institution.35,36 A majority of the participants in the current study reported having sufficient instruction to evaluate student instrumentation performance after the online calibration instruction, as compared to the in-person calibration instruction.

The literature specifies that both students and faculty members want better calibration and training efforts for clinical educators to enable the more effective transfer of clinical skills to students.12,37 Clinical faculty reported they would have more positive interactions with students if they received more preparation and calibration.38 The majority of participants in the current study agreed they felt they were interacting with students more positively after participating in the online calibration, congruent with previous research that disclosed participants felt they would have more positive and effective interactions with more teaching preparation.36

Lastly, evidence of technical skill standardization for clinical teaching is a required CODA standard.15 Previous research reported that most calibration occurring in the clinical setting was discussion-based rather than concrete skill standardization, and could be enhanced by implementing a standard for measuring performance.12 Dicke et al. suggested all clinical faculty (novice and seasoned) should be held to the same standards to evaluate student performance.12 When all clinical faculty, regardless of their experience level, agreed with the same standard, they also agreed with each other. In the current study, a gold standard was used to which everyone was compared so that a plan for resolving inconsistencies and re-evaluating outcomes to ensure reliability could be established.

An online calibration activity similar to the one used in this study can be used to mitigate some of the barriers to calibration found in dental hygiene programs. Results demonstrated that online calibration activities are convenient, accessible, effective, and were a desired training style compared to in-person calibration. Online calibration is always available and easy to repeat as part of necessary recalibration efforts to maintain psychomotor skills.

This study had limitations. The small convenience sample from a single dental hygiene program limits the generalizability of the results. A small sample size may have also impacted the statistical power to detect a difference (effect size). Questionnaires used in research studies often have low response rates and respondents may provide answers they think are desired rather than what they feel.39 In phase II, one question was inadvertently omitted from the questionnaire. An accurate representation of how participants felt about having sufficient resources to evaluate student performance was deficient. Using a continuous prerecorded instrument video on several teeth for evaluation presented challenges. Future investigations using brief video clips of instrumentation on one tooth surface and asking simple yes or no questions about basic fundamentals of instrumentation may negate the complexity of evaluation. Using yes/no questions may allow for statistical analysis of inter-rater reliability using Cohen’s Kappa analysis.

Future research should compare various online methods for dental hygiene faculty calibration. Evaluation of a program is an essential element of curriculum development, and consideration should be given to what outcomes need measuring and how they will be calculated. The Kirkpatrick Model for Evaluation of Training Programs, though available since the 1970’s, has been reported more recently in dental literature as being utilized in the evaluation of both students and faculty.24,31,32

CONCLUSION

Study participants agreed that both online and in-person activities were an effective use of their time and contributed to greater feelings of confidence and preparedness. However, a greater percentage of the participants agreed the online calibration activities increased their feelings of confidence in the ability to evaluate student performance, and more participants agreed the online activities provided sufficient instruction compared to the in-person activities. Moreover, participants reported being more satisfied with the online training style compared to the in-person style. The online activities were more effective at increasing calibration on instrumentation and the on-demand availability of the videos may have contributed to greater feelings of confidence, preparedness, and knowledge. More research is needed to determine additional effective online calibration methods and how online calibration can be utilized for recalibration.

ACKNOWLEDGMENTS

The authors thank Melinda Siler, Administrative Assistant, for her assistance in creating the Qualtrics surveys; the clinical faculty for their participation and contributions; and to Irene van Woerden for her statistical advice and consultation.

Footnotes

  • NDHRA priority area, Professional development: Education (educational models).

  • DISCLOSURES

    The authors have no conflicts of interest to disclose. This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

  • Received April 24, 2023.
  • Accepted September 1, 2023.
  • Copyright © 2023 The American Dental Hygienists’ Association

REFERENCES

  1. 1.↵
    1. Tucker, K.
    The lived experience of clinical nurse experts transitioning to the role of novice educators. Order no. 10194156 ed [Internet]. Greeley (CO): University of Northern Colorado; 2022 [cited 2023 May 17]. Available from: https://www.proquest.com/docview/1882665881/abstract/785C9395295849FDPQ/1
  2. 2.↵
    1. Brame JL,
    2. AlGheithy DS,
    3. Platin E,
    4. Mitchell SH.
    Use of a self-instructional radiographic anatomy module for dental hygiene faculty calibration. J Dent Hyg. 2017 Jun;91(3):5–13.
    OpenUrlFREE Full Text
  3. 3.↵
    1. Garland KV,
    2. Newell KJ.
    Dental hygiene faculty calibration in the evaluation of calculus detection. J Dent Educ. 2009 Mar;73(3):383-89.
    OpenUrlAbstract/FREE Full Text
  4. 4.
    1. La Chimea T,
    2. Kanji Z,
    3. Schmitz S.
    Assessment of clinical competence in competency-based education. Can J Dent Hyg. 2020 Jun;54(2):83-91.
    OpenUrl
  5. 5.
    1. Lanning SK,
    2. Best AM,
    3. Temple HJ, et al.
    Accuracy and consistency of radiographic interpretation among clinical instructors in conjunction with a training program. J Dent Ed. 2006 May;70(5):545-57.
    OpenUrl
  6. 6.
    1. Park RD,
    2. Susarla SM,
    3. Howell TH,
    4. Karimbux NY.
    Differences in clinical grading associated with instructor status. Eur J Dent Educ. 2009 Feb;13(1):31–8.
    OpenUrlPubMed
  7. 7.
    1. Partido BB,
    2. Jones AA,
    3. English DL, et al.
    Calculus detection calibration among dental hygiene faculty members utilizing dental endoscopy: A pilot study. J Dent Educ. 2015 Feb;79(2):124–32.
    OpenUrlAbstract/FREE Full Text
  8. 8.
    1. Santiago LJ,
    2. Freudenthal JJ,
    3. Peterson T,
    4. Bowen DM.
    Dental hygiene faculty calibration using two accepted standards for calculus detection: a pilot study. J Dent Educ. 2016 Aug;80(8):975–82.
    OpenUrlAbstract/FREE Full Text
  9. 9.
    1. Seabra RC,
    2. Costa FO,
    3. Costa JE,
    4. Van Dyke T,
    5. Soares RV.
    Impact of clinical experience of the accuracy of probing depth measurements. Quintessence Int. 2008 Aug;39(7):559–65.
    OpenUrlPubMed
  10. 10.↵
    1. Sharaf AA,
    2. AbdelAziz AM,
    3. El Meligy OAS.
    Intra- and inter-examiner variability in evaluating preclinical pediatric dentistry operative procedures. J Dent Educ. 2007 Jul-Aug;71(4):540–4.
    OpenUrlAbstract/FREE Full Text
  11. 11.↵
    1. Tabassum A,
    2. Alhareky M,
    3. Madi M,
    4. Nazir MA.
    Attitudes and satisfaction of dental faculty toward calibration: A cross-sectional study. J Dent Educ. 2022 Jun;86(6):714–20.
    OpenUrl
  12. 12.↵
    1. Dicke NL,
    2. Hodges KO,
    3. Rogo EJ,
    4. Hewett BJ.
    A survey of clinical faculty calibration in dental hygiene programs. J Dent Hyg. 2015 Aug;89(4):264-73.
    OpenUrlFREE Full Text
  13. 13.↵
    1. Jacks ME,
    2. Blue C,
    3. Murphy D.
    Short- and long-term effects of training on dental hygiene faculty members’ capacity to write soap notes. J Dent Educ. 2008 Jun;72(6):719-24.
    OpenUrlAbstract/FREE Full Text
  14. 14.↵
    1. Kang I,
    2. Foster Page LA,
    3. Anderson VR, et al.
    Changes in students’ perceptions of their dental education environment. Eur J Dent Educ. 2015 May;19(2):122-30.
    OpenUrl
  15. 15.↵
    1. CODA
    . Accreditation standards for dental hygiene education programs 2022 [Internet]. Chicago (IL): American Dental Association; 2022 [cited 2022 Apr 14]. Available from: https://coda.ada.org/-/media/project/ada-organization/ada/coda/files/dental_hygiene_standardspdf?rev=aa609ad18b504e9f9cc63f0b3715a5fd&hash=67CB76127017AD98CF8D62088168EA58
  16. 16.↵
    1. Haj-Ali R,
    2. Feil P.
    Rater reliability: short- and long-term effects of calibration training. J Dent Educ. 2006 Apr;70(4):428-33.
    OpenUrlAbstract/FREE Full Text
  17. 17.↵
    1. Issa N,
    2. Mayer RE,
    3. Schuller M, et al.
    Teaching for understanding in medical classrooms using multimedia design principles. Med Educ. 2013 Apr;47(4):388-96.
    OpenUrlCrossRefPubMed
  18. 18.↵
    1. Mayer R.
    Using multimedia for e-learning. J Comput Assist Learn. 2017 Jun;33(5):403-23.
    OpenUrl
  19. 19.↵
    1. Carter BB.
    Faculty calibration with instructional videos for head and neck examinations [Internet]. Chapel Hill (NC): The University of North Carolina; 2017 [cited 2023 Aug 2]. Available from: http://www.proquest.com/pqdtglobal/docview/1952165365/abstract/2B28D469E35E4CB3PQ/1
  20. 20.↵
    1. Smethers RD,
    2. Smallidge DL,
    3. Giblin-Scanlon LJ,
    4. Perry KR.
    Experiences and challenges of clinical dental hygienists transitioning into teaching roles. J Dent Hyg. 2018 Dec;92(6):40-46.
    OpenUrlFREE Full Text
  21. 21.↵
    1. Shah S.
    The technological impact of COVID-19 on the future of education and health care delivery. Pain Phys. 2020 Aug; 4S;23(8;4S): S367-S380.
    OpenUrl
  22. 22.↵
    1. Sahi PK,
    2. Mishra D,
    3. Singh T.
    Medical education amid the COVID-19 pandemic. Indian Pediatr. 2020 Jul; 57(7):652-57.
    OpenUrlPubMed
  23. 23.↵
    1. Kirkpatrick Partners
    . The Kirkpatrick model [Internet]. Newnan (GA); Kirkpatrick Partners; 2022 [cited 2023 Aug 2]. Available from: https://kirkpatrickpartners.com/the-kirkpatrick-model/. Accessed Feb 3, 2023.
  24. 24.↵
    1. Gianoni-Capenakas S,
    2. Lagravere M,
    3. Pacheco-Pereira C,
    4. Yacyshyn J.
    Effectiveness and perceptions of flipped learning model in dental education: a systematic review. J Dent Educ. 2019 Aug;83(8):935-45.
    OpenUrlAbstract/FREE Full Text
  25. 25.↵
    1. Polit DF,
    2. Beck CT,
    3. Owen SV.
    Is the CVI an acceptable indicator of content validity? Appraisal and recommendations. Res Nurs Health. 2007 Aug; 30(4):459-67.
    OpenUrlCrossRefPubMed
  26. 26.↵
    1. McAndrew M.
    Faculty calibration: Much ado about something. J Dent Educ. 2016 Nov; 80(11):1271-72.
    OpenUrlFREE Full Text
  27. 27.↵
    1. McMillan JL.
    Participation in an online faculty development program to support novice nursing faculty [Internet]. Flagstaff (AZ): Northern Arizona University; 2020 [cited 2023 Aug 2]. Available from: http://www.proquest.com/pqdtglobal/docview/2412268816/abstract/D5C5051CA404478FPQ/1.
  28. 28.↵
    1. Goddu K.
    Meeting the challenge: Teaching strategies for adult learners. Kappa Delta Pi Rec. 2012 Oct;48(4):169-73.
    OpenUrl
  29. 29.↵
    1. Sanger PA,
    2. Pavlova IV.
    Applying andragogy to promote active learning in adult education in Russia. Int J Eng Ped. 2016 Nov; 6(4):41.
    OpenUrl
  30. 30.↵
    1. Alsalamah A,
    2. Callinan C.
    The Kirkpatrick model for training evaluation: Bibliometric analysis after 60 years (1959–2020). Ind Commer Train. 2022 Jan;54(1):36-63.
    OpenUrl
  31. 31.↵
    1. Badran AS,
    2. Keraa K,
    3. Farghaly MM.
    Applying the Kirkpatrick model to evaluate dental students’ experience of learning about antibiotics use and resistance. Eur J Dent Educ. 2022 Nov;26(4):756-766.
    OpenUrl
  32. 32.↵
    1. Johnston C,
    2. Ganas J,
    3. Jeong YN, et al.
    Faculty development initiatives in academic dentistry: a systematic review. J Dent Educ. 2019 Sept; 83(9):1107-1117.17.
    OpenUrlAbstract/FREE Full Text
  33. 33.↵
    1. DiGiacinto D.
    Using multimedia effectively in the teaching–learning process. J Allied Health. 2007 Jun;36(3):176-79.
    OpenUrlPubMed
  34. 34.↵
    1. Bilal,
    2. Guraya SY,
    3. Chen S.
    The impact and effectiveness of faculty development program in fostering the faculty’s knowledge, skills, and professional competence: A systematic review and meta-analysis. Saudi J Biol Sci. 2019 May;26(4):688-97.
    OpenUrl
  35. 35.↵
    1. Forbes H,
    2. Oprescu FI,
    3. Downer T, et al.
    Use of videos to support teaching and learning of clinical skills in nursing education: A review. Nurse Education Today. 2016 Jul;42:53–6.
    OpenUrl
  36. 36.↵
    1. McDermid F,
    2. Peters K,
    3. John Daly J,
    4. Jackson D.
    ‘I thought I was just going to teach’: Stories of new nurse academics on transitioning from sessional teaching to continuing academic positions. Contemporary Nurse. 2013 Aug;45(1):46–55.
    OpenUrl
  37. 37.↵
    1. Belinski DE,
    2. Kanji Z.
    Intersections between clinical dental hygiene education and perceived practice barriers. Can J Dent Hyg. 2018 Jun;52(2):132-39.
    OpenUrl
  38. 38.↵
    1. Paulis MR.
    Comparison of dental hygiene clinical instructor and student opinions of professional preparation for clinical instruction. J Dent Hyg. 2011 Fall;85(4):297-305.
    OpenUrlFREE Full Text
  39. 39.↵
    1. Patten ML.
    Questionnaire Research: A Practical Guide. 3rd ed. Glendale, CA. Pyrczak Pub.; 2011. 154p.
PreviousNext
Back to top

In this issue

American Dental Hygienists' Association: 97 (5)
American Dental Hygienists' Association
Vol. 97, Issue 5
October 2023
  • Table of Contents
  • Index by author
  • Complete Issue (PDF)
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Journal of Dental Hygiene.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Effectiveness of Online Faculty Calibration Activities
(Your Name) has sent you a message from Journal of Dental Hygiene
(Your Name) thought you would like to see the Journal of Dental Hygiene web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Citation Tools
Effectiveness of Online Faculty Calibration Activities
Camille M. Biorn, Ellen J. Rogo, Rachelle Williams
American Dental Hygienists' Association Oct 2023, 97 (5) 103-115;

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Effectiveness of Online Faculty Calibration Activities
Camille M. Biorn, Ellen J. Rogo, Rachelle Williams
American Dental Hygienists' Association Oct 2023, 97 (5) 103-115;
del.icio.us logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSION
    • ACKNOWLEDGMENTS
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • References
  • PDF

Similar Articles

More in this TOC Section

  • Leadership Development in Dental Hygiene Education
  • Literature Review as a Graded Assignment
  • Escape Rooms for Health Professional Education: A scoping review
Show more Innovations in Dental Hygiene Education

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Keywords

  • dental hygiene education
  • clinical education
  • faculty calibration
  • professional development
  • online education

About

  • About ADHA
  • About JDH
  • JDH Reviewers
  • Contact Us

Helpful Links

  • Submit a Paper
  • Author Guidelines
  • Permissions
  • FAQs

More Information

  • Advertise
  • Subscribe
  • Email Alerts
  • Help

ISSN #: 1553-0205

Copyright © 2025 American Dental Hygienists’ Association

Powered by HighWire