2012 AAMC/SACME Harrison Survey of Academic Medical Centers Links CME to Quality Improvement

Since 1981, the Society for Academic CME (SACME) Research Committee has surveyed continuing medical education (CME) units at medical schools in the U.S. and Canada.  Beginning in 2008, the Association of American Medical Colleges (AAMC) partnered with SACME to prepare this report.  The Biennial Survey of Society members includes questions about the organization of the CME unit, its relationship to the larger organization in which it resides, the ‘product’ of the CME unit (courses and other activities and interventions), its funding base, research and innovation, and other items related to the operation of the CME unit. 

The survey is electronically distributed to all U.S. and Canadian medical schools, teaching hospitals in the U.S., and clinical academic societies who are members of the AAMC’s Council of Academic Societies (CAS).  We covered the survey in 2010 and 2009.  Consequently, SACME recently released its 2012 data. 

The fifth iteration of the annual AAMC/SACME Harrison Survey documents a highly  viable and robust academic enterprise increasingly integrated into the functions and  mission of the academic medical centers (AMCs) and medical schools of the U.S. and

Canada.  While there are several limitations to interpretation of this survey, the survey generates broad but important findings for discussion and analysis, namely: 

  • An increasing linkage of the academic continuing medical education (CME) unit to  the quality and performance improvement programs and initiatives of the hospital  and health system. In particular, extensive interaction among these areas has grown from below 10 percent to more than 15 percent since 2008, and the relationship has  become increasingly important over a five-year period.
  • Continued well-developed relationships with programs for other health  professions, graduate medical education (GME), and faculty development; however,  missed opportunities for the academic CME unit and the AMC in building  collaborations with faculty practice plans, undergraduate medical education,  hospital accreditation, and other functions.
  • A clear trend to assess outcomes beyond the scope of the traditional post-course  ‘happiness index’ using a variety of methods to assess competence, performance,  and patient outcomes to evaluate their impact on the health system.
  • Growing institutional support, demonstrated by comparing median institutional  support to full-budget figures, representing commitment on the part of most, if not  all, institutions in support of academic CME.
  • A widespread commitment to regional community-based hospitals, health  systems, and health professionals, reflected in a growing array of educational  methods, including academic detailing. This regional alignment is important to  considerations of ’accountable care’ structures.
  • Increasing use of evidence-based educational methods that have been shown to  more frequently change clinical performance over a five-year period.
  • A reasonably steady, if still relatively small, cohort of CME units committed to  scholarship that contribute to the research enterprise in health professional learning  and change, the product of collaboration both within and across AMCs, derived  from funding sources internal and external to the institution. 

“Thus, academic CME demonstrates, despite external financial and regulatory pressures  (and perhaps because of them), several major changes over a five-year period,” the report maintained.  “There is  evidence of an increasing integration into the functions of the AMC; an uptake in the  use of effective educational methods; a wide variety of outreach activities geared to the  needs of the communities served by AMCs; and an impressive, if not yet widespread,  record in scholarly activities and best practices.” 

Methods 

In June and July of 2012, an Internet search identified a total of 465 academic CME  units that comprised 315 U.S. teaching hospitals, 17 Canadian medical schools, and  133 U.S. medical schools for whom a defined CME office and/or institutional contact  information could be identified, and/or in which a central national or regional CME  office did not accredit the activities (e.g., VA hospitals).  

Regarding medical schools, only 45 academic CME units represented U.S. medical schools, while all 17 Canadian medical schools were represented by  CME units.  An additional 88 CME units in the U.S. indicated that they provided CME services to both their medical school and one or more teaching hospitals or health care systems, for a total of 150 such institutions. 

In all, this generated a total of 239 academic CME units—17 Canadian medical schools, 133 U.S. medical schools, and 89 teaching hospitals/health care systems, with cross-representation as indicated above.  The 2012 Harrison Survey report describes: 

  • response rate and characteristics of the respondents
  • mission and scope of activities of academic CME units
  • organization of AMCs in relationship to their CME units
  • use of effective CME methods
  • funding issues in academic CME
  • research, development, and best practices 

Of the 239 eligible CME units in U.S. teaching hospitals and in U.S. and Canadian  medical schools, 184 (77%) responded to the survey.  Of these, roughly 94% were U.S.- based and 6% Canadian-based.  One hundred twenty-seven (79% of all respondents)  reported national accreditation in the U.S. by the ACCME and 22 (14%) by state  accrediting agencies.  All 12 reporting Canadian schools (6% of the total) indicated accreditation by the Committee on Accreditation of Canadian Medical Schools  (CACMS).  

  • Roughly three-quarters (76%) of these sites provided quality and  performance improvement activities, or planning and continuing education for  an inter-professional audience.  
  • Similar percentages provided clinical professional development for faculty and staff and faculty development to improve teaching skills.  
  • Of interest, 35% provided at least a small number of noncertified educational services, and
  • 15% provided patient or public education programming.  

In addition to reporting staff with responsibilities in educational development, event planning, and support roles, CME units also reported staff expertise in: 

  • Research and grant writing
  • Accreditation and compliance
  • Business analysis and operations
  • Marketing and Communications
  • Academic detailing
  • Strategic affairs and planning 

Developing activities to achieve the missions of the academic medical center (AMC) requires an understanding of the organizational and reporting structures of CME units in modern health care settings. 

Internal Relationships 

The relationships developed within the AMC foster the achievement of goals of the entire center and those of the academic CME unit.  Respondents were provided a list of programs, departments, or units internal to the AMC, which may exist in their respective settings.  These included faculty development programs, library services, conflict of interest committees, medical student or resident educational programs, compliance education, physician performance or quality improvement units, faculty practice plans, continuing education for other health professions, health services research, public health, employee or staff professional development, and public education. 

Further, respondents were asked to describe the relationship between the CME office and each of those programs on a scale ranging from no or minimal interaction to

extensive interaction. In this case, minimal interaction was described as “irregular or occasional activity linked to the program or unit,” while extensive interaction was characterized as “ongoing planning or developmental activity, conjoint programming, shared goals and strategic directions, or shared resources.”  Of the 91 U.S. medical school-based units shown in 

  • 74% indicated a combined moderate/extensive interaction with continuing education programs for other health professions,
  • 62% described similar interactions with physician or hospital quality improvement programs, and
  • 61% expressed a combined moderate/extensive interaction  with faculty development programs. 

One area represents an important change in the degree of interaction over a five-year period.  Previous Harrison Survey reports described relationships with QI/PI functions as fifth in frequency in 2008, with combined moderate to extensive interaction at roughly 58%.  In the 2012 survey, relationships with QI/PI now place second among relationships.  Further, the percentage of those selecting extensive interaction has also grown, accounting for virtually the entire growth in this area.  From 2008 to 2012, the total of combined moderate/extensive interaction improved from 58% to 63%, the product of growth in the ‘extensive interaction’ category. 

Also reported at levels of greater than 50% in this year’s survey was involvement at moderate or extensive levels with conflict of interest committees, with graduate or residency medical education, and with the allied health professions. Several interactions were much less frequently reported, providing examples of opportunities for academic CME, and the AMC itself.  

The 2012 Harrison Survey focuses on four aspects of programming that reflect a growing awareness of the literature driving changes in the delivery of continuing education and professional development activities, making them more effective in the process, namely: 

  • use of evidence-based methods in regular course planning, implementation, and  follow-up
  • use of assessment methods to determine the effect of these activities
  • growing use of alternative methods and strategies to reach a diverse audience  external to the AMC
  • role of faculty development 

Rigorous research evidence, including systematic reviews, demonstrates the positive effect on health professional performance when research-based educational methods are employed.  In particular, this research encourages CME providers to: 

  • Use objective data and understand barriers to change as they plan activities (e.g.,  employing quality data in planning and development).
  • Increase the use of interaction in planned educational sessions (e.g., by using case  discussion methods or simulations, or by providing in-program practice aids such  as flow charts).
  • Employ sequential learning so that practice and education are mutually reinforced. 

Since the first AAMC/SACME Harrison Survey in 2008, respondents have been asked about the use of these methods, along with other activities in pre-activity planning, course development, and post-course evaluation.  

Pre-activity Methods 

Reported here are several pre-activity planning methods.  Among them, planning based on quality metrics to augment subjective needs assessments appears to be an important, evidence-based step.  Of the 145 reporting CME units, 

  • 80% indicated regular use and 19% reported occasional use of these methods.  
  • 47% percent occasionally and 35% regularly have developed meaningful interprofessional planning methods,
  • 37% occasionally and 41% regularly consider barriers to changing professional performance.
  • Only 40% of units occasionally or regularly undertake presenter training  of professional development in order to train these presenters in more effective  methodologies. 

Academic CME providers report using evidence-based educational methods—objective needs assessments, teaching methods such  as interactivity and simulations,  and post-activity follow-up—to a  significantly greater extent over a  five-year period. 

Several evidence-based educational methods in courses and conferences themselves are also reported as occasionally or regularly used by academic CME providers.  Chief among these was the use of ‘interactivity,’ defined as devoting more than a quarter of educational time to case discussion, audience interaction and participation, and/or question/answer sessions.  The majority of CME units reported employing such methods to a significant extent (>25% of programming time), with 60% reporting interactive techniques and methods regularly.  Other effective methods used regularly or occasionally included: 

  • practice facilitators or enablers such as flow charts for use in the practice setting (81%),
  • evidence-based tools and resources (69%),
  • simulations (69%), and
  • team-based learning (64%).
  • Only 45% used methods described as  sequential i.e., learning sessions separated by practice periods, in which new knowledge  or skills can be acquired and then built upon in further educational sessions. 

Post-program Methods 

Respondents were asked about the use of quality metrics to assess the impact of their programs: 75% did so either regularly or occasionally.  In addition, adopting the research evidence that reinforcing strategies enabled the uptake of knowledge and practice change, respondents were asked if they followed up with their program participants post-course—emailing new information, reinforcing commitments to change, asking further questions—and 70 % reported doing so, more than half regularly.

Comparison with the first (2008) AAMC/SACME Harrison Survey report provides an instructive means to judge progress in academic CME.  Several items have been tracked annually or semi-annually since the publication of that report; the 2012 Harrison Survey report demonstrates a progressive change towards the current picture.  2008 data, in particular demonstrating: 

  • the use of quality or performance improvement as needs assessment occasionally or  regularly (85%)
  • in-program practice enablers (61%)
  • simulations (56%)
  • post-performance quality measures to track the outcomes of educational activities (54%)
  • follow-up methods to reinforce participant learning (64%) 

Beyond the post course ‘happiness index’, many academic CME providers use measures of competence, self-reported or actual performance data, and (a smaller number) patient and population health data.  New to the 2012 survey, respondents were asked to indicate what percentage of activities used the outcomes measured. 

  • 63% percent of respondents’ activities involved measures of competence (e.g., post-course multiple choice examinations),
  • 40% employed general, self-reported performance measures,
  • 16% used objective data such as quality measures to track outcomes,
  • while smaller percentages (13% and 9% respectively) measured  actual patient or population health outcomes.  

When asked the question, “Does your CME unit participate in faculty development activities,” 119 units, or 84%, indicated ‘yes’.  The majority of these activities involved educational or accreditation aspects of faculty development that touched upon the teaching methods in UME, GME, and CME.   

  • 91%had responsibilities of some type for CME teaching,
  • 61% for GME (or in Canada PGME) teaching, and
  • 52% for UME teaching improvements.  

This percentage of activity has remained relatively stable over several years.  One hundred and four CME units also reported some responsibility for faculty development activities involving basic research, regulatory matters, or clinical issues.  Ninety-six units (92%) developed faculty-focused activities in clinical issues such as team training or quality improvement, and 60 units (58%) in the area of basic research (e.g., regulatory, conflict of interest, or ethical issues). 

Reaching Out: Serving the Regional Community 

In addition to providing traditional courses and conferences, academic CME providers also reach out to regional community-based practitioners.  Respondents reported using live teleconferencing (audio or video) methods, visiting speakers’ programs, opinion leader, and train-the-trainer activities, academic detailing, social networking, and other means.  

Regional health professionals and systems are also served by academic CME providers, using teleconferencing, opinion leader and train-the-trainer programs, and (to a lesser but growing extent) social networking methods. Academic detailing—outreach visits by trained health professional—appears to have undergone substantial growth over a five year period.  Out of 140 respondents, CME providers said they engaged inthe following types of activities: 

  • over 80% live teleconferences (video, audio, webcasts)
  • over 70% visiting speakers at medical society or community hospital meetings series
  • over 60% opinion leader/Train-the-Trainer programs
  • almost 50% academic detailing
  • over 40% individual traineeships or tutorials
  • over 35% communities of practice
  • ~35% learning/individuals coaching programs
  • ~35% social networking 

Academic Detailing 

50% of CME units reported the use of academic detailing—educational visits by trained health professionals to individual or team-based physicians.  This method has been demonstrated to improve prescribing and health promotion/screening performance to at least a moderate extent.  Academic CME units have reported a steady rise in the numbers of such programs, from under 10 in 2008 to almost 70 in 2012 (an increase from ~45 in 2011). 

Academic CME Budgets 

For calendar year 2011, CME units were asked about the size of their total fixed  operating budgets, reflecting a wide spread of means, medians, and budget ranges by  institution type.  Less variability was noted among the eight reporting Canadian medical schools, which indicated budgets with the following characteristics: a median of $1.3 million, a maximum of $4.5 million, and a minimum of $364,000. 

In contrast, the 90 U.S. medical schools reported lower median budget figures (by roughly $500,000), but greater variability ($11.8 million in maximum, $25,000 in minimum).  CME Fixed Operating Budget (127 respondents) 

  • Mean budget for U.S. Medical School $797,097
  • Median budget U.S. Medical School $497,506

Even more variability was noted among U.S. teaching hospitals.  Twenty such units reported median figures of $275,000, indicating a maximum of $30 million and a minimum of $5,000. 

  • Mean budget for Teaching Hospital $2,126,509
  • Median budget for Teaching Hospital $275,000 of which $200,000 was derived from institutional (hospital and health system) sources—roughly 80%. 

The total revenue from institution to CME Unit (128 respondents) 

  • Mean U.S. Medical School $235,536
  • Median U.S. Medical School $141,461
  • Mean Teaching Hospital $1,307,074
  • Median Teaching Hospital $200,000 

Among Canadian medical schools with median budgets averaging $1.3 million, just under $400,000 (roughly one-third) come from institutional sources.  U.S. medical schools reported slightly less than $500,000 in median total revenues (though a wide range) and $141,000 in median institutional support (slightly under one-third).  

The 2012  survey asked the question, “As a percentage of the fixed CME budget, has institutional  support increased, decreased, or stayed the same in the last year?”   

  • 54% of respondents indicated similar year-to-year support,
  • 16% indicated an increase in support, and
  • 30% a decrease. 

Research Activity 

Research activity is reported by a sizable minority of academic units, primarily in medical schools—totaling millions of dollars and over two hundred studies.  Research includes formal evaluation processes related to physician or health professional learning, the effect of CME, the outcomes derived from educational activities, and related matters.  Some research was externally funded by peer review or commercial sources, and some internally funded.  In the U.S. and Canada, 43 units reported research activity.  These units undertook:  

  • a median of two research studies.
  • 22 respondents indicated receiving grant support for CME-related research studies greater than $0. 
  • The maximum reported grant for those with support was almost $10 million. 
  • The minimum was $20,000, with the mean $924,337 and the median $70,000. 

Collaboration Within and Across CME Units 

Those units reporting research activity were then asked to what extent these studies were either cross-institutional or collaborative within the institution and/or multi-institutional.  The majority of respondents reported undertaking both cross- and multi-institutional studies.  86.4% said they engage in cross-institutional or collaborative research within their own institution (51 respondents), while 57.9% said they had multi-institutional or collaborative research with other medical institutions (33 respondents).  

Examples of research include: outreach activities, PI-CME or quality improvement strategies, personal learning programs, faculty development, and others; knowledge translation or implementation science; new technologies such as those in social networking or blended learning; new audiences including allied health professionals, patients, and public members; planning and assessment strategies; and outcomes assessment methods.  A host of administrative, financial, organizational, and collaborative activities were also described. 

Limitations of Survey 

There are several limitations to the interpretation of this survey.  First, the survey is based on a self-reported questionnaire to which 77% of academic CME units responded.  Thus, responses are absent from one-quarter to one-third of academic CME units.  Notably, traditionally defined teaching hospitals in the U.S. have not generally been the target of these surveys and are not, as a rule, members of the major co-sponsor of this survey—the Society for Academic Continuing Medical Education.  

Further, between-year comparisons may be marred by a sampling of non-identical CME units and by questions which have been worded slightly differently, the result of an ongoing process to improve the clarity of the questions asked.  Finally, the wide variation in reporting of some figures (e.g., those related to the budget) makes judgment difficult and casts some doubt on respondents’ understanding of specific questions; median figures were used in this regard.  

Academic CME: Internal Alignment and Value  

The first notable feature of this year’s report, when compared to similar surveys over a five-year period, is the increasing linkage of the academic CME unit to the quality and performance improvement programs and initiatives of the hospital and health system. In particular, extensive interaction has grown from below 10% to more than 15% since 2008, and the relationship moved from fifth to second place during this time.  Also apparent is a well-developed relationship internally with CE programs for other health professions, with GME, and, notably, with faculty development.  These linkages across the medical school, teaching hospital, and clinical settings appear to make the traditional teaching hospital and medical school structures arbitrary at best, leaving the clear conclusion that there exists an entity—academic CME—with significant roles and demonstrated impact within the AMC.  

The process of alignment, however, is not universal across sites and systems and appears to neglect areas of possible interest to continuing education providers and the AMC— namely building relationships with faculty practice plans, undergraduate medical education, hospital accreditation, and other functions.  Further, staffing of academic CME units rarely includes individuals skilled in such areas as quality measurement or performance improvement.  At a minimum, these observations identify missed opportunities.  

Academic CME providers appear to use a variety of methods to assess competence, performance, and even patient outcomes to evaluate their impact on the health system. A further, if arguable, means of assessing the degree of integration of academic CME into the mission of the AMC, is to judge the extent of funding support from institutional sources.  This, at least by comparing median institutional support to full-budget figures, notes a sizable commitment on the part of most, if not all, institutions that support academic CME. 

Research, Scholarship, and Evidence-based Education: How Academic is ‘Academic’ CME?  

First, and most notably, academic CME programmatic commitment appears to be more evidence-based over a five-year period, i.e., employing educational methods that have been demonstrated to more frequently change clinical performance.  These methods include pre-activity planning (e.g., using quality metrics as an objective means to determine performance), intraprogram activities (sequencing of learning activities, creating interactivity, and using enabling materials such as flow sheets and patient educational materials to support change in the clinical setting), and post-course methods to reinforce learning.  The growth in use of these methods over a five-year period is notable.  

This shift from traditional, didactic, and, thus, marginally effective CME, reflects more than just a contemporary trend. Instead, it demonstrates—much as the practices of academic physicians do—the uptake of best evidence with regards to practice.  In this case, the practice is educational and reflects the undertaking of scholarship, research, and study, to analyze the effect of educational interventions, to test new ones, and to study their outcomes.  

Second, it appears that there is a reasonably steady, if still relatively small, cohort of CME units which are committed to scholarship and which contribute to the research enterprise in health professional learning and change. This commitment is the product of collaboration both within and across AMCs, and is derived from funding sources internal and external to the institution.  

The 2012 Harrison Survey also makes apparent a strong commitment, equal to its internal integration, to regional community-based hospitals, health systems, and health professionals.  This is reflected in a growing array of educational methods such as teleconferencing, online learning activities, opinion leader and train-the-trainer programs, and the growing use of social networking to link to community-based health professionals. In addition, the growth of academic detailing during this time further shows CME providers’ innovation and attention to effective educational engagement and an awareness of external funding opportunities in this area.  Finally, this regional alignment is important to considerations of accountable-care structures in which community-based health professionals—and the linkage that academic CME represents to them—play a large and important role. 

Conclusion 

Academic CME demonstrates, despite external financial and regulatory pressures (and perhaps because of them), several major features: evidence of an increasing integration into the functions of the AMC; an uptake in the use of effective educational methods; a wide variety of outreach activities geared to the needs of the communities served by AMCs; and an impressive, if not yet widespread record in scholarly activities and best practices.  

Challenges are also presented in this report.  Academic CME providers, AMC leadership and faculty need to create significant, functional alignments among CME and other relevant internal units, and enhance community engagement strategies and methods. Support for and an understanding of the potential role of academic CME units in scholarship in improving patient care and in achieving other missions of the AMC remain incomplete and patchy.

NEW
Comments (0)
Add Comment