Skip to main content icon/video/no-internet

Credentialing is the process of assessing and confirming the qualifications of a licensed, registered, or certified healthcare professional. The main goal of the credentialing process is to ensure that health professionals such as physicians, dentists, registered nurses, and others are skilled and knowledgeable about the current best practices of appropriate and effective care. To be responsible to the public and to meet legal obligations, healthcare organizations must verify the competency of their staff members. Credentialing should be conducted by an independent third party to ensure the accuracy of the information obtained on the staff members. Some of the elements that are normally verified in the credentialing process include the individual's current licensure; relevant education, training, or experience; current competence; and health fitness or the ability to perform the required tasks. Requirements of credentialing, however, vary depending on specialty or area of practice. For example, an internship or residency may not be deemed necessary to ensure that a laboratory technician has the appropriate knowledge and experience to perform his or her job; surgeons, on the other hand, are required to complete lengthy and ongoing training activities.

Background

The general public's knowledge about the importance of credentialing has grown over the years. In the past, a large variation existed in what health practitioners learned in different specialty areas or schools, especially in the field of medicine. In the 19th century, the majority of medical schools in the United States were run with the focus on making a profit; they were not associated with a university or college, and curricula lacked extensive hands-on learning opportunities such as laboratory work or dissection. As a result, many poorly trained physicians entered the profession, patients suffered high mortality rates, and the public's faith in the medical field was low. Communities discovered that it was difficult to certify physicians because there were no established guidelines according to which what they had learned could be assessed.

In the early 1900s, a number of professional medical organizations advocated for the establishment of stricter, science-based, national requirements for medical education. As part of this effort, the American Medical Association (AMA) and the Council on Medical Education (CME) wanted an assessment of the current status of medical training. With funding from the Carnegie Foundation, Abraham Flexner (1866–1959), a professional educator, was hired to conduct on-site visits to assess all medical schools in North America. Flexner compiled his findings in a landmark report, Medical Education in the United States and Canada, which was published in 1910. The Flexner Report, as it would become known, criticized the state of medical education and the training process, and Flexner made a number of recommendations. Specifically, he recommended that medical schools be integrated with colleges or universities, that the length of education be extended to at least 4 years, and that the curriculum content be agreed on and standardized by a reputable body. The report's findings led to significant changes in the nation's medical education, including more standardized curricula for medical students. Its findings also carried over to the areas of accreditation and credentialing.

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading