of complete
The Kappa coefficient can be thought of as a way of quantifying the amount of disagreement between raters. It takes into account both chance agreement (agreement that would be expected by chance alone) and real agreement (agreement that is not due to chance alone). The coefficient is equal to one if the raters are in agreement, and less than or equal to zero if there is no agreement.
Copyright © 2024 Elsevier, its licensors, and contributors. All rights are reserved, including those for text and data mining, AI training, and similar technologies.
Cookies are used by this site.
USMLE® is a joint program of the Federation of State Medical Boards (FSMB) and the National Board of Medical Examiners (NBME). COMLEX-USA® is a registered trademark of The National Board of Osteopathic Medical Examiners, Inc. NCLEX-RN® is a registered trademark of the National Council of State Boards of Nursing, Inc. Test names and other trademarks are the property of the respective trademark holders. None of the trademark holders are endorsed by nor affiliated with Osmosis or this website.