Warning: mkdir(): Permission denied in /home/virtual/lib/view_data.php on line 81

Warning: fopen(upload/ip_log/ip_log_2024-03.txt): failed to open stream: No such file or directory in /home/virtual/lib/view_data.php on line 83

Warning: fwrite() expects parameter 1 to be resource, boolean given in /home/virtual/lib/view_data.php on line 84
Recommendations for pathologic practice using digital pathology: consensus report of the Korean Society of Pathologists
Skip Navigation
Skip to contents

J Pathol Transl Med : Journal of Pathology and Translational Medicine

OPEN ACCESS
SEARCH
Search

Articles

Page Path
HOME > J Pathol Transl Med > Volume 54(6); 2020 > Article
Review
Recommendations for pathologic practice using digital pathology: consensus report of the Korean Society of Pathologists
Yosep Chong1orcid, Dae Cheol Kim2orcid, Chan Kwon Jung1orcid, Dong-chul Kim3orcid, Sang Yong Song4orcid, Hee Jae Joo5orcid, Sang-Yeop Yi,6orcid, Medical Informatics Study Group of the Korean Society of Pathologists
Journal of Pathology and Translational Medicine 2020;54(6):437-452.
DOI: https://doi.org/10.4132/jptm.2020.08.27
Published online: October 8, 2020

1Department of Hospital Pathology, College of Medicine, The Catholic University of Korea, Seoul, Korea

2Department of Pathology, Dong-A University College of Medicine, Busan, Korea

3Department of Pathology, Seoul Clinical Laboratories, Yongin, Korea

4Department of Pathology, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Korea

5Department of Pathology, TCM Laboratory, Seongnam, Korea

6Department of Pathology, Catholic Kwandong University College of Medicine, Gangneung, Korea

Corresponding Author: Sang-Yeop Yi, MD, PhD, Department of Pathology, Catholic Kwandong University College of Medicine, 25 Simgok-ro 100beon-gil, Seo-gu, Incheon 22711, Korea Tel: +82-32-290-2972, Fax: +82-32-290-3440, E-mail: 'pathysy@naver.com'
• Received: July 14, 2020   • Accepted: August 27, 2020

© 2020 The Korean Society of Pathologists/The Korean Society for Cytopathology

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

  • 6,497 Views
  • 277 Download
  • 16 Web of Science
  • 18 Crossref
  • 17 Scopus
  • Digital pathology (DP) using whole slide imaging (WSI) is becoming a fundamental issue in pathology with recent advances and the rapid development of associated technologies. However, the available evidence on its diagnostic uses and practical advice for pathologists on implementing DP remains insufficient, particularly in light of the exponential growth of this industry. To inform DP implementation in Korea, we developed relevant and timely recommendations. We first performed a literature review of DP guidelines, recommendations, and position papers from major countries, as well as a review of relevant studies validating WSI. Based on that information, we prepared a draft. After several revisions, we released this draft to the public and the members of the Korean Society of Pathologists through our homepage and held an open forum for interested parties. Through that process, this final manuscript has been prepared. This recommendation contains an overview describing the background, objectives, scope of application, and basic terminology; guidelines and considerations for the hardware and software used in DP systems and the validation required for DP implementation; conclusions; and references and appendices, including literature on DP from major countries and WSI validation studies.
Digital pathology (DP) refers to the use of a digital scanner to convert and save pathologic slides as digital images and the use of those images for pathologic diagnosis [1,2]. DP includes various processes of pathologic diagnosis, including primary diagnosis based on whole slide imaging (WSI) that enables diagnosis through display devices such as monitors instead of a microscope; telepathology that allows opinions to be obtained and digital images to be shared with experts at different locations over a network connection; and computer-aided diagnosis based on quantitative and morphometric analyses made using image analysis software [1].
Because the implementation of DP is based on automatic sample tracking systems, it can prevent sample contamination and switching caused by human error and play an important role in patient safety [3,4]. Moreover, DP allows easier and faster consultation between experts, which could minimize problems such as overtreatment and loss of treatment opportunity due to differences in diagnosis [3-5]. Furthermore, the permanent storage of high-resolution digital images as data eliminates the risks of discoloration, damage, and loss of glass slides over time while also enabling efficient, accurate, and comprehensive reading by allowing pathologists to easily compare current results with past pathologic examinations [3,4]. In addition, the future development of digital image sharing systems among medical institutions could significantly reduce the healthcare costs associated with the production of additional slides and duplicate tests when a patient transfers to another hospital [3,4].
Recent advances in relevant technologies and equipment have led many countries to implement DP systems for primary diagnosis using WSI. Based on the results of recent studies showing high concordance between primary WSI-based diagnosis and conventional pathologic diagnosis using a microscope [5,6], the United States, European Union, and Japan are already accepting the use of DP for medical practice in clinical settings, with the US Food and Drug Administration (FDA) approving the registration of WSI equipment as medical devices in 2017 [5,6]. Moreover, teleconsultation and telediagnosis using DP have been approved, with Japan creating charges associated with DP systems in its insurance system in 2018 [7].
However, because the associated technologies are still maturing, more data and experience are required to determine whether and to what extent primary diagnoses based on WSI can safely replace conventional pathologic diagnoses using a microscope.
Accordingly, the United States, Japan, and European countries, including the United Kingdom, Spain, and Germany, have released national-level guidelines for the safe implementation and quality assessment of DP to promote the active application of this technique (Table 1) [2,3,5-30]. Those guidelines suggest detailed principles to inform primary diagnosis based on WSI, teleconsultation, and telepathology; considerations regarding hardware, such as digital scanners, medical image archiving and communication systems, and image display systems; considerations for image acquisition, viewing, archiving, and analysis software; strategies for connection to laboratory information systems; and basic principles for the validation and quality assessment of DP systems.
As of 2019, several university hospitals, including the Catholic University of Korea Seoul St. Mary’s Hospital, Seoul National University Hospital, and Samsung Medical Center, have introduced DP systems and started making primary diagnoses based on WSI, although it has not yet fully replaced conventional microscopic diagnoses [3].
To keep pace with these domestic and international trends, the Korean Society of Pathologists (KSP) launched a research project in June 2019 by which the members of the Medical Informatics Study Group (MISG) were asked to develop a recommendation for pathologic practices using DP. Accordingly, the MISG spent 5 months conducting a literature search and comprehensive review, proposing recommendation principles, and establishing a draft recommendation through 4 rounds of discussion and solicitation of expert opinions. On November 20, 2019, the MISG held an open public forum to hear from relevant corporations, stakeholders, and regulatory authorities and then revised the draft recommendations accordingly.
This recommendation was prepared based on DP guidelines released by the U.S. Digital Pathology Association [29], College of American Pathologists (CAP) [30], U.K. British Royal College of Pathologists (RCP) [6], Canadian Association of Pathologists [11], Royal College of Pathologists of Australia [25], Federal Association of German Pathologists (FAGP) [10], Japanese Society of Pathology (JSP) [5,7,26], and Spanish Society of Anatomic Pathology [9] and the final report of the “Preparation of Reimbursement Assessment Guidelines for AI-based Medical Technology (Pathology)” research project funded by the Health Insurance Review and Assessment Service and carried out and published by the KSP Committee of Informatics in August 2019 [3].
The goal for this recommendation is to include only the most critical and fundamental principles suitable for the current situation and environment in Korea based on the opinions of the experts in the MISG and various stakeholders. DP will be broadly applied only after rationalization of the implementation costs, additional technical advances, the establishment of an adequate reimbursement policy within the domestic health insurance system, and the resolution of technical issues in clinical practice. Accordingly, it will be necessary to regularly update this recommendation to promptly integrate relevant knowledge and additional considerations as they evolve along with the DP environment.
The Korean version of this report is also provided separately as Supplementary Material 1.
This study was approved by the Institutional Review Board of the Catholic University of Korea, College of Medicine (SC19ZCSI0173). We searched major electronic databases from January 1, 2000, to May 31, 2019, for relevant literature on DP guidelines, recommendations, and position papers published or officially announced from major countries. In addition, all the relevant studies validating DP and WSI, including original articles and reviews, were searched. The searched databases included MEDLINE (PubMed) and Google Scholar. After reviewing the relevant articles, more references were added by cross-referencing (Tables 1, 2) [2,3,5-51]. We next selected a dozen references, mostly recent versions of DP guidelines and recommendations from the major countries and recent reviews of DP and WSI, to extract relevant opinions applicable to the current Korean environment. Synthesizing them, we prepared a draft of this recommendation and revised it through 4 rounds of discussion among experts. On November 20, 2019, the MISG of the KSP held an open forum to present the draft to the public and hear from interested parties, including companies, industry and academic associates, and stakeholders. The draft recommendation was also released on the KSP webpage, and opinions and suggestions from KSP members were accepted and considered. After this hearing, relevant revisions were made based on the submitted opinions.
Objectives
This recommendation suggests that the following objectives should be recognized and considered when implementing and operating DP systems in pathology labs.
1) To define the standard terminology generally used for DP.
2) To regulate the scope and boundaries of DP application and suggest its evidentiary foundation.
3) To present institutional-level considerations regarding the hardware and software needed to implement DP systems.
4) To provide relevant guidelines and considerations for the validation and in-house quality control (QC) of DP systems during implementation and operation.
That information is expected to provide background knowledge about DP and serve as a reference for preparing checklists for the Quality Control Program of the KSP (The Red Book), the external QC program (proficiency tests), and resident training programs regarding DP. Furthermore, this recommendation also provides basic information toward establishing an appropriate reimbursement system, including incorporation into the coverage of the National Health Insurance System.
Scope of application
This recommendation includes guidelines and considerations for the implementation and operation of the hardware, systems, and software listed below, which can be used in DP systems for WSI-based diagnosis.
1) Whole slide scanner (WSS) used to generate and acquire digital images from glass slides and image acquisition software used to operate this scanner.
2) Pathology picture archiving and communication system (PACS) and image storage systems for saving, archiving, managing, and preserving the digital images acquired by image input devices.
3) Image viewing software that allows the observation of DP images through image display devices and records measurements or annotations.
4) Image display devices used for image observation (monitors and displays).
5) Network or data sharing systems used for transmitting images.
This recommendation does not cover the acquisition, transmission, observation, and annotation of microscopic images acquired with digital cameras set up on microscopes, smartphones, or tablets (digitizer pen tablets).
In terms of staining methods and sample types applicable to WSI-based diagnosis using DP systems, most basic tissue slides stained with hematoxylin and eosin (H&E), most special stains, immunohistochemical stains, and frozen section slides are expected to be applicable, though they will require appropriate validation studies followed by trial periods until the users reach a stable learning level [30]. However, for the few special cases listed below, a higher level of validation and longer period of trial operation are recommended.
1) Cytology slides: In most cases with cell smears, liquid-based cytology, or blood smears, the results are similar to those from microscopic diagnoses made using glass slides with appropriate focus stacking (Z-stacking) during scanning and sufficient validation [52]. However, the optimal focus stacking method must be carefully selected for certain sample types (e.g., samples with many 3-dimensional structures, such as thyroid gland fine-needle aspirates [53]), staining conditions, and smear conditions. Excessive focus stacking and the acquisition of higher-magnification images (60× 100× or higher) could lead to over-scaling of the WSI files, which could negatively affect the operation of the entire system [6]. Therefore, the determination of optimal scanning conditions during the implementation of DP for cytology slides is essential and should be based on a balance between desirable scan quality and file size. Partial image acquisition of the slides is not recommended because it can severely impair diagnostic integrity and accuracy. In conclusion, importing cytology slides into a DP system requires a more extensive validation and trial operation period than is needed for other types of slides, including simultaneous comparison periods of diagnostic results using both WSI- and microscope-based diagnosis.
2) Samples clinically or morphologically suspected of lymphoreticular neoplasms: Lymphoreticular neoplasms have similar morphology at low magnification and similar nuclear features, such as chromatin patterns and nucleoli features, that are important for histologic diagnosis [5,37,48,54]. Accordingly, the acquisition of high-magnification images is generally recommended. Although evidence is lacking about diagnostic agreement at different scanning magnifications, results to date have shown less than 1% of major discordance between WSI-based and microscopic diagnoses using a basic 20× scan [54]. Moreover, the recorded 6.25% of minor discordance was mostly due to differences in grading follicular lymphoma and was similar to the inter- and intra-observer diagnostic discordance in microscopic diagnosis [54]. Because pathologic diagnosis of lymphoreticular neoplasms is almost always made in combination with the results of additional tests, such as immunohistochemical staining, diagnostic differences in the findings of H&E slide images alone do not seem to significantly affect diagnostic accuracy. On the other hand, the image comparison function available in the image viewing software of DP systems might offer the benefit of improved accuracy in the interpretation of immunohistochemical staining. Carefully designed validation studies and trial operations based on those considerations are needed.
3) Detection of microorganisms such as Helicobacter pylori: The detection of H. pylori infections through the microscopic examination of gastric biopsy tissue samples, especially when special staining such as the Giemsa stain is used, showed specificity and sensitivity comparable to that in other H. pylori tests [5]. However, concerns remain about whether microorganisms such as H. pylori can be detected successfully in the 40×-scanned WSI most commonly used today [5,44,55]. A recent study demonstrated that increasing the number of focus stacking layers provided results similar to those obtained via microscopic examination [56]. Without focus stacking, WSI-based detection showed an impaired sensitivity of 0.562 and specificity of 0.818 [57]. Therefore, when detecting microorganisms, special care is required, such as using appropriate focus stacking for assessment and mentioning the limitations of such examination results in the report.
Basic terminology
– Digital pathology (DP): Dynamic imaging environment (or academic field related to this environment) that involves the acquisition and management of pathologic information by converting microscopic glass slides into digital files and the pathologic diagnosis and interpretation of those images by means of an image display device. The scope of application includes education, research, image analysis, archiving, retrieval, connection to laboratory information systems, consultations among specialists, and image sharing.
– Digital pathology system (DPS): Image data–based computer system that enables the collection, management, and interpretation of pathologic information by digitalizing glass slides. It includes a scanner that contains an optical microscope and digital camera connected to a computer, software, and a network connection.
– Digital image analysis: Analytical method for quantifying or detecting the unique features of enhanced or processed digital images using a computer, such as chromosomal and morphometric analyses of fluorescence in situ hybridization or immunohistochemical staining images.
– Computer-aided diagnosis: Computerized assistance in the interpretation of medical images, such as providing differential diagnoses or detecting lesions by means of a digital image analysis.
– Telepathology: Digital or real-time pathologic image communication environment using wired or wireless networks or a related academic field. Telepathology could be used widely for consultation with specialists in other areas or the diagnosis of samples in a remote facility [11]. Generally two methods are available: the conventional method uses a remote-controlled microscope for the real-time transmission of glass slide images, and the DP method transmits WSIs over a wired or wireless network [11].
– Whole slide image/imaging (WSI): A single high-resolution glass slide image file or associated technology that has been scanned and converted from a single glass slide using a whole slide scanner. With this high-resolution copy or mirrored image of a glass slide with equivalent quality, image viewing software can create a virtual environment for pathologic diagnosis that mimics the conventional pathology environment of microscopic diagnosis. WSIs are also referred to as virtual slides or virtual microscopy.
– Image input device: The initial processing device for converting actual images, including glass slides, into electronic signals and recording them as digital data.
– Whole slide scanner (WSS): Device used to scan glass slides and digitally convert them to WSIs. A WSS is generally run by image acquisition (operating) software, and a WSI is generated by combining multiple small, continuously acquired, high-resolution image tiles or strips at various magnifications, such as 20×, 40×, 60×, or 100× (corresponding to 200, 400, 600, or 1,000 times magnification under a general light microscope). The digital image data can be saved using a variety of compression methods.
– Focus stacking (Z-stacking): Image processing techniques combine digital images acquired at varying focus levels to obtain a much greater depth of field than that in the individual original images. When obtaining images of samples with many 3-dimensional microstructures and cell clusters, such as cytology slides, it is difficult to obtain the appropriate depth of field with a single focus. Thus, multiple images at slightly different levels of the Z-axis are combined using various image processing methods to convert them into a single image file.
– Image acquisition software: Computer software that operates and controls the WSS device to allow images to be acquired and saved using the appropriate format, compression rate, and compression method.
– Image viewing software: Computer software that makes acquired image data viewable through an image display device such as a monitor. This software can also provide observation functions to compare two or more images, pan the image laterally, or zoom in and out of areas of interest, along with other functions such as making basic length and area measurements, saving screenshots in compatible image file formats, and recording user annotations during review.
– Image database system: Computer system and software used for the compression, management, and mass storage of acquired image data.
– Picture archiving and communication system (PACS): A system that archives, processes, and transmits digital medical images in accordance with the international standard Digital Imaging and Communications in Medicine (DICOM) format. A PACS comprises image viewing and archiving software, a mass storage device, and a computer hardware system. Its typical functions include data archiving and transfer, including text data such as interpretation reports and data acquired by medical imaging devices (computed tomography, magnetic resonance imaging, etc.). Systems based on a similar concept include a pathology PACS that manages pathologic images.
– Laboratory information management system (LIMS): Also referred to as a laboratory information system (LIS), this softwarebased system is designed to manage information related to the overall operation of a laboratory.
– Electronic medical record (EMR): An EMR contains a patient’s digital medical information, all the data obtained during diagnosis and treatment. The EMR and order communication system constitute a hospital information system (HIS), which is vital in the digitalization of medicine.
– Quality assurance (QA): Activities performed by a QC manager to ensure that certain material, data, products, or services (in this recommendation draft, QA refers to examination services inside a laboratory) have functions or results that comply with or satisfy established technical requirements.
– Quality control (QC): QC, also referred to as quality management, refers to laboratory analysis activities designed to improve the quality of test results by detecting and correcting defects that can occur during the experimental processes of all tests conducted within a laboratory. QC can be divided into internal QC, standard operating procedures and regulations set by the laboratory itself, and external QC, verified and approved by the FDA, member organizations of the International Laboratory Accreditation Cooperation, or agencies that operate proficiency assessment programs in accordance with international standards.
– Validation: Validation describes the process of confirming whether equipment, reagents, and test methods that have already been verified can be appropriately applied to the individual laboratory in question according to certain standards before they are implemented. Validation should be established by documents that provide a high level of assurance.
Considerations for the hardware and software used in DP systems
This recommendation combines the essence of various research articles and guidelines, position papers, and instructions announced by major countries to present the functional requirements for DP hardware and software, along with related information to be considered when implementing a DP system. Using this information, each institution can select an appropriate system suitable for its particular circumstances when implementing a DP system, training experts to manage the DP system, and educating the pathologists and residents who will be using the DP system.
Considerations and recommended functional requirements for a WSS

Considerations for a WSS

The process of scanning to acquire images might be the most significant aspect of the DP system. When implementing a DP system using a WSS, it is important to understand that a WSI is a high-resolution copy of a glass slide image [6]. In other words, the actual glass slide image might not be 100% accurately replicated into a digital image due to various factors involved in image acquisition using a scanner. During the scanning process, some image data can be missed or inadvertently omitted because of inappropriately set scan parameters or tissue samples that are too thin or too small (e.g., fine needle aspirate of breast fat tissue or highly necrotic tissue) or an automatic tissue detection system error [6,10]. Therefore, the person in charge must verify that all important areas of interest are included in the scan range and prepare a plan to prevent errors that can occur for those reasons. Commercially available WSSs are listed in Supplementary Table S1 organized by manufacturer.

Recommended functional requirements for a WSS

The WSS requires an optical system that can illuminate glass slides using a bright field method as well as appropriate luminous intensity to scan the entire area [7,58]. Users should have detailed information about the light source, color temperature, and mode of illumination in the optical system [7].
The scanner is recommended to have an optical system capable of at least 40× magnification [6,7,58]. Resolutions of 500 and 250 nm per pixel can be achieved with 20× and 40× scans, respectively; thus, the scan magnification should be set based on the type of sample and purpose of the test.
The scanner must be designed such that the glass slide is safely maintained without being damaged, dislodged, or shifted during the slide exchange process as a scanned glass slide is removed and the next glass slide to be scanned is mounted [58]. A slide that is not firmly secured could shift slightly during this exchange, potentially creating artifacts in the scanned digital image [6].
The scanner must have an identification function, such as the ability to scan and recognize the labels of glass slides or identification information such as barcodes or QR codes to match the information from a slide to its digital image [4,7,58]. Scanners that support an identification function using a barcode or QR code play an important role in the automation of pathology laboratories and could help to reduce errors such as switched samples [4].
In addition to acquiring magnified images, the scanner should provide overview images (also called preview or macro-images) to allow users to observe the entire tissue on a glass slide in a single view [7,58]. During image acquisition, the horizontal and vertical resolution must be the same, and the color range and gradation of color images must satisfy the quality standard designated by the manufacturer [6,58].
The scanner must have an auto-focus function that satisfies the quality standard designated by the manufacturer, and it must be able to tell the user whether the image was successfully scanned with the normal auto-focus function [7,58].
The scanner must provide a function that allows users to check whether the digital image was scanned satisfactorily [7,58]. The manager who evaluates the quality of scanned images must carefully examine whether faint stains, pen marks, foreign objects, air bubbles during sealing, or damage to the cover slide affected the quality of scanned digital images and whether errors such as misalignment of strips or tiles when combining have occurred [6]. A work process enabling screening for those and similar factors must be established.
When implementing a DP system, the scanner type, performance, and quantity must be selected based on the scale of testing at the institution, sample size, sample type, tests being applied, required turn-around time (TAT), and amount of labor required to carry out the scan work [4]. The number of scanners appropriate for each institution can be calculated by determining the total time needed for scanning, available time for scanning, and scanner utilization rate [4]. The total time needed for scanning can be calculated by multiplying the number of samples to be scanned and average scan time per slide of the equipment; however, other factors, such as the time required to mount and dismount slides, time required for maintenance and repair, changes in workflow, and available labor, must also be considered [4]. Moreover, even digital images acquired from the same glass slide can show slight differences in saturation, color density, and color temperature depending on the scanner manufacturer [59]; thus, comparison tests of suitable equipment from various vendors should be performed before system implementation [55].
Considerations and recommended functional requirements for image database systems

Considerations for image database systems

An image database system comprises a computer system to manage image data, a storage device such as a server, and imagearchiving software related to the database that manages data storage [6,7,58]. Data can also be stored on a hard disk drive managed by the computer operating system (OS) without image archiving software [10]. In the case of an image database system, a pathology PACS using an independent server is recommended to accommodate the size of digital pathologic image data and the amount of data transmitted. However, depending on the situation and needs of each individual institution, data storage could also be integrated and use the same server as the general institutional PACS [6,7].
Each institution should determine how long digital image data should be preserved. The guidelines by CAP (USA), RCP (UK), and FAGP (Germany) recommend data preservation for at least 10 years, whereas the guidelines from the JSP (Japan) recommend permanent preservation, with a minimum of 5 years [6,10,26,30]. The JSP guidelines also recommend that data from the past 5 years be preserved in hot storage, meaning that they are available for immediate use [26]. For reference, the preservation period recommended by the Medical Act of South Korea for glass slides containing pathological tissue is 5 years, which is the requirement for test records or findings among general medical records.
Each institution must set a preservation period based on its particular situation and the relevant laws.
During implementation of an image database system, institutional IT specialists should be consulted because their cooperation is required for integration with the LIMS, connection to electronic health records that include clinical information, compliance with the institution’s information security policies, and seamless interconnection and integration with existing PACS [6,7,10].

Recommended functional requirements for image database systems

The image database system must be able to guarantee that the identification information of the glass slide matches that of the digital image [26]. Moreover, even if the version of the image archiving software changes, using a preserved image should not be problematic, and the possibility that digital images could be damaged by overheating of the storage device or recording medium should be eliminated [7].
The type of storage method must include the concept of backup or mirroring to ensure that data are safely preserved (e.g., using network attached storage or a redundant array of inexpensive disks) [10].
The image database system (or software) could support the DICOM format, the standard file format for medical imaging designated by the American College of Radiology and National Electrical Manufacturers Association, to ensure compatibility with other scanners or PACS [7,10]. This is an important function that must be considered when implementing image database systems to enable future data exchange or transmission to other institutions and combined use with other image acquisition/storage devices within the institution [7,10].
Considerations and recommended functional requirements for image display devices and image viewing software

Considerations for image display devices and image viewing software

Image display devices include flat-screen monitors, occasionally with touch-input function. The devices can be portable, such as a tablet PC, for telepathology [7,58]. The image viewing software should run on various platforms suitable for the image display devices used by an institution and should support operation over a network connection [58].
Image display devices, including monitors, are part of an imaging chain (also called a visualization pipeline), as are optical components such as scanner lenses and image acquisition components such as a charge-coupled device, an electronic component for data processing [6,10]. Consideration should be given to the following factors that determine the quality of image display devices: the type of display (such as the size of the device), the structure of the light source (light-emitting or light-receiving), the liquid crystal alignment mode (in-plain switching or vertical alignment), and the structure of the liquid-crystal display and flat panel; the mechanical characteristics of the device, including the resolution (dots per inch), luminance, contrast ratio and contrast, color temperature, color profile of the monitor, viewing angle, response rate, image retention, and burn-in [6,7,10]; the mechanical characteristics of the image display system associated with the speed and capacity of the graphic memory in the computer system; and environmental factors such as room lighting, window placement, distance from the observer eye level, and differences in user heights [6,10].
When making diagnoses using a DP system, it is often necessary to check clinical information from the patient EMR or radiologic data from the PACS, for which multiple monitors can be used. When comparing or observing two or more DP images using multiple monitors, the monitors should have been manufactured in the same year and should be the same model to minimize differences in the images caused by the different monitors [6,10]. Moreover, the performance of an image display device can degrade after extended use in terms of decreased luminance, decreased contrast ratio, and burn-in. Therefore, each device must be regularly validated by appropriate methods, or actions must be taken to maintain its performance [6,7,10,58]. For reference, examples of displays used in current DP deployments and the largest validation studies presented in best-practice recommendations by the RCP for implementing DP are listed in Supplementary Table S2 [6].

Recommended functional requirements for image display devices and image viewing software

Because technologies for image display devices such as monitors are advancing rapidly, it is difficult to define the minimum requirements for a DP system based simply on numeric values [7,58]. Moreover, it is difficult to define the absolute requirements because the luminance, luminance ratio, and contrast can change depending on the office environment. Therefore, optimal functional conditions should be defined according to the situation and needs of each institution [58].
Various international DP guidelines present minimum functional requirements with slightly different specifications that all gradually increased depending on when they were announced (Table 3) [7,9,10,55,58].
These guidelines generally recommend the following: horizontal resolution ≥1,280–2,560 pixels, screen size (diagonal) ≥17– 27 inches, luminance ≥170–300 cd/m2, luminance ratio ≥250–1,600:1, pixel pitch ≤0.33 mm, and minimum luminance ≥0.5 cd/m2. However, certain guidelines do not specify values and recommend that each laboratory select suitable monitors at their discretion, with validation of the overall performance [2,6]. Increasing the monitor resolution does not enable digital images acquired by a scanner to be viewed at a resolution higher than the original resolution. However, if the monitor resolution is too low, original digital images acquired at higher resolutions might not be accurately displayed [2,6].
Image viewing software can include the following functions: an observation field display that shows overview images (also referred to as preview or macro-images), with the part of the total overview image being observed indicated within a square; an annotation function that displays the object magnification and length scale (accumulation) on the images and allows users to insert figures or words to mark areas of interest; a function to screen-capture partial or entire images of interest displayed on the monitor; a function that allows side-by-side comparison of DP images from different tests performed on the same patient, such as immunohistochemical or special stains, or DP images from similar cases for reference; and basic morphometric functions, such as measuring the length and area of certain microstructures [7,10]. Whether these functions can be adequately performed in the workflow of real practice should be determined in advance [4,6,10]. The image viewing software can also include additional functions such as morphological classification, morphometry, tumor grading, and tumor diagnosis and detection with the aid of machine- or deep-learning computer algorithms [1,3]. It is necessary to determine whether sufficient evidence, based on the results of comparative analyses with conventional microscopy, supports the use of those functions within the diagnostic process [1,3].
Other considerations

Issues related to integration/links with LIS and EMR systems

The DP system must be linked appropriately to the LIS that stores and manages test records from the pathology laboratory and the HIS or EMR system that manages clinical patient records inside the hospital [7,10,58]. The ATA (US), Canadian, and European guidelines recommend linkage and management using standard methods such as HL7 [9-11,60]. The DP system must include metadata associated with the digital images [58; i.e., overview images (preview, macro-images), scan parameters, and data on the scanned area. When the system is linked to the LIS, data such as the test number, tissue information, block number, and staining information must be appropriately linked. Moreover, the linked systems must be checked to confirm the smooth operation of both systems and the link between them under actual workflow conditions [4].
The DP system must also be linked to the institutional PACS and EMR or HIS so that the patient clinical and radiologic image data needed for diagnosis are easily accessible [7,10,61,62]. It is best to follow the international standard DICOM. When linking to other information systems within an institution, IT specialists within that institution should be consulted in advance so that they can ensure safety by minimizing the effects of those links on network or information security [61-64].

Issues related to telepathology, firewalls, protection of personal information, and mobile device use

Rapid advances in wired/wireless networks and mobile technologies are expected to increase WSI diagnosis by means of remote systems or portable devices such as tablet PCs [6]. Diagnostic systems once used real-time images acquired by remotely controlled microscopes [13,60,65]. In South Korea, which has a relatively small territory with a highly developed wireless network environment and equally distributed access to healthcare, transmitting and sharing WSIs through a wired/wireless network is more likely than using the older telepathology system [1,3]. Moreover, rapid advances in the wireless network environment are likely to accelerate diagnosis or consultation using portable devices.
In both cases, strict technical measures must be in place to ensure information security and protect personal information regardless of the type of terminal being used [6,10]. Therefore, measures are needed to ensure that transmitted data are not easily released outside the network and that transmitted metadata do not contain personal information to minimize the risk to personal data even if a data leak were to occur.
When diagnoses are made using portable terminals such as a tablet PC, the use of a relatively small screen is a major concern. The results of a recent study suggest that diagnosis using portable terminals should be limited to special cases that require relatively low accuracy but rapid reporting of results, such as frozen section tests [1,53,65]. These terminals are generally not reliable for routine diagnostic work. Appropriate caution and considerations are deemed necessary.
Portable terminals run on different OSs, usually iOS (Apple) or the Android OS (Google). Image viewing software for portable terminals must be built to run smoothly on each OS. Cross-platform software that can run on any OS could be built based on HTML5 [58].
Guidelines and considerations for validation needed for the implementation of DP systems and internal QC needed during operation
For primary pathologic diagnosis by WSI, the DP system must undergo appropriate validation by the managing personnel before implementation. Moreover, internal QC must be performed regularly while the system is in operation to ensure that the system is operating normally and that the test results are reliable. Upon the identification of a system defect that could cause serious errors in the test results, immediate action must be taken to resolve the issue.
This recommendation is based on guidelines related to the validation of DP systems published by CAP (US) (Table 4) [30] and FAGP (Germany) (Supplementary Table S3) [10] and contains general principals and instructions that could be referenced for internal QC during operation, validation of the DP system during implementation at each institution, and the development of QA items related to DP systems for the KSP QC program.
Laboratory QA includes verification and validation. Verification refers to the approval of equipment or the production of reagents by regulatory authorities, such as the Ministry of Food and Drug Safety, before the implementation of a specific test or experiment. Validation is an institutional-level testing process before implementation in a laboratory. Validation can be divided into internal validation performed in-house by each laboratory, which is the case for DP systems, and external validation performed by third-party institutions. During DP operation after implementation, a quality management program should be performed continuously and routinely. This program also includes internal quality management programs carried out in-house according to internal instructions and an external quality management program performed by independent institutions. A typical example of an external quality management program is the KSP Quality Control Program (proficiency test). Because that program does not currently include quality assessment items related to DP systems, appropriate items should be developed and included soon; in doing so, the following rules should be considered. The following recommendation statements are summarized in Table 5.

1. All pathology laboratories operating WSI-based DP systems for clinical diagnosis must conduct in-house validation studies (Expert consensus)

Variable factors in the testing process could influence DP system performance and validity; thus, a validation study before system implementation is essential. Simply because the DP system has already been approved by relevant authorities through a verification process and is being operated according to the manufacturer’s recommended operating protocol does not guarantee the validity of the system for the samples and environment at each institution. The validation results must be appropriately documented and maintained accordingly [6,10,30].

2. The validation study should be conducted under conditions that are consistent with the clinical use intended by the DP system manufacturer (Recommendation)

Validation is intended to prove that the WSI system is operating as expected according to its intended purpose [6,10,30]. Therefore, the specific methods and design of the validation must be consistent with the purpose at the time that the WSI system was manufactured. For example, even if a DP system that was manufactured to run gynecological liquid-based cytology slides has been successfully validated using gynecological liquid-based cytology samples before implementation, it would not be safe to assume that this system would demonstrate the same quality for centrifuged urine cytology samples. Therefore, separate testing must be conducted when the DP system is to be used for purposes other than originally planned. However, if the overall process of sample preparation and interpretation is the same, then a single validation study could be sufficient [6,10,30]. For example, when testing immunohistochemical stain slides, having the sample preparation process would obviate the need to individually test all antibodies.

3. The validation study should be designed to be as similar as possible to the actual clinical settings in which the technology will be used (Recommendation)

It is not advisable to conduct the validation study by selecting samples that show “typical” pathologic findings for each diagnosis favorable for testing [6,10,30]. The validation must represent common cases and should include a sufficient number of borderline cases that could be difficult to diagnose using the DP system, such that the spectrum of diagnostic complexity and difficulty found in actual workflow is adequately represented. In addition to a comparison of diagnostic accuracy, the validation must also include an assessment of its performance with respect to cases that are expected to be more difficult than by microscopy, such as dysplasia grading, calcium oxalate crystal detection, mitosis counting, eosinophil counting, microorganism detection, and viral inclusion detection. This process can be used to facilitate user training and learning, as well as proper validation. For frozen section cases, whether the TAT from scanning to diagnosis is similar to that of microscopic diagnosis must also be assessed [6,10,30]. If the system is used in a single institution, comparative assessment with other laboratories is not necessary. However, if samples prepared in other institutions are used, then advanced testing of the method is necessary to simulate the same workflow.

4. The validation study should cover the entire DP system (Recommendation)

The validation study is a QA process intended to test the entire process; thus, separate testing of individual system components (e.g., the computer system, monitor, and scanner) or processes is unnecessary [6,10,30].

5. Significant changes in the composition of the DP system necessitate re-validation (Expert consensus)

Validation must be repeated whenever significant changes are made to the composition of the DP system, such as the use of a new type of scanner or hardware or software upgrades [6,10,30]. The validation could be performed with a smaller number of samples (i.e., 20 samples) if the new scanner was manufactured by the same manufacturer; is the same model as the previously validated scanner; and is used with the same network, image database system, image viewing software, and image display device. Minor changes can be managed according to internal guidelines [6,10,30].

6. Validation is intended to be conducted by at least one pathologist who has been acclimated to the DP system (Recommendation)

The validation process assumes that a pathologist who has been acclimated to the DP system will make a diagnosis [6,10,30]. Therefore, validation should be performed by someone familiar with using the DP system, rather than inexperienced individuals, to eliminate results biased by the tester’s level of education and training. Moreover, although the system does not need to be validated by every pathologist who uses it, the validation could include other laboratory personnel (e.g., laboratory managers, histo- or cytotechnicians, and residents), IT managers, or technical advisors. The validation should also include personnel who perform slide scanning [6,10,30].

7. The validation must be performed on at least 60 samples for a single applicable field (e.g., histopathologic H&E-stained slides, frozen sections, cytology slides, blood smear slides) according to the type of sample or test. For additional applicable fields (e.g., immunohistochemical staining, special staining), validation could be performed by adding 20 or more samples (Recommendation)

The number of people involved in the validation and the scale of the validation could vary significantly between institutions [6,10,30]. Moreover, it is difficult to accurately calculate the minimum number of samples needed to guarantee 100% validity. The manager of the DP system at each institution must fully consider the scale and characteristics handled by the institution as well as the relevant personnel and include samples with varying degrees of diagnostic difficulty when selecting the appropriate number of samples needed to ensure reliable operation of the DP system [4,6,10,30]. A prospective validation process during 1–3 months of actual operation, as well as a retrospective validation study using prior tests, could be also considered.

8. Validation must be carried out using a comparative analysis of concordance between microscopic and WSI-based diagnoses made by a single observer (intra-observer variability assessment) (Suggestion)

Validation is intended to assess the diagnostic concordance between microscopic and WSI-based diagnoses; thus, it must be conducted as an intra-observer variability assessment with repeated assessments by the same observer [4,6,10,30]. The degree of diagnostic concordance can be assessed using a 3-tier system according to the clinical implications (i.e., major discordance that could drastically affect patient prognosis and treatment; minor discordance that could affect the diagnosis severity without causing changes in patient prognosis and treatment; and minimal discordance with little or no difference in the diagnosis severity and patient prognosis or treatment) [4,6,10,30]. The goal of comparative analysis should be to identify the cause of problems related to image quality, such as artifacts during digital image scanning, rather than diagnostic variability resulting from interpretational changes by the observer [6].

9. Validation can be performed using either randomly or sequentially arranged samples (Recommendation)

Intuitively, the random arrangement of samples would seem to minimize the influence of recall bias on validation. However, relevant studies have reported no significant difference between random and sequential assessments [2,4,6,10,30].

10. During the validation, a washout period of at least 2 weeks is needed to minimize the influence of recall bias (Recommendation)

An observer remembering tissue slide images previously examined and their corresponding diagnoses can cause recall bias that could affect validation concordance [2,4,6,10,30]. Therefore, it is important to perform the validation with a sufficient washout period between the observations. Previous studies and major guidelines recommend a washout period of at least 2 weeks; however, a longer washout period might be more favorable as long as it does not burden the operation of the institution.

11. During validation, data integrity during image acquisition must be assessed by verifying whether all tissues on the glass slide have been properly scanned to form the digital image (Expert consensus)

In addition to assessing the diagnostic concordance, the assessment of data integrity during image acquisition is also important with respect to the QA of the DP system [2,4,6,10,30]. Slides with poor staining quality, images of very small tissues that are out-of-focus, and errors or scan failure during image acquisition should be checked and appropriate measures taken during validation. In addition, it is important to check whether the metadata of the digital images and the slide identification numbers (slide labels) match [2,4,6,10,30].

12. Pathology laboratories must maintain documentation regarding the validation of the DP system, including the methods, results, and final approval (Expert consensus)

Pathology laboratories must keep and manage the documents demonstrating their successful validation of their DP systems, including the methods used, results, and final approval [2,4,6,10,30]. During the validation, system users should be educated and trained to operate the system, and supporting documents showing that this education has been conducted must be prepared and maintained. The final document must contain the signature of the DP system manager or designated representative. In addition, the inclusion of a statement in the pathologic report that a DP system was used for diagnosis is recommended [26].
The technical innovations in the past decade have advanced DP enough for it to replace conventional microscopic diagnosis [1]. However, caution is still needed in certain situations that require specific pathologic determination, such as microbial infection assessment [1]. Ultimately, accumulating experience and data could lead to solutions to these current technical limitations.
The successful implementation of DP systems provides a foundation from which pathology laboratories can enhance their services and create innovative workflows. DP could change the daily lives of pathologists in the next 20–30 years. The convergence of DP with various cutting-edge scientific fields, such as computing based on big data and artificial intelligence, would be a gamechanger in the upcoming 4th Industrial Revolution. Therefore, government-led planning and systemic support for the timely implementation of DP systems is needed. KSP-MISG continues to introduce relevant and timely technologies to meet demand and strives to provide standards and advice on their safe implementation in actual clinical practice.
The Data Supplement is available with this article at https://doi.org/10.4132/jptm.2020.08.27.
Supplementary Material 1
jptm-2020-08-27-suppl1.pdf
Supplementary Table S1
jptm-2020-08-27-suppl2.pdf
Supplementary Table S2
jptm-2020-08-27-suppl3.pdf
Supplementary Table S3
jptm-2020-08-27-suppl4.pdf

Ethics Statement

This work was reviewed and approved by the Institutional Review Board of The Catholic University of Korea (SC19ZCSI0173).

Author contributions

Conceptualization: YC, DCK (Dae Cheol Kim), CKJ, DCK (Dong-chul Kim), SYS, HJJ, SYY. Data curation: YC, DCK (Dae Cheol Kim), DCK (Dong-chul Kim). Funding acquisition: SYY. Investigation: YC, DCK (Dae Cheol Kim), CKJ, DCK (Dong-chul Kim), SYS, HJJ, SYY. Methodology: YC, DCK (Dae Cheol Kim), SYY. Project administration: DCK (Dae Cheol Kim), CKJ, SYS, HJJ, SYY. Resources: YC, DCK (Dae Cheol Kim), CKJ, DCK (Dong-chul Kim). Supervision: SYS, HJJ, SYY. Validation: YC, DCK (Dae Cheol Kim), CKJ, DCK (Dong-chul Kim), SYS, HJJ, SYY. Visualization: YC, DCK (Dae Cheol Kim). Writing—original draft: YC, DCK (Dae Cheol Kim), SYY. Writing—review & editing: YC, DCK (Dae Cheol Kim), CKJ, DCK (Dong-chul Kim), SYS, HJJ, SYY. Approval of final manuscript: all authors.

Conflicts of Interest

C.K.J., the editor-in-chief and Y.C., contributing editor of the Journal of Pathology and Translational Medicine, were not involved in the editorial evaluation or decision to publish this article. All remaining authors have declared no conflicts of interest.

Funding Statement

This research was supported by The Korean Society of Pathologists Grant (KSPG2019-03).

We appreciate Prof. Se Jin Jang from Asan Medical Cencer, University of Ulsan, College of Medicine, Seoul, the Chairman of the KSP, Prof. Kyoung Bun Lee from Seoul National University, Seoul, the Executive Director of the Committee of Informatics of the KSP, Prof. Ho-Chang Lee from Chungbuk National University College of Medicine, Cheongju, the Executive Director of the Committee of Scholarship of the KSP, Prof. Ju Han Lee from Korea University Ansan Hospital, the Executive Director of the Committee of Insurance of the KSP, and Prof. Dong Hoon Kim from Kangbuk Samsung Hospital, Sungkyunkwan University School of Medicine, Seoul, the Executive Director of the Committee of General Affairs of the KSP for generously providing practical advice during the project.
Table 1.
Digital pathology guidelines, position papers, and relevant instructions in leading countries
Country Guideline and instruction
Canada Canadian Association of Pathologists (CAP-ACP)
2014: Guidelines from the Canadian Association of Pathologists for establishing a telepathology service for anatomic pathology using whole-slide imaging [11]
United States College of American Pathologists (CAP)
2013: Validating whole-slide imaging for diagnostic purposes in pathology [30]
2011: Anatomic pathology checklist: CAP accreditation program [12]
American Telemedicine Association (ATA)
2014: Clinical guidelines for telepathology [13]
Digital Pathology Association (DPA)
2019: Computational pathology definitions, best practices, and recommendations for regulatory guidance: a white paper from the Digital Pathology Association [29]
2011: Validation of digital pathology in a healthcare environment [14]
2011: Archival and retrieval in digital pathology systems [15]
2011: Interoperability between anatomic pathology laboratory information systems and digital pathology systems [16]
2011: Validation of digital pathology systems in the regulated nonclinical environment [17]
Food and Drug Administration (FDA)
2015: Technical performance assessment of digital pathology whole-slide imaging devices: draft guidance for industry and Food and Drug Administration staff [18]
Centers for Medicare & Medicaid Services (CMS)
2015: Clinical Laboratory Improvement Amendments (CLIA) [19]
Centers for Disease Control and Prevention (CDC)
2013: Clinical Laboratory Improvement Advisory Committee (CLIAC) [20]
Society of Toxicologic Pathology
2013: Validation of digital pathology systems in the regulated nonclinical environment [17]
2007: Pathology position paper on pathology image data [22]
European Union European Commission (EC)
2012: Guidelines on the qualification and classification of stand alone software used in healthcare within the regulatory framework of medical devices [23]
Spain Spanish Society of Anatomic Pathology (SEAP-IAP)
2015: Practical guidelines for digital pathology implementation [9]
United Kingdom The Royal College of Pathologists (RCP)
2018: Best practice recommendations for implementing digital pathology [6]
2013: Telepathology: guidance from The Royal College of Pathologists [24]
Germany Federal Association of German Pathologist (FAGP-BDP)
2018: Guidelines digital pathology for diagnosis on (and reports of) digital image [10]
Australia The Royal College of Pathologists of Australasia (RCPA)
2014: Position statement: telepathology [25]
Japan Japanese Society of Pathology (JSP)
2019: Guidelines for pathologic diagnosis using digital pathology image (clinical questions and answers) [5]
2018: Technical standards for digital pathology system for pathologic diagnosis ver. 3 [7]
2016: Guidelines for pathologic diagnosis using digital pathology image [26]
2016: Technical standards for digital pathology system for pathologic diagnosis ver. 2 [26]
2015: Technical standards for digital pathology system for pathologic diagnosis ver. 1 [28]
Korea Korean Society of Pathologists (KSP)
2019: Preparation of reimbursement assessment guidelines for AI-based medical technology (pathology) [3]
Table 2.
List of whole-slide imaging validation studies
Year Author Journal No. of samples/observers Results Evidence level Reference No.
2006 Gilbertson et al. BMC Clin Pathol 25 Mixed/3 32% Discordancy IV [31]
2011 Jukic et al. Arch Pathol Lab Med 101 Mixed/3 3%–7% discordancy III [32]
2012 Al-Janabi et al. J Clin Pathol 100 Breast Kappa = 0.92 IV [33]
2012 Al-Janabi et al. Hum Pathol 100 GI 5% Discordancy (minor) IV [34]
2012 Al-Janabi et al. J Clin Pathol 100 Skin 6% Discordancy (minor) IV [35]
2013 Al-Janabi et al. J Clin Pathol 100 Pediatric WSI: 10% discordancy IV [36]
Glass: 7% discordancy
2013 Bauer et al. Arch Pathol Lab Med 607 Mixed WSI: 1.65% discordancy III [37]
Glass: 0.99% discordancy
2013 Krishnamurthy et al. Arch Pathol Lab Med 100 Breast WSI: 9.5% discordancy III [38]
Glass: 7.9% discordancy
2013 Pantanowitz et al. Arch Pathol Lab Med Meta-analysis of 27 papers - III [30]
2014 Al-Janabi et al. J Renal Inj Prev 100 GU 13% Discordancy III [39]
2014 Buck et al. J Pathol Inform 150 Mixed/6 WSI: 2.1%–10.1% discordancy III [40]
Glass: 3.3%–13.3% discordancy
2014 Reyes et al. J Pathol Inform 103 Breast /3 WSI: 1%–4% discordancy III [41]
Glass: 0%–7% discordancy
2015 Ordi et al. J Clin Pathol 452 GYN/2 5.8% Discordancy III [42]
2016 Pekmezci et al. J Pathol Inform 97 neuro/2 5.1%–12% Discordancy III [43]
2016 Snead et al. Histopathology 3,017/17 (2,666 biopsy, 340 surgery, 11 frozen, 10 organs) 1.3% Discordancy III [44]
2016 Wack et al. J Pathol Inform 33 Mixed/16 WSI: 20.9% discordancy III [45]
Glass: 23.5% discordancy
2017 Kent et al. JAMA Dermatol 499 Skin/3 WSI: 6% discordancy III [46]
Glass: 6% discordancy
2017 Saco et al. Dig Liver Dis 176 Liver/2 3.4%–9.7% discordancy III [47]
2017 Tabata et al. Pathol Int 1,070 Mixed/9 4.4% discordancy III [48]
2018 Araujo et al. Virchow Arch 70 Oral/2 3% Discordancy III [49]
2018 Lee et al. Am J Dermatopathol 77 Skin/2 0.3% Discordancy III [50]
2018 Mukhopadhyay et al. Am J Surg Pathol 1,992 Mixed/16 WSI: 4.9% discordancy III [51]
Glass: 4.6% discordancy

Evidence level in the table is as follows.

I, Systematic review or meta-analysis; II, At least one randomized controlled study; III, Non-randomized clinical trial (NRCT); IV, Analytic epidemiological study (cohort or case-controlled study); V, Descriptive study (case report or case series); VI, Expert opinion.

GI, gastrointestinal; WSI, whole slide imaging; GU, genitourinary; GYN, gynecopathology.

Table 3.
Minimum requirements for image display devices recommended by international guidelines
CAPa (US) [55] SEAP (Spain) [9] FAGP (Germany) [10] JSP (Japan) [7]
Published year 2015 2015 2018 2019
Screen resolution (pixels) 1,280 × 1,024 1,920 × 1,080 2560 × 1600 1,280 × 1,024
Screen size (inch) 17 or 19 22 27 19.3
Pixel pitch (mm) - - - ≤ 0.33
Luminance (cd/m2) - ≥ 100 ≥ 300 ≥ 170
Luminance ratio (contrast ratio) - 1,000–1,600:1 - ≥ 250:1
Minimum luminance (cd/m2) - - ≥ 0.5 -

CAP, College of American Pathologists; SEAP, Society of Anatomic Pathology; FAGP, Federal Association of German Pathologist; JSP, Japanese Society of Pathology.

a Results of validation studies conducted based on U.S. CAP guidelines [55].

Table 4.
College of American Pathologists guidelines for validating whole slide imaging systems for diagnostic purposes in pathology [30]
Guideline statement Grade of evidence
1. All pathology laboratories implementing WSI technology for clinical diagnostic purposes should carry out their own validation studies. Expert consensus opinion
2. Validation should be appropriate for and applicable to the intended clinical use and clinical setting of the application in which WSI will be used. Validation of WSI systems should involve specimen preparation types relevant to the intended use (e.g., formalin-fixed paraffin-embedded tissue, frozen tissue, immunohistochemical staining, cytology slides, hematology blood smears). Recommendation
 Note: If a new intended use for WSI is contemplated and this new use differs materially from the previously validated use, a separate validation for the new use should be performed. Grade A
3. The validation study should closely emulate the real-world clinical environment in which the technology will be used. Recommendation
Grade A
4. The validation study should encompass the entire WSI system. Recommendation
 Note: It is unnecessary to separately validate each individual component of the system (e.g., computer hardware, monitor, network, scanner) or the individual steps of the digital imaging process. Grade B
5. Revalidation is required whenever a significant change is made to any component of the WSI system. Expert consensus opinion
6. At least one pathologist adequately trained to use the WSI system must be involved in the validation process. Recommendation
Grade B
7. The validation process should include a sample set of at least 60 cases for one application (e.g., H&E stained sections of fixed tissue, frozen sections, cytology, or hematology) that reflects the spectrum and complexity of specimen types and diagnoses likely to be encountered during routine practice. Recommendation
 Note: The validation process should include another 20 cases for each additional application (e.g., immunohistochemistry, special stains). Grade A
8. The validation study should establish diagnostic concordance between digital and glass slides for a single observer (i.e., intra-observer variability). Suggestion
Grade A
9. Digital and glass slides can be evaluated in random or nonrandom order (as to which is examined first and second) during the validation process. Recommendation
Grade A
10. A washout period of at least 2 weeks should occur between viewing digital and glass slides. Recommendation
Grade B
11. The validation process should confirm that all of the material present on a glass slide to be scanned is included in the digital image. Expert consensus opinion
12. Documentation recording the method, measurements, and final approval of the validation results for the WSI system should be maintained. Expert consensus opinion

WSI, whole slide imaging; H&E, hematoxylin and eosin.

Table 5.
Recommendation statements from the Medical Informatics Study Group (MISG) of the Korean Society of Pathologists (KSP) for validation of digital pathology systems for primary diagnosis during implementation
Recommendation statement Grade of evidence
1. All pathology laboratories operating whole slide image–based digital pathology systems for clinical diagnosis must conduct in-house validation studies. Expert consensus
2. The validation study should be conducted under conditions that are consistent with the clinical use intended by the digital pathology system manufacturer. Recommendation
3. The validation study should be designed to be as similar as possible to the actual clinical settings in which the technology will be used. Recommendation
4. The validation study should cover the entire digital pathology system. Recommendation
5. Significant changes in the composition of the digital pathology system necessitate re-validation. Expert consensus
6. Validation is intended to be conducted by at least one pathologist who has been acclimated to the digital pathology system. Recommendation
7. The validation must be performed on at least 60 samples for a single applicable field (e.g., histopathologic H&E-stained slides, frozen sections, cytology slides, blood smear slides) according to the type of sample or test. For additional applicable fields (e.g., immunohistochemical staining, special staining), validation could be performed by adding 20 or more samples. Recommendation
8. Validation must be carried out using a comparative analysis of diagnostic concordance between microscopic and WSI-based diagnoses by a single observer (intra-observer variability assessment). Suggestion
9. Validation can be performed using either randomly or sequentially arranged samples. Recommendation
10. During the validation, a washout period of at least 2 weeks is needed to minimize the influence of recall bias. Recommendation
11. During validation, data integrity during image acquisition must be assessed by verifying whether all tissues on the glass slide have been properly scanned to form the digital image. Expert consensus
12. Pathology laboratories must maintain documentation regarding the validation of the digital pathology system, including the methods, results, and final approval. Expert consensus

H&E, hematoxylin and eosin; WSI, whole slide imaging.

  • 1. Nam S, Chong Y, Jung CK, et al. Introduction to digital pathology and computer-aided pathology. J Pathol Transl Med 2020; 54: 125–34. ArticlePubMedPMCPDF
  • 2. Hanna MG, Pantanowitz L, Evans AJ. Overview of contemporary guidelines in digital pathology: what is available in 2015 and what still needs to be addressed? J Clin Pathol 2015; 68: 499–505. ArticlePubMed
  • 3. Lee K, Jang S, Kim D, Lee J. Preparation of reimbursement assessment guidelines for AI-based medical technology (pathology). Seoul: Health Insurance Review and Assessment Service (HIRA), 2019.
  • 4. Treanor D, Williams B. The leeds guide to digital pathology. Leeds: The Leeds Teaching Hospitals NHS, University of Leeds, 2019.
  • 5. Japanese Society of Pathology. Digital Pathology Assessment Committee: guidelines for pathologic diagnosis using digital pathology images (clinical questions and answers). Tokyo: Japanese Society of Pathology, 2019.
  • 6. Cross S, Furness P, Igali L, Snead DR, Treanor D. Best practice recommendations for implementing digital pathology. London: The Royal College of Pathologists, 2018; 1–43.
  • 7. Japanese Society of Pathology. Digital Pathology Assessment Committee: technical standards for digital pathology system for pathologic diagnosis. Tokyo: Japanese Society of Pathology, 2018.
  • 8. College of American Pathologists. Validating whole slide imaging for diagnostic purposes in pathology. Northfield: College of American Pathologists, 2013.
  • 9. Garcia Rojo M, Conde A, Ordi J, et al. Guia practica para la implantacion de la patologia digital. In : Guerra Merino I, ed. Libro Blanco de la Anatomia Patologica en Espana 2015. Vitoria: Sociedad Espanola de Anatomia Patologica, 2015; 247–78.
  • 10. Federal Association of German Pathologists Bundesverband Deutscher Pathologen (FAGP-BDP). Guidelines digital pathology for diagnosis on (and reports of) digital images, 2018. Berlin: Federal Association of German Pathologists Bundesverband Deutscher Pathologen (FAGP-BDP), 2018.
  • 11. Canadian Association of Pathologists Telepathology Guidelines Committee, Bernard C, Chandrakanth SA, et al. Guidelines from the Canadian Association of Pathologists for establishing a telepathology service for anatomic pathology using whole-slide imaging. J Pathol Inform 2014; 5: 15.ArticlePubMedPMC
  • 12. College of American Pathologists. Anatomic pathology checklist: CAP accreditation program. Northfield: College of American Pathologists, 2011.
  • 13. Pantanowitz L. Clinical guidelines for telepathology. Arlington: American Telemedicine Association, 2014.
  • 14. Lowe A, Chlipala E, Elin J, Kawano Y, Long RE, Tillman D. Validation of digital pathology in a healthcare environment. San Diego: Digital Pathology Association, 2011.
  • 15. Chlipala E, Elin J, Eichhorn O, Huisman A, Krishnamurti M, Sabata B. Archival and retrieval in digital pathology systems. Madison: Digital Pathology Association, 2011.
  • 16. Ellin J, Haskvitz A, Premraj P, et al. Interoperability between anatomic pathology laboratory information systems and digital pathology systems. Madison: Digital Pathology Association, 2010.
  • 17. Cann J, Chlipala E, Ellin J, et al. Validation of digital pathology systems in the regulated nonclinical environment. Madison: Digital Pathology Association, 2013.
  • 18. US Food and Drug Administration. Technical performance assessment of digital pathology whole slide imaging devices: draft guidance for industry and food and drug administration staff. Silver Spring: US Department of Health and Human Services, 2015.
  • 19. Centers for Medicare and Medicaid Services. Clinical laboratory improvement amendments (CLIA). Baltimore: Centers for Medicare and Medicaid Services, US Department of Health and Human Services (HHS), 2015.
  • 20. US Department of Health and Human Services. Clinical Laboratory Improvement Advisory Committee. Summary report. Washington, DC: US Department of Health and Human Services, 2013.
  • 21. Long RE, Smith A, Machotka SV, et al. Scientific and Regulatory Policy Committee (SRPC) paper: validation of digital pathology systems in the regulated nonclinical environment. Toxicol Pathol 2013; 41: 115–24. ArticlePubMed
  • 22. Tuomari DL, Kemp RK, Sellers R, et al. Society of Toxicologic Pathology position paper on pathology image data: compliance with 21 CFR Parts 58 and 11. Toxicol Pathol 2007; 35: 450–5. ArticlePubMed
  • 23. European Commission; DG Health and Consumer, Directorate B, Unit B2 ‘Health Technology and Cosmetics. Guidelines on the qualification and classification of stand alone software used in healthcare within the regulatory framework of medical devices. Brussels: European Commission, 2012.
  • 24. Rashbass J, Furness P. Telepathology: guidance from The Royal College of Pathologists. London: The Royal College of Pathologists, 2005.
  • 25. The Royal College of Pathologists of Australasia (RCPA). The Royal College of Pathologists of Australasia (RCPA): position satement: telepathology. Sydney: The Royal College of Pathologists of Australasia, 2014.
  • 26. Japanese Society of Pathology. Digital Pathology Assessment Committee: guidelines for pathologic diagnosis using digital pathology images. Tokyo: Japanese Society of Pathology, 2016.
  • 27. Japanese Society of Pathology. Digital Pathology Assessment Committee: technical standards for digital pathology system for pathologic diagnosis. Tokyo: Japanese Society of Pathology, 2016.
  • 28. Japanese Society of Pathology. Digital Pathology Assessment Committee: technical standards for digital pathology system for pathologic diagnosis. Tokyo: Japanese Society of Pathology, 2015.
  • 29. Abels E, Pantanowitz L, Aeffner F, et al. Computational pathology definitions, best practices, and recommendations for regulatory guidance: a white paper from the Digital Pathology Association. J Pathol 2019; 249: 286–94. ArticlePubMedPMC
  • 30. Pantanowitz L, Sinard JH, Henricks WH, et al. Validating whole slide imaging for diagnostic purposes in pathology: guideline from the College of American Pathologists Pathology and Laboratory Quality Center. Arch Pathol Lab Med 2013; 137: 1710–22. ArticlePubMedPMC
  • 31. Gilbertson JR, Ho J, Anthony L, Jukic DM, Yagi Y, Parwani AV. Primary histologic diagnosis using automated whole slide imaging: a validation study. BMC Clin Pathol 2006; 6: 4.ArticlePubMedPMCPDF
  • 32. Jukic DM, Drogowski LM, Martina J, Parwani AV. Clinical examination and validation of primary diagnosis in anatomic pathology using whole slide digital images. Arch Pathol Lab Med 2011; 135: 372–8. ArticlePubMedPDF
  • 33. Al-Janabi S, Huisman A, Nap M, Clarijs R, van Diest PJ. Whole slide images as a platform for initial diagnostics in histopathology in a medium-sized routine laboratory. J Clin Pathol 2012; 65: 1107–11. ArticlePubMed
  • 34. Al-Janabi S, Huisman A, Vink A, et al. Whole slide images for primary diagnostics of gastrointestinal tract pathology: a feasibility study. Hum Pathol 2012; 43: 702–7. ArticlePubMed
  • 35. Al-Janabi S, Huisman A, Vink A, et al. Whole slide images for primary diagnostics in dermatopathology: a feasibility study. J Clin Pathol 2012; 65: 152–8. ArticlePubMed
  • 36. Al-Janabi S, Huisman A, Nikkels PG, ten Kate FJ, van Diest PJ. Whole slide images for primary diagnostics of paediatric pathology specimens: a feasibility study. J Clin Pathol 2013; 66: 218–23. ArticlePubMed
  • 37. Bauer TW, Schoenfield L, Slaw RJ, Yerian L, Sun Z, Henricks WH. Validation of whole slide imaging for primary diagnosis in surgical pathology. Arch Pathol Lab Med 2013; 137: 518–24. ArticlePubMedPDF
  • 38. Krishnamurthy S, Mathews K, McClure S, et al. Multi-institutional comparison of whole slide digital imaging and optical microscopy for interpretation of hematoxylin-eosin-stained breast tissue sections. Arch Pathol Lab Med 2013; 137: 1733–9. ArticlePubMed
  • 39. Al-Janabi S, Huisman A, Jonges GN, Ten Kate FJ, Goldschmeding R, van Diest PJ. Whole slide images for primary diagnostics of urinary system pathology: a feasibility study. J Renal Inj Prev 2014; 3: 91–6. PubMedPMC
  • 40. Buck TP, Dilorio R, Havrilla L, O'Neill DG. Validation of a whole slide imaging system for primary diagnosis in surgical pathology: A community hospital experience. J Pathol Inform 2014; 5: 43.ArticlePubMedPMC
  • 41. Reyes C, Ikpatt OF, Nadji M, Cote RJ. Intra-observer reproducibility of whole slide imaging for the primary diagnosis of breast needle biopsies. J Pathol Inform 2014; 5: 5.ArticlePubMedPMC
  • 42. Ordi J, Castillo P, Saco A, et al. Validation of whole slide imaging in the primary diagnosis of gynaecological pathology in a University Hospital. J Clin Pathol 2015; 68: 33–9. ArticlePubMed
  • 43. Pekmezci M, Uysal SP, Orhan Y, Tihan T, Lee HS. Pitfalls in the use of whole slide imaging for the diagnosis of central nervous system tumors: a pilot study in surgical neuropathology. J Pathol Inform 2016; 7: 25.ArticlePubMedPMC
  • 44. Snead DR, Tsang YW, Meskiri A, et al. Validation of digital pathology imaging for primary histopathological diagnosis. Histopathology 2016; 68: 1063–72. ArticlePubMed
  • 45. Wack K, Drogowski L, Treloar M, et al. A multisite validation of whole slide imaging for primary diagnosis using standardized data collection and analysis. J Pathol Inform 2016; 7: 49.ArticlePubMedPMC
  • 46. Kent MN, Olsen TG, Feeser TA, et al. Diagnostic accuracy of virtual pathology vs traditional microscopy in a large dermatopathology study. JAMA Dermatol 2017; 153: 1285–91. ArticlePubMedPMC
  • 47. Saco A, Diaz A, Hernandez M, et al. Validation of whole-slide imaging in the primary diagnosis of liver biopsies in a University Hospital. Dig Liver Dis 2017; 49: 1240–6. ArticlePubMed
  • 48. Tabata K, Mori I, Sasaki T, et al. Whole-slide imaging at primary pathological diagnosis: Validation of whole-slide imaging-based primary pathological diagnosis at twelve Japanese academic institutes. Pathol Int 2017; 67: 547–54. ArticlePubMed
  • 49. Araujo AL, Amaral-Silva GK, Fonseca FP, et al. Validation of digital microscopy in the histopathological diagnoses of oral diseases. Virchows Arch 2018; 473: 321–7. ArticlePubMedPDF
  • 50. Lee JJ, Jedrych J, Pantanowitz L, Ho J. Validation of digital pathology for primary histopathological diagnosis of routine, inflammatory dermatopathology cases. Am J Dermatopathol 2018; 40: 17–23. ArticlePubMed
  • 51. Mukhopadhyay S, Feldman MD, Abels E, et al. Whole slide imaging versus microscopy for primary diagnosis in surgical pathology: a multicenter blinded randomized noninferiority study of 1992 cases (pivotal study). Am J Surg Pathol 2018; 42: 39–52. ArticlePubMed
  • 52. Bongaerts O, Clevers C, Debets M, et al. Conventional microscopical versus digital whole-slide imaging-based diagnosis of thin-layer cervical specimens: a validation study. J Pathol Inform 2018; 9: 29.ArticlePubMedPMC
  • 53. Canberk S, Behzatoglu K, Caliskan CK, et al. The role of telecytology in the primary diagnosis of thyroid fine-needle aspiration specimens. Acta Cytol 2020; 64: 323–31. ArticlePubMed
  • 54. Amin S, Mori T, Itoh T. A validation study of whole slide imaging for primary diagnosis of lymphoma. Pathol Int 2019; 69: 341–9. ArticlePubMed
  • 55. Thrall MJ, Wimmer JL, Schwartz MR. Validation of multiple whole slide imaging scanners based on the guideline from the College of American Pathologists Pathology and Laboratory Quality Center. Arch Pathol Lab Med 2015; 139: 656–64. ArticlePubMedPDF
  • 56. Kalinski T, Zwonitzer R, Sel S, et al. Virtual 3D microscopy using multiplane whole slide images in diagnostic pathology. Am J Clin Pathol 2008; 130: 259–64. ArticlePubMedPDF
  • 57. Aoyama H, Daimon Y, Tamaki T, Matsumoto H, Matsuzaki A, Yoshimi N. H. pylori infection in gastric biopsy specimens by whole slide image and inflammation assessment. Kure: Japanese Society of Digital Pathology, 2018; 40.
  • 58. Garcia-Rojo M. International clinical guidelines for the adoption of digital pathology: a review of technical aspects. Pathobiology 2016; 83: 99–109. ArticlePubMed
  • 59. Roy S, Kumar Jain A, Lal S, Kini J. A study about color normalization methods for histopathology images. Micron 2018; 114: 42–61. ArticlePubMed
  • 60. Evans AJ, Krupinski EA, Weinstein RS, Pantanowitz L. 2014 American Telemedicine Association clinical guidelines for telepathology: another important step in support of increased adoption of telepathology for patient care. J Pathol Inform 2015; 6: 13.ArticlePubMedPMC
  • 61. Tuominen VJ, Isola J. Linking whole-slide microscope images with DICOM by using JPEG2000 interactive protocol. J Digit Imaging 2010; 23: 454–62. ArticlePubMedPDF
  • 62. Singh R, Chubb L, Pantanowitz L, Parwani A. Standardization in digital pathology: supplement 145 of the DICOM standards. J Pathol Inform 2011; 2: 23.ArticlePubMedPMC
  • 63. DICOM Standards Committee. Digital Imaging and Communications in Medicine (DICOM). Supplement 145: whole slide microscopic image IOD and SOP classes. Rosslyn: DICOM, 2010.
  • 64. Zwonitzer R, Kalinski T, Hofmann H, Roessner A, Bernarding J. Digital pathology: DICOM-conform draft, testbed, and first results. Comput Methods Programs Biomed 2007; 87: 181–8. ArticlePubMed
  • 65. Ribback S, Flessa S, Gromoll-Bergmann K, Evert M, Dombrowski F. Virtual slide telepathology with scanner systems for intraoperative frozen-section consultation. Pathol Res Pract 2014; 210: 377–82. ArticlePubMed

Figure & Data

References

    Citations

    Citations to this article as recorded by  
    • Performance of externally validated machine learning models based on histopathology images for the diagnosis, classification, prognosis, or treatment outcome prediction in female breast cancer: A systematic review
      Ricardo Gonzalez, Peyman Nejat, Ashirbani Saha, Clinton J.V. Campbell, Andrew P. Norgan, Cynthia Lokker
      Journal of Pathology Informatics.2024; 15: 100348.     CrossRef
    • ChatGPT as an aid for pathological diagnosis of cancer
      Shaivy Malik, Sufian Zaheer
      Pathology - Research and Practice.2024; 253: 154989.     CrossRef
    • Remote Placental Sign-Out: What Digital Pathology Can Offer for Pediatric Pathologists
      Casey P. Schukow, Jacqueline K. Macknis
      Pediatric and Developmental Pathology.2024;[Epub]     CrossRef
    • Digital Validation in Breast Cancer Needle Biopsies: Comparison of Histological Grade and Biomarker Expression Assessment Using Conventional Light Microscopy, Whole Slide Imaging, and Digital Image Analysis
      Ji Eun Choi, Kyung-Hee Kim, Younju Lee, Dong-Wook Kang
      Journal of Personalized Medicine.2024; 14(3): 312.     CrossRef
    • Diagnostic Assessment of Deep Learning Algorithms for Frozen Tissue Section Analysis in Women with Breast Cancer
      Young-Gon Kim, In Hye Song, Seung Yeon Cho, Sungchul Kim, Milim Kim, Soomin Ahn, Hyunna Lee, Dong Hyun Yang, Namkug Kim, Sungwan Kim, Taewoo Kim, Daeyoung Kim, Jonghyeon Choi, Ki-Sun Lee, Minuk Ma, Minki Jo, So Yeon Park, Gyungyub Gong
      Cancer Research and Treatment.2023; 55(2): 513.     CrossRef
    • Recent application of artificial intelligence on histopathologic image-based prediction of gene mutation in solid cancers
      Mohammad Rizwan Alam, Kyung Jin Seo, Jamshid Abdul-Ghafar, Kwangil Yim, Sung Hak Lee, Hyun-Jong Jang, Chan Kwon Jung, Yosep Chong
      Briefings in Bioinformatics.2023;[Epub]     CrossRef
    • Sustainable development goals applied to digital pathology and artificial intelligence applications in low- to middle-income countries
      Sumi Piya, Jochen K. Lennerz
      Frontiers in Medicine.2023;[Epub]     CrossRef
    • Diagnostic proficiency test using digital cytopathology and comparative assessment of whole slide images of cytologic samples for quality assurance program in Korea
      Yosep Chong, Soon Auck Hong, Hoon Kyu Oh, Soo Jin Jung, Bo-Sung Kim, Ji Yun Jeong, Ho-Chang Lee, Gyungyub Gong
      Journal of Pathology and Translational Medicine.2023; 57(5): 251.     CrossRef
    • Real-World Implementation of Digital Pathology: Results From an Intercontinental Survey
      Daniel Gomes Pinto, Andrey Bychkov, Naoko Tsuyama, Junya Fukuoka, Catarina Eloy
      Laboratory Investigation.2023; 103(12): 100261.     CrossRef
    • National digital pathology projects in Switzerland: A 2023 update
      Rainer Grobholz, Andrew Janowczyk, Ana Leni Frei, Mario Kreutzfeldt, Viktor H. Koelzer, Inti Zlobec
      Die Pathologie.2023; 44(S3): 225.     CrossRef
    • Swiss digital pathology recommendations: results from a Delphi process conducted by the Swiss Digital Pathology Consortium of the Swiss Society of Pathology
      Andrew Janowczyk, Inti Zlobec, Cedric Walker, Sabina Berezowska, Viola Huschauer, Marianne Tinguely, Joel Kupferschmid, Thomas Mallet, Doron Merkler, Mario Kreutzfeldt, Radivoje Gasic, Tilman T. Rau, Luca Mazzucchelli, Isgard Eyberg, Gieri Cathomas, Kirst
      Virchows Archiv.2023;[Epub]     CrossRef
    • Understanding the ethical and legal considerations of Digital Pathology
      Cheryl Coulter, Francis McKay, Nina Hallowell, Lisa Browning, Richard Colling, Philip Macklin, Tom Sorell, Muhammad Aslam, Gareth Bryson, Darren Treanor, Clare Verrill
      The Journal of Pathology: Clinical Research.2022; 8(2): 101.     CrossRef
    • Current Trend of Artificial Intelligence Patents in Digital Pathology: A Systematic Evaluation of the Patent Landscape
      Muhammad Joan Ailia, Nishant Thakur, Jamshid Abdul-Ghafar, Chan Kwon Jung, Kwangil Yim, Yosep Chong
      Cancers.2022; 14(10): 2400.     CrossRef
    • Recent Applications of Artificial Intelligence from Histopathologic Image-Based Prediction of Microsatellite Instability in Solid Cancers: A Systematic Review
      Mohammad Rizwan Alam, Jamshid Abdul-Ghafar, Kwangil Yim, Nishant Thakur, Sung Hak Lee, Hyun-Jong Jang, Chan Kwon Jung, Yosep Chong
      Cancers.2022; 14(11): 2590.     CrossRef
    • Automated Hybrid Model for Detecting Perineural Invasion in the Histology of Colorectal Cancer
      Jiyoon Jung, Eunsu Kim, Hyeseong Lee, Sung Hak Lee, Sangjeong Ahn
      Applied Sciences.2022; 12(18): 9159.     CrossRef
    • Development of quality assurance program for digital pathology by the Korean Society of Pathologists
      Yosep Chong, Jeong Mo Bae, Dong Wook Kang, Gwangil Kim, Hye Seung Han
      Journal of Pathology and Translational Medicine.2022; 56(6): 370.     CrossRef
    • Improving quality control in the routine practice for histopathological interpretation of gastrointestinal endoscopic biopsies using artificial intelligence
      Young Sin Ko, Yoo Mi Choi, Mujin Kim, Youngjin Park, Murtaza Ashraf, Willmer Rafell Quiñones Robles, Min-Ju Kim, Jiwook Jang, Seokju Yun, Yuri Hwang, Hani Jang, Mun Yong Yi, Anwar P.P. Abdul Majeed
      PLOS ONE.2022; 17(12): e0278542.     CrossRef
    • What is Essential is (No More) Invisible to the Eyes: The Introduction of BlocDoc in the Digital Pathology Workflow
      Vincenzo L’Imperio, Fabio Gibilisco, Filippo Fraggetta
      Journal of Pathology Informatics.2021; 12(1): 32.     CrossRef

    • PubReader PubReader
    • ePub LinkePub Link
    • Cite this Article
      Cite this Article
      export Copy Download
      Close
      Download Citation
      Download a citation file in RIS format that can be imported by all major citation management software, including EndNote, ProCite, RefWorks, and Reference Manager.

      Format:
      • RIS — For EndNote, ProCite, RefWorks, and most other reference management software
      • BibTeX — For JabRef, BibDesk, and other BibTeX-specific software
      Include:
      • Citation for the content below
      Recommendations for pathologic practice using digital pathology: consensus report of the Korean Society of Pathologists
      J Pathol Transl Med. 2020;54(6):437-452.   Published online October 8, 2020
      Close
    • XML DownloadXML Download
    Related articles

    J Pathol Transl Med : Journal of Pathology and Translational Medicine