Wednesday, March 3, 2021

The Role of PACS IIP’s and Medical Physicists in Legal Disputes

Unfortunately patient safety, privacy and security violations, as well as disputes about PACS functionality and contracting might end up in arbitration cases or even in court. As a judge and/or jury is unfamiliar with the field of Imaging and Informatics, especially if it comes to understanding issues with DICOM connectivity or other details, lawyers often will call on Imaging Informatics Professionals (IIP’s) who are prepared to function as an expert witness to help settle a dispute. The same can be said regarding involvement of medical physicists, as it relates to image quality and/or radiation safety matters.

Here are some typical examples that we as authors and working in the industry for many years have been involved with as a consultant and/or expert witness that explain the role of IIP’s/medical physicists in legal (or at least potentially legal) disputes and show how you can be better prepared as a professional to anticipate potential legal proceedings.


1.Disputes between vendors and providers – These types of disputes are almost always caused by a mismatch of vendor capabilities and provider expectations. If there is a detailed RFP that spells out the requirements then the dispute is relatively easy to settle. In one case, a provider did not want to pay for the PACS delivery as it argued that the PACS did not meet the requirements. I did an onsite audit checking each RFP item and concluded in my final report that the vendor had complied with 95% of all items and was due full payment.

In another case, the vendor did NOT meet even 50% of the requirements, which forced them to pull out the system and pay for a replacement. If there is not a detailed RFP to go back to, it gets hard to find out what the expectations would have been, in which case there is often a settlement reached in-between.

Lesson learned; make sure you have very well defined requirements when purchasing a PACS including performance, features, functionality etc. so you are covered.

2. Technical issues – A radiation therapy vendor installed both the planning system and the therapy delivery system (linac). A programming error in the DICOM interface sent the treatment parameters to the therapy system with the ISO-center inverted, causing patients who were supposed to be treated for throat cancer to be radiated on their spine which caused them to become paraplegic. One patient became so depressed that he took his life. There was an FDA warning issued for the device and it was eventually corrected, but not after several patients were incorrectly treated.


Lesson learned; never trust a vendor, make sure you check all settings, parameters, etc. especially if it potentially can cause direct patient harm.

3. Misdiagnosis – “I did not see that lesion,” is unfortunately still a common event. Humans are not perfect, but some professionals are just incompetent, and the general public should be protected from those that do not deserve to be treating patients. In one case, a radiologist argued that the monitor was defective, and he could not see the lesion (even I could see the hemorrhage in the brain). The dispute required the calibration files to be pulled for that workstation. In another case a radiologist flatly denied that he saw the chest image to check the tube placement, which was refuted by showing the audit trail proof that he himself indeed (or someone with his login-password) looked at the image. In both cases the patient died, in the first case they gave blood thinner to a patient with a bleed in the brain, in the second case the patient was in the ICU and died because of incorrect tube placement.

Lesson learned; keep your calibration files for your monitors safe and also all your audit trails as they might be needed for a case.

4. QA/QC policies not being followed – A technologist selected the wrong study and the report which indicated a terminal disease was connected to the wrong patient. The good news was that the terminally ill patient already knew about the disease, but the patient who was perfectly healthy got the bad news that her days were numbered. After the discovery of the error, you can imagine the stress and agony she went through, causing her to sue the hospital.

Lessons learned; have very well defined QA policies in place that require the technologist to double check the patient demographics, defining these policies typically is the job of an IIP.

5. Privacy violations – A radiologist was fired because his “supervisor,” i.e. medical director looked at his brain CT and deemed him to be unfit for practicing radiology as he had an apparent brain cancer. The medical director argued that it was her “duty” to perform regular QA checks on all of the department studies, however, the radiologist argued that his privacy was violated and should not have been fired for that reason as he had sought medical attention and was treated accordingly.

Lesson learned; again, make sure you keep your audit trails available, in this case for proving it was indeed a privacy violation.


6. Commercial grade vs. medical grade displays for diagnostic interpretation – During an evaluation of a radiologist’s home set of 5 MP displays used primarily for mammography reporting, I was asked to look at her > 5 MP Apple display as well for comparison. The question she brought to my attention was “are these satisfactory for mammography interpretation”? The AAPM TG18-QC test pattern was brought up and looked great (all details supposed to be observed were observed). However, further measurements of the luminance patches across the entire dynamic range showed that there was gross non-conformance to the GSDF curve.

Lessons learned; never make assumptions regarding the calibration status of your displays, even if an image looks acceptable for interpretation and/or review. Use of calibration software with the front sensor and/or external photometer measurements should occur on a regular basis with documentation and records retention. 

7. Phantom image retention – Routine quality control imaging of phantoms and/or other test objects can be of value, not only for compliance/accreditation requirements, but also as a form of documentation to prove that some sort of constancy evaluation of the image quality is being performed for your imaging unit. Regulatory officers/inspectors may decide to audit a facility to show that these phantom images have been retained and are consistent. Failing to adhere to imaging phantoms or retaining the images can result in fines, should an officer decide to enforce this. Furthermore, image retention can be extremely valuable in the event your facility is hit with a lawsuit and/or allegation.

Lessons learned; know your regulatory/accreditation requirements and adhere fully to these. Even if these requirements do not pertain to some modalities, routine phantom imaging and retention should still occur for the reasons mentioned previously. 

8. Dental facility being sued – I was called in by a lawyer to act as an expert consultant regarding a dental facility being sued by one of its staff members. The staff member claimed that the cause of her breast cancer (resulting in a double mastectomy) and cataracts was the result of use of the facility’s panoramic x-ray unit. I had reviewed documentation regarding past service/x-ray compliance reports, Ministry approval to install the x-ray unit, and other related documentation. Furthermore, I had conducted a comprehensive evaluation of the unit according to AAPM TG 175 (Acceptance Testing and Quality Control of Dental Imaging Equipment). Based on my evaluation, no compliance requirements evaluated or reviewed failed.

Lessons learned; no matter if the modality produces a low amount of radiation relative to others (e.g. dental panoramic to diagnostic CT), ensure that you follow all compliance requirements and retain such documentation for the required amount of time. Lawsuits are a real thing and can occur for ANY facility!

9. Accidental exposure – An x-ray technologist had apparently performed both AP and LAT views of the tibia/fibula of a two-year old child who was suspected of a fracture. During the AP view, in the midst of the child crying and the father restraining the child to remain still for the exam, the technologist had forgotten to give shielding to both the child and father. Lead protection was provided prior to the LAT view being taken, however. I was called in to perform a dose/risk analysis regarding this matter. Output measurements of the x-ray unit were taken as well as a review of the images in PACS.

Upon review of the image headers, including the attributes pertaining to exposure techniques, I asked the technologist if she was aware of what range the exposure indicator should be within (the exposure indicator being a way to quantify how much radiation the detector received). The response given to me was a blank stare. In a follow-up discussion with the diagnostic imaging manager and chief radiologist, it was explained that the increased radiation risk to the child and father from the first exposure without any lead protection was minimal. However, it was further explained that some training for at least this technologist should occur regarding how much radiation is to be administered (via the exposure indicator).

Lessons learned; accidental exposures, even overexposures, deserve due diligence and use of an expert to add value to the matter. As an incidental finding performing this review, there was a knowledge gap revealed for this particular technologist. If radiation is administered, it’s crucial to be able to know how much radiation is used. Training in best and latest trends/standards should be implemented.  

10. Investigation of high radiation badge readings – An x-ray facility had a general radiography room with a control badge directly behind a glass window. Since the start of using this room, the badge had not registered anything, but was kept to monitor the secondary radiation just in-case. However, after about one year of use, the badge was recording values, which surprised and concerned staff members, particularly some pregnant x-ray workers. I was asked to investigate this matter.

The facility had retained the documentation related to what shielding requirements they have been approved for. Upon reviewing this as well as taking various radiation measurements, I had noticed that the glass windows were half of the lead-equivalent thickness they were approved for (1.6 mm Pb vs. 3.2 mm Pb). Furthermore, the use of the room had increased by approximately three times from what they were approved for. Putting two and two together with other related evaluations, I concluded that the increase in use and inappropriate glass windows were the cause of the higher badge readings.

Lessons learned; when planning the design for approval of an x-ray room, be sure to verify and document what shielding is actually installed. One also should be proactive and plan ahead if they intend to increase the utilization of their x-ray machine, which will increase shielding requirements.   

In conclusion, there are several unfortunate events (patient injury and/or death), that might end up in a legal case. You, as an IIP/medical physicist might be asked to either assist in providing data, an opinion, or serve as an expert witness to get to the bottom of a situation so that corrective actions and/or compensation for the ones impacted can be achieved. If you have any questions, don’t hesitate to contact us:

Monday, February 8, 2021

PACS professional certification, CIIP vs PARCA.

PACS administrators or as they are also called, Imaging Informatics Professionals (IIP), have been around for about 20 years, ever since providers started to realize that it takes a dedicated support staff to manage the PACS system and be responsible for its data integrity and operation. In 2004 a certification agency, the PACS Administrators Registry and certification Association, aka PARCA, was formed to provide a proof of a certain level of proficiency witnessed by passing a certification exam. Not that long after that, the American Board of Imaging Informatics, aka ABII, was formed in a partnership between ARRT and SIIM, providing another certification called CIIP. The PARCA and CIIP certifications are quite different due to their origin and intended target group of professionals. This write up is intended to clarify the different certification paths to benefit potential candidates that are looking to become certified and for those who are trying to recruit potential IIP professionals. Here are the main differences:

1.       PARCA has four different certification exams (see figure). The basic level shown on the left-bottom, is the Certified PACS Associate (CPAS) level which requires both IT and Clinical proficiency and therefore requires both the CPAS clinical and IT certification exams to be taken. ONLY after passing this first level of certification, the candidate can continue to become certified as a Certified PACS System Administrator (CPSA). There is also a special, stand-alone certification dedicated to DICOM, i.e., the CDIP or Certified DICOM Integration Professional, which is becoming increasingly important because IIP’s are having to do their own troubleshooting in case there is a dispute between a vendor and the healthcare enterprise. CDIP teaches important skills such as how to use a DICOM sniffer to investigate performance issues and integration problems. ABII offers only one level of certification, i.e. Certified Imaging Informatic Professional (CIIP). Note that ABII requires both a certain level of experience AND minimum level of relevant imaging informatics training (i.e. semester credits) prior to CIIP certification, which is not required for the PARCA certification. The CIIP qualification requirements are explained in more detail on their website, and consists of a point system, for example, one could qualify with 2 years healthcare experience and a relevant  Masters degree, 5 years experience, 36 hours of imaging informatics experinece and an Associate degree and combinations thereof.

2.       The subject areas covered by the CIIP and PARCA PACS certifications are quite different. Because CIIP only has a single certification, there is some overlap between CIIP and PARCA certifications to cover the complete area of PACS. As a matter of fact, there is about a 10% overlap between CIIP and PARCA CPAS IT, 30% overlap between CIIP and CPAS Clinical, 30% overlap between CIIP and CPSA and 10% overlap between CIIP and CDIP, as shown in the figure by the red dashed line. 80% of the CIIP subject matter is unique and different. In order to fully appreciate the differences between the CIIP and PARCA subject areas, one would need to compare the test content outlines and/or requirments next to each other item by item, but in general, the CPAS is more technical oriented and CIIP more managerial as it covers areas such as project management, procurement, education and others. It is not uncommon for a CPAS candidate to become either CPSA and/or CIIP certified; there are several professionals who sit for all of the exams.

3.       The CPAS, i.e. basic level of PARCA, has two distinct target audiences. First, CPAS is appropriate for either non-healthcare professionals or professionals who work in a related field who are looking for a career change and want to get into the PACS profession. These are mostly Radiological Technologists or IT professionals, who either need additional training in either the IT or clinical area. The other audience is for those professionals who are already working in PACS but never had the opportunity to gain either the clinical or IT skills and would like to become trained and certified in this area. The first audience, i.e., individuals without any PACS background, is not served by the CIIP certification because CIIP requires a certain number of years of experience. This is also the case for those who were unable to complete any formal education in the form of a certain number of college credits: they would be able to take the CPAS certification but are excluded from taking the CIIP exam.

4.       PARCA and ABII took a very different approach with regard to developing their what is called certification requirements or Test Content Outline (TCO). PARCA exam requirements were developed top down, starting with outcome measurements, and learning objectives and were developed by educators in this field. The ABII requirements (TCO) were developed bottom up, based on a survey and subsequent committee of volunteers. The good news is that SIIM is in the process of “reverse engineering” the ABII requirements to make the curriculum more coherent. The same applies to the supporting materials, the PARCA textbooks are written by a single author, while the CIIP textbook is created by various authors which makes it challenging to have a consistent style and prevent gaps and overlaps.

5.       PARCA has an international focus. The majority of its board members are from outside the US. 20% of its candidates are from outside the US and it is growing in importance. Also, one should realize that in many countries, especially LMIC regions (Lower and Middle Income Countries), PACS is still in its infancy and therefore a prime target for CPAS certification, while CIIP certification, which is focused on people with experience, would not be an option.

6.       CIIP exams have to be taken at a physical testing center. In contrast, PARCA has provided on-line exams since 2005 which allows the exams to be taken “any time-anywhere”. The exams are on-line proctored, that is each student has a dedicated one-on-one proctor who is watching the candidate continuously during the examination. Any potential distraction and/or indication that a student might use any external resource or even look away from the screen will invalidate the exam. This guarantees a secure test environment. Electronic exams by telepresence instead of a physical presence at a test center has a lot of other advantages, first of all, none of the exams are identical as the questions are pulled randomly from a large test pool that is at least twice the actual test size, with the multiple choice answers being re-arranged for each test. In addition, on-line certification testing is the only way to offer international certification exams because many countries do not have access to physical testing centers. Lastly, on-line testing is much more cost effective. That is most likely one of the reasons that the PARCA certification exam cost is less than half the cost of the CIIP certification while the PARCA retake fee ($20) is less than one tenth of the CIIP retake cost ($250).

7.       The difference between PARCA and CIIP is also reflected in the training options provided by 3rd parties such as SIIM, OTech and others (note that PARCA and ABII are not offering any training themselves as they try to keep a "barrier" between trainers and examiners). PARCA CPAS training is typically a several day event going through the materials in depth because much of the content is new to the candidates. The CIIP training is typically a bootcamp to provide experienced professionals the opportunity to refresh their knowledge prior to taking the exam. The PARCA IT as well as the CDIP is also more hands-on than the CIIP, as those topics are difficult to master without any hands-on exercises.

8.       There are other minor differences between PARCA and CIIP such as related to the requirement for continuing education and re-certification and when the content is made up-to-date because the PARCA requirements have been recently updated to include new technologies such as AI and cyber security and the CIIP Test Content Outline is up-to-date as of 2019.

In conclusion, both PARCA and CIIP certifications are valuable and given the fact that after 16 years, thousands of professionals have become certified, both have certainly established a new standard of professionalism which benefits the imaging and informatics community. It would be beneficial for the IIP community if PARCA and ABII would merge and work towards a more consolidated approach taking the best of both worlds. Time will tell whether this could happen.

For more details about either certification see PARCA and ABII.

About the author: Herman Oosterwijk has taught the PARCA curriculum since its inception. He created detailed study guides for both the CIIP and PARCA certifications. He is also part of the SIIM committee to create a new internship program based on the CIIP curriculum and part of the SIIM CIIP bootcamp faculty.

Tuesday, January 19, 2021

Common HL7® Order to DICOM Mapping Issues

Most PACS administrators are not that familiar with HL7® , which can be a disadvantage when trying to troubleshoot issues at the PACS backend, as many of the Attributes that are part of the DICOM header are initially created and/or mapped from a HL7 message. HL7 messages are not that hard to interpret as they are encoded in plain ASCII text. A simple browser or even notepad will show you the message allowing you to find out why a PACS system either rejects a DICOM file or decides to store it in the “lost and found” input queue and not making it available for a radiologist at its workstation.

An HL7 order message (ORM) is typically received by a PACS, RIS, or Broker which will add the order to its scheduling database and upon a query request by a digital modality, it will provide a DICOM worklist by mapping the HL7 fields into the DICOM Attributes, which are subsequently copied by the modality in the DICOM header for the images to be sent to the PACS. The most common reasons for rejection by the PACS can be traced back to couple of HL7 to DICOM mapping issues as described below:

1.       Length mismatch – Many of the HL7 fields allow for a larger field length than the DICOM Attributes. When received by the PACS, it might check its length and decide that its database record length is smaller and reject it. As an example, the Person Name data type definition in HL7 (XPN) can be a maximum of 1103 characters long and have 14 subcomponents including items such as name validity range, expiration date, etc. while the DICOM data type definition (PN) is a maximum of 64 characters long with 5 components.

2.       Formatting mismatch – Some of the HL7 fields are encoded with a different order of the components than the DICOM fields. An example would be the order of the name components in the HL7 person name (XPN), which is <last>^<first>^<middle>^<suffix>^<prefix>, for the DICOM it is <last>^<first>^<middle>^<prefix>^<suffix>, note the suffix, prefix reversal.

3.       Contents mismatch – Some of the fields in HL7 have a different set of defined terms. For example, in HL7 V2.1, the set of recommended fields for patient sex is F, M, O, U, in version 2.7 that has been expanded to F, M, O, U, A and N. In DICOM the list of defined terms is F, M, and O. All of the additional HL7 codes except for M and F should be mapped to O, If not done so, the PACS will complain.

4.       Location mismatch – The patient ID in HL7 can be in different fields depending on whether it is the internal ID, external ID, social security number, alternate patient ID or, after version 2.3.1, they could be in a single location as a list that includes the issuer of the patient ID. One needs to select the right ID to be used as its primary ID and map that into the Patient ID field (0010,0020). All others that a PACS wants to carry along in the header should be mapped into the Other Patient ID field (0010,1000). Incorrect mapping might cause issues with patient identification.

5.       Coding issues – Procedures are encoded using its CPT4 code and description as well as abbreviation. Hospitals might use internal code systems, might mix the abbreviation with the full description, etc. causing incorrect procedure selection by the technologist at a modality.

6.       Mapping the procedure to the right modality – An HL7 order does not typically include the modality type (e.g. CT, MRI, etc.). The scheduling application has to map the procedure to the modality based on the procedure code, e.g. it would know what procedure codes are to be performed at a CT, MRI, ultrasound, or other modality. The problem occurs if there are multiple modalities, e.g. an inpatient CT in radiology and an outpatient CT in the ER, as the order might be listed on both. The same applies for having an ultrasound in the delivery, radiology and cardiology departments, mapping all of these procedures to a single modality, e.g. “US”, which would cause the procedures to show up on each US modality. In this case, the scheduler has to look at additional information to determine the patient location, class, etc. and map the order to a particular Scheduled AE-Title that can be used by the modality to request the correct worklist for this device.

7.       User entry errors – These are not necessarily mapping errors, instead these are caused by user data entry input issues, for example, a data entry person might put the last name AND first name in the last name field. The information will be mapped correctly but the meaning of the data is in this case is incorrect.

In conclusion, data integrity of the image data in the PACS or enterprise imaging system is essential. Issues with incorrect, inconsistent or missing information can often be traced back to the source, i.e. the HL7 order that was used to map into a DICOM worklist for retrieval by a modality and subsequently mapped into a DICOM image header. Knowing how to interpret the HL7 transactions will go a long way to troubleshooting these issues when they occur.