Saturday, June 17, 2017

SIIM 2017 Top Ten Observations.

The 2017 SIIM (Society for Imaging Informatics in Medicine) meeting was held in Pittsburgh, PA on June 1-3.
View back to the city from Allegheny
The meeting was well attended both by users and an increasing number of exhibitors. This meeting is mostly attended by PACS professionals, typically PACS administrators, in addition to several “geeky” radiologists who have a special interest in medical informatics. Pittsburgh, in addition to being somewhat “out of the way,” was not a bad choice to hold a conference; downtown was quite nice and readily accessible, actually better than I expected. Here are my top ten takeaways of the meeting:

1.     AI (Artificial Intelligence) is still a very popular topic. The title of the keynote speech by Dr. Dryer from Mass General says it all; “Harnessing Artificial Intelligence: Medical Imaging’s Next Frontier.” AI goes also by the name of “deep learning” reflecting the fact that it uses large databases of medical information to determine trends, predictions, precision medicine approaches, and provide decision support for physicians.Another term people use is “machine learning” and I would argue that CAD (Computer Aided Diagnosis) is a form of AI as well. One of the major draws for this new topic is that some professionals are arguing that we won’t need radiologists anymore in the next 5-10 years as they are going to be replaced with machines. In my opinion, much of this is hype, but I believe that in two areas there will be a potentially significant impact on the future of radiology. First of all, for radiography screening AI could help to rule out “normal.” Imagine for breast screening or TB screening of chest images, one could potentially eliminate the reading of many of them as they would appear normal to a computer, freeing the physician to concentrate on the “possible positives” instead.Second, there were several new startup companies that showed some kind of sophisticated processing that can assist a radiologist with diagnosis, for very specific niche applications. There are a couple of issues with the latter. A radiologist might have to perform some extra steps and/or analyses, which could impact the application’s performance and throughput. As such, the application will have to provide a significant clinical advantage. Also, licensing additional software could be a cost that might or might not be reimbursed. In conclusion, AI’s initial impact will be small, and I don’t think that despite the major investments (GE investing $100m in analytics) it will mean the end of the radiology profession in the near future. A quote from Dr. Dryer also summed it up, “it will not be about Man vs. AI but rather the discussion of Man with, vs a Man without AI.”

2.     Cyber warfare is getting real. The recent WannaCry incident shut down 16 hospitals in the UK, which created chaos, as practitioners had to go back to paper. As we are now living in the IOT (Internet Of Things) era, we should be worried about ransomware and hacking. Infusion pumps, pacemakers and other devices can be accessed and their characteristics and operating parameters can be modified.It is interesting that HIPAA regulations already covered many of the security measures that could prevent and/or manage these incidents, but in the past, most institutions focused mostly on patient privacy. Of course, patient privacy is a major issue, but it might be prudent for institutions to shift some of the emphasis on network security instead of privacy as that could be potentially more damaging. Imagine the potential impact of one patient’s privacy being compromised vs the impact of infusion pumps going berserk, or a complete hospital shutdown.

3.     Facilitating the management of images created by “ologies” is still very challenging. Enterprise imaging, typically done using an enterprise archive such as a VNA as imaging repository, is still in its infancy. The joint HIMSS/SIIM working group has done a great job outlining all of the needed components and defined somewhat of an architecture, but there are still several issues to be resolved. When talking with the VNA vendors, their top issue that seems to come up universally is that the workflow of non-traditional imaging is poorly defined and does not lend itself very well to being managed electronically. For example, imagine a practitioner making an ultrasound during anesthesia or an ER physician taking a picture of an injury with his or her smart phone. How do we match up these images with the patient record in such a way that they can be managed? Most radiology-based imaging is order driven, which means that a worklist entry is available from a DICOM Modality Worklist provider, however, most of the “ologies” are encounter driven. There is typically no order, so to go hunting for the patient demographics from a source of truth can be challenging.There are several options, one could query a patient registration system using HL7, using a patient RFID or wristband as a key, or, if FHIR takes off, one could use the FHIR resource as a source, or one could use admission transactions instead (ADT), or do a direct interface to a proprietary database. There is probably another handful of options, which is the problem as there is no single standard that people are following. The good news is that the IHE is working on the encounter-based workflow, so we are eagerly awaiting their results.

4.     Patient engagement is still a challenge. There is no good definition of patient engagement in my opinion, and different vendors are implementing only piecemeal solutions. Here is what HIMSS has to say about this topic:
Patient engagement is the activity of providers and patients working together to improve health. A patient’s greater engagement in healthcare contributes to improved health outcomes, and information technologies can support engagement. Patients want to be engaged in their healthcare decision-making process, and those who are engaged as decision-makers in their care tend to be healthier and have better outcomes.
 
Many think of patient engagement as being equivalent to having a patient portal. The top reasons for patients wanting to use a portal are for making appointments, renewing prescriptions and paying their bills. However, none of these is a true clinical interaction. Face-to-face communication using, for example, Skype or another video communication, or just simply having an email exchange dealing with clinical questions are very important. One of the issues is that the population group that is the first to use these portals are also the group who already take responsibility for their own health. 
The challenge is to reach the non-communicative, passive group of patients and keep a check on their blood pressures, glucose levels, pacemaker records, etc. Also, portals are not always effective unless they can be accessed using a smart phone. This assumes of course that people have a phone, which was solved by one of the participants in the discussion by providing free phones for homeless so that texts can be sent for the medication reminders and checking up on them. Different approaches are also needed, as a point in fact, Australia had made massive investments in patient portals but because patients were by default set up as opt-out, only 5 percent of them were using portals. 
One of the vendors showed a slick implementation whereby the images of a radiology procedure were sent to the personal health record in the cloud and from there could easily be forwarded to any physician authorized by the patient. This is a major improvement and could impact the CD exchange nightmare we are currently experiencing. I personally take my laptop with my images loaded on it to my specialists as I have had several issues in the past with the specialists having no CD reader on their computers or lacking a decent DICOM viewer. There are still major opportunities for vendors to make a difference here.

5.     FHIR (Fast Healthcare Interoperability Resources) is getting traction, albeit limited. If you want one good
Packed rooms for educational sessions
example of hype, it would be the new
FHIR standard. It has been touted as the one and only solution for every piece of clinical information and even made it into several of the federal ONC standard guidelines. Now back to reality. We are on its third release of the Draft Standard for Trial implementation (DSTU3), typically, there is only one draft before a standard, and it is still not completely done. Its number of options are concerning as well. And then, assuming you have an EMR that has just introduced a FHIR interface (maybe DSTU version 2 or 3) for one or more resources, are you going to upgrade it right away to make use of it? But to be honest, yes, it will very likely be used for some relatively limited applications, some examples are the physician resource used by the HIE here in Texas finding information about referrals, or, as one of the SIIM presenters showed, a FHIR interface to get reports from an EMR to a PACS viewing station. But there are still many questions to be addressed to use what David Clunie calls “universal access to mythical distributed FHIR resources”.

6.     The boundary between documents and images remains blurry. When PACS were limited to radiology images, and document management systems were limited to scanned documents that were digitized, life was easy and there was a relatively clear division between images and documents. However, this boundary has become increasingly blurry. Users of PACS systems started to scan documents such as orders and patient release forms into the PACS, archiving them as encapsulated DICOM objects, either as a bitmap (aka as “Secondary Captures”) or encapsulated PDF’s.Some modalities such as ophthalmology were starting to create native PDF’s, bone densitometry (“DEXA”) scanners were also showing thumbnail pictures of the radiographs with a graph of its measurements in a PDF format. Then we got the requirement to store native png, tiff, jpeg’s and even mpeg videos in the PACS as well. At the same time, some of the document management systems were starting to store jpegs as well as ECG waveforms that were scanned in. By the way, there has been a major push for waveform vendors to create DICOM output for their ECG’s, which means they would now be managed by a cardiology PACS.And managing diagnostic reports is an issue by itself, some store them at the EMR, some at the RIS, some at the PACS and some at the document management system. The fact that the boundary is not well defined is not so much of an issue, what becomes clear is that each institution decides where the information resides and creates a universal document and image index and/or resource so that viewers can access the information in a seamless manner.

7.     The DICOMWeb momentum is growing. DICOMWeb is the DICOM equivalent of FHIR and includes what most people know as WADO, i.e. Web Access to DICOM Objects, but there is more to that, as it also allows for images to be uploaded (STOW), or queried (QIDO) and even provides a universal worklist allowing images to be labelled with the correct patient demographics before sending them off to their destination.There are three versions of DICOMWeb, each one builds on the next one with regard to functionality and a more advanced technology making them current with state-of-the-art web services. One should realize that the core of DICOM, i.e. its pixel encoding and data formats is not changed, we still deal with “DICOM headers” but that the protocol, i.e. the mechanism to address a source and destination as well as the commands to exchange information has become much simpler.As a matter of fact, as the SIIM hackathon showed, it is relatively easy to write a simple application using the DICOM resources. As with FHIR, DICOMWeb is still somewhat immature, and IHE is still trying to catch up. Note that the XDS-I profile is based on the second DICOMWeb iteration, which is based on SOAP (XML encapsulated) messaging that has recently been retired by the DICOM standards committee. The profile dealing with the final version of WADO, called MHD-I is still very new. There is a pretty good adoption rate though; and many PACS systems are implementing WADO, which unlike FHIR can be done by a simple proxy implementation on an existing traditional DICOM interface.

The radworkflow space
8.     Ergonomics is critical for radiology. I can feel it in my arm when I am typing or using a mouse for an extended time. Imagine doing it day-in and day-out while staring at a screen in half-dark, no wonder that radiology practitioners have issues with their arms, neck, and eyes. Dr Mukai, a practicing radiologist who started to rethink his workspace after having back surgery is challenging the status quo with what he calls the radworkflow space, i.e. don’t think about a workspace but rather a flow space (see link to his video). He built his own space addressing the following requirements:
a.     You need a curved area when looking at multiple monitors with a table and chair that can rotate making sure you always have a perpendicular view. Not only does this improve the view angle distortion from the monitors but also is easy on your neck muscles.
b.    Everything should be voice activated and by the way, all audio in and out should be integrated such as your voice activation, dictation software and phone.
c.     Two steps are too many and two seconds for retrieval is too much. It is amazing to think that retrievals of images in the 1990’s, using a dedicated fiber to the big PACS monitors of the first PACS systems used by the Army, were as fast or possibly faster than what is state-of-the-art today. Moore’s law of faster, better, quicker and more computing power apparently does not apply to PACS.
d.    Multiple keyboards is a no-no, even when controlling three different applications on 6 imaging monitors (one set for the PACS, one set for the 3-D software, and one set for outside studies).
Hopefully, vendors are taking notes and will start implementing some of these recommendations, it is long overdue.

Camera mounted at Xray
9.     Adding a picture to the exam to assist in patient identification. As we know, there are still way too many errors made in the healthcare delivery that potentially could be prevented. Any tool that allows a practitioner to double-check patient identity in an easy manner is recommended. A company that was exhibiting at SIIM had a simple solution as it takes a picture of a patient and makes it part of the study by creating a DICOM Secondary Capture of the image. It consists of a small camera that can be mounted at the x-ray source. I noticed two potential issues that need to be addressed: does it work with a MRI, i.e. what is the impact of a strong magnetic field on its operation? Second, now we know how to identify the patient better, how would it be to de-identify the study if needed? We would need to delete that image from the study prior to sharing it for the purpose of clinical trials, teaching files, or when sharing it through any public communication channel.

Nice dashboard from Cincinati Childrens
10.  Dashboards assist in department awareness. I am all in favor of dashboards, both clinical and operational as it typically allows one to graphically see what it going on. I liked the poster that was shown by Cincinnati Children’s showing the display that is placed in a prominent space in the department and shows its operational performance such as the number of unread procedures, turnaround time, a list of doctors who are on call, and also a news and weather link. They pulled this data from their PACS/RIS system doing some simple database queries. This is a good example of how to provide feedback to the staff.


As mentioned earlier, I thought that SIIM2017 was a pretty good meeting, not only for networking with fellow professionals, but also learning what’s new, and seeing a couple of new innovative small start-up companies, especially in the AI domain, and last but not least, enjoying a bit of Pittsburgh, which pleasantly surprised me. Next year will be in DC again, actually National Harbor MD, which despite its close location to Washington will not be a match for this year’s, but regardless, I’ll be looking forward to it.

Wednesday, June 14, 2017

Top 10 lessons learned when installing digital imaging in developing countries.

Patient at Zinga
Children's hospital,
close to Dar-es Salaam,
Tanzania,
recipient of a Rotary
 International grant for
imaging equipment
Installing a digital medical imaging department in a developing country is challenging, which is
probably an understatement. The unique environment, lack of resources, money and training, pose barriers to creating a sustainable system.

As anyone who has worked in these countries will attest, sustainability is key, witnessed by the numerous empty buildings, sometimes half finished, non-working equipment due to lack of consumables, spare parts, or simply not having the correct power, A/C or infrastructure environment. 

I learned quite a bit when deploying these systems as a volunteer, especially through gracious grants by Rotary International and other non-profits, which allowed me to travel and support these systems in the field. Some of these lessons learned seem obvious, but I had to re-learn that, what is obvious in the developed world, is not necessarily the case at the emerging developing countries of the world. 

So, here is my top 10 lessons learned in the process:

1.       You need a “super user” at the deployment site with a minimum set of technical skills. Let’s take, as an example, a typical digital system for a small hospital or large clinic, which has one or two ultrasounds, a digital dental system and a digital X-ray, either using Direct or Computerized Radiography (DR or CR). These modalities require a network to connect them to a server and a diagnostic monitor and physician viewer. Imagine that the images don’t show up at the view station, someone needs to be able to check the network connection, and be able to run some simple diagnostics making sure that the application software is running. In addition to being able to do basic troubleshooting on-site, that person needs to also function as the single point of contact for a vendor trying to support the system and be the ears and eyes for support.

2.       Talking about “single point of contact,” I learned that it is essential to have a project manager on-site, which means that one person arranges for equipment to be there, knows what the configuration looks like, checks that the infrastructure is ready, does the follow up, etc. It is unusual that the local dealer does all of this. There also might be construction needed to make a room suitable for taking X-rays (shielding etc.), A/C to be installed to prevent the computers from overheating, network cables to be pulled, etc.; there has to be a main coordinator to do this.

3.       You also need a clinical coordinator on site. This person takes responsibility for X-ray radiation safety (which is a big concern) and also doing the QA checks, looking for dose creep (over exposing patients), reject analysis (what is the repeat rate for exams and why are they repeated). With regard to radiation safety, I have yet to see a radiation badge in a developing country, which is common practice for any healthcare practitioner who could be exposed to X-ray radiation in the developing world. As a matter of fact, I used to carry one with me all the time when on the vendor site and being in radiology departments on a regular basis. I would get calls from the radiation safety officer in my company when I forgot that I had left the badge in my luggage going through the airport security X-ray scanners. There is little radiation safety infrastructure available in developing countries, and the use of protective gloves, lead aprons and other protective devices is not always strictly enforced, this is definitely an area where improvements can be made.

4.       Reporting back to the donors is critical. There are basically three kinds of reports which are preferably shared on a monthly basis, as a matter of fact, this is a requirement for most projects funded by Rotary International grants: 1) The operational reports that include information such as number of exams performed by modality (x-ray, dental, ultrasound), age, gender, presenting diagnosis, exam type, etc. 2) The clinical reporting which includes the quality measures such as exposure index, kV mAs, etc. and 3) Outcomes reporting which includes demographics, trends, diagnosis, etc.
The operational reporting will indicate potential operational issues, for example, if the number of exams shows a sudden drop, there could be an equipment reliability issue. The clinical reporting will show if the clinic meets good practices. The outcomes reporting is not only the hardest to quantify but is the most important as it will prove to potential donors, investors and the local government the societal and population health impact of the technology. This information is critical to justify future grant awards.

5.       Power backup and stabilizers are essential. Power outages are a way of life, every day there can be a 4 hour or more power outage, therefore, having backup batteries and/or generators in addition to having a local UPS for each computer for short term outages is a requirement. One thing we overlooked is the fact that if we have power from the grid, the variation can be quite large, for example, a nominal 220V can fluctuate between 100 and 500 Volts. Needless to say most electronic equipment would not withstand such high spikes, so we had to go back in and install a stabilizer at one site after we had a burnout, which is now part of the standard package for new installs.

6.       Staging and standardization is a must. When I tried to install dental software on a PC on-site in Tanzania, it required me to enter a password. After getting back to a spot where I could email the supplier, I found that the magic word “Administrator” allowed me to start up the software, however, not until a loss of a day’s work as the time difference between the US and East Africa is 9 hours. After that, It took me only 5 minutes to discover the next obstacle, “device not recognized,” which did not allow the dental byte-wings to be used for capturing the X-rays. This caused another day delay as it took me another night to get an answer to solve that question. This shows that installing software onsite in the middle of nowhere is not very efficient unless you have at least 2 weeks time, which is often a luxury. And this was just a simple application, imagine a more complex medical imaging (PACS) system requiring quite a bit of configuration and setting up, it will take weeks.

There are a few requirements to prevent these issues:

1) Virtualize as much as you can, i.e. use a pre-built software VM (virtual machine) that can be “dropped in” on site. The other advantage of the virtual machine is that it is easy to restore to its original condition, or any other in-between conditions that are saved. It is interesting that the “virtualization trend,” which is common in the western IT world in order to save on computers, servers, and most importantly power and cooling capacity, is advantageous in these countries as well but more for ease of installation and maintenance reasons.

2) Stage as much as you can, but do it locally. If you preload the software on a computer in the US, ship it to let’s say Kenya, first you will be charged with an import duty that easily can be 40%, and you also might send the latest and greatest server hardware that nobody knows how to support locally. Therefore, the solution is to source your hardware locally providing local support and spare parts, and then stage it at a central local location that has internet access to monitor the software installation and then ship to the remote site.

3) Use standard “images” which goes back to the “cookie-cutter” approach, i.e. have a single standardized software solution, for maybe three different sizes of facilities, small, mid-size and large, so that the variation is minimal.

7.       Use a dedicated network. This goes back to the early days of medical imaging in the western world. I remember when we would connect a CT to the hospital network to send the images to the PACS archive, it would kill the network because of its high bandwidth demands. It is quite a different story right now, the hospital IT departments have been catching up, and have been configuring routers into VLANS that have fiber and/or gigabit speed connections to facilitate the imaging modalities. But we are back to square one in the developing world; networks, if available, are unreliable, might be open to the internet and/or computers that are allowed to use flash drives (the number one virus source), and therefore connecting these new devices to that would be asking for trouble. Therefore, when planning a medical imaging system, plan to pull your own cables, and use dedicated routers and switches. If you use high quality programmable and managed devices, it could become the core of the future hospital network expanding beyond the imaging department.

8.       Have an Internet connection. The bad news is that there is typically no reliable or affordable internet connection, however, the good news is that the phone system leapfrogged the cable infrastructure and therefore you should plan for a G3 compatible hot-spot that can be used to connect a support expert and take a look at the system in case there are any issues.

9.       Training is critical. Imagine buying a car for your 16-year-old daughter and just giving her the keys and telling her that she’ll be on her own. No-one would do that, but now imagine deploying a relatively complicated system in the middle of nowhere, which will allow people to make life-and-death decisions, without any proper training. I am not talking about clinical training on how to take an X-ray or do an ultrasound, but the training on how to support these systems that are taking the images, communicating, archiving them and displaying them. You need a person who takes the weekly back-ups to make sure that if there is a disk crash they can recover the information, who will do the database queries to get the report statistics, do the troubleshooting in case an image has been lost or misidentified, is the main contact to the support people at the vendor, and so on. On- the-job-training will not be sufficient. The good news is that it is relatively easy to create training videos and upload them on YouTube (or better send them on a CD as internet access might not always be available).

10.   Do not compromise on clinical requirements. I have seen darkroom processors being replaced with a CR and a commercial (i.e. non-medical) grade monitors to look at the images in a bright environment. This is very poor medical practice. No, you don’t need two medical grade 3 MegaPixel monitors at the cost of several thousands of dollars. Clinical trials have shown that a 2 Megapixel has the same clinical efficacy as a 3MP, but requires a user to use its zoom and pan tools a little bit more, which is acceptable in these countries. Therefore, the key is to use a medical grade monitor, which is calibrated to convert each individual grayscale value into a pixel that can be distinguished from each other. If this is not the case, there is no question that valuable clinical information will be lost. Also the so-called luminance ratio (difference between dark and white) does not have to be as high as long as the viewing environment is dark enough. So, as a rule of thumb, use an affordable medical grade monitor and put it into a dark room (paint windows, walls, hang curtains), don’t skimp on these monitors.


In conclusion, none of these lessons learned are new, we learned most of these 20 years ago, but the problem is that most of them might be forgotten or assumed, at least that is what I did when venturing out to these developing countries. The good news is that we can apply most of what we have learned and therefore be successful in providing imaging to the remaining two-thirds of the world that does not yet have access to basic imaging capabilities and thereby still make a major difference.

Monday, May 1, 2017

Digital Pathology: the next frontier for digital imaging; top ten things you should know about.

Typical pathology workstation (see note)
As the first digital pathology system finally has passed FDA muster and is ready to be sold and used in the USA, it is time for healthcare institutions to prepare for this new application. Before jumping head first into this new technology, it is prudent to familiarize yourself with the challenges of this application and learn from others who, notably in Europe, have been doing this for 5+ years. Here is a list of the top ten things you should be aware of.

1.       The business case for digital pathology is not obvious. Unlike the experience in radiology where film was replaced by digital detectors, and we could argue that the elimination of film, processors, file rooms and personnel would at least pay for some of the investment in digital radiology, digital pathology does not hold promise for the same amount of savings. Lab technicians will still need to prepare the slides, and as a matter of fact, there is additional equipment needed to digitize the slides to be viewed electronically.
The good news is that pathology contributes very little to the overall cost of healthcare (0.2%), and therefore, even though the investment in scanners, viewers, and archiving storage is significant, impact of this on the bottom line is small. Of course, there are lots of “soft” savings such as never losing slides, being able to conference and get second opinions without having to send slides around, much less preparation time for tumor boards, much faster turnaround through tele pathology, and the potential for Computer Aided Diagnosis. So, going digital makes every sense in the world, but it might just be a little bit hard to convince your CFO.

2.       Most institutions are “kind of” ready to take the jump from an architecture perspective. Many hospitals are strategizing how to capture all of their pathology imaging, in addition to radiology and cardiology, in a central enterprise archiving system (aka Vendor Neutral Archive). And they might have already made small steps towards that by incorporating some of the other “ologies.” However, pathology is definitely going to be challenging, as the files sizes for images are huge. A sub-sampled compressed digital slide could easily top 1.5GB, therefore you should be ready to multiply your digital storage requirements by a factor of 10. As a case in point the University of Utrecht, which has been doing this for 7 years is approaching 1 Petabyte of storage. So, even if you have an enterprise image management, archive and exchange platform in place, it definitely will need an adjustment.

3.       Pathology viewers are different from other “ologies.” Pathologists look at specimens in a three dimensional plane, unlike radiologists who, in many cases look at a 2-D plane (e.g. when looking at a chest radiograph). One could argue that looking at a set of CT or MRI slices is “kind of 3-D” but it is still different than when having to simulate looking at a slide under a microscope. The pathologist requires a 3-D mouse to view the images, which are readily available. The requirements for the monitors are different from other imaging specialties as well; a large-size good quality color monitor will suffice for displaying the images, which is actually much less expensive (by a factor of 10) than the medical grade monitors needed for radiology.

4.       Standard image formats are still in their infancy. This is something to be very aware of; most pathology management systems are closed systems, with an archive, viewer and workflow manager from the same vendor, with little incentive to use the existing DICOM pathology standard for encoding the medical images. Dealing with proprietary formats does not only lock you in to the same vendor, possibly making migration of the data to another vendor costly and lengthy, but also jeopardizes the whole idea of a single enterprise imaging archiving, management and exchange platform. Hopefully, user pressure will change this so that the vendors will begin to embrace the well-defined standards that the DICOM and IHE community has been working on for several years.

5.       Digital pathology will accelerate the access to specialists. I remember from several years back, visiting a remote area in Alaska, when it switched to digital radiology and when all the images were sent to Anchorage to be diagnosed. Prior to that, a radiologist would fly in for 2 days a week, weather permitting, to read the images. So if you needed a diagnosis over the weekend, you were out of luck. The same scenario applies for having a pathologist at those locations, as of now, the samples are sent, weather permitting, to a central location to be read. In some locations there is a surplus of pathologists, in some there is a shortage or even lack of these medical professionals. Digital pathology will level the playing field from a patient access perspective. Without having to physically ship the slides and/or specimens, it will significantly decrease the report turnaround time and impact patient care positively.

Typical Slide scanner (see note)
6.       Digital pathology is the next frontier. Here is some more good news. Vendors are spending 100’s of millions of dollars in developing this new technology. Digital scanners that can load stacks of slides and scan them while matching them with the correct patient using barcodes are available. Workflow management software has improved. Last but not least, automatic detection and counting instead of doing this manually, of certain structures in the images is a big improvement towards characterizing patterns and therefore diagnosis can be made more accurately.

7.       Don’t expect to become 100% digital. Some applications still require a microscope. The experience at the Utrecht Medical Center in the Netherlands is that you may achieve  95% conversion to digital but there are still some outliers that require a microscope because of the nature of the specimen. However, this is very manageable and only a relatively small subset.

8.       Digital pathology has ergonomic advantages. Imagine having to bend over most of the day while looking through a microscope, you can imagine that doing that day-in-day-out for many years can cause strain on your neck and back. Instead, sitting in a comfortable chair, or having a stand-up desk definitely is better, even although one needs to be careful with picking the right mouse to avoid carpal tunnel syndrome.

There is a lot of opportunity for automated
counting and detection (see note)
9.       Viewing capabilities are an order of magnitude better. This is obvious for professionals who are reading medical images as a radiologist or cardiologist, but for pathologists who were bound to a single view through a microscope, and who now are having multiple images next to each other, and being able to annotate them electronically, it is a completely new world.

10.   Research and education gets a major boost. Imagine the difference when teaching a group of pathology students who are supposed to be looking at a similar tissue through their own microscope and now they all can access the same image on their computer monitor. One can build a database of teaching files and easily share them electronically. All of this seems obvious for anyone who is involved with medical imaging in other specialties, but for pathology this is a major step.

In conclusion, digital pathology is finally here in the USA. However, there are some hurdles, starting with convincing the people who hold the purse that it is a good investment, then adjusting the architecture and workflows to facilitate the huge image sizes, and making sure that these systems support open standards so you are not going to be locked into a specific vendor. There are definitely major advantages and it might be expected that the benefits will soon become so evident that it will only be a matter of time before everyone will jump on the digital pathology band wagon. It is strongly recommended that you learn from others, notably in Europe who have been implementing this technology already for several years.

Note: Illustrations courtesy of Prof. Paul van Diest, UMC Utrecht.

Tuesday, April 11, 2017

Top Ten VNA Requirements

The term PACS VNA (Vendor Neutral Archive) has been loosely defined by different vendors and its functionality varies widely among providers. Early implementations have seen some good success stories but also, in several cases, caused confusion and initial frustrations and unmatched expectations. The list below concentrates on the key features that are necessary for a successful implementation. So, the VNA should:

1.    Facilitate enterprise archiving: Enterprise Archiving requires many different components, as a matter of fact, the joint SIIM/HIMMS working group has done a great job listing key components, including governance, a strategy definition, image and multimedia support, EHR integration and a viewer, but most importantly a platform definition, which can be provided by a VNA. The VNA needs to be the main enterprise image repository, which is the gateway to viewers and the EMR, taking in the information encoded as DICOM, as well as other formats, following the XDS (cross-enterprise document sharing) repository requirements. A true VNA needs to be able to provide that functionality.

2.    Facilitate cross-enterprise archiving: The VNA should be the gateway to the outside world for any imaging and image related documents. Examples of image related documents are obviously the imaging reports but also measurements (Structured Reports) and other supporting documentation, which can be scanned documents, or be in native digital formats. It also needs to be a gateway for external CD import and export, for portals, and the gateway to cloud sharing and archiving solutions.

3.    Support of non-DICOM objects (JPEG, MPEG, Waveforms). Even though DICOM has proven to be an excellent encapsulation of medical images and other objects, such as waveforms, PDF’s, documents, etc., there are cases where this is not as easy or possible. A use case for this is when one needs to archive a native MPEG video file from surgery or another specialty. As long as there is sufficient metadata to manage the object, this should be possible and be provided by the VNA.

4.    Be truly vendor neutral: Even if the VNA is from the same vendor as one or more of your PACS systems, its interface with any PACS system(s) should be open and non-proprietary. This is one of the most important requirements: plugging in a PACS to your VNA from another vendor should be very close to “plug-and-play.”

5.    Synchronize data with multiple archives: Lack of synchronization is probably the number one complaint that I hear from early implementers. To be fair to the VNA vendors, in many cases synchronization is lacking on the PACS side. Even if the VNA is able to facilitate the IOCM (Imaging Object Change Management) messages, which is basically a Key Image Note with the reason for changes (rejects, corrections for safety or quality reasons or worklist selection errors), if the PACS has no IOCM support, then you are left with manual corrections at multiple locations. At best, there should be some kind of web-based interface that allows a PACS administrator to make the changes. It might be possible to adjust the workflow, which could minimize corrections, for example, one institution does not send the copy to the VNA till one day after the images are acquired which means that the majority of the changes have been applied at the PACS, however, if the VNA is the main gateway for physician access, this is not feasible. Lack of this synchronization requires a PACS administrator to have to repeat the changes at different locations.

6.    Provide physician access: A key feature of the VNA is that it provides “patient-centered” image access, i.e. instead of a physician having to log into a radiology, cardiology, surgery or oncology PACS with different viewers using different log-ins, and disparate interfaces, there is now a single point of access. This access point is also used for the EMR plug-in, i.e. there should be an API that allows a physician to open up the images referred to in the EMR with a single click provided by the VNA. Note that accessing the data with a different viewer could create some training and support issues as the features and functions are most likely different from the PACS viewer.

7.    Take care of normalizing/specializing: As soon as images are shared between multiple departments and even enterprises, the lack of standardization becomes obvious with regard to Series and Studies Descriptions, procedure codes/descriptions, and body parts. The differences could be obvious, such as when using “skull” or “brain” for the same body part or subtle changes such as between “CT Head w/o contrast” and “CT HD without contrast.”  Any difference, even minor ones, could cause prior images to not be fetched for comparison. That is where, what is sometimes referred to as “tag morphing,” comes in, where the data is “normalized” according to a new set of descriptions and/or codes before it is archived in the VNA. When a specific PACS expects certain information to be encoded in a specific manner, the data has to be modified again to facilitate its local quirks, which I would call “specialization”.

8.    Handle multiple identities: Images will be presented to the VNA with local patient identifiers that need to be indexed and cross-referenced. The same applies to studies and orders. Most VNA’s can pre-fix an Accession Number to make it unique in the VNA domain and remove that prefix when sending the information back. This assumes that the Accession Numbers are not using the maximum allowed 16 byte length, otherwise it has to be dealt with in the database.

9.    Be the gateway to the outside world using open standards. Many states, regions, or, if small enough, countries, are rolling out a central registry (HIE or Health Information Exchange) so that an institution can register the presence of images and related information to anyone outside the enterprise who is authorized to access this information. The registration and also discovery uses the IHE defined protocols XDS while the PIX/PDQ standards take care of the patient cross referencing and query.

10. Meet your specific needs: More than 50 percent of US-based institutions are apparently installing or planning to install a VNA, according to a recent survey. I suspect that the main reason is that many are getting tired of yet another data migration, which is lengthy (months to years), and potentially costly in terms of both money and lost studies. The elimination of future migrations is somewhat of a moot point as PACS migration will likely be replaced with migrating the VNA, so it is kind of shifting but not eliminating this issue. The real reason for getting a VNA has to be some of the key features listed above. If on the other hand you have a relatively small institution, with only images created in radiology and possibly cardiology but not in any other specialty, and there is no immediate need for image exchange, then I would argue that you might be better off staying with the current PACS architecture as the business case for a VNA is not quite clear yet.

In conclusion, VNA’s are here to stay, assuming they have most, if not all, of the features listed above. However, it might not be for you, so you need to make a business case and look at the potential pro’s and con’s of getting a VNA. When you are thinking about getting a VNA, talk with your VNA and PACS vendor about the features listed above to make sure you understand the clinical, technical and personnel impact if your vendor does not support one or more of these features. By the way, we'll have a VNA seminar coming up, see details here.



Monday, February 27, 2017

HIMSS17: Where the Brightest Minds Inspire Action.

A view at the floor
Here are my top ten hot topics from this year’s HIMSS meeting, which was held Feb. 19-23 in Orlando this year. Before I comment, one should note that I look at this meeting with an imaging background and interoperability interest, so I probably missed many things outside my scope of interest. Second, I was a little bit turned off by this year’s slogan “Where brightest Minds inspire action.” I have to admit that it did its job because as soon as I arrived at the airport I made a picture of myself with this poster to post on social media, but after the fact I thought, “what about those even smarter physicians, or, even better, shouldn’t it be all about the patient?” In any case, it was a nice marketing scheme and definitely did its work.
So, what about my top 10 hot topics of this year?
  1. Cognitive computing is the solution to all problems in healthcare (or maybe not?) – Virginia “Ginni” Rometty (IBM CEO) said in her keynote speech, “We’re at a moment where we can change large pieces of healthcare and we are at a point where cognitive computing could usher in a new golden age in healthcare.” Note that cognitive computing is defined as “the simulation of human thought processes in a computer using techniques such as AI, machine learning, and neural networks. So, is our solution to our healthcare issues to replace physicians with computers? Maybe not quite yet, it can be used to guide precision medicine, such as tailored drug and therapy treatment for cancer, and yes it can also be used to create better outcomes using data mining to indicate more effective treatments. But I believe we have some major issues to deal with first, such as, having information about patients in different data silos that is still incompatible, hard to extract and exchange, and lastly, semantically very differently encoded. And then, assuming we do eventually get all that data at our fingertips, replacing the thought process from an experienced physician is somewhat more complex than being able to win a chess or jeopardy game (which is what Watson is known for). So, in my opinion, Watson may not be a solution for a while.
  2. Clinical Document Architecture (CDA) is not (quite) dead but “in hospice.” – This is a quote from Keith Boone, an interoperability expert and guru who wrote a blog about this particular subject 2 years ago. CDA is the document standard that was defined as part of HL7 version 3 and was supposedly to become the norm for exchanging documents out of, and in between EMR’s. For example, a physician can access a CDA encoded discharge record from a patient and import it into his or her own EHR by requesting the CDA from the hospital. There are several different templates defined as part of the CDA definition, such as care record summaries, clinical notes, and several others. The expectation is that CDA is going to be replaced by Fast Healthcare Interoperability Resources (FHIR) (see below) in the next 2 or 3
    Use case demo with FHIR
    years. However, I saw quite a few working demonstrations of CDA exchanges, and talking with HIE executives, it appears that this is for several applications the most common interface. And as we know in healthcare, there is a big resistance to change something that works (witnessed the death of HL7 version 3). So, even though FHIR will start to take off in a few years, there remains plenty of opportunity to fix the issues with CDA, which mostly deal with its semantic interoperability, and to keep it for certain use cases. So, in my opinion, CDA is not going to die soon and, for certain use cases, continue to be a good way to exchange information between EMR’s.
  3. FHIR year-three is still in its teens. – If you have teenagers you know what this
    One of the several presentations
    on FHIR at the HL7 booth
    means. It is unpredictable, you often are holding your breath, hoping that common sense will prevail (eventually). Despite (or maybe thanks to) the strong support from ONC many people don’t realize the FHIR is still a draft, and it has many resources still to be defined, and, unfortunately because of its loose definition, has many options, and lastly there is also the issue with having different releases out there. So, it will take a few years to get there and at that time it will be used next to conventional HL7 version 2, CDA, and in some rare version 3 messages, to exchange the information.
  4. Imaging is still an IT stepchild. – As with any stepchild, imaging does not get the attention it deserves and is under appreciated and underserved at this meeting. The HIMSS program committee does not seem to realize that the CIO’s who visit HIMSS will never set foot in any of the imaging trade shows, so in order to bridge the gap between IT and imaging, it is essential that there be education and exposure to the complexities of storing, archiving, managing and exchanging patient images. It is not the vendors fault, if you wanted to learn about the new enterprise image solutions using VNA technology, all of the major (and minor) players were present. But the lack of educational sessions on this topic was discouraging.
  5. The HIMSS Interoperability Showcase is growing up. – As of the second day, there were more visitors
    One of the many use-cases shown
    at the HIMSS showcase booth than last year’s (more than 7,400). This showcase showed true interoperability by having multiple vendors in the same booth demonstrating true life scenarios. One might wonder why there is so much more interest. Here are some of the questions I got when working at the IHE information booth this year: 
    From large insurance company – Why do I have to develop custom interfaces for each provider’s EMR to get data so we can use it to optimize our billing, reimbursement, actuarial predictions, etc. From device vendor – How can I upload my data using standards into the EMR?·  From HIE provider – Why are there so few people supporting XDS for cross enterprise data exchange?  From PACS vendor – We provide XDS capability but why don’t we get any “takers” in the US?   From IT hospital people – When can I get what I see here in my hospital? So, in my opinion, the increased interest in these exhibits is due to the fact that people are starting to realize that there indeed is another way of integrating systems, i.e. by using industry standards such as IHE, but that there is still a big difference between what was shown (the “show and tell” or what we call in our own jargon the “dog and pony show”) and what is available in the real world.
  6. IHE Connectathon is maturing. – The IHE connectathon took place in Cleveland,
    Connectathon in Cleveland
    just a few weeks prior to the HIMSS17 meeting, which is somewhat of a “testing ground” to prepare for the IHE showcase event, but also stands on its own merits as it is a great opportunity for the vendor community to test their applications among themselves. As the connectathon attendees can attest to, it was obvious that in contrast to the IHE showcase attendance, the connectathon attendance has been declining over the past few years. The number of Health IT systems tested has dropped between 2015 and 2017  and the number of organizations testing as well. IHE USA is still looking to see what the trends are and reasons for the drop, but in my opinion, it might just be that it is time to pause and make sure that there is time to implement all of these profiles from these different domains. It is good for standards to set the path ahead, but the industry needs to have time to implement it, and for the users to demand it, and test and deploy it and make the appropriate workflow and other changes that come with implementing new technologies.
  7. Enterprise-wide Image exchange is still a challenge. – The Joint SIIM/HIMSS workgroup on enterprise imaging gave their report at the meeting detailing the result, which produced several white papers that are available for free on their website and covers all aspects including the governance, image ownership, encounter-based imaging issues and viewing. The problem is not only how to manage and exchange these images, but also how to acquire them in a consistent manner, especially from the non-DICOM “ologies.” Taking a picture on a smartphone to be uploaded into the patient EMR is not trivial as it requires consistent and unique metadata to be generated, which sometimes can be done from order information, but often there is no order available. There are a couple of follow up working groups established that basically try to take these white papers a step further and educate users about the issues and resources. In addition, there is a working group that is evaluating the Digital Imaging Adoption Model (DIAM), as defined and used by the European HIMSS division, to see how it can be made applicable for the US.
  8. It is very hard (if not impossible) to get physicians to give up their pagers. – There are still about 1.5 million pagers! around in the US, which are almost exclusively used by healthcare practitioners. There are several reasons for this, most of them are related to habits as there is no reason that, with using the appropriate secure messaging software, one can’t use a smart phone. If a physician would use a smart phone it is possible to link directly to the EMR, look up on-line resources as needed and even pull up an image, all of which is not possible with a pager. Dr. Sean Spina from Island Health in Canada did an experiment with his pharmacy staff and found that when using the smartphones, the average response times for messages was reduced from five-and-half to three minutes and the time for high priority calls were reduced from 19 to 5 minutes. However, it really requires a top-down enforcement as one vendor commented that if they sold let’s say 1000 licenses for their messaging, after one year they found that at most 200 would be using it, the remainder would still be hooked on their pagers.
  9. Patient engagement through messaging is critical – Another important type of secure messaging that is evolving and also critical for outcomes is messaging of patients. There is a high percentage of patients that do not take critical medications, so simple follow-up texts has been proven to make a major difference, even potentially impacting readmission rates in the hospital.
  10. The last observation is that gadgets and happy hour rule at these types of
    My favorite give-away:
    Vespa Scooters
    events
    . – If walking down the aisles you wondered why there is more traffic and people seem to be more vocal late in the afternoon, it would have been the fact that the “bar is open” after 4 pm. Not only can you purchase beer, but you could get free Belgium beer at certain European vendors, and, of course, Bud Light with the US vendors, and here and there you could get a good glass of wine. And of course gadgets still rule, get those stuffed animals, pens, USB sticks, shopping bags and much more in return for vendors scanning your badge so they can follow up with junk mail.
This was another good trade show, it had record attendance because the “who-is-who” in healthcare IT was there and there were some pretty good talks even though it was often hard to spot them in the myriad of presentations. As one of my vendor-colleagues commented, “your customers expect you to be at HIMSS, whether you like it or not.” One common complaint was that the trade show was very elongated, i.e. to get from one side to the other was more than half a mile and it took me at least 10 minutes walking at a brisk pace to get from one place to the next, but I heard few complaints about the content. HIMSS18 will be in Vegas again, a favorite location for many, including myself. Looking forward to it!