Tuesday, March 20, 2018

DICOM Experts, Where Art Thou?

This past month alone, I got three inquiries from high tech imaging companies looking for seasoned DICOM professionals; two are wanted on the east coast (Boston), two in rural Arkansas, and if you like skiing and hiking, there is a vacancy in Boulder, Colorado.

One of these positions does not even require US residency, as they are willing to sponsor a work visa for qualified applicants. The reason these inquiries came to me is that there are literally thousands of students who went through the OTech DICOM training over the past 25 years, and therefore, I have a large base of “alumni” among my Facebook and Linked-in friends.

This poses the question, what is an expert anyway? My first source is always the (un)-official source of truth, i.e. Wikipedia:
Historically, an expert was referred to as a sage (Sophos), was a profound thinker distinguished for wisdom and sound judgment. Informally, an expert is someone widely recognized as a reliable source of technique or skill whose faculty for judging or deciding rightly, justly, or wisely is accorded authority and status by peers or the public in a specific well-distinguished domain.
The next question would then be, how to define a DICOM “expert?” To define his or her skills, I like to refer to the official DICOM certification for professionals, which is managed and administered by PARCA. The requirements for this certification include knowing:
1.     Negotiation – How DICOM connections (Associations) are being negotiated and established, i.e. the handshake and agreement on the type of images to be exchanged and encoded such as compression. Note that “images” mean any DICOM file, including dose reports, measurements, presentation states containing overlays etc.
2.     Messages and data elements ­– How DICOM metadata (literally “data about data”), aka DICOM headers that are part of the DICOM file is encoded and can be interpreted.
3.     Storage and Image management – That DICOM protocol services include the capability to query a worklist at a modality, allow for images to be exchanged, get a commitment from an archive about its permanent storage and can communicate study status and changes to the procedure using “Modality Performed Procedure Step.”
4.     Print, Query/Retrieve and compression – There are still a lot of DICOM printers, especially in emerging and developing countries, communicating with the DICOM print protocol, while Query/Retrieve is the interface to a PACS database/archive. Compression specifies what compression schemes are supported and can be negotiated such a JPEG, JPEG2000, MPEG, and others.
5.     DICOM Media – Reliable CD interchange is still a major headache and pain point for many institutions, if only everyone would follow the DICOM standard closely, it would be much easier. One should be familiar with how images are stored on a CD i.e. as so-called “part-10” files and how the DICOMDIR or directory is structured.
6.     Image quality and Structured Reports – DICOM defines a so-called pixel pipeline which specifies all the steps that the pixel data is going through prior to being displayed such as different greyscale/color schemes, annotations, Look-Up Tables, etc. Displaying the images on a monitor that is calibrated using the DICOM defined standard greyscale and color mapping is critical to ensuring that every discreet pixel value is mapped into a distinguishable greyscale or color value. Structured Reports are used for measurements, CAD marks, dose information, key images and other information related to image metrics.
7.     VR’s and conformance – A VR or Value Representation defines the data types, i.e. maximum length and encoding of the DICOM data elements. Knowing where and how to evaluate these allows for spotting errors, the most frequent being exceeding maximum length, invalid codes in the fields, invalid characters, etc. Conformance is critical as it allows checking whether two DICOM devices can communicate using the conformance statements.
8.     Networking – This includes addressing, i.e. use of IP address, port number, and AE-Title, using tools such as DICOM network sniffers as well as interpreting the communication logs and dumps.
9.     Troubleshooting – To troubleshoot DICOM connections, one would use simulators and test tools. The most basic tool is the use of the DICOM Verification, as well as using multiple test images such as those for testing the imaging pipeline and be able to change negotiation parameters
10.  New DICOM extensions – There are several DICOM extensions, such as the specifications “for processing” aka raw data, which typically is used to perform CAD, the definition of the new multi-frame enhanced CT, MR and other image types, using the Universal Worklist and the new pathology image definition. Last but not least, is DICOMWeb, which uses RESTfull services, mostly being used for mobile access and through web browsers and is the counterpart of the HL7 FHIR services.
As you can see, there is quite a bit involved with being a “DICOM expert.” If you feel like honing your skills, you might want to check out available textbooks, training or pursue certification. If you feel you would qualify for one of the “expert positions,” feel free to forward your resume and I’ll be happy to share it with those inquiring about hiring.

Monday, March 12, 2018

HIMSS 2018: Wake-up call for the sleeping giants.

As I browsed through the vendor exhibits among the more than 45,000 healthcare IT professionals gathered in Las Vegas last week for HIMSS 2018, I noticed that the big IT giants Amazon, Google, and Microsoft (Apple was noticeably absent from the exhibit floor), as well as other businesses who are in the CRM space (Salesforce) are finally taking notice of the opportunities in healthcare. It was also not a coincidence that Eric Smidt, past chairman of Alphabet, Google’s parent company, was the keynote speaker for the conference. I believe that this is very promising as healthcare in many ways is very much behind other industries and can learn from their experiences.

As anecdotal evidence of the need for better technology in healthcare, I listened to a presentation from a vascular surgeon who explained how he annotates relevant images on a PACS viewing station, then takes a picture with his iPhone of the screen and shares it using Chat with his residents and surgical team to prep for surgery. The reason for him to have to use his phone, is that we don’t yet have the “connectors” that tie these phones, tablets, and other smart devices with our big, semi-closed healthcare imaging and IT systems. The good news is that Apple just announced an interface allowing information exchange, which can be used, among other things, for patients to access their medical information from a hospital EMR. Also Google cloud announced an open API.
Here are my top observations from HIMSS2018:

Demonstration of new Apple
App accessing health records
·       Patients are taking control of their medical information: Apple announced a FHIR based interface on the iPhone that provides access to personal health records. The interface is built into the recent Apple phone as part of its health app. Information such as recorded allergies, medications, lab results, etc. is copied to the person’s phone. Note that this is different from solutions where this information is stored in the cloud (e.g. Google, etc.).
Regardless, it allows patients to access and keep their own information. It provides a mechanism for patients to share the information, as the hospitals are struggling to meet that demand (only one out of three hospitals can share information according to a recent AHA study, despite the fact that more than 90% of them use electronic health records).
In reverse, it is not that hard to upload this information back into an EMR of a physician or a specialist, together with information collected from blue-tooth enabled blood pressure, pacemaker, insulin pump, and other intelligent healthcare devices as well as wearables. At the IHE interoperability showcase demonstration areas, there were several demonstrations of how this upload can be achieved using standard interface protocols, often using FHIR.

·       FHIR is gaining more traction: The new HL7 protocol allowing easy access, especially by mobile devices, to so-called resources such as lab results, reports, and also patient information is getting more traction. However, there is a still a big disparity between what is shown as “works in progress” such as the demonstrations at the IHE interoperability showcase, and what is actually deployed. Almost every use-case that was demonstrated at the showcase had one or more FHIR elements, such as used for patient information access, uploading images or labs, accessing registries, etc. However, when I asked vendors on the exhibit floor where they deployed the FHIR interface, many of them told me that yes, they have their FHIR interface available but are still waiting for the first customer to actually use it.
There are a couple of exceptions, for example, at the Mayo Clinic they are using FHIR to access diagnostic reports, utilizing the EPIC FHIR interface, but there are still very few. One of the major obstacles with FHIR implementations is that it took them a long time (5 years to-date) to get to a standard that has at least some normative parts in it, which will be release 4 to be balloted soon, which means that any implementation you do right now is subject to changes as upgrades are non-backwards compatible. As an example, the Apple FHIR interface is based on release 2. So, I am officially upgrading my FHIR implementation status from “very limited” to “spotty,” but I believe that there is definitely a lot of potential.

Demonstration of VA to DOD gateway
based on FHIR technology
·       The VA is making major strides in healthcare interoperability: I feel compelled to call out the US department of Veterans Affairs as there is a push to shift some of their care to the private sector, while the fact of the matter is that research shows that the VA scores higher than the industry in many of the quality scores, despite the fact that yes, there is still a lot of disparity between the different VA facilities. The high quality of care is not in the least caused by the early implementation of electronic medical records and the ability to be paperless. But, their current medical record system is becoming out-of-date, hence the intention to replace it with a new EMR at the cost of about $10 billion over the course of the next 10 years. Nevertheless in many ways, their current system still outshines what can be achieved today by commercial vendors.
As a case in point, there is a connection between the VA EMR and the one from the DOD that allows for a smooth transition of veteran data between these two entities, which is based on FHIR. What is significant, is that of the many FHIR resources that FHIR has defined (more than 100 up to now, planning to be at about 150), the VA is able to exchange all of the information needed with only very few FHIR resources, notably Patient, Imaging Study, Questinonnaire, Observation, Clinical Impression, Diagnostic Report, Encounter, Condition, Composition, Allergy and Medications. This means that implementing a relatively limited subset can still be very effective. Hopefully their replacement EMR (Cerner?) will have the same kind of interoperability, which seems to be a point of contention right now in the contract negotiations for replacement.

·       The big EMR companies are doomed (or are they?): This millennium has shown a major shift in healthcare IT as the past ten years the number of hospitals in the US having an electronic record has gone from 10% to more than 90%.
However, these monolithic, semi-closed systems which accumulate all the patient information in big databases that are hard to access with limited tools for dashboarding and quality metrics, and who often charge a hefty fee to provide yet another interface to get information in or out, might be on their way out unless they change their architecture and focus. For what it’s worth, even the White House is taking notice as Jared Kushner mentioned during the meeting that “Trump has a new plan for interoperability.”
Let’s look at an analogy on how other industries solve the information access problem, for example, a website for a hotel. If you would like to find directions to the hotel, you click on a link to Google Maps, if you want to know what the local sightseeing tours are, you click on “tripit”, for reviews you click on “Tripadvisor”, and so on.
Now let’s go back to our ideal EMR user screen, wouldn’t it be nice if you can get the patient information from a “source of truth,” which is a web-accessible source for patient information, the latest lab results from the lab, either internal and/or external, the past 6 months progress on a weight loss program from the patient’s Fitbit located in the cloud, diagnostic reports from the radiology reporting system, and so on. And by the way, arranging transportation for the patient is just another click on the Uber or Lyft App (note the announcement from Allscripts to embed a Lyft interface to their EMR).
The EMR would be a mash-up of multiple resources accessible through standard protocols (FHIR), in some cases guaranteed immutable, using blockchain technology, and the only functionality left would be a temporary cache and workflow engine that guides health care practitioners through their job in a very easy to use manner.
Currently user friendliness, especially, still leaves a lot to be desired, as a recent study showed that during an average patient visit, providers spent 18.6 minutes entering or reviewing EHR data on digital devices, and only 16.5 minutes of face-to-face time with patients. We’ll see what happens over the next 5 years and who will win and who will lose but it appears that FHIR might facilitate a disruptive development.

Standing room seats only for blockchain
·       Blockchain has some (limited) applications in healthcare. I purposely did not mention blockchain in the title of this write up so as not to overload my ISP as I found it to be the most hyped (according to the dictionary: “extravagant or intensive publicity or promotion”) subject of the conference. Presentations on this subject went beyond standing room only.
What is blockchain? It is an immutable, decentralized public ledger that could be used to securely share transactions without a central authority. Knowing that most of the patient’s health information is not intended to be public, and that some of the files (think a 1.5GB digital pathology slide) are just too big to simply move around and copy multiple times, it makes the application for blockchain very limited in scope. The immutable aspect is also hard to accomplish, even for objects or entities that you might think are immutable such as a patient/person.
Imagine that you would store the patient information in a blockchain (e.g. a url and “fingerprint” or “signature” of the data), can you really guarantee that there would be no changes? Some of the content might need to be updated such as a “disease status” in case someone dies, a different name in case a woman who marries, and it is not uncommon anymore for a patient to change sex.
Apart from the “content,” the structure might change as well, due to database changes such as allowing storage of multiple middle names, aliases, etc. Some of these solutions such as providing a unique, immutable person identification, will be resolved by other industries anyway as financial institutions have a lot of interest in making sure that they provide credit to “real persons” and identify if a financial transaction is requested by the actual person instead of a hacker or intruder.
There are however a few blockchain candidates for healthcare, one example was shown at the recent RSNA show dealing with certification and accreditation of physicians, which should be public and from a reliable source. Another example is dealing with consents, so that a healthcare provider can trust the fact that patient information can be shared with for example a parent or caretaker, and what part of the record can be shared and what not (e.g. limit access to mental illness records or the fact that a 16 year old daughter uses contraceptives). So, in conclusion, yes there are some limited applications for blockchain technology, many of them we can “borrow” from other industries, and some of them we can implement for medical purposes, but in practice it will be few.

Salesforce: Patients are
·       Healthcare is learning from CRM companies: According to one of the major CRM companies, Salesforce, Customer Relationship Management (CRM) is a technology for managing all your company’s relationships and interactions with customers and potential customers. Replace the word “customers” with “patients” and you have a perfect system that allows a healthcare institution to manage their patients in a better manner. That is why not only Salesforce but other companies (I saw a demo at Microsoft) are using the CRM core to provide patient management solutions.

·       Artificial Intelligence is making small progress: It would not be right not to mention AI in this report as it is in the top ten tweets about the conference. However, machine learning and Artificial Intelligence is still not as easy as one might think. Some researchers indicate that the IQ of intelligent machines to be equivalent of a 4 your old right now. But, as of now, machines are unbeatable for chess and jeopardy, so there are definitely some applications that can benefit from AI. Examples are predicting ER re-admission rates of certain patients and taking action accordingly, assisting a physician to make a better diagnosis, or, even better, ruling out any findings with an almost 100% accuracy, which would assist in routine screenings. In addition to the technology having to become more mature, there is also an issue with data access as I talked with one user who is in charge of entering manually textual data from old records in structured format, and the fact that much of the accessible data is not very structured. There is a lot of emphasis on AI, so much that some companies are re-branding their whole healthcare business around it (think IBM: Watson Health), which also seems an overkill to me. But AI will silently enter into many applications where it can impact workflow, enhance diagnosis and clinical outcomes.

Yes, I want theVespa
·       HIMSS is still an IT tradeshow: Imagine walking around the RSNA (radiology conference) and being asked if you want to enter in a $200 drawing, participate in a magician performance or, enter a drawing for a motorcycle. It would be unthinkable, but it is still common at the HIMSS. This indicates that it gears towards a different audience than clinicians. In contrast with the last time, however, I did not see any showgirls on the floor this year for photo-ops, so the only decision I had to make was if I would enter the motorcycle or scooter drawing. Having driven a Vespa myself when I was young, it was not a hard choice for me.

In conclusion, this was another great event, with some hype as usual, but I found especially the promise of “outsiders” getting involved in the business of healthcare to be very encouraging. A “fresh look” from these companies using some of the practices that make our life easier when we are not sick, could definitely make our life easier and improve patient care when we are sick. There is no reason that financial transactions can freely move between banks so that I can go to an ATM any place in the world and access my account, while my physician has trouble getting timely lab results, medications, allergies and other pertinent information. I can’t wait for the sleeping giants to not only wake up but get actively involved and make an impact.

Herman Oosterwijk is a healthcare imaging and IT trainer/consultant. In case you like to learn more about new standards, in particular FHIR, check out the upcoming web training and in-depth face-to-face training.

Thursday, February 8, 2018

FHIR myths and truths, and its relationship to HL7 V2

If you were to attend the HL7 Working Group meeting in New Orleans this February, you would have had a hard time finding anyone who is still involved with HL7 version 2.x development. Yes, HL7 version 2.8 has been issued and there is work going on to specify 2.9 but given the fact that most of the US deployments (est. 80 percent) is still on version 2.3 or 2.3.1, which dates from the last century, one might wonder how useful it is to keep on developing these new versions.

The question you might ask is “will version 2 will be replaced by FHIR (Fast Healthcare Interoperability Resources)?” My answer to that question is “possibly, but only to a certain degree.” In order to have an opinion about this, you have to understand what FHIR is, its benefits, and what it could do for an implementation. Here are my observations about common myths and truths about FHIR:

1.      FHIR is just another message encoding – This is incorrect, it is much more than that. This has been the problem with HL7 version 3, which does not provide any more benefits than being another way of encoding messages. Hence, v3 failed miserably because there was no business case to support upgrading all those existing interfaces. A v3 encoded message, which is encoded in XML, is more verbose than the compact, pragmatic “pipe” encoded v2 messages, and early implementations choked the interface engines and IT infrastructure.
 Also, XML is not really “mobile-friendly” as it requires overhead for decoding. Instead, most FHIR implementations use JSON (JavaScript Object Notation) encoding for the messages, which is more compact, efficient and easy to interpret. However, this is not the most important benefit. FHIR messaging is more “normalized” than a “composite” v2 message and uses so-called resources to encode this information. As an example, think about mapping the v2 registration message PID segment into a FHIR patient resource, the PV1 segment into an organization and practitioner resource, and the OBX segment into an observation resource, each being web-accessible. The messaging paradigm is one of four paradigms and the basis of FHIR and an important part of the standard, but not the only and not the most important part.

2.      FHIR uses resources that are shared – This is correct and, in my opinion, the most powerful capability of FHIR. To explain this, let’s look at the current duplication of data in a typical scenario. A patient is registered when admitted in the ER and patient information is stored in the registration system. Clinical information is entered into the EMR together with patient information. An order is placed for a lab and radiology procedure, results are stored locally, again with corresponding patient information. The radiology department produces images and a report; the images are labeled with the patient information again. This scenario shows the duplication of data just for patient information (four times in this simple use case).
What happens if someone discovers that the patient information was incorrectly entered, or there were some previous records available under a different name (e.g. maiden name), or additional demographic information becomes available, etc. HL7 version 2 defines many transactions for updates, moves, and merges, to synchronize these records. But what if there was only one single source of patient information, which can be accessed by any application through a simple API web interface? It would greatly simplify the development and management of patient demographics. The same applies for many other resources, such as practitioner information, medications, scheduling, appointments, etc. This is the beauty of FHIR, using easily accessible resources for important medical information.

3.      There could be hundreds of shared FHIR resources – Not quite, a FHIR resource is a small logically related set of parameters of interest to healthcare, having a known identity and location so it can be accessed, such as a “patient,” “organization,” practitioner,” “consent,” “schedule,” and about 150 more that are defined.
Access is provided using the second of the four FHIR paradigms, i.e. it is based on REST (REpresentational State Transfer). REST is a client server protocol that provides access to a resource to create, retrieve, update or delete information. As an analogy let’s look at how resources are used in the non-healthcare field. Think about a website for a hotel which contains access to a map resource (Google Maps), customer rating (Trip advisor), Activities (TheCrazyTourist), and secure checkout (Paypal).
Similarly, one could create a webpage of an EMR that has links to the patient resource, linked to its scheduled appointments, list of practitioners, and showing medications, recorded observations and diagnostic reports. The EMR provider can merely use these resources to create a “mash-up,” which is very light weight as it relies on these resources instead of having to manage all of this data itself. Using standard interfaces has been demonstrated at various “FHIR hackathon” events, showing that one could use a simple client implementation using standard resources in literally a couple of hours.

4.      FHIR is ready to be implemented – Here is where one could have a difference of opinion. The issue is that the FHIR standard is still very much a living standard based on drafts, called DSTU or “Draft Standard for Trial Use,” which means that subsequent changes are not necessarily backwards compatible. Knowing that, implementing the latest version is DSTU 4 means that implementations based on versions 1, 2 and 3 won’t be compatible. This has become one of the major pain points and causes for interoperability problems among FHIR implementations.
The good news is that parts of the standard passed the balloting process and are reliable and stable enough. For example, 12 of the defined resources, including “patient” are now called “normative,” but one should remember that 12 resources represents less than 10 percent of the list of all defined resources. So if one would, for example, implement the access of a radiology report using the “diagnostic report” resource, it has to be based on a draft standard.
Even though there are a couple of early FHIR implementations, there are not a lot of production systems that even partially use FHIR. The first draft was published in 2012, and most of the standard is still in draft format as of today. This is one of the most common complaints; why is it still mostly in draft and why does it take so long? The short answer is that standardization just takes time, but it definitely impacts early implementations.

5.      FHIR is an all-encompassing standard – Incorrect, FHIR covers only 80 percent of the most common use cases. The remaining 20 percent are covered by so-called “extensions.” The goods news is that the conformance and related extensions are well documented, retrievable and interchangeable. The bad news is that extensions are semi-proprietary and require code changes in case they are important enough to be interpreted by a client. There are two different types of extensions, the “normal” extensions, which can be ignored by a recipient, and the “modifier” extensions which cannot be ignored as they change the meaning and/or context of the information to be exchanged. The 80 percent rule implies that the FHIR standard is going to be relatively compact, unlike HL7 v2, which has lots of options, but it would not necessarily be better from an interoperability perspective. However, if one considers how poorly v2 interoperates with its many options (patient registration, aka ADT, alone has 60 event types) the relatively “bare bones” FHIR might, in the long term, be better especially with a well-defined conformance and extension definition. Time will tell.

6.      FHIR is a document standard – Yes, this is correct, unlike v2, that is primarily a messaging standard. The third paradigm of FHIR is the capability to exchange persistent objects such as documents. By linking and exchanging multiple resources in a so-called “bundle,” you can create a document. For example a patient resource combined with resources of his/her allergies, a list of medications, observations and reports could be exchanged between a hospital and referring physician to document a hospital visit. These documents can be signed and authenticated.  A receiver can either store the document, e.g. in a EMR, or use the information available in the document to update its internal database or record, for example, with details from a lab report.
FHIR can also exchange existing CDA (Clinical Document Architecture) documents. CDA, and the US version called Consolidated or CCDA is the most used and pretty much the only surviving part of the HL7 v3 standard that is in use to exchange information between EMR’s from different institutions and physicians. These CDA’s are being created and exchanged routinely by several Health Information Exchanges (HIE’s), therefore, we can use the FHIR messaging standard to exchange these.

7.      The FHIR interface provides for all use cases – Incorrect, The FHIR REST interface is somewhat limited, which is where we use the FHIR Services, the fourth FHIR paradigm, which is based on the SOA (Service-oriented Architecture). Services are a higher-level functionality that can be accessed using a standard Application Programming Interface or API. These services maintain responsibility for certain related information, for example terminology management, which keeps medical terminology up-to-date, or identity management, which is responsible patient demographics and updating and cross referencing different identities among different domains and institutions.

8.      FHIR implementation is optional – This is correct but that might change, at least for the US. To comply with what used to be called “Meaningful Use” and is now called MACRA, healthcare providers need to use an EMR to get full Medicare/Medicaid reimbursement payments. To comply, the Office of the National Coordinator for Health Information Technology (ONC) requires an EMR to be certified using a so-called Open API, which is basically the REST interface, consequently FHIR is strongly recommended.

9.      FHIR is a US standard – Incorrect, as a matter of fact, two of the main FHIR standard architects are based in Australia and the Netherlands, and initial implementations seem to be more prevalent outside the US than inside the US. This is caused by the fact that centralized national healthcare systems can sometimes more easily make changes happen than in the US. For example, there is a project up and running in Canada for prescription management and in the Netherlands for medications.

10.   FHIR will replace all existing interfaces – Incorrect, HL7 version 2 is widespread and entrenched in many systems. If there is no incentive to modify or change, v2 will continue to be used and form the backbone of healthcare delivery communication. There are going to be v2 interfaces for a long time, and there will also be interfaces using v3 CDA document exchanges. But, for new applications, such as a physician directory resource in the state of Texas that will be used through a public HIE for accessing diagnostic reports by a patient through a web portal, or for accessing a scheduling application or prescription renewal application from a smart phone, FHIR will become the standard of choice.

There is no question that there is a lot of hype around FHIR, but the fact is implementations are (still) very limited. Part of it has to do with the required learning curve, especially if you are an HL7 v2 developer, the immaturity of the FHIR standard, which is still very much in draft status, means that changes and updates are inevitable, and lack a business case for converting existing applications. But, for new developments, FHIR deserves, at a minimum, a fresh look and evaluation if one could use this new paradigm for implementation. Just don’t go overboard and implement something just because it is new and “hot,” but do due diligence before making this choice.

From a support and clinical perspective, it is likely that you’ll see some initial limited niche applications, which very well could be expanded as the experience and support into day-to-day software applications such as the EMR’s, HIE’s, mobile health-aps, and others grow. Note that more information can be found at the FHIR website (www.hl7.org/fhir) or, if you are interested to learn more, you can sign up for the OTech FHIR core webcast training.

Wednesday, December 6, 2017

My 2017 RSNA top ten.

The atmosphere was very positive in Chicago among the 50,000 or so visitors during the 2017 RSNA radiology tradeshow. As one of the vendors mentioned: “Everyone seems to be upbeat,” which is good news for the industry and end users.
Here are my top ten observations from a balmy for Chicago (I did not need my thick coat this year) for this year’s meeting at McCormick Place:

Dr. Al Naqvi, director of SAIMAH
1.      The AI hype is in full swing – Artificial Intelligence, deep learning, or what ever it is called as shown by the many vendors touting this technology has gotten the attention of the radiologists, who packed sessions, and the trade press as shown by all the front page coverage. It is fueled in part by fear that computers will replace radiologists, which in my opinion won’t happen for many years to come. In the meantime, there will be additional tools that might assist a radiologist in eliminating some of the mundane screening exams which can definitely be labeled as “negatives,” but there is still a lot of work to be done and many of the so-called AI tools are nothing more than sophisticated CAD (Computer Assisted Diagnosis) tools that have been around for many years.

There is also a new society being established for AI, the Society of Artificial Intelligence in Medicine and Healthcare (SAIMAH). I guess every new technology needs its champion and corresponding non-profit to promote its use and efficacy.

Ultrasound CAD
2.      Talking about CAD, this has become rather commonplace in the US for digital mammography screening, which is interestingly enough not the case in many other countries, especially Europe, where they do double-reads for mammography. Algorithms are now becoming available for other modalities as well, for example for the ABUS (Automated Breast Ultrasound System). One company showed their CAD algorithm which will become available in the market as soon as they get FDA approval. I expect that CAD for several other modalities and body parts will follow suit, in addition to the already commercialized CAD for lung nodule detection in chest radiographs and CT and breast MRI.
Ultrasound robot

3.      The robots are coming! Not only are radiologists being threatened by AI, technologists might also
become obsolete and be replaced by robots. Well, maybe not quite, but there is definitely a potential. One of the large robot manufacturers was showing its device performing ultrasounds, which can be used to perform remote procedures “tele-ultrasound” in case there is no local technologist available, while performing repeatable procedures using a uniform pressure over the whole sweep performed by the robotic arm. It will be interesting to see what new applications become possible using these robotic devices.

Virtual currency (Bitcoin) in medicine? Blockchain technology has become the main vehicle to propel the popularity of virtual currency to new heights. The underlying blockchain technology is very useful for managing public records, which need to be secured from unauthorized users. The records are automatically duplicated on tens of thousands of computers and accessed through an extension of a common browser that accesses the blockchain network, in this case Ethereum.

A demonstration of a possible application that manages the licensing of physicians in the state of Illinois was shown at the show. This technology might have certain niche applications in healthcare IT. However, for managing medical records, which must stay private, it would definitely be a problematic solution (unless is it encrypted which defeats the purpose of public access), as well as for images which are definitely too large for this application.

5.      VR is getting more traction. I counted three vendors (there could have been more) who were
VR using wrap-around goggles
demonstrating 3-D stereoscopic displays, which could be especially useful for surgery applications. It still looks kind of weird to see users with these large wrap-around glasses on while waving a wand into space, and I think it might take a few years for this application to mature beyond its “gadget” state into real practical use. But this is a field where the gaming industry has provided some real spin-offs into practical applications that might potentially benefit patient care.

6.      Cloud-phobia is getting stronger – Moving your images to the cloud for image sharing was one of the previous year’s hot topics, especially as cloud providers (Amazon, Google, Microsoft and others) have been offering very competitive rates for their storage capabilities. Despite the fact that the data are probably safer and more protected at one of these cloud providers than at many corporate healthcare IT facilities, there is still a concern among users about potential HIPAA violations and potential hackers looking for patient demographics, which if accessed, could be downloaded and resold on the black market. As an alternative, some of the image sharing service companies are starting to provide secure point-to-point transfers managed by their own patient-driven portal. They provide an access key to the patient who then controls access by physicians and obviously themselves as well. This looks to be a good alternative if you don’t trust the cloud.

lightweight probes with processing in
 dedicated, customized tablet
7.      Ultrasound units are becoming commoditized. Miniaturization and ever increasingly powerful tablets allow for ultrasound units to be carried in your pocket facilitating easy bedside procedures and also to giving workers in the field, particularly in remote areas, the capability to do basic ultrasound exams.

For potentially high-risk pregnancies this has become a great asset. These units range from US$10k to $25k depending on the functionality and number of probes you want to have. There are
Heavy (wireless) probe, standard tablet
two different approaches to this technology, the first one puts all the processing in the tablet allowing for very lightweight probes, the second one is the opposite, it puts the technology and processing in the probe, which can even be wireless and uses standard tablets (IOS or Android). The latter results in probes that will produce heat and are rather heavy, especially if they need to contain a battery in the case they are wireless.

SPECT lounge
Ergonomics is gaining traction. Patients could potentially be intimidated by these large diagnostic radiology machines while having to lay on a table staring at the ceiling when being scanned. Instead, being able to sit in a comfy chair while your scan is taking place could provide a more pleasant experience while allowing for eye contact with the technologist as well. Case in point a SPECT scanner offered by one of the vendors.

Who is this company again?
9.      Mergers and Acquisitions are still the order of the day (or year?) When walking into one of the exhibit halls I was puzzled by a major booth from “Change Healthcare,” a confusing name to me, but which probably cost its owner a lot of money for a market branding company. I had to ask one of the booth attendants, who explained, “yeah, we used to be McKesson.” Similarly, Acuo went from Perceptive to
become Hyland, Toshiba is now Canon, Merge disappeared to become IBM and others were swallowed by different vendors, spun off, or re-branded themselves. The good news is that there were probably as many “first-time vendors” as there were “last-time vendors” showing that there is still room for new start-ups bringing in fresh ideas and innovative products. But it is always with a little bit of nostalgia, having worked for one of these “giants” for a few years in my past life (notably Kodak), to see these names disappear.

Shopping downtown Chicago
10.Last but not least, Chicago (still) rocks. After attending 30 RSNA’s I stopped counting, but every year it gets better. The food prices and lodging costs are still exorbitant, but instead of the always notorious cab drivers who talk non-stop on their cell phones in a language you don’t understand and don’t get out of the car to help you load your luggage, there is now Uber or Lyft to bring you to where you want for half the price. Even better there is the opportunity to provide feedback for your driver on your phone after your ride (what a great concept!). And it is always fun to watch the many international participants at the show and try to guess where they are from based on their gestures and clothing (the Italians always stick out!). This was another great year, new hype, good vibes and fun, looking forward to next year already!

Monday, October 30, 2017

What is the future of PACS?

I get this question a lot, i.e. where is PACS headed? It comes from different professionals, from
people who are decision-makers ready to spend another large sum of money for the next generation PACS, or from those who made PACS a career such as PACS administrators who come to my training and want to make sure that their newly acquired skills and/or PACS professional certification will be of use 5 or 10 years from now.

I also get it from the users who are often frustrated by limitations and/or issues with their current system and from start-up companies that are planning to spend a lot of time and energy in developing yet another and better PACS in the already crowded market place. Where do I think PACS is going and is there still a future in this product and market? I don’t have a crystal ball but based on what I have seen in my interactions with PACS professions, here is my assessment and prediction:

When we talk about a PACS (Picture Archiving and Communication System) in the traditional sense as a “product,” instead of as a “functionality,” yes, the PACS product is indeed the equivalent of a gas-powered car that needs a dedicated driver at the steering wheel,  doomed to disappear in favor of electric, self-driving cars using AI technology developed by Google and others. Just as every car manufacturer is scrambling to get on the bandwagon and change their product development to meet the new demands out of fear of going the same route as Kodak. Similarly, If I were be a truck driver who operates a vehicle mostly on interstate highways, I would be worried about my long-time career path.

PACS systems viewed as a “function,” however, will still be around as the need to interpret and manage images and related information will continue. But, many of those functions will become more autonomous using AI. The Wall Street Journal proclaimed recently AI to be the latest Holy Grail for the tech industry, and there is definitely going to be a spillover to the field of healthcare imaging and IT.

Self-learning systems using algorithms developed by Facebook and Amazon that know which friends or product you might want to follow or purchase next will anticipate your steps and tasks and reduce mouse clicks, anticipate information you want to consult and in what form and presentation (think self-learning hanging protocols) that allow you to become more efficient and effective. This will impact the number one complaint that users currently voice about their PACS, i.e. that it does not support their preferred workflow well.

PACS will give up its autonomy regarding the workflow. In several institutions the workflow is starting to shift from being PACS or RIS-driven to now being the EMR-driven workflow. Unlike PACS, the traditional RIS systems are becoming quickly obsolete. Order entry is shifting to CPOE functionality in the EMR and even the modality worklists are starting to become available in the EMR. Not every EMR, however, is quite ready to incorporate the entire process, consequently there are many holes that are covered with interface engines, routers, brokers, workflow managers, etc. from several “middle-ware” vendors who are bridging the gaps and integrating these systems smoothly. If I were to invest in healthcare imaging and IT that is the niche where I would bet my money.

Another major application for AI will be the elimination of the majority of negative findings from screening exams. Early experiences have shown that AI can eliminate perfectly “normal” mammography images and reduce the images that would need to be reviewed by a person to about 20 or 30 percent of the caseload. Computer Aided Diagnosis (CAD) will also become mainstay for not just the current niches in breast imaging but also be available in other types of exams.

Among the periphery, i.e. at the acquisition side, we will also see a shift as new modalities are being introduced and/or existing modalities are being replaced. Mammography screening exams could be replaced by low cost MRI combined with ultrasound and potentially thermography imaging. We can already look inside arteries and veins using IV-OCT (Intravascular Optical Coherence Tomography) using a small catheter, who knows what we will be able to visualize next, maybe the brain?

Note that this transition assumes a “deconstructed PACS,” of which the core is stripped down to an image cache of a few months with diagnostic viewing stations tightly coupled to this core, and using an enterprise VNA image manager/archive which could be from another vendor, which is driven by the EMR, tied together for now by multiple routers and prefetching gateways. Some of the institutions will opt to archive their images in the cloud, which will become very inexpensive as cloud storage rapidly transforms into a commodity with Google, Amazon, Microsoft and others all vying for your business. If nothing else, the cloud will replace the many tape libraries that are still out there. View stations will become super-fast as solid-state memory will be replacing disk drives, so we will finally be able to improve today’s requirement of a “3 second minimum image retrieval” at a workstation, which has been the semi-gold standard for the past 25 years.

Unlimited image sharing is going to be common practice, CD image exchange will go the way of floppy disks, or the large 14-inch optical disks we used to have for image storage. At my last company we used to take these big optical disk platters and make them into wall clocks, I still have one of them in my office. I should save a CD as well to hang next to it. Accessing information across different organizational boundaries will use webservices much like what you see on an Amazon web page right now. On that Amazon page you can purchase a product from Amazon or an external vendor, which is seamlessly linked.

Compare that with the physician portal, he or she can access the local lab results or jump to an outside lab that provides the lab results in a nice graph, while the image access in the local or remote VNA is also just a click away. And of course, access to many educational on-line resources and good practices are all simple apps on that same desktop, or should I say dashboard, which also displays the current wait time in the ER, number of unread reports in the queue and report turn-around time, in addition to the weather forecast and radiology Facebook share page.

So, do I think that PACS is dead as some people are declaring? In don’t think so, especially if you consider PACS as a function. Just as some see the need for fewer radiologists (think truck drivers?) as a doomed career, but I their functional roles will shift to that of a consultant and the job will be less focused on cranking out reports of which many are “normals” as those will be automated, PACS will continue as an important function in clinical-decision making.

Finally, what about the people who support these sophisticated systems, i.e. the PACS administrators? Their role will shift too, many of the mundane jobs will be more automated and they will be able to focus on re-engineering workflows, planning and solving tricky integration problems. So, the future of PACS is bright in my opinion, but is will be a different color of bright, and as always with transitions, there will be people and companies that anticipate and embrace these changes, and others that will have blinders on and be left out.