On the 19th of April 2013, the current batch of students of the Master’s Programme in Health Informatics, Karolinska Institutet together with a group of teachers and PHD students visited the IBM Client Center in Stockholm to get more information about their ongoing health care projects.
Now let me highlight some of the main interesting points that they covered during the presentation…
The word “Smart” is used a lot to tag many products that we see these days, but are these products really smart? The word smart has always been associated with the human brain and its ability to reason, understand and reflect and in my opinion it’s unfair to give lots of machines this status without being able to perform these tasks. That’s how I always perceived things when it comes to computer systems, but I think that has changed a bit now.
We probably know that unlike us human beings, computers are capable only of understanding structured information. According to our lecturers, 90% of the data that we have right now was created only 2 years ago and 80% of this data is unstructured data with healthcare contributing a lot to this. There are many reasons why healthcare is associated with unstructured data. One of the main reasons is the fact that healthcare generates a massive amount of hugely different data types that result from being collected in different ways.
To be able to move further with innovation in the healthcare IT sector, there are two possibilities related to data structure. One is to try to find ways to structure the data (which is not at all an easy task) or to try to adapt to the nature of this data (which is probably going to remain unstructured for a long time!)
Watson might not be the one and only trial to develop Artificial Intelligence (AI), but it is one of the closest that I have seen mimicking to a great extent the way that we human beings interact with information. We humans can interact with unstructured data and manage to extract the valuable information and process it (Of course the degrees of doing this varies among different people)
This brings us to the question of: How does Watson work?
Again, I’m going to refer to an example that our lecturers used to highlight the steps that Watson takes to be able to respond to a question asked in natural language.
Let’s imagine we’re going to ask Watson “Where was Einstein born?”
Watson would go through 4 stages to be able to answer such question that could be summarized as:
1. Analyzing: Here, Watson tries to separate the important parts defining this question which are represented mainly by “Where”, “Einstein” and “Born”. This is where you can see that Watson can really differentiate and understand grammar resulting in more accurate understanding and consequently analyzing of the given information
2. Generating of different hypothesis: Watson then searches(its database or the connected databases) for candidate answers that are related to the identified keywords. Watson accumulates the candidate answers throughout the process of searching until it’s done with it.
3. Validating and assigning a score: Watson then searches for evidence supporting the accumulated candidate answers which are also given a score according to the evidence found supporting them.
4. Ranking and presentation: Finally, Watson arranges the answers according to the supporting evidence in a descending order (highest supported is first and down).
This also means that Watson might not be able to find matching answers or evidence supporting them in some cases and this is when it can also “admit” that it can’t answer which I think is also another good feature mimicking the human way of thinking.
Knowing how it works, we can start to think how Watson can become a powerful tool in healthcare. By providing Watson with access to Evidence Based Medicine (EBM) databases, Watson can be able to perform the previously mentioned steps in order to find answers for medical questions.
Moreover, Watson is capable of acquiring the information either from other systems (e.g. Electronic Health Record) or from direct input (e.g. by a physician seeing a patient) and based on that it can come up with different suggestions to the physician about management for example together with references supporting the given suggestion(s).
Speaking about direct input, we saw a demonstration of voice enabled input (using voice recognition techniques) which allows the physician to speak directly to Watson instead of typing the information and getting accurate results.
Currently, Watson is being tested by Memorial Sloan-Kettering Cancer Center (MSKCC) with the purpose of “Fighting cancer”. According to the information we got, Watson is doing well and is getting very positive feedback about its performance from the healthcare staff.
To wrap things up, Watson definitely has a great potential being able to handle unstructured data and deal with natural language. It can be a very powerful tool not only in healthcare, but also in many other fields. The biggest challenge that would face Watson as a Clinical Decision Support System (CDSS) is the degree of acceptance among end-users who are disappointed, tired and in many times reluctant to include another computer based system in their workflow.
How will physicians react and interact with Watson? Only time will tell! But I can say that me being a physician who is interested in IT, I see that Watson has a great potential.