Localized Artificial Intelligence As Healthcare Data Harvesters

I’m no longer working because of my cancer. To make a long story short, I miss it. Such is life. How’s that song go???  Oh yeah, you can’t always get what you want.  🙂  I used to be a software engineer. I used to create entire applications by myself. It was a blast. I did the graphics (GUI) and the front end coding (HTML, Javascript and so on…) back-end stuff like (PLSQL, C and C++…Java (beans too), databases and a ton of scripting languages). You get the idea.  I pulled it all together and made fast running web-based apps that were a blast to write and even fun to use. So even though I can no longer sit at a desk, I find myself still poking around in stuff like that. 🙂  I can’t help it. You might have noticed that on my FB page. 😀 So, getting to the point finally, I’m wondering if the new trends in artificial intelligence can be used locally by cancer centers with minimal investments? I’m wondering if localized artificial intelligence as healthcare data harvesters is a good idea?

Big Data

If you are or ever were involved in the tech industry then you know that “big data” is a big trend right now. You see, we have too much data to handle. Tons of companies write huge applications to give larger institutions ways to handle these issues. I doubt any are very useful. Trust me on that one. 🙂  Here’s a pretty good definition and a pretty good article from Tech Target.

Big data is often characterized by 3Vs: the extreme volume of data, the wide variety of data types and the velocity at which the data must be processed. Although big data doesn’t equate to any specific volume of data, the term is often used to describe terabytes, petabytes and even exabytes of data captured over time. Read More

OK, yawn… I know, but just know it’s a lotta crap coming at you fast! 😀  It’s enough to overwhelm the standard systems and people put in place to handle and analyze the data. So big companies jump in with big applications for big money and offer to solve the mess. It never works. Also, many companies will write “home grown” applications that may or may not do the job. The problem is that these “home grown” applications are never shared. They stay within the confines of the company that created them and are only good for those individual data sets. For example, a large cancer institution might have data on lung cancer that it wants to analyze. Another cancer center may have the exact same problem. They both might write applications that solve their particular problem but they never share the information with each other and don’t write applications that can communicate with each other. In fact, they probably don’t even know the other teams exist. So, you can see the problem.

Will Artificial Intelligence Help

Do you remember when Ken Jennings was beat by IBM’s Watson?  That was pretty fun to watch. What about when chess grand master, Kasparov, was beaten by Deep Blue? Boy, was he angry! Again, fun. Until recently, artificial intelligence has been used mainly as entertainment and as a way to experiment with new techniques in artificial learning. I always loved that stuff. 🙂  For the most part, Kasparov was beaten only because the computer could churn out more possibilities faster than the human mind. The machine was dumb. It was only cranking out scenarios for a win quicker than Kasparov could. It would explore every possible path after each of Kasparov’s moves and then execute the most logical move next. The machine won by “brute force” but not skill. There was no creativity and no intuition and no reading the opponents “psyche”. You can see the same thing happening now with computerized poker sites. It’s the same sort of paradigm. Technically, this is considered AI but it really isn’t in my opinion.

This brute force paradigm is what most companies have been using to handle their data until now. The meat grinder effect. Data chunks go in and a final product comes out. No intelligence, no creativity … just fast computation and sorting. That type of thing. With the onset of “big data” the standard brute force applications were getting old and tired and were no longer able to keep up. When you need half a day to run a process and it ties up the entire database it’s time for a new solution. Still, many companies are using these older solutions today.

Baby Watson

…and then we all saw Jeopardy. Ken Jennings was beaten by IBM’s Watson computer. There was a huge difference in the losses of Jennings and Kasparov. Kasparov lost because he was not able to calculate enough scenarios quick enough. He lost because he was not a computer. Jennings lost because he was “out thought”. Jennings lost because he was human. Jennings lost fair and square but, Kasparov…maybe not so much. 🙂

When you watched Watson on Jeopardy, you were watching the public birth announcement of real artificial intelligence. Here’s what set Watson apart from Deep Blue. Deep Blue as we said was simply faster than Kasparov whereas Watson is using logic and “reason” to make decisions. Watson can change it’s mind with new information. Watson can choose between two logical outcomes that are very close and pick what it deems the best “answer”. Watson can make mistakes whereas Deep Blue could not. Deep Blue always made the correct choice for the circumstance because the decision paths were limited and not variable. Watson can choose. Now, Watson is not artificial life which is what many are striving for and, I believe, this will be achieved. After all, we are made in God’s image and God is a creator. I predict humanity will create artificial life. Scary, right?  I think so. Still, Watson is big….and needs very skilled programmers to pull off what is really only one step away from the previous attempts at handling big data with brute force.

Localized Artificial Intelligence As Healthcare Data Harvesters

So, think of it this way. Think of data as sunshine. Little points of light floating around. Now think of these little localized software programs designed to harvest the data as solar panels. Everybody knows how this works. Typically, you install a solar energy system and sell the extra energy back to the electric company. If we built tiny, localized artificial intelligence programs (solar energy systems) that harvested data (sunlight) with certain goals in mind, like “neuroendocrine cancer survival rates“, you could then create programs that don’t harvest all data but only data that conforms to the goal. You could set them loose like Internet spiders to harvest the data, report back locally and eventually report back to a larger program.

If all of these programs were written to generate reports with a certain “syntax” or protocol for communicating results, then all of the data could be used by everyone. It could become an “open source” solution within the medical community all the while upholding privacy regulations because of anonymity. They could be locally written as long as they conformed to reporting standards. Then the data could be shared with larger artificial intelligence programs designed to handle “big data”.  Hospitals would be freed up to run simple AI programs instead of massive data crunching programs. Each organization could write their own. The larger “big data” programs could be built by places like Google or Oracle or …. well, let’s not say the government  … we all know how that would turn out. 😀  Well, now I really wish that I could get access to some data sets and a development environment. Ha ha ha ha!!! I’ll bet you never heard anybody say that before. 😀 NERD ALERT!!!

Live big, think big, fight big,

Ed – To find out how to use my images on your blog for free – Click Here
Visit Me On Facebook
Visit Me on Twitter

scrollwork

 

 

 

Leave a Reply