In 1995, when I received the textbook for my major subject in Information Systems, “Expert Systems and Applied Artificial Intelligence” by Efraim Turban, I felt like this was the stuff movies were made of.
The application of artificial intelligence
This subdivision of computer science, devoted to creating computer software and hardware that attempts to produce results such as those produced by people, enabling us to automate and enhance complex tasks that were done manually, seemed like an area best left to the gurus researching their days away in the MIT labs. But now, thanks to the rise of big data, the case for applied artificial intelligence is upon us.
Artificial Intelligence is closer than we think.
To help join the dots, consider the following definitions of Artificial Intelligence: First, it involves the studying of the thought process of humans; Second, it deals with representing these processes via machines (computers, robots, etc).
Regarding the first point, the fantastic thing about the human brain is “how a little bit of knowledge goes a long way,” in allowing a human to decide what is truly important. Intuition and gut feel may not find their way into machine learning capabilities, but truly, how many decisions do we as humans make in the business world that don’t require a fair amount of studying and analysing of factual data before making them?
Secondly, how we as humans thinly slice through all the data is not much different from the Map/Reduce frameworks inherent in Big Data Architectures. This is the principle of consolidating disparate data, big or small, and distilling it to a simple truth, so that a decision -- whether it is operational or strategic in nature -- can be made.
Watson, I presume
The exciting part is, this is not just talk anymore. Consider IBM’s best Cognitive system to date, that of Watson. Watson understands natural language; the system generates hypotheses recognizing that there are different probabilities of various outcomes. The system learns, tracking feedback -- learning from success and failure -- to improve future responses.
In February 2011, Watson appeared on Jeopardy, the quiz show known for its complex, tricky questions and very smart champions. The clues given to contestants require analysis and understanding of subtle meaning, irony, riddles, and other language complexities in which humans excel and computers traditionally do not. Watson received the first prize of $1 million.
Watson had access to 200 million pages of structured and unstructured content, consuming four terabytes of disk storage, including the full text of Wikipedia. August 2011 saw Watson enter the favorite big data arena of healthcare. In March 2012, it expanded into the Financial Services area.
Beating the Turing test
Alan Turing designed a test to determine whether a computer exhibits intelligent behavior. According to this test, a computer could be considered smart only when the human interviewer, conversing with both an unseen human being and an unseen computer, could not determine which was which.
Replace the interviewer with a high-powered CEO who picks up the telephone and calls the ABI (Artificial Business Intelligence) department to ask for the top 10 performing branches for the last three months -- and after a few minutes, he gets a report emailed to him. The scenario is not as far-fetched as it was for me in 1995.
Today, with tools such as Tableau (which suggests the best visualization for selected data), combined with underlying computing power as described above, the future may see even the “sexy” data scientists having to battle against the likes of Watson in the recruitment process.
Realistically though, it's good to end with an excerpt from my textbook:
AI is an excellent technology. Look for ways to use it, but do not expect miracles. On rare occasions, AI may give a miracle-like solution, but more likely it will not. It will deliver evolutionary rather than revolutionary improvements.
Re: 20 years ago is history These early teens of the 21st century are shaping up to be pretty fertile grounds for ai developelment. Those older theories coming in at a time when big data technologies are becoming more accessible... we should be approaching a sweet spot for development.
User Rank: Bit Player 2/3/2013 | 12:38:12 PM
Re: 20 years ago is history It's funny how text that was written 20 years ago could still be relevant, even accurate in describing current technologies today--it might be old, but there can still be something learned from it.
Blade Runner one day? As officer Deckard witnessed at the beginning of the dystopian classic film Blade Runner, administering the Turning test can have dire consequences. Until then, tread carefully with our present-day replicants like Watson!
I'm not sure what the concern is? I suppose it depends which book one references. Humanity surely does put huge emphasis on books thousands of years old. I think its amazing how 20 years ago applied artificial intelligence ideas (not talking about science fiction) were floating around and now they are being implemented in real-world business applications.
I suppose a good parallel could be that of a scientist 20 years ago with a theory but no proof. 20 years later he finds the proof for his theory. Surely there can be nothing unrealistic with looking back to the original theory and seeing how it has manifested itself in the present.
The fact that an application like Watson exists, proves that the idea is being used today.
AI is an excellent technology. Look for ways to use it, but do not expect miracles. On rare occasions, AI may give a miracle-like solution, but more likely it will not. It will deliverevolutionaryrather than revolutionary improvements.
User Rank: Exabyte Executive 1/31/2013 | 4:46:41 PM
Re: The Watson Wars Watson may come out on top or get trounced by some Chinese supercomputer. The latest report I read said the world's fastest computer currently is a US computer that snatched the crown back from a Chinese computer. Watson is going to have rivals.
User Rank: Exabyte Executive 1/31/2013 | 1:42:04 PM
Re: The Watson Wars Big Data is the food for AI. We look at AI with the same awe the creators of ENIAC (celebrating an anniversary in February) did when the behemoth produced calculations in five seconds. It could be argued that absent Big Data AI would remain a lab queen. Now AI can be part of the business vernacular.
User Rank: Exabyte Executive 1/31/2013 | 10:10:28 AM
Re: The Watson Wars Yes. AI and Machine Learning algorithms have much more applicability possibility with big data infrastructure. It is exciting to see how they will be used in real business problems.