Software like electronics is full of hype and trends. Remember when 3D TVs were going to take over or when everyone had a walkman.
Software has its own jargon that comes up periodically and fades away. UML, Aspect oriented programming are just a couple of trends that seemed to be the panacea for all software problems. Teams adopted them in droves and over a period of time they got absorbed into existing software practices and never caused a significant paradigm shift.
So how is AI any different from these trends.
The basic fact that separates AI from any of these trends and buzzwords is the kind of problems that can be solved by AI. Speech recognition, traffic prediction, image recognition and autonomous driving are just a few examples of problems that can be solved using AI and machine learning. These kinds of problems cannot be addressed using traditional programming techniques.
The traditional programming techniques were mostly logical workflows of decisions. They were relatively simplistic applications that applied logic to business data. Granted that there were always statistical analysis programs that applied more advanced mathematical concepts but for the most part the problems that could be solved were emulating glorified calculators or jazzed up excel sheets.
The fundamental difference between AI and traditional programming is that in transitional programming the programmers need to code in the logic. They have to instruct the program on what to do. To do this they need to know the business problem and how it can be solved. This created a need for a team of business experts that know how the business problem needs to be solved. The development team would then interact with the business team and take those instructions and code them. If all went well the end result was a reasonable approximation of the ideal solution.
There are multiple challenges to this. The business experts need to have a thorough understanding of the problem and the solution. The communication between the business and technical teams needs to be reasonably efficient so the solution translates well into the technical landscape and the development team needs to be skilled at coding up anything that has been described to them.
In the case of simple problems like how I would rout a purchase order for approval, this is not such a big deal. There are simple enough rules that can be enumerated by the business experts and translated into code by the development team. For example a simple rule could be: If the purchase order amount is less than 50,000 send it to the procurement head. A purchase order between 50,000 to 500,000 needs to be approved by the VP procurement and so on. It goes further up the food chain as the amount of the PO increases. There could be other rules like when the purchase order is for items on a restricted list then always send it to the VP which complicates the process, but it is still a well-defined problem with a concrete solution.
Let us take a slightly more involved business problem. Let us look at a hospital surgery center. Picture a surgery center with twenty operating rooms and imagine it performs a few hundred procedures every week. You can easily visualize a department humming with activity with nursing shuttling patients in and out, cleaning crews trying to clean up and turn around rooms as fast as they can, busy surgeons working on extremely complicated procedures and anxious relatives waiting for their loved ones in the waiting room.
A simple problem like predicting when a room is available for the next patient now becomes very complicated. There is no simple workflow that can be drawn to make decisions on who to prep up next or how to communicate delays to nurses anxious to move their patients into the procedure rooms.
The business experts are the nurses, hospital administration staff and the process specialists that have been following this choreography. The prediction of when procedures end depends on many different factors. Similarly the prediction on when the next patient can be moved in also dependents on a lot of criteria. There are a number of factors to consider like how busy the nurse is, how busy the cleaning crew is, and how backed up the post anesthesia case unit is at that time. The bottlenecks don’t even stop in the operation room as even bed availability in the in patient section affects this workflow. This makes it difficult for the business expert to document all the rules that are involved in these predictions and even more difficult for the developers to code them.
In these cases the AI algorithms take a radically different approach to traditional programming techniques. AI and machine learning algorithms can take in all the inputs about everything that happened in the OR. There are multiple signals that can be fed into the algorithms. Once the algorithms digest this data, they could be trained to look for patterns or logical pathways on how the case length and start/end times could be calculated/predicted. During these training iterations the AI routines use deep learning to figure out the patterns that are not obvious to someone that is not looking at the complete set of inputs. If the set if inputs is spread over events that occurred over multiple years the AI algorithms have an advantage over traditional analysis.
So AI gets away from the traditional format of solving the problem by first relying on the business experts to identify the solution and the technical experts to then translate that into programming logic.
AI and machine learning come with their own challenges. The skill sets required for these solutions are very different to the regular programming skill sets. These algorithms also need a lot of data to generate good predictions. This in turns needs a lot of computing power and could be expensive to setup. There are good options to run these in the cloud that could ease some of that burden.
In spite of the initial barriers to entry, it is this transformative aspect of AI and machine learning that will make it stick and adoption will increase dramatically over the next decade.