Share this:

Supercomputing solutions


Accelerated computing affords us many luxuries, faster computing, computes on a larger data set, but one aspect that is often not mentioned is the luxury of exploring multiple methods for solving a particular problem. Luxury? Hold on a minute, exploring various methods for solving a particular challenge is a necessity. If the reader indulges me in a little bit of abstraction, we find that for every problem there is a set of possible solutions. This solution space is often referred to as the event space, and this space has some interesting features and topography. I promise that’s as far as I’m going with this stuff, you can exhale now. What this means is that for a given problem, there is potentially more than one solution and for the sake of correctness, should we not evaluate either all possible solutions or at least a selection of methods to generate solutions?



This is where accelerated computing makes strong sense. If an accelerated machine can run a model in half of the time a non-accelerated machine can run the model, we now have a conundrum. Are we satisfied with the result of one model or, is it more economical, in terms of time, to run two different models in the time it would usually take to run a single model? In essence, it is the balance between volume and speed. Would I rather have two answers to evaluate or one answer to evaluate in half of the time? Similarly, increasing this would result in many solutions to assess and the need to review all of these solutions to find the best one for our application.



One way to look at this is to assume that all models are fundamentally incorrect (after all, they are approximations, even in the simplest of systems) and running multiple models allows us to examine different solution methods that have been optimized differently or that are computed using various methods. This makes it possible to take a more holistic view of our proposed solutions, and it gives us confidence in our interpretation. Multiple solutions mitigate the possibility that a solution generated may be faulty due to the input conditions. If by chance the input data causes the analysis to go down a particular path which results in incorrect results, then a single instance can be very detrimental. The ability to run multiple cases concurrently and arrive at an array of solutions reduces this risk. The incorrect result will be an outlying data point while the majority of solutions will fall within the norm. The luxury of multiple solutions also creates the necessity to review all of these solution sets and utilize the appropriate one that works best for our application.



The notion of gaining confidence from multiple corroborating sources is often referred to as evidence-based support. This is commonly seen in decision support applications. In the decision support context, a question is postulated, and evidence is collected, and from that evidence, support is either present for the conclusion or it is not. It is the support that is then evaluated concerning validity. Multiple solutions also pose a new problem where the user needs to determine which solution is best for the application. This can quickly become very difficult to decipher particularly since the tolerance grows for the variables and boundary conditions established for the analysis.



In many cases, operating in the decision support context provides more freedom for interpretation than with a single model. In fields that area based primarily on expert interpretation, the decision support model has considerable traction. One such field is the practice of medicine. In particular, anatomic pathology. Pathologists are highly trained specialist physicians who have an extensive specialty (an additional five years minimum post medical school) and are trained to diagnose disease based on laboratory analysis. One of the most common pathological techniques is a histological examination of surgical specimens. If you ever have surgery where something is removed, that bit of removed tissue is sent “to the lab” for analysis. As part of that examination, the lab makes thin sections of the surgical sample and mounts them on microscope slides and stains them with chemicals that highlight different areas in different colors. The pathologist examines the slides and treats the slides, to make a determination as to the nature of the pathology of the surgical sample. The pathologist then writes up a report and sends it to the referring physician, frequently the surgeon. Now, the pathologist is an expert diagnostician and is often the final word on whether that tumor removed from your lung is cancerous or not. If it is cancerous that sets in motion a series of life-changing events, if not, then life continues as usual. Now, given that the pathologist is often the first to detect or confirm cancer, would you want them to make that decision based upon one point of data? Not me! I’d want them to run the full battery of tests and assays and paint the most comprehensive picture possible and support their diagnosis.



Engineering and other consumers of high performance computing are becoming more like pathologists. A practical engineering example of where multiple solutions are reviewed is a dynamic analysis of a car crash. Engineers build a model of the car in question and test the crash using simulation. The slightest change in input will result in significantly different results. If these tests were conducted with only a precise single contact from the front end of the car and it results with passengers not being harmed, then it is irresponsible for the engineer to conclude the that the car is safe during impact base on this single solution. Just slightly changing the impact location by a few inches up or down can have drastic changes downstream in the analysis. Supportive information is becoming more and more critical; therefore, exploring the solution space with multiple models to support a particular conclusion will be made more necessary. Accelerated computing solves this economically in that it can allow a larger area to be explored for any given unit of time. So, I guess multiple solutions isn’t a luxury, after all, they are a necessity.