Share this:


Big Data doesn’t have to be complicated nor expensive when you deploy the right integrated Cloud-based solution to tackle it.

There are 3 key phases of Big Data and Analytics requiring very different compute and storage capabilities.  First, there is “capture”, which includes curation and storage of the data.  Simply put, this is the act of listening for, “mining”, or filtering information from one or more sources into a set for future processing.  This phase typically requires large amounts of storage with relatively minor computational capabilities.

The second phase is “analysis”, which may also include search and filtering.  When we hear the term Analytics by itself, it typically revolves around these acts.  Here is where we actually make sense of the data we captured in the first phase.  There are different types of Analytics and they are really driven by the application, or the problem being solved.  Predictive Analytics, for example, can be used to guess how a particular person or machine will infiltrate a secure facility (e.g. computer network) based on actions they have taken so far and how this matches known patterns of behavior.  Descriptive and Prescriptive Analytics are typically used in business to answer “what” and “why” for past and future events, respectively.  Analytics can be performed after the “capture” stage or in some cases during, helping to reduce the “noise” and focus on relevant information as it comes in.  In either case, these processes require large amounts of compute capacity with fast access to the data.

The final phase is “visualization”, which is where we present the results of the “analysis” in a human readable form.  This can range from a graphical representation of air flowing around a vehicle at different speeds and directions, to a map of reported cases of infectious disease, to name a couple of examples.  In many cases the user interacts with the visualization – rotating, zooming, etc.  In short, “visualization” is where the human touch graces Big Data.  This phase requires less computational power than “analysis”, but must offer real-time performance and response for a good user experience.

The Nimbix Cloud, powered by JARVICE, solves Big Data by harmonizing heterogeneous resources and processes into one seamlessly integrated experience.  Rather than a “one size fits all” approach leveraging virtualized instances for all phases, we optimize the cost and performance for each unique capability with the right tools for the job.  Our managed hosting and colocation services for High Performance Computing systems solves “capture”, while our accelerated data processing platform scales seamlessly, on demand, to conquer “analysis”.  When you are ready for “visualization”, we offer the latest GPU-powered applications on a pay-per-use model as well.  If you prefer a hybrid approach, we even automate data movement so you can integrate our processing platform into your workflow at any phase.

At Nimbix we are committed to solving your Big Data and Analytics platform and application challenges so you can focus on what really matters: making sense of it all.