Big Data Problem Solving Capabilities Shift From Analytic To Synthetic
Fremont, CA: The tech market is awash with demands for big data and its allied products. So much hope and expectations are pinned to it that everyday new innovations are taking place with much zeal. The growth of Big Data and the abundance of programmatic interfaces to new fields and industries have altered the manner in which we resolve problems.
Zavin Dar, for Venturebeat explains how Big Data trends can be connected to analytic and synthetic truth.. An analytic truth can be derived from a logical argument and on the other hand, synthetic truth is a report whose correctness cannot be decided without access to pragmatic evidence or external data.
Similarly, with the emergence of big data, the world has shifted from creating novel analytic models to creation of infrastructure and capabilities to solve the problems via synthetic means.
Till now, the emphasis was on either creating new models or to use already present models to derive new statements and outcomes. But the development of computer systems and softwares has guided the transformation to synthetic innovation.
For instance, to understand the working of web in the synthetic approach, collection and synthesis of previous click streams and link are done to predict the user behavior in future.
But two infrastructural steps must occur before the application of synthetic methodologies to new fields. The first requirement is the underlying data must be in digital form and the second is the stack from the data to the scientist and back to the data must be automated.
Several instances can be quoted for digitizing existing data. Like Estimote, Innovation Endeavors’ company is collecting physical data, which when applied to commercial purposes can help brick and mortar retailers or the adoption of Sports VU cameras to track location of the players and ball by the microsecond or Bitcoin, which represents an entirely digitized financial system.
Great emphasis is not only given for the collection of new data but also in storing and automating the action ability of the data. But the fact is that big data makes more sense when it is seen as a byproduct or as a means to solve a problem. View “big data” not in terms of data size or database type, but rather as a necessary infrastructural evolution as the trend shift from analytic to synthetic problem solving.
Essentially, a shift in approaching problems is taking place. By isolating from the intellectual and philosophical yoke of positing structures and axioms, no longer does one rely on step function driven analytical insights. Rather, what’s happening is a widespread infrastructural adoption to accelerate the adoption of synthetic problem solving.
Conventionally these techniques were constrained to sub-domains of computer science but as digitization of new data sets occur and build necessary automation on top of them; one can employ synthetic applications in entirely new fields.