Thursday, July 9, 2020

Is Your Data Smart Enough?

The state of data affairs over the last ten years or so revolved around big data. Of course, size matters, but big data promises to morph to monster data as more data sources hit the cloud with more tributaries like voice, video, IoT, events, and business patterns. So what about all this parked data? Are we going to keep storing it and bragging about how much cloud space it consumes?  Are you going to make it cleaner and smarter or just admire it? I would suggest we make data more intelligent and faster than just figuring out how to catalog and park it, so we can use it later. Making it faster means treating the data as a database of now, now of the future. Making data smarter can be tricky, but it is worth it.

Gleaning Data is Basic Intelligence.

Capturing data of different types and classifying them is pretty normal. Deciding how long to and where to keep it is essential. Determining if it is worthy of a long time archiving is doing data a solid. Knowing some basics about the data source and cost of acquisition and relative purity is pretty much a given these days. Some data cleansing and organization will help usage down the road.

Giving Data Meaning is Average Intelligence

Knowing the data about the data (AKA meta-data) is essential for interpreting it. The simplest is understating the data’s domain and its relative relationship to other data (logically or physically). Data representation and transformation options are pretty essential when combining with other data. Knowing the key or identifier of groups of related data is pretty standard. This step is where some of the impurities can be dealt with before heavy use. First use usually revolves around visualization and reporting to find actionable insights. This step is turning descriptive data into a prescription at times.

Granting Data Representation in Its Context is Very Smart

Most data is gathered and used within one or two base contexts. One is undoubtedly timing/frequency, and the other is the primary home of the data. For instance, the entity family it belongs to like product data. Sophisticated context representation will go beyond an original context or source to include others that have a neighborhood relationship with the data grouping/entity. An example would be a product within multiple markets and channels. This level is where statistical and predictive models enable more actions to either react or intercept the trends indicated in the data. This level is turning prescription to prediction to create/place data, event, or pattern sentinels on processes or the edge to look for prediction completion or variants.

 Grinding Data to a Fine Edge is Smarter

We are interrogating data to learn the need for important adjustments to goals, rules, or constraints for operating processes that include humans, software systems, or machines. This level can build a change to work in a supervised or unsupervised change process. This level starts with machine learning and extends to deep leading, which peels back layers and interrogates more data. In extreme cases, the data can be used to support judgment, reason, and creativity. The worm turns from data-driven to goal-driven, established by cognitive collaborations with management principles, guidelines, and guardrails.

Grappling with Data in Motion Right Now is Brilliance

The pinnacle of smart data is where the data coming in fresh is used to create the “database of now”.  At this level, all of the approaches above can be applied in a hybrid/complex fashion in a near time/ real-time basis. This level uses the combined IQ of all the AI and algorithm-driven approaches in a poly-analytical way that leverages the brainpower combined with fast data. A dynamic smart parts creation and dynamic assembly line would be a non-combat example. 

Net; Net:

Data: Use it or lose it, but let the data lead to the learnings that sense, decide, and suggest responses appropriate to action windows necessary to meet the timing need. If it is a sub-second focused problem domain, the patterns in the data and intelligent methods may make the decisions and take action with governance constraints. If not subs-second focused, let smart notifications or options be presented to humans supervising the actions. Don't leave all the precious data parked for the future only. 


1 comment:

  1. Building adaptive operational policies (using KEEL Technology) allows systems to interpret complex (dynamic, non-linear, inter-related, multi-dimensional) data sets in real-time and react in real time with what to do, INCLUDING how, how much, where and when to act while considering multiple, sometimes conflicting goals.

    ReplyDelete