Tuesday, July 14, 2020
Best Visual Options for Process Mining
Thursday, July 9, 2020
Is Your Data Smart Enough?
The state of data affairs over the last ten years or so
revolved around big data. Of course, size matters, but big data promises to
morph to monster data as more data sources hit the cloud with more tributaries
like voice, video, IoT, events, and business patterns. So what about all this
parked data? Are we going to keep storing it and bragging about how much cloud
space it consumes? Are you going to make
it cleaner and smarter or just admire it? I would suggest we make data more
intelligent and faster than just figuring out how to catalog and park it, so we
can use it later. Making it faster means treating the data as a database of now,
now of the future. Making data smarter can be tricky, but it is worth it.
Gleaning Data is
Basic Intelligence.
Capturing data of different types and classifying them is
pretty normal. Deciding how long to and where to keep it is essential.
Determining if it is worthy of a long time archiving is doing data a solid.
Knowing some basics about the data source and cost of acquisition and relative
purity is pretty much a given these days. Some data cleansing and organization
will help usage down the road.
Giving Data Meaning
is Average Intelligence
Knowing the data about the data (AKA meta-data) is essential
for interpreting it. The simplest is understating the data’s domain and its
relative relationship to other data (logically or physically). Data
representation and transformation options are pretty essential when combining
with other data. Knowing the key or identifier of groups of related data is
pretty standard. This step is where some of the impurities can be dealt with
before heavy use. First use usually revolves around visualization and reporting
to find actionable insights. This step is turning descriptive data into a
prescription at times.
Granting Data
Representation in Its Context is Very Smart
Most data is gathered and used within one or two base
contexts. One is undoubtedly timing/frequency, and the other is the primary
home of the data. For instance, the entity family it belongs to like product
data. Sophisticated context representation will go beyond an original context
or source to include others that have a neighborhood relationship with the data
grouping/entity. An example would be a product within multiple markets and
channels. This level is where statistical and predictive models enable more
actions to either react or intercept the trends indicated in the data. This
level is turning prescription to prediction to create/place data, event, or
pattern sentinels on processes or the edge to look for prediction completion or
variants.
Grinding Data to a Fine Edge is Smarter
We are interrogating data to learn the need for important
adjustments to goals, rules, or constraints for operating processes that
include humans, software systems, or machines. This level can build a change to
work in a supervised or unsupervised change process. This level starts with
machine learning and extends to deep leading, which peels back layers and
interrogates more data. In extreme cases, the data can be used to support
judgment, reason, and creativity. The worm turns from data-driven to
goal-driven, established by cognitive collaborations with management
principles, guidelines, and guardrails.
Grappling with Data
in Motion Right Now is Brilliance
The pinnacle of smart data is where the data coming in fresh
is used to create the “database of now”. At this level, all of the approaches above
can be applied in a hybrid/complex fashion in a near time/ real-time basis. This
level uses the combined IQ of all the AI and algorithm-driven approaches in a
poly-analytical way that leverages the brainpower combined with fast data. A
dynamic smart parts creation and dynamic assembly line would be a non-combat
example.
Net; Net:
Data: Use it or lose
it, but let the data lead to the learnings that sense, decide, and suggest
responses appropriate to action windows necessary to meet the timing need. If
it is a sub-second focused problem domain, the patterns in the data and
intelligent methods may make the decisions and take action with governance
constraints. If not subs-second focused, let smart notifications or options be
presented to humans supervising the actions. Don't leave all the precious data
parked for the future only.
Tuesday, July 7, 2020
Art for 2Q 2020
Wednesday, July 1, 2020
Context: The Connecting Clues for Data
The “database of now” demands a quick understanding of
data, particularly in context. There are many opportunities in understanding or
misunderstanding data in terms of the contexts they participate in, or are
connected to, or connected at the edge of a data neighborhood. Because each
context has its own unique vocabulary, you can see the opportunity for misconnects
in meaning by not understanding the full context of any statement or set of
proven facts.
If someone says, "I like the blue one", how can
you evaluate what that means? If it is a swimsuit on the beach, it means one
thing, if it's a lobster from the same beach, that means a totally different
thing. Context is what gives data real meaning. There are three primary forms
of context that help understand the true meaning of the base data. One is the
real world contextual meaning, the other is the contextual business meaning,
and the other is the technical contextual meaning. Obviously, finding meaning
in big or monster data is a challenge, but that difficulty increases as the
speed increases, particularly if the data is hard to manage or access.
Figure 1
Representation of Interconnected Contexts.
Real-World Context
Data has meaning in terms of its definitional domain. When you mention "blue", usually comes from the color domain. However, in the world of mental health, it means a kind of feeling or mood. So understanding the base context in which a data element exists is essential. If blue is associated with a human context, it could be physical and mean a lack of oxygen. It could also mean that the person is adorned in something blue. This is usually cleared up by understanding the base subject or entity that the data element is associated with by having a precise name, meaning, and basic subject area association. Underlying meaning can be tricky when just looking at the data value alone. Having proper meta-data and associations is the ideal solution to this problem.
Business Context
The contextual areas that relate to business fall into three
basic categories of meaning. Every data item or group of data items needs to be
viewed in terms of the context they are being viewed in or from. Internal
contexts where the vocabulary is understood in the context of the internal
organization and defined within a particular organization. These internal
contexts usually revolve around the organization and skill definitions. There
is also the external context that represents the outside world irrespective of
the organization itself. The third context is where the outside world touches
the internal world. Listed below are the typical contexts in each of these
categories:
Common External
Contexts:
Communities, Brand/Reputation, Public, Legal Frameworks/Courts,
Geographical Regions, Countries, Local Culture, Governmental Agencies, Industries,
Dynamic Industry 4.0, Value Chains, Supply Chains, Service Vendors, Markets,
Competitors, Prospects, and Competitors Customers.
Common Internal
Contexts
Organizational Culture, Goals, Constraints, Boundaries, Actual
Customers, Products, Services, Suppliers, Employees, Contractors, Departments,
Divisions, General Accounts, Contracts, Physical Infrastructure, Technical
Infrastructure, Properties, Investments, Intellectual capital, Business
Competencies. Knowledge, Skills, Patents, Success Measures and Statements
Common Interactive
Contexts:
Marketing Channels, Advertisements, Customer Journeys,
Customer Experience, Loyalty, Satisfaction Scores, Processes, Applications,
User interfaces, Websites, Webpages, and System Interfaces.
Technical Context
Data must also be understood in terms of physical contexts,
limitations, and potential lag times. Data sources need to be understood in
terms of their currency and ability to be integrated easily with other sources.
While many views, interactions, and integrations work well at the logical
level, physically, they may not be ready in terms of near-real-time
capabilities, transformation potential, or performance levels on or off-prem.
While meta-data may exist to understand possible joins and combinations,
executing them fast enough to be useful in multiple business contexts may not
be possible. The physical data types and file storage mechanisms may not be
conducive to the demands of new usage scenarios. New low lag databases that are
near real-time will become the standard, going forward.
Net; Net:
Data, information, knowledge are quite dependent on the
context(s) they participate in or the perspective they are viewed from. Often
Knowledge worlds interact; therefore, meanings can overlap and connect in ways
that are essential for ultimate understanding, manipulation, or utilization.
Knowing the context of your data is absolutely critical for leveraging
understanding. All of this is happening
at greater speeds approaching the “database
of now” speed necessary to make critical decisions, actions, adjustments or
improvements.