Tuesday, July 14, 2020

Best Visual Options for Process Mining

Until bots can be cognitive enough to complete closed-loop improvements on processes or data stores on their own, visualization for humans will be key for making process improvements. Today many of those improvements are made through data mining in real-time or after the fact by humans. They do it by setting tolerances and monitoring outcomes or looking at the visualization of process instances that travel through processes or collaborations. The best visual options for any organization will depend on their culture, maturity, and desired business outcomes. I've laid out three categories of process mining visualization techniques that typically match maturity levels. I have used examples from vendors to help sort out the options, so your favorite vendor may have been left out of this post. 



Basic Visualizations

Basic visual analysis sometimes starts with an ideal process, sometimes called a "happy path", and look for the actual paths taken by a process. Organizations sometimes start with the outliers and try to reign them in closer to the ideal. Other organizations start with clusters of most common deviant paths and try to improve them. See the visualization below for a representation of this approach. Most organizations do a before and after to measure change effects, also depicted below. This shows the process before changes are made and the resulting process with deltas in certain instances. 







Intermediate Vizualizations

More mature organizations try to add important business contexts to show the actual delivery made by processes in terms of key measures. One of the more important contexts, shown below, are the actions shown on a timeline. This gives "time to results" a high priority while counting key costs and resource utilization specifics. This is an effective way to eye-ball opportunities. Another key approach is to show the process instances in light of desired outcomes versus real outcomes usually represented by dashboards or scorecards also depicted below. This is the start of the journey to adding more intelligence to the process of mining efforts. Simple Step through visualization with or without simulation of proposed changes is another nifty approach pictured below. 




                                    



Advanced Visualizations

One of the proven visualization techniques is animations that attract humans to opportunities through either speed or color indicators. This typically shows choke points and bottlenecks, but there are additional uses to simulate alternatives to show the value of different change opportunities. See below for an example. Predictive analytics combined with virtual reality can be used to visualize points of view or personas to fine-tune processes from different perspectives walking through a process or journey as depicted below. For those organizations that want to learn as they go, they can add machine or deep learning to improve processes as depicted below. 






Net; Net:

The visualization approaches can have a great impact on the resulting processes and finding opportunities for more automation, tuning for better results, and trying alternatives without the negative impacts of breaking or breaking optimized processes. Your chosen visualization might be a personal preference, but as organizations mature more sophisticated visualizations will be needed until the smart autonomous bots or agents can do this work as a partner or autonomously.  


Thursday, July 9, 2020

Is Your Data Smart Enough?

The state of data affairs over the last ten years or so revolved around big data. Of course, size matters, but big data promises to morph to monster data as more data sources hit the cloud with more tributaries like voice, video, IoT, events, and business patterns. So what about all this parked data? Are we going to keep storing it and bragging about how much cloud space it consumes?  Are you going to make it cleaner and smarter or just admire it? I would suggest we make data more intelligent and faster than just figuring out how to catalog and park it, so we can use it later. Making it faster means treating the data as a database of now, now of the future. Making data smarter can be tricky, but it is worth it.

Gleaning Data is Basic Intelligence.

Capturing data of different types and classifying them is pretty normal. Deciding how long to and where to keep it is essential. Determining if it is worthy of a long time archiving is doing data a solid. Knowing some basics about the data source and cost of acquisition and relative purity is pretty much a given these days. Some data cleansing and organization will help usage down the road.

Giving Data Meaning is Average Intelligence

Knowing the data about the data (AKA meta-data) is essential for interpreting it. The simplest is understating the data’s domain and its relative relationship to other data (logically or physically). Data representation and transformation options are pretty essential when combining with other data. Knowing the key or identifier of groups of related data is pretty standard. This step is where some of the impurities can be dealt with before heavy use. First use usually revolves around visualization and reporting to find actionable insights. This step is turning descriptive data into a prescription at times.

Granting Data Representation in Its Context is Very Smart

Most data is gathered and used within one or two base contexts. One is undoubtedly timing/frequency, and the other is the primary home of the data. For instance, the entity family it belongs to like product data. Sophisticated context representation will go beyond an original context or source to include others that have a neighborhood relationship with the data grouping/entity. An example would be a product within multiple markets and channels. This level is where statistical and predictive models enable more actions to either react or intercept the trends indicated in the data. This level is turning prescription to prediction to create/place data, event, or pattern sentinels on processes or the edge to look for prediction completion or variants.

 Grinding Data to a Fine Edge is Smarter

We are interrogating data to learn the need for important adjustments to goals, rules, or constraints for operating processes that include humans, software systems, or machines. This level can build a change to work in a supervised or unsupervised change process. This level starts with machine learning and extends to deep leading, which peels back layers and interrogates more data. In extreme cases, the data can be used to support judgment, reason, and creativity. The worm turns from data-driven to goal-driven, established by cognitive collaborations with management principles, guidelines, and guardrails.

Grappling with Data in Motion Right Now is Brilliance

The pinnacle of smart data is where the data coming in fresh is used to create the “database of now”.  At this level, all of the approaches above can be applied in a hybrid/complex fashion in a near time/ real-time basis. This level uses the combined IQ of all the AI and algorithm-driven approaches in a poly-analytical way that leverages the brainpower combined with fast data. A dynamic smart parts creation and dynamic assembly line would be a non-combat example. 

Net; Net:

Data: Use it or lose it, but let the data lead to the learnings that sense, decide, and suggest responses appropriate to action windows necessary to meet the timing need. If it is a sub-second focused problem domain, the patterns in the data and intelligent methods may make the decisions and take action with governance constraints. If not subs-second focused, let smart notifications or options be presented to humans supervising the actions. Don't leave all the precious data parked for the future only. 


Tuesday, July 7, 2020

Art for 2Q 2020

I hope you and yours are safe and healthy during these pandemic days. I delivered on a promise to my Granddaughter and painted Gabriella a beach scene that we designed on the phone right around her birthday when she called to thanks us for her birthday gift. It was a fun piece to do even though it is difficult to do sea scenes. To say the least, I learned some lessons for the next one, but she was quite pleased with the results. 

It was a great quarter for art sales. I sold seven pieces to two collectors. Two were fractals and the rest were paintings, which is quite different than my normal. Fractals have sold two to one in the past  I also completed a couple more fractals for you to see. If you are interested in seeing my portfolio or buying a piece, please click here



                                                         Serenity Beach


                                                              Happy Koi
                                           
                             
                                                             Circle Saw

Wednesday, July 1, 2020

Context: The Connecting Clues for Data

The “database of now” demands a quick understanding of data, particularly in context. There are many opportunities in understanding or misunderstanding data in terms of the contexts they participate in, or are connected to, or connected at the edge of a data neighborhood. Because each context has its own unique vocabulary, you can see the opportunity for misconnects in meaning by not understanding the full context of any statement or set of proven facts.

If someone says, "I like the blue one", how can you evaluate what that means? If it is a swimsuit on the beach, it means one thing, if it's a lobster from the same beach, that means a totally different thing. Context is what gives data real meaning. There are three primary forms of context that help understand the true meaning of the base data. One is the real world contextual meaning, the other is the contextual business meaning, and the other is the technical contextual meaning. Obviously, finding meaning in big or monster data is a challenge, but that difficulty increases as the speed increases, particularly if the data is hard to manage or access.


Figure 1 Representation of Interconnected Contexts.

 Real-World Context

Data has meaning in terms of its definitional domain. When you mention "blue", usually comes from the color domain. However, in the world of mental health, it means a kind of feeling or mood. So understanding the base context in which a data element exists is essential. If blue is associated with a human context, it could be physical and mean a lack of oxygen. It could also mean that the person is adorned in something blue. This is usually cleared up by understanding the base subject or entity that the data element is associated with by having a precise name, meaning, and basic subject area association. Underlying meaning can be tricky when just looking at the data value alone. Having proper meta-data and associations is the ideal solution to this problem.  

Business Context

The contextual areas that relate to business fall into three basic categories of meaning. Every data item or group of data items needs to be viewed in terms of the context they are being viewed in or from. Internal contexts where the vocabulary is understood in the context of the internal organization and defined within a particular organization. These internal contexts usually revolve around the organization and skill definitions. There is also the external context that represents the outside world irrespective of the organization itself. The third context is where the outside world touches the internal world. Listed below are the typical contexts in each of these categories:

Common External Contexts:

Communities, Brand/Reputation, Public, Legal Frameworks/Courts, Geographical Regions, Countries, Local Culture, Governmental Agencies, Industries, Dynamic Industry 4.0, Value Chains, Supply Chains, Service Vendors, Markets, Competitors, Prospects, and Competitors Customers.

Common Internal Contexts

Organizational Culture, Goals, Constraints, Boundaries, Actual Customers, Products, Services, Suppliers, Employees, Contractors, Departments, Divisions, General Accounts, Contracts, Physical Infrastructure, Technical Infrastructure, Properties, Investments, Intellectual capital, Business Competencies. Knowledge, Skills, Patents, Success Measures and Statements

Common Interactive Contexts:

Marketing Channels, Advertisements, Customer Journeys, Customer Experience, Loyalty, Satisfaction Scores, Processes, Applications, User interfaces, Websites, Webpages, and System Interfaces.

Technical Context

Data must also be understood in terms of physical contexts, limitations, and potential lag times. Data sources need to be understood in terms of their currency and ability to be integrated easily with other sources. While many views, interactions, and integrations work well at the logical level, physically, they may not be ready in terms of near-real-time capabilities, transformation potential, or performance levels on or off-prem. While meta-data may exist to understand possible joins and combinations, executing them fast enough to be useful in multiple business contexts may not be possible. The physical data types and file storage mechanisms may not be conducive to the demands of new usage scenarios. New low lag databases that are near real-time will become the standard, going forward.

Net; Net:

Data, information, knowledge are quite dependent on the context(s) they participate in or the perspective they are viewed from. Often Knowledge worlds interact; therefore, meanings can overlap and connect in ways that are essential for ultimate understanding, manipulation, or utilization. Knowing the context of your data is absolutely critical for leveraging understanding.  All of this is happening at greater speeds approaching the “database of now” speed necessary to make critical decisions, actions, adjustments or improvements.