Monday, August 3, 2020

Increasing Corporate Performance with the Database of Now

Organizations no longer have the luxury of sitting back and waiting for an opportunity to react. Corporate performance depends on intercepting the emerging future quickly, thus putting a premium on the Database of Now. We can see many examples of the inability to pre-build strategic responses to emerging conditions such as inverted yield curves, new super competitors, hyper disinflation, currency shifts, pandemics, ECO events, and geopolitical shifts. So how do organizations take advantage of the database of now and build for interacting response cycles? The answer is to create a database of now and leverage it differently at different levels in the organization (See Figure 1) while trying to extend reaction to preemption. The interaction will be changed at different levels and cascading levels of strategy, tactics, and operations.


Figure 1 Interacting Response Cycles

Organizations are running in an automatic mode within normal conditions; they take actions without a lot of thinking or bother. The problem today is that automatic mode is not happening consistently with profitability like it has in the past because of emergent conditions. These conditions can emerge from a variety of sources represented by fast and large growing sources of data. These conditions can come from outside the organization in either an anticipated or unanticipated manner where they are not as controllable. These conditions can come from inside the organization to optimize business outcomes through observation or management influence in a controlled fashion. See figure 2 for the common sources of new conditions. The causes include changes in data, patterns, contexts, decision parameters, results from actions, changes in goals, or new risk management desires/demands. These changes are occurring on a more frequent basis and at a faster speed, thus creating the need for the Database of Now


Figure 2. Sources of Emergent Conditions

Keep in mind that each level's triggers in Figure 1 will likely be different, iterative, and possibly influenced/interconnected by other levels.

Operations of Now:

Operations are focused on completing business events, customer journeys, and work journeys with the support of humans, software, bots, and physical infrastructure. The operations are often iterative and monitored in a near real-time fashion. Today's operations require a Database of Now where the dashboards reflect actual progress/completion of work. When exceptions emerge, responses are required within the constraints of existing operational goals to make minor adjustments. Also, significant adjustments need projects that may leverage a fail-fast approach to make corrections. Operational goals are often influenced by changes initiated by tactical and strategic decisions and adjustments. Analysis and reporting help make appropriate adjustments without unseating other operations.

Tactics of Now:

Tactical management within the constraints of strategy tends to optimize interrelated outcomes that may look across multiple operational domains. Real-time forecasting based on real-time data is essential to predict the direction of aggregated operations. The Database of Now plays a crucial role in making better decisions by quickly changing rules to optimize business outcomes in support of strategic goals and directions. Tactical changes may imply shifting resources, changing rules, goals, and constraints of aggregated operations. Often key projects identified at this level, like recognizing patterns that might indicate the need for a new product or service. This is also the level that decides the amount and type of automation that will help reach the currently selected strategy.

The Strategy of Now:

Strategies tend to stay stable and are highly linked to the organization's missional operations within Its typical communities and common scenarios. Predictive and prescriptive analytics help shape expected scenarios that may be sitting on the shelf with their associated tactics and operations ready to jump in at a moment's notice. Of course, the Database of Now can point to a playbook for switching scenarios when expected patterns emerge. Still, unexpected patterns can generate the need to apply new scenarios generated by more predictive and prescriptive analytics. 

Net; Net:

It is pretty easy to see that fast monster data will create the need for the Database of Now necessary for better performance at all levels (strategy, tactics, and operations). It is also clear that fast data without time lags generated by too many synchronizations and transformations is necessary for better corporate performance while keeping all contributing resources aimed at business outcomes. 

 

 


Tuesday, July 14, 2020

Best Visual Options for Process Mining

Until bots can be cognitive enough to complete closed-loop improvements on processes or data stores on their own, visualization for humans will be key for making process improvements. Today many of those improvements are made through data mining in real-time or after the fact by humans. They do it by setting tolerances and monitoring outcomes or looking at the visualization of process instances that travel through processes or collaborations. The best visual options for any organization will depend on their culture, maturity, and desired business outcomes. I've laid out three categories of process mining visualization techniques that typically match maturity levels. I have used examples from vendors to help sort out the options, so your favorite vendor may have been left out of this post. 



Basic Visualizations

Basic visual analysis sometimes starts with an ideal process, sometimes called a "happy path", and look for the actual paths taken by a process. Organizations sometimes start with the outliers and try to reign them in closer to the ideal. Other organizations start with clusters of most common deviant paths and try to improve them. See the visualization below for a representation of this approach. Most organizations do a before and after to measure change effects, also depicted below. This shows the process before changes are made and the resulting process with deltas in certain instances. 







Intermediate Vizualizations

More mature organizations try to add important business contexts to show the actual delivery made by processes in terms of key measures. One of the more important contexts, shown below, are the actions shown on a timeline. This gives "time to results" a high priority while counting key costs and resource utilization specifics. This is an effective way to eye-ball opportunities. Another key approach is to show the process instances in light of desired outcomes versus real outcomes usually represented by dashboards or scorecards also depicted below. This is the start of the journey to adding more intelligence to the process of mining efforts. Simple Step through visualization with or without simulation of proposed changes is another nifty approach pictured below. 




                                    



Advanced Visualizations

One of the proven visualization techniques is animations that attract humans to opportunities through either speed or color indicators. This typically shows choke points and bottlenecks, but there are additional uses to simulate alternatives to show the value of different change opportunities. See below for an example. Predictive analytics combined with virtual reality can be used to visualize points of view or personas to fine-tune processes from different perspectives walking through a process or journey as depicted below. For those organizations that want to learn as they go, they can add machine or deep learning to improve processes as depicted below. 






Net; Net:

The visualization approaches can have a great impact on the resulting processes and finding opportunities for more automation, tuning for better results, and trying alternatives without the negative impacts of breaking or breaking optimized processes. Your chosen visualization might be a personal preference, but as organizations mature more sophisticated visualizations will be needed until the smart autonomous bots or agents can do this work as a partner or autonomously.  


Thursday, July 9, 2020

Is Your Data Smart Enough?

The state of data affairs over the last ten years or so revolved around big data. Of course, size matters, but big data promises to morph to monster data as more data sources hit the cloud with more tributaries like voice, video, IoT, events, and business patterns. So what about all this parked data? Are we going to keep storing it and bragging about how much cloud space it consumes?  Are you going to make it cleaner and smarter or just admire it? I would suggest we make data more intelligent and faster than just figuring out how to catalog and park it, so we can use it later. Making it faster means treating the data as a database of now, now of the future. Making data smarter can be tricky, but it is worth it.

Gleaning Data is Basic Intelligence.

Capturing data of different types and classifying them is pretty normal. Deciding how long to and where to keep it is essential. Determining if it is worthy of a long time archiving is doing data a solid. Knowing some basics about the data source and cost of acquisition and relative purity is pretty much a given these days. Some data cleansing and organization will help usage down the road.

Giving Data Meaning is Average Intelligence

Knowing the data about the data (AKA meta-data) is essential for interpreting it. The simplest is understating the data’s domain and its relative relationship to other data (logically or physically). Data representation and transformation options are pretty essential when combining with other data. Knowing the key or identifier of groups of related data is pretty standard. This step is where some of the impurities can be dealt with before heavy use. First use usually revolves around visualization and reporting to find actionable insights. This step is turning descriptive data into a prescription at times.

Granting Data Representation in Its Context is Very Smart

Most data is gathered and used within one or two base contexts. One is undoubtedly timing/frequency, and the other is the primary home of the data. For instance, the entity family it belongs to like product data. Sophisticated context representation will go beyond an original context or source to include others that have a neighborhood relationship with the data grouping/entity. An example would be a product within multiple markets and channels. This level is where statistical and predictive models enable more actions to either react or intercept the trends indicated in the data. This level is turning prescription to prediction to create/place data, event, or pattern sentinels on processes or the edge to look for prediction completion or variants.

 Grinding Data to a Fine Edge is Smarter

We are interrogating data to learn the need for important adjustments to goals, rules, or constraints for operating processes that include humans, software systems, or machines. This level can build a change to work in a supervised or unsupervised change process. This level starts with machine learning and extends to deep leading, which peels back layers and interrogates more data. In extreme cases, the data can be used to support judgment, reason, and creativity. The worm turns from data-driven to goal-driven, established by cognitive collaborations with management principles, guidelines, and guardrails.

Grappling with Data in Motion Right Now is Brilliance

The pinnacle of smart data is where the data coming in fresh is used to create the “database of now”.  At this level, all of the approaches above can be applied in a hybrid/complex fashion in a near time/ real-time basis. This level uses the combined IQ of all the AI and algorithm-driven approaches in a poly-analytical way that leverages the brainpower combined with fast data. A dynamic smart parts creation and dynamic assembly line would be a non-combat example. 

Net; Net:

Data: Use it or lose it, but let the data lead to the learnings that sense, decide, and suggest responses appropriate to action windows necessary to meet the timing need. If it is a sub-second focused problem domain, the patterns in the data and intelligent methods may make the decisions and take action with governance constraints. If not subs-second focused, let smart notifications or options be presented to humans supervising the actions. Don't leave all the precious data parked for the future only. 


Tuesday, July 7, 2020

Art for 2Q 2020

I hope you and yours are safe and healthy during these pandemic days. I delivered on a promise to my Granddaughter and painted Gabriella a beach scene that we designed on the phone right around her birthday when she called to thanks us for her birthday gift. It was a fun piece to do even though it is difficult to do sea scenes. To say the least, I learned some lessons for the next one, but she was quite pleased with the results. 

It was a great quarter for art sales. I sold seven pieces to two collectors. Two were fractals and the rest were paintings, which is quite different than my normal. Fractals have sold two to one in the past  I also completed a couple more fractals for you to see. If you are interested in seeing my portfolio or buying a piece, please click here



                                                         Serenity Beach


                                                              Happy Koi
                                           
                             
                                                             Circle Saw

Wednesday, July 1, 2020

Context: The Connecting Clues for Data

The “database of now” demands a quick understanding of data, particularly in context. There are many opportunities in understanding or misunderstanding data in terms of the contexts they participate in, or are connected to, or connected at the edge of a data neighborhood. Because each context has its own unique vocabulary, you can see the opportunity for misconnects in meaning by not understanding the full context of any statement or set of proven facts.

If someone says, "I like the blue one", how can you evaluate what that means? If it is a swimsuit on the beach, it means one thing, if it's a lobster from the same beach, that means a totally different thing. Context is what gives data real meaning. There are three primary forms of context that help understand the true meaning of the base data. One is the real world contextual meaning, the other is the contextual business meaning, and the other is the technical contextual meaning. Obviously, finding meaning in big or monster data is a challenge, but that difficulty increases as the speed increases, particularly if the data is hard to manage or access.


Figure 1 Representation of Interconnected Contexts.

 Real-World Context

Data has meaning in terms of its definitional domain. When you mention "blue", usually comes from the color domain. However, in the world of mental health, it means a kind of feeling or mood. So understanding the base context in which a data element exists is essential. If blue is associated with a human context, it could be physical and mean a lack of oxygen. It could also mean that the person is adorned in something blue. This is usually cleared up by understanding the base subject or entity that the data element is associated with by having a precise name, meaning, and basic subject area association. Underlying meaning can be tricky when just looking at the data value alone. Having proper meta-data and associations is the ideal solution to this problem.  

Business Context

The contextual areas that relate to business fall into three basic categories of meaning. Every data item or group of data items needs to be viewed in terms of the context they are being viewed in or from. Internal contexts where the vocabulary is understood in the context of the internal organization and defined within a particular organization. These internal contexts usually revolve around the organization and skill definitions. There is also the external context that represents the outside world irrespective of the organization itself. The third context is where the outside world touches the internal world. Listed below are the typical contexts in each of these categories:

Common External Contexts:

Communities, Brand/Reputation, Public, Legal Frameworks/Courts, Geographical Regions, Countries, Local Culture, Governmental Agencies, Industries, Dynamic Industry 4.0, Value Chains, Supply Chains, Service Vendors, Markets, Competitors, Prospects, and Competitors Customers.

Common Internal Contexts

Organizational Culture, Goals, Constraints, Boundaries, Actual Customers, Products, Services, Suppliers, Employees, Contractors, Departments, Divisions, General Accounts, Contracts, Physical Infrastructure, Technical Infrastructure, Properties, Investments, Intellectual capital, Business Competencies. Knowledge, Skills, Patents, Success Measures and Statements

Common Interactive Contexts:

Marketing Channels, Advertisements, Customer Journeys, Customer Experience, Loyalty, Satisfaction Scores, Processes, Applications, User interfaces, Websites, Webpages, and System Interfaces.

Technical Context

Data must also be understood in terms of physical contexts, limitations, and potential lag times. Data sources need to be understood in terms of their currency and ability to be integrated easily with other sources. While many views, interactions, and integrations work well at the logical level, physically, they may not be ready in terms of near-real-time capabilities, transformation potential, or performance levels on or off-prem. While meta-data may exist to understand possible joins and combinations, executing them fast enough to be useful in multiple business contexts may not be possible. The physical data types and file storage mechanisms may not be conducive to the demands of new usage scenarios. New low lag databases that are near real-time will become the standard, going forward.

Net; Net:

Data, information, knowledge are quite dependent on the context(s) they participate in or the perspective they are viewed from. Often Knowledge worlds interact; therefore, meanings can overlap and connect in ways that are essential for ultimate understanding, manipulation, or utilization. Knowing the context of your data is absolutely critical for leveraging understanding.  All of this is happening at greater speeds approaching the “database of now” speed necessary to make critical decisions, actions, adjustments or improvements. 



Tuesday, June 30, 2020

Generative AI+ Art is Gaining Momentum

I thought a post on generative art might be in the interest of all things AI. This kind of art is leveraging AI, algorithms, randomness, programs, and humans to create exciting and beautiful art. As you may know, I now collaborate with Fractal Software to develop compelling and award-winning artwork. In fact, some of my fractals are my best sellers. I have a great friend and fellow artist, Bob Weerts, who is pushing this collaboration even further. Below are two of his early generative pieces:












Bob employs lines as his fundamental stylistic element and incorporates a chance in determining line length, density, and color. He cedes some control over the work's final outcome to a process enabled by Software he's written allow the piece to "emerge" over time. He plans to let the Software take more control of these emergent pieces over time, letting AI/Algorithms expand some range. I find his early pieces quite pleasing and interesting already.

One source of Bob’s original inspiration is Casey Reas "Process Compendium," which, among other ideas, explored a synthesis of the Complexity Science notion of “emergence” and Generative Art in the early 2000s. An example of Reas Compendium work is below: (Click Here for Other Examples).

Reas is an internationally admired artist, but perhaps best known as the author, along with Ben Fry, of the graphical sketching too called "Processing," which is widely used in the domains of Art, Design, and Media. 

The significance of the generative art trend is perhaps exemplified by Christie's record of $432,500 sales of "Portrait of Belamy". The image is one of a series created by a group of young French students collaborating collectively as "Obvious".  Obvious borrowed heavily from open-source Generative Adversarial Network (GAN) algorithms specially developed by a then-high school graduate Robbie Barrat but originally conceived by the AI researcher Ian Goodfellow. This has the ball rolling, and there is new momentum under the "GAN" movement. Generative adversarial networks (GANs) are algorithmic architectures that use two neural networks, pitting one against the other (thus the “adversarial”) in order to generate new, synthetic instances of data that can pass for real data. They are used widely in image generation, video generation, and voice generation.

GAN's potential for both good and evil is huge because they can learn to mimic any distribution of data. GANs can be taught to create worlds eerily similar to our own in any domain: Images, music, speech, prose. They are robot artists in a sense, and their output is impressive. But they can also be used to generate fake media content of often called "deep fakes." 

Net; Net:

AI Generative Art is quite striking. Since the whole field is getting more towards AI and less from the artist/programmer, we can expect some exciting results in the future. I will likely pursue a more intimate collaboration with all kinds of generative art going forward. Keep your eye on Bob Weerts as he is a creative guy seeking this edge faster than many other artists.  

 

If you want to see my works, check out the fractals section here 

If you want to know more about my collaborations with Software to create, check out this post 

Read about more right-brained AI by clicking here 


 

 




 


Tuesday, June 16, 2020

Exploring Data Delivers

We hear about organizations mining data looking for benefits nearly every day now. Just like the prospectors of old, people are trying to mine gems out of the big patch of ground under their claim while searching adjacent areas. The case studies are abounding, so the appeal is strong. These mining efforts are this really paying off so how should one go about it?  Just start digging a big hole and hope for the best? With all the buzz around data mining and process mining, there are some proven paths to successful mining. 




Identify the Benefits of Data Mining 

It's pretty easy to justify the mining efforts on the promise of benefits today because there are so many success stories floating out there. The typical benefits that keep repeating include improved decision making, improved risk mitigation, improved planning, competitive advantage, cost reduction, customer acquisition, customer loyalty, new revenue streams, and new product/service development. The crucial step here is to find the benefits that will resound in your organization and situation. These days organizations are dealing with a multitude of challenges from plagues to politics. It is always a win to save costs, but there has to be more to it to create a compound set of appropriate benefits needed to justify mining efforts. 


Scoping Efforts Properly Delivers Better Results

While we all believe that data mining has the potential to improve and even transform organizations, the amount of data to mine is growing by the second and the number of advancements in making data smart is expanding. It's not difficult to understand that the majority of organizations are struggling to find the right strategy or solution. The first step is to discover where there is significant potential like miners do by drilling boreholes to discover the potential in the ground. That means organizations will have to sample areas of data that promise potential. To that end, many organizations start with process minging because it promises cost and time savings that often improve customer experiences leveraging smaller scopes. For those organizations that wanted an outside-in perspective, starting with customer journeys and large scoped processes that cross system boundaries have been quite successful. 


Incremental Learning is Essential to Continued Success

Starting small and expanding as success allows seems to be the most common model for mining. What is really popular at the moment is to use mining to find opportunities for more automation. Savvy organizations will look at adjacent systems, organizational units, and contexts. Feedback loops and iteration will teach the best lessons for mining results. Alternative visualization techniques such as timelines, animation, and limits do also help the learning process. Some organizations will also combine the visualization with discrete simulation to test alternative outcomes.   


Net; Net: 

If you are not practicing focused data mining and looking for productive patterns in your ever-growing data inventory, you are missing many opportunities. As successes emerge, the more savvy organizations are looking to widen their scopes and using approaches that cut through the jungle of their organization. Mining is here to stay and brings a valuable set of methods, techniques, and tools to leverage for organizations looking to thrive under all conditions.