Tuesday, September 22, 2020

Monster Data is Headed Our Way

If you thought Big Data was a challenge for us all, wait until the new wave of Monster Data hits us. We will have to manage it, make decisions, and build processes and applications leveraging this monster data. Just what is monster data, and how will it affect us. 

Monster Data represents data that is overwhelmingly large, unduly complex, can’t be trusted for accuracy. Typically it is composed of multiple kinds of data including structured, unstructured text, voice, image, or video. Some monster data may be unknown or emergent, making it scary to deal with for most individuals, technologies, or organizations.  

Even Larger Volumes        

We have long been concerned about “the IoT Awakening," exposing large amounts of critical data that would likely need immediate attention often at the edge. While managing all the moving parts of Industry 4.0 is a challenge, we see new value chains that employ GPS, tracing, and original digital identities adding data to the mix. As organizations want to leverage data for more refined business outcomes, more data will be needed.

Organizations leverage more powerful AI and computer-analysis techniques to gain insight into human behavior using personality, social, and organizational psychology data. This need will yield data sets that are much larger than what we have today and certainly too large for traditional processes and applications. Data will likely include recorded conversations that could process into usable information.

More and more data is piling up from digital footprints left in social media, cell phones, business transactions in various contexts, shopping, surfing, and other devices that record our every moment, freely given or not. Sometimes this new data is just taken from sites as people pass through, leaving crumbs behind.

Even More Complex  

To order to utilize technology to empower us, the complexity of the data will also become much more diverse. Because large data collections can be computationally analyzed to reveal new signals, patterns, and trends, the complexity of that data will have to be managed well.  Organizations want to deliver insights from human behavior and interactions collected everywhere, every second of the day.

The data will come from various contexts that imply context-sensitive meaning. Hopefully, this new and emergent data will be available on the cloud, but the cost and security issues will make this data more hybrid cloud in nature. The data will likely be a hybrid of structured and unstructured data and require new data management means with ownership and dynamism challenges.

This complex and dynamic set of data sources will become more challenging to manage, but it is on its way to becoming a precious asset that can be leveraged by machine and deep learning. While dynamic and emergent, its use will become more stunning over time.

Even More Inaccurate

Because of speed and size alone, the accuracy of monster data will be a constant challenge. When combining data in new ways understanding its source, context, and ultimate meaning at all levels of granularity, this becomes more of a critical problem for the data management professionals as well as the end-users.

There will be ownership issues and who will be held accountable for the accuracy of any data leveraged. All of this will have to be sorted and managed under the gun with the pressure of speedy results. Of course, internal data sets will have a better-understood pedigree than those data sources from outside an organization and in contexts not well understood.                    

Net; Net:

As we grow to zettabytes, the amount and variety of data being accessed, collected, stored in the cloud, stored on-premises, and analyzed will keep increasing in an exponential fashion. This seems like a near-impossible task until the promise of better analysis and prediction to correct problems takes over our desires. Business outcomes will likely drive this growth in this extreme competition and dynamic environment these days.

 

Tuesday, September 8, 2020

Decisions Need to Drive Data Science

There has been and will continue to be a significant shift in how data is leveraged in this continually changing world. Until recently, the data science process of collecting, cleaning, exploring, model building, and model deployment ruled the data management mindset. In a world that is "steady as she goes," this makes a great deal of sense. The amount of data to be curated is growing impressively, but the data science mindset is still on the scene dealing though pressed to its limits with big data. Two things will break this sole reliance on the data science process. 



Dynamic Business Scenarios

In a world with operational KPIs staying steady with minor adjustments over time, focusing on data makes sense. That world is virtually gone with the elephant in the room being a pandemics at the moment. Tomorrow it could be natural disasters and the down-stream effects of climate change impacting geopolitical behaviors. There are many business scenario possibilities and combinations headed our way. We can't afford just to explore data and have knee jerk responses.

Monster Data is Lurking

If you think big data is worthy of concern, just think about the monster data just around the corner, driven by higher volumes, more complexity, and even more inaccurate by nature. Organizations are bound and determined to take advantage of behavioral data that is further away from standard core operational data. Monster data includes all kinds of unstructured data that will contain digital footprints worthy of new types of decisions.

Either of these would require a major addition of new data processes, but combined data science processes alone just won't suffice. I am not saying that data science will dim, but it needs some new additional turbocharging and methods that are not just focused on exploring structured and clean data.

Dealing with Changing Scenarios

There are several ways of dealing with scenario planning and practicing responses, but here is what I would encourage organizations to do. Many decisions will drive the data that is leveraged during these efforts.

  • ·        Plan probable scenarios by having executives brainstorm and list likely scenarios and their outcomes.
  • ·        Simulate and practice these likely scenarios, so they become part of the muscle memory of an organization. It will involve leveraging key data sources cascading to tactics and operations. Build communications mechanisms ahead of time and communicate readiness.
  • ·        Identify unlikely dangerous scenarios and simulate the effects and plan responses appropriately.
  • ·        Identify critical decisions, events, and patterns to scour appropriate data resources (owned or not).
  • ·        Identify key leverage points in processes, systems, applications, and the data that could be involved

Dealing with Changing Tactics

Middle management is always trying to optimize outcomes for their functional areas though savvy organizations try to link results to remove friction points for overall optimization. Optimization often leads to self-imposed changing goals that need to be operationalized or tweaked in operations. When executives want different outcomes based on a refined organizational charter, new governance rules, and critical trends delivered by business scenarios in place, a bigger picture is in play. Tactics are the essential glue to hold together operational outcomes guided by goals. As these goals shift in a dynamic set of business demands, managers would be wise to be ready for new guidance coming at faster speeds by following this list of practices.

  • ·        Understand the impact of significant changes by modeling or simulating the effects of change.
  • ·        Be aware of all executive expected sets of scenarios and search for critical events and patterns to detect new scenario emergence.
  • ·        Implement various approaches to near real-time responses, including digital war rooms, dynamic process/application changes, and low-code methods.

Dealing with Operational Change:

Typically operational processes and systems are in place to deal with day to day operations. Changes in behaviors, markets, tactics, or scenarios will cascade down to operations. There may be additions and changes to procedures dictated by outside factors over the routine operational optimizations that occur on an ongoing basis. Processes tend to be more stable, but some changes could rock the house. To deal with functional change, I would encourage the following activities.

  • ·        Model key decisions that affect KPIs and desired business outcomes.
  • ·        Generate procedures from the models—manuals for human resources and code for processes and applications.
  • ·        Perform a volatility analysis based on past changes to identify hot spots. Enable hot spots for change, particularly for code using late binding techniques or low-code.

Net; Net:

We are entering an era of significant change linked to constant change. It means that just shining and studying data alone does cut it as a sole strategy. While data affects decisions as they are made, deciding what is going to change is emerging as a dominant new organizational competency area. We need to add some new disciples/practices to thrive going forward and call it Decision Science. 

 

 

 

 


Wednesday, August 26, 2020

Budgeting Technologies for 2021

 Organizations are now challenged in new ways; therefore, they must budget very carefully, as we advance. There will be a tug of war between accelerating digital and dealing with budget reductions for IT investment. Gartner, affectionately known as "The Big G" in my circles, has predicted IT budget reductions. At the same time, businesses are being pushed to accelerate digital transformation. Savvy organizations will save with technology to invest in technology to break through this set of apparent conflicting goals. Organizations will be careful in deciding what to invest in to survive, thrive, and capitalize on these dynamic and challenging times. I will try to lay out the five most important technologies to invest in to keep these conflicting goals in balance.



Continuous Intelligent Automation & Cost Optimization

Automation has come on strong through the use of RPA, Workflow/iBPMS, and Low Code solutions as of late. Now I see an extension of the accelerated use of both guided by both Process Mining and AI. Continuously, process mining offers extreme visibility for opportunities to handle outliers or optimize on process/case yielding time and labor savings. Machine and Deep Learning will also play a guiding role in finding more optimization opportunities over and above what the human eye can detect across various mining visualizations. The pressure for quick improvements with fast feedback cycles will push more detection of options to intelligent software or machines as responsible AI continues to develop. The savings from these profitable efforts can be applied to future digitization efforts. These efforts can be multiplied by using Software as a Service (SaaS) in some instances.

Human Augmentation & Skills Expansion

As more automation pushes humans to higher-skilled pattern detection, advanced Decision Intelligence, and smarter actions, humans will use technology to enhance a person's cognitive and physical experiences. There may be sensory augmentation, perception augmentation, and AI cognitive assists in enabling higher-skilled work levels. In the case of physical responses, appendage assistance, and exoskeleton leverage may be enabled. Imagine having the assistance of experienced experts in your ear, eye, or mind to accomplish more challenging tasks. The new worker will have interactions with technologies that will enable super skills and accelerated outcomes. This will start small and accelerated by the end of 2021.

Immersive Experiences & Visibility

All constituents will have more immersive and pleasing experiences, making them more informed and satisfied. Customer Journey Mapping/Mining technologies will allow organizations to get real-time and truthful feedback from their customers, employees, partners, and vendors to help improve their experiences on a continuous basis. Virtual Reality and Mixed Reality has the potential to radically influence the direction of improved customer experiences, product supply, and value chain services. These new visibility assists can give customers a real sense of progress towards their outcomes when balanced with organizational processes.  Some organizations have seen the value in onboarding and immersive training in a safe and realistic virtual environment.

Augmented & Real-Time Data Management

The amount and speed of data are increasing faster than our ability to manage it. Big data is turning to a complex and multi-head monster of data types with various different requirements. Managing all the data and data types will require assistance. Data marketplaces and Exchanges are emerging to add to the data chaos. Managing the various data sources will need to leverage the Database of Now integrating various data sources in the cloud, AI's ability to learn from the incoming flood of data, and the metadata that defines it within its various contexts and workloads are essential. Dark data will start to be better understood. Data journeys and transparency will be assisted by practical Blockchain that enables data traceability.

Autonomous Bots & Edge Computing

Autonomous bots/agents that bid on work at the maximum and minimally perform activities on "the edge" with AI help. Edge processing, data collections, and decisions are placed closer to the information and activity source to sense, decide, and respond closer and in the proper context. Often the IoT is where this occurs when machines, sensors, and controllers are involved with physical activity, but there are instances software, Digital Twin, or not have a presence at the edge. Often these activities are semi-autonomous and supervised today, but we are moving to more autonomy over time. Smart spaces, smart production, and smart value chains will drive these kinds of efforts. Look for robots as a service (RaaS) to elevate some of the data density issues.

Net; Net:

Every organization will have to match their operating plans to the technologies above and decide what they want to take on within their cultural and risk limits. The danger here is to focus on technologies that can contribute to short term financial results to the detriment of the future. This is true with short term cloud efforts without thinking of the total cost of ownership. Grab some profits, but invest wisely to compete digitally in the future. Negotiate with your financial folks, please or hope they get more innovative.

 

 

 

 

 

 

 



Thursday, August 20, 2020

Are Masks the New Accessory?

When COVID 19 first emerged, a number of us scrambled to get any mask we could. In our home, we first went for the standard paper mask from the Pharmacy. As COVID 19 got to be a bit more pervasive and scary, we upgraded to N95s or K95s and some double layered cloth masks, mostly in black. Now masks are getting better looking, so I thought I'd put some of my art on masks to see if there was a demand. Indeed there was. It seems the folks that want masks, want something nice to look at. Here are the masks that I'm offering. I can be reached through if you are so to give me feedback or even have one in your possession. You can see more of my art by clicking here




Wednesday, August 19, 2020

Acceleration of Decisions Helped by the Database of Now

Things were going along nicely until COVID 19 hit, and it was “game on” for rapid decision making. Executives were slammed from operational optimization with known decision/static models while transforming incrementally to digital towards a world of large amounts of decisions made in short time frames. The vast majority of organizations had not planned for this kind of scenario, thus not practiced to handling it. The acceleration to some form of digital and remote workers was instant. Our executives were bombarded with one critical decision opportunity after another, and many were up to the task thankfully. Is the question "Is this a one-off situation"? I would argue that maybe not the exact same scenario, but the beginning of many emergent situations at various corporate performance levels. How does the Database of Now help



Integrated Data to Support the Lateral Thinking in Decision Making

Traditionally decisions have been generated by new management goals and finding "aha" discoveries in data? While this approach will continue, forced innovation will be a necessity driven by these large-scale and emergent scenarios, such as changing markets, customer demands, extreme competition, and the desire for better outcomes. This shift will require more data and new complexities to continue to monitor, decide, and take appropriate actions. Intercepting the changing future will also drive towards adapting to and integrating new data resources, many of which will be cloud resident. The Database of Now supports dynamic and easy integration of new/emergent data sources.

Fast Data for Analytic Assistance, Guided, and Flexible Implementations

Decision-makers will demand assistance in making informed decisions fast and understanding the ultimate impact of their actions in planning and execution modes. The first demand will be for fast data supportive before the actual decision occurs. Lack of speed kills but so does speed without anticipating potential outcomes. Analytics will greatly assist decision-makers in understanding the possible consequences of their impending decisions. Once the decisions are made, the emergent effect will need to be tracked, monitored, and measured during rollout. Once implemented, fast feedback loops will help guide adjustments for better performance while sensing emergent patterns for potential new decisions. The Database of Now is designed for speed.

Smart Data that Leverages Machine Learning and Other Forms of AI

Today, most decision-makers are highly involved with the decision-making process unless they can be easily automated, usually using decision models. Typically these decisions are operational and static in nature, but there is a strong trend towards flexible change and emergent business outcomes are driven by new responses to integrated response scenarios. Either way, forms of AI can speed the decision-making process, starting with machine learning that watches conditions and outcomes. Deep learning can sharpen the focus for even better results. This kind of leverage is often described as smart data commonly used in supervised learning situations. With the advent of emergent and complex conditions driven by expected or unexpected events and emerging patterns, AI will take a more judgmental role in an unsupervised fashion. The Database of Now is ideal for having the most up to data and contextually sensitive data sources necessary for high intelligence.

Net: Net:

The future will require rapid decision making that needs speedy data that traverses many data types, monster data volumes, and growing complexity.  There will be quiet periods of optimization that will also benefit from the Database of Now, but get ready for waves of emergent situations potentially never seen before by the modern decision-makers. It may turn decision making on its head changing from only modeling operational decisions into crisp responses also to include emergent decisions dependent on complex, fast, and shifting data sources. Will you be ready for fast and effective decisions for customer needs and operational effectiveness?

  •            Customer experience demands responsive and instantaneous data.
  •            Business operations insights enable instant adaptations for changing market needs.

Additional Reading:

IncreasingCorporate Performance with the Database of Now  

Context: TheConnecting Clues for Data  

DeliveringSuccess with Smart Data Streams  

 

 

Monday, August 3, 2020

Increasing Corporate Performance with the Database of Now

Organizations no longer have the luxury of sitting back and waiting for an opportunity to react. Corporate performance depends on intercepting the emerging future quickly, thus putting a premium on the Database of Now. We can see many examples of the inability to pre-build strategic responses to emerging conditions such as inverted yield curves, new super competitors, hyper disinflation, currency shifts, pandemics, ECO events, and geopolitical shifts. So how do organizations take advantage of the database of now and build for interacting response cycles? The answer is to create a database of now and leverage it differently at different levels in the organization (See Figure 1) while trying to extend reaction to preemption. The interaction will be changed at different levels and cascading levels of strategy, tactics, and operations.


Figure 1 Interacting Response Cycles

Organizations are running in an automatic mode within normal conditions; they take actions without a lot of thinking or bother. The problem today is that automatic mode is not happening consistently with profitability like it has in the past because of emergent conditions. These conditions can emerge from a variety of sources represented by fast and large growing sources of data. These conditions can come from outside the organization in either an anticipated or unanticipated manner where they are not as controllable. These conditions can come from inside the organization to optimize business outcomes through observation or management influence in a controlled fashion. See figure 2 for the common sources of new conditions. The causes include changes in data, patterns, contexts, decision parameters, results from actions, changes in goals, or new risk management desires/demands. These changes are occurring on a more frequent basis and at a faster speed, thus creating the need for the Database of Now


Figure 2. Sources of Emergent Conditions

Keep in mind that each level's triggers in Figure 1 will likely be different, iterative, and possibly influenced/interconnected by other levels.

Operations of Now:

Operations are focused on completing business events, customer journeys, and work journeys with the support of humans, software, bots, and physical infrastructure. The operations are often iterative and monitored in a near real-time fashion. Today's operations require a Database of Now where the dashboards reflect actual progress/completion of work. When exceptions emerge, responses are required within the constraints of existing operational goals to make minor adjustments. Also, significant adjustments need projects that may leverage a fail-fast approach to make corrections. Operational goals are often influenced by changes initiated by tactical and strategic decisions and adjustments. Analysis and reporting help make appropriate adjustments without unseating other operations.

Tactics of Now:

Tactical management within the constraints of strategy tends to optimize interrelated outcomes that may look across multiple operational domains. Real-time forecasting based on real-time data is essential to predict the direction of aggregated operations. The Database of Now plays a crucial role in making better decisions by quickly changing rules to optimize business outcomes in support of strategic goals and directions. Tactical changes may imply shifting resources, changing rules, goals, and constraints of aggregated operations. Often key projects identified at this level, like recognizing patterns that might indicate the need for a new product or service. This is also the level that decides the amount and type of automation that will help reach the currently selected strategy.

The Strategy of Now:

Strategies tend to stay stable and are highly linked to the organization's missional operations within Its typical communities and common scenarios. Predictive and prescriptive analytics help shape expected scenarios that may be sitting on the shelf with their associated tactics and operations ready to jump in at a moment's notice. Of course, the Database of Now can point to a playbook for switching scenarios when expected patterns emerge. Still, unexpected patterns can generate the need to apply new scenarios generated by more predictive and prescriptive analytics. 

Net; Net:

It is pretty easy to see that fast monster data will create the need for the Database of Now necessary for better performance at all levels (strategy, tactics, and operations). It is also clear that fast data without time lags generated by too many synchronizations and transformations is necessary for better corporate performance while keeping all contributing resources aimed at business outcomes. 

 

 


Tuesday, July 14, 2020

Best Visual Options for Process Mining

Until bots can be cognitive enough to complete closed-loop improvements on processes or data stores on their own, visualization for humans will be key for making process improvements. Today many of those improvements are made through data mining in real-time or after the fact by humans. They do it by setting tolerances and monitoring outcomes or looking at the visualization of process instances that travel through processes or collaborations. The best visual options for any organization will depend on their culture, maturity, and desired business outcomes. I've laid out three categories of process mining visualization techniques that typically match maturity levels. I have used examples from vendors to help sort out the options, so your favorite vendor may have been left out of this post. 



Basic Visualizations

Basic visual analysis sometimes starts with an ideal process, sometimes called a "happy path", and look for the actual paths taken by a process. Organizations sometimes start with the outliers and try to reign them in closer to the ideal. Other organizations start with clusters of most common deviant paths and try to improve them. See the visualization below for a representation of this approach. Most organizations do a before and after to measure change effects, also depicted below. This shows the process before changes are made and the resulting process with deltas in certain instances. 







Intermediate Vizualizations

More mature organizations try to add important business contexts to show the actual delivery made by processes in terms of key measures. One of the more important contexts, shown below, are the actions shown on a timeline. This gives "time to results" a high priority while counting key costs and resource utilization specifics. This is an effective way to eye-ball opportunities. Another key approach is to show the process instances in light of desired outcomes versus real outcomes usually represented by dashboards or scorecards also depicted below. This is the start of the journey to adding more intelligence to the process of mining efforts. Simple Step through visualization with or without simulation of proposed changes is another nifty approach pictured below. 




                                    



Advanced Visualizations

One of the proven visualization techniques is animations that attract humans to opportunities through either speed or color indicators. This typically shows choke points and bottlenecks, but there are additional uses to simulate alternatives to show the value of different change opportunities. See below for an example. Predictive analytics combined with virtual reality can be used to visualize points of view or personas to fine-tune processes from different perspectives walking through a process or journey as depicted below. For those organizations that want to learn as they go, they can add machine or deep learning to improve processes as depicted below. 






Net; Net:

The visualization approaches can have a great impact on the resulting processes and finding opportunities for more automation, tuning for better results, and trying alternatives without the negative impacts of breaking or breaking optimized processes. Your chosen visualization might be a personal preference, but as organizations mature more sophisticated visualizations will be needed until the smart autonomous bots or agents can do this work as a partner or autonomously.  


Thursday, July 9, 2020

Is Your Data Smart Enough?

The state of data affairs over the last ten years or so revolved around big data. Of course, size matters, but big data promises to morph to monster data as more data sources hit the cloud with more tributaries like voice, video, IoT, events, and business patterns. So what about all this parked data? Are we going to keep storing it and bragging about how much cloud space it consumes?  Are you going to make it cleaner and smarter or just admire it? I would suggest we make data more intelligent and faster than just figuring out how to catalog and park it, so we can use it later. Making it faster means treating the data as a database of now, now of the future. Making data smarter can be tricky, but it is worth it.

Gleaning Data is Basic Intelligence.

Capturing data of different types and classifying them is pretty normal. Deciding how long to and where to keep it is essential. Determining if it is worthy of a long time archiving is doing data a solid. Knowing some basics about the data source and cost of acquisition and relative purity is pretty much a given these days. Some data cleansing and organization will help usage down the road.

Giving Data Meaning is Average Intelligence

Knowing the data about the data (AKA meta-data) is essential for interpreting it. The simplest is understating the data’s domain and its relative relationship to other data (logically or physically). Data representation and transformation options are pretty essential when combining with other data. Knowing the key or identifier of groups of related data is pretty standard. This step is where some of the impurities can be dealt with before heavy use. First use usually revolves around visualization and reporting to find actionable insights. This step is turning descriptive data into a prescription at times.

Granting Data Representation in Its Context is Very Smart

Most data is gathered and used within one or two base contexts. One is undoubtedly timing/frequency, and the other is the primary home of the data. For instance, the entity family it belongs to like product data. Sophisticated context representation will go beyond an original context or source to include others that have a neighborhood relationship with the data grouping/entity. An example would be a product within multiple markets and channels. This level is where statistical and predictive models enable more actions to either react or intercept the trends indicated in the data. This level is turning prescription to prediction to create/place data, event, or pattern sentinels on processes or the edge to look for prediction completion or variants.

 Grinding Data to a Fine Edge is Smarter

We are interrogating data to learn the need for important adjustments to goals, rules, or constraints for operating processes that include humans, software systems, or machines. This level can build a change to work in a supervised or unsupervised change process. This level starts with machine learning and extends to deep leading, which peels back layers and interrogates more data. In extreme cases, the data can be used to support judgment, reason, and creativity. The worm turns from data-driven to goal-driven, established by cognitive collaborations with management principles, guidelines, and guardrails.

Grappling with Data in Motion Right Now is Brilliance

The pinnacle of smart data is where the data coming in fresh is used to create the “database of now”.  At this level, all of the approaches above can be applied in a hybrid/complex fashion in a near time/ real-time basis. This level uses the combined IQ of all the AI and algorithm-driven approaches in a poly-analytical way that leverages the brainpower combined with fast data. A dynamic smart parts creation and dynamic assembly line would be a non-combat example. 

Net; Net:

Data: Use it or lose it, but let the data lead to the learnings that sense, decide, and suggest responses appropriate to action windows necessary to meet the timing need. If it is a sub-second focused problem domain, the patterns in the data and intelligent methods may make the decisions and take action with governance constraints. If not subs-second focused, let smart notifications or options be presented to humans supervising the actions. Don't leave all the precious data parked for the future only. 


Tuesday, July 7, 2020

Art for 2Q 2020

I hope you and yours are safe and healthy during these pandemic days. I delivered on a promise to my Granddaughter and painted Gabriella a beach scene that we designed on the phone right around her birthday when she called to thanks us for her birthday gift. It was a fun piece to do even though it is difficult to do sea scenes. To say the least, I learned some lessons for the next one, but she was quite pleased with the results. 

It was a great quarter for art sales. I sold seven pieces to two collectors. Two were fractals and the rest were paintings, which is quite different than my normal. Fractals have sold two to one in the past  I also completed a couple more fractals for you to see. If you are interested in seeing my portfolio or buying a piece, please click here



                                                         Serenity Beach


                                                              Happy Koi
                                           
                             
                                                             Circle Saw

Wednesday, July 1, 2020

Context: The Connecting Clues for Data

The “database of now” demands a quick understanding of data, particularly in context. There are many opportunities in understanding or misunderstanding data in terms of the contexts they participate in, or are connected to, or connected at the edge of a data neighborhood. Because each context has its own unique vocabulary, you can see the opportunity for misconnects in meaning by not understanding the full context of any statement or set of proven facts.

If someone says, "I like the blue one", how can you evaluate what that means? If it is a swimsuit on the beach, it means one thing, if it's a lobster from the same beach, that means a totally different thing. Context is what gives data real meaning. There are three primary forms of context that help understand the true meaning of the base data. One is the real world contextual meaning, the other is the contextual business meaning, and the other is the technical contextual meaning. Obviously, finding meaning in big or monster data is a challenge, but that difficulty increases as the speed increases, particularly if the data is hard to manage or access.


Figure 1 Representation of Interconnected Contexts.

 Real-World Context

Data has meaning in terms of its definitional domain. When you mention "blue", usually comes from the color domain. However, in the world of mental health, it means a kind of feeling or mood. So understanding the base context in which a data element exists is essential. If blue is associated with a human context, it could be physical and mean a lack of oxygen. It could also mean that the person is adorned in something blue. This is usually cleared up by understanding the base subject or entity that the data element is associated with by having a precise name, meaning, and basic subject area association. Underlying meaning can be tricky when just looking at the data value alone. Having proper meta-data and associations is the ideal solution to this problem.  

Business Context

The contextual areas that relate to business fall into three basic categories of meaning. Every data item or group of data items needs to be viewed in terms of the context they are being viewed in or from. Internal contexts where the vocabulary is understood in the context of the internal organization and defined within a particular organization. These internal contexts usually revolve around the organization and skill definitions. There is also the external context that represents the outside world irrespective of the organization itself. The third context is where the outside world touches the internal world. Listed below are the typical contexts in each of these categories:

Common External Contexts:

Communities, Brand/Reputation, Public, Legal Frameworks/Courts, Geographical Regions, Countries, Local Culture, Governmental Agencies, Industries, Dynamic Industry 4.0, Value Chains, Supply Chains, Service Vendors, Markets, Competitors, Prospects, and Competitors Customers.

Common Internal Contexts

Organizational Culture, Goals, Constraints, Boundaries, Actual Customers, Products, Services, Suppliers, Employees, Contractors, Departments, Divisions, General Accounts, Contracts, Physical Infrastructure, Technical Infrastructure, Properties, Investments, Intellectual capital, Business Competencies. Knowledge, Skills, Patents, Success Measures and Statements

Common Interactive Contexts:

Marketing Channels, Advertisements, Customer Journeys, Customer Experience, Loyalty, Satisfaction Scores, Processes, Applications, User interfaces, Websites, Webpages, and System Interfaces.

Technical Context

Data must also be understood in terms of physical contexts, limitations, and potential lag times. Data sources need to be understood in terms of their currency and ability to be integrated easily with other sources. While many views, interactions, and integrations work well at the logical level, physically, they may not be ready in terms of near-real-time capabilities, transformation potential, or performance levels on or off-prem. While meta-data may exist to understand possible joins and combinations, executing them fast enough to be useful in multiple business contexts may not be possible. The physical data types and file storage mechanisms may not be conducive to the demands of new usage scenarios. New low lag databases that are near real-time will become the standard, going forward.

Net; Net:

Data, information, knowledge are quite dependent on the context(s) they participate in or the perspective they are viewed from. Often Knowledge worlds interact; therefore, meanings can overlap and connect in ways that are essential for ultimate understanding, manipulation, or utilization. Knowing the context of your data is absolutely critical for leveraging understanding.  All of this is happening at greater speeds approaching the “database of now” speed necessary to make critical decisions, actions, adjustments or improvements. 



Tuesday, June 30, 2020

Generative AI+ Art is Gaining Momentum

I thought a post on generative art might be in the interest of all things AI. This kind of art is leveraging AI, algorithms, randomness, programs, and humans to create exciting and beautiful art. As you may know, I now collaborate with Fractal Software to develop compelling and award-winning artwork. In fact, some of my fractals are my best sellers. I have a great friend and fellow artist, Bob Weerts, who is pushing this collaboration even further. Below are two of his early generative pieces:












Bob employs lines as his fundamental stylistic element and incorporates a chance in determining line length, density, and color. He cedes some control over the work's final outcome to a process enabled by Software he's written allow the piece to "emerge" over time. He plans to let the Software take more control of these emergent pieces over time, letting AI/Algorithms expand some range. I find his early pieces quite pleasing and interesting already.

One source of Bob’s original inspiration is Casey Reas "Process Compendium," which, among other ideas, explored a synthesis of the Complexity Science notion of “emergence” and Generative Art in the early 2000s. An example of Reas Compendium work is below: (Click Here for Other Examples).

Reas is an internationally admired artist, but perhaps best known as the author, along with Ben Fry, of the graphical sketching too called "Processing," which is widely used in the domains of Art, Design, and Media. 

The significance of the generative art trend is perhaps exemplified by Christie's record of $432,500 sales of "Portrait of Belamy". The image is one of a series created by a group of young French students collaborating collectively as "Obvious".  Obvious borrowed heavily from open-source Generative Adversarial Network (GAN) algorithms specially developed by a then-high school graduate Robbie Barrat but originally conceived by the AI researcher Ian Goodfellow. This has the ball rolling, and there is new momentum under the "GAN" movement. Generative adversarial networks (GANs) are algorithmic architectures that use two neural networks, pitting one against the other (thus the “adversarial”) in order to generate new, synthetic instances of data that can pass for real data. They are used widely in image generation, video generation, and voice generation.

GAN's potential for both good and evil is huge because they can learn to mimic any distribution of data. GANs can be taught to create worlds eerily similar to our own in any domain: Images, music, speech, prose. They are robot artists in a sense, and their output is impressive. But they can also be used to generate fake media content of often called "deep fakes." 

Net; Net:

AI Generative Art is quite striking. Since the whole field is getting more towards AI and less from the artist/programmer, we can expect some exciting results in the future. I will likely pursue a more intimate collaboration with all kinds of generative art going forward. Keep your eye on Bob Weerts as he is a creative guy seeking this edge faster than many other artists.  

 

If you want to see my works, check out the fractals section here 

If you want to know more about my collaborations with Software to create, check out this post 

Read about more right-brained AI by clicking here 


 

 




 


Tuesday, June 16, 2020

Exploring Data Delivers

We hear about organizations mining data looking for benefits nearly every day now. Just like the prospectors of old, people are trying to mine gems out of the big patch of ground under their claim while searching adjacent areas. The case studies are abounding, so the appeal is strong. These mining efforts are this really paying off so how should one go about it?  Just start digging a big hole and hope for the best? With all the buzz around data mining and process mining, there are some proven paths to successful mining. 




Identify the Benefits of Data Mining 

It's pretty easy to justify the mining efforts on the promise of benefits today because there are so many success stories floating out there. The typical benefits that keep repeating include improved decision making, improved risk mitigation, improved planning, competitive advantage, cost reduction, customer acquisition, customer loyalty, new revenue streams, and new product/service development. The crucial step here is to find the benefits that will resound in your organization and situation. These days organizations are dealing with a multitude of challenges from plagues to politics. It is always a win to save costs, but there has to be more to it to create a compound set of appropriate benefits needed to justify mining efforts. 


Scoping Efforts Properly Delivers Better Results

While we all believe that data mining has the potential to improve and even transform organizations, the amount of data to mine is growing by the second and the number of advancements in making data smart is expanding. It's not difficult to understand that the majority of organizations are struggling to find the right strategy or solution. The first step is to discover where there is significant potential like miners do by drilling boreholes to discover the potential in the ground. That means organizations will have to sample areas of data that promise potential. To that end, many organizations start with process minging because it promises cost and time savings that often improve customer experiences leveraging smaller scopes. For those organizations that wanted an outside-in perspective, starting with customer journeys and large scoped processes that cross system boundaries have been quite successful. 


Incremental Learning is Essential to Continued Success

Starting small and expanding as success allows seems to be the most common model for mining. What is really popular at the moment is to use mining to find opportunities for more automation. Savvy organizations will look at adjacent systems, organizational units, and contexts. Feedback loops and iteration will teach the best lessons for mining results. Alternative visualization techniques such as timelines, animation, and limits do also help the learning process. Some organizations will also combine the visualization with discrete simulation to test alternative outcomes.   


Net; Net: 

If you are not practicing focused data mining and looking for productive patterns in your ever-growing data inventory, you are missing many opportunities. As successes emerge, the more savvy organizations are looking to widen their scopes and using approaches that cut through the jungle of their organization. Mining is here to stay and brings a valuable set of methods, techniques, and tools to leverage for organizations looking to thrive under all conditions. 








Tuesday, June 9, 2020

Organizations are a Jungle of Journeys

The simple idea of selling a product or a service for a price to make a profit is still the underpinning of most organizations. Still, it's gotten more sophisticated and intertwined than even five years ago. Many organizations participate in broader contexts like value and supply chains while dealing with dynamic change and emerging scenarios are driven by geopolitical or environmental trends/events. For continued organizational health, organizations will need to understand the journeys that exist and interact in their footprint of impact, learn the levers that can adapt their jungle to changing and in some cases, practice the response to the emergent conditions of "NOW." Listed below are the typical journeys that organizations need to participate in or manage in no particular order of importance:



Customer Journeys

The journey that a customer takes is a crucial journey to manage as it defines your organization's contribution to that journey that leaves an indelible memory of good or bad for all steps involved. It is essential to understand the customer's real journey, not just where a customer might touch your organization. Getting customers to be attracted and stay loyal to your organization depends significantly on your understanding of their real journey, not just the optimization or automation inside of your organization for cost savings. 

Work Journeys

Work arrives, gets assigned and moves through your organization, and is the key for cost and timing outcomes. Understanding where work gets stuck, deep in the innards of your organization, is essential for cost optimization and customer satisfaction improvements. It could be a competency skill deficiency, a data deficiency, an overburdened shared resource, or just a situation never contemplated for in the work design. These are some of the thickest vines in the jungle.

Employee Journeys

Employees are some of the most critical and expensive resources an organization manages. Making sure their time is optimized and used correctly is crucial for resource leverage with optimization reasons. Concurrent with employee participation with various journeys, they must be augmented and have enhanced/expanded skills. Assistance may occur through bot augmentation or knowledge turbocharging, but investment in employees is the often forgotten sub-journey. Lack of investment in employees is an easy way to lose in the long term.

Product/Service Journeys

Every product or service must be designed with the greatest of care and the best knowledge/ skills available. The journey from design to production should be planned, managed, built, and tested with the greatest of attention as they are often the competitive differentiator along with the customer journey and experience. Organizations tend to be very good at these kinds of journeys except when they become out of touch with trends or their customers, partners, and employees. 

Infrastructure Journeys

Organizations have to build and establish the infrastructure necessary to support the business. Service software has to run somewhere and needs to be built/maintained and supported by infrastructural software. These are part of the infrastructure that must be carefully and made promptly and retired if necessary as time progresses. The support will need to be built, maintained, or outsourced to other organizations if it is a product. Managing the portfolio of infrastructure during the building and maintenance periods are journeys to manage.  

Capital/Funds Journeys

Organizations are usually very concerned with money, how it is raised, how it's used, and what becomes of excesses or losses. While these journeys are better established and repeatable, they often try to dictate the level of investment in the other journeys. Visionary management will satisfy short term results expectations along with building for the future, thus fund incrementally in various journeys. Having a proper governance journey or two is essential for the investors. 

Community Journeys

All organizations participating in physical and logical communities that can affect them positively or negatively, the reputation and the operation of an organization. As organizations join in legal frameworks, the best are necessary to plan and execute the journeys that fit those contexts. The results will affect the kind of outside direct or indirect governance for organizations and may set the policies or rules for other remarkable journeys or processes.  

Net: Net:

Each journey must be thought through and managed collectively and individually. Traditionally only portions of individual journeys participated in digital optimization or automation. For organizations to thrive, these journeys need to be served digitally from one end to another. The interaction between these journeys will show where organizational friction will occur over time. Also, the interaction within these journeys must be orchestrated in the context of continuous foresight with emerging expected and unexpected scenarios.

The good news is that new digital business platforms(DBPs) are emerging to integrate digital functions to service journeys better. Some will help with process fabrics; others will manage the intelligence well for better decisions, or reaction/guidance; some will manage data integration, and still, others will work at the edge to manage emergence.

 

 


Thursday, June 4, 2020

Delivering Success with Smart Data Streams

It is becoming clear that AI will be a critical competitive differentiator for organizations, industries, and even countries. It is also clear that many are looking for success stories to leverage into learning opportunities. As AI embeds its intelligence throughout organizations, the sophistication of the data usage will increase to a point where traditional data approaches will need to extend to include real-time data streams of images, videos, speech, events, and operational data. It means that new data approaches will be necessary. As AI gets more sophisticated at speed, its hunger for complex data becomes insatiable. As organizations learn to leverage AI, emergent problems can now be attempted. You can find strong case studies of emergent AI acting on data streams by clicking here.



Leveraging AI Starting with Machine Learning (ML)

Machine learning allows applications to learn from the data in order to make better decisions at speed. There is significant value in creating predictive applications that can smartly select smart actions that meet or intercept emergent data from multiple and intersecting contexts. This iterative learning and improvement cycles are driven by emergent data, shifting goals, and guardrails that are invaluable for organizations that want to stay in step or ahead of their market place and constituents.

Intermediate AI Applications Leverage Smart Streaming

As AI gets more sophisticated in its learning ability by applying deep learning and even cognitive thinking leveraging interpretation, recognition, scoring, intuition, reasoning, and judgment, the hunger for faster multiple data sources will grow. Streams of complex and evolving data will need to be utilized in solving both static and emerging problems.

Putting a Premium on Emergent, Fast and Agile Data Sources

Looking to the past is valuable, but today's demands require organizations to get in front of business events, constituents, and competitors. The data sources will include traditional and non-traditional data such as voice, video, and images. The speed and mixes of data types and sources will be dynamic and agile. Instant integration and transformation will be the norm to satisfy prediction and intelligence needs fueled by AI and analytics.

Net; Net:

AI is gaining momentum and is taking on predictive applications that leverage fast and agile data sources. As AI migrates to the edge over time, the notion of fast streams of event and pattern data will grow along with traditional big and fast operational data. Organizations that want to thrive and capitalize on leveraging AI and smart streams will get ahead of the curve by learning from successful implementations. Please click here to access an E-book for some impressive case studies that leverage AI-enabled smart data streams.

 

Click here for the E-Book entitled "The Future Starts Now" subtitled "Achieving Successful Operations of ML & AI-Driven Applications."

 

This blog and this breakthrough E-book are sponsored by MemSQL(an agile real-time database).

 


Tuesday, June 2, 2020

AI Devours Data!

Those who have worked on Machine Learning (ML) projects know that ML requires a large amount of data to train the resulting algorithms. Some would say you can never have too much data. There is usually a correlation between the amount of data and the sophistication of the resulting ML model. This data hunger is only going to get more intense as AI progresses towards new benefit pools while leveraging more sophisticated AI capabilities. Since there are other contributing trends bedsides the sophistication of AI, the question looms for organizations is, "do they have the right data to fuel successful AI efforts?" If they don't have enough, should they inventory more in anticipation of the AI feast?




Figure 1:  The AI / Data Continuum

It’s not likely that all that big data that organizations have been hoarding is the correct data, but understanding where AI is going will give an organization a "leg up" on culling and collecting more of the correct data as AI progresses during the next decades.

The Progression of AI Changes the Data Game

While ML requires significant amounts of data to self-modify its behavior, the appetite of AI increases quickly as the sophistication of the AI capabilities increase. There is a big step from machine learning to Deep Learning (DL) in that DL requires much more data than ML. The reason being that DL is usually only able to identify concept differences with the layers of neural networks. DL determines the edges of concepts when exposed to millions of data points. DL allows machines to represent concepts via neural networks as the human brain does, thus allowing more complex problem-solving. AI can also work on fuzzier problems where the answers are more uncertain or ambiguous. These are typically judgment or recognition problems that can extend to the creation or other right-brained activities. This again requires more data, which in some cases may be emergent or real-time in nature.

The Shift from Data-Driven to Outcome Driven

As AI moves up in the sophistication of the problems its assists or solves, it will become data-driven and goal/outcome-driven. It means that the AI may request data on the fly that it needs to solve a particular problem or make a specific deduction, thus complicating data management. It may involve the interaction of inductive data-driven portions of a solution with the deductive needs for data based on a hypothesis to reach a target. This kind of dynamic interaction is needed for outcome-oriented problems. It is much different than just interrogating the data looking for interesting events and patterns. Decision driven approaches fit right in the middle of these two distinct approaches. Some decisions are operationally focused and improved through matching data with outcomes. More strategic decisions will pick up on both inductive and deductive approaches. This is just another demand channel to boost data usage.

The Shifting Problem Scopes Impact Data Needs

The scope of AI solutions are will typically start narrow and move to wider scope over time, thus requiring more data. Complex solutions typically target more than one answer and will require more data to support the tributary solution sets, contributing to a complex/hybrid result. As the scope of decisions, actions, and outcomes span more contexts inside and outside an organization, more data will need to be obtained to understand each context and their interactions. Each of these contexts could be changing and morphing at different rates, therefore, requiring more data yet.

Net; Net:

It's clear that more data will be the hallmark of AI-assisted solutions. The data appetite might come from more challenging problems, the better leverage of advanced AI/analytics, or growing end to end value chains. One thing is for sure. Organizations had better get ready for the new world of “AI/Data Interaction”. It could change or extend data management policies, methods, techniques or technologies.

 

 


Thursday, May 14, 2020

Dealing with Emergent Data


The recent and ongoing battle with COVID-19 has raised a goodly number of issues in and around getting surprised, arguments around emergent data, and slow/appropriate responses. The lessons learned so far are pretty rich, but I think there is more to discover. Scenario planning seemed to be lacking, the early warning systems seemed to have broken down, responses seemed slow and unpracticed once the denial hurdle was overcome. This was not a "Black Swan" event, so why did this pandemic seem to throw a monkey wrench into humankind's systems and processes? It may a bit early as all the dust hasn’t settled yet, but there are some obvious conclusions even now. What can we say now?





Scenario Planning


While pandemics are a practiced and expected scenario, the level of detail in this kind of scenario was tested in new ways. There were new discoveries in how trends in supply chains were working against us. With longer supply chains for medical supplies and equipment, the stress put on existing supply chains because of panic buying of many items, including food, and how do deal with rescinding demand in finely tuned supply chains. The detailed scenario planning and modeling really seemed off the mark this time on a worldwide basis. There were also some emergent geopolitical effects not completely thought through for sure. Scenario planning needs to handle models with more emergence in a fine-tuned fashion. Businesses and individuals need to up their game in this arena as well.

Early Warning

It is not surprising that less than effective scenario planning would lead to missing emergent data that was not expected, but as the emergent data morphed/changed, there were shadow events, signals, and patterns that took longer to recognize. Early warning needs to be able to recognize event patterns that go beyond expected events. These events and patterns need to participate in more complexity theory and real-time recognition that understands emergence in complex and interconnected systems and supply/value chains. This starts out at endpoint detection and merges with associated and real-time unassociated events to create emergent patterns. Agents/bots sniffing at the edge and event responding in the case of known situations and contexts is minimal, but merging and emergent complexities need to be tested, and models/scenarios need to be updated in near real-time.

Appropriate Responses

It seemed pretty clear that many responses were unpracticed, emergency stores were over-whelmed and tactical responses were being invented on the fly. Since appropriate responses are dependent on in scenarios and early warning, the compounding effect on responses were evident. There were some pretty impressive examples of human creativity/inventiveness and sacrifice to make up for the deficiencies, but can we plan on this always happening for the good of all? We saw governments taking over supply chains, people quarantined late and long, and decisions walking the razor edge between mortality and economic suffocation.
  
Net; Net:

We can do better.  We have to do better as more negative scenarios are emerging as nature deals out an accelerating frequency of earthquakes, hurricanes, volcanoes, pandemics, regional famines, and shifting geopolitical events. I suggest we invest in Emergent Data Recognition (EDR) tied to improvements in scenario planning and practiced responses from a strategic and tactical perspective. There are lessons in emergent data that we have to be ready to leverage. The more prepared we are, the better it will be for us all. It's worth the investment in terms of life and livelihood.