Tuesday, December 22, 2020

Thankful During a Difficult Christmas

This is certainly a tough year for mankind because of all the pain and suffering COVID 19 continues to deal out. Despite this, we have hope and a thankful heart. First, at the physical health level, we have at least two vaccines that are rolling out at this time. Our prayers and positive vibes are going out to all affected by this monster plague. 

Secondly, we have the hope of our saviors birth which preceded his death, resurrection, and ascension to care for us. This is something for our spirits as a cure for our mistakes. It is not lost on us that God and the heavens show us the Bethlehem Star as a sign of hope during this serious plague that besets our lives and livelihoods.  Here is a picture that was taken outside the city over the Arizona desert skies by Cindy Morrison Maes that is spooky good.  We are thankful for God's salvation through our savior, Jesus Christ


                                      Happy Holidays to all from the Sinur clan!!!!  






Tuesday, December 15, 2020

What Are People Reading?

 As the number of hits on my blog eclipses 700,000 in seven+ years, I want to thank those who have been loyal readers. In the spirit of transparency, I wanted to share what others have been interested in over the last year and beyond. I hope the numbers below help you in picking out some interesting blog posts to visit on my blog  Use the search bar to help you hone in on your topics of interest.  

Figure 1 represents the greatest hits over the years. The top areas of interest were Automation (RPA), Customer Journeys (CJM) Process (BPM), Cognitive (AI). Digital and Industry 4.0 (IoT).


                                                     Figure 1 Greatist Hits

While most hits come from the US (65%) the offshore hits are significant. I do not get much action from Asia, but Europe, Eastern Europe, and Russia are quite active as seen in Figure 2.

                                                           Figure 2 Off-Shore Hits

New Topics have emerged to shift the demand to also include Corporate Performance, Decisions, Real-Time, Data Management, Data Mining, and the Database of Now,  Figure 3 slices the 2020 themes for us. 


                                                              Figure 3 Top 2020 Hits

The second half of 2020 has some new topics in and around AI & Art, Monster Data, and the Value of Voice Data.

                                                       Figure 4 Second Half of 2020

Net; Net:

I hope that these charts help you in pursuing emerging topics as well as the traditional ones. Please contact me with topics you would like me to cover in the future. 



Wednesday, December 9, 2020

Are You Hearing Voices? Maybe You Should

 Many organizations are hungry to understand what customers and prospects think of their brand, products, services, and unique relationship. To this end, organizations are trying to move from reactive responses to negative experiences to proactive real-time action at the time of both positive and negative interactions. There is a movement to more sentiment analysis by supporting this shift, leveraging emotion AI and other forms of opinion mining to help spot, extract, and quantify subjective information around the customer's voice in real-time. Leveraging natural language processing, voice/text analytics, computational linguistics, and biometrics are becoming a strong trend.

Maybe You Should be Listening:

The cost of not listening after the damage is evident is not smart to do business today. When an organization hears phrases like "Can you believe this?", "Shame on them," "Let's get 'em," or "Do the right thing" in social media streams, it's getting to the point of damage. If there are positive responses, organizations would be foolish not to take advantage of those beautiful gems of customer delight. It is important to sense harmful and helpful responses in social media, tapping into real-time sentiments in the middle of voice interactions happening in the servicing channels.


What Should Organizations Listen to?

Organizations should be listening for brand health, product/service satisfaction, and the overall relationship. Listening to mentions of the competitor's names is essential and usually means you are under pressure of some kind. The customer is either growing in frustration or is beyond their tolerance for unsatisfactory service. Voice inflections will tell you if it is just a need or a real frustration. It is essential to distinguish between a want or a wish or a real need. Of course, listening for the good news is just as important, so knowing if your customer or prospect is happy and loves the service they are getting is a great thing. Listening for the positive is particularly vital if you want to upsell or cross-sell a customer.

What are the Typical Indicators to Focus on?

There are everyday things to listen to for sentiment. If there is a lot of customer silence, they may be learning or just controlling their emotions. Listen for overtalk where the agent is pushing their script over customer needs. Also, listen for impatient customers that feel that they are not being heard; they overtalk the agent and their talking script. These calls are great for training, but real-time action is the first prize.

It is also essential to listen to the drivers for the calls to determine if there are positive or negative trends forming. Customers will let you know what they want, so it is essential to focus on their goals, not just the CPA driven goal of reducing time to apparent resolution. This kind of listening helps in improving first call resolution and even net promoter scores (NPS).

Net; Net:

It's crucial to spot bad customer experiences early in your contact center. Voice and text analysis in real-time leveraging AI helps ferret out real sentiment analysis. Finding the right combination of digital technologies is much easier today than even a year ago. Many options will help improve customer relationship metrics. Voice is the growing approach today as it captures the real customer sentiments before the relationship breaks down and organizations are trying to bandage a large wound without much success.

Additional Reading

Voice; Voice Baby               

Got Customer Excellence? 

Let Your Voice Be Heard

 

 

 

 

 


Tuesday, December 8, 2020

Art for the 4th Quarter 2020

 I sure hope you and your loved ones are doing well during this COVID 19 Pandemic. Creating art doesn't seem to stop during tough times, but the demand is up, much to my surprise. I was able to complete a commission for my buddy Jonathan Yarmis proudly standing in front of his new mustang convertible in the middle of a large Texas sky. On a whim, I put a number of my better fractals on masks, and they sold out three times. I painted a cute little octopus with neon paint for creativity's sake and created a couple of interesting fractals. I hope you enjoy the pieces here. If you want to see more or even buy a piece hit my art website or email me at jim.sinur @gmail.com.  Happy Holidays to all. 



                                                MUSTANG JOHNNY 





                                                         ART MASKS



                                                      HAPPY OCTO




                                                      BLUE SWIRL



                                                       PURPLE DELIGHTS 

Monday, November 23, 2020

Real-Time Use Cases Enabled by the Database of Now

While we all know that real-time applications and analytics are constrained to respond in the order of microseconds, real-time systems have been difficult to justify and attain until recently. There is a whole new class of real-time analytics that are now open to more organizations and applications with the advent of “The Database of Now." This blog investigates some of the new and emerging uses and is meant to give the reader some real-world examples. Hopefully, this list of successful uses will inspire others to follow suit, mainly where real-time dashboards and analytics deliver significant benefits.

Real-Time was for Special Applications

Real-time computing started with operating systems and then only a handful of applications because of specialized software costs. The first implementations revolved around real-time networks and market-driven applications where results demanded no significant delays or instant results. Real-time was an excellent fit for physical systems that need instant responses like fly-by-wire or ABS brakes or single purpose-focused applications. These use cases required extreme correctness, deep concurrency, and durable stability while being distributed and sometimes autonomous. Real-time was a limited set of applications and systems until recently.



What Has Changed?

Business drivers require more speed. It used to be good enough for businesses to have dashboards that were relatively up to date. Now that kind of speed is not acceptable even if applications aren't hooked up to devices on the internet's edge (IoT).  Almost all the leaders of business-focused software organizations believe that speed is the new currency of business. We are in an era of extreme competition and dynamic adaptability. Those organizations that can handle emerging trends by sensing them, making rapid decisions, and implementing fast are the ones that will emerge as the winners as long as they consider the voice of the customer and other constituents. Savvy organizations will switch from reactive to proactive by planning alternatives, practicing them, and put in listening posts for an emergent change.

 

Technology enablers have been emerging to meet the need at a lower cost and broader use cases. It started with complex events processing's ability to sense signals, events, and patterns of interest and even respond in limited situations. And now, fast forward to today, and we see the accelerating trend of adapting to real-time applications with the mainstream use of streaming data. Ventana research states that more than 50% of enterprises will leverage real-time streaming data in their enterprise next year. It reaches full bloom now with databases that can handle various kinds of complex monster data in the cloud-managed as a single logical store rather than multiple special-purpose datastores, greatly simplifying and accelerating data delivery. Organizations simplify the complexity that revolves around data location, meaning, and transformation connected to the smart IoT, often embedded in Industry 4.0 solutions facilitated thus increased speed. 

Sample List of Successful New Real-Time Use Cases

They are listed in no particular order with common uses highlighted in bold text. All industries will have emergent situations that will cry out for real-time assists.

  • FinTech
    • Portfolio Management & Analytics
    • Fraud Detection
    • Algorithmic Trading, Crypto Exchange
    • Dashboards & APIs
  • Software & SaaS
    • Improved CX for Internet Services
    • Supply Chain Visibility  
    • Machine Learning Pipelines & Platforms
    • Dashboards & APIs
  • Media & Communications
    • Ad Optimization & Ad Serving
    • Streaming Media Quality Analytics
    • Video Game Telemetry Processing
    • Network Telemetry & Analytics
  • Energy 
    • IoT & Smart Meter Analytics
    • Predictive Maintenance
    • Geospatial Tracking & Calculations
    • Dashboards & APIs

 

Net; Net:

While there are many more industry examples not listed here, the evidence is clear that it is time to rethink real-time utilization. The Database of Now has lowered the hurdles keeping organizations from leveraging real-time implementations. This is especially true of dashboards, analytics, and other transparency rich situations. If you need to know about the status of work or outcomes on the spot and right now, you should be considering the real-time use cases that the Database of Now enables. Real-time is for everybody now, so start planning and preparing for new competitive implementations.

 

Additional Reading:

Corporate Performance in Real-Time

Database of Now

Monster Data

Ventana Source

 

 


Thursday, November 19, 2020

Customers; Let Your Voice Be Heard

We all know that customers are the lifeblood of organizations, so organizations should treat their relationship with the customer as extremely important. Applying intelligence to voice data, either through analytics or artificial intelligence (AI), is particularly useful to both the customer and the organization. Organizations can identify patterns to improve the relationship, and customers will, in turn, get a better experience. Typically organizations use crucial voice interactions to extract vital information from customer interactions to measure and improve performance, but there is much more potential in leveraging voice data. The power of voice data in the customer experience goes beyond organizational excellence to relational effectiveness. 


Using Voice Data for Organizational Performance

Organizations are always looking for efficiency while optimally managing their resources. Contact centers and agents need the utmost efficiency and performance so that voice data can be searched for desired and undesired behavior. Customer service usually means measuring call handling times to deal with call volume. Sometimes call center agents are just measured on time to complete without considering if the customers' outcomes were delivered or experienced frustration in getting their needs translated into corporate transactions. Measuring customer sentiment in real-time adds a differentiating factor to the overall call handling perspective. If a customer has to call multiple times to get something done while the organization/agents hit handle time goals, they may not be impressed and leave permanently.

Indeed, voice data helps improve agent script adherence and the actual scripts themselves. Voice is also helpful in detecting when competitors' names are mentioned for further evaluation. When fine-tuning agent training programs, voice data is invaluable, enabling first call resolution and even reduce average call handling times. Some "bigger picture" efforts such as payment compliance, including redaction for terms like "credit card number" or "social security number," also benefit from voice data and analytics. You can even fine-tune your marketing campaigns by picking up on themes and slogans.

Using Voice Data for Voice of the Customer

Voice data delivers significant opportunities for a better customer experience without the customer knowing it. An example would be finding the best call center agent for each customer on a real-time basis. AI can understand which agent is suited for the type of call because not every call needs your best agent. Understanding who is needed will reduce frustration, call transfers, and escalations to managers or higher-skilled agents. Another terrific use is AI & voice data would understand the customers’ emotions. This emotional understanding would allow a real-time sense of frustration, anger, or sweet satisfaction. Appropriate actions could be taken at the moment, and lessons learned could be recorded for future analysis. Instant adaptation is a much more reasonable approach to responding to the customers in a way that optimizes the customer's experience. 

Customer pain points are sensed with voice analysis and dealt with before becoming a sour customer growing negative reputations. This analysis will reduce customer churn and increase net promoter scores. Imagine learning what is causing customer churn and adjusting strategies on the fly. Customer analysis helps implement predictive speech analytics to deal with the customer at risk and act immediately. It might include a link to smart calling software for callbacks when an expert can reach out back to the customer. In turn, call center supervisors can implement strategies to prevent churn by improving building a knowledge base, better training, and improved scripts.

Even if caller emotions can’t be handled in a real-time fashion, there is an aggregate picture of the overall customer satisfaction by measuring and classifying caller emotions into buckets. Examples would likely include want, like, frustration, anger, annoyance, need, passion, and pleasure. Not only can sentiment be classified, so can call drivers, topics discovery, and brand health. Keywords can lead to emerging trends, and real-time dashboards focused on customer experience.

Using Voice Data for Voice of the Employee

Often forgotten in the shuffle is the actual employee voice data. Looking for comments from the agents that say things like "I wish we could do ….."  Employees are a great source of understanding your customers' journey even if the journey is outside your organization's purview that might encourage communicating or partnering with other organizations. Indeed, voice data should be analyzed for more knowledge for the agents, creating projects, and training curriculum adjustments. Lack of experience might lead to AI knowledge bots to adapt scripts in real-time instead of the batch script creation process.

Net; Net:

Analyzing voice data with analytics and AI will accelerate time-to-value for both the customer and organizations. It makes organizations more personal for both customers and employees while demonstrating the power of voice data plus AI rich analytics in the customer experience.

 

Additional Reading

Voice; Voice Baby               

Got Customer Excellence? 

Journey Maps  

Tuesday, November 3, 2020

Voice, Voice Baby

Voice-enabled devices are everywhere, and there is a high awareness of the potential for voice data on a personal level. Devices are in our living rooms and even in our bedrooms, but we are just scratching the surface of the use of data in organizations other than a simple search, understanding, and basic commerce. People talk to speech recognition like real people and say things like "please and thank you". Organizations need to up their game to take on smarter uses of voice over the coming decade. Organizations will have to gear up for “Big Voice Data” that goes beyond rudimentary search to analytic leverage and secure redaction of sensitive voice phrases.


Popular Uses Today

Voice search is prevalent and driven by the younger generations, but the deep usage is coming for the middle-aged wealth builders. Search is a critical participant in understanding what something is, how to leverage it, and the best way to go about it. Search is also a significant participant in simple commerce to find and buy specific products and services. It is particularly useful for lower-cost items like groceries, entertainment, and essential clothing. Voice is poised for explosive growth in the enterprise for large and complex voice sources. 

Needed Uses for Tomorrow

Because the voice influence cuts both ways, voice can influence commerce and help organizations to get better.  Organizations will have to ramp up their use of AI-powered voice analytics to gain an advantage. Savvy organizations are examining the critical moments of sales or service leveraging conversation analysis speed and accuracy at optimal cost levels. By understanding what was said and making the intent discoverable, new opportunities arise. This process allows organizations to analyze interactions and prescribe proper actions. The typical uses are for offers, complaints, first call resolution, compliments, outages, and escalation analysis.

On The Edge Uses

Organizations need to stop thinking about voice as just an interface or the outcome of a conversation. Voice data is a unique source of critical insights for the business. The leading organizations create the next generation of communication and collaboration with AI bots/agents to identify call drivers, trends, predict new directions, and design or optimize products and service bundles. This kind of smart voice leverage can apply to prospects or customers and all constituents involved with product/service creation, including employees and vendors. Voice analysis could cover the end to end supply or value chains. Voice analytics applies to both external and internal voice moments of truth, differentiating amongst the individuals' diversity at the same time.

Net; Net:

We all know that voice has much more value than any other means of communication because it expresses context, sentiment, intent, emotions, and action potential. Speech technology should be a vital part of any digital transformation as there are critical insights about product, service, and customers. Better accuracy, smarter implementations, and bulletproof security will allow more complex solutions while making it easier for users. It's time to up your game at getting more value out of the voice in your organization.

 




Wednesday, October 28, 2020

The Database of Now is About Leveraging Business Moments

 Recently I have been writing about the Database of Now without really defining it. While the definition was somewhat implied in the context of the write-ups, it is time to make it an official thing by defining it. Keep in mind that the definition is ideal at the moment and will emerge over time as digital technology advances to enable it. I have invited a coauthor who has some real-world examples to help flesh out this definition in context. Please welcome Dominic Ravita, whose bio is included.

Database of Now Definition

The Database of Now is a set of structured or unstructured data representing the present time or a moment without any processing or location delays. The Database of Now represents the most accurate and up to date state from which to make observations, decisions and take appropriate actions within the context of desired outcomes and goals.

The Database of Now must be considered in context and time continuums as the data meaning will vary with these vectors. An accurate state of now is a must to interpret the past and potentially predict the picture in emergent situations. Maintaining an accurate database of now is exceptionally challenging today with the amount and speed of data currently.

Benefits of the Database of Now

The Database of Now delivers operational insights and advantages by providing the current state of the business. It is a modern, efficient approach to cloud data management which broadens, accelerates and simplifies access to all the relevant in-the-moment with historical data while unifying data access styles and patterns

Why Care About the Database of Now? 

Digital transformation projects have accelerated to meet the increased demand for digital products and services, providing answers or solutions with immediacy. With the onset of the "always-on" culture, the pervasive use of smartphones and ubiquitous devices has driven a global shift in customer experience and consumer expectations. Business is now won or lost in a moment.

The Database of Now delivers the operational insights and advantages by providing the current state of business to proactively identify, capture, and capitalize on the most crucial moments for their endeavors and their customers' success. It achieves this by simplifying the data infrastructure required to execute diverse workloads across various data styles, patterns, and types. Data professionals, application stakeholders, and end-users gain the advantages of speed, scale, and simplicity.

Characteristics of the Database of Now

Instant Integration & transformation

Self-Managed with Incremental Ingestion/Cleaning

Run Anywhere Dynamically

Pattern & Event Recognition

Ability to Learn from Analytics, Machine Learning, and AI

Ability to Suggest Alternatives 

Ability to Take Action within Allowed Freedom Levels




Real-World Needs/Examples

 Cash Burn & Flow Need the Database of Now

Managing cash flow and burn down in uncertain times requires a “hands-on” real-time look at the monies and where they are going. Having a pulse on money and how much is flowing out right now is vital when unpredictable revenues. Having a chokehold on expenses can be just as much of a mistake today, so watching the corporate performance in real-time is a fantastic advantage. Watching spending patterns in the market and your organization in a real-time fashion is a key to organizational success. All of this monetary care must be watched in the context of current business scenarios that could be subtly changing and appearing in new events and patterns.

The Digital Customer Experience Needs the Database of Now

Measuring what your customer is experiencing in real-time yields much better satisfaction and results than looking at slanted surveys and inconsistently timed customer forums. Real-time data mining leveraging the Database of Now will give insights into customer behavior. While customers generally have consistent goals in the way they behave, they are now experiencing new pressures. Many organizations are declaring victory because they could continue supporting customers with a remote workforce quickly, but few look at the customer experience impact of the recent moves. The long term future of many organizations hangs in the balance with the customer experience. Digital, along with instant adaptation, will go beyond remote workers to customers and other partners.

Saving Lives Needs the Database of Now:

Seconds can save lives too. True Digital of Thailand's mission is to protect children from sex trafficking, and they do that by continuously processing massive amounts of web data to identify children in danger. They were able to decrease law enforcement investigation time by as much as 63%. True Digital also seeks to proactively reduce the likelihood of new viral hotspots, including COVID 19, by monitoring mass population movement trends and rates of population density changes through anonymized cellphone data location data.

Industry 4.0 Needs the Database of Now

When organizations interact to create outcomes with each other, especially when hyper-automation is involved, having instant transparency and acting nearest to the emergent events or patterns is essential. The Database of Now operates well at the edge, leveraging cloud presence, and work shifting capabilities. Measuring results at the edge and adapting within a set of guidelines and goals set up by inter-organizational governance bodies is essential for success.

Each of these examples has make-or-break moments in time. “Now Scenarios” are time-critical, but the length of time available for effective action varies by situation, as does the variety, volume, and velocity of data required. What is essential for these “Now Scenarios” is to leverage all the relevant data to establish the most accurate, complete, and timely context to drive proactive responses. For business operations and management, real-time revenue is lost or gained in a split second with each passing moment. 

Net; Net:

The new world is much faster in terms of reactive, proactive, or predictive decisions or actions. The Database of Now is now or will be a critical fundamental contributor to many digital transformation efforts. Organizations that can sense shifts and intercept the outcomes in time will be those who flourish best.

Additional Case Studies Can Be Found Here:

https://www.memsql.com/

Domenic Ravita is MemSQL’s Field Chief Technology Officer and Head of Product Marketing. He brings 24 years of experience across consulting, software development, architecture, and solution engineering leadership. He brings product knowledge and a senior technical perspective to field teams and to customers, and represents the voice of the customer to the product & engineering organization. He is a trusted technical and business advisor to the Fortune 500 having served global customers through multi-year transformations enabled by event-driven architectures. He has experience in distributed, in-memory data grids, streaming analytics, databases, integration, and data science.

Additional Reading on Database of Now:

Better Corporate Performance

https://jimsinur.blogspot.com/2020/08/increasing-corporate-performance-with.html

Accelerated Decisions

https://jimsinur.blogspot.com/2020/08/acceleration-of-decisions-helped-by.html

Contextual Intelligence

https://jimsinur.blogspot.com/2020/07/context-connecting-clues-for-data.html

Better Customer Experiences

https://jimsinur.blogspot.com/2017/09/do-you-need-technology-to-assist.html

Smart Data

https://jimsinur.blogspot.com/2020/07/is-your-data-smart-enough.html

 

 

Wednesday, October 14, 2020

Is Chasing Perfect Data a Reasonable Quest?


We have heard many quotes about the poor quality of data. In fact, there are those that want perfect data before they make a decision, Is that a realistic attitude towards data quality? While in some situations where data that is nearly perfect is an absolute must, there are other situations where you make the best decision under the circumstances. Let's Explore some of the issues a bit more.


Figure 1 Interaction of Data Quality and Decisions. 


What is Data Quality?

Gartner defines quality this way:

"The term "data quality" relates to the processes and technologies for identifying, understanding, and correcting data that support effective data and analytics governance across operational business processes and decision making. The packaged solutions available include a range of critical functions, such as profiling, parsing, standardization, cleansing, matching, enrichment, monitoring, and collaborating. 

I'd like to add my analysis to this solid base by referring to Figure 1 above that tries to show the interaction of time, decisions, and data. Looking at the X-axis we see the data quality increasing as efforts to make it better move it to a more clean, concise, and crisp state. The Y-axis represents a time continuum that goes from right this instant to all the time ever needed. Given all the time and all the money necessary, data can approach or even attain perfection until entropy enters into the equation. 

When data is new or first brought into an organization it is good for emergent and morphing sets of problems that are often under pressure to make decisions and even take actions on those decisions. Things are fuzzier at this point and a precise answer is often not possible, but progress can be made even with data of less quality. Often the decisions needed are not fully understood at this point in time. 

As the decisions become more known and even routine, the priority of the crucial decisions, goals, and outcomes tend to sort themselves out. This helps identify the critical data sources that need attention and efforts to get better over time. Some decisions become so important that operational excellence and customer interaction drive the need to make the decisions excellent too. This excellence demands more prefect data in most instances. 

When is Perfect Data Needed? 

Circumstances that Demand Perfect Data

When it comes to safety and the lives of people, it is hard to argue against perfect data, The problem occurs when the timing of getting that data perfect flies in the face of a need to make a decision to avoid downstream negative consequences. Sometimes decisions have to be made with less than perfect data. There are two strong forces pushing here that have to be balanced within the context of emergent or known scenarios.

Good Enough Data Sometimes Works

While it is easy to demand perfect data in order to make decisions and take action, sometimes real leadership finds good enough data to move forward. There is a danger to rely on just gut feel, so leveraging the data the best way possible considering its flaws is sometimes the best road taken. Sometimes additional scenario simulation is the answer. Sometimes some quick and dirty approaches to incremental data bolstering make sense. It may mean that you look at other similar decisions and data for insight.   

Net; Net: 

While we all chase better data, all data can't be perfect, so don't die trying. Certain decisions demand perfect data because of their importance, but the critical nature of decision timing fights against this desire. It is very easy to sit back and say that all data has to be perfect and not take any responsibility or action because the data isn't perfect. Don't get caught in that trap. On the other hand, don't stop investing in great data quality because it might cost some effort. There is a delicate and dynamic balance to be struck here. 

Tuesday, October 6, 2020

Art for the 3rd Quarter 2020

 I hope you and yours are doing well during this challenging year of 2020. My art business is getting an unexpected lift in sales demand and production. Here are some of the new pieces. One was a request from my granddaughter in Loveland Colorado. She wanted a painting of a Brontosaurus because she thinks that they are gentle dinos :) I also created two new paintings on a whim. The first was a raven flying past a full moon. It reminded me of my late father who had a raven for a pet. It sold immediately to a good Candian friend, Lin Whaley. The second was a night scene of a park in Paris that reminded me a night Sherry said "yes" in Paris in 1997. I posted it recently and sold a giclee to another friend Kari Moeller in Kansas City.  I also completed a brilliantly colored fractal for you to see. If you are interested in seeing my portfolio or buying a piece, please click here  Some of the newer pieces haven't made the website yet, so contact me at jim.sinur@gmail.com, if you have a desire. 


                                                          BRONTO 



                                                 RAVEN NIGHT



                                                   PARC DU PARIS



                                                   BIG BANG



                                             



Tuesday, September 29, 2020

Prioritizing Data Management in the Monster Data Era

As much as organizations would like perfect data under their complete governance/management rules and guidelines, those ideal outcomes disappeared already with the big data era's reality. In the "monster data" era, it will be much rarer to have complete data management, especially with the inclusion of more data types, more complexity, and less reliable data sources coming at organizations at an accelerated speed. This writing aims to give some advice on deciding which data sources are likely to get more intense data management. Prioritizing will allow data management to focus better on balancing its efforts. It will also enable the data usage to be separated from Ideal/excellent data management allowing new data resources to enter the organization and grow into data management over time. Prioritizing data sources for robust data management should consider goals, measures, and change channels.


Prioritize on Critical Goals

Organizations can't focus on all the data coming their way, and they certainly can't depend on AI supercharged data scientists to find interesting anomalies from the bottom up. The first cut at the data under management should directly relate to business/organizational outcomes necessary as a base. Many organizations set up strategic goals that represent the most desirable for the circumstances at hand. Savvy organizations plan alternative strategies/scenarios and practice the most likely emergent strategies. Most organizations, these days, tie both their functional and operational goals to the strategy du jour. Savvy organizations look at the end to end processes/journeys to link goals together to minimize organizational conflict and consider constituent outcomes as legitimate goals. These efforts would include customer journeys laced with highly desirable customer experiences. These goals will help narrow down the data sources that need proper data management, although much more than in the past.

Prioritize on Real World Measures

An excellent place to start is where most of the data activity occurs. Many organizations create heat-maps of highest use and apply the highest level of governance for these sources regardless of whether they are operational or decision focused data. There more enlightened organizations push the data timing to real-time, representing the Database of Now. These same organizations practice data mining to look at the reality of the current and past situations that may point to future operation optimizations. The aware organization also mine new data sources like voice and video for customer experience improvement potential. While much of the mining is descriptive, the more advanced efforts start prescribing potential for change and even predicting future behaviors. In this kind of environment, decision optimization is considered a major important improvement arena.

Prioritize on Change Channels

Like it or not, things change and almost always get more complicated. Rapid change may mean cobbling together creative responses in a quick fashion. To that end, organizations need to sniff many data sources to change potential impacts that could affect strategy or operations. Sometimes it might be subtle to look at signals and events in these data streams, but it could range to finding emerging patterns within or across data sources. The likely channels would often include competitors, markets, industries, demographics, vendors, suppliers, geographies, legal frameworks, and governmental pushes. Additionally, best practices involve finding new contexts and sources.





Figure 1 Optimal Data Governance Defined

 

Net; Net:

Applying excellent data management governance, depicted in Figure 1, is no longer possible to all data resources. Organizations will have to focus on the data sources that are key for business outcomes that enable organizations to survive, thrive, and even optimize their impact. While Data Management is necessary for specific resources at a high level, the time has come to identify those particular resources and set data governance levels across various data sources. 

 

 

 

 

 

Tuesday, September 22, 2020

Monster Data is Headed Our Way

If you thought Big Data was a challenge for us all, wait until the new wave of Monster Data hits us. We will have to manage it, make decisions, and build processes and applications leveraging this monster data. Just what is monster data, and how will it affect us. 

Monster Data represents data that is overwhelmingly large, unduly complex, can’t be trusted for accuracy. Typically it is composed of multiple kinds of data including structured, unstructured text, voice, image, or video. Some monster data may be unknown or emergent, making it scary to deal with for most individuals, technologies, or organizations.  

Even Larger Volumes        

We have long been concerned about “the IoT Awakening," exposing large amounts of critical data that would likely need immediate attention often at the edge. While managing all the moving parts of Industry 4.0 is a challenge, we see new value chains that employ GPS, tracing, and original digital identities adding data to the mix. As organizations want to leverage data for more refined business outcomes, more data will be needed.

Organizations leverage more powerful AI and computer-analysis techniques to gain insight into human behavior using personality, social, and organizational psychology data. This need will yield data sets that are much larger than what we have today and certainly too large for traditional processes and applications. Data will likely include recorded conversations that could process into usable information.

More and more data is piling up from digital footprints left in social media, cell phones, business transactions in various contexts, shopping, surfing, and other devices that record our every moment, freely given or not. Sometimes this new data is just taken from sites as people pass through, leaving crumbs behind.

Even More Complex  

To order to utilize technology to empower us, the complexity of the data will also become much more diverse. Because large data collections can be computationally analyzed to reveal new signals, patterns, and trends, the complexity of that data will have to be managed well.  Organizations want to deliver insights from human behavior and interactions collected everywhere, every second of the day.

The data will come from various contexts that imply context-sensitive meaning. Hopefully, this new and emergent data will be available on the cloud, but the cost and security issues will make this data more hybrid cloud in nature. The data will likely be a hybrid of structured and unstructured data and require new data management means with ownership and dynamism challenges.

This complex and dynamic set of data sources will become more challenging to manage, but it is on its way to becoming a precious asset that can be leveraged by machine and deep learning. While dynamic and emergent, its use will become more stunning over time.

Even More Inaccurate

Because of speed and size alone, the accuracy of monster data will be a constant challenge. When combining data in new ways understanding its source, context, and ultimate meaning at all levels of granularity, this becomes more of a critical problem for the data management professionals as well as the end-users.

There will be ownership issues and who will be held accountable for the accuracy of any data leveraged. All of this will have to be sorted and managed under the gun with the pressure of speedy results. Of course, internal data sets will have a better-understood pedigree than those data sources from outside an organization and in contexts not well understood.                    

Net; Net:

As we grow to zettabytes, the amount and variety of data being accessed, collected, stored in the cloud, stored on-premises, and analyzed will keep increasing in an exponential fashion. This seems like a near-impossible task until the promise of better analysis and prediction to correct problems takes over our desires. Business outcomes will likely drive this growth in this extreme competition and dynamic environment these days.

 

Tuesday, September 8, 2020

Decisions Need to Drive Data Science

There has been and will continue to be a significant shift in how data is leveraged in this continually changing world. Until recently, the data science process of collecting, cleaning, exploring, model building, and model deployment ruled the data management mindset. In a world that is "steady as she goes," this makes a great deal of sense. The amount of data to be curated is growing impressively, but the data science mindset is still on the scene dealing though pressed to its limits with big data. Two things will break this sole reliance on the data science process. 



Dynamic Business Scenarios

In a world with operational KPIs staying steady with minor adjustments over time, focusing on data makes sense. That world is virtually gone with the elephant in the room being a pandemics at the moment. Tomorrow it could be natural disasters and the down-stream effects of climate change impacting geopolitical behaviors. There are many business scenario possibilities and combinations headed our way. We can't afford just to explore data and have knee jerk responses.

Monster Data is Lurking

If you think big data is worthy of concern, just think about the monster data just around the corner, driven by higher volumes, more complexity, and even more inaccurate by nature. Organizations are bound and determined to take advantage of behavioral data that is further away from standard core operational data. Monster data includes all kinds of unstructured data that will contain digital footprints worthy of new types of decisions.

Either of these would require a major addition of new data processes, but combined data science processes alone just won't suffice. I am not saying that data science will dim, but it needs some new additional turbocharging and methods that are not just focused on exploring structured and clean data.

Dealing with Changing Scenarios

There are several ways of dealing with scenario planning and practicing responses, but here is what I would encourage organizations to do. Many decisions will drive the data that is leveraged during these efforts.

  • ·        Plan probable scenarios by having executives brainstorm and list likely scenarios and their outcomes.
  • ·        Simulate and practice these likely scenarios, so they become part of the muscle memory of an organization. It will involve leveraging key data sources cascading to tactics and operations. Build communications mechanisms ahead of time and communicate readiness.
  • ·        Identify unlikely dangerous scenarios and simulate the effects and plan responses appropriately.
  • ·        Identify critical decisions, events, and patterns to scour appropriate data resources (owned or not).
  • ·        Identify key leverage points in processes, systems, applications, and the data that could be involved

Dealing with Changing Tactics

Middle management is always trying to optimize outcomes for their functional areas though savvy organizations try to link results to remove friction points for overall optimization. Optimization often leads to self-imposed changing goals that need to be operationalized or tweaked in operations. When executives want different outcomes based on a refined organizational charter, new governance rules, and critical trends delivered by business scenarios in place, a bigger picture is in play. Tactics are the essential glue to hold together operational outcomes guided by goals. As these goals shift in a dynamic set of business demands, managers would be wise to be ready for new guidance coming at faster speeds by following this list of practices.

  • ·        Understand the impact of significant changes by modeling or simulating the effects of change.
  • ·        Be aware of all executive expected sets of scenarios and search for critical events and patterns to detect new scenario emergence.
  • ·        Implement various approaches to near real-time responses, including digital war rooms, dynamic process/application changes, and low-code methods.

Dealing with Operational Change:

Typically operational processes and systems are in place to deal with day to day operations. Changes in behaviors, markets, tactics, or scenarios will cascade down to operations. There may be additions and changes to procedures dictated by outside factors over the routine operational optimizations that occur on an ongoing basis. Processes tend to be more stable, but some changes could rock the house. To deal with functional change, I would encourage the following activities.

  • ·        Model key decisions that affect KPIs and desired business outcomes.
  • ·        Generate procedures from the models—manuals for human resources and code for processes and applications.
  • ·        Perform a volatility analysis based on past changes to identify hot spots. Enable hot spots for change, particularly for code using late binding techniques or low-code.

Net; Net:

We are entering an era of significant change linked to constant change. It means that just shining and studying data alone does cut it as a sole strategy. While data affects decisions as they are made, deciding what is going to change is emerging as a dominant new organizational competency area. We need to add some new disciples/practices to thrive going forward and call it Decision Science. 

 

 

 

 


Wednesday, August 26, 2020

Budgeting Technologies for 2021

 Organizations are now challenged in new ways; therefore, they must budget very carefully, as we advance. There will be a tug of war between accelerating digital and dealing with budget reductions for IT investment. Gartner, affectionately known as "The Big G" in my circles, has predicted IT budget reductions. At the same time, businesses are being pushed to accelerate digital transformation. Savvy organizations will save with technology to invest in technology to break through this set of apparent conflicting goals. Organizations will be careful in deciding what to invest in to survive, thrive, and capitalize on these dynamic and challenging times. I will try to lay out the five most important technologies to invest in to keep these conflicting goals in balance.



Continuous Intelligent Automation & Cost Optimization

Automation has come on strong through the use of RPA, Workflow/iBPMS, and Low Code solutions as of late. Now I see an extension of the accelerated use of both guided by both Process Mining and AI. Continuously, process mining offers extreme visibility for opportunities to handle outliers or optimize on process/case yielding time and labor savings. Machine and Deep Learning will also play a guiding role in finding more optimization opportunities over and above what the human eye can detect across various mining visualizations. The pressure for quick improvements with fast feedback cycles will push more detection of options to intelligent software or machines as responsible AI continues to develop. The savings from these profitable efforts can be applied to future digitization efforts. These efforts can be multiplied by using Software as a Service (SaaS) in some instances.

Human Augmentation & Skills Expansion

As more automation pushes humans to higher-skilled pattern detection, advanced Decision Intelligence, and smarter actions, humans will use technology to enhance a person's cognitive and physical experiences. There may be sensory augmentation, perception augmentation, and AI cognitive assists in enabling higher-skilled work levels. In the case of physical responses, appendage assistance, and exoskeleton leverage may be enabled. Imagine having the assistance of experienced experts in your ear, eye, or mind to accomplish more challenging tasks. The new worker will have interactions with technologies that will enable super skills and accelerated outcomes. This will start small and accelerated by the end of 2021.

Immersive Experiences & Visibility

All constituents will have more immersive and pleasing experiences, making them more informed and satisfied. Customer Journey Mapping/Mining technologies will allow organizations to get real-time and truthful feedback from their customers, employees, partners, and vendors to help improve their experiences on a continuous basis. Virtual Reality and Mixed Reality has the potential to radically influence the direction of improved customer experiences, product supply, and value chain services. These new visibility assists can give customers a real sense of progress towards their outcomes when balanced with organizational processes.  Some organizations have seen the value in onboarding and immersive training in a safe and realistic virtual environment.

Augmented & Real-Time Data Management

The amount and speed of data are increasing faster than our ability to manage it. Big data is turning to a complex and multi-head monster of data types with various different requirements. Managing all the data and data types will require assistance. Data marketplaces and Exchanges are emerging to add to the data chaos. Managing the various data sources will need to leverage the Database of Now integrating various data sources in the cloud, AI's ability to learn from the incoming flood of data, and the metadata that defines it within its various contexts and workloads are essential. Dark data will start to be better understood. Data journeys and transparency will be assisted by practical Blockchain that enables data traceability.

Autonomous Bots & Edge Computing

Autonomous bots/agents that bid on work at the maximum and minimally perform activities on "the edge" with AI help. Edge processing, data collections, and decisions are placed closer to the information and activity source to sense, decide, and respond closer and in the proper context. Often the IoT is where this occurs when machines, sensors, and controllers are involved with physical activity, but there are instances software, Digital Twin, or not have a presence at the edge. Often these activities are semi-autonomous and supervised today, but we are moving to more autonomy over time. Smart spaces, smart production, and smart value chains will drive these kinds of efforts. Look for robots as a service (RaaS) to elevate some of the data density issues.

Net; Net:

Every organization will have to match their operating plans to the technologies above and decide what they want to take on within their cultural and risk limits. The danger here is to focus on technologies that can contribute to short term financial results to the detriment of the future. This is true with short term cloud efforts without thinking of the total cost of ownership. Grab some profits, but invest wisely to compete digitally in the future. Negotiate with your financial folks, please or hope they get more innovative.