Monday, November 23, 2020

Real-Time Use Cases Enabled by the Database of Now

While we all know that real-time applications and analytics are constrained to respond in the order of microseconds, real-time systems have been difficult to justify and attain until recently. There is a whole new class of real-time analytics that are now open to more organizations and applications with the advent of “The Database of Now." This blog investigates some of the new and emerging uses and is meant to give the reader some real-world examples. Hopefully, this list of successful uses will inspire others to follow suit, mainly where real-time dashboards and analytics deliver significant benefits.

Real-Time was for Special Applications

Real-time computing started with operating systems and then only a handful of applications because of specialized software costs. The first implementations revolved around real-time networks and market-driven applications where results demanded no significant delays or instant results. Real-time was an excellent fit for physical systems that need instant responses like fly-by-wire or ABS brakes or single purpose-focused applications. These use cases required extreme correctness, deep concurrency, and durable stability while being distributed and sometimes autonomous. Real-time was a limited set of applications and systems until recently.



What Has Changed?

Business drivers require more speed. It used to be good enough for businesses to have dashboards that were relatively up to date. Now that kind of speed is not acceptable even if applications aren't hooked up to devices on the internet's edge (IoT).  Almost all the leaders of business-focused software organizations believe that speed is the new currency of business. We are in an era of extreme competition and dynamic adaptability. Those organizations that can handle emerging trends by sensing them, making rapid decisions, and implementing fast are the ones that will emerge as the winners as long as they consider the voice of the customer and other constituents. Savvy organizations will switch from reactive to proactive by planning alternatives, practicing them, and put in listening posts for an emergent change.

 

Technology enablers have been emerging to meet the need at a lower cost and broader use cases. It started with complex events processing's ability to sense signals, events, and patterns of interest and even respond in limited situations. And now, fast forward to today, and we see the accelerating trend of adapting to real-time applications with the mainstream use of streaming data. Ventana research states that more than 50% of enterprises will leverage real-time streaming data in their enterprise next year. It reaches full bloom now with databases that can handle various kinds of complex monster data in the cloud-managed as a single logical store rather than multiple special-purpose datastores, greatly simplifying and accelerating data delivery. Organizations simplify the complexity that revolves around data location, meaning, and transformation connected to the smart IoT, often embedded in Industry 4.0 solutions facilitated thus increased speed. 

Sample List of Successful New Real-Time Use Cases

They are listed in no particular order with common uses highlighted in bold text. All industries will have emergent situations that will cry out for real-time assists.

  • FinTech
    • Portfolio Management & Analytics
    • Fraud Detection
    • Algorithmic Trading, Crypto Exchange
    • Dashboards & APIs
  • Software & SaaS
    • Improved CX for Internet Services
    • Supply Chain Visibility  
    • Machine Learning Pipelines & Platforms
    • Dashboards & APIs
  • Media & Communications
    • Ad Optimization & Ad Serving
    • Streaming Media Quality Analytics
    • Video Game Telemetry Processing
    • Network Telemetry & Analytics
  • Energy 
    • IoT & Smart Meter Analytics
    • Predictive Maintenance
    • Geospatial Tracking & Calculations
    • Dashboards & APIs

 

Net; Net:

While there are many more industry examples not listed here, the evidence is clear that it is time to rethink real-time utilization. The Database of Now has lowered the hurdles keeping organizations from leveraging real-time implementations. This is especially true of dashboards, analytics, and other transparency rich situations. If you need to know about the status of work or outcomes on the spot and right now, you should be considering the real-time use cases that the Database of Now enables. Real-time is for everybody now, so start planning and preparing for new competitive implementations.

 

Additional Reading:

Corporate Performance in Real-Time

Database of Now

Monster Data

Ventana Source

 

 


Thursday, November 19, 2020

Customers; Let Your Voice Be Heard

We all know that customers are the lifeblood of organizations, so organizations should treat their relationship with the customer as extremely important. Applying intelligence to voice data, either through analytics or artificial intelligence (AI), is particularly useful to both the customer and the organization. Organizations can identify patterns to improve the relationship, and customers will, in turn, get a better experience. Typically organizations use crucial voice interactions to extract vital information from customer interactions to measure and improve performance, but there is much more potential in leveraging voice data. The power of voice data in the customer experience goes beyond organizational excellence to relational effectiveness. 


Using Voice Data for Organizational Performance

Organizations are always looking for efficiency while optimally managing their resources. Contact centers and agents need the utmost efficiency and performance so that voice data can be searched for desired and undesired behavior. Customer service usually means measuring call handling times to deal with call volume. Sometimes call center agents are just measured on time to complete without considering if the customers' outcomes were delivered or experienced frustration in getting their needs translated into corporate transactions. Measuring customer sentiment in real-time adds a differentiating factor to the overall call handling perspective. If a customer has to call multiple times to get something done while the organization/agents hit handle time goals, they may not be impressed and leave permanently.

Indeed, voice data helps improve agent script adherence and the actual scripts themselves. Voice is also helpful in detecting when competitors' names are mentioned for further evaluation. When fine-tuning agent training programs, voice data is invaluable, enabling first call resolution and even reduce average call handling times. Some "bigger picture" efforts such as payment compliance, including redaction for terms like "credit card number" or "social security number," also benefit from voice data and analytics. You can even fine-tune your marketing campaigns by picking up on themes and slogans.

Using Voice Data for Voice of the Customer

Voice data delivers significant opportunities for a better customer experience without the customer knowing it. An example would be finding the best call center agent for each customer on a real-time basis. AI can understand which agent is suited for the type of call because not every call needs your best agent. Understanding who is needed will reduce frustration, call transfers, and escalations to managers or higher-skilled agents. Another terrific use is AI & voice data would understand the customers’ emotions. This emotional understanding would allow a real-time sense of frustration, anger, or sweet satisfaction. Appropriate actions could be taken at the moment, and lessons learned could be recorded for future analysis. Instant adaptation is a much more reasonable approach to responding to the customers in a way that optimizes the customer's experience. 

Customer pain points are sensed with voice analysis and dealt with before becoming a sour customer growing negative reputations. This analysis will reduce customer churn and increase net promoter scores. Imagine learning what is causing customer churn and adjusting strategies on the fly. Customer analysis helps implement predictive speech analytics to deal with the customer at risk and act immediately. It might include a link to smart calling software for callbacks when an expert can reach out back to the customer. In turn, call center supervisors can implement strategies to prevent churn by improving building a knowledge base, better training, and improved scripts.

Even if caller emotions can’t be handled in a real-time fashion, there is an aggregate picture of the overall customer satisfaction by measuring and classifying caller emotions into buckets. Examples would likely include want, like, frustration, anger, annoyance, need, passion, and pleasure. Not only can sentiment be classified, so can call drivers, topics discovery, and brand health. Keywords can lead to emerging trends, and real-time dashboards focused on customer experience.

Using Voice Data for Voice of the Employee

Often forgotten in the shuffle is the actual employee voice data. Looking for comments from the agents that say things like "I wish we could do ….."  Employees are a great source of understanding your customers' journey even if the journey is outside your organization's purview that might encourage communicating or partnering with other organizations. Indeed, voice data should be analyzed for more knowledge for the agents, creating projects, and training curriculum adjustments. Lack of experience might lead to AI knowledge bots to adapt scripts in real-time instead of the batch script creation process.

Net; Net:

Analyzing voice data with analytics and AI will accelerate time-to-value for both the customer and organizations. It makes organizations more personal for both customers and employees while demonstrating the power of voice data plus AI rich analytics in the customer experience.

 

Additional Reading

Voice; Voice Baby               

Got Customer Excellence? 

Journey Maps  

Tuesday, November 3, 2020

Voice, Voice Baby

Voice-enabled devices are everywhere, and there is a high awareness of the potential for voice data on a personal level. Devices are in our living rooms and even in our bedrooms, but we are just scratching the surface of the use of data in organizations other than a simple search, understanding, and basic commerce. People talk to speech recognition like real people and say things like "please and thank you". Organizations need to up their game to take on smarter uses of voice over the coming decade. Organizations will have to gear up for “Big Voice Data” that goes beyond rudimentary search to analytic leverage and secure redaction of sensitive voice phrases.


Popular Uses Today

Voice search is prevalent and driven by the younger generations, but the deep usage is coming for the middle-aged wealth builders. Search is a critical participant in understanding what something is, how to leverage it, and the best way to go about it. Search is also a significant participant in simple commerce to find and buy specific products and services. It is particularly useful for lower-cost items like groceries, entertainment, and essential clothing. Voice is poised for explosive growth in the enterprise for large and complex voice sources. 

Needed Uses for Tomorrow

Because the voice influence cuts both ways, voice can influence commerce and help organizations to get better.  Organizations will have to ramp up their use of AI-powered voice analytics to gain an advantage. Savvy organizations are examining the critical moments of sales or service leveraging conversation analysis speed and accuracy at optimal cost levels. By understanding what was said and making the intent discoverable, new opportunities arise. This process allows organizations to analyze interactions and prescribe proper actions. The typical uses are for offers, complaints, first call resolution, compliments, outages, and escalation analysis.

On The Edge Uses

Organizations need to stop thinking about voice as just an interface or the outcome of a conversation. Voice data is a unique source of critical insights for the business. The leading organizations create the next generation of communication and collaboration with AI bots/agents to identify call drivers, trends, predict new directions, and design or optimize products and service bundles. This kind of smart voice leverage can apply to prospects or customers and all constituents involved with product/service creation, including employees and vendors. Voice analysis could cover the end to end supply or value chains. Voice analytics applies to both external and internal voice moments of truth, differentiating amongst the individuals' diversity at the same time.

Net; Net:

We all know that voice has much more value than any other means of communication because it expresses context, sentiment, intent, emotions, and action potential. Speech technology should be a vital part of any digital transformation as there are critical insights about product, service, and customers. Better accuracy, smarter implementations, and bulletproof security will allow more complex solutions while making it easier for users. It's time to up your game at getting more value out of the voice in your organization.

 




Wednesday, October 28, 2020

The Database of Now is About Leveraging Business Moments

 Recently I have been writing about the Database of Now without really defining it. While the definition was somewhat implied in the context of the write-ups, it is time to make it an official thing by defining it. Keep in mind that the definition is ideal at the moment and will emerge over time as digital technology advances to enable it. I have invited a coauthor who has some real-world examples to help flesh out this definition in context. Please welcome Dominic Ravita, whose bio is included.

Database of Now Definition

The Database of Now is a set of structured or unstructured data representing the present time or a moment without any processing or location delays. The Database of Now represents the most accurate and up to date state from which to make observations, decisions and take appropriate actions within the context of desired outcomes and goals.

The Database of Now must be considered in context and time continuums as the data meaning will vary with these vectors. An accurate state of now is a must to interpret the past and potentially predict the picture in emergent situations. Maintaining an accurate database of now is exceptionally challenging today with the amount and speed of data currently.

Benefits of the Database of Now

The Database of Now delivers operational insights and advantages by providing the current state of the business. It is a modern, efficient approach to cloud data management which broadens, accelerates and simplifies access to all the relevant in-the-moment with historical data while unifying data access styles and patterns

Why Care About the Database of Now? 

Digital transformation projects have accelerated to meet the increased demand for digital products and services, providing answers or solutions with immediacy. With the onset of the "always-on" culture, the pervasive use of smartphones and ubiquitous devices has driven a global shift in customer experience and consumer expectations. Business is now won or lost in a moment.

The Database of Now delivers the operational insights and advantages by providing the current state of business to proactively identify, capture, and capitalize on the most crucial moments for their endeavors and their customers' success. It achieves this by simplifying the data infrastructure required to execute diverse workloads across various data styles, patterns, and types. Data professionals, application stakeholders, and end-users gain the advantages of speed, scale, and simplicity.

Characteristics of the Database of Now

Instant Integration & transformation

Self-Managed with Incremental Ingestion/Cleaning

Run Anywhere Dynamically

Pattern & Event Recognition

Ability to Learn from Analytics, Machine Learning, and AI

Ability to Suggest Alternatives 

Ability to Take Action within Allowed Freedom Levels




Real-World Needs/Examples

 Cash Burn & Flow Need the Database of Now

Managing cash flow and burn down in uncertain times requires a “hands-on” real-time look at the monies and where they are going. Having a pulse on money and how much is flowing out right now is vital when unpredictable revenues. Having a chokehold on expenses can be just as much of a mistake today, so watching the corporate performance in real-time is a fantastic advantage. Watching spending patterns in the market and your organization in a real-time fashion is a key to organizational success. All of this monetary care must be watched in the context of current business scenarios that could be subtly changing and appearing in new events and patterns.

The Digital Customer Experience Needs the Database of Now

Measuring what your customer is experiencing in real-time yields much better satisfaction and results than looking at slanted surveys and inconsistently timed customer forums. Real-time data mining leveraging the Database of Now will give insights into customer behavior. While customers generally have consistent goals in the way they behave, they are now experiencing new pressures. Many organizations are declaring victory because they could continue supporting customers with a remote workforce quickly, but few look at the customer experience impact of the recent moves. The long term future of many organizations hangs in the balance with the customer experience. Digital, along with instant adaptation, will go beyond remote workers to customers and other partners.

Saving Lives Needs the Database of Now:

Seconds can save lives too. True Digital of Thailand's mission is to protect children from sex trafficking, and they do that by continuously processing massive amounts of web data to identify children in danger. They were able to decrease law enforcement investigation time by as much as 63%. True Digital also seeks to proactively reduce the likelihood of new viral hotspots, including COVID 19, by monitoring mass population movement trends and rates of population density changes through anonymized cellphone data location data.

Industry 4.0 Needs the Database of Now

When organizations interact to create outcomes with each other, especially when hyper-automation is involved, having instant transparency and acting nearest to the emergent events or patterns is essential. The Database of Now operates well at the edge, leveraging cloud presence, and work shifting capabilities. Measuring results at the edge and adapting within a set of guidelines and goals set up by inter-organizational governance bodies is essential for success.

Each of these examples has make-or-break moments in time. “Now Scenarios” are time-critical, but the length of time available for effective action varies by situation, as does the variety, volume, and velocity of data required. What is essential for these “Now Scenarios” is to leverage all the relevant data to establish the most accurate, complete, and timely context to drive proactive responses. For business operations and management, real-time revenue is lost or gained in a split second with each passing moment. 

Net; Net:

The new world is much faster in terms of reactive, proactive, or predictive decisions or actions. The Database of Now is now or will be a critical fundamental contributor to many digital transformation efforts. Organizations that can sense shifts and intercept the outcomes in time will be those who flourish best.

Additional Case Studies Can Be Found Here:

https://www.memsql.com/

Domenic Ravita is MemSQL’s Field Chief Technology Officer and Head of Product Marketing. He brings 24 years of experience across consulting, software development, architecture, and solution engineering leadership. He brings product knowledge and a senior technical perspective to field teams and to customers, and represents the voice of the customer to the product & engineering organization. He is a trusted technical and business advisor to the Fortune 500 having served global customers through multi-year transformations enabled by event-driven architectures. He has experience in distributed, in-memory data grids, streaming analytics, databases, integration, and data science.

Additional Reading on Database of Now:

Better Corporate Performance

https://jimsinur.blogspot.com/2020/08/increasing-corporate-performance-with.html

Accelerated Decisions

https://jimsinur.blogspot.com/2020/08/acceleration-of-decisions-helped-by.html

Contextual Intelligence

https://jimsinur.blogspot.com/2020/07/context-connecting-clues-for-data.html

Better Customer Experiences

https://jimsinur.blogspot.com/2017/09/do-you-need-technology-to-assist.html

Smart Data

https://jimsinur.blogspot.com/2020/07/is-your-data-smart-enough.html

 

 

Wednesday, October 14, 2020

Is Chasing Perfect Data a Reasonable Quest?


We have heard many quotes about the poor quality of data. In fact, there are those that want perfect data before they make a decision, Is that a realistic attitude towards data quality? While in some situations where data that is nearly perfect is an absolute must, there are other situations where you make the best decision under the circumstances. Let's Explore some of the issues a bit more.


Figure 1 Interaction of Data Quality and Decisions. 


What is Data Quality?

Gartner defines quality this way:

"The term "data quality" relates to the processes and technologies for identifying, understanding, and correcting data that support effective data and analytics governance across operational business processes and decision making. The packaged solutions available include a range of critical functions, such as profiling, parsing, standardization, cleansing, matching, enrichment, monitoring, and collaborating. 

I'd like to add my analysis to this solid base by referring to Figure 1 above that tries to show the interaction of time, decisions, and data. Looking at the X-axis we see the data quality increasing as efforts to make it better move it to a more clean, concise, and crisp state. The Y-axis represents a time continuum that goes from right this instant to all the time ever needed. Given all the time and all the money necessary, data can approach or even attain perfection until entropy enters into the equation. 

When data is new or first brought into an organization it is good for emergent and morphing sets of problems that are often under pressure to make decisions and even take actions on those decisions. Things are fuzzier at this point and a precise answer is often not possible, but progress can be made even with data of less quality. Often the decisions needed are not fully understood at this point in time. 

As the decisions become more known and even routine, the priority of the crucial decisions, goals, and outcomes tend to sort themselves out. This helps identify the critical data sources that need attention and efforts to get better over time. Some decisions become so important that operational excellence and customer interaction drive the need to make the decisions excellent too. This excellence demands more prefect data in most instances. 

When is Perfect Data Needed? 

Circumstances that Demand Perfect Data

When it comes to safety and the lives of people, it is hard to argue against perfect data, The problem occurs when the timing of getting that data perfect flies in the face of a need to make a decision to avoid downstream negative consequences. Sometimes decisions have to be made with less than perfect data. There are two strong forces pushing here that have to be balanced within the context of emergent or known scenarios.

Good Enough Data Sometimes Works

While it is easy to demand perfect data in order to make decisions and take action, sometimes real leadership finds good enough data to move forward. There is a danger to rely on just gut feel, so leveraging the data the best way possible considering its flaws is sometimes the best road taken. Sometimes additional scenario simulation is the answer. Sometimes some quick and dirty approaches to incremental data bolstering make sense. It may mean that you look at other similar decisions and data for insight.   

Net; Net: 

While we all chase better data, all data can't be perfect, so don't die trying. Certain decisions demand perfect data because of their importance, but the critical nature of decision timing fights against this desire. It is very easy to sit back and say that all data has to be perfect and not take any responsibility or action because the data isn't perfect. Don't get caught in that trap. On the other hand, don't stop investing in great data quality because it might cost some effort. There is a delicate and dynamic balance to be struck here. 

Tuesday, October 6, 2020

Art for the 3rd Quarter 2020

 I hope you and yours are doing well during this challenging year of 2020. My art business is getting an unexpected lift in sales demand and production. Here are some of the new pieces. One was a request from my granddaughter in Loveland Colorado. She wanted a painting of a Brontosaurus because she thinks that they are gentle dinos :) I also created two new paintings on a whim. The first was a raven flying past a full moon. It reminded me of my late father who had a raven for a pet. It sold immediately to a good Candian friend, Lin Whaley. The second was a night scene of a park in Paris that reminded me a night Sherry said "yes" in Paris in 1997. I posted it recently and sold a giclee to another friend Kari Moeller in Kansas City.  I also completed a brilliantly colored fractal for you to see. If you are interested in seeing my portfolio or buying a piece, please click here  Some of the newer pieces haven't made the website yet, so contact me at jim.sinur@gmail.com, if you have a desire. 


                                                          BRONTO 



                                                 RAVEN NIGHT



                                                   PARC DU PARIS



                                                   BIG BANG



                                             



Tuesday, September 29, 2020

Prioritizing Data Management in the Monster Data Era

As much as organizations would like perfect data under their complete governance/management rules and guidelines, those ideal outcomes disappeared already with the big data era's reality. In the "monster data" era, it will be much rarer to have complete data management, especially with the inclusion of more data types, more complexity, and less reliable data sources coming at organizations at an accelerated speed. This writing aims to give some advice on deciding which data sources are likely to get more intense data management. Prioritizing will allow data management to focus better on balancing its efforts. It will also enable the data usage to be separated from Ideal/excellent data management allowing new data resources to enter the organization and grow into data management over time. Prioritizing data sources for robust data management should consider goals, measures, and change channels.


Prioritize on Critical Goals

Organizations can't focus on all the data coming their way, and they certainly can't depend on AI supercharged data scientists to find interesting anomalies from the bottom up. The first cut at the data under management should directly relate to business/organizational outcomes necessary as a base. Many organizations set up strategic goals that represent the most desirable for the circumstances at hand. Savvy organizations plan alternative strategies/scenarios and practice the most likely emergent strategies. Most organizations, these days, tie both their functional and operational goals to the strategy du jour. Savvy organizations look at the end to end processes/journeys to link goals together to minimize organizational conflict and consider constituent outcomes as legitimate goals. These efforts would include customer journeys laced with highly desirable customer experiences. These goals will help narrow down the data sources that need proper data management, although much more than in the past.

Prioritize on Real World Measures

An excellent place to start is where most of the data activity occurs. Many organizations create heat-maps of highest use and apply the highest level of governance for these sources regardless of whether they are operational or decision focused data. There more enlightened organizations push the data timing to real-time, representing the Database of Now. These same organizations practice data mining to look at the reality of the current and past situations that may point to future operation optimizations. The aware organization also mine new data sources like voice and video for customer experience improvement potential. While much of the mining is descriptive, the more advanced efforts start prescribing potential for change and even predicting future behaviors. In this kind of environment, decision optimization is considered a major important improvement arena.

Prioritize on Change Channels

Like it or not, things change and almost always get more complicated. Rapid change may mean cobbling together creative responses in a quick fashion. To that end, organizations need to sniff many data sources to change potential impacts that could affect strategy or operations. Sometimes it might be subtle to look at signals and events in these data streams, but it could range to finding emerging patterns within or across data sources. The likely channels would often include competitors, markets, industries, demographics, vendors, suppliers, geographies, legal frameworks, and governmental pushes. Additionally, best practices involve finding new contexts and sources.





Figure 1 Optimal Data Governance Defined

 

Net; Net:

Applying excellent data management governance, depicted in Figure 1, is no longer possible to all data resources. Organizations will have to focus on the data sources that are key for business outcomes that enable organizations to survive, thrive, and even optimize their impact. While Data Management is necessary for specific resources at a high level, the time has come to identify those particular resources and set data governance levels across various data sources.