Thursday, August 27, 2015

Components of a Fast Data Architecture

How does an organization differentiate itself in the emerging digital world where data older than an hour is no longer helpful in making decisions that will deliver customer-pleasing results based on appropriate actions? Well, you could build a Fast Data architecture yourself, you could wait for an alliance to agree and deliver, or you could buy one out of the box. Some organizations want  a designed-for-purpose and integrated approach that is proven to work today by combining the integrated source of signals and patterns, sense and alert facilities, technology-assisted human insight suite, and automated action capabilities, all based on faster big data infrastructure. Let’s dig into each of these contributing stacks.

Integration of Data Sources:

Managing the multitude of signal emitters, whether they be heterogeneous hardware devices, or heterogeneous software activity is crucial at the mouth of the data funnel that selects, condenses, analyzes, and takes guided or independent actions based on organizational goals. This requires a rich integration and aggregation stack that can handle the variety and complexity of signal sources that generate data and event data. 

Real-Time Intelligent Technologies:

When the speed of the data increases, new capabilities need to be there to deal with the speed increase and get the most out of the data streams. In order to sense patterns of organizational interest, opportunities or threats, there needs to be a complex events capability that can recognize predefined patterns. In the case of an unanticipated event or pattern, real-time analytics can translate an ordinary event into a pattern to watch and add to the list of threats or opportunities. In the case of anticipated or unanticipated, alerting mechanisms need to be available to gain attention. In the case of anticipated patterns, there can be inventoried actions.

Human Insight Assistance:

In this class of capabilities lie a significant set of features to support prediction and decisions. This requires assists for humans to visualize conditions, predict the next best action, and create action or task lists (with automated responses where anticipated), or for humans to deal with the unknown. The key here is the power of the visualization and the ease of use. Some single events or patterns do not need a response, but decision activity may be involved in either case. 

Guided and Automated Actions:

Decisions and actions are not only important in the context of a pattern but are also important in the context of an overall end-to-end process or a simple set of tasks, or both. The context of channel and customer needs and desires—that are dynamic by nature—need to be factored into appropriate actions. This requires support for a variety of process styles from straight through processes to dynamically changing and contextual processes. It may even require kicking off a longer running case that needs deep knowledge requiring great collaboration.

Net; Net:

As you can see, Fast Data architectures have to include more than support for data. They have to support visual decisions and smart actions. All of this has to work seamlessly and flawlessly, taking out as much complexity as possible. There are a number of vendors who have or are pursuing speedier big data architectures and are at various stages of maturity. Many view fast data as a better way to stay competitive. 
A Sample Integrated Architecture from Tibco that is available and expanding today:

Tuesday, August 25, 2015

Announcing a New Series on Business Rules

"Business rules are everywhere"  - they are one of the key dynamics in the new digital world. Rules play an even bigger role than just guiding logic (think fraud detection), governing organizations (think ACA and HIPPA) and making sure organizations stay compliant (think Basel II). The new roles rules are around the growing control that customers have on their own user experience and becoming constraints on emergent and evolving behaviors in processes, applications and the internet of things (IoT). This is going to require some new thinking about business rules - how they are authored and by whom, managed and most efficiently executed (think a cloud based decision service, an enterprise class version of IFTTT

In the coming weeks, Paul Hessinger and I will be doing a series on "Business Rules are Everywhere". We will identify the opportunities and threats of getting business rules right as they permeate the processes, systems, resources(human or otherwise) and applications. First let me introduce Paul, with whom I have collaborated with for a ungodly number of years. He is secretly one of my heroes, but don't tell him :) 

Paul Hessinger is an accomplished technology and business strategist who brings 30+ years of experience and effective leadership to Chicago based InRule Technology where he has been the CEO since 2004. He has functioned with a broad span of responsibilities as a Chief Technology Officer, Chief Marketing Officer, Chief Operating Officer, or a Chief Executive Officer in the professional services and software industries. Paul spent 16 years with Computer Task Group, the first professional services firm to be listed on the NYSE. Hessinger has acted as executive counsel with leading IT vendors and Fortune 500 firms, assisting with long-range IT strategies, acquisitions, leveraged buy-outs and fund raising. 

Outline for the Business Rules Series

1.  Rules are Cool; Rules are Cruel

2.  Rules Case Study

3.  No Code Attitude

4.  Many Faces of Rules in and Around Processes

Net; Net: 

I think you will enjoy this series as Paul is a sage in the area of business rules and how to successful manage or apply them 

Tuesday, August 18, 2015

Decisions: Key to Your Destiny Between Stimulus and Response

In the today's world of hyper speed and pressure, assisting decisions in the best way possible will determine success. Some of the decisions that organizations will have to make under pressure will be strategic in nature with significant consequences. Other decisions will be tactical and reactive, but better assistance for these kinds of decisions will play large over a period of time as they accumulate in effect. There are a number of assistance capabilities that will enable better decisions made in fast times. Organizations will have to get better at planing scenarios and recognizing the emerging stimuli that indicate the need to leverage inventoried responses.


Proactive or Reactive:

Until recently organizations made decisions by looking through their rear view mirror, but with improved analytic capabilities and big data this is not the way forward except for trending. Even with trending, it will be combined in real time with up to the nanosecond activity. With the speed increasing, organizations should identify those decisions that will be of significant impact and prepare for them ahead of time. Real savvy organizations have crucial scenarios practiced out with predetermined responses and stimulus predetermined. Some organizations even use simulation


Clarifying and Condensing Stimulus:

Stimulus can come from many directions all at once or over time. This puts an importance on gathering from a wide range of events and patterns. This implies that there is a strong integration capability that can gather from a wide range of sources (markets, systems, machines and people), Once gathered, they condensed and set aside for combination or comparison. First they should be compared to crucial stimulus for crucial scenarios, then they can be sorted for interest and notification.


Considering Proper Contexts for Action:

Automatic actions can be taken when they are directly linked to stimulus or decisions, however contexts need to be considered before action. In the case of dangerous automated responses, approval may have to occur to consider downstream impacts. In the case of responses to customers, history of interaction and tendencies would be a proper context. At a minimum, the response channel needs to be considered.


Net; Net: 

Controlling your destiny in pressured times will require additional resources and capabilities. As the cognitive inventory expands in the cloud for use, more decisions will get sophisticated assistance.


Tuesday, August 11, 2015

Top Three Invasive Techniques for Leveraging Legacy

Everyone is afraid of messing with legacy code unless you are forced to do so. In fact some organizations have systems written in languages that they no longer have the knowledge and skills to maintain much less take them along for the digital ride. In my last post I identified the top five non-invasive approaches to leverage legacy in the new digital world which kept you out of the code from a change perspective.

This post will now focus on surgical methods that reduce risk that leverages the incremental remodeling approach to dealing with legacy opposed to reverse engineering. These techniques are comparable to robotic surgery that is precise and minimally invasive and safe. These invasive approaches isolate and change the areas that need help the most which implies some study of where changes have occurred the most and where they are most likely to occur.

Creating Explicit Business Rules for Change Hot Spots:

Once there has been a study for where change has occurred the most and discussions on future changes have completed, you will end up with a list of hot spots where externalizing the business rules makes sense to speed up responses to the digital pace of change. You can either make these rules explicit through data variables or leveraging a rule engine that supports some form of rule visualization such as a decision table, a decision tree or decision modeling representation such as DMN. This will enable both IT and Business Professionals the rapid control of change synchronized with the new digital capabilities you are building or have completed.  

Creating Reusable Code Snippets:

Your completed study for change combined with a digital target architecture should identify those portions of code that will be leveraged going forward (with or without explicit business rules). These process flow or base code snippets can be isolated and called from the legacy for digital reuse. This is better than a mass SOA approach in that your a focusing on what is leveraged the most and the granularity does not have to be consistent, but it does have to be practical and inventoried. Combined with the notion of agents these can be leveraged to create composites of legacy and new digital functionality.

Creating a New Interaction Layer That Leverages Old Logic: 

Quite often legacy systems just need a face lift and have to interact in a different manner with the participants in the new context such as a user based workbench, profile or an adaptable process. Imagine a stronger interface that can understand a clients needs and submit orders based on mood, likes and CRM history. The order submission legacy code is used in a completely new context with a new interface and new calling points (invocations). It could also just be screen scrapping, but I think the digital world is headed to dynamic and customized personas that leverage back end transactions where appropriate until the back end changes. 

Net; Net: 

Light surgery can deliver strong benefits when thought out. Invasive change is not always bad especially when combined with some strong analysis as to when, where and how to best do it. Legacy can be the pig eating budget or the bacon in your new digital banquet. You can use invasive and non-invasive approaches together, but mapping and adjusting your digital target is essential in guiding where legacy can help. 

Monday, August 10, 2015

Do You Know Digital Well Enough?

We are all being told that organizations are either going to have to change their business model or delivery systems to compete in the new world of digital. I think unless you are faced with an "uber-like" upstart in your industry, the business model change will be phased incrementally, but delivery systems are changing right now. If you wan to learn more about the approaches, the techniques and the technology groups that work together for the best digital outcomes, you need to attend the Executive Insights Track of  the Technology Conference in Vegas next month. This may help you turn chaos into digital order over time while giving valuable information that you can put into practice.

See the attached Video for the Executive Track

I am very excited by the team that will be delivering fantastic content. Jim Lundy will be giving his perspectives on what it takes to compete as a CX level executive in the digital world with his agenda of presentations and panels. Adrian Bowles will be giving his views of the new digital world leveraging big data, predictive analytics and the new cognitive world. David Mario Smith will be letting us know the digital future with mobile, collaboration and active content. In addition, I will be presenting the combination of process, the internet of things, cognitive computing and how to transform to digital while leveraging legacy systems using real world examples in production today that deliver on the digital promise.

See the exciting and detailed agenda of the Executive Track:

For more detail on my contribution see

Thursday, August 6, 2015

Top Five Non-Invasive Techniques for Leveraging Legacy

Many organizations are hamstrung with a legacy portfolio that they have invested in for decades. They are now challenged with moving forward to a new digital world without the drag of legacy affecting their progress. Don't let legacy eat your budget in a change happy and customer focused world like pictured below. I say it's best to leverage legacy if you can't kill it or outsource it. In this post I will be identifying the top five techniques for leveraging legacy in a non-invasive way.


Legacy Pigging Out on Your IT Budget

A Mobile Face Lift:

Many organizations have taken a mobile first approach to going digital and it makes a lot of sense for organizations that want to buy time to survive until they can do a better job of designing a set of digital targets. Some smart organizations combine this effort with mapping real customer journeys from the customer viewpoint.


Tapping Fast Data:

Some organizations have leveraged new big data pools & lakes to focus on the data that represents events of interest to produce patterns that require responses. Traditionally organizations have tapped integration data to find opportunities, but with big data combined with the IoT and complex events new opportunities to speed up actions even from legacy applications who run at slower speeds.


Modeling for Legacy Leverage & Elimination:

Understanding the state of your legacy assets is very important these days to know if you want to leverage these assets, going forward, where appropriate. Modeling the process flows inside of these applications and processes are key to understand their scope and impact. This process flow can be brought out of the people who work with the applications or you can use a scientific mining approach.


Mining Rules for Documentation & Analysis: 

It is important to understand process flows, but it may be more important to understand the business policies, rules and constraints that are coded in your legacy logic. Finding the proper code parsers, rule aggregators and rule classifiers that can deal with your pieces of legacy code is important. I found a "one stop" vendor who has helped many of my clients in the past.


Hybrid Mix of New & Legacy:

A major theme of digital is hybrid in terms of resources, speeds, channels and mixture of technologies to leverage. Mixing new functionality with the legacy applications in a surround fashion is a very popular approach to deal with the challenge of the new digital efforts. Legacy can play the core transaction processors and the new functionality can deal with the dynamic change portion of digital. Hopefully you can use the legacy in a large grained way, but there are legacy approaches that allow for the slicing and dicing of legacy for more focused usage.

Net; Net: 

In an ideal world organizations would be able to cope with legacy with non-invasive approaches and techniques. While I have seen successful efforts without legacy invasion in the short term, but over time invasive techniques will likely be needed. I will document the popular invasive techniques in a future post. Watch this space.

Monday, August 3, 2015

Fast Data is Better Than Big Data

Big Data is here to stay and it is a base for many possibilities in the new "Digital World" of extreme competition. One of the brightest areas is the notion of fast data as it looks at the "Right Now" opportunities for organizations. With the combination of the internet of things and emerging business patterns, fast data will play an important role.

Intelligent actions based on a combination of strong analytical approaches, prediction and cognitive computing capabilities. the future for fast data looks strong. But before we jump on this movement, it is important to take a moment to explore the business benefits of fast data. I wrote a blog for Tibco on this topic that I think you will find interesting. I will be writing more on Fast Data going forward.