As the saying goes, it isn’t the size of the data but how you use it. You can have the best data analytics in the world, but they are useless without the business processes required to act on the data. Once data is gathered and analyzed, it is time to enact change based on the information gleaned from the data. Here are some tips on how to create quality procedures and avoid common mistakes.
The first common mistake to avoid is called “analysis paralysis.” This refers to a phenomenon that can occur in both businesses and individuals. It is when you become so inundated with data and obsessed with analyzing a situation that you cannot act. Let’s face facts: most decisions are made on gut impulse and then we go through the motions of verifying our gut feelings with information.
With an ever increasing amount of information available to consider, there is the temptation to put off final decisions until more information comes in. The truth is that you can never really get enough information to be 100% certain of any decision. At some point, you just have to jump in.
In business, this problem can be avoided by setting up clear guidelines on how much data to gather before making a decision. Make sure your time frames are set from the start. Create set timelines for testing. You can always go back and reevaluate later if you need to.
The second thing to avoid is creating “data silos.” This is where different applications do not share data and information with each other. This is the latest incarnation of an age old problem in business, the “silo mentality.” It is up to management to break down organizational barriers between departments and prevent silo mentality. Likewise, it is up to management to implement software that can integrate the data from a variety of applications. Think of each application as a frame in a movie. You need to string all of the different frames together in order to get the whole moving picture.
To truly eliminate silos, you must simultaneously attack their human and digital counterparts. Departments should be sharing data. Departments can better understand each other when they see the data upon which decisions are made. This also allows them to cross-pollinate ideas, and introduce different perspectives on how to address a problem. You do not need to go to the extreme of having all data available to all employees, if this may create a security problem. It is important, however, to have department heads in a position where they can see the larger picture, and upper management as always needs to be able to integrate everything into a cohesive whole. Also, don’t assume that just because upper management collaborates that employees further down the chain do the same.
On the more technical side of things, it is critical to maintain your data reservoir (or “data lake”) so it doesn’t deteriorate into a “data swamp.” While some of these processes can be automated, technology is still at the point where an actual human being needs to tend to the data. Utilizing metadata is critical, and a logical and clear organizational structure will save a lot of headaches in the long run. If you don’t keep up with organizing and categorizing your data it will quickly deteriorate into an unintelligible morass. Without metadata and some form of organizational structure, each time you try to analyze the data you will be starting from scratch.
Duplication in a data reservoir is a common problem, and it isn’t always easy to spot, so the best way to prevent it is to be proactive. The overarching theme is that you have a wide variety of data, and often you are comparing apples to oranges. There isn’t always an easy way to organize it, and sometimes the best approach is to utilize several different software products to organize and coordinate your reservoir of data. Sometimes you will need to organize the “data lake” into smaller “data pools.” Try to find a solution that is customized to your specific needs.
This concludes the three part Thought Leadership series about data. Be sure to check out the other two installments: Gathering Data and Analyzing Data.