[twitter-follow screen_name=’@dinojain’ show_screen_name=’no’]

Analytics is a risky proposition. Deployment is complex, expensive, and fraught with pitfalls that can doom any project. As such, the jobs of many CAOs/CIOs are on the line with every analytics project undertaken. The payoff has to outweigh the risk, and it’s not always easy to guarantee that. But what if you could reduce that risk? What if you could go into any analytics implementation project knowing you were doing it right?

Actually, you can. You just have to build an environment where analytics is deployed with the right tools and architecture, with enough flexibility and scalability, and using the right deployment model—to meet your needs now, and as they change.

Don’t Re-invent the Wheel

Everyone has their ideal version of analytics tools, languages, and platforms. And, typically, if they can justify it, function heads often get to deploy their own analytics systems without much regard to what’s best for the organization. This leads to analytics anarchy where various, often overlapping, analytics capabilities are deployed across multiple, siloed environments, and instead of leveraging what other departments have done, teams build models and applications de-novo, wasting time and resources. Out of this chaos comes very little except for blurred insights and slow decision-making.

Keys to De-Risking AnalyticsTo make order out of the chaos, it’s essential to operationalize analytics. Make it a process that uses pre-built models and applies those models consistently, across the organization. Also, consolidate your analytics architecture. Realistically, probably 90% of analytics tools can be deployed on one, comprehensive architecture that supports the ingestion, storage, and analysis of multi-structured data.

Make sure you have support for your analysts’ favorite R, Python, and SAS, etc., and tools such as RStudio, Jupyter, and SAS—but keeping them on the same platform is a must to achieve consistent insights.

Deploy with Flexibility

Currently, you might be satisfied with having your data storage and analytics capabilities fully on premise. But what if you want to port to the cloud? What if you want to deploy a hybrid storage model? Could you do that? Does your software and/or hardware vendor support such a move? Does your architecture? If so, could you make the move quickly? If you’re like most companies, your answer has to be no.

It’s critical, to meet market demands and to have the ability to find and solve problems quickly, that you have the flexibility to store data in any environment—on-premise, in the cloud, or in a hybrid environment—and access it quickly for analysis. Pick a vendor with portable licensing and deployment models that are essentially environment agnostic and who will help you deploy in whatever environment makes sense for your business, and who will help you port your workloads, and analytics capabilities, as your requirements change.

Right-Size it and Make it Scalable

Analytics is risky enough without having to bet the farm and pay for everything up front, and with bundling options that make you pay for capabilities you don’t need, and might never need. And, many companies—especially if their IT staffs aren’t robust, find it difficult to manage all that complex technology. Yet, despite the strain this arrangement often causes on IT budgets, there aren’t other options that are presented by vendors hot to sell you on the latest analytics wonder tools.

Don’t buy it. Literally. Instead, find a vendor who’ll work with you to build a bundle of services that meets your needs, and that can be reconfigured as those needs change. It’s also critical to be honest about your IT capabilities—and your risk tolerance—for managing your analytics functionality. Give serious consideration to subscription-based analytics and/or as-a-service analytics to get the capabilities you need, when you need them—and the ability to scale up, or down, quickly and seamlessly, when you need to.

PayoffWhat you Get

Simply, you get less risk. How? The benefits are many, but let’s examine one very common scenario. Say you’re having a problem with your supply chain. In a typical situation, it might take six to nine months to build a one-off analytical model to solve the problem.

However, in an environment where analytics is operationalized, with pre-built models and the most effective tools to apply to just about any problem set, and that gives you the ability to scale up as your data needs warrant, you’ll be able to solve the problem in weeks, not months. That reduces your opportunity cost and your risk. You can get to a solution more quickly, and you have the peace of mind that comes from getting those deeper insights to ensure that you have the right solution. Deeper insights, quicker problem-solving, and better decision-making–that’s a pretty big payoff.

I’d love to hear what you think. Please comment or DM me on Twitter, and please follow me! [twitter-follow screen_name=’@dinojain’ show_screen_name=’no’] You can also message me on Linkedin, or email me at anuraag.jain@thinkbiganalytics.com.

General manager of Teradata Consulting and Go-To-Market Analytic Solutions. Thought leader in analytics, business intelligence, big data, and business transformation. My passion is helping my clients drive value through data and achieve a sustainable competitive advantage.

Leave a Reply

%d bloggers like this: