“Making predictions is hard, especially about the future” is a well known witticism. When it comes to making predictions about how companies will make predictions, it can be even harder to know what to say. Nevertheless, the folks over at KDnuggets recently asked some of the leading experts in Data Science and Predictive Analytics for some thoughts on developments in 2016 and trends for 2017. I was one of those that participated and you can see the article here – Data Science, Predictive Analytics Main Developments in 2016 and Key Trends for 2017. Several key themes emerged from the various expert responses:
- More focus on ROI and business results and less on technology experiments
- More cynicism about predictive analytics because of the election results and the perception that predictions were “wrong”
- Broader adoption of analytic techniques by mainstream organizations not just early adopters and innovators
- More automated machine learning and less reliance on data science experts to build models as a result
- More embedding of analytics, machine learning and even cognitive computing into real-time operational systems – like the example Tom Davenport discussed in his post on printing money.
- A more fragmented analytic tools landscape with more companies adopting a mix-and-match approach that includes proprietary and open source.
Personally I think these trends are going to force companies to get real about how they identify analytic opportunity and frame analytic requirements – only with a clear sense of what they hope to achieve can they deliver business results, make sure the analytic is believable in context and determine the right mix of technologies to use. Expect to see a growth in the use of decision modeling, for instance, to capture the business understanding needed in analytic projects. See our materials on framing analytic requirements and on the questions to ask before you build an analytic model for more.