Decision Management Solution Blog
Standards in Predictive Analytics: Hadoop
With the fourth post in this series I am going to talk about Hadoop – something with even more hype than R or predictive analytics. As we all know the era of Big Data has arrived. As anyone who reads the IT or business press knows, there is more data available today, this data is no longer just structured data suitable for storing in a relational database but is increasingly unstructured and semi-structured data, and this data is arriving at organizations more rapidly than ever. All this is putting pressure on traditional data infrastructures and Hadoop has emerged as a key technology in dealing with these Big Data challenges. Grossly simplifying, Hadoop consists of two core elements—the Hadoop Distributed file System or HDFS and the MapReduce programming framework. HDFS is a highly fault tolerant distributed file system that runs on low-cost comm
Online Decision Management and Decision Modeling training in February
We are running our “Decision Management and Decision Modeling” online in February.
This 6-session online live training class will prepare you to be immediately effective in using the Decision Management approach and a modern, collaborative and standards-based approach to decision modeling.You will learn how to identify and prioritize the decisions that drive your success, see how to analyze and model these decisions, and understand the role these decisions play in delivering more powerful information systems. This course is newly designated an IIBA Endorsed Course so you will earn 9 PDs/CDUs for attending.
Sessions begin at 10am Pacific/1pm Eastern and are 90 minutes in length. The 6 sessions spread over two weeks:
February 4,5, and 6
February 18,19, and 20Decision Management, of course, is a proven approach for adopting business rules and pr
Standards in Predictive Analytics: R
The third post in my series on standards in Predictive Analytics is on R, a hot topic in analytic circles these days. R is fundamentally an interpreted language for statistical computing and for the graphical display of results associated with these statistics. Highly extensible, it is available as free and open source software. The core environment provides standard programming capabilities as well as specialized capabilities for data ingestion, data handling, mathematical analysis and visualization. The core contains support for linear and generalized linear models, nonlinear regression, time series, clustering, smoothing and more. The language has been in development and use since 1997 with the 1.0 release coming in 2000. The core is now at release 3.0. New capabilities can be added by creating packages typically written in the R language itself. Over 5,000 packages have been added through the open source community.
<span style="l
Webinar: Nurture Customer Loyalty With Personalized Messages and Channel Alignment
I am giving a webinar on Nurture Customer Loyalty With Personalized Messages and Channel Alignment for the Association of Strategic Marketing on March 17 at 1:00pm Eastern
More customers are using more channels and with the growth in mobile and other automated channels it’s challenging to deliver personalized, targeted messages across channels and over time. One-size-doesn’t-really-fit-anyone marketing based on simplistic ‘customer-centric’ strategies isn’t getting it done. Companies need to take what they know about their customers and use it to deliver a truly personalized, 1:1 interaction every time a customer interacts with them, no matter which channel.
5 Reasons to Attend
- Best practices in responding to customers in real time.
- That decision making is central to personalized
Standards in Predictive Analytics: PMML
Continuing my series on standards in Predictive Analytics I am going to talk first about PMML. PMML is an XML standard for the interchange of predictive analytic models developed by the Data Mining Group. The basic structure is an XML format document that contains a header, a data dictionary, data transformations and one or more models each consisting of a mining schema based on the type of model, a target and other outputs such as measures of accuracy. PMML started in 1998 and the most recent release was 4.1 in 2011. The 4.x releases marked a major milestone with support for pre- and post-processing, time series, explanations and ensembles. Support for PMML is widespread and growing with an active community and many analytic vendors are either members of DMG or provide support for the standard in their products.
PMML has particular value for organizations as they move away from a batch scoring mindset to a more real-time scoring approach.
White Paper of the Week: Beyond Business Agility
Happy New Year. This week’s white paper is one from 2012: Beyond Business Agility: Becoming Adaptive and Analytic or how to use more than just business rules to develop systems more effectively.
Business rules are a powerful tool for developing agile systems. Moving forward, becoming analytic and adaptive as well as agile will be a source of competitive advantage. This white paper is based on my keynote from the Building Business Capability conference.
Contents:
- Introducing Decision Management Systems
- Becoming Analytic
- Becoming Adaptive
- Next Steps
You can find this white paper, and many others, on our <a href="https://decisionmanagementsolutions.com/w
The growing impact of Business Rules and Decision Management. What a difference 5 years makes!
Back in 2008 I wrote this post – The small impact of business rules on the big players, bemoaning the lack of serious investment in business rules on the part of major software companies. As we enter 2014 I thought I would revisit this post and consider what a difference 5 years makes. Let’s consider some of the world’s top enterprise software vendors:
- IBM
IBM has become a major booster for both Decision Management and business rules. The old ILOG product is now Operational Decision Management and has seen significant and continuous investment since IBM acquired it. Decision Management is one of three pillars of IBM’s Smarter Process initiative and every year it seems to have more visibility at events like IMPACT as well
The Role of Standards in Predictive Analytics: A Series
I am working on a paper, for publication in early 2014, on the role of standards such as R, Hadoop and PMML in the mainstreaming of predictive analytics. As I do so I will be publishing a few blog posts. I thought I would start with a quick introduction to the topic now and then finish the series in the new year.
Just a few years ago it was common to develop a predictive analytic model using a single proprietary tool against a sample of structured data. This would then be applied in batch, storing scores for future use in a database or data warehouse. This model has been disrupted in recent years:
- There is a move to real-time scoring, calculating the value of predictive analytic models when they are needed.
- At the same time the variety of model execution platforms has expanded with in-database analytics as well as MapReduce-based execution becoming increasingly
Showing 449-456 of 699 results