Decision Management Solution Blog
Another analytic practitioner speaks – an interview with Tracy Altman
Last year I interested Andrea Scarso, CEO of MoneyFarm, about analytics. This was a hugely popular post so I thought I would continue the series this year by interviewing some other analytic practitioners. The first in this continuing series is an interview with Tracy Allison Altman, co-founder of Ugly Research. Ugly Research are developing PepperSlice, […]
Another analytic practitioner speaks – an interview with Tracy Altman
Last year I interested Andrea Scarso, CEO of MoneyFarm, about analytics. This was a hugely popular post so I thought I would continue the series this year by interviewing some other analytic practitioners. The first in this continuing series is an interview with Tracy Allison Altman, co-founder of Ugly Research. Ugly Research are developing PepperSlice, an analytics application for explaining recommendations to decision makers.  She’s on Twitter @EvidenceSoup, blogs at EvidenceSoup.com and you can read about her work at Ugly Research.
What’s your background, how did you come to be working in analytics?
I started out as an engineer, evaluating oil and gas investment decisions. This sparked my interest in data and analytical methods, so I got a Master’s (Computer Science concentration) and then a PhD in Pub
Decision Model and Notation officially adopted
Well it’s official – the Object Management Group’s Board of Directors has voted to publish the Decision Model and Notation (DMN) specification. You can see the press release here. Decision Management Solutions is excited to be a submitter for this new standard. As I say in the press release:
“Decision modeling is transforming how organizations adopt powerful technologies such as business rules and advanced analytics to automate and improve their decision-making. The Decision Model and Notation (DMN) standard provides an open and effective specification for this emerging practice. Decision Management Solutions is committed to supporting and using DMN and we’re delighted that the standard has been adopted and garnered broad support among OMG members.”
If you are interested in this standard and how it can complement proce
Webinar: Agile and Cost Effective Financial Compliance
I am giving a webinar on Agile and Cost Effective Financial Compliance: Going Beyond Business Rules with Decision Management on February 13 at 10am Eastern. I will be joined by Jan Purchase, Director and Founder of our investment banking specialist and Decision Management Solutions partner Lux Magi.
Decision management and business rules management systems are the ideal platform for an agile and cost-effective compliance approach. In regulated industries like financial services, leading companies are building compliance into every process and system with consistency and transparency across the entire organization and with the agility to meet ever more challenging deadlines. Companies that fail to do so incur huge costs with manual checks and balances and risk significant fines. We’ll share know-how and best pra
Standards in Predictive Analytics: Futures
In this series so far we have discussed a number of standards – R, PMML and Hadoop – that are well established. There are also some future developments that are worth considering—the emergence of the Decision Model and Notation standard, growing acceptance of Hadoop 2 and planned updates to PMML specifically.
As regular readers of this blog know, the Object Management Group recently accepted the Decision Model and Notation standard as a beta specification for finalization in 2014. DMN provides a common modeling notation, understandable by both business and technical users, that allows decision-making approaches to be precisely defined. These decision models can include the specification of detailed decision logic and can model the role of predictive analytics at a requirements level and at an implementation level through the inclusion of PMML models as functions. <a href="https://decisionmanagementsolutions.com/decision-modeling-
Standards in Predictive Analytics: Hadoop
With the fourth post in this series I am going to talk about Hadoop – something with even more hype than R or predictive analytics. As we all know the era of Big Data has arrived. As anyone who reads the IT or business press knows, there is more data available today,  this data is no longer just structured data suitable for storing in a relational database but is increasingly unstructured and semi-structured data, and this data is arriving at organizations more rapidly than ever. All this is putting pressure on traditional data infrastructures and Hadoop has emerged as a key technology in dealing with these Big Data challenges. Grossly simplifying, Hadoop consists of two core elements—the Hadoop Distributed file System or HDFS and the MapReduce programming framework. HDFS is a highly fault tolerant distributed file system that runs on low-cost comm
Online Decision Management and Decision Modeling training in February
We are running our “Decision Management and Decision Modeling” online in February.
This 6-session online live training class will prepare you to be immediately effective in using the Decision Management approach and a modern, collaborative and standards-based approach to decision modeling.You will learn how to identify and prioritize the decisions that drive your success, see how to analyze and model these decisions, and understand the role these decisions play in delivering more powerful information systems. This course is newly designated an IIBA Endorsed Course so you will earn 9 PDs/CDUs for attending.
Sessions begin at 10am Pacific/1pm Eastern and are 90 minutes in length. The 6 sessions spread over two weeks:
February 4,5, and 6
February 18,19, and 20Decision Management, of course, is a proven approach for adopting business rules and pr
Standards in Predictive Analytics: R
The third post in my series on standards in Predictive Analytics is on R, a hot topic in analytic circles these days. R is fundamentally an interpreted language for statistical computing and for the graphical display of results associated with these statistics. Highly extensible, it is available as free and open source software. The core environment provides standard programming capabilities as well as specialized capabilities for data ingestion, data handling, mathematical analysis and visualization. The core contains support for linear and generalized linear models, nonlinear regression, time series, clustering, smoothing and more. The language has been in development and use since 1997 with the 1.0 release coming in 2000. The core is now at release 3.0. New capabilities can be added by creating packages typically written in the R language itself. Over 5,000 packages have been added through the open source community.
<span style="l
Showing 441-448 of 696 results