Using Analytics Results
Deploying an analytics model is a significant milestone, but it is not the end of the journey! A model needs to contribute to desired outcomes to be successful. Frequently, this means providing model outputs to investigators in a useful manner, such as a report or dashboard, that can be integrated with existing case management systems to prioritize investigations. It can also mean using the results to identify processes and controls that can prevent future fraud.
If you have ever had a credit or debit card blocked from transactions while traveling, you have experienced this first-hand: the location of card-usage became an outlier, and the card issuer attempted to prevent future fraud by locking the account and requiring that the account owner verify whether the change was legitimate. For a good client this “false alarm” creates a cost (negative experience) which must be weighed against the opposite error, a “false dismissal” of actual fraud, to find the best operating balance.
Model maintenance is critical for generating long-term value from a fraud analytics model. Circumstances change over time, and models need to reflect these changes to remain effective. For credit cards, the quantity of online transactions has grown over time; models that have not been updated would not reflect the current rates of online transactions accurately so might wrongly select cases for investigation. Similarly, inflation affects the average dollar-amounts of transactions so needs to be monitored for model updates to preserve accuracy.
Changes can also come from more direct sources, such as feedback from investigators, new business understanding, or a new fraud-prevention process that alters the landscape of risk assessment. A new procedure that provides additional oversight for a previously high-risk activity and that effectively reduces risk necessitates model updates to reduce the likelihood of identifying events from this source as fraud.
To improve the CRISP-DM framework we need to include model maintenance — the connection between deployment and business understanding. Maintenance begins after model deployment and is initiated by understanding the environment in which the model operates.