Data Science Consulting- When Algorithms Work Too Well

Data Science Consulting- When Algorithms Work Too Well

July 5, 2017 Data Science Consulting 10

Data Science Consulting Sometimes Predicts Too Well

 

In data science consulting, we enjoy sharing stories of success and failures. It helps us learn, and gain further insight to how to implement data science processes and policies (the hardest part of the job). One such example a data science project that went a little too well involves a telecommunication company.

About 5 years ago, a telecommunications company wanted to be able to predict whether a customer would cancel their service or not. An example of a great starting project and question to work on for a new data science team.

The executives loved the idea when the project manager pitched it. Who wouldn’t! It makes a lot of sense. If you can catch a customer right before they fall out and convince them to stay on your plan, then you keep revenue. Thus, a bigger bonus for the executives, or higher dividends for shareholders.

So the analysts and data scientists were set off to races on developing a model that could help the company better predict when customers were planning to quit their service.

After about a year and a couple hundred thousand dollars worth of resource hours the team had developed a successful model. It was 85-90% accurate. They then worked with process managers, and customer service teams to develop playbooks and guides to help approach customers who may not be interested in the service. A few more hundred thousands of dollars dropped into the project. Once they felt like the algorithm was set and the team members were trained they went forward to implement the project.

On the first day, the customer service representatives were given a list of people to call and check up on. They were supposed to figure out how they could keep the customer happy. A great motive.

 

Data Science Project in Action

The first call, according to the customer rep went a little like this:

Customer Rep. : “Hello, my name is Joe. I work for ACME Telecommunications. We are working to better appeal to our customers and we were wondering if you were having any problems with our service.

Customer: Yes, in fact. I have been meaning to call you guys for the last two months to cancel my service. Thank goodness you called, since I have you on the line anyways. Can you help me cancel my service.

A good 30-40% of the calls were like this one. The algorithm worked amazingly well. It was actually predicting people who wanted to cancel the service. The only problem was, some people were determined to cancel the service and it ended up in a huge loss of possible monthly charges that might have not been cancelled for another 3-6 months. This lead to a loss in profit, and a cancellation of the project. In the end, the project was a successful failure.

 

Conclusion

Sometimes, algorithms do exactly what you tell them to. Yet, the customers responded very differently. One of the common statements the Go Masters that faced off against Google’s Alpha Go said was that they had never faced an opponent that thought the way a human did. That’s just it. Humans don’t always act the way we calculate. We can try, but it’s not always perfect. So when implementing a new data driven strategy, it is important to incorporate the human aspect. Don’t let the numbers and metrics lead away from the human factor.

 

 

10 Responses

  1. jgreenemi says:

    For what it’s worth, it’s awesome that the project helped to ease customer pain by giving them an easier time canceling their service instead of helping to bleed them for their dollars.

    • research@theseattledataguy.com says:

      If only all companies thought like that! Can you image the world we would be in where they actual put the customer over profit….

      Supposedly Amazon does it at the cost of their employees.

  2. research@theseattledataguy.com says:

    Comments will go through, for some reason something is wrong with blue host. Working on a fix! Probably won’t have time until tomorrow or friday though!

  3. Ogaday says:

    What technical advice would you give to a new Data Science team looking at working on this exact problem (churn) ? Are there specific approaches or paradigms that lend themselves more easily to the problem?

    • research@theseattledataguy.com says:

      This would depend somewhat on what type of data you have. However, if you are just looking to tell if someone will cancel a service or quit a job. You are looking at a “Yes”, “no” classification problem.

      Algorithms like Logistic Regression and Decision Tree Classifiers work great. Just to name a few. If you have a large enough set of data, then you could use a neural network as well.

      Kaggle.com actually has a great test data set for this that I have used. It is HR data and you can try to build a model to see if someone is likely to quit or not.

      • Ogaday says:

        Thanks for the reply and the link the Kaggle set, I’ll definitely have a look!

        Am I getting mixed up or is assigning a probability of leaving to each customer / staff (ie. predict employee x has a 32% chance of leaving) still a classification problem using a different type of model, or is it a different approach altogether?

Comments are closed.