Blog Header 1.jpg


Sort by Topic

Using Agile Data Science Methods to Manage Shifting Priorities

Andy Janaitis

[fa icon="calendar'] February 10, 2017

BLOG_Using Agile Data Science Methods.jpg

After working with a client’s data for over three weeks with no real progress, you finally hit upon a real breakthrough. You’ve been searching for insights that will help identify which customers are most likely to turn into regular purchasers; the ultimate goal is to focus the company’s marketing efforts on this group in order to earn more revenue per advertising dollar. Studying customer purchase history has been unfruitful. Suddenly, you find that customer geography seems to be a better predictor of future purchases. You have a few more weeks to explore that connection, so you should be able to find some real value for the customer, right?

Read More [fa icon="long-arrow-right"]

Taking Action on Technical Success: A Fable of Data Science and Consequences

William Proffitt

[fa icon="calendar'] January 31, 2017


TalkThree’s new Analytics Director, Michael, has had a sobering month. What he had hoped would be his first major contribution to his company has fallen flat. His team created a model which was intended to address a pressing challenge for TalkThree: a steady stream of departing cellular phone service customers, known as “churners.” Their model predicts who is most likely to leave, and though he delivered it enthusiastically, it received an unexpectedly lukewarm reception from the customer retention team. The churn solution was a poorly conceived data product that didn't reach its audience in a way that worked for them. 

Read More [fa icon="long-arrow-right"]

Analytics 4.0 - Are You Ready For The Future of Competition?

Jordan Barr, Ph.D.

[fa icon="calendar'] January 13, 2017


The evolution of analytics is categorized by three distinct eras (i.e. 1.0, 2.0, and now 3.0), but a new era, Analytics 4.0 looms on the horizon.  Before we are catapulted into the future, let us visit the present and past of analytics.  As we will see in Analytics 3.0, the most successful companies incorporate data analytics into all aspects of their business processes to gain and sustain a competitive advantage.  But such advances are very recent. 

Read More [fa icon="long-arrow-right"]

Empathy and Data Science: A Fable of Near-Success

William Proffitt

[fa icon="calendar'] December 28, 2016


Michael is an analytics director. This evening we find him frazzled. As he pulls out of the TalkThree parking lot, contemplating his next move, his radio whimpers a slow rock ballad. Michael let down the executives of his company today. The recipients of his first project aren’t experiencing the success he loudly promised, and his team has ended up back at the drawing board. How did he get here? Michael failed to consider his audience. Let’s rewind the story.

Read More [fa icon="long-arrow-right"]

Operationalizing Analytics Deployment with SPSS Collaboration & Deployment Services

Joy McKinney

[fa icon="calendar'] December 16, 2016


As analytics professionals, we can work days or weeks building and validating predictive or descriptive models.  However, the success of an analytics project does not just depend on the model building.  To truly be a success, we need to deploy our models, integrate them into workflows, and use them to change business processes.  Operationalizing the analytic assets can be the most rewarding part of the process, but how do we know if we are successful?  How can we ensure that once analytics are deployed in production that the environment will remain stable and continue to produce the desired results?

Read More [fa icon="long-arrow-right"]

DoD’s Third Offset – Science Fact or Fantasy?

Jordan Barr, Ph.D.

[fa icon="calendar'] December 6, 2016

BLOG_Third Offset.jpg

An offset strategy is critical to the US Department of Defense’s plan to gain and maintain a strategic defense advantage over our enemies, real or potential.  Since the beginning of the Cold War, the vision casting and implementation of these offsets has followed three distinct phases.  The first was nuclear, the second depended on precision-guided missiles, and the third and most ambitious, currently underway, relies on emerging technologies centered on human-machine integration. 

Read More [fa icon="long-arrow-right"]

How Do You Know Your Predictive Model Results Are Valid?

Will Goodrum

[fa icon="calendar'] November 28, 2016

BLOG_Model Validation.png

Across our diverse portfolio of clients, we have repeatedly seen the value of getting an early second opinion on the validity of predictive analytics models. Why bring in an outside expert? Even well-intentioned, highly-capable, technically-sophisticated data analytics initiatives can fail if they lack proper context and support from within the business.1 Having an accurate model is not good enough. Are you confident that you’re on the right track with respect to your analytics initiatives? What changes does the model demand of the organization? Most importantly, how do you know the model results are valid?

Read More [fa icon="long-arrow-right"]

Finding Balance: Model Accuracy vs. Interpretability in Regulated Environments

Will Goodrum

[fa icon="calendar'] November 4, 2016

BLOG-Balance-Accuracy-Interpretability  3.png

At Elder Research, we are a “whiteboard on every wall” kind of office. Inspiration happens spontaneously, and any conversation, however casual, can easily drift into an involved discussion that uncovers the hidden route to move a project forward. Recently one of these whiteboard discussions took place between our Lead Data Scientist, Mike Thurber, and Founder, Dr. John Elder IV. The conversation between Mike and John started innocently enough over lunch, but then dragged out over the next several hours. Although the tone never rose above a murmur, this was clearly a vibrant discussion. What was the subject of this whiteboard-fueled philosophizing? It was another chapter in the familiar balancing act between accuracy and interpretability in predictive modeling.

Read More [fa icon="long-arrow-right"]

Data Transparency - What Does the OPEN Government Data Act Mean for U.S. Citizens?

Bryan Jones

[fa icon="calendar'] October 27, 2016


The OPEN (Open, Public, Electronic and Necessary) Government Data Act, or S.2852, as stated in a Data Coalition press release “directs all federal agencies to publish their information as machine-readable data, using searchable formats.”  Essentially this means government data would be open, by default, for use or reuse by the public, the private sector, non-profits or other government agencies.  For those providing analytics consulting services to government agencies this data transparency is very good news but what is the reality for implementation if this act is signed into law?

Read More [fa icon="long-arrow-right"]

Product Usage Analytics: Transforming Software Reliability and Customer Experience

Anna Godwin

[fa icon="calendar'] October 11, 2016


Among software companies, new version releases are a promising practice to improve the customer experience and prevent churn. Yet now more than ever, determining software reliability between releases remains a major challenge for these companies. Is there a way to quantify the software performance that influences the customer experience? The answer is yes. Elder Research has developed a powerful approach to measuring software reliability by leveraging log data and Product Usage Analytics.

Read More [fa icon="long-arrow-right"]