Taking a data-driven approach to pricing to optimize profit

Research around pricing has consistently shown its importance. Deloitte  found on average, a 1 percent price increase translates into an 8.7 percent increase in operating profits (assuming no loss of volume, of course). Yet, also estimated up to 30 percent of the thousands of pricing decisions made by companies each year, fall short of delivering the best price. That’s a lot of money left on the table!

To often, pricing is an after thought, and pricing decisions are made by 'gut feeling,' based on a quick look at competitor websites. Ideally, pricing decisions involve determining the value to the customer relative to the competitor, factoring in pricing power and pricing strategy. For more on this see Pricing Power. What it is. How to get it. from our partners at Ibbaka. Understanding value requires data, and that data can change over time so that the best thought out pricing model can soon be out of date. The solution is data science plus data integration. Data from many different sources can be connected to value and pricing models and when these get out of alignment, due to changes in the market and competitor actions, alerts can be triggered.

When we think about B2B sales analysis, the things that initially come to mind usually involve reporting on CRM data to understand sales by product, region, deal velocity, and the like. Whereas most innovative B2C companies take greater advantage of the mountains of valuable data they have, B2B companies have been a bit slower to adopt this approach. After identifying customer segments, data from not only CRM, but ERP, third party economic data sources and others can be used to understand past purchases & prices, preferences  and more to determine the optimal price. As mentioned, even a 1% increase can have a huge impact.

Automation is king

One of the critical keys to getting more out of your data is automating or even better, eliminating, as many mundane processes as possible. This allows organizations to focus on the test & evaluate part of pricing process, not configuring infrastructure and monitoring and maintaining data flows. It also allows the people doing analysis to take advantage of larger and more diverse data sets and make adjustments without a huge headache.

Sell pricing internally

As someone who has spent years in sales, I can tell you that pricing is so much more than just a number. It’s lead to heated discussions around many tables in innumerable offices. It’s one of the big “hows” sales reps have to keep in mind when trying to close deals. Providing B2B sales organizations, with information and context to help reps understand the factors behind pricing is mission critical. Keeping them educated on this topic will build confidence and translate to clients that have a better understanding of how the cost of the product translate to business value. Introduce optimal pricing earlier in sales cycles can also lead to increasing the speed to close deals as well as increasing win rate.


Test and evaluate

It’s important to remember that pricing is not a static, set it one time type of event. It’s important to try, test and learn. This process is so much better when we can use data to influence and validate the decisions made.

Pricing is an important part of the foundation for strong sales and marketing organizations and using data effectively as part of your strategy can reveal key insights to make sure you get it right.

Want to learn more?

Join us for our upcoming webinar with Ibbaka where will dive deeper into data-driven pricing strategies!


Holiday Gift Ideas for the Data Geek in Your Life

I can’t believe it’s already been a year since we covered some great gift ideas for data people!  We’re back with some more last minute ideas, some may look familiar albeit bigger (or smaller) and better while others are new arrivals. Whatever you’re looking for, we hope at least one of these ideas will help you find something that really excites the techie / data lover in your life this holiday season!


Amazon Echo Second Gen (Dot)

How to take an agile (Minimum Viable Product) approach to analytics projects

By now, the idea of agile development and a Minimum Viable Product or MVP is prevalent. The problem is, while most people have the minimum part down, people often haven't mastered the viable…. especially when it comes to analytics.

To quickly recap,a Minimum Viable Product is an approach, where you’re focusing on creating a product with a sufficient level of features to be able to solve a particular problem. This first iteration is used to collect user feedback and develop the complete set of features for the final product to be delivered.

That’s all nice and well, but you may be wondering what the benefits to this approach are as it concerns analytics projects...

Learning, and learning quickly

Is your solution actually delivering the value that you are trying to create? In a typical project, you may be months down the road before what you’re building is actually in front of users. This makes it difficult to determine its viability for solving the business case. The whole point is to prove or disprove your initial assumptions sooner.

  • What part of your users current process is really frustrating them?

  • Are the analytics we designed actually guiding them through their workflow and making their life better?

By getting a usable set of features in front of user’s earlier in the process, we can collect feedback and determine if we are in fact on the right track.

Is there untapped value in your data?

          extractingvaluejpg

Embedded analytics, data products, data monetization, big data….there are plenty of buzz words we can use to “categorize” the idea.

IDC reports that the big data and business analytics market growing at a rate of over 11% in 2016 and at a compound annual growth rate of 11.7% through to 2020. This rapidly growing area of investment can’t be for naught….can it?

Let’s look beyond the hype at some specific approaches for extracting additional value (and ultimately dollars) from your data.

According to Gartner, Data Monetization refers to using data for quantifiable economic benefit.

The first thing that may come to mind is outright selling of data (via a data broker or independently.)  Although a potentially viable option, with increased data privacy policies and the sheer amount of data needed to be successful with this approach, it can be quite limiting.

There are many other approaches to monetizing your data, such as:

How to build data products that increase user engagement

Think about all the social media platforms out there, which ones do you use the most (and why)? I’m not talking about giving your LinkedIn profile a face lift before you put in a job application or searching for a long lost friend on Facebook; which of these apps are actually driving user engagement? For me, it’s Instagram; the interface is easy to navigate and more than once I’ve found myself re-opening it after I’ve just closed it. Have you thought about why many of these platforms have exploded in user engagement with many people posting to their Twitter or Facebook accounts multiple times per day? According to a recent Gartner blog, adoption rate for some of the BI tools in their Magic Quadrant are at a low but not too surprising 21%. Are people sick and tired of “all that data” or is there something more sinister at work…

We’ve thought a lot about social media platforms (and other apps) that seem to drive such high user engagement and put together a few thoughts on how you can do the same within your data product to ensure you keep users coming back for more. Before we reveal the secret sauce for building engagement in your data products, let’s take a quick look at how many analytics teams approach the problem.

Too often, teams building an analytics product for this customer’s approach the project in the wrong way, the story is oh so familiar. As we covered in a recent blog, this meant taking the reports existing in an Excel spreadsheet and web-ifying them in a cloud BI tool. It’s essentially surfacing the exact same information as before, but now with shiny new charts and graphs, more color choices, and some interactivity. After the initial excitement over the new toy in the room, the latter solution isn’t doing any better than the former at driving engagement; let alone delivering “insights” or creating a new revenue stream.

One of the big reasons customer aren’t lining up to write a check for the latest, greatest data product a vendor has rolled out is that the analytics team failed to make it engaging. Simply put, product teams need to let users know “hey—check this out,” “hey—we’ve got some important information for you”, and “hey—you should come back and see us.” Most teams do the second part, the “we’ve got insights” piece, but they fail to inform users why they need to keep coming back for more. These are essential elements of establishing engagement; not building these in is like skipping the foundation of a new skyscraper. "It's like when you see a skyscraper; you're impressed by the height, but nobody is impressed by the foundation. But make no mistake, it's important," said Akshay Tandon, Head of Strategy & Analytics at LendingTree.

Want to avoid the killer mistakes of failing to build engagement into your data product? Here’s how:

Creating Intelligent Narratives with Narrative Science & Keboola

Intelligent Narratives are the data-driven stories of the enterprise. They are automated, insightful communications packed with the information that matters most to you—specific to your role or industry—written in conversational language, and at machine scale. By giving your employees and your customers a richer, more nuanced understanding of your business, they can make more informed decisions and realize their greatest potential.

Narrative Science is the leader in advanced natural language generation (Advanced NLG) for the enterprise. Quill™, its Advanced NLG platform, learns and writes like a human, automatically transforming data into Intelligent Narratives—insightful, conversational communications packed with audience-relevant information that provide complete transparency into how analytic decisions are made.

As we all know, one of the biggest barriers to successful data projects is having the right data in the right place; that's why Narrative Science and Keboola have partnered to bring the next generation of analytics to you faster. Automate data workflows, reduce time and complexity of implementations and start gaining new insights now! Leverage this app, powered by Narrative Science, to produce machine-generated narratives of data ingested by Keboola. 

Freethink + Keboola: Understanding cross-channel video analytics

Video is one of the hottest trends in digital marketing. YouTube, which has expanded more than 40 percent since last year, reaches more 18-49 year-old viewers than any of the cable networks and has a billion users watching hundreds of millions of hours every day. 

Freethink, a modern media publisher, uses online video to tell the stories of passionate innovators who are solving some of humanity’s biggest challenges by thinking differently. While telling important stories is their primary focus, data underlies all of their decisions. As a publisher, they need to understand how well each piece of content performs, as well as how that content performs across platforms (they currently publish videos on their website, YouTube and Facebook.)

Prior to working with Keboola, collecting and combining data for cross-channel video analysis was a time consuming, manual effort (particularly because Facebook has separate APIs to track page content and promoted content.) In addition, this process made performing time-over-time analyses a real challenge.

The goal was to provide a dashboard solution for the team to have better visibility into their data. Keboola Connection (KBC) was able to overcome this by leveraging existing API connections to get data from Facebook and YouTube. In addition, Keboola utilized its partnership with Quintly (social media analytics) in order to pick up cleaned and verified data from their API.  All this data is combined additional data sources including Google Sheets to provide additional metadata for advanced reporting and segmentation. This blended data enables universal reporting across platforms to get a 360-degree picture of each piece of content.

Image result for social media

Freethink now has all their data populated in Redshift, where Chartio is able to connect to create beautiful dashboards for reporting. They are able to go into the Keboola platform and manually adjust and run configurations to get exactly the data they need. The biggest gains have been in time saved, being able to show change over time and freeing the team up to focus on more complicated analyses. This also opened up data access to the broader team, promoting collaboration and data driven decision making.


"Keboola really helped simplify and automate the process of collecting and combining data. Working together, Chartio and Keboola Connection deliver a full stack solution for modern analytics, taking full advantage of the cloud. I’m able to give my team better insights into our performance and make better decisions, quicker."

-Brandon Stewart, Executive Editor at Freethink


Thanks,

Colin


The Best Tool for Your Data Product Journey? A Good Map

                    mapjpg

For anyone creating an analytics product, the pressures of engaging customers and generating revenue while protecting your core product and brand can be overwhelming, especially when aiming to hit so many goals on the horizon:

  • Does it target users effectively?

  • Will it guide users to a solution to their business problem?

  • Can it scale to many customers?

  • Will it deliver real results that customers are willing to pay for??

Fortunately, we've been there, done that, and understand what it takes to build a great data product. That's why we've created a map to help you navigate your way to success, built on the experience of countless voyagers who have sailed the same seas before you; the Data Product Readiness Assessment.

Why your data product needs a good elevator pitch

                       elevatorlumbergpng

In recent years, a term started appearing across the technology world: “data monetization,” turn your data into dollars.. (as we mentioned in a previous, post, you can Find Gold in Your Data!) Businesses reacted to the hype, started spending on every solution under the sun and then… Nothing. Nada. Zilch. In many cases the revenues never materialized, buyers became frustrated with the lack of results and blamed the whole concept of data monetization. The problem is, you’ve got to avoid certain mistakes... and they’re silent killers.

In truth, data products are a great opportunity for most businesses to engage customers and create new streams of revenue. Untapped, dormant data can, when refined properly, become a crucial resource for your company. Fortunately, we’ve worked on many analytics projects ourselves, have seen these mistakes made and have put together a guide to help you avoid making them yourself.

To provide some quick insight, we thought we’d share one of the tips we’ve found most helpful when starting to create an analytics product.

Creating an elevator pitch

What is "modern" business intelligence anyway...?

                 thethinkerjpeg

Last week, Tableau hosted a session on the evolution of Business Intelligence in Portland that I had the chance to attend. Although I did review their Top 10 trends in BI when they released them earlier this year, the presentation and discussion ended up being pretty interesting. A few of the topics really resonated with me and I thought we could dig into them a bit more.  

For starters:

Modern BI becomes the new normal

The session (and report) kick off by highlighting Gartner’s Business Intelligence Magic Quadrant and the shift away from IT-centric BI over the last 10 years. Regardless of who’s discussing the trends (Gartner, Tableau or otherwise..) and if or when they come to fruition, it’s important to dig deeper. **Reports like those by Gartner are good guideposts for trends and technologies to exam; saw that mentioned somewhere recently, comment for credit.

That said, I think we can agree that the overall landscape of technology and the way that organizations of all sizes are taking advantage of it in the domain of business intelligence has improved over the last decade.

So does that mean modern BI has truly arrived?

Although some ideas come to mind when I hear the phrase..

What is modern business intelligence?  

And do we all think of the same things when we discuss it….?


               carrier-pigeonpng