Keboola and Slalom Consulting Team up to host Seattle’s Tableau User Group

On Wednesday, May 18th, Keboola’s Portland and BC team converged in Seattle to host the city’s monthly Tableau User Group with Slalom Consulting. We worked with SeaTUG’s regular hosts and organizers, Slalom Consulting, to put together a full evening of discussion around how to solve complex Tableau data problems using KBC. With 70+ people in attendance, Seattle’s Alexis Hotel was buzzing with excitement! 

The night began with Slalom’s very own Anthony Gould, consultant, data nerd and SeaTUG host extraordinaire, welcoming the group and getting everyone riled up for the night’s contest - awarding the attendee who’s SeaTUG related tweet got the most retweets! He showed everyone how we used Keboola Connection (KBC) to track that data and prepared them that this would be updated at the end of the night and prizes distributed!

Cleaning Dirty Address Data in KBC

There is an increasing number of use cases and data projects for which geolocation data can add a ton of value - e-commerce and retail, supply chain, sales and marketing, etc.  Unfortunately, one of the most challenging asks of any data project is relating geographical information to various components of the dataset. On a more positive note, however, KBC’s easy integration with Google apps of all kinds allows users to leverage Google Maps to add geo-coding functionality. Since we have so many clients taking advantage of geo-coding capabilites, one of our consultants, Pavel Boiko outlined the process of adding this feature to your KBC environment. Check it out!  

Anatomy of an Award Winning Data Project Part 3: Ideal Problems not Ideal Customers

Hopefully you’ve had a chance to read about our excitement and pride upon learning that two of our customers had won big awards for the work we’d done together. To jog your memory, Computer Science Corporation (CSC)’s marketing team won the ITSMA Diamond Marketing Excellence Award as a result of the data project we built together. CSC used KBC to bridge together 50+ data sources and pushing those insights out to thousands of CSC employees. To catch up on what you missed or to read again, revisit our Part 1 of our Anatomy of an Award Winning Data Project. 

Additionally, the BI team at Firehouse Subs won Hospitality Technology’s Enterprise Innovator Award for its Station Pulse dashboard built with a KBC foundation. The dashboard measures each franchise’s performance based on 10 distinct metrics and pulling data from at least six sources. To catch up on what you missed or to read again, revisit our Part 2 of our Anatomy of an Award Winning Data Project.

We’re taught that most businesses have a “typical” or “ideal” customer. When crafting a marketing strategy or explaining your business to partners, customers and your community, this concept comes up repeatedly. And we don’t really have a ready-made answer. A data-driven business can be in any industry and the flexibility and agility of the Keboola platform is by its very nature data source and use case agnostic.

And so, when these two customers of ours both won prestigious awards highlighting their commitment to data innovation, it got us thinking. These two use cases are pretty different. We worked with completely different departments, different data sources, different end-users, different KPIs, etc. And yet both have been successful, award-winning projects.

We realized that perhaps the question of an ideal customer isn’t really relevant for us. Perhaps we’d been asking the wrong question all along. We can’t define our target customer, but we can define the target problem that our customers need help solving.

Anatomy of an Award Winning Data Project Part 2: Firehouse Subs Station Pulse BI Dashboard


As we reported last week, we are still beaming with pride, like proud parents at a little league game or a dance recital. Not one, but two!, of our customers won big fancy awards for the work we did together. The concept of a data-driven organization has been discussed and proposed as an ideal for a while now, but how we define and identify those organizations is certainly still up for debate. We’re pretty confident that these two customers in question - Computer Sciences Corporation (CSC) and Firehouse Subs - would be prime contenders. These awards highlight their commitment to go further than their industry counterparts to empower employees and franchisees to leverage data in new and exciting ways. 

If you missed last week’s post with CSC’s Chris Marin, check it out here. Today, let’s learn more about Firehouse Subs award winning project. In case you don’t know much about Firehouse Subs, let me bring you up to speed. The sandwich chain started in 1994 and as of March 2016 has more than 960 locations in 44 states, Puerto Rico and Canada. Firehouse Subs is no stranger to winning awards, either. In 2006, KPMG named them “Company of the Year” and they’ve been recognized for their commitment to community service and public safety as well through Firehouse Subs Public Safety Foundation®, created in 2005.  


Now let’s hear from our project champion and our main ally at Firehouse Subs, Director of Reporting and Analytics, Danny Walsh.

Anatomy of an Award Winning Data Project Part 1: CSC and Marketing Analytics

Here at Keboola, we take pride in working closely with partners and customers ensuring that each project is a success. Typically we’re there from the beginning - to understand the problem the client needs to solve; to help them define the scope and timeline of the implementation; to provide the necessary resources to get buy in from the rest of their team; to offer alternative perspectives and options when mapping out the project; and to be their ally and guide throughout every step of the process. With all that work, all that dedication, it turns out we develop quite a soft spot for both our clients and their projects. 

We’ve got skin in the game, so when one of our clients receives an award because of the project we worked on together, we get pretty excited. And when two clients receive an award because of our work together, well, then we’re downright ecstatic and ready to celebrate!

At the end of 2015, two customers were honored for their commitment to data innovation. Firehouse Subs® was awarded the Hospitality Technology Innovation Award and the digital marketing team at Computer Science Corporation (CSC) for the ITSMA Diamond Marketing Excellence Award.

Since new partners and clients often ask us to explain what components and environment cultivate a successful data project, we thought we’d take this exceptional opportunity to ask our customers themselves: Danny Walsh, Director of Reporting and Analytics, Firehouse Subs and Chris Marin, Senior Principal, Digital Marketing Platform & Analytics, CSC.

Over the next couple of weeks, we’ll share each of their stories and explain how we feel these separate use cases in two distinctly different industries are reflective of what we at Keboola view as the ideal conditions for creating a wildly successful - award-winning even - data project.

Empowering the Business User in your BI and Analytics Environment


There’s one trend on Gartner’s radar that hasn’t changed much over the last few years and that’s the increasing move toward a self-service BI model. Gone are the days of your IT or analytics department being report factories. And if those days aren’t gone for you, then it’s time you make some substantive changes to your business intelligence environment. When end-users are forced to rely on another department to deliver the reports they need, the entire concept of being a “data-driven” organization goes right out the window. 

So other than giving your users access to ad hoc reporting capabilities, how do  you empower the user?

Bi-Modal BI: Balancing Self-Service and Governance

                                   

The age old conflict.  IT needs centralization, governance, standards and control; on the other side of the coin?  Business units need the ability to move fast and try new things.  How can we get lines of business access to the data they need to for projects so they can spend their time focused on discovering new insights?  Typically they get stuck in a bottleneck of IT requests or spending 80% of their time doing data integration and preparation.  Neither group seems particularly excited to do it, and I don’t blame them.  For the analyst it increases the complexity of their tasks and seriously raises the technical knowledge requirements.  For IT, it’s a major distraction from their main purpose in life, an extra thing to do.  Self serve BI is trying to destroy the backlogged “report factories,” only to replace them with “data stores,”  which are sadly even less equipped for the job at hand.  Either way, the result is a painfully inefficient process, straining both ends of the value chain in any company that embarks on the data driven journey.

The Bi-Modal BI Answer?

An organization's ability to effectively extract value from data and analytics while maintaining a well governed source of truth is the difference between competitive advantage or sunken costs and missed opportunities.  How can we create an environment that provides the agile data access needed by the business users while still maintaining sound data governance?    Gartner has referred to a  Bi-modal IT strategy.  A big challenge with Bi-modal IT is that it pushes IT management to divide their efforts between ITs traditional focus and a more business focused agile methodology.

The DBA and Analyst Divide

Another major challenge in data access comes from the separation between DBAs and business users.  Although the technical side may have the necessary expertise to implement ETL projects, they often lack the business domain expertise needed to make the correct assumptions around context and how the data is regarded.  With so many projects competing for resources, we shouldn’t have to task a DBA on all of them.  Back to the flip side of the coin, data analysts and scientists want the right data for their tools of choice and they want it fast.  Even though there is growing set of data integration tools that allows individual business units to create and maintain their own data projects, this typically requires a lot of manual data modeling and can lead to siloed data or inconsistent metrics.  

Instead of controlling all of BI, IT can enable the business to develop their analytics without sacrificing control and governance standards.  So how can we get the right data in the hands of people who understand and need it in a timely manner?

3 Critical Steps to Evangelize the New in Business Intelligence and Analytics

The rapid evolution in business intelligence and analytics capabilities is both exhilarating and overwhelming. 

How do you protect the stability of the work you’ve already done, while evangelizing experimentation, exploration and progress within your organization?

We’ve got a few tips for you.

Keboola: Data Monetization Series - How data science can help

                       

Having access to the right data in a clean and accessible format is the first step (or series of steps) leading up to actually extracting business value from your data.  As much as 80% of the time spent on data science projects involves data integration and preparation.  Once we get there, the real fun begins.  With the continued focus on big data and analytics to drive competitive advantage, data science has been spending a lot of time in the headlines.  (Can we fit a few more buzzwords into one sentence?)

Let’s take a look at a few data science apps available on our platform and how they can help us into our data monetization efforts.

Basket analysis

One of the most popular algorithms is market basket analysis.  It provides the power behind things like Amazon’s product recommendation engine and identifies that if someone buys product A, they are likely to buy product B.  More specifically, it’s not identifying products placed next to each other on the site that get bought together, rather products that aren’t placed next to each,  This can be useful in improving in-store and on site customer experience, target marketing and even the placement of content items on media sites.

Anomaly detection

Anomaly detection refers to identifying specific events that don’t conform to the expected pattern from the data.  This could take the form of fraud detection, identifying medical problems or even detecting subtle change in consumer buying behaviors.  If we look at the last example, this could help us in  identifying new buying trends early and taking advantage.  Using the example of an eCommerce company, you could identify anomalies in carts created per minute, a high number of carts abandons, an odd shift in orders per minute or a significant variance in any other number of metrics.

Using a Data Prep Platform: The Key to Analytic Product Agility

                                                     

                                                                                      Guest post by Kevin Smith

For a product owner, one of the biggest fears is that the product you're about to launch won't get the necessary adoption to achieve success. This might happen for a variety of reasons— two of the most common are a lack of fit to the customers' needs and confusing design (it's just too hard to use!).

To combat the possibility of failure, many product owners have adopted the "agile" approach to building products that have enough functionality to meet to minimum needs, but are still lean enough to facilitate easy change.

As a data product builder — someone building customer-facing analytics that will be part of a product — the needs are no different but achieving agility can be a real challenge. Sure, every analytics platform provider you might consider claims that they can connect to any data, anywhere, but this leaves a lot of wiggle room. Can you really connect to anything? How easy is it? How hard is it to change later? What about [insert new technology on the horizon here] that I just heard about? If you want to build an agile data product, you've got a tough road ahead... as I found out.

Recently I started working on the analytics strategy for a small start-up firm focused on providing services to large enterprises. As they delivered their services, they wanted to show the results in an analytics dashboard instead of the traditional PowerPoint presentation. It would be more timely, easier to deliver, and could be an on-going source of revenue after an engagement was completed. As I spoke with the team, a few goals surfaced:

  1. They wanted to buy an analytics platform rather than build from scratch. The team realized that they would be better off developing the methodology that would differentiate them from the competition instead of creating the deep functionality already provided by most analytics platforms.
  2. The system had to be cost-effective both to set-up and to operate. As a start-up, there simply wasn't the cashflow available for costly analytics platforms that required extensive professional services to get started. The product had to be flexible and "configurable" by non-Engineers. With little to no budget for an Engineering staff, the team wanted a BI platform that could be configured easily as customer needs changed.
  3. Up and running quickly. This company had customers ready to go and needed a solution quickly. It would be essential to get a solution in front of the customers NOW, rather than try to migrate them to a new way of operating once the dashboards were ready. Changes would certainly be needed post-launch, but this was accepted as part of the product strategy.

None of this seemed to be impossible. I've worked on many data products with similar goals and constraints. Product teams always want to have a platform that's cost-effective, doesn't strain the technical capabilities of the organization, is flexible, and is launched sooner rather than later. It was only after a few more conversations that the problem arose: uncertain data sources.

Most data-driven products work like this: you've got a workflow application such as a help desk application or an ordering system that generates data into a database that you control. You know what data is flowing out of the workflow application and therefore, you understand the data that is available for your analytics. You connect to your database, transform the data into an analytics-ready state, then display the information in analytics on a dashboard. The situation here was different. As a services company, this business had to operate in a technology environment dictated by the customer. Some customers might use Salesforce, some might use Sugar CRM. Still others might use Zoho or one of the myriad other CRM platforms available. Although the team would structure the dashboards and analytics based on their best practices and unique methodology, the data driving the analytics product would differ greatly from customer to customer.