Creating Intelligent Narratives with Narrative Science & Keboola

Intelligent Narratives are the data-driven stories of the enterprise. They are automated, insightful communications packed with the information that matters most to you—specific to your role or industry—written in conversational language, and at machine scale. By giving your employees and your customers a richer, more nuanced understanding of your business, they can make more informed decisions and realize their greatest potential.

Narrative Science is the leader in advanced natural language generation (Advanced NLG) for the enterprise. Quill™, its Advanced NLG platform, learns and writes like a human, automatically transforming data into Intelligent Narratives—insightful, conversational communications packed with audience-relevant information that provide complete transparency into how analytic decisions are made.

As we all know, one of the biggest barriers to successful data projects is having the right data in the right place; that's why Narrative Science and Keboola have partnered to bring the next generation of analytics to you faster. Automate data workflows, reduce time and complexity of implementations and start gaining new insights now! Leverage this app, powered by Narrative Science, to produce machine-generated narratives of data ingested by Keboola. 

Freethink + Keboola: Understanding cross-channel video analytics

Video is one of the hottest trends in digital marketing. YouTube, which has expanded more than 40 percent since last year, reaches more 18-49 year-old viewers than any of the cable networks and has a billion users watching hundreds of millions of hours every day. 

Freethink, a modern media publisher, uses online video to tell the stories of passionate innovators who are solving some of humanity’s biggest challenges by thinking differently. While telling important stories is their primary focus, data underlies all of their decisions. As a publisher, they need to understand how well each piece of content performs, as well as how that content performs across platforms (they currently publish videos on their website, YouTube and Facebook.)

Prior to working with Keboola, collecting and combining data for cross-channel video analysis was a time consuming, manual effort (particularly because Facebook has separate APIs to track page content and promoted content.) In addition, this process made performing time-over-time analyses a real challenge.

The goal was to provide a dashboard solution for the team to have better visibility into their data. Keboola Connection (KBC) was able to overcome this by leveraging existing API connections to get data from Facebook and YouTube. In addition, Keboola utilized its partnership with Quintly (social media analytics) in order to pick up cleaned and verified data from their API.  All this data is combined additional data sources including Google Sheets to provide additional metadata for advanced reporting and segmentation. This blended data enables universal reporting across platforms to get a 360-degree picture of each piece of content.

Image result for social media

Freethink now has all their data populated in Redshift, where Chartio is able to connect to create beautiful dashboards for reporting. They are able to go into the Keboola platform and manually adjust and run configurations to get exactly the data they need. The biggest gains have been in time saved, being able to show change over time and freeing the team up to focus on more complicated analyses. This also opened up data access to the broader team, promoting collaboration and data driven decision making.


"Keboola really helped simplify and automate the process of collecting and combining data. Working together, Chartio and Keboola Connection deliver a full stack solution for modern analytics, taking full advantage of the cloud. I’m able to give my team better insights into our performance and make better decisions, quicker."

-Brandon Stewart, Executive Editor at Freethink


Thanks,

Colin


The Best Tool for Your Data Product Journey? A Good Map

                    mapjpg

For anyone creating an analytics product, the pressures of engaging customers and generating revenue while protecting your core product and brand can be overwhelming, especially when aiming to hit so many goals on the horizon:

  • Does it target users effectively?

  • Will it guide users to a solution to their business problem?

  • Can it scale to many customers?

  • Will it deliver real results that customers are willing to pay for??

Fortunately, we've been there, done that, and understand what it takes to build a great data product. That's why we've created a map to help you navigate your way to success, built on the experience of countless voyagers who have sailed the same seas before you; the Data Product Readiness Assessment.

Why your data product needs a good elevator pitch

                       elevatorlumbergpng

In recent years, a term started appearing across the technology world: “data monetization,” turn your data into dollars.. (as we mentioned in a previous, post, you can Find Gold in Your Data!) Businesses reacted to the hype, started spending on every solution under the sun and then… Nothing. Nada. Zilch. In many cases the revenues never materialized, buyers became frustrated with the lack of results and blamed the whole concept of data monetization. The problem is, you’ve got to avoid certain mistakes... and they’re silent killers.

In truth, data products are a great opportunity for most businesses to engage customers and create new streams of revenue. Untapped, dormant data can, when refined properly, become a crucial resource for your company. Fortunately, we’ve worked on many analytics projects ourselves, have seen these mistakes made and have put together a guide to help you avoid making them yourself.

To provide some quick insight, we thought we’d share one of the tips we’ve found most helpful when starting to create an analytics product.

Creating an elevator pitch

What is "modern" business intelligence anyway...?

                 thethinkerjpeg

Last week, Tableau hosted a session on the evolution of Business Intelligence in Portland that I had the chance to attend. Although I did review their Top 10 trends in BI when they released them earlier this year, the presentation and discussion ended up being pretty interesting. A few of the topics really resonated with me and I thought we could dig into them a bit more.  

For starters:

Modern BI becomes the new normal

The session (and report) kick off by highlighting Gartner’s Business Intelligence Magic Quadrant and the shift away from IT-centric BI over the last 10 years. Regardless of who’s discussing the trends (Gartner, Tableau or otherwise..) and if or when they come to fruition, it’s important to dig deeper. **Reports like those by Gartner are good guideposts for trends and technologies to exam; saw that mentioned somewhere recently, comment for credit.

That said, I think we can agree that the overall landscape of technology and the way that organizations of all sizes are taking advantage of it in the domain of business intelligence has improved over the last decade.

So does that mean modern BI has truly arrived?

Although some ideas come to mind when I hear the phrase..

What is modern business intelligence?  

And do we all think of the same things when we discuss it….?


               carrier-pigeonpng

Find Gold in Your Data

monetization-1png

"Data Monetization" is a term you might have heard a lot lately.  But what does it really mean for you and your business?  There is gold in your data, but how can you extract it to gain all its benefits without adding resource burdens on your business?  We collected the main approaches successful companies are using to give you inspiration and insight into how you can use data you already have to improve efficiencies, create new revenue streams or increase value and hence your wallet share from your current customer base. 

Use data to make better decisions

It is not always about the big, earth shaking decisions. What if we can empower our employees to choose better paths in incremental fashion? Which ad to place in an available space? How to utilize remaining capacity on a shipment? Those items may each mean just $50.00, or $1,000.00. But people can be easily making 50 decisions like that per day.

First Principles: The Foundation of a Great Data Product

To kick-off our new series about creating data products, we decided to write a white paper. This sounds simple, but this time it was a little more difficult than we expected.

Specifically, where do we start when we want to explain the difficulties data product teams face and how to overcome the critical obstacles? Should we begin with user personas and how to design data products that engage users? Do we kick things off with a piece about pricing data products and the finer points of ensuring future up-sell paths? How about a few words explain why data products that don’t use Keboola are doomed to fail and bring shame upon their product teams and ultimately their entire company? Hmmm... All possibilities, but none of these seemed the best way to start our series.
 
After much thought and coffee, we decided to start at the beginning with “first principles”—those foundational attributes which distinguish successful analytical applications from those that don’t quite meet their objectives. Our white paper would discuss these principles that make a data product truly great.
 
Wait—isn’t that a little vague, a little “fluffy”? Not at all. We felt compelled to start with these principles because, while not as mathematical as pricing or as black and white as dashboard design, it can be hard to know where to begin when you’re part of
a product team charged with building an analytics product. First principles act as guide post to help you stay on the right path.
 
These guide post are essential for product team because it isn’t easy trying create analytics that have a positive impact both for users and on your company’s bottom line. Do you start by setting revenue targets and determining the cost structure that
needs to be achieved in order for the data product to be profitable? Maybe you start by defining the various reports and information that you need to put in the hands of your customers to solve their problems and reduce the deluge of “more data” requests. Or perhaps you could start by brainstorming a list of all of the features that might make users engage with the analytics—requests you’ve received or functionality that is present in your competitors’ products.
 
Each of these paths is a reasonable starting point, but are any of them the best way to begin the process of building a great data product? That's where first principles come into play.
 
First principles don’t have anything to do with bar charts versus pie charts or even technology selection. Instead, they are a set of guiding beliefs about what makes a data product great. They are foundational truths and from them, everything else—features, pricing, and product strategy—follow.
 
As we start our series on creating data products, we felt that our first principles were a great place to begin and so, we’d like to share them with you in a white paper. Before you start to think that these principles will be a rehash of all the modern catchphrases such as “embrace the change” or “empower each other”—these are directly targeted at creating successful data products. They are a collection of elements that we’ve seen in great, successful analytics-based products and are the place where we always begin when considering each project.
 
We hope that you find these elements of a great data product useful in your journey to deliver analytics to your customers and, as always, we’re here to help if you’d like to build a data product together.
 
 
Thanks!

Facebook Prophet - Forecasting library

It all started yesterday morning when I saw multiple tweets mentioning new forecasting library published on my way to work:

tweet_03jpeg

Sounds interesting, I thought. I bookmarked the link for “weekend fun with code” and moved on. The minute I stepped in the office, Amazon S3 had the outage (coincidence?) which impacted half of the internet and KBC as well. Ok, what can i do now then?

I opened link to the facebook engineering page and started reading about the forecasting module. They supplied quite simple instructions and it made me tempted to test it out. Wouldn't it be great to use it in some KBC projects?

Since the code needed for forecasting is pretty simple, I mocked up a script of suitable for KBC use before lunch and when amazon (US-east) got back up, I could implement the code as a custom science app.

The algorithm requires two columns, date and the value column. The current script gets the source and result tables’ information from the input and output mapping and the parameters specified by user. Those parameters will define:

  • Date column name

  • Value column name

  • Required prediction length (period)

This is how it looks like in the Keboola:

custom_sciencepng

To see the output in a visual form, I used Jupyter, which has been recently integrated within KBC. Not bad for a day’s work, what do you say?

chartpng


Just imagine how easy would be for our user to orchestrate the forecasting process:

  1. Extract sales data

  2. Run forecasting

  3. Enrich data by forecasted values

  4. Publish them to sales and marketing teams

orchestrationpng

Notes:

  • The sample data I used sucks. I bet yours will be better!
  • Here is the link for Jupyter notebook.
  • Feel free to check some other custom science apps I did: https://bitbucket.org/VFisa/

Where Prophet shines (from Facebook page)

Not all forecasting problems can be solved by the same procedure. Prophet is optimized for the business forecast tasks we have encountered at Facebook, which typically have any of the following characteristics:
  • hourly, daily, or weekly observations with at least a few months (preferably a year) of history
  • strong multiple “human-scale” seasonalities: day of week and time of year
  • important holidays that occur at irregular intervals that are known in advance (e.g. the Super Bowl)
  • a reasonable number of missing observations or large outliers
  • historical trend changes, for instance due to product launches or logging changes
  • trends that are non-linear growth curves, where a trend hits a natural limit or saturates

Martin Fiser (Fisa)

Keboola, Vancouver, Canada

Twitter: @VFisa



Webhooks and KBC - How to trigger orchestration by form submission (Typeform)

Triggering KBC orchestration with webhook

How to trigger orchestration by form submission

Use case

Keboola just implemented a product assessment tool dedicated to OEM partners. The form's results will show how submitters fare in the various dimensions of data product readiness, areas on which to focus, and specific next steps to undertake.

We wanted to trigger the orchestration that extracts the responses (have you noticed new Typeform extractor?), processes the data, and updates our GoodData dashboard with answers. There was no option to use "Magic Button" to do so because there is no guarantee the respondent would click on it at the end of the form.

Keboola + InterWorks Partnership Offers End-to-End Solutions for Tableau

                           iwpng


We’re always keeping an eye out for BI and analytics experts to add to our fast growing network of partners and we are thrilled to add a long-standing favorite in the Tableau ecosystem! InterWorks, who holds multiple Tableau Partner Awards, is a full spectrum IT and data consulting firm that leverages their experienced talent and powerful partners to deliver maximum value for their clients. (Original announcement from InterWorks here.)  This partnership is focused on enabling consolidated end-to-end data analysis in Tableau.

Whether we’re talking Tableau BI services, data management or infrastructure, InterWorks can deliver everything from quick-strikes (to help get a project going or keep it moving) to longer-term engagements with a focus on enablement and adoption. Their team has a ton of expertise and is also just generally great to work with.

InterWorks will provide professional services to Keboola customers, with the focus on projects using Tableau alongside Keboola Connection, both in North America and in Europe, in collaboration with our respective teams.  “We actually first got into Keboola by using it ourselves,” said InterWorks Principal and Data Practice Lead Brian Bickell. “After seeing how easy it was to connect to multiple sources and then integrate that data into Tableau, we knew it had immediate value for our clients.”

What does this mean for Keboola customers?

InterWorks brings world-class Tableau expertise into the Keboola ecosystem. Our clients using Tableau can have a one-stop-shop for professional services, leveraging both platforms to fully utilize their respective strengths. InterWorks will also utilize Keboola Connection as the backbone for their white-gloves offering for a fully managed Tableau crowned BI stack.

Shared philosophy

Whether working on projects with customers or partners, we both believe that aligning people and philosophy is even more critical than the technology behind it.  To that end, we’ve found in InterWorks a kindred spirit, we believe in being ourselves and having fun, while ensuring we deliver the best results for our shared clients. The notion of continuous learning and trying new things was one of the driving factors behind the partnership.

Have a project you want to discuss with InterWorks?

Contact InterWorks or if you want to learn a bit more about the types of projects they work on, check out their blog!