Understanding project management workflow by integrating Keboola + Asana

Related image

Asana is on a mission to help humanity thrive by enabling all teams to work together effortlessly, improve the productivity of teams, and increase the potential output of every team’s effort. They provide a great web-based project management tool which allows users across teams to keep track of their work. 

Although Asana does offer fantastic UI, features to help managers or project managers to gauge the progress of the project, it lacks the simplicity in creating a dashboard to report the progress. For example, the number of tasks completed last week, the number of tasks tagged 1st priority, the number of tasks each user has, etc. With Asana extractor, users can transform and enrich the data with Keboola to have a better insight into the projects contained in Asana. The integration will enhance collaboration and build an actionable, 360-degree view of project management and usage as well as customer experience.

See How Keboola Automated Personalized MailChimp + SurveyMonkey Campaigns


Our client is in the event business organizing regular meetups for CEO’s in the Vancouver area, specifically for technology companies. To organize these ad-hoc conferences for hundreds of people, they use excel, Salesforce as CRM, MailChimp for the emails and SurveyMonkey, well, for surveys. For those of you who have been using Keboola for some time, you might know this is the optimal setup for us.

After an initial discussion to understand their current in-house processes, we narrowed down the biggest pain to be the e-mailing component. Do you remember when I mentioned those meetups are for CEOs?

Well, they have 14 groups and each group meets once a month. For each event, they need to contact the host two weeks before in order to remind them. Then there is another reminder a week before the event that is sent to all the guests. Finally, after each meeting, there is a survey sent out to those who attended.

All those emails are being prepared and sent manually by one person. You can do the math but believe me, it is almost a full time job just to check every day what email has to be sent out to whom.

This is where Keboola stepped in…..

Let’s skip the part where we moved the Excel into Google Sheets, replaced nicely formatted bar charts and roadmaps with simple data tables so we could crunch all the data in Keboola.

SurveyMonkey

After each meeting the organization collects the feedback using the SurveyMonkey. They measure several KPIs, plus they allow users to comment in each section. This survey results are distributed the next month as a part of the Mailchimp campaign reminding guests the next event is coming up soon.

Then we wrote a short API call (JSON Config?) using our almighty Generic extractor to get the data from SurveyMonkey. They have a good documentation, so it’s not difficult to obtain the survey results.

The more complicated part of the process was to join together all the output tables from their API, because the endpoint created around 20 tables full of parents and childs.

Mailchimp Part 1

The most important part of the puzzle, is that all emails are being distributed from MailChimp.

We had to figure out how to automatically feed each Mailchimp template with personalized data. It might sound easy at first, but by default Mailchimp offers only two variable fields connected to the recipient. Unsurprisingly, it is the first and the last name.

However, for each template we needed much more than that! We needed to personalize location, time and date of the event and even add a personal note. There special sections for those survey results and users comments from previous meeting but having just the two out of the box fields were not going to cut it.

This was exactly the moment when we started asking the “what if” type of questions. Quite soon after opening the API documentation, the right question hit us: Could we push the whole content via API call and not merely trigger the campaign remotely?

Keboola Python Part

This is where Leo, our Python ninja, stepped in. He will briefly go over how Keboola “hacked” the MailChimp API.

With the use of a custom science component written in Python and MailChimp API integration, users can ease the pain away from manually creating multiple campaigns and entering the “variables” (eg. Time of the event, locations of the event, etc) everytime when a reminder or email is needed to send out. Users are only required to maintain the templates within MailChimp and the google documentations which have detailed descriptions of the event and the list of participants. Combined with automation via Keboola orchestration, the custom science component can be run periodically fetching events and participants details. With fetched data, component will then use the configured templates in MailChimp and create a new campaign for every events the component can list. If repetitive event names are found within the same sheet/run, the component  will inject the details into the first encountered row with the same event name. Upon triggering completion of all the campaigns, the component will output a CSV to Keboola storage indicating the behaviour of the campaigns regarding whether or not it is successfully executed. “Sent” campaigns are kept in users’ MailChimp account as a record. Users can find what contents and which participants the campaign sends to. Stats and activities of the campaigns can also be found under the Report tab in users’ MailChimp Accounts.

The last step was to create a different csv outputs for each scenario. But since we could use as many variable fields in the Mailchimp template as we wished, it was just a matter of a couple SQL queries and a few python transformations to get the results with the right numbers in

Mailchimp Part 2

Now we had a functioning writer which was able to replace any predefined variable field with data in corresponding column in the output file.

All we had to do was to adjust the existing templates and we were ready to go!

Conclusion

I’m not sure how this solution would withstand thousand of recipients, but in our case, when there goes tens of e-mail at the most busy day, it works well.

The event manager’s job is now to maintain the master sheet where all they need to do is to add a new event. Keboola downloads the sheet every day, transformation detects if there is a need for any kind of email campaign and the custom writer pushes personalized campaigns through Mailchimp.


Thanks!

Michal

Data Geek / BI Developer


Taking a data-driven approach to pricing to optimize profit

Research around pricing has consistently shown its importance. Deloitte  found on average, a 1 percent price increase translates into an 8.7 percent increase in operating profits (assuming no loss of volume, of course). Yet, also estimated up to 30 percent of the thousands of pricing decisions made by companies each year, fall short of delivering the best price. That’s a lot of money left on the table!

To often, pricing is an after thought, and pricing decisions are made by 'gut feeling,' based on a quick look at competitor websites. Ideally, pricing decisions involve determining the value to the customer relative to the competitor, factoring in pricing power and pricing strategy. For more on this see Pricing Power. What it is. How to get it. from our partners at Ibbaka. Understanding value requires data, and that data can change over time so that the best thought out pricing model can soon be out of date. The solution is data science plus data integration. Data from many different sources can be connected to value and pricing models and when these get out of alignment, due to changes in the market and competitor actions, alerts can be triggered.

When we think about B2B sales analysis, the things that initially come to mind usually involve reporting on CRM data to understand sales by product, region, deal velocity, and the like. Whereas most innovative B2C companies take greater advantage of the mountains of valuable data they have, B2B companies have been a bit slower to adopt this approach. After identifying customer segments, data from not only CRM, but ERP, third party economic data sources and others can be used to understand past purchases & prices, preferences  and more to determine the optimal price. As mentioned, even a 1% increase can have a huge impact.

Automation is king

One of the critical keys to getting more out of your data is automating or even better, eliminating, as many mundane processes as possible. This allows organizations to focus on the test & evaluate part of pricing process, not configuring infrastructure and monitoring and maintaining data flows. It also allows the people doing analysis to take advantage of larger and more diverse data sets and make adjustments without a huge headache.

Sell pricing internally

As someone who has spent years in sales, I can tell you that pricing is so much more than just a number. It’s lead to heated discussions around many tables in innumerable offices. It’s one of the big “hows” sales reps have to keep in mind when trying to close deals. Providing B2B sales organizations, with information and context to help reps understand the factors behind pricing is mission critical. Keeping them educated on this topic will build confidence and translate to clients that have a better understanding of how the cost of the product translate to business value. Introduce optimal pricing earlier in sales cycles can also lead to increasing the speed to close deals as well as increasing win rate.


Test and evaluate

It’s important to remember that pricing is not a static, set it one time type of event. It’s important to try, test and learn. This process is so much better when we can use data to influence and validate the decisions made.

Pricing is an important part of the foundation for strong sales and marketing organizations and using data effectively as part of your strategy can reveal key insights to make sure you get it right.

Want to learn more?

Join us for our upcoming webinar with Ibbaka where will dive deeper into data-driven pricing strategies!


Holiday Gift Ideas for the Data Geek in Your Life

I can’t believe it’s already been a year since we covered some great gift ideas for data people!  We’re back with some more last minute ideas, some may look familiar albeit bigger (or smaller) and better while others are new arrivals. Whatever you’re looking for, we hope at least one of these ideas will help you find something that really excites the techie / data lover in your life this holiday season!


Amazon Echo Second Gen (Dot)

How to take an agile (Minimum Viable Product) approach to analytics projects

By now, the idea of agile development and a Minimum Viable Product or MVP is prevalent. The problem is, while most people have the minimum part down, people often haven't mastered the viable…. especially when it comes to analytics.

To quickly recap,a Minimum Viable Product is an approach, where you’re focusing on creating a product with a sufficient level of features to be able to solve a particular problem. This first iteration is used to collect user feedback and develop the complete set of features for the final product to be delivered.

That’s all nice and well, but you may be wondering what the benefits to this approach are as it concerns analytics projects...

Learning, and learning quickly

Is your solution actually delivering the value that you are trying to create? In a typical project, you may be months down the road before what you’re building is actually in front of users. This makes it difficult to determine its viability for solving the business case. The whole point is to prove or disprove your initial assumptions sooner.

  • What part of your users current process is really frustrating them?

  • Are the analytics we designed actually guiding them through their workflow and making their life better?

By getting a usable set of features in front of user’s earlier in the process, we can collect feedback and determine if we are in fact on the right track.

Is there untapped value in your data?

          extractingvaluejpg

Embedded analytics, data products, data monetization, big data….there are plenty of buzz words we can use to “categorize” the idea.

IDC reports that the big data and business analytics market growing at a rate of over 11% in 2016 and at a compound annual growth rate of 11.7% through to 2020. This rapidly growing area of investment can’t be for naught….can it?

Let’s look beyond the hype at some specific approaches for extracting additional value (and ultimately dollars) from your data.

According to Gartner, Data Monetization refers to using data for quantifiable economic benefit.

The first thing that may come to mind is outright selling of data (via a data broker or independently.)  Although a potentially viable option, with increased data privacy policies and the sheer amount of data needed to be successful with this approach, it can be quite limiting.

There are many other approaches to monetizing your data, such as:

How to build data products that increase user engagement

Think about all the social media platforms out there, which ones do you use the most (and why)? I’m not talking about giving your LinkedIn profile a face lift before you put in a job application or searching for a long lost friend on Facebook; which of these apps are actually driving user engagement? For me, it’s Instagram; the interface is easy to navigate and more than once I’ve found myself re-opening it after I’ve just closed it. Have you thought about why many of these platforms have exploded in user engagement with many people posting to their Twitter or Facebook accounts multiple times per day? According to a recent Gartner blog, adoption rate for some of the BI tools in their Magic Quadrant are at a low but not too surprising 21%. Are people sick and tired of “all that data” or is there something more sinister at work…

We’ve thought a lot about social media platforms (and other apps) that seem to drive such high user engagement and put together a few thoughts on how you can do the same within your data product to ensure you keep users coming back for more. Before we reveal the secret sauce for building engagement in your data products, let’s take a quick look at how many analytics teams approach the problem.

Too often, teams building an analytics product for this customer’s approach the project in the wrong way, the story is oh so familiar. As we covered in a recent blog, this meant taking the reports existing in an Excel spreadsheet and web-ifying them in a cloud BI tool. It’s essentially surfacing the exact same information as before, but now with shiny new charts and graphs, more color choices, and some interactivity. After the initial excitement over the new toy in the room, the latter solution isn’t doing any better than the former at driving engagement; let alone delivering “insights” or creating a new revenue stream.

One of the big reasons customer aren’t lining up to write a check for the latest, greatest data product a vendor has rolled out is that the analytics team failed to make it engaging. Simply put, product teams need to let users know “hey—check this out,” “hey—we’ve got some important information for you”, and “hey—you should come back and see us.” Most teams do the second part, the “we’ve got insights” piece, but they fail to inform users why they need to keep coming back for more. These are essential elements of establishing engagement; not building these in is like skipping the foundation of a new skyscraper. "It's like when you see a skyscraper; you're impressed by the height, but nobody is impressed by the foundation. But make no mistake, it's important," said Akshay Tandon, Head of Strategy & Analytics at LendingTree.

Want to avoid the killer mistakes of failing to build engagement into your data product? Here’s how:

Creating Intelligent Narratives with Narrative Science & Keboola

Intelligent Narratives are the data-driven stories of the enterprise. They are automated, insightful communications packed with the information that matters most to you—specific to your role or industry—written in conversational language, and at machine scale. By giving your employees and your customers a richer, more nuanced understanding of your business, they can make more informed decisions and realize their greatest potential.

Narrative Science is the leader in advanced natural language generation (Advanced NLG) for the enterprise. Quill™, its Advanced NLG platform, learns and writes like a human, automatically transforming data into Intelligent Narratives—insightful, conversational communications packed with audience-relevant information that provide complete transparency into how analytic decisions are made.

As we all know, one of the biggest barriers to successful data projects is having the right data in the right place; that's why Narrative Science and Keboola have partnered to bring the next generation of analytics to you faster. Automate data workflows, reduce time and complexity of implementations and start gaining new insights now! Leverage this app, powered by Narrative Science, to produce machine-generated narratives of data ingested by Keboola. 

Freethink + Keboola: Understanding cross-channel video analytics

Video is one of the hottest trends in digital marketing. YouTube, which has expanded more than 40 percent since last year, reaches more 18-49 year-old viewers than any of the cable networks and has a billion users watching hundreds of millions of hours every day. 

Freethink, a modern media publisher, uses online video to tell the stories of passionate innovators who are solving some of humanity’s biggest challenges by thinking differently. While telling important stories is their primary focus, data underlies all of their decisions. As a publisher, they need to understand how well each piece of content performs, as well as how that content performs across platforms (they currently publish videos on their website, YouTube and Facebook.)

Prior to working with Keboola, collecting and combining data for cross-channel video analysis was a time consuming, manual effort (particularly because Facebook has separate APIs to track page content and promoted content.) In addition, this process made performing time-over-time analyses a real challenge.

The goal was to provide a dashboard solution for the team to have better visibility into their data. Keboola Connection (KBC) was able to overcome this by leveraging existing API connections to get data from Facebook and YouTube. In addition, Keboola utilized its partnership with Quintly (social media analytics) in order to pick up cleaned and verified data from their API.  All this data is combined additional data sources including Google Sheets to provide additional metadata for advanced reporting and segmentation. This blended data enables universal reporting across platforms to get a 360-degree picture of each piece of content.

Image result for social media

Freethink now has all their data populated in Redshift, where Chartio is able to connect to create beautiful dashboards for reporting. They are able to go into the Keboola platform and manually adjust and run configurations to get exactly the data they need. The biggest gains have been in time saved, being able to show change over time and freeing the team up to focus on more complicated analyses. This also opened up data access to the broader team, promoting collaboration and data driven decision making.


"Keboola really helped simplify and automate the process of collecting and combining data. Working together, Chartio and Keboola Connection deliver a full stack solution for modern analytics, taking full advantage of the cloud. I’m able to give my team better insights into our performance and make better decisions, quicker."

-Brandon Stewart, Executive Editor at Freethink


Thanks,

Colin


The Best Tool for Your Data Product Journey? A Good Map

                    mapjpg

For anyone creating an analytics product, the pressures of engaging customers and generating revenue while protecting your core product and brand can be overwhelming, especially when aiming to hit so many goals on the horizon:

  • Does it target users effectively?

  • Will it guide users to a solution to their business problem?

  • Can it scale to many customers?

  • Will it deliver real results that customers are willing to pay for??

Fortunately, we've been there, done that, and understand what it takes to build a great data product. That's why we've created a map to help you navigate your way to success, built on the experience of countless voyagers who have sailed the same seas before you; the Data Product Readiness Assessment.