Transformation Versions Management

We're introducing a simple versions management for transformation buckets. You can quickly access some functions via a dropdown on transformation bucket page.

This dropdown shows only last 5 versions and allows to quickly rollback any of the previous versions or to copy the current version to a new bucket. All versions are available on a separate page via Show all versions menu option.

There you can see all versions, do a fulltext search within their descriptions, dates or authors, rollback to any version in history or copy any version to a new bucket.

Versions always work with the whole bucket, you cannot rollback or copy a single transformation. 

Please consider this as a first version - we're too excited to present you what we have so far we didn't want to wait to have everything ready. There are plenty of enhancements to come:

  • better change descriptions - we'll be fixing the messages to be more accurate and descriptive
  • versions for all components and their configurations
  • access to versions from transformations and not only buckets
  • diffing
  • dev workspace and committing changes
  • ability to set your own change description

Please let us know your thoughts, happy versioning! And of course, if you experience any unexpected behavior or find a bug, reach out to us at support@keboola.com.

New Application for Personalized Recommendation by Recombee

Have you ever thought about the hidden potential in your data? If you have many products and many customers, Personalized Recommendations by Recombee can boost your business.

You probably have purchases of your customers stored in the Keboola cloud. We prepared an application to show you how our recommendations can change your business. Using your data, the application generates list of recommended products for each of your users, based of his/her personal preferences. The recommendations can be, by example, used for your marketing campaigns. The application is also able to generate list of related products for each product from your catalog, which can be used directly on your e-commerce web pages. Furthermore, we can also deliver valuable insights to help understand your customers and discover complex relationships among products in your data.

Follow this link to learn how to use our application and upload your data simply by selecting right columns in your database table.

 https://git.recombee.net/keboola/recombee-app-description

Week in Review -- March 29, 2016

Another week has gone by at lightning speed. Here's what's new in Keboola Connection.

Documentation

We started overhauling our docs. About time. Read more...

Temporary access to projects

Our support staff (that's us, developers, btw) will stop littering your projects with unused accounts. We're now only allowed to enter your project for a limited period and we must provide a reason. Everything is saved, so you can see what's happening. Thoughts on security, control and privacy? Talk to us!

Bugfixes
  • Truncating Redshift tables now correctly updates bucket stats
New components or component changes
  • Apache Impala extractor
  • Bing Ads extractor, a 3rd party extractor provided by David Ešner
  • S3 writer allows to specify the format of timestamp suffix and optionally compresses the files

Week(s) in Review -- March 21, 2016

Writer News:

The GoodData Writer has successfully migrated all of its configurations from sys buckets to the components API.  The buckets no longer in use will be removed this week.

Also, we're thrilled to now officially support writing to Looker and Qlik.

Extractor News:

Get your hardhats on!  Our DB2 and Firebird extractors now support SSH tunnelling.

In Other News:

Events and Notifications:
  • Newly invited users will not see old project notifications when they join, they will now only see notifications from the time of their joining the project
  • Checking the mark all as read ckeckmark will now mark all notifications as read, not just those loaded to the display
  • Newly joining users will now see all project events, whereas before they'd only see the last 200 events prior to joining the project.
Transformations and Applications:
  • Inputs in the standard interface for input and output mappings now support [space] character 
  • We've noticed that SQL comments larger than 8kb will fail without exception id, so please take note if your SQL is very comment heavy.  
  • We also now support docker containers from private repositories on quay.io

GoodData API issues

Some older GoodData Writer configurations may experience fails of update model jobs which are accompanied by error message "You can not use a model that contains two or more facts identified by ids having same two tailing segments.". The problems are caused by some unexpected changes in GoodData's Project Model API after their Saturday's release. We are trying to fix the situation in cooperation with GoodData support. Thanks for your patience.

UPDATE (18:00 CET): The problem still isn't solved and apparently will take some more time. We will keep you updated. However you should be able to load data if you avoid updating of project model. It means use API calls load-data or load-data-multi instead of update-table and update-project (see API documentation)

UPDATE (Mar 22 17:30 CET): GoodData released a fix which prevents those failures of update model jobs. Writer should work without problem now. Thanks for your patience.

Amazon S3 Writer

It's my pleasure to announce another addition to Keboola Connection - an Amazon S3 Writer. 

Many of you know have been already familiar with REST Box component with functionality related to S3 manipulation (reading to and/or writing from). However, there are certain situations where a custom component is a better fit and for that reason we developed one.

The main advantage is a speed. It's been written in Node.js and using AWS SDK library where the actual file upload is handled via stream processing. Even a bit larger files (+1GB) are uploaded in a very convenient timeframe.

Configuration is very straightforward as well (similar to SFTP/WebDAV Writer) and you should be able to start using this component very quickly.

This writer is developed independently by Blue Sky Media. For more information on how to use the writer, please refer to the documentation. In case you run into some issues or you have more questions, please contact me directly (radek@bluesky.pro). 

Transformation UI bug

A recent minor update in transformation UI introduced a bug, where certain input / output mappings didn't open. The bug has been fixed and deployed. If you're still experiencing this issue please refresh your browser.

We're sorry for this inconvenience.

Deleting Buckets and Snapshots

To simplify cleanup tasks in KBC storage we've added support for deleting non-empty buckets.  You no longer have to delete any tables before deleting your bucket, so this should save some time when cleaning up your workspace.




We've also added support for deleting table snapshots, which should be helpful when working with tables and snapshots.




Both functions are available through the KBC Storage API, documented here: "drop bucket"  and "delete snapshot"

Gmail Extractor

We are proud to announce a new member of our group of Docker extractors - Gmail Extractor.

This extractor helps you fetch messages from your Inbox by specifying simple queries. These queries are identical to those you're using in Gmail's web interface. If you're more into advanced search, check this Gmail's Help site.

Main features of the Gmail Extractor:

  • you can specify multiple queries
  • output is divided to multiple tables, which gives you more freedom when selecting/joining data
  • multiple runs of the same query won't cause downloading the same data
  • you can specify the headers you want to download

The process for adding the Gmail Extractor is very similar to the other extractors (e.g. Adwords Extractor v2)

1. Find Gmail Extractor in the list of extractors

2. Create a new configuration for this extractor

3. Authorize the extractor to access your inbox

4. Configure the extractor and specify the queries you want to run

For more detailed information about the extractor configuration options please see the documentation.