GoodData API issues

Some older GoodData Writer configurations may experience fails of update model jobs which are accompanied by error message "You can not use a model that contains two or more facts identified by ids having same two tailing segments.". The problems are caused by some unexpected changes in GoodData's Project Model API after their Saturday's release. We are trying to fix the situation in cooperation with GoodData support. Thanks for your patience.

UPDATE (18:00 CET): The problem still isn't solved and apparently will take some more time. We will keep you updated. However you should be able to load data if you avoid updating of project model. It means use API calls load-data or load-data-multi instead of update-table and update-project (see API documentation)

UPDATE (Mar 22 17:30 CET): GoodData released a fix which prevents those failures of update model jobs. Writer should work without problem now. Thanks for your patience.

Amazon S3 Writer

It's my pleasure to announce another addition to Keboola Connection - an Amazon S3 Writer. 

Many of you know have been already familiar with REST Box component with functionality related to S3 manipulation (reading to and/or writing from). However, there are certain situations where a custom component is a better fit and for that reason we developed one.

The main advantage is a speed. It's been written in Node.js and using AWS SDK library where the actual file upload is handled via stream processing. Even a bit larger files (+1GB) are uploaded in a very convenient timeframe.

Configuration is very straightforward as well (similar to SFTP/WebDAV Writer) and you should be able to start using this component very quickly.

This writer is developed independently by Blue Sky Media. For more information on how to use the writer, please refer to the documentation. In case you run into some issues or you have more questions, please contact me directly (radek@bluesky.pro). 

Transformation UI bug

A recent minor update in transformation UI introduced a bug, where certain input / output mappings didn't open. The bug has been fixed and deployed. If you're still experiencing this issue please refresh your browser.

We're sorry for this inconvenience.

Deleting Buckets and Snapshots

To simplify cleanup tasks in KBC storage we've added support for deleting non-empty buckets.  You no longer have to delete any tables before deleting your bucket, so this should save some time when cleaning up your workspace.




We've also added support for deleting table snapshots, which should be helpful when working with tables and snapshots.




Both functions are available through the KBC Storage API, documented here: "drop bucket"  and "delete snapshot"

Gmail Extractor

We are proud to announce a new member of our group of Docker extractors - Gmail Extractor.

This extractor helps you fetch messages from your Inbox by specifying simple queries. These queries are identical to those you're using in Gmail's web interface. If you're more into advanced search, check this Gmail's Help site.

Main features of the Gmail Extractor:

  • you can specify multiple queries
  • output is divided to multiple tables, which gives you more freedom when selecting/joining data
  • multiple runs of the same query won't cause downloading the same data
  • you can specify the headers you want to download

The process for adding the Gmail Extractor is very similar to the other extractors (e.g. Adwords Extractor v2)

1. Find Gmail Extractor in the list of extractors

2. Create a new configuration for this extractor

3. Authorize the extractor to access your inbox

4. Configure the extractor and specify the queries you want to run

For more detailed information about the extractor configuration options please see the documentation.


AdWords Exractor API Update

Extractor will update to API v201601 of AdWords API in the beginning of April. Please be sure to review your configuration so that it does not use no-longer supported metric names till the end of March. Also do not use new metrics until the update, we will let you know about it.

GoodData Writer configuration status update

By now, all writers are reading configurations of datasets, date dimensions and filters from Components Configurations API only. Corresponding tables in Storage API still exist and are updated even when they are not used for reading. 

Table users is not used for reading nor updating anymore and information about users created by writers are stored in Writer's backend exclusively. You can access the data using API: http://docs.keboolagooddatawriterv2.apiary.io/#reference/projects-and-users-provisioning/users.

Table filters_projects is not used anymore, GoodData URIs of the filters were moved to Component configuration to section filters.

Table filters_users is not used anymore, information about assigned filters to users is obtained directly from GoodData API. Notice that it brings implicit obsoletion of sync-filters API call. The API call still rebuilds filters-users relations according to the filters_users table but the table isn't updated.

Last tables actively used by Writers are projects and project_users and they will be migrated soon, probably this week. They will be moved to Writer's backend similarly to users table.

Please don't delete configuration tables yourself, whole buckets will be deleted automatically when they won't be used anymore, probably within two weeks.

Project Limits

Today we’re introducing limits to all Keboola Connection projects.  You can find them in the “Users & Settings” section.

It will let you know what your limits are for storage, user licenses, orchestrations, etc. 


If your project or component is over a limit, the metrics will be shown in red brick


Keep in mind that these are just soft quotas which can be easily exceeded.  So If you go over a limit, you don’t need to be afraid of anything happening to your project usage (we all are in the cloud after all, lots of room up here :) Our goal is just to keep the red metrics at a minimum, so you may be hearing from us if too many red boxes hang around for too long.  The end result should be that you get your desired performance, and we get our profit :)

The project settings will also list your monthly cost, project type and days remaining until project expiration (typically proof-of-concept or demo projects will have expiration conditions).

Project expiration will also be announced on the project homepage (Overview):


Since we’re moving to this new system from our old filing cabinet and fax machine solution, there might be some glitches in the numbers displayed. If you find any discrepancies with the numbers there, please let us know. 

Week in Review

SSL for Transformation Sandboxes

Connection through SSL is now available when creating sandboxes for transformations.


Intercom.io Extractor

Extractor for https://www.intercom.io/ has been added to Keboola Connection.


"Load more" button added in Notifications

You can now browse older notifictionas with "Load more" button on Keboola Connection Notifictions page.


Changes to R transformations

We have made some internal changes to R transformation backend which resolves a couple of edge-case issues and bring few new features:

  • We have unified R Transformations and R Custom Science to use exactly the same environment - this solves issues when using code from transformations in Custom Science.
  • R Transformation scripts now have properly set encoding to UTF8 - manually setting encoding with Sys.setlocale("LC_CTYPE", "cs_CZ.UTF-8") is no longer necessary
  • R packages are now installed from multiple backup CRAN repositories - this solves an issue where a package installation would randomly fail in case of CRAN outage
  • When installing R packages we now automatically load them - this solves an issue where a package was successfully installed, but failed to load. This also has the nice side effect that, you no longer need to call library() explicitly to load packages in your R code.
  • R script output is now available in Transformation events, so you can do some basic logging with print('message') or a bit nicer write('message', stdout())

You don't need to make any changes to your R scripts. If you run into any incompatibilities, let us know.

Other posts this week