Security improvements

We're announcing few security improvements:

  • All our servers, facing to clients, are using EV security certificates (what is EV?
  • All our servers have encrypted disks by using Amazon AWS KMS.
  • All our Elasticsearch clusters encrypt all events.
  • Amazon Redshift backends are encrypted by default. Existing customers can request to be moved to encrypted backends.
  • Storage API employ native Amazon S3 file encryption by default
  • All our Multi-AZ RDS metadata servers have encrypted data by default.
  • New Amazon RDS servers are encrypted by default. Existing customers can request to be moved to encrypted backends.

Long story short: if you're connecting to Keboola Connection, client facing servers are covered by strong encryption SSL with displayed identity in browser's address bar + all client's data in Keboola Connection are encrypted by default. issues, backend API powering our Pigeon Extractor (email data import), has an incidents with their infrastructure (caused by Rackspace maintenance). Their status page is here, twitter account here. When they stabilise their platform, everything should be OK. Please be patient.

Table Aliases with your custom SQL

A few weeks ago, we silently launched ability to create Storage API Aliases by using your own SQL code. These Alias Tables with custom SQL can be created with Redshift backend only.

Create New Alias Table:

Define your own SQL code:


Alias Tables can help you structure your data. Imagine it as a "Transform on Demand" - everything is happening on-the-fly (aka real-time). Say we have business transactions in table "data". This is an example how to define "derived" table with weekly sum of all transactions, that can't be joined with our Customer (alarm, wrong data!! :-)

Raw Result of this simple alias table:


Thanks to the almost unlimited power of the Redshift cluster, you can also create much more complex examples. For instance, this one creates a denormalised table of transactions that occur during the night at weekends, in EUR, outside of Czech Republic and not having one specific product code:

Feature Wishlist

We're announcing public Trello board, where you can lobby for Keboola Connection features - it's your wishlist.

You can add your ideas by emailing it to and vote like crazy for anything you find interesting - we'll be watching this board very precisely and, hopefully, your dreams come true soon :-) More votes, higher visibility on our table!

If you create your own Trello account, you can subscribe to our board (or just to a specific card) and receive all notifications. This might be handy if you'd like to look under our hands - we'll be tracking all customer facing features here, so voting on our cards will affect our priorities. Separating our features and your wishes can be done by filters - your cards has green label. 

Maintenance Announcement

On January 17th, 2015 we will perform scheduled upgrade of our meta-data servers. This will cause a maintenance window from 2:00 pm to 3:00 pm (GMT+1), or 5:00am to 6:00am (PST).

During the maintenance, you can't access your data. All network connections will be terminated by "HTTP 503 - down for maintenance" status message.

All running tasks will be monitored by us and restarted in case of any interruption. Orchestrations and running transformations will be generally delayed, but not interrupted. However, feel free to re-schedule your saturday's orchestrations to avoid this maintenance window.

Direct (r/o) Access to any Redshift Bucket

Today we're announcing new Storage API feature: Bucket Credentials (api here). 

If you're using Keboola Connection w/ Redshift backend, you can have read-only credentials (direct sql access) to any Redshift bucket. 

In Storage API Console, go to Bucket Detail > Credentials and press "Create new credentials" button:

Describe new credentials (you can have multiple credentials assigned to each bucket!):

When you create credentials, carefully copy&paste credentials to you SQL client or preferred remote service (, etc.). After closing displayed credentials, you can't display it's settings:

In case you need to re-use already created credentials, you have to delete it and create new combination of username and password. All existing credentials are listed under it's bucket:

  1. credentials can be used for accessing just one bucket
  2. write access isn't supported 

WARNING: Always employ SSL when accessing your data. Generated credentials are opening your dedicated AWS Redshift Cluster. Please read "Configure Security Options for Connections". Redshift Cluster's CA certificate can be downloaded here.

Tableau Writer

For those who love Tableau, we're officially launching "Tableau Writer". At this time, it's able to provision MySQL database or push data to your own MySQL server.

Our Tableau Writer allows you to rename tables, it's columns and define data types for better handling on Tableau side (which configuration is pretty straight forward too - just let Tableau connect to MySQL server with using proper credentials). 

If you're "Tableau Desktop Personal Edition" user, you can enjoy TDE files very soon! (delivered to your Google Drive account automatically). 

Storage API outage

Our Storage API servers were offline from 2:33 UTC to 6:47 UTC of July 21th, 2014. We're still identifying main roots of this outage now, but servers are up and running. You can expect slow performance for next 1~2 hours. All nightly jobs failed. 

We let you down and we know it. We take our responsibilities — and the trust you place in us — very seriously. I cannot express how sorry I am to those of you who were inconvenienced. 

Please, contact us at in case you have any trouble, we'd like to assist you!