Investigating EU overload

We experienced overload in EU

Since 2020-10-12 11:40 UTC Start investigating overload in EU. Next update in 1 hour or as new information becomes available.

UPDATE 2020-10-12 12:56 UTC:  Everything should be back to normal. No jobs should be affected. We'll keep monitoring our platform closely.

UPDATE 2020-10-12 13:57 UTC: All operations are back to normal and everything is fully working.

Telemetry Data Fixes and Upcoming Telemetry Component

Dear customers,

We are in the final phase of shifting and finalizing the calculation of time credits. We thank you for your patience and partnership in working with us and transitioning your contracts over to the new credit system.

As a part of finalizing, we are implementing some changes that weren’t applied previously and will begin to calculate starting now.

  • Transformations with multiple transformation backends - If you have a transformation that runs an SQL transformation with a R/Python transformation following it, in the same bucket, the SQL transformation is being calculated until the R/Python transformations are finished. As a result, these SQL jobs will have its child jobs run time deducted from their own, resulting in lower time credits consumption of affected clients. The change will cover jobs since the beginning of 2020.
  • SQL sandboxes time credits usage - Most of our clients don’t have SQL sandboxes consumption added to their overall consumption. In our analysis, this is a fraction of total consumption, so the impact is not very material for the majority of clients. For those clients that have a material impact, your CSM will be reaching out to you to let you know the impact, otherwise you can assume your usage is not going to impact you.
  • COVID-19 Error jobs - We haven’t been calculating credits for error jobs since April 2020 due to Covid and a higher than normal rate of Snowflake related issues. We will begin adding the consumption from job errors starting October.



Telemetry Component

We’re planning to release a new Keboola Connection component, which will allow the users to get telemetry data about their project or organization. That will eventually replace our temporary solutions like GoodData telemetry dashboard or direct writing of telemetry data to some projects. In the end, all of the Keboola Connection users will have easy access to documented telemetry data on demand.

Snowflake issues in US region

We're investigating an issue with Snowflake in US region which causes some Storage operations with tables to be stuck in processing state. This can cause jobs to be executing longer than expected or seemingly "forever". Terminating and restarting the job does not help in such a case. Only certain projects are affected.

Next update in 1 hour or as new information becomes available.

Update 10:50 UTC: The stuck jobs are unblocked now and should be finishing, we're monitoring the situation if the issue reappears. A post mortem will be published once we get an RCA from Snowflake.


Keboola-provided credentials for Snowflake and Redshift database writers

When configuring a Snowflake or Redshift database writer, you can use a Keboola-provided database.

In the past, when you selected this option, the credentials were stored in a configuration in a plain-text format. Storing the credentials this way allowed you to copy the password and use it in your favorite database client (or another system) even if you didn't copy it right after the creation.

To improve the overall security, we decided to show you your password only once and store it encrypted. From now on, when you create a new Keboola-provided database (Snowflake or Redshift), you will see the password only once, right after its creation.

Backward compatibility

The existing credentials will remain untouched. But if you delete them, there's no option to create them the old way.

Week in Review - September 14th, 2020

New Components

  • LiveRamp Identity Resolution application - solving some of the main challenges with customer and prospect data by returning people-based identifiers and metadata for your consumer records

  • KBC Project Metadata extractor - Keboola metadata extractor downloads metadata about all objects in your Keboola project.

  • Avro2CSV processor - Avro is a row-oriented remote procedure call and data serialization framework developed within Apache's Hadoop project. It uses JSON for defining data types and protocols, and serializes data in a compact binary format.

Updated Components

  • Generic extractor - added option "caCertificate" which allows you configure custom certificate authority bundle in crt/pem format. (documentation)

Minor Improvements

  • Google BigQueryupdated google-cloud-bigquery package

  • Python updated to 3.8.5

  • Julia updated to 1.5.0

Snowflake US - Performance degradation

We are investigating slight performance degradation of Snowflake in US region, there are no job failures or increased queue backlog but everything seems to run slightly slower. Degradation started around 00:00 AM UTC. Next update in 120 minutes or as new information becomes available.

UPDATE 2020-09-02 14:59 UTC: We still see slight performance degradation of some queries. We are in touch with Snowflake support. Next update tomorrow or as new information becomes available. 

UPDATE 2020-09-03 06:31 UTC: The issue is now resolved, performance went back to normal around 2020-09-03 00:00 UTC. We are waiting for more details about the issue from Snowflake.



Snowflake Incident in US region

On September 1st between 8:25 PM UTC and 9:48 PM UTC there was an incident on Snowflake service which led to Storage job failures.

The issue is now resolved and all systems are operational. We apologize for the inconvenience caused by this incident.

Snowflake Incident in US region

Since 2020-08-25 8:35 UTC we are experiencing Storage errors in the US region due to a reported incident in Snowflake. We are going to monitor the situation and keep you posted within 90 minutes.

UPDATE 2020-08-25 9:20 UTC: Snowflake incident update:

We have identified the problem with the Snowflake Service that is interrupting the following services:
1. Access to Snowflake UI
2. Cannot execute queries

Incident Start Time: 01:20 PT Aug 25, 2020
We will provide an update within 30 minutes or as soon as we have more details on the status of the issue.
UPDATE 2020-08-25 9:50 UTC: The problem seems to disappear, we don't get any more errors since 9:30 UTC. But Snowflake hasn't updated the incident yet so we are still monitoring the situation.

UPDATE 2020-08-25 10:15 UTC: A fix to resolve the issue is applied and the situation is still being monitored.

Week in Review - August 14th, 2020

Updated Components

  • AWS S3 Extractor supports Authentication with an AWS role (documentation)

  •  Twitter Extractor supports Direct Messages

To extract direct messages, you must reauthorize the account since an additional permission (DMs) is needed.

  •  MongoDB Extractor supports custom URI connection

  •  MSSQL Extractor supports encrypted (SSL) connection


UI Improvements

  •  Storage job detail now has a permalink

To get a permalink for the job detail, click on the job ID. A popup with the job detail will appear, and the URL in your browser will change.


Minor Improvements