- Files can be tagged and filtered by tags
- Files are searchable
It is handy for integration with other AWS services such as Redshift, you can also utlize existing AWS SDKs for file uploads.
File uploads backend has also been migrated to Elastic search, expect search and tagging capatibilities in near future.
There’s a new extractor you can use from the Keboola Connection’s Extractor tab.
It serves for extracting call log from Telfa into Storage API.
If you set the filter to “from”, which is a date since when you want to extract calls to auto, it will automatically load all the calls initially, and all the subsequent runs of the extractor (using the same configuration, of course) will extract calls since the previous execution (with a minimal overlap).
There were several failed orchestrations in last hours caused by a bug in Writer which tried to recreate existing date dimensions in GD project. Error message looks like: “Dimension … already exists”.
- new UI has been deployed under the ‘Extractors’ tab in the KBC UI
- when adding a report to extract, it has to be from a GoodData project that is configured(via writer) under the same KBC project.
Unificaction with latest Keboola Connection UI, preparation for upcoming features.
In the transformation detail you are able to see which transformations are dependent on the current transformation. This list is not editable, clicking on the label redirects to the transformation detail.
- feature: added run button with ability to specify date range and run configuration
- bug fix: add new query to an empty config
- Jobs can be filtered by creator
- Load type (incremental/full) is shown in jobs table
- Bugfix - data type can be specified for LABEL in dataset configuration
Creation of date dimensions has been separated from uploadTable and is visible in jobs queue as independent job (called uploadDateDimension).