GoodData API issues

Some older GoodData Writer configurations may experience fails of update model jobs which are accompanied by error message "You can not use a model that contains two or more facts identified by ids having same two tailing segments.". The problems are caused by some unexpected changes in GoodData's Project Model API after their Saturday's release. We are trying to fix the situation in cooperation with GoodData support. Thanks for your patience.

UPDATE (18:00 CET): The problem still isn't solved and apparently will take some more time. We will keep you updated. However you should be able to load data if you avoid updating of project model. It means use API calls load-data or load-data-multi instead of update-table and update-project (see API documentation)

UPDATE (Mar 22 17:30 CET): GoodData released a fix which prevents those failures of update model jobs. Writer should work without problem now. Thanks for your patience.

GoodData Writer configuration status update

By now, all writers are reading configurations of datasets, date dimensions and filters from Components Configurations API only. Corresponding tables in Storage API still exist and are updated even when they are not used for reading. 

Table users is not used for reading nor updating anymore and information about users created by writers are stored in Writer's backend exclusively. You can access the data using API: http://docs.keboolagooddatawriterv2.apiary.io/#reference/projects-and-users-provisioning/users.

Table filters_projects is not used anymore, GoodData URIs of the filters were moved to Component configuration to section filters.

Table filters_users is not used anymore, information about assigned filters to users is obtained directly from GoodData API. Notice that it brings implicit obsoletion of sync-filters API call. The API call still rebuilds filters-users relations according to the filters_users table but the table isn't updated.

Last tables actively used by Writers are projects and project_users and they will be migrated soon, probably this week. They will be moved to Writer's backend similarly to users table.

Please don't delete configuration tables yourself, whole buckets will be deleted automatically when they won't be used anymore, probably within two weeks.

Some failing data uploads in GoodData Writer

In last days several errors with message like "Could not export table out.c-main.products from Storage API: Table Activities not found in bucket out.c-main." appeared unexpectedly. It is direct consequence of this change from December. When you call upload project or load-data and load-data-multi API calls without parameter tables, Writer takes all configured tables and tries to upload them. But when some table is missing from Storage API and still has configuration in the writer, this failure happens. And it didn't happen earlier because Writer automatically removed configuration for deleted SAPI tables.

But because this problem confused several of our clients, we decided to make this behaviour more comfortable. Now if you call upload project or load-data and load-data-multi without parameter tables, Writer will ignore configurations of non-existing tables and won't fail. However if you call load-data or load-data-multi with explicitly listed tables (in parameter tables) and some of these tables doesn't exist, the job will still fail.

We apologize for the confusion.

Tables list change in GoodData Writer

Tables are not automatically added to GoodData Writer UI any more and you have to add them there explicitly. Naturally you can also delete no longer needed tables. This improvement should benefit clarity and unites behaviour across other writers. Non-configured tables (i.e. those with all columns ignored) will be removed from your existing configurations soon by automated script.

On-demand SSO access to GoodData projects from KBC

Access to GoodData projects from their writers in Keboola Connection has been changed. Once you enter a project in KBC you are not automatically added to any GoodData projects any more. Instead there is new button "Enable Access to Project" in each writer which needs to be explicitly confirmed first.


After that you can directly access the project using sso link (no need to login first any more) or you can Disable access to the project.


Also when you leave a KBC project, you are removed from all of it's GoodData projects automatically. 

GoodData Writer automatically creates one GoodData account for each KBC user. Unfortunately because of security reasons we cannot connect this feature to your existing GoodData accounts but once you are logged in GoodData using a sso link you can switch between other projects you joined earlier so it should not be a big obstacle.

Access to projects is provisioned with editor role by default but you can ask support to allow you access with admin roles if you are sure you need it.

GoodData Writer Failures

There is some problem on GoodData API since about 12:00 CEST (10:00 UTC) which causes failures on some model updates and data loads. We are investigating the problem with GoodData support and will keep you updated.

UPDATE 16:15 CEST GoodData support is still investigating the problem but our logs show that it didn't appear again almost three hours. Preliminary report indicates that they had some troubles with connection to S3 storage where LDM models of projects are stored.

Changes to GoodData Writer's upload project and upload table behaviour

API calls upload-project and upload-table now enforce model update every time they are called. Decision making whether to update the model or not based on comparing GoodData's and Writer's last change date of dataset had not been reliable in some situations so we decided to leave it at all.

Now upload-table API call generates one UpdateModel and one LoadData task every time it is called but upload-project API call generates only one UpdateModel task for whole project at once.