Giter Club home page Giter Club logo

pentaho-pdi-plugin-jdbc-metadata's People

Contributors

andr3al3x avatar dependabot[bot] avatar rpbouman avatar slawo-ch avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

pentaho-pdi-plugin-jdbc-metadata's Issues

EE repository serialization errors

The step has trouble serializing to the Pentaho Enterprise Repository.

In particular:

  • output fields are not restored when reading from enterprise repository

When exporting from file-based repository and importing to Enterprise Repository:

  • boolean values are not preserved after import (i.e. "always pass input row" checkbox)

  • empty method arguments cause the total list of arguments to be truncated after import
    example: show tables with

    • catalog: [empty]
    • schema pattern: %
    • name pattern: %
    • table type: TABLE

    results in table type not being preserved after import

Issue when using the plugin embedded in java

I am trying to execute a transformation in Java using the Pentaho SDK. If your pentaho-pdi-plugin-jdbc-metadata is used in a transformation, it seems that this step won't set the ouput fields correctly. This is only the case when started from Java. When the same transformation is executed using Spoon, everything seems to work fine. To verify that this problem is not caused by "wrong usage", we used your example code that can be found when downloading your plugin source code from git.

Concrete case:

Tha example transformation "generate-scd2.ktr" has been executed in Spoon without any problems.
In Java, the following code has been used to execute the transformation:

    KettleEnvironment.init();
    TransMeta transMeta = new TransMeta("/home/mhe/Downloads/test/generate-scd2.ktr", (Repository) null);
    transMeta.setParameterValue("CATALOG_NAME", "el2export");
    transMeta.setParameterValue("SCHEMA_NAME", "el2export");
    transMeta.setParameterValue("TABLE_NAME", "meldung");
    Trans transformation = new Trans(transMeta);
    transformation.setLogLevel(LogLevel.DETAILED);
    transformation.execute(new String[0]);
    transformation.waitUntilFinished();

The log output looks like that:

log4j:WARN No appenders could be found for logger (org.apache.commons.vfs2.impl.StandardFileSystemManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
2016/02/01 14:07:56 - generate-scd2 - Dispatching started for transformation [generate-scd2]
2016/02/01 14:07:56 - generate-scd2 - Nr of arguments detected:0
2016/02/01 14:07:56 - generate-scd2 - This is not a replay transformation
2016/02/01 14:07:56 - generate-scd2 - I found 13 different steps to launch.
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets...
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 0 --> Get parameters
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Get parameters.0 - constants.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 1 rowsets for step 0 --> Get parameters
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 1 --> constants
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [constants.0 - PK column(s).0]
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [constants.0 - Table columns.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 3 rowsets for step 1 --> constants
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 2 --> PK column(s)
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [PK column(s).0 - lookup PK column.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 4 rowsets for step 2 --> PK column(s)
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 3 --> Table columns
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Table columns.0 - lookup PK column.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 5 rowsets for step 3 --> Table columns
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 4 --> lookup PK column
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [lookup PK column.0 - Column DDL.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 6 rowsets for step 4 --> lookup PK column
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 5 --> Column DDL
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Column DDL.0 - Columns DDL.0]
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Column DDL.0 - Is PK Column?.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 8 rowsets for step 5 --> Column DDL
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 6 --> Columns DDL
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Columns DDL.0 - Combine.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 9 rowsets for step 6 --> Columns DDL
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 7 --> Is PK Column?
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Is PK Column?.0 - PK Columns.0]
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Is PK Column?.0 - Non PK Columns.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 11 rowsets for step 7 --> Is PK Column?
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 8 --> PK Columns
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [PK Columns.0 - Combine.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 12 rowsets for step 8 --> PK Columns
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 9 --> Non PK Columns
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Non PK Columns.0 - Combine.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 13 rowsets for step 9 --> Non PK Columns
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 10 --> Combine
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Combine.0 - Generate Transformation.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 14 rowsets for step 10 --> Combine
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 11 --> Generate Transformation
2016/02/01 14:07:56 - generate-scd2 - prevcopies = 1, nextcopies=1
2016/02/01 14:07:56 - generate-scd2 - Transformation allocated new rowset [Generate Transformation.0 - write ktr.0]
2016/02/01 14:07:56 - generate-scd2 - Allocated 15 rowsets for step 11 --> Generate Transformation
2016/02/01 14:07:56 - generate-scd2 - Allocating rowsets for step 12 --> write ktr
2016/02/01 14:07:56 - generate-scd2 - Allocated 15 rowsets for step 12 --> write ktr
2016/02/01 14:07:56 - generate-scd2 - Allocating Steps & StepData...
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [Get parameters] of type [GetVariable]
2016/02/01 14:07:56 - Get parameters.0 - distribution activated
2016/02/01 14:07:56 - Get parameters.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - Get parameters.0 - Step info: nrinput=0 nroutput=1
2016/02/01 14:07:56 - Get parameters.0 - output rel. is 1:1
2016/02/01 14:07:56 - Get parameters.0 - Found output rowset [Get parameters.0 - constants.0]
2016/02/01 14:07:56 - Get parameters.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [Get parameters].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [constants] of type [Constant]
2016/02/01 14:07:56 - constants.0 - distribution de-activated
2016/02/01 14:07:56 - constants.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - constants.0 - Step info: nrinput=1 nroutput=2
2016/02/01 14:07:56 - constants.0 - Got previous step from [constants] #0 --> Get parameters
2016/02/01 14:07:56 - constants.0 - input rel is 1:1
2016/02/01 14:07:56 - constants.0 - Found input rowset [Get parameters.0 - constants.0]
2016/02/01 14:07:56 - constants.0 - output rel. is 1:1
2016/02/01 14:07:56 - constants.0 - Found output rowset [constants.0 - PK column(s).0]
2016/02/01 14:07:56 - constants.0 - output rel. is 1:1
2016/02/01 14:07:56 - constants.0 - Found output rowset [constants.0 - Table columns.0]
2016/02/01 14:07:56 - constants.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [constants].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [PK column(s)] of type [jdbcMetaData]
2016/02/01 14:07:56 - PK column(s).0 - distribution activated
2016/02/01 14:07:56 - PK column(s).0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - PK column(s).0 - Step info: nrinput=1 nroutput=1
2016/02/01 14:07:56 - PK column(s).0 - Got previous step from [PK column(s)] #0 --> constants
2016/02/01 14:07:56 - PK column(s).0 - input rel is 1:1
2016/02/01 14:07:56 - PK column(s).0 - Found input rowset [constants.0 - PK column(s).0]
2016/02/01 14:07:56 - PK column(s).0 - output rel. is 1:1
2016/02/01 14:07:56 - PK column(s).0 - Found output rowset [PK column(s).0 - lookup PK column.0]
2016/02/01 14:07:56 - PK column(s).0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [PK column(s)].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [Table columns] of type [jdbcMetaData]
2016/02/01 14:07:56 - Table columns.0 - distribution activated
2016/02/01 14:07:56 - Table columns.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - Table columns.0 - Step info: nrinput=1 nroutput=1
2016/02/01 14:07:56 - Table columns.0 - Got previous step from [Table columns] #0 --> constants
2016/02/01 14:07:56 - Table columns.0 - input rel is 1:1
2016/02/01 14:07:56 - Table columns.0 - Found input rowset [constants.0 - Table columns.0]
2016/02/01 14:07:56 - Table columns.0 - output rel. is 1:1
2016/02/01 14:07:56 - Table columns.0 - Found output rowset [Table columns.0 - lookup PK column.0]
2016/02/01 14:07:56 - Table columns.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [Table columns].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [lookup PK column] of type [StreamLookup]
2016/02/01 14:07:56 - lookup PK column.0 - distribution activated
2016/02/01 14:07:56 - lookup PK column.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - lookup PK column.0 - Step info: nrinput=2 nroutput=1
2016/02/01 14:07:56 - lookup PK column.0 - Got previous step from [lookup PK column] #0 --> Table columns
2016/02/01 14:07:56 - lookup PK column.0 - input rel is 1:1
2016/02/01 14:07:56 - lookup PK column.0 - Found input rowset [Table columns.0 - lookup PK column.0]
2016/02/01 14:07:56 - lookup PK column.0 - Got previous step from [lookup PK column] #1 --> PK column(s)
2016/02/01 14:07:56 - lookup PK column.0 - input rel is 1:1
2016/02/01 14:07:56 - lookup PK column.0 - Found input rowset [PK column(s).0 - lookup PK column.0]
2016/02/01 14:07:56 - lookup PK column.0 - output rel. is 1:1
2016/02/01 14:07:56 - lookup PK column.0 - Found output rowset [lookup PK column.0 - Column DDL.0]
2016/02/01 14:07:56 - lookup PK column.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [lookup PK column].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [Column DDL] of type [ScriptValueMod]
2016/02/01 14:07:56 - Column DDL.0 - distribution de-activated
2016/02/01 14:07:56 - Column DDL.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - Column DDL.0 - Step info: nrinput=1 nroutput=2
2016/02/01 14:07:56 - Column DDL.0 - Got previous step from [Column DDL] #0 --> lookup PK column
2016/02/01 14:07:56 - Column DDL.0 - input rel is 1:1
2016/02/01 14:07:56 - Column DDL.0 - Found input rowset [lookup PK column.0 - Column DDL.0]
2016/02/01 14:07:56 - Column DDL.0 - output rel. is 1:1
2016/02/01 14:07:56 - Column DDL.0 - Found output rowset [Column DDL.0 - Columns DDL.0]
2016/02/01 14:07:56 - Column DDL.0 - output rel. is 1:1
2016/02/01 14:07:56 - Column DDL.0 - Found output rowset [Column DDL.0 - Is PK Column?.0]
2016/02/01 14:07:56 - Column DDL.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [Column DDL].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [Columns DDL] of type [GroupBy]
2016/02/01 14:07:56 - Columns DDL.0 - distribution activated
2016/02/01 14:07:56 - Columns DDL.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - Columns DDL.0 - Step info: nrinput=1 nroutput=1
2016/02/01 14:07:56 - Columns DDL.0 - Got previous step from [Columns DDL] #0 --> Column DDL
2016/02/01 14:07:56 - Columns DDL.0 - input rel is 1:1
2016/02/01 14:07:56 - Columns DDL.0 - Found input rowset [Column DDL.0 - Columns DDL.0]
2016/02/01 14:07:56 - Columns DDL.0 - output rel. is 1:1
2016/02/01 14:07:56 - Columns DDL.0 - Found output rowset [Columns DDL.0 - Combine.0]
2016/02/01 14:07:56 - Columns DDL.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [Columns DDL].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [Is PK Column?] of type [FilterRows]
2016/02/01 14:07:56 - Is PK Column?.0 - distribution activated
2016/02/01 14:07:56 - Is PK Column?.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - Is PK Column?.0 - Step info: nrinput=1 nroutput=2
2016/02/01 14:07:56 - Is PK Column?.0 - Got previous step from [Is PK Column?] #0 --> Column DDL
2016/02/01 14:07:56 - Is PK Column?.0 - input rel is 1:1
2016/02/01 14:07:56 - Is PK Column?.0 - Found input rowset [Column DDL.0 - Is PK Column?.0]
2016/02/01 14:07:56 - Is PK Column?.0 - output rel. is 1:1
2016/02/01 14:07:56 - Is PK Column?.0 - Found output rowset [Is PK Column?.0 - PK Columns.0]
2016/02/01 14:07:56 - Is PK Column?.0 - output rel. is 1:1
2016/02/01 14:07:56 - Is PK Column?.0 - Found output rowset [Is PK Column?.0 - Non PK Columns.0]
2016/02/01 14:07:56 - Is PK Column?.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [Is PK Column?].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [PK Columns] of type [GroupBy]
2016/02/01 14:07:56 - PK Columns.0 - distribution activated
2016/02/01 14:07:56 - PK Columns.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - PK Columns.0 - Step info: nrinput=1 nroutput=1
2016/02/01 14:07:56 - PK Columns.0 - Got previous step from [PK Columns] #0 --> Is PK Column?
2016/02/01 14:07:56 - PK Columns.0 - input rel is 1:1
2016/02/01 14:07:56 - PK Columns.0 - Found input rowset [Is PK Column?.0 - PK Columns.0]
2016/02/01 14:07:56 - PK Columns.0 - output rel. is 1:1
2016/02/01 14:07:56 - PK Columns.0 - Found output rowset [PK Columns.0 - Combine.0]
2016/02/01 14:07:56 - PK Columns.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [PK Columns].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [Non PK Columns] of type [GroupBy]
2016/02/01 14:07:56 - Non PK Columns.0 - distribution activated
2016/02/01 14:07:56 - Non PK Columns.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - Non PK Columns.0 - Step info: nrinput=1 nroutput=1
2016/02/01 14:07:56 - Non PK Columns.0 - Got previous step from [Non PK Columns] #0 --> Is PK Column?
2016/02/01 14:07:56 - Non PK Columns.0 - input rel is 1:1
2016/02/01 14:07:56 - Non PK Columns.0 - Found input rowset [Is PK Column?.0 - Non PK Columns.0]
2016/02/01 14:07:56 - Non PK Columns.0 - output rel. is 1:1
2016/02/01 14:07:56 - Non PK Columns.0 - Found output rowset [Non PK Columns.0 - Combine.0]
2016/02/01 14:07:56 - Non PK Columns.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [Non PK Columns].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [Combine] of type [JoinRows]
2016/02/01 14:07:56 - Combine.0 - distribution activated
2016/02/01 14:07:56 - Combine.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - Combine.0 - Step info: nrinput=3 nroutput=1
2016/02/01 14:07:56 - Combine.0 - Got previous step from [Combine] #0 --> Non PK Columns
2016/02/01 14:07:56 - Combine.0 - input rel is 1:1
2016/02/01 14:07:56 - Combine.0 - Found input rowset [Non PK Columns.0 - Combine.0]
2016/02/01 14:07:56 - Combine.0 - Got previous step from [Combine] #1 --> PK Columns
2016/02/01 14:07:56 - Combine.0 - input rel is 1:1
2016/02/01 14:07:56 - Combine.0 - Found input rowset [PK Columns.0 - Combine.0]
2016/02/01 14:07:56 - Combine.0 - Got previous step from [Combine] #2 --> Columns DDL
2016/02/01 14:07:56 - Combine.0 - input rel is 1:1
2016/02/01 14:07:56 - Combine.0 - Found input rowset [Columns DDL.0 - Combine.0]
2016/02/01 14:07:56 - Combine.0 - output rel. is 1:1
2016/02/01 14:07:56 - Combine.0 - Found output rowset [Combine.0 - Generate Transformation.0]
2016/02/01 14:07:56 - Combine.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [Combine].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [Generate Transformation] of type [ScriptValueMod]
2016/02/01 14:07:56 - Generate Transformation.0 - distribution activated
2016/02/01 14:07:56 - Generate Transformation.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - Generate Transformation.0 - Step info: nrinput=1 nroutput=1
2016/02/01 14:07:56 - Generate Transformation.0 - Got previous step from [Generate Transformation] #0 --> Combine
2016/02/01 14:07:56 - Generate Transformation.0 - input rel is 1:1
2016/02/01 14:07:56 - Generate Transformation.0 - Found input rowset [Combine.0 - Generate Transformation.0]
2016/02/01 14:07:56 - Generate Transformation.0 - output rel. is 1:1
2016/02/01 14:07:56 - Generate Transformation.0 - Found output rowset [Generate Transformation.0 - write ktr.0]
2016/02/01 14:07:56 - Generate Transformation.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [Generate Transformation].0
2016/02/01 14:07:56 - generate-scd2 - Transformation is about to allocate step [write ktr] of type [TextFileOutput]
2016/02/01 14:07:56 - write ktr.0 - distribution activated
2016/02/01 14:07:56 - write ktr.0 - Starting allocation of buffers & new threads...
2016/02/01 14:07:56 - write ktr.0 - Step info: nrinput=1 nroutput=0
2016/02/01 14:07:56 - write ktr.0 - Got previous step from [write ktr] #0 --> Generate Transformation
2016/02/01 14:07:56 - write ktr.0 - input rel is 1:1
2016/02/01 14:07:56 - write ktr.0 - Found input rowset [Generate Transformation.0 - write ktr.0]
2016/02/01 14:07:56 - write ktr.0 - Finished dispatching
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated a new step: [write ktr].0
2016/02/01 14:07:56 - generate-scd2 - This transformation can be replayed with replay date: 2016/02/01 14:07:56
2016/02/01 14:07:56 - generate-scd2 - Initialising 13 steps...
2016/02/01 14:07:56 - constants.0 - Released server socket on port 0
2016/02/01 14:07:56 - PK column(s).0 - Released server socket on port 0
2016/02/01 14:07:56 - Column DDL.0 - Released server socket on port 0
2016/02/01 14:07:56 - lookup PK column.0 - Released server socket on port 0
2016/02/01 14:07:56 - Is PK Column?.0 - Released server socket on port 0
2016/02/01 14:07:56 - Table columns.0 - Released server socket on port 0
2016/02/01 14:07:56 - Get parameters.0 - Released server socket on port 0
2016/02/01 14:07:56 - Generate Transformation.0 - Released server socket on port 0
2016/02/01 14:07:56 - Combine.0 - Released server socket on port 0
2016/02/01 14:07:56 - Non PK Columns.0 - Released server socket on port 0
2016/02/01 14:07:56 - PK Columns.0 - Released server socket on port 0
2016/02/01 14:07:56 - Columns DDL.0 - Released server socket on port 0
2016/02/01 14:07:56 - write ktr.0 - Released server socket on port 0
2016/02/01 14:07:56 - generate-scd2 - Step [Get parameters.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [constants.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [PK column(s).0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [Table columns.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [lookup PK column.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [Column DDL.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [Columns DDL.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [Is PK Column?.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [PK Columns.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [Non PK Columns.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [Combine.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [Generate Transformation.0] initialized flawlessly.
2016/02/01 14:07:56 - generate-scd2 - Step [write ktr.0] initialized flawlessly.
2016/02/01 14:07:56 - Get parameters.0 - Starting to run...
2016/02/01 14:07:56 - lookup PK column.0 - Starting to run...
2016/02/01 14:07:56 - Column DDL.0 - Starting to run...
2016/02/01 14:07:56 - Table columns.0 - Starting to run...
2016/02/01 14:07:56 - PK column(s).0 - Starting to run...
2016/02/01 14:07:56 - constants.0 - Starting to run...
2016/02/01 14:07:56 - lookup PK column.0 - Reading from stream [PK column(s)]
2016/02/01 14:07:56 - Is PK Column?.0 - Starting to run...
2016/02/01 14:07:56 - Columns DDL.0 - Starting to run...
2016/02/01 14:07:56 - Get parameters.0 - field [CATALOG_NAME] has value [el2export]
2016/02/01 14:07:56 - Get parameters.0 - field [TABLE_NAME] has value [meldung]
2016/02/01 14:07:56 - Get parameters.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2016/02/01 14:07:56 - PK Columns.0 - Starting to run...
2016/02/01 14:07:56 - Combine.0 - Starting to run...
2016/02/01 14:07:56 - Non PK Columns.0 - Starting to run...
2016/02/01 14:07:56 - Generate Transformation.0 - Starting to run...
2016/02/01 14:07:56 - write ktr.0 - Starting to run...
org.pentaho.di.core.exception.KettleStepException:
Unable to find field [TABLE_CAT] in the source rows
child index = 4, logging object : org.pentaho.di.core.logging.LoggingObject@6bcc7462 parent=79c48e3d-4211-4e0b-ab01-dd94f44243d1

            at org.pentaho.di.trans.steps.streamlookup.StreamLookup.readLookupValues(StreamLookup.java:175)
            at org.pentaho.di.trans.steps.streamlookup.StreamLookup.processRow(StreamLookup.java:382)
            at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
            at java.lang.Thread.run(Thread.java:745)

2016/02/01 14:07:56 - lookup PK column.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Unexpected error
2016/02/01 14:07:56 - lookup PK column.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2016/02/01 14:07:56 - lookup PK column.0 - Unable to find field [TABLE_CAT] in the source rows
2016/02/01 14:07:56 - lookup PK column.0 -
2016/02/01 14:07:56 - lookup PK column.0 - at org.pentaho.di.trans.steps.streamlookup.StreamLookup.readLookupValues(StreamLookup.java:175)
2016/02/01 14:07:56 - lookup PK column.0 - at org.pentaho.di.trans.steps.streamlookup.StreamLookup.processRow(StreamLookup.java:382)
2016/02/01 14:07:56 - lookup PK column.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/02/01 14:07:56 - lookup PK column.0 - at java.lang.Thread.run(Thread.java:745)
2016/02/01 14:07:56 - lookup PK column.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2016/02/01 14:07:56 - generate-scd2 - Transformation detected one or more steps with errors.
2016/02/01 14:07:56 - generate-scd2 - Transformation is killing the other steps!
2016/02/01 14:07:56 - generate-scd2 - Transformation has allocated 13 threads and 15 rowsets.
2016/02/01 14:07:56 - constants.0 - Finished processing (I=0, O=0, R=1, W=2, U=0, E=0)
2016/02/01 14:07:56 - Table columns.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2016/02/01 14:07:56 - Generate Transformation.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=0)
2016/02/01 14:07:56 - Column DDL.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=0)
2016/02/01 14:07:56 - PK column(s).0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2016/02/01 14:07:56 - write ktr.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=0)
2016/02/01 14:07:56 - Is PK Column?.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=0)
org.pentaho.di.core.exception.KettleStepException:
Return value KEY_SEQ can't be found in the input row.

            at org.pentaho.di.trans.steps.streamlookup.StreamLookupMeta.getFields(StreamLookupMeta.java:184)
            at org.pentaho.di.trans.TransMeta.getThisStepFields(TransMeta.java:2042)
            at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1871)
            at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
            at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
            at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1940)
            at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1905)
            at org.pentaho.di.trans.steps.groupby.GroupBy.processRow(GroupBy.java:106)
            at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
            at java.lang.Thread.run(Thread.java:745)

org.pentaho.di.core.exception.KettleStepException:
Return value KEY_SEQ can't be found in the input row.

            at org.pentaho.di.trans.steps.streamlookup.StreamLookupMeta.getFields(StreamLookupMeta.java:184)
            at org.pentaho.di.trans.TransMeta.getThisStepFields(TransMeta.java:2042)
            at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1871)
            at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
            at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1940)
            at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1905)
            at org.pentaho.di.trans.steps.groupby.GroupBy.processRow(GroupBy.java:106)
            at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
            at java.lang.Thread.run(Thread.java:745)

org.pentaho.di.core.exception.KettleStepException:
Return value KEY_SEQ can't be found in the input row.

            at org.pentaho.di.trans.steps.streamlookup.StreamLookupMeta.getFields(StreamLookupMeta.java:184)
            at org.pentaho.di.trans.TransMeta.getThisStepFields(TransMeta.java:2042)
            at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1871)
            at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
            at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
            at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1940)
            at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1905)
            at org.pentaho.di.trans.steps.groupby.GroupBy.processRow(GroupBy.java:106)
            at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
            at java.lang.Thread.run(Thread.java:745)

2016/02/01 14:07:56 - Non PK Columns.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Unexpected error
2016/02/01 14:07:56 - PK Columns.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Unexpected error
2016/02/01 14:07:56 - Columns DDL.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Unexpected error
2016/02/01 14:07:56 - PK Columns.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2016/02/01 14:07:56 - PK Columns.0 - Return value KEY_SEQ can't be found in the input row.
2016/02/01 14:07:56 - PK Columns.0 -
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.steps.streamlookup.StreamLookupMeta.getFields(StreamLookupMeta.java:184)
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.TransMeta.getThisStepFields(TransMeta.java:2042)
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1871)
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1940)
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1905)
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.steps.groupby.GroupBy.processRow(GroupBy.java:106)
2016/02/01 14:07:56 - PK Columns.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/02/01 14:07:56 - PK Columns.0 - at java.lang.Thread.run(Thread.java:745)
child index = 8, logging object : org.pentaho.di.core.logging.LoggingObject@426c57e5 parent=79c48e3d-4211-4e0b-ab01-dd94f44243d1
2016/02/01 14:07:56 - Non PK Columns.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2016/02/01 14:07:56 - Non PK Columns.0 - Return value KEY_SEQ can't be found in the input row.
2016/02/01 14:07:56 - Non PK Columns.0 -
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.steps.streamlookup.StreamLookupMeta.getFields(StreamLookupMeta.java:184)
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.TransMeta.getThisStepFields(TransMeta.java:2042)
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1871)
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1940)
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1905)
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.steps.groupby.GroupBy.processRow(GroupBy.java:106)
2016/02/01 14:07:56 - Non PK Columns.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/02/01 14:07:56 - Non PK Columns.0 - at java.lang.Thread.run(Thread.java:745)
child index = 9, logging object : org.pentaho.di.core.logging.LoggingObject@56fe9d74 parent=79c48e3d-4211-4e0b-ab01-dd94f44243d1
2016/02/01 14:07:56 - Columns DDL.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2016/02/01 14:07:56 - Columns DDL.0 - Return value KEY_SEQ can't be found in the input row.
2016/02/01 14:07:56 - Columns DDL.0 -
2016/02/01 14:07:56 - Columns DDL.0 - at org.pentaho.di.trans.steps.streamlookup.StreamLookupMeta.getFields(StreamLookupMeta.java:184)
2016/02/01 14:07:56 - Columns DDL.0 - at org.pentaho.di.trans.TransMeta.getThisStepFields(TransMeta.java:2042)
2016/02/01 14:07:56 - Columns DDL.0 - at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1871)
2016/02/01 14:07:56 - Columns DDL.0 - at org.pentaho.di.trans.TransMeta.getStepFields(TransMeta.java:1834)
2016/02/01 14:07:56 - Columns DDL.0 - at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1940)
2016/02/01 14:07:56 - Columns DDL.0 - at org.pentaho.di.trans.TransMeta.getPrevStepFields(TransMeta.java:1905)
2016/02/01 14:07:56 - Columns DDL.0 - at org.pentaho.di.trans.steps.groupby.GroupBy.processRow(GroupBy.java:106)
2016/02/01 14:07:56 - Columns DDL.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/02/01 14:07:56 - Columns DDL.0 - at java.lang.Thread.run(Thread.java:745)
child index = 6, logging object : org.pentaho.di.core.logging.LoggingObject@6613574a parent=79c48e3d-4211-4e0b-ab01-dd94f44243d1
2016/02/01 14:07:56 - PK Columns.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2016/02/01 14:07:56 - Columns DDL.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2016/02/01 14:07:56 - Non PK Columns.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2016/02/01 14:07:56 - Combine.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=0)
2016/02/01 14:07:56 - generate-scd2 - Transformation detected one or more steps with errors.
2016/02/01 14:07:56 - generate-scd2 - Transformation is killing the other steps!
2016/02/01 14:07:57 - generate-scd2 - Transformation detected one or more steps with errors.
2016/02/01 14:07:57 - generate-scd2 - Transformation is killing the other steps!
2016/02/01 14:07:57 - generate-scd2 - Transformation detected one or more steps with errors.
2016/02/01 14:07:57 - generate-scd2 - Transformation is killing the other steps!

Add "Kettle Datatype" to the output columns

I'd like to know the Kettle Datatype in addition to the database datatype. Could that be added?

It would save us re-doing the already existing mappings from e.g. VARCHAR->String etc.

Is it even possible? Does it make sense as to why? (Essentially I'm using the data to metadata inject, so obviously i need PDI datatypes for that to make sense.)

Error saving file: java.lang.NullPointerException

Environment:
Fedora 20
PDI v5.0.1-stable
java version "1.7.0_45"
"Get JDBC Metadata" step plugin downloaded from marketplace

Error:
When adding the Get JDBC Metadata step and then saving the transformation, I get following error message:
Error saving file: java.lang.NullPointerException

Removing the step and then saving the file works.

This is what it shows in the Error details window:
java.lang.NullPointerException
at org.pentaho.di.steps.jdbcmetadata.JdbcMetaDataMeta.getXML(JdbcMetaDataMeta.java:787)
at org.pentaho.di.trans.step.StepMeta.getXML(StepMeta.java:210)
at org.pentaho.di.trans.step.StepMeta.getXML(StepMeta.java:189)
at org.pentaho.di.trans.TransMeta.getXML(TransMeta.java:2650)
at org.pentaho.di.trans.TransMeta.getXML(TransMeta.java:2476)
at org.pentaho.di.ui.spoon.Spoon.saveMeta(Spoon.java:5419)
at org.pentaho.di.ui.spoon.TransFileListener.save(TransFileListener.java:99)
at org.pentaho.di.ui.spoon.Spoon.save(Spoon.java:5404)
at org.pentaho.di.ui.spoon.Spoon.saveToFile(Spoon.java:4652)
at org.pentaho.di.ui.spoon.Spoon.saveFile(Spoon.java:4610)
at sun.reflect.GeneratedMethodAccessor44.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:329)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:139)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:123)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:26)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:88)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1227)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7368)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:8673)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:134)

Argument Cannot be Null Exception when trying to edit step after it was saved unconfigured

You get an argument cannot be null exception when you add the step to a transformation, do not fully configure the step, particularly the output fields, save the transformation, close and reopen the transformation, then try to edit the step.

java.lang.IllegalArgumentException: Argument cannot be null
at org.eclipse.swt.SWT.error(Unknown Source)
at org.eclipse.swt.SWT.error(Unknown Source)
at org.eclipse.swt.SWT.error(Unknown Source)
at org.eclipse.swt.widgets.Widget.error(Unknown Source)
at org.eclipse.swt.widgets.TableItem.setText(Unknown Source)
at org.pentaho.di.steps.jdbcmetadata.JdbcMetaDataDialog.updateOutputFields(JdbcMetaDataDialog.java:332)
at org.pentaho.di.steps.jdbcmetadata.JdbcMetaDataDialog.populateDialog(JdbcMetaDataDialog.java:972)
at org.pentaho.di.steps.jdbcmetadata.JdbcMetaDataDialog.open(JdbcMetaDataDialog.java:905)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:124)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8720)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3027)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:744)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1310)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7931)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9202)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:648)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.