Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
AdemGuler
Participant
I have written this article to give general information about creating Table Function and including it in Data Flow in SAP Datasphere.


We can write SQLScript (Table Function) code in a powerful SQL editor in SAP Datasphere.

Please refer that link for SQLScript: SQLScript Reference | SAP Help Portal

When we start coding Table Function in the SQL editor, the first structure appears on the screen in this way.



Declaration of the Field(s)


The columns we define in the "Model Properties" side panel must match the columns we return in our SQL code. Means that in order to validate our code, we must first define each column in the “Model Properties” and then code it in the SQL editor.


When we code the field that we did not define in the "Model Properties" in the SQL editor, we get the following error.


The solution is to add the “COUNTRYFR” field in the Model Properties -> Columns side panel as follow.



This is how we fixed the error.


In this way, we add all the fields we need in the output on both in the SQL editor and in the “Model Properties” side panel.

Another point to remember is that the fields must be defined in the same order in both the "Model Properties" and the SQL editor. Otherwise we will get an error similar to the following.



Declaration of the Join Operator(s)



These are the Join types supported by the system.


Be aware of that the system is sensitive to upper and lower case words.

The declaration of the Join operators has to be done in the brackets of the FROM (); statement as show here:



Declaration of the Conversion(s)


We need to know in order to make CONVERSION_UNIT or CONVERSION_CURRENCY conversions in our Table Function code:

  • To use the CONVERT_UNIT function, the unit conversion tables T006 and T006D must be available in the SAP HANA database (or in other words, must be available in our Datasphere' SPACE).

  • To use the CONVERT_CURRENCY function, the currency conversion tables TCURV, TCURX, TCURN, TCURR, TCURF and TCURC must be available in the SAP HANA database (or in other words, must be available in our Datasphere' SPACE).


Then we need to declate the tables in SQLScript code as follows:

Declare the tables used in the UNIT CONVERSION function.
CONVERT_UNIT(
"QUANTITY"=>"RELATED_FIELD",
"SOURCE_UNIT" =>"RELATED_FIELD",
"SCHEMA" => 'YOUR_SCHEMA_ID',
"DIMENSION_TABLE" => 'T006D', --the table used for the conversion must be declared here
"RATES_TABLE" => 'T006', --the table used for the conversion must be declared here
"TARGET_UNIT" => 'KM',
"ERROR_HANDLING"=>'fail on error',
"CLIENT" => '100'
)

Declare the tables used in the CURRENCY CONVERSION function.
CONVERT_CURRENCY(
"AMOUNT" => "RELATED_FIELD",
"SOURCE_UNIT" => "RELATED_FIELD",
"TARGET_UNIT" => 'EUR',
"CONVERSION_TYPE" => 'M',
"REFERENCE_DATE" => "RELATED_FIELD",
"CLIENT" => '100',
"SCHEMA" => 'YOUR_SCHEMA_ID',
"ERROR_HANDLING" => 'set_to_null',
"PRECISIONS_TABLE" => 'TCURX', --the table used for conversion must be declared here
"CONFIGURATION_TABLE" => 'TCURV', --the table used for conversion must be declared here
"PREFACTORS_TABLE" => 'TCURF', --the table used for conversion must be declared here
"RATES_TABLE" => 'TCURR' --the table used for conversion must be declared here
)

At the end of my work, all the code we wrote looked like this.


This is my first personal experience of writing a Table Function in Datasphere.

Using Table Function in Data Flow


We can use the Table Function we created in the Data Flow.

Be aware of that the Transformation (TRFN) operations in the BW system have been replaced by Data Flow in the Datasphere system.

Data Flow currently supports the following operators to be used within our transformation process:

  • Join Operator

  • Union Operator

  • Projection Operator

  • Aggregation Operator

  • Script Operator



 

Best Regards.
1 Comment
Labels in this area