Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
laura_vega
Advisor
Advisor

Introduction


As stated in our latest Business Intelligence Statement of Direction, Business Objects multi source universes will not be maintained after release SAP BusinessObjects BI 4.3, which is planned to be supported until end of 2027.

This means you need to replace your multi source universes with a supported alternative in our next release BI 2025, and I had listed in this previous  post  which are your different options.

But nothing is more convincing than a real use case, and I'm very happy to publish here the testimony of one of our customers, Mike Radics-Saunders from ICBC Standard Bank, who successfully turned a multi source universe into a single source one.

 

Mike, can you present us ICBC Standard Bank and tell us what is your role in its organization?

ICBC Standard Bank was formed in 2015 when the Industrial and Commercial Bank of China Limited (ICBC) acquired a controlling stake in Standard Bank’s London-based Global Markets business. The Bank’s purpose is ‘to serve our clients globally as the Commodities and Financial Markets hub of ICBC’. The Bank pursues strong, lasting relationships with its clients, leveraging the strength of its shareholder banking groups.

My main role is to act as the SAP BI Platform administrator.  My background is in server-side consultancy so all patching and install activities are covered by myself along with daily BAU activities.  There is also time for me to be involved in report and universe design.  I enjoy the mix of the two areas very much.

 

What is your BusinessObjects configuration – how many users, reports, universes, ..? For which usage?

We have Development, UAT and Production environments running the full SAP BI stack and are using SAP BI 4.3 SP02 Patch 3. We are moving to 4.3 SP03 Patch 4 in September 2023.  SAP BI is used by several business units across the Bank and we have around 130 active users.  We are decommissioning Crystal Reports 2020 as we are finding that Webi can now satisfy our reporting requirements.  The several hundred Webi reports we have use ten unx format universes linking to a mix of on-prem and cloud data sources.

 

What was the reason of initially using Multi source Universes?
Most universes we have were created before I joined the Bank. SAP BI 4.1 was the first version of the product installed here and the unx format was used from the outset.  For one particular universe it looked like multiple sources would prove really useful over time, so it was designed as multisource to keep all options open.

 

When did you realize your Multi source universe could not be used in the future, and how did you react?
I’m pretty sure I heard the news in a webinar hosted by Wiiisdom covering one of the 4.3 service Pack releases. It was a shock initially but once I realized that none of the other components listed as being removed from SAP BI 2025 were ones we used it felt like a small price to pay for the fact that on-premise SAP BI was going to live on beyond SAP BI 4.3!

My mind jumped to the one multi-source universe though, with over 300 tables and 600 joins the idea of rebuilding the Data Foundation manually didn’t appeal to me.

I sent a message to sapaskanalytics@sap.com to ask initially if there were any plans to include a converter in the product.  That got the conversation started which led to this article.

 

What were the different solutions you tried to create a single source universe out of your multi source universe?

From the first discussion with SAP it became clear that copying and pasting tables and joins between the existing multi-source Data Foundation and a new single-source one in the Information Design Tool might be possible.  This did work well for tables but I found that the names of derived tables were lost, replaced with just Derived1, Derived2 and so on.  It was possible to match SQL definitions by eye and rename the derived tables in the single-source Data Foundation but not a very smooth process.  Joins appeared to copy and paste at first glance but I could never get them to appear on the Data Foundation canvas.  At this point the SDK seemed the next obvious place to go.

 

Can you describe the script you have implemented to create the SSU?

I have the SAP BI 4.3 Client Tools installed on my local workstation. The code is running in Eclipse installed on the same machine.  I’ve used the IDT to save my existing multi-source data foundation and a target, empty, single-source data foundation in a local folder.  You could of course orchestrate all of this via code but it didn’t really make sense for me with just a single universe to focus on.

 

Script Step 1 – Pre-amble

  • Declare an IEnterpriseSession, an SL Context and a LocalResourceService and load both data foundations.

  • Get a list of the tables and a list of the joins in the multi-source data foundation via the getTables() and getJoins() methods on the MultiSourceDataFoundation interface respectively.


 

Step 2 – Tables

  • Loop through all tables that were found in the multi-source data foundation in Step 1.

  • Determine what type of table you are dealing with in each case via the instanceof operator.

  • Create a table of the same kind in the single-source data foundation via the appropriate DataFoundationFactory method, taking information from the table in the multi-source data foundation as required.


 

Step 3 – Joins

  • Use the MonoSourceDataFoundation getJoins() method to create an empty list which will receive the joins in the single-source data foundation as they are created.

  • Loop through all joins that were found in the multi-source data foundation in Step 1.

  • Use the DataFoundationFactory createJoin() method to create a new join against the single-source data foundation.

  • Use the replace() method on the expression read from the multi-source join to remove any @catalog(…) syntax and arrive at an expression that is valid in the single-source foundation.

  • Set the cardinality and type of the single-source join, using the values read from the multi-source join.

  • Add the join to the join list for the single-source data foundation.


 

Step 4 – Data foundation Views

  • Step 2 created all the required tables in the Master Data Foundation View of the new single-source data foundation but if you were to open it in the IDT at this point you would see that all tables were positioned on top of each other in the very top left of the screen. No custom Data Foundation Views exist in the single-source data foundation either at this point, so…

  • Loop through all the custom Data Foundation Views found in the multi-source data foundation (found via the MultiSourceDataFoundation getDataFoundationViews() method).

  • Create the Data Foundation View in the single-source foundation.

  • Loop through all Table Views in the current Data Foundation View, capturing table name, X and Y position, table width and table state in arrays.

  • Loop through all tables in the single-source foundation.

  • If the table name is found in your table name array create a new Table View in the Data Foundation View in the single-source foundation and set its X and Y position, table width and table state from the elements of your other arrays with the same index.

  • Finally process the Master Data Foundation View (MultiSourceDataFoundation getMasterView() method). There will be a TableView present for every table here already (as all tables appear in the Master View by default) so it’s just a case of looping through them, looking up and applying the settings held in the X and Y position, table width and table state in arrays.


 

With a single-source data foundation available the ‘Change Data Foundation’ option in the IDT can be used to simply re-point your existing Business Layer at the single-source foundation.

This action alone removed almost all occurrences of the @catalog() function in the SQL definitions in the Business Layer.  Running an integrity check flagged up any objects where multi-source syntax was still present allowing them to be addressed quickly and easily.

 

How did you validate the new universe?

I made use of the Save As option in the IDT to create descriptions of my data foundations in text format at various points.   I could then use a text comparison tool to check that my single-source data foundation was matching up to the multi-source one I started with regard to table, view and join counts and so on.

We don’t currently have an automated testing tool so here I leaned on the SDK to help schedule every report we had built against the universe to a file location in text format, using text comparison to check a result set created using the multi-source universe against one created with the new single-source version.

Several measures tracking task durations flagged up as differences via this method.  Investigation showed the timestampDiff() function had been used in the definition of these in the multi-source universe and that it was rounding inconsistently  e.g. a duration of 21 seconds would be rounded to zero minutes in some cases and 1 minute in others!  The native Oracle SQL in the new single-source universe objects always rounds as you would expect so the conversion to single source has actually improved the accuracy of these measures.

 

Have you delivered the SSU in a production environment?

Yes. We have been running the single source universe in Production for almost two months without issue.

 

What would you recommend to other customers facing the same situation?

That would depend on the number and complexity of universes to convert, their in-house skills and their budget.  3rd party conversion tools are already available I believe.  If you have a large number of complex universes to convert and sufficient budget then this would probably be the most attractive route.

This exercise has shown that there are alternatives.  If you have in-house Java and SAP BI SDK skills then automating the conversion process is possible and the road is a fairly smooth one compared to some tasks I’ve tackled with the Java SDKs. Running code in Eclipse has served my needs.  It could be extended with a GUI and turned into fully fledged app if required.  Maybe mixing in more manual options like copying and pasting tables between data foundations will work for you rather than doing everything in code?  Are some of your data foundation actually simple enough just to rewrite them manually using the single source option?

Our core driver was trying to future proof the platform as SAP BI 2025 will be here before we know it!

 

As a conclusion, I want to thank Mike for sharing his deep expertise, which I'm sure will be very helpful to many of our customers.

If you have any question or feedback, please write to sapaskanalytics@sap.com .