Dynamic source filtering is a big issue in SAP Datasphere. There are several approaches that may work for one, but not for another.
The discussion started again on LinkedIn after I read a post about dynamic filtering that was set with a fixed filter. Wanda then showed us a solution he uses to filter data. But Christopher pointed out that the filter is not pushed down to the source when you use a DP agent, e.g., the ABAP connection.
I had in mind that there was a blog post about how to push a filter down, even on an ABAP connection. So I tried Wanda's idea and combined it with the other knowledge. This blog post will show you how it works.
If you already know s-note 2567999, you know that you can filter data with a stored procedure. And as of June, you can run stored procedures directly in task chains. This is a really nice feature. But back to the other solution.
First, we need a local table to store the load parameter for this approach. The loading table is just to set the parameter easily and not always in the coding.
It is quite simple. 3 columns, one for the application, one for the parameter and one for the value. It may be different on your approach, but keep in mind to adjust the coding as well.
It's just before the summer break, and I want to share an idea on how to get a dynamic prior year in an analytical model.
There are several questions in the SAP community how to get such a result as it was used in SAP BW. Like this one https://community.sap.com/t5/finance-q-a/offsetting-input-parameters-in-datasphere-s-analytic-models-restricted/qaq-p/13723947
In this post, I want to share an idea on how to get such an offset in SAP Datasphere. Let's go to our fact view and add some logic that we will use later in our analytic model.
First we will add an input parameter to the bicycle data model, later I will describe another way if you don't want to use an input parameter.
The input parameter in this case is called IP_YEAR, has no input help, and has a string data type with a length of 4.
In this post I want to describe how you can use hierarchies in SAP Datasphere just like in good old SAP BW.
To be able to use a hierarchy in SAP Datasphere like in SAP Business Warehouse, you need an object with the semantic type "Hierarchy with Directory".
There are several blog posts on this topic that explain it in detail:
The entity relationship model for hierarchies with directories looks like this:
I know it's been a long time since I wrote the last post, and also make some housekeeping on the site. But the last quarter was as always very busy with different topics. I have now four pilot projects with SAP Datasphere, which I have to manage and develop the cool stuff. 😉 For example, how to use the command line interface (CLI) for SAP Datasphere to create views or tables based on a remote table. I had a really cool meeting with Ronald & Tim about this topic - Thank you guys for the input, now I have more ideas and less time.
I also developed some cool quarter slices in my projects to fulfil the customer needs. But besides the Datasphere projects I had this year, I also did a lot of other stuff. Let's start in March with the DSAG Technologie Tage in Mannheim. It was quite fun. You find all slides and posts on LinkedIn.
It was quite a bit silent here. That's a fact. I had a lot on my plate since I have a new job and also before the promotion. But now I have a super cool topic on my mind I want to share with you.
Hierarchies are a common topic in companies. They offer the business users' flexibility to navigate in the frontend reports. In this example, I flatten the hierarchy structure to consume a hierarchy with text nodes and infoobjects. The Product Group hierarchy looks like the following screenshot.
The Data Warehouse Cloud Analytic Model is now on all tenants available. With this wave, you have now the possibility to use instead of an Analytical Dataset (ADS) the new Analytic Model. The Analytic Model is a kind of cube which allows you to slice and dice your data model. I wrote about the preview access on December, and now I want to show you some details.
Wow! It's now a long time since I wrote the last blog post. I think many of my posts starts with this now. But anyway, let's get back to the subject. I have the opportunity to have a look at the new Data Warehouse Cloud (DWC) Analytic Model. It was presented in the keynote of Hagen Jander and Eric Schemer on the DSAG Jahreskongress.
It has been a while since I wrote the last blog post. But there happened a lot in the last two months. We had our Deep Dives about Self Service with SAP Data Analytics Cloud Architecture and I had also some weeks of vacation. Now I am back from my vacation, and now I want to share some ideas I had in the last months.
This post is about Microsoft PowerPoint and how I use it to create a master PowerPoint file for different purposes. The idea was to have one place for all my SAP Data Warehouse Cloud slides and use them in different customer scenarios.
Therefore, I search a little what I can do. If you have Microsoft Office 365, PowerPoint has the option of Custom Slide Shows under the tab Slide Show.
Unfortunately, I didn't make it to publish the post in the last month. There were several reasons that I didn't make it, like the internal BI days or to prepare the next deep dive for Data Warehouse Cloud. But back to topic. SAP published Analysis for Office 2.8 SP14. Maybe there is something with the SP14 because Analysis Office 1.4 had also a SP14 before Analysis office 2.0 was released. So perhaps we see some new feature in the future?
But back to Analysis for Office 2.8 SP14. Now you are able to connect to Data warehouse Cloud and consume the analytical data sets directly in Excel. This is the biggest update with for a long time with features. The last updates were mostly bug fixing and some technical setting parameter, but nothing what is fascinating.
Besides the function in Analysis Office 2.8 SP12 repeat titles of a crosstab. The latest version also offers a new API method called SaveBwComments and some new technical settings like
But for me the best part is now the Data Warehouse Cloud connection. For this, you have to create a connection in the Insert Data Source dialog.
In this blog post, I want to share my experience with the Data Warehouse Cloud Bridge. There are several blogs out there that describe how to set up eclipse (https://www.seaparkconsultancy.com/single-post/create-a-dwc-sap-bw-bridge-project and https://blogs.sap.com/2022/01/24/using-sap-bw-bridge-for-data-warehouse-cloud-part-1-creating-simple-objects-demo/). So I won't consider this, because there are other sources for this. I will focus on problems I had during the conversion from an SAP Business Warehouse 7.4 on any DB into Data Warehouse Cloud.
So here is the configuration of my systems
First, I had a look into this blog post and look into the sap note 3141688. After I checked with the note analyzer, the notes that are missing in my system. I had a list of around 1,000 notes I had to implement.
Okay, to be fair enough, the system release with BW 7.4 SP19 is from 2018 or so. So that normally should not happen in the real world. Now, after I implemented the notes over a long time and created several SAP tickets because of some wrong notes and not documented issues. I now can start with the steps of the blog post.
Let's get started and look into the system. Because our system is a demo system, I have no real data and also no InfoCubes and DSO I can convert. Let's be honest, who created an InfoCube and a DSO in the last years? First, I had to think about how to create the objects in the SAP GUI ;)
But after I had created my first InfoCube for years, I can finally start. So let's open the transaction code stc01 and select the task list SAP_BW4_TRANSFER_CLOUD_SHELL as described in the blog post above.
There are different ways to create hierarchies in Data Warehouse Cloud. One way is to use a CSV file and upload it into DWC. Another way is to use the existing hierarchies of your SAP BW system. In this post, I want to show how to use a hierarchy from BW and transform it into a parent-child hierarchy in Data Warehouse Cloud. After the transformation, we use the hierarchy in a view.
First, we need a hierarchy on one InfoObject in SAP BW.
There are different ideas and logics to determine year-to-date. Besides my post, which is also available on blogs.sap.com there is another post to determine a week-to-date (WTD) and year-to-date (YTD). I think that idea is also a good starting point, and I looked into it.
Instead of a control table which I have to fill manually, I created a new SQL view based on the standard SAP timetables in Data Warehouse Cloud. First create a new SQL view in the Data Builder of Data Warehouse Cloud (DWC).
In this blog post, I want to share an idea of how you can generate month-to-date (MTD), quarter-to-date (QTD), and year-to-date (YTD) values in SAP Data Warehouse Cloud (DWC). This is only one way, I think there are several other ways how you can solve this issue. I am happy to discuss your ideas in the comment section. In my old post, I describe the same logic for SAP HANA Calculation Views.
It is a while since I published my last post here. There are several reasons that I don't write anything, for example, SAP does not publish new features for Analysis for Office or my current project has no special cool new things I can talk about because it is mostly just maintenance and nothing hip. So it was very quiet here, and this is what I want to change. If you follow me on Twitter, you could have seen this post.
The year is almost over, and I haven't written a lot of post on my blog this year. I have to do it a lot more next year. Some #DataWarehouseCloud, #Python and also #ABAP topics are on my list.
— Tobias Meyer (@reyemsaibot) December 17, 2021
So I will write some posts about Python, Data Warehouse Cloud, and some ABAP topics in the near future. This post starts with Python and how to analyze the Apple Health data.
You can export the health data from your iPhone and receive a ZIP file that contains an XML file. How you can do this can be found via Google. It is uncomplicated. In my case, my XML file was around 1 GB big, and it contains about 3 million entries until May 2021. So I could not analyze it with Microsoft Excel or Notepad++, and I need only some information out of it, so I tried Python. I work with Python just more than one year, so please be kind if it is not perfectly written code.
It is quite a while since I published my last post. A lot happened since then. Analysis for Office 2.8 SP10 is now available, the summer and the vacation are over. But in the meantime I developed some new ABAP tools, had quite some interesting exchange and took the SQL Script course by Jörg Brandeis. But in this blog post I want to share with you the latest tool I developed which compare transformations in SAP Business Warehouse systems through the landscape.
Here is a short overview:
Since last week Analysis for Office 2.8 SP8 is available, and you can download it with your S-User. I just got a question if I could write about it, so here are the notes which fixes some bugs:
In the last post I wrote about authorizations in SAP Data Warehouse Cloud and I had an open topic about authorization on hierarchy nodes in SAP DWC. So I looked into and here is one example of how it could work at the moment. I don't know if SAP changes something in future releases.
So let us started with a CSV file to create our authorization we can use in the SAP Data Warehouse Cloud. I would now authorize my user to a Product Category because my hierarchy looks like this:
So we have the same structure as in the last post:
In this blog post I want to show you how you can use data authorizations in SAP Data Warehouse Cloud. First, we have to log on to our SAP DWC and select the space we want to use for this.
After we have selected our space, we open the Data Builder of SAP Data Warehouse Cloud. Here we have to import a new table with our authorizations. In my case we want to filter the Product ID, so the table looks like this:
User
ProductID
Someone ask me how you could create a time hierarchy in SAP Data Warehouse Cloud (DWC) to use it in SAP Analytics Cloud (SAC). Because out of the box by just create the time dimension it isn't working right now. So here are the steps you have to make.
In BW/4HANA SAP offers you the possibility to link restricted or calculated key figures across different Composite Providers. The advantage of the link components concept is that they can automatically synchronize whenever you make changes in the source or master component. If you are not familiar with this concept.
SAP Help:
"A linked component can be automatically synchronized whenever changes are made to the corresponding source component.
Example scenario: You have two highly similar InfoProviders, IP_A and IP_B. You have created the query Q_A for IP_A. You now want to create the query Q_B for InfoProvider IP_B. You want this query to be very similar to query Q_A and to be automatically adjusted whenever changes are made to query Q_A.
To do this, you use the link component concept: You create the linked target query Q_B for source query Q_A. This is more than just a copy, as the system also retains the mapping information. This mapping information makes it possible to synchronize the queries."
So the concept is really nice because you can ensure that your key figures with the same name has also the same configuration, e.g. your turnover is configured correctly on any Composite Provider in your system. Now let's create a new link component. First we select a restricted key figure on our Composite Provider and goto "Linked Component" view in Eclipse.