Dynamic source filtering is a big issue in SAP Datasphere. There are several approaches that may work for one, but not for another.
The discussion started again on LinkedIn after I read a post about dynamic filtering that was set with a fixed filter. Wanda then showed us a solution he uses to filter data. But Christopher pointed out that the filter is not pushed down to the source when you use a DP agent, e.g., the ABAP connection.
I had in mind that there was a blog post about how to push a filter down, even on an ABAP connection. So I tried Wanda's idea and combined it with the other knowledge. This blog post will show you how it works.
If you already know s-note 2567999, you know that you can filter data with a stored procedure. And as of June, you can run stored procedures directly in task chains. This is a really nice feature. But back to the other solution.
First, we need a local table to store the load parameter for this approach. The loading table is just to set the parameter easily and not always in the coding.
It is quite simple. 3 columns, one for the application, one for the parameter and one for the value. It may be different on your approach, but keep in mind to adjust the coding as well.
It's just before the summer break, and I want to share an idea on how to get a dynamic prior year in an analytical model.
There are several questions in the SAP community how to get such a result as it was used in SAP BW. Like this one https://community.sap.com/t5/finance-q-a/offsetting-input-parameters-in-datasphere-s-analytic-models-restricted/qaq-p/13723947
In this post, I want to share an idea on how to get such an offset in SAP Datasphere. Let's go to our fact view and add some logic that we will use later in our analytic model.
First we will add an input parameter to the bicycle data model, later I will describe another way if you don't want to use an input parameter.
The input parameter in this case is called IP_YEAR, has no input help, and has a string data type with a length of 4.
In this post I want to describe how you can use hierarchies in SAP Datasphere just like in good old SAP BW.
To be able to use a hierarchy in SAP Datasphere like in SAP Business Warehouse, you need an object with the semantic type "Hierarchy with Directory".
There are several blog posts on this topic that explain it in detail:
The entity relationship model for hierarchies with directories looks like this:
I know it's been a long time since I wrote the last post, and also make some housekeeping on the site. But the last quarter was as always very busy with different topics. I have now four pilot projects with SAP Datasphere, which I have to manage and develop the cool stuff. 😉 For example, how to use the command line interface (CLI) for SAP Datasphere to create views or tables based on a remote table. I had a really cool meeting with Ronald & Tim about this topic - Thank you guys for the input, now I have more ideas and less time.
It was quite a bit silent here. That's a fact. I had a lot on my plate since I have a new job and also before the promotion. But now I have a super cool topic on my mind I want to share with you.
Hierarchies are a common topic in companies. They offer the business users' flexibility to navigate in the frontend reports. In this example, I flatten the hierarchy structure to consume a hierarchy with text nodes and infoobjects. The Product Group hierarchy looks like the following screenshot.
The Data Warehouse Cloud Analytic Model is now on all tenants available. With this wave, you have now the possibility to use instead of an Analytical Dataset (ADS) the new Analytic Model. The Analytic Model is a kind of cube which allows you to slice and dice your data model. I wrote about the preview access on December, and now I want to show you some details.
Wow! It's now a long time since I wrote the last blog post. I think many of my posts starts with this now. But anyway, let's get back to the subject. I have the opportunity to have a look at the new Data Warehouse Cloud (DWC) Analytic Model. It was presented in the keynote of Hagen Jander and Eric Schemer on the DSAG Jahreskongress.
Unfortunately, I didn't make it to publish the post in the last month. There were several reasons that I didn't make it, like the internal BI days or to prepare the next deep dive for Data Warehouse Cloud. But back to topic. SAP published Analysis for Office 2.8 SP14. Maybe there is something with the SP14 because Analysis Office 1.4 had also a SP14 before Analysis office 2.0 was released. So perhaps we see some new feature in the future?
But back to Analysis for Office 2.8 SP14. Now you are able to connect to Data warehouse Cloud and consume the analytical data sets directly in Excel. This is the biggest update with for a long time with features. The last updates were mostly bug fixing and some technical setting parameter, but nothing what is fascinating.
Besides the function in Analysis Office 2.8 SP12 repeat titles of a crosstab. The latest version also offers a new API method called SaveBwComments and some new technical settings like
But for me the best part is now the Data Warehouse Cloud connection. For this, you have to create a connection in the Insert Data Source dialog.
In this blog post, I want to share my experience with the Data Warehouse Cloud Bridge. There are several blogs out there that describe how to set up eclipse (https://www.seaparkconsultancy.com/single-post/create-a-dwc-sap-bw-bridge-project and https://blogs.sap.com/2022/01/24/using-sap-bw-bridge-for-data-warehouse-cloud-part-1-creating-simple-objects-demo/). So I won't consider this, because there are other sources for this. I will focus on problems I had during the conversion from an SAP Business Warehouse 7.4 on any DB into Data Warehouse Cloud.
There are different ways to create hierarchies in Data Warehouse Cloud. One way is to use a CSV file and upload it into DWC. Another way is to use the existing hierarchies of your SAP BW system. In this post, I want to show how to use a hierarchy from BW and transform it into a parent-child hierarchy in Data Warehouse Cloud. After the transformation, we use the hierarchy in a view.
First, we need a hierarchy on one InfoObject in SAP BW.
There are different ideas and logics to determine year-to-date. Besides my post, which is also available on blogs.sap.com there is another post to determine a week-to-date (WTD) and year-to-date (YTD). I think that idea is also a good starting point, and I looked into it.
In this blog post, I want to share an idea of how you can generate month-to-date (MTD), quarter-to-date (QTD), and year-to-date (YTD) values in SAP Data Warehouse Cloud (DWC). This is only one way, I think there are several other ways how you can solve this issue. I am happy to discuss your ideas in the comment section. In my old post, I describe the same logic for SAP HANA Calculation Views.
In the last post I wrote about authorizations in SAP Data Warehouse Cloud and I had an open topic about authorization on hierarchy nodes in SAP DWC. So I looked into and here is one example of how it could work at the moment. I don't know if SAP changes something in future releases.
So let us started with a CSV file to create our authorization we can use in the SAP Data Warehouse Cloud. I would now authorize my user to a Product Category because my hierarchy looks like this:
In this blog post I want to show you how you can use data authorizations in SAP Data Warehouse Cloud. First, we have to log on to our SAP DWC and select the space we want to use for this.
After we have selected our space, we open the Data Builder of SAP Data Warehouse Cloud. Here we have to import a new table with our authorizations. In my case we want to filter the Product ID, so the table looks like this:
Someone ask me how you could create a time hierarchy in SAP Data Warehouse Cloud (DWC) to use it in SAP Analytics Cloud (SAC). Because out of the box by just create the time dimension it isn't working right now. So here are the steps you have to make.