0% found this document useful (0 votes)
18 views

OpenSAP Bw4h2 Week 1 Transcript en

This document provides an overview of SAP BW/4HANA modeling tools and data modeling concepts. It discusses the key components of a BW/4HANA installation including the server, HANA database, and modeling/administration tools. Some core data modeling objects in BW/4HANA are InfoObjects, which can define master data and dimensions, and advanced DataStore Objects, which provide smart data persistency capabilities. The document also reviews data types, hierarchies, aggregation, and other data modeling fundamentals in BW/4HANA.

Uploaded by

sahoosunit
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

OpenSAP Bw4h2 Week 1 Transcript en

This document provides an overview of SAP BW/4HANA modeling tools and data modeling concepts. It discusses the key components of a BW/4HANA installation including the server, HANA database, and modeling/administration tools. Some core data modeling objects in BW/4HANA are InfoObjects, which can define master data and dimensions, and advanced DataStore Objects, which provide smart data persistency capabilities. The document also reviews data types, hierarchies, aggregation, and other data modeling fundamentals in BW/4HANA.

Uploaded by

sahoosunit
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

openSAP

Modern Data Warehousing with SAP BW/4HANA


Week 1 Unit 1

00:00:09 Hello and welcome to Week 1, First Steps with SAP BW4/HANA,
00:00:13 Unit 1, the Basics of SAP BW4/HANA. So after the preparations of Week 0,
00:00:18 we are now actually ready to get rolling in the system. Gordon, what are we going to see
today?
00:00:23 All right, in this week we will talk about the components of BW/4HANA,
00:00:30 what is inside BW/4HANA, and what data modeling fundamentals we have in BW/4HANA,
00:00:37 as well as the modeling and administration environment. So this means we will jump into the
00:00:42 Eclipse-based modeling tools as well as on the BW/4 administration cockpit.
00:00:48 All right, so let's have a look at the overall architecture of a BW/4HANA installation.
00:00:54 On the right-hand side, you see the back-end parts of it, which basically consist of the SAP
BW/4HANA server,
00:01:02 an SAP HANA database underneath, and especially for planning scenarios,
00:01:07 we have add-on capabilities with SAP Business Planning and Consolidation 11.0.
00:01:14 If you look at the architecture picture of the BW/4HANA server, you see the metadata
repository
00:01:21 which contains all the model information, the information about runtime objects,
00:01:27 which we discussed last week already. You also see the analytical manager,
00:01:31 which is basically responsible for all the OLAP capabilities of BW/4HANA and you see the
00:01:37 data warehousing component which is basically responsible for managing,
00:01:41 storing the data during the transformations, and all that kind of stuff.
00:01:46 And on the left side of the picture, you can see the developer and administration tools we offer.

00:01:53 So here you can see the BW/4HANA modeling tools. The BW/4HANA modeling tools are a
desktop installation
00:02:01 based on the Eclipse environment and the BW modeling tools is an add-on to the Eclipse.
00:02:09 And we offer as well, with SP8, the BW/4HANA administration cockpit,
00:02:16 which is web browser-based and based on SAP Fiori UIs.
00:02:21 so this means you can easily open this web browser on each mobile device.
00:02:28 This is what we are talking about, the HANA modeling tools here,
00:02:32 on the left side of the picture. One thing I forgot to mention is, of course,
00:02:37 the important components of the SAP HANA database, which are also relevant for data
warehousing.
00:02:42 So, of course, generally database management, being able to store data, keep track of tables,

00:02:47 manage tables, that's one of the important points. Another very important functionality is
00:02:52 the data integration and data quality functionality which comes with SAP HANA.
00:02:58 We will see in Week 2 how we can leverage this in combination with BW/4HANA,
00:03:03 and then we have some additional advanced analytic capabilities
00:03:07 which are also very interesting and play a big role in the
00:03:11 analytics story of SAP BW4/HANA. When we are talking about data modeling on BW/4HANA,
00:03:21 then we, of course, provide the fully integrated set of data, data types, and data models.
00:03:30 So what we have or what we offer here is we have reusable data types and dimensions
00:03:37 for InfoObjects and other objects. We have comprehensive data management capabilities here

00:03:43 as well as the virtual data models and analytic reports. So this means we offer full flexibility
here
00:03:50 to build advanced data models based on persistency and/or virtual data models.
00:03:57 Right, and one thing to add of course, something which is very attractive
00:04:02 to a lot of customers as well is that BW/4HANA comes with pre-built
00:04:05 data models which are shipped from SAP, can be used as templates,
00:04:09 but can also be used as best practices for certain scenarios. So you basically already get
some business content
00:04:15 which you can leverage right out of the box. Here you can see the core modeling objects in
BW/4HANA.
00:04:25 We dramatically simplified the number of objects here. What we already mentioned is the
architecture
00:04:32 of the BW/4HANA, containing the data warehousing part,
00:04:36 the analytical manager, as well as the metadata repository. And therefore we have special
BW/4 objects here
00:04:44 for the data warehousing part, we have the DataStore Objects for persistency,
00:04:49 we have the InfoObject for data persistency... Especially for the master data.
00:04:54 ... especially for the master data, right. We have the Open ODS View
00:04:58 to integrate data virtually and to connect the objects in between,
00:05:06 we have the transformation for ETL processes, and for manipulating the data
00:05:11 as well as the data transfer process to load data from A to B,
00:05:15 from the source to a target, and we have our DataSources to integrate data,
00:05:23 SAP data as well as non-SAP data, of course. Okay, so let's look a little bit into the details
here.
00:05:30 The first thing which you mentioned were basically the InfoObjects on the slide before.
00:05:34 So what are the InfoObjects? They're basically the most granular building blocks
00:05:37 for SAP BW/4HANA. They can just be like global data type definitions,
00:05:43 but they can also range to basically fully fledged, even slowly changing dimensions.
00:05:49 So what are the different types of InfoObjects? The first thing is, we have characteristics.
00:05:54 Characteristics are basically, as I just mentioned, more or less like dimensions in the general
analytic world.
00:06:01 So they are basically able to model time dependency, even slowly changing dimensions of
type two for example.
00:06:07 They support quite complex hierarchy features. You can also model authorizations based on
one InfoObject,
00:06:14 so it's all in one place with all the relevant things which are important for modeling
00:06:22 a dimension set up in a reusable way. Multi-language text is a big deal as well.
00:06:30 Time and date hierarchies, that's also a very nice feature
00:06:32 because you don't have to deal with this on your It comes out of the box,
00:06:36 you have all the calendar capabilities which you know, for example, from an S/4HANA
00:06:41 or an ECC system as well. And in addition, we also have some geospatial support.
00:06:48 And we have measures. In BW it's called key figures.
00:06:54 Here you can aggregate the data, you can SUM the data,
00:06:57 you can MIN, MAX all the data. You can aggregate the data here,

2
00:07:02 and you can assign currency or units. We have a full unit and currency support,
00:07:08 including, of course, conversion options, as well as special key figures,
00:07:15 like inventory key figures and non-cumulative key figures for the retail world.
00:07:21 And of course, we offer display properties for reporting, so this means we have a format
conversion here
00:07:30 and you can assign a number of decimal places for what the key figures and the other objects
should look like.
00:07:37 So all this is, as we said, basically built and modeled in one single object
00:07:42 for basically maximum consistency and maximum reuse, right? There's one object where
00:07:46 you model all these different properties, and wherever you use this object,
00:07:51 all these properties apply automatically. Now let's switch to the fact side
00:07:57 of a multidimensional model: the "advanced DataStore Objects".
00:08:00 That's basically the key data persistency in BW. The place where most of the data is typically
stored.
00:08:08 And the approach of BW/4HANA is that we don't just use a regular table,
00:08:14 but we actually have some smartness in this table which allows us to do out-of-the-box things
00:08:21 like parallel loads, we also have automatic or standard recovery options
00:08:27 for erroneous loads. So in case something goes wrong with a load
00:08:30 there's a clear way how to get this back into a consistent state again.
00:08:37 You don't have to figure out: How much of my load has succeeded?
00:08:40 What parts have failed? How do I get the partially loaded data out again?
00:08:45 There's a clear way to correct this in BW/4HANA because of the mechanisms and the
smartness
00:08:50 of the aDSO in between. And a final thing, which is also
00:08:55 very important about the aDSO: It has automatic data change capture capabilities.
00:09:00 So whenever data is loaded into an aDSO, it also tracks the changes between the previous
state
00:09:06 and the current state after the load. And you can use these captured changes
00:09:12 to propagate data to the next layer. And, of course, the aDSO is based on the
00:09:19 InfoObjects, as we mentioned earlier. But we have the possibility to use field dimensions
00:09:26 or field-based objects here as well. Additionally, it is possible to use partitioning
00:09:34 for large data sets. So this means you can partition your persistency,
00:09:40 your advanced DataStore Object, logically as well as physically.
00:09:46 Both are possible. And we have a smart solution to tier the data
00:09:52 between different tiers. So this means we can easily assign a data temperature
00:09:56 to a partition of the advanced DataStore Object here. When we compare the advanced
DataStore Object
00:10:02 to a classic DataStore Object on earlier releases, then you can see we have a quite simple
00:10:11 complex data structure. The data structure is quite simple.
00:10:16 In the end, it's a flat table. We don't have an extended star schema
00:10:20 or star schema any longer. So this is for fact data and the assignment
00:10:27 to the star schema is done on the virtualization area, which is on the CompositeProvider.
00:10:33 Right, so let's look at that side. We have now basically have reached the
00:10:38 top of the analytic modeling world, which is basically assembling the facts and the dimensions

00:10:45 to basically build a star schema. In BW/4HANA, this is done in the "CompositeProvider".

3
00:10:51 So this is the place where you take the fact tables, maybe you even create a kind of logical
fact table
00:10:58 out of basically, for example, a union of multiple physical fact tables.
00:11:02 Or you can even do joins and mixes of both. So that's the fact part.
00:11:07 And then you basically associate the right dimension tables to the fact tables
00:11:12 to really build a multidimensional star schema. As we said, so you have the possibility to do
00:11:19 unions and joins of aDSOs in a CompositeProvider, and you define the relations between
00:11:26 the facts and the master data. What about the query side?
00:11:32 Yeah, last but not least, we have the analytic query for end user reports
00:11:38 and accessed by the business users, of course. And here we have the variety of defining a
report layout.
00:11:46 So this means you can easily build a query consisting of InfoObjects, of key figures,
00:11:54 as we explained before. And then you have different options here.
00:11:58 We have drilldown options, we have calculations on key figures via formulas,
00:12:03 filters, and all that stuff is available here. We have extended display properties here.
00:12:10 We have exceptional aggregation. All the query features based on the OLAP engine
00:12:15 can be used here. And of course, it's the base to define
00:12:19 complex analytical processes to cover the business needs. Exactly.
00:12:25 So, I guess we're ready for the system demo. Let me briefly describe what we're going to show

00:12:30 We're not going to go into details of the objects which we just described.
00:12:33 But we will give you a first glimpse of the BW modeling environment.
00:12:37 So we will show you the modeling environment in Eclipse for the modeling part,
00:12:43 and we will also give you a brief introduction into the SAP BW/4HANA
00:12:47 cockpit for administration. So here we are.
00:12:53 This is the Eclipse-based BW modeling tool. And here you can see on the left side
00:13:00 we have our system. The system is designed to the BW modeling tools,
00:13:04 and here you can see we have a section for favorites. You can store your personal favorites
here.
00:13:12 We have a section for the BW repository. Here you can see we have different info areas,
00:13:18 which is more or less a logical partitioning of the... Folder structure.
00:13:22 ... the content, exactly. And in this area you would basically see
00:13:25 all the objects which you've built. You would see all your CompositeProviders,
00:13:28 you would see all your aDSOs, your InfoObjects,
00:13:31 even your queries. So all that stuff would be found here.
00:13:33 Yeah, so you can see that... We can show it here for the stuff which we
00:13:36 prepared for this course. So here you can see it.
00:13:40 Here we have a data flow object, we have CompositeProvider,
00:13:43 we have the advanced DataStore Object, as well as the characteristics,
00:13:47 which are the InfoObjects. And we have the key figures here as well.
00:13:51 So you can see that folder structure here. And, for ETL purposes,
00:13:58 we have the DataSources to load data from any source into the BW/4HANA as a target,
00:14:05 and here you can see we have connected our BW/4 system to different sources.
00:14:11 We have ODP sources here, we have CDS view connected,
00:14:15 we have an S/4 system connected. Here we have the possibility to, let's say,
00:14:22 connect each system to the BW/4 SAP systems as well as non-SAP systems.
00:14:28 Yeah, right, that's something which we mentioned in the previous week, in Week 0.

4
00:14:32 And, of course, the openness of SAP BW4/HANA is one of the main strengths of it.
00:14:37 So we basically can connect to any kind of external data. What we see here is basically mainly

00:14:44 classic SAP applications, but of course, in the upcoming weeks,


00:14:49 we will also see how we can connect to other stuff. And from the BW modeling environment,
00:14:54 it's quite easy to directly jump into the web-based administration tools.
00:15:01 This is just one click, here we are. Open the BW/4 cockpit, which system, and here we are.
00:15:09 I have to log on of course. And here you can see the SAP BW/4HANA cockpit.
00:15:16 Here you can see the monitoring part, we have a manage part, as well as a modeling part.
00:15:21 And here it's possible to create process chains, to one process chain, scheduled process
chain,
00:15:28 manage the loads, the ETL loads, manage the DataStore objects,
00:15:31 manage the InfoObjects, as we talked about earlier. So basically we have the possibility to see

00:15:37 all the loads and the status of all loads into these objects,
00:15:40 you get an idea of how much data the objects contain, how many rows basically.
00:15:46 All that can basically be done here and viewed here. You mentioned the process chain,
00:15:51 so the assembly of individual process steps into end-to-end process.
00:15:57 That's modeled and monitored here as well. And you have some statistic information
00:16:01 about the runs of your process chains and all that kind of information.
00:16:06 And as I said, this BW/4HANA cockpit can be accessed via each mobile device.
00:16:12 Right, it's Fiori-based and HTML-based as a consequence.
00:16:16 So as you say, basically every browser can be used to display it. And it has all the
configuration and
00:16:24 personalization functionalities which you know from Fiori.
00:16:27 So you can really personalize this to your needs, you can remove the functionalities
00:16:34 and the apps which you don't need, you can add some additional ones which are relevant for
you.
00:16:38 It's really configurable and personalizable to our own needs.
00:16:43 And I guess that's it for the very first demo of this course for today.
00:16:49 Now let's see what we have learned today. We have basically two key takeaways.
00:16:54 The first thing which you should remember after this class is the three core modeling objects
which we mentioned.
00:17:00 The InfoObjects for the dimensions and the key figure part of BW/4HANA,
00:17:07 the aDSOs for the facts pretty much, and the CompositeProviders to assemble a star schema
00:17:12 out of these two objects. And the second part is that you should now
00:17:16 have a rough idea of what the modeling environment of BW/4HANA looks like,
00:17:21 as I said, from a modeling perspective, but also from an administrative perspective.
00:17:25 And I guess that's it for today. Don't forget to do your self-test.

5
Week 1 Unit 2

00:00:08 Hello, and welcome to Week 1, Unit 2: Modeling Data Flows with SAP BW/4HANA, Part
00:00:14 So what are we going to do in this unit? We will first give you an overview
00:00:17 of the data modeling capabilities of SAP BW/4HANA, especially focusing on the persistence
areas,
00:00:23 so the question of how to model the persistencies for master data and transaction data as well.

00:00:29 The virtualization layer with the CompositeProvider and the virtual data marts on top is
00:00:34 basically going to be part of the following unit, and we will, of course, give you a system demo,

00:00:39 which shows you the end-to-end process of modeling these artifacts,
00:00:43 including all the ETL stuff as well. Data flow modeling in BW/4HANA means
00:00:50 the combination of specific BW objects to fulfill the business requirements.
00:00:56 So this means we have a predefined semantics here, or you can model your data flow fully
freestyle
00:01:03 by using field-based objects as well as InfoObjects. Both are possible.
00:01:11 The idea is that you introduce one layer of persistency, and then it is possible
00:01:18 to build special data marts to access the data from the persistency layer fully virtualized.
00:01:25 That's basically the scope of pure BW/4HANA modeling. One of the nice features of SAP
BW/4HANA
00:01:31 is that we also have the chance and the possibility to use hybrid modeling, which combines the
modeling artifacts
00:01:39 and the modeling world of SAP BW/4HANA with the SQL world of SAP HANA itself.
00:01:43 So we can actually build data models which combine artifacts from both sides,
00:01:47 but that's going to be part of Week 2, basically. Another aspect which we are also going to see

00:01:53 in a later unit of this week is the Data Tiering Optimization,


00:01:58 which basically means that we have a very cost-efficient mechanism
00:02:02 to handle large data loads and introduce data tiering with various data temperatures from hot,
warm, to cold.
00:02:09 Basically, distribute the data in a cost-efficient manner across a larger system landscape.
00:02:14 And the last part is, of course, that BW/4HANA comes with a lot of predefined content
00:02:21 with predefined business models delivered by which show you basically best practices
00:02:26 and the semantics of standard SAP applications from an analytic perspective.
00:02:35 As you already know, we have virtualized modeling and we have persistent data modeling.
00:02:41 The idea is to combine both worlds into one data from a persistency point of view.
00:02:48 We have the DataStore object. The DataStore object is our central object
00:02:52 for data storage and data consolidation. It consists of different tables
00:02:59 and we have delta capabilities. Multidata load scenarios are possible,
00:03:05 and we have the InfoObject here. The InfoObject is the central object
00:03:10 for master data characteristics and for key figures with unit and currency conversion.
00:03:16 Unit and currency conversion, it's possible during the loads into an InfoObject.
00:03:21 InfoObjects can be persistent, as well as virtualized. Right, so the core idea of InfoObjects
00:03:27 is basically consistency. They combine all the aspects of a dimension
00:03:31 as we have it here in the green box. In one object, making sure that you
00:03:37 have consistent data types all over the place. You never come into a situation where maybe
00:03:42 a customer ID is a character ten in one place and a character eight in another place.

6
00:03:46 As long as you use the same InfoObject, this kind of consistency is guaranteed.
00:03:50 Even from a data perspective to a certain extent, we can guaranteed referential integrity
00:03:54 between facts and master data. So if this consistency is needed,
00:04:00 then InfoObjects are, of course, the way to go. If that's not your primary focus,
00:04:04 but if you want to work quickly and maybe a little bit dirty,
00:04:07 we also have the "field-based modeling capabilities", which basically is an extension of what
you can
00:04:13 with InfoObjects just working on fields, so with much less metadata and much less effort
00:04:19 to build, but on the other hand, fully integrated with all the services of BW/4HANA.
00:04:23 So you can use these fields in reporting. You can assign authorizations to fields.
00:04:31 You also have the full loading capabilities of aDSOs when it comes to fields.
00:04:36 So from that perspective there is basically no big difference. It's really from a semantic
perspective
00:04:43 how much functionality you assign to a field. If there's a lot of that, then InfoObject is the way
to go.
00:04:50 If it's just about having a quick way to load data and start working with data,
00:04:54 then fields are probably the better way to go. And now the question is how we can load the
00:05:01 into BW/4HANA persistency object for these ETA processes. We have a variety of options
here.
00:05:10 You can load the data in batch. You can access the data in real time,
00:05:14 or you can use a remote data acquisition. Therefore, we have predefined content, as well.
00:05:20 And the idea here is, during the transformation, transform and enrich the dataset.
00:05:26 Here we have a lot of delta capabilities for delta mechanisms, as well as for sophisticated error
handling.
00:05:35 That's possible via the transformation, via the ETL processes. And we have different ways to
transform your data.
00:05:44 It is possible to use, let's say, a standard way, you can use expert routines.
00:05:49 You can use field routines. You can use fully freestyle routines here
00:05:55 to transform and enrich the data. If this is not enough, we offer the possibility
00:06:02 to use adapters, SDIs – smart data integration – adapters, to connect to a variety of different
sources,
00:06:10 to load the data into BW/4HANA. All these capabilities will, basically,
00:06:17 also be part of, or the focus of, Week 2. Yeah.
00:06:21 So let's start with an overview of the demo data model, which we are going to build now in
BW/4HANA.
00:06:28 This is basically a look at it from a database perspective, or entity perspective, possibly.
00:06:33 So we're basically dealing with a fairly simple and straightforward star schema.
00:06:38 We have a central fact table which contains sales orders. And we have two main dimensions,
00:06:44 which are the product dimension on the left-hand and the customer dimension on the right-
hand side.
00:06:48 Each of these dimensions also has language- dependent texts. We basically mentioned that
this is
00:06:53 one of the key aspects of InfoObject modeling. And we have some additional texts
00:07:00 for the city and for the country because it's also, to a certain extent, language dependent.
00:07:06 So that's, from a relational or database or entity perspective, what the data model looks like.
00:07:12 You basically see the relations between the various objects.
00:07:17 Now, let's have a look at how this looks from a BW perspective.

7
00:07:21 On the BW side, as you can see, is the central object here. We have, as an example, the sales
order.
00:07:27 In the sales order, we have different objects like the SalesOrderID, OrderDate, and so on.
00:07:32 And each object represents an InfoObject as well. And as you can see, on the dotted lines,
00:07:39 this InfoObject has an InfoObject as well. And then you can use that as navigational attributes
00:07:46 in that case, on the sales order ID. As you can see, we have different InfoObjects
00:07:52 as well as texts. You can assign texts to the InfoObjects as well.
00:07:58 And you can see, each line here represents one InfoObject. That's quite important here.
00:08:05 And besides the texts, you can, of course, use hierarchies as well.
00:08:09 What you didn't mention... so the central fact table basically corresponds to an aDSO
00:08:14 built out of a lot of InfoObjects here. And as you said, the other things,
00:08:18 the attributes of the customer, which would be InfoObjects as well,
00:08:21 are assembled into a bigger dimensional InfoObject, or "characteristic", which is the customer,

00:08:27 and the same is true for the product. So, let's directly jump into the demo,
00:08:32 and see what this looks like in the system. What are we going to do in this demo?
00:08:36 We are basically going to show you how to model a data flow. A data flow is a dedicated object
in SAP BW/4HANA.
00:08:45 We'll have a look at the InfoObjects which are relevant in this scenario.
00:08:49 We'll show you how to model an aDSO. And we'll also create transformations and DTPs
00:08:54 so that we are ready to load data in the subsequent units. So now we are back in the BW
modeling tools.
00:09:04 You can see on the left side the different info areas, and you can see the BW for specific
objects.
00:09:14 We have the data flow object. We have a CompositeProvider datastore object,
00:09:18 the characteristic, and the key figures. Let's have a closer look at
00:09:22 the characteristics and the key figures. And here you can see the different key figures
00:09:28 and the characteristics we described in the slides before. Exactly, so if you look at this list of
InfoObjects,
00:09:33 it should basically be a one-to-one match with the list of attributes which we had
00:09:38 on the entities of the previous slide. Yeah, let's jump into one...
00:09:42 Maybe let's have a look at the product. Yeah.
00:09:47 So this is one of the more complex InfoObjects, which is not just a single data type
00:09:51 but one of the things which has multiple attributes, which also has texts.
00:09:55 And we see these attributes here on this screen. On the left-hand side, we basically see
00:10:00 the type definition, it's a character of length 20. On the right-hand side, in the Properties area,

00:10:08 you see that it has master data because it has all these attributes,
00:10:11 and you can actually load master data into it, right? We're focusing on the persistence part of
the
00:10:15 data model in this unit. It also has texts.
00:10:19 In addition, it has hierarchies, because that's quite typical. I think you have retail experience,
Gordon, right?
00:10:25 When it comes to products, you very often have a product hierarchy.
00:10:30 Let's jump to the attributes. Exactly, that was the overview.
00:10:32 And now, on the more detailed level, we see now the list of attributes.
00:10:36 Again, you can compare this to what we had on the previous slide.
00:10:39 It should be exactly the same list of attributes from color, product size, product weight,

8
00:10:45 over to product category. And basically, all of this is now assembled
00:10:49 into one larger characteristic which is also an InfoObject. And where all these properties,
00:10:56 including later on authorizations and all this stuff, is assembled into one thing
00:11:00 and ready for reuse whenever you need it. Okay, so let's create a new data flow object.
00:11:07 Exactly, we're basically switching now to the transaction data side, for the moment.
00:11:10 That was the master data. Now we go over to the transaction data.
00:11:13 And to load and show you how the modeling here works, Gordon is opening the
00:11:19 data flow modeler of BW/4HANA. So here we are.
00:11:26 This is the data flow object, more or less naked, of course. It looks like a whiteboard.
00:11:31 Now we have to possibility via drag and drop to use the BW objects on the right side.
00:11:37 Or you can use existing BW/4 objects, and drag and drop these objects into the data flow.
00:11:45 - From the left side.
- From the left side. If you reuse objects, you do it from the left side.
00:11:48 If you build new objects, you do it from the right side. Exactly.
00:11:51 I will do that for an object which is still existing. So here you can see the objects in the system.

00:11:58 So that's, like, half ready. We basically did the initial parts of the modeling,
00:12:03 and now we'll show you how to complete this by adding the remaining InfoObjects here.
00:12:08 So let's maybe jump into the object, look at the details, and then complete the object,
00:12:13 and then start working on the data load. So here you basically see the list of InfoObjects.
00:12:21 Again, compare this to the slide where we had the overview of the data model.
00:12:26 Some of the InfoObjects are missing, and we're now basically going to add them.
00:12:32 Click on Add InfoObjects, and we basically select... I think what's missing is the product which
we just
00:12:37 What's also missing is the customer, and I think one of the key figures was also missing.
00:12:42 And we basically select this from the list of InfoObjects which we have here.
00:12:47 Customer is very high up. And I think that the last one was also missing,
00:12:50 which is the Total Due Amount. And now we bring these into the selection area.
00:12:58 Click on add and the system adds them here to the list of InfoObjects of the aDSO.
00:13:03 Now, basically, the aDSO is completed. So all the properties, including data types
00:13:08 and all the other relevant properties of the InfoObjects which we have built
00:13:13 are now baked into the aDSO. So now we'll activate the DataStore object.
00:13:20 Pressing the activate button. Right, and after activation is finished,
00:13:26 we will go back to the data flow. I think that's right now, right?
00:13:30 Yeah. We'll go back to the data flow and connect this
00:13:32 to a DataSource and basically work on the loading part. Yeah, therefore I will open the
DataSource area here.
00:13:42 By the way, we have connected an S/4 system to our BW/4HANA system.
00:13:48 So here we are. We are using ABAB CDS views.
00:13:53 Here, this is the S/4 system, the connected S/4 system. And here we have a list of
DataSources
00:14:02 for each of the entities which we saw in the overview. Here's now the Sales Orders
DataSource.
00:14:12 And we now connect this to the aDSO. We will create a transformation
00:14:16 and a data transfer process, which basically is responsible for the runtime properties of the
load.
00:14:23 Transformation basically models, as the name says, the transformation rules, so basically how
data

9
00:14:29 is to be modified and adjusted during loads. Whereas the data transfer process,
00:14:34 basically, is responsible for the execution of the load, how packaging is supposed to be done,

00:14:38 how errors are supposed to be handled, all that kind of stuff.
00:14:41 Now the DataSource is connected to the data target, to the advanced DSO, and now I will
create a transformation.
00:14:48 Quite simple. Go to Transformation and Create Transformation.
00:14:54 Finish. And as we said, all of this is part of the
00:14:57 Eclipse modeling environment. There's no switch between Eclipse and SAP GUI anymore,
00:15:04 as of SAP BW/4HANA 1.0 SP8. And here you can see where the transformation is executed.

00:15:10 Of course, here in the HANA runtime, you can switch between the runtimes.
00:15:15 And you have the possibility to build your rules using SQL functionality
00:15:20 or using ABAP functionality. And there are also some modeling capabilities,
00:15:24 for example, for lookups, you can do this without any coding in a modeled
00:15:32 So the system proposes some of the mappings already here. It seems quite complete.
00:15:37 Yep, it is. The system also has some smartness built in
00:15:40 to match the fields of the source and fields of the target. If we were to adjust some of the rules,

00:15:47 we could of course pick one. For example, just pick any of them.
00:15:51 Then in the lower area, on the bottom, we would basically have the possibility
00:15:55 to adjust the details of this rule. For our purposes here, we just take a one-to-one mapping.
00:16:00 I think that's good enough. Here you can see the different options we have.
00:16:04 We have the option to use a formula, to create a lookup, assign a constant,
00:16:11 use a routine, or no update options. Exactly.
00:16:14 But that's all good for the moment. Let's just activate it
00:16:16 and go back to the data transfer process. Now the transformation is getting activated.
00:16:26 So that's done. So close.
00:16:31 And now you can see the transformation has switched to green, which means the
transformation is now active.
00:16:38 Green is a good sign usually, as opposed to red. Yeah, that's true, yeah.
00:16:43 Now we create a data transfer process. Here we are.
00:16:49 The system already sees, okay, there's one transformation between the source and the target,

00:16:55 so that's probably the way along which the data is supposed to be moved.
00:17:00 And it takes this into account.
00:17:02 Here we have different options, as well. Extraction settings, execution settings...
00:17:07 Package size, that's relevant if you want to parallelize, for example, as well.
00:17:11 All these aspects. Yeah, update rules...
00:17:14 So I will simply activate... The standard settings are typically okay,
00:17:18 at least to start with. Of course, in more complex scenarios
00:17:21 and if you have high data volumes, you may want to revisit this. So if you want to use special
features
00:17:27 of the data transfer process. But in the standard case, which we're showing here,
00:17:31 the standard settings are good. So we can also activate this,
00:17:35 and then we're basically done with that part of the demo. Yeah.
00:17:40 Back to the data flow object, now the DTP is graded as well.

10
00:17:45 Now the data flow is more or less done. It's ready for loading
00:17:49 and that's going to be part of the next unit. So let's summarize what we saw in this unit.
00:17:56 Let's look at the key takeaways. So we basically saw two of the core modeling objects
00:18:02 when it comes to modeling persistency in BW/4HANA. That's the advanced DataStore object
00:18:06 typically for the transaction data. But, of course, it also plays a role
00:18:12 when it comes to manipulating and storing master on the data warehouse layer.
00:18:18 InfoObjects are basically the core entities when it comes to modeling master data
00:18:21 for reporting purposes. You've seen that all the modeling is basically now
00:18:27 a part of the Eclipse environment, so there's no switch between SAP GUI and Eclipse
anymore.
00:18:33 It's all contained in the Eclipse experience. And you've gotten a rough idea
00:18:39 of how the enriching and transforming capabilities of transformations in the DTP work.
00:18:45 And with that, I want to remind you not to forget your self-test.

11
Week 1 Unit 3

00:00:07 Hello, and welcome to week one, unit three, Modeling Data Flows with SAP BW/4HANA, part
two.
00:00:13 So, now we're basically continuing what we did in the previous unit, and complete the data
model
00:00:19 which we started to build by adding the virtual layer on top.
00:00:23 So, first of all we will build a CompositeProvider, which basically assembles the star schema
00:00:28 out of the InfoObjects and the aDSO which we have built,
00:00:31 to basically really give us a multi-dimensional object for analysis.
00:00:35 And on top we will build an analytic query, which actually designs the report layout
00:00:39 and maybe some additional calculations on top of the star schema which we defined.
00:00:44 Another aspect which we will show is completion of what we did last time.
00:00:48 We will not only... last time we only modeled the data load
00:00:51 and the transformation of the DTP. Now we will also execute it
00:00:54 to actually have data in our data model so that we can play around with the query
00:00:58 and really start to analyze data. Or at least get an impression of what's possible.
00:01:03 All right, so let's have a look at the objects in a little more detail.
00:01:08 Yeah, as we already learned in the previous units, we have the objects for persistency,
00:01:13 the advanced DataStore object and the InfoObjects, and now we build, on top of the persistent
data model,
00:01:21 the virtual objects. Here we have the CompositeProvider. The CompositeProvider joins the
data,
00:01:28 or did a simple union to join or union the persistency objects like the advanced DataStore
object and InfoObject,
00:01:39 and bring in the star schema here. We have as well the open ODS view
00:01:44 to consume external data or external tables and views into the CompositeProvider.
00:01:51 That's something which we will deal more with in the next week actually.
00:01:55 So, the next layer is the analytic query. You know that the analytic manager
00:02:00 is the built-in OLAP engine of SAP BW/4HANA. And the analytic query is basically the object
00:02:07 which is analyzed by the analytic manager and transformed into kind of an execution plan
00:02:12 for the HANA database. So what you can do in an analytic query is
00:02:16 basically define the layout of a report. You have a rich set of functionalities,
00:02:22 which range from hierarchies, like generic hierarchies of certain objects, date hierarchies, a lot
of formatting
00:02:29 information, you can define what options you for drilldown,
00:02:34 which key figures you actually want to present. As I said, do certain restrictions on these key
figures
00:02:40 for example, restrict sales amount by calendar and do a year-over-year comparison,
00:02:47 or things like that. All such computations on top of the actual data
00:02:53 in the data model. It's important to know that with SAP BW/4HANA,
00:03:00 the vast majority of these calculations is actually not executed on the application server side,
00:03:04 but down in the database. So, for performance reasons we have put a lot of effort into
00:03:10 really optimizing the execution of such an analytic query, and pushing down as much
functionality as possible
00:03:16 into the database, to do the calculations where the data actually resides.
00:03:21 And I already mentioned some of the analytic capabilities. So hierarchy handling is one of the
very strong points

12
00:03:27 of the BW/4HANA OLAP engine. Currency and unit conversions are also important.
00:03:32 Right? You're typically dealing in a larger enterprise
00:03:35 with multiple currencies, and you probably have something like a group currency into which
you have to convert
00:03:42 the data coming from subsidiaries. Unit conversions are also very typical
00:03:47 because certain objects are dealt with in different units. So, that's basically all in-built.
00:03:53 We have quite complex aggregation schemes, so it's not just sum, and min and max.
00:03:58 We also have the possibility to combine account with other aggregations.
00:04:04 That's something we call exception aggregation. And the examples of restricted and calculated
key figures
00:04:10 which I just mentioned, like doing year-over-year comparisons all that kind of stuff.
00:04:14 And the last example I will leave to you, Gordon because that's a typical retail functionality.
00:04:20 Sure. We have, of course, the inventory handling for the non-cumulative key figures.
00:04:27 To calculate different stock or in-stock scenarios. Exactly. So that's also very business-
oriented functionality
00:04:34 which you typically don't find in a generic database-oriented tool.
00:04:39 But now it's already time to switch to the demo. So what are we going to show today?
00:04:43 As I said, we are going to load data first to be ready for the actual analysis, then we are going
to build a
00:04:49 CompositeProvider on top of the data model which we saw in the previous unit.
00:04:54 We will build an analytic query on top of that and we'll start one of the front-end tools.
00:05:00 In this case, it's going to be Analysis for Microsoft Office to visualize the data and play a little
bit around
00:05:04 with the query which we're building. All right Gordon, let's roll.
00:05:11 So... Maybe one brief remark on the data flow.
00:05:14 It's basically the data flow which we started to build last time,
00:05:17 but we've added a little bit on top. So what we added, we added an InfoObject
00:05:24 to demonstrate what the load to an InfoObject looks like. Here you can see the InfoObject
Product on top,
00:05:33 and then you can see the different streams. One is a hierarchy stream, the other is the
00:05:40 attribute stream, and the other is the text stream. And here you can see, under the attributes
and the text,
00:05:46 we have data sources, and via the data sources it's possible to load data into the persistency
00:05:52 of the InfoObject. Exactly. So it's basically the same schema
00:05:55 as we see it on the left-hand side for the transaction data, but it's a little bit more complex
00:05:59 because we have these three different types of data, which come from different sources
typically as well,
00:06:05 and therefore, for InfoObjects, we in many cases have these three targets for data loads.
00:06:11 The hierarchies, the attributes, and the texts separately. But now, let's start the data load for
the transaction date.
00:06:17 Yep. Let's start a data load. Therefore, I will jump into the data transfer process here.
00:06:24 We can display the data transfer process. And then we have the button here to execute
00:06:30 the data transfer process. I will execute the data transfer process here.
00:06:37 Of course, that's not always a manual task. We will see in one of the next units how you can
00:06:41 automate this, how you can build more complex or assemble different process steps
00:06:46 in a data warehouse load scenario into something which is called a process chain.
00:06:51 But for our purpose here it's just a manual task because we just want to load data once right

13
00:06:55 So now we load the data in the background, but now I will jump into a Web UI to administrate
the data
00:07:04 because I have to activate the data. We will show you that later on,
00:07:07 in a later unit, that it's possible to use a so-called process chain to automate these processes.

00:07:17 So right here. Here we are.


00:07:20 Let's just do it quickly. As we said, this is basically a little bit ahead
00:07:26 of what we're actually doing. We are going to show you the details
00:07:29 of these administration UIs in the upcoming two units. Right now, it's just necessary to jump
into this
00:07:37 because we have to activate the data, which is the step which happens after loading
00:07:41 and which typically, well more or less, compares the new data, which has been loaded,
00:07:47 to the current state of data. And basically does the merge of these two data sets.
00:07:51 So all right, now we are basically ready to start analyzing the data.
00:07:56 First, building the CompositeProvider, and then building the analytic query.
00:08:01 Yeah. I will start to build the CompositeProvider.
00:08:05 Therefore I will create a new one. Or, we can simply drag and drop it from here.
00:08:16 So here we are. So I will simply connect our source,
00:08:23 which is an advanced DataStore object, to the newly created CompositeProvider.
00:08:28 Double-click on the CompositeProvider. Name.
00:08:36 Zero two I would guess.
00:08:38 I don't know where the zero one is coming from, but we can just ignore that one on the right-
hand side.
00:08:44 There you can see we have automatically unioned the sales orders advanced status to object.

00:08:50 Right. We mentioned that you could actually do even joins or combinations of joins and unions

00:08:55 of multiple fact tables here. In our scenario, since we only have one,
00:08:59 it's just a union of the single table. I mean you can imagine that a lot of scenarios
00:09:04 now are possible with the combination of joints and unions. Exactly.
00:09:09 So... All right, so what we see here is basically
00:09:15 the overview page. Nothing important here for the moment on this
00:09:21 We're going to talk a little bit about the details maybe next time, or next week actually.
00:09:28 But now let's define the star schema in more detail. So here you can see the source fields,
00:09:35 you can see the different InfoObjects, and the key figures, and the key fields.
00:09:41 And now I will simply... move the source fields to the target fields,
00:09:47 via drag and drop. Here you can see... Maybe one thing I would like to mention here.
00:09:51 You see the InfoObject for the customer and for the product, but you don't see the attributes of
customer and product here.
00:09:57 That's exactly the point. We only see the keys here.
00:10:01 But all the additional information like all the additional attributes which you potentially want
00:10:05 drill down by, which the customer provides. Remember, we had address information in the
customer.
00:10:13 We had some information like color in the product. If you want to join these and drill down by
these,
00:10:19 that's something we should do actually in the next step. Yeah, in the next step.
00:10:22 That's the pure structure of the fact table. Yeah.
00:10:26 Now we'll jump to the Output tab. Here you can see the designated objects

14
00:10:31 on the right side in the middle. And here you can see, for instance, we have product.
00:10:39 Exactly. Now the point is we actually don't want to drill down just
00:10:42 by product, but maybe by some of the attributes of product. That's why here we are.
00:10:47 Now we can assign the navigational attributes to the product.
00:10:52 I will select all here. Right.
00:10:54 That's good.
00:10:56 This mean we have then color, product size, and product category for browsing.
00:11:00 Exactly. And we do the same with the customer.
00:11:03 For example, to add address information, if you want to do an analysis not by individual
customers
00:11:07 but maybe by zip code or by region. So we have country, for example, as one option or city.
00:11:14 This geographic information is probably very relevant for your drilldowns.
00:11:19 So, let's add these as well. And with that maybe you can open the two folders here
00:11:26 so that you actually see the additional fields which we have now added to really make this a
star schema.
00:11:30 My guess now is we're ready to activate this. On the right-hand side,
00:11:34 you have the possibility to add some or maintain some detailed properties
00:11:40 of each of the InfoObjects on the left-hand side. For example, conversion routines which
change the data
00:11:46 or reformat the data for output purposes. Such things. For the key figures here, we'd
00:11:54 have the possibility to define the accuracy, the number of digits, and that kind of stuff.
00:11:59 But for the moment, again in this example we're happy with the standard settings.
00:12:03 We'll activate and do the next step to go to the query. So object is activated.
00:12:11 Now I will jump back to our data flow. And now we can create a query
00:12:17 on top of the CompositeProvider. So, here we are.
00:12:28 The right mouse button, new, and query. Creates a new query here.
00:12:32 So name of the query, maybe ZOSQ01. Okay.
00:12:44 And as we said, this is basically the point where you define the layout
00:12:47 of your report in more detail. So, let's maybe go over to the sheet definition...
00:12:54 Yeah, here we are. ...where we basically define what,
00:12:59 basically the initial drilldown state. So, what columns do we want to see?
00:13:03 What key figures we typically want to see in the columns? Maybe let's just add all of the key
figures here.
00:13:09 Or just pick a few. Whatever you like.
00:13:12 In the row, since we are dealing with money here, it probably makes a lot of sense to drill down

00:13:16 by currency right away, because otherwise you would either


00:13:21 have the threat of adding up numbers which don't make sense if you add up US dollar and
euros.
00:13:27 On the other hand, in BW/4HANA this would not happen because BW/4HANA knows the
relation between
00:13:33 a key figure and its unit and would just display a star if it knows
00:13:36 that you're adding up amounts for different currencies. But still to really see data,
00:13:43 it's important to drill down by currency here. What else is relevant?
00:13:51 Maybe you can use the color in the free section. Let's maybe also use all
00:13:56 the geographic information, the address information for the free characteristics
00:14:00 so that we can actually drill down by city or by postal code afterwards.

15
00:14:05 Color is maybe nice to have one of the... Maybe city.
00:14:08 Maybe postal code. Country.
00:14:11 Postal code. Whatever.
00:14:12 That's fine. And I guess, that's almost all we need for the moment.
00:14:16 Again, you have a lot of possibilities to define details here.
00:14:20 For example, you can see Can you go back?
00:14:24 You can see that you could, for example, leverage hierarchy functionalities
00:14:27 of the InfoObject. For example, if we had the product
00:14:30 as a free characteristic, or even in the rows, we could actually use the product hierarchy
00:14:36 to have a hierarchy display of the product.
00:14:40 For the key figures, we could also define some additional calculations.
00:14:43 As I said, we could define these restricted key figures for year-over-year comparison.
00:14:47 For example, a quarter comparisons. We could do some additional calculations on
00:14:53 for various purposes. So all this can be done here in the query.
00:14:56 Defined here at this point and then leveraged from all the analytic UIs.
00:15:00 But for the moment, I guess that's it again. Let's maybe save this query
00:15:05 and see if we can really have a look at the data I will now open Analysis for Microsoft Office
here.
00:15:14 Right. So we have this
00:15:16 especially for preview and verification purposes built right into the Eclipse UI
00:15:22 that you can directly start either some in-built preview screens
00:15:28 or even an Excel plug-in which we sell called Analysis for Microsoft Office.
00:15:36 So that you can right away start analyzing the data,
00:15:38 do a little bit of slicing and dicing and see what's going on.
00:15:41 So, here it is. And you see that we basically have exactly the layout
00:15:47 which we wanted. We have the two key figures which we modeled.
00:15:51 Not all the four which we actually have in the CompositeProvider,
00:15:53 but we only decided to use the two here. We have a drilldown
00:15:56 by currency and unit in the rows. You also see the overall result line
00:16:03 which displays the stars which I mentioned before
00:16:05 because the system knows, okay, it doesn't make sense to add up euros and US dollars
00:16:10 just as numbers. Right?
00:16:13 And you also see the possibilities to drill down by all the free characteristics
00:16:18 which we added in the query. So, maybe let's...
00:16:21 Let's do a postal code as an example. Postal code is always nice. Let's see what happens
00:16:24 if we drill down by postal code. And you see a list of postal codes.
00:16:29 Okay, maybe that's not so easy to read. But you see we actually have various countries.
00:16:34 Maybe we can also remove the postal code and drill down by country
00:16:39 to get something which is not as detailed. Just to get a rough overview of what's going on.
00:16:44 I think we should have something like three of four countries here
00:16:47 Right. Interesting.
00:16:50 Countries are for some reason... looks like a problem with our data load.
00:16:55 Countries are not displayed as texts here. You see that?
00:16:58 So it doesn't say, unlike the currencies
00:17:00 where you have the key display as EUR and the text as euro,
00:17:05 we only see a shortcut for the country. It doesn't say Canada, United Kingdom,

16
00:17:12 and United States. But it's just says the shortcuts.
00:17:16 Anyway, that's also something which you can define in a query by the way.
00:17:19 Maybe we didn't maintain this properly. But in any case,
00:17:23 all these options you have in an analytic query. And of course much more details
00:17:28 which we only briefly mentioned. And I guess that's it for the demo today.
00:17:33 So, let's go back to the presentation and summarize what we learned.
00:17:40 Well, you saw the possibilities of the virtual data modeling in BW/4HANA.
00:17:46 Those of you who remember classic BW, or are still running classic BW,
00:17:51 should also be aware of the difference and the more strict separation
00:17:54 between the persistence layer and the virtual data model,
00:17:58 which wasn't as clear in classic BW. So all the navigation attributes,
00:18:03 all the star schema modeling is really done on the CompositeProvider level.
00:18:07 No longer on the persistence layer with the aDSO.
00:18:11 You got a rough idea of the capabilities of the CompositeProvider doing unions and joints
00:18:18 of fact tables. And then associating the relevant navigation attributes
00:18:24 of the master data. You've seen the capabilities of analytic queries.
00:18:30 And I guess with that, we're ready for today, we're done for today.
00:18:34 We are. You're not
00:18:36 because you're supposed to do your self-test.

17
Week 1 Unit 4

00:00:07 Hello, and welcome to week 1, unit 4: Operating Data Flows with SAP BW/4HANA, Part 1.
00:00:14 So, after the last unit, we're basically ready with all the ETL stuff,
00:00:18 with all the preparations for the data loads, and we have actually even manually
00:00:21 executed one of them. Today, we are going to show you how to basically
00:00:26 turn this into a recurring process. Basically, the model artifacts which you need
00:00:33 to be able to schedule a more complex process, which not only consists of loading,
00:00:37 but also of the activation, which were two manual steps which were executed last time,
00:00:41 and even maybe more complex dependencies between master data loads and transaction
data loads.
00:00:45 So basically, that part of the modeling will be important in this unit and also the administration
of these processes:
00:00:52 How do you monitor whether such a process fails, where it fails, or whether it was successful?

00:00:58 How do you get statistical information about how many of your processes are doing well,
00:01:03 and which ones are not doing so well, and maybe identify certain hotspots there.
00:01:08 Another aspect is that you probably also want to monitor the objects which you're dealing with.

00:01:12 See how much data they contain, how many loads are going into the objects.
00:01:16 All that stuff is done in the SAP BW/4HANA administration cockpit, which is a Web-based UI.

00:01:22 We briefly saw it in the last unit; today, this is going to be the focus.
00:01:26 We will also show you the process chain dashboard, which gives you the rough overview of
00:01:31 all the processes which are running in your system, and we will show you a demo of all these
aspects
00:01:38 of the administration cockpit. So, as you already know,
00:01:43 we identified different user groups for BW/4HANA, and one of the user groups is the
administrator,
00:01:50 and for the administrator we introduced the SAP BW/4HANA cockpit.
00:01:56 It's an entry point to administrate and monitor processes on BW/4HANA.
00:02:03 ETL processes, process chain dashboards. You can see aborted or failed process chains.
00:02:10 You can manage your DataStore objects as well as the InfoObjects,
00:02:14 all that stuff is possible in the BW/4HANA administration cockpit.
00:02:20 You have different tiles here. We have a section for, as I said, monitoring
00:02:24 We have one section for managing the objects, and we have one section for modeling objects.

00:02:30 Today we will show you how to build and execute a process chain in the BW/4HANA cockpit.
00:02:38 This is Fiori based so it's quite easy to configure and personalize, and you can access
00:02:47 this BW/4HANA cockpit from any device. So, the next step, which we will also show you,
00:02:55 is the process chain modeler. This is actually a modeling artifact
00:02:59 which you could debate whether it should be part of Eclipse or should be part of a Web-based

00:03:04 We decided to move it into the Web-based UIs, and it is basically the place where
00:03:08 you assemble the individual steps which are relevant for data loads, as we said,
00:03:12 for example running a data transfer process, doing data activation afterwards,
00:03:16 assembling these steps in a certain order, and being able to execute these larger processes
00:03:23 and schedule these larger processes, this is what you do in the process chain modeler.

18
00:03:29 You see a screenshot here in the background, the one which is slightly overlapped
00:03:34 is the actual model of a process chain. In the foreground, you actually see
00:03:39 the monitor for it where you basically can really look at the detail information
00:03:43 about each of the steps, whether it was successful or not. All the process logs, which are
written during the process,
00:03:49 you can investigate and look at here. This process chain monitor also gives you
00:03:57 very detailed information. As you see here, it is again a state-of-the-art, Web-based UI.
00:04:05 You can do all the detailed error analysis here. You can get a list of all your process chains to
get an overview.
00:04:10 You can dive into details about each of the process chains and really drill down
00:04:16 individual steps and see where they failed and what happened, and correct the situations.
00:04:20 You can even restart certain processes which have failed, and continue the run of a process
chain.
00:04:27 So all that is possible from this UI. Those of you who are aware of the functionality
00:04:32 of classic BW of course know that we had similar functionalities in the past in SAP GUI.
00:04:38 Now all this is a new experience which is based on the Fiori paradigm
00:04:42 and here in a Web-based UI. Besides the monitoring and scheduling tool,
00:04:50 we have of course a process chain dashboard. This dashboard gives you an overview
00:04:55 of your process chain statistics. You can see the latest one,
00:04:58 you can see failed ones, you can find the schedules of each process chain,
00:05:03 you can compare runtimes. You can calculate average runtimes
00:05:08 and the interesting thing is, you can fully personalize this tile
00:05:12 and store it on your individual BW/4HANA cockpit. You have, as I said, detailed information
about the status
00:05:22 of the process chain, average runtimes, start delays, everything is possible here and what's
interesting to know is
00:05:29 you can full personalize that view to get your personalized view in the overview screen.
00:05:37 Alright, I think it's already time for the demo today. So what we are going to show you today
00:05:41 is basically how to create a process chain to load data into a DataStore object
00:05:45 and do the activation afterwards, so the steps you did briefly manually in the last unit.
00:05:51 We will show you what you can do when it comes to scheduling the process chains,
00:05:54 the options you have there. When it comes to monitoring the process chains
00:05:57 to see what steps were successful, what step failed and so on.
00:06:01 Same thing for advanced DataStore objects and potentially also InfoObjects.
00:06:05 We're not going to show it for InfoObjects. Basically, understanding how many they are.
00:06:08 How many data loads have happened into an advanced DataStore object.
00:06:13 How many activations have happened, what the relations between data loads and
00:06:17 activations was and all these all these things and we will also have a look at the process chain
dashboard
00:06:24 to see, to get a high-level overview of the statistics of your process chain runs.
00:06:30 So with that I'll hand over to Gordon to do the demo. Maybe the first thing that we should
mention before
00:06:35 we jump into the details is the personalization aspect which we did here. If you compare this
screen to the layout
00:06:42 which you see in the slides, we actually see we have an additional tile here.
00:06:46 Yeah, this is a personalized view of a search of a managed DataStore object. A search, you
can see, that is filtered

19
00:06:55 by ZSOD_03 and then you can see the different objects. So, that's to variance in classic SAP
GUI.
00:07:03 You can store such things basically as variants here, as an individual tile and place it
somewhere on your desktop
00:07:10 or on your cockpit. And if that's the typical list of process chains
00:07:15 which you are responsible for, then only have to click this tile and
00:07:18 you don't have to do the search anymore. So, here you can see,
00:07:21 you can send this link by email, important as well. And you can save that as a tile on your
home screen.
00:07:27 So, save as email is what you do if someone is supposed to do the work.
00:07:30 And save as tile if it is you. Alright, here you can see we have the Monitoring view,
00:07:38 we have a Manage view, and we have a Modeling for creating a process chain.
00:07:44 We have to jump into the Model Process Chains view, that's that one.
00:07:51 Here we are. And then I would say "Create". We will create a new process chain, of course.
00:08:00 So, then you can see here on the left side, the description. I will enter a description here.
00:08:12 [Both] "My Process Chain"... OpenSAP, whatever...
00:08:22 Okay, that's good enough. Right...
00:08:28 And now we will add the individual steps of this process chain. As I said, it's going to be quite
simple.
00:08:32 We are going to do a DTP run here as a first step. First, of all, I have to maintain the start
process.
00:08:41 Here you can see, we have different types for scheduling this process chain. We can choose
the immediate start,
00:08:47 we can choose event-based scheduling, or workday-based scheduling - it totally depends on
the requirements here.
00:08:55 You can set some recurrence patterns here. We can generate mails, if this one step of the
00:09:03 process chain runs successfully or not. But for that example, I will simply use immediate start.

00:09:11 I guess that's good enough. Yeah.


00:09:13 And now we will add two processes. One process for running the DTPs
00:09:20 and the other process for activating the data in the advanced DataStore object.
00:09:29 So, then, I have to go to the properties and the General button, and here and you can see the
different process types.
00:09:37 You can see what is possible in the process chains. It starts from General Services,
00:09:44 like interrupt, ABAP programs, and goes to BPC-based jobs and so on and so forth.
00:09:52 For our example, we will use the DTP load and then I have to select the DTP.
00:10:05 So, you can jump into search and here you can see this is my DTP.
00:10:12 It's interesting, what you actually did was not select by DTP but you entered the target of DTP

00:10:16 that was the only thing which we remembered, and the system, of course, it's not such hard
stuff,
00:10:21 but the system finds the right DTP right away. And you're basically done modeling the DTP.
00:10:26 Otherwise, knowing or remembering the technical name of a DTP can be quite tedious.
00:10:31 Okay, so that was the first step, and the next step would basically be to do the activation of this

00:10:37 DataStore object again. So, we pick the right process type,
00:10:45 which would be... Here we are!
00:10:49 ...Alias or Activation. And name.

20
00:10:59 And again we know the name of the object, can search for it, and we select it.
00:11:08 I guess that's all. So, here we are.
00:11:14 And we are basically done modeling the process chain, right? That's it, quite fast.
00:11:18 I will save the process chain here, A name.
00:11:22 It's a technical name... "ZOSPC01" (openSAP PC 01)...
00:11:39 $TMP... Here we are, just a local package.
00:11:53 Now, the process chain has been created. And now I will activate the process chain.
00:12:01 So with this we are basically ready to run the process chain and then see from the monitoring
perspective what happened.
00:12:09 So, therefore, I will go to the properties, It's okay to double check if this is right...
00:12:23 So, jumping to the Monitoring button, and now we can schedule the process chain.
00:12:29 Since we selected "immediate execution", the process chain is going to start right away.
00:12:36 And now you see that the process chain is running. You see that the DTP has currently, or the
process step
00:12:42 executing the DTP has the status ACTIVE, so currently data is being loaded If we refresh, we
will probably see it has finished, or failed, potentially.
00:12:52 [Both] Of course not! Potentially you see the process chain has actually finished now
00:12:57 because we are not dealing with huge data, with huge amounts of data,
00:13:01 and it's just taking a couple of seconds for the whole execution. But we can basically also now
get really detailed information
00:13:09 down to the lowest level of information the system has on each of the process steps.
00:13:15 I don't think that we need to go into all the details here. I think that's probably good enough.
00:13:20 Maybe let's go back to the monitoring perspective. And see what we can analyze from here.
00:13:30 So, one thing we could do is go into the Manage Datastore Objects view to really see
00:13:38 whether the data load and activation are visible there. Remember, last time we did this
manually in the last unit,
00:13:45 you actually still see the data load from the last unit there.
00:13:53 Let's go here. Here you can see the process, you can see the timestamp,
00:14:02 and in the log you can see detailed information. You can see that we inserted 3,840 records.
00:14:14 That's the activation request, right, that we're talking about? That's right, yeah.
00:14:18 Okay. Because it also mentions how many records have been updated or deleted.
00:14:26 And the second one is activating a request as well. And we also have here the advanced view,

00:14:32 which is quite interesting because it shows you correspondence between the load requests
00:14:37 on the left-hand side and the activation requests on the right-hand side.
00:14:41 Interesting to know maybe is the Metadata view as well. Here you can see the name of the
DDIC tables, if you like.
00:14:53 Technical names of the dictionary tables, that's good to know. You can actually even navigate
into the table maintenance here
00:15:00 to have a direct look on database level at the tables. That's something which you'll probably be
familiar with from SAP GUI as well.
00:15:08 So, all this information is now kept here in this UI also. What's next after DTP and aDSO
maintenance
00:15:20 would be to have an overview of the process chain dashboard, I guess. Yeah.
00:15:25 The Process Chain Dashboard, here we are. You simply press the button,
00:15:29 and now the system, in the background the system reads the statistics and prepares a
dashboard here.
00:15:37 And here you can see the latest ones, the average start delays, the average runtimes,

21
00:15:41 the runtimes from no executions to too long. We have, as I said, average times,
00:15:49 and you can personalize that view as well by filtering that view to your process chains which
are of interest.
00:15:58 On the right-hand side, I think that's also interesting, You can also see certain deviations,
00:16:03 certain process chains that were running exceptionally long, all that kind of stuff, so that you
get an impression of, for example, if this bar is
00:16:10 increasing and increasing, then it looks like your system doesn't have enough resources
anymore.
00:16:14 So, all this information you basically have here at hand, and you can also assemble this and
configure
00:16:20 to the extent which you need. I guess that's it for the demo, or do we have any additional steps
here?
00:16:28 That's it for the demo. So, let's summarize this chapter as well.
00:16:32 What did we see today? Well, we had a more detailed look at the BW/4HANA administration
cockpit,
00:16:40 which is the new entry point for administrators and people who are monitoring the system.
00:16:46 We've seen how to model, execute, and schedule data loads and process chains. How to look
at or administrate and monitor the data objects.
00:16:56 We only did it for the advanced DataStore objects in our case but you saw the tile and the app

00:17:01 which does the same thing for the InfoObjects as well. So basically all the data loads, both for
the master data
00:17:07 and the transaction data, you model in the same way from this UI now. And we have also at
least given you an idea of how you
00:17:15 can start personalizing this to your needs. Exactly.
00:17:18 And with that, it's time for you to do your self-test.

22
Week 1 Unit 5

00:00:07 Hello and welcome to week one unit five, Operating Data Flows with SAP BW/4HANA, part
two.
00:00:14 In this unit, we are basically going to complete the journey through the BW/4HANA Cockpit,
00:00:18 which we started in the last unit. and we'll show you the manage capabilities
00:00:23 of the BW/4HANA cockpit for advanced DataStores and InfoObjects.
00:00:28 We will show you the request maintenance functionalities which we have.
00:00:32 We'll also have a look at the in-app help functionality in the cockpit and of course we'll give you
a
00:00:37 that rounds off basically the stuff which we didn't show last time.
00:00:42 And so, in the BW/4HANA web administration tools, we have a request maintenance for
00:00:50 advanced DataStore objects and InfoObjects, and here you can see, you can slide between
the source
00:00:57 and the target in one single screen. You can see the detailed information about data loads
00:01:04 and the activations as well as the data targets. You can manage the requests, you can jump
into activation,
00:01:11 delete and maintaining the data. You can select the status of the requests
00:01:17 between yellow, red, and green. You can select the latest requests here,
00:01:23 you can see an overview of the runtimes, the statistics, the data volume,
00:01:29 and all the other stuff in one single screen. Right, from a metadata perspective, you also get
00:01:36 an overview of the objects. So there are basically screens incorporated in the BW/4HANA
cockpit,
00:01:42 which show you certain statistical information about the data volume contained in certain
objects,
00:01:47 especially when you go in the direction of data tiering. You'll see here in the screenshot on the
right- hand side
00:01:53 that the system shows you what part of the data, what percentage of the data is in hot, warm,
and cold.
00:01:57 We'll come to the details of data tiering optimization later on, but here's already one of the
screens where you get
00:02:03 some statistical information about the data distribution. You also get the possibility to have
00:02:10 a look at the generated table. So basically all the database side of the BW artifacts
00:02:17 is included in the BW/4HANA cockpit as well. Furthermore, of course, in the manage part
00:02:27 of the BW/4HANA Web administration cockpit, you can see different tiles for managing
DataStore objects,
00:02:36 manage InfoObjects, you have the possibility to use data tiering maintenance here,
00:02:42 data tiering adjustment is possible in the Manage tile, as well as SAP HANA remote
subscription.
00:02:51 You can see maintenance and simple manage functions for DataStore objects, for InfoObjects,

00:02:58 for user maintenance. But this is just the beginning.


00:03:02 We are currently working for more and more integration of the SAP GUI administration part
00:03:08 the BW/4HANA Web administration program. We'll see some of these functionalities in a
minute
00:03:14 when we start the demo. To be honest, some of these apps actually currently still
00:03:20 start the SAP Web GUI, so you will basically end up with a user experience which was
basically taken
00:03:28 from SAP GUI, in a Web-based fashion and included in the Fiori theme in general.

23
00:03:34 So basically here we have the overview page from which you can start
00:03:38 and jump into all the details. Some of them, some of the details screens
00:03:42 are still being reworked and will be Fiori-ized, Fiori-stylized, or whatever Fiori,
00:03:48 made Fiori compatible over the next months. So what we have as well as we have in-app help
00:03:57 for the BW/4Hana Cockpit. It's embedded with the Fiori apps which you can
00:04:03 Here you have different options for help functionalities. I mean in the end it's a state-of-the-art

00:04:09 Web assistant, just press the question mark button and then an interactive session starts and
then you can
00:04:17 move between the different options we have. You can press each button and then help is
popping up
00:04:24 and explains to you exactly the steps which are possible.
00:04:29 It's of course, context sensitive, and we have guided tours as well.
00:04:35 All the stuff is integrated in this one cockpit here. So that was basically the summary of the
functionality
00:04:44 which we wanted to present here. Now let's maybe jump into the system again
00:04:47 and show you a bit of a demo. We've already seen some parts of the administration
00:04:52 of advanced DSOs in the last unit, so today the focus will be more on the InfoObject,
00:04:57 on the master data side. We will also show you a detailed a demo on the
00:05:02 personalization features. We showed the possibility of personalization last time as well.
00:05:07 Today, we'll basically do a quick demo which really shows it end-to-end. And we will also show
you that
00:05:12 this really works on mobile devices. Gordon has his phone with him and we'll show
00:05:17 a screenshot of the Web administration cockpit right on his phone.
00:05:25 So then let's jump into the BW/4HANA cockpit. You can see we have the home button again,
00:05:35 the monitoring screen's here, and here you can see we have
00:05:38 the manage the InfoObjects button. I will jump into that tile, and I will start to filter.
00:05:45 Maybe just look for the InfoObjects which we created for openSAP.
00:05:49 So here's the ones which actually contain master data. Remember, the list was much larger,
but most of
00:05:53 did not contain master data. Those are the ones which actually have the persistence,
00:05:58 which have tables generated into which we load. So yeah, as you can see here, we have City,

00:06:04 we have Country, we have Customer, and we have Product. Now I will save this as a tile for...
00:06:11 So, basically, this selection so that we can right start from one tile and see this,
00:06:17 the screen as we see it now... So that would be InfoObjects for openSAP...
00:06:30 Okay, I guess that's good enough for the moment. Yeah, that's it.
00:06:34 Enough information. So when we now go back to the home button,
00:06:38 then we can see we have this tile on top of the BW/4HANA administration cockpit.
00:06:47 Jump into that one. We get exactly the layout which we just had.
00:06:51 So that's very handy, if you're responsible for a certain area of the data warehouse, for
operations
00:06:56 in a certain area, it's very useful to personalize using your dedicated tiles.
00:07:03 Exactly, so I will now jump to maybe to the customer InfoObject.
00:07:10 And here you can see the overview of the requests, we have just one request, of course.
00:07:16 You can see the different status the request could have. We have green, red, and yellow, of
course.
00:07:25 You can select it here, you can select the attributes, or if you like to see the attributes,

24
00:07:30 the text, and/or the hierarchies, if the InfoObject has some of these functions,
00:07:36 functionalities here. You can see the lock here, pressing the lock,
00:07:40 then you can see that data was loaded, how much data, how many records and so on and so
forth...
00:07:48 This is really the locks from the batch jobs, right? Exactly. You can see the affected tables...
00:07:59 In the Advanced View, as we already showed, here we can see the sources, the activations,
and the targets.
00:08:06 In that case, of course we have only activation. Right, so this looks a little bit different
00:08:11 than we saw last time in the aDSO because in the aDSO we had the distinction between
loading and activation.
00:08:18 For this, for the InfoObjects with the settings which we used for this scenario, there's no
distinction
00:08:26 between loading and activation, right? Yeah.
00:08:29 We have the manage button here, of course. And that brings us basically more to the
metadata,
00:08:35 the metadata side of it, right? If we have a look at the generated tables
00:08:39 and all that stuff, this is, this would be the right screen. So here you get an overview of how
00:08:45 the InfoObject is configured. You see it has master data texts, what kinds of texts.
00:08:50 You also see because it has text, there's a text table generated and you can jump into the text
table.
00:08:56 You also see the M view which is generated to the time-dependent and the time-independent
00:09:04 parts of the master data. You see the data type, so that basically gives
00:09:10 a metadata overview of the InfoObject. Yeah, what you can see as well is if a HANA
00:09:15 for external HANA view is generated or not. Topic of next week, but..
00:09:19 Exactly. Again, so it's all part of this overview.
00:09:24 Okay, I guess that's it for the master data. That's it for the master data section,
00:09:28 and here in the Manage section, you can see the data tiering maintenance, data tiering
adjustments,
00:09:34 user maintenance as well as HANA remote subscription. I will jump into one of these, one of
these tiles here.
00:09:41 Let's jump into the Data Tiering Maintenance tile, and here you can see the Web GUI is
opening.
00:09:47 Currently it's the Web GUI, but we are working on a deeper integration into a Fiori style.
00:09:54 Exactly. But I think for the moment it's very important to have the possibility to include this in
the cockpit
00:09:59 and really have one place from where you can start working, even though you don't always
have the possibility
00:10:07 to really enjoy Fiori-style apps, at least from a functional perspective,
00:10:11 this is very convenient and you have one place to start your work.
00:10:16 Okay, so then we are back in the home button and here you can see the personalization.
00:10:24 You can fully personalize your own cockpit. So you could, for example, also delete
00:10:30 some of the apps which are not relevant for you. All that would basically be possible here,
yeah.
00:10:36 You can probably rearrange things as well, change the order. Exactly, here you can edit your
home page,
00:10:41 can delete stuff, you can add groups, add tiles, and so on and so forth.
00:10:46 Okay, that's quite nice. So, all right.
00:10:53 That's it pretty much from the browser side. Now, we would like to show you that, of course,

25
00:11:00 all of this also works on a mobile device. Gordon, start up your phone and
00:11:04 show the attendees what's going on here. See, so that's basically the same view,
00:11:11 you see the same tiles as we had them right here now on Gordon's phone.
00:11:20 It's working, great, and I guess with that let's wrap up what we saw today.
00:11:26 We basically had a complete journey, including the previous unit, part one.
00:11:33 We had a complete journey through the BW/4HANA cockpit, showing you all the details, how
to manage advanced
00:11:39 DataStore objects or InfoObjects, how to do the request maintenance for both of these objects.

00:11:48 We also showed you the stuff around the pure persistence management,
00:11:54 like the user management which you can do there, the details about data tiering optimization,
00:11:59 all that kind of stuff and, of course, the personalization capabilities which you get from a Fiori-
style application.
00:12:07 And with that, I guess you're ready for the self- test.

26
Week 1 Unit 6

00:00:07 Hello and welcome to week one, unit six, Managing Large Data Volumes.
00:00:11 So far, we've been dealing with the basic modeling and administration concepts of SAP
BW/4HANA.
00:00:17 Now we come to a slightly more advanced topic, which is basically how to deal with large data
volumes,
00:00:23 how to distribute data, how to partition data in order to make such large data volumes
00:00:29 easier to handle. So we'll basically discuss
00:00:32 two different partitioning concepts for big data volumes.
00:00:35 One is the so-called semantic groups where the point is that we basically distribute
00:00:41 over multiple database tables or advanced DataStore objects, so to speak.
00:00:45 The other one is a partitioning schema more on the database level, where it's about having
00:00:52 a single big advanced DataStore object, and splitting this on database level
00:00:56 into multiple partitions. Of course, we will conclude this
00:01:00 and complete this with a demo which shows the various aspects
00:01:03 and how this looks and feels. So the question is, of course,
00:01:07 why partitioning? Why is partitioning so important
00:01:10 for managing large data volumes? There are two aspects.
00:01:14 One aspect is the performance aspect here. You will have a better performance with mass
00:01:21 when you partition the data. And data pruning is then possible here.
00:01:25 Which basically means that certain parts of the will not even have to be read
00:01:28 because you know in advance that this data will not be relevant for certain queries.
00:01:33 Exactly. Data tiering optimization,
00:01:36 which means across the data between different tiers, hot, cold, and warm will
00:01:42 much more flexible because data tiering optimization is based on partitions, partitions on a
table,
00:01:50 and that's why it's much easier to distribute the between these different tiers.
00:01:57 Last but not least, administration because the tables are quite smaller then,
00:02:02 when we're talking about semantic partitioning here. With that approach,
00:02:09 we support very large column store tables, so each partition can be up to two billion records.
00:02:15 And you can have multiple partitions... Exactly. Then you have multiple partitions
00:02:18 and you can have more than this limitation of two billion. So from a business point of view,
00:02:25 we will, of course, minimize the impact across different time zones.
00:02:29 So this means when you load data between... when you have different time zones in your
company,
00:02:34 then you can load and report on the different advanced DataStore objects,
00:02:38 and, of course, then we avoid impacts here. And error handling is then possible
00:02:43 on each data stream here as well. So there's no impact between the different time zones
00:02:48 and the different countries then. Right, so let's look at an example here.
00:02:53 Basically, what we see here is both partitioning schemas which we just described.
00:02:59 So on the top level, we basically split the data into two
00:03:02 different aDSOs: one for, say, the Northern Hemisphere, one for the Southern Hemisphere.
00:03:07 Typically, you would rather do something according to time zones because, as you just
described,
00:03:12 this would allow you to minimize the impact of loads in one time zone with reporting happening

27
00:03:17 at the same time in another time zone. But basically, geography is very often a good schema,

00:03:22 a partitioning scheme here. So that's really about splitting up the data
00:03:26 into multiple aDSOs and doing a union in the CompositeProvider.
00:03:31 The other option, which we only see for the Northern advanced DataStore object is
00:03:37 that you can split up this single advanced DataStore object into multiple partitions on database
level.
00:03:42 Here, for example, we see that we do this in this image
00:03:48 on a calendar-year level. So you would basically have partitions
00:03:51 for each calendar year, and that would basically then
00:03:54 also allow you to use things like data tiering optimization,
00:03:58 which we will cover in detail later, but many of you will have heard about it.
00:04:01 It's basically about storing aging data, older data,
00:04:06 which is not as relevant any more at an efficient cost-point.
00:04:13 For automation of this semantic partitioning, we introduced so-called semantic groups here.
00:04:21 A semantic group is more or less a flexible concept for semantic partitioning
00:04:28 of multiple advanced DataStore objects. So the idea here is to create
00:04:32 one semantic partitioning object, and then you can add different advanced DataStore objects
00:04:40 based on the reference structure. The reference structure needs to be
00:04:43 an advanced DataStore object here. In this reference structure advanced DataStore object,
00:04:50 a key is needed to assign the different criteria here. Based on the criteria,
00:04:55 you can define flexible split information, and then the system will automatically generate
00:05:03 the advanced DataStore object to split it once,
00:05:07 and, of course, the CompositeProvider to do the union on top of the partition advanced
DataStore object.
00:05:16 So when you talk about partitions here, it really means individual advanced DataStore objects.

00:05:20 It's not database partitions but individual aDSOs. Therefore, we also see individual data flows
00:05:27 into each of these objects and the union done in the CompositeProvider on
00:05:32 Exactly. The semantic group,
00:05:37 will create the DTP, including the filters, as well as the CompositeProvider.
00:05:42 On the transport then, you can see the individual advanced DataStore object
00:05:48 and the CompositeProvider, as well as the data. Data flow objects, isn't it?
00:05:53 All right, so what's next here?
00:05:58 Here you can see a screenshot of creating a semantic group
00:06:02 and its objects here. You can see
00:06:07 the semantic group is based on a reference structure. The reference structure is, of course,
00:06:11 an advanced DataStore object for persistency. You can choose that one,
00:06:15 and then you have to choose an element, the element for creating the split criteria
00:06:23 or creating the semantic partition. In that case, we have a calendar year which is quite easy.
00:06:28 It can be country, or region, or whatever. Then you can easily add in or delete elements
00:06:36 based on InfoObjects or fields. For fields, it's possible here, as well.
00:06:41 Then you can create new components at any time. It's possible to change the components
here.
00:06:48 Then, of course, maybe remodeling then is needed after the generation of the
00:06:54 belonging advanced DataStore objects. The criteria can be modified.
00:07:00 You can see in that example, the system creates

28
00:07:06 10 advanced DataStore objects based on the criterion calendar year.
00:07:13 In our example, we will use country as a split criterion.
00:07:18 Then in the advanced DataStore object itself, we will create different...
00:07:25 We will have a physical partitioning on the table, on the database table here, as well.
00:07:30 What's nice about this is that the individual aDSOs which are generated don't necessarily
00:07:36 have to have the exact same structure, they can be slightly different.
00:07:39 If you need some additional attributes for certain regions, you can do that.
00:07:45 There's a little bit more flexibility than we used to have in the past.
00:07:48 Yeah, that's important here. You can maintain
00:07:50 the advanced DataStore objects independently from each other. And still have this umbrella
00:07:57 or this bracket of the CompositeProvider on top. Okay, so it's time for the system demo.
00:08:03 What we're basically going to show you is in the first step, we're going to take
00:08:08 an advanced DataStore object which is not partitioned but contains data,
00:08:12 and we'll partition it for you, and even show you how to deal with the data
00:08:16 and the redistribution of data afterwards. That's the remodeling job, basically.
00:08:20 We'll also show you the semantic group part of so basically, how you
00:08:26 take a reference structure and create multiple aDSOs out of that,
00:08:30 and then add these components to a semantic group. All right, then.
00:08:38 Let's start with adding the physical partitioning
00:08:42 to the created advanced DataStore object for sales orders.
00:08:46 We have the object here. On the Settings tab,
00:08:50 here you can see that it's possible to create partitions here.
00:08:54 No partition is created at the moment. I will create a new one.
00:08:57 Here you can see the options we have. We have to select a key here.
00:09:02 We will use Order Date, and then we will add a new partition
00:09:07 starting at 2010 until the end of 2020.
00:09:17 I'll say OK. Then you can see here,
00:09:20 now we have one partition for 10 years containing the 10 years.
00:09:24 Now I will split that partition into 10 partitions, one partition for one year each.
00:09:32 You don't have to do this manually, you can actually have some support by the system for that.

00:09:36 It's quite simple with the system here.


00:09:42 I will just add split criteria based on calendar years. We'll Apply,
00:09:48 and now you can see the system will automatically generate
00:09:52 10 partitions. Now we will activate.
00:09:55 So right now, it's not active yet, so nothing has happened on the database yet.
00:09:58 During activation, I guess, we'll get a notification that the object contains data
00:10:03 and that we'll have to do some more work here
00:10:06 because it's not just working on an empty object.
00:10:09 Yeah. Here we are.
00:10:12 Remodeling is pending. Now we have to remodel the advanced DataStore object.
00:10:16 I will, of course, do that. Jump into the embedded GUI.
00:10:21 Here you can see the request.
00:10:25 I will start the request. Immediate. Yeah, but it's very convenient that the system
00:10:31 out-of-the-box detects. Okay, for this object some extra work is needed.
00:10:35 You have to remodel because the object contains data,

29
00:10:38 and it actually does all the conversion for you. Now we should actually be able...
00:10:43 Now it's done. To see the object and have it to...
00:10:47 I will activate it again. Now it's fine.
00:10:53 Now we have one object, one advanced DataStore object
00:10:57 with 10 partitions, one partition for each year. Now I will create a new semantic group
00:11:05 based on that advanced DataStore object as a reference. Okay.
00:11:10 I will go into the modeling part here and say New,
00:11:14 Semantic Group here. I have to assign a name,
00:11:19 Z-O-S, S-G, for semantic group, 02, maybe.
00:11:27 Finish. Now I have to add a CompositeProvider.
00:11:32 If I would like to create a CompositeProvider on as an umbrella on top of the
00:11:38 independent advanced DataStore objects, I will, of course, do that.
00:11:43 04, maybe.
00:11:48 Then we will jump into the criteria here. Now we have to choose a reference
00:11:53 advanced DataStore object. I will use
00:11:59 the already changed... Z-O-S.
00:12:03 Z-03 already changed advanced DataStore object. I have to browse for an element here.
00:12:11 For that simple example, I will use country. Let's take country, I think that's a good example.
00:12:16 Then I have to create, New. Then you can see here a new advanced DataStore object
00:12:21 will be created here. I have to maintain the description.
00:12:26 Let's say this is Germany. I can add a criterion here
00:12:33 and then say a New one, maybe one new for Russia.
00:12:42 Add Criterion. A new one
00:12:47 for US, maybe. Add this criterion.
00:12:51 Now the system will create three complete independent advanced DataStore objects
00:12:57 and combine the three DataStore objects via a CompositeProvider on top here.
00:13:03 And even more, the CompositeProvider would at query runtime that, for example, when you
run
00:13:07 a query which selects only data for Germany, then only this individual advanced DataStore
object
00:13:13 would be read and the other ones would not even be touched, so that really saves system
resources.
00:13:18 So I will now... That's the pruning mechanism.
00:13:20 Start generation here. You can see it.
00:13:24 The system will create a semantic group object, a CompositeProvider,
00:13:30 and three independent members advanced DataStore objects, of course.
00:13:37 After that, now I can generate it. It will take some time here.
00:13:43 Now the system creates these advanced DataStore objects
00:13:47 and the CompositeProvider as an umbrella to report directly on the objects here.
00:13:54 And in contrast to the first example, where we only did the partitioning on database level,
00:13:58 we would now have the possibility to define multiple data flows,
00:14:01 three data flows with three transformations, and three DTPs into these three aDSOs.
00:14:06 In the first case, we had multiple partitions,
00:14:10 but only one ETL route into this
00:14:16 single aDSO, of course. So that kind of flexibility,
00:14:19 which we also showed in the slides, that's one of the key differences

30
00:14:23 between the two concepts, as well. So activation is done,
00:14:28 objects are created. Do we see them somewhere in the system?
00:14:33 Not today?
00:14:36 Not today. Anyway, so let's sum up what we saw
00:14:40 and what we learned in this unit. Basically, two key takeaways:
00:14:46 There are two ideas or two schemas for managing large data volumes by partitioning.
00:14:52 One is the so-called semantic partitioning which splits the data into multiple aDSOs,
00:14:58 different objects with different ETL routes into them,
00:15:04 everything individual and more independence, which is very useful as we said for typically
00:15:11 managing different time zones, for example, also leveraging pruning, of course, to a large
extent.
00:15:17 The other mechanism is the database partitioning which splits the active table
00:15:22 of an advanced DataStore object into multiple database partitions.
00:15:26 There's a pruning mechanism there, of course, as well. So from a resource perspective
00:15:30 and a performance perspective, you benefit here as well.
00:15:34 You don't have the aspect of independent loads, but from a performance perspective,
00:15:40 that's actually in many cases even a little bit more efficient
00:15:43 because everything is based on one table in the database as opposed to unions of multiple
tables.
00:15:50 With that, today there's no self-test, but a weekly assignment for you.

31
www.sap.com/contactsap

© 2018 SAP SE or an SAP affiliate company. All rights reserved.


No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP SE or an SAP affiliate company.

The information contained herein may be changed without prior notice. Some software products marketed by SAP SE and its distributors contain proprietary software components of other software vendors.
National product specifications may vary.

These materials are provided by SAP SE or an SAP affiliate company for informational purposes only, without representation or warranty of any kind, and SAP or its affiliated companies shall not be liable
for errors or omissions with respect to the materials. The only warranties for SAP or SAP affiliate company products and services are those that are set forth in the express warranty statements
accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.

In particular, SAP SE or its affiliated companies have no obligation to pursue any course of business outlined in this document or any related presentation, or to develop or release any functionality
mentioned therein. This document, or any related presentation, and SAP SE’s or its affiliated companies’ strategy and possible future developments, products, and/or platform directions and functionality are
all subject to change and may be changed by SAP SE or its affiliated companies at any time for any reason without notice. The information in this document is not a commitment, promise, or legal obligation
to deliver any material, code, or functionality. All forward-looking statements are subject to various risks and uncertainties that could cause actual results to differ materially from e xpectations. Readers are
cautioned not to place undue reliance on these forward-looking statements, and they should not be relied upon in making purchasing decisions.

SAP and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP SE (or an SAP affiliate company) in Germany and other
countries. All other product and service names mentioned are the trademarks of their respective companies. See http://www.sap.com/corporate-en/legal/copyright/index.epx for additional trademark
information and notices.

You might also like