SAP Datasphere Administration Guide
SAP Datasphere Administration Guide
Public
Warning
This document has been generated from SAP Help Portal and is an incomplete version of the official SAP product documentation.
The information included in custom documentation may not reflect the arrangement of topics in SAP Help Portal, and may be
missing important aspects and/or correlations to other topics. For this reason, it is not for production use.
This is custom documentation. For more information, please visit SAP Help Portal. 1
7/9/25, 8:45 AM
Prepare Connectivity
Tip
The English version of this guide is open for contributions and feedback using GitHub. This allows you to get in contact with
responsible authors of SAP Help Portal pages and the development team to discuss documentation-related issues. To
contribute to this guide, or to provide feedback, choose the corresponding option on SAP Help Portal:
Feedback Edit page : Contribute to a documentation page. This option opens a pull request on GitHub.
Feedback Create issue : Provide feedback about a documentation page. This option opens an issue on GitHub.
More information:
Contribution Guidelines
We recommend that you link your tenant to an SAP Analytics Cloud tenant (see Review and Manage Links to SAP Analytics
Cloud and SAP Business Data Cloud Tenants).
You can enable SAP HANA for SQL data warehousing on your tenant to exchange data between your HDI containers and
your SAP Datasphere spaces without the need for data movement (see Enable SAP HANA for SQL data warehousing on
Your SAP Datasphere Tenant).
You can enable the SAP HANA Cloud script server to access the SAP HANA Automated Predictive Library (APL) and SAP
HANA Predictive Analysis Library (PAL) machine learning libraries (see Enable the SAP HANA Cloud Script Server on Your
SAP Datasphere Tenant).
This is custom documentation. For more information, please visit SAP Help Portal. 2
7/9/25, 8:45 AM
An administrator creates SAP Datasphere users manually, from a *.csv file, or via an identity provider (see Managing SAP
Datasphere Users).
You must assign one or more roles to each of your users via scoped roles and global roles (see Managing Roles and Privileges). You
can create your own custom roles or use the following standard roles delivered with SAP Datasphere:
System Owner - Includes all user privileges to allow unrestricted access to all areas of the application. Exactly one
user must be assigned to this role.
DW Administrator - Can create users, roles and spaces and has other administration privileges across the SAP
Datasphere tenant. Cannot access any of the apps (such as the Data Builder).
DW Space Administrator (template) - Can manage all aspects of the spaces users are assigned to (except the
Space Storage and Workload Management properties) and can create data access controls.
DW Scoped Space Administrator - This predefined scoped role is based on the DW Space Administrator role
and inherits its privileges and permissions.
Note
Users who are space administrators primarily need scoped permissions to work with spaces, but they
also need some global permissions (such as Lifecycle when transporting content packages). To provide
such users with the full set of permissions they need, they must be assigned to a scoped role (such as the
DW Scoped Space Administrator) to receive the necessary scoped privileges, but they also need to be
assigned directly to the DW Space Administrator role (or a custom role that is based on the DW Space
Administrator role) in order to receive the additional global privileges.
DW Integrator (template) - Can integrate data via connections and can manage and monitor data integration in a
space.
DW Scoped Integrator - This predefined scoped role is based on the DW Integrator role and inherits its
privileges and permissions.
DW Modeler (template) - Can create and edit objects in the Data Builder and Business Builder and view data in
objects.
DW Scoped Modeler - This predefined scoped role is based on the DW Modeler role and inherits its privileges
and permissions.
DW Viewer (template) - Can view objects and view data output by views that are exposed for consumption in
spaces.
DW Scoped Viewer - This predefined scoped role is based on the DW Viewer role and inherits its privileges
and permissions.
Roles providing privileges to consume the data exposed by SAP Datasphere spaces:
DW Consumer (template) - Can consume data exposed by SAP Datasphere spaces, using SAP Analytics Cloud, and
other clients, tools, and apps. Users with this role cannot log into SAP Datasphere. It is intended for business
analysts and other users who use SAP Datasphere data to drive their visualizations, but who have no need to access
the modeling environment.
DW Scoped Consumer - This predefined scoped role is based on the DW Consumer role and inherits its
privileges and permissions.
This is custom documentation. For more information, please visit SAP Help Portal. 3
7/9/25, 8:45 AM
Roles providing privileges to work in the SAP Datasphere catalog:
Catalog Administrator - Can set up and implement data governance using the catalog. This includes connecting the
catalog to source systems for extracting metadata, building business glossaries, creating tags for classification, and
publishing enriched catalog assets so all catalog users can find and use them. Must be used in combination with
another role such as DW Viewer or DW Modeler for the user to have access to SAP Datasphere.
Catalog User - Can search and discover data and analytics content in the catalog for consumption. These users may
be modelers who want to build additional content based on official, governed assets in the catalog, or viewers who
just want to view these assets. Must be used in combination with another role such as DW Viewer or DW Modeler for
the user to have access to SAP Datasphere.
Note
To activate SAP Business AI features in your SAP Datasphere tenant, see Enable SAP Business AI for SAP
Datasphere
All data acquisition, preparation, and modeling in SAP Datasphere happens inside spaces. A space is a secure area - space data
cannot be accessed outside the space unless it is shared to another space or exposed for consumption.
An administrator must create one or more spaces and allocate resources to them. See Creating Spaces and Allocating Resources.
Prepare Connectivity
Administrators prepare SAP Datasphere for creating connections to source systems in spaces (see Preparing Connectivity for
Connections).
(Space Management)
In the Space Management, you can set up, configure, and monitor your spaces, including assigning users to them. For more
information, see Preparing Your Space and Integrating Data.
(System Monitor)
In the System Monitor, you can monitor the performance of your system and identify storage, task, out-of-memory, and other
issues. For more information, see Monitoring SAP Datasphere.
This is custom documentation. For more information, please visit SAP Help Portal. 4
7/9/25, 8:45 AM
Security
Users Create, modify, and manage users in SAP Managing SAP Datasphere Users
Datasphere.
Activities Track the activities that users perform on Monitor Object Changes with Activities
objects such as spaces, tables, views, data
flows, and others, track changes to users
and roles, and more.
System Configuration
Data Integration Live Data Connections (Tunnel): For SAP Create Live Data Connection of Type Tunnel
BW∕4HANA and SAP S/4HANA model (SAP BW∕4HANA)
import, you need Cloud Connector. This
Create SAP S/4HANA Live Data Connection
requires a live data connection of type
of Type Tunnel (SAP S/4HANA)
tunnel.
Third-Party Drivers: Upload driver files that Upload Third-Party ODBC Drivers (Required
are required for certain third-party cloud for Data Flows)
connections to use them for data flows.
Password Policy Configuration: Define your Set a Password Policy for Database Users
password policy settings for the database
users. The policy can be enabled when
configuring your database users.
Audit Audit View Enablement: Configure a space Logging Read and Change Actions for Audit
that gets access to audit views and allows
you to display the audit logs in that space.
This is custom documentation. For more information, please visit SAP Help Portal. 5
7/9/25, 8:45 AM
Tasks Clean-up task logs to reduce storage Delete Task Logs to Reduce Storage
consumption in your SAP Datasphere Consumption
tenant.
Check Consent Expirations
Also allows you to view a list of users whose
authorization consent will expire within a
given timeframe, by default, four weeks.
Tenant Configuration Allocate the capacity units to storage and Configure the Size of Your SAP Datasphere
compute resources for your tenant. Tenant
SAP BW Bridge Create a SAP BW bridge tenant. Provisioning the SAP BW Bridge Tenant
Business Data Products Select spaces to which SAP Business Data Authorize Spaces to Install SAP Business
Cloud data products from an activated data Data Cloud Data Products
package can be installed.
AI Services Enable Artificial Intelligence services in SAP Enable SAP Business AI for SAP Datasphere
Datasphere.
System Information Add a visual tenant type indicator to show Display Your System Information
all users which system they are using, for
example a test or production system.
Workload Management Set a priority for a particular space when Set Priorities and Statement Limits for
querying the database and set limits to the Spaces or Groups
amount of memory and threads that the
space can consume.
System Administration
This is custom documentation. For more information, please visit SAP Help Portal. 6
7/9/25, 8:45 AM
System Configuration Session timeout: Set the amount of time By default the session timeout is set to
before a user session expires if the user 3600 seconds (1 hour). The minimum value
doesn't interact with the system. is 300 seconds, and the maximum value is
43200 seconds.
Allow SAP support user creation: Let SAP Request Help from SAP Technical Support
create support users based on incidents.
Tenant Links Product Switch: Link an SAP Analytics Review and Manage Links to SAP Analytics
Cloud tenant to your SAP Datasphere Cloud and SAP Business Data Cloud
tenant to enable the product switch in the Tenants
top right of the shell bar, and be able to
easily navigate between them.
Data Source Configuration SAP BTP Core Account: Get subaccount Set Up Cloud Connector in SAP Datasphere
information for SAP Datasphere. You need
the information to configure the Cloud
Connector that SAP Datasphere uses to
connect to sources for data flows and
model import.
Security Authentication Method: Select the Enabling a Custom SAML Identity Provider
authentication method used by SAP (Legacy Custom IdP)
Datasphere.
App Integration OAuth Clients: You can use Open Create OAuth2.0 Clients to Authenticate
Authorization (OAuth) protocol to allow Against SAP Datasphere
third-party applications access.
This is custom documentation. For more information, please visit SAP Help Portal. 7
7/9/25, 8:45 AM
System About
Every user can view information about the software components and versions of your system, in particular:
Build Date: Displays the date and time when the current version of the SAP Datasphere tenant was built.
Platform Version: Displays the version of the SAP Analytics Cloud components used in SAP Datasphere.
Users with the DW Administrator role can open a More section to find more details. They can find outbound and database IP
addresses that might be required for allowlists in source systems or databases of SAP Datasphere for example (see Finding SAP
Datasphere IP addresses). Administrators can also upgrade their SAP HANA database patch version. For details, see Apply a Patch
Upgrade to Your SAP HANA Database.
Desktop browser Google Chrome, latest version Google releases continuous updates to their Chrome browser. We
make every effort to fully test and support the latest versions as
they are released. However, if defects are introduced with OEM-
specific browser software, we cannot guarantee fixes in all cases.
Microsoft Edge based on the Microsoft has available for download continuous updates to their
Chromium engine, latest version new Chromium-based Edge browser. We make every effort to fully
test and support the latest versions as they are released.
Network bandwidth Minimum 500-800 kbit/s per user In general, SAP Datasphere requires no more
bandwidth than is required to browse the internet. All
application modules are designed for speed and
responsiveness with minimal use of large graphic
files.
This is custom documentation. For more information, please visit SAP Help Portal. 8
7/9/25, 8:45 AM
JavaScript Enable -
Power Option Recommendation High Performance mode for improved For Microsoft based Operating Systems
JavaScript performance
Supported Languages
Menus, buttons, messages, and other elements of the user Bulgarian (bgBG); Catalan (caES); Chinese (zhTW); Chinese
interface. (Simplified) (zhCN); Croatian (hrHR); Czech (csCZ); Danish
(daDK); Dutch (nlNL); English (enGB); English (enUS); Estonian
(etEE); French (frCA); French (frFR); Finnish (fiFI); German (deDE);
German (deCH); Greek (elGR); Hindi (hiIN); Hungarian (huHU);
Indonesian (idID); Italian (itIT); Japanese (jaJP); Korean (koKR);
Latvian (lvLV); Lithuanian (ltLT); Malay (msMY); Norwegian
(noNO); Polish (plPL); Portuguese (Brazil) (ptBR); Portuguese
(Portugal) (ptPT); Romanian (roRO); Russian (ruRU); Serbian
(srRS); Slovakian (skSK); Slovenian (slSL); Spanish (esES);
Spanish (esMX); Swedish (svSE); Thai (thTH); Turkish
(trTR);Ukrainian (ukUA); Vietnamese (viVN) and Welsh (cyGB).
Data Connectivity
We recommend to always use the latest released version of the Data Provisioning Agent but at least the recommended minimum
version from SAP Note 2419138 . Make sure that all agents that you want to connect to SAP Datasphere have the same latest
version.
For more information, including information on minimum requirements for source systems and databases, see:
This is custom documentation. For more information, please visit SAP Help Portal. 9
7/9/25, 8:45 AM
Configure Data Provisioning Adapters in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
documentation
You can create an SAP support incident on the SAP Support Portal (S-user login required). For detailed information about what to
include in an incident, see SAP Note 2854764 .
In SAP Datasphere, users with an administrator role can make sure that a support user is created in the tenant. Two options are
available:
Option 1: Allow SAP Technical Support to Create Support Users for Incidents
To generally allow SAP Technical Support to create support users based on incidents, proceed as follows:
2. Choose Edit.
4. Click Save.
In case of an incident, the assigned support engineer from SAP Technical Support can request and generate a personalized
support user for the affected tenant. This user is enabled for multi-factor authentication.
Support engineers can request the support user with one of the following roles:
the global extended role DW Support User along with the scoped role DW Scoped Support User
DW Support User gives support users read-only access privileges to all functionalities of SAP Datasphere, enabling
them to analyze the incident.
When support engineers request the DW Scoped Support User role, they can specify the spaces that need to be
added as scopes to this role. This gives the support user read-only access to these spaces.
the global DW Administrator role, if the customer confirms this in the incident
The support user does not consume a user license, and it will be automatically deleted after two days or after the incident
has been closed.
2. In the Support dialog, click Create Support User and then choose OK to confirm the support user creation.
An email is automatically sent to SAP Support to notify them of the newly created support user, and it is listed with your
other users at Security Users .
The support user has minimum privileges and does not consume a user license.
This is custom documentation. For more information, please visit SAP Help Portal. 10
7/9/25, 8:45 AM
You can assign an appropriate scoped role to the support user and add the user to the required space, or assign the DW or
Catalog Administrator role if required.
For more information about creating a support user and assign appropriate roles, see SAP Note 2891554 .
You can create your own tenant in the SAP BTP Cockpit. The procedure is the same for both subscription-based and consumption-
based contracts. Some details may vary depending on the chosen service plan (free or standard). For more information about
limitations for a free plan, see SAP Note 3227267.
When the tenant is configured, a data center region is selected. The main role of a data center is to guarantee the uninterrupted
operation of computer systems. It also provides secure storage, processing, and networking capabilities for your data. A data
center refers to the physical location, which could be a building or a group of buildings, housing computer systems and their
components.
Each data center region has multiple availability zones. Your workloads are deployed in these various zones. By distributing
workloads across different zones, we ensure our services remain available, even if a specific zone experiences issues. By keeping
backup data within the same data center region, the latency for data transfers and access is minimized. This infrastructure
strategy balances the workload and enhances performance. The zone deployment contributes to a more robust and reliable
infrastructure, ensuring near-zero downtime for your critical processing needs.
For information about enabling multiple availability zones, see this SAP Knowledge Base Article .
Provisioning For information about region availability, see For information about region availability, see
the SAP Discovery Center . the SAP Discovery Center .
The SAP BTP subaccount administrator must The SAP BTP subaccount administrator must
trigger the SAP Datasphere instance trigger the SAP Datasphere instance
creation. Tenant creation will not be triggered creation. Tenant creation will not be triggered
by SAP. by SAP.
You must create and configure the to-be- You must create and configure the to-be-
provisioned SAP Datasphere service instance provisioned SAP Datasphere service instance
(tenant) in SAP BTP. See Create Your SAP (tenant) in SAP BTP. See Create Your SAP
Datasphere Service Instance in SAP BTP. Datasphere Service Instance in SAP BTP.
The system owner of SAP Datasphere, who The system owner of SAP Datasphere, who
has been specified during the provisioning, is has been specified during the provisioning, is
notified via email when the tenant is notified via email when the tenant is
provisioned. provisioned.
This is custom documentation. For more information, please visit SAP Help Portal. 11
7/9/25, 8:45 AM
Size Configuration Tenants are initially created with minimal Tenants are created with 128 GB of storage
configuration that includes 128 GB of storage and 32 GB of memory (2 compute blocks).
Note and 32 GB of memory (2 compute blocks).
You cannot upscale free plan tenants. You
For maximum size
Once logged to your tenant, upscaling can be need to update your plan from free to
configuration options, done at any time. See Configure the Size of standard if any sizing configuration is
see the tables below. Your SAP Datasphere Tenant. required.
Note
After finalizing the configuration, you can
only change the size of your SAP BW
Bridge storage later if you don’t have any
SAP BW Bridge instances.
Metering :
The number of consumed capacity units is
reported on a hourly basis to your SAP BTP The usage of a free plan tenant is reported to
account. your SAP BTP account, but SAP does not
charge you for using this tenant.
The maxium configuration size of your tenant depends on regional availability and your server type.
Note
Data integration includes 200h/month from the minimum free package.
This is custom documentation. For more information, please visit SAP Help Portal. 12
7/9/25, 8:45 AM
Australia 5970 GB 16000 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 440 (Memory
Performance
Class)
Brazil (São 5970 GB 16000 GB 4096 GB Not 7200 h/month 20.5 GB/h 440 (Memory
Paulo) Supported Performance
Class)
Canada 5970 GB 16000 GB 4096 GB Not 7200 h/month 20.5 GB/h 440 (Memory
(Montreal) Supported Performance
Class)
Europe 12000 GB 61440 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 442 (High
(Frankfurt) Memory
Performance
Class)
EU Access 5970 GB 16000 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 440 (Memory
(Frankfurt) Performance
Class)
Japan (Tokyo) 5970 GB 16000 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 440 (Memory
Performance
Class)
Singapore 5970 GB 16000 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 440 (Memory
Performance
Class)
South Korea 5970 GB 16000 GB 4096 GB Not 7200 h/month 20.5 GB/h 440 (Memory
Supported Performance
Class)
US East 12000 GB 61440 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 442 (High
Memory
Performance
Class)
Microsoft Azure
Brazil 5600 GB 27840 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
Performance
Class)
Europe 11150 GB 55760 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
(Amsterdam) Performance
Class)
This is custom documentation. For more information, please visit SAP Help Portal. 13
7/9/25, 8:45 AM
Europe 5600 GB 27840 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
(Switzerland) Performance
Class)
US West 11150 GB 55760 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
Performance
Class)
US East 5600 GB 27840 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
Performance
Class)
Brazil 5750 GB 28928 GB Supported 90TB 7200 20.5 GB/h 124 (High
h/month Memory
Performance
Class)
India (Mumbai) 5750 GB 28928 GB Supported 90 TB 7200 20.5 GB/h 204 (High
h/month Memory
Performance
Class)
Saudi Arabia 5750 GB 28928 GB Not 90 TB 7200 20.5 GB/h 204 (High
supported h/month Memory
Note
Performance
Only available
Class)
to customers
representing
critical
national
This is custom documentation. For more information, please visit SAP Help Portal. 14
7/9/25, 8:45 AM
infrastructure
or the public
sector.
Saudi Arabia 5750 GB 28928 GB Not 90 TB 7200 20.5 GB/h 204 (High
supported h/month Memory
Note
Performance
Available to
Class)
non-
regulated
customers.
Note
Creating an SAP Datasphere service instance in SAP Business Technology Platform (SAP BTP) results in provisioning an SAP
Datasphere tenant.
For both subscription-based contracts (initiated on November 2023) and consumption-based contracts, you can access the SAP
BTP cockpit and view all currently available services in a global account. You need to structure this global account into
subaccounts and other related artefacts, such as directories and/or spaces.
Prerequisites
To create your SAP Datasphere service instance in SAP BTP, you need the following prerequisites:
Your global account has a commercial entitlement either via cloud credits (in case of a consumption-based contract) or via
a subscription-based contract.
A Cloud Foundry subaccount which is entitled for SAP Datasphere. For more information, see Configure Entitlements and
Quotas for Subaccounts.
You have SAP BTP administration authorization on the subaccount that is entitled to SAP Datasphere.
You are using Google Chrome to properly view popups in SAP BTP.
Service Plans
Standard The standard plan provides an SAP Datasphere tenant for productive and non-productive use, which is
represented by a service instance.
This is custom documentation. For more information, please visit SAP Help Portal. 15
7/9/25, 8:45 AM
Free The free plan provides an SAP Datasphere tenant for a limited time for trial use, which is represented by a
service instance.
Note
For information about region availability, see the SAP Discovery Center .
Create a Tenant
The following procedure uses the SAP BTP cockpit to create the service instance.
In the SAP Datasphere Administration Guide, we provide high-level steps to create an SAP Datasphere tenant on SAP BTP. For
more detailed information, or for instructions that use the Cloud Foundry Command-Line Interface, see the SAP Business
Technology (SAP BTP) documentation.
Note
You can create only one free tenant under the global account. If your SAP BTP service causes issues, you can open an incident
ticket via SAP for Me.
1. In the SAP BTP cockpit, navigate to the space in which you want to create the service instance, and click Services
Service Marketplace in the left navigation area.
2. Search for "Datasphere", and click the SAP Datasphere service to open it.
A wizard opens, in which you can select or specify the following parameters:
Parameter Description
Note
Not all runtime environments are available for free.
Space [no selection needed if you're creating the instance from the space
area] Select the SAP BTP space in which you want to create the
service instance.
4. Click Next and enter the following information about the SAP Datasphere system owner, who will be notified when the
service instance is created: First Name, Last Name, Email, and Host Name.
Note
This is custom documentation. For more information, please visit SAP Help Portal. 16
7/9/25, 8:45 AM
Alternatively, you can use a JSON file to provide the information above. See Create an API Access Configuration JSON
File.
5. Click Next to go to the final page of the wizard where you can review your selections, and then click Create to exit the
wizard.
An information message is displayed to confirm that the service instance creation is in progress.
Note
The creation of the instance can take a while.
6. Click View Instance to go to your space Service Instances page, where the new instance is listed and you can view the
progress of its creation.
7. When the service instance is created, the SAP Datasphere system owner receives an email confirming its availability, and
providing a link to navigate to the SAP Datasphere tenant, which the service instance represents.
Note
If the creation of the service instance fails (the "failed" status is displayed), you must first delete the failed instance and
then create a new SAP Datasphere service instance. If you need support, you can open an incident via SAP for Me with
the component DS-PROV.
When configuring the service instance in SAP Business Technology Platform, you can create and upload a JSON configuration file
with the required parameters to authenticate the service instance. You can use the JSON file when creating a new instance or
recovering a deleted instance.
When creating a new instance, use these parameters in the JSON file:
Sample Code
{
“first_name”: “”,
“last_name”: “”,
“email”: “”,
“host_name”: “”
}
Parameter Description
email The email address of the authorized user. For example, [Link]@[Link].
This is custom documentation. For more information, please visit SAP Help Portal. 17
7/9/25, 8:45 AM
Sample Code
{
“tenantUuid”: “”,
“access_token”: “”,
}
Parameter Description
tenantUuid The tenant universally unique identifier (UUID) for the source system.
You can configure the size of a subscription-based tenant and a consumption-based tenant with a standard plan.
Caution
Once you save the size configuration of your tenant, some resources cannot be resized later. Storage cannot be downsized. If
you require storage downsize, you must recreate the tenant. Exception: If you need to decrease the memory, see SAP note
3224686 .
The whole process could take more than 90 minutes. The configuration process is not long, but the operational process in
the background can take a while.
In case an error occurs, you are notified that the configuration cannot be completed and that you need to try again later by
clicking the Retry button (which replaces the Save button in such a case). The delay depends on the error (for example, if
there is an error on the SAP HANA Cloud database side, you can retry after 60 minutes).
You can only change SAP HANA Compute and SAP HANA Storage once every 24 hours.
If you try to change your SAP HANA configuration, SAP HANA Cloud functionality (Spaces, DPServer, Serving of Queries)
will not be available for around 10 minutes. If you run into issues after the configuration, use the Retry button.
Supported Sizes
This is custom documentation. For more information, please visit SAP Help Portal. 18
7/9/25, 8:45 AM
To view all supported size combinations for compute and storage resources and the number of capacity units consumed, go to the
[Link]
Base Configuration
Property Description
Compute
High-Memory
High-Compute
Note
vCPU Allocation table below.
You can increase the amount of memory from 32 GB (minimum), by increments of 16 GB.
You can reduce the amount of memory, but the lower limit depends on how much space you have
assigned to space management.
vCPU Displays the number of vCPUs allocated to your tenant. The number is calculated based on the
selected performance class, and memory used by your tenant.
Enable the SAP HANA Cloud Script Enable this option to access the SAP HANA Automated Predictive Library (APL) and SAP HANA
Server Predictive Analysis Library (PAL) machine learning libraries.
Additional Configuration
Property Description
Data Lake Storage [optional] Select the size of data lake disk storage. The performance class you select determines
the number of vCPUs allocated to your tenant.
You can specify from 0 TB (minimum) to 90 TB (maximum), by increments of 1 TB.
To reduce the size of your data lake storage, you must first delete your data lake instance, and re-
create it in the size that you want.
Note
Deletion cannot be reversed and all data stored in the data lake instance is deleted.
You cannot delete your data lake storage if it's connected to a space. You must first disconnect
the space:
2. Select Edit.
This is custom documentation. For more information, please visit SAP Help Portal. 19
7/9/25, 8:45 AM
Property Description
3. Under General Settings, clear the Use this space to access data lake checkbox.
Data lake is not available in all regions. See SAP Note 3144215 .
SAP BW Bridge Storage [optional] Enter a value to select the size of SAP BW bridge storage. The system updates to the
nearest value automatically. You can also click the + and - buttons to adjust to your desired size.
SAP BW Bridge includes SAP BTP, ABAP environment, and an own HANA Cloud runtime and
compute.
Caution
It isn not possible to downsize an SAP BW Bridge tenant.
Note
First finalize the size configuration of your tenant. Then, you can create the SAP BW
bridge instance in the dedicated page SAP BW Bridge of the Configuration area with
the size you’ve allocated (see Provisioning the SAP BW Bridge Tenant).
As soon as you click Save, the allocated capacity units will be assigned to SAP BW
Bridge.
Object Store
For this option to be enabled, the Memory option must be configure to have 128 GB or more. See the Base Configuration table.
Property Description
Note
You may incur higher consumption costs because data lake files keep a previous copy of any
file affected by an operation for a given retention time to allow for operations such as
RESTORESNAPSHOT. These previous copies incur data lake storage costs. For example, you
may have a 10 MB table, and the storage will be higher than that because of the number of
operations initiated and copied. For more information, see Restoring data in Data Lake Files
and Limitations of Data Lake files.
Storage is rounded to the next whole GB. For example, if all of the files in storage consume 1.2
GB, then the memory is rounded up to the next full gigabyte. In this example, it would round up
to 2 GB.
Requests Select the number of API calls needed per month in units of 1000.
This is custom documentation. For more information, please visit SAP Help Portal. 20
7/9/25, 8:45 AM
Property Description
Performance Class [optional] Select a performance class for your elastic compute node block-hours:
Memory
Compute
High-Compute
Note
The performance class you select determines the number of vCPUs and the RAM allocated to
your tenant.
You can only use one performance class at a time. To use a different performance class, you must
re-configure your Tenant Configuration settings.
Block Specifications Displays the number of vCPUs and the amount of RAM allocated to your tenant.
Block-Hours [optional] Set the number of blocks-hours scheduled for elastic compute node consumption. Each
block-hour is an additional block of vCPu and RAM for your tenant to use in one hour. The
maximum number of block-hours you can consume in one hour is four.
Elastic Compute Node Usage: Displays the number of blocks currently scheduled for elastic compute node consumption.
Allocated Block-Hours
Elastic Compute Node Usage: Used Displays the total number of blocks consumed by elastic compute nodes. The total is independent
Block-Hours of which performance-class is selected.
Elastic Compute Node Usage: Displays the block-hours you have used that exceed the amount allocated by your tenant
Exceeded Block-Hours configuration.
Note
This option only appears if you have used more block-hours than allocated.
Data Integration
Property Description
Data Integration [optional] ] Enter the number of blocks to allocate to data integration applications (replication
flows and transformation flows).
Even if you don’t allocate blocks here, you have a default number of execution hours for data
integration (depending on your contract). For more information about this, as well as about how
many execution hours you get per block, see the SAP Datasphere and SAP Datasphere, Test
Tenant Supplemental Terms and Conditions, which are a part of the Service Level Agreement.
Execution Hours Displays the number of execution hours available for data integration applications per month. It is
calculated by multiplying the number of allocated compute blocks by the number of execution
hours per block (per your contract). You can increase or decrease the data integration node hours
without downtime. The amount of job processing in parallel is automatically adjusted within the
limits set: Every 100h of your allocated data integration hours gets one extra parallel pod for job
processing. For example, if you have 400h or data integration, you will have a maximum of four
parallel pods available for processing.
Note
If you exceed the available execution hours, your data integration processes (such as
replication flow runs) continues running to avoid interrupting critical integration scenarios,
This is custom documentation. For more information, please visit SAP Help Portal. 21
7/9/25, 8:45 AM
Property Description
Maximum Parallel Jobs Displays the maximum number of jobs that can run in parallel.
The minimal configuration of SAP Datasphere supports 2 parallel jobs. For every additional 100
execution hours allocated, you get one extra parallel job, up to a maximum of 10.
Each parallel job means that roughly 5 replication objects (from one or more replication flows) can
be processed in parallel.
If the number of running replication flows exceeds the maximum number of parallel jobs,
processing is queued, and replication occurs less frequently.
Data Integration: Allocated Displays the number of execution hours allocated to data integration applications so that you can
Execution Hours easily compare it against the used execution hours.
Data Integration: Used Execution Displays the number of hours used by data integration applications in the current month. The
Hours value is updated once every 6 hours.
Data Integration: Exceeded Displays the execution hours that you have used in the current month that exceed the amount
Execution Hours allocated in tenant configuration.
Note
This option only appears if you have used more hours than allocated.
Property Description
Outbound Blocks Enter the number of blocks to be used for premium outbound integration. Having at least one
block assigned here is a prerequisite for using a non-SAP target in a replication flow. For more
information, see Premium Outbound Integration.
Outbound Volume Displays the data volume available for premium outbound integration per month. It is calculated
by multiplying the number of allocated blocks by 20 GB.
Note
If you exceed the assigned volume, your data integration processes (such as replication flow
runs) continues running to avoid interrupting critical integration scenarios, which can result in
additional costs (depending on your plan).
Premium Outbound Usage: Displays the monthly allocated data volume (in GB) for premium outbound integration so that you
Allocated Data Volume can easily compare it against the used volume.
Premium Outbound Usage: Used Displays the used data volume (in GB) for premium outbound integration in the current month.
Data Volume The value is updated once every 6 hours.
Premium Outbound Usage: Displays the data volume that you have used in the current month that exceeds the amount
Exceeded Data Volume allocated in tenant configuration.
Note
This option only appears if you have used more data than allocated.
Catalog
This is custom documentation. For more information, please visit SAP Help Portal. 22
7/9/25, 8:45 AM
Property Description
Catalog Storage Included by default. You can increase or decrease the number of storage blocks allocated for the
catalog.
Storage The amount of storage available for the catalog is calculated from the number of allocated blocks.
Catalog Usage: Allocated Storage Displays the number of GB allocated to the catalog.
Catalog Usage: Used Storage Displays the number of GB used by the catalog.
Catalog Usage: Exceeded Storage Displays the amount of storage that you have used that exceeds the amount allocated by your
tenant configuration.
Note
This option only appears if you have used more storage space than allocated.
Capacity Units
Property Description
Purchased units Displays the capacity units purchased for the month.
Estimated Units Displays the number of units anticipated to be charged to the user by the end of the month. This
calculation assumes that the current configuration stays unchanged and all pay-per-use services
are fully utilized.
Available Units Displays the estimated capacity units left for this month. This number is calculated as Purchased
Units - Estimated Units = Available Units.
Your Consumption Displays the amount charged to the users this month for all services. This calculation accounts for
any configuration changes made during the month and the precise usage of pay-per-use services.
vCPU Allocation
When you set your base configuration, the performance class you select and the hyperscaler you are using determines the amount
of memory in GB available for each vCPU in the system. For example, an AWS system with 32 GB of memory has 2 vCPUs, whereas
an AWS system with 320 GB of memory has 20 vCPUs. The tables below list the memory to vCPU ratio for each hyperscaler.
AWS 32-960 GB 16
AWS 1024-1792 GB 16
AWS 1800 GB 15
GCP 32-960 GB 16
GCP 1024-1856 GB 16
GCP 1904 16
Azure 32-1024 GB 16
This is custom documentation. For more information, please visit SAP Help Portal. 23
7/9/25, 8:45 AM
Azure 1088-1920 GB 16
Azure 3744 GB 16
AWS 3600 GB 30
AWS 32-912 GB 8
GCP 32-608 GB 8
Azure 32-480 GB 8
AWS 32-352 GB 4
GCP 32-288 GB 4
Azure 32-352 GB 4
This is custom documentation. For more information, please visit SAP Help Portal. 24
7/9/25, 8:45 AM
In SAP Business Technology Platform (SAP BTP), if you have an SAP Datasphere service instance with a free plan, which you can
use for 90 days, you can update it to a standard plan (no time limitation) for productive purposes. The number of days before the
expiration is displayed in the top panel of SAP Datasphere.
Note
If you do not update to a standard plan within 90 days, your SAP Datasphere tenant will be suspended. While the tenant is
suspended, you can still upgrade your service instance from the free to standard plan, but after 5 days of suspension, your
tenant will be deleted and there is no way to recover it.
If your tenant is deleted, the service instance will still be shown in your Global Account, but it is not functional. You can delete it
and create a new SAP Datasphere service instance with a free plan.
To do so, you must have SAP BTP administration authorization on the subaccount that is entitled to SAP Datasphere.
1. In SAP BTP, select the subaccount and the space where the service instance with a free plan was created.
3. In the Service Instances page, find the SAP Datasphere service instance with the free plan, click the button at the end of
the row and select Update.
Note
After updating your free plan to standard plan, you must wait at least 24 hours before changing the tenant settings on
the Tenant Configuration page.
4. In the Update Instance dialog, select standard and click Update Instance.
You can view the progress of the update. The status of the instance becomes green when the update is completed.
Note
The update process takes around 30 minutes, and during this time some features might not work as expected.
Prerequisites
To view the Administration page containing the Tenant Links tab, you must have a global role that grants you the following
privileges:
The DW Administrator global role, for example, grants these [Link] more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
This is custom documentation. For more information, please visit SAP Help Portal. 25
7/9/25, 8:45 AM
Note
To select an SAP Analytics Cloud tenant to make available via the (Product Switch), you must have the System Owner role.
Property Description
SAP Datasphere URL [read-only] Displays the URL for the current SAP Datasphere
tenant.
SAP Analytics Cloud URL Displays the URL for the SAP Analytics Cloud selected by the
System Owner to be accessible via the (Product Switch).
If your tenant is included in an SAP Business Data Cloud formation, then the following links are also displayed:
Property Description
SAP Business Data Cloud Cockpit URL [read-only] Displays the URL of the SAP Business Data Cloud
Cockpit tenant.
SAP Analytics Cloud URL (SAP Business Data Cloud Formation) [read-only] Displays the URL of the SAP Analytics Cloud tenant in
the formation.
In this situation, both the SAP Business Data Cloud Cockpit and the SAP Analytics Cloud tenants are shown as tiles when users
click the (Product Switch) and, if they have a user in the relevant target tenant, they can click the tile to navigate there.
Integrate SAP Business Data Cloud Provisioned Systems (in the SAP Business Data Cloud documentation)
Specify an SAP Analytics Cloud Tenant to Access via the Product Switch
You can link your SAP Datasphere tenant to a SAP Analytics Cloud tenant accessible in the (Product Switch) in the top right of
the shell bar, to help your users easily navigate between them.
To select an SAP Analytics Cloud tenant to make available via the (Product Switch), you must have the System Owner role.
Note
You must select an SAP Analytics Cloud tenant hosted in a Cloud Foundry environment. Linking to NEO tenants is not
supported.
This is custom documentation. For more information, please visit SAP Help Portal. 26
7/9/25, 8:45 AM
The selected SAP Analytics Cloud tenant tile is shown when users click the (Product Switch) and, if they have a user with
an appropriate role in the SAP Analytics Cloud system, they can click the tile to navigate there.
Note
An SAP Analytics Cloud user must create a live connection before they can consume data from SAP Datasphere.
Multiple SAP Analytics Cloud tenants can create live connections to your SAP Datasphere tenant, but only one SAP
Analytics Cloud tenant can be accessed via the (Product Switch).
For more information, see Consume Data in SAP Analytics Cloud via a Live Connection.
Property Description
SAP Datasphere URL [read-only] Displays the URL for the current SAP Datasphere tenant.
SAP Analytics Cloud URL [read-only] Displays the URL of the SAP Analytics Cloud tenant storing planning data in the current SAP
Datasphere tenant.
Tenant Link Artifacts [read-only] Displays the name of the OAuth client the SAP Analytics Cloud tenant uses to connect to SAP
Datasphere.
For more information about storing SAP Analytics Cloud planning data in SAP Datasphere, see:
Configure Data Storage for Planning (in the SAP Analytics Cloud documentation)
Prerequisites
To enable SAP HANA for SQL data warehousing on your SAP Datasphere tenant, you must have a global role that grants you the
following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator global role, for example, grants these [Link] more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
This is custom documentation. For more information, please visit SAP Help Portal. 27
7/9/25, 8:45 AM
Context
To enable SAP HANA for SQL data warehousing on your SAP Datasphere tenant, you must map your tenant to your SAP Business
Technology Platform account.
Note
The SAP Datasphere tenant and SAP Business Technology Platform organization and space must be in the same data center
(for example, eu10, us10). This feature is not available for Free Tier plan tenants (see SAP Note 3227267 ).
A tenant may have both Kyma mappings and Cloud Foundry mappings simultaneously. You can also map instances using the
Instance Mapping REST API (see Create and Manage Instance Mappings Using the REST API in the SAP HANA Cloud
Administration Guide).
For information about working with SAP Datasphere and HDI containers, see Exchanging Data with SAP HANA for SQL data
warehousing HDI Containers.
Procedure
1. In the side navigation area, click (Configuration) Instance Maping .
2. Add a mapping for one of these environment types: Cloud Foundry or Kyma.
Cloud Enter the organization and space GUIDs that you are mapping to:
Foundry Your SAP Business Technology Platform organization GUID.
Tip
You can use the Cloud Foundry CLI to find your organization GUID:
See [Link] .
[Optional] Your SAP Business Technology Platform space inside the organization. If you only specify the orga
GUID, the instance is mapped to all spaces in that organization.
Tip
You can use the Cloud Foundry CLI to find your space GUID:
See [Link] .
Kyma Enter the cluster ID and namespace that you are mapping to:
Your cluster ID. The cluster ID must be a GUID and the namespace can be no longer than 64 characters. Allow
characters are all lowercase letters, numbers, and dash (-).
[Optional] Your namespace. If no namespace is provided, the instance is mapped to all namespaces in the cl
Tip
You can use the following methods to find your cluster ID GUID and your namespace:
This is custom documentation. For more information, please visit SAP Help Portal. 28
7/9/25, 8:45 AM
kubectl CLI
Kyma Console
Cluster ID GUID
a. In the left sidebar, click Namespaces. Then select kyma-system from the main page.
b. In the left sidebar, click Configuration Config Maps . Then select sap-btp-operator-config from the m
Namespace
In the left sidebar, click Namespaces to see all namespaces in the Kyma cluster.
For more information, see SAP HANA Instance Mapping in the SAP HANA Cloud Administration Guide.
Your tenant is mapped to another environment context. You can now create HDI containers outside of SAP Datasphere.
4. Build one or more new HDI containers in the SAP Business Technology Platform Space and they will be created in the SAP
Datasphere run-time database (identified by the Database ID on the SAP Datasphere About dialog).
For information about setting up your build, see Binding Applications to an SAP HANA Cloud Instance.
Next Steps
When one or more HDI containers are available in your tenant, users with a space administrator role can work with them
(Exchanging Data with SAP HANA for SQL data warehousing HDI Containers).
To enable the SAP HANA Cloud script server, go to the Tenant Configuration page and select the checkbox in the Base
Configuration section. For more information, see Configure the Size of Your SAP Datasphere Tenant.
Note
The script server cannot be enabled in a SAP Datasphere consumption-based tenant with free plan.
Once the script server is enabled, the Enable Automated Predictive Library and Predictive Analysis Library option can be
selected when creating a database user (see Create a Database User).
This is custom documentation. For more information, please visit SAP Help Portal. 29
7/9/25, 8:45 AM
For detailed information about using the machine learning libraries, see:
Prerequisites
Your SAP Datasphere tenant is on a landscape that supports SAP Business AI. See SAP Note 0003491182 .
You've purchased the SAP AI Units license. For more information about SAP AI Units license, contact your Account
Executive.
To activate an SAP Business AI feature, you need the tenant administrator role.
You could have access to the AI Services tab, but it's possible that the tenant has not been activated with SAP Business AI
yet, or SAP Business AI features are not supported yet. For more information, see SAP Note
[Link] .
Procedure
1. In the side navigation area, click System Configuration .
3. In the AI Features section, check the options that you want to use.
AI-Assisted Catalog Content Generation - AI-Enhanced Metadata Enrichment - Generate asset summaries and
descriptions, and assign tag relationships. See Enriching and Managing Catalog Assets and Manage Tag
Relationships for Assets.
AI-Assisted Natural Language Search - AI-Enhanced Metadata Discovery - Enter your search string in natural
language and SAP Datasphere interprets your phrase and filters your results appropriately. See Natural Language
Search.
4. Click Save.
Next Steps
Grant access to use SAP Business AI features by assigning the DW AI Consumer role or another global role that grants the Data
Warehouse AI Consumption privilege (see Assign Users to a Role).
When users have been granted the privilege to SAP Business AI, they will see this icon in areas of SAP Datasphere where AI is
available for use:
This is custom documentation. For more information, please visit SAP Help Portal. 30
7/9/25, 8:45 AM
To use the datasphere command line interface (Log into the Command Line Interface via an OAuth Client).
To consume data via the OData API (see Consume SAP Datasphere Data in SAP Analytics Cloud via an OData
Service).
To use the SCIM 2.0 API (see Create Users and Assign Them to Roles via the SCIM 2.0 API).
To transport content through SAP Cloud Transport Management (see Transporting Your Content through SAP Cloud
Transport Management).
Context
You create an OAuth2.0 client with an Interactive Usage purpose:
To use the datasphere command line interface (Log into the Command Line Interface via an OAuth Client).
To consume data via the OData API (see Consume SAP Datasphere Data in SAP Analytics Cloud via an OData Service).
Note
Consuming exposed data in third-party clients, tools, and apps via an OData service requires a three-legged OAuth2.0
flow with type authorization_code.
Procedure
1. In the side navigation area, click (System) (Administration) App Integration .
Property Description
This is custom documentation. For more information, please visit SAP Help Portal. 31
7/9/25, 8:45 AM
Property Description
Authorization Grant [read-only] Authorization Code is automatically selected and cannot be changed.
Secret [read-only] Allows the secret to be copied immediately after the client is created.
Note
Once you close the dialog, the secret is no longer available.
Note
Clients created before v2024.08 have a Show Secret button, which allows you to display and copy
the secret at any time after the client is created.
Redirect URI Enter a URI to indicate to where the user will be redirected after authorization. If the URI has dynamic
parameters, use a wildcard pattern (for example, [Link]
The client, tool, or app that you want to connect is responsible for providing the redirect URI:
When working with the datasphere command line interface, set this value to
[Link] (see Accessing SAP Datasphere via the Command Line).
When connecting SAP Analytics Cloud to SAP Datasphere via an OData services connection,
use the Redirect URl provided in the SAP Analytics Cloud connection dialog (see Consume SAP
Datasphere Data in SAP Analytics Cloud via an OData Service).
Token Lifetime Enter a lifetime for the access token from a minimum of 60 seconds to a maximum of one day.
Default: 60 minutes
Refresh Token Lifetime Enter a lifetime for the refresh token from a minimum of 60 seconds to a maximum of 180 days.
Default: 30 days
4. Click Add to create the client and generate the ID and secret.
5. Copy the secret, save it securely, and then close the dialog.
Note
You won't be able to copy the secret again. If you lose it, you will need to create a new client.
6. Provide the following information to users who will use the client:
Client ID Client ID
Secret Secret
This is custom documentation. For more information, please visit SAP Help Portal. 32
7/9/25, 8:45 AM
Context
You create an OAuth2.0 Client with an API Access purpose:
To use the SCIM 2.0 API (see Create Users and Assign Them to Roles via the SCIM 2.0 API).
To transport content through SAP Cloud Transport Management (see Transporting Your Content through SAP Cloud
Transport Management).
Procedure
1. In the side navigation area, click (System) (Administration) App Integration .
Property Description
User Provisioning - To use the SCIM 2.0 API (see Create Users and Assign Them to Roles via the
SCIM 2.0 API).
Analytics Content Network Interaction - To transport content through SAP Cloud Transport
Management (see Transporting Your Content through SAP Cloud Transport Management).
Client Credentials - If the client application is accessing its own resources or when the
permission to access resources has been granted by the resource owner via another
mechanism. To use the SCIM 2.0 API, select this option (see Create Users and Assign Them to
Roles via the SCIM 2.0 API).
SAML2.0 Bearer - If the user context is passed using SAML or to control access based on user
permissions using SAML. This option requires specific client-side infrastructure to support
SAML.
Secret [read-only] Allows the secret to be copied immediately after the client is created.
Note
Once you close the dialog, the secret is no longer available.
Note
Clients created before v2024.08 have a Show Secret button, which allows you to display and copy
the secret at any time after the client is created.
Token Lifetime Enter a lifetime for the access token from a minimum of 60 seconds to a maximum of one day.
This is custom documentation. For more information, please visit SAP Help Portal. 33
7/9/25, 8:45 AM
Property Description
Default: 60 minutes
4. Click Add to create the client and generate the ID and secret.
5. Copy the secret, save it securely, and then close the dialog.
Note
You won't be able to copy the secret again. If you lose it, you will need to create a new client.
6. Provide the following information to users who will use the client:
Client ID Client ID
Secret Secret
Context
The OAuth 2.0 SAML Bearer Assertion workflow allows third-party applications to access protected resources without prompting
users to log into SAP Datasphere when there is an existing SAML assertion from the third-party application identity provider.
Note
Both SAP Datasphere and the third-party application must be configured with the same identity provider. The identity provider
must have a user attibute Groups set to the static value sac. See also the blog Integrating with SAP Datasphere Consumption
APIs using SAML Bearer Assertion (published March 2024).
Procedure
1. Go to System Administration App Integration .
2. In the Trusted Identity Providers section, click Add a Trusted Identity Provider.
This is custom documentation. For more information, please visit SAP Help Portal. 34
7/9/25, 8:45 AM
Property Description
Name Enter a unique name, which will appear in the list of trusted identity providers.
Provider Name Enter a unique name for the provide. This name can contain only alphabet characters (a-z & A-Z),
numbers (0-9), underscore (_), dot (.), hyphen (-), and cannot exceed 36 characters.
Signing Certificate Enter the signing certificate information for the third-party application server in X.509 Base64 encoded
format.
4. Click Add.
The identity provider is added to the list. Hover over it and select Edit to update it or Delete to delete it.
You may need to use the Authorization URL and Token URL listed here to complete setup on your OAuth clients.
To do so, you must have SAP BTP administration authorization on the subaccount that is entitled to SAP Datasphere.
Note
If you delete your service instance by accident, it can be recovered within seven days. After seven days have passed, the tenant
and all its data will be deleted and cannot be recovered.
1. In SAP BTP, select the subaccount and the space where the service instance was created.
3. In the Service Instances page, find the SAP Datasphere service instance that you want to delete, click the button at the end
of the row and select Delete, then click Delete in the confirmation dialog.
You can view the progress of the deletion. The tenant stays in a suspended state for seven days. During that time, you
cannot use the same tenant host name.
Context
If you accidentally delete your SAP Datasphere service instance in SAP BTP, you can restore it within seven days. For more
information, see SAP Note 3455188 .
Note
Restoring your service instance is only supported for standard service plans.
Procedure
1. Create a customer incident through ServiceNow using the component DS-PROV. Set the priority to High, and ask SAP
support to restore impacted SAP Datasphere tenant. You must provide the tenant URL.
This is custom documentation. For more information, please visit SAP Help Portal. 35
7/9/25, 8:45 AM
Once completed, SAP Support informs you that the impacted tenant has been restored and unlocked successfully.
h. Select Save.
i. Copy and save the OAuth Client ID and OAuth Client Secret for Step 3.
3. Create an OAuth Client in the impacted SAP Datasphere tenant. You can name it something like
DSP_SERVICE_INSTANCE_LINKING.
4. Fetch your access token via http POST request to the OAuth Client Token URL.
The Token URL is displayed on the App Integration tab, above the list of Configured Clients.
Replace <TokenURL> with your OAuth Client Token URL. Replace <OAuthClientSecret> with the OAuth Client
Secret. The secret must be Base64 encoded.
b. Go to System About .
6. Create a new BTP service instance for SAP Datasphere and link it to the impacted SAP Datasphere tenant.
b. Navigate to the subaccount where the deleted SAP BTP service instance was assigned.
d. Click Create.
i. Click Next.
{
"tenantUuid": "<TenantUUID>",
This is custom documentation. For more information, please visit SAP Help Portal. 36
7/9/25, 8:45 AM
"access_token": "<AccessToken>"
}
Replace <TenantUUID> with the ID that you retrieved in Step 4c. Replace <AccessToken> with the token that you
fetched in Step 3b.
l. Click Next.
n. Back in the SAP Datasphere tenant, go to System Administration and delete the OAuth Client named
DSP_SERVICE_INSTANCE_LINKING, previously created in Step 3.
Results
A new service instance is created and linked to the SAP Datasphere tenant that was accidentally deleted. All tenant data is
restored.
The elastic compute nodes will take over the read peak loads and support the SAP HANA Cloud database.
Note
Users of SAP Datasphere can consume data via elastic compute nodes only in SAP Analytics Cloud (via a live connection) and
Microsoft Excel (via an SAP add-in).
Using elastic compute nodes can lower the overall cost of ownership: instead of sizing your tenant on the basis of the maximum
load, you can use elastic compute nodes to handle short periods of exceptional peak load. For example, you can use an elastic
compute node for two months in the year to support end-of-year reporting, or you can use an elastic compute node to cover a
specific eight-hour period in the working day.
To identify peak loads, you can look at the following areas in the System Monitor: out-of-memory widgets in the Dashboard tab,
key figures in Statement Logs, views used in MDS statements in Statement Logs. See Monitoring SAP Datasphere.
Depending on the resources allocated to your tenant in the Tenant Configuration page, the administrator decides how many
compute blocks they will allocate to elastic compute nodes. See Configure the Size of Your SAP Datasphere Tenant.
This is custom documentation. For more information, please visit SAP Help Portal. 37
7/9/25, 8:45 AM
Create an Elastic Compute Node
Prerequisites
To create and manage elastic compute nodes, you must have the following privileges:
Space Files (-------M) - To add spaces and objects to an elastic compute node
System Information (--U-----) - To access tenant settings needed to manage elastic compute nodes
The DW Administrator global role, for example, grants these privileges (see Roles and Privileges by App and Feature).
This is custom documentation. For more information, please visit SAP Help Portal. 38
7/9/25, 8:45 AM
You can select the following objects for an elastic compute node: perspectives and analytic models, and views of type analytical
dataset and that are exposed for consumption. To make the data of the objects available for consumption, their sources - persisted
views, local tables, and, if enabled, open SQL schema tables and HDI container tables - are replicated to the elastic compute node.
Users of SAP Analytics Cloud and Microsoft Excel (with the SAP add-in) will then automatically benefit from the improved
performance of the elastic compute nodes when consuming data exposed by SAP Datasphere. See Consuming Data Exposed by
SAP Datasphere.
2. In the Create Elastic Compute Node dialog, enter the following properties, and then click Create:
Property Description
Business Name Enter the business name of the elastic compute node. Can contain a maximum of 30 characters, and
can contain spaces and special characters.
This is custom documentation. For more information, please visit SAP Help Portal. 39
7/9/25, 8:45 AM
Property Description
Technical Name Enter the technical name of the elastic compute node. The technical name must be unique. It can only
contain lowercase letters (a-z) and numbers (0-9). It must contain the prefix: ds (which helps to
identify elastic compute nodes in monitoring tools). The minimum length is 3 and the maximum length
is 9 characters. See Rules for Technical Names.
Note
As the technical name will be displayed in monitoring tools, including SAP internal tools, we
recommend that you do not mention sensitive information in the name.
Performance Class The performance class, which has been selected beforehand for all elastic compute nodes, is displayed
and you cannot modify it for a particular elastic compute node.
Note
The performance class is selected when purchasing additional resources in the Tenant
Configuration page (see Configure the Size of Your SAP Datasphere Tenant) and applies to all
elastic compute nodes. The default performance class is High Compute and you may want change it
in specific cases. For example, if you notice that the memory usage is high and the CPU usage is low
during the runtime and you want to save resources, you can select another the performance class,
which will change the memory/CPU ratio.
If the performance class is changed in the Tenant Configuration page and you want to edit your
elastic compute node by selecting it and clicking Configure, you will be asked to select the changed
performance class.
Compute Blocks Select the number of compute blocks. You can choose 4, 8, 12, or 16 blocks. The amount of memory and
vCPU depends on the performance class you choose:
Memory: 1 vCPU and 16 GB RAM per block
Default: 4
The number of GB for memory and storage and the number of CPU are calculated based on the
compute blocks and you cannot modify them.
Note
You can modify the number of compute blocks later on by selecting the elastic compute node and
click Configure.
The price you pay for additional resources depends on the compute blocks and the performance class.
If a node that includes 4 compute blocks runs for 30 minutes, you pay for 2 block-hours.
Select the spaces and objects whose data you want to make available in an elastic compute node. The data of the objects you've
selected, which is stored in local tables, persisted views, and, if enabled, open SQL schema tables and HDI container tables, will be
replicated to the node and available for consumption when the elastic compute node is run.
1. In the side navigation area, click (Space Management), then select the elastic compute node.
2. Click Add Spaces, then in the dialog box, select the spaces that contain objects whose data you want to make available in
an elastic compute node and click Add Spaces.
This is custom documentation. For more information, please visit SAP Help Portal. 40
7/9/25, 8:45 AM
Note
File spaces are not displayed in the dialog box as they cannot be added to an elastic compute node.
The number of spaces added to the elastic compute node is displayed in the list of nodes on the left part of the screen.
By default, all current and future exposed objects of the selected spaces are automatically assigned to the elastic compute
node and All Exposed Objects is displayed in the space tile.
You can deactivate the automatic assignment and manually select the objects.
There are 3 types of exposed objects: analytic models, perspectives and views (of type analytical dataset and that are
exposed for consumption). See Consuming Data Exposed by SAP Datasphere.
3. To manually select the objects of a space, select the space and click Add Objects. Uncheck Add All Objects Automatically,
then select the objects you want and click Add Objects.
All the objects added across all the added spaces, are displayed in the Exposed Objects tab, whether they've been added manually
or automatically via the option All Exposed Objects.
Note
To enable the following tables to be replicated:
Open SQL schema tables - see Allow the Space to Access the Open SQL Schema
HDI container tables - see Prepare Your HDI Project for Exchanging Data with Your Space
Note
Remote Tables - Data that is replicated from remote tables in the main instance cannot be replicated to an elastic compute
node. If you want to make data from a replicated remote table available in an elastic compute node, you should build a view on
top of the remote table and persist its data in the view (see Persist Data in a Graphical or SQL View). You should then make sure
that the object (analytic model, perspective or view) does not consume the remote table but now consumes the persisted view.
Shared Table Example - Making data from a shared table available in an elastic compute node:
The IT space shares the Products table with the Sales space.
The analytical model in the Sales space uses the shared Products table as a source.
If you want the Products table to be replicated to an elastic compute node, you need to add to the node both the Sales
space and the IT space. The shared Products table will not be replicated to the node if you only add the Sales space.
This is custom documentation. For more information, please visit SAP Help Portal. 41
7/9/25, 8:45 AM
1. In the side navigation area, click (Space Management), then select the elastic compute node.
All spaces and their objects are removed from the elastic compute node.
3. To remove one or more objects that you've manually added, in the Exposed Objects tab, select one or more objects and
click Remove Objects.
The Delete button is disabled if the status of the elastic compute node is Running.
This is custom documentation. For more information, please visit SAP Help Portal. 42
7/9/25, 8:45 AM
Not Ready - The node cannot be run because no spaces or objects are assigned to it.
Ready - Spaces or objects are assigned to the node, which can be run, either by starting the run manually or scheduling it.
The status displayed in grey indicates that the elastic compute node has never run whereas green indicates that it has
already run.
Starting - You’ve started the elastic compute node manually by clicking the Start button or it has been started via a
schedule: persisted views and local tables are being replicated and routing is created to the elastic compute node.
Starting Failed (displayed in red) - You’ve started the elastic compute node manually by clicking the Start button or it has
been started via a schedule: issues have occurred. You can start again the elastic compute node.
Updating - You’ve started the elastic compute node manually by clicking the Update button: persisted views and local
tables that have failed to be replicated are now replicated and routing is created to the elastic compute node.
Running - The node is in its running phase: the data that have been replicated during the starting phase can be consumed
in SAP Analytics Cloud for the spaces and objects specified.
Note
The Running status displayed in red indicates that the elastic compute node contains issues. We recommend that you
stop and restart the node, or, alternatively that you stop and delete the node and create a new one.
Stopping - You’ve stopped the elastic compute node manually by clicking the Stop button or it has been stopped via a
schedule: persisted view replicas, local table replicas and routing are being deleted from the node.
Stopping Failed (displayed in red) - You’ve stopped the elastic compute node manually by clicking the Stop button or it has
been stopped via a schedule: issues have occurred. You can stop again the elastic compute node.
Note
This is custom documentation. For more information, please visit SAP Help Portal. 43
7/9/25, 8:45 AM
Up to 4 elastic compute nodes can run at the same time.
Updates of local tables or persisted views while an elastic compute node is running - An elastic compute node is in its running
phase, which means that its local tables and persisted views have been replicated. Here is the behavior if these objects are
updated while the node is running:
If a local table data is updated, it is updated on the main instance and the local table replica is also updated in parallel on
the elastic compute node. The runtime may take longer and more memory may be consumed.
If a persisted view data is updated, it is first updated on the main instance, then as a second step the persisted view replica
is updated on the elastic compute node. The runtime will take longer, and more memory and compute will be consumed.
If local table or persisted view metadata is changed on (new column for example) or deleted from the main instance, the
local table replica or the persisted view replica is deleted from the elastic compute node. The data of these objects is
therefore read from the main instance and not from the elastic compute node.
To create and manage elastic compute nodes, you must have the following privileges:
Space Files (-------M) - To add spaces and objects to an elastic compute node
System Information (--U-----) - To access tenant settings needed to manage elastic compute nodes
The DW Administrator global role, for example, grants these privileges (see Roles and Privileges by App and Feature).
1. In the side navigation area, click (Space Management), then select the elastic compute node.
2. Click Start.
1. In the side navigation area, click (Space Management), then select the elastic compute node.
2. Click Stop.
1. In the side navigation area, click (Space Management), then select the elastic compute node.
This is custom documentation. For more information, please visit SAP Help Portal. 44
7/9/25, 8:45 AM
3. In the Create Schedule dialog, specify the options of the schedule, just like for any other integration task. See Schedule a
Data Integration Task (Simple Schedule) and Schedule a Data Integration Task (with Cron Expression).
4. In addition, specify in the Duration area the total number of hours and minutes of an elastic compute node run, from the
starting to the stopping stages.
Example - The elastic compute node is scheduled to run on the first day of every month for a duration of 72 hours (uptime of 3
days).
Once you've created the schedule, a schedule icon is displayed next to the elastic compute node in the list of nodes in the left-hand
side area of the Space Management.
You can then perform the following actions for the schedule by clicking Schedule: edit, pause, resume, delete or take over the
ownership of the schedule (see Scheduling Data Integration Tasks).
1. In the side navigation area, click (Space Management), then select the elastic compute node.
2. Click Update.
1. In the side navigation area, click (Space Management), then select the elastic compute node.
This is custom documentation. For more information, please visit SAP Help Portal. 45
7/9/25, 8:45 AM
2. Click View Logs.
The Statement Logs tab of the System Monitor opens, displaying information filtered on the elastic compute node. For
more information about logs in the System Monitor, see Monitoring SAP Datasphere.
If local tables or persisted views were not replicated, you can go back to the elastic compute node and update it to replicate
them.
Note
To monitor the start and stop runs for all elastic compute nodes, you can click View Logs in the left-hand area of the Space
Management.
You can monitor key figures related to an elastic compute node (such as start and end time of the last run; amount of memory
used for data replication), in the Elastic Compute Nodes tab of the System Monitor (see Monitoring SAP Datasphere).
Context
You can add a tenant type indicator to show all users which system they are using. For example, it would allow users to differentiate
between a test or production system. When enabled, a colored information bar is visible to all users of the tenant, and the browser
favicon is be updated with the matching color.
Procedure
1. Go to System Configuration System Information .
2. If you have not set system information before, select Customize Visual Settings. If you have previously set system
information, select Edit.
3. Select a tenant type from the list. If you select a Custom type, you must add a Title. The tenant type will be displayed in the
information bar.
4. Select a color.
This is custom documentation. For more information, please visit SAP Help Portal. 46
7/9/25, 8:45 AM
A preview of the favicon and information bar will be displayed.
5. Select Confirm.
Results
The tenant information that you set is displayed to all users above the shell bar. For example:
Context
Automated database upgrades are not impacted by your ability to upgrade your patch version manually. You can follow this
procedure in cases where a patch upgrade resolves an issue with the previous patch version.
Note
To upgrade the SAP HANA database, you must have a global role that grants you the privilege System Information with the
Update permission. The DW Administrator global role, for example, grants this privilege (see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere).
This task is limited to patch upgrades. For example, if your current database version is 2024.28.3, and the next patch version of
2024.28.4 is available, you can upgrade. You cannot go from version 2024.28.4 to 2024.29.0, because that is a larger upgrade,
not a patch
Procedure
1. In the side navigation area, click System About.
A confirmation dialog is shown informing you that the upgrade will cause a short downtime where SAP Datasphere is not
connected to SAP HANA.
The patch upgrade begins. When the patch is finished, you'll receive a notification.
This is custom documentation. For more information, please visit SAP Help Portal. 47
7/9/25, 8:45 AM
Note
Allowed authentication methods for SAP Datasphere tenants before or after the bundling with SAP Cloud Identity Services
tenants feature was enabled:
Existing tenants: the default IdP, a custom IdP, or bundled SAP Cloud Identity Services tenants.
New tenants: the default IdP or bundled SAP Cloud Identity Services tenants, which supports forwarding all SSO
requests to a corporate IdP.
Note
Bundling with SAP Cloud Identity Services tenants is being rolled out over the course of a number of versions. For more details,
see SAP Note 3619907.
To allow users to sign in to your SAP Datasphere tenant and to other SAP products via single sign-on (SSO), you can provision an
SAP Cloud Identity Services tenant for your SAP Datasphere tenant (see Configure Your Bundled SAP Cloud Identity Services
Tenant). SAP Cloud Identity Services tenants support forwarding all SSO requests to a corporate IdP (see What Are Cloud Identity
Services).
If you are having trouble signing in, you can use the Identity Provider Administration tool to repair your custom IdP. For more
information, see Access the Identity Provider Administration Tool.
Configure Authentication
This is custom documentation. For more information, please visit SAP Help Portal. 48
7/9/25, 8:45 AM
Modify Your Authentication Setup
Disable Your SAP Cloud Identity Services Tenant and Revert to Default IdP
Disable Your SAP Cloud Identity Services Tenant and Revert to Your Custom IdP
Note
Bundling with SAP Cloud Identity Services tenants is being rolled out over the course of a number of versions. For more details,
see SAP Note 3619907.
Note
If you currently use a custom IdP, we recommend that you migrate to an SAP Cloud Identity Services tenant and configure the
tenant to work with your corporate IdP (see Forward All SSO Requests to Corporate IdP). Using SAP Cloud Identity Services to
federate the identity of a custom IdP brings several benefits (see What Are Cloud Identity Services).
You provision and configure a bundled SAP Cloud Identity Services tenant for your SAP Datasphere tenant to allow users to sign in
via single sign-on (SSO) to SAP Datasphere and to other SAP products that use the same SAP Cloud Identity Services tenant. For
more information about bundles, see Bundles.
Prerequisites
To configure your bundled SAP Cloud Identity Services tenant for your SAP Datasphere tenant:
You must have the system owner role for your SAP Datasphere tenant and have multi-factor authentication enabled (see
Multi-Factor Authentication)
You must have a S-user with the same email as the system owner of the SAP Datasphere tenant. If you do not have an S-
user, click the Register button and create a user with email address used by the system owner.
1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).
All tenants that you own will appear as cards on the My Tenants page. Under Current IdP, the name and type of
authentication method used will appear: default, bundled, or custom.
2. On the tenant card that you want to bundle, select + Add SAP Cloud Identity Services.
All the SAP Cloud Identity Services tenants that you own will be listed.
3. Select an existing SAP Cloud Identity Services tenant or + Provision New SCI Tenant.
Option Description
Provision a New SAP Cloud a. If you are provisioning a new SAP Cloud Identity Services tenant, enter information for
Identity Services Tenant the SAP Cloud Identity Services tenant administrator's first name, last name, and email.
By default, the fields will be populated with your information, but the fields are editable in
case you want to provide different information.
b. Click Step 3.
This is custom documentation. For more information, please visit SAP Help Portal. 49
7/9/25, 8:45 AM
Option Description
When you provision a new tenant, its tenant role will match the tenant type used by SAP
Datasphere. For example, if your SAP Datasphere tenant is a test system, the SAP Cloud Identity
Services tenant will be assigned the tenant role: Test.
Note
This option is only available if you do not have an existing SAP Cloud Identity Services tenant
with a role that matches your SAP Datasphere tenant type.
Use an Existing SAP Cloud a. Select the SAP Cloud Identity Services tenant you want to use with the authentication
Identity Services tenant bundle.
b. Click Step 2. A summary page will appear containing the tenant name, and information
about the tenant administrator.
Note
If you do not already have a user on the SAP Cloud Identity Services tenant, a
standard user will be created for you. A standard user does not have administration
rights on the tenant, and you must request permissions from an existing
administrator if you want to make changes to the tenant in the future.
4. Select Finish.
Tenant provisioning can take up to 1 hour to complete. A progress bar will indicate your provisioning status. It will change to
a success message once the provisioning is complete.
If you select Close to leave the progress dialog before it is complete, tenant provisioning will continue in the background. If
you return to the My Tenants page, the tenant's provisioning status will appear on the tenant card. You can click the status
to return to the progress bar.
You then need to configure authentication to work with your SAP Cloud Identity Services tenant.
Configure Authentication
As a prerequisite, an SAP Cloud Identity Services tenant must be provisioned for the selected SAP Datasphere tenant.
a. From the SAP Cloud Identity Services tenant provisioning progress bar, select Continue.
b. From the My Tenants page, go to the tenant you are bundling, then select Configure Authentication .
The attribute will be used to map users from your existing user list to SAP Datasphere.
Determine what your Subject Name Identifier maps to in your SAP Datasphere system. It should map to User ID,
Email or a custom attribute. You can view your SAP Datasphere user attributes in Security Users .
Note
This is custom documentation. For more information, please visit SAP Help Portal. 50
7/9/25, 8:45 AM
Subject Name Identifier is case sensitive. The User ID, Email, or Custom Authentication User Mapping must
match the values exactly. For example, if the Subject Name Identifier returned by your identity provider is
user@[Link] and the email you used in SAP Datasphere is User@[Link] the mapping will fail.
Note
If you select this option, there will be a new column named SAML User Mapping in Security Users . After
switching to your SAML IdP, you must manually update this column for all existing users.
When dynamic user creation is enabled, new users will be automatically created using the default role and will be able to
sign into SAP Datasphere. After users are created, you can set roles using SAML attributes. For more information, see
Assign Users to a Role Using SAML Attributes.
Note
If this option is enabled, dynamic user creation still occurs in SAP Datasphere even when user attributes have not been
set for all SAP Cloud Identity Services tenant users. To prevent a user from being automatically created, your SAP Cloud
Identity Services tenant must deny the user access to SAP Datasphere.
4. Click Step 2.
5. [Optional] Do additional configuration in SAP Cloud Identity Services if you don't want to use the default settings.
SAP Datasphere should appear in the Bundled Applications list on your SAP Cloud Identity Services tenant.
For example, you can change the subject name identifier used by SAP Cloud Identity Services to match the attribute you
selected in Step 2. You can also configure the tenant to forward authentication from SAP Cloud Identity Services to a
corporate IdP. For more information, see Forward All SSO Requests to Corporate IdP.
6. Verify that you can sign in to your SAP Cloud Identity Services tenant: in another browser, sign in to the URL provided in the
Verify Your Account dialog, using your SAP Cloud Identity Services tenant credentials.
Note
You can copy the URL by selecting (Copy).
You must use a private session to sign into the URL; for example, Guest mode in Google Chrome. This ensures
that when you sign in to the dialog and select SAP Datasphere, you are prompted to sign in and do not reuse an
existing browser session.
7. On the Validate Login page, select Validate Login, then click Step 3.
8. Select Finish.
Disabling the bundled SAP Cloud Identity Services tenant can take up to 1 hour to complete. A progress bar will indicate
your provisioning status. It will change to a success message once the provisioning is complete.
9. Select Close to leave the progress dialog before it is complete, tenant provisioning will continue in the background. If you
return to the My Tenants page, the tenant's provisioning status will appear on the tenant card. You can click the status to
return to the progress bar.
This is custom documentation. For more information, please visit SAP Help Portal. 51
7/9/25, 8:45 AM
You are taken to the My Tenants page, and the status of your tenant will be updated.
1. In the My Tenants page, go to the bundled tenant you want to modify, then select Configure Authentication .
2. In the Select User Attribute Type step, you can change the user attribute type, or enable dynamic user creation.
3. Click Step 2.
Disable Your SAP Cloud Identity Services Tenant and Revert to Default IdP
You can disable your bundled SAP Cloud Identity Services tenant and revert to using the default IdP.
Note
Users who sign in to SAP Datasphere will not be able to sign in to other SAP products via SSO.
1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).
2. In the My Tenants page, go to the bundled tenant you want to modify, then select Disable Bundling .
Disabling the bundled SAP Cloud Identity Services tenant can take up to 1 hour to complete. A progress bar will indicate your
provisioning status. It will change to a success message once the provisioning is complete.
If you select Close to leave the progress dialog before it is complete, tenant provisioning will continue in the background. If you
return to the My Tenants page, the tenant's provisioning status will appear on the tenant card. You can click the status to return to
the progress bar.
Disable Your SAP Cloud Identity Services Tenant and Revert to Your Custom IdP
You can disable your bundled SAP Cloud Identity Services tenant and revert to using your custom IdP only if your SAP Datasphere
tenant that was provisioned before bundling with SAP Cloud Identity Services tenants was enabled.
To establish a trust relationship between your SAML IdP and your SAP Datasphere tenant, you'll need to exchange their metadata.
Note
Users who sign in to SAP Datasphere will not be able to sign in to other SAP products via SSO.
1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).
2. In the My Tenants page, go to the bundled tenant you want to disable, then select Disable Bundling .
4. click Download and save the file, which contains the SAP Datasphere metadata.
5. In your custom IdP, upload the file containing the SAP Datasphere metadata.
This is custom documentation. For more information, please visit SAP Help Portal. 52
7/9/25, 8:45 AM
Configure your SAML IdP to map user attributes to the following case-sensitive allowlisted assertion attributes. We
recommend that you map only the user attributes and roles that will be used in SAP Datasphere. Mapping additional user
attributes may result in a large SAML assertion, which could produce a login error.
Groups Required. The value must be set to "sac", even in case of SAP Datasphere. The Groups attribute is a custom
attribute and must be added if it does not exist yet. You need to contact your administrator to get the path
where the mapping needs to be changed.
displayName Optional.
functionalArea Optional.
preferredLanguage Optional.
<AttributeStatement>
<Attribute
Name="email">
<AttributeValue>[Link]@[Link]</AttributeValue>
</Attribute>
<Attribute
Name="givenName">
<AttributeValue>Abc</AttributeValue>
</Attribute>
<Attribute
Name="familyName">
<AttributeValue>Def</AttributeValue>
</Attribute>
<Attribute
Name="displayName">
<AttributeValue>Abc Def</AttributeValue>
</Attribute>
<Attribute
Name="Groups">
<AttributeValue>sac</AttributeValue>
</Attribute>
<Attribute
Name="custom1">
<AttributeValue>Domain Users</AttributeValue>
<AttributeValue>Enterprise Admins</AttributeValue>
<AttributeValue>Enterprise Key Admins</AttributeValue>
</Attribute>
</AttributeStatement>
Note
This is custom documentation. For more information, please visit SAP Help Portal. 53
7/9/25, 8:45 AM
Map the Groups "sac" attribute under Default Attributes for your SAP Datasphere tenant. The remaining attributes
should be mapped under Assertion Attributes for your SAP Datasphere tenant.
8. In the Identity Provider Administration tool, click Upload and select the file containing the SAML metadata of your IdP.
9. Click Step 3.
The attribute will be used to map users from your existing user list to SAP Datasphere.
Determine what your Subject Name Identifier maps to in your SAP Datasphere tenant. It should map to User ID,
Email or a custom attribute. You can view your SAP Datasphere user attributes in Security Users .
Note
Subject Name Identifier is case sensitive. The User ID, Email, or Custom Authentication User Mapping must
match the values exactly. For example, if the Subject Name Identifier returned by your identity provider is
user@[Link] and the email you used in SAP Datasphere is User@[Link] the mapping will fail.
Note
If you select this option, there will be a new column named SAML User Mapping in Security Users . After
switching to your SAP Cloud Identity Services tenant, you must manually update this column for all existing
users.
When dynamic user creation is enabled, new users will be automatically created using the default role and will be able to
sign into SAP Datasphere. After users are created, you can set roles using SAML attributes. For more information, see
Assign Users to a Role Using SAML Attributes.
Note
If this option is enabled, dynamic user creation still occurs in SAP Datasphere even when user attributes have not been
set for all SAP Cloud Identity Services tenant users. To prevent a user from being automatically created, your SAP Cloud
Identity Services tenant must deny the user access to SAP Datasphere.
13. Verify that you can sign in to your custom IdP: in another browser, sign in to the URL provided in the Verify Your Account
dialog, using your custom IdP credentials.
Note
You can copy the URL by selecting (Copy).
You must use a private session to sign into the URL; for example, Guest mode in Google Chrome. This ensures
that when you sign in to the dialog and select SAP Datasphere, you are prompted to sign in and do not reuse an
existing browser session.
This is custom documentation. For more information, please visit SAP Help Portal. 54
7/9/25, 8:45 AM
14. On the Validate Login page, select Validate Login, then click Step 5.
Disabling the bundled SAP Cloud Identity Services tenant can take up to 1 hour to complete. A progress bar will indicate
your provisioning status. It will change to a success message once the provisioning is complete.
16. Select Close to leave the progress dialog before it is complete, tenant provisioning will continue in the background. If you
return to the My Tenants page, the tenant's provisioning status will appear on the tenant card. You can click the status to
return to the progress bar.
Prerequisites
Note
Bundling with SAP Cloud Identity Services tenants is being rolled out over the course of a number of versions. For more details,
see SAP Note 3619907.
To configure your bundled SAP Cloud Identity Services tenant for your SAP Datasphere tenant:
You must have the system owner role for your SAP Datasphere tenant and have multi-factor authentication enabled (see
Multi-Factor Authentication)
You must have a S-user with the same email as the system owner of the SAP Datasphere tenant. If you do not have an S-
user, click the Register button and create a user with email address used by the system owner.
Procedure
1. Access the Identity Provider Administration tool using the following URL: [Link]
center>.[Link]/idp-admin/
For example, if your SAP Datasphere system is on eu10, then the URL is:
[Link]
[Link]
[Link]
[Link]
2. Log in with an S-user that has the same email address as the system owner of SAP Datasphere tenant. If you don't yet have
such an S-user, you can click the Register button and create a P-user.
If you create a new P-user, you'll receive an email with an activation link that will let you set your password.
This is custom documentation. For more information, please visit SAP Help Portal. 55
7/9/25, 8:45 AM
3. Once you're signed in, the list of SAP Datasphere tenants for which you are the system owner is displayed. Select the tenant
you want to work on by clicking on the card.
Note
Your SAP Datasphere tenant is connected to the Identity Provider Administration tool by default. If you'd like to
disconnect your tenant from the console, you can do so in either of two places:
In SAP Datasphere, navigate to System Administration Security Optional: Configure Identity Provider
Administration Tool , click the Connected switch, and then save the changes.
Click Disconnect IdP Admin from your system after selecting your tenant in the Identity Provider Administration
tool.
Prerequisites
SAP Datasphere can be hosted on non-SAP data centers.
You must be the system owner of the SAP Datasphere tenant. For more information see Transfer the System Owner Role.
If your users are connecting from Apple devices using the mobile app, the certificate used by your IdP must be compatible
with Apple's App Transport Security (ATS) feature.
Context
A custom identity provider is a separate solution, like for example Azure AD, and is not part of SAP Analytics Cloud or SAP
Datasphere. Therefore the change in configuration is to be applied directly in the solution, not within SAP Datasphere. No access to
SAP Datasphere is required to make the change, only an access to the IdP.
Note
Be aware that the SAML attributes for SAP Datasphere roles do not cover user assignment to spaces. A user who logs into a
SAP Datasphere tenant through SSO must be assigned to the space in order to access the space. If you do not assign a user to
a space, the user will not have access to any space.
Procedure
1. From the side navigation, go to (System) → (Administration) →Security.
2. Select (Edit).
3. In the Authentication Method area, select SAML Single Sign-On (SSO) if it is not already selected.
Note
By default, SAP Cloud Identity is used for authentication.
This is custom documentation. For more information, please visit SAP Help Portal. 56
7/9/25, 8:45 AM
A SAP Datasphere metadata file is saved.
If you are creating a new SAP Datasphere application on the Identity Authentication Service (IAS) side with the type
"unknown", set the type to "Unknown".
The file includes metadata for SAP Datasphere, and is used to create a trust relationship between your SAML Identity
Provider and your SAP Datasphere system.
Configure your SAML IdP to map user attributes to the following case-sensitive allowlisted assertion attributes. We
recommend that you map only the user attributes and roles that will be used in SAP Analytics Cloud. Mapping additional
user attributes may result in a large SAML assertion, which could produce a login error.
Groups Required. The value must be set to "sac", even in case of SAP Datasphere. The Groups attribute is a custom
attribute and must be added if it does not exist yet. You need to contact your administrator to get the path
where the mapping needs to be changed.
displayName Optional.
functionalArea Optional.
preferredLanguage Optional.
<AttributeStatement>
<Attribute
Name="email">
<AttributeValue>[Link]@[Link]</AttributeValue>
</Attribute>
<Attribute
Name="givenName">
<AttributeValue>Abc</AttributeValue>
</Attribute>
<Attribute
Name="familyName">
<AttributeValue>Def</AttributeValue>
</Attribute>
<Attribute
Name="displayName">
<AttributeValue>Abc Def</AttributeValue>
</Attribute>
<Attribute
Name="Groups">
<AttributeValue>sac</AttributeValue>
This is custom documentation. For more information, please visit SAP Help Portal. 57
7/9/25, 8:45 AM
</Attribute>
<Attribute
Name="custom1">
<AttributeValue>Domain Users</AttributeValue>
<AttributeValue>Enterprise Admins</AttributeValue>
<AttributeValue>Enterprise Key Admins</AttributeValue>
</Attribute>
</AttributeStatement>
Note
If you are using the SAP Cloud Identity Authentication service as your IdP, map the Groups "sac" attribute under
Default Attributes for your SAP Datasphere tenant. The remaining attributes should be mapped under Assertion
Attributes for your SAP Datasphere tenant.
8. In Step 2, select Upload, and choose the metadata file you downloaded from your SAML IdP.
The attribute will be used to map users from your existing SAML user list to SAP DatasphereNameID used in your custom
SAML assertion:
Determine what your NameID maps to in your SAP Datasphere system. It should map to . The user attribute you select
must match the User ID, Email or a custom attribute. You can view your SAP Datasphere user attributes in Security
Users .
Note
NameID is case sensitive. The User ID, Email, or Custom SAML User Mapping must match the values in your SAML IdP
exactly. For example, if the NameId returned by your SAML IdP is user@[Link] and the email you used in SAP
Datasphere is User@[Link] the mapping will fail.
Note
If your NameID email is not case-sensitive and contains mixed-case, for example User@[Link],
consider choosing Custom SAML User Mapping instead.
Note
If you select this option, there will be a new column named SAML User Mapping in Security Users . The.
After switching to your SAML IdP, you must manually update this column for all existing users.
Note
If you are using a live connection to SAP S/4HANA Cloud Edition with OAuth 2.0 SAML Bearer Assertion, NameId must
be identical to the user name of the business user on your SAP S/4HANA system.
For example, if you want to map an SAP Datasphere user with the user ID SACUSER to your SAP S/4HANA Cloud user
with the user name S4HANAUSER, you must select Custom SAML User Mapping and use S4HANAUSER as the Login
Credential in Step 10.
This is custom documentation. For more information, please visit SAP Help Portal. 58
7/9/25, 8:45 AM
If you are using SAP Cloud Identity as your SAML IdP, you can choose Login Name as the NameID attribute for SAP
Datasphere, then you can set the login name of your SAP Datasphere user as S4HANAUSER.
When dynamic user creation is enabled, new users will be automatically created and assigned the default role and will be
able to use SAML SSO to log onto SAP Datasphere. Once the users are created, you can assign roles using SAML attributes
(see Assign Users to a Role Using SAML Attributes).
Note
Automatic user deletion is not supported. If a user in SAP Datasphere is removed from your SAML IdP, you must go to
Security Users and manually delete users. For more information, see Delete Users.
If this option is enabled, dynamic user creation still occurs in SAP Datasphere even when SAML user attributes have not
been set for all IdP users. To prevent a user from being automatically created, your SAML IdP must deny the user access
to SAP Datasphere.
This value must identify the SAP Datasphere system owner. The Login Credential provided here are automatically set for
your user.
Note
The Login Credential depends on the User Attribute you selected under Step 3.
12. Test the SAML IdP setup, by logging in with your IdP, and then clicking Verify Account to open a dialog for validation.
In another browser, log on to the URL provided in the Verify Your Account dialog, using your SAML IdP credentials. You can
copy the URL by selecting (Copy).
You must use a private session to log onto the URL; for example, guest mode in Chrome. This ensures that when you log on
to the dialog and select SAP Datasphere, you are prompted to log in and do not reuse an existing browser session.
Note
When starting the verification step, you will see a new screen when logging into SAP Datasphere. Two links will be
displayed on this page. One will link to your current IdP and the other will link to the new IdP you will switch to. To
perform the Verify Account step, use the link for the new IdP. Other SAP Datasphere users can continue logging on with
the current IdP. Once you have completed Step 16 and the IdP switch has completed, this screen will no longer appear.
If the verification was successful, a green border should appear around the Login Credential box.
The URL should link to the password management page of your SAML IdP.
15. [Optional] Configure the logout by choosing one of the following logout options:
Application log out: Log out of SAP Datasphere and remain signed in to your IdP system.
Note
By default, when users log out of SAP Datasphere, they are automatically logged out of their SAML IdP.
This is custom documentation. For more information, please visit SAP Help Portal. 59
7/9/25, 8:45 AM
17. Select Convert.
When the conversion is complete, you will be logged out and directed to the logon page of your SAML IdP.
18. Log on to SAP Datasphere with the credentials you used for the verification step.
19. From the side navigation, go to (Security) → and (Users), look for the column of the User Attribute you selected in
step 9.
The values in this column should be a case sensitive match with the NameId sent by your IdP's SAML assertion.
Note
If you selected Custom SAML User Mapping as User Attribute, you must manually update all fields in the SAML User
Mapping column.
Results
Users will be able to use SAML SSO to log onto SAP Datasphere.
Note
You can also set up your IdP with your Public Key Infrastructure (PKI) so that you can automatically log in your users with a
client side X.509 certificate.
Next Steps
Switching to a Different Custom IdP: If SAML SSO is enabled and you would like to switch to a different SAML IdP, you can repeat
the above steps using the new SAML IdP metadata.
A common use case is to upload new metadata from your identity provider when a new signing certificate has been generated.
If you can't sign in to SAP Datasphere you can use the Identity Provider Administration tool to upload new metadata or download
your tenant signing certificate.
Prerequisites
You must have the metadata file that contains the new certificate from your custom IdP, and you must be logged into SAP
Datasphere before your IdP switches over to using the new certificate.
Update the SAML IdP Signing Certificate Using the Security Page
Upload new metadata to reconfigure trust between your custom IdP and your SAP Datasphere system.
2. Select (Edit).
3. Under Step 2, select Update and provide the new metadata file.
This is custom documentation. For more information, please visit SAP Help Portal. 60
7/9/25, 8:45 AM
4. Select (Save) and confirm the change to complete the update.
Update the SAML IdP Signing Certificate Using the Identity Provider Administration
Tool
Upload new metadata to reconfigure trust between your custom IdP and your SAP Datasphere system using the Identity Provider
Administration tool.
1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).
2. On the card for the tenant that you want to update, select Repair IdP .
3. Select Upload new metadata for the current custom identity provider.
4. Click Browse to select the new metadata file for your current custom identity provider.
5. Click Upload File to upload the provided metadata file. After the upload is successful, it can take up to five minutes for the
new metadata file to be applied.
7. Click Log into SAP Datasphere to open a new tab and navigate to your SAP Datasphere system.
If you have any sign in problems related to the identity provider configuration, as opposed to a user-specific problem, you can
return to the Identity Provider Administration tool and either re-upload the metadata file or revert to the default identity provider.
For more information, see Revert to Default Authentication (Legacy Custom IdP).
Reacquire the SAP Datasphere SAML Signing Certificate Using the Identity Provider
Administration Tool
If you need to reqacquire your system metadata it can be downloaded from the Identity Provider Administration tool.
1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).
2. On the card for the tenant that you want to use, select Repair IdP .
To revert your custom IdP authentication back to the default method, you can use the Security page in SAP Datasphere. If you are
having problems signing in, you can use the Identity Provider Administration tool.
2. Select (Edit) .
The system will perform a check to see if all users also exist in SAP Cloud Identity.
This is custom documentation. For more information, please visit SAP Help Portal. 61
7/9/25, 8:45 AM
If users do not exist in SAP Cloud Identity, you will be prompted to Synchronize Users.
Note
A user validation error may occur if users do not have a valid email address, or if duplicate email addresses are found.
User validation errors must be corrected on the Users page before synchronization can be completed.
4. Select (Save) .
When conversion is complete, you will be logged out and directed to the SAP Cloud Identity logon page.
1. Sign in to the Identity Provider Administration tool. For more information, see Access the Identity Provider Administration
Tool.
2. On the card for the tenant that you want to revert, select Repair IdP .
5. Select Yes in the confirmation dialog to revert your authentication method back to the default IdP.
The system will perform a check to see if all custom SAML IdP users also exist in SAP Cloud Identity. If any users do not
exist in SAP Cloud Identity, you will be prompted to Synchronize Users. A progress bar will track the status of the
synchronization process.
Note
A user validation error may occur if users do not have a valid email address, or if duplicate email addresses are found.
User validation errors must be corrected before synchronization can be completed.
9. Click Log into SAP Datasphere to open a new tab and navigate to your SAP Datasphere system. Log in with your default
identity provider credentials. If you get an error saying “Your profile is not configured”, please create a support ticket under
the component LOD-ANA-BI.
Once the reversion has finished, you can exit the Identity Provider Administration tool and do additional configuration in SAP
Datasphere.
Prerequisites
To manage users, you must have a global role that grants you the following privileges:
This is custom documentation. For more information, please visit SAP Help Portal. 62
7/9/25, 8:45 AM
User (CRUD----) - To access the (Users)area in the (Security) tool and to create, update, and delete users.
The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
Creating Users
You can create users in the following ways:
Import multiple users from a CSV file Import or Modify Users from a File
Modifying Users
You can modify existing users in the following ways:
Export user data to a CSV file, to synchronize with other systems Export Users
Update the email address a user logs on with Update a User Email Address
Create a User
You can create individual users in SAP Datasphere.
Prerequisites
You can select one or more roles while you're creating the user. Before getting started creating users, you might want to become
familiar with the global roles and scoped roles. You can still assign roles after you've created the users.
Global Roles A role that enables users assigned to it to Managing Roles and Privileges
perform actions that are not space-related,
typically a role that enables to administrate
the tenant. A standard or custom role is
considered as global when it includes global
privileges.
Scoped Roles A role that inherits a set of scoped Create a Scoped Role to Assign Privileges to
privileges from a standard or custom role Users in Spaces
and grants these privileges to users for use
in the assigned spaces.
This is custom documentation. For more information, please visit SAP Help Portal. 63
7/9/25, 8:45 AM
Context
The method described here assumes that SAP Datasphere is using its default authentication provider. If you are using a custom
SAML Identity Provider, you must provide slightly different information, depending upon how your SAML authentication is
configured.
Procedure
1. Go to (Expand) (Security) (Users).
Each user needs a unique ID. Only alphanumeric and underscore characters are allowed. The maximum length is 20
characters.
Only Last Name is mandatory, but it is recommended that you provide a First Name, Last Name, and Display Name.
Display Name will appear in the screens.
Note
The Manager column is not relevant for SAP Datasphere users.
6. In the Roles column, select the icon and choose one or more roles from the list.
If one or more default roles have already been created, you can leave Roles empty. Default roles will be assigned to the user
when you click Save.
7. Select (Save).
Results
A welcome email including an account activation URL will be sent to the user, so that the user can set an initial password
and access the system. Optionally, you can disable the welcome email notification (see Configure Notifications).
When you create a user, it is activated by default. You may want to deactivate a user in specific cases, for example when a
user is on long-term leave. To deactivate a user, select the relevant check box in the leftmost column of the table, click the
icon (Deactivate Users) and optionally select Email users to notify them that their accounts have been deactivated.
Deactivated users cannot login to SAP Datasphere until they are activated again.
Note
In addition to the standard workflows, you can also create users via the command line (see Manage Users via the Command
Line).
Prerequisites
This is custom documentation. For more information, please visit SAP Help Portal. 64
7/9/25, 8:45 AM
The user data you want to import must be stored in a CSV file. At minimum, your CSV file needs columns for UserID, LastName,
and Email, but it is recommended that you also include FirstName and DisplayName.
If you want to assign new users different roles, include a Roles column in the CSV file. The role IDs used for role assignment are
outlined in Standard Roles Delivered with SAP Datasphere.
For existing users that you want to modify, you can create the CSV file by first exporting a CSV file from SAP Datasphere. For more
information, see Export Users.
Note
The first name, last name, and display name are linked to the identity provider, and can't be changed in the User list page, or
when importing a CSV file. (In the User list page, those columns are grayed out.)
To edit those values, you'll need to use the user login, and edit that user's profile.
Edit the downloaded CSV file to remove columns whose values you don't want to modify, and to remove rows for users whose
values you don't want to modify. Do not modify the USERID column. This ensures that entries can be matched to existing users
when you re-import the CSV.
These are the available mapping parameters when importing CSV user data:
Parameter Description
User ID
First Name
Last Name
Display Name
Manager
Roles
Mobile
Phone
Office Location
Job Title
This is custom documentation. For more information, please visit SAP Help Portal. 65
7/9/25, 8:45 AM
Parameter Description
Closed Item Picker Tips Closed tooltips are tracked so that they
won't be reopened again (for first time
users).
Last Maintenance Banner Version The version when the last maintenance
banner was shown.
Homescreen content is initialized If default tiles have been set for the
home screen.
On the Edit Home Screen dialog, a user can override all the default preferences that have been
set by the administrator for the system ( System Administration Default Appearance ).
These are the preferences:
Procedure
1. Go to (Expand) (Security) (Users).
3. In the Import Users dialog, choose Select Source File to upload your CSV file.
4. Choose Create Mapping to assign the fields of your user data from the CSV file to the fields in user management.
5. Select the appropriate entries for the Header, Line Separator, Delimiter, and Text Qualifier.
7. In the Import Users dialog, choose Import to upload your CSV file according to the defined mapping.
This is custom documentation. For more information, please visit SAP Help Portal. 66
7/9/25, 8:45 AM
Export Users
If you want to synchronize SAP Datasphere user data with other systems, you can export the data to a CSV file.
Procedure
On the Users page of the Security area, choose (Export).
Results
The system exports all user data into a CSV file that is automatically downloaded to your browser's default download folder.
Column Description
USER_NAME
FIRST_NAME
LAST_NAME
DISPLAY_NAME
MANAGER
On the Edit Home Screen dialog, a user can override all the default preferences that
have been set by the administrator for the system ( System Administration
Default Appearance ). These are the preferences:
OVERRIDE_BACKGROUND_OPTION
This is custom documentation. For more information, please visit SAP Help Portal. 67
7/9/25, 8:45 AM
Column Description
OVERRIDE_LOGO_OPTION
OVERRIDE_WELCOME_MESSAGE_FLAG
OVERRIDE_HOME_SEARCH_TO_INSIGHT_FLAG
OVERRIDE_GET_STARTED_FLAG
OVERRIDE_RECENT_FILES_FLAG
OVERRIDE_RECENT_STORIES_FLAGOVERRIDE_RECENT_STORIES_FLAG
OVERRIDE_RECENT_PRESENTATIONS_FLAG
OVERRIDE_RECENT_APPLICATIONS_FLAG
OVERRIDE_CALENDAR_FLAG
OVERRIDE_FEATURED_FILES_FLAG
Context
When you create a user, you must add an email address. The email address is used to send logon information.
Procedure
1. In the side navigation area, click (Security) (Users).
2. Select the email address you want to modify, add a new email address and press Enter or select another cell.
If the email address is already assigned to another user, a warning will appear and you must enter a new address as every
user must be assigned a unique email address.
A new logon email will be sent to the updated address. As long as a user has not logged on to the system with the new email
address, the email address will appear in a pending state in the Users list.
3. If the user has not received the logon email, you can resend the email. To do so, select the checkbox correponding to the
user, click the enveloppe icon and click Resend in the dialog box that opens.
Related Information
Create a User
Import or Modify Users from a File
Delete Users
You can delete users.
Procedure
This is custom documentation. For more information, please visit SAP Help Portal. 68
7/9/25, 8:45 AM
1. In the Users management table, select the user ID you want to delete by clicking the user number in the leftmost column of
the table.
Related Information
Create a User
Import or Modify Users from a File
Update a User Email Address
Context
Users with the DW Space Administrator role (space administrators) can create database users in their spaces to allow the
connection of ETL tools to write to and read from Open SQL schemas attached to the space schema (see Integrating Data via
Database Users/Open SQL Schemas).
Procedure
1. In the side navigation area, click (System) (Configuration) Security .
2. In the Password Policy Configuration section, enter the number of days after which a database user's password will expire.
After this period, the user will be prompted to set a new password.
Note
The password policy applies only to database users where the Enable Password Policy property is selected, for both
existing and new users. If a user does not log on with their initial password during this period, they will be deactivated
until their password is reset.
Prerequisites
To manage roles and privileges, you must have a global role that grants you the following privileges:
Role (CRUD----) - To access the (Roles) and (Authorization Overview) areas in the (Security) tool and to create,
update, and delete roles.
This is custom documentation. For more information, please visit SAP Help Portal. 69
7/9/25, 8:45 AM
Spaces (-------M) - To add spaces to scoped roles.
The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
SAP Datasphere delivers a set of standard roles and you can create your own custom roles:
Standard role - A role delivered with SAP Datasphere that includes a set of privileges. As a best practice, a tenant
administrator can use these roles as templates for creating custom roles for different business needs. See Standard Roles
Delivered with SAP Datasphere.
Custom role - A role that a tenant administrator creates to choose specific privileges as needed. See Create a Custom Role.
Each standard or custom role is either a global role or a template for scoped roles:
Global role - A role that enables users assigned to it to perform actions that are not space-related, typically a role that
enables to administrate the tenant. A standard or custom role is considered as global when it includes global privileges. A
tenant administrator can assign a global role to the relevant users. See Assign Users to a Role.
Scoped role - A role that inherits a set of privileges from a standard or custom role and assigns them to one or more users
for one or more spaces. Users assigned to a scoped role can perform actions in the assigned spaces. A tenant
administrator can create a scoped role. See Create a Scoped Role to Assign Privileges to Users in Spaces.
For more information on global and scoped privileges, see Privileges and Permissions.
This is custom documentation. For more information, please visit SAP Help Portal. 70
7/9/25, 8:45 AM
Users have relevant privileges depending on which actions they can do in the spaces.
Claret administers the SAP Datasphere tenant and also has modeler privileges in the two spaces Sales Europe and Sales
US.
Jorge has purchasing modeler privileges in the Purchasing space and has viewer privileges in the Worldwide Purchasing
space.
Maeve and Ahmed have modeler privileges in the two spaces Sales Europe and Sales US.
This is custom documentation. For more information, please visit SAP Help Portal. 71
7/9/25, 8:45 AM
A DW Administrator can use standard roles as templates for creating custom roles with a different set of privileges (see Create a
Custom Role). You can also use the standard roles that include scoped privileges as templates for creating scoped roles (see
Create a Scoped Role to Assign Privileges to Users in Spaces). You can assign the standard roles that contain global privileges
(such as DW Administrator, Catalog Administrator and Catalog User) directly to users.
Note
You cannot delete nor edit standard roles.
In the side navigation area, click (Security) (Roles). The following standard roles are available:
System Owner - Includes all user privileges to allow unrestricted access to all areas of the application. Exactly one
user must be assigned to this role.
DW Administrator - Can create users, roles and spaces and has other administration privileges across the SAP
Datasphere tenant. Cannot access any of the apps (such as the Data Builder).
DW Space Administrator (template) - Can manage all aspects of the spaces users are assigned to (except the
Space Storage and Workload Management properties) and can create data access controls.
DW Scoped Space Administrator - This predefined scoped role is based on the DW Space Administrator role
and inherits its privileges and permissions.
Note
Users who are space administrators primarily need scoped permissions to work with spaces, but they
also need some global permissions (such as Lifecycle when transporting content packages). To provide
such users with the full set of permissions they need, they must be assigned to a scoped role (such as the
DW Scoped Space Administrator) to receive the necessary scoped privileges, but they also need to be
assigned directly to the DW Space Administrator role (or a custom role that is based on the DW Space
Administrator role) in order to receive the additional global privileges.
DW Integrator (template) - Can integrate data via connections and can manage and monitor data integration in a
space.
DW Scoped Integrator - This predefined scoped role is based on the DW Integrator role and inherits its
privileges and permissions.
DW Modeler (template) - Can create and edit objects in the Data Builder and Business Builder and view data in
objects.
DW Scoped Modeler - This predefined scoped role is based on the DW Modeler role and inherits its privileges
and permissions.
DW Viewer (template) - Can view objects and view data output by views that are exposed for consumption in
spaces.
DW Scoped Viewer - This predefined scoped role is based on the DW Viewer role and inherits its privileges
and permissions.
Roles providing privileges to consume the data exposed by SAP Datasphere spaces:
DW Consumer (template) - Can consume data exposed by SAP Datasphere spaces, using SAP Analytics Cloud, and
other clients, tools, and apps. Users with this role cannot log into SAP Datasphere. It is intended for business
This is custom documentation. For more information, please visit SAP Help Portal. 72
7/9/25, 8:45 AM
analysts and other users who use SAP Datasphere data to drive their visualizations, but who have no need to access
the modeling environment.
DW Scoped Consumer - This predefined scoped role is based on the DW Consumer role and inherits its
privileges and permissions.
Catalog Administrator - Can set up and implement data governance using the catalog. This includes connecting the
catalog to source systems for extracting metadata, building business glossaries, creating tags for classification, and
publishing enriched catalog assets so all catalog users can find and use them. Must be used in combination with
another role such as DW Viewer or DW Modeler for the user to have access to SAP Datasphere.
Catalog User - Can search and discover data and analytics content in the catalog for consumption. These users may
be modelers who want to build additional content based on official, governed assets in the catalog, or viewers who
just want to view these assets. Must be used in combination with another role such as DW Viewer or DW Modeler for
the user to have access to SAP Datasphere.
Note
To activate SAP Business AI features in your SAP Datasphere tenant, see Enable SAP Business AI for SAP
Datasphere
Note
Please do not use the roles DW Support User and DW Scoped Support User as they are reserved for SAP Support.
Users are assigned roles in particular spaces via scoped roles. One user may have different roles in different spaces depending on
the scoped role they're assigned to. See Create a Scoped Role to Assign Privileges to Users in Spaces.
Planning Professional, Planning Standard as well as Analytics Hub are SAP Analytics Cloud specific license types. For more
information, see Understand Licenses, Roles, and Permissions in the SAP Analytics Cloud documentation.
Overview
This is custom documentation. For more information, please visit SAP Help Portal. 73
7/9/25, 8:45 AM
Permissions
Overview
A role represents the main tasks that a user performs in SAP Datasphere. Each role has a set of privileges with appropriate levels
of permissions. The privileges represent areas of the application like the Space Management or the Business Builder and the files
or objects created in those areas.
The standard roles provide sets of privileges and permissions that are appropriate for that role. For example, the DW
Administrator role has all the Spaces permissions, while the DW Viewer role has none.
You can use the standard roles (see Standard Roles Delivered with SAP Datasphere) and create your own custom roles to group
together other sets of privileges and permissions (see Create a Custom Role).
Global versus scoped privileges - Global privileges are privileges that are used at the tenant level and are not space-related, and
can therefore be included in a global role, typically a tenant administrator role. Scoped privileges are privileges that are space-
related and can therefore be included in a scoped role.
Caution
The permission Manage should be granted only to
tenant administrators.
Note
The permissions Read, Update and Delete are scoped
permissions and are described in the scoped privileges and
permissions table (see Scoped Privileges and Permissions).
Space Files -------M Allows access to all objects inside a space, such as views and
tables.
Note
This is custom documentation. For more information, please visit SAP Help Portal. 74
7/9/25, 8:45 AM
Caution
The permission Manage should be granted only to tenant
administrators.
Note
The permissions Create, Read, Update and Delete are scoped
permissions and are described in the scoped privileges and
permissions table (see Scoped Privileges and Permissions).
Data Warehouse General -R------ Allows users to log into SAP Datasphere. Included in all standard
roles except for DW Consumer.
Data Warehouse Runtime -R--E--- Read - Allows users of the View Analyzer to download the
generated SQL analyzer plan file.
Data Warehouse AI Consumption ----E--- Allows users to use SAP Business AI features.
See Enable SAP Business AI for SAP Datasphere.
Note
To enable SAP Business AI in your SAP Datasphere tenant, go
to SAP note 3522010 .
Other Datasources ----E--- Some connection types require this privilege. For more
information, see Permissions in the SAP Analytics Cloud Help.
Note
The permissions are included in the DW Administrator
role. When you create a custom role based on the DW
This is custom documentation. For more information, please visit SAP Help Portal. 75
7/9/25, 8:45 AM
Activity Log -R-D---- Allows access to the Activities page in the Security tool.
Lifecycle -R---MS- Allows to import content from the Content Network and to import
and export content via the Transport tool.
Note
The permissions -R---MS- are included in the DW
Administrator role. When you create a custom role based on
the DW Administrator role, the permissions are automatically
included and you cannot edit them.
Read:
Manage:
This is custom documentation. For more information, please visit SAP Help Portal. 76
7/9/25, 8:45 AM
Catalog Glossary CRUD---- Create: Use with the Update privilege to create a glossary.
Create a category.
Create a category.
Manage:
Catalog KPI Object CRUD---M Create: Use with Catalog KPI Template with Read
permission to create a KPI.
This is custom documentation. For more information, please visit SAP Help Portal. 77
7/9/25, 8:45 AM
Manage:
Catalog KPI Template -RU----- Read: Use with Catalog KPI Object with Read permission
to:
Catalog Log -R------ Read: View and search extraction logs for assets and
batch job details.
Cloud Data Product (------S) Share: Use with Catalog Asset with Read permission to:
Note
Some permissions require others and may automatically set them. For example, setting the Delete permission for the Data
Warehouse Data Builder privilege automatically sets the Read permission as well.
Note
This is custom documentation. For more information, please visit SAP Help Portal. 78
7/9/25, 8:45 AM
Note
The permission Manage is a global
permission and is described in the
global privileges and permissions table
(see Global Privileges and Permissions).
Data Warehouse Data Builder CRUD--S- Allows access to all objects in the Data
Builder app.
This is custom documentation. For more information, please visit SAP Help Portal. 79
7/9/25, 8:45 AM
Note
The following feature needs an
additional permission:
Data Warehouse Data Integration -RU-E--- Allows access to the Data Integration
Monitor app:
Update:
Note
In addition to these permissions, the
following Data Integration Monitor
actions require the Data Warehouse
Data Builder (Read) privilege:
This is custom documentation. For more information, please visit SAP Help Portal. 80
7/9/25, 8:45 AM
Note
To run and schedule flows, you must
have the privilege Data Warehouse Data
Integration with Read, Update and
Execute permissions.
Data Warehouse Data Access Control CRUD---- Allows access to data access controls in the
Data Builder app:
Data Warehouse Business Builder -R------ Allows access to the Business Builder app.
Data Warehouse Fact Model CRUD---- Allows access to fact models defined in the
Business Builder. Fact models are shaped
like consumption models but offer re-
useability in other consumption models.
This is custom documentation. For more information, please visit SAP Help Portal. 81
7/9/25, 8:45 AM
Data Warehouse General -R------ Allows users to log into SAP Datasphere.
Included in all standard roles except for DW
Consumer.
Note
This is custom documentation. For more information, please visit SAP Help Portal. 82
7/9/25, 8:45 AM
Note
Custom roles cannot be assigned this
privilege.
Permissions
The following table displays the available permissions and their definitions.
Permission Description
Create Permits creating new objects of this item type. Users need this permission to create spaces, views or tables,
upload data into a story, or upload other local files.
Update Permits editing and updating existing items. Compare this permission with the Maintain permission, which
doesn't allow changes to the data structure. Note: some object types need the Maintain permission to update
data. See the Maintain entry.
Maintain Permits the maintenance of data values, for example adding records to a model, without allowing changes to
the actual data structure. Compare this permission with the Update permission, which does allow changes to
the data structure.
When granted on the Lifecycle privilege, permits importing and exporting objects.
Manage When granted on Spaces and Space Files, permits to view all spaces and their content (including data),
regardless of whether the user is assigned to the space or not.
This is custom documentation. For more information, please visit SAP Help Portal. 83
7/9/25, 8:45 AM
Permission Description
To perform actions on spaces, you need the Manage permission in combination with other permissions for
Spaces and other privileges. See Roles and Privileges by App and Feature.
Caution
This permission should be granted only to tenant administrators.
Apps
Administration Tools
This is custom documentation. For more information, please visit SAP Help Portal. 84
7/9/25, 8:45 AM
A user is granted a set of global privileges for the tenant via a global A user is granted a set of scoped privileges for one or more spaces
role. via a scoped role.
The global role can be: The scoped role inherits a role template, which can be:
A standard global role that is delivered with SAP A standard scoped role template that is delivered with SAP
Datasphere (such as DW Administrator). Datasphere, such as DW Space Administrator).
A custom role that you create from a template (a standard A custom role template that you create from another
global role or another custom role containing global template (a standard scoped role or another custom role).
privileges).
To assign a user to a scoped role, see Create a Scoped Role to
To assign a user to a global role, see Assign Users to a Role. Assign Privileges to Users in Spaces.
Note
For complete lists of standard roles, privileges and permissions, see:
Apps
To access an app, tool, or editor, a user must have a global or scoped role inheriting from a role template which contains the listed
privileges:
This is custom documentation. For more information, please visit SAP Help Portal. 85
7/9/25, 8:45 AM
See Semantic Onboarding. Each section requires a specific permission: DW Space Administrator (all
sections)
SAP Systems:
DW Modeler (SAP Systems and
Data Warehouse Data Builder (CRU-----)
Data Products)
Data Warehouse Business Entity (CRU-----)
Content Network:
Lifecycle (-R---MS-)
(Business Builder) Each page or editor requires a separate permission: DW Space Administrator
Start page Start page: Data Warehouse Business Builder (-R------) DW Modeler
Dimension editor Dimension editor: Data Warehouse Business Entity DW Viewer (read-only access)
(CRUD----)
Fact editor
Fact editor: Data Warehouse Business Entity (CRUD----)
Fact model editor
Fact model editor: Data Warehouse Fact Model (CRUD----)
Consumption model editor
Consumption model editor: Data Warehouse Consumption
Authorization scenario editor
Model (CRUD----)
See Modeling Data in the
Authorization scenario editor: Data Warehouse
Business Builder
Authorization Scenario (CRUD----)
Note
The DW Viewer role includes Data Warehouse
[Link], which allows these users to preview
only data from Fact models and consumption models.
(Data Builder) All pages and editors share a single permission: DW Space Administrator
Table editor The following features need additional permissions (which are DW Viewer (read-only access)
included in the DW Modeler role):
Graphical view editor
This is custom documentation. For more information, please visit SAP Help Portal. 86
7/9/25, 8:45 AM
SQL view editor Preview data from any object in the Data Preview panel -
Data Warehouse [Link]
Entity-relationship model editor
Data access control editor Access the local table Data Editor screen - Data
Warehouse Data [Link]
See:
See remote objects in Data Builder editors - Data
Acquiring Data in the
Warehouse [Link]
Data Builder
The following features need additional permissions (which are
Preparing Data in the
included in the DW Integrator role):
Data Builder
Run an intelligent lookup - Data Warehouse Data
Modeling Data in the
[Link]
Data Builder
Run a task chain - Data Warehouse Data
Securing Data with Data
[Link]
Access Controls
Delete data in a local table - Data Warehouse Data
[Link]
Note
The DW Modeler role includes Data Warehouse Data
Access [Link], which allows them to apply an
existing data access control to a view.
(Data Integration Monitor) Data Warehouse Data Integration (-RU-E---) DW Space Administrator
This is custom documentation. For more information, please visit SAP Help Portal. 87
7/9/25, 8:45 AM
See Integrating Data via The following feature needs an additional permission (which is DW Integrator
Connections included in the DW Administrator role):
DW Modeler (read-only access)
Select a location ID - [Link]
DW Viewer (read-only access)
Administration Tools
To access an app, tool, or editor, a user must have a global or scoped role inheriting from a role template which contains the listed
privileges:
DW Space Administrator
Note
This is custom documentation. For more information, please visit SAP Help Portal. 88
7/9/25, 8:45 AM
See Data Marketplace - Data To create a new data provider profile, or edit an existing one, you
Provider's Guide must have the Spaces (Update) privilege assigned to your role.
A user with a viewer role can log into SAP Datasphere, but has no Spaces permissions and cannot see the Space
Management tool.
A user with a modeler or integrator role has Spaces (-R------) permission. They have read-only access to the page for
their space (though they cannot see all its properties).
A user with a space administrator role has Spaces (-RUD----) permissions. They can see all the space properties, and
edit those outside the General Settings and Workload Management sections.
A user with an administrator role has Spaces (CRUD---M) permissions. They can create spaces and edit some space
properties, including modifying the storage allocated and the space priority.
Various privileges and permissions are required to see and edit different parts of the Space Management tool:
Note
In addition to all the privileges listed in the table below that are required to work with the Space Management tool, the
following privileges are required:
Data Warehouse General (-R------) (both global and scoped privilege) - To access SAP Datasphere.
Global privilege Space Files (-------M) or scoped privilege Space Files (-R------)- To view objects in your
space.
The global privilege Spaces (-------M) enables users with a global role to perform the following actions in all the
spaces of the tenant: read, update and delete.
Create a Space Global privileges Spaces (C------M) and User (-R----- DW Administrator
See Create a Space -).
This is custom documentation. For more information, please visit SAP Help Portal. 89
7/9/25, 8:45 AM
Note Note
In addition, you also need the following permissions to view A user with a role based on
these properties: the DW Modeler or DW
Integrator role template
Users: Global privileges Role (-R------) or
have read-only access to
scoped privileges Scoped Role User Assignment
the page for their space
(-------M)
but they cannot view all its
Data Consumption and Database Users: Global properties.
privilege Spaces (-------M) or scoped privilege
Spaces (-R------)
Note
A DW Administrator cannot see the HDI Containers
area in a space.
Note
A DW Administrator cannot see the Time Data area
in a space.
Modify General Settings (except for Global privilege Spaces (-------M) DW Administrator and DW
Space Storage) Space Administrator
or scoped privilege Spaces (-RU-----)
See Create a Space
Modify Space Storage, Data Lake Global privilege Spaces (-------M) DW Administrator
Access, Workload Management
Modify Users Global privileges Spaces (-------M) and Role (------- DW Administrator and DW
M) Space Administrator
See Control User Access to Your Space
or scoped privileges Spaces (--U-----) and Scoped Role
User Assignment (-------M)
Modify Data Consumption and Global privilege Spaces (-------M) DW Administrator, DW Space
Database Users Administrator
or scoped privileges Spaces (-RU-----)
See Create a Database User Note
A user with a role based in
the DW Integrator role
template needs in addition
the privilege Spaces (--
This is custom documentation. For more information, please visit SAP Help Portal. 90
7/9/25, 8:45 AM
U-----) to create
database users.
Modify HDI Containers Scoped privileges Spaces (--U-----) and Data Warehouse DW Space Administrator
Connection (--U-----)
See Prepare Your HDI Project for Note
Exchanging Data with Your Space A DW Administrator
cannot access the HDI
Containers area in a
space.
Modify Time Data To update time data: scoped privileges Spaces (--U-----) DW Space Administrator
and Data Builder (--U-----)
See Create Time Data and Dimensions Note
To delete time data: scoped privileges Spaces(--U-----) A DW Administrator
and Data Builder (---D----) cannot access the Time
Data area in a space.
Modify Auditing Global privilege Spaces (-------M) or scoped privilege DW Administrator and DW
Spaces (-RU-----) Space Administrator
See Logging Read and Change Actions
for Audit
Delete a Space Global privileges Spaces (-------M) and User (------- DW Administrator and DW
See Delete Your Space M) Space Administrator
When creating a custom role for using or administering the catalog, you must set the permissions for the privileges in certain ways
so that you can complete various tasks. Review the following table of tasks to see which permissions and privilege combinations
you need.
To be able to access the Catalog app from the side navigation, all custom catalog roles need the Read permission on Catalog
Asset.
This is custom documentation. For more information, please visit SAP Help Portal. 91
7/9/25, 8:45 AM
Note
All custom catalog roles need the SAP Datasphere read permission on Space Files to allow users to mark assets, terms, and
KPIs as their favorite.
Assets Search for an asset and view the detailed Catalog Asset: (-R------)
information for it.
See Searching for Data Products and Assets in
the Catalog
Assets Edit the name of the asset that appears in the Catalog Asset: (-RU-----)
catalog.
Catalog Tag Hierarchy: (-R------)
See Enriching and Managing Catalog Assets
Assets Add a catalog description for the asset. Catalog Asset: (-RU-----)
See Enriching and Managing Catalog Assets
Catalog Tag Hierarchy: (-R------)
Assets Add a term, tag, or KPI relationship to the asset Catalog Asset: (-RU-----)
from the asset’s detailed information page.
Catalog Tag Hierarchy: (-R------)
See Enriching and Managing Catalog Assets
Catalog Glossary Object: (-R------)
Assets Manage the relationship for a term and an Catalog Glossary Object: (-R------)
asset.
Catalog Asset: (-RU-----)
See Create and Manage Glossary Terms
Assets Manage the relationship for a KPI and an asset. Catalog KPI Object: (-R------)
See Create and Manage Key Performance
Catalog Asset: (-RU-----)
Indicators
Tags Add a tag relationship to the asset from the Catalog Asset: (-RU-----)
asset’s detailed information page.
Catalog Tag Hierarchy: (-R------)
See Enriching and Managing Catalog Assets
This is custom documentation. For more information, please visit SAP Help Portal. 92
7/9/25, 8:45 AM
Terms Manage the relationship for a term and an Catalog Glossary Object: (-R------)
asset.
Catalog Asset:(-RU-----)
See Create and Manage Glossary Terms
This is custom documentation. For more information, please visit SAP Help Portal. 93
7/9/25, 8:45 AM
KPIs Manage the relationship for a KPI and an asset. Catalog KPI Object: (-R------)
Create and Manage Key Performance
Catalog Asset: (-RU-----)
Indicators
KPIs Create a KPI category. Create and Manage Key Catalog KPI Object: (C-------)
Performance Indicator Categories
Catalog KPI Template: (-R------)
Marketplace Data Products Search for a data marketplace data product, Spaces: (-R------)
view the detailed information for it, and install it
Space Files: (CRUD----)
to a space.
See Searching for Data Products and Assets in Data Warehouse Connection: (CRUD----)
the Catalog and Evaluating and Installing
Marketplace Data Products. Data Warehouse Data Integration: (-RU-----)
SAP Business Data Cloud Search for an SAP Business Data Cloud data Catalog Asset: (-R------)
Data Products product and view the detailed information for it.
SAP Business Data Cloud Search for an SAP Business Data Cloud data Catalog Asset: (-R------)
Data Products product, view the detailed information for it,
Spaces: (-R------)
and install it to a space and use it. See
Evaluating and Installing Data Products. Space Files: (CRUD----)
SAP Business Data Cloud Search for an SAP Business Data Cloud data Catalog Asset: (-R------)
Data Products product, view the detailed information for it,
Cloud Data Product: (------S)
and share it with external users.
This is custom documentation. For more information, please visit SAP Help Portal. 94
7/9/25, 8:45 AM
Here are a few examples for catalog roles and permissions.
Review assets: update asset names and descriptions, add tags, and publish assets. Catalog Asset: (-RU----M)
Manage and publish glossaries, terms, and KPIs; also add terms and KPI relationships to Catalog Asset: (-RU-----)
assets.
Catalog Glossary: (CRUD----)
Manage terms within existing glossaries and manages tags, but do not add these relationships Catalog Asset: (-R------)
to assets.
Catalog Glossary: (-R------)
Consume data in SAP Analytics Cloud, Space Files (-R------) All roles
Microsoft Excel, and other clients, tools, and
apps Note
See Consuming Data Exposed by SAP If a user does not need to access SAP
To use the command line interface (see Manage Spaces via the Command Line), a user must have the following standard role or a
custom role containing the listed privileges:
This is custom documentation. For more information, please visit SAP Help Portal. 95
7/9/25, 8:45 AM
Spaces (C------M)
User (-R------)
Team (-RUD---M)
Prerequisites
To create a custom role, you need the DW Administrator role.
Context
You can create a custom role to enable users to do either global actions on the tenant or actions that are specific to spaces.
If you create a custom role for global purposes, you should include only global privileges and permissions. You can then
assign the role to the relevant users.
This is custom documentation. For more information, please visit SAP Help Portal. 96
7/9/25, 8:45 AM
If you create a custom role for space-related purposes, you should include only scoped privileges and permissions. As a
second step, you need to create a scoped role based on this custom role to assign users and spaces to the set of privileges
included. See Create a Scoped Role to Assign Privileges to Users in Spaces.
You should not mix global and scoped privileges in a custom role.
If you include a scoped privilege in a custom role that you create for global purposes, the privilege is ignored.
If you include a global privilege in a custom role that you want to use as a template for a scoped role, the privilege is ignored
.
Note
Some users, such as space administrators, primarily need scoped permissions to work with spaces, but they also need some
global permissions (such as Lifecycle when transporting content packages). To provide such users with the full set of
permissions they need, you can include both the relevant global privileges and scoped privileges in the custom role you will use
as a template for the scoped role. Each space administrator is then assigned to the scoped role to receive the necessary
scoped privileges, but they are also assigned directly to the custom role in order to receive the additional global privileges.
For more details about global and scoped privileges, see Privileges and Permissions.
Procedure
1. Go to (Expand) (Security) (Roles).
2. To create a custom role, click (Add Role) and select Create a Custom Role.
Property Description
Name Enter a unique name for the role. The name can only contain upper and lower case letters, numbers,
and underscores and its maximum length is 20 characters.
Description [optional] Enter a description, which can be changed at any time. The description can only contain
upper and lower case letters, numbers, spaces, and dashes and its maximum length is 155 characters.
4. Click Create.
The role templates are the predefined standard roles associated with the SAP Datasphere license type. If you wish to create
a role without extending a predefined standard role, choose the blank template. After you select a template, a page opens
showing you the individual permissions assigned to the privileges that have been defined for the role template you chose.
6. Select the permissions for your new role for every privilege type. The permission privileges represent an area, app, or tool in
SAP Datasphere while the permissions (create, read, update, delete, execute, maintain, share, and manage) represent the
actions a user can perform. For more details about global and scoped privileges, see Privileges and Permissions.
7. [optional] If you want to change the role template that your new custom role will be based on, select (Select Template),
and choose a role.
8. [optional] To define the custom role as a default role, which will be assigned to all new users when no other role is assigned
to them, select (Role Configuration) and select the option Use as Default Role.
Note
The option Enable Self-Service is not relevant for SAP Datasphere.
Note
You can assign the role to a user from the Users page or - only if you've created a custom role for global purposes (and
not for space-related purposes) - from the Roles page. Whether you create users first or roles first does not matter. See
Assign Users to a Role.
A DW Administrator can assign a role to multiple users in multiple spaces, in a single scoped role. As a consequence, a user can
have different roles in different spaces: be a modeler in space Sales Germany and Sales France and a viewer in space Europe Sales.
You can create a scoped role based on a standard role or on a custom role. In both cases, the scoped role inherits the privileges
from the standard or custom role. You cannot edit the privileges of a scoped role or of a standard role. You can edit the privileges of
a custom role. To create a scoped role with a different set of privileges, create a custom role with the set of privileges wanted and
This is custom documentation. For more information, please visit SAP Help Portal. 98
7/9/25, 8:45 AM
then create the scoped role from the custom role. You can then change the privileges of the custom role as needed, which will also
change the privileges of all the scoped roles that are based on the custom role.
Users who are granted the DW Space Administrator role via a scoped role can add or remove users to or from their spaces and the
changes are reflected in the scoped roles. See Control User Access to Your Space.
In the following example, the DW administrator begins assigning users to the three Sales spaces by creating the appropriate
scoped roles:
She creates three scoped roles based on standard and custom roles and assigns the users to the spaces as follows:
Senior Sales Modeler Custom role “Senior Modeler” Jim Sales Europe
based on the DW Modeler
This is custom documentation. For more information, please visit SAP Help Portal. 99
7/9/25, 8:45 AM
Data Warehouse
Connection (Create,
Read, Update and
Delete)
If Bob no longer needs to work in the space Sales US, the DW administrator can unassign Bob from Sales US in the scoped role
Sales Modeler.
As Joan has the role of space administrator for the space Sales US, she can also unassign Bob from Sales US directly in the space
page (in the Space Management). The user assignment change is automatically reflected in the Sales Modeler scoped role.
Later on, Bob needs the space administration privileges for the space Sales Asia. From the page of the space Sales Asia, Joan
assigns Bob to the space with the Sales Space Admin scoped role.
For more information on scoped roles, see the blog Preliminary Information SAP Datasphere– Scoped Roles (published in
September 2023).
Note
In addition to the standard workflows, you can also create scoped roles and assign scopes and users to them via the command
line (see Manage Scoped Roles via the Command Line).
1. In the side navigation area, click (Security) (Roles) and click your scoped role to open it.
Note
As an alternative to creating a scoped role, you can use one of the predefined scoped roles that are delivered with SAP
Datasphere in the Roles page and directly assign spaces and users to them.
Property Description
Name Enter a unique name for the role. The name can only contain upper and lower case letters, numbers,
and underscores and its maximum length is 20 characters.
This is custom documentation. For more information, please visit SAP Help Portal. 100
7/9/25, 8:45 AM
Property Description
Description [optional] Enter a description, which can be changed at any time. The description can only contain
upper and lower case letters, numbers, spaces, and dashes and its maximum length is 155 characters.
4. Click Create.
5. Select the role template, which can either be a standard role template or a custom role and click Save.
6. As your scoped role inherits privileges from the template you've chosen, you cannot edit the privileges, except for the one
privilege Scoped Role User Assignment (Manage). If you're creating a scoped role for space administration purposes, you
should select this privilege that allows to manage user assignment in a space.
You can then assign spaces and users to the new scoped role. The spaces and users must be created beforehand and you must
assign spaces before assigning users to them.
Note
If you’re creating a scoped role to assign space administration privileges to certain users in certain spaces, you can either do as
follows:
Create a scoped role based on the standard role template DW Space Administrator and, to allow user assignment, select
the privilege (permission) Scoped Role User Assignment privilege (Manage), which is the only privilege you can select,
as the rest of the privileges are inherited from the template. Then, assign one or more spaces and one or more users to
the spaces.
Open the predefined scoped role DW Scoped Space Administrator and assign one or more spaces and one or more
users to the spaces. Scoped Role User Assignment (Manage) is selected by default.
1. In the side navigation area, click (Security) (Roles) and click your scoped role to open it.
2. Click [number] Scopes, select one or more spaces in the dialog Scopes and click Save.
Note
By default, all users of the scoped role are automatically assigned to the spaces you've just added. You can change this
and assign only certain members to certain spaces in the Users page of the scoped role.
3. In the Selected Scopes area of the dialog Scopes, click the cross icon for each space that you want to remove from the role,
then click Save.
All users that were assigned to the spaces you've just removed are automatically removed from the scoped role.
This is custom documentation. For more information, please visit SAP Help Portal. 101
7/9/25, 8:45 AM
1. In the side navigation area, click (Security) (Roles) and click your scoped role to open it.
2. Click Users. All user assignements are displayed in the Users page.
To individually select users and assign them to spaces, click (Add Users to Scopes), then Add New Users to
Scopes. Select one or more users in the wizard Add Users to Scopes and click Next Step.
Note
By default, the added users are automatically assigned to all the spaces included in the scoped role. If you want
to modify this, select the one or more spaces to which you want to assign the users.
Note
You can also add a user to a scoped role from the (Users) area. In such a case, the user is automatically
assigned to all the spaces included in the scoped role. See Assign Users to a Role.
To assign all users included in the scoped role to one or more spaces. To do so, click (Add Users to Scopes), then
Add All Current Users to Scopes. Select one or more spaces in the wizard Add Users to Scopes and click Next Step
and Save.
To assign all users of the tenant to one or more spaces, click (Add Users to Scopes), then Add All Users to Scopes.
Select one or more spaces in the wizard Add Users to Scopes and click Next Step and Save.
Restriction
A user can be assigned to a maximum of 100 spaces across all scoped roles.
Note
In the Users page, you can filter users and spaces to see for example to which spaces and roles a user is assigned to.
Once you've assigned a user to a space with the DW Space Administrator role via a scoped role, this user can manage the users for
its space directly in the page of its space (in the Space Management). See Control User Access to Your Space.
2. Click Users. All user assignements are displayed in the Users page.
3. Check the relevant rows (a row corresponding to a combination of one user and one space) and click the garbage icon. The
users cannot access the spaces they were previously assigned to in the scoped role.
This is custom documentation. For more information, please visit SAP Help Portal. 102
7/9/25, 8:45 AM
Prerequisites
Users with an admnistrator role can assign roles to users in the Users and Roles pages.
You can assign an individual user to a role (global or scoped) in the Users page.
3. In the user's row, select the icon in the Roles column. A list of Available Roles will appear.
The icon is not available if the user has the system owner role, which means that, from the Security Users page, you
cannot assign an additional role to a user who has the system owner role. You can do so from the Security Roles page
(see Create a Scoped Role to Assign Privileges to Users in Spaces).
5. Select OK.
Note
If you assign a user to a scoped role, be aware that the user is automatically assigned to all the spaces included in the scoped
role. You can change the user assignment in the scoped role. See Create a Scoped Role to Assign Privileges to Users in Spaces.
You can assign several users to a global role at the same time in the Roles page.
Note
This is not relevant for scoped roles. For information about how to assign users to spaces in a scoped role, see Create a Scoped
Role to Assign Privileges to Users in Spaces.
3. At the bottom of the role box, click the link Add Users.
4. Select one or more users from the Assign Role to User dialog.
5. Select OK.
For example, you want to give a specific role to all employees that are assigned to a specific cost center. Once you've done the role
mapping, if new users are assigned to the cost center in the SAML identity provider (IdP), the users will be automatically assigned
to the role when logging onto SAP Datasphere via SAML authentication.
Prerequisites
This is custom documentation. For more information, please visit SAP Help Portal. 103
7/9/25, 8:45 AM
Your custom SAML Identity Provider (IdP) must be configured and the authentication method selected must be SAML Single
Sign-On (SSO) in (System) → (Administration) →Security. See Enabling a Custom SAML Identity Provider (Legacy Custom
IdP).
Procedure
1. In the side navigation area, click (Security) (Roles).
2. Select a role (or open the role) and click (Open 'SAML Role Mapping').
3. Under Conditions, select a SAML Attribute, select a Condition, and enter a Value if required.
4. (Optional) Select + (New mapping definition) to add additional mappings to the role assignment.
For each additional mapping, under Conditions, select a SAML Attribute, select a Condition, and enter a Value if required.
If AND is selected, the conditions for all attributes must be met for the mapping to be applied. If OR is selected, the
conditions for only one of the attributes must be met for the mapping to be applied.
The selected role will be applied to all users who meet the specified conditions when logging onto SAP Datasphere via
SAML authentication. If the selected role was previously assigned to a user, but the user does not meet the specified
conditions, the role will be revoked when the user logs in.
Note
If a user is assigned to a scoped role via SAML attributes, the user is automatically assigned to all the spaces included in
the scoped role.
In the Roles page, a dedicated icon in the role tile is displayed, indicating that the users are assigned to the role via SAML
attributes. When you hover over the icon, the conditions defined for the role are displayed.
In (Security) (Authorization Overview), a user with the DW Administrator global role can see all the users, roles, and
spaces in the tenant and how they relate to each other. You can filter by user, role, or space to see:
As you type, the field will begin proposing objects and search strings. Click on a string to trigger a search on it.
For example, to display all roles that are assigned to the user Lucia, enter "Lucia" in the Search.
Filter by Criteria
This is custom documentation. For more information, please visit SAP Help Portal. 104
7/9/25, 8:45 AM
You can filter the list by any of the categories listed in the Filter By area of the left panel: user (in User Name), space (in Scope
Name) and role (in Role Name).
You can select one or more values in each filter category in the Filter By section:
Each value selected in a category acts as an OR condition. For example, to display all roles that are assigned to the users
Lucia and Ahmed, select Lucia and Ahmed in the User Name category.
Values selected in separate categories act together as AND conditions. For example, to display all the scoped roles that
enables Lucia to access the Sales Asia space, select Lucia in the User Name category and Sales Asia in the Scope Name
category.
Create Users and Assign Them to Roles via the SCIM 2.0 API
You can create, read, modify and delete users and add them to roles via the SCIM 2.0 API.
Introduction
List Users
Create a User
Modify a User
Delete a User
Bulk Operations
Introduction
This API allows you to programmatically manage users using a SCIM 2.0 compliant endpoint.
SAP Datasphere exposes a REST API based on the System for Cross-domain Identity Management (SCIM 2.0) specification. This
API allows you to keep your SAP Datasphere system synchronized with your preferred identity management solution.
Note
You cannot create new roles using this API.
List users.
This is custom documentation. For more information, please visit SAP Help Portal. 105
7/9/25, 8:45 AM
Get information on the identity provider, available schemas, and resource types.
This API uses SCIM 2.0. For more information, see SCIM Core Schema.
Note
The OAuth client must be configured with the following properties:
To log in to the OAuth client, send a GET (or POST) request with the following elements:
[Link]
Note
You can find the token URL in (System) (Administration) App Integration OAuth Clients Token URL .
The response body returns the access token, which you'll then use as the bearer token to obtain the csrf token.
Authorization token <access token retrieved when logging in with the OAuth
client>
This is custom documentation. For more information, please visit SAP Help Portal. 106
7/9/25, 8:45 AM
<tenant_url>/api/v1/csrf
The CSRF token is returned in the x-csrf-token response header. This token can then be included in the POST, PUT, PATCH, or
DELETE request in the x-csrf-token:<token> header.
List Users
To retrieve users, use the GET request with the/api/v1/scim2/Users endpoint and the following elements:
Authorization token <access token retrieved when logging in with the OAuth
client>
[Link]
You can control the list of users to retrieve by using one or more of the following optional URL parameters:
Parameter Description
sortBy=userName
sortOrder Specifies the order in which items are returned, either ascending or descending. By
default, an ascending sort order is used.
To retieve the list of users by descending order:
sortOrder=descending
startIndex=10
This is custom documentation. For more information, please visit SAP Help Portal. 107
7/9/25, 8:45 AM
Parameter Description
count=8
filter=userName co "K"
See the user schema for available attributes. All operators are supported.
[Link] co "a"&sortOrder=descending&sta
Caution
GET requests send personal identifiable information as part of the URL, such as the user name in this case. Consider using the
POST request with the /api/v1/scim2/Users/.search endpoint instead for enhanced privacy of personal information. Syntax of
POST request:
[Link]
Note
In the response body, if the users listed are assigned to roles, you can identify the roles as they are prefixed with PROFILE.
To retrieve a specific user based on its ID, use the GET request with the /api/v1/scim2/Users/<user ID> endpoint and the
following elements:
Authorization token <access token retrieved when logging in with the OAuth
client>
To retrieve a specific user based on its ID, enter the GET request:
[Link] ID>
The user ID must be the UUID (universally unique identifier), which you can get by sending the GET request:
[Link]
Note
This is custom documentation. For more information, please visit SAP Help Portal. 108
7/9/25, 8:45 AM
In the response body, if the user is assigned to roles, you can identify with their prefix PROFILE.
Create a User
To create a user, use the POST request with the/api/v1/scim2/Users/ endpoint and the following elements:
Authorization token <access token retrieved when logging in with the OAuth
client>
Note
The following information are required: userName, name, and emails information. Other information that are not provided
will be either left empty or set to its default value.
If you are using SAML authentication, idpUserId should be set to the property you are using for your SAML mapping. For
example, the user's USER ID, EMAIL, or CUSTOM SAML MAPPING. If your SAML mapping is set to EMAIL, the email address
you add to idpUserId must match the email address you use for email.
The userName attribute can only contain alphanumeric and underscore characters. The maximum length is 20 characters.
Note
When creating or modifying a user, you can add optional properties to the user.
{
"schemas": [
"urn:ietf:params:scim:schemas:core:2.0:User",
"urn:ietf:params:scim:schemas:extension:enterprise:2.0:User"
],
"userName": "LGARCIA",
"name": {
"familyName": "Garcia",
"givenName": "Lisa",
"formatted": "Lisa Garcia"
},
"displayName": "Lisa Garcia",
"preferredLanguage": "en",
"active": true,
"emails": [
This is custom documentation. For more information, please visit SAP Help Portal. 109
7/9/25, 8:45 AM
{
"value": "[Link]@[Link]",
"type": "work",
"primary": true
}
],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"idpUserId": "[Link]@[Link]"
}
}
The following example shows how to create a new user and assign it to a role:
{
"schemas": [
"urn:ietf:params:scim:schemas:core:2.0:User",
"urn:ietf:params:scim:schemas:extension:enterprise:2.0:User"
],
"userName": "LGARCIA",
"name": {
"familyName": "Garcia",
"givenName": "Lisa",
"formatted": "Lisa Garcia"
},
"displayName": "Lisa Garcia",
"preferredLanguage": "en",
"active": true,
"emails": [
{
"value": "[Link]@[Link]",
"type": "work",
"primary": true
}
],
"roles": [
{
"value": "PROFILE:t.V:Sales_Modeler",
"display": "Sales_Modeler",
"primary": true
} ],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"idpUserId": "[Link]@[Link]"
}
}
The response body returns the ID of the user created, which is the user UUID (universally unique identifier).
Note
When creating or modifying a user via the API, you can also assign the user to one or more roles - either global or scoped,
provided that the roles already exist in the tenant:
Before you can add one or more users to a scoped role, one space at least must be assigned to the scoped role.
This is custom documentation. For more information, please visit SAP Help Portal. 110
7/9/25, 8:45 AM
When a user is added to a scoped role, the user is given access to all the spaces included in the scoped role.
All roles are prefixed with PROFILE. Custom and scoped roles have IDs in the following format: PROFILE:<t.#>:
<role_name>.
Modify a User
You can modify a specific user either way:
To override all information related to a specific user, use a PUT request. The user properties are updated with the properties
you provide and all the properties that you do not provide are either left empty or set to their default value.
To update only some information related to a specific user, use a PATCH request. The user properties are updated with the
changes you provide and all properties that you do not provide remain unchanged.
You can use either the PUT (override) or PATCH (update) request with the/api/v1/scim2/Users/<user ID> endpoint and
the following elements:
Authorization token <access token retrieved when logging in with the OAuth
client>
[Link] ID>
Note
If you are using SAML authentication, and you are using USER ID as your SAML mapping, you cannot change the userName
using this API. The userName you use in the request body must match the user <ID>.
Note
When creating or modifying a user, you can add optional properties to the user.
The following example shows how to add a user to a role with a PUT request:
{
"schemas": [
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters",
"urn:ietf:params:scim:schemas:core:2.0:User"
],
"id": "userID-00001",
"meta": {
This is custom documentation. For more information, please visit SAP Help Portal. 111
7/9/25, 8:45 AM
"resourceType": "User",
"location": "/api/v1/scim2/Users/userID-00001"
},
"userName": "LGARCIA",
"name": {
"familyName": "Garcia",
"givenName": "Lisa",
"formatted": "Lisa Garcia"
},
"displayName": "Lisa Garcia",
"preferredLanguage": "en",
"active": true,
"emails": [
{
"value": "[Link]@[Link]",
"type": "work",
"primary": true
}
],
"roles": [
{
"value": "PROFILE:t.V:Sales_Modeler",
"display": "Sales_Modeler",
"primary": true
} ],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"idpUserId": "[Link]@[Link]"
}
}
The following example shows how to remove a user from a role and add it to another role with a PATCH request:
{
"schemas": [
"urn:ietf:params:scim:api:messages:2.0:PatchOp"
],
"Operations": [
{
"op": "replace",
"path": "roles",
"value": [
{
"value": "PROFILE:t.V:Sales_Modeler_US",
"display": "Sales_Modeler_US",
"primary": true
}
]
}
]
}
The following example shows how to do the following changes with a PATCH request: remove a user from a role and add it to
another role, and modify a user's email address and its idpUserId.
This is custom documentation. For more information, please visit SAP Help Portal. 112
7/9/25, 8:45 AM
{
"schemas": [
"urn:ietf:params:scim:api:messages:2.0:PatchOp"
],
"Operations": [
{
"op": "replace",
"path": "roles",
"value": [
{
"value": "PROFILE:t.V:Sales_Modeler_US",
"display": "Sales_Modeler_US",
"primary": true
}
]
},
{
"op": "replace",
"path": "[Link]",
"value": [Link]+1@[Link]
},
{
"op": "replace",
"path": "urn:sap:params:scim:schemas:extension:sac:2.0:[Link]",
"value": [Link]+1@[Link]
}
]
}
Note
When creating or modifying a user via the API, you can also assign the user to one or more roles - either global or scoped,
provided that the roles already exist in the tenant:
Before you can add one or more users to a scoped role, one space at least must be assigned to the scoped role.
When a user is added to a scoped role, the user is given access to all the spaces included in the scoped role.
All roles are prefixed with PROFILE. Custom and scoped roles have IDs in the following format: PROFILE:<t.#>:
<role_name>.
Delete a User
To delete a user, use the DELETE request with the/api/v1/scim2/Users/<user ID> endpoint and the following elements:
Authorization token <access token retrieved when logging in with the OAuth
client>
This is custom documentation. For more information, please visit SAP Help Portal. 113
7/9/25, 8:45 AM
To delete a specific user based on its ID, enter the DELETE request:
[Link] ID>
The user ID must be the UUID (universally unique identifier), which you can get by sending the GET request:
[Link]
You can add optional parameters to the user when creating or modifying a user, in addition to the required properties (userName,
name, and emails).
Parameter Description
preferredLanguage Specifies the language in which to view the SAP Datasphere interface.
Allowed values:
Concatenation of an ISO 639-1 two-letter language code, a dash, and ISO 3166-1
two-letter country code. For example, en-us.
Default value: en
Example
"preferredLanguage": "en",
dataAccessLanguage Specifies the default language in which to display text data in SAP Analytics Cloud.
Allowed values:
Concatenation of an ISO 639-1 two-letter language code, a dash, and ISO 3166-1
two-letter country code. For example, en-us.
Default value: en
MMM d, yyyy
[Link]
[Link]
This is custom documentation. For more information, please visit SAP Help Portal. 114
7/9/25, 8:45 AM
Parameter Description
[Link]
yyyy/MM/dd
dd/MM/yyyy
MM/dd/yyyy
Note
H:mm:ss corresponds to 24-Hour Format. For example, [Link].
Example:
"urn:ietf:params:scim:schemas:extension:sap:user-custom-parameters:1.0": {
"dataAccessLanguage": "en",
"dateFormatting": "MMM d, yyyy",
"timeFormatting": "H:mm:ss",
"numberFormatting": "1,234.56",
"cleanUpNotificationsNumberOfDays": 0,
"systemNotificationsEmailOptIn": true,
"marketingEmailOptIn": false
},
Bulk Operations
To create, modify or delete users in bulk, use the POST request with the /api/v1/scim2/Bulk/ endpoint and the following
elements:
Authorization token <access token retrieved when logging in with the OAuth
client>
This is custom documentation. For more information, please visit SAP Help Portal. 115
7/9/25, 8:45 AM
Note
A maximum of 30 operations per request can be processed.
{
"schemas": ["urn:ietf:params:scim:api:messages:2.0:BulkRequest"],
"Operations": [
{
"method": "POST",
"path": "/Users",
"bulkId": "bulkId1",
"data":{
"schemas": [
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters",
"urn:ietf:params:scim:schemas:core:2.0:User"
],
"userName": "LGARCIA",
"name": {
"familyName": "Garcia",
"givenName": "Lisa "
},
"displayName": "Lisa Garcia",
"emails": [
{
"value": "[Link]@[Link]"
}
],
"roles":[
{
"value": "PROFILE:t.V:Sales_Modeler",
"display": "Sales_Modeler",
"primary": true
}
],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"dataAccessLanguage": "en",
"numberFormatting": "1,234.56",
"idpUserId": "[Link]@[Link]",
"timeFormatting": "H:mm:ss",
"dateFormatting": "MMM d, yyyy",
}
}
This is custom documentation. For more information, please visit SAP Help Portal. 116
7/9/25, 8:45 AM
},
{
"method": "POST",
"path": "/Users",
"bulkId": "bulkId2",
"data": {
"schemas": [
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters",
"urn:ietf:params:scim:schemas:core:2.0:User"
],
"userName": "JOWEN",
"name": {
"familyName": "Owen",
"givenName": "Joe"
},
"displayName": "Joe Owen",
"emails": [
{
"value": "[Link]@[Link]"
}
],
"roles":[
{
"value": "PROFILE:t.V:Sales_Modeler",
"display": "Sales_Modeler",
"primary": true
}
],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"dataAccessLanguage": "en",
"numberFormatting": "1,234.56",
"idpUserId": "[Link]@[Link]",
"timeFormatting": "H:mm:ss",
"dateFormatting": "MMM d, yyyy",
}
}
}
]
}
The following example shows how to delete two users using their IDs:
{
"schemas": ["urn:ietf:params:scim:api:messages:2.0:BulkRequest"],
"Operations": [
{
"method": "DELETE",
"path": "/Users/<userID_User1>"
},
{
"method": "DELETE",
"path": "/Users/<userID_User2>"
}
This is custom documentation. For more information, please visit SAP Help Portal. 117
7/9/25, 8:45 AM
]
}
/scim2/ServiceProviderConfig - Gets information about the identity provider being used with your SAP
Datasphere tenant.
Converted Roles
This is custom documentation. For more information, please visit SAP Help Portal. 118
7/9/25, 8:45 AM
A DW Administrator assigned a role to a user and assigned the user A DW Administrator assigns a role to one or more users and one or
as a member of a space. more spaces within a new role: a scoped role.
As a consequence: As a consequence:
A user had the same one or more roles in all the spaces he A user can have different roles in different spaces: be a
was a member of. modeler in space Sales Germany and Sales France and a
viewer in space Europe Sales.
A DW Administrator assigned users space by space by
going in each space page. A DW Administrator can give a role to many users in many
spaces, all in one place in a scoped role. See Create a
Scoped Role to Assign Privileges to Users in Spaces.
Converted Roles
You can now use global roles for tenant-wide actions and scoped roles for space-related actions.
This is custom documentation. For more information, please visit SAP Help Portal. 119
7/9/25, 8:45 AM
The Roles page lists the same standard and custom roles as before the conversion, and in addition the scoped roles that have been
automatically created.
This is custom documentation. For more information, please visit SAP Help Portal. 120
7/9/25, 8:45 AM
DW Administrator, Catalog Administrator and Catalog User: these standard roles are considered as global roles. They now
include only privileges that are global, which means privileges that apply to the tenant and are not space-related. For
example, the DW Administrator role no more grants access to any of the modeling apps of SAP Datasphere (such as Data
Builder).
Users who previously had these roles are still assigned to them after conversion.
Users who previously had the DW Administrator role and were members of certain spaces are assigned to the new DW
Scoped Space Administrator role for those spaces they previously had access to.
The user who previously had the System Owner role and was member of certain spaces is assigned to the new DW Scoped
Space Administrator role for those spaces the user previously had access to.
A single scoped role is created for each standard role (outside of DW Administrator, Catalog Administrator and Catalog
User) and each custom role and all the users who previously had that standard or custom role are assigned to the new
scoped role but only for those spaces they previously had access to.
Note
All the spaces of the tenant are included in each scoped role created, but not all users are assigned to all spaces. See the
example of scoped role below.
For each standard or custom role, two roles are available after the conversion: the initial standard or custom role (which
acts as a template for the scoped role) and the scoped role created.
Each scoped role includes privileges which are now considered as scoped privileges.
Users who previously had the DW Space Administrator role are assigned to these 2 roles: the standard role DW Space
Administrator and the new scoped role DW Scoped Space Administrator. Users who manage spaces primarily need scoped
permissions to work with spaces, but they also need some global permissions (such as Lifecycle when transporting content
packages). To provide such users with the full set of permissions they need, each space administrator is assigned to the
scoped role DW Scoped Space Administrator to receive the necessary scoped privileges, and they are also assigned directly
to the DW Space Administrator role in order to receive the additional global privileges.
Note
Specific case - no role assigned to a user: Before conversion, a DW Administrator assigned a user to certain spaces but
did not assign a role to the user. As no role was assigned to the user, the user-to-spaces assignment is not kept after
conversion.
Privileges and permissions are now either global or scoped. See Privileges and Permissions.
In this example, users assigned to a custom role called « Senior Modeler » were members of certain spaces before the conversion,
as shown below.
This is custom documentation. For more information, please visit SAP Help Portal. 121
7/9/25, 8:45 AM
The custom role « Senior Modeler » has been converted to the scoped role « Custom Scoped Senior Modeler » and the users who
previously had that custom role « Senior Modeler » are assigned to the scoped role but only for the spaces they previously had
access to.
In this example, the following scoped roles have been automatically created during conversion:
DW Scoped Modeler
DW Scoped Viewer
DW Scoped Consumer
There are 4 spaces: Sales US, Sales Europe, Finance US and Finance Europe, which can be logically organized in one Sales group
and one Finance group.
You should create a set of scoped roles for each logical group of spaces, add the relevant spaces and the relevant users and assign
the users to the spaces in the scoped roles. The users will have access to the spaces with the appropriate privileges.
Scoped Roles
DW Sales Space Administrator DW Finance Space Administrator
This is custom documentation. For more information, please visit SAP Help Portal. 122
7/9/25, 8:45 AM
Spaces
Sales US Finance US
For more information about creating a scoped role, see Create a Scoped Role to Assign Privileges to Users in Spaces.
Note
In addition to the standard workflows, you can also create scoped roles and assign scopes and users to them via the command
line (see Manage Scoped Roles via the Command Line).
Prerequisites
You must be logged on as a user with the System Information Update privilege.
Note
Transferring the system owner role is not possible if you only have one license for SAP Datasphere.
Context
1. On the Users page of the Security area, select the user you want to assign the system owner role to.
3. Under New Role, enter a new role for the previous system owner, or select to open a list of available roles.
Note
One or more roles may be selected.
4. Select OK.
Delete a Role
You can delete a custom or a scoped role when it is no longer needed.
Context
This is custom documentation. For more information, please visit SAP Help Portal. 123
7/9/25, 8:45 AM
You can delete custom roles and scoped roles (except for the predefined scoped roles that are delivered with SAP Datasphere as
examples).
Procedure
1. In the side navigation area, click (Security) (Roles).
Results
The selected roles are deleted. All users that were assigned the role will lose access to certain features depending on the privileges
and permissions that were included in the role.
All data acquisition, preparation, and modeling in SAP Datasphere happens inside spaces. A space is a secure area - space data
cannot be accessed outside the space unless it is shared to another space or exposed for consumption.
SAP HANA Database (Disk and In-Memory) - You allocate disk and memory storage, set a priority, and can limit how much
memory and how many threads its statements can consume.
SAP HANA Data Lake Files (file spaces) - You allocate compute resources. File spaces are intended for loading and
preparing large quantities of data in an inexpensive inbound staging area and are stored in the SAP Datasphere object
store.
You can then assign one or more users to the space via scoped roles. The users can start acquiring and preparing data in the
space.
If you assign users to a space with a space administrator role, they can manage users, create connections to source systems,
secure data with data access controls, and manage other aspects of the space (see Managing Your Space).
Create a Space
Create a space, allocate storage, and set the space priority and statement limits.
Prerequisites
To create a space, you must have a global role that grants you the following privileges:
This is custom documentation. For more information, please visit SAP Help Portal. 124
7/9/25, 8:45 AM
User (-R------) - To allow the creation of spaces.
The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
Context
Note
Only administrators can create spaces, allocate storage, and set the space priority and statement limits. The remaining space
properties can be managed by the space administrators that the administrator assigns to the space via a scoped role.
Procedure
1. In the side navigation area, click (Space Management), and click Create.
2. In the Create Space dialog, enter the following properties, and then click Create:
Property Description
Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain
spaces and special characters.
Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and
must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must
not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names).
As the technical name will be displayed in the Open SQL Schema and in monitoring tools, including SAP
internal tools, we recommend that you do not include sensitive business or personal data in the name.
Storage Type [Default] Select SAP HANA Database (Disk and In-Memory).
Property Description
Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and
must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must
not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names).
Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain
spaces and special characters.
Space Status [read-only] Displays the status of the space. Newly-created spaces are always active.
Space Type [read-only] Displays the type of the space. You can only create spaces of type SAP Datasphere.
Created On [read-only] Displays the date and time when the space was created.
Deployment Status [read-only] Displays the deployment status of the space. Newly-created spaces are deployed, but when
you make changes, you need to save and re-deploy them before they are available to space users.
This is custom documentation. For more information, please visit SAP Help Portal. 125
7/9/25, 8:45 AM
Property Description
Deployed On [read-only] Displays the date and time when the space was last deployed.
Description [optional] Enter a description for the space. Can contain a maximum of 4 000 characters.
Note
Once the space is created, users with space administrator privileges can use the Translation area to choose the
language from which business textual information will be translated. For more information, see Translating Metadata for
SAP Analytics Cloud.
4. [optional] Use the Space Storage properties to allocate disk and memory storage to the space and to choose whether it
will have access to the SAP HANA data lake.
Property Description
Expose for Consumption by Default Choose the default setting for the Expose for Consumption
property for views created in this space.
Data Access/Database Users - Use the list in the Database Users section to create users who can connect external
tools and read from and write to the space. See Create a Database User.
Data Access/HDI Containers - Use the list in the HDI Containers section to associate HDI containers to the space.
See Prepare Your HDI Project for Exchanging Data with Your Space.
Note
A user with the DW Administrator role only cannot see the HDI Containers area.
Time Data/Time Tables and Dimensions - Click the button in the Time Tables and Dimensions section to generate
time data in the space. See Create Time Data and Dimensions.
Note
A user with the DW Administrator role only cannot see the Time Tables and Dimensions area.
Auditing/Space Audit Settings - Use the properties in the Space Audit Settings section to enable audit logging for
the space. See Logging Read and Change Actions for Audit.
Add your space to an existing scoped role (see Add Spaces to a Scoped Role).
Create a scoped role and add your space and at least one user to the scoped role (see Create a Scoped Role).
For more information, see Create a Scoped Role to Assign Privileges to Users in Spaces.
All users assigned to the space via the scoped roles are automatically displayed in the Users area of the space page. In this
area, you can add or remove users to/from scoped roles for your space (see Control User Access to Your Space). Either an
This is custom documentation. For more information, please visit SAP Help Portal. 126
7/9/25, 8:45 AM
administrator or a user with space administrator privileges can do so.
8. [optional] The properties in the Workload Management section are set with their default values. To change them, go in the
side navigation area and click (System) (Configuration) Workload Management (see Set Priorities and
Statement Limits for Spaces or Groups).
Prerequisites
To create a file space, you must have a global role that grants you the following privileges:
The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
Context
Note
For additional information on working with data in the object store, see SAP note 3538038 .
Note
You cannot create or manage a file space via the command line, add a file space to an elastic compute node, or choose a file
space as a monitoring space. You cannot monitor, lock, or unlock a file space. You cannot generate time data, enable audit
logging, create database users, or associate HDI containers in a file space.
Users with an administrator role can create spaces, allocate compute resources and assign users. The remaining space properties
can be managed by users with a space administrator role.
Procedure
1. In the side navigation area, click (Space Management), and click Create.
Property Description
Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain
spaces and special characters.
This is custom documentation. For more information, please visit SAP Help Portal. 127
7/9/25, 8:45 AM
Property Description
Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and
must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must
not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names).
As the technical name will be displayed in the Open SQL Schema and in monitoring tools, including SAP
internal tools, we recommend that you do not include sensitive business or personal data in the name.
Storage Type Select SAP HANA Data Lake Files. The option is greyed out if no resources have been allocated to the
object store (see Configure the Size of Your SAP Datasphere Tenant).
3. Click Create. The space page opens. The creation and provisioning of a file space may take several minutes. You must wait
for the notification message indicating that the file space is deployed before you can start working with the file space.
Property Description
Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and
must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must
not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names).
Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain
spaces and special characters.
Space Status [read-only] Displays the status of the space. Newly-created spaces are always active.
Space Type [read-only] Displays the type of the space. You can only create spaces of type SAP Datasphere.
Created On [read-only] Displays the date and time when the space was created.
Deployment Status [read-only] Displays the deployment status of the space. Newly-created spaces are deployed, but when
you make changes, you need to save and re-deploy them before they are available to space users.
Deployed On [read-only] Displays the date and time when the space was last deployed.
Description [optional] Enter a description for the space. Can contain a maximum of 4 000 characters.
5. Apache Spark section - The maximum amount of compute resources that the file space can consume when processing
statements are allocated to its Apache Spark instance. The resources allocated for the file space depend on the resources
allocated for the object store in the Tenant Configuration page (see Configure the Size of Your SAP Datasphere Tenant).
The following applications, which are listed in the Applications area, are available for the instance and are used to run the
tasks that are listed in the Task Assignment area.
Property Description
Cluster Size [read-only] Qualifies the overall size of resources allocated to the application. For
example, micro or medium.
Driver [read-only] Shows the amount of resources allocated to the driver for the application.
This is custom documentation. For more information, please visit SAP Help Portal. 128
7/9/25, 8:45 AM
Property Description
Executor [read-only] Shows the amount of resources allocated to the executor for the
application.
Max. Used [read-only] Shows the maximum amount of resources that can be used for the
application.
You can view which applications are used by default to run which tasks in the Task Assignment area.
To modify the size of the instance at any time, change the amount of memory and click Update. You should change the size
of your instance based on the resource amounts displayed in the Max. Used column of the table. For example, you can see
that the application used to run transformation flows is allocated 168 CPU and 672 GB of memory. If you want that 4
transformation flows can be run in parallel, you must enter 2688 in Memory (GB). The amount of vCPUs is automatically
calculated based on the amount of memory with a ratio of 4:1 (for example 2688 GB of memory and 672 vCPUs). The
minimum size for the instance is 1632 GB of memory (and 408 vCPUs), and its maximum size is 8192 GB of memory (and
2048 vCPUs).
Add your space to an existing scoped role (see Add Spaces to a Scoped Role).
Create a scoped role and add your space and at least one user to the scoped role (see Create a Scoped Role).
For more information, see Create a Scoped Role to Assign Privileges to Users in Spaces.
All users assigned to the space via the scoped roles are automatically displayed in the Users area of the space page. In this
area, you can add or remove users to/from scoped roles for your space (see Control User Access to Your Space). Either a
user with an administrator role or a user with a space administrator role can do so.
Note
If you've made some changes in the General Settings area, such as changing the space name or entering a description,
click Save.
If your file space and its data lake instance or Apache Spark instance run into communication errors, click Deploy.
For more information about working with data in the object store, see Acquiring and Preparing Data in the Object Store.
This is custom documentation. For more information, please visit SAP Help Portal. 129
7/9/25, 8:45 AM
Prerequisites
To allocate disk and memory storage to your space, you must have a global role that grants you the following privileges:
The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
Note
Relevant only for spaces with a storage type SAP HANA Database (Disk and In-Memory), and not for SAP HANA Data Lake
Files spaces.
Context
SAP Datasphere supports data tiering using the features of SAP HANA Cloud:
Memory Storage (hot data) - Keep your most recent, frequently-accessed, and mission-critical data loaded constantly in
memory to maximize real-time processing and analytics speeds.
When you persist a view, the persisted data is stored in memory (see Persist Data in a Graphical or SQL View).
Disk (warm data) - Store master data and less recent transactional data on disk to reduce storage costs.
When you load data to a local table or replicate data to a remote table in SAP Datasphere, the data is stored on disk by
default, but you can load it in memory by activating the Store Table Data in Memory switch (see Accelerate Table Data
Access with In-Memory Storage).
Data Lake (cold data) - Store historical data that is infrequently accessed in the data lake. With its low cost and high
scalability, the data lake is also suitable for storing vast quantities of raw structured and unstructured data, including IoT
data. For more information, see Integrating Data to and From SAP HANA Cloud Data Lake.
You can allocate specific amounts of memory and disk storage to a space or disable the Enable Space Quota option, and allow the
space to consume all the storage it needs, up to the total amount available in your tenant.
Procedure
1. In the side navigation area, click (Space Management), locate your space tile, and click Edit to open it.
2. Use the Space Storage properties to allocate disk and memory storage to the space and to choose whether it will have
access to the SAP HANA data lake.
Property Description
Enable Space Quota Disable this option to allow the space to consume any amount of disk and memory storage up to the
total amounts available in your tenant.
This is custom documentation. For more information, please visit SAP Help Portal. 130
7/9/25, 8:45 AM
Property Description
If this option was disabled and then subsequently re-enabled, the Disk and Memory properties are
initialized to the minimum values required by the current contents of the space.
Default: Enabled
Disk (GB) Enter the amount of disk storage allocated in GB. You can use the buttons to change the amount by
whole GBs or enter fractional values in increments of 100 MB by hand.
Default: 2 GB
Memory (GB) Enter the amount of memory storage allocated in GB. This value cannot exceed the amount of disk
storage allocated. You can use the buttons to change the amount by whole GBs or enter fractional
values in increments of 100 MB by hand.
Note
The memory allocated is used to store data and is not related to processing memory. For more
information on limiting processing memory in a space, see Set Priorities and Statement Limits for
Spaces or Groups.
Default: 1 GB
Use This Space to Enable access to the SAP HANA Cloud data lake. Only one space can connect to the data lake.
Access the Data Lake
Note
Even though the option is available for selection, you should check first that no other space already
has access to the data lake. To do so, you can choose the table layout in the Space Management
overview page and sort on the Data Lake Access column.
Default: Disabled
Note
If a space exceeds its allocations of memory or disk storage, it will be locked until a user of the space deletes the excess
data or an administrator assigns additional storage. See Unlock a Locked Space.
3. Click Save to save your changes to the space, or Deploy to save and immediately make the changes available to users
assigned to the space.
Results
To view the total storage available and the amount assigned to and used by all spaces, see Monitoring SAP Datasphere.
Prerequisites
This is custom documentation. For more information, please visit SAP Help Portal. 131
7/9/25, 8:45 AM
Export Workload Management Settings
Prerequisites
To set priorities and statement limits for spaces or groups, you must have a global role that grants you the following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
Note
Relevant only for spaces with a storage type SAP HANA Database (Disk and In-Memory), and not for SAP HANA Data Lake
Files spaces.
Note
You can use the SAP Datasphere command line interface, datasphere, to set space or group priorities and statement limits.
See Manage Priorities and Statement Limits for Spaces or Groups via the Command Line.
2. If Space is not already selected, select it and click Save. Be aware that if you've modified the default settings for the Group
option, once you confirm the reorganization by space, your settings will be deleted. To keep your settings, you can export
them (see Export Workload Management Settings).
3. In the confirmation message that opens, click Yes. The process may take some time but you can continue to work in other
areas of SAP Datasphere.
4. Click on the row of the space for which you want to edit the properties.
Note
You can search for a space based on its ID by entering one or more characters in the Search field. Only the spaces
whose space ID includes the entered characters are displayed in the table.
5. To prioritize between spaces, specify in the Space Priority section the prioritization of this space when querying the
database. You can choose a value from 1 (lowest priority) to 8 (highest priority). The default value is 5. In situations where
spaces are competing for available threads, those with higher priorities have their statements run before those of spaces
with lower priorities.
6. To manage other workload parameters, you can select either of the following in the Configuration dropdown list:
Default. The default configuration provides generous resource limits, while preventing any single space from
overloading the system. The default configuration is applied by default to new spaces.
These statement limit and admission control parameters are taken into account in the default configuration and
cannot be changed:
This is custom documentation. For more information, please visit SAP Help Portal. 132
7/9/25, 8:45 AM
Parameter Value
Custom. These statement limit and admission control parameters are taken into account in the custom
configuration. You can specify only the value for statements limits to set maximum total thread and memory limits
that statements running concurrently in the space can consume:
Caution
Be aware that changing the statement limits may cause performance issues.
Parameter Value
TOTAL STATEMENT In the Total Statement Thread Limit area, enter the maximum number (or percentage) of
THREAD LIMIT threads that statements running concurrently in the space can consume. You can enter a
percentage between 1% and 100% (or the equivalent number) of the total number of
threads available in your tenant.
Note
100% represents the maximum of 80% of CPU resources reserved for workload
generated by spaces, user group users and agent users. The remaining 20% of CPU
resources are reserved to ensure that the system can respond under heavy load.
Setting this limit prevents the space from consuming too many threads, and can help with
balancing resource consumption between competing spaces.
Caution
Be aware that setting this limit too low may impact statement performance, while
excessively high values may impact the performance of statements in other spaces.
Default: 70%
This is custom documentation. For more information, please visit SAP Help Portal. 133
7/9/25, 8:45 AM
Parameter Value
TOTAL STATEMENT In the Total Statement Memory Limit area, enter the maximum number (or percentage) of
MEMORY LIMIT GBs of memory that statements running concurrently in the space can consume. You can
enter any value or percentage between 0 (no limit) and the total amount of memory
available in your tenant.
Setting this limit prevents the space from consuming all available memory, and can help
with balancing resource consumption between competing spaces.
Caution
Be aware that setting this limit too low may cause out-of-memory issues, while
excessively high values or 0 may allow the space to consume all available system
memory.
Default: 80%
7. Click Save. The changes are reflected in the space details page in read-only.
You can set priorities and statement limits by group (which are groups of processes) and distribute the workload between the 8
groups provided.
For example, you can choose a total thread and memory limit for the Analytic Consumption group that are higher than for the
Modeling or the Data Management group, which can typically run more slowly in the background.
Group Processes
Analytic Consumption Analytic data preview inside the Analytic Model editors.
Data Management Actions related to data integration, such as running task chains, persisting data
or importing data.
Modeling Data preview actions in the Business Builder and the Data Builder.
SAP Analytics Cloud Data Management Import and export of data in SAP Analytics Cloud.
SAP Analytics Cloud Interactive Operations Navigation in SAP Analytics Cloud, primarily for loading stories.
SAP Analytics Cloud Long-Running Operations Background jobs for planning workflows in SAP Analytics Cloud.
SAP Analytics Cloud System Operation Background operations in SAP Analytics Cloud, such as collecting statistics and
cleaning up jobs.
2. If Group is not already selected, select it and click Save. Be aware that if you've modified the default settings for the option
Space, once you confirm the reorganization by group, your settings will be deleted. To keep your settings, you can export
them (see Export Workload Management Settings).
This is custom documentation. For more information, please visit SAP Help Portal. 134
7/9/25, 8:45 AM
3. In the confirmation message that opens, click Yes. The process may take some time but you can continue to work in other
areas of SAP Datasphere.
The groups are provided with the following default priorities and statement limits:
The default configuration provides generous resource limits, while preventing any single group from overloading the
system.
4. To change the priority or statement limits of a group, click it and change the settings as follows:
Note
The statement timeout and admission control parameters mentioned in the table of the previous step are also taken
into account in the custom configuration but you cannot change their values.
Parameter Description
Priority To prioritize between groups, specify the priority of this group when querying the
database. You can choose a value from 1 (lowest priority) to 8 (highest priority). The
default value depends on the group. In situations where groups are competing for
available threads, those with higher priorities have their statements run before those
of groups with lower priorities.
Total Statement Thread Limit Select Configuration Custom and enter the maximum number (or percentage)
of threads that statements running concurrently in the group can consume. You can
enter a percentage between 1% and 100% (or the equivalent number) of the total
number of threads available in your tenant.
This is custom documentation. For more information, please visit SAP Help Portal. 135
7/9/25, 8:45 AM
Parameter Description
Setting this limit prevents the group from consuming too many threads, and can help
with balancing resource consumption between competing groups.
Caution
Be aware that setting this limit too low may impact statement performance, while
excessively high values may impact the performance of statements in other
groups.
Total Statement Memory Limit Select Configuration Custom and enter the maximum number (or percentage)
of GBs of memory that statements running concurrently in the group can consume.
You can enter any value or percentage between 0 (no limit) and the total amount of
memory available in your tenant.
Setting this limit prevents the group from consuming all available memory, and can
help with balancing resource consumption between competing groups.
Caution
Be aware that setting this limit too low may cause out-of-memory issues, while
excessively high values or 0 may allow the group to consume all available system
memory.
5. Click Save.
3. Save it.
To apply the customized workload settings, for spaces or groups, that you have previously exported, you can import the saved
.json file.
3. Select the .json file that you have previously exported and click Import.
This is custom documentation. For more information, please visit SAP Help Portal. 136
7/9/25, 8:45 AM
Prerequisites
To copy a space and its contents, you must have a global role that grants you the following privileges:
The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
Note
A space cannot be copied if it contains any Business Builder objects.
If the space is used as storage by an associated SAP Analytics Cloud tenant, then it cannot be copied if any SAP Analytics
Cloud objects are exposed (see Exposing Objects for Consumption in SAP Datasphere).
Note
Relevant only for spaces with a storage type SAP HANA Database (Disk and In-Memory), and not for SAP HANA Data Lake
Files spaces.
Context
If you copy a space that contains objects protected by a namespace, the copied objects will be modified so that they are removed
from the namespace and become editable. Copying protected content in this way allows you to extend content delivered through
SAP Business Data Cloud (see Extending Intelligent Applications).
Procedure
1. In the side navigation area, click (Space Management), locate your space tile, and click Edit to open it.
3. Enter the name of the new space who want to copy to.
By default, the contents of the space will be copied to the new space, but will not be deployed. To have them deployed,
select Deploy Objects.
4. Click Copy.
The new space is configured exactly as the original space but has a new Space ID and Space Name.
All connections (credentials need to be re-entered unless the connection is a shared UCL connection).
Note
Replication task schedules are not copied and must be recreated manually.
Any objects shared to the original space are shared to the new space.
The new space is added as a scope to all scoped roles that the original space belongs to, but no users are added to
the new space, by default.
This is custom documentation. For more information, please visit SAP Help Portal. 137
7/9/25, 8:45 AM
For information about adding users to a space, see Create a Scoped Role to Assign Privileges to Users in Spaces.
When specifying the technical name of an object, bear in mind the following rules and restrictions:
Space The space ID can only contain uppercase letters, numbers, and underscores (_). 20
Reserved keywords, such as SYS, CREATE, or SYSTEM, must not be used. Unless
advised to do so, the ID must not contain prefix _SYS and should not contain
prefixes: DWC_, SAP_. The maximum length is 20 characters.
Also, the keywords that are reserved for the SAP HANA database cannot be used
in a space ID. See Reserved Words in the SAP HANA SQL Reference Guide for SAP
HANA Platform.
Elastic Compute Node The elastic compute node technical name can only contain lowercase letters (a-z) 9
and numbers (0-9). It must contain the prefix: ds. The minimum length is 3 and
the maximum length is 9 characters.
SAP BW bridge instance The technical name can contain any characters except for the asterisk (*), colon 50
(:), and hash sign (#). Also, tab, carriage return, and newline must not be used,
Remote table generated
and space must not be used at the start of the name. The maximum length is 50
during the import of
characters.
analysis authorizations
from a SAP BW or SAP
BW∕4HANA system
Object created in the The technical name can only contain alphanumeric characters and underscores 50
Data Builder, for example (_). The maximum length is 50 characters.
a table, view, E/R model,
flow, intelligent lookup,
task chain, or data
access control
Element in the Data The technical name can only contain alphanumeric characters and underscores 30
Builder, for example a (_). The maximum length is 30 characters.
column, or a join,
projection, or
aggregation node
Object created in the The technical name can only contain alphanumeric characters and underscores 30
Business Builder, for (_). The maximum length is 30 characters.
example a fact,
dimension, fact model,
consumption model, or
authorization scenario
This is custom documentation. For more information, please visit SAP Help Portal. 138
7/9/25, 8:45 AM
Association The technical name can only contain alphanumeric characters, underscores (_), 20
and dots (.). The maximum length is 20.
Input parameter The technical name can only contain uppercase letters, numbers, and 30
underscores (_). The maximum length is 30 characters.
Database analysis user The user name suffix can only contain uppercase letters, numbers, and 31 (40 minus prefix)
underscores (_). The maximum length is 41 characters. This suffix is added to the
default prefix DWCDBUSER# to create your full user name. Note that you cannot
change the prefix as it is a reserved prefix.
Database user group The user name suffix can only contain uppercase letters, numbers, and 30 (40 minus prefix)
user underscores (_). The maximum length is 41 characters. This suffix is added to the
default prefix DWCDBGROUP# to create your full user name. Note that you
cannot change the prefix as it is a reserved prefix.
Database user (Open The user name suffix can only contain uppercase letters, numbers, and 40 minus space name
SQL schema) underscores (_). The maximum length is 41 characters. This suffix is added to the (or 41 minus prefix)
default prefix <space ID># to create your full user name. Note that you cannot
change the prefix.
Connection The technical name can only contain alphanumeric characters and underscores 40
(_). Underscore (_) must not be used at the start or end of the name. The
maximum length is 40 characters.
The technical name by default is synchronized with the business name. While entering the business name, invalid characters are
replaced in the technical name as follows:
Rule Example
This is custom documentation. For more information, please visit SAP Help Portal. 139
7/9/25, 8:45 AM
Note
Relevant only for spaces with a storage type SAP HANA Database (Disk and In-Memory), and not for SAP HANA Data Lake
Files spaces.
To use datasphere to create spaces, you must have an SAP Datasphere user with the DW Administrator role or equivalent
permissions (see Roles and Privileges by App and Feature).
For more information, see Manage Spaces via the Command Line.
Prerequisites
Context
Restore a Space
Prerequisites
To restore spaces, or delete them from the Recycle Bin, you must have a global role with the following privileges:
Spaces (-------M) - To access the Recycle Bin in the Space Management tool.
The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.
Context
Once a space has been deleted and moved to the Recycle Bin (see Delete Your Space), you can either restore the space or
permanently delete the space from the database to recover the disk storage used by the data in the space.
Restore a Space
1. In the side navigation area, click (Space Management).
2. In the Recycle Bin area, locate and select your space and click the Restore button.
The space is moved to the All Spaces area and you can work with it.
This is custom documentation. For more information, please visit SAP Help Portal. 140
7/9/25, 8:45 AM
Note
Once a space is restored, its status is "active", regardless of the status the space had before deletion (active or locked).
As the following actions are not automatically done with the space restore, you can perform them manually:
Run the replication for remote tables connected via SAP HANA smart data access, with real-time replication, and for
replication flows with the load type "Initial and Delta" (see Replicating Data and Monitoring Remote Tables).
Re-enable real-time replication for remote tables connected via SAP HANA smart data integration, with real-time
replication (see Replicating Data and Monitoring Remote Tables).
If replication flows were stopped before the space was deleted, ensure that they get started again (see Working With
Existing Replication Flow Runs).
Synchronize the source system with the catalog (see Manually Synchronizing a System).
Caution
This action cannot be undone.
All objects and data contained in any Open SQL schema associated with the space.
All audit logs entries generated for the space, including audit log entries related to any Open SQL schema associated with
the space.
Note
For spaces that have been deleted before version 2023.05, all related audit logs have been kept. A user with an
administrator role can decide to delete them (see Delete Audit Logs ).
2. In the Recycle Bin area, locate and select one or more spaces and click the Delete button.
3. In the confirmation message, enter DELETE if you are sure that you no longer need the spaces and any of their content or
data, then click the Delete button.
The spaces are permanently deleted from the database and cannot be recovered.
Note
When you delete a file space, it can take more than 5 minutes and a timeout message from your browser might be
displayed even though the space is being properly deleted. To make sure that your space has been permanently deleted,
you can check later that it is no more in the recycle bin.
This is custom documentation. For more information, please visit SAP Help Portal. 141
7/9/25, 8:45 AM
The following overview lists the most common prerequisites per connection type and points to further information about what
needs to be prepared to connect and use a connection.
Connection Remote Remote Data Flows Data Flows: SAP Source IP Additional
Type Tables: Data Tables: and Third-Party Datasphere Required in Information
Provisioning Installation of Replication Driver Upload IP Required in SAP and
Agent Third-Party Flows: Cloud Required? Source Datasphere Prerequisites
Required? JDBC Library Connector Allowlist? IP Allowlist?
Required? Required for
On-Premise
Sources?
Amazon no no no no no no Prepare
Athena Connectivity to
Connections Amazon
Athena
This is custom documentation. For more information, please visit SAP Help Portal. 142
7/9/25, 8:45 AM
Connection Remote Remote Data Flows Data Flows: SAP Source IP Additional
Type Tables: Data Tables: and Third-Party Datasphere Required in Information
Provisioning Installation of Replication Driver Upload IP Required in SAP and
Agent Third-Party Flows: Cloud Required? Source Datasphere Prerequisites
Required? JDBC Library Connector Allowlist? IP Allowlist?
Required? Required for
On-Premise
Sources?
Hadoop no no no no no no n/a
Distributed File
System
Connections
Microsoft no no no no no no n/a
Azure Blob
Storage
Connections
Open no no no no no no Prepare
Connectors Connectivity to
Connections SAP Open
Connectors
This is custom documentation. For more information, please visit SAP Help Portal. 143
7/9/25, 8:45 AM
Connection Remote Remote Data Flows Data Flows: SAP Source IP Additional
Type Tables: Data Tables: and Third-Party Datasphere Required in Information
Provisioning Installation of Replication Driver Upload IP Required in SAP and
Agent Third-Party Flows: Cloud Required? Source Datasphere Prerequisites
Required? JDBC Library Connector Allowlist? IP Allowlist?
Required? Required for
On-Premise
Sources?
SAP HANA yes (for on- no yes (for on- no no Cloud Prepare
Connections premise) premise: for Connector IP Connectivity to
data flows (for on- SAP HANA
and premise when
replication using Cloud
flows, or when Connector for
using Cloud remote tables
Connector for feature)
remote tables
feature)
SAP HANA no no no no no no no
Cloud, Data
Lake Files
Connections
This is custom documentation. For more information, please visit SAP Help Portal. 144
7/9/25, 8:45 AM
Connection Remote Remote Data Flows Data Flows: SAP Source IP Additional
Type Tables: Data Tables: and Third-Party Datasphere Required in Information
Provisioning Installation of Replication Driver Upload IP Required in SAP and
Agent Third-Party Flows: Cloud Required? Source Datasphere Prerequisites
Required? JDBC Library Connector Allowlist? IP Allowlist?
Required? Required for
On-Premise
Sources?
Engine
Connections
Note
For information about supported versions of sources that are connected via SAP HANA smart data integration and its Data
Provsioning Agent, see the SAP HANA smart data integration and all its patches Product Availability Matrix (PAM) for SAP
HANA SDI 2.0 .
For information about necessary JDBC libraries for connecting to sources from third-party vendors, see:
SAP HANA smart data integration and all its patches Product Availability Matrix (PAM) for SAP HANA SDI 2.0
Context
The Data Provisioning Agent is a lightweight component running outside the SAP Datasphere environment. It hosts data
provisioning adapters for connectivity to remote sources, enabling data federation and replication scenarios. The Data
Provisioning Agent acts as a gateway to SAP Datasphere providing secure connectivity between the database of your SAP
This is custom documentation. For more information, please visit SAP Help Portal. 145
7/9/25, 8:45 AM
Datasphere tenant and the adapter-based remote sources. The Data Provisioning Agent is managed by the Data Provisioning
Server. It is required for all connections with SAP HANA smart data integration.
Through the Data Provisioning Agent, the preinstalled data provisioning adapters communicate with the Data Provisioning Server
for connectivity, metadata browsing, and data access. The Data Provisioning Agent connects to SAP Datasphere using JDBC. It
needs to be installed on a local host in your network and needs to be configured for use with SAP Datasphere.
Note
A given Data Provisioning Agent can only connected to one SAP Datasphere tenant (see SAP Note 2445282 ).
For an overview of connection types that require a Data Provisioning Agent setup, see Preparing Connectivity for Connections.
Note
See also the guide Best Practices and Sizing Guide for Smart Data Integration (When used in SAP Datasphere) (published
June 10, 2022) for information to consider when creating and using connections that are based on SDI and Data Provisioning
Agent.
Procedure
This is custom documentation. For more information, please visit SAP Help Portal. 146
7/9/25, 8:45 AM
To prepare connectivity via Data Provisioning Agent, perform the following steps:
1. Download and install the latest Data Provisioning Agent version on a host in your local network.
Note
We recommend to always use the latest released version of the Data Provisioning Agent. For information on
supported and available versions for the Data Provisioning Agent, see the SAP HANA Smart Data Integration
Product Availability Matrix (PAM) .
Make sure that all agents that you want to connect to SAP Datasphere have the same latest version.
2. Add the external IPv4 address of the server on which your Data Provisioning Agent is running to the IP allowlist in SAP
Datasphere. When using a proxy, the proxy's address needs to be included in IP allowlist as well.
Note
For security reasons, all external connections to your SAP Datasphere instance are blocked by default. By adding
external IPv4 addresses or address ranges to the allowlist you can manage external client connections.
This includes configuring the agent and setting the user credentials in the agent.
For more information, see Connect and Configure the Data Provisioning Agent.
Note
For third-party adapters, you need to download and install any necessary JDBC libraries before registering the adapters.
Results
The registered adapters are available for creating connections to the supported remote sources and enabling these connections
for creating views and accessing or replicating data via remote tables.
Context
Procedure
1. Plan and prepare the Data Provisioning Agent installation.
a. Plan your installation to ensure that it meets your system landscape's needs.
This is custom documentation. For more information, please visit SAP Help Portal. 147
7/9/25, 8:45 AM
You can install the agent on any host system that has access to the sources you want to access, meets the minimum
system requirements, and has any middleware required for source access installed. The agent should be installed on
a host that you have full control over to view logs and restart, if necessary.
Planning and Preparation in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
documentation.
Supported Platforms and System Requirements in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.
b. Download the latest Data Provisioning Agent HANA DP AGENT 2.0 from the SAP Software Download Center .
Note
We recommend to always use the latest released version of the Data Provisioning Agent.
Make sure that all agents that you want to connect to SAP Datasphere have the same latest version.
Software Download in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
documentation
SAP HANA Smart Data Integration Product Availability Matrix (PAM) (for supported and available versions
for the Data Provisioning Agent and operating system support)
For more information, see Install from the Command Line in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality documentation.
Note
If you have upgraded your Data Provisioning Agent to version 2.5.1 and want to create an Amazon Redshift connection,
apply SAP note 2985825 .
Related Information
Install the Data Provisioning Agent
Update the Data Provisioning Agent
Procedure
1. In SAP Datasphere, register the Data Provisioning Agent.
b. In the On-Premise Agents section, add a new tile to create a new agent registration in SAP Datasphere.
c. In the following dialog, enter a unique name for your new agent registration.
This is custom documentation. For more information, please visit SAP Help Portal. 148
7/9/25, 8:45 AM
Note
The registration name cannot be changed later.
d. Select Create.
The Agent Settings dialog opens and provides you with information required to configure the Data Provisioning
Agent on your local host:
Agent name
HANA port
Note
Either keep the Agent Settings dialog open, or note down the information before closing the dialog.
2. At the command line, connect the agent to SAP HANA using JDBC. Perform the following steps:
a. Navigate to <DPAgent_root>/bin/. <DPAgent_root> is the Data Provisioning Agent installation root location.
By default, on Windows, this is C:\usr\sap\dataprovagent, and on Linux it is /usr/sap/dataprovagent.
On Windows: dpagent_servicedaemon_start.bat
c. Start the command-line agent configuration tool using the following command:
On Linux:
<DPAgent_root>/bin/[Link] --configAgent
On Windows:
<DPAgent_root>/bin/[Link] --configAgent
Tip
An encrypted connection is always required when connecting to SAP HANA in a cloud-based environment.
h. Enter the host name (HANA server) and port number (HANA port) for the SAP Datasphere instance.
For example:
i. If HTTPS traffic from your agent host is routed through a proxy, enter true and specify any required proxy
information as prompted.
This is custom documentation. For more information, please visit SAP Help Portal. 149
7/9/25, 8:45 AM
i. Enter true to specify that the proxy is an HTTP proxy.
iii. If you use proxy authentication, enter true and provide a proxy user name and password.
j. Enter the credentials for the HANA user for agent messaging.
The HANA user for agent messaging is used only for messaging between the agent and SAP Datasphere.
k. Confirm that you want to save the connection settings you have made by entering true.
Note
Any existing agent connection settings will be overwritten.
On Linux:
<DPAgent_root>/bin/[Link] --configAgent
On Windows:
<DPAgent_root>/bin/[Link] --configAgent
i. To stop the agent, choose Start or Stop Agent, and then choose Stop Agent.
iii. Choose Agent Status to check the connection status. If the connection succeeded, you should see
Agent connected to HANA: Yes.
Note
For agent version 2.7.4 and higher, if in the agent status the message No connection established yet is
shown, this can be ignored.
This is custom documentation. For more information, please visit SAP Help Portal. 150
7/9/25, 8:45 AM
For more information about the agent/SAP HANA connection status in agent version 2.7.4 and higher, see
SAP Note 3487646 .
3. In SAP Datasphere, if you have kept the Agent Settings dialog open, you can now close it.
Results
If the tile of the registered Data Provisioning Agent doesn’t display the updated connection status, select Refresh Agents.
Related Information
Troubleshooting the Data Provisioning Agent (SAP HANA Smart Data Integration)
Prerequisites
For third-party adapters, ensure that you have downloaded and installed any necessary JDBC libraries. Place the files in the
<DPAgent_root>/lib folder before registering the adapters with SAP Datasphere. For connection types Amazon Redshift and
Generic JDBC, place the file in the <DPAgent_root>/camel/lib folder.
For information about the proper JDBC library for your source, see the SAP HANA smart data integration and all its patches
Product Availability Matrix (PAM) for SAP HANA SDI 2.0 . Search for the library in the internet and download it from an
appropriate web page.
Procedure
1. In the side navigation area, click (System) (Configuration) Data Integration .
2. In the On-Premise Agents section, click the Adapters button to display the agents with their adapter information.
4. In the Agent Settings dialog, under Agent Adapters select the adapters.
5. Click Close to close the dialog and register the selected adapters with SAP Datasphere.
Note
It is not required to save to update the agent settings.
The registered adapters are now available for creating connections to the supported on-premise sources.
Next Steps
To use new functionality of an already registered adapter or to update the adapter in case of issues that have been fixed in a new
agent version, you can refresh the adapter by clicking the (menu) button and then choosing Refresh.
This is custom documentation. For more information, please visit SAP Help Portal. 151
7/9/25, 8:45 AM
You need to create an RFC destination in the ABAP source system. With the RFC destination you register the Data
Provisioning agent as server program in the source system.
Using transaction SM59, you create a TCP/IP connection with a user-defined name. The connection should be created with
“Registered Server Program” as “Activation Type”. Specify “IM_HANA_ABAPADAPTER_*” as a filter for the “Program ID”
field, or leave it empty.
Note
You can ignore failing SM59 connection tests because the RFC connection is only built up when the replication is
running to query records from the SAP system. For more information, see SAP Note 3206908 .
Successful registration on an SAP Gateway requires that suitable security privileges are configured. For example:
Set up an Access Control List (ACL) that controls which host can connect to the gateway. That file should contain
something similar to the following syntax: <permit> <ip-address[/mask]> [tracelevel] [#
comment]. <ip-address> here is the IP of the server on which Data Provisioning agent has been installed.
For more information, see the Gateway documentation in the SAP help for your source system version, for example
in the SAP NetWeaver 7.5 documentation:
You may also want to configure a reginfo file to control permissions to register external programs.
Context
Cloud Connector serves as a link between SAP Datasphere and your on-premise sources and is required for connections that you
want to use for:
Data flows
Replication flows
SAP BW/4HANA Model Transfer connections (Cloud Connector is required for the live data connection of type
tunnel that you need to create the model import connection)
SAP S/4HANA On-Premise connections (Cloud Connector is required for the live data connection of type tunnel
that you need to search for the entities in the SAP S/4HANA system)
Remote tables (only for SAP HANA on-premise via SAP HANA Smart Data Access)
This is custom documentation. For more information, please visit SAP Help Portal. 152
7/9/25, 8:45 AM
For an overview of connection types that require a Cloud Connector setup to be able to use any of these features, see Preparing
Connectivity for Connections.
Procedure
To prepare connectivity via Cloud Connector, perform the following steps:
For more information, see Cloud Connector Installation in the SAP BTP Connectivity documentation.
2. Make sure to hold the SAP Datasphere subaccount information ready. You can find the information in (System)
(Administration) Data Source Configuration .
3. In the Cloud Connector administration, set up and configure Cloud Connector according to your requirements.
4. If you have connected multiple Cloud Connector instances to your subaccount and you want to use these locations for your
connections, add the location IDs in (System) (Administration) Data Source Configuration .
5. If you you want to create SAP BW/4HANA Model Transfer connections or SAP S/4HANA On-Premise connections for model
import, make sure you have switched on Allow live data to securely leave my network in (System)
(Administration) Data Source Configuration .
Result
The Cloud Connector respectively Cloud Connector instances are available for creating connections and enabling these for the
supported features.
Related Links
Frequently Asked Questions (about the Cloud Connector) in the SAP BTP Connectivity documentation
Prerequisites
Before configuring the Cloud Connector, the following prerequisites must be fulfilled:
For more information, see Cloud Connector Installation in the SAP BTP Connectivity documentation.
This is custom documentation. For more information, please visit SAP Help Portal. 153
7/9/25, 8:45 AM
If you are using egress firewalling, add the following domains (wildcard) to the firewall/proxy allowlist in your on-premise
network:
*.[Link]
*.[Link]
Before configuring the Cloud Connector, you or the owner of your organisation will need an SAP Business Technology
Platform (SAP BTP) account. If you don't have an account yet, create an account by clicking Register in the SAP BTP
cockpit.
During Cloud Connector configuration you will need information for your SAP Datasphere subaccount. Make sure that you
have the subaccount information available in System Administration Data Source Configuration SAP BTP Core
Account .
Note
If you have an account but cannot see the account information here, enter the SAP BTP user ID. This ID is typically the
email address you used to create your SAP BTP account. After you have entered the ID, you can see the Account
Information for SAP Datasphere.
Context
For more information about the supported use cases depending on the connection type, see Preparing Cloud Connector
Connectivity.
Procedure
1. Log on to the Cloud Connector Administration on [Link]
<hostname> refers to the machine on which the Cloud Connector is installed. If installed on your machine, you can simply
enter localhost.
2. To connect the SAP Datasphere subaccount to your Cloud Connector, perform the following steps:
a. In the side navigation area of the Cloud Connector Administration, click Connector to open the Connector page
and click Add Subaccount to open the Add Subaccount dialog.
b. Enter or select the following information to add the SAP Datasphere subaccount to the Cloud Connector.
Note
You can find the subaccount, region, and subaccount user information in SAP Datasphere under System
Administration Data Source Configuration SAP BTP Core Account Account Information .
Property Description
Password Add your S-User password for the SAP Business Technology Platform.
This is custom documentation. For more information, please visit SAP Help Portal. 154
7/9/25, 8:45 AM
Property Description
Location ID [optional] Define a location ID that identifies the location of this Cloud
Connector for the subaccount.
Note
c. Click Save.
In the Subaccount Dashboard section of the Connector page, you can see all subaccounts added to the Cloud
Connector at a glance. After you added your subaccount, you can check the status to verify that the Cloud
Connector is connected to the subaccount.
3. To allow SAP Datasphere to access systems (on-premise) in your network, you must specify the systems and the accessible
resources in the Cloud Connector (URL paths or function module names depending on the used protocol). Perform the
following steps for each system that you want to be made available by the Cloud Connector:
a. In the side navigation area, under your subaccount menu, click Cloud To On-Premise and then (Add)in the
Mapping Virtual To Internal System section of the Access Control tab to open the Add System Mapping dialog.
Note
The side navigation area shows the display name of your subaccount. If the area shows another subaccount,
select your subaccount from the Subaccount field of the Cloud Connector Administration.
b. Add your system mapping information to configure access control and save your configuration.
The procedure to add your system mapping information is specific to the protocol that you are using for
communication. The relevant protocols are:
SAP ABAP on-premise only (remote tables via ABAP SQL service)
SAP HANA on-premise only (data flows, replication flows, remote tables via SAP TCP
HANA Smart Data Access and Cloud Connector)
This is custom documentation. For more information, please visit SAP Help Portal. 155
7/9/25, 8:45 AM
Confluent - Confluent Platform on-premise only (replication flows) TCP - for the Kafka broker
For more information, see Configure Access Control in the SAP BTP Connectivity documentation.
Note
When adding the system mapping information, you enter internal and virtual system information. The
internal host and port specify the actual host and port under which the backend system can be reached
within the intranet. It must be an existing network address that can be resolved on the intranet and has
network visibility for the Cloud Connector. The Cloud Connector tries to forward the request to the
network address specified by the internal host and port, so this address needs to be real. The virtual host
name and port represent the fully qualified domain name of the related system in the cloud.
We recommend to use a virtual (cloud-side) name that is different from the internal name.
For ABAP-based connection types: When using load balancing, make sure to directly specify the message
server port in the System ID field of the system mapping information.
For ABAP-based connection types: The Connection Type selected in the system mapping information
(load balancing logon or connecting to a specific application server) must match the SAP Logon
Connection Type selected in SAP Datasphere connection management (Message Server or Application
Server).
If encrypted communication using TLS/SSL is defined in the SAP Datasphere connection (to establish
end-to-end encryption), ensure that the associated system mapping in the Cloud Connector does not use
TLS.
For SAP S/4HANA On-Premise connections using the ABAP SQL service for data federation with remote
tables:
Using the model import feature is not supported with the same connection.
If you want to use the same connection for remote tables and flows, you need to create two
system mappings. For more information about what to consider when creating the required
system mappings, see Using ABAP SQL Services for Accessing Data from SAP S/4HANA.
For SAP ABAP (on-premise) connections using the ABAP SQL service for data federation with remote
tables: If you want to use the same connection for remote tables and flows, you need to create two
system mappings. For more information about what to consider when creating the required system
mappings, see Using ABAP SQL Services for Accessing Data from SAP S/4HANA.
c. To grant access only to the resources needed by SAP Datasphere, select the system host you just added from the
Mapping Virtual To Internal System list, and for each resource that you want to allow to be invoked on that host
click (Add) in the Resources Of section to open the Add Resource dialog.
d. Depending on the connection type, protocol, and use case, add the required resources:
This is custom documentation. For more information, please visit SAP Help Portal. 156
7/9/25, 8:45 AM
For data federation with remote tables via the ABAP SQL service:
Enter the service path of the SQL service endpoint on the SAP
S/4HANA system. For example:
/sap/bc/sql/sql1/sap/s_privileged – Path and all
sub-paths
Note
In older Cloud Connector versions, the option might appear as
WebSocket or WebSocket Upgrade.
SAP ABAP Function Name For accessing data using CDS view extraction:
(name of the
SAP S/4HANA On- DHAMB_ – Prefix
function module for
Premise
RFC) DHAPE_ – Prefix
RFC_FUNCTION_SEARCH
LTAMB_ – Prefix
LTAPE_ – Prefix
RFC_FUNCTION_SEARCH
SAP BW Function Name For accessing data using ODP connectivity (for legacy systems that do
(name of the not have the ABAP Pipeline Engine extension or DMIS Addon installed):
SAP ECC
function module for
/SAPDS/ – Prefix
RFC)
RFC_FUNCTION_SEARCH
RODPS_REPL_ – Prefix
This is custom documentation. For more information, please visit SAP Help Portal. 157
7/9/25, 8:45 AM
For more information, see Configure Access Control (HTTP) and Configure Access Control (RFC) in the SAP BTP
Connectivity documentation.
e. Choose Save.
4. [optional] To enable secure network communication (SNC) for data flows, configure SNC in the Cloud Connector.
For more information, see Initial Configuration (RFC) in the SAP BTP Connectivity documentation.
Next Steps
1. If you have defined a location ID in the Cloud Connector configuration and want to use it when creating connections, you
need to add the location ID in (System) (Administration) Data Source Configuration .
2. If you want to create SAP BW/4HANA Model Transfer connections or SAP S/4HANA On-Premise connections for model
import, you need to switch on Allow live data to securely leave my network in (System) (Administration)
Data Source Configuration
For answers to the most common questions about the Cloud Connector, see Frequently Asked Questions in the SAP BTP
Connectivity documentation.
Context
The Cloud Connector allows you to connect to on-premise data sources and use them in various use cases depending on the
connection type.
Procedure
This is custom documentation. For more information, please visit SAP Help Portal. 158
7/9/25, 8:45 AM
1. In the side navigation area, click (System) (Administration) Data Source Configuration .
Receive the SAP Datasphere subaccount information that is required during Cloud Connector configuration.
To receive the SAP Datasphere subaccount information, the subaccount needs to be linked to the user ID of your
SAP BTP account. In the SAP BTP Core Account section, you can check if this has been done and the information is
already available in Account Information.
During Cloud Connector configuration, you will then need to enter the following information from your SAP
Datasphere subaccount:
Subaccount
Region Host
Subaccount User
If you have an account but cannot see the Account Information, enter the SAP BTP user ID. This ID is typically the
email address you used to create your SAP BTP account. After you have entered the ID you can see the Account
Information for SAP Datasphere:
Note
If you don't have an SAP Business Technology Platform (SAP BTP) user account yet, create an account in the
SAP BTP cockpit by clicking Register in the cockpit.
To be able to use the Cloud Connector for SAP BW/4HANA Model Transfer connections to import analytic queries
with the Model Transfer Wizard and for SAP S/4HANA On-Premise connections to import ABAP CDS Views with the
Import Entities wizard, switch on Allow live data to securely leave my network in the Live Data Sources section.
Note
The Allow live data to securely leave my network switch is audited, so that administrators can see who switched
this feature on and off. To see the changes in the switch state, go to (Security) (Activities), and search
for ALLOW_LIVE_DATA_MOVEMENT.
If you have connected multiple Cloud Connector instances to your subaccount with different location IDs and you
want to offer them for selection when creating connections using a Cloud Connector, in the On-premise data
sources section, add the appropriate location IDs. If you don't add any location IDs here, the default location will be
used.
Cloud Connector location IDs identify Cloud Connector instances that are deployed in various locations of a
customer's premises and connected to the same subaccount. Starting with Cloud Connector 2.9.0, it is possible to
connect multiple Cloud Connectors to a subaccount as long as their location ID is different.
Manage IP Allowlist
Add IP addresses to the IP Allowlist by either directly entering them or importing them from a CSV file. You can also export the IP
Allowlist.
This is custom documentation. For more information, please visit SAP Help Portal. 159
7/9/25, 8:45 AM
Context
To secure your environment, you can control the range of IPv4 addresses that get access to the database of your SAP Datasphere
by adding them to an allowlist.
You need to provide the external (public) IPv4 address (range) of the client directly connecting to the database of SAP
Datasphere. This client might be an SAP HANA smart data integration Data Provisioning Agent on a server, a 3rd party ETL or
analytics tool, or any other JDBC-client. If you're using a network firewall with a proxy, you need to provide the public IPv4 address
of your proxy.
Internet Protocol version 4 addresses (IPv4 addresses) have a size of 32 bits and are represented in dot-decimal notation,
[Link] for example. The external IPv4 address is the address that the internet and computers outside your local network
can use to identify your system.
The address can either be a single IPv4 address or a range specified with a Classless Inter-Domain Routing suffix (CIDR suffix). An
example for a CIDR suffix is /24 which represents 256 addresses and is typically used for a large local area network (LAN). The
CIDR notation for the IPv4 address above would be: [Link]/24 to denote the IP addresses between [Link] and
[Link] (the leftmost 24 bits of the address in binary notation are fixed). The external (public) IP address (range) to enter
into the allowlist will be outside of the range [Link]/16. You can find more information on Classless Inter-Domain Routing on
Wikipedia .
Note
The number of entries in the allowlist is limited. Once the limit has been reached, you won't be able to add entries.
Therefore, please consider which IP addresses should be added and whether the number of allowlist entries can be
reduced by using ranges to request as few allowlist entries as possible.
Procedure
1. In the side navigation area, click (System) (Configuration) IP Allowlist .
Trusted IPs: For clients such as an Data Provisioning Agent on a server, 3rd party ETL or analytics tools, or any other
JDBC-client
Trusted Cloud Connector IPs: For Cloud Connectors that you want to use for federation and replication with remote
tables from on-premise systems such as SAP HANA
The selected list shows all IP addresses that are allowed to connect to the SAP Datasphere database.
Note
Once the number of entries in the allowlist has reached its limit, the Add button will be disabled.
4. In the CIDR field of the dialog, either provide a single IPv4 address or a range specified with a CIDR suffix.
Note
Please make sure that you provide the external IPv4 address of your client respectively proxy when using a network
firewall. The IP you enter needs to be your public internet IP.
5. [optional] You can add a description of up to 120 characters to better understand your IP entries.
This is custom documentation. For more information, please visit SAP Help Portal. 160
7/9/25, 8:45 AM
7. To save your newly added IP to the allowlist on the database, click Save in the pushbutton bar of your list.
Note
Updating the allowlist in the database requires some time. To check if your changes have been applied, click Refresh.
Next Steps
You can also select and edit an entry from the list if an IP address has changed, or you can delete IPs if they are not required
anymore to prevent them from accessing the database of SAP Datasphere. To update the allowlist in the database with any change
you made, click Save and be reminded that the update in the database might take some time.
Context
You could find yourself in a situation where you need many IP addresses added to your current list of IP addresses. Rather than
manually entering them, an easier way to move IP addresses is to import or export a list from SAP Datasphere. When importing,
the file should be a CSV type using a semicolon, comma, tab, or pipe as the value that separates the IP addresses and their
descriptions. The column headings must include CIDR (Classless Inter-Domain Routing) and Description. Here is an example of a
basic comma-separated CSV file:
CIDR Description
[Link] Computer1
[Link]/1 Computer 2
You can use a file produced in the same or on a different SAP Datasphere tenant.
Procedure
1. In the side navigation area, click (System) (Configuration) IP Allowlist .
Trusted IPs: For clients such as a Data Provisioning Agent on a server, third-party ETL or analytics tools, or any other
JDBC-client
Trusted Cloud Connector IPs: For Cloud Connectors that you want to use for federation and replication with remote
tables from on-premise systems such as SAP HANA
The selected list shows all IP addresses that are allowed to connect to the SAP Datasphere database.
Option Action
b. Click Select Source File, and choose the allowlist file. Click Open.
Append new IPs: Add the unique IP addresses to the current list or update
existing IP descriptions.
Overwrite existing IPs: Remove the old IP addresses and add only those
addresses in this file.
d. Click Import.
c. Click Export.
Remote applications might restrict access to their instances. Whether an external client such as SAP Datasphere is allowed to
access the remote application is often decided by the remote application based on allowlisted IPs. Any external client trying to
access the remote appplication has to be made known to the remote application before first trying to access the application by
adding the external client's IP address(es) to an allowlist in the remote application. As an SAP Datasphere administrator or a user
with the System Information = Read privilege you can find the necessary information in the About dialog.
Particular remote applications or sources that you might want to access with SAP Datasphere restrict access to their instances
and require external SAP Datasphere IP address information to be added to an allowlist in the remote application before first trying
to access the application.
Users with the DW Administrator role can open a More section to find more details.
Administrators can find the Replication/Data Flow NAT IP (egress) from the side navigation area by clicking (System)
(About) More Replication/Data Flow NAT IP (egress).
Examples
The network for Amazon Redshift, Microsoft Azure SQL Database, or SAP SuccessFactors instances, for example, is protected by a
firewall that controls incoming traffic. To be able to use connections with these connection types for data flows or replication flows,
the connected sources require the relevant SAP Datasphere network address translation (NAT) IP address to be added to an
allowlist.
This is custom documentation. For more information, please visit SAP Help Portal. 162
7/9/25, 8:45 AM
For Amazon Redshift and Microsoft Azure SQL Database, find the Replication/Data Flow NAT IP (egress) in the last step of the
connection creation wizard.
(IP address of the SAP Datasphere's SAP HANA Cloud database instance)
If connecting a REST remote source to the HANA Cloud instance through SDI (for example, OData Adapter), then the REST remote
source is accessed using one of the NAT / egress IPs.
If connecting a remote source using SDA to the HANA Cloud instance, then the connection uses the NAT / egress IP in case the
Cloud Connector is not used in the scenario.
Administrators can find the NAT IPs from the side navigation area by clicking (System) (About) More SAP
HANA Cloud NAT IP (egress).
For more information, see Domains and IP Ranges in the SAP HANA Cloud documentation.
Access to SAP SuccessFactors instances is restricted. To be able to use a SAP SuccessFactors connection for remote tables and
view building, the connected source requires the externally facing IP addresses of theSAP Datasphere tenant to be added to an
allowlist.
For more information about adding the IP addresses in SAP SuccessFactors, see Adding an IP Restriction in the SAP
SuccessFactors platform documentation.
If you're using SAP Datasphere on Microsoft Azure and want to connect to an Azure storage service in a firewall-protected
Microsoft Azure storage account within the same Azure region, an administrator must allow the SAP Datasphere's Virtual Network
Subnet ID in the Microsoft Azure storage account. This is required for connections to Azure storage services such as Microsoft
Azure Data Lake Store Gen2.
Administrators can find the ID from the side navigation area by clicking (System) (About) More Virtual
Network Subnet ID (Microsoft Azure).
Related Links
SAP Note 3456052 (FAQ: About IP Addresses used in SAP Datasphere)
Prerequisites
This is custom documentation. For more information, please visit SAP Help Portal. 163
7/9/25, 8:45 AM
You have downloaded the required SSL/TLS certificate from an appropriate website. As one option for downloading, common
browsers provide functionality to export these certificates.
Note
Only X.509 Base64-encoded certificates enclosed between "-----BEGIN CERTIFICATE-----" and "-----END CERTIFICATE--
---" are supported. The common filename extension for the certificates is .pem (privacy-enhanced mail). We also
support filename extensions .crt and .cer.
A certificate used in one region might differ from those used in other regions. Also, some sources, such as Amazon
Athena, might require more than one certificate.
If you have a problem with a certificate, please contact your cloud company for assistance.
Context
For connections secured by leveraging HTTPS as the underlying transport protocol (using SSL/TLS transport encryption), the
server certificate must be trusted.
Note
You can create connections to remote systems which require a certificate upload without having uploaded the necessary
certificate. Validating a connection without valid server certificate will fail though, and you won't be able to use the connection.
Procedure
1. In the side navigation area, click (System) (Configuration) Security .
3. In the Upload Certificate dialog, browse your local directory and select the certificate.
4. Enter a description to provide intelligible information on the certificate, for example to point out to which connection type
the certificate applies.
5. Choose Upload.
Results
In the overview, you can see the certificate with its creation and expiry date. From the overview, you can delete certificates if
required.
Prerequisites
Search for the required driver files in the internet, make sure you have selected the correct driver files (identified by their
SHA256-formatted fingerprint) and download them from an appropriate web page (see below).
Context
Drivers are required for the following connection types (if several driver versions are supported, we recommend to use the newest
supported version mentioned below):
Make sure to select the Basic Light package zip file. The package applies to all versions [Link]
supported by the Oracle connection type (Oracle 12c, Oracle 18c, and Oracle 19c). [Link]
Before uploading the files, you must rename them following the names already indicated:
[Link], osdt_cert.jar, osdt_core.jar.
When uploading the drivers, they are identified by their SHA256-formatted fingerprint. You can verify the fingerprint with the
following command:
Upload a Driver
Perform the following steps before creating the first Amazon Redshift, Oracle, or Google BigQuery connection that you want to use
for data flows.
This is custom documentation. For more information, please visit SAP Help Portal. 165
7/9/25, 8:45 AM
2. Go to Third-Party Drivers and choose Upload.
3. In the following dialog box, choose Browse to select the driver file from your download location.
Note
The fingerprint of the driver file name to be uploaded must match the fingerprint mentioned above.
4. Choose Upload.
5. Choose sync to synchronize the driver with the underlying component. Wait for about 5 to 10 minutes to finish
synchronization before you start creating connections or using data flows with the connection.
You might need to remove a driver when you want to upload a new version of the driver or your licence agreement has terminated.
2. If you're using a connection that requires the removed driver for data flows, choose Upload to re-upload the driver to
make sure that you can continue using the data flows.
3. Choose sync to synchronize the driver changes with the underlying component. Once the synchronization has finished,
you can continue using data flows with the connection, or, if you haven't uploaded a new driver, you won't be able to use
data flows with the connection anymore unless you re-upload the driver.
Troubleshooting
If a data flow fails with the error message saying that the driver could not be found, check that the drivers are uploaded and start
synchronization.
Note
This procedure only applies to manual data product installation. It doesn't apply to the installation of SAP Business Data Cloud
intelligent applications.
Context
SAP systems provide their SAP Business Data Cloud data products to SAP Datasphere via SAP Business Data Cloud formations
(see Integrate SAP Business Data Cloud Provisioned Systems in the SAP Business Data Cloud documentation). When your SAP
Datasphere tenant is added to an SAP Business Data Cloud formation, the connections to the source systems of the formation
become available in SAP Datasphere. Both systems and connections can be found under (System) (Configuration)
Business Data Products .
Before an SAP Datasphere modeler can install data products from an SAP system to any target spaces, an SAP Datasphere
administrator must authorize these spaces in (System) (Configuration) Business Data Products .
This is custom documentation. For more information, please visit SAP Help Portal. 166
7/9/25, 8:45 AM
Procedure
1. In the side navigation area of SAP Datasphere, click (System) (Configuration) Business Data Products .
2. [optional] Select the system (with its connection) and choose Edit Business Name to provide a more reasonable business
name to the connection.
3. Select the system and click (Details) to open the side panel.
4. In the side panel, choose Add to authorize one or more spaces to install data products from the system, and confirm your
selection.
Note
You can only remove a space if no data products are installed in this space (see the Data Products Installed column in
the list of selected spaces).
Results
An SAP Datasphere modeler installing the data products in the catalog can select authorized spaces as target spaces (in the
Import Entities wizard). When a data product is installed:
The data product objects are created and deployed in the ingestion space and shared with the target spaces selected
during installation.
In an Adverity workspace, you have prepared a datastream that connects to the data source for which you want to create
the connection.
In SAP Datasphere, you have added the necessary Adverity IP addresses to the IP allowlist. For more information, see
Manage IP Allowlist.
Note
To get the relevant IP addresses, please contact your Adverity Account Manager or the Adverity Support team.
Related Information
Adverity Connections
This is custom documentation. For more information, please visit SAP Help Portal. 167
7/9/25, 8:45 AM
Remote Tables
Before you can use the connection for remote tables, the following is required:
A DW administrator has uploaded the server certificates to SAP Datasphere. Two certificates are required, one for Amazon
Athena and one for Amazon S3. Region-specific certificates might be required for Amazon Athena. Alternatively, if the
common root CA certificate contains trust for both endpoints, Amazon Athena and Amazon Simple Storage Service
(API/Athena and Data/S3), you can upload the root certificate.
Related Information
Amazon Athena Connections
Replication Flows
Before you can use the connection for replication flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to the Apache Kafka on-premise
implementation.
Related Information
Apache Kafka Connections
Replication Flows
Before you can use the connection for replication flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to Confluent Platform (Kafka brokers) and to
the Schema Registry.
Note
This is custom documentation. For more information, please visit SAP Help Portal. 168
7/9/25, 8:45 AM
Separate Cloud Connector instances might be used for the two endpoints. The Schema Registry might be used in one
Cloud Connector location is while connecting to the Kafka brokers happens in another location.
Related Information
Confluent Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CamelJdbcAdapter.
An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/camel/lib folder
and restarted the Data Provisioning Agent before registering the adapter with SAP Datasphere.
Data Flows
Before you can use the connection for data flows, the following is required:
For information on where a DW administrator can find the IP address, see Finding SAP Datasphere IP addresses.
A DW administrator has uploaded the required ODBC driver file to SAP Datasphere.
For more information, see Upload Third-Party ODBC Drivers (Required for Data Flows).
Related Information
Amazon Redshift Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
This is custom documentation. For more information, please visit SAP Help Portal. 169
7/9/25, 8:45 AM
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CloudDataIntegrationAdapter.
For ABAP-based cloud SAP systems such as SAP S/4HANA Cloud or SAP Marketing Cloud: A communication arrangement
has been created for communication scenario SAP_COM_0531 in the source system.
For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.
Data Flows
Before you can use the connection for data flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
For ABAP-based cloud SAP systems such as SAP S/4HANA Cloud or SAP Marketing Cloud: A communication arrangement
has been created for communication scenario SAP_COM_0531 in the source system.
For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.
Related Information
Cloud Data Integration Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CamelJdbcAdapter.
It has been checked that the data source is supported by the CamelJdbcAdapter.
For latest information about supported data sources and versions, see the SAP HANA Smart Data Integration Product
Availability Matrix (PAM) .
Note
For information about unsupported data sources, see SAP Note 3130999 .
An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/camel/lib folder
and restarted the Data Provisioning Agent before registering the adapter with SAP Datasphere.
For more information, see Set up the Camel JDBC Adapter in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality Installation and Configuration Guide.
This is custom documentation. For more information, please visit SAP Help Portal. 170
7/9/25, 8:45 AM
For information about the proper JDBC library for your source, see the SAP HANA smart data integration Product
Availability Matrix (PAM).
Related Information
Generic JDBC Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
Data Flows
Before you can use the connection for data flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
Related Information
Generic OData Connections
The expected format of the file provided in the host key is one or more lines, each composed of the following elements:
Example
This is custom documentation. For more information, please visit SAP Help Portal. 171
7/9/25, 8:45 AM
The following is a valid file with two entries:
Provide the host public key through a trusted channel. If your Windows 10, Linux, or MacOS machine has a trusted channel,
perform the following steps by replacing the following elements with the specified values:
Use the resulting file host_key.[Link], in the directory where you run the specified command, as the Host Key for your
connection. The specified commands are as follows:
Linux/MacOS: In a unix-compliant shell with both ssh-keyscan and sed commands (both are installed in your system),
obtain the key through the following command:
Note
If your machine doesn't have a trusted channel, we recommend asking your administrator for the public host key to avoid man-
in-the-middle attacks.
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
Related Information
Generic SFTP Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
Note
This is custom documentation. For more information, please visit SAP Help Portal. 172
7/9/25, 8:45 AM
The root certificates GTS Root R1 and GTS Root R4 (valid until 2036) are required. In your browser, open
[Link] (Google Trust Services Repository) to download the certificates (supported filename
extensions are .pem and .crt).
A DW administrator has uploaded the required ODBC driver file to SAP Datasphere.
For more information, see Upload Third-Party ODBC Drivers (Required for Data Flows).
Related Information
Google BigQuery Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the MssqlLogReaderAdapter.
An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/lib folder before
registering the adapter with SAP Datasphere.
To use Microsoft SQL Server trigger-based replication, the user entered in the connection credentials needs to have the
required privileges and permissions. For more information, see Required Permissions for SQL Server Trigger-Based
Replication in the Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data
Quality
For information on where a DW administrator can find the IP address, see Finding SAP Datasphere IP addresses.
Related Information
Microsoft Azure SQL Database Connections
This is custom documentation. For more information, please visit SAP Help Portal. 173
7/9/25, 8:45 AM
If you're using SAP Datasphere on Microsoft Azure and want to connect to Microsoft Azure Data Lake Store Gen2 in a
firewall-protected Microsoft Azure storage account within the same Azure region: An Azure administrator must grant SAP
Datasphere access to the Microsoft Azure storage account.
Related Information
Microsoft Azure Data Lake Store Gen2 Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the MssqlLogReaderAdapter.
An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/lib folder before
registering the adapter with SAP Datasphere.
Required Permissions for SQL Server Trigger-Based Replication in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality Installation and Configuration Guide
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
Note
Cloud Connector is not required if your Microsoft SQL Server database is available on the public internet.
Related Information
This is custom documentation. For more information, please visit SAP Help Portal. 174
7/9/25, 8:45 AM
Microsoft SQL Server Connections
1. Set up an SAP BTP account and enable the SAP Integration Suite service with the SAP Open Connectors capability.
Note
You need to know your SAP BTP subaccount information (provider, region, environment, trial - yes/no) later to select the
appropriate SAP BTP subaccount region in SAP Datasphere when integrating the SAP Open Connectors account in your
space.
For information about setting up an SAP BTP trial version with the SAP Integration Suite service, see Set Up Integration
Suite Trial . To enable SAP Open Connectors, you need to activate the Extend Non-SAP Connectivity capability in the
Integration Suite.
For information about setting up SAP Integration Suite from a production SAP BTP account, see Initial Setup in the SAP
Integration Suite documentation.
For information about SAP Open Connectors availability in data centers, see SAP Note 2903776 .
2. In your SAP Open Connectors account, create connector instances for the sources that you want to connect to SAP
Datasphere.
For more information about creating an instance, see Authenticate a Connector Instance (UI) in the SAP Open Connectors
documentation.
For more information about connector-specific setup and connector-specific properties required to create an instance, see
Connectors Catalog in the SAP Open Connectors documentation. There, click the connector in question and then
<connector name> API Provider Setup or <connector name> Authenticate a Connector Instance.
3. In your SAP Open Connectors account, record the following information which you will require later in SAP Datasphere:
Organization secret and user secret - required when integrating the SAP Open Connectors account in your space.
Name of the connector instance - required when selecting the instance in the connection creation wizard
a. In the SAP BTP Sub Account Region field, select the appropriate entry according to your SAP BTP subaccount
information (provider, region, environment, trial - yes/no).
3. Click OK to integrate your SAP Open Connectors account with SAP Datasphere.
Results
This is custom documentation. For more information, please visit SAP Help Portal. 175
7/9/25, 8:45 AM
With connection type Open Connectors you can now create connections to the third-party data sources available as connector
instances with your SAP Open Connectors account.
Related Information
Open Connectors Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the OracleLogReaderAdapter.
An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/lib folder before
registering the adapter with SAP Datasphere.
For more information about the supported JDBC libraries, see the SAP HANA smart data integration and all its patches
Product Availability Matrix (PAM) for SAP HANA SDI 2.0 . Search for the required library in the internet and download it
from an appropriate web page.
Required Permissions for Oracle Trigger-Based Replication in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality Installation and Configuration Guide
If encrypted communication is used (connection is configured to use SSL), the server certificate must be uploaded to the
Data Provisioning Agent.
To retrieve the certificate, you can use for example the following command: openssl s_client -showcerts -
servername <host name of the Oracle database server>:<port number of the Oracle database
server> -connect <host name of the Oracle database server>:<port number of the Oracle
database server>
For more information about uploading the certificate to the Data Provisioning Agent, see:
Configure the Adapter Truststore and Keystore in the SAP HANA Smart Data Integration and SAP HANA Smart Data
Quality documentation
Configure SSL for the Oracle Log Reader Adapter in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality documentation
Data Flows
Before you can use the connection for data flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
Note
This is custom documentation. For more information, please visit SAP Help Portal. 176
7/9/25, 8:45 AM
Cloud Connector is not required if your Oracle database is available on the public internet.
A DW administrator has uploaded the required ODBC driver file to SAP Datasphere.
To use encrypted communication (connection is configured to use SSL), additional files are required to be uploaded.
For more information, see Upload Third-Party ODBC Drivers (Required for Data Flows).
To retrieve the certificate, you can use for example the following command: openssl s_client -showcerts -
servername <host name of the Oracle database server>:<port number of the Oracle database
server> -connect <host name of the Oracle database server>:<port number of the Oracle
database server>
Related Information
Oracle Connections
In Precog, you have added the source for which you want to create the connection.
In SAP Datasphere, you have added the necessary Precog IP addresses to the IP allowlist. For more information, see
Manage IP Allowlist.
Note
You can find and copy the relevant IP addresses in the final step of the connection creation wizard.
Related Information
Precog Connections
Remote Tables
If you want to use federated access to CDS view entities using the ABAP SQL service exposure from SAP S/4HANA (on-premise),
see Using ABAP SQL Services for Accessing Data from SAP S/4HANA (recommended for federation scenarios).
If you want to use federated access to CDS view entities using the ABAP SQL service exposure from SAP S/4HANA Cloud, see
Using ABAP SQL Services for Accessing Data from SAP S/4HANA Cloud (recommended for federation scenarios).
This is custom documentation. For more information, please visit SAP Help Portal. 177
7/9/25, 8:45 AM
If you want to federate and replicate data from ABAP-based on-premise systems using SAP HANA smart data integration, the
following is required before you can use the connection (legacy):
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the ABAPAdapter.
For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data
Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.
The ABAP user specified in the credentials of the SAP ABAP connection needs to have a specific set of authorizations in the
SAP ABAP system. For more information, see: Authorizations in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.
To access and copy data from SAP BW objects such as InfoProviders or characteristics, the appropriate authorization
objects like S_RS_ADSO or S_RS_IOBJA are required for the ABAP user. For more information, see Overview: Authorization
Objects in the SAP NetWeaver documentation.
If you want to access data from SAP BW Queries, make sure that the ABAP user has the required analysis authorizations to
read the data and that characteristic 0TCAIPROV (InfoProvider) in the authorization includes @Q, which is the prefix for
Queries as InfoProviders. For more information, see Defining Analysis Authorizations in the SAP NetWeaver documentation.
If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to
configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC
destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent
as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming.
To be able to use ABAP Dictionary tables from connections to a SAP BW∕4HANA system for remote tables and creating
views, please make sure that SAP note 2872997 has been applied to the system.
Data Flows
Note
The availability of the data flow feature depends on the used version and Support Package level of the ABAP-based SAP system
(SAP S/4HANA or the DMIS addon in the source). Make sure your source systems meet the required minimum versions. We
recommend to use the latest available version of SAP S/4HANA and the DMIS add-on where possible and have the latest SAP
notes and TCI notes implemented in your systems.
For more information about required versions, recommended system landscape, considerations for the supported source
objects, and more , see SAP Note 2890171 .
Before you can use the connection for data flows, the following is required:
If the connected system is an on-premise source, an adminstrator has installed and configured Cloud Connector.
In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.
See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)
If you want to connect to SAP S/4HANA Cloud to replicate extraction-enabled, C1-released CDS views: Consider the
information about preparing an SAP S/4HANA Cloud connection for data flows.
This is custom documentation. For more information, please visit SAP Help Portal. 178
7/9/25, 8:45 AM
Replication Flows
Note
The availability of the replication flow feature depends on the used version and Support Package level of the ABAP-based SAP
system (SAP S/4HANA or the DMIS addon in the source). Make sure your source systems meet the required minimum
versions. We recommend to use the latest available version of SAP S/4HANA and the DMIS add-on where possible and have the
latest SAP notes and TCI notes implemented in your systems.
For more information about required versions, recommended system landscape, considerations for the supported source
objects, and more , see SAP Note 2890171 .
Before you can use the connection for replication flows, the following is required:
If the connected system is an on-premise source, an administrator has installed and configured Cloud Connector.
In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.
See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)
If you want to connect to SAP S/4HANA Cloud to replicate extraction-enabled, C1-released CDS views or you want to
replicate CDS view entities using the SQL service exposure: Consider the information about preparing an SAP S/4HANA
Cloud connection for replication flows.
Related Information
SAP ABAP Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the ABAPAdapter.
For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data
Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.
The ABAP user specified in the credentials of the SAP ABAP connection needs to have a specific set of authorizations in the
SAP ABAP system. For more information, see: Authorizations in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.
To access and copy data from SAP BW objects such as InfoProviders or characteristics, the appropriate authorization
objects like S_RS_ADSO or S_RS_IOBJA are required for the ABAP user. For more information, see Overview: Authorization
Objects in the SAP NetWeaver documentation.
This is custom documentation. For more information, please visit SAP Help Portal. 179
7/9/25, 8:45 AM
If you want to access data from SAP BW Queries, make sure that the ABAP user has the required analysis authorizations to
read the data and that characteristic 0TCAIPROV (InfoProvider) in the authorization includes @Q, which is the prefix for
Queries as InfoProviders. For more information, see Defining Analysis Authorizations in the SAP NetWeaver documentation.
If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to
configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC
destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent
as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming.
To be able to use ABAP Dictionary tables from connections to a SAP BW∕4HANA system for remote tables and creating
views, please make sure that SAP note 2872997 has been applied to the system.
Data Flows
Before you can use the connection for data flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.
Related Information
SAP BW Connections
For accessing SAP BW∕4HANA, http is used to securely connect to the SAP BW∕4HANA system via Cloud Connector, and SAP
HANA SQL is used to connect to the SAP HANA database of SAP BW∕4HANA via Data Provisioning Agent. Using Cloud Connector
to make http requests to SAP BW∕4HANA requires a live data connection of type tunnel to SAP BW∕4HANA.
For information on supported SAP BW/4HANA source versions, see Supported Source Versions for SAP BW∕4HANA Model
Transfer Connections.
Before creating a connection for SAP BW/4HANA Model Transfer in SAP Datasphere, you need to prepare the following:
1. In SAP BW∕4HANA, make sure that the following services are active in transaction code SICF:
/sap/bw/ina/GetCatalog
/sap/bw/ina/GetResponse
/sap/bw/ina/GetServerInfo
/sap/bw/ina/ValueHelp
/sap/bw/ina/BatchProcessing
/sap/bw/ina/Logoff
/sap/bw4
This is custom documentation. For more information, please visit SAP Help Portal. 180
7/9/25, 8:45 AM
2. In SAP BW∕4HANA, activate OData service ESH_SEARCH_SRV in Customizing (transaction SPRO) under SAP
NetWeaver Gateway OData Channel Administration General Settings Activate and Maintain Services .
3. Install and configure Cloud Connector. For more information, see Configure Cloud Connector.
4. In the side navigation area of SAP Datasphere, click System Administration Data Source Configuration Live Data
Sources and switch on Allow live data to leave my network.
5. In the side navigation area of SAP Datasphere, click System Administration Data Source Configuration On-premise
data sources and add the location ID of your Cloud Connector instance.
6. In the side navigation area of SAP Datasphere, open System Configuration Data Integration Live Data Connections
(Tunnel) and create a live data connection of type tunnel to SAP BW∕4HANA.
For more information, see Create Live Data Connection of Type Tunnel.
7. Install and configure a Data Provisioning Agent and register the SAP HANA adapter with SAP Datasphere:
Install the latest Data Provisioning Agent version on a local host or update your agent to the latest version. For more
information, see Install the Data Provisioning Agent.
In SAP Datasphere, add the external IPv4 address of the server on which your Data Provisioning Agent is running, or
in case you are using a network firewall add the public proxy IP address to the IP allowlist. For more information, see
Manage IP Allowlist.
Connect the Data Provisioning Agent to SAP Datasphere. For more information, see Connect and Configure the Data
Provisioning Agent.
In SAP Datasphere, register the SAP HANA adapter with SAP Datasphere. For more information, see Register
Adapters with SAP Datasphere.
Related Information
SAP BW∕4HANA Model Transfer Connections
Prerequisites
See the prerequisites 1 to 5 in Preparing SAP BW/4HANA Model Transfer Connectivity.
Procedure
1. In the side navigation area, click (System) (Configuration) Data Integration .
2. In the Live Data Connections (Tunnel) section, click Manage Live Data Connections.
This is custom documentation. For more information, please visit SAP Help Portal. 181
7/9/25, 8:45 AM
4. Expand Connect to Live Data and select SAP BW.
5. Enter a name and description for your connection. Note that the connection name cannot be changed later.
By enabling tunneling, data from the connected source will always be transferred through the Cloud Connector.
Note
In the next step, you will need to specify the virtual host that is mapped to your on-premise system. This depends on the
settings in your selected Cloud Connector location.
8. Add your SAP BW∕4HANA host name, HTTPS port, and client.
Use the virtual host name and virtual port that were configured in the Cloud Connector.
This language will always be used for this connection and cannot be changed by users without administrator privileges.
Note
You must know which languages are installed on your SAP BW∕4HANA system before adding a language code. If the
language code you enter is invalid, SAP Datasphere will default to the language specified by your system metadata.
11. Enter user name (case sensitive) and password of the technical user for the connection.
Read authorizations for SAP BW∕4HANA metadata (Queries, CompositeProviders and their InfoProviders)
Using authorizations for SAP BW∕4HANA metadata, you can restrict a model transfer connection to a designated
semantic SAP BW/4HANA area.
For more information, see Overview: Authorization Objects in the SAP BW∕4HANA documentation.
12. Select Save this credential for all users on this system.
Note
While saving the connection, the system checks if it can access /sap/bc/ina/ services in SAP BW∕4HANA.
Results
The connection is saved and now available for selection in the SAP Datasphere connection creation wizard for the SAP BW∕4HANA
Model Transfer connection.
2989654 BW/4 - Enable DWC "Import from Connection" for BW/4 Query - Revision 1
2976147 Import of query views in the BW/4 hybrid scenario: No search results of BW back ends with SAP_BASIS
Release 753
SAP BW∕4HANA 2.0 SPS01 to SPS06 after you have applied the following SAP Notes:
2945277 BW/4 - Enable DWC "Import from Connection" for BW/4 Query
2989654 BW/4 - Enable DWC "Import from Connection" for BW/4 Query - Revision 1
2976147 Import of query views in the BW/4 hybrid scenario: No search results of BW back ends with SAP_BASIS
Release 753
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the ABAPAdapter.
For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data
Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.
The ABAP user specified in the credentials of the SAP ABAP connection needs to have a specific set of authorizations in the
SAP ABAP system. For more information, see: Authorizations in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.
If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to
configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC
This is custom documentation. For more information, please visit SAP Help Portal. 183
7/9/25, 8:45 AM
destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent
as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming.
Data Flows
Before you can use the connection for data flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.
Related Information
SAP ECC Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CloudDataIntegrationAdapter.
Related Information
SAP Fieldglass Connections
A DW administrator has uploaded the TLS server certificate DigiCert Global Root CA
([Link]).
This is custom documentation. For more information, please visit SAP Help Portal. 184
7/9/25, 8:45 AM
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere
and registered the HanaAdapter.
If you use encrypted communication (see the Security properties in the connection creation wizard):
An administrator has already correctly configured Data Provisioning Agent for SSL support.
For more information, see Configure SSL for SAP HANA On-Premise [Manual Steps] in the SAP HANA Smart Data
Integration and SAP HANA Smart Data Quality documentation.
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
If you use encrypted communication and the server certificate should be validated (see the Security properties in
the connection creation wizard):
For SAP HANA (on-premise), before you can use the connection for data flows and replication flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
Related Information
SAP HANA Connections
Remote Tables
Before you can use the connection for remote tables, the following is required:
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CloudDataIntegrationAdapter.
This is custom documentation. For more information, please visit SAP Help Portal. 185
7/9/25, 8:45 AM
For more information, see Preparing Data Provisioning Agent Connectivity.
A communication arrangement has been created for communication scenario SAP_COM_0531 in the source system.
For more information, see Integrating CDI in the SAP Marketing Cloud documentation.
Data Flows
Before you can use the connection for data flows, the following is required:
A communication arrangement has been created for communication scenario SAP_COM_0531 in the source system.
For more information, see Integrating CDI in the SAP Marketing Cloud documentation.
Related Information
SAP Marketing Cloud Connections
For more information, see Registering Your OAuth2 Client Application in the SAP SuccessFactors platform
documentation.
A SAML assertion needs to be generated to be able to provide it when creating or editing the connection.
For an overview of the available options to generate a SAML assertion, see Generating a SAML Assertion in the SAP
SuccessFactors platform documentation.
In SAP SuccessFactors IP restriction management, you have added the externally facing SAP HANA IP addresses and the
outbound IP address for SAP Datasphere to the list of IP restrictions. IP restrictions are a specified list of IP addresses from
which users can access your SAP SuccessFactors system.
Related Information
SAP SuccessFactors Connections
This is custom documentation. For more information, please visit SAP Help Portal. 186
7/9/25, 8:45 AM
Remote Tables
Before you can use the connection for remote tables, the following is required:
For federated access to CDS view entities using the ABAP SQL service exposure from SAP S/4HANA Cloud (recommended
for federation scenarios):
See Using ABAP SQL Services for Accessing Data from SAP S/4HANA Cloud.
For federated access to and replication of ABAP CDS Views exposed as OData services for data extraction using Cloud Data
Integration (legacy):
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere
and registered the CloudDataIntegrationAdapter.
A communication arrangement has been created for communication scenario SAP_COM_0531 in the source
system.
For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.
Data Flows
Before you can use the connection for data flows, the following is required:
A communication arrangement has been created for communication scenario SAP_COM_0532 in the SAP S/4HANA Cloud
system.
For more information, see Integrating CDS Views Using SAP Datasphere in the SAP S/4HANA Cloud documentation.
Replication Flows
Before you can use the connection for replication flows, the following is required:
For replicating CDS view entities using the ABAP SQL service exposure from SAP S/4HANA Cloud (recommended for
replication scenarios):
See Using ABAP SQL Services for Accessing Data from SAP S/4HANA Cloud.
For both replicating CDS view entities using the ABAP SQL service exposure and replicating CDS views view entities using
the ABAP Pipeline Engine:
A communication arrangement has been created for communication scenario SAP_COM_0532 in the SAP S/4HANA Cloud
system.
replicating CDS view entities using the ABAP SQL service exposure:
Integrating SQL Services Using SAP Datasphere in the SAP S/4HANA Cloud documentation
Creating a Communication Arrangement to Enable Replication Flows in SAP Datasphere in the ABAP Cloud
documentation
This is custom documentation. For more information, please visit SAP Help Portal. 187
7/9/25, 8:45 AM
replicating CDS views using the ABAP Pipeline Engine:
Integrating CDS Views Using SAP Datasphere in the SAP S/4HANA Cloud documentation
Note
The same communication user must be added to all communication arrangements you're using for the connection.
If you want to use RFC fast serialization for your replication flows, see SAP Note 3486245 .
Model Import
Before you can use the connection for model import, the following is required:
A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered
CloudDataIntegrationAdapter.
In the SAP S/4HANA Cloud system, communication arrangements have been created for the following communication
scenarios:
SAP_COM_0532
For more information, see Integrating CDS Views Using SAP Datasphere in the SAP S/4HANA Cloud documentation.
SAP_COM_0531
For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.
SAP_COM_0722
For more information, see Integrating SAP Data Warehouse Cloud in the SAP S/4HANA Cloud documentation.
Note
The same communication user must be added to all communication arrangements you're using for the connection.
Related Information
SAP S/4HANA Cloud Connections
For more information, see Accessing ABAP-Managed Data Using SQL Services for Data Integration Scenarios in the SAP S/4HANA
Cloud Public Edition documentation.
Note
This feature requires developer extensibility in SAP S/4HANA Cloud (including ABAP development tools), which is only
available in a 3-system landscape. For more information, see the SAP S/4HANA Cloud Public Edition documentation:
Developer Extensibility
This is custom documentation. For more information, please visit SAP Help Portal. 188
7/9/25, 8:45 AM
System Landscapes in SAP S/4HANA Cloud
For both consumption scenarios using the SQL service, data federation and data replication, privileged data access needs to be
enabled for communication users in SAP S/4HANA Cloud. For more information about the consumption scenarios and privileged
access, see Data Integration Patterns in the ABAP Cloud documentation for SAP S/4HANA Cloud Public Edition.
There are some prerequisites and constraints that must be considered before using the SQL service.
For more information, see Prerequisites and Constraints in the ABAP Cloud documentation. Note that for SAP Datasphere
the ODBC driver installation is not required (the driver is pre-installed on the SAP HANA database).
To expose CDS view entities using the SQL service, an SAP S/4HANA Cloud business user has created a service definition
and a corresponding service binding of type SQL1 in the ABAP Development Tools. The service definition lists the set of
CDS view entities that shall be exposed, and a service binding of type SQL for that service definition enables their exposure
via the ABAP SQL Service.
In the Enabled Operations area of the service binding, the business user must select access type SELECT to enable
federated access.
For more information, see Creating a Service Definition and an SQL-Typed Service Binding in the ABAP Cloud
documentation.
To expose the SQL service to get privileged access to the CDS view entities with a communication user, a communication
arrangement is required. This involves the following steps:
1. An SAP S/4HANA Cloud business user has created a custom communication scenario in the ABAP Development
Tools.
When filling out the authorizations for authorization object S_SQL_VIEW in the communication scenario, note the
following:
On the Sources tab of the Data Builder view editors in SAP Datasphere, the service binding name from the
SQL_SCHEMA authorization field is visible as (virtual) schema.
In the SQL_VIEWOP authorization field, select the option SELECT to grant federated access.
2. An administrator has created a communication system and user in the SAP Fiori launchpad of the ABAP
environment.
Note
The same communication user must be added to all communication arrangements you're using for the
connection.
3. An administrator has created a communication arrangement for exposing the SQL service in the SAP Fiori
launchpad of the ABAP environment.
For more information, see Exposing the SQL Service for Data Federation and Replication with Privileged Access in the ABAP
Cloud documentation.
You can now create a connection to consume the ABAP SQL service for data federation with remote tables using the ABAP SDA
adapter in SAP HANA.
This is custom documentation. For more information, please visit SAP Help Portal. 189
7/9/25, 8:45 AM
In SAP S/4HANA Cloud, a business user and administrator must perform the following steps to prepare data replication with
replication flows:
There are some prerequisites and constraints that must be considered before using the SQL service.
For more information, see Prerequisites and Constraints in the ABAP Cloud documentation. Note that for SAP Datasphere
the ODBC driver installation is not required (the driver is pre-installed on the SAP HANA database).
To expose CDS view entities using the SQL service, an SAP S/4HANA Cloud business user has created a service definition
and a corresponding service binding of type SQL1 in the ABAP Development Tools. The service definition lists the set of
CDS view entities that shall be exposed, and a service binding of type SQL for that service definition enables their exposure
via the ABAP SQL Service.
In the Enabled Operations area of the service binding, the business user must select access type REPLICATE to enable
data replication.
For more information, see Creating a Service Definition and an SQL-Typed Service Binding in the ABAP Cloud
documentation.
To expose the SQL service to get privileged access to the CDS view entities with a communication user, a communication
arrangement is required. This involves the following steps:
1. An SAP S/4HANA Cloud business user has created a custom communication scenario in the ABAP Development
Tools.
When filling out the authorizations for authorization object S_SQL_VIEW in the communication scenario, note the
following:
In the SQL_VIEWOP authorization field, select the option REPLICATE to allow replication on the specified
views.
2. An administrator has created a communication system and user in the SAP Fiori launchpad of the ABAP
environment.
Note
The same communication user must be added to all communication arrangements you're using for the
connection.
3. An administrator has created a communication arrangement for exposing the SQL service in the SAP Fiori
launchpad of the ABAP environment.
For more information, see Exposing the SQL Service for Data Federation and Replication with Privileged Access in the ABAP
Cloud documentation.
An administrator has created a communication arrangement for communication scenario SAP_COM_0532 in the SAP Fiori
launchpad of the ABAP environment.
You can now create a connection to consume the ABAP SQL service for data replication with replication flows using the ABAP
Pipeline Engine.
This is custom documentation. For more information, please visit SAP Help Portal. 190
7/9/25, 8:45 AM
This topic contains the following sections:
Remote Tables
Data Flows
Replication Flows
Remote Tables
If you want to use federated access to CDS view entities using the ABAP SQL service exposure from SAP S/4HANA, see Using
ABAP SQL Services for Accessing Data from SAP S/4HANA (recommended for federation scenarios).
If you want to federate and replicate data using SAP HANA smart data integration, the following is required before you can use the
connection (legacy):
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the ABAPAdapter.
For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data
Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.
The ABAP user specified in the credentials of the SAP ABAP connection needs to have a specific set of authorizations in the
SAP ABAP system. For more information, see: Authorizations in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.
If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to
configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC
destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent
as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming.
Data Flows
Note
The availability of the data flow feature depends on the used version and Support Package level of SAP S/4HANA or the DMIS
addon in the source. Make sure your source systems meet the required minimum versions. We recommend to use the latest
available version of SAP S/4HANA and the DMIS add-on where possible and have the latest SAP notes and TCI notes
implemented in your systems.
For more information about required versions, recommended system landscape, considerations for the supported source
objects, and more , see SAP Note 2890171 .
Before you can use the connection for data flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.
See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)
This is custom documentation. For more information, please visit SAP Help Portal. 191
7/9/25, 8:45 AM
Replication Flows
Note
The availability of the replication flow feature depends on the used version and Support Package level of SAP S/4HANA or the
DMIS addon in the source. Make sure your source systems meet the required minimum versions. We recommend to use the
latest available version of SAP S/4HANA and the DMIS add-on where possible and have the latest SAP notes and TCI notes
implemented in your systems.
For more information about required versions, recommended system landscape, considerations for the supported source
objects, and more , see SAP Note 2890171 .
Before you can use the connection for replication flows, the following is required:
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.
See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)
The endpoint is either RFC or RFCLB (for load balancing via message server). Fast serialization is not available for
endpoints WSRFC or SQL.
In the SAP S/4HANA on-premise system, the feature has not been disabled via parameter
APE_DISABLE_RUCKSACK in DHBAS_RUNTIME configuration table.
For more information about using fast serialization in SAP Datasphere and its prerequisites, see SAP Note 3486245 .
Before you can use the connection to import entities with data access Remote Tables, the following is required:
In SAP S/4HANA
An administrator has followed the instructions from SAP Note 3081998 to properly set up the SAP S/4HANA system,
which includes:
1. SAP Note 3283282 has been implemented to provide the required infrastructure in the SAP S/4HANA system.
2. The required corrections have been implemented and checks have been performed to make sure that SAP Note
3283282 and subsequent corrections have been applied properly and all required objects to provide the
infrastructure are available and activated.
3. Report ESH_CSN_CDS_TO_CSN has been run to prepare the CDS Views for the import.
Authorization object S_SERVICE - service authorizations for the Enterprise Search search service
This is custom documentation. For more information, please visit SAP Help Portal. 192
7/9/25, 8:45 AM
Field Value
SRV_NAME EF608938F3EB18256CE851763C2952
SRV_TYPE HT
Authorization object SDDLVIEW - Search access authorization for the search view CSN_EXPOSURE_CDS
Field Value
DDLSRCNAME CSN_EXPOSURE_CDS
ACTVT 03
An adminstrator has checked that the required InA services are active in transaction code SICF:
/sap/bw/ina/GetCatalog
/sap/bw/ina/GetResponse
/sap/bw/ina/GetServerInfo
/sap/bw/ina/ValueHelp
/sap/bw/ina/BatchProcessing
/sap/bw/ina/Logoff
An administrator has activated OData service ESH_SEARCH_SRV in Customizing (transaction SPRO) under SAP
NetWeaver Gateway OData Channel Administration General Settings Activate and Maintain Services .
Cloud Connector
An administrator has installed and configured Cloud Connector to connect to your on-premise source.
For the remote tables that will be created during the import, the respective prerequisites have to be met including a Data
Provisioning Agent with the ABAPAdapter registered in SAP Datasphere.
In SAP Datasphere
In System Administration Data Source Configuration Live Data Sources , you have switched on Allow live data to
securely leave my network.
In the side navigation area of SAP Datasphere, click System Administration Data Source Configuration On-premise
data sources , you have added the location ID of your Cloud Connector instance.
This is custom documentation. For more information, please visit SAP Help Portal. 193
7/9/25, 8:45 AM
In System Configuration Data Integration Live Data Connections (Tunnel) , you have created a live data
connection of type tunnel to SAP S/4HANA.
For more information, see Create SAP S/4HANA Live Data Connection of Type Tunnel.
Before you can use the connection to import entities with data access Replication Flow to Local Tables, the following is required:
1. You have met all prerequisites mentioned in section Model Import (Data Access: Remote Tables).
Related Information
SAP S/4HANA On-Premise Connections
Prerequisites
See: Model Import (Data Access: Remote Tables)
Procedure
1. In the side navigation area, click (System) (Configuration) Data Integration .
2. In the Live Data Connections (Tunnel) section, click Manage Live Data Connections.
5. Enter a name and description for your connection. Note that the connection name cannot be changed later.
Note
In the next step, you will need to specify the virtual host that is mapped to your on-premise system. This depends on the
settings in your selected Cloud Connector location.
8. Add your SAP S/4HANA host name, HTTPS port, and client.
Use the virtual host name and virtual port that were configured in the Cloud Connector.
This is custom documentation. For more information, please visit SAP Help Portal. 194
7/9/25, 8:45 AM
9. Optional: Choose a Default Language from the list.
This language will always be used for this connection and cannot be changed by users without administrator privileges.
Note
You must know which languages are installed on your SAP S/4HANA system before adding a language code. If the
language code you enter is invalid, SAP Datasphere will default to the language specified by your system metadata.
11. Enter user name (case sensitive) and password of the technical user for the connection.
12. Select Save this credential for all users on this system.
Results
The connection is saved and now available for selection in the SAP Datasphere connection creation wizard for the SAP S/4HANA
On-Premise connection.
Note
This feature requires developer extensibility in SAP S/4HANA (including ABAP development tools). For more
information, see Developer Extensibility in the ABAP Platform documentation for your SAP S/4HANA system.
For data federation using the SQL service, privileged data access needs to be enabled for communication users in SAP
S/4HANA. For more information, see Access Scenarios in the ABAP Platform documentation for your SAP S/4HANA
system.
Make sure the SAP S/4HANA system you want to connect is based on the ABAP platform 2021 FPS01 or higher where
the ABAP SQL service is available.
Data federation with remote tables using the ABAP SQL service is supported for SAP Logon connection type
Application Server and basic authentication with User Name and Password.
When a connection is configured for using the ABAP SQL service for data federation with remote tables, you can't use
the same connection for model import.
Perform the following steps to prepare data federation with remote tables:
Configure Cloud Connector to use the ABAP SQL service (see Configure Cloud Connector), paying particular attention to
the following configuration steps:
1. When adding the system mapping to the SAP S/4HANA system, select HTTPS protocol.
Note
When you want to use a connection for both data or replication flows and remote tables, you need to create two
system mapping entries in the Cloud Connector considering the following:
This is custom documentation. For more information, please visit SAP Help Portal. 195
7/9/25, 8:45 AM
Data flow and RFC Enter the same host Enter the same The virtual port is
replication flow for both system virtual host for both derived from the
mapping entries. system mapping instance number
entries. (system number)
Also, the virtual host entered in the
must be the same in system mapping:
the Cloud Connector sapgw<system
system mapping and number>
in the connection's .
Cloud Connector
Remote tables HTTPS properties. Enter the same
virtual port in the
Cloud Connector
system mapping and
in the connection's
Cloud Connector
properties.
In the SAP Datasphere Connections app, you must enter the virtual port for the HTTPS protocol and the virtual
host in separate fields. Deriving virtual host and port is not supported in the Connections app because of the
different virtual ports used in the two system mappings.
a. Enter the service path of the SQL service endpoint on the SAP S/4HANA system. For example:
/sap/bc/sql/sql1/sap/s_privileged.
Note
In older Cloud Connector versions, the option might appear as WebSocket or WebSocket Upgrade.
In SAP S/4HANA, a business user and administrator must perform the following steps to prepare data federation with
remote tables:
1. Consider the prerequisites and constraints that must be considered before using the SQL service.
For more information, see Prerequisites and Constraints in the ABAP Platform documentation for your SAP
S/4HANA system.
2. To expose CDS view entities using the SQL service, an SAP S/4HANA business user has created a service definition
and a corresponding service binding of type SQL1 in the ABAP Development Tools. The service definition lists the set
of CDS view entities that shall be exposed, and a service binding of type SQL for that service definition enables their
exposure via the ABAP SQL Service.
For more information, see Creating a Service Definition and an SQL-Typed Service Binding in the ABAP Platform
documentation for your SAP S/4HANA system.
This is custom documentation. For more information, please visit SAP Help Portal. 196
7/9/25, 8:45 AM
3. To expose the SQL service to get privileged access to the CDS view entities with a communication user, a role is
required.
For more information, see Creating a Role for Privileged Access in the ABAP Platform documentation for your SAP
S/4HANA system.
In Configuration Data Integration On-Premise Agents choose the Monitor button to display the agents with the following:
Information about free and used physical memory and swap memory on the Data Provisioning Agent server.
Information about when the agent was connected the last time.
Information about the overall number of connections that use the agent and the number of connections that actively use
real-time replication, with active real-time replication meaning that the connection type supports real-time replication and
for the connection at least one table is replicated via real-time replication.
You can change to the Connections view to see the agents with a list of all connections they use and their real-time
replication status. You can pause real-time replication for the connections of the while applying changes to the agent. For
more information, see Pause Real-Time Replication for an Agent.
With the integrated log access, you don’t need to leave SAP Datasphere to monitor the agent and analyze agent issues. Accessing
the log data happens via the Data Provisioning Agent File adapter which reads the log files and saves them into the database of
SAP Datasphere.
Log File Name and Location on Data Provisioning Agent Server Description
<DPAgent_root>/log/framework_alert.trc Data Provisioning Agent adapter framework log. Use this file to
monitor data provisioning agent statistics.
<DPAgent_root>/log/[Link] Data Provisioning Agent adapter framework trace log. Use this file
to trace and debug data provisioning agent issues.
You can review the logs in SAP Datasphere after log access has been enabled for the agent in question. We display the actual log
files as well as up to ten archived log files that follow the naming convention [Link].<x> respectively
framework_alert.trc.<x> with <x> being a number between one and ten.
This is custom documentation. For more information, please visit SAP Help Portal. 197
7/9/25, 8:45 AM
Related Information
Enable Access to Data Provisioning Agent Logs
Review Data Provisioning Agent Logs
Prerequisites
A Data Provisioning Agent administrator has provided the necessary File adapter configuration with an access token that you need
for enabling the log access in SAP Datasphere.
To define the access token in the agent's secure storage, the administrator has performed the following steps in the agent
configuration tool in command-line interactive mode:
3. Choose Set FileAdapter Access Token and define a new token: Under Enter File Adapter Access Token, enter the token,
make a note of it, confirm it, and press Enter to quit the configuration tool.
For more information about the File adapter configuration, see File in the Installation and Configuration Guide of the SAP HANA
Smart Data Integration and SAP HANA Smart Data Quality documentation.
Procedure
1. From the main menu, open Configuration Data Integration .
4. In the FileAdapter Password field that appears, enter the File adapter access token.
Results
The Review Logs entry in the menu of the agent’s tile is enabled and the framework_alert.trc and [Link] logs are
written to the database of SAP Datasphere. You can now review the current and archived log files from the agent's tile.
This is custom documentation. For more information, please visit SAP Help Portal. 198
7/9/25, 8:45 AM
Prerequisites
The logs are written to the database of SAP Datasphere. For more information, see Enable Access to Data Provisioning Agent Logs.
Procedure
1. From the main menu, open Configuration Data Integration .
The Review Agent Logs dialog initially shows 50 log entries. To load another chunks of 50 entries each, scroll down to the
bottom of the dialog and use the More button.
3. To show the complete message for a log entry, click More in the Message column.
4. You have the following options to restrict the results in the display of the logs:
Search: In the <agent name> field, enter a search string and click (Search) to search in the messages of the logs.
Filters: You can filter based on time, message type and log file name. When you’ve made your selection, click Apply
Filters.
Note
If your local time zone differs from the time zone used in the Data Provisioning Agent logs and you're applying a
time-based filter, you might get other filter results than expected.
5. [optional] Export the logs as CSV file to your local system. Note that filters and search restrictions will be considered for the
exported file.
Prerequisites
To run recurring scheduled tasks on your behalf, you need to authorize the job scheduling component of SAP Datasphere. In your
profile settings under Schedule Consent Settings, you can give and revoke your consent to SAP Datasphere to run your scheduled
tasks in the future. Note that when you don't give your consent or revoke your consent, tasks that you own won't be executed but
will fail.
Context
A recurring task will check for any status changes according to the configured frequency and send the notifications to the user
who is the owner of the configuration. The initial owner is the user who created the configuration. Any user with the appropriate
administration privileges can take over the ownership for this task if required, for example in case of vacation replacement or when
the previous owner left the department or company.
Procedure
1. In the side navigation area, click (System) (Configuration) Data Integration .
This is custom documentation. For more information, please visit SAP Help Portal. 199
7/9/25, 8:45 AM
2. Go to the On-Premise Agents section and click (menu) Configure Sending Notifications.
3. If you haven't authorized SAP Datasphere yet to run your scheduled tasks for you, you will see a message at the top of the
Configure Sending Notifications dialog asking for your consent. Give your consent.
An additional field Owner appears that shows that you have been automatically assigned as the owner of the task.
5. Select the frequency in which the status of the Data Provisioning Agent should be checked.
This will start the first status check. After the first check, the status check will be performed according to the defined
frequency.
Results
If the status check finds any status change for the agent, a notification will be sent that you can find by clicking (Notifications) on
the shell bar.
When you click on the notification, you’ll get to the On-Premise Agents section in (System) (Configuration) Data
Integration where you can start searching for the root cause in case the agent is disconnected.
Next Steps
If you need to take over the ownership and receive the notifications for an agent’s status changes, go the the Configure Sending
Notifications dialog as described above, click Assign to Me and save the configuration. From now on you will receive the
notifications about any status changes for the agent. If you haven’t done so yet, you need to provide your consent before you can
take over the ownership.
Context
If you need to perform maintenance activities in a source system, you can pause real-time replication for the corresponding
connection. For more information, see Pause Real-Time Replication for a Connection.
Procedure
1. In SAP Datasphere, from the main menu, open Configuration Data Integration On-Premise Agents .
2. To show the Data Provisioning Agent tiles with a list of all connections they use, click the Connections button.
Active The connection type supports real-time replication and for the
connection at least one table is replicated via real-time
replication (even if the status in the Remote Table Monitor is
Error).
This is custom documentation. For more information, please visit SAP Help Portal. 200
7/9/25, 8:45 AM
Inactive The connection type supports real-time replication and for the
connection currently there is no table replicating via real-time
replication.
Paused The connection type supports real-time replication and for the
connection at least for one table real-time replication is
paused.
3. To pause the agent's connections with replication status Active or Inactive, on the tile of the agent choose (menu) and
then Pause All Connections.
In the list of connections shown on the tile, the status for affected connections changes to Paused. You can also see the
status change for the connections in the Connections application.
In the Remote Table Monitor the status for affected tables changes to Paused and actions related to real-time replication
are not available for these tables. Also, you cannot start real-time replication for any table of a paused connection.
4. You can now apply the changes to your Data Provisiong Agent.
5. Once you're finished with the changes, restart real-time replication for the agent. Choose (menu) and then Restart All
Connections.
The status in the list of connections shown on the tile, in the Connections application as well as in the Remote Table
Monitor changes accordingly and you can again perform real-time related actions for the tables or start real-time
replication.
The following sections provide information about checks, logs, and actions that you can take to troubleshoot problems with the
Data Provisionning Agent:
Initial Checks
Configuration Checks
Performance
Validating the Connection from the Server the Agent is Running to SAP Datasphere
SAP Notes
Support Information
Note
In the following sections, filepaths and screenshots are based on a Linux-based installation of the agent. If you have installed
the agent on a Microsoft Windows server, the slashes "/” must be replaced by backslashes “\”.
Initial Checks
This is custom documentation. For more information, please visit SAP Help Portal. 201
7/9/25, 8:45 AM
A Data Provisioning Agent administrator can perform the following checks:
Firewall
Agent version
Make sure to always use the latest released version of the Data Provisioning Agent. For information on supported and
available versions for the Data Provisioning Agent, see the SAP HANA Smart Data Integration Product Availability Matrix
(PAM) .
Make sure that all agents that you want to connect to SAP Datasphere have the same latest version.
Java Installation
Check whether a Java installation is available by running the command java -version. If you receive a response like
java: command not found, use the Java installation which is part of the agent installation. The Java executable can be
found in folder <DPAgent_root>/sapjvm/bin.
Configuration Checks
The agent configuration is stored in the <DPAgent_root>/[Link] file in the agent installation root location
(<DPAgent_root> file in the agent installation root location (For a successful connection, make sure that outbound connections
from the Data Provisioning Agent to the target host and port, which is provided in the Data Provisioning Agent registration
information in file in the agent installation root location). A Data Provisioning Agent administrator can double-check for the correct
values (please do not maintain the parameters directly in the configuration file; the values are set with the command-line agent
configuration tool):
[Link]=<Agent Name> Agent Name (the name defined by the user who registered the
agent in SAP Datasphere; the name is case sensitive)
[Link]=false n/a
[Link]=true n/a
If you use a proxy server in your landscape, additionally check for the following parameters:
[Link] file
proxyType=http
[Link]=true
This is custom documentation. For more information, please visit SAP Help Portal. 202
7/9/25, 8:45 AM
[Link] file
For more information, see Agent Configuration Parameters in the SAP HANA Smart Data Integration and SAP HANA Smart Data
Quality documentation.
Agent Logs
Change the logging level to INFO (default), ALL, DEBUG, or TRACE according to your needs. For more informatiaon, see SAP
Note 2496051 - How to change "Logging Level" (Trace level) of a Data Provisioning Agent - SAP HANA Smart Data
Integration.
The parameters for the logging level in the <DPAgent_root>/[Link] file are:
[Link]
[Link]
Note
Changing the level to DEBUG or ALL will generate a large amount of data. We therefore recommend to change the
logging level to these values only for a short period of time while you are actively debugging and change it to a lower
information level after you have finished debugging.
See also SAP Note 2461391 - Where to find Data Provisioning Agent Log Files
JDBC Trace
For information about activating JDBC tracing, see Trace a JDBC Connection in the SAP HANA Service for SAP BTP in AWS
and Google Cloud Regions documentation.
To set the trace level, execute the JDBC driver *.jar file from the <DPAgent_root>/plugins directory.
Performance
If you experience performance issues when replicating data via the Data Provisioning Agent, a Data Provisioning Agent
administrator can consider increasing the agent memory as described in SAP Note 2737656 - How to increase DP Agent
memory.
For general memory sizing recommendations for SAP HANA Smart Data Integration, see
This is custom documentation. For more information, please visit SAP Help Portal. 203
7/9/25, 8:45 AM
Data Provisioning Agent - Best Practices and Sizing Guide in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality documentation.
SAP Note 2688382 - SAP HANA Smart Data Integration Memory Sizing Guideline
Validating the Connection from the Server the Agent is Running to SAP Datasphere
In SAP Datasphere
In (System) (Configuration) Data Integration On-Premise Agents a green bar and status information on the
agent tile indicates if the agent is connected.
In On-Premise Agents, click Refresh Agents if the tile of a newly connected agent doesn’t display the updated connection status.
Note
When you connect a new agent, it might take several minutes until it is connected.
Via Data Provisioning Agent Configuration Tool (for agent versions lower than 2.7.4)
4. If the output doesn't show that the agent is connected, it may show an error message. Resolve the error, and then select
option Start or Stop Agent, and then option Start Agent to start the agent.
Note
For agent version 2.7.4 and higher, if in the agent status the message No connection established yet is shown, this can be
ignored. You can check the connection status in SAP Datasphere instead. For more information about the agent/SAP HANA
connection status in agent version 2.7.4 and higher, see SAP Note 3487646 .
This is custom documentation. For more information, please visit SAP Help Portal. 204
7/9/25, 8:45 AM
The Data Provisioning Agent framework trace file [Link] in the <DPAgent_root>/log/ folder should contain entries
indicating that the agent has been successfully connected.
To validate the connection, you can directly use the JDBC driver jar file from the command line interface. You must ensure that
you’re using the same JDBC driver as used by the Data Provisioning Agent. The JDBC driver jar file ([Link].jdbc_*.jar)
is located in the <DPAgent_root>/plugins directory.
java -jar <[Link].jdbc_*.jar> -u <HANA User Name for Messaging Agent>,”<HANA User Password for
Navigate to the <DPAgent_root>/plugins/ directory and run one of the following commands by replacing the variables as
needed and depending on your landscape:
Without proxy:
With proxy:
If the connection works properly the statement should look like this:
This is custom documentation. For more information, please visit SAP Help Portal. 205
7/9/25, 8:45 AM
If you see this kind of error, it is most likely related to a missing entry in the IP Allowlist inSAP Datasphere.
Verify that the external (public) IPv4 address of the server where the agent is installed is in the IP allowlist. When using a proxy, the
proxy's address needs to be included in IP allowlist as well.
Manage IP Allowlist
Authentication failed
This is custom documentation. For more information, please visit SAP Help Portal. 206
7/9/25, 8:45 AM
Authentication fails because of invalid HANA User for Agent Messaging credentials in the agent secure storage. To update the
credentials, use the agent configuration tool and then restart the agent.
For more information, see Manage the HANA User for Agent Messaging Credentials in the SAP HANA Smart Data Integration and
SAP HANA Smart Data Quality documentation.
Firewall/Proxy Issues
This issue typically indicates that the JDBC driver is not capable of resolving the SAP HANA server URL to connect to theSAP
Datasphere tenant and/or to establish a correct outbound call. Please check your firewall/proxy settings and make sure to enable
outbound connections accordingly.
In case of missing encryption the log containts the following statement: "only secure connections are allowed".
When testing the connectivity directly with the JDBC driver, add the parameter -o encrypt=true.
The logs are located in the <DPAgent_root>/log directory. For more information on the available log files, see SAP Note
2461391 .
If the agent is connected, you can review the framework log (framework_alert.trc) and the framework trace log
([Link]) directly in SAP Datasphere. For more information, see Monitoring Data Provisioning Agent Logs.
SAP Notes
SAP Note 2511196 - What ports are used by Smart Data Integration
SAP Note 2091095 - SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
This is custom documentation. For more information, please visit SAP Help Portal. 207
7/9/25, 8:45 AM
SAP Note 2400022 - FAQ: SAP HANA Smart Data Integration (SDI)
SAP Note 2688382 - SAP HANA Smart Data Integration Memory Sizing Guideline
Support Information
Support Component: SDI HAN-DP-SDI
The following error occurs if you try to connect to a remote source using the Cloud Connector, but the connectivity proxy hasn’t
been enabled:
SAP Datasphere takes care of enabling the connectivity proxy. This might take a while.
2. The connectivity proxy is enabled but not fully ready to serve requests
The following error occurs if the connectivity proxy has been enabled but is not yet ready to be used:
SAP Datasphere takes care of enabling the connectivity proxy. This might take a while.
This is custom documentation. For more information, please visit SAP Help Portal. 208
7/9/25, 8:45 AM
The following error occurs if you’ve used a virtual host name with an underscore, for example, hana_01:
The following error occurs when the Cloud Connector's IP is not included in the allowlist list:
You can find the related logs in the ljs_trace.log file in the Cloud Connector. For example:
For information about renewing a subaccount certificate, see Update the Certificate for a Subaccount in the SAP BTP Connectivity
documentation.
Related Information
Troubleshooting Connection Issues with the Cloud Connector (SAP HANA Cloud, SAP HANA Database documentation)
Prerequisites
To create a database user group, you must have a global role that grants you the following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator global role, for example, grants these [Link] more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.
Context
When creating a database user group, an administrator is also created. This administrator can create other users, schemas, and
roles using SAP Datasphere stored procedures. The administrator and their users can create data entities (DDL) and ingest data
(DML) directly into their schemas and prepare them for consumption by spaces.
For detailed information about user groups, see User Groups in the SAP HANA Cloud documentation.
Note
Users with the DW Space Administrator role can create database users, which are associated with their space (see Integrating
Data via Database Users/Open SQL Schemas).
Procedure
1. In the side navigation area, click (System) (Configuration) Database Access Database User Groups .
3. Enter a suffix for your database user group and click Create.
The group is created and the connection details and administrator credentials are displayed.
If you want to work with the SAP HANA database explorer, you will need to enter your password to grant the explorer access
to the database user group schema. When connecting to SAP HANA Cloud with other tools, users will need the following
properties:
This is custom documentation. For more information, please visit SAP Help Portal. 210
7/9/25, 8:45 AM
Database Group Administrator (name and password)
Host Name
Port
Prerequisites
Create a User
Create a Schema
Revoke a Role
Drop a Role
Prerequisites
To create users, schemas, and roles in a database user group, you must have a database user group administrator and a password
(Creating a Database User Group).
The SAP HANA database explorer opens with your database user group at the top level. You can now use the SQL editor to create
users, roles and schemas.
Create a User
You can create a user in your user group with the following statement:
Note
This is custom documentation. For more information, please visit SAP Help Portal. 211
7/9/25, 8:45 AM
To avoid possible conflicts, we recommend that you use the prefix DWCDBGROUP#<DBgroup_name># when naming users,
schemas, and roles in your group (see Rules for Technical Names).
Create a Schema
You can create a schema in your database user group by using the following SAP Datasphere stored procedure:
CALL "DWC_GLOBAL"."CREATE_USERGROUP_SCHEMA"
(
SCHEMA_NAME => '<schema_name>',
OWNER_NAME => '<user_name>'
);
Note
To avoid possible conflicts, we recommend that you use the prefix DWCDBGROUP#<DBgroup_name># when naming users,
schemas, and roles in your group (see Rules for Technical Names).
The owner of the new schema must be a user of the database user group. If the owner name is set to null, then the database user
group administrator is set as the owner.
In our example, we create a new schema, DWCDBGROUP#DWMIGRATE#STAGING, and set BOB as the owner:
CALL "DWC_GLOBAL"."CREATE_USERGROUP_SCHEMA"
(
SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING,
OWNER_NAME => 'DWCDBGROUP#DWMIGRATE#BOB'
);
Create a Role
You can create a role in your database user group by using the following SAP Datasphere stored procedure:
CALL "DWC_GLOBAL"."CREATE_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => '<schema_name>',
ROLE_NAME => '<role_name>'
);
Note
To avoid possible conflicts, we recommend that you use the prefix DWCDBGROUP#<DBgroup_name># when naming users,
schemas, and roles in your group (see Rules for Technical Names).
Once the role is created, you can grant it to a user or to another role, revoke it, and drop it.
This is custom documentation. For more information, please visit SAP Help Portal. 212
7/9/25, 8:45 AM
In our example, we create a new role, DWCDBGROUP#DWMIGRATE#DWINTEGRATOR in the schema STAGING:
CALL "DWC_GLOBAL"."CREATE_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING',
ROLE_NAME => 'DWCDBGROUP#DWMIGRATE#DWINTEGRATOR'
);
You can grant a role to a user or to another role in your database user group by using the following SAP Datasphere stored
procedure:
CALL "DWC_GLOBAL"."GRANT_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => '<schema_name>',
ROLE_NAME => '<role_name>',
GRANTEE => '<user_name>',
GRANTEE_ROLE_NAME => NULL,
WITH_ADMIN_OPTION => FALSE
);
The role schema, grantee, and grantee role must all be in the same database user group.
CALL "DWC_GLOBAL"."GRANT_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING',
ROLE_NAME => 'DWCDBGROUP#DWMIGRATE#DWINTEGRATOR',
GRANTEE => 'DWCDBGROUP#DWMIGRATE#BOB',
GRANTEE_ROLE_NAME => NULL,
WITH_ADMIN_OPTION => FALSE
);
Revoke a Role
You can revoke a role from a user by using the following SAP Datasphere stored procedure:
CALL "DWC_GLOBAL"."REVOKE_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => '<schema_name>',
ROLE_NAME => '<role_name>',
GRANTEE => '<user_name>',
GRANTEE_ROLE_NAME => NULL
);
This is custom documentation. For more information, please visit SAP Help Portal. 213
7/9/25, 8:45 AM
CALL "DWC_GLOBAL"."REVOKE_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING',
ROLE_NAME => 'DWCDBGROUP#DWINTEGRATOR',
GRANTEE => 'DWCDBGROUP#DWMIGRATE#BOB',
GRANTEE_ROLE_NAME => NULL
);
Drop a Role
You can drop a role by using the following SAP Datasphere stored procedure:
CALL "DWC_GLOBAL"."DROP_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => '<schema_name>',
ROLE_NAME => '<role_name>'
);
CALL "DWC_GLOBAL"."DROP_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING',
ROLE_NAME => 'DWCDBGROUP#DWMIGRATE#DWINTEGRATOR'
);
Prerequisites
To allow a space to read from the database user group schema, you must have a database user group administrator and a
password (Creating a Database User Group).
Context
You can grant read privileges by running an SAP Datasphere specific stored procedure in the SQL console in the SAP HANA
Database Explorer.
Procedure
1. From the side navigation area, go to (System) → (Configuration) → Database Access → Database User Groups.
2. Select the database user group and click Open Database Explorer.
3. In the SQL console in SAP HANA Database Explorer, call the stored procedure to grant the 'SELECT' privilege to a space
using the following syntax:
This is custom documentation. For more information, please visit SAP Help Portal. 214
7/9/25, 8:45 AM
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => <operation>,
PRIVILEGE => <privilege>,
SCHEMA_NAME => <schema name>,
OBJECT_NAME => <object name>,
SPACE_ID => <space ID>);
'REVOKE' space.
privilege 'SELECT' [required] Enter the read privilege that you want to
grant (or revoke) to the space.
schema_name '[name of database user group schema]' [required] Enter the name of the schema you want
the space to be able to read from.
space_id '[ID of the space]' [required] Enter the ID of the space you are granting
the read privileges to.
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'SELECT',
SCHEMA_NAME => 'SALE#ETL',
OBJECT_NAME => '',
SPACE_ID => 'SALES');
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'SELECT',
SCHEMA_NAME => 'SALE#ETL',
OBJECT_NAME => 'MY_TABLE',
SPACE_ID => 'SALES');
Results
If the run is successful, you receive a confirmation message in the Result pane. You can then open the Data Builder, create a data
flow, and select the tables as sources.
This is custom documentation. For more information, please visit SAP Help Portal. 215
7/9/25, 8:45 AM
Prerequisites
To allow a space to write to the database user group schema, you must have a database user group administrator and a password
(Creating a Database User Group).
Context
You can grant write privileges by running an SAP Datasphere specific stored procedure in the SQL console in the SAP HANA
Database Explorer.
Procedure
1. From the side navigation area, go to (System) → (Configuration) → Database Access → Database User Groups.
2. Select the database user group and click Open Database Explorer.
3. In the SQL console in SAP HANA Database Explorer, call the stored procedure to grant the 'INSERT', 'UPDATE', or 'DELETE'
privilege to a space using the following syntax:
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => <operation>,
PRIVILEGE => <privilege>,
SCHEMA_NAME => <schema name>,
OBJECT_NAME => <object name>,
SPACE_ID => <space ID>);
'UPDATE' Note
'DELETE' You can grant one privilege at a time.
schema_name '[name of database user group schema]' [required] Enter the name of the schema you want
the space to be able to write from.
This is custom documentation. For more information, please visit SAP Help Portal. 216
7/9/25, 8:45 AM
space_id '[ID of the space]' [required] Enter the ID of the space you are granting
the write privileges to.
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'UPDATE',
SCHEMA_NAME => 'SALE#ETL',
OBJECT_NAME => '',
SPACE_ID => 'SALES');
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'UPDATE',
SCHEMA_NAME => 'SALE#ETL',
OBJECT_NAME => 'MY_TABLE',
SPACE_ID => 'SALES');
Results
If the run is successful, you receive a confirmation message in the Result pane. You can then open the Data Builder, create a data
flow, and select the tables as targets.
Monitor Tasks
Monitor Statements
Monitoring File Space Storage Consumption and Apache Spark Application Usage
This is custom documentation. For more information, please visit SAP Help Portal. 217
7/9/25, 8:45 AM
Click (System Monitor) to access the main monitoring tool. The System Monitor allows to monitor the performance of your
system and identify storage, task, out-of-memory, and other issues across all spaces.
For example, you can see all the errors (such as failed tasks and out-of-memory errors) that occurred yesterday or the top five
statements with the highest peak memory consumption.
Note
For optimal performance, it is recommended that you consider staggering the scheduled run time of tasks such as data flows
or task chains that may contain these tasks. Make sure to distribute your work such as scheduling and running tasks. There
isn't a specific numerical limit on how many tasks can be scheduled. There could be a resource distribution issue caused by too
many tasks running at once. Check your system monitor to look at your workload distribution. For more information see,
Monitoring SAP Datasphere or Persisted Views and Memory Consumption.
Note
SAP Datasphere is integrated into SAP Cloud ALM for health monitoring, which enables you to check the health of one or more
SAP Datasphere tenants from the Health Monitoring app in SAP Cloud ALM. See Health Monitoring in the SAP Cloud ALM -
Application Help.
You can monitor available disk and memory storage on your tenant with the following cards:
Card Description
Disk Storage Used Shows the total amount of disk storage used in all spaces, broken down between:
Audit Log Data: Data related to audit logs (see Audit Logging).
Note
Audit logs can grow quickly and consume a great deal of disk storage (see Delete
Audit Logs).
Other Data: Includes data stored in database user group schemas (see Creating a
Database User Group) and SAP HANA data (such as statistics schemas).
Administrative Data: Data used to administer the tenant and all spaces (such as space
quota, space version). Includes all information stored in the central schemas
(DWC_GLOBAL, DWC_GLOBAL_LOG, DWC_TENANT_OWNER).
Disk Used by Spaces for Shows the total amount of disk storage out of the total amount of disk storage. You can see a
Storage breakdown of this amount in the card Disk Storage Used.
Memory Used by Spaces for Shows the total amount of memory storage out of the total amount of memory storage.
Storage
This is custom documentation. For more information, please visit SAP Help Portal. 218
7/9/25, 8:45 AM
b. Display the list of spaces in the table layout and order by column. For example, you can display at the top of the table
the spaces that use the highest amount of storage by choosing the descending order for the column Used Storage.
c. Open a space and click Monitor in the space details page to see the storage amount assigned to and used by the
space (see Monitor Your Space Storage Consumption).
Monitor Tasks
For example, you can find out if tasks have to be scheduled at another time so that high-memory consuming tasks do not run at
the same time. If single tasks consume too much memory, some additional views may need to be persisted or the view partitioning
may need to be used to lower the memory consumption.
To investigate issues:
You can identify issues with tasks with the following cards:
Card Description
Shows the number of failed tasks by day for the last 7 days.
Shows the 5 tasks whose run duration time was the longest in the last 48 hours.
Shows the 5 tasks whose processing memory consumption was the highest in the last
48 hours.
2. Click View Logs in a card to go to the Task Logs tab, which displays information filtered on the card criteria. For more
information on the Task Logs tab, see Task Logs Tab.
Activity column - For the spaces you have access to (via scoped roles), a link opens the run in the Data Integration
Monitor (see Managing and Monitoring Data Integration).
Object Name column - For the spaces you have access to (via scoped roles), a link opens the editor of the object.
Monitor Statements
Note
Expensive statement tracing is enabled by default. If disabled, statement information and errors are not traced and you cannot
see them in the System Monitor. For more information on enabling and configuring expensive statement tracing, see Configure
Monitoring.
This is custom documentation. For more information, please visit SAP Help Portal. 219
7/9/25, 8:45 AM
You can monitor statements with the following cards:
Card Description
Shows the 5 statements whose processing memory consumption was the highest in the
last 48 hours.
Shows the number of out-of-memory errors that have occurred in tasks and statements,
by day for the last 7 days.
Top 5 MDS Requests by Shows the 5 SAP HANA multi-dimensional services (MDS) requests (used for example in SAP
Processing Memory Analytics Cloud consumption), whose processing memory consumption is the highest.
Consumption
Out-of-Memory Errors (MDS Shows the out-of-memory errors that are related to SAP HANA multi-dimensional services
Requests) (MDS) requests, which is used for example for SAP Analytics Cloud consumption.
Top 5 Out-of-Memory Errors Shows the schemas in which out-of-memory errors have occurred in the last 7 days because the
(Workload Class) by Space statement limits have been exceeded.
To set the statement limits for spaces, see Set Priorities and Statement Limits for Spaces or
Groups.
2. Click View Logs in a card to go to the Statement Logs, which displays information filtered on the card criteria. For more
information on the Statements tab, see Statement Logs Tab.
You can monitor statements that are rejected or queued with the following cards.
Card Description
Top 5 Admission Control Shows the 5 spaces with the highest number of rejected statements in the last 7 days.
Rejection Events by Space
Note
A space that has been deleted is prefixed with an asterisk character.
Shows the number of statements that have been rejected in the last 7 days because
they’ve exceeded the threshold percentage of CPU usage.
Top 5 Admission Control Shows the 5 spaces with the highest number of queued statements in the last 7 days.
Queuing Events by Space
Note
This is custom documentation. For more information, please visit SAP Help Portal. 220
7/9/25, 8:45 AM
Card Description
Shows the number of statements that have been queued in the last 7 days because
they’ve exceeded the threshold percentage of CPU usage.
If you've created a database analysis user, you're connected to the SAP HANA Cockpit without entering your credentials
(see Create a Database Analysis User to Debug Database Issues.
For more information about admission control thresholds, see Set Priorities and Statement Limits for Spaces or Groups.
1. In the side navigation area, click (System Monitor), then click the Elastic Compute Nodes tab.
2. From the dropdown list, select the elastic compute node that you want to monitor.
Note
If one elastic compute node exists, related monitoring information is automatically displayed in the tab. If several elastic
compute nodes exist, you must select a node from the dropdown list to display monitoring information in the tab.
You can view elastic compute node key figures or identify issues with the following cards:
Card Description
Configuration Shows the following information about the current elastic compute node:
Technical name.
The performance class and the resources allocated to the node: number of compute
blocks, memory, disk storage and number of vCPUs.
Run Details Shows the following information about the latest or the previous run of the current elastic
compute node:
The date and time at which the elastic compute node has started and stopped.
The total run duration (uptime) from the starting to the stopping phase.
The number of block-hours is the numbers of hours that have been consumed by the
run. The number of block-hours is the result of the run duration in numbers of hours
multiplied by the number of compute blocks. If a node that includes 4 compute blocks
runs for 5 hours, 20 block-hours have been consumed. In such a case, the uptime equals
This is custom documentation. For more information, please visit SAP Help Portal. 221
7/9/25, 8:45 AM
Card Description
the block-hours. If a node that includes 8 compute blocks runs for 5 hours, 40 block-
hours have been consumed.
Monthly Uptime Shows the following information about the elastic compute node runs for the current month or
the last month:
The total duration (uptime) of all runs in the current or last month.
The total number of block-hours consumed by all the runs in the current or last month.
Average CPU Shows the average percentage of the number of vCPUs consumed, during the latest or previous
run of the elastic compute node.
The trend icon (up or down arrow) indicates if the percentage is higher or lower than the
previous run.
To see the real-time average CPU utilization in percentage for the elastic compute note, click
Performance Monitor, which opens the Performance Monitor page in the SAP HANA Cockpit
(see The Perfomance Monitor in the SAP HANA Cloud Database Administration with SAP HANA
Cockpit).
Average Memory Shows the average amount of memory consumed (in GiB), during the latest or previous run of
the elastic compute node.
The trend icon (up or down arrow) indicates if the percentage is higher or lower than the
previous run.
To see the real-time average memory utilization in GB for the elastic compute note, click
Performance Monitor, which opens the Performance Monitor page in the SAP HANA Cockpit
(see The Perfomance Monitor in the SAP HANA Cloud Database Administration with SAP HANA
Cockpit).
Total Uptime Shows the total duration in hours of all runs of the elastic compute node.
Top 5 Statements by Shows the 5 statements whose memory consumption was the highest during the last run of the
Processing Memory elastic compute node.
Consumption To see detailed information about the statements, you can click View Logs, which takes you to
the Statement Logs tab. See Monitoring SAP Datasphere.
Out-of-Memory Errors Shows the number of out-of-memory errors that have occurred in tasks and statements related
to the elastic compute node, during the last run.
To see detailed information about the errors, you can click View Logs, which takes you to the
Statement Logs tab. See Monitoring SAP Datasphere.
Memory Distribution Shows the amount of memory allocated to the elastic compute node, if in a running state, broken
down between:
Unused Memory - Shows the amount of memory available for the elastic compute node.
Memory Used for Data Replication - Shows the amount of memory used to store
replicated data for the elastic compute node.
Memory Used for Processing - Shows the amount of memory used by the processes
that are currently running for the elastic compute node. For example: consumption of the
queries running on the elastic compute node.
Note
If the elastic compute node is not in a running state, no data is displayed.
This is custom documentation. For more information, please visit SAP Help Portal. 222
7/9/25, 8:45 AM
3. To investigate further, you can do the following:
To view statement details, click View Logs in a card to go to the Statement Logs tab, which displays information
filtered on the card criteria. Then, click the links in the Statement Details column. For more information on the
Statement Logs tab, see Statement Logs Tab.
To view details on a run, click View Logs in a card to go to the Task Logs tab, which displays information filtered on
the card criteria. In the Activity column, click the link to open the run in the Data Integration Monitor (see Managing
and Monitoring Data Integration).
To navigate to the elastic compute node in the Space Management app, click Manage Elastic Compute Node (see
Create an Elastic Compute Node and Run an Elastic Compute Node).
To analyze the performance of the SAP HANA database, click Database Overview (SAP HANA Cockpit), which
opens the Database Overview page in the SAP HANA Cockpit (see The Database Overview Page in the SAP HANA
Cloud Database Administration with SAP HANA Cockpit).
Your dailly consumption for the current month is shown. You track usage relative to your subscription. You can also download
detailed hourly data. See Monitor Capacities.
Monitoring File Space Storage Consumption and Apache Spark Application Usage
You can monitor the storage consumption for file spaces (of storage type SAP HANA Data Lake Files) and their usage of the
Apache Spark application for task runs. This allows you to understand where storage consumption is the highest and to target
specific tasks.
1. In the side navigation area, click (System Monitor), then click the Object Store tab.
2. Select the file space of your choice in the Spaces drop-down list. All file spaces of the tenant are listed.
3. You can monitor the storage utilization of your selected file space and of all file spaces with the two following cards:
Card Description
SAP HANA Data Lake Files: Storage Utilization Shows the amount of storage used in terabyte (TB) for the
selected space.
Note
As an administrator, you can see the storage of all spaces
even if you aren't a member.
You can see the space usage of tasks runs in the Apache Spark:
Tasks table below.
SAP HANA Data Lake Files: Storage Utilization of All Spaces Shows the amount of storage used in terabyte (TB) for all
spaces.
4. You can investigate the selected space's tasks further in the Apache Spark: Tasks table:
This is custom documentation. For more information, please visit SAP Help Portal. 223
7/9/25, 8:45 AM
a. Select a time frame in the Date and Time Range options (Single Dates, Date Ranges, Weeks, Months, or Custom
Options).
Number of Tasks: Shows the total of tasks that ran in the application during the selected time range. Select
the line or click (Details) to show more information about the tasks, such as Application Configuration
details (Executor CPU, Executor Memory, Driver CPU, Driver Memory, Maximum CPU, and Maximum
Memory) and Tasks details (Object Type, Task Activity, and Number of Tasks). Sorting and filtering abilities
are available.
Property Description
Start Time Shows at what time (date and hour) the task has started to run.
Duration (sec) Shows how many seconds the task has run.
Object Type Shows the type of object that was run in the task. For example: view, remote table, data flow.
Activity Shows the action that was performed on the object. For example: persist, replicate, execute. You can
click on the activity name, which takes you to the Data Integration Monitor.
Space Name Shows the name of the space in which the task is run.
Object Name Shows the name of the object. You can click on the object name, which opens the object in the Data
Builder.
SAP HANA Peak Memory Shows the maximum amount of memory (in MiB) the task has used during the runtime in SAP HANA.
Note
You can see this information:
The option Enable Expensive Statement Tracing is enabled by default. It traces task
exceeding the thresholds specified in (Configuration) → Monitoring.
And if the task is run for these objects (and activities): views (persist,
remove_persisted_data), remote tables (replicate, enable_realtime), data flows (execute)
and intelligent lookup (execute, delete_data).
SAP HANA CPU Time Shows the maximum amount of CPU time (in ms) the task has used in SAP HANA.
Note
You can see this information:
If the option Enable Expensive Statement Tracing is enabled and if the task exceeds the
thresholds specified in (Configuration) → Monitoring. See Configure Monitoring.
And if the task is run for these objects (and activities): views (persist,
remove_persisted_data), remote tables (replicate, enable_realtime), data flows (execute)
and intelligent lookup (execute, delete_data).
This is custom documentation. For more information, please visit SAP Help Portal. 224
7/9/25, 8:45 AM
Property Description
Note
The CPU time indicates how much time is used for all threads. It means that if the CPU time is
significantly higher than the duration of the statement, then many threads are used. If many
threads are used for a long time, no other tasks should be scheduled at that point in time, or
resource bottlenecks may occur and tasks may even be canceled.
Records Shows the number of records of the target table after the task has finished running.
Note
You can see this information only if the task is run for these objects (and activities): views (persist),
remote tables (replicate, enable_realtime), data flows (execute) and intelligent lookup (execute,
delete_data). Otherwise, no number is displayed.
SAP HANA Used Memory Shows the amount of memory (in MiB) that is used by the target table in SAP HANA after the task has
finished running.
SAP HANA Used Disk Shows the amount of disk space (in MiB) that is used by the target table in SAP HANA after the task
has finished running.
Substatus For tasks with the status “failed”, shows the substatus and a message describing the cause of failure.
For more information about failed task substatuses, see Understanding Statuses and Substatuses.
Target Table Shows the SAP HANA database technical name of the target table.
Statements Shows a link you can click to view all the statements of the task in the Statements tab, if the
information is available.
Note
You can see this information if the option Enable Expensive Statement Tracing is enabled
in (Configuration) → Monitoring. See Configure Monitoring.
However, as statements are traced for a limited period, you may not be able to see the
statements used in the task.
Out-of-Memory Shows if the task has an out-of-memory error ("Yes" is then displayed) or not ("No" is then displayed).
Start Date Shows at which date the task has started to run.
You can cancel a task run by selecting one single task and clicking Cancel Task. You can cancel a task run on the following objects:
Transformation flow
Data flow
Task chain
Cancelling a task run may be required when it takes too long or if the run impacts negatively other runs by taking too many
resources away. Canceling a task via the System Monitor is the most reliable option. Its access isn't restricted when resource
This is custom documentation. For more information, please visit SAP Help Portal. 225
7/9/25, 8:45 AM
consumption is too high (as in the Data Integration Monitor), and it is the fastest way to cancel a task (compared to the Database
Explorer). The data is rolled back and restored to the state that existed before the task run was initially triggered.
Note
Data on tasks are kept for the time specified in (Configuration) → Tasks.
You may not be able to cancel a task via the Data Integration Monitor or the Database Explorer when resource
consumption is too high. You will always be able to cancel a task via the System Monitor.
The option Enable Expensive Statement Tracing is enabled by default, you can see all the database statements that
exceed the specified thresholds.
If the option Enable Expensive Statement Tracing is disabled, then the Statements tab is disabled.
Property Description
Start Time Shows at what time (date and hour) the statement has started to run.
Duration (ms) Shows how many milliseconds the statement has run.
Object Type
Shows the type of object that was run in the statement (for example: view, remote table, data
flow).
Business Layer Modeling - the statement was run in the Business Builder.
Data Layer Modeling - the statement was run in the data preview of the view editor in
the Data Builder.
DWC Space Management - the statement was run in the Space Management, for
example, when deploying an object.
DWC Administration - the statement was run for an administration task such as writing
a task framework status.
Activity Shows the action that was performed. For example: update, compile, select.
This is custom documentation. For more information, please visit SAP Help Portal. 226
7/9/25, 8:45 AM
Property Description
Object Name If the statement is related to a task, it shows the name of the object for which the statement was run.
Schema Name Shows the name of the schema in which the statement is run.
SAP HANA Peak Memory Shows the maximum amount of memory (in MiB) the statement has used during the runtime in SAP
HANA.
Note
You can see the information if the option Enable Expensive Statement Tracing is enabled and if the
statement exceeds the thresholds specified in (Configuration) → Monitoring. See Configure
Monitoring.
SAP HANA CPU Time Shows the amount of CPU time (in ms) the statement has used in SAP HANA.
Note
You can see the information if the option Enable Expensive Statement Tracing is enabled and if the
statement exceeds the thresholds specified in (Configuration) → Monitoring. See Configure
Monitoring.
Note
The CPU time indicates how much time is used for all threads. It means that if the CPU time is
significantly higher than the duration of the statement, then many threads are used. If many
threads are used for a long time, no other tasks should be scheduled at that point in time, or
resource bottlenecks may occur and tasks may even be canceled.
Statement Details Shows the More link that you can click to view the complete SQL statement.
Note
For MDS queries - If you’ve enabled the tracing of MDS information (see Configure Monitoring), the
payload of the MDS query that is run by SAP Analytics Cloud is displayed. If identified in the
payload, the following information is also displayed: story ID, story name and data sources. You can
copy or download the displayed information.
Parameters Shows the values of the parameters of the statement that are indicated by the character "?" in the
popup that opens when clicking More in the Statement Details column.
Out-of-memory Shows if the statement has an out-of-memory error. If there is one, a timestamp is displayed. If there is
none, NULL is then displayed.
Task Log ID If the statement is related to a task, it shows the identifier of the task within a link, which takes you to
the Tasks tab filtered on this task.
Elastic Compute Node If the statement exceeds the thresholds specified in the option Enable Expensive Statement Tracing
in (Configuration) → Monitoring (see Configure Monitoring):
Shows the name of the elastic compute node if the statement is run on an elastic compute
node.
Error Code If the statement has failed, it shows the numeric code of the SQL error. See SQL Error Codes in the
SAP HANA SQL Reference Guide for SAP HANA Platform.
This is custom documentation. For more information, please visit SAP Help Portal. 227
7/9/25, 8:45 AM
Property Description
Error Message If the statement has failed, it shows a description of the SQL error.
Workload Class If the statement has an out-of-memory error, it shows the name of the workload class whose limit has
been exceeded.
Start Date Shows at which date the statement has started to run.
Note
Data on statements are kept for a time that depends on the thresholds specified in (Configuration) → Monitoring (see
Configure Monitoring). As a certain number of statements are kept (30.000 by default), if very low thresholds are set, the time
period may be very low (for example, only a few hours). To keep the statements for a longer time, the thresholds should be set
accordingly.
Sort on a column by clicking the column header and then clicking (Sort Ascending) or (Sort Descending).
Filter on a column by using the quick filtering or the advanced filtering options.
Quick Filters
Filter Description
Date and Time Range Enter one date and time range or click to see the available options:
Date Ranges - From/To, From/To (Date and Time), From, To, From (Date and
Time), To (Date and Time), Last X Minutes/Hours/Days/Weeks
Spaces Select or enter the name of at least one space. All spaces of the tenant are available, even
the ones you are not added to.
Example
You are looking for all failed and running records that happened last week in the space ACME_TF. Define the time
range by clicking to display the list of available options and selecting From/To in Date and Time Range and
selecting the dates relevant to you in the calendar. Then, select the space ACME_TF in the Space filter drop down
list. Finally, select Failed and Running in the Statuses drop down list. The log list automatically updates after
each filter definition.
Note
This is custom documentation. For more information, please visit SAP Help Portal. 228
7/9/25, 8:45 AM
Defined quick filters are shown in the Define Filter dialog. If you add an advanced filter in the Define Filter
dialog, the quick filters fields will be cleared. The filters are not deleted.
Filters defined in the Define Filter dialog are not shown in the quick filters fields.
Advanced Filters
1. Click a column header, then click (Filter). The Define Filter dialog opens and advanced filtering options are
available.
2. Chose the appropriate section for your filter. If your filter is meant to include data in the table (you could say
"I want my Data Preview to show"), add your filter in the Include section. If your filter is meant to exclude
data from the table (you could say "I want my Data Preview to hide"), add your filter in the Exclude section.
When in the appropriate section, click (Add Filter) to add a filter.
3. Select a column to filter on, a filtering option, and a value. You can add several filters. Click OK to apply the
filter(s). The currently applied filters are displayed above the table.
Example
To only see the tasks that have failed on remote tables, in the Include area, select the column Object
Type, then the filtering value contains and enter "REMOTE". Then, add a filter, select the column Status,
then the filtering value contains and enter "FAILED". Once applied, the filter is displayed above the table.
4. Click Clear Filter in the filter strip or (Remove Filter) in the Define Filter dialog to remove the filter.
Note
The filtering options available depend on the data type of the column you filter on.
To increase performance, only the first 1,000 rows are displayed. Use filters to find the data you are looking for.
Filters are applied to all rows, but only the first filtered 1,000 rows are displayed.
Note
If you filter on one of the following columns and you enter a number, use the “.” (period) character as the decimal
separator, regardless of the decimal separator used in the number formatting that you’ve chosen in the general user
settings ( Settings Language & Region ): SAP HANA Peak Memory, SAP HANA CPU Time, SAP HANA Used
Memory and SAP HANA Used Disk.
Show or hide columns by clicking (Columns Settings) to open the Columns Settings dialog, selecting columns as
appropriate. To return to the default preview columns, click Reset.
Configure Monitoring
You can control which monitoring data is collected and also obtain independent access to the underlying SAP HANA monitoring
views that power the System Monitor.
Prerequisites
This is custom documentation. For more information, please visit SAP Help Portal. 229
7/9/25, 8:45 AM
To control which monitoring data is collected, you must have a global role that grants you the following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.
Procedure
1. In the side navigation area, click (System) (Configuration) and then select the Monitoring tab.
2. To obtain independent access to the underlying SAP HANA monitoring views that power the System Monitor:
a. Select a space from the drop-down list and click Confirm Selected Space.
Note
File spaces are not available in the list as they cannot be chosen as the monitoring space.
b. If you've created the <SAP_ADMIN> space and you want to enable it, click Enable access to SAP Monitoring
Content Space. If there isn't any space named <SAP_ADMIN> in your tenant, this is not available for selection.
For more information, see Working with SAP HANA Monitoring Views.
3. Analyze individual SQL queries whose execution exceeds one or more thresholds, select Enable Expensive Statement
Tracing. Keep the default settings or specify the following parameters to configure and filter the trace details, then save
your changes.
Property Description
Memory Tracing Specify the maximum number of records that are stored in the monitoring tables.
Records For example, if about 5 days are traced in the expensive statement tables and you don’t want to change
the thresholds, you can double the number of records in In-Memory Tracing Records so that about 10
days are traced. Be aware that increasing this number will also increase the used storage.
Default: 30,000
Maximum: 100,000
Threshold CPU Time Specify the threshold CPU time of statement execution.
When set to 0, all SQL statements are traced.
Default: 0
Default: 1,024MB
Maximum: 1 GB
Trace Parameter Values In SQL statements, field values may be specified as parameters (using a "?" in the syntax). If these
parameter values are not required, then do not select the option to reduce the amount of data traced.
Default: False
This is custom documentation. For more information, please visit SAP Help Portal. 230
7/9/25, 8:45 AM
If expensive statement tracing is not enabled, then statement information and errors are not traced and you cannot see
them in the System Monitor (see Monitoring SAP Datasphere).
For more information about these parameters, see Expensive Statements Trace in the SAP HANA Cloud, SAP HANA
Database Administration Guide.
4. To analyze individual SAP HANA multi-dimensional services (MDS) queries, select Enable MDS Information Tracing and
save.
Property Description
MDS Tracing Records Specify the maximum number of records that are stored for MDS requests in the monitoring tables.
You can increase this number in order to trace more data in the System Monitor.
If the tracing is enabled, you can view information on MDS queries when clicking More in the column Statement Details of
the Statement Logs tab in the System Monitor (see Monitoring SAP Datasphere).
5. To trace elastic compute node data, select Enable Elastic Compute Node Data Tracing and save.
If the tracing is disabled, only the statements of currently running nodes are displayed in the System Monitor. If a
node is stopped, its information is deleted.
If the tracing is enabled and a node is started and stopped more than once, only the information about the previous
run is displayed. The information is kept for 10 days or is deleted if more than 100 individual elastic compute nodes
have run.
Monitoring Views
Note
The data from these monitoring views is available directly in the System Monitor (see Monitoring SAP Datasphere). Working
with them independently is optional and allows you to do further analysis that is not supported in the standard monitor.
As the monitoring spaces you choose will provide unfiltered access to monitoring views, be aware that the users assigned to the
spaces will be able to see all metadata and object definitions of all spaces.
This is custom documentation. For more information, please visit SAP Help Portal. 231
7/9/25, 8:45 AM
You can dedicate one or two spaces to monitoring:
Note
If you have already selected a space for monitoring before version 2021.19, you need to select another space, then select
the initial space again so that you can access all the views.
<SAP_ADMIN> space - This space can contain the pre-configured monitoring views provided by SAP via the Content
Network. First create the space with the space ID <SAP_ADMIN> and the space name <Administration (SAP)>, enable
access to it, and import the package from the Content Network.
Note
Do not create a space with the space ID <SAP_ADMIN> for another purpose.
Monitoring Views
The following monitoring views are available:
SAP HANA SYS Schema Monitoring Views - All SAP HANA monitoring views start with M_. For more information, see
Monitoring Views in the SAP HANA Cloud, SAP HANA Database SQL Reference Guide.
SAP HANA _SYS_STATISTICS Schema Statistics Service Views (see Embedded Statistics Service Views
(_SYS_STATISTICS schema)).
SAP HANA _SYS_BI Schema Tables and Views (see BIMC Tables and Views in the SAP HANA Cloud, SAP HANA Analytics
Catalog (BIMC Views) Reference).
SAP HANA DWC_GLOBAL Schema Monitoring Views (see Working with SAP HANA Monitoring Views).
SAP Datasphere Monitoring Views - Delivered via the Content Network in the <SAP_ADMIN> space (see SAP Datasphere
Monitoring Views (Delivered via the Content Network)).
The following monitoring views have the suffix _V_EXT and are ready to use in the DWC_GLOBAL schema:
SPACE_SCHEMAS_V_EXT:
Column Description
SPACE_ID Identifier of the SAP Datasphere space. Note that one space can contain several schemas.
This is custom documentation. For more information, please visit SAP Help Portal. 232
7/9/25, 8:45 AM
SPACE_USERS_V_EXT:
Column Description
SPACE_ID Identifier of the SAP Datasphere space. Note that one space can contain several users.
USER_TYPE Type of user, such as space technical user (for example database user for open SQL schemas) or global user.
TASK_SCHEDULES_V_EXT:
Column Description
SPACE_ID (Key) Identifier of the SAP Datasphere space which contains the object with the defined schedule.
OBJECT_ID (Key) Identifier of the SAP Datasphere object for which the schedule is defined.
Note
For each application, you can have multiple activities (for example, replicating or deleting data).
For example: PERSIST (View), EXECUTE (Dataflow), REPLICATE (Remote Tables), RUN_CHAIN (Task
Chain)
OWNER Identifier of the responsible of the schedule, schedule executed on users behalf, consent is checked
against (< DWC User ID >).
CHANGED_AT Timestamp containing Date and Time, at which the schedule was last changed.
TASK_LOGS_V_EXT:
Column Description
SPACE_ID Identifier of the SAP Datasphere space which contains the object with the defined schedule.
OBJECT_ID Identifier of the SAP Datasphere object for which the schedule is defined.
This is custom documentation. For more information, please visit SAP Help Portal. 233
7/9/25, 8:45 AM
Column Description
ACTIVITY For each application there could be multiple activities, e.g. replicating or deleting data.
For example: VIEWS, REMOTE_TABLES, DATA_FLOWS, TASK_CHAINS
PEAK_MEMORY Captures the highest peak memory consumption (in bytes). Not available for all apps. Requires Enable
Expensive Statement Tracing (see Configure Monitoring).
Gives Null if not available for the application, Enable Expensive Statement Tracing not set, or the
threshold defined is not reached, 0 or value of the memory consumption.
PEAK_CPU Total CPU time (in microseconds) consumed by the task. Not available for all apps. Requires Enable
Expensive Statement Tracing (see Configure Monitoring).
Gives Null if not available for the application, Enable Expensive Statement Tracing not set, or the
threshold defined is not reached, 0 or value of the CPU time consumption.
RECORDS Shows the number of records of the target table after the task has finished running.
Gives Null (not applicable or not measured), 0 or number of records.
START_TIME Timestamp containing Date and Time, at which the scheduled task was started.
END_TIME Timestamp containing Date and Time, at which the scheduled task was stopped.
TRIGGERED_TYPE Indicates if task execution was triggered manually (DIRECT) or via schedule (SCHEDULED).
APPLICATION_USER The user on whose behalf the schedule was executed (the owner at this point in time).
DURATION Duration in seconds of the task execution (also works for ongoing execution).
TASK_LOG_MESSAGES_V_EXT:
Column Description
MESSAGE_NO (Key) Order sequence of all messages belonging to a certain Tasklog ID.
SEVERITY Indicates if the message provides general information (INFO) or error information (ERROR).
DETAILS Technical additional information. For example, it can be an error stack or a correlation ID.
TASK_LOCKS_V_EXT:
Column Description
LOCK_KEY (Key) Identifier, flexible field as part of the lock identifier, usually set to WRITE or EXECUTE.
SPACE_ID (Key) Identifier of the SAP Datasphere space which contains the object with the defined schedule.
OBJECT_ID (Key) Identifier of the SAP Datasphere object for which the schedule is defined.
This is custom documentation. For more information, please visit SAP Help Portal. 234
7/9/25, 8:45 AM
Column Description
TASK_LOG_ID Uniquely identifies the task execution that set the lock.
Note
Cross-space sharing is active for all SAP HANA monitoring views. The row level access of shared views is bound to the space
read access privileges of the user who consumes the view.
See the blogs SAP Datasphere: Data Integration Monitoring – Sample Content for Reporting (published in October 2021) and
SAP Datasphere: Data Integration Monitoring – Running Task Overview (published in November 2021).
You must:
Create a space with the space ID <SAP_ADMIN> and the space name <Administration (SAP)> and configure it as a
monitoring space by enabling the toggle Enable Access to SAP Monitoring Content Space (see Configure Monitoring).
Import the Technical Content: Task Monitoringpackage from the Content Network (see Importing SAP and
Partner Business Content from the Content Network).
Task properties, such as duration and execution status (e.g. failed, completed, ...).
Locking status
Best Practice: To enable the navigation between SAP Datasphere and SAP Analytics Cloud, you must change the constant
for the url_host to your SAP Datasphere instance. Open the view in the view editor, and update the URL host:
Uses the view TASK_SCHEDULES_V_EXT and adds a row-count to be compatible with OLAP reporting.
Error code, header line and first stack line parsed out from detailed message.
This is custom documentation. For more information, please visit SAP Help Portal. 235
7/9/25, 8:45 AM
An indicator that the task_id has an error (facilitate filtering of messages).
Best Practice: To enable the navigation between SAP Datasphere and SAP Analytics Cloud, you must change the constant
for the url_host to your SAP Datasphere instance. Open the view in the view editor, and update the URL host:
Prerequisites
To monitor database operations with audit logs, you must have a global role that grants you the following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.
Context
If Space Administrators have enabled audit logs to be created for their space (see Logging Read and Change Actions for Audit),
you can get an overview of these audit logs. You can do analytics on audit logs by assigning the audit views to a dedicated space
and then work with them in a view in the Data Builder.
Note
Audit logs can consume a large quantity of GB of disk in your database, especially when combined with long retention periods
(which are defined at the space level). You can delete audit logs when needed, which will free up disk space. For more
information, see Delete Audit Logs.
Procedure
Go to System Configuration Audit . Enable to save and later display the audit logs directly in a certain space by
choosing a space from the drop-down list. We recommend to create a dedicated space for audit logs, as you might not want
all users to view sensitive data.
2. Open the Data Builder, create a view, and add one or more of the following views from the DWC_AUDIT_READER schema
as sources:
AUDIT_LOG_OVERVIEW - Contains audit policies (read or change operations) and the number of audit log entries.
ANALYSIS_AUDIT_LOG - Contains audit log entries for database analysis users. For more information, see Create
a Database Analysis User to Debug Database Issues.
This is custom documentation. For more information, please visit SAP Help Portal. 236
7/9/25, 8:45 AM
Spaces for which auditing is enabled. For each space, you can delete separately all the audit log entries recorded for read
operations and all the audit log entries recorded for change operations. All the entries recorded before the date and time
you specify are deleted.
All read audit logs recorded for all database analysis users. They are grouped together into the audit policy
DWC_ANALYSIS_USERS_AUDIT_ALL.
2. Select the spaces (and the audit policy names - read or change) or the database analysis user audit policy
(DWC_ANALYSIS_USERS_AUDIT_ALL) for which you want to delete all audit log entries and click Delete.
All entries that have been recorded before this date and time are deleted.
Deleting audit logs frees up disk storage, which you can see in the Disk Storage Used card in System Monitor
Dashboard .
Note
Audit logs are automatically deleted when performing the following actions: deleting a space, deleting a database user (open
SQL schema), disabling an audit policy for a space, disabling an audit policy for a database user (open SQL schema),
unassigning an HDI container from a space. Before performing any of these actions, you may want to export the audit log
entries, for example by using SAP HANA Database Explorer (see Logging Read and Change Actions for Audit).
Prerequisites
Context
Download and Delete the Activity Log for a Specific Time Period
Prerequisites
To monitor object changes with activities, you must have a global role that grants you the following privileges:
This is custom documentation. For more information, please visit SAP Help Portal. 237
7/9/25, 8:45 AM
Activity Log (-R------) - To view and download activities. The DW Administrator role template, for example, grants this
privilege.
For more information, see Privileges and Permissions and Standard Roles Delivered with SAP Datasphere.
Context
Actions that are performed by users are logged in Security Activities .
For example:
Table changes
Logged users
To view activities, you must have a global role that grants you the privilege:
In the Set Filters dialog, you can select one or more parameters to filter in the Available Filters list. In the Active Filters list,
type or choose a filter value for each parameter that you select. When you click OK, the log is filtered according to your
selections.
If you apply filters to the log, the entries that you filter out are also excluded if you download the activity data.
Download and Delete the Activity Log for a Specific Time Period
To download and delete activity logs, you must have a global role that grants you the privilege:
When the size of the activity log approaches the limit, users who have the Delete permission for the Activity Log privilege will
receive an email and an alert notification. Further alerts will be sent if the log continues to grow closer to the limit.
When the activity log reaches its limit, final notifications are sent, and then the oldest rows will be deleted from the system to keep
the log size below the limit. To reduce the size of the log, you can first download part or all of the log as CSV files, and then delete
those log entries from the system.
The default limit for the activity log is 500,000 rows. You can request that this number be changed to a higher number, or changed
to a one-year rolling period, by entering a support ticket.
This is custom documentation. For more information, please visit SAP Help Portal. 238
7/9/25, 8:45 AM
1. In the side navigation area, click Security Activities .
You can also open the Activities page directly from the link in the notification email.
2. If you want to filter the activities that you will download, select (Filter).
Tip
Filtering the activity log can be useful when collecting troubleshooting data, but is usually not necessary for archiving
activity log data.
In the Set Filters dialog, select the filters that you want to apply, and choose a value for each filter. Time Stamp filters will
be overridden by your settings in the Download Activities dialog.
4. In the Download Activities dialog, type a file name for the download in the Name field.
The rows within the dates and filters that you specified are downloaded as CSV files with up to 75 000 rows each.
8. If you choose Specific range, set a Starting Date and an End Date.
Note
Filters applied in the Activities page don't apply to the delete operation.
9. Select Delete.
All activity rows in the specified time period are deleted from the system.
Prerequisites
To delete task logs and reduce storage consumption, you must have a global role that grants you the following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.
Context
Each time an activity is running in SAP Datasphere (for example, replicate a remote table), task logs are created to allow you to
check if the activity is running smoothly or if there is an issue to solve. You access these detailed task logs by navigating to the
This is custom documentation. For more information, please visit SAP Help Portal. 239
7/9/25, 8:45 AM
Data Integration Monitor - Details screen of the relevant object. For example, clicking the button of the relevant remote table.
See Managing and Monitoring Data Integration.
However, task logs can consume a lot of spaces in a tenant. Deleting old task logs that are no longer needed can be useful to
release storage space. This is why SAP Datasphere has a log deletion schedule activated by default. You can change the schedule
defining your own criteria or decide to take immediate deletion actions.
Procedure
1. In the side navigation area, click (Configuration) → Tasks.
2. Check how much size the task logs consume on your tenant. If needed, decide how you want to delete the task logs.
Manually Delete Task Log: SAP Datasphere automatically triggers logs deletion using the following default criteria:
Go to the section Schedule Task Log Deletion, update the deletion criteria following your needs and click Save.
Manual Deletion: You want to manually delete task logs to take immediate action. Go to the Manually Delete Task
Log section and determine how long you want to keep the logs. For example, delete the logs that are older than 100
days.
Click Delete.
Prerequisites
To create a database user to monitor, analyze, trace, or debug your SAP Datasphere database, you must have a global role that
grants you the following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.
Context
Note
You should only create a database analysis user to resolve a specific database issue and then delete it immediately after the
issue is resolved (see Manage Database Analysis Users). This user can access all SAP HANA Cloud monitoring views and all
SAP Datasphere data in all spaces, including any sensitive data stored there.
Procedure
This is custom documentation. For more information, please visit SAP Help Portal. 240
7/9/25, 8:45 AM
1. In the side navigation area, click (System) (Configuration) Database Access Database Analysis Users .
Property Description
Database Analysis User Enter the suffix, which is used to create the full name of the user. Can contain a maximum of 31
Name Suffix uppercase letters or numbers and must not contain spaces or special characters other than _
(underscore). See Rules for Technical Names.
Enable Space Schema Select only if you need to grant the user access to space data.
Access
Database analysis user Select the number of days after which the user will be deactivated. We strongly recommend creating
expires in this user with an automatic expiration date.
The host name and port, as well as the user password are displayed. Note these for later use.
4. Select your user in the list and then click one of the following and enter your credentials:
Open SAP HANA Cockpit - Open the Database Overview Monitoring page for the SAP Datasphere run-time
database, which offers various monitoring tools.
For more information, see Using the Database Overview Page to Manage a Database).
Open Database Explorer - Open an SQL Console for the SAP Datasphere run-time database.
For more information, see Getting Started With the SAP HANA Database Explorer).
A database analysis user can run a procedure in Database Explorer to stop running statements. For more
information, see Stop a Running Statement With a Database Analysis User.
Note
All actions of the database analysis user are logged in the ANALYSIS_AUDIT_LOG view, which is stored in the space
that has been assigned to store audit logs (see Logging Read and Change Actions for Audit).
Audit logs can consume a large quantity of GB of disk in your SAP Datasphere tenant database. The audit log entries for
database analysis users are kept for 180 days, after which they are automatically deleted. You can also manually delete
the audit logs to free up disk space (see Delete Audit Logs). Also, a database analysis user can be automatically
deactivated due to a large amount of disk storage consumed by audit logs (see Manage Database Analysis Users).
This is custom documentation. For more information, please visit SAP Help Portal. 241
7/9/25, 8:45 AM
Delete a Database Analysis User
Status Description
Deactivated The database analysis user has been deactivated because its audit logs have exceeded the disk storage
threshold.
Locked The database analysis user has been locked after too many failed login attempts.
Expired The database analysis user has passed the expiration date that you've set when creating it.
A database analysis user is deactivated and its status set to Deactivated because its audit logs have exceeded the disk storage
threshold.
If the total size of all audit logs in the tenant has reached more than 40% of the tenant disk storage, the system automatically
deactivates any analysis database users - and locks any spaces - whose audit logs consume more than 30% of the total audit log
size.
You can reactivate a deactivated database analysis user by deleting its audit log entries so that they fall below the threshold (see
Delete Audit Logs). The database analysis user will be automatically reactivated after a few minutes.
You can unlock a locked database analysis user by requesting a new password for it.
1. In the side navigation area, click (System) (Configuration) Database Access Database Analysis Users .
2. Click the icon next to the Locked status of the database analysis user.
1. In the side navigation area, click (System) (Configuration) Database Access Database Analysis Users .
2. Click the icon next to the Expired status of the analysis database user.
3. In the dialog box that opens, select the number of days after which the user will expire and click Reactivate Analysis User.
This is custom documentation. For more information, please visit SAP Help Portal. 242
7/9/25, 8:45 AM
1. In the side navigation area, click (System) (Configuration) Database Access Database Analysis Users .
2. Select the user you want to delete and then click Delete.
Deleting a database analysis user does not delete its audit logs. The audit logs will be deleted after a retention period of 180 days.
As they can consume a large amount of disk storage, you may want to manually delete them before the end of the retention period
(see Delete Audit Logs).
You may for example want to stop a statement that has been running for a long time and is causing performance issues.
You can only stop a statement that has been run by space users, analysis users, user group users and Data Provisioning Agent
users.
In SAP HANA Database Explorer, run a database procedure using the following syntax:
ACTION CANCEL Enter CANCEL to run the statement ALTER SYSTEM CANCEL
[WORK IN] SESSION (see ALTER SYSTEM CANCEL [WORK IN]
SESSION Statement (System Management) in the SAP HANA
Cloud, SAP HANA Database SQL Reference Guide.)
Note
You can find the connection ID in (System Monitor)
Statement Logs , then the column Connection ID.
For more information on database explorer, Getting Started With the SAP HANA Database Explorer.
Configure Notifications
Configure notifications about system events and network connection issues, and define the SMTP server to be used for email
deliveries.
This is custom documentation. For more information, please visit SAP Help Portal. 243
7/9/25, 8:45 AM
Prerequisites
To configure notifications, you must have a global role that grants you the following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.
When the notification is on, everyone who uses the application on that tenant will see the notification in the top right corner of
their application.
By default, when users are added to your SAP Datasphere tenant, they receive a welcome email which contains a link to the tenant
so they can activate their account and log in for the first time. You can disable the welcome email from being sent to new users. You
may want to do so in the following cases:
When SAML single sign-on (SSO) is setup and it's not necessary for the users to activate their account.
When you want to setup single sign-on (SSO) before users are given the go to access the system.
When you need to import users from a public tenant to a private tenant.
4. Click Save.
Note
If you disable the welcome email and then add a user who doesn't have an activated account for SAP Datasphere, they will not
be able to access the system. The new user needs to go to the tenant log-on page and click "Forgot password?". They must
This is custom documentation. For more information, please visit SAP Help Portal. 244
7/9/25, 8:45 AM
enter the email address associated with their account and follow the instructions of the received email to set up a password.
Configuring an email server of your choice ensures greater security and flexibility while delivering email for your business.
3. In the Email Server Configuration section, select Custom, and complete the following properties.
Prerequisites
To check consent expirations, you must have a global role that grants you the following privileges:
System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.
The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.
Procedure
1. To view a list of users whose authorization consent will expire within the next four weeks, click (Configuration) Tasks
.
2. in the Consent Expiration section of the Tasks page, click the View Expiration List link. SAP Datasphere now displays a
dialog in which you can view a list of users whose authorization consent will expire within a given timeframe.
This is custom documentation. For more information, please visit SAP Help Portal. 245
7/9/25, 8:45 AM
By default, the dialog displays a list of users whose consent will expire within four weeks. You can change the default expiration
timeframe to anywhere between one and four weeks. In addition to displaying the list of users whose consent will soon expire, you
can also select a user in the list and click the Show Affected Tasks link to view the collection of tasks that user has scheduled.
Monitor Capacities
View the amount of capacity units you have used each month.
Context
Monitor Capacities provides insights into monthly and daily capacity unit consumption, allowing users to track usage relative to
their subscription and download detailed hourly data. This tool is useful for optimizing resource allocation and ensuring efficient
subscription management.
Card Description
Total CU Consumption Shows the number of capacity units consumed this month.
Total CU Consumption: Relative to Your Shows the percentage of your capacity unit subscription that is used this month.
Subscription
This is custom documentation. For more information, please visit SAP Help Portal. 246
7/9/25, 8:45 AM
Card Description
Total CU Consumption: Daily Displays a bar chart showing the amount of capacity units consumed each day of
this month.
You can also download a CSV file to view the hourly consumption of capacity units. The screenshot shows the rows that go
together for processing.
OBJECT_NAME Shows the name of the object when the consumption is associated
with a specific object.
SPACE_NAME Shows the name of the space when the consumption is associated
with a specific object.
The values shown for CONSUMED_CU and CONSUMED_BLOCKS are not final and can change. For metrics involving multiple tasks
that generate CU consumption, such as Premium Outbound Integration or ECN, the values are approximate due to the concurrent
nature of those tasks. When the values in these columns are empty, the cause could be:
The entry did not present any consumption. This might happen when there are multiple workflows present, as entries will
still be displayed even though they were not running at the time.
This is custom documentation. For more information, please visit SAP Help Portal. 247
7/9/25, 8:45 AM
The values are not available because the measurement has not been consolidated yet.
Procedure
1. From the side navigation menu, click (System Monitor) Capacities .
2. To download a CSV file of the consumption, click Download Capacity Metrics as CSV.
3. Click and select the beginning and end dates for the report.
4. Click Download.
The CSV file is downloaded, and you can view it in you spreadsheet application.
Note
The CSV file contains approximate data that may not reflect the finalized monthly total.
This is custom documentation. For more information, please visit SAP Help Portal. 248