0% found this document useful (0 votes)
2K views248 pages

SAP Datasphere Administration Guide

The document provides guidance on administering SAP Datasphere, detailing the responsibilities of administrators, including user and role management, space creation, connectivity preparation, and system monitoring. It outlines the roles available for users, such as System Owner and DW Administrator, and explains how to create spaces for data management. Additionally, it emphasizes that this is a custom documentation and not for production use, with a link to the official SAP Help Portal for comprehensive information.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2K views248 pages

SAP Datasphere Administration Guide

The document provides guidance on administering SAP Datasphere, detailing the responsibilities of administrators, including user and role management, space creation, connectivity preparation, and system monitoring. It outlines the roles available for users, such as System Owner and DW Administrator, and explains how to create spaces for data management. Additionally, it emphasizes that this is a custom documentation and not for production use, with a link to the official SAP Help Portal for comprehensive information.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

7/9/25, 8:45 AM

Administering SAP Datasphere


Generated on: 2025-07-09 [Link] GMT+0000

SAP Datasphere | cloud (2025.14)

Public

Original content: [Link]


US&state=PRODUCTION&version=cloud

Warning

This document has been generated from SAP Help Portal and is an incomplete version of the official SAP product documentation.
The information included in custom documentation may not reflect the arrangement of topics in SAP Help Portal, and may be
missing important aspects and/or correlations to other topics. For this reason, it is not for production use.

For more information, please visit [Link]

This is custom documentation. For more information, please visit SAP Help Portal. 1
7/9/25, 8:45 AM

Administering SAP Datasphere


Users with an administrator role are responsible for managing users and roles for the SAP Datasphere tenant, preparing
connectivity for data integration, and creating spaces and allocating storage to them, as well as monitoring and maintaining the
tenant.

This topic contains the following sections:

Configure Your SAP Datasphere Tenant

Create Users and Assign Roles

Create Spaces and Allocate Storage

Prepare Connectivity

Monitor and Maintain SAP Datasphere

 Tip
The English version of this guide is open for contributions and feedback using GitHub. This allows you to get in contact with
responsible authors of SAP Help Portal pages and the development team to discuss documentation-related issues. To
contribute to this guide, or to provide feedback, choose the corresponding option on SAP Help Portal:

 Feedback Edit page : Contribute to a documentation page. This option opens a pull request on GitHub.

 Feedback Create issue : Provide feedback about a documentation page. This option opens an issue on GitHub.

You need a GitHub account to use these options.

More information:

Contribution Guidelines

Introduction Video: Open Documentation Initiative

Blog Post: Introducing the Open Documentation Initiative

Configure Your SAP Datasphere Tenant


Either SAP will provision your tenant or you can create an instance in SAP BTP (see Creating and Configuring Your SAP Datasphere
Tenant).

We recommend that you link your tenant to an SAP Analytics Cloud tenant (see Review and Manage Links to SAP Analytics
Cloud and SAP Business Data Cloud Tenants).

You can enable SAP HANA for SQL data warehousing on your tenant to exchange data between your HDI containers and
your SAP Datasphere spaces without the need for data movement (see Enable SAP HANA for SQL data warehousing on
Your SAP Datasphere Tenant).

You can enable the SAP HANA Cloud script server to access the SAP HANA Automated Predictive Library (APL) and SAP
HANA Predictive Analysis Library (PAL) machine learning libraries (see Enable the SAP HANA Cloud Script Server on Your
SAP Datasphere Tenant).

This is custom documentation. For more information, please visit SAP Help Portal. 2
7/9/25, 8:45 AM

Create Users and Assign Roles

An administrator creates SAP Datasphere users manually, from a *.csv file, or via an identity provider (see Managing SAP
Datasphere Users).

You must assign one or more roles to each of your users via scoped roles and global roles (see Managing Roles and Privileges). You
can create your own custom roles or use the following standard roles delivered with SAP Datasphere:

Roles providing privileges to administer the SAP Datasphere tenant:

System Owner - Includes all user privileges to allow unrestricted access to all areas of the application. Exactly one
user must be assigned to this role.

DW Administrator - Can create users, roles and spaces and has other administration privileges across the SAP
Datasphere tenant. Cannot access any of the apps (such as the Data Builder).

Roles providing privileges to work in SAP Datasphere spaces:

DW Space Administrator (template) - Can manage all aspects of the spaces users are assigned to (except the
Space Storage and Workload Management properties) and can create data access controls.

DW Scoped Space Administrator - This predefined scoped role is based on the DW Space Administrator role
and inherits its privileges and permissions.

 Note
Users who are space administrators primarily need scoped permissions to work with spaces, but they
also need some global permissions (such as Lifecycle when transporting content packages). To provide
such users with the full set of permissions they need, they must be assigned to a scoped role (such as the
DW Scoped Space Administrator) to receive the necessary scoped privileges, but they also need to be
assigned directly to the DW Space Administrator role (or a custom role that is based on the DW Space
Administrator role) in order to receive the additional global privileges.

DW Integrator (template) - Can integrate data via connections and can manage and monitor data integration in a
space.

DW Scoped Integrator - This predefined scoped role is based on the DW Integrator role and inherits its
privileges and permissions.

DW Modeler (template) - Can create and edit objects in the Data Builder and Business Builder and view data in
objects.

DW Scoped Modeler - This predefined scoped role is based on the DW Modeler role and inherits its privileges
and permissions.

DW Viewer (template) - Can view objects and view data output by views that are exposed for consumption in
spaces.

DW Scoped Viewer - This predefined scoped role is based on the DW Viewer role and inherits its privileges
and permissions.

Roles providing privileges to consume the data exposed by SAP Datasphere spaces:

DW Consumer (template) - Can consume data exposed by SAP Datasphere spaces, using SAP Analytics Cloud, and
other clients, tools, and apps. Users with this role cannot log into SAP Datasphere. It is intended for business
analysts and other users who use SAP Datasphere data to drive their visualizations, but who have no need to access
the modeling environment.

DW Scoped Consumer - This predefined scoped role is based on the DW Consumer role and inherits its
privileges and permissions.

This is custom documentation. For more information, please visit SAP Help Portal. 3
7/9/25, 8:45 AM
Roles providing privileges to work in the SAP Datasphere catalog:

Catalog Administrator - Can set up and implement data governance using the catalog. This includes connecting the
catalog to source systems for extracting metadata, building business glossaries, creating tags for classification, and
publishing enriched catalog assets so all catalog users can find and use them. Must be used in combination with
another role such as DW Viewer or DW Modeler for the user to have access to SAP Datasphere.

Catalog User - Can search and discover data and analytics content in the catalog for consumption. These users may
be modelers who want to build additional content based on official, governed assets in the catalog, or viewers who
just want to view these assets. Must be used in combination with another role such as DW Viewer or DW Modeler for
the user to have access to SAP Datasphere.

Role providing privileges to use AI features in SAP Datasphere:

DW AI Consumer - Can use SAP Business AI features.

 Note
To activate SAP Business AI features in your SAP Datasphere tenant, see Enable SAP Business AI for SAP
Datasphere

Create Spaces and Allocate Storage

All data acquisition, preparation, and modeling in SAP Datasphere happens inside spaces. A space is a secure area - space data
cannot be accessed outside the space unless it is shared to another space or exposed for consumption.

An administrator must create one or more spaces and allocate resources to them. See Creating Spaces and Allocating Resources.

Prepare Connectivity
Administrators prepare SAP Datasphere for creating connections to source systems in spaces (see Preparing Connectivity for
Connections).

Monitor and Maintain SAP Datasphere


Administrators have access to various monitoring logs and views and can, if necessary, create database analysis users to help
troubleshoot issues (see Monitoring SAP Datasphere).

Administration Apps and Tools


You administer SAP Datasphere using apps and tools in the side navigation area.

(Space Management)
In the Space Management, you can set up, configure, and monitor your spaces, including assigning users to them. For more
information, see Preparing Your Space and Integrating Data.

 (System Monitor)
In the System Monitor, you can monitor the performance of your system and identify storage, task, out-of-memory, and other
issues. For more information, see Monitoring SAP Datasphere.

This is custom documentation. For more information, please visit SAP Help Portal. 4
7/9/25, 8:45 AM

 Security

Tool Task More Information

Users Create, modify, and manage users in SAP Managing SAP Datasphere Users
Datasphere.

Roles Assign pre-defined standard roles or Managing Roles and Privileges


custom roles that you have created to
users.

Activities Track the activities that users perform on Monitor Object Changes with Activities
objects such as spaces, tables, views, data
flows, and others, track changes to users
and roles, and more.

 System  Configuration

Tab Task More Information

Data Integration Live Data Connections (Tunnel): For SAP Create Live Data Connection of Type Tunnel
BW∕4HANA and SAP S/4HANA model (SAP BW∕4HANA)
import, you need Cloud Connector. This
Create SAP S/4HANA Live Data Connection
requires a live data connection of type
of Type Tunnel (SAP S/4HANA)
tunnel.

On-Premise Agents: Manage Data Connect and Configure the Data


Provisioning Agents which are required to Provisioning Agent
act as gateway to SAP Datasphereto enable
Register Adapters with SAP Datasphere
using connections to on-premise sources
for remote tables and building views. Monitoring Data Provisioning Agent in SAP
Datasphere

Pause Real-Time Replication for an Agent

Third-Party Drivers: Upload driver files that Upload Third-Party ODBC Drivers (Required
are required for certain third-party cloud for Data Flows)
connections to use them for data flows.

Security SSL/TLS Certificates: Upload server Manage Certificates for Connections


certificates to enable secure SSL/TLS-
based connections to certain sources.

Password Policy Configuration: Define your Set a Password Policy for Database Users
password policy settings for the database
users. The policy can be enabled when
configuring your database users.

Audit Audit View Enablement: Configure a space Logging Read and Change Actions for Audit
that gets access to audit views and allows
you to display the audit logs in that space.

Audit Log Deletion: Delete Audit Logs

Monitoring Control which monitoring data is collected Configure Monitoring


and enable access to pre-configured
monitoring views prepared by SAP.

This is custom documentation. For more information, please visit SAP Help Portal. 5
7/9/25, 8:45 AM

Tab Task More Information

IP Allowlist Trusted IPs: Control the range of external Manage IP Allowlist


public IPv4 addresses that get access to the
database of your SAP Datasphere by adding
them to an allowlist.

Trusted Cloud Connector IPs:

Tasks Clean-up task logs to reduce storage Delete Task Logs to Reduce Storage
consumption in your SAP Datasphere Consumption
tenant.
Check Consent Expirations
Also allows you to view a list of users whose
authorization consent will expire within a
given timeframe, by default, four weeks.

Database Access Database Analysis Users: Create a Monitoring SAP Datasphere


database analysis user to connect to your
SAP HANA Cloud database to analyze,
diagnose and solve database issues. Only
create this user for a specific specific task
and delete right after the task has been
completed.

Database User Groups: Create an isolated Creating a Database User Group


environment with corresponding
administrators where you can work more
freely with SQL in your SAP HANA Cloud
database.

Tenant Configuration Allocate the capacity units to storage and Configure the Size of Your SAP Datasphere
compute resources for your tenant. Tenant

SAP BW Bridge Create a SAP BW bridge tenant. Provisioning the SAP BW Bridge Tenant

Business Data Products Select spaces to which SAP Business Data Authorize Spaces to Install SAP Business
Cloud data products from an activated data Data Cloud Data Products
package can be installed.

AI Services Enable Artificial Intelligence services in SAP Enable SAP Business AI for SAP Datasphere
Datasphere.

System Information Add a visual tenant type indicator to show Display Your System Information
all users which system they are using, for
example a test or production system.

Workload Management Set a priority for a particular space when Set Priorities and Statement Limits for
querying the database and set limits to the Spaces or Groups
amount of memory and threads that the
space can consume.

 System  Administration

This is custom documentation. For more information, please visit SAP Help Portal. 6
7/9/25, 8:45 AM

Tab Task More Information

System Configuration Session timeout: Set the amount of time By default the session timeout is set to
before a user session expires if the user 3600 seconds (1 hour). The minimum value
doesn't interact with the system. is 300 seconds, and the maximum value is
43200 seconds.

Allow SAP support user creation: Let SAP Request Help from SAP Technical Support
create support users based on incidents.

Support users generated by SAP will be


deleted after their validity has expired or
after the incident has been closed.

Tenant Links Product Switch: Link an SAP Analytics Review and Manage Links to SAP Analytics
Cloud tenant to your SAP Datasphere Cloud and SAP Business Data Cloud
tenant to enable the product switch in the Tenants
top right of the shell bar, and be able to
easily navigate between them.

Data Source Configuration SAP BTP Core Account: Get subaccount Set Up Cloud Connector in SAP Datasphere
information for SAP Datasphere. You need
the information to configure the Cloud
Connector that SAP Datasphere uses to
connect to sources for data flows and
model import.

Live Data Sources: If you want to use SAP


BW∕4HANA model import, you need to
allow data from your live data connection of
type tunnel to securely leave your network.

On-premise data sources: Add location IDs


if you have connected multiple Cloud
Connector instances to your SAP
Datasphere subaccount and you want to
offer them for selection when creating
connections using a Cloud Connector.

Security Authentication Method: Select the Enabling a Custom SAML Identity Provider
authentication method used by SAP (Legacy Custom IdP)
Datasphere.

SAML Single Sign-On (SSO)


Configuration: Configure SAML SSO if you
selected it as authentication method.

App Integration OAuth Clients: You can use Open Create OAuth2.0 Clients to Authenticate
Authorization (OAuth) protocol to allow Against SAP Datasphere
third-party applications access.

Trusted Identity Providers: If you use the


OAuth 2.0 SAML Bearer Assertion workflow,
you must add a trusted identity provider.

Trusted Origins: Enter the origins that will


be hosting your client application.

Notifications Make sure that users are notified Configure Notifications


appropriately about issues in the tenant.

This is custom documentation. For more information, please visit SAP Help Portal. 7
7/9/25, 8:45 AM

 System  About
Every user can view information about the software components and versions of your system, in particular:

Version: Displays the version of the SAP Datasphere tenant.

Build Date: Displays the date and time when the current version of the SAP Datasphere tenant was built.

Tenant: Displays the SAP Datasphere tenant id.

Database: Displays the id of the SAP Datasphere run-time database.

Platform Version: Displays the version of the SAP Analytics Cloud components used in SAP Datasphere.

Users with the DW Administrator role can open a More section to find more details. They can find outbound and database IP
addresses that might be required for allowlists in source systems or databases of SAP Datasphere for example (see Finding SAP
Datasphere IP addresses). Administrators can also upgrade their SAP HANA database patch version. For details, see Apply a Patch
Upgrade to Your SAP HANA Database.

System Requirements and Technical Prerequisites


SAP Datasphere is a fully web-based offering. You will need an internet connection and a system that meets certain requirements.

The requirements listed here are for the current release.

Client Software Requirements

Client Software Version Additional Information

Desktop browser Google Chrome, latest version Google releases continuous updates to their Chrome browser. We
make every effort to fully test and support the latest versions as
they are released. However, if defects are introduced with OEM-
specific browser software, we cannot guarantee fixes in all cases.

For additional system requirements, see your web browser


documentation.

Microsoft Edge based on the Microsoft has available for download continuous updates to their
Chromium engine, latest version new Chromium-based Edge browser. We make every effort to fully
test and support the latest versions as they are released.

Additional software Adobe Acrobat Reader 9 or higher -

Client Configuration Requirements

Client Configuration Setting Additional Information

Network bandwidth Minimum 500-800 kbit/s per user In general, SAP Datasphere requires no more
bandwidth than is required to browse the internet. All
application modules are designed for speed and
responsiveness with minimal use of large graphic
files.

Screen resolution XGA 1024x768 (high color) or higher -


Widescreen: 1366x766 or higher

This is custom documentation. For more information, please visit SAP Help Portal. 8
7/9/25, 8:45 AM

Client Configuration Setting Additional Information

Minimum recommended browser 250 MB SAP Datasphere is a Web 2.0 application. We


cache size recommend allowing browser caching because the
application uses it heavily for static content such as
image files. If you clear your cache, the browser will
not perform as well until the deleted files are
downloaded again to the browser and cached for use
next time.

To set browser cache size, see your browser


documentation.

HTTP 1.1 Enable -

JavaScript Enable -

Cookies Enable web browser session cookies -


(non-persistent) for authentication
purposes

Pop-up windows Allow pop-up windows from SAP -


Datasphere domains

Power Option Recommendation High Performance mode for improved For Microsoft based Operating Systems
JavaScript performance

Supported Languages

Client Browser What's Supported

Menus, buttons, messages, and other elements of the user Bulgarian (bgBG); Catalan (caES); Chinese (zhTW); Chinese
interface. (Simplified) (zhCN); Croatian (hrHR); Czech (csCZ); Danish
(daDK); Dutch (nlNL); English (enGB); English (enUS); Estonian
(etEE); French (frCA); French (frFR); Finnish (fiFI); German (deDE);
German (deCH); Greek (elGR); Hindi (hiIN); Hungarian (huHU);
Indonesian (idID); Italian (itIT); Japanese (jaJP); Korean (koKR);
Latvian (lvLV); Lithuanian (ltLT); Malay (msMY); Norwegian
(noNO); Polish (plPL); Portuguese (Brazil) (ptBR); Portuguese
(Portugal) (ptPT); Romanian (roRO); Russian (ruRU); Serbian
(srRS); Slovakian (skSK); Slovenian (slSL); Spanish (esES);
Spanish (esMX); Swedish (svSE); Thai (thTH); Turkish
(trTR);Ukrainian (ukUA); Vietnamese (viVN) and Welsh (cyGB).

Data Connectivity

Connectivity with SAP HANA Smart Data Integration

We recommend to always use the latest released version of the Data Provisioning Agent but at least the recommended minimum
version from SAP Note 2419138 . Make sure that all agents that you want to connect to SAP Datasphere have the same latest
version.

For more information, including information on minimum requirements for source systems and databases, see:

SAP HANA Smart Data Integration Product Availability Matrix (PAM)

This is custom documentation. For more information, please visit SAP Help Portal. 9
7/9/25, 8:45 AM
Configure Data Provisioning Adapters in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
documentation

Request Help from SAP Technical Support


You can request help from SAP Technical Support by creating a support incident. In many cases, a support user is required to allow
an SAP support engineer to log into and troubleshoot your system.

You can create an SAP support incident on the SAP Support Portal (S-user login required). For detailed information about what to
include in an incident, see SAP Note 2854764 .

In SAP Datasphere, users with an administrator role can make sure that a support user is created in the tenant. Two options are
available:

Option 1: Allow SAP Technical Support to Create Support Users for Incidents
To generally allow SAP Technical Support to create support users based on incidents, proceed as follows:

1. In the side navigation area, click  (System)  (Administration) System Configuration .

2. Choose Edit.

3. Set the Allow SAP support user creation setting to ON.

4. Click Save.

In case of an incident, the assigned support engineer from SAP Technical Support can request and generate a personalized
support user for the affected tenant. This user is enabled for multi-factor authentication.

Support engineers can request the support user with one of the following roles:

the global extended role DW Support User along with the scoped role DW Scoped Support User

DW Support User gives support users read-only access privileges to all functionalities of SAP Datasphere, enabling
them to analyze the incident.

When support engineers request the DW Scoped Support User role, they can specify the spaces that need to be
added as scopes to this role. This gives the support user read-only access to these spaces.

the global DW Administrator role, if the customer confirms this in the incident

The support user does not consume a user license, and it will be automatically deleted after two days or after the incident
has been closed.

Option 2: Create Support User for an Incident

Before creating an incident with SAP, proceed as follows:

1. In the shell bar, click  (Support).

2. In the Support dialog, click  Create Support User and then choose OK to confirm the support user creation.

An email is automatically sent to SAP Support to notify them of the newly created support user, and it is listed with your
other users at Security Users .

The support user has minimum privileges and does not consume a user license.

This is custom documentation. For more information, please visit SAP Help Portal. 10
7/9/25, 8:45 AM
You can assign an appropriate scoped role to the support user and add the user to the required space, or assign the DW or
Catalog Administrator role if required.

3. Delete the support user when your issue is resolved.

For more information about creating a support user and assign appropriate roles, see SAP Note 2891554 .

Creating and Configuring Your SAP Datasphere Tenant


Creating and Configuring Your SAP Datasphere Tenant: Learn how to create and configure your own tenant in SAP BTP, select a
data center region, and distribute workloads across availability zones. This information is essential for managing and optimizing
your data center infrastructure.

You can create your own tenant in the SAP BTP Cockpit. The procedure is the same for both subscription-based and consumption-
based contracts. Some details may vary depending on the chosen service plan (free or standard). For more information about
limitations for a free plan, see SAP Note 3227267.

When the tenant is configured, a data center region is selected. The main role of a data center is to guarantee the uninterrupted
operation of computer systems. It also provides secure storage, processing, and networking capabilities for your data. A data
center refers to the physical location, which could be a building or a group of buildings, housing computer systems and their
components.

Each data center region has multiple availability zones. Your workloads are deployed in these various zones. By distributing
workloads across different zones, we ensure our services remain available, even if a specific zone experiences issues. By keeping
backup data within the same data center region, the latency for data transfers and access is minimized. This infrastructure
strategy balances the workload and enhances performance. The zone deployment contributes to a more robust and reliable
infrastructure, ensuring near-zero downtime for your critical processing needs.

For information about enabling multiple availability zones, see this SAP Knowledge Base Article .

Characteristics Standard Plan Free Plan

Provisioning For information about region availability, see For information about region availability, see
the SAP Discovery Center . the SAP Discovery Center .

The SAP BTP subaccount administrator must The SAP BTP subaccount administrator must
trigger the SAP Datasphere instance trigger the SAP Datasphere instance
creation. Tenant creation will not be triggered creation. Tenant creation will not be triggered
by SAP. by SAP.

You must create and configure the to-be- You must create and configure the to-be-
provisioned SAP Datasphere service instance provisioned SAP Datasphere service instance
(tenant) in SAP BTP. See Create Your SAP (tenant) in SAP BTP. See Create Your SAP
Datasphere Service Instance in SAP BTP. Datasphere Service Instance in SAP BTP.

The system owner of SAP Datasphere, who The system owner of SAP Datasphere, who
has been specified during the provisioning, is has been specified during the provisioning, is
notified via email when the tenant is notified via email when the tenant is
provisioned. provisioned.

This is custom documentation. For more information, please visit SAP Help Portal. 11
7/9/25, 8:45 AM

Characteristics Standard Plan Free Plan

Size Configuration Tenants are initially created with minimal Tenants are created with 128 GB of storage
configuration that includes 128 GB of storage and 32 GB of memory (2 compute blocks).
 Note and 32 GB of memory (2 compute blocks).
You cannot upscale free plan tenants. You
For maximum size
Once logged to your tenant, upscaling can be need to update your plan from free to
configuration options, done at any time. See Configure the Size of standard if any sizing configuration is
see the tables below. Your SAP Datasphere Tenant. required.

 Note
After finalizing the configuration, you can
only change the size of your SAP BW
Bridge storage later if you don’t have any
SAP BW Bridge instances.

To view all supported size combinations, go to the


SAP Datasphere Capacity Unit Estimator.

Metering :
The number of consumed capacity units is
reported on a hourly basis to your SAP BTP The usage of a free plan tenant is reported to
account. your SAP BTP account, but SAP does not
charge you for using this tenant.

Time Limitation Subscription Contract: dependent on the :


contract. The time limitation is 90 days and the trial duration
cannot be extended.
Consumption Based Contract: no time
limitation. You can update the tenant from free to standard plan
before the 90-day expiration (the number of days
before the expiration is displayed in the top panel of
the SAP Datasphere free-plan tenant).

If you do not perform the update within 90 days, the


tenant is automatically deleted. The remaining
service instance cannot be reused and should be
deleted at any time by an administrator of your SAP
BTP account.

See Update Your Free Plan to Standard Plan in SAP


BTP.

Number of tenants No limitation. 1 per SAP BTP global account.

In the case of a subscription contract, the available


capacity units can be distributed among all your
tenants.

Maximum Configuration Values

The maxium configuration size of your tenant depends on regional availability and your server type.

 Note
Data integration includes 200h/month from the minimum free package.

Catalog includes 0.5 GB/h from the minimum free package.

This is custom documentation. For more information, please visit SAP Help Portal. 12
7/9/25, 8:45 AM

Amazon Web Services (AWS)

Hyperscaler Memory Storage BW Bridge Data Lake Data Catalog vCPU


Regional Integration
Availability

Australia 5970 GB 16000 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 440 (Memory
Performance
Class)

Brazil (São 5970 GB 16000 GB 4096 GB Not 7200 h/month 20.5 GB/h 440 (Memory
Paulo) Supported Performance
Class)

Canada 5970 GB 16000 GB 4096 GB Not 7200 h/month 20.5 GB/h 440 (Memory
(Montreal) Supported Performance
Class)

Europe 12000 GB 61440 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 442 (High
(Frankfurt) Memory
Performance
Class)

EU Access 5970 GB 16000 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 440 (Memory
(Frankfurt) Performance
Class)

Japan (Tokyo) 5970 GB 16000 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 440 (Memory
Performance
Class)

Singapore 5970 GB 16000 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 440 (Memory
Performance
Class)

South Korea 5970 GB 16000 GB 4096 GB Not 7200 h/month 20.5 GB/h 440 (Memory
Supported Performance
Class)

US East 12000 GB 61440 GB 4096 GB 90 TB 7200 h/month 20.5 GB/h 442 (High
Memory
Performance
Class)

Microsoft Azure

Hyperscaler Memory Storage BW Bridge Data Lake Data Catalog vCPU


Regional Integration
Availability

Brazil 5600 GB 27840 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
Performance
Class)

Europe 11150 GB 55760 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
(Amsterdam) Performance
Class)

This is custom documentation. For more information, please visit SAP Help Portal. 13
7/9/25, 8:45 AM

Hyperscaler Memory Storage BW Bridge Data Lake Data Catalog vCPU


Regional Integration
Availability

Europe 5600 GB 27840 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
(Switzerland) Performance
Class)

US West 11150 GB 55760 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
Performance
Class)

US East 5600 GB 27840 GB Supported 90 TB 7200 h/month 20.5 GB/h 412 (Memory
Performance
Class)

Google Cloud Platform (GCP)

Hyperscaler Memory Storage BW Bridge Data Lake Data Catalog vCPU


Regional Integration
Availability

Australia 3700 GB 28928 GB Supported 90 TB 7200 20.5 GB/h 204 (High


h/month Memory
Performance
Class)

Brazil 5750 GB 28928 GB Supported 90TB 7200 20.5 GB/h 124 (High
h/month Memory
Performance
Class)

Europe 11520 GB 28928 GB Supported 90 TB 7200 20.5 GB/h 412 (High


(Frankfurt) h/month Memory
Performance
Class)

India (Mumbai) 5750 GB 28928 GB Supported 90 TB 7200 20.5 GB/h 204 (High
h/month Memory
Performance
Class)

Israel 3700 GB 28928 GB Not 90 TB 7200 20.5 GB/h 124 (High


supported h/month Memory
Performance
Class)

Japan 5750 GB 28928 GB Supported 90 TB 7200 20.5 GB/h 204 (High


h/month Memory
Performance
Class)

Saudi Arabia 5750 GB 28928 GB Not 90 TB 7200 20.5 GB/h 204 (High
supported h/month Memory
 Note
Performance
Only available
Class)
to customers
representing
critical
national

This is custom documentation. For more information, please visit SAP Help Portal. 14
7/9/25, 8:45 AM

Hyperscaler Memory Storage BW Bridge Data Lake Data Catalog vCPU


Regional Integration
Availability

infrastructure
or the public
sector.

Saudi Arabia 5750 GB 28928 GB Not 90 TB 7200 20.5 GB/h 204 (High
supported h/month Memory
 Note
Performance
Available to
Class)
non-
regulated
customers.

US Central 11520 GB 28928 GB Supported 90 TB 7200 20.5 GB/h 412 (High


h/month Memory
Performance
Class)

Create Your SAP Datasphere Service Instance in SAP BTP


Create your SAP Datasphere service instance in SAP Business Technology Platform.

 Note
Creating an SAP Datasphere service instance in SAP Business Technology Platform (SAP BTP) results in provisioning an SAP
Datasphere tenant.

For both subscription-based contracts (initiated on November 2023) and consumption-based contracts, you can access the SAP
BTP cockpit and view all currently available services in a global account. You need to structure this global account into
subaccounts and other related artefacts, such as directories and/or spaces.

Prerequisites
To create your SAP Datasphere service instance in SAP BTP, you need the following prerequisites:

Your global account has a commercial entitlement either via cloud credits (in case of a consumption-based contract) or via
a subscription-based contract.

A Cloud Foundry subaccount which is entitled for SAP Datasphere. For more information, see Configure Entitlements and
Quotas for Subaccounts.

You have SAP BTP administration authorization on the subaccount that is entitled to SAP Datasphere.

You are using Google Chrome to properly view popups in SAP BTP.

Service Plans

Service Plan Description

Standard The standard plan provides an SAP Datasphere tenant for productive and non-productive use, which is
represented by a service instance.

This is custom documentation. For more information, please visit SAP Help Portal. 15
7/9/25, 8:45 AM

Service Plan Description

Free The free plan provides an SAP Datasphere tenant for a limited time for trial use, which is represented by a
service instance.

 Note
For information about region availability, see the SAP Discovery Center .

Create a Tenant

The following procedure uses the SAP BTP cockpit to create the service instance.

In the SAP Datasphere Administration Guide, we provide high-level steps to create an SAP Datasphere tenant on SAP BTP. For
more detailed information, or for instructions that use the Cloud Foundry Command-Line Interface, see the SAP Business
Technology (SAP BTP) documentation.

 Note
You can create only one free tenant under the global account. If your SAP BTP service causes issues, you can open an incident
ticket via SAP for Me.

1. In the SAP BTP cockpit, navigate to the space in which you want to create the service instance, and click Services
Service Marketplace in the left navigation area.

For more information, see Navigate to Orgs and Spaces.

2. Search for "Datasphere", and click the SAP Datasphere service to open it.

3. Click Create in the top-right corner.

A wizard opens, in which you can select or specify the following parameters:

Parameter Description

Service Select SAP Datasphere.

Plan Select Standard or Free.

Runtime Environment Select Other.

 Note
Not all runtime environments are available for free.

Space [no selection needed if you're creating the instance from the space
area] Select the SAP BTP space in which you want to create the
service instance.

Instance Name Enter a name to identify your instance (up to 32 alphanumeric


characters, periods, underscores, and hyphens; cannot contain
white spaces).

4. Click Next and enter the following information about the SAP Datasphere system owner, who will be notified when the
service instance is created: First Name, Last Name, Email, and Host Name.

 Note

This is custom documentation. For more information, please visit SAP Help Portal. 16
7/9/25, 8:45 AM
Alternatively, you can use a JSON file to provide the information above. See Create an API Access Configuration JSON
File.

5. Click Next to go to the final page of the wizard where you can review your selections, and then click Create to exit the
wizard.

An information message is displayed to confirm that the service instance creation is in progress.

 Note
The creation of the instance can take a while.

6. Click View Instance to go to your space Service Instances page, where the new instance is listed and you can view the
progress of its creation.

7. When the service instance is created, the SAP Datasphere system owner receives an email confirming its availability, and
providing a link to navigate to the SAP Datasphere tenant, which the service instance represents.

 Note
If the creation of the service instance fails (the "failed" status is displayed), you must first delete the failed instance and
then create a new SAP Datasphere service instance. If you need support, you can open an incident via SAP for Me with
the component DS-PROV.

Create an API Access Configuration JSON File


Use these parameters in a JSON file to authenticate the service instance in SAP Business Technology Platform.

When configuring the service instance in SAP Business Technology Platform, you can create and upload a JSON configuration file
with the required parameters to authenticate the service instance. You can use the JSON file when creating a new instance or
recovering a deleted instance.

When creating a new instance, use these parameters in the JSON file:

 Sample Code

{
“first_name”: “”,
“last_name”: “”,
“email”: “”,
“host_name”: “”
}

Parameter Description

first_name The first name of the authorized user.

last_name The last name of the authorized user.

email The email address of the authorized user. For example, [Link]@[Link].

host_name The unique name of the host or domain on the network.

When recovering a deleted tenant, use these parameters in a JSON file:

This is custom documentation. For more information, please visit SAP Help Portal. 17
7/9/25, 8:45 AM

 Sample Code

{
“tenantUuid”: “”,
“access_token”: “”,
}

Parameter Description

tenantUuid The tenant universally unique identifier (UUID) for the source system.

access_token The OAuth string used to authorize the user.

See SAP note 3455188 .

Configure the Size of Your SAP Datasphere Tenant


Configure the size of your tenant by specifying resource sizes based on your business needs. Capacity Units (CU) are allocated to
obtain storage and compute resources for your tenant.

You can configure the size of a subscription-based tenant and a consumption-based tenant with a standard plan.

To do so, you must have an SAP Datasphere administrator role.

Configuring the Sizes of Resources


In the Tenant Configuration page of the Configuration area, you can increase the sizes for the various resources, within the
permitted size combinations, to obtain a configuration that fits your exact needs, and then click Save.

 Caution
Once you save the size configuration of your tenant, some resources cannot be resized later. Storage cannot be downsized. If
you require storage downsize, you must recreate the tenant. Exception: If you need to decrease the memory, see SAP note
3224686 .

Also, once you click Save:

The whole process could take more than 90 minutes. The configuration process is not long, but the operational process in
the background can take a while.

In case an error occurs, you are notified that the configuration cannot be completed and that you need to try again later by
clicking the Retry button (which replaces the Save button in such a case). The delay depends on the error (for example, if
there is an error on the SAP HANA Cloud database side, you can retry after 60 minutes).

You can only change SAP HANA Compute and SAP HANA Storage once every 24 hours.

If you try to change your SAP HANA configuration, SAP HANA Cloud functionality (Spaces, DPServer, Serving of Queries)
will not be available for around 10 minutes. If you run into issues after the configuration, use the Retry button.

Supported Sizes

This is custom documentation. For more information, please visit SAP Help Portal. 18
7/9/25, 8:45 AM
To view all supported size combinations for compute and storage resources and the number of capacity units consumed, go to the
[Link]

Tenant Configuration Page Properties

Base Configuration

Property Description

Performance Class Select a performance class for your tenant:


Memory

Compute

High-Memory

High-Compute

 Note
vCPU Allocation table below.

Storage Set the size of disk storage.


You can specify from 128 GB (minimum), by increments of 64 GB.

Memory Set the memory allocated to your tenant.

You can increase the amount of memory from 32 GB (minimum), by increments of 16 GB.

You can reduce the amount of memory, but the lower limit depends on how much space you have
assigned to space management.

For more information, see Allocate Storage to a Space.

vCPU Displays the number of vCPUs allocated to your tenant. The number is calculated based on the
selected performance class, and memory used by your tenant.

Enable the SAP HANA Cloud Script Enable this option to access the SAP HANA Automated Predictive Library (APL) and SAP HANA
Server Predictive Analysis Library (PAL) machine learning libraries.

Additional Configuration

Property Description

Data Lake Storage [optional] Select the size of data lake disk storage. The performance class you select determines
the number of vCPUs allocated to your tenant.
You can specify from 0 TB (minimum) to 90 TB (maximum), by increments of 1 TB.

Data lake storage includes data lake compute.

To reduce the size of your data lake storage, you must first delete your data lake instance, and re-
create it in the size that you want.

 Note
Deletion cannot be reversed and all data stored in the data lake instance is deleted.

You cannot delete your data lake storage if it's connected to a space. You must first disconnect
the space:

1. Go to Space Managementand choose a space.

2. Select Edit.

This is custom documentation. For more information, please visit SAP Help Portal. 19
7/9/25, 8:45 AM

Property Description

3. Under General Settings, clear the Use this space to access data lake checkbox.

Data lake is not available in all regions. See SAP Note 3144215 .

SAP BW Bridge Storage [optional] Enter a value to select the size of SAP BW bridge storage. The system updates to the
nearest value automatically. You can also click the + and - buttons to adjust to your desired size.

SAP BW Bridge includes SAP BTP, ABAP environment, and an own HANA Cloud runtime and
compute.

 Caution
It isn not possible to downsize an SAP BW Bridge tenant.

 Note

First finalize the size configuration of your tenant. Then, you can create the SAP BW
bridge instance in the dedicated page SAP BW Bridge of the Configuration area with
the size you’ve allocated (see Provisioning the SAP BW Bridge Tenant).

For data center availability, check SAP note 3144215 .

As soon as you click Save, the allocated capacity units will be assigned to SAP BW
Bridge.

Object Store

For this option to be enabled, the Memory option must be configure to have 128 GB or more. See the Base Configuration table.

Property Description

Storage Select the size of storage in TB.

You can specify the storage size starting from 1 TB.

Storage is correlated to SAP HANA Data Lake Files (HDLF) size.

 Note
You may incur higher consumption costs because data lake files keep a previous copy of any
file affected by an operation for a given retention time to allow for operations such as
RESTORESNAPSHOT. These previous copies incur data lake storage costs. For example, you
may have a 10 MB table, and the storage will be higher than that because of the number of
operations initiated and copied. For more information, see Restoring data in Data Lake Files
and Limitations of Data Lake files.

Storage is rounded to the next whole GB. For example, if all of the files in storage consume 1.2
GB, then the memory is rounded up to the next full gigabyte. In this example, it would round up
to 2 GB.

Compute Select the number of block-hours starting from 1.

4 GB of Spark Compute are equal to 0.149 capacity units.

Requests Select the number of API calls needed per month in units of 1000.

1000 Object Store Requests are equal to 0.026 capacity units.

This is custom documentation. For more information, please visit SAP Help Portal. 20
7/9/25, 8:45 AM

Elastic Compute Node

Property Description

Performance Class [optional] Select a performance class for your elastic compute node block-hours:

Memory

Compute

High-Compute

 Note
The performance class you select determines the number of vCPUs and the RAM allocated to
your tenant.

You can only use one performance class at a time. To use a different performance class, you must
re-configure your Tenant Configuration settings.

Block Specifications Displays the number of vCPUs and the amount of RAM allocated to your tenant.

Block-Hours [optional] Set the number of blocks-hours scheduled for elastic compute node consumption. Each
block-hour is an additional block of vCPu and RAM for your tenant to use in one hour. The
maximum number of block-hours you can consume in one hour is four.

Elastic Compute Node Usage: Displays the number of blocks currently scheduled for elastic compute node consumption.
Allocated Block-Hours

Elastic Compute Node Usage: Used Displays the total number of blocks consumed by elastic compute nodes. The total is independent
Block-Hours of which performance-class is selected.

Elastic Compute Node Usage: Displays the block-hours you have used that exceed the amount allocated by your tenant
Exceeded Block-Hours configuration.

 Note
This option only appears if you have used more block-hours than allocated.

Data Integration

Property Description

Data Integration [optional] ] Enter the number of blocks to allocate to data integration applications (replication
flows and transformation flows).

Even if you don’t allocate blocks here, you have a default number of execution hours for data
integration (depending on your contract). For more information about this, as well as about how
many execution hours you get per block, see the SAP Datasphere and SAP Datasphere, Test
Tenant Supplemental Terms and Conditions, which are a part of the Service Level Agreement.

Execution Hours Displays the number of execution hours available for data integration applications per month. It is
calculated by multiplying the number of allocated compute blocks by the number of execution
hours per block (per your contract). You can increase or decrease the data integration node hours
without downtime. The amount of job processing in parallel is automatically adjusted within the
limits set: Every 100h of your allocated data integration hours gets one extra parallel pod for job
processing. For example, if you have 400h or data integration, you will have a maximum of four
parallel pods available for processing.

 Note
If you exceed the available execution hours, your data integration processes (such as
replication flow runs) continues running to avoid interrupting critical integration scenarios,

This is custom documentation. For more information, please visit SAP Help Portal. 21
7/9/25, 8:45 AM

Property Description

which can result in additional costs (depending on your plan).

Maximum Parallel Jobs Displays the maximum number of jobs that can run in parallel.

The minimal configuration of SAP Datasphere supports 2 parallel jobs. For every additional 100
execution hours allocated, you get one extra parallel job, up to a maximum of 10.

Each parallel job means that roughly 5 replication objects (from one or more replication flows) can
be processed in parallel.

If the number of running replication flows exceeds the maximum number of parallel jobs,
processing is queued, and replication occurs less frequently.

Data Integration: Allocated Displays the number of execution hours allocated to data integration applications so that you can
Execution Hours easily compare it against the used execution hours.

Data Integration: Used Execution Displays the number of hours used by data integration applications in the current month. The
Hours value is updated once every 6 hours.

Data Integration: Exceeded Displays the execution hours that you have used in the current month that exceed the amount
Execution Hours allocated in tenant configuration.

 Note
This option only appears if you have used more hours than allocated.

Premium Outbound Integration

Property Description

Outbound Blocks Enter the number of blocks to be used for premium outbound integration. Having at least one
block assigned here is a prerequisite for using a non-SAP target in a replication flow. For more
information, see Premium Outbound Integration.

Each block gives you 20 GB of data volume for transfer.

Outbound Volume Displays the data volume available for premium outbound integration per month. It is calculated
by multiplying the number of allocated blocks by 20 GB.

 Note
If you exceed the assigned volume, your data integration processes (such as replication flow
runs) continues running to avoid interrupting critical integration scenarios, which can result in
additional costs (depending on your plan).

Premium Outbound Usage: Displays the monthly allocated data volume (in GB) for premium outbound integration so that you
Allocated Data Volume can easily compare it against the used volume.

Premium Outbound Usage: Used Displays the used data volume (in GB) for premium outbound integration in the current month.
Data Volume The value is updated once every 6 hours.

Premium Outbound Usage: Displays the data volume that you have used in the current month that exceeds the amount
Exceeded Data Volume allocated in tenant configuration.

 Note
This option only appears if you have used more data than allocated.

Catalog

This is custom documentation. For more information, please visit SAP Help Portal. 22
7/9/25, 8:45 AM

Property Description

Catalog Storage Included by default. You can increase or decrease the number of storage blocks allocated for the
catalog.

Storage The amount of storage available for the catalog is calculated from the number of allocated blocks.

Catalog Usage: Allocated Storage Displays the number of GB allocated to the catalog.

Catalog Usage: Used Storage Displays the number of GB used by the catalog.

Catalog Usage: Exceeded Storage Displays the amount of storage that you have used that exceeds the amount allocated by your
tenant configuration.

 Note
This option only appears if you have used more storage space than allocated.

Capacity Units

Property Description

Purchased units Displays the capacity units purchased for the month.

Estimated Units Displays the number of units anticipated to be charged to the user by the end of the month. This
calculation assumes that the current configuration stays unchanged and all pay-per-use services
are fully utilized.

Available Units Displays the estimated capacity units left for this month. This number is calculated as Purchased
Units - Estimated Units = Available Units.

Your Consumption Displays the amount charged to the users this month for all services. This calculation accounts for
any configuration changes made during the month and the precise usage of pay-per-use services.

vCPU Allocation

When you set your base configuration, the performance class you select and the hyperscaler you are using determines the amount
of memory in GB available for each vCPU in the system. For example, an AWS system with 32 GB of memory has 2 vCPUs, whereas
an AWS system with 320 GB of memory has 20 vCPUs. The tables below list the memory to vCPU ratio for each hyperscaler.

Performance Class: Memory

Hyperscaler Memory vCPUs

AWS 32-960 GB 16

AWS 1024-1792 GB 16

AWS 1800 GB 15

AWS 5970 GB 13.57

GCP 32-960 GB 16

GCP 1024-1856 GB 16

GCP 1904 16

Azure 32-1024 GB 16

This is custom documentation. For more information, please visit SAP Help Portal. 23
7/9/25, 8:45 AM

Hyperscaler Memory vCPUs

Azure 1088-1920 GB 16

Azure 2800 GB 13.72

Azure 3744 GB 16

Azure 5600 GB 13.59

Performance Class: High Memory

Hyperscaler Memory vCPUs

AWS 3600 GB 30

AWS 9000 GB 20.36

AWS 12000 GB 27.15

GCP 3700 GB 23.72

GCP 5750 GB 28.19

GCP 8630 GB 20.95

GCP 11520 GB 27.96

Azure 3776 GB 31.47

Azure 5595 GB 27.43

Azure 7440 GB 18.05

Azure 11150 GB 20.95

Performance Class: Compute

Hyperscaler Memory vCPUs

AWS 32-912 GB 8

GCP 32-608 GB 8

Azure 32-480 GB 8

Performance Class: High Compute

Hyperscaler Memory vCPUs

AWS 32-352 GB 4

GCP 32-288 GB 4

Azure 32-352 GB 4

Update Your Free Plan to Standard Plan in SAP BTP


Update your service instance from free plan to standard plan.

This is custom documentation. For more information, please visit SAP Help Portal. 24
7/9/25, 8:45 AM
In SAP Business Technology Platform (SAP BTP), if you have an SAP Datasphere service instance with a free plan, which you can
use for 90 days, you can update it to a standard plan (no time limitation) for productive purposes. The number of days before the
expiration is displayed in the top panel of SAP Datasphere.

 Note
If you do not update to a standard plan within 90 days, your SAP Datasphere tenant will be suspended. While the tenant is
suspended, you can still upgrade your service instance from the free to standard plan, but after 5 days of suspension, your
tenant will be deleted and there is no way to recover it.

If your tenant is deleted, the service instance will still be shown in your Global Account, but it is not functional. You can delete it
and create a new SAP Datasphere service instance with a free plan.

To do so, you must have SAP BTP administration authorization on the subaccount that is entitled to SAP Datasphere.

1. In SAP BTP, select the subaccount and the space where the service instance with a free plan was created.

2. Navigate to Instances and Subscriptions.

3. In the Service Instances page, find the SAP Datasphere service instance with the free plan, click the button at the end of
the row and select Update.

 Note
After updating your free plan to standard plan, you must wait at least 24 hours before changing the tenant settings on
the Tenant Configuration page.

4. In the Update Instance dialog, select standard and click Update Instance.

You can view the progress of the update. The status of the instance becomes green when the update is completed.

 Note
The update process takes around 30 minutes, and during this time some features might not work as expected.

Review and Manage Links to SAP Analytics Cloud and SAP


Business Data Cloud Tenants
You can link your SAP Datasphere tenant to a SAP Analytics Cloud tenant accessible in the  (Product Switch) in the top right of
the shell bar, to help your users easily navigate between them. In addition, your tenant can be linked to by SAP Analytics Cloud and
SAP Business Data Cloud administrators.

Prerequisites

To view the Administration page containing the Tenant Links tab, you must have a global role that grants you the following
privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To acces the Administration area.

The DW Administrator global role, for example, grants these [Link] more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

This is custom documentation. For more information, please visit SAP Help Portal. 25
7/9/25, 8:45 AM

 Note
To select an SAP Analytics Cloud tenant to make available via the  (Product Switch), you must have the System Owner role.

Product Switch Tenant Links

In the side navigation area, click  (System)  (Administration) Tenant Links .

The following properties are available in the Product Switch section:

Property Description

SAP Datasphere URL [read-only] Displays the URL for the current SAP Datasphere
tenant.

SAP Analytics Cloud URL Displays the URL for the SAP Analytics Cloud selected by the
System Owner to be accessible via the  (Product Switch).

If your tenant is included in an SAP Business Data Cloud formation, then the following links are also displayed:

Property Description

SAP Business Data Cloud Cockpit URL [read-only] Displays the URL of the SAP Business Data Cloud
Cockpit tenant.

SAP Analytics Cloud URL (SAP Business Data Cloud Formation) [read-only] Displays the URL of the SAP Analytics Cloud tenant in
the formation.

In this situation, both the SAP Business Data Cloud Cockpit and the SAP Analytics Cloud tenants are shown as tiles when users
click the  (Product Switch) and, if they have a user in the relevant target tenant, they can click the tile to navigate there.

For more information about SAP Business Data Cloud, see:

Integrating Data from SAP Business Data Cloud

Integrate SAP Business Data Cloud Provisioned Systems (in the SAP Business Data Cloud documentation)

Specify an SAP Analytics Cloud Tenant to Access via the Product Switch
You can link your SAP Datasphere tenant to a SAP Analytics Cloud tenant accessible in the  (Product Switch) in the top right of
the shell bar, to help your users easily navigate between them.

To select an SAP Analytics Cloud tenant to make available via the  (Product Switch), you must have the System Owner role.

1. In the side navigation area, click  (System)  (Administration) Tenant Links .

2. Enter the URL of your SAP Analytics Cloud tenant.

 Note
You must select an SAP Analytics Cloud tenant hosted in a Cloud Foundry environment. Linking to NEO tenants is not
supported.

3. Click Save to confirm the connection.

This is custom documentation. For more information, please visit SAP Help Portal. 26
7/9/25, 8:45 AM
The selected SAP Analytics Cloud tenant tile is shown when users click the  (Product Switch) and, if they have a user with
an appropriate role in the SAP Analytics Cloud system, they can click the tile to navigate there.

 Note
An SAP Analytics Cloud user must create a live connection before they can consume data from SAP Datasphere.

Multiple SAP Analytics Cloud tenants can create live connections to your SAP Datasphere tenant, but only one SAP
Analytics Cloud tenant can be accessed via the  (Product Switch).

For more information, see Consume Data in SAP Analytics Cloud via a Live Connection.

Data Storage for Planning


If your SAP Datasphere tenant has been selected to store planning data for an SAP Analytics Cloud tenant, then the following
information is displayed:

Property Description

SAP Datasphere URL [read-only] Displays the URL for the current SAP Datasphere tenant.

SAP Analytics Cloud URL [read-only] Displays the URL of the SAP Analytics Cloud tenant storing planning data in the current SAP
Datasphere tenant.

Tenant Link Artifacts [read-only] Displays the name of the OAuth client the SAP Analytics Cloud tenant uses to connect to SAP
Datasphere.

For more information about storing SAP Analytics Cloud planning data in SAP Datasphere, see:

Integrate with SAP Analytics Cloud for Planning

Configure Data Storage for Planning (in the SAP Analytics Cloud documentation)

Enable SAP HANA for SQL data warehousing on Your SAP


Datasphere Tenant
Use SAP HANA for SQL data warehousing to build calculation views and other SAP HANA Cloud HDI objects directly in your SAP
Datasphere run-time database and then exchange data between your HDI containers and your SAP Datasphere spaces. SAP HANA
for SQL data warehousing can be used to bring existing HDI objects into your SAP Datasphere environment, and to allow users
familiar with the HDI tools to leverage advanced SAP HANA Cloud features.

Prerequisites
To enable SAP HANA for SQL data warehousing on your SAP Datasphere tenant, you must have a global role that grants you the
following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator global role, for example, grants these [Link] more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

This is custom documentation. For more information, please visit SAP Help Portal. 27
7/9/25, 8:45 AM

Context
To enable SAP HANA for SQL data warehousing on your SAP Datasphere tenant, you must map your tenant to your SAP Business
Technology Platform account.

 Note
The SAP Datasphere tenant and SAP Business Technology Platform organization and space must be in the same data center
(for example, eu10, us10). This feature is not available for Free Tier plan tenants (see SAP Note 3227267 ).

A tenant may have both Kyma mappings and Cloud Foundry mappings simultaneously. You can also map instances using the
Instance Mapping REST API (see Create and Manage Instance Mappings Using the REST API in the SAP HANA Cloud
Administration Guide).

For information about working with SAP Datasphere and HDI containers, see Exchanging Data with SAP HANA for SQL data
warehousing HDI Containers.

Procedure
1. In the side navigation area, click  (Configuration) Instance Maping .

2. Add a mapping for one of these environment types: Cloud Foundry or Kyma.

Environment Environment Instance ID and Group


Type

Cloud Enter the organization and space GUIDs that you are mapping to:
Foundry Your SAP Business Technology Platform organization GUID.

 Tip
You can use the Cloud Foundry CLI to find your organization GUID:

cf org <ORG> --guid

See [Link] .

[Optional] Your SAP Business Technology Platform space inside the organization. If you only specify the orga
GUID, the instance is mapped to all spaces in that organization.

 Tip
You can use the Cloud Foundry CLI to find your space GUID:

cf space <SPACE> --guid

See [Link] .

Kyma Enter the cluster ID and namespace that you are mapping to:
Your cluster ID. The cluster ID must be a GUID and the namespace can be no longer than 64 characters. Allow
characters are all lowercase letters, numbers, and dash (-).

[Optional] Your namespace. If no namespace is provided, the instance is mapped to all namespaces in the cl

 Tip
You can use the following methods to find your cluster ID GUID and your namespace:

This is custom documentation. For more information, please visit SAP Help Portal. 28
7/9/25, 8:45 AM

Environment Environment Instance ID and Group


Type

kubectl CLI

To find your cluster ID GUID, run the following command:

kubectl get configmap sap-btp-operator-config -n kyma-system -o jsonpath='{.[Link]

To find your namespace, run the following command:

kubectl get namespaces

Kyma Console

Cluster ID GUID

a. In the left sidebar, click Namespaces. Then select kyma-system from the main page.

b. In the left sidebar, click Configuration Config Maps . Then select sap-btp-operator-config from the m

c. The cluster ID GUID is located on the main page.

Namespace

In the left sidebar, click Namespaces to see all namespaces in the Kyma cluster.

For more information, see SAP HANA Instance Mapping in the SAP HANA Cloud Administration Guide.

3. Click the Save button to apply your changes.

Your tenant is mapped to another environment context. You can now create HDI containers outside of SAP Datasphere.

4. Build one or more new HDI containers in the SAP Business Technology Platform Space and they will be created in the SAP
Datasphere run-time database (identified by the Database ID on the SAP Datasphere About dialog).

For information about setting up your build, see Binding Applications to an SAP HANA Cloud Instance.

Next Steps
When one or more HDI containers are available in your tenant, users with a space administrator role can work with them
(Exchanging Data with SAP HANA for SQL data warehousing HDI Containers).

Enable the SAP HANA Cloud Script Server on Your SAP


Datasphere Tenant
You can enable the SAP HANA Cloud script server on your SAP Datasphere tenant to access the SAP HANA Automated Predictive
Library (APL) and SAP HANA Predictive Analysis Library (PAL) machine learning libraries.

To enable the SAP HANA Cloud script server, go to the Tenant Configuration page and select the checkbox in the Base
Configuration section. For more information, see Configure the Size of Your SAP Datasphere Tenant.

 Note
The script server cannot be enabled in a SAP Datasphere consumption-based tenant with free plan.

Once the script server is enabled, the Enable Automated Predictive Library and Predictive Analysis Library option can be
selected when creating a database user (see Create a Database User).

This is custom documentation. For more information, please visit SAP Help Portal. 29
7/9/25, 8:45 AM
For detailed information about using the machine learning libraries, see:

SAP HANA Automated Predictive Library Developer Guide

SAP HANA Cloud Predictive Analysis Library (PAL)

Enable SAP Business AI for SAP Datasphere


SAP Business AI is a fully managed service by SAP that allows you to integrate artificial intelligence (AI) models in different
business solutions. SAP Business AI provides a simple and easy-to-use API with various endpoints that you can use in your
solution for different tasks such as text generation, summarization, language translation, creative content development.

Enable SAP Business AI


SAP Business AI is integrated to generate AI content recommendations in various areas of SAP Datasphere. The integration uses
AI models with prompts that are preconfigured to use input parameters to generate AI content recommendations. For example,
you might want to show all objects that have been created by a specific user in the past three months.

Prerequisites

Your SAP Datasphere tenant is on a landscape that supports SAP Business AI. See SAP Note 0003491182 .

You've purchased the SAP AI Units license. For more information about SAP AI Units license, contact your Account
Executive.

To activate an SAP Business AI feature, you need the tenant administrator role.

You could have access to the AI Services tab, but it's possible that the tenant has not been activated with SAP Business AI
yet, or SAP Business AI features are not supported yet. For more information, see SAP Note
[Link] .

Procedure
1. In the side navigation area, click  System Configuration .

2. Click the AI Services tab.

3. In the AI Features section, check the options that you want to use.

AI-Assisted Catalog Content Generation - AI-Enhanced Metadata Enrichment - Generate asset summaries and
descriptions, and assign tag relationships. See Enriching and Managing Catalog Assets and Manage Tag
Relationships for Assets.

AI-Assisted Natural Language Search - AI-Enhanced Metadata Discovery - Enter your search string in natural
language and SAP Datasphere interprets your phrase and filters your results appropriately. See Natural Language
Search.

4. Click Save.

Next Steps

Grant access to use SAP Business AI features by assigning the DW AI Consumer role or another global role that grants the Data
Warehouse AI Consumption privilege (see Assign Users to a Role).

When users have been granted the privilege to SAP Business AI, they will see this icon in areas of SAP Datasphere where AI is
available for use: 

This is custom documentation. For more information, please visit SAP Help Portal. 30
7/9/25, 8:45 AM

Create OAuth2.0 Clients to Authenticate Against SAP


Datasphere
Users with an administrator role can create OAuth2.0 clients and provide the client parameters to users who need to connect
clients, tools, or apps to SAP Datasphere.

You can create OAuth2.0 clients with:

An Interactive Usage purpose:

To use the datasphere command line interface (Log into the Command Line Interface via an OAuth Client).

To consume data via the OData API (see Consume SAP Datasphere Data in SAP Analytics Cloud via an OData
Service).

An API Access purpose:

To use the SCIM 2.0 API (see Create Users and Assign Them to Roles via the SCIM 2.0 API).

To transport content through SAP Cloud Transport Management (see Transporting Your Content through SAP Cloud
Transport Management).

Create an OAuth2.0 Client with an Interactive Usage Purpose


Users with an administrator role can create OAuth2.0 clients with an interactive usage purpose and provide the client parameters
to users who need to connect clients, tools, or apps to SAP Datasphere.

Context
You create an OAuth2.0 client with an Interactive Usage purpose:

To use the datasphere command line interface (Log into the Command Line Interface via an OAuth Client).

To consume data via the OData API (see Consume SAP Datasphere Data in SAP Analytics Cloud via an OData Service).

 Note
Consuming exposed data in third-party clients, tools, and apps via an OData service requires a three-legged OAuth2.0
flow with type authorization_code.

Procedure
1. In the side navigation area, click  (System)  (Administration) App Integration .

2. Under Configured Clients, select Add a New OAuth Client.

3. In the dialog, enter or review the following properties as appropriate:

Property Description

Name Enter a name to identify the OAuth client.

OAuth Client ID [read-only] Displays the ID once the client is created.

Purpose Select Interactive Usage.

This is custom documentation. For more information, please visit SAP Help Portal. 31
7/9/25, 8:45 AM

Property Description

Authorization Grant [read-only] Authorization Code is automatically selected and cannot be changed.

Secret [read-only] Allows the secret to be copied immediately after the client is created.

 Note
Once you close the dialog, the secret is no longer available.

 Note
Clients created before v2024.08 have a Show Secret button, which allows you to display and copy
the secret at any time after the client is created.

Redirect URI Enter a URI to indicate to where the user will be redirected after authorization. If the URI has dynamic
parameters, use a wildcard pattern (for example, [Link]

The client, tool, or app that you want to connect is responsible for providing the redirect URI:

When working with the datasphere command line interface, set this value to
[Link] (see Accessing SAP Datasphere via the Command Line).

When connecting SAP Analytics Cloud to SAP Datasphere via an OData services connection,
use the Redirect URl provided in the SAP Analytics Cloud connection dialog (see Consume SAP
Datasphere Data in SAP Analytics Cloud via an OData Service).

Token Lifetime Enter a lifetime for the access token from a minimum of 60 seconds to a maximum of one day.

Default: 60 minutes

Refresh Token Lifetime Enter a lifetime for the refresh token from a minimum of 60 seconds to a maximum of 180 days.

Default: 30 days

4. Click Add to create the client and generate the ID and secret.

5. Copy the secret, save it securely, and then close the dialog.

 Note
You won't be able to copy the secret again. If you lose it, you will need to create a new client.

6. Provide the following information to users who will use the client:

Standard OAuth2 Authorization Flow OAuth2SAMLBearer Principal Propagation Flow

Client ID Client ID

Secret Secret

Authorization URL OAuth2SAML Token URL

Token URL OAuth2SAML Audience


Users must manually authenticate against the IDP in order to
Users authenticate with their third-party app, which has a
generate the authorization code before continuing with the
trusted relationship with the IDP, and do not need to re-
remaining OAuth2.0 steps.
authenticate (see Add a Trusted Identity Provider). See also the
blog Integrating with SAP Datasphere Consumption APIs using
SAML Bearer Assertion (published March 2024).

This is custom documentation. For more information, please visit SAP Help Portal. 32
7/9/25, 8:45 AM

Create an OAuth2.0 Client with an API Access Purpose


Users with an administrator role can create OAuth2.0 clients with an API access purpose and provide the client parameters to
users who need to connect clients, tools, or apps to SAP Datasphere.

Context
You create an OAuth2.0 Client with an API Access purpose:

To use the SCIM 2.0 API (see Create Users and Assign Them to Roles via the SCIM 2.0 API).

To transport content through SAP Cloud Transport Management (see Transporting Your Content through SAP Cloud
Transport Management).

Procedure
1. In the side navigation area, click  (System)  (Administration) App Integration .

2. Under Configured Clients, select Add a New OAuth Client.

3. In the dialog, enter or review the following properties as appropriate:

Property Description

Name Enter a name to identify the OAuth client.

OAuth Client ID [read-only] Displays the ID once the client is created.

Purpose Select API Access.

Access Select the appropriate access:

User Provisioning - To use the SCIM 2.0 API (see Create Users and Assign Them to Roles via the
SCIM 2.0 API).

Analytics Content Network Interaction - To transport content through SAP Cloud Transport
Management (see Transporting Your Content through SAP Cloud Transport Management).

Authorization Grant Select one of the following:

Client Credentials - If the client application is accessing its own resources or when the
permission to access resources has been granted by the resource owner via another
mechanism. To use the SCIM 2.0 API, select this option (see Create Users and Assign Them to
Roles via the SCIM 2.0 API).

SAML2.0 Bearer - If the user context is passed using SAML or to control access based on user
permissions using SAML. This option requires specific client-side infrastructure to support
SAML.

Secret [read-only] Allows the secret to be copied immediately after the client is created.

 Note
Once you close the dialog, the secret is no longer available.

 Note
Clients created before v2024.08 have a Show Secret button, which allows you to display and copy
the secret at any time after the client is created.

Token Lifetime Enter a lifetime for the access token from a minimum of 60 seconds to a maximum of one day.

This is custom documentation. For more information, please visit SAP Help Portal. 33
7/9/25, 8:45 AM

Property Description

Default: 60 minutes

4. Click Add to create the client and generate the ID and secret.

5. Copy the secret, save it securely, and then close the dialog.

 Note
You won't be able to copy the secret again. If you lose it, you will need to create a new client.

6. Provide the following information to users who will use the client:

Standard OAuth2 Authorization Flow OAuth2SAMLBearer Principal Propagation Flow

Client ID Client ID

Secret Secret

Authorization URL OAuth2SAML Token URL

Token URL OAuth2SAML Audience


Users must manually authenticate against the IDP in order to
Users authenticate with their third-party app, which has a
generate the authorization code before continuing with the
trusted relationship with the IDP, and do not need to re-
remaining OAuth2.0 steps.
authenticate (see Add a Trusted Identity Provider). See also the
blog Integrating with SAP Datasphere Consumption APIs using
SAML Bearer Assertion (published March 2024).

Add a Trusted Identity Provider


If you use the OAuth 2.0 SAML Bearer Assertion workflow, you must add a trusted identity provider to SAP Datasphere.

Context
The OAuth 2.0 SAML Bearer Assertion workflow allows third-party applications to access protected resources without prompting
users to log into SAP Datasphere when there is an existing SAML assertion from the third-party application identity provider.

 Note
Both SAP Datasphere and the third-party application must be configured with the same identity provider. The identity provider
must have a user attibute Groups set to the static value sac. See also the blog Integrating with SAP Datasphere Consumption
APIs using SAML Bearer Assertion (published March 2024).

Procedure
1. Go to System Administration App Integration .

2. In the Trusted Identity Providers section, click Add a Trusted Identity Provider.

3. In the dialog, enter the following properties:

This is custom documentation. For more information, please visit SAP Help Portal. 34
7/9/25, 8:45 AM

Property Description

Name Enter a unique name, which will appear in the list of trusted identity providers.

Provider Name Enter a unique name for the provide. This name can contain only alphabet characters (a-z & A-Z),
numbers (0-9), underscore (_), dot (.), hyphen (-), and cannot exceed 36 characters.

Signing Certificate Enter the signing certificate information for the third-party application server in X.509 Base64 encoded
format.

4. Click Add.

The identity provider is added to the list. Hover over it and select Edit to update it or Delete to delete it.

You may need to use the Authorization URL and Token URL listed here to complete setup on your OAuth clients.

Delete Your Service Instance in SAP BTP


Delete your SAP Datasphere service instance in SAP BTP.

To do so, you must have SAP BTP administration authorization on the subaccount that is entitled to SAP Datasphere.

 Note
If you delete your service instance by accident, it can be recovered within seven days. After seven days have passed, the tenant
and all its data will be deleted and cannot be recovered.

1. In SAP BTP, select the subaccount and the space where the service instance was created.

2. Navigate to Instances and Subscriptions.

3. In the Service Instances page, find the SAP Datasphere service instance that you want to delete, click the button at the end
of the row and select Delete, then click Delete in the confirmation dialog.

You can view the progress of the deletion. The tenant stays in a suspended state for seven days. During that time, you
cannot use the same tenant host name.

Restore an Accidentally Deleted Service Instance

Context

If you accidentally delete your SAP Datasphere service instance in SAP BTP, you can restore it within seven days. For more
information, see SAP Note 3455188 .

 Note
Restoring your service instance is only supported for standard service plans.

Procedure
1. Create a customer incident through ServiceNow using the component DS-PROV. Set the priority to High, and ask SAP
support to restore impacted SAP Datasphere tenant. You must provide the tenant URL.

This is custom documentation. For more information, please visit SAP Help Portal. 35
7/9/25, 8:45 AM
Once completed, SAP Support informs you that the impacted tenant has been restored and unlocked successfully.

2. Get the OAuth Client ID and OAuth Client Secret:

a. Log on to the impacted SAP Datasphere tenant.

b. From the side navigation, choose System Administration .

c. Choose the App Integration tab.

d. Select Add a New OAuth Client.

e. From the Purpose list, select API Access.

f. Choose at least one API option from the Access list.

g. Set Authorization Grant to Client Credentials.

h. Select Save.

i. Copy and save the OAuth Client ID and OAuth Client Secret for Step 3.

3. Create an OAuth Client in the impacted SAP Datasphere tenant. You can name it something like
DSP_SERVICE_INSTANCE_LINKING.

4. Fetch your access token via http POST request to the OAuth Client Token URL.

The Token URL is displayed on the App Integration tab, above the list of Configured Clients.

a. Provide the following information with your POST request:

curl --location --request POST '<TokenURL>/oauth/token/oauth/token?grant_type=client_cre


--header 'Content-Type: application/json' \
--header 'Authorization: Basic <OAuthClientSecret>' \
--data''

Replace <TokenURL> with your OAuth Client Token URL. Replace <OAuthClientSecret> with the OAuth Client
Secret. The secret must be Base64 encoded.

b. Save the access token returned by the POST request.

5. Get the UUID for your tenant.

a. Log on to the impacted tenant.

b. Go to System About .

c. Copy the ID under Tenant.

6. Create a new BTP service instance for SAP Datasphere and link it to the impacted SAP Datasphere tenant.

a. Log on to the SAP BTP Cockpit.

b. Navigate to the subaccount where the deleted SAP BTP service instance was assigned.

c. Navigate to Services Instances and Subscriptions .

d. Click Create.

e. Select Service: SAP Datasphere.

f. Select Plan: Standard.

g. Select Runtime Environment: Other.

h. Enter a name for the service instance.

i. Click Next.

j. In the parameters dialog, switch from Form to JSON.

k. Maintain the following two parameters in JSON format:

{
"tenantUuid": "<TenantUUID>",

This is custom documentation. For more information, please visit SAP Help Portal. 36
7/9/25, 8:45 AM
"access_token": "<AccessToken>"
}

Replace <TenantUUID> with the ID that you retrieved in Step 4c. Replace <AccessToken> with the token that you
fetched in Step 3b.

l. Click Next.

m. In the review dialog, click Create.

n. Back in the SAP Datasphere tenant, go to System Administration and delete the OAuth Client named
DSP_SERVICE_INSTANCE_LINKING, previously created in Step 3.

Results
A new service instance is created and linked to the SAP Datasphere tenant that was accidentally deleted. All tenant data is
restored.

Add Scalable Processing Capacity via Elastic Compute Nodes


If certain views and SAP HANA multi-dimensional services (MDS) requests regularly require more resources than are available on
your SAP Datasphere tenant, you can now purchase additional on-demand compute and processing memory. You can then create
elastic compute nodes, allocate the additional resources to them and schedule them to spin up to handle read peak loads.

The elastic compute nodes will take over the read peak loads and support the SAP HANA Cloud database.

 Note
Users of SAP Datasphere can consume data via elastic compute nodes only in SAP Analytics Cloud (via a live connection) and
Microsoft Excel (via an SAP add-in).

Using elastic compute nodes can lower the overall cost of ownership: instead of sizing your tenant on the basis of the maximum
load, you can use elastic compute nodes to handle short periods of exceptional peak load. For example, you can use an elastic
compute node for two months in the year to support end-of-year reporting, or you can use an elastic compute node to cover a
specific eight-hour period in the working day.

To identify peak loads, you can look at the following areas in the System Monitor: out-of-memory widgets in the Dashboard tab,
key figures in Statement Logs, views used in MDS statements in Statement Logs. See Monitoring SAP Datasphere.

Purchase Resources for Elastic Compute Nodes


You can purchase a number of compute blocks to allocate to elastic compute nodes.

Depending on the resources allocated to your tenant in the Tenant Configuration page, the administrator decides how many
compute blocks they will allocate to elastic compute nodes. See Configure the Size of Your SAP Datasphere Tenant.

Create an Elastic Compute Node


Once you've purchased additional resources, you can create an elastic compute node to take over peak loads.

This topic contains the following sections:

Introduction to Elastic Compute Nodes

This is custom documentation. For more information, please visit SAP Help Portal. 37
7/9/25, 8:45 AM
Create an Elastic Compute Node

Add Spaces and Objects to an Elastic Compute Node

Remove Spaces and Objects from an Elastic Compute Node

Delete an Elastic Compute Node

Prerequisites
To create and manage elastic compute nodes, you must have the following privileges:

Spaces (C------M) - To create, manage and run an elastic compute node

Space Files (-------M) - To add spaces and objects to an elastic compute node

System Information (--U-----) - To access tenant settings needed to manage elastic compute nodes

The DW Administrator global role, for example, grants these privileges (see Roles and Privileges by App and Feature).

Introduction to Elastic Compute Nodes


Once an administrator has purchased additional resources dedicated to elastic compute nodes, they can create and manage
elastic compute nodes in the Space Management. You can create an elastic compute node and allocate resources to it, assign
spaces and objects to it to specify the data that will be replicated to the node, and start the node (manually or via a schedule) to
replicate the data to be consumed.

This is custom documentation. For more information, please visit SAP Help Portal. 38
7/9/25, 8:45 AM

You can select the following objects for an elastic compute node: perspectives and analytic models, and views of type analytical
dataset and that are exposed for consumption. To make the data of the objects available for consumption, their sources - persisted
views, local tables, and, if enabled, open SQL schema tables and HDI container tables - are replicated to the elastic compute node.

Users of SAP Analytics Cloud and Microsoft Excel (with the SAP add-in) will then automatically benefit from the improved
performance of the elastic compute nodes when consuming data exposed by SAP Datasphere. See Consuming Data Exposed by
SAP Datasphere.

Create an Elastic Compute Node


1. In the side navigation area, click (Space Management), then click Create in the Elastic Compute Nodes area.

2. In the Create Elastic Compute Node dialog, enter the following properties, and then click Create:

Property Description

Business Name Enter the business name of the elastic compute node. Can contain a maximum of 30 characters, and
can contain spaces and special characters.

This is custom documentation. For more information, please visit SAP Help Portal. 39
7/9/25, 8:45 AM

Property Description

Technical Name Enter the technical name of the elastic compute node. The technical name must be unique. It can only
contain lowercase letters (a-z) and numbers (0-9). It must contain the prefix: ds (which helps to
identify elastic compute nodes in monitoring tools). The minimum length is 3 and the maximum length
is 9 characters. See Rules for Technical Names.

 Note
As the technical name will be displayed in monitoring tools, including SAP internal tools, we
recommend that you do not mention sensitive information in the name.

Performance Class The performance class, which has been selected beforehand for all elastic compute nodes, is displayed
and you cannot modify it for a particular elastic compute node.

 Note
The performance class is selected when purchasing additional resources in the Tenant
Configuration page (see Configure the Size of Your SAP Datasphere Tenant) and applies to all
elastic compute nodes. The default performance class is High Compute and you may want change it
in specific cases. For example, if you notice that the memory usage is high and the CPU usage is low
during the runtime and you want to save resources, you can select another the performance class,
which will change the memory/CPU ratio.

If the performance class is changed in the Tenant Configuration page and you want to edit your
elastic compute node by selecting it and clicking Configure, you will be asked to select the changed
performance class.

Compute Blocks Select the number of compute blocks. You can choose 4, 8, 12, or 16 blocks. The amount of memory and
vCPU depends on the performance class you choose:
Memory: 1 vCPU and 16 GB RAM per block

Compute: 2 vCPUs and 16 GB RAM per block

High Compute: 4 vCPUs and 16 GB RAM per block

Default: 4

The number of GB for memory and storage and the number of CPU are calculated based on the
compute blocks and you cannot modify them.

 Note
You can modify the number of compute blocks later on by selecting the elastic compute node and
click Configure.

The price you pay for additional resources depends on the compute blocks and the performance class.
If a node that includes 4 compute blocks runs for 30 minutes, you pay for 2 block-hours.

Add Spaces and Objects to an Elastic Compute Node

Select the spaces and objects whose data you want to make available in an elastic compute node. The data of the objects you've
selected, which is stored in local tables, persisted views, and, if enabled, open SQL schema tables and HDI container tables, will be
replicated to the node and available for consumption when the elastic compute node is run.

1. In the side navigation area, click (Space Management), then select the elastic compute node.

2. Click Add Spaces, then in the dialog box, select the spaces that contain objects whose data you want to make available in
an elastic compute node and click Add Spaces.

This is custom documentation. For more information, please visit SAP Help Portal. 40
7/9/25, 8:45 AM

 Note
File spaces are not displayed in the dialog box as they cannot be added to an elastic compute node.

The number of spaces added to the elastic compute node is displayed in the list of nodes on the left part of the screen.

By default, all current and future exposed objects of the selected spaces are automatically assigned to the elastic compute
node and All Exposed Objects is displayed in the space tile.

You can deactivate the automatic assignment and manually select the objects.

There are 3 types of exposed objects: analytic models, perspectives and views (of type analytical dataset and that are
exposed for consumption). See Consuming Data Exposed by SAP Datasphere.

3. To manually select the objects of a space, select the space and click Add Objects. Uncheck Add All Objects Automatically,
then select the objects you want and click Add Objects.

All the objects added across all the added spaces, are displayed in the Exposed Objects tab, whether they've been added manually
or automatically via the option All Exposed Objects.

 Note
To enable the following tables to be replicated:

Open SQL schema tables - see Allow the Space to Access the Open SQL Schema

HDI container tables - see Prepare Your HDI Project for Exchanging Data with Your Space

 Note
Remote Tables - Data that is replicated from remote tables in the main instance cannot be replicated to an elastic compute
node. If you want to make data from a replicated remote table available in an elastic compute node, you should build a view on
top of the remote table and persist its data in the view (see Persist Data in a Graphical or SQL View). You should then make sure
that the object (analytic model, perspective or view) does not consume the remote table but now consumes the persisted view.

Shared Table Example - Making data from a shared table available in an elastic compute node:

The IT space shares the Products table with the Sales space.

The analytical model in the Sales space uses the shared Products table as a source.

If you want the Products table to be replicated to an elastic compute node, you need to add to the node both the Sales
space and the IT space. The shared Products table will not be replicated to the node if you only add the Sales space.

This is custom documentation. For more information, please visit SAP Help Portal. 41
7/9/25, 8:45 AM

Remove Spaces and Objects from an Elastic Compute Node

1. In the side navigation area, click (Space Management), then select the elastic compute node.

2. Select one or more spaces and click Remove Spaces.

All spaces and their objects are removed from the elastic compute node.

3. To remove one or more objects that you've manually added, in the Exposed Objects tab, select one or more objects and
click Remove Objects.

Delete an Elastic Compute Node


1. In the side navigation area, click (Space Management), then select the elastic compute node.

2. Click Delete then in the confirmation dialog click Delete again.

The Delete button is disabled if the status of the elastic compute node is Running.

Run an Elastic Compute Node


Once you've created an elastic compute node and added spaces and objects to it, you can run it and make data available for
consumption.

This topic contains the following sections:

Introduction to Elastic Compute Node Run Process

Start an Elastic Compute Node Manually

Stop an Elastic Compute Node Manually

Schedule an Elastic Compute Node

Update an Elastic Compute Node

Monitor an Elastic Compute Node

Introduction to Elastic Compute Node Run Process


When you start an elastic compute node, it will pass through the following phases:

This is custom documentation. For more information, please visit SAP Help Portal. 42
7/9/25, 8:45 AM

An elastic compute node can have the following statuses:

Not Ready - The node cannot be run because no spaces or objects are assigned to it.

Ready - Spaces or objects are assigned to the node, which can be run, either by starting the run manually or scheduling it.
The status displayed in grey indicates that the elastic compute node has never run whereas green indicates that it has
already run.

Starting - You’ve started the elastic compute node manually by clicking the Start button or it has been started via a
schedule: persisted views and local tables are being replicated and routing is created to the elastic compute node.

Starting Failed (displayed in red) - You’ve started the elastic compute node manually by clicking the Start button or it has
been started via a schedule: issues have occurred. You can start again the elastic compute node.

Updating - You’ve started the elastic compute node manually by clicking the Update button: persisted views and local
tables that have failed to be replicated are now replicated and routing is created to the elastic compute node.

Running - The node is in its running phase: the data that have been replicated during the starting phase can be consumed
in SAP Analytics Cloud for the spaces and objects specified.

 Note
The Running status displayed in red indicates that the elastic compute node contains issues. We recommend that you
stop and restart the node, or, alternatively that you stop and delete the node and create a new one.

Stopping - You’ve stopped the elastic compute node manually by clicking the Stop button or it has been stopped via a
schedule: persisted view replicas, local table replicas and routing are being deleted from the node.

Stopping Failed (displayed in red) - You’ve stopped the elastic compute node manually by clicking the Stop button or it has
been stopped via a schedule: issues have occurred. You can stop again the elastic compute node.

 Note
This is custom documentation. For more information, please visit SAP Help Portal. 43
7/9/25, 8:45 AM
Up to 4 elastic compute nodes can run at the same time.

Updates of local tables or persisted views while an elastic compute node is running - An elastic compute node is in its running
phase, which means that its local tables and persisted views have been replicated. Here is the behavior if these objects are
updated while the node is running:

If a local table data is updated, it is updated on the main instance and the local table replica is also updated in parallel on
the elastic compute node. The runtime may take longer and more memory may be consumed.

If a persisted view data is updated, it is first updated on the main instance, then as a second step the persisted view replica
is updated on the elastic compute node. The runtime will take longer, and more memory and compute will be consumed.

If local table or persisted view metadata is changed on (new column for example) or deleted from the main instance, the
local table replica or the persisted view replica is deleted from the elastic compute node. The data of these objects is
therefore read from the main instance and not from the elastic compute node.

To create and manage elastic compute nodes, you must have the following privileges:

Spaces (C------M) - To create, manage and run an elastic compute node

Space Files (-------M) - To add spaces and objects to an elastic compute node

System Information (--U-----) - To access tenant settings needed to manage elastic compute nodes

The DW Administrator global role, for example, grants these privileges (see Roles and Privileges by App and Feature).

Start an Elastic Compute Node Manually


If the status of an elastic compute node is Ready, you can start it.

1. In the side navigation area, click (Space Management), then select the elastic compute node.

2. Click Start.

The status of the elastic compute node changes to Starting.

Stop an Elastic Compute Node Manually


If the status of an elastic compute node is Starting or Running, you can stop it.

1. In the side navigation area, click (Space Management), then select the elastic compute node.

2. Click Stop.

The status of the elastic compute node changes to Stopping.

Schedule an Elastic Compute Node


You can schedule an elastic compute node to run periodically at a specified date or time. You can also pause and then later resume
the schedule. You create and manage a schedule to run an elastic compute node as any other data integration task (see
Scheduling Data Integration Tasks) and, in addition, you can specify the duration time frame as follows.

1. In the side navigation area, click (Space Management), then select the elastic compute node.

2. Click Schedule, then Create Schedule.

This is custom documentation. For more information, please visit SAP Help Portal. 44
7/9/25, 8:45 AM
3. In the Create Schedule dialog, specify the options of the schedule, just like for any other integration task. See Schedule a
Data Integration Task (Simple Schedule) and Schedule a Data Integration Task (with Cron Expression).

4. In addition, specify in the Duration area the total number of hours and minutes of an elastic compute node run, from the
starting to the stopping stages.

Example - The elastic compute node is scheduled to run on the first day of every month for a duration of 72 hours (uptime of 3
days).

Once you've created the schedule, a schedule icon is displayed next to the elastic compute node in the list of nodes in the left-hand
side area of the Space Management.

You can then perform the following actions for the schedule by clicking Schedule: edit, pause, resume, delete or take over the
ownership of the schedule (see Scheduling Data Integration Tasks).

Update an Elastic Compute Node


In some cases, you can partially run again an elastic compute node for updates and replicate tables or persisted views that were
not replicated: if changes have been made to a local table or a persisted view assigned to the node while or after the node was
running; if a table or a view has failed to be replicated. In such a case, the Update button is available.

1. In the side navigation area, click (Space Management), then select the elastic compute node.

2. Click Update.

The status of the elastic compute node changes to Starting.

Monitor an Elastic Compute Node


Monitor an elastic compute node to see for example all its start and stop runs or if all local tables and persisted views have been
replicated.

1. In the side navigation area, click (Space Management), then select the elastic compute node.

This is custom documentation. For more information, please visit SAP Help Portal. 45
7/9/25, 8:45 AM
2. Click View Logs.

The Statement Logs tab of the System Monitor opens, displaying information filtered on the elastic compute node. For
more information about logs in the System Monitor, see Monitoring SAP Datasphere.

If local tables or persisted views were not replicated, you can go back to the elastic compute node and update it to replicate
them.

 Note
To monitor the start and stop runs for all elastic compute nodes, you can click View Logs in the left-hand area of the Space
Management.

You can monitor key figures related to an elastic compute node (such as start and end time of the last run; amount of memory
used for data replication), in the Elastic Compute Nodes tab of the System Monitor (see Monitoring SAP Datasphere).

Display Your System Information


Add a visual tenant type indicator to your system.

Context
You can add a tenant type indicator to show all users which system they are using. For example, it would allow users to differentiate
between a test or production system. When enabled, a colored information bar is visible to all users of the tenant, and the browser
favicon is be updated with the matching color.

Procedure
1. Go to System Configuration System Information .

2. If you have not set system information before, select Customize Visual Settings. If you have previously set system
information, select Edit.

3. Select a tenant type from the list. If you select a Custom type, you must add a Title. The tenant type will be displayed in the
information bar.

Example Custom dialog:

4. Select a color.

This is custom documentation. For more information, please visit SAP Help Portal. 46
7/9/25, 8:45 AM
A preview of the favicon and information bar will be displayed.

5. Select Confirm.

6. Turn on the Display System Information toggle.

Results
The tenant information that you set is displayed to all users above the shell bar. For example:

Apply a Patch Upgrade to Your SAP HANA Database


As an SAP Datasphere administrator, you can manually upgrade your SAP HANA database. This ensures your system is up to date
and running smoothly.

Context

Automated database upgrades are not impacted by your ability to upgrade your patch version manually. You can follow this
procedure in cases where a patch upgrade resolves an issue with the previous patch version.

 Note
To upgrade the SAP HANA database, you must have a global role that grants you the privilege System Information with the
Update permission. The DW Administrator global role, for example, grants this privilege (see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere).

This task is limited to patch upgrades. For example, if your current database version is 2024.28.3, and the next patch version of
2024.28.4 is available, you can upgrade. You cannot go from version 2024.28.4 to 2024.29.0, because that is a larger upgrade,
not a patch

Procedure
1. In the side navigation area, click  System  About.

2. Click More and scroll to the bottom of the dialog.

3. Click Trigger Patch Upgrade.

A confirmation dialog is shown informing you that the upgrade will cause a short downtime where SAP Datasphere is not
connected to SAP HANA.

4. Click Trigger upgrade to continue.

The patch upgrade begins. When the patch is finished, you'll receive a  notification.

Managing Users and Roles


Users with an administrator role can create SAP Datasphere users and manage secure access to the tenant through roles and
privileges.

This is custom documentation. For more information, please visit SAP Help Portal. 47
7/9/25, 8:45 AM

Managing User Identity and Authentication


The system owner can choose how to manage user identity and authentication for its SAP Datasphere tenant.

There are several methods to authenticate to SAP Datasphere.

 Note
Allowed authentication methods for SAP Datasphere tenants before or after the bundling with SAP Cloud Identity Services
tenants feature was enabled:

Existing tenants: the default IdP, a custom IdP, or bundled SAP Cloud Identity Services tenants.

New tenants: the default IdP or bundled SAP Cloud Identity Services tenants, which supports forwarding all SSO
requests to a corporate IdP.

Default Identity Provider


SAP Cloud Identity is the default authentication method for SAP Datasphere. Users can sign in to SAP Datasphere but not to other
SAP products.

Bundled SAP Cloud Identity Services Tenants

 Note
Bundling with SAP Cloud Identity Services tenants is being rolled out over the course of a number of versions. For more details,
see SAP Note 3619907.

To allow users to sign in to your SAP Datasphere tenant and to other SAP products via single sign-on (SSO), you can provision an
SAP Cloud Identity Services tenant for your SAP Datasphere tenant (see Configure Your Bundled SAP Cloud Identity Services
Tenant). SAP Cloud Identity Services tenants support forwarding all SSO requests to a corporate IdP (see What Are Cloud Identity
Services).

(Legacy) Custom Identity Provider


For your SAP Datasphere tenant, you can enable single sign-on authentication to a custom IdP, which is a separate solution, for
example Azure AD. For more information, see Enabling a Custom SAML Identity Provider (Legacy Custom IdP).

If you are having trouble signing in, you can use the Identity Provider Administration tool to repair your custom IdP. For more
information, see Access the Identity Provider Administration Tool.

Configure Your Bundled SAP Cloud Identity Services Tenant


Provision and configure your bundled SAP Cloud Identity Services tenant to authentication to SAP Datasphere.

This topic contains the following sections:

Provision an SAP Cloud Identity Services Tenant

Configure Authentication

This is custom documentation. For more information, please visit SAP Help Portal. 48
7/9/25, 8:45 AM
Modify Your Authentication Setup

Disable Your SAP Cloud Identity Services Tenant and Revert to Default IdP

Disable Your SAP Cloud Identity Services Tenant and Revert to Your Custom IdP

 Note
Bundling with SAP Cloud Identity Services tenants is being rolled out over the course of a number of versions. For more details,
see SAP Note 3619907.

 Note
If you currently use a custom IdP, we recommend that you migrate to an SAP Cloud Identity Services tenant and configure the
tenant to work with your corporate IdP (see Forward All SSO Requests to Corporate IdP). Using SAP Cloud Identity Services to
federate the identity of a custom IdP brings several benefits (see What Are Cloud Identity Services).

You provision and configure a bundled SAP Cloud Identity Services tenant for your SAP Datasphere tenant to allow users to sign in
via single sign-on (SSO) to SAP Datasphere and to other SAP products that use the same SAP Cloud Identity Services tenant. For
more information about bundles, see Bundles.

Prerequisites
To configure your bundled SAP Cloud Identity Services tenant for your SAP Datasphere tenant:

You must have the system owner role for your SAP Datasphere tenant and have multi-factor authentication enabled (see
Multi-Factor Authentication)

You must have a S-user with the same email as the system owner of the SAP Datasphere tenant. If you do not have an S-
user, click the Register button and create a user with email address used by the system owner.

We recommend that the S-user has multi-factor authentication enabled.

Provision an SAP Cloud Identity Services Tenant


You can provision up to two bundled SAP Cloud Identity Services tenant roles: test and production.

1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).

All tenants that you own will appear as cards on the My Tenants page. Under Current IdP, the name and type of
authentication method used will appear: default, bundled, or custom.

2. On the tenant card that you want to bundle, select  + Add SAP Cloud Identity Services.

All the SAP Cloud Identity Services tenants that you own will be listed.

3. Select an existing SAP Cloud Identity Services tenant or + Provision New SCI Tenant.

Option Description

Provision a New SAP Cloud a. If you are provisioning a new SAP Cloud Identity Services tenant, enter information for
Identity Services Tenant the SAP Cloud Identity Services tenant administrator's first name, last name, and email.
By default, the fields will be populated with your information, but the fields are editable in
case you want to provide different information.

b. Click Step 3.

This is custom documentation. For more information, please visit SAP Help Portal. 49
7/9/25, 8:45 AM

Option Description

When you provision a new tenant, its tenant role will match the tenant type used by SAP
Datasphere. For example, if your SAP Datasphere tenant is a test system, the SAP Cloud Identity
Services tenant will be assigned the tenant role: Test.

 Note
This option is only available if you do not have an existing SAP Cloud Identity Services tenant
with a role that matches your SAP Datasphere tenant type.

Use an Existing SAP Cloud a. Select the SAP Cloud Identity Services tenant you want to use with the authentication
Identity Services tenant bundle.

b. Click Step 2. A summary page will appear containing the tenant name, and information
about the tenant administrator.

 Note
If you do not already have a user on the SAP Cloud Identity Services tenant, a
standard user will be created for you. A standard user does not have administration
rights on the tenant, and you must request permissions from an existing
administrator if you want to make changes to the tenant in the future.

4. Select Finish.

A warning will appear before provisioning begins.

5. Select Yes to start the tenant provisioning.

Tenant provisioning can take up to 1 hour to complete. A progress bar will indicate your provisioning status. It will change to
a success message once the provisioning is complete.

If you select Close to leave the progress dialog before it is complete, tenant provisioning will continue in the background. If
you return to the My Tenants page, the tenant's provisioning status will appear on the tenant card. You can click the status
to return to the progress bar.

You then need to configure authentication to work with your SAP Cloud Identity Services tenant.

Configure Authentication
As a prerequisite, an SAP Cloud Identity Services tenant must be provisioned for the selected SAP Datasphere tenant.

1. Access the configuration dialog one of two ways:

a. From the SAP Cloud Identity Services tenant provisioning progress bar, select Continue.

b. From the My Tenants page, go to the tenant you are bundling, then select  Configure Authentication .

2. Select a User Attribute.

The attribute will be used to map users from your existing user list to SAP Datasphere.

Determine what your Subject Name Identifier maps to in your SAP Datasphere system. It should map to User ID,
Email or a custom attribute. You can view your SAP Datasphere user attributes in Security Users .

 Note

This is custom documentation. For more information, please visit SAP Help Portal. 50
7/9/25, 8:45 AM
Subject Name Identifier is case sensitive. The User ID, Email, or Custom Authentication User Mapping must
match the values exactly. For example, if the Subject Name Identifier returned by your identity provider is
user@[Link] and the email you used in SAP Datasphere is User@[Link] the mapping will fail.

Choose one of the following options:

USER ID: To map to the SAP Datasphere User ID.

Email: To map to the SAP Datasphere Email address

Custom Authentication User Mapping: To map to a custom value.

 Note
If you select this option, there will be a new column named SAML User Mapping in Security Users . After
switching to your SAML IdP, you must manually update this column for all existing users.

3. (Optional) Enable Dynamic User Creation.

When dynamic user creation is enabled, new users will be automatically created using the default role and will be able to
sign into SAP Datasphere. After users are created, you can set roles using SAML attributes. For more information, see
Assign Users to a Role Using SAML Attributes.

 Note
If this option is enabled, dynamic user creation still occurs in SAP Datasphere even when user attributes have not been
set for all SAP Cloud Identity Services tenant users. To prevent a user from being automatically created, your SAP Cloud
Identity Services tenant must deny the user access to SAP Datasphere.

4. Click Step 2.

The Validate Login page appears.

5. [Optional] Do additional configuration in SAP Cloud Identity Services if you don't want to use the default settings.

SAP Datasphere should appear in the Bundled Applications list on your SAP Cloud Identity Services tenant.

For example, you can change the subject name identifier used by SAP Cloud Identity Services to match the attribute you
selected in Step 2. You can also configure the tenant to forward authentication from SAP Cloud Identity Services to a
corporate IdP. For more information, see Forward All SSO Requests to Corporate IdP.

6. Verify that you can sign in to your SAP Cloud Identity Services tenant: in another browser, sign in to the URL provided in the
Verify Your Account dialog, using your SAP Cloud Identity Services tenant credentials.

 Note
You can copy the URL by selecting (Copy).

You must use a private session to sign into the URL; for example, Guest mode in Google Chrome. This ensures
that when you sign in to the dialog and select SAP Datasphere, you are prompted to sign in and do not reuse an
existing browser session.

If you can sign in successfully, the authentication setup is correct.

7. On the Validate Login page, select Validate Login, then click Step 3.

A summary page will appear with the system information.

8. Select Finish.

Disabling the bundled SAP Cloud Identity Services tenant can take up to 1 hour to complete. A progress bar will indicate
your provisioning status. It will change to a success message once the provisioning is complete.

9. Select Close to leave the progress dialog before it is complete, tenant provisioning will continue in the background. If you
return to the My Tenants page, the tenant's provisioning status will appear on the tenant card. You can click the status to
return to the progress bar.

This is custom documentation. For more information, please visit SAP Help Portal. 51
7/9/25, 8:45 AM
You are taken to the My Tenants page, and the status of your tenant will be updated.

Modify Your Authentication Setup


Once you have created your authentication bundle, you can change the configuration if needed.

1. In the My Tenants page, go to the bundled tenant you want to modify, then select  Configure Authentication .

2. In the Select User Attribute Type step, you can change the user attribute type, or enable dynamic user creation.

3. Click Step 2.

4. Go through steps 5-10 of the Configure Authentication procedure above.

Disable Your SAP Cloud Identity Services Tenant and Revert to Default IdP
You can disable your bundled SAP Cloud Identity Services tenant and revert to using the default IdP.

 Note
Users who sign in to SAP Datasphere will not be able to sign in to other SAP products via SSO.

1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).

2. In the My Tenants page, go to the bundled tenant you want to modify, then select  Disable Bundling .

3. Select SAP Cloud Identity (Default) and click Step 2.

4. Select Finish to disable the bundle.

Disabling the bundled SAP Cloud Identity Services tenant can take up to 1 hour to complete. A progress bar will indicate your
provisioning status. It will change to a success message once the provisioning is complete.

If you select Close to leave the progress dialog before it is complete, tenant provisioning will continue in the background. If you
return to the My Tenants page, the tenant's provisioning status will appear on the tenant card. You can click the status to return to
the progress bar.

Disable Your SAP Cloud Identity Services Tenant and Revert to Your Custom IdP
You can disable your bundled SAP Cloud Identity Services tenant and revert to using your custom IdP only if your SAP Datasphere
tenant that was provisioned before bundling with SAP Cloud Identity Services tenants was enabled.

To establish a trust relationship between your SAML IdP and your SAP Datasphere tenant, you'll need to exchange their metadata.

 Note
Users who sign in to SAP Datasphere will not be able to sign in to other SAP products via SSO.

1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).

2. In the My Tenants page, go to the bundled tenant you want to disable, then select  Disable Bundling .

3. Select SAML Single Sign-On (SSO) and click Step 2.

4. click Download and save the file, which contains the SAP Datasphere metadata.

5. In your custom IdP, upload the file containing the SAP Datasphere metadata.

6. Map your SAML IdP user attributes and roles.

This is custom documentation. For more information, please visit SAP Help Portal. 52
7/9/25, 8:45 AM
Configure your SAML IdP to map user attributes to the following case-sensitive allowlisted assertion attributes. We
recommend that you map only the user attributes and roles that will be used in SAP Datasphere. Mapping additional user
attributes may result in a large SAML assertion, which could produce a login error.

Attribute Name Notes

email Required if your NameID is "email".

Groups Required. The value must be set to "sac", even in case of SAP Datasphere. The Groups attribute is a custom
attribute and must be added if it does not exist yet. You need to contact your administrator to get the path
where the mapping needs to be changed.

familyName Optional. familyName is the user's last name (surname).

displayName Optional.

functionalArea Optional.

givenName Optional. givenName is the user's first name.

preferredLanguage Optional.

custom1 Optional. For SAML role assignment.

custom2 Optional. For SAML role assignment.

custom3 Optional. For SAML role assignment.

custom4 Optional. For SAML role assignment.

custom5 Optional. For SAML role assignment.

Example of SAML assertion:

<AttributeStatement>
<Attribute
Name="email">
<AttributeValue>[Link]@[Link]</AttributeValue>
</Attribute>
<Attribute
Name="givenName">
<AttributeValue>Abc</AttributeValue>
</Attribute>
<Attribute
Name="familyName">
<AttributeValue>Def</AttributeValue>
</Attribute>
<Attribute
Name="displayName">
<AttributeValue>Abc Def</AttributeValue>
</Attribute>
<Attribute
Name="Groups">
<AttributeValue>sac</AttributeValue>
</Attribute>
<Attribute
Name="custom1">
<AttributeValue>Domain Users</AttributeValue>
<AttributeValue>Enterprise Admins</AttributeValue>
<AttributeValue>Enterprise Key Admins</AttributeValue>
</Attribute>
</AttributeStatement>

 Note

This is custom documentation. For more information, please visit SAP Help Portal. 53
7/9/25, 8:45 AM
Map the Groups "sac" attribute under Default Attributes for your SAP Datasphere tenant. The remaining attributes
should be mapped under Assertion Attributes for your SAP Datasphere tenant.

7. In your custom IdP, download its SAML metadata in a file.

8. In the Identity Provider Administration tool, click Upload and select the file containing the SAML metadata of your IdP.

9. Click Step 3.

10. Select a User Attribute.

The attribute will be used to map users from your existing user list to SAP Datasphere.

Determine what your Subject Name Identifier maps to in your SAP Datasphere tenant. It should map to User ID,
Email or a custom attribute. You can view your SAP Datasphere user attributes in Security Users .

 Note
Subject Name Identifier is case sensitive. The User ID, Email, or Custom Authentication User Mapping must
match the values exactly. For example, if the Subject Name Identifier returned by your identity provider is
user@[Link] and the email you used in SAP Datasphere is User@[Link] the mapping will fail.

Choose one of the following options:

USER ID: To map to the SAP Datasphere User ID.

Email: To map to the SAP Datasphere Email address

Custom Authentication User Mapping: To map to a custom value.

 Note
If you select this option, there will be a new column named SAML User Mapping in Security Users . After
switching to your SAP Cloud Identity Services tenant, you must manually update this column for all existing
users.

11. [Optional] Enable Dynamic User Creation.

When dynamic user creation is enabled, new users will be automatically created using the default role and will be able to
sign into SAP Datasphere. After users are created, you can set roles using SAML attributes. For more information, see
Assign Users to a Role Using SAML Attributes.

 Note
If this option is enabled, dynamic user creation still occurs in SAP Datasphere even when user attributes have not been
set for all SAP Cloud Identity Services tenant users. To prevent a user from being automatically created, your SAP Cloud
Identity Services tenant must deny the user access to SAP Datasphere.

12. Click Step 4.

The Validate Login page appears.

13. Verify that you can sign in to your custom IdP: in another browser, sign in to the URL provided in the Verify Your Account
dialog, using your custom IdP credentials.

 Note
You can copy the URL by selecting (Copy).

You must use a private session to sign into the URL; for example, Guest mode in Google Chrome. This ensures
that when you sign in to the dialog and select SAP Datasphere, you are prompted to sign in and do not reuse an
existing browser session.

If you can sign in successfully, the authentication setup is correct.

This is custom documentation. For more information, please visit SAP Help Portal. 54
7/9/25, 8:45 AM
14. On the Validate Login page, select Validate Login, then click Step 5.

A summary page will appear with the system information.

15. Select Finish.

Disabling the bundled SAP Cloud Identity Services tenant can take up to 1 hour to complete. A progress bar will indicate
your provisioning status. It will change to a success message once the provisioning is complete.

16. Select Close to leave the progress dialog before it is complete, tenant provisioning will continue in the background. If you
return to the My Tenants page, the tenant's provisioning status will appear on the tenant card. You can click the status to
return to the progress bar.

Access the Identity Provider Administration Tool


The Identity Provider Administration tool allows system owners to manage the identity provider configured with SAP Datasphere.
Through the tool, the system owner can create or modify an authentication bundle, choose to upload new metadata for a custom
identity provider, or revert to using the default identity provider.

Prerequisites

 Note
Bundling with SAP Cloud Identity Services tenants is being rolled out over the course of a number of versions. For more details,
see SAP Note 3619907.

To configure your bundled SAP Cloud Identity Services tenant for your SAP Datasphere tenant:

You must have the system owner role for your SAP Datasphere tenant and have multi-factor authentication enabled (see
Multi-Factor Authentication)

You must have a S-user with the same email as the system owner of the SAP Datasphere tenant. If you do not have an S-
user, click the Register button and create a user with email address used by the system owner.

We recommend that the S-user has multi-factor authentication enabled.

Procedure
1. Access the Identity Provider Administration tool using the following URL: [Link]
center>.[Link]/idp-admin/

For example, if your SAP Datasphere system is on eu10, then the URL is:

[Link]

If your SAP Datasphere system is on cn1, then the URL is:

[Link]

If your tenant is on EUDP:

[Link]

[Link]

2. Log in with an S-user that has the same email address as the system owner of SAP Datasphere tenant. If you don't yet have
such an S-user, you can click the Register button and create a P-user.

If you create a new P-user, you'll receive an email with an activation link that will let you set your password.

This is custom documentation. For more information, please visit SAP Help Portal. 55
7/9/25, 8:45 AM
3. Once you're signed in, the list of SAP Datasphere tenants for which you are the system owner is displayed. Select the tenant
you want to work on by clicking on the card.

 Note
Your SAP Datasphere tenant is connected to the Identity Provider Administration tool by default. If you'd like to
disconnect your tenant from the console, you can do so in either of two places:

In SAP Datasphere, navigate to System Administration Security Optional: Configure Identity Provider
Administration Tool , click the Connected switch, and then save the changes.

Click Disconnect IdP Admin from your system after selecting your tenant in the Identity Provider Administration
tool.

Enabling a Custom SAML Identity Provider (Legacy Custom IdP)


By default, SAP Cloud Identity is used by SAP Datasphere. SAP Datasphere also supports single sign-on (SSO), using your custom
identity provider.

Prerequisites
SAP Datasphere can be hosted on non-SAP data centers.

You must have an IdP that supports SAML 2.0 protocol.

You must be able to configure your IdP.

You must be the system owner of the SAP Datasphere tenant. For more information see Transfer the System Owner Role.

If your users are connecting from Apple devices using the mobile app, the certificate used by your IdP must be compatible
with Apple's App Transport Security (ATS) feature.

Context
A custom identity provider is a separate solution, like for example Azure AD, and is not part of SAP Analytics Cloud or SAP
Datasphere. Therefore the change in configuration is to be applied directly in the solution, not within SAP Datasphere. No access to
SAP Datasphere is required to make the change, only an access to the IdP.

 Note
Be aware that the SAML attributes for SAP Datasphere roles do not cover user assignment to spaces. A user who logs into a
SAP Datasphere tenant through SSO must be assigned to the space in order to access the space. If you do not assign a user to
a space, the user will not have access to any space.

Procedure
1. From the side navigation, go to  (System) →  (Administration) →Security.

2. Select  (Edit).

3. In the Authentication Method area, select SAML Single Sign-On (SSO) if it is not already selected.

 Note
By default, SAP Cloud Identity is used for authentication.

4. In Step 1, select Download and save the metadata file.

This is custom documentation. For more information, please visit SAP Help Portal. 56
7/9/25, 8:45 AM
A SAP Datasphere metadata file is saved.

5. Upload the SAP Datasphere metadata file to your SAML IdP.

If you are creating a new SAP Datasphere application on the Identity Authentication Service (IAS) side with the type
"unknown", set the type to "Unknown".

The file includes metadata for SAP Datasphere, and is used to create a trust relationship between your SAML Identity
Provider and your SAP Datasphere system.

6. Map your SAML IdP user attributes and roles.

Configure your SAML IdP to map user attributes to the following case-sensitive allowlisted assertion attributes. We
recommend that you map only the user attributes and roles that will be used in SAP Analytics Cloud. Mapping additional
user attributes may result in a large SAML assertion, which could produce a login error.

Attribute Name Notes

email Required if your NameID is "email".

Groups Required. The value must be set to "sac", even in case of SAP Datasphere. The Groups attribute is a custom
attribute and must be added if it does not exist yet. You need to contact your administrator to get the path
where the mapping needs to be changed.

familyName Optional. familyName is the user's last name (surname).

displayName Optional.

functionalArea Optional.

givenName Optional. givenName is the user's first name.

preferredLanguage Optional.

custom1 Optional. For SAML role assignment.

custom2 Optional. For SAML role assignment.

custom3 Optional. For SAML role assignment.

custom4 Optional. For SAML role assignment.

custom5 Optional. For SAML role assignment.

Example of SAML assertion:

<AttributeStatement>
<Attribute
Name="email">
<AttributeValue>[Link]@[Link]</AttributeValue>
</Attribute>
<Attribute
Name="givenName">
<AttributeValue>Abc</AttributeValue>
</Attribute>
<Attribute
Name="familyName">
<AttributeValue>Def</AttributeValue>
</Attribute>
<Attribute
Name="displayName">
<AttributeValue>Abc Def</AttributeValue>
</Attribute>
<Attribute
Name="Groups">
<AttributeValue>sac</AttributeValue>

This is custom documentation. For more information, please visit SAP Help Portal. 57
7/9/25, 8:45 AM
</Attribute>
<Attribute
Name="custom1">
<AttributeValue>Domain Users</AttributeValue>
<AttributeValue>Enterprise Admins</AttributeValue>
<AttributeValue>Enterprise Key Admins</AttributeValue>
</Attribute>
</AttributeStatement>

 Note
If you are using the SAP Cloud Identity Authentication service as your IdP, map the Groups "sac" attribute under
Default Attributes for your SAP Datasphere tenant. The remaining attributes should be mapped under Assertion
Attributes for your SAP Datasphere tenant.

7. Download metadata from your SAML IdP.

8. In Step 2, select Upload, and choose the metadata file you downloaded from your SAML IdP.

9. In Step 3, select a User Attribute.

The attribute will be used to map users from your existing SAML user list to SAP DatasphereNameID used in your custom
SAML assertion:

<NameID Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified"><Your Unique Identifie

Determine what your NameID maps to in your SAP Datasphere system. It should map to . The user attribute you select
must match the User ID, Email or a custom attribute. You can view your SAP Datasphere user attributes in Security
Users .

 Note
NameID is case sensitive. The User ID, Email, or Custom SAML User Mapping must match the values in your SAML IdP
exactly. For example, if the NameId returned by your SAML IdP is user@[Link] and the email you used in SAP
Datasphere is User@[Link] the mapping will fail.

Choose one of the following options:

USER ID: If NameID maps to the SAP Datasphere User ID.

Email: If NameID maps to SAP Datasphere Email address.

 Note
If your NameID email is not case-sensitive and contains mixed-case, for example User@[Link],
consider choosing Custom SAML User Mapping instead.

Custom SAML User Mapping: If NameID maps to a custom value.

 Note
If you select this option, there will be a new column named SAML User Mapping in Security Users . The.
After switching to your SAML IdP, you must manually update this column for all existing users.

 Note
If you are using a live connection to SAP S/4HANA Cloud Edition with OAuth 2.0 SAML Bearer Assertion, NameId must
be identical to the user name of the business user on your SAP S/4HANA system.

For example, if you want to map an SAP Datasphere user with the user ID SACUSER to your SAP S/4HANA Cloud user
with the user name S4HANAUSER, you must select Custom SAML User Mapping and use S4HANAUSER as the Login
Credential in Step 10.

This is custom documentation. For more information, please visit SAP Help Portal. 58
7/9/25, 8:45 AM
If you are using SAP Cloud Identity as your SAML IdP, you can choose Login Name as the NameID attribute for SAP
Datasphere, then you can set the login name of your SAP Datasphere user as S4HANAUSER.

10. [Optional] Enable Dynamic User Creation.

When dynamic user creation is enabled, new users will be automatically created and assigned the default role and will be
able to use SAML SSO to log onto SAP Datasphere. Once the users are created, you can assign roles using SAML attributes
(see Assign Users to a Role Using SAML Attributes).

 Note
Automatic user deletion is not supported. If a user in SAP Datasphere is removed from your SAML IdP, you must go to
Security Users and manually delete users. For more information, see Delete Users.

If this option is enabled, dynamic user creation still occurs in SAP Datasphere even when SAML user attributes have not
been set for all IdP users. To prevent a user from being automatically created, your SAML IdP must deny the user access
to SAP Datasphere.

11. In Step 4, enter <Your Unique Identifier>.

This value must identify the SAP Datasphere system owner. The Login Credential provided here are automatically set for
your user.

 Note
The Login Credential depends on the User Attribute you selected under Step 3.

12. Test the SAML IdP setup, by logging in with your IdP, and then clicking Verify Account to open a dialog for validation.

In another browser, log on to the URL provided in the Verify Your Account dialog, using your SAML IdP credentials. You can
copy the URL by selecting  (Copy).

You must use a private session to log onto the URL; for example, guest mode in Chrome. This ensures that when you log on
to the dialog and select SAP Datasphere, you are prompted to log in and do not reuse an existing browser session.

 Note
When starting the verification step, you will see a new screen when logging into SAP Datasphere. Two links will be
displayed on this page. One will link to your current IdP and the other will link to the new IdP you will switch to. To
perform the Verify Account step, use the link for the new IdP. Other SAP Datasphere users can continue logging on with
the current IdP. Once you have completed Step 16 and the IdP switch has completed, this screen will no longer appear.

If you can log on successfully, the SAML IdP setup is correct.

13. In the Verify Your Account dialog, select Check Verification.

If the verification was successful, a green border should appear around the Login Credential box.

14. [Optional] Enter a password management URL.

The URL should link to the password management page of your SAML IdP.

15. [Optional] Configure the logout by choosing one of the following logout options:

IdP Logout: Log out of your SAML IdP.

Application log out: Log out of SAP Datasphere and remain signed in to your IdP system.

 Note
By default, when users log out of SAP Datasphere, they are automatically logged out of their SAML IdP.

16. Select  (Save).

The Convert to SAML Single Sign-On confirmation dialog will appear.

This is custom documentation. For more information, please visit SAP Help Portal. 59
7/9/25, 8:45 AM
17. Select Convert.

When the conversion is complete, you will be logged out and directed to the logon page of your SAML IdP.

18. Log on to SAP Datasphere with the credentials you used for the verification step.

19. From the side navigation, go to  (Security) → and  (Users), look for the  column of the User Attribute you selected in
step 9.

The values in this column should be a case sensitive match with the NameId sent by your IdP's SAML assertion.

 Note
If you selected Custom SAML User Mapping as User Attribute, you must manually update all fields in the SAML User
Mapping column.

Results
Users will be able to use SAML SSO to log onto SAP Datasphere.

 Note
You can also set up your IdP with your Public Key Infrastructure (PKI) so that you can automatically log in your users with a
client side X.509 certificate.

Next Steps
Switching to a Different Custom IdP: If SAML SSO is enabled and you would like to switch to a different SAML IdP, you can repeat
the above steps using the new SAML IdP metadata.

Update SAML Signing Certificates (Legacy Custom IdP)


You can update the SAML identity provider (IdP) signing certificate. If you are using a custom IdP for authentication, you may need
to update your SAML IdP signing certificate.

A common use case is to upload new metadata from your identity provider when a new signing certificate has been generated.

If you can't sign in to SAP Datasphere you can use the Identity Provider Administration tool to upload new metadata or download
your tenant signing certificate.

Prerequisites
You must have the metadata file that contains the new certificate from your custom IdP, and you must be logged into SAP
Datasphere before your IdP switches over to using the new certificate.

You must be the system owner in SAP Datasphere.

Update the SAML IdP Signing Certificate Using the Security Page
Upload new metadata to reconfigure trust between your custom IdP and your SAP Datasphere system.

1. From the side navigation, go to  (System) →  (Administration) →Security .

2. Select  (Edit).

3. Under Step 2, select Update and provide the new metadata file.

This is custom documentation. For more information, please visit SAP Help Portal. 60
7/9/25, 8:45 AM
4. Select  (Save) and confirm the change to complete the update.

The update will take effect within two minutes.

You do not have to redo Step 3 or Step 4 on the Security tab.

Update the SAML IdP Signing Certificate Using the Identity Provider Administration
Tool
Upload new metadata to reconfigure trust between your custom IdP and your SAP Datasphere system using the Identity Provider
Administration tool.

1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).

2. On the card for the tenant that you want to update, select  Repair IdP .

3. Select Upload new metadata for the current custom identity provider.

4. Click Browse to select the new metadata file for your current custom identity provider.

5. Click Upload File to upload the provided metadata file. After the upload is successful, it can take up to five minutes for the
new metadata file to be applied.

6. Click Step 3 to proceed to the validation step.

7. Click Log into SAP Datasphere to open a new tab and navigate to your SAP Datasphere system.

If you have any sign in problems related to the identity provider configuration, as opposed to a user-specific problem, you can
return to the Identity Provider Administration tool and either re-upload the metadata file or revert to the default identity provider.
For more information, see Revert to Default Authentication (Legacy Custom IdP).

Reacquire the SAP Datasphere SAML Signing Certificate Using the Identity Provider
Administration Tool
If you need to reqacquire your system metadata it can be downloaded from the Identity Provider Administration tool.

1. Sign in to the Identity Provider Administration tool (see Access the Identity Provider Administration Tool).

2. On the card for the tenant that you want to use, select  Repair IdP .

3. Select Download to get the SAP Datasphere metadata.

Revert to Default Authentication (Legacy Custom IdP)


You can revert your tenant to the default authentication method.

To revert your custom IdP authentication back to the default method, you can use the Security page in SAP Datasphere. If you are
having problems signing in, you can use the Identity Provider Administration tool.

Revert to the Default Authentication Method Using the Security Page


You can revert your custom IdP to the default authentication method directly in the Security page of SAP Datasphere.

1. From the side navigation, go to  (System)→  (Administration) → Security.

2. Select  (Edit) .

3. In the Authentication Method area, select SAP Cloud Identity (default).

The system will perform a check to see if all users also exist in SAP Cloud Identity.

This is custom documentation. For more information, please visit SAP Help Portal. 61
7/9/25, 8:45 AM
If users do not exist in SAP Cloud Identity, you will be prompted to Synchronize Users.

A progress bar will track the status of the synchronization process.

 Note
A user validation error may occur if users do not have a valid email address, or if duplicate email addresses are found.
User validation errors must be corrected on the Users page before synchronization can be completed.

4. Select  (Save) .

When conversion is complete, you will be logged out and directed to the SAP Cloud Identity logon page.

Revert to the Default Authentication Method Using the Identity Provider


Administration Tool
You can use the Identity Provider Administration tool to revert your tenant to the default authentication method.

1. Sign in to the Identity Provider Administration tool. For more information, see Access the Identity Provider Administration
Tool.

2. On the card for the tenant that you want to revert, select  Repair IdP .

3. Select Revert to the Default Identity Provider.

4. Select the Yes radio button to revert to the default IdP.

5. Select Yes in the confirmation dialog to revert your authentication method back to the default IdP.

6. Click Step 2 to proceed to the user synchronization step.

7. Click Synchronize Users.

The system will perform a check to see if all custom SAML IdP users also exist in SAP Cloud Identity. If any users do not
exist in SAP Cloud Identity, you will be prompted to Synchronize Users. A progress bar will track the status of the
synchronization process.

 Note
A user validation error may occur if users do not have a valid email address, or if duplicate email addresses are found.
User validation errors must be corrected before synchronization can be completed.

8. Click Step 3 to proceed to the validation step.

9. Click Log into SAP Datasphere to open a new tab and navigate to your SAP Datasphere system. Log in with your default
identity provider credentials. If you get an error saying “Your profile is not configured”, please create a support ticket under
the component LOD-ANA-BI.

Once the reversion has finished, you can exit the Identity Provider Administration tool and do additional configuration in SAP
Datasphere.

Managing SAP Datasphere Users


You can create and modify users in SAP Datasphere in several different ways.

Prerequisites
To manage users, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

This is custom documentation. For more information, please visit SAP Help Portal. 62
7/9/25, 8:45 AM
User (CRUD----) - To access the  (Users)area in the  (Security) tool and to create, update, and delete users.

User (-------M) - To assign users to roles.

The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

Creating Users
You can create users in the following ways:

Method More Information

Create individual users in the Users list Create a User

Import multiple users from a CSV file Import or Modify Users from a File

Modifying Users
You can modify existing users in the following ways:

Modification More Information

Export user data to a CSV file, to synchronize with other systems Export Users

Update the email address a user logs on with Update a User Email Address

Delete users Delete Users

Create a User
You can create individual users in SAP Datasphere.

Prerequisites
You can select one or more roles while you're creating the user. Before getting started creating users, you might want to become
familiar with the global roles and scoped roles. You can still assign roles after you've created the users.

Type of Role Description More Information

Global Roles A role that enables users assigned to it to Managing Roles and Privileges
perform actions that are not space-related,
typically a role that enables to administrate
the tenant. A standard or custom role is
considered as global when it includes global
privileges.

Scoped Roles A role that inherits a set of scoped Create a Scoped Role to Assign Privileges to
privileges from a standard or custom role Users in Spaces
and grants these privileges to users for use
in the assigned spaces.

This is custom documentation. For more information, please visit SAP Help Portal. 63
7/9/25, 8:45 AM

Context
The method described here assumes that SAP Datasphere is using its default authentication provider. If you are using a custom
SAML Identity Provider, you must provide slightly different information, depending upon how your SAML authentication is
configured.

Procedure
1. Go to  (Expand)  (Security)  (Users).

2. Select  (New) to add a new user to the user management table.

3. Enter a User ID.

Each user needs a unique ID. Only alphanumeric and underscore characters are allowed. The maximum length is 20
characters.

4. Enter the user name details.

Only Last Name is mandatory, but it is recommended that you provide a First Name, Last Name, and Display Name.
Display Name will appear in the screens.

5. Enter an Email address.

A welcome email with logon information will be sent to this address.

 Note
The Manager column is not relevant for SAP Datasphere users.

6. In the Roles column, select the icon  and choose one or more roles from the list.

If one or more default roles have already been created, you can leave Roles empty. Default roles will be assigned to the user
when you click Save.

7. Select  (Save).

Results
A welcome email including an account activation URL will be sent to the user, so that the user can set an initial password
and access the system. Optionally, you can disable the welcome email notification (see Configure Notifications).

When you create a user, it is activated by default. You may want to deactivate a user in specific cases, for example when a
user is on long-term leave. To deactivate a user, select the relevant check box in the leftmost column of the table, click the
icon (Deactivate Users) and optionally select Email users to notify them that their accounts have been deactivated.
Deactivated users cannot login to SAP Datasphere until they are activated again.

 Note
In addition to the standard workflows, you can also create users via the command line (see Manage Users via the Command
Line).

Import or Modify Users from a File


You can create users or batch-update existing users by importing user data that you have saved in a CSV file.

Prerequisites

This is custom documentation. For more information, please visit SAP Help Portal. 64
7/9/25, 8:45 AM
The user data you want to import must be stored in a CSV file. At minimum, your CSV file needs columns for UserID, LastName,
and Email, but it is recommended that you also include FirstName and DisplayName.

If you want to assign new users different roles, include a Roles column in the CSV file. The role IDs used for role assignment are
outlined in Standard Roles Delivered with SAP Datasphere.

For existing users that you want to modify, you can create the CSV file by first exporting a CSV file from SAP Datasphere. For more
information, see Export Users.

 Note
The first name, last name, and display name are linked to the identity provider, and can't be changed in the User list page, or
when importing a CSV file. (In the User list page, those columns are grayed out.)

To edit those values, you'll need to use the user login, and edit that user's profile.

Edit the downloaded CSV file to remove columns whose values you don't want to modify, and to remove rows for users whose
values you don't want to modify. Do not modify the USERID column. This ensures that entries can be matched to existing users
when you re-import the CSV.

These are the available mapping parameters when importing CSV user data:

Parameter Description

User ID

First Name

Last Name

Display Name

Email

Manager

Roles

Mobile

Phone

Office Location

Function Area Can be used to refer to a user's team or


area within their organization.

Job Title

Clean up notifications older than Set in user settings: when to


automatically delete notifications.

Email Notification Set in user settings.

Welcome message Message that is shown to the user on


the home screen.

Page tips Enabled/disabled via the help center


(deprecated).

This is custom documentation. For more information, please visit SAP Help Portal. 65
7/9/25, 8:45 AM

Parameter Description

Closed Page tips Closed page tips are tracked so that


they are not shown again.

Closed Item Picker Tips Closed tooltips are tracked so that they
won't be reopened again (for first time
users).

Current Banner Saves which banner is currently


showing.

Last Banner The UUID of the last closed banner.

Last Maintenance Banner Version The version when the last maintenance
banner was shown.

Marketing email opt in Set in user settings.

Homescreen content is initialized If default tiles have been set for the
home screen.

Expand Story Toolbar Set in user settings.

Is user concurrent If the user has a concurrent license.

On the Edit Home Screen dialog, a user can override all the default preferences that have been
set by the administrator for the system ( System Administration Default Appearance ).
These are the preferences:

Override Background Option

Override Logo Option

Override Welcome Message

Override Home Search To Insight

Override Get Started

Override Recent Stories

Override Recent Presentations

Override Calendar Highlights

Procedure
1. Go to  (Expand)  (Security)  (Users).

2. Select  (Import Users) Import Users from File .

3. In the Import Users dialog, choose Select Source File to upload your CSV file.

4. Choose Create Mapping to assign the fields of your user data from the CSV file to the fields in user management.

5. Select the appropriate entries for the Header, Line Separator, Delimiter, and Text Qualifier.

6. Select OK when you've finished mapping.

7. In the Import Users dialog, choose Import to upload your CSV file according to the defined mapping.

This is custom documentation. For more information, please visit SAP Help Portal. 66
7/9/25, 8:45 AM

Export Users
If you want to synchronize SAP Datasphere user data with other systems, you can export the data to a CSV file.

Procedure
On the Users page of the Security area, choose  (Export).

Results
The system exports all user data into a CSV file that is automatically downloaded to your browser's default download folder.

The CSV file contains these columns:

Column Description

USER_NAME

FIRST_NAME

LAST_NAME

DISPLAY_NAME

EMAIL

MANAGER

ROLES Roles assigned to the user.

SAML_USER_MAPPING SAML property for the user (if SAML enabled).

MOBILE Set in user preferences.

OFFICE_PHONE Set in user preferences.

OFFICE_ADDRESS Set in user preferences.

AGILE_BI_ENABLED_BY_DEFAULT Opt in for the agile data preparation feature.

JOB_TITLE Set in user preferences.

MARKETING_EMAIL_OPT_IN Set in user preferences.

IS_CONCURRENT Licensing attribute to indicate whether the user


is consuming a named licensed user account
(0) or a concurrent licensed user account (1).

DEFAULT_APP The application that will launch when you


access your SAP Datasphere URL. The default
application can be set in System
Administration System Configuration or
in the user settings.

On the Edit Home Screen dialog, a user can override all the default preferences that
have been set by the administrator for the system ( System Administration
Default Appearance ). These are the preferences:

OVERRIDE_BACKGROUND_OPTION

This is custom documentation. For more information, please visit SAP Help Portal. 67
7/9/25, 8:45 AM

Column Description

OVERRIDE_LOGO_OPTION

OVERRIDE_WELCOME_MESSAGE_FLAG

OVERRIDE_HOME_SEARCH_TO_INSIGHT_FLAG

OVERRIDE_GET_STARTED_FLAG

OVERRIDE_RECENT_FILES_FLAG

OVERRIDE_RECENT_STORIES_FLAGOVERRIDE_RECENT_STORIES_FLAG

OVERRIDE_RECENT_PRESENTATIONS_FLAG

OVERRIDE_RECENT_APPLICATIONS_FLAG

OVERRIDE_CALENDAR_FLAG

OVERRIDE_FEATURED_FILES_FLAG

Update a User Email Address


You can update a user email address used for logon.

Context
When you create a user, you must add an email address. The email address is used to send logon information.

Procedure
1. In the side navigation area, click  (Security)  (Users).

2. Select the email address you want to modify, add a new email address and press Enter or select another cell.

If the email address is already assigned to another user, a warning will appear and you must enter a new address as every
user must be assigned a unique email address.

A new logon email will be sent to the updated address. As long as a user has not logged on to the system with the new email
address, the email address will appear in a pending state in the Users list.

3. If the user has not received the logon email, you can resend the email. To do so, select the checkbox correponding to the
user, click the enveloppe icon and click Resend in the dialog box that opens.

Related Information
Create a User
Import or Modify Users from a File

Delete Users
You can delete users.

Procedure
This is custom documentation. For more information, please visit SAP Help Portal. 68
7/9/25, 8:45 AM
1. In the Users management table, select the user ID you want to delete by clicking the user number in the leftmost column of
the table.

The whole row is selected.

2. Choose  (Delete) from the toolbar.

3. Select OK to continue and remove the user from the system.

Related Information
Create a User
Import or Modify Users from a File
Update a User Email Address

Set a Password Policy for Database Users


Users with the DW Administrator role (administrators) can set a password policy to cause database user passwords to expire after
a specified number of days.

Context
Users with the DW Space Administrator role (space administrators) can create database users in their spaces to allow the
connection of ETL tools to write to and read from Open SQL schemas attached to the space schema (see Integrating Data via
Database Users/Open SQL Schemas).

Procedure
1. In the side navigation area, click  (System)  (Configuration) Security .

2. In the Password Policy Configuration section, enter the number of days after which a database user's password will expire.

After this period, the user will be prompted to set a new password.

 Note
The password policy applies only to database users where the Enable Password Policy property is selected, for both
existing and new users. If a user does not log on with their initial password during this period, they will be deactivated
until their password is reset.

Managing Roles and Privileges


Assigning roles to your users maintains access rights and secures your information in SAP Datasphere.

Prerequisites
To manage roles and privileges, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

Role (CRUD----) - To access the  (Roles) and  (Authorization Overview) areas in the  (Security) tool and to create,
update, and delete roles.

User (-------M) - To add users to roles.

This is custom documentation. For more information, please visit SAP Help Portal. 69
7/9/25, 8:45 AM
Spaces (-------M) - To add spaces to scoped roles.

The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

Introduction to Roles and Privileges


A role is a set of privileges and permissions.

SAP Datasphere delivers a set of standard roles and you can create your own custom roles:

Standard role - A role delivered with SAP Datasphere that includes a set of privileges. As a best practice, a tenant
administrator can use these roles as templates for creating custom roles for different business needs. See Standard Roles
Delivered with SAP Datasphere.

Custom role - A role that a tenant administrator creates to choose specific privileges as needed. See Create a Custom Role.

Each standard or custom role is either a global role or a template for scoped roles:

Global role - A role that enables users assigned to it to perform actions that are not space-related, typically a role that
enables to administrate the tenant. A standard or custom role is considered as global when it includes global privileges. A
tenant administrator can assign a global role to the relevant users. See Assign Users to a Role.

Scoped role - A role that inherits a set of privileges from a standard or custom role and assigns them to one or more users
for one or more spaces. Users assigned to a scoped role can perform actions in the assigned spaces. A tenant
administrator can create a scoped role. See Create a Scoped Role to Assign Privileges to Users in Spaces.

For more information on global and scoped privileges, see Privileges and Permissions.

This is custom documentation. For more information, please visit SAP Help Portal. 70
7/9/25, 8:45 AM

Users have relevant privileges depending on which actions they can do in the spaces.

Lisa administers the SAP Datasphere tenant.

Claret administers the SAP Datasphere tenant and also has modeler privileges in the two spaces Sales Europe and Sales
US.

Jorge has purchasing modeler privileges in the Purchasing space and has viewer privileges in the Worldwide Purchasing
space.

Maeve and Ahmed have modeler privileges in the two spaces Sales Europe and Sales US.

Lucia has modeler privileges in the Sales Europe space.

Standard Roles Delivered with SAP Datasphere


SAP Datasphere is delivered with several standard roles. A standard role includes a predefined set of privileges and permissions.

This is custom documentation. For more information, please visit SAP Help Portal. 71
7/9/25, 8:45 AM
A DW Administrator can use standard roles as templates for creating custom roles with a different set of privileges (see Create a
Custom Role). You can also use the standard roles that include scoped privileges as templates for creating scoped roles (see
Create a Scoped Role to Assign Privileges to Users in Spaces). You can assign the standard roles that contain global privileges
(such as DW Administrator, Catalog Administrator and Catalog User) directly to users.

 Note
You cannot delete nor edit standard roles.

In the side navigation area, click  (Security)  (Roles). The following standard roles are available:

Roles providing privileges to administer the SAP Datasphere tenant:

System Owner - Includes all user privileges to allow unrestricted access to all areas of the application. Exactly one
user must be assigned to this role.

DW Administrator - Can create users, roles and spaces and has other administration privileges across the SAP
Datasphere tenant. Cannot access any of the apps (such as the Data Builder).

Roles providing privileges to work in SAP Datasphere spaces:

DW Space Administrator (template) - Can manage all aspects of the spaces users are assigned to (except the
Space Storage and Workload Management properties) and can create data access controls.

DW Scoped Space Administrator - This predefined scoped role is based on the DW Space Administrator role
and inherits its privileges and permissions.

 Note
Users who are space administrators primarily need scoped permissions to work with spaces, but they
also need some global permissions (such as Lifecycle when transporting content packages). To provide
such users with the full set of permissions they need, they must be assigned to a scoped role (such as the
DW Scoped Space Administrator) to receive the necessary scoped privileges, but they also need to be
assigned directly to the DW Space Administrator role (or a custom role that is based on the DW Space
Administrator role) in order to receive the additional global privileges.

DW Integrator (template) - Can integrate data via connections and can manage and monitor data integration in a
space.

DW Scoped Integrator - This predefined scoped role is based on the DW Integrator role and inherits its
privileges and permissions.

DW Modeler (template) - Can create and edit objects in the Data Builder and Business Builder and view data in
objects.

DW Scoped Modeler - This predefined scoped role is based on the DW Modeler role and inherits its privileges
and permissions.

DW Viewer (template) - Can view objects and view data output by views that are exposed for consumption in
spaces.

DW Scoped Viewer - This predefined scoped role is based on the DW Viewer role and inherits its privileges
and permissions.

Roles providing privileges to consume the data exposed by SAP Datasphere spaces:

DW Consumer (template) - Can consume data exposed by SAP Datasphere spaces, using SAP Analytics Cloud, and
other clients, tools, and apps. Users with this role cannot log into SAP Datasphere. It is intended for business

This is custom documentation. For more information, please visit SAP Help Portal. 72
7/9/25, 8:45 AM
analysts and other users who use SAP Datasphere data to drive their visualizations, but who have no need to access
the modeling environment.

DW Scoped Consumer - This predefined scoped role is based on the DW Consumer role and inherits its
privileges and permissions.

Roles providing privileges to work in the SAP Datasphere catalog:

Catalog Administrator - Can set up and implement data governance using the catalog. This includes connecting the
catalog to source systems for extracting metadata, building business glossaries, creating tags for classification, and
publishing enriched catalog assets so all catalog users can find and use them. Must be used in combination with
another role such as DW Viewer or DW Modeler for the user to have access to SAP Datasphere.

Catalog User - Can search and discover data and analytics content in the catalog for consumption. These users may
be modelers who want to build additional content based on official, governed assets in the catalog, or viewers who
just want to view these assets. Must be used in combination with another role such as DW Viewer or DW Modeler for
the user to have access to SAP Datasphere.

Role providing privileges to use AI features in SAP Datasphere:

DW AI Consumer - Can use SAP Business AI features.

 Note
To activate SAP Business AI features in your SAP Datasphere tenant, see Enable SAP Business AI for SAP
Datasphere

 Note
Please do not use the roles DW Support User and DW Scoped Support User as they are reserved for SAP Support.

Users are assigned roles in particular spaces via scoped roles. One user may have different roles in different spaces depending on
the scoped role they're assigned to. See Create a Scoped Role to Assign Privileges to Users in Spaces.

Roles and Licenses


The standard roles are grouped by the license type they consume and each user's license consumption is determined solely by the
roles that they've been assigned. For example, a user who has been assigned only the DW Administrator standard role consumes
only a SAP Datasphere license.

Planning Professional, Planning Standard as well as Analytics Hub are SAP Analytics Cloud specific license types. For more
information, see Understand Licenses, Roles, and Permissions in the SAP Analytics Cloud documentation.

Privileges and Permissions


A privilege represents a task or an area in SAP Datasphere and can be assigned to a specific role. The actions that can be
performed in the area are determined by the permissions assigned to a privilege.

This topic contains the following sections:

Overview

Global Privileges and Permissions

Scoped Privileges and Permissions

This is custom documentation. For more information, please visit SAP Help Portal. 73
7/9/25, 8:45 AM
Permissions

Overview

A role represents the main tasks that a user performs in SAP Datasphere. Each role has a set of privileges with appropriate levels
of permissions. The privileges represent areas of the application like the Space Management or the Business Builder and the files
or objects created in those areas.

The standard roles provide sets of privileges and permissions that are appropriate for that role. For example, the DW
Administrator role has all the Spaces permissions, while the DW Viewer role has none.

You can use the standard roles (see Standard Roles Delivered with SAP Datasphere) and create your own custom roles to group
together other sets of privileges and permissions (see Create a Custom Role).

Global versus scoped privileges - Global privileges are privileges that are used at the tenant level and are not space-related, and
can therefore be included in a global role, typically a tenant administrator role. Scoped privileges are privileges that are space-
related and can therefore be included in a scoped role.

Global Privileges and Permissions


The following table lists the privileges and their permissions that can be included in a global role.

Global Privileges and Permissions

(C=Create, R=Read, U=Update, D=Delete, E=Execute, M=Maintain, S=Share, M=Manage)

Privilege Permissions Description

Spaces C------M Allows access to spaces in the Space Management tool.

Create - To create spaces and elastic compute nodes.

To perform actions on spaces, you need a combination of


permissions for the privilege Spaces and for other
privileges. See Roles and Privileges by App and Feature.

Manage - To read, update and delete all spaces and elastic


compute nodes.

 Caution
The permission Manage should be granted only to
tenant administrators.

 Note
The permissions Read, Update and Delete are scoped
permissions and are described in the scoped privileges and
permissions table (see Scoped Privileges and Permissions).

See Managing Your Space

Space Files -------M Allows access to all objects inside a space, such as views and
tables.

Manage - To view objects and data in all spaces.

 Note

This is custom documentation. For more information, please visit SAP Help Portal. 74
7/9/25, 8:45 AM

Privilege Permissions Description

To perform actions on spaces, you need a combination of


permissions for the privilege Spaces and for other privileges.
See Roles and Privileges by App and Feature.

 Caution
The permission Manage should be granted only to tenant
administrators.

 Note
The permissions Create, Read, Update and Delete are scoped
permissions and are described in the scoped privileges and
permissions table (see Scoped Privileges and Permissions).

See Managing Your Space

Data Warehouse General -R------ Allows users to log into SAP Datasphere. Included in all standard
roles except for DW Consumer.

Data Warehouse Runtime -R--E--- Read - Allows users of the View Analyzer to download the
generated SQL analyzer plan file.

See Exploring Views with View Analyzer

Execute - not in use

Data Warehouse AI Consumption ----E--- Allows users to use SAP Business AI features.
See Enable SAP Business AI for SAP Datasphere.

 Note
To enable SAP Business AI in your SAP Datasphere tenant, go
to SAP note 3522010 .

Other Datasources ----E--- Some connection types require this privilege. For more
information, see Permissions in the SAP Analytics Cloud Help.

Role CRUD---- Allows access to Security Roles and Security


Authorization Overview .

See Managing Roles and Privileges and View Authorizations by


User, Role, or Space

User CRUD---M Allows access to lists of users.

R (Read) - To see a list of users in a dialog, for example


when choosing which users to share a story with, or when
choosing users to add to a team.

To open the Security Users tools, you need all 4


permissions CRUD----. If you have only the Read
permission, you cannot see the list of users in Security
Users .

 Note
The permissions are included in the DW Administrator
role. When you create a custom role based on the DW

This is custom documentation. For more information, please visit SAP Help Portal. 75
7/9/25, 8:45 AM

Privilege Permissions Description

Administrator role, the permissions are automatically


included and you cannot edit them.

M (Manage) - To permit assigning users to roles, and


approving role assignment requests from users.

See Managing SAP Datasphere Users

Activity Log -R-D---- Allows access to the Activities page in the Security tool.

Read - To view the activities in the Activities page and


download the activity log for a specific time period.

Delete - To delete the activity log for a specific time period.

See Monitor Object Changes with Activities

Lifecycle -R---MS- Allows to import content from the Content Network and to import
and export content via the Transport tool.

M (Maintain) - Allows to import packages from the


Content Network and Transport Import .

M (Maintain) and S (Share) - Allows to export packages via


Transport Export .

 Note
The permissions -R---MS- are included in the DW
Administrator role. When you create a custom role based on
the DW Administrator role, the permissions are automatically
included and you cannot edit them.

See Importing SAP and Partner Business Content from the


Content Network and Transporting Content Between Tenants

System Information -RU-----


Read: To access the Configuration area in read-only in the
System tool.

Update: To access the Administration and Configuration


areas in the System tool.

Catalog Asset CRUD---M Create: Add an asset.

Read:

Access the catalog and view asset details.

Search assets, favorites, and recent.

Filter linked tags, terms, and KPIs.

Update: Edit the asset name and description.

Delete: Remove an asset from the catalog.

Manage:

View published and unpublished assets.

Publish or unpublish assets.

This is custom documentation. For more information, please visit SAP Help Portal. 76
7/9/25, 8:45 AM

Privilege Permissions Description

Catalog Glossary CRUD---- Create: Use with the Update privilege to create a glossary.

Read: Use with Catalog Glossary Object to:

View the term details and glossary list.

Create a category.

Search for terms, favorites, and assets.

Update: Edit a glossary.

Delete: Delete a glossary.

Catalog Glossary Object CRUD---M Create: Create a term.

Read: Use with Catalog Glossary to:

View the term details and glossary list.

Create a category.

Search for terms, favorites, and assets.

Update: Edit a term.

Delete: Delete a term.

Manage:

View published and unpublished terms.

Publish or unpublish a term.

Catalog Tag Hierarchy CRUD---- Create: Create a tag hierarchy.

Read: View tag hierarchies and search for tags.

Update: Edit tag hierarchies.

Delete: Delete a tag hierarchy.

Catalog System CRUDE--- Create: Create a system.

Read: View systems overview.

Update: Configure and update a system.

Delete: Delete a system.

Execute: Synchronize source systems manually.

Catalog KPI Object CRUD---M Create: Use with Catalog KPI Template with Read
permission to create a KPI.

Read: Use with Catalog KPI Template with Read


permission to:

View KPI details.

Search for KPIs, favorites, and recent.

Filter KPIs on linked terms.

Update: Use with Catalog KPI Template with Read


permission to update a KPI.

This is custom documentation. For more information, please visit SAP Help Portal. 77
7/9/25, 8:45 AM

Privilege Permissions Description

Delete: Delete a KPI.

Manage:

View published and unpublished KPIs.

Publish or unpublish KPIs.

Catalog KPI Template -RU----- Read: Use with Catalog KPI Object with Read permission
to:

View KPI details.

Search for KPIs, favorites, and recent.

Filter KPIs on linked terms.

Update: Edit the KPI template.

Catalog Log -R------ Read: View and search extraction logs for assets and
batch job details.

Cloud Data Product (------S) Share: Use with Catalog Asset with Read permission to:

Share an SAP Business Data Cloud with external


users.

View the sharing details of the data product.

Edit the authorized users who the data product is


shared with.

Delete or remove the sharing access for the data


product.

Scoped Privileges and Permissions


The following table lists the privileges and their permissions that can be included in a scoped role.

 Note
Some permissions require others and may automatically set them. For example, setting the Delete permission for the Data
Warehouse Data Builder privilege automatically sets the Read permission as well.

Scoped Privileges and Permissions

(C=Create, R=Read, U=Update, D=Delete, E=Execute, M=Maintain, S=Share, M=Manage)

Privilege Permissions Description

Spaces -RUD---- Allows access to spaces in the Space


Management tool.

Read - To view the Space


Management.

Update, Delete - To update or


delete spaces.

 Note
This is custom documentation. For more information, please visit SAP Help Portal. 78
7/9/25, 8:45 AM

Privilege Permissions Description

The permissions Create and Manage are


global permissions and are described in
the global privileges and permissions
table (see Global Privileges and
Permissions).

See Managing Your Space

Space Files CRUD---- Allows access to all objects inside a space,


such as views and tables.

Read - To view the objects in


spaces.

Create, Update, Delete - To create,


update or delete objects in spaces.

To view certain space properties or perform


actions on spaces, you need a combination
of permissions for the privilege Spaces and
for other privileges. See Roles and
Privileges by App and Feature.

 Note
The permission Manage is a global
permission and is described in the
global privileges and permissions table
(see Global Privileges and Permissions).

See Managing Your Space

Data Warehouse Data Builder CRUD--S- Allows access to all objects in the Data
Builder app.

Users with the Share permission can share


objects to other spaces.

Also allows access to the Data Sharing


Cockpit app with the Create, Read and
Update permissions.

See Acquiring Data in the Data Builder,


Preparing Data in the Data Builder,
Modeling Data in the Data Builder and
Sharing Entities and Task Chains to Other
Spaces.

Data Warehouse Connection CRUD---- Allows access to remote and run-time


objects:

Read - To view remote tables in the


Data Builder.

Create, Update and Delete - To


create, update, or delete a
connection in the Connections app,
in addition to the corresponding
Space Files permission.

This is custom documentation. For more information, please visit SAP Help Portal. 79
7/9/25, 8:45 AM

Privilege Permissions Description

See Integrating Data via Connections and


Acquiring Data in the Data Builder

 Note
The following feature needs an
additional permission:

Select a location ID - [Link]

The privilege is neither included in the


DW Integrator nor in the DW Space
Administrator role. If you need to select
a location ID, ask your tenant
administrator to either assign your user
to a global role that is based on the DW
Administrator role or to assign your
user to a custom global role (with
license type SAP Datasphere) that
includes the required [Link]
privilege.

Data Warehouse Data Integration -RU-E--- Allows access to the Data Integration
Monitor app:

Read - To view the Data Integration


Monitor.

Update:

To perform any one-off data


replication/persistence
actions in the Data
Integration Monitor or Data
Builder.

To redeploy views in the


Data Builder where data
persistence is used
(including in the view
lineage)

Execute - To work with schedules.

 Note
In addition to these permissions, the
following Data Integration Monitor
actions require the Data Warehouse
Data Builder (Read) privilege:

To set up or change partitioned


data loading in the Remote
Tables monitor or in the Views
monitor.

To start the View Analyzer in the


Views monitor.

See Managing and Monitoring Data


Integration

This is custom documentation. For more information, please visit SAP Help Portal. 80
7/9/25, 8:45 AM

Privilege Permissions Description

 Note
To run and schedule flows, you must
have the privilege Data Warehouse Data
Integration with Read, Update and
Execute permissions.

Data Warehouse Business Catalog Not in use Not in use

Data Warehouse Data Access Control CRUD---- Allows access to data access controls in the
Data Builder app:

Create, Update and Delete - To


create, update, or delete a data
access control in the editor.

Read - To use a data access control


to protect a view.

See Securing Data with Data Access


Controls

Data Warehouse Business Builder -R------ Allows access to the Business Builder app.

See Modeling Data in the Business Builder

Data Warehouse Business Entity CRUD---- Allows access to business objects


(dimensions and facts) defined in the
Business Builder.

See Creating a Business Entity

Data Warehouse Authorization Scenario CRUD---- Allows access to authorization scenarios


defined in the Business Builder.
Authorization scenarios are modeling
abstractions for Data Access Controls.

See Authorization Scenario

Data Warehouse Fact Model CRUD---- Allows access to fact models defined in the
Business Builder. Fact models are shaped
like consumption models but offer re-
useability in other consumption models.

See Creating a Fact Model

Data Warehouse Consumption Model CRUD---- Allows access to consumption models


inside the Business Builder. Consumption
models comprise perspectives which are
presented as DWC_CUBE objects in the file
repository.

See Creating a Consumption Model

Data Warehouse Folder CRUD---- Allows access to folders defined in the


Business Builder. Folders are used to
organize objects inside the Business
Builder.

See Business Builder Start Page

This is custom documentation. For more information, please visit SAP Help Portal. 81
7/9/25, 8:45 AM

Privilege Permissions Description

Data Warehouse Consumption -RU-E--- Allows access to data in modeling objects:

R (Read) - Read data output by


Data Builder views that have the
Expose for Consumption switch
enabled and data in Business
Builder fact models and
consumption models.

Users with this permission may not


preview data in local or remote
tables, in views that are not exposed
for consumption, in sources or
intermediate nodes of graphical
views (even if those views are
exposed for consumption), or in
Business Builder business entities.

This permission is given to users


with the standard DW Viewer role
(who have read-only access to SAP
Datasphere) and users with the DW
Consumer role (who do not have
access to SAP Datasphere and
merely consume exposed data in
SAP Analytics Cloud and other
analytics clients).

U (Update) - Upload data from a csv


file to a local table. For local tables
with delta capture enabled, updates
are tracked in the "Change Type"
column.

E (Execute) - Access data in all Data


Builder and Business Builder
objects and edit data in Data
Builder local tables.

See Consuming Data Exposed by SAP


Datasphere

Data Warehouse General -R------ Allows users to log into SAP Datasphere.
Included in all standard roles except for DW
Consumer.

Scoped Role User Assignment -------M Allows to manage user assignment in a


space.
M (Manage):

To see the Users area in the spaces


assigned to the scoped role, in
addition to Spaces Read.

To edit the Users area in the spaces


assigned to the scoped role, in
addition to Spaces Update.

 Note

This is custom documentation. For more information, please visit SAP Help Portal. 82
7/9/25, 8:45 AM

Privilege Permissions Description

This privilege is displayed and available


for selection only in a scoped role and is
selected by default in the predefined
scoped role DW Scoped Space
Administrator.

See Create a Scoped Role to Assign


Privileges to Users in Spaces

Translation CR-D---- Allows access to the Translation tool:

C (Create): Lets you select objects


to translate, translate manually,
download and upload translations
using XLIFF files, or review
translations.

R (Read): Lets you access the


Translation tool.

D (Delete): Lets you delete the


translations.

 Note
Custom roles cannot be assigned this
privilege.

Data Warehouse Graph Modeler Not in use Not in use

Permissions
The following table displays the available permissions and their definitions.

Permission Description

Create Permits creating new objects of this item type. Users need this permission to create spaces, views or tables,
upload data into a story, or upload other local files.

Read Permits opening and viewing an item and its content.

Update Permits editing and updating existing items. Compare this permission with the Maintain permission, which
doesn't allow changes to the data structure. Note: some object types need the Maintain permission to update
data. See the Maintain entry.

Delete Permits deletion of the item.

Execute Permits executing the item to run a process, such as schedules.

Maintain Permits the maintenance of data values, for example adding records to a model, without allowing changes to
the actual data structure. Compare this permission with the Update permission, which does allow changes to
the data structure.
When granted on the Lifecycle privilege, permits importing and exporting objects.

Share Permits the sharing of the selected item type.

Manage When granted on Spaces and Space Files, permits to view all spaces and their content (including data),
regardless of whether the user is assigned to the space or not.

This is custom documentation. For more information, please visit SAP Help Portal. 83
7/9/25, 8:45 AM

Permission Description

To perform actions on spaces, you need the Manage permission in combination with other permissions for
Spaces and other privileges. See Roles and Privileges by App and Feature.

 Caution
This permission should be granted only to tenant administrators.

Roles and Privileges by App and Feature


Review the standard roles and the privileges needed to access apps, tools, and other features of SAP Datasphere.

This topic contains the following sections:

Granting Privileges via Global and Scoped Roles

Apps

Administration Tools

Space Management Privileges and Permissions

External Data Consumption

The Command Line Interface

Granting Privileges via Global and Scoped Roles

This is custom documentation. For more information, please visit SAP Help Portal. 84
7/9/25, 8:45 AM

A user is granted a set of global privileges for the tenant via a global A user is granted a set of scoped privileges for one or more spaces
role. via a scoped role.

The global role can be: The scoped role inherits a role template, which can be:

A standard global role that is delivered with SAP A standard scoped role template that is delivered with SAP
Datasphere (such as DW Administrator). Datasphere, such as DW Space Administrator).

A custom role that you create from a template (a standard A custom role template that you create from another
global role or another custom role containing global template (a standard scoped role or another custom role).
privileges).
To assign a user to a scoped role, see Create a Scoped Role to
To assign a user to a global role, see Assign Users to a Role. Assign Privileges to Users in Spaces.

 Note
For complete lists of standard roles, privileges and permissions, see:

Standard Roles Delivered with SAP Datasphere

Privileges and Permissions

Apps
To access an app, tool, or editor, a user must have a global or scoped role inheriting from a role template which contains the listed
privileges:

App Requires Privileges (Permissions)… Granted by Role Template...

 (Home) Data Warehouse General (-R------) All roles except DW Consumer

See The SAP Datasphere DW Viewer (read-only access)


Homepage

 (Repository Explorer) Space Files (-R------) All roles except DW Consumer

See Repository Explorer DW Viewer (read-only access)

 (Catalog & Marketplace) Catalog Administrator


Catalog Asset (CRUD---M)
See Governing and Publishing Catalog User (read-only access
Catalog Glossary (CRUD----)
Data in the Catalog for all privileges, except no
Catalog Glossary Object (CRUD---M) access for Catalog System,
Catalog Log, and Cloud Data
Catalog KPI Object (CRUD---M) Product)

Catalog KPI Template (-RU-----) In addition, the following sub-


tools require the Catalog
Catalog Tag Hierarchy (CRUD------)
Administrator role:
Catalog System (CRUDE---)
Tag Hierarchies
Catalog Log (-R------)
Monitoring
Cloud Data Product (------S)

This is custom documentation. For more information, please visit SAP Help Portal. 85
7/9/25, 8:45 AM

App Requires Privileges (Permissions)… Granted by Role Template...

 (Data Marketplace) DW Integrator


Spaces (-R------)
See Purchasing Data from Data DW Modeler
Space Files (CRUD----)
Marketplace
DW Administrator, DW Space
Data Warehouse Connection (CRUD--S-)
Administrator and DW Viewer:
Data Warehouse Data Integration (-RU-E---) read-only access

Data Warehouse Data Builder (CRU-----)

 (Semantic Onboarding) Data Warehouse General (-R------) DW Viewer (read-only access)

See Semantic Onboarding. Each section requires a specific permission: DW Space Administrator (all
sections)
SAP Systems:
DW Modeler (SAP Systems and
Data Warehouse Data Builder (CRU-----)
Data Products)
Data Warehouse Business Entity (CRU-----)

Data Warehouse Consumption Model (CRU-----)

Content Network:

Lifecycle (-R---MS-)

Data Products - See Data Marketplace, above.

 (Business Builder) Each page or editor requires a separate permission: DW Space Administrator

Start page Start page: Data Warehouse Business Builder (-R------) DW Modeler

Dimension editor Dimension editor: Data Warehouse Business Entity DW Viewer (read-only access)
(CRUD----)
Fact editor
Fact editor: Data Warehouse Business Entity (CRUD----)
Fact model editor
Fact model editor: Data Warehouse Fact Model (CRUD----)
Consumption model editor
Consumption model editor: Data Warehouse Consumption
Authorization scenario editor
Model (CRUD----)
See Modeling Data in the
Authorization scenario editor: Data Warehouse
Business Builder
Authorization Scenario (CRUD----)

The following features need additional permissions (which are


included in the DW Modeler role):

Preview data from any object in the Data Preview screen -


Data Warehouse [Link]

 Note
The DW Viewer role includes Data Warehouse
[Link], which allows these users to preview
only data from Fact models and consumption models.

 (Data Builder) All pages and editors share a single permission: DW Space Administrator

Start Page Data Warehouse Data Builder (CRUD--S-) DW Modeler

Table editor The following features need additional permissions (which are DW Viewer (read-only access)
included in the DW Modeler role):
Graphical view editor
This is custom documentation. For more information, please visit SAP Help Portal. 86
7/9/25, 8:45 AM

App Requires Privileges (Permissions)… Granted by Role Template...

SQL view editor Preview data from any object in the Data Preview panel -
Data Warehouse [Link]
Entity-relationship model editor

Data flow editor  Note


The DW Viewer role includes Data Warehouse
Transformation flow editor
[Link], which allows these users to preview
Replication flow editor only data output by views with the Expose for
Consumption switch enabled.
Analytic model editor

Intelligent lookup editor Upload data in a local table - Data Warehouse


[Link] or Data Warehouse Data
Task chain editor [Link]

Data access control editor Access the local table Data Editor screen - Data
Warehouse Data [Link]
See:
See remote objects in Data Builder editors - Data
Acquiring Data in the
Warehouse [Link]
Data Builder
The following features need additional permissions (which are
Preparing Data in the
included in the DW Integrator role):
Data Builder
Run an intelligent lookup - Data Warehouse Data
Modeling Data in the
[Link]
Data Builder
Run a task chain - Data Warehouse Data
Securing Data with Data
[Link]
Access Controls
Delete data in a local table - Data Warehouse Data
[Link]

The following features need additional permissions (which are


included in the DW Space Administrator role):

Create, update, and delete a data access control - Data


Warehouse Data Access Control (CRUD----)

 Note
The DW Modeler role includes Data Warehouse Data
Access [Link], which allows them to apply an
existing data access control to a view.

 (Data Integration Monitor) Data Warehouse Data Integration (-RU-E---) DW Space Administrator

See Managing and Monitoring DW Integrator


 Note
Data Integration
Data Warehouse Data [Link] allows you to do only DW Modeler (manual tasks
manual integration tasks. The DW Integrator role includes Data only)
Warehouse Data [Link], which also allows
DW Viewer (read-only access)
scheduling automated integration tasks.

The following features need additional permissions (which are


included in the DW Space Administrator role):

Views (monitor) Define partitions - Data


[Link]

Views (monitor) View Analyzer - Data [Link]

This is custom documentation. For more information, please visit SAP Help Portal. 87
7/9/25, 8:45 AM

App Requires Privileges (Permissions)… Granted by Role Template...

Views (monitor) Generate SQL Analyzer Plan File -


Data [Link]

 (Connections) Data Warehouse Connection (CRUD--S-) DW Space Administrator

See Integrating Data via The following feature needs an additional permission (which is DW Integrator
Connections included in the DW Administrator role):
DW Modeler (read-only access)
Select a location ID - [Link]
DW Viewer (read-only access)

Administration Tools

To access an app, tool, or editor, a user must have a global or scoped role inheriting from a role template which contains the listed
privileges:

Tool Requires Privileges (Permissions)… Granted by Role Template...

(Space Management) Spaces (CRUD---M) DW Administrator (can create


spaces)
See Preparing Your Space and
 Note
Integrating Data DW Space Administrator
For detailed information on permissions for Spaces, see Space
Management Privileges and Permissions DW Integrator and DW
Modeler: have read-only access
to the page for their space
(though they cannot see all its
properties).

 (System Monitor) System Information (-RU-----) DW Administrator

See Monitoring SAP Datasphere

 (Translation) Translation (CR-D----) DW Space Administrator

See Translating Metadata for DW Modeler (read-only access)


SAP Analytics Cloud

 (Security) The sub-tools require the following permissions: DW Administrator (read-only


access for the sub-tool
Users (see Managing SAP Users: User (CRUD---M)
Activities)
Datasphere Users)
Roles: Role (CRUD----)
Roles (see Managing Roles and
Authorization Overview: Role (CRUD----)
Privileges)

Activities: Activity Log


Authorization Overview (see
View Authorizations by User,
Role, or Space)

Activities (see Monitor Object


Changes with Activities)

 (Transport) Lifecycle (-R---MS-) DW Administrator

See Transporting Content DW Space Administrator


Between Tenants

 (Data Sharing Cockpit) Data Warehouse Data Builder (CRU-----) DW Modeler

DW Space Administrator
 Note
This is custom documentation. For more information, please visit SAP Help Portal. 88
7/9/25, 8:45 AM

Tool Requires Privileges (Permissions)… Granted by Role Template...

See Data Marketplace - Data To create a new data provider profile, or edit an existing one, you
Provider's Guide must have the Spaces (Update) privilege assigned to your role.

See Maintaining your Data Provider Profile.

 (System) System Information (-RU-----) DW Administrator

Configuration System Information (-R------) gives access to the


 Note
Configuration area in read-only.
Administration Users with any role can view
System Information (--U-----) gives access to the the About dialog.
About
Configuration and Administrationareas.
See Administering SAP
Datasphere

Space Management Privileges and Permissions


Users with different roles have different levels of access to the Space Management tool:

A user with a consumer role cannot log into SAP Datasphere.

A user with a viewer role can log into SAP Datasphere, but has no Spaces permissions and cannot see the Space
Management tool.

A user with a modeler or integrator role has Spaces (-R------) permission. They have read-only access to the page for
their space (though they cannot see all its properties).

A user with a space administrator role has Spaces (-RUD----) permissions. They can see all the space properties, and
edit those outside the General Settings and Workload Management sections.

A user with an administrator role has Spaces (CRUD---M) permissions. They can create spaces and edit some space
properties, including modifying the storage allocated and the space priority.

Various privileges and permissions are required to see and edit different parts of the Space Management tool:

 Note
In addition to all the privileges listed in the table below that are required to work with the Space Management tool, the
following privileges are required:

Data Warehouse General (-R------) (both global and scoped privilege) - To access SAP Datasphere.

Global privilege Space Files (-------M) or scoped privilege Space Files (-R------)- To view objects in your
space.

The global privilege Spaces (-------M) enables users with a global role to perform the following actions in all the
spaces of the tenant: read, update and delete.

Action Requires Privilege (Permission) Granted by Role Template...

Create a Space Global privileges Spaces (C------M) and User (-R----- DW Administrator
See Create a Space -).

View Space Properties Global privilege Spaces (-------M) DW Administrator and DW


Space Administrator
or scoped privilege Spaces (-R------)

This is custom documentation. For more information, please visit SAP Help Portal. 89
7/9/25, 8:45 AM

Action Requires Privilege (Permission) Granted by Role Template...

 Note  Note
In addition, you also need the following permissions to view A user with a role based on
these properties: the DW Modeler or DW
Integrator role template
Users: Global privileges Role (-R------) or
have read-only access to
scoped privileges Scoped Role User Assignment
the page for their space
(-------M)
but they cannot view all its
Data Consumption and Database Users: Global properties.
privilege Spaces (-------M) or scoped privilege
Spaces (-R------)

HDI Containers: Scoped privileges Spaces (-R--


----) and Data Warehouse Connection (-R---
---)

 Note
A DW Administrator cannot see the HDI Containers
area in a space.

Time Data: Scoped privileges Spaces (-R-----


-) and Data Builder (-R------)

 Note
A DW Administrator cannot see the Time Data area
in a space.

Auditing: Global privilege Spaces (-------M) or


scoped privilege Spaces (--R-----)

Modify General Settings (except for Global privilege Spaces (-------M) DW Administrator and DW
Space Storage) Space Administrator
or scoped privilege Spaces (-RU-----)
See Create a Space

Modify Space Storage, Data Lake Global privilege Spaces (-------M) DW Administrator
Access, Workload Management

See Create a Space, Allocate Storage


to a Space and Set Priorities and
Statement Limits for Spaces or Groups

Modify Users Global privileges Spaces (-------M) and Role (------- DW Administrator and DW
M) Space Administrator
See Control User Access to Your Space
or scoped privileges Spaces (--U-----) and Scoped Role
User Assignment (-------M)

Modify Data Consumption and Global privilege Spaces (-------M) DW Administrator, DW Space
Database Users Administrator
or scoped privileges Spaces (-RU-----)
See Create a Database User  Note
A user with a role based in
the DW Integrator role
template needs in addition
the privilege Spaces (--

This is custom documentation. For more information, please visit SAP Help Portal. 90
7/9/25, 8:45 AM

Action Requires Privilege (Permission) Granted by Role Template...

U-----) to create
database users.

Modify HDI Containers Scoped privileges Spaces (--U-----) and Data Warehouse DW Space Administrator
Connection (--U-----)
See Prepare Your HDI Project for  Note
Exchanging Data with Your Space A DW Administrator
cannot access the HDI
Containers area in a
space.

Modify Time Data To update time data: scoped privileges Spaces (--U-----) DW Space Administrator
and Data Builder (--U-----)
See Create Time Data and Dimensions  Note
To delete time data: scoped privileges Spaces(--U-----) A DW Administrator
and Data Builder (---D----) cannot access the Time
Data area in a space.

Modify Auditing Global privilege Spaces (-------M) or scoped privilege DW Administrator and DW
Spaces (-RU-----) Space Administrator
See Logging Read and Change Actions
for Audit

Monitor a Space Global privileges Spaces (-------M) DW Administrator, DW Space


See Monitor Your Space Storage Administrator, DW Integrator
or scoped privilege Spaces (-R------)
Consumption and DW Modeler

Lock or Unlock a Space Global privileges Spaces (-------M) DW Administrator and DW


See Unlock a Locked Space Space Administrator
or scoped privilege Spaces (--U-----)

Delete a Space Global privileges Spaces (-------M) and User (------- DW Administrator and DW
See Delete Your Space M) Space Administrator

or scoped privileges Spaces (-RUD----) and Scoped Role  Note


User Assignment (-------M) A user with a space
administrator role can
delete only the spaces
they’re assigned to via a
scoped role.

A user with a tenant


administrator role can
delete any space as
Spaces (-------M) is
included in the role.

Catalog Role Privilege Dependencies

When creating a custom role for using or administering the catalog, you must set the permissions for the privileges in certain ways
so that you can complete various tasks. Review the following table of tasks to see which permissions and privilege combinations
you need.

To be able to access the Catalog app from the side navigation, all custom catalog roles need the Read permission on Catalog
Asset.

This is custom documentation. For more information, please visit SAP Help Portal. 91
7/9/25, 8:45 AM

 Note
All custom catalog roles need the SAP Datasphere read permission on Space Files to allow users to mark assets, terms, and
KPIs as their favorite.

Category What do you want to do Required combination of privileges

Assets Search for an asset and view the detailed Catalog Asset: (-R------)
information for it.
See Searching for Data Products and Assets in
the Catalog

Assets View detailed information for an asset, Catalog Asset: (-R------)


including the details for any term, tag, or KPI
Catalog Glossary: (-R------)
that is linked.
See Evaluating and Accessing Catalog Assets Catalog Glossary Object: (-R------)

Tag Hierarchy: (-R------)

Catalog KPI Object: (-R------)

Catalog KPI Template: (-R------)

Assets Edit the name of the asset that appears in the Catalog Asset: (-RU-----)
catalog.
Catalog Tag Hierarchy: (-R------)
See Enriching and Managing Catalog Assets

Assets Add a catalog description for the asset. Catalog Asset: (-RU-----)
See Enriching and Managing Catalog Assets
Catalog Tag Hierarchy: (-R------)

Assets Add a term, tag, or KPI relationship to the asset Catalog Asset: (-RU-----)
from the asset’s detailed information page.
Catalog Tag Hierarchy: (-R------)
See Enriching and Managing Catalog Assets
Catalog Glossary Object: (-R------)

Catalog KPI Object: (-R------)

Assets Create or delete a relationship between a tag Catalog Asset: (-RU-----)


and an asset.
Catalog Tag Hierarchy: (-R------)
See Manage Tag Relationships for Assets

Assets Manage the relationship for a term and an Catalog Glossary Object: (-R------)
asset.
Catalog Asset: (-RU-----)
See Create and Manage Glossary Terms

Assets Manage the relationship for a KPI and an asset. Catalog KPI Object: (-R------)
See Create and Manage Key Performance
Catalog Asset: (-RU-----)
Indicators

Assets Publish/Unpublish assets to the catalog or Catalog Asset: (-R-----M)


exclude assets from being automatically
published.
See Publishing Content to the Catalog

Tags Add a tag relationship to the asset from the Catalog Asset: (-RU-----)
asset’s detailed information page.
Catalog Tag Hierarchy: (-R------)
See Enriching and Managing Catalog Assets

This is custom documentation. For more information, please visit SAP Help Portal. 92
7/9/25, 8:45 AM

Category What do you want to do Required combination of privileges

Tags Create a tag hierarchy. Catalog Tag Hierarchy: (CRU-----)


See Manage Hierarchies and Tags

Tags Edit a tag hierarchy. Catalog Tag Hierarchy: (-RU-----)


See Manage Hierarchies and Tags

Tags Delete a tag hierarchy. Catalog Tag Hierarchy: (-R-D----)


See Manage Hierarchies and Tags

Tags Create or delete a relationship between a tag Catalog Asset: (-RU-----)


and an asset.
Catalog Tag Hierarchy: (-R------)
See Manage Tag Relationships for Assets

Glossary Create a glossary. Catalog Glossary: (C-------)


See Create and Manage a Glossary

Glossary Edit a glossary. Catalog Glossary: (-RU-----)


See Create and Manage a Glossary

Glossary Delete a glossary. Catalog Glossary: (-R-D----)


See Create and Manage a Glossary

Glossary Create a glossary category. Catalog Glossary: (-R------)


See Create and Manage a Glossary Category
Catalog Glossary Object: (C-------)

Glossary Edit a glossary category. Catalog Glossary: (-R------)


See Create and Manage a Glossary Category
Catalog Glossary Object: (-RU-----)

Glossary Delete a glossary category. Catalog Glossary: (-R------)


See Create and Manage a Glossary Category
Catalog Glossary Object: (-R-D----)

Terms Create a glossary term. Catalog Glossary: (-R------)


See Create and Manage Glossary Terms
Catalog Glossary Object: (C-------)

Terms Edit a glossary term. Catalog Glossary: (-R------)


See Create and Manage Glossary Terms
Catalog Glossary Object: (-RU-----)

Terms Delete a glossary term. Catalog Glossary: (-R------)


See Create and Manage Glossary Terms
Catalog Glossary Object: (-R-D----)

Terms Publish or unpublish a glossary term. Catalog Glossary Object: (-R-----M)


See Create and Manage Glossary Terms

Terms Manage the relationship for a term and an Catalog Glossary Object: (-R------)
asset.
Catalog Asset:(-RU-----)
See Create and Manage Glossary Terms

KPIs Create a KPI. Catalog KPI Object: (C-------)


Create and Manage Key Performance
Catalog KPI Template: (-R------)
Indicators

KPIs Edit a KPI. Catalog KPI Object: (-RU-----)


Create and Manage Key Performance
Catalog KPI Template: (-R------)
Indicators

This is custom documentation. For more information, please visit SAP Help Portal. 93
7/9/25, 8:45 AM

Category What do you want to do Required combination of privileges

KPIs Delete a KPI. Catalog KPI Object: (-R-D----)


Create and Manage Key Performance
Indicators

KPIs Publish/Unpublish a KPI. Catalog KPI Object: (-R-----M)


Create and Manage Key Performance
Indicators

KPIs Manage the relationship for a KPI and an asset. Catalog KPI Object: (-R------)
Create and Manage Key Performance
Catalog Asset: (-RU-----)
Indicators

KPIs Create a KPI category. Create and Manage Key Catalog KPI Object: (C-------)
Performance Indicator Categories
Catalog KPI Template: (-R------)

KPIs Edit a KPI category. Catalog KPI Object: (-RU-----)


Create and Manage Key Performance Indicator
Catalog KPI Template: (-R------)
Categories

KPIs Delete a KPI category. Catalog KPI Object: (-R-D----)


Create and Manage Key Performance Indicator
Catalog KPI Template: (-R------)
Categories

KPIs Edit KPI template. Catalog KPI Template: (-RU-----)


SeeDefine the Key Performance Indicator
Template

Marketplace Data Products Search for a data marketplace data product, Spaces: (-R------)
view the detailed information for it, and install it
Space Files: (CRUD----)
to a space.
See Searching for Data Products and Assets in Data Warehouse Connection: (CRUD----)
the Catalog and Evaluating and Installing
Marketplace Data Products. Data Warehouse Data Integration: (-RU-----)

Data Warehouse Data Builder: (CRU-----)

SAP Business Data Cloud Search for an SAP Business Data Cloud data Catalog Asset: (-R------)
Data Products product and view the detailed information for it.

See Searching for Data Products and Assets in


the Catalog and Evaluating and Installing Data
Products.

SAP Business Data Cloud Search for an SAP Business Data Cloud data Catalog Asset: (-R------)
Data Products product, view the detailed information for it,
Spaces: (-R------)
and install it to a space and use it. See
Evaluating and Installing Data Products. Space Files: (CRUD----)

Data Warehouse Data Builder: (CRU-----)

SAP Business Data Cloud Search for an SAP Business Data Cloud data Catalog Asset: (-R------)
Data Products product, view the detailed information for it,
Cloud Data Product: (------S)
and share it with external users.

See Searching for Data Products and Assets in


the Catalog and Evaluating and Installing Data
Products.

This is custom documentation. For more information, please visit SAP Help Portal. 94
7/9/25, 8:45 AM
Here are a few examples for catalog roles and permissions.

Include these privileges in the custom


For users who... role

Review assets: update asset names and descriptions, add tags, and publish assets. Catalog Asset: (-RU----M)

Catalog Tag Hierarchy: (-RU-----)

Manage and publish glossaries, terms, and KPIs; also add terms and KPI relationships to Catalog Asset: (-RU-----)
assets.
Catalog Glossary: (CRUD----)

Catalog Glossary Object: (CRUD---M)

Catalog KPI Object: (CRUD---M)

Manage terms within existing glossaries and manages tags, but do not add these relationships Catalog Asset: (-R------)
to assets.
Catalog Glossary: (-R------)

Catalog Glossary Object: (CRUD---M)

Catalog Tag Hierarchy: (CRUD----)

External Data Consumption


Users can consume data exposed by SAP Datasphere if they are assigned to a space via a scoped role and have the Space
[Link] permission.

Action Requires Privileges (Permissions)… Granted by Role Template...

Consume data in SAP Analytics Cloud, Space Files (-R------) All roles
Microsoft Excel, and other clients, tools, and
apps  Note
See Consuming Data Exposed by SAP If a user does not need to access SAP

Datasphere Datasphere itself, and only wants to


consume data exposed by it, they should
be granted the DW Consumer role.

The Command Line Interface

To use the command line interface (see Manage Spaces via the Command Line), a user must have the following standard role or a
custom role containing the listed privileges:

Command Requires Privileges (Permissions)… Contained in Standard Role

datasphere dbusers Spaces (-RU-----) DW Administrator

datasphere marketplace Data Builder (CRUD----) DW Modeler

datasphere objects DW Modeler


Data Builder (CRUD----)

Data Warehouse Business Entity


(CRUD----)

This is custom documentation. For more information, please visit SAP Help Portal. 95
7/9/25, 8:45 AM

Command Requires Privileges (Permissions)… Contained in Standard Role

Data Warehouse Fact Model


(CRUD----)

Data Warehouse Consumption


Model (CRUD----)

Data Warehouse Authorization


Scenario (CRUD----)

datasphere scoped-roles Role (CRUD----) DW Administrator

datasphere spaces DW Administrator


Create a space and set storage,
datasphere workload priority:

Spaces (C------M)

User (-R------)

datasphere spaces Update/delete spaces: DW Administrator and DW Space


Administrator
Spaces (-RUD---M)

Update space users:

Team (-RUD---M)

Scoped Role User


Assignment (-------M)

datasphere tasks Data Warehouse Data Integration (-RU-E-- DW Integrator


-)

datasphere users User (CRUD---M) DW Administrator

datasphere configuration System Information (-RU-----) DW Administrator


certificates

datasphere spaces connections Data Warehouse Connection (CRUD----) DW Integrator

Create a Custom Role


You can create a custom role using either a blank template or a standard role template and choosing privileges and permissions as
needed.

Prerequisites
To create a custom role, you need the DW Administrator role.

Context

You can create a custom role to enable users to do either global actions on the tenant or actions that are specific to spaces.

If you create a custom role for global purposes, you should include only global privileges and permissions. You can then
assign the role to the relevant users.

This is custom documentation. For more information, please visit SAP Help Portal. 96
7/9/25, 8:45 AM
If you create a custom role for space-related purposes, you should include only scoped privileges and permissions. As a
second step, you need to create a scoped role based on this custom role to assign users and spaces to the set of privileges
included. See Create a Scoped Role to Assign Privileges to Users in Spaces.

You should not mix global and scoped privileges in a custom role.

If you include a scoped privilege in a custom role that you create for global purposes, the privilege is ignored.

If you include a global privilege in a custom role that you want to use as a template for a scoped role, the privilege is ignored
.

 Note
Some users, such as space administrators, primarily need scoped permissions to work with spaces, but they also need some
global permissions (such as Lifecycle when transporting content packages). To provide such users with the full set of
permissions they need, you can include both the relevant global privileges and scoped privileges in the custom role you will use
as a template for the scoped role. Each space administrator is then assigned to the scoped role to receive the necessary
scoped privileges, but they are also assigned directly to the custom role in order to receive the additional global privileges.

For more details about global and scoped privileges, see Privileges and Permissions.

Procedure
1. Go to  (Expand)  (Security)  (Roles).

2. To create a custom role, click  (Add Role) and select Create a Custom Role.

3. In the Create a New Role dialog, complete the following properties:

Property Description

Name Enter a unique name for the role. The name can only contain upper and lower case letters, numbers,
and underscores and its maximum length is 20 characters.

Description [optional] Enter a description, which can be changed at any time. The description can only contain
upper and lower case letters, numbers, spaces, and dashes and its maximum length is 155 characters.

License Type Select SAP Datasphere.

4. Click Create.

5. Select a role template.

The role templates are the predefined standard roles associated with the SAP Datasphere license type. If you wish to create
a role without extending a predefined standard role, choose the blank template. After you select a template, a page opens
showing you the individual permissions assigned to the privileges that have been defined for the role template you chose.

6. Select the permissions for your new role for every privilege type. The permission privileges represent an area, app, or tool in
SAP Datasphere while the permissions (create, read, update, delete, execute, maintain, share, and manage) represent the
actions a user can perform. For more details about global and scoped privileges, see Privileges and Permissions.

7. [optional] If you want to change the role template that your new custom role will be based on, select  (Select Template),
and choose a role.

8. [optional] To define the custom role as a default role, which will be assigned to all new users when no other role is assigned
to them, select  (Role Configuration) and select the option Use as Default Role.

 Note
The option Enable Self-Service is not relevant for SAP Datasphere.

9. Save your new custom role.


This is custom documentation. For more information, please visit SAP Help Portal. 97
7/9/25, 8:45 AM

 Note
You can assign the role to a user from the Users page or - only if you've created a custom role for global purposes (and
not for space-related purposes) - from the Roles page. Whether you create users first or roles first does not matter. See
Assign Users to a Role.

Create a Scoped Role to Assign Privileges to Users in Spaces


A scoped role inherits a set of scoped privileges from a standard or custom role and grants these privileges to users for use in the
assigned spaces.

This topic contains the following sections:

Introduction to Scoped Roles

Create a Scoped Role

Add Spaces to a Scoped Role

Remove Spaces from a Scoped Role

Add Users to a Scoped Role

Remove Users from a Scoped Role

Introduction to Scoped Roles


A user with the DW Administrator role can create scoped roles.

A DW Administrator can assign a role to multiple users in multiple spaces, in a single scoped role. As a consequence, a user can
have different roles in different spaces: be a modeler in space Sales Germany and Sales France and a viewer in space Europe Sales.

You can create a scoped role based on a standard role or on a custom role. In both cases, the scoped role inherits the privileges
from the standard or custom role. You cannot edit the privileges of a scoped role or of a standard role. You can edit the privileges of
a custom role. To create a scoped role with a different set of privileges, create a custom role with the set of privileges wanted and
This is custom documentation. For more information, please visit SAP Help Portal. 98
7/9/25, 8:45 AM
then create the scoped role from the custom role. You can then change the privileges of the custom role as needed, which will also
change the privileges of all the scoped roles that are based on the custom role.

Users who are granted the DW Space Administrator role via a scoped role can add or remove users to or from their spaces and the
changes are reflected in the scoped roles. See Control User Access to Your Space.

We recommend that you create scoped roles by logical groups of spaces.

In the following example, the DW administrator begins assigning users to the three Sales spaces by creating the appropriate
scoped roles:

She creates three scoped roles based on standard and custom roles and assigns the users to the spaces as follows:

Scoped Roles Roles (Templates) Users Spaces

Sales Modeler DW Modeler standard role Sally Sales Europe


Bob Sales US

Senior Sales Modeler Custom role “Senior Modeler” Jim Sales Europe
based on the DW Modeler

This is custom documentation. For more information, please visit SAP Help Portal. 99
7/9/25, 8:45 AM

Scoped Roles Roles (Templates) Users Spaces

standard role + these privileges


(and permissions):

Data Warehouse Data


Integration (Execute)

Data Warehouse
Connection (Create,
Read, Update and
Delete)

Sales Spaces Admin DW Space Administrator Joan Sales US


standard role + this privilege Sales Asia
(permission):

Scoped Role User


Assignment (Manage)

If Bob no longer needs to work in the space Sales US, the DW administrator can unassign Bob from Sales US in the scoped role
Sales Modeler.

As Joan has the role of space administrator for the space Sales US, she can also unassign Bob from Sales US directly in the space
page (in the Space Management). The user assignment change is automatically reflected in the Sales Modeler scoped role.

Later on, Bob needs the space administration privileges for the space Sales Asia. From the page of the space Sales Asia, Joan
assigns Bob to the space with the Sales Space Admin scoped role.

For more information on scoped roles, see the blog Preliminary Information SAP Datasphere– Scoped Roles (published in
September 2023).

Create a Scoped Role

 Note
In addition to the standard workflows, you can also create scoped roles and assign scopes and users to them via the command
line (see Manage Scoped Roles via the Command Line).

1. In the side navigation area, click  (Security)  (Roles) and click your scoped role to open it.

2. Click  (Add Role) and select Create a Scoped Role.

 Note
As an alternative to creating a scoped role, you can use one of the predefined scoped roles that are delivered with SAP
Datasphere in the Roles page and directly assign spaces and users to them.

3. In the Create a New Role dialog, complete the following properties:

Property Description

Name Enter a unique name for the role. The name can only contain upper and lower case letters, numbers,
and underscores and its maximum length is 20 characters.

This is custom documentation. For more information, please visit SAP Help Portal. 100
7/9/25, 8:45 AM

Property Description

Description [optional] Enter a description, which can be changed at any time. The description can only contain
upper and lower case letters, numbers, spaces, and dashes and its maximum length is 155 characters.

License Type [read-only] Shows SAP Datasphere.

4. Click Create.

5. Select the role template, which can either be a standard role template or a custom role and click Save.

6. As your scoped role inherits privileges from the template you've chosen, you cannot edit the privileges, except for the one
privilege Scoped Role User Assignment (Manage). If you're creating a scoped role for space administration purposes, you
should select this privilege that allows to manage user assignment in a space.

You can then assign spaces and users to the new scoped role. The spaces and users must be created beforehand and you must
assign spaces before assigning users to them.

 Note
If you’re creating a scoped role to assign space administration privileges to certain users in certain spaces, you can either do as
follows:

Create a scoped role based on the standard role template DW Space Administrator and, to allow user assignment, select
the privilege (permission) Scoped Role User Assignment privilege (Manage), which is the only privilege you can select,
as the rest of the privileges are inherited from the template. Then, assign one or more spaces and one or more users to
the spaces.

Open the predefined scoped role DW Scoped Space Administrator and assign one or more spaces and one or more
users to the spaces. Scoped Role User Assignment (Manage) is selected by default.

The users can manage the spaces they're assigned to.

Add Spaces to a Scoped Role

To add spaces to a scoped role, the spaces must be created beforehand.

1. In the side navigation area, click  (Security)  (Roles) and click your scoped role to open it.

2. Click [number] Scopes, select one or more spaces in the dialog Scopes and click Save.

 Note
By default, all users of the scoped role are automatically assigned to the spaces you've just added. You can change this
and assign only certain members to certain spaces in the Users page of the scoped role.

Remove Spaces from a Scoped Role


1. In the side navigation area, click  (Security)  (Roles) and click your scoped role to open it.

2. Click [number] Scopes.

3. In the Selected Scopes area of the dialog Scopes, click the cross icon for each space that you want to remove from the role,
then click Save.

All users that were assigned to the spaces you've just removed are automatically removed from the scoped role.

This is custom documentation. For more information, please visit SAP Help Portal. 101
7/9/25, 8:45 AM

Add Users to a Scoped Role

To add users to a scoped role, the users must be created beforehand.

1. In the side navigation area, click  (Security)  (Roles) and click your scoped role to open it.

2. Click Users. All user assignements are displayed in the Users page.

To individually select users and assign them to spaces, click  (Add Users to Scopes), then Add New Users to
Scopes. Select one or more users in the wizard Add Users to Scopes and click Next Step.

 Note
By default, the added users are automatically assigned to all the spaces included in the scoped role. If you want
to modify this, select the one or more spaces to which you want to assign the users.

Click Next Step and Save.

 Note
You can also add a user to a scoped role from the  (Users) area. In such a case, the user is automatically
assigned to all the spaces included in the scoped role. See Assign Users to a Role.

To assign all users included in the scoped role to one or more spaces. To do so, click  (Add Users to Scopes), then
Add All Current Users to Scopes. Select one or more spaces in the wizard Add Users to Scopes and click Next Step
and Save.

To assign all users of the tenant to one or more spaces, click  (Add Users to Scopes), then Add All Users to Scopes.
Select one or more spaces in the wizard Add Users to Scopes and click Next Step and Save.

 Restriction
A user can be assigned to a maximum of 100 spaces across all scoped roles.

 Note
In the Users page, you can filter users and spaces to see for example to which spaces and roles a user is assigned to.

Once you've assigned a user to a space with the DW Space Administrator role via a scoped role, this user can manage the users for
its space directly in the page of its space (in the Space Management). See Control User Access to Your Space.

Remove Users from a Scoped Role


1. In the side navigation area, click  (Security)  (Roles) and click your scoped role to open it.

2. Click Users. All user assignements are displayed in the Users page.

3. Check the relevant rows (a row corresponding to a combination of one user and one space) and click the garbage icon. The
users cannot access the spaces they were previously assigned to in the scoped role.

Assign Users to a Role


You can assign an individual user to a role (global or scoped) in the Users page,and you can assign several users to a global role at
the same time in the Roles page.

This is custom documentation. For more information, please visit SAP Help Portal. 102
7/9/25, 8:45 AM

Prerequisites
Users with an admnistrator role can assign roles to users in the Users and Roles pages.

Assign an Individual User to a Role

You can assign an individual user to a role (global or scoped) in the Users page.

1. In the side navigation area, click  (Security)  (Users).

2. On the Users page, find the required user.

3. In the user's row, select the  icon in the Roles column. A list of Available Roles will appear.

The icon is not available if the user has the system owner role, which means that, from the Security Users page, you
cannot assign an additional role to a user who has the system owner role. You can do so from the Security Roles page
(see Create a Scoped Role to Assign Privileges to Users in Spaces).

4. Select one or more roles.

5. Select OK.

 Note
If you assign a user to a scoped role, be aware that the user is automatically assigned to all the spaces included in the scoped
role. You can change the user assignment in the scoped role. See Create a Scoped Role to Assign Privileges to Users in Spaces.

Assign Several Users to a Global Role

You can assign several users to a global role at the same time in the Roles page.

 Note
This is not relevant for scoped roles. For information about how to assign users to spaces in a scoped role, see Create a Scoped
Role to Assign Privileges to Users in Spaces.

1. In the side navigation area, click  (Security)  (Roles).

2. Find the role that you want to assign.

3. At the bottom of the role box, click the link Add Users.

4. Select one or more users from the Assign Role to User dialog.

5. Select OK.

Assign Users to a Role Using SAML Attributes


You can create a SAML role mapping to automatically assign users to a specific role based on their SAML attributes.

For example, you want to give a specific role to all employees that are assigned to a specific cost center. Once you've done the role
mapping, if new users are assigned to the cost center in the SAML identity provider (IdP), the users will be automatically assigned
to the role when logging onto SAP Datasphere via SAML authentication.

Prerequisites

This is custom documentation. For more information, please visit SAP Help Portal. 103
7/9/25, 8:45 AM
Your custom SAML Identity Provider (IdP) must be configured and the authentication method selected must be SAML Single
Sign-On (SSO) in  (System) →  (Administration) →Security. See Enabling a Custom SAML Identity Provider (Legacy Custom
IdP).

Procedure
1. In the side navigation area, click  (Security)  (Roles).

2. Select a role (or open the role) and click (Open 'SAML Role Mapping').

3. Under Conditions, select a SAML Attribute, select a Condition, and enter a Value if required.

4. (Optional) Select + (New mapping definition) to add additional mappings to the role assignment.

For each additional mapping, under Conditions, select a SAML Attribute, select a Condition, and enter a Value if required.

Under Conditions Logic, select AND or OR.

If AND is selected, the conditions for all attributes must be met for the mapping to be applied. If OR is selected, the
conditions for only one of the attributes must be met for the mapping to be applied.

The selected role will be applied to all users who meet the specified conditions when logging onto SAP Datasphere via
SAML authentication. If the selected role was previously assigned to a user, but the user does not meet the specified
conditions, the role will be revoked when the user logs in.

 Note
If a user is assigned to a scoped role via SAML attributes, the user is automatically assigned to all the spaces included in
the scoped role.

In the Roles page, a dedicated icon in the role tile is displayed, indicating that the users are assigned to the role via SAML
attributes. When you hover over the icon, the conditions defined for the role are displayed.

View Authorizations by User, Role, or Space


See all the users, roles, and spaces in the tenant and how they relate to each other.

In  (Security)  (Authorization Overview), a user with the DW Administrator global role can see all the users, roles, and
spaces in the tenant and how they relate to each other. You can filter by user, role, or space to see:

which users are assigned with which roles to which spaces,

which users are assigned to which global roles.

Enter a String to Search On


To display information related to the one or more terms, enter one or more characters in the Search field and press Enter (or click
Search).

As you type, the field will begin proposing objects and search strings. Click on a string to trigger a search on it.

For example, to display all roles that are assigned to the user Lucia, enter "Lucia" in the Search.

Filter by Criteria

This is custom documentation. For more information, please visit SAP Help Portal. 104
7/9/25, 8:45 AM
You can filter the list by any of the categories listed in the Filter By area of the left panel: user (in User Name), space (in Scope
Name) and role (in Role Name).

You can select one or more values in each filter category in the Filter By section:

Each value selected in a category acts as an OR condition. For example, to display all roles that are assigned to the users
Lucia and Ahmed, select Lucia and Ahmed in the User Name category.

Values selected in separate categories act together as AND conditions. For example, to display all the scoped roles that
enables Lucia to access the Sales Asia space, select Lucia in the User Name category and Sales Asia in the Scope Name
category.

Create Users and Assign Them to Roles via the SCIM 2.0 API
You can create, read, modify and delete users and add them to roles via the SCIM 2.0 API.

This topic contains the following sections:

Introduction

Log in with a OAuth Client

Obtain a CSRF Token

List Users

Get a Specific User

Create a User

Modify a User

Delete a User

Optional User Properties

Bulk Operations

Get Information About the SCIM API

Introduction

This API allows you to programmatically manage users using a SCIM 2.0 compliant endpoint.

SAP Datasphere exposes a REST API based on the System for Cross-domain Identity Management (SCIM 2.0) specification. This
API allows you to keep your SAP Datasphere system synchronized with your preferred identity management solution.

Using this API, you can perform the following actions:

Create, read, modify and delete users.

Add users to existing scoped or global roles.

 Note
You cannot create new roles using this API.

List users.

This is custom documentation. For more information, please visit SAP Help Portal. 105
7/9/25, 8:45 AM
Get information on the identity provider, available schemas, and resource types.

This API uses SCIM 2.0. For more information, see SCIM Core Schema.

Log in with a OAuth Client


Beforehand you can log in with a Oauth client, a user with the administrator role must create an OAuth2.0 client in your SAP
Datasphere tenant and provide you with the OAuth client ID and secret parameters.

 Note
The OAuth client must be configured with the following properties:

Purpose: API Access

Access: User Provisioning

Authorization Grant: Client Credentials

See Create an OAuth2.0 Client with an API Access Purpose.

To log in to the OAuth client, send a GET (or POST) request with the following elements:

Request Component Setting Value

Parameter key ?grant_type=client_credentials

Authorization type Basic Auth

Authorization username <OAuth Client ID>

Authorization password <OAuth Client Secret>

Syntax of GET request:

[Link]

 Note
You can find the token URL in  (System)  (Administration) App Integration OAuth Clients Token URL .

The response body returns the access token, which you'll then use as the bearer token to obtain the csrf token.

Obtain a CSRF Token


To obtain a csrf token, send a GET request with the following elements:

Request Component Setting Value

Authorization type Bearer Token

Authorization token <access token retrieved when logging in with the OAuth
client>

This is custom documentation. For more information, please visit SAP Help Portal. 106
7/9/25, 8:45 AM

Request Component Setting Value

Header key x-sap-sac-custom-auth=true

Header key x-csrf-token: fetch

Syntax of GET request:

<tenant_url>/api/v1/csrf

The CSRF token is returned in the x-csrf-token response header. This token can then be included in the POST, PUT, PATCH, or
DELETE request in the x-csrf-token:<token> header.

List Users

To retrieve users, use the GET request with the/api/v1/scim2/Users endpoint and the following elements:

Request Component Setting Value

Authorization type Bearer Token

Authorization token <access token retrieved when logging in with the OAuth
client>

Header key x-sap-sac-custom-auth=true

To list all the users existing in the tenant, enter:

[Link]

You can control the list of users to retrieve by using one or more of the following optional URL parameters:

Parameter Description

sortBy Specifies the user attribute to sort the results by.


For example, to retieve the list of users sorted by user name:

sortBy=userName

sortOrder Specifies the order in which items are returned, either ascending or descending. By
default, an ascending sort order is used.
To retieve the list of users by descending order:

sortOrder=descending

startIndex Specifies the index of the first user to fetch.


For example, so that the tenth user is the first user retireved:

startIndex=10

count Specifies the number of users to return on each page.


For example, to display a maximum of 8 users on a page:

This is custom documentation. For more information, please visit SAP Help Portal. 107
7/9/25, 8:45 AM

Parameter Description

count=8

filter=<attribute> Adds a filter to the request.


For example, to display the users whose user name include the letter K:

filter=userName co "K"

See the user schema for available attributes. All operators are supported.

Example of a GET request with the various parameters:

[Link] co "a"&sortOrder=descending&sta

 Caution
GET requests send personal identifiable information as part of the URL, such as the user name in this case. Consider using the
POST request with the /api/v1/scim2/Users/.search endpoint instead for enhanced privacy of personal information. Syntax of
POST request:

[Link]

 Note
In the response body, if the users listed are assigned to roles, you can identify the roles as they are prefixed with PROFILE.

Get a Specific User

To retrieve a specific user based on its ID, use the GET request with the /api/v1/scim2/Users/<user ID> endpoint and the
following elements:

Request Component Setting Value

Authorization type Bearer Token

Authorization token <access token retrieved when logging in with the OAuth
client>

Header key x-sap-sac-custom-auth=true

To retrieve a specific user based on its ID, enter the GET request:

[Link] ID>

The user ID must be the UUID (universally unique identifier), which you can get by sending the GET request:

[Link]

 Note

This is custom documentation. For more information, please visit SAP Help Portal. 108
7/9/25, 8:45 AM
In the response body, if the user is assigned to roles, you can identify with their prefix PROFILE.

Create a User

To create a user, use the POST request with the/api/v1/scim2/Users/ endpoint and the following elements:

Request Component Setting Value

Authorization type Bearer Token

Authorization token <access token retrieved when logging in with the OAuth
client>

Header key x-sap-sac-custom-auth=true

Header key x-csrf-token: csrf token value

Syntax of POST request: [Link]

 Note
The following information are required: userName, name, and emails information. Other information that are not provided
will be either left empty or set to its default value.

If you are using SAML authentication, idpUserId should be set to the property you are using for your SAML mapping. For
example, the user's USER ID, EMAIL, or CUSTOM SAML MAPPING. If your SAML mapping is set to EMAIL, the email address
you add to idpUserId must match the email address you use for email.

To find this information, log on to SAP Datasphere and go to (Security) (Users) .

The userName attribute can only contain alphanumeric and underscore characters. The maximum length is 20 characters.

 Note
When creating or modifying a user, you can add optional properties to the user.

The following example shows how to create a new user:

{
"schemas": [
"urn:ietf:params:scim:schemas:core:2.0:User",
"urn:ietf:params:scim:schemas:extension:enterprise:2.0:User"
],
"userName": "LGARCIA",
"name": {
"familyName": "Garcia",
"givenName": "Lisa",
"formatted": "Lisa Garcia"
},
"displayName": "Lisa Garcia",
"preferredLanguage": "en",
"active": true,
"emails": [

This is custom documentation. For more information, please visit SAP Help Portal. 109
7/9/25, 8:45 AM
{
"value": "[Link]@[Link]",
"type": "work",
"primary": true
}
],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"idpUserId": "[Link]@[Link]"
}
}

The following example shows how to create a new user and assign it to a role:

{
"schemas": [
"urn:ietf:params:scim:schemas:core:2.0:User",
"urn:ietf:params:scim:schemas:extension:enterprise:2.0:User"
],
"userName": "LGARCIA",
"name": {
"familyName": "Garcia",
"givenName": "Lisa",
"formatted": "Lisa Garcia"
},
"displayName": "Lisa Garcia",
"preferredLanguage": "en",
"active": true,
"emails": [
{
"value": "[Link]@[Link]",
"type": "work",
"primary": true
}
],
"roles": [
{
"value": "PROFILE:t.V:Sales_Modeler",
"display": "Sales_Modeler",
"primary": true
} ],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"idpUserId": "[Link]@[Link]"
}
}

The response body returns the ID of the user created, which is the user UUID (universally unique identifier).

 Note
When creating or modifying a user via the API, you can also assign the user to one or more roles - either global or scoped,
provided that the roles already exist in the tenant:

Before you can add one or more users to a scoped role, one space at least must be assigned to the scoped role.

This is custom documentation. For more information, please visit SAP Help Portal. 110
7/9/25, 8:45 AM
When a user is added to a scoped role, the user is given access to all the spaces included in the scoped role.

All roles are prefixed with PROFILE. Custom and scoped roles have IDs in the following format: PROFILE:<t.#>:
<role_name>.

Modify a User
You can modify a specific user either way:

To override all information related to a specific user, use a PUT request. The user properties are updated with the properties
you provide and all the properties that you do not provide are either left empty or set to their default value.

To update only some information related to a specific user, use a PATCH request. The user properties are updated with the
changes you provide and all properties that you do not provide remain unchanged.

You can use either the PUT (override) or PATCH (update) request with the/api/v1/scim2/Users/<user ID> endpoint and
the following elements:

Request Component Setting Value

Authorization type Bearer Token

Authorization token <access token retrieved when logging in with the OAuth
client>

Header key x-sap-sac-custom-auth=true

Header key x-csrf-token: csrf token value

Syntax of PUT or PATCH request:

[Link] ID>

 Note
If you are using SAML authentication, and you are using USER ID as your SAML mapping, you cannot change the userName
using this API. The userName you use in the request body must match the user <ID>.

You can use the active attribute to activate or deactivate users.

 Note
When creating or modifying a user, you can add optional properties to the user.

The following example shows how to add a user to a role with a PUT request:

{
"schemas": [
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters",
"urn:ietf:params:scim:schemas:core:2.0:User"
],
"id": "userID-00001",
"meta": {

This is custom documentation. For more information, please visit SAP Help Portal. 111
7/9/25, 8:45 AM
"resourceType": "User",
"location": "/api/v1/scim2/Users/userID-00001"
},
"userName": "LGARCIA",
"name": {
"familyName": "Garcia",
"givenName": "Lisa",
"formatted": "Lisa Garcia"
},
"displayName": "Lisa Garcia",
"preferredLanguage": "en",
"active": true,
"emails": [
{
"value": "[Link]@[Link]",
"type": "work",
"primary": true
}
],
"roles": [
{
"value": "PROFILE:t.V:Sales_Modeler",
"display": "Sales_Modeler",
"primary": true
} ],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"idpUserId": "[Link]@[Link]"
}
}

The following example shows how to remove a user from a role and add it to another role with a PATCH request:

{
"schemas": [
"urn:ietf:params:scim:api:messages:2.0:PatchOp"
],
"Operations": [
{
"op": "replace",
"path": "roles",
"value": [
{
"value": "PROFILE:t.V:Sales_Modeler_US",
"display": "Sales_Modeler_US",
"primary": true
}
]
}
]
}

The following example shows how to do the following changes with a PATCH request: remove a user from a role and add it to
another role, and modify a user's email address and its idpUserId.

This is custom documentation. For more information, please visit SAP Help Portal. 112
7/9/25, 8:45 AM

{
"schemas": [
"urn:ietf:params:scim:api:messages:2.0:PatchOp"
],
"Operations": [
{
"op": "replace",
"path": "roles",
"value": [
{
"value": "PROFILE:t.V:Sales_Modeler_US",
"display": "Sales_Modeler_US",
"primary": true
}
]
},
{
"op": "replace",
"path": "[Link]",
"value": [Link]+1@[Link]
},
{
"op": "replace",
"path": "urn:sap:params:scim:schemas:extension:sac:2.0:[Link]",
"value": [Link]+1@[Link]
}
]
}

 Note
When creating or modifying a user via the API, you can also assign the user to one or more roles - either global or scoped,
provided that the roles already exist in the tenant:

Before you can add one or more users to a scoped role, one space at least must be assigned to the scoped role.

When a user is added to a scoped role, the user is given access to all the spaces included in the scoped role.

All roles are prefixed with PROFILE. Custom and scoped roles have IDs in the following format: PROFILE:<t.#>:
<role_name>.

Delete a User

To delete a user, use the DELETE request with the/api/v1/scim2/Users/<user ID> endpoint and the following elements:

Request Component Setting Value

Authorization type Bearer Token

Authorization token <access token retrieved when logging in with the OAuth
client>

Header key x-sap-sac-custom-auth=true

This is custom documentation. For more information, please visit SAP Help Portal. 113
7/9/25, 8:45 AM

Request Component Setting Value

Header key x-csrf-token: csrf token value

To delete a specific user based on its ID, enter the DELETE request:

[Link] ID>

The user ID must be the UUID (universally unique identifier), which you can get by sending the GET request:

[Link]

Optional User Properties

You can add optional parameters to the user when creating or modifying a user, in addition to the required properties (userName,
name, and emails).

Parameter Description

preferredLanguage Specifies the language in which to view the SAP Datasphere interface.

Allowed values:

ISO 639-1 two-letter language code. For example, en.

Concatenation of an ISO 639-1 two-letter language code, a dash, and ISO 3166-1
two-letter country code. For example, en-us.

Default value: en

Example

"preferredLanguage": "en",

The following parameters must be included in an urn:ietf:params:scim:schemas:extension:sap:user-custom-


parameters:1.0 block within the user schema.

dataAccessLanguage Specifies the default language in which to display text data in SAP Analytics Cloud.

Allowed values:

ISO 639-1 two-letter language code. For example, en.

Concatenation of an ISO 639-1 two-letter language code, a dash, and ISO 3166-1
two-letter country code. For example, en-us.

Default value: en

dateFormatting Specifies the date display format.


Allowed values:

MMM d, yyyy

MMM dd, yyyy

[Link]

[Link]

This is custom documentation. For more information, please visit SAP Help Portal. 114
7/9/25, 8:45 AM

Parameter Description

[Link]

yyyy/MM/dd

dd/MM/yyyy

MM/dd/yyyy

Default value: MMM d, yyyy

numberFormatting Specifies the number format.


Allowed values: 1,234.56, 1.234,56 or 1 234,56

Default value: 1,234.56

timeFormatting Specifies the time display format.


Allowed values:

H:mm:ss, h:mm:ss a or h:mm:ss A

 Note
H:mm:ss corresponds to 24-Hour Format. For example, [Link].

h:mm:ss a corresponds to 12-Hour Format. For example, [Link] p.m.

h:mm:ss A corresponds to 12-Hour Format. For example, [Link] PM.

Default value: H:mm:ss

Example:

"urn:ietf:params:scim:schemas:extension:sap:user-custom-parameters:1.0": {
"dataAccessLanguage": "en",
"dateFormatting": "MMM d, yyyy",
"timeFormatting": "H:mm:ss",
"numberFormatting": "1,234.56",
"cleanUpNotificationsNumberOfDays": 0,
"systemNotificationsEmailOptIn": true,
"marketingEmailOptIn": false
},

Bulk Operations

To create, modify or delete users in bulk, use the POST request with the /api/v1/scim2/Bulk/ endpoint and the following
elements:

Request Component Setting Value

Authorization type Bearer Token

Authorization token <access token retrieved when logging in with the OAuth
client>

Header key x-sap-sac-custom-auth=true

This is custom documentation. For more information, please visit SAP Help Portal. 115
7/9/25, 8:45 AM

Request Component Setting Value

Header key x-csrf-token: csrf token value

The supported operations are POST, PUT, PATCH and DELETE.

Syntax of POST request: [Link]

 Note
A maximum of 30 operations per request can be processed.

The following example shows how to create two users:

{
"schemas": ["urn:ietf:params:scim:api:messages:2.0:BulkRequest"],
"Operations": [
{
"method": "POST",
"path": "/Users",
"bulkId": "bulkId1",
"data":{
"schemas": [
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters",
"urn:ietf:params:scim:schemas:core:2.0:User"
],
"userName": "LGARCIA",
"name": {
"familyName": "Garcia",
"givenName": "Lisa "
},
"displayName": "Lisa Garcia",
"emails": [
{
"value": "[Link]@[Link]"
}
],
"roles":[
{
"value": "PROFILE:t.V:Sales_Modeler",
"display": "Sales_Modeler",
"primary": true
}
],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"dataAccessLanguage": "en",
"numberFormatting": "1,234.56",
"idpUserId": "[Link]@[Link]",
"timeFormatting": "H:mm:ss",
"dateFormatting": "MMM d, yyyy",

}
}

This is custom documentation. For more information, please visit SAP Help Portal. 116
7/9/25, 8:45 AM
},
{
"method": "POST",
"path": "/Users",
"bulkId": "bulkId2",
"data": {
"schemas": [
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters",
"urn:ietf:params:scim:schemas:core:2.0:User"
],
"userName": "JOWEN",
"name": {
"familyName": "Owen",
"givenName": "Joe"
},
"displayName": "Joe Owen",
"emails": [
{
"value": "[Link]@[Link]"
}
],
"roles":[
{
"value": "PROFILE:t.V:Sales_Modeler",
"display": "Sales_Modeler",
"primary": true
}
],
"urn:sap:params:scim:schemas:extension:sac:2.0:user-custom-parameters": {
"dataAccessLanguage": "en",
"numberFormatting": "1,234.56",
"idpUserId": "[Link]@[Link]",
"timeFormatting": "H:mm:ss",
"dateFormatting": "MMM d, yyyy",
}
}
}
]
}

The following example shows how to delete two users using their IDs:

{
"schemas": ["urn:ietf:params:scim:api:messages:2.0:BulkRequest"],
"Operations": [
{
"method": "DELETE",
"path": "/Users/<userID_User1>"
},
{
"method": "DELETE",
"path": "/Users/<userID_User2>"
}
This is custom documentation. For more information, please visit SAP Help Portal. 117
7/9/25, 8:45 AM
]
}

Get Information About the SCIM API


Using the GET request, you can obtain the following information about the SCIM API:

/scim2/ServiceProviderConfig - Gets information about the identity provider being used with your SAP
Datasphere tenant.

/scim2/Schemas - Gets information on the schemas used for user management.

/scim2/ResourceTypes - Gets information on all available resource types.

/scim2/ResourceTypes/<Type> - Gets information on a specific resource type.

Automated Conversion to Scoped Roles


For SAP Datasphere tenants that were created before version 2023.21, the roles and user assignment to spaces have been
converted so that users can continue to perform the same actions as before in their spaces.

This topic contains the following sections:

Introduction to Automated Conversion to Scoped Roles

Converted Roles

Example of a Converted Scoped Role

Adapting Converted Scoped Roles

Introduction to Automated Conversion to Scoped Roles


The way a DW Administrator gives privileges to users to do certain actions in spaces has changed.

This is custom documentation. For more information, please visit SAP Help Portal. 118
7/9/25, 8:45 AM

Before Conversion After Conversion

A DW Administrator assigned a role to a user and assigned the user A DW Administrator assigns a role to one or more users and one or
as a member of a space. more spaces within a new role: a scoped role.

As a consequence: As a consequence:

A user had the same one or more roles in all the spaces he A user can have different roles in different spaces: be a
was a member of. modeler in space Sales Germany and Sales France and a
viewer in space Europe Sales.
A DW Administrator assigned users space by space by
going in each space page. A DW Administrator can give a role to many users in many
spaces, all in one place in a scoped role. See Create a
Scoped Role to Assign Privileges to Users in Spaces.

A DW Space Administrator can then manage users in their


spaces and the changes are reflected in the scoped roles.
See Control User Access to Your Space.

Converted Roles

You can now use global roles for tenant-wide actions and scoped roles for space-related actions.

This is custom documentation. For more information, please visit SAP Help Portal. 119
7/9/25, 8:45 AM

The Roles page lists the same standard and custom roles as before the conversion, and in addition the scoped roles that have been
automatically created.

This is custom documentation. For more information, please visit SAP Help Portal. 120
7/9/25, 8:45 AM
DW Administrator, Catalog Administrator and Catalog User: these standard roles are considered as global roles. They now
include only privileges that are global, which means privileges that apply to the tenant and are not space-related. For
example, the DW Administrator role no more grants access to any of the modeling apps of SAP Datasphere (such as Data
Builder).

Users who previously had these roles are still assigned to them after conversion.

Users who previously had the DW Administrator role and were members of certain spaces are assigned to the new DW
Scoped Space Administrator role for those spaces they previously had access to.

The user who previously had the System Owner role and was member of certain spaces is assigned to the new DW Scoped
Space Administrator role for those spaces the user previously had access to.

A single scoped role is created for each standard role (outside of DW Administrator, Catalog Administrator and Catalog
User) and each custom role and all the users who previously had that standard or custom role are assigned to the new
scoped role but only for those spaces they previously had access to.

 Note
All the spaces of the tenant are included in each scoped role created, but not all users are assigned to all spaces. See the
example of scoped role below.

For each standard or custom role, two roles are available after the conversion: the initial standard or custom role (which
acts as a template for the scoped role) and the scoped role created.

Each scoped role includes privileges which are now considered as scoped privileges.

Users who previously had the DW Space Administrator role are assigned to these 2 roles: the standard role DW Space
Administrator and the new scoped role DW Scoped Space Administrator. Users who manage spaces primarily need scoped
permissions to work with spaces, but they also need some global permissions (such as Lifecycle when transporting content
packages). To provide such users with the full set of permissions they need, each space administrator is assigned to the
scoped role DW Scoped Space Administrator to receive the necessary scoped privileges, and they are also assigned directly
to the DW Space Administrator role in order to receive the additional global privileges.

 Note

Specific case - no role assigned to a user: Before conversion, a DW Administrator assigned a user to certain spaces but
did not assign a role to the user. As no role was assigned to the user, the user-to-spaces assignment is not kept after
conversion.

Privileges and permissions are now either global or scoped. See Privileges and Permissions.

Example of a Converted Scoped Role

In this example, users assigned to a custom role called « Senior Modeler » were members of certain spaces before the conversion,
as shown below.

This is custom documentation. For more information, please visit SAP Help Portal. 121
7/9/25, 8:45 AM

The custom role « Senior Modeler » has been converted to the scoped role « Custom Scoped Senior Modeler » and the users who
previously had that custom role « Senior Modeler » are assigned to the scoped role but only for the spaces they previously had
access to.

Adapting Converted Scoped Roles


The scoped roles that are automatically created during the conversion ensure that users can continue to perform the same
actions as before the conversion. However, we recommend that you do not use the automatically created scoped roles and that
you create your own scoped roles by logical groups as soon as possible.

In this example, the following scoped roles have been automatically created during conversion:

DW Scoped Space Administrator

DW Scoped Modeler

DW Scoped Viewer

DW Scoped Consumer

There are 4 spaces: Sales US, Sales Europe, Finance US and Finance Europe, which can be logically organized in one Sales group
and one Finance group.

You should create a set of scoped roles for each logical group of spaces, add the relevant spaces and the relevant users and assign
the users to the spaces in the scoped roles. The users will have access to the spaces with the appropriate privileges.

Sales Spaces Finance Spaces

Scoped Roles
DW Sales Space Administrator DW Finance Space Administrator

DW Sales Modeler DW Finance Modeler

DW Sales Viewer DW Finance Viewer

This is custom documentation. For more information, please visit SAP Help Portal. 122
7/9/25, 8:45 AM

Sales Spaces Finance Spaces

DW Sales Consumer DW Finance Consumer

Spaces
Sales US Finance US

Sales Europe Finance Europe

For more information about creating a scoped role, see Create a Scoped Role to Assign Privileges to Users in Spaces.

 Note
In addition to the standard workflows, you can also create scoped roles and assign scopes and users to them via the command
line (see Manage Scoped Roles via the Command Line).

Transfer the System Owner Role


The individual who purchases SAP Datasphere is automatically designated as the system owner. If you, as the purchaser, are not
the right person to administer the system, you can transfer the system owner role to the appropriate person in your organization.

Prerequisites
You must be logged on as a user with the System Information Update privilege.

 Note
Transferring the system owner role is not possible if you only have one license for SAP Datasphere.

Context
1. On the Users page of the Security area, select the user you want to assign the system owner role to.

2. Select  (Assign as System Owner).

The Transfer System Owner Role dialog appears.

3. Under New Role, enter a new role for the previous system owner, or select  to open a list of available roles.

 Note
One or more roles may be selected.

4. Select OK.

Delete a Role
You can delete a custom or a scoped role when it is no longer needed.

Context

This is custom documentation. For more information, please visit SAP Help Portal. 123
7/9/25, 8:45 AM
You can delete custom roles and scoped roles (except for the predefined scoped roles that are delivered with SAP Datasphere as
examples).

You can delete one or more roles at the same time.

Procedure
1. In the side navigation area, click  (Security)  (Roles).

2. Hover over the role and select the check box.

3. Choose  (Delete) from the toolbar.

4. In the warning dialog, select Delete.

Results
The selected roles are deleted. All users that were assigned the role will lose access to certain features depending on the privileges
and permissions that were included in the role.

Creating Spaces and Allocating Resources


Users with an administrator role can create spaces and allocate resources to them.

All data acquisition, preparation, and modeling in SAP Datasphere happens inside spaces. A space is a secure area - space data
cannot be accessed outside the space unless it is shared to another space or exposed for consumption.

You can create spaces with different types of storage:

SAP HANA Database (Disk and In-Memory) - You allocate disk and memory storage, set a priority, and can limit how much
memory and how many threads its statements can consume.

SAP HANA Data Lake Files (file spaces) - You allocate compute resources. File spaces are intended for loading and
preparing large quantities of data in an inexpensive inbound staging area and are stored in the SAP Datasphere object
store.

You can then assign one or more users to the space via scoped roles. The users can start acquiring and preparing data in the
space.

If you assign users to a space with a space administrator role, they can manage users, create connections to source systems,
secure data with data access controls, and manage other aspects of the space (see Managing Your Space).

Create a Space
Create a space, allocate storage, and set the space priority and statement limits.

Prerequisites

To create a space, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

Spaces (C-------) - To create spaces.

This is custom documentation. For more information, please visit SAP Help Portal. 124
7/9/25, 8:45 AM
User (-R------) - To allow the creation of spaces.

Spaces (-------M) - To update all spaces and space properties.

The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

Context

 Note
Only administrators can create spaces, allocate storage, and set the space priority and statement limits. The remaining space
properties can be managed by the space administrators that the administrator assigns to the space via a scoped role.

Procedure
1. In the side navigation area, click (Space Management), and click Create.

2. In the Create Space dialog, enter the following properties, and then click Create:

Property Description

Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain
spaces and special characters.

Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and
must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must
not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names).
As the technical name will be displayed in the Open SQL Schema and in monitoring tools, including SAP
internal tools, we recommend that you do not include sensitive business or personal data in the name.

Storage Type [Default] Select SAP HANA Database (Disk and In-Memory).

The space is created and its property sheet opens.

3. In the General Settings section, review the following properties:

Property Description

Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and
must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must
not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names).

Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain
spaces and special characters.

Space Status [read-only] Displays the status of the space. Newly-created spaces are always active.

Space Type [read-only] Displays the type of the space. You can only create spaces of type SAP Datasphere.

Created By [read-only] Displays the user that created the space.

Created On [read-only] Displays the date and time when the space was created.

Deployment Status [read-only] Displays the deployment status of the space. Newly-created spaces are deployed, but when
you make changes, you need to save and re-deploy them before they are available to space users.

This is custom documentation. For more information, please visit SAP Help Portal. 125
7/9/25, 8:45 AM

Property Description

Deployed On [read-only] Displays the date and time when the space was last deployed.

Description [optional] Enter a description for the space. Can contain a maximum of 4 000 characters.

Storage Type [read-only] Displays where the space is stored on.

 Note
Once the space is created, users with space administrator privileges can use the Translation area to choose the
language from which business textual information will be translated. For more information, see Translating Metadata for
SAP Analytics Cloud.

4. [optional] Use the Space Storage properties to allocate disk and memory storage to the space and to choose whether it
will have access to the SAP HANA data lake.

For more information, see Allocate Storage to a Space.

5. [optional] Use the remaining sections to further configure the space.

Data Access/Data Consumption: Modify the following property, if appropriate:

Property Description

Expose for Consumption by Default Choose the default setting for the Expose for Consumption
property for views created in this space.

Data Access/Database Users - Use the list in the Database Users section to create users who can connect external
tools and read from and write to the space. See Create a Database User.

Data Access/HDI Containers - Use the list in the HDI Containers section to associate HDI containers to the space.
See Prepare Your HDI Project for Exchanging Data with Your Space.

 Note
A user with the DW Administrator role only cannot see the HDI Containers area.

Time Data/Time Tables and Dimensions - Click the button in the Time Tables and Dimensions section to generate
time data in the space. See Create Time Data and Dimensions.

 Note
A user with the DW Administrator role only cannot see the Time Tables and Dimensions area.

Auditing/Space Audit Settings - Use the properties in the Space Audit Settings section to enable audit logging for
the space. See Logging Read and Change Actions for Audit.

6. Click Deploy to deploy your space to the run-time database.

7. Add your space to one or more scoped roles. You can:

Add your space to an existing scoped role (see Add Spaces to a Scoped Role).

Create a scoped role and add your space and at least one user to the scoped role (see Create a Scoped Role).

For more information, see Create a Scoped Role to Assign Privileges to Users in Spaces.

All users assigned to the space via the scoped roles are automatically displayed in the Users area of the space page. In this
area, you can add or remove users to/from scoped roles for your space (see Control User Access to Your Space). Either an

This is custom documentation. For more information, please visit SAP Help Portal. 126
7/9/25, 8:45 AM
administrator or a user with space administrator privileges can do so.

8. [optional] The properties in the Workload Management section are set with their default values. To change them, go in the
side navigation area and click  (System)  (Configuration) Workload Management (see Set Priorities and
Statement Limits for Spaces or Groups).

Create a File Space to Load Data in the Object Store


Create a file space and allocate compute resources to it. File spaces are intended for loading and preparing large quantities of data
in an inexpensive inbound staging area and are stored in the SAP Datasphere object store.

Prerequisites
To create a file space, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

Spaces (C-------) - To create spaces.

User (-R------) - To allow the creation of spaces.

Spaces (-------M) - To update all spaces and space properties.

The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

Context

 Note
For additional information on working with data in the object store, see SAP note 3538038 .

 Note
You cannot create or manage a file space via the command line, add a file space to an elastic compute node, or choose a file
space as a monitoring space. You cannot monitor, lock, or unlock a file space. You cannot generate time data, enable audit
logging, create database users, or associate HDI containers in a file space.

You can create up to 5 file spaces in a tenant.

Users with an administrator role can create spaces, allocate compute resources and assign users. The remaining space properties
can be managed by users with a space administrator role.

Procedure

1. In the side navigation area, click (Space Management), and click Create.

2. In the Create Space dialog, complete the following properties:

Property Description

Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain
spaces and special characters.

This is custom documentation. For more information, please visit SAP Help Portal. 127
7/9/25, 8:45 AM

Property Description

Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and
must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must
not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names).
As the technical name will be displayed in the Open SQL Schema and in monitoring tools, including SAP
internal tools, we recommend that you do not include sensitive business or personal data in the name.

Storage Type Select SAP HANA Data Lake Files. The option is greyed out if no resources have been allocated to the
object store (see Configure the Size of Your SAP Datasphere Tenant).

3. Click Create. The space page opens. The creation and provisioning of a file space may take several minutes. You must wait
for the notification message indicating that the file space is deployed before you can start working with the file space.

4. Review the following properties in the General Settings section:

Property Description

Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and
must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must
not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names).

Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain
spaces and special characters.

Space Status [read-only] Displays the status of the space. Newly-created spaces are always active.

Space Type [read-only] Displays the type of the space. You can only create spaces of type SAP Datasphere.

Created By [read-only] Displays the user that created the space.

Created On [read-only] Displays the date and time when the space was created.

Deployment Status [read-only] Displays the deployment status of the space. Newly-created spaces are deployed, but when
you make changes, you need to save and re-deploy them before they are available to space users.

Deployed On [read-only] Displays the date and time when the space was last deployed.

Description [optional] Enter a description for the space. Can contain a maximum of 4 000 characters.

Storage Type [read-only] Displays where the space is stored on.

5. Apache Spark section - The maximum amount of compute resources that the file space can consume when processing
statements are allocated to its Apache Spark instance. The resources allocated for the file space depend on the resources
allocated for the object store in the Tenant Configuration page (see Configure the Size of Your SAP Datasphere Tenant).

The following applications, which are listed in the Applications area, are available for the instance and are used to run the
tasks that are listed in the Task Assignment area.

Property Description

Application [read-only] Shows the name of the application.

Cluster Size [read-only] Qualifies the overall size of resources allocated to the application. For
example, micro or medium.

Driver [read-only] Shows the amount of resources allocated to the driver for the application.

This is custom documentation. For more information, please visit SAP Help Portal. 128
7/9/25, 8:45 AM

Property Description

Executor [read-only] Shows the amount of resources allocated to the executor for the
application.

Max. Used [read-only] Shows the maximum amount of resources that can be used for the
application.

You can view which applications are used by default to run which tasks in the Task Assignment area.

Object Type Activity Default Application Description

LOCAL_TABLE DELETE_DATA, 300 [read-only] Indicates the application that is


MERGE_FILES, used by default to run all the activities listed
OPTIMIZE_FILES, via a task chain or the Local Tables (File)
TRUNCATE_FILES, monitor.
VACUUM_FILES,
See Creating a Task Chain and Monitoring
FIND_AND_REPLACE
Local Tables (File).

TRANSFORMATION_FLOWS EXECUTE 400 [read-only] Indicates the application that is


used by default to run a transformation flow in
a file space.
See Creating a Transformation Flow in a File
Space

To modify the size of the instance at any time, change the amount of memory and click Update. You should change the size
of your instance based on the resource amounts displayed in the Max. Used column of the table. For example, you can see
that the application used to run transformation flows is allocated 168 CPU and 672 GB of memory. If you want that 4
transformation flows can be run in parallel, you must enter 2688 in Memory (GB). The amount of vCPUs is automatically
calculated based on the amount of memory with a ratio of 4:1 (for example 2688 GB of memory and 672 vCPUs). The
minimum size for the instance is 1632 GB of memory (and 408 vCPUs), and its maximum size is 8192 GB of memory (and
2048 vCPUs).

6. Add your space to one or more scoped roles. You can:

Add your space to an existing scoped role (see Add Spaces to a Scoped Role).

Create a scoped role and add your space and at least one user to the scoped role (see Create a Scoped Role).

For more information, see Create a Scoped Role to Assign Privileges to Users in Spaces.

All users assigned to the space via the scoped roles are automatically displayed in the Users area of the space page. In this
area, you can add or remove users to/from scoped roles for your space (see Control User Access to Your Space). Either a
user with an administrator role or a user with a space administrator role can do so.

 Note

If you've made some changes in the General Settings area, such as changing the space name or entering a description,
click Save.

If your file space and its data lake instance or Apache Spark instance run into communication errors, click Deploy.

For more information about working with data in the object store, see Acquiring and Preparing Data in the Object Store.

This is custom documentation. For more information, please visit SAP Help Portal. 129
7/9/25, 8:45 AM

Allocate Storage to a Space


Use the Space Storage properties to allocate disk and memory storage to the space and to choose whether it will have access to
the SAP HANA data lake.

Prerequisites
To allocate disk and memory storage to your space, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

Spaces (-------M) - To update all spaces and space properties.

The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

 Note
Relevant only for spaces with a storage type SAP HANA Database (Disk and In-Memory), and not for SAP HANA Data Lake
Files spaces.

Context
SAP Datasphere supports data tiering using the features of SAP HANA Cloud:

Memory Storage (hot data) - Keep your most recent, frequently-accessed, and mission-critical data loaded constantly in
memory to maximize real-time processing and analytics speeds.

When you persist a view, the persisted data is stored in memory (see Persist Data in a Graphical or SQL View).

Disk (warm data) - Store master data and less recent transactional data on disk to reduce storage costs.

When you load data to a local table or replicate data to a remote table in SAP Datasphere, the data is stored on disk by
default, but you can load it in memory by activating the Store Table Data in Memory switch (see Accelerate Table Data
Access with In-Memory Storage).

Data Lake (cold data) - Store historical data that is infrequently accessed in the data lake. With its low cost and high
scalability, the data lake is also suitable for storing vast quantities of raw structured and unstructured data, including IoT
data. For more information, see Integrating Data to and From SAP HANA Cloud Data Lake.

You can allocate specific amounts of memory and disk storage to a space or disable the Enable Space Quota option, and allow the
space to consume all the storage it needs, up to the total amount available in your tenant.

Procedure
1. In the side navigation area, click (Space Management), locate your space tile, and click Edit to open it.

2. Use the Space Storage properties to allocate disk and memory storage to the space and to choose whether it will have
access to the SAP HANA data lake.

Property Description

Enable Space Quota Disable this option to allow the space to consume any amount of disk and memory storage up to the
total amounts available in your tenant.

This is custom documentation. For more information, please visit SAP Help Portal. 130
7/9/25, 8:45 AM

Property Description

If this option was disabled and then subsequently re-enabled, the Disk and Memory properties are
initialized to the minimum values required by the current contents of the space.

Default: Enabled

Disk (GB) Enter the amount of disk storage allocated in GB. You can use the buttons to change the amount by
whole GBs or enter fractional values in increments of 100 MB by hand.

Default: 2 GB

Memory (GB) Enter the amount of memory storage allocated in GB. This value cannot exceed the amount of disk
storage allocated. You can use the buttons to change the amount by whole GBs or enter fractional
values in increments of 100 MB by hand.

 Note
The memory allocated is used to store data and is not related to processing memory. For more
information on limiting processing memory in a space, see Set Priorities and Statement Limits for
Spaces or Groups.

Default: 1 GB

Use This Space to Enable access to the SAP HANA Cloud data lake. Only one space can connect to the data lake.
Access the Data Lake
 Note
Even though the option is available for selection, you should check first that no other space already
has access to the data lake. To do so, you can choose the table layout in the Space Management
overview page and sort on the Data Lake Access column.

Default: Disabled

 Note
If a space exceeds its allocations of memory or disk storage, it will be locked until a user of the space deletes the excess
data or an administrator assigns additional storage. See Unlock a Locked Space.

3. Click Save to save your changes to the space, or Deploy to save and immediately make the changes available to users
assigned to the space.

Results
To view the total storage available and the amount assigned to and used by all spaces, see Monitoring SAP Datasphere.

Set Priorities and Statement Limits for Spaces or Groups


Prioritize between spaces or groups for resource consumption and set limits to the amount of memory and threads that a space or
group can consume when processing statements.

This topic contains the following sections:

Prerequisites

Set Priorities and Statement Limits by Space

Set Priorities and Statement Limits by Group

This is custom documentation. For more information, please visit SAP Help Portal. 131
7/9/25, 8:45 AM
Export Workload Management Settings

Import Workload Management Settings

Prerequisites
To set priorities and statement limits for spaces or groups, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

 Note
Relevant only for spaces with a storage type SAP HANA Database (Disk and In-Memory), and not for SAP HANA Data Lake
Files spaces.

 Note
You can use the SAP Datasphere command line interface, datasphere, to set space or group priorities and statement limits.
See Manage Priorities and Statement Limits for Spaces or Groups via the Command Line.

Set Priorities and Statement Limits by Space


You can set priorities and statement limits by space (default option). Once you’ve created a space, a new row for the space is
added in the Workload Management page with a default priority and default statement limits.

1. In the side navigation area, click  (System)  (Configuration) Workload Management .

2. If Space is not already selected, select it and click Save. Be aware that if you've modified the default settings for the Group
option, once you confirm the reorganization by space, your settings will be deleted. To keep your settings, you can export
them (see Export Workload Management Settings).

3. In the confirmation message that opens, click Yes. The process may take some time but you can continue to work in other
areas of SAP Datasphere.

4. Click on the row of the space for which you want to edit the properties.

 Note
You can search for a space based on its ID by entering one or more characters in the Search field. Only the spaces
whose space ID includes the entered characters are displayed in the table.

5. To prioritize between spaces, specify in the Space Priority section the prioritization of this space when querying the
database. You can choose a value from 1 (lowest priority) to 8 (highest priority). The default value is 5. In situations where
spaces are competing for available threads, those with higher priorities have their statements run before those of spaces
with lower priorities.

6. To manage other workload parameters, you can select either of the following in the Configuration dropdown list:

Default. The default configuration provides generous resource limits, while preventing any single space from
overloading the system. The default configuration is applied by default to new spaces.

These statement limit and admission control parameters are taken into account in the default configuration and
cannot be changed:

This is custom documentation. For more information, please visit SAP Help Portal. 132
7/9/25, 8:45 AM

Parameter Value

ADMISSION CONTROL QUEUE CPU THRESHOLD 90%

ADMISSION CONTROL REJECT CPU THRESHOLD 99%

TOTAL STATEMENT THREAD LIMIT 70%

Custom. These statement limit and admission control parameters are taken into account in the custom
configuration. You can specify only the value for statements limits to set maximum total thread and memory limits
that statements running concurrently in the space can consume:

 Caution
Be aware that changing the statement limits may cause performance issues.

Parameter Value

ADMISSION CONTROL 90%


QUEUE CPU THRESHOLD

ADMISSION CONTROL 99%


REJECT CPU THRESHOLD

TOTAL STATEMENT In the Total Statement Thread Limit area, enter the maximum number (or percentage) of
THREAD LIMIT threads that statements running concurrently in the space can consume. You can enter a
percentage between 1% and 100% (or the equivalent number) of the total number of
threads available in your tenant.

 Note
100% represents the maximum of 80% of CPU resources reserved for workload
generated by spaces, user group users and agent users. The remaining 20% of CPU
resources are reserved to ensure that the system can respond under heavy load.

Setting this limit prevents the space from consuming too many threads, and can help with
balancing resource consumption between competing spaces.

 Caution
Be aware that setting this limit too low may impact statement performance, while
excessively high values may impact the performance of statements in other spaces.

Default: 70%

This is custom documentation. For more information, please visit SAP Help Portal. 133
7/9/25, 8:45 AM

Parameter Value

TOTAL STATEMENT In the Total Statement Memory Limit area, enter the maximum number (or percentage) of
MEMORY LIMIT GBs of memory that statements running concurrently in the space can consume. You can
enter any value or percentage between 0 (no limit) and the total amount of memory
available in your tenant.

Setting this limit prevents the space from consuming all available memory, and can help
with balancing resource consumption between competing spaces.

 Caution
Be aware that setting this limit too low may cause out-of-memory issues, while
excessively high values or 0 may allow the space to consume all available system
memory.

Default: 80%

7. Click Save. The changes are reflected in the space details page in read-only.

Set Priorities and Statement Limits by Group

You can set priorities and statement limits by group (which are groups of processes) and distribute the workload between the 8
groups provided.

For example, you can choose a total thread and memory limit for the Analytic Consumption group that are higher than for the
Modeling or the Data Management group, which can typically run more slowly in the background.

You distribute your workload management with these groups:

Group Processes

Analytic Consumption Analytic data preview inside the Analytic Model editors.

Data consumption of SAP Datasphere models in SAP Analytics Cloud.

Consumption of SAP Datasphere data via OData.

Data Management Actions related to data integration, such as running task chains, persisting data
or importing data.

Modeling Data preview actions in the Business Builder and the Data Builder.

SAP Analytics Cloud Data Management Import and export of data in SAP Analytics Cloud.

SAP Analytics Cloud Interactive Operations Navigation in SAP Analytics Cloud, primarily for loading stories.

SAP Analytics Cloud Long-Running Operations Background jobs for planning workflows in SAP Analytics Cloud.

SAP Analytics Cloud System Operation Background operations in SAP Analytics Cloud, such as collecting statistics and
cleaning up jobs.

SQL Access All actions related to space database users.

1. In the side navigation area, click  (System)  (Configuration) Workload Management .

2. If Group is not already selected, select it and click Save. Be aware that if you've modified the default settings for the option
Space, once you confirm the reorganization by group, your settings will be deleted. To keep your settings, you can export
them (see Export Workload Management Settings).

This is custom documentation. For more information, please visit SAP Help Portal. 134
7/9/25, 8:45 AM
3. In the confirmation message that opens, click Yes. The process may take some time but you can continue to work in other
areas of SAP Datasphere.

The groups are provided with the following default priorities and statement limits:

Group Priority ADMISSION ADMISSION STATEMENT TOTAL TOTAL


CONTROL CONTROL TIMEOUT STATEMENT STATEMENT
QUEUE CPU REJECT CPU THREAD MEMORY
THRESHOLD THRESHOLD LIMIT LIMIT

Analytic 7 90% 99% - 70% -


Consumption

Data 5 90% - - 50% -


Management

Modeling 7 90% 99% - 40% -

SAP Analytics 5 90% - - 40% 40%


Cloud Data
Management

SAP Analytics 5 90% 99% 185 seconds 70% 70%


Cloud Interactive
Operations

SAP Analytics 5 90% 99% - 20% 20%


Cloud Long-
Running
Operations

SAP Analytics 8 90% 99% - 50% 50%


Cloud System
Operation

SQL Access 8 90% 99% - 60% -

The default configuration provides generous resource limits, while preventing any single group from overloading the
system.

4. To change the priority or statement limits of a group, click it and change the settings as follows:

 Note
The statement timeout and admission control parameters mentioned in the table of the previous step are also taken
into account in the custom configuration but you cannot change their values.

Parameter Description

Priority To prioritize between groups, specify the priority of this group when querying the
database. You can choose a value from 1 (lowest priority) to 8 (highest priority). The
default value depends on the group. In situations where groups are competing for
available threads, those with higher priorities have their statements run before those
of groups with lower priorities.

Total Statement Thread Limit Select Configuration Custom and enter the maximum number (or percentage)
of threads that statements running concurrently in the group can consume. You can
enter a percentage between 1% and 100% (or the equivalent number) of the total
number of threads available in your tenant.

This is custom documentation. For more information, please visit SAP Help Portal. 135
7/9/25, 8:45 AM

Parameter Description

Setting this limit prevents the group from consuming too many threads, and can help
with balancing resource consumption between competing groups.

 Caution
Be aware that setting this limit too low may impact statement performance, while
excessively high values may impact the performance of statements in other
groups.

The default value is 70% for all groups.

Total Statement Memory Limit Select Configuration Custom and enter the maximum number (or percentage)
of GBs of memory that statements running concurrently in the group can consume.
You can enter any value or percentage between 0 (no limit) and the total amount of
memory available in your tenant.

Setting this limit prevents the group from consuming all available memory, and can
help with balancing resource consumption between competing groups.

 Caution
Be aware that setting this limit too low may cause out-of-memory issues, while
excessively high values or 0 may allow the group to consume all available system
memory.

The default value is 80% for all groups.

5. Click Save.

Export Workload Management Settings


To keep the workload settings, for spaces or groups, that you have customized, you can export them in a .json file.

1. In the side navigation area, click  (System)  (Configuration) Workload Management .

2. Click the export button. The .json is dowloaded to your computer.

3. Save it.

Import Workload Management Settings

To apply the customized workload settings, for spaces or groups, that you have previously exported, you can import the saved
.json file.

1. In the side navigation area, click  (System)  (Configuration) Workload Management .

2. Click the import button.

3. Select the .json file that you have previously exported and click Import.

The workload management settings are applied.

Copy a Space and its Contents


You can copy a space and all the Data Builder objects it contains into a new space.

This is custom documentation. For more information, please visit SAP Help Portal. 136
7/9/25, 8:45 AM

Prerequisites

To copy a space and its contents, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

Spaces (C-------) - To create spaces.

Spaces (-------M) - To update all spaces and space properties.

The DW Administrator global role, for example, grants these privileges. For more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

 Note
A space cannot be copied if it contains any Business Builder objects.

If the space is used as storage by an associated SAP Analytics Cloud tenant, then it cannot be copied if any SAP Analytics
Cloud objects are exposed (see Exposing Objects for Consumption in SAP Datasphere).

 Note
Relevant only for spaces with a storage type SAP HANA Database (Disk and In-Memory), and not for SAP HANA Data Lake
Files spaces.

Context
If you copy a space that contains objects protected by a namespace, the copied objects will be modified so that they are removed
from the namespace and become editable. Copying protected content in this way allows you to extend content delivered through
SAP Business Data Cloud (see Extending Intelligent Applications).

Procedure
1. In the side navigation area, click (Space Management), locate your space tile, and click Edit to open it.

2. Click  (More) Copy .

3. Enter the name of the new space who want to copy to.

By default, the contents of the space will be copied to the new space, but will not be deployed. To have them deployed,
select Deploy Objects.

4. Click Copy.

The following actions are performed:

The new space is configured exactly as the original space but has a new Space ID and Space Name.

The following objects are copied:

All Data Builder objects.

All connections (credentials need to be re-entered unless the connection is a shared UCL connection).

 Note
Replication task schedules are not copied and must be recreated manually.

Any objects shared to the original space are shared to the new space.

The new space is added as a scope to all scoped roles that the original space belongs to, but no users are added to
the new space, by default.
This is custom documentation. For more information, please visit SAP Help Portal. 137
7/9/25, 8:45 AM
For information about adding users to a space, see Create a Scoped Role to Assign Privileges to Users in Spaces.

You will receive a notification when the copy is complete.

Rules for Technical Names


Rules and restrictions apply to the technical names of objects that you create in SAP Datasphere. The technical name by default is
synchronized with the business name by using rules to automatically replace invalid characters.

When specifying the technical name of an object, bear in mind the following rules and restrictions:

Object Type Rule Maximum Length

Space The space ID can only contain uppercase letters, numbers, and underscores (_). 20
Reserved keywords, such as SYS, CREATE, or SYSTEM, must not be used. Unless
advised to do so, the ID must not contain prefix _SYS and should not contain
prefixes: DWC_, SAP_. The maximum length is 20 characters.

Reserved keywords: SYS, PUBLIC, CREATE, SYSTEM, DBADMIN, MONITORING,


PAL_STEM_TFIDF, SAP_PA_APL, DWC_USER_OWNER, DWC_TENANT_OWNER,
DWC_AUDIT_READER, DWC_GLOBAL, and DWC_GLOBAL_LOG.

Also, the keywords that are reserved for the SAP HANA database cannot be used
in a space ID. See Reserved Words in the SAP HANA SQL Reference Guide for SAP
HANA Platform.

Elastic Compute Node The elastic compute node technical name can only contain lowercase letters (a-z) 9
and numbers (0-9). It must contain the prefix: ds. The minimum length is 3 and
the maximum length is 9 characters.

SAP BW bridge instance The technical name can contain any characters except for the asterisk (*), colon 50
(:), and hash sign (#). Also, tab, carriage return, and newline must not be used,
Remote table generated
and space must not be used at the start of the name. The maximum length is 50
during the import of
characters.
analysis authorizations
from a SAP BW or SAP
BW∕4HANA system

Object created in the The technical name can only contain alphanumeric characters and underscores 50
Data Builder, for example (_). The maximum length is 50 characters.
a table, view, E/R model,
flow, intelligent lookup,
task chain, or data
access control

Element in the Data The technical name can only contain alphanumeric characters and underscores 30
Builder, for example a (_). The maximum length is 30 characters.
column, or a join,
projection, or
aggregation node

Object created in the The technical name can only contain alphanumeric characters and underscores 30
Business Builder, for (_). The maximum length is 30 characters.
example a fact,
dimension, fact model,
consumption model, or
authorization scenario

This is custom documentation. For more information, please visit SAP Help Portal. 138
7/9/25, 8:45 AM

Object Type Rule Maximum Length

Association The technical name can only contain alphanumeric characters, underscores (_), 20
and dots (.). The maximum length is 20.

Input parameter The technical name can only contain uppercase letters, numbers, and 30
underscores (_). The maximum length is 30 characters.

Database analysis user The user name suffix can only contain uppercase letters, numbers, and 31 (40 minus prefix)
underscores (_). The maximum length is 41 characters. This suffix is added to the
default prefix DWCDBUSER# to create your full user name. Note that you cannot
change the prefix as it is a reserved prefix.

Database user group The user name suffix can only contain uppercase letters, numbers, and 30 (40 minus prefix)
user underscores (_). The maximum length is 41 characters. This suffix is added to the
default prefix DWCDBGROUP# to create your full user name. Note that you
cannot change the prefix as it is a reserved prefix.

Database user (Open The user name suffix can only contain uppercase letters, numbers, and 40 minus space name
SQL schema) underscores (_). The maximum length is 41 characters. This suffix is added to the (or 41 minus prefix)
default prefix <space ID># to create your full user name. Note that you cannot
change the prefix.

Connection The technical name can only contain alphanumeric characters and underscores 40
(_). Underscore (_) must not be used at the start or end of the name. The
maximum length is 40 characters.

The technical name by default is synchronized with the business name. While entering the business name, invalid characters are
replaced in the technical name as follows:

Rule Example

Reserved keywords which are " SYS" ""


not allowed are removed.

Leading underscores (_) are "_NAME" "NAME"


removed.

Leading and trailing " NAME " "NAME"


whitespaces (" ") are removed.

Whitespaces (" ") within a name "NA ME" "NA_ME"


are replaced with underscores
(_).

Characters with diacritical signs "Namé" "Name"


are replaced with their basic
character.

Non-alphanumeric characters "N$ME" "NME"


are removed.

Dots (.) and double quotes (") "[Link]"E" "N_AM_E"


are replaced with underscores
(_).

Leading dots (.) are removed. ".NAME" "NAME"

This is custom documentation. For more information, please visit SAP Help Portal. 139
7/9/25, 8:45 AM

Create Spaces via the Command Line


You can use the SAP Datasphere command line interface, datasphere, to create, read, update, and delete spaces. You can set
space properties, assign (or remove) users, create database users, create or update objects (tables, views, and data access
controls), and associate HDI containers to a space.

 Note
Relevant only for spaces with a storage type SAP HANA Database (Disk and In-Memory), and not for SAP HANA Data Lake
Files spaces.

To use datasphere to create spaces, you must have an SAP Datasphere user with the DW Administrator role or equivalent
permissions (see Roles and Privileges by App and Feature).

For more information, see Manage Spaces via the Command Line.

Restore Spaces from, or Empty the Recycle Bin


Restore spaces, or delete them from the Recycle Bin to recover the disk storage used by the data in spaces.

This topic contains the following sections:

Prerequisites

Context

Restore a Space

Delete a Space Permanently

Prerequisites
To restore spaces, or delete them from the Recycle Bin, you must have a global role with the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

Spaces (-------M) - To access the Recycle Bin in the Space Management tool.

The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.

Context
Once a space has been deleted and moved to the Recycle Bin (see Delete Your Space), you can either restore the space or
permanently delete the space from the database to recover the disk storage used by the data in the space.

Restore a Space
1. In the side navigation area, click (Space Management).

2. In the Recycle Bin area, locate and select your space and click the Restore button.

The space is moved to the All Spaces area and you can work with it.

This is custom documentation. For more information, please visit SAP Help Portal. 140
7/9/25, 8:45 AM

 Note
Once a space is restored, its status is "active", regardless of the status the space had before deletion (active or locked).

As the following actions are not automatically done with the space restore, you can perform them manually:

Resume the schedules that were paused.

Run the replication for remote tables connected via SAP HANA smart data access, with real-time replication, and for
replication flows with the load type "Initial and Delta" (see Replicating Data and Monitoring Remote Tables).

Re-enable real-time replication for remote tables connected via SAP HANA smart data integration, with real-time
replication (see Replicating Data and Monitoring Remote Tables).

If replication flows were stopped before the space was deleted, ensure that they get started again (see Working With
Existing Replication Flow Runs).

Synchronize the source system with the catalog (see Manually Synchronizing a System).

Delete a Space Permanently

 Caution
This action cannot be undone.

Be aware that the following content will also be permanently deleted:

All objects and data contained in the space.

All connections defined in the space.

All objects and data contained in any Open SQL schema associated with the space.

All audit logs entries generated for the space, including audit log entries related to any Open SQL schema associated with
the space.

 Note
For spaces that have been deleted before version 2023.05, all related audit logs have been kept. A user with an
administrator role can decide to delete them (see Delete Audit Logs ).

1. In the side navigation area, click (Space Management).

2. In the Recycle Bin area, locate and select one or more spaces and click the Delete button.

3. In the confirmation message, enter DELETE if you are sure that you no longer need the spaces and any of their content or
data, then click the Delete button.

The spaces are permanently deleted from the database and cannot be recovered.

 Note
When you delete a file space, it can take more than 5 minutes and a timeout message from your browser might be
displayed even though the space is being properly deleted. To make sure that your space has been permanently deleted,
you can check later that it is no more in the recycle bin.

This is custom documentation. For more information, please visit SAP Help Portal. 141
7/9/25, 8:45 AM

Preparing Connectivity for Connections


Users with an administrator role can prepare SAP Datasphere connectivity to allow the creation of connections to remote systems
in spaces.

The following overview lists the most common prerequisites per connection type and points to further information about what
needs to be prepared to connect and use a connection.

Connection Remote Remote Data Flows Data Flows: SAP Source IP Additional
Type Tables: Data Tables: and Third-Party Datasphere Required in Information
Provisioning Installation of Replication Driver Upload IP Required in SAP and
Agent Third-Party Flows: Cloud Required? Source Datasphere Prerequisites
Required? JDBC Library Connector Allowlist? IP Allowlist?
Required? Required for
On-Premise
Sources?

Adverity no no no no no yes Prepare


Connections Connectivity to
Adverity

Amazon no no no no no no Prepare
Athena Connectivity to
Connections Amazon
Athena

Amazon yes yes no yes yes no Prepare


Redshift (Outbound IP Connectivity to
Connections Address) Amazon
Redshift

Amazon Simple no no no no no no n/a


Storage
Service
Connections

Apache Kafka no no yes no no no Prepare


Connections Connectivity to
Apache Kafka

Confluent no no yes no no no Prepare


Connections Connectivity to
Confluent

Cloud Data yes no yes (for data no no no Prepare


Integration flows) Connectivity
Connections for Cloud Data
Integration

Generic JDBC yes (for on- yes no no no no Prepare


Connections premise) Connectivity
for Generic
JDBC

Generic OData no no yes (for data no no no Prepare


Connections flows) Connectivity
for Generic
OData

This is custom documentation. For more information, please visit SAP Help Portal. 142
7/9/25, 8:45 AM

Connection Remote Remote Data Flows Data Flows: SAP Source IP Additional
Type Tables: Data Tables: and Third-Party Datasphere Required in Information
Provisioning Installation of Replication Driver Upload IP Required in SAP and
Agent Third-Party Flows: Cloud Required? Source Datasphere Prerequisites
Required? JDBC Library Connector Allowlist? IP Allowlist?
Required? Required for
On-Premise
Sources?

Generic SFTP no no yes (for on- no no no Prepare


Connections premise) Connectivity
for Generic
SFTP

Google no no no yes no no Prepare


BigQuery Connectivity to
Connections Google
BigQuery

Google Cloud no no no no no no n/a


Storage
Connections

Hadoop no no no no no no n/a
Distributed File
System
Connections

Microsoft no no no no no no n/a
Azure Blob
Storage
Connections

Microsoft no no no no yes (Microsoft no Prepare


Azure Data Azure Connectivity to
Lake Store deployments Microsoft
Gen2 only: Virtual Azure Data
Connections Network Lake Store
Subnet ID) Gen2

Microsoft yes yes no no yes no Prepare


Azure SQL (Outbound IP Connectivity to
Database Address) Microsoft
Connections Azure SQL
Database

Microsoft SQL yes yes yes (for data no (pre- no no Prepare


Server flows) bundled; no Connectivity to
Connections upload Microsoft SQL
required) Server

Open no no no no no no Prepare
Connectors Connectivity to
Connections SAP Open
Connectors

Oracle yes yes yes (for data yes no no Prepare


Connections flows) Connectivity to
Oracle

This is custom documentation. For more information, please visit SAP Help Portal. 143
7/9/25, 8:45 AM

Connection Remote Remote Data Flows Data Flows: SAP Source IP Additional
Type Tables: Data Tables: and Third-Party Datasphere Required in Information
Provisioning Installation of Replication Driver Upload IP Required in SAP and
Agent Third-Party Flows: Cloud Required? Source Datasphere Prerequisites
Required? JDBC Library Connector Allowlist? IP Allowlist?
Required? Required for
On-Premise
Sources?

Precog no no no no no yes Prepare


Connections Connectivity to
Precog

SAP ABAP yes (for on- no yes (for on- no no no Prepare


Connections premise) premise: for Connectivity to
data flows SAP ABAP
and Systems
replication
flows)

SAP BW yes no yes (for data no no no Prepare


Connections flows) Connectivity to
SAP BW

SAP yes (for model no yes (for model no no no Preparing SAP


BW∕4HANA import - to import - to BW/4HANA
Model Transfer connect to make http Model Transfer
Connections the SAP requests to Connectivity
HANA SAP
database of BW/4HANA)
SAP
BW/4HANA)

SAP ECC yes no yes (for data no no no Prepare


Connections flows) Connectivity to
SAP ECC

SAP Fieldglass yes no no no no no Prepare


Connections Connectivity to
SAP Fieldglass

SAP HANA yes (for on- no yes (for on- no no Cloud Prepare
Connections premise) premise: for Connector IP Connectivity to
data flows (for on- SAP HANA
and premise when
replication using Cloud
flows, or when Connector for
using Cloud remote tables
Connector for feature)
remote tables
feature)

SAP HANA no no no no no no no
Cloud, Data
Lake Files
Connections

SAP HANA no no no no no no n/a


Cloud, Data
Lake Relational

This is custom documentation. For more information, please visit SAP Help Portal. 144
7/9/25, 8:45 AM

Connection Remote Remote Data Flows Data Flows: SAP Source IP Additional
Type Tables: Data Tables: and Third-Party Datasphere Required in Information
Provisioning Installation of Replication Driver Upload IP Required in SAP and
Agent Third-Party Flows: Cloud Required? Source Datasphere Prerequisites
Required? JDBC Library Connector Allowlist? IP Allowlist?
Required? Required for
On-Premise
Sources?

Engine
Connections

SAP Marketing yes no no no no no Prepare


Cloud Connectivity to
Connections SAP Marketing
Cloud

SAP no no no no yes (HANA IP no Prepare


SuccessFactors Address) Connectivity to
Connections SAP
SuccessFactors

SAP S/4HANA yes no no no no no Prepare


Cloud Connectivity to
Connections SAP S/4HANA
Cloud

SAP S/4HANA yes (for model no yes (for data no no no Prepare


On-Premise import) flows, Connectivity to
Connections replication SAP S/4HANA
flows, and On-Premise
model import)

 Note
For information about supported versions of sources that are connected via SAP HANA smart data integration and its Data
Provsioning Agent, see the SAP HANA smart data integration and all its patches Product Availability Matrix (PAM) for SAP
HANA SDI 2.0 .

For information about necessary JDBC libraries for connecting to sources from third-party vendors, see:

SAP HANA smart data integration and all its patches Product Availability Matrix (PAM) for SAP HANA SDI 2.0

Register Adapters with SAP Datasphere

Preparing Data Provisioning Agent Connectivity


Most connection types supporting remote tables use SAP HANA Smart Data Integration (SDI) and its Data Provisioning Agent.
Before using the connection, the agent requires an appropriate setup.

Context

The Data Provisioning Agent is a lightweight component running outside the SAP Datasphere environment. It hosts data
provisioning adapters for connectivity to remote sources, enabling data federation and replication scenarios. The Data
Provisioning Agent acts as a gateway to SAP Datasphere providing secure connectivity between the database of your SAP

This is custom documentation. For more information, please visit SAP Help Portal. 145
7/9/25, 8:45 AM
Datasphere tenant and the adapter-based remote sources. The Data Provisioning Agent is managed by the Data Provisioning
Server. It is required for all connections with SAP HANA smart data integration.

Through the Data Provisioning Agent, the preinstalled data provisioning adapters communicate with the Data Provisioning Server
for connectivity, metadata browsing, and data access. The Data Provisioning Agent connects to SAP Datasphere using JDBC. It
needs to be installed on a local host in your network and needs to be configured for use with SAP Datasphere.

 Note
A given Data Provisioning Agent can only connected to one SAP Datasphere tenant (see SAP Note 2445282 ).

For an overview of connection types that require a Data Provisioning Agent setup, see Preparing Connectivity for Connections.

 Note
See also the guide Best Practices and Sizing Guide for Smart Data Integration (When used in SAP Datasphere) (published
June 10, 2022) for information to consider when creating and using connections that are based on SDI and Data Provisioning
Agent.

Procedure

This is custom documentation. For more information, please visit SAP Help Portal. 146
7/9/25, 8:45 AM
To prepare connectivity via Data Provisioning Agent, perform the following steps:

1. Download and install the latest Data Provisioning Agent version on a host in your local network.

 Note

We recommend to always use the latest released version of the Data Provisioning Agent. For information on
supported and available versions for the Data Provisioning Agent, see the SAP HANA Smart Data Integration
Product Availability Matrix (PAM) .

Make sure that all agents that you want to connect to SAP Datasphere have the same latest version.

For more information, see Install the Data Provisioning Agent.

2. Add the external IPv4 address of the server on which your Data Provisioning Agent is running to the IP allowlist in SAP
Datasphere. When using a proxy, the proxy's address needs to be included in IP allowlist as well.

 Note
For security reasons, all external connections to your SAP Datasphere instance are blocked by default. By adding
external IPv4 addresses or address ranges to the allowlist you can manage external client connections.

For more information, see Manage IP Allowlist.

3. Connect the Data Provisioning Agent to SAP Datasphere.

This includes configuring the agent and setting the user credentials in the agent.

For more information, see Connect and Configure the Data Provisioning Agent.

4. Register the adapters with SAP Datasphere.

 Note
For third-party adapters, you need to download and install any necessary JDBC libraries before registering the adapters.

For more information, see Register Adapters with SAP Datasphere.

Results
The registered adapters are available for creating connections to the supported remote sources and enabling these connections
for creating views and accessing or replicating data via remote tables.

Install the Data Provisioning Agent


Download the latest Data Provisioning Agent 2.0 version from SAP Software Download Center and install it as a standalone
installation on a Windows or Linux machine. If you have already installed an agent, check if you need to update to the latest version.
If you have more than one agent that you want to connect to SAP Datasphere, make sure to have the same latest version for all
agents.

Context

Procedure
1. Plan and prepare the Data Provisioning Agent installation.

a. Plan your installation to ensure that it meets your system landscape's needs.

This is custom documentation. For more information, please visit SAP Help Portal. 147
7/9/25, 8:45 AM
You can install the agent on any host system that has access to the sources you want to access, meets the minimum
system requirements, and has any middleware required for source access installed. The agent should be installed on
a host that you have full control over to view logs and restart, if necessary.

For more information, see:

Planning and Preparation in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
documentation.

Supported Platforms and System Requirements in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.

b. Download the latest Data Provisioning Agent HANA DP AGENT 2.0 from the SAP Software Download Center .

 Note

We recommend to always use the latest released version of the Data Provisioning Agent.

Make sure that all agents that you want to connect to SAP Datasphere have the same latest version.

Select your operating system before downloading the agent.

For more information, see:

Software Download in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
documentation

SAP HANA Smart Data Integration Product Availability Matrix (PAM) (for supported and available versions
for the Data Provisioning Agent and operating system support)

2. Install the Data Provisioning Agent on a host in your local network.

For more information, see Install from the Command Line in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality documentation.

 Note
If you have upgraded your Data Provisioning Agent to version 2.5.1 and want to create an Amazon Redshift connection,
apply SAP note 2985825 .

Related Information
Install the Data Provisioning Agent
Update the Data Provisioning Agent

Connect and Configure the Data Provisioning Agent


Connect the Data Provisioning Agent to the SAP HANA database of SAP Datasphere. This includes configuring the agent and
setting the user credentials in the agent.

Procedure
1. In SAP Datasphere, register the Data Provisioning Agent.

a. In the side navigation area, click  (System)  (Configuration) Data Integration .

b. In the On-Premise Agents section, add a new tile to create a new agent registration in SAP Datasphere.

c. In the following dialog, enter a unique name for your new agent registration.
This is custom documentation. For more information, please visit SAP Help Portal. 148
7/9/25, 8:45 AM

 Note
The registration name cannot be changed later.

d. Select Create.

The Agent Settings dialog opens and provides you with information required to configure the Data Provisioning
Agent on your local host:

Agent name

HANA server (host name)

HANA port

HANA user name for agent messaging

HANA user password for agent messaging

 Note
Either keep the Agent Settings dialog open, or note down the information before closing the dialog.

2. At the command line, connect the agent to SAP HANA using JDBC. Perform the following steps:

a. Navigate to <DPAgent_root>/bin/. <DPAgent_root> is the Data Provisioning Agent installation root location.
By default, on Windows, this is C:\usr\sap\dataprovagent, and on Linux it is /usr/sap/dataprovagent.

b. Start the agent using the following command:

On Linux: ./ dpagent_servicedaemon.sh start

On Windows: dpagent_servicedaemon_start.bat

c. Start the command-line agent configuration tool using the following command:

On Linux:

<DPAgent_root>/bin/[Link] --configAgent

On Windows:

<DPAgent_root>/bin/[Link] --configAgent

d. Choose SAP HANA Connection.

e. Choose Connect to SAP Datasphere via JDBC.

f. Enter the name of the agent registration (agent name).

g. Enter true to use an encrypted connection over JDBC.

 Tip
An encrypted connection is always required when connecting to SAP HANA in a cloud-based environment.

h. Enter the host name (HANA server) and port number (HANA port) for the SAP Datasphere instance.

For example:

Host name: <instance_name>.[Link]

Port number: 443

i. If HTTPS traffic from your agent host is routed through a proxy, enter true and specify any required proxy
information as prompted.

This is custom documentation. For more information, please visit SAP Help Portal. 149
7/9/25, 8:45 AM
i. Enter true to specify that the proxy is an HTTP proxy.

ii. Enter the proxy host and port.

iii. If you use proxy authentication, enter true and provide a proxy user name and password.

j. Enter the credentials for the HANA user for agent messaging.

The HANA user for agent messaging is used only for messaging between the agent and SAP Datasphere.

k. Confirm that you want to save the connection settings you have made by entering true.

 Note
Any existing agent connection settings will be overwritten.

l. Stop and restart the Data Provisioning Agent.

On Linux:

<DPAgent_root>/bin/[Link] --configAgent

On Windows:

<DPAgent_root>/bin/[Link] --configAgent

i. To stop the agent, choose Start or Stop Agent, and then choose Stop Agent.

ii. Choose Start Agent to restart the agent.

iii. Choose Agent Status to check the connection status. If the connection succeeded, you should see
Agent connected to HANA: Yes.

 Note
For agent version 2.7.4 and higher, if in the agent status the message No connection established yet is
shown, this can be ignored.

Alternatively, in  (System)  (Configuration) Data Integration On-Premise Agents a


green bar and status information on the agent tile indicates if the agent is connected.

This is custom documentation. For more information, please visit SAP Help Portal. 150
7/9/25, 8:45 AM
For more information about the agent/SAP HANA connection status in agent version 2.7.4 and higher, see
SAP Note 3487646 .

iv. Choose Quit to exit the script.

3. In SAP Datasphere, if you have kept the Agent Settings dialog open, you can now close it.

Results

The Data Provisioning Agent is now connected.

If the tile of the registered Data Provisioning Agent doesn’t display the updated connection status, select Refresh Agents.

Related Information
Troubleshooting the Data Provisioning Agent (SAP HANA Smart Data Integration)

Register Adapters with SAP Datasphere


After configuring the Data Provisioning Agent, in SAP Datasphere, register the Data Provisioning adapters that are needed to
connect to on-premise sources.

Prerequisites

For third-party adapters, ensure that you have downloaded and installed any necessary JDBC libraries. Place the files in the
<DPAgent_root>/lib folder before registering the adapters with SAP Datasphere. For connection types Amazon Redshift and
Generic JDBC, place the file in the <DPAgent_root>/camel/lib folder.

For information about the proper JDBC library for your source, see the SAP HANA smart data integration and all its patches
Product Availability Matrix (PAM) for SAP HANA SDI 2.0 . Search for the library in the internet and download it from an
appropriate web page.

Procedure
1. In the side navigation area, click  (System)  (Configuration) Data Integration .

2. In the On-Premise Agents section, click the Adapters button to display the agents with their adapter information.

3. Click  (menu) and then  Edit.

4. In the Agent Settings dialog, under Agent Adapters select the adapters.

5. Click Close to close the dialog and register the selected adapters with SAP Datasphere.

 Note
It is not required to save to update the agent settings.

The registered adapters are now available for creating connections to the supported on-premise sources.

Next Steps
To use new functionality of an already registered adapter or to update the adapter in case of issues that have been fixed in a new
agent version, you can refresh the adapter by clicking the  (menu) button and then choosing  Refresh.

This is custom documentation. For more information, please visit SAP Help Portal. 151
7/9/25, 8:45 AM

Prerequisites for ABAP RFC Streaming


If you want to stream ABAP tables for loading large amounts of data without running into memory issues it is required to meet the
following requirements.

You need to create an RFC destination in the ABAP source system. With the RFC destination you register the Data
Provisioning agent as server program in the source system.

Using transaction SM59, you create a TCP/IP connection with a user-defined name. The connection should be created with
“Registered Server Program” as “Activation Type”. Specify “IM_HANA_ABAPADAPTER_*” as a filter for the “Program ID”
field, or leave it empty.

 Note
You can ignore failing SM59 connection tests because the RFC connection is only built up when the replication is
running to query records from the SAP system. For more information, see SAP Note 3206908 .

Successful registration on an SAP Gateway requires that suitable security privileges are configured. For example:

Set up an Access Control List (ACL) that controls which host can connect to the gateway. That file should contain
something similar to the following syntax: <permit> <ip-address[/mask]> [tracelevel] [#
comment]. <ip-address> here is the IP of the server on which Data Provisioning agent has been installed.

For more information, see the Gateway documentation in the SAP help for your source system version, for example
in the SAP NetWeaver 7.5 documentation:

Configuring Network-Based Access Control Lists (ACL)

Gateway ACL Editor

You may also want to configure a reginfo file to control permissions to register external programs.

Preparing Cloud Connector Connectivity


Connections to on-premise sources used for data flows, replication flows, and other use cases require Cloud Connector to act as
link between SAP Datasphere and the source. Before creating the connection, the Cloud Connector requires an appropriate setup.

Context
Cloud Connector serves as a link between SAP Datasphere and your on-premise sources and is required for connections that you
want to use for:

Data flows

Replication flows

Model import from:

SAP BW/4HANA Model Transfer connections (Cloud Connector is required for the live data connection of type
tunnel that you need to create the model import connection)

SAP S/4HANA On-Premise connections (Cloud Connector is required for the live data connection of type tunnel
that you need to search for the entities in the SAP S/4HANA system)

Remote tables (only for SAP HANA on-premise via SAP HANA Smart Data Access)

This is custom documentation. For more information, please visit SAP Help Portal. 152
7/9/25, 8:45 AM
For an overview of connection types that require a Cloud Connector setup to be able to use any of these features, see Preparing
Connectivity for Connections.

Procedure
To prepare connectivity via Cloud Connector, perform the following steps:

1. Install the Cloud Connector in your on-premise network.

For more information, see Cloud Connector Installation in the SAP BTP Connectivity documentation.

2. Make sure to hold the SAP Datasphere subaccount information ready. You can find the information in  (System) 
(Administration) Data Source Configuration .

For more information, see Set Up Cloud Connector in SAP Datasphere.

3. In the Cloud Connector administration, set up and configure Cloud Connector according to your requirements.

For more information, see Configure Cloud Connector.

4. If you have connected multiple Cloud Connector instances to your subaccount and you want to use these locations for your
connections, add the location IDs in  (System)  (Administration) Data Source Configuration .

For more information, see Set Up Cloud Connector in SAP Datasphere.

5. If you you want to create SAP BW/4HANA Model Transfer connections or SAP S/4HANA On-Premise connections for model
import, make sure you have switched on Allow live data to securely leave my network in  (System) 
(Administration) Data Source Configuration .

For more information, see Set Up Cloud Connector in SAP Datasphere.

Result

The Cloud Connector respectively Cloud Connector instances are available for creating connections and enabling these for the
supported features.

Related Links

Frequently Asked Questions (about the Cloud Connector) in the SAP BTP Connectivity documentation

Configure Cloud Connector


Configure Cloud Connector before connecting to on-premise sources and using them in various use cases. In the Cloud Connector
administation, connect the SAP Datasphere subaccount to your Cloud Connector, add a mapping to each relevant source system
in your network, and specify accessible resources for each source system.

Prerequisites

Before configuring the Cloud Connector, the following prerequisites must be fulfilled:

The Cloud Connector is installed in your on-premise network.

For more information, see Cloud Connector Installation in the SAP BTP Connectivity documentation.

This is custom documentation. For more information, please visit SAP Help Portal. 153
7/9/25, 8:45 AM
If you are using egress firewalling, add the following domains (wildcard) to the firewall/proxy allowlist in your on-premise
network:

*.[Link]

*.[Link]

Before configuring the Cloud Connector, you or the owner of your organisation will need an SAP Business Technology
Platform (SAP BTP) account. If you don't have an account yet, create an account by clicking Register in the SAP BTP
cockpit.

During Cloud Connector configuration you will need information for your SAP Datasphere subaccount. Make sure that you
have the subaccount information available in System Administration Data Source Configuration SAP BTP Core
Account .

For more information, see Set Up Cloud Connector in SAP Datasphere.

 Note
If you have an account but cannot see the account information here, enter the SAP BTP user ID. This ID is typically the
email address you used to create your SAP BTP account. After you have entered the ID, you can see the Account
Information for SAP Datasphere.

Context
For more information about the supported use cases depending on the connection type, see Preparing Cloud Connector
Connectivity.

Procedure
1. Log on to the Cloud Connector Administration on [Link]

<hostname> refers to the machine on which the Cloud Connector is installed. If installed on your machine, you can simply
enter localhost.

2. To connect the SAP Datasphere subaccount to your Cloud Connector, perform the following steps:

a. In the side navigation area of the Cloud Connector Administration, click Connector to open the Connector page
and click  Add Subaccount to open the Add Subaccount dialog.

b. Enter or select the following information to add the SAP Datasphere subaccount to the Cloud Connector.

 Note
You can find the subaccount, region, and subaccount user information in SAP Datasphere under System
Administration Data Source Configuration SAP BTP Core Account Account Information .

Property Description

Region Select your region host from the list.

Subaccount Add your SAP Datasphere subaccount name.

Display Name [optional] Add a name for the account.

Subaccount User Add your subaccount (S-User) username.

Password Add your S-User password for the SAP Business Technology Platform.

This is custom documentation. For more information, please visit SAP Help Portal. 154
7/9/25, 8:45 AM

Property Description

Location ID [optional] Define a location ID that identifies the location of this Cloud
Connector for the subaccount.

 Note

Using location IDs you can connect multiple Cloud Connector


instances to your subaccount. If you don't specify any value, the
default is used. For more information, see Managing Subaccounts in
the SAP BTP Connectivity documentation.

Each Cloud Connector instance must use a different location, and an


error will appear if you choose a location that is already been used.

We recommend that you leave the Location ID empty if you don't


plan to set up multiple Cloud Connectors in your system landscape.

Description (Optional) Add a description for the Cloud Connector.

c. Click Save.

In the Subaccount Dashboard section of the Connector page, you can see all subaccounts added to the Cloud
Connector at a glance. After you added your subaccount, you can check the status to verify that the Cloud
Connector is connected to the subaccount.

3. To allow SAP Datasphere to access systems (on-premise) in your network, you must specify the systems and the accessible
resources in the Cloud Connector (URL paths or function module names depending on the used protocol). Perform the
following steps for each system that you want to be made available by the Cloud Connector:

a. In the side navigation area, under your subaccount menu, click Cloud To On-Premise and then  (Add)in the
Mapping Virtual To Internal System section of the Access Control tab to open the Add System Mapping dialog.

 Note
The side navigation area shows the display name of your subaccount. If the area shows another subaccount,
select your subaccount from the Subaccount field of the Cloud Connector Administration.

b. Add your system mapping information to configure access control and save your configuration.

The procedure to add your system mapping information is specific to the protocol that you are using for
communication. The relevant protocols are:

Connection Type (Feature used with the Connection) Protocol

SAP ABAP (data flows, replication flows) RFC

SAP BW (data flows)

SAP ECC (data flows)

SAP S/4HANA On-Premise (data flows, replication flows, model import)

SAP BW/4HANA Model Transfer (model import) HTTPS

SAP S/4HANA On-Premise (model import)

SAP S/4HANA On-Premise (remote tables via ABAP SQL service)

SAP ABAP on-premise only (remote tables via ABAP SQL service)

SAP HANA on-premise only (data flows, replication flows, remote tables via SAP TCP
HANA Smart Data Access and Cloud Connector)

This is custom documentation. For more information, please visit SAP Help Portal. 155
7/9/25, 8:45 AM

Connection Type (Feature used with the Connection) Protocol

For information about how to enable


encrypted communication, see the
Security properties in Configuring
Connection Properties (SAP HANA on-
premise).

Generic OData (data flows) HTTPS

Microsoft SQL Server (data flows, replication flows) TCP

Oracle (data flows) TCP

Apache Kafka on-premise only (replication flows) TCP

Generic SFTP (data flows, replication flows) TCP

Confluent - Confluent Platform on-premise only (replication flows) TCP - for the Kafka broker

HTTPS - for the Schema Registry

For more information, see Configure Access Control in the SAP BTP Connectivity documentation.

 Note

When adding the system mapping information, you enter internal and virtual system information. The
internal host and port specify the actual host and port under which the backend system can be reached
within the intranet. It must be an existing network address that can be resolved on the intranet and has
network visibility for the Cloud Connector. The Cloud Connector tries to forward the request to the
network address specified by the internal host and port, so this address needs to be real. The virtual host
name and port represent the fully qualified domain name of the related system in the cloud.

We recommend to use a virtual (cloud-side) name that is different from the internal name.

For ABAP-based connection types: When using load balancing, make sure to directly specify the message
server port in the System ID field of the system mapping information.

For ABAP-based connection types: The Connection Type selected in the system mapping information
(load balancing logon or connecting to a specific application server) must match the SAP Logon
Connection Type selected in SAP Datasphere connection management (Message Server or Application
Server).

If encrypted communication using TLS/SSL is defined in the SAP Datasphere connection (to establish
end-to-end encryption), ensure that the associated system mapping in the Cloud Connector does not use
TLS.

For SAP S/4HANA On-Premise connections using the ABAP SQL service for data federation with remote
tables:

Using the model import feature is not supported with the same connection.

If you want to use the same connection for remote tables and flows, you need to create two
system mappings. For more information about what to consider when creating the required
system mappings, see Using ABAP SQL Services for Accessing Data from SAP S/4HANA.

For SAP ABAP (on-premise) connections using the ABAP SQL service for data federation with remote
tables: If you want to use the same connection for remote tables and flows, you need to create two
system mappings. For more information about what to consider when creating the required system
mappings, see Using ABAP SQL Services for Accessing Data from SAP S/4HANA.

c. To grant access only to the resources needed by SAP Datasphere, select the system host you just added from the
Mapping Virtual To Internal System list, and for each resource that you want to allow to be invoked on that host
click  (Add) in the Resources Of section to open the Add Resource dialog.

d. Depending on the connection type, protocol, and use case, add the required resources:

This is custom documentation. For more information, please visit SAP Help Portal. 156
7/9/25, 8:45 AM

Connection Type Resource Type Resources


(depending on
protocol)

SAP BW/4HANA URL Path (for


/sap/opu/odata/sap/ESH_SEARCH_SRV/SearchQueries
Model Import HTTPS)
/sap/bw4/v1/dwc/dbinfo

/sap/bw4/v1/dwc/metadata/queryviews – Path and all


sub-paths

/sap/bw4/v1/dwc/metadata/treestructure – Path and


all sub-paths

/sap/bw/ina – Path and all sub-paths

SAP S/4HANA On- URL Path (for For model import:


Premise HTTPS)
/

For data federation with remote tables via the ABAP SQL service:

Enter the service path of the SQL service endpoint on the SAP
S/4HANA system. For example:
/sap/bc/sql/sql1/sap/s_privileged – Path and all
sub-paths

Select the Upgrade Allowed option in the Add Resources dialog.

 Note
In older Cloud Connector versions, the option might appear as
WebSocket or WebSocket Upgrade.

SAP ABAP Function Name For accessing data using CDS view extraction:
(name of the
SAP S/4HANA On- DHAMB_ – Prefix
function module for
Premise
RFC) DHAPE_ – Prefix

RFC_FUNCTION_SEARCH

For accessing data based on tables with SAP LT Replication Server:

LTAMB_ – Prefix

LTAPE_ – Prefix

RFC_FUNCTION_SEARCH

SAP BW Function Name For accessing data using ODP connectivity (for legacy systems that do
(name of the not have the ABAP Pipeline Engine extension or DMIS Addon installed):
SAP ECC
function module for
/SAPDS/ – Prefix
RFC)

RFC_FUNCTION_SEARCH

RODPS_REPL_ – Prefix

SAP Datasphere, Function Name See Add Resources to Source System.


SAP BW bridge (name of the
(connectivity for ODP

This is custom documentation. For more information, please visit SAP Help Portal. 157
7/9/25, 8:45 AM

Connection Type Resource Type Resources


(depending on
protocol)

source systems in function module for


SAP BW bridge) RFC)

Confluent - URL Path (for /


Confluent Platform HTTPS)
(for the Schema
Registry)

For more information, see Configure Access Control (HTTP) and Configure Access Control (RFC) in the SAP BTP
Connectivity documentation.

e. Choose Save.

4. [optional] To enable secure network communication (SNC) for data flows, configure SNC in the Cloud Connector.

For more information, see Initial Configuration (RFC) in the SAP BTP Connectivity documentation.

Next Steps

1. If you have defined a location ID in the Cloud Connector configuration and want to use it when creating connections, you
need to add the location ID in  (System)  (Administration) Data Source Configuration .

For more information, see Set Up Cloud Connector in SAP Datasphere.

2. If you want to create SAP BW/4HANA Model Transfer connections or SAP S/4HANA On-Premise connections for model
import, you need to switch on Allow live data to securely leave my network in  (System)  (Administration)
Data Source Configuration

For more information, see Set Up Cloud Connector in SAP Datasphere.

You can now create your connections in SAP Datasphere.

For answers to the most common questions about the Cloud Connector, see Frequently Asked Questions in the SAP BTP
Connectivity documentation.

Set Up Cloud Connector in SAP Datasphere


Receive SAP Datasphere subaccount information required for Cloud Connector configuration and complete Cloud Connector
setup for creating SAP BW/4HANA Model Transfer connections and for using multiple Cloud Connector instances.

Context

The Cloud Connector allows you to connect to on-premise data sources and use them in various use cases depending on the
connection type.

For more information, see Preparing Cloud Connector Connectivity.

Procedure

This is custom documentation. For more information, please visit SAP Help Portal. 158
7/9/25, 8:45 AM

1. In the side navigation area, click  (System)  (Administration) Data Source Configuration .

2. Perform the required tasks:

Receive the SAP Datasphere subaccount information that is required during Cloud Connector configuration.

To receive the SAP Datasphere subaccount information, the subaccount needs to be linked to the user ID of your
SAP BTP account. In the SAP BTP Core Account section, you can check if this has been done and the information is
already available in Account Information.

During Cloud Connector configuration, you will then need to enter the following information from your SAP
Datasphere subaccount:

Subaccount

Region Host

Subaccount User

If you have an account but cannot see the Account Information, enter the SAP BTP user ID. This ID is typically the
email address you used to create your SAP BTP account. After you have entered the ID you can see the Account
Information for SAP Datasphere:

 Note
If you don't have an SAP Business Technology Platform (SAP BTP) user account yet, create an account in the
SAP BTP cockpit by clicking Register in the cockpit.

To be able to use the Cloud Connector for SAP BW/4HANA Model Transfer connections to import analytic queries
with the Model Transfer Wizard and for SAP S/4HANA On-Premise connections to import ABAP CDS Views with the
Import Entities wizard, switch on Allow live data to securely leave my network in the Live Data Sources section.

 Note
The Allow live data to securely leave my network switch is audited, so that administrators can see who switched
this feature on and off. To see the changes in the switch state, go to  (Security)  (Activities), and search
for ALLOW_LIVE_DATA_MOVEMENT.

If you have connected multiple Cloud Connector instances to your subaccount with different location IDs and you
want to offer them for selection when creating connections using a Cloud Connector, in the On-premise data
sources section, add the appropriate location IDs. If you don't add any location IDs here, the default location will be
used.

Cloud Connector location IDs identify Cloud Connector instances that are deployed in various locations of a
customer's premises and connected to the same subaccount. Starting with Cloud Connector 2.9.0, it is possible to
connect multiple Cloud Connectors to a subaccount as long as their location ID is different.

Manage IP Allowlist
Add IP addresses to the IP Allowlist by either directly entering them or importing them from a CSV file. You can also export the IP
Allowlist.

Add IP Address to IP Allowlist


Clients in your local network need an entry in the appropriate IP allowlist in SAP Datasphere. Cloud Connectors in your local
network only require an entry if you want to use them for federation and replication with remote tables from on-premise systems.

This is custom documentation. For more information, please visit SAP Help Portal. 159
7/9/25, 8:45 AM

Context
To secure your environment, you can control the range of IPv4 addresses that get access to the database of your SAP Datasphere
by adding them to an allowlist.

You need to provide the external (public) IPv4 address (range) of the client directly connecting to the database of SAP
Datasphere. This client might be an SAP HANA smart data integration Data Provisioning Agent on a server, a 3rd party ETL or
analytics tool, or any other JDBC-client. If you're using a network firewall with a proxy, you need to provide the public IPv4 address
of your proxy.

Internet Protocol version 4 addresses (IPv4 addresses) have a size of 32 bits and are represented in dot-decimal notation,
[Link] for example. The external IPv4 address is the address that the internet and computers outside your local network
can use to identify your system.

The address can either be a single IPv4 address or a range specified with a Classless Inter-Domain Routing suffix (CIDR suffix). An
example for a CIDR suffix is /24 which represents 256 addresses and is typically used for a large local area network (LAN). The
CIDR notation for the IPv4 address above would be: [Link]/24 to denote the IP addresses between [Link] and
[Link] (the leftmost 24 bits of the address in binary notation are fixed). The external (public) IP address (range) to enter
into the allowlist will be outside of the range [Link]/16. You can find more information on Classless Inter-Domain Routing on
Wikipedia .

 Note

Private IP ranges are not relevant to be included in the allowlist.

The number of entries in the allowlist is limited. Once the limit has been reached, you won't be able to add entries.
Therefore, please consider which IP addresses should be added and whether the number of allowlist entries can be
reduced by using ranges to request as few allowlist entries as possible.

Procedure
1. In the side navigation area, click  (System)  (Configuration) IP Allowlist .

2. From the IP Allowlist dropdown, select the appropriate list:

Trusted IPs: For clients such as an Data Provisioning Agent on a server, 3rd party ETL or analytics tools, or any other
JDBC-client

Trusted Cloud Connector IPs: For Cloud Connectors that you want to use for federation and replication with remote
tables from on-premise systems such as SAP HANA

The selected list shows all IP addresses that are allowed to connect to the SAP Datasphere database.

3. Click Add to open the Allow IP Addresses dialog.

 Note
Once the number of entries in the allowlist has reached its limit, the Add button will be disabled.

4. In the CIDR field of the dialog, either provide a single IPv4 address or a range specified with a CIDR suffix.

 Note
Please make sure that you provide the external IPv4 address of your client respectively proxy when using a network
firewall. The IP you enter needs to be your public internet IP.

5. [optional] You can add a description of up to 120 characters to better understand your IP entries.

6. In the dialog, click Add to return to the list.

This is custom documentation. For more information, please visit SAP Help Portal. 160
7/9/25, 8:45 AM
7. To save your newly added IP to the allowlist on the database, click Save in the pushbutton bar of your list.

 Note
Updating the allowlist in the database requires some time. To check if your changes have been applied, click Refresh.

Next Steps
You can also select and edit an entry from the list if an IP address has changed, or you can delete IPs if they are not required
anymore to prevent them from accessing the database of SAP Datasphere. To update the allowlist in the database with any change
you made, click Save and be reminded that the update in the database might take some time.

Import or Export IP Allowlist


Importing or exporting IP addresses from SAP Datasphere using a CSV file simplifies the process of managing large lists of IP
addresses. This method saves time and reduces the risk of errors compared to manual entry.

Context

You could find yourself in a situation where you need many IP addresses added to your current list of IP addresses. Rather than
manually entering them, an easier way to move IP addresses is to import or export a list from SAP Datasphere. When importing,
the file should be a CSV type using a semicolon, comma, tab, or pipe as the value that separates the IP addresses and their
descriptions. The column headings must include CIDR (Classless Inter-Domain Routing) and Description. Here is an example of a
basic comma-separated CSV file:

CIDR Description

[Link] Computer1

[Link]/1 Computer 2

Here is a more complex CSV:

CIDR From To Total Amount Description

[Link] [Link] [Link] [Link] Computer 1

[Link]/16 [Link] [Link] 65534 Range 1

You can use a file produced in the same or on a different SAP Datasphere tenant.

Procedure
1. In the side navigation area, click  (System)  (Configuration) IP Allowlist .

2. From the IP Allowlist dropdown, select the appropriate list:

Trusted IPs: For clients such as a Data Provisioning Agent on a server, third-party ETL or analytics tools, or any other
JDBC-client

Trusted Cloud Connector IPs: For Cloud Connectors that you want to use for federation and replication with remote
tables from on-premise systems such as SAP HANA

The selected list shows all IP addresses that are allowed to connect to the SAP Datasphere database.

3. Choose one of these options:


This is custom documentation. For more information, please visit SAP Help Portal. 161
7/9/25, 8:45 AM

Option Action

Import IP Allowlist a. Click  (Import IP Allowlist).

b. Click Select Source File, and choose the allowlist file. Click Open.

c. Choose one of the following:

Append new IPs: Add the unique IP addresses to the current list or update
existing IP descriptions.

Overwrite existing IPs: Remove the old IP addresses and add only those
addresses in this file.

d. Click Import.

Export IP Allowlist a. Click  (Export IP Allowlist).

b. Choose a comma-separated value (CSV) from the list.

c. Click Export.

Finding SAP Datasphere IP addresses


Find externally facing IP addresses and IDs that must be added to allowlists in particular remote applications before you can use
connections to these remote applications.

Remote applications might restrict access to their instances. Whether an external client such as SAP Datasphere is allowed to
access the remote application is often decided by the remote application based on allowlisted IPs. Any external client trying to
access the remote appplication has to be made known to the remote application before first trying to access the application by
adding the external client's IP address(es) to an allowlist in the remote application. As an SAP Datasphere administrator or a user
with the System Information = Read privilege you can find the necessary information in the About dialog.

Particular remote applications or sources that you might want to access with SAP Datasphere restrict access to their instances
and require external SAP Datasphere IP address information to be added to an allowlist in the remote application before first trying
to access the application.

Users with the DW Administrator role can open a More section to find more details.

Replication/Data Flow NAT IP (egress)


To allow SAP Datasphere access to a protected remote application and using the corresponding connection with data flows or
replication flows, add the Replication/Data Flow NAT IP (egress) to the remote application allowlist.

Administrators can find the Replication/Data Flow NAT IP (egress) from the side navigation area by clicking  (System) 
(About) More Replication/Data Flow NAT IP (egress).

Examples

The network for Amazon Redshift, Microsoft Azure SQL Database, or SAP SuccessFactors instances, for example, is protected by a
firewall that controls incoming traffic. To be able to use connections with these connection types for data flows or replication flows,
the connected sources require the relevant SAP Datasphere network address translation (NAT) IP address to be added to an
allowlist.

This is custom documentation. For more information, please visit SAP Help Portal. 162
7/9/25, 8:45 AM
For Amazon Redshift and Microsoft Azure SQL Database, find the Replication/Data Flow NAT IP (egress) in the last step of the
connection creation wizard.

SAP HANA Cloud NAT IP Addresses (Egress)

(IP address of the SAP Datasphere's SAP HANA Cloud database instance)

If connecting a REST remote source to the HANA Cloud instance through SDI (for example, OData Adapter), then the REST remote
source is accessed using one of the NAT / egress IPs.

If connecting a remote source using SDA to the HANA Cloud instance, then the connection uses the NAT / egress IP in case the
Cloud Connector is not used in the scenario.

Administrators can find the NAT IPs from the side navigation area by clicking  (System)  (About) More SAP
HANA Cloud NAT IP (egress).

For more information, see Domains and IP Ranges in the SAP HANA Cloud documentation.

Example: SAP SuccessFactors

Access to SAP SuccessFactors instances is restricted. To be able to use a SAP SuccessFactors connection for remote tables and
view building, the connected source requires the externally facing IP addresses of theSAP Datasphere tenant to be added to an
allowlist.

For more information about adding the IP addresses in SAP SuccessFactors, see Adding an IP Restriction in the SAP
SuccessFactors platform documentation.

Microsoft Azure Deployments Only: Virtual Network Subnet ID

If you're using SAP Datasphere on Microsoft Azure and want to connect to an Azure storage service in a firewall-protected
Microsoft Azure storage account within the same Azure region, an administrator must allow the SAP Datasphere's Virtual Network
Subnet ID in the Microsoft Azure storage account. This is required for connections to Azure storage services such as Microsoft
Azure Data Lake Store Gen2.

For more information, see SAP Note 3405081 .

Administrators can find the ID from the side navigation area by clicking  (System)  (About) More Virtual
Network Subnet ID (Microsoft Azure).

Related Links
SAP Note 3456052 (FAQ: About IP Addresses used in SAP Datasphere)

Manage Certificates for Connections


To import a certificate into the SAP Datasphere trust chain, obtain the certificate from the target endpoint and upload it to SAP
Datasphere.

Prerequisites

This is custom documentation. For more information, please visit SAP Help Portal. 163
7/9/25, 8:45 AM
You have downloaded the required SSL/TLS certificate from an appropriate website. As one option for downloading, common
browsers provide functionality to export these certificates.

 Note

Only X.509 Base64-encoded certificates enclosed between "-----BEGIN CERTIFICATE-----" and "-----END CERTIFICATE--
---" are supported. The common filename extension for the certificates is .pem (privacy-enhanced mail). We also
support filename extensions .crt and .cer.

A certificate used in one region might differ from those used in other regions. Also, some sources, such as Amazon
Athena, might require more than one certificate.

Remember that all certificates can expire.

If you have a problem with a certificate, please contact your cloud company for assistance.

Context
For connections secured by leveraging HTTPS as the underlying transport protocol (using SSL/TLS transport encryption), the
server certificate must be trusted.

 Note
You can create connections to remote systems which require a certificate upload without having uploaded the necessary
certificate. Validating a connection without valid server certificate will fail though, and you won't be able to use the connection.

Procedure
1. In the side navigation area, click  (System)  (Configuration) Security .

2. Click  Add Certificate.

3. In the Upload Certificate dialog, browse your local directory and select the certificate.

4. Enter a description to provide intelligible information on the certificate, for example to point out to which connection type
the certificate applies.

5. Choose Upload.

Results
In the overview, you can see the certificate with its creation and expiry date. From the overview, you can delete certificates if
required.

Upload Third-Party ODBC Drivers (Required for Data Flows)


To enable access to a non-SAP database via ODBC to use it as a source for data flows, you need to upload the required ODBC
driver files to SAP Datasphere.

Prerequisites
Search for the required driver files in the internet, make sure you have selected the correct driver files (identified by their
SHA256-formatted fingerprint) and download them from an appropriate web page (see below).

Ensure you have a valid license for the driver files.


This is custom documentation. For more information, please visit SAP Help Portal. 164
7/9/25, 8:45 AM

Context

Drivers are required for the following connection types (if several driver versions are supported, we recommend to use the newest
supported version mentioned below):

Connection Driver to be uploaded Download Site


Type

Amazon AmazonRedshiftODBC-64-bit-1.4.11.1000-1.x86_64.rpm (SHA256 fingerprint: [Link]


Redshift 6d811e2f198a030274bf9f099d4c828b1b071b78e99432eee1531d4988768a22)
Connections
AmazonRedshiftODBC-64-bit-1.4.65.1000-1.x86_64.rpm (SHA256 fingerprint:
ee79a8d41760a90b6fa2e1a074e33b0518e3393afd305f0bee843b5393e10df0)

Oracle instantclient-basiclite-linux.x64-[Link].[Link] (SHA256 fingerprint: Driver: [Link]


Connections ea4a9557c6355f5b56b648b7dff47db79a1403b7e9f7abeca9e1a0e952498e13) basiclite-linux

 Note Additional files if SSL

Make sure to select the Basic Light package zip file. The package applies to all versions [Link]
supported by the Oracle connection type (Oracle 12c, Oracle 18c, and Oracle 19c). [Link]

Additional files are required if SSL is used: [Link]


[Link]
[Link] (SHA256 fingerprint:
e408e7ae67650917dbce3ad263829bdc6c791d50d4db2fd59aeeb5503175499b) [Link]
[Link]
osdt_cert.jar (SHA256 fingerprint:
6b152d4332bd39f258a88e58b9215a926048d740e148971fe1628b09060176a8)

osdt_core.jar (SHA256 fingerprint:


c25e30184bb94c6da1227c8256f0e1336acb97b29229edb4aacf27167b96075e)

Before uploading the files, you must rename them following the names already indicated:
[Link], osdt_cert.jar, osdt_core.jar.

Google SimbaODBCDriverforGoogleBigQuery_2.[Link] (SHA256 fingerprint: [Link]


BigQuery abf4551d621c26f4fa30539e7ece2a47daaf6e1d67c59e5b7e79c43a3335018f) release/odbc/Si
Connections
SimbaODBCDriverforGoogleBigQuery_3.[Link] (SHA256 fingerprint: [Link]
58d3c9acfb93f0d26c081a230ff664a16c8544d567792ebc5436beb31e9e28e4)

When uploading the drivers, they are identified by their SHA256-formatted fingerprint. You can verify the fingerprint with the
following command:

Windows 10: In PowerShell, run the following command:

Get-Filehash <driver file> -Algorithm SHA256

Linux/MacOS: In a unix-compliant shell, run the following command:

shasum -a 256 <driver file>

Upload a Driver

Perform the following steps before creating the first Amazon Redshift, Oracle, or Google BigQuery connection that you want to use
for data flows.

1. In the side navigation area, click  (System)  (Configuration) Data Integration .

This is custom documentation. For more information, please visit SAP Help Portal. 165
7/9/25, 8:45 AM
2. Go to Third-Party Drivers and choose  Upload.

3. In the following dialog box, choose Browse to select the driver file from your download location.

 Note
The fingerprint of the driver file name to be uploaded must match the fingerprint mentioned above.

4. Choose Upload.

5. Choose  sync to synchronize the driver with the underlying component. Wait for about 5 to 10 minutes to finish
synchronization before you start creating connections or using data flows with the connection.

Remove (and Re-Upload) a Driver

You might need to remove a driver when you want to upload a new version of the driver or your licence agreement has terminated.

1. Select the driver and choose  Delete.

2. If you're using a connection that requires the removed driver for data flows, choose  Upload to re-upload the driver to
make sure that you can continue using the data flows.

3. Choose  sync to synchronize the driver changes with the underlying component. Once the synchronization has finished,
you can continue using data flows with the connection, or, if you haven't uploaded a new driver, you won't be able to use
data flows with the connection anymore unless you re-upload the driver.

Troubleshooting
If a data flow fails with the error message saying that the driver could not be found, check that the drivers are uploaded and start
synchronization.

Authorize Spaces to Install SAP Business Data Cloud Data


Products
An SAP Datasphere administrator must choose the spaces to which SAP Business Data Cloud data products from an activated
data package can be installed.

 Note
This procedure only applies to manual data product installation. It doesn't apply to the installation of SAP Business Data Cloud
intelligent applications.

Context
SAP systems provide their SAP Business Data Cloud data products to SAP Datasphere via SAP Business Data Cloud formations
(see Integrate SAP Business Data Cloud Provisioned Systems in the SAP Business Data Cloud documentation). When your SAP
Datasphere tenant is added to an SAP Business Data Cloud formation, the connections to the source systems of the formation
become available in SAP Datasphere. Both systems and connections can be found under  (System)  (Configuration)
Business Data Products .

Before an SAP Datasphere modeler can install data products from an SAP system to any target spaces, an SAP Datasphere
administrator must authorize these spaces in  (System)  (Configuration) Business Data Products .

This is custom documentation. For more information, please visit SAP Help Portal. 166
7/9/25, 8:45 AM

Procedure

1. In the side navigation area of SAP Datasphere, click  (System)  (Configuration) Business Data Products .

2. [optional] Select the system (with its connection) and choose Edit Business Name to provide a more reasonable business
name to the connection.

3. Select the system and click  (Details) to open the side panel.

4. In the side panel, choose Add to authorize one or more spaces to install data products from the system, and confirm your
selection.

5. To remove one or more spaces from your selection, choose Remove.

 Note
You can only remove a space if no data products are installed in this space (see the Data Products Installed column in
the list of selected spaces).

Results

An SAP Datasphere modeler installing the data products in the catalog can select authorized spaces as target spaces (in the
Import Entities wizard). When a data product is installed:

The connection is created in an ingestion space if it does not already exist.

The data product objects are created and deployed in the ingestion space and shared with the target spaces selected
during installation.

For more information, see:

Evaluating and Installing Data Products

Business Data Product Connections

Prepare Connectivity to Adverity


To be able to successfully validate and use a connection to Adverity for view building certain preparations have to be made.

Before you can use the connection, the following is required:

In an Adverity workspace, you have prepared a datastream that connects to the data source for which you want to create
the connection.

In SAP Datasphere, you have added the necessary Adverity IP addresses to the IP allowlist. For more information, see
Manage IP Allowlist.

 Note
To get the relevant IP addresses, please contact your Adverity Account Manager or the Adverity Support team.

Related Information
Adverity Connections

This is custom documentation. For more information, please visit SAP Help Portal. 167
7/9/25, 8:45 AM

Prepare Connectivity to Amazon Athena


To be able to successfully validate and use a connection to Amazon Athena for remote tables certain preparations have to be
made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

A DW administrator has uploaded the server certificates to SAP Datasphere. Two certificates are required, one for Amazon
Athena and one for Amazon S3. Region-specific certificates might be required for Amazon Athena. Alternatively, if the
common root CA certificate contains trust for both endpoints, Amazon Athena and Amazon Simple Storage Service
(API/Athena and Data/S3), you can upload the root certificate.

For more information, see Manage Certificates for Connections.

Related Information
Amazon Athena Connections

Prepare Connectivity to Apache Kafka


To be able to successfully validate and use a connection to Apache Kafka (on-premise) for replication flows, certain preparations
have to be made.

Replication Flows
Before you can use the connection for replication flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to the Apache Kafka on-premise
implementation.

Related Information
Apache Kafka Connections

Prepare Connectivity to Confluent


To be able to successfully validate and use a connection to Confluent Platform (on-premise) for replication flows, certain
preparations have to be made.

Replication Flows
Before you can use the connection for replication flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to Confluent Platform (Kafka brokers) and to
the Schema Registry.

 Note

This is custom documentation. For more information, please visit SAP Help Portal. 168
7/9/25, 8:45 AM
Separate Cloud Connector instances might be used for the two endpoints. The Schema Registry might be used in one
Cloud Connector location is while connecting to the Kafka brokers happens in another location.

For more information, see Configure Cloud Connector.

Related Information
Confluent Connections

Prepare Connectivity to Amazon Redshift


To be able to successfully validate and use a connection to an Amazon Redshift database for remote tables or data flows certain
preparations have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CamelJdbcAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/camel/lib folder
and restarted the Data Provisioning Agent before registering the adapter with SAP Datasphere.

Data Flows

Before you can use the connection for data flows, the following is required:

The outbound IP has been added to the source allowlist.

For information on where a DW administrator can find the IP address, see Finding SAP Datasphere IP addresses.

A DW administrator has uploaded the required ODBC driver file to SAP Datasphere.

For more information, see Upload Third-Party ODBC Drivers (Required for Data Flows).

Related Information
Amazon Redshift Connections

Prepare Connectivity for Cloud Data Integration


To be able to successfully validate and use a Cloud Data Integration connection for remote tables or data flows certain
preparations have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

This is custom documentation. For more information, please visit SAP Help Portal. 169
7/9/25, 8:45 AM
An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CloudDataIntegrationAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

For ABAP-based cloud SAP systems such as SAP S/4HANA Cloud or SAP Marketing Cloud: A communication arrangement
has been created for communication scenario SAP_COM_0531 in the source system.

For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.

Data Flows

Before you can use the connection for data flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

For more information, see Configure Cloud Connector.

For ABAP-based cloud SAP systems such as SAP S/4HANA Cloud or SAP Marketing Cloud: A communication arrangement
has been created for communication scenario SAP_COM_0531 in the source system.

For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.

Related Information
Cloud Data Integration Connections

Prepare Connectivity for Generic JDBC


To be able to successfully validate and use a Generic JDBC connection for remote tables certain preparations have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CamelJdbcAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

It has been checked that the data source is supported by the CamelJdbcAdapter.

For latest information about supported data sources and versions, see the SAP HANA Smart Data Integration Product
Availability Matrix (PAM) .

 Note
For information about unsupported data sources, see SAP Note 3130999 .

An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/camel/lib folder
and restarted the Data Provisioning Agent before registering the adapter with SAP Datasphere.

For more information, see Set up the Camel JDBC Adapter in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality Installation and Configuration Guide.

This is custom documentation. For more information, please visit SAP Help Portal. 170
7/9/25, 8:45 AM
For information about the proper JDBC library for your source, see the SAP HANA smart data integration Product
Availability Matrix (PAM).

Related Information
Generic JDBC Connections

Prepare Connectivity for Generic OData


To be able to successfully validate and use a connection to an OData service for remote tables or data flows certain preparations
have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

The OData service URL needs to be publicly available.

A DW administrator has uploaded the server certificate to SAP Datasphere.

For more information, see Manage Certificates for Connections.

Data Flows
Before you can use the connection for data flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

For more information, see Configure Cloud Connector.

Related Information
Generic OData Connections

Prepare Connectivity for Generic SFTP


To connect to the SFTP server, a public host key is required to verify the server's identity. Additionally, to successfully validate and
use a Generic SFTP connection to an on-premise SFTP server, Cloud Connector is required.

Data Flows and Replication Flows


Before you can use the connection for data flows and replication flows, the following is required:

Expected Format for Host Key

The expected format of the file provided in the host key is one or more lines, each composed of the following elements:

<server host key algorithm> <SHA-256 fingerprint> <optional-comment>

 Example

This is custom documentation. For more information, please visit SAP Help Portal. 171
7/9/25, 8:45 AM
The following is a valid file with two entries:

ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEMXBFYDfcYMW0dccgbJ/TfhpTQhc5oR06jKIg+WCarr myuser@myhost


ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDRqWbaMxSetrsAtTHFaxym4rVqV1yb4umqhDJbJ0H63T+wn8lm+Ev/i/u+

Obtain Host Key

Provide the host public key through a trusted channel. If your Windows 10, Linux, or MacOS machine has a trusted channel,
perform the following steps by replacing the following elements with the specified values:

$HOST with the host name value of your connection

$PORT with the port value of your connection

Use the resulting file host_key.[Link], in the directory where you run the specified command, as the Host Key for your
connection. The specified commands are as follows:

Windows 10: In PowerShell, run the following command:

(ssh-keyscan -p $PORT $HOST 2>$null) -replace '^[^ ]* ','' > host_key.[Link]

Linux/MacOS: In a unix-compliant shell with both ssh-keyscan and sed commands (both are installed in your system),
obtain the key through the following command:

ssh-keyscan -p $PORT $HOST 2>/dev/null | sed "s/^[^ ]* //" > host_key.[Link]

 Note
If your machine doesn't have a trusted channel, we recommend asking your administrator for the public host key to avoid man-
in-the-middle attacks.

Cloud Connector for On-Premise SFTP Servers

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

For more information, see Configure Cloud Connector.

Related Information
Generic SFTP Connections

Prepare Connectivity to Google BigQuery


To be able to successfully validate and use a connection to a Google BigQuery data source for remote tables, certain preparations
have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

A DW administrator has uploaded the server certificate to SAP Datasphere.

 Note
This is custom documentation. For more information, please visit SAP Help Portal. 172
7/9/25, 8:45 AM
The root certificates GTS Root R1 and GTS Root R4 (valid until 2036) are required. In your browser, open
[Link] (Google Trust Services Repository) to download the certificates (supported filename
extensions are .pem and .crt).

For more information, see SAP Notes 3424000 and 3567141 .

For more information, see Manage Certificates for Connections.

Data Flows and Replication Flows


Before you can use the connection for data flows and replication flows, the following is required:

A DW administrator has uploaded the required ODBC driver file to SAP Datasphere.

For more information, see Upload Third-Party ODBC Drivers (Required for Data Flows).

Related Information
Google BigQuery Connections

Prepare Connectivity to Microsoft Azure SQL Database


To be able to successfully validate and use a connection to Microsoft Azure SQL database for remote tables or data flows certain
preparations have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the MssqlLogReaderAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/lib folder before
registering the adapter with SAP Datasphere.

To use Microsoft SQL Server trigger-based replication, the user entered in the connection credentials needs to have the
required privileges and permissions. For more information, see Required Permissions for SQL Server Trigger-Based
Replication in the Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data
Quality

Data Flows and Replication Flows


Before you can use the connection for data flows and replication flows, the following is required:

The outbound IP has been added to the source allowlist.

For information on where a DW administrator can find the IP address, see Finding SAP Datasphere IP addresses.

Related Information
Microsoft Azure SQL Database Connections

This is custom documentation. For more information, please visit SAP Help Portal. 173
7/9/25, 8:45 AM

Prepare Connectivity to Microsoft Azure Data Lake Store Gen2


To be able to successfully validate and use a connection to Microsoft Azure Data Lake Store Gen2 certain preparations have to be
made.

Data Flows and Replication Flows


Before you can use the connection for data flows and replication flows, the following is required:

If you're using SAP Datasphere on Microsoft Azure and want to connect to Microsoft Azure Data Lake Store Gen2 in a
firewall-protected Microsoft Azure storage account within the same Azure region: An Azure administrator must grant SAP
Datasphere access to the Microsoft Azure storage account.

For more information, see Finding SAP Datasphere IP addresses

Related Information
Microsoft Azure Data Lake Store Gen2 Connections

Prepare Connectivity to Microsoft SQL Server


To be able to successfully validate and use a connection to a Microsoft SQL Server, certain preparations have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the MssqlLogReaderAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/lib folder before
registering the adapter with SAP Datasphere.

Required Permissions for SQL Server Trigger-Based Replication in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality Installation and Configuration Guide

Data Flows and Replication Flows


Before you can use the connection for data flows and replication flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

For more information, see Configure Cloud Connector.

 Note
Cloud Connector is not required if your Microsoft SQL Server database is available on the public internet.

The required driver is pre-bundled and doesn't need to be uploaded by an administrator.

Related Information
This is custom documentation. For more information, please visit SAP Help Portal. 174
7/9/25, 8:45 AM
Microsoft SQL Server Connections

Prepare Connectivity to SAP Open Connectors


Integrate SAP Open Connectors with SAP Datasphere to be able to connect to third party data sources powered by SAP Open
Connectors.

Preparations in SAP BTP and SAP Open Connectors Account

1. Set up an SAP BTP account and enable the SAP Integration Suite service with the SAP Open Connectors capability.

 Note
You need to know your SAP BTP subaccount information (provider, region, environment, trial - yes/no) later to select the
appropriate SAP BTP subaccount region in SAP Datasphere when integrating the SAP Open Connectors account in your
space.

For information about setting up an SAP BTP trial version with the SAP Integration Suite service, see Set Up Integration
Suite Trial . To enable SAP Open Connectors, you need to activate the Extend Non-SAP Connectivity capability in the
Integration Suite.

For information about setting up SAP Integration Suite from a production SAP BTP account, see Initial Setup in the SAP
Integration Suite documentation.

For information about SAP Open Connectors availability in data centers, see SAP Note 2903776 .

2. In your SAP Open Connectors account, create connector instances for the sources that you want to connect to SAP
Datasphere.

For more information about creating an instance, see Authenticate a Connector Instance (UI) in the SAP Open Connectors
documentation.

For more information about connector-specific setup and connector-specific properties required to create an instance, see
Connectors Catalog in the SAP Open Connectors documentation. There, click the connector in question and then
<connector name> API Provider Setup or <connector name> Authenticate a Connector Instance.

3. In your SAP Open Connectors account, record the following information which you will require later in SAP Datasphere:

Organization secret and user secret - required when integrating the SAP Open Connectors account in your space.

Name of the connector instance - required when selecting the instance in the connection creation wizard

Preparations in SAP Datasphere


1. In the side navigation area, click  (Connections), select a space if necessary, click the SAP Open Connectors tab, and then
click Integrate your SAP Open Connectors Account to open the Integrate your SAP Open Connectors Account dialog.

2. In the dialog, provide the following data:

a. In the SAP BTP Sub Account Region field, select the appropriate entry according to your SAP BTP subaccount
information (provider, region, environment, trial - yes/no).

b. Enter your SAP Open Connectors organisation and user secret.

3. Click OK to integrate your SAP Open Connectors account with SAP Datasphere.

Results
This is custom documentation. For more information, please visit SAP Help Portal. 175
7/9/25, 8:45 AM
With connection type Open Connectors you can now create connections to the third-party data sources available as connector
instances with your SAP Open Connectors account.

Related Information
Open Connectors Connections

Prepare Connectivity to Oracle


To be able to successfully validate and use a connection to an Oracle database for remote tables or data flows, certain
preparations have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the OracleLogReaderAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

An administrator has downloaded and installed the required JDBC library in the <DPAgent_root>/lib folder before
registering the adapter with SAP Datasphere.

For more information about the supported JDBC libraries, see the SAP HANA smart data integration and all its patches
Product Availability Matrix (PAM) for SAP HANA SDI 2.0 . Search for the required library in the internet and download it
from an appropriate web page.

Required Permissions for Oracle Trigger-Based Replication in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality Installation and Configuration Guide

If encrypted communication is used (connection is configured to use SSL), the server certificate must be uploaded to the
Data Provisioning Agent.

To retrieve the certificate, you can use for example the following command: openssl s_client -showcerts -
servername <host name of the Oracle database server>:<port number of the Oracle database
server> -connect <host name of the Oracle database server>:<port number of the Oracle
database server>

For more information about uploading the certificate to the Data Provisioning Agent, see:

Configure the Adapter Truststore and Keystore in the SAP HANA Smart Data Integration and SAP HANA Smart Data
Quality documentation

Configure SSL for the Oracle Log Reader Adapter in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality documentation

Data Flows

Before you can use the connection for data flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

For more information, see Configure Cloud Connector.

 Note
This is custom documentation. For more information, please visit SAP Help Portal. 176
7/9/25, 8:45 AM
Cloud Connector is not required if your Oracle database is available on the public internet.

A DW administrator has uploaded the required ODBC driver file to SAP Datasphere.

To use encrypted communication (connection is configured to use SSL), additional files are required to be uploaded.

For more information, see Upload Third-Party ODBC Drivers (Required for Data Flows).

A DW administrator has uploaded the server certificate to SAP Datasphere.

To retrieve the certificate, you can use for example the following command: openssl s_client -showcerts -
servername <host name of the Oracle database server>:<port number of the Oracle database
server> -connect <host name of the Oracle database server>:<port number of the Oracle
database server>

For more information, see Manage Certificates for Connections.

Related Information
Oracle Connections

Prepare Connectivity to Precog


To be able to successfully validate and use a connection to Precog for view building certain preparations have to be made.

Before you can use the connection, the following is required:

In Precog, you have added the source for which you want to create the connection.

In SAP Datasphere, you have added the necessary Precog IP addresses to the IP allowlist. For more information, see
Manage IP Allowlist.

 Note
You can find and copy the relevant IP addresses in the final step of the connection creation wizard.

Related Information
Precog Connections

Prepare Connectivity to SAP ABAP Systems


To be able to successfully validate and use a connection to an SAP ABAP system for remote tables or data flows, certain
preparations have to be made.

Remote Tables

If you want to use federated access to CDS view entities using the ABAP SQL service exposure from SAP S/4HANA (on-premise),
see Using ABAP SQL Services for Accessing Data from SAP S/4HANA (recommended for federation scenarios).

If you want to use federated access to CDS view entities using the ABAP SQL service exposure from SAP S/4HANA Cloud, see
Using ABAP SQL Services for Accessing Data from SAP S/4HANA Cloud (recommended for federation scenarios).

This is custom documentation. For more information, please visit SAP Help Portal. 177
7/9/25, 8:45 AM
If you want to federate and replicate data from ABAP-based on-premise systems using SAP HANA smart data integration, the
following is required before you can use the connection (legacy):

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the ABAPAdapter.

For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data
Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.

For more information, see Preparing Data Provisioning Agent Connectivity.

The ABAP user specified in the credentials of the SAP ABAP connection needs to have a specific set of authorizations in the
SAP ABAP system. For more information, see: Authorizations in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.

To access and copy data from SAP BW objects such as InfoProviders or characteristics, the appropriate authorization
objects like S_RS_ADSO or S_RS_IOBJA are required for the ABAP user. For more information, see Overview: Authorization
Objects in the SAP NetWeaver documentation.

If you want to access data from SAP BW Queries, make sure that the ABAP user has the required analysis authorizations to
read the data and that characteristic 0TCAIPROV (InfoProvider) in the authorization includes @Q, which is the prefix for
Queries as InfoProviders. For more information, see Defining Analysis Authorizations in the SAP NetWeaver documentation.

If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to
configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC
destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent
as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming.

To be able to use ABAP Dictionary tables from connections to a SAP BW∕4HANA system for remote tables and creating
views, please make sure that SAP note 2872997 has been applied to the system.

Data Flows

 Note
The availability of the data flow feature depends on the used version and Support Package level of the ABAP-based SAP system
(SAP S/4HANA or the DMIS addon in the source). Make sure your source systems meet the required minimum versions. We
recommend to use the latest available version of SAP S/4HANA and the DMIS add-on where possible and have the latest SAP
notes and TCI notes implemented in your systems.

For more information about required versions, recommended system landscape, considerations for the supported source
objects, and more , see SAP Note 2890171 .

Before you can use the connection for data flows, the following is required:

If the connected system is an on-premise source, an adminstrator has installed and configured Cloud Connector.

In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.

For more information, see Configure Cloud Connector.

See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)

If you want to connect to SAP S/4HANA Cloud to replicate extraction-enabled, C1-released CDS views: Consider the
information about preparing an SAP S/4HANA Cloud connection for data flows.

For more information, see Prepare Connectivity to SAP S/4HANA Cloud.

This is custom documentation. For more information, please visit SAP Help Portal. 178
7/9/25, 8:45 AM

Replication Flows

 Note
The availability of the replication flow feature depends on the used version and Support Package level of the ABAP-based SAP
system (SAP S/4HANA or the DMIS addon in the source). Make sure your source systems meet the required minimum
versions. We recommend to use the latest available version of SAP S/4HANA and the DMIS add-on where possible and have the
latest SAP notes and TCI notes implemented in your systems.

For more information about required versions, recommended system landscape, considerations for the supported source
objects, and more , see SAP Note 2890171 .

Before you can use the connection for replication flows, the following is required:

If the connected system is an on-premise source, an administrator has installed and configured Cloud Connector.

In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.

For more information, see Configure Cloud Connector.

See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)

If you want to connect to SAP S/4HANA Cloud to replicate extraction-enabled, C1-released CDS views or you want to
replicate CDS view entities using the SQL service exposure: Consider the information about preparing an SAP S/4HANA
Cloud connection for replication flows.

For more information, see Prepare Connectivity to SAP S/4HANA Cloud.

Related Information
SAP ABAP Connections

Prepare Connectivity to SAP BW


To be able to successfully validate and use a connection to SAP BW for remote tables or data flows, certain preparations have to be
made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the ABAPAdapter.

For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data
Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.

For more information, see Preparing Data Provisioning Agent Connectivity.

The ABAP user specified in the credentials of the SAP ABAP connection needs to have a specific set of authorizations in the
SAP ABAP system. For more information, see: Authorizations in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.

To access and copy data from SAP BW objects such as InfoProviders or characteristics, the appropriate authorization
objects like S_RS_ADSO or S_RS_IOBJA are required for the ABAP user. For more information, see Overview: Authorization
Objects in the SAP NetWeaver documentation.

This is custom documentation. For more information, please visit SAP Help Portal. 179
7/9/25, 8:45 AM
If you want to access data from SAP BW Queries, make sure that the ABAP user has the required analysis authorizations to
read the data and that characteristic 0TCAIPROV (InfoProvider) in the authorization includes @Q, which is the prefix for
Queries as InfoProviders. For more information, see Defining Analysis Authorizations in the SAP NetWeaver documentation.

If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to
configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC
destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent
as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming.

To be able to use ABAP Dictionary tables from connections to a SAP BW∕4HANA system for remote tables and creating
views, please make sure that SAP note 2872997 has been applied to the system.

Data Flows
Before you can use the connection for data flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.

For more information, see Configure Cloud Connector.

Related Information
SAP BW Connections

Preparing SAP BW/4HANA Model Transfer Connectivity


Accessing SAP BW/4HANA meta data and importing models into SAP Datasphere with a SAP BW/4HANA Model Transfer
connection requires two protocols (or endpoints): Http and SAP HANA Smart Data Integration based on the SAP HANA adapter.

For accessing SAP BW∕4HANA, http is used to securely connect to the SAP BW∕4HANA system via Cloud Connector, and SAP
HANA SQL is used to connect to the SAP HANA database of SAP BW∕4HANA via Data Provisioning Agent. Using Cloud Connector
to make http requests to SAP BW∕4HANA requires a live data connection of type tunnel to SAP BW∕4HANA.

For information on supported SAP BW/4HANA source versions, see Supported Source Versions for SAP BW∕4HANA Model
Transfer Connections.

Before creating a connection for SAP BW/4HANA Model Transfer in SAP Datasphere, you need to prepare the following:

1. In SAP BW∕4HANA, make sure that the following services are active in transaction code SICF:

BW InA - BW Information Access Services:

/sap/bw/ina/GetCatalog

/sap/bw/ina/GetResponse

/sap/bw/ina/GetServerInfo

/sap/bw/ina/ValueHelp

/sap/bw/ina/BatchProcessing

/sap/bw/ina/Logoff

/sap/bw4

This is custom documentation. For more information, please visit SAP Help Portal. 180
7/9/25, 8:45 AM
2. In SAP BW∕4HANA, activate OData service ESH_SEARCH_SRV in Customizing (transaction SPRO) under SAP
NetWeaver Gateway OData Channel Administration General Settings Activate and Maintain Services .

3. Install and configure Cloud Connector. For more information, see Configure Cloud Connector.

4. In the side navigation area of SAP Datasphere, click System Administration Data Source Configuration Live Data
Sources and switch on Allow live data to leave my network.

For more information, Set Up Cloud Connector in SAP Datasphere.

5. In the side navigation area of SAP Datasphere, click System Administration Data Source Configuration On-premise
data sources and add the location ID of your Cloud Connector instance.

For more information, Set Up Cloud Connector in SAP Datasphere.

6. In the side navigation area of SAP Datasphere, open System Configuration Data Integration Live Data Connections
(Tunnel) and create a live data connection of type tunnel to SAP BW∕4HANA.

For more information, see Create Live Data Connection of Type Tunnel.

7. Install and configure a Data Provisioning Agent and register the SAP HANA adapter with SAP Datasphere:

Install the latest Data Provisioning Agent version on a local host or update your agent to the latest version. For more
information, see Install the Data Provisioning Agent.

In SAP Datasphere, add the external IPv4 address of the server on which your Data Provisioning Agent is running, or
in case you are using a network firewall add the public proxy IP address to the IP allowlist. For more information, see
Manage IP Allowlist.

Connect the Data Provisioning Agent to SAP Datasphere. For more information, see Connect and Configure the Data
Provisioning Agent.

In SAP Datasphere, register the SAP HANA adapter with SAP Datasphere. For more information, see Register
Adapters with SAP Datasphere.

Related Information
SAP BW∕4HANA Model Transfer Connections

Create Live Data Connection of Type Tunnel


To securely connect and make http requests to SAP BW∕4HANA, you need to connect via Cloud Connector. This requires that you
create a live data connection of type tunnel to the SAP BW∕4HANA system.

Prerequisites
See the prerequisites 1 to 5 in Preparing SAP BW/4HANA Model Transfer Connectivity.

Procedure
1. In the side navigation area, click  (System)  (Configuration) Data Integration .

2. In the Live Data Connections (Tunnel) section, click Manage Live Data Connections.

The Manage Live Data Connections dialog appears.

3. On the Connections tab, click  (Add Connection).

The Select a data source dialog will appear.

This is custom documentation. For more information, please visit SAP Help Portal. 181
7/9/25, 8:45 AM
4. Expand Connect to Live Data and select SAP BW.

The New BW Live Connection dialog appears.

5. Enter a name and description for your connection. Note that the connection name cannot be changed later.

6. Set the Connection Type to Tunnel.

By enabling tunneling, data from the connected source will always be transferred through the Cloud Connector.

7. Select the Location ID.

 Note
In the next step, you will need to specify the virtual host that is mapped to your on-premise system. This depends on the
settings in your selected Cloud Connector location.

8. Add your SAP BW∕4HANA host name, HTTPS port, and client.

Use the virtual host name and virtual port that were configured in the Cloud Connector.

9. Optional: Choose a Default Language from the list.

This language will always be used for this connection and cannot be changed by users without administrator privileges.

 Note
You must know which languages are installed on your SAP BW∕4HANA system before adding a language code. If the
language code you enter is invalid, SAP Datasphere will default to the language specified by your system metadata.

10. Under Authentication Method, select User Name and Password.

11. Enter user name (case sensitive) and password of the technical user for the connection.

The user needs the following authorizations:

Authorization object S_BW4_REST (authorization field: BW4_URI, value: /sap/bw4/v1/dwc*)

Authorization object SDDLVIEW (authorization field: DDLSRCNAME, value: RSDWC_SRCH_QV)

Read authorizations for SAP BW∕4HANA metadata (Queries, CompositeProviders and their InfoProviders)

Using authorizations for SAP BW∕4HANA metadata, you can restrict a model transfer connection to a designated
semantic SAP BW/4HANA area.

For more information, see Overview: Authorization Objects in the SAP BW∕4HANA documentation.

12. Select Save this credential for all users on this system.

13. Click OK.

 Note
While saving the connection, the system checks if it can access /sap/bc/ina/ services in SAP BW∕4HANA.

Results
The connection is saved and now available for selection in the SAP Datasphere connection creation wizard for the SAP BW∕4HANA
Model Transfer connection.

Supported Source Versions for SAP BW∕4HANA Model Transfer


Connections
This is custom documentation. For more information, please visit SAP Help Portal. 182
7/9/25, 8:45 AM
In order to create a connection of type SAP BW/4HANA Model Transfer , the SAP BW∕4HANA system needs to have a specific
version.

These versions of SAP BW∕4HANA are supported:

SAP BW∕4HANA 2.0 SPS07 or higher

2989654 BW/4 - Enable DWC "Import from Connection" for BW/4 Query - Revision 1

2714624 Version Comparison False Result

2754328 Disable creation of HTTP Security Sessions per request

2840529 Sporadic HTTP 403 CSRF token validation errors

2976147 Import of query views in the BW/4 hybrid scenario: No search results of BW back ends with SAP_BASIS
Release 753

SAP BW∕4HANA 2.0 SPS01 to SPS06 after you have applied the following SAP Notes:

2943200 TCI for BW4HANA 2.0 Hybrid

2945277 BW/4 - Enable DWC "Import from Connection" for BW/4 Query

2989654 BW/4 - Enable DWC "Import from Connection" for BW/4 Query - Revision 1

2714624 Version Comparison False Result

2754328 Disable creation of HTTP Security Sessions per request

2840529 Sporadic HTTP 403 CSRF token validation errors

2976147 Import of query views in the BW/4 hybrid scenario: No search results of BW back ends with SAP_BASIS
Release 753

Prepare Connectivity to SAP ECC


To be able to successfully validate and use a connection to SAP ECC for remote tables or data flows, certain preparations have to
be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the ABAPAdapter.

For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data
Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.

For more information, see Preparing Data Provisioning Agent Connectivity.

The ABAP user specified in the credentials of the SAP ABAP connection needs to have a specific set of authorizations in the
SAP ABAP system. For more information, see: Authorizations in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.

If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to
configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC

This is custom documentation. For more information, please visit SAP Help Portal. 183
7/9/25, 8:45 AM
destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent
as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming.

Data Flows
Before you can use the connection for data flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.

For more information, see Configure Cloud Connector.

Related Information
SAP ECC Connections

Prepare Connectivity to SAP Fieldglass


To be able to successfully validate and use a connection to SAP Fieldglass for remote tables or data flows, certain preparations
have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CloudDataIntegrationAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

Related Information
SAP Fieldglass Connections

Prepare Connectivity to SAP HANA


To be able to successfully validate and use a connection to SAP HANA Cloud or SAP HANA (on-premise) for remote tables or data
flows certain preparations have to be made.

SAP HANA Cloud


A DW administrator has uploaded the server certificate to SAP Datasphere.

A DW administrator has uploaded the TLS server certificate DigiCert Global Root CA
([Link]).

For more information, see Manage Certificates for Connections.

SAP HANA on-premise

This is custom documentation. For more information, please visit SAP Help Portal. 184
7/9/25, 8:45 AM
Remote Tables

Before you can use the connection for remote tables, the following is required:

If you want to use SAP HANA Smart Data Integration:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere
and registered the HanaAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

If you use encrypted communication (see the Security properties in the connection creation wizard):

An administrator has already correctly configured Data Provisioning Agent for SSL support.

For more information, see Configure SSL for SAP HANA On-Premise [Manual Steps] in the SAP HANA Smart Data
Integration and SAP HANA Smart Data Quality documentation.

If you want to use SAP HANA Smart Data Access:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

For more information, see Configure Cloud Connector.

An administrator has added the Cloud Connector IP address to the IP allowlist.

For more information, see Manage IP Allowlist.

If you use encrypted communication and the server certificate should be validated (see the Security properties in
the connection creation wizard):

A DW administrator has uploaded the server certificate to SAP Datasphere.

For more information, see Manage Certificates for Connections.

Data Flows and Replication Flows

For SAP HANA (on-premise), before you can use the connection for data flows and replication flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

For more information, see Configure Cloud Connector.

Related Information
SAP HANA Connections

Prepare Connectivity to SAP Marketing Cloud


To be able to successfully validate and use a connection to SAP Marketing Cloud for remote tables or data flows, certain
preparations have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the CloudDataIntegrationAdapter.

This is custom documentation. For more information, please visit SAP Help Portal. 185
7/9/25, 8:45 AM
For more information, see Preparing Data Provisioning Agent Connectivity.

A communication arrangement has been created for communication scenario SAP_COM_0531 in the source system.

For more information, see Integrating CDI in the SAP Marketing Cloud documentation.

Data Flows
Before you can use the connection for data flows, the following is required:

A communication arrangement has been created for communication scenario SAP_COM_0531 in the source system.

For more information, see Integrating CDI in the SAP Marketing Cloud documentation.

Related Information
SAP Marketing Cloud Connections

Prepare Connectivity to SAP SuccessFactors


To be able to successfully validate and use a connection to SAP SuccessFactors for remote tables or data flows certain
preparations have to be made.

Before you can use the connection, the following is required:

A DW administrator has uploaded the server certificate to SAP Datasphere.

Example for Certificate Download Site: [Link]

For more information, see Manage Certificates for Connections.

When using OAuth 2.0 for authentication:

SAP Datasphere must be registered in SAP SuccessFactors.

For more information, see Registering Your OAuth2 Client Application in the SAP SuccessFactors platform
documentation.

A SAML assertion needs to be generated to be able to provide it when creating or editing the connection.

For an overview of the available options to generate a SAML assertion, see Generating a SAML Assertion in the SAP
SuccessFactors platform documentation.

In SAP SuccessFactors IP restriction management, you have added the externally facing SAP HANA IP addresses and the
outbound IP address for SAP Datasphere to the list of IP restrictions. IP restrictions are a specified list of IP addresses from
which users can access your SAP SuccessFactors system.

For more information, see:

IP Restrictions in the SAP SuccessFactors platform documentation

Finding SAP Datasphere IP addresses

Related Information
SAP SuccessFactors Connections

This is custom documentation. For more information, please visit SAP Help Portal. 186
7/9/25, 8:45 AM

Prepare Connectivity to SAP S/4HANA Cloud


To be able to successfully validate and use a connection to SAP S/4HANA Cloud, certain preparations have to be made.

Remote Tables
Before you can use the connection for remote tables, the following is required:

For federated access to CDS view entities using the ABAP SQL service exposure from SAP S/4HANA Cloud (recommended
for federation scenarios):

See Using ABAP SQL Services for Accessing Data from SAP S/4HANA Cloud.

For federated access to and replication of ABAP CDS Views exposed as OData services for data extraction using Cloud Data
Integration (legacy):

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere
and registered the CloudDataIntegrationAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

A communication arrangement has been created for communication scenario SAP_COM_0531 in the source
system.

For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.

Data Flows
Before you can use the connection for data flows, the following is required:

If you want to replicate CDS views:

A communication arrangement has been created for communication scenario SAP_COM_0532 in the SAP S/4HANA Cloud
system.

For more information, see Integrating CDS Views Using SAP Datasphere in the SAP S/4HANA Cloud documentation.

Replication Flows
Before you can use the connection for replication flows, the following is required:

For replicating CDS view entities using the ABAP SQL service exposure from SAP S/4HANA Cloud (recommended for
replication scenarios):

See Using ABAP SQL Services for Accessing Data from SAP S/4HANA Cloud.

For both replicating CDS view entities using the ABAP SQL service exposure and replicating CDS views view entities using
the ABAP Pipeline Engine:

A communication arrangement has been created for communication scenario SAP_COM_0532 in the SAP S/4HANA Cloud
system.

For more information, see:

replicating CDS view entities using the ABAP SQL service exposure:

Integrating SQL Services Using SAP Datasphere in the SAP S/4HANA Cloud documentation

Creating a Communication Arrangement to Enable Replication Flows in SAP Datasphere in the ABAP Cloud
documentation

This is custom documentation. For more information, please visit SAP Help Portal. 187
7/9/25, 8:45 AM
replicating CDS views using the ABAP Pipeline Engine:

Integrating CDS Views Using SAP Datasphere in the SAP S/4HANA Cloud documentation

 Note
The same communication user must be added to all communication arrangements you're using for the connection.

If you want to use RFC fast serialization for your replication flows, see SAP Note 3486245 .

Model Import
Before you can use the connection for model import, the following is required:

A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered
CloudDataIntegrationAdapter.

For more information, see Preparing Data Provisioning Agent Connectivity.

In the SAP S/4HANA Cloud system, communication arrangements have been created for the following communication
scenarios:

SAP_COM_0532

For more information, see Integrating CDS Views Using SAP Datasphere in the SAP S/4HANA Cloud documentation.

SAP_COM_0531

For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.

SAP_COM_0722

For more information, see Integrating SAP Data Warehouse Cloud in the SAP S/4HANA Cloud documentation.

 Note
The same communication user must be added to all communication arrangements you're using for the connection.

Related Information
SAP S/4HANA Cloud Connections

Using ABAP SQL Services for Accessing Data from SAP


S/4HANA Cloud
The ABAP SQL service provides SQL-level access to published CDS view entities for SAP Datasphere. You can use the service to
replicate data with replication flows or to federate data with remote tables.

For more information, see Accessing ABAP-Managed Data Using SQL Services for Data Integration Scenarios in the SAP S/4HANA
Cloud Public Edition documentation.

 Note
This feature requires developer extensibility in SAP S/4HANA Cloud (including ABAP development tools), which is only
available in a 3-system landscape. For more information, see the SAP S/4HANA Cloud Public Edition documentation:

Developer Extensibility

This is custom documentation. For more information, please visit SAP Help Portal. 188
7/9/25, 8:45 AM
System Landscapes in SAP S/4HANA Cloud

For both consumption scenarios using the SQL service, data federation and data replication, privileged data access needs to be
enabled for communication users in SAP S/4HANA Cloud. For more information about the consumption scenarios and privileged
access, see Data Integration Patterns in the ABAP Cloud documentation for SAP S/4HANA Cloud Public Edition.

Data Federation With Remote Tables


In SAP S/4HANA Cloud, a business user and administrator must perform the following steps to prepare data federation with
remote tables:

There are some prerequisites and constraints that must be considered before using the SQL service.

For more information, see Prerequisites and Constraints in the ABAP Cloud documentation. Note that for SAP Datasphere
the ODBC driver installation is not required (the driver is pre-installed on the SAP HANA database).

To expose CDS view entities using the SQL service, an SAP S/4HANA Cloud business user has created a service definition
and a corresponding service binding of type SQL1 in the ABAP Development Tools. The service definition lists the set of
CDS view entities that shall be exposed, and a service binding of type SQL for that service definition enables their exposure
via the ABAP SQL Service.

In the Enabled Operations area of the service binding, the business user must select access type SELECT to enable
federated access.

For more information, see Creating a Service Definition and an SQL-Typed Service Binding in the ABAP Cloud
documentation.

To expose the SQL service to get privileged access to the CDS view entities with a communication user, a communication
arrangement is required. This involves the following steps:

1. An SAP S/4HANA Cloud business user has created a custom communication scenario in the ABAP Development
Tools.

When filling out the authorizations for authorization object S_SQL_VIEW in the communication scenario, note the
following:

On the Sources tab of the Data Builder view editors in SAP Datasphere, the service binding name from the
SQL_SCHEMA authorization field is visible as (virtual) schema.

In the SQL_VIEWOP authorization field, select the option SELECT to grant federated access.

2. An administrator has created a communication system and user in the SAP Fiori launchpad of the ABAP
environment.

 Note
The same communication user must be added to all communication arrangements you're using for the
connection.

3. An administrator has created a communication arrangement for exposing the SQL service in the SAP Fiori
launchpad of the ABAP environment.

For more information, see Exposing the SQL Service for Data Federation and Replication with Privileged Access in the ABAP
Cloud documentation.

You can now create a connection to consume the ABAP SQL service for data federation with remote tables using the ABAP SDA
adapter in SAP HANA.

This is custom documentation. For more information, please visit SAP Help Portal. 189
7/9/25, 8:45 AM

Data Replication With Replication Flows

In SAP S/4HANA Cloud, a business user and administrator must perform the following steps to prepare data replication with
replication flows:

There are some prerequisites and constraints that must be considered before using the SQL service.

For more information, see Prerequisites and Constraints in the ABAP Cloud documentation. Note that for SAP Datasphere
the ODBC driver installation is not required (the driver is pre-installed on the SAP HANA database).

To expose CDS view entities using the SQL service, an SAP S/4HANA Cloud business user has created a service definition
and a corresponding service binding of type SQL1 in the ABAP Development Tools. The service definition lists the set of
CDS view entities that shall be exposed, and a service binding of type SQL for that service definition enables their exposure
via the ABAP SQL Service.

In the Enabled Operations area of the service binding, the business user must select access type REPLICATE to enable
data replication.

For more information, see Creating a Service Definition and an SQL-Typed Service Binding in the ABAP Cloud
documentation.

To expose the SQL service to get privileged access to the CDS view entities with a communication user, a communication
arrangement is required. This involves the following steps:

1. An SAP S/4HANA Cloud business user has created a custom communication scenario in the ABAP Development
Tools.

When filling out the authorizations for authorization object S_SQL_VIEW in the communication scenario, note the
following:

In the SQL_VIEWOP authorization field, select the option REPLICATE to allow replication on the specified
views.

2. An administrator has created a communication system and user in the SAP Fiori launchpad of the ABAP
environment.

 Note
The same communication user must be added to all communication arrangements you're using for the
connection.

3. An administrator has created a communication arrangement for exposing the SQL service in the SAP Fiori
launchpad of the ABAP environment.

For more information, see Exposing the SQL Service for Data Federation and Replication with Privileged Access in the ABAP
Cloud documentation.

An administrator has created a communication arrangement for communication scenario SAP_COM_0532 in the SAP Fiori
launchpad of the ABAP environment.

For more information, see Replication Flows.

You can now create a connection to consume the ABAP SQL service for data replication with replication flows using the ABAP
Pipeline Engine.

Prepare Connectivity to SAP S/4HANA On-Premise


To be able to successfully validate and use a connection to SAP S/4HANA, certain preparations have to be made.

This is custom documentation. For more information, please visit SAP Help Portal. 190
7/9/25, 8:45 AM
This topic contains the following sections:

Remote Tables

Data Flows

Replication Flows

Model Import (Data Access: Remote Tables)

Model Import (Data Access: Replication Flow to Local Tables)

Remote Tables
If you want to use federated access to CDS view entities using the ABAP SQL service exposure from SAP S/4HANA, see Using
ABAP SQL Services for Accessing Data from SAP S/4HANA (recommended for federation scenarios).

If you want to federate and replicate data using SAP HANA smart data integration, the following is required before you can use the
connection (legacy):

An administrator has connected an SAP HANA smart data integration Data Provisioning Agent to SAP Datasphere and
registered the ABAPAdapter.

For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data
Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.

For more information, see Preparing Data Provisioning Agent Connectivity.

The ABAP user specified in the credentials of the SAP ABAP connection needs to have a specific set of authorizations in the
SAP ABAP system. For more information, see: Authorizations in the SAP HANA Smart Data Integration and SAP HANA
Smart Data Quality documentation.

If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to
configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC
destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent
as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming.

Data Flows

 Note
The availability of the data flow feature depends on the used version and Support Package level of SAP S/4HANA or the DMIS
addon in the source. Make sure your source systems meet the required minimum versions. We recommend to use the latest
available version of SAP S/4HANA and the DMIS add-on where possible and have the latest SAP notes and TCI notes
implemented in your systems.

For more information about required versions, recommended system landscape, considerations for the supported source
objects, and more , see SAP Note 2890171 .

Before you can use the connection for data flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.

For more information, see Configure Cloud Connector.

See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)

This is custom documentation. For more information, please visit SAP Help Portal. 191
7/9/25, 8:45 AM

Replication Flows

 Note
The availability of the replication flow feature depends on the used version and Support Package level of SAP S/4HANA or the
DMIS addon in the source. Make sure your source systems meet the required minimum versions. We recommend to use the
latest available version of SAP S/4HANA and the DMIS add-on where possible and have the latest SAP notes and TCI notes
implemented in your systems.

For more information about required versions, recommended system landscape, considerations for the supported source
objects, and more , see SAP Note 2890171 .

Before you can use the connection for replication flows, the following is required:

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

In the Cloud Connector configuration, an administrator has made sure that access to the required resources is granted.

For more information, see Configure Cloud Connector.

See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)

Making use of fast serialization requires the following prerequisites:

The endpoint is either RFC or RFCLB (for load balancing via message server). Fast serialization is not available for
endpoints WSRFC or SQL.

The SAP S/4HANA on-premise system needs to support the feature.

In the SAP S/4HANA on-premise system, the feature has not been disabled via parameter
APE_DISABLE_RUCKSACK in DHBAS_RUNTIME configuration table.

For more information about using fast serialization in SAP Datasphere and its prerequisites, see SAP Note 3486245 .

Model Import (Data Access: Remote Tables)


Supported source versions: SAP S/4HANA 1809 or higher (SAP_BASIS 753 and higher)

Before you can use the connection to import entities with data access Remote Tables, the following is required:

In SAP S/4HANA

An administrator has followed the instructions from SAP Note 3081998 to properly set up the SAP S/4HANA system,
which includes:

1. SAP Note 3283282 has been implemented to provide the required infrastructure in the SAP S/4HANA system.

2. The required corrections have been implemented and checks have been performed to make sure that SAP Note
3283282 and subsequent corrections have been applied properly and all required objects to provide the
infrastructure are available and activated.

3. Report ESH_CSN_CDS_TO_CSN has been run to prepare the CDS Views for the import.

An administrator has created a technical user with the following authorizations:

Authorization object S_SERVICE - service authorizations for the Enterprise Search search service

This is custom documentation. For more information, please visit SAP Help Portal. 192
7/9/25, 8:45 AM

Field Value

SRV_NAME EF608938F3EB18256CE851763C2952

SRV_TYPE HT

Authorization object SDDLVIEW - Search access authorization for the search view CSN_EXPOSURE_CDS

Field Value

DDLNAME <leave empty - this field is not used>

DDLSRCNAME CSN_EXPOSURE_CDS

ACTVT 03

Authorizations for remote table access via ODP

An adminstrator has checked that the required InA services are active in transaction code SICF:

/sap/bw/ina/GetCatalog

/sap/bw/ina/GetResponse

/sap/bw/ina/GetServerInfo

/sap/bw/ina/ValueHelp

/sap/bw/ina/BatchProcessing

/sap/bw/ina/Logoff

An administrator has activated OData service ESH_SEARCH_SRV in Customizing (transaction SPRO) under SAP
NetWeaver Gateway OData Channel Administration General Settings Activate and Maintain Services .

Cloud Connector

An administrator has installed and configured Cloud Connector to connect to your on-premise source.

For more information, see Configure Cloud Connector.

Data Provisioning Agent

For the remote tables that will be created during the import, the respective prerequisites have to be met including a Data
Provisioning Agent with the ABAPAdapter registered in SAP Datasphere.

For more information, see Remote Tables.

In SAP Datasphere

In System Administration Data Source Configuration Live Data Sources , you have switched on Allow live data to
securely leave my network.

For more information, Set Up Cloud Connector in SAP Datasphere.

In the side navigation area of SAP Datasphere, click System Administration Data Source Configuration On-premise
data sources , you have added the location ID of your Cloud Connector instance.

For more information, Set Up Cloud Connector in SAP Datasphere.

This is custom documentation. For more information, please visit SAP Help Portal. 193
7/9/25, 8:45 AM
In System Configuration Data Integration Live Data Connections (Tunnel) , you have created a live data
connection of type tunnel to SAP S/4HANA.

For more information, see Create SAP S/4HANA Live Data Connection of Type Tunnel.

Model Import (Data Access: Replication Flow to Local Tables)


Supported source versions: SAP S/4HANA 2021 or higher (SAP_BASIS 756 and higher)

Before you can use the connection to import entities with data access Replication Flow to Local Tables, the following is required:

1. You have met all prerequisites mentioned in section Model Import (Data Access: Remote Tables).

2. You have met all prerequisites mentioned in SAP Note 3463326 .

Related Information
SAP S/4HANA On-Premise Connections

Create SAP S/4HANA Live Data Connection of Type Tunnel


To securely connect to SAP S/4HANA on-premise when searching for ABAP CDS Views to be imported with the Import Entities
wizard, you need to connect via Cloud Connector. This requires that you create a live data connection of type tunnel to the SAP
S/4HANA system.

Prerequisites
See: Model Import (Data Access: Remote Tables)

Procedure
1. In the side navigation area, click  (System)  (Configuration) Data Integration .

2. In the Live Data Connections (Tunnel) section, click Manage Live Data Connections.

The Manage Live Data Connections dialog appears.

3. On the Connections tab, click  (Add Connection).

The Select a data source dialog appears.

4. Expand Connect to Live Data and select SAP S/4HANA.

The New S/4HANA Live Connection dialog appears.

5. Enter a name and description for your connection. Note that the connection name cannot be changed later.

6. Set the Connection Type to Tunnel.

7. Select the Location ID.

 Note
In the next step, you will need to specify the virtual host that is mapped to your on-premise system. This depends on the
settings in your selected Cloud Connector location.

8. Add your SAP S/4HANA host name, HTTPS port, and client.

Use the virtual host name and virtual port that were configured in the Cloud Connector.

This is custom documentation. For more information, please visit SAP Help Portal. 194
7/9/25, 8:45 AM
9. Optional: Choose a Default Language from the list.

This language will always be used for this connection and cannot be changed by users without administrator privileges.

 Note
You must know which languages are installed on your SAP S/4HANA system before adding a language code. If the
language code you enter is invalid, SAP Datasphere will default to the language specified by your system metadata.

10. Under Authentication Method select User Name and Password.

11. Enter user name (case sensitive) and password of the technical user for the connection.

12. Select Save this credential for all users on this system.

13. Click OK.

Results
The connection is saved and now available for selection in the SAP Datasphere connection creation wizard for the SAP S/4HANA
On-Premise connection.

Using ABAP SQL Services for Accessing Data from SAP


S/4HANA
The ABAP SQL service provides SQL-level access to published CDS view entities for SAP Datasphere. You can use the service to
federate data with remote tables. Using the service requires Cloud Connector.

 Note

This feature requires developer extensibility in SAP S/4HANA (including ABAP development tools). For more
information, see Developer Extensibility in the ABAP Platform documentation for your SAP S/4HANA system.

For data federation using the SQL service, privileged data access needs to be enabled for communication users in SAP
S/4HANA. For more information, see Access Scenarios in the ABAP Platform documentation for your SAP S/4HANA
system.

Make sure the SAP S/4HANA system you want to connect is based on the ABAP platform 2021 FPS01 or higher where
the ABAP SQL service is available.

Data federation with remote tables using the ABAP SQL service is supported for SAP Logon connection type
Application Server and basic authentication with User Name and Password.

When a connection is configured for using the ABAP SQL service for data federation with remote tables, you can't use
the same connection for model import.

Perform the following steps to prepare data federation with remote tables:

Configure Cloud Connector to use the ABAP SQL service (see Configure Cloud Connector), paying particular attention to
the following configuration steps:

1. When adding the system mapping to the SAP S/4HANA system, select HTTPS protocol.

 Note
When you want to use a connection for both data or replication flows and remote tables, you need to create two
system mapping entries in the Cloud Connector considering the following:

This is custom documentation. For more information, please visit SAP Help Portal. 195
7/9/25, 8:45 AM

Feature Protocol Host Virtual Host Virtual Port

Data flow and RFC Enter the same host Enter the same The virtual port is
replication flow for both system virtual host for both derived from the
mapping entries. system mapping instance number
entries. (system number)
Also, the virtual host entered in the
must be the same in system mapping:
the Cloud Connector sapgw<system
system mapping and number>
in the connection's .
Cloud Connector
Remote tables HTTPS properties. Enter the same
virtual port in the
Cloud Connector
system mapping and
in the connection's
Cloud Connector
properties.

In the SAP Datasphere Connections app, you must enter the virtual port for the HTTPS protocol and the virtual
host in separate fields. Deriving virtual host and port is not supported in the Connections app because of the
different virtual ports used in the two system mappings.

2. When adding resources, specify the URL path:

a. Enter the service path of the SQL service endpoint on the SAP S/4HANA system. For example:
/sap/bc/sql/sql1/sap/s_privileged.

b. Select the Upgrade Allowed option.

 Note
In older Cloud Connector versions, the option might appear as WebSocket or WebSocket Upgrade.

For more information, see:

Configure Cloud Connector

Configure Access Control (HTTP) in the SAP BTP Connectivity documentation

In SAP S/4HANA, a business user and administrator must perform the following steps to prepare data federation with
remote tables:

1. Consider the prerequisites and constraints that must be considered before using the SQL service.

For more information, see Prerequisites and Constraints in the ABAP Platform documentation for your SAP
S/4HANA system.

2. To expose CDS view entities using the SQL service, an SAP S/4HANA business user has created a service definition
and a corresponding service binding of type SQL1 in the ABAP Development Tools. The service definition lists the set
of CDS view entities that shall be exposed, and a service binding of type SQL for that service definition enables their
exposure via the ABAP SQL Service.

For more information, see Creating a Service Definition and an SQL-Typed Service Binding in the ABAP Platform
documentation for your SAP S/4HANA system.

This is custom documentation. For more information, please visit SAP Help Portal. 196
7/9/25, 8:45 AM
3. To expose the SQL service to get privileged access to the CDS view entities with a communication user, a role is
required.

For more information, see Creating a Role for Privileged Access in the ABAP Platform documentation for your SAP
S/4HANA system.

Managing and Monitoring Connectivity for Data Integration


Users with an administrator role can monitor and troubleshoot Data Provisioning Agent and Cloud Connector connectivity.

Monitoring Data Provisioning Agent in SAP Datasphere


For connected Data Provisioning Agents, you can proactively become aware of resource shortages on the agent instance and find
more useful information.

In Configuration Data Integration On-Premise Agents choose the Monitor button to display the agents with the following:

Information about free and used physical memory and swap memory on the Data Provisioning Agent server.

Information about when the agent was connected the last time.

Information about the overall number of connections that use the agent and the number of connections that actively use
real-time replication, with active real-time replication meaning that the connection type supports real-time replication and
for the connection at least one table is replicated via real-time replication.

You can change to the Connections view to see the agents with a list of all connections they use and their real-time
replication status. You can pause real-time replication for the connections of the while applying changes to the agent. For
more information, see Pause Real-Time Replication for an Agent.

Monitoring Data Provisioning Agent Logs


Access the Data Provisioning Agent adapter framework log and the adapter framework trace log directly in SAP Datasphere.

With the integrated log access, you don’t need to leave SAP Datasphere to monitor the agent and analyze agent issues. Accessing
the log data happens via the Data Provisioning Agent File adapter which reads the log files and saves them into the database of
SAP Datasphere.

The following logs are available:

Log File Name and Location on Data Provisioning Agent Server Description

<DPAgent_root>/log/framework_alert.trc Data Provisioning Agent adapter framework log. Use this file to
monitor data provisioning agent statistics.

<DPAgent_root>/log/[Link] Data Provisioning Agent adapter framework trace log. Use this file
to trace and debug data provisioning agent issues.

You can review the logs in SAP Datasphere after log access has been enabled for the agent in question. We display the actual log
files as well as up to ten archived log files that follow the naming convention [Link].<x> respectively
framework_alert.trc.<x> with <x> being a number between one and ten.

This is custom documentation. For more information, please visit SAP Help Portal. 197
7/9/25, 8:45 AM

Related Information
Enable Access to Data Provisioning Agent Logs
Review Data Provisioning Agent Logs

Enable Access to Data Provisioning Agent Logs


Enable accessing an agent’s log files before you can view them in SAP Datasphere.

Prerequisites
A Data Provisioning Agent administrator has provided the necessary File adapter configuration with an access token that you need
for enabling the log access in SAP Datasphere.

To define the access token in the agent's secure storage, the administrator has performed the following steps in the agent
configuration tool in command-line interactive mode:

1. At the command line, navigate to <DPAgent_root>/bin.

2. Start the agent configuration tool with the setSecureProperty parameter.

On Windows: [Link] --setSecureProperty

On Linux, ./[Link] --setSecureProperty

3. Choose Set FileAdapter Access Token and define a new token: Under Enter File Adapter Access Token, enter the token,
make a note of it, confirm it, and press Enter to quit the configuration tool.

For more information, see SAP Note 2554427 .

For more information about the File adapter configuration, see File in the Installation and Configuration Guide of the SAP HANA
Smart Data Integration and SAP HANA Smart Data Quality documentation.

Procedure
1. From the  main menu, open Configuration Data Integration .

2. On the agent’s tile, click Edit.

3. In the Agent Settings dialog, set Enable Log Access to true.

4. In the FileAdapter Password field that appears, enter the File adapter access token.

5. Click Save to activate the log access.

Results

The Review Logs entry in the menu of the agent’s tile is enabled and the framework_alert.trc and [Link] logs are
written to the database of SAP Datasphere. You can now review the current and archived log files from the agent's tile.

Review Data Provisioning Agent Logs


Use the logs to monitor the agent and analyze issues with the agent.

This is custom documentation. For more information, please visit SAP Help Portal. 198
7/9/25, 8:45 AM

Prerequisites
The logs are written to the database of SAP Datasphere. For more information, see Enable Access to Data Provisioning Agent Logs.

Procedure
1. From the  main menu, open Configuration Data Integration .

2. On the agent’s tile, click Review Logs.

The Review Agent Logs dialog initially shows 50 log entries. To load another chunks of 50 entries each, scroll down to the
bottom of the dialog and use the More button.

3. To show the complete message for a log entry, click More in the Message column.

4. You have the following options to restrict the results in the display of the logs:

Search: In the <agent name> field, enter a search string and click  (Search) to search in the messages of the logs.

Filters: You can filter based on time, message type and log file name. When you’ve made your selection, click Apply
Filters.

 Note
If your local time zone differs from the time zone used in the Data Provisioning Agent logs and you're applying a
time-based filter, you might get other filter results than expected.

5. [optional] Export the logs as CSV file to your local system. Note that filters and search restrictions will be considered for the
exported file.

Receive Notifications About Data Provisioning Agent Status


Changes
For a selected SAP HANA Smart Data Integration Data Provisioning Agent, you can configure to get notified when the agent’s
status changes from connected to disconnected or the other way round.

Prerequisites
To run recurring scheduled tasks on your behalf, you need to authorize the job scheduling component of SAP Datasphere. In your
profile settings under Schedule Consent Settings, you can give and revoke your consent to SAP Datasphere to run your scheduled
tasks in the future. Note that when you don't give your consent or revoke your consent, tasks that you own won't be executed but
will fail.

For more information, see Changing SAP Datasphere Settings.

Context
A recurring task will check for any status changes according to the configured frequency and send the notifications to the user
who is the owner of the configuration. The initial owner is the user who created the configuration. Any user with the appropriate
administration privileges can take over the ownership for this task if required, for example in case of vacation replacement or when
the previous owner left the department or company.

Procedure
1. In the side navigation area, click  (System)  (Configuration) Data Integration .

This is custom documentation. For more information, please visit SAP Help Portal. 199
7/9/25, 8:45 AM
2. Go to the On-Premise Agents section and click  (menu) Configure Sending Notifications.

3. If you haven't authorized SAP Datasphere yet to run your scheduled tasks for you, you will see a message at the top of the
Configure Sending Notifications dialog asking for your consent. Give your consent.

4. Switch on the Send Notifications toggle.

An additional field Owner appears that shows that you have been automatically assigned as the owner of the task.

5. Select the frequency in which the status of the Data Provisioning Agent should be checked.

6. Save your configuration.

This will start the first status check. After the first check, the status check will be performed according to the defined
frequency.

Results

If the status check finds any status change for the agent, a notification will be sent that you can find by clicking  (Notifications) on
the shell bar.

When you click on the notification, you’ll get to the On-Premise Agents section in  (System)  (Configuration) Data
Integration where you can start searching for the root cause in case the agent is disconnected.

Next Steps
If you need to take over the ownership and receive the notifications for an agent’s status changes, go the the Configure Sending
Notifications dialog as described above, click Assign to Me and save the configuration. From now on you will receive the
notifications about any status changes for the agent. If you haven’t done so yet, you need to provide your consent before you can
take over the ownership.

Pause Real-Time Replication for an Agent


For a selected SAP HANA Smart Data Integration Data Provisioning Agent, you can pause real-time replication for the connections
that use the agent while applying changes to it, such as configuration changes or applying patches. After you have finished your
agent changes, you can restart real-time replication.

Context

If you need to perform maintenance activities in a source system, you can pause real-time replication for the corresponding
connection. For more information, see Pause Real-Time Replication for a Connection.

Procedure
1. In SAP Datasphere, from the main menu, open Configuration Data Integration On-Premise Agents .

2. To show the Data Provisioning Agent tiles with a list of all connections they use, click the Connections button.

The real-time replication status of a connection shown here, can be:

Real-Time Replication status When do we show the status?

Active The connection type supports real-time replication and for the
connection at least one table is replicated via real-time
replication (even if the status in the Remote Table Monitor is
Error).

This is custom documentation. For more information, please visit SAP Help Portal. 200
7/9/25, 8:45 AM

Real-Time Replication status When do we show the status?

Inactive The connection type supports real-time replication and for the
connection currently there is no table replicating via real-time
replication.

Paused The connection type supports real-time replication and for the
connection at least for one table real-time replication is
paused.

3. To pause the agent's connections with replication status Active or Inactive, on the tile of the agent choose  (menu) and
then  Pause All Connections.

In the list of connections shown on the tile, the status for affected connections changes to Paused. You can also see the
status change for the connections in the Connections application.

In the Remote Table Monitor the status for affected tables changes to Paused and actions related to real-time replication
are not available for these tables. Also, you cannot start real-time replication for any table of a paused connection.

4. You can now apply the changes to your Data Provisiong Agent.

5. Once you're finished with the changes, restart real-time replication for the agent. Choose  (menu) and then  Restart All
Connections.

The status in the list of connections shown on the tile, in the Connections application as well as in the Remote Table
Monitor changes accordingly and you can again perform real-time related actions for the tables or start real-time
replication.

Troubleshooting the Data Provisioning Agent (SAP HANA Smart


Data Integration)
If you encounter problems with the Data Provisioning Agent, you can perform various checks and take actions to troubleshoot the
problems.

The following sections provide information about checks, logs, and actions that you can take to troubleshoot problems with the
Data Provisionning Agent:

Initial Checks

Configuration Checks

Setting Log and JDBC Trace Levels

Performance

Validating the Connection from the Server the Agent is Running to SAP Datasphere

Troubleshooting Connection Issues

Reviewing Data Provisioning Agent Logs

SAP Notes

Support Information

 Note
In the following sections, filepaths and screenshots are based on a Linux-based installation of the agent. If you have installed
the agent on a Microsoft Windows server, the slashes "/” must be replaced by backslashes “\”.

Initial Checks
This is custom documentation. For more information, please visit SAP Help Portal. 201
7/9/25, 8:45 AM
A Data Provisioning Agent administrator can perform the following checks:

Firewall

SAP Datasphere, are not blocked by your firewall.

Agent version

Make sure to always use the latest released version of the Data Provisioning Agent. For information on supported and
available versions for the Data Provisioning Agent, see the SAP HANA Smart Data Integration Product Availability Matrix
(PAM) .

Make sure that all agents that you want to connect to SAP Datasphere have the same latest version.

Java Installation

Check whether a Java installation is available by running the command java -version. If you receive a response like
java: command not found, use the Java installation which is part of the agent installation. The Java executable can be
found in folder <DPAgent_root>/sapjvm/bin.

Configuration Checks

The agent configuration is stored in the <DPAgent_root>/[Link] file in the agent installation root location
(<DPAgent_root> file in the agent installation root location (For a successful connection, make sure that outbound connections
from the Data Provisioning Agent to the target host and port, which is provided in the Data Provisioning Agent registration
information in file in the agent installation root location). A Data Provisioning Agent administrator can double-check for the correct
values (please do not maintain the parameters directly in the configuration file; the values are set with the command-line agent
configuration tool):

[Link] file Agent Settings in SAP Datasphere

[Link]=<Agent Name> Agent Name (the name defined by the user who registered the
agent in SAP Datasphere; the name is case sensitive)

[Link]=<HANA Port> HANA Port

[Link]=false n/a

[Link]=true HANA Use SSL

[Link]=<HANA Server> HANA Server

[Link]=true HANA via JDBC

[Link]=<HANA Server> HANA Server

[Link]=<HANA Port> HANA Port

[Link]=true n/a

If you use a proxy server in your landscape, additionally check for the following parameters:

[Link] file

proxyType=http

[Link]=true

This is custom documentation. For more information, please visit SAP Help Portal. 202
7/9/25, 8:45 AM

[Link] file

[Link]=<your proxy host>

[Link]=<your proxy port>

[Link]=true (true in case of http proxy, false in case of SOCKS proxy)

[if proxy authentication is required] [Link]=true

[if proxy authentication is required] [Link]=<your proxy user name>

[if proxy authentication is required] [Link]=<your proxy password>

For more information, see Agent Configuration Parameters in the SAP HANA Smart Data Integration and SAP HANA Smart Data
Quality documentation.

Setting Log and JDBC Trace Levels


To troubleshoot connection issues, a Data Provisioning Agent administrator can enable logging and JDBC tracing for the Data
Provisioning Agent.

Agent Logs

Change the logging level to INFO (default), ALL, DEBUG, or TRACE according to your needs. For more informatiaon, see SAP
Note 2496051 - How to change "Logging Level" (Trace level) of a Data Provisioning Agent - SAP HANA Smart Data
Integration.

The parameters for the logging level in the <DPAgent_root>/[Link] file are:

[Link]

[Link]

 Note
Changing the level to DEBUG or ALL will generate a large amount of data. We therefore recommend to change the
logging level to these values only for a short period of time while you are actively debugging and change it to a lower
information level after you have finished debugging.

See also SAP Note 2461391 - Where to find Data Provisioning Agent Log Files

JDBC Trace

For information about activating JDBC tracing, see Trace a JDBC Connection in the SAP HANA Service for SAP BTP in AWS
and Google Cloud Regions documentation.

To set the trace level, execute the JDBC driver *.jar file from the <DPAgent_root>/plugins directory.

Performance
If you experience performance issues when replicating data via the Data Provisioning Agent, a Data Provisioning Agent
administrator can consider increasing the agent memory as described in SAP Note 2737656 - How to increase DP Agent
memory.

For general memory sizing recommendations for SAP HANA Smart Data Integration, see

This is custom documentation. For more information, please visit SAP Help Portal. 203
7/9/25, 8:45 AM
Data Provisioning Agent - Best Practices and Sizing Guide in the SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality documentation.

SAP Note 2688382 - SAP HANA Smart Data Integration Memory Sizing Guideline

Validating the Connection from the Server the Agent is Running to SAP Datasphere

Ensure that your Data Provisioning Agent is connected to SAP HANA.

In SAP Datasphere

In  (System)  (Configuration) Data Integration On-Premise Agents a green bar and status information on the
agent tile indicates if the agent is connected.

In On-Premise Agents, click  Refresh Agents if the tile of a newly connected agent doesn’t display the updated connection status.

 Note
When you connect a new agent, it might take several minutes until it is connected.

Via Data Provisioning Agent Configuration Tool (for agent versions lower than 2.7.4)

1. Navigate to the command line and run <DPAgent_root>/bin/[Link] --configAgent.

2. Choose Agent Status to check the connection status.

3. Make sure the output shows Agent connected to HANA: Yes.

4. If the output doesn't show that the agent is connected, it may show an error message. Resolve the error, and then select
option Start or Stop Agent, and then option Start Agent to start the agent.

 Note
For agent version 2.7.4 and higher, if in the agent status the message No connection established yet is shown, this can be
ignored. You can check the connection status in SAP Datasphere instead. For more information about the agent/SAP HANA
connection status in agent version 2.7.4 and higher, see SAP Note 3487646 .

Via Trace File

This is custom documentation. For more information, please visit SAP Help Portal. 204
7/9/25, 8:45 AM
The Data Provisioning Agent framework trace file [Link] in the <DPAgent_root>/log/ folder should contain entries
indicating that the agent has been successfully connected.

Via Command Line

To validate the connection, you can directly use the JDBC driver jar file from the command line interface. You must ensure that
you’re using the same JDBC driver as used by the Data Provisioning Agent. The JDBC driver jar file ([Link].jdbc_*.jar)
is located in the <DPAgent_root>/plugins directory.

The pattern for the command line is:

java -jar <[Link].jdbc_*.jar> -u <HANA User Name for Messaging Agent>,”<HANA User Password for

Navigate to the <DPAgent_root>/plugins/ directory and run one of the following commands by replacing the variables as
needed and depending on your landscape:

Without proxy:

../sapjvm/bin/java -jar <[Link].jdbc_*.jar> -u <HANA User Name for Messaging Agent>,”<HAN

With proxy:

../sapjvm/bin/java -jar <[Link].jdbc_*.jar> -u <HANA User Name for Messaging Agent>,”<HAN

With proxy with authentication required:

../sapjvm/bin/java -jar <[Link].jdbc_*.jar> -u <HANA User Name for Messaging Agent>,”<HAN

If the connection works properly the statement should look like this:

This is custom documentation. For more information, please visit SAP Help Portal. 205
7/9/25, 8:45 AM

Troubleshooting Connection Issues


If you are unable to connect your Data Provisioning Agent to SAP Datasphere and have already validated the connection as
described in the previous section, open the agent framework trace file [Link] in the <DPAgent_root>/log/ folder
and check whether the output matches any of the following issues.

An entry is missing in the SAP Datasphere IP Allowlist

Example for an entry in the [Link] file:

If you see this kind of error, it is most likely related to a missing entry in the IP Allowlist inSAP Datasphere.

Verify that the external (public) IPv4 address of the server where the agent is installed is in the IP allowlist. When using a proxy, the
proxy's address needs to be included in IP allowlist as well.

For more information, see:

Manage IP Allowlist

SAP Note 2938870 - Errors when connecting DP Agent with DWC

Authentication failed

Example for an entry in the [Link] file:

This is custom documentation. For more information, please visit SAP Help Portal. 206
7/9/25, 8:45 AM
Authentication fails because of invalid HANA User for Agent Messaging credentials in the agent secure storage. To update the
credentials, use the agent configuration tool and then restart the agent.

For more information, see Manage the HANA User for Agent Messaging Credentials in the SAP HANA Smart Data Integration and
SAP HANA Smart Data Quality documentation.

Firewall/Proxy Issues

Example for an entry in the [Link] file:

This issue typically indicates that the JDBC driver is not capable of resolving the SAP HANA server URL to connect to theSAP
Datasphere tenant and/or to establish a correct outbound call. Please check your firewall/proxy settings and make sure to enable
outbound connections accordingly.

Encryption is missing: Only Secure Connections are Allowed

In case of missing encryption the log containts the following statement: "only secure connections are allowed".

When testing the connectivity directly with the JDBC driver, add the parameter -o encrypt=true.

Reviewing Data Provisioning Agent Logs

The logs are located in the <DPAgent_root>/log directory. For more information on the available log files, see SAP Note
2461391 .

If the agent is connected, you can review the framework log (framework_alert.trc) and the framework trace log
([Link]) directly in SAP Datasphere. For more information, see Monitoring Data Provisioning Agent Logs.

SAP Notes

SAP Note 2938870 - Errors when connecting DP Agent with DWC

SAP Note 2894588 - IP Allowlist in SAP Datasphere

SAP Note 2511196 - What ports are used by Smart Data Integration

SAP Note 2091095 - SAP HANA Smart Data Integration and SAP HANA Smart Data Quality

This is custom documentation. For more information, please visit SAP Help Portal. 207
7/9/25, 8:45 AM
SAP Note 2400022 - FAQ: SAP HANA Smart Data Integration (SDI)

SAP Note 2477204 - FAQ: SAP HANA Services and Ports

SAP Note 2688382 - SAP HANA Smart Data Integration Memory Sizing Guideline

Support Information
Support Component: SDI HAN-DP-SDI

Add and attach the following information:

Version of the Data Provisioning Agent

Framework trace log file ([Link])

Data Provisioning Agent configuration file ([Link] file)

Troubleshooting Cloud Connector Related Issues


For information about troubleshooting Cloud Connector related issues when creating or using a connection in SAP Datasphere,
see 3369433 .

Troubleshooting SAP HANA Smart Data Access via Cloud


Connector
These are some of the most common issues that can occur when you use the Cloud Connector to connect to on-premise remote
sources via SAP HANA Smart Data Access.

1. The connectivity proxy is not enabled

The following error occurs if you try to connect to a remote source using the Cloud Connector, but the connectivity proxy hasn’t
been enabled:

[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89001]


Cannot resolve host name '<connectivity_proxy_host>' rc=-2:
Name or service not known (<virtual_host>:<virtual_port>))

SAP Datasphere takes care of enabling the connectivity proxy. This might take a while.

2. The connectivity proxy is enabled but not fully ready to serve requests

The following error occurs if the connectivity proxy has been enabled but is not yet ready to be used:

[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89006]


System call 'connect' failed, rc=111:
Connection refused {<connectivity_proxy_ip>:<connectivity_proxy_port>)} {ClientPort:<client_port>}

SAP Datasphere takes care of enabling the connectivity proxy. This might take a while.

This is custom documentation. For more information, please visit SAP Help Portal. 208
7/9/25, 8:45 AM

3. The virtual host specified in the connection details includes an underscore

The following error occurs if you’ve used a virtual host name with an underscore, for example, hana_01:

[LIBODBCHDB SO][HDBODBC] General error;-10719 Connect failed (invalid SERVERNODE 'hana_01:<virtual_

Virtual host names must not contain underscores.

4. The virtual host specified in the connection details is unreachable


The following error occurs if the specified virtual host cannot be reached:

[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89132]


Proxy server connect: connection not allowed by ruleset (<virtual_host>:<virtual_port>))

5. The selected location ID is invalid.


The following error occurs if an invalid location ID was specified in the Data Source Configuration of the SAP Datasphere
Administration:

[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89133]


Proxy server connect: Network unreachable (<virtual_host>:<virtual_port>))

6. The Cloud Connector's IP is missing or is incorrectly specified in the SAP


Datasphere IP allowlist for trusted Cloud Connector IPs

The following error occurs when the Cloud Connector's IP is not included in the allowlist list:

[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89133]


Proxy server connect: Network unreachable (<virtual_host>:<virtual_port>))

7. The Cloud Connector certificate has expired


The following error occurs when the subaccount certificate used in the Cloud Connector has expired:

[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89133]


Proxy server connect: Network unreachable (<virtual_host>:<virtual_port>))

You can find the related logs in the ljs_trace.log file in the Cloud Connector. For example:

2021-07-29 [Link],131 +0200#ERROR#[Link]


#Unable to handshake with notification server [Link]/<vi
[Link]: Received fatal alert: certificate_expired

For information about renewing a subaccount certificate, see Update the Certificate for a Subaccount in the SAP BTP Connectivity
documentation.

8. The on-premise backend system requires TCP SSL


This is custom documentation. For more information, please visit SAP Help Portal. 209
7/9/25, 8:45 AM
The following error occurs if the on-premise backend system requires TCP SSL:

[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89008]


Socket closed by peer (<virtual_host>:<virtual_port>))

Related Information
Troubleshooting Connection Issues with the Cloud Connector (SAP HANA Cloud, SAP HANA Database documentation)

Creating a Database User Group


Users with an administrator role can create database user groups in SAP Datasphere to allow users to work in a sandboxed area in
the underlying SAP HANA Cloud database, unattached to any space. These users can transfer an existing data warehouse
implementation into the SAP Datasphere database or do any other work in SAP HANA Cloud and then make it available to one or
more spaces as appropriate.

Prerequisites
To create a database user group, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator global role, for example, grants these [Link] more information, see Privileges and Permissions and
Standard Roles Delivered with SAP Datasphere.

Context
When creating a database user group, an administrator is also created. This administrator can create other users, schemas, and
roles using SAP Datasphere stored procedures. The administrator and their users can create data entities (DDL) and ingest data
(DML) directly into their schemas and prepare them for consumption by spaces.

For detailed information about user groups, see User Groups in the SAP HANA Cloud documentation.

 Note
Users with the DW Space Administrator role can create database users, which are associated with their space (see Integrating
Data via Database Users/Open SQL Schemas).

Procedure
1. In the side navigation area, click  (System)  (Configuration) Database Access Database User Groups .

2. On the Database User Group page, click Create.

3. Enter a suffix for your database user group and click Create.

The group is created and the connection details and administrator credentials are displayed.

If you want to work with the SAP HANA database explorer, you will need to enter your password to grant the explorer access
to the database user group schema. When connecting to SAP HANA Cloud with other tools, users will need the following
properties:

This is custom documentation. For more information, please visit SAP Help Portal. 210
7/9/25, 8:45 AM
Database Group Administrator (name and password)

Host Name

Port

4. Click Close to close the dialog.

Create Users, Schemas, and Roles in a Database User Group


A database user group administrator can create users, schemas, and roles to organise and staff their group. Creating schemas and
roles and granting, revoking, and dropping roles require the use of SAP Datasphere stored procedures.

This topic contains the following sections:

Prerequisites

Log In With Your Database User Group Administrator

Create a User

Create a Schema

Grant a Role to a User or to Another Role

Revoke a Role

Drop a Role

Prerequisites
To create users, schemas, and roles in a database user group, you must have a database user group administrator and a password
(Creating a Database User Group).

Log In With Your Database User Group Administrator


To connect to SAP HANA Cloud with the administrator, select your newly created user group in the list, and click Open Database
Explorer, enter the password when requested, and click OK.

The SAP HANA database explorer opens with your database user group at the top level. You can now use the SQL editor to create
users, roles and schemas.

You can review your privileges with the following statement:

select * from effective_privileges where user_name = current_user;

Create a User
You can create a user in your user group with the following statement:

CREATE USER <user_name> PASSWORD <pwd> SET USERGROUP <DBgroup_name>

 Note

This is custom documentation. For more information, please visit SAP Help Portal. 211
7/9/25, 8:45 AM
To avoid possible conflicts, we recommend that you use the prefix DWCDBGROUP#<DBgroup_name># when naming users,
schemas, and roles in your group (see Rules for Technical Names).

In our example, we create a new user, DWCDBGROUP#DWMIGRATE#BOB, in our DWMIGRATE group:

CREATE USER DWCDBGROUP#DWMIGRATE#BOB password “Welcome1” set usergroup “DWCDBGROUP#DWMIGRATE”;

Create a Schema

You can create a schema in your database user group by using the following SAP Datasphere stored procedure:

CALL "DWC_GLOBAL"."CREATE_USERGROUP_SCHEMA"
(
SCHEMA_NAME => '<schema_name>',
OWNER_NAME => '<user_name>'
);

 Note
To avoid possible conflicts, we recommend that you use the prefix DWCDBGROUP#<DBgroup_name># when naming users,
schemas, and roles in your group (see Rules for Technical Names).

The owner of the new schema must be a user of the database user group. If the owner name is set to null, then the database user
group administrator is set as the owner.

In our example, we create a new schema, DWCDBGROUP#DWMIGRATE#STAGING, and set BOB as the owner:

CALL "DWC_GLOBAL"."CREATE_USERGROUP_SCHEMA"
(
SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING,
OWNER_NAME => 'DWCDBGROUP#DWMIGRATE#BOB'
);

Create a Role
You can create a role in your database user group by using the following SAP Datasphere stored procedure:

CALL "DWC_GLOBAL"."CREATE_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => '<schema_name>',
ROLE_NAME => '<role_name>'
);

 Note
To avoid possible conflicts, we recommend that you use the prefix DWCDBGROUP#<DBgroup_name># when naming users,
schemas, and roles in your group (see Rules for Technical Names).

Once the role is created, you can grant it to a user or to another role, revoke it, and drop it.

This is custom documentation. For more information, please visit SAP Help Portal. 212
7/9/25, 8:45 AM
In our example, we create a new role, DWCDBGROUP#DWMIGRATE#DWINTEGRATOR in the schema STAGING:

CALL "DWC_GLOBAL"."CREATE_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING',
ROLE_NAME => 'DWCDBGROUP#DWMIGRATE#DWINTEGRATOR'
);

Grant a Role to a User or to Another Role

You can grant a role to a user or to another role in your database user group by using the following SAP Datasphere stored
procedure:

CALL "DWC_GLOBAL"."GRANT_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => '<schema_name>',
ROLE_NAME => '<role_name>',
GRANTEE => '<user_name>',
GRANTEE_ROLE_NAME => NULL,
WITH_ADMIN_OPTION => FALSE
);

The role schema, grantee, and grantee role must all be in the same database user group.

In our example, we grant the DWCDBGROUP#DWMIGRATE#DWINTEGRATOR role to our user, BOB:

CALL "DWC_GLOBAL"."GRANT_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING',
ROLE_NAME => 'DWCDBGROUP#DWMIGRATE#DWINTEGRATOR',
GRANTEE => 'DWCDBGROUP#DWMIGRATE#BOB',
GRANTEE_ROLE_NAME => NULL,
WITH_ADMIN_OPTION => FALSE
);

Revoke a Role
You can revoke a role from a user by using the following SAP Datasphere stored procedure:

CALL "DWC_GLOBAL"."REVOKE_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => '<schema_name>',
ROLE_NAME => '<role_name>',
GRANTEE => '<user_name>',
GRANTEE_ROLE_NAME => NULL
);

In our example, we revoke the DWCDBGROUP#DWMIGRATE#DWINTEGRATOR role from BOB:

This is custom documentation. For more information, please visit SAP Help Portal. 213
7/9/25, 8:45 AM

CALL "DWC_GLOBAL"."REVOKE_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING',
ROLE_NAME => 'DWCDBGROUP#DWINTEGRATOR',
GRANTEE => 'DWCDBGROUP#DWMIGRATE#BOB',
GRANTEE_ROLE_NAME => NULL
);

Drop a Role
You can drop a role by using the following SAP Datasphere stored procedure:

CALL "DWC_GLOBAL"."DROP_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => '<schema_name>',
ROLE_NAME => '<role_name>'
);

In our example, we drop the DWCDBGROUP#DWMIGRATE#DWINTEGRATOR role:

CALL "DWC_GLOBAL"."DROP_USERGROUP_ROLE"
(
ROLE_SCHEMA_NAME => 'DWCDBGROUP#DWMIGRATE#STAGING',
ROLE_NAME => 'DWCDBGROUP#DWMIGRATE#DWINTEGRATOR'
);

Allow a Space to Read From the Database User Group Schema


By default, no SAP Datasphere space can access the database user group schema. To grant a space read privileges from the
database user group schema, use the GRANT_PRIVILEGE_TO_SPACE stored procedure.

Prerequisites
To allow a space to read from the database user group schema, you must have a database user group administrator and a
password (Creating a Database User Group).

Context
You can grant read privileges by running an SAP Datasphere specific stored procedure in the SQL console in the SAP HANA
Database Explorer.

Procedure
1. From the side navigation area, go to  (System) →  (Configuration) → Database Access → Database User Groups.

2. Select the database user group and click Open Database Explorer.

3. In the SQL console in SAP HANA Database Explorer, call the stored procedure to grant the 'SELECT' privilege to a space
using the following syntax:
This is custom documentation. For more information, please visit SAP Help Portal. 214
7/9/25, 8:45 AM

CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => <operation>,
PRIVILEGE => <privilege>,
SCHEMA_NAME => <schema name>,
OBJECT_NAME => <object name>,
SPACE_ID => <space ID>);

Parameters are set as follows:

Parameter Values Description

operation [required] Enter 'GRANT' to give the read privileges,


'GRANT'
or 'REVOKE' to remove the read privileges to the

'REVOKE' space.

privilege 'SELECT' [required] Enter the read privilege that you want to
grant (or revoke) to the space.

schema_name '[name of database user group schema]' [required] Enter the name of the schema you want
the space to be able to read from.

object_name [required] You can grant the read privileges, either at


''
the schema level or at the object level.
null
At the schema level (all objets in the
'[name of the objet]' schema): enter null or ' '.

At the object level: enter a valid table name.

space_id '[ID of the space]' [required] Enter the ID of the space you are granting
the read privileges to.

To grant read access to all objects (tables) in the schema:

CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'SELECT',
SCHEMA_NAME => 'SALE#ETL',
OBJECT_NAME => '',
SPACE_ID => 'SALES');

To grant read access to the table MY_TABLE:

CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'SELECT',
SCHEMA_NAME => 'SALE#ETL',
OBJECT_NAME => 'MY_TABLE',
SPACE_ID => 'SALES');

4. Run the query by clicking  (Run) or press F8.

Results
If the run is successful, you receive a confirmation message in the Result pane. You can then open the Data Builder, create a data
flow, and select the tables as sources.

This is custom documentation. For more information, please visit SAP Help Portal. 215
7/9/25, 8:45 AM

Allow a Space to Write to the Database User Group Schema


To grant a space write privileges in the database user group schema, use the GRANT_PRIVILEGE_TO_SPACE stored procedure.
Once this is done, data flows running in the space can select tables in the schema as targets and write data to them.

Prerequisites
To allow a space to write to the database user group schema, you must have a database user group administrator and a password
(Creating a Database User Group).

Context
You can grant write privileges by running an SAP Datasphere specific stored procedure in the SQL console in the SAP HANA
Database Explorer.

Procedure
1. From the side navigation area, go to  (System) →  (Configuration) → Database Access → Database User Groups.

2. Select the database user group and click Open Database Explorer.

3. In the SQL console in SAP HANA Database Explorer, call the stored procedure to grant the 'INSERT', 'UPDATE', or 'DELETE'
privilege to a space using the following syntax:

CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => <operation>,
PRIVILEGE => <privilege>,
SCHEMA_NAME => <schema name>,
OBJECT_NAME => <object name>,
SPACE_ID => <space ID>);

Parameters are set as follows:

Parameter Values Description

operation [required] Enter 'GRANT' to give the write privileges,


'GRANT'
or 'REVOKE' to remove the write privileges to the
'REVOKE' space.

privilege [required] Enter the write privilege that you want to


'INSERT"
grant (or revoke) to the space.

'UPDATE'  Note
'DELETE' You can grant one privilege at a time.

schema_name '[name of database user group schema]' [required] Enter the name of the schema you want
the space to be able to write from.

object_name [required] You can grant the write privileges, either at


''
the schema level or at the object level.
null
At the schema level (all objets in the
'[name of the objet]' schema): enter null or ' '.

At the object level: enter a valid table name.

This is custom documentation. For more information, please visit SAP Help Portal. 216
7/9/25, 8:45 AM

Parameter Values Description

space_id '[ID of the space]' [required] Enter the ID of the space you are granting
the write privileges to.

To grant update write access to all objects (tables) in the schema:

CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'UPDATE',
SCHEMA_NAME => 'SALE#ETL',
OBJECT_NAME => '',
SPACE_ID => 'SALES');

To grant update write access to the table MY_TABLE:

CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" (
OPERATION => 'GRANT',
PRIVILEGE => 'UPDATE',
SCHEMA_NAME => 'SALE#ETL',
OBJECT_NAME => 'MY_TABLE',
SPACE_ID => 'SALES');

4. Run the query by clicking  (Run) or press F8.

Results
If the run is successful, you receive a confirmation message in the Result pane. You can then open the Data Builder, create a data
flow, and select the tables as targets.

Monitoring SAP Datasphere


Users with an administrator role have access to various monitoring logs and views and can, if necessary, create database analysis
users to help troubleshoot issues.

This topic contains the following sections:

Monitor Disk and Memory Assignment

Monitor Tasks

Monitor Statements

Monitor Access Control Issues

Monitor Elastic Compute Nodes

Monitoring File Space Storage Consumption and Apache Spark Application Usage

Task Logs Tab

Statement Logs Tab

Show/Hide, Filter, Sort and Reorder Task and Statement Columns

Monitor Capacity Units

This is custom documentation. For more information, please visit SAP Help Portal. 217
7/9/25, 8:45 AM
Click  (System Monitor) to access the main monitoring tool. The System Monitor allows to monitor the performance of your
system and identify storage, task, out-of-memory, and other issues across all spaces.

For example, you can see all the errors (such as failed tasks and out-of-memory errors) that occurred yesterday or the top five
statements with the highest peak memory consumption.

 Note
For optimal performance, it is recommended that you consider staggering the scheduled run time of tasks such as data flows
or task chains that may contain these tasks. Make sure to distribute your work such as scheduling and running tasks. There
isn't a specific numerical limit on how many tasks can be scheduled. There could be a resource distribution issue caused by too
many tasks running at once. Check your system monitor to look at your workload distribution. For more information see,
Monitoring SAP Datasphere or Persisted Views and Memory Consumption.

 Note
SAP Datasphere is integrated into SAP Cloud ALM for health monitoring, which enables you to check the health of one or more
SAP Datasphere tenants from the Health Monitoring app in SAP Cloud ALM. See Health Monitoring in the SAP Cloud ALM -
Application Help.

Monitor Disk and Memory Assignment


1. In the side navigation area, click  (System Monitor).

You can monitor available disk and memory storage on your tenant with the following cards:

Card Description

Disk Storage Used Shows the total amount of disk storage used in all spaces, broken down between:

Data in Spaces: All data that is stored in spaces.

Audit Log Data: Data related to audit logs (see Audit Logging).

 Note
Audit logs can grow quickly and consume a great deal of disk storage (see Delete
Audit Logs).

Other Data: Includes data stored in database user group schemas (see Creating a
Database User Group) and SAP HANA data (such as statistics schemas).

Administrative Data: Data used to administer the tenant and all spaces (such as space
quota, space version). Includes all information stored in the central schemas
(DWC_GLOBAL, DWC_GLOBAL_LOG, DWC_TENANT_OWNER).

Disk Used by Spaces for Shows the total amount of disk storage out of the total amount of disk storage. You can see a
Storage breakdown of this amount in the card Disk Storage Used.

Memory Used by Spaces for Shows the total amount of memory storage out of the total amount of memory storage.
Storage

2. To investigate issues in particular spaces:

a. In the side navigation area, click (Space Management).

This is custom documentation. For more information, please visit SAP Help Portal. 218
7/9/25, 8:45 AM
b. Display the list of spaces in the table layout and order by column. For example, you can display at the top of the table
the spaces that use the highest amount of storage by choosing the descending order for the column Used Storage.

c. Open a space and click Monitor in the space details page to see the storage amount assigned to and used by the
space (see Monitor Your Space Storage Consumption).

Monitor Tasks
For example, you can find out if tasks have to be scheduled at another time so that high-memory consuming tasks do not run at
the same time. If single tasks consume too much memory, some additional views may need to be persisted or the view partitioning
may need to be used to lower the memory consumption.

To investigate issues:

1. In the side navigation area, click  (System Monitor).

You can identify issues with tasks with the following cards:

Card Description

Failed Tasks Two cards provide information:


Shows the number of tasks that have failed in the last 24 hours with a trend icon (up or
down arrow) indicating if there are more or less failed tasks than the day before.

Shows the number of failed tasks by day for the last 7 days.

Top 5 Tasks by Run Duration Two cards provide information:


Shows the 5 tasks whose run duration time was the longest in the last 24 hours.

Shows the 5 tasks whose run duration time was the longest in the last 48 hours.

Top 5 Tasks by Processing Two cards provide information:


Memory Consumption Shows the 5 tasks whose processing memory consumption was the highest in the last 24
hours.

Shows the 5 tasks whose processing memory consumption was the highest in the last
48 hours.

2. Click View Logs in a card to go to the Task Logs tab, which displays information filtered on the card criteria. For more
information on the Task Logs tab, see Task Logs Tab.

3. Click the links in the following columns:

Activity column - For the spaces you have access to (via scoped roles), a link opens the run in the Data Integration
Monitor (see Managing and Monitoring Data Integration).

Object Name column - For the spaces you have access to (via scoped roles), a link opens the editor of the object.

Monitor Statements

 Note
Expensive statement tracing is enabled by default. If disabled, statement information and errors are not traced and you cannot
see them in the System Monitor. For more information on enabling and configuring expensive statement tracing, see Configure
Monitoring.

1. In the side navigation area, click  (System Monitor).

This is custom documentation. For more information, please visit SAP Help Portal. 219
7/9/25, 8:45 AM
You can monitor statements with the following cards:

Card Description

Top 5 Statements by Two cards provide information:


Processing Memory Shows the 5 statements whose processing memory consumption was the highest in the
Consumption last 24 hours.

Shows the 5 statements whose processing memory consumption was the highest in the
last 48 hours.

Out-of-Memory Errors Two cards provide information:


Shows the number of out-of-memory errors that have occurred in tasks and statements
in the last 24 hours.

Shows the number of out-of-memory errors that have occurred in tasks and statements,
by day for the last 7 days.

Top 5 MDS Requests by Shows the 5 SAP HANA multi-dimensional services (MDS) requests (used for example in SAP
Processing Memory Analytics Cloud consumption), whose processing memory consumption is the highest.
Consumption

Out-of-Memory Errors (MDS Shows the out-of-memory errors that are related to SAP HANA multi-dimensional services
Requests) (MDS) requests, which is used for example for SAP Analytics Cloud consumption.

Top 5 Out-of-Memory Errors Shows the schemas in which out-of-memory errors have occurred in the last 7 days because the
(Workload Class) by Space statement limits have been exceeded.
To set the statement limits for spaces, see Set Priorities and Statement Limits for Spaces or
Groups.

2. Click View Logs in a card to go to the Statement Logs, which displays information filtered on the card criteria. For more
information on the Statements tab, see Statement Logs Tab.

3. Click the links in the Statement Details column.

Monitor Access Control Issues


1. In the side navigation area, click  (System Monitor).

You can monitor statements that are rejected or queued with the following cards.

Card Description

Top 5 Admission Control Shows the 5 spaces with the highest number of rejected statements in the last 7 days.
Rejection Events by Space
 Note
A space that has been deleted is prefixed with an asterisk character.

Admission Control Rejection Two cards provide information:


Events Shows the number of statements that have been rejected in the last 24 hours because
they’ve exceeded the threshold percentage of CPU usage. A trend icon (up or down
arrow) indicates if there are more or less rejected statements than the day before.

Shows the number of statements that have been rejected in the last 7 days because
they’ve exceeded the threshold percentage of CPU usage.

Top 5 Admission Control Shows the 5 spaces with the highest number of queued statements in the last 7 days.
Queuing Events by Space
 Note
This is custom documentation. For more information, please visit SAP Help Portal. 220
7/9/25, 8:45 AM

Card Description

A space that has been deleted is prefixed with an asterisk character.

Admission Control Queuing Two cards provide information:


Events Shows the number of statements that have been queued in the last 24 hours because
they’ve exceeded the threshold percentage of CPU usage. A trend icon (up or down
arrow) indicates if there are more or less queued statements than the day before.

Shows the number of statements that have been queued in the last 7 days because
they’ve exceeded the threshold percentage of CPU usage.

2. To investigate further, click Open SAP HANA Cockpit in a card.

If you've created a database analysis user, you're connected to the SAP HANA Cockpit without entering your credentials
(see Create a Database Analysis User to Debug Database Issues.

For more information about admission control thresholds, see Set Priorities and Statement Limits for Spaces or Groups.

Monitor Elastic Compute Nodes


Once you’ve created an elastic compute node in the Space Management app (see Create an Elastic Compute Node), you can
monitor its key figures, such as the start and end time of the last run or the amount of memory used for data replication.

1. In the side navigation area, click  (System Monitor), then click the Elastic Compute Nodes tab.

2. From the dropdown list, select the elastic compute node that you want to monitor.

 Note
If one elastic compute node exists, related monitoring information is automatically displayed in the tab. If several elastic
compute nodes exist, you must select a node from the dropdown list to display monitoring information in the tab.

You can view elastic compute node key figures or identify issues with the following cards:

Card Description

Configuration Shows the following information about the current elastic compute node:

Technical name.

Status, such as Ready or Running (see Run an Elastic Compute Node).

The performance class and the resources allocated to the node: number of compute
blocks, memory, disk storage and number of vCPUs.

Run Details Shows the following information about the latest or the previous run of the current elastic
compute node:

The date and time at which the elastic compute node has started and stopped.

The total run duration (uptime) from the starting to the stopping phase.

The number of block-hours is the numbers of hours that have been consumed by the
run. The number of block-hours is the result of the run duration in numbers of hours
multiplied by the number of compute blocks. If a node that includes 4 compute blocks
runs for 5 hours, 20 block-hours have been consumed. In such a case, the uptime equals

This is custom documentation. For more information, please visit SAP Help Portal. 221
7/9/25, 8:45 AM

Card Description

the block-hours. If a node that includes 8 compute blocks runs for 5 hours, 40 block-
hours have been consumed.

Monthly Uptime Shows the following information about the elastic compute node runs for the current month or
the last month:

The total duration (uptime) of all runs in the current or last month.

The total number of block-hours consumed by all the runs in the current or last month.

Average CPU Shows the average percentage of the number of vCPUs consumed, during the latest or previous
run of the elastic compute node.

The trend icon (up or down arrow) indicates if the percentage is higher or lower than the
previous run.

To see the real-time average CPU utilization in percentage for the elastic compute note, click
Performance Monitor, which opens the Performance Monitor page in the SAP HANA Cockpit
(see The Perfomance Monitor in the SAP HANA Cloud Database Administration with SAP HANA
Cockpit).

Average Memory Shows the average amount of memory consumed (in GiB), during the latest or previous run of
the elastic compute node.

The trend icon (up or down arrow) indicates if the percentage is higher or lower than the
previous run.

To see the real-time average memory utilization in GB for the elastic compute note, click
Performance Monitor, which opens the Performance Monitor page in the SAP HANA Cockpit
(see The Perfomance Monitor in the SAP HANA Cloud Database Administration with SAP HANA
Cockpit).

Total Uptime Shows the total duration in hours of all runs of the elastic compute node.

Top 5 Statements by Shows the 5 statements whose memory consumption was the highest during the last run of the
Processing Memory elastic compute node.
Consumption To see detailed information about the statements, you can click View Logs, which takes you to
the Statement Logs tab. See Monitoring SAP Datasphere.

Out-of-Memory Errors Shows the number of out-of-memory errors that have occurred in tasks and statements related
to the elastic compute node, during the last run.
To see detailed information about the errors, you can click View Logs, which takes you to the
Statement Logs tab. See Monitoring SAP Datasphere.

Memory Distribution Shows the amount of memory allocated to the elastic compute node, if in a running state, broken
down between:

Unused Memory - Shows the amount of memory available for the elastic compute node.

Memory Used for Data Replication - Shows the amount of memory used to store
replicated data for the elastic compute node.

Memory Used for Processing - Shows the amount of memory used by the processes
that are currently running for the elastic compute node. For example: consumption of the
queries running on the elastic compute node.

 Note
If the elastic compute node is not in a running state, no data is displayed.

This is custom documentation. For more information, please visit SAP Help Portal. 222
7/9/25, 8:45 AM
3. To investigate further, you can do the following:

To view statement details, click View Logs in a card to go to the Statement Logs tab, which displays information
filtered on the card criteria. Then, click the links in the Statement Details column. For more information on the
Statement Logs tab, see Statement Logs Tab.

To view details on a run, click View Logs in a card to go to the Task Logs tab, which displays information filtered on
the card criteria. In the Activity column, click the link to open the run in the Data Integration Monitor (see Managing
and Monitoring Data Integration).

To navigate to the elastic compute node in the Space Management app, click Manage Elastic Compute Node (see
Create an Elastic Compute Node and Run an Elastic Compute Node).

To analyze the performance of the SAP HANA database, click Database Overview (SAP HANA Cockpit), which
opens the Database Overview page in the SAP HANA Cockpit (see The Database Overview Page in the SAP HANA
Cloud Database Administration with SAP HANA Cockpit).

Monitor Capacity Units


As a tenant administrator, you can view the consumption of capacity units for various features over time. This tool is useful for
optimizing resource allocation and ensuring efficient subscription management.

From the side navigation menu, click  (System Monitor) Capacities .

Your dailly consumption for the current month is shown. You track usage relative to your subscription. You can also download
detailed hourly data. See Monitor Capacities.

Monitoring File Space Storage Consumption and Apache Spark Application Usage
You can monitor the storage consumption for file spaces (of storage type SAP HANA Data Lake Files) and their usage of the
Apache Spark application for task runs. This allows you to understand where storage consumption is the highest and to target
specific tasks.

1. In the side navigation area, click  (System Monitor), then click the Object Store tab.

2. Select the file space of your choice in the Spaces drop-down list. All file spaces of the tenant are listed.

3. You can monitor the storage utilization of your selected file space and of all file spaces with the two following cards:

Card Description

SAP HANA Data Lake Files: Storage Utilization Shows the amount of storage used in terabyte (TB) for the
selected space.

 Note
As an administrator, you can see the storage of all spaces
even if you aren't a member.

You can see the space usage of tasks runs in the Apache Spark:
Tasks table below.

SAP HANA Data Lake Files: Storage Utilization of All Spaces Shows the amount of storage used in terabyte (TB) for all
spaces.

4. You can investigate the selected space's tasks further in the Apache Spark: Tasks table:

This is custom documentation. For more information, please visit SAP Help Portal. 223
7/9/25, 8:45 AM
a. Select a time frame in the Date and Time Range options (Single Dates, Date Ranges, Weeks, Months, or Custom
Options).

b. The table shows the following information:

Applications: Shows the name of the application.

Number of Tasks: Shows the total of tasks that ran in the application during the selected time range. Select
the line or click  (Details) to show more information about the tasks, such as Application Configuration
details (Executor CPU, Executor Memory, Driver CPU, Driver Memory, Maximum CPU, and Maximum
Memory) and Tasks details (Object Type, Task Activity, and Number of Tasks). Sorting and filtering abilities
are available.

Task Logs Tab


In Task Logs, the table shows the following information:

Property Description

Start Time Shows at what time (date and hour) the task has started to run.

Duration (sec) Shows how many seconds the task has run.

Object Type Shows the type of object that was run in the task. For example: view, remote table, data flow.

Activity Shows the action that was performed on the object. For example: persist, replicate, execute. You can
click on the activity name, which takes you to the Data Integration Monitor.

Space Name Shows the name of the space in which the task is run.

Object Name Shows the name of the object. You can click on the object name, which opens the object in the Data
Builder.

SAP HANA Peak Memory Shows the maximum amount of memory (in MiB) the task has used during the runtime in SAP HANA.

 Note
You can see this information:

The option Enable Expensive Statement Tracing is enabled by default. It traces task
exceeding the thresholds specified in  (Configuration) → Monitoring.

And if the task is run for these objects (and activities): views (persist,
remove_persisted_data), remote tables (replicate, enable_realtime), data flows (execute)
and intelligent lookup (execute, delete_data).

Otherwise, no number is displayed.

SAP HANA CPU Time Shows the maximum amount of CPU time (in ms) the task has used in SAP HANA.

 Note
You can see this information:

If the option Enable Expensive Statement Tracing is enabled and if the task exceeds the
thresholds specified in  (Configuration) → Monitoring. See Configure Monitoring.

And if the task is run for these objects (and activities): views (persist,
remove_persisted_data), remote tables (replicate, enable_realtime), data flows (execute)
and intelligent lookup (execute, delete_data).

Otherwise, no number is displayed.

This is custom documentation. For more information, please visit SAP Help Portal. 224
7/9/25, 8:45 AM

Property Description

 Note
The CPU time indicates how much time is used for all threads. It means that if the CPU time is
significantly higher than the duration of the statement, then many threads are used. If many
threads are used for a long time, no other tasks should be scheduled at that point in time, or
resource bottlenecks may occur and tasks may even be canceled.

Records Shows the number of records of the target table after the task has finished running.

 Note
You can see this information only if the task is run for these objects (and activities): views (persist),
remote tables (replicate, enable_realtime), data flows (execute) and intelligent lookup (execute,
delete_data). Otherwise, no number is displayed.

SAP HANA Used Memory Shows the amount of memory (in MiB) that is used by the target table in SAP HANA after the task has
finished running.

SAP HANA Used Disk Shows the amount of disk space (in MiB) that is used by the target table in SAP HANA after the task
has finished running.

Status Shows the status of the task: completed, failed, running.

Substatus For tasks with the status “failed”, shows the substatus and a message describing the cause of failure.
For more information about failed task substatuses, see Understanding Statuses and Substatuses.

User Shows the user who has run the task.

Target Table Shows the SAP HANA database technical name of the target table.

Statements Shows a link you can click to view all the statements of the task in the Statements tab, if the
information is available.

 Note

You can see this information if the option Enable Expensive Statement Tracing is enabled
in  (Configuration) → Monitoring. See Configure Monitoring.

However, as statements are traced for a limited period, you may not be able to see the
statements used in the task.

Out-of-Memory Shows if the task has an out-of-memory error ("Yes" is then displayed) or not ("No" is then displayed).

Task Log ID Shows the identifier of the run task.

Start Date Shows at which date the task has started to run.

You can cancel a task run by selecting one single task and clicking Cancel Task. You can cancel a task run on the following objects:

Transformation flow

Remote table view

Data flow

Task chain

Cancelling a task run may be required when it takes too long or if the run impacts negatively other runs by taking too many
resources away. Canceling a task via the System Monitor is the most reliable option. Its access isn't restricted when resource

This is custom documentation. For more information, please visit SAP Help Portal. 225
7/9/25, 8:45 AM
consumption is too high (as in the Data Integration Monitor), and it is the fastest way to cancel a task (compared to the Database
Explorer). The data is rolled back and restored to the state that existed before the task run was initially triggered.

 Note

Data on tasks are kept for the time specified in  (Configuration) → Tasks.

You may not be able to cancel a task via the Data Integration Monitor or the Database Explorer when resource
consumption is too high. You will always be able to cancel a task via the System Monitor.

Statement Logs Tab


In Statement Logs, the table shows the following information, depending on what you've specified in  (Configuration) →
Monitoring:

The option Enable Expensive Statement Tracing is enabled by default, you can see all the database statements that
exceed the specified thresholds.

If the option Enable Expensive Statement Tracing is disabled, then the Statements tab is disabled.

See Configure Monitoring.

Property Description

Start Time Shows at what time (date and hour) the statement has started to run.

Duration (ms) Shows how many milliseconds the statement has run.

Object Type
Shows the type of object that was run in the statement (for example: view, remote table, data
flow).

Or shows the area where the statement was run:

MDS - this is an SAP HANA multi-dimensional services (MDS) statement, which is


caused for example by stories when SAP Analytics Cloud queries SAP Datasphere.

Data Flow - the statement was run by a data flow.

Analysis - the statement was run by a database analysis user.

Space SQL - the statement was run by a database user of a space.

Business Layer Modeling - the statement was run in the Business Builder.

Data Layer Modeling - the statement was run in the data preview of the view editor in
the Data Builder.

DWC Space Management - the statement was run in the Space Management, for
example, when deploying an object.

DB Usergroup - the statement was run by a user of a database user group.

DWC Administration - the statement was run for an administration task such as writing
a task framework status.

System - any other SAP HANA system statement.

Activity Shows the action that was performed. For example: update, compile, select.

This is custom documentation. For more information, please visit SAP Help Portal. 226
7/9/25, 8:45 AM

Property Description

Object Name If the statement is related to a task, it shows the name of the object for which the statement was run.

Schema Name Shows the name of the schema in which the statement is run.

SAP HANA Peak Memory Shows the maximum amount of memory (in MiB) the statement has used during the runtime in SAP
HANA.

 Note
You can see the information if the option Enable Expensive Statement Tracing is enabled and if the
statement exceeds the thresholds specified in  (Configuration) → Monitoring. See Configure
Monitoring.

Otherwise, no number is displayed.

SAP HANA CPU Time Shows the amount of CPU time (in ms) the statement has used in SAP HANA.

 Note
You can see the information if the option Enable Expensive Statement Tracing is enabled and if the
statement exceeds the thresholds specified in  (Configuration) → Monitoring. See Configure
Monitoring.

Otherwise, no number is displayed.

 Note
The CPU time indicates how much time is used for all threads. It means that if the CPU time is
significantly higher than the duration of the statement, then many threads are used. If many
threads are used for a long time, no other tasks should be scheduled at that point in time, or
resource bottlenecks may occur and tasks may even be canceled.

Statement Details Shows the More link that you can click to view the complete SQL statement.

 Note
For MDS queries - If you’ve enabled the tracing of MDS information (see Configure Monitoring), the
payload of the MDS query that is run by SAP Analytics Cloud is displayed. If identified in the
payload, the following information is also displayed: story ID, story name and data sources. You can
copy or download the displayed information.

Parameters Shows the values of the parameters of the statement that are indicated by the character "?" in the
popup that opens when clicking More in the Statement Details column.

Out-of-memory Shows if the statement has an out-of-memory error. If there is one, a timestamp is displayed. If there is
none, NULL is then displayed.

Task Log ID If the statement is related to a task, it shows the identifier of the task within a link, which takes you to
the Tasks tab filtered on this task.

Elastic Compute Node If the statement exceeds the thresholds specified in the option Enable Expensive Statement Tracing
in  (Configuration) → Monitoring (see Configure Monitoring):

Shows the name of the elastic compute node if the statement is run on an elastic compute
node.

Shows a hyphen (-) if the statement is run on the main instance.

Error Code If the statement has failed, it shows the numeric code of the SQL error. See SQL Error Codes in the
SAP HANA SQL Reference Guide for SAP HANA Platform.

This is custom documentation. For more information, please visit SAP Help Portal. 227
7/9/25, 8:45 AM

Property Description

Error Message If the statement has failed, it shows a description of the SQL error.

Workload Class If the statement has an out-of-memory error, it shows the name of the workload class whose limit has
been exceeded.

Statement ID Shows the identifier of the statement.

Connection ID Shows the ID used to connect to the database.

Start Date Shows at which date the statement has started to run.

 Note
Data on statements are kept for a time that depends on the thresholds specified in  (Configuration) → Monitoring (see
Configure Monitoring). As a certain number of statements are kept (30.000 by default), if very low thresholds are set, the time
period may be very low (for example, only a few hours). To keep the statements for a longer time, the thresholds should be set
accordingly.

Show/Hide, Filter, Sort and Reorder Task and Statement Columns


You can control the tables in Task Logs and Statement Logs in the following ways:

Reorder the columns by drag and drop.

Sort on a column by clicking the column header and then clicking  (Sort Ascending) or  (Sort Descending).

Filter on a column by using the quick filtering or the advanced filtering options.

Quick Filters

Filter Description

Date and Time Range Enter one date and time range or click  to see the available options:

Single Dates - Today, Yesterday

Date Ranges - From/To, From/To (Date and Time), From, To, From (Date and
Time), To (Date and Time), Last X Minutes/Hours/Days/Weeks

Custom Options - Last 1 Hour, Last 6 Hours, Last 24 Hours

Spaces Select or enter the name of at least one space. All spaces of the tenant are available, even
the ones you are not added to.

Statuses Select or enter at least one status: Completed, Failed, or Running.

 Example
You are looking for all failed and running records that happened last week in the space ACME_TF. Define the time
range by clicking  to display the list of available options and selecting From/To in Date and Time Range and
selecting the dates relevant to you in the calendar. Then, select the space ACME_TF in the Space filter drop down
list. Finally, select Failed and Running in the Statuses drop down list. The log list automatically updates after
each filter definition.

 Note
This is custom documentation. For more information, please visit SAP Help Portal. 228
7/9/25, 8:45 AM
Defined quick filters are shown in the Define Filter dialog. If you add an advanced filter in the Define Filter
dialog, the quick filters fields will be cleared. The filters are not deleted.

Filters defined in the Define Filter dialog are not shown in the quick filters fields.

Advanced Filters

1. Click a column header, then click  (Filter). The Define Filter dialog opens and advanced filtering options are
available.

2. Chose the appropriate section for your filter. If your filter is meant to include data in the table (you could say
"I want my Data Preview to show"), add your filter in the Include section. If your filter is meant to exclude
data from the table (you could say "I want my Data Preview to hide"), add your filter in the Exclude section.
When in the appropriate section, click  (Add Filter) to add a filter.

3. Select a column to filter on, a filtering option, and a value. You can add several filters. Click OK to apply the
filter(s). The currently applied filters are displayed above the table.

 Example
To only see the tasks that have failed on remote tables, in the Include area, select the column Object
Type, then the filtering value contains and enter "REMOTE". Then, add a filter, select the column Status,
then the filtering value contains and enter "FAILED". Once applied, the filter is displayed above the table.

4. Click Clear Filter in the filter strip or  (Remove Filter) in the Define Filter dialog to remove the filter.

 Note
The filtering options available depend on the data type of the column you filter on.

Filters applied to text columns are case-sensitive.

You can enter filter or sort values in multiple columns.

To increase performance, only the first 1,000 rows are displayed. Use filters to find the data you are looking for.
Filters are applied to all rows, but only the first filtered 1,000 rows are displayed.

 Note
If you filter on one of the following columns and you enter a number, use the “.” (period) character as the decimal
separator, regardless of the decimal separator used in the number formatting that you’ve chosen in the general user
settings ( Settings Language & Region ): SAP HANA Peak Memory, SAP HANA CPU Time, SAP HANA Used
Memory and SAP HANA Used Disk.

Show or hide columns by clicking  (Columns Settings) to open the Columns Settings dialog, selecting columns as
appropriate. To return to the default preview columns, click Reset.

Refresh the table at any time by clicking Refresh.

Configure Monitoring
You can control which monitoring data is collected and also obtain independent access to the underlying SAP HANA monitoring
views that power the System Monitor.

Prerequisites

This is custom documentation. For more information, please visit SAP Help Portal. 229
7/9/25, 8:45 AM
To control which monitoring data is collected, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.

Procedure
1. In the side navigation area, click  (System)  (Configuration) and then select the Monitoring tab.

2. To obtain independent access to the underlying SAP HANA monitoring views that power the System Monitor:

a. Select a space from the drop-down list and click Confirm Selected Space.

 Note
File spaces are not available in the list as they cannot be chosen as the monitoring space.

b. If you've created the <SAP_ADMIN> space and you want to enable it, click Enable access to SAP Monitoring
Content Space. If there isn't any space named <SAP_ADMIN> in your tenant, this is not available for selection.

For more information, see Working with SAP HANA Monitoring Views.

3. Analyze individual SQL queries whose execution exceeds one or more thresholds, select Enable Expensive Statement
Tracing. Keep the default settings or specify the following parameters to configure and filter the trace details, then save
your changes.

Property Description

Memory Tracing Specify the maximum number of records that are stored in the monitoring tables.
Records For example, if about 5 days are traced in the expensive statement tables and you don’t want to change
the thresholds, you can double the number of records in In-Memory Tracing Records so that about 10
days are traced. Be aware that increasing this number will also increase the used storage.

Default: 30,000

Maximum: 100,000

Threshold CPU Time Specify the threshold CPU time of statement execution.
When set to 0, all SQL statements are traced.

Default: 0

Threshold Memory Specify the threshold memory usage of statement execution.


When set to 0, all SQL statements are traced.

Default: 1,024MB

Maximum: 1 GB

Threshold Duration Specify the threshold execution time.

When set to 0, all SQL statements are traced.

Default: 50000 Microseconds

Trace Parameter Values In SQL statements, field values may be specified as parameters (using a "?" in the syntax). If these
parameter values are not required, then do not select the option to reduce the amount of data traced.

Default: False

This is custom documentation. For more information, please visit SAP Help Portal. 230
7/9/25, 8:45 AM
If expensive statement tracing is not enabled, then statement information and errors are not traced and you cannot see
them in the System Monitor (see Monitoring SAP Datasphere).

For more information about these parameters, see Expensive Statements Trace in the SAP HANA Cloud, SAP HANA
Database Administration Guide.

4. To analyze individual SAP HANA multi-dimensional services (MDS) queries, select Enable MDS Information Tracing and
save.

Property Description

MDS Tracing Records Specify the maximum number of records that are stored for MDS requests in the monitoring tables.
You can increase this number in order to trace more data in the System Monitor.

Default (max): 100,000

If the tracing is enabled, you can view information on MDS queries when clicking More in the column Statement Details of
the Statement Logs tab in the System Monitor (see Monitoring SAP Datasphere).

5. To trace elastic compute node data, select Enable Elastic Compute Node Data Tracing and save.

If the tracing is disabled, only the statements of currently running nodes are displayed in the System Monitor. If a
node is stopped, its information is deleted.

If the tracing is enabled and a node is started and stopped more than once, only the information about the previous
run is displayed. The information is kept for 10 days or is deleted if more than 100 individual elastic compute nodes
have run.

Working with SAP HANA Monitoring Views


You can obtain independent access to the underlying SAP HANA monitoring views that power the System Monitor to do additional
analysis on them and visualize them in SAP Analytics Cloud.

This topic contains the following sections:

Preparing Monitoring Spaces

Monitoring Views

SAP HANA DWC_GLOBAL Schema Monitoring Views

SAP Datasphere Monitoring Views (Delivered via the Content Network)

Preparing Monitoring Spaces


Monitoring information includes information on all spaces and views and so these views should not be made accessible to all SAP
Datasphere users. An administrator can select two spaces dedicated to monitoring information and assign users to these spaces
with modeling privileges so they can work with the monitoring views in the Data Builder.

 Note
The data from these monitoring views is available directly in the System Monitor (see Monitoring SAP Datasphere). Working
with them independently is optional and allows you to do further analysis that is not supported in the standard monitor.

As the monitoring spaces you choose will provide unfiltered access to monitoring views, be aware that the users assigned to the
spaces will be able to see all metadata and object definitions of all spaces.

This is custom documentation. For more information, please visit SAP Help Portal. 231
7/9/25, 8:45 AM
You can dedicate one or two spaces to monitoring:

Choose a space that you want to contain monitoring views.

 Note
If you have already selected a space for monitoring before version 2021.19, you need to select another space, then select
the initial space again so that you can access all the views.

<SAP_ADMIN> space - This space can contain the pre-configured monitoring views provided by SAP via the Content
Network. First create the space with the space ID <SAP_ADMIN> and the space name <Administration (SAP)>, enable
access to it, and import the package from the Content Network.

 Note
Do not create a space with the space ID <SAP_ADMIN> for another purpose.

Monitoring Views
The following monitoring views are available:

SAP HANA SYS Schema Monitoring Views - All SAP HANA monitoring views start with M_. For more information, see
Monitoring Views in the SAP HANA Cloud, SAP HANA Database SQL Reference Guide.

The views for monitoring expensive statements are M_EXPENSIVE_STATEMENTS and


M_EXPENSIVE_STATEMENT_EXECUTION_LOCATION_STATISTICS (see M_EXPENSIVE_STATEMENTS and
M_EXPENSIVE_STATEMENT_EXECUTION_LOCATION_STATISTICS).

The view M_MULTIDIMENSIONAL_STATEMENT_STATISTICS and M_MULTIDIMENSIONAL_STATEMENTS provide


extensive information about MDS queries. For more information, see M_MULTIDIMENSIONAL_STATEMENT_STATISTICS
System View and M_MULTIDIMENSIONAL_STATEMENTS System View in the SAP HANA Cloud, SAP HANA Database SQL
Reference Guide.

SAP HANA _SYS_STATISTICS Schema Statistics Service Views (see Embedded Statistics Service Views
(_SYS_STATISTICS schema)).

SAP HANA _SYS_BI Schema Tables and Views (see BIMC Tables and Views in the SAP HANA Cloud, SAP HANA Analytics
Catalog (BIMC Views) Reference).

SAP HANA DWC_GLOBAL Schema Monitoring Views (see Working with SAP HANA Monitoring Views).

SAP Datasphere Monitoring Views - Delivered via the Content Network in the <SAP_ADMIN> space (see SAP Datasphere
Monitoring Views (Delivered via the Content Network)).

SAP HANA DWC_GLOBAL Schema Monitoring Views

The following monitoring views have the suffix _V_EXT and are ready to use in the DWC_GLOBAL schema:

SPACE_SCHEMAS_V_EXT:

Column Description

SPACE_ID Identifier of the SAP Datasphere space. Note that one space can contain several schemas.

SCHEMA_NAME Name of the schema used to run the task.

This is custom documentation. For more information, please visit SAP Help Portal. 232
7/9/25, 8:45 AM
SPACE_USERS_V_EXT:

Column Description

SPACE_ID Identifier of the SAP Datasphere space. Note that one space can contain several users.

USER_NAME Identifier of the user.

USER_TYPE Type of user, such as space technical user (for example database user for open SQL schemas) or global user.

TASK_SCHEDULES_V_EXT:

Column Description

SPACE_ID (Key) Identifier of the SAP Datasphere space which contains the object with the defined schedule.

OBJECT_ID (Key) Identifier of the SAP Datasphere object for which the schedule is defined.

APPLICATION_ID Identifier of the type of object.


(Key) For example: PERSIST (View), EXECUTE (Dataflow), REPLICATE (Remote Tables), RUN_CHAIN (Task
Chain).

ACTIVITY (Key) Identifier of the type of activity applied to the object.

 Note
For each application, you can have multiple activities (for example, replicating or deleting data).

For example: PERSIST (View), EXECUTE (Dataflow), REPLICATE (Remote Tables), RUN_CHAIN (Task
Chain)

OWNER Identifier of the responsible of the schedule, schedule executed on users behalf, consent is checked
against (< DWC User ID >).

CRON Defines the recurrence of a schedule in CRON format .


NULL (no schedule defined, or a SIMPLE schedule is defined) For example: "0 */1 * * *" for hourly (see
Schedule a Data Integration Task (with Cron Expression)).

FREQUENCY Defines the recurrence of a schedule in json format (simple format).


NULL (no schedule defined, or a CRON schedule is defined) or schedule definition, for example Daily +
start date + time + duration (see Schedule a Data Integration Task (Simple Schedule)).

CHANGED_BY User who last changed the schedule configuration.

CHANGED_AT Timestamp containing Date and Time, at which the schedule was last changed.

TASK_LOGS_V_EXT:

Column Description

TASK_LOG_ID (Key) Uniquely identifies an execution of a task.

SPACE_ID Identifier of the SAP Datasphere space which contains the object with the defined schedule.

APPLICATION_ID Identifier of the type of object .


For example: VIEWS, REMOTE_TABLES, DATA_FLOWS, TASK_CHAINS

OBJECT_ID Identifier of the SAP Datasphere object for which the schedule is defined.

This is custom documentation. For more information, please visit SAP Help Portal. 233
7/9/25, 8:45 AM

Column Description

ACTIVITY For each application there could be multiple activities, e.g. replicating or deleting data.
For example: VIEWS, REMOTE_TABLES, DATA_FLOWS, TASK_CHAINS

PEAK_MEMORY Captures the highest peak memory consumption (in bytes). Not available for all apps. Requires Enable
Expensive Statement Tracing (see Configure Monitoring).

Gives Null if not available for the application, Enable Expensive Statement Tracing not set, or the
threshold defined is not reached, 0 or value of the memory consumption.

PEAK_CPU Total CPU time (in microseconds) consumed by the task. Not available for all apps. Requires Enable
Expensive Statement Tracing (see Configure Monitoring).

Gives Null if not available for the application, Enable Expensive Statement Tracing not set, or the
threshold defined is not reached, 0 or value of the CPU time consumption.

RECORDS Shows the number of records of the target table after the task has finished running.
Gives Null (not applicable or not measured), 0 or number of records.

START_TIME Timestamp containing Date and Time, at which the scheduled task was started.

END_TIME Timestamp containing Date and Time, at which the scheduled task was stopped.

STATUS Reports if this task execution is still running, completed or failed.

TRIGGERED_TYPE Indicates if task execution was triggered manually (DIRECT) or via schedule (SCHEDULED).

APPLICATION_USER The user on whose behalf the schedule was executed (the owner at this point in time).

DURATION Duration in seconds of the task execution (also works for ongoing execution).

START_DATE Date when the scheduled task was started.

TASK_LOG_MESSAGES_V_EXT:

Column Description

TASK_LOG_ID (Key) Uniquely identifies an instance of a task.

MESSAGE_NO (Key) Order sequence of all messages belonging to a certain Tasklog ID.

SEVERITY Indicates if the message provides general information (INFO) or error information (ERROR).

TEXT The message itself.

DETAILS Technical additional information. For example, it can be an error stack or a correlation ID.

TASK_LOCKS_V_EXT:

Column Description

LOCK_KEY (Key) Identifier, flexible field as part of the lock identifier, usually set to WRITE or EXECUTE.

APPLICATION_ID (Key) Identifier of the type of object.

SPACE_ID (Key) Identifier of the SAP Datasphere space which contains the object with the defined schedule.

OBJECT_ID (Key) Identifier of the SAP Datasphere object for which the schedule is defined.

This is custom documentation. For more information, please visit SAP Help Portal. 234
7/9/25, 8:45 AM

Column Description

TASK_LOG_ID Uniquely identifies the task execution that set the lock.

CREATION_TIME Indicates when the lock has been set.

 Note
Cross-space sharing is active for all SAP HANA monitoring views. The row level access of shared views is bound to the space
read access privileges of the user who consumes the view.

SAP Datasphere Monitoring Views (Delivered via the Content Network)


These SAP Datasphere monitoring views help you monitor data integration tasks in a more flexible way. They are built on the
V_EXT views, and are enriched with further information as preparation for consumption in an SAP Analytics Cloud story.

See the blogs SAP Datasphere: Data Integration Monitoring – Sample Content for Reporting (published in October 2021) and
SAP Datasphere: Data Integration Monitoring – Running Task Overview (published in November 2021).

You must:

Create a space with the space ID <SAP_ADMIN> and the space name <Administration (SAP)> and configure it as a
monitoring space by enabling the toggle Enable Access to SAP Monitoring Content Space (see Configure Monitoring).

Import the Technical Content: Task Monitoringpackage from the Content Network (see Importing SAP and
Partner Business Content from the Content Network).

The following views are available:

SAP_TCT_TASK_LOGS_V_R_01: Monitoring: Task Execution Headers - Exposes:

Task properties, such as duration and execution status (e.g. failed, completed, ...).

Various measures for counting tasks (e.g. failed).

The schedule description.

Locking status

Uses the views TASK_LOCKS_V_EXT, TASK_SCHEDULES_V_EXT and TASK_LOGS_V_EXT.

Best Practice: To enable the navigation between SAP Datasphere and SAP Analytics Cloud, you must change the constant
for the url_host to your SAP Datasphere instance. Open the view in the view editor, and update the URL host:

SAP_TCT_TASK_SCHEDULE_V_R_01: Monitoring: Schedule Properties - Exposes the properties of a data integration


schedule.

Uses the view TASK_SCHEDULES_V_EXT and adds a row-count to be compatible with OLAP reporting.

SAP_TCT_TASK_MSG_V_R_01: Monitoring: Task Execution Items - Exposes:

All messages occurring during data integration monitoring.

Error code, header line and first stack line parsed out from detailed message.

This is custom documentation. For more information, please visit SAP Help Portal. 235
7/9/25, 8:45 AM
An indicator that the task_id has an error (facilitate filtering of messages).

Uses the views TASK_LOG_MESSAGES_V_EXT and TASK_LOGS_V_EXT.

Best Practice: To enable the navigation between SAP Datasphere and SAP Analytics Cloud, you must change the constant
for the url_host to your SAP Datasphere instance. Open the view in the view editor, and update the URL host:

Monitor Database Operations with Audit Logs


Monitor the read and change actions (policies) performed in the database with audit logs, and see who did what and when.

Prerequisites
To monitor database operations with audit logs, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.

Context
If Space Administrators have enabled audit logs to be created for their space (see Logging Read and Change Actions for Audit),
you can get an overview of these audit logs. You can do analytics on audit logs by assigning the audit views to a dedicated space
and then work with them in a view in the Data Builder.

 Note
Audit logs can consume a large quantity of GB of disk in your database, especially when combined with long retention periods
(which are defined at the space level). You can delete audit logs when needed, which will free up disk space. For more
information, see Delete Audit Logs.

Procedure

1. Choose a space that will contain the audit logs.

Go to System Configuration Audit . Enable to save and later display the audit logs directly in a certain space by
choosing a space from the drop-down list. We recommend to create a dedicated space for audit logs, as you might not want
all users to view sensitive data.

2. Open the Data Builder, create a view, and add one or more of the following views from the DWC_AUDIT_READER schema
as sources:

DPP_AUDIT_LOG - Contains audit log entries.

AUDIT_LOG_OVERVIEW - Contains audit policies (read or change operations) and the number of audit log entries.

ANALYSIS_AUDIT_LOG - Contains audit log entries for database analysis users. For more information, see Create
a Database Analysis User to Debug Database Issues.

This is custom documentation. For more information, please visit SAP Help Portal. 236
7/9/25, 8:45 AM

Delete Audit Logs


Delete audit logs and free up disk storage.

You can delete audit logs for:

Spaces for which auditing is enabled. For each space, you can delete separately all the audit log entries recorded for read
operations and all the audit log entries recorded for change operations. All the entries recorded before the date and time
you specify are deleted.

All read audit logs recorded for all database analysis users. They are grouped together into the audit policy
DWC_ANALYSIS_USERS_AUDIT_ALL.

1. Go to System Configuration Audit Audit Log Deletion .

2. Select the spaces (and the audit policy names - read or change) or the database analysis user audit policy
(DWC_ANALYSIS_USERS_AUDIT_ALL) for which you want to delete all audit log entries and click Delete.

3. Select a date and time and click Delete.

All entries that have been recorded before this date and time are deleted.

Deleting audit logs frees up disk storage, which you can see in the Disk Storage Used card in System Monitor
Dashboard .

 Note
Audit logs are automatically deleted when performing the following actions: deleting a space, deleting a database user (open
SQL schema), disabling an audit policy for a space, disabling an audit policy for a database user (open SQL schema),
unassigning an HDI container from a space. Before performing any of these actions, you may want to export the audit log
entries, for example by using SAP HANA Database Explorer (see Logging Read and Change Actions for Audit).

Monitor Object Changes with Activities


Monitor the changes that users perform on modeling objects (such as spaces and tables) as well as changes to the system
configuration (such as roles and users).

This topic contains the following sections:

Prerequisites

Context

View all Activities and Filter on Specific Activities

Download and Delete the Activity Log for a Specific Time Period

Prerequisites
To monitor object changes with activities, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

This is custom documentation. For more information, please visit SAP Help Portal. 237
7/9/25, 8:45 AM
Activity Log (-R------) - To view and download activities. The DW Administrator role template, for example, grants this
privilege.

Activity Log (---D----) - To delete activity logs.

For more information, see Privileges and Permissions and Standard Roles Delivered with SAP Datasphere.

Context
Actions that are performed by users are logged in Security Activities .

For example:

Space creation and changes

Table changes

Role changes and assignments

Logged users

View all Activities and Filter on Specific Activities

To view activities, you must have a global role that grants you the privilege:

Activity Log (-R------) – Read activities

The DW Administrator global role, for example, grants this privilege.

1. In the side navigation area, click Security Activities .

2. To filter for specific types of activities, select  (Filter).

In the Set Filters dialog, you can select one or more parameters to filter in the Available Filters list. In the Active Filters list,
type or choose a filter value for each parameter that you select. When you click OK, the log is filtered according to your
selections.

If you apply filters to the log, the entries that you filter out are also excluded if you download the activity data.

Download and Delete the Activity Log for a Specific Time Period
To download and delete activity logs, you must have a global role that grants you the privilege:

Activity Log (-R-D----) – Read and delete activities

The DW Administrator global role, for example, grants this privilege.

When the size of the activity log approaches the limit, users who have the Delete permission for the Activity Log privilege will
receive an email and an alert notification. Further alerts will be sent if the log continues to grow closer to the limit.

When the activity log reaches its limit, final notifications are sent, and then the oldest rows will be deleted from the system to keep
the log size below the limit. To reduce the size of the log, you can first download part or all of the log as CSV files, and then delete
those log entries from the system.

The default limit for the activity log is 500,000 rows. You can request that this number be changed to a higher number, or changed
to a one-year rolling period, by entering a support ticket.

This is custom documentation. For more information, please visit SAP Help Portal. 238
7/9/25, 8:45 AM
1. In the side navigation area, click Security Activities .

You can also open the Activities page directly from the link in the notification email.

2. If you want to filter the activities that you will download, select  (Filter).

 Tip
Filtering the activity log can be useful when collecting troubleshooting data, but is usually not necessary for archiving
activity log data.

In the Set Filters dialog, select the filters that you want to apply, and choose a value for each filter. Time Stamp filters will
be overridden by your settings in the Download Activities dialog.

3. Select Download Options

4. In the Download Activities dialog, type a file name for the download in the Name field.

5. Select a Starting Date and an End Date, and select Download.

The rows within the dates and filters that you specified are downloaded as CSV files with up to 75 000 rows each.

6. To delete activity data, select the (Delete options) icon.

7. In the Delete Activities dialog, select a Time period.

8. If you choose Specific range, set a Starting Date and an End Date.

We recommend to delete the same range that you downloaded.

 Note
Filters applied in the Activities page don't apply to the delete operation.

9. Select Delete.

All activity rows in the specified time period are deleted from the system.

Delete Task Logs to Reduce Storage Consumption


In the Configuration area, you can check how much spaces the task logs are using on your tenant, and decide to delete the
obsolete ones to reduce storage consumption.

Prerequisites

To delete task logs and reduce storage consumption, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.

Context
Each time an activity is running in SAP Datasphere (for example, replicate a remote table), task logs are created to allow you to
check if the activity is running smoothly or if there is an issue to solve. You access these detailed task logs by navigating to the

This is custom documentation. For more information, please visit SAP Help Portal. 239
7/9/25, 8:45 AM

Data Integration Monitor - Details screen of the relevant object. For example, clicking the button of the relevant remote table.
See Managing and Monitoring Data Integration.

However, task logs can consume a lot of spaces in a tenant. Deleting old task logs that are no longer needed can be useful to
release storage space. This is why SAP Datasphere has a log deletion schedule activated by default. You can change the schedule
defining your own criteria or decide to take immediate deletion actions.

Procedure
1. In the side navigation area, click  (Configuration) → Tasks.

2. Check how much size the task logs consume on your tenant. If needed, decide how you want to delete the task logs.

Manually Delete Task Log: SAP Datasphere automatically triggers logs deletion using the following default criteria:

Deletion tasks will be run every 4 months

Task logs older than 200 days will be deleted

Go to the section Schedule Task Log Deletion, update the deletion criteria following your needs and click Save.

Manual Deletion: You want to manually delete task logs to take immediate action. Go to the Manually Delete Task
Log section and determine how long you want to keep the logs. For example, delete the logs that are older than 100
days.

Click Delete.

Create a Database Analysis User to Debug Database Issues


Database analysis users are SAP HANA Cloud database users who have read-only access to all space schemas, and all their
activities are recorded in audit logs. You create a database user to monitor, analyze, trace, or debug your SAP Datasphere
database, and resolve a specific database issue.

Prerequisites

To create a database user to monitor, analyze, trace, or debug your SAP Datasphere database, you must have a global role that
grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.

Context

A user with an administrator role can create a database analysis user.

 Note
You should only create a database analysis user to resolve a specific database issue and then delete it immediately after the
issue is resolved (see Manage Database Analysis Users). This user can access all SAP HANA Cloud monitoring views and all
SAP Datasphere data in all spaces, including any sensitive data stored there.

Procedure
This is custom documentation. For more information, please visit SAP Help Portal. 240
7/9/25, 8:45 AM
1. In the side navigation area, click  (System)  (Configuration) Database Access Database Analysis Users .

2. Click Create and enter the following properties in the dialog:

Property Description

Database Analysis User Enter the suffix, which is used to create the full name of the user. Can contain a maximum of 31
Name Suffix uppercase letters or numbers and must not contain spaces or special characters other than _
(underscore). See Rules for Technical Names.

Enable Space Schema Select only if you need to grant the user access to space data.
Access

Database analysis user Select the number of days after which the user will be deactivated. We strongly recommend creating
expires in this user with an automatic expiration date.

3. Click Create to create the user.

The host name and port, as well as the user password are displayed. Note these for later use.

4. Select your user in the list and then click one of the following and enter your credentials:

Open SAP HANA Cockpit - Open the Database Overview Monitoring page for the SAP Datasphere run-time
database, which offers various monitoring tools.

For more information, see Using the Database Overview Page to Manage a Database).

Open Database Explorer - Open an SQL Console for the SAP Datasphere run-time database.

For more information, see Getting Started With the SAP HANA Database Explorer).

A database analysis user can run a procedure in Database Explorer to stop running statements. For more
information, see Stop a Running Statement With a Database Analysis User.

 Note
All actions of the database analysis user are logged in the ANALYSIS_AUDIT_LOG view, which is stored in the space
that has been assigned to store audit logs (see Logging Read and Change Actions for Audit).

Audit logs can consume a large quantity of GB of disk in your SAP Datasphere tenant database. The audit log entries for
database analysis users are kept for 180 days, after which they are automatically deleted. You can also manually delete
the audit logs to free up disk space (see Delete Audit Logs). Also, a database analysis user can be automatically
deactivated due to a large amount of disk storage consumed by audit logs (see Manage Database Analysis Users).

Manage Database Analysis Users


You should delete a database analysis user immediately after the issue is resolved to avoid misuse of sensitive data. For database
analysis users that are still needed, you can reactivate, unlock, or extend them.

This topic contains the following sections:

Introduction to Database Analysis User Management

Reactivate a Database Analysis User

Unlock a Database Analysis User

Extend a Database Analysis User

This is custom documentation. For more information, please visit SAP Help Portal. 241
7/9/25, 8:45 AM
Delete a Database Analysis User

Introduction to Database Analysis User Management


The database analysis users can have one of the following statuses:

Status Description

Active The database analysis user is active and can be used.

Deactivated The database analysis user has been deactivated because its audit logs have exceeded the disk storage
threshold.

Locked The database analysis user has been locked after too many failed login attempts.

Expired The database analysis user has passed the expiration date that you've set when creating it.

Reactivate a Database Analysis User

A database analysis user is deactivated and its status set to Deactivated because its audit logs have exceeded the disk storage
threshold.

If the total size of all audit logs in the tenant has reached more than 40% of the tenant disk storage, the system automatically
deactivates any analysis database users - and locks any spaces - whose audit logs consume more than 30% of the total audit log
size.

You can reactivate a deactivated database analysis user by deleting its audit log entries so that they fall below the threshold (see
Delete Audit Logs). The database analysis user will be automatically reactivated after a few minutes.

Unlock a Database Analysis User


After too many failed login attempts, a database analysis user is locked and its status set to Locked.

You can unlock a locked database analysis user by requesting a new password for it.

1. In the side navigation area, click  (System)  (Configuration) Database Access Database Analysis Users .

2. Click the icon next to the Locked status of the database analysis user.

3. In the dialog box that opens, click Request New Password.

A new password is automatically generated.

Extend a Database Analysis User


If the expiration date of an analysis database user has been reached, the user is automatically deactivated and its status set to
Expired.

You can extend an expired database analysis user.

1. In the side navigation area, click  (System)  (Configuration) Database Access Database Analysis Users .

2. Click the icon next to the Expired status of the analysis database user.

3. In the dialog box that opens, select the number of days after which the user will expire and click Reactivate Analysis User.

This is custom documentation. For more information, please visit SAP Help Portal. 242
7/9/25, 8:45 AM

Delete a Database Analysis User


Delete your database analysis user immediately after the issue is resolved to avoid misuse of sensitive data.

1. In the side navigation area, click  (System)  (Configuration) Database Access Database Analysis Users .

2. Select the user you want to delete and then click Delete.

Deleting a database analysis user does not delete its audit logs. The audit logs will be deleted after a retention period of 180 days.
As they can consume a large amount of disk storage, you may want to manually delete them before the end of the retention period
(see Delete Audit Logs).

Stop a Running Statement With a Database Analysis User


Using a database analysis user, you can stop a statement that is currently running.

You may for example want to stop a statement that has been running for a long time and is causing performance issues.

You can only stop a statement that has been run by space users, analysis users, user group users and Data Provisioning Agent
users.

In SAP HANA Database Explorer, run a database procedure using the following syntax:

CALL "DWC_GLOBAL"."STOP_RUNNING_STATEMENT"('<ACTION>', '<CONNECTION_ID>')

Complete the parameters as follows:

Parameter Value Description

ACTION CANCEL Enter CANCEL to run the statement ALTER SYSTEM CANCEL
[WORK IN] SESSION (see ALTER SYSTEM CANCEL [WORK IN]
SESSION Statement (System Management) in the SAP HANA
Cloud, SAP HANA Database SQL Reference Guide.)

DISCONNECT Enter DISCONNECT to run the statement ALTER SYSTEM


DISCONNECT SESSION (see ALTER SYSTEM DISCONNECT
SESSION Statement (System Management) in the SAP HANA
Cloud, SAP HANA Database SQL Reference Guide.)

CONNECTION_ID Enter the ID of the connection to the database, which corresponds


to the statement that you want to stop.

 Note
You can find the connection ID in  (System Monitor)
Statement Logs , then the column Connection ID.

For more information on database explorer, Getting Started With the SAP HANA Database Explorer.

Configure Notifications
Configure notifications about system events and network connection issues, and define the SMTP server to be used for email
deliveries.

This is custom documentation. For more information, please visit SAP Help Portal. 243
7/9/25, 8:45 AM

Prerequisites

To configure notifications, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.

Notify All Users about Network Connection Issues


When there are problems with a system, your users would like to know whether it is something that they control or if the issues are
related to the network. You can't create messages for all situations, but you can let them know when the network connection is
unstable.

To turn on the connection notification:

1. In the side navigation area, click System Administration Notifications .

2. To enable editing of all settings on the page, click Edit.

3. In the Connections Notifications section, change the toggle to ON.

4. Click Save to commit your changes.

When the notification is on, everyone who uses the application on that tenant will see the notification in the top right corner of
their application.

Notify Users When They are Added to the Tenant

By default, when users are added to your SAP Datasphere tenant, they receive a welcome email which contains a link to the tenant
so they can activate their account and log in for the first time. You can disable the welcome email from being sent to new users. You
may want to do so in the following cases:

When SAML single sign-on (SSO) is setup and it's not necessary for the users to activate their account.

When you want to setup single sign-on (SSO) before users are given the go to access the system.

When the custom SAML Identity Provider (IdP) is changed.

When you need to import users from a public tenant to a private tenant.

To disable the welcome email notification:

1. In the side navigation area, click System Administration Notifications .

2. To enable editing of all settings on the page, click Edit.

3. In the Welcome Email section, toggle the button to Off.

4. Click Save.

 Note
If you disable the welcome email and then add a user who doesn't have an activated account for SAP Datasphere, they will not
be able to access the system. The new user needs to go to the tenant log-on page and click "Forgot password?". They must

This is custom documentation. For more information, please visit SAP Help Portal. 244
7/9/25, 8:45 AM
enter the email address associated with their account and follow the instructions of the received email to set up a password.

Configure Custom SMTP Server

Configuring an email server of your choice ensures greater security and flexibility while delivering email for your business.

1. In the side navigation area, click System Administration Notifications .

2. To enable editing of all settings on the page, click Edit.

3. In the Email Server Configuration section, select Custom, and complete the following properties.

4. Click Check Configuration or Save to successfully validate the configuration details.

Check Consent Expirations


View a list of users whose authorization consent will expire in less than four weeks.

Prerequisites
To check consent expirations, you must have a global role that grants you the following privileges:

Data Warehouse General (-R------) - To access SAP Datasphere.

System Information (-RU-----) - To access the Administration and Configuration areas in the System tool.

The DW Administrator role template, for example, grants these privileges. For more information, see Privileges and Permissions
and Standard Roles Delivered with SAP Datasphere.

Procedure
1. To view a list of users whose authorization consent will expire within the next four weeks, click  (Configuration) Tasks
.

2. in the Consent Expiration section of the Tasks page, click the View Expiration List link. SAP Datasphere now displays a
dialog in which you can view a list of users whose authorization consent will expire within a given timeframe.

This is custom documentation. For more information, please visit SAP Help Portal. 245
7/9/25, 8:45 AM

By default, the dialog displays a list of users whose consent will expire within four weeks. You can change the default expiration
timeframe to anywhere between one and four weeks. In addition to displaying the list of users whose consent will soon expire, you
can also select a user in the list and click the Show Affected Tasks link to view the collection of tasks that user has scheduled.

Monitor Capacities
View the amount of capacity units you have used each month.

Context
Monitor Capacities provides insights into monthly and daily capacity unit consumption, allowing users to track usage relative to
their subscription and download detailed hourly data. This tool is useful for optimizing resource allocation and ensuring efficient
subscription management.

Card Description

Total CU Consumption Shows the number of capacity units consumed this month.

Total CU Consumption: Relative to Your Shows the percentage of your capacity unit subscription that is used this month.
Subscription

This is custom documentation. For more information, please visit SAP Help Portal. 246
7/9/25, 8:45 AM

Card Description

Total CU Consumption: Daily Displays a bar chart showing the amount of capacity units consumed each day of
this month.

You can also download a CSV file to view the hourly consumption of capacity units. The screenshot shows the rows that go
together for processing.

The following table explains the information is in each column.

Column Heading Description

MEASUREMENT_PERIOD_START Marks the beginning time period in yyyy-mm-dd hh:mm:ss format.


The time data is separated by hour.

CONSUMED_BLOCKS Shows approximate consumed block hours.

CONSUMED_CU Shows the approximate consumed capacity units.

MEASUREMENT_PERIOD_END Marks the ending time period in yyyy-mm-dd hh:mm:ss format.


The time data is separated by hour.

MEASURE_NAME Shows the type of measure such as THRESHOLD_MEMORY or


PREMIUM_OUTBOUND.

OBJECT_NAME Shows the name of the object when the consumption is associated
with a specific object.

SPACE_NAME Shows the name of the space when the consumption is associated
with a specific object.

The values shown for CONSUMED_CU and CONSUMED_BLOCKS are not final and can change. For metrics involving multiple tasks
that generate CU consumption, such as Premium Outbound Integration or ECN, the values are approximate due to the concurrent
nature of those tasks. When the values in these columns are empty, the cause could be:

The entry did not present any consumption. This might happen when there are multiple workflows present, as entries will
still be displayed even though they were not running at the time.

This is custom documentation. For more information, please visit SAP Help Portal. 247
7/9/25, 8:45 AM
The values are not available because the measurement has not been consolidated yet.

Procedure
1. From the side navigation menu, click  (System Monitor) Capacities .

The Capacities page is shown.

2. To download a CSV file of the consumption, click  Download Capacity Metrics as CSV.

3. Click  and select the beginning and end dates for the report.

4. Click Download.

The CSV file is downloaded, and you can view it in you spreadsheet application.

 Note
The CSV file contains approximate data that may not reflect the finalized monthly total.

This is custom documentation. For more information, please visit SAP Help Portal. 248

You might also like