0% found this document useful (0 votes)
335 views229 pages

Architect Academy For Partners - September 2022

Uploaded by

adriagil08
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
335 views229 pages

Architect Academy For Partners - September 2022

Uploaded by

adriagil08
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 229

Salesforce.

org
Architect Academy
for Partners (Nonprofit / Education)
Forward-Looking Statements
This presentation contains forward-looking statements about, among other things, trend analyses and future events, future financial performance, anticipated growth, industry
prospects, environmental, social and governance goals, and the anticipated benefits of acquired companies. The achievement or success of the matters covered by such
forward-looking statements involves risks, uncertainties and assumptions. If any such risks or uncertainties materialize or if any of the assumptions prove incorrect, Salesforce’s
results could differ materially from the results expressed or implied by these forward-looking statements. The risks and uncertainties referred to above include those factors
discussed in Salesforce’s reports filed from time to time with the Securities and Exchange Commission, including, but not limited to: the impact of, and actions we may take in
response to, the COVID-19 pandemic, related public health measures and resulting economic downturn and market volatility; our ability to maintain security levels and service
performance meeting the expectations of our customers, and the resources and costs required to avoid unanticipated downtime and prevent, detect and remediate performance
degradation and security breaches; the expenses associated with our data centers and third-party infrastructure providers; our ability to secure additional data center capacity; our
reliance on third-party hardware, software and platform providers; the effect of evolving domestic and foreign government regulations, including those related to the provision of
services on the Internet, those related to accessing the Internet, and those addressing data privacy, cross-border data transfers and import and export controls; current and
potential litigation involving us or our industry, including litigation involving acquired entities such as Tableau Software, Inc. and Slack Technologies, Inc., and the resolution or
settlement thereof; regulatory developments and regulatory investigations involving us or affecting our industry; our ability to successfully introduce new services and product
features, including any efforts to expand our services; the success of our strategy of acquiring or making investments in complementary businesses, joint ventures, services,
technologies and intellectual property rights; our ability to complete, on a timely basis or at all, announced transactions; our ability to realize the benefits from acquisitions,
strategic partnerships, joint ventures and investments, including our July 2021 acquisition of Slack Technologies, Inc., and successfully integrate acquired businesses and
technologies; our ability to compete in the markets in which we participate; the success of our business strategy and our plan to build our business, including our strategy to be a
leading provider of enterprise cloud computing applications and platforms; our ability to execute our business plans; our ability to continue to grow unearned revenue and
remaining performance obligation; the pace of change and innovation in enterprise cloud computing services; the seasonal nature of our sales cycles; our ability to limit customer
attrition and costs related to those efforts; the success of our international expansion strategy; the demands on our personnel and infrastructure resulting from significant growth
in our customer base and operations, including as a result of acquisitions; our ability to preserve our workplace culture, including as a result of our decisions regarding our current
and future office environments or work-from-home policies; our dependency on the development and maintenance of the infrastructure of the Internet; our real estate and office
facilities strategy and related costs and uncertainties; fluctuations in, and our ability to predict, our operating results and cash flows; the variability in our results arising from the
accounting for term license revenue products; the performance and fair value of our investments in complementary businesses through our strategic investment portfolio; the
impact of future gains or losses from our strategic investment portfolio, including gains or losses from overall market conditions that may affect the publicly traded companies
within our strategic investment portfolio; our ability to protect our intellectual property rights; our ability to develop our brands; the impact of foreign currency exchange rate and
interest rate fluctuations on our results; the valuation of our deferred tax assets and the release of related valuation allowances; the potential availability of additional tax assets in
the future; the impact of new accounting pronouncements and tax laws; uncertainties affecting our ability to estimate our tax rate; uncertainties regarding our tax obligations in
connection with potential jurisdictional transfers of intellectual property, including the tax rate, the timing of the transfer and the value of such transferred intellectual property;
uncertainties regarding the effect of general economic and market conditions; the impact of geopolitical events; uncertainties regarding the impact of expensing stock options and
other equity awards; the sufficiency of our capital resources; our ability to comply with our debt covenants and lease obligations; and the impact of climate change, natural
disasters and actual or threatened public health emergencies, including the ongoing COVID-19 pandemic.
092821
3
About This Course

~2 hours of instruction each day (4 days)

This is not an introductory course

The content is most relevant for Advanced


Administrators, Technical Consultants,
Developers, and Architects with deep
technical skills

Attendees are also strongly encouraged to


work through the CTA Trailmix to be fully
prepared

Architect Journey: Prepare to Become a Certified Technical Architect (CTA)


Salesforce.org
Architect Academy
for Partners (Nonprofit / Education)
Day 1

Claudio Moraes Fabrice Pierre


Partner Success Director, Partner Success Manager,
Technical Lead EMEA
Would you consider yourself?
● Salesforce Advanced Administrator
● Salesforce Developer

? ● Salesforce Architect
Learning Journey

4 Donations
1 5 Recurring Donations
(Legacy & Enhanced)
Profiling
Nonprofit & 3 Performance and

Education Products
Debug
8
Salesforce Architecture,
2
Salesforce for Nonprofit & Automation Bonus
Education Overview
All About Data Salesforce.org 7 Technology as a tool

Large Data Volume,


Automation
framework (TDTM)
6 to delivery faster time
to value
queries, indexes, and Batch Apex Product Access
Salesforce optimizer, Demo and Product
data import Use Cases Study
access
Real Life examples
Schedule
Day 1:
● Quick review of our Multitenant Architecture
● Salesforce for Nonprofits & Education
○ Products and Data Model
● Thinking about data as an Architect
○ Large Data Volume, Data Storage, Data Skew.
Skinny Tables, Indexes, and Salesforce Query
Optimizer
Day 2:
● Thinking about data as an Architect (continuation…)
○ Query Plan and Data Import
● Salesforce.org Automation Framework
○ Order of Execution review
○ Table Driven Trigger Management (TDTM)
○ Batch Apex
■ Customizable Rollups, Batch Chunk Size,
Incremental Mode, and Batch Size
Configuration
Schedule
Day 3:
● Performance & Debug
○ Understanding Performance in Salesforce
○ Developer Console and Debug Logs
● Recurring Donations
○ Legacy vs Enhanced Recurring Donations
○ Migration or Upgrade?
● Use Cases Study - Real Life Examples
○ Common Issues - What, How and Why?
Day 4:
● Use Cases Study - Real Life Examples
○ Continuation…
● Demo Environments and Products access
● Technology as a tool to deliver faster time to value
○ CumulusCI
○ Building Templates as Lego blocks
How many years of Salesforce
hands-on experience do you
have?
● 1 year
● 2-4 years

?
● 5-6 years
● 7+ years
● What is Salesforce?
Multitenant
Architecture
Quick Review
Rate your Salesforce
Multitenant knowledge:
● Not sure what it is
● I have an idea what it is

? ● I completely understand it
Multitenant - Metadata Driven Architecture
Reliable, Customizable, Upgradeable, Secure, and Fast
● Runtime engine generates app components from metadata for tenant-specific customizations,
configurations, and business logic
● Data tables and indexes are abstract constructs that exists merely as metadata in the platform
Universal Data Dictionary (UDD)

Platform Multitenant Architecture


Multitenant - Data Architecture
Metadata + Data + Pivot Tables work in unison

● Virtual segregation of data by customer using


unique identifiers (GUID, OrgID)
● All data access operations include OrgID

Resource: Platform Multitenant Architecture


Multitenant - Data Structure
Simplified View of Key Metadata and Data Tables

● Metadata, data, and pivot table structures


store data corresponding to virtual data
structures (i.e. Accounts or Contacts)
● A set of specialized pivot tables maintain
denormalized data that makes the
combined data set extremely functional
○ For example, the UniqueFields pivot table
facilitates uniqueness for custom fields

Platform Multitenant Architecture


Review

Salesforce for ● What products are part of:


○ Nonprofit
Nonprofits & Education ○ Education
● Data Model
Products and Data Model
Have you implemented
Salesforce for Nonprofits or
Education?
● Not yet
● Yes, only Nonprofits

? ● Yes, only Education


● Yes, both
Salesforce Customer 360 Platform
Salesforce for Nonprofit & Salesforce for Education
Your Custom Apps
(Platform as a Service)

Nonprofit & Education Managed Packages AppExchange Apps & Services


• Data Lake on AWS
• Language Runtimes (PHP, Java, Node…)
• Automatic Scaling
Unified Customer ID • Integrated with Salesforce data

Salesforce DX Lightning Design System Messaging


UI Components
Salesforce Mobile Devices/Bots

Your Unlocked Data


Builders (Integration Platform)
Search Security Eventing
Services
AI & ML Identity Workflow Compute Open
Tools Platform APIs • Integrate External Systems
• Pre-Built Connector
Abstraction Metadata Data Models Multi-tenancy • Full Lifecycle API Management
Environments • Data Governance & Monitoring

Storage Connectivity
Release
Management
Relational DB Big Data Services External Data External Services

Your Ecosystem
(Services, Data Stores, etc.)
Salesforce for Nonprofit
Building Relationships, Building Change
Einstein for Nonprofits
Case
Accounting Subledger Management

Program
Build the relationships that
Management
achieve great missions Fundraising

Outbound
Gain safer data, deeper Insights Data
Funds
insights, greater impact Integrity
Grant
Support your mission with Marketing & Management
flexibility, integration and Engagement
innovation Volunteers for
Salesforce

Nonprofit Operations
(NPSP)

To learn more: Nonprofit Cloud Academy: Consultant Fundamentals


Salesforce.org Nonprofit Products

Salesforce for Nonprofit products


https://install.salesforce.org/products
NPSP Entity Relationship Diagram (ERD)

Foundation for constituent data


and other Nonprofit products
Salesforce for Nonprofit ERDs
Nonprofit Products - Entity Relationship Diagrams

NPSP data dictionary provides


fields information

Each product has its own entity


relationship diagram
Nonprofit Success Pack - Automated Processes

NPSP Workflow Diagrams


Salesforce for Education
Purpose-built products leveraging EDA

AppExchange Salesforce.org
Partner Apps Purpose-built products

Get started faster with Accounting


Recruitment
out-of-the-box configuration Subledger
and processes
Configure easily with a Student
Events
flexible architecture that Success Hub
grows with your institution
Leverage best practices Payments Admissions
developed with and for the Connect
Education Community

Education Data Architecture


(EDA)

To learn more: Education Cloud Academy: Consultant Fundamentals


Salesforce.org Education Products

Salesforce for Education products


https://install.salesforce.org/products
EDA Entity Relationship Diagram (ERD)

Data Architecture
tailored for Education
Institutions models
Large Data Volume
● Data Storage
● Data Skew
● Skinny Tables

Data Considerations ●

Indexes
Salesforce Query Optimizer
Thinking about data as an Architect ● Query Plan
● Data Import

* Partner Learning Camp Course


Data Architecture and Management Designer: Large Data Volumes
Rate your knowledge about
Large Data Volume:
● What is Large Data Volume?
● I understand it

? ● I’m an expert
Large Data Volume (LDV)
What does it mean?

A “large data volume” is an imprecise, Examples where LDV may be an issue:


elastic term. ● Related lists, Visualforce Pages, LWC, Reports
If your deployment has tens of and Dashboards time out instead of
thousands of users, tens of millions of displaying data or perform poorly
records, or hundreds of gigabytes of ● Reaching governor limits in SOQL, or NPSP
total record storage, you have a large Batch Apex
data volume. ● Row locks when updating records
Even if you work with smaller ● NPSP jobs taking too long to run or don’t
deployments, you can still learn complete
something from these best practices. ● Third-party payment processing jobs don’t
complete
Source: Best Practices for Deployments with Large Data Volumes
Article - Large Data Volume in
Nonprofit & Education
Why it is important?
LVD can impact the user
experience causing
performance problems - Part I

Preparing for LDV


Preparing to work with LDV -
Part II

Data and Best Practices


Loading data - Part III
How familiar are you with
database concepts (tables,
indexes, optimization, etc)?
● What is a database?

?
● I understand how they work
● I’m an expert
Salesforce - Data Storage
Each Record by Default is 2K in Size - Help in estimating data storage and LVD

SALESFORCE EDITION DATA STORAGE MINIMUM DATA STORAGE ALLOCATION FILE STORAGE ALLOCATION FILE STORAGE ALLOCATION
PER ORG PER USER LICENSE PER ORG PER USER LICENSE

Contact Manager

Group 612 MB
20 MB
Professional

Enterprise
10 GB 10 GB

Performance
120 MB
2 GB
Unlimited 20 MB for Lightning Platform
Starter user licenses

Developer 5 MB
20 MB
Personal 20 MB (approximately 10,000
N/A N/A
records)

Essentials 10 GB 1 GB

Source: Data and File Storage Allocations


Data Storage - Example
Overview of a newly created Org - Each Record by Default is 2K in Size

Pay attention to objects “Record Type” where


there is only one record.

Notice the used storage equal to 2 KB.

Salesforce record size overview


Data Storage - Salesforce for Nonprofit Example
Consider the specific Nonprofit Data Model
1 Donor:
● Account - 1 record (1 household) Opportunity
Opportunity Payments
Contact Role
● Contact - 1 record
● Address - 2 records (home and work)
● Relationship - (no family members included)
● Affiliation - 1 record (employer) Account

Total of 5 records x 2KB = 10KB Address

10,000 Donor (50,000 records x 2KB) = 100,000Mb


(Megabytes) Contact Affiliation

1 Donation:
● Donation (Opportunity) - 1 record Relationship
● Hard Credit (Opportunity Contact Role) - 1 record
● Payment - 1 record

Total of 3 records x 2KB = 6KB


50,000 Donations (300,000 records x 2KB) =
600,000Mb (Megabytes)
Data Storage - Salesforce for Education Example
Consider the specific Education Data Model
EDA - 1 Student may generate many records: SSH/SAL - 1 Student may generate many records:
● Contact - 1 ● Advisee Case - 1
● Account - 1 ● Success Team - 4
● Address - 3 (home, dorm, off campus housing) ● Alerts - 5
● Affiliation - 5 (High School, University, Academic ● Events - 5
Program, Department) ● Tasks - 10
● Relationship - 5 (family members) ● Notes - 10
● Program Enrollments - 1
● Course Enrollments - 12

Account
Term

Address

Program Program
Contact Affiliation Plan
Enrollment Course

1 Full Time Student: 126K Relationship Course Plan Course


Connection
● EDA - 28 records = 56KB Requirement Offering

● SSH (formerly SAL) - 35 records = 70 KB


Data Storage - Areas of Concern
Large Number of Records = Large Data Volume

Large Number of Records Communities Data Loading


in Objects ● Depending on the sharing ● Indexing can be slow to
● Reporting across multiple model and number of “catch up”
objects may be an issue users there may be a ● Bulk API limits
● Complex Role Hierarchy performance impact on ○ Max records updated per
may cause issues with Communities 24 hours
○ Max file size
sharing records
○ etc
● Full or Partial Sandbox
● Data Skew (more next)
refresh time Sharing
● Automation performance
● Sharing recalculation may
impact or errors
cause row locking and
ultimately impact loading
or Salesforce usage
Data Skew - What is it?
Large number of children records associated with the same parent record

10k 10k
Donations Household Members
(Opportunity) (Account) (Contact)

Data Skew Ownership Skew Lookup Skew


● Example, Account with ● Same object type owned ● Large number of records
multiple Contacts or by a single user. are associated to a single
Opportunities (Donations) ○ Example: Account record in the lookup object
● Update on Contacts or ownership bucket with
multiple contacts. Implications
Opportunities may lock the
Account Record Implications ● Performance
● Application and processes
Implications ● Performance errors (i.e. Time out, Too
● Performance ● Sharing (Role Hierarchy, many DML rows for Apex
● Application and processes Sharing Rules) execution)
errors (i.e. Time out, Too
many DML rows for Apex)
Resource: Types of Salesforce Data Skew
Data Skew - Salesforce for Nonprofit Error
What happens if a donation is created for a Household with +10k members?

Nonprofit Success Pack (NPSP - formerly known


as Nonprofit Starter Pack) has automation
(TDTM) to roll up or aggregate donations at the
Contact and Account levels.
Data Skew will cause an error when creating a
new Donation as illustrated on the right:
● The TDTM automation framework will try to roll
up donations at the contact and then account
level
● With +10k members in a Household the DML
(query) will exceed the 10k records processed
limit
Data Skew - Salesforce for Nonprofit Error
What happens if a new donation is associated with a GAU that has +10k donations?

As mentioned previously, Nonprofit Success Pack ERROR: npsp.TDTM_Allocation on


has automation (TDTM) to perform several tasks. Allocation trigger event
BeforeInsert… System.DmlException:
Insert failed. First exception on
Data Skew may cause an error when allocating row 0; first error:
the Donation to a GAU. UNABLE_TO_LOCK_ROW, unable to obtain
exclusive access to this record or 1
To prevent it, additional GAU records should be records: …
created and used to keep the donation allocation
to 10,000 or less for each GAU.
Data Skew - Salesforce for Education Error
What happens if upserting Student information?

Salesforce for Education has automation ERROR: hed.TDTM_Contact: execution of


(TDTM) similar to the NPSP to roll up or AfterUpdatecaused by:
System.QueryException: Non-selective
aggregate information.
query against large object type (more
As an example, Data Skew may cause an than 200000 rows). Consider an indexed
error when upserting Student data: filter or contact salesforce.com about
custom indexing. Even if a field is
● The TDTM automation framework will try indexed a filter might still not be
to aggregate and/or roll up student selective when: 1. The filter value
information includes null (for instance binding with
● An error may happen when the DML a list that contains null) 2. Data skew
query tries to return more than 200k exists whereby the number of matching
rows rows is very large (for instance,
filtering for a particular foreign key
value that occurs many times)(hed)
Skinny Tables - Improving Performance
For complex joins in reports, list views and SOQL - Not common with NPSP or EDA

● In multitenant data model the data in objects The picture below illustrates an Account view
may be stored in separate tables at the retrieving data based on Skinny table containing
database level. This separation may create a standard and custom fields from separate database
join when querying data from different tables. tables.
● A Skinny table is a custom table created by the
Salesforce support team that contains a subset
of fields from separate tables to avoid joins
and consequently speed up queries in reports,
list views and/or SOQL queries
○ Reduced I/O and Joins
○ Better cache utilization
○ Supported on Account, Contact,
Opportunity, Lead and Case objects
○ Additional overhead to keep in sync
○ Invisible to end users
○ Read-only

Additional information and limitations: Skinny Tables


Salesforce Indexes
Indexed Fields, Custom Indexes, Index and Data tables

● By default, the following field types are indexed: The Salesforce platform work with its own
index table that contains a copy of the data,
○ RecordTypeId
along with information about the data types.
○ Division
○ CreatedDate The platform then builds a standard database
○ Systemmodstamp (LastModifiedDate) index on this index table.
○ Name
The index table places upper limits on the
○ Email (for contacts and leads)
number of records that an indexed search can
○ Foreign key relationships (lookups and
effectively return (more about this later).
master-detail)
○ Unique Salesforce record ID
○ External ID
● Custom indexes can be created by logging a
Case with Salesforce support who will analyze
the queries and evaluate if the index is
necessary or not

Additional information about indexes: Indexes


Object Indexed Fields - Workbench
How to find what fields are indexed in a Salesforce Object?

● The Tooling API exposes metadata that you can Using Tooling API it is possible to find what
access through REST or SOAP and its SOQL fields in a specific Salesforce Object are
capabilities allow you to retrieve smaller pieces indexed.
of metadata Using Workbench, navigate to Query > SOQL
Query:
● Workbench is a powerful, web-based suite of
tools designed for administrators and
developers to interact with Salesforce
organizations via the Force.com APIs. It is free to
use, but is not an official Salesforce product
As an example, to find the Account indexed
fields run the SOQL query as shown below:
SELECT QualifiedApiName FROM
FieldDefinition WHERE
EntityDefinition.QualifiedApiName
='Account' and IsIndexed = true
Object Indexed Fields - Visual Studio Code
How to find what fields are indexed in a Salesforce Object?

● Visual Studio Code, as Workbench, allows With the SOQL Builder for Visual Studio Code
administrators and developers to interact with extension it is possible to leverage the Tooling API
Salesforce organizations APIs. It is free to use, to find what fields are indexed.
but is not an official Salesforce product To find the Household (Account) indexed fields run
the SOQL query as shown below:
SELECT QualifiedApiName FROM FieldDefinition
WHERE EntityDefinition.QualifiedApiName
='Account' and IsIndexed = true
Query Optimization - What is it?
Process of choosing the most efficient means of executing a Query

For example, a user may be looking for Household members with Last Name starting with ‘A’.

If the query optimizer statistics indicates that 80% of Household members Last Name starts
with ‘A’, then the query optimizer may decide that a full table scan is more efficient.
However, as illustrated in the picture, if statistics indicate that very few household members
have a Last Name starting with ‘A’, then reading an index may be more efficient than a full
table scan.
Salesforce Query Optimizer (SQO)
Produce effective execution plans for Salesforce queries
The SQO (Salesforce Query Optimizer) maintains a table containing For example, household member are
statistics about the distribution of data in each index. 68% Female and 32% Male.

It uses this table to perform pre-queries to determine whether using


the index can speed up the query.
Go
NOTE: Sometimes Indexing of newly loaded records can take time to
"catch up“.
Run pre-queries
Shared Shared
Execution Process Multi-tenant Check user Check filter
Visibility Indexes
Query Visibility selectivity
● Run pre-query engine Optimizer
ID Data 1 Data 2 ID Data 1 Data 2

10002 unus erat toto naturae


10002 unus erat toto naturae

10003 vultus in orbe


10003 vultus in orbe

10004 quem dixere chaeos


10004 quem dixere chaeos

10005 rudis indigestaque


10005 rudis indigestaque

● Choose most selective filter


10006 meis perpetuum
10006 meis perpetuum

10007 deducite temopra


10007 deducite temopra

10008 carmen ante


10008 carmen ante

based on the WHERE clause Write query-based on results


10009 mare et terras
10009 mare et terras

GMR/PM0
10010 tegit et quod
10010 tegit et quod

of pre-queries
10011 omnia caelum
10011 omnia caelum

10012 unus erat toto naturae


10012 unus erat toto naturae

10013 vultus in orbe


10013 vultus in orbe

Grp 1
10014 quem dixere chaeos
10014 quem dixere chaeos

● Determine the best leading 10015

10016

10017
rudis

meis

deducite
indigestaque

perpetuum

temopra
10015

10016

10017
rudis

meis

deducite
indigestaque

perpetuum

temopra

table/index to drive query 10018

10019

10020
carmen

mare

tegit
ante

et terras

et quod
10018

10019

10020
carmen

mare

tegit
ante

et terras

et quod

10021 omnia caelum


10021 omnia caelum

Execute query
10022 unus erat toto naturae
10022 unus erat toto naturae

● Write query based on the


10023 vultus in orbe
10023 vultus in orbe

10024 quem dixere chaeos


10024 quem dixere chaeos

10025 rudis indigestaque


10025 rudis indigestaque

results
10026 meis perpetuum
10026 meis perpetuum

10027 deducite temopra


10027 deducite temopra

10028 carmen ante


10028 carmen ante

10029 mare et terras


10029 mare et terras

10030 tegit et quod


10030 tegit et quod

10031 omnia caelum


10031 omnia caelum

● Execute the query 10032

10033
unus erat

vultus
toto naturae

in orbe
10032

10033
unus erat

vultus
toto naturae

in orbe

Stop

Force.com Query Optimizer Secrets You Can Use Today


Salesforce Query Optimizer (SQO)
Options considered by the Optimizer

If the # of records visible to the Skinny table helps when custom


user are less than the # of records index optimizations don’t work.
returned from the best index then It’s still better than doing a full
the query will be driven by the table scan. Any change in your
sharing tables. query may void the use of it.
Standard Indexes
Standard indexes are pre-created and available
Index will be considered only if the to every customer out of the box.
filter fetches < 30% of the records for
Looking at the left table, considering 750K
the first million records and less than accounts in Salesforce, you would look at the “#
15% of the records after the first of records” in each row to know when the index
million records, up to 1M records (final is used or not. In this case you could query up
threshold) to 30% of your data, which is 225K records and
use an index.
If you have 4M accounts, then you could query
up to 750K records and use an index.
The pre-query will stop querying data beyond
750K rows and say index unusable in this
example.
Custom Indexes
With custom indexes, the threshold is 1/3rd of
Index will be considered only if the what is allowed for standard indexes.
filter fetches < 10% of the records for
Why? Custom index is stored separately, hence we
the first million records and less than need to join another (helper) table and this has a
5% of the records after the first cost.
million records, up to 333,333 records Taking the previous standard index example:
(final threshold)
With 750K accounts, looking at the left table, you would
look at the 1st row to know when custom index is used
or not. In this case you could query up to 10% of your
data, which is 75K records and use a custom index.
If you have 4M accounts, then you could query up to
250K records and use a custom index.
After that it’s a drag on performance.
If you have 3M accounts and the filter is on a custom
indexed field, the pre-query will state that if you query
more than 200K records, the index is unusable.
Selective Filter - Examples
Standard Index & Formula Field

Threshold is 30% of total up to 1 million


Standard Field # Rows Selective?
SELECT Id FROM MyCase__c WHERE Status__c = ‘Closed’
● full table scan is more efficient Index Value
Status Closed 96,500 No
SELECT Id FROM MyCase__c WHERE Status__c = ‘New’
● index use is more efficient Status New 3,500 Yes

CaseType__c formula field references related


object Formula Field Formula

SELECT Id FROM MyCase__c WHERE CaseType__c = 1 CaseType__c CASE(MyUser__r.UserType_


● index cannot be used or created for the formula field _c,1,”Gold”,”Silver”)
Selective Filter - Examples
Custom Index & NULL value

Threshold is 10% of total up to 1 million


SELECT Id FROM MyCase__c WHERE Priority__c != 3
● full table scan - cannot use index because of NOT EQUAL Custom
Value # Rows Selective?
Index
SELECT Id FROM MyCase__c WHERE Priority__c IN (1,2)
● index use is more efficient Priority 1 6,000 Yes
Priority 2 3,500 Yes
By default, index tables do not include records that are null Priority 3 90,500 No
(records with empty values).

SELECT Id FROM MyCase__c WHERE Priority = null


● full table scan - cannot use index because of NULL comparison

Salesforce Support can help analyze and create custom indexes that include null if necessary.
Selective Filter - Examples
AND Condition

Composite Index Join (index intersection)

● 20% of 100k = 20k


● Query is returning less than
threshold = 15,250 records
Selective Filter - Examples
OR Condition

Composite Index Union (index addition)

● 10% of 100k = 10k


● Query is returning more than
threshold = 15,250 records
Considerations
Keep in mind...
● Selectivity thresholds determine if an index is considered

● Not Equals filters will not leverage indexes

● Be careful filtering on Null

● And conditions involve an INTERSECTION of indexes

● OR conditions involve an ADDITION of indexes


Salesforce.org
Architect Academy
for Partners (Nonprofit / Education)
Day 2

Claudio Moraes Fabrice Pierre


Partner Success Director, Partner Success Manager,
Technical Lead EMEA
Links from Day 1
https://salesforce.quip.com/NaT7AWSaUEtv
Learning Journey

4 Donations
1 5 Recurring Donations
(Legacy & Enhanced)
Profiling
Nonprofit & 3 Performance and

Education Products
Debug
8
Salesforce Architecture,
2
Salesforce for Nonprofit & Automation Bonus
Education Overview
All About Data Salesforce.org 7 Technology as a tool

Large Data Volume,


Automation
framework (TDTM)
6 to delivery faster time
to value
queries, indexes, and Batch Apex Product Access
Salesforce optimizer, Demo and Product
data import Use Cases Study
access
Real Life examples
Schedule
Day 1:
● Quick review of our Multitenant Architecture
● Salesforce for Nonprofits & Education
○ Products and Data Model
● Thinking about data as an Architect
○ Large Data Volume, Data Storage, Data Skew.
Skinny Tables, Indexes, and Salesforce Query
Optimizer
Day 2:
● Thinking about data as an Architect (continuation…)
○ Query Plan and Data Import
● Salesforce.org Automation Framework
○ Order of Execution review
○ Table Driven Trigger Management (TDTM)
○ Batch Apex
■ Customizable Rollups, Batch Chunk Size,
Incremental Mode, and Batch Size
Configuration
Understanding Queries - Cardinality & Cost
What does it mean?

● Cardinality is the number of elements in a set


○ For database indexes, it refers to the uniqueness of values stored in a specific column within
an index
○ For database relationships, it refers to the data between two tables which defines how the
data is related to each other (i.e. one-to-one, one-to-many, many-to-many)
● Cost of a Query in simple terms is the amount of work needed to retrieve the requested
data set
○ For the database, it is the amount of work the query plan will do
○ Fetching or retrieving more records for example during a full table scan to satisfy a query
request will require more resources and will take longer than if using an index as the driver to
retrieve the requested records
Query Plan - Analyzing Queries
How to enable it and what is it?

● Enable it in Developer Console > Help > Preferences


● It provides insights into the different plans for a given query

Developer Console Query Plan Tool FAQ


Which one will be used?
● Cost = 0.33333…

? ● Cost = 0.88333…
● I’m not sure
Query Plan - Indexes Utilization
Understanding the Query Plan Information

The Query Plan Tool displays the number of records returned, the cost of the query and if
using an index table or performing a full table scan.

The Query Plan tool will show a list of available plans the Query Optimizer can utilize
and will be arranged by cost ascending. Each Plan will contain information on
Cardinality, Operation Type, Cost, sObject Type, and more. Each plan has a “Leading
Operation Type”, for example, Field Index or Full Table Scan. The plan with the lowest
cost is the plan that is used for driving the query execution.

Developer Console Query Plan Tool FAQ


Query Plan - Index Utilization
Adding Order By or Limit to the query will alter the Query Plan

● WITHOUT Order By or Limit

● WITH Order By and Limit 99999


Query & Search Optimization Cheat Sheet
Database Indexes, Index selectivity conditions and thresholds, and more

When building queries, list views, and reports, it's best


to create filter conditions that are selective so the
Salesforce platform scans only the rows necessary in
the objects your query target.

Use the Optimization Cheat Sheet as a guide on how


to write selective filter conditions, minimize query
response times, and optimize database overall
performance.

Query & Search Optimization Cheat Sheet


Considerations
Keep in mind...
● Database Statistics - Modern databases gather statistics on the amount and
types of data stored inside of them, and they use this information to execute
queries efficiently. As a result, when large amounts of data are created,
updated, or deleted via the API, the database must gather statistics before the
application can efficiently access data meaning you may not see performance
improvements until the process is completed

● Deleted data - The Salesforce data deletion mechanism can have a profound
effect on the performance specially with LDV. Salesforce uses a Recycle Bin
metaphor for data that users delete. Instead of removing the data, Salesforce
flags the data as deleted and makes it visible only through the recycle bin. This
process is called soft deletion. While the data is soft deleted, it still affects
database performance because the data is still resident, and deleted records
have to be excluded from any queries (deleted data is purged after 15 days)
Process Overview
Considerations

Data Import Import Sequence

Important Considerations
In Salesforce, can I load
Contacts or Opportunities
before Accounts?
● Yes

? ● No
Data Import - Process Overview
“Failing to plan is planning to fail” quote by Benjamin Franklin

Prep Import Review Execute


- Data Import Tool - NPSP Data Import - Dry run in Sandbox - Run in production
- scale - templates ready - fix possible errors
- Run reports to
- performance and populated - fix mapping if
- pre /post processes necessary review and validate
- NPSP & Other Tool data import
- correct any
- Data Scope - templates
mistakes
- LVD? - load directly into
- check performance
- all, last X years, mix NPSP Import object
- mix - Repeat until you are
- Data Mapping
satisfied
- custom fields?
- Data Enrichment
- Data Cleaning
Before you Start - Considerations
It is common to follow an order when loading data

What will impact the import process? Including NPSP and EDA specific functionality

● Import sequence
● Data volume
● Automation
● Limits & Storage
● Validation rules
● Sharing rules and calculation (computed vs deferred)
● NPSP/EDA specific configuration
○ Automation: Household or Opportunity naming rules, TDTM, Rollups, Soft Credit automation,
etc
Before you Start - Target Data Model
Understand the target data model

NPSP Entity Relationship Diagram EDA Entity Relationship Diagram

NPSP ERD & EDA ERD


What tool do you use to import
data into Salesforce for
Nonprofits or Education?
● I never imported data into Salesforce
● I use NPSP Data Import

?
● I use Data Mover
● I use Data Loader
● Other tools
NPSP Data Import - Ordered Sequence
What objects should be populated first?
1. User 12. Opportunity
2. Account (Household / Organization) 13. Opportunity Contact Role
3. Address 14. Partial Soft Credit
4. Contact 15. Payment
5. Lead 16. GAU Allocation
6. Affiliation 17. Deliverable
7. Relationship 18. Engagement Plan Template
8. Campaign 19. Engagement Plan Task
9. Campaign Member 20. Level
10. General Allocation Unit 21. Activity
11. Recurring Donation

Orange items trigger code that automates tasks or validates data


EDA Data Import - Ordered Sequence
School Information and Curriculum
1. User 12. Courses
2. School Accounts 13. Course Offerings (past, current and
- Education Institutions (including other future)
colleges and high schools) 14. Course Offerings Schedules
- Departments
- Academic Programs 15. Program Plans
- Sports Organizations and Clubs 16. Plan Requirements
- Businesses
3. Faculty Contacts
4. Facilities
5. Terms (past, current and future)

Orange items trigger code that automates tasks or validates data


Recommended Data Import Sequence for EDA
EDA Data Import - Ordered Sequence
Student Information
10. Student Accounts 12. Relationships
- Administrative - Family Members
- Household 13. Program Enrollments
11. Contacts 14. Course Connection
- Students - Link to Course Offering
- Family - Program Enrollment relationship
12. Address (campus, home) - Student Contact relationship
13. Affiliations - Program Enrollment Affiliation relationship
- Primary Education Institution
- Primary Department
- Primary Academic Program

Orange items trigger code that automates tasks or validates data


Recommended Data Import Sequence for EDA
Data Import Process - Testing
Testing allows you to achieve the best results

Bulk API in Parallel Mode With NPSP Data Import:


● Sorted by Parent or Foreign Key ID ● Ensure you understand how the process
● Chunk the batches (more about it later) works
Allocated Time (weekend, phases, extended days) ● It is possible to speed up the first step of the
import process by loading data directly into
Ensure no users are using the system
the NPSP Data Import Object (great for
Keep in mind the consideration items LVD)
What Tool should be used? (NPSP Data Import, NPSP Data
Import Object
Data Loader, ETL Tool, etc) Load directly Upload
into NPSP formatted
Import template
Object
Salesforce for Nonprofit Academy
(NPSP Data Import)
CSV
Integration & Data Management

NPSP - How the Import Process Works & Advanced Data Import Options
Have you used NPSP Data
Import Advanced Mapping
before?
● Yes

? ● No
NPSP Data Import - Advanced Mapping
Importing Custom Fields

NPSP Advanced Mapping allows you to Advanced Mapping Recipes


import custom fields.
● Overview page with links
One common example is relationships as
● Affiliation Recipes
mentioned in the NPSP Advanced Mapping
Recipes ● Engagement Plan Recipes
Let’s review in the next slides the steps that ● Relationship Recipes
would be followed when importing the ● Soft Credit Recipes
relationships after enabling Advanced
Mapping.
Affiliation

Account Contact Relationship


NPSP Data Import - Advanced Mapping
Create Contact1 Relationship Custom Fields

1. Navigate to Setup > Object Manager tab


2. Type “NPSP Data Import” in the “Quick
Find” search box
3. Click on the “NPSP Data Import” object
name
4. Click on “Fields & Relationships”
5. Create the new custom field
NPSP Data Import - Advanced Mapping
Create Contact1 Relationship Custom Fields

6. At the end, three new custom fields should be created in the NPSP Data Import object
NPSP Data Import - Advanced Mapping
Create Contact1 Relationship Custom Fields

7. Navigate to NPSP Settings


8. Enable Advanced Mapping (if not already
enabled)
NPSP Data Import - Advanced Mapping
Create Contact1 Relationship Custom Fields

9. Navigate to NPSP Settings > Systems Tools >


Advanced Mapping for Data Import & Gift Entry
10. Click on Configure Mapping
11. Create a New Object Group

Object Groups are collections of field mappings that tell


the NPSP Data Import tool what types of records (e.g.
Contacts, Opportunities, Payments, etc.) to create or
update.
NPSP is able to link related records within a single row
of your import to one another because fields mappings
are bundled in Object Groups in this way.
NPSP Data Import - Advanced Mapping
Create Contact1 Relationship Custom Fields

11. In the new group, click on the down arrow at the end of the row and then click on “View
Field Mappings”:

12. Click “Create New Fields Mapping:


NPSP Data Import - Advanced Mapping
Create Contact1 Relationship Custom Fields

13. Select “Contact2 Imported” and map it 14. Repeat previous step to create a new
or select “Related Contact” field mapping for “Relationship 1 Type”
NPSP Data Import - Advanced Mapping
Create Contact1 Relationship Custom Fields

15. After completed this is what the field mapping will look like
NPSP Data Import - Advanced Mapping
Create Contact1 Relationship Custom Fields

16. Add the custom fields in the NPSP Data Import template and test the import process
Order of Execution
● Validations
● Triggers
● Workflows
● Process Builder
● Flow
● Assignment & Escalation Rules
Salesforce.org Table Driven Trigger Management (TDTM)
Automation Framework ● Trigger Framework

Table Driven Trigger Management Batch Apex - Customizable Rollups


(TDTM) ● Donations
● Hard and Soft Credits
● Seasonal Address Updates
● General Accounting Unit
● and more...

* Partner Learning Camp Course


Table Driven Trigger Management (TDTM) with NPSP
Rate your Apex knowledge:
● What is Apex?
● I can play with it and break things

? ● I’m an expert
Salesforce Order of Execution
Saving a record - Insert, update, or upsert statement

Architecture Basics > Data Manipulation


Salesforce Order of Execution - simplified view
Saving a record - Insert, update, or upsert statement
Recursion

Load Original Overwrite with System Flow(s): Before System (again) &
Record or New Record Validations Save Updates Before Trigger(s) Custom
Initialize it Field Values (Request from UI) Validations

Save to Stop if
Auto Response Assignment Duplication duplicate
Workflow Rules After Trigger(s) Database but
Rules Rules Rules
not committed

Automation - Process/Flow
Update Before + After
Same Record ? Escalation Rules Processes, Flow, Update Same
Yes Trigger(s) Workflow Rules Record ? Yes
No No

Commit DML Record


Criteria Sharing Standard Roll-up Entitlement
operations to Triggered
Evaluation Summary Rules
database Flow(s)

Post-Commit
(Email, Async
Apex/Flows)

RESOURCE: Triggers and Order of Execution


Table Driven Trigger Management (TDTM)
Salesforce.org Automation Framework - What is it?

● TDTM is a trigger management framework


built to allow the control of when and
which order the Apex code within triggers
is executed.
● Implemented using Apex Classes with one
trigger per object and multiple handlers.

10
Accounting
Subledger
55
NPSP
This course assume you are familiar with TDTM and how it works. The intent is to provide additional information
related to TDTM and automation. To learn more about TDTM check the Resources slide +70
EDA

Table-Driven Trigger Management (TDTM) Overview


Table Driven Trigger Management (TDTM)
Salesforce.org Automation Framework - Why TDTM?

Bad Practices Best Practices


● Multiple Triggers per Object ● Scale - Manage large number of Trigger
Handlers
● Not bulkfied
● Flexible & Extendable - Allow control of what
● Logic written directly within Triggers
runs or not, in which order and creation of
● Messy Code and logic new Trigger Handlers
● No expected and controlled order ● Performance - By organizing Triggers, it is
likely that it will reduce redundancy of
Apex code execution and improve
performance
● Maintenance - Easier to maintain,
specifically if custom Triggers Handlers
are implemented
Table Driven Trigger Management (TDTM)
Deep Dive - A closer look at the NPSP Contact Object Trigger

● Only one main Trigger - TDTM_Contact


● Several Trigger Handlers
○ TDTM_Contact class control the trigger handlers execution

Apex Class Descriptions for NPSP & Apex Class Descriptions for EDA
Table Driven Trigger Management (TDTM)
How it works? A Single Apex Trigger starts and TDTM takes over

The example below illustrates the after triggers when creating new Contact.
2
3
4 1
Action 1 - Create
Household
ACCT_Individual Account

After Insert, Sort by Order


Contact Trigger Handler:
Accounts_TDTM

Action 2 - Create
Primary
AFFL_Affiliations Affiliation
_TDTM

Action 3 - Create
ADDR_Contact_ Primary Address
TDTM
NPSP - Table Driven Trigger Management (TDTM)
Apex classes managed by Table-Driven Trigger Management (TDTM)
Object Class Name Description

Account ACCT_Accounts_TDTM For inserted or updated Accounts, this class checks the Account Model selected in NPSP Settings and sets these
system fields:
● npe01__SYSTEM_AccountType__c
● npe01__SYSTEMISINDIVIDUAL__c

Based on NPSP Household Naming Settings, the class also updates Household Names on any Household Accounts in
the transaction (if necessary).

Account ACCT_AccountMerge_TDTM This class supports data management during the Account Merge process. When Accounts are merged, the class:
● Updates the Account Name, and Formal and Informal Greeting fields
● Recalculates Household Soft Credits
● Recalculates Opportunity rollups
● Deletes duplicate Addresses and propagates the Default Address to the appropriate Account and Contacts

Account ACCT_CascadeDeleteLookups_TDTM When you delete an Account, this class deletes Recurring Donations, Allocations, and Relationships that are related to
the Account through various lookup fields.

and more…
55
NPSP

Apex Class Descriptions for NPSP


EDA - Table Driven Trigger Management (TDTM)
Apex
Object classes managed
Class Name by Table-Driven Trigger Management
Description(TDTM)

Account ACCT_CannotDelete_TDTM This class prevents an Account from being deleted if it has any related Address, Affiliation, Attribute, Course,
Course Connection, Education History, Facility, Program Enrollment, Program Plan, Term, or Time Block
records.

Account ADDR_Account_TDTM When you update the values on Billing Address fields (Billing Street, Billing City, and so on), this class creates or
updates a related Address record for Accounts that are configured for Address management in EDA Settings.

If the Account is a Household Account, the class updates the Mailing Address fields for any Contacts associated
with the Household.

Account AFFL_AccRecordType_TDTM When you change the record type of an Account, and it's a Primary Affiliation for a Contact, this class updates
the Primary Affiliation to match the new record type. For example, if you change the Account's record type
from Academic Program to Sports Organization, this class updates the Contact's Primary Sports Organization
field.

This class does not create a new Affiliation record when the related Account type is changed. It updates the
existing Primary Affiliation record to reflect the new Account Type.

Account RenameLeadConvertedAccounts_TDTM When an Account is created during Lead conversion with a record type specified in Automatically Rename
Lead-Converted Accounts in EDA Settings, this class enforces the use of the Account naming format associated
with the record type in EDA.

and more… +70


EDA
Apex Class Descriptions for EDA
Resources
Learn more...
● TDTM Overview: NPSP | EDA

● Apex Class Descriptions: NPSP | EDA

● Manage Trigger Handlers: NPSP | EDA

● Disable Trigger Handlers: NPSP | EDA

● Add the User Managed Field to the Trigger Handler Page Layout

● GitHub: NPSP | EDA

● Partner Learning Camp: Table Driven Trigger Management (TDTM) with NPSP

● Trailhead: Automate Tasks with Table-Driven Trigger Management (TDTM)


Batch Apex - What is it?
For complex, long running asynchronous processes handling large data volumes

Special Apex Class Has Three Methods


An Apex Class that implements the start - collect the records or objects for processing (QueryLocator)
Database.Batchable interface.
execute - process each chunk or “batch” of records (200 records by
public class MyBatchClass implements default. It can be defined by the scope parameters)
Database.Batchable<sObject> {
finish - post-processing logic after all batches are executed. (i.e. send an
email)

public (Database.QueryLocator | Iterable<sObject>)


start(Database.BatchableContext bc) {
// collect the batches of records or objects to be passed to execute
}

public void execute(Database.BatchableContext bc, List<P> records){


// process each batch of records
}

public void finish(Database.BatchableContext bc){


// execute any post-processing operations
}

Additional Information: Using Batch Apex


Batch Apex - How it works?
High-level Overview

For example, there are 10,001 Account records to update which is more than can be
processed in a single transaction without some way of breaking it up.
1. Using Batch Apex, in the start() method you define the query to retrieve the collection of
records: SELECT Id FROM Account WHERE …
2. Then the execute() method runs, but only receives a relatively short list of records (default
200).
○ Within the execute(), everything runs in its own transactional context, which means almost all of
the governor limits apply to that block (i.e. each time execute() runs, you are allowed 150
queries and 50,000 DML rows and so on). When execute() is complete, a new instance is
created with the next group of 200 Account records, with a brand new set of governor limits.
3. Finally the finish() method wraps up any loose ends as necessary, like sending a status
email.
Batch Apex - Example
Apex Code Example

// Query for 10 accounts


String q = 'SELECT Industry FROM Account LIMIT 10';
String e = 'Account';
global class UpdateAccountFields implements
String f = 'Industry';
Database.Batchable<sObject>{
String v = 'Consulting';
global final String Query;
Id batchInstanceId = Database. executeBatch(new
global final String Entity;
UpdateAccountFields(q,e,f,v), 5) ;
global final String Field;
global final String Value;
In this example, the Apex code above makes a global UpdateAccountFields (String q, String e, String f, String v){
call to our own implementation of a Batch Apex Query=q; Entity=e; Field=f;Value=v;
}
code which will execute:
global Database.QueryLocator start(Database.BatchableContext BC){
1. Start - collect all records to be processed return Database.getQueryLocator(query);
}
2. Execute - receive the QueryLocator and
global void execute(Database.BatchableContext BC,
iterate through collecting the fields to be List<sObject> scope){
updated for(Sobject s : scope) {
s.put(Field,Value);
3. Finish - execute when the process finishes }
update scope;
}

global void finish(Database.BatchableContext BC){

}
}
Batch Apex in NPSP
87 Customizable Rollups

From Opportunity, Payment, and GAU Allocation


Records

Best Gift Year Total Soft Credits

Membership Join Date Total Match Amount

Last Allocation Date


Customizable Rollup - Batch Job Apex Classes
NPSP Packaged Rollups - reference spreadsheet
Batch Apex & Governor Limits in NPSP
Batch Apex Jobs + Large Data Volume + Apex Governor Limits
Job Name
NPSP 00 - Error Processing
Watch out for Large Data Volume
NPSP 01 - Opportunity Account Rollups Orgs + NPSP Batch Apex Jobs +
NPSP 01A - Customizable Rollups - Account Hard Credit Custom configuration… (more later)
NPSP 01B - Customizable Rollups - Account Hard Credit Skew
NPSP 02 - Opportunity Contact Rollups
NPSP 02A - Customizable Rollups - Contact Hard Credit
NPSP 02B - Customizable Rollups - Contact Hard Credit Skew
Job Name
NPSP 03 - Opportunity Household Rollups
NPSP 05 - GAU Allocation Rollups
NPSP 03A - Customizable Rollups - Account-level Contact Soft Credit
NPSP 05 - Customizable Rollups - General Accounting Units
NPSP 03B - Customizable Rollups - Account Soft Credit
NPSP 06 - Recurring Donation Updates
NPSP 03C - Customizable Rollups - Account-level Contact Soft Credit Skew
NPSP 06A - Customizable Rollups - Recurring Donations
NPSP 03D - Customizable Rollups - Account Soft Credit Skew
NPSP 06B - Customizable Rollups - Recurring Donations Skew
NPSP 04 - Opportunity Soft Credit Rollups
NPSP 07 - Seasonal Address Updates
NPSP 04A - Customizable Rollups - Contact Soft Credit
NPSP 08 - Level Assignment Updates
NPSP 04B - Customizable Rollups - Contact Soft Credit Skew
NPSP 09 - Data Import Batch Processing

NPSP Default Jobs and Apex Classes


Customizable Rollup - Batch Apex Jobs
Calculated by a series of batch jobs - Non-Skew vs Skew vs Incremental Mode

Non-Skew Skew
● Default mode ● Used when a single record has more than 250
● When there are few records, usually less than 250 related records
related records ● Break the rollups into groups of records (default is
● Or, when “Customizable Rollups: Force Skew 300 records per group)
Mode” is selected on Account or Contact

Incremental
● Only evaluates Account and Contact records with Opportunities that were modified within a certain timeframe.
○ By default, this timeframe is the value of the largest N-Day rollup. NPSP's largest N-Day rollup by default is 2 years,
but can be changed or your own N-Day rollups be created
● Only available for Contact Hard Credit and Account Hard Credit Non-Skew batch jobs
● Only affects our nightly scheduled jobs. Running Bulk Data Processes still calculates rollups for all records

How Customizable Rollup Batch Jobs Work


Customizable Rollup - Batch Job Execution
Large Data Volume - Remember how batch Apex works?

Execution LDV & Chunk Size


● start - collect the records for processing ● If job is handling LVD (i.e. +50k records) you may
● execute - process each chunk or “batch” of records see an error: “REQUEST_RUNNING_TOO_LONG”
(default: 200 records) or “Too many query rows: 50001”
● finish - post-processing logic after all batches are
executed. (i.e. send an email)
How to Resolve*
● Reduce number of records by splitting the long
running job into multiple smaller jobs
○ Chunk Size
● Switch to Incremental Mode [on by default on
new Orgs]
● Change Batch Size

* Explained in the next slides


Batch Chunk Size
How to enable it?

1. Go to Setup > Custom Settings > Customizable Rollup Settings > click on Manage
2. Click on Edit
3. Update the value for:
a. LDV: Account Hard Credit Chunk Size
b. LDV: Account Soft Credit Chunk Size
c. LDV: Contact Hard Credit Chunk Size
d. LDV: Contact Soft Credit Chunk Size
Incremental Mode
How to enable it? - On by default on new Orgs

1. Go to Setup > Custom Settings > Customizable Rollup Settings > click on Manage
2. Click on Edit
3. Ensure the following is checked:
a. Incremental Account Hard Credit Non-Skew
b. Incremental Contact Hard Credit Non-Skew
4. Also, review the Last N Days for:
a. Incremental: Last N Days Field Override
b. Incremental: Last N Days Value Override
Batch Size Configuration
By default each batch process 200 records at a time

NPSP Settings
1. Go to NPSP Custom Settings
2. Click on Edit
3. Update the desire Batch size for the
Rollup job

How Customizable Rollup Batch Jobs Work


Customizable Rollups - When do they run?
Real time vs Batch Job

Real Time + Batch Job Batch Job


The Customizable Rollups Quip spreadsheet contains By default, jobs runs every night at 11 PM at staggered
details when each rollup is executed. times. The following jobs are created when Customizable
Rollups is enabled:
● Immediately & Nightly OR Nightly Only
● NPSP 01A - Customizable Rollups - Account Hard Credit
● NPSP 01B - Customizable Rollups - Account Hard Credit Skew
● NPSP 02A - Customizable Rollups - Contact Hard Credit
● NPSP 02B - Customizable Rollups - Contact Hard Credit Skew
● NPSP 03A - Customizable Rollups - Account-level Contact Soft Credit
● NPSP 03B - Customizable Rollups - Account Soft Credit
● NPSP 03C - Customizable Rollups - Account-level Contact Soft Credit
Skew
● NPSP 03D - Customizable Rollups - Account Soft Credit Skew
● NPSP 04A - Customizable Rollups - Contact Soft Credit
● NPSP 04B - Customizable Rollups - Contact Soft Credit Skew
● NPSP 05 - Customizable Rollups - General Accounting Units
● NPSP 06A - Customizable Rollups - Recurring Donations
● NPSP 06B - Customizable Rollups - Recurring Donations Skew

How Customizable Rollup Batch Jobs Work


Considerations
Keep in mind...
● QueryLocator can retrieve up to 50 million records per batch job
● If Query retrieves more than 50 thousand records an error will be displayed
and the process terminated > System.LimitException: Too many
query rows: 50001
● Batch job by default process 200 records in each chunk (execute). It is
possible to specific the number of records processed
● Up to 5 batch jobs can be queued or active concurrently
● Up to 100 “Holding” batch jobs can be held in the Apex flex queue
● The maximum number of batch Apex method executions per 24-hour period
is 250,000, or the number of user licenses in your org multiplied by
200—whichever is greater. Method executions include executions of the start,
execute, and finish method
Salesforce.org
Architect Academy
for Partners (Nonprofit / Education)
Day 3

Claudio Moraes Fabrice Pierre


Partner Success Director, Partner Success Manager,
Technical Lead EMEA
Links from Day 1 & 2
https://salesforce.quip.com/NaT7AWSaUEtv
Query Limits - 50 million vs 50 thousand
What is the difference?

Static Apex Limits Per-Transaction Apex Limits


● Maximum number of records returned for These limits count for each Apex transaction. For
Database.QueryLocator in the Batch Apex - Batch Apex, these limits are reset for each
50 million execution of a batch of records in the execute
method.
● Max number of records retrieved by SOQL
queries - 50 thousand
EXAMPLE
The Batch Apex job Database.QueryLocator may
retrieve 49 million records in the start step.
The execute step logic may handle records from
related objects that exceed the Apex transaction
limit of 50k

Execution Governors and Limits


Learning Journey

4 Donations
1 5 Recurring Donations
(Legacy & Enhanced)
Profiling
Nonprofit & 3 Performance and

Education Products
Debug
8
Salesforce Architecture,
2
Salesforce for Nonprofit & Automation Bonus
Education Overview
All About Data Salesforce.org 7 Technology as a tool

Large Data Volume,


Automation
framework (TDTM)
6 to delivery faster time
to value
queries, indexes, and Batch Apex Product Access
Salesforce optimizer, Demo and Product
data import Use Cases Study
access
Real Life examples
Schedule
Day 2:
● Thinking about data as an Architect (continuation…)
○ Query Plan and Data Import
● Salesforce.org Automation Framework
○ Order of Execution review
○ Table Driven Trigger Management (TDTM)
○ Batch Apex
■ Customizable Rollups, Batch Chunk Size,
Incremental Mode, and Batch Size
Configuration
Day 3:
● Performance & Debug
○ Understanding Performance in Salesforce
○ Developer Console and Debug Logs
● Recurring Donations
○ Legacy vs Enhanced Recurring Donations
○ Migration or Upgrade?
● Use Cases Study - Real Life Examples
○ Common Issues - What, How and Why?
Performance
● Understand what impacts performance and how to be
equipped and prepared to ensure your implementation
is successful
Performance & Debug
Profiling Tool Deep Dive Profiling
● Identify, reproduce, test and review

Improve Speed and Performance of Lightning Experience Pages

A Guide to Application Performance Profiling in Force.com


Have you used Debug logs
before?
● Yes

?
● No
Defining Performance
In Simple Terms

● End user point of view (focus here)


○ performance is the time it takes to display and
make available the entire content of a webpage in a
browser window
● Batch processing (more later)
○ performance is the time it takes to complete the
job
Salesforce Performance
Understanding Performance in Lightning

Comparing old Salesforce Classic with Salesforce Lightning


● Fundamentally different UI platforms, classic is a server side technology that renders and entire
page at once and sends it to a browser whereas lightning is a rich client side technology that
progressively loads pages in a browser without requiring full round trips to the server

Experience Page Time (EPT)


● Performance metric Salesforce uses in Lightning to measure page load time

Lightning Usage App


● Monitor adoption metrics, such as daily active users, slowest desktop record pages and more

Lightning Experience Performance Optimization


The Performance Triangle
Understanding what impact EPT
● Lightning UI is rendered client-side, so sensitive to
browser performance - type, CPU. This can be
measured through an Octane score.

ce

Br
● Complex pages with many custom fields and/or

an

ow
components are slower to render. And even slower

rm

se
with low browser performance.

rfo

rP
Pe

erf
ork
● Lightning UI requires many XHRs

orm
(XMLHttpRequests) to render a page, so is sensitive

tw

an
Ne
to network latency.

ce
● Worst case scenario (slow network, slow browser,
laptop running too many apps, complex page
layout) can translate to 10+ second EPT.
Page Complexity & Customization

What Is EPT?
Improving EPT
Complexity and Customization

● Reduce the complexity of the pages (e.g. number of


components, query exec times, custom code complexity,
etc)

Br
e
● Use secondary tabs

ow
an

se
● Reduce number of fields on the layout (recommended <

rm

rP
rfo
50)

erf
Pe
● Reduce related lists (recommended < 12) or use Related

orm
ork
List Single component to show only the most important

an
tw
important related lists and then use secondary tabs

ce
Ne
● Use Lightning Actions instead of custom components
where appropriate
● Minimize the number of open tabs within Console Apps Page Complexity & Customization
● Tip: use Lightning Inspector to assess page complexity

Salesforce Lightning Inspector Chrome Extension


Improving EPT
Network Performance

● Use Lightning Experience CDN and Salesforce Edge, if


required for global users (next slides)
● Reduce custom component XHRs (XMLHttpRequests)

ce

Br
with caching, storable actions, and the Lightning Data

ow
an
Service.

se
or

rP
● Lookout for impact of VPN network topologies and

erf

erf
consider split tunneling.

kP

orm
or
● Follow Networking best practices

an
tw

ce
Ne
● If necessary Troubleshoot networking issues
● Consider leveraging Thousand Eyes if needed
Page Complexity & Customization
Do you know what CDN
(Content Delivery Network) is?
● Yes

?
● No
Enabled by default
For Digital
in new Orgs after
Experience
CDN & Caching Winter’ 19

Improving Network Performance

Enable Content Delivery Network (CDN) in


Salesforce to load applications faster

Enable CDN for Community Cloud to load


communities faster

Enable secure and persistent browser


caching to reduce the number of server calls
Have you heard about or
implemented Salesforce Edge
before?
● Yes

? ● No
What is Salesforce Edge?
Data driven and geo-distributed network product to deliver superior user experience

TLS Termination Caching Static Route ML Driven


Content Optimization Network
Optimizations

Provides ability to Storing certain Using network Congestion control


complete secure objects that are latency and other transport
handshake in a cacheable in measurements and optimizations
shorter amount of locations closer to topology tailored to user’s
time so that the customer in a information to context
application can compliant manner direct customers to
begin requesting the EDGE that
data faster results in best
experience

Salesforce Edge Network and Considerations


Improving EPT
Browser Performance

● Ensure hardware and browser meet the technical requirements


● Check for older or out of date Browsers in Lightning Usage App report

Br
e

ow
c
an
● Leverage the built-in speed test to review Octane Scores

se
rm

rP
rfo

erf
Pe

or
ork

ma
tw

n
Ne

ce
Page Complexity & Customization

● Record Octane, CPU, RAM, OS, Browser, Geography data in spreadsheet


to identify issues and patterns
Lightning Reports and Dashboards Technical Requirements
Resources
Where to find additional information

Trailhead
● Lightning Experience Performance Optimization
● Lightning Web Components Basics

Knowledge Base Article


● Improve Speed & Performance in Lightning
● Improve listview speed and performance

Community
● *Lightning Now!* Trailblazer Group
● Lightning Speed Admin Podcast
Debug & Profiling
Methodology

● Identify the process and define the scope to be


analyzed
● Define the scenario to reproduce the performance
issue
● Test the scenario, usually in the full sandbox, to
have performance similar to production
● Use Developer Console as a tool to identify the
bottlenecks or code(s) that are taking long time for
execution while reviewing the Debug Logs

A Guide to Application Performance Profiling in Force.com


Developer Console and Debug Logs
How To - Prepare

● In the Setup menu select Environment > Logs > Debug Logs

● In User Trace Flags panel, click New and select the user that reproduces the issue. Prefer an admin
user to execute the sharing rules:

● To see the level of details in the log file click on Filters


○ It is recommended that Apex Profiling, DB, Apex code is to set as FINEST
○ The log Level can also be set using the Developer Console
○ In case you have timeout issues when executing your scenario try reducing the log level

Debug Log Levels


Developer Console and Debug Logs
How To - Reproduce

● First, in the Salesforce Org, click on the Gear Icon at the top
right and select Developer Console. This will open the
Developer Console where you will be able to see the log
information

● Run the scenario to reproduce the error: for example, in the


Donations/Opportunities tab select New, populate the form
with the right values and click Save
Developer Console and Debug Logs
How To - Review the generated log

● After executing the scenario, review the debug log files in the Developer Console > Logs tab. Check
the request duration and file size

For example, double click on the log file taking


longer to execute to open the debug log

Additional information: Debug Log Levels


Developer Console and Debug Logs
How To - Debug log fine tune

● With the debug log opened, change the view to select what is important in Debug menu > View Log Panels...

Select the important Panel like


Stack Tree, Execution Overview …
Developer Console and Debug Logs
How To - Analyze
Performance Tree > Stack Trace Panel & Execution Overview > Timeline

Most of the execution time is spent in this


method getSLAMatchCount fired by Trigger
CaseBeforeInsert and executed 6993 times

98% of execution Time is in Apex


code. See above in Stack Trace the
code that take long time for
execution

By Clicking on the execution timeline the


Execution Log is filtered by the
corresponding event
Developer Console and Debug Logs
How To - Check the Governor Limits
Execution Overview > Timeline

Close to LIMIT for CPU


and Heap Size. Think
what will happen when
changes are made or
added

Filter Execution Log by


LIMIT_USAGE
Legacy and Enhanced Recurring Donations

Salesforce for Nonprofit ●



History
Differences
Recurring Donations Migration from Legacy to Enhanced Recurring Donations

Allow Nonprofit organizations to ● More than just a button click


● Considerations
track gifts that donors have pledged
over a period of time (Fixed-Length
or Open-Ended)
How familiar are you with
Legacy and Enhanced Recurring
Donations?
● Are there two types?

?
● I understand it
● I’m an expert
Salesforce for Nonprofit Recurring Donations
Recurring Donations has existed in Nonprofit Success Pack (NPSP) for many years
In an effort to improve the user experience, and in partnership with our nonprofit
community, the Salesforce.org team reimagined the Legacy Recurring Donations (RD1 or
LRD) and in 2020 launched the Enhanced Recurring Donations (RD2 or ERD).

NPSP NPSP
Version 2 Version 3 Continuous Innovation

1999 2008 2010 2014 2016 2022


2009 05/2020

64+ package versions

First Recurring Enhanced Recurring


Donations feature Donations
Household Account Model Donations
Process overview for single, multiple payments, and recurring donations
Recurring Donations are a series of single
Contacts are always people who are part of a donations with identical contacts and roles
household Account. New Contacts automatically managed by the Recurring Donation object.
create and link to new Household Accounts, but
can be merged into existing households.
Account
Recurring Donations
Household (Household)
Primary O.P.C.
The first member of a household initially
Contact Single Donation Single Donation
becomes its Primary Contact.

Contact Contact Contact Contact

The Opportunity Contact Role defines how credit


is distributed. Each member’s role is automatically
Opp. Contact Opp. Contact Opp. Contact Opp. Contact
assigned. Everyone in a household gets hard or
Role Role Role Role
soft credit. (Donor) (HH Member) (Donor) (HH Member)
Opp.
Primary
Contact
Opportunity
The Opportunity Primary Contact is the Opportunity Opportunity
(single or multiple payments)
donating household member if the
donation is made at the Contact level, or
the household’s Primary Contact if made
at the Household level.
Payment Payment Payment Payment

Opportunities are always Multiple payment donations


associated with Accounts. Recurring Donation opportunities
close with the last payment. close as each payment is received.
Recurring Donations - LRD vs ERD
Main Data Model changes
Legacy Recurring Donations (LRD) are generated by a Enhanced Recurring Donations (ERD) create one
batch process or on-demand for future months based pledge donation at a time. Next donation is created
on NPSP settings configuration (default 12 months) when pledged donation is closed.

Account Account
(Household) (Household)

Still use Recurring


Donations object.
Contact Contact

Option to track Recurring


Donations changes.
Opp. Contact Recurring Opp. Contact Recurring
Role Donations Role Donations
Recurring
Donations
Changes
Opportunity Opportunity

Recurring
Donations
Schedule
Payment Payment
Protected/Private
Object
Enhanced Recurring Donations (ERD)
Page Layout

● New Page Layout


○ Active Schedules
○ Upcoming
Installments - define
number of upcoming
installments in the
page layout
component
● New, removed or
replaced fields
Enhanced Recurring Donations NPSP Settings
What is new?

● Installment Opportunity
Auto-Creation
○ Always Create Next Installment
(default)
○ Disable First Installment on Create
○ Disable All Installments
● Next Donation Date Match Range
● Use Fiscal Year for Recurring
Donations
● Recurring Donation Name Format
Enhanced Recurring Donations (ERD)
Upgrade or Migration?

Why move? Upgrade Already using Legacy Recurring


Donations?
● New functionality “Move to a newer version of the
● Innovation and same product or functionality” Approach it as a Project:
enhancements for ERD only Evaluate, plan, scope, test, refactor,
Usually with minimal effort
● Salesforce.org Products that
migrate/downtime, re-test, enable
require ERD (i.e. Elevate, i.e. iOS 14 to iOS 15
Einstein for Nonprofits) DO NOT TURN OFF Automation
● NPSP Data Import and Gift There is a lot of automation when
Entry are pre-configured for Migration running the upgrade process
ERD “Change to a newer product or Considerations:
● Improved performance for functionality”
Large Data Volume ● Data Model Changes
Entails significant effort ● Large Data Volume to migrate
● Integrations
i.e. platform change: Windows to
● Process changes
Linux
● Enablement

Enhanced Recurring Donations Upgrade Guide


NPSP Data Load
● NPSP Data Import is too slow
Customizable Rollups
● Batch jobs failures after enabling customizable rollups
● Custom Filter on Customizable Rollups cause overnight
batch to fail
Declarative Lookup Rollup Summaries (DLRS)
● When using Apsona to upload Contacts into Salesforce
an error message is encountered and the process does
not complete
Use Cases Study Legacy Recurring Donations
Real Life Examples ● Changing NPSP Settings > Recurring Donations
What, How & Why >Recurring Donations > Opportunity Forecast Months to
a lower number causes an error
Volunteers for Salesforce
● Website not displaying available volunteer job shifts
Admissions Connect
● New applicants registration encounters an error and
does not complete
Use Case #1 - Batch Job error during data loading

Large Data Volume Testing Reported Issue


Nonprofit organization running tests in UAT with During load testing in UAT, we noticed
additional data and configuration, seeing an error in an error in the batch Apex jobs:
the nightly Batch Apex job.
"Error: [REQUEST_RUNNING_TOO_LONG] Your
request was running for too long, and has been
stopped."
Context: npsp__CRLP_SkewDispatcher_BATCH

Summary
Large Data Volume testing in UAT
causing batch apex job errors.
Use Case #1 - Understanding the Error Message
What “Your request was running for too long” means?

Apex transaction limit


Apex transaction exceeded the transaction limit and it was killed.

Description Synchronous Limit Asynchronous Limit

Maximum CPU time on the Salesforce servers 10,000 milliseconds 60,000 milliseconds

Maximum execution time for each Apex transaction 10 minutes 10 minutes

Execution Governors and Limits


What is happening?
Key considerations

“ Is this a large data


volume Org?
Yes, LVD testing in UAT

“ What By reviewing the “How Customizable Rollup Batch Jobs


Work”, we understand the Skew Mode Dispatcher class
“CRLP_SkewDispatcher_
breaks the rollups into smaller groups of records (the
BATCH” does?
default is 300) for execution

“ Are there custom


processes or
configuration involved?
In this specific scenario, the nonprofit also
had inactivated CRLP_Account_BATCH and
CRLP_Contact_BATCH batch apex jobs
#1 - Batch Job error during data load
Cause & Fix
What caused the issue and how to fix?
Setup > Custom Settings > Customizable Rollup
Settings > click Manage > click Edit > enable the
below checkboxes: Cause
Incremental Contact Hard Credit Non-Skew Large Data Volume caused the issue.
Incremental Account Hard Credit Non-Skew
Fix
Setup > Custom Settings > Customizable 1) Re-enabled CRLP_Account_BATCH & CRLP_Contact_BATCH
Rollup Settings > click Manage > click Edit >
enable the below checkboxes: 2) Enabled Incremental Mode to help reduce the overall data
within the Customizable Rollup batch jobs and also reduce the
Incremental: Last N Days Field Override occurrence of a long-running batch job
Incremental: Last N Days Value Override
3) Specified which Opportunity field and value is used to
determine which Contacts/Accounts are processed
*** per Salesforce guidance only *** 4) Enabled Batch chunking mode which splits one long-running
Setup > Custom Settings > Customizable batch into smaller batch jobs
Rollup Settings > click Manage > click Edit >
add LDV: Account Hard Credit Chunk Size as ** Incremental Mode only affects nightly scheduled jobs. Running the
100000 Rollup Donations Batch (in NPSP Settings, under Bulk Data Processes)
still calculates rollups for all records
Use Case #2 - Data Loader error

Loading Donations Reported Issue


Nonprofit organization preparing for go-live using Using Data Loader to load donations
Data Loader to load donations (Opportunities) is and getting an error:
running into errors.
npsp.TDTM_Opportunity: execution of
AfterInsert
caused by: System.QueryException:
Non-selective query against large object type
(more than 200000 rows).

Summary
Loading donations causes an error
during the Opportunity trigger
execution.
Use Case #2 - Understanding NPSP Automation
What happens when a Donation is created?

NPSP TDTM Apex Class


During Opportunity creation, the Salesforce.org
Automation Framework (TDTM) executes the
OPP_OpportunityContactRoles_TDTM Apex class.
On Opportunity insert, the Apex class:
● Implements key aspects of automated soft credits
by creating Opportunity Contact Roles based on
related Relationships, Affiliations, and the
Contact Role as defined in NPSP Settings
● Updates the Opportunity Name based on
Opportunity Naming configuration defined in
NPSP Settings
Use Case #2 - Understanding the Error Message
What “Non-selective query against large object type” means?

Salesforce Query Optimizer Large Data Volume


First the Optimizer evaluate the query filters, if no Specially important in a LDV Org, when the query
selective filter is found the query will be marked is non-selective and more than 200k records are
as non-selective. returned.

System.QueryException: Non-selective query


SELECT fields FROM Account
against large object type (more than 200000
(returning more than 200k records) rows).
What is happening?
Key considerations

“ Is this a large data


volume Org?
Yes, LVD of opportunities


As mentioned in the previous slide, when running a
What “Non-selective SOQL query returning more than 200k records as part
query against large of a trigger execution, if no selective filter is found the
object type” means? query will be marked as non-selective and the
QueryException error will be returned.

“ What else is happening


during the donation load
process?
When loading donations, Affiliations are
created between donors and organizations
#2 - Data Loader error
Cause & Fix
What caused the issue and how to fix?

Cause
npsp.TDTM_Opportunity: execution of Non-selective query with Data Skew between 4 Account
AfterInsert with more than 10k Affiliations records.
caused by: System.QueryException:
Non-selective query against large object type
(more than 200000 rows).

Fix
Distributing the records to avoid Data Skew (less than
10k related records for an Account) solved the issue.
Use Case #3 - Customizable Rollups Error

NPSP Customizable Rollups Reported Issue


Nonprofit organization using NPSP for a long time After enabling Customizable Rollups, the
decide to enable Customizable Rollups as part of nonprofit started to see errors in their
their strategy to have better visibility of aggregated batch jobs.
information.
One example is:
CRLP_ContactSkew_SoftCredit_BATCH -
(apex job id ...) error
REQUEST_RUNNING_TOO_LONG

Summary
Customizable Rollups Batch Apex jobs
not completed because of errors.
What is happening?
Key considerations


Yes, Customizable Rollups enabled in LVD Org:
Did something changed ● Accounts (2,191,343)
or is this a large data ● Contacts (2,016,072)
volume Org? ● Opportunities (49,624,889)
● Campaign Member (22,291,122)

“ What
“REQUEST_RUNNING_T
OO_LONG” means?
It means the query executed in the Apex
transaction is taking more than 10 minutes to
complete (as explained earlier)

“ Are there custom


processes or
configuration involved?
There is custom Apex being executed in the
OpportunityContactRole
#3 - Customizable Rollups Error
Cause & Fix
What caused the issue and how to fix?

Cause
Custom Apex running in a loop querying the
OpportunityContactRole object when processing Hard
Credits.
The Query in the Apex code is being executed against all
OpportunityContactRole records.

Fix
Removing the Custom Apex Code and custom logic
solved the issue.
Use Case #4 - Customizable Rollups Custom Filters

NPSP Customizable Rollups Reported Issue


Nonprofit organization using a more restrictive filter After adding additional conditions to
on the Customizable Rollups configuration: NPSP Customizable Rollups filters, the
Settings> Donations > Customizable Rollups > overnight batch jobs failed. One
“Configure Customizable Rollups” button > “View example:
Filter Groups” button > added conditions to filter
rules: Opps: Won (HC) and Opps: Won (SC) [REQUEST_RUNNING_TOO_LONG] Your request
was running for too long, and has been stopped.

Context: npsp__CRLP_SkewDispatcher_BATCH

Summary
Customizable Rollups custom filters
caused errors on Batch Apex jobs.
Illustration
What is happening?
Key considerations

“ Did something changed


or is this a large data
volume Org?
Yes, new Customizable Rollups filters and no large data
volume involved

“ What By reviewing the “How Customizable Rollup Batch Jobs


Work” it is possible to see the Skew Mode Dispatcher
“CRLP_SkewDispatcher_
breaks the rollups into smaller groups of records (the
BATCH” does?
default is 300) for execution

“ Are there custom


processes or
configuration involved?
Yes, the new custom filters are based on
custom fields
#4 - Customizable Rollups Custom Filters
Cause & Fix
What caused the issue and how to fix?

Cause
Custom Customizable Rollup filter based on a custom
field causing query timeout

Fix
Added an index to handle the custom filter definition,
improving the query execution time and preventing the
Illustration
long running query and consequently timeout
Use Case #5 - Legacy Recurring Donations Errors

Opportunity Forecast Months Reported Issue


Nonprofit organization with millions of recurring After changing the “Opportunity
donations (Opportunities) changed the NPSP Forecast Months” we started to get an
Settings > Recurring Donations > Recurring error:
Donations > Opportunity Forecast Months from the
value of 12 to 1 aiming to reduce the number of npsp:Too many query rows: 50001 error
future created donation records.

Option only available if using


Legacy Recurring Donations

Summary
In Legacy Recurring Donations, changing
forecast months caused an error.

Configure Legacy Recurring Donations


What is happening?
Key considerations

“ Is this a large data


volume Org?
Yes, the Salesforce Org has millions of Donations and
related records

“ What “Too many query


rows: 50001 error”
means?
In simple terms, this error happens when the SOQL
query is retrieving more than 50,000 records

“ Are there custom


processes or
configuration involved?
In this specific scenario, the nonprofit also
had a custom Trigger on Opportunities
#5 - Legacy Recurring Donations Errors
Cause & Fix
What caused the issue and how to fix?

Cause
Changing the forecast value in NPSP is a
computationally expensive process particularly in an Org
that has a large data volume, as it will either mass-create
or delete Opportunities and related records.
The change combined with the custom Trigger was
causing the issue.

Fix
The following was tried but did not worked:
1) Changing incrementally by 1 (i.e. 12 > 11)
2) Disabling the custom Trigger
The partner updated the Custom Settings manually
through SOQL (not recommended) which did not trigger
the processing of existing recurring donations
Use Case #6 - Customizable Rollups Batch Job Error

Rollups with Large Data Volume Reported Issue


Nonprofit organization with +300k Accounts and Since a few months back we started to
+250 Donations (Opportunities) per Account get errors in our batch apex jobs as the
started to notice errors in Batch Apex jobs. one below:

Opportunity
Opportunity Payments "First error: Apex CPU time limit exceeded"
Contact Role
Context: npsp__CRLP_Account_BATCH

Account

Address

Contact Affiliation
Summary
Batch Apex job is failing with an error.
Relationship
Use Case #6 - Understanding the Error Message
What “Apex CPU time limit exceeded” means?

Apex CPU time limit Not Counted


It is calculated for the executing Apex code and Specific database operations, e.g. the portion of
any processes that are called from this code, such execution time spent in the database for DML,
as package code, workflows, process builders and SOQL, and SOSL isn’t counted, nor is waiting time
flows. for Apex callouts and SOQL

Salesforce has a limit for transactions based on


CPU usage. If transactions consume too much
CPU time, they will be shut down as a
long-running transaction.
What is happening?
Key considerations

“ Is this a large data


volume Org?
Yes, the Salesforce Org has +300k Accounts and +250
Donations (Opportunities) per Account

“ What
“npsp__CRLP_Account_B
ATCH” does?
As mentioned here, this batch apex job is responsible
for “Customizable Rollups - Account Hard Credit”

“ Are there custom


processes or
configuration involved?
Not on this specific scenario
#6 - Customizable Rollups Batch Job
Error Cause & Fix
What caused the issue and how to fix?

Cause
The error is being caused by the large data volume to be
processed by the Customizable Rollups batch apex job.

Fix
Reducing the Batch job size for Accounts fixed the issue:
“NPSP Settings > Bulk Data Processes > Batch Process
Settings > Account Hard Credit Batch Size”.
Use Case #7 - Volunteers don’t see available jobs

Volunteers for Salesforce Reported Issue


Nonprofit organization delivering baby essentials to We are using NPSP and V4S and our site
homeless families is using the Nonprofit Success is not displaying available jobs
Pack (NPSP) and Volunteers for Salesforce (V4S) to anymore, it was working up until now.
make easier for volunteers to sign up for upcoming
We attempted to replicate it using a
job shifts through their Website.
Sandbox but it work as expected. If we
They implemented a custom automated process to delete the shift record then the job is
create upcoming volunteer job shifts for the next posted on the web page.
several months.

Summary
Define how job
and shift repeat Volunteers don’t see job shifts on the
Website.

Volunteer-for-Salesforce Issues in Github


What is happening?
Key considerations

“ Did something changed


or is there a related issue
in the GitHub repo?
Nothing changed but it seems there is a related issue
in GitHub: Some Jobs May Not Be Displayed if More than 999
Active Jobs

“ What is the custom


process doing?
The custom process is currently creating 1k+ job
shifts records for the next several months

“ Are there limits causing


the issue?
There are no limits in how far in the future
you can schedule job shifts as mentioned
here
#7 - Volunteers don’t see available jobs
Cause & Fix
What caused the issue and how to fix?

Cause
By reviewing the V4S GitHub repository issues list we
found there is a limit that prevents more than 999 jobs
to be displayed: Some Jobs May Not Be Displayed if
More than 999 Active Jobs

Fix
Reducing the number of available active jobs to less
than 999 fixed the issue
Salesforce.org
Architect Academy
for Partners (Nonprofit / Education)
Day 4

Claudio Moraes Fabrice Pierre


Partner Success Director, Partner Success Manager,
Technical Lead EMEA
Links from Day 1, 2 & 3
https://salesforce.quip.com/NaT7AWSaUEtv
Learning Journey

4 Donations
1 5 Recurring Donations
(Legacy & Enhanced)
Profiling
Nonprofit & 3 Performance and

Education Products
Debug
8
Salesforce Architecture,
2
Salesforce for Nonprofit & Automation Bonus
Education Overview
All About Data Salesforce.org 7 Technology as a tool

Large Data Volume,


Automation
framework (TDTM)
6 to delivery faster time
to value
queries, indexes, and Batch Apex Product Access
Salesforce optimizer, Demo and Product
data import Use Cases Study
access
Real Life examples
Schedule
Day 3:
● Performance & Debug
○ Understanding Performance in Salesforce
○ Developer Console and Debug Logs
● Recurring Donations
○ Legacy vs Enhanced Recurring Donations
○ Migration or Upgrade?
● Use Cases Study - Real Life Examples
○ Common Issues - What, How and Why?
Day 4:
● Use Cases Study - Real Life Examples
○ Continuation…
● Demo Environments and Products access
● Technology as a tool to deliver faster time to value
○ CumulusCI
○ Building Templates as Lego blocks
Use Case #8 - Recurring Volunteers Job Scheduling

Volunteers for Salesforce Reported Issue


Nonprofit organization using Salesforce Nonprofit Currently, the Batch Apex job
Success Pack (NPSP) with over 1,000 volunteers ''GW_Volunteers.VOL_BATCH_Recurre
running a large number of Volunteer Jobs, some nce" runs overnight and most nights has
with up to 15 Recurrence Schedules. an error: "Apex CPU time limit
exceeded".
Originally configured the scheduling of Recurring Volunteers job shifts are not being
Volunteers to run on the first day of the month to created correctly, some are duplicated in
extend Volunteer recurrence schedules out for the same month, others are missed
another 4 months. altogether.

Custom Process Builder running the Batch Apex job.


Summary
Scheduling of Recurring Jobs does not
complete or produce unexpected
outcome.
What is happening?
Key considerations

“ What
VOL_BATCH_Recurrence
class does ?
Batchable and Schedulable class to find the list of
active Job Recurrence Schedules, and process them to
add job shifts into the future as seen here.

“ Is there a custom
configuration or process
impacting the job?
There is a custom Process Builder invoked when
the job is running.


Volunteer Job has:
Is the amount of data ● 4 active Recurrence Schedules
● 923 Volunteer Job Shifts
contributing to the issue? ● 1,541 Volunteer Recurrence Schedules
● 36,271 Volunteer Hours
What is probably causing the
problem?
● Large Data Volume
● Custom Process Builder

? ● Something else
#8 - Recurring Volunteers Job Scheduling
Cause & Fix
What caused the issue and how to fix?

Cause
Custom Process Builder running during scheduled job,
plus considerable number of volunteer records to be
processed are causing the issue.

Fix
1. Changed the custom Process Builder to a before
update Flow to minimize DML execution
2. Lowered the “Recurring Job Future Months” value:
○ Setup > Custom Settings > Volunteers Settings
○ click Manage > click Edit and if there are no
Custom Settings values configured then click
New
○ Update the value of the Recurring Job Future
Months field to a lower value than 4
Use Case #9 - Apsona Data Loading

Apsona & DLRS Reported Issue


Nonprofit organization using Apsona, frequently Recently, the nonprofit started to get an
uploading constituent information. error when using Apsona:

"CANNOT_INSERT_UPDATE_ACTIVATE
_ENTITY: npsp.TDTM_Account:
System.LimitException: Apex CPU time
limit exceeded. "

Also using Declarative Lookup Rollup Summaries for


custom aggregation and rollup summaries.
Summary
Using Apsona to upload 40+ contacts
with 20+ field updates gives an error.
What is happening?
Key considerations

“ Did something changed?


No configuration changed but there are more records
currently in Salesforce

“ Is there a custom
process or app involved?
The nonprofit is using Declarative Lookup Rollup
Summary for custom aggregation and rollups


Salesforce platform imposes a CPU usage governor limit
What APEX CPU Time to any given execution context, which is approximately
10 seconds. It means Apex Code, declarative tools or a
Limit Exceeded means? combination in a transaction, must not exceed a
~10-second limit. More info here
What is probably causing the
problem?
● Large Data Volume
● Declarative Lookup Rollup
Summaries

? ● Something else
#9 - Apsona Data Loading
Cause & Fix
What caused the issue and how to fix?

Cause
Declarative Lookup Rollup Summaries (DLSR)
introduced additional custom code execution which
combined with the large data volume caused the issue.

Fix
Moved configuration from DLSR to NPSP Customizable
Rollups and uninstalled DLSR.
Use Case #10 - School Applicants Registration

Admissions Connect Reported Issue


Education institution implementing Admissions New applicants are receiving the
Connect receiving a large number of applicants is following error:
getting errors when applicants try to register in the
web site. “Registration error in site Applicants
Your organization has most likely
Currently using the “Administrative Account” Model exceeded the portal role limit.
(more details on the next slide). Registration from Applicants is not able
to create portal users.”

Summary
New Applicants are unable to complete
the registration.
Use Case #10 - School Applicants Registration
First, let’s review the EDA Account Models

Administrative Account Model Household Account Model


Think of an Administrative Account as the Just like its name suggests, it represents the
Account-level representation of a Contact. It is a household of a student and typically contains
1:1 relationship between the two. multiple Contacts under the same Account.

Claudio Administrative Account Claudio Household Account

Student Contact: Claudio Moraes Student Contact: Parent Contact: Parent Contact:
Claudio Moraes Antonio Moraes Antonio Moraes
What is happening?
Key considerations

“ Is this a large data


volume Org?
Yes. In addition, several new applicants are trying to
register

“ What “portal role limit”


means?
As mentioned here, if the number of roles used in a
Salesforce Org exceed 50,000, an error will be displayed

“ What else is happening?


Education institution using EDA Contact
Administrative Account architecture
#10 - School Applicants Registration
Cause & Fix
What caused the issue and how to fix?
Cause
When a new applicant User is created in Experience Cloud
(Portal), a new Role for that User is also created that references
the student's Account (Administrative or Household Account).
Without ARO As a large number of User Roles are created this introduces an
issue related to the Salesforce limit on the number of User Roles
4,000 Single-User Accounts per org.
4,000 Portal Roles Created Fix
Configure Account Role Optimization (ARO) and Enabled
With ARO
shared person account roles.
4,000 Single-User Accounts
Shared Person Account Roles is designed for this scenario where
Individual
1* Person Account Role Created there are hundreds of community users with 1-1 Account to
Applicant Contact (Administrative Accounts). The resulting sharing
* With +10k individual applicants think about using more than one Person behavior is the same but using only a fraction of the number of
Account Role to avoid data skew roles. An existing org can shift to Person Account Roles although
it requires some data manipulation with something like Data
Loader to point existing Community/Portal users to the Person
Account Roles.
Use Case #11 - Admissions Connect

Territory Management for Recruiting Reported Issue


University implementing Admissions Connect has a Currently, Admissions Connect Territory
requirement to automatically assign territories to Management functionality only works
recruiters based on specific rules. when manually executed. There is no
option to configure automated territory
assignment.

Summary
Recruiting Territory does not run
automated assignment.
Illustration
Use Case #11 - Admissions Connect
First, let’s review a few key points about Enterprise Territory Management

Access Assignment
Territory Management allows you to define access Assignment is defined at the Account level, no option
to Accounts, Opportunities, and Cases. (not is available to define assignment for Contacts
Contact) (Students) meaning if student information change it
will not trigger an automatic reassignment.
What is happening?
Key considerations

“ How Enterprise Enterprise Territory Management rules are based on Accounts


while Admissions Connect may be based on the
Territory Management
Administrative Account Model with the main student
(ETM) works?
information based on the Contact object.

“ Is there an option to still


use ETM?
It is possible to use a programmatic approach by
leveraging the AssignmentRuleHeader Web Services API
but it may become a complex solution

“ What else should we


take into consideration?
Think about the complexity of the current
implementation and territory requirements
as you evaluate different options
#11 - Admissions Connect
Territory Management for Recruiting Cause & Fix
What caused the issue and how to fix?

Cause
Recruiting Territory Management does not trigger
assignment based on Student (Contact) information
Auto Run Territory Assignment Rules updates.
● Run Territory Assignment Rules on any update

● Trigger Territory Assignment Rules from Flow

● Trigger Territory Assignment Rules from Process Builder


Fix
One possible alternative is to use Flows to automate the
territory assignment.
Another alternative is to use the “Auto Run Territory
Assignment Rules” app from Trigg.
Ensure to test it based on the Education institution
requirements before deploying it in production.
Observed Patterns
Usually a combination of...

● Large Data Volume


● Customizations in the Data Model
○ Custom Objects
○ Custom Relationships
● Custom Apex Code and Trigger logic
● Custom Process Automation (Process Builder)
combined with custom trigger/apex
● Batch jobs handling LDV
● Multi batch jobs running simultaneously
Industry Demo Orgs (IDO)

Demo & Products


Demo & Dev Station for Partners
Have you used an Industry
Demo Org (IDO) before?
● Yes

?
● No
KEY FEATURES

Pre-Populated Industry Data & Use Cases

Built-In Tools To Quickly Customize & Configure

Customizable Demos DIY Cross-Cloud Integrations

Nonprofit & Education Ideal for Deep Dive Nonprofit and Education
Industry Clouds Demos
Industry Demo Orgs
(IDOs) Spin Up & Throw Away (Expires in 30 days)
Highly configurable, pre-populated industry
demos built on the SDO PRODUCTS INCLUDED

Industry Sales Service Commerce Experience Platform Analytics myTrailhead


Clouds (B2B)

Powered
PoweredByBy
CONSTITUENT ENGAGEMENT

Volunteers

Digital Experiences

PROGRAM MANAGEMENT

Nonprofit Case Management


Industry Demo Org (IDO) Program Management Module
Nonprofit (NGO) IDO FUNDRAISING

Elevate Giving Pages

Accounting Subledger

GRANTS MANAGEMENT

Grantmaking Portal

Powered
Powered
ByBy Due Diligence Framework
DATA ARCHITECTURES

K-12 Architecture Kit

Education Data Architecture

ADMISSIONS AND RECRUITING

Admissions Connect
Industry Demo Org (IDO) Recruiting & Marketing
Education (EDU) IDO STUDENT SUCCESS

Student Success Hub

Advisor Link
ALUMNI RELATIONS

Accounting Subledger

Elevate Pages
Powered
Powered
ByBy
Have you used the Partner
Learning Camp?
● Yes

?
● No
Step 1: Complete the SDO Fundamentals Course
Partners must first complete the
SDO Fundamentals course.

1. Access Partner Learning Camp


2. Search “Simple Demo Org
Fundamentals”
3. Hover over course Card to see
details
4. Click “Enroll” and complete
the course to access the
“Demo Org” Tab In this course, discover all the tips,
tricks, and tools unique to the
Salesforce Simple Demo Org (SDO)
and Everybody's Demo Org (EDO).
Identify when, where, why, and how
to use SDO and EDO.
Approx. 1 hour and 7 mins.
Step 2: Complete Nonprofit / Education Courses

Complete the Nonprofit or Education Industry


Fundamentals Course.

1. Access Partner Learning Camp


2. Search “Nonprofit Industry Demo Org Fundamentals”
or “Education Industry Demo Org Fundamentals”
3. Hover over course Card to see details
4. Click “Enroll” and complete the course
Step 3: Request the Nonprofit / Education Demo Org

Requesting the Nonprofit / Education


Industry Demo Org.

1. Access Partner Learning Camp


2. Navigate to the “Demo Org” Tab
3. Read the information provided in the left form
4. Select “NGO IDO” or “EDU IDO” in the Demo Type field
5. Complete and submit the request form
Step 4: Nonprofit / Education Demo Org Access

After the request has been submitted,


the Partner will receive an email with a
link to access the provisioned Demo
Org.

Auto-provisioning may take a few


hours to complete.
Step 5: Join the “Demo Orgs for Partners” Group
Do you have questions?

Join the “Demo Orgs for Partners”


(https://sfdc.co/psdo-updates) group on the
Partner Community to access additional resources
as the Nonprofit and Education Cloud demo scripts
which will be available soon. z
Demo & Dev Station for Partners
Check it out: p.force.com/demo

Advice on how to get your


hands on a demo or dev
environment for every Cloud
available

See how Salesforce demos our


products, and learn how to extend
current Salesforce offerings
Salesforce.org Releases and Roadmap
Check it out: sfdc.co/sfdoroad

A comprehensive collection of
resource available to
Salesforce.org Network Partners:
● Release Readiness
● Early access initiatives (pilots,
betas, previews, Success
Insights, and more)
● Roadmap Resources
● Etc
Salesforce Orgs
What types of Orgs can I access?
Org Type Best Used For Expires After Creation/Access

Industry Demo Org (IDO) Customer demos and learning 1 month, unless you request an extension Partner Learning Camp

Partner Developer Edition Development and testing Never Environment Hub

Partner Enterprise Edition Robust development, testing and 1 year, unless you request an extension Environment Hub
customer demo

Scratch Org Development and testing 7 days by default SFDX or CumulusCI


Expiration can be defined at creation time
with a maximum of 30 days

Partner Developer, Partner Enterprise and Scratch Orgs requires the installation and set up of
products where the Nonprofit & Education Industry Demo Orgs comes pre-configured with
ready to show demo scenarios

What kinds of orgs can I create in the Environment Hub?


Technology as a tool CumulusCI (CCI) Platform
to deliver faster time
to value
Required Skills & Next Steps
Do you use a DevOps tool?
● What is DevOps?
● Yes

? ● No
Collaborate & Automate Project Delivery at Scale
Our new normal is changing customers and partners expectations on how projects
are delivered
Challenges
Addressing Markets at Scale

Faster Time to Value

Innovate and Deliver Faster

High Pace of Change

Project Collaboration

Customers Partners
Paradigm Shift - From Org to Source Driven Dev.
Current Org-Driven Code QA Test Release
Development: Develop + Unit Tests Code Merge + Functional UAT + Load + Staging Training + Deploy
Code is promoted
through change sets,
ant migration, metadata
API. Dev
Pro
SBX
Change set promotions
Dev Dev Dev
SBX SBX SBX
Partial Full
Future Source-Driven PROD
Development: SBX SBX
Changes are tracked in a
VCS with a CLI for source Scratch Scratch Scratch Dev
pull and push, and Org Org Org Pro
programmatic testing. SBX

New Tools:
CumulusCI and
Salesforce DX delivers a
suite of tools and
processes to support
modern development
and VCS.
Version Control System (repo)

CLI-driven changes
What is your development
model?
● I use only persistent Orgs (sandbox,
production but no scratch Org)
● I use a version control system (i.e.
GitHub) with scratch Orgs and
persistent ORgs for UAT, training and

? ●
integration testing
I use scratch and persistent Orgs
without a version control system
Source-Driven Development Process
Plan Code Merge & Test (1) Merge & Test (2) Release

ID/ Test in Test in


Req’t/ Test Peer
Do Dev Commit Integrate Integration Stage Staging Release
Ticket Change Review
Change Env Env

Review
Dev Org ReviewOrg
Work
Dev Org Review (SBX or
Org Integration Org Staging Org
Tracking Dev Org (SBX
Org Scratch)
or Prod
(Sandbox) (Sandbox)
tool of (Sandbox or Scratch
(Sandboxoror Scratch Org)
Org)
(SBXScratch)
or Org
(Sandbox Scratch Org) Scratch)
choice

PR

git pull
git pull git pull deploy
deploy deploy <or>
git pull source:pul git pull <or> <or> package
deploy l deploy package install package install install
git push
Main

Staging

Integrate

Feature 1

Feature 2
Rate your experience with
Salesforce CLI/SFDX
● Never used it
● I played with it

? ● I’m an expert
Do you know what is
CumulusCI?
● Yes

?
● No
CumulusCI Suite
Proven, scalable orchestration tool to deliver faster time to value

CumulusCI enhances Salesforce DX


by providing a robust framework for
portable automation.

MetaCI MetaDeploy Metecho cci: Command Line


MetaCI: Continuous Integration
Heroku MetaDeploy: Configuration Delivery
Metecho: Track Org Configuration
CumulusCI
Portable Automation Framework

Snowfakery Salesforce DX Robot Framework


Fake Relational Data Generation API & Selenium Test Automation
Trailhead Trail
Product Delivery Model: Automate everything needed to create usable orgs including Building Applications
with CumulusCI
dependencies, managed package, post-install configuration, and data
Metéchō
Empowering users, admins and developers to easily collaborate

Metéchō handles the creation of fully configured scratch org environments, metadata change control,
and Github feature branch management through an easy to use web interface.

100% Competitive Differentiator:


Test Coverage The only app for declarative contributions to open source
MetaCI
Continuous Integration at Scale

MetaCI is used by Salesforce.org team to run builds of all managed packages and for creating beta and
production releases.

398,000+ 195 Million+


Automated
Builds since 2017 test cases
Robot Framework
Maintainable API and Browser Test Automation

CumulusCI ships with Robot Framework and keyword libraries


for interacting with CumulusCI's automation and Salesforce
via the REST API and Selenium.

● Keywords updated with each Salesforce release

● Products build their own keyword libraries

● Test cases built using the CumulusCI and product


keywords for easy to maintain test suites

398,000+ 195 Million+


Automated
Builds since 2017 test cases
MetaDeploy
Easily deliver portable projects

Create plans for projects for easily delivery.


The cumulusci.yml file contains the plan definitions and the
metadeploy_publish task publish it.
MetaDeploy web interface make it easy to deploy configuration
based on customer use cases.

install.salesforce.org

install.work.com
Smart Metadata Updates
CumulusCI handle situations where static metadata is dangerous

When is static metadata dangerous? Smart Metadata Updates

● Page Layouts ● Retrieve the target metadata


● Value Sets & Picklists ● Transform the metadata locally
● Record Types ● Deploy the modified metadata
● Permission Sets
● Generally, any metadata the customer
can modify
Data Sets and Snowfakery
Generate, Capture, and Lead Relational Data

Snowfakery: open source relational data generation using yaml


data shape syntax

CumulusCI Bulk API Tasks: capture and load relational data in an


org using the Bulk API and REST API

CumulusCI + Snowfakery: generate


sample data sets of any size including
LDV from Snowfakery data shapes and
load using Bulk and REST API
Skills & Requirements
What skills are needed to fully benefit from CumulusCI?

Package Github CumulusCI


Development Understands Has knowledge of
Understands packages source-driven CumulusCI Suite and
types and development how to orchestrate
development models development and
Has experience with
deployment using
Has experience continuous
command line tools
developing and integration, delivery
implementing and testing
solutions using
Salesforce CLI, and
SFDX
Journey - Next Steps
Understand Packages and Development Models
01 Trail: Build Apps Together with Package Development

02 Development Process with Continuous Integration


Trailhead: Build Apps Together with Package Development

Hands on with CumulusCI


03 Trailhead: Build Applications with CumulusCI

Snowfakery
04 Getting started with Snowfakery
Snowfakery sample recipes

Additional Resources
05 Branch Strategies
Building Templates as
Lego Blocks

CumulusCI
How often do you perform the
same or similar task in different
projects?
● I find myself repeating a lot of tasks
and configurations in different
projects

? ● Each project is unique and I never


repeat the same tasks or
configurations
Process Overview
Simplified view of the Journey

Configuration
● Create a new Scratch
Org based on the
defined Salesforce Test
Tools Org configuration ● Test your template
● Start implementing deployment in
● Visual Studio Code CumulusCI.yml the configuration different Salesforce
● Salesforce Dev Hub Github & CCI ● What are the managed required Orgs
enabled ● Capture and extract
● CumulusCI (CCI) ● Create the Recipe packages or ISV Apps ● If using Metadeploy,
Template Github that my solution will the metadata related test your installation
● SFDX is authorized to to the performed
connect to Dev Hub Org Repository depend on? plan in different
● Configure the CCI ● What features and configuration Salesforce Orgs
● Github account ● Build CCI
connected with Project settings settings should the
Salesforce Org have? automation (custom
CumulusCI tasks and flows) used
● Github Desktop to deploy your
installed and Configured configuration

Build Applications with CumulusCI


Github & CumulusCI (CCI)
Journey: Github repository and CCI project setup

Create Github Repository CumulusCI Project Setup


1. Open GitHub Desktop 1. Open the repository in your terminal by choosing
2. Select File→New Repository from the menu Repository→Open in Visual Studio Code (if you don’t see it
3. Enter the information for the repository below: at the bottom, open the Terminal window by choosing
Terminal→New Terminal)
a. Name: Nonprofit-Template
2. In the Terminal window, type: cci project init
b. Description: Starting Template for Nonprofit Cloud
3. Answer the series of prompts for information about your
c. Local Path: Choose GitHub folder, or create a new
project as follows.
one if you see a warning
a. Use default Project Name (press Return)
d. Git Ignore: None
b. Use default Package Name (press Return)
e. License: None
c. Managed package project: n
4. Click Create Repository
d. Use default API Version (press Return)
e. Use default source format sfdx (press Return)
f. This project extends NPSP:
i. Answer y
ii. Then answer 2
g. Accept the defaults for all remaining questions (press
Return)

When you’re done, you will see the following: Your project is now initialized for use with CumulusCI
Cumulusci.yml File
Journey: Dependencies & Automation

Dependencies Dependencies Automation


If required, you can add additional dependencies to be installed 5. As dependencies are added, you may need to also add its
when the scratch org is created. related automation to your current project. That is done by
adding sources to your cumulusci.yml file. The example
1. In GitHub Desktop, open your Nonprofit-Template below show a dependency automation for Outbound Funds
repository by choosing Repository→Open in Visual Studio module:
Code
2. In the Sidebar, under the Nonprofit-Template header find
cumulusci.yml and click it to open it in the Editor area of VS
Code.
3. Find the line dependencies: section and add a new line as
follow: - github:
'https://github.com/SalesforceFoundation/NPSP
'
4. For example, add Outbound Funds to the Scratch org:
New Scratch Org & Configuration Changes
Journey: Capturing Configuration Changes - 1 of 2

Create new Scratch Org Configuration Changes


If required, you can add additional dependencies to be installed In the example below, we added the new Application
when the scratch org is created. Submitted picklist value to the Opportunity Stage field:

1. In GitHub Desktop, open your repository by choosing


Repository→Open in Visual Studio Code
2. Type this command: cci flow run dev_org --org
dev
3. Once the flow has completed, open the scratch org in a
browser by typing: cci org browser dev
4. Go to the Installed Packages page in Setup to verify the
Nonprofit Success Pack is installed correctly
New Scratch Org & Configuration Changes
Journey: Capturing Configuration Changes - 2 of 2

List the Configuration Changes Retrieve/Extracting Changes


CCI can show you the changes that are being captured by Once all the changes are completed, the following CCI
executing the following command:cci task run command can retrieve the changes and include them into your
list_changes --org dev local Github repository: cci task run retrieve_changes
--org dev -o exclude "Profile:" (the command
The example below shows the added Opportunity Stage picklist excludes profiles as these are best handled separated)
value and its related changes in BusinessProcess and
StandardValueSet. It also shows a new Contact custom field
[Gender__c] and its related changes in Layouts and Profiles:
Next Steps
Journey: Test, Review, Approve and Collaborate

What is next? Metadeploy


After you retrieve all the configuration changes, it is now time to You can make it easier for your team to use or distribute the
collaborate with your team meaning: template by creating Metadeploy installation plans as illustrated
below:
1. Push your changes to a new Github branch
2. Collaborate with your team to test, fix, review and approve
the changes
3. Merge the changes into the Main Github branch to make
them available to anyone that uses this GitHub repository

You might also like