0% found this document useful (0 votes)
81 views27 pages

Aindump2go Saa-C03 PDF Exam 2023-Dec-11 by Theobald 332q Vce

This document provides sample questions and answers for the Amazon Web Services SAA-C03 certification exam. It recommends purchasing practice exam questions from SurePassExam to help prepare for and pass the certification. The document includes 6 sample questions from various exam topics and provides explanations for the answers.

Uploaded by

Saksham kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views27 pages

Aindump2go Saa-C03 PDF Exam 2023-Dec-11 by Theobald 332q Vce

This document provides sample questions and answers for the Amazon Web Services SAA-C03 certification exam. It recommends purchasing practice exam questions from SurePassExam to help prepare for and pass the certification. The document includes 6 sample questions from various exam topics and provides explanations for the answers.

Uploaded by

Saksham kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Recommend!!

Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

Amazon-Web-Services
Exam Questions SAA-C03
AWS Certified Solutions Architect - Associate (SAA-C03)

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

NEW QUESTION 1
- (Exam Topic 1)
A solutions architect is designing the cloud architecture for a new application being deployed on AWS. The process should run in parallel while adding and
removing application nodes as needed based on the number of jobs to be processed. The processor application is stateless. The solutions architect must ensure
that the application is loosely coupled and the job items are durably stored.
Which design should the solutions architect use?

A. Create an Amazon SNS topic to send the jobs that need to be processed Create an Amazon Machine Image (AMI) that consists of the processor application
Create a launch configuration that uses the AMI Create an Auto Scaling group using the launch configuration Set the scaling policy for the Auto Scaling group to
add and remove nodes based on CPU usage
B. Create an Amazon SQS queue to hold the jobs that need to be processed Create an Amazon Machine image (AMI) that consists of the processor application
Create a launch configuration that uses the AM' Create an Auto Scaling group using the launch configuration Set the scaling policy for the Auto Scaling group to
add and remove nodes based on network usage
C. Create an Amazon SQS queue to hold the jobs that needs to be processed Create an Amazon Machine image (AMI) that consists of the processor application
Create a launch template that uses the AMI Create an Auto Scaling group using the launch template Set the scaling policy for the Auto Scaling group to add and
remove nodes based on the number of items in the SQS queue
D. Create an Amazon SNS topic to send the jobs that need to be processed Create an Amazon Machine Image (AMI) that consists of the processor application
Create a launch template that uses the AMI Create an Auto Scaling group using the launch template Set the scaling policy for the Auto Scaling group to add and
remove nodes based on the number of messages published to the SNS topic

Answer: C

Explanation:
"Create an Amazon SQS queue to hold the jobs that needs to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the
scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue"
In this case we need to find a durable and loosely coupled solution for storing jobs. Amazon SQS is ideal for this use case and can be configured to use dynamic
scaling based on the number of jobs waiting in the queue.To configure this scaling you can use the backlog per instance metric with the target value being the
acceptable backlog per instance to maintain. You can calculate these numbers as follows: Backlog per instance: To calculate your backlog per instance, start with
the ApproximateNumberOfMessages queue attribute to determine the length of the SQS queue

NEW QUESTION 2
- (Exam Topic 1)
A company runs a shopping application that uses Amazon DynamoDB to store customer information. In case of data corruption, a solutions architect needs to
design a solution that meets a recovery point objective (RPO) of 15 minutes and a recovery time objective (RTO) of 1 hour.
What should the solutions architect recommend to meet these requirements?

A. Configure DynamoDB global table


B. For RPO recovery, point the application to a different AWS Region.
C. Configure DynamoDB point-in-time recover
D. For RPO recovery, restore to the desired point in time.
E. Export the DynamoDB data to Amazon S3 Glacier on a daily basi
F. For RPO recovery, import the data from S3 Glacier to DynamoDB.
G. Schedule Amazon Elastic Block Store (Amazon EBS) snapshots for the DynamoDB table every 15 minute
H. For RPO recovery, restore the DynamoDB table by using the EBS snapshot.

Answer: B

Explanation:
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/PointInTimeRecovery.html

NEW QUESTION 3
- (Exam Topic 1)
A company is hosting a web application on AWS using a single Amazon EC2 instance that stores
user-uploaded documents in an Amazon EBS volume. For better scalability and availability, the company duplicated the architecture and created a second EC2
instance and EBS volume in another Availability Zone placing both behind an Application Load Balancer After completing this change, users reported that, each
time they refreshed the website, they could see one subset of their documents or the other, but never all of the documents at the same time.
What should a solutions architect propose to ensure users see all of their documents at once?

A. Copy the data so both EBS volumes contain all the documents.
B. Configure the Application Load Balancer to direct a user to the server with the documents
C. Copy the data from both EBS volumes to Amazon EFS Modify the application to save new documents to Amazon EFS
D. Configure the Application Load Balancer to send the request to both servers Return each document from the correct server.

Answer: A

Explanation:
Amazon EFS provides file storage in the AWS Cloud. With Amazon EFS, you can create a file system, mount the file system on an Amazon EC2 instance, and
then read and write data to and from your file system. You can mount an Amazon EFS file system in your VPC, through the Network File System versions 4.0 and
a4.1 (NFSv4) protocol. We recommend using a current generation Linux NFSv4.1 client, such as those found in the latest Amazon Linux, Redhat, and Ubuntu
AMIs, in conjunction with the Amazon EFS Mount Helper. For instructions, see Using the amazon-efs-utils Tools.
For a list of Amazon EC2 Linux Amazon Machine Images (AMIs) that support this protocol, see NFS Support. For some AMIs, you'll need to install an NFS client to
mount your file system on your Amazon EC2 instance. For instructions, see Installing the NFS Client.
You can access your Amazon EFS file system concurrently from multiple NFS clients, so applications that scale beyond a single connection can access a file
system. Amazon EC2 instances running in multiple Availability Zones within the same AWS Region can access the file system, so that many users can access and
share a common data source.
https://docs.aws.amazon.com/efs/latest/ug/how-it-works.html#how-it-works-ec2

NEW QUESTION 4

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

- (Exam Topic 1)
A company is planning to use an Amazon DynamoDB table for data storage. The company is concerned about cost optimization. The table will not be used on
most mornings. In the evenings, the read and write traffic will often be unpredictable. When traffic spikes occur, they will happen very quickly.
What should a solutions architect recommend?

A. Create a DynamoDB table in on-demand capacity mode.


B. Create a DynamoDB table with a global secondary index.
C. Create a DynamoDB table with provisioned capacity and auto scaling.
D. Create a DynamoDB table in provisioned capacity mode, and configure it as a global table.

Answer: A

NEW QUESTION 5
- (Exam Topic 1)
An application allows users at a company's headquarters to access product data. The product data is stored in an Amazon RDS MySQL DB instance. The
operations team has isolated an application performance slowdown and wants to separate read traffic from write traffic. A solutions architect needs to optimize the
application's performance quickly.
What should the solutions architect recommend?

A. Change the existing database to a Multi-AZ deploymen


B. Serve the read requests from the primary Availability Zone.
C. Change the existing database to a Multi-AZ deploymen
D. Serve the read requests from the secondary Availability Zone.
E. Create read replicas for the databas
F. Configure the read replicas with half of the compute and storage resources as the source database.
G. Create read replicas for the databas
H. Configure the read replicas with the same compute and storage resources as the source database.

Answer: D

Explanation:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_MySQL.Replication.ReadReplicas.html

NEW QUESTION 6
- (Exam Topic 1)
A company hosts a containerized web application on a fleet of on-premises servers that process incoming requests. The number of requests is growing quickly.
The on-premises servers cannot handle the increased number of requests. The company wants to move the application to AWS with minimum code changes and
minimum development effort.
Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS Fargate on Amazon Elastic Container Service (Amazon ECS) to run the containerized web application with Service Auto Scalin
B. Use an Application Load Balancer to distribute the incoming requests.
C. Use two Amazon EC2 instances to host the containerized web applicatio
D. Use an Application Load Balancer to distribute the incoming requests
E. Use AWS Lambda with a new code that uses one of the supported language
F. Create multiple Lambda functions to support the loa
G. Use Amazon API Gateway as an entry point to the Lambda functions.
H. Use a high performance computing (HPC) solution such as AWS ParallelClusterto establish an HPC cluster that can process the incoming requests at the
appropriate scale.

Answer: A

NEW QUESTION 7
- (Exam Topic 1)
A company that hosts its web application on AWS wants to ensure all Amazon EC2 instances. Amazon RDS DB instances. and Amazon Redshift clusters are
configured with tags. The company wants to minimize the effort of configuring and operating this check.
What should a solutions architect do to accomplish this?

A. Use AWS Config rules to define and detect resources that are not properly tagged.
B. Use Cost Explorer to display resources that are not properly tagge
C. Tag those resources manually.
D. Write API calls to check all resources for proper tag allocatio
E. Periodically run the code on an EC2 instance.
F. Write API calls to check all resources for proper tag allocatio
G. Schedule an AWS Lambda function through Amazon CloudWatch to periodically run the code.

Answer: A

NEW QUESTION 8
- (Exam Topic 1)
A company recently signed a contract with an AWS Managed Service Provider (MSP) Partner for help with an application migration initiative. A solutions architect
needs to share an Amazon Machine Image (AMI) from an existing AWS account with the MSP Partner's AWS account. The AMI is backed by Amazon Elastic
Block Store (Amazon EBS) and uses a customer managed customer master key (CMK) to encrypt EBS volume snapshots.
What is the MOST secure way for the solutions architect to share the AMI with the MSP Partner's AWS account?

A. Make the encrypted AMI and snapshots publicly availabl


B. Modify the CMK's key policy to allow the MSP Partner's AWS account to use the key
C. Modify the launchPermission property of the AM
D. Share the AMI with the MSP Partner's AWS account onl

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

E. Modify the CMK's key policy to allow the MSP Partner's AWS account to use the key.
F. Modify the launchPermission property of the AMI Share the AMI with the MSP Partner's AWS account onl
G. Modify the CMK's key policy to trust a new CMK that is owned by the MSP Partner for encryption.
H. Export the AMI from the source account to an Amazon S3 bucket in the MSP Partner's AWS account.Encrypt the S3 bucket with a CMK that is owned by the
MSP Partner Copy and launch the AMI in the MSP Partner's AWS account.

Answer: B

Explanation:
Share the existing KMS key with the MSP external account because it has already been used to encrypt the AMI snapshot.
https://docs.aws.amazon.com/kms/latest/developerguide/key-policy-modifying-external-accounts.html

NEW QUESTION 9
- (Exam Topic 1)
A company uses NFS to store large video files in on-premises network attached storage. Each video file ranges in size from 1MB to 500 GB. The total storage is
70 TB and is no longer growing. The company decides to migrate the video files to Amazon S3. The company must migrate the video files as soon as possible
while using the least possible network bandwidth.
Which solution will meet these requirements?

A. Create an S3 bucket Create an IAM role that has permissions to write to the S3 bucke
B. Use the AWS CLI to copy all files locally to the S3 bucket.
C. Create an AWS Snowball Edge jo
D. Receive a Snowball Edge device on premise
E. Use the Snowball Edge client to transfer data to the devic
F. Return the device so that AWS can import the data into Amazon S3.
G. Deploy an S3 File Gateway on premise
H. Create a public service endpoint to connect to the S3 File Gateway Create an S3 bucket Create a new NFS file share on the S3 File Gateway Point the new file
share to the S3 bucke
I. Transfer the data from the existing NFS file share to the S3 File Gateway.
J. Set up an AWS Direct Connect connection between the on-premises network and AW
K. Deploy an S3 File Gateway on premise
L. Create a public virtual interlace (VIF) to connect to the S3 File Gatewa
M. Create an S3 bucke
N. Create a new NFS file share on the S3 File Gatewa
O. Point the new file share to the S3 bucke
P. Transfer the data from the existing NFS file share to the S3 File Gateway.

Answer: B

NEW QUESTION 10
- (Exam Topic 1)
A development team needs to host a website that will be accessed by other teams. The website contents consist of HTML, CSS, client-side JavaScript, and
images Which method is the MOST cost-effective for hosting the website?

A. Containerize the website and host it in AWS Fargate.


B. Create an Amazon S3 bucket and host the website there
C. Deploy a web server on an Amazon EC2 instance to host the website.
D. Configure an Application Loa d Balancer with an AWS Lambda target that uses the Express js framework.

Answer: B

Explanation:
In Static Websites, Web pages are returned by the server which are prebuilt. They use simple languages such as HTML, CSS, or JavaScript.
There is no processing of content on the server (according to the user) in Static Websites. Web pages are returned by the server with no change therefore, static
Websites are fast.
There is no interaction with databases.
Also, they are less costly as the host does not need to support server-side processing with different languages.
============
In Dynamic Websites, Web pages are returned by the server which are processed during runtime means they are not prebuilt web pages but they are built during
runtime according to the user’s demand.
These use server-side scripting languages such as PHP, Node.js, ASP.NET and many more supported by the server.
So, they are slower than static websites but updates and interaction with databases are possible.

NEW QUESTION 10
- (Exam Topic 1)
A company is launching a new application and will display application metrics on an Amazon CloudWatch dashboard. The company’s product manager needs to
access this dashboard periodically. The product manager does not have an AWS account. A solution architect must provide access to the product manager by
following the principle of least privilege.
Which solution will meet these requirements?

A. Share the dashboard from the CloudWatch consol


B. Enter the product manager’s email address, and complete the sharing step
C. Provide a shareable link for the dashboard to the product manager.
D. Create an IAM user specifically for the product manage
E. Attach the CloudWatch Read Only Access managed policy to the use
F. Share the new login credential with the product manage
G. Share the browser URL of the correct dashboard with the product manager.
H. Create an IAM user for the company’s employees, Attach the View Only Access AWS managed policy to the IAM use
I. Share the new login credentials with the product manage
J. Ask the product manager to navigate to the CloudWatch console and locate the dashboard by name in the Dashboards section.

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

K. Deploy a bastion server in a public subne


L. When the product manager requires access to the dashboard, start the server and share the RDP credential
M. On the bastion server, ensure that the browser is configured to open the dashboard URL with cached AWS credentials that have appropriate permissions to
view the dashboard.

Answer: B

NEW QUESTION 14
- (Exam Topic 1)
A company hosts an application on multiple Amazon EC2 instances The application processes messages from an Amazon SQS queue writes to an Amazon RDS
table and deletes the message from the queue Occasional duplicate records are found in the RDS table. The SQS queue does not contain any duplicate
messages.
What should a solutions architect do to ensure messages are being processed once only?

A. Use the CreateQueue API call to create a new queue


B. Use the Add Permission API call to add appropriate permissions
C. Use the ReceiveMessage API call to set an appropriate wail time
D. Use the ChangeMessageVisibility APi call to increase the visibility timeout

Answer: D

Explanation:
The visibility timeout begins when Amazon SQS returns a message. During this time, the consumer processes and deletes the message. However, if the
consumer fails before deleting the message and your system doesn't call the DeleteMessage action for that message before the visibility timeout expires, the
message becomes visible to other consumers and the message is received again. If a message must be received only once, your consumer should delete it within
the duration of the visibility timeout. https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-visibility-timeout.html
Keyword: SQS queue writes to an Amazon RDS From this, Option D best suite & other Options ruled out [Option A - You can't intruduce one more Queue in
the existing one; Option B - only Permission & Option C - Only Retrieves Messages] FIF O queues are designed to never introduce duplicate messages.
However, your message producer might introduce duplicates in certain scenarios: for example, if the producer sends a message, does not receive a response, and
then resends the same message. Amazon SQS APIs provide deduplication functionality that prevents your message producer from sending duplicates. Any
duplicates introduced by the message producer are removed within a 5-minute deduplication interval. For standard queues, you might occasionally receive a
duplicate copy of a message (at-least- once delivery). If you use a standard queue, you must design your applications to be idempotent (that is, they must not be
affected adversely when processing the same message more than once).

NEW QUESTION 15
- (Exam Topic 1)
A company's containerized application runs on an Amazon EC2 instance. The application needs to download security certificates before it can communicate with
other business applications. The company wants a highly secure solution to encrypt and decrypt the certificates in near real time. The solution also needs to store
data in highly available storage after the data is encrypted.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create AWS Secrets Manager secrets for encrypted certificate


B. Manually update the certificates as neede
C. Control access to the data by using fine-grained IAM access.
D. Create an AWS Lambda function that uses the Python cryptography library to receive and perform encryption operation
E. Store the function in an Amazon S3 bucket.
F. Create an AWS Key Management Service (AWS KMS) customer managed ke
G. Allow the EC2 role to use the KMS key for encryption operation
H. Store the encrypted data on Amazon S3.
I. Create an AWS Key Management Service (AWS KMS) customer managed ke
J. Allow the EC2 role to use the KMS key for encryption operation
K. Store the encrypted data on Amazon Elastic Block Store (Amazon EBS) volumes.

Answer: D

NEW QUESTION 20
- (Exam Topic 1)
A company runs its Infrastructure on AWS and has a registered base of 700.000 users for res document management application The company intends to create a
product that converts large pdf files to jpg Imago files. The .pdf files average 5 MB in size. The company needs to store the original files and the converted files. A
solutions architect must design a scalable solution to accommodate demand that will grow rapidly over lime.
Which solution meets these requirements MOST cost-effectively?

A. Save the pdf files to Amazon S3 Configure an S3 PUT event to invoke an AWS Lambda function to convert the files to jpg format and store them back in
Amazon S3
B. Save the pdf files to Amazon DynamoD
C. Use the DynamoDB Streams feature to invoke an AWS Lambda function to convert the files to jpg format and store them hack in DynamoDB
D. Upload the pdf files to an AWS Elastic Beanstalk application that includes Amazon EC2 instances.Amazon Elastic Block Store (Amazon EBS) storage and an
Auto Scaling grou
E. Use a program In the EC2 instances to convert the files to jpg format Save the .pdf files and the .jpg files In the EBS store.
F. Upload the .pdf files to an AWS Elastic Beanstalk application that includes Amazon EC2 instances, Amazon Elastic File System (Amazon EPS) storage, and an
Auto Scaling grou
G. Use a program in the EC2 instances to convert the file to jpg format Save the pdf files and the jpg files in the EBS store.

Answer: A

Explanation:
Elastic BeanStalk is expensive, and DocumentDB has a 400KB max to upload files. So Lambda and S3 should be the one.

NEW QUESTION 21

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

- (Exam Topic 1)
A solutions architect is developing a multiple-subnet VPC architecture. The solution will consist of six subnets in two Availability Zones. The subnets are defined as
public, private and dedicated for databases. Only the Amazon EC2 instances running in the private subnets should be able to access a database.
Which solution meets these requirements?

A. Create a now route table that excludes the route to the public subnets' CIDR block
B. Associate the route table to the database subnets.
C. Create a security group that denies ingress from the security group used by instances in the public subnet
D. Attach the security group to an Amazon RDS DB instance.
E. Create a security group that allows ingress from the security group used by instances in the private subnet
F. Attach the security group to an Amazon RDS DB instance.
G. Create a new peering connection between the public subnets and the private subnet
H. Create a different peering connection between the private subnets and the database subnets.

Answer: C

Explanation:
Security groups are stateful. All inbound traffic is blocked by default. If you create an inbound rule allowing traffic in, that traffic is automatically allowed back out
again. You cannot block specific IP address using Security groups (instead use Network Access Control Lists).
"You can specify allow rules, but not deny rules." "When you first create a security group, it has no inbound rules. Therefore, no inbound traffic originating from
another host to your instance is allowed until you add inbound rules to the security group." Source:
https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html#VPCSecurityGroups

NEW QUESTION 22
- (Exam Topic 1)
A company is storing sensitive user information in an Amazon S3 bucket The company wants to provide secure access to this bucket from the application tier
running on Ama2on EC2 instances inside a VPC.
Which combination of steps should a solutions architect take to accomplish this? (Select TWO.)

A. Configure a VPC gateway endpoint for Amazon S3 within the VPC


B. Create a bucket policy to make the objects to the S3 bucket public
C. Create a bucket policy that limits access to only the application tier running in the VPC
D. Create an IAM user with an S3 access policy and copy the IAM credentials to the EC2 instance
E. Create a NAT instance and have the EC2 instances use the NAT instance to access the S3 bucket

Answer: AC

Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/s3-private-connection-no-authentication/

NEW QUESTION 23
- (Exam Topic 1)
A company is running a business-critical web application on Amazon EC2 instances behind an Application Load Balancer. The EC2 instances are in an Auto
Scaling group. The application uses an Amazon Aurora PostgreSQL database that is deployed in a single Availability Zone. The company wants the application to
be highly available with minimum downtime and minimum loss of data.
Which solution will meet these requirements with the LEAST operational effort?

A. Place the EC2 instances in different AWS Region


B. Use Amazon Route 53 health checks to redirect traffi
C. Use Aurora PostgreSQL Cross-Region Replication.
D. Configure the Auto Scaling group to use multiple Availability Zone
E. Configure the database as Multi-A
F. Configure an Amazon RDS Proxy instance for the database.
G. Configure the Auto Scaling group to use one Availability Zon
H. Generate hourly snapshots of the databas
I. Recover the database from the snapshots in the event of a failure.
J. Configure the Auto Scaling group to use multiple AWS Region
K. Write the data from the application to Amazon S3. Use S3 Event Notifications to launch an AWS Lambda function to write the data to the database.

Answer: B

NEW QUESTION 26
- (Exam Topic 1)
A company is developing a two-tier web application on AWS. The company's developers have deployed the application on an Amazon EC2 instance that connects
directly to a backend Amazon RDS database. The company must not hardcode database credentials in the application. The company must also implement a
solution to automatically rotate the database credentials on a regular basis.
Which solution will meet these requirements with the LEAST operational overhead?

A. Store the database credentials in the instance metadat


B. Use Amazon EventBridge (Amazon CloudWatch Events) rules to run a scheduled AWS Lambda function that updates the RDS credentials and instance
metadata at the same time.
C. Store the database credentials in a configuration file in an encrypted Amazon S3 bucke
D. Use Amazon EventBridge (Amazon CloudWatch Events) rules to run a scheduled AWS Lambda function that updates the RDS credentials and the credentials
in the configuration file at the same tim
E. Use S3 Versioning to ensure the ability to fall back to previous values.
F. Store the database credentials as a secret in AWS Secrets Manage
G. Turn on automatic rotation for the secre
H. Attach the required permission to the EC2 role to grant access to the secret.
I. Store the database credentials as encrypted parameters in AWS Systems Manager Parameter Stor
J. Turn on automatic rotation for the encrypted parameter

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

K. Attach the required permission to the EC2 role to grant access to the encrypted parameters.

Answer: C

Explanation:
https://docs.aws.amazon.com/secretsmanager/latest/userguide/create_database_secret.html

NEW QUESTION 27
- (Exam Topic 1)
A company wants to run its critical applications in containers to meet requirements tor scalability and availability The company prefers to focus on maintenance of
the critical applications The company does not want to be responsible for provisioning and managing the underlying infrastructure that runs the containerized
workload
What should a solutions architect do to meet those requirements?

A. Use Amazon EC2 Instances, and Install Docker on the Instances


B. Use Amazon Elastic Container Service (Amazon ECS) on Amazon EC2 worker nodes
C. Use Amazon Elastic Container Service (Amazon ECS) on AWS Fargate
D. Use Amazon EC2 instances from an Amazon Elastic Container Service (Amazon ECS)-op6mized Amazon Machine Image (AMI).

Answer: C

Explanation:
using AWS ECS on AWS Fargate since they requirements are for scalability and availability without having to provision and manage the underlying infrastructure
to run the containerized workload. https://docs.aws.amazon.com/AmazonECS/latest/userguide/what-is-fargate.html

NEW QUESTION 29
- (Exam Topic 1)
A company needs the ability to analyze the log files of its proprietary application. The logs are stored in JSON format in an Amazon S3 bucket Queries will be
simple and will run on-demand A solutions architect needs to perform the analysis with minimal changes to the existing architecture
What should the solutions architect do to meet these requirements with the LEAST amount of operational overhead?

A. Use Amazon Redshift to load all the content into one place and run the SQL queries as needed
B. Use Amazon CloudWatch Logs to store the logs Run SQL queries as needed from the Amazon CloudWatch console
C. Use Amazon Athena directly with Amazon S3 to run the queries as needed
D. Use AWS Glue to catalog the logs Use a transient Apache Spark cluster on Amazon EMR to run the SQL queries as needed

Answer: C

Explanation:
Amazon Athena can be used to query JSON in S3

NEW QUESTION 34
- (Exam Topic 1)
A solutions architect is using Amazon S3 to design the storage architecture of a new digital media application. The media files must be resilient to the loss of an
Availability Zone Some files are accessed frequently while other files are rarely accessed in an unpredictable pattern. The solutions architect must minimize the
costs of storing and retrieving the media files.
Which storage option meets these requirements?

A. S3 Standard
B. S3 Intelligent-Tiering
C. S3 Standard-Infrequent Access {S3 Standard-IA)
D. S3 One Zone-Infrequent Access (S3 One Zone-IA)

Answer: B

Explanation:
S3 Intelligent-Tiering - Perfect use case when you don't know the frequency of access or irregular patterns of usage.
Amazon S3 offers a range of storage classes designed for different use cases. These include S3 Standard for general-purpose storage of frequently accessed
data; S3 Intelligent-Tiering for data with unknown or changing access patterns; S3 Standard-Infrequent Access (S3 Standard-IA) and S3 One Zone-Infrequent
Access (S3 One Zone-IA) for long-lived, but less frequently accessed data; and Amazon S3 Glacier (S3 Glacier) and Amazon S3 Glacier Deep Archive (S3 Glacier
Deep Archive) for long-term archive and digital preservation. If you have data residency requirements that can’t be met by an existing AWS Region, you can use
the S3 Outposts storage class to store your S3 data on-premises. Amazon S3 also offers capabilities to manage your data throughout its lifecycle. Once an S3
Lifecycle policy is set, your data will automatically transfer to a different storage class without any changes to your application.
https://aws.amazon.com/getting-started/hands-on/getting-started-using-amazon-s3-intelligent-tiering/?nc1=h_ls

NEW QUESTION 39
- (Exam Topic 1)
A company's application integrates with multiple software-as-a-service (SaaS) sources for data collection. The company runs Amazon EC2 instances to receive
the data and to upload the data to an Amazon S3 bucket for analysis. The same EC2 instance that receives and uploads the data also sends a notification to the
user when an upload is complete. The company has noticed slow application performance and wants to improve the performance as much as possible.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create an Auto Scaling group so that EC2 instances can scale ou


B. Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.
C. Create an Amazon AppFlow flow to transfer data between each SaaS source and the S3 bucket.Configure an S3 event notification to send events to an
Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.
D. Create an Amazon EventBridge (Amazon CloudWatch Events) rule for each SaaS source to send output dat
E. Configure the S3 bucket as the rule's targe
F. Create a second EventBridge (CloudWatch Events) rule to send events when the upload to the S3 bucket is complet

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

G. Configure an Amazon Simple Notification Service (Amazon SNS) topic as the second rule's target.
H. Create a Docker container to use instead of an EC2 instanc
I. Host the containerized application on Amazon Elastic Container Service (Amazon ECS). Configure Amazon CloudWatch Container Insights to send events to an
Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.

Answer: B

Explanation:
Amazon AppFlow is a fully managed integration service that enables you to securely transfer data between Software-as-a-Service (SaaS) applications like
Salesforce, SAP, Zendesk, Slack, and ServiceNow, and AWS services like Amazon S3 and Amazon Redshift, in just a few clicks.
https://aws.amazon.com/appflow/

NEW QUESTION 40
- (Exam Topic 1)
A company uses 50 TB of data for reporting. The company wants to move this data from on premises to AWS A custom application in the company's data center
runs a weekly data transformation job. The company plans to pause the application until the data transfer is complete and needs to begin the transfer process as
soon as possible.
The data center does not have any available network bandwidth for additional workloads A solutions architect must transfer the data and must configure the
transformation job to continue to run in the AWS Cloud
Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue
B. Order an AWS Snowcone device to move the data Deploy the transformation application to the device
C. Order an AWS Snowball Edge Storage Optimized devic
D. Copy the data to the devic
E. Create a custom transformation job by using AWS Glue
F. Order an AWS
G. Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a new EC2 instance on AWS to run the
transformation application

Answer: C

NEW QUESTION 45
- (Exam Topic 1)
A company is implementing a new business application. The application runs on two Amazon EC2 instances and uses an Amazon S3 bucket for document
storage. A solutions architect needs to ensure that the EC2 instances can access the S3 bucket.
What should the solutions architect do to meet this requirement?

A. Create an IAM role that grants access to the S3 bucke


B. Attach the role to the EC2 instances.
C. Create an IAM policy that grants access to the S3 bucke
D. Attach the policy to the EC2 instances.
E. Create an IAM group that grants access to the S3 bucke
F. Attach the group to the EC2 instances.
G. Create an IAM user that grants access to the S3 bucke
H. Attach the user account to the EC2 instances.

Answer: A

Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/ec2-instance-access-s3-bucket/

NEW QUESTION 47
- (Exam Topic 1)
A company is preparing to deploy a new serverless workload. A solutions architect must use the principle of least privilege to configure permissions that will be
used to run an AWS Lambda function. An Amazon EventBridge (Amazon CloudWatch Events) rule will invoke the function.
Which solution meets these requirements?

A. Add an execution role to the function with lambda:InvokeFunction as the action and * as the principal.
B. Add an execution role to the function with lambda:InvokeFunction as the action and Service:amazonaws.com as the principal.
C. Add a resource-based policy to the function with lambda:'* as the action and Service:events.amazonaws.com as the principal.
D. Add a resource-based policy to the function with lambda:InvokeFunction as the action and Service:events.amazonaws.com as the principal.

Answer: D

Explanation:
https://docs.aws.amazon.com/eventbridge/latest/userguide/resource-based-policies-eventbridge.html#lambda-pe

NEW QUESTION 49
- (Exam Topic 1)
An image-processing company has a web application that users use to upload images. The application uploads the images into an Amazon S3 bucket. The
company has set up S3 event notifications to publish the object creation events to an Amazon Simple Queue Service (Amazon SQS) standard queue. The SQS
queue serves as the event source for an AWS Lambda function that processes the images and sends the results to users through email.
Users report that they are receiving multiple email messages for every uploaded image. A solutions architect determines that SQS messages are invoking the
Lambda function more than once, resulting in multiple email messages.
What should the solutions architect do to resolve this issue with the LEAST operational overhead?

A. Set up long polling in the SQS queue by increasing the ReceiveMessage wait time to 30 seconds.
B. Change the SQS standard queue to an SQS FIFO queu

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

C. Use the message deduplication ID to discard duplicate messages.


D. Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the batch window timeout.
E. Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before processing.

Answer: C

NEW QUESTION 54
- (Exam Topic 1)
A company needs to keep user transaction data in an Amazon DynamoDB table. The company must retain the data for 7 years.
What is the MOST operationally efficient solution that meets these requirements?

A. Use DynamoDB point-in-time recovery to back up the table continuously.


B. Use AWS Backup to create backup schedules and retention policies for the table.
C. Create an on-demand backup of the table by using the DynamoDB consol
D. Store the backup in an Amazon S3 bucke
E. Set an S3 Lifecycle configuration for the S3 bucket.
F. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda functio
G. Configure the Lambda function to back up the table and to store the backup in an Amazon S3bucke
H. Set an S3 Lifecycle configuration for the S3 bucket.

Answer: C

NEW QUESTION 57
- (Exam Topic 1)
A company has applications that run on Amazon EC2 instances in a VPC. One of the applications needs to call the Amazon S3 API to store and read objects.
According to the company's security regulations, no traffic from the applications is allowed to travel across the internet.
Which solution will meet these requirements?

A. Configure an S3 interface endpoint.


B. Configure an S3 gateway endpoint.
C. Create an S3 bucket in a private subnet.
D. Create an S3 bucket in the same Region as the EC2 instance.

Answer: B

Explanation:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/privatelink-interface-endpoints.html#types-of-vpc-end
https://docs.aws.amazon.com/vpc/latest/userguide/vpc-endpoints-s3.html

NEW QUESTION 62
- (Exam Topic 1)
A company needs to review its AWS Cloud deployment to ensure that its Amazon S3 buckets do not have unauthorized configuration changes.
What should a solutions architect do to accomplish this goal?

A. Turn on AWS Config with the appropriate rules.


B. Turn on AWS Trusted Advisor with the appropriate checks.
C. Turn on Amazon Inspector with the appropriate assessment template.
D. Turn on Amazon S3 server access loggin
E. Configure Amazon EventBridge (Amazon Cloud Watch Events).

Answer: D

NEW QUESTION 63
- (Exam Topic 1)
A company is migrating a distributed application to AWS The application serves variable workloads The legacy platform consists of a primary server trial
coordinates jobs across multiple compute nodes The company wants to modernize the application with a solution that maximizes resiliency and scalability.
How should a solutions architect design the architecture to meet these requirements?

A. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 instances
that are managed in an Auto Scaling grou
B. Configure EC2 Auto Scaling to use scheduled scaling
C. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 Instances
that are managed in an Auto Scaling group Configure EC2 Auto Scaling based on the size of the queue
D. Implement the primary server and the compute nodes with Amazon EC2 instances that are managed In an Auto Scaling grou
E. Configure AWS CloudTrail as a destination for the fobs Configure EC2 Auto Scaling based on the load on the primary server
F. implement the primary server and the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group Configure Amazon EventBridge
(Amazon CloudWatch Events) as a destination for the jobs Configure EC2 Auto Scaling based on the load on the compute nodes

Answer: B

NEW QUESTION 64
- (Exam Topic 1)
A company is building an application in the AWS Cloud. The application will store data in Amazon S3 buckets in two AWS Regions. The company must use an
AWS Key Management Service (AWS KMS) customer managed key to encrypt all data that is stored in the S3 buckets. The data in both S3 buckets must be
encrypted and decrypted with the same KMS key. The data and the key must be stored in each of the two Regions.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create an S3 bucket in each Region Configure the S3 buckets to use server-side encryption with Amazon S3 managed encryption keys (SSE-S3) Configure

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

replication between the S3 buckets.


B. Create a customer managed multi-Region KMS ke
C. Create an S3 bucket in each Regio
D. Configure replication between the S3 bucket
E. Configure the application to use the KMS key with client-side encryption.
F. Create a customer managed KMS key and an S3 bucket in each Region Configure the S3 buckets to use server-side encryption with Amazon S3 managed
encryption keys (SSE-S3) Configure replication between the S3 buckets.
G. Create a customer managed KMS key and an S3 bucket m each Region Configure the S3 buckets to use server-side encryption with AWS KMS keys (SSE-
KMS) Configure replication between the S3 buckets.

Answer: B

Explanation:
From https://docs.aws.amazon.com/kms/latest/developerguide/custom-key-store-overview.html
For most users, the default AWS KMS key store, which is protected by FIPS 140-2 validated cryptographic modules, fulfills their security requirements. There is no
need to add an extra layer of maintenance responsibility or a dependency on an additional service. However, you might consider creating a custom key store if
your organization has any of the following requirements: Key material cannot be stored in a shared environment. Key material must be subject to a secondary,
independent audit path. The HSMs that generate and store key material must be certified at FIPS 140-2 Level 3.
https://docs.aws.amazon.com/kms/latest/developerguide/custom-key-store-overview.html
https://docs.aws.amazon.com/kms/latest/developerguide/multi-region-keys-overview.html

NEW QUESTION 69
- (Exam Topic 1)
A bicycle sharing company is developing a multi-tier architecture to track the location of its bicycles during peak operating hours The company wants to use these
data points in its existing analytics platform A solutions architect must determine the most viable multi-tier option to support this architecture The data points must
be accessible from the REST API.
Which action meets these requirements for storing and retrieving location data?

A. Use Amazon Athena with Amazon S3


B. Use Amazon API Gateway with AWS Lambda
C. Use Amazon QuickSight with Amazon Redshift.
D. Use Amazon API Gateway with Amazon Kinesis Data Analytics

Answer: D

Explanation:
https://aws.amazon.com/solutions/implementations/aws-streaming-data-solution-for-amazon-kinesis/

NEW QUESTION 71
- (Exam Topic 1)
A company is developing an application that provides order shipping statistics for retrieval by a REST API. The company wants to extract the shipping statistics,
organize the data into an easy-to-read HTML format, and send the report to several email addresses at the same time every morning.
Which combination of steps should a solutions architect take to meet these requirements? (Choose two.)

A. Configure the application to send the data to Amazon Kinesis Data Firehose.
B. Use Amazon Simple Email Service (Amazon SES) to format the data and to send the report by email.
C. Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Glue job to query the application's API for the data.
D. Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Lambda function to query the application's API for the
data.
E. Store the application data in Amazon S3. Create an Amazon Simple Notification Service (Amazon SNS) topic as an S3 event destination to send the report by

Answer: DE

NEW QUESTION 76
- (Exam Topic 1)
A company has a data ingestion workflow that consists the following:
An Amazon Simple Notification Service (Amazon SNS) topic for notifications about new data deliveries
An AWS Lambda function to process the data and record metadata
The company observes that the ingestion workflow fails occasionally because of network connectivity issues. When such a failure occurs, the Lambda function
does not ingest the corresponding data unless the company manually reruns the job.
Which combination of actions should a solutions architect take to ensure that the Lambda function ingests all data in the future? (Select TWO.)

A. Configure the Lambda function In multiple Availability Zones.


B. Create an Amazon Simple Queue Service (Amazon SQS) queue, and subscribe It to me SNS topic.
C. Increase the CPU and memory that are allocated to the Lambda function.
D. Increase provisioned throughput for the Lambda function.
E. Modify the Lambda function to read from an Amazon Simple Queue Service (Amazon SQS) queue

Answer: BE

NEW QUESTION 81
- (Exam Topic 1)
A company's website uses an Amazon EC2 instance store for its catalog of items. The company wants to make sure that the catalog is highly available and that
the catalog is stored in a durable location.
What should a solutions architect do to meet these requirements?

A. Move the catalog to Amazon ElastiCache for Redis.


B. Deploy a larger EC2 instance with a larger instance store.

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

C. Move the catalog from the instance store to Amazon S3 Glacier Deep Archive.
D. Move the catalog to an Amazon Elastic File System (Amazon EFS) file system.

Answer: A

NEW QUESTION 82
- (Exam Topic 1)
A company stores call transcript files on a monthly basis. Users access the files randomly within 1 year of the call, but users access the files infrequently after 1
year. The company wants to optimize its solution by giving users the ability to query and retrieve files that are less than 1-year-old as quickly as possible. A delay
in retrieving older files is acceptable.
Which solution will meet these requirements MOST cost-effectively?

A. Store individual files with tags in Amazon S3 Glacier Instant Retrieva


B. Query the tags to retrieve the files from S3 Glacier Instant Retrieval.
C. Store individual files in Amazon S3 Intelligent-Tierin
D. Use S3 Lifecycle policies to move the files to S3 Glacier Flexible Retrieval after 1 yea
E. Query and retrieve the files that are in Amazon S3 by using Amazon Athen
F. Query and retrieve the files that are in S3 Glacier by using S3 Glacier Select.
G. Store individual files with tags in Amazon S3 Standard storag
H. Store search metadata for each archive in Amazon S3 Standard storag
I. Use S3 Lifecycle policies to move the files to S3 Glacier Instant Retrieval after 1 yea
J. Query and retrieve the files by searching for metadata from Amazon S3.
K. Store individual files in Amazon S3 Standard storag
L. Use S3 Lifecycle policies to move the files to S3 Glacier Deep Archive after 1 yea
M. Store search metadata in Amazon RD
N. Query the files from Amazon RD
O. Retrieve the files from S3 Glacier Deep Archive.

Answer: B

Explanation:
"For archive data that needs immediate access, such as medical images, news media assets, or genomics data, choose the S3 Glacier Instant Retrieval storage
class, an archive storage class that delivers the lowest cost storage with milliseconds retrieval. For archive data that does not require immediate access but needs
the flexibility to retrieve large sets of data at no cost, such as backup or disaster recovery use cases, choose S3 Glacier Flexible Retrieval (formerly S3 Glacier),
with retrieval in minutes or free bulk retrievals in 5-12 hours."
https://aws.amazon.com/about-aws/whats-new/2021/11/amazon-s3-glacier-instant-retrieval-storage-class/

NEW QUESTION 86
- (Exam Topic 1)
An application runs on an Amazon EC2 instance in a VPC. The application processes logs that are stored in an Amazon S3 bucket. The EC2 instance needs to
access the S3 bucket without connectivity to the internet.
Which solution will provide private network connectivity to Amazon S3?

A. Create a gateway VPC endpoint to the S3 bucket.


B. Stream the logs to Amazon CloudWatch Log
C. Export the logs to the S3 bucket.
D. Create an instance profile on Amazon EC2 to allow S3 access.
E. Create an Amazon API Gateway API with a private link to access the S3 endpoint.

Answer: A

Explanation:
VPC endpoint allows you to connect to AWS services using a private network instead of using the public Internet

NEW QUESTION 87
- (Exam Topic 1)
A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from tens of gigabytes to hundreds of
terabytes The application data must be stored in a standard file system structure The company wants a solution that scales automatically, is highly available, and
requires minimum operational overhead.
Which solution will meet these requirements?

A. Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for storage
B. Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic Block Store (Amazon EBS) for storage
C. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
D. Use Amazon Elastic File System (Amazon EFS) for storage.
E. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
F. Use Amazon Elastic Block Store (Amazon EBS) for storage.

Answer: C

Explanation:
EFS is a standard file system, it scales automatically and is highly available.

NEW QUESTION 91
- (Exam Topic 1)
A company hosts more than 300 global websites and applications. The company requires a platform to analyze more than 30 TB of clickstream data each day.
What should a solutions architect do to transmit and process the clickstream data?

A. Design an AWS Data Pipeline to archive the data to an Amazon S3 bucket and run an Amazon EMR duster with the data to generate analytics

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

B. Create an Auto Scaling group of Amazon EC2 instances to process the data and send it to an Amazon S3 data lake for Amazon Redshift to use tor analysis
C. Cache the data to Amazon CloudFron: Store the data in an Amazon S3 bucket When an object is added to the S3 bucket, run an AWS Lambda function to
process the data tor analysis.
D. Collect the data from Amazon Kinesis Data Stream
E. Use Amazon Kinesis Data Firehose to transmit the data to an Amazon S3 data lake Load the data in Amazon Redshift for analysis

Answer: D

Explanation:
https://aws.amazon.com/es/blogs/big-data/real-time-analytics-with-amazon-redshift-streaming-ingestion/

NEW QUESTION 92
- (Exam Topic 2)
A large media company hosts a web application on AWS. The company wants to start caching confidential media files so that users around the world will have
reliable access to the files. The content is stored in Amazon S3 buckets. The company must deliver the content quickly, regardless of where the requests originate
geographically.
Which solution will meet these requirements?

A. Use AWS DataSync to connect the S3 buckets to the web application.


B. Deploy AWS Global Accelerator to connect the S3 buckets to the web application.
C. Deploy Amazon CloudFront to connect the S3 buckets to CloudFront edge servers.
D. Use Amazon Simple Queue Service (Amazon SQS) to connect the S3 buckets to the web application.

Answer: C

Explanation:
CloudFront uses a local cache to provide the response, AWS Global accelerator proxies requests and connects to the application all the time for the response.
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-restricting-access-to-s3

NEW QUESTION 94
- (Exam Topic 2)
A solutions architect needs to securely store a database user name and password that an application uses to access an Amazon RDS DB instance. The
application that accesses the database runs on an Amazon EC2 instance. The solutions architect wants to create a secure parameter in AWS Systems Manager
Parameter Store.
What should the solutions architect do to meet this requirement?

A. Create an IAM role that has read access to the Parameter Store paramete
B. Allow Decrypt access to an AWS Key Management Service (AWS KMS) key that is used to encrypt the paramete
C. Assign this IAM role to the EC2 instance.
D. Create an IAM policy that allows read access to the Parameter Store paramete
E. Allow Decrypt access to an AWS Key Management Service (AWS KMS) key that is used to encrypt the paramete
F. Assign this IAM policy to the EC2 instance.
G. Create an IAM trust relationship between the Parameter Store parameter and the EC2 instanc
H. Specify Amazon RDS as a principal in the trust policy.
I. Create an IAM trust relationship between the DB instance and the EC2 instanc
J. Specify Systems Manager as a principal in the trust policy.

Answer: B

Explanation:
https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_aws-services-that-work-with-iam.html

NEW QUESTION 98
- (Exam Topic 2)
A company wants to run a gaming application on Amazon EC2 instances that are part of an Auto Scaling group in the AWS Cloud. The application will transmit
data by using UDP packets. The company wants to ensure that the application can scale out and in as traffic increases and decreases.
What should a solutions architect do to meet these requirements?

A. Attach a Network Load Balancer to the Auto Scaling group


B. Attach an Application Load Balancer to the Auto Scaling group.
C. Deploy an Amazon Route 53 record set with a weighted policy to route traffic appropriately
D. Deploy a NAT instance that is configured with port forwarding to the EC2 instances in the Auto Scaling group.

Answer: B

NEW QUESTION 100


- (Exam Topic 2)
An ecommerce company has an order-processing application that uses Amazon API Gateway and an AWS Lambda function. The application stores data in an
Amazon Aurora PostgreSQL database. During a recent sales event, a sudden surge in customer orders occurred. Some customers experienced timeouts and the
application did not process the orders of those customers A solutions architect determined that the CPU utilization and memory utilization were high on the
database because of a large number of open connections The solutions architect needs to prevent the timeout errors while making the least possible changes to
the application.
Which solution will meet these requirements?

A. Configure provisioned concurrency for the Lambda function Modify the database to be a global database in multiple AWS Regions
B. Use Amazon RDS Proxy to create a proxy for the database Modify the Lambda function to use the RDS Proxy endpoint instead of the database endpoint
C. Create a read replica for the database in a different AWS Region Use query string parameters in API Gateway to route traffic to the read replica
D. Migrate the data from Aurora PostgreSQL to Amazon DynamoDB by using AWS Database Migration Service (AWS DMS| Modify the Lambda function to use
the OynamoDB table

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

Answer: D

NEW QUESTION 104


- (Exam Topic 2)
A company wants to use the AWS Cloud to make an existing application highly available and resilient. The current version of the application resides in the
company's data center. The application recently experienced data loss after a database server crashed because of an unexpected power outage.
The company needs a solution that avoids any single points of failure. The solution must give the application the ability to scale to meet user demand.
Which solution will meet these requirements?

A. Deploy the application servers by using Amazon EC2 instances in an Auto Scaling group across multiple Availability Zone
B. Use an Amazon RDS DB instance in a Multi-AZ configuration.
C. Deploy the application servers by using Amazon EC2 instances in an Auto Scaling group in a single Availability Zon
D. Deploy the databaseon an EC2 instanc
E. Enable EC2 Auto Recovery.
F. Deploy the application servers by using Amazon EC2 instances in an Auto Scaling group across multiple Availability Zone
G. Use an Amazon RDS DB instance with a read replica in a single Availability Zon
H. Promote the read replica to replace the primary DB instance if the primary DB instance fails.
I. Deploy the application servers by using Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones Deploy the primary and secondary
database servers on EC2 instances across multiple Availability Zones Use Amazon Elastic Block Store (Amazon EBS) Multi-Attach to create shared storage
between the instances.

Answer: A

NEW QUESTION 105


- (Exam Topic 2)
A company is migrating an application from on-premises servers to Amazon EC2 instances. As part of the migration design requirements, a solutions architect
must implement infrastructure metric alarms. The company does not need to take action if CPU utilization increases to more than 50% for a short burst of time.
However, if the CPU utilization increases to more than 50% and read IOPS on the disk are high at the same time, the company needs to act as soon as possible.
The solutions architect also must reduce false alarms.
What should the solutions architect do to meet these requirements?

A. Create Amazon CloudWatch composite alarms where possible.


B. Create Amazon CloudWatch dashboards to visualize the metrics and react to issues quickly.
C. Create Amazon CloudWatch Synthetics canaries to monitor the application and raise an alarm.
D. Create single Amazon CloudWatch metric alarms with multiple metric thresholds where possible.

Answer: A

NEW QUESTION 108


- (Exam Topic 2)
A company is building a web-based application running on Amazon EC2 instances in multiple Availability Zones. The web application will provide access to a
repository of text documents totaling about 900 TB in size. The company anticipates that the web application will experience periods of high demand. A solutions
architect must ensure that the storage component for the text documents can scale to meet the demand of the application at all times. The company is concerned
about the overall cost of the solution.
Which storage solution meets these requirements MOST cost-effectively?

A. Amazon Elastic Block Store (Amazon EBS)


B. Amazon Elastic File System (Amazon EFS)
C. Amazon Elasticsearch Service (Amazon ES)
D. Amazon S3

Answer: D

Explanation:
Amazon S3 is cheapest and can be accessed from anywhere.

NEW QUESTION 112


- (Exam Topic 2)
A company is concerned about the security of its public web application due to recent web attacks. The application uses an Application Load Balancer (ALB). A
solutions architect must reduce the risk of DDoS attacks against the application.
What should the solutions architect do to meet this requirement?

A. Add an Amazon Inspector agent to the ALB.


B. Configure Amazon Macie to prevent attacks.
C. Enable AWS Shield Advanced to prevent attacks.
D. Configure Amazon GuardDuty to monitor the ALB.

Answer: C

NEW QUESTION 114


- (Exam Topic 2)
A company wants to migrate its MySQL database from on premises to AWS. The company recently experienced a database outage that significantly impacted the
business. To ensure this does not happen again, the company wants a reliable database solution on AWS that minimizes data loss and stores every transaction on
at least two nodes.
Which solution meets these requirements?

A. Create an Amazon RDS DB instance with synchronous replication to three nodes in three Availability Zones.
B. Create an Amazon RDS MySQL DB instance with Multi-AZ functionality enabled to synchronously replicate the data.

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

C. Create an Amazon RDS MySQL DB instance and then create a read replica in a separate AWS Region that synchronously replicates the data.
D. Create an Amazon EC2 instance with a MySQL engine installed that triggers an AWS Lambda function to synchronously replicate the data to an Amazon RDS
MySQL DB instance.

Answer: B

Explanation:
Q: What does Amazon RDS manage on my behalf?
Amazon RDS manages the work involved in setting up a relational database: from provisioning the infrastructure capacity you request to installing the database
software. Once your database is up and running, Amazon RDS automates common administrative tasks such as performing backups and patching the software
that powers your database. With optional Multi-AZ deployments, Amazon RDS also manages synchronous data replication across Availability Zones with
automatic failover.
https://aws.amazon.com/rds/faqs/

NEW QUESTION 118


- (Exam Topic 2)
A company needs to save the results from a medical trial to an Amazon S3 repository. The repository must allow a few scientists to add new files and must restrict
all other users to read-only access. No users can have the ability to modify or delete any files in the repository. The company must keep every file in the repository
for a minimum of 1 year after its creation date.
Which solution will meet these requirements?

A. Use S3 Object Lock In governance mode with a legal hold of 1 year


B. Use S3 Object Lock in compliance mode with a retention period of 365 days.
C. Use an IAM role to restrict all users from deleting or changing objects in the S3 bucket Use an S3 bucket policy to only allow the IAM role
D. Configure the S3 bucket to invoke an AWS Lambda function every tune an object is added Configure the function to track the hash of the saved object to that
modified objects can be marked accordingly

Answer: C

NEW QUESTION 122


- (Exam Topic 2)
A company is planning to move its data to an Amazon S3 bucket. The data must be encrypted when it is stored in the S3 bucket. Additionally, the encryption key
must be automatically rotated every year.
Which solution will meet these requirements with the LEAST operational overhead?

A. Move the data to the S3 bucke


B. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3). Use the built-in key rotation behavior of SSE-S3 encryption keys.
C. Create an AWS Key Management Service {AWS KMS) customer managed ke
D. Enable automatic key rotatio
E. Set the S3 bucket's default encryption behavior to use the customer managed KMS ke
F. Move the data to the S3 bucket.
G. Create an AWS Key Management Service (AWS KMS) customer managed ke
H. Set the S3 bucket's default encryption behavior to use the customer managed KMS ke
I. Move the data to the S3 bucke
J. Manually rotate the KMS key every year.
K. Encrypt the data with customer key material before moving the data to the S3 bucke
L. Create an AWS Key Management Service (AWS KMS) key without key materia
M. Import the customer key material into the KMS ke
N. Enable automatic key rotation.

Answer: C

NEW QUESTION 125


- (Exam Topic 2)
A company runs a web-based portal that provides users with global breaking news, local alerts, and weather updates. The portal delivers each user a personalized
view by using mixture of static and dynamic content. Content is served over HTTPS through an API server running on an Amazon EC2 instance behind an
Application Load Balancer (ALB). The company wants the portal to provide this content to its users across the world as quickly as possible.
How should a solutions architect design the application to ensure the LEAST amount of latency for all users?

A. Deploy the application stack in a single AWS Regio


B. Use Amazon CloudFront to serve all static and dynamic content by specifying the ALB as an origin.
C. Deploy the application stack in two AWS Region
D. Use an Amazon Route 53 latency routing policy to serve all content from the ALB in the closest Region.
E. Deploy the application stack in a single AWS Regio
F. Use Amazon CloudFront to serve the static conten
G. Serve the dynamic content directly from the ALB.
H. Deploy the application stack in two AWS Region
I. Use an Amazon Route 53 geolocation routing policy to serve all content from the ALB in the closest Region.

Answer: A

Explanation:
https://aws.amazon.com/blogs/networking-and-content-delivery/deliver-your-apps-dynamic-content-using-amaz

NEW QUESTION 130


- (Exam Topic 2)
A company wants to build a scalable key management Infrastructure to support developers who need to encrypt data in their applications.
What should a solutions architect do to reduce the operational burden?

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

A. Use multifactor authentication (MFA) to protect the encryption keys.


B. Use AWS Key Management Service (AWS KMS) to protect the encryption keys
C. Use AWS Certificate Manager (ACM) to create, store, and assign the encryption keys
D. Use an IAM policy to limit the scope of users who have access permissions to protect the encryption keys

Answer: B

Explanation:
https://aws.amazon.com/kms/faqs/#:~:text=If%20you%20are%20a%20developer%20who%20needs%20to%20d

NEW QUESTION 133


- (Exam Topic 2)
A company runs its two-tier ecommerce website on AWS. The web tier consists of a load balancer that sends traffic to Amazon EC2 instances. The database tier
uses an Amazon RDS DB instance. The EC2 instances and the RDS DB instance should not be exposed to the public internet. The EC2 instances require internet
access to complete payment processing of orders through a third-party web service. The application must be highly available.
Which combination of configuration options will meet these requirements? (Choose two.)

A. Use an Auto Scaling group to launch the EC2 instances in private subnet
B. Deploy an RDS Multi-AZ DB instance in private subnets.
C. Configure a VPC with two private subnets and two NAT gateways across two Availability Zones.Deploy an Application Load Balancer in the private subnets.
D. Use an Auto Scaling group to launch the EC2 instances in public subnets across two Availability Zones.Deploy an RDS Multi-AZ DB instance in private subnets.
E. Configure a VPC with one public subnet, one private subnet, and two NAT gateways across two Availability Zone
F. Deploy an Application Load Balancer in the public subnet.
G. Configure a VPC with two public subnets, two private subnets, and two NAT gateways across two Availability Zone
H. Deploy an Application Load Balancer in the public subnets.

Answer: AE

Explanation:
Before you begin: Decide which two Availability Zones you will use for your EC2 instances. Configure your virtual private cloud (VPC) with at least one public
subnet in each of these Availability Zones. These public subnets are used to configure the load balancer. You can launch your EC2 instances in other subnets of
these Availability Zones instead.

NEW QUESTION 137


- (Exam Topic 2)
A company has two applications: a sender application that sends messages with payloads to be processed and a processing application intended to receive the
messages with payloads. The company wants to implement an AWS service to handle messages between the two applications. The sender application can send
about 1.000 messages each hour. The messages may take up to 2 days to be processed. If the messages fail to process, they must be retained so that they do
not impact the processing of any remaining messages.
Which solution meets these requirements and is the MOST operationally efficient?

A. Set up an Amazon EC2 instance running a Redis databas


B. Configure both applications to use the instanc
C. Store, process, and delete the messages, respectively.
D. Use an Amazon Kinesis data stream to receive the messages from the sender applicatio
E. Integrate the processing application with the Kinesis Client Library (KCL).
F. Integrate the sender and processor applications with an Amazon Simple Queue Service (Amazon SQS) queu
G. Configure a dead-letter queue to collect the messages that failed to process.
H. Subscribe the processing application to an Amazon Simple Notification Service (Amazon SNS) topic to receive notifications to proces
I. Integrate the sender application to write to the SNS topic.

Answer: C

Explanation:
https://aws.amazon.com/blogs/compute/building-loosely-coupled-scalable-c-applications-with-amazon-sqs-and-
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-dead-letter-queues.htm

NEW QUESTION 138


- (Exam Topic 2)
A company has a legacy data processing application that runs on Amazon EC2 instances. Data is processed sequentially, but the order of results does not matter.
The application uses a monolithic architecture. The only way that the company can scale the application to meet increased demand is to increase the size of the
instances.
The company's developers have decided to rewrite the application to use a microservices architecture on Amazon Elastic Container Service (Amazon ECS).
What should a solutions architect recommend for communication between the microservices?

A. Create an Amazon Simple Queue Service (Amazon SQS) queu


B. Add code to the data producers, and send data to the queu
C. Add code to the data consumers to process data from the queue.
D. Create an Amazon Simple Notification Service (Amazon SNS) topi
E. Add code to the data producers, and publish notifications to the topi
F. Add code to the data consumers to subscribe to the topic.
G. Create an AWS Lambda function to pass message
H. Add code to the data producers to call the Lambda function with a data objec
I. Add code to the data consumers to receive a data object that is passed from the Lambda function.
J. Create an Amazon DynamoDB tabl
K. Enable DynamoDB Stream
L. Add code to the data producers to insert data into the tabl
M. Add code to the data consumers to use the DynamoDB Streams API to detect new table entries and retrieve the data.

Answer: A

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

Explanation:
Queue has Limited throughput (300 msg/s without batching, 3000 msg/s with batching whereby up-to 10 msg per batch operation; Msg duplicates not allowed in
the queue (exactly-once delivery); Msg order is preserved (FIFO); Queue name must end with .fifo

NEW QUESTION 140


- (Exam Topic 2)
A company is building a containerized application on premises and decides to move the application to AWS. The application will have thousands of users soon
after li is deployed. The company Is unsure how to manage the deployment of containers at scale. The company needs to deploy the containerized application in a
highly available architecture that minimizes operational overhead.
Which solution will meet these requirements?

A. Store container images In an Amazon Elastic Container Registry (Amazon ECR) repositor
B. Use an Amazon Elastic Container Service (Amazon ECS) cluster with the AWS Fargate launch type to run the container
C. Use target tracking to scale automatically based on demand.
D. Store container images in an Amazon Elastic Container Registry (Amazon ECR) repositor
E. Use an Amazon Elastic Container Service (Amazon ECS) cluster with the Amazon EC2 launch type to run the container
F. Use target tracking to scale automatically based on demand.
G. Store container images in a repository that runs on an Amazon EC2 instanc
H. Run the containers on EC2 instances that are spread across multiple Availability Zone
I. Monitor the average CPU utilization in Amazon CloudWatc
J. Launch new EC2 instances as needed
K. Create an Amazon EC2 Amazon Machine Image (AMI) that contains the container image Launch EC2 Instances in an Auto Scaling group across multiple
Availability Zone
L. Use an Amazon CloudWatch alarm to scale out EC2 instances when the average CPU utilization threshold is breached.

Answer: A

NEW QUESTION 142


- (Exam Topic 2)
A company is running a multi-tier web application on premises. The web application is containerized and runs on a number of Linux hosts connected to a
PostgreSQL database that contains user records. The operational overhead of maintaining the infrastructure and capacity planning is limiting the company's
growth. A solutions architect must improve the application's infrastructure.
Which combination of actions should the solutions architect take to accomplish this? (Choose two.)

A. Migrate the PostgreSQL database to Amazon Aurora


B. Migrate the web application to be hosted on Amazon EC2 instances.
C. Set up an Amazon CloudFront distribution for the web application content.
D. Set up Amazon ElastiCache between the web application and the PostgreSQL database.
E. Migrate the web application to be hosted on AWS Fargate with Amazon Elastic Container Service (Amazon ECS).

Answer: AE

NEW QUESTION 147


- (Exam Topic 2)
An ecommerce company hosts its analytics application in the AWS Cloud. The application generates about 300 MB of data each month. The data is stored in
JSON format. The company is evaluating a disaster recovery solution to back up the data. The data must be accessible in milliseconds if it is needed, and the data
must be kept for 30 days.
Which solution meets these requirements MOST cost-effectively?

A. Amazon OpenSearch Service (Amazon Elasticsearch Service)


B. Amazon S3 Glacier
C. Amazon S3 Standard
D. Amazon RDS for PostgreSQL

Answer: C

NEW QUESTION 148


- (Exam Topic 2)
A company is planning to build a high performance computing (HPC) workload as a service solution that Is hosted on AWS A group of 16 AmazonEC2Ltnux
Instances requires the lowest possible latency for
node-to-node communication. The instances also need a shared block device volume for high-performing
storage.
Which solution will meet these requirements?

A. Use a duster placement grou


B. Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon E BS) volume to all the instances by using Amazon EBS Multi-Attach
C. Use a cluster placement grou
D. Create shared 'lie systems across the instances by using Amazon Elastic File System (Amazon EFS)
E. Use a partition placement grou
F. Create shared tile systems across the instances by using Amazon Elastic File System (Amazon EFS).
G. Use a spread placement grou
H. Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon EBS) volume to all the instances by using Amazon EBS Multi-Attach

Answer: A

NEW QUESTION 153


- (Exam Topic 2)
A media company is evaluating the possibility ot moving rts systems to the AWS Cloud The company needs at least 10 TB of storage with the maximum possible

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

I/O performance for video processing. 300 TB of very durable storage for storing media content, and 900 TB of storage to meet requirements for archival media
that is not in use anymore
Which set of services should a solutions architect recommend to meet these requirements?

A. Amazon EBS for maximum performance, Amazon S3 for durable data storage, and Amazon S3 Glacier for archival storage
B. Amazon EBS for maximum performance, Amazon EFS for durable data storage and Amazon S3 Glacier for archival storage
C. Amazon EC2 instance store for maximum performanc
D. Amazon EFS for durable data storage and Amazon S3 for archival storage
E. Amazon EC2 Instance store for maximum performanc
F. Amazon S3 for durable data storage, and Amazon S3 Glacier for archival storage

Answer: A

Explanation:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html

NEW QUESTION 156


- (Exam Topic 2)
A company's application Is having performance issues The application staleful and needs to complete m-memory tasks on Amazon EC2 instances. The company
used AWS CloudFormation to deploy
infrastructure and used the M5 EC2 Instance family As traffic increased, the application performance degraded Users are reporting delays when the users attempt
to access the application.
Which solution will resolve these issues in the MOST operationally efficient way?

A. Replace the EC2 Instances with T3 EC2 instances that run in an Auto Scaling grou
B. Made the changes by using the AWS Management Console.
C. Modify the CloudFormation templates to run the EC2 instances in an Auto Scaling grou
D. Increase the desired capacity and the maximum capacity of the Auto Scaling group manually when an increase is necessary
E. Modify the CloudFormation template
F. Replace the EC2 instances with R5 EC2 instance
G. Use Amazon CloudWatch built-in EC2 memory metrics to track the application performance for future capacity planning.
H. Modify the CloudFormation template
I. Replace the EC2 instances with R5 EC2 instance
J. Deploy the Amazon CloudWatch agent on the EC2 instances to generate custom application latency metrics for future capacity planning.

Answer: D

Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/cloudwatch-memory-metrics-ec2/

NEW QUESTION 159


- (Exam Topic 2)
A company wants to migrate its existing on-premises monolithic application to AWS.
The company wants to keep as much of the front- end code and the backend code as possible. However, the company wants to break the application into smaller
applications. A different team will manage each application. The company needs a highly scalable solution that minimizes operational overhead.
Which solution will meet these requirements?

A. Host the application on AWS Lambda Integrate the application with Amazon API Gateway.
B. Host the application with AWS Amplif
C. Connect the application to an Amazon API Gateway API that is integrated with AWS Lambda.
D. Host the application on Amazon EC2 instance
E. Set up an Application Load Balancer with EC2 instances in an Auto Scaling group as targets.
F. Host the application on Amazon Elastic Container Service (Amazon ECS) Set up an Application Load Balancer with Amazon ECS as the target.

Answer: D

Explanation:
https://aws.amazon.com/blogs/compute/microservice-delivery-with-amazon-ecs-and-application-load-balancers/

NEW QUESTION 161


- (Exam Topic 2)
An entertainment company is using Amazon DynamoDB to store media metadata. The application is read intensive and experiencing delays. The company does
not have staff to handle additional operational overhead and needs to improve the performance efficiency of DynamoDB without reconfiguring the application.
What should a solutions architect recommend to meet this requirement?

A. Use Amazon ElastiCache for Redis.


B. Use Amazon DynamoDB Accelerator (DAX).
C. Replicate data by using DynamoDB global tables.
D. Use Amazon ElastiCache for Memcached with Auto Discovery enabled.

Answer: B

Explanation:
https://aws.amazon.com/dynamodb/dax/

NEW QUESTION 162


- (Exam Topic 2)
A company runs an Oracle database on premises. As part of the company’s migration to AWS, the company wants to upgrade the database to the most recent
available version. The company also wants to set up disaster recovery (DR) for the database. The company needs to minimize the operational overhead for normal

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

operations and DR setup. The company also needs to maintain access to the database's underlying operating system.
Which solution will meet these requirements?

A. Migrate the Oracle database to an Amazon EC2 instanc


B. Set up database replication to a different AWS Region.
C. Migrate the Oracle database to Amazon RDS for Oracl
D. Activate Cross-Region automated backups to replicate the snapshots to another AWS Region.
E. Migrate the Oracle database to Amazon RDS Custom for Oracl
F. Create a read replica for the database in another AWS Region.
G. Migrate the Oracle database to Amazon RDS for Oracl
H. Create a standby database in another Availability Zone.

Answer: C

Explanation:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/rds-custom.html and https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/working-with-
custom-oracle.html

NEW QUESTION 167


- (Exam Topic 2)
A company has a service that produces event data. The company wants to use AWS to process the event data as it is received. The data is written in a specific
order that must be maintained throughout processing The company wants to implement a solution that minimizes operational overhead. How should a solutions
architect accomplish this?

A. Create an Amazon Simple Queue Service (Amazon SQS) FIFO queue to hold messages Set up an AWS Lambda function to process messages from the queue
B. Create an Amazon Simple Notification Service (Amazon SNS) topic to deliver notifications containing payloads to process Configure an AWS Lambda function
as a subscriber.
C. Create an Amazon Simple Queue Service (Amazon SQS) standard queue to hold message
D. Set up an AWS Lambda function to process messages from the queue independently
E. Create an Amazon Simple Notification Service (Amazon SNS) topic to deliver notifications containing payloads to proces
F. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a subscriber.

Answer: A

Explanation:
The details are revealed in below url: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html
FIFO (First-In-First-Out) queues are designed to enhance messaging between applications when the order of operations and events is critical, or where duplicates
can't be tolerated. Examples of situations where you might use FIFO queues include the following: To make sure that user-entered commands are run in the right
order. To display the correct product price by sending price modifications in the right order. To prevent a student from enrolling in a course before registering for an
account.

NEW QUESTION 169


- (Exam Topic 2)
A company produces batch data that comes from different databases. The company also produces live stream data from network sensors and application APIs.
The company needs to consolidate all the data into one place for business analytics. The company needs to process the incoming data and then stage the data in
different Amazon S3 buckets. Teams will later run one-time queries and import the data into a business intelligence tool to show key performance indicators
(KPIs).
Which combination of steps will meet these requirements with the LEAST operational overhead? (Choose two.)

A. Use Amazon Athena foe one-time queries Use Amazon QuickSight to create dashboards for KPIs
B. Use Amazon Kinesis Data Analytics for one-time queries Use Amazon QuickSight to create dashboards for KPIs
C. Create custom AWS Lambda functions to move the individual records from me databases to an Amazon Redshift duster
D. Use an AWS Glue extract transform, and toad (ETL) job to convert the data into JSON format Load the data into multiple Amazon OpenSearch Service
(Amazon Elasticsearch Service) dusters
E. Use blueprints in AWS Lake Formation to identify the data that can be ingested into a data lake Use AWS Glue to crawl the source extract the data and load the
data into Amazon S3 in Apache Parquet format

Answer: CE

NEW QUESTION 172


- (Exam Topic 3)
A company is migrating a Linux-based web server group to AWS. The web servers must access files in a shared file store for some content. The company must
not make any changes to the application.
What should a solutions architect do to meet these requirements?

A. Create an Amazon S3 Standard bucket with access to the web servers.


B. Configure an Amazon CloudFront distribution with an Amazon S3 bucket as the origin.
C. Create an Amazon Elastic File System (Amazon EFS) file syste
D. Mount the EFS file system on all web servers.
E. Configure a General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS) volum
F. Mount the EBS volume to all web servers.

Answer: C

NEW QUESTION 173


- (Exam Topic 3)
An ecommerce company needs to run a scheduled daily job to aggregate and filler sales records for analytics. The company stores the sales records in an
Amazon S3 bucket. Each object can be up to 10 G6 in size Based on the number of sales events, the job can take up to an hour to complete. The CPU and
memory usage of the fob are constant and are known in advance.

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

A solutions architect needs to minimize the amount of operational effort that is needed for the job to run. Which solution meets these requirements?

A. Create an AWS Lambda function that has an Amazon EventBridge notification Schedule the EventBridge event to run once a day
B. Create an AWS Lambda function Create an Amazon API Gateway HTTP API, and integrate the API with the function Create an Amazon EventBridge scheduled
avert that calls the API and invokes the function.
C. Create an Amazon Elastic Container Service (Amazon ECS) duster with an AWS Fargate launch type.Create an Amazon EventBridge scheduled event that
launches an ECS task on the cluster to run the job.
D. Create an Amazon Elastic Container Service (Amazon ECS) duster with an Amazon EC2 launch type and an Auto Scaling group with at least one EC2 instanc
E. Create an Amazon EventBridge scheduled event that launches an ECS task on the duster to run the job.

Answer: C

NEW QUESTION 177


- (Exam Topic 3)
A company uses a payment processing system that requires messages for a particular payment ID to be received in the same order that they were sent Otherwise,
the payments might be processed incorrectly.
Which actions should a solutions architect take to meet this requirement? (Select TWO.)

A. Write the messages to an Amazon DynamoDB table with the payment ID as the partition key
B. Write the messages to an Amazon Kinesis data stream with the payment ID as the partition key.
C. Write the messages to an Amazon ElastiCache for Memcached cluster with the payment ID as the key
D. Write the messages to an Amazon Simple Queue Service (Amazon SQS) queue Set the message attribute to use the payment ID
E. Write the messages to an Amazon Simple Queue Service (Amazon SQS) FIFO queu
F. Set the message group to use the payment ID.

Answer: AE

NEW QUESTION 181


- (Exam Topic 3)
A company is planning to migrate a commercial off-the-shelf application from is on-premises data center to AWS. The software has a software licensing model
using sockets and cores with predictable capacity and uptime requirements. The company wants to use its existing licenses, which were purchased earlier this
year.
Which Amazon EC2 pricing option is the MOST cost-effective?

A. Dedicated Reserved Hosts


B. Dedicated On-Demand Hosts
C. Dedicated Reserved Instances
D. Dedicated On-Oemand Instances

Answer: A

NEW QUESTION 183


- (Exam Topic 3)
A company needs to transfer 600 TB of data from its on-premises network-attached storage (NAS) system to the AWS Cloud. The data transfer must be complete
within 2 weeks. The data is sensitive and must be encrypted in transit. The company's internet connection can support an upload speed of 100 Mbps.
Which solution meets these requirements MOST cost-effectively?

A. Use Amazon S3 multi-part upload functionality to transfer the fees over HTTPS
B. Create a VPN connection between the on-premises NAS system and the nearest AWS Region Transfer the data over the VPN connection
C. Use the AWS Snow Family console to order several AWS Snowball Edge Storage Optimized devices Use the devices to transfer the data to Amazon S3.
D. Set up a 10 Gbps AWS Direct Connect connection between the company location and (he nearest AWS Region Transfer the data over a VPN connection into
the Region to store the data in Amazon S3

Answer: D

NEW QUESTION 188


- (Exam Topic 3)
A company manages its own Amazon EC2 instances that run MySQL databases. The company is manually managing replication and scaling as demand
increases or decreases. The company needs a new solution that simplifies the process of adding or removing compute capacity to or from its database tier as
needed. The solution also must offer improved performance, scaling, and durability with minimal effort from operations.
Which solution meets these requirements?

A. Migrate the databases to Amazon Aurora Serverless for Aurora MySQL.


B. Migrate the databases to Amazon Aurora Serverless for Aurora PostgreSQL.
C. Combine the databases into one larger MySQL databas
D. Run the larger database on larger EC2 instances.
E. Create an EC2 Auto Scaling group for the database tie
F. Migrate the existing databases to the new environment.

Answer: A

Explanation:
https://aws.amazon.com/rds/aurora/serverless/

NEW QUESTION 193


- (Exam Topic 3)
An ecommerce company has noticed performance degradation of its Amazon RDS based web application. The performance degradation is attributed to an
increase in the number of read-only SQL queries triggered by business analysts. A solutions architect needs to solve the problem with minimal changes to the

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

existing web application.


What should the solutions architect recommend?

A. Export the data to Amazon DynamoDB and have the business analysts run their queries.
B. Load the data into Amazon ElastiCache and have the business analysts run their queries.
C. Create a read replica of the primary database and have the business analysts run their queries.
D. Copy the data into an Amazon Redshift cluster and have the business analysts run their queries

Answer: C

Explanation:
Creating a read replica of the primary RDS database will offload the read-only SQL queries from the primary database, which will help to improve the performance
of the web application. Read replicas are exact copies of the primary database that can be used to handle read-only traffic, which will reduce the load on the
primary database and improve the performance of the web application. This solution can be implemented with minimal changes to the existing web application, as
the business analysts can continue to run their queries on the read replica without modifying the code.

NEW QUESTION 195


- (Exam Topic 3)
An image-hosting company stores its objects in Amazon S3 buckets. The company wants to avoid accidental exposure of the objects in the S3 buckets to the
public. All S3 objects in the entire AWS account need to remain private
Which solution will meal these requirements?

A. Use Amazon GuardDuty to monitor S3 bucket policies Create an automatic remediation action rule that uses an AWS Lambda function to remediate any change
that makes the objects public
B. Use AWS Trusted Advisor to find publicly accessible S3 Dockets Configure email notifications In Trusted Advisor when a change is detected manually change
the S3 bucket policy if it allows public access
C. Use AWS Resource Access Manager to find publicly accessible S3 buckets Use Amazon Simple Notification Service (Amazon SNS) to invoke an AWS Lambda
function when a change it detected.Deploy a Lambda function that programmatically remediates the change.
D. Use the S3 Block Public Access feature on the account leve
E. Use AWS Organizations to create a service control policy (SCP) that prevents IAM users from changing the setting Apply tie SCP to tie account

Answer: D

NEW QUESTION 199


- (Exam Topic 3)
A company has a Microsoft NET application that runs on an on-premises Windows Server Trie application stores data by using an Oracle Database Standard
Edition server The company is planning a migration to AWS and wants to minimize development changes while moving the application The AWS application
environment should be highly available
Which combination of actions should the company take to meet these requirements? (Select TWO )

A. Refactor the application as serverless with AWS Lambda functions running NET Cote
B. Rehost the application in AWS Elastic Beanstalk with the NET platform in a Multi-AZ deployment
C. Replatform the application to run on Amazon EC2 with the Amazon Linux Amazon Machine Image (AMI)
D. Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Amazon DynamoDB in a Multi-AZ deployment
E. Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Oracle on Amazon RDS in a Multi-AZ deployment

Answer: BE

NEW QUESTION 203


- (Exam Topic 3)
A company wants to migrate its 1 PB on-premises image repository to AWS. The images will be used by a serverless web application Images stored in the
repository are rarely accessed, but they must be immediately available Additionally, the images must be encrypted at rest and protected from accidental deletion
Which solution meets these requirements?

A. Implement client-side encryption and store the images in an Amazon S3 Glacier vault Set a vault lock to prevent accidental deletion
B. Store the images in an Amazon S3 bucket in the S3 Standard-Infrequent Access (S3 Standard-IA) storage class Enable versioning default encryption and MFA
Delete on the S3 bucket.
C. Store the images in an Amazon FSx for Windows File Server file share Configure the Amazon FSx file share to use an AWS Key Management Service (AWS
KMS) customer master key (CMK) to encrypt the images in the file share Use NTFS permission sets on the images to prevent accidental deletion
D. Store the images in an Amazon Elastic File System (Amazon EFS) file share in the Infrequent Access storage class Configure the EFS file share to use an
AWS Key Management Service (AWS KMS) customer master key (CMK) to encrypt the images in the file shar
E. Use NFS permission sets on the images to prevent accidental deletion

Answer: B

NEW QUESTION 207


- (Exam Topic 3)
A company is implementing new data retention policies for all databases that run on Amazon RDS DB instances. The company must retain daily backups for a
minimum period of 2 years. The backups must be consistent and restorable.
Which solution should a solutions architect recommend to meet these requirements?

A. Create a backup vault in AWS Backup to retain RDS backup


B. Create a new backup plan with a daily schedule and an expiration period of 2 years after creatio
C. Assign the RDS DB instances to the backup plan.
D. Configure a backup window for the RDS DB instances for daily snapshot
E. Assign a snapshot retention policy of 2 years to each RDS DB instanc
F. Use Amazon Data Lifecycle Manager (Amazon DLM) to schedule snapshot deletions.
G. Configure database transaction logs to be automatically backed up to Amazon CloudWatch Logs with an expiration period of 2 years.
H. Configure an AWS Database Migration Service (AWS DMS) replication tas

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

I. Deploy a replication instance, and configure a change data capture (CDC) task to stream database changes to Amazon S3 as the targe
J. Configure S3 Lifecycle policies to delete the snapshots after 2 years.

Answer: A

NEW QUESTION 208


- (Exam Topic 3)
A company hosts a multi-tier web application that uses an Amazon Aurora MySQL DB cluster for storage. The application tier is hosted on Amazon EC2 instances.
The company's IT security guidelines mandate that the database credentials be encrypted and rotated every 14 days
What should a solutions architect do to meet this requirement with the LEAST operational effort?

A. Create a new AWS Key Management Service (AWS KMS) encryption key Use AWS Secrets Manager to create a new secret that uses the KMS key with the
appropriate credentials Associate the secret with the Aurora DB cluster Configure a custom rotation period of 14 days
B. Create two parameters in AWS Systems Manager Parameter Store one for the user name as a string parameter and one that uses the SecureStnng type for the
password Select AWS Key Management Service (AWS KMS) encryption for the password parameter, and load these parameters in the application tier Implement
an AWS Lambda function that rotates the password every 14 days.
C. Store a file that contains the credentials in an AWS Key Management Service (AWS KMS) encryptedAmazon Elastic File System (Amazon EFS) file system
Mount the EFS file system in all EC2 instances of the application tie
D. Restrict the access to the file on the file system so that the application can read the file and that only super users can modify the file Implement an AWS
Lambda function that rotates the key in Aurora every 14 days and writes new credentials into the file
E. Store a file that contains the credentials in an AWS Key Management Service (AWS KMS) encrypted Amazon S3 bucket that the application uses to load the
credentials Download the file to the application regularly to ensure that the correct credentials are used Implement an AWS Lambda function that rotates the
Aurora credentials every 14 days and uploads these credentials to the file in the S3 bucket

Answer: A

NEW QUESTION 210


- (Exam Topic 3)
At part of budget planning. management wants a report of AWS billed dams listed by user. The data will be used to create department budgets. A solution architect
needs to determine the most efficient way to obtain this report Information
Which solution meets these requirement?

A. Run a query with Amazon Athena to generate the report.


B. Create a report in Cost Explorer and download the report
C. Access the bill details from the runing dashboard and download Via bill.
D. Modify a cost budget in AWS Budgets to alert with Amazon Simple Email Service (Amazon SES).

Answer: B

NEW QUESTION 211


- (Exam Topic 3)
A company has an AWS Lambda function that needs read access to an Amazon S3 bucket that is located in the same AWS account. Which solution will meet
these requirement in the MOST secure manner?

A. Apply an S3 bucket pokey that grants road access to the S3 bucket


B. Apply an IAM role to the Lambda function Apply an IAM policy to the role to grant read access to the S3 bucket
C. Embed an access key and a secret key In the Lambda function's coda to grant the required IAM permissions for read access to the S3 bucket
D. Apply an IAM role to the Lambda functio
E. Apply an IAM policy to the role to grant read access to all S3 buckets In the account

Answer: B

NEW QUESTION 213


- (Exam Topic 3)
A company uses a legacy application to produce data in CSV format The legacy application stores the output data In Amazon S3 The company is deploying a new
commercial off-the-shelf (COTS) application that can perform complex SQL queries to analyze data that is stored Amazon Redshift and Amazon S3 only However
the COTS application cannot process the csv files that the legacy application produces The company cannot update the legacy application to produce data in
another format The company needs to implement a solution so that the COTS application can use the data that the legacy applicator produces.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create a AWS Glue extract, transform, and load (ETL) job that runs on a schedul
B. Configure the ETL job to process the .csv files and store the processed data in Amazon Redshit.
C. Develop a Python script that runs on Amazon EC2 instances to convert th
D. csv files to sql files invoke the Python script on cron schedule to store the output files in Amazon S3.
E. Create an AWS Lambda function and an Amazon DynamoDB tabl
F. Use an S3 event to invoke the Lambda functio
G. Configure the Lambda function to perform an extract transform, and load (ETL) job to process the .csv files and store the processed data in the DynamoDB
table.
H. Use Amazon EventBridge (Amazon CloudWatch Events) to launch an Amazon EMR cluster on a weekly schedul
I. Configure the EMR cluster to perform an extract, tractform, and load (ETL) job to process the .csv files and store the processed data in an Amazon Redshift
table.

Answer: C

Explanation:
According to the Amazon website, Amazon S3 Select is an Amazon S3 feature that enables applications to retrieve only a subset of data from an object. It offers
an efficient way to access data stored in Amazon S3 and can significantly improve query performance, save money, and increase the scalability of applications
that frequently access data in S3. S3 Select allows applications to retrieve only the data that is needed, instead of the entire object, and supports SQL expressions,
CSV, and JSON. Additionally, S3 Select can be used to query objects stored in the S3 Glacier storage class. The exact text from the Amazon website about S3

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

Select is:
"Amazon S3 Select is an Amazon S3 feature that enables applications to retrieve only a subset of data from an object. It offers an efficient way to access data
stored in Amazon S3 and can significantly improve query performance, save money, and increase the scalability of applications that frequently access data in S3.
S3 Select allows applications to retrieve only the data that is needed, instead of the entire object, and supports SQL expressions, CSV, and JSON. Additionally, S3
Select can be used to query objects stored in the S3 Glacier storage class."

NEW QUESTION 218


- (Exam Topic 3)
A company has an Amazon S3 data lake that is governed by AWS Lake Formation The company wants to create a visualization in Amazon QuickSight by joining
the data in the data lake with operational data that is stored in an Amazon Aurora MySQL database The company wants to enforce column-level authorization so
that the company's marketing team can access only a subset of columns in the database
Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon EMR to ingest the data directly from the database to the QuickSight SPICE engine Include only the required columns
B. Use AWS Glue Studio to ingest the data from the database to the S3 data lake Attach an IAM policy tothe QuickSight users to enforce column-level access
contro
C. Use Amazon S3 as the data source in QuickSight
D. Use AWS Glue Elastic Views to create a materialized view for the database in Amazon S3 Create an S3 bucket policy to enforce column-level access control
for the QuickSight users Use Amazon S3 as the data source in QuickSight.
E. Use a Lake Formation blueprint to ingest the data from the database to the S3 data lake Use Lake Formation to enforce column-level access control for the
QuickSight users Use Amazon Athena as the data source in QuickSight

Answer: D

NEW QUESTION 220


- (Exam Topic 3)
A company recently deployed a new auditing system to centralize information about operating system versions patching and installed software for Amazon EC2
instances. A solutions architect must ensure all instances provisioned through EC2 Auto Scaling groups successfully send reports to the auditing system as soon
as they are launched and terminated
Which solution achieves these goals MOST efficiently?

A. Use a scheduled AWS Lambda function and run a script remotely on all EC2 instances to send data to the audit system.
B. Use EC2 Auto Scaling lifecycle hooks to run a custom script to send data to the audit system when instances are launched and terminated
C. Use an EC2 Auto Scaling launch configuration to run a custom script through user data to send data to the audit system when instances are launched and
terminated
D. Run a custom script on the instance operating system to send data to the audit system Configure the script to be invoked by the EC2 Auto Scaling group when
the instance starts and is terminated

Answer: B

NEW QUESTION 223


- (Exam Topic 3)
A company runs a public three-Tier web application in a VPC The application runs on Amazon EC2 instances across multiple Availability Zones. The EC2
instances that run in private subnets need to communicate with a license server over the internet The company needs a managed solution that minimizes
operational maintenance
Which solution meets these requirements''

A. Provision a NAT instance in a public subnet Modify each private subnets route table with a default route that points to the NAT instance
B. Provision a NAT instance in a private subnet Modify each private subnet's route table with a default route that points to the NAT instance
C. Provision a NAT gateway in a public subnet Modify each private subnet's route table with a default route that points to the NAT gateway
D. Provision a NAT gateway in a private subnet Modify each private subnet's route table with a default route that points to the NAT gateway .

Answer: C

NEW QUESTION 228


- (Exam Topic 3)
A company has a three-tier application for image sharing. The application uses an Amazon EC2 instance for the front-end layer, another EC2 instance for the
application layer, and a third EC2 instance for a MySQL database. A solutions architect must design a scalable and highly available solution that requires the least
amount of change to the application.
Which solution meets these requirements?

A. Use Amazon S3 to host the front-end laye


B. Use AWS Lambda functions for the application laye
C. Move the database to an Amazon DynamoDB tabl
D. Use Amazon S3 to store and serve users’ images.
E. Use load-balanced Multi-AZ AWS Elastic Beanstalk environments for the front-end layer and the application laye
F. Move the database to an Amazon RDS DB instance with multiple read replicas to serve users’ images.
G. Use Amazon S3 to host the front-end laye
H. Use a fleet of EC2 instances in an Auto Scaling group for the application laye
I. Move the database to a memory optimized instance type to store and serve users’ images.
J. Use load-balanced Multi-AZ AWS Elastic Beanstalk environments for the front-end layer and the application laye
K. Move the database to an Amazon RDS Multi-AZ DB instanc
L. Use Amazon S3 to store and serve users’ images.

Answer: D

Explanation:
for "Highly available": Multi-AZ & for "least amount of changes to the application": Elastic Beanstalk automatically handles the deployment, from capacity
provisioning, load balancing, auto-scaling to application health monitoring

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

NEW QUESTION 233


- (Exam Topic 3)
A company is using a centralized AWS account to store log data in various Amazon S3 buckets. A solutions architect needs to ensure that the data is encrypted at
rest before the data is uploaded to the S3 buckets. The data also must be encrypted in transit.
Which solution meets these requirements?

A. Use client-side encryption to encrypt the data that is being uploaded to the S3 buckets.
B. Use server-side encryption to encrypt the data that is being uploaded to the S3 buckets.
C. Create bucket policies that require the use of server-side encryption with S3 managed encryption keys (SSE-S3) for S3 uploads.
D. Enable the security option to encrypt the S3 buckets through the use of a default AWS Key Management Service (AWS KMS) key.

Answer: A

NEW QUESTION 237


- (Exam Topic 3)
A solutions architect has created two IAM policies: Policy1 and Policy2. Both policies are attached to an IAM group.

A cloud engineer is added as an IAM user to the IAM group. Which action will the cloud engineer be able to perform?

A. Deleting IAM users


B. Deleting directories
C. Deleting Amazon EC2 instances
D. Deleting logs from Amazon CloudWatch Logs

Answer: C

Explanation:
https://awscli.amazonaws.com/v2/documentation/api/latest/reference/ds/index.html

NEW QUESTION 242


- (Exam Topic 3)
A company is running a critical business application on Amazon EC2 instances behind an Application Load Balancer The EC2 instances run in an Auto Scaling
group and access an Amazon RDS DB instance
The design did not pass an operational review because the EC2 instances and the DB instance are all located in a single Availability Zone A solutions architect
must update the design to use a second Availability Zone
Which solution will make the application highly available?

A. Provision a subnet in each Availability Zone Configure the Auto Scaling group to distribute the EC2 instances across bothAvailability Zones Configure the DB

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

instance with connections to each network


B. Provision two subnets that extend across both Availability Zones Configure the Auto Scaling group to distribute the EC2 instancesacross both Availability Zones
Configure the DB instance with connections to each network
C. Provision a subnet in each Availability Zone Configure the Auto Scaling group to distribute the EC2 instances across both Availability Zones Configure the DB
instance for Multi-AZ deployment
D. Provision a subnet that extends across both Availability Zones Configure the Auto Scaling group to distribute the EC2 instancesacross both Availability Zones
Configure the DB instance for Multi-AZ deployment

Answer: C

NEW QUESTION 245


- (Exam Topic 3)
A company is building a new web-based customer relationship management application. The application will use several Amazon EC2 instances that are backed
by Amazon Elastic Block Store (Amazon EBS) volumes behind an Application Load Balancer (ALB). The application will also use an Amazon Aurora database. All
data for the application must be encrypted at rest and in transit.
Which solution will meet these requirements?

A. Use AWS Key Management Service (AWS KMS) certificates on the ALB to encrypt data in transi
B. Use AWS Certificate Manager (ACM) to encrypt the EBS volumes and Aurora database storage at rest.
C. Use the AWS root account to log in to the AWS Management Consol
D. Upload the company’s encryption certificate
E. While in the root account, select the option to turn on encryption for all data at rest and in transit for the account.
F. Use a AWS Key Management Service (AWS KMS) to encrypt the EBS volumes and Aurora database storage at res
G. Attach an AWS Certificate Manager (ACM) certificate to the ALB to encrypt data in transit.
H. Use BitLocker to encrypt all data at res
I. Import the company’s TLS certificate keys to AWS key Management Service (AWS KMS). Attach the KMS keys to the ALB to encrypt data in transit.

Answer: C

NEW QUESTION 248


- (Exam Topic 3)
A company needs to provide its employee with secure access to confidential and sensitive files. The company wants to ensure that the files can be accessed only
by authorized users. The files must be downloaded security to the employees devices.
The files are stored in an on-premises Windows files server. However, due to an increase in remote usage, the file server out of capacity.
Which solution will meet these requirement?

A. Migrate the file server to an Amazon EC2 instance in a public subne


B. Configure the security group to limit inbound traffic to the employees ‚IP addresses.
C. Migrate the files to an Amazon FSx for Windows File Server file syste
D. Integrate the Amazon FSx file system with the on-premises Active Directory Configure AWS Client VPN.
E. Migrate the files to Amazon S3, and create a private VPC endpoin
F. Create a signed URL to allow download.
G. Migrate the files to Amazon S3, and create a public VPC endpoint Allow employees to sign on with AWS IAM identity Center (AWS Sing-On).

Answer: C

NEW QUESTION 252


- (Exam Topic 3)
A company wants to configure its Amazon CloudFront distribution to use SSL/TLS certificates. The company does not want to use the default domain name for the
distribution. Instead, the company wants to use a different domain name for the distribution.
Which solution will deploy the certificate with icurring any additional costs?

A. Request an Amazon issued private certificate from AWS Certificate Manager (ACM) in the us-east-1 Region
B. Request an Amazon issued private certificate from AWS Certificate Manager (ACM) in the us-west-1 Region.
C. Request an Amazon issued public certificate from AWS Certificate Manager (ACU) in the us-east-1 Region
D. Request an Amazon issued public certificate from AWS Certificate Manager (ACU) in the us-west-1 Regon.

Answer: B

NEW QUESTION 254


- (Exam Topic 3)
A solutions architect is designing the architecture for a software demonstration environment The environment will run on Amazon EC2 instances in an Auto Scaling
group behind an Application Load Balancer (ALB) The system will experience significant increases in traffic during working hours but Is not required to operate on
weekends.
Which combination of actions should the solutions architect take to ensure that the system can scale to meet demand? (Select TWO)

A. Use AWS Auto Scaling to adjust the ALB capacity based on request rate
B. Use AWS Auto Scaling to scale the capacity of the VPC internet gateway
C. Launch the EC2 instances in multiple AWS Regions to distribute the load across Regions
D. Use a target tracking scaling policy to scale the Auto Scaling group based on instance CPU utilization
E. Use scheduled scaling to change the Auto Scaling group minimum, maximum, and desired capacity to zero for weekends Revert to the default values at the
start of the week

Answer: DE

NEW QUESTION 259


- (Exam Topic 3)
A company is designing a shared storage solution for a gaming application that is hosted in the AWS Cloud The company needs the ability to use SMB clients to

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

access data solution must be fully managed.


Which AWS solution meets these requirements?

A. Create an AWS DataSync task that shares the data as a mountable file system Mount the file system to the application server
B. Create an Amazon EC2 Windows instance Install and configure a Windows file share role on the instance Connect the application server to the file share
C. Create an Amazon FSx for Windows File Server file system Attach the file system to the origin server Connect the application server to the file system
D. Create an Amazon S3 bucket Assign an IAM role to the application to grant access to the S3 bucket Mount the S3 bucket to the application server

Answer: C

NEW QUESTION 264


- (Exam Topic 3)
A company is using AWS to design a web application that will process insurance quotes Users will request quotes from the application Quotes must be separated
by quote type, must be responded to within 24 hours, and must not get lost The solution must maximize operational efficiency and must minimize maintenance.
Which solution meets these requirements?

A. Create multiple Amazon Kinesis data streams based on the quote type Configure the web application to send messages to the proper data stream Configure
each backend group of application servers to use the Kinesis Client Library (KCL) to pool messages from its own data stream
B. Create an AWS Lambda function and an Amazon Simple Notification Service (Amazon SNS) topic for each quote type Subscribe the Lambda function to its
associated SNS topic Configure the application to publish requests tot quotes to the appropriate SNS topic
C. Create a single Amazon Simple Notification Service (Amazon SNS) topic Subscribe Amazon Simple Queue Service (Amazon SQS) queues to the SNS topic
Configure SNS message filtering to publish messages to the proper SQS queue based on the quote type Configure each backend application server to use its own
SQS queue
D. Create multiple Amazon Kinesis Data Firehose delivery streams based on the quote type to deliver data streams to an Amazon Elasucsearch Service (Amazon
ES) cluster Configure the application to send messages to the proper delivery stream Configure each backend group of application servers to search for the
messages from Amazon ES and process them accordingly

Answer: C

NEW QUESTION 267


- (Exam Topic 3)
An ecommerce company is running a multi-tier application on AWS. The front-end and backend tiers run on Amazon EC2, and the database runs on Amazon RDS
for MYSQL. The backend tier communities with the RDS instance. There are frequent calls to return identical database from the database that are causing
performance slowdowns.
Which action should be taken to improve the performance of the backend?

A. Implement Amazon SNS to store the database calls.


B. Implement Amazon ElasticCache to cache the large database.
C. Implement an RDS for MySQL read replica to cache database calls.
D. Implement Amazon Kinesis Data Firehose to stream the calls to the database.

Answer: B

NEW QUESTION 268


- (Exam Topic 3)
A gaming company is moving its public scoreboard from a data center to the AWS Cloud. The company uses Amazon EC2 Windows Server instances behind an
Application Load Balancer to host its dynamic application. The company needs a highly available storage solution for the application. The application consists of
static files and dynamic server-side code.
Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.) ' A. Store the static files on Amazon S3. Use Amazon A.
CloudFront to cache objects at the edge.

A. Store the static files on Amazon S3. Use Amazon ElastiCache to cache objects at the edge.
B. Store the server-side code on Amazon Elastic File System (Amazon EFS). Mount the EFS volume on each EC2 instance to share the files.
C. Store the server-side code on Amazon FSx for Windows File Serve
D. Mount the FSx for Windows File Server volume on each EC2 instance to share the files.
E. Store the server-side code on a General Purpose SSD (gp2) Amazon Elastic Block Store (Amazon EBS) volum
F. Mount the EBS volume on each EC2 instance to share the files.

Answer: AE

NEW QUESTION 269


- (Exam Topic 3)
A company has deployed a database in Amazon RDS for MySQL. Due to increased transactions, the database support team is reporting slow reads against the
DB instance and recommends adding a read replica.
Which combination of actions should a solutions architect take before implementing this change? (Choose two.)

A. Enable binlog replication on the RDS primary node.


B. Choose a failover priority for the source DB instance.
C. Allow long-running transactions to complete on the source DB instance.
D. Create a global table and specify the AWS Regions where the table will be available.
E. Enable automatic backups on the source instance by setting the backup retention period to a value other than 0.

Answer: CE

Explanation:
"An active, long-running transaction can slow the process of creating the read replica. We recommend that you wait for long-running transactions to complete
before creating a read replica. If you create multiple read replicas in parallel from the same source DB instance, Amazon RDS takes only one snapshot at the start
of the first create action. When creating a read replica, there are a few things to consider. First, you must enable automatic backups on the source DB instance by
setting the backup retention period to a value other than 0. This requirement also applies to a read replica that is the source DB instance for another read replica"

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.html

NEW QUESTION 272


- (Exam Topic 3)
A company needs to retain its AWS CloudTrail logs for 3 years. The company is enforcing CloudTrail across a set of AWS accounts by using AWS Organizations
from the parent account. The CloudTrail target S3 bucket is configured with S3 Versioning enabled. An S3 Lifecycle policy is in place to delete current objects after
3 years.
After the fourth year of use of the S3 bucket, the S3 bucket metrics show that the number of objects has continued to rise. However, the number of new CloudTrail
logs that are delivered to the S3 bucket has remained consistent.
Which solution will delete objects that are older than 3 years in the MOST cost-effective manner?

A. Configure the organization’s centralized CloudTrail trail to expire objects after 3 years.
B. Configure the S3 Lifecycle policy to delete previous versions as well as current versions.
C. Create an AWS Lambda function to enumerate and delete objects from Amazon S3 that are older than 3 years.
D. Configure the parent account as the owner of all objects that are delivered to the S3 bucket.

Answer: B

Explanation:
https://docs.aws.amazon.com/awscloudtrail/latest/userguide/best-practices-security.html#:~:text=The%20Cloud

NEW QUESTION 276


- (Exam Topic 3)
A company that primarily runs its application servers on premises has decided to migrate to AWS. The company wants to minimize its need to scale its Internet
Small Computer Systems Interface (iSCSI) storage on premises. The company wants only its recently accessed data to remain stored locally.
Which AWS solution should the company use to meet these requirements?

A. Amazon S3 File Gateway


B. AWS Storage Gateway Tape Gateway
C. AWS Storage Gateway Volume Gateway stored volumes
D. AWS Storage Gateway Volume Gateway cachea volumes

Answer: D

NEW QUESTION 278


- (Exam Topic 3)
A media company collects and analyzes user activity data on premises. The company wants to migrate this capability to AWS. The user activity data store will
continue to grow and will be petabytes in size. The company needs to build a highly available data ingestion solution that facilitates on-demand analytics of existing
data and new data with SQL.
Which solution will meet these requirements with the LEAST operational overhead?

A. Send activity data to an Amazon Kinesis data strea


B. Configure the stream to deliver the data to an Amazon S3 bucket.
C. Send activity data to an Amazon Kinesis Data Firehose delivery strea
D. Configure the stream to deliver the data to an Amazon Redshift cluster.
E. Place activity data in an Amazon S3 bucke
F. Configure Amazon S3 to run an AWS Lambda function on the data as the data arrives in the S3 bucket.
G. Create an ingestion service on Amazon EC2 instances that are spread across multiple Availability Zone
H. Configure the service to forward data to an Amazon RDS Multi-AZ database.

Answer: B

Explanation:
Amazon Redshift is a fully managed, petabyte-scale data warehouse service in the cloud. You can start with just a few hundred gigabytes of data and scale to a
petabyte or more. This allows you to use your data to gain new insights for your business and customers. The first step to create a data warehouse is to launch a
set of nodes, called an Amazon Redshift cluster. After you provision your cluster, you can upload your data set and then perform data analysis queries. Regardless
of the size of the data set, Amazon Redshift offers fast query performance using the same SQL-based tools and business intelligence applications that you use
today.

NEW QUESTION 281


......

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/SAA-C03-exam-dumps.html (551 New Questions)

Thank You for Trying Our Product

We offer two products:

1st - We have Practice Tests Software with Actual Exam Questions

2nd - Questons and Answers in PDF Format

SAA-C03 Practice Exam Features:

* SAA-C03 Questions and Answers Updated Frequently

* SAA-C03 Practice Questions Verified by Expert Senior Certified Staff

* SAA-C03 Most Realistic Questions that Guarantee you a Pass on Your FirstTry

* SAA-C03 Practice Test Questions in Multiple Choice Formats and Updatesfor 1 Year

100% Actual & Verified — Instant Download, Please Click


Order The SAA-C03 Practice Test Here

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Powered by TCPDF (www.tcpdf.org)

You might also like