GCP 5
GCP 5
A. Migrate the data from the on-premises data centre to Cloud Spanner by
using the upload files function.
B. Migrate the data from the on-premises data centre to Cloud SQL for
MySQL by using the upload files function.
C. Migrate the data from the on-premises data centre to Cloud Storage by
using a custom script with gsutil commands.
D. Migrate the data from the on-premises data centre to BigQuery by using a
custom script with bq commands.
Answer: C
A. Direct the traffic through a Global HTTP(s) Load Balancer to shield your
application from GCP zone failures.
B. Provision another compute engine instance in us-west1-b and balance the
traffic across both zones.
C. Ensure you have hourly snapshots of the disk in Google Cloud Storage. In
the unlikely event of a zonal outage, use the snapshots to provision a new
Compute Engine Instance in a different zone.
D. Replace the single instance with a Managed Instance Group (MIG) and
autoscaling enabled. Configure a health check to detect failures rapidly.
Answer: B
Answer: A
Answer: C
5. You want to optimize the storage costs for long term archival of logs. Logs
are accessed frequently in the first 30 days and only retrieved after that if
there is any special requirement in the annual audit. The auditors may need to
look into log entries of the previous three years. What should you do?
A. Store the logs in Nearline Storage Class and set up a lifecycle policy to
transition the files older than 30 days to Archive Storage Class.
B. Store the logs in Standard Storage Class and set up a lifecycle policy to
transition the files older than 30 days to Archive Storage Class.
C. Store the logs in Standard Storage Class and set up lifecycle policies to
transition the files older than 30 days to Coldline Storage Class, and files
older than 1 year to Archive Storage Class.
D. Store the logs in Nearline Storage Class and set up lifecycle policies to
transition the files older than 30 days to Coldline Storage Class, and files
older than 1 year to Archive Storage Class.
Answer: B
Answer: A
7. You run a business-critical application in a Google Cloud Compute Engine
instance, and you want to set up a cost-efficient solution for backing up the
data on the boot disk. You want a solution that:
minimizes operational overhead
– backs up boot disks daily
. allows quick restore of the backups when needed, e.g. disaster scenarios
. deletes backups older than a month automatically.
What should you do?
Answer: A
8. The operations manager has asked you to identify the IAM users with
Project Editor role on the GCP production project. What should you do?
Answer: C
Answer: B
10. You manage an overnight batch job that uses 20 VMs to transfer
customer information from a CRM system to BigQuery dataset. The job can
tolerate some VMs going down. The current high cost of the VMs make the
overnight job not viable, and you want to reduce the costs. What should you
do?
A. Use preemptible compute engine instances to reduce cost.
B. Use a fleet of f1-micro instances behind a Managed Instances Group (MIG)
with autoscaling. Set minimum and maximum nodes to 20.
C. Use tiny f1-micro instances to reduce cost.
D. Use a fleet of f1-micro instances behind a Managed Instances Group (MIG)
with autoscaling and minimum nodes set to 1.
Answer: A
A. Grant the necessary IAM roles to a service account, download the JSON
key file and package it with your application.
B. Grant the necessary IAM roles to the service account used by Google
Compute Engine instance.
C. Grant the necessary IAM roles to a service account and configure the
application running on Google Compute Engine instance to use this service
account.
D. Grant the necessary IAM roles to a service account, store its credentials in
a config file and package it with your application.
Answer: B
12. You want to run an application in Google Compute Engine in the app-tier
GCP project and have it export data from Cloud Bigtable to daily-us-customer-
export Cloud Storage bucket in the data-warehousing project. You plan to run
a Cloud Dataflow job in the data-arehousing project to pick up data from this
bucket for further processing. How should you design the IAM access to
enable the compute engine instance push objects to daily-us-customer-export
Cloud Storage bucket in the data-warehousing project?
Answer: C
13. You run a batch job every month in your on-premises data centre that
downloads clickstream logs from Google Cloud Storage bucket, enriches the
data and stores them in Cloud BigTable. The job runs for 32 hours on average,
can be restarted if interrupted, and must complete. You want to migrate this
batch job onto a cost-efficient GCP compute service. How should you deploy
it?
A. Deploy the batch job in a GKE Cluster with preemptible VM node pool.
B. Deploy the batch job on a fleet of Google Cloud Compute Engine
preemptible VM in a Managed Instances Group (MIG) with autoscaling.
C. Deploy the batch job on a Google Cloud Compute Engine Preemptible
VM.
D. Deploy the batch job on a Google Cloud Compute Engine non-
preemptible VM. Restart instances as required.
Answer: A
14. Your gaming backend uses Cloud Spanner to store leaderboard and
player profile data. You want to scale the spanner instances based on
predictable usage patterns. What should you do?
Answer: C
Answer: C
Answer: C
Answer: A
18. You deployed the Finance teams’ Payroll application to Google Compute
Engine, and this application is used by staff during regular business hours.
The operations team want to backup the VMs daily outside the business
hours and delete images older than 50 days to save costs. They need an
automated solution with the least operational overhead and the least number
of GCP services. What should they do?
Answer: A
19. Your compliance team wants to review the audit logs and data access
logs in the production GCP project. You want to follow Google recommended
practices. What should you do?
Answer: D
20. You want to migrate a legacy application from your on-premises data
centre to Google Cloud Platform. The application serves SSL encrypted traffic
from worldwide clients on TCP port 443. What GCP Loadbalancing service
should you use to minimize latency for all clients?
A. External HTTP(S) Load Balancer.
Answer: D
21. You are deploying an application on the Google Compute Engine, and you
want to minimize network egress costs. The organization has a policy that
requires you to block all but essential egress traffic. What should you do?
B. Enable a firewall rule at priority 100 to allow ingress and essential egress
traffic.
C. Enable a firewall rule at priority 100 to block all egress traffic, and another
firewall rule at priority 65534 to allow essential egress traffic.
D. Enable a firewall rule at priority 65534 to block all egress traffic, and
another firewall rule at priority 100 to allow essential egress traffic.
Answer: D
22. You work for a startup company where every developer has a dedicated
development GCP project linked to a central billing account. Your finance lead
is concerned that some developers may leave some services running
unnecessarily or may not understand the cost implications of turning on
specific services in Google Cloud Platform. They want to be alerted when a
developer spends more than 750$ per month in their GCP project. What
should you do?
A. Export Billing data from each development GCP projects to a separate
BigQuery dataset. On each dataset, use a Data Studio dashboard to plot the
spending.
B. Set up a budget for each development GCP projects. For each budget,
trigger an email notification when the spending exceeds $750.
C. Export Billing data from all development GCP projects to a single BigQuery
dataset. Use a Data Studio dashboard to plot the spend.
D. Set up a single budget for all development GCP projects. Trigger an email
notification when the spending exceeds $750 in the budget.
Answer: B
23. Your company has deployed all its production applications in a single
Google Cloud Project and uses several GCP projects for development and
test environments. The operations team requires access to all production
services in this project to debug live issues and deploy enhancements. Your
security team prevents the creation of IAM roles that automatically broaden to
include new permissions/services in future. How should you design the IAM
role for operations team?
A. Create a custom role with the necessary permissions and grant the role on
the production GCP project to all members of the operations team.
B. Grant the Project Editor role at the organization level to all members of the
operations team.
C. Grant the Project Editor role on the production GCP project to all members
of the operations team.
D. Create a custom role with the necessary permissions and grant the role at
the organization level to all members of the operations team.
Answer: A
24. EU GDPR requires you to respond to a Subject Access Request (SAR)
within one month. To be compliant, your company deployed an application
that uses Apache WebServer to provide SAR archive (tar) files back to
customers requesting them. Your compliance team has asked you to send
them an email notification when the network egress charges for this server in
the GCP project exceeds 250 dollars per month. What should you do?
A. Export the logs from Apache server to Cloud Logging and deploy a Cloud
Function to parse the logs, extract and sum up the size of response payload
for all requests during the current month; and send an email notification when
spending exceeds $250.
B. Configure a budget with the scope set to the billing account, the amount set
to $250, threshold rule set to 100% of actual cost & trigger email notifications
when spending exceeds the threshold.
C. Export the project billing data to a BigQuery dataset and deploy a Cloud
Function to extract and sum up the network egress costs from the BigQuery
dataset for the Apache server for the current month, and send an email
notification when spending exceeds $250.
D. Configure a budget with the scope set to the project, the amount set to
$250, threshold rule set to 100% of actual cost & trigger email notifications
when spending exceeds the threshold.
Answer: C
Answer: D
26. You developed an application on App Engine Service to read data from a
BigQuery dataset and convert the data to PARQUET format. The application
is using the default app-engine service account in the app-tier GCP project.
The data team owns the BigQuery dataset in the data-warehousing project.
What IAM Access should you grant to the default app-engine service account
in app-tier GCP project?
A. Grant the default app-engine service account in the app-tier GCP project
roles/bigquery. data Viewer role on the data-warehousing project.
C. Grant the default app-engine service account in the app-tier GCP project
roles/bigquery.dataViewer role on the same project.
D. Grant the default app-engine service account in the app-tier GCP project
roles/bigquery.jobUser role on data-warehousing project.
Answer: A
27. Your company updated its business operating model recently and no
longer need the applications deployed in the data-analytics-v1 GCP project.
You want to turn off all GCP services and APIs in this project. You want to do
this efficiently using the least number of steps while following Google
recommended practices. What should you do?
A. Ask an engineer with Project Owner IAM role to identify all resources in the
project and delete them.
B. Ask an engineer with Project Owner IAM role to locate the project and shut
down.
Answer: B
Answer: B
29. You want to monitor resource utilization (RAM, Disk, Network, CPU, etc.)
for all applications in development, test and production GCP projects in a
single dashboard. What should you do?
Answer: B
30. You have an application in your on-premises data centre with an API that
is triggered when a new file is created or updated in a NAS share. You want
to migrate this solution to Google Cloud Platform and have identified Cloud
Storage as the replacement service for NAS. How should you deploy the API?
A. Deploy the API on GKE cluster and use Cloud Scheduler to trigger the
API to look for files in Cloud Storage there were created or update since the
last run.
B. Trigger a Cloud Function whenever files in Cloud Storage are created or
updated.
C. Trigger a Cloud Dataflow job whenever files in Cloud Storage are created
or updated.
D. Configure Cloud Pub/Sub to capture details of files created/modified in
Cloud Storage. Deploy the API in App Engine Standard and use Cloud
Scheduler to trigger the API to fetch information from Cloud Pub/Sub.
Answer: B
A. 1. Ensure you don’t have any persistent disks with the same name as the
VM instance. 2. Ensure the disk autodelete property is turned on
(disks.autoDelete set to true). 3. Ensure instance template syntax is valid.
B. 1. Ensure instance template syntax is valid. 2. Ensure the instance
template, instance and the persistent disk names do not conflict.
C. 1. Ensure the instance template, instance and the persistent disk names
do not conflict. 2. Ensure the disk autodelete property is turned on
(disks.autoDelete set to true).
D. 1. Ensure you don’t have any persistent disks with the same name as the
VM instance. 2. Ensure instance template syntax is valid.
Answer: A
32. You work for a multinational car insurance company that specializes in
rewarding safer drivers with cheaper premiums. Your company does this by
installing black box loT devices in its 2 million insured drivers’ cars. These
devices capture driving behaviours such as acceleration/deceleration, speed
compared to speed limits, and types of driving, such as commuting on
freeway compared to commuting on surface streets etc. You expect to receive
hundreds of events per minute from every device. You need to store this data
and retrieve data consistently based on the event time, and both operations
should be atomic. How should you store this data?
A. Store the data in Cloud Storage. Have a file per loT device and append
new data to the file.
B. Store the data in Cloud Datastore. Have an entity group per device.
C. Store the data in Cloud BigTable. Have a row key based on the ingestion
timestamp.
D. Store the data in Cloud Filestore. Have a file per loT device and append
new data to the file.
Answer: C
Answer: B
A. Have the mobile application use signed URLs to enabled time- limited
upload to Cloud Storage.
B. Use Cloud Scheduler to trigger a Cloud Function to check for objects older
than 50 days and delete them.
C. Enable lifecycle policy on the bucket to delete objects older than 50 days.
D. Write a cron script that checks for objects older than 50 days and deletes
them.
E. Have the mobile application send the images to an SFTP server.
Answer: A & C
A. 1. Set the custom IAM role lifecycle stage to ALPHA while you test the role
in the test GCP project. 2. Restrict the custom IAM role to use permissions
with TESTING support level.
B. 1. Set the custom IAM role lifecycle stage to BETA while you test the role in
the test GCP project. 2. Restrict the custom IAM role to use permissions with
SUPPORTED support level.
C. 1. Set the custom IAM role lifecycle stage to BETA while you test the role
in the test GCP project. 2. Restrict the custom IAM role to use permissions
with TESTING support level.
D. 1. Set the custom IAM role lifecycle stage to ALPHA while you test the role
in the test GCP project. 2. Restrict the custom IAM role to use permissions
with SUPPORTED support level.
Answer: D
36. Your company stores an export of its Customer PII data in a multi-regional
Google Cloud storage bucket. Your legal and compliance department has
asked you to record all operations/requests on the data in this bucket. What
should you do?
A. Enable the default Cloud Storage Service account exclusive access to read
all operations and record them.
B. Use the Data Loss Prevention API to record this information.
C. Use the Identity Aware Proxy API to record this information.
D. Turn on data access audit logging in Cloud Storage to record this
information.
Answer: D
Answer: B
38. Your company wants to migrate all compute workloads from the on-
premises data centre to Google Cloud Compute Engine. A third-party team
provides operational support for your production applications outside business
hours. Everyone at your company has a Gsuite account, but the support team
do not. How should you grant them access to the VMs?
A. Use Cloud Identity Aware Proxy (IAP) to enable SSH tunnels to the VMs
and add the third-party team as a tunnel user.
B. Set up a firewall rule to open SSH port (TCP:22) to the IP range of the
third-party team.
C. Set up a Cloud VPN tunnel between the third-party network and your
production GCP project.
D. Add all the third party teams SSH keys to the production compute engine
instances.
Answer: A
39. All departments at your company have their own Google Cloud Projects.
You got transferred into a new department that doesn’t have a project yet, and
you are ready to deploy a new application onto a Compute Engine Instance.
What should you do?
A. In the GCP Console, enable the Compute Engine API. When creating a
new instance in the console, select the checkbox to create the instance in a
new GCP project and provide the project name and ID.
B. Use gcloud commands first to create a new project, then to enable the
Compute Engine API and finally, to launch a new compute engine instance in
the project.
C. Run gcloud compute instances create with –project flag to automatically
create the new project and a compute engine instance. When prompted to
enable the Compute Engine API, select Yes.
D. In the GCP Console, enable the Compute Engine API. Run gcloud
compute instances create with – –project flag to automatically create the new
project and a compute engine instance.
Answer: B
40. You deployed an application using Apache Tomcat server on a single
Google Cloud VM. Users are complaining of intermittent issues accessing a
specific page in the application, and you want to look at the logs on the local
disk. What should you do?
A. Configure a health check on the instance to identify the issue and email
you the logs when the application experiences the issue.
B. Check logs in Cloud Logging.
C. Check logs in the Serial Console.
D. Install the Cloud Logging Agent on the VM and configure it to send logs to
Cloud Logging. Check logs in Cloud Logging.
Answer: D
41. Your company plans to migrate all applications from the on-premise data
centre to Google Cloud Platform and requires a monthly estimate of the cost
of running these applications in GCP. How can you provide this estimate?
A. For all GCP services/APIs you are planning to use, use the GCP pricing
calculator to estimate the monthly costs.
B. For all GCP services/APIs you are planning to use, capture the pricing from
the products pricing page and use an excel sheet to estimate the monthly
costs.
C. Migrate all applications to GCP and run them for a week. Use the costs
from the Billing Report page for this week to extrapolate the monthly cost of
running all applications in GCP.
D. Migrate all applications to GCP and run them for a week. Use Cloud
Monitoring to identify the costs for this week and use it to derive the monthly
cost of running all applications in GCP.
Answer: A
42. Your company uses Google Cloud for all its compute workloads. One of
the applications that you developed has passed unit testing, and you want to
use Jenkins to deploy the application in User Acceptance Testing (UAT)
environment. Your manager has asked you to automate Jenkins installation
as quickly and efficiently as possible. What should you do?
Answer: C
43. Your company deployed its applications across hundreds of GCP projects
that use different billing accounts. The finance team is struggling to add up all
production Cloud Opex costs and has requested your assistance for
enabling/providing a single pane of glass for all costs incurred by all
applications in Google Cloud. You want to include new costs as soon as they
become available. What should you do?
A. Use Google pricing calculator for all the services used in all GCP projects
and pass the estimated cost to finance team every month.
B. Enable Billing Export from all GCP projects to BigQuery and ask the
finance team to use Google Data Studio to visualize the data.
C. Ask the finance team to check reports view section in Cloud Billing Console.
D. Use Cloud Scheduler to trigger a Cloud Function every hour. Have the
Cloud Function download the CSV from the Cost Table page and upload the
data to BigQuery. Ask the finance team to use Google Data Studio to
visualize the data.
Answer: B
44. Your production Compute workloads are running in a subnet with a range
192.168.20.128/25. A recent surge in traffic has seen the production VMs
struggle, and you want to add more VMs, but there are no free IP addresses
in the VPC. All new and old VMs need to communicate with each other. How
can you do this with the fewest steps?
Answer: B
45. Your compliance team has asked you to set up an external auditor access
to logs from all GCP projects for the last 60 days. The auditor wants to
combine, explore and analyze the contents of the logs from all projects quickly
and efficiently. You want to follow Google Recommended practices. What
should you do?
A. Set up a Cloud Storage sink destination to export logs from all the projects
to a bucket. Configure a lifecycle rule to delete objects older than 60 days.
Ask the auditor to query logs from the bucket.
B. Set up a Cloud Scheduler job to trigger a Cloud Function that reads and
export logs from all the projects to a BigQuery dataset. Configure the table
expiration on the dataset to 60 days. Ask the auditor to query logs from the
dataset.
C. Set up a BigQuery sink destination to export logs from all the projects to a
dataset. Configure the table expiration on the dataset to 60 days. Ask the
auditor to query logs from the dataset.
D. Ask the auditor to query logs from Cloud Logging.
Answer: C
46. You work for a multinational delivery services company that uses Apache
Cassandra DB as the backend store for its delivery track and trace system.
The existing on-premises data centre is out of space. To cope with an
anticipated increase in requests in the run-up to Christmas, you want to move
this application rapidly to Google Cloud with minimal effort whilst ensuring you
can spin up multiple stacks (development, test, production) and isolate them
from each other. How can you do this?
A. Download the installation guide for Cassandra on GCP and follow the
instructions to install the database.
B. Launch Cassandra DB from Cloud Marketplace.
C. Install an instance of Cassandra DB on Google Cloud Compute Engine,
take a snapshot of this instance and use the snapshot to spin up additional
instances of Cassandra DB.
D. Install an instance of Cassandra DB on Google Cloud Compute Engine,
take a snapshot of this instance and upload to Google Cloud Storage bucket.
Every time you need a new instance of Cassandra DB, spin up a new
compute engine instance from the snapshot.
Answer: B