Amazon exam practice test / dbs-c01 dumps / dbs-c01 exam / dbs-c01 exam dumps / dbs-c01 pdf / dbs-c01 practice exam / dbs-c01 study guide

[2021.5] Valid Amazon AWS DBS-C01 Practice Questions Free Share From Pass4itsure

Amazon AWS DBS-C01 is difficult. But with the Pass4itsure DBS-C01 dumps https://www.pass4itsure.com/aws-certified-database-specialty.html preparation material candidate, it can be achieved easily. In DBS-C01 practice tests, you can practice on the same exam as the actual exam. If you master the tricks you gained through practice, it will be easier to achieve your target score.

Amazon AWS DBS-C01 pdf free https://drive.google.com/file/d/16YqKaTSxNTW4PhDIrMlTPcFuL76zLbgg/view?usp=sharing

Latest Amazon DBS-C01 dumps Practice test video tutorial

Latest Amazon AWS DBS-C01 practice exam questions at here:

QUESTION 1
A company just migrated to Amazon Aurora PostgreSQL from an on-premises Oracle database. After the migration, the
company discovered there is a period of time every day around 3:00 PM where the response time of the application is
noticeably slower. The company has narrowed down the cause of this issue to the database and not the application.
Which set of steps should the Database Specialist take to most efficiently find the problematic PostgreSQL query?
A. Create an Amazon CloudWatch dashboard to show the number of connections, CPU usage, and disk space
consumption. Watch these dashboards during the next slow period.
B. Launch an Amazon EC2 instance, and install and configure an open-source PostgreSQL monitoring tool that will run
reports based on the output error logs.
C. Modify the logging database parameter to log all the queries related to locking in the database and then check the
logs after the next slow period for this information.
D. Enable Amazon RDS Performance Insights on the PostgreSQL database. Use the metrics to identify any queries that
are related to spikes in the graph during the next slow period.
Correct Answer: D


QUESTION 2
A company uses the Amazon DynamoDB table contractDB in us-east-1 for its contract system with the following
schema:
1.
orderID (primary key)
2.
timestamp (sort key)
3.
contract (map)
4.
createdBy (string)
5.
customerEmail (string)
After a problem in production, the operations team has asked a database specialist to provide an IAM policy to read
items from the database to debug the application. In addition, the developer is not allowed to access the value of the
customerEmail field to stay compliant.
Which IAM policy should the database specialist use to achieve these requirements?

DBS-C01 exam questions-q2

DBS-C01 exam questions-q2-2

DBS-C01 exam questions-q2-3

DBS-C01 exam questions-q2-4

A. Option A
B. Option B
C. Option C
D. Option D
Correct Answer: A

QUESTION 3
A company is moving its fraud detection application from on premises to the AWS Cloud and is using Amazon Neptune
for data storage. The company has set up a 1 Gbps AWS Direct Connect connection to migrate 25 TB of fraud detection
data from the on-premises data center to a Neptune DB instance. The company already has an Amazon S3 bucket and
an S3 VPC endpoint, and 80% of the company\\’s network bandwidth is available.
How should the company perform this data load?
A. Use an AWS SDK with a multipart upload to transfer the data from on premises to the S3 bucket. Use the Copy
command for Neptune to move the data in bulk from the S3 bucket to the Neptune DB instance.
B. Use AWS Database Migration Service (AWS DMS) to transfer the data from on premises to the S3 bucket. Use the
Loader command for Neptune to move the data in bulk from the S3 bucket to the Neptune DB instance.
C. Use AWS DataSync to transfer the data from on premises to the S3 bucket. Use the Loader command for Neptune to
move the data in bulk from the S3 bucket to the Neptune DB instance.
D. Use the AWS CLI to transfer the data from on premises to the S3 bucket. Use the Copy command for Neptune to
move the data in bulk from the S3 bucket to the Neptune DB instance.
Correct Answer: C


QUESTION 4
A company is building a new web platform where user requests trigger an AWS Lambda function that performs an insert
into an Amazon Aurora MySQL DB cluster. Initial tests with less than 10 users on the new platform yielded successful
execution and fast response times. However, upon more extensive tests with the actual target of 3,000 concurrent
users, Lambda functions are unable to connect to the DB cluster and receive too many connections errors. Which of the
following will resolve this issue?
A. Edit the my.cnf file for the DB cluster to increase max_connections
B. Increase the instance size of the DB cluster
C. Change the DB cluster to Multi-AZ
D. Increase the number of Aurora Replicas
Correct Answer: B

QUESTION 5
A large financial services company requires that all data be encrypted in transit. A Developer is attempting to connect to
an Amazon RDS DB instance using the company VPC for the first time with credentials provided by a Database
Specialist. Other members of the Development team can connect, but this user is consistently receiving an error
indicating a communications link failure. The Developer asked the Database Specialist to reset the password a number
of times, but the error persists. Which step should be taken to troubleshoot this issue?
A. Ensure that the database option group for the RDS DB instance allows ingress from the Developer machine\\’s IP
address
B. Ensure that the RDS DB instance\\’s subnet group includes a public subnet to allow the Developer to connect
C. Ensure that the RDS DB instance has not reached its maximum connections limit
D. Ensure that the connection is using SSL and is addressing the port where the RDS DB instance is listening for
encrypted connections
Correct Answer: B


QUESTION 6
A media company is using Amazon RDS for PostgreSQL to store user data. The RDS DB instance currently has a
publicly accessible setting enabled and is hosted in a public subnet. Following a recent AWS Well-Architected
Framework review, a Database Specialist was given new security requirements.
1.
Only certain on-premises corporate network IPs should connect to the DB instance.
2.
Connectivity is allowed from the corporate network only.
Which combination of steps does the Database Specialist need to take to meet these new requirements? (Choose
three.)
A. Modify the pg_hba.conf file. Add the required corporate network IPs and remove the unwanted IPs.
B. Modify the associated security group. Add the required corporate network IPs and remove the unwanted IPs.
C. Move the DB instance to a private subnet using AWS DMS.
D. Enable VPC peering between the application host running on the corporate network and the VPC associated with the
DB instance.
E. Disable the publicly accessible setting.
F. Connect to the DB instance using private IPs and a VPN.
Correct Answer: DEF

QUESTION 7
A company is using Amazon Aurora PostgreSQL for the backend of its application. The system users are complaining
that the responses are slow. A database specialist has determined that the queries to Aurora take longer during peak
times. With the Amazon RDS Performance Insights dashboard, the load in the chart for average active sessions is often above the line that denotes maximum CPU usage and the wait state shows that most wait events are IO:XactSync.
What should the company do to resolve these performance issues?
A. Add an Aurora Replica to scale the read traffic.
B. Scale up the DB instance class.
C. Modify applications to commit transactions in batches.
D. Modify applications to avoid conflicts by taking locks.
Correct Answer: A


QUESTION 8
An electric utility company wants to store power plant sensor data in an Amazon DynamoDB table. The utility company
has over 100 power plants and each power plant has over 200 sensors that send data every 2 seconds. The sensor
data includes time with milliseconds precision, a value, and a fault attribute if the sensor is malfunctioning. Power plants
are identified by a globally unique identifier. Sensors are identified by a unique identifier within each power plant. A
database specialist needs to design the table to support an efficient method of finding all faulty sensors within a given
power plant.
Which schema should the database specialist use when creating the DynamoDB table to achieve the fastest query time
when looking for faulty sensors?
A. Use the plant identifier as the partition key and the measurement time as the sort key. Create a global secondary
index (GSI) with the plant identifier as the partition key and the fault attribute as the sort key.
B. Create a composite of the plant identifier and sensor identifier as the partition key. Use the measurement time as the
sort key. Create a local secondary index (LSI) on the fault attribute.
C. Create a composite of the plant identifier and sensor identifier as the partition key. Use the measurement time as the
sort key. Create a global secondary index (GSI) with the plant identifier as the partition key and the fault attribute as the
sort key.
D. Use the plant identifier as the partition key and the sensor identifier as the sort key. Create a local secondary index
(LSI) on the fault attribute.
Correct Answer: B

QUESTION 9
A company is looking to migrate a 1 TB Oracle database from on-premises to an Amazon Aurora PostgreSQL DB
cluster. The company\\’s Database Specialist discovered that the Oracle database is storing 100 GB of large binary
objects (LOBs) across multiple tables. The Oracle database has a maximum LOB size of 500 MB with an average LOB
size of 350 MB. The Database Specialist has chosen AWS DMS to migrate the data with the largest replication
instances. How should the Database Specialist optimize the database migration using AWS DMS?
A. Create a single task using full LOB mode with a LOB chunk size of 500 MB to migrate the data and LOBs together
B. Create two tasks: task1 with LOB tables using full LOB mode with a LOB chunk size of 500 MB and task2 without
LOBs
C. Create two tasks: task1 with LOB tables using limited LOB mode with a maximum LOB size of 500 MB and task 2
without LOBs
D. Create a single task using limited LOB mode with a maximum LOB size of 500 MB to migrate data and LOBs
together
Correct Answer: C


QUESTION 10
An AWS CloudFormation stack that included an Amazon RDS DB instance was accidentally deleted and recent data
was lost. A Database Specialist needs to add RDS settings to the CloudFormation template to reduce the chance of
accidental instance data loss in the future.
Which settings will meet this requirement? (Choose three.)
A. Set DeletionProtection to True
B. Set MultiAZ to True
C. Set TerminationProtection to True
D. Set DeleteAutomatedBackups to False
E. Set DeletionPolicy to Delete
F. Set DeletionPolicy to Retain
Correct Answer: ACF
Reference: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-attributedeletionpolicy.html
https://aws.amazon.com/premiumsupport/knowledge-center/cloudformation-accidental-updates/

QUESTION 11
A company is running Amazon RDS for MySQL for its workloads. There is downtime when AWS operating system
patches are applied during the Amazon RDS-specified maintenance window.
What is the MOST cost-effective action that should be taken to avoid downtime?
A. Migrate the workloads from Amazon RDS for MySQL to Amazon DynamoDB
B. Enable cross-Region read replicas and direct read traffic to then when Amazon RDS is down
C. Enable a read replicas and direct read traffic to it when Amazon RDS is down
D. Enable an Amazon RDS for MySQL Multi-AZ configuration
Correct Answer: C


QUESTION 12
A company runs a customer relationship management (CRM) system that is hosted on-premises with a MySQL
database as the backend. A custom stored procedure is used to send email notifications to another system when data is
inserted into a table. The company has noticed that the performance of the CRM system has decreased due to
database reporting applications used by various teams. The company requires an AWS solution that would reduce
maintenance, improve performance, and accommodate the email notification feature.
Which AWS solution meets these requirements?
A. Use MySQL running on an Amazon EC2 instance with Auto Scaling to accommodate the reporting applications.
Configure a stored procedure and an AWS Lambda function that uses Amazon SES to send email notifications to the
other system.
B. Use Amazon Aurora MySQL in a multi-master cluster to accommodate the reporting applications. Configure Amazon
RDS event subscriptions to publish a message to an Amazon SNS topic and subscribe the other system\\’s email
address to the topic.
C. Use MySQL running on an Amazon EC2 instance with a read replica to accommodate the reporting applications.
Configure Amazon SES integration to send email notifications to the other system.
D. Use Amazon Aurora MySQL with a read replica for the reporting applications. Configure a stored procedure and an
AWS Lambda function to publish a message to an Amazon SNS topic. Subscribe the other system\\’s email address to
the topic.
Correct Answer: D


QUESTION 13
A ride-hailing application uses an Amazon RDS for MySQL DB instance as persistent storage for bookings. This
application is very popular and the company expects a tenfold increase in the user base in next few months. The
application experiences more traffic during the morning and evening hours.
This application has two parts:
1.
An in-house booking component that accepts online bookings that directly correspond to simultaneous requests from
users.
2.
A third-party customer relationship management (CRM) component used by customer care representatives. The CRM
uses queries to access booking data.
A database specialist needs to design a cost-effective database solution to handle this workload.
Which solution meets these requirements?
A. Use Amazon ElastiCache for Redis to accept the bookings. Associate an AWS Lambda function to capture changes
and push the booking data to the RDS for MySQL DB instance used by the CRM.
B. Use Amazon DynamoDB to accept the bookings. Enable DynamoDB Streams and associate an AWS Lambda
function to capture changes and push the booking data to an Amazon SQS queue. This triggers another Lambda
function that pulls data from Amazon SQS and writes it to the RDS for MySQL DB instance used by the CRM.
C. Use Amazon ElastiCache for Redis to accept the bookings. Associate an AWS Lambda function to capture changes
and push the booking data to an Amazon Redshift database used by the CRM.
D. Use Amazon DynamoDB to accept the bookings. Enable DynamoDB Streams and associate an AWS Lambda
function to capture changes and push the booking data to Amazon Athena, which is used by the CRM.
Correct Answer: A

Welcome to download the valid Pass4itsure DBS-C01 pdf

Free downloadGoogle Drive
Amazon AWS DBS-C01 pdf https://drive.google.com/file/d/16YqKaTSxNTW4PhDIrMlTPcFuL76zLbgg/view?usp=sharing

Pass4itsure latest Amazon exam dumps coupon code free share

Summary:

New Amazon DBS-C01 exam questions from Pass4itsure DBS-C01 dumps! Welcome to download the newest Pass4itsure DBS-C01 dumps https://www.pass4itsure.com/aws-certified-database-specialty.html (145 Q&As), verified the latest DBS-C01 practice test questions with relevant answers.

Amazon AWS DBS-C01 dumps pdf free share https://drive.google.com/file/d/16YqKaTSxNTW4PhDIrMlTPcFuL76zLbgg/view?usp=sharing