Home / Amazon Web Services / AWS-Certified-Database / DBS-C01 - AWS Certified Database - Specialty

Latest DBS-C01 Exam Questions


Question # 1



Recently, a gaming firm purchased a popular iOS game that is especially popular during
the Christmas season. The business has opted to include a leaderboard into the game,
which will be powered by Amazon DynamoDB. The application's load is likely to increase
significantly throughout the Christmas season.
Which solution satisfies these criteria at the lowest possible cost?

A.

DynamoDB Streams

B.

DynamoDB with DynamoDB Accelerator

C.

DynamoDB with on-demand capacity mode

D.

DynamoDB with provisioned capacity mode with Auto Scaling




D.
  

DynamoDB with provisioned capacity mode with Auto Scaling



Explanation: "On-demand is ideal for bursty, new, or unpredictable workloads whose
traffic can spike in seconds or minutes"
vs.
'DynamoDB released auto scaling to make it easier for you to manage capacity efficiently,
and auto scaling continues to help DynamoDB users lower the cost of workloads that have
a predictable traffic pattern."
https://aws.amazon.com/blogs/database/amazon-dynamodb-auto-scaling-performanceand-
cost-optimization-at-any-scale/





Question # 2



A company migrated one of its business-critical database workloads to an Amazon Aurora
Multi-AZ DB cluster. The company requires a very low RTO and needs to improve the
application recovery time after database failovers.
Which approach meets these requirements?

A.

Set the max_connections parameter to 16,000 in the instance-level parameter group.

B.

Modify the client connection timeout to 300 seconds.

C.

Create an Amazon RDS Proxy database proxy and update client connections to point to
the proxy endpoint

D.

Enable the query cache at the instance level




C.
  

Create an Amazon RDS Proxy database proxy and update client connections to point to
the proxy endpoint



Explanation:
Amazon RDS Proxy allows applications to pool and share connections established with the
database, improving database efficiency and application scalability. With RDS Proxy,
failover times for Aurora and RDS databases are reduced by up to 66% and database
credentials, authentication, and access can be managed through integration with AWS
Secrets Manager and AWS Identity and Access Management (IAM).
https://aws.amazon.com/rds/proxy/





Question # 3



A business is launching a new Amazon RDS for SQL Server database instance. The
organization wishes to allow auditing of the SQL Server database.
Which measures should a database professional perform in combination to achieve this
requirement? (Select two.)

A.

Create a service-linked role for Amazon RDS that grants permissions for Amazon RDS
to store audit logs on Amazon S3.

B.

Set up a parameter group to configure an IAM role and an Amazon S3 bucket for audit
log storage. Associate the parameter group with the DB instance

C.

Disable Multi-AZ on the DB instance, and then enable auditing. Enable Multi-AZ after
auditing is enabled.

D.

Disable automated backup on the DB instance, and then enable auditing. Enable
automated backup after auditing is enabled

E.

Set up an options group to configure an IAM role and an Amazon S3 bucket for audit log
storage. Associate the options group with the DB instance




A.
  

Create a service-linked role for Amazon RDS that grants permissions for Amazon RDS
to store audit logs on Amazon S3.




E.
  

Set up an options group to configure an IAM role and an Amazon S3 bucket for audit log
storage. Associate the options group with the DB instance



Explanation:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.SQLServer.Options
.Audit.html
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/security_iam_service-withiam.
html





Question # 4



A database specialist was alerted that a production Amazon RDS MariaDB instance with
100 GB of storage was out of space. In response, the database specialist modified the DB
instance and added 50 GB of storage capacity. Three hours later, a new alert is generated
due to a lack of free space on the same DB instance. The database specialist decides to
modify the instance immediately to increase its storage capacity by 20 GB.
What will happen when the modification is submitted?

A.

The request will fail because this storage capacity is too large.

B.

The request will succeed only if the primary instance is in active status.

C.

The request will succeed only if CPU utilization is less than 10%.

D.

The request will fail as the most recent modification was too soon.




D.
  

The request will fail as the most recent modification was too soon.



Explanation:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_PIOPS.StorageTypes.
html





Question # 5



An internet advertising firm stores its data in an Amazon DynamoDb table. Amazon
DynamoDB Streams are enabled on the table, and one of the keys has a global secondary
index. The table is encrypted using a customer-managed AWS Key Management Service
(AWS KMS) key.
The firm has chosen to grow worldwide and want to duplicate the database using
DynamoDB global tables in a new AWS Region.
An administrator observes the following upon review:
No role with the dynamodb: CreateGlobalTable permission exists in the account.
An empty table with the same name exists in the new Region where replication is
desired.
A global secondary index with the same partition key but a different sort key exists
in the new Region where replication is desired.
Which settings will prevent you from creating a global table or replica in the new Region?

A.

A global secondary index with the same partition key but a different sort key exists in the
new Region where replication is desired.

B.

An empty table with the same name exists in the Region where replication is desired.

C.

No role with the dynamodb:CreateGlobalTable permission exists in the account

D.

DynamoDB Streams is enabled for the table.

E.

The table is encrypted using a KMS customer managed key.




A.
  

A global secondary index with the same partition key but a different sort key exists in the
new Region where replication is desired.




B.
  

An empty table with the same name exists in the Region where replication is desired.







Question # 6



A company is migrating a mission-critical 2-TB Oracle database from on premises to
Amazon Aurora. The cost for the database migration must be kept to a minimum, and both
the on-premises Oracle database and the Aurora DB cluster must remain open for write
traffic until the company is ready to completely cut over to Aurora.
Which combination of actions should a database specialist take to accomplish this
migration as quickly as possible? (Choose two.)

A.

Use the AWS Schema Conversion Tool (AWS SCT) to convert the source database
schema. Then restore the converted schema to the target Aurora DB cluster.

B.

Use Oracle’s Data Pump tool to export a copy of the source database schema and
manually edit the schema in a text editor to make it compatible with Aurora.

C.

Create an AWS DMS task to migrate data from the Oracle database to the Aurora DB
cluster. Select the migration type to replicate ongoing changes to keep the source and
target databases in sync until the company is ready to move all user traffic to the Aurora
DB cluster.

D.

Create an AWS DMS task to migrate data from the Oracle database to the Aurora DB
cluster. Once the initial load is complete, create an AWS Kinesis Data Firehose stream to
perform change data capture (CDC) until the company is ready to move all user traffic to
the Aurora DB cluster.

E.

Create an AWS Glue job and related resources to migrate data from the Oracle
database to the Aurora DB cluster. Once the initial load is complete, create an AWS DMS
task to perform change data capture (CDC) until the company is ready to move all user
traffic to the Aurora DB cluster.




A.
  

Use the AWS Schema Conversion Tool (AWS SCT) to convert the source database
schema. Then restore the converted schema to the target Aurora DB cluster.




C.
  

Create an AWS DMS task to migrate data from the Oracle database to the Aurora DB
cluster. Select the migration type to replicate ongoing changes to keep the source and
target databases in sync until the company is ready to move all user traffic to the Aurora
DB cluster.







Question # 7



In North America, a business launched a mobile game that swiftly expanded to 10 million
daily active players. The game's backend is hosted on AWS and makes considerable use
of a TTL-configured Amazon DynamoDB table.
When an item is added or changed, its TTL is set to 600 seconds plus the current epoch
time. The game logic is reliant on the purging of outdated data in order to compute rewards
points properly. At times, items from the table are read that are many hours beyond their
TTL expiration.
How should a database administrator resolve this issue?

A.

Use a client library that supports the TTL functionality for DynamoDB.

B.

Include a query filter expression to ignore items with an expired TTL.

C.

Set the ConsistentRead parameter to true when querying the table

D.

Create a local secondary index on the TTL attribute.




B.
  

Include a query filter expression to ignore items with an expired TTL.



Explanation:
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/howitworks-ttl.html





Question # 8



A business is transferring its on-premises database workloads to the Amazon Web
Services (AWS) Cloud. A database professional migrating an Oracle database with a huge
table to Amazon RDS has picked AWS DMS. The database professional observes that
AWS DMS is consuming considerable time migrating the data.
Which activities would increase the pace of data migration? (Select three.)

A.

Create multiple AWS DMS tasks to migrate the large table.

B.

Configure the AWS DMS replication instance with Multi-AZ

C.

Increase the capacity of the AWS DMS replication server

D.

Establish an AWS Direct Connect connection between the on-premises data center and
AWS.

E.

Enable an Amazon RDS Multi-AZ configuration.




A.
  

Create multiple AWS DMS tasks to migrate the large table.




C.
  

Increase the capacity of the AWS DMS replication server




D.
  

Establish an AWS Direct Connect connection between the on-premises data center and
AWS.







Question # 9



A significant automotive manufacturer is switching a mission-critical finance application's
database to Amazon DynamoDB. According to the company's risk and compliance policy,
any update to the database must be documented as a log entry for auditing purposes.
Each minute, the system anticipates about 500,000 log entries. Log entries should be kept
in Apache Parquet files in batches of at least 100,000 records per file.
How could a database professional approach these needs while using DynamoDB?

A.

Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda function
triggered by the stream. Write the log entries to an Amazon S3 object

B.

Create a backup plan in AWS Backup to back up the DynamoDB table once a day.
Create an AWS Lambda function that restores the backup in another table and compares
both tables for changes. Generate the log entries and write them to an Amazon S3 object.

C.

Enable AWS CloudTrail logs on the table. Create an AWS Lambda function that reads
the log files once an hour and filters DynamoDB API actions. Write the filtered log files to
Amazon S3.

D.

Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda function
triggered by the stream. Write the log entries to an Amazon Kinesis Data Firehose delivery
stream with buffering and Amazon S3 as the destination.




D.
  

Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda function
triggered by the stream. Write the log entries to an Amazon Kinesis Data Firehose delivery
stream with buffering and Amazon S3 as the destination.







Question # 10



A business need a data warehouse system that stores data consistently and in a highly
organized fashion. The organization demands rapid response times for end-user inquiries
including current-year data, and users must have access to the whole 15-year dataset
when necessary. Additionally, this solution must be able to manage a variable volume of
incoming inquiries. Costs associated with storing the 100 TB of data must be maintained to
a minimum.
Which solution satisfies these criteria?

A.

Leverage an Amazon Redshift data warehouse solution using a dense storage instance
type while keeping all the data on local Amazon Redshift storage. Provision enough
instances to support high demand

B.

Leverage an Amazon Redshift data warehouse solution using a dense storage instance
to store the most recent data. Keep historical data on Amazon S3 and access it using the
Amazon Redshift Spectrum layer. Provision enough instances to support high demand

C.

Leverage an Amazon Redshift data warehouse solution using a dense storage instance
to store the most recent data. Keep historical data on Amazon S3 and access it using the Amazon Redshift Spectrum layer. Enable Amazon Redshift Concurrency Scaling.

D.

Leverage an Amazon Redshift data warehouse solution using a dense storage instance
to store the most recent data. Keep historical data on Amazon S3 and access it using the
Amazon Redshift Spectrum layer. Leverage Amazon Redshift elastic resize.




C.
  

Leverage an Amazon Redshift data warehouse solution using a dense storage instance
to store the most recent data. Keep historical data on Amazon S3 and access it using the Amazon Redshift Spectrum layer. Enable Amazon Redshift Concurrency Scaling.



Explanation: https://docs.aws.amazon.com/redshift/latest/dg/concurrency-scaling.html
"With the Concurrency Scaling feature, you can support virtually unlimited concurrent users
and concurrent queries, with consistently fast query performance. When concurrency
scaling is enabled, Amazon Redshift automatically adds additional cluster capacity when
you need it to process an increase in concurrent read queries. Write operations continue as
normal on your main cluster. Users always see the most current data, whether the queries
run on the main cluster or on a concurrency scaling cluster. You're charged for concurrency
scaling clusters only for the time they're in use. For more information about pricing, see
Amazon Redshift pricing. You manage which queries are sent to the concurrency scaling
cluster by configuring WLM queues. When you enable concurrency scaling for a queue,
eligible queries are sent to the concurrency scaling cluster instead of waiting in line."




Get 270 AWS Certified Database - Specialty questions Access in less then $0.15 per day.

Total Questions Answers: 270
Last Updated: 3-Oct-2024
Available with 1, 3, 6 and 12 Months Free Updates Plans
PDF/ Day: $0.15

Test Engine/ Day: $0.18

PDF + Engine/ Day: $0.20


Amazon Web Services DBS-C01 Dumps - Latest Questions


Exam Code: DBS-C01
Exam Name: AWS Certified Database - Specialty

  • 90 Days Free Updates
  • Amazon Web Services Experts Verified Answers
  • Printable PDF File Format
  • DBS-C01 Exam Passing Assurance

Get 100% Real DBS-C01 Exam Dumps With Verified Answers As Seen in the Real Exam. AWS Certified Database - Specialty Exam Questions are Updated Frequently and Reviewed by Industry TOP Experts for Passing AWS-Certified-Database Exam Quickly and Hassle Free.

AWS-Certified-Database Exams

Amazon Web Services DBS-C01 Exam Questions


Struggling with AWS Certified Database - Specialty prep? Get the edge you need!

Our carefully crafted DBS-C01 dumps give you the confidence to ace the exam. We offer:

  • Up-to-date AWS-Certified-Database practice questions: Stay current with the latest exam content.
  • PDF and test engine formats: Choose the study tools that work best for you.
  • Realistic Amazon Web Services DBS-C01 practice exams: Simulate the real exam experience and boost your readiness.
Pass your AWS-Certified-Database exam with ease. Try our study materials today!


Ace your AWS-Certified-Database exam with confidence!



We provide top-quality DBS-C01 exam prep materials that are:
  • Accurate and up-to-date: Reflect the latest Amazon Web Services exam changes and ensure you are studying the right content. 
  • Comprehensive: Cover all exam topics so you do not need to rely on multiple sources. 
  • Convenient formats: Choose between PDF files and online AWS Certified Database - Specialty practice tests for easy studying on any device.
Do not waste time on unreliable DBS-C01 practice exams. Choose our proven AWS-Certified-Database study materials and pass with flying colors.

Try Dumps4free AWS Certified Database - Specialty Exam 2024 PDFs today!

  • Assurance

    AWS Certified Database - Specialty practice exam has been updated to reflect the most recent questions from the Amazon Web Services DBS-C01 Exam.

  • Demo

    Try before you buy! Get a free demo of our AWS-Certified-Database exam dumps and see the quality for yourself. Need help? Chat with our support team.

  • Validity

    Our Amazon Web Services DBS-C01 PDF contains expert-verified questions and answers, ensuring you're studying the most accurate and relevant material.

  • Success

    Achieve DBS-C01 success! Our AWS Certified Database - Specialty exam questions give you the preparation edge.

If you have any question then contact our customer support at live chat or email us at support@dumps4free.com.