A company is closing one of its remote data centers. This site runs a 100 TB on-premises data warehouse
solution. The company plans to use the AWS Schema Conversion Tool (AWS SCT) and AWS DMS for themigration to AWS. The site network bandwidth is 500 Mbps. A Database Specialist wants to migrate the
on-premises data using Amazon S3 as the data lake and Amazon Redshift as the data warehouse. This move
must take place during a 2-week period when source systems are shut down for maintenance. The data should
stay encrypted at rest and in transit.
Which approach has the least risk and the highest likelihood of a successful data transfer?
A.
Set up a VPN tunnel for encrypting data over the network from the data center to AWS. Leverage
AWSSCT and apply the converted schema to Amazon Redshift. Once complete, start an AWS DMS
task tomove the data from the source to Amazon S3. Use AWS Glue to load the data from Amazon S3
to AmazonRedshift.
B.
Leverage AWS SCT and apply the converted schema to Amazon Redshift. Start an AWS DMS task
withtwo AWS Snowball Edge devices to copy data from on-premises to Amazon S3 with AWS KMS
encryption.Use AWS DMS to finish copying data to Amazon Redshift.
C.
Leverage AWS SCT and apply the converted schema to Amazon Redshift. Once complete, use a fleet
of10 TB dedicated encrypted drives using the AWS Import/Export feature to copy data from
on-premises toAmazon S3 with AWS KMS encryption. Use AWS Glue to load the data to Amazon
redshift.
D.
Set up a VPN tunnel for encrypting data over the network from the data center to AWS. Leverage a
nativedatabase export feature to export the data and compress the files. Use the aws S3 cp multi-port
uploadcommand to upload these files to Amazon S3 with AWS KMS encryption. Once complete, load
the data toAmazon Redshift using AWS Glue.
Leverage AWS SCT and apply the converted schema to Amazon Redshift. Once complete, use a fleet
of10 TB dedicated encrypted drives using the AWS Import/Export feature to copy data from
on-premises toAmazon S3 with AWS KMS encryption. Use AWS Glue to load the data to Amazon
redshift.
A Database Specialist is designing a new database infrastructure for a ride hailing application. The application
data includes a ride tracking system that stores GPS coordinates for all rides. Real-time statistics and metadata
lookups must be performed with high throughput and microsecond latency. The database should be fault
tolerant with minimal operational overhead and development effort.
Which solution meets these requirements in the MOST efficient way?
A.
Use Amazon RDS for MySQL as the database and use Amazon ElastiCache
B.
Use Amazon DynamoDB as the database and use DynamoDB Accelerator
C.
Use Amazon Aurora MySQL as the database and use Aurora’s buffer cache
D.
Use Amazon DynamoDB as the database and use Amazon API Gateway
Use Amazon DynamoDB as the database and use Amazon API Gateway
A company has a web-based survey application that uses Amazon DynamoDB. During peak usage, when
survey responses are being collected, a Database Specialist sees the
ProvisionedThroughputExceededException error.
What can the Database Specialist do to resolve this error? (Choose two.)
A.
Change the table to use Amazon DynamoDB Streams
B.
Purchase DynamoDB reserved capacity in the affected Region
C.
Increase the write capacity units for the specific table
D.
Change the table capacity mode to on-demand
E.
Change the table type to throughput optimized
Increase the write capacity units for the specific table
Change the table type to throughput optimized
A company is deploying a solution in Amazon Aurora by migrating from an on-premises system. The IT
department has established an AWS Direct Connect link from the company’s data center. The company’s
Database Specialist has selected the option to require SSL/TLS for connectivity to prevent plaintext data from
being set over the network. The migration appears to be working successfully, and the data can be queried
from a desktop machine.
Two Data Analysts have been asked to query and validate the data in the new Aurora DB cluster. Both
Analysts are unable to connect to Aurora. Their user names and passwords have been verified as valid and
the Database Specialist can connect to the DB cluster using their accounts. The Database Specialist also
verified that the security group configuration allows network from all corporate IP addresses.
What should the Database Specialist do to correct the Data Analysts’ inability to connect?
A.
Restart the DB cluster to apply the SSL change.
B.
Instruct the Data Analysts to download the root certificate and use the SSL certificate on the connection string to connect.
C.
Add explicit mappings between the Data Analysts’ IP addresses and the instance in the security group
assigned to the DB cluster.
D.
Modify the Data Analysts’ local client firewall to allow network traffic to AWS.
Modify the Data Analysts’ local client firewall to allow network traffic to AWS.
A financial company wants to store sensitive user data in an Amazon Aurora PostgreSQL DB cluster. The
database will be accessed by multiple applications across the company. The company has mandated that all
communications to the database be encrypted and the server identity must be validated. Any non-SSL-based
connections should be disallowed access to the database.
Which solution addresses these requirements?
A.
Set the rds.force_ssl=0 parameter in DB parameter groups. Download and use the Amazon RDS
certificatebundle and configure the PostgreSQL connection string with sslmode=allow.
B.
Set the rds.force_ssl=1 parameter in DB parameter groups. Download and use the Amazon RDS
certificatebundle and configure the PostgreSQL connection string with sslmode=disable.
C.
Set the rds.force_ssl=0 parameter in DB parameter groups. Download and use the Amazon RDS
certificatebundle and configure the PostgreSQL connection string with sslmode=verify-ca.
D.
Set the rds.force_ssl=1 parameter in DB parameter groups. Download and use the Amazon RDS
certificatebundle and configure the PostgreSQL connection string with sslmode=verify-full.
Set the rds.force_ssl=1 parameter in DB parameter groups. Download and use the Amazon RDS
certificatebundle and configure the PostgreSQL connection string with sslmode=verify-full.
Page 8 out of 40 Pages |
Previous |