Question # 1
A company wants to share information with a third party. The third party has an HTTP API endpoint that the company can use to share the information. The company has the required API key to access the HTTP API.
The company needs a way to manage the API key by using code. The integration of the API key with the application code cannot affect application performance.
Which solution will meet these requirements MOST securely?
|
A. Store the API credentials in AWS Secrets Manager. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.
| B. Store the API credentials in a local code variable. Push the code to a secure Git repository. Use the local code variable at runtime to make the API call.
| C. Store the API credentials as an object in a private Amazon S3 bucket. Restrict access to the S3 object by using IAM policies. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.
| D. Store the API credentials in an Amazon DynamoDB table. Restrict access to the table by using resource-based policies. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.
|
A. Store the API credentials in AWS Secrets Manager. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.
Explanation:
AWS Secrets Manager is a service that helps securely store, rotate, and manage secrets such as API keys, passwords, and tokens. The developer can store the API credentials in AWS Secrets Manager and retrieve them at runtime by using the AWS SDK. This solution will meet the requirements of security, code management, and performance. Storing the API credentials in a local code variable or an S3 object is not secure, as it exposes the credentials to unauthorized access or leakage. Storing the API credentials in a DynamoDB table is also not secure, as it requires additional encryption and access control measures. Moreover, retrieving the credentials from S3 or DynamoDB may affect application performance due to network latency.
References:
• [What Is AWS Secrets Manager? - AWS Secrets Manager]
• [Retrieving a Secret - AWS Secrets Manager]
Question # 2
A company has a multi-node Windows legacy application that runs on premises. The application uses a network shared folder as a centralized configuration repository to store configuration files in .xml format. The company is migrating the application to Amazon EC2 instances. As part of the migration to AWS, a developer must identify a solution that provides high availability for the repository.
Which solution will meet this requirement MOST cost-effectively?
|
A. Mount an Amazon Elastic Block Store (Amazon EBS) volume onto one of the EC2 instances. Deploy a file system on the EBS volume. Use the host operating system to share a folder. Update the application code to read and write configuration files from the shared folder.
| B. Deploy a micro EC2 instance with an instance store volume. Use the host operating system to share a folder. Update the application code to read and write configuration files from the shared folder.
| C. Create an Amazon S3 bucket to host the repository. Migrate the existing .xml files to the S3 bucket. Update the application code to use the AWS SDK to read and write configuration files from Amazon S3.
| D. Create an Amazon S3 bucket to host the repository. Migrate the existing .xml files to the S3 bucket. Mount the S3 bucket to the EC2 instances as a local volume. Update the application code to read and write configuration files from the disk.
|
C. Create an Amazon S3 bucket to host the repository. Migrate the existing .xml files to the S3 bucket. Update the application code to use the AWS SDK to read and write configuration files from Amazon S3.
Explanation:
Amazon S3 is a service that provides highly scalable, durable, and secure object storage. The developer can create an S3 bucket to host the repository and migrate the existing .xml files to the S3 bucket. The developer can update the application code to use the AWS SDK to read and write configuration files from S3. This solution will meet the requirement of high availability for the repository in a cost-effective way.
References:
• [Amazon Simple Storage Service (S3)]
• [Using AWS SDKs with Amazon S3]
Question # 3
A company has installed smart motes in all Its customer locations. The smart meter’s measure power usage at 1minute intervals and send the usage readings to a remote endpoint tot collection. The company needs to create an endpoint that will receive the smart meter readings and store the readings in a database.
The company wants to store the location ID and timestamp information.
The company wants to give Is customers low-latency access to their current usage and historical usage on demand The company expects demand to increase significantly. The solution must not impact performance or include downtime write seeing.
When solution will meet these requirements MOST cost-effectively?
|
A. Store the smart meter readings in an Amazon RDS database. Create an index on the location ID and timestamp columns Use the columns to filter on the customers ‘data.
| B. Store the smart motor readings m an Amazon DynamoDB table Croato a composite Key oy using the location ID and timestamp columns. Use the columns to filter on the customers' data.
| C. Store the smart meter readings in Amazon EastCache for Reds Create a Sorted set key y using the location ID and timestamp columns. Use the columns to filter on the customers’ data.
| D. Store the smart meter readings m Amazon S3 Parton the data by using the location ID and timestamp columns. Use Amazon Athena lo tiler on me customers' data.
|
B. Store the smart motor readings m an Amazon DynamoDB table Croato a composite Key oy using the location ID and timestamp columns. Use the columns to filter on the customers' data.
Explanation:
The solution that will meet the requirements most cost-effectively is to store the smart meter readings in an Amazon DynamoDB table. Create a composite key by using the location ID and timestamp columns. Use the columns to filter on the customers’ data. This way, the company can leverage the scalability, performance, and low latency of DynamoDB to store and retrieve the smart meter readings.
The company can also use the composite key to query the data by location ID and timestamp efficiently. The other options either involve more expensive or less scalable services, or do not provide low-latency access to the current usage.
Reference: [Reference: Working with Queries in DynamoDB, , ]
Question # 4
A developer has an application that stores data in an Amazon S3 bucket. The application uses an HTTP API to store and retrieve objects. When the PutObject API operation adds objects to the S3 bucket the developer must encrypt these objects at rest by using server-side encryption with Amazon S3 managed keys (SSE-S3).
Which solution will meet this requirement?
|
A. Create an AWS Key Management Service (AWS KMS) key. Assign the KMS key to the S3 bucket
| B. Set the x-amz-server-side-encryption header when invoking the PutObject API operation.
| C. Provide the encryption key in the HTTP header of every request.
| D. Apply TLS to encrypt the traffic to the S3 bucket.
|
B. Set the x-amz-server-side-encryption header when invoking the PutObject API operation.
Explanation:
Amazon S3 supports server-side encryption, which encrypts data at rest on the server that stores the data. One of the encryption options is SSE-S3, which uses keys managed by S3. To use SSE-S3, the x-amz-server-side-encryption header must be set to AES256 when invoking the PutObject API operation. This instructs S3 to encrypt the object data with SSE-S3 before saving it on disks in its data centers and decrypt it when it is downloaded.
Reference:
Protecting data using server-side encryption with Amazon S3-managed encryption keys (SSE-S3)
Question # 5
A developer has code that is stored in an Amazon S3 bucket. The code must be deployed as an AWS Lambda function across multiple accounts in the same AWS Region as the S3 bucket an AWS CloudPormation template that runs for each account will deploy the Lambda function.
What is the MOST secure way to allow CloudFormaton to access the Lambda Code in the S3 bucket?
|
A. Grant the CloudFormation service role the S3 ListBucket and GetObject permissions. Add a bucket policy to Amazon S3 with the principal of "AWS" (account numbers)
| B. Grant the CloudFormation service row the S3 GetObfect permission. Add a Bucket policy to Amazon S3 with the principal of "'"
| C. Use a service-based link to grant the Lambda function the S3 ListBucket and GetObject permissions by explicitly adding the S3 bucket's account number in the resource.
| D. Use a service-based link to grant the Lambda function the S3 GetObject permission Add a resource of "** to allow access to the S3 bucket.
|
B. Grant the CloudFormation service row the S3 GetObfect permission. Add a Bucket policy to Amazon S3 with the principal of "'"
Explanation:
This solution allows the CloudFormation service role to access the S3 bucket from any account, as long as it has the S3 GetObject permission. The bucket policy grants access to any principal with the GetObject permission, which is the least privilege needed to deploy the Lambda code. This is more secure than granting ListBucket permission, which is not required for deploying Lambda code, or using a service-based link, which is not supported for Lambda functions.
Reference: [Reference: AWS CloudFormation Service Role, Using AWS Lambda with Amazon S3, ,]
Question # 6
A company hosts a client-side web application for one of its subsidiaries on Amazon S3. The web application can be accessed through Amazon CloudFront from https://www.example.com. After a successful rollout, the company wants to host three more client-side web applications for its remaining subsidiaries on three separate S3 buckets.
To achieve this goal, a developer moves all the common JavaScript files and web fonts to a central S3 bucket that serves the web applications. However, during testing, the developer notices that the browser blocks the JavaScript files and web fonts.
What should the developer do to prevent the browser from blocking the JavaScript files and web fonts?
|
A. Create four access points that allow access to the central S3 bucket. Assign an access point to each web application bucket.
| B. Create a bucket policy that allows access to the central S3 bucket. Attach the bucket policy to the central S3 bucket.
| C. Create a cross-origin resource sharing (CORS) configuration that allows access to the central S3 bucket. Add the CORS configuration to the central S3 bucket.
| D. Create a Content-MD5 header that provides a message integrity check for the central S3 bucket. Insert the Content-MD5 header for each web application request.
|
C. Create a cross-origin resource sharing (CORS) configuration that allows access to the central S3 bucket. Add the CORS configuration to the central S3 bucket.
Explanation:
This is a frequent trouble. Web applications cannot access the resources in other domains by default, except some exceptions. You must configure CORS on the resources to be accessed. https://docs.aws.amazon.com/AmazonS3/latest/userguide/cors.html
Question # 7
An application runs on multiple EC2 instances behind an ELB. Where is the session data best written so that it can be served reliably across multiple requests?
|
A. Write data to Amazon ElastiCache
| B. Write data to Amazon Elastic Block Store
| C. Write data to Amazon EC2 instance Store
| D. Wide data to the root filesystem
|
A. Write data to Amazon ElastiCache
Explanation:
The solution that will meet the requirements is to write data to Amazon ElastiCache. This way, the application can write session data to a fast, scalable, and reliable in-memory data store that can be served reliably across multiple requests. The other options either involve writing data to persistent storage, which is slower and more expensive than in-memory storage, or writing data to the root filesystem, which is not shared among multiple EC2 instances.
Reference: [Reference: Using ElastiCache for session management, ]
Question # 8
A company has a web application that runs on Amazon EC2 instances with a custom Amazon Machine Image (AMI) The company uses AWS CloudFormation to provision the application The application runs in the us-east-1 Region, and the company needs to deploy the application to the us-west-1 Region
An attempt to create the AWS CloudFormation stack in us-west-1 fails. An error message states that the AMI ID does not exist. A developer must resolve this error with a solution that uses the least amount of operational overhead
Which solution meets these requirements?
|
A. Change the AWS CloudFormation templates for us-east-1 and us-west-1 to use an AWS AMI. Relaunch the stack for both Regions.
| B. Copy the custom AMI from us-east-1 to us-west-1. Update the AWS CloudFormation template for us-west-1 to refer to AMI ID for the copied AMI Relaunch the stack
| C. Build the custom AMI in us-west-1 Create a new AWS CloudFormation template to launch the stack in us-west-1 with the new AMI ID
| D. Manually deploy the application outside AWS CloudFormation in us-west-1.
|
B. Copy the custom AMI from us-east-1 to us-west-1. Update the AWS CloudFormation template for us-west-1 to refer to AMI ID for the copied AMI Relaunch the stack
Explanation:
https://aws.amazon.com/blogs/aws/ec2-ami-copy-between-regions/
Question # 9
A company is running Amazon EC2 instances in multiple AWS accounts. A developer needs to implement an application that collects all the lifecycle events of the EC2 instances. The application needs to store the lifecycle events in a single Amazon Simple Queue Service (Amazon SQS) queue in the company's main AWS account for further processing. Which solution will meet these requirements?
|
A. Configure Amazon EC2 to deliver the EC2 instance lifecycle events from all accounts to the Amazon EventBridge event bus of the main account. Add an EventBridge rule to the event bus of the main account that matches all EC2 instance lifecycle events. Add the SQS queue as a target of the rule.
| B. Use the resource policies of the SQS queue in the main account to give each account permissions to write to that SQS queue. Add to the Amazon EventBridge event bus of each account an EventBridge rule that matches all EC2 instance lifecycle events. Add the SQS queue in the main account as a target of the rule.
| C. Write an AWS Lambda function that scans through all EC2 instances in the company accounts to detect EC2 instance lifecycle changes. Configure the Lambda function to write a notification message to the SQS queue in the main account if the function detects an EC2 instance lifecycle change. Add an Amazon EventBridge scheduled rule that invokes the Lambda function every minute.
| D. Configure the permissions on the main account event bus to receive events from all accounts. Create an Amazon EventBridge rule in each account to send all the EC2 instance lifecycle events to the main account event bus. Add an EventBridge rule to the main account event bus that matches all EC2 instance lifecycle events. Set the SQS queue as a target for the rule.
|
D. Configure the permissions on the main account event bus to receive events from all accounts. Create an Amazon EventBridge rule in each account to send all the EC2 instance lifecycle events to the main account event bus. Add an EventBridge rule to the main account event bus that matches all EC2 instance lifecycle events. Set the SQS queue as a target for the rule.
Explanation:
Amazon EC2 instances can send the state-change notification events to Amazon EventBridge. https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/monitoring-instance-state-changes.html Amazon EventBridge can send and receive events between event buses in AWS accounts. https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-cross-account.html<br><br>
Question # 10
A developer is creating a new REST API by using Amazon API Gateway and AWS Lambda. The development team tests the API and validates responses for the known use cases before deploying the API to the production environment.
The developer wants to make the REST API available for testing by using API Gateway locally.
Which AWS Serverless Application Model Command Line Interface (AWS SAM CLI) subcommand will meet these requirements?
|
A. Sam local invoke
| B. Sam local generate-event
| C. Sam local start-lambda
| D. Sam local start-api
|
D. Sam local start-api
Explanation:
The AWS Serverless Application Model Command Line Interface (AWS SAM CLI) is a command-line tool for local development and testing of Serverless applications2. The sam local start-api subcommand of AWS SAM CLI is used to simulate a REST API by starting a new local endpoint3. Therefore, option D is correct.
Get 292 AWS Certified Developer - Associate questions Access in less then $0.12 per day.
Amazon Web Services Bundle 1: 1 Month PDF Access For All Amazon Web Services Exams with Updates $100
$400
Buy Bundle 1
Amazon Web Services Bundle 2: 3 Months PDF Access For All Amazon Web Services Exams with Updates $200
$800
Buy Bundle 2
Amazon Web Services Bundle 3: 6 Months PDF Access For All Amazon Web Services Exams with Updates $300
$1200
Buy Bundle 3
Amazon Web Services Bundle 4: 12 Months PDF Access For All Amazon Web Services Exams with Updates $400
$1600
Buy Bundle 4
Disclaimer: Fair Usage Policy - Daily 5 Downloads
AWS Certified Developer - Associate Exam Dumps
Exam Code: DVA-C02
Exam Name: AWS Certified Developer - Associate
- 90 Days Free Updates
- Amazon Web Services Experts Verified Answers
- Printable PDF File Format
- DVA-C02 Exam Passing Assurance
Get 100% Real DVA-C02 Exam Dumps With Verified Answers As Seen in the Real Exam. AWS Certified Developer - Associate Exam Questions are Updated Frequently and Reviewed by Industry TOP Experts for Passing AWS Certified Associate Exam Quickly and Hassle Free.
Amazon Web Services DVA-C02 Test Dumps
Struggling with AWS Certified Developer - Associate preparation? Get the edge you need! Our carefully created DVA-C02 test dumps give you the confidence to pass the exam. We offer:
1. Up-to-date AWS Certified Associate practice questions: Stay current with the latest exam content.
2. PDF and test engine formats: Choose the study tools that work best for you. 3. Realistic Amazon Web Services DVA-C02 practice exam: Simulate the real exam experience and boost your readiness.
Pass your AWS Certified Associate exam with ease. Try our study materials today!
Official AWS Developer Associate exam info is available on Amazon website at https://aws.amazon.com/certification/certified-developer-associate/
Prepare your AWS Certified Associate exam with confidence!We provide top-quality DVA-C02 exam dumps materials that are:
1. Accurate and up-to-date: Reflect the latest Amazon Web Services exam changes and ensure you are studying the right content.
2. Comprehensive Cover all exam topics so you do not need to rely on multiple sources.
3. Convenient formats: Choose between PDF files and online AWS Certified Developer - Associate practice questions for easy studying on any device.
Do not waste time on unreliable DVA-C02 practice test. Choose our proven AWS Certified Associate study materials and pass with flying colors. Try Dumps4free AWS Certified Developer - Associate 2024 material today!
-
Assurance
AWS Certified Developer - Associate practice exam has been updated to reflect the most recent questions from the Amazon Web Services DVA-C02 Exam.
-
Demo
Try before you buy! Get a free demo of our AWS Certified Associate exam dumps and see the quality for yourself. Need help? Chat with our support team.
-
Validity
Our Amazon Web Services DVA-C02 PDF contains expert-verified questions and answers, ensuring you're studying the most accurate and relevant material.
-
Success
Achieve DVA-C02 success! Our AWS Certified Developer - Associate exam questions give you the preparation edge.
If you have any question then contact our customer support at live chat or email us at support@dumps4free.com.
|