Available in 1, 3, 6 and 12 Months Free Updates Plans
PDF: $15 $60

Test Engine: $20 $80

PDF + Engine: $25 $99

MuleSoft-Integration-Architect-I Practice Test


Page 5 out of 54 Pages

What aspects of a CI/CD pipeline for Mute applications can be automated using MuleSoft-provided Maven plugins?


A. Compile, package, unit test, deploy, create associated API instances in API Manager Import from API designer, compile, package, unit test, deploy, publish to Am/point Exchange


B. Compile, package, unit test, validate unit test coverage, deploy


C. Compile, package, unit test, deploy, integration test





C.
  Compile, package, unit test, deploy, integration test

One of the backend systems involved by the API implementation enforces rate limits on the number of request a particle client can make. Both the back-end system and API implementation are deployed to several non-production environments including the staging environment and to a particular production environment. Rate limiting of the back-end system applies to all non-production environments. The production environment however does not have any rate limiting. What is the cost-effective approach to conduct performance test of the API implementation in the non-production staging environment?


A. Including logic within the API implementation that bypasses in locations of the back-end system in the staging environment and invoke a Mocking service that replicates typical back-end system responses Then conduct performance test using this API implementation


B. Use MUnit to simulate standard responses from the back-end system. Then conduct performance test to identify other bottlenecks in the system


C. Create a Mocking service that replicates the back-end system's production performance characteristics Then configure the API implementation to use the mocking service and conduct the performance test


D. Conduct scaled-down performance tests in the staging environment against rate-limiting back-end system. Then upscale performance results to full production scale





C.
  Create a Mocking service that replicates the back-end system's production performance characteristics Then configure the API implementation to use the mocking service and conduct the performance test

Explanation:

To conduct performance testing in a non-production environment where rate limits are enforced, the most cost-effective approach is:

C. Create a Mocking service that replicates the back-end system's production performance characteristics. Then configure the API implementation to use the mocking service and conduct the performance test.

Mocking Service: Develop a mock service that emulates the performance characteristics of the production back-end system. This service should mimic the response times, data formats, and any relevant behavior of the actual back-end system without imposing rate limits. Configuration: Modify the API implementation to route requests to the mocking service instead of the actual back-end system. This ensures that the performance tests are not impacted by the rate limits imposed in the non-production environment.

Performance Testing: Conduct the performance tests using the API implementation configured with the mocking service. This approach allows you to assess the performance under expected production load conditions without being constrained by non-production rate limits.

This method ensures that performance testing is accurate and reflective of the production environment without additional costs or constraints due to rate limiting in staging environments.






MuleSoft Documentation: Mocking Services MuleSoft Documentation: Performance Testing

An organization is building a test suite for their applications using m-unit. The integration architect has recommended using test recorder in studio to record the processing flows and then configure unit tests based on the capture events What are the two considerations that must be kept in mind while using test recorder (Choose two answers)


A. Tests for flows cannot be created with Mule errors raised inside the flow or already existing in the incoming event


B. Recorder supports smoking a message before or inside a ForEach processor


C. The recorder support loops where the structure of the data been tested changes inside the iteration


D. A recorded flow execution ends successfully but the result does not reach its destination because the application is killed





A.
  Tests for flows cannot be created with Mule errors raised inside the flow or already existing in the incoming event

D.
  A recorded flow execution ends successfully but the result does not reach its destination because the application is killed

Explanation:

When using MUnit's test recorder in Anypoint Studio to create unit tests, consider the following points:

A. Tests for flows cannot be created with Mule errors raised inside the flow or already existing in the incoming event:

Explanation: The test recorder cannot record flows if Mule errors are raised during the flow execution or if the incoming event already contains errors. This limitation requires users to handle or clear errors before recording the flow to ensure accurate test creation.

D. A recorded flow execution ends successfully but the result does not reach its destination because the application is killed:

Explanation: If the application is killed before the recorded flow execution completes, the recorder captures the flow up to the point of termination. However, the final result may not be reached or recorded. This scenario should be avoided to ensure complete and reliable test recordings. These considerations help ensure the accuracy and reliability of tests created using the test recorder.

References:

MUnit Documentation: https://docs.mulesoft.com/munit/2.2/

MUnit Test Recorder: https://blogs.mulesoft.com/dev/mule-dev/using-the-munit-test-recorder/

A stock broking company makes use of CloudHub VPC to deploy Mule applications. Mule application needs to connect to a database application in the customers on-premises corporate data center and also to a Kafka cluster running in AWS VPC. How is access enabled for the API to connect to the database application and Kafka cluster securely?


A. Set up a transit gateway to the customers on-premises corporate datacenter to AWS VPC


B. Setup AnyPoint VPN to the customer's on-premise corporate data center and VPC peering with AWS VPC


C. Setup VPC peering with AWS VPC and the customers devices corporate data center


D. Setup VPC peering with the customers onto my service corporate data center and Anypoint VPN to AWS VPC





B.
  Setup AnyPoint VPN to the customer's on-premise corporate data center and VPC peering with AWS VPC

Explanation:

Requirement Analysis: The Mule application needs secure access to both an on-premises database and a Kafka cluster in AWS VPC.

Solution: Setting up Anypoint VPN for the on-premises corporate data center and VPC peering with AWS VPC ensures secure and seamless connectivity.

Implementation Steps:

Advantages:

References

MuleSoft Documentation on Anypoint VPN

AWS Documentation on VPC Peering

A company wants its users to log in to Anypoint Platform using the company's own internal user credentials. To achieve this, the company needs to integrate an external identity provider (IdP) with the company's Anypoint Platform master organization, but SAML 2.0 CANNOT be used. Besides SAML 2.0, what single-sign-on standard can the company use to integrate the IdP with their Anypoint Platform master organization?


A. SAML 1.0


B. OAuth 2.0


C. Basic Authentication


D. OpenID Connect





D.
  OpenID Connect

Explanation

As the Anypoint Platform organization administrator, you can configure identity management in Anypoint Platform to set up users for single sign-on (SSO).

Configure identity management using one of the following single sign-on standards:

1) OpenID Connect: End user identity verification by an authorization server including SSO

2) SAML 2.0: Web-based authorization including cross-domain SSO


Page 5 out of 54 Pages
Previous