Available in 1, 3, 6 and 12 Months Free Updates Plans
PDF: $15 $60

Test Engine: $20 $80

PDF + Engine: $25 $99

Data-Cloud-Consultant Practice Test


Page 3 out of 33 Pages

A customer wants to use the transactional data from their data warehouse in Data Cloud. They are only able to export the data via an SFTP site. How should the file be brought into Data Cloud?


A. Ingest the file with the SFTP Connector.


B. Ingest the file through the Cloud Storage Connector.


C. Manually import the file using the Data Import Wizard.


D. Use Salesforce's Dataloader application to perform a bulk upload from a desktop.





A.
  Ingest the file with the SFTP Connector.

Explanation:

A. The SFTP Connector is a data source connector that allows Data Cloud to ingest data from an SFTP server. The customer can use the SFTP Connector to create a data stream from their exported file and bring it into Data Cloud as a data lake object. The other options are not the best ways to bring the file into Data Cloud because:

B. The Cloud Storage Connector is a data source connector that allows Data Cloud to ingest data from cloud storage services such as Amazon S3, Azure Storage, or Google Cloud Storage. The customer does not have their data in any of these services, but only on an SFTP site.

C. The Data Import Wizard is a tool that allows users to import data for many standard Salesforce objects, such as accounts, contacts, leads, solutions, and campaign members. It is not designed to import data from an SFTP site or for custom objects in Data Cloud.

D. The Dataloader is an application that allows users to insert, update, delete, or export Salesforce records. It is not designed to ingest data from an SFTP site or into Data Cloud. References: SFTP Connector - Salesforce, Create Data Streams with the SFTP Connector in Data Cloud -Salesforce, Data Import Wizard - Salesforce, Salesforce Data Loader

A consultant wants to build a new audience in Data Cloud. Which three criteria can the consultant include when building a segment? Choose 3 answers


A. Direct attributes


B. Data stream attributes


C. Calculated Insights


D. Related attributes


E. Streaming insights





A.
  Direct attributes

C.
  Calculated Insights

D.
  Related attributes

Explanation:

A segment is a subset of individuals who meet certain criteria based on their attributes and behaviors. A consultant can use different types of criteria when building a segment in Data Cloud, such as:

Direct attributes: These are attributes that describe the characteristics of an individual, such as name, email, gender, age, etc. These attributes are stored in the Profile data model object (DMO) and can be used to filter individuals based on their profile data.

Calculated Insights: These are insights that perform calculations on data in a data space and store the results in a data extension. These insights can be used to segment individuals based on metrics or scores derived from their data, such as customer lifetime value, churn risk, loyalty tier, etc.

Related attributes: These are attributes that describe the relationships of an individual with other DMOs, such as Email, Engagement, Order, Product, etc. These attributes can be used to segment individuals based on their interactions or transactions with different entities, such as email opens, clicks, purchases, etc.

The other two options are not valid criteria for building a segment in Data Cloud. Data stream attributes are attributes that describe the streaming data that is ingested into Data Cloud from various sources, such as Marketing Cloud, Commerce Cloud, Service Cloud, etc. These attributes are not directly available for segmentation, but they can be transformed and stored in data extensions using streaming data transforms. Streaming insights are insights that analyze streaming data in real time and trigger actions based on predefined conditions. These insights are not used for segmentation, but for activation and personalization.

References: Create a Segment in Data Cloud, Use Insights in Data Cloud, Data Cloud Data Model

A segment fails to refresh with the error "Segment references too many data lake objects (DLOS)". Which two troubleshooting tips should help remedy this issue? Choose 2 answers


A. Split the segment into smaller segments.


B. Use calculated insights in order to reduce the complexity of the segmentation query.


C. Refine segmentation criteria to limit up to five custom data model objects (DMOs).


D. Space out the segment schedules to reduce DLO load.





A.
  Split the segment into smaller segments.

B.
  Use calculated insights in order to reduce the complexity of the segmentation query.

Explanation:

The error “Segment references too many data lake objects (DLOs)” occurs when a segment query exceeds the limit of 50 DLOs that can be referenced in a single query. This can happen when the segment has too many filters, nested segments, or exclusion criteria that involve different DLOs. To remedy this issue, the consultant can try the following troubleshooting tips:

Split the segment into smaller segments. The consultant can divide the segment into multiple segments that have fewer filters, nested segments, or exclusion criteria. This can reduce the number of DLOs that are referenced in each segment query and avoid the error. The consultant can then use the smaller segments as nested segments in a larger segment, or activate them separately.

Use calculated insights in order to reduce the complexity of the segmentation query. The consultant can create calculated insights that are derived from existing data using formulas. Calculated insights can simplify the segmentation query by replacing multiple filters or nested segments with a single attribute. For example, instead of using multiple filters to segment individuals based on their purchase history, the consultant can create a calculated insight that calculates the lifetime value of each individual and use that as a filter.

The other options are not troubleshooting tips that can help remedy this issue. Refining segmentation criteria to limit up to five custom data model objects (DMOs) is not a valid option, as the limit of 50 DLOs applies to both standard and custom DMOs. Spacing out the segment schedules to reduce DLO load is not a valid option, as the error is not related to the DLO load, but to the segment query complexity.

References:

Troubleshoot Segment Errors

Create a Calculated Insight

Create a Segment in Data Cloud

The Salesforce CRM Connector is configured and the Case object data stream is set up. Subsequently, a new custom field named Business Priority is created on the Case object in Salesforce CRM. However, the new field is not available when trying to add it to the data stream. Which statement addresses the cause of this issue?


A. The Salesforce Integration User Is missing Rad permissions on the newly created field.


B. The Salesforce Data Loader application should be used to perform a bulk upload from a desktop.


C. Custom fields on the Case object are not supported for ingesting into Data Cloud.


D. After 24 hours when the data stream refreshes it will automatically include any new fields that were added to the Salesforce CRM.





A.
  The Salesforce Integration User Is missing Rad permissions on the newly created field.

The Salesforce CRM Connector uses the Salesforce Integration User to access the data from the Salesforce CRM org. The Integration User must have the Read permission on the fields that are included in the data stream. If the Integration User does not have the Read permission on the newly created field, the field will not be available for selection in the data stream configuration. To resolve this issue, the administrator should assign the Read permission on the new field to the Integration User profile or permission set. References: Create a Salesforce CRM Data Stream, Edit a Data Stream, Salesforce Data Cloud Full Refresh for CRM, SFMC, or Ingestion API Data Streams

A customer requests that their personal data be deleted. Which action should the consultant take to accommodate this request in Data Cloud?


A. Use a streaming API call to delete the customer's information.


B. Use Profile Explorer to delete the customer data from Data Cloud.


C. Use Consent API to request deletion of the customer's information.


D. Use the Data Rights Subject Request tool to request deletion of the customer's information.





C.
  Use Consent API to request deletion of the customer's information.

Explanation:

The Data Rights Subject Request tool is a feature that allows Data Cloud users to manage customer requests for data access, deletion, or portability. The tool provides a user interface and an API to create, track, and fulfill data rights requests. The tool also generates a report that contains the customer’s personal data and the actions taken to comply with the request. The consultant should use this tool to accommodate the customer’s request for data deletion in Data Cloud. References: Data Rights Subject Request Tool, Create a Data Rights Subject Request.


Page 3 out of 33 Pages
Previous