UC has multiple SF orgs that are distributed across regional branches. Each branch stores local customer data inside its org’s Account and Contact objects. This creates a scenario where UC is unable to view customers across all orgs. UC has an initiative to create a 360-degree view of the customer, as UC would like to see Account and Contact data from all orgs in one place. What should a data architect suggest to achieve this 360-degree view of the customer?
A.
Consolidate the data from each org into a centralized datastore
B.
Use Salesforce Connect’s cross-org adapter.
C.
Build a bidirectional integration between all orgs.
D.
Use an ETL tool to migrate gap Accounts and Contacts into each org.
Consolidate the data from each org into a centralized datastore
Universal Containers is setting up an external Business Intelligence (BI) system and wants to extract 1,000,000 Contact records. What should be recommended to avoid timeouts during the export process?
A.
Use the SOAP API to export data.
B.
Utilize the Bulk API to export the data.
C.
Use GZIP compression to export the data.
D.
Schedule a Batch Apex job to export the data.
Use GZIP compression to export the data.
Northern Trail Outfitters needs to implement an archive solution for Salesforce data. This archive solution needs to help NTO do the following:
1. Remove outdated Information not required on a day-to-day basis.
2. Improve Salesforce performance.
Which solution should be used to meet these requirements?
A.
Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis,
B.
Identify a location to store archived data, and move data to the location using a time based workflow.
C.
Use a formula field that shows true when a record reaches a defined age and use that field to run a report and export a report into SharePoint.
D.
Create a full copy sandbox, and use it as a source for retaining archived data.
Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis,
Universal Containers (UC) has a data model as shown in the image. The Project object has a private sharing model, and it has Roll -Up summary fields to calculate the number of resources assigned to the project, total hours for the project, and the number of work items associated to the project. What should the architect consider, knowing there will be a large amount of time entry records to be loaded regularly from an external system into Salesforce.com?
A.
Load all data using external IDs to link to parent records.
B.
Use workflow to calculate summary values instead of Roll -Up.
C.
Use triggers to calculate summary values instead of Roll -Up.
D.
Load all data after deferring sharing calculations.
Load all data after deferring sharing calculations.
Universal Containers has a legacy system that captures Conferences and Venues. These Conferences can occur at any Venue. They create hundreds of thousands of Conferences per year. Historically, they have only used 20 Venues. Which two things should the data architect consider when denormalizing this data model into a single Conference object with a Venue picklist? Choose 2 answers
A.
Limitations on master -detail relationships.
B.
Org data storage limitations.
C.
Bulk API limitations on picklist fields.
D.
Standard list view in -line editing.
Bulk API limitations on picklist fields.
Standard list view in -line editing.
Page 9 out of 52 Pages |
Previous |