This page was exported from Valid Premium Exam [ http://premium.validexam.com ] Export date:Mon Feb 24 15:29:43 2025 / +0000 GMT ___________________________________________________ Title: Buy Latest Mar 02, 2023 Data-Architect Exam Q&A PDF - One Year Free Update [Q69-Q91] --------------------------------------------------- Buy Latest Mar 02, 2023 Data-Architect Exam Q&A PDF - One Year Free Update Download the Latest Data-Architect Dump - 2023 Data-Architect Exam Questions Salesforce Data-Architect Exam Syllabus Topics: TopicDetailsTopic 1Recommend techniques to ensure data is persisted in a consistent manner Design a data model that scales considering large data volume and solution performanceTopic 2Recommend a design to effectively consolidate andor leverage data from multiple Salesforce instances Compare and contrast various techniques, approaches and considerationsTopic 3Recommend and use techniques for establishing a "golden record" or "system of truth" Recommend a data archiving and purging plan that is optimal for customer's data storage management needsTopic 4Compare and contrast the different reasons for implementing Big Objects vs StandardCustom objects within a production instance Discuss the various options to identify, classify and protect personal and sensitive informationTopic 5Compare and contrast various techniques and considerations for designing a data model Compare and contrast the various techniques, approaches and considerations for implementing Master Data Management SolutionsTopic 6Compare and contrast various approaches and considerations for designing and implementing an enterprise data governance program Discuss criteria and methodology for picking the winning attributesTopic 7Recommend appropriate approaches and techniques to capture and maintain customer reference & metadata Compare and contrast various techniques for improving performance when migrating large data volumes into SalesforceTopic 8Recommend approaches and techniques for consolidating data attributes from multiple sources Given a scenario with multiple systems of interaction   QUESTION 69Universal Containers (UC) has several custom Visualforce applications have been developed in which users are able to edit Opportunity records. UC struggles with data completeness on their Opportunity records and has decided to make certain fields required that have not been in the past. The newly required fields are dependent on the Stage of the Opportunity, such that certain fields are only required once an Opportunity advances to later stages. There are two fields. What is the simplest approach to handle this new requirement?  Write an Apex trigger that checks each field when records are saved.  Update the Opportunity page layout to mark these fields as required.  Update these Opportunity field definitions in Setup to be required.  Use a validation rule for each field that takes the Stage into consideration. QUESTION 70UC recently migrated 1 Billion customer related records from a legacy data store to Heroku Postgres. A subset of the data need to be synchronized with salesforce so that service agents are able to support customers directly within the service console. The remaining non- synchronized set of data will need to be accessed by salesforce at any point in time, but UC management is concerned about storage limitations.What should a data architect recommend to meet these requirements with minimal effort?  Use Heroku connect to bi-directional, sync all data between systems.  Virtualize the remaining set of data with salesforce connect and external objects.  Migrate the data to big objects and leverage async SOQL with custom objects.  As needed, make call outs into Heroku postgres and persist the data in salesforce. QUESTION 71A Customer is migrating 10 million order and 30 million order lines into Salesforce using Bulk API. The Engineer is experiencing time-out errors or long delays querying parents order IDs in Salesforce before importing related order line items. What is the recommended solution?  Leverage an External ID from source system orders to import related order lines.  Leverage Batch Apex to update order ID on related order lines after import.  Query only indexed ID field values on the imported order to import related order lines.  Leverage a sequence of numbers on the imported orders to import related order lines. QUESTION 72A large retail B2C customer wants to build a 360 view of its customer for its call center agents. The customer interaction is currently maintained in the following system: 1. Salesforce CRM2. Custom billing solution3. Customer Master Data management (MDM)4. Contract Management system5. Marketing solutionWhat should a data architect recommend that would help upgrade uniquely identify customer across multiple systems:  Create a custom object that will serve as a cross reference for the customer id.  Create a customer data base and use this id in all systems.  Create a custom field as external id to maintain the customer Id from the MDM solution.  Store the salesforce id in all the solutions to identify the customer. QUESTION 73Universal Containers (UC) is implementing a Salesforce project with large volumes of data and daily transactions. The solution includes both real-time web service integrations and Visualforce mash -ups with back -end systems. The Salesforce Full sandbox used by the project integrates with full-scale back -end testing systems. What two types of performance testing are appropriate for this project?Choose 2 answers  Post go -live automated page -load testing against the Salesforce Production org.  Pre -go -live unit testing in the Salesforce Full sandbox.  Pre -go -live automated page -load testing against the Salesforce Full sandbox.  Stress testing against the web services hosted by the integration middleware. QUESTION 74Universal Containers is looking to use Salesforce to manage their sales organization. They will be migrating legacy account data from two aging systems into Salesforce. Which two design considerations should an architect take to minimize data duplication? Choose 2 answers  Use Salesforce matching and duplicate rules.  Use a workflow to check and prevent duplicates.  Import the data concurrently.  Clean data before importing to Salesforce. QUESTION 75UC is using SF CRM. UC sales managers are complaining about data quality and would like to monitor and measure data quality.Which 2 solutions should a data architect recommend to monitor and measure data quality?Choose 2 answers.  Install and run data quality analysis dashboard app  Review data quality reports and dashboards.  Export data and check for data completeness outside of Salesforce.  Use custom objects and fields to identify issues. QUESTION 76A data architect has been tasked with optimizing a data stewardship engagement for a Salesforce instance Which three areas of Salesforce should the architect review before proposing any design recommendation? Choose 3 answers  Run key reports to determine what fields should be required.  Review the metadata xml files for redundant fields to consolidate.  Review the sharing model to determine impact on duplicate records.  Determine if any integration points create records in Salesforce.  Export the setup audit trail to review what fields are being used. QUESTION 77UC has to built a B2C ecommerce site on Heroku that shares customer and order data with a Heroku Postgres database. UC is currently utilizing Postgres as the single source of truth for both customers and orders. UC has asked a data architect to replicate the data into salesforce so that salesforce can now act as the system of record.What are the 3 considerations that data architect should weigh before implementing this requirement? Choose 23 answers:  A selection of the tool required to replicate the data.  Ensure there is a tight relationship between order data and an enterprise resource plaining (ERP) application.  Ensure the data is CRM center and able to populate standard of custom objects.  Determine if the data is driver of key process implemented within salesforce.  Consider whether the data is required for sales reports, dashboards and KPI’s. a. – Heroku Connect is required but this is confusingQUESTION 78For a production cutover, a large number of Account records will be loaded into Salesforce from a legacy system. The legacy system does not have enough information to determine the Ownership for these Accounts upon initial load. Which two recommended options assign Account ownership to mitigate potential performance problems?  Let a “system user” own all the Account records and make this user part of the highest-level role in the Role Hierarchy.  Let a “system user” own all the Account records without assigning any role to this user in Role Hierarchy.  Let a “system user” own the Account records and assign this user to the lowest-level role in the Role Hierarchy.  Let the VP of the Sales department, who will report directly to the senior VP, own all the Account records. QUESTION 79Which three characteristics of a Skinny table help improve report and query performance?  Skinny tables are kept in sync with changes to data in the source tables.  Skinny tables can contain frequently used fields and thereby help avoid joins.  Skinny tables do not include records that are available in the recycle bin.  Skinny tables can be used to create custom indexes on multi-select picklist fields.  Skinny tables provide a view across multiple objects for easy access to combined data. QUESTION 80UC needs to load a large volume of leads into salesforce on a weekly basis. During this process the validation rules are disabled.What should a data architect recommend to ensure data quality is maintained in salesforce.  Develop custom APEX batch process to improve quality once the load is completed.  Ensure the lead data is preprocessed for quality before loading into salesforce.  Allow validation rules to be activated during the load of leads into salesforce.  Activate validation rules once the leads are loaded into salesforce to maintain quality. QUESTION 81UC has one SF org (Org A) and recently acquired a secondary company with its own Salesforce org (Org B). UC has decided to keep the orgs running separately but would like to bidirectionally share opportunities between the orgs in near-real time.Which 3 options should a data architect recommend to share data between Org A and Org B?Choose 3 answers.  Install a 3rd party AppExchange tool to handle the data sharing  Leverage middleware tools to bidirectionally send Opportunity data across orgs.  Leverage Heroku Connect and Heroku Postgres to bidirectionally sync Opportunities.  Use Salesforce Connect and the cross-org adapter to visualize Opportunities into external objects  Develop an Apex class that pushes opportunity data between orgs daily via the Apex schedule. QUESTION 82Universal container (UC) would like to build a Human resources application on Salesforce to manage employee details, payroll, and hiring efforts. To adequately and store the relevant data, the application will need to leverage 45 custom objects. In addition to this, UC expects roughly 20,00 API calls into Salesfoce from an n-premises application daily.Which license type should a data architect recommend that best fits these requirements?  Lightning platform Start  Lightning External Apps Starts  Lightning Platform plus  Service Cloud QUESTION 83Northern trail Outfitters (NTO) uses Sales Cloud and service Cloud to manage sales and support processes. Some of NTOs team are complaining they see new fields on their page unsure of which values need be input. NTO is concerned about lack of governance in making changes to Salesforce.Which governance measure should a data architect recommend to solve this issue?  Create reports to identify which users are leaving blank, and use external data sources o agreement the missing data.  Create and manage a data dictionary and ups a governance process for changes made to common objects.  Add description fields to explain why the field is used, and mark the field as required.  Create validation rules with error messages to explain why the fields is used QUESTION 84Universal Containers (UC) has 1,000 accounts and 50,000 opportunities. UC has an enterprise security requirement to export all sales data outside of Salesforce on a weekly basis. The security requirement also calls for exporting key operational data that includes events such as file downloads, logins, logouts, etc. Which two recommended approaches would address the above requirement?  Use Event Monitoring to extract event data to on-premise systems.  Use a custom built extract job to extract operational data to on-premise systems.  Use Weekly Export to extract transactional data to on-premise systems.  Use Field Audit History to capture operational data and extract it to on-premise systems. QUESTION 85Northern trail Outfitters (NTO) runs its entire out of an enterprise data warehouse (EDW), NTD’s sales team starting to use Salesforce after a recent implementation, but currently lacks data required to advanced and opportunity to the next stage.NTO’s management has research Salesforce Connect and would like to use It to virtualize and report on data from the EDW within Salesforce. NTO will be running thousands of reports per day across 10 to 15 external objects.What should a data architect consider before implementing Salesforce Connect for reporting?  Maximum external objects per org  Maximum page size for server-driven paging  Maximum number for records returned  OData callout limits per day QUESTION 86Universal Container require all customers to provide either a phone number of an email address when registering for an account.What should the data architect use to ensure this requirement is met?  Apex Class  required Fields  validation Rule  Process Builder QUESTION 87Two million Opportunities need to be loaded in different batches into Salesforce using the Bulk API in parallel mode.What should an Architect consider when loading the Opportunity records?  Use the Name field values to sort batches.  Group batches by the AccountId field.  Create indexes on Opportunity object text fields.  Order batches by Auto-number field. QUESTION 88Universal Container (UC) has accumulated data over years and has never deleted data from its Salesforce org. UC is now exceeding the storage allocations in the org. UC is now looking for option to delete unused from the org.Which three recommendations should a data architect make is order to reduce the number of records from the org?Choose 3 answers  Archive the records in enterprise data warehouse (EDW) before deleting from Salesforce.  Use Rest API to permanently delete records from the Salesforce org.  Identify records in objects that have not been modified or used In last 3 years.  Use hard delete in batch Apex to permanently delete records from Salesforce.  Use hard delete in Bulk API to permanently delete records from Salesforce. QUESTION 89Universal Containers has more than 10 million records in the Order_c object. The query has timed out when running a bulk query. What should be considered to resolve query timeout?  Streaming API  Tooling API  PK Chunking  Metadata API QUESTION 90Universal Containers (UC) is planning to move away from legacy CRM to Salesforce. As part of one-time data migration, UC will need to keep the original date when a contact was created in the legacy system. How should an Architect design the data migration solution to meet this requirement?  After the data is migrated, perform an update on all records to set the original date in a standard CreatedDate field.  Create a new field on Contact object to capture the Created Date. Hide the standard CreatedDate field using Field -Level Security.  Enable “Set Audit Fields” and assign the permission to the user loading the data for the duration of the migration.  Write an Apex trigger on the Contact object, before insert event to set the original value in a standard CreatedDate field. QUESTION 91Universal Containers (UC) maintains a collection of several million Account records that represent business in the United Sates. As a logistics company, this list is one of the most valuable and important components of UC’s business, and the accuracy of shipping addresses is paramount. Recently it has been noticed that too many of the addresses of these businesses are inaccurate, or the businesses don’t exist. Which two scalable strategies should UC consider to improve the quality of their Account addresses?  Contact each business on the list and ask them to review and update their address information.  Build a team of employees that validate Accounts by searching the web and making phone calls.  Leverage Data.com Clean to clean up Account address fields with the D&B database.  Integrate with a third-party database or services for address validation and enrichment.  Loading … Verified Data-Architect Dumps Q&As - 1 Year Free & Quickly Updates: https://www.validexam.com/Data-Architect-latest-dumps.html --------------------------------------------------- Images: https://premium.validexam.com/wp-content/plugins/watu/loading.gif https://premium.validexam.com/wp-content/plugins/watu/loading.gif --------------------------------------------------- --------------------------------------------------- Post date: 2023-03-02 11:21:14 Post date GMT: 2023-03-02 11:21:14 Post modified date: 2023-03-02 11:21:14 Post modified date GMT: 2023-03-02 11:21:14