1 / 7

Qlik Replicate Certification PDF Questions and Answers

Download the Latest Qlik Replicate Certification PDF Questions and Answers u2013 Verified by Experts. Get fully prepared for the exam with this comprehensive PDF from PassQuestion. It includes the most up-to-date exam questions and accurate answers, designed to help you pass the exam with confidence.

wilson84
Télécharger la présentation

Qlik Replicate Certification PDF Questions and Answers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. QlikView QREP Exam Qlik Replicate Certification Exam https://www.passquestion.com/qrep.html 35% OFF on All, Including QREP Questions and Answers Pass QlikView QREP Exam with PassQuestion QREP questions and answers in the first attempt. https://www.passquestion.com/ 1 / 7

  2. 1.Which is the path to add a new column to a single table in a task? A. Table Selection -> Schemas -> Add Column B. New Transformation -> Column -> Add Column C. Select Table -> Transform -> Add New D. Table Settings -> General -> Add New Column Answer: D Explanation: To add a new column to a single table in a Qlik Replicate task, the correct path is through Table Settings. Here’s the process you would typically follow: Navigate to the Table Settings of the table you wish to modify within your task. Go to the General section. Use the option to Add New Column. This process allows you to add a column directly to the table’s schema as part of the task configuration. It’s important to note that this action is part of the task’s design phase, where you can specify the schema changes that should be applied to the data as it is replicated. The other options listed, such as New Transformation or Select Table -> Transform, are not the direct paths for adding a new column to a table’s schema within a task. They are related to different aspects of task configuration and transformation1. 2.Using Qlik Replicate, how can the timestamp shown be converted to unlx time (unix epoch - number of seconds since January 1st 1970)? A. SELECT datetime<1092941466, 'unixepoch*, 'localtime'); B. SELECT datetime(482340664, 'localtime', 'unixepoch'); C. strftime('%s*,SAR_H_COMMIT_TIMESTAMP) - <code>datetime.datetime</code>('%s','1970-01-01 00:00:00') D. strftime*'%s,,SAR_H_COMMIT_TIMESTAMP) - strftime('%s','1970-01-01 00:00:00') E. Time.now.strftime(%s','1970-01-01 00:00:00') Answer: D Explanation: The goal is to convert a timestamp to Unix time (seconds since January 1, 1970). The strftime function is used to format date and time values. To get the Unix epoch time, you can use the command: strftime('%s',SAR_H_COMMIT_TIMESTAMP) - strftime('%s','1970-01-01 00:00:00'). This command extracts the Unix time from the timestamp and subtracts the Unix epoch start time to get the number of seconds since January 1, 1970. This is consistent with the Qlik Replicate documentation and SQL standard functions for handling date and time conversions. To convert a timestamp to Unix time (also known as Unix epoch time), which is the number of seconds since January 1st, 1970, you can use the strftime function with the %s format specifier in Qlik Replicate. The correct syntax for this conversion is: strftime('%s', SAR_H_COMMIT_TIMESTAMP) - strftime('%s','1970-01-01 00:00:00') This function will return the number of seconds between the SAR_H_COMMIT_TIMESTAMP and the Unix epoch start date. Here’s a breakdown of the function: strftime('%s', SAR_H_COMMIT_TIMESTAMP) converts the SAR_H_COMMIT_TIMESTAMP to Unix time. 2 / 7

  3. strftime('%s','1970-01-01 00:00:00') gives the Unix time for the epoch start date, which is 0. Subtracting the second part from the first part is not necessary in this case because the Unix epoch time is defined as the time since 1970-01-01 00:00:00. However, if the timestamp is in a different time zone or format, adjustments may be needed. The other options provided do not correctly represent the conversion to Unix time: Options A and B use datetime instead of strftime, which is not the correct function for this operation1. Option C incorrectly includes <code>datetime.datetime</code>, which is not a valid function in Qlik Replicate and seems to be a mix of Python code and SQL1. Option E uses Time.now.strftime, which appears to be Ruby code and is not applicable in the context of Qlik Replicate1. Therefore, the verified answer is D, as it correctly uses the strftime function to convert a timestamp to Unix time in Qlik Replicate1. 3.Which information in Qlik Replicate can be retrieved from the server logs? A. Network and performance issues B. Load status and performance of task C. Specific task information D. Qlik Replicate Server status Answer: D Explanation: The server logs in Qlik Replicate provide information about the Qlik Replicate Server instance, rather than individual tasks. The logs can include various levels of information, such as errors, warnings, info, trace, and verbose details1. Specifically, the server logs can provide insights into: Network and performance issues: These might be indicated by error or warning messages related to connectivity or performance bottlenecks. Load status and performance of task: While the server logs focus on the server instance, they may contain information about the overall load status and performance, especially if there are server-level issues affecting tasks. Specific task information: The server logs can include information about tasks, particularly if there are errors or warnings that pertain to task execution at the server level. Qlik Replicate Server status: This includes general information about the server’s health, status, and any significant events that affect the server’s operation. Therefore, while the server logs can potentially contain a range of information, the primary purpose is to provide details on the Qlik Replicate Server status (D), including any issues that may impact the server’s ability to function properly and manage tasks231. 4.Which two components are responsible for reading data from the source endpoint and writing it to the target endpoint in Full Load replication? (Select two.) A. SOURCE_UNLOAD B. TARGET_APPLY C. TARGET_UNLOAD D. SOURCE_CAPTURE E. TARGET_LOAD 3 / 7

  4. Answer: AE Explanation: The SOURCE_UNLOAD component is responsible for reading data from the source endpoint. The TARGET_LOAD component is responsible for writing the data to the target endpoint. These components work in tandem during the Full Load replication process to move data from the source to the target. According to Qlik Replicate documentation, these two components are crucial in handling the extraction and loading phases of Full Load replication. In the context of Full Load replication with Qlik Replicate, the components responsible for reading data from the source and writing it to the target are: SOURCE_UNLOAD: This component is responsible for unloading data from the source endpoint. It extracts the data that needs to be replicated to the target system1. TARGET_LOAD: This component is in charge of loading the data into the target endpoint. After the data is extracted by the SOURCE_UNLOAD, the TARGET_LOAD component ensures that the data is properly inserted into the target system1. The other options provided do not align with the Full Load replication process: B. TARGET_APPLY and D. SOURCE_CAPTURE are typically associated with the Change Data Capture (CDC) process, not the Full Load process2. C. TARGET_UNLOAD is not a recognized component in the context of Qlik Replicate’s Full Load replication. Therefore, the correct answers are A. SOURCE_UNLOAD and E. TARGET_LOAD, as they are the components that handle the reading and writing of data during the Full Load replication process12. 5.Where are the three options in Qlik Replicate used to read the log files located? (Select three.) A. In Windows Event log B. In Diagnostic package C. In External monitoring tool D. In Data directory of Installation E. In Monitor of Qlik Replicate F. In Enterprise Manager Answer: BDE Explanation: In Qlik Replicate, the options to read the log files are located in the following places: In Diagnostic package (B): The diagnostic package in Qlik Replicate includes various log files that can be used for troubleshooting and analysis purposes1. In Data directory of Installation (D): The log files are written to the log directory within the data directory. This is the primary location where Qlik Replicate writes its log files, and it is not possible to change this location2. In Monitor of Qlik Replicate (E): The Monitor feature of Qlik Replicate allows users to view and manage log files. Users can access the Log Viewer from the Server Logging Levels or File Transfer Service Logging Level sub-tabs1. The other options provided do not align with the locations where log files can be read in Qlik Replicate: 4 / 7

  5. A. In Windows Event log: This is not a location where Qlik Replicate log files are stored. C. In External monitoring tool: While external monitoring tools can be used to read log files, they are not a direct feature of Qlik Replicate for reading log files. F. In Enterprise Manager: The Enterprise Manager is a separate component that may manage and monitor multiple Qlik Replicate instances, but it is not where log files are directly read. Therefore, the verified answers are B, D, and E, as they represent the locations within Qlik Replicate where log files can be accessed and read21. 6.In the CDC mode of a Qlik Replicate task, which option can be set for Batch optimized apply mode? A. Source connection processes B. Number of changed records C. Time and/or volume D. Maximum time to batch transactions Answer: C Explanation: In Change Data Capture (CDC) mode, Batch optimized apply mode can be set based on time and/or volume. This means that the batching of transactions can be controlled by specifying time intervals or the volume of data changes to be batched together. This optimization helps improve performance by reducing the frequency of writes to the target system and handling large volumes of changes efficiently. The Qlik Replicate documentation outlines this option as a method to enhance the efficiency of data replication in CDC mode by batching transactions based on specific criteria. In the Change Data Capture (CDC) mode of a Qlik Replicate task, when using the Batch optimized apply mode, the system allows for tuning based on time and/or volume. This setting is designed to optimize the application of changes in batches to the target system. Here’s how it works: Time: You can set intervals at which batched changes are applied. This includes setting a minimum amount of time to wait between each application of batch changes, as well as a maximum time to wait before declaring a timeout1. Volume: The system can be configured to force apply a batch when the processing memory exceeds a certain threshold. This allows for the consolidation of operations on the same row, reducing the number of operations on the target to a single transaction2. The other options provided do not align with the settings for Batch optimized apply mode in CDC tasks: A. Source connection processes: This is not a setting related to the batch apply mode. B. Number of changed records: While the number of changed records might affect the batch size, it is not a setting that can be directly configured in this context. D. Maximum time to batch transactions: This option is related to the time aspect but does not fully capture the essence of the setting, which includes both time and volume considerations. Therefore, the verified answer is C. Time and/or volume, as it accurately represents the options that can be set for Batch optimized apply mode in the CDC tasks of Qlik Replicate21. 7.How should missing metadata be added in a Qlik Replicate task after the task has been stopped? A. Drop tables or delete tables and data on target side, then run task from a certain timestamp 5 / 7

  6. B. Under Advanced Run option choose reload target, stop task again, and then resume processing C. Under Advanced Run option choose metadata only, stop task again, and then resume processing D. Drop tables and data on the target side, run advanced option, create metadata, and then resume task Answer: C Explanation: If a task has missing metadata, you need to first stop the task. Navigate to the "Advanced Run" options. Select the option "Metadata Only." Start the task with this setting to process the missing metadata. Stop the task again after the metadata is added. Resume normal task processing. This procedure ensures that only the metadata is processed without affecting the existing data on the target side. This method is recommended in Qlik Replicate documentation for handling missing metadata issues. To add missing metadata in a Qlik Replicate task after the task has been stopped, the correct approach is to use the Advanced Run option for metadata only. Here’s the process: Select the task that requires metadata to be added. Go to the Advanced Run options for the task. Choose the Metadata Only option, which has two sub-options: Recreate all tables and then stop: This will rebuild metadata for all available tables in the task. Create missing tables and then stop: This will rebuild metadata only for the missing tables or the tables that were newly added to the task1. By selecting the Metadata Only option and choosing to create missing tables, you can ensure that the metadata for the newly added tables is updated without affecting the existing tables and data. After this operation, you can stop the task again and then resume processing. The other options provided are not the recommended methods for adding missing metadata: A and D suggest dropping tables or data, which is not necessary for simply adding metadata. B suggests reloading the target, which is not the same as updating metadata only. Therefore, the verified answer is C, as it accurately describes the process of adding missing metadata to a Qlik Replicate task using the Advanced Run options1. 8.When running a task in Qlik Replicate (From Oracle to MS SQL), the following error message appears: Failed adding supplemental logging for table "Table name" Which must be done to fix this error? A. Contact the Oracle DBA B. Check the permission on the target endpoint C. Enable supplemental logging D. Check the permission of the source endpoint Answer: C Explanation: The error message "Failed adding supplemental logging for table" indicates that supplemental logging is not enabled on the Oracle source. Supplemental logging must be enabled to capture the necessary changes for replication. To fix this error, you should enable supplemental logging on the Oracle database for the specific table or tables. This can usually be done by executing the following SQL command on the Oracle source: ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS; 6 / 7

  7. Verify that the logging is enabled and then retry the replication task. This solution aligns with the troubleshooting steps provided in the Qlik Replicate documentation for dealing with supplemental logging errors. The error message “Failed adding supplemental logging for table ‘Table name’” indicates that supplemental logging has not been enabled for the table in the Oracle source database. Supplemental logging is necessary for Qlik Replicate to capture the changes in the Oracle database accurately, especially for Change Data Capture (CDC) operations. To resolve this error, you should: Enable supplemental logging at the database level by executing the following SQL command in the Oracle database: ALTER DATABASE ADD SUPPLEMENTAL LOG DATA; This command enables minimal supplemental logging, which is required for Qlik Replicate to function correctly1. If you need to enable supplemental logging for all columns, you can use the following SQL command: ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS; This ensures that all necessary column data is logged for replication purposes1. After enabling supplemental logging, verify that it is active by querying the v$database view: SELECT supplemental_log_data_min FROM v$database; The correct return value should be ‘YES’, indicating that supplemental logging is enabled1. The other options provided are not directly related to the issue of supplemental logging: A. Contact the Oracle DBA: While contacting the DBA might be helpful, the specific action needed is to enable supplemental logging. B. Check the permission on the target endpoint: Permissions on the target endpoint are not related to the supplemental logging requirement on the source database. D. Check the permission of the source endpoint: Permissions on the source endpoint are important, but the error message specifically refers to the need for supplemental logging. Therefore, the verified answer is C. Enable supplemental logging, as it directly addresses the requirement to fix the error related to supplemental logging in Qlik Replicate21. 9.Which is the minimum level of permissions required for a user to delete tasks? A. Operator B. Viewer C. Designer D. Admin Answer: C 7 / 7

More Related