Verified 70-776 Dumps Questions 2019

70-776 Royal Pack Testengine pdf

100% Actual & Verified — 100% PASS

Unlimited access to the world's largest Dumps library! Try it Free Today!

https://www.exambible.com/70-776-exam/

Product Description:
Exam Number/Code: 70-776
Exam name: Perform Big Data Engineering on Microsoft Cloud Services (beta)
n questions with full explanations
Certification: Microsoft Certification
Last updated on Global synchronizing

Free Certification Real IT 70-776 Exam pdf Collection

We provide 70-776 Free Practice Questions in two formats. Download PDF & Practice Tests. Pass Microsoft 70-776 Exam quickly & easily. The 70-776 PDF type is available for reading and printing. You can print more and practice many times. With the help of our 70-776 Dumps product and material, you can easily pass the 70-776 exam.

Online Microsoft 70-776 free dumps demo Below:

NEW QUESTION 1
You have sensor devices that report data to Microsoft Azure Stream Analytics. Each sensor reports data several times per second.
You need to create a live dashboard in Microsoft Power BI that shows the performance of the sensor devices. The solution must minimize lag when visualizing the data.
Which function should you use for the time-series data element?

  • A. LAG
  • B. SlidingWindow
  • C. System.TimeStamp
  • D. TumblingWindow

Answer: D

NEW QUESTION 2
DRAG DROP
You need to create a dataset in Microsoft Azure Data Factory that meets the following requirements: How should you complete the JSON code? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
70-776 dumps exhibit

    Answer:

    Explanation:
    References:
    https://github.com/aelij/azure-content/blob/master/articles/data-factory/data-factory-create-pipelines.md

    NEW QUESTION 3
    You have a fact table named PowerUsage that has 10 billion rows. PowerUsage contains data about customer power usage during the last 12 months. The usage data is collected every minute. PowerUsage contains the columns configured as shown in the following table.
    70-776 dumps exhibit
    LocationNumber has a default value of 1. The MinuteOfMonth column contains the relative minute within each month. The value resets at the beginning of each month.
    A sample of the fact table data is shown in the following table.
    70-776 dumps exhibit
    There is a related table named Customer that joins to the PowerUsage table on the CustomerId column. Sixty percent of the rows in PowerUsage are associated to less than 10 percent of the rows in Customer. Most queries do not require the use of the Customer table. Many queries select on a specific month.
    You need to minimize how long it takes to find the records for a specific month. What should you do?

    • A. Implement partitioning by using the MonthKey colum
    • B. Implement hash distribution by using the CustomerId column.
    • C. Implement partitioning by using the CustomerId colum
    • D. Implement hash distribution by using the MonthKey column.
    • E. Implement partitioning by using the MonthKey colum
    • F. Implement hash distribution by using the MeasurementId column.
    • G. Implement partitioning by using the MinuteOfMonth colum
    • H. Implement hash distribution by using the MeasurementId column.

    Answer: C

    NEW QUESTION 4
    You have a Microsoft Azure SQL data warehouse that has a fact table named FactOrder. FactOrder contains three columns named CustomerId, OrderId, and OrderDateKey. FactOrder is hash distributed on CustomerId. OrderId is the unique identifier for FactOrder. FactOrder contains 3 million rows.
    Orders are distributed evenly among different customers from a table named dimCustomers that contains 2 million rows.
    You often run queries that join FactOrder and dimCustomers by selecting and grouping by the OrderDateKey column.
    You add 7 million rows to FactOrder. Most of the new records have a more recent OrderDateKey value than the previous records.
    You need to reduce the execution time of queries that group on OrderDateKey and that join dimCustomers and FactOrder.
    What should you do?

    • A. Change the distribution for the FactOrder table to round robin.
    • B. Update the statistics for the OrderDateKey column.
    • C. Change the distribution for the FactOrder table to be based on OrderId.
    • D. Change the distribution for the dimCustomers table to OrderDateKey.

    Answer: B

    Explanation:
    References:
    https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-tables-statistics

    NEW QUESTION 5
    DRAG DROP
    You are designing a Microsoft Azure analytics solution. The solution requires that data be copied from Azure Blob storage to Azure Data Lake Store.
    The data will be copied on a recurring schedule. Occasionally, the data will be copied manually. You need to recommend a solution to copy the data.
    Which tools should you include in the recommendation? To answer, drag the appropriate tools to the correct requirements. Each tool may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
    NOTE: Each correct selection is worth one point.
    70-776 dumps exhibit

      Answer:

      Explanation: 70-776 dumps exhibit

      NEW QUESTION 6
      You have a Microsoft Azure Data Lake Store and an Azure Active Directory tenant.
      You are developing an application that will access the Data Lake Store by using end-user credentials. You need to ensure that the application uses end-user authentication to access the Data Lake Store. What should you create?

      • A. a Native Active Directory app registration
      • B. a policy assignment that uses the Allowed resource types policy definition
      • C. a Web app/API Active Directory app registration
      • D. a policy assignment that uses the Allowed locations policy definition

      Answer: A

      Explanation:
      References:
      https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-end-user-authenticate-using-active-directory

      NEW QUESTION 7
      Note: This question is part of a series of questions that present the same scenario. Each question in
      the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
      After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
      You are monitoring user queries to a Microsoft Azure SQL data warehouse that has six compute nodes.
      You discover that compute node utilization is uneven. The rows_processed column from sys.dm_pdw_workers shows a significant variation in the number of rows being moved among the distributions for the same table for the same query.
      You need to ensure that the load is distributed evenly across the compute nodes. Solution: You add a nonclustered columnstore index.
      Does this meet the goal?

      • A. Yes
      • B. No

      Answer: B

      NEW QUESTION 8
      You plan to use Microsoft Azure Event Hubs to ingest sensor data. You plan to use Azure Stream Analytics to analyze the data in real time and to send the output directly to Azure Data Lake Store.
      You need to write events to the Data Lake Store in batches. What should you use?

      • A. Apache Storm in Azure HDInsight
      • B. Stream Analytics
      • C. Microsoft SQL Server Integration Services (SSIS)
      • D. the Azure CLI

      Answer: B

      Explanation:
      References:
      https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-data-scenarios

      NEW QUESTION 9
      You have a Microsoft Azure Data Lake Analytics service.
      You need to store a list of milltiple-character string values in a single column and to use a cross apply explode expression to output the values.
      Which type of data should you use in a U-SQL query?

      • A. SQL.MAP
      • B. SQL.ARRAY
      • C. string
      • D. byte [ ]

      Answer: B

      NEW QUESTION 10
      Note: This question is part of a series of questions that use the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.
      Start of repeated scenario
      You are migrating an existing on-premises data warehouse named LocalDW to Microsoft Azure. You will use an Azure SQL data warehouse named AzureDW for data storage and an Azure Data Factory named AzureDF for extract transformation, and load (ETL) functions.
      For each table in LocalDW, you create a table in AzureDW.

      • A. adataset
      • B. a gateway
      • C. a pipeline
      • D. a linked service

      Answer: A

      NEW QUESTION 11
      DRAG DROP
      You have a Microsoft Azure SQL data warehouse.
      You plan to reference data from Azure Blob storage. The data is stored in the GZIP compressed format. The blob storage requires authentication.
      You create a master key for the data warehouse and a database schema.
      You need to reference the data without importing the data to the data warehouse.
      Which four statements should you execute in sequence? To answer, move the appropriate statements from the list of statements to the answer area and arrange them in the correct order.
      70-776 dumps exhibit

        Answer:

        Explanation: 70-776 dumps exhibit

        NEW QUESTION 12
        HOTSPOT
        You have a Microsoft Azure Data Lake Analytics service.
        You have a tab-delimited file named UserActivity.tsv that contains logs of user sessions. The file does not have a header row.
        You need to create a table and to load the logs to the table. The solution must distribute the data by a column named SessionId.
        How should you complete the U-SQL statement? To answer, select the appropriate options in the answer area.
        NOTE: Each correct selection is worth one point.
        70-776 dumps exhibit

          Answer:

          Explanation:
          References:
          https://msdn.microsoft.com/en-us/library/mt706197.aspx

          NEW QUESTION 13
          You have a Microsoft Azure Stream Analytics job.
          You are debugging event information manually.
          You need to view the event data that is being collected.
          Which monitoring data should you view for the Stream Analytics job?

          • A. query
          • B. outputs
          • C. scale
          • D. inputs

          Answer: D

          NEW QUESTION 14
          You plan to deploy a Microsoft Azure Stream Analytics job to filter multiple input streams from IoT devices that have a total data flow of 30 MB/s.
          You need to calculate how many streaming units you require for the job. The solution must prevent lag.
          What is the minimum number of streaming units required?

          • A. 3
          • B. 10
          • C. 30
          • D. 300

          Answer: C

          NEW QUESTION 15
          You have a Microsoft Azure Data Lake Analytics service. You plan to configure diagnostic logging.
          You need to use Microsoft Operations Management Suite (OMS) to monitor the IP addresses that are used to access the Data Lake Store.
          What should you do?

          • A. Stream the request logs to an event hub.
          • B. Send the audit logs to Log Analytics.
          • C. Send the request logs to Log Analytics.
          • D. Stream the audit logs to an event hub.

          Answer: B

          Explanation:
          References:
          https://docs.microsoft.com/en-us/azure/data-lake-analytics/data-lake-analytics-diagnostic-logs https://docs.microsoft.com/en-us/azure/security/azure-log-audit

          NEW QUESTION 16
          DRAG DROP
          You have a Microsoft Azure SQL data warehouse named DW1. Data is located to DW1 once daily at 01:00.
          A user accidentally deletes data from a fact table in DW1 at 09:00.
          You need to recover the lost data. The solution must prevent the need to change any connection strings and must minimize downtime.
          Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
          70-776 dumps exhibit

            Answer:

            Explanation: 70-776 dumps exhibit

            NEW QUESTION 17
            You ingest data into a Microsoft Azure event hub.
            You need to export the data from the event hub to Azure Storage and to prepare the data for batch processing tasks in Azure Data Lake Analytics.
            Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

            • A. Run the Avro extractor from a U-SQL script.
            • B. Create an Azure Storage account.
            • C. Add a shared access policy.
            • D. Enable Event Hubs Archive.
            • E. Run the CSV extractor from a U-SQL script.

            Answer: BD

            Recommend!! Get the Full 70-776 dumps in VCE and PDF From 2passeasy, Welcome to Download: https://www.2passeasy.com/dumps/70-776/ (New 91 Q&As Version)