Contact TalentLaunch Support: 1 (216) 750-8901

Refreshing PowerBI Model Data

Changing the date range and running the data factory

  1. Navigate to the Process Control page on the TalentLaunch page of the portal under Admin -> Process Control https://portal.thetlnetwork.com/ProcessControl.aspx?linkId=1a79d083-b52e-4250-b631-9eb008e36017

  2. For each source system or module, there will be 5 attributes: Process Flag, Start Date, End Date, Rolling Days, and Rolling Days Override. The definitions of each are below:

    1. Process Flag: 1 = run this module, 0 = don’t run this module

    2. Start Date: earliest date to pull from date-ranged table extracts for this module

    3. End Date: latest date to pull from date-ranged table extracts for this module

    4. Rolling Days: default number of days to go back each time the job runs. At the beginning of each run, this is what is used to set the Start Date and End Date if Rolling Days Override = N

    5. Rolling Days Override: Y = for the next job run I want to set an explicit range of dates using the Start Date and End Date fields, N = I want the default Start Date and End Date to be set according to Rolling Days for the next job run

  3. Make sure to click “Save Changes” if you edit anything on the Process Control page

  4. Run the Data Factory

    1. If you do not run the Data Factory manually, your date range changes will take effect during the next scheduled run

      1. As of this writing, the data factory runs every 30 minutes daily between 6 am and 7:30 pm EST, and once at midnight

    2. If you want to run the Data Factory manually, go to https://adf.azure.com/datafactories -> TL-AZ-PRD-RPT01-DataFatory

    3. The main pipeline is called V360_MasterPipeline. To run it manually (i.e. “trigger” it), click the pipeline name, then go to Trigger -> Trigger Now

      1. It will ask you to fill in the pipeline parameters, just leave the default ones

  5. Make sure the pipeline is running

    1. Click the orange gauge on the left side of the screen to see all historical and current pipeline runs

    2. Choose a date range and make sure that the current point in time is included. Also, make sure there are no unintended filters on the pipeline name or annotations that would prevent the pipeline you just kicked off from showing

    3. Confirm that the “V360_MasterPipeline” is in the “In Progress” status. Since this pipeline will kick off multiple other pipelines as it runs, there are other pipeline names that will appear above it that might also be in progress or completed.

    4. The last step of the master pipeline is to refresh the Azure Analysis Services model. As of this writing, it will refresh the entire model since there are no partitions, so any historical changes that came through in SQL will be updated in the model

Was this answer helpful? Yes No

Sorry we couldn't be helpful. Help us improve this article with your feedback.