Controlling Concurrent Execution of Scenarios and Load Plans

1. Default Behavior:

By default, there is no restriction on concurrent execution. This means two or more instances of the same scenario or load plan can run at the same time.

2. Possible Scenarios of Concurrent Execution:

Concurrent execution can occur in several situations, including:

  • Multiple Load Plans: A load plan containing a Run Scenario Step could run in two or more instances at the same time, causing the same scenario to execute concurrently.
  • Running a Scenario in Different Ways:
    • A scenario might be executed from the command line.
    • It could be executed from ODI Studio.
    • It may also be scheduled to run on an agent while another instance of the same scenario is running (either on the same or a different agent or ODI Studio session).

3. Scope of Concurrent Execution:

Concurrent execution applies across all agents—both remote and internal agents. So, this is not limited to the current session but extends across the entire system.

4. Potential Issues with Concurrent Execution:

Concurrent execution of scenarios or load plans could be problematic, especially when:

  • The job involves writing data, which could result in data corruption, inconsistent results, or overwrites if multiple instances are writing simultaneously.

5. Controlling Concurrent Execution:

To prevent unwanted concurrency, Concurrent Execution Control options can be utilized. These options allow you to manage and restrict the execution of the same scenario or load plan to only one instance at a time.

6. How ODI Identifies Scenarios and Load Plans:

ODI identifies a scenario or load plan by its internal ID, not its name or version number. This means:

  • Even if a scenario or load plan is regenerated or modified (while retaining the same name and version), it will still have the same internal ID and be treated as the same entity.
  • If a scenario is deleted and a new one is generated (with the same name and version), it will be treated as a different scenario because it will have a new internal ID.

Conclusion:

Managing concurrent execution is critical, especially for jobs that involve data writing. You can control this through ODI's execution control options to ensure jobs are executed sequentially when necessary, preventing potential data issues.

Here’s a step-by-step breakdown of the implications when enabling or disabling Concurrent Execution Control for a scenario or load plan:

1. Enabling Concurrent Execution Control (from Disabled):

  • Existing running jobs: When you switch from disabled to enabled, any currently running or queued jobs will be considered as executing jobs. These jobs will continue running, but they won't be subject to the new Concurrent Execution Control settings.
  • New job submissions: New jobs submitted after the change will be processed with the new Concurrent Execution Control settings. So, the system will consider the control options for newly invoked jobs, but not for those already running or queued.

2. Disabling Concurrent Execution Control (from Enabled):

  • Jobs already submitted and waiting: Jobs that are in the waiting state or were submitted before disabling will still respect the original Concurrent Execution Control settings (i.e., they’ll be treated according to the rules in place when they were submitted).
  • Restarted jobs: Any jobs that are restarted after disabling Concurrent Execution Control will also follow the original settings and be considered as executing jobs.

3. Impact on New Job Submissions (when Concurrent Execution Control is Disabled):

  • New jobs after disabling: If new jobs are submitted with Concurrent Execution Control disabled, they might be executed ahead of waiting jobs. This can cause delays for jobs that are in the waiting state.
  • Waiting jobs may be delayed: The system may find executing jobs (submitted while Concurrent Execution Control was disabled) and treat them as executing jobs, causing delays for jobs that were previously queued and waiting to start.
  • Concurrent execution impact: Even after a waiting job eventually starts, it may still be affected by uncontrolled jobs submitted after the disabling of Concurrent Execution Control, meaning jobs may run concurrently in ways they otherwise wouldn't have.

4. Summary of Key Points:

  • Switching from disabled to enabled: The new Concurrent Execution Control settings apply to new jobs, but running or queued jobs are not affected.
  • Switching from enabled to disabled: Jobs already submitted (including those waiting) retain the original settings, while new jobs may execute out of order, potentially delaying queued jobs and causing concurrent execution.

This step-by-step approach highlights how changes to the Concurrent Execution Control settings can affect job execution behavior in ODI.

Here are the steps to limit concurrent execution of a scenario or load plan in ODI:

1. Open the Scenario or Load Plan:

  • In Designer or Operator Navigators, right-click the scenario or load plan you want to modify.
  • Select Open from the context menu.

2. Navigate to the Definition Tab:

  • Once the scenario or load plan is open, go to the Definition tab.

3. Modify Concurrent Execution Controller Options:

  • In the Definition tab, find the Concurrent Execution Control options.

3.1 Enable the Limit Concurrent Executions Option:

  • Enable the Limit Concurrent Executions checkbox if you want to prevent multiple instances of this scenario or load plan from running at the same time.
  • If the Limit Concurrent Executions checkbox is disabled (unchecked), no restrictions are imposed, and more than one instance of the scenario or load plan can run concurrently.

3.2 Set the Violation Behavior:

·        If you enabled Limit Concurrent Executions, you need to choose the Violation Behavior to determine how the system should handle additional executions if an instance is already running.

·        Raise Execution Error:

    • If an instance is already running, trying to run another instance will cause an execution error. A session will be created but will immediately fail, displaying an error message indicating the running session that caused the issue.

·        Wait to Execute:

    • If an instance is already running, additional executions will be placed in a waiting status. The system will poll for its turn to run.
    • The session’s status will be updated periodically to show:
      • The currently running session.
      • Any concurrent sessions that are waiting to run after the current session completes.

3.3 Set the Wait Polling Interval (if Wait to Execute is chosen):

  • If you choose Wait to Execute, specify the Wait Polling Interval—how often the system checks if the running instance has completed.
  • If no interval is specified, the default agent value will be used:
    • In ODI 12.1.3, the default polling interval is 30 seconds.

4. Save Your Changes:

  • Once you’ve made the necessary changes, click Save to apply the new settings to the scenario or load plan.

Summary of Options:

  • Limit Concurrent Executions: Enable to prevent multiple instances from running simultaneously.
  • Violation Behavior: Choose how to handle violations (Raise Error or Wait to Execute).
  • Wait Polling Interval: Specify how often to check if the running instance has finished (only applicable if "Wait to Execute" is selected).

These steps will help you manage concurrent executions in ODI, ensuring that scenarios or load plans run in a controlled manner.

 

No comments:

Post a Comment