Scheduling rules

Job scheduling rules can be used to control when your jobs run in relation to other jobs. In particular, scheduling rules allow you to prevent multiple jobs from running concurrently in situations where concurrency can lead to inconsistent results. They also allow you to guarantee the execution order of a series of jobs. The power of scheduling rules is best illustrated by an example. Let's start by defining two jobs that are used to turn a light switch on and off concurrently:

   public class LightSwitch {
      private boolean isOn = false;
      public boolean isOn() {
         return isOn;
      }
      public void on() {
         new LightOn().schedule();
      }
      public void off() {
         new LightOff().schedule();
      }
      class LightOn extends Job {
         public LightOn() {
            super("Turning on the light");
         }
         public IStatus run(IProgressMonitor monitor) {
            System.out.println("Turning the light on");
            isOn = true;
            return Status.OK_STATUS;
         }
      }
      class LightOff extends Job {
         public LightOff() {
            super("Turning off the light");
         }
         public IStatus run(IProgressMonitor monitor) {
            System.out.println("Turning the light off");
            isOn = false;
            return Status.OK_STATUS;
         }
      }
   }
Now we create a simple program that creates a light switch and turns it on and off again:
   LightSwitch light = new LightSwitch();
   light.on();
   light.off();
   System.out.println("The light is on? " + switch.isOn());
If we run this little program enough times, we will eventually obtain the following output:
   Turning the light off
   Turning the light on
   The light is on? true
How can that be? We told the light to turn on and then off, so its final state should be off! The problem is that there was nothing preventing the LightOff job from running at the same time as the LightOn job. So, even though the "on" job was scheduled first, their concurrent execution means that there is no way to predict the exact execution order of the two concurrent jobs. If the LightOff job ends up running before the LightOn job, we get this invalid result. What we need is a way to prevent the two jobs from running concurrently, and that's where scheduling rules come in.

We can fix this example by creating a simple scheduling rule that acts as a mutex (also known as a binary semaphore):

   class Mutex implements ISchedulingRule {
      public boolean isConflicting(ISchedulingRule rule) {
         return rule == this;
      }
      public boolean contains(ISchedulingRule rule) {
         return rule == this;
      }
   }
This rule is then added to the two light switch jobs from our previous example:
   public class LightSwitch {
      final MutextRule rule = new MutexRule();
      ...
      class LightOn extends Job {
         public LightOn() {
            super("Turning on the light");
            setRule(rule);
         }
         ...
      }
      class LightOff extends Job {
         public LightOff() {
            super("Turning off the light");
            setRule(rule);
         }
         ...
      }
   }

Now, when the two light switch jobs are scheduled, the job infrastructure will call the isConflicting method to compare the scheduling rules of the two jobs. It will notice that the two jobs have conflicting scheduling rules, and will make sure that they run in the correct order. It will also make sure they never run at the same time. Now, if you run the example program a million times, you will always get the same result:

   Turning the light on
   Turning the light off
   The light is on? false

[Patience! This document is a work in progress.]

Copyright IBM Corporation and others 2000, 2003.