What is the Impact of a BPMS on Job Shop Operations?

ShoppingThe keys to success in job shop operations are ability to price work, rapid setup, resource availability, adherence to order specifications including delivery time, quality, accommodation of on-the-fly changes in priority, and overall throughput,

Few BPMs have the ability to respond to these needs because of the lead time typically needed to map and deploy process templates and the need to accommodate changes in process step logic.

Clearly, what we need is the ability to map out a process for a new order in, say, ½ an hour as opposed to hours or days and we don’t want to end up with one process per order.  One approach is to have a menu of process fragments that a user can thread together.

Once we have a specific master template for a particular class of work, there will always be a need for ad hoc insertion of steps at the template, skipping over steps, and re-visiting completed steps. This needs to be accommodated at run-time.

It’s reasonable to say that, given the capability to examine steps making up completed orders, we can identify ways and means of auto-updating process fragments.

Suppose I have a process made up of three steps (i.e. 1-2-3). If an analysis of completed orders shows that at step 2 in the process, we consistently see an ad hoc step called 2.1, this makes step 2 a likely antecedent to step 2.1, which makes step 2.1 a candidate for an upgraded process.

If we similarly observe that step 3 consistently starts up only following completion of ad hoc step 2.1 then we may be able to conclude that step 2.1 is a constraint on the start of step 3.

All of this could be accomplished by auto-analysis of run time data followed by auto-updating of process maps with appropriate color coding of process improvement candidate steps. The process owner would then make appropriate decisions re updating the process and roll out a new version of that process.

Once an organization is managing its processes 360 (i.e. map, improve, deploy, monitor, analyze run time data, re-map, re-deploy) we can look to the data to help improve throughput.  It’s well known that order processing time is a function of the time to perform steps which varies according to interrupt times, plus time gaps between steps.

There is not a lot we can do about interrupt times when priorities shift on-the-fly other than to understand that each time there is a interrupt at a step, re-engagement of work at the step requires a “settling-in” time (i.e. “S” curve phenomenon).

My experience is that a three-tier approach to scheduling works well for job shop settings – software auto-schedules, workers micro-schedule, supervisors level and balance workload.

We call this approach RALB (Resource allocation, levelling and balancing).


Courtesy to Walter Keirstead. This blog is also available on http://kwkeirstead.wordpress.com/

pixelstats trackingpixel

By Karl Walter Keirstead @ Civerex | August 15, 2013

Leave a Reply

Your email address will not be published. Required fields are marked *