Wednesday, May 6, 2020
Scheduling Manufacturing Operations Free Essays
string(73) " important to understand the distinction between planning and execution\." ABSTRACT Without true finite capacity scheduling, any implementation for manufacturing execution, whether it is ERP, SCM, or MES, cannot realize the goal of enterprise efficiency and agility. All aspects of OM for manufacturing execution fall behind the lead of FCS, which is the bridge between planning and execution. Real tangible return on assets rests with FCS. We will write a custom essay sample on Scheduling Manufacturing Operations or any similar topic only for you Order Now INTRODUCTION Integrating a diverse collection of resources to accomplish a goal is an issue that has faced humankind since the first city arose and food and services needed to be provided to the populace. The modern challenge for operations management (OM) is the speed and volume that data is presented to OM systems. This explosion of data holds the promise of efficiency and agility unrealized in the past, but it forces the attention of analysts and engineers to convert the flood of data into a useable form to move from planning to action. All the systems such as MES, SCM, and ERP are information hungry beasts that must be fed with the right information at the right time to direct enterprise resources. OM requires a well-coordinated dispatch of its resources to realize efficiency and agility. This paper addresses the need to look at OM from an information-centric perspective as a necessary complement to emerging process-centric views. This discussion moves to the execution systems, also treated from an information-centric perspective, and concludes with a discussion as to why finite capacity scheduling (FCS) is the key to OM for manufacturing execution. WHEN DATA BECOMES INFORMATION Despite the advances in information technology, notably object-oriented software, systems continue to be defined by functional decomposition. Functional decomposition creates complex definitions with fragile coupling and cohesion that are on one side of a great chasm from the reality of the methods that are used to build modern information systems. Information itself is an under designed component of modern systems. Information is a series of objects made from atoms of data. Data becomes information only through context and inferences derived from context. A good example is the use of spreadsheets to attempt to understand data rather than the use of application software designed to with the operational context in mind. Figure 1: Hierarchy of Data Fusion Inferences Figure 1 shows the hierarchy of inferences through a process called data fusion. Data fusion simulates the cognitive processes used by humans to continuously integrate data from their senses to make inferences about the external world. Information systems collect data though sensors and other assets, and in the hierarchy of data processing, multiple data sources are combined to approximate or estimate the condition of some aspect of the enterprise operation. This is the first translation of data to a level of inference. Parametric data is processed to begin specific identification of a situation. As more parametric data are collected, different aspects of the situation come together to allow a contextual analysis of an increasingly complex set of conditions. Once integrated, the situation can be compared to the goals or desired state of the system. Parallel to the types of data processing are the types of inference. With raw data an inference can be made of the general condition. While this level of inference rarely points to a specific correction action, it does begin to isolate what subsystems require attention. The next level of inference will reveal a specific characteristic behavior of the system. With more integrated data, the identity of an operational system or process is revealed. The next inference is the behavior of a process, which then leads to an assessment of a situation. At the highest levels of inference, the performance is assessed to determine the deviation from the performance goals, acceptable risks, or desired state. Data fusion is not a new concept, having its origins in simple scouting, but has come into its own since WWII. The use of data fusion systems as an information springboard for systems design places execution aspects of OM firmly into a modern framework of information systems engineering. WHAT WAS OLD IS NEW AGAIN As mentioned in the introduction, operations management has been, and remains, one of the greatest organizational challenges throughout history. OM arises from the need to coordinate diverse resources to meet the needs of a complex system. The concept of the plan-execute-control model, a ââ¬Å"discoveryâ⬠made by analysts in the late 1990ââ¬â¢s, appears in the historical records of systems management, one of the earliest mentions circa 4th century BC in China. One of the more versatile models in modern systems management appeared in 1977 as a result of a joint effort between Dr J. S. Lawson of the Naval Electronic Systems Command and Dr. Paul Moore of the Naval Postgraduate School. Figure 2 shows the Lawson-Moore model, adapted by the author for general resource management. SENSE is the collection of raw data or other collateral information about the observed environment. PROCESS takes the data through the inference hierarchy, integrating data within the context of the tasks required of the managed resources. The situation as best can be determined with the resources is then compared to the DESIRED STATE. The DESIRED STATE is the result of planning, which drives the allocation of resources to tasks. The plan exists in generalities, except for enterprises where goals are achieved with simple tasks assigned to few or uncomplicated resources. DECIDE is the point where the comparison of the situation to the goals will dictate what corrective actions are needed to bring the performance of the enterprise in line with the plan. ACT is the direct management of resources to alter enterprise performance to close the gap between the current state and the DESIRED STATE. The Lawson-Moore model is a closed-loop execution model, continuously integrating data, making inferences about the environment, and managing resources to meet goals of the plan. The Lawson-Moore model does not address planning, but it does unite planning and execution. To develop an execution system, it is important to understand the distinction between planning and execution. You read "Scheduling Manufacturing Operations" in category "Papers" Figure 2: Lawson-Moore Model (aka Lawson Model) PLANNING AND EXECUTION Planning and execution are related, but not one and the same. Planning does not occur during execution; the plan should be formulated to allow for variations and alternate execution strategies. Business (or manufacturing or service) processes are set in place, serving as doctrine that unites actions within the enterprise. Processes should be compiled for all resource management, and serve as a set of procedures designed to achieve the best results from a united enterprise, while allowing for inspired actions and initiatives. The enterprise doctrine exists so that laborious planning for each individual operation need not repeated with every new plan. The more complex or unstable composition of enterprise resource, the greater the need for standardized procedures. This becomes the foundation of repeatable performance, reducing human variations to the least contributor of performance variations. Planning cannot deviate greatly from doctrine, and execution will fail without doctrine. It is possible that execution will look so different from the plan that the uninitiated will see no similarity, but if the goals of the plan are achieved, then the execution is successful. The next section will unite the inference model with the Lawson-Moore model to develop an information-centric execution model. DATA FUSION AS OPERATIONS MANAGEMENT SYSTEM Figure 3 shows the execution system that arises from merging inference and the Lawson-Moore model. For main components exist in this system: information collection, execution environment, human-machine interface (HMI), and evaluation. Information collection includes sensors and all other information gathering, and is a critical component to the resources managed by the OM system. The HMI is the primary means by which operators interact with the OM system. Evaluation is the component that applies performance measurements and other measures of effectiveness to determine the degree to which the execution system is meeting the goals of the plan. The execution system performs the data fusion, situation definition, and resource management. Figure 3: Data Fusion as Execution Environment Data flows from sensors contained in resources through data filtering to begin building inferences. Filtered data enters three levels of information processing. Level 1 processing aligns data in time, insures consistent units of measure, and accounts for any other physical aspects of the data. Data from different sources are aligned or correlated in order to develop meaningful inferences (e. g the color of the box has little to do with its volume, but its height, length, and width has a direct bearing on computing volume). The final function of Level 1 is identifying the situation for further processing in Levels 2 and 3. Level 2 assesses the situation within the context of the fusion process in use and available information from Level 1. Level 2 may require algorithms to augment sparse or missing data. Level 3 evaluates the situation and may direct actions to modify the use of resources to minimize deviations from plan goals. The communications between the three processing levels is continuous, forming an information loop within the execution environment to adapt to changes in the external environment. Short term and long term (historical) databases form the decision support system for the OM system. Corrective action can be automatic or require operator intervention as dictated by operation procedures. THE COMMON DENOMINATOR The integrated systems view for the enterprise is emerging as analysts focus on process-centric models and away from product- and information-centric models. Evidence is the REPAC model from AMR, shown in Figure 4. Recognizing the shortcomings of the functions intense MES and SCOR models, AMR developed a model that is focused on the business processes while supporting component assembly. Comparing that process-centric model with the information-centric model, common elements emerge. The main theme in REPAC COORDINATE is the need to schedule detailed activities from PLAN, utilizing feedbacks from EXECUTE and ANALYZE. These are the same themes addressed by the Lawson-Moore model. In both models, the key element is the ability to manage resources at the individual operations to achieve the goals set by the plan. This level of resource management is achieved by dynamic capacitated scheduling, supported by the real-time data from the environment and comparisons to the desired state established by the plan. Figure 4: AMR REPAC Model FCS: THE KEY TO OPERATIONS MANAGEMENT Whether OM is approached from a process- or information-centric model, finite capacity scheduling drives how resources are deployed to perform the tasks required to achieve the goals of the plan. The sequence of operations, the materials and labor required for operations, and the output of the operations all require supporting resources to act in sync with the business of implementing the plan. Finite capacity scheduling with the ability to account for multiple resource constraints and complex scheduling goals will be scalable to schedule both the lowest level of operation and the supporting resources. Planning is at best an approximation of the resource needs because planning cannot develop a precise quantification of labor, material, or time to meet the goals. Execution cannot begin until the set of actions, well matched to the available resources, is developed to load the operations and develop a timeline for the actions. Execution cannot continue unless the scheduling component can receive the feedback from the resources and develop alternative sets of actions that will best meet the goals of the plan. Only true finite capacity scheduling, design for real-time use, can integrate the planning and execution together to meet the enterprise objectives. CONCLUSION For manufacturing OM to achieve the goals of efficiency and agility, all aspect of planning, execution, and control are necessary to create an effective system. The bridge from the plan to the actions of the organization is dynamic resource management. For an organization with any degree of complexity, procedures need to be in place to establish the general guidelines of operations. In this imperfect world, the plan and procedures must be flexible enough to adapt. The control side provides data and accepts corrective action, but a dynamic element must exist in the OM system that allows for accepting a situation assessment and rapid response to degrading performance. The planning side requires feedback from the OM layer to create future plans. The baseline provided by planning drives the selection of enterprise operations, but the synchronization of these operations, and the alternative actions needed when the exceptions arise, comes from the power of true finite capacity scheduling. FCS is the means by which OM for manufacturing execution becomes a reality. How to cite Scheduling Manufacturing Operations, Papers
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.