ICAOR’10 ABSTRACTS

2ND INTERNATIONAL CONFERENCE ON APPLIED OPERATIONAL RESEARCH

25-27 AUGUST 2010, TURKU, FINLAND


OPTIMIZATION OF DAILY SCHEDULING FOR EXTRAMURAL HEALTH CARE SERVICES

Andrea Trautsamwieser and Patrick Hirsch

Institute of Production and Logistics University of Natural Resources and Applied Life Sciences Vienna, Austria

Abstract. As demographic trends show, the demand for home health care services will rise in future. Currently, the routing of the nurses is performed manually by the main service providers in Austria. This leads to a time-consuming process with a presumably suboptimal outcome. This paper presents a model formulation and a metaheuristic solution approach, based on Variable Neighbourhood Search, for optimizing the daily scheduling of the nurses. The objective is to minimize the time spent by the nurses for travelling including driving and waiting. Additionally, the dissatisfaction level of nurses and clients is minimized. A feasible solution has to observe working time regulations, hard time windows, mandatory breaks, and a feasible assignment of nurses to clients. The proposed method finds the global optimal solutions for small problem instances. In extensive numerical studies it is shown that the algorithm is capable to solve real life instances with up to 512 home visits and 75 nurses.


AUTOMATED MULTI-SKILL SHIFT DESIGN FOR HOSPITALS

Eivind J. Nilssen, Martin Stølevik, Erik Lien Johnsen and Tomas Eric Nordlander

SINTEF ICT, Department of Applied Mathematics Oslo, Norway

Abstract. For labour intensive organizations, finding a good match between the predicted workload and the scheduled workforce work capacity is crucial: One important step in this matching process is shift design. In a multi-skill shift design problem, the model must reflect the skill attributes of the employees and the time dependent demand of each skill type. For hospitals, a number of constraints and objectives complicate the picture, and in this paper we introduce models which reflect many of the challenges faced by planners at two reference hospitals in Norway. Experiments using mixed-integer programming solvers show promising results, and optimal solutions are in some cases found within seconds or a few minutes.


A TWO-PHASE APPROACH FOR THE OPERATING ROOM PLANNING AND SCHEDULING PROBLEM

John Fowler and Qing Li

Arizona State University Tempe, AZ, USA

Abstract. Hospitals nowadays face an increasing pressure for efficient resource usage while providing timely patient care. Planning and scheduling the Operating Room, which is one of the largest cost unit, is thus important and has attracted the attention of many researchers. In this paper, we develop a two phase approach for cyclic block scheduling of OR (Phase 1) and day-to-day patient scheduling (Phase 2). Each phase is formulated as a mixed integer program and exact solutions are obtained using real data. Random Keys Genetic Algorithm is used in the second phase and compared with optimal solutions.


FIVE SECTIONS ANALYSIS: THE OR/MS PROCESS

Heiner Müller-Merbach

Technische Universität Kaiserslautern Wirtschaftsinformatik und Operations Research Germany

Abstract. A five section analysis is suggested for the OR/MS process. The author’s first (German) publication on this topic leads back to 1987 (Müller-Merbach, 1987), the first English publication to 2010 (Müller-Merbach, 2010). In this paper, OR/MS shall be understood according to an officious definition of OR by the Operations Research Society of America (ORSA, 1977) (i. e. prior to the merger of ORSA and TIMS to INFORMS by January 1, 1995), such as: “Operations Research is concerned with scientifically deciding how to best design and operate man-machine systems, usually under conditions requiring the allocation of scarce resources.” In this definition: (i) “mathematics” is not explicitly mentioned, (ii) “scientifically deciding” is to be interpreted as “interdisciplinary”, i. e. including all the relevant knowledge available, (iii) “how to best design and operate”, i. e. aiming at optimality of the structures and the processes, (iv) “man-machine systems”, i. e. combined social and technical systems, such as enterprises, political states/nations, universities, hospitals, political parties etc. and their sections.


METAHEURISTICS FOR THE WASTE COLLECTION VEHICLE ROUTING PROBLEM WITH TIME WINDOWS

AM Benjamin and JE Beasley

Brunel University Uxbridge, UK

Abstract. In this problem there is a set of customers which waste is collected by vehicles. Vehicles can visit waste disposal facilities during their working day to empty collected waste and hence continue to collect from customers. The vehicles start and end their routes at a single depot empty. We take into consideration time windows associated with customers, disposal facilities and the depot. Here, we also have a driver rest period. The problem is solved using a number of metaheuristic algorithms namely tabu search (TS) and variable neighbourhood search (VNS). Moreover, we also present a combined metaheuristic algorithm based on variable neighbourhood tabu search (VNTS), where the variable neighbourhood is searched via tabu search. Computational experiments on ten publicly available waste collection benchmark problems involving up to 2092 customers and 19 waste disposal facilities indicates that the proposed algorithms are able to find better quality solutions than previous work presented in literature within reasonable computation times.


THE SOFTWARE SUPPORT FOR MULTIPLE-CRITERIA EVALUATION – VARIOUS TYPES OF PARTIAL EVALUATIONS AGGREGATION

Pavel Holeček and Jana Talašová

Palacký University Olomouc Olomouc, Czech Republic

Abstract. This paper presents the software product FuzzME that was developed as a tool for creating fuzzy models of multiple-criteria evaluation and decision making. The partial evaluations with respect to criteria express the (fuzzy) degrees of fulfilment of corresponding goals. FuzzME allows the utilization of several aggregation methods – fuzzy weighted average, fuzzy OWA operator, fuzzified WOWA operator, fuzzified discrete Choquet integral, and fuzzy expert system. In this paper, all these methods will be described and the conditions for their use will be studied. The paper also describes an example from area of banking. In this example, it is shown how different types of criteria interactions can be modeled by the FuzzME


CREDIBILITY MEASURES IN PORTFOLIO ANALYSIS

Irina Georgescu and Jani Kinnunen

Academy of Economic Studies Bucharest, Romania

Institute for Advanced Management Systems Research Åbo Akademi University Turku, Finland

Abstract. This paper treats risk based on the notions of credibility measure and credibility expected value. Firstly, the paper derives and discusses the credibility expected value. Secondly, the paper presents the definition and analysis of possibilistic portfolios. With a possibilistic portfolio a probabilistic portfolio is canonically associated. Risk evaluation in the context of probabilistic portfolio leads to an understanding of risk for the possibilistic portfolio.


INTERACTIONS AMONG CRITERIA AS MODELLED BY FUZZIFIED CHOQUET INTEGRAL

Iveta Bebčáková and Jana Talašová

Palacký University Olomouc Olomouc, Czech Republic

Abstract. Within evaluation models, the integrand of the discrete Choquet integral represents partial evaluations of an alternative with respect to given criteria, while the fuzzy measure, i.e. the generalised monotonic measure, stands for the weights of the sets of criteria. We focus on fuzzified Choquet integral. We study the first-level fuzzified Choquet integral that handles partial fuzzy evaluations. Then we employ the second-level fuzzified Choquet integral where also the weights of the sets of criteria are in the form of fuzzy numbers.


THE REPRESENTATION OF UNCERTAINTY IN OPERATIONAL RESEARCH: CONSIDERATIONS ON THE USE OF POSSIBILITY, PROBABILITY, AND FUZZY NUMBERS

Matteo Brunelli and Mario Fedrizzi

IAMSR and Turku Centre for Computer Science Åbo Akademi University Turku, Finland

University of Ttrento Trento, Italy

Abstract. The judgment of the reliability, credibility, or adequacy of the available information plays a critical role when one or more individuals have to make decisions in the presence of uncertainty. In decision making activities, most of the uncertainty comes from subjective judgments and is commonly transmitted through statements in natural language involving vague predicates and therefore linguistic uncertainty is generated, i.e., the uncertainty about a precisely defined quantity that is produced by linguistic information. In this paper we discuss some issues in the application of possibility and probability theory in the domain of operational research. In doing so, we emphasize how, sometimes, justifications for the use of fuzzy numbers in the representation of uncertainty, lack formality or empirical background.


EXPERIMENTS TO IMPROVE FORECASTING ACCURACY OF REGRESSION MODELS WITH MINIMAL ASSUMPTIONS

Magderie van der Westhuizen, Giel Hattingh and Hennie Kruger

North-West University, South Africa

Abstract. The forecasting accuracy of a regression model relies heavily on the applicability of the assumptions made by the model builder. In addition, the presence of outliers may also lead to models that are not reliable and thus less robust. In this paper a suggested regression model, based on minimal assumptions, is studied and extended in an effort to improve forecast accuracy. The approach is based on mathematical programming techniques combined with smoothing and piecewise linear techniques. Three cases from the literature are considered and presented as illustrative examples.


HYBRID GENETIC PATTERN SEARCH AUGMENTED LAGRANGIAN ALGORITHM: APPLICATION TO WWTP OPTIMIZATION

Isabel A.C.P. Espirito Santo, Lino Costa, Roman Denysiuk and Edite MGP Fernandes

University of Minho, Campus de Gualtar, Braga, Portugal

Algoritmi R&D Center, Portugal

Abstract. An augmented Lagrangian algorithm is presented to solve a global optimization problem that arises when modeling the activated sludge system in a Wastewater Treatment Plant, attempting to minimize both investment and operation costs. It is a heuristic-based algorithm that uses a genetic algorithm to explore the search space for a global optimum and a pattern search method for the local search refinement. The obtained results have physical meaning and show the effectiveness of the proposed method.


WHICH RATING AGENCY ACHIEVED RATING STABILITY, MOODY’S AND R&I? EMPIRICAL STUDY USING AN ARTIFICIAL NEURAL NETWORK

Motohiro Hagiwara, Katsuaki Tanaka, Hideki Katsuda and Susumu Saitou

Meiji University Japan

Setsunan University Japan

Kinki University Japan

Sophia Universities Japan

Abstract. The distribution of ratings changes plays a crucial role in many credit risk models. As is well-known, these distributions vary across time and different issuer types. Ignoring such dependencies may lead to inaccurate assessments of credit risk. We introduce a new approach to improve the performance of rating prediction models for multinational corporations. In the last decade, neural networks have emerged from an esoteric instrument in academic research to a rather common tool assisting auditors, investors, portfolio managers and investment advisors in making critical financial decisions. It is apparent that a better understanding of the network's performance and limitations would help both researchers and practitioners in analysing real-world problems. The objectives of this research is to verify the effectiveness of artificial neural networks (ANNs) and examine and compare the stability of rating structures of rating agencies in the United States and Japan. Method to predict corporate ratings by public quantitative information in an inter-temporally stable manner would be useful from the perspective of cost-benefit performance especially in recent rapidly changing economic situation. We find that ANNs has more explanatory power in many cases than models by previous research and that R&I and Moody’s changed rating structure significantly in 2006 and 2007 respectively.


A FLEXIBLE FORECASTING INTELLIGENT MODEL FOR NON-STATIONARY TIME SERIES

Iulian Nastac

Politehnica University of Bucharest Bucharest, Romania

Abstract. The paper presents a general adaptive model for those dynamic systems that work on continuously changing environments. This interdisciplinary model is tested in the financial, genetic and technical fields. The algorithm of the model establishes how a viable structure of an artificial neural network at a previous moment of time could be retrained in an efficient manner, in order to support modifications in a complex input output function of a real forecasting system. A “remembering process” from the previous learning phase is used to enhance the accuracy of the predictions. The advantage of the retraining procedure is that some relevant aspects are preserved not only from the immediate previous training phase, but also from the previous but one phase, and so on. A kind of “slow forgetting process” also occurs; thus it is much easier for the model to remember specific aspects of the previous training instead of an oldest one.


A COMPUTATIONALLY INEXPENSIVE APPROACH IN MULTI-OBJECTIVE HEAT EXCHANGER NETWORK SYNTHESIS

Markus Hartikainen and Kaisa Miettinen

University of Jyväskylä Jyväskylä, Finland

Abstract. We consider a heat exchanger network synthesis problem formulated as a multi-objective optimization problem. The Pareto front of this problem is approximated with a new approximation approach and the preferred point on the approximation is found with the interactive multi-objective optimization method NIMBUS. Using the approximation makes the solution process computationally inexpensive. Finally, the preferred outcome on the Pareto front approximation is projected on the actual Pareto front.


ANALYZING COST STRUCTURES OF INVENTORY ROUTING: APPLICATION TO CASH SUPPLY CHAINS

Michael Wagner

Hanken School of Economics Helsinki, Finland

Abstract. This paper analyzes cost structures of inventory routing and investigates efficiency gains that result from combining vehicle routing and inventory management. Mixed-integer models are applied to determine the optimal replenishment schedule. Total costs of a sequential approach are contrasted with an integrated approach in order to capture the impact of the underlying set-dependent cost structure. The proposed methodology applies simulation and a factorial design to analyze the role of set-dependent cost structures as well as the impact of demand, cost factors, factor levels and interaction effects. Cost benefits are evaluated in a deterministic environment using nominal range sensitivity analysis and repeated measure analysis of variance (ANOVA). The approach is illustrated for a local network of Automated Teller Machines (ATMs) using empirical data of an international commercial bank. Results of the case study show that cost benefits of the integrated approach vary with factor levels and are primarily determined by routing costs with remaining factors having only limited impact.


SOLVING A COMPLEX PRODUCTION PLANNING PROBLEM WITH MATHEMATICAL PROGRAMMING TECHNIQUES

Mikael Nyberg and Kaj-Mikael Björk

Åbo Akademi University Turku, Finland

Abstract. This paper presents a new MILP (Mixed Integer Linear Programming) model to solve a complex production planning problem. The problem is found in a producer of food sweeteners, with several different kinds of products. The case study plant is found in the USA. The process is fairly complex and the alternatives are vast. A discrete-time production planning model is presented and found suitable for the solution of the problem as well as a discussion how to reduce complexity in large processes without loosing good solutions.


NURSE ROSTERING IN A DANISH HOSPITAL

Jonas Bæklund

Aarhus University Aarhus, Denmark

Abstract. Nurse rostering is the complex scheduling problem of planning which shift each nurse should work for the next period. The problem confronted is a nurse rostering problem in a ward at a Danish hospital; the problem includes several special regulations that have to be fulfilled in this ward. This presentation will present the first results of the on-going work of my Ph.D. thesis project. It gives moreover an overview on the solution approach we are currently working on. The overall method is a branch and price algorithm based on a linear master problem and a sub-problem handled with heuristics and constraint programming.


MODELLING TO GENERATE ALTERNATIVE POLICIES IN HIGHLY UNCERTAIN ENVIRONMENTS

Julian Scott Yeomans

Schulich School of Business York University Toronto, ON, Canada

Abstract. Public policy formulation often proves to be an extremely complicated endeavour due to the considerable uncertainty within its various system components. The complexity of public sector decision-making is further compounded by competing performance design objectives and requirements that are difficult to specify, quantify and capture at the time ancillary decision models are constructed. Consequently, there are invariably unmodelled performance design issues, not apparent at the time of model construction, which can greatly impact the acceptability of its solutions. In particular, while a mathematically optimal solution may prove to be the best solution for the modelled problem, it is frequently not the best solution to the real problem. Therefore, in public policy formulation, it is generally preferable to create several quantifiably good alternatives that provide very different structural approaches and perspectives to the problem – an approach referred to as “modelling to generate alternatives”. The potentially unique performance features within these dissimilar alternatives are expected to result in them performing very differently with respect to the unmodeled issues, thus providing a means for capturing and incorporating the unmodelled issues into the solution process. This presentation reviews recent research in modelling to generate alternatives and shows how it can be used to generate numerous policy alternatives that satisfy required system performance criteria in highly uncertain environments and yet are maximally different in the decision space. Many of these techniques can be adapted to a wide variety of problem types and can be extended into many different types of operational and strategic planning applications.


A REAL TIME QUALITY DECISION SUPPORT MODEL FOR THE CONSTRUCTION INDUSTRY

Marios Charalambides, Michalis Menicou, Petros Christou and Vassos Vassiliou

Frederick University Nicosia, Cyprus

Abstract. Construction Industry across the Globe is at a crossroad striving for its survival after the recent economic crisis. Issues relating to product quality and cost minimization prove to be of the outmost importance for the sector’s successful future. Within this context, this paper presents a real time quality-cost optimization model for the construction industry, applied to the residential housing sector. The model is based on a quality assurance tool, accompanied by the necessary methodology to identify the optimum path of improvements to be made in order to restore the desired quality of construction with the minimum cost. A prototype optimization tool is also presented.


ASSESSING ACADEMIC STAFF PERFORMANCE USING MULTIPLE CRITERIA EVALUATION MODELS

Jana Talašová and Jan Stoklasa

Palacký University Olomouc Olomouc, Czech Republic

Abstract. Various academic staff evaluation models were subjected to detailed analysis. The routine use of weighted mean as a sole aggregation operator proved inappropriate for aggregating evaluations from different academic areas (lecturing, R&D, management). Even more general aggregation operators (OWA, WOWA) still leave some room for improvement. To objectively assess benefit of an individual staff member, the use of a fuzzy rule base in aggregating partial evaluations proves optimal. Our proposed linguistic fuzzy evaluation model is currently being implemented at Palacky University Olomouc.


OPTIPROT – A SPREADSHEET-BASED DECISION SUPPORT SYSTEM FOR BLENDING PROBLEMS IN FOOD-PROCESSING COMPANY

Veronika Skocdopolova and Josef Jablonsky

University of Economics Prague, Czech Republic

Abstract. The paper presents the model for production process optimisation in large international milk processing company. The model is used in OPTIPROT application which was created for this company and is used for operational planning. The application optimises production process of so called white masses and purchase of raw milk and other milk ingredients. Optimization criterion of the model is minimisation of total production costs. This problem was formulated as a standard blending problem with many specific features. OPTIPROT application uses MS Excel interface written in VBA (Visual Basic for Application) co-operating with optimisation system LINGO. Optimal results offer recommendation for purchase of raw materials, own production of materials, surplus sales of raw materials, and stock creating.


A CO-EVOLUTIONARY SIMULATION-OPTIMIZATION ALGORITHM FOR MODELLING TO GENERATE ALTERNATIVES IN MUNICIPAL SOLID WASTE MANAGEMENT PLANNING

Julian Scott Yeomans and Yavuz Gunalay

Schulich School of Business York University Toronto, ON, Canada.

Bahcesehir University Istanbul, Turkey

Abstract. Public sector decision-making typically involves complex problems that are riddled with competing performance objectives and possess design requirements which are difficult to quantify and capture at the time decision models are constructed. Environmental policy formulation can prove additionally complicated because the various system components often contain considerable stochastic uncertainty and frequently there are also numerous stakeholders holding incompatible perspectives. Consequently, there are invariably unmodelled performance design issues, not apparent at the time of the problem formulation, which can greatly impact the acceptability of any proposed solutions. While a mathematically optimal solution might provide the best solution to a modelled problem, normally this will not be the best solution to the real problem. Therefore, in public environmental policy formulation, it is generally preferable to be able to create several quantifiably good alternatives that provide very different approaches and perspectives to the problem. This study shows how simulation-optimization (SO) modelling can be combined into a co-evolutionary algorithm to efficiently generate multiple policy alternatives that satisfy required system performance criteria in highly uncertain environments and yet are maximally different in the decision space. The efficacy of this modelling-to-generate alternatives approach is specifically demonstrated on a municipal solid waste management planning case.


HYBRIDISATIONS OF HARMONY SEARCH ALGORITHM WITH BANDWIDTH AND VARIABLE NEIGHBOURHOOD SEARCH ELEMENTS FOR NOISY RESPONSE SURFACE OPTIMISATIONS

Pongchanun Luangpaiboon

Thammasat University Pathumthani, Thailand

Abstract. A response surface supposes the yield of an engineering system depends on a number of influential variables, which are restricted to some region of safe operation. The expected value of response is some unknown function of influential variables, and the measured yields will vary about their expected values because of random errors. These errors are comprised of natural variation in the process and measurement errors, which occur when monitoring the yield, and are assumed to have a mean of zero and to be uncorrelated with the values taken by influential variables. Errors in measuring the values of the k influential variables are usually assumed to be negligible in comparison with the random errors associated with the yield. In this paper we examine variations of a harmony search algorithm (HSA) based on bandwidth and variable neighbourhood search elements, for different error standard deviations. All the algorithms are run until they converge. The requirements are that an algorithm converges to the optimum, and that it does so as quickly as possible. The objective of this paper is to investigate how the choice of best algorithm for optimisation depends on the amount of random variation in process yields when parameters are fixed. The results show that the HSA with the variable neighbourhood search elements seems to be better in terms of the mean and variance of design points and yields.


AN ALGORITHM TO FIND THE BEST APPROXIMATE SOLUTIONS FOR A PARTICULAR FUZZY RELATIONAL EQUATION WITH MAX-PRODUCT COMPOSITION

Yan-Kuen Wu and Cheng-Chuang Hon

Vanung University Taoyuan, Taiwan

Minghsin University of Science and Technology Xinfeng Hsinchu, Taiwan

Abstract. Fuzzy relational equations have played an important role in fuzzy modelling and have applied to many practical problems. Most theoretical results of fuzzy relational equations based on a premise that the solution set is nonempty. However, it is commonly seen that the case of fuzzy relational equations is inconsistent. The inconsistent fuzzy relational equation is so-called “inverse fuzzy relation” problem. Finding the approximate solution for inverse fuzzy relation problem has been investigated by several authors. The proposed algorithm for solving the inverse fuzzy relation problem usually based on the genetic algorithm (GA) or heuristic algorithm. However, these algorithms are expected to yield good results in most cases but are not guaranteed to yield the best approximate solution. To provide a precisely solution procedure, an algorithm to find the best approximate solution of the inverse fuzzy relation problem included the right hand side vector b = (1 0 … 0) with max-product composition is presented in this study. Numerical examples are also provided to illustrate how the solution algorithm can be applied to find the best approximate solution for the studied problem.


EFFICIENCY EVALUATION OF HYDROELECTRIC POWER PLANTS USING DATA ENVELOPMENT ANALYSIS

V Dedoussis, K Konstas, A Kassimis and S Sofianopoulou

University of Piraeus Piraeus, Greece

Abstract. The purpose of this paper is to evaluate the efficiency of a network of hydroelectric power plants using the Data Envelopment Analysis approach. The network is modelled as a linear system with multiple inputs and outputs. As inputs one could consider, for instance, the age of a plant, the total number of hours that a plant is in operation during each year, etc. As outputs the model considers the electrical energy delivered per year, the number of hours that the plant is not in operation, etc. The proposed approach does not only evaluate each plant relative to the other ones, but it also ‘produces’ policy making scenarios that would enable plant managers to improve the plant’s operational characteristics. Computational results based on real-world data are presented and discussed. Relationships between efficiency scores and various inputs/outputs are also investigated and some interesting trends are identified.


AN EXTENDED CASE-BASED DISTANCE APPROACH FOR ALTERNATIVES SCREENING

Li-Ching Ma and Pei-Pei Hsu

National United University MiaoLi City, Taiwan

I-Shou University Kaohsiung County, Taiwan

Abstract. Screening is a helpful process to reduce larger set of alternatives into a smaller set that contains the best alternatives, thus decision makers can concentrate on evaluating alternatives in the smaller set. Hence, how to assist decision makers in screening out poor alternatives is an important issue in multiple criteria decision making. This study tries to develop a screening model incorporating the advantages of the case-based distance method and the discriminate analysis. From the concept of the case-based distance method, the proposed approach can obtain criterion weights and screening rules by eliciting decision makers’ preferences based on a set of test cases. Moreover, the proposed approach can increase hit rates and improve multiple solution problems of conventional cased-based distance methods.


BI-CRITERIA HEURISTIC FOR SCHEDULING ON UNRELATED PARALLEL MACHINES

YK Lin, JW Fowler and ME Pfund

Feng Chia University Taichung, Taiwan

Arizona State University Tempe, AZ, USA

Abstract. In this research, a bi-criteria heuristic is proposed to find non-dominated solutions to the unrelated parallel machines scheduling problem that minimizes makespan and total weighted tardiness.


A MIXED INTEGER PROGRAMMING FORMULATION FOR PARALLEL MACHINES SCHEDULING WITH A SINGLE SERVER

Mi-Yi Kim and Young Hoon Lee

Yonsei University Seoul, Korea

Abstract. This paper considers the identical parallel machines scheduling problem (PMSP) with a single server, which is in charge of job setups. A job can be processed with a precedent setup by a server on one of the machines. The setup can be processed at only one machine at any time because of the single server constraint. In this paper, the Problem P,S1|sj|Cmax with a general job set is formulated in a mixed integer programming, which is developed by taking the characteristic of the single server problem into account and modifying the description of server waiting time identified by Adbekhodaee and Wirth(2002).


A GENETIC ALGORITHM FOR DISTRIBUTED SCHEDULING IN SUPPLY NETWORKS

Anna Ławrynowicz

Warsaw School of Economics Warszawa, Poland

Abstract. In this paper, the author proposes a genetic algorithm for a distributed scheduling in supply networks. The new genetic algorithm enables not only a manufacturing scheduling. Additionally, the genetic algorithm aided planners in transport orders planning. The algorithm is based on operation codes, where each chromosome is a set of 5-position genes. The new method was verification in some experiments. Finally, representative examples are illustrated to indicate that the suggested new method can improve distributed scheduling in supply networks. It can be applied in a dynamic setting when re-scheduling is initiated by unexpected changes.


A TWO-PHASE CONSTRUCTIVE HEURISTIC FOR PERMUTATION FLOW-SHOP SCHEDULING PROBLEM MINIMIZING TOTAL COMPLETION TIME

Kaveh Sheibani

Iran Telecommunication Research Centre (ITRC) Tehran, Iran

Abstract. This paper describes a polynomial-time heuristic for the permutation flow-shop scheduling problem with the criterion of minimizing total completion time. The proposed method consists of two phases: arranging the jobs in priority order and then constructing a sequence. A new ranking method is employed to prioritize the jobs for incorporating into the construction phase of the heuristic. Computational experiments using standard benchmark problems indicate the proposed method is very effective.


OPTIMAL CONTROL OF A MEAN-REVERTING INVENTORY

Abel Cadenillas, Peter Lakner and Michael Pinedo

University of Alberta Edmonton, Canada

New York University New York, USA

Abstract. In this paper we consider the inventory level of a company and assume that it follows a mean reverting process. The objective of the management is to keep this inventory level as close as possible to a given target; there is a running cost associated with the difference between the inventory level and the target. The management is allowed to perform at times interventions in the form of major purchases or sales of the goods. These interventions are subject to fixed as well as proportional costs. The objective of this paper is to find the levels of the inventory at which management should perform interventions and the magnitudes of these interventions that minimize the expected total discounted cost.


DESIGN OF SINGLE-FEATURED EWMA-X CONTROL CHART FOR PROCESS MEAN SHIFT DETECTION

Chi-Shuan Liu and Fang-Chih Tien

National Taipei University of Technology Taipei, Taiwan

Abstract. The control chart is a useful process monitoring technique for statistical process control (SPC). Identifying both large and small process mean shifts is an important issue in this area. Currently, a combined EWMA-X chart is commonly used in practice, but two sets of statistics and control limits are required to be calculated and plotted. The objective of this paper is to simplify the original combined chart in order to perform the process monitoring with one set of statistic and control limits. Therefore, a new control chart, so called SFEWMA-X chart, is designed and proposed. An experiment was performed to show the proposed chart can identify both small and large process mean shift with the same detection power.


ECONOMIC DESIGN OF X CONTROL CHARTS UNDER PREVENTIVE MAINTENANCE AND TAGUCHI LOSS FUNCTIONS

Fong-Jung Yu, Yu-Hua Lin and Ruey-Shiang Guh

Da-Yeh University, Taiwan

Hsiuping Institute of Technology, Taiwan

National Formosa University, Taiwan

Abstract. Economic designs of x control charts have been widely investigated and insure that the economic design of control chart actually has a lower cost. A preventive maintenance can reduce the failure rate to an out-of-control state by an amount proportional to the preventive maintenance level. This paper presents an integrated model for combining the preventive maintenance and the economic design of x control charts using Taguchi loss function. The maintenance activities are coordinated with the statistical characteristics of the sampling results. A numerical example is also used to demonstrate the model’s working underlying the effect of preventive maintenance on the quality control costs.


OPTIMIZING AN INVENTORY CONTROL SYSTEM IN MAZANDARAN WOOD AND PAPER INDUSTRIES IN IRAN

T Mojibi, R Tavakkoli-Moghaddam and VA Rezaei-Nosrati

Islamic Azad University - Firoozkuh Branch, Firoozkuh, Iran

College of Engineering, University of Tehran, Tehran, Iran

Mazandaran Wood and Paper Industries, Sari, Iran

Abstract. This paper considers an inventory control system in a real case study, namely Mazandaran Wood and Paper Industries in Iran. By optimizing the existing situation of such a system, we can find the ordering and holding costs resulting in a reduction of the total inventory cost. We calculate the economic order quantity (EOQ) and re-order point (ROP). Six items of raw materials are taken into account, which are considered about 94% of the warehouse's inventories as a sample in our study. The related result show that the total cost is reduced by 41% in the optimized inventory control system.


EXPLORING BENEFITS, OPPORTUNITIES, COSTS AND RISKS OF TERRITORIAL TRANSFORMATIONS: A COMBINED ANALYTIC NETWORK PROCESS (ANP) MODEL AND DRIVING FORCES-PRESSURES - STATE - IMPACTS - RESPONSES (DPSIR) FRAMEWORK FOR ENVIRONMENTAL ASSESSMENT

Marta Bottero and Valentina Ferretti

Politecnico di Torino Torino Italy

Abstract. Environmental Assessment of territorial transformation projects is an intrinsically complex multidimensional process, because it considers different elements, such as the physical-chemical, biological, cultural and socio-economic components. The use of decision support methods therefore can be beneficial for decision makers. When talking referring to Environmental Assessment and territorial transformations, a very consolidated approach that is used to report information concerning the various aspects of the development is the one that makes use of indicators. Of these methods, it is worth mentioning a recent approach presented by the Organization for Economic Co-operation and Development where environmental indicators have been organized according to the so-called Driving forces- Pressures- State-Impacts- Responses (DPSIR) framework. Unfortunately, this approach results in a linearity in the relationships between the actions of a project, the impacts on the environmental system and the interferences with human activities and thus fails to study the system complexity in depth. In order to overcome the limits of this approach, this paper proposes a combined decision support tool that employs the DPSIR environmental indicator framework to analyse the different environmental aspects of the problem and the Analytic Network Process (ANP) method to manage the interdependencies among the factors, which can be organized in categories of Benefits, Opportunities, Costs and Risks (BOCR structure). The paper illustrates the application of the combined DPSIR/ANP model according to the BOCR structure to assess three alternative projects for the re qualification of a downgraded urban area in Northern Italy. The results show the most relevant environmental indicators which describe the transformation and the ranking of the three considered options.


A COMBINATION OF QUALITY FUNCTION DEPLOYMENT (QFD) AND ANALYTIC NETWORK PROCESS (ANP) TO EVALUATE URBAN REDEVELOPMENT PROJECTS. AN APPLICATION TO THE “BELLE DE MAI – LA FRICHE” OF MARSEILLE (FRANCE)

Isabella M Lami and Elena L Vitti

Polytechnic of Turin Turin, Italy

Abstract. This paper proposes a methodological framework to integrate stakeholders’ requirements and different aspects of urban redevelopment projects. We propose the application of the multi criteria decision analysis techniques combined of Quality Function Deployment (QFD) and Analytic Network Process (ANP) in order to evaluate the revitalization projects for areas where an industrial past has vanished from an economical point of view, but still remains in the character of the buildings and the area. This work illustrates the application of the methodological framework in order to evaluate the principals aspects of the transformation of the site “Belle de Mai – La friche” in Marseille, a former tobacco factory where, since the early 1990s, the 45,000 square meter have been a centre for contemporary cultural and artistic events. Trough a combination of QFD and ANP we intend to verify the possibility of this procedure can contribute to the decision-making process and the general consensus.


OPTIMIZATION OF SALES PROMOTION ACTIVITIES FOR CUSTOMERS UNDER THE CONDITION OF BUDGET CONSTRAINT

Susumu Saito and Hisashi Kikuchi

Tokyo University of Science Japan

Abstract. The purpose of this study is to create a model of sales promotion activities to each customer where the summation of expected profits under budget constraint is maximized, as a multi-dimensional knapsack problem, where the purchase probability of a customer for various channels is already known. In addition, we have proposed a metaheuristic algorithm for that problem. This problem has been formulated using a defined expected cost, an expected profit, and a risk measure that is the square root of a semi-variance of risks. We could obtain the tool to decide “to whom and how (channel) what sort of products should be promoted” in order to maximize an expected total profit under a budget constraint and a risk tolerance. The relation among an expected total profit, a budget and a risk tolerance were also obtained.


FUZZY SUBSET HOOD MEASURE APPLIED IN WEIGHTS SETTING OF DECISION MAKERS

Cheng-Chuang Hon, Yan-Kuen Wu and Ling-Lang Tang

Ming Hsin University of Science and Technology, Taiwan

Vanung University, Taiwan

Yuan-Ze University, Taiwan

Abstract. In order to avoid deviations caused by autocratic and subjective attitudes in decision making process, bringing decision makers (DMs) together to obtain group consensus is one of the best choices for achieving accurate performance. Decision weighting of each decision maker (DM) is set equally in traditional performance evaluation model. Actually it is more suitable to be set by means of mutual interactions within decision group. This paper summarized some methodologies, Similarity Aggregation Method (SAM), Optimal Aggregation Method (OAM), Least Squares Distance Method (LSDM), Defuzzification-based Least Squares Method (DLSM), to define the weights setting of DMs. Also a new developed method called Subset hood Aggregation Method (SbAM) is provided for the same purpose, which can be used for getting more objective group consensus.


AN APPROXIMATE REASONING APPROACH TO RANK THE RESULTS OF FUZZY QUERIES

Christer Carlsson, Robert Fuller and Jozsef Mezei

Abo Akademi University Turku, Finland

Abstract. In this paper we suggest the use of a context-dependent fuzzy aggregation method to rank the results of fuzzy queries over fuzzy ontologies. In our approach the fuzzy aggregation rules are provided by the experts, the coefficients of the consequence part of the rules are derived from the linguistic values used in the conditional part of the rules and the rank of a search result is determined by the Takagi-Sugeno fuzzy reasoning scheme.


USING MULTIPLE CRITERIA DECISION-MAKING METHOD TO ANALYZE OPTIMUM PUSH/PULL JUNCTION POINT LOCATION FOR TFT-LCD MANUFACTURING

Taho Yang, Jiunn-chenn Lu and Ling-Hsiu Chen

Institute of Manufacturing Information and Systems National Cheng Kung University Tainan, Taiwan

Chaoyang University of Technology Taichung County, Taiwan

Abstract. This research’s aim is to implement a hybrid push/pull production system that can satisfy both high service-levels and low inventory cost. Simultaneously, we consider sophisticated variability, such as multi-products, random setup, indiscriminate break-downs, yield loss, batch processes, and other contingencies. The problem can be solved by a multiple criteria decision-making method (MCDM). A technique for order-preference by similarity-to-ideal solution (TOPSIS) is used to select a suitable option. The optimization involves evaluation of stochastic performance measures within alternative scenarios among potential junction point locations using a discrete event simulation model. A practical thin film transistor-liquid crystal display (TFT-LCD) process case-study is utilized to illustrate the proposed method. Simulation results indicate that the inventory cost was reduced by over 46% after implementing a hybrid push/pull production strategy.


ADOPTING MULTI-CLASS MTS TO PREDICT THE OBSTRUCTIVE SLEEP APNEA

Li-Fei Chen, Pa-Chun Wang, Chun-Chin Hsu and Chao-Ton Su

Fu-Jen Catholic University, Taiwan

Cathay General Hospital, School of Medicine, Fu Jen Catholic University, Taiwan

Chaoyang University of Technology, Taiwan

National Tsing Hua University, Taiwan

Abstract. This study aims to apply the Multi-Mahalanobis Taguchi System (MMTS), based on anthropometric information and questionnaire data, to predict obstructive sleep apnea (OSA). We separated the collected OSA data into two parts: group I and group II. The group I data were used to establish the MMTS model, and group II’s data were used to test the developed MMTS model. The results show that MMTS has an accuracy of 0.8438 on the OSA prediction. Therefore, MMTS can be applied to assist doctors in foreseeing the diagnosis of OSA before running the PSG test, so the medical resources can be used more effectively.


A FUZZY MODEL OF MEDICAL DISASTER RESPONSE: DECISION MAKING SUPPORT AND DISASTER MANAGEMENT TOOL FOR CZECH EMERGENCY MEDICAL RESCUE SERVICES

Jan Stoklasa

Palacky University Olomouc Olomouc, Czech Republic

Abstract. The decision making process of the Emergency medical rescue services operations centre during disasters involves a significant amount of uncertainty. Decisions need to be made quickly and no mistakes are tolerable, particularly in the case of disasters resulting in large number of injured people. A multiphase linguistic fuzzy model is introduced to assist the operator during the initial phase of the medical disaster response. Based on uncertain input data, we can estimate the severity of the disaster, the number of injured people, and the amount of forces and resources needed to successfully deal with the situation. The need of reinforcements is also considered. Fuzzy numbers, linguistic variables and fuzzy rule bases are applied to deal with the uncertainty. Results derived by the model are available both as fuzzy sets and linguistic terms.


REAL ASSET APPRAISAL BASED ON A MULTI-EXPERT APPROACH USING THE PAY-OFF METHOD FOR REAL OPTION VALUATION

Mikael Collan and Mario Fedrizzi

Institute for Advanced Management Systems Research Åbo Akademi University Turku, Finland

University of Trento Trento, Italy

Abstract. The pay-off method is a novel method that is designed for the valuation and analysis of real assets. The method is based on cash-flow scenarios that are used to create a pay-off distribution for a project, from which the real option value is calculated. The reliability of the used cash-flow scenarios influences the reliability of the analysis results. Evaluation of future cash-flows is a complicated issue, as it is often impossible to identify or specify in sufficient detail any relevant processes that underlie the real asset cash-flows. This means that often the best information available on the future of the cash-flows that the asset generates is in the heads of project managers / experts. Managers’ opinions about the future may not be in concert thus calling for consensus building. In this paper we show how consensus on project dynamics can be modelled and how the resulting cash-flows can be used as a basis for real an analysis of project profitability with the payoff method.


CASH FLOW SIMULATION EMBEDDED REAL OPTIONS

Tero Haahtela

Aalto BIT Research Centre School of Science and Technology, Aalto University Aalto, Finland

Abstract. Cash flow simulation embedded option is an option whose value is based on choosing the optimal decision in each time step during a single cash flow calculation simulation run. Cash flow simulation embedded options are mostly operative options with continuous, gradual and nearly immediate exercise with well-known payoff or benefits. Typically these options are difficult to model with other methods than Monte Carlo simulation. However, cash flow simulation embedded options and the common, once exercisable options can be applied simultaneously in an investment valuation by using the simulated cash flow with embedded options as the underlying asset for the lattice, which is then used in valuing other lattice type options and their interactions.


DEFAULT ABLE BONDS UNDER IMPRECISE INFORMATION

Elettra Agliardi and Rossella Agliardi

University of Bologna Italy

Abstract. This article develops a computational method to implement the effect of imperfect information on the value of default able bonds. A fuzzy modelling is adopted and the numerical experiments show that an imprecise value of the stochastic underlying asset and/or the barrier triggering the default have material impact on the qualitative shape of the term structures of credit spreads.


APPLYING FINANCIAL RIGOUR TO FUZZY REAL OPTIONS

Sebastian Jaimungal and Yuri Lawryshyn

University of Toronto Toronto, ON, Canada

Abstract. Fuzzy numbers have recently been introduced in the real options literature as a simple alternative to option valuation. However, so far the assumption that the project value is a fuzzy number has not been put on solid theoretical ground. Here, we consider two methods to value a Real Options project whose future expected cash owes are based on managerial triangular estimates. Both methods rely on correlating a geometric Brownian motion (GBM) process to a traded market security or index. By utilizing the minimal entropy martingale measure, we value the real option in a theoretically sound manner.


REAL TIME REDUNDANCY STUDY OF A TIME DEPENDENT FAILURE RATES MODEL

Mani Sharifi, Soroush Bafekr Mishamandani and Sara Harati Zadeh

Islamic Azad University, Qazvin branch, Qazvin, Iran

Islamic Azad University, Kashan branch, Kashan, Iran

Islamic Azad University, Karaj Branch Karaj, Iran

Abstract. In this paper we study a useful model in redundancy systems. This model has (n + 1 ) components that n components are spare parts of the main component. The failure rate of the working component is time dependent as λ.tδ and the failure rates of non working components are zero. When a component stops working, one of the spare parts start working immediately. The failed components are non repairable. In this model we establish the differential equations between the system states and by solve these equations; we calculate some parameters of the system like the reliability and MTTF in real time situation.


APPLICATIONS OF AGENT-BASED MODELS FOR OPTIMIZATION PROBLEMS: A LITERATURE REVIEW

M Barbati, G Bruno and A Genovese

University of Naples Naples, Italy

University of Sheffield Sheffield, UK

Abstract. This work is devoted to the illustration of the application of Agent-Based Modelling (ABM) to optimization problems. Indeed, given their peculiarity in dealing with the representation and the simulation of complex systems, ABMs have been recently applied (sometimes combined to other optimization techniques) to solve optimization problems whose domains present several interrelated components in a distributed and heterogeneous environment. Thus, a first comparison among agent-based approaches and classical optimization techniques will be provided, followed by an extensive review aimed at evaluating the impact of these methodologies in the Operational Research / Management Science (OR/MS) literature.


EFFICIENT LOWER BOUNDING SCHEMES FOR THE MULTI-COMMODITY CAPACITATED MULTI-FACILITY WEBER PROBLEM

M Hakan Akyüz, Temel Öncan and İ Kuban Altinel

Galatasaray University Istanbul, Turkey

Boğaziçi University Istanbul, Turkey

Abstract. The Multi-commodity Capacitated Multi-facility Weber Problem (MCMWP) is concerned with locating I capacitated facilities in the plane satisfying the demands of J customers for K types of commodities subject to bundle constraints on commodity flows between facilities and customers. We propose a Lagrangean relaxation (LR) scheme where the sub problem is solved by column generation procedure on an equivalent Set Covering problem. To enhance the Sub gradient Optimization algorithm, two efficient acceleration strategies are applied. We also propose a block norm based lower bounding approach which employs approximate Mixed Integer Linear Programming formulations of the MCMWP. The proposed lower bounding schemes are tested on randomly generated test instances.


SYSTEM PERFORMANCE UNDER COST RESTRICTIONS

Frank Beichelt

University of the Witwatersrand Johannesburg, Republic of South Africa

Abstract. This contribution deals with maximizing the availability of a technical system under a repair cost limit replacement policy: After a system failure occurring at system age t the necessary cost for repairing the system is estimated. If this cost exceeds a certain limit c(t), called repair cost limit, the system is replaced by an equivalent new one. Otherwise, the failure is removed by repairing the system. The corresponding system availability is determined and minimized with respect to special classes of decreasing repair cost limit functions c(t). The results give strong evidence in favour of the hypothesis that applying a decreasing repair cost limit is more efficient than applying a constant repair cost limit.


AUCTIONING HETEROGENEOUS ITEMS WITH APPLICATIONS TO INTERNET ADVERTISEMENTS AND PRIVATIZATIONS

Yumiko Baba

Aoyamagakuin University Tokyo, Japan

Abstract. We analyze auctions with multiple similar, but heterogeneous items. Bidders’ preferences among the items are common, but bidders value the same item differently. Therefore, the model has both common value component and private value component. We show that the sequential sealed-bid first-price auction achieves both efficiency and revenue maximization simultaneously. This mechanism is used by a Japanese advertisement company to sell internet keyword to sponsors and is very different from the mechanism used by the US search engines such as Google and Yahoo. The mechanism is also applicable to privatization problem.


DIVERSIFICATION EFFECTS OF ASSET PRICE PROCESS PARAMETERS: AN EMPIRICAL INVESTIGATION

Ursula Walther and Andrey Fetsun

Frankfurt School of Finance and Management, Germany

BHF Bank AG, Germany

Abstract. Higher moments of asset price distributions – especially skewness – have long been recognized as important characteristics in asset pricing and portfolio management. The three moment capital asset pricing model (3M CAPM) considers skewness in addition to mean and variance, the final asset pricing equation containing the linear factor gamma that describes standardized co-skewness. In an asset management context the assumption of a general preference for positive skewness leads to different efficient sets and portfolio choice. A less well known fact is the diversification behavior of skewness when several assets are blended in a portfolio. It repeatedly has been found that skewness is not “diversified away” even in large portfolios. In some studies the market portfolio even shows a higher negative skewness than most of the individual asset. Therefore, the expectation of a general diversification benefit for asset characteristics other than variance is misleading. Unwanted portfolio characteristics may even accumulate on the portfolio level. In our study we investigate the diversification effects of a broad set of asset characteristics in a systematic way. The starting point is the description of asset price behavior as parameterized stochastic processes using a rich set of parameters. We follow the two-step approach frequently used in financial risk management: firstly we estimate the time varying volatility by a GARCH-type model; then, we capture the remaining characteristics by fitting a parameterized probability distribution to the standardized returns. For the latter we use the family of NIG distributions. This provides a set of seven to eight parameters that detailed but systematically describe the asset characteristics. This estimation approach is applied to a broad data set of daily returns of German large and small cap stocks. In order to study diversification effects we systematically select assets showing distinctive parameter levels and analyze the historical price processes of buy-and-hold-portfolios. We also study the behavior of portfolios created by the aggregation of simulated single asset portfolio returns. The results confirm the observation of non-diversification effects with respect to skewness. Additionally we find interesting and systematic effects on other parameters that give new insights into the behavior of stock returns on an aggregated level.


TYPOLOGIES OF FOREST FIRES IN THE MEDITERRANEAN AREA ACCORDING TO HUMAN ACTIVITY

M Martínez-Gómez, M Marí-Benlloch, C Maroto and J Suárez

Universidad Politécnica de Valencia Valencia, Spain

Abstract. Important economic losses and deaths are caused every year by forest fires; these also deteriorate natural resources and increase pollution. We need rigorous analysis of the factors that cause them to reduce their effects, and to develop tools in order to minimize their consequences. First, we selected the main variables to classify the human activity factors and the characteristics of the forest fires. Once the variables were defined, we classified the municipalities (using a land zoning of a Mediterranean area - Autonomous Region of Valencia) with homogeneous characteristics of incidence and causes of forest fires using multivariate techniques and Geographical Information Systems (GIS).


OPEN STATION MIXED-MODEL ASSEMBLY LINES: A CASE OF A NEWLY ESTABLISHED LINE

Babak H. Tabrizi and Reza Tavakkoli-Moghaddam

College of Engineering, University of Tehran Tehran, Iran

Abstract. This paper considers a special case of mixed-model assembly lines (MMALs). As each station assembly time is one of the fundamental parameters in setting the number of stations and operators and also providing an accurate production schedule, we need to investigate circumstances in which operators are inexperienced and perform their task within a time interval rather than in an exact time considering corresponding learning function. The strategy of Value-at-Risk (VaR) is applied to tackle with mentioned uncertainty as an efficient pmedian approach in manipulating potential risk. The objective is to minimize the utility and idle time costs in open station cases, incurred when the line is not balanced because of lack of homogeneity in each station throughput time. A numerical example is tested at the end to demonstrate the different results obtained from an ordinary problem compared with its analogous stochastic one.


A POSSIBILISTIC PROGRAMMING SOLUTION METHOD FOR A BI-OBJECTIVE OPEN SHOP SCHEDULING PROBLEM WITH FUZZY PARAMETERS

Reza Tavakkoli-Moghaddam, Samaneh Noori Darvish and Nikbakhsh Javadian

College of Engineering, University of Tehran Tehran, Iran

Mazandaran University of Science and Technology Babol, Iran

Abstract. This paper presents a novel, bi-objective possibilistic mixed- integer linear programming model for an open shop scheduling problem. Machine-dependent setup times, fuzzy processing times and fuzzy due dates with triangular possibility distributions are the main constrains of this model. The objectives are to minimize the total weighted tardiness and total weighted completion times. An interactive fuzzy programming solution approach proposed by Torabi and Hassini (TH) is applied to convert the original model into an auxiliary single objective crisp model. By the use of a classic approach given in the literature, some numerical instances in small sizes are generated at random and solved in order to obtain the Pareto optimal solutions.


IS TIMING MONEY? THE RETURN SHAPING EFFECT OF TECHNICAL TRADING SYSTEMS

Peter Scholtz

Frankfurt School of Finance and Management Frankfurt, Germany

Abstract. The success of trading systems based on technical analysis still seems puzzling and is controversial discussed among experts. In practice, technical trading is widely accepted, whereas academics are traditionally rather skeptical. But in the late 1980s, the picture started to change: the prominent study written by Brock et al. (1992) analyzed the profitability of technical trading rules and found strong support for the predictability of stock returns from past returns. They concluded that the damnation of technical analysis might have been premature. Since then, the subject of technical analysis became somehow acceptable in the academic circle and more and more empirical studies were conducted, which provided an extensive analysis (for a literature review, see Park & Irwin 2007). However, the previous research is predominantly based on historical back tests and bootstraps of different markets. This work investigates the hypothesis that trend following systems should profit from auto correlated returns and contributes to recent literature by trying to answer the questions why, i.e. under which circumstances trading systems may work.


SCHEDULING PROBLEMS FOR LOGISTIC PLATFORMS WITH FIXED STAIRCASE COMPONENT ARRIVALS AND VARIOUS DELIVERIES HYPOTHESES

Susana Carrera, Wahiba Ramdane-Cherif and Marie-Claude Portmann

INPL-LORIA, ENSMN, Parc de Saurupt Nancy, France

Abstract. This paper studies the scheduling problem within a logistic platform node. It corresponds to the last node in the supply chain located just before the final customers. In this node, customer orders are prepared using known quantities of components received from the suppliers. For upstream flows, trucks deliver known component quantities at fixed times. For downstream flows, optimized customer delivery tours are planned at fixed times except the last one which will be done at a flexible time corresponding to the end of the schedule. We consider a simplified case of one bottleneck-assembling machine. Several sub cases are considered depending on the number of the fixed and flexible deliveries and also on the chosen criteria. For each sub case, upper and lower bounds are proposed and compared experimentally on heterogeneous instance families. All the ingredients are therefore available to design a branch and bound for these new scheduling problems.


© ORLab Analytics Inc