Excellence in Research and Innovation for Humanity

International Science Index

Commenced in January 1999 Frequency: Monthly Edition: International Abstract Count: 50776

Industrial and Manufacturing Engineering

1277
87679
Industry 4.0: A Scenario for Innovation as an Opportunity for Industrial Designers
Abstract:
We are facing a situation of imminent change in the industry. It is heading towards complete digitalisation and the automation of manufacturing processes. This transformation is built on the technology bases of the internet of things (IoT), cyber-physical systems, maker culture, manufacturing 4.0, the cloud, big data, and augmented reality. A transformation has people at the centre of its development. In many fields, a Fourth Industrial Revolution or the concept of Industry 4.0 is now being spoken about. This new scenario requires the continual development of an industrial designer’s work, and they must continually adapt their knowledge, as traditional models of design are rendered obsolete in a changing and flexible environment. An environment, which demands immediate response and action, employing design as a focal point in the creation of innovative solutions. In the scenario of Industry 4.0, communication is instantaneous, and limits on connections to a multitude of geographic regions are removed, allowing the exchange of knowledge. Since scholars have so far paid little attention to this specific understanding, the aim of the research presented in this paper is to conceptualize it by defining a new scenario for action. Relevant factors that iteratively integrate the perceptions of various industry case studies and literature research have been identified. This analysis is intended to understand the new reality that is Industry 4.0 and its consequences, given the growth of new patterns of innovation, to which the designer must adapt. In this situation, the designer should use their ability to read signs in their environment to foresee both opportunity and errors, finding solutions that today's society needs and improving people's quality of life.
Digital Article Identifier (DAI):
1276
87292
Factors Affecting Green Supply Chain Management of Lampang Ceramics Industry
Abstract:
This research aims to study the factors that affect the performance of green supply chain management in the Lampang ceramics industry. The data investigation of this research was questionnaires which were gathered from 20 factories in the Lampang ceramics industry. The research factors are divided into five major groups which are green design, green purchasing, green manufacturing, green logistics and reverse logistics. The questionnaire has consisted of four parts that related to factors green supply chain management and general information of the Lampang ceramics industry. Then, the data were analyzed using descriptive statistic and priority of each factor by using the analytic hierarchy process (AHP). The understanding of factors affecting the green supply chain management of Lampang ceramics industry was indicated in the summary result along with each factor weight. The result of this research could be contributed to the development of indicators or performance evaluation in the future.
Digital Article Identifier (DAI):
1275
87114
Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction
Abstract:
The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.
Digital Article Identifier (DAI):
1274
86251
The Role of Dialogue in Shared Leadership and Team Innovative Behavior Relationship
Authors:
Abstract:
Purpose: The aim of this study was to investigate the impact that dialogue has on the relationship between shared leadership and innovative behavior and the importance of dialogue in innovation. This study wants to contribute to the literature by providing theorists and researchers a better understanding of how to move forward in the studies of moderator variables in the relationship between shared leadership and team outcomes such as innovation. Methodology: A systematic review of the literature, originally adopted from the medical sciences but also used in management and leadership studies, was conducted to synthesize research in a systematic, transparent and reproducible manner. A final sample of 48 empirical studies was scientifically synthesized. Findings: Shared leadership gives a better solution to team management challenges and goes beyond the classical, hierarchical, or vertical leadership models based on the individual leader approach. One of the outcomes that emerge from shared leadership is team innovative behavior. To intensify the relationship between shared leadership and team innovative behavior, and understand when is more effective, the moderating effects of other variables in this relationship should be examined. This synthesis of the empirical studies revealed that dialogue is a moderator variable that has an impact on the relationship between shared leadership and team innovative behavior when leadership is understood as a relational process. Dialogue is an activity between at least two speech partners trying to fulfill a collective goal and is a way of living open to people and ideas through interaction. Dialogue is productive when team members engage relationally with one another. When this happens, participants are more likely to take responsibility for the tasks they are involved and for the relationships they have with others. In this relational engagement, participants are likely to establish high-quality connections with a high degree of generativity. This study suggests that organizations should facilitate the dialogue of team members in shared leadership which has a positive impact on innovation and offers a more adaptive framework for the leadership that is needed in teams working in complex work tasks. These results uncover the necessity of more research on the role that dialogue plays in contributing to important organizational outcomes such as innovation. Case studies describing both best practices and obstacles of dialogue in team innovative behavior are necessary to gain a more detailed insight into the field. It will be interesting to see how all these fields of research evolve and are implemented in dialogue practices in the organizations that use team-based structures to deal with uncertainty, fast-changing environments, globalization and increasingly complex work.
Digital Article Identifier (DAI):
1273
86165
Optimization of Two Quality Characteristics in Injection Molding Processes via Taguchi Methodology
Abstract:
The main objective of this research is to optimize tensile strength and dimensional accuracy in injection molding processes using Taguchi Parameter Design. Taguchi methodology enables the user to identify one parameter setting to improve both quality characteristics in injection molding processes. An L16 orthogonal array (OA) is used in Taguchi experimental design with five control factors at 4 levels each and with non-controllable factor vibration. A total of 32 experiments were designed to obtain the optimal parameter setting for the process. The optimal parameters identified for the shrinkage are shot volume, 1.7 cubic inches (A₄); mold term temperature, 130ºF (B₁); hold pressure, 3200Psi (C₄); injection speed, 0.61 inch³ / sec (D₂); and hold time of 14 seconds (E₂). The optimal parameters identified for the Tensile strength are shot volume, 1.7 cubic inches (A₄); mold temperature, 160ºF (B₄); hold pressure, 3100Psi (C₃); injection speed, 0.69 inch³ / sec (D₄); and hold time of 14 seconds (E₂). The Taguchi-based optimization framework systematically and successfully implemented to obtain an adjusted optimal setting in this research. The mean shrinkage of the conformation runs is 0.0031%, and the tensile strength value was found to be 3148.1 psi. Both outcomes are far better results from the baseline and defects have been further reduced in injection molding processes.
Digital Article Identifier (DAI):
1272
85979
Implications of Industry 4.0 to Supply Chain Management and Human Resources Management: The State of the Art
Abstract:
Industry 4.0 (I4.0) is a significant and promising research topic that is expected to gain more importance due to its effects on important concepts like cost, resource management, and accessibility. Instead of focusing those effects in only one area, combining different departments, and see the big picture helps to make more realistic predictions about the future. The aim of this paper is to identify the implications of Industry 4.0 for both supply chain management and human resources management by finding out the topics that take place at the intersection of them. Another objective is helping the readers to realize the expected changes in these two areas due to I4.0 in order to take the necessary steps in advance and make recommendations to catch up the latest trends. The expected changes are concluded from the industry reports and related journal papers in the literature. As found in the literature, this study is the first to combine the Industry 4.0, supply chain management and human resources management and urges to lead future works by finding out the intersections of those three areas. Benefits of I4.0 and the amount, research areas and the publication years of papers on I4.0 in the academic journals are mentioned in this paper. One of the main findings of this research is that a change in the labor force qualifications is expected with the advancements in the technology. There will be a need for higher level of skills from the workers. This will directly affect the human resources management in a way of recruiting and managing those people. Another main finding is, as it is explained with an example in the article, the advancements in the technology will change the place of production. For instance, 'dark factories', a popular topic of I4.0, will enable manufacturers to produce in places that close to their marketplace. The supply chains are expected to be influenced by that change.
Digital Article Identifier (DAI):
1271
84931
Increasing of Strength of FDM (FFF) 3D-Printed Parts by Influencing on Temperature-Related Parameters of the Process
Abstract:
An average desktop 3D printer based on FDM(FFF) technology is capable to form complex shape part with no special tooling in few hours. Relatively democratic price and TCO (total cost of ownership) allows combining dozens of desktop printers into a productive array for rapid manufacturing of highly customized parts. However until now these devices are hardly considered as production tools. The limited use of desktop 3D printers for manufacturing of functional parts capable to handle significant loads is largely caused by lack of trust for the new class of means of fabrication by designers and engineers. This lack of trust is backed by the widespread belief that the strength of 3D printed parts is dramatically lower in comparison with parts manufactured using a traditional technology. Understanding the peculiarities of 3D printing technology, parameter optimization and, most important, designing parts with keeping these peculiarities in mind allows at least approaching the strength of monolithic polymer parts. Current work is performed as a part of greater research for increasing strength of 3D printed parts made from biodegradable polymers using democratic manufacturing utilities. It is aimed to promote the use of 3D printing technology as a means of an efficient, distributed on-demand fabrication based on renewable resource usage and targeted on local markets. Current work investigates how PLA (polylactic acid) printed parts strength depends on user-controlled parameters that define temperature conditions at the current layer of the part being fabricated. As a characteristic of the part strength we used the load at which the specimen breaks down in the case of a three-point bend (UFL). Calculated maximum stresses appearing in the critical section (UFS) can be derived given the geometrical features of the sample. During the printing process parts are oriented the long side along the Z axis, thus, in the bend tests, the maximum stress is applied orthogonally to the layers. Apart from UFL, mass and density were registered for each sample, and sample mesostructure was analyzed using SEM (scanning electronic microscopy). The influence of the extrusion temperature, the intensity of model cooling and the time between printing individual layers is considered. The increase in the printing temperature has a significant effect on the resistance to extrusion, which results in the growth of the volume of the extruded polymer. Along with strength improvement due to the increase in the amount of material (the area of contact between the layers), the effect of strength growth is also registered due to better cohesion between the layers. Reducing cooling intensity results in a significant increase in the temperature of the plastic acting as a base for the deposition of the liquid polymer. That also contributes positively to the increase in interlayer cohesion. In the course of the experiments, samples with qualitative changes in the mesostructure of the interlayer boundaries were obtained solely by changing cooling conditions. It is concluded that by influencing temperature conditions of formation of a joint between layers, it is possible to achieve significant increase in the strength of the part.
Digital Article Identifier (DAI):
1270
83569
Modeling Intelligent Threats: Case of Continuous Attacks on a Specific Target
Abstract:
In this paper, we treat a model that falls in the area of protecting targeted systems from intelligent threats including terrorism. We introduce the concept of system survivability, in the context of continuous attacks, as the probability that a system under attack will continue operation up to some fixed time t. We define a constant attack rate (CAR) process as an attack on a targeted system that follows an exponential distribution. We consider the superposition of several CAR processes. From the attacker side, we determine the optimal attack strategy that minimizes the system survivability. We also determine the optimal strengthening strategy that maximizes the system survivability under limited defensive resources. We use operations research techniques to identify optimal strategies of each antagonist. Our results may be used as interesting starting points to develop realistic protection strategies against intentional attacks.
Digital Article Identifier (DAI):
1269
83520
Storage System Validation Study for Raw Cocoa Beans Using Minitab® 17 and R (R-3.3.1)
Abstract:
In this observational study, the performance of a known conventional storage system was tested and evaluated for fitness for its intended purpose. The system has a scope extended for the storage of dry cocoa beans. System sensitivity, reproducibility and uncertainties are not known in details. This study discusses the system performance in the context of existing literature on factors that influence the quality of cocoa beans during storage. Controlled conditions were defined precisely for the system to give reliable base line within specific established procedures. Minitab® 17 and R statistical software (R-3.3.1) were used for the statistical analyses. The approach to the storage system testing was to observe and compare through laboratory test methods the quality of the cocoa beans samples before and after storage. The samples were kept in Kilner jars and the temperature of the storage environment controlled and monitored over a period of 408 days. Standard test methods use in international trade of cocoa such as the cut test analysis, moisture determination with Aqua boy KAM III model and bean count determination were used for quality assessment. The data analysis assumed the entire population as a sample in order to establish a reliable baseline to the data collected. The study concluded a statistically significant mean value at 95% Confidence Interval (CI) for the performance data analysed before and after storage for all variables observed. Correlational graphs showed a strong positive correlation for all variables investigated with the exception of All Other Defect (AOD). The weak relationship between the before and after data for AOD had an explained variability of 51.8% with the unexplained variability attributable to the uncontrolled condition of hidden infestation before storage. The current study concluded with a high-performance criterion for the storage system.
Digital Article Identifier (DAI):
1268
83261
Structural Equation Modelling Based Approach to Integrate Customers and Suppliers with Internal Practices for Lean Manufacturing Implementation in the Indian Context
Abstract:
Lean management is an integrated socio-technical system to bring about a competitive state in an organization. The purpose of this paper is to explore and integrate the role of customers and suppliers with the internal practices of the Indian manufacturing industries towards successful implementation of lean manufacturing (LM). An extensive literature survey is carried out. An attempt is made to build an exhaustive list of all the input manifests related to customers, suppliers and internal practices necessary for LM implementation, coupled with a similar exhaustive list of the benefits accrued from its successful implementation. A structural model is thus conceptualized, which is empirically validated based on the data from the Indian manufacturing sector. With the current impetus on developing the industrial sector, the Government of India recently introduced the Lean Manufacturing Competitiveness Scheme that aims to increase competitiveness with the help of lean concepts. There is a huge scope to enrich the Indian industries with the lean benefits, the implementation status being quite low. Hardly any survey-based empirical study in India has been found to integrate customers and suppliers with the internal processes towards successful LM implementation. This empirical research is thus carried out in the Indian manufacturing industries. The basic steps of the research methodology followed in this research are the identification of input and output manifest variables and latent constructs, model proposition and hypotheses development, development of survey instrument, sampling and data collection and model validation (exploratory factor analysis, confirmatory factor analysis, and structural equation modeling). The analysis reveals six key input constructs and three output constructs, indicating that these constructs should act in unison to maximize the benefits of implementing lean. The structural model presented in this paper may be treated as a guide to integrating customers and suppliers with internal practices to successfully implement lean. Integrating customers and suppliers with internal practices into a unified, coherent manufacturing system will lead to an optimum utilization of resources. This work is one of the very first researches to have a survey-based empirical analysis of the role of customers, suppliers and internal practices of the Indian manufacturing sector towards an effective lean implementation.
Digital Article Identifier (DAI):
1267
83028
Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion
Abstract:
This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.
Digital Article Identifier (DAI):
1266
82437
Understanding Ambivalent Behaviors of Social Media Users toward the 'Like' Function: A Social Capital Perspective
Abstract:
The 'Like' function in social media platforms represents the immediate responses of social media users to postings and other users. A large number of 'likes' is often attributed to fame, agreement, and support from others that many users are proud of and happy with. However, what 'like' implies exactly in social media context is still in discussion. Some argue that it is an accurate parameter of the preferences of social media users, whereas others refute that it is merely an instant reaction that is volatile and vague. To address this gap, this study investigates how social media users perceive the 'like' function and behave differently based on their perceptions. This study posits the following arguments. First, 'like' is interpreted as a quantified form of social capital that resides in social media platforms. This incarnated social capital rationalizes the attraction of people to social media and belief that social media platforms bring benefits to their relationships with others. This social capital is then conceptualized into cognitive and emotive dimensions, where social capital in the cognitive dimension represents the awareness of the 'likes' quantitatively, whereas social capital in the emotive dimension represents the receptions of the 'likes' qualitatively. Finally, the ambivalent perspective of the social media users on 'like' (i.e., social capital) is applied. This view rationalizes why social media users appreciate the reception of 'likes' from others but are aware that those 'likes' can distort the actual responses of other users by sending erroneous signals. The rationale on this ambivalence is based on whether users perceive social media as private or public spheres. When social media is more publicized, the ambivalence is more strongly observed. By combining the ambivalence and dimensionalities of the social capital, four types of social media users with different mechanisms on liking behaviors are identified. To validate this work, a survey with 300 social media users is conducted. The analysis results support most of the hypotheses and confirm that people have ambivalent perceptions on 'like' as a social capital and that perceptions influence behavioral patterns. The implication of the study is clear. First, this study explains why social media users exhibit different behaviors toward 'likes' in social media. Although most of the people believe that the number of 'likes' is the simplest and most frank measure of supports from other social media users, this study introduces the users who do not trust the 'likes' as a stable and reliable parameter of social media. In addition, this study links the concept of social media openness to explain the different behaviors of social media users. Social media openness has theoretical significance because it defines the psychological boundaries of social media from the perspective of users.
Digital Article Identifier (DAI):
1265
82200
A Location Routing Model for the Logistic System in the Mining Collection Centers of the Northern Region of Boyacá-Colombia
Abstract:
The main objective of this study is to design a mathematical model for the logistics of mining collection centers in the northern region of the department of Boyacá (Colombia), determining the structure that facilitates the flow of products along the supply chain. In order to achieve this, it is necessary to define a suitable design of the distribution network, taking into account the products, customer’s characteristics and the availability of information. Likewise, some other aspects must be defined, such as number and capacity of collection centers to establish, routes that must be taken to deliver products to the customers, among others. This research will use one of the operation research problems, which is used in the design of distribution networks known as Location Routing Problem (LRP).
Digital Article Identifier (DAI):
1264
81601
Organisational Factors and Total Quality Management Practice in Nigeria Manufacturing Industry: Evidence from Honeywell Flour Mills Plc
Abstract:
Nigerian manufacturing industry, particularly the flour producing firms play vital roles in Nigerian economy. This sector’s quality management practice is given a little attention along with organizational factors that hinder successful practice of total quality management which needs to be documented. Honeywell Flour Mills Plc operate in Nigeria with an appreciable number of products that serves this sector of the economy. Internal-external disposition of the company and total quality practice of the company deserve some elucidations. Hence, this study examined the influence of organizational factors on total quality management practice of Nigerian manufacturing industry, using Honeywell Flour Mills Plc as a case study. The study employed the correlational type of descriptive survey research design. The population consisted of 656 staff of Honeywell Flour Mills Plc, out of which 235 members were selected through scientific sampling method developed by Paler-Calmorin and Calmorin. A total of 235 copies of questionnaires titled 'Organisational Factors and Total Quality Management Practices (QF-TQM) Questionnaire' were administered with a response rate of 66 copies returned. The following variables were applied internal organisational factors (IOFs), external organizational factors (EOFs) and total quality management (TQM). Data generated were analysed using frequency distribution and regression analysis at 0.05 level. The findings revealed that IOFs positively and significantly related with TQM (r = .147**, N= 64, P(.000) < .01). Also, EOFs negatively and significantly related with TQM (r = -.117, N= 64, P(.000) < .01). Findings showed that internal and external organizational factors jointly influenced TQM practiced in F₍₂,₆₁₎=22.250; R²=.629; Adj.R²=.603; P(.000) < .05). The study concluded that organizational factors are determinants of TQM practice in Nigerian manufacturing industry. It is recommended that both internal and external organizational factors influencing TQM practices should be considered in the development of TQM strategies.
Digital Article Identifier (DAI):
1263
80922
A Real-Time Simulation Environment for Avionics Software Development and Qualification
Abstract:
The development of guidance, navigation and control algorithms and avionic procedures requires the disposability of suitable analysis and verification tools, such as simulation environments, which support the design process and allow detecting potential problems prior to the flight test, in order to make new technologies available at reduced cost, time and risk. This paper presents a simulation environment for avionic software development and qualification, especially aimed at equipment for general aviation aircrafts and unmanned aerial systems. The simulation environment includes models for short and medium-range radio-navigation aids, flight assistance systems, and ground control stations. All the software modules are able to simulate the modeled systems both in fast-time and real-time tests, and were implemented following component oriented modeling techniques and requirement based approach. The paper describes the specific models features, the architectures of the implemented software systems and its validation process. Performed validation tests highlighted the capability of the simulation environment to guarantee in real-time the required functionalities and performance of the simulated avionics systems, as well as to reproduce the interaction between these systems, thus permitting a realistic and reliable simulation of a complete mission scenario.
Digital Article Identifier (DAI):
1262
80587
Continuous Improvement as an Organizational Capability in the Industry 4.0 Era
Abstract:
Continuous improvement is becoming increasingly a prerequisite for manufacturing companies to remain competitive in a global market. In addition, future survival and success will depend on the ability to manage the forthcoming digitalization transformation in the industry 4.0 era. Industry 4.0 promises substantially increased operational effectiveness, were all equipment are equipped with integrated processing and communication capabilities. Subsequently, the interplay of human and technology will evolve and influence the range of worker tasks and demands. Taking into account these changes, the concept of continuous improvement must evolve accordingly. Based on a case study from manufacturing industry, the purpose of this paper is to point out what the concept of continuous improvement will meet and has to take into considering when entering the 4th industrial revolution. In the past, continuous improvement has the focus on a culture of sustained improvement targeting the elimination of waste in all systems and processes of an organization by involving everyone. Today, it has to be evolved into the forthcoming digital transformation and the increased interplay of human and digital communication system to reach its full potential. One main findings of this study, is how digital communication systems will act as an enabler to strengthen the continuous improvement process, by moving from collaboration within individual teams to interconnection of teams along the product value chain. For academics and practitioners, it will help them to identify and prioritize their steps towards an industry 4.0 implementation integrated with focus on continuous improvement.
Digital Article Identifier (DAI):
1261
79972
A Concept for Flexible Battery Cell Manufacturing from Low to Medium Volumes
Abstract:
The competitiveness and success of new electrical energy storages such as battery cells are significantly dependent on a short time-to-market. Producers who decide to supply new battery cells to the market need to be easily adaptable in manufacturing with respect to the early customers’ needs in terms of cell size, materials, delivery time and quantity. In the initial state, the required output rates do not yet allow the producers to have a fully automated manufacturing line nor to supply handmade battery cells. Yet there was no solution for manufacturing battery cells in low to medium volumes in a reproducible way. Thus, in terms of cell format and output quantity, a concept for the flexible assembly of battery cells was developed by the Fraunhofer-Institute for Manufacturing Engineering and Automation. Based on clustered processes, the modular system platform can be modified, enlarged or retrofitted in a short time frame according to the ordered product. The paper shows the analysis of the production steps from a conventional battery cell assembly line. Process solutions were found by using I/O-analysis, functional structures, and morphological boxes. The identified elementary functions were subsequently clustered by functional coherences for automation solutions and thus the single process cluster was generated. The result presented in this paper enables to manufacture different cell products on the same production system using seven process clusters. The paper shows the solution for a batch-wise flexible battery cell production using advanced process control. Further, the performed tests and benefits by using the process clusters as cyber-physical systems for an integrated production and value chain are discussed. The solution lowers the hurdles for SMEs to launch innovative cell products on the global market.
Digital Article Identifier (DAI):
1260
79811
A Petri Net Model to Obtain the Throughput of Unreliable Production Lines in the Buffer Allocation Problem
Abstract:
A production line designer faces with several challenges in manufacturing system design. One of them is the assignment of buffer slots in between every machine of the production line in order to maximize the throughput of the whole line, which is known as the Buffer Allocation Problem (BAP). The BAP is a combinatorial problem that depends on the number of machines and the total number of slots to be distributed on the production line. In this paper, we are proposing a Petri Net (PN) Model to obtain the throughput in unreliable production lines, based on PN mathematical tools and the decomposition method. The results obtained by this methodology are similar to those presented in previous works, and the number of machines is not a hard restriction.
Digital Article Identifier (DAI):
1259
79616
Dimensional Accuracy of CNTs/PMMA Parts and Holes Produced by Laser Cutting
Abstract:
Laser cutting is a very common production method for cutting 2D polymeric parts. Developing of polymer composites with nano-fibers makes important their other properties like laser workability. The aim of this research is investigation of the influence different laser cutting conditions on the dimensional accuracy of parts and holes from poly methyl methacrylate (PMMA)/carbon nanotubes (CNTs) material. Experiments were carried out by considering of CNTs (in four level 0,0.5, 1 and 1.5% wt.%), laser power (60, 80, and 100 watt) and cutting speed 20, 30, and 40 mm/s as input variable factors. The results reveal that CNTs adding improves the laser workability of PMMA and the increasing of power has a significant effect on the part and hole size. The findings also show cutting speed is effective parameter on the size accuracy. Eventually, the statistical analysis of results was done, and calculated mathematical equations by the regression are presented for determining relation between input and output factor.
Digital Article Identifier (DAI):
1258
79287
Timely Detection and Identification of Abnormalities for Process Monitoring
Authors:
Abstract:
The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.
Digital Article Identifier (DAI):
1257
79215
Third Party Logistics (3PL) Selection Criteria for an Indian Heavy Industry Using SEM
Abstract:
In the present paper, we propose an incorporated approach for 3PL supplier choice that suits the distinctive strategic needs of the outsourcing organization in southern part of India. Four fundamental criteria have been used in particular Performance, IT, Service and Intangible. These are additionally subdivided into fifteen sub-criteria. The proposed strategy coordinates Structural Equation Modeling (SEM) and Non-additive Fuzzy Integral strategies. The presentation of fluffiness manages the unclearness of human judgments. The SEM approach has been used to approve the determination criteria for the proposed show though the Non-additive Fuzzy Integral approach uses the SEM display contribution to assess a supplier choice score. The case organization has a exclusive vertically integrated assembly that comprises of several companies focusing on a slight array of the value chain. To confirm manufacturing and logistics proficiency, it significantly relies on 3PL suppliers to attain supply chain superiority. However, 3PL supplier selection is an intricate decision-making procedure relating multiple selection criteria. The goal of this work is to recognize the crucial 3PL selection criteria by using the non-additive fuzzy integral approach. Unlike the outmoded multi criterion decision-making (MCDM) methods which frequently undertake independence among criteria and additive importance weights, the nonadditive fuzzy integral is an effective method to resolve the dependency among criteria, vague information, and vital fuzziness of human judgment. In this work, we validate an empirical case that engages the nonadditive fuzzy integral to assess the importance weight of selection criteria and indicate the most suitable 3PL supplier.
Digital Article Identifier (DAI):
1256
79172
A Measuring Industrial Resiliency by Using Data Envelopment Analysis Approach
Abstract:
Having several crises that affect industrial sector performance in the past decades, decision makers should utilize measurement application that enables them to measure industrial resiliency more precisely. It provides not only a framework for the development of resilience measurement application, but also several theories for the concept building blocks, such as performance measurement management, and resilience engineering in real world environment. This research is a continuation of previously published paper on performance measurement in the industrial sector. Finally, this paper contributes an alternative performance measurement method in industrial sector based on resilience concept. Moreover, this research demonstrates how applicable the concept of resilience engineering is and its method of measurement.
Digital Article Identifier (DAI):
1255
79137
Designing Price Stability Model of Red Cayenne Pepper Price in Wonogiri District, Centre Java, Using ARCH/GARCH Method
Abstract:
Food and agricultural sector become the biggest sector contributing to inflation in Indonesia. Especially in Wonogiri district, red cayenne pepper was the biggest sector contributing to inflation on 2016. A national statistic proved that in recent five years red cayenne pepper has the highest average level of fluctuation among all commodities. Some factors, like supply chain, price disparity, production quantity, crop failure, and oil price become the possible factor causes high volatility level in red cayenne pepper price. Therefore, this research tries to find the key factor causing fluctuation on red cayenne pepper by using ARCH/GARCH method. The method could accommodate the presence of heteroscedasticity in time series data. At the end of the research, it is statistically found that the second level of supply chain becomes the biggest part contributing to inflation with 3,35 of coefficient in fluctuation forecasting model of red cayenne pepper price. This model could become a reference to the government to determine the appropriate policy in maintaining the price stability of red cayenne pepper.
Digital Article Identifier (DAI):
1254
79065
Developing a Total Quality Management Model Using Structural Equation Modeling for Indonesian Healthcare Industry
Abstract:
This paper is made to present an Indonesian Healthcare model. Currently, there are nine TQM (Total Quality Management) practices in healthcare industry. However, these practices are not integrated yet. Therefore, this paper aims to integrate these practices as a model by using Structural Equation Modeling (SEM). After administering about 210 questionnaires to various stakeholders of this industry, a LISREL program was used to evaluate the model's fitness. The result confirmed that the model is fit because the p-value was about 0.45 or above required 0.05. This has signified that previously mentioned of nine TQM practices are able to be integrated as an Indonesian healthcare model.
Digital Article Identifier (DAI):
1253
79020
Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets
Abstract:
Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.
Digital Article Identifier (DAI):
1252
79000
An Anthropometric and Postural Risk Assessment of Students in Computer Laboratories of a State University
Abstract:
Ergonomics considers the capabilities and limitations of a person as they interact with tools, equipment, facilities and tasks in their work environment. Workplace is one example of physical work environment, be it a workbench or a desk. In school laboratories, sitting is the most common working posture of the students. Students maintain static sitting posture as they perform different computer-aided activities. The College of Engineering and College of Information and Communication Technology of a State University consist of twenty-two computer laboratories. Normally, students aren’t usually aware of the importance of sustaining proper sitting posture while doing their long hour computer laboratory activities. The study evaluates the perceived discomfort and working postures of students as they are exposed on current workplace design of computer laboratories. The current study utilizes Rapid Upper Limb Assessment (RULA), Body Discomfort Chart using Borg’s CR-10 Scale Rating and Quick Exposure Checklist in order to assess the posture and the current working condition. The result of the study may possibly minimize the body discomfort experienced by the students. The researchers redesign the individual workstations which includes working desk, sitting stool and other workplace design components. Also, the economic variability of each alternative was considered given that the study focused on improvement of facilities of a state university.
Digital Article Identifier (DAI):
1251
78980
Controlling the Process of a Chicken Dressing Plant through Statistical Process Control
Abstract:
In a manufacturing firm, controlling the process ensures that optimum efficiency, productivity, and quality in an organization are achieved. An operation with no standardized procedure yields a poor productivity, inefficiency, and an out of control process. This study focuses on controlling the small intestine processing of a chicken dressing plant through the use of Statistical Process Control (SPC). Since the operation does not employ a standard procedure and does not have an established standard time, the process through the assessment of the observed time of the overall operation of small intestine processing, through the use of X-Bar R Control Chart, is found to be out of control. In the solution of this problem, the researchers conduct a motion and time study aiming to establish a standard procedure for the operation. The normal operator was picked through the use of Westinghouse Rating System. Instead of utilizing the traditional motion and time study, the researchers used the X-Bar R Control Chart in determining the process average of the process that is used for establishing the standard time. The observed time of the normal operator was noted and plotted to the X-Bar R Control Chart. Out of control points that are due to assignable cause were removed and the process average, or the average time the normal operator conducted the process, which was already in control and free form any outliers, was obtained. The process average was then used in determining the standard time of small intestine processing. As a recommendation, the researchers suggest the implementation of the standard time established which is with consonance to the standard procedure which was adopted from the normal operator. With that recommendation, the whole operation will induce a 45.54 % increase in their productivity.
Digital Article Identifier (DAI):
1250
78800
Impact of Contemporary Performance Measurement System and Organization Justice on Academic Staff Work Performance
Abstract:
As part of the Malaysia Higher Institutions' Strategic Plan in promoting high-quality research and education, the Ministry of Higher Education has introduced various instrument to assess the universities performance. The aims are that university will produce more commercially-oriented research and continue to contribute in producing professional workforce for domestic and foreign needs. Yet the spirit of the success lies in the commitment of university particularly the academic staff to translate the vision into reality. For that reason, the element of fairness and justice in assessing individual academic staff performance is crucial to promote directly linked between university and individual work goals. Focusing on public research universities (RUs) in Malaysia, this study observes at the issue through the practice of university contemporary performance measurement system. Accordingly management control theory has conceptualized that contemporary performance measurement consisting of three dimension namely strategic, comprehensive and dynamic building upon equity theory, the relationships between contemporary performance measurement system and organizational justice and in turn the effect on academic staff work performance are tested based on online survey data administered on 365 academic staff from public RUs, which were analyzed using statistics analysis SPSS and Equation Structure Modeling. The findings validated the presence of strategic, comprehensive and dynamic in the contemporary performance measurement system. The empirical evidence also indicated that contemporary performance measure and procedural justice are significantly associated with work performance but not for distributive justice. Furthermore, procedural justice does mediate the relationship between contemporary performance measurement and academic staff work performance. Evidently, this study provides evidence on the importance of perceptions of justice towards influencing academic staff work performance. This finding may be a fruitful input in the setting up academic staff performance assessment policy.
Digital Article Identifier (DAI):
1249
78506
The Temperature Effects on the Microstructure and Profile in Laser Cladding
Abstract:
In this study, a 50-W CO2 laser was used for the clad of 304L powders on the stainless steel substrate with a temperature sensor and image monitoring system. The laser power and cladding speed and focal position were modified to achieve the requirement of the workpiece flatness and mechanical properties. The numerical calculation is based on ANSYS to analyze the temperature change of the moving heat source at different surface positions when coating the workpiece, and the effect of the process parameters on the bath size was discussed. The temperature of stainless steel powder in the nozzle outlet reacting with the laser was simulated as a process parameter. In the experiment, the difference of the thermal conductivity in three-dimensional space is compared with single-layer cladding and multi-layer cladding. The heat dissipation pattern of the single-layer cladding is the steel plate and the multi-layer coating is the workpiece itself. The relationship between the multi-clad temperature and the profile was analyzed by the temperature signal from an IR pyrometer.
Digital Article Identifier (DAI):
1248
78504
A Simulative Approach for JIT Parts-Feeding Policies
Abstract:
Lean philosophy follows the simple principle of “creating more value with fewer resources”. In accordance with this policy, material handling can be managed by the mean of Kanban which by triggering every feeding tour only when needed regulates the flow of material in one of the most efficient way. This paper focuses on Kanban Supermarket’s parameters and their optimization on a purely cost-based point of view. Number and size of forklifts, as well as size of the containers they carry, will be variables of the cost function which includes handling costs, inventory costs but also shortage costs. With an innovative computational approach encoded into industrial engineering software Tecnomatix and reproducing real-life conditions, a fictive assembly line is established and produces a random list of orders. Multi-scenarios are then run to study the impact of each change of parameter and the variation of costs it implies. Lastly, best-case scenarios financially speaking are selected.
Digital Article Identifier (DAI):