An Empirical Study of Evaluating the Effectiveness of Code Refactoring Techniques in a Mobile Operating System
Mobile applications have become a high priority topic for software developers. Researchers and practitioners are working towards improving and optimizing mobile application energy efficiency and performance due to the capacity limitation of mobile device processors and batteries. To improve and enhance maintainability, extensibility, and understandability of application code, refactoring techniques were introduced. However, applying such techniques to mobile applications affects energy efficiency and performance. To evaluate and categorize software implementation and optimization efficiency several metrics are introduced, such as the GPS-UP metrics (Greenup, Powerup, and Speedup). In this empirical study, we quantitatively evaluate the impact of some refactoring techniques on mobile applications energy efficiency and performance. In addition, to better categorize the impact of refactoring techniques on mobile applications, we introduce two new categories to the GPS-UP metrics. Moreover, we explain the interrelationship between energy efficiency and performance to provide more knowledge and insight for mobile application developers. The results of our work will allow them to understand the trade-offs between performance, energy efficiency, and maintainability when applying any of the refactoring techniques chosen in our study.
Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction
In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.
Sensor And Sensor System Design, Selection and Data Fusion Using Non-Deterministic Multi-Attribute Tradespace Exploration
The conceptualization and design phases of a system lifecycle consume a significant amount of the lifecycle budget in the form of direct tasking and capital, as well as the implicit costs associated with unforeseeable design errors that are only realized during downstream phases. Ad hoc or iterative approaches to generating system requirements oftentimes fail to consider the full array of feasible systems or product designs for a variety of reasons, including, but not limited to: initial conceptualization that oftentimes incorporates a priori or legacy features; the inability to capture, communicate and accommodate stakeholder preferences; inadequate technical designs and/or feasibility studies; and locally-, but not globally-, optimized subsystems and components. These design pitfalls can beget unanticipated developmental or system alterations with added costs, risks and support activities, heightening the risk for suboptimal system performance, premature obsolescence or forgone development. Supported by rapid advances in learning algorithms and hardware technology, sensors and sensor systems have become commonplace in both commercial and industrial products. The evolving array of hardware components (i.e. sensors, CPUs, modular / auxiliary access, etc…) as well as recognition, data fusion and communication protocols have all become increasingly complex and critical for design engineers during both concpetualization and implementation. This work seeks to develop and utilize a non-deterministic approach for sensor system design within the multi-attribute tradespace exploration (MATE) paradigm, a technique that incorporates decision theory into model-based techniques in order to explore complex design environments and discover better system designs. Developed to address the inherent design constraints in complex aerospace systems, MATE techniques enable project engineers to examine all viable system designs, assess attribute utility and system performance, and better align with stakeholder requirements. Whereas such previous work has been focused on aerospace systems and conducted in a deterministic fashion, this study addresses a wider array of system design elements by incorporating both traditional tradespace elements (e.g. hardware components) as well as popular multi-sensor data fusion models and techniques. Furthermore, statistical performance features to this model-based MATE approach will enable non-deterministic techniques for various commercial systems that range in application, complexity and system behavior, demonstrating a significant utility within the realm of formal systems decision-making.
D-Care: Diabetes Care Application to Enhance Diabetic Awareness to Diabetes in Indonesia
Diabetes is a common disease in Indonesia. One of the risk factors of diabetes is an unhealthy diet which is consuming food that contains too much glucose, one of glucose sources presents in food containing carbohydrate. The purpose of this study is to identify the amount of glucose level in the consumed food. The authors use literature studies for this research method. For the results of this study, the authors expect diabetics to be more aware of diabetes by applying daily dietary regulation through D-Care. D-Care is an application that can enhance people awareness to diabetes in Indonesia. D-Care provides two menus; there are nutrition calculation and healthy food. Nutrition calculation menu is used for knowing estimated glucose intake level by calculating food that consumed each day. Whereas healthy food menu, it provides a combination of healthy food menu for diabetic. The conclusion is D-Care is useful to be used for reducing diabetes prevalence in Indonesia.
Programming without Code: An Approach and Environment to Conditions-On-Data Programming
This paper presents the concept of an object-based programming language where tests (if... then... else) and control structures (while, repeat, for...) disappear and are replaced by conditions on data. According to the object paradigm, by using this concept, data are still embedded inside objects, as variable-value couples, but object methods are expressed into the form of logical propositions (‘conditions on data’ or COD).For instance : variable1 = value1 AND variable2 > value2 => variable3 = value3. Implementing this approach, a central inference engine turns and examines objects one after another, collecting all CODs of each object. CODs are considered as rules in a rule-based system: the left part of each proposition (left side of the ‘=>‘ sign) is the premise and the right part is the conclusion. So, premises are evaluated and conclusions are fired. Conclusions modify the variable-value couples of the object and the engine goes to examine the next object. The paper develops the principles of writing CODs instead of complex algorithms. Through samples, the paper also presents several hints for implementing a simple mechanism able to process this ‘COD language’. The proposed approach can be used within the context of simulation, process control, industrial systems validation, etc. By writing simple and rigorous conditions on data, instead of using classical and long-to-learn languages, engineers and specialists can easily simulate and validate the functioning of complex systems.
Management Software for the Elaboration of an Electronic File in the Pharmaceutical Industry Following Mexican Regulations
For certification, certain goods of public interest, such as medicines and food, it is required the preparation and delivery of a dossier. For its elaboration, legal and administrative knowledge must be taken, as well as organization of the documents of the process, and an order that allows the file verification. Therefore, a virtual platform was developed to support the process of management and elaboration of the dossier, providing accessibility to the information and interfaces that allow the user to know the status of projects. The development of dossier system on the cloud allows the inclusion of the technical requirements for the software management, including the validation and the manufacturing in the field industry. The platform guides and facilitates the dossier elaboration (report, file or history), considering Mexican legislation and regulations, it also has auxiliary tools for its management. This technological alternative provides organization support for documents and accessibility to the information required to specify the successful development of a dossier. The platform divides into the following modules: System control, catalog, dossier and enterprise management. The modules are designed per the structure required in a dossier in those areas. However, the structure allows for flexibility, as its goal is to become a tool that facilitates and does not obstruct processes. The architecture and development of the software allows flexibility for future work expansion to other fields, this would imply feeding the system with new regulations.
Usability and Biometric Authentication of
Electronic Voting System
In this paper, a new voting system is developed and its usability is evaluated. The main feature of this system is the biometric verification of the voter and then a few easy steps to cast a vote. As compared to existing systems available, e.g dual vote, the new system requires no training in advance. The security is achieved via multiple key concept (another part of this project). More than 100 student voters were participated in the election from University of Malakanad, Chakdara, PK. To achieve the reliability, the voters cast their votes in two ways, i.e. paper based and electronic based voting using our new system. The results of paper based and electronic voting system are compared and it is concluded that the voters cast their votes for the intended candidates on the electronic voting system. The voters were requested to fill a questionnaire and the results of the questionnaire are carefully analyzed. The results show that the new system proposed in this paper is more secure and usable than other systems.
Use of Personal Rhythm to Authenticate Encrypted Messages
When communicating using private and secure keys, there is always the doubt as to the identity of the message creator. We introduce an algorithm that uses the personal typing rhythm (keystroke dynamics) of the message originator to increase the trust of the authenticity of the message originator by the message recipient. The methodology proposes the use of a Rhythm Certificate Authority (RCA) to validate rhythm information. An illustrative example of the communication between Bob and Alice and the RCA is included. An algorithm of how to communicate with the RCA is presented. This RCA can be an independent authority or an enhanced Certificate Authority like the one used in public key infrastructure (PKI).
Installing Cloud Computing Model for E-Businesses in Small Organizations
Information technology developments have changed the way how businesses are working. Organizations are required to become visible online and stay connected to take advantages of costs reduction and improved operation of existing resources. The approval and the application areas of the cloud computing has significantly increased since it was presented by Google in 2007. Internet Cloud computing has attracted the IT enterprise attention especially the e-business enterprise. At this time, there is a great issue of environmental costs during the enterprises apply the e- business, but with the coming of cloud computing, most of the problem will be solved. Organizations around the world are facing with the continued budget challenges and increasing in the size of their computational data so, they need to find a way to deliver their services to clients as economically as possible without negotiating the achievement of anticipated outcomes. E- business companies need to provide better services to satisfy their clients.
In this research, the researcher proposed a paradigm that use and deploy cloud computing technology environment to be used for e-business in small enterprises. Cloud computing might be a suitable model for implementing e-business and e-commerce architecture to improve efficiency and user satisfaction.
Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks
Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.
Metric Suite for Schema Evolution of a Relational Database
Requirement of stakeholders for adding more details to the database is the main cause of the schema evolution in the relational database. Further, this schema evolution causes the instability to the database. Hence, it is aimed to define a metric suite for schema evolution of a relational database. The metric suite will calculate the metrics based on the features of the database, analyse the queries on the database and measures the coupling, cohesion and component dependencies of the schema for existing and evolved versions of the database. This metric suite will also provide an indicator for the problems related to the stability and usability of the evolved database. The degree of change in the schema of a database is presented in the forms of graphs that acts as an indicator and also provides the relations between various parameters (metrics) related to the database architecture. The acquired information is used to defend and improve the stability of database architecture. The challenges arise in incorporating these metrics with varying parameters for formulating a suitable metric suite are discussed. To validate the proposed metric suite, an experimentation has been performed on publicly available datasets.
Neighbor Caring Environment System (NCE) Using Parallel Replication Mechanism
Pertaining to a particular Marine interest, the process of data sampling could take years before a study can be concluded. Therefore, the need for a robust backup system for the data is invariably implicit. In recent advancement of Marine applications, more functionalities and tools are integrated to assist the work of the researchers. It is anticipated that this modality will continue as research scope widens and intensifies and at the same to follow suit with current technologies and lifestyles. The convenience to collect and share information these days also applies to the work in Marine research. Therefore, Marine system designers should be aware that high availability is a necessary attribute in Marine repository applications as well as a robust backup system for the data. In this paper, the approach to high availability is related both to hardware and software but the focus is more on software. We consider a NABTIC repository system that is primitively built on a single server and does not have replicated components. First, the system is decomposed into separate modules. The modules are placed on multiple servers to create a distributed system. Redundancy is added by placing the copies of the modules on different servers using Neighbor Caring Environment System(NCES) technique. NCER is utilizing parallel replication components mechanism. A background monitoring is established to check servers’ heartbeats to confirm their aliveness. At the same time, a critical adaptive threshold is maintained to make sure a failure is timely detected using Adaptive Fault Detection (AFD). A confirmed failure will set the recovery mode where a selection process will be done before a fail-over server is instructed. In effect, the Marine repository service is continued as the fail-over masks a recent failure. The performance of the new prototype is tested and is confirmed to be more highly available. Furthermore, the downtime is not noticeable as service is immediately restored automatically. The Marine repository system is said to have achieved fault tolerance.
A Proposal for Systematic Mapping Study of Software Security Testing, Verification and Validation
Software vulnerabilities are increasing and not only impact services and processes availability as well as information confidentiality, integrity and privacy, but also cause changes that interfere in the development process. Security test could be a solution to reduce vulnerabilities. However, the variety of test techniques with the lack of real case studies of applying tests focusing on software development life cycle compromise its effective use. This paper offers an overview of how a Systematic Mapping Study (MS) about security verification, validation and test (VVT) was performed, besides presenting general results about this study.
Teaching Students Collaborative Requirements Engineering: Case Study of Red:Wire
This paper discusses the use of a template-based
approach for documenting high-quality requirements as part of course
projects in an undergraduate Software Engineering course. In order
to ease some of the Requirements Engineering activities that are
performed when defining requirements by using the template, a new
CASE tool, RED:WIRE, was first developed and later tested by
students attending the course. Two questionnaires were conceived
around a study that aims to analyze the new tool’s learnability as
well as other obtained results concerning its usability in particular
and the Requirements Engineering skills developed by the students
The Importance of Cultural Adaptation of B2C E-Services Design in Germany
This research will give the introductory ideas for cultural adaption of B2C E-Service design in Germany. By the intense competition of E-Service development, many companies have realized the importance of understanding the emotional and cultural characteristics of their customers. Ignoring customers’ needs and requirements throughout the E-Service design can lead to faults, mistakes, and gaps. The term of E-Service usability now is changed not only to develop high quality E-Services, but also to be extended to include customer satisfaction and provide for them to feel local.
[Keynote Talk]: The Challenges and Solutions for Developing Mobile Apps in a Small University
As computing technology advances, smartphone applications can assist in student learning in a pervasive way. For example, the idea of using a mobile apps for the PA Common Trees, Pests, Pathogens, in the field as a reference tool allows middle school students to learn about trees and associated pests/pathogens without bringing a textbook. In the past, some researches study the mobile software Mobile Application Software Development Life Cycle (MADLC) including traditional models such as the waterfall model, or more recent Agile Methods. Others study the issues related to the software development process. Very little research is on the development of three heterogenous mobile systems simultaneously in a small university where the availability of developers is an issue. In this paper, we propose to use a hybride model of Waterfall Model and the Agile Model, known as the Relay Race Methodology (RRM) in practice, to reflect the concept of racing and relaying for scheduling. Based on the development project, we observe that the modeling of the transition between any two phases is manifested naturally. Thus, we claim that the RRM model can provide a de fecto rather than a de jure basis for the core concept in the MADLC. In this paper, the background of the project is introduced first. Then, the challenges are pointed out followed by our solutions. Finally, the experiences learned and the future work are presented.
Dynamic Store Procedures in Database
In recent years, different methods have been proposed to optimize question processing in database. Although different methods have been proposed to optimize the query, but the problem which exists here is that most of these methods destroy the query execution plan after executing the query. This research attempts to solve the above problem by using a combination of methods of communicating with the database (the present questions in the programming code and using store procedures) and making query processing adaptive in database, and proposing a new approach for optimization of query processing by introducing the idea of dynamic store procedures. This research creates dynamic store procedures in the database according to the proposed algorithm. This method has been tested on applied software and results shows a significant improvement in reducing the query processing time and also reducing the workload of DBMS. Other advantages of this algorithm include: making the programming environment a single environment, eliminating the parametric limitations of the stored procedures in the database, making the stored procedures in the database dynamic, etc.
An Improved VM Allocation Algorithm by Utilizing Combined Resource Allocation Mechanism and Released Resources in Cloud Environment
Utilization of resources is always a great challenge for any allocation problem, particularly when resource availability is dynamic in nature. In this work VM allocation mechanism has been augmented by providing resources in a combined manner. This approach has some inherent advantages in terms of reduction of wait state for the pending jobs of some users and better utilization of unused resources from the service providers’ point of view. Moreover the algorithm takes care of released resources from the finished jobs as soon as those become available. The proposed algorithm has been explained by suitable example to make the work complete.
Information Technology for Business Process Management in Insurance Companies
Information technology plays an irreplaceable role in introducing and improving business process orientation in a company. It enables implementation of the theoretical concept, measurement of results achieved and undertaking corrective measures aimed at improvements. Information technology is a key concept in the development and implementation of the business process management systems as it establishes a connection to business operations. Both in the literature and practice, insurance companies are often seen as highly process oriented due to the nature of their business and focus on customers. They are also considered leaders in using information technology for business process management. The research conducted aimed to investigate whether the perceived leadership status of insurance companies is well deserved, i.e. to establish the level of process orientation and explore the practice of information technology use in insurance companies in the region. The main instrument for primary data collection within this research was an electronic survey questionnaire sent to the management of insurance companies in the Republic of Croatia, Bosnia and Herzegovina, Slovenia, Serbia and Macedonia. The conducted research has shown that insurance companies have a satisfactory level of process orientation, but that there is also a huge potential for improvement, especially in the segment of information technology and its connection to business processes.
A Task Scheduling Algorithm in Cloud Computing
Efficient task scheduling method can meet users' requirements, and improve the resource utilization, then increase the overall performance of the cloud computing environment. Cloud computing has new features, such as flexibility, virtualization and etc., in this paper we propose a two levels task scheduling method based on load balancing in cloud computing. This task scheduling method meet user's requirements and get high resource utilization, that simulation results in CloudSim simulator prove this.
Survey on Big Data Stream Classification by Decision Tree
Nowadays, the development of computers technology and its recent applications provide access to new types of data, which have not been considered by the traditional data analysts. Two particularly interesting characteristics of such data sets include their huge size and streaming nature .Incremental learning techniques have been used extensively to address the data stream classification problem. This paper presents a concise survey on the obstacles and the requirements issues classifying data streams with using decision tree. The most important issue is to maintain a balance between accuracy and efficiency, the algorithm should provide good classification performance with a reasonable time response.
Detecting the Edge of Multiple Images in Parallel
Edge is variation of brightness in an image. Edge detection is useful in many application areas such as finding forests, rivers from a satellite image, detecting broken bone in a medical image etc. The paper discusses about finding edge of multiple aerial images in parallel .The proposed work tested on 38 images 37 colored and one monochrome image. The time taken to process N images in parallel is equivalent to time taken to process 1 image in sequential. The proposed method achieves pixel level parallelism as well as image level parallelism.
A Survey on Internet of Things and Fog Computing as a Platform for Internet of Things
The Internet of Things (IOT) is a technological revolution that represents the future of computing and communications. IOT is the convergence of Internet with RFID, NFC, Sensor, and smart objects. Fog Computing is the natural platform for IOT. At present, the IOT as a new network communication technology has rapidly shifted from concept to application under fog computing virtual storage computing platform. In this paper, we describe everything about IOT and difference between cloud computing and fog computing.
Rest API Based System-level Test Automation for Mobile Applications
Today’s mobile applications are communicating with servers more and more in order to access external services or information. Also, server-side code changes are more frequent than client-side code changes in a mobile application. The frequent changes lead to an increase in testing cost increase.
To reduce costs, UI based test automation can be one of the solutions. It is a common automation technique in system-level testing. However, it can be unsuitable for mobile applications. When you automate tests based on UI elements for mobile applications, there are some limitations such as the overhead of script maintenance or the difficulty of finding invisible defects that UI elements cannot represent. To overcome these limitations, we present a new automation technique based on Rest API. You can automate system-level tests through test scripts that you write. These scripts call a series of Rest API in a user’s action sequence. This technique does not require testers to know the internal implementation details, only input and expected output of Rest API. You can easily modify test cases by modifying Rest API input values and also find problems that might not be evident from the UI level by validating output values. For example, when an application receives price information from a payment server and user cannot see it at UI level, Rest API based scripts can check whether price information is correct or not. More than 10 mobile applications at our company are being tested automatically based on Rest API scripts whenever application source code, mostly server source code, is built. We are finding defects right away by setting a script as a build job in CI server. The build job starts when application code builds are completed. This presentation will also include field cases from our company.
Automatic MC/DC Test Data Generation from Software Module Description
Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that is highly recommended or required for safety-critical software coverage. Therefore, many testing standards include this criterion and require it to be satisfied at a particular level of testing (e.g. validation and unit levels). However, an important amount of time is needed to meet those requirements. In this paper we propose to automate MC/DC test data generation. Thus, we present an approach to automatically generate MC/DC test data, from software module description written over a dedicated language. We introduce a new merging approach that provides high MC/DC coverage for the description, with only a little number of test cases.
Practical Methods for Automatic MC/DC Test Cases Generation of Boolean Expressions
Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that aims to prove that all conditions involved in a Boolean expression can influence the result of that expression. In the context of automotive, MC/DC is highly recommended and even required for most security and safety applications testing. However, due to complex Boolean expressions that often embedded in those applications, generating a set of MC/DC compliant test cases for any of these expressions is a nontrivial task and can be time consuming for testers. In this paper we present an approach to automatically generate MC/DC test cases for any Boolean expression. We introduce novel techniques, essentially based on binary trees to quickly and optimally generate MC/DC test cases for the expressions. Thus, the approach can be used to reduce the manual testing effort of testers.
Event Monitoring Based On Web Services for Heterogeneous Event Sources
This article discusses event monitoring options for heterogeneous event sources as they are given in nowadays heterogeneous distributed information systems. It follows the central assumption, that a fully generic event monitoring solution cannot provide complete support for event monitoring; instead, event source specific semantics such as certain event types or support for certain event monitoring techniques have to be taken into account. Following from this, the core result of the work presented here is the extension of a configurable event monitoring (Web) service for a variety of event sources. A service approach allows us to trade genericity for the exploitation of source specific characteristics. It thus delivers results for the areas of SOA, Web services, CEP and EDA.
Performance Comparison of AODV and Soft AODV Routing Protocol
A mobile ad hoc network (MANET) represents a system of wireless mobile nodes that can self-organize freely and dynamically into arbitrary and temporary network topology. Unlike a wired network, wireless network interface has limited transmission range. Routing is the task of forwarding data packets from source to a given destination. Ad-hoc On Demand Distance Vector (AODV) routing protocol creates a path for a destination only when it required. This paper describes the implementation of AODV routing protocol using MATLAB-based Truetime simulator. In MANET's node movements are not fixed while they are random in nature. Hence intelligent techniques i.e. fuzzy and ANFIS are used to optimize the transmission range. In this paper, we compared the transmission range of AODV, fuzzy AODV and ANFIS AODV. For soft computing AODV, we have taken transmitted power and received threshold as input and transmission range as output. ANFIS gives better results as compared to fuzzy AODV.
Model-Based Automotive Partitioning and Mapping for Embedded Multicore Systems
This paper introduces novel approaches to partitioning and mapping in terms of model-based embedded multicore system engineering and further discusses benefits, industrial relevance and features in common with existing approaches. In order to assess and evaluate results, both approaches have been applied to a real industrial application as well as to various prototypical demonstrative applications, that have been developed and implemented for different purposes. Evaluations show, that such applications improve significantly according to performance, energy efficiency, meeting timing constraints and covering maintaining issues by using the AMALTHEA platform and the implemented approaches. Further- more, the model-based design provides an open, expandable, platform independent and scalable exchange format between OEMs, suppliers and developers on different levels. Our proposed mechanisms provide meaningful multicore system utilization since load balancing by means of partitioning and mapping is effectively performed with regard to the modeled systems including hardware, software, operating system, scheduling, constraints, configuration and more data.
Automatic API Regression Analyzer and Executor
As the software product changes versions across releases, there are changes to the API’s and features and the upgrades become necessary. Hence, it becomes imperative to get the impact of upgrading the dependent components. This tool finds out API changes across two versions and their impact on other API’s followed by execution of the automated regression suites relevant to updates and their impacted areas. This tool has 4 layer architecture, each layer with its own unique pre-assigned capability which it does and sends the required information to next layer. This are the 4 layers. 1) Comparator: Compares the two versions of API. 2) Analyzer: Analyses the API doc and gives the modified class and its dependencies along with implemented interface details. 3) Impact Filter: Find the impact of the modified class on the other API methods. 4) Auto Executer: Based on the output given by Impact Filter, Executor will run the API regression Suite. Tool reads the java doc and extracts the required information of classes, interfaces and enumerations. The extracted information is saved into a data structure which shows the class details and its dependencies along with interfaces and enumerations that are listed in the java doc.