Detectability Analysis of Typical Aerial Targets from Space-Based Platforms
In order to achieve effective detection of aerial targets over long distances from space-based platforms, the mechanism of interaction between the radiation characteristics of the aerial targets and the complex scene environment including the sunlight conditions, underlying surfaces and the atmosphere are analyzed. A large simulated database of space-based radiance images is constructed considering several typical aerial targets, target working modes (flight velocity and altitude), illumination and observation angles, background types (cloud, ocean, and urban areas) and sensor spectrums ranging from visible to thermal infrared. The target detectability is characterized by the signal-to-clutter ratio (SCR) extracted from the images. The influence laws of the target detectability are discussed under different detection bands and instantaneous fields of view (IFOV). Furthermore, the optimal center wavelengths and widths of the detection bands are suggested, and the minimum IFOV requirements are proposed. The research can provide theoretical support and scientific guidance for the design of space-based detection systems and on-board information processing algorithms.
Electronic Payment Recording with Payment History Retrieval Module: A System Software
The Electronic Payment Recording with Payment History Retrieval Module is developed intendedly for the College of Science and Technology. This system software innovates the manual process of recording the payments done in the department through the development of electronic payment recording system software shifting from the slow and time-consuming procedure to quick yet reliable and accurate way of recording payments because it immediately generates receipts for every transaction. As an added feature to its software process, generation of recorded payment report is integrated eliminating the manual reporting to a more easy and consolidated report. As an added feature to the system, all recorded payments of the students can be retrieved immediately making the system transparent and reliable payment recording software. Viewing the whole process, the system software will shift from the manual process to an organized software technology because the information will be stored in a logically correct and normalized database. Further, the software will be developed using the modern programming language and implement strict programming methods to validate all users accessing the system, evaluate all data passed into the system and information retrieved to ensure data accuracy and reliability. In addition, the system will identify the user and limit its access privilege to establish boundaries of the specific access to information allowed for the store, modify, and update making the information secure against unauthorized data manipulation. As a result, the System software will eliminate the manual procedure and replace with an innovative modern information technology resulting to the improvement of the whole process of payment recording fast, secure, accurate and reliable software innovations.
Quantifying Mobility of Urban Inhabitant Based on Social Media Data
Check-in locations on social media provide information about an individual’s location. The millions of units of data generated from these sites provide knowledge for human activity. In this research, we used a geolocation service and users’ texts posted on Twitter social media to analyze human mobility. Our research will answer the questions; what are the movement patterns of a citizen? And, how far do people travel in the city? We explore the people trajectory of 201,118 check-ins and 22,318 users over a period of one month in Makassar city, Indonesia. To accommodate individual mobility, the authors only analyze the users with check-in activity greater than 30 times. We used sampling method with a systematic sampling approach to assign the research sample. The study found that the individual movement shows a high degree of regularity and intensity in certain places. The other finding found that the average distance an urban inhabitant can travel per day is as far as 9.6 km.
Self-Disclosure of Location: Influences of Personality Traits, Intrinsic Motivations and Extrinsic Motivations
With the popularity of smartphone usage and the flourish of social networks, many people began to use the 'check-in' functions to share their location information and days of live and self-disclosure. In order to increase exposure and awareness, some stores provide discounts and other benefits to attract consumers to 'check-in' in their stores. The purpose of this study was to investigate whether personality traits, intrinsic motivations, extrinsic motivations, and privacy concerns would affect self-disclosure of location for consumers. Research data were collected from 407 individuals that have used Facebook check-in in Taiwan. This study used SmartPLS 2.0 structural equation modeling to validate the model. The results show that information sharing, information storage, enjoyment, self-presentation, get a feedback, economic reward, and keep up with trends had significant positive effects on self-disclosure. While extroversion and openness to use have significant positive effects on self-disclosure, conscientiousness and privacy concerns have significant negative effects on self-disclosure. The results of the study provide academic and practical implications for the future growth of location-based self-disclosure.
Community Structure Detection in Networks Based on Bee Colony
In this paper, we propose a new method to find the community structure in networks. Our method is based on bee colony and the maximization of modularity to find the community structure. We use a bee colony algorithm to find the first community structure that has a good value of modularity. To improve the community structure, that was found, we merge communities until we get a community structure that has a high value of modularity. We provide a general framework for implementing our approach. We tested our method on computer-generated and real-world networks with a comparison to very known community detection methods. The obtained results show the effectiveness of our proposition.
Internet Impulse Buying: A Study Based on Stimulus-Organism-Response Theory
As the advance of e-commerce technologies, the consumers buying behavior have changed. The focus on consumer buying behavior has already shifted from physical space to the cyberspace, which impulse buying is a major issue of concern. This study examines the stimulus effect of web environment on the consumer's emotional states, and in turn, affecting the urge of impulse buying based on a stimulus-organism-response (S-O-R) theory. Website ambiance and website service quality are the two stimulus variables. The study also explores the effects and the moderator effects of contextual variables and individual characteristic variables on the web environment, the emotional states and the urge of impulse buying. A total of 328 valid questionnaires were collected. Structural equation modeling was used to test the research hypothesis. This study found that both website ambiance and website service quality have a positive effect on consumer emotion, which in turn positively affect the urge of impulse buying. Consumer’s trait of impulse buying has a positive effect on the urge of impulse buying. Consumer’s hedonic motivation has a positive effect on both emotion state and the urge of impulse buying. On the other hand, the study found that money available for the consumer would positively affect consumer's emotion state and time available for the consumer would negatively affect the relationship between website service quality and consumer emotion. The result of this study validates Internet impulse buying behavior based on the S-O-R theory. This study also suggests that having a good website atmosphere and service quality is important to influencing consumers’ emotion and increasing the likelihood of consumer purchasing. The study could serve as a basis for the future research regarding online consumer behavior.
Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks
In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.
Identifying the Risks on Philippines’ Pre- and Post-Disaster Media Communication on Natural Hazards
The Philippine is a hotbed of disasters and is a locus of natural hazards. With an average of 20 typhoons entering the Philippine Area of Responsibility (PAR) each year, seven to eight (7-8) of which makes landfall. The country rather inevitably suffers from climate-related calamities. With this vulnerability to natural hazards, the relevant hazard-related issues that come along with the potential threat and occurrence of a disaster oftentimes garners lesser media attention than when a disaster actually occurred. Post-disaster news and events flood the content of news networks primarily focusing on, but not limited to, the efforts of the national government in resolving post-disaster displacement, and all the more on the community leaders’ incompetence in disaster mitigation-- even though the University of the Philippines’ NOAH Center work hand in hand with different stakeholders for disaster mitigation communication efforts. Disaster risk communication is actually a perennial dilemma. There are so many efforts to reach the grassroots level but emergency and disaster preparedness messages inevitably fall short.. The Philippines is very vulnerable to hazards risk and disasters but social media posts and communication efforts mostly go unnoticed, if not argued upon. This study illustrates the outcomes of a research focusing on the print, broadcast, and social media’s role on disaster communication involving the natural catastrophic events that took place in the Philippines from 2009 to present. Considering the country’s state of development, this study looks on the rapid and reliable communication between the government, and the relief/rescue workers in the affected regions; and how the media portrays these efforts effectively. Learning from the disasters that have occurred in the Philippines over the past decade, effective communication can ensure that any efforts to prepare and respond to disasters can make a significant difference. It can potentially either break or save lives. Recognizing the role of communications is not only in improving the coordination of vital services for post disaster; organizations gave priority in reexamining disaster preparedness mechanisms through the Communication with Communities (CwC) programs. This study, however, looks at the CwC efforts of the Philippine media platforms. CwC, if properly utilized by the media, is an essential tool in ensuring accountability and transparency which require effective exchange of information between disasters and survivors and responders. However, in this study, it shows that the perennial dilemma of the Philippine media is that the Disaster Risk Reduction and Management (DRRM) efforts of the country lie in the clouded judgment of political aims. This kind of habit is a multiplier of the country’s risk and insecurity. Sometimes the efforts in urging the public to take action seem useless because the challenge lies on how to achieve social, economic, and political unity using the tri-media platform.
Students' Perceptions of Communication Design in Media: Case Study of Portuguese and Spanish Communication Students
The proliferation of mobile devices in society enables the media to disseminate information and knowledge more rapidly. Higher education students access these contents and share them with each other, in the most diverse platforms, allowing the ubiquity in access to information. This article presents the results and respective quantitative analysis of a survey applied to communication students of two higher education institutions: one in Portugal and another in Spain. The results show that, in this sample, higher education students regularly access news content believing traditional news sources to be more credible. Regarding online support, it was confirmed that free access to news is still the most popular mode. This study intends to promote the knowledge about the changes that occur in the relationship of higher education students with the media, characterizing the way in which the consumption of news by these students is processed, considering the resulting effects the evolution of digital media. It is intended to present not only the news sources they use, but also to know some of their habits and relationship with the news media.
Design of Collaborative Web System: Based on Case Study of PBL Support Systems
This paper describes the design and implementation of
web system for continuable and viable collaboration. This study
proposes the improvement of the system based on a result of a certain
practice. As contemporary higher education information environments
transform, this study highlights the significance of university identity
and college identity that are formed continuously through independent
activities of the students. Based on these discussions, the present study
proposes a practical media environment design which facilitates the
processes of organizational identity formation based on a continuous
and cyclical model. Even if users change by this system, the
communication system continues operation and cooperation. The
activity becomes the archive and produces new activity. Based on the
result, this study elaborates a plan with a re-design by a system from
the viewpoint of second-order cybernetics. Systems theory is a
theoretical foundation for our study.
Lexical Based Method for Opinion Detection on Tripadvisor Collection
The massive development of online social networks
allows users to post and share their opinions on various topics.
With this huge volume of opinion, it is interesting to extract and
interpret these information for different domains, e.g., product and
service benchmarking, politic, system of recommendation. This is
why opinion detection is one of the most important research tasks.
It consists on differentiating between opinion data and factual data.
The difficulty of this task is to determine an approach which returns
opinionated document. Generally, there are two approaches used
for opinion detection i.e. Lexical based approaches and Machine
Learning based approaches. In Lexical based approaches, a dictionary
of sentimental words is used, words are associated with weights. The
opinion score of document is derived by the occurrence of words from
this dictionary. In Machine learning approaches, usually a classifier
is trained using a set of annotated document containing sentiment,
and features such as n-grams of words, part-of-speech tags, and
logical forms. Majority of these works are based on documents text
to determine opinion score but dont take into account if these texts
are really correct. Thus, it is interesting to exploit other information
to improve opinion detection. In our work, we will develop a new
way to consider the opinion score. We introduce the notion of
trust score. We determine opinionated documents but also if these
opinions are really trustable information in relation with topics. For
that we use lexical SentiWordNet to calculate opinion and trust
scores, we compute different features about users like (numbers of
their comments, numbers of their useful comments, Average useful
review). After that, we combine opinion score and trust score to
obtain a final score. We applied our method to detect trust opinions in
TRIPADVISOR collection. Our experimental results report that the
combination between opinion score and trust score improves opinion
An Earth Mover’s Distance Algorithm Based DDoS Detection Mechanism in SDN
Software-defined networking (SDN) provides a solution
for scalable network framework with decoupled control and data
plane. However, this architecture also induces a particular distributed
denial-of-service (DDoS) attack that can affect or even overwhelm
the SDN network. DDoS attack detection problem has to date been
mostly researched as entropy comparison problem. However, this
problem lacks the utilization of SDN, and the results are not accurate.
In this paper, we propose a DDoS attack detection method, which
interprets DDoS detection as a signature matching problem and is
formulated as Earth Mover’s Distance (EMD) model. Considering
the feasibility and accuracy, we further propose to define the cost
function of EMD to be a generalized Kullback-Leibler divergence.
Simulation results show that our proposed method can detect DDoS
attacks by comparing EMD values with the ones computed in the case
without attacks. Moreover, our method can significantly increase the
true positive rate of detection.
Information Technology Pattern for Traceability to Increase the Exporting Efficiency of Thailand’s Orchid
Traceability system is one of the tools which can ensure the product’s confident of the consumer as it can trace the product back to its origin and can reduce the operation cost of recall. Nowadays, there are so many technologies which can be applied to the traceability system and also able to increase the efficiency of the system such as QR Code, barcode, GS1 and GTIN. As the result, this research is aimed to study and design the information technology pattern that suits for the traceability of Thailand’s orchid because Thailand’s orchid is the popular export product for Japan, USA, China, Netherlands and Italy. This study will enhance the value of Thailand’s orchid and able to prevent the unexpected event of the defects or damaged product. The traceability pattern was received IOC test from 12 experts from 4 fields of study which are traceability field, information technology field, information communication technology field and orchid export field. The result of the in-depth interview and questionnaire showed that the technology which most compatibility with the traceability system is the QR code. The mean of the score was 4.25 and the standard deviation was 0.5 as the QR code is the new technology and user-friendly. The traceability system should start from the farm to the consumer in the consuming country as the traceability system will enhance the quality level of the product and increase the value of its as well. The other outcome from this research is the supply chain model of Thailand’s Orchid along with the system architecture and working system diagram.
Effective Validation Model and Use of Mobile-Health Apps for Elderly People
The controversy brought about by the increasing use of mHealth apps and their effectiveness for disease prevention and diagnosis calls for immediate control. Although a critical topic in research areas such as medicine, engineering, economics, among others, this issue lacks reliable implementation models. However, projects such as Open Web Application Security Project (OWASP) and various studies have helped to create useful and reliable apps. This research is conducted under a quality model to optimize two mHealth apps for older adults. Results analysis on the use of two physical activity monitoring apps - AcTiv (physical activity) and SMCa (energy expenditure) - is positive and ideal. Through a theoretical and practical analysis, precision calculations and personal information control of older adults for disease prevention and diagnosis were performed. Finally, apps are validated by a physician and, as a result, they may be used as health monitoring tools in physical performance centers or any other physical activity. The results obtained provide an effective validation model for this type of mobile apps, which, in turn, may be applied by other software developers that along with medical staff would offer digital healthcare tools for elderly people.
Information Theoretic Approach for Beamforming in Wireless Communications
Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.
Phishing Detection: Comparison between Uniform Resource Locator and Content-Based Detection
A web application is the most targeted by the attacker because the web application is accessible by the end users. It has become more advantageous to the attacker since not all the end users aware of what kind of sensitive data already leaked by them through the Internet especially via social network in shake on ‘sharing’. The attacker can use this information such as personal details, a favourite of artists, a favourite of actors or actress, music, politics, and medical records to customize phishing attack thus trick the user to click on malware-laced attachments. The Phishing attack is one of the most popular attacks for social engineering technique against web applications. There are several methods to detect phishing websites such as Blacklist/Whitelist based detection, heuristic-based, and visual similarity-based detection. This paper illustrated a comparison between the heuristic-based technique using features of a uniform resource locator (URL) and visual similarity-based detection techniques that compares the content of a suspected phishing page with the legitimate one in order to detect new phishing sites based on the paper reviewed from the past few years. The comparison focuses on three indicators which are false positive and negative, accuracy of the method, and time consumed to detect phishing website.
Hierarchical Filtering Method of Threat Alerts Based on Correlation Analysis
Nowadays, the threats of the internet are enormous and increasing; however, the classification of huge alert messages generated in this environment is relatively monotonous. It affects the accuracy of the network situation assessment, and also brings inconvenience to the security managers to deal with the emergency. In order to deal with potential network threats effectively and provide more effective data to improve the network situation awareness. It is essential to build a hierarchical filtering method to prevent the threats. In this paper, it establishes a model for data monitoring, which can filter systematically from the original data to get the grade of threats and be stored for using again. Firstly, it filters the vulnerable resources, open ports of host devices and services. Then use the entropy theory to calculate the performance changes of the host devices at the time of the threat occurring and filter again. At last, sort the changes of the performance value at the time of threat occurring. Use the alerts and performance data collected in the real network environment to evaluate and analyze. The comparative experimental analysis shows that the threat filtering method can effectively filter the threat alerts effectively.
Multi-Dimension Threat Situation Assessment Based on Network Security Attributes
As the increasing network attacks become more and more complex, network situation assessment based on log analysis cannot meet the requirements to ensure network security because of the low quality of logs and alerts. This paper addresses the lack of consideration of security attributes of hosts and attacks in the network. Identity and effectiveness of Distributed Denial of Service (DDoS) are hard to be proved in risk assessment based on alerts and flow matching. This paper proposes a multi-dimension threat situation assessment method based on network security attributes. First, the paper offers an improved Common Vulnerability Scoring System (CVSS) calculation, which includes confident risk, integrity risk, availability risk and a weighted risk. Second, the paper introduces deterioration rate of properties collected by sensors in hosts and network, which aimed at assessing the time and level of DDoS attacks. Third, the paper introduces distribution of asset value in security attributes considering features of attacks and network, which aimed at assessing and show the whole situation. Experiments demonstrate that the approach reflects effectiveness and level of DDoS attacks, and the result can show the primary threat in network and security requirement of network. Through comparison and analysis, the method reflects more in security requirement and security risk situation than traditional methods based on alert and flow analyzing.
Web Proxy Detection via Bipartite Graphs and One-Mode Projections
With the Internet becoming the dominant channel for business and life, many IPs are increasingly masked using web proxies for illegal purposes such as propagating malware, impersonate phishing pages to steal sensitive data or redirect victims to other malicious targets. Moreover, as Internet traffic continues to grow in size and complexity, it has become an increasingly challenging task to detect the proxy service due to their dynamic update and high anonymity. In this paper, we present an approach based on behavioral graph analysis to study the behavior similarity of web proxy users. Specifically, we use bipartite graphs to model host communications from network traffic and build one-mode projections of bipartite graphs for discovering social-behavior similarity of web proxy users. Based on the similarity matrices of end-users from the derived one-mode projection graphs, we apply a simple yet effective spectral clustering algorithm to discover the inherent web proxy users behavior clusters. The web proxy URL may vary from time to time. Still, the inherent interest would not. So, based on the intuition, by dint of our private tools implemented by WebDriver, we examine whether the top URLs visited by the web proxy users are web proxies. Our experiment results based on real datasets show that the behavior clusters not only reduce the number of URLs analysis but also provide an effective way to detect the web proxies, especially for the unknown web proxies.
Requirement Engineering and Software Product Line Scoping Paradigm
Requirement Engineering (RE) is a part being created for programming structure during the software development lifecycle. Software product line development is a new topic area within the domain of software engineering. It also plays important role in decision making and it is ultimately helpful in rising business environment for productive programming headway. Decisions are central to engineering processes and they hold them together. It is argued that better decisions will lead to better engineering. To achieve better decisions requires that they are understood in detail. In order to address the issues, companies are moving towards Software Product Line Engineering (SPLE) which helps in providing large varieties of products with minimum development effort and cost. This paper proposed a new framework for software product line and compared with other models. The results can help to understand the needs in SPL testing, by identifying points that still require additional investigation. In our future scenario, we will combine this model in a controlled environment with industrial SPL projects which will be the new horizon for SPL process management testing strategies.
Optimized Approach for Secure Data Sharing in Distributed Database
In the current age of technology, information is the most precious asset of a company. Today, companies have a large amount of data. As the data become larger, access to data for some particular information is becoming slower day by day. Faster data processing to shape it in the form of information is the biggest issue. The major problems in distributed databases are the efficiency of data distribution and response time of data distribution. The security of data distribution is also a big issue. For these problems, we proposed a strategy that can maximize the efficiency of data distribution and also increase its response time. This technique gives better results for secure data distribution from multiple heterogeneous sources. The newly proposed technique facilitates the companies for secure data sharing efficiently and quickly.
IT and Security Experts' Innovation and Investment Front for IT-Entrepreneurship in Pakistan
This paper targets the rising factor of entrepreneurship innovation, which lacks in Pakistan as compared to the other countries or the regions like China, India, and Malaysia, etc. This is an exploratory and explanatory study. Major aspects have identified as the direction for the policymakers while highlighting the issues in true spirit. IT needs to be considered not only as a technology but also as itself growing as a new community. IT management processes are complex and broad, so generally requires extensive attention to the collective aspects of human variables, capital and technology. In addition, projects tend to have a special set of critical success factors, and if these are processed and given attention, it will improve the chances of successful implementation. This is only possible with state of the art intelligent decision support systems and accumulating IT staff to some extent in decision processes. This paper explores this issue carefully and discusses six issues to observe the implemented strength and possible enhancement.
Formal Verification for Ethereum Smart Contract Using Coq
The smart contract in Ethereum is a unique program deployed on the Ethereum Virtual Machine (EVM) to help manage cryptocurrency. The security of this smart contract is critical to Ethereum’s operation and highly sensitive. In this paper, we present a formal model for smart contract, using the separated term-obligation (STO) strategy to formalize and verify the smart contract. We use the IBM smart sponsor contract (SSC) as an example to elaborate the detail of the formalizing process. We also propose a formal smart sponsor contract model (FSSCM) and verify SSC’s security properties with an interactive theorem prover Coq. We found the 'Unchecked-Send' vulnerability in the SSC, using our formal model and verification method. Finally, we demonstrate how we can formalize and verify other smart contracts with this approach, and our work indicates that this formal verification can effectively verify the correctness and security of smart contracts.
Security Issues in Long Term Evolution-Based Vehicle-To-Everything Communication Networks
The ability for vehicles to communicate with other vehicles (V2V), the physical (V2I) and network (V2N) infrastructures, pedestrians (V2P), etc. – collectively known as V2X (Vehicle to Everything) – will enable a broad and growing set of applications and services within the intelligent transport domain for improving road safety, alleviate traffic congestion and support autonomous driving. The telecommunication research and industry communities and standardization bodies (notably 3GPP) has finally approved in Release 14, cellular communications connectivity to support V2X communication (known as LTE – V2X). LTE – V2X system will combine simultaneous connectivity across existing LTE network infrastructures via LTE-Uu interface and direct device-to-device (D2D) communications. In order for V2X services to function effectively, a robust security mechanism is needed to ensure legal and safe interaction among authenticated V2X entities in the LTE-based V2X architecture. The characteristics of vehicular networks, and the nature of most V2X applications, which involve human safety makes it significant to protect V2X messages from attacks that can result in catastrophically wrong decisions/actions include ones affecting road safety. Attack vectors include impersonation attacks, modification, masquerading, replay, MiM attacks, and Sybil attacks. In this paper, we focus our attention on LTE-based V2X security and access control mechanisms. The current LTE-A security framework provides its own access authentication scheme, the AKA protocol for mutual authentication and other essential cryptographic operations between UEs and the network. V2N systems can leverage this protocol to achieve mutual authentication between vehicles and the mobile core network. However, this protocol experiences technical challenges, such as high signaling overhead, lack of synchronization, handover delay and potential control plane signaling overloads, as well as privacy preservation issues, which cannot satisfy the adequate security requirements for majority of LTE-based V2X services. This paper examines these challenges and points to possible ways by which they can be addressed. One possible solution, is the implementation of the distributed peer-to-peer LTE security mechanism based on the Bitcoin/Namecoin framework, to allow for security operations with minimal overhead cost, which is desirable for V2X services. The proposed architecture can ensure fast, secure and robust V2X services under LTE network while meeting V2X security requirements.
Evaluating Machine Learning Techniques for Activity Classification in Smart Home Environments
With the widespread adoption of the Internet-connected
devices, and with the prevalence of the Internet of Things (IoT)
applications, there is an increased interest in machine learning
techniques that can provide useful and interesting services in the
smart home domain. The areas that machine learning techniques
can help advance are varied and ever-evolving. Classifying smart
home inhabitants’ Activities of Daily Living (ADLs), is one
prominent example. The ability of machine learning technique to find
meaningful spatio-temporal relations of high-dimensional data is an
important requirement as well. This paper presents a comparative
evaluation of state-of-the-art machine learning techniques to classify
ADLs in the smart home domain. Forty-two synthetic datasets and
two real-world datasets with multiple inhabitants are used to evaluate
and compare the performance of the identified machine learning
techniques. Our results show significant performance differences
between the evaluated techniques. Such as AdaBoost, Cortical
Learning Algorithm (CLA), Decision Trees, Hidden Markov Model
(HMM), Multi-layer Perceptron (MLP), Structured Perceptron and
Support Vector Machines (SVM). Overall, neural network based
techniques have shown superiority over the other tested techniques.
Deploying a Platform as a Service Cloud Solution to Support Student Learning
This presentation describes the design and implementation of PaaS (platform as a service) cloud-based labs that are used in database-related courses to teach students practical skills. Traditionally, all labs are implemented in a desktop-based environment where students have to install heavy client software to access database servers. In order to release students from that burden, we have successfully deployed the cloud-based solution to support database-related courses, from which students and teachers can practice and learn database topics in various database courses via cloud access. With its development environment, execution runtime, web server, database server, and collaboration capability, it offers a shared pool of configurable computing resources and comprehensive environment that supports students’ needs without the complexity of maintaining the infrastructure.
Lightweight and Seamless Distributed Scheme for the Smart Home
Security of the smart home in terms of behavior activity pattern recognition is a totally dissimilar and unique issue as compared to the security issues of other scenarios. Sensor devices (low capacity and high capacity) interact and negotiate each other by detecting the daily behavior activity of individuals to execute common tasks. Once a device (e.g., surveillance camera, smart phone and light detection sensor etc.) is compromised, an adversary can then get access to a specific device and can damage daily behavior activity by altering the data and commands. In this scenario, a group of common instruction processes may get involved to generate deadlock. Therefore, an effective suitable security solution is required for smart home architecture. This paper proposes seamless distributed Scheme which fortifies low computational wireless devices for secure communication. Proposed scheme is based on lightweight key-session process to upheld cryptic-link for trajectory by recognizing of individual’s behavior activities pattern. Every device and service provider unit (low capacity sensors (LCS) and high capacity sensors (HCS)) uses an authentication token and originates a secure trajectory connection in network. Analysis of experiments is revealed that proposed scheme strengthens the devices against device seizure attack by recognizing daily behavior activities, minimum utilization memory space of LCS and avoids network from deadlock. Additionally, the results of a comparison with other schemes indicate that scheme manages efficiency in term of computation and communication.
Valuation of Cultural Heritage: A Hedonic Pricing Analysis of Housing via GIS-based Data
The hedonic pricing model has been popularly applied to describe the economic value of environmental amenities in urban housing, but the results for cultural heritage variables remain relatively ambiguous. In this paper, integrated variables extending by GIS-based data and an existing typology of communities used to examine how cultural heritage and environmental amenities and disamenities affect housing prices across urban communities in Tainan, Taiwan. The developed models suggest that, although a sophisticated variable for central services is selected, the centrality of location is not fully controlled in the price models and thus picked up by correlated peripheral and central amenities such as cultural heritage, open space or parks. Analysis of these correlations permits us to qualify results and present a revised set of relatively reliable estimates. Positive effects on housing prices are identified for views, various types of recreational infrastructure and vicinity of nationally cultural sites and significant landscapes. Negative effects are found for several disamenities including wasteyards, refuse incinerators, petrol stations and industries. The results suggest that systematic hypothesis testing and reporting of correlations may contribute to consistent explanatory patterns in hedonic pricing estimates for cultural heritage and landscape amenities in urban.
Study on Construction of 3D Topography by UAV-Based Images
In this paper, a method of fast 3D topography modeling using the high-resolution camera images is studied based on the characteristics of Unmanned Aerial Vehicle (UAV) system for low altitude aerial photogrammetry and the need of three dimensional (3D) urban landscape modeling. Firstly, the existing high-resolution digital camera with special design of overlap images is designed by reconstructing and analyzing the auto-flying paths of UAVs, which improves the self-calibration function to achieve the high precision imaging by software, and further increased the resolution of the imaging system. Secondly, several-angle images including vertical images and oblique images gotten by the UAV system are used for the detail measure of urban land surfaces and the texture extraction. Finally, the aerial photography and 3D topography construction are both developed in campus of Chang-Jung University and in Guerin district area in Tainan, Taiwan, provide authentication model for construction of 3D topography based on combined UAV-based camera images from system. The results demonstrated that the UAV system for low altitude aerial photogrammetry can be used in the construction of 3D topography production, and the technology solution in this paper offers a new, fast, and technical plan for the 3D expression of the city landscape, fine modeling and visualization.
Comparative Study of Ad Hoc Routing Protocols in Vehicular Ad-Hoc Networks for Smart City
In this paper, we perform the investigation of some routing protocols in Vehicular Ad-Hoc Network (VANET) context. Indeed, we study the efficiency of protocols like Dynamic Source Routing (DSR), Ad hoc On-demand Distance Vector Routing (AODV), Destination Sequenced Distance Vector (DSDV), Optimized Link State Routing convention (OLSR) and Vehicular Multi-hop algorithm for Stable Clustering (VMASC) in terms of packet delivery ratio (PDR) and throughput. The performance evaluation and comparison between the studied protocols shows that the VMASC is the best protocols regarding fast data transmission and link stability in VANETs. The validation of all results is done by the NS3 simulator.