Normalizing Scientometric Indicators of Individual Publications Using Local Cluster Detection Methods on Citation Networks
One of the major shortcomings of widely used
scientometric indicators is that different disciplines cannot be
compared with each other. The issue of cross-disciplinary
normalization has been long discussed, but even the classification
of publications into scientific domains poses problems. Structural
properties of citation networks offer new possibilities, however, the
large size and constant growth of these networks asks for precaution.
Here we present a new tool that in order to perform cross-field
normalization of scientometric indicators of individual publications
relays on the structural properties of citation networks. Due to the
large size of the networks, a systematic procedure for identifying
scientific domains based on a local community detection algorithm
is proposed. The algorithm is tested with different benchmark
and real-world networks. Then, by the use of this algorithm, the
mechanism of the scientometric indicator normalization process is
shown for a few indicators like the citation number, P-index and
a local version of the PageRank indicator. The fat-tail trend of the
article indicator distribution enables us to successfully perform the
indicator normalization process.
Approach for Demonstrating Reliability Targets for Rail Transport during Low Mileage Accumulation in the Field: Methodology and Case Study
In railway industry, train sets are designed based on contractual requirements (mission profile), where reliability targets are measured in terms of mean distance between failures (MDBF). However, during the beginning of revenue services, trains do not achieve the designed mission profile distance (mileage) within the timeframe due to infrastructure constraints, scarcity of commuters or other operational challenges thereby not respecting the original design inputs. Since trains do not run sufficiently and do not achieve the designed mileage within the specified time, car builder has a risk of not achieving the contractual MDBF target. This paper proposes a constant failure rate based model to deal with the situations where mileage accumulation is not a part of the design mission profile. The model provides appropriate MDBF target to be demonstrated based on actual accumulated mileage. A case study of rolling stock running in the field is undertaken to analyze the failure data and MDBF target demonstration during low mileage accumulation. The results of case study prove that with the proposed method, reliability targets are achieved under low mileage accumulation.
Multidimensional Compromise Optimization for Development Ranking of the Gulf Cooperation Council Countries and Turkey
In this research, a multidimensional compromise optimization method is proposed for multidimensional decision making analysis in the development ranking of the Gulf Cooperation Council Countries and Turkey. The proposed approach presents ranking solutions resulting from different multicriteria decision analyses, which yield different ranking orders for the same ranking problem, consisting of a set of alternatives in terms of numerous competing criteria when they are applied with the same numerical data. The multiobjective optimization decision making problem is considered in three sequential steps. In the first step, five different criteria related to the development ranking are gathered from the research field. In the second step, identified evaluation criteria are, objectively, weighted using standard deviation procedure. In the third step, a country selection problem is illustrated with a numerical example as an application of the proposed multidimensional compromise optimization model. Finally, multidimensional compromise optimization approach is applied to rank the Gulf Cooperation Council Countries and Turkey.
Pose Normalization Network for Object Classification
Convolutional Neural Networks (CNN) have
demonstrated their effectiveness in synthesizing 3D views of object
instances at various viewpoints. Given the problem where one
have limited viewpoints of a particular object for classification, we
present a pose normalization architecture to transform the object to
existing viewpoints in the training dataset before classification to
yield better classification performance. We have demonstrated that
this Pose Normalization Network (PNN) can capture the style of
the target object and is able to re-render it to a desired viewpoint.
Moreover, we have shown that the PNN improves the classification
result for the 3D chairs dataset and ShapeNet airplanes dataset
when given only images at limited viewpoint, as compared to a
Normalizing Logarithms of Realized Volatility in an ARFIMA Model
Modelling realized volatility with high-frequency returns is popular as it is an unbiased and efficient estimator of return volatility. A computationally simple model is fitting the logarithms of the realized volatilities with a fractionally integrated long-memory Gaussian process. The Gaussianity assumption simplifies the parameter estimation using the Whittle approximation. Nonetheless, this assumption may not be met in the finite samples and there may be a need to normalize the financial series. Based on the empirical indices S&P500 and DAX, this paper examines the performance of the linear volatility model pre-treated with normalization compared to its existing counterpart. The empirical results show that by including normalization as a pre-treatment procedure, the forecast performance outperforms the existing model in terms of statistical and economic evaluations.
Image Retrieval Based on Multi-Feature Fusion for Heterogeneous Image Databases
Selecting an appropriate image representation is the most important factor in implementing an effective Content-Based Image Retrieval (CBIR) system. This paper presents a multi-feature fusion approach for efficient CBIR, based on the distance distribution of features and relative feature weights at the time of query processing. It is a simple yet effective approach, which is free from the effect of features' dimensions, ranges, internal feature normalization and the distance measure. This approach can easily be adopted in any feature combination to improve retrieval quality. The proposed approach is empirically evaluated using two benchmark datasets for image classification (a subset of the Corel dataset and Oliva and Torralba) and compared with existing approaches. The performance of the proposed approach is confirmed with the significantly improved performance in comparison with the independently evaluated baseline of the previously proposed feature fusion approaches.
Applying Spanning Tree Graph Theory for Automatic Database Normalization
In Knowledge and Data Engineering field, relational
database is the best repository to store data in a real world. It has
been using around the world more than eight decades. Normalization
is the most important process for the analysis and design of relational
databases. It aims at creating a set of relational tables with minimum
data redundancy that preserve consistency and facilitate correct
insertion, deletion, and modification. Normalization is a major task in
the design of relational databases. Despite its importance, very few
algorithms have been developed to be used in the design of
commercial automatic normalization tools. It is also rare technique to
do it automatically rather manually. Moreover, for a large and
complex database as of now, it make even harder to do it manually.
This paper presents a new complete automated relational database
normalization method. It produces the directed graph and spanning
tree, first. It then proceeds with generating the 2NF, 3NF and also
BCNF normal forms. The benefit of this new algorithm is that it can
cope with a large set of complex function dependencies.
Enhanced Gram-Schmidt Process for Improving the Stability in Signal and Image Processing
The Gram-Schmidt Process (GSP) is used to convert a non-orthogonal basis (a set of linearly independent vectors) into an orthonormal basis (a set of orthogonal, unit-length vectors). The process consists of taking each vector and then subtracting the
elements in common with the previous vectors. This paper introduces an Enhanced version of the Gram-Schmidt Process (EGSP) with inverse, which is useful for signal and image processing applications.
Adaptive Score Normalization: A Novel Approach for Multimodal Biometric Systems
Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.
Photograph Based Pair-matching Recognition of Human Faces
In this paper, a novel system
recognition of human faces without using face
different color photographs is proposed. It mainly in
face detection, normalization and recognition. Foot
method of combination of Haar-like face determined
segmentation and region-based histogram stretchi
(RHST) is proposed to achieve more accurate perf
using Haar. Apart from an effective angle norm
side-face (pose) normalization, which is almost a might be important and beneficial for the prepr
introduced. Then histogram-based and photom
normalization methods are investigated and ada
retinex (ASR) is selected for its satisfactory illumin
Finally, weighted multi-block local binary pattern
with 3 distance measures is applied for pair-mat
Experimental results show its advantageous perfo
with PCA and multi-block LBP, based on a principle.
Normalization and Constrained Optimization of Measures of Fuzzy Entropy
In the literature of information theory, there is
necessity for comparing the different measures of fuzzy entropy and
this consequently, gives rise to the need for normalizing measures of
fuzzy entropy. In this paper, we have discussed this need and hence
developed some normalized measures of fuzzy entropy. It is also
desirable to maximize entropy and to minimize directed divergence
or distance. Keeping in mind this idea, we have explained the method
of optimizing different measures of fuzzy entropy.
A Normalization-based Robust Image Watermarking Scheme Using SVD and DCT
Digital watermarking is one of the techniques for
copyright protection. In this paper, a normalization-based robust
image watermarking scheme which encompasses singular value
decomposition (SVD) and discrete cosine transform (DCT)
techniques is proposed. For the proposed scheme, the host image is
first normalized to a standard form and divided into non-overlapping
image blocks. SVD is applied to each block. By concatenating the
first singular values (SV) of adjacent blocks of the normalized image,
a SV block is obtained. DCT is then carried out on the SV blocks to
produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency
band of a SVD-DCT block by imposing a particular
relationship between two pseudo-randomly selected DCT
coefficients. An adaptive frequency mask is used to adjust local
watermark embedding strength. Watermark extraction involves
mainly the inverse process. The watermark extracting method is blind
and efficient. Experimental results show that the quality degradation
of watermarked image caused by the embedded watermark is visually
transparent. Results also show that the proposed scheme is robust
against various image processing operations and geometric attacks.
A Normalization-based Robust Watermarking Scheme Using Zernike Moments
Digital watermarking has become an important technique for copyright protection but its robustness against attacks remains a major problem. In this paper, we propose a normalizationbased robust image watermarking scheme. In the proposed scheme, original host image is first normalized to a standard form. Zernike transform is then applied to the normalized image to calculate Zernike moments. Dither modulation is adopted to quantize the magnitudes of Zernike moments according to the watermark bit stream. The watermark extracting method is a blind method. Security analysis and false alarm analysis are then performed. The quality degradation of watermarked image caused by the embedded watermark is visually transparent. Experimental results show that the proposed scheme has very high robustness against various image processing operations and geometric attacks.
Fixed Point Equations Related to Motion Integrals in Renormalization Hopf Algebra
In this paper we consider quantum motion integrals
depended on the algebraic reconstruction of BPHZ method for
perturbative renormalization in two different procedures. Then based
on Bogoliubov character and Baker-Campbell-Hausdorff (BCH) formula,
we show that how motion integral condition on components
of Birkhoff factorization of a Feynman rules character on Connes-
Kreimer Hopf algebra of rooted trees can determine a family of fixed
Systholic Boolean Orthonormalizer Network in Wavelet Domain for Microarray Denoising
We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on the following procedure: We apply 1) Bidimentional Discrete Wavelet Transform (DWT-2D) to the Noisy Microarray, 2) scaling and rounding to the coefficients of the highest subbands (to obtain integer and positive coefficients), 3) bit-slicing to the new highest subbands (to obtain bit-planes), 4) then we apply the Systholic Boolean Orthonormalizer Network (SBON) to the input bit-plane set and we obtain two orthonormal otput bit-plane sets (in a Boolean sense), we project a set on the other one, by means of an AND operation, and then, 5) we apply re-assembling, and, 6) rescaling. Finally, 7) we apply Inverse DWT-2D and reconstruct a microarray from the modified wavelet coefficients. Denoising results compare favorably to the most of methods in use at the moment.
A New High Speed Neural Model for Fast Character Recognition Using Cross Correlation and Matrix Decomposition
Neural processors have shown good results for
detecting a certain character in a given input matrix. In this paper, a
new idead to speed up the operation of neural processors for character
detection is presented. Such processors are designed based on cross
correlation in the frequency domain between the input matrix and the
weights of neural networks. This approach is developed to reduce the
computation steps required by these faster neural networks for the
searching process. The principle of divide and conquer strategy is
applied through image decomposition. Each image is divided into
small in size sub-images and then each one is tested separately by
using a single faster neural processor. Furthermore, faster character
detection is obtained by using parallel processing techniques to test the
resulting sub-images at the same time using the same number of faster
neural networks. In contrast to using only faster neural processors, the
speed up ratio is increased with the size of the input image when using
faster neural processors and image decomposition. Moreover, the
problem of local subimage normalization in the frequency domain is
solved. The effect of image normalization on the speed up ratio of
character detection is discussed. Simulation results show that local
subimage normalization through weight normalization is faster than
subimage normalization in the spatial domain. The overall speed up
ratio of the detection process is increased as the normalization of
weights is done off line.
Using Automated Database Reverse Engineering for Database Integration
One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
An Improved Illumination Normalization based on Anisotropic Smoothing for Face Recognition
Robust face recognition under various illumination
environments is very difficult and needs to be accomplished for
successful commercialization. In this paper, we propose an improved
illumination normalization method for face recognition. Illumination
normalization algorithm based on anisotropic smoothing is well known
to be effective among illumination normalization methods but
deteriorates the intensity contrast of the original image, and incurs less
sharp edges. The proposed method in this paper improves the previous
anisotropic smoothing-based illumination normalization method so
that it increases the intensity contrast and enhances the edges while
diminishing the effect of illumination variations. Due to the result of
these improvements, face images preprocessed by the proposed
illumination normalization method becomes to have more distinctive
feature vectors (Gabor feature vectors) for face recognition. Through
experiments of face recognition based on Gabor feature vector
similarity, the effectiveness of the proposed illumination
normalization method is verified.
A Content Based Image Watermarking Scheme Resilient to Geometric Attacks
Multimedia security is an incredibly significant area of concern. The paper aims to discuss a robust image watermarking scheme, which can withstand geometric attacks. The source image is initially moment normalized in order to make it withstand geometric attacks. The moment normalized image is wavelet transformed. The first level wavelet transformed image is segmented into blocks if size 8x8. The product of mean and standard and standard deviation of each block is computed. The second level wavelet transformed image is divided into 8x8 blocks. The product of block mean and the standard deviation are computed. The difference between products in the two levels forms the watermark. The watermark is inserted by modulating the coefficients of the mid frequencies. The modulated image is inverse wavelet transformed and inverse moment normalized to generate the watermarked image. The watermarked image is now ready for transmission. The proposed scheme can be used to validate identification cards and financial instruments. The performance of this scheme has been evaluated using a set of parameters. Experimental results show the effectiveness of this scheme.