SciELO - Scientific Electronic Library Online

 
vol.16 número4Detección de plagio translingüe utilizando el diccionario estadístico de BabelNet índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • No hay artículos similaresSimilares en SciELO

Compartir


Computación y Sistemas

versión On-line ISSN 2007-9737versión impresa ISSN 1405-5546

Comp. y Sist. vol.16 no.4 Ciudad de México oct./dic. 2012

 

Editorial

 

This issue comprises an assorted collection of papers reporting on excellent, and interesting pieces of research. Overall, it comprehends one invited paper, seven regular papers, and one extended PhD thesis abstract. Out of these regular papers, three are about robotics, two about digital signal processing, one about formal protocol specification and verification, and the other two are about foundations of computing. We shall briefly describe each of these articles below.

In the invited paper, Cross-Language Plagiarism Detection Using the BabelNet Statistic Dictionary,1 Franco-Salvador et al. have applied several techniques on BabelNet's statistic dictionary to translate documents, suspect of plagiarism, to a source language, and then compare the translated document against a collection of authenticated documents, in order to detect this kind of dishonesty. The authors approach plagiarism detection using the Cross-Language Alignment-based Similarity Analysis model, which yields a list of candidate translations, weighted using a similarity function. To validate the benefits of using BabelNet's dictionary, the authors have set an experiment, using the PAN-PC'11 corpus as a testbed, for the language pairs English-Spanish and English-German, and have successfully established that their proposed method outperforms a similar one based on a dictionary trained with the IBM M1 alignment model, especially in the English-Spanish test case.

Arenas-Mena et al. present a novel approach to automatically generate eye-believable motions for virtual characters navigating in a 3D, cluttered environment. Central to their approach is the use of an abstract model of a virtual (humanoid) skeleton, which helps both in representation (and, hence, motion classification), since it reduces the high dimensionality involved in character configuration specification, and in planning, since it has yet enough details to yield non-trivial, eye-believable motions. Arenas-Mena et al. also suggest an abstract model, based on PCA, that enables a compact structure to store motion-capture clips, which can be used to yield motions for, or learn new ones from, a planned path. Together, both models enable the construction of a planner, which has been used in a simulator, to successfully output path plans for challenging environments with obstacles that can be jumped over or passed underneath.

Salazar-Silva et al. introduce a new methodology for modeling and controlling a manipulator arm, mounted on a mobile robot, so-called a mobile manipulator. The mobile manipulator model works in a way that transforms the stated problem to a simpler one, namely: modeling a stationary manipulator with non-holonomic kinematic constraints on the joints. Since the task space associated with a mobile manipulator is larger than that of a stationary manipulator, Salazar-Silva et al. also present a suitable task-space control, which consists of both an internal compensator for the robot dynamics and the factory installed Proportional-Derivative (PD) control, and another, external, PD control, which operates in the feedforward of an estimate of the derivative of the posture kinematic model. Salazar-Silva et al. validate the robustness of their control, using three sets of numerical experiments.

López-Juárez et al. report on a methodology, based on neural networks, for rapidly locating, and accurately recognizing a part, regardless of its scale, displacement or orientation, which can be applied to self-adapting industrial robots that perform assembly tasks. Key to their methodology is the use of pattern vector descriptors, using collections of 2D images, to quickly obtain feature data about an object. To achieve fast, on-line object learning, López-Juárez et al. consider the use of only two simple and complex (geometric) patterns, learning and capturing the most relevant aspects of a scene so that they can be later on used to recognize the object, without needing to go onto the object details. Authors present an experimental setting, through which they have conducted a test. The results obtained throughout their experimentations are used to argue an outstanding overall recognition efficiency, with times lower than 1ms, suggesting the suitability of the approach to industrial-strength operations.

Villapol presents a formal model, based on Colored Petri Nets (CPN), of Baseband, a key protocol part of Bluetooth. Baseband is responsible for establishing a connection amongst a master and up to seven devices. Using CPN, and other formalisms (such as set theory), Villapol has formally specified a number of properties, some of which were borrowed from the standard, and some others proposed by her herself. To verify the model against these properties, she has followed either of two methods, one where verification is reduced to verifying properties of a special model graph, and the other where it is carried out using a temporal logic similar to CTL. Results show that Baseband satisfies the intended properties, at least for a one-master and one-slave configuration, and under some simplifying assumptions.

Aguilar-Torrentera introduces a coding method suitable for distributed transversal filtering-based code division multiplexing access, which gets around the need for data recovering circuits and operates at chip-rate speeds. Aguilar-Torrentera shows that his coding approach, based on spread time, enables the creation of orthogonal channels, and that the corresponding encoder can be implemented by means of a filter with a spread-in-time response function. The filter responses approximate short spreading waveforms, and can also be used to yield desirable correlation properties. The coding method would allow three users to simultaneously share an available channel.

Pérez-Pérez et al. report on the results obtained throughout an experimental test on applying Broder's algorithm for randomly selecting spanning trees on graphs, for the construction of a random walk. Roughly, a random walk over a connected, undirected graph is one that departs from an arbitrarily chosen vertex, and traverses the entire graph, thereby generating a Markov chain, described by means of a transition matrix. At each step of random walk generation, the next vertex to visit is chosen from the current vertex neighborhood. Each edge in the graph is labeled with a probability, giving rise to a probability distribution associated with every vertex, indicating the likelihood of being a vertex in its neighborhood the immediate successor. Pérez-Pérez et al. explain the results obtained from applying Broder's algorithm, but unlike previous research, considering Markov chains with non-uniform probability distributions, associated with vertex vicinity. For that purpose, they use a test set, which gathers a collection of randomly generated connected graphs and others borrowed from the literature. Overall, the aim of Pérez-Pérez et al. is to provide efficient procedures to generate spanning trees, by means of random walks, with interesting, desirable properties, including tree diameter, inner vertex valence, etc. This is of paramount interest, as it is use in the development of authentication protocols based on graph theory, for example.

Luna-Benoso and Yáñez-Márquez introduce an alpha-beta associative model, built out of cellular automata constructs. Roughly, a cellular automaton is a synchronous model, made out of interconnected components, each of which is called a cell, forming a lattice. At each point of time, governed by a universal clock, each cell might be in one of a finite number of states; as time goes by, the state of each cell changes following a function over its local neighborhood. An associative memory is a pattern recognition model, especially designed towards correctly retrieving a full pattern from a collection of input noisy patterns. The authors model, they call alpha-beta cellular, bridges the conceptual gap amongst cellular automata and alpha-beta associative models, making it possible to apply methods of one realm to the other, and vice versa. In the paper, authors show how to apply the alpha-beta cellular model for the classification of hand-written symbols, using the NIST MNIST database as a testbed, highlighting key results.

Dolecek and Mitra present a method for designing a multiplier-less comb-based filter. Roughly, comb filters suffer from a high pass band droop, and low folding attenuations, the first folding band of which being the most critical, as it has less attenuation than the subsequent ones. In this work, Dolecek and Mitra aim at extending and generalizing the two-stage CIC based decimator of their own, which has proven to provide a good aliasing rejection characteristic, and a low pass band droop for a decimation factor of the second stage equal to 8. So, in this work, authors address to what extent it is possible to achieve a low pass band droop, and high stop band attenuation, in the first band, with a minimum value of the decimation factor of the second stage (equal to two). Dolecek and Mitra report on an improvement on the alias rejection of the first comb folding band; unlike rival techniques, the proposed filter is less complex, as it is multiplier-less. Their new filter compensates the comb pass band droop, and is based solely on comb filters, which makes it possible to achieve goals about power consumption, by appropriately applying a recursive, or a non recursive comb structure. Apart from the design process, Dolecek and Mitra present a comparison of the proposed design method against competitor techniques.

Finally, Mar-Ortiz et al. give a summary of the key results that in the area of reverse logistics contribute Mar-Ortiz's PhD thesis. Roughly, Mar-Ortiz aims to develop mathematical models to optimize the performance of systems for the collection of waste of electric and electronic equipment. His contributions are threefold. First, the design of collection networks, where he proposes a three-phase, hierarchical approach, based on simulation, heuristics, and integer programming, which has proven to be efficient and to provide an average deviation lower than 1.2% over optimal solutions. Second, the optimization of collection routes, where he proposes the use of a greedy randomized adaptive searching procedure, capable of handling tight constraints regarding collection capacity of a fixed, and a heterogeneous vehicle flee, and a range of dates within which collection is to take place. Third, the optimization of disassembly systems, where he provides a formal definition of the disassembly cell formation problem, so that the disassembly task is scheduled in a way that total costs are minimized, the disassembly aims to recover all intended goods, and a number of constrains are satisfied. For this problem, he proposes the use of a heuristic-based Tabu search algorithm, which shows a 2.93% average deviation over optimal solutions.

We are confident that this issue will benefit the community working in either practical or theoretical aspects of information technology and communication. Enjoy reading it!

 

Raúl Monroy

 

NOTA

1 Original title: "Detección de plagio translingüe utilizando el diccionario estadístico de BabelNet".

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons