Scielo RSS <![CDATA[Computación y Sistemas]]> http://www.scielo.org.mx/rss.php?pid=1405-554620140004&lang=en vol. 18 num. 4 lang. en <![CDATA[SciELO Logo]]> http://www.scielo.org.mx/img/en/fbpelogp.gif http://www.scielo.org.mx <![CDATA[<b>Editorial</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400001&lng=en&nrm=iso&tlng=en <![CDATA[<b>Simulating and Visualizing Real-Time Crowds on GPU Clusters</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400002&lng=en&nrm=iso&tlng=en We present a set of algorithms for simulating and visualizing real-time crowds in GPU (Graphics Processing Units) clusters. First we present crowd simulation and rendering techniques that take advantage of single GPU machines. Then, using as an example a wandering crowd behavior simulation algorithm, we explain how this kind of algorithms can be extended for their use in GPU cluster environments. We also present a visualization architecture that renders the simulation results using detailed 3D virtual characters. This architecture is adaptable in order to support the Barcelona Supercomputing Center (BSC) infrastructure. The results show that our algorithms are scalable in different hardware platforms including embedded systems, desktop GPUs, and GPU clusters, in particular, the BSC's Minotauro cluster. <![CDATA[<b>Open Framework for Web Service Selection Using Multimodal and Configurable Techniques</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400003&lng=en&nrm=iso&tlng=en Services as part of our daily life represent an important means to deliver value to their consumers and have a great economic impact for organizations. The service consumption and their exponential proliferation show the importance and acceptance by their customers. In this sense, it is possible to predict that the infrastructure of future cities will be supported by different kind of services, such as smart city services, open data services, as well as common services (e.g., e-mail services), etc. Nowadays a large percentage of services are provided on the web and are commonly called web services (WSs). This kind of services has become one of the most used technologies in software systems. Among the challenges when integrating web services in a given system, requirements-driven selection occupies a prominent place. A comprehensive selection process needs to check compliance of Non-Functional Requirements (NFRs) which can be assessed by analyzing the Quality of Service (QoS). In this paper, we describe a framework called WeSSQoS that aims at ranking available WSs based on the comparison of their QoS and the stated NFRs. The framework is designed as an open Service Oriented Architecture (SOA) that hosts a configurable portfolio of normalization procedures and ranking algorithms which can be selected by users when starting a selection process. The QoS data from WSs can be obtained either from a static, WSDL-like description or dynamically through monitoring techniques. WeSSQoS is designed to work over multiple WS repositories and QoS sources. The impact of having a portfolio of different normalization and ranking algorithms is illustrated with an example. <![CDATA[<b>Fast and Efficient Palmprint Identification of a Small Sample within a Full Image</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400004&lng=en&nrm=iso&tlng=en In some fields like forensic research, experts demand that a found sample of an individual can be matched with its full counterpart contained in a database. The found sample may present several characteristics that make this matching more difficult to perform, such as distortion and, most importantly, a very small size. Several solutions have been presented intending to solve this problem, however, big computational effort is required or low recognition rate is obtained. In this paper, we present a fast, simple, and efficient method to relate a small sample of a partial palmprint to a full one using elemental optimization processes and a voting mechanic. Experimentation shows that our method performs with a higher recognition rate than the state of the art method, when trying to identify palmprint samples with a radius as small as 2.64 cm. <![CDATA[<b>Simulation of Baseball Gaming by Cooperation and Non-Cooperation Strategies</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400005&lng=en&nrm=iso&tlng=en Baseball is a top strategic collective game that challenges the team manager's decision-making. A classic Nash equilibrium applies for non-cooperative games, while a Kantian equilibrium applies for cooperative ones. We use both Nash equilibrium (NE) and Kantian equilibrium (KE), separate or in combination, for the team selection of strategies during a baseball match: as soon as the selection of strategies by NE or KE carries a team to stay match loosing, a change to KE or NE is introduced. From this variation of selection of strategies the team that is losing tends to close or overcome the score with respect to the team with advantage according to the results from computer simulations. Hence, combining Nash selfish-gaming strategies with Kantian collaboration-gaming strategies, a baseball team performance is strengthened. <![CDATA[<b>Security Enhancement on Li-Lee's Remote User Authentication Scheme Using Smart Card</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400006&lng=en&nrm=iso&tlng=en Recently, Li and Lee proposed a new remote user authentication scheme using smart card. However, their scheme requires a verification table and the user's identity is not protected. Moreover, users cannot change their password off-line. In order to overcome the security flaws, we propose a new scheme which provides more security without affecting the merits of the original scheme. <![CDATA[<b>A Heuristic Approach for Blind Source Separation of Instant Mixtures</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400007&lng=en&nrm=iso&tlng=en In this paper we present a methodology for blind source separation (BSS) based on a coherence function to solve the problem of linear instantaneous mixtures of signals. The proposed methodology consists of minimizing the coherence function using a heuristic algorithm based on the simulating annealing method. Also, we derived an analytical expression of the coherence for the BSS model, in which it is found that independent and identically distributed (iid) Gaussian components can be recovered. Our results show satisfactory performance in comparison with traditional methods. <![CDATA[<b>Designing Minimal Sorting Networks Using a Bio-inspired Technique</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400008&lng=en&nrm=iso&tlng=en Sorting Networks (SN) are efficient tools to sort an input data sequence. They are composed by a set of comparison-exchange operations called comparators. The comparators are a priori fixed for a determined input size. The comparators are independent of the input configuration. SN with a minimal number of comparators results in an optimal manner to sort data; it is a classical NP-hard problem studied for more than 50 years. In this paper we adapted a biological inspired heuristic called Artificial Immune System to evolve candidate sets of SN. Besides, a local strategy is proposed to consider the information regarding comparators and sequences to be ordered at a determined building stage. New optimal Sorting Networks designs for input sizes from 9 to 15 are presented. <![CDATA[<b>Periodicity-Based Computation of Optical Flow</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400009&lng=en&nrm=iso&tlng=en The standard Brightness Constancy Equation states spatiotemporal shift invariance of the input data along a local velocity of optical flow. In its turn, the shift invariance leads to a periodic function of a real argument. This allows application of a known test for periodicity to computation of optical flow at random locations. The approach is valid also for higher dimensions: for example, it applies to a sequence of 3D tomography images. The proposed method has a reasonably high accuracy for continuous flow and is noise tolerant. Special attention is paid to weak signal input. It is shown that a drastic reduction in the signal strength worsens the accuracy of estimates insignificantly. For a possible application to tomography, this would lead to an unprecedented diminution of harmful radiation exposure. <![CDATA[<b>Wikification of Learning Objects Using Metadata as an Alternative Context for Disambiguation</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400010&lng=en&nrm=iso&tlng=en We present a methodology to wikify learning objects. Our proposal is focused on two processes: word sense disambiguation and relevant phrase selection. The disambiguation process involves the use of the learning object's metadata as either additional or alternative context. This increases the probability of success when a learning object has a low quality context. The selection of relevant phrases is performed by identifying the highest values of semantic relatedness between the main subject of a learning object and the phrases. This criterion is useful for achieving the didactic objectives of the learning object. <![CDATA[<b>Sliding Windows by Blocks for Online Wavelet Discrete Transform Implementation</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400011&lng=en&nrm=iso&tlng=en En este trabajo se propone un esquema para la implementación online de la Transformada Wavelet Discreta. Se introducen mejoras en cuanto a tiempo de ejecución respecto al método de ventanas deslizantes tradicional. En la propuesta se realiza una modificación a la definición de la ventana de datos propuesta en el esquema original. Las pruebas realizadas muestran que el algoritmo propuesto es más rápido que el de ventanas deslizantes tradicionales.<hr/>In this paper we propose an online Wavelet Discrete Transform implementation scheme. Our proposal improves the execution time compared to the traditional sliding window method. Also, we modify the definition of the data window concept given in the original scheme. The experiments we performed show that the runtime cost of the proposed algorithm is better than that of the traditional sliding window method. <![CDATA[<b>Study of the Global Dynamics for a Tumor Immune-Evasion System</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400012&lng=en&nrm=iso&tlng=en En este artículo se estudia la dinámica global del modelo de Evasión-Inmune presentado por Arciero, Jackson y Kirschner [1], el cual describe la interacción entre células efectoras, células cancerígenas y las citocinas IL - 2 y TGF - β en el sitio del tumor. El sistema modela distintos comportamientos, como lo son: puntos de equilibrio, órbitas periódicas y ciclos límite estables. Utilizando el método de Localización de Conjuntos Compactos Invariantes se logra definir un dominio en el espacio de estados donde se localizan todas las dinámicas que exhibe el modelo de Evasión-inmune. La localización de dicho dominio es importante debido a que proporciona información sobre la salud del individuo en corto y largo plazo. Los límites de tal dominio representan los valores mínimos y máximos de las variables de estado y se expresan mediante desigualdades algebraicas dadas por una combinación de los parámetros del sistema. Adicionalmente, mediante una función candidata de Lyapunov, se demuestra que la región de localización es un dominio positivamente invariante, lo que permite asegurar que dada cualquier condición inicial, las trayectorias del sistema no divergen. Finalmente, se presentan simulaciones numéricas y se realiza un análisis de las posibles implicaciones biológicas de los resultados obtenidos.<hr/>In this paper we study the global dynamics for a Tumor Immune-Evasion model proposed by Arciero, Jackson and Kirschner [1], which describes the interaction between effector cells, cancer cells, and the cytokines IL -2 and TGF - β in the tumor site. This system describes different behaviors such as equilibrium points, periodic orbits, and stable limit cycles. By using the Localization of Compact Invariant Sets method, we define a domain where all the dynamics of the Immune-Evasion system are located. The localization of these sets is important because they provide information about the individual's health in the short and long term. The domain boundaries are expressed by inequalities depending on the system's parameters and represent the minimum and maximum values of the system variables. Furthermore, by taking a Lyapunov candidate function, we demonstrate that the localizing region is a positively invariant domain. This ensures that for any initial condition outside this domain, the trajectories of the system will not diverge. Finally, we present numerical simulations and realize an analysis of possible biological implications of our results. <![CDATA[<b>Support for Starting Collaboration in Distributed Software Development through Collaborative Working Spheres</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400013&lng=en&nrm=iso&tlng=en Este artículo estudia los entornos de trabajo donde grupos de personas interactúan de manera síncrona y remota (distribuida), con el propósito de crear y desarrollar software dentro del marco institucional de una organización, en lo que se conoce como desarrollo distribuido de software (DSD, por sus siglas en inglés). En este tipo de esquemas colaborativos, los desarrolladores requieren trabajar en grupos que están geográficamente distribuidos y su interacción generalmente es realizada con el apoyo de tecnología de información y comunicación. Es común que las tecnologías de colaboración no estén diseñadas para apoyar lo que llamamos inicio de colaboración informado, es decir los escenarios donde el iniciador de la colaboración pueda contar con la información de la actividad que realiza la persona buscada, con la cual, el iniciador pueda inferir si el momento para iniciarla es óptimo y apropiado para ambos. Para lograr esto, es necesario conocer el contexto de la actividad de la persona buscada en un momento determinado. Para apoyar el inicio de colaboración informado se propone la conceptualización y caracterización tecnológica de esferas de trabajo colaborativas, la cual aporta ideas de diseño para el desarrollo de una herramienta prototipo. A esta herramienta le llamamos CWS-IM (Collaborative Working Spheres - Instant Messaging). La herramienta es un mensajero instantáneo extendido con soporte para inicios de colaboración informados, la cual se introdujo a las actividades reales de DSD en una fábrica de software con la finalidad de evaluarla mediante un estudio de caso. Los resultados de esta evaluación proporcionan evidencia que muestra la aceptación favorable de CWS-IM por parte de los participantes en términos de utilidad, facilidad de uso, apoyo al inicio de interacción y gestión del nivel de interrupción.<hr/>This paper studies work environments where groups of people interact synchronously and remotely (distributed) with the purpose of creating and developing software within the institutional framework of an organization, in what is known as distributed software development (DSD). In this type of collaborative schemes, developers are required to work in groups that are geographically distributed and their interaction is usually conducted with the support of information and communication technologies. It is common that collaborative technologies are not designed to support what we call informed collaboration initiation, i.e., scenarios where the initiator of the collaboration can have the information of the activity that is being performed by the person he/she is looking for, with which the initiator can infer whether the time to start is optimal and appropriate for both of them. To achieve this it is necessary to know the context of the activity of the person that is being looked for at a given moment. To support the initiation of informed collaboration, the conceptualization and technological characterization of collaborative working spheres is proposed, which provides design ideas for the development of a prototype tool. We call this tool CWS-IM (Collaborative Working Spheres - Instant Messaging). The tool is an extended instant messenger with support for the initiation of informed collaboration, which was introduced to the actual activities of DSD in a software factory in order to evaluate it through a case study. The results of this evaluation show the favorable acceptance of CWS-IM by participants in terms of usefulness, ease of use, support for the initiation of informed interaction and disruption level management. <![CDATA[<b>Concurrent Real-Time Task Schedulers</b>: <b>A Classification Based on Functions and Set Theory</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400014&lng=en&nrm=iso&tlng=en Los Sistemas Operativos en Tiempo Real deben brindar soporte para concurrencia, para lograrlo se requieren de los Planificadores de tareas. Los planificadores operan sobre un Conjunto de Tareas en Tiempo Real Concurrentes donde sus instancias requieren ejecutarse hasta completarse dentro de sus plazos máximos; el planificador recibe un conjunto de Tiempos de Arribo y lo mapea hacia un Conjunto de Tiempos de Inicio para que las instancias empiecen su ejecución. En este contexto, un planificador se considera una función que mapea entre dos conjuntos que evolucionan en el tiempo, en este sentido, se presenta una clasificación de los planificadores basada en funciones y teoría de conjuntos siendo: críticos, no críticos, estáticos, adaptativos, predictivos, por desalojo de prioridades y óptimos. Esta propuesta de clasificación es novedosa ya que en el actual estado del arte solo se presentan clasificaciones verbales no formales y no aportan elementos que ayuden a su análisis, modelado y/o caracterización. Como resultado adicional, esta clasificación podrá ser utilizada para realizar futuros estudios cualitativos en optimalidad, estabilidad, controlabilidad, eficiencia, convergencia y predecibilidad desde el punto de vista computacional. Al final del documento se clasifican dos ejemplos de planificadores: RM (Rate Monotionic) y EDF (Earliest Deadline First).<hr/>Real-Time Operating Systems must provide support for concurrency; to achieve this, task schedulers are necessary. Schedulers operate on a set of concurrent real-time tasks in which its instances are to complete their execution within their respective deadlines; a scheduler receives a set of arrival times and maps it to a set of start times for the instances to begin their execution. In this context, a scheduler is considered to be a function that performs mapping between two sets which evolve in time, with respect to this we present a classification of schedulers, based on functions and set theory, into such categories as critical, non-critical, static, adaptive, predictive, preemptive, and optimal. Our proposed classification is novel because the state of the art classifications are only verbal and non-formal and they do not support elements which assist in analysis, modeling and/or characterization of schedulers. As an additional result, this classification can be used for future qualitative studies of optimality, stability, controllability, efficiency, convergence, and predictability from the computational point of view. The paper concludes with two examples of schedulers: RM (Rate Monotonic) and EDF (Earliest Deadline First). <![CDATA[<b>Regression Models for Time Series with Increasing Seasonality</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400015&lng=en&nrm=iso&tlng=en Se compara el desempeño de tres modelos de regresión, en términos de su efectividad predictiva, para el caso de series temporales con estacionalidad creciente. Se emplearon 617 series en el cotejo así como tres modelos de los cuales, uno es propuesta original de este trabajo. Adicionalmente, se compararon estos modelos contra uno de raíces unitarias, típicamente empleado para el pronóstico de las series de interés. Entre los resultados más importantes, se muestra que la efectividad de los modelos de regresión dependerá del horizonte de pronóstico así como del grado de su curvatura. A menor curvatura y mayor horizonte, mejor será su desempeño. Se mostrarán las condiciones bajo las cuales, los modelos de regresión pueden pronosticar tan bien o incluso mejor que la alternativa típica. Por último, se realiza un análisis de los intervalos de predicción y sobre cómo mejorar su efectividad.<hr/>In this paper, three regression models are compared according to their performance in terms of forecast accuracy, for the case of time series with increasing seasonality. 617 series are used in the comparison as well as three models, being one of them an original contribution of this work. In addition, the regression models are compared with the autoregressive approach, commonly used in the forecast of these series. The results indicate that the performance of the regression models depends on the forecast horizon and on the degree of curvature of the series. At fewer curvature and longer forecast horizon, its performance is better. The conditions under which the regression models outperform the autoregressive approach are discussed. Also, the performance of the prediction intervals in order to improve its effectiveness is analyzed. <![CDATA[<b>Filter Estimator by Deconvolution and Pseudoinverse</b>: <b>Recursive Description and Implementation</b>]]> http://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462014000400016&lng=en&nrm=iso&tlng=en En este trabajo se presenta un filtro estimador con base al modelo matricial de deconvolución utilizando a la pseudoinversa como proceso de filtrado, con el cual es posible conocer la dinámica interna del modelo tipo caja negra con respuesta lineal, y con evolución invariante en el tiempo. Se presenta la descripción recursiva del filtro por deconvolución y pseudoinversa considerando: a) la convolución en diferencias finitas, b) el filtro por deconvolución y pseudoinversa, c) la descripción recursiva del funcional del error y d) las condiciones de estabilidad a cubrir por el estimador tomado en cuenta los criterios de Lyapunov. De manera ilustrativa, se presenta una simulación utilizando MatLab®.<hr/>In this paper we present a filter estimator based on deconvolution matrix model using the pseudoinverse as a filtering process, which permits to know the internal dynamics of a black-box model with linear response and time-invariant evolution. We present a recursive description of the filter by deconvolution and pseudoinverse considering a) convolution in finite differences, b) the filter by deconvolution and pseudoinverse, c) recursive description of error functional, and d) stability conditions to be covered by the estimator taking into account the Lyapunov criteria. To illustrate the description, a MATLAB® simulation is presented.