No. 4 (2019)
Full Issue
SECTION I. ARTIFICIAL INTELLIGENCE AND FUZZY SYSTEMS
-
FRACTAL ANALYSIS OF TECHNICAL PARAMETERS FOR FORECASTING TASKS
S.I. KlevtsovAbstract ▼The main objective of the monitoring system of a technical object is to monitor the values of
its parameters at the current time and to predict possible changes in parameters for a given time
period. The assessment of parameter changes for the period of time for which the forecast is calculated
is based on the processing of previously obtained data. The horizon and forecasting accuracy
depend on the quality of the time series of data used for the calculations. The nature of the data
series also determines the choice of a specific forecasting model, the need and the choice of themethod of their preliminary processing. The more unstable the trend of the parameter’s time series,
the less possibilities there are for estimating the parameter values outside the current time
frame. Trend resistance of a series is determined using the normalized range method. The result of
using the method is to assess the nature of the time series as a whole. However, most methods of
forecasting time series use a limited part of the series directly adjacent to the time point of the
beginning of the forecasting process. And the nature of this section affects the accuracy of the
forecast. If this part of the series can be defined as persistent, then a forecast is possible. To analyze
the parameter values belonging to a small part of the series, the normalized span method is
not suitable, since it requires a large amount of data to be sampled. In this case, you can use the
fractality index, which allows local analysis. This paper presents a phased fractal analysis of the
time series data of a technical object parameter. At the first stage, a preliminary analysis is carried
out using the normalized scale method. The nature of the series as a whole is determined.
If the series is persistent, then the next step is the local fractal analysis of a limited area of a series
of parameter values. The possibility of using this section of the series for the implementation of the
forecasting problem is estimated. A local fractal analysis can be performed on a site limited by a
fixed time window that moves at each forecasting step following this process. A similar forecasting
support scheme can improve the accuracy and reliability of the forecast. -
FUZZY MODEL FOR THE MAXIMUM DYNAMIC FLOW FINDING FOR SOLVING THE TASK OF EVACUATION FROM BULDINGS
E.M. GerasimenkoAbstract ▼This article is devoted to solving the important task of evacuation from building, particularly,
evacuating the maximum number of victims from dangerous areas to safe ones within a given time interval.
The constructed model of the evacuated building is represented by a transportation network with a
dynamic structure, since the capacities and parameters of the flow time can vary in time. In addition, thenodes of the transportation network have weights that limit the maximum number of people who can be
arranged in a given vertex-room. The fuzzy and uncertain nature of the network parameters leads to the
problem statement in fuzzy conditions, which allows simulating a real evacuation situation, in which the
arc capacities in different time intervals are not exactly determined, however can be estimated approximately,
in a certain interval, etc. This leads to the necessity of the specifying arcs in a fuzzy form. A feature
of the algorithm is the ability to take into account the weights of the vertices of the transportation
network; this is accomplished by replacing the vertex with the arc capacity with two auxiliary vertices,
the arc between which has capacity equal to the initial capacity of the vertex. The numerical example
illustrating the operation of the proposed algorithm is solved. The results obtained in the course of solving
the evacuation task using the proposed algorithm can be applied in practice to solve the building
evacuation tasks in uncertain conditions, where the exact number of evacuees is not known and the
maximum possible number of evacuees must be transferred to safe zones, taking into account the timevarying
capacities and limitations on the capacity of the rooms. -
EVOLUTIONARY DESIGN AS A TOOL FOR DEVELOPING MULTI-AGENT SYSTEMS
L.A. Gladkov, N.V. GladkovaAbstract ▼The article proposes a methodology for the evolutionary design of autonomous agents and
multi-agent systems (MAS), on the basis of which the development and implementation of fuzzy
and hybrid models of the formation of agents is carried out. Existing approaches to the design of
information systems based on multi-agent organizations are considered. Analyzed the features and
disadvantages of existing approaches. It is noted that the use of the principles of the theory of evolutionary
development, the development of new approaches that use natural analogues, makes it possible
to increase the effectiveness of the existing methodology for designing multi-agent systems.
A model of agent interaction in a multi-agent system is described. Different approaches to the evolutionary
design of agents and multi-agent systems, which can be based on different models of evolution,
are considered. A formal formulation of the evolutionary design problem of artificial systems is
presented. The fundamental problems that arise when creating a general theory of the evolution of
agents and multi-agent systems are highlighted. The features of various models and levels of evolution
are considered. The concept of designing agents and multi-agent systems has been developed,
according to which the design process includes the basic components of self-organization, including
the processes of interaction, crossing, adaptation to the environment, etc. A model of forming an
agent - a descendant, based on the analysis of possible types of agent interaction - parents in the
process of evolutionary design is proposed. A general methodology for the evolutionary design of
agents and a multi-agent system has been built. Various types of crossover operators were developed
and described, the idea of creating agencies (families) as units of evolving multi-agent systems
was formulated. A software system has been developed and implemented to support evolutionary
design of agents and multi-agent systems. A brief description of the performed computational
experiments confirming the effectiveness of the proposed method is presented. -
MULTI-MEMETIC ANT DISCRETE OPTIMIZATION ALGORITHM
B. K. Lebedev, O.B. Lebedev, Е.О. LebedevaAbstract ▼The work deals with the task of constructing a multi-meme ant algorithm (AA), which includes
the development of 5 memes and the choice of an expedient strategy for using one meme or another
from a swarm of available memes. The work of discrete optimization AA is illustrated by the example
of the problem of splitting a graph into subgraphs, which is widely used in solving problems such asclustering, coloring, selection of clicks, matching, partitioning of VLSI, etc. Given a graph G (X,U),
where X is the set of vertices, |X|=n, U is the set of edges. It is necessary to divide the set X into subsets
X1 and X2, X1X2=X, X1X2=, Xi.. To search for a solution to the problem, a complete solution
search graph (SSG) R (X,E) is used. In contrast to the canonical paradigm of the AA, the agent for the
SSG R(X,E) does not construct a route, but forms a subgraph. At the first stage of each iteration of the
MA, the constructive algorithm (meme) of each agent forms the set X1k. In all memes, the formation of
the set X1k is carried out sequentially (step by step). At each step t, the agent applies a probabilistic selection
rule for the next vertex to include it in the generated set X1k(t). In the first meme, the search for
solutions is carried out on the SSG R (X,E), and the pheromone is deposited on the set of edges E. The
main difference between the second meme and the first one is that the full search graph R(X,E) is excluded
from the search process by the decision agent. Pheromone is deposited only on the vertices of the
graph G(X,U). This leads to a sharp reduction in the number of pheromone points and, consequently, in
memory, since the number of edges of the graph R(X,E) is significantly greater than the number of vertices.
The peculiarity of the third meme M3 is that two indicators φi1k and φi2k are formed at the vertex xi of
the graph G(X,U): φi1k is the total level of the pheromone delayed by xi, only when xi was in X1k(t ); φi2k
is the total level of pheromone deposited on xi, only in cases when xi was not included in X1k(t). The indicator
φ1i characterizes for the vertex xi the preference of the node X1k(t). The fourth meme differs from
the third one in that the agent generates two parallel sets X1k(t) and X2k(t) in parallel. In the fifth meme,
an approach based on the decomposition of the data structure in the processes of forming the X1kX
vertex set and the pheromone deposit is used to reduce the overall laboriousness of the search procedure.
In the modified approach, the pheromone is deposited on the graph R(X,E), and the formation of a
subset of the X1kX vertices is carried out on the oriented graph B(X,V). Two approaches to constructing
an adaptive multimeme algorithm using a swarm of memes are proposed. Multimemeic hybridization
leads to an algorithm with self-adaptive properties, since in the search process memes compete with
each other, and the winner at this stage of the search is used in subsequent iterations. By increasing the
search efficiency, the hybrid algorithm makes it possible to reduce the number of agents in the system
and thereby increase its speed. -
BIO-INSPIRED SCALABLE ALGORITHM IN PROBLEMS OF MULTIDIMENSIONAL OPTIMIZATION
S. I. Rodzin, I. G. Danilov, K.S. Kovaleva, O. N. RodzinaAbstract ▼Bioinspired algorithms are mathematical transformations that allow to transform the input flow
of information into the output according to the rules based on the simulation of the mechanisms of evolution,
as well as on the statistical approach to the study of situations and iterative approximation to the
desired solution. Bioinspired algorithms have certain advantages over traditional deterministic optimization
methods, better exploring the space of finding solutions. However, in the era of big data, the suitability
of bioinspired algorithms for processing them, like most other methods, is questionable. Big data
poses new computational challenges, including due to its very high dimensionality and sparsity. Multidimensional
problems bring additional complexity to the study of the solution search space. An urgent
task is to develop a scalable bioinspired algorithm that can support the diversity of the population ofsolutions and find a balance between the rate of convergence of the algorithm and the diversification of
the search in the space of solutions. The paper presents a scalable bioinspired algorithm capable of
solving multidimensional optimization problems of high dimension using a hierarchical multipopulation
approach. The scalability of the algorithm assumes that its computational cost increases in direct proportion
to the increase in the amount of data processed, as well as the ability to give the best solution for
its current capabilities at any time of calculation, even if the calculation process is not completed by a
natural stop. Scalability also includes the ability to perform calculations within the limited memory of
the computer being used. The algorithm uses a special mutation operator to support the diversity of the
solution population, expanding the search for solutions at the expense of less promising solutions. Evaluation
of the effectiveness of the proposed algorithm is performed on the set of multidimensional optimization
functions benchmarks: Griewangk, Rastrigin, Rosenbrock, Schwefel. The indicators of the developed
algorithm are compared with the indicators of competing algorithms. The experiments show in
favor of a scalable algorithm, and the differences in the values of optimized functions for the developed
algorithm are statistically significant compared to competing algorithms, especially with increasing
dimension of the problem. -
ALGORITHM FOR DETERMINING THE PARAMETERS OF BIO-INSPIRED SEARCH BASED ON BORROWING THE RESULTS OF PREVIOUSLY SOLVED PROBLEMS
Y. O. Chernyshev, N.N. VentsovAbstract ▼The problem of selecting the most effective hardware architecture in the implementation of
genetic algorithms that solve NP-complete problems is considered. The peculiarity of this kind of
problems is the unproven existence of algorithms capable of finding an exact solution in polynomial
time. Evolutionary algorithms used to solve such problems are stochastic in nature, which
makes it difficult to plan the computational process effectively. The aim of the work is to find ways
to reduce the search time by analyzing the results obtained in solving similar problems considered
earlier and choosing an effective architecture that implements this evolutionary algorithm.
The relevance of the work is due to the exponential growth of the dimensions of the optimization
problems to be solved. The scientific novelty lies in the use of knowledge about previously solved
problems to optimize the parameters of the genetic algorithm that solves the current problem, and
the choice of an effective hardware architecture on which this algorithm will be implemented.
The exact boundary, as a function of the number of individuals processed by the stochastic algorithm,
of the most appropriate architecture cannot be determined precisely. For this reason, the
boundary of the most efficient hardware architecture is defined as a fuzzy set. The problem statement
is: to accelerate the process of solving the current task, using the a priori parameter settings
of genetic algorithm based on the analysis of the results of decisions previously discussed tasks,
and choosing efficient hardware architecture. An approach to the selection of effective hardware
architecture is proposed, consisting of three stages: analysis of the results of solving the previously
considered problems, determining the number of individuals that must be generated to obtain an
acceptable solution by the evolutionary algorithm, selection of hardware architecture on which the
evolutionary algorithm will be implemented. -
METHODS OF PROTECTION OF DISTRIBUTED COMPUTING IN A MULTI-AGENT SYSTEM
S. A. Khovanskov, V. S. Khovanskovа, V. A. LitvinenkoAbstract ▼The organization and protection of distributed computing based on a multi-agent system for
solving problems of multivariate modeling is considered. When modeling, the choice of one of
many options may require a search of a huge set of parameters unavailable for a high-speed computer.
Distributed computing is used to reduce the time required to solve such problems. There are
many different approaches to the organization of distributed computing in a computer network -
grid technology, metacomputing (BOINC, PVM and others). The main disadvantage of most existing
approaches is that they are designed to create centralized distributed computing systems. Distributed
computing is organized on the basis of a multi-agent system on the computing nodes of
any computer network. When used as a computing environment a computer network on a large
scale can cause threats to the security of distributed computing from the intruders. One of these
threats is getting the calculation about the result by the attacker. A false result can lead in the
modeling process to the adoption of not optimal or wrong decision. A method of protection of distributed
computing based on a multi-agent system from the threat of false results is developed. The
assessment of the degree of information security in multi-agent systems with different structure.
Comparison between information securities in these two systems is provided.
SECTION II. DATA ANALYSIS AND KNOWLEDGE MANAGEMENT
-
INTELLECTUAL KNOWLEDGE MANAGEMENT SYSTEM IN THE PROCESS OF TRAINING
E. V. Kuliev, E.S. Tsyrulnikova, N. V. Kulieva, V. V. MarkovAbstract ▼In this work, an intelligent information system of supporting the quality of educational activities
(IIS SQEA) is proposed. IIS SQEA includes a module "Expert quality assessment" and a module
"Analysis and prediction of the results of the learning process." IIS SQEA solves the problem
of analyzing and forecasting the results of the learning process using an extended set of features:
data on attendance of training and test resources, data on participation in discussion forums, data
on viewing announcements, data on a training course, data on the level of student training and
expert evaluations quality teaching materials and test materials. The set of informative features isdetermined using a filtering algorithm and is considered optimal for building a model for predicting
the results of the learning process. Expert evaluation of educational and methodological support
is carried out according to the criteria of completeness, availability and relevance. Expert
assessment of test materials is carried out on indicators: difficulties, validity and reliability of the
test. Expert assessments are made in special forms. The structure of the forms is arranged in such
a way that when filling out forms on the third level, forms of the second and first levels can be
filled in automatically, which simplifies the task of data entry. To build a forecast model, a classification
algorithm is used: boosting over decision trees. In the course of computational experiments,
the application of the algorithm for boosting over decisive trees and an extended set of
features showed a good forecast quality compared with similar developments. The developed IIS
SQEA provides a qualitative forecast of the results of the learning process and the ability to effectively
manage the quality of the educational process, identify problem situations and improve information
and teaching materials. Experimental studies have been carried out. -
BIOINSPIRED APPROACH FOR SOLVING THE PROBLEM OF CLASSIFICATION IN USERS BEHAVIOR PROFILES IN INTELLIGENT INTERNET SERVICES
V.V. Bova, Y. A. KravchenkoAbstract ▼The article deals with the problems of increasing the effectiveness of personal-oriented user interaction
organization in intelligent Internet services for developing a culture of safe behavior in the Internet
space. To solve them, we propose a method for constructing profiles of user behavior based on an analysis
of their information needs, which most closely match their preferences, including the implicit ones.
The relevance of the work is determined by the growing popularity of the idea of personalizing content
and information services in the Internet space. The user's behavior profile is considered by the authors as
a weakly formalized object in the feature space of internal and external characteristics describing its
interaction with an Internet resource. The proposed method is based on the EM-clustering probabilistic
algorithm of the studied data on user characteristics and distributed Internet resources for generating the
structure of the input parameters of classifiers of the user profile generation model. The structure optimization
is realized by the mechanism of selection of informative characteristics of a profile based on the
idea of identifying hidden interests and preferences of users on the one hand, and the ability of a resource
to satisfy interested users of this feature set - on the other. In order to reduce the dimension of the source
data of the attribute space, the classification problem suggests a meta-heuristic algorithm for optimizing
the “cuckoo search”, which is distinguished by scalability and high interpretability of the output data.
Optimization of the parameters of classifiers consists in the selection of the parameters of the membership
function and the labels of classes of the generalized profile so that the numerical criterion for the accuracy
of the classification of resource attributes and user preferences is reduced to a maximum on real data.
To assess the effectiveness of the proposed algorithm, a computational experiment was conducted on test
datasets from the open repository of the UCI Machine Learning Repository. The results of which showed
that the classifier built on test data possesses a higher level of interpretability of the results of the formation
of profiles, while maintaining the accuracy of classification. -
BOOSTING OF BIOINSPIRED ALGORITHMS FOR SOLVING PROBLEM OF DATA INTEGRATION
Y. A. Kravchenko, D. V. Balabanov, A. V. KovtunAbstract ▼Currently, the integration of data is an actual problem. Data integration can be presented at
various levels. The existing methods for solving the problem of data integration are quite complex
and still far from solving the problem of semantics. Analysis of the methods shows that when solving
the problem of heterogeneity at the semantic level, methods are used which were based on the use of
a single ontology of the upper level. Due to the fact that data in information systems usually represent
information objects that model some parts of the domain, which in turn has its own ontology, in order
to solve the problem of semantic heterogeneity of data, it is necessary to conform to the concepts of
subject domains of interaction objects. In this case, the coordinated semantics of the subject area can
be built interaction of information systems. The paper considers the semantic level at which data is
analyzed in terms of their semantic properties, as well as in the aspect of a unified ontology. To resolve
this class of problems, it is proposed to use evolutionary and, in particular, bioinspired algorithms.
The paper proposes to use boosting, the idea of which is to apply several algorithms sequentially.
The process of integration of information systems is reduced to solving the problem of estimating
the semantic proximity of objects of inhomogeneous ontologies, based on a hybrid measure of
similarity, including attribute, taxonomic and relational. The structure of boosting bioinspired algorithms
for searching for concepts and evaluating their semantic proximity is proposed. It is proposed
to use the dynamic probability of choosing algorithms based on the results obtained. -
METHODOLOGY OF USING BIOINSPIRED METHODS FOR INTEGRATED PROCESSING OF GREAT DATA ON THE EXAMPLE OF THE TRANSPORT TYPE PROBLEM SOLUTION
S. N. ScheglovAbstract ▼This paper presents a methodology for using bioinspired methods for integrated processing
of big data using the example of solving the transport type problem. The main place among the
applied problems of the transport type is given to the tasks of building transport facilities, whichmake it possible to minimize vehicle traffic problems or minimize the cost of transportation, minimize
the cost of transport problems. Routing is the most perfect way to organize the flow of goods
from enterprises, which has a significant impact on the acceleration of traffic turnover with rational
and efficient use of it. For this class of combinatorial problems, there are no effective classical
methods and algorithms for solving. These tasks are characterized by a finite, but very large
number of possible solutions. They can be put as tasks of integer programming, but in this case
there are no efficient algorithms. Therefore, the development of methods and algorithms for solving
transport-type problems, which has been carried out for many years, is still an urgent problem.
The methodological analysis of the research problem has been carried out. The analysis
showed that the use of methods and algorithms of sequential and parallel bio-inspired search for
solving the considered class of transport-type problems is an actual scientific problem of practical
interest. The problem statement is given. An integrated search scheme is shown, which allows you
to parallelize the process of finding an acceptable solution for large-scale problems. The structural
scheme of the bio-inspired search for the problem of the extreme path is considered. The results
of computational experiments are presented. The research results allow us to conclude that the
time complexity of the considered bioinspired search algorithms does not exceed the limits of the
polynomial dependence, and can be expressed by the formula: O (N2), where N is the number of
graph vertices (size of the problem to be solved). -
IMPLEMENTATION OF THE EXISTING DATA TRANSFER PROTOCOL FOR THE DEVELOPED ARCHITECTURE OF THE RADIO TRANSMISSION SYSTEM FOR THE AUTOMATED COMMUNICATION COMPLEX
A.I. RybakovAbstract ▼The presented materials give a General description of the coding and decoding algorithms
used in the development of signal-code structures implemented in the layout of radio stations with
the equipment of software-defined radio system (Software-Defined Radio – SDR). Frame formats
of broadcast and half-duplex protocols; modulation/demodulation and subsequent digital signal
processing, which are used in existing and future radio communication systems. The authors bet
on the use of OFDM modulation together with absolute phase manipulation (2PSK and 4PSK) in
subchannels. Purpose of work. Investigation of existing methods of modulation / demodulation and
subsequent digital signal processing, as well as the requirements imposed by them on the equipment
of network stations and algorithms of the system. The conducted researches will allow to
define the most expedient and energy-efficient way of development, including creation of the software
allowing to create the equipment capable to satisfy to the maximum number of possible assignments
of channels of radio access. To justify the reliability and performance of the proposed
algorithm and transmission Protocol was developed appropriate software (software). It can be
used to receive and transmit information through the use of ionospheric reflections. In addition,
existing standards, Amateur systems such as WinLink and marine information systems (digital and
analog) are taken into account in terms of the "physical" and "channel" levels. Results. The structure
and functional description of the software-defined radio channel is given. The results of experimental
testing of technical solutions are shown. The software can use hardware and software
to control the transceiver module, including the sunsdr2 transceiver, which supports the operation
of the hardware in full duplex mode, and the antenna amplifier. Result of consideration of the
structure and functional description of the developed software, it is concluded that the validity andperformance of the proposed algorithm and transmission Protocol are relevant in the development
of digital receivers for communication systems for various purposes. Presents experimental data
for verification of the proposed algorithm showed acceptable compliance with the decisions taken
on quality using a channel resource with a sufficient level of credibility and reliability in the
transmission of information contained in the above code construction. -
KNOWLEDGE MANAGEMENT TECHNOLOGY USING ONTOLOGIES, COGNITIVE MODELS AND PRODUCTION EXPERT SYSTEMS
M. L. Massel, A. G. Massel, D.V. PesterevAbstract ▼The article proposes a knowledge management technology based on the joint use of ontologies,
cognitive models and production expert systems. The core of the proposed technology is the
technique of cognitive modeling and the transformation of cognitive models into the production
rules of the expert system, as well as the tools to support the proposed technology, developed on
the basis of the agent approach. The methods and tools for the semantic modeling (primarily ontological
and cognitive) developed in the authors' team were applied. We consider a two-level technology
of multivariate research, integrating the levels of qualitative (using semantic modeling)
and quantitative analysis (using mathematical models and a computational experiment), as well as
an intelligent IT environment that integrates tools for supporting two-level technology. Based on
them, the modification and development of tools to support the proposed knowledge management
technology was performed. An illustration of the proposed technology is given, the seven stages of
which are described in the table. The development of a conversion agent for converting causal
relationships of cognitive maps into production rules of an expert system is considered, the algorithm
of the agent is presented. The conversion result - the resulting set of rules - is transferred to
the shell of the Clips production expert system. Using the Clips inference engine, the expert receives
conclusions about the degree to which the concepts of the cognitive map interact with each
other and on their basis interprets the cognitive map. Examples of ontologies, cognitive maps,
results of converting cognitive maps into production rules of an expert system illustrating the proposed
knowledge management technology are given. A computational experiment is described inwhich a cognitive threat map “Low rates of renewal of electric generating equipment” is used.
The effectiveness of the proposed technology to support the justification and adoption of strategic
decisions on the development of energy can be first of all evaluated qualitatively, since it ensures
the scientific validity of the recommended solutions and thereby increases their effectiveness. In
addition, the use of this technology is aimed at reducing the expert’s labor costs and, accordingly,
reducing the time for analysis and substantiation of solutions. The novelty of the proposed approach
consists, firstly, in the integration of various intelligent technologies (ontologies, cognitive
modeling, expert systems) within the framework of a single technology for knowledge management
in scientific research; secondly, in the automation of analysis and interpretation of cognitive maps
using production expert systems. As a result, we can talk about improving the quality of preparation
and substantiation of recommendations for decision-making. The knowledge management
technology has been tested in studies of energy security problems, but may have wider application. -
PROBLEMS AND ISSUES OF PROCESSING AND ORGANIZATION OF GRAPHICAL DATA IN VOLUME VISUALIZATION ON DISTRIBUTED SYSTEMS
N. I. Vitiska, N. A. Gulyaev, I.G. DanilovAbstract ▼Most of current tasks in fields of science and technology are connected to visualization,
which brings explicitness and informativity in management and design of natural and technical
objects and processes, it also solves a number of important tasks of human-machine interface. The
key problem of visualization in most cases is to display all the necessary information in allotted
amount of time and system resources. In volume visualization, large amount of data is processed,
which may require an excessive amount of computing resources, or it can introduce a number of
problems in management of visualization process. Currently, research and study of approaches to
organization of data being processed during volume visualization is crucial. Any decrease in computational
costs during storing and processing the graphic data not only allows to reduce corresponding
costs, but also allows to implement a number of modifications of sampling and compositing procedures, which introduces a number of possibilities for additional optimization of rendering
process. This paper discusses key tasks of organizing, storing and processing graphical data in
volume visualization paradigm in terms of distributed implementation. A general optimization
approach suitable for distributed implementation is considered. A multiparametric optimization
task is described, as well as objective function. Next, a problem of organization of graphical data
based on spatial location and properties of volumetric phenomenon is reviewed. Key nuances are
discussed, a solution for distributed case based on a combination of well-known approaches is
described, theoretical and practical aspects are considered. An approach to a low-level implementation
is considered, a technique of structuring the initial data as a marked graph by a set of properties,
as well as a process of carrying out the rendering procedure on such structure is described.
SECTION III. DESIGN AUTOMATION
-
SOLUTION OF THE PROBLEMS OF DESIGN DECISIONS SEARCH AND OPTIMIZATION ON THE BASIS OF A HYBRID APPROACH
L.A. Gladkov, N. V. Gladkova, S. N. LeibaAbstract ▼The integrated approaches to solving optimization problems of computer-aided design of digital
electronic computing equipment circuits are discuss. The relevance and importance of developing
new effective methods for solving such problems is emphasized. The important direction in the development
of optimization methods is the development of hybrid methods and approaches that combine
the merits of various methods of computational intelligence is noted. It has been suggested that hybridization
allows one to achieve a “synergistic effect” when the advantages of individual methods are mutually enhanced. The definition of mixed artificial systems and the conditional classification of
hybrid systems are given. Relationships and possibilities of mutual use of the theory of evolutionary
design and multi-agent systems are considered. A hybrid approach to solving optimization problems
based on a combination of evolutionary search methods, fuzzy control methods and possibilities of
parallel organization of the computational process is proposed. A modified migration operator for
the exchange of information between decision populations in the process of performing parallel computations
is proposed. The structure of the parallel hybrid algorithm has been developed. The island
and buffer models of a parallel genetic algorithm to organize a parallel computing process is proposed
to use. To improve the quality of the results obtained, a fuzzy logic controller in the evolution
contour of expert information, which regulates the values of the parameters of the evolution process
is included. A block diagram of the developed hybrid algorithm is presented. A software application
is developed, a description of the architecture of the software application is given. The features of the
software implementation of the proposed hybrid algorithm are considered. A brief description of the
performed computational experiments confirming the effectiveness of the proposed method is presented. -
PARALLEL AND CONSISTENT BIOINSPIRED ALGORITHM FOR CONSTRUCTION OF A STEINER MINIMAL TREE
B. K. Lebedev, O. B. Lebedev, E. O. LebedevaAbstract ▼The paper considers a parallel sequential approach to constructing a minimal Steiner tree.
The developed algorithm is based on a general approach, consisting in the decomposition of a
connecting network and its presentation in the form of a combination of two-terminal connections.
For the chain ti on the set of vertices Xi, |Xi|=ni of the graph G, using the Prim algorithm, we construct
the minimal connecting tree Ri={rik|i=1, 2, …, ni-1}. For each edge rikRi, a route sik is
formed that connects on the graph G a pair of vertices corresponding to the edge rik. Each route sik
corresponds to the set Г(sik) of edges of G. The problem of constructing a minimal Steiner tree
reduces to the problem of constructing and choosing an s-route sik on the graph G, for each edge
rik. For each agent located at the vertex pi, the coordinates of the vertex pj to which the s-route is
laid are determined and the possible directions of its movement along the edges of the orthogonal
graph G=(V,E) are determined, which ensure the construction of the s-route of minimum length.
The quality of the route built by the agent will be determined by the presence of common sections
with routes built by other agents of the cluster. The more common sections of the route with routes
constructed by other cluster agents, the smaller the total length of the Steiner tree ribs. The decomposition
of the problem in the framework of a parallel-sequential approach allowed us to
avoid the problem of the sequence of routing and organize a collective adaptation system with a
high degree of appropriate behavior and convergence. A feature of the presented ant algorithm for
constructing a minimal Steiner tree is that the ant colony is divided into clusters and the search for
a specific solution to the problem is carried out by a population of ant clusters. The task of ants
constructing each cluster Aσ of the Steiner tree is reduced to the problem of constructing and selecting
in the graph G the set of Mσ s-routes covering the minimal Steiner tree. In contrast to the
canonical paradigm of the ant colony, a modified greedy strategy is proposed for constructing an
oriented route on the solution representation model. The concept of collective adaptation (adaptive
behavior) of an agent cluster used in the above approach is as follows. The global goal of the
team (agent cluster) is to build a minimal Steiner tree. The local goal of each agent is to build a
route of maximum cost, that is, a route that maximally matches the routes built by other agents of
the cluster, which indirectly contributes to the achievement of the global goal of the team. The
state of each edge ejE of the graph of the search for G solutions is described by two parameters
hj and dj. The values of the indicators hj and dj are updated by increasing at each iteration after all
the clusters of agents build Steiner trees. For the first time, the composite structure of pheromone
and the differentiated method of its deposition are used. Each ant in the group marks the path(edges of the route) with two types of pheromone: pheromone-1 and pheromone-2. This provides a
higher probability of localization of the global extremum of the problem. The time complexity of
this algorithm at one iteration is defined as O(n2). After all the actions are completed, the agent
with the best solution is found in the iteration, which is remembered. Next, the transition to the
next iteration. -
A MODIFIED WOLF PACK ALGORITHM FOR SOLVING DESIGN TASKS
E.V. Kuliev, D. Y. Zaporozhets, D. Y. TereshenkoAbstract ▼The paper deals with the main problem of artificial intelligence - the development of effective
new and modified heuristic mechanisms for finding the optimal solution. Currently, one of the
promising areas for the development of artificial intelligence is the use of methods and models of
the behavior in biological systems for solving NP-complete and NP-difficult optimization problems.
The article considers one of the urgent problems of the design of ultra-large-scale integrated
circuits (VLSI) - the problem of placement. Presented is the formulation of the problem of placement
of elements of VLSI. The task of locating the VLSI elements is proposed to be solved on the
basis of the behavior of biological systems in nature, using the wolf pack algorithm as an example.
Wolves are typical social animals that have a clear division of social work. Presents the actions
and rules of behavior of the wolf pack in wildlife. Based on the presented rules and actions of
wolves, a modified algorithm of the wolf pack is described. The advantage of the developed modified
algorithm is the possibility of improving each subsequent stage of solving the placement problem.
As part of the work, the wolf pack algorithm was implemented in Java. Famous IBM benchmarks
were used as test patterns. Comparisons were made with the results of the work of wellknown
algorithms: Capo 8.6, Feng Shui 2.0, Dragon 2.23. Based on the research, the algorithm
developed by the wolf pack showed higher results compared to peers. -
SYNTHESIS METHOD OF FAULT-TOLERANT COMBINATION CIRCUITS WITH CED BASED ON LDPC CODE
A. L. Stempkovskiy, D. V. Telpukhov, T.D. Zhukova, A.N. Schelokov, A. D. Novikov, S. I. GurovAbstract ▼Ionizing radiation leads to the occurrence of short-term disturbances in the performance of
electronic equipment, so-called soft errors. This type of failure was mainly considered in the context
of storage devices and memory elements. However, the intensive development of the microelectronic
industry leads to an increase in the number of soft errors in combinational areas, and
may soon lead to the fact that the frequency of occurrence of soft errors in these areas will be
comparable to the frequency in unprotected memory elements. Today, there are many different
methods to deal with the consequences of soft errors: traditional methods of N-fold modular redundancy,
methods that enhance the masking properties of the circuit, the use of various control
tools based on the theory of noise-resistant coding, etc. However, basically, most of the methods
presented lead to the appearance of large structural redundancy. As a result, there is a need to
develop the new methods to deal with the consequences of soft errors. This article discusses the
use of special control tools – concurrent error detection (CED) circuits based on a low-density
code in order to increase the fault tolerance of combination circuits. The use of such a CED allows
single error correction due to the majority decoding method. When compared with the triple
modular redundancy method in terms of parameters such as logical sensitivity and structural redundancy,
the application of the obtained CED can be some compromise solution to the problem
of the circuit fault tolerance to the occurrence of soft errors. -
STUDY AND DESIGN OF AUTOMATED TOOLS FOR SIMULATION OF SOFT ERRORS IN MODERN COMBINATIONAL CMOS IC
D. V. Telpukhov, A. I. Demeneva, V. V. NadolenkoAbstract ▼Circuits are becoming more sensitive to the effects of heavy charged particles due to changes
in design technologies: shrinking of features size, supply voltages, an increase denser chips and
clock frequencies. The widespread use of modern CMOS integrated circuits in the cosmic, military
industry, scientific research facilities, and medical therapy installations under conditions of radiation
exposure, where failures are unacceptable, is relevant. But it should also be borne in mind
that soft errors are possible in electronic systems at sea level as a result of exposure to neutrons
and alpha particles, which makes the use radiation hardening measures of terrestrial applications
also necessary. Today, methods being developed to ensure failure tolerance at the circuit-level. To
evaluate methods for design radiation hardened integrated circuits, fast and accurate methods of
automated simulation of soft errors are needed. But there are no automated tools for simulating of
high-energy particles in commercial CAD systems. For 45nm technologies and below the majority
of observed soft errors will occur in combinational logic. In this paper, the method of automated
simulation of single event transients is proposed. The developed method is comparing with existing
analogs. The paper shows the application of the method on a large set of combinational cells. -
IMAGE SEGMENTATION BY SPIDER MONKEY ALGORITHM FOR AUTOMATED REVERSE DEVELOPMENT OF PRINTED CIRCUIT BOARDS
V. M. Kureichik, A.M. ShtuchnyAbstract ▼This article analyzes the current state of the problem of printed circuit boards reverse development
in the context of the development of modern society, science and technology. The relationship
between reverse engineering and the global development trend is seen in the context of
Industry 4.0 or the fourth scientific and technological revolution. A review of existing modern
segmentation algorithms based on clustering algorithms is carried out. Their advantages and disadvantages
are revealed. The aim of the article is to automate the process of reverse development
of printed circuit boards. The objective of the article is the development of a new image segmentation
algorithm for its use in an automated system for reverse development of printed circuit
boards. The article presents the theoretical development of the fuzzy algorithm of spider monkey
image segmentation based on the fuzzy c-means algorithm. The principle of operation is to use the
algorithm of arachnid monkeys used to find the maximum distribution of the probability of finding
a similar pixel in the segmented image, then the maxima are assigned by the centers of the segments
and the fuzzy s-means segmentation algorithm is used. The advantage of this developed
algorithm is the automatic determination of the number of clusters and their centers. The theoretical
advantage of this approach is the use of a universal optimization algorithm, which surpasses
analogues in many optimization optimization problems. The algorithm of actions, an automatedsystem for reverse development of printed circuit boards. The article provides conclusions about
the prospects of research in this direction and suggests the possibility of using the results of the
work. The novelty is the automation of the process of reverse development of printed circuit
boards. The fundamental difference is the use of a new method of segmentation, based on algorithms
of fuzzy c-medium segmentation and spider monkey algorithm. -
Report of retraction
V.N. Gridin,Abstract ▼According to the "Izvestiya SFedU. Engineering Sciences"
Editorial decision the given article is retracted due to the fact it is
69,47 % coincides to article "Construction of the automated design
systems based on service-oriented architecture" by G.D. Dmitrievich,
D.A. Anisimov, Proceedings of Saint Petersburg Electrotechnical
University Journal No. 2/2014.