ru / en

#### name

Orlov Alexander Ivanovich

professor

#### Research interests

статистические методы, организационно-экономическое моделирование. Разработал новую область прикладной статистики — статистику объектов нечисловой природы

0

## Articles count: 139

• pdf  352.019kb doc 352.019kb Views: 854 Date: 30.10.2015
Description
The article is devoted to the nonparametric point and interval estimation of the characteristics of the probabilistic distribution (the expectation, median, variance, standard deviation, variation coefficient) of the sample results. Sample values are regarded as the implementation of independent and identically distributed random variables with an arbitrary distribution function having the desired number of moments. Nonparametric analysis procedures are compared with the parametric procedures, based on the assumption that the sample values have a normal distribution. Point estimators are constructed in the obvious way - using sample analogs of the theoretical characteristics. Interval estimators are based on asymptotic normality of sample moments and functions from them. Nonparametric asymptotic confidence intervals are obtained through the use of special output technology of the asymptotic relations of Applied Statistics. In the first step this technology uses the multidimensional central limit theorem, applied to the sums of vectors whose coordinates are the degrees of initial random variables. The second step is the conversion limit multivariate normal vector to obtain the interest of researcher vector. At the same considerations we have used linearization and discarded infinitesimal quantities. The third step - a rigorous justification of the results on the asymptotic standard for mathematical and statistical reasoning level. It is usually necessary to use the necessary and sufficient conditions for the inheritance of convergence. This article contains 10 numerical examples. Initial data - information about an operating time of 50 cutting tools to the limit state. Using the methods developed on the assumption of normal distribution, it can lead to noticeably distorted conclusions in a situation where the normality hypothesis failed. Practical recommendations are: for the analysis of real data we should use nonparametric confidence limits
• pdf  457.229kb doc 457.229kb Views: 879 Date: 30.10.2015
Description
In various applications it is necessary to analyze some expert orderings, ie clustered rankings of examination objects. These areas include technical studies, ecology, management, economics, sociology, forecasting, etc. The objects may make samples of the products, technologies, mathematical models, projects, job applicants and others. We obtain clustered rankings which can be both with the help of experts and objective way, for example, by comparing the mathematical models with experimental data using a particular quality criterion. The method described in this article was developed in connection with the problems of chemical safety and environmental security of the biosphere. We propose a new method for constructing a clustered ranking which can be average (in the sense, discussed in this work) for all clustered rankings under our consideration. Then the contradictions between the individual initial rankings are contained within clusters average (coordinated) ranking. As a result, ordered clusters reflects the general opinion of the experts, more precisely, the total that is contained simultaneously in all the original rankings. Newly built clustered ranking is often called the matching (coordinated) ranking with respect to the original clustered rankings. The clusters are enclosed objects about which some of the initial rankings are contradictory. For these objects is necessary to conduct the new studies. These studies can be formal mathematics (calculation of the Kemeny median, orderings by means of the averages and medians of ranks, etc.) or these studies require involvement of new information from the relevant application area, it may be necessary conduct additional scientific research. In this article we introduce the necessary concepts and we formulate the new algorithm of construct the coordinated ranking for some cluster rankings in general terms, and its properties are discussed
• pdf  352.708kb doc 352.708kb Views: 754 Date: 30.10.2015
Description
The basic ideas of the developed by us solidary information economy are analyzed (the original name - the nonformal informational economy of the future). Its use as the base of modern organizational-economic theory in exchange for the term of “economics” is proved. The core of researches in the field of the NIEF is forecasting of development of the future society and its economy, working out of organizational-economic methods and models, necessary for the future and intended for increase of efficiency of managerial processes. The economy is a science how to make, instead of, how to divide profit. The basic kernel of the modern economic theory is an engineering economy. As the economic component of state ideology of Russia we offer solidary information economy. According to the solidary information economy the modern information technology and decision theory allow, based on the “open network society”, to build information and communication system designed to identify the needs of people and the organization of production in order to meet them. To implement this feature we must have political will of leadership of economic unit, aimed at transforming the management of this economic unit. In particular, as is already happening in all developed countries, the Russian state should become a major player in the economy
• pdf  172.883kb doc 172.883kb Views: 698 Date: 30.11.2015
Description
Based on an objective analysis, it must be noted that in the arsenal of managers, especially foreign ones, there is practically no fundamentally new methods and tools. However, promising mathematical and instrumental methods of controlling actively developed in our country. In the XXI century it developed a new paradigm of mathematical methods of economics and produced more than 10 books, developed in accordance with this paradigm. The new paradigm is based on the modern development of mathematics as a whole - on the system interval fuzzy math. The new paradigm offers tools used non-parametric statistics, which suggest that the distribution functions are arbitrary. In 1979 it was allocated one of the four major areas of modern applied statistics - statistics of objects of nonnumeric nature (statistics of non-numeric data, nonnumeric statistics). The other three - statistics of random variables, multivariate statistical analysis, statistics of random processes and time series. Statistics of objects of non-numeric nature is central to the modern mathematical methods of economics. On the basis of modern information-communication technologies we have developed a new economic theory - solidary information economy. New intellectual tools of controlling include an automated system-cognitive analysis (ASA) and its software - the system of "Eidos". The systems approach to solving specific applications often requires going beyond the economy. Very important are the procedures for the introduction of innovative methods and tools
• pdf  273.905kb doc 273.905kb Views: 708 Date: 30.11.2015
Description
The real facts presented in this article, demonstrate the great importance in today's world of strategic management, methods of analyses of innovations and investments and the role of the theory of decision-making in these economic disciplines. We have given the retrospective analysis of the development of nuclear physics research. For the development of fundamental and applied science in the second half of the twentieth century, we had a very great importance of the two events: the decision of US President Roosevelt to deploy nuclear program (adopted in response to a letter from Einstein) and the coincidence in time between the completion of the construction of nuclear bomb and the end of World War II. The nuclear bombing of Hiroshima and Nagasaki has determined the developments in science and technology for the entire second half of the twentieth century. For the first time in the entire history of the world the leaders of the leading countries clearly seen that fundamental research can bring great practical benefit (from the point of view of the leaders of countries). Namely, they can give the brand new super-powerful weapon. The consequence was a broad organizational and financial support of fundamental and deriving from them applied research. Is analyzed the influence of fundamental and applied research on the development and effective use of new technology and technical progress. We consider the development of mathematical methods of research and information technology, in particular, the myth of "artificial intelligence"
• pdf  283.889kb doc 283.889kb Views: 543 Date: 30.11.2015
Description
We are developing a new organizational-economic theory - solidary information economy, based on the views of Aristotle. The name of this theory has changed over time. Initially, we used the term "nonformal information economy of the future", and then began to use the term "solidary information economy." In connection with Biocosmology and neo-Aristotelism preferred is an adequate term "functionalist organic information economy. Further development of our theory is the subject of this article. We begin with a brief review of the economic views of Aristotle and the basic ideas of solidary information economy. Then are substantiated the withering away of the Family, Private Property and the State. We discuss the evolution of money - from gold coins to IOUs and conventional units of circulation. We prove that the market economy has remained in the XIX century and the mainstream in modern economic science - justification of insolvency of a market economy and the need to move to a planned system of economic management. We examine the impact of ICT on economic activity. We develop the approaches to decision-making in the functionalist organic information economy. On the basis of modern decision theory (especially expert procedures) and information-communication technologies earthlings can get rid of chrematistics and will understand the term "economy" according to Aristotle
• pdf  262.321kb doc 262.321kb Views: 1075 Date: 30.12.2015
Description
When developing management solutions with the aim of joint consideration and comparison of various factors, partial removal of uncertainty is widely used ratings. In the theory of decisionmaking in almost the same sense, we use the terms "composite index" or "integrated indicator". The article is devoted to the mathematical theory of ratings as tools for studying socio-economic systems. We considered, primarily, linear ratings which is a linear function from a single (private) indicators (factors, criteria), constructed using the coefficients of importance (weightiness, importance). The study discusses the factors affecting the magnitude of the ratings. Three groups of causes affect the value of a line ranking: the ways of measurement of individual indicators, the choice of the set of indicators; the values of the coefficients of importance. We considered binary ratings when the rating takes two values. To compare the proposed rankings we use a new indicator of the quality of diagnostics and prognostic power. Significantly, in many managerial situations, significant differences between objects are identified using any rating. According to the fundamental results of stability theory, the same source data should be processed in several ways. Matching findings, obtained using multiple methods, likely reflect the properties of reality. The difference is the result of a subjective selection method. When using the results of the comparison of objects according to several indicators (criteria ratings), including in dynamics, very useful is the selection of the Pareto set. We discuss the examples of the application of the decision theory, expert evaluations and rankings when developing complex technical systems
• pdf  183.129kb doc 183.129kb Views: 759 Date: 30.12.2015
Description
The purpose of mathematical statistics is development of methods for the data analysis intended to solve applied problems. Over time, approaches to the development of data analysis methods have changed. A hundred years ago, it was assumed, that the distributions of the data have a certain type, for example, they are normal distributions, and on that assumption they developed a statistical theory. The next stage, in the first place in theoretical studies there are limit theorems. By "small sample" we mean a sample, which can not be applied to conclusions based on the limit theorems. In each statistical problem there is a need to divide the final sample sizes into two classes - those for which you can apply the limit theorems, and those for which you can not do it because of the risk of incorrect conclusions. To solve this problem we often used the Monte Carlo method. More complex problems arise when studying the effect on the properties of statistical procedures for data analysis of various deviations from the original assumptions. To study such impact, we often used the Monte Carlo method as well. The basic (and not solved in a general way) problem of the study of the stability of the findings in the presence of deviations from the parametric families of distributions is the problem of choosing some distributions for using in modeling. We consider some examples of application of the Monte Carlo method, relating to the activities of our research team. We have also formulated basic unsolved problems
• pdf  184.388kb doc 184.388kb Views: 781 Date: 30.12.2015
Description
In the statistical hypothesis testing, critical values often point to a priori fixed (nominal) significance levels. As such, typically researcher uses the values of three numbers 0.01, 0.05, 0.1, to which may be added a few levels: 0.001, 0.005, 0.02, and others. However, for the statistics with discrete distribution functions, which, in particular, include all nonparametric statistical tests, the real significance levels may be different from the nominal, differ at times. Under the real significance level we refer to the highest possible significance level of discrete statistics, not exceeding a given nominal significance level (ie, the transition to the next highest possible value corresponding discrete statistical significance level is greater than a predetermined nominal). In the article, we have discussed the difference between nominal and real significance levels on the example of nonparametric tests for the homogeneity of two independent samples. We have also studied two-sample Wilcoxon test, the criterion of van der Waerden, Smirnov two-sample two-sided test, sign test, runs test (Wolfowitz) and calculated the real significance levels of the criteria for nominal significance level of 0.05. The study of the power of these statistical tests is accomplished by means of Monte Carlo method. The main conclusion: the use of nominal significance levels instead of real significance levels for discrete statistics is inadmissible for small sample sizes
• pdf  188.287kb doc 188.287kb Views: 358 Date: 27.01.2016
Description
In many applications, we study the time series (or a random process), which is the sum of the periodic deterministic function of time and random errors that distort the periodic signal. It is required to estimate the length of the period and the periodic component. It does not assume that the periodic function is included in any parameter family of functions, such as finite sums of sines and cosines. It is obvious that the assumption of occurrence of a periodic function in parametric family does not meet the characteristics of the real world, ie, is conditional, internal mathematical (look for the keys under the lamp because there is a light, not in the bush, where lost, because there are dark). For similar reasons, it is impossible to assume that the distribution function of the random errors is included in any parameter family of distributions. In accordance with the new paradigm of mathematical statistics in this article we studied the problem of nonparametric estimation (minimum) length of the period and the periodic component of the signal. On the basis of natural variation and scope of indicators is suggested a new class of nonparametric estimators of the length of the period and the periodic component in the time series. Based on the general results of statistics of objects of non-numeric nature we proved the consistency of these estimates. From the practical point of view it is necessary to minimize the numerical (one parameter - ability length of period of time) one or more of the 66 functionals, described in the article