Bauman Moscow State Technical University
Author list of organization
List of articles written by the authors of the organization
-
DO WE ALWAYS NEED A SUPPLIER’S QUALITY CONTROL?
DescriptionThe higher the level of quality achieved, the greater the control size - this is the paradox of the classical theory of statistical control. A possible way out is to move to the technical policy based on economic characteristics. Shifting control to the consumer may be economically profitable. We have considered two variants of technical policy - increasing lot size and replacing defective product units at the consumer
-
MOVING FORWARD TO ARISTOTLE: WE MUST BE FREE FROM THE PERVERSIONS OF ECONOMIC THEORY
DescriptionThe founder of the economic theory is Aristotle. The so-called "market economy" is a perversion of Aristotle's views. We have to eliminate distortions. What can replace the "market economy"? We are developing a new organizational-economic theory - solidary information economy, based on the views of Aristotle. The name of this theory has changed over time. Initially, we used the term "nonformal information economy of the future", and then began to use the term "solidary information economy." In connection with Biocosmology and neo-Aristotelism preferred is an adequate term "functionalist organic information economy". This article describes the main provisions of solidary information economy, intended to replace the market economy as a management tool. The main problems are discussed, the solution of which is devoted to research related to the considered basic organizational and economic theory. We discuss Aristotle's positions, on which the economic theory is based, in particular, solidary information economy. We prove that the market economy has remained in the XIX century and the mainstream in modern economic science - justification of insolvency of a market economy and the need to move to a planned system of economic management. We examine the impact of ICT on economic activity. We develop the approaches to decision-making in the solidary information economy. On the basis of modern decision theory (especially expert procedures) and informationcommunication technologies people can get rid of chrematistics and will understand the term of "economy" according to Aristotle
-
RENEWAL DEPENDENCE METHOD OF LEAST SQUARES BASED NONPARAMETRIC MODEL WITH PERIODIC COMPONENT
01.00.00 Physical-mathematical sciences
DescriptionWe consider the nonparametric problem of reneval dependence, which is described by the sum of a linear trend and periodic function with a known period. We obtain the asymptotic distribution of the parameter estimates and the trend component. The methods of estimating the periodic component and designing in-terval forecast. In the model of the points of observa-tion, natural for applications, justified by the condi-tions of use. In particular, we prove an asymptotically unbiased estimate of the coefficient of the linear term
-
INTERCONNECTION LIMIT THEOREMS AND MONTE-CARLO METHOD
01.00.00 Physical-mathematical sciences
DescriptionThe purpose of mathematical statistics is development of methods for the data analysis intended to solve applied problems. Over time, approaches to the development of data analysis methods have changed. A hundred years ago, it was assumed, that the distributions of the data have a certain type, for example, they are normal distributions, and on that assumption they developed a statistical theory. The next stage, in the first place in theoretical studies there are limit theorems. By "small sample" we mean a sample, which can not be applied to conclusions based on the limit theorems. In each statistical problem there is a need to divide the final sample sizes into two classes - those for which you can apply the limit theorems, and those for which you can not do it because of the risk of incorrect conclusions. To solve this problem we often used the Monte Carlo method. More complex problems arise when studying the effect on the properties of statistical procedures for data analysis of various deviations from the original assumptions. To study such impact, we often used the Monte Carlo method as well. The basic (and not solved in a general way) problem of the study of the stability of the findings in the presence of deviations from the parametric families of distributions is the problem of choosing some distributions for using in modeling. We consider some examples of application of the Monte Carlo method, relating to the activities of our research team. We have also formulated basic unsolved problems
-
PROBABILITY MODELS FOR OBTAINING NON-NUMERICAL DATA
01.00.00 Physical-mathematical sciences
DescriptionThe statistics of objects of non-numerical nature (statistics of non-numerical objects, non-numerical data statistics, non-numeric statistics) is the area of mathematical statistics, devoted to the analysis methods of non-numeric data. Basis of applying the results of mathematical statistics are probabilistic-statistical models of real phenomena and processes, the most important (and often only) which are models for obtaining data. The simplest example of a model for obtaining data is the model of the sample as a set of independent identically distributed random variables. In this article we have considered the basic probabilistic models for obtaining non-numeric data. Namely, the models of dichotomous data, results of paired comparisons, binary relations, ranks, the objects of general nature. We have discussed the various options of probabilistic models and their practical use. For example, the basic probabilistic model of dichotomous data - Bernoulli vector (Lucian) i.e. final sequence of independent Bernoulli trials, for which the probabilities of success may be different. The mathematical tools of solutions of various statistical problems associated with the Bernoulli vectors are useful for the analysis of random tolerances; random sets with independent elements; in processing the results of independent pairwise comparisons; statistical methods for analyzing the accuracy and stability of technological processes; in the analysis and synthesis of statistical quality control plans (for dichotomous characteristics); the processing of marketing and sociological questionnaires (with closed questions like "yes" - "no"); the processing of socio-psychological and medical data, in particular, the responses to psychological tests such as MMPI (used in particular in the problems of human resource management), and analysis of topographic maps (used for the analysis and prediction of the affected areas for technological disasters, distributing corrosion, propagation environmentally harmful pollutants, various diseases (including myocardial infarction), in other situations), etc.
-
PROBABILISTIC-STATISTICAL MODELING THE INTERFERENCES FROM ELECTRIC LOCOMOTIVES
01.00.00 Physical-mathematical sciences
DescriptionThe movements of electric locomotives create the interferences affecting the wired link. The creation of sufficiently technical effective and at the same time cost-effective means of protection from wireline interferences generated traction networks assumes as a preparatory phase to develop mathematical models of interference caused by electric locomotives. We have developed a probabilistic-statistical model of interferences caused by electric locomotives. The asymptotic distribution of the total interference is the distribution of the length of the two-dimensional random vector whose coordinates - independent normally distributed random variables with mean 0 and variance 1. Limit theorem is proved for the expectation of the total amplitude of the interferences. Monte-Carlo method is used to study the rate of convergence of the expectation of the total amplitude of the interferences to the limiting value. We used an algorithm of mixing developed by MacLaren-Marsaglia (M-algorithm). Five sets of amplitudes are analyzed, selected in accordance with the recommendations of experts in the field of traction AC networks. The most rapid convergence to the limit takes place in the case of equal amplitudes. It was found that the maximum possible average value of the amplitude of the random noise by 7.4% less than the previously used value, which promises a significant economic impact
-
PROBABILITY-STATISTICAL MODELS OF CORRELATION AND REGRESSION
08.00.13 Mathematical and instrumental methods of Economics
DescriptionThe correlation and determination coefficients are widely used in statistical data analysis. According to measurement theory, Pearson's linear paired correlation coefficient is applicable to variables measured on an interval scale. It cannot be used in the analysis of ordinal data. The nonparametric Spearman and Kendall rank coefficients estimate the relationship of ordinal variables. The critical value when testing the significance of the difference of the correlation coefficient from 0 depends on the sample size. Therefore, using the Chaddock Scale is incorrect. When using a passive experiment, the correlation coefficients are reasonably used for prediction, but not for control. To obtain probabilistic-statistical models intended for control, an active experiment is required. The effect of outliers on the Pearson correlation coefficient is very large. With an increase in the number of analyzed sets of predictors, the maximum of the corresponding correlation coefficients — indicators of approximation quality noticeably increases (the effect of “inflation” of the correlation coefficient). Four main regression analysis models are considered. Models of the least squares method with a determinate independent variable are distinguished. The distribution of deviations is arbitrary, however, to obtain the limit distributions of parameter estimates and regression dependences, we assume that the conditions of the central limit theorem are satisfied. The second type of model is based on a sample of random vectors. The dependence is nonparametric, the distribution of the two-dimensional vector is arbitrary. The estimation of the variance of an independent variable can be discussed only in the model based on a sample of random vectors, as well as the determination coefficient as a quality criterion for the model. Time series smoothing is discussed. Methods of restoring dependencies in spaces of a general nature are considered. It is shown that the limiting distribution of the natural estimate of the dimensionality of the model is geometric, and the construction of an informative subset of features encounters the effect of "inflation coefficient correlation". Various approaches to the regression analysis of interval data are discussed. Analysis of the variety of regression analysis models leads to the conclusion that there is no single “standard model”
-
PROBABILISTIC-STATISTICAL METHODS IN GNEDENKO’S RESEARCHES
01.00.00 Physical-mathematical sciences
DescriptionWe analyze the probabilistic-statistical methods in the researches of Boris Vladimirovich Gnedenko – the academician of Ukrainian Academy of Science, which are very important for the XXI century. We have also discussed the limit theorems of probability theory, mathematical statistics, reliability theory, statistical methods of quality control and queuing theory. We give some information about the main stages of scientific career of B.V. Gnedenko, his views on the history of mathematics and teaching
-
PROBABILISTIC-STATISTICAL METHODS IN KOLMOGOROV’S RESEARCHES
01.00.00 Physical-mathematical sciences
DescriptionFrom a modern point of view we have discussed Kolmogorov’s researches in the axiomatic approach to probability theory, the goodness-of-fit test of the empirical distribution with theoretical, properties of the median estimates as a distribution center, the effect of "swelling" of the correlation coefficient, the theory of averages, the statistical theory of crystallization of metals, the least squares method, the properties of sums of a random number of random variables, statistical control, unbiased estimates, axiomatic conclusion of logarithmic normal distribution in crushing, the methods of detecting differences in the weather-type experiments
-
BASIC RESULTS OF THE MATHEMATICAL THEORY OF CLASSIFICATION
01.00.00 Physical-mathematical sciences
DescriptionThe mathematical theory of classification contains a large number of approaches, models, methods, algorithms. This theory is very diverse. We distinguish three basic results in it - the best method of diagnosis (discriminant analysis), an adequate indicator of the quality of discriminant analysis algorithm, the statement about stopping after a finite number of steps iterative algorithms of cluster analysis. Namely, on the basis of Neyman - Pearson Lemma we have shown that the optimal method of diagnosis exists and can be expressed through probability densities corresponding to the classes. If the densities are unknown, one should use non-parametric estimators of training samples. Often, we use the quality indicator of diagnostic algorithm as "the probability (or share) the correct classification (diagnosis)" - the more the figure is the better algorithm is. It is shown that widespread use of this indicator is unreasonable, and we have offered the other - "predictive power", obtained by the conversion in the model of linear discriminant analysis. A stop after a finite number of steps of iterative algorithms of cluster analysis method is demonstrated by the example of k-means. In our opinion, these results are fundamental to the theory of classification and every specialist should be familiar with them for developing and applying the theory of classification