This article examines the problem of managing ore flow quality at mining enterprises from the perspective of applying big data to improve the efficiency of mineral quality management. It is noted that assessing the feasibility of collecting and processing big data for ore flow quality control requires an optimal quantifiable weight parameter, which determines the data collection discreteness and the effectiveness of their processing. Currently, this parameter is the ore (or concentrate) batch. A scientific-practical approach to determining batch sizes at mining enterprises is proposed, based not on business process conditions, but on the analysis of the distribution of quality parameters within the ore body, considering subsequent methods of mineral raw material transportation. An analysis was conducted on the data from every technological process within the mining technical system, leading to the establishment of principles for calculating the minimum required data samples for each stage of the process. The applicability of the Kotelnikov theorem (Nyquist – Shannon sampling theorem) for determining the optimal quantifiable weight parameter of a mineral raw material batch within quality control frameworks is considered. To obtain a qualitative model, the required scope of quarry operation statistics should range from 16 to 52 months of excavator operation at the face. This range depends on the value of the mineral quality distribution coefficient at the mining enterprise. It was also established that for building a qualitative model, the mentioned coefficient must be considered; the higher its value, the lower the sampling frequency should be when collecting data from technological processing stages.