Features Of The Implementation Of Big Data Projects In 2022: Results Of A Study Of The Russian Market

According to a joint study, 62% of Russian companies are already working with Big Data, and 34% of companies have been using Big Data solutions for over 3 years.

But many companies implementing big data projects and working with Big Data analytics face barriers and difficulties. We asked the heads of IT departments, CIO, CTO, and CDTO of large Russian companies about this and identified typical challenges. In the article, we analyze the limitations and offer solutions.

Respondents of the study “Technologies for working with Big Data: readiness for use and main barriers” shared their approaches, investment plans, and business expectations from working with big data. The survey results and conclusions can be found in the full version of the report at the link.

Lack Of Competencies In The Team

The lack of prominent data specialists is one of the main problems for most Russian companies. This factor complicates the work with big data for 22% of the study participants.

This situation is due to the lack of senior-level big data technology stack specialists in the labor market. Some companies solve this problem by attracting external experts to implement big data projects: integrators, agencies, and consulting companies. 21% of survey respondents rely on the development of in-house competence centers. Often these are companies where big data projects are actively developing, and data arrays are overgrowing.

Additional difficulties in working with Big Data are created by the departure of foreign vendors from the Russian market in 2022. As a result, companies are forced to:

  • Review the stack of technologies on which solutions for Big Data are built and plan the implementation of new solutions for data processing based on other tools;
  • Look for ways to maintain and use systems built on software that is not supported;
  • Migrate to the cloud, where the provider is responsible for providing the technology stack.

There is a growing interest in Big Data specialists who can work with Open-Source technologies. For several companies, these new competencies need to be formed within teams or hire experts from outside.

Poorly Cataloged Data

Data quality affects the accuracy and objectivity of Big Data analytics. It is essential to determine the procedure for collecting and storing information, develop internal regulations, and select sources and methods for working with Big Data. This takes time and resources, which is why 21% of respondents and Arenadata study consider poorly cataloged data to be one of the main obstacles.

Systematic work allows preventing the appearance of this complexity:

  • Inventory of the collected data
  • The cataloging of Big Data
  • Elimination of factors for obtaining low-quality data
  • The introduction of a culture of working with Big Data at all stages

Leave a Reply

Your email address will not be published. Required fields are marked *