Wargaming is looking for a Big Data Integrator/DevOps to join the Maintenance team, of its Data Warehouse (DWH) unit. In this position, you will help develop and support the data warehouse.
All the DWH units share a common mission—to deliver data that helps the Company make key business decisions.
The Maintenance team operates the existing infrastructure which consists of an Oracle Exadata, Hadoop cluster of 24 nodes with a total capacity of 2 petabytes and Snowflake cloud solution. The team's responsibilities include the creation of stable data processing pipelines, monitoring the cluster, and troubleshooting issues.
In this role you will be required to learn and improve your skills working with the following tools and technologies: Hive, Impala, Sqoop, Kafka, Kudu, Spark, Snowflake, Airflow.
What will you do?
Develop and support scheduled upload processes
Develop and optimize internal services for processing large amounts of data
Develop and support automation processes
Develop solutions for collecting and uploading data sets
Implement and support industrial solutions for data processing
Process and resolve data-related incidents
What are we looking for?
Working experience as a DevOps Engineer or similar software engineering role
2+ years of experience with relational databases: Oracle, MySQL, PostgreSQL
Professional work experience with Linux
Experience scripting in Python
Good knowledge of basic concepts and principles of building data warehouses
Basic knowledge of the DevOps stack: Bash, Git, CI/CD, Jenkins and Airflow
Perfect written and verbal communication skills in Russian
Intermediate or higher level of English
What will help you stand out?
Experience with Hadoop ecosystem (Hive, Impala, Sqoop, Spark, Kudu)
Experience with cloud data platforms (Snowflake and others)
- Citizenship: Not Provided
- Incentives: Not Provided
- Education: Not Provided
- Travel: Not Provided
- Telework: Not Provided