University degree in IT or relevant discipline, combined with minimum 15 years of relevant working experience in IT;
Experience in migrating legacy data systems to a modern data platform;
Experience with ETL/ELT processes, API integration, data ingestion and transformation tools (dbt, Spark, Talend, Fabric, SAP DS);
Experience with DevOps practices and tools related to data pipelines, including CI/CD for data infrastructure;
Experience with data pipeline orchestration tools (Airflow, Dagster);
Experience with database systems, both relational (PostgreSQL, Oracle) and non-relational (Elasticsearch, MongoDB);
Excellent knowledge of designing scalable and flexible modern data architectures;
Excellent knowledge of business intelligence reporting tools;
Strong understanding of cloud data architecture principles;
Good knowledge of modelling tools;
Knowledge of data governance frameworks and tools (DataHub, Open Metadata, Atlas, Collibra), data quality management, data security, access control and regulatory compliance;