Requirements
What will you be bringing to the team?
- Bachelor degree in IT or related field and 13 years of professional experience in IT.
- Excellent knowledge of data warehouse and/or data lakehouse design & architecture.
- Excellent knowledge of open-source, code-based data transformation tools such as dbt, Spark and Trino.
- Excellent knowledge of SQL.
- Good knowledge of Python.
- Good knowledge of open-source orchestration tools such as Airflow, Dagster or Luigi.
- Experience with AI-powered assistants like Amazon Q that can streamline data engineering processes.
- Good knowledge of relational database systems.
- Good knowledge of event streaming platforms and message brokers like Kafka and RabbitMQ.
- Extensive experience in creating end-to-end data pipelines and the ELT framework.
- Understanding of the principles behind storage protocols like Apache Iceberg or Delta Lake.
- Proficiency with Kubernetes and Docker/Podman.
- Good knowledge of data modelling tools.
- Good knowledge of online analytical data processing (OLAP) and data mining tools.
- Fluent in English at least at a level C1.
Solliciteren