This is a fully remote contract, offering €1000 per day, forming part of a strategic 5-year transformation programme.
Location: Fully Remote (Belgium-based client)
Contract: Long-term (5-year programme)
The Programme
You will join a large-scale enterprise data platform transformation, modernizing legacy systems into a scalable, cloud-native architecture. The programme is business-critical and backed at executive level.
Key Responsibilities
- Design and build scalable data pipelines using Snowflake and Databricks
- Develop robust ELT/ETL processes in a cloud-native environment
- Optimize data models for performance, scalability, and cost efficiency
- Work closely with Data Architects, Platform Engineers, and Business stakeholders
- Implement best practices in data governance, security, and quality
- Contribute to CI/CD automation and DevOps practices within data engineering
- Support regulatory and compliance-driven banking requirements
Required Experience
- 7+ years of Data Engineering experience
- Proven hands‑on expertise with Snowflake (data modeling, optimization, performance tuning)
- Strong experience with Databricks (Spark, PySpark, Delta Lake)
- Experience building enterprise-grade data platforms in cloud environments (Azure or AWS)
- Strong SQL and Python skills
- Experience within banking or financial services (highly preferred)
- Familiarity with data governance, lineage, and regulatory environments
Nice to Have
- Experience with real-time streaming (Kafka, Event Hub)
- Experience with infrastructure-as-code (Terraform, ARM, etc.)
- Knowledge of data mesh or modern data architecture principles
This is an opportunity to contribute to one of Belgium’s most significant banking data transformations, with long‑term stability and high visibility.
#J-18808-Ljbffr
Solliciteren