We are seeking an experienced and ambitious Senior Data Engineer to join our core data team within MSC's cross-country IoT department. This is a unique opportunity for a hands-on engineer who is passionate about building robust data solutions and is eager to grow into a broad position as a Data Lead or Data Architect.
Our small, agile data team operates within the 50-person IoT department, which spans across Antwerp, Turin, and Chennai. You will play a pivotal role in shaping our data landscape, from raw data ingestion to the delivery of actionable insights. While a significant portion of your time will be dedicated to hands-on engineering, you will also be responsible for guiding technical decisions, translating business needs into technical solutions, and mentoring team members.
Key Responsibilities- Data Pipeline Development: Build, optimize, and maintain scalable Big Data ETL/ELT pipelines using Azure Metastore, RabbitMQ, and Event Hub. You will own the quality and reliability of daily data ingests.
- DevOps & Automation: Manage the data solution lifecycle using Azure DevOps, including CI/CD, Git for version control, and YAML for build and release pipelines.
- Data Transformation & Modelling: Develop complex data transformations and build curated datasets in Azure Databricks using clean, production-ready PySpark code.
- Analytics & Insights: Independently handle ad-hoc analytical requests, from querying and discovering patterns to presenting conclusions clearly to diverse audiences.
- Functional & Architectural Leadership: Act as the bridge between business and tech. Translate functional requests into technical tasks and help evolve our overall data architecture.
- Team Collaboration & Mentorship: Actively contribute in team meetings, challenge assumptions to align with business objectives, and mentor team members as you grow in the role.
Technical Profile: Required Skills & Experience- Cloud Platform: Proven experience within the Microsoft Azure data ecosystem.
- Core Data Services:
- Azure Databricks: Expertise in optimizing clusters for large-scale data processing.
- Azure Data Lake Storage / Metastore: Strong understanding of data lake principles.
- Azure Event Hub & RabbitMQ (RMQ): Experience with real-time data ingestion, stream processing, and event-driven architectures.
- Azure DevOps: Proficiency with CI/CD, Git workflows, and YAML pipelines.
- Technical Knowledge:
- Strong understanding of data modelling techniques and architectural best practices.
- Programming & Querying:
- Expert-level PySpark (PySpark) for data transformation and analysis.
- Strong command of SQL.
Who You Are: Personality & Mindset- Analytically Minded: You have strong statistical intuition and a healthy skepticism to ensure data integrity.
- A Problem Solver: You excel at translating ambiguous business requests into concrete technical solutions and actionable plans.
- Proactive & Curious: You are a self-starter who seeks to understand the "why" behind requests and actively engages to align with business goals.
- An Excellent Communicator: You confidently present findings and technical decisions to both technical and non-technical audiences.
What do we offer you?
The opportunity to be part of our MSC Family where a family atmosphere is combined with a professional approach and a no-nonsense mentality Your team of experienced colleagues will immediately immerse you in our culture and guide you through your position. Work-life balance? It's all right thanks to our flexible hours. Do occasionally have a (pre)evening free for our afterwork activities or a drink with colleagues. Finally, you can count on a market-based salary, supplemented with fringe benefits (meal vouchers, eco vouchers, additional vacation days, ...) and the opportunity to continuously develop yourself through training.
We also offer a relocation package if you don't live in Belgium.
Any questions on the expectations of this vacancy? Feel free to reach out to
We don't work with recruitment agencies for this vacancy. Thanks for your understanding
Solliciteren