💼Job Title: Data Integrator - Senior
👨 💻Job Type: Freelance
📍Location: Brussels, Belgium
💼Work regime: Hybrid
🔥Keywords: ETL, SQL, Database, and Data warehouse
Position Overview:
Based on the defined Data Architecture, the Data Integrator develops and documents ETL solutions, which translate complex business data into usable physical data models. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests.
The Data Integrator uses ETL (“Extract, Transform, Load”), which is the process of loading business data into a data warehousing environment, testing it for performance, and troubleshooting it before it goes live.
The Data Integrator is responsible for the development of the physical data models designed to improve efficiency and outputs, of the Data Vault and Data Marts, of the Operational Data Store (ODS), and of the Data Lakes on the target platforms (SQL/NoSQL).
What you'll do:
• Design, implement, maintain and extend physical data models and data pipelines
• Gather data model integration requirements and process knowledge in order to integrate the data so that it corresponds to the needs of end users
• Ensure that the solution is scalable, performant, correct, high quality, maintainable and secure
• Propose standards, tools, and best practices
• Implement according to standards, tools, and best practices
• Maintain and improve existing processes
• Investigate potential issues, notify end-users and propose adequate solutions
• Prepare and maintain the documentation
What you bring:
Must have:
• Strong knowledge of ETL tools and development (DATASTAGE)
• Strong knowledge of Database Platforms (DB2 & Netezza)
• Strong knowledge of SQL Databases (DDL)
• Strong knowledge of SQL (Advanced querying, optimization of queries, creation of stored procedures)
• Relational databases (IBM DB2 LUW)
• Data warehouse appliances (NETEZZA)
• Atlassian Suite: JIRA / CONFLUENCE / BITBUCKET
• TWS (IBM Workload Management)
• UNIX Scripting (KSH / SH / PERL / PYTHON)
• Knowledge of AGILE principles
Nice to have
• Knowledge of Data Modeling Principles / Methods including Conceptual, Logical & Physical Data Models, Data Vault, Dimensional Modelling
• Knowledge of Test Principles (Test Scenarios / Test Use Cases and of Testing)
• Knowledge of ITIL
• Knowledge of OLAP
• Knowledge of NoSQL databases
• Knowledge of Hadoop Components – HDFS, Spark, Hbase, Hive, Sqoop
• Knowledge of Big Data
• Knowledge of Data Science / Machine Learning / Artificial Intelligence
• Knowledge of Data Reporting Tools (TABLEAU)
A little about us:
Innova Solutions is a diverse and award-winning global technology services partner. We provide our clients with strategic technology, talent, and business transformation solutions, enabling them to be leaders in their field.
Awardee of prestigious recognitions including:
Stevie International Business Awards, Denials Remediation Healthcare Technology Solutions, 2023