Job closed
Ref: KP709-2669
Job description / Role
A leading bank in UAE is looking to hire an AVP Data Ingestion who will lead a team of Data practitioners – Senior Data Engineers and Data Engineers. The role is mainly an operational role, ensuring the data ingestion process is well managed and delivery focused, ensuring delivery commitments are achieved. The role also is required to ensure architectural standards and patterns are defined and adhered to and optimize the data ingestion process.
The AVP Data Ingestion is responsible and accountable for data integration i.e. loading data from source systems into the data platforms, Hadoop, HANA Financial Services Data Platform or any other platforms. The main tools of use are Informatica Big Data Management. Going forward the data ingestion will include real-time data ingestion using KAFKA and Informatica Big Data Streaming. The role is required to work closely with Business and Technology stakeholders.
Job Contents:
Data Ingestion
• Data Integration
• Ensure data security policy and procedures have been implemented
• Implement data profiling
• Implement data lineage and metadata management
Data Transformation
• Transform data from Hadoop to SAP FSDP
• Ensure data consistency and integrity
Data Profiling and metadata management
• Create data lineage
• Capture meta data
• Create data associations and maps
Team management
• Resource planning
• Team skills matrix
• Manage resource utilization, demand, skills shortages
• Team development, empowerment
Financial management
• Create project codes
• Ensure resources cost recover
• Ensure vendor billing is accurate
• Ensure invoices are processed in a timely manner
Automation
• Adopt Continuous Integration and Continuous Deployment
• Test case automation
Requirements
• Bachelor of Computer Science or Equivalent
• Relevant Certification
• Minimum 10 years of professional experience in Banking & Finance industry
• Minimum 5 years professional experience in Data Warehouse, and 5 years of data ingestion
• Knowledge/exposure to Modern Data Warehouse Architecture
• 4+ years of working experience with ETL tool, preferably Informatica
• Solid knowledge of Data Warehouse principles
• 2 years’ Experience of working with Hadoop
• Experience of working with Hadoop related tools e.g. Spark, Sqoop
• 2 years’ Experience of working in Agile teams as part of a squad.
• 1-year Experience of technical aspects of Data Governance including glossary, metadata, data quality, master data management & data life cycle management
About the Company
Building businesses, changing lives
At Tandem, we’ve built a vast network of top-tier professionals across the globe. We empower businesses to thrive by delivering unmatched talent solutions. With our global reach, collaborative approach, and unwavering commitment to excellence, we drive transformative growth and shape exceptional teams.
Who we are.
We are a team of industry experts, driven by a passion for innovation and excellence. We are dedicated to providing unique talent solutions and nurturing collaborative partnerships that redefine success.