IT Officer, Data and Information Management II - Lead Data Engineer - Information Technology - Tenders Global

IT Officer, Data and Information Management II – Lead Data Engineer – Information Technology

International Finance Corporation

tendersglobal.net

<!–

Description

–>

Description

  • IFC — a member of the World Bank Group — is the largest global development institution focused on the private sector in emerging markets. We work in more than 100 countries, using our capital, expertise, and influence to create markets and opportunities. Our mission is to leverage the power of the private sector to end extreme poverty and boost shared prosperity on a livable planet.
  • Information Technology is an integral part of the strategic initiatives for all the core business functions at IFC. Within Corporate Information and Technologies (CIT), Data Engineering, AI and Platforms (CITDE) organization provides comprehensive data and analytics support for IFC’s various organizations, such as Economics & Private Sector Developments, Global Industries, Climate Business, Treasury and Mobilization, any many more. A robust engagement model with business ensures prioritization of the business requirements & focus on delivery. The solution landscape encompasses a wide variety of in-house and customized data applications, AI/ML solutions and scaled capabilities which meet the business needs of IFC departments.
  • Data Engineering & Platforms (CITDE) is seeking a Lead Data Engineer with expertise in migrating of transactional database systems to Microsoft Azure Cloud Platform. You will understand technical requirements and assist in designing, planning, managing, and troubleshooting the data migration process and data pipelines. In addition to creating and maintaining an optimal pipeline architecture, this role is responsible for identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies. Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition. Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues. Lastly, working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues.

Duties and Responsibilities:

  • Design standardized data architecture and best practices to support batch ingestion, real-time, and near real-time integration and ETL, data quality rules, and structuring data for analytic consumption by end uses.
  • Lead the design and development of modern data solutions leveraging Azure Logic Apps, Azure Data Factory, Azure Data Lake, Azure SQL, Azure Synapse, and Databricks.
  • Conduct enterprise-wide data discovery and profiling to understand existing data sources, structures, and dependencies.
  • Architect secure and efficient data integration pipelines for ingesting, transforming, and loading data from various source systems.
  • Ensure data lineage and traceability across organization’s data ecosystem.
  • Implement data security measures and data operations metrics, access controls, and authentication mechanisms to protect sensitive data and ensure compliance.
  • Develop and implement data retention, archiving, disaster recovery, and business continuity policies and plans.
  • Continuously monitor data quality, identify performance bottlenecks, and optimize data flows for efficient and reliable data delivery.
  • Design and architect end-to-end data solutions, ensuring alignment with business objectives and technical constraints.
  • Collaborate with stakeholders across the organization to define and articulate data strategy and objectives aligned with business goals.
  • Work closely with data engineers, business intelligence engineers, and stakeholders to ensure seamless collaboration and alignment across the data ecosystem.
  • Provide guidance, mentorship, and knowledge transfer to cross-functional teams on data architecture best practices and emerging technologies.

Selection Criteria

  • Master’s degree in computer science, Statistics, Applied Mathematics, Data Science, or Machine Learning with 6 years of experience or equivalent combination of education and experience.
  • Hands-on experience in designing, building, and optimizing data pipelines.
  • 5+ years of experience in designing and implementation of modern data architectures and concepts such as cloud services (e.g., AWS, Azure, GCP), real-time data distribution (e.g., Kafka, Kinesis, DataFlow, Airflow), NoSQL (e.g., MongoDB, DynamoDB, HBase, CosmosDB) and modern data warehouse tools including Snowflake and Databricks
  • Experience in end-to-end architecture design and deployment of Azure data science environments using Azure ML, ML Flow, Synapse analytics, Azure OpenAI services, Cognitive services, Azure Databricks
  • Expertise in relational database concepts and experience across multiple database technologies (e.g., SQL, NoSQL, Oracle, Hadoop, Postgres, etc.) environments.
  • In-depth knowledge of data architecture principles and best practices, including data modeling, data warehousing, data lakes, and data integration.
  • Proficiency in data modeling techniques and the ability to choose appropriate architectural design patterns, including Data Fabrics, Data Mesh, Lake Houses, or Delta lakes. 
  • Experience with containerization technologies like Docker and container orchestration platforms like Kubernetes for managing data workloads. 
  • Proficiency in version control systems (e.g. Git) and knowledge of DevOps practices for automating data engineering workflows (DataOps). 
  • Practical understanding of data encryption, access control, and security best practices to protect sensitive data. 
  • SAFe experience and the ability to work on a cross functional, self-organizing development teams.

Source: https://worldbankgroup.csod.com/ats/careersite/JobDetails.aspx?id=29091&site=1

<!—

<!–

–>

To help us track our recruitment effort, please indicate in your cover/motivation letter where (tendersglobal.net) you saw this job posting.

Job Location