US Jobs US Jobs     UK Jobs UK Jobs     EU Jobs EU Jobs


Senior, BI Developer

We are looking for a skilled Data Integration & Data Warehousing Engineer to design, build, and manage scalable data pipelines and enterprise data warehouse solutions using AWS Glue and Informatica.

This role will focus on enabling reliable data ingestion, transformation, modelling, and delivery across the organization while ensuring performance, quality, governance, and scalability.

The ideal candidate should have strong experience in ETL/ELT frameworks, data warehousing design, advanced SQL development, and cloud-based data engineering.

What will you do?

Key Responsibilities

Data Integration & Pipeline Development


* Design, develop, and maintain data pipelines using AWS Glue and Informatica (PowerCenter)


* Build scalable ETL/ELT workflows for structured data


* Implement data ingestion from multiple sources (databases, APIs, files, cloud storage, SaaS systems)

Data Warehousing & Data Modelling


* Design and implement enterprise data warehouse (EDW) structures


* Develop and maintain dimensional data models (Star / Snowflake schemas)


* Build staging, ODS, and presentation layers


* Optimize data models for reporting and analytics performance


* Support data mart design aligned to business domains

SQL Development & Performance Optimization


* Write complex SQL queries for transformation, validation, and reporting


* Develop stored procedures, views, and data transformation logic


* Perform query tuning and performance optimization


* Implement incremental load logic and data reconciliation checks


* Ensure efficient handling of large-volume datasets

Cloud Data Engineering

Develop AWS Glue jobs using PySpark / Spark SQL

Manage data workflows across AWS ecosystem (S3, Redshift, IAM, Lambda, etc.)

What skills and capabilities will make you successful?


* Strong experience with AWS Glue (ETL jobs, workflows, triggers)


* Hands-on experience with Informatica PowerCenter


* Strong expertise in SQL development and performance tuning


* Experience designing and managing data warehouse architectures


* Proficiency in Python / PySpark for data processing

What's in it for you?


* Warehousing Expertise


* Dimensional modelling (Star / Snowflake schema)


* Data warehouse lifecycle and architecture


* Data staging, transformation, and presentation layers


* Slowly Changing Dimensions (SCD)


* Fact and dimension design


* Data mart architecture

Who will you report to?


* General Manager

What qualifications will make you successful for this role?


* Bachelor's or Master's degree in Engineering, Computer Science, or related field


* Typically 5+ years experience in data engineering, ETL, or data warehousing roles


* Experience working in enterprise-scale data environments

Let us learn about you! Apply today.

You must submit an online application to be considered for any position with us.

This position will be posted until fille...




Share Job