Senior, BI Developer
We are looking for a skilled Data Integration & Data Warehousing Engineer to design, build, and manage scalable data pipelines and enterprise data warehouse solutions using AWS Glue and Informatica.
This role will focus on enabling reliable data ingestion, transformation, modelling, and delivery across the organization while ensuring performance, quality, governance, and scalability.
The ideal candidate should have strong experience in ETL/ELT frameworks, data warehousing design, advanced SQL development, and cloud-based data engineering.
What will you do?
Key Responsibilities
Data Integration & Pipeline Development
* Design, develop, and maintain data pipelines using AWS Glue and Informatica (PowerCenter)
* Build scalable ETL/ELT workflows for structured data
* Implement data ingestion from multiple sources (databases, APIs, files, cloud storage, SaaS systems)
Data Warehousing & Data Modelling
* Design and implement enterprise data warehouse (EDW) structures
* Develop and maintain dimensional data models (Star / Snowflake schemas)
* Build staging, ODS, and presentation layers
* Optimize data models for reporting and analytics performance
* Support data mart design aligned to business domains
SQL Development & Performance Optimization
* Write complex SQL queries for transformation, validation, and reporting
* Develop stored procedures, views, and data transformation logic
* Perform query tuning and performance optimization
* Implement incremental load logic and data reconciliation checks
* Ensure efficient handling of large-volume datasets
Cloud Data Engineering
Develop AWS Glue jobs using PySpark / Spark SQL
Manage data workflows across AWS ecosystem (S3, Redshift, IAM, Lambda, etc.)
What skills and capabilities will make you successful?
* Strong experience with AWS Glue (ETL jobs, workflows, triggers)
* Hands-on experience with Informatica PowerCenter
* Strong expertise in SQL development and performance tuning
* Experience designing and managing data warehouse architectures
* Proficiency in Python / PySpark for data processing
What's in it for you?
* Warehousing Expertise
* Dimensional modelling (Star / Snowflake schema)
* Data warehouse lifecycle and architecture
* Data staging, transformation, and presentation layers
* Slowly Changing Dimensions (SCD)
* Fact and dimension design
* Data mart architecture
Who will you report to?
* General Manager
What qualifications will make you successful for this role?
* Bachelor's or Master's degree in Engineering, Computer Science, or related field
* Typically 5+ years experience in data engineering, ETL, or data warehousing roles
* Experience working in enterprise-scale data environments
Let us learn about you! Apply today.
You must submit an online application to be considered for any position with us.
This position will be posted until fille...
- Rate: Not Specified
- Location: Bangalore, IN-KA
- Type: Permanent
- Industry: Finance
- Recruiter: Schneider Electric
- Contact: Not Specified
- Email: to view click here
- Reference: 102807-en-us
- Posted: 2026-04-15 07:52:55 -
- View all Jobs from Schneider Electric
More Jobs from Schneider Electric
- Journeyman Diesel Mechanic
- Production Operator
- Customer Account Coordinator
- Procurement Leader
- Tool Crib Attendant
- Coater Process Engineer
- Environmental, Health & Safety (EH&S) Manager
- Customer Account Coordinator - Corrugated
- Accounting Analyst
- Senior Signal Integrity Engineer
- Senior Signal Integrity Engineer
- Senior Process Control Engineer
- Planner
- Rig Welder
- Rig Welder
- Surveyor Helper
- Staff Accountant (Albany, OR)
- Ironworker Foreman
- Rodbuster
- Millwright LME (Albany, OR)