Job Summary
Kogta Financial Ltd. is seeking an experienced and highly skilled ETL & Data Warehouse Developer with strong expertise in AWS data services . As a key member of our data engineering team, you will be responsible for designing, developing, and optimizing ETL pipelines and scalable data warehouse solutions on the AWS platform. The ideal candidate will have a solid background in data modeling, SQL performance tuning, and AWS-based data architectures .
Key Responsibilities
1. ETL Development
- Design, develop, and implement robust ETL processes using AWS EMR, AWS Data Pipeline, or custom scripts .
- Ensure efficient extraction, transformation, and loading of data from multiple sources into the data warehouse.
2. Data Warehousing
Design and maintain scalable, high-performance data warehouse solutions on AWS.Implement optimized data models and structures for AWS Redshift or equivalent environments.3. AWS Service Utilization
Leverage AWS services such as S3, Lambda, Redshift , EMR , Step Functions and others to build end-to-end data solutions.Stay updated with AWS advancements and recommend new tools or services to enhance data infrastructure.4. SQL Expertise
Develop and optimize complex SQL queries, stored procedures, and views for analytics and reporting.Troubleshoot and fine-tune SQL performance to ensure efficient data retrieval.5. Pyspark Expertise
Design, Develop and optimize ETL pipelines using Pyspark .Leverage optimization and data modelling capabilities to design data pipelines.6. Performance Optimization
Continuously monitor and enhance ETL and query performance.Identify and resolve processing bottlenecks across data pipelines and warehouse layers.7. Data Integration & Collaboration
Collaborate with cross-functional teams to integrate data from multiple systems.Partner with business stakeholders to understand and deliver on data requirements.8. Security & Compliance
Implement strong data security and governance measures.Ensure compliance with relevant industry standards and organizational policies.9. Documentation
Maintain detailed documentation for ETL processes, data models, and configurations .Ensure proper handover and knowledge transfer within the team.Qualifications & Skills
Bachelor’s degree in Computer Science, Information Technology , or a related field.Experience with Spark Streaming will be an advantage.Proven experience as an ETL / Data Warehouse Developer with expertise in AWS and SQL.Strong proficiency in SQL , including complex queries and performance tuning.Hands on experience with developing and maintaining ETL pipeline using Pyspark .Excellent understanding of Spark architecture.Hands-on experience with AWS services such as S3, EMR, Redshift, Lambda, and Step Functions.Solid understanding of data modeling , data integration, and warehousing principles.Familiarity with data security, compliance, and governance best practices.Strong analytical, problem-solving, and communication skills.AWS Certification or equivalent credentials preferred.