Responsibilities :
Data Pipeline Development and Maintenance :
- Design, build, and optimize scalable ETL / ELT pipelines using AWS Serverless services (Lambda, Glue, s3, Airflow) to ingest data from diverse sources such as APIs, cloud platforms, and databases.
- Ensure pipelines are robust, efficient, and capable of handling large volumes of data.
Data Integration and Harmonization :
Aggregate and normalize marketing performance data from multiple platforms (e.g., Adobe Analytics, Salesforce Marketing Cloud, Teradata data warehouse, ad platforms).Implement data transformation and enrichment processes to support analytics and reporting needs.Data Quality and Monitoring :
Develop and implement data validation and monitoring frameworks to ensure data accuracy and consistency.Troubleshoot and resolve issues related to data quality, latency, or performance.Collaboration with Stakeholders :
Partner with marketing teams, analysts, and data scientists to understand data requirements and translate them into technical solutions.Provide technical support and guidance on data-related issues or projects.Tooling and Automation :
Leverage cloud-based solutions and frameworks (e.g., AWS serverless services - Lambda, Glue, s3, Airflow) to streamline processes and enhance automation.Maintain and optimize existing workflows while continuously identifying opportunities for improvement.Documentation and Best Practices :
Document pipeline architecture, data workflows, and processes for both technical and non-technical audiences.Follow industry best practices for version control, security, and data governance.Continuous Learning and Innovation :
Stay current with industry trends, tools, and technologies in data engineering and marketing analytics.Recommend and implement innovative solutions to improve the scalability and efficiency of data systems.Lead and mentor other junior members within the teamWhat you need to succeed (minimum qualifications) :
Bachelor of Science degree in Computer Science or equivalentAt least 6+ years of post-degree professional experience as a data engineer developing and maintaining data pipelinesExtensive experience with databases and data platforms (Teradata and AWS serverless - (Lambda, Glue, s3, Airflow)5+ Hands-on experience in designing, implementing, managing large scale data and ETL solutions utilizing AWS Compute, Storage and database services (S3, Lambda, Redshift, Glue, Athena)3+ years of experience as a tech lead with experience in mentorship and guidance, task planning and delegation, collaboration across teams, and technical communicationShould have experience in architecture and design decisions, code quality and reviews, hands-on development and tech stack and tools managementShould have good working knowledge on sprint planning and execution, risk management, performance and stability focusProficiency in Python, SQL, PySpark, Glue, Lambda, S3 and other AWS tool setsGood understanding of data warehouses, ETL, AWS serverlessStrong analytical and programming skills with the ability to solve data-related challenges efficientlyCollaborate with product owners to understand requirements and translate them into technical specsPerform regular code reviews and advocate for best practices (testing, CI / CD, documentation)Proven ability to learn new data models quickly and apply them effectively in a fast-paced environment.Excellent communication skills with the ability to present complex data findings to both technical and non-technical audiences.What will give you a competitive edge (preferred qualifications) :
Desired Airline industry experienceDesired experience working with marketing and media dataExperience working with SAS (Statistical Analysis System) to develop data pipelinesAWS certifications : Solution Architect (SAA / SAP) or Data Analytics Specialty (DAS)Experience migrating data pipelines and systems to modern cloud-based solutionsFamiliarity with marketing data platformsID : DELYB03