Job Description :
Key Responsibilities :
1. Design, develop, and optimize ETL processes, with a focus on Snowflake integration.
2. Collaborate with stakeholders to gather and analyze requirements, translating them into technical specifications.
3. Architect and implement efficient data pipelines to support various business needs using Snowflake, and DBT.
4. Perform data profiling, cleansing, and transformation to ensure data accuracy and consistency.
5. Monitor and troubleshoot ETL jobs, identifying and resolving performance issues and data anomalies.
6. Implement best practices for data integration, storage, and retrieval within the Snowflake environment.
7. Work closely with data engineers, analysts, and business users to understand data requirements and deliver solutions that meet their needs.
8. Stay updated with the latest trends and advancements in ETL technologies, Matillion features, and AWS services.
9. Design, develop, and optimize complex data pipelines within the Snowflake data warehouse environment.
10. Implement scalable ETL processes to ingest, transform, and load data from various sources into Snowflake.
11. Collaborate with data architects and analysts to design and implement efficient data models within Snowflake.
12. Optimize SQL queries, database configurations, and data pipeline performance for enhanced efficiency and scalability.
13. Set up and maintain GitHub repositories for version control of data engineering code, configurations, and scripts.
14. Establish and enforce branching strategies, pull request workflows, and code review processes to ensure code quality and collaboration.
15. Develop and implement robust data quality checks and validation processes to ensure the accuracy and integrity of data within Snowflake.
16. Monitor data pipelines for anomalies, errors, and discrepancies, and implement proactive measures to maintain data quality.
17. Automate deployment, monitoring, and management of data pipelines using orchestration tools like Airflow or custom automation scripts.
18. Continuously enhance automation processes to streamline data engineering workflows and minimize manual interventions.
19. Document data engineering processes, pipeline configurations, and troubleshooting steps for knowledge sharing and reference.
20. Provide mentorship and training to junior team members on Snowflake best practices, GitHub usage, and data engineering techniques.
Required Skills and Qualifications :
Preferred Qualifications :
Data Engineer Snowflake • Hyderabad, Telangana, India