The Role :
We are seeking a seasoned and proficient Senior Python Data Engineer with substantial experience in cloud technologies.
As a pivotal member of our data engineering team, you will play a crucial role in designing, implementing, and optimizing data pipelines, ensuring seamless integration with cloud platforms.
The ideal candidate will possess a strong command of Python, data engineering principles, and a proven track record of successful implementation of scalable solutions in cloud : Pipeline Development :
- Design, develop, and maintain scalable and efficient data pipelines using Python and cloud-based technologies.
- Implement Extract, Transform, Load (ETL) processes to seamlessly move data from diverse sources into our cloud-based data Integration :
- Utilize cloud platforms (e.g., Google Cloud, AWS, Azure) to deploy, manage, and optimize data engineering solutions.
- Leverage cloud-native services for storage, processing, and analysis of large Modelling and Architecture :
- Collaborate with data scientists, analysts, and other stakeholders to design effective data models that align with business requirements.
- Ensure the scalability, reliability, and performance of the overall data infrastructure on cloud and Performance :
- Continuously optimize data processes for improved performance, scalability, and cost-effectiveness in a cloud environment.
- Monitor and troubleshoot issues, ensuring timely resolution and minimal impact on data Assurance :
- Implement data quality checks and validation processes to ensure the accuracy and completeness of data in the cloud-based data warehouse.
- Collaborate with cross-functional teams to identify and address data quality and Communication :
- Work closely with data scientists, analysts, and other teams to understand data requirements and provide technical support.
- Collaborate with other engineering teams to seamlessly integrate data engineering solutions into larger cloud-based :
- Create and maintain comprehensive documentation for data engineering processes, cloud architecture, and Skills : Languages : Proficiency in Python for data engineering tasks, scripting, and Engineering Technologies :
- Extensive experience with data engineering frameworks like distributed data processing.
- Understanding and hands-on experience with workflow management tools like Apache Platforms :
- In-depth knowledge and hands-on experience with at least one major cloud platform : AWS, Azure, or Google Cloud.
- Familiarity with cloud-native services for data processing, storage, and Processes : Proven expertise in designing and implementing Extract, Transform, Load (ETL) and Databases : Proficient in SQL with experience in working with relational databases (e.g., PostgreSQL, MySQL) and cloud-based database Modeling : Strong understanding of data modeling principles and experience in designing effective data Control : Familiarity with version control systems, such as Git, for tracking changes in code and Tools : Experience using collaboration and project management tools for effective communication and project and Orchestration : Familiarity with containerization technologies (e.g., Docker) and orchestration tools (e.g., and Troubleshooting : Ability to implement monitoring solutions and troubleshoot issues in data Quality Assurance : Experience in implementing data quality checks and validation Methodologies : Familiarity with agile development methodologies and Skills :
- Strong problem-solving and critical-thinking abilities.
- Excellent communication skills, both written and verbal.
- Ability to work collaboratively in a cross-functional team environment.
- Attention to detail and commitment to delivering high-quality solutions.
(ref : hirist.tech)