Job Title : Data Engineer
Experience : 5 –14 years
Location : Hyderabad
Face to Face Interview on Nov 8 Friday
Job Summary :
We are seeking a skilled and detail-oriented Data Engineer to design, build, and optimize data pipelines and architectures. The ideal candidate should have strong experience working across cloud platforms (GCP and Azure) , with proficiency in Teradata SQL , database design , and programming in Java and Python .
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines and ETL processes .
- Work with structured and unstructured data across GCP BigQuery , Azure Data Lake , and Teradata environments.
- Build, optimize, and manage data models and schemas for performance and scalability.
- Implement and monitor data quality , governance , and security standards .
- Collaborate with cross-functional teams — data scientists, analysts, and business stakeholders — to translate data requirements into technical solutions.
- Develop automation scripts and reusable data processing components in Python and Java .
- Troubleshoot and optimize SQL queries for high-performance data retrieval.
- Support data migration, integration, and transformation initiatives between on-premise and cloud environments.
Required Skills :
Strong experience with Google Cloud Platform (BigQuery, Dataflow, Cloud Storage, Pub / Sub) .Experience with Microsoft Azure Data Services (Data Factory, Synapse, Blob Storage).Proficiency in Teradata SQL and complex query optimization.Solid understanding of database design and data modeling concepts (OLTP / OLAP, star / snowflake schema).Programming skills in Python and Java .Hands-on experience with ETL tools and workflow orchestration (e.G., Airflow, Data Fusion, Azure Data Factory).Familiarity with version control (Git) and CI / CD pipelines .Understanding of data security , governance , and compliance frameworks .Preferred Qualifications :
Certification in Google Cloud Data Engineer or Azure Data Engineer .Experience with NoSQL databases (e.G., MongoDB, Cassandra).Exposure to data streaming (Kafka, Pub / Sub).Experience working in Agile / Scrum environments.