No.of positions : 04 ( 3 for remote & 01 for onsite Bangalore)
experince : 5+ years
shift : General Shift
Interview Rounds : 02
We are seeking a Data Integration Engineer with expertise in building and orchestrating pipelines using Apache Airflow to integrate data from diverse sources into Snowflake . The ideal candidate will have strong experience with JDBC and API-based integrations (REST / JSON) , hands-on skills in Postman , and proficiency in SQL encryption / decryption, Python development, and pipeline monitoring .
Key Responsibilities
- Design, develop, and maintain Airflow DAGs to orchestrate end-to-end data workflows.
- Integrate structured and unstructured data from multiple systems into Snowflake using JDBC connectors, APIs, and flat file ingestion .
- Work with Postman and other tools to test, validate, and automate API integrations .
- Implement SQL encryption / decryption techniques to secure sensitive datasets.
- Perform data quality checks including row-level validation, hash-based reconciliation, and exception handling.
- Build transformation logic in Python and SQL , ensuring performance and maintainability.
- Implement detailed logging, monitoring, and alerting to guarantee pipeline reliability and compliance.
- Collaborate with stakeholders to understand requirements and deliver scalable, production-ready solutions.
Required Skills
Strong proficiency in Apache Airflow for workflow orchestration.Hands-on experience with Snowflake as a data warehouse.Proven ability to integrate data via JDBC drivers, REST APIs, and Postman-tested endpoints .Advanced knowledge of SQL , including encryption / decryption .Strong programming background in Python for ETL / ELT development.Experience with logging, monitoring, and data observability practices .Skills Required
Apache Airflow, snowflake , Rest Apis, Postman, Sql, Jdbc, Python, Monitoring, Logging