We are seeking an experienced data architect to lead full lifecycle implementations from secure platform setup and cloud integration to high-performance data pipeline development.
As a seasoned data professional, you will own the design and implementation of scalable, secure infrastructure-as-code using Terraform or Pulumi on AWS, Azure, or GCP. You will also build robust CD / CD pipelines and integrate with GitHub Actions, Jenkins, or Azure DevOps.
You will develop ETL / ELT pipelines using DBT, Airflow, or custom Python solutions, ingest and transform data from APIs, SaaS apps, and event streams. Additionally, you will ensure data security, lineage tracking, and compliance with GDPR, HIPAA, and SOC2.
Key responsibilities :
- Lead Snowflake platform setup environment provisioning RBAC and resource optimization
- Architect scalable secure infrastructure-as-code using Terraform or Pulumi on AWS, Azure, or GCP
- Build robust CD / CD pipelines and integrate with GitHub Actions, Jenkins, or Azure DevOps
- Develop ETL / ELT pipelines using DBT, Airflow, or custom Python solutions
- Ingest and transform data from APIs, SaaS apps, and event streams
- Ensure data security, lineage tracking, and compliance with GDPR, HIPAA, and SOC2
Requirements :
5+ years in data engineering or platform operations3+ years of deep Snowflake hands-on experience performance security costStrong SQL, Python, Terraform / YAML, and Git workflowsProven CCD and DevOps knowledgeExperience with at least one major cloud provider AWS, Azure, or GCPBonus points :
SnowPro Advanced Architect or Data Engineer certificationExperience with DBT, Fivetran, Informatica, or MatillionFamiliarity with Kafka, Delta Lake, Iceberg, or DatabricksBackground in SaaS or regulated industries like finance or healthcareWhy join us?
Remote-first flexibilityCompetitive pay and equity optionsCertifications and professional development supportHealth, dental, and wellness programsOpportunity to drive next-gen enterprise data and AI transformation