Key Responsibilities
o Serve as the technical lead for the Snowflake data platform, driving architectural decisions and best practices.
o Design and implement scalable, robust, and cost-effective data warehousing solutions using Snowflake.
o Develop and manage high-volume, real-time data ingestion pipelines utilizing Snowpipe Streaming and Kafka.
o Configure and optimize integrations between Kafka / Confluent and Snowflake for low-latency data availability.
o Write, optimize, and review highly complex and advanced SQL queries and stored procedures within Snowflake.
o Perform rigorous performance tuning on complex queries, data loads, and overall Snowflake warehouse utilization to ensure efficient resource consumption and fast query response times.
o Implement and maintain robust Data Quality checks and reconciliation processes to ensure data accuracy, completeness, and consistency.
o Implement stringent controls for Sensitive Data Handling using Snowflake features like dynamic data masking, access control (RBAC), and external tokenization.
o Ensure all data solutions comply with regulatory standards and internal governance policies.
o Design, develop, and maintain dimensional and normalized Data Models (e.g., Star Schema, Data Vault) optimized for analytical and reporting needs within the Snowflake environment.
o Leverage AWS services such as EC2 for pipeline orchestration / management, MSK (Managed Streaming for Apache Kafka) for robust streaming data management, and IAM for secure access and resource provisioning.
o Develop Infrastructure as Code (e.g., Terraform / CloudFormation) for managing AWS and Snowflake resources (a plus).
Data Architect • Greater Bengaluru Area, India