Snowflake Development : Design, develop, and implement robust, scalable, and high-performance data warehousing solutions on the Snowflake cloud data platform.
ETL / ELT Pipeline Construction : Build and maintain efficient ETL / ELT (Extract, Transform, Load / Extract, Load, Transform) processes utilizing Snowflake features such as Snowpipe, Tasks, Streams, and stored procedures.
SQL Expertise : Write and optimize complex SQL queries for data extraction, transformation, loading, and analysis within Snowflake.
Data Modeling : Contribute to and implement dimensional and relational data models optimized for analytical reporting and performance on Snowflake.
Performance Tuning & Cost Optimization : Proactively identify and resolve performance bottlenecks within Snowflake, and optimize resource utilization to manage costs effectively.
Data Integration : Integrate Snowflake with various data sources (e.g., cloud storage, databases, APIs) and downstream applications (e.g., BI tools, data visualization platforms).
Data Governance & Security : Implement and maintain data governance, security, and access control policies within Snowflake.
Collaboration : Work closely with data architects, data scientists, business intelligence analysts, and business stakeholders to understand requirements and deliver data solutions.
Automation : Develop scripts (e.g., Python, Shell) for automation of data processes, monitoring, and administrative Skills & Qualifications :
Experience : 5+ years of professional experience in data warehousing and data engineering, with a strong focus on cloud data platforms.
Snowflake Expertise : Deep understanding and hands-on experience with the Snowflake cloud data platform, including its architecture, key features (Virtual Warehouses, Zero-Copy Clones, Time Travel, Caching), and best practices.
SQL Proficiency : Expert-level proficiency in SQL for data manipulation, querying, and optimization.
ETL / ELT Methodologies : Proven experience in designing and implementing ETL / ELT processes and data pipelines.
Data Modeling : Strong understanding of data warehousing concepts and data modeling techniques (e.g., dimensional modeling, 3NF).
Scripting : Experience with scripting languages such as Python or Shell for data processing and automation.
Version Control : Proficient in using Git for version control and collaborative development.
Problem-Solving : Strong analytical and problem-solving skills, with the ability to troubleshoot complex data-related issues.
Communication : Excellent verbal and written communication skills to articulate technical concepts and collaborate effectively with diverse teams.
Preferred Skills (Nice-to-Have) :
Experience with DBT (Data Build Tool) for data transformation.
Familiarity with other cloud platforms (AWS, Azure, or GCP) and their data services.
Experience with data visualization tools like Tableau, Power BI, or Looker.
Snowflake certification (e.g., SnowPro Core).
Experience with other ETL / data orchestration tools (e.g., Airflow, Matillion, Fivetran).
Knowledge of big data technologies (e.g., Spark, Hadoop)