Job purpose :
Manage and create design schema, SQL query tuning, and code review.
Who you are :
Min 3+ years of professional experience in the field of data engineering with knowledge of the data platform and DWH development
Min. 2 years of hands-on experience in SSAS tabular models
Designed Data Ingestion and Orchestration Pipelines with Kafka and Python
Design and develop DBT models and build the data pipeline processes to perform processes as per DBT scripting
Experience with analytical tools such as SQL (Snowflake), ETL / ELT (Azure Data Factory, APIs), Cloud (Azure Functions & Storage), and BI tools (Power BI, ThoughtSpot)
Proficient in SSAS DAX queries
Experience in a role performing data warehouse and analytics solution design and development using a variety of techniques
Experience in Data Lake Concepts with Structured, Semi-Structured, and Unstructured Data
Independently analyze and correct issues in real-time, providing end-to-end problem resolution.
Refine and automate regular processes, track issues, and document changes.
High level of independence and responsibility
Excellent teamwork and communication skills
What will excite you :
Complete ownership and independence of execution
Experiment, fail, and learn. (with a condition - 'Fail Fast')
High pedigree, high-caliber team
Contribute to every area of our business. Have a real impact on your work.
Location : Ahmedabad - Work from Office
Data Engineer • Ahmedabad, Gujarat, India