Key Responsibilities :
- Design, develop, and maintain data pipelines using Python and SQL.
- Write efficient, optimized SQL queries for data extraction, transformation, and reporting.
- Automate data workflows and integrate APIs or third-party services.
- Collaborate with cross-functional teams to understand data requirements and deliver actionable insights.
- Perform data validation, cleansing, and quality checks.
- Develop dashboards or reports using BI tools (optional, if applicable).
- Document processes, code, and data models for future reference.
Required Skills & Qualifications :
Strong proficiency in Python (Pandas, NumPy, etc.).Advanced knowledge of SQL (joins, subqueries, CTEs, window functions).Experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server).Familiarity with version control systems like Git.Strong problem-solving and analytical skills.Excellent communication and collaboration abilities.Preferred Qualifications :
Experience with cloud platforms (AWS, Azure, GCP).Familiarity with data visualization tools (e.g., Power BI, Tableau).Knowledge of ETL tools or frameworks (e.g., Airflow, dbt).Background in data warehousing or big data technologies.Education :
Bachelors or Master's degree in Computer Science, Information Systems, Engineering, or a related field.Skills Required
Python, Sql, Pandas, Postgresql, Git, Airflow