Responsibilities :
Data Pipeline Development and Maintenance Design, build, and optimize robust ETL pipelines to support operational reporting requirements Ensure data quality, consistency, and integrity across sources and reporting outputs Automate data ingestion from various internal and external systems Reporting and Dashboarding Develop and maintain dashboards and reports in BI tools (e.g., Power BI, Tableau, Looker) Collaborate with business stakeholders to translate requirements into effective visualizations Optimize dashboard performance and user experience through best practices Data Modeling and Architecture Create logical and physical data models that support scalable reporting solutions Participate in the design and implementation of data marts and operational data stores Work closely with data architects to align with enterprise data strategy Cross-Functional Collaboration Partner with analysts, product managers, and operations teams to define reporting KPIs Ensure consistent definitions and calculations across different business units Support ad hoc analytical requests and provide technical insights when needed Governance and Best Practices Implement and advocate for data governance practices including data cataloging and lineage Define and enforce reporting standards and data documentation Participate in peer code and dashboard reviews Qualifications : Experience : 5–8 years of experience in data engineering or business intelligence engineering roles Proven track record in building scalable reporting systems and maintaining dashboards for operational use Technical Skills : Solid experience with SQL, capable of writing complex queries and understanding database structures across various SQL dialects (e.g., Oracle, MySQL, PostgreSQL). Strong experience with SQL, Python, and modern ETL frameworks (e.g., dbt, Apache Airflow) Understanding of data orchestration concepts and experience with Airflow (or similar tools like Prefect, Dagster). Proficiency in at least one BI tool (Power BI, Tableau, or Looker) or similar technology for dashboard and report development. Knowledge of cloud data platforms (AWS Redshift, Google BigQuery, Databricks, Snowflake, or Azure Synapse) Familiarity with version control and CI / CD pipelines for data Exposure to or understanding of streaming data concepts, ideally with Kafka. Soft Skills : Excellent communication and stakeholder management skills Strong problem-solving capabilities and attention to detail Ability to manage multiple projects and meet tight deadlines Preferred Skills : Experience with real-time data processing frameworks (e.g., Kafka, Spark Streaming) Exposure to data observability and monitoring tools Understanding of data privacy and compliance requirements (e.g., GDPR, HIPAA)
Reporting Analyst • chennai, India