Lead Engineer - Big Data
About us :
Gracenote is the content data business unit of Nielsen that powers innovative
entertainment experiences for the world s leading media companies. Our entertainment
metadata and connected IDs deliver advanced content navigation and discovery to
connect consumers to the content they love and discover new ones.
With a global footprint, Gracenote provides global and local content solutions covering
regions across the Americas, Asia-Pacific, Europe, the Middle East and Africa.
We are looking for Senior Developers to join our Gracenote Tech team. The ideal
candidates would have a passion for Clean Code, scalable architectures, Test Driven
Development and DevOps.
We are looking for a talented Senior Software Engineer with strong programming skills to
join our dynamic team. If you are passionate about technology and eager to work on
challenging projects, we want to hear from you!
Job Purpose :
Develop and enhance our flagship Video, Audio, Automotive and Sports metadata
software solutions.
Design applications with a Platform-first mentality where scale, consistency and
reliability are at the core of every decision.
Job Description :
As a Senior Software Engineer, you will be responsible for designing, developing, and
maintaining high-quality software applications using Java. You will collaborate with
cross-functional teams to define, design, and ship new features, while also ensuring the
performance, quality, and responsiveness of applications.
Key Responsibilities :
Design, develop, and maintain scalable and robust Big Data pipelines and systems.
Architect and implement solutions for managing & processing large-scale datasets
with fast refresh cycles, ensuring high performance, scalability & accuracy.
Collaborate with cross-functional teams, including data scientists, engineers, and
product managers, to define and translate business requirements into technical
solutions.
Write clean, maintainable, and efficient code following best practices and coding
standards.
Conduct design and code reviews to ensure high-quality deliverables and
adherence to best practices.
Troubleshoot and resolve complex issues in systems, ensuring reliability,
availability, SLA compliance, observability, and minimal downtime.
Participate in the full software development lifecycle, including planning,
development, review, testing, and deployment.
Stay up-to-date with emerging technologies and industry trends to continuously
improve skills and knowledge.
Mentor and guide junior engineers, fostering a culture of learning and collaboration
within the team.
Qualifications
Bachelor s degree in Computer Science, Engineering, or a related field.
6 to 10 years of professional experience in Big Data engineering, with hands-on
expertise in processing large-scale datasets.
Advanced programming skills in Python, Java, or Scala, with a focus on data
processing and stream analytics.
Experience of working with distributed data systems such as Spark or Flink.
Deep understanding of distributed storage systems (HDFS, S3, or ADLS) and
modern file formats like Parquet, ORC & Arrow.
Strong expertise in Lakehouse architectures and technologies like Delta Lake,
Iceberg and data orchestration tools like Airflow, Dagster.
Knowledge of database systems, including NoSQL stores (Cassandra, MongoDB),
relational databases (PostgreSQL, MySQL) and SQL.
Working proficiency with Agile development methodologies and CI / CD practices.
Strong problem-solving skills and the ability to work independently as well as in a
team environment.
Excellent communication and interpersonal skills.
Preferred Qualifications
Experience with cloud platforms (e.g., AWS, Azure, Google Cloud).
Familiarity with containerization technologies (e.g., Docker, Kubernetes).
Knowledge of CI / CD tools and practices.
Big Data Engineer • Chennai, Tamilnadu, India