About Netscribes :
Netscribes is a global market intelligence and content services provider that helps corporations achieve strategic objectives through a wide range of offerings.
Our solutions rely on a unique combination of qualitative and quantitative primary research, secondary / desk research, social media analytics, and IP research.
For more than 15 years, we have helped our clients across a range of industries, including technology, financial services, healthcare, retail, and CPG.
Fortune 500 companies, as well as small- to mid-size firms, have benefited from our partnership with relevant market and competitive insights to drive higher growth, faster customer acquisition, and a sustainable edge in their business.
Netscribes is backed by Helix investments, a US-based private equity fund.
Responsibilities :
- Design, build, and maintain scalable and reliable data pipelines to support growing data volume and complexity.
- Partner with analytics and business teams to optimize data models that power BI toolsenhancing data accessibility and driving informed decision-making.
- Implement robust systems for monitoring and ensuring data quality, accuracy, and availability across critical business functions.
- Conduct in-depth data analysis to identify and resolve issues related to data integrity and performance.
- Define and develop data assets and jobs using Spark, Spark SQL, and Hive SQL.
- Collaborate closely with backend / frontend engineers, product managers, and analysts to deliver end-to-end data solutions.
- Design and implement data integration and quality frameworks.
- Write unit and integration tests, document technical work, and contribute to internal knowledge bases.
Requirements :
Bachelor's or Master's degree in Computer Science or a related technical field.5+ years of hands-on experience in Python or Java development.5+ years of strong SQL experience (NoSQL experience is a plus).5+ years in schema design and dimensional data modeling.5+ years working with Big Data technologies like Spark and Hive.2+ years of experience with data engineering on Google Cloud Platform (GCP), especially with BigQuery(ref : hirist.tech)