Talent.com
This job offer is not available in your country.
Data Modeller

Data Modeller

FractalMumbai
30+ days ago
Job type
  • Full-time
Job description

It's fun to work in a company where people truly BELIEVE in what they are doing!

Responsibilities

  • Participate in requirements definition, analysis, and the design of logical and physical data models for Dimensional Data Model, NoSQL, or Graph Data Model.
  • Lead data discovery discussions with Business in JAD sessions and map the business requirements to logical and physical data modeling solutions.
  • Conduct data model reviews with project team members.
  • Capture technical metadata through data modeling tools.
  • Ensure database designs efficiently support BI and end user requirements.
  • Drive continual improvement and enhancement of existing systems.
  • Collaborate with ETL / Data Engineering teams to create data process pipelines for data ingestion and transformation.
  • Collaborate with Data Architects for data model management, documentation, and version control.
  • Maintain expertise and proficiency in the various application areas.
  • Maintain current knowledge of industry trends and standards.

Required Skills

  • Strong data analysis and data profiling skills.
  • Strong conceptual, logical, and physical data modeling for VLDB Data Warehouse and Graph DB.
  • Hands-on experience with modeling tools such as ERWIN or another industry-standard tool.
  • Fluent in both normalized and dimensional model disciplines and techniques.
  • Minimum of 3 years' experience in Oracle Database.
  • Hands-on experience with Oracle SQL, PL / SQL, or Cypher.
  • Exposure to Databricks Spark, Delta Technologies, Informatica ETL, or other industry-leading tools.
  • Good knowledge or experience with AWS Redshift and Graph DB design and management.
  • Working knowledge of AWS Cloud technologies, mainly on the services of VPC, EC2, S3, DMS, and Glue.
  • Bachelor's degree in Software Engineering, Computer Science, or Information Systems (or equivalent experience).
  • Excellent verbal and written communication skills, including the ability to describe complex technical concepts in relatable terms.
  • Ability to manage and prioritize multiple workstreams with confidence in making decisions about prioritization.
  • Data-driven mentality. Self-motivated, responsible, conscientious, and detail-oriented.
  • Effective oral and written communication skills.
  • Ability to learn and maintain knowledge of multiple application areas.
  • Understanding of industry best practices pertaining to Quality Assurance concepts and procedures.
  • Education / Experience Level

  • Bachelor's degree in Computer Science, Engineering, or relevant fields with 3+ years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or a similar role for large-scale enterprise solutions.
  • 3+ years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem, including Data Lake, Data Warehouses, and Graph DB.
  • AWS Solutions Architect Professional Level certifications.
  • Extensive experience in data analysis on critical enterprise systems like SAP, E1, Mainframe ERP, SFDC, Adobe Platform, and eCommerce systems.
  • If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

    Not the right fit? Let us know you're interested in a future opportunity by clickingin the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!