Hands-on Experience with programming languages such as Python mandatorily.Thorough understanding of AWS from a data engineering and tools standpoint. Experience in another cloud is also beneficial.Experience in AWS Glue, Spark, and Python with Airflow for designing and developing data pipelines. Expertise in Informatica Cloud is advantageousData Modeling : Advanced / Intermediate Data Modeling skills (Master / Ref / ODS / DW / DM) to enable Analytics on the Platform.Traditional data warehousing and ETL skillset, including strong SQL and PL / SQL skills.Experience with inbound and outbound integrations on the cloud platformDesign and development of Data APIs (Python, Flask / FastAPI) to expose data on the platformPartner with SA to identify data inputs and related data sources, review sample data, identify gaps, and perform quality checks.Experience loading and querying cloud-hosted databases like Redshift, Snowflake, and BigQuery.Preferred - Knowledge of system-to-system integration, messaging / queuing, and managed file transfer.Preferred - Building and maintaining REST APIs, ensuring security and scalabilityPreferred - DevOps / DataOps : Experience with Infrastructure as Code, setting up CI / CD pipelines.Preferred - Building real-time streaming data ingestionSkills Required
Aws, Databricks, Python, Pyspark, Spark, Aws Services