Talent.com
This job offer is not available in your country.
AWS Data Engineer

AWS Data Engineer

HireIT ConsultantsHyderabad
30+ days ago
Job description

Job Title : AWS Data Engineer IoT, ERP / MES, CRM & Generative AI Integration

Location : Remote / Onsite / Hybrid

Job Type : Full-Time

Experience Level : Mid to Senior Level

Industry : IoT, Industrial Automation, Enterprise Systems

Job Overview :

We are looking for a hands-on AWS Data Engineer with experience in IoT ecosystems, enterprise application integration, and emerging AI / ML capabilities. The ideal candidate will design and build modern data pipelines that process telemetry from IoT Core (MQTT) into Aurora PostgreSQL, while also integrating ERP / MES dumps, CRM field-service data, and cloud SaaS exports using tools like AWS Glue, AppFlow, and Lambda. Experience with LLM enablement and vector databases is a strong plus.

Key Responsibilities :

  • Develop scalable data pipelines to handle data ingestion from :

1. IoT Core MQTT topics using AWS IoT Rules Aurora PostgreSQL for telemetry storage.

2. ERP / MES data dumps (e.g., SAP, Oracle Manufacturing) and CRM service exports (e.g., Salesforce, ServiceNow).

3. S3-based data uploads from field gateways or system exports.

  • Implement and maintain AWS Glue Jobs, AppFlow, and AWS Lambda functions to automate extraction, transformation, and loading (ETL / ELT).
  • Create metadata-enriched, queryable datasets for use in analytics, dashboards, or feeding LLMs (Large Language Models) and vector databases for semantic search and intelligent applications.
  • Design and optimize Aurora PostgreSQL schemas to support structured IoT telemetry, event logs, and traceability data.
  • Integrate with SAP, Oracle, Salesforce, ServiceNow using AppFlow, REST APIs, or custom connectors.
  • Enable real-time monitoring and alerting of IoT data pipelines and ensure high availability and resilience.
  • Collaborate with AI / ML and data science teams to vectorize data using tools like Amazon OpenSearch, Pinecone, or FAISS, and build LLM pipelines for industrial search, chat, or recommendation use cases.
  • Required Skills :

  • Strong hands-on expertise with AWS IoT Core, MQTT protocol, and IoT Rules Engine.
  • Proficiency in Aurora PostgreSQL schema design, query tuning, and time-series handling.
  • Proven experience with AWS Glue, Lambda, and AppFlow for orchestration and integration.
  • Solid programming experience in Python or Node.js, especially for Lambda functions.
  • Working knowledge of data lake architecture, S3, Glue Catalog, and Athena.
  • Experience with ERP / MES system exports (e.g., SAP IDocs, BAPIs, Oracle DB dumps).
  • Familiarity with Salesforce / ServiceNow APIs or AppFlow connectors.
  • Exposure to LLM frameworks (LangChain, Bedrock, etc.) and vector databases (OpenSearch, Pinecone, etc.).
  • Experience with CI / CD, Terraform / CloudFormation, and monitoring tools (e.g., CloudWatch, Prometheus).
  • Preferred Qualifications :

  • AWS Certifications (e.g., AWS Certified Data Analytics, AWS Certified Machine Learning, Solutions Architect).
  • Experience working in smart factory, industrial IoT, or connected device ecosystems.
  • Exposure to Kafka or Kinesis for streaming pipeline implementations.
  • Understanding of data governance, data lineage, and compliance standards.
  • (ref : hirist.tech)

    Create a job alert for this search

    Aws Data Engineer • Hyderabad