- in developing sophisticated security use cases, will bridge the gap between security requirements and technical implementation.
- You will connect the dots by aligning Splunk configurations with overarching security objectives, ensuring that our log sources are effectively onboarded and monitored for potential threats.
You are a collaborator -
- Your expertise and insights as an Analytical Engineer will be instrumental in Collaborating with business to understand data requirements & support in building dashboards to enable data driven decision making
- You will work closely with Data Engineers , Data scientist & Business to strive for greater functionality & define business goals and parameters to measure analytics outcome
You are an innovator -
Identify, design, and implement internal process improvements : automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
What you'll need (Required)
- Bachelor s degree required, Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field a plus
- Minimum 3+ years of experience in a Analytical Engineer role
- Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement
- Strong analytic skills related to working with unstructured datasets
- Build processes supporting data transformation, data structures, metadata, dependency and workload management
- A successful history of harmonizing, processing and extracting value from large disconnected datasets
- Experience supporting and working with multi-functional teams in a dynamic environment
- Experience with object-oriented / object function scripting languages : Python, Java, etc.
- Experience with cloud services : GCP, AWS, Snowflake Experience with Visualization tools like Tableau, QlikView , Domo
- Experience with Data Flow, Data Pipeline and workflow management tools : Cloud Composer, Airflow, Luigi,dbt etc
- Ability to resolve issues independently.
- Excellent communication and collaboration skills.
What you'll need (Preferred) :
- Experience with relational SQL and NoSQL databases : MongoDB, PostgreSQL, etc
- Experience with big data tools : Hadoop, Spark, Kafka, etc.
- Experience building and optimizing big data data pipelines, architectures and data sets
Skills Required
Data Pipeline, Data Structures, Big Data, Python, Sql, Analytics