Senior Cloud Data Engineer
Job ID: 9866
Job Type: Full-time
Location:
Westlake Village,
CA
Date Posted: Jan 11, 2023
Job Description
Description
PURPOSE OF THE JOB
This role is a highly visible and value driven individual contributor role, providing data infrastructural help for fellow analysts and data scientists as well as operationalizing ML models at scale integrated to customer facing applications. Chosen candidate will be hands on developing Deep learning models, Holtz Winters anomaly detection models, Classification and Regression models as well as operationalizing them as unsupervised continuous learning models. Model outcomes will be pushed into Timeseries data stores like Apache Pinot for quick rendering and REST API end points. Candidate should be SQL savvy, self-sufficient in handling large volumes of data as well as excellent data visualization capabilities using Tableau or like solutions
RESPONSIBILITIES/DUTIES
- Design and build reliable, scalable data infrastructure with leading privacy and security techniques to safeguard data using AWS technologies
- Design and Build scalable, secure, low latency, resilient and cost-effective solutions for enabling predictive and prescriptive analytics across the organization
- Design/ Architect frameworks to Operationalize ML models through serverless architecture and support unsupervised continuous training models
- Take over and scale our data models (Tableau, Dynamo DB, Kibana)
- Experience in shipping low-latency massive scale systems to production
- Communicate data-backed findings to a diverse constituency of internal and external stakeholders
- Build frameworks for data ingestion pipeline both real time and batch using best practices in data modeling, ETL/ELT processes and hand off to data engineers
- Participate in technical decisions and collaborate with talented peers.
- Review code, implementations and give meaningful feedback that helps others build better solutions.
- Helps drive technology direction and choices of technologies by making recommendations based on experience and research.
MINIMUM REQUIREMENTS & SPECIAL ATTRIBUTES
- 5 or more years of experience working directly with enterprise data solutions
- Hands on experience working in a public cloud environment and on-prem infrastructure.
- Specialty on Columnar Databases like Redshift Spectrum, Time Series data stores like Apache Pinot and the AWS cloud infrastructure
- Experience with in-memory, serverless, streaming technologies and orchestration tools such as Docker, Spark, Kafka, Airflow, Kubernetes is needed
- Excellent SQL skills and Python coding is a must
- Current hands-on implementation experience required, possessing 5or more years of IT platform implementation experience.
- AWS Certified Big Data - Specialty desirable
- Experience designing and implementing AWS big data and analytics solutions in large digital and retail environments is desirable
- Advanced knowledge and experience in online transactional (OLTP) processing and analytical processing (OLAP) databases, data lakes, and schemas.
- Experience with AWS Cloud Data Lake Technologies and operational experience of Kinesis/Kafka, S3, Glue and Athena.
- Experience with any of the message / file formats: Parquet, Avro, ORC
- Design and development experience on subscribing to a Streaming Service, EMS, MQ, Java, XSD, File Adapter, and ESB based applications
- Experience in distributed architectures such as Microservices, SOA, RESTful APIs and data integration architectures is a plus
- Hands on experience migrating On-Prem Data solutions to cloud
- Prior experience managing On-prem Enterprise Data Warehouse solutions like Netezza is a plus
- Experience with a wide variety of modern data processing technologies, including
- Big Data Stack (Spark, spectrum, Flume, Kafka, Kinesis etc.)
- Data streaming (Kafka, SQS/SNS queuing, etc)
- Expert in Columnar databases primarily, Redshift or like technologies lile Snowflake, Firebolt
- Expert in Commonly used AWS services (S3, Lambda, Redshift, Glue, EC2, etc)
- Expertise in Python, pySpark or similar programming languages is a must have
- BI tools (Tableau, Domo, MicroStrategy) is a plus
- Skilled in AWS Compute such as EC2, Lambda, Beanstalk, or ECS
- Skilled in AWS Management and Governance suite of products such as CloudTrail, CloudWatch, or Systems Manager
- Skilled in Amazon Web Services (AWS) offerings, development, and networking platforms
- Skilled in AWS Analytics such as Athena, EMR, or Glue
- Proficiency in Oracle, MYSQL and Microsoft SQL Server Databases is a plus
- Understanding Continuous Integration / Continuous Delivery with experience in Jenkins
Why Guitar Center? Here's just some of the rewards:
For our employees who are musicians we offer the unique opportunity of gig leave--take time off to share your music with the world and return to your job after your tour! Guitar Center offers robust benefits and perks, including Medical, Dental, Vision, 401K plus company match, mental health support, paid sick/holiday/vacation time, employee discount program, and tuition reimbursement options.
Pay Rate: $116,200 - $145,300/yr depending on background and experience.
The job posting is not necessarily reflective of actual compensation that may be earned, nor a promise of any specific pay for any specific employee, which is always dependent on actual experience, education, and other factors. The pay range(s) listed are provided in compliance with state specific laws. Pay ranges may be different in other locations.