**Responsibilities**:
- Design, build and maintain data platform infrastructure on AWS environment with automation tools such as Terraform, Cloud Formation, Code pipeline, Code Build, Jenkins and Bamboo.
- Build continuous integration/deployment (CI/CD) pipelines to accelerate development and improve team agility.
- Maintain and support Redshift Cluster and Schemas and business pipelines.
Take ownership of the design, governance, performance tuning, capacity planning, data availability and operational aspects of the data solutions.
- Develop appropriate instrumentation to collect metrics on system performance, cost, data ingress/egress /storage processes.
- Work with information security and compliance teams in governing policies and procedures.
- Have a clear understanding of the reports/analyses/insights to be driven by data and build data driven solutions to optimally support the operational analytics needs.
- Coordinate infrastructure enhancements and maintenance with the system/network engineering teams.
- Perform technology evaluations and testing to introduce new technologies to be adopted by the data platform team.
- Participate in 24x7 on-call rotation to handle issues that occur outside of business hours
**Requirements**:
- At least 3 years of experience with building and maintaining distributed and high-performance production environments in large-scale consumer enterprises.
- Development and deployment experience with most if not all of the following components of AWS:
- Management - CloudWatch (Events/Logs), IAM, CloudTrail, EC2 Systems Manager and Splunk
- IaaS - EC2, VPC, EBS, ELB, KMS, Config, SNS, SQS, SES, SWF, S3 and Glacier
- Data Management - DMS, Redshift, RDS, EMR, Data Bricks, AWS Batch, Airflow Managed Service, Glue Services and Lambda
- Other - Server Migration, Storage Gateway, CDN
- Very good hands-on experience with development and deployment of above infrastructure with automation tools Terraform and Cloud Formation.
- Experience in designing and developing CI/CD pipelines with Bamboo, AWS code pipeline, code build & lambda.
- Strong CS fundamentals, experience with one of the programming languages (Java, Python) and able to write pipelines for automation, metrics/data collection and system administration purposes.
- Understanding of automation and orchestration platforms such as Airflow and AWS Step Functions.
- Ability to learn quickly and think outside of the box with excellent communication skills (written and oral).
- Bachelor's degree in Computer Science or equivalent.