Senior Data DevOps Engineer
EPAM Systems
This job is no longer accepting applications
See open jobs at EPAM Systems.See open jobs similar to "Senior Data DevOps Engineer" FinTech Australia.Senior Data DevOps Engineer Description
We're seeking a remote Senior Data DevOps Engineer to join our dynamic team for a new project focused on developing and managing data infrastructure in the cloud, primarily using AWS, Azure, or GCP.
In this role, you will be responsible for designing, deploying, and managing data systems, developing automation scripts and workflows for infrastructure provisioning, deployment, and monitoring, and optimizing performance, scalability, and reliability of data platforms and systems.
You will work closely with the data engineering team to ensure efficient data pipelines and processes, automating data workflows using Python. You will also be responsible for setting up and maintaining continuous integration and delivery (CI/CD) pipelines using tools such as Jenkins, GitHub Actions, or similar cloud-based CI/CD tools.
#LI-DNI#LI-AP13
Responsibilities
- Design, deploy, and manage data infrastructure in the cloud, primarily using AWS, Azure, or GCP
- Develop and implement automation scripts and workflows for infrastructure provisioning, deployment, and monitoring using tools like Terraform or similar Infrastructure as Code (IaC) tools
- Ensure efficient data pipelines and processes, automating data workflows using Python
- Work closely with the data engineering team to ensure efficient data pipelines and processes, automating data workflows using Python
- Set up and maintain continuous integration and delivery (CI/CD) pipelines using tools such as Jenkins, GitHub Actions, or similar cloud-based CI/CD tools
- Collaborate with cross-functional teams to optimize the performance, scalability, and reliability of data platforms and systems
- Install, configure, and maintain data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools in both on-premises and cloud environments
- Monitor and troubleshoot data systems, proactively identifying and resolving performance, scalability, and reliability issues
Requirements
- Minimum of 3 years of experience in data infrastructure management and DevOps
- Strong proficiency in Python, and Batch experience
- Professional mastery of the Linux operating system
- Strong knowledge of Cloud technologies (AWS, GCP or Azure)
- Solid understanding of network protocols and mechanisms such as TCP, UDP, ICMP, DHCP, DNS, and NAT
- Hands-on experience using or setting up data tools such as Spark, Airflow, R
- Proficiency with SQL
- Experience with Infrastructure as Code (IaC) tools
- Proficiency with setting up and managing CI/CD pipelines using tools like Jenkins, Bamboo, TeamCity, GitLab CI, GitHub Actions, or similar cloud-based CI/CD tools
- Experience installing and configuring data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools
- Good verbal and written communication skills in English at a B2+ level
Nice to have
- Expertise in AWS CloudFormation
- Knowledge of Terraform and Ansible
- Azure DevOps skills
We offer
- Learning Culture - We want you to be the best version of yourself, that is why we offer unlimited access to learning platforms, a wide range of internal courses, and all the knowledge you need to grow professionally
- Paid Holidays - We offer paid time off during all national holidays (working and non-working) for a total of 15 days. On top of that, you will also have vacation days to enjoy quality time with your family or just rest
- Professional Growth Opportunities - We have designed a highly competitive and complete development process, where you will have all the tools to get where you have always wanted to be, personally and professionally
- Stock Option Purchase Plan - As an EPAMer you can be more than just an employee, you will also have the opportunity to purchase stock at a reduced price and become a part owner of our organization
- Additional Income - Besides your regular salary, you will also have the chance to earn extra income by referring talent, being a technical interviewer, and many more ways
- Community Benefit - You will be part of a worldwide community of over 50,000 employees, where you can learn, challenge yourself, stand out, and share your knowledge and experience with multicultural teams!
- Are you open to relocation? - If you want to relocate to another country and we have the right project, we will assist you every step of the way, to help you and your family, reach your new home
This job is no longer accepting applications
See open jobs at EPAM Systems.See open jobs similar to "Senior Data DevOps Engineer" FinTech Australia.