Data Software Engineering & 8 others
EPAM Systems
Software Engineering
Portugal · Remote
Posted on Nov 19, 2025
Responsibilities
- Develop PySpark ETL pipelines and configure Kafka connectors
- Build Jenkins pipelines for schedulers, CI/CD, etc.
- Work with Kafka UI and other relevant tools such as Python 3.7, PySpark SQL, Jenkins, Artifactory, Sonar, and Atlassian tools
- Collaborate with cross-functional teams to determine project requirements and deliverables
- Troubleshoot and debug applications as needed
Requirements
- 3+ years of relevant experience as a Big Data Developer or Data Engineer
- Strong skills in Snowflake, AWS, and Spark
- Knowledge of Python
- Hands-on experience with Kubernetes/Docker and Jenkins pipeline
- Knowledge of Kafka Connect Cluster and Atlassian tools (Bitbucket, Jira, Confluence)
- Good English communication skills (B2+)
Nice to have
- Familiarity with PyCharm IDE
We offer/Benefits
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn