Data Integration & 3 others
EPAM Systems
Colombia · Remote
Posted on Nov 19, 2025
Responsibilities
- Develop and support ETL pipelines written in Python and SQL (+HiveQL) and orchestrated by a 3rd party tool
- Build all data layers (landing, staging, warehouse, datamarts) in S3, Qubole Hive and AWS Redshift
- Collaborate with report developer to deliver dashboards in time and compliance with business requirements
- Analyze business requirements and make technical decisions to drive the development process
- Negotiate with customers on requirements and technical restrictions
- Be a development stream coordinator (consult junior team member, monitor progress and troubleshoot issues)
- Be a proactive developer who can provide expertise on technical and business matter
Requirements
- 3+ years of experience as Data Engineer
- Knowledge of Python (intermediate)
- Knowledge of OOP, decorators
- ETL/ELT development experience, preferably with Python
- SQL Knowledge (intermediate+)
- Experience with Data warehousing (Kimball)
- Experience with AWS services (S3, Redshift, EC2)
- Self-driven, self-coordinating
- No supervisor needed
- Able to drive development process of small team
- Able to negotiate and collaborate with customers, drive discussion
We offer/Benefits
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn