FinTech Australia
FinTech Australia
About
About Us
What is Fintech
Contact Us
Policy
Policy
Policy Working Groups
Events
Events Calendar
The Finnies
Intersekt Festival
Members
Corporate Partners
Fintech Careers
Jobs Board
eLearning
Resources
Ecosystem Map
Regulatory Map
Investor Map
EY Fintech Census
Services Directory
News
News
Podcast
Member Portal
FinTech Australia
FinTech Australia
About
About Us
What is Fintech
Contact Us
Policy
Policy
Policy Working Groups
Events
Events Calendar
The Finnies
Intersekt Festival
Members
Corporate Partners
Fintech Careers
Jobs Board
eLearning
Resources
Ecosystem Map
Regulatory Map
Investor Map
EY Fintech Census
Services Directory
News
News
Podcast
Member Portal
Folder: About
Folder: Policy
Folder: Events
Members
Corporate Partners
Folder: Fintech Careers
Folder: Resources
Folder: News
Member Portal
Back
About Us
What is Fintech
Contact Us
Back
Policy
Policy Working Groups
Back
Events Calendar
The Finnies
Intersekt Festival
Back
Jobs Board
eLearning
Back
Ecosystem Map
Regulatory Map
Investor Map
EY Fintech Census
Services Directory
Back
News
Podcast
hero

Companies you'll love to work for

130
companies
6,993
Jobs
For Employers
Add your job
listings
Contact Us
For Employers
Find Candidates
Directly
Talent Pool
For Candidates
Help Recruiters
Find You
Talent Network
Search 
jobs
Explore 
companies
Join talent network
Talent
My job alerts

Senior Data Integration Engineer

EPAM Systems

EPAM Systems

This job is no longer accepting applications

See open jobs at EPAM Systems.See open jobs similar to "Senior Data Integration Engineer" FinTech Australia.
Other Engineering
Prague, Czechia
Posted 6+ months ago
Apply Apply

Senior Data Integration Engineer Description

We are currently looking for a Senior Data Integration Engineer to join our Prague office. PROJECT


Technologies

  • Cloud providers stack (AWS/Azure/GCP): Storage; Compute; Networking; Identity and Security
  • DataWarehousing and DB solutions (RedShift, Snowflake, BigQuery, Azure Synapse, etc.)
  • Experience with some industry-standard Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS, etc.)
  • Experience in coding with one of the data-oriented programming languages: SQL, Python, SparkSQL, PySpark, R, Bash, Scala
  • Expected experience working with at least one Relational Database (RDBMS: MS SQL Server, Oracle, MySQL, PostgreSQL)
  • Dataflow orchestration tools, data replication tools and data preparation tools
  • Version Control Systems (Git, SVN)
  • Testing: Component/ Integration Testing / Reconciliation
  • Cloud providers stack (AWS/Azure/GCP): Storage; Compute; Networking; Identity and Security
  • DataWarehousing and DB solutions (RedShift, Snowflake, BigQuery, Azure Synapse, etc.)
  • Experience with some industry-standard Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS, etc.)
  • Experience in coding with one of the data-oriented programming languages: SQL, Python, SparkSQL, PySpark, R, Bash, Scala
  • Expected experience working with at least one Relational Database (RDBMS: MS SQL Server, Oracle, MySQL, PostgreSQL)
  • Dataflow orchestration tools, data replication tools and data preparation tools
  • Version Control Systems (Git, SVN)
  • Testing: Component/ Integration Testing / Reconciliation

Responsibilities

  • Design and implement Data Integration solutions, model databases, and contribute to building data platforms using classic Data technologies and tools (Databases, ETL/ELT technology & tools, MDM tools, etc.) as well as implementing modern Cloud or Hybrid data solutions
  • Work with product and engineering teams to understand data product requirements, evaluate new features and architecture to help and drive decisions
  • Build collaborative partnerships with architects and key individuals within other functional groups
  • Perform detailed analysis of business problems and technical environments and use this in designing high-quality technical solutions
  • Actively participate in code review and testing of solutions to ensure it meets specification and quality requirements
  • Build and foster a high-performance engineering culture, supervise junior/middle team members and provide them technical leadership
  • Write project documentation
  • Be self-managing, implement functionality without supervision, test his/her work thoroughly using test cases, and/or supervise less experienced colleagues

Requirements

  • At least 3 years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database design
  • Practical hands-on experience in developing Data Solutions in at least one major public Cloud environment (AWS, Azure, GCP)
  • Practical knowledge of leading cloud data warehousing solutions (e.g. Redshift, Azure Synapse Analytics, Google BigQuery, Snowflake, etc.)
  • Production coding experience in one of the data-oriented programming languages
  • Solid background in developing Data Analytics & Visualization, Data Integration or DBA & Cloud Migration Solutions
  • Experienced and highly self-motivated professional with outstanding analytical and problem-solving skills
  • Play the role of a Key Developer and a Designer or a Team Lead of 2-5 engineers and ensure that delivered solutions meet business requirements and expectations
  • Able to read and understand project and requirement documentation; able to create design and technical documentation including high-quality documentation of his/her code
  • Experienced in working with modern Agile developing methodologies and tools
  • Able to work closely with customers and other stakeholders
  • Advanced knowledge of Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS, etc.)
  • Advanced knowledge of Relational Databases (SQL optimization, Relations, Stored Procedures, Transactions, Isolation Levels, Security)
  • Practical hands-on experience of development of Data Solutions in Cloud environments (AWS, Azure, GCP) - designing, implementing, deploying, and monitoring scalable and fault-tolerant data solutions
  • Solid understanding of core cloud technologies and approaches. Awareness of niche and case-specific cloud services
  • Expected ability to troubleshoot the outages of average complexity, identify and trace performance issues
  • Pattern-driven solutioning, choosing the best for particular business requirements and technical constraints
  • Advanced knowledge of Data Security (Row-level data security, audit, etc.)
  • Production experience of one of the data-oriented programming languages: SQL, Python, SparkSQL, PySpark, R, Bash
  • Production projects experience in Data Management, Data Storage, Data Analytics, Data Visualization, Data Integration, MDM (for MDM profiles), Disaster Recovery, Availability, Operation, Security, etc
  • Experience with data modeling (OLAP, OLTP, ETL and DWH / Data Lake /Delta Lake/ Data Mesh methodologies. Inman vs Kimbal, Staging areas, SCD and other dimension types)
  • Good understanding of Online and streaming integrations, micro-batching, Understanding of CDC methods and delta extracts
  • General understanding of Housekeeping processes (archiving, purging, retention policies, hot/cold data, etc.)
  • Good understanding of CI/CD principles and best practices. Understanding of concepts of "Canary release", Blue-Green, Red-Black deployment models
  • Data-oriented focus and possessing compliance awareness, such as PI, GDPR, HIPAA
  • Experience in direct customer communications
  • Experienced in different business domains
  • English proficiency

We offer

  • Opportunity to work in a fast-paced, agile, software engineering culture
  • Comfortable modern office in Prague 7, with support of hybrid or fully remote mode
  • Benefit program (5 weeks of vacation, paid sick days, paid days off for special occasions, meal vouchers, flexi pass, Prague city public transport annual coupon, multisport cards, optional contribution to pension fund, health insurance for family member)
  • EPAM Employee Stock Purchase Plan (ESPP) (subject to certain eligibility requirements)
  • English language courses
  • Czech language courses upon request
  • Referral bonuses for recommended candidates
  • Mobile Phone Tariff’s program for managerial-level candidates
  • Great learning and development opportunities, including in-house professional training, career advisory and coaching, sponsored professional certifications, well-being programs, LinkedIn Learning Solutions and much more

Certain benefits and perks may be subject to eligibility requirements and may be available only after you have passed your probationary period.

Apply Apply

This job is no longer accepting applications

See open jobs at EPAM Systems.See open jobs similar to "Senior Data Integration Engineer" FinTech Australia.
See more open positions at EPAM Systems
Privacy policyCookie policy
FINTECH AUSTRALIA

FinTech Australia exists to help our country become one of the world’s top markets for fintech innovation and investment.

IMPORTANT LINKS
  • Privacy Policy
  • Member Login
  • Join Fintech Australia
  • Contact Us
© 2023 FinTech Australia