Senior Software Engineer (Big Data)
Intuit
Senior Software Engineer (Big Data)
Company Overview
Intuit is the global financial technology platform that powers prosperity for the people and communities we serve. With approximately 100 million customers worldwide using products such as TurboTax, Credit Karma, QuickBooks, and Mailchimp, we believe that everyone should have the opportunity to prosper. We never stop working to find new, innovative ways to make that possible.
Job Overview
Come join the SBSEG Data Engineering team at IDC as a "Senior Software Engineer". We are leveraging big data technologies to gain new insights into our QuickBooks customer experience. Some of the technologies leveraged by our team that includes Hadoop, Vertica, and AWS Data Services. We foster an open team environment where we value direct interactions and working code above working in a cave.
Responsibilities
- Strong understanding of data engineering and dimensional design fundamentals, good at SQL, integration (ETL), front-end analysis / data visualization, learns new technologies quickly.
- Good understanding of data warehouse schema design and granularity of the data.
- Good understanding of data federation techniques and aggregation of data at scale from multiple source systems.
- Designing and developing ETL pipelines across multiple platforms and tools including Spark, Hadoop and AWS Data Services.
- Gathering functional requirements, developing technical specifications, and project & test planning.
- Work with business users to develop and refine analytical requirements for quantitative data (view-through, clickstream, acquisition, product usage, transactions), qualitative data (survey, market research) and unstructured data (blog, social network).
- Designing and developing schema definitions and support data warehouse/mart to enable integration of disparate data sources from within Intuit and outside, aggregate it and make it available for analysis.
- Support large data volumes and accommodate flexible provisioning of new sources.
- As a key member of the team drive adoption of new technologies, tools, and process improvements to build world class analytical capabilities for web analytics, optimization, experimentation and personalization.
- Resolve defects/bugs during QA testing, pre-production, production, and post-release patches.
- Work cross-functionally with various Intuit teams: Product Management, Project Management, Data Architects, Data Scientists, Data Analysts, Software Engineers, and other Data Engineers.
- Contribute to the design and architecture of project across the data landscape.
- Experience with Agile Development, SCRUM, or Extreme Programming methodologies.
- Helps to align to overall strategies and reconcile competing priorities across organization
Qualifications
- BS/MS in computer science or equivalent work experience.
- Experience in developing complex star/snowflake schemas, creating ETLs pipelines with Spark and familiar with MPP/Hadoop systems.
- Experience using Cloud services such as AWS, Azure , GCP etc
- Must have mastery of data warehousing technologies including data modeling, ETL and reporting. Ideal candidate to have 6+ years of experience in end-to-end data warehouse implementations and at least 2 projects with 4TB+ data volume.
- Extensive experience with databases (Vertica, Netezza or oracle and AWS data services tech).
- Experience in handling realtime data applications and building pipelines using streaming data.
- Understanding of Data Mesh architecture and microservices architecture
- Extensive experience in handling complex orchestrations for data pipelines
- Should be able to collaborate with cross functional teams to resolve issues
- Good understanding of Data Security, Data Governance
- Good knowledge of Operating Systems (Unix or Linux).
- Good understanding of Data ware House methodologies.
- Hands on experience in any of the programming languages (Shell scripting, Python, Java, etc).
- Must have been through several full life cycle Data Warehousing implementations and involved in scalability and performance related design aspects in Database and ETL.
- Solid communication skills: Demonstrated ability to explain complex technical issues related to technical and non-technical audiences.
- Demonstrated understanding of the Software design and architecture process.
- Experience with unit testing and data quality automation checks
- Should be results oriented, self-motivated, accountable and work under minimal supervision.
- Excellent written, oral communication and presentation Skills.
Good to have
- Knowledge of Big Data eco system like Hadoop M/R, Pig and Hive is a strong plus.
- Foundational knowledge of building highly resilient, fault tolerant data platforms that can support 1000+ data applications
- Conceptual understanding of Machine Learning / LLM / GenAI Usage is a plus.
- Good understanding of any reporting tools such as Tableau, QlikSense
- Experience in design, development and deployment of one or more tools - ETL (Informatica, OWB, ODI )