Tidepool
Lead Data Engineer
Job Description
Job Title: – Lead Data Engineer
Location: – Hyderabad/Bengaluru/Delhi NCR
About Tide
At Tide, we are building a finance platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting.
Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 1,800 employees.
Tide is rapidly growing, expanding into new markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money.
About the team:
As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard..
We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP.
About the role
As a Data Engineer you’ll be:
●Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function.
● Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture
● Mentoring Fother Junior Engineers in the Team
● Be a “go-to” expert for data technologies and solutions
● Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
● Troubleshooting and resolving technical issues as they arise
● Looking for ways of improving both what and how data pipelines are delivered by the department
● Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports
● Owning the delivery of data models and reports end to end
● Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
● Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches
● Discovering, transforming, testing, deploying and documenting data sources
● Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review
● Building Looker Dashboard for use cases if required
What we are looking for :
- Having 6+ years of extensive development experience using snowflake or similar data warehouse technology
- Having working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker
- Experience in agile processes, such as SCRUM
- Extensive experience in writing advanced SQL statements and performance tuning them
- Experience in Data Ingestion techniques using custom or SAAS tool like fivetran
- Experience in data modelling and can optimise existing/new data models
- Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
- Having experience architecting analytical databases (in Data Mesh architecture) is added advantage
- You have experience working in agile cross-functional delivery team
- You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
- You have strong technical documentation skills and the ability to be clear and precise with business users
- You have business-level of English and good communication skills
- You have basic understanding of various systems across the AWS platform ( Good to have )
- Preferably, you have worked in a digitally native company, ideally fintech
- Experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage
Our Tech Stack:
- DBT
- Snowflake
- Airflow
- Fivetran
- SQL
- Looker
What you’ll get in return:
- Competitive salary
- Self & Family Health Insurance
- Term & Life Insurance
- OPD Benefits
- Mental wellbeing through Plumm
- Learning & Development Budget
- WFH Setup allowance
- 15 days of Privilege leaves
- 12 days of Casual leaves
- 12 days of Sick leaves
- 3 paid days off for volunteering or L&D activities
- Stock Options
Tidean Ways of Working
At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams.
While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community.
Tide is a place for everyone
At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives.
We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard.
#LI-NN1