Nagarro

Staff Engineer – Senior Data Engineer

23 April 2024
Apply Now
Deadline date:
£118000 - £190000 / year

Job Description

Company Description

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!

Job Description

  • Architect, Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, airflow. 
  • Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. 
  • Actively monitor and triage technical challenges in critical situations that require immediate resolution. 
  • Evaluate viable technical solutions and share MVPs or PoCs in support of the research.
  • Develop relationships with external stakeholders to maintain awareness of data and security issues and trends.
  • Review work from other tech team members and provide feedback for growth.
  • Implement Data Performance, Data security policies that align with governance objectives and regulatory requirements 
  • Close collaboration with Global functional product portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT) , product owners, functional area teams across levels.
  • Partnering with Global Data Product Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management).
  • Communicate with consulting and internal Data Product Portfolio teams.

Qualifications

  • Bachelor’s degree or equivalent combination of education and experience.
  • Bachelor’s degree in information science, data management, computer science or related field preferred.
  • 5+ years of IT experience with major focus on data warehouse/database related projects.
  • Must have exposure to technologies such as dbt, Apache Airflow, Snowflake.
  • Experience in data platforms: Snowflake, Oracle, SQL Server, MDM.
  • Expertise in writing SQL and database objects.
  • Stored procedures, functions, views. 
  • Hands on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, Attunity, Golden Gate, APIs, Apache Airflow, etc.
  • Experience in data modeling and relational database design.
  • Well-versed in applying SCD, CDC, and DQ/DV framework.
  • Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket).
  • Experience around responsibility for analyzing and interpreting financial data to provide valuable insights and support strategic decision-making.
  • Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs).
  • Excellent written, oral communication and presentation skills to present architecture, features, and solution recommendations 
  • Experience in data warehousing, data modeling, and the building of data engineering pipelines. 
  • Experience with data engineering methods, such as ETL and ELT techniques through scripting and/or tooling. 
  • Experience with analyzing performance bottlenecks and providing enhancement recommendations.
  • Passion for customer service and a desire to learn and grow as a professional and a technologist.
  • Strong analytical skills related to working with unstructured datasets.
  • Collaborating with product owners to identify requirements, define desired outcomes and deliver trusted results.
  • Building processes supporting data transformation, data structures, metadata, dependency and workload management.
  • In this role, SQL is heavily focused. An ideal candidate must have hands-on experience with SQL database design. Plus, Python.
  • Demonstrably deep understanding of SQL (level: advanced) and analytical data warehouses (Snowflake preferred).
  • Demonstrated ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket)
  • Extremely talented in applying SCD, CDC and DQ/DV framework. 
  • Familiar with JIRA & Confluence. – Must have exposure to technologies such as dbt, Apache airflow and Snowflake.
  • Desire to continually keep up with advancements in data engineering practices.
  • Knowledge on AWS cloud, Python is a plus

Additional Information

  • Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake.
  • Good to have strong programming/ scripting skills (Python, PowerShell, etc.).
  • Good to know about developing financial models and forecasting to support financial planning and decision-making processes.