Load more All jobs loaded.

Yash Technologies

Data Analytics Engineer Job

20 November 2024
Apply Now
Deadline date:
£51000 - £95000 / year

Job Description

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.

 

At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.

 

We are looking forward to hire SQL Professionals in the following areas :

 

Job Description:

 

Experience: 5+ years

 

Role Definition
Employs data analytics to assess an organization’s technical performance and makes system enhancement recommendations.

Responsibilities

  • Analyzing customer requirements based on the understanding of conformed models and catering solutions that are implementable and maintainable.
  • Discerning data accuracies in large sets of data of diverse formats
  • Researching, designing, and documenting data specifications at all points in the production life cycle.
  • Understanding software development and having proficiency in SQL and a data programming languages.
  • Value Realization: Knowledge of value realization methods; ability to plan, execute, monitor and manage business activities and resources to determine and achieve the actual value from a business initiative as estimated in an associated business case.
  • Communicating Complex Concepts: Knowledge of effective presentation tools and techniques to ensure clear understanding; ability to use summarization and simplification techniques to explain complex technical concepts in simple, clear language appropriate to the audience.
  • Agile Development: Knowledge of agile methodologies and the agile development lifecycle; ability to utilize formal agile methodologies, disciplines, practices and techniques for the delivery of new and enhanced applications.
  • Cloud Computing: Knowledge of the concepts, technologies and services of cloud computing; ability to design, deploy and implement cloud computing solutions in various business environment.
  • Carries out tasks, under supervision, to increase capacity or add capabilities through cloud computing.
  • ETL Process: Knowledge of the extraction, transformation and loading (ETL) process; ability to develop a database through the ETL process.
  • Information Management: Knowledge of an organization’s existing and planned Information Architecture and Information Management (IM) methodology; ability to collect and manage information from different sources, and distribute this information to enhance operational efficiency.
  • Modeling: Data, Process, Events, Objects: Knowledge of data, process and events; ability to use tools and techniques for analyzing and documenting logical relationships among data, processes or events.

Basic Qualifications

  • Master Or Bachelor’s degree in computer science or a related field
  • At least 5+ years of experience
  • Experience in Data analysis Performing statistical analysis, data visualization, and predictive modelling to identify trends and patterns.
  • Strong skills in SQL
  • At least 2+ years of recent experience in programming – preferably Python
  • Experience with relational databases such as Snowflake, MySQL or PostgreSQL
  • Capable of thriving in high-pressure situations and delivering results within tight time constraints.
  • Demonstrated passion for technology coupled with an eagerness to contribute to a collaborative team environment.

Nice to have

  • a working knowledge of statistical methodologies and data management
  • Proficient in working with diverse datastores, including Snowflake, Elasticsearch, MySQL, and Oracle.
  • Well-versed in developing Snowflake procedures, tasks, and other Snowflake components.
  • Proficient in utilizing batch or stream processing systems, including Apache Spark and AWS Glue.
  • Familiarity with scheduling tools like Apache Airflow.
  • Skilled in developing and working with Restful APIs.
  • Hands-on experience with API tools like Swagger, Postman, and Assertible.
  • Advocate of Test-Driven Development (TDD) and Behaviour-Driven Development (BDD).
  • Extensive hands-on experience with testing tools like Selenium and Cucumber, with expertise in seamlessly integrating them into CI/CD pipelines.

 

 

At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale.

 

Our Hyperlearning workplace is grounded upon four principles

  • Flexible work arrangements, Free spirit, and emotional positivity
  • Agile self-determination, trust, transparency, and open collaboration
  • All Support needed for the realization of business goals,
  • Stable employment with a great atmosphere and ethical corporate culture