Nagarro
Associate Staff Engineer — Data Analyst
Job Description
Company Description
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (19,500+ experts across 35+ countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
Job Description
Must have skills : Power BI (Capable), Python (Capable), SQL (Strong), SQL Server, Snowflake
Job Description:
- General Experience and Skills
- 4+ years’ experience in the data & analytics space
- Good communication skills
- Ability to articulate findings and challenges found in analysis and during SQL Development
- Ability to listen attentively to capture notes and feedback during requirements sessions.
- Requirements Gathering & Business Acumen
- Experience collecting requirements from business stakeholders and translating into technical data requirements.
- Ability to relate and understand how data supports business objectives and strategy.
- Experience with building data analyst consumer facing data structures (tables, views, small data marts)
- Strong data analysis skills
- Experience with exploratory data analysis with new datasets.
- Foundational descriptive statistics experience (e.g., mean, median, mode, standard deviation, skewness)
- Experience with unit testing developed tables and views.
- Experience with identifying and resolving data quality issues.
- Technical Skills
- Strong general SQL experience & fluency
- CTEs, Window Functions, Joins, Group By, Where, Having, Stored Procedures / UDFs
- DDL/DML to create/alter/insert/delete tables and views
- AWS Cloud Data Analyst Experience
- Athena, S3, Glue, EMR Studio (for Jupyter Notebooks)
- Foundational Data Modeling Knowledge & Concepts
- Star & Snowflake schema structures
- Slowly changing dimensions
- Normalization & Denormalization
- Data Warehouse / Data Lake / Data Lakehouse concepts
- Basic Python Experience
- Pandas, Numpy, SQLAlchemy, Requests (for API Requests)
- Basic Spark Experience & Concepts
- Comfort with SparkSQL and using Spark data frames
- Foundational experience with creating and updating YAML-based configuration files
- Experience working in Jupyter Notebooks with SQL, Python, SparkSQL
- Foundational API Experience
- Understanding of pagination and cursor concepts for retrieving multiple responses
- Experience using a tool like Postman, Curl statements, and/or the Python requests libraries to write requests (GET/POST) to retrieve and interpret sample payloads from APIs
- Basic Power BI Experience
- Connecting to basic data sources (SQL Server, Snowflake, and )
- Developing basic dashboard visualizations
- Connecting different data sources in the Power BI data model interface.