PradeepIT Consulting Services Pvt Ltd
Dataops Engineer or database administration | Permanent Remote
Job Description
Job description
We are looking for a talented individual with minimum 5 years of experience to join a team of passionate and highly-skilled technology professionals in creating leading customer-centric software for the energy and finance sector.
Roles & Responsibilities-
- Building and optimizing data pipelines to facilitate the extraction of data from multiple sources and load it into data warehouses. A DataOps engineer must be familiar with extract, load, transform (ELT) and extract, transform, load (ETL) tools.
- Using automation to streamline data processing. To reduce development time and increase data reliability, DataOps engineers automate manual processes, such as data extraction and testing.
- Managing the production of data pipelines. A DataOps engineer provides organizations with access to structured datasets and analytics they will further analyze and derive insights from.
- Designing data engineering assets. This involves developing frameworks to support an organizations data demands.
- Facilitating collaboration. DataOps engineers communicate and collaborate with other data and BI team members to enhance the quality of data products.
- Testing. This involves executing automated testing at every stage of a pipeline to increase productivity while reducing errors. This includes unit tests (testing separate components of a data pipeline) as well as performance tests (testing the responsiveness) and end-to-end tests (testing the whole pipeline).
- Adopting new solutions. This includes testing and adopting solutions and tools that adhere to the DataOps best practices.
- Handling security. DataOps engineers ensure data security standards are applied across the data pipelines.
- Reducing waste and improving data flow. This involves continually striving to reduce wasted effort, identify gaps and correct them, and improve data development and deployment processes.
- Database Consolidations: Assist in consolidating multiple databases across different cloud providers into a unified, managed environment, ensuring consistency and efficient operations.
- Database Performance Improvements: Identify performance bottlenecks in MongoDB and other database systems, and implement optimizations such as indexing, query tuning, and database configuration enhancements.
- Big Data: Manage and maintain the DataLake Bronze/Silver/Gold storages and seeks to quickly find the right data repository for the right workload.
- Test Environment Best Practices: Collaborate with the QA and development teams to establish best practices for test environments, including data seeding, cleansing sensitive data, and maintaining consistent data sets.
- CICD and IaC Integration: Work closely with the DevOps team to integrate database improvements into the CI/CD pipeline and Infrastructure as Code (IaC) workflows, using tools like Terraform.
- SQL Expertise: Utilize your strong SQL skills to create and optimize complex queries, stored procedures, and database views across various database platforms (Postgres, MySQL, MS SQL Server, MongoDB).
- Database Administration: Perform routine database administration tasks, including backup and recovery, monitoring, security, and user management for the supported database systems.
- Automation: Develop and maintain automation scripts and tools to streamline database administration tasks, such as provisioning, configuration management, and data migration.
- Collaboration: Work closely with cross-functional teams, including developers, system administrators, QA engineers, and DevOps personnel, to ensure smooth database operations and support their requirements.
- Documentation: Create and maintain technical documentation, including database design specifications, standard operating procedures, and troubleshooting guides.
- Continuous Improvement: Stay updated with the latest trends and technologies in database management, DevOps, and cloud computing. Propose and implement innovative solutions to enhance the database systems’ performance, scalability, and security.
Experience & key skills:
- Proven experience as a Database Administrator or DataOps Engineer.
- Strong knowledge of database management systems, such as Oracle, MySQL, PostgreSQL, MongoDB, and MS SQL Server.
- Experience with managing ClickHouse cluster or similar Data Warehouse
- Proven experience with Big Data storage technologies such as S3,HDFS and ELK, Hadoop is a plus.
- Proficiency in database administration, performance tuning, and troubleshooting.
- Experience with infrastructure as code tools, such as Terraform, Ansible, or CloudFormation.
- Solid understanding of DevOps principles and practices, including CI/CD workflows.
- Strong SQL skills and ability to optimize complex queries across multiple database platforms.
- Experience with cloud platforms, such as AWS, Azure, or Google Cloud Platform.
- Familiarity with containerization technologies, such as Docker and Kubernetes.
- Prior experience with SSRS (SQL Server Reporting Services) is a plus.
- Strong problem-solving skills and the ability to work independently and as part of a team.
- Excellent communication and collaboration skills.
Immediate Joiner or less than 30 days notice period candidate are most welcome.
- Role: Cloud System Administration
- Industry Type: IT Services & Consulting