DP World
Group Cloud Data Architect
Job Description
KEY ACCOUNTABILITIES
- Define how the data will be stored, consumed, integrated, and managed by different data entities and IT systems, as well as any applications using or processing that data.
- Document data sources, data structures, data flows, and data infrastructure throughout the organization
- Support the planning, scheduling, and completion of data archiving jobs and to meet business needs
Performing necessary configurations, writing detailed specifications for development of custom programs, testing, and implementing the automated solutions. - Collaborate with internal customers to elicit their business concerns and translate them into system development requirements.
- Oversee programs to cleanse, consolidate, and correlate structured, semi-structured, and unstructured data.
- Participate in planning initiatives, feasibility studies, cost/benefit analyses, new systems design, and implementation timelines.
- Participate in planning initiatives for system testing and configuration
- Setting standards for data use and managing standards for metadata, Master data, Reference data, and naming standards
- Responsible for maintaining the cloud architecture, develop cloud adoption plans, determine cloud application design, and create systems for managing, monitoring, and maintaining the cloud system.
- Responsible for coordinating with different teams for requirement and define data solution architecture.
- Incorporating and adapting to new cloud technologies/solution in azure as well as other cloud solutions.
- Responsible for ensuring cost for data platform is optimal and within defined budget, working on cost optimization activities.
- Act as an ambassador for DP World at all times when working; promoting and demonstrating positive behaviours in harmony with DP World’s Founder’s Principles, values and culture; ensuring the highest level of safety is applied in all activities; understanding and following DP World’s Code of Conduct and Ethics policies.
- Perform other related duties as assigned.
QUALIFICATIONS, EXPERIENCE AND SKILLS
Qualification:
- Bachelor/master’s in computer science/IT or equivalent.
- Azure certifications will be added advantage (Certification in AZ- 305 or AZ-303-AZ-304 or DP200 & DP201).
Experience: 8-12 Years
- Minimum 8 to 12 years of experience in Data Applications with experience in Architecture and design experience with different cloud providers preferably Azure
- 5+ years of DevOps / IT infrastructure experience
Must Have Skills:
- Experience architecting and building Cloud Data Lake, specifically Azure Data Analytics technologies and architecture is desired, Enterprise Analytics Solutions, and optimizing ‘big data’ data pipelines, architectures, and data sets.
- Hands on experience Architecting and delivering solutions using the Azure Data Analytics platform including Azure Databricks, Azure Cosmos DB, Azure Data Factory, Azure Logic Apps, Azure Functions, Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Search
- Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation pipelines.
- Experience working with structured and unstructured data.
- Experience working in the high-tech industry is a plus.
- Advanced hands-on SQL, Spark, Python, Scala, pySpark (2+ of these), Terraform knowledge and experience working with relational databases for data querying and retrieval.
- Experience with Design and Architecture of Azure big data frameworks/tools: Azure Data Lake, Azure Data Factory, Azure Data Bricks, Azure ML, SQL Data Warehouse, Azure Data Bricks.
- Experience with Design and Architecture of relational SQL and NoSQL databases, including MS SQL Server, Cosmos DB, Azure Synapse
- Experience with Design and Architecture of data security and Azure security, VM, Vnet
Leading development of Data Lake Architectures from scratch - Experience with Azure DevOps/CI-CD, Continuous integration, and deployment.
#LI-VG1