GFT Technologies
Data Architect
Job Description
Hybrid work from Wrocław office – 2/3 office days per week
At GFT, you will be working on designing and implementing state-of-the-art modern data processing systems for some of the biggest and most technologically advanced companies in the financial, IoT and retail sector. Often working directly with stakeholder and up to C-Level client representatives, our architects are experts in top-level system design and project scoping. You will also be involved in pre-sales support and technological research, providing directions for our tech community in Poland and representing GFT in the larger tech community.
Your tasks:
- Designing architecture (top level to low level) for data processing systems
- Cooperating with client’s architects and senior technical management as a SME
- Scoping projects and providing high-level estimates
- Hands-on support for Data / Azure
- Design of Data Warehouses, Data Lakes, data models / flows for projects / systems
- Planning and execution of migration of databases and data warehouses to Azure or cloud-agnostic modern warehouses
- Master Data Management strategies: design, discussion, optimization
- Data pipelines optimization
- Design / optimization of Data Governance strategies for projects / systems / organizations
- Pre-sales support, with knowledge, know-how, presentations etc.
- Technical guidance in the Data / Azure space, for our employees and clients alike
- Providing a “track record” to credentialize GFT as a data-savvy organization
Your skills:
- Experience with Machine Learning / MLOps
- Proficiency in Apache Spark
- Solid Python / Java hands-on programming skills
- Proficiency in SQL
- Experience with SQL and NoSQL databases
- Data structures and pipelines modelling
- Data Warehouse and Data Lake design
- Solid knowledge of Data Governance aspects (Data Catalogs, Data Lineage, Master Data Management etc.)
Nice to have:
- Experience in pre-sales support
- Infrastructure as Code tooling and best practices
- Stream processing with Apache Kafka / Azure EventHubs and Confluent Platform
- Databricks Platform
- Experience in cloud database / data warehouse migration
- Data pipelines orchestration with DataFactory, Apache Airflow etc.
- Azure familiarity, including core Azure data services (DataFactory, HDInsight, Synapse Analytics, Data Lake Storage)
We offer you
- Working in a highly experienced and dedicated team
- Contract of employment or B2B contract
- Hybrid work from Wrocław office – 2/3 office days per week
- Competitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
- On-line training and certifications fit for career path
- On-line foreign languages lessons
- Social events
- Access to e-learning platform
- Ergonomic and functional working space