Sertis
Data Engineer
Job Description
Who is Sertis?
Sertis is a leading Data and AI company based in the heart of Bangkok. We provide both off-the-shelf and customized solutions for our clients ranging from data infrastructure, BI development, and data-driven business insights to forecasting, optimization, and computer vision. Our expert team of data and AI consultants work closely with clients, across different industries such as retail, manufacturing, banking, energy, airlines, agriculture, and healthcare, to understand their business needs and deliver bespoke solutions using cutting-edge technologies that are just right for them.
Our aim is to be one of the leading Data and AI companies globally, where a diverse mix of talent want to come, stay, and do their best work. We pride ourselves on bringing not only the best, but also nice, talent from around the World. We recognise that our company runs on our people’s hard work and dedication while maintaining a culture that encourages learning, growth opportunity, innovative contribution, and a sense of ownership.
For more information, please visit: https://www.sertiscorp.com/
Overview of the job
We are seeking a Data Engineer to join our growing team. The ideal candidate will be a highly motivated and experienced data professional with a strong technical background in data engineering, data architecture and cloud. As a Senior Data Engineer, you will play a crucial role in designing and implementing data solutions that drive business value, help us make data-driven decisions, and improve customer experiences.
In this role, you will get to:
- Collaborate with cross-functional teams to understand business requirements and design data solutions to meet those needs
- Design and build scalable data pipelines to ingest, process, and store large volumes of data
- Manipulate complex data from a variety of sources (e.g. API, SFTP, Databases, SAP, Google Analytics, etc.)
- Ensure data quality, accuracy, and completeness through data validation and error handling processes
- Maintain and monitor existing ETL pipelines and advising on necessary infrastructure changes
- Design and implement security and access control measures to protect sensitive data or comply with regulation such as PDPA
- Participate in recruitment in order to evaluate and interview candidates, as well as improving our recruitment processes
- Gather customer requirements during pre-sales interactions, determining project scope, creating technical diagrams, and making estimations for man-days and cloud costs
- Develop and maintain technical documentation
You’ll be successful if you have:
- 4+ years of experience in Data engineering in designing, building, maintaining data infrastructure in cloud environments such as AWS, GCP or Azure
- Strong programming skills in languages such as Python, Java, or Scala (Python preferred)
- Strong Experience with cloud based data lake solutions, such as S3, GCS or Data Lake Storage, and how to design and implement data lake architectures
- Strong Experience with data warehousing tools, such as Redshift, Synapse or BigQuery, and how to optimize data warehousing performance for large-scale data sets
- In-depth knowledge and hands-on experience with big data technologies such as Hadoop, Spark, and Hive (Spark preferred)
- Expertise in designing and implementing efficient ETL pipelines composed of a variety of data sources, and the ability to ensure data quality with a strong understanding of data integration and transformation techniques
- Experience working with orchestration tools such as Airflow, AWS Step functions or Azure Data Factory
- Ability to write efficient SQL queries for data extraction and transformation, demonstrating proficiency in optimizing query performance, understanding distributed query execution models, and utilizing advanced SQL concepts. (Spark preferred)
- Knowledge of CI/CD and Infrastructure as Code concepts, including experience with tools such as Terraform, AWS CDK, Github pipelines and Gitlab CI
- Excellent problem-solving skills and the ability to work well in a fast-paced, collaborative environment
- Ability to scope projects, define architectures, and choose technologies based on project requirements
- Leadership skills and ability to mentor junior and mid-level Data engineers
It’s a plus if you have:
- Holding certifications related to data engineering of cloud providers such GCP Data engineering, etc
- Working on multiple projects simultaneously
- Data streaming technologies and data integration with streaming data sources
- Docker and experience deploying and managing containerized data solutions
- Implementing PDPA process
- Optimizing cloud costs through various strategies
- DevOps and continuous integration/continuous delivery (CI/CD) practices
- Gathering customer requirements and estimating project scope during pre-sales interactions
- Agile methodologies (e.g. Scrum, Kanban)
What are some benefits working at Sertis?
- Hybrid working environment, up early or slow starter in the morning? We have flexible office hours
- Get to work and learn from the best in the industry, and share your ideas with like-minded individuals
- We cultivate intelligence and learning so that our experts can become community leaders in their respected fields in the tech industry
- Amazing colleagues to enjoy company social outings, parties, and events
- Result-oriented workplace; We provide direction, not orders and give you the autonomy to deliver your best work
- We work at the frontier of innovation in the AI industry
- Work on meaningful solutions that solve and improve real-life problems and challenges
- We run like a startup, and embrace the adventure; we focus on getting things done, while still having a down-to-earth and informal culture
This is your chance to build your career in a growing data-driven and AI industry.
APPLY NOW!