Syffer
Data Engineer
Job Description
Syffer is an all-inclusive consulting company focused on talent, tech and innovation. We exist to elevate companies and humans all around the world, making change, from the inside to the outside.
We believe that technology + human kindness positively impacts every community around the world. Our approach is simple, we see a world without borders, and believe in equal opportunities. We are guided by our core principles of spreading positivity, good energy and promote equality and care for others.
Our hiring process is unique! People are selected by their value, education, talent and personality. We dont present ethnicity, religion, national origin, age, gender, sexual orientation or identity.
Its time to burst the bubble, and we will do it together!
We are looking for a professional to join a client team
What you will do
– Understand user problems and ensure the architecture provided by the Data Architect is clear;
– Communicate with the Data Architect, peers, and Project Manager about technical solutions;
– Write and update interface contracts with strong knowledge of data warehousing, data lakes, ETL/ELT, and data modeling;
– Develop data pipelines, apply best practices, and deploy infrastructure using Terraform;
– Conduct and request peer code reviews;
– Define, perform, and document tests based on pipeline requirements;
– Present work during Deployment Reviews and monitor for errors post-deployment;
– Ensure adherence to deployment processes, logging, and monitoring strategies.
Who you are
– Proficiency with PySpark and Spark SQL for data processing;
– Experience with Databricks using Unit Catalog;
– Knowledge of Delta Live Tables (DLT) for automated ETL and workflow orchestration in Databricks;
– Familiarity with Azure Data Lake Storage;
– Experience with orchestration tools (e.g., Apache Airflow or similar) for building and scheduling ETL/ELT pipelines;
– Knowledge of data partitioning and data lifecycle management on cloudbased storage;
– Familiarity with implementing data security and data privacy practices in a cloud environment;
– Terraform: At least one year of experience with Terraform and know good practices of GitOps;
– Additional Knowledge and Experience that are a Plus:
- Databricks Asset Bundles
- Kubernetes
- Apache Kafka
- Vault
What you’ll get
– An inspiring work environment;
– Allocation of health insurance from the beginning of the employment ;
– Delivery of work equipment adjusted to the performance of functions;
– Implementation of a hybrid work regime whenever possible;
– Payment of the food allowance in meal card (exempt from legal discounts);
– And others
Work together with expert teams on projects of large magnitude and intensity, long term together with our clients, all leaders in their industries. Are you ready to step into a diverse and inclusive world with us?
Together we will promote uniquess!