YellowIpe

YellowIpe® is your consultancy for professional IT services. We offer the best customized solutions for the requirements and challenges of your technology project.
About company

Data Engineer - DataBricks

On-site

location

date July 1, 2025

About YellowIpe

Our mission is to inspire the connection between technology and people, we foster the best of our professionals through our expertise in finding and attracting the best talent for the best projects. The Focus on People, Collaboration and Commitment are the pillars that guide us in this trajectory.

Join the yellow team as our new Data Engineer - DataBricks!

As a Data Engineer, you will be responsible for understanding business and technological challenges, develop data pipelines tackling those challenges, and ensuring their smooth deployment.

You will as well be responsible of the applications of standard industry and within-the-company good practices, and the application and evolution of our various patterns.

Responsibilities:

Projects understanding and Communication:

- Understand problems from a user perspective and communicate to clearly understand the issue.

- Ensure the architecture provided by the Data Architect is clearly understood by yourself.

- Communicate with the Data Architect and your peers on the technical solution you’re developing and communicate with the Project Manager in charge of the project you’re working on.

Development:

- Write and communicate on new or updated interface contracts.

- Strong understanding of data warehousing concepts, data lakes, ETL/ELT processes, and data modeling.

- Develop data pipelines based on the defined architecture.

- Ensure the regular good practices are applied.

- Deploy requested infrastructure, particularly using Terraform.

- Make peer reviews and ask to your peers to review your code when merging a new version of the codebase.

Testing:

- Define tests with your project manager, based on the functional and technical requirements of the pipeline you’re developing.

- Perform those tests and communicate regularly on the results.

- Regularly summarize the results of your tests in a dedicated document.

Deployments:

- Present to the Data Architect in charge of the architecture, and the Lead DataOps, the development that was performed through our Deployment Reviews.

- Track and communicate on any potential errors in the entire period of active monitoring following a deployment.

- Ensure diligent application of deployment process, logging, and monitoring strategy.

Requirements:

- Proficiency with PySpark and Spark SQL for data processing.

- Experience with Databricks using Unit Catalog.

- Knowledge of Delta Live Tables (DLT) for automated ETL and workflow orchestration in Databricks.

- Familiarity with Azure Data Lake Storage.

- Experience with orchestration tools (e.g., Apache Airflow or similar) for building and scheduling ETL/ELT pipelines.

- Knowledge of data partitioning and data lifecycle management on cloudbased storage.

- Familiarity with implementing data security and data privacy practices in a cloud environment.

- Terraform: At least one year of experience with Terraform and know good practices of GitOps.

- Additional Knowledge and Experience that are a Plus: Databricks Asset Bundles; Kubernetes; Apache Kafka; Vault.

Personal Traits:

- Ability to adapt to different contexts, teams and stakeholders.

- Proactive Ownership for the projects delivered by your team.

- A continuous eye for improvement on existing processes.

- Clear communication and collaboration skills.

- Excellent analytical / problem- solving ability.

- Demonstrated ability to manage your stress in an operational environment.

- Demonstrated ability to understand quickly business and technical requirements of a data pipeline that needs to be developed, and challenge potential misconceptions.

- The candidate should have worked in a Data Platform that was in an industrial scale.

Important informations:

- Remote (1x/month in the office) - Fatima - Leiria.

- Candidates must be living in Portugal.

Apply for this opportunity in our website!

website

Contacts and address