
ITDS Portugal
ITDS is a leader in outsourcing IT engineers and works with various web and mobile technologies for over 30 global clients. It has been recognized as one of the 1000 fastest-growing companies in Europe for three consecutive years, Great Place to Work, and the Forbes Diamond award in 2023. ITDS currently has more than 600 IT professionals working in Portugal, Poland, and the Netherlands.
About company Mid-level data engineer – data pipelines and modern data stack
Remote
Porto
April 6, 2026
Hybrid
24k-30k EUR
Mid-Level Data Engineer – Data Pipelines and Modern Data Stack
Porto-based opportunity with hybrid work model (up to 3 days remote per week).
As a Mid-Level Data Engineer, you will be working for our client, an innovative organization focused on developing scalable data solutions to empower data-driven decision making. Join a team dedicated to building reliable data pipelines and supporting analytics that impact business performance. This role offers a fantastic opportunity to grow your technical expertise within a modern data stack environment.
Your main responsibilities:
- Build, maintain, and optimize data pipelines from source systems to analytics-ready datasets.
- Develop and maintain data transformations using dbt Core, following best practices for models, tests, documentation, and macros.
- Implement and manage data ingestion workflows using dbt and Python data pipelines.
- Contribute to data modeling activities across staging, intermediate, and data mart layers.
- Ensure data quality and reliability by implementing tests and monitoring data pipelines.
- Troubleshoot pipeline failures and resolve performance issues promptly.
- Collaborate closely with Analytics Engineers, BI teams, and stakeholders to gather data requirements.
- Participate in code reviews and help improve data engineering standards within the team.
You're ideal for this role if you have:
- 3–4 years of experience in Data Engineering or a related field.
- Hands-on experience with dbt Core, including CLI development and version control.
- Solid skills in building and maintaining data pipelines.
- Strong SQL knowledge and experience with analytical databases or data warehouses.
- Intermediate understanding of data modeling concepts (star schema, medallion architecture).
- Familiarity with Git-based workflows.
- Understanding of CI/CD principles as applied to data pipelines.
- Excellent communication skills in English (written and spoken).
- Experience with Python for data processing or orchestration tasks.
It is a strong plus if you have: (optional)
- Exposure to cloud platforms (AWS, Azure, GCP).
- Hands-on experience with Dataiku for data preparation and analytics workflows.
- Experience with Power BI (data modeling, DAX, visualization).
- Knowledge of supporting machine learning models in Dataiku.
- Exposure to Generative AI or LLM-based use cases (e.g., embeddings, prompt engineering).
Language Required for the role:
- Fluent English (written and spoken)
Eligibility for the role:
- Only candidates with an existing legal right to work in the European Union will be considered for this role.
#MAKEYourCareerBETTER
Interested? Apply now and include your CV (preferably in English) along with a statement confirming your consent to the processing and storage of your personal data.
Benefits
- ITDS Clubs
- Access to medical insurance
- Meal Card
- Access to Pluralsight & Udemy
- Integrational Events