Job | Data Engineer Lead | Badenoch + Clark
Posted 27 June 2022

*Serás redirigido a la plataforma de ofertas de empleo del Grupo Adecco

The IT Data Engineer lead is the responsible for the design of the data flows through ETL procedures and to support the technology for the load scheduling and execution plan.

From the governance perspective he/she will design and implement the data governance policies, data governance controls and data dictionary lifecycle.

He/She will be in charge of the data quality area introducing quality rules for the common data repository. He/she will manage the corporate data and metadata.

He/She will collaborate with the data team to model data for visualization and exploitation purposes to create a corporate single source of comprehensive, consistent, and high-quality corporate data.

He/she will act as a bridge between the reporting team and the external cloud experts/vendors or cloud architects for preparing cloud platform resources according to use case or project needs. He/she will also identify the use cases needs for the resource’s configuration of the technical platform.

As a senior expert he/she will manage an internal resource and a pool of external experts and he/she will apply project management practices for tracking the evolution of Data initiatives.

He/she will act as a technical product owner of the already ETL implemented systems.

Roles & Responsibilities

•           Implement data flows to connect operational systems, data for analytics and business intelligence systems in a cloud based architecture.

•           The hire will be responsible for expanding and optimizing our data flows and data pipeline architecture, both legacy and cloud-based, as well as optimizing data flow and collection for cross functional teams

•           Build processes supporting data transformation, data structures, metadata, dependency and workload management.

•           Design, re-engineer, document, construct, deploy and manage the lifecycle of scaling source-to-target mappings. 

•           Optimize the code and the configuration of the tools to ensure data flow processes perform optimally in terms of time, performance and costs.

•           Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, …

•           Ability to design systems for the collection, storage and processing of massive data or Big Data, in real time, batch …

•           Participates and provides input to the Data technological platform and collaborates in the selection processes of technology in the company.

•           Provide the direction of our data engineering platforms and supports a legacy and hybrid cloud based technological and architecture.

•           Provide guidance on how to design and orchestrate data flow solutions on the cloud platform, hybrid platform or on-premises systems.

•           Effectively communicate understanding of the data, reporting requests and business needs to proactively research solutions and provide suggestions for improvement.

•           Research industry trends, standard methodologies and review current data flow processes to effectively provide company staff analysis and recommend and implement potential improvements.

•           Define and implement data governance controls and ensure their compliance, as his/her lead on the track of data dictionary lifecycle, implement metrics to ensure data accuracy and accessibility or produce and enforce database development standards

•           Manage the pool of experts and apply project management practices for tracking the evolution of the initiatives.



•           Degree in computer science, computer engineer or equivalent experience.


•           Spanish – Mother tongue.

•           English: Fluent (B2 minimum, C1 preferred).

Computer Skills:

•           NA


•           Minimum of 5 years of professional experience in related position required.

•           Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

•           Solid working knowledge of ETL tools. Experience with data pipelines / ETL pipelines.

•           Experience working with Azure technologies

o           Azure Databricks (Programming language: Python / Scala)

o           Azure Data Factory

o           Serverless and cloud technologies: Azure Functions

•           Experience with stream processing tools (Spark structured streaming, Azure Stream Analytics, Storm, etc…).

•           Experience with cloud architectures: AWS, Azure, Google cloud , Hybrid cloud required.

•           Experience with Microsoft data technologies suite: SSIS,SSRS,SSAS

•           Experience developing Mulesoft integrations with systems such as Salesforce , Workday , Oracle ERP , SAP Concur) is a plus

•           Experience in data modelling and database architectures.

•           Experience with relational, no sql, cloud, real time, streaming , datamart, datalake … databases

•           Experience with data integration between APIs (Application Programming Interfaces) required.

•           Experience with Azure DevOps

•           Familiar with Agile / Scrum methodologies.

•           Demonstrable ability to work independently and in a team setting

•           Technological leadership and IT vision on data platforms

•           Deep knowledge on Big data architectures

•           Must be capable of working under tight time constraints and with multiple priorities.

•           Ability to: Plan, organize, configure and document complex data platforms within corporate institutional policies/procedures; communicate technical/complex information both verbally and in writing; establish and maintain cooperation, understanding, trust and credibility; perform multiple tasks concurrently and respond to emergency situations


•           Master on Big data, data science or similar

•           Certification on any data cloud solution: Microsoft Azure, Google Cloud …

•           Mulesoft Certified Professional

•           Experience with data visualization tools: Power BI, Qlikview, reporting services…

•           Software architecture background

•           Experience with Cloudera, Horton, Stratio…

•           Experience with ETL tools

•           Experience on Agile, Scrum, DevOps

•           Highly organized and able to multi-task and prioritize effectively.

•           Excellent verbal, written, listening, and interpersonal skills.

•           Knowledge of programming languages, such as R or Python.

•           Data science or data analytical knowledge will be a plus.

•           Analytical Problem-Solving: Approaching high-level data challenges with a clear eye on what is important; employing the right approach/methods to make the maximum use of time and human resources.

•           Effective Communication: Carefully listening to management, data analysts and relevant staff to come up with the best data design; explaining complex concepts to non-technical colleagues.

•           Expert Management: Effectively directing and advising a team of data modelers, data engineers and database administrators.

•           Industry Knowledge: Understanding the way your chosen industry functions and how data are collected, analyzed, and utilized; maintaining flexibility in the face of big data developments.