Levy Professionals
(Multiple Openings)
We are looking for…
A driven Data Engineer who can bridge the gap between complex data management and robust software engineering. You are someone who not only executes tasks but brings new perspectives to the table, helping to evolve our large-scale application landscape that processes millions of records.
About the role
As a Data Engineer, you are responsible for the end-to-end development and implementation of data pipelines and streaming-based integrations. You will take a lead role in designing how we consume data from diverse sources, applying complex transformations, and ensuring seamless delivery to target systems within the Azure cloud.
You will:
-
Develop and perform peer reviews for data solutions deployed in the Azure cloud.
-
Master and implement technologies including ADLS Gen2, Databricks, and Apache Kafka.
-
Act as a core member of a diverse Agile/Scrum DevOps team, taking full ownership of managing and maintaining mission-critical functionalities.
Responsibilities
-
Architecting and maintaining a state-of-the-art Data Lake solution.
-
Integrating time-critical fraud detection applications with multiple downstream systems via real-time and streaming interfaces.
-
Mentoring junior team members and catering to the high-level data needs of various business consumers.
Who are you?
You are a technical specialist who lives and breathes data engineering. You stay ahead of the curve regarding the latest technology trends and thrive in a hybrid, flexible working environment that rewards proactivity and a “challenge-accepted” mindset.
Experience
-
Big Data Mastery: Proven experience with PySpark and Databricks in large-scale environments.
-
Programming Excellence: Strong command of Python and SQL for complex data manipulation.
-
Cloud Infrastructure: Professional experience with ADLS Gen2 and Azure-based components.
-
Agile Leadership: Deep understanding of Scrum/Agile principles and DevOps mentalities.
Profile
-
Cloud Ecosystem: Strong exposure to Microsoft Azure (DevOps, Boards, Git, Pipelines) or AWS.
-
Data Architecture: Solid understanding of data management, including data retention, quality control, metadata management, and data lineage.
-
Automation: Proficiency or a strong drive to implement Workflow Automation via Azure Data Factory, Databricks Workflows, or Apache Airflow.
-
Soft Skills: Professional English proficiency; a persuasive communicator who can align stakeholders and lead technical discussions.
About Levy Professionals
Since 2000 we provide professional solutions to organizations ranging from tech start-ups to global players. From our offices in Amsterdam and London we have built an international and local network of skilled employed professionals and contractors fuelled by our passion for connecting skills with projects.
Over the years we have fulfilled over 1700 requirements and nowadays we consistently have 250+ professionals recruited and relocated from 14 countries allocated to various projects. Our strength is the way that we see and treat people. This will always be a key factor in our strategy for many years to come.


