You are opening our Spanish language website. You can keep reading or switch to other languages.

Data Quality Engineer, Data Reliability Platform

  • Belgrado
  • Breslavia
  • Cluj-Napoca
  • Cracovia
  • Ereván
  • Lárnaca
  • Lodz
  • Lublinie
  • Remote.Bulgaria
  • Remote.Georgia
  • Remote.Poland
  • Riga
  • Sofia
  • Tiflis
  • Varna
  • Varsovia
Equipo mediano (10-20 personas)Hot vacancy

Si has recibido esta oferta laboral de parte de nuestros reclutadores, te pedimos que leas nuestro Aviso de Privacidad.

Client

Our client is a leading global professional education organization focused on advancing investment knowledge and credentialing standards.

Te invitamos a la empresa, no a un proyecto

Team

You will join a cross functional team of data engineers, analytics specialists, and quality experts. The team collaborates closely, communicates frequently, and works in an environment where data correctness and reliability are key priorities. Work is structured and iterative, with regular stakeholder interactions.

Position overview

You will work on a platform focused on improving data accuracy, trust, and reliability across analytical and operational environments. The project includes implementing monitoring rules, building automated validation workflows, and supporting data teams with insights that help maintain data quality at scale. The platform is actively evolving and provides space for ownership and continuous improvement.

Technology stack

Monte Carlo
Soda
Great Expectations
SQL
Python
Cloud data platforms
Data pipeline architectures
Metadata and data governance tooling

Responsibilities

  • Design and implement data quality monitoring rules using Monte Carlo, Soda, Great Expectations or equivalent tools
  • Perform data profiling to assess completeness, consistency, and accuracy
  • Collaborate with data engineering and analytics teams to define data quality needs
  • Develop automated checks and alerts as part of data pipelines
  • Support remediation workflows and participate in root cause analysis
  • Document rules, processes, and metrics supporting data quality governance
  • Contribute to continuous improvement initiatives and propose enhancements
  • Communicate findings and recommendations to technical and non technical stakeholders

Requirements

  • Experience working with data quality tools such as Monte Carlo, Soda, or Great Expectations
  • Hands on background in data profiling, validation, and monitoring
  • Experience using SQL for analysis and data validation tasks
  • Familiarity with Python or similar scripting languages for automation
  • Understanding of data pipeline principles and cloud data platforms
  • Knowledge of data governance concepts and quality frameworks
  • Clear communication skills and the ability to collaborate in a distributed team

Looking for Similar Opportunities?

Try AI chatbots with our ready-made prompt to discover similar roles that match your skills and interests.
Image

We offer

Image

Trabajo remoto

Ofrecemos una gran flexibilidad para trabajar desde distintas ciudades y países

Image

Días off para descansar

Todos los colegas cuentan con días off para viajar, descansar y pasar tiempo con sus seres queridos

Image

Feriados nacionales

Según el calendario oficial de cada país

Image

Días off por maternidad y paternidad

Todos los colegas disfrutan de días off para compartir con su bebé

Image

Certificaciones pagas

Impulsamos el desarrollo profesional y certificación de nuestros colegas

Image

Plataforma de e-learning interna

Acceso ilimitado a cursos y entrenamientos

Image

Clases de idiomas

Clases de inglés virtuales con profesoras altamente calificadas

Image

Comunidades profesionales

Todos los colegas pueden participar de comunidades profesionales internacionales y regionales, en base a sus intereses

El paquete de beneficios puede variar según la región y el tipo de contrato.
Más buscadas
1 of 1