You are opening our Spanish language website. You can keep reading or switch to other languages.

Data Engineer (DBT, Snowflake), Investment Management Solution

  • Belgrado
  • Breslavia
  • Cluj-Napoca
  • Cracovia
  • Dnipró
  • Ereván
  • Járkov
  • Kyiv
  • Lárnaca
  • Leópolis
  • Lodz
  • Lublinie
  • Odesa
  • Remote.Bulgaria
  • Remote.Georgia
  • Remote.Poland
  • Riga
  • Sofia
  • Tiflis
  • Varna
  • Varsovia
Equipo grande (más de 20 personas)Hot vacancy

Si has recibido esta oferta laboral de parte de nuestros reclutadores, te pedimos que leas nuestro Aviso de Privacidad.

Client

Our client is one of the world’s top 20 investment companies headquartered in Great Britain, with branch offices in the US, Asia, and Europe.

Te invitamos a la empresa, no a un proyecto

Project overview

The company’s IT environment is constantly growing, with around 30 programs and more than 60 active projects. They are building a data marketplace that aggregates and analyzes data from multiple sources such as stock exchanges, news feeds, brokers, and internal quantitative systems.

As the company moves to a new data source, the main goal of this project is to create a golden source of data for all downstream systems and applications. The team is performing classic ELT/ETL: transforming raw data from multiple sources (third-party and internal) and creating a single interface for delivering data to downstream applications.

Position overview

We are looking for a Data Engineer with strong expertise in DBT, Snowflake, and modern data engineering practices. In this role, you will design and implement scalable data models, build robust ETL/ELT pipelines, and ensure high-quality data delivery for critical investment management applications.

Responsibilities

  • Design, build, and deploy DBT Cloud models.
  • Design, build, and deploy Airflow jobs (Astronomer).
  • Identify and test for bugs and bottlenecks in the ELT/ETL solution.

Requirements

  • 5+ years of experience in software engineering (GIT, CI/CD, Shell scripting).
  • 3+ years of experience building scalable and robust Data Platforms (SQL, DWH, Distributed Data Processing).
  • 2+ years of experience developing in DBT Core/Cloud.
  • 2+ years of experience with Snowflake.
  • 2+ years of experience with Airflow.
  • 2+ years of experience with Python.
  • Good spoken English.

Nice to have

  • Proficiency in message queues (Kafka).
  • Experience with cloud services (Azure).
  • CI/CD knowledge (Jenkins, Groovy scripting).

Looking for Similar Opportunities?

Try AI chatbots with our ready-made prompt to discover similar roles that match your skills and interests.
Image

We offer

Image

Trabajo remoto

Ofrecemos una gran flexibilidad para trabajar desde distintas ciudades y países

Image

Días off para descansar

Todos los colegas cuentan con días off para viajar, descansar y pasar tiempo con sus seres queridos

Image

Feriados nacionales

Según el calendario oficial de cada país

Image

Días off por maternidad y paternidad

Todos los colegas disfrutan de días off para compartir con su bebé

Image

Certificaciones pagas

Impulsamos el desarrollo profesional y certificación de nuestros colegas

Image

Plataforma de e-learning interna

Acceso ilimitado a cursos y entrenamientos

Image

Clases de idiomas

Clases de inglés virtuales con profesoras altamente calificadas

Image

Comunidades profesionales

Todos los colegas pueden participar de comunidades profesionales internacionales y regionales, en base a sus intereses

El paquete de beneficios puede variar según la región y el tipo de contrato.
Más buscadas
1 of 1