You are opening our Bulgarian language website. You can keep reading or switch to other languages.

Data Engineer (DBT, Snowflake), Investment Management Solution

  • Remote.Bulgaria
  • Remote.Georgia
  • Remote.Poland
  • Белград
  • Варна
  • Варшава
  • Вроцлав
  • Днепър
  • Ереван
  • Киев
  • Клуж-Напока
  • Краков
  • Ларнака
  • Лвов
  • Лодз
  • Люблин
  • Одеса
  • Рига
  • София
  • Тбилиси
  • Харков
Голям екип (20+ души)Гореща позиция

Ако сте получили информация за тази свободна позиция от нашите рекрутери, прочетете нашата Политика за поверителност на личните данни.

Client

Our client is one of the world’s top 20 investment companies headquartered in Great Britain, with branch offices in the US, Asia, and Europe.

Наемаме ви в компанията, не в проект

Project overview

The company’s IT environment is constantly growing, with around 30 programs and more than 60 active projects. They are building a data marketplace that aggregates and analyzes data from multiple sources such as stock exchanges, news feeds, brokers, and internal quantitative systems.

As the company moves to a new data source, the main goal of this project is to create a golden source of data for all downstream systems and applications. The team is performing classic ELT/ETL: transforming raw data from multiple sources (third-party and internal) and creating a single interface for delivering data to downstream applications.

Position overview

We are looking for a Data Engineer with strong expertise in DBT, Snowflake, and modern data engineering practices. In this role, you will design and implement scalable data models, build robust ETL/ELT pipelines, and ensure high-quality data delivery for critical investment management applications.

Responsibilities

  • Design, build, and deploy DBT Cloud models.
  • Design, build, and deploy Airflow jobs (Astronomer).
  • Identify and test for bugs and bottlenecks in the ELT/ETL solution.

Requirements

  • 5+ years of experience in software engineering (GIT, CI/CD, Shell scripting).
  • 3+ years of experience building scalable and robust Data Platforms (SQL, DWH, Distributed Data Processing).
  • 2+ years of experience developing in DBT Core/Cloud.
  • 2+ years of experience with Snowflake.
  • 2+ years of experience with Airflow.
  • 2+ years of experience with Python.
  • Good spoken English.

Nice to have

  • Proficiency in message queues (Kafka).
  • Experience with cloud services (Azure).
  • CI/CD knowledge (Jenkins, Groovy scripting).

Търсите сходни възможности?

Try AI chatbots with our ready-made prompt to discover similar roles that match your skills and interests.
Image
Най-търсени позиции
1 of 1