You are opening our Bulgarian language website. You can keep reading or switch to other languages.

Senior Analytics Engineer (DBT Core & Data Modeling)

  • Remote.AR
  • Remote.Brazil
  • Remote.Bulgaria
  • Remote.Colombia
  • Remote.Georgia
  • Remote.Poland
  • Rosario
  • Белград
  • Варна
  • Варшава
  • Вроцлав
  • Ереван
  • Клуж-Напока
  • Краков
  • Ларнака
  • Лодз
  • Люблин
  • Монтевидео
  • Монтерей
  • Рига
  • София
  • Тбилиси
Гореща позицияМалък екип (1-10 души)

Ако сте получили информация за тази свободна позиция от нашите рекрутери, прочетете нашата Политика за поверителност на личните данни.

Client

Our client is a U.S.-based healthcare marketplace focused on connecting providers with patients to improve access to medical services. The company fosters a strong engineering culture and values proactive individuals who are eager to explore new technologies and implement ready-made solutions.

Наемаме ви в компанията, не в проект

Position overview

The Data Analytics Engineer will be responsible for understanding business objectives, assessing the current data landscape, and writing SQL queries to transform raw data managed in DBT into comprehensive Looker dashboards.

Effective collaboration requires at least a 3-hour overlap with Eastern Standard Time (EST).

Responsibilities

  • Build and maintain reliable, scalable data models and pipelines using SQL, DBT, and Dagster
  • Support self-service analytics by managing tools like Looker and Amplitude, driving adoption and training
  • Partner with analysts, product teams, and engineers to align data infrastructure with business needs
  • Improve data ingestion and integration in collaboration with data engineering
  • Define and promote best practices in data governance, quality, privacy, and security

Requirements

  • Solid experience with DBT Core to build models, run tests, and deploy changes
  • Solid experience with data modeling - Kimball, SCDs, Event modeling
  • Expert-level SQL skills for building performant, production-grade transformations
  • Proficiency with data visualization tools; experience with Looker is a plus, and mandatory proficiency in any modern BI tool (e.g., Tableau, Power BI, Qlik)
  • Looker (LookML) for semantic layer design, project architecture, and governance
  • Python scripting for automation, testing, and developer productivity
  • Orchestration and workflow management using Dagster (preferred) or Airflow
  • Experience with Redshift or BigQuery
  • Amplitude for event-based product analytics and user journey modeling
  • Implementation of data governance frameworks and KPI management tools to maintain quality, lineage, and ownership
  • Excellent communication skills for collaborating with cross-functional stakeholders to gather requirements and clarify data context and meaning
  • Use of AI-assisted development tools (e.g., GitHub Copilot, CodeWhisperer) to improve workflow efficiency
  • Strong understanding of data modeling and data warehousing principles

Nice to have

  • Experience with Snowflake as the primary cloud data warehouse

Търсите сходни възможности?

Try AI chatbots with our ready-made prompt to discover similar roles that match your skills and interests.
Image
Най-търсени позиции
1 of 1