You are opening our Polish language website. You can keep reading or switch to other languages.

GCP Data Architect

  • Belgrad
  • Kluż-Napoka
  • Kraków
  • Larnaka
  • Łódź
  • Lublin
  • Monterrey
  • Montevideo
  • Remote.AR
  • Remote.Brazil
  • Remote.Bulgaria
  • Remote.Colombia
  • Remote.Poland
  • Rosario
  • Ryga
  • Sofia
  • Warna
  • Warszawa
  • Wrocław
Duży zespół (20+ osób)Gorąca oferta

Jeśli otrzymałeś tę ofertę pracy od naszych rekruterów, zapoznaj się z naszą Polityką prywatności.

Position overview

We are looking for an experienced GCP Data Architect to design, build, and optimize cloud‑native data platforms and advanced analytics solutions for various clients. In this role, you will work directly with client stakeholders, lead architectural decisions, define data models, and ensure scalability and reliability across the full data lifecycle.

This position requires deep, hands‑on expertise in the Google Cloud ecosystem and modern data engineering practices. Most of your time will be spent assessing client infrastructures, recommending improvements, and planning and delivering migrations and modernizations within GCP.

Technology stack

TBU

Responsibilities

  • Design end‑to‑end data architecture solutions on Google Cloud Platform.
  • Develop scalable data pipelines and data processing frameworks (batch and streaming).
  • Architect data models for analytics, machine learning, and operational reporting.
  • Define data governance, data quality, metadata management, and security standards.
  • Lead migration efforts from on‑prem or other cloud platforms to GCP.
  • Collaborate with cross‑functional teams (data engineers, analysts, ML teams, product owners).
  • Optimize data storage and processing for performance, reliability, and cost efficiency.
  • Evaluate and integrate third‑party tools into the GCP data ecosystem.
  • Participate in architectural reviews, provide best practices, and mentor teams.
  • Produce documentation, diagrams, and technical specifications.

Requirements

  • 5+ years of experience in Data Architecture, Data Engineering, or similar roles.
  • Strong hands‑on expertise with Google Cloud Platform, including: BigQuery, Dataflow / Apache Beam, Cloud Composer/Airflow, Dataplex, Pub/Sub, Cloud Storage
  • Strong SQL and data modeling skills (3NF, dimensional modeling, data vault is a plus).
  • Experience designing large‑scale distributed data processing systems.
  • Strong understanding of ETL/ELT patterns and streaming data architectures.
  • Familiarity with CI/CD, IaC (Terraform), and automated deployment practices.
  • Experience with programming languages such as Python or Java/Scala.
  • Understanding of data governance, lineage, cataloging, and security (IAM).
  • Experience with ML pipelines or MLOps is a plus but not required.
  • Excellent communication skills and ability to work with non‑technical stakeholders.
  • GCP Certifications or readiness and ability to achieve in a quick period of time

Nice to have

  • Deep knowledge of any of other modern data ecosystem would be plus: AWS, Azure, Databricks, Snowflake and/or certificates obtained.

Looking for Similar Opportunities?

Try AI chatbots with our ready-made prompt to discover similar roles that match your skills and interests.
Image

Co oferujemy

Image

Dni wolne

Zgodne z lokalnym prawem

Image

Dbamy o Twoje zdrowie

Zapewniamy szeroki wachlarz usług w ramach prywatnego ubezpieczenia medycznego

Image

Płatne chorobowe

Zgodnie z lokalnym prawem

Image

Wakacje i specjalne dni wolne

Zgodnie z oficjalnym kalendarzem, niezależnie od kalendarza klienta

Image

Komfortowe warunki pracy

Elastyczny czas pracy oraz pomoc w wyposażeniu komfortowego miejsca pracy

Image

Wewnętrzna platforma edukacyjna

Dostęp do profesjonalnych kursów i szkoleń

Image

Wewnętrzne kursy języka angielskiego

Firmowe szkolenia z wysoko wykwalifikowanymi nauczycielami

Najbardziej poszukiwane
1 of 1