You are opening our Bulgarian language website. You can keep reading or switch to other languages.

Software Engineer with Data and AI Engineering

  • Remote.AR
  • Remote.Brazil
  • Remote.Colombia
  • Rosario
  • Монтевидео
  • Монтерей
Малък екип (1-10 души)

Ако сте получили информация за тази свободна позиция от нашите рекрутери, прочетете нашата Политика за поверителност на личните данни.

Client

Our client is a leading global corporate and investment bank, providing a wide range of financial services to large corporates, financial institutions, and institutional investors across international markets.

Наемаме ви в компанията, не в проект

Project overview

The project involves migrating production data pipelines from legacy environments into a modern, cloud‑native data platform. The new platform enables domain‑oriented data products, scalable analytics, and embedded governance, with AI‑based tools supporting data quality, anomaly detection, privacy, and compliance.

Team

You will be part of a cross functional team of data engineers, software engineers, and AI specialists working in close collaboration. The team follows an agile delivery model with shared ownership of design, development, and production support.

Position overview

We are looking for a Software Engineers with a strong background in data engineering and applied AI engineering. This role is part of a dual skill track where deep data engineering expertise is essential and hands on experience with AI enablement is expected.

Candidates from other countries may be considered if they are able to work US East Coast hours.

Technology stack

Python, SQL, NoSQL, Snowflake, Airflow, LLM/RAG, Flask, Streamlit, Azure/AWS/GCP

Responsibilities

  • Design and develop scalable ETL and ELT data pipelines
  • Build and maintain data orchestration workflows using Apache Airflow or similar tools
  • Collaborate with AI engineers to integrate LLMs into data‑driven applications
  • Develop RAG pipelines using embeddings and vector‑based search
  • Optimize Snowflake data models for performance and cost efficiency
  • Contribute to cloud native application design and deployment
  • Support integration or development of MCP servers where applicable
  • Collaborate closely with product, data, and platform teams

Requirements

  • Mid‑level candidates with 4+ years and Senior candidates with 10+ years of professional experience in software or data engineering.
  • Strong experience building production-grade data pipelines
  • Hands-on experience with Apache Airflow or similar orchestration tools
  • Solid experience with Snowflake, including data modeling and performance tuning
  • Advanced SQL skills and working knowledge of NoSQL databases
  • Strong Python development experience
  • Experience working in cloud environments Azure, AWS or GCP
  • Ability to work US East Coast hours

Nice to have

  • Hands on experience with large language models
  • Experience with retrieval augmented generation patterns
  • Experience with embeddings and vector databases
  • Experience using Streamlit or similar tools for GenAI interfaces
  • Exposure to MCP server development or integration

Търсите сходни възможности?

Try AI chatbots with our ready-made prompt to discover similar roles that match your skills and interests.
Image
Най-търсени позиции
1 of 1