Náplň práce
The Mission
Our client is launching a world-class Data Management & AI Center of Excellence in Prague. We aren't looking for someone to just "move data"—we need a pipeline artisan. As a founding member of this hub, you will build the high-velocity ETL/ELT engines that transform raw global data into a massive strategic advantage.
The Tech Stack & Responsibilities
?️ Build the Engine – Design, develop, and scale robust ETL/ELT and Data Pipelining solutions from the ground up.
❄️ Master the Modern Stack – Deep-dive into Snowflake, DBT, and Microsoft SQL Server to architect high-performance data environments.
? Polyglot Engineering – Switch seamlessly between Python and SQL to solve complex processing challenges.
? API Integration – Build sophisticated ingestion layers from global sources like Salesforce, SAP, Google, and Facebook.
⚡ Real-Time & Event-Driven – Drive the transition toward real-time ingestion and event-driven architectures.
? CI/CD & Governance – Deploy clean, maintainable code in an Azure/Cloud environment while ensuring GDPR and SOX compliance are built into the DNA of every pipeline.
Požadavky
What We’re Looking For
Technical Versatility: A Master’s degree in CS/BI with a "tool-agnostic" mindset. You love getting your hands on the latest tech.
SQL Mastery: You don't just write queries; you optimize complex logic for massive scale.
Architecture Mindset: Experience in Data Modeling and translating business requirements into schemas that Data Scientists dream of.
Engineering Rigor: Familiarity with Git, REST APIs, and Data Profiling within an Agile framework.
Communication: Fluent English is essential. If you speak French or Portuguese, that’s a major technical/collaboration bonus for this global team.
Odpovědět na inzerát