Przejdź do głównej części
Czat

Powiadomienia

Dodaj ogłoszenie
Rekrutacja Radiometer
Dodane 15 stycznia 2026

Senior Data Platform Engineer (m/f/d)

Radiometer Solutions Dowiedz się więcej

Lokalizacja

Kraków, Bronowice

Wymiar pracy

Pełny etat

Typ umowy

Umowa o pracę

Lokalizacja

Kraków, Bronowice

Opis

Bring more to life.

Are you ready to accelerate your potential and make a real difference within life sciences, diagnostics and biotechnology?

At Radiometer, one of Danaher’s 15+ operating companies, our work saves lives—and we’re all united by a shared commitment to innovate for tangible impact.

You’ll thrive in a culture of belonging where you and your unique viewpoint matter. And by harnessing Danaher’s system of continuous improvement, you help turn ideas into impact – innovating at the speed of life.

At Radiometer, life comes first. Our vision is to improve global healthcare with reliable, fast, and easy patient diagnoses. We’re a team that celebrates diverse ideas and continuous improvement. Here, you’ll find a place to grow and make a real impact, with your unique perspective driving us forward in improving patient care. At Radiometer, our vision is to improve global healthcare with reliable, fast, and easy patient diagnoses.

Learn about the Danaher Business System which makes everything possible.

The Senior Data Platform Engineer is a key development role in the Global IT organization responsible for implementing and governing the data architecture that supports both analytics and data platform initiatives.

This position is part of the Data, Analytics, and AI organization and plays a critical role in enabling scalable, secure, and high-performing data solutions across the enterprise.

As a Senior Data Platform Engineer, you will work closely with cross-functional teams including Business Application Specialists, BI Analysts, AI Engineers, Product Owners, Enterprise Architects and Project Managers to define and implement data architecture strategies that align with the company’s strategic objectives, business goals and technical standards.

Position is located in Kraków and will be an on-site role.

In this role, you will have the opportunity to:

· Design and implement hub-and-spoke integration frameworks in Python, enabling seamless data movement between operational systems (ERP, CRM) and cloud platforms.

· Build and maintain data pipelines using Dagster for orchestration, dbt for transformation, and Snowflake/Databricks for storage and analytics, with CI/CD via Azure DevOps and DataOps best practices.

· Define and enforce integration, data modelling, and governance standards, ensuring consistency, scalability, security, and compliance with enterprise and regulatory requirements.

· Enable diverse ingestion patterns (incremental and full-refresh) through YAML-based configuration, change tracking, and robust data contracts to support Data Product delivery.

· Design and optimise cloud-native data platforms, including Data Lakes integrating Snowflake, Databricks, and Azure services, while monitoring reliability, cost, and performance using observability and lineage tools.

· Drive technology adoption and modernisation, recommending tools (Snowflake, Databricks, dbt, Dagster, Azure), migrating legacy integrations, and leveraging IaC for infrastructure provisioning and change management.

· Foster AI and data excellence, enabling AI/ML platforms, mentoring engineers and analysts on best practices, and initiating FinOps for cost analysis and visibility into major cost drivers.

The essential requirements of the job include:

· Bachelor’s or Master’s in Computer Science, Engineering, or related field, with 6+ years in data integration, platform engineering, or analytics.

· Proven Python development skills for data integrations and orchestration, plus hands-on experience with hub-and-spoke architectures and Infrastructure as Code (IaC).

· Deep knowledge of Snowflake, Databricks, Azure; strong skills in dbt for modelling, Dagster for orchestration, and Azure DevOps for CI/CD and DataOps.

· Expertise in data warehousing, ETL/ELT, cloud-native engineering, and integrating business systems (ERP, CRM, CPQ) as well as unstructured data (images, text, graphs) into cloud platforms.

· Experience designing Data Lakes integrating Snowflake, Databricks, and Azure; strong understanding of Data Contracts and delivering Data Products.

· Familiarity with ML/AI Ops platforms, ability to enable AI engineers, and develop monitoring solutions for proactive alerts on data workflows.

· Skilled in stakeholder management, balancing short-term deliverables with long-term strategy, and applying FinOps practices for cost analysis and visibility.

It would be a plus if you also possess previous experience in:

· Certifications in Azure, Snowflake, or Databricks.

· Familiarity with big data technologies (Spark, Kafka, Delta Lake).

· Experience with data governance, security, and compliance frameworks.

Join our winning team today. Together, we’ll accelerate the real-life impact of tomorrow’s science and technology. We partner with customers across the globe to help them solve their most complex challenges, architecting solutions that bring the power of science to life.

For more information, visit www.danaher.com.

Aplikowanie na stronie ogłoszeniodawcy

Ogłoszeniodawca udostępnia aplikowanie na swojej stronie internetowej. Po kliknięciu przycisku Aplikuj zostaniesz tam przeniesiony.

Aplikuj terazopens in a new tab
ID: 1049608792
Darmowa aplikacja na Twój telefon