

Senior Data Engineer (m/f/d)
Lokalizacja
Kraków, Bronowice
Wymiar pracy
Pełny etat
Typ umowy
Umowa o pracę
Lokalizacja
Kraków, Bronowice
Opis
Bring more to life.
Are you ready to accelerate your potential and make a real difference within life sciences, diagnostics and biotechnology?
At Radiometer, one of Danaher’s 15+ operating companies, our work saves lives—and we’re all united by a shared commitment to innovate for tangible impact.
You’ll thrive in a culture of belonging where you and your unique viewpoint matter. And by harnessing Danaher’s system of continuous improvement, you help turn ideas into impact – innovating at the speed of life.
At Radiometer, life comes first. Our vision is to improve global healthcare with reliable, fast, and easy patient diagnoses. We’re a team that celebrates diverse ideas and continuous improvement. Here, you’ll find a place to grow and make a real impact, with your unique perspective driving us forward in improving patient care. At Radiometer, our vision is to improve global healthcare with reliable, fast, and easy patient diagnoses.
Learn about the Danaher Business System which makes everything possible.
The Senior Data Platform Engineer is a key development role in the Global IT organization responsible for implementing and governing the data architecture that supports both analytics and data platform initiatives.
This position is part of the Data, Analytics, and AI organization and plays a critical role in enabling scalable, secure, and high-performing data solutions across the enterprise. You will work closely with business stakeholders, BI analysts, AI engineers, product owners, and application specialists to translate business needs—especially in finance, sales, supply chain, and operations—into innovative data models, analytics products, and intelligent applications. Position is located in Kraków.
In this role, you will have the opportunity to:
Lead the development of robust data models and analytics products that deliver actionable insights and predictive capabilities for business domains such as finance, sales, supply chain, and operations.
Design and implement advanced analytics workflows, enabling business users to make data-driven decisions.
Build end-to-end data pipelines for ingestion, transformation, and delivery, integrating diverse data sources (ERP, CRM, unstructured data, external APIs).
Collaborate with AI engineers to develop, deploy, and maintain machine learning models and intelligent automation within business applications.
Develop APIs, microservices, and application components that integrate analytics and AI into operational systems and user-facing products.
Champion best practices in analytics engineering, data modelling, and application development, mentoring junior team members.
Ensure data quality, security, and compliance throughout the application lifecycle.
Drive technology adoption, recommending new tools and frameworks for AI, analytics, and application development.
Engage with stakeholders to understand business requirements and deliver solutions that create tangible impact.
Essential requirements:
Bachelor’s or Master’s in Computer Science, Engineering, or related field, with 6+ years in data engineering, analytics, or application development.
Proven experience designing and delivering data models and analytics products for business domains such as finance, sales, supply chain, and operations.
Strong Python development skills, including experience with frameworks for data processing, machine learning, and application development (e.g., FastAPI, Flask, PySpark, TensorFlow, scikit-learn).
Hands-on experience with cloud data platforms (Snowflake, Databricks, Azure), and modern data pipeline tools (dbt, Dagster, Airflow).
Expertise in data modelling, ETL/ELT, and integrating business systems (ERP, CRM, CPQ) as well as unstructured data (images, text, graphs).
Experience deploying and maintaining machine learning models in production environments.
Skilled in stakeholder management, balancing short-term deliverables with long-term strategy.
Familiarity with DevOps, CI/CD, and agile development practices.
Preferred qualifications:
Certifications in Azure, Snowflake, Databricks, or relevant AI/ML technologies.
Experience with big data technologies (Spark, Kafka, Delta Lake).
Knowledge of data governance, security, and compliance frameworks.
Experience developing user-facing applications or dashboards (e.g., Streamlit, Dash, Power BI).
Join our winning team today. Together, we’ll accelerate the real-life impact of tomorrow’s science and technology. We partner with customers across the globe to help them solve their most complex challenges, architecting solutions that bring the power of science to life.
For more information, visit www.danaher.com.
Aplikowanie na stronie ogłoszeniodawcy
Ogłoszeniodawca udostępnia aplikowanie na swojej stronie internetowej. Po kliknięciu przycisku Aplikuj zostaniesz tam przeniesiony.