Madrid, M, ES

Senior Data Engineer (Hybrid; 80-100%; m/f/x/d)

Are you and experienced Senior Data Engineer ready to contribute in one of the most fast-growing digital insurance platform in the market? If yes - then we have an outstanding opportunity for YOUBuckle up and read on.


About the Role


As Senior Data Engineer helps us to maintain and improve our existing data ingestion into the data lake and building up a data mesh in Tangram.


Your key responsibilities involve:

  • Infrastructure Development and Optimization: Design, build, and maintain the infrastructure and baseline services required for implementing a data mesh architecture. Optimize existing infrastructure to ensure optimal performance, reliability, and cost-efficiency.

  • Data Architecture Transition: Lead the transition from classical data lake architecture to a data mesh, ensuring a smooth migration and minimizing data loss or downtime. Establish domain-driven decentralized data architectures, creating a robust and scalable data ecosystem.

  • Real-time Data Streaming Management: Design, implement, and manage real-time data streaming pipelines using Apache Kafka or other relevant technologies. Ensure the reliability, efficiency, and accuracy of real-time data pipelines.

  • Backend Development: Work closely with backend developers to ensure the seamless integration of data services with Java and SpringBoot applications. Ensure data availability and consistency across microservices and the broader application ecosystem.

  • Database Management: Administer and optimize the performance of AWS RDS/PostgreSQL, AWS DocumentDB/MongoDB databases. Ensure data integrity, availability, and security across all database systems.

  • CI/CD Pipeline Management: Enhance existing CI/CD pipelines to accommodate new data services, ensuring a smooth deployment and management process.

  • Data Governance and Compliance: Establish and maintain data governance practices ensuring data quality, data lineage, and compliance with GDPR or other regional data protection laws.

  • Team Collaboration and Mentoring: Collaborate with cross-functional teams to drive the successful execution of data projects. Mentor junior data engineers and other team members, promoting a culture of continuous learning and improvement.

  • Performance Monitoring and Troubleshooting: Monitor system performance, identify issues, and implement necessary optimizations to ensure optimal performance. Debug and troubleshoot data-related issues, ensuring the stability and reliability of the data infrastructure.

  • Emerging Technology Exploration: Stay updated with the latest industry trends and technologies, evaluating and recommending new tools and technologies that can enhance the platform's capabilities.

  • Documentation and Knowledge Sharing: Document system architectures, data models, and processes, ensuring that knowledge is shared across the organization. Facilitate training and knowledge sharing sessions to champion a deeper understanding of data infrastructure among the team.


Our Technology Stack:


Our cloud-native, multi-tenant SaaS platform operates on AWS, embodying a robust and scalable architecture. Here are the key technologies we use:


  • Backend: Utilizing Kotlin, Java, and SpringBoot, orchestrated via Docker and Kubernetes. Real-time data streaming is managed through Apache Kafka.

  • Database Management: We employ MongoDB and PostgreSQL for a balanced approach to NoSQL and relational data handling.

  • Frontend: Our user interfaces are built with ReactJS and TypeScript, ensuring a responsive and intuitive user experience.

  • Continuous Delivery: Our pipeline is streamlined with GitLab, facilitating a consistent and reliable delivery process.

  • Monitoring and Logging: Employing a suite comprising Prometheus, ElasticSearch, Kibana, Grafana and AWS X-Ray for comprehensive system monitoring and log management.

  • Data Lake: Built on AWS S3 and AWS Aurora, with AWS Lambda and DTB handling most data transformations, ensuring a well-structured and manageable data ecosystem.

  • Reporting: Tableau serves as our primary tool for data visualization and reporting, aiding in data-driven decision-making.

  • Data Mesh Initiative: We are in the preliminary stages of implementing a Data Mesh architecture to enhance data discoverability and decentralization.

  • AI & Real-time Analytics Exploration: Initiating investigations into AI and real-time analytics to potentially enhance our platform's capabilities.


About the Team


We are a diverse, international team of highly motivated individuals with a strong team spirit. We believe in mutual support to help each other succeed.


About You


If you are experienced Data Engineer passionate about quality and scalability, you are ready to take ownership and are eager to expand your horizons in all aspects of data & software engineering, we'll be more than happy to meet you!


Essential Skills:

  • AWS Proficiency: In-depth experience with AWS services including EKS, MSK, S3, RDS, and DocumentDB, as they are core to your existing tech stack.

  • Data Lake and Data Mesh Experience: Demonstrated experience in transitioning from traditional data lake architectures to data mesh is key. Understanding the principles of domain-driven decentralized data architecture and ability to implement baseline services for data mesh.

  • Real-time Data Streaming: Practical experience with Apache Kafka, or similar real-time data streaming technologies, with the capability to set up, manage, and optimize streaming data pipelines.

  • Backend Development: Proficiency in Java and SpringBoot, as this is the backbone of our platform.

  • Infrastructure as Code (IaC): Experience with IaC tools such as Terraform and Terragrunt to automate the deployment and management of infrastructure.

  • Database Management: Strong understanding and experience in both relational and NoSQL databases, including performance tuning, schema design, and query optimization.

  • CI/CD: Experience with continuous integration and continuous deployment tools and practices to maintain a high standard of delivery quality.


Bonus Skills:

  • AI/ML Experience: Any experience with AI/ML for data analysis or predictive modelling could be a plus as it might become handy when dealing with vast amounts of data or real-time analytics.

  • Project Management: Experience in Agile or Scrum methodologies and the ability to manage projects & coordinate with different teams

  • Data Governance and Compliance: Knowledge of data governance practices, including data quality, data lineage, and data privacy, especially in regard to GDPR or other regional data protection laws.

  • Data Visualization Tools: Familiarity with data visualization tools like Tableau, Power BI, or similar to present data insights in a relevant way.

  • Industry Certifications: Certifications like AWS Certified Big Data Specialty or Certified Data Management Professional.


What We Offer:


  • A supportive and collaborative team environment

  • Opportunity to work on innovative technologies

  • Professional growth and learning opportunities


If you are looking for an breath-taking opportunity to grow your career in a dynamic environment as a Senior Data Engineer we encourage you to apply. Together let’s build and innovate!


We provide feedback to all candidates via email. If you have not heard back from us, please check your spam folder.




About iptiQ


At iptiQ, we partner with established brands around the world to create impactful digital insurance solutions. We make it easier for consumers to buy the insurance they need from the brands they trust. We provide life and non-life insurance through our end-to-end digital platform and build multi-channel customer experiences.

What we offer our employees is outstanding. Hybrid working in offices across the world, phenomenal learning & career opportunities and a culture that encourages new perspectives to challenge conventions and come up with innovative solutions. We believe in the power of inclusion. Drawing on our employees' broad range of perspectives, life experiences and backgrounds stimulates creativity and gives us a competitive edge. iptiQ embraces a workplace where everyone has equal opportunities to thrive regardless of their age, gender, gender identity and/or expression, sexual orientation, race, ethnicity, religion, physical or mental ability, or other characteristic and can be their authentic self. Ignite your curiosity to shape digital insurance.

iptiQ is part of Swiss Re, one of the world’s leading providers of re/insurance and risk transfer solutions. This means we’re backed by Swiss Re’s capital strength and more than 150 years of risk knowledge. Through our partnerships, we contribute to Swiss Re’s vision to make the world more resilient.

During the recruitment process, reasonable accommodations for special needs are available upon request. If contacted for an interview, please inform the Recruiter/HR Professional of the accommodation needed.



Reference Code: 127107 


Job Segment: Data Architect, Data Analyst, Data Management, HR, Data, Human Resources