Data Engineer
PRODYNA (Schweiz) AG
Description du poste
Rejoignez PRODYNA, expert en solutions logicielles innovantes. Une ambiance de travail dynamique vous attend.
Tâches
• Concevoir des plateformes de données modernes pour l'intégration massive.
• Planifier et mettre en œuvre des services backend pour l'utilisation des données.
• Optimiser les pipelines et assurer leur intégration dans le cloud.
Compétences
• Expérience en développement logiciel et projets axés sur les données.
• Maîtrise de SQL et d'un langage de programmation comme Python.
• Connaissance des méthodologies de modélisation de données.
Data Engineer
At PRODYNA we design, implement, and operate custom software applications for mid- to large-sized enterprises. We're committed to offering our customers innovative and future-proof solutions through digitalization and cloud computing strategies. As a member of the Cloud Native Computing Foundation (CNCF), we promote speed, agility, and scalability in software development.
Your tasks
• As a Data Engineer, you'll be responsible for designing and conceptualizing modern data platforms capable of integrating and processing large volumes of data from systems at scale. Your responsibilities will include:
• Designing and conceptualizing modern data platforms for large-scale data integration and processing.
• Planning and implementing backend services using modern frameworks for data provisioning and utilization.
• Independently optimizing pipelines and integrating them into cloud platforms.
• Implementing tools and solutions to ensure data quality, data cleansing, and provision of aggregated data for analysis.
• Gathering and specifying requirements from our clients.
Your profile
• Several years of experience in software development, particularly in data-driven projects, data warehousing, or data platform construction.
• Knowledge in pipeline development and data integration (ELT/ETL, Azure Data Factory, DBT, Stored Procedures, or similar tools).
• Familiarity with data modeling and experience with Data Vault, Data Mesh, Kimball, or Inmon methodologies.
• Proficiency in SQL and at least one programming language (e.g., Python, Scala).
• Experience with cloud platforms (e.g., Azure, AWS, or GCP).
• Strong analytical and problem-solving skills.
Nice to Have:
• Experience with Microsoft Fabric or understanding of the Fabric ecosystem.
• Knowledge of Delta Lake, Lakehouse architecture, and data mesh principles.
• Familiarity with CI/CD for data pipelines.
• Exposure to data cataloging tools (e.g., Purview, Alation).
• Understanding of privacy regulations (e.g., GDPR, CCPA).
Your benefits
• Employee education
• 25 vacation days
• Free hardware selection
• Private health insurance
• Health support and wellbeing
• Team events
• International network
• Fruits in the office
• Employee referral programm
Tâches
• Concevoir des plateformes de données modernes pour l'intégration massive.
• Planifier et mettre en œuvre des services backend pour l'utilisation des données.
• Optimiser les pipelines et assurer leur intégration dans le cloud.
Compétences
• Expérience en développement logiciel et projets axés sur les données.
• Maîtrise de SQL et d'un langage de programmation comme Python.
• Connaissance des méthodologies de modélisation de données.
Data Engineer
At PRODYNA we design, implement, and operate custom software applications for mid- to large-sized enterprises. We're committed to offering our customers innovative and future-proof solutions through digitalization and cloud computing strategies. As a member of the Cloud Native Computing Foundation (CNCF), we promote speed, agility, and scalability in software development.
Your tasks
• As a Data Engineer, you'll be responsible for designing and conceptualizing modern data platforms capable of integrating and processing large volumes of data from systems at scale. Your responsibilities will include:
• Designing and conceptualizing modern data platforms for large-scale data integration and processing.
• Planning and implementing backend services using modern frameworks for data provisioning and utilization.
• Independently optimizing pipelines and integrating them into cloud platforms.
• Implementing tools and solutions to ensure data quality, data cleansing, and provision of aggregated data for analysis.
• Gathering and specifying requirements from our clients.
Your profile
• Several years of experience in software development, particularly in data-driven projects, data warehousing, or data platform construction.
• Knowledge in pipeline development and data integration (ELT/ETL, Azure Data Factory, DBT, Stored Procedures, or similar tools).
• Familiarity with data modeling and experience with Data Vault, Data Mesh, Kimball, or Inmon methodologies.
• Proficiency in SQL and at least one programming language (e.g., Python, Scala).
• Experience with cloud platforms (e.g., Azure, AWS, or GCP).
• Strong analytical and problem-solving skills.
Nice to Have:
• Experience with Microsoft Fabric or understanding of the Fabric ecosystem.
• Knowledge of Delta Lake, Lakehouse architecture, and data mesh principles.
• Familiarity with CI/CD for data pipelines.
• Exposure to data cataloging tools (e.g., Purview, Alation).
• Understanding of privacy regulations (e.g., GDPR, CCPA).
Your benefits
• Employee education
• 25 vacation days
• Free hardware selection
• Private health insurance
• Health support and wellbeing
• Team events
• International network
• Fruits in the office
• Employee referral programm