Who you are Expert PHP (Laravel): Deep track record in designing and building complex e-commerce middleware. Architectural Design: Proven ability to document and implement scalable integration patterns for enterprise systems (ERP, PIM, PSP). AWS & Infrastructure: Expertise in core services (EC2, ECS/EKS, SQS, Kafka) and Infrastructure as Code (Terraform).
IHRE AUFGABEN Entwicklung von Backend-Services in C#/.NET, inklusive Datenbanklogik auf MS SQL Server (T-SQL, Performance-Tuning, Indexing) Gestaltung von Architektur und Schnittstellen (z.B. REST/gRPC) nach Clean-Architecture/SOLID und Domain-Driven-Design; Review, Refactoring und Technical Debt Management gehören dazu Qualitätssicherung durch automatisierte Tests (Unit/Integration), Code-Reviews und statische Analyse; Ende-zu-Ende-Tests für kritische Lagerprozesse Observability und Betriebssicherheit: Logging/Tracing/Metrics (OpenTelemetry), Feature-Toggles, Fehlerbudgets und Post-Mortems Security by Design nach OWASP ASVS (z.
Develop and continuously enhance the OneCompiler toolchain, including the MLIR/IREE front end, intermediate representation (IR) transformations, and backend code generation Integrate compiler components into software products such as eIQ and ML SDKs Design optimization passes and hardware-specific lowering flows for efficient model execution Contribute to exploratory compiler initiatives such as training-graph ingestion, auto-tuning, heterogeneous scheduling, and runtime bindings Build and maintain model ingestion pipelines for PyTorch, ONNX, TensorFlow, and TFLite Collaborate closely with internal engineering teams and external partners to drive innovation across the compiler ecosystem Strong background in compiler design and modern ML compiler frameworks (e.g., MLIR/LLVM, TVM, IREE, XLA) Familiarity with model export workflows for PyTorch, ONNX, and TensorFlow Deep understanding of AI/ML models, quantization techniques, and hardware-aware optimizations Proficiency in C++ and Python, with experience implementing compiler passes or device backends Experience working with embedded or heterogeneous compute architectures (e.g., Cortex CPUs, NPUs, DSPs) Hands-on experience with IREE AOT compilation or its runtime is benefitical Experience developing new MLIR dialects is benefitical Fascinating, innovative environment in an international atmosphere Ihr Kontakt Referenznummer 860059/1 Kontakt aufnehmen Telefon:+ 49 621 1788-4297 E-Mail: positionen@hays.de Anstellungsart Freiberuflich für ein Projekt
Location: Mobile work (Germany-wide) What you’ll do Empower customers: Identify, design, and implement data‑ and AI‑driven use cases for external clients across diverse industries. Build the backbone: Collaborate with other Data Engineers to enhance and integrate solutions into existing data and platform architectures.
Responsible for Windows Server systems administration and large-scale infrastructure operations Design, modernization and build-out of Microsoft server infrastructures Operation and monitoring of Microsoft environments Development of automation scripts and tools using PowerShell Contribution to development tasks in C++ and C# Maintenance, development and troubleshooting of SQL databases Preparation of documentation describing experience, modernization concepts and technical approaches Execution of hands-on operational, engineering and development tasks within Windows Server and network components Proven experience in Microsoft Windows Server administration Strong knowledge of network operations in Microsoft environments Proficiency in PowerShell-based automation Development experience in C++ and C# Solid SQL skills for infrastructure-related development tasks Experience designing and operating large Microsoft infrastructures Experience in operating system development.
Development and maintenance of modern user interfaces for the software platform Implementation of customer requirements and handling of tickets in day-to-day operations Optimization of existing frontend components with regard to performance and usability Design and implementation of new features for various application areas Ensuring a consistent and intuitive user experience across all applications Close collaboration with backend developers to integrate interfaces and data flows Support in the strategic further development of the platform and introduction of new technologies 3 years of experience in frontend development Experience with Vue.js, React, or Angular is desirable Fluent English skills as well as good German skills are preferred 30 vacation days Flexible working hours Home office option Gehaltsinformationen Budget of up to 80.000€ Ihr Kontakt Ansprechpartner Silva Celine Rehberg Referenznummer 854809/1 Kontakt aufnehmen E-Mail: silva.rehberg@hays.de Anstellungsart Festanstellung durch unseren Kunden
Give it a try and learn what the market has to offer – our services are free of charge, non-binding and discreet! We look forward to hearing from you. Design and implement a SQL-based landing zone for regulatory data Develop stored procedures for transformation, enrichment, and aggregation Build and operate high-volume batch processing chains for monthly/quarterly cycles Implement SSIS-based ingestion flows and job orchestration Ensure data quality, technical lineage, and full traceability across layers Define and document integration patterns and mapping logic between landing-zone datasets and Tagetik-based reporting templates Perform operational monitoring, troubleshooting, and performance optimization Strong expertise in Microsoft SQL Server and T-SQL Hands-on experience with stored-procedure-driven ETL and complex data models Solid SSIS skills for orchestration and control of processing chains Experience with batch processing, logging, restartability, and performance tuning Knowledge of data lineage, reconciliation, and regulatory processing needs Experience with reporting platforms such as Tagetik is a plus Familiarity with Oracle source systems is advantageous Renowned Client Remote Option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 862801/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
Manage and refine business and technical requirements in collaboration with stakeholders Coordinate data integration activities with various source systems Design and model data structures within a Data Warehouse environment, with a strong focus on Data Vault methodology Develop and optimize data pipelines using SQL and Python Work with tools like Databricks and dbt to build scalable data transformation workflows Ensure data quality, consistency, and compliance, especially within banking-related use cases Experience in requirements management Experience in coordination with source systems Experience with data modeling in a Data Warehouse environment: Focus on Data Vault Good German and English language skills Databricks experience is nice to have Experience with dbt (data build tool) is an advantage Experience with SQL (as a query language) and Python is an advantage Banking experience is an advantage Renowned client Remote work Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863466/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
Develop, maintain, and optimize data pipelines and ETL/ELT processes using Databricks Implement version control workflows and collaborate using GitHub Build and maintain CI/CD pipelines with GitHub Actions Design and implement scalable data transformations using Python/PySpark Write efficient and reliable SQL queries for data processing and analytics Strong hands-on experience with Databricks and strong SQL skills Proficiency with GitHub for version control and collaboration Experience building CI/CD pipelines, ideally with GitHub Actions and solid knowledge of Python/PySpark Experience with Microsoft Azure and knowledge of Data Vault data modeling is an advantage Experience with Kafka or other streaming technologies is an advantage Understanding of Unity Catalog for data governance is an advantage Experience with Splunk for monitoring and troubleshooting is an advantage Renowned client Remote work possible Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863468/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
Development and implementation of a pre- and post-trade automated controls, as well as monitoring systems for automated trading in alignment with the product team Develop frontend interface, allowing for more customization, data streaming, and more use-cases, i.e., frontend development Developing and integrating Python and Java applications with Azure services, such as Azure Kubernetes Service, Postgres, Redis, Kafka, Azure Machine Learning, Azure DevOps, i.e., backend development Collaborating with architects and other team members to design and implement scalable, efficient, and secure software solutions Ensuring code quality and adherence to software development best practices, including documentation, version control, and code reviews Python, ReactJS and Java Solutions: Docker, Kubernetes, Kafka Databases: PostgreSQL, Redis Cloud Providers: Microsoft Azure Possibilities to work remote A very renowned company You will work in an international environment Ihr Kontakt Referenznummer 865059/1 Kontakt aufnehmen Telefon:+ 49 621 1788-4297 E-Mail: positionen@hays.de Anstellungsart Freiberuflich für ein Projekt
Arbeitstag >> Persönliche Betreuung durch unser Team >> Weiterbildungsmöglichkeiten As a UX/UI Designer, your focus will be on driving value and usability within our agile team by: • Design Execution: Translating validated user needs into clean, functional UI designs, which includes implementing, maintaining, and extending our existing design system to ensure consistency. • Discovery & Validation: Championing a discovery-first mindset by constantly questioning assumptions, asking "why" we are building something, and conducting necessary validation to identify and solve the root problem. • Agile Collaboration: Working within SAFe and Scrum frameworks to plan, track, and communicate tasks and progress, using tools such as Azure DevOps. • Team Collaboration: Collaborating closely with developers, engineers, and other stakeholders to ensure designs align with technical feasibility and business goals.
Your Responsibilities Act as the main contact for all AutoCAD-related topics within the internal team Communicate continuously with development teams to identify new requirements and features Design, introduce, and integrate new functions into the engineering design tool architecture Develop and implement new modules and features, including testing and documentation Support development and configuration in the AutoCAD environment Provide technical support and guidance to users regarding application functionality Your Qualifications Degree in mechanical engineering, computer science, mathematics, electrical engineering, physics, or a comparable field Several years of professional experience as an AutoCAD Application Specialist or Developer Extensive experience working in agile development teams Very good knowledge of Autodesk products, especially AutoCAD and related products Experience in developing CAD interfaces (e.g.
Development of new cloud-native web solutions and customer portalsContinuous improvement of existing software products and adaption to new requirements/standardsQuality assurance measures such as code reviews and software testsActive participation in the development process of an agile team Solid professional experience in the field of frontend software engineeringExtensive knowledge of web frameworks and web technologies, very strong JavaScript (React, TypeScript) knowledge, HTML, CSS3Hands on experience with integrating design systems into code (Storybook) and visual testingSound experience with cross browser development, responsive design, web accessibility, search engine optimizationRestful APIs (REST, XML, JSON)Good command of EnglishKnowledge of relevant development tools and CI/CD pipelines (Jira, Confluence, Git)Experience with public cloud (Amazon AWS) and cloud services (Cloudfront, Load Balancers, API Gateways, Lambda) is an advantageGerman language skills are an advantage International clientRemote option Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 861186/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
Give it a try and learn what the market has to offer – our services are free of charge, non-binding and discreet! We look forward to hearing from you. Design and implement a SQL-based landing zone for regulatory dataDevelop stored procedures for transformation, enrichment, and aggregationBuild and operate high-volume batch processing chains for monthly/quarterly cyclesImplement SSIS-based ingestion flows and job orchestrationEnsure data quality, technical lineage, and full traceability across layersDefine and document integration patterns and mapping logic between landing-zone datasets and Tagetik-based reporting templatesPerform operational monitoring, troubleshooting, and performance optimization Strong expertise in Microsoft SQL Server and T-SQLHands-on experience with stored-procedure-driven ETL and complex data modelsSolid SSIS skills for orchestration and control of processing chainsExperience with batch processing, logging, restartability, and performance tuningKnowledge of data lineage, reconciliation, and regulatory processing needsExperience with reporting platforms such as Tagetik is a plusFamiliarity with Oracle source systems is advantageous Renowned ClientRemote Option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 862801/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
MID – Driving Continuous Transformation Für spannende Projekteinsätze bei unseren Kunden suchen wir erfahrene Consultants mit BI Expertise.In Deiner Position als Senior Consultant Data Warehouse / Senior Data Analyst erwarten Dich abwechslungsreiche und anspruchsvolle Tätigkeiten entlang des gesamten Prozesses der Softwareentwicklung in Projekteinsätzen bei unseren KundenEigenverantwortlich stimmst Du die unterschiedlichen Anforderungsspezifikationen mit unseren Kunden und Fachabteilungen ab, analysierst und konzipierst die Datenaufbereitungsprozesse von der Schnittstelle bis hin zur KennzahlDabei beschäftigst Du Dich mit der Entwicklung/Aufbau von relationalen und multidimensionalen Datenbanken mit dem Ziel eines einheitlichen DatenbestandesDu optimierst bestehende Systemlösungen im Kontext Data WarehouseDu gestaltest die Architektur und das Design des Data Warehouse und entwickelst moderne Lösungen für ein Datenmodell.Du hast ein Studium im Bereich (Wirtschafts-)Informatik oder einem anderen Studiengang mit IT und/oder Informatik Fokus abgeschlossen, alternativ verfügst du über eine vergleichbare Qualifikation mit entsprechender IT-Affinität und bestenfalls Bezug zu ControllingDu bringst mindestens 5 Jahre Berufserfahrung in agilen Business Intelligence- oder in Data Warehouse-Projekten mit und bist dabei mit der Umsetzung von DSGVO-konformen DWH-Lösungen in Berührung gekommen.Du verfügst über Berufserfahrung im Design von ETL-Prozessen und Datenstrukturen, insbesondere im Umgang mit sehr großen Datenmengen und bist sicher im Umgang mit SQLMehrjährige Erfahrung in der Konzeption von ETL/ELT-Prozessen sowie ein Bewusstsein für DWH-Architekturen zeichnen dich aus.Du bringst Erfahrung in der Gestaltung von Schnittstellen zum DWH mit und hast bestenfalls bereits eventbasierte Schnittstellen mit Apache Kafka designt.Du überzeugst uns mit konzeptionellen Fähigkeiten und einem prozessualen und analytischen Denkvermögen und verfügst über verhandlungssichere Deutschkenntnisse.
Development and implementation of a pre- and post-trade automated controls, as well as monitoring systems for automated trading in alignment with the product team Develop frontend interface, allowing for more customization, data streaming, and more use-cases, i.e., frontend developmentDeveloping and integrating Python and Java applications with Azure services, such as Azure Kubernetes Service, Postgres, Redis, Kafka, Azure Machine Learning, Azure DevOps, i.e., backend developmentCollaborating with architects and other team members to design and implement scalable, efficient, and secure software solutionsEnsuring code quality and adherence to software development best practices, including documentation, version control, and code reviews Python, ReactJS and JavaSolutions: Docker, Kubernetes, KafkaDatabases: PostgreSQL, RedisCloud Providers: Microsoft Azure Possibilities to work remoteA very renowned companyYou will work in an international environment Ihr Kontakt Referenznummer 865059/1 Kontakt aufnehmen Telefon:+ 49 621 1788-4297 E-Mail: positionen@hays.de Anstellungsart Freiberuflich für ein Projekt
IHRE AUFGABEN Entwicklung von Backend-Services in C#/.NET, inklusive Datenbanklogik auf MS SQL Server (T-SQL, Performance-Tuning, Indexing) Gestaltung von Architektur und Schnittstellen (z.B. REST/gRPC) nach Clean-Architecture/SOLID und Domain-Driven-Design; Review, Refactoring und Technical Debt Management gehören dazu Qualitätssicherung durch automatisierte Tests (Unit/Integration), Code-Reviews und statische Analyse; Ende-zu-Ende-Tests für kritische Lagerprozesse Observability und Betriebssicherheit: Logging/Tracing/Metrics (OpenTelemetry), Feature-Toggles, Fehlerbudgets und Post-Mortems Security by Design nach OWASP ASVS (z.
YTD reporting Hands-on experience in GRDC form design (Manage Forms) Experience with Package Management (Manage Package app) Ability to design validation rules and manage controls in the Data Monitor Solid understanding of GRDC ?
Ihr Focus liegt auf einem der Themencluster IT Demand Management, IT Architektur, IT Service Design, IT Service Transition oder IT Service Delivery. Sie bewegen sich professionell an der Schnittstelle zwischen Fachbereich und IT.
Work with consolidation units, FS items, subitems, versions, and the ACDOCU table structureSupport both periodic and year-to-date reporting activitiesCreate and configure GRDC forms using the Manage Forms applicationMaintain and optimize forms to ensure accurate and efficient data collectionCreate and manage packages via the Manage Package appDefine package steps, assign forms or folders, and configure data-entry context for usersDesign and maintain validation rules, ensuring proper behavior in the Data MonitorUse the Reported Data Validation task to resolve reported-data issuesManage visual and backend controls to ensure consistent data qualityUnderstand GRDC data integration with ACDOCU and Group ReportingWork with Data Monitor task sequences such as Calculate Net Income Strong understanding of Group Reporting concepts and ACDOCU structureExperience with periodic vs. YTD reportingHands-on experience in GRDC form design (Manage Forms)Experience with Package Management (Manage Package app)Ability to design validation rules and manage controls in the Data MonitorSolid understanding of GRDC ?
MID – Driving Continuous Transformation Für spannende Projekteinsätze bei unseren Kunden suchen wir erfahrene Softwareentwickler (m/w/d) mit Python und AWS-/ Kubernetes-Kenntnissen.In Deiner Position als Softwareentwickler:in erwarten Dich abwechslungsreiche und anspruchsvolle Tätigkeiten entlang des gesamten Prozesses der Softwareentwicklung in Projekteinsätzen bei unseren Kunden.Eigenverantwortlich stimmst Du die unterschiedlichen Anforderungsspezifikationen mit unseren Kunden und Fachabteilungen sowie externen Stakeholdern ab.Du beteiligst dich an der Weiterentwicklung von Plattformen zur Ressourcen-Bereitstellung und Rechteverwaltung in Artifactory (Python / Flask).Du kümmerst Dich um das Code Review und sorgst dafür, dass es keine Bugs in die Produktion schaffen.Die Wartung, Optimierung und das Entwerfen von Software-Architekturen mit Kubernetes / Helm im Kontext von AWS gehören ebenfalls zu Deinen Aufgaben.Du kannst Ende-zu-Ende von der lokalen Entwicklung bis hin zum Cloud-Deployment und Monitoring tätig sein.Außerdem automatisierst Du Deployment-Prozesse mit Gitlab CI nach den Prinzipien Infrastructure-as-Code und Reproducible Builds.Du bist an der Wartung und Entwicklung von Basiskomponenten mittels Amazon Cloud Development Kit beteiligt.Dein Studium im Bereich (Wirtschafts-)Informatik oder einem anderen Studiengang mit IT- und/oder Informatik-Fokus hast Du abgeschlossen, alternativ verfügst Du über eine vergleichbare Qualifikation mit entsprechender IT-Affinität.Deine mehrjährige Berufserfahrung im DevOps Umfeld zeichnet Dich aus und Du bringst mindestens 3 Jahre Berufserfahrung in der Python- oder Backend-Entwicklung in einer vergleichbaren Sprache mit.Ferner bringst Du eine bereits mehrjährige Expertise mit REST API Design, OpenAPI-Spezifikationen und Authentifizierungsprotokollen wie OAuth2 oder SAML mit.Du besitzt ein tiefes Verständnis von Service-to-Service-Architekturen inklusive Anforderungen in den Bereichen High Availability und Skalierung.Du bereicherst das Team mit Deiner Erfahrung im Bereich AWS-Cloud und Kubernetes / Helm, idealerweise hast Du bereits komplexe cloud-native Software entwickelt und betrieben.Eigenverantwortliches Arbeiten ist für dich selbstverständlich, Du arbeitest gern in einem agilen Team und agile Prozesse wie Scrum oder Kanban sind Dir bekannt.Du punktest außerdem durch analytische und konzeptionelle Fähigkeiten sowie eine eigenständige und strukturierte Arbeitsweise.Du verfügst über verhandlungssichere Deutschkenntnisse und kannst bei Bedarf auch in englischer Sprache kommunizieren.Gelegentliche Dienstreisen innerhalb Deutschlands sind für Dich kein Problem.Die Möglichkeit, Dich auf Basis Deiner Expertise und Talente auf den Einsatz in Kundenprojekten zu konzentrieren, oder ergänzend zu Kundenprojekten bei der Mitgestaltung unserer Zukunftsthemen sowie deren Marktdurchdringung zu unterstützen.Aktuell sind wir rund 170 MIDler:innen und wir wollen weiter wachsen.
MID – Driving Continuous Transformation Mit der zunehmenden Bedeutung der Datenverarbeitung in der Cloud suchen wir nach einem erfahrenen Data Vault Consultant, der uns bei der Implementierung und Optimierung von Data Vault Lösungen in einer Cloud-Umgebung unterstützt.Du entwirfst und implementierst skalierbare Data Vault 2.1-Modelle.Dabei designst, modellierst und entwickelst Du Datenintegrationslösungen, die auf Best Practices im Bereich Data Vault basieren.Zur Sicherstellung einer reibungslosen Datenpipeline arbeitest Du eng mit den Data Engineering und BI-Teams zusammen.Du beteiligst Dich an Datenqualitätsprüfungen, Validierungen und Governance-Praktiken.Die Analyse von Kundenanforderungen und Übersetzung in skalierbare und robuste Architekturen gehören ebenfalls zu Deinen Aufgaben.Du unterstützt bei der Migration bestehender Lösungen in die Cloud.Du unterstützt die Geschäftsberichterstattung und -analyse durch sauber versionierte Data MartsAußerdem schulst und coachst Du Teams in Data Vault Methodiken und Best Practices in den jeweiligen Cloud Umgebungen.Dein Studium mit dem Schwerpunkt Informatik / Wirtschaftsinformatik, Mathematik, MINT-Fächern oder eine vergleichbare Qualifikation hast Du abgeschlossen.Du konntest bisher mindestens 5 Jahre Erfahrung in der Modellierung und Implementierung von Data Vault Lösungen in Produktionsumgebungen sammeln und bringst fundierte Kenntnisse in der Arbeit mit Cloud-Datenplattformen (z.
As part of our team, you will take on the following responsibilities: You are responsible for leading and developing the Product Development department, including staffing and promoting employees, as well as agreeing on targets for the department, such as personnel, budget, and investment planning, and controlling measures in case of deviations from these targets.Introducing, tracking, and evaluating department-relevant measures are also part of your tasks, as well as identifying and implementing necessary structural changes within the department and in collaboration with other departments and areas of DKV.You are responsible for developing and implementing strategies for creating and maintaining complex system architectures in the customer-centric web environment, as well as further developing customer-centric web technologies by evaluating and designing sophisticated solution architectures and ensuring availability and security in collaboration with other departments and areas of DKV.You are responsible for implementing and operating applications to defined SLAs at efficient costs, advising on architecture and design of the system network, as well as ensuring the logical integrity and scalability of systems and teams.Additionally, you ensure effective collaboration with other departments and areas of DKV along the value chain, as well as uniform product delivery in close cooperation with the Framework Development department.Using agile methods such as SCRUM, Kanban for IT, Servant Leadership, etc., is not a problem for you, nor is using future-proof cloud technology and phasing out legacy software towards a robust, modular, service-oriented software architecture.You maintain the scalability of the department and are a pioneer and shaper of change.
The project involves setting up and testing system interfaces between Salesforce and MS Project Operations using Azure Functions as middleware. Key Responsibilities 1. Interface & Technical Setup Design and implement six integration endpoints between Salesforce and MS Project Operations via Azure Functions: Four data transfers from Salesforce to MSPO Two data transfers from MSPO to Salesforce Configure and manage integration for the following objects: Opportunity Account Portfolio (custom object) 2.
Deine Aufgaben Key Responsibilities: Analyze business needs and design tailored SAP EWM solutions Configure and support end-to-end warehouse and logistics processes in SAP Integrate EWM with other SAP modules such as MM, WM, SD, and PP Draft functional and technical specifications for enhancements and developments Collaborate with ABAP developers on customizations Support testing phases and lead issue resolution throughout project lifecycle Deliver user training and produce system documentation Advise on SAP best practices and process optimization Stay up to date with SAP innovations, especially within S/4HANA and embedded EWM Dein Profil Requirements: 3–5 years of experience with SAP EWM (embedded or decentralized) Deep understanding of warehouse and logistics processes Background in SAP MM, WM, SD, or PP is a strong plus Experience with technical development (e.g., debugging, basic ABAP) is an advantage Solid knowledge of SAP S/4HANA architecture and EWM integration Strong analytical, problem-solving, and communication skills Comfortable working independently and in multicultural teams Fluent English required; German (B2/C1) is a strong asset and preferred for certain client contexts Warum wir?
As an IT Data Engineer, you will contribute to discover insights about our customers and internal operations, by designing and implementing data pipelines and models, as well as maintaining and improving existing ones, so that you and your team can accelerate business experimentation and influence data-driven-augmented decision making. Tasks & Responsibilities Design and implement data pipelines to extract, transform, and load data from various sources, including databases, cloud storage, and APIs.
Deliver backend development for a multi-agent evaluation platform within an agile setup Provide technical consulting to the project team and contribute to solution design Build and maintain backend services aligned with defined functional requirements Design and implement APIs for seamless integration across internal applications Collaborate with cross-functional teams to clarify and solve technical integration needs Apply modern data processing and analysis techniques to support platform evolution Improve system performance through testing strategies, code reviews, and tooling enhancements Support UI implementation and ensure smooth deployment in cloud and HPC environments Academic background or equivalent experience in a computing, engineering, or data-related field Strong expertise in Python programming Solid understanding of REST API design and implementation Hands-on experience with machine-learning engineering, ideally in scientific or technical domains Proficiency with CI/CD pipelines using GitLab Experience with GitOps workflows and Kubernetes-based service hosting Ability to implement or contribute to user interface components Knowledge of LLMs, multi-agent frameworks, and working in high-performance computing environments Opportunity to work on an award-winning, innovative scientific-technology project Highly collaborative, cross-functional environment with modern engineering practices Exposure to advanced ML, multi-agent systems, and cutting-edge data technologies Flexible hybrid working model with a dynamic team culture Ihr Kontakt Referenznummer 863129/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Analyze and implement localization requirements for the Philippines within SAP S/4HANA Design, build, and deliver the E-Invoicing solution required by the BIR Collaborate closely with SAP Globalization Product Management Ensure continuous compliance by implementing legal and regulatory changes Work across Finance and Logistics functional areas Creating and delivering the E-invoicing solution for Philippines and work on continuous legal changes to keep our product compliant - S/4 HANA Finance and Logistics.
Give it a try and learn what the market has to offer – our services are free of charge, non-binding and discreet! We look forward to hearing from you. Design, build, and optimize batch data pipelines for internal tool use cases Develop efficient Spark SQL transformations for large-scale datasets Use Python for data processing, orchestration, and automation Create and maintain data models (facts, dimensions, aggregates) with clear grain and metric definitions Ensure data quality and correctness, including handling late data, duplicates, and adjustments Implement validation, data quality checks, and reconciliation logic Work with business stakeholders to gather requirements, define metrics, and translate needs into pipelines Collaborate with infrastructure teams on standards, performance tuning, and best practices Bachelor oder Master degree in a technical field or an equivalent qualification Experience in data engineering or a related field Strong proficiency in Spark SQL for large-scale data transformations Solid Python skills for data processing and pipeline development Strong understanding of data modeling (fact tables, dimensions, grain, SCDs) Hands-on experience building and maintaining batch pipelines in production High attention to detail with a strong focus on data quality and metric integrity Ability to communicate clearly with non-technical stakeholders and translate business needs into data solutions Remuneration in the most attractive collective agreement in the industry Annual leave entitlement of 30 days Generous working time account with the possibility to pay overtime Subsidization of direct insurance (as company pension scheme) Ihr Kontakt Ansprechpartner Kristina Meng Referenznummer 863942/1 Kontakt aufnehmen E-Mail: kristina.meng@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
Your Tasks As a member of the OHB Simulator Software Team, you are responsible for the full life-cycle development of simulation software for a variety of spaceflight missions:Design, develop and implement simulation models of satellite equipment and the space environmentContinuously improve the simulation framework, user front-end and the test-/debugging interfaces and tooling Perform tasks along the full software life cycle, including software requirements, architecture, design, code, test, integration and documentation Coordinate with customers and internal software teams for the design, for reviews and the timely delivery of simulatorsSupport the customer teams during mission preparation as well as during the maintenance phase of the simulators Your Qualifications A degree in engineering with a major or PhD in Computer Science, Electrical Engineering, Aerospace Engineering or a comparable qualificationProficiency in modern C++, experience with Python Familiarity with working in a Linux environment Professional experience as Software Engineer or Simulator Software Engineer would be an assetKnowledge of federated simulation and co-simulation techniques would be an assetExperience with Qt, Git, DOORS and JIRA as well as embedded software is a plusExperience with ECSS-SMP tools/environments and software standards (e.g.
As an IT Data Engineer, you will contribute to discover insights about our customers and internal operations, by designing and implementing data pipelines and models, as well as maintaining and improving existing ones, so that you and your team can accelerate business experimentation and influence data-driven-augmented decision making. Tasks & Responsibilities Design and implement data pipelines to extract, transform, and load data from various sources, including databases, cloud storage, and APIs.
Configure and customize SAP EHSM workflows according to business requirementsGuide functional configuration with a deep understanding of EHS processesOwnership of the entire project lifecycle from requirements gathering through deployment, go-live, and hyper care supportCollaborate with IT and business teams to validate system behavior against EHS needsAssist in training content development to ensure domain accuracyParticipate in testing and validation phases to ensure compliance and process correctnessDocument workflow designs and changes for knowledge management Strong knowledge of SAP EHSM and SAP Workflow frameworkExperience with SAP S/4HANA and RISE with SAP environmentsExperience in working with SAP EHSM or similar EHS software solutionsProven experience in end-to-end application implementation and deliveryStrong skills in requirement gathering, business analysis, and creating functional specification documentsDeep understanding of design approaches, including designing and documenting business processesExpertise in configuring specific modules, such as Incident/Accident Management instancesAbility to conduct workshops for requirement gathering and clarification and fit-gap-analysisFamiliarity with various testing phases including System Integration Testing (SIT), User Acceptance Testing (UAT), and deployment activitiesCapability to train key users and provide ongoing support Interesting assignments at renowned national and international companiesPossibilities to work remote Ihr Kontakt Ansprechpartner Nina van den Eynde Referenznummer 861303/1 Kontakt aufnehmen E-Mail: nina.van.den.eynde@hays.de Anstellungsart Freiberuflich für ein Projekt
Deliver backend development for a multi-agent evaluation platform within an agile setupProvide technical consulting to the project team and contribute to solution designBuild and maintain backend services aligned with defined functional requirementsDesign and implement APIs for seamless integration across internal applicationsCollaborate with cross-functional teams to clarify and solve technical integration needsApply modern data processing and analysis techniques to support platform evolutionImprove system performance through testing strategies, code reviews, and tooling enhancementsSupport UI implementation and ensure smooth deployment in cloud and HPC environments Academic background or equivalent experience in a computing, engineering, or data-related fieldStrong expertise in Python programmingSolid understanding of REST API design and implementation Hands-on experience with machine-learning engineering, ideally in scientific or technical domainsProficiency with CI/CD pipelines using GitLabExperience with GitOps workflows and Kubernetes-based service hostingAbility to implement or contribute to user interface componentsKnowledge of LLMs, multi-agent frameworks, and working in high-performance computing environments Opportunity to work on an award-winning, innovative scientific-technology projectHighly collaborative, cross-functional environment with modern engineering practicesExposure to advanced ML, multi-agent systems, and cutting-edge data technologiesFlexible hybrid working model with a dynamic team culture Ihr Kontakt Referenznummer 863129/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Give it a try and learn what the market has to offer – our services are free of charge, non-binding and discreet! We look forward to hearing from you. Design, build, and optimize batch data pipelines for internal tool use casesDevelop efficient Spark SQL transformations for large-scale datasetsUse Python for data processing, orchestration, and automationCreate and maintain data models (facts, dimensions, aggregates) with clear grain and metric definitionsEnsure data quality and correctness, including handling late data, duplicates, and adjustmentsImplement validation, data quality checks, and reconciliation logicWork with business stakeholders to gather requirements, define metrics, and translate needs into pipelinesCollaborate with infrastructure teams on standards, performance tuning, and best practices Bachelor oder Master degree in a technical field or an equivalent qualificationExperience in data engineering or a related fieldStrong proficiency in Spark SQL for large-scale data transformationsSolid Python skills for data processing and pipeline developmentStrong understanding of data modeling (fact tables, dimensions, grain, SCDs)Hands-on experience building and maintaining batch pipelines in productionHigh attention to detail with a strong focus on data quality and metric integrityAbility to communicate clearly with non-technical stakeholders and translate business needs into data solutions Remuneration in the most attractive collective agreement in the industry Annual leave entitlement of 30 days Generous working time account with the possibility to pay overtime Subsidization of direct insurance (as company pension scheme) Ihr Kontakt Ansprechpartner Kristina Meng Referenznummer 863942/1 Kontakt aufnehmen E-Mail: kristina.meng@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
Wo Du mit anpacken kannst Als Softwarearchitekt:in analysierst Du unsere bestehende Client-Server-Landschaft und gestaltest eine moderne Migrationsstrategie, die Innovation und stabilen Betrieb vereint. Durch die Definition klarer Standards und Design-Patterns gibst Du unseren Teams wertvolle Orientierung, ohne dabei den kreativen Spielraum einzuschränken. Gemeinsam mit dem Produktmanagement identifizierst und priorisierst Du bottlenecks, um unsere Plattform kontinuierlich zu optimieren.
The project involves setting up and testing system interfaces between Salesforce and MS Project Operations using Azure Functions as middleware. Key Responsibilities 1. Interface & Technical Setup Design and implement six integration endpoints between Salesforce and MS Project Operations via Azure Functions: Four data transfers from Salesforce to MSPO Two data transfers from MSPO to Salesforce Configure and manage integration for the following objects: Opportunity Account Portfolio (custom object) 2.
Solutions are provided in a cost- and time-efficient manner and by strong collaboration with stakeholders and cross-functional departments. primary responsibilities: a) Software development • Development, implementation and verification of architecture, concepts and solutions according to product and design specifications • Ability to coordinate software-related tasks within a cross-functional environment, align priorities with project leads, and effectively facilitate collaboration across different departments. • Coordination and cooperation with third-party suppliers and contractors • Supporting project management in planning software-related tasks • Collection and determination of technical requirements • Support of trouble shooting in production related to software issues • Sustaining of existing products / systems to ensure stable and reliable operations a spart of the product lifecycle management process b) Documentation • Adherence of projects goals, national and international standards and regulations (CE, FDA, MDR…) • Create documentation according to regulatory requirements in the strongly regulated field for medical devices, like requirements, specifications, risk analysis and verification documents Ihr Profil • Graduate in engineering or computer science or Technical education (e.g. staatl. gepr.
Responsibilities Gather, analyze, and structure complex business requirements independently Identify gaps, inconsistencies, and risks in requirements early on Map customer needs to existing platform features and drive feature adoption Create detailed business and integration concepts, including data imports and system flows Design scalable solutions and clearly explain trade-offs from both business and technical views Lead workshops and align diverse stakeholder groups toward clear decisions Own and maintain functional documentation throughout the project lifecycle until go-live Requirements Bachelor’s or Master’s degree in Computer Science, Business Informatics, or comparable experience 5+ years of experience in enterprise e-commerce, SaaS platforms, or digital consulting Strong expertise in structured requirements engineering and problem-solving Solid understanding of e-commerce architectures like OMS, PIM, CMS, and API/data flows (REST/JSON, SQL) Ability to work with complex system landscapes and enterprise-level stakeholders Experience with platform migrations or replatforming projects is a plus Familiarity with platforms such as Salesforce Commerce Cloud, CommerceTools, SAP Commerce Cloud (Hybris), or Shopify Plus is nice to have YOU ARE THE CORE OF OUR COMPANY We take responsibility for creating an inclusive and exceptional environment where all genders, nationalities and ethnicities feel welcomed and accepted exactly as they are.
Give it a try and learn what the market has to offer – our services are free of charge, non-binding and discreet! We look forward to hearing from you. Design, build and optimize enterprise Tableau dashboardsDevelop reporting-friendly data models on Cloudera Data Platform (Hadoop) and Azure Databricks (Delta Lake, SQL Warehouses)Implement and tune SQL queries (Hive, Impala, Spark SQL, Databricks SQL) for performance, cost efficiency and concurrencyApply Tableau performance optimization strategies (extract vs live, push-down optimization, query tuning)Design and implement secure enterprise Tableau configurations, including row-level security aligned with role conceptsEnsure compliance with IT security, data governance and regulatory requirementsCollaborate with data platform teams, DataOps, IT security/compliance and controlling solutionsProduce professional documentation: data models, dashboard specifications, data-source definitions, security concepts, test casesConduct testing for accuracy, performance, access control and stability of dashboards and data modelsProvide knowledge transfer, coaching and structured handover to internal teams Strong hands-on experience with Tableau Desktop and Tableau Server/Cloud in enterprise environmentsProven ability to build management-ready dashboards for finance/controlling or other regulated industriesPractical experience with Cloudera Data Platform, Hadoop ecosystems, and Azure Databricks integrationsAdvanced SQL skills across Hive, Impala, Spark SQL, Databricks SQLSolid understanding of Delta Lake, parquet/ORC formats, and BI-oriented data modeling principlesExperience implementing row-level security and with enterprise BI solutionsStrong knowledge of performance optimization in Tableau, Hadoop and Databricks environmentsAbility to operate in regulated financial environments with security, compliance and data governance constraintsExcellent communication and documentation skills in English International clientRemote option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 864486/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
As part of our team, you will take on the following responsibilities: You are responsible for the operation, continuous development, and hardening of our enterprise‑wide PKI landscape (on‑premises and cloud).You design, implement, and operate Azure‑based PKI services, in particular Microsoft Cloud PKI and Microsoft Intune Certificate Lifecycle Management.You integrate certificate services into modern zero‑trust architectures.You develop and implement secure certificate and key management processes.You support the modernization of our identity and security infrastructure, especially in hybrid Azure environments.You work closely with security and infrastructure teams to establish secure and automated certificate workflows.You actively contribute to IT security projects and create security and architecture concepts.In close collaboration with the Group Information Security Office, you ensure compliance with regulatory and internal security requirements.You drive continuous process optimization, automation, and documentation.
Responsible for all aspects of Material Master for the global template and for MDG-M Review all Requirements for clarity and for support of harmonized approach Drive, oversee, guide and assist the technical developers Test all requirement implementations Involved in all aspects of PLM Interfaces for each of the PLM Systems With each roll out the above topics must be addressed in different ways including migration With regards to migration: manage clones, collisions, duplicates for every S4E-Unify and Agora Roll Out Provide demonstrations on MDG-M functionality, oversee the Data Quality module and its implementation relative to KPI’s and management of KPI results Bachelor’s degree in computer science, Business Management, Management Information Systems or equivalent experience Experience in business process design in material management and production planning Expert knowledge about PP, MM, MDG-M, processes and customizing in SAP S/4 HANA Experience and the ability to read ABAP code / debugging are desirable Ability to meet business requirements through the standard solution used Excellent with troubleshooting and analytical skills 10-20% business travel (international and domestic) is expected Languages: English, German (optional) Exciting Global Projects: Be part of a major international SAP S/4HANA transformation program Hybrid Work Model: Flexible working hours and remote work options depending on project needs Professional Development: Opportunity to work with cutting-edge technologies and gain experience in SAP S/4HANA, MDG-M, and PLM interfaces International Environment: Collaborate with global teams across different countries and cultures Modern Workplace: Access to Siemens Energy’s innovative and sustainable office infrastructure Networking Opportunities: Work alongside experts in energy, digitalization, and transformation Gehaltsinformationen 120000 Ihr Kontakt Ansprechpartner Julian Hientz Referenznummer 847518/1 Kontakt aufnehmen E-Mail: julian.hientz@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
PP, MM, SD, EWM, FI/CO) Fundierte Kenntnisse in SAP-Integrationskonzepten (z. B. Schnittstellen, APIs, Middleware) Erfahrung im Design von End-to-End-Geschäftsprozessen im ERP-Kontext Verständnis für SAP-Erweiterungsstrategien (Customizing, Add-ons, Erweiterungen im Rahmen von S/4) Fähigkeit, komplexe technische Sachverhalte strukturiert zu dokumentieren und verständlich zu kommunizieren Sehr gutes Verständnis für die Anforderungen produzierender Unternehmen und deren Wertschöpfungsketten Sicheres Auftreten gegenüber Fachbereichen, IT und externen Partnern IHRE ERFAHRUNGEN Du bringst fundierte Erfahrung im SAP-S/4HANA-Umfeld mit – als Solution Architect, Lead Consultant oder aus einer entwicklungsnahen Rolle mit signifikanter Architekturverantwortung.
Responsible for all aspects of Material Master for the global template and for MDG-M Review all Requirements for clarity and for support of harmonized approach Drive, oversee, guide and assist the technical developers Test all requirement implementations Involved in all aspects of PLM Interfaces for each of the PLM Systems With each roll out the above topics must be addressed in different ways including migration With regards to migration: manage clones, collisions, duplicates for every S4E-Unify and Agora Roll Out Provide demonstrations on MDG-M functionality, oversee the Data Quality module and its implementation relative to KPI’s and management of KPI results Bachelor’s degree in computer science, Business Management, Management Information Systems or equivalent experience Experience in business process design in material management and production planning Expert knowledge about PP, MM, MDG-M, processes and customizing in SAP S/4 HANA Experience and the ability to read ABAP code / debugging are desirable Ability to meet business requirements through the standard solution used Excellent with troubleshooting and analytical skills 10-20% business travel (international and domestic) is expected Languages: English, German (optional) Exciting Global Projects: Be part of a major international SAP S/4HANA transformation program Hybrid Work Model: Flexible working hours and remote work options depending on project needs Professional Development: Opportunity to work with cutting-edge technologies and gain experience in SAP S/4HANA, MDG-M, and PLM interfaces International Environment: Collaborate with global teams across different countries and cultures Modern Workplace: Access to Siemens Energy’s innovative and sustainable office infrastructure Networking Opportunities: Work alongside experts in energy, digitalization, and transformation Gehaltsinformationen 120000 Ihr Kontakt Ansprechpartner Julian Hientz Referenznummer 847518/1 Kontakt aufnehmen E-Mail: julian.hientz@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
WHAT YOU WILL DO Be a key influencer to our business colleagues in resolving significant or potential issuesUnderstand your customer by gathering requirement, presenting concepts and proposal, providing updates on deliverables and negotiating issue resolution when neededAct as a coordinator across projects to monitor performance to ensure delivery of quality application on timeWork with external parties in assembling the system components with standardized modulesCustomizeCommercial Off-The-Shelf (COTS)based on pre-determined parameters, and execute customizing testsEnsure works are documented according to required standards, methods and toolsAssist in the define, initiate and design of architecture in projects/ program WHAT YOU SHOULD HAVE: 8 years hands-on experience in designing and developing solutions with Java /JavaScriptWorking experience and knowledge in Software Development Life Cycle (SDLC) with relevant qualification in IT fieldPassion in developing a high quality softwareInquisitive and analytical mind Strong IT and business acumenProven ability to work in a multi-cultural and multi-functional environmentCustomer and end-user focusedExcellent oral and written communications skills WHAT YOU WILL GET FROM US Great team of IT professionals with global working exposure On-going professional and technical training and certifications Global internal job opportunities available within DPDHL A multicultural environment in modern offices Meal Card and Flexible Benefits – customized according to individual needs Choose any day for your vacation from earned public holiday (Saturday and ad hoc) Smart casual dress code everyday Unlimited Outpatient Medical Home office possibilities Sounds good?
Create detailed 3D models, renderings, and visual presentations to effectively communicate design concepts. Stay up to date with industry trends and emerging technologies to enhance event design. Work closely with production teams to ensure designs are executed accurately and within budget.
As part of our team, you will take on the following responsibilities: As an Infrastructure Architect (gn), you analyze the existing system and infrastructure landscape and, based on this, develop technical design changes for new software and hardware solutions. You design, plan, and implement infrastructure technology solutions and ensure their sustainable integration into the existing architecture.You are responsible for managing and continuously developing the existing IT services and contribute your expertise in an advisory capacity to other internal IT units.
You are responsible for the conceptual, logical, and structural integrity of our Core Data Model as well as the Gold Layer across Azure, Snowflake, and dbt.You ensure that fragmented data sources are transformed into consistent, reusable, and decision‑relevant data products, actively preventing the platform from drifting into team‑specific, incompatible models.You define and maintain central business objects, canonical dimensions, shared metrics, and facts, ensuring that the Core Data Model serves as a stable, business‑oriented foundation across all domains.You develop modeling standards, naming conventions, layering concepts (Staging → Intermediate → Gold), reuse patterns, and dbt design guidelines, and you ensure their consistent implementation across all teams.You safeguard the semantic consistency of the entire data model, resolve domain conflicts, ensure that identical business terms are modeled only once, and review changes affecting core layers.You act as the technical design authority for model changes in Snowflake/dbt, balancing local requirements with long‑term model coherence, and ensuring that all models remain performant, scalable, maintainable, and of high quality.
Stellenbeschreibung Join us on our journey to success We are currently recruiting in Munich, Stuttgart, Nuremberg, and Augsburg. This appeals to you: Design, develop, and implement embedded software solutions for various microcontrollers and processorsCollaborate with hardware engineers to define software architecture for embedded systems, considering hardware constraintsWrite efficient, reliable code in C, C++, or assembly language for embedded systemsDevelop software for RTOS (Real-Time Operating System) to ensure timely and predictable task executionCreate and optimize drivers for peripherals and interfaces like sensors, actuators, and communication modulesIdentify and resolve issues through debugging and testing to ensure system stability and performanceWork with cross-functional teams, including hardware engineers and quality assurance, to deliver integrated solutionsImplement security measures to protect embedded systems from vulnerabilities and cyber threatsOptimize code and algorithms for resource-constrained environments, considering factors like power consumption and memory usage This is you: Bachelor’s or higher degree in Computer Science, Electrical Engineering, or a related fieldProficiency in programming languages such as C and C++Experience in embedded systems development with knowledge of microcontroller/microprocessor architecturesFamiliarity with real-time operating systems (RTOS) and embedded development toolsKnowledge of communication protocols (e.g., SPI, I2C, UART) and networking protocols is a plusUnderstanding of hardware design principles is advantageousStrong problem-solving skills and attention to detailExcellent communication skills for effective collaboration and documentationIndustry-specific certifications or relevant experience may be preferredGood German and Englisch language skills What we offer: Future prospects in an innovative, growing, and agile companyAn exciting work environment with diverse career opportunitiesCompetitive, industry-standard / above-industry-standard salaryFlexible working hours and the option for remote work30 vacation days per year and special leave for significant occasions when employed at Xtended EngineeringBonus programs for referrals and employee recruitmentTeam events Xtended Engineering GmbH connects skilled engineers with companies in the industrial and commercial sectors.
You collaborate closely with backend engineers, translate Figma designs into robust component-based implementations, and ensure consistency, performance, and maintainability across the ecosystem. This is a hands-on senior role requiring 5+ years of professional frontend engineering experience in modern web application environments.
PHP Laravel Vue.js TypeScript What you will do Own the end-to-end architecture of modular add-ons (backend, frontend, integrations) and ensure scalability, security, and maintainability Design and implement advanced backend logic (Laravel, queues, events) and complex database schemas optimized for high-traffic e-commerce environments Oversee frontend integration within a micro-frontend setup (SingleSpa), ensuring seamless performance and UX consistency Lead CI/CD, semantic versioning, Docker-based environments, and release processes with stable rollouts Ensure production reliability through monitoring, alerting, and proactive incident response (e.g., DataDog, AWS environments) Act as technical lead and final reviewer, mentoring engineers, enforcing high code quality standards, and driving architectural consistency Who you are 10+ years of experience building, scaling, and operating complex fullstack web applications in production environments, ideally within e-commerce or SaaS ecosystems Expert proficiency in PHP (8.x) and Laravel (including Middleware, Queues, Events, Gates/Policies), along with strong experience in Vue.js 3, TypeScript, and modern state management (e.g., Pinia) Deep architectural expertise in designing secure, multi-tenant, and distributed systems, with strong understanding of SOLID/DRY principles and long-term maintainability Advanced database and performance engineering skills (MySQL/RDS, complex schema design, indexing strategies, Redis caching) in high-traffic environments Proven ownership of DevOps-adjacent responsibilities, including Docker-based environments, CI/CD pipelines, semantic versioning, AWS infrastructure awareness, and production monitoring/alerting Strong technical leadership skills: mentoring senior engineers, conducting high-quality code reviews, decomposing complex problems, and translating technical decisions into clear business impact Benefits Hybrid working Fresh fruit every day Sports courses Exclusive employee discounts Language courses Company parties Help in the relocation process Mobility subsidy State-of-the-art technology Flexible Working Hours Company pension Professional training Dog-friendly office AY Academy Feedback Culture Job Bikes YOU ARE THE CORE OF SCAYLE.