Develop, maintain, and optimize data pipelines and ETL/ELT processes using Databricks Implement version control workflows and collaborate using GitHub Build and maintain CI/CD pipelines with GitHub Actions Design and implement scalable data transformations using Python/PySpark Write efficient and reliable SQL queries for data processing and analytics Strong hands-on experience with Databricks and strong SQL skills Proficiency with GitHub for version control and collaboration Experience building CI/CD pipelines, ideally with GitHub Actions and solid knowledge of Python/PySpark Experience with Microsoft Azure and knowledge of Data Vault data modeling is an advantage Experience with Kafka or other streaming technologies is an advantage Understanding of Unity Catalog for data governance is an advantage Experience with Splunk for monitoring and troubleshooting is an advantage Renowned client Remote work possible Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863468/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
Give it a try and learn what the market has to offer – our services are free of charge, non-binding and discreet! We look forward to hearing from you. Design, build and optimize enterprise Tableau dashboards Develop reporting-friendly data models on Cloudera Data Platform (Hadoop) and Azure Databricks (Delta Lake, SQL Warehouses) Implement and tune SQL queries (Hive, Impala, Spark SQL, Databricks SQL) for performance, cost efficiency and concurrency Apply Tableau performance optimization strategies (extract vs live, push-down optimization, query tuning) Design and implement secure enterprise Tableau configurations, including row-level security aligned with role concepts Ensure compliance with IT security, data governance and regulatory requirements Collaborate with data platform teams, DataOps, IT security/compliance and controlling solutions Produce professional documentation: data models, dashboard specifications, data-source definitions, security concepts, test cases Conduct testing for accuracy, performance, access control and stability of dashboards and data models Provide knowledge transfer, coaching and structured handover to internal teams Strong hands-on experience with Tableau Desktop and Tableau Server/Cloud in enterprise environments Proven ability to build management-ready dashboards for finance/controlling or other regulated industries Practical experience with Cloudera Data Platform, Hadoop ecosystems, and Azure Databricks integrations Advanced SQL skills across Hive, Impala, Spark SQL, Databricks SQL Solid understanding of Delta Lake, parquet/ORC formats, and BI-oriented data modeling principles Experience implementing row-level security and with enterprise BI solutions Strong knowledge of performance optimization in Tableau, Hadoop and Databricks environments Ability to operate in regulated financial environments with security, compliance and data governance constraints Excellent communication and documentation skills in English International client Remote option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 864486/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
Give it a try and learn what the market has to offer – our services are free of charge, non-binding and discreet! We look forward to hearing from you. Design, build and optimize enterprise Tableau dashboardsDevelop reporting-friendly data models on Cloudera Data Platform (Hadoop) and Azure Databricks (Delta Lake, SQL Warehouses)Implement and tune SQL queries (Hive, Impala, Spark SQL, Databricks SQL) for performance, cost efficiency and concurrencyApply Tableau performance optimization strategies (extract vs live, push-down optimization, query tuning)Design and implement secure enterprise Tableau configurations, including row-level security aligned with role conceptsEnsure compliance with IT security, data governance and regulatory requirementsCollaborate with data platform teams, DataOps, IT security/compliance and controlling solutionsProduce professional documentation: data models, dashboard specifications, data-source definitions, security concepts, test casesConduct testing for accuracy, performance, access control and stability of dashboards and data modelsProvide knowledge transfer, coaching and structured handover to internal teams Strong hands-on experience with Tableau Desktop and Tableau Server/Cloud in enterprise environmentsProven ability to build management-ready dashboards for finance/controlling or other regulated industriesPractical experience with Cloudera Data Platform, Hadoop ecosystems, and Azure Databricks integrationsAdvanced SQL skills across Hive, Impala, Spark SQL, Databricks SQLSolid understanding of Delta Lake, parquet/ORC formats, and BI-oriented data modeling principlesExperience implementing row-level security and with enterprise BI solutionsStrong knowledge of performance optimization in Tableau, Hadoop and Databricks environmentsAbility to operate in regulated financial environments with security, compliance and data governance constraintsExcellent communication and documentation skills in English International clientRemote option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 864486/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
Messdaten, Betriebsführung, GIS, Asset Management) zu einer einheitlichen Datenbasis für projektbezogene Analyse- und Nutzungsszenarien. Design, Implementierung und Weiterentwicklung von ETL-/ELT Prozessen auf Basis moderner Azure und Databricks KomponentenAnalyse von Quellstrukturen sowie Aufbau und Pflege projektweiter DatenmodelleUnterstützung beim Aufbau, der Konfiguration und dem Monitoring der Datenplattform sowie der zugehörigen Schnittstellenprozesse.Aufbereitung und Bereitstellung von Daten für unterschiedliche technische und analytische Nutzungsszenarien im ProjektEntwicklung und Betreuung projektbezogener Reporting Lösungen (u.a.
Messdaten, Betriebsführung, GIS, Asset Management) zu einer einheitlichen Datenbasis für projektbezogene Analyse- und Nutzungsszenarien. Design, Implementierung und Weiterentwicklung von ETL-/ELT Prozessen auf Basis moderner Azure und Databricks Komponenten Analyse von Quellstrukturen sowie Aufbau und Pflege projektweiter Datenmodelle Unterstützung beim Aufbau, der Konfiguration und dem Monitoring der Datenplattform sowie der zugehörigen Schnittstellenprozesse.
Manage and refine business and technical requirements in collaboration with stakeholders Coordinate data integration activities with various source systems Design and model data structures within a Data Warehouse environment, with a strong focus on Data Vault methodology Develop and optimize data pipelines using SQL and Python Work with tools like Databricks and dbt to build scalable data transformation workflows Ensure data quality, consistency, and compliance, especially within banking-related use cases Experience in requirements management Experience in coordination with source systems Experience with data modeling in a Data Warehouse environment: Focus on Data Vault Good German and English language skills Databricks experience is nice to have Experience with dbt (data build tool) is an advantage Experience with SQL (as a query language) and Python is an advantage Banking experience is an advantage Renowned client Remote work Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863466/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
YOUR TASKS: Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets. Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup. Create technical documentation.
YOUR TASKS: Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets. Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup. Create technical documentation.
Du legst großen Wert auf hohe Qualität – dazu gehören ein tiefes Verständnis von Softwarearchitekturen, eine Affinität für konzeptionelle Klarheit sowie der Einsatz von Design Patterns und Code Reviews. Du hast den Ehrgeiz, dich kontinuierlich weiterzuentwickeln – sei es durch interne Tech Talks, teamübergreifende Gemeinschaften, Konferenzen oder Forschung.
Das bringst du mit Studium, Data-Background: Du hast ein abgeschlossenes Studium der Wirtschaftsinformatik, Informatik, Data Science oder eine vergleichbare Qualifikation.Berufserfahrung, Analytics-Know-how: Du verfügst über mindestens 8 Jahre Erfahrung in Data-Analytics-, BI-, DWH- oder KI-Projekten mit direktem Kundenkontakt.Solution Design, Pre-Sales-Expertise: Du hast fundierte Erfahrung im Anforderungsmanagement, in der Entwicklung von Solution Designs sowie in Pre-Sales- und Angebotsprozessen.Technologiekompetenz, Cloud-Plattformen: Du bringst tiefes Verständnis von Analytics- und KI-Architekturen sowie Erfahrung mit MS Fabric, Snowflake oder Databricks mit.Wirtschaftlichkeit, Multiprojektmanagement: Du kannst Projekte wirtschaftlich bewerten, kalkulieren und mehrere Kunden- und Projektkontexte parallel steuern.Kommunikationsstärke, Persönlichkeit: Du trittst souverän auf Managementebene auf, kommunizierst verhandlungssicher in Deutsch und gut in Englisch und überzeugst durch analytisches Denken, Abschlussstärke und Reisebereitschaft.
As Senior Data Engineer (m/w/d) , you design and operate scalable data pipelines and architectures that support Nordex analytics, reporting and machine learning solutions. You work closely with data scientists and engineering teams to deliver robust, high‑quality datasets.
As Senior Data Engineer (m/w/d), you design and operate scalable data pipelines and architectures that support Nordex analytics, reporting and machine learning solutions. You work closely with data scientists and engineering teams to deliver robust, high‑quality datasets.
You are responsible for the conceptual, logical, and structural integrity of our Core Data Model as well as the Gold Layer across Azure, Snowflake, and dbt.You ensure that fragmented data sources are transformed into consistent, reusable, and decision‑relevant data products, actively preventing the platform from drifting into team‑specific, incompatible models.You define and maintain central business objects, canonical dimensions, shared metrics, and facts, ensuring that the Core Data Model serves as a stable, business‑oriented foundation across all domains.You develop modeling standards, naming conventions, layering concepts (Staging → Intermediate → Gold), reuse patterns, and dbt design guidelines, and you ensure their consistent implementation across all teams.You safeguard the semantic consistency of the entire data model, resolve domain conflicts, ensure that identical business terms are modeled only once, and review changes affecting core layers.You act as the technical design authority for model changes in Snowflake/dbt, balancing local requirements with long‑term model coherence, and ensuring that all models remain performant, scalable, maintainable, and of high quality.