Overview
Who We Are
Axpo is driven by a single purpose – to enable a sustainable future through innovative energy solutions. As Switzerland's largest producer of renewable energy and a leading international energy trader, Axpo leverages cutting-edge technologies to serve customers in over 30 countries. We thrive on collaboration, innovation, and a passion for driving impactful change.
About the Team
You will report to the Head of Development and collaborate closely with the Chief Data & Analytics Office (CDAO) and business data teams. We share the goal of unlocking data and enabling self-service analytics capabilities across Axpo, within a cross-functional operating model that ensures data quality, semantics, and governance align with enterprise standards. You’ll work closely with the CDAO office and contribute to the definition and provision of data pipeline templates, used for the ingestion of source data as data products, aligned with business semantics, domain ownership, and enterprise data governance strategy. Our decentralized approach means close collaboration with various business hubs across Europe, ensuring local needs shape our global platform. You’ll find a mindset committed to innovation, collaboration, and excellence.
What You Will Do
As a Databricks Data Engineer, you will :
- Be a core contributor in Axpo’s data transformation journey by using Databricks as our primary data and analytics platform.
- Design, develop, and operate scalable data pipelines on Databricks, integrating data from a wide variety of sources (structured, semi-structured, unstructured).
- Leverage Apache Spark, Delta Lake, and Unity Catalog to ensure high-quality, secure, and reliable data operations.
- Apply best practices in CI / CD, DevOps, orchestration (e.g., Dragster, Airflow), and infrastructure-as-code (Terraform).
- Build reusable frameworks and libraries to accelerate ingestion, transformation, and data serving across the business.
- Work closely with data scientists, analysts, and product teams to create performant and cost-efficient analytics solutions.
- Drive the adoption of Databricks Lakehouse architecture and ensure that data pipelines conform to governance, access, and documentation standards defined by the CDAO office.
- Ensure compliance with data privacy and protection standards (e.g., GDPR).
- Actively contribute to the continuous improvement of our platform in terms of scalability, performance, and usability.
What You Bring & Who You Are
We’re looking for someone with :
A university degree in Computer Science, Data Engineering, Information Systems, or a related field.Strong experience with Databricks, Spark, Delta Lake, and SQL / Scala / Python.Proficiency in dbt, ideally with experience integrating it into Databricks workflows.Familiarity with Azure cloud services (Data Lake, Blob Storage, Synapse, etc.).Hands-on experience with Git-based workflows, CI / CD pipelines, and data orchestration tools like Dragster and Airflow.Deep understanding of data modeling, streaming & batch processing, and cost-efficient architecture.Ability to work with high-volume, heterogeneous data and APIs in production-grade environments.Experience working within enterprise data governance frameworks, and implementing metadata management and observability practices in alignment with governance guidance.Strong interpersonal and communication skills, with a collaborative, solution-oriented mindset.Fluency in English.Technologies You’ll Work With
Core : Databricks, Spark, Delta Lake, Python, dbt, SQLCloud : Microsoft Azure (Data Lake, Synapse, Storage)DevOps : Bitbucket / GitHub, Azure DevOps, CI / CD, TerraformOrchestration & Observability : Dragster, Airflow, Grafana, Datadog, New RelicVisualization : Power BIOther : Confluence, Docker, LinuxNice to Have
Experience with Unity Catalog and Databricks Governance FrameworksExposure to Machine Learning workflows on Databricks (e.g., MLflow)Knowledge of Microsoft Fabric or SnowflakeExperience with low-code analytics tools like DataikuFamiliarity with PostgreSQL or MongoDBFront-end development skills (e.g., for data product interfaces)Benefits
Working Hours : Flexible working hours with 60% remote and 40% at our offices in Madrid, Torre Europa.Meal allowances : 11 € net per day, with option to use for public transportation or childcare.Internet Compensation : Up to 48.40 € net per month for home internet.Microsoft ESI Certification : Access to Enterprise Skills Initiative program certification for hands-on training in Microsoft and Azure technologies.Training courses : Industry-specific training and learning channels.Gym Coverage : 90% coverage for gym access.Health Insurance : Comprehensive health insurance with option to extend to spouse and children.#J-18808-Ljbffr