
Databricks Data Engineer (f/m/d)
- Comunidad de Madrid
- Permanente
- Tiempo completo
- Be a core contributor in Axpo’s data transformation journey by using Databricks as our primary data and analytics platform.
- Design, develop, and operate scalable data pipelines on Databricks, integrating data from a wide variety of sources (structured, semi-structured, unstructured).
- Leverage Apache Spark, Delta Lake, and Unity Catalog to ensure high-quality, secure, and reliable data operations.
- Apply best practices in CI/CD, DevOps, orchestration (e.g., Dragster, Airflow), and infrastructure-as-code (Terraform).
- Build re-usable frameworks and libraries to accelerate ingestion, transformation, and data serving across the business.
- Work closely with data scientists, analysts, and product teams to create performant and cost-efficient analytics solutions.
- Drive the adoption of Databricks Lakehouse architecture and ensure that data pipelines conform to governance, access, and documentation standards defined by the CDAO office
- Ensure compliance with data privacy and protection standards (e.g., GDPR).
- Actively contribute to the continuous improvement of our platform in terms of scalability, performance, and usability.
- A university degree in Computer Science, Data Engineering, Information Systems, or a related field.
- Strong experience with Databricks, Spark, Delta Lake, and SQL/Scala/Python.
- Proficiency in dbt, ideally with experience integrating it into Databricks workflows.
- Familiarity with Azure cloud services (Data Lake, Blob Storage, Synapse, etc.).
- Hands-on experience with Git-based workflows, CI/CD pipelines, and data orchestration tools like Dragster and Airflow.
- Deep understanding of data modeling, streaming & batch processing, and cost-efficient architecture.
- Ability to work with high-volume, heterogeneous data and APIs in production-grade environments.
- Experience working within enterprise data governance frameworks, and implementing metadata management and observability practices in alignment with governance guidance.
- Strong interpersonal and communication skills, with a collaborative, solution-oriented mindset.
- Fluency in English.
- Core: Databricks, Spark, Delta Lake, Python, dbt, SQL
- Cloud: Microsoft Azure (Data Lake, Synapse, Storage)
- DevOps: Bitbucket/GitHub, Azure DevOps, CI/CD, Terraform
- Orchestration & Observability: Dragster, Airflow, Grafana, Datadog, New Relic
- Visualization: Power BI
- Other: Confluence, Docker, Linux
- Experience with Unity Catalog and Databricks Governance Frameworks
- Exposure to Machine Learning workflows on Databricks (e.g., MLflow)
- Knowledge of Microsoft Fabric or Snowflake
- Experience with low-code analytics tools like Dataiku
- Familiarity with PostgreSQL or MongoDB
- Front-end development skills (e.g., for data product interfaces)
- Working Hours
- Meal allowances
*Option to use it for public transportation or childcare instead.
- Internet Compensation
- Microsoft ESI Certifications
- Training courses
- Gym Coverage
- Health Insurance