Join Schwarz IT Barcelona - IT Hub of Europe's Largest Retail Group. At Schwarz IT Barcelona, we provide high-value IT services for the entire Schwarz Group, including Lidl, Kaufland, Schwarz Produktion, PreZero, Schwarz Digits, STACKIT, and XMCyber. As part of a top 5 global retail company, we serve 6 billion customers through 13,700 stores in 32 countries, supported by over 575,000 employees.
We are looking for open-minded colleagues with a passion for technology, eager to explore diverse and exciting career opportunities in a dynamic work environment that fosters development and progress. Elevate your career with us, where growth and innovation are at the core of our mission.
YOUR TASKS
- Collaborate with teams to build data services for data ingestion, processing, and visualization of data insights.
- Integrate cloud providers and third-party tools to offer a comprehensive view of cloud costs, code quality, and software security.
- Provide essential platform services such as billing data ingestion, managing data configuration portals, data contracts, and defining data pipelines.
- Design, develop, and implement data integration solutions supporting batch / ETL and API-led integrations that add business value.
- Assess current technology, identify gaps, and develop future state roadmaps.
- Develop policies on data quality, security, retention, and stewardship, and support project impacts.
- Serve as an expert technical resource across multiple initiatives.
- Work in a team environment with a global workforce, vendors, and contractors.
- Translate business requirements into detailed technical specifications.
- Mentor peers and foster knowledge sharing.
- Evaluate and advocate for advanced tools and technologies.
- Participate actively in Agile development processes.
- Collaborate with cross-functional teams to understand data needs and deliver solutions.
- Monitor, troubleshoot, and optimize data pipelines.
- Assist in designing APIs for seamless data integration.
YOUR PROFILE
Bachelor’s degree in Computer Science, Engineering, or related field.5+ years of experience in data engineering, building complex data pipelines for big data workloads.4+ years working with data and computation frameworks / systems.4+ years programming in Python or Java.5+ years SQL experience with query optimization.3+ years working with cloud technologies, preferably GCP and Azure.3+ years with Data Lake, Data Warehouse, and Data Lakehouse architectures.Strong communication, problem-solving, organizational, and analytical skills.Preferred : Master’s degree in related fields.Preferred : Experience with cloud environments like GCP, Azure, AWS.Preferred : Practical experience with Pandas and PySpark.Preferred : Knowledge of Data Platform technologies like Databricks, Unity Catalog, Delta ecosystem.Preferred : Experience with data streaming technologies such as Apache Spark, Flink, Kafka.Preferred : Familiarity with orchestration tools like Apache Airflow.Preferred : Infrastructure as Code tools such as Terraform or Pulumi.Preferred : Knowledge of Agile data engineering principles and ELT / ETL methodologies.Preferred : Data warehousing concepts and troubleshooting skills.J-18808-Ljbffr
#J-18808-Ljbffr