Project Description :
We are looking for highly skilled Data Engineers to join our team in DBIZ. The Technik Value Stream teams are responsible for Data Ingest on ODE development of the relevant data products on ODE Operations of the data products on ODE
Activity description and concrete tasks :
- Infrastructure Deployment & Management : Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP) including network architectures (Shared VPC Hub-and-Spoke) security implementations (IAM Secret Manager firewalls Identity-Aware Proxy) DNS configuration VPN and Load Balancing.
- Data Processing & Transformation : Utilize Hadoop cluster with Hive for querying data and PySpark for data transformations. Implement job orchestration using Airflow.
- Core GCP Services Management : Work extensively with services like Google Kubernetes Engine (GKE) Cloud Run BigQuery Compute Engine and Composer all managed through Terraform.
- Application Implementation : Develop and implement Python applications for various GCP services.
- CI / CD Pipelines : Integrate and manage GitLab Magenta CI / CD pipelines for automating cloud deployment testing and configuration of diverse data pipelines.
- Security & Compliance : Implement comprehensive security measures manage IAM policies secrets using Secret Manager and enforce identity-aware policies.
- Data Integration : Handle integration of data sources from CDI Datendrehscheibe (FTP servers) TARDIS APIs and Google Cloud Storage (GCS).
- Multi-environment Deployment : Create and deploy workloads across Development (DEV) Testing (TEST) and Production (PROD) environments.
- AI Solutions : Implement AI solutions using Googles Vertex AI for building and deploying machine learning models.
- Certification Desired : Must be a certified GCP Cloud Architect or Data Engineer.
Qualifications : Skills Required :
Strong proficiency in Google Cloud Platform (GCP)Expertise in Terraform for infrastructure managementSkilled in Python for application implementationExperience with GitLab CI / CD for automationDeep knowledge of network architectures security implementations and management of core GCP servicesProficiency in employing data processing tools like Hive PySpark and data orchestration tools like AirflowFamiliarity with managing and integrating diverse data sourcesCertified GCP Cloud Architect and Data EngineerAdditional Information :
What do we offer you
International positive dynamic and motivated work environment.Hybrid work model (telecommuting / on-site).Flexible schedule.Continuous training : Certification preparation access to Coursera weekly English and German classes...Flexible compensation plan : medical insurance restaurant tickets day care transportation allowances...Life and accident insurance.More than 26 working days of vacation per year.Social fund.Free service in specialists (doctors physiotherapists nutritionists psychologists lawyers...).100% of salary in case of medical leave.And many more advantages of being part of T-Systems!
If you are looking for a new challenge do not hesitate to send us your CV! Please send CV in English. Join our team!
T-Systems Iberia will only process the CVs of candidates who meet the requirements specified for each offer.
Remote Work : Employment Type :
Full-time
Key Skills
Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala
Experience : years
Vacancy : 1