Professional area : Information Technology
Contract type : Permanent
Professional level : Experienced
Location : Madrid, ES, 00000
At JTI we celebrate differences, and everyone truly belongs. 46,000 people from all over the world are continuously building their unique success story with us. 83% of employees feel happy working at JTI.
To make a difference with us, all you need to do is bring your human best.
Data Platform Engineer Manager provides guidance on the design and management of data for data applications, formulates best practices and development standards, and organizes processes for data management, governance, and evolution. Build processes and tools to maintain high data availability, quality, and maintainability. He / She will develop and maintain an architectural roadmap for data products and data services, ensuring alignment with the business and Enterprise Architecture strategies and standards.
The incumbent will determine technical solutions that further business goals and align with corporate technology strategies, keeping in mind performance, reliability, scalability, usability, security, flexibility, and cost.
He / She will explore new ways to implement / automate data quality and reliability processes to reduce development times and provide the most optimal and cost-effective data assets for Data Analysts and Data Scientists to consume.
What will you do - Responsibilities :
- Central Data Lake Platform re-engineering towards a new Data Mesh paradigm :
- Intensively collaborate in the implementation of the Data Mesh concept by provisioning azure resources needed by the data domains or the data federation modules
- Design and develop artifacts that will reduce the deployment time of new data domains inside the Data Mesh concept
- Participate in new data initiatives : Purview, Trident, to facilitate their implementation
- Design the data pipelines for the corresponding projects ensuring they are highly available, scalable, reliable, secure, and cost-effective
- Collaborate with GTC, EA, and IT Security teams to ensure that new development fulfills security and technical requirements
- Design and document the application topology, technology recommendations, and delivery methodology
- Work with the project manager and external service providers to estimate resources, budget, and plan. Highlight known risks.
- Design unit / integration tests, contribute to the engineering wiki, and document work
- Supervise the development up to the UAT phase and coordinate bug fixing during UAT
- Rollouts and deployments to Production support, ensuring the cutover is defined and followed up
- Ensure Data pipelines maintenance and performance testing
- Responsible for the CI / CD of developments
- Central Data Lake Platform Governance monitoring, automation, optimization, and support :
- Institute patterns that support data ingestion, data movement, transformations, aggregations, and more
- Create optimal data pipeline architecture ensuring the data has the best suitable format from the point of view of consumption and cost in every stage of the datalake
- Conceptualize and generate infrastructure that allows big data to be accessed and analyzed
- Reformulate existing data frameworks to optimize their functioning
- Remain up-to-date with industry standards and technological advancements that will improve the operation and quality of the data platform
- Design and implement monitoring processes of the whole Central Data Lake Platform, including performance, cost, and availability indicators at the project / resource level
- Provide, review, and confirm technical solutions aligned with the Data Platform Delivery Team Manager
- Identify and design internal process improvements : automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- DEVOPS service platform architecture :
- Infrastructure Management : Design, implement, and manage infrastructure using Infrastructure as Code (IaC) tools
- CI / CD Pipelines : Develop and maintain Continuous Integration and Continuous Delivery (CI / CD) pipelines to automate software and code releases
- Collaboration : Facilitate communication and collaboration between development, operations, and other stakeholders to improve productivity and efficiency
- Monitoring and Logging : Implement monitoring, logging, alerts, and dashboards to track the performance and health of applications and infrastructure.
- Automation : Write and maintain scripts to automate tasks and DevOps processes
- Support and Troubleshooting : Provide support and troubleshoot issues related to applications, systems, and infrastructure
- Cloud Management : Efficiently manage and monitor cloud resources, implementing autoscaling and other techniques to maintain optimal performance
Skills and Qualifications :
College or University degree2+ years strong experience with metadata-driven frameworks, CPG.ai is a must10+ years of experience in Data and Analytics with high exposure to IT architecture and solutioning for data flows among systems and applications5+ years delivering solutions on the Microsoft Azure platform with a special emphasis on data solutions and services5+ years delivering solutions on the Power BI platform with a special emphasis on architecture, security, and maintenance.5+ years of DEVOPS, CI / CDAre you ready to join us? Build your success story at JTI. Apply now!
Next Steps :
After applying, if selected, please anticipate the following within 1-3 weeks of the job posting closure : Phone screening with Talent Advisor >
Assessment tests >
Interviews >
Offer. Each step is eliminatory and may vary by role type.
At JTI, we strive to create a diverse and inclusive work environment. As an equal-opportunity employer, we welcome applicants from all backgrounds. If you need any specific support, alternative formats, or have other access requirements, please let us know.
J-18808-Ljbffr