SKF has been around for more than a century and today we are one of the world’s largest global suppliers of bearings and supporting solutions for rotating equipment. Our products can be found literally everywhere in society. This means that we are an important part of the everyday lives of people and companies around the world.
In September of 2024, SKF announced the separation of its Automotive business, with the objective to build two world-leading businesses. The role you are applying for will be part of the automotive business. This means you will have the opportunity to be a part of shaping a new company aimed at meeting the needs of the transforming global automotive market. Would you like to join us in shaping the future of motion? We are now looking for a …
Senior Data Platform Architect - Automotive Business
Data Factory is the central data engine of SKF Automotive Organization. We own and manage data platforms and the entire data engineering lifecycle, powering a self-service data platform and delivering end-to-end data services. We ensure the data ecosystem's reliability and enable business units to use data effectively and confidently.
SKF Automotive's Data Factory department is seeking a Senior Data Platform Architect to define the future of our data factory ecosystem. In this pivotal role, you will be a principal architect and builder of the foundational technology & platform that underpins our entire Data Factory operations. You will be responsible for creating the architectural blueprint and technology roadmap that will guide our evolution. By proactively partnering with Data Product Managers & Data Domain Leads, you ensure our data platform is not only powerful and scalable today but is also prepared for the challenges and opportunities of tomorrow.
This is a role for a strategic thinker and influential leader who can design cohesive, secure, and cost-effective data platform architecture. You will be the technical north star for our data engineering teams, empowering them to build with clarity, consistency, and a shared understanding of our long-term goals.
You will also oversee the daily management of IT systems and applications within the scope of your role, ensuring efficiency, effectiveness, and the adoption of best-in-class solutions.
Join our team to drive transformative initiatives and foster a culture of collaboration and innovation.
Key responsibilities
- Play the role of a senior technical expert for key data integration and transformation tools, including Informatica, FiveTran HVR, and dbt. Design and architect robust solutions, stay current with new features and roadmaps, and provide expert guidance to technical teams.
- Collaborate effectively with platform vendors and service delivery partners to ensure the successful delivery and integration of their services. You will be responsible for defining technical requirements, overseeing partner activities, and validating that their work meets our architectural standards and business objectives.
- Actively contribute to our data lakehouse architecture, leveraging our modern tech stack to build and refine data pipelines. You will enable the platform by building capabilities to process data seamlessly across raw, mastered, and application-ready layers, delivering high-quality, layered data solutions.
- Collaborate with data & integration architects, data engineers, analytics engineers, and data platform engineers to design scalable, reliable and robust data ingestions and pipelines.
- Play a key role in empowering the organization with self-service capabilities by contributing to the development of our self-service data ingestion platform.
- Stay up to date with platform developments, best practices, and vendor updates, ensuring the platform remains efficient and secure while enhancing capabilities. (in alignment with the
- Lead the design and implementation of robust, scalable data ingestion pipelines pulling data from our core SAP systems (e.g., SAP S/4HANA, BW, MDG, C4C) to our Snowflake data platform. This includes collaborating with SAP Platform experts and functional experts to develop solutions both for bulk and incremental change data capture / near real-time data capture.
- Enable stakeholders to effectively use the tools in the toolkit through training and proactive guidance.
- Identify and execute cost optimization potentials, apply controlling, and conduct capacity planning sessions for continuous improvement.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Information Systems, Cloud Computing, Software Engineering, or a related field.
- Deep, hands-on expertise in architecting and implementing solutions with modern data integration tools like Informatica and FiveTran HVR, and data transformation tools such as dbt.
- Proven experience in a data engineering role, with a strong understanding of data modeling, data warehousing, and ETL/ELT principles.
- Proven experience developing and optimizing data pipelines with dbt.
- Experience with version control systems (Git) and implementing CI/CD practices in the context of dbt projects, highlighting the integrated use of Git and dbt.
- Proven experience working on one or more of these platforms – Snowflake, Databricks, Fabric.
- Prior experience building and maintaining data pipelines that extract and land data from SAP ERP systems into a modern data warehouse (Snowflake, Databricks or similar).
- Experience working with cloud platforms, preferably Microsoft Azure. Proficiency with Azure's native data services, eg: Azure Data Factory (ADF) would be an advantage.
- Experience using GenAI, prompt-based engineering and frameworks like MCP (Model Context Protocol) to build faster, more reliable data pipelines is highly desirable.
- Strong communication, stakeholder management, and leadership skills with a proven ability to balance technical nuances with strategic vision.
- Ability to work collaboratively in a dynamic, fast-paced environment.
- A desire to learn and grow, with an interest in learning emerging & new technologies.
You will enjoy working here if you (are/have)
- A dynamic environment where you work closely with cross-functional teams to drive meaningful data solutions.
- Opportunities to lead transformative initiatives and contribute to operational excellence.
- A culture that values proactive problem-solving, autonomy, and ongoing professional development.
SKF is committed to creating a diverse environment, and we firmly believe that a diverse workforce is essential for our continued success. Therefore, we only focus on your experience, skills, and potential. Come as you are – just be yourself. #weareSKF
Some additional information
This role can be located at any of SKF Automotive sites globally, and the preferred locations is Gothenburg - Sweden, Bangalore - India, and Cajamar São Paolo - Brazil.
You will report to the Head of Data Factory. For questions regarding the recruitment process, please contact Stina Scheller – Automotive Recruitment Expert, [email protected]. Please note that we can't accept applications and CV's via email.
Is this you?
If the answer is yes, submit your application with your CV in English no later than August 18, 2025.
Please note that we can't accept applications via email.
We will screen candidates continuously throughout the application period, so make sure to submit your application as soon as possible.
We are building a strong & diverse team with a broad palette of responsibilities. We invite you to explore all currently open positions in Automotive IT here: Automotive IT, Information Security and Process Excellence.