Website Siemens EDA
Working for Siemens Financial Services Information Technology (SFS IT), you will assume end-to-end responsibility for one or more of our business-critical finance IT applications. You will secure reliable operations and continuous enhancement while ensuring that regulatory and security requirements as well as agreed service levels are met. You will achieve this, working hand in hand with our interdisciplinary and international team of IT experts.
There are more than 1200 professionals working in Cybersecurity, Analytics Business Intelligence, Application Lifecycle Management, IT Project & Service Management and IT Infrastructure Management.
We are a global powerhouse focusing on the areas of electrification, automation, and digitalization. One of the world’s largest producers of energy-efficient, resource-saving technologies, Siemens is a leading supplier of systems for power generation and transmission as well as medical diagnosis. In infrastructure and industry solutions the company plays a groundbreaking role.
Looking for a chance to create a positive impact on our society? Join us!
What role will you play?
Data is at the heart of our business and one of our greatest assets. And we treat it like this. Who we are? – We are the Chapter Data Management at Siemens Financial Service’s Information Technology & Cybersecurity Function. We are a global team of experts accountable for Data Warehouse Architectures & Development Principles shaping Data Management of Today and Tomorrow.
- In Collaboration with all relevant stakeholders (e.g., Chapter & Delivery Lead, Business) and based on predefined business requirements, you design the Intra-Architecture of our Data Warehouse and Data Lake of Today and Tomorrow.
- This includes end-2-end responsibility form taking over the business requirements for a specific use case/ user story, translating them into technical solution designs and ensuring that your technical concepts are developed by your data engineering colleagues as requested.
- We work use case and project oriented, meaning that you take a leading role within the squad(s) you are assigned to as Technical Lead (e.g., for an upcoming migration a/o data integration project).
- In close collaboration with the Lead Data Architect, you develop further, govern, and communicate our Data Warehouse Architecture & Data Modelling Guidelines & Principles and thus ensure a standardized Data Warehouse Delivery and build of new features.
- You advise and support your data engineering colleagues in all their activities, proactively include the requirements of other colleagues (e.g., DWH Automation, Compliance & Governance/ GDPR/ Finance Regulatory) and always keep an eye on the needs of your squads and customers.
- You support your key management stakeholders, e.g., via well-targeted technical consultancy, effort estimation for the implementation of new features and preparation of senior and C-Level management meetings.
- You identify knowledge gaps in the squads and develop recommendations for action to close them eventually (e.g., through suggestions for training, coaching, workshops, etc.).
- As you also are familiar with at least one programming language (e.g., SQL, Python) you take over smaller software programming tasks if necessary.
We are looking for:
- The basis of your success is a degree in Information Technology, Mathematics, Physics, Business Administration or equivalent with at least +3 years’ experience – Gained Knowledge in Finance and Banking Industry is a big plus.
- You are an enthusiast on data, data warehousing and engineering and you have gain several years of experience as a Data Warehouse Architect/ DWH Solution Design Expert in your current a/o previous positions or you are a Data Warehouse Developer with the willingness to make the next big move in her/ his/ its carrier.
- You have worked and gained practical knowledge with Cloud Data Warehouse Technologies (ideally Snowflake on Azure).
- You are familiar with innovative technologies and concepts of the Azure Eco-System (Azure Data Lake Gen2, Azure Data Factory, Kafka/ CDC, Docker/ Containerization, AKS/ Kubernetes, Azure Directory (AD), Data Warehouse Automation and API).
- You are familiar and you have a proven track record in Data Modelling with Data Vault 2.0.
- You have a proven track within the diverse fields of Data Warehouse Architecture (Data Vault 2.0, Multiple Layer-, Lamda-, Kappa-), Data Processing (Load Design Patterns, CDC, Stream, Batch, etc.) a/o in the setup and roll-out of complex rights-role concepts.
- At least, advanced programming experience in SQL.
- Familiarity with Design Patterns and Object-Oriented thinking.
- High motivation to meet deadlines, hands-on mentality, flexibility, and proactive support.
- High agile mindset, as well as particularly effective communication and presentation skills.
- English is necessary, German would be a plus.
- A proven project experience in migrating on-premises data warehouses such as SAP BW / SAP HANA, Oracle, Microsoft SQL Server to the Cloud (ideally Snowflake on Azure) would be a great plus.
- A further programming Languages (e.g., SQL, Python, Java, C++) would be a plus.
Apply for job
To view the job application please visit jobs.siemens.com.