Skillset:
· Understand and translate business needs into data models supporting long-term solutions.
· Develop high performing PL/SQL procedures for both OLTP and batch operations. Must exhibit deep understanding and experience in database and PL/SQL tuning at both logical and system levels.
· Hands on experience of building complex and large data management systems
· Expertise on at least one of RDBMS system (Oracle, PL/SQL Server, etc.)
· Provide expertise in database design, development data modelling and architecture
· Hands on experience of building complex and large data management systems
· Hands on working experience of functioning of multiple database systems/technologies
· Hands on in writing complex queries, triggers, functions, packages and procedures
· Can Independently create and maintain Application schema upgrade scripts
· Manage complex relationships and rules-based systems
· Understand how to leverage infrastructure for solving such large scale problems.
· Develop tools and contribute to open source wherever possible.
· Adopt problem solving as a way of life – always go to root cause! Support the code you write in production
· Work with the Application Development team to implement data strategies, build data flows and develop conceptual data model.
Job Responsibilities:
· Should have experience of resolving performance issues, query optimization, database tuning, interpreting query execution plans, resolving deadlocks etc.
· You have previously worked on building data pipelines ingesting and transforming large number of events per minute and terabytes of data per day.
· You have worked with any sql interface before and have experimented or heard about nosql implementation and understand when to use one over the other. Preferably, you have worked with cloud based big data processing platform such as Google Cloud DataProc.
· You are passionate about producing clean, maintainable and testable code part of real-time data pipeline.
· You can connect different services and processes together even if you have not worked with them before and follow the flow of data through various pipelines to debug data issues.
· You understand issues with ingesting data from applications in multiple data centres across geographies, on-premise and cloud and will find a way to solve them.
· Working knowledge of linux environment is a plus
· Previous experience of working with Google tools/gadgets is a big plus
· Strong leadership and management skills .
· Data Warehousing/Data Modelling - Very strong SQL, data modelling and transformation
· Skills - with the ability to manipulate large data-sets, automate processes and debug scripts.
· Creation and monitoring of reports.
· Extracting, manipulating and analyzing data.
*We welcome every application - your gender, age, origin, religion or sexuality do not matter. Networked togetherness. That is Tech Mahindra Germany.