By clicking the \xe2\x80\x9cApply\xe2\x80\x9d button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda\xe2\x80\x99s and . I further attest that all information I submit in my employment application is true to the best of my knowledge.
The Future Begins Here
At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet.
Bengaluru, the city, which is India\xe2\x80\x99s epicenter of Innovation, has been selected to be home to Takeda\xe2\x80\x99s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement.
At Takeda\xe2\x80\x99s ICC we Unite in Diversity
Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team.
The Opportunity:
As a Data Platforms Engineer, you\'ll be overseeing the development, implementation and maintenance of our Enterprise data platforms and services. You\'ll play a critical role in optimizing and streamlining our platforms, ensuring efficient and effective processes across the organization. As a pivotal role in the Technology & Data organization, reporting to the Engineering lead, this role will be responsible for managing important data platforms that fit into the larger collection of global data systems. This role supports an enterprise infrastructure where business and technology partners work together to manage digital product offerings with their supporting platforms. Platforms should work together and be optimized within their niche to bring optimized value to the organization.
The Data Platform team builds and operates systems to centralize all internal and third-party data, making it easy for teams across the company to access, process, and transform that data for analytics, machine learning, and powering end-user experiences. As an engineer on the team, you will contribute to the full spectrum of our systems, from managing foundational processing and data storage, to building and maintaining scalable pipelines, to developing frameworks, tools, analytics, and internal applications to make that data easily and efficiently available to other teams and systems.
Responsibilities
Architecture and design of data platform & product capabilities including data processing batch and streaming engines, Data Lake, Analytics, In Memory Computing Systems, Distributed Data Systems, Data Governance & Observability
Develop and maintain the data platform technical design with data ingestion, processing (stream, batch), storage (analytics DB, data lake, graph DB), and data access (REST / GraphQL APIs)
Participate in design and code review sessions as appropriate to ensure the quality of work in accordance with development standards.
Managing a vendor team of technical experts, interacting directly with other teams for timely resolution of production application issue.
Leading the development and implementation of operational platforms and systems to support the organization\'s operations.
Collaborating with cross-functional teams to understand operational requirements and translate them into technical solutions.
Conducting thorough testing and quality assurance checks to identify and resolve any issues or bugs.
Monitoring platform performance and identify opportunities for improvement, efficiency, and automation.
Developing and maintaining platform documentation, including user guides, standard operating procedures, and training materials.
Providing training and support to end-users to ensure proper understanding and utilization of the operational platforms.
Staying up to date with industry trends and best practices related to operational platforms and systems.
Collaborating with external vendors and partners to leverage their expertise and resources. Being responsible for internal IT processes like Change Control Board, patch application testing, coordination, and communication for outages etc.
Conducting performance and Ops reviews for leadership and stakeholders by using and owning established Service Delivery metrics.
Working with cross teams ensuring system availability and minimal planned and unplanned outages. Working with IM Risk and Security team and ensure all the SOX and controllership requirements are met including policies, standards, documents, and applications are compliant.
Skills and Qualifications
Bachelor\xe2\x80\x99s degree or higher in Computer Science/Information technology; or relevant work experience.
Hands-on experience with data engineering tools such as Talend (or Informatica or AbInitio), Databricks (or Spark)
Working knowledge of data build tools, Azure Data Factory, continuous integration and continuous delivery (CI/CD), automated testing, data lakes, data warehouses
Mid-level to advanced knowledge of building analytics applications using Power BI/Spotfire/Tableau.
Ability to implement automation using DevOps and Infrastructure as Code (IaC) tools such as Terraform, Ansible, Chef, or Puppet.
Experience with ETL/Orchestration tools (e.g., Informatica, and Airflow, etc.)
Industry level experience of working with public cloud environments (AWS, GCP, or Azure), and associated deep understanding of failover, high-availability, and high scalability.
Data ingestion using one or more modern ETL compute and orchestration frameworks (e.g., Apache Airflow, Luigi, Spark, Apache Nifi, Flink and Apache Beam).
3+ years of experience with SQL or NoSQL databases: PostgreSQL, SQL Server, Oracle, MySQL, Redis, MongoDB, Elasticsearch, Hive, HBase, Teradata, Cassandra, Amazon Redshift, Snowflake.
Advanced working SQL knowledge and experience working with relational databases, authoring (SQL) as well as working familiarity with a variety of databases.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
You thrive in ambiguity, working with almost zero guidance, taking extreme ownership, and creating win-win situations with your creative solutions.
WHAT TAKEDA CAN OFFER YOU:
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.