We are currently looking for a Senior Software Engineer to join our APM Terminals Integration Platform Portfolio team - based in Bangalore (India) Key Responsibilities Building software in accordance with the Maersk standards and guidelines Responsible for the quality design and implementation (supportable, maintainable, scalable, performant, secure, efficient, and supportable) of data driven applications delivering business value Provide design validation on key technologies Building Data Products in the form of APIs and Derived events in a reusable method Using disparate data sources to deliver complex event driven products with a good understanding of the business through partnering with them. Designing and engineering Data Products that are lean focused, end to end, thinking with a value orientated mindset and considering FinOps, Lean and true MVPs when delivering products Supporting onboarding data from multiple terminals with different cultures and differing levels of technical debt Take part in on-call rotations with the platform consumers (product development teams) and take the lead in preventing incidents and maintaining platform SLAs, through automation and blameless postmortems. Ensuring builds are kept green and the code management strategy (branching) is closely followed. Raising capability and standards within team; pairing on tasks, peer review of team members code and constructive feedback for improvement in both the code base and team capability (blame free feedback etc.) Proactive contribution to continual improvements within your team through both active participation in retrospective and from engagement with cross team best-practice communities Advisory to Product Owners to identify and manage risks, debt, issues, and opportunities for technical improvements Supporting the recruitment of (engineers) across the department Technical support during cut-over activities Who we are looking for: We are looking for candidates with a proven performance track record with the following: Technical Skills Minimum 4+ years of relevant experience in Kafka. Develop and implement solutions using Kafka. Administer and improve use of Kafka across the organization including Kafka Connect, Kafka Streams, and custom implementations. Work with multiple teams to ensure best use of Kafka and data-safe event streaming. Understand and apply event-driven architecture patterns and Kafka best practices and enable other development teams members to do the same. Strong knowledge and experience with Kafka Streams API, Kafka Connect, Kafka brokers, zookeepers, API frameworks, Pub / Sub patterns, schema registry, KSQL, Rest proxy, Replicator, ADB, Operator and Kafka Control center. Hands on experience on Kafka connectors such as MQ connectors, Elastic search connectors, JDBC connectors, File stream connector. Working knowledge and expert level understanding of data migration and CDC as it relates to Kafka using Kafka Connect and Debezium. Knowledge of source and sink connector technical details for a variety of platforms including PostgreSQL, MS SQL Server, Oracle and others as required. Understand various Kafka related metrics, read dashboard and support to ensure no downtime. Experience with micro-service application architecture, developed Java Spring Boot container applications Strong experience in SQL and ETL development, Confluent KSQL is a plus. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Experience in building Kafka producer and consumer applications using Spring Boot Must Have Skills Adhering to best SecDevOps practices Working in a manner that drives cost efficiency (FinOps), through removal of waste. Driving re-usability through well designed data products Listening to the needs of technical and business stakeholders and interpreting them in partnership with the product owner Identifying areas of innovation in data tools and techniques and recognize appropriate timing for adoption. Understand and help teams apply a range of techniques for data profiling. Sourcing data analysis from a complex single source. You can bring multiple data sources together in a conformed model for analysis. Working with OpenSource technology and introducing new technology to solve problems. Demonstrates that they are keeping aligned / educated with latest technology and industry trends. Real passion for data and technology Extensive experience with database technologies and architecture. Exceptional problem-solving and critical thinking skills. Experience of working with virtual teams and scrum teams Ability and initiative to see ambiguity as an opportunity and be able to solve problems, innovate and create. Demonstrating a customer first mentality (quick to market, understand business outcome), strive to improve product for the customer.
foundit
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.