8 plus years of overall experience. 5 to 8 years of experience in Hadoop, Airflow & Cloud 1. Install, configure, and support Hadoop clusters in Cloudera Distribution. 2. Design and manage the commencement of new Hadoop projects. 3. Execute system upgrades for existing Hadoop clusters using Cloudera. 4. Install, configure, and support HIVE, IMPALA, SOLR, SPARK, KAFKA. 5. Handle severity incidents related to Hadoop. 6. Manage MySQL database related to Hadoop clusters. 7. Set up and manage BDR jobs. 8. Procure and license Hadoop hardware. 9. Implement SSL and TLS on clusters. 10. Work closely with clients as part of a cross-functional team to support operational, tactical, and strategic reporting, as well as other analytical business needs. 11. Ensure change management and incident management protocols are followed. 12. Collaborate with internal teams throughout project delivery lifecycle. 13. Investigate, model, define, and document required processes, including business processes. 14. Leverage automation skills to minimize manual efforts in daily support tasks. 15. Provide on call support. 16. Review and analyse change requests to ensure adherence to protocols. 17. Manage code in Github and experience in managing Git repositories. 18. Engage necessary teams via JIRA. 19. Install Hadoop patches, updates, and version upgrades. 20. Understand UDFs and Lambda Architecture. 21. Have a good understanding of Spark Streaming with Kafka for real-time processing. 22. Experience in Airflow DAG analysis.
Hadoop Administration Hadoop Admin, Airflow & Cloud Spark Streaming with Kafka Airflow DAG analysis
Technology->Big Data - Hadoop->Hadoop Administration
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.