Gurugram, Haryana, India
Job posted on
Location Gurgaon
Functional Adminstration
Department Data Engineering
Designation Big Data Admin
Entity Valiance Analytics
Business Unit N/A
Number of Position(s) 1
Openings Valid Till Company Description
-----------------------
Valiance is a global AI & Data analytics firm helping clients build cutting-edge technology solutions for digital transformation in areas of Credit Risk, Fraud, Customer Engagement, Predictive Maintenance, Quality Inspection, Data lake, IoT analytics, Supply Chain analytics, etc. Our team comprises 100+ professionals across Machine Learning, Data Engineering & Cloud expertise.
What Makes Us Different?
1) Experienced, a cross-trained team of seasoned technology professionals
2) 10 years plus history of happy clients and successful engagements
3) Expertise across the entire lifecycle of data which is data engineering, deployment, AI/ML, and cloud computing.
4) Expertise with solutions architecture, execution, and ongoing management
5) Flexible, technology-agnostic approach to designing solutions that best meet client needs
6) Ability to comprehensively review and understand business requirements, technology constraints, compliance, and regulatory issues through senior partner-level involvement
-------------------
Skill set Must Have: Hadoop , Linux , RHEL, Ambari/Cloudera Deployment Platform, Big Data , HDFS
Responsibilities :
• Having 3+ years of experience in Hadoop
• Installing HDP/CDP in Linux environment.
• Experience On-premise applications having thorough knowledge of Linux commands and working
• Deployment in a Hadoop cluster and its maintenance.
• Health check of a Hadoop cluster monitoring whether it is up and running all the time.
• Analyse the storage data volume and allocating the space in HDFS.
• Resource management in a cluster environment. This involves new node creation and removal of unused ones.
• Configuring Name Node to ensure its high availability
• Implementing and administering Hadoop infrastructure on an ongoing basis.
• Required hardware and software deployment in Hadoop environment, furthermore to expanding of existing environments.
• Software installation and its configuration.
• Having understanding of networking and security layer of application and network
• Performance monitoring and fine tuning on actual basis.
• Managing and optimizing disk space for handling data
• Installing patches and upgrading software as and when needed.
• Automate manual tasks for faster performance.
• User creation in Linux for Hadoop and its components in the ecosystem. Moreover, setting up Kerberos principals is a part of Hadoop administration.
• Monitoring connectivity and security of Hadoop cluster
• Managing and reviewing log files in Hadoop.
• Management of HDFS file system and monitoring them.
• Communicating with other development, administrating and business teams. They include infrastructure, application, network, database, DS, and business intelligence teams.
• Coordinating with application teams. Installing the operating system and Hadoop related updates as and when required.
Qualifications
------------------
Bachelor's degree in computer science, information systems, or a related field Additional Information
--------------------------
N/A
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.