Python Spark Airflow Database/senior Consultant Specialist/hyderabad/group Data Technology : 0000kc9w

Year    Hyderabad, Telangana, India

Job Description


Job descriptionSome careers shine brighter than others.If you\xe2\x80\x99re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist.In this role, you will:

  • Design, develop, implement, and maintain robust data pipelines to support various data initiatives.
  • Utilize 5+ years of experience in data engineering to architect scalable solutions.
  • Proficient in Python, Hive, Airflow, and Spark for efficient data processing and workflow management.
  • Champion best practices in data pipeline development from inception to deployment in production environments.
  • Innovate and drive new features and enhancements, ensuring alignment with business objectives.
  • Demonstrate strong data literacy, understanding structured and unstructured data, big data technologies, data models, and pipeline optimization.
  • Handle diverse data formats (JSON, ORC, Parquet, CSV) and manage varying data volumes effectively.
  • Implement DevOps principles and tools for data engineering, including data testing, code scanning, and adherence to coding standards.
  • Manage metadata effectively to ensure data lineage and governance.
  • Familiarity with Qlik for data visualization and analytics.
  • Prior experience with GCP (Google Cloud Platform) is advantageous, particularly with BigQuery, Data Proc, and Pub/Sub.
  • Experience with data observability tools to ensure data quality and integrity.
RequirementsTo be successful in this role, you should meet the following requirements:
  • Proficient in Python (Pandas, Pyarrow, BeautifulSoup4), Hive, and Spark.
  • Strong knowledge of Hadoop ecosystem and other big data technologies.
  • Experience with Apache Airflow for workflow orchestration.
  • Familiarity with Git for version control and code scanning tools.
  • Experience with data quality packages like Great Expectations is advantageous.
  • Proficiency in Linux and shell scripting.
  • Hands-on experience with GCP services such as BigQuery, Data Proc, and Pub/Sub.
  • Knowledge of containerization technologies like Docker and orchestration with Kubernetes (K8).
You\xe2\x80\x99ll achieve more when you join HSBC.www.hsbc.com/careersHSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.Issued by \xe2\x80\x93 HSDI

HSBC

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3427797
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Hyderabad, Telangana, India
  • Education
    Not mentioned
  • Experience
    Year