Data Engineer/application Developer Aws Job

Year    Bangalore, Karnataka, India

Job Description


YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.At YASH, we\'re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.We are looking forward to hire Python Professionals in the following areas ::Experience required: 3-5 years.Data Engineer/ DevOps - Enterprise Big Data PlatformIn this role, you will be part of a growing, global team of data engineers, who collaborate in DevOps mode, to enable business with state-of-the-art technology to leverage data as an asset and to take better informed decisions.The Enabling Functions Data Office Team is responsible for designing, developing, testing, and supporting automated end-to-end data pipelines and applications on Enabling Function\'s data management and analytics platform (Palantir Foundry, AWS and other components).Developing pipelines and applications on Uptimize AWS requires:

  • Proficiency in Python
  • Proficiency in SQL
  • Proficiency in PySpark for distributed computation
  • Familiarity with AWS Services
  • Familiarity with common databases (e.g., Oracle, MySQL, Microsoft SQL). Not all types required
This position will be project based and may work across multiple smaller projects or a single large project utilizing an agile project methodology.Roles & Responsibilities:
  • B.Tech / B.Sc./M.Sc. in Computer Science or related field and overall 3+ years of industry experience
  • 2+ years in engineering with experience in data integration on big data and database platforms, analytics application development, data modeling and data visualization of structured and unstructured data sets
  • Design, develop, and maintain Python applications, ensuring high performance and responsiveness.
  • Utilize "Pandas" for data manipulation, transformation, and analysis.
  • Implement best practices of *Object-Oriented Programming (OOPs) for clean, modular, and scalable code.
  • Write efficient "SQL" queries for data extraction, transformation, and management across relational databases.
  • Collaborate with data teams to integrate *AWS Cloud ETL services* to support data pipelines.
  • Good to have knowledge on ML models pipeline integration.
  • Knowledge of Agile/Scrum methodologies* and exposure to tools like *Jira* or *Azure DevOps* for task tracking and collaboration within development teams
  • Ability to work both individually and collaboratively across globally matrixed product teams.
  • Strong skills in structuring work packages, considering dependencies and coordinating deliverables in an agile environment.
  • Fluent in English with strong written and verbal communication
Education
  • Bachelor (or higher) degree in Computer Science, Engineering, Mathematics, Physical Sciences, or related fields
Professional Experience
  • 3+ years of experience in system engineering or software development
  • 2+ years of experience in engineering with experience in ETL type work with databases and Hadoop platforms.
Skills:
  • Data management / data structures*|Must be proficient in technical data management tasks, i.e. writing code to read, transform and store data XML/JSON knowledge Experience working with REST APIs| SCC /Gi
  • Must be experienced in the use of source code control systems such as Git|ETL
  • Experience with developing ELT/ETL processes with experience in loading data from enterprise sized RDBMS systems such as Oracle, DB2, MySQL, etc.
  • Authorization | Basic understanding of user authorization (Apache Ranger preferred)
  • Programming | Must be at able to code in Python or expert in at least one high level language such as Java, C, Scala.
  • Must have experience in using REST APIs|SQL|Must be an expert in manipulating database data using SQL. Familiarity with views, functions, stored procedures and exception handling. AWS General knowledge of AWS Stack (EC2, S3, EBS, \xe2\x80\xa6)
  • IT Process Compliance | SDLC experience and formalized change controls
  • Working in DevOps teams, based on Agile principles (e.g. Scrum)
  • ITIL knowledge (especially incident, problem and change management)
Languages- Fluent English skills
  • Specific information related to the position
  • Physical presence in primary work location (Bangalore)
  • Flexible to work CEST and US EST time zones (according to team rotation plan)
  • Willingness to travel to Germany, US, and potentially other locations (as per project demand)
At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale.Our Hyperlearning workplace is grounded upon four principles
  • Flexible work arrangements, Free spirit, and emotional positivity
  • Agile self-determination, trust, transparency, and open collaboration
  • All Support needed for the realization of business goals,
  • Stable employment with a great atmosphere and ethical corporate culture

YASH Technologies

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3467470
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Bangalore, Karnataka, India
  • Education
    Not mentioned
  • Experience
    Year