Data Engineer

Year    Bangalore, Karnataka, India

Job Description


Location(s)Bengaluru, KarnatakaCompanyKoch Global ServicesCareer FieldInformation Systems & TechnologyJob Number155596The Data Engineer will be a part of an international team that designs, develops and delivers Data Pipelines and Data Analytics Solutions for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global Solution India (KGSI) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KSGI rapidly scales up its operations in India, it\'s employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Solution (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees.This position will work with the Georgia Pacific Consumer Products IT organization as a part of the KOLO/CPG Cloud Platform team. The position will report direct to the CPG Team Lead located in Bangalore, India. This team is responsible for building and monitoring the KOLO/CPG cloud platform for secure provisioning, ingestion, and reliable publishing. The role creates long-term value through support of our connected devices business partners. The selected candidate should have hands-on experience with AWS and ETL processes and delivering data pipeline at scale, ensuring low latency, reliability, and extensibility of the platform. In addition, the candidate should demonstrate initiative, good economic-thinking, analytical skills, and a sense of urgency.:(job responsibilities)What You Will Need To Bring With You:(experience & education required)

  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, AWS \'big data\' technologies.
  • Identifying data quality issues and supporting data governance initiatives by participating in necessary activities including data profiling, data mining, and clean up, implement automated alerts
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management.
  • Providing estimates for development work needed to support break-fix, new projects, and enhancements
  • Providing after hours support for mission critical applications and platforms when needed
  • Performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
6 - 8years of total IT industry experience. 5+ years of experience developing ETL packages including scheduling and automation
  • 5+ years of experience in applied data warehousing methodology, analysis, and development
  • 5+ years of SQL, no SQL coding and tuning experience
Experience with AWS cloud services: EC2, EMR, RDS, Redshift, Lambda, Airflow Strong knowledge in Python, PySpark, SQL and Redshift store procedures, Kinesis and AWS Glue service. Working knowledge of data platforms such as Redshift, Tableau/Qlik. Data Transformation with Kinesis Firehouse and/or DynamoDb Data Streams
  • Solid experience transforming business requirements to technical solutions
  • Strong conceptual, analytical, and problem-solving skills
  • Proven ability to communicate effectively with both technical and non-technical staff
  • Bachelor\'s Degree in Computer Science, Data Science, Math or another related field
What Will Put You Ahead:(experience & education preferred)Other Considerations:(physical demands/ unusual working conditions) Experience supporting and working with cross-functional teams in a dynamic environment Understanding of and experience in ETL performance tuning Experience with iterative testing and design processes such as AGILE Experience in Jira Experience with version control tools such as GIT Experience with Python, data transformations with Kinesis and/or dynamodb datastreams Implement processes for Continuous integration, Test automation and Deployment (CI/CD Pipelines)

Koch Global Services

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3288428
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Bangalore, Karnataka, India
  • Education
    Not mentioned
  • Experience
    Year