EXPERT- Data Ingestion OPERATIONS_AWS Data Services_GTS Data Platforms
•Schneider Electric is an Equal Opportunity Employer. It is our policy to provide equal employment and advancement opportunities in the areas of recruiting, hiring, training, transferring, and promoting all qualified individuals regardless of race, religion, color, gender, disability, national origin, ancestry, age, military status, sexual orientation, marital status, or any other legally protected characteristic or conduct •
Description:
================
Data Ingestion Technical Support
The role will play an active role in accelerating the Schneider Electric Big Data and Analytics environment and will contribute to the Schneider Digital initiatives for enhancing, automating and accelerating implementation of master data management, adoption of big data platforms, data excellence and data dictionary evolution, data security and eventually the business intelligence and analytics that is built on top of these platforms to derive insights that drive strategic decisions across corporate functions.
To enable data transformation in SE, we are looking for a Data Ingestion Technical Support to help various Schneider Electric business units to ingest their data with the expected quality standard in our big data platform. This role will be exposed to a global work culture, collaborating with different business units of Schneider Electric, and team members distributed between Paris, Grenoble, Bangalore and Barcelona.
Responsibilities:
=====================
Provide technical support to project teams to ingest their data in to global Big Data platform Raw Data Layer, working closely with Platform owner, big data engineer and cloud architect.
• Educate business projects on the platform ingestion rules, standards and best practices
• Communicate on ingestion pre-requisite
• Investigate any technical issues and data failures
• Verify and validate that data are ingested as per requirement and quality standards
• Enrich ingestion standard documentation
• Manage change requests on existing ingestion flows
• Hand over to support team when new data flow ingested is delivered
Qualifications: - Education : B.e / B Tech in Computer Science, Electronics, Relevant technical Certification
=================================================================================================================
Technology Skills and Project Experience:
• Ability to engage with multiple technical teams with a supportive attitude to achieve a shared goal
• Ability to communicate effectively with technical and non-technical individuals.
• Technical skills :
Good knowledge and must have 2-3 years of experience on AWS technologies (S3, Lambda, Step function, Glue, DynamoDb, EMR with Hive, PySpark,AWS CLI)
o Project experience with Python, Terraform, GitHub,
o Project experience with Redshift, LakeFormation, ETL tools.
o Should be having proficient knowledge in SQL, Linux
o AWS Redshift database
o AWS Migration services
o AWS roles / policies, privileges, console management
- Experience with Big Data technologies - Data processing and data transformation flows
• Ability to multi-task, organize, prioritize, research and resolve issues in a timely and effective manner.
• Understanding of core AWS services like EC2, S3, IAM, and VPC.
• Familiarity with the AWS console and basic command-line tools (AWS CLI).
• Familiarity with data processing concepts like ETL/ELT pipelines. Basic experience with Python or Scala for data manipulation.
• Experience with data processing frameworks like Apache Spark or AWS Glue. Understanding of data warehousing concepts and data quality practices.
• In-depth knowledge of S3 for object storage and data lakes.
• Understanding of data partitioning and file formats like Parquet or good to have knowledge on OTF / Iceberg
• Familiarity with data sharing mechanisms like S3 Sharing and Redshift Sharing. Sharing using Lake formation. sharing between accounts, regions
• Basic understanding of data warehousing concepts and querying languages like SQL.
• Basic understanding of AWS serverless services like Lambda, Step Functions, AWS Glue and Redshift Serverless.
• Experience with CI/CD pipelines for automating infrastructure and code deployment.
• Knowledge of cloud monitoring and logging services like CloudWatch.
• Understanding of data lineage and data quality tools.
Business / Soft Skills:
• Must have solid presentation, communication (on complex technical solutions) and inter-personal skills.
• Ability to work effectively with globally dispersed stakeholders from business.
• Ability to manage multiple priorities in a fast-paced environment.
• A data-driven mindset, Ability to clearly communicate complex business problems and technical solutions.
• Ability to manage and make decisions about competing priorities and resources.
• Ability to delegate where appropriate
• Must be a strong team player/leader
Qualifications
EXPERT- Data Ingestion OPERATIONS_AWS Data Services_GTS Data Platforms
Description:
================
Data Ingestion Technical Support
The role will play an active role in accelerating the Schneider Electric Big Data and Analytics environment and will contribute to the Schneider Digital initiatives for enhancing, automating and accelerating implementation of master data management, adoption of big data platforms, data excellence and data dictionary evolution, data security and eventually the business intelligence and analytics that is built on top of these platforms to derive insights that drive strategic decisions across corporate functions.
To enable data transformation in SE, we are looking for a Data Ingestion Technical Support to help various Schneider Electric business units to ingest their data with the expected quality standard in our big data platform. This role will be exposed to a global work culture, collaborating with different business units of Schneider Electric, and team members distributed between Paris, Grenoble, Bangalore and Barcelona.
Responsibilities:
=====================
Provide technical support to project teams to ingest their data in to global Big Data platform Raw Data Layer, working closely with Platform owner, big data engineer and cloud architect.
• Educate business projects on the platform ingestion rules, standards and best practices
• Communicate on ingestion pre-requisite
• Investigate any technical issues and data failures
• Verify and validate that data are ingested as per requirement and quality standards
• Enrich ingestion standard documentation
• Manage change requests on existing ingestion flows
• Hand over to support team when new data flow ingested is delivered
Qualifications: - Education : B.e / B Tech in Computer Science, Electronics, Relevant technical Certification
=================================================================================================================
Technology Skills and Project Experience:
• Ability to engage with multiple technical teams with a supportive attitude to achieve a shared goal
• Ability to communicate effectively with technical and non-technical individuals.
• Technical skills :
Good knowledge and must have 2-3 years of experience on AWS technologies (S3, Lambda, Step function, Glue, DynamoDb, EMR with Hive, PySpark,AWS CLI)
o Project experience with Python, Terraform, GitHub,
o Project experience with Redshift, LakeFormation, ETL tools.
o Should be having proficient knowledge in SQL, Linux
o AWS Redshift database
o AWS Migration services
o AWS roles / policies, privileges, console management
- Experience with Big Data technologies - Data processing and data transformation flows
• Ability to multi-task, organize, prioritize, research and resolve issues in a timely and effective manner.
• Understanding of core AWS services like EC2, S3, IAM, and VPC.
• Familiarity with the AWS console and basic command-line tools (AWS CLI).
• Familiarity with data processing concepts like ETL/ELT pipelines. Basic experience with Python or Scala for data manipulation.
• Experience with data processing frameworks like Apache Spark or AWS Glue. Understanding of data warehousing concepts and data quality practices.
• In-depth knowledge of S3 for object storage and data lakes.
• Understanding of data partitioning and file formats like Parquet or good to have knowledge on OTF / Iceberg
• Familiarity with data sharing mechanisms like S3 Sharing and Redshift Sharing. Sharing using Lake formation. sharing between accounts, regions
• Basic understanding of data warehousing concepts and querying languages like SQL.
• Basic understanding of AWS serverless services like Lambda, Step Functions, AWS Glue and Redshift Serverless.
• Experience with CI/CD pipelines for automating infrastructure and code deployment.
• Knowledge of cloud monitoring and logging services like CloudWatch.
• Understanding of data lineage and data quality tools.
Business / Soft Skills:
• Must have solid presentation, communication (on complex technical solutions) and inter-personal skills.
• Ability to work effectively with globally dispersed stakeholders from business.
• Ability to manage multiple priorities in a fast-paced environment.
• A data-driven mindset, Ability to clearly communicate complex business problems and technical solutions.
• Ability to manage and make decisions about competing priorities and resources.
• Ability to delegate where appropriate
• Must be a strong team player/leader
Primary Location : IN-Karnataka-Bangalore
Schedule : Full-time Unposting Date : Ongoing
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.