Overview:
What you will do
====================
• Build data pipelines to assemble large, complex sets of data that meet non-functional and functional business requirements
• Build analytical tools to utilize the data pipeline, providing actionable insight into key business performance including operational efficiency and business metrics
• Work closely with data architect, SMEs and other technology partners to develop & execute on the data architecture and product roadmap
• Work with stakeholders including the leadership, product, customer teams to support their data infrastructure needs while assisting with data-related technical issues.
• Act as a subject matter expert to leadership for technical guidance, solution design and best practices within the customer organization
• Keep current on big data and data visualization technology trends, evaluate, work on proof-ofconcept and make recommendations on cloud technologies
What You Have
=================
• 3+ years of data engineering experience working in partnership with large data sets
• Solid experience, developing and implementing DW Architectures, OLAP & OLTP technologies, data modeling with star/snowflake-schemas to enable self-service reporting and data lake.
• Experience building data solutions within any cloud platforms using Postgres, Informatica,Redshift, Tableau and other similar services and tools
• Advanced SQL and programming experience with Python and/or Spark
• Experience or demonstrated understanding with real-time data streaming tools like Kafka, Kinesis or any similar tools
• 2+ years of experience designing, building and testing Java EE applications
• Great problem-solving capabilities, troubleshooting data issues and experience in stabilizing big data systems
• Excellent communication and presentation skills as you'll be regularly interacting with stakeholders and engineering leadership.
• Bachelor's or master's in quantitative disciplines such as Computer Science, Computer Engineering, Analytics, Mathematics, Statistics, Information Systems, or other scientific fields.
• Nice to Have: Hands-on deep experience with SaaS data product, and experience working with query engines like Presto, Druid, or Snowflake and multi-dimensional OLAP cube
• Nice to Have: Certification in one of the cloud platforms (AWS/GCP/Azure)
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.