Responsibilities
• Collaborate with data science and machine learning teams to build and maintain machine learning and data pipelines.
• Manage large datasets using data warehouses such as Snowflake and Redshift.
• Automate and schedule data workflows with orchestrators like Dagster and Airflow.
• Implement and manage data transformation workflows using dbt.
• Write efficient and scalable SQL queries for data extraction, transformation, and loading.
• Deliver high-quality data solutions by collaborating with globally distributed, cross-functional teams.
Must Haves
• Hands-on experience with data warehouses like Snowflake and Redshift.
• Proficiency in using orchestrators such as Dagster and Airflow.
• Experience with dbt for data transformations.
• Strong SQL skills and experience with complex queries.
• Strong verbal and written communication skills.
• Ability to work effectively with globally distributed, cross-functional teams.
• Experience working with data science and machine learning teams.
Nice To Have Skills
• Experience with Apache Beam and Spark.
• General software engineering and programming skills.
• Experience in the fintech enterprise domain
Job Types: Full-time, Permanent
Pay: ?2,500,000.00 - ?4,000,000.00 per year
Schedule:
• Day shift
• Monday to Friday
Application Question(s):
• How soon you can join/ Notice period (Official)?
Experience:
• snowflake: 2 years (Required)
• Redshift: 3 years (Required)
• Dagster & Airflow: 2 years (Preferred)
• SQL: 3 years (Required)
Work Location: Remote
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.