Snowflake Data Engineer

Year    Delhi, India

Job Description


What Were Looking For ( 5 Years exp Required)As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on Snowflake, Airflow, Fivetran, dbt , Looker for our business intelligence and embrace AWS as a key partner across our engineering teamsAs a Analytics Engineer Youll Be

  • Developing end to end ETL/ELT Pipeline working with Data Analyst of business function.
  • Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture
  • Mentoring other Junior Engineers in Team
  • Be a go-to expert for data technologies and solutions
  • Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
  • Troubleshooting and resolving technical issues as they arise
  • Looking for ways of improving both what and how data pipelines are delivered by the department
  • Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports owning the delivery of data models and reports end to end
  • Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
  • Working with Data Analyst to ensure that all data feeds are optimized and available at the required times. This can include Change Capture, Change Data Control and other delta loading approaches
  • Discovering, transforming, testing, deploying and documenting data sources
  • Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review
  • Building Looker Dashboard for use cases if required
What Makes You a Great Fit
  • You have 5+ years of extensive development experience using snowflake or similar data warehouse technology
  • You have working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker
  • You have experience in agile processes, such as SCRUM
  • You have extensive experience in writing advanced SQL statements and performance tuning them
  • You have experience in Data Ingestion techniques using custom or SAAS tool like Fivetran
  • You have experience in data modelling and can optimize existing/new data models
  • You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
  • You having experience architecting analytical databases (in Data Mesh architecture) is added advantage
  • You have experience working in agile cross-functional delivery team
  • You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
Email- contact@toolify.inContact- 9319037707

Expertia AI Technologies

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3329218
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Rs.1400000 per year
  • Employment Status
    Permanent
  • Job Location
    Delhi, India
  • Education
    Not mentioned
  • Experience
    Year