Software Developer

Year    TS, IN, India

Job Description

MS or BS in Computer Science or Equivalent


4-6+ years of relevant experience


Key Responsibilities: Feature Engineering Pipelines

• Design and implement data pipelines for feature extraction, transformation, and enrichment at scale using tools like Apache Spark, Airflow, or Prefect.
• Collaborate with data scientists to automate and optimize feature engineering workflows.
• Develop real-time and batch processing pipelines to support machine learning and analytics workloads.
• Ensure pipelines are resilient, maintainable, and scalable to handle large volumes of structured and unstructured data.

Secure Data Storage & Management

• Architect and manage secure storage solutions leveraging relational (SQL) and non-relational (NoSQL) databases like Oracle, DynamoDB, and MongoDB.
• Implement encryption techniques, data masking, and role-based access controls (RBAC) to safeguard sensitive data.
• Establish data retention policies and backup strategies to ensure compliance with data privacy regulations.
• Track data lineage and manage metadata using tools like Apache Atlas or DataHub.

Data Infrastructure & Tools

• Build and manage scalable ETL/ELT pipelines to streamline data ingestion and transformation processes.
• Leverage cloud-based services (AWS, GCP, Azure) for secure storage and data processing.
• Integrate distributed systems like Hadoop and Kafka for high-volume data handling.

Collaboration & Leadership

• Partner with data science and analytics teams to understand feature engineering needs and translate them into technical solutions.
• Lead efforts to enhance the security and reliability of data pipelines and storage systems.
• Mentor junior engineers on best practices in data engineering and secure software design.

Qualifications: • Strong background in data structures, algorithms, and system design.
• Hands-on experience with feature engineering pipelines, ETL/ELT tools, and secure data storage solutions.
• Proficiency in programming languages like Python, Java, or Scala.
• In-depth knowledge of data privacy regulations and compliance requirements.
• Familiarity with distributed systems, real-time data processing, and cloud platforms.

This role offers the chance to work on cutting-edge data engineering challenges, ensuring secure and efficient handling of large-scale data. If you are passionate about building feature-rich and secure platforms, we encourage you to apply.


Career Level - IC4



Key Responsibilities: Feature Engineering Pipelines

• Design and implement data pipelines for feature extraction, transformation, and enrichment at scale using tools like Apache Spark, Airflow, or Prefect.
• Collaborate with data scientists to automate and optimize feature engineering workflows.
• Develop real-time and batch processing pipelines to support machine learning and analytics workloads.
• Ensure pipelines are resilient, maintainable, and scalable to handle large volumes of structured and unstructured data.

Secure Data Storage & Management

• Architect and manage secure storage solutions leveraging relational (SQL) and non-relational (NoSQL) databases like Oracle, DynamoDB, and MongoDB.
• Implement encryption techniques, data masking, and role-based access controls (RBAC) to safeguard sensitive data.
• Establish data retention policies and backup strategies to ensure compliance with data privacy regulations.
• Track data lineage and manage metadata using tools like Apache Atlas or DataHub.

Data Infrastructure & Tools

• Build and manage scalable ETL/ELT pipelines to streamline data ingestion and transformation processes.
• Leverage cloud-based services (AWS, GCP, Azure) for secure storage and data processing.
• Integrate distributed systems like Hadoop and Kafka for high-volume data handling.

Collaboration & Leadership

• Partner with data science and analytics teams to understand feature engineering needs and translate them into technical solutions.
• Lead efforts to enhance the security and reliability of data pipelines and storage systems.
• Mentor junior engineers on best practices in data engineering and secure software design.

Qualifications: • Strong background in data structures, algorithms, and system design.
• Hands-on experience with feature engineering pipelines, ETL/ELT tools, and secure data storage solutions.
• Proficiency in programming languages like Python, Java, or Scala.
• In-depth knowledge of data privacy regulations and compliance requirements.
• Familiarity with distributed systems, real-time data processing, and cloud platforms.

This role offers the chance to work on cutting-edge data engineering challenges, ensuring secure and efficient handling of large-scale data. If you are passionate about building feature-rich and secure platforms, we encourage you to apply.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3575743
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Contract
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    TS, IN, India
  • Education
    Not mentioned
  • Experience
    Year