Back to Jobs

Big Data Developer

Talent Worx No location specified Full-time
$100,000
per year

Job Description

We’re hiring for one of the world’s leading professional services firms, renowned for its commitment to innovation, excellence, and global impact. With a presence in over 150 countries, this organization provides services across consulting, audit, tax, risk advisory, and financial advisory — helping Fortune 500 companies and governments navigate complex challenges.

Job Title: Big Data Developer
Employment Type: Full-Time Employee (FTE)
Location: PAN India
Experience: 6+ years

About the Role:

We are seeking a highly skilled Big Data Developer with strong expertise in Spark and Scala to join our dynamic team. The ideal candidate will have hands-on experience with cloud platforms such as AWS, Azure, or GCP for big data processing and storage solutions. You will play a critical role in designing, developing, and maintaining scalable data pipelines and backend services using modern big data technologies.

Key Responsibilities:

  • Develop, optimize, and maintain large-scale data processing pipelines using Apache Spark and Scala.
  • Implement and manage cloud-based big data storage and processing solutions on Azure Data Lake Storage (DLS) and Azure Databricks.
  • Collaborate with cross-functional teams to understand data requirements and deliver scalable backend services using Java and Spring Boot framework.
  • Ensure best practices in data security, performance optimization, and code quality.
  • Troubleshoot and resolve production issues related to big data workflows and backend services.
  • Continuously evaluate emerging technologies and propose enhancements to current systems.

Must-Have Qualifications:

  • 6+ years of experience in Big Data development.
  • Strong expertise in Apache Spark and Scala for data processing.
  • Hands-on experience with cloud platforms such as AWS, Azure, or GCP, with a strong focus on Azure Data Lake Storage (DLS) and Azure Databricks.
  • Proficient in backend development using Java and Spring Boot framework.
  • Experience in designing and implementing scalable and fault-tolerant data pipelines.
  • Solid understanding of big data architectures, ETL processes, and data modeling.
  • Excellent problem-solving skills and ability to work in an agile environment.

Preferred Skills:

  • Familiarity with containerization and orchestration tools like Docker and Kubernetes.
  • Knowledge of streaming technologies such as Kafka.
  • Experience with CI/CD pipelines and automated testing frameworks.

What We Offer:

  • Competitive salary of based on experience and skills.
  • Flexible working options with PAN India presence.
  • Opportunity to work with cutting-edge big data technologies in a growing and innovative company.
  • Collaborative and supportive work culture with career growth opportunities.

Company Information

Location: Not specified

Type: Not specified