Apply for the job
B2B SAAS Unicorn   Stealth Hiring Mode

Data Engineer III

Bengaluru, Karnataka, IN
REMOTE
FULL TIME

Company: B2B SAAS Unicorn Stealth Hiring Mode

Category: Computer and Mathematical Occupations

Published on 2022-05-08 07:01

Opportunity is with a B2B SaaS Unicorn. More details will be shared during the exploratory call. Job Location : Pune | Remote Option available.


We are looking for Data Engineer III to join the data analytics team. As a Data Engineer, you’ll collaborate with software engineers, product management, and partner teams to design and implement solutions for the complex data needs of our platform. You enjoy working with complex systems, are customer-centric, realize ideas/plans in the product, have a strong sense of ownership and drive, are emotionally intelligent, have a product mindset, and thrive on the challenge of implementing solutions at scale.


You will work directly with software developers, program managers, and product managers to design and implement software on the data analytics platform. You are an engineer at heart who understands and highlights deep technical issues that should be timely addressed.


Tech stack team uses: Scala, Python, NodeJS, React/Redux, Spark, Kafka, Databricks Delta, PostgreSQL, Apache Parquet, Tibco Jaspersoft, Redis, AWS, Kubernetes


Current capabilities: 20 million records processed in a query, terabytes of data delivered per week, a multi-tenant platform with 200+ customers globally


What you'll be doing:

  • Design and development of new features on the Analytics Data Platform.
  • Mentor team members.
  • Work with engineering manager to develop strategies for effective data engineering development and long-term architectural planning.
  • Partner with architects in problem-solving.
  • Continuously evaluate relevant technologies, influence and drive architecture and design discussions
  • Work with QA to define QA strategy before releasing of a feature.
  • Collaborate with DevOps & delegate to define the deployment strategy for a release.
  • Optimize the data analytics platform periodically.


Requirements:

  • Bachelor's degree in Computer Science.
  • 5-8 years of total software development experience.
  • 3-6 years of experience in data engineering.
  • Proficiency in Python programming.
  • Working knowledge of Scala.
  • Data Streaming/ Batch Processing using Apache Spark and Kafka.
  • Big Data Querying and Analytics.
  • Ability to write efficient queries in SQL/PL-SQL.
  • Knowledge of Architectural Patterns for OLAP systems.
  • Data modeling for analytical (OLAP) systems.
  • Knowledge of database management systems for OLAP.
  • Familiarity with Kubernetes and Containers.
  • Familiarity with DevOps tools and processes.
  • Excellent problem solving skills.
  • Good written and verbal communication skills.
  • Ability to build strong relationships with people and drive them towards a common goal.


Desired skills and experience:

  • Demonstrated leadership in a cross-functional, highly collaborative environment.
  • Demonstrated focus on continuous improvement of systems, software, and processes through data analysis and metrics.
  • 4+ years of hands-on experience with Python, Scala, SQL.
  • Have strong experience with Batch/Stream processing systems (such as Spark), Kafka, ETL pipelines, and Parallel and Distributed Query Processing.
  • Experience with data modeling techniques for big data.
  • Knowledge of modern DevOps tools and techniques.