The role will require the team member to thoroughly familiarize themselves with the CDM telemetry data architecture and instrumentation, as well as with our cloud infrastructure. This will enable them to put a process in place for the data asset migration to our common data platform, DataOS. The role also entails building exec ready dashboards (using Looker as the visualization tool) and providing high level analysis on the events data.

This is an IC role, which means the candidate must be capable of working independently in understanding the role requirements and delivering to them. The person will be expected to work on complex problems and in some cases, act as a consultant on other data engineering projects within the team.

Location(s):Bangalore, Karnataka, India
Job ID:3063076

Responsibilities :-

  • Designs and establishes secure and performant data architectures, enhancements, updates, and programming changes for portions and subsystems of data pipelines, repositories or models for structured/unstructured data.
  • Analyzes design and determines coding, programming, and integration activities required based on general objectives and knowledge of overall architecture of product or solution.
  • Writes and executes complete testing plans, protocols, and documentation for assigned portion of data system or component; identifies and debugs and creates solutions for issues with code and integration into data system architecture.
  • Ability to manage and execute on an E2E data project from architecture inputs, to laying pipelines, data processing and dashboard building
  • Design and Develop business scorecards, KPIs Dashboards, Reporting, and operational information for executive business reviews
  • Conduct analysis on the data and reporting performance on KPIs.
  • Collaborates and communicates with project team regarding project progress and issue resolution.
  • Provides guidance and mentoring to less experienced team members.

Education and Experience Required

  • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering or equivalent.
  • Typically 4-6 years’ experience.

Knowledge and Skills

  • Using data engineering tools, languages, frameworks to mine, cleanse and explore data.
  • Ability to work with unstructured data
  • Fluent in NoSQL & relational based systems.
  • Fluent in complex, distributed and massively parallel systems.
  • Strong analytical and problem-solving skills with ability to represent complex algorithms in software.
  • Designing data systems/solutions to manage complex data.
  • Strong understanding of database technologies and management systems.
  • At least 1 year of exposure to working on AWS systems (Redshift and S3 must, Good to have EC2,lambda)
  • Fluency in Python and Linux shell scripting and small-scale server maintenance
  • Strong understanding of cloud-based systems/services.
  • Database architecture testing methodology, including execution of test plans, debugging, and testing scripts and tools.
  • Ability to lay pipelines across multiple systems is a key requirement
  • Familiarity with BI visualization tools, e.g. Looker (preferred), PowerBI, Tableau
  • Ability to design analytics solutions to address business problems – i.e. triangulate different data points to solution business problems / needs
  • Excellent written and verbal communication skills; mastery in English and local language.
  • Ability to effectively communicate product architectures, design proposals and negotiate options at management levels.


  • Collaborates with peers, junior engineers, data scientists and project team.
  • Typically interacts with high-level Individual Contributors, Managers and Program Teams.
  • Leads a project requiring data engineering solutions development.

Leave a Comment