Join Us

Open Positions


Skills required

  • 3+ years of experience in backend development of enterprise-level software
  • Experience in developing and deploying cloud-based microservices - in-depth knowledge with AWS services and Containerization
  • Experience with data stores like MySQL, PostgreSQL, Redis, Real-time database etc. and a solid understanding of relational & non-relational databases 
  • Development experience of Kafka (Message broker architecture, Producers, Consumers and Streaming) in a distributed environment)
  • Basic understanding of computer security - Kerberos, LDAP, SAML , OAUTH2 and User Access Control Design
  • Understanding of patterns and techniques for building scalable back-end infrastructure including caching, rate limiting, authentication and authorization schemes.
  • Good knowledge of software design methodologies (OOD, UML)
  • Working knowledge of networking fundamentals - VPC, Subnet, Gateways etc.
  • Willingness to learn and pick up new technology along with patience to mentor


  • Design and develop secure architecture and scalable microservices in an agile environment
  • Execute and deliver large and complex projects end-to-end, with or without the involvement of other team members 
  • Ensuring performance, scalability and security in all new product features being developed
  • Quickly troubleshoot production issues and come up with an appropriate RCA.
  • Evaluate new technologies and participate in decision-making, accounting for several factors such as viability within Target’s technical environment, maintainability, and cost of ownership.
  • Closely collaborating with our EV Analytics and Machine Learning Team to jointly deliver products to our customers


  • Passionate about software designing and developing, with a willingness to develop your analytical skills related to the implementation of Big Data solutions.
  • 3+ years hands-on experience in Data Engineering & Analysis
  • Experience in Knowledge of Python, Node JS, TypeScript and if possible Java
  • Knowledge of Spark, Apache is plus
  • Basic understanding of large scale distributed systems (theory and practical experience), design patterns, and object-oriented design principles
  • Experience with CICD and Behaviour Driven Development
  • Experience with Google Cloud Platform, Google Big Query, Snowflakes or AWS services
  • Solid understanding and experience with any of the relational databases 
  • You will be responsible for deploying, automating, maintaining and managing cloud based production systems for big data analytics, to ensure the availability, performance, scalability and security of productions as well as delivery systems for initial lean product

Daily business involvement

  • Design and implement ETL/ELT pipelines from ground-up
  • Build data infrastructure platforms including the data ingestion, data consumption and stream processing on cloud
  • Deploy ML solutions using open-source frameworks 
  • Research and assess technology environment and trend for best practises to apply, especially No Code Platforms
  • Devise the collection and analysis of key metrics and reporting dashboards to monitor enterprise data platform performance and reliability
  • Support the development and maintenance of data engineering guidelines, policies, standards and process narratives for in-scope business functions


We are looking for students ( from 3rd year and above ) who have experience in and will be able to undertake the following responsibilities from the beginning of the internship:

  • Exceptionally skilled in data exploration, developing predictive algorithms, designing, and recommending algorithms using Advance ML Techniques
  • Highly proficient in implementing popular python libraries for ML and DL
  • Superb understanding of multi-dimensional arrays, dataframes, computability, complexity and efficiency of models
  • Prior experience of implementing neural network and parameter tuning 
  • Excellent Debugging skills to troubleshoot issues and implement fixes
  • Knowledge of advanced signal processing techniques is desired
  • Basic Understanding of Probability and Statistics
  • Strong inclination to learn and implement new DL technologies
  • Working knowledge of SQL