Expoint - all jobs in one place

The point where experts and best companies meet

Limitless High-tech career opportunities - Expoint

ATT Lead Data/AI Engineering 
India, Karnataka, Bengaluru 
111577961

10.04.2025

About AT&T Chief Data Office

Candidates will:

  • Work on cutting-edge Cloud Technologies, AI/ML, and data-driven solutions, be a part of a dynamic and innovative team driving digital transformation.
  • Lead high-impact Agile initiatives with top talent in the industry.
  • Get opportunity to grow and implement Agile at an enterprise level.
  • Offered competitive compensation, flexible work culture, and learning opportunities.

Shift timing (if any):12.30 to 9.30 IST(Bangalore)/1:00-10:00 pm (Hyderabad)

Hybrid (3 days mandatory in office)

Roles and Responsibilities

  • Create product roadmap and project plan.
  • Design, develop, and maintain scalable ETL pipelines using Azure Services to process, transform, and load large datasets into Cloud platforms.
  • Collaborate with cross-functional teams, including data architects, analysts, and business stakeholders, to gather data requirements and deliver efficient data solutions.
  • Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure.
  • Work together with data scientists/architects and analysts to understand the needs for data and create effective data workflows.
  • Exposure to Snowflake Warehouse.
  • Big Data Engineer with solid background with the larger Hadoop ecosystem and real-time analytics tools including PySpark/Scala-Spark/Hive/Hadoop CLI/MapReduce/Storm/Kafka/Lambda Architecture
  • Implementing data validation and cleansing techniques.
  • Improve the scalability, efficiency, and cost-effectiveness of data pipelines.
  • Experience in designing and hands-on development in cloud-based analytics solutions.
  • Expert level understanding on Azure Data Factory Azure Data Lake, Snowflake, Pyspark is required.
  • Good to have exp in full Stack Development background with Java and JavaScript/CSS/HTML.
  • Knowledge of ReactJs/Angular is a plus.
  • Designing and building of data pipelines using API ingestion and Streaming ingestion methods.
  • Unix/Linux expertise; comfortable with Linux operating system and Shell Scripting.
  • Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is desirable.
  • PL/SQL, RDBMS background with Oracle/MySQL
  • Comfortable with microServices, CI/CD, Dockers, and Kubernetes
  • Strong experience in common Data Vault data warehouse modelling principles.
  • Creating/modifying Dockers and deploying them via Kubernetes.

Additional Skills Required:

The ideal candidate should have at least 14+ years of experience in IT along in addition to the following:

  • Having 10+ years of extensive development experience using snowflake or similar data warehouse technology
  • Having working experience with dbt and other technologies of the modern datastack, such as Snowflake, Azure, Databricks and Python,
  • Experience in agile processes, such as SCRUM
  • Extensive experience in writing advanced SQL statements and performance tuning.
  • Experience in Data Ingestion techniques using custom or SAAS tool
  • Experience in data modelling and can optimize existing/new data models
  • Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets

Technical Qualifications:

Preferred:

  • Bachelor's degree in Computer Science, Information Systems, or a related field.
  • Experience in high-tech, software, or telecom industries is a plus.
  • Strong analytical skills to translate insights into impactful product initiatives.

Time Type:

Hyderabad, Andhra Pradesh, India

AT&T is a fair chance employer and does not initiate a background check until an offer is made.

04/09/2025