

Share
WHO YOU’LL WORK WITH
As a Senior Data Engineer, you will play a key role in ensuring that our data products are robust and capable of supporting our Data Engineering and Business Intelligence initiatives.
A data engineer with 5+ years of experience working with cloud-native platforms.
Advanced skills in SQL, PySpark, Apache Airflow (or similar workflow management tools), Databricks, and Snowflake.
Deep understanding of Spark optimization, Delta Lake, and Medallion architecture.
Strong experience in data modeling and data quality practices.
Experience with Tableau for data validation and monitoring.
Exposure to DevOps practices, CI/CD, Git, and security aspects.
Effective mentorship and team collaboration skills.
Strong communication skills, able to explain technical concepts clearly.
Experience with Kafka or other real-time systems
Preferred:
Familiarity with ML/GenAI integration into pipelines.
Databricks Data Engineer certification.
WHAT YOU’LL WORK ON
Own and optimize large-scale ETL/ELT pipelines and reusable frameworks.
Collaborate with cross-functional teams to translate business requirements into technical solutions.
Guide junior engineers through code reviews and design discussions.
Monitor data quality, availability, and system performance.
Lead CI/CD implementation and improve workflow automation.
These jobs might be a good fit

Share
We are looking for individuals who are highly driven and have the ability to understand and translate business requirements into data needs. The candidates should be good at problem solving and have in-depth technical knowledge on SQL, bigdata with optional expertise in pyspark. They need to have excellent verbal and written communication and should be willing to work with business consumers to understand their needs and requirements.
Role requirements include
A minimum of bachelor’s degree in computer science/information science engineering
5+ years of experience in data and analytics space with hands-on experience
Very high expertise in SQL with the ability to work on platforms like databricks, hive and snowflake
Ability to integrate and communicate moderately complex information, sometimes to audiences who are not familiar with the subject matter. Acts as a resource to teammates.
Ability to integrate complex datasets and derive business value out of data
Independently utilizes knowledge, skills, and abilities to identify areas of opportunity, resolve complex problems & navigate solutions.
In this role you'll be working with a team of talented data engineers, product managers and data consumers who'll focus on the enterprise-wide data needs of Nike. You'll have a direct impact on the deliverables of the team, and you'll be guiding the team on solving complex business problems.
Some of your day-to-day activities will include -
Collaborating with engineers, product managers and business users for optimal usage of data
Understanding business used cases using data
Analysing data to inform business decisions
Troubleshooting complex data integration problems at a business level
Writing and enhancing complex queries in databricks, hive, snowflake
Providing inputs to the product management in growing the data foundational layers
These jobs might be a good fit

Share
The core competencies required for this role include -
• Bachelor’s degree in computer science engineering
• 5+ years of hands-on experience in data engineering field
• In depth big data tech stack knowledge
• Expertise in pyspark, SQL and data modelling
• Expertise in databricks, snowflake, EMR, airflow
• Pipeline design and architectural skillset
• Knowledge on performance tuning of big data pipelines
• Excellent written and verbal communication skills
• Mentoring and guiding skills
On a day-to-day basis, you'll focus on -
• Building, enhancing, and troubleshooting complex data pipelines
• Collaborating with product managers, engineers, analysts to build, enhance and troubleshoot data pipelines
• Contribute towards the design and architecture of overall data landscape at Nike
• Collaborate with lead and principal engineers to define and implement quality standards across data pipelines
• Collaborate with engineering leadership to define and implement observability metrics across data pipelines
These jobs might be a good fit

Share
We are looking for individuals who are highly driven and have the ability to understand and translate business requirements into data needs. The candidates should be good at problem solving and have in-depth technical knowledge on SQL, bigdata with optional expertise in pyspark. They need to have excellent verbal and written communication and should be willing to work with business consumers to understand their needs and requirements.
Role requirements include
A minimum of bachelor’s degree in computer science/information science engineering
5+ years of experience in data and analytics space with hands-on experience
Very high expertise in SQL with the ability to work on platforms like databricks, hive and snowflake
Ability to integrate and communicate moderately complex information, sometimes to audiences who are not familiar with the subject matter. Acts as a resource to teammates.
Ability to integrate complex datasets and derive business value out of data
Independently utilizes knowledge, skills, and abilities to identify areas of opportunity, resolve complex problems & navigate solutions.
In this role you'll be working with a team of talented data engineers, product managers and data consumers who'll focus on the enterprise-wide data needs of Nike. You'll have a direct impact on the deliverables of the team, and you'll be guiding the team on solving complex business problems.
Some of your day-to-day activities will include -
Collaborating with engineers, product managers and business users for optimal usage of data
Understanding business used cases using data
Analysing data to inform business decisions
Troubleshooting complex data integration problems at a business level
Writing and enhancing complex queries in databricks, hive, snowflake
Providing inputs to the product management in growing the data foundational layers
These jobs might be a good fit

Share
You will be part of the Digital Design & Merchandising, Product Creation, Planning, and Manufacturing Technology team at Converse. You will take direction and work primarily with the Demand and Supply team, supporting business planning space. You'll work with a talented team of engineers, data architects, and business stakeholders to design and implement scalable data integration solutions on cloud-based platforms to support our planning org. The successful candidate will be responsible for leading the integration of planning systems, processes, and data across the organization
We're looking for a seasoned Cloud Integration Lead with expertise in Databricks, Apache Spark, and cloud-based data integration. You'll have a strong technical background, excellent collaboration skills, and a passion for delivering high-quality solutions.
The ideal candidate will have:
5+ years of experience with Databricks, Apache Spark, and cloud-based data integration.
Strong Technical expertise with cloud-based platforms, including AWS and or Azure cloud.
Strong programming skills in languages like SQL, Python, Java, or Scala.
3+ years' experience with cloud-based data infrastructure and integration leveraging tools like S3, Airflow, EC2, AWS Glue, DynamoDB & Lambdas, Athena, AWS Code deploy, Azure Data Factory, or Google Cloud Dataflow.
Experience with Jenkins and other CI/CD tools like GitLab CI/CD, CircleCI, etc.
Experience with containerization using Docker and Kubernetes.
Experience with infrastructure such as code using tools like Terraform or CloudFormation
Experience with Agile development methodologies and version control systems like Git
Experience with IT service management tools like ServiceNow, JIRA, etc.
Data warehousing solutions, such as Amazon Redshift, Azure Synapse Analytics, or Google BigQuery will be a plus but not mandatory.
Data science and machine learning concepts, including TensorFlow, PyTorch, or scikit-learn will be a plus but not mandatory.
Strong technical background in computer science, software engineering, or a related field.
Excellent collaboration, communication, and interpersonal skills.
Experience with data governance, data quality, and data security principles.
Ability to lead and mentor junior team members.
AWS Certified Solutions Architect or AWS Certified Developer Associate or Azure Certified Solutions Architect certification.
Design and implement scalable data integration solutions using Databricks, Apache Spark, and cloud-based platforms.
Develop and implement cloud-based data pipelines using Databricks, Nifi, AWS Glue, Azure Data Factory, or Google Cloud Dataflow.
Collaborate with cross-functional teams to deliver high-quality solutions that meet business requirements.
Develop and maintain technical standards, best practices, and documentation.
Integrate various data sources, including on-premises and cloud-based systems, applications, and databases.
Ensure data quality, integrity, and security throughout the integration process.
Collaborate with data engineering, data science, and business stakeholders to understand requirements and deliver solutions.
These jobs might be a good fit

Share
• 8+ years of experience in software development, with a focus on PLM (Product Lifecycle Management) and/or retail industry
• 5+ years of experience with FlexPLM. VibeIQ experience a plus
• Bachelor's degree in computer science, Information Technology, or related field
• PTC FlexPLM Certification is a plus
• Strong knowledge of Java, Spring, and Hibernate. Python experience a plus
• Experience with web services, RESTful APIs, and microservices architecture
• Familiarity with Agile development methodologies and version control systems (e.g., Git)
• Strong understanding of database design and data modelling
• Experience with cloud-based platforms (AWS and Azure) and data integration tools, such as AWS Glue, S3, DynmoDB & Lambdas, or Azure Data Factory
• Experience with IT service management tools like ServiceNow, JIRA, etc.
• Experience with Jenkins and containerization using Docker and Kubernetes, as well as other CI/CD tools like GitLab CI/CD, CircleCI
• Strong problem-solving skills and attention to detail
• Excellent communication and collaboration skills
• Ability to work in a fast-paced environment and prioritize multiple tasks
• Strong analytical and problem-solving skills
• Experience working with cross-functional teams across different time zones
• Ability to lead and mentor junior team members
WHAT YOU’LL WORK ON
• Design and implement software solutions using FlexPLM and VibeIQ, ensuring alignment with business requirements and industry best practices
• Collaborate with business stakeholders to understand requirements and develop solutions that meet their needs
• Lead the development team in the design, development, testing, and deployment of software solutions
• Ensure solutions are scalable, secure, and meet performance requirements
• Develop and maintain technical documentation, including architecture diagrams, design documents, and technical notes
• Collaborate with QA team to develop and execute testing plans
• Participate in code reviews and ensure adherence to coding standards and best practices
• Stay up-to-date with industry trends, new technologies, and emerging standards
• Provide technical guidance and mentorship to junior team members
• Collaborate with DevOps team to ensure smooth deployment and operation of software solutions
These jobs might be a good fit

Share
The Technical Program Manager at ITC will partner closely with technology leads and Technical Directors in engineering, product, UX and architecture to ensure software capability execution across all functions of Global Technology. You will have close partnership with leads of other technology domains and business teams associated with the program. You will work closely to coordinate with other Technology teams on dependencies across the portfolio.
You will drive key programs within the Consumer Product & Innovation portfolio. This role will lead diverse cross-function team members to execute complex and strategic programs. We are looking for a leader who has experience building program management functions and oversee end-to-end execution across multiple parallel program workstreams, including risk management, issue/conflict resolution and dependency management.
Bachelor’s degree or master’s (preferred) in Engineering, Computer Science or related technical field
Project Management Professional (PMP) preferred.
5+ years of Technical Program Management experience working directly with engineering teams
3+ years of software development experience
Relevant experience in Retail, Customer Experience, Innovation, eCommerce, Mobile Commerce, Digital marketing, Supply Chain, and Operations preferred.
Experience leading program management efforts including establishing effective governance structures, cross functional coordination, and executive stakeholder management and reporting.
Strong relationship-building skills and experience working closely with senior executives and cross-functional partners to deliver key strategic initiatives.
Experience running and executing across multiple complex technology platforms and business areas
Strong oral and written communication skills and ability to challenge the status quo
Ability to deal with ambiguity and work in a dynamic, results-oriented matrixed environment
Delivery experience using Agile and Waterfall methodologies
Deep experience with tools such as Aha!, JIRA, and Confluence
Proven experience in a technical program management or engineering delivery role, with a strong ability to drive technical projects to successful completion.
Technical understanding of Service Oriented Architecture Solutions.
Broad technical understanding across a variety of platforms, systems, and engineering disciplines, like Front End, Dev ops, Cloud Services
Excellent communication skills, with the ability to translate complex technical concepts to non-technical stakeholders and vice versa.
You will assess and understand product specifications and coordinate teams and resources needed to execute those specifications. You will lead the program and have oversight of its execution and development. You will be responsible for the end-to-end timely, on budget technology execution of the Technology program, actively engaging key business and technology portfolio partners to ensure opportunities, risks and issues are understood and addressed. You will co-create and help develop program dashboards, portfolio reporting and analysis activities.
These jobs might be a good fit

WHO YOU’LL WORK WITH
As a Senior Data Engineer, you will play a key role in ensuring that our data products are robust and capable of supporting our Data Engineering and Business Intelligence initiatives.
A data engineer with 5+ years of experience working with cloud-native platforms.
Advanced skills in SQL, PySpark, Apache Airflow (or similar workflow management tools), Databricks, and Snowflake.
Deep understanding of Spark optimization, Delta Lake, and Medallion architecture.
Strong experience in data modeling and data quality practices.
Experience with Tableau for data validation and monitoring.
Exposure to DevOps practices, CI/CD, Git, and security aspects.
Effective mentorship and team collaboration skills.
Strong communication skills, able to explain technical concepts clearly.
Experience with Kafka or other real-time systems
Preferred:
Familiarity with ML/GenAI integration into pipelines.
Databricks Data Engineer certification.
WHAT YOU’LL WORK ON
Own and optimize large-scale ETL/ELT pipelines and reusable frameworks.
Collaborate with cross-functional teams to translate business requirements into technical solutions.
Guide junior engineers through code reviews and design discussions.
Monitor data quality, availability, and system performance.
Lead CI/CD implementation and improve workflow automation.
These jobs might be a good fit