Expoint – all jobs in one place
Finding the best job has never been easier
Limitless High-tech career opportunities - Expoint

Apple Software Development Engineer - Applications 
United States, Washington, Seattle 
357741887

Yesterday
APPLE INC has the following available in Seattle, Washington. Collaborate with Data Science & Analytics team to analyze data, develop KPIs, and drive improvements to user-facing features. Collaborate closely with the Data Science & Analytics team to collect and analyze new business requirements, translate them to technical specifications, and design solutions to meet the analytics needs of stakeholders. Understand usage and configure data ingestion technologies developed at Apple to ingest data from various source systems. Design and develop data pipelines for Payments data products using ASE ADE tech stack to meet business requirements. Engage in design discussions and code reviews and explore new technologies to apply creative solutions. Develop code using Spark, Java, Scala, Kafka, Cassandra, SQL, Parquet, Apache Iceberg, and other Big Data technologies to build data pipelines for analytics. Integrate data from different source systems into the data pipeline. Review source code to debug defects and performance issues. Build quality, scalable data pipeline products that can handle existing and projected data volumes. Deploy data pipelines to production, monitor performance, and tune for optimal resource utilization. Address critical issues or bugs reported on production systems by reviewing, validating, and fixing bugs in the data pipeline code. Plan and develop proof of concepts with emerging technologies to meet evolving business requirements. 40 hours/week. At Apple, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within a role. The base pay range for this role is between $137,259 - $204,000/yr and your base pay will depend on your skills, qualifications, experience, and location.
  • Master’s degree or foreign equivalent in Computer Science, Computer Engineering or related field and 1 year of experience in the job offered or related occupation.
  • 1 year of experience with each of the following skills is required:
  • Developing data pipelines using object oriented design concepts in the programming language Scala.
  • Designing data pipelines to be run in distributed systems such as clusters running Hadoop or externally on computer providers such as Amazon AWS.
  • Using Databases and SQL to perform data analysis work.
  • Developing tools to aid the development processes, such as dataset documentation, using Shell scripting.
  • Using Python to develop tools to aid the development process.
  • Using Cassandra as a data lookup to perform efficient data enrichments in the data pipeline.
  • Building efficient data pipelines to handle large-scale datasets using distributed computing concepts with Apache Spark.