Expoint – all jobs in one place
Finding the best job has never been easier
Limitless High-tech career opportunities - Expoint

Apple Software Development Engineering 
United States, Washington, Seattle 
898383385

08.06.2025
APPLE INC has the following available in Seattle, Washington. Collaborate with cross-functional teams including data scientists, software engineers, and system administrators to define the architectural requirements of the data platform. Set up and manage data storage solutions, including databases (SQL and NoSQL), data lakes, and distributed file systems, ensuring data integrity, availability, and scalability. Develop and maintain data ingestion pipelines, ETL (Extract, Transform, Load) processes, and data transformation workflows to enable timely and accurate data processing. Utilize technologies including Apache Spark, Hadoop, Flink, and relevant data processing frameworks to perform complex data transformations and aggregations. Continuously evaluate data processing performance, pinpoint bottlenecks, and driving enhancements to achieve superior system performance. Develop and maintain deployment scripts, configuration templates, and infrastructure blueprints to ensure consistent and repeatable infrastructure setup. Perform document infrastructure architecture, design decisions, deployment processes, troubleshooting procedures, and best practices for internal knowledge sharing. Ensure stringent security and regulatory compliance within the organization by implementing and managing Apache Ranger, a comprehensive access control and security framework. Develop strategies and implement initiatives to increase customer engagement through targeted communication and personalized solutions, optimizing client satisfaction and retention. Develop and maintain efficient data ingestion pipelines, ETL processes, and data transformation workflows to facilitate the timely and accurate processing of data. Collaborate with interdisciplinary teams, including data scientists, software engineers, and system administrators, to outline the structural needs of the data platform. This involves facilitating discussions to define requirements that accommodate diverse technical perspectives and align with overarching business objectives.
  • Master’s degree or foreign equivalent in Computer Science or related field.
  • Experience and/or education with each of the following:
  • Develop RESTful APIs and micro services using Python for scalable and efficient backend systems.
  • Design interactive and responsive user interfaces using React, Redux, and Python for business intelligence portals.
  • Utilize Python, AWS Lambda, and API Gateway to process and analyze data within Snowflake, ensuring seamless data workflows.
  • Implement asynchronous I/O and multi-threading techniques to optimize system performance and responsiveness.
  • Apply Hadoop and MapReduce methodologies to conduct thorough analysis and insights extraction from large datasets.
  • Using D3.js to facilitate understanding and decision-making to perform data mining tasks and create engaging data visualizations
  • Leverage NVIDIA CUDA and modern GPU technology for accelerated computations and enhanced processing capabilities.
  • Utilize Kubernetes scheduler algorithms for efficient resource management to implement cloud computing solutions and work with distributed systems.
  • Utilize Kubernetes, Python, cloud platforms, and Docker containers to lead agile development practices and establish robust DevOps pipelines
  • Network security protocols: utilizing authentication methods, cryptographic algorithms (RSA, DH, and DSA), firewalls, TLS, SSH, and IPsec to ensure data integrity and confidentiality in communication channels