

Share
These jobs might be a good fit

Share
Minimum Qualifications:
• 7+ yrs exp with Bachelor's or master's degree in computer science, Software Engineering, or related technical field.
• Technical Expertise include but not limit to C, C++, Python
• Strong understanding of memory management, pointers, and system-level programming
• Experience with multi-threading, concurrency, and synchronization
• Knowledge of performance optimization techniques
• should be able to Design and implement telemetry collection agents and data processing pipelines
• Develop high-performance, low-latency telemetry systems using C/C++ for system-level components
• Build telemetry analysis tools and automation scripts using Python
• Optimize data collection mechanisms to minimize system overhead and resource consumption
• Implement real-time data streaming and batch processing solutions
• Design APIs and interfaces for telemetry data access and integration
• Debug complex system-level issues across distributed environments
• Ensure telemetry systems meet reliability, scalability, and security requirements
• Collaborate with global platform teams to instrument applications and services
• Maintain and enhance existing telemetry services
• Work effectively in cross-functional teams with product managers, SREs, and other engineers
• Communicate technical concepts clearly to both technical and non-technical stakeholders
• Provide constructive code reviews and technical feedback
• Mentor junior engineers and share knowledge proactively
• Participate actively in design discussions and technical decision-making
Write clean, well-documented, and testable code

Share

Share

Share

Share

Share
• The successful candidate will work closely withyield/process/Qualityengineers to understand their business needs, analysis methodologies and come up with solutions to assist data-driven decisions in their daily work.
• Help users to do quick and efficient signal detection for yield or process issues, improve yield and quality and reduce cost to meet Quality/Output/Cost goals.
• Stay abreast of the latest advancements in data solution development and evaluate their applicability to the manufacturing processes and ecosystems.
Qualifications:Education: • Bachelor's degree with 5+ years ofindustry/post-universityexperience in Computer Science/Computer Engineering or a related field with solid software programming experience. OR
• Master's degree in Computer Science/Computer Engineering or a related field with 3+ years strong software programming experience. OR
• PhD with some experience in Computer Science/Computer Engineering or a related field with strong software programming experience.
Skills:• DataAnalysis/Visualizations,TIBCO Spotfire (Preferred): Power BI or Tableau.Ability to create custom dashboards
•Scripting/Programming:Python or
• Web/API Integration: REST API, JSON, XML. JavaScript/Web/HTML scripting.
• Data Engineering and ETL Skills: Experience with data pipeline, data ingestion and transformation. Familiarity with Apache NiFI, Kafka or similar ETL frameworks. Shell scripting (e.g. Bash) for task automation
• Database Querying: SQL (Oracle or MS SQL or Postgres).
• Experience with NoSQL databases (MongoDB, Cassandra, HBase) is a plus
• Data Science / ML: Experience with Jupiter notebooks, MLOPs platforms and Advanced Python is a plus
• Mfg/Semiconductor process/data domain knowledge is a plus.
Experienced HireShift 1 (India)India, BangaloreThis role will require an on-site presence. * Job posting details (such as work model, location or time type) are subject to change.
These jobs might be a good fit