

What you'll be doing:
As a senior member in our team, you will work with pre-silicon and post-silicon data analytics - visualization, insights and modeling.
Design and uphold sturdy data pipelines and ETL processes for the ingestion and processing of DFX Engineering data from various origins
Lead engineering efforts by collaborating with cross-functional teams (execution, analytics, data science, product) to define data requirements and ensure data quality and consistency
You will work on hard-to-solve problems in the Design For Test space which will involve application of algorithm design, using statistical tools to analyze and interpret complex datasets and explorations using Applied AI methods.
In addition, you will help develop and deploy DFT methodologies for our next generation products using Gen AI solutions.
You will also help mentor junior engineers on test designs and trade-offs including cost and quality.
What we need to see:
BSEE (or equivalent experience) with 5+, MSEE with 3+, or PhD with 1+ years of experience in low-power DFT, Data Visualization, Applied Machine Learning or Database Management.
Experience with SQL, ETL, and data modeling is crucial
Hands-on experience with cloud platforms (AWS, Azure, GCP)
Design and implement highly scalable, fault tolerant distributed database solutions
Lead data modeling, performance tuning, and capacity planning for large-scale, mission-critical storage workloads
Excellent knowledge in using statistical tools for data analysis & insights.
Strong programming and scripting skills in Perl, Python, C++ or Tcl is expected
Outstanding written and oral communication skills with the curiosity to work on rare challenges.
Ways to stand out from the crowd:
Experience in data pipeline and database architecture for real-world systems
Experience in application of AI for EDA-related problem-solving
Good understanding of technology and passionate about what you do
Strong collaborative and interpersonal skills, specifically a proven ability to effectively guide and influence within a dynamic environment
You will also be eligible for equity and .
ืืฉืจืืช ื ืืกืคืืช ืฉืืืืืืช ืืขื ืืืย ืืืชื

What you'll be doing:
Develop and implement the business logic in the new End-to-End Data systems for our Planning, Logistics, Services, and Sourcing initiatives.
Lead discussions with Operations stakeholders and IT to identify and implement the right data strategy given data sources, data locations, and use cases.
Analyze and organize raw operational data including structured and unstructured data. Implement data validation checks to track and improve data completeness and data integrity.
Build data systems and data pipelines to transport data from a data source to the data lake ensuring that data sources, ingestion components, transformation functions, and destination are well understood for implementation.
Prepare data for AI/ML/LLM models by making sure that the data is complete, has been cleansed, and has the necessary rules in place.
Build/develop algorithms, prototypes, and analytical tools that enable the Ops teams to make critical business decisions.
Build data and analytic solutions for key initiatives to set up manufacturing plants in US.
Support key strategic initiatives like building scalable cross-functional datalake solutions.
What we need to see:
Masterโs or Bachelorโs degree in Computer Science or Information System, or equivalent experience
8+ years of relevant experience including programming knowledge (i.e SQL, Python, Java, etc)
Highly independent, able to lead key technical decisions, influence project roadmap and work effectively with team members
Experience architecting, designing, developing, and maintaining data warehouses/data lakes for complex data ecosystems
Expert in data and database management including data pipeline responsibilities in replication and mass ingestion, streaming, API and application and data integration
Experience in developing required infrastructure for optimal extraction, transformation, and loading of data from various sources using Databricks, AWS, Azure, SQL or other technologies
Strong analytical skills with the ability to collect, organize, and disseminate significant amounts of information with attention to detail and accuracy
Knowledge of supply chain business processes for planning, procurement, shipping, and returns of chips, boards, systems, and networking.
Ways to stand out from the crowd:
Self-starter, collaborative, positive mindset, committed to growth with integrity and accountability, highly motivated, driven, and high-reaching
Solid ability to drive continuous improvement of systems and processes
A consistent record to work in a fast-paced environment where good interpersonal skills are crucial
You will also be eligible for equity and .

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Passion, excitement & global collaboration are all core to what it means to be a FlyMate. At Flywire, weโre on a mission to deliver the worldโs most important and complex payments. We use our Flywire Advantage - the combination of our next-gen payments platform, proprietary payment network and vertical specific software, to help our clients get paid, and help their customers pay with ease - no matter where they are in the world.
What more do we need to truly be unstoppable? Perhaps, that is you!
Who we are:
Today we support more than 4,800 clients across the global education, healthcare, travel & B2B industries, with diverse payment methods across 240 countries & territories and more than 140 currencies.
With over 1,200 global FlyMates, representing more than 40 nationalities, and in 12 offices world-wide, weโre looking for FlyMates to join the next stage of our journey as we continue to grow.
The Opportunity :
We, at Flywire, are seeking an experienced Implementation Consultant to join our dynamic and growing Healthcare vertical.
Key Responsibilities
:
What We Offer:
Submit today and get started!
We are excited to get to know you! Throughout our process you can expect to meet with different FlyMates including the Hiring Manager, Peers on the team, the VP of the department, and a skills assessment. Your Talent Acquisition Partner will walk you through the steps and be your โgo-toโ person for any questions.
Flywire is an equal opportunity employer. With over 30 nationalities across 12 different offices, and diversity and inclusion at the core of our people agenda, we believe our FlyMates are our greatest asset, and weโre excited to watch our unique culture evolve with each new hire.
Flywire is an equal opportunity employer.
#LI-remote




What you'll be doing:
As a senior member in our team, you will work with pre-silicon and post-silicon data analytics - visualization, insights and modeling.
Design and uphold sturdy data pipelines and ETL processes for the ingestion and processing of DFX Engineering data from various origins
Lead engineering efforts by collaborating with cross-functional teams (execution, analytics, data science, product) to define data requirements and ensure data quality and consistency
You will work on hard-to-solve problems in the Design For Test space which will involve application of algorithm design, using statistical tools to analyze and interpret complex datasets and explorations using Applied AI methods.
In addition, you will help develop and deploy DFT methodologies for our next generation products using Gen AI solutions.
You will also help mentor junior engineers on test designs and trade-offs including cost and quality.
What we need to see:
BSEE (or equivalent experience) with 5+, MSEE with 3+, or PhD with 1+ years of experience in low-power DFT, Data Visualization, Applied Machine Learning or Database Management.
Experience with SQL, ETL, and data modeling is crucial
Hands-on experience with cloud platforms (AWS, Azure, GCP)
Design and implement highly scalable, fault tolerant distributed database solutions
Lead data modeling, performance tuning, and capacity planning for large-scale, mission-critical storage workloads
Excellent knowledge in using statistical tools for data analysis & insights.
Strong programming and scripting skills in Perl, Python, C++ or Tcl is expected
Outstanding written and oral communication skills with the curiosity to work on rare challenges.
Ways to stand out from the crowd:
Experience in data pipeline and database architecture for real-world systems
Experience in application of AI for EDA-related problem-solving
Good understanding of technology and passionate about what you do
Strong collaborative and interpersonal skills, specifically a proven ability to effectively guide and influence within a dynamic environment
You will also be eligible for equity and .
ืืฉืจืืช ื ืืกืคืืช ืฉืืืืืืช ืืขื ืืืย ืืืชื