Expoint – all jobs in one place
The point where experts and best companies meet
Limitless High-tech career opportunities - Expoint

Microsoft Technical Solution Management 
Taiwan, Taoyuan City 
280714887

17.07.2025

Required:

  • Bachelor’s degree in computer science, MIS, Data Engineering, or equivalent work experience.
  • 5–8 years of experience building cloud-based data systems and ETL frameworks.
  • Demonstrated experience with relational databases, cloud based data systems and large-scale data pipelines and orchestration tools (like Azure Data Factory, Azure Synapse, Azure Data Lake, SQL, Spark, PySpark, Python etc,).
  • Proficiency in Visualization tools like Microsoft Power Platform, including Power Apps, Power Automate, & Power BI , Fabric etc.

Preferred:

  • Strong foundation in data modeling, warehousing, and data lake architecture.
  • Familiarity with ERP systems such as SAP and Dynamics 365.
  • Experience in Microsoft Power Platform – Power BI, Power Apps, Power Automate, and Fabric.
  • Proficient in modern development practices – version control (e.g., Git), CI/CD, Agile (Scrum).
  • Hands-on experience in implementing data security, compliance controls, and governance frameworks.
  • Understanding of SFI guidelines, cloud security and data access controls.
  • Ability to manage system upgrades, apply timely security patches, and proactively address vulnerabilities .
  • Knowledge and implementation of an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed.

Key Competencies

  • Strong business acumen and ability to align data capabilities with strategic business outcomes.
  • Deep understanding of data privacy, regulatory compliance, and data lifecycle management.
  • Exceptional collaboration and communication skills—able to work across technical and non-technical teams globally.
  • Self-starter mindset with the ability to thrive in a fast-paced, evolving environment.
  • Strong analytical thinking, problem-solving skills, and a passion for continuous improvement.
  • Ability to drive change and promote a data-driven culture within the organization.
Responsibilities

The ideal candidate will blend strong technical expertise in data solutioning with a sharp business acumen, supporting data analysts, scientists, and business partners. Key responsibilities include:

  • Design and develop scalable data ingestion pipelines from multiple structured/unstructured sources like Azure Data Lake, SQL Server, Kusto, flat files etc.
  • Implement data orchestration using Spark, PySpark, and Python.
  • Implement ETL jobs to optimize data flow and reliability.
  • Model and optimize data architecture by designing logical and physical data models supporting near real-time analytics.
  • Perform data profiling and gap analysis to support migration from legacy BI platforms to next-gen platforms like Microsoft Fabric, Keystone based data sourcing etc.
  • Ensure models support future scalability, privacy, and data lifecycle governance.
  • Adhere to Microsoft’s SFI guidelines, data residency policies, and data privacy regulations
  • Ensure Data Security, Privacy, and Compliance by implementing data masking, and encryption at required levels.
  • Collaborate with Engineering teams to ensure timely patches, system updates, incorporate audit trails and data lineage tracking mechanisms.
  • Define and implement robust data validation, anomaly detection, and reconciliation logic and monitor and track data pipeline performance
  • Enable Self-Service BI and Analytics by partnering with SMEs, business stakeholders to enable self-service capabilities using Power BI, Power Platform, and Azure Synapse.
  • Create reusable datasets, certified data models, and intuitive visualizations that align with business priorities.
  • Collaborate with Engineering and Business Stakeholders by translating business requirements into technical specs and into scalable data solutions.