The point where experts and best companies meet
Share
Senior Software Engineer
Leads the work of other small groups of four to six engineers, including offshore associates, for assigned Engineering projects by proving pertinent documents, direction, and examples; short- and long-term solutions; resolutions; and leading cross-functional partnership. Leads the discovery phase of medium to large projects by reviewing project requirements; translating requirements into technical solutions; gathering requested information (for example, design documents, product requirements, wire frames); writing and developing code; conducting unit testing; communicating status and issues to team members and stakeholders; collaborating with project team and cross-functional teams; troubleshooting open issues and bug-fixes; ensuring on-time delivery and hand-offs; interacting with project manager to provide input on project plan; and providing leadership to the project team.
Minimum education and experience Bachelor’s degree or the equivalent in Computer Science, Information Technology, Engineering, or related field plus 5 years of experience in software engineering or related experience; OR Master’s degree or the equivalent in Computer Science, Information Technology, Engineering, or related field plus 2 years of experience in software engineering or related experience.
Skills required: Must have experience with: gathering requirements for applications to automate design of RDBMS databases and data modelling; coding and debugging complex SQLs, including resolving performance bottlenecks and deadlocks; designing, implementing, and maintaining coding standards to enhance application development and maintenance for the data pipeline running on Big Data technology; designing, developing, optimizing, and troubleshooting complex Data Pipelines and systems with data integrity to capture structured and unstructured data using MapReduce, Hive, Pig, and Sqoop scripts into Hadoop Big Data environment; building ETL data pipelines and applications to stream and process datasets at low latencies to Hadoop Big Data processes using Python, , and Shell Scripts; designing, developing, and enhancing Cloud data platform to process high volumes of data; analyzing and implementing various alerting systems to handle data integrity, data lineage, and data governance; utilizing scheduling tools ( or Airflow) to schedule data pipelines based on requirements; supporting and optimizing production processes in pre-production stage, preparing components and supplies, and verifying all project documentation is thorough and complete; identifying and driving improvements in infrastructure and system reliability, performance, monitoring, and overall stability of platform; and building tools and automations to eliminate repetitive tasks and prevent incident occurrence using Python, SQL, and Shell Scripts.
These jobs might be a good fit