The point where experts and best companies meet
Share
Job Description:
Job Description:
This job is responsible for developing and delivering complex requirements to accomplish business goals. Key responsibilities of the job include ensuring that software is developed to meet functional, non-functional and compliance requirements, and solutions are well designed with maintainability/ease of integration and testing built-in from the outset. Job expectations include a strong knowledge of development and testing practices common to the industry and design and architectural patterns.
Responsibilities:
Codes solutions and unit test to deliver a requirement/story per the defined acceptance criteria and compliance requirements
Designs, develops, and modifies architecture components, application interfaces, and solution enablers while ensuring principal architecture integrity is maintained
Mentors other software engineers and coach team on Continuous Integration and Continuous Development (CI-CD) practices and automating tool stack
Executes story refinement, definition of requirements, and estimating work necessary to realize a story through the delivery lifecycle
Performs spike/proof of concept as necessary to mitigate risk or implement new ideas
Automates manual release activities
Designs, develops, and maintains automated test suites (integration, regression, performance)
Required Qualifications:
A minimum Bachelor’s degree or related discipline or equivalent working experience
9 -10 years of experience in Java application development end to end, springboot, hibernate.
Strong knowledge of big data technologies and frameworks, such as Apache Hadoop, Apache Spark, Apache Kafka, Apache Hive or Impala.
Build Hadoop-based data management applications.
Strong core java skills including multi-threading, collections API, Streams, JDBC and knowledge of Java profiling tools
Experience in using frameworks like spring, spring boot, JPA, Hibernate
Design Hive/HBase distributed data warehouse and analytical solutions to deliver on multiple use cases
Design, develop, and maintain cross-platform ETL processes and Map Reduce/Hive data processing workflows
Discover, ingest, and incorporate new sources of real-time, streaming, batch, and API-based data into our platform. Loading and managing large data in Hadoop. Computing complex logic in spark platforms
Skills to optimize data pipelines and queries for better performance and scalability
Strong problem-solving abilities and the capability to identify and resolve complex data engineering issues
Leadership and mentoring skills, as senior data engineers often lead and guide other members of the data engineering team
Strong computer science fundamentals in design, data structures, and algorithms
Knowledge of performance tuning data intensive applications, Expertise in performance profiling, ability to identify performance improvements and memory optimizations
Expertise of SQL and NoSQL databases, as well as data integration and transformation tools
Excellent communication skills
Skills:
Application Development
Automation
Influence
Solution Design
Technical Strategy Development
Architecture
Business Acumen
DevOps Practices
Result Orientation
Solution Delivery Process
Analytical Thinking
Collaboration
Data Management
Risk Management
Test Engineering
These jobs might be a good fit