Role: Data Engineer Architect:

Exp: 8+ years

Location: Bangalore

Skills: SQL, Python, Pyspark & Cloud (AWS, Azure), strong in development and Architecture areas


Job Objective:
Roles & Responsibilities:
 Working closely with the client to understand the business use case, build the data architecture, build the data pipeline, by using scalable, highly available and fault tolerant services in big data.
 Provide thought leadership in all phases of a project from discovery and planning through implementation and delivery
 Responsible for planning, organizing, and overseeing the completion of the data workstream project tasks while ensuring the project is on time, on budget, and within scope.
 Accountable for the technical leadership regarding the delivery. Ensuring a sound and future-proof architecture is planned and the implementation meets the technical quality standards
 Proficiency in design, creation, deployment, review and get the final sign off from the client by following the best practices in SDLC / Agile methodologies
 Analyze latest big data technologies and its innovative applications; and bring these insights and best practices to the team
 Should have excellent communication and presentation skills.
 Experience in leading and managing teams for customer implementations
 Providing thought leadership and mentoring to the data engineering team on how data should be stored and processed more efficiently and quickly at scale
 Good understanding of data engineering challenges and proven experience with data platform engineering (batch and streaming, ingestion, storage, processing, data management, integration, consumption, Scheduling, Automation, Quality control, Migration, Deployment) using various cloud technologies in AWS/Azure/GCP
 Work with Functional Analysts to understand the core functionalities of the solution and build Data Models, Design Documents
 Ensure adherence with Security and Compliance policies for the products
 Stay up to date with evolving cloud technologies and development best-practices including open-source software.
 Proven problem-solving skills with the ability to anticipate roadblocks, diagnose problems and generate effective solutions
 Identify and manage risks / issues related to deliverables and arrive at mitigation plans to resolve the issues and risks
 Workload Distribution for sub teams as per business priority - ensure even and well-rounded distribution of work for all analysts
 Work in an Agile Environment and provide optimized solutions to the customers using JIRA or similar tools for project management

 Strong experience in client facing roles along with expertise in implementation planning, resource utilization tracking, project plan tracking, time-period project status reports, client presentations


Experience: 8+ years of experience
Qualifications: Bachelor’s degree, preferably B.E / B. Tech
Technical Skills:
● Excellent communication and problem-solving skills.
● Experience of management of data and analytics projects
● Proficiency in team management, project management ,
● Strong experience in client facing roles
● Highly proficient in Project Management principles, methods, techniques, and tools
● Should have a strong understanding of big data solutions and analytical techniques
● Hands on experience in ETL process, performance optimization techniques are a must
● Candidate should have taken part in Architecture design and discussion
● Ability to set good goals and objectives and write reviews for team members
● Minimum of 7 years of experience in working with batch processing/ real-time systems
● Minimum of 7 years of experience working in Datawarehouse or Data Lake Projects in a role beyond just
Data consumption.
● Minimum of 4 years of extensive working knowledge in AWS building scalable solutions and equivalent level of experience in Azure or Google Cloud is also acceptable
● Minimum of 4 years of experience in programming languages (preferably Python)
● Experience in Pharma Domain will be a very Big Plus.
● Familiar with tools like Git, Code Commit, Jenkins, Code Pipeline
● Familiar with Unix/Linux and Shell Scripting
Additional Skills:
▪ Exposure to Pharma and life sciences would be an added advantage.
▪ Certified in any cloud technologies like AWS, GCP, Azure.

About Quation

We specialize in delivering synergistic solutions that enable businesses to make informed decisions. With 200+ years of cumulative experience, our leaders strive to create customized solutions that empower Fortune companies to make better decisions. We deliver tangible and measurable benefits through advanced analytics solutions.
Quation is a forward-thinking organization that is committed to delivering high-quality services to its clients. We are known for our expertise in specialized fields viz. Technology, Supply Chain Analytics, Data Engineering, Data Warehousing, and Marketing Analytics.
Our goal is to help businesses achieve their objective by providing them with the tools, information, and resources that they need to succeed. Quation provides exceptional customer service and builds long-term relationships.
Data is an asset! When harnessed, it effectively enhances performance, fosters efficiency, and accelerates growth in real-time.
At Quation, we believe that the field of analytics is constantly evolving. To stay abreast with the latest trends and technologies, we make significant investments in learning and development.
With deep domain expertise and core technical knowledge, our products and services provide our customers with a vital competitive edge. We create a world where understanding the complex intricacies of businesses becomes easier.
Our USP is our ability to leverage artificial intelligence to help businesses solve complex problems and make data-driven decisions. We analyze vast amounts of data quickly and accurately, providing insights that would be difficult or impossible to obtain through traditional methods.
We help businesses automate processes, improve efficiency, reduce costs, and gain a competitive advantage in the marketplace.