Job Title: Data Engineer

Skills: Python, Databricks, Spark , GCP/GCS, DBT , SQL

Experience - 4+ Years

Location: Bangalore, India

Job Description:

About Quation

We specialize in delivering synergistic solutions that enable businesses to make informed decisions. With 200+ years of cumulative experience, our leaders strive to create customized solutions that empower Fortune companies to make better decisions. We deliver tangible and measurable benefits through advanced analytics solutions.
Quation is a forward-thinking organization that is committed to delivering high-quality services to its clients. We are known for our expertise in specialized fields viz. Technology, Supply Chain Analytics, Data Engineering, Data Warehousing, and Marketing Analytics.
Our goal is to help businesses achieve their objective by providing them with the tools, information, and resources that they need to succeed. Quation provides exceptional customer service and builds long-term relationships.
Data is an asset! When harnessed, it effectively enhances performance, fosters efficiency, and accelerates growth in real-time.
At Quation, we believe that the field of analytics is constantly evolving. To stay abreast with the latest trends and technologies, we make significant investments in learning and development.
With deep domain expertise and core technical knowledge, our products and services provide our customers with a vital competitive edge. We create a world where understanding the complex intricacies of businesses becomes easier.
Our USP is our ability to leverage artificial intelligence to help businesses solve complex problems and make data-driven decisions. We analyze vast amounts of data quickly and accurately, providing insights that would be difficult or impossible to obtain through traditional methods.
We help businesses automate processes, improve efficiency, reduce costs, and gain a competitive advantage in the marketplace.

Key Responsibilities:

  • Design, develop, and maintain robust data pipelines and ETL processes using Python, Databricks, Spark, and SQL.
  • Implement data integration solutions on Google Cloud Platform (GCP), specifically Google Cloud Storage (GCS).
  • Utilize DBT (Data Build Tool) to transform data in our data warehouse, ensuring data quality and consistency.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.
  • Monitor and optimize data pipelines for performance and scalability.
  • Ensure data security and compliance with relevant data protection regulations.

Requirements:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Data Engineer, with a strong focus on Python, Databricks, Spark, and SQL.
  • Hands-on experience with Google Cloud Platform (GCP) and Google Cloud Storage (GCS).
  • Proficiency with DBT (Data Build Tool) for data transformation.
  • Strong understanding of data warehousing concepts and ETL processes.
  • Experience with version control systems like Git.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication skills and the ability to work effectively in a team.

Preferred Qualifications:

  • Master’s degree in a relevant field.
  • Familiarity with additional cloud platforms and big data technologies.
  • Certification in GCP or relevant technologies.

Benefits:

  • Competitive salary and performance-based incentives.
  • Comprehensive health insurance.
  • Opportunities for professional development and career growth.
  • Flexible working hours and the possibility of remote work.
  • Collaborative and inclusive work environment