Sr. Technical Data Engineer- Contract

Remote
Contracted
Experienced

Sr. Technical Data Engineer- India | Remote | Fintech

Bainbridge-India

Job Type: Contract/ Project focused (6-18 months with potential for extension).

Compensation: ₹2.5- ₹3.5L/ per month. Benefits are not included.

Experience: 5-7 years. Experience in a start-up environment highly preferred.

Position Location: This is a remote position based in India, but close collaboration with our U.S. team is needed. Candidates should expect at least 2-3 hours per day of overlap with U.S. business hours for effective communication and  collaboration.

Start Date: Immediately

How to Apply: Interested candidates should submit a PDF version of resume and a brief letter of interest.

About Us

Bainbridge is a leading financial services firm, serving the top private equity funds and corporate buyers. We have completed over $5 billion in small- to mid-cap acquisitions for PE funds and corporations in all market sectors including: technology, healthcare, automation, global supply chain, machine learning and energy. We are experiencing exciting growth into the investment banking and fintech arenas.

Position Summary:

We are looking for a skilled and highly motivated Technical Data Engineer- Contractor to join us for a long term project at a pivotal time.  You will have the opportunity to build and shape critical components of our data infrastructure- not entirely from scratch, but close to it. This is a rare chance to design and implement systems that will power decision-making and product innovation across the company, with the potential to transform the way financial services work.

Responsibilities:

·      Gain a comprehensive understanding of current data sources, pipelines, and storage systems.

·      Design, build, and maintain scalable ETL/ELT pipelines to automate the movement of data from diverse sources to a centralized data warehouse.

·      Optimize data pipelines for performance, reliability, and maintainability.

·      Ensure data is validated, transformed, and stored in formats that meet analytical needs.

·      Analyze existing data origin sources (internal databases, third-party APIs, web-based systems) to assess structure, quality, and reliability.

·      Define and document data architecture, recommending improvements to support current and future data needs.

·      Collaborate with stakeholders to align technical solutions with business requirements.

·      Apply data wrangling techniques to prepare raw data for analysis, including handling missing values, data deduplication, and schema standardization.

·      Ensure data integrity and implement logging, alerting, and monitoring for all data workflows.

·      Partner with Data Analysts and business Stakeholders to support A/B testing frameworks and provide infrastructure for running experiments.

·      Enable self-service reporting and analysis by ensuring well-documented, accessible datasets.

·      Assist in the development of dashboards and reports using tools like Tableau or Power BI.

·      Support the data team in presenting key metrics and insights in a visually compelling way.

·      Continuously identify and integrate new data sources—internal or external—to enhance business insights and competitive edge.

·      Deploy systems to monitor data quality, pipeline health, and job failures proactively.

·      Design and implement automated pipelines to ingest and process data from key sources.

·      Ensure data is clean, validated, and ready for use by analysts and stakeholders.

·      Document existing workflows and identify quick wins for optimization.

·      Take initiative and ownership of projects from concept to deployment, demonstrating a builder's mindset.

Qualifications:

  • Master’s degree in Computer Science, Engineering, Data Science, or a related field.
  • 5+ years of experience as a Data Engineer or in a similar role.
  • Proficient in SQL and at least one programming language (e.g., Python).
  • Experience with data pipeline tools (e.g., Airflow, dbt, Apache Beam, etc.).
  • Familiarity with cloud platforms (AWS, GCP, Azure) and data warehouses (e.g., Snowflake, BigQuery, Redshift).
  • Experience with frameworks like BeautifulSoup, Scrapy, or Selenium.
  • Knowledge of A/B testing frameworks and visualization tools (e.g., Tableau, Power BI).
  • Strong problem-solving skills and the ability to work in a fast-paced environment.
  • Experience in an entrepreneurial or start up environment preferred.
  • Must be able to manage time and deliverables independently.



·   

Share

Apply for this position

Required*
Apply with
We've received your resume. Click here to update it.
Attach resume as .pdf, .doc, .docx, .odt, .txt, or .rtf (limit 5MB) or Paste resume

Paste your resume here or Attach resume file

To comply with government Equal Employment Opportunity and/or Affirmative Action reporting regulations, we are requesting (but NOT requiring) that you enter this personal data. This information will not be used in connection with any employment decisions, and will be used solely as permitted by state and federal law. Your voluntary cooperation would be appreciated. Learn more.
Human Check*