Built for Mettler-Toledo DBS Team
Darshan
Senthil
Data Scientist  |  AI Expert  |  M.S. Computer Science

I build AI systems that turn complex, unstructured data into answers that non-technical teams can actually use. NLP pipelines, agentic Slack bots, LLM classification at scale. That is what you are hiring for. That is what I do.

4+
Years of
Experience
20+
Hours/Month
Saved via LLM Bot
100K+
Records via
LLM Pipeline
JD Match: Data Scientist, AI Expert, Columbus
Every requirement in your job
description maps to real work I have shipped.
NLP and LLMs
GPT-4 powered Slack bot in production at WPTI, with live Redshift queries via plain English
Agentic Solutions + Knowledge Management
Guardrails AI enforcing SELECT-only validation, Airflow-orchestrated multi-model LLM classification at Rutgers
Data Quality + Coaching Business Partners
Automated SQL reconciliation at Vue.ai; trained 10+ non-technical staff to query live data independently
Python + SQL + Cloud
FastAPI, AWS Redshift, dbt, Airflow, PySpark on Databricks. All in production, not just coursework
Why Mettler-Toledo
Precision is not just a product line. It is a way of working.

Most candidates write about being drawn to MT's global scale. Here is something more specific.

01
The DBS team is doing the most interesting AI work in precision instruments
The Digital Business Services team is building AI that makes MT's internal operations smarter. Not a chatbot demo, not a dashboard refresh. Actual agentic systems that change how 17,000 people across 40 countries work. That is a problem worth spending a career on.
02
MT's precision culture means AI has to be right, not just fast
I built Guardrails AI into my Slack bot specifically because the output needed to be trustworthy, not just plausible. MT's standards of accuracy in instrumentation translate directly to how I think about AI system reliability. This is an environment where that mindset is valued.
03
Internal AI (not product AI) is where the highest leverage work lives
This role focuses on internal solutions first. That is where I have done my most impactful work: replacing 20+ hours of manual data pulling at WPTI, compressing weeks of research coding at Rutgers, cutting discrepancy detection from hours to 15 minutes at Vue.ai.
04
The Columbus team values people who ship, not just people who strategize
Lucas, MT's Data Analytics Program Manager, said: "Even as a newly hired employee, my work is being applied to real business decisions." That is the kind of culture I am looking for. Not two years of shadowing before you are trusted with real problems.
"
MT is moving from data generation to knowledge delivery. I have been building exactly that infrastructure for four years: systems that take raw data and turn it into answers that real people, with no technical background, can act on. That is not a coincidence. That is why I applied.
Darshan Senthil, on why this role specifically
Work That Maps to This Role
Three projects. Each one built for a problem MT is hiring to solve.

These are not side projects or coursework. Each one is in production or published.

Most Relevant to This Role
LLM-Powered NLP-to-SQL Slack Bot
Workforce Professionals Training Institute (WPTI)  |  New York, NY  |  2024 to Present

Program staff needed daily access to attendance and enrollment data but had no SQL skills. I built an agentic system that accepts plain-English questions in Slack, converts them to validated SQL using GPT-4 and Guardrails AI, runs them against live AWS Redshift, and returns the result in under three seconds. It is in production today, used by 10+ people every day.

20+
Hours saved per month across the team
10+
Non-technical staff using it daily
<3s
Query response time
How it works
  • GPT-4 parses natural language into parameterized SQL
  • Guardrails AI enforces SELECT-only with no write operations possible
  • FastAPI backend handles routing and Redshift connection pooling
  • Deployed into existing Slack workspace with zero new tools for users
  • Error handling returns clear messages when queries fall outside scope
Stack
OpenAI GPT-4 Guardrails AI FastAPI AWS Redshift Slack API Python
MT Relevance
The DBS job description says: "develop agentic solutions around knowledge management." This bot is exactly that: an agentic system that gives business partners direct access to structured knowledge without technical mediation. The architecture scales.
LLM Pipeline at Scale
Multi-Model LLM Classification Pipeline
Rutgers School of Public Health  |  New Brunswick, NJ  |  2023 to 2024

Researchers needed 100,000+ social media posts classified for a public health study. Manual coding was taking weeks and producing inconsistent labels. I designed an Airflow-orchestrated pipeline running GPT-4 and Mistral in parallel with ensemble voting to surface high-confidence labels, reducing classification from weeks to hours. The findings were published in a peer-reviewed journal.

100K+
Posts classified via LLM pipeline
Weeks
to Hours
Classification time reduction
Architecture
  • Apache Airflow orchestrates ingestion and model routing
  • GPT-4 and Mistral run in parallel with independent labels per post
  • Ensemble voting: agreement scores flag confidence level
  • Low-confidence posts routed to human review only
  • Full pipeline documented for reproducibility and future reuse
Stack
GPT-4 Mistral Apache Airflow Python NLP Ensemble Voting
MT Relevance
MT's DBS team is building agentic solutions for knowledge management and data processing. This pipeline does exactly that: ingesting unstructured content, applying LLM intelligence at scale, and producing structured outputs with measurable confidence. The domain changes; the architecture applies directly.
Data Quality at Production Scale
Automated SQL Reconciliation and Alerting System
Vue.ai  |  Chennai, India  |  2021 to 2022

Data discrepancies across 50+ enterprise client accounts were being caught hours after dashboards had already gone out. I built an automated SQL reconciliation system that runs continuous validation checks, detects anomalies, and fires targeted Slack alerts within 15 minutes with the root cause logged in the alert itself.

Hours
to <15 min
Discrepancy detection time
50+
Enterprise clients covered automatically
What changed
  • Replaced hours of manual SQL checks with automated scheduled validation
  • Standardized the check framework across all 50+ client accounts
  • Slack alerts fire automatically with root cause included in the message
  • Zero manual effort required to maintain or run , fully automated
Stack
SQL Python PySpark Databricks Slack API
MT Relevance
The JD asks candidates to "assist in delivering and preparing datasets for analysis, ensuring adherence to quality standards and improving data cleaning processes." This system is the direct, production-scale answer to that requirement.
90-Day Plan
What I will build at Mettler-Toledo.

Not a vague "hit the ground running" promise. A concrete plan for the DBS AI team.

Days 1 to 30
Learn and Map
  • Audit the current AI platform: what is live, what is being prototyped, what is blocked
  • Shadow business partner teams to identify the top three knowledge management pain points
  • Map data sources, quality gaps, and existing LLM tooling
  • Talk to the users who will interact with the AI solutions, not just the stakeholders who commission them
Deliverable: Written assessment of highest-impact AI opportunity for the DBS team
Days 31 to 60
Build and Ship
  • Prototype an NLP or agentic solution for the top-priority pain point
  • Apply validation and guardrails to ensure outputs are trustworthy, not just plausible
  • Run real user testing with one or two internal business teams and iterate on actual feedback
  • Document the architecture clearly enough that any teammate can extend it
Deliverable: Working prototype with documented architecture, tested by real users
Days 61 to 90
Scale and Enable
  • Refine the solution based on user feedback: edge cases, UX, performance
  • Train the business partner teams on using the AI solution directly
  • Write the implementation guide so the framework replicates across other MT divisions globally
  • Identify the next highest-priority project and begin scoping
Deliverable: Production-ready AI solution and a reusable playbook for future MT AI projects
Get in Touch
Let's talk about what I'd build for your team.

I am actively interviewing for the Data Scientist, AI Expert role in Columbus. Happy to walk through any of these projects in more detail or discuss the 90-day plan.

Quick facts for the hiring team
Education: M.S. Computer Science, Rutgers University, GPA 3.96
Location: New York, NY, open to to relocation to Columbus, OH
Core stack: Python, SQL, OpenAI API, LangChain, FastAPI, AWS, dbt, Airflow, PySpark
Certifications: Databricks Data Engineer Associate, AWS Cloud Practitioner, Google Data Analytics
What I bring: Production AI systems, not just demos. Every project in this portfolio is live or published.