Quantitative Chemical Analysis

Quantitative Analysis: Titration

Titration is a fundamental laboratory technique used to determine the unknown concentration of a solution (the analyte) by reacting it with a solution of known concentration (the titrant). In a clinical, this process ensures the precise measurement of active ingredients or metabolic markers.

The Titration Process

  1. Preparation: A precise volume of the unknown acid (analyte) is placed in the Erlenmeyer flask. A few drops of an indicator (like phenolphthalein) are added; this remains colorless in acidic conditions.
  2. The Titrant: The burette is filled with a known concentration of base (NaOH). The initial volume reading is recorded.
  3. The Reaction: The base is added drop-by-drop while swirling the flask. As the $OH^-$ ions from the base neutralize the $H^+$ ions from the acid, the solution nears the "Equivalence Point."
  4. The Endpoint: The titration stops at the first sign of a persistent, faint pink color. This "endpoint" signals that the acid has been completely neutralized by the base.
  5. Calculation: By measuring how much volume was dispensed from the burette, you can calculate the exact number of moles used and, subsequently, the concentration of the unknown sample.
✏️ Two-Part Practice: Acid Analysis
Scenario: You are analyzing a 10.0 mL sample of an unknown acid ($H^+$) using a 0.100 M NaOH solution as your titrant.

Part A: Finding Moles of Titrant
The titration takes 15.5 mL of NaOH to reach the color change (endpoint). How many moles of NaOH were used?

Moles = Molarity × Volume (L)
0.100 mol/L × 0.0155 L = 0.00155 moles NaOH


Part B: Calculating Unknown Molarity
Based on the 1:1 reaction ratio, there must also be 0.00155 moles of acid in the original 10.0 mL sample. What is the Molarity of the acid?

Molarity = moles / Liters
0.00155 mol / 0.010 L = 0.155 M Acid