Friday, March 26, 2021

Measurement of Acidity in water and wastewater

Understanding Acidity in Water: A Practical Lab Guide to Accurate Measurement

Water quality sits at the heart of environmental health, public safety, and countless industrial processes. While parameters like pH and turbidity often steal the spotlight, acidity is another critical factor that deserves attention. Elevated acidity can corrode pipelines, disrupt aquatic ecosystems, and compromise drinking water safety.

In this post, we’ll walk through a standard laboratory method for measuring acidity in water and wastewater samples. This isn’t just textbook chemistry—it’s a practical, widely used approach that helps protect water resources and ensures regulatory compliance.




Why Measuring Acidity Matters

Acidity testing determines the concentration of acidic substances present in a water sample. These acids may exist as free hydrogen ions or as compounds that release hydrogen ions when dissolved or hydrolyzed.

By neutralizing these acidic components with a standard alkaline solution, laboratories can quantify acidity and express it as milligrams per liter (mg/L) of calcium carbonate (CaCO₃). This standardized unit allows results to be compared across different samples, locations, and regulatory frameworks.

Acidity measurements are essential for:

  • Monitoring environmental pollution
  • Designing and evaluating wastewater treatment processes
  • Ensuring compliance with environmental and industrial discharge standards

Roles and Responsibilities in the Laboratory

Accurate acidity measurement is a team effort. In a typical laboratory setup:

  • Laboratory Chemist: Conducts the titration and records observations.
  • Technical Manager: Reviews data for accuracy, consistency, and technical validity.
  • Quality Manager: Ensures procedures follow approved standards and quality protocols.

Each role contributes to producing reliable, defensible results.


The Science Behind Acidity Testing

Acidity is determined using acid–base titration, one of the most fundamental techniques in analytical chemistry. The acidic components in the sample react with a standardized base—commonly sodium hydroxide (NaOH).

As the base is added, it neutralizes the acid. The endpoint of the reaction is detected using a color indicator, signaling that all acidic components have been neutralized.


Equipment and Glassware Required

This method relies on standard laboratory glassware:

  • Conical (Erlenmeyer) flasks
  • Burette for accurate titrant delivery
  • Pipettes or measuring cylinders for sample handling

No advanced instrumentation is required, making this method accessible and cost-effective.


Reagent Preparation

High-quality reagents are critical for accurate results. Proper preparation and storage are essential.

1. 0.02 N Sodium Hydroxide (NaOH)

  • Dilute 200 mL of 0.1 N NaOH to 1 liter with distilled water.
  • This solution serves as the titrant.

2. Phenolphthalein Indicator

  • Dissolve 80 mg of phenolphthalein in 100 mL of 95% ethanol.
  • Isopropyl alcohol or methanol may be used as alternatives.
  • The indicator turns pink under basic conditions, marking the titration endpoint.

3. 0.05 N Potassium Hydrogen Phthalate (KHP)

  • Dry approximately 15 g of KHP at 120°C for two hours.
  • Cool in a desiccator.
  • Accurately weigh about 10 g and dilute to 1 liter with distilled water.
  • This primary standard is used to standardize the NaOH solution.

Step-by-Step Acidity Testing Procedure

  1. Sample Preparation
    Measure 50–100 mL of the water or wastewater sample into a clean conical flask. The volume may be adjusted depending on the expected acidity.

  2. Indicator Addition
    Add 2–3 drops of phenolphthalein indicator to the sample.

  3. Titration
    Slowly titrate with 0.02 N NaOH from the burette while gently swirling the flask.

  4. Endpoint Detection
    Continue titration until a faint pink color appears and persists for at least 30 seconds.

  5. Recording Results
    Record the burette reading corresponding to the volume of NaOH used.

  6. Standardization
    Standardize the NaOH solution by titrating 10 mL of the 0.05 N KHP solution using the same procedure.


Calculation of Acidity

Acidity is calculated using the following formula:

Acidity (mg/L as CaCO₃) = (A × N × 50 × 1000) / V

Where:

  • A = Volume of NaOH used (mL)
  • N = Normality of NaOH
  • V = Volume of sample (mL)
  • 50 = Equivalent weight of CaCO₃

This calculation converts titration data into a standardized and meaningful result.


Practical Tips and Common Pitfalls

  • Rinse all glassware with distilled water before use.
  • Always standardize the NaOH solution before analysis.
  • Highly colored or turbid samples may require modified techniques or indicators.
  • Wear appropriate personal protective equipment (gloves and safety goggles).
  • Store reagents properly to avoid degradation or contamination.

Final Thoughts

Acidity testing is a simple yet powerful tool in water and wastewater analysis. By following this standardized titration method, laboratories can generate accurate, reliable data that support environmental protection, public health, and industrial compliance.

Whether you’re a student learning analytical chemistry or a professional working in environmental monitoring, mastering acidity measurement is a valuable skill that directly contributes to safer and more sustainable water systems.

Have questions or hands-on experiences to share? Join the conversation and stay tuned for more practical lab insights!


Sunday, March 21, 2021

Biological Oxygen Demand BOD Testing

 

Biological Oxygen Demand (BOD): A Complete Laboratory Guide

Biological Oxygen Demand (BOD) is one of the most widely used parameters for evaluating organic pollution in water and wastewater. It reflects the amount of oxygen required by microorganisms to biologically decompose organic matter under aerobic conditions. This blog presents a clear, original, and laboratory-oriented explanation of the BOD test procedure, including reagent preparation, dilution techniques, incubation, and calculation



Purpose of BOD Analysis

The purpose of this procedure is to describe the laboratory method for measuring Biological Oxygen Demand (BOD) in water and wastewater samples.


Scope

This method is applicable to environmental laboratories involved in the analysis of:

  • Surface water
  • Groundwater
  • Treated and untreated wastewater
  • Industrial effluents

where BOD determination is required for monitoring, treatment efficiency, or regulatory compliance.


Roles and Responsibilities

  • Laboratory Chemist: Performs sample preparation, dilution, incubation, DO measurement, and calculation of BOD.
  • Technical Manager: Reviews analytical activities and validates results.
  • Quality Manager: Ensures SOP implementation and adherence to quality requirements.

Principle of the BOD Test

Biological Oxygen Demand is defined as the quantity of dissolved oxygen consumed by microorganisms while stabilizing biodegradable organic matter present in a water or wastewater sample under aerobic conditions. The reduction in dissolved oxygen over a fixed incubation period reflects the BOD of the sample.


Instruments and Equipment

  • BOD bottles (300 mL capacity)
  • BOD incubator maintained at 27 ± 1°C
  • Measuring cylinders and volumetric flasks
  • DO titration setup (as per Winkler method)

Reagents and Their Preparation

1. Phosphate Buffer Solution

Reagents required:

  • Potassium dihydrogen phosphate (KH₂PO₄): 8.5 g
  • Dipotassium hydrogen phosphate (K₂HPO₄): 21.75 g
  • Disodium hydrogen phosphate heptahydrate (Na₂HPO₄·7H₂O): 33.4 g
  • Ammonium chloride (NH₄Cl): 1.7 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve all salts in distilled water, make up to 1000 mL, and adjust the pH to 7.2.


2. Magnesium Sulphate Solution

Reagents required:

  • Magnesium sulphate heptahydrate (MgSO₄·7H₂O): 82.5 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve MgSO₄·7H₂O in distilled water and dilute to 1000 mL.


3. Calcium Chloride Solution

Reagents required:

  • Calcium chloride (CaCl₂): 27.5 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve CaCl₂ in distilled water and dilute to 1000 mL.


4. Ferric Chloride Solution

Reagents required:

  • Ferric chloride hexahydrate (FeCl₃·6H₂O): 0.25 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve FeCl₃·6H₂O in distilled water and dilute to 1000 mL.


5. Sodium Thiosulfate Solution (0.025 N)

Reagents required:

  • Sodium thiosulfate (Na₂S₂O₃): 6.205 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve sodium thiosulfate in distilled water and dilute to 1000 mL.


Test Method

Preparation of Dilution Water

  1. Aerate the required quantity of distilled water by bubbling compressed air for 1–2 days to achieve DO saturation.
  2. Add 1 mL each of phosphate buffer, magnesium sulphate, calcium chloride, and ferric chloride solutions per liter of dilution water.
  3. Mix thoroughly.
  4. For samples lacking sufficient microbial population, add seed—generally 2 mL of settled sewage per liter of dilution water.



Sample Preparation and Pretreatment

  • Adjust sample pH to approximately 7.0 if highly acidic or alkaline.
  • Ensure the sample is free from residual chlorine. If chlorine is present, remove it using sodium thiosulfate.

Removal of Residual Chlorine

  1. Take 50 mL of sample and acidify with 10 mL of 1+1 acetic acid.
  2. Add approximately 1 g potassium iodide (KI).
  3. Titrate with sodium thiosulfate using starch as an indicator.
  4. Calculate the amount of sodium thiosulfate required per milliliter of sample and treat the BOD sample accordingly.
  • If the sample has unusually high DO (above 9 mg/L), reduce it by gentle aeration or agitation.

Sample Dilution

Prepare multiple dilutions to achieve:

  • At least 2 mg/L DO depletion
  • Residual DO not less than 1 mg/L after incubation
  • Approximately 50% DO depletion

Dilution is prepared by siphoning seeded dilution water, adding the required volume of sample, and making up to volume with dilution water.

Suggested Dilutions and BOD Ranges

% Dilution Expected BOD (mg/L)
0.01 20,000 – 70,000
0.02 10,000 – 35,000
0.05 4,000 – 14,000
0.1 2,000 – 7,000
0.2 1,000 – 3,500
0.5 400 – 1,900
1 200 – 700
2 100 – 350
5 40 – 140
10 20 – 70
20 10 – 35
50 Up to 14
100 1 – 7

Incubation and DO Measurement

  • Fill labeled BOD bottles with prepared dilutions and stopper immediately.
  • Measure initial DO (D₀) in one bottle.
  • Incubate three bottles at 27°C for 3 days with a proper water seal.
  • Prepare blank bottles using dilution water only.
  • Determine DO for samples and blanks on day 0 and after 3 days using the Winkler method.

Calculation of BOD

Let:

  • D₀ = DO of sample on day 0 (mg/L)
  • D₁ = DO of sample after 3 days (mg/L)
  • C₀ = DO of blank on day 0 (mg/L)
  • C₁ = DO of blank after 3 days (mg/L)

BOD (mg/L) = {(D₀ - D₁) - (C₀ - C₁)}/{Decimal fraction of sample used}}

If the sample is seeded, determine the BOD contribution of the seed separately and apply the appropriate correction.


Final Remarks

BOD testing provides critical insight into the biodegradable organic load of water and wastewater. Accurate dilution, proper incubation, and careful DO measurement are essential for reliable results. When followed correctly, this method remains a cornerstone of environmental water quality assessment.

🌱 Healthy microbes tell the true story of water quality.


Monday, March 15, 2021

Iron (Fe) Analysis

Mastering Iron Detection in Water: The Phenanthroline Colorimetric Method

Iron is a common element in water sources and can cause aesthetic and operational issues, including rusty stains and metallic taste. Accurate measurement of iron in water and wastewater is essential for environmental monitoring, industrial processes, and safe water supply. The Phenanthroline colorimetric method is a reliable, sensitive, and easy-to-use technique for quantifying iron concentrations.






Why Measure Iron?

High iron levels in water are more than just cosmetic:

  • Cause orange-brown stains on plumbing, laundry, and utensils
  • Affect taste and color in food and beverages
  • Influence industrial processes and equipment longevity
  • Impact environmental quality for agriculture and ecosystems

Monitoring iron ensures compliance with regulatory limits and helps maintain water quality.


Principle of the Phenanthroline Method

This method measures iron by forming a colored complex:

  1. Iron in the sample is converted to the ferrous form (Fe²⁺) using acid and a reducing agent like hydroxylamine.
  2. At pH 3.2–3.3, 1,10-phenanthroline binds Fe²⁺, forming an orange-red tris-phenanthroline complex.
  3. The color intensity is proportional to iron concentration (Beer's Law) and can be measured spectrophotometrically at 510 nm.

The method is robust, with stable complexes across pH 3–9 and rapid color development between pH 2.9–3.5.


Roles in the Lab

  • Lab Chemist: Handles sample preparation, treatment, and measurement
  • Technical Manager: Reviews procedures and ensures accuracy
  • Quality Manager: Oversees SOP compliance and quality assurance

This method applies to water and wastewater samples where precise iron measurement is required.


Equipment Needed

  • Spectrophotometer (set at 510 nm)
  • Conical flasks, volumetric flasks, pipettes
  • Glass beads (for boiling)

Reagents

  • Hydroxylamine Solution: 10 g hydroxylamine hydrochloride in 100 mL distilled water
  • Ammonium Acetate Buffer: 250 g ammonium acetate in 150 mL water + 700 mL glacial acetic acid
  • Sodium Acetate Solution: 200 g in 800 mL water
  • Phenanthroline Solution: 100 mg 1,10-phenanthroline monohydrate in 100 mL water at 80°C
  • 0.1M Potassium Permanganate: 0.316 g KMnO₄ in 100 mL water
  • Stock Iron Solution: 1.404 g ferrous ammonium sulfate + 20 mL concentrated H₂SO₄ in 1 L water
  • Standard Iron Solutions: Dilutions from stock solution (e.g., 10 µg/mL or 1 µg/mL)

Safety: Handle acids and reagents with gloves and fume hood protection.


Procedure

  1. Sample Preparation: Filter cloudy samples into clean flasks.
  2. Calibration Curve: Prepare iron standards (1–5 mg/L Fe). Zero spectrophotometer with blank and measure absorbance.
  3. Sample Treatment: Pipette 50 mL of sample into a 100 mL volumetric flask. Add 2 mL HCl, 1 mL hydroxylamine, and a few glass beads. Boil until reduced to 15–20 mL.
  4. Cooling and Color Development: Cool, add 10 mL ammonium acetate buffer and 4 mL phenanthroline solution. Dilute to 100 mL with water and wait 10 minutes.
  5. Measurement: Measure absorbance at 510 nm. Dilute samples if readings exceed the standard range.
  6. Blank Check: Run a blank with distilled water instead of sample.

Calculation

Fe (mg/L) = (µg Fe in 100 mL final solution) ÷ mL of sample

This formula gives the iron concentration directly in the sample.


Why Use This Method?

The Phenanthroline method is preferred because it is:

  • Sensitive and accurate
  • Easy to perform with minimal equipment
  • Applicable to a wide range of water and wastewater samples
  • Provides reliable results that comply with regulatory standards (APHA, ISO)

Monitoring iron ensures safe water, prevents corrosion, and improves aesthetic quality. High levels can be treated using filtration or oxidation methods.

Pro Tip: Always follow lab safety protocols and verify your results against official standards.


Mastering this method allows scientists, students, and environmental technicians to measure iron accurately and maintain water quality effectively.



Synthetic precipitation Leaching Procedure (SPLP)

  Synthetic Precipitation Leaching Procedure (SPLP) Environmental protection and waste management rely heavily on scientific testing metho...