Saturday, May 8, 2021

Measurement of Nitrite in Water and Wastewater

How to Measure Nitrite in Water and Wastewater: Methods, Health Risks, and Tips

Meta Description: Learn how to accurately measure nitrite in water and wastewater using the Griess method. Understand its environmental impact, health risks, and practical testing tips.


Introduction

Nitrite (NO₂⁻) might be a minor component in water chemistry, but it plays a critical role in water safety, environmental monitoring, and wastewater treatment. Elevated nitrite levels can be toxic to humans and aquatic life and may indicate issues in water treatment processes.

Measuring nitrite accurately is essential for public health, regulatory compliance, and ecosystem protection. This article explains how nitrite forms, why it matters, and the most effective methods for testing it in water and wastewater.


What is Nitrite and How Does it Form?

Nitrite is an intermediate in the nitrogen cycle, formed during the microbial conversion of ammonia to nitrate:

Ammonia → Nitrite → Nitrate

In water and wastewater:

  • Produced by ammonia-oxidizing bacteria during nitrification
  • Usually short-lived in healthy systems
  • Accumulation signals biological imbalance or incomplete treatment

Monitoring nitrite levels helps detect potential treatment failures and prevent environmental contamination.


Why Measuring Nitrite is Important

Nitrite is not just another chemical—it’s an indicator of water quality and safety. Key reasons to measure nitrite include:

  • Health Risks: Can cause methemoglobinemia (“blue baby syndrome”) in infants by interfering with oxygen transport in blood.
  • Aquatic Toxicity: Toxic to fish and other aquatic organisms, even at low concentrations.
  • Wastewater Monitoring: Indicates incomplete nitrification or oxygen deficiencies in treatment systems.
  • Regulatory Compliance: Safe drinking water standards limit nitrite to ≤ 1 mg/L (WHO & EPA).

Accurate nitrite measurement ensures safe water, efficient treatment, and environmental protection.


How to Measure Nitrite in Water and Wastewater

Several analytical techniques exist, but the colorimetric Griess method is the most widely used due to its accuracy, simplicity, and cost-effectiveness.

The Colorimetric (Griess) Method

Principle: Nitrite reacts with sulfanilamide in acidic conditions to form a diazonium salt. This intermediate reacts with N-(1-naphthyl)ethylenediamine (NED), producing a pink/red azo dye. The intensity of the color is directly proportional to the nitrite concentration.



Equipment & Reagents Needed:

  • Spectrophotometer (wavelength: 543 nm)
  • Sulfanilamide reagent
  • NED reagent
  • Phosphoric acid
  • Sodium nitrite (for standard solutions)

Detection Range: 0.001–1.0 mg/L


Step-by-Step Procedure

  1. Take a measured sample of water or wastewater.
  2. Add sulfanilamide reagent and mix thoroughly.
  3. Add NED reagent and allow ~10 minutes for color development.
  4. Measure absorbance using a spectrophotometer.
  5. Determine nitrite concentration using a calibration curve.

Conversion to Nitrogen:


\text{mg/L as N} = \text{mg/L as NO₂⁻} \times \frac{14}{46}

Alternative Nitrite Testing Methods

Other methods are available for specialized applications:

  • Ion Chromatography: High precision, simultaneous detection of multiple ions.
  • Electrochemical Sensors: Real-time monitoring in treatment plants.
  • UV Spectrophotometry: Suitable for clear water but prone to interference.

Applications of Nitrite Measurement

Nitrite testing is critical in many areas:

  • Drinking Water Safety: Ensures regulatory compliance and safe consumption.
  • Wastewater Treatment: Helps optimize nitrification and denitrification processes.
  • Environmental Monitoring: Detects nitrogen pollution in rivers, lakes, and groundwater.
  • Research: Provides insight into microbial nitrogen cycling.

Tips for Accurate Nitrite Measurement

  • Neutralize residual chlorine before analysis.
  • Filter turbid or colored samples to reduce interference.
  • Use freshly prepared reagents and calibration standards.
  • Regularly calibrate your spectrophotometer.
  • Follow standard methods (APHA, ISO, EPA) for reliable results.

FAQs About Nitrite in Water

Q1: What is a safe level of nitrite in drinking water?

  • WHO and EPA recommend ≤ 1 mg/L as NO₂⁻.

Q2: Can nitrite turn into nitrate in water?

  • Yes, nitrite is oxidized to nitrate by nitrite-oxidizing bacteria in natural and treatment systems.

Q3: How fast does the Griess method work?

  • Color develops in about 10 minutes, making it ideal for routine laboratory testing.

Q4: Does nitrite affect aquatic life?

  • Even low concentrations can be toxic, especially to fish and sensitive invertebrates.

Conclusion

Measuring nitrite in water and wastewater is essential for public health, environmental safety, and wastewater management. The Griess colorimetric method is the most widely used technique due to its reliability, sensitivity, and ease of use. Accurate nitrite monitoring helps detect water treatment issues, prevent ecological damage, and ensure safe drinking water for communities world

Suggested Links

External Links:




Friday, May 7, 2021

Measurement of Silica by Molybdosilicate Method water and waste water samples.


Silica Analysis in Water and Wastewater: APHA Method

Silica (SiO₂) is a natural component of water, originating from the weathering of silicate minerals in rocks, soil, and sand. While generally not harmful to human health, high silica concentrations can cause scaling in boilers, fouling of membranes, and operational inefficiencies in water treatment systems. Reliable measurement of silica is essential for industrial water systems, wastewater reuse, and reverse osmosis (RO) processes.

The APHA molybdate blue method is a standardized and widely used procedure for silica determination, particularly for reactive silica.


Forms of Silica in Water

  • Reactive (Dissolved) Silica

    • Mainly monosilicic acid (H₄SiO₄)
    • Directly measurable by APHA methods
  • Polymeric or Colloidal Silica

    • Forms from condensation of dissolved silica
    • Reacts slowly and may require digestion
  • Particulate Silica

    • Suspended solids (sand, silt, clay)
    • Usually removed before analysis

Importance of Silica Analysis

Monitoring silica is critical for:

  • Preventing scale formation in boilers, cooling towers, and heat exchangers
  • Protecting RO membranes and industrial equipment
  • Optimizing demineralization and water reuse processes
  • Ensuring efficiency in wastewater treatment and zero liquid discharge systems
  • Extending equipment lifespan and reducing maintenance costs

Typical Silica Concentrations

Water Source Silica Concentration (mg/L as SiO₂)
Surface water 1–30
Groundwater 10–100
Industrial wastewater Highly variable
RO permeate / high-purity < 1

APHA Method for Silica Determination

The Molybdate Blue Colorimetric Method measures reactive silica:

  • Silica reacts with ammonium molybdate in acidic conditions to form silicomolybdic acid
  • Reduction of this complex produces a blue color
  • The intensity of the blue color is proportional to silica concentration and is measured spectrophotometrically at ~815 nm

This method is widely used due to its simplicity, sensitivity, and cost-effectiveness.




Laboratory SOP (APHA Method)

Purpose

To determine reactive silica in water and wastewater samples using a standardized colorimetric procedure.

Scope

Applicable to drinking water, surface water, groundwater, industrial wastewater, and RO permeate.

Apparatus and Equipment

  • Spectrophotometer or colorimeter (~815 nm)
  • Plastic or polyethylene sample bottles
  • Volumetric flasks, pipettes, and test tubes

Reagents

  • Ammonium molybdate reagent
  • Acid reagent (e.g., sulfuric acid)
  • Reducing reagent (e.g., ascorbic acid solution)
  • Silica stock solution
  • Deionized, silica-free water

Sample Collection

  • Collect in plastic bottles (avoid glass)
  • Filter samples if particulate silica is not required
  • Analyze promptly at room temperature

Calibration Procedure

  1. Prepare silica standards (0, 5, 10, 20, 30, 50 mg/L as SiO₂)
  2. Add reagents to standards and blanks under identical conditions
  3. Allow color development according to the method
  4. Measure absorbance at ~815 nm against a reagent blank
  5. Plot absorbance vs. silica concentration to create the calibration curve

Calculating the Slope (Calibration Factor)

  • Use the equation of the line from the calibration curve: Absorbance = m × [SiO₂] + c
  • m is the slope, which represents the change in absorbance per unit concentration
  • c is the y-intercept (blank absorbance)

For example, if the calibration curve equation is:

Absorbance = 0.0085 × [SiO₂] + 0.005

Then the slope m = 0.0085 Abs/mg/L, which is used in sample calculations.



Sample Analysis Procedure

  1. Pipette a measured volume of sample into a clean reaction vessel
  2. Add ammonium molybdate reagent under acidic conditions and mix
  3. Add reducing reagent and allow full color development
  4. Measure absorbance against the reagent blank
  5. Determine silica concentration using the calibration curve and slope

Quality Control Measures

  • Include a reagent blank in each batch
  • Analyze duplicate samples to assess precision
  • Verify calibration with a mid-range standard
  • Recalibrate when instrument drift or new reagent batches occur

Example Calculation

Given:

  • Sample absorbance = 0.420
  • Calibration curve equation: Absorbance = 0.0085 × [SiO₂] + 0.005

Step 1: Solve for SiO₂ concentration

[SiO₂] = (0.420 − 0.005) ÷ 0.0085 ≈ 48.8 mg/L

Step 2: Apply dilution factor if used

Final Result: Reactive silica concentration = 48.8 mg/L as SiO₂


Reporting Guidelines

  • Report results in mg/L SiO₂
  • Specify that the analysis measures reactive silica
  • Include details of any filtration or dilution performed

Conclusion

The APHA molybdate blue method provides a reliable, sensitive, and standardized approach for reactive silica measurement in water and wastewater. Accurate silica analysis is essential for preventing scaling, protecting membranes, optimizing treatment processes, and ensuring sustainable water management. Using the calibration curve slope in calculations ensures consistent and reproducible results, which are critical for both industrial and municipal water systems.

Wednesday, April 28, 2021

Measuring Sulfur Dioxide (SO₂) in Ambient Air

Measuring Sulfur Dioxide (SO₂) in Ambient Air: A Practical Laboratory Guide.

Monitoring sulfur dioxide (SO₂) in ambient air is a key component of air‑quality assessment and public health protection. SO₂ is a major atmospheric pollutant generated primarily from fossil‑fuel combustion, power plants, refineries, and other industrial activities. Prolonged exposure can harm human health, damage vegetation, and contribute to acid rain formation.



This blog presents a practical, laboratory‑based guide to measuring ambient SO₂ using the widely accepted para‑rosaniline colorimetric method, explaining the principle, reagents, procedures, and calculations in a clear and user‑friendly manner.



Why Measure Ambient SO₂?

Accurate measurement of sulfur dioxide is essential because:

  • SO₂ irritates the respirat hiory system and aggravates asthma
  • It damages crops, forests, and building materials
  • It contributes to acid rain and secondary particulate formation
  • Regulatory agencies require routine monitoring for compliance

Reliable laboratory analysis supports environmental decision‑making and pollution‑control strategies



Principle of the Method

Ambient air is drawn through an absorbing solution of potassium tetrachloromercurate (TCM). Sulfur dioxide reacts with TCM to form a stable dichlorosulphitomercurate complex, which is resistant to oxidation by oxygen, ozone, and nitrogen oxides. This stability allows samples to be stored prior to analysis without significant SO₂ loss.

For analysis, the complex reacts with para rosaniline in the presence of formaldehyde, forming a colored compound. The color intensity is directly proportional to the amount of SO₂ present and is measured spectrophotometrically at 560 nm.


Roles and Responsibilities

  • Laboratory Chemist: Conducts sampling, analysis, and calculations
  • Technical Manager: Reviews analytical procedures and results
  • Quality Manager: Ensures SOP implementation and quality control

Key Reagents Used

The following reagents are critical for accurate SO₂ analysis:

  • Distilled water (free from oxidizing agents)
  • Potassium tetrachloromercurate (0.04 M) – absorbing solution
  • Sulphamic acid (0.6%) – removes nitrogen oxide interference
  • Formaldehyde (0.2%) – supports color development
  • Para rosaniline dye – produces measurable color
  • Iodine and sodium thiosulphate solutions – for standardization
  • Standard sulphite solution – used to prepare calibration standards

All reagents must be freshly prepared or stored under specified conditions to maintain analytical accuracy.


Preparation of Standards and Calibration

Standard Sulphite Solution

A sulphite solution is prepared using sodium sulphite or sodium metabisulphite and standardized by iodine–thiosulphate titration. This step determines the exact SO₂ concentration in the standard solution.

Working Sulphite–TCM Solution

A measured volume of the standardized sulphite solution is diluted and mixed with TCM. This working solution is stable for up to 30 days when stored under refrigeration and is used for preparing calibration standards.

Calibration Curve

Different volumes of the working sulphite–TCM solution are added to volumetric flasks to prepare standards containing known amounts of SO₂. After reagent addition and color development, absorbance is measured at 560 nm.

A straight‑line plot of absorbance versus SO₂ mass (µg) confirms proper calibration. The slope of this line is used to calculate the calibration factor (B).


Sample Analysis Procedure

  1. Prepare a reagent blank, control, and sample solutions
  2. Add sulphamic acid to remove nitrite interference
  3. Add formaldehyde followed by para rosaniline
  4. Allow color to develop for 30 minutes
  5. Measure absorbance between 30–60 minutes at 560 nm using a 1 cm cuvette
  6. Use distilled water, not the reagent blank, as the spectrophotometer reference

Strict temperature control is essential, as color intensity is temperature‑dependent.


Handling High Absorbance Samples

  • If absorbance lies between 1.0 and 2.0, dilute the sample 1:1 with reagent blank
  • Highly concentrated samples may require dilution up to six times
  • Always apply the correct dilution factor (D) during calculations

Calculations

SO₂ Concentration in Air

SO₂ concentration is calculated using:

SO₂ (µg/m³) = (SA × B × D) ÷ V₁

Where:

  • SA = Sample absorbance
  • B = Calibration factor
  • D = Dilution factor
  • V₁ = Volume of air sampled at STP (m³)

Conversion to ppm

The calculated mass concentration can be converted to ppm using standard gas‑law relationships.


Quality Control and Good Laboratory Practice

  • Analyze control samples with known SO₂ concentrations
  • Recalibrate if reagent blank absorbance deviates significantly
  • Clean cuvettes immediately after use
  • Maintain consistent temperature during calibration and analysis

Final Thoughts

The para rosaniline method remains a dependable and sensitive technique for measuring sulfur dioxide in ambient air. When performed with careful reagent preparation, calibration, and quality control, it provides accurate and reproducible results essential for air‑quality monitoring and regulatory compliance.

Consistent application of this method helps laboratories contribute reliable data toward protecting public health and the environment.


Friday, April 23, 2021

Methods phosphate analysis in water and wastewater

Phosphate Analysis in Water and Wastewater

Introduction

Phosphates are essential nutrients for plant growth, but when present in excess in water and wastewater, they become a major environmental concern. Elevated phosphate levels are one of the primary causes of eutrophication, leading to algal blooms, oxygen depletion, and degradation of aquatic ecosystems. Because of these impacts, phosphate analysis is a critical component of water quality monitoring and wastewater treatment operations.

This blog explores why phosphate analysis matters, common forms of phosphates found in water, analytical methods used for measurement, and their significance in environmental management.


What Are Phosphates?

Phosphates are chemical compounds containing phosphorus combined with oxygen, commonly found as:

  • Orthophosphates – the simplest and most reactive form
  • Condensed phosphates – polyphosphates and metaphosphates
  • Organic phosphates – phosphorus bound to organic molecules

In water and wastewater analysis, results are often reported as mg/L of PO₄³⁻ or as phosphorus (P).


Sources of Phosphates in Water and Wastewater

Phosphates enter water bodies from both natural and anthropogenic sources:

Natural Sources

  • Weathering of phosphate-containing rocks
  • Decomposition of organic matter

Anthropogenic Sources

  • Domestic sewage and human waste
  • Detergents and cleaning agents
  • Agricultural runoff containing fertilizers
  • Industrial effluents (food processing, fertilizer, chemical industries)



Why Is Phosphate Analysis Important?

Phosphate monitoring is essential for several reasons:

  • Prevention of eutrophication: Excess phosphates promote uncontrolled algal growth.
  • Protection of aquatic life: Algal decay reduces dissolved oxygen, harming fish and other organisms.
  • Regulatory compliance: Environmental agencies set strict discharge limits for phosphates.
  • Process control: Wastewater treatment plants rely on phosphate measurements to optimize biological and chemical removal processes.

Analytical Methods for Phosphate Determination

Several methods are used for phosphate analysis depending on accuracy requirements, sample type, and available instrumentation.

1. Colorimetric (Spectrophotometric) Method

This is the most widely used method for phosphate analysis.

  • Based on the reaction of phosphate with ammonium molybdate under acidic conditions
  • Forms a blue-colored complex (molybdenum blue)
  • Intensity of color is measured using a spectrophotometer

Advantages:

  • Simple and cost-effective
  • Suitable for routine laboratory analysis

Limitations:

  • Interference from silica, arsenate, or turbidity if not properly controlled

2. Ascorbic Acid Method

A refined colorimetric method commonly recommended by standard methods.

  • Produces a stable blue color
  • High sensitivity and reproducibility

This method is widely used in environmental laboratories for both water and wastewater samples.


3. Ion Chromatography

  • Separates phosphate ions from other anions
  • Quantification based on conductivity or UV detection

Advantages:

  • High precision and selectivity
  • Capable of multi-ion analysis

Limitations:

  • High equipment and maintenance cost

4. Inductively Coupled Plasma (ICP) Techniques

  • Measures phosphorus directly as an element
  • Suitable for trace-level analysis

Advantages:

  • Very high sensitivity
  • Minimal chemical interference

Limitations:

  • Expensive instrumentation
  • Requires skilled operation

Sample Collection and Preservation

Proper sampling is critical for accurate phosphate analysis:

  • Use clean, phosphate-free containers
  • Analyze samples as soon as possible
  • Refrigerate samples at 4°C if analysis is delayed
  • Acid digestion may be required to convert all forms of phosphorus into orthophosphate for total phosphate analysis

Phosphate Removal and Control in Wastewater

Phosphate analysis supports treatment strategies such as:

  • Biological phosphorus removal (EBPR)
  • Chemical precipitation using alum, ferric chloride, or lime
  • Tertiary treatment and filtration

Accurate monitoring ensures effective removal and regulatory compliance.


Regulatory Standards

Many environmental authorities specify maximum allowable phosphate or phosphorus concentrations in effluents and surface waters. Typical discharge limits range from 0.1 to 1.0 mg/L as phosphorus, depending on local regulations and receiving water sensitivity.


Conclusion

Phosphate analysis plays a vital role in protecting water quality and maintaining ecological balance. By understanding phosphate sources, applying appropriate analytical methods, and interpreting results correctly, water and wastewater professionals can effectively control nutrient pollution and meet environmental standards.

Regular monitoring, combined with efficient treatment processes, is key to reducing phosphate-related environmental impacts and ensuring sustainable water management.



Friday, March 26, 2021

Measurement of Acidity in water and wastewater

Understanding Acidity in Water: A Practical Lab Guide to Accurate Measurement

Water quality sits at the heart of environmental health, public safety, and countless industrial processes. While parameters like pH and turbidity often steal the spotlight, acidity is another critical factor that deserves attention. Elevated acidity can corrode pipelines, disrupt aquatic ecosystems, and compromise drinking water safety.

In this post, we’ll walk through a standard laboratory method for measuring acidity in water and wastewater samples. This isn’t just textbook chemistry—it’s a practical, widely used approach that helps protect water resources and ensures regulatory compliance.




Why Measuring Acidity Matters

Acidity testing determines the concentration of acidic substances present in a water sample. These acids may exist as free hydrogen ions or as compounds that release hydrogen ions when dissolved or hydrolyzed.

By neutralizing these acidic components with a standard alkaline solution, laboratories can quantify acidity and express it as milligrams per liter (mg/L) of calcium carbonate (CaCO₃). This standardized unit allows results to be compared across different samples, locations, and regulatory frameworks.

Acidity measurements are essential for:

  • Monitoring environmental pollution
  • Designing and evaluating wastewater treatment processes
  • Ensuring compliance with environmental and industrial discharge standards

Roles and Responsibilities in the Laboratory

Accurate acidity measurement is a team effort. In a typical laboratory setup:

  • Laboratory Chemist: Conducts the titration and records observations.
  • Technical Manager: Reviews data for accuracy, consistency, and technical validity.
  • Quality Manager: Ensures procedures follow approved standards and quality protocols.

Each role contributes to producing reliable, defensible results.


The Science Behind Acidity Testing

Acidity is determined using acid–base titration, one of the most fundamental techniques in analytical chemistry. The acidic components in the sample react with a standardized base—commonly sodium hydroxide (NaOH).

As the base is added, it neutralizes the acid. The endpoint of the reaction is detected using a color indicator, signaling that all acidic components have been neutralized.


Equipment and Glassware Required

This method relies on standard laboratory glassware:

  • Conical (Erlenmeyer) flasks
  • Burette for accurate titrant delivery
  • Pipettes or measuring cylinders for sample handling

No advanced instrumentation is required, making this method accessible and cost-effective.


Reagent Preparation

High-quality reagents are critical for accurate results. Proper preparation and storage are essential.

1. 0.02 N Sodium Hydroxide (NaOH)

  • Dilute 200 mL of 0.1 N NaOH to 1 liter with distilled water.
  • This solution serves as the titrant.

2. Phenolphthalein Indicator

  • Dissolve 80 mg of phenolphthalein in 100 mL of 95% ethanol.
  • Isopropyl alcohol or methanol may be used as alternatives.
  • The indicator turns pink under basic conditions, marking the titration endpoint.

3. 0.05 N Potassium Hydrogen Phthalate (KHP)

  • Dry approximately 15 g of KHP at 120°C for two hours.
  • Cool in a desiccator.
  • Accurately weigh about 10 g and dilute to 1 liter with distilled water.
  • This primary standard is used to standardize the NaOH solution.

Step-by-Step Acidity Testing Procedure

  1. Sample Preparation
    Measure 50–100 mL of the water or wastewater sample into a clean conical flask. The volume may be adjusted depending on the expected acidity.

  2. Indicator Addition
    Add 2–3 drops of phenolphthalein indicator to the sample.

  3. Titration
    Slowly titrate with 0.02 N NaOH from the burette while gently swirling the flask.

  4. Endpoint Detection
    Continue titration until a faint pink color appears and persists for at least 30 seconds.

  5. Recording Results
    Record the burette reading corresponding to the volume of NaOH used.

  6. Standardization
    Standardize the NaOH solution by titrating 10 mL of the 0.05 N KHP solution using the same procedure.


Calculation of Acidity

Acidity is calculated using the following formula:

Acidity (mg/L as CaCO₃) = (A × N × 50 × 1000) / V

Where:

  • A = Volume of NaOH used (mL)
  • N = Normality of NaOH
  • V = Volume of sample (mL)
  • 50 = Equivalent weight of CaCO₃

This calculation converts titration data into a standardized and meaningful result.


Practical Tips and Common Pitfalls

  • Rinse all glassware with distilled water before use.
  • Always standardize the NaOH solution before analysis.
  • Highly colored or turbid samples may require modified techniques or indicators.
  • Wear appropriate personal protective equipment (gloves and safety goggles).
  • Store reagents properly to avoid degradation or contamination.

Final Thoughts

Acidity testing is a simple yet powerful tool in water and wastewater analysis. By following this standardized titration method, laboratories can generate accurate, reliable data that support environmental protection, public health, and industrial compliance.

Whether you’re a student learning analytical chemistry or a professional working in environmental monitoring, mastering acidity measurement is a valuable skill that directly contributes to safer and more sustainable water systems.

Have questions or hands-on experiences to share? Join the conversation and stay tuned for more practical lab insights!


Sunday, March 21, 2021

Biological Oxygen Demand BOD Testing

 

Biological Oxygen Demand (BOD): A Complete Laboratory Guide

Biological Oxygen Demand (BOD) is one of the most widely used parameters for evaluating organic pollution in water and wastewater. It reflects the amount of oxygen required by microorganisms to biologically decompose organic matter under aerobic conditions. This blog presents a clear, original, and laboratory-oriented explanation of the BOD test procedure, including reagent preparation, dilution techniques, incubation, and calculation



Purpose of BOD Analysis

The purpose of this procedure is to describe the laboratory method for measuring Biological Oxygen Demand (BOD) in water and wastewater samples.


Scope

This method is applicable to environmental laboratories involved in the analysis of:

  • Surface water
  • Groundwater
  • Treated and untreated wastewater
  • Industrial effluents

where BOD determination is required for monitoring, treatment efficiency, or regulatory compliance.


Roles and Responsibilities

  • Laboratory Chemist: Performs sample preparation, dilution, incubation, DO measurement, and calculation of BOD.
  • Technical Manager: Reviews analytical activities and validates results.
  • Quality Manager: Ensures SOP implementation and adherence to quality requirements.

Principle of the BOD Test

Biological Oxygen Demand is defined as the quantity of dissolved oxygen consumed by microorganisms while stabilizing biodegradable organic matter present in a water or wastewater sample under aerobic conditions. The reduction in dissolved oxygen over a fixed incubation period reflects the BOD of the sample.


Instruments and Equipment

  • BOD bottles (300 mL capacity)
  • BOD incubator maintained at 27 ± 1°C
  • Measuring cylinders and volumetric flasks
  • DO titration setup (as per Winkler method)

Reagents and Their Preparation

1. Phosphate Buffer Solution

Reagents required:

  • Potassium dihydrogen phosphate (KH₂PO₄): 8.5 g
  • Dipotassium hydrogen phosphate (K₂HPO₄): 21.75 g
  • Disodium hydrogen phosphate heptahydrate (Na₂HPO₄·7H₂O): 33.4 g
  • Ammonium chloride (NH₄Cl): 1.7 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve all salts in distilled water, make up to 1000 mL, and adjust the pH to 7.2.


2. Magnesium Sulphate Solution

Reagents required:

  • Magnesium sulphate heptahydrate (MgSO₄·7H₂O): 82.5 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve MgSO₄·7H₂O in distilled water and dilute to 1000 mL.


3. Calcium Chloride Solution

Reagents required:

  • Calcium chloride (CaCl₂): 27.5 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve CaCl₂ in distilled water and dilute to 1000 mL.


4. Ferric Chloride Solution

Reagents required:

  • Ferric chloride hexahydrate (FeCl₃·6H₂O): 0.25 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve FeCl₃·6H₂O in distilled water and dilute to 1000 mL.


5. Sodium Thiosulfate Solution (0.025 N)

Reagents required:

  • Sodium thiosulfate (Na₂S₂O₃): 6.205 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve sodium thiosulfate in distilled water and dilute to 1000 mL.


Test Method

Preparation of Dilution Water

  1. Aerate the required quantity of distilled water by bubbling compressed air for 1–2 days to achieve DO saturation.
  2. Add 1 mL each of phosphate buffer, magnesium sulphate, calcium chloride, and ferric chloride solutions per liter of dilution water.
  3. Mix thoroughly.
  4. For samples lacking sufficient microbial population, add seed—generally 2 mL of settled sewage per liter of dilution water.



Sample Preparation and Pretreatment

  • Adjust sample pH to approximately 7.0 if highly acidic or alkaline.
  • Ensure the sample is free from residual chlorine. If chlorine is present, remove it using sodium thiosulfate.

Removal of Residual Chlorine

  1. Take 50 mL of sample and acidify with 10 mL of 1+1 acetic acid.
  2. Add approximately 1 g potassium iodide (KI).
  3. Titrate with sodium thiosulfate using starch as an indicator.
  4. Calculate the amount of sodium thiosulfate required per milliliter of sample and treat the BOD sample accordingly.
  • If the sample has unusually high DO (above 9 mg/L), reduce it by gentle aeration or agitation.

Sample Dilution

Prepare multiple dilutions to achieve:

  • At least 2 mg/L DO depletion
  • Residual DO not less than 1 mg/L after incubation
  • Approximately 50% DO depletion

Dilution is prepared by siphoning seeded dilution water, adding the required volume of sample, and making up to volume with dilution water.

Suggested Dilutions and BOD Ranges

% Dilution Expected BOD (mg/L)
0.01 20,000 – 70,000
0.02 10,000 – 35,000
0.05 4,000 – 14,000
0.1 2,000 – 7,000
0.2 1,000 – 3,500
0.5 400 – 1,900
1 200 – 700
2 100 – 350
5 40 – 140
10 20 – 70
20 10 – 35
50 Up to 14
100 1 – 7

Incubation and DO Measurement

  • Fill labeled BOD bottles with prepared dilutions and stopper immediately.
  • Measure initial DO (D₀) in one bottle.
  • Incubate three bottles at 27°C for 3 days with a proper water seal.
  • Prepare blank bottles using dilution water only.
  • Determine DO for samples and blanks on day 0 and after 3 days using the Winkler method.

Calculation of BOD

Let:

  • D₀ = DO of sample on day 0 (mg/L)
  • D₁ = DO of sample after 3 days (mg/L)
  • C₀ = DO of blank on day 0 (mg/L)
  • C₁ = DO of blank after 3 days (mg/L)

BOD (mg/L) = {(D₀ - D₁) - (C₀ - C₁)}/{Decimal fraction of sample used}}

If the sample is seeded, determine the BOD contribution of the seed separately and apply the appropriate correction.


Final Remarks

BOD testing provides critical insight into the biodegradable organic load of water and wastewater. Accurate dilution, proper incubation, and careful DO measurement are essential for reliable results. When followed correctly, this method remains a cornerstone of environmental water quality assessment.

🌱 Healthy microbes tell the true story of water quality.


Monday, March 15, 2021

Iron (Fe) Analysis

Mastering Iron Detection in Water: The Phenanthroline Colorimetric Method

Iron is a common element in water sources and can cause aesthetic and operational issues, including rusty stains and metallic taste. Accurate measurement of iron in water and wastewater is essential for environmental monitoring, industrial processes, and safe water supply. The Phenanthroline colorimetric method is a reliable, sensitive, and easy-to-use technique for quantifying iron concentrations.






Why Measure Iron?

High iron levels in water are more than just cosmetic:

  • Cause orange-brown stains on plumbing, laundry, and utensils
  • Affect taste and color in food and beverages
  • Influence industrial processes and equipment longevity
  • Impact environmental quality for agriculture and ecosystems

Monitoring iron ensures compliance with regulatory limits and helps maintain water quality.


Principle of the Phenanthroline Method

This method measures iron by forming a colored complex:

  1. Iron in the sample is converted to the ferrous form (Fe²⁺) using acid and a reducing agent like hydroxylamine.
  2. At pH 3.2–3.3, 1,10-phenanthroline binds Fe²⁺, forming an orange-red tris-phenanthroline complex.
  3. The color intensity is proportional to iron concentration (Beer's Law) and can be measured spectrophotometrically at 510 nm.

The method is robust, with stable complexes across pH 3–9 and rapid color development between pH 2.9–3.5.


Roles in the Lab

  • Lab Chemist: Handles sample preparation, treatment, and measurement
  • Technical Manager: Reviews procedures and ensures accuracy
  • Quality Manager: Oversees SOP compliance and quality assurance

This method applies to water and wastewater samples where precise iron measurement is required.


Equipment Needed

  • Spectrophotometer (set at 510 nm)
  • Conical flasks, volumetric flasks, pipettes
  • Glass beads (for boiling)

Reagents

  • Hydroxylamine Solution: 10 g hydroxylamine hydrochloride in 100 mL distilled water
  • Ammonium Acetate Buffer: 250 g ammonium acetate in 150 mL water + 700 mL glacial acetic acid
  • Sodium Acetate Solution: 200 g in 800 mL water
  • Phenanthroline Solution: 100 mg 1,10-phenanthroline monohydrate in 100 mL water at 80°C
  • 0.1M Potassium Permanganate: 0.316 g KMnO₄ in 100 mL water
  • Stock Iron Solution: 1.404 g ferrous ammonium sulfate + 20 mL concentrated H₂SO₄ in 1 L water
  • Standard Iron Solutions: Dilutions from stock solution (e.g., 10 µg/mL or 1 µg/mL)

Safety: Handle acids and reagents with gloves and fume hood protection.


Procedure

  1. Sample Preparation: Filter cloudy samples into clean flasks.
  2. Calibration Curve: Prepare iron standards (1–5 mg/L Fe). Zero spectrophotometer with blank and measure absorbance.
  3. Sample Treatment: Pipette 50 mL of sample into a 100 mL volumetric flask. Add 2 mL HCl, 1 mL hydroxylamine, and a few glass beads. Boil until reduced to 15–20 mL.
  4. Cooling and Color Development: Cool, add 10 mL ammonium acetate buffer and 4 mL phenanthroline solution. Dilute to 100 mL with water and wait 10 minutes.
  5. Measurement: Measure absorbance at 510 nm. Dilute samples if readings exceed the standard range.
  6. Blank Check: Run a blank with distilled water instead of sample.

Calculation

Fe (mg/L) = (µg Fe in 100 mL final solution) ÷ mL of sample

This formula gives the iron concentration directly in the sample.


Why Use This Method?

The Phenanthroline method is preferred because it is:

  • Sensitive and accurate
  • Easy to perform with minimal equipment
  • Applicable to a wide range of water and wastewater samples
  • Provides reliable results that comply with regulatory standards (APHA, ISO)

Monitoring iron ensures safe water, prevents corrosion, and improves aesthetic quality. High levels can be treated using filtration or oxidation methods.

Pro Tip: Always follow lab safety protocols and verify your results against official standards.


Mastering this method allows scientists, students, and environmental technicians to measure iron accurately and maintain water quality effectively.



Tuesday, February 2, 2021

Ammonical Nitrogen Testing in waste water

Monitoring Ammonical Nitrogen in Water and Wastewater

Monitoring ammonical nitrogen (NH₃‑N) in water and wastewater is essential for evaluating pollution levels, treatment efficiency, and compliance with environmental regulations. Elevated ammonia concentrations can be toxic to aquatic organisms and often indicate contamination from sewage, industrial discharges, or agricultural runoff.

This article presents a clear, laboratory‑based Standard Operating Procedure (SOP) for determining ammonical nitrogen using the distillation and titrimetric method, a widely accepted and reliable analytical technique.




Why Measure Ammonical Nitrogen?

Ammonical nitrogen represents ammonia and ammonium compounds present in water. High levels may:

  • Indicate contamination from domestic or industrial wastewater
  • Cause toxicity to fish and other aquatic life
  • Interfere with drinking water treatment processes
  • Signal incomplete biological treatment in wastewater plants

Accurate measurement of ammonical nitrogen is therefore critical for environmental monitoring, regulatory compliance, and treatment process control.


Objective of the Method

The objective of this method is to determine the concentration of ammonical nitrogen (NH₃‑N) in water and wastewater samples using a standardized laboratory procedure that ensures accuracy, reliability, and repeatability of results.


Principle of Analysis

The method is based on the alkaline distillation of ammonia from the sample. The sample is buffered to a pH of approximately 9.5 using a borate buffer, which minimizes the decomposition of cyanates and organic nitrogen compounds.

Under alkaline conditions, ammonia is liberated and distilled into a receiving solution of boric acid. The absorbed ammonia is then determined by titration with standard sulfuric acid using a mixed indicator. The volume of acid consumed is directly proportional to the ammonical nitrogen content of the sample.


Apparatus and Equipment

The following laboratory equipment is required:

  • Pipettes
  • Conical flasks
  • Nitrogen distillation assembly
  • Heating mantle

Reagents

All reagents should be of analytical reagent grade.

  • Sodium Tetraborate (0.025 M): Dissolve 9.5 g Na₂B₄O₇·10H₂O in 1 L distilled water
  • Borate Buffer: Mix 500 mL of 0.025 M sodium tetraborate with 88 mL of 0.1 N NaOH
  • Sodium Hydroxide (6 N): Dissolve 240 g NaOH in 1 L distilled water
  • Mixed Indicator Solution: Methyl red and methylene blue dissolved in ethanol or propanol
  • Indicating Boric Acid Solution: Dissolve 20 g H₃BO₃ and add 10 mL mixed indicator; dilute to 1 L
  • Sulfuric Acid (0.1 N and 0.02 N): Prepare and standardize as required
  • Sodium Carbonate (0.05 N): Dissolve 2.5 g in 1 L distilled water

Test Method

Distillation Procedure

  1. Take 250 mL of dechlorinated sample or dilute the sample to 250 mL.
  2. Adjust the pH to approximately 7.0, if required.
  3. Add 25 mL borate buffer and adjust the pH to 9.5 using 6 N NaOH.
  4. Assemble the distillation unit and begin distillation at a rate of 6–10 mL per minute.
  5. Collect the distillate in a 500 mL conical flask containing 50 mL indicating boric acid solution.
  6. Perform a reagent blank under identical conditions and apply blank correction.

Titration

Titrate the collected distillate with 0.02 N sulfuric acid until a pale lavender endpoint is observed.


Calculation

Ammonical Nitrogen (mg/L) is calculated using the formula:

Ammonical Nitrogen (mg/L) = ((A − B) × N × 14 × 1000) / Sample Volume (mL)

Where:

  • A = Volume of sulfuric acid used for the sample (mL)
  • B = Volume of sulfuric acid used for the blank (mL)
  • N = Normality of sulfuric acid
  • 14 = Atomic weight of nitrogen

Example Calculation

Given:

  • A = 12.5 mL
  • B = 0.5 mL
  • N = 0.02 N
  • Sample volume = 250 mL

Step 1: Blank correction 12.5 − 0.5 = 12.0 mL

Step 2: Multiply by normality 12.0 × 0.02 = 0.24

Step 3: Multiply by atomic weight of nitrogen 0.24 × 14 = 3.36

Step 4: Convert to mg/L 3.36 × 1000 / 250 = 13.44 mg/L


Conclusion

The distillation–titrimetric method is a reliable and widely accepted approach for determining ammonical nitrogen in water and wastewater. Strict adherence to this SOP ensures accurate results, supports regulatory compliance, and aids in effective water quality assessment. Consistent monitoring of ammonical nitrogen plays a vital role in protecting aquatic ecosystems and ensuring sustainable water management practices.


Wednesday, December 23, 2020

Total Hardness Analysis in Water

Total Hardness Determination (EDTA Titrimetric Method)

1. Purpose

This Standard Operating Procedure (SOP) describes a validated method for the determination of Total Hardness in water and wastewater samples using the EDTA titrimetric method. The procedure is designed to produce accurate, precise, and reproducible results suitable for routine laboratory analysis and regulatory monitoring.




2. Scope

This method applies to the analysis of drinking water, surface water, groundwater, and wastewater samples in which calcium (Ca²⁺) and magnesium (Mg²⁺) are the primary contributors to hardness.


3. Principle of the Method

Total hardness in water is caused mainly by dissolved calcium and magnesium salts. The determination is based on complexometric titration using Ethylenediaminetetraacetic Acid (EDTA) as the titrant.

At a controlled pH of 10.0 ± 0.1, calcium and magnesium ions react with Eriochrome Black T (EBT) indicator to form a wine‑red colored complex. During titration, EDTA preferentially complexes with calcium and magnesium ions. Once all metal ions are bound by EDTA, the indicator is released, producing a distinct color change from wine‑red to sky blue, which signifies the endpoint.


4. Responsibilities

  • Laboratory Analyst: Perform the analysis in accordance with this SOP and accurately record all observations and results.
  • Laboratory Supervisor: Ensure availability, calibration, and validation of equipment, reagents, and standards.

5. Apparatus and Equipment

The following equipment is required:

  • Calibrated pH meter
  • Burette (50 mL capacity)
  • Conical flasks (100 mL)
  • Volumetric flasks
  • Measuring cylinders
  • Analytical balance
  • Standard laboratory glassware

6. Reagents and Chemicals

All reagents shall be of Analytical Reagent (AR) grade.

6.1 Distilled Water

Used for reagent preparation and dilution.

6.2 Buffer Solution (pH 10)

Dissolve 16.9 g ammonium chloride (NH₄Cl) in 143 mL ammonium hydroxide, and dilute to 250 mL with distilled water.

6.3 Eriochrome Black T Indicator

Thoroughly mix 0.5 g Eriochrome Black T with 100 g sodium chloride (NaCl).

6.4 Inhibitor Solution (If Required)

For samples containing interfering ions, dissolve 4.5 g hydroxylamine hydrochloride in 100 mL of 95% ethyl alcohol or isopropyl alcohol.

6.5 Standard EDTA Solution (0.01 M)

Dissolve 3.723 g EDTA in distilled water and dilute to 1000 mL. Standardize the solution against 0.01 M zinc sulfate solution.

6.6 Standard Zinc Sulfate Solution (0.01 M)

Dissolve 2.8754 g ZnSO₄ in distilled water and make up to 1000 mL.

6.7 Standard Calcium Solution (1 mL = 1 mg as CaCO₃)

Dry analytical‑grade calcium carbonate at 180°C for 1 hour. Accurately weigh 1.000 g, dissolve using minimal concentrated hydrochloric acid, boil briefly, cool, add methyl red indicator, neutralize to an orange endpoint using 3N ammonium hydroxide, and dilute to 1000 mL with distilled water.


7. Standardization of EDTA Solution

  1. Pipette 50 mL of standard calcium solution into a conical flask.
  2. Add 1 mL buffer solution.
  3. Add 1–2 drops of Eriochrome Black T indicator.
  4. Titrate with EDTA solution until the color changes from purple to sky blue.
  5. Record the burette reading and calculate the exact normality of the EDTA solution.

8. Sample Analysis Procedure

  1. Transfer 50 mL of the water sample (or a suitable aliquot) into a conical flask.
  2. If interference is anticipated, add 1 mL hydroxylamine hydrochloride solution.
  3. Add 1–2 mL buffer solution to adjust the pH to 10.0–10.1.
  4. Add 2–3 drops of Eriochrome Black T indicator; the solution will turn wine‑red.
  5. Titrate with standardized EDTA solution, stirring rapidly initially and slowly near the endpoint.
  6. Note the endpoint indicated by a color change from wine‑red to sky blue.

9. Calculation

Total Hardness (as CaCO₃), mg/L


Total Hardness = {A x N x 1000 x 100}/{V}

Where:

  • A = Volume of EDTA used (mL)
  • N = Normality of EDTA solution
  • V = Volume of sample taken (mL)

10. Process Flow (Summary)

  • Measure 50 mL of sample into a conical flask
  • Add buffer solution
  • Add Eriochrome Black T indicator (wine‑red color)
  • Titrate with 0.01 N EDTA solution
  • Observe endpoint color change to sky blue
  • Record titration volume
  • Calculate total hardness in mg/L

11. Precautions

  • Maintain pH strictly at 10 to ensure accurate endpoint detection.
  • Use freshly standardized EDTA solution.
  • Ensure all glassware is free from metal contamination.

12. Conclusion

The EDTA titrimetric method is a well‑established and reliable technique for determining total hardness in water and wastewater samples. When performed under controlled conditions, it provides precise and reproducible results essential for water quality assessment and effective treatment process control.










Tuesday, December 15, 2020

Standard Operating Prucedure for the Measurement of Chemicsl Oxygen Demand

Chemical Oxygen Demand (COD): A Practical Laboratory Guide


Chemical Oxygen Demand (COD) is a key parameter used to assess the level of organic pollution in water and wastewater. It measures the amount of oxygen required to chemically oxidize organic and inorganic matter present in a sample. In this blog, we explain the COD test procedure, its principle, reagents, and calculations in a clear and laboratory-friendly way.


Purpose of COD Testing

The purpose of this standard operating procedure is to describe the laboratory method for determining Chemical Oxygen Demand in water and wastewater samples.


Scope

This procedure is applicable to laboratories engaged in the analysis of:

  • Drinking water
  • Surface water
  • Industrial effluents
  • Domestic and municipal wastewater

where COD measurement is required for monitoring, compliance, or research purposes.


Roles and Responsibilities

  • Laboratory Chemist: Responsible for sample preparation, digestion, titration, and calculation of COD values.
  • Technical Manager: Reviews the analytical procedure and verifies results.
  • Quality Manager: Ensures proper implementation of the SOP and compliance with quality standards.

Principle of COD Determination

The COD test is based on the oxidation of organic matter by potassium dichromate (K₂Cr₂O₇) in a strongly acidic medium:

  • The sample is refluxed with a known excess amount of potassium dichromate in the presence of concentrated sulfuric acid.
  • Organic matter present in the sample gets oxidized during digestion.
  • After refluxing, the remaining (unreduced) potassium dichromate is titrated with ferrous ammonium sulphate (FAS).
  • The amount of dichromate consumed is directly proportional to the oxygen required to oxidize the sample constituents and is expressed as mg/L of oxygen.

Instruments and Equipment Required

  • Heating mantle
  • Reflux condenser
  • Burette (50 mL)
  • Round-bottom flask (500 mL)
  • Pipettes (50 mL)
  • Measuring cylinders

Reagents Used

1. Potassium Dichromate Solution (0.25 N)

Reagents required:

  • Potassium dichromate (K₂Cr₂O₇): 12.259 g
  • Distilled water: up to 1000 mL

Preparation: Dissolve 12.259 g of potassium dichromate in distilled water, transfer to a 1000 mL volumetric flask, and make up to the mark with distilled water. Store in an amber-colored bottle.


2. Ferrous Ammonium Sulphate (FAS) Solution (0.25 N)

Reagents required:

  • Ferrous ammonium sulphate hexahydrate [Fe(NH₄)₂(SO₄)₂·6H₂O]: 98 g
  • Concentrated sulfuric acid: 20 mL
  • Distilled water: up to 1000 mL

Preparation: Dissolve 98 g of ferrous ammonium sulphate in distilled water. Add 20 mL of concentrated sulfuric acid carefully, cool the solution, and dilute to 1000 mL with distilled water. Standardize daily against 0.25 N potassium dichromate.


3. Ferroin Indicator

Reagents required:

  • 1,10-Phenanthroline monohydrate: 1.485 g
  • Ferrous sulphate heptahydrate (FeSO₄·7H₂O): 0.695 g
  • Distilled water: up to 100 mL

Preparation: Dissolve 1.485 g of 1,10-phenanthroline monohydrate and 0.695 g of ferrous sulphate in distilled water and dilute to 100 mL. Store in a dark bottle.


4. Concentrated Sulfuric Acid

Reagent required:

  • Concentrated H₂SO₄ (analytical grade)

Used to provide the acidic medium necessary for oxidation during reflux digestion.


5. Silver Sulphate

Reagent required:

  • Silver sulphate (Ag₂SO₄)

Used as a catalyst to enhance oxidation of certain organic compounds, especially straight-chain fatty acids.


6. Mercuric Sulphate

Reagent required:

  • Mercuric sulphate (HgSO₄)

Added to eliminate chloride interference during COD determination.


Step-by-Step Test Method

  1. Pipette 20 mL of the sample into a 500 mL round-bottom reflux flask.

    For samples with high COD, take a smaller volume and dilute it to 20 mL with distilled water to ensure proper digestion and clear endpoint.

  2. Add approximately 0.5 g of mercuric sulphate and a pinch of silver sulphate to the flask.

  3. Add 10 mL of 0.25 N potassium dichromate solution.

  4. Carefully add 30 mL of concentrated sulfuric acid with gentle swirling to mix the contents.

  5. Attach a reflux condenser and reflux the mixture on a heating mantle for a minimum of two hours.

  6. Allow the mixture to cool and wash down the condenser with about 80 mL of distilled water.

  7. Cool the solution to room temperature.

  8. Titrate the excess potassium dichromate with 0.25 N ferrous ammonium sulphate using 4–5 drops of ferroin indicator.

  9. Observe the color change from bluish-green to reddish-brown, indicating the endpoint.

  10. Record the burette reading for the sample.


Blank Determination

Perform a blank test using 20 mL of distilled water and the same quantities of reagents under identical conditions. Record the burette reading for the blank.


Calculation of Chemical Oxygen Demand


COD (mg/L) = {(B - A) x N x8000}\{Sample Volume (mL)}

Where:

  • B = Volume of FAS used for the blank (mL)
  • A = Volume of FAS used for the sample (mL)
  • N = Normality of ferrous ammonium sulphate

Final Notes

COD analysis is a robust and widely accepted method for estimating organic pollution in water and wastewater. Accurate reagent preparation, proper refluxing time, and careful titration are essential for reliable results. When performed correctly, COD testing provides valuable insights into treatment efficiency and environmental impact.

🧪 Clean chemistry leads to cleaner water!



Wednesday, December 2, 2020

Measurement of Dissolved Oxygen

  

Measuring Dissolved  Oxygen in Water and Wastewater: A Laboratory Guide

Dissolved Oxygen (DO) is one of the most important indicators of water quality. Whether it’s drinking water, surface water, or wastewater, DO levels tell us a lot about biological activity, pollution load, and overall ecosystem health. In this blog, we’ll walk through the laboratory procedure for determining Dissolved Oxygen using the classic iodometric (Winkler) titration method—explained clearly and practically.



                                   


Purpose of the Test

The aim of this procedure is to accurately determine the amount of dissolved oxygen present in water and wastewater samples using a standardized laboratory method.


Scope of Application

This method is applicable in environmental and water-testing laboratories where routine analysis of:

  • Drinking water
  • Surface water
  • Groundwater
  • Wastewater

is required for Dissolved Oxygen measurement.


Roles and Responsibilities

  • Laboratory Chemist: Performs the DO analysis as per the procedure.
  • Technical Manager: Reviews the analytical process and results.
  • Quality Manager: Ensures implementation of the SOP and compliance with quality standards.

Principle of the Method

The determination of dissolved oxygen is based on a redox reaction involving manganese salts and iodide ions:

  1. Oxygen present in the sample oxidizes divalent manganese (Mn²⁺) to a higher oxidation state under alkaline conditions.
  2. This forms a brown precipitate of manganese hydroxide.
  3. Upon acidification, the manganese compound releases iodine from potassium iodide.
  4. The amount of iodine liberated is directly proportional to the dissolved oxygen in the sample.
  5. The released iodine is titrated with standardized sodium thiosulfate using starch as an indicator.

Instruments and Equipment Required

  • BOD bottle (300 mL capacity)
  • Burette (25 mL)
  • Conical flask
  • Pipettes and standard laboratory glassware

Reagents Used

1. Manganese Sulfate Solution

Prepared by dissolving manganese sulfate in distilled water and making up to one liter. The solution must be free from any oxidizing impurities.

2. Alkali–Iodide–Azide Reagent

A mixture of sodium hydroxide, potassium iodide, and sodium azide. Sodium azide suppresses nitrite interference during the analysis.

3. Concentrated Sulfuric Acid

Used to acidify the sample and dissolve the precipitate.

4. Starch Indicator

Freshly prepared starch solution gives a sharp blue endpoint during titration.

5. Sodium Thiosulfate Solution

  • 0.1 N solution: Prepared and standardized using potassium dichromate.
  • 0.025 N solution: Obtained by diluting the 0.1 N solution and re-standardizing.

Step-by-Step Test Method

  1. Collect the water or wastewater sample carefully in a 300 mL BOD bottle, ensuring no air bubbles are trapped.
  2. Add 2 mL of manganese sulfate solution, followed by 2 mL of alkali–iodide–azide reagent. The pipette tip should remain below the liquid surface during addition.
  3. Stopper the bottle immediately and mix by gently inverting it 3–4 times.
  4. Allow the brown precipitate to settle completely.
  5. Add 2 mL of concentrated sulfuric acid, restopper, and mix until the precipitate dissolves, resulting in a clear yellow-brown solution.
  6. Transfer 203 mL of the treated sample into a conical flask.
  7. Titrate with 0.025 N sodium thiosulfate until the solution becomes pale yellow.
  8. Add a few drops of starch indicator; the solution turns blue.
  9. Continue titration until the blue color just disappears—this is the endpoint.

Why 203 mL Is Used for Titration

During reagent addition, 4 mL of reagents displace the original sample volume. To compensate:

  • Actual sample required: 200 mL
  • Adjusted volume taken for titration:

200x{300}/{(300 - 4)} = 203 mL

This correction ensures accurate DO calculation.


Calculation of Dissolved Oxygen

  • 1 mL of 0.025 N sodium thiosulfate corresponds to 0.2 mg of oxygen.
  • Since the effective sample volume is 200 mL:

DO (mg/L= 0.2x1000=200

Therefore:

1 mL of 0.025 N sodium thiosulfate = 1 mg/L of Dissolved Oxygen

Multiply the burette reading by 1 to obtain DO in mg/L.


Final Thoughts

The Winkler titration method remains a reliable and widely accepted technique for Dissolved Oxygen analysis when performed carefully. Proper sample collection, fresh reagents, and precise titration are key to obtaining accurate and reproducible results.

Understanding each step—not just following it—helps ensure data quality and regulatory compliance in water and wastewater testing.

Happy testing! 🧪💧


Synthetic precipitation Leaching Procedure (SPLP)

  Synthetic Precipitation Leaching Procedure (SPLP) Environmental protection and waste management rely heavily on scientific testing metho...