This article introduces a 96-channel high-speed anticoagulant drug screening platform based on the transmission turbidity method, using the 89C52 microcontroller as the core.
The instrument performs real-time detection and data acquisition of blood (plasma) coagulation time. Its data acquisition precision, speed, and sensitivity are significantly improved compared with traditional coagulation timers. A new approach to applying coagulation time measurement to drug screening is proposed.
Blood coagulation is a complex process. Under normal physiological conditions, the coagulation system and the anticoagulation system are in a self-regulating balance; if this balance is disrupted, coagulation disorders can occur. Clinically, heparin is the most commonly used anticoagulant. Although it has effective anticoagulant properties, it can cause side effects such as reductions in red blood cells and platelets, and it cannot be used in patients with conditions such as disseminated intravascular coagulation. Existing oral anticoagulants, such as coumarin derivatives, also have limitations in efficacy. Therefore, development of new anticoagulant drugs remains necessary.
The first step in developing anticoagulant drugs is testing their anticoagulant effect, i.e., measuring coagulation time. The commonly accepted clinical measures are Prothrombin Time (PT) and Activated Partial Thromboplastin Time (APTT). These two indicators were jointly established by the International Committee for Standardization in Haematology (ICSH), the International Committee on Thrombosis and Haemostasis (ICTH), and the National Committee for Clinical Laboratory Standards (NCCLS). PT and APTT can replace traditional methods such as Duke bleeding time and slide clotting time as clinical hemostasis function indicators, and they provide better monitoring metrics in the process of anticoagulant drug development.
Normal coagulation occurs in a short time; even with anticoagulants added, it typically does not exceed one minute. Under normal conditions, PT does not exceed 20 seconds, which makes manual measurement difficult. To seek more convenient detection methods, many companies in China and abroad have begun developing automated coagulation time analyzers and have brought products to market, such as TECO's TEChrom IV plus 4-channel semi-automatic thrombus/hemostasis analyzer and BIOCHEM's STAGO fully automated thrombus/hemostasis analyzer. Although these analyzers can test one or several samples simultaneously, they generally follow clinical testing workflows: limited throughput, relatively large sample volume requirements, fixed sample holders that are inconvenient to clean, and high cost, making them unsuitable for drug development screening.
To improve coagulation measurement performance and meet high-throughput drug screening requirements, we designed a new automated coagulation time detector based on a microcontroller, aiming to facilitate quality control in new drug development. This compact coagulation time measurement device (dimensions 30 cm × 20 cm × 12 cm) supports 96-channel parallel real-time detection, uses small sample volumes (20 μl), and offers high sensitivity (0.1 s).
1 Measurement Principle
From a physical perspective, blood coagulation is the formation of insoluble fibrin, and the amount of insoluble fibrin increases rapidly within a short time. Consequently, the whole blood's transmittance drops quickly (turbidity increases) and then gradually levels off. The coagulation time typically refers to the initiation phase of insoluble fibrin formation, i.e., the time when the turbidity change reaches a defined multiple of the baseline signal-to-noise ratio.
The transmission turbidity method leverages the sudden turbidity increase during coagulation. With sufficiently sensitive detectors, it is possible to detect the coagulation time precisely.
2 Detector Design
2.1 Sample Holder Design
The detector uses a standard transparent flat-bottom 96-well plate as the sample holder. Photodiodes and light-emitting diodes (LEDs) are aligned respectively above and below the plate. The sample holder is designed as a drawer: the drawer is pulled out to place the 96-well plate, add coagulation reagent and blood (plasma), then start detection and data acquisition.
2.2 Circuit Design
The voltage signal received from the photodiode often has a small amplitude (several millivolts), so the selection of the light source and detector is critical. Prior to instrument design, several different LEDs and corresponding photodiodes were tested experimentally and analyzed theoretically for their transmitted light response during coagulation.
The current through the diode is given by:
I = Is (e^{VD/VT} - 1)
where I is the current through the photodiode, Is is the reverse saturation current, VD is the voltage across the diode, and VT = kT/q is the thermal voltage equivalent, with k the Boltzmann constant, T the absolute temperature, and q the electronic charge. At 300 K, VT ≈ 26 mV. Under reverse bias, when |VD| is several times larger than VT, I ≈ -Is, where the negative sign indicates reverse current.
Experimental results show that the photodiode reverse current is, within a certain range, proportional to the LED drive voltage. This is because the LED drive voltage is proportional to emitted light intensity during normal LED operation, and the photodiode reverse current is proportional to the absorbed light intensity. From this we can infer:
Assuming the absorbed light intensity at the photodiode is φ, then
I = -Is = Cφ + m ≈ Cφ
where C and m are factors that vary only with temperature, and m ≈ 0.
The resistor R in series with the photodiode has a voltage V = IR = CφR, and φ = φ0 + Δφ, V = V0 + ΔV, with V0 = CRφ0 and ΔV = CRΔφ. Here, V0 and φ0 are the voltage across R and the absorbed light intensity of the photodiode before the onset of the coagulation reaction.
ALLPCB