13 Aug 2010
Maximising transducer measurement accuracy
Claude Gudel, of LEM SA, Switzerland explains how to take transducer technology to the limits of current measurement accuracy.
The development of magnetic resonance imaging (MRI) has led to an increased ability to diagnose and, subsequently, treat a growing number of physical conditions at the cellular level, perhaps most notably cancer. As a diagnostic methodology, MRI continues to evolve but for some time that evolution has been geared to developments in the underlying technology, not least of all the techniques used to acquire the images.
Although the profile of MRI scanning has been increasing since the early 1970s, the phenomenon that enables it was first observed in the mid-40s. It was around this time that two independent research groups, at Harvard and Stanford universities, both discovered what was to become known as Nuclear Magnetic Resonance. Shortly after, Dr Bernard Rollin, operating out of Oxford University in the UK, built what must be the earliest example of an NMR spectrometer. Further discoveries made in the early 50s led to the development of high resolution NMR spectroscopy, when it was recognised as a potentially useful tool in the field of chemistry and biochemistry. Efforts to increase the resolution of images eventually saw its application in diagnostic medicine, and MRI scanning began its own development on a parallel path to NMR.
Beyond the iconic image of a horizontal platform large enough to carry a patient, sliding into a larger, circular machine resembling a huge inductor, it isn't immediately obvious just how MRI scans are carried out.
Detection of magnetic fields
A key element of NMR/MRI spectrometry is the detection of small magnetic fields generated by the movement of cells in a soft tissue. This movement is effectively cell realignment, which happens following their displacement. That in turn is caused by the cells' exposure to a stronger magnetic field. The speed at which the cells realign themselves depends on their structure and condition, and the resolution with which the much smaller magnetic fields they generate can be detected determines the overall resolution of the instrument.
Crucial to any MRI scanner's efficacy is the level of cell excitation generated by the magnetic field, so controlling this field is as critical as detecting the resulting cell realignment. There are now many companies that develop MRI scanners, many of whom are household names, but interestingly they rely heavily on other, less well known specialists to develop and supply the sensors used to enable these machines.
One such company is LEM, a leading provider of innovative and high quality solutions for measuring electrical parameters. As MRI scanners have become more widely used, a need to improve their resolution has also developed. This can only be achieved through careful and precise regulation of the magnetic fields, which in turn depends greatly on the ability to measure and control the currents used to generate them.
For some time, the technology used in this application was based on Hall Effect current transducers, but this technology has significant limitations in this application area, particularly in their precision. LEM was approached by a customer in this field, who needed a new kind of current transducer; one that, in order to improve on what was already available, needed to offer much greater precision. It took LEM around 7 months to adapt an existing technology to meet the customer's demand and the current transducer it developed now offers the highest performance available on the open market.
The solution developed by LEM can be described as a double fluxgate closed loop transducer, known as ITL900, but it may be more useful to compare its operation against the more commonly found Hall Effect technologies.
The Hall Effect was discovered in 1879 by an American physicist called Edwin Herbert Hall, at John Hopkins University in Baltimore. The Hall Effect is created by Lorentz force, F=q.(VXB), which acts on charges moving through a magnetic flux density. A control current flows through a very thin plate of semi-conductor passing through the field. The mobile charge carriers of the control current are deflected as the external magnetic flux density, B, generates a Lorentz force, perpendicular to the direction of current flow. This deflection causes more charge carriers to gather on one side of the conductor, creating a potential difference across it, referred to as the Hall voltage.
Certain elements of the Hall Effect; specifically the Hall constant and the offset voltage of the Hall element, are temperature dependent. Therefore it is necessary in any current transducer using the Hall Effect to provide temperature compensation.
Hall effect implementation
The simplest practical implementation of the Hall Effect is an open loop transducer, providing the smallest, lightest and most cost sensitive current measurement solution, while also having very low power consumption.
As shown above, it can be seen that the transducer is formed of a current carrying conductor creating a magnetic field. The field is concentrated by a magnetic core, which is cut to create an air gap. Within the air gap, a Hall element is used to sense the magnetic flux density. The control current and differential amplification are applied electronically, with the components normally integrated within the transducer. Within the linear region of the B-H loop of the material used to create the magnetic circuit, the magnetic flux density, B, remains proportional to the primary current, Ip, and the Hall voltage, VH, is proportional to the flux density. Therefore, the output of the Hall element is proportional to the primary current plus the offset Hall voltage, Vo.
Page 1 of 2 | Next >
About the author
Claude Gudel has been with LEM since graduating and is now the head of the Electronics, Magnetics and prototypes section of LEM R&D, and gained his Bachelor's degree from ISEA (Institut des Sciences Exactes et Appliquees) in Mulhouse, France, and another from the Faculte des Sciences et techniques de Besancon, France. He then gained a Masters degree from Faculte des Sciences et techniques de Besancon, France.
LEM is the global leader in providing innovative and high quality solutions for measuring electrical parameters. Its current and voltage transducers are used in a broad range of applications in industrial, traction, energy & automation and automotive markets. LEM is a high growth global company with approximately 1000 employees worldwide. It has production plants in Geneva (Switzerland), Copenhagen (Denmark), Machida (Japan), Beijing (China) and regional sales offices close to its customer's locations. LEM has been listed on the SIX Swiss Exchange since 1986; the company's ticker symbol is LEHN.
Most popular articles in Test & measurementMaximising the Return from Testing
Software Designed Instruments: What Are They?
Developments in Oscilloscope Technology – An Interview with Tektronix
Realtime Spectrum Analyzers - what are they?
New IJTAG Standard for Embedded Test