RTD vs Thermocouple vs Thermistor vs Infrared Thermometer

Which is the Best Temperature Sensor for You?

Getting the best temperature sensor is a must in any industry. Not getting one is like putting a round peg in a square hole. You may think that you’ve given it your best shot; but instead, you’re bound to compromise your operation — if not grind it to a halt. When quality results are dependent on temperature readings, it’s paramount you don’t settle for less. It goes without saying, therefore, that your desired end result should define what kind of temperature sensor fits your application best.

Take note that operational efficiency, the ones touted by six-sigma black belt adherents, can only matter in relation to overall effectiveness. Getting the wrong temperature sensor not only is a slap to operational efficiency, but it’s also a point against effectiveness. In this sense, it’s a scenario best avoided. As it’s a bomb waiting to explode in your face.

To that end, acquiring fair technical knowledge on the characteristics of each temperature sensor type should bid you well. Consider each sensor type as a distinct tool. As such, you need to know how each one functions so you can make the most of it. In the final analogy, you really can’t compare a shovel and a hammer, right? Each tool serves a different function. In this light, operational efficiency dictates getting the right tool for the job.

Read on below to find out more on the top four most utilized temperature sensors in the world: thermocouples, RTDs, thermistors, and thermal thermometers. And realize how each one can serve your purpose best.

First Principles

Before we do a comparison, let’s take a look at how each of the four temperature probes came to be. And how they work.

In the process, we can do what the great philosopher Aristotle preached and what Elon Musk, one of the most celebrated engineers of our time practice. We’re talking about first principles thinking.

Infrared Thermometer

Also dubbed laser thermometers, infrared thermometers are non-contact thermometers that measure the amount of infrared energy emitted by an object being measured along with its emissivity. It’s also called a laser thermometer as it is guided by a laser to zero in on a spot.

Usually handheld, infrared thermometers operate like a thermal camera but with limited capacity measuring only a small spot. A thermal camera is more sophisticated and more powerful — measuring a wider area and thereby give a more detailed temperature profile of a target object. Infrared technology is based on the works of Sir William Herschel who discovered infrared radiation in 1800.


A thermocouple is basically an electrical junction formed by two dissimilar metal conductors that can produce temperature-dependent voltage through the Seebeck effect (thermoelectric effect).

The phenomenon was first discovered by Thomas Johann Seebeck in 1821. The German physicist found out that when two distinct metals are joined together at the ends, a magnetic field appears when the metal joints experience a temperature difference. Later it was found out that this magnetic field is a product of the thermoelectric current that is formed. Measuring the ensuing voltage is key to measuring the temperature. Take note that as different metal alloys will generate different voltage sizes, thermocouples are classified according to the various metal combinations.


An RTD or resistance temperature detector, also dubbed as a resistance thermometer, is a temperature sensor that uses electric resistance that passes through a fine wire to measure the temperature of an object. In principle, it is based on a linear relationship between metal electrical resistance and applied temperature.

As RTDs are dependent on the accuracy of the resistance/temperature relationship, the fine wire used in RTD is made of pure material, typically platinum, nickel, or copper. In 1821, the same year that Seebeck discovered the thermoelectric effect, Sir Humphrey Davy, a renowned British chemist, announced that resistance in metal has a direct relationship with temperature. Half a decade later, Sir Carl Siemens, a British-German electrical engineer, would propose platinum as an ideal element for RTDs. Platinum RTDs are still used today, amongst other materials. Platinum RTDs can measure as high as 660°C while nickel ones are best for temperatures below 300°C.


A thermistor is basically an RTD on steroids. Derived from the words THERMally sensitive resisTOR, it utilizes the same principles as the resistance temperature detector. And it was the great English physicist, Michael Faraday, who discovered thermistors and ushered the power of semiconductors in 1833. Unlike the RTD which uses metals, a thermistor uses semiconductor materials. As a result, thermistors can detect the slightest change in temperature much better than RTDs.

Basically, there are two types of thermistors, PTC (Positive Temperature Coefficient) and NTC (Negative Temperature Coefficient). It’s not unusual for NTC thermistors to display a -3% to -6% resistance change for every 1°C temperature change. The main drawback of thermistors is their limited range and much more limited than RTDs.


Top Four Temperature Sensors General Overview

Temperature Sensor

Discoveries Behind

Greatest Advantage

Common Uses


Infrared Thermometer

1800: Infrared discovery by Sir William Herschel


 Fever detection


 Car repair




1821: Thermoelectric effect by Thomas Johann Seebeck

Widest temperature range

 Thermostat sensors

 Gas turbine exhaust

 Flame sensors



1821: Sir Humphrey Davy R vs T findings (resistance vs temperature)

Accurate readings

 Coffee machines

 Cell phones

 Fire detectors



1833: discovered by Michael Faraday

Most accurate readings



 Digital thermometers for automotive


Best Temperature Sensor Head-to-Head Comparison

As aforementioned, each temperature sensor has its own unique strengths and weaknesses. The best approach, therefore, is to take into consideration your particular application and ensure the most fitting temperature sensor serves it best.  

The Case for Spot Infrared Thermometer

using an infrared thermometer

As its basic operation are the same, infrared thermometers (also dubbed as temperature guns, pyrometers, and laser thermometers) are the infrared camera’s simpler younger brother. As such, it consists of the focusing lens which captures the infrared radiation (emitted by an object) and the electronic mechanism which converts said energy into a readable electrical signal. A laser is usually used to guide the infrared thermometer to a particular spot.

Right off the bat, the biggest advantage of a spot infrared thermometer is its ability to get a temperature reading from a distance. Unlike probe-type sensors (e.g., thermocouple, thermistor), infrared thermometers can take a reading without establishing direct contact with the body under observation.

Take note, however, that there are certain considerations that have to be factored in to arrive at a reliable temperature output. For one, there’s the emissivity of the object being studied. Simply put, emissivity is a measurement of how effective an object in emitting thermal radiation. In technical terms:

  • Emissivity: is the ratio of the thermal radiation from a particular surface over the thermal radiation of an ideal black surface (perfect emissivity) at the same temperature. That ratio is based on the Stefan-Boltzmann law.
  • As such, the emissivity ratio can take the form of 0 to 1, with 1 as perfect emissivity and 0 as the least reliable.

An ideal black surface or black body is theoretically an object that can absorb all incidents of electromagnetic radiation no matter the angle or the frequency. Best of all, it can emit perfectly thermal radiation. On the other end of the spectrum is a white-body that scatters thermal energy in all directions.

Other key considerations in choosing a particular infrared thermometer include:

  • Field of view
  • Temperature range
  • Mounting limitations
  • Response time

Over time, a host of spot infrared thermometers types has surfaced commercially. Some of these are:

  • Handheld infrared thermometers
  • Pocket infrared thermometers
  • Infrared thermocouples

Another key advantage of an infrared thermometer is data integrity. Compared to surface probe sensors, the infrared device does not introduce its own temperature into the object being observed.

Small wonder why laser thermometers are best to monitor moving parts in manufacturing and other continuous process temperature measurement jobs.

Infrared Thermometer Limitations

A major stumbling block when using an infrared thermometer is the accuracy of the temperature data gathered. Know that there are a host of factors that can affect negatively the integrity of the data being measured.

A key consideration in using an infrared thermometer in particular and infrared thermography, in general, is ambient temperature. A dramatic difference in the ambient temperature, or the temperature around which the measuring device is used, and the temperature of the object being measured can negatively impact the accuracy of the resulting temperature data.

For instance, if you introduce a laser thermometer into a refrigerated space, the temperature readings may vary as much as 5 to 6 degrees off. To counter this and generate a more accurate reading, you will have to acclimatize the infrared thermometer to the new environment. That means leaving the device in that cold space for 20 to 30 minutes before using it, a time industrial setting won’t have the luxury. Alternatively, you can use a special lens (e.g., Mica lens) infrared thermometer to get accurate results without having to acclimatize the device.

Moreover, another key limitation is that the infrared thermometer is basically a surface temperature sensor. As such, it may not be the best sensor if you want to measure the core temperature of an object such as those of processed meat. Or when you want to measure various points in a heat-induced room.


Infrared Thermometer Characteristics in a Nutshell


IR Thermometer Data

Vs Probe Type

Temperature Range

-37°C to 5537.77778°C (-100°F to 10000°F)

 much lower range


-270°C to 1704°C  (-454°F to 3,100°F)

Biggest advantage

Non-contact (best with moving parts)

Highly accurate

Biggest limitation

 Only measures surface temperature

 Require adjustments depending on the emissivity of a surface

 Not as accurate as probe type

 Accessibility (cannot measure inaccessible places)

 May interfere with the temperature of the object under study

Price Consideration

More expensive per unit



The Case for Thermocouples

holding a thermocouple probe

In the probe-type sensor world, thermocouples rule when it comes to temperature range. Indeed, there have been significant technological strides made when it comes to other sensor probes. RTDs for one can now deal with temperatures higher than 400°C, even up to 1000°C. But ultimately you sacrifice accuracy a lot when RTDs go beyond 400°C. Reason enough why 90% of RTDs today operate below 400°C. Thermistors operate way lower than this.

However, thermocouples go way higher than RTDs and thermistors for that matter. It’s not uncommon for thermocouples to be used in temperatures up to 2500°C. And that certainly blows the rest of the probe-type sensors way out of the water.

Another design advantage for thermocouples is for single-point measurements. Thus, you can zero in on the measuring point down to the very spot where the two distinct metals converge. By using “naked tips”, temperature for this exact point, however small, can be generated. RTDs won’t have such luxury.

Then, there are the cost considerations. More often than not, RTDs will cost you twice or even thrice as much as a thermocouple. Thermistors may be cheaper than RTDs but not as cheap as thermocouples. For one, thermistors need an external power source to function, unlike thermocouples which are self-powered.

When it some to price tag, no probe type sensor can come close to thermocouples. Quite simply, they’re the cheapest of the bunch.

Thermocouple Limitations

As a general rule, RTDs are way more accurate than thermocouples. And as thermistors are also generally more accurate than RTDs, thermistors, therefore, trump thermocouples in this department. It’s normal for RTDs to have an accuracy of 0.1°C while most would have 1°C. On the other end of the spectrum, thermocouples typically have an accuracy of 2°C, quite large if you need precision in your industry.

To note, there are 2 factors that affect probe sensor accuracy. These are:

  • Linearity: As RTDs are based on electrical resistance, the output is basically linear. In this sense, a rise in temperature generates a corresponding rise in resistance value and thereby output reading. In short, it’s almost a direct proportionality. Thermocouples, however, waver in terms of proportionality. It’s not a direct relationship but rather is governed by an “S”-plot output.
  • Drift (Stability and repeatability): RTD sensor output is repeatable and thus stays constant over a longer period of time with minimal drift. Thermocouples, thanks to the inhomogeneous wires, suffer chemical changes (oxidation) that affect their readings. The overall effect? High-drift-over-time for thermocouples.

Thermocouples can also be adversely affected by a host of factors, allowing readings to deviate into large errors. Factors that can influence thermocouple output integrity include:

  • Surface oxidation
  • Ambient temperature of surrounding air
  • Loss of heat via conduction
  • Starting temperature of the probe (thermal inertia)
  • Applied pressure of the probe on the surface

These will have to be considered when your quest for reliable temperature output is high. A concrete example is a two-point thermocouple. As it is bulky and massive, a two-point thermocouple may fail because the heat flow coming from the surface of the object being measured may not be large enough to introduce heat to the probe that reflects the temperature of the object. In short, the thermocouple could get in the way of a reliable reading.  


The Case for RTDs

Using a RTD Thermometer

At its core, RTDs also dubbed as resistance thermometers are probe-type sensors that use a metal’s predictable resistance/temperature relationship to detect temperature. As it has higher accuracy and repeatability, RTDs are slowly gaining traction in the market and replacing thermocouples.

RTDs are PTC (Positive Temperature Coefficient) sensors which means their resistance-to-temperature are directly proportional, unlike NTC thermistors. Usually, the top metals used in RTDs are nickel and platinum. And the most popular of these is the PT100 made up of platinum.

The greatest advantage of RTDs over the more popular thermocouple is accuracy. Simply put, they carry greater accuracy compared to mass-use thermocouples. To note, however, there are thermocouples that can go head-to-head with RTDs in terms of accuracy.

Another key benefit of RTDs is better predictability. As an RTD can stay stable over time, its output is highly-repeatable compared to that of a thermocouple. Even better, RTDs display a linear temperature-resistance relationship when detecting temperature, something a thermocouple doesn’t have.

RTD Limitations

Temperature range limitation is one drawback to RTDs. Compared to thermocouples which can detect as high as 2500°C, thermocouples seem petty reaching limits of up to 650°C only. As the difference is vast, small wonder why thermocouples are still the undisputed leader in probe-type temperature sensors.

Another key limitation is the price. It takes far less to produce a thermocouple than an RTD. As a whole, thermocouples carry a lesser price tag compared to RTDs. To note, it usually takes twice or thrice more to produce an RTD than a thermocouple.

Then, there’s the case for sensitivity. While both thermocouples and RTDs can respond quickly in accordance with temperature changes, thermocouples can do so faster.

Thermocouple-RTD Comparison Chart

Parameter of Comparison



Temperature Measuring Range

-270°C to +2500°C

-240°C to +650°C




Undesirable Self-heating


Medium to Excellent







Response Time

Medium to Excellent


Long-term Stability




The Case for Thermistors

Using a thermistor

Indeed, both thermistors and RTDs use the resistance-to-temperature ratio to come up with readable temperature output. The major difference lies in the material being used. RTDs utilize pure metal (platinum being most common) which is protected inside a probe or a sheath. Additionally, the probe can also be embedded in one ceramic substrate.

On the other hand, thermistors are made up of semiconductors or composite materials. Usually, these are metal oxides (e.g., nickel, copper, manganese) together with stabilizers and binding agents.

It’s good to note that although both thermistors and RTDs rely on the resistance produced by a conductor, there is a major difference.  RTDs use the direct relationship between resistance and temperature.

On the other hand, the more popular NTC (Negative Temperature Coefficient) thermistors rely on the inverse relationship of resistance to temperature. Meaning: electrical resistance decreases as the temperature increases. And while the temperature-to-resistance relationship is linear when it comes to RTDs, thermistors observe an exponential curve relationship. By design, the NTC thermistor is more efficient in detecting the smallest changes in temperature.

As such, the thermistor's greatest strength is it has greater accuracy compared to RTDs. Although recently, technological advancements have made the best RTDs come as close as possible to thermistor accuracy, in general, RTDs give a poorer performance in the accuracy department.

This greater accuracy can be traced to greater sensitivity. RTD sensors demonstrate a smaller ohms resistance change per degree changes in temperature. On the other hand, thermistors display tens of ohms change per degree of temperature difference made.

Another strong point of thermistors is their price. As a rule, thermistors are way cheaper than RTDs.

Thermistor Limitations

The greatest drawback to thermistors is their very limited range. RTDs reaching 600°C is no news. It’s usual. But at its best, thermistors can only reach up to 130°C. Beyond that and thermistors are rendered ineffective. And yet, there are newer thermistor models that are designed to go beyond the normal temperature limits.

The temperature limits of thermistors can be traced to design. Thanks to the pronounced non-linearity of NTC resistors getting beyond 130°C is a challenge. For one, the resulting drift is larger than those for RTDs.

As a rule, as it’s cheaper and more efficient getting a thermistor over an RTD is wise. Assuming that the temperature range is observed. In practice, thermistors are often used in home applications due to their limited range while RTDs are fielded in the industries for detecting higher temperatures.

However, take note. In general, thermistors are less expensive compared to RTDs. But, thermistors that are designed for an extended temperature range beyond the normal are usually more pricey than run-of-the-mill RTDs.

Thermistor-RTD Comparison Chart

Parameter for Comparison


RTD (Resistance Temperature Detector)

Temperature range

-60°C - 130°C

-230°C to 660°C


More accurate

Less accurate

Response Time













Pure metals (nickel, copper, platinum)



In Conclusion: The Best Temperature Sensor


First things first. Before you choose your temperature sensor, you need to assess the object to be measure and its environment. Meaning, you need to do your due diligence. If not, you could be wasting a lot of time, not to mention precious dollars, on a temperature sensor that underdelivers. Or worse, over-delivers.  

Again, there is no such thing as the best temperature sensor overall. As industrial and home applications can vary, there is only the most fitting for the job at hand.

In short, you need to do your due diligence. As such, here are some of the key elements that should influence your choice of the best temperature sensor for your particular application.


    Infrared thermometers are best for spotting temperature abnormalcy signaling a system breakdown or system flaw. As such, it’s best for moving parts, in manufacturing or in any industry. That way anomalies in machines can be detected even while they’re in operation. Even better, you don’t put anybody in harm’s way. No risk of contamination or factoring any mechanical effect on the object’s surface. Plus, you don’t interrupt the process just to get the temperature.

    In hindsight, this has made the infrared laser thermometer the go-to tool for fever detection in light of the COVID-19 pandemic. Call it IR thermometers finest hour. And as it can carry a -37°C to 5537.77778°C (-100°F to 10000°F) temperature range, these non-contact sensors can definitely help a slew of industries a lot. Indeed, they’re a shot in the arm.

    But there are jobs that require physical contact with the surface of the object under observations. A classic example is examining the core of meat in food processing applications. Or for that matter, measuring the heat in a hot oven.

    In which case, probe-type sensors are vital. They have their place under the sun. For instance, bedbug extermination which can go as high as 160°F needs contact sensors. Continuous monitoring of the temperature is also possible with contact temperature sensors. Even better, you can configure Wi-Fi or Bluetooth connectivity to record continuously needed data.

    The question now is which probe-type sensor is best for your application. For your convenience, here’s a table of comparison of the top 3 contact thermometers in the world.

    Thermocouple vs. RTD vs Thermistor Chart

    Sensor type




    Accuracy (typical)


    0.5 to 5°C

    0.1 to 1°C

    0.05 to 1.5°C

    Long-term stability @ 100°C




    Temperature Range (typical)

    200 to 1750°C

    -200 to 650°C

    -100 to 325°C

    Power required


    Constant voltage or current

    Constant voltage or current

    Susceptibility to electrical noise

    Susceptible / Cold junction compensation

    Rarely susceptible

    Rarely susceptible

    High resistance only

    Response time


    0.10 to 10s

    Generally slow

    1 to 50s


    0.12 to 10s



    Fairly linear





    Low to moderate

    The good thing about doing your due diligence is it would be easier for you to decide which temperature fits your application best. There’s no other way around it. When you do your homework, getting the best temperature sensor should be a walk in the park.

    Back to blog