What is mV on a Multimeter?

mV on a Multimeter

Multimeters are essential tools for anyone working with electrical equipment. They allow us to measure various electrical properties, including voltage, current, and resistance. But what does the “mV” setting on a multimeter actually mean? Let’s dive into it!

Understanding Multimeter Symbols

Before we get into millivolts (mV), let’s quickly review the main parts of a multimeter:

  1. Display: This is where you see the measurements displayed. Depending on your multimeter type, the display may show readings such as voltage, current, resistance, and more.
  2. Buttons and Sockets: These vary depending on the type of multimeter. Some have buttons for selecting different functions, while others have sockets for plugging in probes or accessories.
  3. Dial/Rotary Switch: Here, you choose the unit of measurement (volts, ohms, etc.) for accurate readings.
  4. Input Jacks/Ports: These are where you insert the test leads – insulated wires that connect the multimeter to the object you’re testing.
  5. It’s helps in converting volts to mV.

Decoding Multimeter Symbols

Now, let’s focus on the “mV” symbol:

When to Use mV on a Multimeter

  1. Low AC Voltage Measurements: If you’re dealing with extremely low AC voltage, the “mV~” setting helps the multimeter recognize this and provides a more accurate reading. For example, when troubleshooting sensitive electronic components or measuring signals in microvolts, the mV range is handy.
  2. Diode Testing: Some multimeters use the “mV” setting for diode testing. Diodes allow current flow in one direction only, and this function helps check their functionality.

How to Use the mV Setting

  1. Select the mV Range: Turn the dial to the “mV~” or “mV—” setting, depending on whether you’re measuring AC or DC millivolts.
  2. Connect the Test Leads: Insert the test leads into the appropriate input jacks. The red lead goes into the positive jack, and the black lead goes into the negative jack.
  3. Measure the Voltage: Touch the test leads to the points where you want to measure the voltage. Read the value displayed on the multimeter.

Difference between Volts and Millivolts.

  • Volt (V): The volt is the standard unit of electric potential difference or voltage. It represents the force that pushes electric charges through a circuit. One volt is defined as the potential difference across a conductor when one ampere of current flows through it, dissipating one watt of power. In short, it’s the “big” unit of voltage.
  • Millivolt (mV): The millivolt is a subunit of the volt. The prefix “milli” means one-thousandth. Therefore, one millivolt (mV) is equal to one-thousandth of a volt (1 mV = 0.001 V). It’s commonly used for measuring very small voltage differences, especially in sensitive electronic circuits.


Millivolts play a crucial role in precise measurements, especially when dealing with low-voltage circuits. Understanding the “mV” setting on your multimeter ensures accurate readings and safe usage. So next time you encounter tiny voltage levels, remember to switch to the “mV” range!

To Top

Pin It on Pinterest

Share This