📏 in to μm — Inch to Micrometer Converter

Convert length and distance units — meters, feet, inches, kilometers, miles, light years and more.

1 unit =
From
To
Formula 1 in = 25400 μm
UnitNameValue
0.001 in25.4 μm
0.01 in254 μm
0.1 in2540 μm
1 in25400 μm
5 in127000 μm
10 in254000 μm
50 in1.27e+06 μm
100 in2.54e+06 μm
1000 in2.54e+07 μm

How to convert Inch to Micrometer

Multiply the number of Inchs by 25400 to get Micrometers. Formula: μm = in × 25400. Example: 10 in × 25400 = 254000 μm. To reverse, divide Micrometers by 25400 to get Inchs.

Worked examples

Example 1
1 in × 25400 = 25400 μm
1 Inch equals 25400 Micrometer.
Example 2
5 in × 25400 = 127000 μm
5 Inch equals 127000 Micrometer.
Example 3
10 in × 25400 = 254000 μm
10 Inch equals 254000 Micrometer.
Example 4 — reverse
1 μm = 3.93701e-05 in
To convert back from Micrometer to Inch, divide by 25400 or use the swap button above.

Inch to Micrometer — reference table

Inch (in)Micrometer (μm)
0.001 in25.4 μm
0.01 in254 μm
0.1 in2540 μm
0.5 in12700 μm
1 in25400 μm
2 in50800 μm
5 in127000 μm
10 in254000 μm
20 in508000 μm
50 in1270000 μm
100 in2540000 μm
250 in6350000 μm
500 in12700000 μm
1000 in25400000 μm
10000 in254000000 μm

Quick conversion tips

1
Multiply by 25400

To convert Inch to Micrometer, multiply by 25400. Example: 10 in = 254000 μm

2
Reverse: divide by 25400

To convert Micrometer back to Inch, divide by 25400 (multiply by 3.93701e-05). Use the swap button above.

3
Round number check

Start with 100 Inchs = 2540000 μm as your reference point. Scale up or down from there.

Where inch to micrometer conversion is used

Precision machining tolerances

US engineering drawings specify part dimensions in inches while ISO tolerancing standards use micrometres. Machinists and quality engineers convert between inch dimensions and micrometre tolerances for every precision component.

Surface finish measurement

Surface roughness (Ra) is measured in microinches (μin) in the US and micrometres internationally. Metrology engineers convert between the two systems for every surface finish specification and quality control report.

Semiconductor packaging

IC package dimensions use inches for overall size while bond wire diameters and die attach layer thicknesses use micrometres — semiconductor packaging engineers convert between both scales in every package design document.

Medical device tolerances

US medical device blueprints specify overall device dimensions in inches while critical fit tolerances — catheter wall thickness, stent strut width, implant surface finish — are specified in micrometres.

Optical fibre manufacturing

Fibre optic cable outer diameters are specified in inches while core and cladding dimensions — single-mode core: 9 μm, multimode: 50–62.5 μm — use micrometres. Cable engineers convert between both scales in product specifications.

Aerosol and filter science

HEPA and ULPA filter media thicknesses are specified in inches for the filter housing while particle retention sizes use micrometres (HEPA: 0.3 μm). Filter engineers convert between the two scales in every filter specification.

Frequently asked questions

1 Inch equals 25400 Micrometers. Multiply any Inch value by 25400 to get Micrometers.
10 Inchs equals 254000 Micrometers. (10 × 25400 = 254000)
100 Inchs equals 2540000 Micrometers. (100 × 25400 = 2540000)
Divide Micrometer by 25400 to get Inchs. Or multiply by 3.93701e-05. Use the swap button on the converter above for instant reverse conversion.
Formula: μm = in × 25400. Example: 5 in × 25400 = 127000 μm.
Yes — Unitafy is completely free. No signup, no ads, and no data sent to any server. All calculations run in your browser.
Yes. Once loaded, the converter works without internet. Install Unitafy to your home screen as a PWA for the best offline experience.

About Inch and Micrometer

Inch (in)

The Inch is a unit of Length measurement (symbol: in). 1 in = 25400 μm. Used in scientific and practical Length measurement applications.

Micrometer (μm)

The Micrometer is a unit of Length measurement (symbol: μm). It is part of an internationally recognised measurement system used alongside the Inch.

History & origin

The inch has one of the most colourful origin stories in measurement history. An English statute from 1324 under King Edward II defined it as 'three grains of barley, dry and round, placed end to end'. Before that, it was often defined as the width of a thumb — hence the word in many languages (French: 'pouce', Dutch: 'duim', both meaning thumb). The inch was standardised at exactly 25.4 mm in 1959 under the International Yard and Pound Agreement signed by the US, UK, Canada, Australia, and South Africa. It remains dominant in the US and is universally used for screen sizes globally.

The micrometre was named in 1879 by the International Committee for Weights and Measures — from the Greek 'mikros' (small) combined with 'metre'. The micrometer screw gauge was first described by William Gascoigne in the 1630s, though the modern calliper was developed in the 1840s by Jean-Louis Palmer in France. It became essential as precision engineering demanded a unit between the millimetre and nanometre.

Common use: Inch to Micrometer conversion is needed when working with international standards, scientific publications, or reference materials that use different unit systems for Length measurement.