⏱️Time|Metric (SI)

Microsecond

Symbol: μsWorldwide

0.000001s0.001ms1,000ns

What is a Microsecond (μs)?

Formal Definition

The microsecond (symbol: μs) is a unit of time equal to one millionth (10⁻⁶) of a second, or one thousandth of a millisecond. The prefix "micro-" comes from Greek "mikros" (small). One second contains exactly 1,000,000 microseconds.

The microsecond is the timescale of modern computing operations, high-speed communications, and many physical processes. A modern CPU can execute several instructions in one microsecond. Radar pulses, ultrasonic echoes, and many chemical reactions occur on microsecond timescales.

Physical Scale

In one microsecond, light travels approximately 300 meters — about three football fields. Sound travels about 0.34 millimeters. An electrical signal in copper wire travels about 200 meters. These distances define the physical limits of computing and communication system design.

Etymology

Greek Prefix

The prefix "micro-" comes from Greek "μικρός" (mikros), meaning small. It was adopted as an SI prefix in 1960, denoting a factor of 10⁻⁶. The symbol "μ" is the Greek letter mu.

History

Electronic Era

The microsecond became practically measurable with the development of electronic oscilloscopes in the 1940s and 1950s. Early digital computers of the 1950s-1960s (UNIVAC, IBM 704) had instruction cycle times measured in microseconds. By the 1970s, microprocessors pushed cycle times below the microsecond into nanoseconds, but the microsecond remained important as the timescale for memory access, I/O operations, and communication protocols.

Modern Relevance

Today, the microsecond is the critical timescale for data center operations, high-frequency trading, and real-time control systems. SSD storage access times are measured in microseconds. Database query response times target single-digit microseconds. 5G cellular networks aim for single-digit microsecond air interface latency.

Current Use

Computing

CPU operations take nanoseconds, but higher-level operations like cache misses (0.5-100 μs), SSD reads (10-100 μs), and network round trips (100-1000 μs) are measured in microseconds. In-memory database lookups take 1-10 μs.

Financial Trading

High-frequency trading systems measure execution latency in microseconds. The fastest exchanges match orders in single-digit microseconds. Co-located trading servers are placed meters from exchange matching engines to minimize microsecond-level latency.

Radar and Sonar

Radar pulse widths are typically 1-100 μs. The round-trip time for a radar pulse to reach a target 150 m away and return is about 1 μs. Ultrasound imaging uses pulse-echo timing in microseconds to construct images of internal body structures.

Everyday Use

Camera Flash Duration

Photographic strobe lights have flash durations of 100-1000 μs, enabling the freezing of fast motion. Studio strobes at minimum power can produce flashes as short as 100 μs.

USB Communication

USB polling intervals operate at 125 μs for high-speed devices (8,000 polls per second). This determines how quickly USB peripherals can respond to commands.

Audio Sampling

CD-quality audio samples at 44,100 times per second, meaning each sample represents about 22.7 μs of audio. At 96 kHz high-resolution audio, each sample is about 10.4 μs apart.

In Science & Industry

Laser Physics

Q-switched lasers produce pulses lasting nanoseconds to microseconds, with peak powers reaching megawatts. These pulses are used in laser surgery, material processing, and LIDAR distance measurement.

Nuclear Physics

Many radioactive isotopes have half-lives measured in microseconds, making the microsecond important for nuclear timing circuits and particle detector readout systems. Muons, fundamental particles produced in cosmic ray interactions, have a mean lifetime of 2.2 μs.

Interesting Facts

1

In one microsecond, light travels about 300 meters — the length of three football fields. This physical limit constrains the design of high-speed computing systems.

2

The muon, a fundamental particle, has a mean lifetime of exactly 2.196 microseconds. Despite this brief existence, muon experiments have provided crucial tests of Einstein's special relativity.

3

High-frequency traders spend millions of dollars to reduce latency by microseconds. A 1-μs advantage on a high-volume trade can generate significant profit over millions of transactions.

4

The world's fastest electronic switches can toggle in about 0.001 μs (1 nanosecond), but real-world circuit operations typically take 0.1-10 μs due to signal propagation and processing overhead.

5

A hummingbird's heart beats about once every 2000 μs (500 beats per minute) at rest, and once every 800 μs (1,200 BPM) during flight.

6

Modern DRAM memory access takes about 50-100 μs for a random read, which is why CPU caches (which take 0.001-0.01 μs) are so important for performance.

Conversion Table

UnitValue
Second (s)0.000001Convert
Millisecond (ms)0.001Convert
Nanosecond (ns)1,000Convert

All Microsecond Conversions

Frequently Asked Questions

How many microseconds are in a second?
Exactly 1,000,000. One microsecond = 10⁻⁶ seconds = 0.000001 seconds.
How many microseconds in a millisecond?
Exactly 1,000 microseconds in one millisecond.
What happens in one microsecond?
Light travels 300 m, a CPU executes several instructions, a radar pulse travels to a 150 m target and back. It's far too fast for human perception.
Why are microseconds important in trading?
High-frequency trading systems execute millions of trades. A 1 μs speed advantage per trade accumulates to significant profits over time.
What is the symbol for microsecond?
μs (Greek mu + s). In plain text, 'us' is sometimes used. Never 'Ms' (which means megasecond).
How do microseconds relate to computer performance?
CPU cache: 0.001-0.01 μs. RAM: 0.05-0.1 μs. SSD: 10-100 μs. Network: 100-1000 μs. These latencies determine system performance.