📶Taxa de Transferência|Métrico (SI)

Bit per Second

Symbol: bpsWorldwide

0,001Kbps0,125B/s0,000125KB/s0,000001Mbps

O que é um/uma Bit per Second (bps)?

Formal Definition

The bit per second (symbol: bps or bit/s) is the fundamental unit of data transfer rate (also called bit rate or data rate) in digital communications and computing. It represents the number of bits — binary digits, each having a value of 0 or 1 — transmitted or processed per second. One bit per second means that exactly one binary digit is transferred in one second. The unit measures the capacity or throughput of a communication channel, network link, or data processing system.

The bit per second is derived from two base concepts: the bit, defined by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication" as the fundamental unit of information, and the second, the SI base unit of time. While the bit itself is not an SI unit, the bit per second is universally used in telecommunications, networking, and computer science as the standard measure of data transfer speed.

Relationship to Bandwidth and Throughput

In networking, it is important to distinguish between bandwidth (the maximum theoretical data rate of a channel), throughput (the actual achieved data rate), and goodput (the useful data rate excluding protocol overhead). All three are measured in bits per second or its multiples. A 100 Mbps Ethernet connection has a bandwidth of 100 million bits per second, but the actual throughput may be lower due to protocol overhead, collisions, and other factors. The goodput — the rate at which useful application data is delivered — is typically lower still, often 90-95% of throughput for TCP connections under favorable conditions.

Etymology

The Origin of "Bit"

The word "bit" is a portmanteau of "binary digit," coined by John Tukey at Bell Laboratories in 1947. Claude Shannon popularized the term in his groundbreaking 1948 paper, where he established the mathematical foundations of information theory. Shannon credited Tukey with the coinage. The concept of binary representation predates the term by centuries — Gottfried Wilhelm Leibniz described binary arithmetic in 1703 — but the formalization of the bit as a unit of information was Shannon's contribution.

From Baud to Bits Per Second

Before "bits per second" became the standard unit for data rate, the term "baud" (named after Émile Baudot, the French telegraph engineer who invented the Baudot code in 1870) was commonly used. Baud measures the number of symbol changes per second on a communication channel. In early telegraphy and modem communications, where each symbol carried exactly one bit, baud and bits per second were identical. However, as modulation techniques advanced and each symbol began carrying multiple bits, the distinction became critical: a 2,400-baud modem using 16-QAM modulation carries 4 bits per symbol, yielding 9,600 bits per second.

The shift from baud to bits per second as the preferred consumer-facing unit occurred in the 1990s as Internet access became mainstream. Modem speeds were marketed in bps (or kbps), and the term became embedded in public consciousness. Today, "baud" is used primarily by telecommunications engineers discussing physical-layer signaling, while "bits per second" and its multiples are the universal currency of data rate specification.

Precise Definition

Base Unit Definition

One bit per second is defined as the transmission or reception of one binary digit (bit) per second of time. The unit is straightforward and requires no physical standard or reference artifact — it is a count of discrete events (bit transmissions) per unit of time.

Decimal vs. Binary Prefixes

When combined with metric prefixes, bits per second uses decimal (SI) prefixes exclusively in networking and telecommunications:

- 1 kbps (kilobit per second) = 1,000 bps - 1 Mbps (megabit per second) = 1,000,000 bps - 1 Gbps (gigabit per second) = 1,000,000,000 bps - 1 Tbps (terabit per second) = 1,000,000,000,000 bps

This is in contrast to units of data storage, where binary prefixes (kibi-, mebi-, gibi-) are sometimes used to denote powers of 1,024. In data transfer contexts, the prefixes always represent powers of 1,000. There is no ambiguity: 1 Mbps always means exactly 1,000,000 bits per second in networking.

Relationship to Bytes

One byte consists of 8 bits. Therefore, 8 bps = 1 byte per second (B/s). This 8:1 ratio is a frequent source of confusion: an Internet connection rated at 100 Mbps (megabits per second) has a theoretical maximum transfer rate of 12.5 MB/s (megabytes per second). The lowercase "b" denotes bits, while the uppercase "B" denotes bytes — a convention that is critically important but often overlooked in casual usage.

História

Telegraph and Early Communications

The concept of measuring information transfer rate predates the formal definition of the bit. In the 1830s and 1840s, telegraph operators and engineers implicitly measured data rates in words per minute or characters per minute. Samuel Morse's telegraph system of the 1840s could transmit roughly 10 to 15 words per minute, which translates to approximately 10 to 20 bits per second using modern encoding. The Baudot code, developed in 1870, transmitted 5-bit characters and operated at speeds up to 30 words per minute on well-maintained lines.

Shannon's Information Theory

The formalization of the bit as a unit of information by Claude Shannon in 1948 provided the theoretical framework for measuring data rates. Shannon's channel capacity theorem established the maximum rate at which information can be reliably transmitted over a noisy channel: C = B × log₂(1 + S/N), where C is the capacity in bits per second, B is the bandwidth in hertz, and S/N is the signal-to-noise ratio. This theorem, known as the Shannon-Hartley theorem, remains the fundamental limit governing all digital communications.

The Modem Era

The practical importance of bits per second exploded with the development of modems for computer communications. The Bell 103 modem (1962) operated at 300 bps. Subsequent generations increased speeds dramatically: 1,200 bps (1977), 2,400 bps (1984), 9,600 bps (1990), 14,400 bps (1991), 28,800 bps (1994), 33,600 bps (1996), and finally 56,000 bps (1998) with V.90 modems. Each generation pushed closer to Shannon's theoretical limit for telephone-grade lines.

Broadband and Beyond

The transition from dial-up to broadband in the late 1990s and 2000s moved common data rates from kilobits to megabits per second. DSL and cable modems initially offered 256 kbps to 1.5 Mbps, then scaled to tens and hundreds of megabits. Fiber optic connections brought gigabit speeds to consumers starting around 2010. Today, backbone network links operate at 100 Gbps to 400 Gbps per wavelength, and research systems have demonstrated throughputs exceeding 1 petabit per second (10¹⁵ bps) over fiber optic cables.

Uso atual

Internet and Broadband

Bits per second and its multiples are the universal language of Internet speed. Internet service providers (ISPs) advertise connection speeds in Mbps or Gbps: typical residential broadband ranges from 25 Mbps (the US FCC's minimum broadband definition as of 2024) to 1 Gbps or more for fiber connections. Speed test services like Ookla's Speedtest report results in Mbps. Streaming video services specify minimum requirements in Mbps: standard definition requires about 3-5 Mbps, HD requires 5-25 Mbps, and 4K UHD requires 25-50 Mbps depending on the codec.

Networking Equipment

All networking equipment is rated in bits per second. Ethernet standards define speeds at 10 Mbps (10BASE-T), 100 Mbps (Fast Ethernet), 1 Gbps (Gigabit Ethernet), 10 Gbps, 25 Gbps, 40 Gbps, 100 Gbps, and 400 Gbps. Wi-Fi generations are similarly characterized: Wi-Fi 4 (802.11n) supports up to 600 Mbps, Wi-Fi 5 (802.11ac) up to 3.5 Gbps, Wi-Fi 6 (802.11ax) up to 9.6 Gbps, and Wi-Fi 7 (802.11be) up to 46 Gbps — though real-world speeds are invariably lower.

Telecommunications

Mobile network generations are defined by their peak data rates. 3G (HSPA+) offered up to 42 Mbps, 4G LTE up to 300 Mbps (LTE-Advanced up to 1 Gbps), and 5G targets peak rates of 20 Gbps with typical user-experienced rates of 100-300 Mbps. Satellite internet services like Starlink deliver 25-200 Mbps to residential customers. Submarine fiber optic cables forming the Internet's backbone carry multiple terabits per second across ocean floors.

Media and Streaming

Digital media encoding relies on bit rates. Audio codecs specify quality in kbps: MP3 at 128 kbps (standard quality) to 320 kbps (high quality), AAC at 96-256 kbps. Lossless audio (FLAC, ALAC) typically runs 800-1,400 kbps. Video bit rates are higher: H.264 at 5-20 Mbps for HD, H.265/HEVC at 3-15 Mbps for equivalent quality, and AV1 achieving similar quality at 2-10 Mbps.

Everyday Use

Internet Speed Tests

The most common encounter with bits per second in daily life is the Internet speed test. When running a speed test on a smartphone or computer, results are reported in Mbps — download speed, upload speed, and sometimes latency. A result of "150 Mbps download / 20 Mbps upload" means the connection can receive 150 million bits per second and send 20 million bits per second. Understanding these numbers helps consumers evaluate their ISP's performance and choose appropriate plans for their usage patterns.

Streaming and Downloads

Bit rate directly affects streaming quality. When a streaming service like Netflix or YouTube automatically adjusts video quality, it is adapting the bit rate to match available bandwidth. Viewers might notice this as a resolution change: a sudden shift from crisp HD to blocky low resolution indicates the available bps has dropped below the required threshold. Music streaming services offer quality tiers defined by bit rate: Spotify's "Normal" is 96 kbps, "High" is 160 kbps, and "Very High" is 320 kbps.

File Transfer Times

Bits per second determine how long file transfers take. To estimate download time, divide the file size in bits by the connection speed in bps. A 4 GB movie (32 gigabits) on a 100 Mbps connection takes approximately 320 seconds (about 5.3 minutes) under ideal conditions. This calculation explains why a "fast" gigabit connection can download the same file in about 32 seconds. Understanding the bit-byte distinction is essential: file sizes are in bytes (B) while speeds are in bits (b), so you must multiply file size by 8 or divide speed by 8 when comparing.

Gaming and Video Calls

Online gaming requires relatively low bandwidth — typically 3-6 Mbps — but demands consistent, low-latency connections. Video conferencing is more bandwidth-intensive: a standard Zoom call uses 1.5-3 Mbps, while a group call can require 3-8 Mbps. When multiple household members stream, game, and video-call simultaneously, total bandwidth requirements add up, making connection speed in Mbps a practical household concern.

In Science & Industry

Information Theory

In information theory, the bit per second serves as the fundamental measure of channel capacity — the maximum rate at which information can be reliably communicated. Shannon's channel capacity theorem (C = B log₂(1 + S/N)) expresses the theoretical maximum in bits per second as a function of bandwidth and signal-to-noise ratio. This theorem has guided the design of every digital communication system since 1948. Modern coding techniques like turbo codes and LDPC (Low-Density Parity-Check) codes approach within fractions of a decibel of the Shannon limit, achieving near-theoretical-maximum bps over real channels.

Network Performance Research

Research in computer networking relies heavily on precise measurement of data rates in bits per second. Network researchers measure throughput, latency, jitter, and packet loss to characterize network performance. Tools like iperf3 measure TCP and UDP throughput in bps between endpoints. Academic papers on network protocols, congestion control algorithms, and routing optimizations report their results in bits per second or its multiples, allowing direct comparison across studies and implementations.

Signal Processing and Telecommunications

In digital signal processing, bit rate is linked to sampling rate and bit depth. The Nyquist theorem dictates that a signal must be sampled at twice its highest frequency to be accurately represented. CD-quality audio (44.1 kHz sampling, 16-bit depth, 2 channels) produces a raw data rate of 1,411,200 bps (1.4112 Mbps). Understanding the relationship between analog signal parameters and digital bit rates is fundamental to audio engineering, video compression, and telecommunications system design.

Multiples & Submultiples

NameSymbolFactor
Bit per secondbps1
Kilobit per secondkbps1000
Megabit per secondMbps1000000
Gigabit per secondGbps1000000000
Terabit per secondTbps1000000000000

Interesting Facts

1

Claude Shannon's 1948 paper, which formalized the bit as a unit of information, is widely considered the founding document of the Information Age. Shannon proved that reliable communication is possible at any rate below the channel capacity in bps, no matter how noisy the channel — a result that surprised many contemporary engineers.

2

The first transatlantic telegraph cable, completed in 1858, operated at approximately 0.1 bits per second — so slow that Queen Victoria's 98-word congratulatory message to President Buchanan took over 16 hours to transmit. Modern transatlantic fiber optic cables carry over 200 terabits per second.

3

A single human neuron can fire at a maximum rate of about 1,000 times per second, carrying roughly 1,000 bits per second of information. The entire human optic nerve transmits approximately 10 million bits per second from each eye to the brain.

4

The fastest Internet speed ever recorded in a laboratory setting exceeded 1.8 petabits per second (1.8 × 10¹⁵ bps) over a single optical fiber, achieved by researchers in 2024 — enough to transfer the entire Netflix library in less than one second.

5

A 56K modem actually could not achieve 56 kbps in practice. FCC power regulations limited upstream data rates to 33.6 kbps, and the theoretical 56 kbps downstream was only possible under ideal conditions. Most users experienced 40-53 kbps at best.

6

The voyager 1 spacecraft, now over 15 billion miles from Earth, transmits data at approximately 160 bits per second — slower than a 1960s-era modem. At this rate, a single smartphone photo would take over 8 hours to transmit.

Regional Variations

Global Standard

The bit per second and its decimal multiples (kbps, Mbps, Gbps) are used universally worldwide. Unlike many physical units that have regional variants or competing systems, data transfer rates in bits per second represent a truly global standard. The International Telecommunication Union (ITU), IEEE, and all major standards bodies use bits per second as the base unit for data rate specification.

Notation Variations

While the unit itself is universal, notation conventions vary slightly. The IEEE and most international standards use "bit/s" as the formal symbol. In common usage, "bps" is the most widespread abbreviation. Some publications use "b/s." For multiples, the lowercase "b" is critical: "Mb/s" or "Mbps" means megabits per second, while "MB/s" means megabytes per second — an eightfold difference. Unfortunately, this distinction is frequently lost in consumer marketing and journalism, leading to widespread confusion.

Marketing and Advertising

Internet service providers worldwide advertise speeds in bits per second, though the prefixes and typical values vary by market. In South Korea and Japan, multi-gigabit residential connections are common. In the United States and Europe, 100 Mbps to 1 Gbps is typical for broadband. In developing regions, connections of 5-25 Mbps may be considered adequate. ISPs universally advertise the "up to" maximum speed in Mbps, which may differ substantially from the actual throughput experienced by users.

Bits vs. Bytes Confusion

The single greatest source of regional (and universal) confusion regarding data rates is the bit/byte distinction. File sizes are conventionally expressed in bytes (KB, MB, GB), while network speeds are in bits (kbps, Mbps, Gbps). A user with a 100 Mbps connection downloading a 1 GB file may expect it to take 10 seconds, when in fact it takes approximately 80 seconds (8 gigabits ÷ 100 Mbps). Some European ISPs have begun advertising speeds in MB/s to reduce this confusion, but the practice remains inconsistent.

Conversion Table

UnitValue
Kilobit per Second (Kbps)0,001Convert
Byte per Second (B/s)0,125Convert
Kilobyte per Second (KB/s)0,000125Convert
Megabit per Second (Mbps)0,000001Convert

All Bit per Second Conversions

Frequently Asked Questions

What is the difference between bits per second and bytes per second?
One byte equals 8 bits. Therefore, 1 byte per second (B/s) equals 8 bits per second (bps). Network speeds are typically advertised in bits per second (Mbps), while file sizes are in bytes (MB). To convert Mbps to MB/s, divide by 8. A 100 Mbps connection delivers a maximum of 12.5 MB/s.
Why are Internet speeds measured in bits instead of bytes?
The convention comes from telecommunications, where data is transmitted serially — one bit at a time over a wire or radio wave. Network protocols, modems, and serial communications have always counted individual bits, making bits per second the natural unit. Bytes (groups of 8 bits) are used for storage because computer memory is organized in byte-addressable chunks.
How many bits per second do I need for streaming video?
It depends on quality: standard definition (480p) requires about 3-5 Mbps, HD (1080p) needs 5-25 Mbps depending on the codec, and 4K UHD requires 25-50 Mbps. Netflix recommends 5 Mbps for HD and 25 Mbps for Ultra HD. These are per-stream requirements — multiple simultaneous streams require proportionally more bandwidth.
What was the fastest Internet speed ever achieved?
Laboratory demonstrations have exceeded 1.8 petabits per second (1,800,000,000 Mbps) over a single optical fiber. Commercially deployed submarine cables carry hundreds of terabits per second. The fastest residential connections available to consumers are typically 10 Gbps (10,000 Mbps) in select cities.
How do I convert bps to Mbps?
Divide the value in bps by 1,000,000 (one million). For example, 50,000,000 bps = 50 Mbps. In networking, prefixes always use decimal (powers of 1,000): 1 kbps = 1,000 bps, 1 Mbps = 1,000,000 bps, 1 Gbps = 1,000,000,000 bps.
What is the difference between bandwidth and speed?
Bandwidth is the maximum theoretical data rate of a connection, measured in bps. 'Speed' in everyday usage typically refers to throughput — the actual data rate achieved, which is always lower than bandwidth due to protocol overhead, latency, congestion, and other factors. A 1 Gbps Ethernet link (bandwidth) might deliver 900 Mbps of actual throughput.
How long does it take to download a 1 GB file at various speeds?
Convert 1 GB to bits: 1 GB = 8 Gb = 8,000 Mb. At 10 Mbps: about 800 seconds (13.3 minutes). At 50 Mbps: about 160 seconds (2.7 minutes). At 100 Mbps: about 80 seconds. At 1 Gbps: about 8 seconds. Real-world times are typically 10-30% longer due to protocol overhead.
What is the difference between bps and baud?
Baud measures the number of signal changes (symbols) per second on a communication line, while bps measures the number of bits transferred per second. When each symbol carries exactly one bit, they are equal. With advanced modulation — such as 256-QAM used in cable modems — each symbol carries 8 bits, so the bps rate is 8 times the baud rate.