Big O Notation

What Does Big O Notation Mean?

Big O notation is a particular tool for assessing algorithm efficiency. Big O notation is often used to show how programs need resources relative to their input size.

Advertisements

Big O notation is also known as Bachmann–Landau notation after its discoverers, or asymptotic notation.

Techopedia Explains Big O Notation

Essentially, using big O notation helps to calculate needs as a program scales. The size of a program’s input is given to the computer, and then the running time and space requirements are determined. Engineers can get a visual graph that shows needs relative to different input sizes.

Big O notation is also used in other kinds of measurements in other fields. It is an example of a fundamental equation with a lot of parameters and variables. A full notation of the big O notation equation can be found online.

Advertisements

Related Terms

Latest Computer Science Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…