Big (o) notation

S

sam_cit

Hi Everyone,

I'm sorry that this is not specific to C language as such, however
as i'm currently programming on C, i'm posting it here.

I read a lot about BIG(0) notation that determines the complexity of
an algorithm, however it is very confusing and difficult to determine
for some cases, can anyone post a good link or give suggestions as to
how to work on it???
 
M

Malcolm

Hi Everyone,

I'm sorry that this is not specific to C language as such, however
as i'm currently programming on C, i'm posting it here.

I read a lot about BIG(0) notation that determines the complexity of
an algorithm, however it is very confusing and difficult to determine
for some cases, can anyone post a good link or give suggestions as to
how to work on it???
Try comp.programming.
Big O notation is not confusing, but it can be made to appear confusing if
you give a student tricky examples and ask him to determine the order of the
algorithm.
 
K

Keith Thompson

I'm sorry that this is not specific to C language as such, however
as i'm currently programming on C, i'm posting it here.

If you were currently riding motorcycles, would you post to
rec.motorcycles?

Try comp.programming.
 
D

David T. Ashley

Hi Everyone,

I'm sorry that this is not specific to C language as such, however
as i'm currently programming on C, i'm posting it here.

I read a lot about BIG(0) notation that determines the complexity of
an algorithm, however it is very confusing and difficult to determine
for some cases, can anyone post a good link or give suggestions as to
how to work on it???

This may help:

http://en.wikipedia.org/wiki/Big_O_Notation

In general, the notation is an upper limit on the number of steps or
operations required to implement some algorithm as some parameter of
interest (typically number of data records) increases.

The mathematical notion will quickly identify algorithms (such as sort
algorithms) that don't scale up well. An algorithm that is O(N**2) may work
great on your computer when you use 100 records, but you may have a surprise
coming if you try to perform the operation on a billion records.

In order of decreasing desirability, we have:

O(log N)
O(N)
O(N log N)
O(N**2)
O(N**q), q>2
O(e**N)

The notions aren't that hard to apply for most algorithms. For example, for
the "rock" or "bubble" sort, it is easy to show that up to

(N) + (N-1) + (N-2) + ... + (N-N+2) + (N-N+1)

exchanges are required. I'm sure one can prove with a little algebra that a
tight upper bound on the number of exchanges is N**2.

Disclaimer: The above expression looks wrong somehow. I don't warranty the
math.
 
R

Richard Heathfield

David T. Ashley said:

(N) + (N-1) + (N-2) + ... + (N-N+2) + (N-N+1)

exchanges are required. I'm sure one can prove with a little algebra that
a tight upper bound on the number of exchanges is N**2.

Rearrange in two rows, each half as long, like so:

N + N-1 + N-2 + ...
N-N+1 + N-N+2 + N-N+3

Now add the columns:

N + N-1 + N-2 + ...
N-N+1 + N-N+2 + N-N+3
---------------------
N+1 + N+1 + N+1 + ...

So the total number of exchanges is N(N+1)/2, which is (N^2 + N) / 2.

So the worst case is O((N^2+N)/2).

We can get rid of the 2 by buying a computer twice as fast, which gives us
O(N^2+N). If N is really, really small, say 10 or less, we can ignore it as
being too small to bother with, and if it's really, really large, say, 11
or more, then it is so dwarfed by N^2 then we can ignore it as being too
small to bother with.

So yes, it's O(N^2).
 
O

onkar

There is a excellent book which explains the concepts.

Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford
Stein. Introduction to Algorithms, Second Edition. MIT Press and
McGraw-Hill, 2001. ISBN 0-262-03293-7. Section 3.1: Asymptotic
notation, pp.41-50.
 
O

onkar

it simply means that, there is function f(n) whose value is always
less than an snother function g(n) for all n's greater than or equal to
(some) n0.

The above statement is stated as f(n)=O(g(n))
note that '=' is in fact "membership of a set"

hope that helps,
Onkar
 
D

Default User

onkar said:
it simply means that, there is function f(n) whose value is always
less than an snother function g(n) for all n's greater than or equal
to (some) n0.

Please don't top-post. Your replies belong following or interspersed
with properly trimmed quotes. See the majority of other posts in the
newsgroup, or:
<http://www.caliburn.nl/topposting.html>
 
D

David T. Ashley

onkar said:
There is a excellent book which explains the concepts.

Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford
Stein. Introduction to Algorithms, Second Edition. MIT Press and
McGraw-Hill, 2001. ISBN 0-262-03293-7. Section 3.1: Asymptotic
notation, pp.41-50.

Ron Rivest ... doesn't he keep giving us those cryptographic hashes that
keep getting compromised (i.e. MD4, MD5)?
 
R

Richard Heathfield

David T. Ashley said:
Ron Rivest ... doesn't he keep giving us those cryptographic hashes that
keep getting compromised (i.e. MD4, MD5)?

<shrug> Can you explain to him how to fix them?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,201
Messages
2,571,049
Members
47,655
Latest member
eizareri

Latest Threads

Top