[ANN] ncache 0.9

G

George Moschovitis

n/cache
=======

A collection of usefull caching methods. This release introduces a
simple, yet efficient LRU cache implementation. The following example
demonstrates the usage of an LRU cache with a maximum of 10 items:


require "n/cache"

class Dummy
include N::LRUCache::Item
end

item = Dummy.new
cache = N::LRUCache.new(maxitems = 10)
cache["the-key"] = item
p cache["the-key"].class


For more examples consult the included test case.

If you 've found a bug, have questions, or suggestions and ideas
regarding this release, please contact the author at (e-mail address removed). All
valueable contributions will be integrated in a future release.

This code is licenced under the same licence as Ruby. To get the latest

version and more open source releases by Navel please visit
http://www.navel.gr/open-source.



(c) 2004 Navel, all rights reserved.
 
B

Brian Schröder

George said:
n/cache
=======

A collection of usefull caching methods. This release introduces a
simple, yet efficient LRU cache implementation. The following example
demonstrates the usage of an LRU cache with a maximum of 10 items:


require "n/cache"

class Dummy
include N::LRUCache::Item
end

item = Dummy.new
cache = N::LRUCache.new(maxitems = 10)
cache["the-key"] = item
p cache["the-key"].class


For more examples consult the included test case.

If you 've found a bug, have questions, or suggestions and ideas
regarding this release, please contact the author at (e-mail address removed). All
valueable contributions will be integrated in a future release.

This code is licenced under the same licence as Ruby. To get the latest

version and more open source releases by Navel please visit
http://www.navel.gr/open-source.



(c) 2004 Navel, all rights reserved.


Hello George,

I don't have the time to check it out now, but I would be interested in
why it is neccessary to extend the cached classes? What functionality
can be needed in the cached data to make a cache more efficient?

Regards,

Brian
 
G

George Moschovitis

The cached items are stored in a linked list to facilitate a
constant(*) time lru algorithm. I could use a separate wrapper object
to avoid 'polluting' the objects to be cached but I decided that the
extra object creation (and related gc strain) is not justified.

Please have a look at the (very clean) source, and if you have a better
idea let me know!

regards,
George Moschovitis



(*) if you consider a Ruby Hash lookup constant.
 
G

George Moschovitis

The cached items are stored in a linked list to facilitate a
constant(*) time lru algorithm. I could use a separate wrapper object
to avoid 'polluting' the objects to be cached but I decided that the
extra object creation (and related gc strain) is not justified.

Please have a look at the (very clean) source, and if you have a better
idea let me know!

regards,
George Moschovitis



(*) if you consider a Ruby Hash lookup constant.
 
R

Robert Klemme

George Moschovitis said:
The cached items are stored in a linked list to facilitate a
constant(*) time lru algorithm. I could use a separate wrapper object
to avoid 'polluting' the objects to be cached but I decided that the
extra object creation (and related gc strain) is not justified.

I beg to differ. Modifying objects that are put into any kind of
container to implement the container's functionality is a major design
sin. All sorts of unforeseen problems can arise from this that are
difficult to track down since nobody expects this (if it's done
automatically). You're likely to run into name clashes and such. No
single container implementaion I know does this.

If you want to spare the object allocation (which is perfectly reasonable
for performance reasons) you better look out for other ways to do this.
Using an Array as list storage comes to mind. It's not too difficult to
move elements in an Array around (look at Array#slice!) and it'll be
rather fast I guess. Still I recommend to first implement the LRU cache
strightforward and then measure performance of alternatives against the
"default" implementation.

Btw another option is to recycle list elements in a local storage.
Usually it's sufficient to cache a single spare linked list element since
the size of the LRU cache is bounded and after you reach the limit you
only exchange elements in the cache.

Kind regards

robert
 
G

George Moschovitis

Robert,

first of all, thank you for your comments!
difficult to track down since nobody expects this (if it's done
automatically)

this is not done automatically, the programmer explicitly
includes a Mixin.
You're likely to run into name clashes and such

only lru_next/lru_prev/lru_key are added to the class to be cached,
I think name clashes are very unlikely.
Using an Array as list storage comes to mind.

This was the original method used, but in order to make this
efficient I had to use lookup by pointer AND a lookup by the
caching key, so I swtiched to a custom structure. If you can
adapt the code to use an Array and no mixin, I will be
impressed :)

Your last idea is rather interesting, but the gain (no mixin) does not
justify the complexity. I use this cache in a simple Fragment caching
scheme for web applications and it seems perfectly natural to
write code like:

class Fragment
include N::LRUCache::Item
attr_accessor :body, :last_modified, :expires
....
end

Hope this is useful to other people as well.
best regards,
George Moschovitis
 
R

Robert Klemme

George Moschovitis said:
Robert,

first of all, thank you for your comments!


this is not done automatically, the programmer explicitly
includes a Mixin.


only lru_next/lru_prev/lru_key are added to the class to be cached,
I think name clashes are very unlikely.

I think otherwise. But this is a free country. :)
This was the original method used, but in order to make this
efficient I had to use lookup by pointer AND a lookup by the
caching key, so I swtiched to a custom structure. If you can
adapt the code to use an Array and no mixin, I will be
impressed :)

Yeah, true. Drop the array suggestion.
Your last idea is rather interesting, but the gain (no mixin) does not
justify the complexity.

What do you mean by complexity? Storing a single linked list element is
not really something I'd call "complex".
I use this cache in a simple Fragment caching
scheme for web applications and it seems perfectly natural to
write code like:

class Fragment
include N::LRUCache::Item
attr_accessor :body, :last_modified, :expires
...
end

*I* would not find this natural. If I were using a LRU cache I'd only
expect to have to provide keys that satisfy the same conditions as Hash
keys (immutable, proper #hash, #eql? etc.) and nothing more. Everything
else makes things unnecessary complicated.

A simple calculation might explain this: if you have the choice of
increasing the complexity in a framework or std lib vs. increasing the
complexity in clients of this framework / lib, the lib side increase
usually generates overall *less* cost than the client side increase simply
because there are usually multiple clients using the same framework / lib
and *each* pays the complexity penalty (which consists of coding efforts,
testing and probably documenting). So if you want to make life easy for
people that you want to use your lib / framework do invest the extra
effort; this will increase the likelyhood of your piece of software being
chosen over some other piece of software.

Regards

robert
 
G

George Moschovitis

What do you mean by complexity? Storing a single linked list element
is
not really something I'd call "complex".

Hmm, I probably didnt understand your suggestion correctly, can you
elaborate please? I am very intereseted in your idea.
*I* would not find this natural. If I were using a LRU cache I'd only
expect to have to provide keys that satisfy the same conditions as
Hash

Well I find natural that a Cache only accepts Cacheable objects (ie
objects that extend the LRUCache::Item). Like a render that accepts
renderable objects. As you say we are free to dissagree :)
best regards,
George Moschovitis
 
R

Robert Klemme

George Moschovitis said:
Hmm, I probably didnt understand your suggestion correctly, can you
elaborate please? I am very intereseted in your idea.

class LruCache
ListItem = Struct.new:)obj,:prev,:next)

private
def create_list_item
begin
return @spare || ListItem.new
ensure
@spare = nil
end
end

def return_list_item(li)
@spare ||= li
end
end

Then use #create_list_item and #return_list_item wherever you create or
release ListItems.
Hash

Well I find natural that a Cache only accepts Cacheable objects (ie
objects that extend the LRUCache::Item). Like a render that accepts
renderable objects. As you say we are free to dissagree :)
best regards,

Although there is some truth in this, a LRUCache is something more general
than a renderer - in fact it comes close to Hash IMHO. So there should be
less requirements on container instances than for a renderer.

Regards

robert
 
G

George Moschovitis

class LruCache
ListItem = Struct.new:)obj,:prev,:next)
...
end
end

This is a nice trick! Do you have any idea how I can get rid of the
lru_key
field of the mixin too?

the lru_key is used here:
124: delete(last.lru_key) if size() > @max_items
best regards,
George Moschovitis
 
G

George Moschovitis

Hmm i can include the lru_key in this spare element too. Very nice
trick!
Thank you.
 
R

Robert Klemme

George Moschovitis said:
This is a nice trick! Do you have any idea how I can get rid of the
lru_key
field of the mixin too?

class LruCache
ListItem = Struct.new:)key,:eek:bj,:prev,:next)
....
end
end

Then put ListItems into the hash.
the lru_key is used here:
124: delete(last.lru_key) if size() > @max_items
best regards,
George Moschovitis

Of course.

robert
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,159
Messages
2,570,879
Members
47,414
Latest member
GayleWedel

Latest Threads

Top