C++ vector algorithm

P

Pelle Beckman

Hello,

This is OT, but since everybody here seems to have
a great knowledge of C++ I thought I'd ask.

I have this vector with n^2 elements (where n usually
is somewhere between 100 and 1000...)
My goal is to loop through these values (this is not
a real-time computation...) 5000 times or so, select a
random element from the vector and do some simple
operations on the most nearby neighbors - meaning on top,
under, left and right in the vector.

However there is this problem - when at the edge
of the vector, not all nearby neighbors can be computed
(since they are not in the vector).

This algorithm is based on a mathematical one which
states that if the are no nearby neighbors, then go
with 0 (zero) instead.

But, this stuff needs to be fast even though it's not
in real-time.

Here's what I've got so far:

S - the vector in question
S_size - total number of elements in S

for(int T = 0; T < T_repeat; T += 10) {
for(int l = 0; l < l_repeat; l += 10) {
for(int q = 0; q < q_repeat; q++) {
xrand = rand() % S_size + 1;
yrand = rand() % S_size + 1;

try {
U1 =(-S_values.at(xrand).at(yrand)) *
S_values.at(xrand).at(yrand);
U2 = S_values.at(xrand).at(yrand) *
S_values.at(xrand + 1).at(yrand);
} catch(...) {
cout << "borked" << endl;
}

/* Do some computing here */
dU = J * (float)(U2 - U1);
}
}
}

See the try/catch-stuff? Theres my problem.
Can I handle this is some nifty way by letting
the exception tell me what overflowed and at
what elements?

Its late at night here and I really should get to bed
- if there's something thats not clear here
I'd be pleased to answer.

Cheers.

-- Pelle
 
?

=?ISO-8859-1?Q?Daniel_Sch=FCle?=

S - the vector in question
S_size - total number of elements in S

for(int T = 0; T < T_repeat; T += 10) {
for(int l = 0; l < l_repeat; l += 10) {
for(int q = 0; q < q_repeat; q++) {
xrand = rand() % S_size + 1;
yrand = rand() % S_size + 1;

given an array (or std::vector)
int field[] = {1, 2, 3, 4};
it has 4 members ... sizeof(field)/sizeof(field[0])

if you use rand() % 4, possible numbers are in range [0..3], then you
add one more and use as index into you array, this will raise an
exception in case rand()%4 == 3

meine 10 cent

--daniel
 
D

Dana Cartwright

Pelle Beckman said:
I have this vector with n^2 elements (where n usually
is somewhere between 100 and 1000...)
My goal is to loop through these values (this is not
a real-time computation...) 5000 times or so, select a
random element from the vector and do some simple
operations on the most nearby neighbors - meaning on top,
under, left and right in the vector.

However there is this problem - when at the edge
of the vector, not all nearby neighbors can be computed
(since they are not in the vector).

Why not make your vector bigger, so if you want n = 100 you actually use n =
102, setting the extra elements to zero (as required by the math). Then as
you work with your 100 elements, they will all have defined neighbors. Then
you won't throw exceptions, and you won't have to waste time testing for
being near the edge.
 
C

Cy Edmunds

Pelle Beckman said:
Hello,

This is OT, but since everybody here seems to have
a great knowledge of C++ I thought I'd ask.

I have this vector with n^2 elements (where n usually
is somewhere between 100 and 1000...)
My goal is to loop through these values (this is not
a real-time computation...) 5000 times or so, select a
random element from the vector and do some simple
operations on the most nearby neighbors - meaning on top,
under, left and right in the vector.

However there is this problem - when at the edge
of the vector, not all nearby neighbors can be computed
(since they are not in the vector).

This algorithm is based on a mathematical one which
states that if the are no nearby neighbors, then go
with 0 (zero) instead.

But, this stuff needs to be fast even though it's not
in real-time.

Here's what I've got so far:

S - the vector in question
S_size - total number of elements in S

for(int T = 0; T < T_repeat; T += 10) {
for(int l = 0; l < l_repeat; l += 10) {
for(int q = 0; q < q_repeat; q++) {
xrand = rand() % S_size + 1;
yrand = rand() % S_size + 1;

try {
U1 =(-S_values.at(xrand).at(yrand)) *
S_values.at(xrand).at(yrand);
U2 = S_values.at(xrand).at(yrand) *
S_values.at(xrand + 1).at(yrand);
} catch(...) {
cout << "borked" << endl;
}

/* Do some computing here */
dU = J * (float)(U2 - U1);
}
}
}

See the try/catch-stuff? Theres my problem.
Can I handle this is some nifty way by letting
the exception tell me what overflowed and at
what elements?

Its late at night here and I really should get to bed
- if there's something thats not clear here
I'd be pleased to answer.

Cheers.

-- Pelle

If you want to go fast don't use "at" and don't use exceptions for what
amounts to flow control. There are two reasonable solutions:

1) put in artifical borders filled with zeros as someone else suggested, and
2) write special loops for use near the edges and corners.

I recently wrote some code which used method 2 and had to write 9 loops: top
left corner, top edge, top right corner, left edge, sweet spot in the
middle, right edge, lower right corner, bottom edge, and lower right corner.
It was a pain in the neck but turned out to be quite fast as well as
non-invasive. The loop for the middle part (where most of the data are) can
be coded without any runtime penalty for checking. The code at the corners
has lots of checking but only subtends a few elements.
 
P

Pelle Beckman

Dana Cartwright skrev:
Why not make your vector bigger, so if you want n = 100 you actually use n =
102, setting the extra elements to zero (as required by the math). Then as
you work with your 100 elements, they will all have defined neighbors. Then
you won't throw exceptions, and you won't have to waste time testing for
being near the edge.

It's perfectly sane idea but since we're talking about
100000s of elements, a "border" would mean
1000s of "unnecessary" elements which could require
a few megs memory req. extra depending on the
type of the vector.

-- Pelle
 
D

Dana Cartwright

Pelle Beckman said:
Dana Cartwright skrev:

It's perfectly sane idea but since we're talking about
100000s of elements, a "border" would mean
1000s of "unnecessary" elements which could require
a few megs memory req. extra depending on the
type of the vector.

-- Pelle

How big are your elements? If the maximum size is 1000 by 1000 (as I think
you said in your OP), that's only one million elements. Adding a border of
zeroes only adds about 4000, or 4%, storage overhead.

You said this needed to be fast. I'm just thinking that 4% memory overhead
is reasonable if you gain a lot of speed.

If your elements are large, then you could put them in memory that's
separate from the vector. The vector could be a vector of pointers into the
elements. Then, you need only create a single 'zero' element, because the
border pointers can (probably) all point at a single 'zero' element.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Members online

Forum statistics

Threads
473,997
Messages
2,570,240
Members
46,830
Latest member
HeleneMull

Latest Threads

Top