Segmentation fault (core dumped)

E

ekilada

Hi,
Using Perl, when I try to open a 1.8G file for reading, I get a
'Segmentation fault (core dumped)' error.
Please, is there any way to segment huge files for reading ?

Thanks And Best Regards,
Eliyah
 
U

usenet

ekilada said:
Using Perl, when I try to open a 1.8G file for reading, I get a
'Segmentation fault (core dumped)' error.

Oops. You have an error on line 42. You need to fix that.
Please, is there any way to segment huge files for reading ?

perldoc perlop (/while)
 
B

Brian Wakem

ekilada said:
Hi,
Using Perl, when I try to open a 1.8G file for reading, I get a
'Segmentation fault (core dumped)' error.
Please, is there any way to segment huge files for reading ?


If the error occurs when *opening* the file then the filesize is irrelevant.
 
M

Mumia W.

Hi,
Using Perl, when I try to open a 1.8G file for reading, I get a
'Segmentation fault (core dumped)' error.
Please, is there any way to segment huge files for reading ?


Thanks And Best Regards,
Eliyah

You're welcome.
 
E

ekilada

Hi Brian,
No, it occurs while reading line number 13457941
FYI: I use while (<>) to read line by line.

Regards,
Eliyah
 
D

Dave

ekilada said:
Hi Brian,
No, it occurs while reading line number 13457941
FYI: I use while (<>) to read line by line.

Regards,
Eliyah

Brian said:
If the error occurs when *opening* the file then the filesize is
irrelevant.

Questions:
Which version of Perl?
On what platform?
Does it happen on any huge file or is there something special about this one
(like a huge line).
Can you post a short but complete script that demonstrates the problem (if
run on a huge file)?
 
B

Brian Wakem

ekilada said:
Hi Brian,
No, it occurs while reading line number 13457941
FYI: I use while (<>) to read line by line.

Is that the last line?

Perhaps the file is not the size is filesystem thinks it is.

Can you open the file and seek to the end with something like vi?
 
X

xhoster

Please don't top post! Re-arranged.
Hi Brian,
No, it occurs while reading line number 13457941
FYI: I use while (<>) to read line by line.

Well, in that case the problem is clearly on line 23, not 42.

Xho
 
J

Joe Smith

ekilada said:
Hi,
Using Perl, when I try to open a 1.8G file for reading, I get a
'Segmentation fault (core dumped)' error.

Are you using a version of perl that was compiled for large file support?

perl -V | egrep '64|large'
osname=cygwin, osvers=1.5.18(0.13242), archname=cygwin-thread-multi-64int
config_args='-de -Dmksymlinks -Duse64bitint -Dusethreads -Uusemymalloc -Doptimize=-O3 -Dman3ext=3pm -Dusesitecustomize'
useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
use64bitint=define use64bitall=undef uselongdouble=undef
Compile-time options: MULTIPLICITY USE_ITHREADS USE_64_BIT_INT

Programs that are not aware of largefiles (such as 'wget') tend to
get a segmentation fault in the STDIO library after outputting
more than 2 or 4GB.
-Joe
 
B

Ben Morrow

Quoth Joe Smith said:
perl -V | egrep '64|large'
osname=cygwin, osvers=1.5.18(0.13242), archname=cygwin-thread-multi-64int
config_args='-de -Dmksymlinks -Duse64bitint -Dusethreads -Uusemymalloc
-Doptimize=-O3 -Dman3ext=3pm -Dusesitecustomize'
useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
use64bitint=define use64bitall=undef uselongdouble=undef
Compile-time options: MULTIPLICITY USE_ITHREADS USE_64_BIT_INT

Are you aware you can write that (arguably) more simply as

perl -V:'.*large.*|.*64.*'

? Also, I get 34 lines from that, rather than your 5, which is a little
weird... I guess you must be using 5.8.<8...

Ben
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,994
Messages
2,570,223
Members
46,812
Latest member
GracielaWa

Latest Threads

Top