J
Jim
I have the code below in a JNI interface routine. If I run it as below (#if
0), then it does not leak memory.
If I change the #if 0 to #if 1, then it leaks - the JVM just keeps on
growing.
I think I am doing all the right things as far as freeing up the bytearray,
etc - at least according to the book.
Anyone please, do you spot a problem?
Thanks,
void *str;
putt.opts = env->GetIntField( obj, put_opts_fid );
putt.timeout = env->GetIntField( obj, put_timeout_fid );
putt.len = env->GetIntField( obj, put_len_fid );
#if 0
jb = (jbyteArray)env->GetObjectField( obj, put_data_fid );
putt.buf = env->GetByteArrayElements( jb, NULL );
#else
str = malloc( putt.len );
memset( (void *)str, 'a', putt.len);
putt.buf = (void *)str;
#endif
if ( putt.buf ) {
rc = sendthedata( hh, &putt );
}
#if 0
env->ReleaseByteArrayElements( jb, (signed char *)putt.buf, 0 );
env->DeleteLocalRef( jb );
#else
free( str );
#endif
return( rc );
Thanks
Jim
0), then it does not leak memory.
If I change the #if 0 to #if 1, then it leaks - the JVM just keeps on
growing.
I think I am doing all the right things as far as freeing up the bytearray,
etc - at least according to the book.
Anyone please, do you spot a problem?
Thanks,
void *str;
putt.opts = env->GetIntField( obj, put_opts_fid );
putt.timeout = env->GetIntField( obj, put_timeout_fid );
putt.len = env->GetIntField( obj, put_len_fid );
#if 0
jb = (jbyteArray)env->GetObjectField( obj, put_data_fid );
putt.buf = env->GetByteArrayElements( jb, NULL );
#else
str = malloc( putt.len );
memset( (void *)str, 'a', putt.len);
putt.buf = (void *)str;
#endif
if ( putt.buf ) {
rc = sendthedata( hh, &putt );
}
#if 0
env->ReleaseByteArrayElements( jb, (signed char *)putt.buf, 0 );
env->DeleteLocalRef( jb );
#else
free( str );
#endif
return( rc );
Thanks
Jim