compare two byte[] 's

H

harryos

I need to write some code where I perform encryption/decryption of an
input bye[] and then compare the plaintext byte[] to the input byte
[].How do I do this comparison?

byte[] input =new byte[]{.0x00, 0x01, 0x02...};
byte[] cipherText=myEncrypt(input);
byte[] plainText =myDecrypt(cipherText);

if(input.equals(plainText))
System.out.println("plaintext matches orig");
else
System.out.println("plaintext DOES NOT match orig!!!!");

even though the printed hex of both arrays are the same,I always get
the 'Does not match' output message.

Is .equals() not the right way?
thanks
harry
 
S

Sigfried

harryos a écrit :
I need to write some code where I perform encryption/decryption of an
input bye[] and then compare the plaintext byte[] to the input byte
[].How do I do this comparison?

byte[] input =new byte[]{.0x00, 0x01, 0x02...};
byte[] cipherText=myEncrypt(input);
byte[] plainText =myDecrypt(cipherText);

if(input.equals(plainText))
System.out.println("plaintext matches orig");
else
System.out.println("plaintext DOES NOT match orig!!!!");

even though the printed hex of both arrays are the same,I always get
the 'Does not match' output message.

Is .equals() not the right way?

No it's not, use Arrays.equals(byte[], byte[]).
 
L

Lew

harryos a écrit :


I need to write some code where I perform encryption/decryption of an
input bye[] and then compare the plaintext byte[] to the input byte
[].How do I do this comparison?
byte[] input =new byte[]{.0x00, 0x01, 0x02...};
byte[] cipherText=myEncrypt(input);
byte[] plainText =myDecrypt(cipherText);
if(input.equals(plainText))
    System.out.println("plaintext matches orig");
else
    System.out.println("plaintext DOES NOT match orig!!!!");
even though the printed hex of both arrays are the same,I always get
the 'Does not match' output message.
Is .equals() not the right way?

No it's not, use Arrays.equals(byte[], byte[]).

The array 'equals()' method does not override 'Object#equals()'.
Consequently it does an identity comparison, not a value comparison.
The utility method that Sigfried mentioned covers that gap.

It is not necessarily, or even usually true that 'equals()' does value
comparison. Unless specifically overridden, it does identity
comparison only, as inherited.
 
M

Mark Space

Lew said:
No it's not, use Arrays.equals(byte[], byte[]).

The array 'equals()' method does not override 'Object#equals()'.


It bugs me that Java, an object oriented language, provides so few
methods for its array class and relies on static methods in a separate
class instead. Off the top of my head, I can only think of #clone()
that's overridden for arrays, and I think that does a shallow copy
(doesn't call #clone() on objects in the array even if they are cloneable).

Just saying, to no one in particular...
 
D

Daniel Pitts

Mark said:
Lew said:
No it's not, use Arrays.equals(byte[], byte[]).

The array 'equals()' method does not override 'Object#equals()'.


It bugs me that Java, an object oriented language, provides so few
methods for its array class and relies on static methods in a separate
class instead. Off the top of my head, I can only think of #clone()
that's overridden for arrays, and I think that does a shallow copy
(doesn't call #clone() on objects in the array even if they are cloneable).

Just saying, to no one in particular...
What is really interesting is that at the JVM level array.length is not
a member-access opcode, but a special opcode just for that purpose.
arrays do not truly contain a final int length field.
 
L

Lew

Daniel said:
What is really interesting is that at the JVM level array.length is not
a member-access opcode, but a special opcode just for that purpose.
arrays do not truly contain a final int length field.

No more interesting than that the JVM allows multiple methods with
compatible argument lists and incompatible return types, but Java
doesn't.

In general I am not surprised when a high-level language and its
target machine language differ in what they directly support, or that
a particular high-level language feature translates to a somewhat
different machine-language feature or combination of features.

If the machine and high-level languages were exactly the same, there'd
be no need to compile.
 
M

Mark Space

Daniel said:
What is really interesting is that at the JVM level array.length is not
a member-access opcode, but a special opcode just for that purpose.
arrays do not truly contain a final int length field.

I think I agree with Lew that lower level details like this don't
concern me much. Although you're right, it is kinda wacky that they
didn't use member access but went with a distinct opcode.

What does bother me is the lack of utility built into arrays. Changing
#equals() or #clone() now would likely be too difficult, but I don't see
the harm in adding new methods. All of the very similar overloaded
methods in java.util.Arrays, for example, would fit neatly into instance
methods of an array class. I honestly can't figure out why this hasn't
been done.
 
M

Mark Space

Eric said:
As a thought experiment,

byte[] b = { 2, 4, 6 };
char[] c = { 2, 4, 6 };

That's a good point. I was thinking that b would only be equal to:

byte [] b2 = { 2, 4, 6 };

Other methods might compare arrays differently. For example, besides
#equals(), there could be #deepEquals for b and b2 above, as well as
numericallyEquals() for arrays of various integers. Comparing integers
to floats/doubles, I don't know the best way to do that.

But I still don't see why polymorphism isn't used for the method calls.
It's blessed inconvenient to have to sort through such a large number
of overloaded methods.
 
E

Eric Sosman

Mark said:
Eric said:
As a thought experiment,

byte[] b = { 2, 4, 6 };
char[] c = { 2, 4, 6 };

That's a good point. I was thinking that b would only be equal to:

byte [] b2 = { 2, 4, 6 };

Other methods might compare arrays differently. For example, besides
#equals(), there could be #deepEquals for b and b2 above, as well as
numericallyEquals() for arrays of various integers. Comparing integers
to floats/doubles, I don't know the best way to do that.

But I still don't see why polymorphism isn't used for the method calls.
It's blessed inconvenient to have to sort through such a large number
of overloaded methods.

The array "method" I'd really like to be able to override
is the store-into method, so I could get

int[] immutableArray = thing.getArray();
immutableArray[42] = 17;

.... to throw UnsupportedOperationException on the second line.
(A compile-time error would be even better.) As things stand
I'm forced into making defensive copies or wrapping the array
in an unmodifiable List, both of which seem wasteful.

Ah, well. If wishes were horses, tinkers would ride.
 
M

Mark Space

Eric said:
The array "method" I'd really like to be able to override
is the store-into method, so I could get

int[] immutableArray = thing.getArray();
immutableArray[42] = 17;

... to throw UnsupportedOperationException on the second line.
(A compile-time error would be even better.) As things stand


I understand what you are saying, although I'm not a huge fan of
operator overloading. More types of array primitives would be useful,
such as an immutable array, as you point out. So would some way
treating primitives polymorphically, and the ability to derive from (and
add to) an array type.


Ah, well. If wishes were horses, tinkers would ride.

I'm sure the designers of Java feel the same way, when they look back on
the development of Java....


Here's a weird thought (speaking of treating primitives
polymorphically). What would the downside to using some sort of
template system for primitives? Let's say I've got a method like
Arrays.toString([]) (all nine of the darn things) and treating the
primitive types as some sort of template.

public void toString( ::int [] array ) {
for( ::int i : array ) {
System.out.println( i );
}
}

The JVM generates compiled code. Why can't the JVM expand the byte
codes of the code above to be whatever type of primitive is needed? The
expanded code could be saved so that it didn't need to be compiled each
invocation. I don't think the overhead of testing the type of the
parameter on each invocation would really be that much.

The worst thing I can think of, the best reason to not do this, would be
the case where several template type parameters are passed to a single
method. So for n parameters, there'll be 8^n different routines to
store. Probably impractical to store all of them for more than two
template parameters.

But if you limit yourself to just numbers as template parameters (which
are the only thing I can't treat polymorphically in Java) then I think
there's a small enough subset of operations that the JVM will have to write.

I guess another good reason to not do this is bytecode compatibility.
Adding something that requires different treatment of the bytecodes by
the JVM is going to break older JVMs. Not fun.

Anyhoo, just thinking out loud....
 
W

Wesley MacIntosh

Eric said:
The array "method" I'd really like to be able to override
is the store-into method, so I could get

int[] immutableArray = thing.getArray();
immutableArray[42] = 17;

... to throw UnsupportedOperationException on the second line.
(A compile-time error would be even better.) As things stand
I'm forced into making defensive copies or wrapping the array
in an unmodifiable List, both of which seem wasteful.

Ah, well. If wishes were horses, tinkers would ride.

Isn't it generally better to use collections in place of arrays anyway?
I tend to do so myself, except in really performance-critical spots, or
implementing a collection backed by an array, or the odd lookup-table
type usage where a private static array makes sense.

Generally, I don't have arrays (other than because of varargs, or
implementing collection toArray methods and from-array constructors) in
my public APIs, or even protected ones.

(Other things I eschew: Vector; Enumeration; EventListenerList. I don't
know what the designers of the latter were thinking; a Set<FooListener>
seems to work fine and without all that nasty casting. OK, it's not
thread-safe, but Swing isn't thread-safe, and I document that adding
event listeners should only be done on the EDT.)
 
T

Tom Anderson

Possibly because there are so many different equivalence relations
that could be candidates for "equality."

Poppycock! There is exactly one relation that makes any sense - the one
specified in List.equals.
Java provides one such as an array instance method, and another as a
static method of Arrays. Common Lisp has four (five? it's been a while)
different built-in equality operators.

And if Java were also written by and for complete madmen, then perhaps it
would too, but it isn't, and shouldn't.

I think it's pretty poor that [].equals doesn't do proper equality
testing, but what i find absolutely appalling is that *Arrays.equals
doesn't either*. Arrays.equals uses == as the elementwise test - fine for
arrays of primitives, but completely hopeless for arrays of objects. There
is a deepEquals which does the right thing, but i'd never even heard of
that until five minutes ago. What on earth were the authors thinking?
Perhaps i should take back the part of my remark above about java not
being written by madmen ...
As a thought experiment,

byte[] b = { 2, 4, 6 };
char[] c = { 2, 4, 6 };
short[] s = { 2, 4, 6 };
int[] i = { 2, 4, 6 };
long[] l = { 2, 4, 6 };
float[] f = { 2, 4, 6 };
double[] d = { 2, 4, 6 };
Integer[] oi = {
new Integer(2), new Integer(4), new Integer(6) };
BigInteger[] bi = { new BigInteger("2"), new BigInteger("4"),
new BigInteger("6") };
Object[] o = { new Integer(2), new Long(4), new Double(6) };

Which pairs of arrays "should" be equal and which "should" be unequal?

The ones where the corresponding pair of Lists would be equal or unequal,
respectively. *

We can argue about whether new Integer(2), new Short(2) and new
BigInteger("2") should be equal, but that's got nothing to do with arrays.
If you want to be evil, include a BigDecimal("2.0") and a
BigDecimal("2.00") up there too - and if you want to be really evil,
arrays of arrays.
Get every programmer everywhere to agree with your answer. ;-)

I'd be very interested to hear from anyone who thinks my assertion above
(marked with the asterisk) is wrong.

tom

--
The Gospel is enlightened in interesting ways by reading Beowulf and The
Hobbit while listening to Radiohead's Hail to the Thief. To kill a dragon
(i.e. Serpent, Smaug, Wolf at the Door) you need 12 (disciples/dwarves)
plus one thief (burglar, Hail to the Thief/King/thief in the night),
making Christ/Bilbo the 13th Thief. -- Remy Wilkins
 
J

Joshua Cranmer

Mark said:
What does bother me is the lack of utility built into arrays. Changing
#equals() or #clone() now would likely be too difficult, but I don't see
the harm in adding new methods. All of the very similar overloaded
methods in java.util.Arrays, for example, would fit neatly into instance
methods of an array class. I honestly can't figure out why this hasn't
been done.

This is one possible reason:

Since each array is its own special class, the virtual machine has to
synthesize special class objects for arrays. The developers made a
choice to keep these objects as small as possible. One thing that could
really bloat these objects would be to have to define methods for
/every/ single array type.

java.util.Array's toString method has 75 bytes for Object[] and 72 for
<primitive>[]. The hashCode methods are between 45 and 56 bytes. The
equals methods are between 54 bytes and 78 bytes.

Even ignoring whether or not you want equals to be == or .equals
equality, that's 206 bytes for just the bytecode operations for an
Object[] class. I'm not including the overhead for method definitions.

I'm willing to bet that the code for Object.clone() special-cases array
cloning, just to avoid having to redefine all of these methods.

I'd also imagine that this setup makes the code generation easier:
there's only a few bytes that have to be modified, even across primitive
and reference type arrays.
 
T

Tom Anderson

This is one possible reason:

Since each array is its own special class, the virtual machine has to
synthesize special class objects for arrays. The developers made a choice to
keep these objects as small as possible. One thing that could really bloat
these objects would be to have to define methods for /every/ single array
type.

Inheritance.

tom

--
The Gospel is enlightened in interesting ways by reading Beowulf and The
Hobbit while listening to Radiohead's Hail to the Thief. To kill a dragon
(i.e. Serpent, Smaug, Wolf at the Door) you need 12 (disciples/dwarves)
plus one thief (burglar, Hail to the Thief/King/thief in the night),
making Christ/Bilbo the 13th Thief. -- Remy Wilkins
 
M

Mark Space

Joshua said:
This is one possible reason:

Since each array is its own special class, the virtual machine has to
synthesize special class objects for arrays. The developers made a
choice to keep these objects as small as possible. One thing that could
really bloat these objects would be to have to define methods for
/every/ single array type.

That's an interesting point. I have to assume however that it would be
possible to play some tricks internally such that the JVM would not have
to bloat itself, but could just use an internal pointer back to a master
routine for each primitive. The worst case should be that an actual
class would not be created unless it had to be serialized or similarly
made external to the JVM.

Not that I think I'm an expert on the JVM internals or anything, just I
don't see how it could be much more complicated than that.
java.util.Array's toString method has 75 bytes for Object[] and 72 for
<primitive>[]. The hashCode methods are between 45 and 56 bytes. The
equals methods are between 54 bytes and 78 bytes.

Even ignoring whether or not you want equals to be == or .equals
equality, that's 206 bytes for just the bytecode operations for an
Object[] class. I'm not including the overhead for method definitions.

OK, 206 bytecodes which exist currently, and which would need to exist
under any system that I can think of.
I'm willing to bet that the code for Object.clone() special-cases array
cloning, just to avoid having to redefine all of these methods.

Fair enough, although I'm not following with the paragraph or the two
before this one why this would discourage adding methods to an array class.
I'd also imagine that this setup makes the code generation easier:
there's only a few bytes that have to be modified, even across primitive
and reference type arrays.

Why modify bytes? You mean for generating new class types? Yes, I'd
assume that it's only necessary to modify a few bytes for a new class
type. Should be that way under any class system.

I'm not trying to be snarky or anything, just trying to understand where
you see a problem.

The JLS says that the type of any array can be reduced to its type and
the number of dimensions:

<http://java.sun.com/docs/books/jls/third_edition/html/expressions.html#46168>

That's all you'd need to store in a new type, in any system. I imagine
that this information could be stored in the array itself. Since a new
type is needed for each array, there's no point to making a separate
object. The relationship is one to one.

extern "C" {
int array_length;
Object array_type;
int num_dimensions;
unsigned char []; // array data
}

Or something like that. Assume that "Objects" have a method
implementing array operations, if Object is a primitive. Not hard for
the low level JVM code to implement. Now you just need one actual copy
of the method per primitive, or 8 total. Ever. Multi-dimensional
arrays just do their multi-dimensional thing that eventually ends up
back at the primitive routine for that type of primitive. Other
Objects, which are full classes, obviously allow full method overriding
in whatever manner the JVM chooses to implement overriding.

Really all I've done here is add a virtual method table for array
primitives. Each array "points back" to it's type, which has the needed
implementations of it's methods. I don't see that part as a great
difficulty. (There are other aspects of this idea that I think would be
much harder.)
 
E

Eric Sosman

Tom said:
[...]
Possibly because there are so many different equivalence relations
that could be candidates for "equality."

Poppycock! There is exactly one relation that makes any sense - the one
specified in List.equals.

"There are nine and sixty ways of constructing tribal lays,
And every single one of them is right!" -- R.K.
 
T

Tom Anderson

In general, if you want an array of references to behave like a List,
consider Arrays.asList:

Arrays.asList(array1).equals(Arrays.asList(array2))

String[][] one = new String[][] {{"a", "b"}, {"c", "d"}} ;
String[][] two = new String[][] {{"a", "b"}, {"c", "d"}} ;
System.err.println(Arrays.asList(one).equals(Arrays.asList(two))) ;

As long as the array doesn't contain other arrays, your approach is highly
sound. But if it does, and you want more than for just those two top-level
arrays to behave like lists, it doesn't work out.

tom

--
It not infrequently happens that something about the earth, about the sky,
about other elements of this world, about the motion and rotation or even
the magnitude and distances of the stars, about definite eclipses of the
sun and moon, about the passage of years and seasons, about the nature
of animals, of fruits, of stones, and of other such things, may be known
with the greatest certainty by reasoning or by experience. -- St Augustine
 
T

Tom Anderson

Tom said:
[...]
Possibly because there are so many different equivalence relations that
could be candidates for "equality."

Poppycock! There is exactly one relation that makes any sense - the one
specified in List.equals.

"There are nine and sixty ways of constructing tribal lays,
And every single one of them is right!" -- R.K.

A thorough, detailed, technically impeccable, and entirely convincing
refutation of my assertion. I take my hat off to you, sir.

tom

--
It not infrequently happens that something about the earth, about the sky,
about other elements of this world, about the motion and rotation or even
the magnitude and distances of the stars, about definite eclipses of the
sun and moon, about the passage of years and seasons, about the nature
of animals, of fruits, of stones, and of other such things, may be known
with the greatest certainty by reasoning or by experience. -- St Augustine
 
A

Arne Vajhøj

Mark said:
Lew said:
No it's not, use Arrays.equals(byte[], byte[]).

The array 'equals()' method does not override 'Object#equals()'.

It bugs me that Java, an object oriented language, provides so few
methods for its array class and relies on static methods in a separate
class instead. Off the top of my head, I can only think of #clone()
that's overridden for arrays, and I think that does a shallow copy
(doesn't call #clone() on objects in the array even if they are cloneable).

Just saying, to no one in particular...

C# has slightly more methods, but also seems to favor
static methods:
http://msdn.microsoft.com/en-us/library/system.array_members.aspx

Arne
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,983
Messages
2,570,187
Members
46,747
Latest member
jojoBizaroo

Latest Threads

Top